https://github.com/stormid/robotify-netcore
Provides robots.txt middleware for .NET core
https://github.com/stormid/robotify-netcore
netcore robots-txt
Last synced: 3 months ago
JSON representation
Provides robots.txt middleware for .NET core
- Host: GitHub
- URL: https://github.com/stormid/robotify-netcore
- Owner: stormid
- License: mit
- Created: 2017-07-20T08:05:40.000Z (almost 8 years ago)
- Default Branch: master
- Last Pushed: 2022-08-26T20:57:47.000Z (almost 3 years ago)
- Last Synced: 2025-03-10T10:03:51.603Z (4 months ago)
- Topics: netcore, robots-txt
- Language: C#
- Size: 70.3 KB
- Stars: 16
- Watchers: 13
- Forks: 5
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
#  Robotify.AspNetCore
Robotify - robots.txt middleware for .NET core (.netstandard2.0)
## Installation
Install package via NuGet:
```powershell
Install-Package Robotify.AspNetCore
```## Configuration
Add Robotify to the middleware pipeline like any other:
Add to the `ConfigureServices` method to ensure the middleware and configuration are added to the services collection
```c#
public void ConfigureServices(IServiceCollection services)
{
services.AddOptions();services.AddRobotify();
services.AddMvc();
}
```To configure what robots directives you want to appear in your robots file you can create and assign a `IRobotifyRobotGroupProvider`. Multiple providers can be created and registered in the standard DI container, each provider will then be iterated when generating the contents of the robots file.
A `IRobotifyRobotGroupProvider` contains a single method that returns a list of `RobotGroup`, below is an example of the `RobotifyDisallowAllRobotGroupProvider`:
```c#
public class YourOwnProvider : IRobotifyRobotGroupProvider
{
public IEnumerable Get()
{
yield return new RobotGroup()
{
UserAgent = "something",
Disallow = new[] { "/" }
}
}
}
```To register a provider, use the `.AddRobotGroupProvider()` extension method when adding robotify:
```c#
public void ConfigureServices(IServiceCollection services)
{
services.AddRobotify(c => c
.AddDisallowAllRobotGroupProvider()
.AddRobotGroupsFromAppSettings()
.AddRobotGroupProvider()
);
}
```Then use the middleware as part of the request pipeline, I would suggest adding it at least after the `.UseStaticFiles()` middleware, then if you have a static file robots.txt this will be used instead.
```c#
public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
app.UseStaticFiles();
app.UseRobotify();
app.UseMvc(routes =>
{
routes.MapRoute(
name: "default",
template: "{controller=Home}/{action=Index}/{id?}");
});
}
```Robotify can be configured within the `.UseRobotify()` method or via json configuration:
```json
{
"Robotify": {
"Enabled": true,
"SitemapUrl": "https://www.example.com/sitemap.xml",
"CrawlDelay": 10,
"Groups": [
{
"UserAgent": "*",
"Disallow": "/"
},
{
"UserAgent": "Googlebot",
"Disallow": ["/admin"]
},
{
"UserAgent": "AnotherBot",
"Allow": ["/search"]
}
]
}
}
```Json configuration and code configuration will be merged at boot time. If no groups are specified a default deny all ill be added.