Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/mguinea/laravel-robots

Laravel package to manage robots
https://github.com/mguinea/laravel-robots

laravel package robots robots-txt seo seotools

Last synced: about 1 month ago
JSON representation

Laravel package to manage robots

Awesome Lists containing this project

README

        

# Laravel Robots

Laravel package to manage robots easily.

If you need a detailed explanation about how robots.txt file works, visit http://www.robotstxt.org/robotstxt.html

Buy Me A Coffee

[![Scrutinizer Code Quality](https://scrutinizer-ci.com/g/mguinea/laravel-robots/badges/quality-score.png?b=master)](https://scrutinizer-ci.com/g/mguinea/laravel-robots/?branch=master)
[![Code Coverage](https://scrutinizer-ci.com/g/mguinea/laravel-robots/badges/coverage.png?b=master)](https://scrutinizer-ci.com/g/mguinea/laravel-robots/?branch=master)
[![Build Status](https://scrutinizer-ci.com/g/mguinea/laravel-robots/badges/build.png?b=master)](https://scrutinizer-ci.com/g/mguinea/laravel-robots/build-status/master)
[![StyleCI](https://styleci.io/repos/143919791/shield?branch=master)](https://styleci.io/repos/143919791)
[![License MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![Laravel](https://img.shields.io/badge/Laravel-8-orange.svg)](http://laravel.com)

This package allows you to manage robots of your site dinamically allowing you to differenciate between environments or configurations.

Migration to persist configuration is optional; you can change its data source.

Once package is installed you can do these things:

```php
Route::get('robots.txt', function() {
$robots = new \Mguinea\Robots\Robots;

// If on the live server
if (App::environment() == 'production') {
$robots->addUserAgent('*')->addSitemap('sitemap.xml');
} else {
// If you're on any other server, tell everyone to go away.
$robots->addDisallow("/");
}

return response($robots->generate(), 200)->header('Content-Type', 'text/plain');
});
```

### Installing

You can install via Composer.

```bash
composer require mguinea/laravel-robots
```

## Running the tests

Just execute

```bash
vendor/bin/phpunit
```

Unit tests will test all methods from Robots class and its related facade.

## Usage

### 1. Dynamically

You can use Robots in routes file to generate a dynamic response

```php
Route::get('robots.txt', function() {
$robots = new \Mguinea\Robots\Robots;

// If on the live server
if (App::environment() == 'production') {
$robots->addUserAgent('*')->addSitemap('sitemap.xml');
} else {
// If you're on any other server, tell everyone to go away.
$robots->addDisallow("/");
}

return response($robots->generate(), 200)->header('Content-Type', 'text/plain');
});
```

### 1.1. Dynamically with facade

You can use Robots facade in routes file to generate a dynamic response

```php
header('Content-Type', 'text/plain');
});
```

### 2. To robots.txt default file

If you prefer to write the original robots.txt file, just use the generator as you have seen

```php
addUserAgent('*')->addSitemap('sitemap.xml');

File::put(public_path('robots.txt'), $robots->generate());
}
}

```

### 3. Building from Data Source

You could prefer building it from some data source. To get that, you just must instantiate Robots object using an array with key value parameters as shown below.

Note that comments and spacers have been removed.

```php
[
'foo', 'bar'
],
'disallows' => [
'foo', 'bar'
],
'hosts' => [
'foo', 'bar'
],
'sitemaps' => [
'foo', 'bar'
],
'userAgents' => [
'foo', 'bar'
],
'crawlDelay' => 10
]);

return response($robots->generate(), 200)->header('Content-Type', 'text/plain');
}
}

```

### Methods

You can use Robots class methods in an individual or nested way.

Remember that you can use Facade to avoid instantiation.

```php
addAllow('foo');

// Add multiple allows rules to the robots. Allow: foo Allow: bar
$robots->addAllow(['foo', 'bar']);
```

```php
addComment('foo');
```

```php
addDisallow('foo');

// Add multiple disallows rules to the robots. Disallow: foo Disallow: bar
$robots->addDisallow(['foo', 'bar']);
```

```php
addHost('foo');

// Add multiple hosts to the robots. Host: foo Host: bar
$robots->addHost(['foo', 'bar']);
```

```php
addSitemap('foo');

// Add multiple sitemaps to the robots. Sitemap: foo Sitemap: bar
$robots->addSitemap(['foo', 'bar']);
```

```php
addSpacer();
```

```php
addUserAgent('foo');

// Add multiple User-agents to the robots. User-agent: foo User-agent: bar
$robots->addUserAgent(['foo', 'bar']);
```

```php
addCrawlDelay(10);
```

```php
generate();
```

```php
reset();
```

## Built With

* [Laravel](https://laravel.com/) - The web framework
* [Composer](https://getcomposer.org/) - Dependency manager

## Contributing

Please read [CONTRIBUTING.md](CONTRIBUTING.md) for details on our code of conduct, and the process for submitting pull requests.

## Security

If you discover any security related issues, please email [email protected] instead of using the issue tracker.

## Versioning

We use [SemVer](http://semver.org/) for versioning. For the versions available, see the [tags on this repository](https://github.com/mguinea/laravel-robots/tags).

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE.md) file for details

## Authors

* **Marc Guinea** [MarcGuinea](https://www.marcguinea.com)