robots.txt

Add default robots.txt that allows bots access to all paths.
Add mix task to generate robots.txt taht allows bots access to no paths.
Document custom emojis, MRF and static_dir
static_dir documentation includes docs for the robots.txt Mix task.
This commit is contained in:
William Pearson 2019-01-20 01:44:00 +00:00
commit 3dadaa4432
7 changed files with 191 additions and 2 deletions

20
docs/static_dir.md Normal file
View file

@ -0,0 +1,20 @@
# Static Directory
Static frontend files are shipped in `priv/static/` and tracked by version control in this repository. If you want to overwrite or update these without the possibility of merge conflicts, you can write your custom versions to `instance/static/`.
```
config :pleroma, :instance,
static_dir: "instance/static/",
```
You can overwrite this value in your configuration to use a different static instance directory.
## robots.txt
By default, the `robots.txt` that ships in `priv/static/` is permissive. It allows well-behaved search engines to index all of your instance's URIs.
If you want to generate a restrictive `robots.txt`, you can run the following mix task. The generated `robots.txt` will be written in your instance static directory.
```
mix pleroma.robots_txt disallow_all
```