Mastodon: Add a robot.txt to prevent indexation

Created on 11 Apr 2017  路  2Comments  路  Source: tootsuite/mastodon

Use a decentralized social network is a good think for privacy, a central company can't use private data on user. But if the different instances allow search engine to index her, it can be a problem. So, i propose to add a "robot.txt " file at the root of the instance for forbid all commons search engine to index an mastodon instance. Toot will not be search-able from outside.

not actionable

Most helpful comment

I think this is probably something that each instance admin should decide for their own instance, and then manually add a robots.txt in their nginx config, if they choose to have one. Doing it globally would probably just result in some admins having to manually remove it again, for example I prefer to have my toots in the Google index.

All 2 comments

I think this is probably something that each instance admin should decide for their own instance, and then manually add a robots.txt in their nginx config, if they choose to have one. Doing it globally would probably just result in some admins having to manually remove it again, for example I prefer to have my toots in the Google index.

I'm pretty sure this should be up to the instance admin, yeah. I'm marking this as invalid and closing

Was this page helpful?
0 / 5 - 0 ratings