Use a decentralized social network is a good think for privacy, a central company can't use private data on user. But if the different instances allow search engine to index her, it can be a problem. So, i propose to add a "robot.txt " file at the root of the instance for forbid all commons search engine to index an mastodon instance. Toot will not be search-able from outside.
I think this is probably something that each instance admin should decide for their own instance, and then manually add a robots.txt in their nginx config, if they choose to have one. Doing it globally would probably just result in some admins having to manually remove it again, for example I prefer to have my toots in the Google index.
I'm pretty sure this should be up to the instance admin, yeah. I'm marking this as invalid and closing
Most helpful comment
I think this is probably something that each instance admin should decide for their own instance, and then manually add a robots.txt in their nginx config, if they choose to have one. Doing it globally would probably just result in some admins having to manually remove it again, for example I prefer to have my toots in the Google index.