Last Activity: 3 years ago
Consider Adding Basic robots.txt to DNN root
Your idea has been submitted and is awaiting moderation.
Not a "bug fix" specifically or something broken, however, how about considering adding a basic robots.txt to the root? This is an old item that has been discussed, and if generic, could be a good addition of basics. This one is basically the setup from the dnnsoftware.com site.
I know that the sitemap.aspx and sitemap provider are preferred, however in every project, we add a basic starter robots.txt file, especially if it is a single domain/portal site where we can reference the sitemap.aspx
Posted as DNN-3909 in DNN Tracker
SO, my thought with this is to include by default a basic, generic robots.txt file in the root. This will cover the basics so that one is present. This would help with the basic management for non-tech administrators in having one present (also often requested by SEO companies that review a DNN site). But then also it could serve as a starter for more advanced host users to edit and work within.
To take it a further step, it would be awesome functionality to have this file auto edited by the system. If you have a single portal DNN instance, then perhaps the Admin Settings page would edit the robots.txt file when a change occurs to the Default portal alias.
If there is one portal, then set the default portal alias domain automatically in the url for the sitemap.aspx line in the robots.txt file
if there are multiple portals, or if there is a "create new portal" step taken, then it could comment out the robots.txt file sitemap.aspx line in the robots.txt file.
Commenting has been disabled because the Idea is not approved, or is locked, or is closed.