Help > Forum > Search Engines > Why are some pages blocked in the robots.txt file?
Why are some pages blocked in the robots.txt file?
The robots.txt file we have specified is optimized to ensure the highest ranking possible in search engines. It tells search engines not to index certain pages which would negatively impact placement in search engines. For example, pages with very little content (like the Compose New Topic page), URLs that are slightly different that point to the same content (this would be counted as duplicate content by search engines), and transactional member-only pages (like following a topic, editing your profile, etc.). This ensures that search engines primarily see the highly relevant and content-rich pages of your forum so that it ranks your forum higher in the search results.