How to create a simple ROBOT.TEXT file for your website
You might be amazed to hear that the small text file known as “robots.txt” could be the breakdown of your website. if you get the file wrong you could end up telling search engine robots not to crawl meaning your website page won’t appear on the search result.
The robots.txt file is a simple text file (no HTML), that must be placed on your web server, for example:
Benefits of using A robots.txt file
- – It can prevent users to find your restricted pages of your website.
- – increased the importance of pages that are allowed to be indexed
- – you can inform search engine where your sitemap file is located
- – saves bandwidth and keeps your logs clean
- – it can avoid duplicate content from being indexed
How to create and upload robots.txt
- – Create a blank notepad called robots and save as plain text (*.txt)
- – Write or copy the code you want and save.
- – Upload the robots.txt file to your root server via FTP (File transfer protocol)
Allow your robots access to everything
Referencing a site in your simple robots
if you have a sitemap you can inform the search engine where it is here.
Content management and robots.
if you are using wordpress or any other content management system framework, you can use plugins to create your robots and sitemap files, however because of other way of CMS it is not possible to create a robots file that can provide for everything.
Some common mistakes and things to remember
- robots.txt file is a case sensitive and the file name is all lowercase
- keep one rule per line
- anyone can view your robots.txt file, so be careful of what you include
- 0 Comment