Robots.txt is a special text file web developers use to help the search engines index their web resource correctly.
Each web site has directories and pages that should not be indexed by search engines. For example, printed versions of web site pages, pages of the security system (registration, authentication). There may also be directories like administrator resources folder, various technical folders.
In addition, webmasters may want to give additional information about indexing to search engines. For example, the location of the sitemap.xml file.
All these tasks are performed by the robots.txt file. It is just a text file of a specific format, and you put it on your web site (to the main directory) so that the web crawlers know how to properly index the web site contents. Full specification of this file format can be found in the Google Developers portal.
To learn more, read about Robots.txt in our blog.