View Full Version : Why we use Robots.txt File?

02-15-2017, 04:14 PM
Why we use Robots.txt File?

02-15-2017, 07:11 PM
Obots.txt file found in root directory of any website. As its name indicated Robots, It is being made for robots, crawler or web robots.

Dennis Miller
02-22-2017, 12:10 PM
The content of a robots.txt file consists of so-called "records". both "/support-desk/index.html" and "/support/index.html" as well as all other files in the "support" directory would not be indexed by search engines. If you leave the Disallow line blank, you're telling the search engine that all files may be indexed.

03-31-2017, 12:45 AM
To tell bots which directory to access or not.

04-26-2017, 05:42 PM
Some of the main uses of robot.txt files are give an permission to web crawler to access the file secret files secured and not crawled by web crawler..

05-01-2017, 08:23 PM
In a nutshell. Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

05-19-2017, 11:57 PM
It's a simple text file that tells google which page it should index and which one it should leave.