Question:

How can I block my page in Google searches using a robots.txt file?

by Guest4961  |  12 years, 9 month(s) ago

0 LIKES UnLike

I don’t need Google to index my pages or to crawl it and I heard about the use of robots.txt file to do that but I don’t know how. So please help

 Tags: block, File, Google, page, robots.txt, searches, Using

   Report

1 ANSWERS

  1. Guest9336
    A robots.txt file restricts access to your site by search engine robots that crawl the web. These bots are automated, and before they access pages of a site, they check to see if a robots.txt file exists that prevents them from accessing certain pages.

Sign In or Sign Up now to answser this question!

Question Stats

Latest activity: 13 years, 2 month(s) ago.
This question has 1 answers.

BECOME A GUIDE

Share your knowledge and help people by answering questions.
Unanswered Questions