Web site owner hide their webpages using commands in Robots.txt.Robots.txt is a text file which is located in the root directory of a site.It is used to control webpages indexed by a robot,ie. you can disallow a particular web page or content to be spidered from search engine robots. By using ‘disallow‘ word you can block any URL of your blog from reaching search engines.
We will take the help of Robots.text file to see the hidden web site pages and content
Continue reading
Filed under: Tips-n-Tricks | Tagged: off, offshore software development | Leave a comment »