Half Million Dollars On Robots.txt File
By: Andy Beal
Good grief, Reuters is reporting a group of publishers are planning to spend $583,700 to find a way to prevent Google...
... from indexing their content!
Link: Reuters is reporting
"Since search engine operators rely on robotic 'spiders' to manage their automated processes, publishers' Web sites need to start speaking a language which the operators can teach their robots to understand," according to a document seen by Reuters that outlines the publishers' plans.
Google already offers a way for a publisher to have their content removed. In addition, modify your Robots.txt file to disallow the search engines and you can donate that money to a charity that helps paranoid publishers! ;-)
View All Articles by Andy Beal
About the Author:
Andy Beal is an internet marketing consultant and considered one of the world's most respected and interactive search engine marketing experts. Andy has worked with many Fortune 1000 companies such as Motorola, CitiFinancial, Lowes, Alaska Air, DeWALT, NBC and Experian.
You can read his internet marketing blog at Marketing Pilgrim and reach him at [email protected].
is an iEntry, Inc. ® publication
All Rights Reserved.