Internet Financial News
Newsletter SampleFREE IFN Newsletter



Half Million Dollars On Robots.txt File

By: Andy Beal
2006-09-25

Good grief, Reuters is reporting a group of publishers are planning to spend $583,700 to find a way to prevent Google...

... from indexing their content!

Link: Reuters is reporting

"Since search engine operators rely on robotic 'spiders' to manage their automated processes, publishers' Web sites need to start speaking a language which the operators can teach their robots to understand," according to a document seen by Reuters that outlines the publishers' plans.

What the..?

Google already offers a way for a publisher to have their content removed. In addition, modify your Robots.txt file to disallow the search engines and you can donate that money to a charity that helps paranoid publishers! ;-)

Comments

View All Articles by Andy Beal




About the Author:
Andy Beal is an internet marketing consultant and considered one of the world's most respected and interactive search engine marketing experts. Andy has worked with many Fortune 1000 companies such as Motorola, CitiFinancial, Lowes, Alaska Air, DeWALT, NBC and Experian.

You can read his internet marketing blog at Marketing Pilgrim and reach him at [email protected].

Dow 13675.25 -0.98 (-0.01%)
Nasdaq 2774.76 -24.50 (-0.88%)
S&P 500 1515.88 0.00 (0.00%)



Titan Quest Forum Nintendo Wii Graphics Forum
Halo 3 Forum Mac Software  

Latest News