Hackers turn to Google to find weakest links

The following article appeared in the New Scientist mag dated 2.8.03

Computer hackers have adopted a startling stratagy in their attempts to break into websites. By using the popular search engine Google, they don't have to visit a site to plan an attack. Instead they can get all the information they need from Google's cached versions of web pages, say experts in the US.

One way that hackers can break into a website is by hunting for private pages that contain the user names and passwords required to access secure parts of the site. These pages are usually hidden from the casual browser because there are no hyperlinks to them on the web. But sometimes websites contain hidden hyperlinks or indexes that point to these private sites. These links may be inserted by faulty software, or they may be created by the owner for temporary use and later forgotten or not properly deleted. Either way, they are serious security loopholes.

Hackers usually hunt for these private pages by trial and error, an activity that an alert webmaster can spot by monitoring traffic on supposedly private parts of the site. But search engines can now make this kind of trawling unnecessary, says Johnny Long, a professional hacker based in the US who is hired by companies to test their security.

Search engines build their databases by systematically following the links they find on web pages. The search engine then records the contents of each page. So if a website contains a link to a sensitive page, a search engine will record it.

These pages would still be hard to find if web servers did not often use the same name for pages that contain passwords and other sensitive information. For example, one common filename for passwords is "bash history". Long says an obvious combination of search terms would include the terms "bash history", "temporary" and "password".

Since Google makes its cached pages available, hackers can access this information without alerting a webmaster, even if the data has since been removed from the web. Long plans to outline the technique this week at Defcon, the annual hackers' conference in Las Vagas.

Google says it bears no responsibility for the way the information it collects is used. "Our search tools are very useful to researchers. There is not a lot we can do to prevent hacking," says a company spokesman.

The responsibility for securing a site lies with the people operating it, says Danny Sullivan, editor of the website SearchEngineWatch.com: "Search engines make it easier for everyone to gain information, hackers included."

HTTP://WWW.NEWSCIENTIST.COM

 

(NB) Editors Note -
There are many commands and comments that can be embedded in the head section of a page. These two META TAGS should suffice to start the budding web builder off on the right track.

META content="14 Days" name=Revisit-After - Tells Robots/Spiders to revisit page after a specific time period.

META content="Robots" name=NoCache - Tells Robots/Spiders NOT to cache the page bearing this line.

Previous