The following tip is one of a series on why and how to perform security scans against your public-facing servers using Google. Return to the main series page for the complete list of tips.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Putting countermeasures in place can help you keep sensitive information away from Google and out of the reach of Google hackers. Here are four critical steps to follow:
1. Harden your public servers from the elements.
It's sad but true -- many critical servers are still completely exposed on the Internet. Tighten down your server access controls, and get those critical servers behind a firewall.
2. Set your robots.txt file to disallow Google.
You can protect Web server files and directories from Google hackers by setting the "User-agent:" parameter to "googlebot" with a "Disallow:" section that lists the information you want secured.
Or, if you'd like to keep all the Web robots off your site, set "User-agent:" to "*" keeping in mind that the bad guys out there poking around on your Web server can gain access to this file and see what you don't want others to see. If this seems like a Web security weakness -- it is. You can go without a robots.txt file, but you should only allow robots to crawl specific public pages or disallow them altogether by entering "Disallow /" to disallow everything starting with the root directory.
3. Keep sensitive information off of public servers.
Make it an organizational policy to keep confidential information (such as passwords, sensitive files, and so on) off of publicly accessible servers. Otherwise, protect it using common-sense access controls wherever possible. Make sure management enforces these policies when they're violated.
4. Make sure your servers remain secure.
To maintain server security, perform ongoing ethical hacks using the Google testing tools and queries I've noted in this tips series.
Keep in mind that although these tests are good for digging through Google, they're not the end-all-be-all solution for ethical hacking or Internet security. There is no one best tool to test for all systems vulnerabilities. Instead, you must use a "layered" testing approach: Use Google along with other freeware, open source and -- most comprehensive and dependable in my opinion -- commercial tools like SPI Dynamics Inc.'s WebInspect (for Web applications), Application Security Inc.'s AppDetective (for Web databases) and Qualys Inc.'s QualysGuard (for OS and network-level vulnerabilities).
If ethical hacking, penetration testing and general network security auditing are part of your job duties, these Google hacking techniques and tools need to be part of your security toolbox. Do it now and do it often -- for security's sake.
Click to return to the main series page.
About the author: Kevin Beaver is an independent information security consultant, author and speaker with Atlanta-based Principle Logic, LLC, where he specializes in information security assessments for those who take security seriously and incident response for those who don't. He is author of the book Hacking For Dummies and co-author of the upcoming book Hacking Wireless For Dummies, both by Wiley Publishing. Send your ethical hacking questions to Kevin today.