"Josh Breckman worked for a company that landed a contract to develop a content management system for a fairly large government website. Much of the project involved developing a content management system so that employees would be able to build and maintain the ever-changing content for their site.
Things went pretty well for a few days after going live. But, on day six, things went not-so-well: all of the content on the website had completely vanished and all pages led to the default "please enter content" page. Whoops.
Josh was called in to investigate and noticed that one particularly troublesome external IP had gone in and deleted *all* of the content on the system. The IP didn't belong to some overseas hacker bent on destroying helpful government information. It resolved to googlebot.com, Google's very own web crawling spider. Whoops.
As it turns out, Google's spider doesn't use cookies, which means that it can easily bypass a check for the "isLoggedOn" cookie to be "false". It also doesn't pay attention to Javascript, which would normally prompt and redirect users who are not logged on. It does, however, follow every hyperlink on every page it finds, including those with "Delete Page" in the title. Whoops."
So next time don't assume every visitor has JavaScript activated and validate the actions both on client side and on server side. If you want to validate your data for accuracy and security then you must use server side code to check your form inputs.
And something else: according to the HTTP 1.1 specification, the GET method is defined as a Safe Method which "SHOULD NOT have the significance of taking an action other than retrieval." If you to change a state (delete content, replace data), you should use POST.
Related posts about security:
Google Deleted... Google Blog
GMail vulnerability: GMail runs javascript in body
Get sensitive information using Google
What?! You could access the admin area of the CMS with cookies and JS turned off? And you landed a large government contract?! Hat's off to your Marketing Dept...
ReplyDeleteTalk about application flaw, not the Googlebots fault.
ReplyDeleteYou got the title wrong. It should read "Bad Coding Can Destroy Sites". Let us know which government gave you the contract will ya. ROFLOL
ReplyDeleteTitle should be... "Clueless Programers Can Destroy Important Data" -- no surprise there!
ReplyDeleteThis can happen with your own bots. I ran an ftp link program designed to seek out every page to find dead links and orphan pages. It identified the directory containing phpmyadmin and proceeded to wipe a database clean
ReplyDelete