Google has fixed a flaw in its Audio CAPTCHA software that could have given scammers a way to automatically set up phoney accounts with the company's services.
The flaw was described in a post to the Full Disclosure mailing list on Monday. According to the post, anyone could pass a Google Audio CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart) test by typing in any 10 words as the response.
CAPTCHA is testing software used by many websites to cut down on online fraud. Sites often use CAPTCHA systems to make sure that new accounts are created by human beings, instead of automated scripts. Typically a CAPTCHA test presents a hard-to-read image of a word, which the user must then type in to prove he is not a machine. The audio version gives visually impaired users a way to use CAPTCHA, by playing a recorded sound of the test word.
According to Harry Strongburg, the Full Disclosure poster who reported the issue, typing "google google google google google google google google google google", for example, would yield a correct response, no matter what the test word.
He stumbled on the issue recently after typing what he suspected was an incorrect answer to a barely audible audio CAPTCHA message. "I clicked it, typed in what it sounded like, and it worked correctly," Strongburg said. "Intrigued by this, I tried it again with another random sentence of the same length. To my surprise, it worked again."
Google moved quickly to fix the bug after it was disclosed.
"We fixed a bug in our audio CAPTCHA validation last night within a few hours," said spokesman Jay Nancarrow on Tuesday. "Audio CAPTCHAs continue to function normally."
That's a good thing, because, in theory, scammers could have leveraged this bug to quickly create thousands of malicious Google accounts. Google's Gmail service has been used by spammers, said Paul Ferguson, a security researcher with Trend Micro. And Blogger and Google Groups have been used to spread malware, he added in an instant message interview.