The discovery of a number of what have been described as serious vulnerabilities within industrial control systems built by manufacturing giant Siemens AG -- and the subsequent nixing of a presentation about those very vulnerabilities -- has raised questions about how the nature of vulnerability disclosure should -- or shouldn't -- change when it comes to the security flaws in industrial systems.
As covered earlier this week in our story "A botched fix, not legal demands, nixed SCADA security talk," NSS Labs researchers pulled a presentation after a fix Siemens offered failed to mitigate attack. A day after that story, Dillon Beresford, the NSS Labs researcher who discovered and reported the flaws took aim at Siemens on the SCADASec mailing list for downplaying the seriousness of the vulnerabilities. According to the report "Siemens says it will fix SCADA bugs," the company is downplaying the SCADA flaws. "While NSS Labs has demonstrated a high level of professional integrity by providing Siemens access to its data, these vulnerabilities were discovered while working under special laboratory conditions with unlimited access to protocols and controllers," Siemens said.
Beresford countered: "The flaws are not difficult for a typical hacker to exploit. Also there were no special laboratory conditions with unlimited access to the protocols. My personal apartment on the wrong side of town where I can hear gunshots at night hardly defines a special laboratory. I purchased the controllers with money my company so graciously provided me with."
In a prior interview with NSS Labs Chief Technology Officer Vikram Phatak, he told CSOonline that the cost of the equipment was roughly $2,500. That's certainly a lower bar to uncover SCADA-related flaws than has been generally discussed.
With that in mind -- and the stakes higher with the security of factories, power plants, and other industrial systems in question -- the issue must be raised: What should the rules of disclosure for SCADA vulnerabilities be?
The surprising -- to some -- answer by experts is not much different at all -- with some caveats. "When it comes to medium risk and lower vulnerabilities, I don't think the disclosure rules should be different with SCADA systems than traditional software," says Phatak. "The researcher should contact the vendors and give them a reasonable period of time to remedy their flaw. With more serious flaws, perhaps the vendor needs more time. But it's crucial to keep the heat on the vendors to fix these issues."
Chris Wysopal, chief technology officer at Veracode and one of the early members of the software security research group L0pht, agrees, and doesn't see the situation with SCADA vulnerability disclosure much different than with traditional software. "Today, when we see more serious issues with DNS and other core issues, researchers take the issue of responsible disclosure more seriously," Wysopal says. "They'll be more likely to coordinate the disclosure with US-CERT, and make sure all of the parties involved are fully briefed about what's going on."
Get your morning news fix with the daily Salted Hash e-newsletter! Sign up today.
Gartner security analyst John Pescatore isn't surprised by the flaw, and believed the pressure is helpful at affecting change. "One of the reasons why SCADA and process control software so often are horrible from a security perspective is that there has been a lack of disclosure driving the vendors to get better," Pescatore says.
There's also the collision of two very different worlds here, those of the often chaotic world of IT security research and stoic engineering firms. "It's just like the culture clash we had in the late 1990s, when the early-wave vulnerability exposers were criticized by Microsoft and Oracle. Every software vendor would rather see no disclosure of vulnerabilities, just as no restaurant wants to see health department violations exposed in the newspaper," says Pescatore. "But operational technology software needs to emphasize security during development and the best way for them to avoid embarrassing and risky disclosures is to have way, way fewer vulnerabilities (like hard-coded passwords, for example) in their software."
That's not to say there won't be bumps ahead, as more researchers learn their way around industrial systems, and the risks rise that a rogue researcher pushes a significant zero-day SCADA vulnerability onto a security mailing list without warning. "There are essentially no rules, if you are a white hat, you generally follow guidelines, but there's nothing forcing them to do so," says Pete Lindstrom, research director at Spire Security. "And should a researcher disclose a bug that is used in a successful attack, I wouldn't be surprised to see legal action taken against the researcher. I don't think the real-world will be as tolerant of irresponsible disclosure shenanigans," he says.
George V. Hulme writes about security and technology from his home in Minneapolis. You can also find him tweeting about those topics on Twitter at @georgevhulme.