Wanted: Software S.O.U.P. Nazis

Remember The Soup Nazi? He was the cantankerous New York City soup-stand operator portrayed by the actor Larry Thomas in a now-famous episode of the sitcom Seinfeld back in 1995. Thomas’s character was loosely based on Al Yeganeh, owner of Soup Kitchen International, who one writer described as “the meanest man in America.”

The Soup Nazi made fantastic soup – but he was a stickler for protocol. Customers had to queue up just so, and order promptly and in just the right manner – no questions asked – or they were out. “No soup for you!” It made for great comedy, but it isn’t a good recipe for success as a small business owner. It’s “the customer is always right,” after all, not “I am always right, and the customer better listen!”

The software industry isn’t the restaurant business, however, and, lately, I’ve been thinking that we could use a Soup Nazi or, more precisely, a S.O.U.P. Nazi for application development. And by “S.O.U.P.,” I’m referring to Software of Unknown Pedigree. As this blog noted in a recent post, S.O.U.P. is a long-used acronym that’s in need of a second act.

Why now? As Veracode has noted, the use of third-party code within enterprises is more common than ever, as enterprises shift to cloud-based applications run on private- or public clouds. When applications are developed internally, it’s almost always with the help of outsourced development shops and/or third-party development tools and frameworks like .NET, Adobe ColdFusion or Ruby on Rails. Despite that, around 84% of vendor-supplied software is implemented without a security audit by customers, even though 82% of auditors inquire about the security of third-party software that’s running within the enterprise. (For more S.O.U.P. stats, see Veracode’s nifty and Warhol-esque infograph.) In other words: software buyers give their suppliers a pass, even though they will pay the cost of non-compliance down the road.

The problems with this approach are on display everywhere that software is performing critical and sensitive work.

Just this week, we read the story of Ahmed Al-Khabaz, the 20 year old computer sciences student at Dawson College in Montreal. Al-Khabaz was expelled from Dawson in November for running a vulnerability scan on Omnivox, a web-based student portal that’s used by Dawson and a slew of other two year vocational colleges, or “CEGEPs.” Al-Khabaz said he conducted the scan, using a free tool from the firm Acunetix, to verify that the company that makes and manages Omnivox, a local software firm called Skytech, had patched a critical vulnerability that he had disclosed to them. The security hole would have allowed anyone who knew the unique ID of a Dawson student or faculty member to access their Omnivox account, which contains sensitive information and personal communications.

The treatment of Al-Khabaz by Dawson, who deemed his audit of Omnivox “unprofessional,” is a subject unto itself. The larger question is how the Province of Quebec managed to buy and deploy a web based portal to 250,000 students and faculty without spotting what Al-Khabaz called “sloppy coding” in the application itself. Chris Wysopal, Veracode’s CTO, said that Dawson and the provincial government should “try to learn” from the Al-Khabaz case – not about how to treat future students who engage in altruistic hacking, but “how we bought software to manage sensitive information for hundreds of thousands of students in which the security quality was not assessed.”

And, unfortunately, the story is no better in areas like medicine, in which lives truly hang in the balance.

At a presentation at the S4 security conference in Miami last week, two researchers: Billy Rios and Terry McCorkle demonstrated a slew of security holes in critical medical instruments like X-Ray machines and medical device management software. The two researchers purchased second-hand devices online, then tested them for security vulnerabilities.

What they found was eye-opening: the software that ran these devices was rife with serious flaws like heap overflows and directory traversal vulnerabilities. Many of these security holes could be exploited by a remote attacker and used to disable the device or take control of it, siphoning off sensitive data or planting malicious code.

Rios and McCorkle also found plenty of evidence of poor design practices “These were straight-forward security issues, like having a service running as a privileged user. All the bad stuff you’re not supposed to do – they’re doing it,” Rios told me in an interview.

What was interesting about the presentation by Rios and McCorkle, was that it took place at a conference dedicated to discussing security issues in the industrial control sector. The two researchers weren’t there to make an argument that medical devices were critical infrastructure (though I think you could make that argument).

They were there because the vendors responsible for the buggy medical devices also made industrial control systems used to manage critical infrastructure like the power grid, water treatment plants and assembly lines. These weren’t mom and pop shops, either, but multi-national corporations like Siemens, Philips and GE – companies with valuations in the hundreds of billions of dollars.

But all that capital doesn’t buy security. Indeed, the lesson of the both Dawson College and S4 is that vendors will only make their products as good as their customers demand – and no better. And, as it stands today, customers are still too dazzled by features and functionality – the bright shiny objects of the technology world- to ask the hard questions about security and code quality. Many, I suspect, don’t even know what questions to ask.

I know what you’re thinking: that’s like buying a house because you like what color its painted, without bothering to see if there are cracks in foundation. Or, more to the point, its like buying a can of soup because you like the label, but not bothering to read the ingredients. But that’s why organizations desperately need S.O.U.P. Nazis – sticklers for detail that hold everyone to account and aren’t afraid to lay down the law: “No S.O.U.P. for you!!”

Twitter revealed that it had been; – Strategic Staff | February 22, 2013 1:22 pm

[...] downside of our freewheeling application development culture. Whether we’re talking about the dangers of trusting third party SOUP (Software of Unknown Pedigree) or the lack of rigor in application design, coding and testing. The [...]

Please Post Your Comments & Reviews

Your email address will not be published. Required fields are marked *

RSS feed for comments on this post