Could We Reduce Data Breaches With Better Open Source Funding? (marketwatch.com) 60
The CEO of Wireline -- a cloud application marketplace and serverless architecture platform -- is pushing for an open source development fund to help sustain projects, funded by an initial coin offering. "Developers like me know that there are a lot of weak spots in the modern internet," he writes on MarketWatch, suggesting more Equifax-sized data breaches may wait in our future.
In fact, many companies are not fully aware of all of the software components they are using from the open-source community. And vulnerabilities can be left open for years, giving hackers opportunities to do their worst. Take, for instance, the Heartbleed bug of 2014... Among the known hacks: 4.5 million health-care records were compromised, 900 Canadians' social insurance numbers were stolen. It was deemed "catastrophic." And yet many servers today -- two years later! -- still carry the vulnerability, leaving whole caches of personal data exposed...
[T]hose of us who are on the back end, stitching away, often feel a sense of dread. For instance, did you know that much of the software that underpins the entire cloud ecosystem is written by developers who are essentially volunteers? And that the open-source software that underpins 70% of corporate America is vastly underfunded? The Heartbleed bug, for instance, was created by an error in some code submitted in 2011 to a core developer on the team that maintained OpenSSL at the time. The team was made up of only one full-time developer and three other part-timers. Many of us are less surprised that a bug had gotten through than that it doesn't happen more often.
The article argues that "the most successful open-source initiatives have corporate sponsors or an umbrella foundation (such as the Apache and Linux foundations). Yet we still have a lot of very deeply underfunded open-source projects creating a lot of the underpinnings of the enterprise cloud."
[T]hose of us who are on the back end, stitching away, often feel a sense of dread. For instance, did you know that much of the software that underpins the entire cloud ecosystem is written by developers who are essentially volunteers? And that the open-source software that underpins 70% of corporate America is vastly underfunded? The Heartbleed bug, for instance, was created by an error in some code submitted in 2011 to a core developer on the team that maintained OpenSSL at the time. The team was made up of only one full-time developer and three other part-timers. Many of us are less surprised that a bug had gotten through than that it doesn't happen more often.
The article argues that "the most successful open-source initiatives have corporate sponsors or an umbrella foundation (such as the Apache and Linux foundations). Yet we still have a lot of very deeply underfunded open-source projects creating a lot of the underpinnings of the enterprise cloud."
I doubt it (Score:3, Interesting)
Here, I'll solve this problem for you in one sentence, instead of a cloaked Ponzi scheme: strict legal liability for data breaches, extending *personally* to C-level executives of the companies at fault. Management generally doesn't care about security, and the only way to make them care is hitting them in the wallet directly. When they can't hide behind the corporate veil anymore and suffer direct financial consequences for their short-term thinking, even the most dimwitted MBA will start to wake up and take notice.
Re: (Score:2)
In Business it is more complex then that.
To survive in the market you need to get your product out before the competition and/or you need more products. Failing to survive as a business is worse then the expense of a security glitch.
It is a chicken and the egg problem. We need a security commitment from the whole industry vs just one brave little company who would go out of business rather quickly. It isn’t an issue of bad programmers or management not wanting to give a quality product, but the restr
Re: (Score:2)
Issue 3? Do you just hate SQL, or do you have a real explanation why you shouldn't use SQL for real work?
Granted you can use SQL poorly which opens the door for SQL Injection Errors, However a properly parameterized command, and well optimized stored procedures, views, with proper access controls can offer a very secure method of protecting your data and preventing extra information from leaking to the outside world.
Re: (Score:2)
We need a security commitment from the whole industry vs just one brave little company who would go out of business rather quickly.
A commitment from the whole industry won't happen. Fortunately, such a commitment isn't the only possible solution; the GP has already provided an alternative. Where the industry won't act voluntarily, legislation can force them to.
If breaches in security can be proven to be due to corner cutting, laziness or negligence (such as the Equifax fiasco) the Cxx managers of companies at fault should be made personally responsible. And not just monetarily, because they can push the expense on to the company and im
Re: (Score:2)
The legislation will only happen in countries where the industry (as a big players or as a group) don't have enough influence over the legislature to keep such a thing from happens.
I don't know what countries could might this goal, but it's small enough to not matter, especially when companies will just make sure they don't legally exist in those countries.
It's cynical thinking yes, but pragmatic I'm afraid.
That's ignoring the decades of legal challenges if it actually did happen, or any fallout on open sou
Re: (Score:2)
But the parent you're replying to is suggesting to increase the expense of a security glitch.
We need a security commitment from the whole industry vs just one brave little company who would go out of business rather quickly.
But that's his point, isn't it? By targeting the C-level executives and making them liable for security breaches, then you're effectively solving the problem for everyone involved, from the small companies to the huge companies.
The hackers will get around it anyway (Score:1)
If top tier companies like Sony can get pwned, not to mention government agencies, in reality, there isn't much companies can do. Security doesn't bring income, and you can throw your entire fiscal budget at it, only to get breached anyway because someone in receiving got a RAT from browsing the web on a machine there, and one privilege escalation vulnerability later, the attacker now has domain admin rights across the AD forest.
It really is a losing battle, as you can't win any engagement by defending onl
Re: (Score:2)
Was the thing that was compromised the latest version, exploited with a zero-day vulnerability? If so, lower penalty.
Was the thing that was compromised able to access only data that the component actually needed to function? If so, lower penalty. Higher penalties for anything that was leaked beyond the minimum that the attacked component needed to access.
Did you retain data beyond what the originators of that data
Re: (Score:1)
When they can't hide behind the corporate veil anymore and suffer direct financial consequences for their short-term thinking, even the most dimwitted MBA will start to wake up and take notice.
If they could be held liable they would simply get personal liability insurance and pass the cost through to the customers.
Re: (Score:2)
Re: (Score:2)
Moving the question, the answer is no to both (Score:1)
This is a classic scheme of moving the question in order to obtain the desired conclusion. In this case, the real question they are trying to lead people to assume the answer to is "is the open source model to blame for security breaches". By essentially stating as a fact that it is, and then making the question "should we throw money at it to fix the problem", they are trying to get people to assume the first question.
No, the open source model is not the cause of security woes. Microsoft, with one of th
Short answer: No (Score:1)
Re:Short answer: No (Score:4, Interesting)
Note that in some countries (e.g. Germany) the agency responsible for protecting domestic computer infrastructure and the agency responsible for attacking foreign computer infrastructure are different. In the USA, the NSA has dual missions, which puts them in a difficult position because if they find a bug in X and X is used both by the US and North Korea (or whoever) in critical positions, they have to decide whether it's more important to keep their attack tool or prevent their enemies from exploiting the vulnerability.
One of the interesting results of the Snowden disclosures has been that the NSA and their rivals have found largely disjoint sets of vulnerabilities, so it's not even clear that if you fixed all of the things the NSA found that you'd be less vulnerable to attack from (for example) China or Russia.
Two factors to weigh. (Score:3)
There are two main factors to weigh here, IMO.
The first is that a lot of vital yet unsexy projects have inadequate funding and testing. Funding can help mitigate problems stemming from that.
The second factor is sysadmins being incompetent or not being given the tools, knowledge, and power to actually fix problems. Funding can't help that.
Re: (Score:2)
Yes, the main problem isn't a lack of software[*]. It's that those who make decisions have no understanding of security, and their bosses in turn are looking at short term ROI.
[*]: Nor do I believe that funding would have helped if that were the cause. A great programmer doesn't become more productive if you toss more money at him. He'd be happy, and may deserve it, but likely you'd just finance more managers and get less done.
Re: (Score:3)
There are two main factors to weigh here, IMO. The first is that a lot of vital yet unsexy projects have inadequate funding and testing. Funding can help mitigate problems stemming from that. The second factor is sysadmins being incompetent or not being given the tools, knowledge, and power to actually fix problems. Funding can't help that.
I'd add a lot of attitude to that, developers that just bang it until it works. Management who says if it works, don't break it. And they go together hand in hand, if the new intranet is working we're done. The PHB and cheap Indian subtractor both think so. Firewall? Access controls? SQL Injection? URL guessing? View source? Never heard of it. And it'll keep running unpatched and out of support because it works until shit hits the fan and a scapegoat must be found, then the circle begins anew.
The problem is
Re: (Score:2)
No. Best practices are the only way. (Score:3)
For the story: These people want to get rich on the current blockchain craze, nothing else. Ignore them.
As to the problem, best practices and liability are the only way. Yes, I am advocating jailing the CEO and CISO and possibly the board of companies that have large amounts of customer data stolen because of negligence. As an alternative, I would also accept insurance that automatically pays out $1000 to every custromer that has their data stolen (regardless of how much data it was and whether it was misused) and triple the actual damage to any customer that had their data stolen and can prove larger actual damage (losses + cost to fix) than $1000.
In order to be not negligent (note that I use simple negligence, not gross negligence) they will have to:
- Develop security critical software only with architects, designers and coders that are understand security (no more paying peanuts for coders...)
- Have external reviews of all security critical code by qualified security experts
- Have careful and adequate white-box penetration testing performed
- Not only fix the issued found in code-reviews and pen-tests, but also fix and investigate the root-causes, such as fire incompetent coders or outsourcers
Do this and the problem vanishes. The human race knows how to produce software that is extremely hard to break into. There are just no incentives to spend the money for it, and, despite my list above looking a bit bombastic, it would not actually be that expensive.
Re: (Score:2)
Re: (Score:2)
Anything to do with an "initial coin offering" is a scam.
Re: (Score:2)
Not everything, but it is a good general assumption and usually quite true.
Re: (Score:2)
Develop security critical software only with architects, designers and coders that are understand security
One of the big problems, and a large part of the reason that we're in this mess, is that a lot of security-critical software wasn't security critical when it was written. Here's a simple example: libjpeg. This library was written as a reference implementation of the JPEG standard, back in 1991. It was expected to be used to compress photographs from scans and render the compressed photographs on the screen. It's not security critical, because it's dealing only with data that it produced for the user.
Start giving a damn (Score:2)
Re: (Score:2)
No, we can't. (Score:3)
More open-source funding won't help reduce breaches. It'd be good to have more funding for development of the basic software, but most of these breaches happen because, despite a patch to fix the vulnerability being available, these companies treat simply don't apply the available patches. Until that stops being the case, more funding for the software will merely mean the breaches happen in different places than they would've otherwise.
Oh, and don't hold the sysadmins responsible. They're at the mercy of the instructions they're given. The people who need held accountable are the executives who classify IT security as a cost center whose budget needs minimized and breaches as a public-relations problem instead of a security issue and who refuse to give the IT people enough budget and resources and authority to apply fixes promptly.
No (Score:2)
What do I win?
Re: (Score:2)
Re: (Score:2)
Fuck that.
I want a fur-lined dookey pot.
OSS had a fix for Equifax. They didn't apply it. (Score:2)
* It also bugs me that I generally
Certification ? (Score:2)
Is there a security equivilent of a UL Certification ?
If not, should we require one before a product can be sold ( for IOT stuff ) in the US ? Or a mandatory periodic security audit of corporate systems housing sensitive personal data ?
Development frameworks (Score:3)
When I was involved in high-security software development, we built the web sites around multiple layers each of which was secured and access was limited, reducing the attack surfaces. If a hacker ever got past all our layers to hit the database, then frankly, I wouldn't argue with them as they would be the NSA or KGB.
But then I started work with new Microsoft frameworks designed to make web building nice and easy (even though its a right over-engineered mess) and I see everything stuck in the webserver tier with full and open direct access to the DB via an ORM. All designed to be written as quickly and easily as possible with security a very distant concept to it.
and yet, said framework could easily split its MVC architecture up to a service and web tier, could put comments or a text file with security hardening information in, could partition the database into secured schemas and it'd be just as easy to write as the monolithic one but far, far more securable.
The current asp.net core framework almost is insecure by design, almost designed that everything is exposed if a hacker gets past the first (and only) level of security. All it takes is 1 zero-day exploit and all your data belongs to someone else. (and yes, other web frameworks are just as bad)
so yes, open-source projects could help - not by compiling a database or package manager of updates and security fixes, but by providing templates and architectures for project defaults that are based around layers of protection.
There will always be some weakness or flaw or bug in software, the only way to mitigate them is to work assuming they're are already there.
Not funding, quality of educated people (Score:2)
If money solved all computer problems a few top US consumer OS brands would have been the most secure OS ever.
They are not due the the low skill sets and the lack of education found in many of their workers.
Consider how an open source project responds to a person who shows security issues.
Do they have a person in place to accept the errors and communicate with the person who found the errors/bugs/backdoor/trapdoor?
That they can communicate back that the errors are understood,
No. Capability Based Operating Systems are needed (Score:2)
Until we get systems like Genode or Hurd to the point where they can be used by most of us, and especially on servers, this is going to keep happening. The idea of trusting an application or service to voluntarily restrict its own actions is idiotic (at best).
Imagine getting a check from the bank of Windows... where after checking your ID very carefully, then handed you all of the funds for the account, and trusted you (the person delegated a small amount of the account holders money) to only take/remove th
Re: (Score:2)
Or you can use FreeBSD right now. Capsicum turns file descriptors into capabilities and as soon as you call cap_enter you lose all access to the global namespace and can only interact with external resources via existing capabilities (or ones that are given to you dynamically by another process).
You can also more or less view iOS as a capability system if you squint hard enough. They write ACLs dynamically to try to emulate a capability system (one of the motivations for Capsicum was looking at what Ap
open source cryptocurrency ponzi scheme? (Score:2)
There's nothing wrong with the Internet that needs fixing, the problem resides in certain computers at either end. Is this article an attempt to tarnish Open Source with some kind of crypto currency ponzi scheme?
Probably not (Score:2)
Good "Open Source" funding leads to companies like Mozilla who, instead of trying to make the web better, mostly work on keeping the browser engine oligopoly alive.
A far better solution would be to have actual FOSS with the additional rule of being as simple as humanly possible. Simple code is shorter and therefore likely contains less errors. Less errors lead to less security critical errors. Also it's easier to maintain a 1k line program than a 20 Megaline program.
Considering that most things companies do
Re: (Score:2)