Why Chinese Hacking Is Only Part of the U.S. Security Problem 101
An anonymous reader writes "Cyber espionage, crime, and warfare are possible only because of poor application or system design, implementation, and/or configuration,' argues a U.S. Air Force cyber security researcher. 'It is technological vulnerabilities that create the ability for actors to exploit the information system and gain illicit access to sensitive national security secrets, as the previous examples highlight. Yet software and hardware developers are not regulated in the same way as, say, the auto or pharmaceutical industries.' 'The truth is that we should no longer accept a patch/configuration management culture that promotes a laissez-faire approach to cyber security."
Because... (Score:1)
US security sucks? Now, now, there's no need to become all yoddle! After all, the US has been propagating that which is unseen to the foreign admissive. Why don't we all just get all along, and become brothers in rancid?
Re: (Score:3)
Yes and the Chinese hackers know it. Seems the US has some chinks in its cyber-armor.
Was that the best way of stating this?
Re: (Score:1)
Yes and the Chinese hackers know it. Seems the US has some chinks in its cyber-armor.
Was that the best way of stating this?
Oh I agree. The prefix "cyber-" is completely overused.
Re: (Score:2)
it needs to have an "i" or "e" in front of it to make it better.. iCyberSecurity or eCyberSecurity will make it all better.
Our buildings are vulnerable to Chinese missiles t (Score:3)
HOWEVER our buildings are also quite vulnerable to Chinese missiles. We haven't secured our shopping centers, our sports stadiums, or our power plants. China could very easily wipe out any of them. Does t
Re: (Score:2)
US security sucks? Now, now, there's no need to become all yoddle! After all, the US has been propagating that which is unseen to the foreign admissive. Why don't we all just get all along, and become brothers in rancid?
US security sucks or ALL electronic security sucks?
Re: (Score:1)
So start demanding changes. (Score:5, Interesting)
First off, demand that every software vendor provide a list of files that their product installs, where those files are installed by default and different checksums/hashes/etc for them.
It should be possible to boot a machine with a live CD (or PXE) and inventory every single file on that machine and identify the origin of each of them.
At least you'd know whether a machine was cracked or not.
Right now, with existing anti-virus, all you can say is that a machine does not have anything that matches the signatures that you have right now.
Re: (Score:3)
Re: (Score:1)
Right now, with the Microsoft Windows way of doing things, all you can say is that a machine does not have anything that matches the signatures that you have right now.
Fixed that for you.
Turns out, the principle of least-privilege isn't so easily bolted-on in a haphazard manner after decades of doing it wrong, teaching software authors to expect it done wrong, and users to not know the difference.
For *nix, there is Tripwire and Aide and similar programs to do exactly what you're proposing, without support from the vendor. Isn't that easier than getting hundreds of independent companies to all cooperate with your scheme?
Re: (Score:1)
First off, demand that every software vendor provide a list of files that their product installs, where those files are installed by default and different checksums/hashes/etc for them.
It should be possible to boot a machine with a live CD (or PXE) and inventory every single file on that machine and identify the origin of each of them.
At least you'd know whether a machine was cracked or not.
Right now, with existing anti-virus, all you can say is that a machine does not have anything that matches the signatures that you have right now.
My days would be much simpler if all dev's turned out software that is hospital grade. I am not real sure about the point Khasim is making, most software and driver's are digitally signed. True, zero-day attacks are more frequent these days however, I believe that is more related to streaming media that is harder for AV software to track because of torrents and proxies and embedded and gets de-ciferd "on the fly".
Your kidding of course (Score:3)
Start with designing operating systems that are secure and language enviromnments that are secure rather that feature rich marketing shows. Don't put the blame on the programmers that have to work with shoddy designed infrastructure. Change the infrastructure.
Re:Your kidding of course (Score:5, Insightful)
You may be over-estimating the will of developers who actually intend to build something secure out of the box. Sure, you've got the chunk of folks that require fine-grained security in their day-to-day, but the rest of them that take security for granted (we're not big enough yet to make things secure, we'll wait until revenue hits $xxx and then "do it right") are just going to worry about making their stuff function according to the spec.
I have left some code lying around before that I am not particularly proud of, not that anyone important would notice, as it tends to be things only another developer would recognize. It's difficult to think of other occupations that are not affected by this type of thinking either, otherwise we wouldn't have to send the Dept. of Health around to restaurants to make sure the kitchens are clean, or the pedagogists around to the elementary school to make sure learning is happening, or aviation officials to enforce maintenance standards...
Of course there needs to be accountability for code that does important things. That is clearly obvious. There are too many people interacting with code in occupations that previously wouldn't have done so. At some point it's going to be a good idea to have a nice audit trail.
Re: (Score:2)
Re: (Score:2)
You may be over-estimating the will of developers who actually intend to build something secure out of the box.
Not only that, but he and the article are hugely overestimating the amount of control the developer has over a project that is done on contract for an agency. "Do it this way", "make it work that way", "no, remove that annoying button"...
The story forgets to mention that, indeed the hackers utilized bugs in software to gain entry but the tech is already here to secure the targets, someone with decision power just decided not to, they are now trying to paw off the responsibility to the poor guys on the floor
Re: (Score:1)
Re: (Score:2)
One of the best security would be to make each application able to have its own directory where it can read/write, and nothing outside it is available, and then the Open File Dialog should be separate from the application, and any file selected by it allows the program to have a descriptor for that file and nothing else.
Basically, programs should run chrooted and not be allowed even read access to the entire harddisk.
s/technological/human (Score:4, Insightful)
Re: (Score:2)
It is clear that they are talking specifically about technological vulnerabilities. Also, in the given context of a military/national security type of system, only trained personnel are allowed to access them. However imperfect, that's as good as it gets in terms of dealing with social engineering or the dumb-user problem.
Ever hear of Mata Hari?
http://en.wikipedia.org/wiki/Mata_Hari [wikipedia.org]
Re: (Score:2)
Planting a spy on the inside is not a social engineering attack.
Actually, yes, it is.
Patch Code is like Chinese Food.... (Score:3)
.....In an hour, you'll be hungry again.
Just plain silly (Score:5, Insightful)
Can you imagine during the cold war of the US President went to Stalin and said "please stop spying on us"? Because that's exactly what's been suggested here.
Re: (Score:2)
The whole idea that China should be 'held responsible' for the hacking is just plain silly on it's face. Governments and private corporations have been spying on each other ever since the first cave man tried to keep a secret.
It's a form of sabre-rattling. Although, it is useful to note the difference between spying as in passive information gathering, versus something intended to cause material damage like Stuxnet. The latter actually is a form of attack.
Can you imagine during the cold war of the US President went to Stalin and said "please stop spying on us"? Because that's exactly what's been suggested here.
I imagine the Soviets were pissed off about this one [wikipedia.org].
The Trans-Siberian Pipeline, as planned, would have a level of complexity that would require advanced automated control software, Supervisory Control And Data Acquisition (SCADA). The pipeline used plans for a sophistica
Exactly, plus .... (Score:1)
more certifications? oversight? (Score:2)
take a look at the IT/data security invested in the automotive/pharm industry, and then ask yourself, "well, why are they so secure?"
Oh, I'm Sorry (Score:4, Insightful)
Do you expect medical professionals to be able to cure every disease and infection ever? Do you expect automotive engineers to be able to build mechanically perfect vehicles? No. Of course the attitude the majority of people take towards online security is a joke, but no more so than saying "Cyber espionage, crime, and warfare are possible only because of poor application or system design, implementation, and/or configuration."
Cyber espionage, crime, and warfare exist through the same mechanisms that allow viruses to become resistant to treatment: adaptation. Systems can be designed to be harder to break, systems can't be made to be impenetrable. The language used in this article is just the same old IT-focused yellow journalism we've all come to expect on the subject.
Re: (Score:2, Insightful)
your analogy is not accurate, the majority of vulnerabilities are due to variations on the same dozen sloppy coding mistakes. A proper analogy would be most car manufacturers in some hypothetical right-hand side driving country with many highway ramps not putting bolts on the right front wheel and not having a problem most the time because most turns are to the right and not the left, and the occasional left turn is almost always followed by a right that reseats the wheel.
Re: (Score:1)
the majority of vulnerabilities are due to variations on the same dozen sloppy coding mistakes
I don't doubt that.
A proper analogy would be most car manufacturers in some hypothetical right-hand side driving country with many highway ramps not putting bolts on the right front wheel and not having a problem most the time because most turns are to the right and not the left, and the occasional left turn is almost always followed by a right that reseats the wheel.
That would be a proper analogy if it's what was being argued. While the article did call for stricter security standards for commercially produced code, something that I agree with, it also said that breaches of security would not happen if such were the case. Hence the analogy; you can make the system better, you can't make it perfect.
Re: (Score:2)
Do you expect automotive engineers to be able to build mechanically perfect vehicles? No.
Vehicles that never fail? No. Vehicles that have a reasonable failure mode? Yes.
Consider the air brakes on a tractor trailer. The air is what keeps the brakes apart. If some mechanical failure caused a loss of air pressure, the failure mode would be stopping the vehicle. That is acceptable. If they did it the other way, with the air pressure being used to apply the brakes, the first sign of failure could be the inability to stop the vehicle at highway speed. That is not acceptable.
Either way,
Re: (Score:1)
You should read my comment again, because your reply is essentially repeating what my post said to begin with. Do people treat security poorly in the IT industry, yes. Can security be strengthened by more rigid standards and harsher penalties for failure, yes.
What I responded to, and I'll quote it again, was "Cyber espionage, crime, and warfare are possible only because of poor application or system design, implementation, and/or configuration." The implication here is that these things are NOT possible if
Re: (Score:3)
You should read my comment again, because your reply is essentially repeating what my post said to begin with. Do people treat security poorly in the IT industry, yes. Can security be strengthened by more rigid standards and harsher penalties for failure, yes.
What I responded to, and I'll quote it again, was "Cyber espionage, crime, and warfare are possible only because of poor application or system design, implementation, and/or configuration." The implication here is that these things are NOT possible if systems are not poorly designed, implemented and configured. That's a load of bullshit. even with the best security advancements available you are simply not immune. To suggest otherwise is to display ignorance on the subject.
Would you concede that (say, by using managed languages) eliminating all buffer overflows would be a huge step in the right direction? We have the capability of doing that. There is still the impossibility of ever conclusively proving that a given piece of software is completely free of all possible bugs, but that's a lofty and unrealistic goal. There are many feasible steps we could take that are realistic. We generally don't take those steps because the trade-offs involved don't fit our priorities. T
Re: (Score:3, Insightful)
I think that with the latter case, you're going to an absurd extreme that no one is realistically suggesting. That was my point.
Except it was suggested. The premise given was that should "poor application or system design, implementation, and/or configuration" be eliminated, so too would "Cyber espionage, crime, and warfare". My argument was tasking engineers with eradicating all of those problems would be like tasking doctors with curing every disease. I'M not the one going to an absurd extreme, it's a direct quote taken from TFA. I'm merely pointing it out.
Re: (Score:2)
Do you expect medical professionals to be able to cure every disease and infection ever? Do you expect automotive engineers to be able to build mechanically perfect vehicles? No. Of course the attitude the majority of people take towards online security is a joke, but no more so than saying "Cyber espionage, crime, and warfare are possible only because of poor application or system design, implementation, and/or configuration."
No, but cyber espionage, crime and warfare are made enormously easier and more productive by shoddy security design.
Re: (Score:1)
Except, again, that's not what's being argued. What was said in TFA was the ONLY reason cyber crime, espionage etc etc exist is because of shoddy security design. This is not only completely false, it unnecessarily burdens engineers and sysadmins with the task of somehow managing the impossible.
is there anyone who takes the opposite position? (Score:4, Interesting)
That is: someone who actually argues that Chinese hacking is the entirety of the U.S. security problem?
Re: (Score:2)
That is: someone who actually argues that Chinese hacking is the entirety of the U.S. security problem?
Yea - Sergei from totallylegitbankwebsite.ru
Re: (Score:1)
No real repercussions, no incentive. (Score:1)
In mainstream corporations none of this is going to happen until security issues impact the bottom line. And then it will be corps typical approach, of addressing specific instances. The military too, Adobe and Windows are used all over the place.
Re: (Score:1)
In mainstream corporations none of this is going to happen until security issues impact the bottom line. And then it will be corps typical approach, of addressing specific instances. The military too, Adobe and Windows are used all over the place.
Mainstream corporations..What corporation does not use computer's , phones, or networks Candy Land?
Re: (Score:2)
In mainstream corporations none of this is going to happen until security issues impact the bottom line. And then it will be corps typical approach, of addressing specific instances. ...
Nah; it'll be the typical approach of finding the "hackers" that expose security holes, then firing and/or prosecuting them to teach them (and their friends) the traditional lesson: We don't want to hear about our security problems. We'll continue to punish you hackers until you give up.
In my experience, this approach is the usual one that corporations use with their own developers. It's why the smarter developers tend to avoid working on security-related stuff. They want to keep their jobs, stay out
Re: (Score:2)
Re: (Score:2)
. . .U.S. Air Force cyber security researcher. . .
So, is Captain Obvious and actual captain?
No he's an Air Force civilian worker, probably a GS13
Re: (Score:2)
But to know the side effects, wouldn't you need the code? Which is exactly the opposite of what you are saying about releasing betas.
Cue (Score:1)
Cue the "But software is hard and we can't do it well" cries from the incompetent.
Sudden burst of common sense (Score:2)
Forget the arguments of "software - a non-regulated industry", that's noise. The reality is:
- Businesses: make hacking illegal and unload the cost to keep us secure to the govt; the businesses purpose is to make money not security
- Army: buddy, it worked for lulsec. But now you're on your own, we can't do it
Outsourcing plays a role. (Score:4, Insightful)
Hiring certain political persuasions to do mission-critical work for mega-corporations is something I would look out for. I specifically mean hiring anti-U.S. personalities to perform work for U.S. infrastructure has its weaknesses.
When mega-corporations implement critical infrastructure (e.g. login credentials) they would be using sympathetic professional contractors, probably from the U.S., the U.K., France, Germany, Japan, Australia, New Zealand, Canda of course. Not BRIC. That's my 2c
Re: (Score:2, Insightful)
In one example I saw, the, um, mistake in security implementation was committed by a belarussian contractor who had a strong feeling against the U.S. oil interests in Georgia (Eastern Europe) and was working at a U.S. mega-corporation... Hiring certain political persuasions to do mission-critical work for mega-corporations is something I would look out for. I specifically mean hiring anti-U.S. personalities to perform work for U.S. infrastructure has its weaknesses. When mega-corporations implement critical infrastructure (e.g. login credentials) they would be using sympathetic professional contractors, probably from the U.S., the U.K., France, Germany, Japan, Australia, New Zealand, Canda of course. Not BRIC. That's my 2c /.
This is common sense. But it has one major political problem: as soon as you try to implement it, the large numbers of people who prefer emotion over thinking are going to scream RACISM. It is how the small-minded feel righteous and noble (instead of, you know, getting off their asses and doing something they believe in).
God help you if any of the work was going to be outsourced to people with some melanin in their skin. It won't matter how critical the project is or how hostile to the US the outsourc
Re: (Score:2)
Yeah, next you'll tell me that the Romans trusted a German named Hermann (Arminius) who then betrayed three entire legions at Teutoburger Wald.
Oops
http://en.wikipedia.org/wiki/Arminius [wikipedia.org]
Re: (Score:2)
Hello??? (Score:1)
outsourcing lack of QA, golf course meetings, ect (Score:2)
outsourcing lack of QA, golf course meetings, ect also plays a role even more so when IT is out of the loop and the PHB makes the calls.
Life has risks, deal with it (Score:1)
A thug with a crowbar in meat-space is no different than some hacker on the Internet with a SQL injection.
Automobiles, airplanes, nuclear power plants, bank vaults, and other physical constructions are regularly identified with security flaws or weaknesses.
You know how to hack an armored Humvee full of infantry? With an IED. Life is dangerous. So is the Internet.
Most people don't live in bunkers. We accept the risk that all types of horrible things can happen, and we worry not. Wood and brick houses a
Re: (Score:1)
Agreed, let's stop blaming the victims.
That Windows XP unpatched PC is "secure" until some knucklehead throws malware at it, just like the jewelry store with bars and an alarm is secure until three thugs show up with crow-bars and perform a smash-and-grab.
As security becomes more problematic for consumers, the market adjusts. In large part, we're already seeing some of this... Unconscious social movement "to the cloud" has a lot to do with putting our heads in the sand. Get the data off the box in front
Punching holes (Score:2)
Is hard to be secure when you exploit 0day holes without warning the vendor to make Stuxnet [wikipedia.org] and similar ones, or if you force companies to leave holes [slashdot.org] for you to enter. Those two policies are incompatible with being secure.
Also, putting people with access to virtually all (even private communications of companies/individuals) adds an specially weak point in the security. If politicians are so easy to bribe, why shouldn't be fbi/nsa agents or middle management?
Re: (Score:2)
Most holes today are opened by poor network management, poor patch management, poor password managment, and most of all the users. Social engineering is the leading vector of most malware today.
Re: (Score:2)
Companies don't want to pay for security (Score:2)
Almost every company does not care about anything that no one notices. Their MBA's weigh the cost of building something secure against their perceived chance of a security breach (or the chance they won't be at a different company when a breach occurs) and rarely are willing to pay.
Outsourcing hurts security, and every big company does it. Why? because its cheap. You may argue about the knowledge level of the employees overseas, but that isn't the point. If you want it secure, you want your own employees wo
Re: (Score:1)
Here's the problem... (Score:1)
So, they regulate a software manufacturer to the point where very little in the way of features are getting accomplished in lieu of focusing on security fixes. Costs skyrocket for made-in-the-u.s.-absolutely-secure-software, meanwhile software made in India, Russia, China, etc. aren't beholden to the same regulations. Their software is cheaper, done sooner, and has all the features customers need. Software firms beholden to the regulations die off in droves. Problem solved, right?
I think there is a point here (Score:1)
The pharmaceutical industries have a lot of rules and procedures that need to be followed, to minimize risk to patients, and these rules are largely effective (sure, not completely, but killer drugs are pretty rare). The idea of 'release it now and fix it later' would never be tolerated in the pharmaceutical industry. Why can't the software industry aspire to similar safety standards? The idea that it is impossible to write perfectly secure code, where does that come from? Is that really true?
None of the above (Score:2)
It's not outsourcing, developers, lazy users, the Chinese or any other of the above mentioned causes that are at the root here. The root cause is the operating systems we all run aren't secure by design.
Linux, OS_X, Windows, Android, and all the phones run systems which are based on the idea of users who can be trusted. This is a great idea for computer science departments of the 1970s, prior to wide scale networking and mobile code. The idea is just stupid in todays environment, and has just lead to a ton
Secure Software Engineering is rarely taught (Score:3)
Server software that is very, very secure is possible. Look at, e.g. postfix, openssh, apache w/o modules, etc. It costs more, but the real issue is it has to be designed and implemented by people with strong secure software engineering skills. Today, secure software engineering is still rarely taught, and almost never as mandatory subject. As long as that continues, most software will suck security-wise, as secure software engineering requires a quite different mind-set from ordinary software engineering. It is however quite clear how to do it today. Techniques like privilege-separation, marking and tagging, secure containers, full input validation, etc. are well understood and cause massive increases in the difficulty to hack a system and can make it impossible. The problem is just that they are not used because so few people understand them.
My proposal: Make secure software engineering courses mandatory for any SW-Engineering and CompSci qualification. Then add high liability risks for all those that do not use these techniques to force management into abandonning shoddy practices.
Re: (Score:1)
Server software that is very, very secure is possible. " I have never heard of "Beta" business class Server software licensing...
Re: (Score:1)
Re: (Score:2)
I have no idea what you are talking about. Care to clarify?
poor application (Score:2)
The premise of this posting is stoooopid! (Score:1)
open systems exploded into commerce (Score:2)