Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
United States China Security

Why Chinese Hacking Is Only Part of the U.S. Security Problem 101

An anonymous reader writes "Cyber espionage, crime, and warfare are possible only because of poor application or system design, implementation, and/or configuration,' argues a U.S. Air Force cyber security researcher. 'It is technological vulnerabilities that create the ability for actors to exploit the information system and gain illicit access to sensitive national security secrets, as the previous examples highlight. Yet software and hardware developers are not regulated in the same way as, say, the auto or pharmaceutical industries.' 'The truth is that we should no longer accept a patch/configuration management culture that promotes a laissez-faire approach to cyber security."
This discussion has been archived. No new comments can be posted.

Why Chinese Hacking Is Only Part of the U.S. Security Problem

Comments Filter:
  • by Anonymous Coward

    US security sucks? Now, now, there's no need to become all yoddle! After all, the US has been propagating that which is unseen to the foreign admissive. Why don't we all just get all along, and become brothers in rancid?

    • US security sucks? Now, now, there's no need to become all yoddle! After all, the US has been propagating that which is unseen to the foreign admissive. Why don't we all just get all along, and become brothers in rancid?

      US security sucks or ALL electronic security sucks?

    • The internet is complete anarchy and given the fact that business is conducted globally, hardening the security landscape is daunting..pointing the finger at the government, being a capitalist society rather than businesses collectively working towards some set of standards and hold those accountable who bring harm to the info-structure.as weaken the economy. Lastly, the US has to protect it's interests...and if that means waging a cyber war against rogue nations, so be it. "All's fair in love and in
  • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Friday June 07, 2013 @05:07PM (#43941261)

    First off, demand that every software vendor provide a list of files that their product installs, where those files are installed by default and different checksums/hashes/etc for them.

    It should be possible to boot a machine with a live CD (or PXE) and inventory every single file on that machine and identify the origin of each of them.

    At least you'd know whether a machine was cracked or not.

    Right now, with existing anti-virus, all you can say is that a machine does not have anything that matches the signatures that you have right now.

    • Nowadays, folks try to do as much as possible in RAM -- by that I mean no patching files or writing to the FS at all. So, keeping track of modifications to any sort of executable file (even indirectly executable, hell, even if it's not executable) will certainly be a handy tool but not as much as you'd think. Also, debsums [ubuntu.com] already does this and I'm sure other package managers support similar functionality. Now, if there's no such utility for your system (even commercial 3rd-party) then you may have chosen/s
    • by Anonymous Coward

      Right now, with the Microsoft Windows way of doing things, all you can say is that a machine does not have anything that matches the signatures that you have right now.

      Fixed that for you.

      Turns out, the principle of least-privilege isn't so easily bolted-on in a haphazard manner after decades of doing it wrong, teaching software authors to expect it done wrong, and users to not know the difference.

      For *nix, there is Tripwire and Aide and similar programs to do exactly what you're proposing, without support from the vendor. Isn't that easier than getting hundreds of independent companies to all cooperate with your scheme?

    • First off, demand that every software vendor provide a list of files that their product installs, where those files are installed by default and different checksums/hashes/etc for them.

      It should be possible to boot a machine with a live CD (or PXE) and inventory every single file on that machine and identify the origin of each of them.

      At least you'd know whether a machine was cracked or not.

      Right now, with existing anti-virus, all you can say is that a machine does not have anything that matches the signatures that you have right now.

      My days would be much simpler if all dev's turned out software that is hospital grade. I am not real sure about the point Khasim is making, most software and driver's are digitally signed. True, zero-day attacks are more frequent these days however, I believe that is more related to streaming media that is harder for AV software to track because of torrents and proxies and embedded and gets de-ciferd "on the fly".

  • by StillNeedMoreCoffee ( 123989 ) on Friday June 07, 2013 @05:08PM (#43941277)

    Start with designing operating systems that are secure and language enviromnments that are secure rather that feature rich marketing shows. Don't put the blame on the programmers that have to work with shoddy designed infrastructure. Change the infrastructure.

    • by pspahn ( 1175617 ) on Friday June 07, 2013 @05:28PM (#43941465)

      You may be over-estimating the will of developers who actually intend to build something secure out of the box. Sure, you've got the chunk of folks that require fine-grained security in their day-to-day, but the rest of them that take security for granted (we're not big enough yet to make things secure, we'll wait until revenue hits $xxx and then "do it right") are just going to worry about making their stuff function according to the spec.

      I have left some code lying around before that I am not particularly proud of, not that anyone important would notice, as it tends to be things only another developer would recognize. It's difficult to think of other occupations that are not affected by this type of thinking either, otherwise we wouldn't have to send the Dept. of Health around to restaurants to make sure the kitchens are clean, or the pedagogists around to the elementary school to make sure learning is happening, or aviation officials to enforce maintenance standards...

      Of course there needs to be accountability for code that does important things. That is clearly obvious. There are too many people interacting with code in occupations that previously wouldn't have done so. At some point it's going to be a good idea to have a nice audit trail.

      • by Cenan ( 1892902 )

        You may be over-estimating the will of developers who actually intend to build something secure out of the box.

        Not only that, but he and the article are hugely overestimating the amount of control the developer has over a project that is done on contract for an agency. "Do it this way", "make it work that way", "no, remove that annoying button"...
        The story forgets to mention that, indeed the hackers utilized bugs in software to gain entry but the tech is already here to secure the targets, someone with decision power just decided not to, they are now trying to paw off the responsibility to the poor guys on the floor

      • Damn right you better not be lazy...people who work on the "back end" and deal with your shit...will be payng you a little visit...If you got time, check your code f-ucker's...I like my wife and kids and weekend's.
    • by satuon ( 1822492 )

      One of the best security would be to make each application able to have its own directory where it can read/write, and nothing outside it is available, and then the Open File Dialog should be separate from the application, and any file selected by it allows the program to have a descriptor for that file and nothing else.

      Basically, programs should run chrooted and not be allowed even read access to the entire harddisk.

  • by Midnight_Falcon ( 2432802 ) on Friday June 07, 2013 @05:11PM (#43941305)
    I find the summary to be quite myopic in terms of security -- it thinks that there's a technological solution for every security problem. In reality, as long as humans have access to data -- they can be deceived, tricked or otherwise made to inadvertently disclose said information to a third party. I doubt there will ever be a technological solution to address this 100% -- you can make walls and try to idiot-proof your network, but then you will discover that someone has invented a better idiot.
  • by Bob_Who ( 926234 ) on Friday June 07, 2013 @05:11PM (#43941309) Journal

    .....In an hour, you'll be hungry again.

  • Just plain silly (Score:5, Insightful)

    by Gorshkov ( 932507 ) <AdmiralGorshkov@ ... Dcom minus punct> on Friday June 07, 2013 @05:13PM (#43941321)
    The whole idea that China should be 'held responsible' for the hacking is just plain silly on it's face. Governments and private corporations have been spying on each other ever since the first cave man tried to keep a secret.

    Can you imagine during the cold war of the US President went to Stalin and said "please stop spying on us"? Because that's exactly what's been suggested here.
    • The whole idea that China should be 'held responsible' for the hacking is just plain silly on it's face. Governments and private corporations have been spying on each other ever since the first cave man tried to keep a secret.

      It's a form of sabre-rattling. Although, it is useful to note the difference between spying as in passive information gathering, versus something intended to cause material damage like Stuxnet. The latter actually is a form of attack.

      Can you imagine during the cold war of the US President went to Stalin and said "please stop spying on us"? Because that's exactly what's been suggested here.

      I imagine the Soviets were pissed off about this one [wikipedia.org].

      The Trans-Siberian Pipeline, as planned, would have a level of complexity that would require advanced automated control software, Supervisory Control And Data Acquisition (SCADA). The pipeline used plans for a sophistica

    • . . .as I mention in a later comment, if all those tech jobs, technology and investment have been shipped to China, this would be the likely result, with generations of American students/workers rendered almost obsolete in their pursuit of IT employment.
  • sounds like an excuse to spend more money, on more stuff that they already have/don't need.

    take a look at the IT/data security invested in the automotive/pharm industry, and then ask yourself, "well, why are they so secure?"
  • Oh, I'm Sorry (Score:4, Insightful)

    by doctor woot ( 2779597 ) on Friday June 07, 2013 @05:14PM (#43941329)

    Do you expect medical professionals to be able to cure every disease and infection ever? Do you expect automotive engineers to be able to build mechanically perfect vehicles? No. Of course the attitude the majority of people take towards online security is a joke, but no more so than saying "Cyber espionage, crime, and warfare are possible only because of poor application or system design, implementation, and/or configuration."

    Cyber espionage, crime, and warfare exist through the same mechanisms that allow viruses to become resistant to treatment: adaptation. Systems can be designed to be harder to break, systems can't be made to be impenetrable. The language used in this article is just the same old IT-focused yellow journalism we've all come to expect on the subject.

    • Re: (Score:2, Insightful)

      by iggymanz ( 596061 )

      your analogy is not accurate, the majority of vulnerabilities are due to variations on the same dozen sloppy coding mistakes. A proper analogy would be most car manufacturers in some hypothetical right-hand side driving country with many highway ramps not putting bolts on the right front wheel and not having a problem most the time because most turns are to the right and not the left, and the occasional left turn is almost always followed by a right that reseats the wheel.

      • the majority of vulnerabilities are due to variations on the same dozen sloppy coding mistakes

        I don't doubt that.

        A proper analogy would be most car manufacturers in some hypothetical right-hand side driving country with many highway ramps not putting bolts on the right front wheel and not having a problem most the time because most turns are to the right and not the left, and the occasional left turn is almost always followed by a right that reseats the wheel.

        That would be a proper analogy if it's what was being argued. While the article did call for stricter security standards for commercially produced code, something that I agree with, it also said that breaches of security would not happen if such were the case. Hence the analogy; you can make the system better, you can't make it perfect.

    • Do you expect automotive engineers to be able to build mechanically perfect vehicles? No.

      Vehicles that never fail? No. Vehicles that have a reasonable failure mode? Yes.

      Consider the air brakes on a tractor trailer. The air is what keeps the brakes apart. If some mechanical failure caused a loss of air pressure, the failure mode would be stopping the vehicle. That is acceptable. If they did it the other way, with the air pressure being used to apply the brakes, the first sign of failure could be the inability to stop the vehicle at highway speed. That is not acceptable.

      Either way,

      • You should read my comment again, because your reply is essentially repeating what my post said to begin with. Do people treat security poorly in the IT industry, yes. Can security be strengthened by more rigid standards and harsher penalties for failure, yes.

        What I responded to, and I'll quote it again, was "Cyber espionage, crime, and warfare are possible only because of poor application or system design, implementation, and/or configuration." The implication here is that these things are NOT possible if

        • You should read my comment again, because your reply is essentially repeating what my post said to begin with. Do people treat security poorly in the IT industry, yes. Can security be strengthened by more rigid standards and harsher penalties for failure, yes.

          What I responded to, and I'll quote it again, was "Cyber espionage, crime, and warfare are possible only because of poor application or system design, implementation, and/or configuration." The implication here is that these things are NOT possible if systems are not poorly designed, implemented and configured. That's a load of bullshit. even with the best security advancements available you are simply not immune. To suggest otherwise is to display ignorance on the subject.

          Would you concede that (say, by using managed languages) eliminating all buffer overflows would be a huge step in the right direction? We have the capability of doing that. There is still the impossibility of ever conclusively proving that a given piece of software is completely free of all possible bugs, but that's a lofty and unrealistic goal. There are many feasible steps we could take that are realistic. We generally don't take those steps because the trade-offs involved don't fit our priorities. T

          • Re: (Score:3, Insightful)

            I think that with the latter case, you're going to an absurd extreme that no one is realistically suggesting. That was my point.

            Except it was suggested. The premise given was that should "poor application or system design, implementation, and/or configuration" be eliminated, so too would "Cyber espionage, crime, and warfare". My argument was tasking engineers with eradicating all of those problems would be like tasking doctors with curing every disease. I'M not the one going to an absurd extreme, it's a direct quote taken from TFA. I'm merely pointing it out.

    • Do you expect medical professionals to be able to cure every disease and infection ever? Do you expect automotive engineers to be able to build mechanically perfect vehicles? No. Of course the attitude the majority of people take towards online security is a joke, but no more so than saying "Cyber espionage, crime, and warfare are possible only because of poor application or system design, implementation, and/or configuration."

      No, but cyber espionage, crime and warfare are made enormously easier and more productive by shoddy security design.

      • Except, again, that's not what's being argued. What was said in TFA was the ONLY reason cyber crime, espionage etc etc exist is because of shoddy security design. This is not only completely false, it unnecessarily burdens engineers and sysadmins with the task of somehow managing the impossible.

  • That is: someone who actually argues that Chinese hacking is the entirety of the U.S. security problem?

    • That is: someone who actually argues that Chinese hacking is the entirety of the U.S. security problem?

      Yea - Sergei from totallylegitbankwebsite.ru

    • I am a geek so, yesterday's revelations did not surprise me, because this kinda bullshit has gone for years now and I assumed all of the "hulla-balloo" that went viral were from people that have never gone on the internet or used a cell phone or have not taken a high school history course or have any knowledge of WW II. The buzz created yesterday was quite un-nerving to me because I never assumed that so many people were oblivious to this. I.T students run sortware, (I would imagine) like PRISM for lear
  • by Anonymous Coward

    In mainstream corporations none of this is going to happen until security issues impact the bottom line. And then it will be corps typical approach, of addressing specific instances. The military too, Adobe and Windows are used all over the place.

    • In mainstream corporations none of this is going to happen until security issues impact the bottom line. And then it will be corps typical approach, of addressing specific instances. The military too, Adobe and Windows are used all over the place.

      Mainstream corporations..What corporation does not use computer's , phones, or networks Candy Land?

    • by jc42 ( 318812 )

      In mainstream corporations none of this is going to happen until security issues impact the bottom line. And then it will be corps typical approach, of addressing specific instances. ...

      Nah; it'll be the typical approach of finding the "hackers" that expose security holes, then firing and/or prosecuting them to teach them (and their friends) the traditional lesson: We don't want to hear about our security problems. We'll continue to punish you hackers until you give up.

      In my experience, this approach is the usual one that corporations use with their own developers. It's why the smarter developers tend to avoid working on security-related stuff. They want to keep their jobs, stay out

  • by Anonymous Coward

    Cue the "But software is hard and we can't do it well" cries from the incompetent.

  • Forget the arguments of "software - a non-regulated industry", that's noise. The reality is:

    - Businesses: make hacking illegal and unload the cost to keep us secure to the govt; the businesses purpose is to make money not security
    - Army: buddy, it worked for lulsec. But now you're on your own, we can't do it

  • by TwineLogic ( 1679802 ) on Friday June 07, 2013 @05:30PM (#43941487)
    In one example I saw, the, um, mistake in security implementation was committed by a belarussian contractor who had a strong feeling against the U.S. oil interests in Georgia (Eastern Europe) and was working at a U.S. mega-corporation...

    Hiring certain political persuasions to do mission-critical work for mega-corporations is something I would look out for. I specifically mean hiring anti-U.S. personalities to perform work for U.S. infrastructure has its weaknesses.

    When mega-corporations implement critical infrastructure (e.g. login credentials) they would be using sympathetic professional contractors, probably from the U.S., the U.K., France, Germany, Japan, Australia, New Zealand, Canda of course. Not BRIC. That's my 2c /.
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      In one example I saw, the, um, mistake in security implementation was committed by a belarussian contractor who had a strong feeling against the U.S. oil interests in Georgia (Eastern Europe) and was working at a U.S. mega-corporation... Hiring certain political persuasions to do mission-critical work for mega-corporations is something I would look out for. I specifically mean hiring anti-U.S. personalities to perform work for U.S. infrastructure has its weaknesses. When mega-corporations implement critical infrastructure (e.g. login credentials) they would be using sympathetic professional contractors, probably from the U.S., the U.K., France, Germany, Japan, Australia, New Zealand, Canda of course. Not BRIC. That's my 2c /.

      This is common sense. But it has one major political problem: as soon as you try to implement it, the large numbers of people who prefer emotion over thinking are going to scream RACISM. It is how the small-minded feel righteous and noble (instead of, you know, getting off their asses and doing something they believe in).

      God help you if any of the work was going to be outsourced to people with some melanin in their skin. It won't matter how critical the project is or how hostile to the US the outsourc

    • Yeah, next you'll tell me that the Romans trusted a German named Hermann (Arminius) who then betrayed three entire legions at Teutoburger Wald.

      Oops

      http://en.wikipedia.org/wiki/Arminius [wikipedia.org]

  • Every piece of technology we use is made in China. And we're just now thinking about this??? Duh!!!
  • outsourcing lack of QA, golf course meetings, ect also plays a role even more so when IT is out of the loop and the PHB makes the calls.

  • A thug with a crowbar in meat-space is no different than some hacker on the Internet with a SQL injection.

    Automobiles, airplanes, nuclear power plants, bank vaults, and other physical constructions are regularly identified with security flaws or weaknesses.

    You know how to hack an armored Humvee full of infantry? With an IED. Life is dangerous. So is the Internet.

    Most people don't live in bunkers. We accept the risk that all types of horrible things can happen, and we worry not. Wood and brick houses a

  • Is hard to be secure when you exploit 0day holes without warning the vendor to make Stuxnet [wikipedia.org] and similar ones, or if you force companies to leave holes [slashdot.org] for you to enter. Those two policies are incompatible with being secure.

    Also, putting people with access to virtually all (even private communications of companies/individuals) adds an specially weak point in the security. If politicians are so easy to bribe, why shouldn't be fbi/nsa agents or middle management?

    • Most holes today are opened by poor network management, poor patch management, poor password managment, and most of all the users. Social engineering is the leading vector of most malware today.

      • by gmuslera ( 3436 )
        The best approach to close holes is education, not opening even more holes, forbidding closing existing ones, or making people accept to live with holes (after all, if we have no privacy because evil government, what more will do evil hackers over that?)
  • Almost every company does not care about anything that no one notices. Their MBA's weigh the cost of building something secure against their perceived chance of a security breach (or the chance they won't be at a different company when a breach occurs) and rarely are willing to pay.

    Outsourcing hurts security, and every big company does it. Why? because its cheap. You may argue about the knowledge level of the employees overseas, but that isn't the point. If you want it secure, you want your own employees wo

    • correct on some level's, I do not mind calling India, tho, . Unreliable hardware, apathy, untrained and stupid trash security. Small businesses or ISP's in my small town (10 ) can get by with outsourcing easily and remote config and call center 's like MSFT's.
  • by Anonymous Coward

    So, they regulate a software manufacturer to the point where very little in the way of features are getting accomplished in lieu of focusing on security fixes. Costs skyrocket for made-in-the-u.s.-absolutely-secure-software, meanwhile software made in India, Russia, China, etc. aren't beholden to the same regulations. Their software is cheaper, done sooner, and has all the features customers need. Software firms beholden to the regulations die off in droves. Problem solved, right?

  • The pharmaceutical industries have a lot of rules and procedures that need to be followed, to minimize risk to patients, and these rules are largely effective (sure, not completely, but killer drugs are pretty rare). The idea of 'release it now and fix it later' would never be tolerated in the pharmaceutical industry. Why can't the software industry aspire to similar safety standards? The idea that it is impossible to write perfectly secure code, where does that come from? Is that really true?

  • It's not outsourcing, developers, lazy users, the Chinese or any other of the above mentioned causes that are at the root here. The root cause is the operating systems we all run aren't secure by design.

    Linux, OS_X, Windows, Android, and all the phones run systems which are based on the idea of users who can be trusted. This is a great idea for computer science departments of the 1970s, prior to wide scale networking and mobile code. The idea is just stupid in todays environment, and has just lead to a ton

  • by gweihir ( 88907 ) on Friday June 07, 2013 @11:25PM (#43943883)

    Server software that is very, very secure is possible. Look at, e.g. postfix, openssh, apache w/o modules, etc. It costs more, but the real issue is it has to be designed and implemented by people with strong secure software engineering skills. Today, secure software engineering is still rarely taught, and almost never as mandatory subject. As long as that continues, most software will suck security-wise, as secure software engineering requires a quite different mind-set from ordinary software engineering. It is however quite clear how to do it today. Techniques like privilege-separation, marking and tagging, secure containers, full input validation, etc. are well understood and cause massive increases in the difficulty to hack a system and can make it impossible. The problem is just that they are not used because so few people understand them.

    My proposal: Make secure software engineering courses mandatory for any SW-Engineering and CompSci qualification. Then add high liability risks for all those that do not use these techniques to force management into abandonning shoddy practices.

    • "

      Server software that is very, very secure is possible. " I have never heard of "Beta" business class Server software licensing...

    • I have never heard of "Beta" business class Server software licensing.. (don't mean to be a smartass here, but you are talking business Domain Controllers software.. If you are talking Mission Critical bug level software? if so, my bad.
  • Poor application doesn't come from lack of familiarity of poor training, however. It comes from tools which do not adequately expose functionality to the end users. Every time a tech argues "but technology X can do this you just need to learn how to do Y", he is dropping the ball. This argument was only appropriate when interfaces were limited by technological capacities (first due to being done in hardware such radio nobs and then due to lack computing power to do both interfaces and main application log
  • When the DoD (that would be Dept. of Defense for the dummies who regularly read this site) issues the top security level (O-Ring) to Micro$oft's operating systems, and MS hands over their OS source code to the Chinese gov't, could be a major cause of the problem. Another major cause would be offshoring all those jobs to China --- offshoring all that technology to China --- offshoring all that investment to China (instead of corporate amerika amortizing into their country from which they are based, and shou
  • Hackers did not not want develop on closed systems like DEC VMS with its deep levels of security. That was very painful for the few months i had to wrok with that. Now we are paying for this.

The last person that quit or was fired will be held responsible for everything that goes wrong -- until the next person quits or is fired.

Working...