Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Open Source Security

Are There Gaps in Training for Secure Software Development? (linuxfoundation.org) 45

A new report "explores the current state of secure software development," according to an announcement from the Linux Foundation, "and underscores the urgent need for formalized industry education and training programs," noting that many developers "lack the essential knowledge and skills to effectively implement secure software development."

The report analyzes a survey of nearly 400 software development professionals performed by and the Open Source Security Foundation (OpenSSF) and Linux Foundation Research: Survey findings outlined in the report show nearly one-third of all professionals directly involved in development and deployment — system operations, software developers, committers, and maintainers — self-report feeling unfamiliar with secure software development practices. This is of particular concern as they are the ones at the forefront of creating and maintaining the code that runs a company's applications and systems.

"Time and again we've seen the exploitation of software vulnerabilities lead to catastrophic consequences, highlighting the critical need for developers at all levels to be armed with adequate knowledge and skills to write secure code," said David A. Wheeler, director of open source supply chain security for the Linux Foundation. "Our research found that a key challenge is the lack of education in secure software development. Practitioners are unsure where to start and instead are learning as they go. It is clear that an industry-wide effort to bring secure development education to the forefront must be a priority." OpenSSF offers a free course on developing secure software (LFD121) and encourages developers to start with this course.

Survey results indicate that the lack of security awareness is likely due to most current educational programs prioritizing functionality and efficiency while often neglecting essential security training. Additionally, most professionals (69%) rely on on-the-job experience as a main learning resource, yet it takes at least five years of such experience to achieve a minimum level of security familiarity.

"The top reason (44%) for not taking a course on secure software development is lack of knowledge about a good course on the topic," according to the announcement — which includes this follow-up quote from Intel's Christopher Robinson (co-chair of the OpenSSF Education SIG).

"Based on these findings, OpenSSF will create a new course on security architecture which will be available later this year which will help promote a 'security by design' approach to software developer education."
This discussion has been archived. No new comments can be posted.

Are There Gaps in Training for Secure Software Development?

Comments Filter:
  • by Narcocide ( 102829 ) on Saturday July 20, 2024 @02:45PM (#64641152) Homepage

    Sure, you have to understand the fundamentals and how to properly evaluate and configure solutions and mitigate risk, but the primary responsibility of an already competent security engineer is telling people "no." Invariably, these are people who don't understand technology or security even the tiniest bit and don't care about it either, outrank you in the corporate hierarchy, and in general in business and in life aren't used to getting told "no." So, of course at the first opportunity to outsource the responsibility to a 3rd party that says "yes" to everything and claims they can 100% automate the task with no gatekeepers involved, they're gonna lay off all those pesky argumentative nerds and jump right on the no-responsibilities-ever bandwagon. And that's what leads to a situation like what happened yesterday.

    • In my experience, many developers are either self taught in development and programming, or self taught about security, crypto, defensive coding, etc. In the embedded side, *many* developers have degrees in EE and not Computer Science, and without formalized training software practices.

      I have some slight background in crypto, but only undergrad level. I did manage the security experts for awhile, people with the proper mathematical background, patents under their belt, and prior experience. I can certain

      • Gap 1 - Basic software security training for developers in college
        Gap 2 - Using security lax protocols - HTTP - and things built on connectionless protocols - HTTP which send data in plain text with lots of boundaries for sending/parsing data incorrectly
        Gap 3 - Using defect prone languages - JavaScript - (hoisting anyone?) lacking compilation beforehand, lacking type checking before compilation, having known language defects (return ends statements in only some cases), having multiple ways to define the sam

    • by gweihir ( 88907 )

      Yes. Profits trump safety. In all established engineering disciplines it is understood that this is exceptionally costly for society and completely unacceptable. In software and systems, it is not. And that has to stop. Hence we need strict product liability and automatic culpability for anybody not following or keeping current on the state-of-the-art. It can be done. There are enough people that have the skills, knowledge and experience. Obviously a lot of the peddlers of crap software and bad IT services

  • What? (Score:4, Informative)

    by The Cat ( 19816 ) on Saturday July 20, 2024 @02:51PM (#64641158)

    What training? There hasn't been any "training" in the workplace for decades.

    See yesterday's headlines for an example.

  • by ctilsie242 ( 4841247 ) on Saturday July 20, 2024 @02:52PM (#64641162)

    One of the things I've encountered when in the "real world", is that people will tell developers, "security has no ROI", or "the only one that profits from a lock is the lock maker". To the point where at the standup meeting, someone delayed on a deliverable will be excoriated because they are "wasting time" on doing security "right" other than just doing the bare minimum.

    This also shows with companies as well. If a security breach happens, it may hurt their stock, but it will be back where it was next quarter. For example, CrowdStrike may take a hit today, but if one looks how it is doing, YoY, it is doing quite well, even with the hit on Friday. This likely will just be a blip and things will be back to normal in a month, as contracts signed means that they will be getting revenue no matter how loud the outcry is, and by mid-August, this all will all but be forgotten about other than a year from now when it pops up in Facebook Memories.

    For the most part, companies don't care about security. Right now, not even features. They get their customers on a subscription, do next to nothing other than maintain the existing code base and do exponential price hikes, and they are Wall Street darlings.

    The exceptions are relatively few. Government comes to mind, because they might actually demand an audit, or even yank an ATO (authority to operate). Hollywood is arguably the most secure, where if someone screws up even in the slightest with regard to rules spelled out by the MPA, that contractor will be tossed off the set immediately, no appeals, no wrist-slaps. A divulging of a movie ending can lose megabucks (or even worse, full res footage hitting torrents), so they actually take security seriously when it comes to film production. Other than those two, pretty much, if a company is big enough, they can provide lip service and get away with things without consequence.

    Security is needed, and it is only a matter of time before a Warhol event happens that actually will get governments scrambling to actually pay attention to cyber security. Something like in the late 1990s/early 2000s, where viruses started destroying monitors and computers, and that made businesses actually take AV seriously, because they actually had skin in the game, rather than just paying a "consultant" company who is offshore the random + a fee, so ransomware actors get their money, as paying off ransomware is cheaper than a security focus.

    • Crowdstrike recently made a huge unforced error must likely due to a lack of QA. If anyone dies because they lost access to emergency services or healthcare was delayed, expect litigation that Crowdstrike would lose in court.

      • We have had companies like that in the past. I will be truly amazed if anything sticks, because an EULA/ToS is pretty much carte blanche nowadays. At most there might be some settlements out of court, but I don't think much is going to happen... at least not something significant enough to get many places to change from the current "it builds, ship it!" mentality that is predominant in the industry.

      • by Q-Hack! ( 37846 )

        The part of this that really scares me, is how did all of the fortune 500 companies get away from test environments before implementation of operational systems? It's like everybody shifted away all at the same time.

        • Money. It takes time and money to do development "right". The same reason why many (though not all) companies have a product manager from Marketing as the Scrum master to act as a screaming, brutal boatswain, yelling at the devs like they are dazed, chained prisoners to move faster. In these environments, code gets done, but security goes out the window. If a dev does something that causes a major crash, there are layers and layers of company bureaucracy between them and consequences, while if they don'

          • by narcc ( 412956 )

            It takes time and money to do development "right"

            Indeed it does, but what few seem to realize is that Agile and Scrum make things worse. The current CI/CD trend just compounds the problem.

            security goes out the window

            As anyone who thought about it for a few seconds would have realized immediately. Agile trades quality for (initial) speed and quantity.

            especially when combined with a LoC (lines of code) requirement

            Does anyone really still do that?

            Now add AI

            Just when you thought things couldn't get any more absurd...

            • From what I have seen, I have worked at good places that did waterfall development. The product was in general, just of higher quality, because the requirements were laid out, people had stuff to do, and they got their pieces together. Agile may be "cheaper", and I'm sure it makes the PM's ego big being the Scrum master and being able to be the high-and-mighty judge doing daily kangaroo court, and gleefully smirking when devs point to each other, saying, "they are blocking me!"

              In those environments, I can

        • by micheas ( 231635 )

          The part of this that really scares me, is how did all of the fortune 500 companies get away from test environments before implementation of operational systems? It's like everybody shifted away all at the same time.

          Because this isn't "software" it is "ransomware prevention" which was installed because cyber insurance companies demanded something and this was the cheapest least effort thing they could do to make the cyber insurance companies happy.

          And this probably isn't covered by their insurance company as it probably falls under one of the many exclusions of the policies.

          My solution was having the latest backups fully restored on a daily basis via a cron job. Which the insurance company was happy with.

    • One of the things I've encountered when in the "real world", is that people will tell developers, "security has no ROI", or "the only one that profits from a lock is the lock maker". To the point where at the standup meeting, someone delayed on a deliverable will be excoriated because they are "wasting time" on doing security "right" other than just doing the bare minimum.

      Really depends on the context. I work at a certain company where we go by the saying: Don't let perfect be the enemy of good enough. I had a situation where I was asked to allow users to filter data with regex (usually not a good idea, but it was necessary.) I was going to restrict the amount of times that a pattern could recurse (which needed more work on my part) so that users can't DoS the server. But somebody else interjected to suggest that we don't bother limiting it unless a problem shows up. And ul

      • by gtall ( 79522 )

        And when other systems start relying upon your system, how will they know this could screw them? And those other systems might be in other companies that actually do critical work. Learn to think.

    • by gweihir ( 88907 )

      Yes. And that needs to change. It is no accident we require, for example, an UL certification on an electrical appliance. Time to make the peddlers of crappy software pay in full for the damage they do. Yes, some large names (like MicroShit) will most certainly not survive that. But the human race as a whole would benefit tremendously.

      • Yes. And that needs to change. It is no accident we require, for example, an UL certification on an electrical appliance. Time to make the peddlers of crappy software pay in full for the damage they do. Yes, some large names (like MicroShit) will most certainly not survive that. But the human race as a whole would benefit tremendously.

        I would expect the opposite -- the human race would suffer on the whole.

        Development costs and costs to insure these companies go through the roof. These costs would pass down to their customers.

        Time between release dates slip to multiple years. Features are pared back to bare bones. Now everyone is not only paying more but are getting less.

        Companies became inordinately risk adverse and pare back efforts at innovation. Advancements are stifled. Smaller firms just give up. Developers at firms of any siz

        • Depending on how the regs are written, who knows what the blowback to the free software community would be. You like to think Microsoft would take the brunt, but the Apache Software Foundation are just as likely to go under.

          This is the part where I get concerned, and I think you're understating the case. Copyleft is a hack, it's not how they intended the copyright system to work. It was supposed to be a tax revenue generator, not to permit a bunch of people to protect their Software Freedom. But now the copyright cartel is an extremely strong lobby (and I will freely admit that is certainly an understatement) and the most likely outcome is probably damage to Open Source and Free Software.

          On the other hand, it is arguably corre

      • We already had that back in 2001-2002 -- Sarbanes-Oxley. We had a lot of suit wearing chatter monkeys tearing out Linux installs and forcing Windows Server 2000 because, "Linux was not compliant". This eventually was mitigated by people showing that Red Hat was Red Book, FIPS, and Common Criteria compliant, but it allowed a lot of people who didn't care and didn't know about Linux to have larger companies pay for big changes.

        Something like this will not do anything. It just means the big guys pay to game

  • The classic tech âoeengineerâ mantra is move fast and break things. Thats the opposite of engineering. To create secure and reliable products, one must use sound deliberative engineering and test it, refine, and release a product - in other words commitment to quality - where quality means stable secure programming.

    • The two are not as much I conflict as you might suppose. MFABT means be willing to try new things; don't get so attached to the status quo that you stagnate or bog everything down with process. Sometimes, that can lead to problems. Sometimes it's exactly what you need in order to escape a local maximum and find new, more secure ways of doing things.
    • by gweihir ( 88907 )

      Indeed. But that stands in the way of getting filthy rich on crappy tech. Unless and until we have full vendor liability, nothing will change. All the crapware producers are basically ripping off all of society. And, looking at the profits MicroShit makes, for example, that scam is going well.

  • by mukundajohnson ( 10427278 ) on Saturday July 20, 2024 @03:44PM (#64641232)

    the critical need for developers at all levels to be armed with adequate knowledge and skills to write secure code

    Firstly, many companies don't see it as a critical need - see ctilsie242's post. Given that many companies want to take advantage of your commitment to this trade for as little as money as possible, paying extra for security experts is a difficult decision. Exceptions apply.

    Secondly, "teaching everyone" is not how you fix a problem. There are always going to be gaps. Teaching everyone sounds excessively tedious, up front and in the long run. Imagine a looming threat as a developer—especially a junior—because you need to constantly apply caution to not screw up.

    What you need is competent developer leadership to make screwing up difficult. Personally I don't think there are a lot of important rules for writing secure software, especially in modern languages and frameworks. Get yourself some experienced developers and:

    1. Lay out some code guidelines to follow.
    2. Code review as necessary.
    3. Automated checks/tests - don't skimp on these, especially if working in an uglier language. Figure out those static analyzers.

    That said, see the first point. AT&T emailed me recently about me being affected by the recent breach, but do I care enough to start a fire? I'm part of the problem too.

    If we are interested in increasing security for our country, we need to give companies a real reason. For example, in construction work, you have to follow building codes and have inspections/audits, especially to prevent fires. There are similar codes/controls for secure software, e.g., the FedRAMP/ATO program or SOC-2 audits, but many vendors don't need to follow these to sell their product. I have seen many sketchy products being used in government systems.

    • 2. Code review as necessary.

      I always found resistance to code review. I always wanted my code reviewed and everybody looked at me funny, like "Why?"

      • I know the pain... I feel super uncomfortable when working with an unfamiliar project alone. IMO it's a lack of empathy. We're all humans susceptible to mistakes and people tend to forget that all the time. I always say that one perspective can only cover so much. The better engineers out there have a better sense of our limits and are more cooperative.

        Sadly, we go back to the first point, where most companies aren't punished enough to deem better engineers necessary in the budget.

      • Code review processes have a tendency to degrade to being style enforcement mechanisms for the most office politics member of the team to attempt to be seen. People who have experienced that will likely offer resistance to extremely formal, strict and byzantine code review rules. The same forces that prevent thinking about software security will corrupt any cargo-cult list of checkboxes to fix your security. This is a culture problem, a bureaucracy problem, an office politics problem; this isn't a developer

        • Yes, the style BS. That was the crap I'd have to endure, when there was an official company standard that nobody followed anyway. My ethic there was to code in the same format / style when working on other peoples code, even if it was stupid and ugly. (I'm was the Linux / K&R stylist, not the stupid and ugly White smith style ;) I liked straight forward code, not tricky.
          I found during code reviews, during presentation, I'd find bugs, not the reviewers.
  • It's having to rely in devs to implement security in the first place. You don't expect a developer writing db access code to include logic that determines whether or not the end user has sufficient privileges for the data in the tables and records. Sure you can handle the various scenarios where access is or isn't granted - but it would be dumb to have the access policy implemented separately in every application that talks to the database. In an ideal world it simply shouldn't be possible for a dev to writ
    • It isn't the language's job to provide security. It can help in some ways, but that isn't sufficient. You need a well-maintained authentication layer and you need people to use it.
    • Development tools and runtimes can and in many cases arguably should do as much as possible to make security easy. Things like buffer overflows and the like can be all but eliminated with a good development and runtime environment.

      That said, there are security issues that aren't so much a "computer problem" as a problem inherent in the task the computer is trying to do.

      Obligatory xkcd [xkcd.com].

  • I learned programming in the early 70s with punchcards and mainframes. Everything I have learned since then has been self-taught. It has not been easy. Attempting to learn by reading the documentation is maddening and frustrating. Much documentation is either poorly written or written to serve as a memory aid to someone who already knows the subject and just needs to refresh their memory

    Training is fine, but is often promoted by those who sell training. We need better documentation for those who are self-ta

  • Gaps? (Score:2, Funny)

    by quonset ( 4839537 )

    You mean with all the varieties of programming languages [imgur.com], there are gaps in security? I'm shocked!

  • by gweihir ( 88907 ) on Saturday July 20, 2024 @05:50PM (#64641406)

    You can still study CS without ever having had any secure software engineering courses and in many places what is on offer is not good. In addition, "management" is clueless and greed drives everything. If you have to compete economically with competitors that do not care, and even have "market leaders" that do not care at all (MicroShit, I am looking at you), the whole discipline is just producing crap.

  • by bob_jenkins ( 144606 ) on Saturday July 20, 2024 @06:17PM (#64641486) Homepage Journal

    ... says no.

  • Gaps you could drive the Titanic through. Up-and-comings can only do better if they know better, but all the CS courses teach actively-insecure software development from the very beginning.

    We'll never be rid of passwords because every CS course teaches people to create a Users database table with plaintext UserName and Password columns.

    We'll never be rid of SQL Injection because every CS course teaches people to concatenate their SQL command strings together with tainted user data instead of using strongly

  • When I interview programmer candidates, I always ask them to describe SQL injection, and how to prevent such an attack. Less than half are able to correctly explain how the attack works, even though it is as simple as typing a sing apostrophe in a data entry field. And most can't explain that the solution is to use parameterized queries. Many will say things like "You need to sanitize the input" or some such.

    If programmers aren't even aware of the details of such an elementary type of attack, how on earth can we hope they will understand the dangers of buffer overruns or cross-site scripting?

  • Basically, the issue is organizations not providing time, tools and resources to harden systems. It's easier to just punt the problem onto the OS or security software. It's not a matter of training developers to do better. The obvious blunders most developers know about and there are tools that will find them easily.

    Many software vulnerabilities come from complex interactions that are really hard to find without a lot of expert knowledge, dedicated resources and time.

    Basically, the amount of code out there

    • this is just the nature of the beast.

      No. Those days are gone. Crowdstrike needs to give detailed analysis of what created this mistake.

      The fact is (and I have been chastised for saying it) - but there needs to be a move away from less secure programming languages (like C++) and move over to rust or go. It's not like 15 years ago when there was no choice. Today -now there's no excuse.

      I'm pretty sure the lawsuits might could be a motivating factor with this.

      • less secure programming languages

        Trusting Trust dictates there cannot be a secure programming language. Trusting the damn tool is just as bad as the CEOs shoving all responsibility onto the OS and security "experts." It's literally the same act of "I don't want to take responsibility for this, while massively profiting off of it." just punted down another rung on the ladder.

        If you're going to do that, you may as well dictate that no command may be executed nor software written without the king's explicit permission, and have some clippe

Keep up the good work! But please don't ask me to help.

Working...