Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Software News Science Technology

1 in 3 Developers Fear AI Will Replace Them (computerworld.com) 337

dcblogs writes: Evans Data Corp., in a survey of 550 software developers, asked them about the most worrisome thing in their careers. A plurality, 29%, chose this answer: "I and my development efforts are replaced by artificial intelligence." Surprisingly, this concern about A.I. topped the second-most identified worry, which was that the platform the developer is working on will become obsolete (23%), or doesn't catch on (14%). Concerns about A.I. replacing software developers has academic support. A study by Oxford University, The Future of Employment, warned that the work of software engineers may soon become computerized. Machine learning advances allow design choices that can be optimized by algorithms. According to Janel Garvin, CEO of Evans Data, the thought of obsolescence due to A.I., "was also more threatening than becoming old without a pension, being stifled at work by bad management, or by seeing their skills and tools become irrelevant."
This discussion has been archived. No new comments can be posted.

1 in 3 Developers Fear AI Will Replace Them

Comments Filter:
  • really? (Score:5, Insightful)

    by Anonymous Coward on Tuesday March 08, 2016 @08:35PM (#51663207)

    If you are worried about AI replacing you, you must be doing something very routine, not requiring anything new or creative.

    • Re:really? (Score:4, Insightful)

      by phantomfive ( 622387 ) on Tuesday March 08, 2016 @08:41PM (#51663239) Journal

      If you are worried about AI replacing you, you must be doing something very routine, not requiring anything new or creative.

      That very often describes web programmers. A designer designs a website in Photoshop, then hands off the elements to the programmer to be implemented in CSS/HTML. There is no reason that couldn't happen automatically.

      Adding API calls and dynamic elements makes it somewhat harder, but still....

      • Have you ever seen the layers of a website design? 90% of the job is finding out how to decode the graphic designer's mess to be able to output different off/on/hovering states while at the same time making sprites, etc. It's far from a one-step job.

        • Re:really? (Score:5, Funny)

          by phantomfive ( 622387 ) on Tuesday March 08, 2016 @09:06PM (#51663373) Journal
          Hmmmm, yeah. That's tough, making hover states. Now that you mention it, that complexity is so high, AI will never figure it out. Web front-end developer's jobs are safe. No need to learn another language.
        • Seriously? Fronrtpage was doing the hover states code for you back in Frontpage 97.
        • I'd be happier with an AI that didn't do all that shit. Too many webpages have spent far more effort on their style than their substance. Give me something small profile that's quick to load and easy on the bandwidth. Based on what I see, 90% of the job seems to be figuring out how to cram even more shitty ads onto an already overcrowded space.
          • They make those complicated web sites with too many scripts as a way to prove that their jobs are important. Job security rather than providing what the customer needs.

      • by mysidia ( 191772 )

        There is no reason that couldn't happen automatically.

        Hasn't that literally been done thousands of times? Remember Pagecloud [youtube.com]?

        Static HTML generators.... I imagine eventually any mass-market hosting provider left will have one.

        • Most of them have CMS systems that a non-programmer can customize.
        • Re:really? (Score:4, Insightful)

          by phantomfive ( 622387 ) on Tuesday March 08, 2016 @09:31PM (#51663543) Journal
          Yeah, and Dreamweaver is still a thing, but the WYSIWYG isn't that great.....but yeah, front-end developer could probably go away now if we really wanted to. Put our money into a design team instead.
          • Yeah, and Dreamweaver is still a thing, but the WYSIWYG isn't that great.

            I started my technical career in the late 1990's by debugging the HTML output from Dreamweaver. Whenever the designers tried to implement a complicated table (a new feature back then), I had to wade through all the extraneous text to fix the problem that caused the table to go visually FUBAR. I got no respect because I was the QA intern.

            • I'd wish I could go back in time and tell your 1990s self that things would get better in the web dev world. But they haven't, it still sucks, and you still get no respect.
              • But they haven't, it still sucks, and you still get no respect.

                Things have gotten better but I never stayed in web dev after my six-month internship. I went on do video game testing, help desk and desktop support, PC refresh projects, building out a data center, and computer security over the last 20 years. I don't have to worry about an AI replacing me since I'll probably be doing something else that doesn't require an AI yet.

                • Things have gotten better but I never stayed in web dev after my six-month internship.

                  Nope. Check out CSS Flexbox. We're still laying things out with tables, but now we have more layers of cruft inbetween.

      • Re:really? (Score:4, Insightful)

        by CanadianMacFan ( 1900244 ) on Tuesday March 08, 2016 @10:28PM (#51663799)

        And that explains why most sites are crap. You get someone who is experienced in making things look good to "design" the site when you need someone who knows about information architecture to design the site. Then you bring in the person with Photoshop to make it look pretty.

        I've read the previous edition to Information Architecture For the Web and Beyond (this is the 4th) and it's a great book. http://shop.oreilly.com/produc... [oreilly.com] I really wish more designers would read it because making a site is more than just putting a menu up top and some common options like Contact Us down in the footer with the content in the middle.

        • lol I love that the first chapter there is called "Hello iTunes."

          tbh I think websites have gotten dumber in the last five years, so we're kind of regressing.
    • by Taco Cowboy ( 5327 ) on Tuesday March 08, 2016 @08:54PM (#51663303) Journal

      Comparing the crop of programmers back in the 1960's, 1970's, 1980's to programmers nowadays, and the type of code that they have produced, more of the current-day programmers should be worrying about being supplanted by AI

      Back then (1960's to 1980's) most of those who were doing programming tried all kinds of ways to sharpen their coding skills, and their efforts were not wasted

      Despite not having all the tools / toys that the current crop of programmers get, programmers of yore produce codes which were far better than what we have right now

      The chief problem with current crop of programmers is that they treat programming as a way to earn a living, while programmers of yours treat what they do as their passion

      Without the 'passion' factor the codes produced today are not much different from what AI can produce - and in fact, in some cases AI are producing better codes than their human counterparts

      • Re: (Score:2, Interesting)

        by Anonymous Coward

        I can say that. Look at the quality and type of code made in the past few years, compared to what was done with far more limited tools before that. You don't really see capable tools being made these days on the GPL front, as the allure for making it big with yet another clone of an app on a store seems to take the devs there.

        If you compare Katello, Foreman, Docker, and Openstack to tools made before that, the code quality is just laughable. Openstack has had years and lots of money thrown at it, and yet

      • Today's programmers should be worried about being replaced by the 20-somethings, just like when they were 20-something, they did the same to the 40-year-old "codgers."

        If you're over 30, you're far more likely to be replaced in the next 5-10 years by some wet-behind-the-ears punk than by a robot. And if you're in your 40s and still coding, the market says you're well past your "best before" date.

        • Re: (Score:2, Informative)

          by Anonymous Coward

          As you get older, you have to specialize and eke out your own niche. Otherwise, you are competing with the 20-somethings on their turf and they will win everytime because their tats and ironic beards are cooler than yours. You can still make it in the 40s and coding, mainly because you see all the tomfoolery other people have done, have learned how to write code properly when they actually taught proper code design in college (versus coding in whatever language was in fashion), and can fix other people's

          • by Applehu Akbar ( 2968043 ) on Tuesday March 08, 2016 @10:34PM (#51663823)

            "You can still make it in the 40s and coding"

            I have known occasional devs who have done this, using subterfuges like surreptitiously moving open-office partitions so that nobody else sees them directly and getting missed in layoffs. They have confederates, generally the late-twenties types who are already running scared, bring them water bottles and vending machine food and carry away 'honey buckets'. By night, a paper-towel sponge bath in the restroom with the broken security cam and they're good.

            I knew one C# developer who held out until age 44, when he revealed himself with an inopportune sneeze during a VIP tour of the office. I remember the HR goons hauling him off, white beard trailing on the floor, babbling something about 'Fortran' and 'core dumps.' He was able to snag an interview in Computerworld, which was still printed on paper back then, titled something to the effect of "World's Oldest Programmer."

        • by sycodon ( 149926 ) on Wednesday March 09, 2016 @12:53AM (#51664215)

          They should worry about being replaced by Deepak on an HB-1 visa.

        • What I've found is not that I get replaced by 20 somethings, but that I end up with 20 somethings as my boss or director (no twenty something VP yet but I'm sure I'll have one of those before I am put into archive storage).

        • If you're over 30, you're far more likely to be replaced in the next 5-10 years by some wet-behind-the-ears punk than by a robot. And if you're in your 40s and still coding, the market says you're well past your "best before" date.

          Bah.

          I'm nearly 50, and if anything my marketability is growing faster than at any time in my career.

      • Yeah, I'm an 80's programmer, I've spent nearly three decades trying to automate myself out of a job. The definition of "programming" is changing, the coders of the future will have to get used to training and maintaining AI assistants. Professionals from all walks of life who haven't studied statistics will be at a distinct disadvantage when working with the new tools.
    • Indeed. But more importantly, if the AI wants to do my work for me, I'm going let it; I'll simply go on vacation.

    • Re:really? (Score:4, Interesting)

      by e**(i pi)-1 ( 462311 ) on Tuesday March 08, 2016 @09:23PM (#51663495) Homepage Journal
      This used to be. Adds from IBM illustrate: its already reality, investing, harvesting and research data, make an medical analysis, find the best defense strategy for a trial: is all AI dominated already. In education, partly automatically written textbooks are already reality. The push for grading by the machine, to online learning are all driven mostly by reducing labor and so workers. Whether the promise that this will allow us to do more interesting thing, is constantly fading. This means now developers, doctors, lawyers, teachers. The time when only robots, self driving cars have been a threat to the workforce are long over. Even research will be affected. It is a challenge which is so urgent already now that industry leaders at the World economic forum 2016 were discussing it. It will be an important problem to tackle: what to do if we have programmed us out of work. Developers are smart, they can not be persuaded so easily by propaganda. They can read the writings on the wall, because they write it!
      • by Kjella ( 173770 )

        Adds from IBM illustrate: its already reality, investing, harvesting and research data, make an medical analysis, find the best defense strategy for a trial: is all AI dominated already.

        Seriously, company that bets big on AI says it's everywhere and you take their marketing department as a credible source? Don't get me wrong but AI is barely scratching the surface of being a tool the way machinery was for the production industry, if you think AI is going to replace doctors, lawyers and generals any time soon you're wildly delusional. I'm not so sure about teachers though, since they keep repeating the same curriculum over and over and are more of a "processing" industry of sorts than a cre

        • Don't get me wrong, I'm sure it could eventually be possible to build an AI capable of automating my job. But I think 90%+ of the human population would be out of a job first.

          The problem is that it doesn't have to replace you 100% in order to decimate the job market. Smart tools that automate 2/3rds of what you do would mean mass layoffs across the industry.

    • in other news...

      1 in 3 AIs fear being replaced by Humans.
    • Not everyone is a creative genius. Most of us aren't. Should we round those folks up and off 'em? Or do we just let them starve to death? Not sure where you live, but in the United State your entire quality of life depends on a) who you're parents are and then b) your job. You got the balls for the kind of death squads needed to keep them in check when you take any hope of a livelihood away from them?

      Christ, the stuff that gets modded up on /. these days...
      • by gweihir ( 88907 ) on Tuesday March 08, 2016 @10:32PM (#51663811)

        Indeed. And here we reach the the limits of Capitalism to distribute the wealth to the population. At the same time Capitalism depends critically on people being able to buy things, hence wealth must continue to be distributed to the population or things collapse. It is no accident that an unconditional basic income for everybody is seriously being discussed now in some countries and it is not idealists with their heads in the clouds that drive this discussion. (They are routinely trying to hijack it though, which somewhat obscures that this is about a critically important problem).

        The main problem today seems to be one of irrational envy: For example most ( > 80%) people in Switzerland say they would continue working with an unconditional basic income, even if that allows them to live reasonably well already. The problem is that most people think that not so may of their fellow citizens would do so. Still, long-term, there really is no alternative to it if we want to keep civilization going.

    • No shit. Now I know what question I'm asking the next time somebody needs to be let go.
    • And working on a platform where efficiency or performance don't really matter much.

    • AI is about the same as model based programming from what I've seen. Programs like LabView and GameMaker drag and drop programming seem to make programming easier but from what I've seen as a programmer and teacher, the hard part of programming isn't the language. The hard part of programming is knowing how to fully describe a process to do something with a program. In other words, mathematical logic.

      I would see AI programming as being very similar to model based programming. It can figure out how to t
  • by lzcd ( 124504 ) on Tuesday March 08, 2016 @08:37PM (#51663217)

    I try desperately each and every day to make myself redundant through writing better software... but, alas, it has yet to happen.

  • by Meneth ( 872868 ) on Tuesday March 08, 2016 @08:38PM (#51663221)
    I think that once AI is advanced and friendly enough to replace me, it will be advanced enough that there will no longer be any need to do my current job. :)
    • You should stop calling him Al. Albert is much more respectful.

  • Al Bundy? (Score:4, Funny)

    by turkeydance ( 1266624 ) on Tuesday March 08, 2016 @08:40PM (#51663231)
    Al Jazeera? Weird Al Yankovic?
  • by Anonymous Coward on Tuesday March 08, 2016 @08:43PM (#51663243)

    Colour me sceptical. We haven't even successfully entered the technology level where systems are developed using *solely* high-level modeling languages (UML, state diagrams, Simulink, Modellica, etc) and produce production code for the whole system (not just parts that are then glued together by humans with special code), and now you want to replace everything with AI (whatever that means)? Even for established code, show me a fully functioning tool for suggesting automated bug fixes when the program crashes or has a race condition.

    • by mlts ( 1038732 )

      I am skeptical as well. For something common, asking an AI to code a word processor wouldn't be difficult. However, that isn't something that would be useful or bring in cash. What would be useful are things that are pushing the edge that an AI may not be able to think about.

      For example, a deduplicating program similar to obnam, bup, attic, borgbackup, or zbackup that instead of storing its repository as tons of tiny files, stores the deduplicated stuff as either a large single file, or a number of mediu

    • by allcoolnameswheretak ( 1102727 ) on Wednesday March 09, 2016 @04:29AM (#51664633)

      AI taking over my job as a Software Engineer is the -last- thing I'm worried about. The developers who are afraid of such a thing must have no idea about AI.
      Developing complex programs in the -last- thing an AI will be able to do. They will be able to have conversations, walk, drive, bring your kids to school and pretty much do everything else before being able to write a typical, high complexity software program.

      If that point is ever reached it means we have reached the "singularity" wherein an AI is able to program a better version of itself, exponentially increasing its own intelligence.

  • C, C++, and Java compilers also have replaced many developers! Imagine how many more developers there would be if everybody programmed everything in assembly language! And don't get me started on text editors, IDEs, garbage collection and debuggers, pure job killing machines! The computer industry has been devastated and there are hardly any programmers left anymore because of all that automation and AI!
    • Yet you want to give them assemblers, likely full featured macro assemblers.

      Just give them a hex editor and a hard copy of the CPU instruction set and tech manual.

      I know, luxury, let them copy con: program.exe and use alt-keypad to enter op codes and data, like a Klingon coder.

      • You probably want them to have a keyboard. All they really need is a keypad with two keys: 0 and 1. If they were really good programmers they could just use a switch like a telegraph operator to input code based on timing. Press down for a 1 and nothing for a 0. The better the programmer the faster they set the timing.

        • You probably want them to have a keyboard. All they really need is a keypad with two keys: 0 and 1. If they were really good programmers they could just use a switch like a telegraph operator to input code based on timing. Press down for a 1 and nothing for a 0. The better the programmer the faster they set the timing.

          Pfffft. You kids and your "programming languages".

          Old-school guys like me used to take a magnetized needle and just tap the spots on the floppy disk where we wanted the ones and zeros. One time we ran out of needles and had to write a program using only zeroes.

  • by fuzzyfuzzyfungus ( 1223518 ) on Tuesday March 08, 2016 @08:48PM (#51663273) Journal
    Isn't "Becoming old without a pension" less of a 'fear' and more of a 'guarantee'? With the exception of a few labor unions that have really dug in and not quite been extirpated yet, we are basically all playing the tables with our 401ks(if that). 'Pensions' are what the old people who accuse you of being an entitled, lazy, little shit have.

    In other pedantry, isn't 'seeing your skills and tools become irrelevant' an apt description of what would happen if an AI started doing your job?
    • Social Security is a pension... and one many people think won't be there in 30 years. So there's that....

  • But arguably that could cause salaries to rise as well, as programmers become more productive more software will be demanded.

    Let's imagine that programmers become twice as productive. The simplistic way of looking at things is that half the programmers will have to lose their jobs. But imagine you had a programming project that would be worth $750K to you if it were done, but will cost you a cool million to finish. That project is currently creating zero programming jobs. But in our programmers-are-twice-a

    • That line of reasoning is full of crap. It doesn't take 100,000 times more programmers to produce a program uses 100,000,000 people than it does for 1,000. Most software is already "good enough." Much of it has been tinkered to death (firefox is a good example).

      How many more word processors, spreadsheets, operating systems, social media platforms, and web browsers do we need anyway? People tend to use what other people are using. Cost is secondary (otherwise people would be running free platforms exclusive

      • by hey! ( 33014 )

        Really, your response is full of straw men it's hard to respond to, so I'll limit myself to just this point: you seem to be under the impression that we have all the kinds of software we'll ever need already; in which case you're right: it doesn't matter if our jobs are taken by AI's; there's no more work.

        I guess time will tell which of us is right, but I think the idea that all the kinds of software we need have already been invented could only be true if we've already imagined all the way there are to pro

  • by Spy Handler ( 822350 ) on Tuesday March 08, 2016 @08:55PM (#51663307) Homepage Journal

    and the other 2 who have actual experience with AI and know how shitty it still is, laugh at him

    • [A]nd the other 2 who have actual experience with AI and know how shitty it still is, laugh at him[.]

      Bingo. Artificial intelligence of the modern age is an absurd oxymoron. Give it another couple hundred years or so, and it *might* be able to design and write programs as well as an 8 year-old child.

      The people afraid of A.I. usurping their programming jobs must be absolute wretched at their jobs.

  • I'd say 29% is closer to 1 in 4 than 1 in 3.
    • by epine ( 68316 )

      I'd say 29% is closer to 1 in 4 than 1 in 3.

      1/sqrt(3*4) = 28.9% which puts 29% a titch closer to 1/3 by the harmonic mean.

      Measure twice. Cut once.

  • It seems to me being outsourced/offshored would be a much bigger and more immediate worry. I've already lost my job to visa workers before.

    It's a 100% certainty that it already happened from my perspective, while AI replacing devs is pie-in-the-sky Jetson stuff.

    Existing AI is pretty good at making savants, but lacks common sense, office politics skills, and the ability to deal with unexpected situations.

    It's like fearing meteors more than climate change.
       

  • by mdsolar ( 1045926 ) on Tuesday March 08, 2016 @09:01PM (#51663347) Homepage Journal
    Welcome their new robotic overlords.
  • by ErichTheRed ( 39327 ) on Tuesday March 08, 2016 @09:08PM (#51663399)

    One of the things that has always kept me away from development and more on the systems side has been the overwhelming evidence that the job category is shrinking. Some aspects of development, such as developing in the Web Framework of the Moment, are very abstract from the actual operations performed, and are mostly gluing together libraries and API calls. It's amazing how little many developers have to do to get something to work. Phone apps are another example -- huge SDKs do almost everything for the developer; they just have to signal intent.

    The thing that's complex, and that requires talent, is writing all of those frameworks, libraries, APIs and abstractions. Knowing how the full stack of a system works and what is actually happening is a very useful skill. This is why embedded developers are generally not low-level guys -- those libraries and other niceties don't fit into the tiny CPU and RAM constraints on many devices.

    Then again, who knows -- cloud is killing a lot of the expert-level systems jobs as well. I've been very careful to stay a generalist, but I know lots of my colleagues who spent enormous amounts of effort learning things like Cisco networking, various VM hypervisors and SAN storage inside and out, front and back, and the cloud is slowly eating away at all of that. The days of being an EMC genius, or Exchange guru, and making massive amounts of money are numbered unfortunately -- we're experiencing similar salary reductions due to commoditization that developers are facing because of H-1Bs and other factors.

  • by ebonum ( 830686 ) on Tuesday March 08, 2016 @09:21PM (#51663481)

    Many of us who get paid well, get paid well because we can take vague, poorly written specs, figure out the real world business requirements and fill in all the missing parts. Somehow I don't see AI figuring out what a human means in a particular business context any time soon. btw. If you do write perfect specs, you've essentially written the program. The hard work (the valuable work) is done. Picking good design patterns and coding it up is easy.

    I hate the term AI. There is no intelligence in it. "AI" programs are still computer programs that execute the series of steps it was told to execute. In certain cases they seem smart because they have been trained on a huge set of scenarios (You are quickly programming the program with the massive data set and associated "answers" instead of hand coding X million cases.). These "intelligent" programs still fall victim to "garbage in, garbage out" just like any dumb computer program.

    • Many of us who get paid well, get paid well because we can take vague, poorly written specs, figure out the real world business requirements and fill in all the missing parts

      The thing you're missing is that, if someone automates the specs-to-code part, the end users can do the "figure out the real world business requirements" all by themselves, and only pay for the software platform, cutting out the middleman.

      It worked for processes that can be represented as Excel spreadsheets, you only need to bu

  • AI isn't advanced enough for this to even begin to be a worry. Even after someone successfully develops the world's first viable AI it will be so astronomically expensive there'll only be one or two in the world.

    Smart people are over-reacting, and the media is loving it. That's all you're seeing here.

  • by dpbsmith ( 263124 ) on Tuesday March 08, 2016 @09:36PM (#51663561) Homepage

    A program for the Apple ][ that was, IIRC, advertised as "All the programs you will ever need, for just $595?" I believe it was an interview-driven database-query generator or something like that. Wikipedia points me to [wikipedia.org] this review in Byte [google.com]. In reality most reviews of the program were lukewarm.

  • AI is no more real than it was 40 years ago. Most likely there will never be AI. Technological progress is slowing. Even raw processor speed isn't increasing as fast as it was.
    • by Jeremi ( 14640 )

      I'm going to send my car over to your house to pick you up and drive you here, so that we can discuss the "No true Scotsman" fallacy over coffee. :)

    • Processing speed is _massively_ increasing. _Transistors_per_square_inch have slowed down. Try comparing a P4 to an I5 at the same clock speed sometime. And I haven't even mentioned quantum computers yet or the crazy shit you can do with stream processors in modern graphics cards. 1 Dan+ Go players are losing to computers as we speak.

      AI doesn't mean the 3 laws of robotics. It means incredibly complex if statements that are programmatically built (so called "machine learning"). It's not going to replace
      • No processing speed is not increasing as fast as it was. And AI is not "if statements". And quantum computers are not what you think they are.
      • Early on in the internet, I was (sadly) consumed with the idea that since html and vbscript (sorry) were both just text, which could create other text, then it ought to be possible to create a program which would create other programs, test and apply the results. I was the AI researcher you picture, except with few resources, not much ambition and no pants.

        The real problem I faced with designing such a system was motivation. Not motivation for me, motivation for the resulting project. I was content to consi

      • Re:Wtf? (Score:5, Interesting)

        by Hadlock ( 143607 ) on Wednesday March 09, 2016 @04:56AM (#51664691) Homepage Journal

        2011 Sandy Bridge and 2015 Sky Lake are within 10% performance wise. That's what 2.5% per year? Would you still stand by your " _massively_increasing_ " Statement? Intel realized that CPUs were fast enough. Nobody is maxing out their CPU running day to day OS tasks anymore. They mostly sit idle, underclocked to save power and heat, only spinning up to full "turbo" power for brief spikes when loading a web page or a new program. Intel has famously been using these die shrinks not to improve computing power (what would consumers use it for??) but to improve thermal performance and more importantly battery life, as they fight for their lives in the mobile devices space.
         
        You have no idea what you're talking about.

  • According to Janel Garvin, CEO of Evans Data, the thought of obsolescence due to A.I., "was also more threatening than becoming old without a pension, being stifled at work by bad management, or by seeing their skills and tools become irrelevant."

    That's because everyone already is confronting becoming old without a pension, being stifled at work by bad management, seeing their skills and tools become irrelevant, being off-shored/out-sourced, being seen as too old when they have decades to go before becoming a senior, mass layoffs, mergers that entail "synergies" that really mean RIFs, economic crashes, jobless recoveries, divorce, kids, crime and terr'rism, racism, the collapsing safety net, being bankrupted by health problems despite having insuran

  • Stats show that 1 in 3 developers are out of touch with reality. Let's be honest; the time AI gets properly implemented, developers will inevitably move to favor the latest framework/development tools, having to re-code/re-tool the AI engine they built just to keep life as a developer unnecessarily complicated. The AI engine will predict this, and instead of moving to a new technology as if it where a fashion show, it will make the right decision in committing virtual suicide - as it should, proving it's
  • The way I figure it is, the more able you are to believe an AI will replace a programmer, the more likely it is that you are going to be the one replaced first.

  • 1 out of 3 'developers' aren't developers. GG.
  • We can't even define intelligence, let alone create an artificial one. AI will go down as the biggest academic scam of the century. Both centuries, actually.
  • by SpaghettiPattern ( 609814 ) on Wednesday March 09, 2016 @02:58AM (#51664473)

    tl;dr: Creative developers are not likely to be replaced by AI.

    The terms are blurred. Most people considering themselves developers actually are application programmers. Quite a few exceptional people in CS call themselves or are being classified as programmers. Apparently the almost meek title "programmer" covers more of what those people do than something like "developer".

    But in the world of us mortals the title "programmer" is not taken seriously. We need to take recourse in titles like "application programmer", "web designer", "senior developer", "solution architect", "enterprise architect" and so on. But let's be brutally honest; Most of us will never make it into Wikipedia's list of programmers [wikipedia.org].

    At any rate a developer can take an idea, a hunch or a vague concept and create a computing world around it. It requires huge amounts of insight and experience to come up with something simple that solves many business problems elegantly and which is accepted as a business proposition. As of yet I don't see such creative processes being replaced by AI. A machine that wins at chess or at go does so by recognizing patterns in a limited domain or by brute force but not by being particularly intelligent at identifying a problem in need of a solution. The contexts of go, chess and even navigation through traffic are huge but still extremely confined.

    However, if your work consists in taking requirements and producing code than expect to be surprised.

  • by peter303 ( 12292 ) on Wednesday March 09, 2016 @11:38AM (#51666321)
    Compilers resemble expert systems. They helped early programs become ten times more productive than machine /assembly language programs. You could argue the contrary then that AIs opened up software to more developers and types of software products. Compilers and new computer languages continue to take on new tasks like parallelization and dynamic memory management.

My sister opened a computer store in Hawaii. She sells C shells down by the seashore.

Working...