Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
The Almighty Buck IT Technology

2016 Has Been an Ugly Year For Tech Layoffs, and It's Going To Get Worse, Says Analyst (ieee.org) 272

IEEE Spectrum writer Tekla Perry writes: Early this year, analyst Trip Chowdhry from Global Equities Research predicted that the tech world was going to see big layoffs in 2016 -- some 330,000 in all at major tech companies. At the time, these numbers seemed way over the top. Then IBM started slashing jobs in March -- and continued to wield the ax over and over as the year progressed. Yahoo began layoffs of some 15 percent of its employees in February. Intel announced in April that it would lay off 12,000 this year. So, was Chowdhry right? "Yes," he told me when I asked him this week. "The layoffs I predicted have been occurring." And worse, he says, these laid-off workers are never again going to find tech jobs: "They will always remain unemployed," at least in tech, he said. "Their skills will be obsolete." Some of these layoffs are due to a sea change in the industry, as it transforms to the world of mobile and cloud. But some are signs of a bubble about to pop. It's all going to get worse in 2017, he predicts, because that's when the tech bubble will burst. Chowdhry, someone who has never been reluctant to go out on a limb, is predicting that'll happen in March.
This discussion has been archived. No new comments can be posted.

2016 Has Been an Ugly Year For Tech Layoffs, and It's Going To Get Worse, Says Analyst

Comments Filter:
  • by lgw ( 121541 ) on Friday October 14, 2016 @02:46PM (#53077203) Journal

    Traditional IT roles are getting automated out of existence - it's all devops and cloud these days. Great time to be working for a cloud provider, though. AWS and Azure are rocking, Oracle is trying desperately (and I hear they have to pay well to get anyone to come), Apple seems to be working on their own thing, rather than using Azure. Google is trying to make that business work for them.

    Heck, lots of normal online businesses are hiring devs to make their stuff work in the cloud, so that's another path.

    • Where it runs is irrelevant. You still need architects and user support. You still need migration consultants. And you still need custom code written...
      • by MightyMartian ( 840721 ) on Friday October 14, 2016 @02:58PM (#53077315) Journal

        It's a strange claim, because up here in British Columbia, particularly in Vancouver and Victoria, there's a huge demand for IT workers, to the point where the industry in BC is starting to get very worried that the lack of tech workers could cause serious problems.

        • by RedCard ( 302122 ) *

          It's a pretty big world, and part of it's always going to be gloomy, and part of it's always going to be booming.

          That's how slashdot's always been...

          • Indeed!
            And when the company is big enough that it spans these GEOs you find things like what happened to me.
            Intel slashed the lions share of its Chipset Firmware ME/CSME team in the US and moved the tech wholesale to Israel.
            There were many on my team that said it wouldn't happen, couldn't happen; pointing at the crypto sourcecode as a prime example of what couldn't be moved outside the US.
            Of course they neglected to realize that there is no law about re-implementing the same feature/API outside the US and m

            • by ArmoredDragon ( 3450605 ) on Friday October 14, 2016 @04:18PM (#53077867)

              This is why I decided to be a network engineer. You can outsource everything, but if your network is down, what are you going to do?

              Sure, I could probably make more as a developer, but meh, I'm happy with my current salary.

              • as I'm entertaining myself re-reading the BOFH series (up to 2001), you wouldn't happen to be in operations would you?

                Yeah, I think that I may have done better in networking as well, but when the fork in the road presented itself I chose the money. (started as validation at a networking company, when I moved I could have gone to either team). Though Intel has eviscerated their IT department as well; last I heard complaints about SLA were being met with laughter and there was a deathmarch just to keep the c

                • I still use the BOFH Excuse Generator at least twice a week. Our new-hire for our helpdesk analyst is bi-lingual, due to our plant in Mexico. I jumped from ITSM Analyst at HPE to Network Admin at a smaller (but rock-solid) corp and made a 15K per year raise. I've also stopped us from buying ANY HP equipment..because you have no idea when HP will "spin off" that department and your "support" evaporates. 4-hour On-Site Mission Critical Dell Enterprise support only! Plus all their call centers for Enterprise
            • by haruchai ( 17472 )

              "There were many on my team that said it wouldn't happen, couldn't happen; pointing at the crypto sourcecode as a prime example of what couldn't be moved outside the US.
              Of course they neglected to realize that there is no law about re-implementing the same feature/API outside the US and making sure it works"

              That sounds like a wonderful way to introduce a bunch of new bugs. Sure, they'll be fixed, in time, but until then it's a huge liability.

        • by iCEBaLM ( 34905 )

          If it didn't cost so bloody much to live there, maybe people would show up.

        • by PRMan ( 959735 )
          I can't find a city where this isn't true. So this guy has no idea what he's talking about. Most of those laid off employees probably had a new job in under 2 weeks.
        • Know of any remote jobs located there?

        • by Hizonner ( 38491 )

          "The industry" is always saying that. I've been doing this for over 30 years, and they have never stoped saying that. That doesn't make it true.

          The thing is that it's always to their advantage if more people go to school in those fields, if governments make it easier to immigrate with those skills, etc. It doesn't necessarily mean there aren't enough people qualified to do the work. It may mean that the would prefer a glut of such people so they don't have to pay very much. And it's no skin off a CxO's nose

      • by BigBuckHunter ( 722855 ) on Friday October 14, 2016 @03:03PM (#53077365)

        Where it runs is irrelevant. You still need architects and user support. You still need migration consultants. And you still need custom code written...

        Indeed, and all of those positions rule out a huge percentage of IT workers. If you cannot code, interface with humans, or are not an SME, then you're out of the game. IT workers with the initiative to evolve their skills into devops survive. I personally want to thank AWS for making the EC2 API so convoluted that it is too challenging for normal humans to operate.

      • It's not a question of, "if you still need them." You do still need them, just need fewer of them. I can't speak for all the other parts but I can speak for the coding part. There's just too many things to list that's made making programs way easier today. Heck, even some of our engineers will do up BPMN and we'll just take it and run it through a processor to produce a lot of the skeletons.

      • A tenth as many architects, since it's all in one place, on one platform, and managed using repeatable and low-effort techniques.

        • A tenth as many architects, since it's all in one place, on one platform, and managed using repeatable and low-effort techniques.

          Yes and No.

          That's fine and dandy if all you do is use AWS, or all you do is use Azure, or all you do is...

          Now, if you want to actually have some vendor redundancy (and what CxO with a brain doesn't?), that's where it gets a bit hinkey.

          (cue someone shouting "but teh Docker! Docker will save us all!!!111!!")

      • by haruchai ( 17472 )

        "And you still need custom code written"
        Those laid-off workers has best start learning COBOL

    • by mu51c10rd ( 187182 ) on Friday October 14, 2016 @03:01PM (#53077343)

      Traditional IT roles are not being automated out of existence. For one, any company providing a "cloud" need server guys, network guys, hardware techs, and so on. Two, many small/midsize businesses don't have the cash flow to support what cloud providers are charging and prefer to keep infrastructure as a capital expense. Over the decades I've noticed that IT changes, but never goes away.

      • by roman_mir ( 125474 ) on Friday October 14, 2016 @03:32PM (#53077541) Homepage Journal

        It's amazing that this story is posted so close to the HP firing story because the same comments [slashdot.org] apply [slashdot.org].

        People may say that there is no need for anything new, the old stuff works great, doesn't need to be replaced and such nonsense. It all comes from major misunderstanding of the economics. People don't necessarily buy new shoes or suits or mattresses, etc. because the old ones can no longer be used. People do it because they follow trends, because other people do it but most importantly because they have ability to do it, they have the means - money to do it.

        The real reason for all of these firings and shut downs is the stagflationary environment - high inflation and high unemployment, low savings and low rate of business formation, inability to put together the capital needed to start new businesses, to evolve existing businesses, to get the people to spend where they are already spending whatever they are making on food, energy and shelter. The growing expense on these items (food, energy and shelter) is completely tied to the level of inflation (expansion of the money supply without an underlying expansion of productivity).

        The people are unproductive, they are becoming more unproductive by the hour and so they are unable to earn enough to replace their toys because they are struggling to make ends meet.

        The reason the people are unproductive is lack of savings and thus lack of capital investment that is inherent in the system that destroys savings and prevents capital formation. The money comes into existence not because of business ventures but because the governments guarantee that money will be created and propagated through the system and they do so by suppressing the interest rates and yields, they do so by growing expenses paid with borrowing rather than with production, they do so by destroying the future of the economy by sacrificing the seed corn to pay for out of bounds consumption today.

        You have an economy that is in a massive depression actually and you are not even noticing it because of the artificial money creation that pushes up stock and housing prices. You are being bombarded with nonsensical 'economic' data that crowds out the real economic data that is important: the real negative interest rates, the real negative GDP, the real double digit unemployment numbers, the real lack of productivity due to the government intervention in the economy in every way, from businesses laws and regulations to income and wealth taxation to massive spending on government initiatives done with money that is borrowed from the future, thus crowding out legitimate business borrowing.

        With all the USA job numbers that come out, the only silver lining is that some government jobs also are sometimes cut, a thousand here, another thousand over there. That's the only beam of light in this otherwise bleak situation.

      • by swb ( 14022 ) on Friday October 14, 2016 @03:33PM (#53077545)

        I've worked on a projects where cloud-outsourced systems got brought back on premise because cloud pricing was just too expensive. A capital outlay to migrate the entire environment on-premise paid for itself in 2 years.

        I'm currently working with a customer to finish an on-premise server refresh that they wanted to do cloud with (hey, cloud is free, right?) but the pricing on deploying their application in the cloud was double annually what the capital refresh was on premise.

        This specific client has a home-grown application for which there is no commercial replacement, it's only real failing is that the architecture is too monolithic but the economics of a top-down rewrite for cloud-friendly deployment don't work, especially on the timelines that would be needed.

        I don't think this is some weird exception, either, I think it's extremely common. I'm sure there is a whole world of cloud-oriented applications out there, but there's decades worth of on-premise IT that's working well and isn't moving into any kind of cloud environment any time soon, either.

        • Yep. Go research what your business will be charged. I worked at CA and argued strongly against cloud (AWS/Azure), partly for self interest reasons, partly because it was vastly more expensive. But was much more scalable. Getting charged roughly $50/month/VM is a massive waste of money when a $5k server can hold 100+ of them for a decade, or a cost of about $0.50/VM/month. Yeah, SAN & network costs are more, but you can see that it's not even in the same scale.

          • by Archangel Michael ( 180766 ) on Friday October 14, 2016 @04:49PM (#53078077) Journal

            People have no idea how much of their business is at risk when it is in the cloud. Cloud works for BIG Corps, not your average Small business. The problem is, that a Big Corp has the resources to build out its own tailor made "Cloud" infrastructure, and doesn't need the cost of Azure or whatever.Unless you need the scaling on demand that the Cloud Offers, then you're likely not going to see a benefit cost wise.

            • by SQLGuru ( 980662 )

              I would say cloud is best for smaller companies but as the company grows large enough and can afford to have specialized people, you bring it back on-prem. A small business can't afford to have a team of expensive engineers (network, storage, server, etc.) so you "outsource" that job to the cloud. Then, when the business is started, you can bring on those types. Large companies already have the experts and don't necessarily need the cloud.

            • And they don't see to split their environments. .

              Heavy duty in house.
              Lightweight, scaling in the cloud.

              Because mgmt still doesn't get it. Even in tech companies. Especially in tech companies

          • by Penguinisto ( 415985 ) on Friday October 14, 2016 @05:05PM (#53078193) Journal

            Getting charged roughly $50/month/VM is a massive waste of money...

            *Only* $50/month? Heh. Welcome to the wonderful world of being charged per GB/hour for storage, charged for every 1,000 PUTs, every 10,000 GETs, every CPU/hr, etc etc etc...

            I saw an AWS rig with four servers - that a previous employer wound up paying $20k/mo for until I cleaned it the hell up for them (and even then got it down to $4k/mo before I started asking them why the hell we don't just drag the VMs into our existing datacenter)...

            • Well, my numbers were at scale (CA to MSFT) so definitely cheaper than any smaller (less than 10,000 employees!) customer.

      • by lgw ( 121541 )

        For one, any company providing a "cloud" need server guys, network guys, hardware techs, and so on.

        Datacenter tech, sure, though there's some additional economies of scale for the big guys.

        For everything else - it's not e.g. "network guys", but "full time software devs with a background in networking". Not everyone can make the change, and the economies of scale there are huge - you don't need many more devs to support 1 million servers than 30k, when you're building automation regardless.

        many small/midsize businesses don't have the cash flow to support what cloud providers are charging and prefer to keep infrastructure as a capital expense.

        Only for capital they've already expended. For small companies, large capital outlays are a real challenge, and a m

        • I don't think "the cloud" really has it right yet for very small companies with just a couple of servers and "the guy who also does IT", but in time it will.

          One of my businesses is not a million miles past that point today, and I think perhaps you're understating how poor cloud offerings are today for that kind of business.

          Every now and then someone suggests we go cloud-hosted instead, and I read around again, trying to work out why so many people seem to love the whole cloud idea. Usually the conclusion is that it would literally add an extra zero to our infrastructure costs and dramatically increase the complexity and number of potential failure points. This

    • Re: (Score:2, Informative)

      by PRMan ( 959735 )

      And worse, he says, these laid-off workers are never again going to find tech jobs

      Ha ha ha ha ha. Is he for real? My company currently has 94 open IT requisitions, up from 74 a month ago. And we can barely find anybody to hire anymore, without overpaying the market like crazy. And my recruiters (always maintain good relationships with them) are telling me that every company in Orange County is saying the same thing. They WANT to hire. There's just nobody TO hire.

      • And we can barely find anybody to hire anymore, without overpaying the market like crazy. ... There's just nobody TO hire.

        You mean there's nobody to hire at the wage you want to pay? Shocking!

      • Are the real must haves legit: 10 years experience with Go Programming Language, might weed out the honest applicants.

      • by fluffernutter ( 1411889 ) on Friday October 14, 2016 @04:48PM (#53078069)

        without overpaying the market like crazy

        Interesting comment. What does 'overpaying' the market mean? If there are no people, and you need people, you need to pay what it takes to get them there. You seem to be insinuating that there is some limit to pay below what it takes to get people there, but that signifies an unfair employment market right there.

    • by elrous0 ( 869638 )

      And yet all the tech companies are still telling Congress that they can't find American workers and desperately need more H1B visas. And Obama is telling everyone that they need to take Computer Science courses to take advantage of this massive tech labor shortage that supposedly exists (even though an awful lot of highly skilled programmers who aren't in on this scam are saying they can't find a job these days).

    • The point of "the cloud" *air quotes to show how much better I am than silly plebs who use weird street phrases they don't understand* *more air quotes* *smug look* is that it takes 20 IT workers to manage the infrastructure at 5 businesses, or it takes 5 IT workers to manage the infrastructure for 20 businesses.

      In other words: per unit of useful output, we want fewer IT workers. Running high-availability Web servers, databases, accounting systems, payroll systems, groupware backbone, user management (

  • One man's loss... (Score:5, Interesting)

    by Anonymous Coward on Friday October 14, 2016 @02:48PM (#53077215)

    I wonder how many of these are actually cut positions versus outsourced replacements (H1Bs, etc)?

    • In my case (as I posted above) my entire team was replaced by another team in a cheaper country to operate in with *friendlier* contract laws.
      There are several tech employers that are big enough that they span multiple Geos, so they are using this as a way to shift costs out of the US and away from an increasingly unstable business/political climate here.
      -nb

    • by Anonymous Coward on Friday October 14, 2016 @05:03PM (#53078183)

      I find it interesting that there is a critical shortage of STEM workers coinciding with a tech bubble. If that's true, what's the point of encouraging STEM paths in college? Those jobs won't be there anymore if/when the bubble bursts.

  • by houstonbofh ( 602064 ) on Friday October 14, 2016 @02:48PM (#53077221)
    A lot of the IT spend was because new versions were better. Faster, more features, new shiny! Now people are waxing nostalgic for the older systems. Win7 is still king. And CPUs have been fast enough for a while. The only thing improving is drive speed as SSDs get larger and cheaper, but you can stick that in old kit and keep Win7! And who wants to drop a grand every 2 years on a phone that is not much better then the old one?
    • Desktop support is a very tiny percentage of IT. We're talking racks of servers, multi-gigabit per second links between data centers, centralized configuration management, server monitoring as a service, custom web applications with published APIs for customers to use, individual servers with hundreds of gigabytes of RAM, true virtualization, containers, on-demand spin-up of new VMs, automated testing and deployment of new code, and single web queries across all that which trigger communication across multi

      • I did most all of that first paragraph by myself. For a dev team of 200+ folks, that grew it's headcount at about 25%/year.

        If you're good at your job, you outsource others.

    • by iCEBaLM ( 34905 )

      Except the old kit is out of warranty and will break and/or die, then what? You still need some form of IT worker to maintain the gear.

  • by ITRambo ( 1467509 ) on Friday October 14, 2016 @02:49PM (#53077223)
    "They will always remain unemployed", referring to laid off tech workers, neglects to give most of them enough credit for being able to start their own businesses or to find a position at a smaller firm that needs their skills, probably for less money. But, to imply that they'll be obsolete is disingenuous.
    • by __aaclcg7560 ( 824291 ) on Friday October 14, 2016 @02:57PM (#53077307)

      But, to imply that they'll be obsolete is disingenuous.

      Tell that to the recruiters and hiring managers. I was out of work for two years (2009-10), where hiring managers told me I was overqualified for minimum wage jobs and recruiters told me I was unemployable for everything else. When the number applicants per job opening dropped from seven in 2009 to three in 2011 (full employment is two), I got hired for a full time job the day after my Chapter Seven bankruptcy got finalized.

      • Of course 2009-2010 was part of the "Great Recession". You had a lot of company in unemployment, techie and non-techie alike. The recruiters were probably also worried about keeping their own paychecks, since almost nobody was hiring.

        (Please understand it is not my intent to make light of what you went through back then.)

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      ". . . . . or to find a position at a smaller firm that needs their skills, probably for less money."

      I am living this reality right now. It's challenging because the clients are in companies that have short-handed staffs, outdated environments and DR plans that are years out of date and/or never tested. If they ever had one at all. They don't really want the new shiny stuff. They want to make the old stuff last and last. And run the newest software, even on the older platforms. 'Not supported on Window

    • by bluefoxlucid ( 723572 ) on Friday October 14, 2016 @03:47PM (#53077665) Homepage Journal

      "Starting your own business" only works if there's a market. To be more complete (but still horribly undetailed to the point of inaccuracy): there's a total amount of income each year and, thus, a total amount of spendable money in one year's time frame, and what can be sold is what can be produced for less than that amount of money. Products are made by labor time, labor time incurs wages, wages are paid out of revenues, revenues come from consumer purchasing, purchasing comes from income.

      A new business either draws purchasing away from an old business or it competes for new purchasing available as population or productivity (amount of stuff made per labor-hour, thus decrease in cost and thus price relative to number of spendable consumer dollars) increase.

      More to the point: unemployment has decreased during 2016. [bls.gov] The labor force participation rate has increased by 0.5% since September 2015, and the number of employed Americans has increased by 3 million while the number of unemployed Americans has increased by 0.014 million (we didn't add as many jobs as we added people in the workforce, but we added more jobs than the proportional employment rate, thus the unemployment rate went down, the number of employed went up, and the number of unemployed also went up).

      Per the past three months, the number of unemployed increased by 0.169 million since July 2016, while the number of employed increased by 0.469.

      [...]some 330,000 in all at major tech companies.[...]

      [...]"The layoffs I predicted have been occurring." And worse, he says, these laid-off workers are never again going to find tech jobs: "They will always remain unemployed," at least in tech, he said. "Their skills will be obsolete."[...]

      I don't see 330,000 newly forever-unemployed IT workers. That would be literally 8.25% of the computer and mathematics workforce. We'd be talking about 13% unemployment today.

      Unfortunately, I don't have occupational employment statistics newer than May, 2015. At that time [bls.gov], employment in computer and mathematical occupations number 4,005,250. Some longer-term analysis [dpeaflcio.org] includes charts that show only normal fluctuation, although computers and mathematical has a long-running up-trend and a current down-trend not substantially different from 2008-2009.

    • find a position at a smaller firm that needs their skills, probably for less money.

      35% less to be precise :(

  • by Anonymous Coward on Friday October 14, 2016 @02:49PM (#53077225)

    The tech bubble has popped before, and it will pop again. That is just the natural respiration of our economy.

    I am sure many of the laid-off workers can update their skillsets and find new jobs. Or some can go ahead and retire. The remaining ones can find accept a non-tech job at lower pay or unemployment.

    And life will go on for everyone.

  • Hmm, from the fancy website, http://www.globalequitiesresea... [globalequi...search.com] I'm sure this guy and his team know what they are talking about. Especially when its regarding the Tech industry. :)

  • PT Barnum disagrees [wikipedia.org] ...
  • Because the need for people in the field who can find out why Granny's printer isn't connecting and who can do a sideload on a TV streaming box is bigger than ever. This is a job that can't be outsourced away from you.

  • H1B (Score:5, Insightful)

    by Anonymous Coward on Friday October 14, 2016 @02:56PM (#53077291)

    Yet 'we can not find anyone'

    Most of the layoffs you are seeing are not material but cost cutting. Meaning the job just went to someone 10x cheaper. Companies will again learn 'you get what you pay for'.

  • From TFA, the one thing that is death to a tech career. >> “They will always remain unemployed,” at least in tech, he said. “Their skills will be obsolete.
    • by __aaclcg7560 ( 824291 ) on Friday October 14, 2016 @03:09PM (#53077397)
      I was out of work for two years (2009-10), told I was unemployable by recruiters, and got a full-time IT support job in 2011. As far as job skills were concerned, nothing changed during those two years.
  • March (Score:4, Interesting)

    by DarthVain ( 724186 ) on Friday October 14, 2016 @03:05PM (#53077379)

    Well March isn't that much of a stretch or a limb. March is easily the most obviously likely time for a tech crash to happen. March 31st being the end of fiscal reporting period when companies, corporations, etc... all do their end of fiscal reporting. Seeing that bubble tend to last as long as possible to the last minute when they finally pop, that would put it firmly in March particularly if you have companies trying to save face for Q1.

    As to 2017, who knows. I went into Computer Science and graduated just prior to the last bubble, which sucked.

    That said this isn't exactly like last time. This time the layoffs are in big old gigantic corporations that have been around forever, were built up on one product, expended into everything, and have been collapsing under their own weight for a very long time now after failing to innovate and being surpassed by other more competitive companies. In most cases the layoffs are all part of restructuring deals as they spinoff and sell the undesirable departments to others hoping to pick the bones clean (and in some cases are just buying IP and assets, not working employees).

  • by sethstorm ( 512897 ) on Friday October 14, 2016 @03:08PM (#53077389) Homepage

    If there's a war on citizens wrt tech employment, then remove the means to exclude citizens.

  • by Pseudonymous Powers ( 4097097 ) on Friday October 14, 2016 @03:22PM (#53077471)

    Early this year, analyst Trip Chowdhry from Global Equities Research predicted that the tech world was going to see big layoffs in 2016 -- some 330,000 in all at major tech companies... So, was Chowdhry right? "Yes," he told me when I asked him this week. "The layoffs I predicted have been occurring."

    I call bullshit on this guy. There's a table of predicted versus actual layoffs in the article. Of the 14 companies he made predictions about, only four of them seem to have any actual layoffs, and the actual numbers are generally a fraction of the predicted number. Layoffs for companies that he didn't even make predictions about are given, in an apparent effort to buff his credibility, but some of those companies are not exactly household names (WTF is "Zenefits"?), and as for the rest, their predicted layoffs are in the 5-to-10 percent range.

    Not that further layoffs are necessarily unlikely, especially with this presidential election coming up. But it's already October, and it's been nowhere near as bad as this clown predicted. You don't get to say you were right when the data proves you wrong. You don't get credit for predicting layoffs in 2016 if they happen in 2017. You don't get to just make a blanket prediction like "layoffs will happen in the future" and get points when they eventually happen.

  • by King_TJ ( 85913 ) on Friday October 14, 2016 @03:27PM (#53077509) Journal

    First off, the person who commented that it's "predictable when newer isn't better" is correct. Right now, I.T. technology is stagnant. IMO, that's not a bad thing either. What's happening is, just as many people are using computers and electronic devices as ever -- but the market has matured. There's a very low level of "techno lust" for the latest and greatest, because truthfully, what's already readily available is good enough for everything people want to do with their machines right now.

    There's finally some sanity in corporate America, where technology is getting replaced on a schedule that reflects the time-frame it takes for the old gear to physically wear out, instead of demands for 2-3 year replacement cycles just because "the new thing is already old, since it can't run the cool new OS and new versions of apps X and Y".

    And really, HP's situation is a different/unique one in the current round of layoffs. HP split their company into two, not long ago, realizing they had to jettison the "boat anchor" of printing/imaging so it didn't weigh down profits of the rest of the business. The HP doing all the layoffs now is the one holding the bag selling printers and scanners. They've also tried to acquire other printer manufacturers like Fujitsu, but ultimately, they're selling a product that's gone from a "must have" companion purchase with every new PC to a niche need. Their investment in 3D printing didn't appear to pan out either. (I think that's turned out to be a niche hobbyist interest, since it's still a very slow process to print a 3D object, the size of said object is really limited, AND it won't realize its full potential until there are much larger collections of downloadable projects to print. If all the major manufacturers of appliances offered a way to print repair parts on their "support" web sites, for example? Then you'd see 3D printing really take off.)

    But claiming the people who get laid off have no future in I.T.? That's FUD, plain and simple. The trend to cloudify everything is still strong, but I've worked in the field long enough to say I'm confident it's going to trend back the other direction in the next decade or so. If you just look at the ridiculous number of data breaches in the news in the last year or two, you quickly see the problems with concentrating a large number of customer's data in one place. But hacking aside, the cloud services amount to giving up direct control over big chunks of your business operations. When one of these services has an outage, you can't do anything but sit around and hope they actually provide some status updates to pass along to your users and to management. Everyone's first question is "When will it be back up?" and you're stuck shrugging and saying, "Beats me! We're at the mercy of the provider." When this happens enough times, companies start demanding some more accountability and control. And you tend to get locked in to these services too. So if they raise prices or change pricing structures, you can't do much besides pay the new, higher bill (or go through a huge, unplanned project to migrate all the data elsewhere and retrain everyone on how to access it). I'm not sure how a cloud provider decides someone with many years of general I.T. experience, including such things as administering servers, troubleshooting networks and supporting staff wouldn't have skills applicable to working for their business anyway? But cloud is an overrated buzzword, either way.

  • Which Skills? (Score:4, Interesting)

    by Elf Sternberg ( 3499985 ) on Friday October 14, 2016 @03:40PM (#53077599)
    None of the articles I've read on this theme list the skills are going to be out-of-date. Which skills? What disciplines?

    In 2008, I was laid off after 8 years at a large company, and I'd been using the same tools for those 8 years. As a front-end developer for dev-ops shops, my skills were woefully out-of-date: We'd been using Sencha (JS) and Webware (PY), with some Python 2 Python-to-C libraries. I knew nothing about what the cool kids were doing. I sat down and in a few days taught myself Django and jQuery; I rebooted by SQL knowledge from my 90s-era experience with Oracle and taught myself the ins and outs of Postgresql.

    And then, in the bottom of the recession, I took shit contracts that paid very little (in one mistake, nothing) but promised to teach me something. I worked for a Netflix clone startup; I traded my knowledge of video transcoding for the promise of learning AWS. I worked for a genetic engineering startup, trading my knowledge of C++ for the promise of learning Node, Backbone, SMS messaging, and credit card processing; a textbook startup, trading my knowledge of LaTeX for the promise of learning Java; an advertising startup trading my basic Django skills to learn modern unit testing; a security training startup, trading my knowledge of assembly language in order to learn Websockets.

    The market improved. I never stopped learning. I gave speeches at Javascript and Python meet-ups. Recruiters sought me out.

    I've been at another big company for four years now. Will things go to hell in March? I don't care. I have the one skill that matters.

    • This is really the key. If your CV reads like, "I do this thing", it's going to be very difficult to get a new job unless the job is exactly what you previously did. If your CV reads, "I can do whatever you want, in whatever language you want, on whatever platform you want and have proven it in a variety of domains", getting a new job is fairly trivial. You are going to have some overlap with any job you apply for and can just buff up on it a bit before the interview and you're fine. Being a generalist

  • by Joe_Dragon ( 2206452 ) on Friday October 14, 2016 @03:47PM (#53077667)

    Ok time to stop H1B's! as clearly they are not needed when USC's are being layed off.

  • by WillAffleckUW ( 858324 ) on Friday October 14, 2016 @03:52PM (#53077695) Homepage Journal

    Seriously, "never ever going to work in tech", give me a break.

    The point of a college degree is to teach you how to learn, and to have basic skills in a field.

    You don't get rid of MDs just because they aren't "up to the latest tech", they take a few CMEs and go do a slightly different version of being a doctor.

    Same with Nurses.

    Same with Biologists.

    Same with Computer Scientists.

    The problem is the employers being too lazy to train people, and using it as an excuse to outsource.

    • The problem is the employers being too lazy to train people, and using it as an excuse to outsource.

      I worked at Cisco for nine months in 2013. My manager said he could train but it would be a waste of his time since I'll take the training to get a job at a competitor who will pay more than Cisco. Never mind that many people were learning the Cisco certification on company time, getting certified, and finding higher paid jobs at competitors. I was subsequently laid off with 10% of the workforce (my contract came up for renewal and my supervisor got locked out of the HR system) and the CEO got a 60% raise f

  • by Anonymous Coward on Friday October 14, 2016 @04:24PM (#53077929)

    "They will always remain unemployed," at least in tech, he said. "Their skills will be obsolete." Some of these layoffs are due to a sea change in the industry, as it transforms to the world of mobile and cloud.

    The fact that someone sees a transformation to "mobile and the cloud" as a significant change in the skills required, makes me wonder just WTF skills these workers really had, even in their old jobs. Programming is programming.

    It's true that sometimes a different environment has different constraints (e.g. I have more RAM on a phone than I have inside the Apache process on a much bigger server) but it's not really all that big a deal to adjust for, is it? So you're taking different approaches to caching -- you might have to unlearn some habits and learn some new ones (suddenly, CPU is expensive and RAM is cheap -- or vice versa).

    How are skills being obsoleted? I don't get it. The more tech advances I see, the more I'm convinced that hardly anything is ever really changing much at all.

  • IBM is not a good example as IBM has been imploding for years, with layoffs every year. See Robert X Cringely for details.

Don't steal; thou'lt never thus compete successfully in business. Cheat. -- Ambrose Bierce

Working...