Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Software United States Technology

America's Cities Are Running on Software From the '80s (bloomberg.com) 211

Even San Francisco's tech chops can't save it from relying on computers that belong in a museum. From a report: The only place in San Francisco still pricing real estate like it's the 1980s is the city assessor's office. Its property tax system dates back to the dawn of the floppy disk. City employees appraising the market work with software that runs on a dead programming language and can't be used with a mouse. Assessors are prone to make mistakes when using the vintage software because it can't display all the basic information for a given property on one screen. The staffers have to open and exit several menus to input stuff as simple as addresses. To put it mildly, the setup "doesn't reflect business needs now," says the city's assessor, Carmen Chu.

San Francisco rarely conjures images of creaky, decades-old technology, but that's what's running a key swath of its government, as well as those of cities across the U.S. Politicians can often score relatively easy wins with constituents by borrowing money to pay for new roads and bridges, but the digital equivalents of such infrastructure projects generally don't draw the same enthusiasm. "Modernizing technology is not a top issue that typically comes to mind when you talk to taxpayers and constituents on the street," Chu says. It took her office almost four years to secure $36 million for updated assessors' hardware and software that can, among other things, give priority to cases in which delays may prove costly. The design requirements are due to be finalized this summer.

This discussion has been archived. No new comments can be posted.

America's Cities Are Running on Software From the '80s

Comments Filter:
  • by Medievalist ( 16032 ) on Thursday February 28, 2019 @01:32PM (#58194474)

    It's Java, right? Tell me it's Java!

    • by juniorkindergarten ( 662101 ) on Thursday February 28, 2019 @01:46PM (#58194570)
      Cobol.
      • MicroFocus COBOL, I bet.

        Which can indeed be modernized with GUI and all, but probably not cost-effective if the salient complaints are true.

        • by sconeu ( 64226 )

          It's running on an AS/400.

          • by spudnic ( 32107 ) on Thursday February 28, 2019 @02:28PM (#58194880)

            That still doesn't mean you couldn't put a web based front end that talks back to the cobol running on the AS/400 (iSeries). We do this all of the time. The cobol is probably very fast and efficient. It just needs a better user interface.

            • Re: (Score:3, Insightful)

              by bferrell ( 253291 )

              Yes, but that needs to have monew, SCARCE TAX DOLLARS, allocated. And it's next to impossible to OBTAIN enough to do the basics... Like public transit, house the homeless, pay teachers etc, etc, etc.

              The comment reminds of when my three year old sister was told there was no money to buy what she wanted. She told our parent to just "write some" (write a check). We all laughed.

              How about if we make Apple, Google, Facebook et al pay what they should have instead of stashing it overseas accounts?

          • Even better, and since it probably came from S/3x midrange, they could have looked into a java GUI and front-end it.

            Of course, a decade ago, the GUI tools in RPG would have been very attractive.

            And also, since this is undoubtedly a vendor package, the vendor would prefer to hold them hostage for real money. Time to move to AIX. Oh, wait...

      • by Anonymous Coward on Thursday February 28, 2019 @02:11PM (#58194724)

        There are probably millions of retired COBOL programmers out there who wouldn't mine working again for a little while. But they don't have "current" experience.

        • Re: (Score:3, Funny)

          by Tablizer ( 95088 )

          retired COBOL programmers out there who wouldn't mine working again for a little while. But they don't have "current" experience.

          Knowing how HR "works", they require "100 years or more of production COBOL experience."

      • Could be power builder also. Never know. Back end applications that do batch processing are cobol. Front end which business people see is in Power Builder.
    • by DeBaas ( 470886 )

      as budget for the new system is $36 million it seems that they're moving to Java, not from...

    • #Basica
    • Java programmers with experience from the 1980s.
    • Here's a perhaps apocryphal story I heard from an old friend. At the time he was with a consultancy that did work for the City.

      About 15 years ago the City of SF felt that their old mainframe-based financial software was showing its age. This was 100% bespoke software, iirc written in COBOL. Lots of lawful-corruption programmed into it, of course. So the City asked Oracle for an estimate on what it would cost to "upgrade" to Oracle Financials.

      So Oracle sends in a pair of consultants to examine the old softw

  • by Anonymous Coward on Thursday February 28, 2019 @01:35PM (#58194496)

    You can read the headline as a denigration of governments (which is always valid, because they get paid regardless of their performance), but you can also read it as proving that the programmers of the 1980s produces some pretty solid work.

    • by thereddaikon ( 5795246 ) on Thursday February 28, 2019 @02:25PM (#58194856)
      The average professional developer of the 80's was probably better at their job than the average one today. For one, they actually had a working understand of the hardware on a conceptual level and could relate how their code would interact with the system. Ask your average JS coder how a computer works and grab some popcorn. Due to the limitations of distribution methods and storage media of the time software was also far less bloated and much more stable at launch. You couldn't easily patch something post launch like you can now so you had to get it mostly right the first time. There is no such thing as bug free code but there is code with some bugs and then there is code that is mostly bugs.
      • by djinn6 ( 1868030 )

        For one, they actually had a working understand of the hardware on a conceptual level and could relate how their code would interact with the system. Ask your average JS coder how a computer works and grab some popcorn.

        There's definitely a lot more going on nowadays with both the OS and application layer libraries. I doubt anyone can understand the entire stack the way people did back then.

        In fact, I'll bet you can't figure out what machine code a simple line of JS code turns into either. And even if you did, that answer would be wrong in a few weeks when the next release of V8 comes out.

        Due to the limitations of distribution methods and storage media of the time software was also far less bloated and much more stable at launch. You couldn't easily patch something post launch like you can now so you had to get it mostly right the first time. There is no such thing as bug free code but there is code with some bugs and then there is code that is mostly bugs.

        I seem to recall a lot of older software that had a final version named something like 1.0.4. And those were only made available months

        • by sfcat ( 872532 ) on Thursday February 28, 2019 @03:34PM (#58195330)

          For one, they actually had a working understand of the hardware on a conceptual level and could relate how their code would interact with the system. Ask your average JS coder how a computer works and grab some popcorn.

          There's definitely a lot more going on nowadays with both the OS and application layer libraries. I doubt anyone can understand the entire stack the way people did back then.

          In fact, I'll bet you can't figure out what machine code a simple line of JS code turns into either. And even if you did, that answer would be wrong in a few weeks when the next release of V8 comes out.

          Due to the limitations of distribution methods and storage media of the time software was also far less bloated and much more stable at launch. You couldn't easily patch something post launch like you can now so you had to get it mostly right the first time. There is no such thing as bug free code but there is code with some bugs and then there is code that is mostly bugs.

          I seem to recall a lot of older software that had a final version named something like 1.0.4. And those were only made available months after the initial release. In other words, you'd have software with glaring flaws that you'd have to put up with for months. Versus now, any major bug would be quickly patched up post launch and it's a matter of a few days, or even just a few hours of waiting.

          Hahahahaha, modern software that is quickly patched is also broken about 99% of the time.

          I recall products and code actually working as designed in the 80s. They had computers you could literally pull a running CPU out of the board and it would still continue to work correctly. If you think today's software is anywhere near as well written or reliable as software of the past, you are a fool. Today's software business is built on cheap coders pushing out crap as fast as possible. In the past we did actual engineering and testing to make sure things worked before we gave them to users. Today, we let the users's beta test software live and 'let it crash' is the moto of the ops folks.

          • I don't remember ever running into a show stopping bug ever as a kid playing computer games. The first game I remember ever patching was MOO2 and even then it was mostly balance changes and some minor bug fixes that I hadn't even noticed. I can't remember the last time I played a major release game that worked so flawlessly out of the gate.

        • In fact, I'll bet you can't figure out what machine code a simple line of JS code turns into either.

          Of course you can't, nobody can - feedback-based inlining and optimization make it impossible without inspection tools.

        • Today, few people even TRY to understand the whole stack (if there is a stack, not everything is web). I see way too many programmers who treat the whole thing like black magic, from the electronics to the libraries and even the algorithms.

      • text only interfaces or limited guis. Simple, non-multithreaded environments. Bloody Green Screens.

        JavaScript apps do a hell of a lot more than the apps of the 80s. You had to _train_ folks on using those apps from 1980 because they were pretty basic. Let me write software designed for a highly trained operator who's expected to be in the job for decades and it'll be a lot better/different than what I write for somebody who's gonna be in the job for 6 months before they get canned/look for more pay.
        • No problem, those sorts of systems still exist today. The GUI can be a layer on top, it doesn't need to be integrated at all levels like many of today's paradigms do. I mean, Javascript is barely a language and yet no one seems to care that it's unsuited for the purpose it's being used for, because people just don't know anymore that programming isn't just snapping lego parts together.

      • . Due to the limitations of distribution methods and storage media of the time software was also far less bloated and much more stable at launch. You couldn't easily patch something post launch like you can now so you had to get it mostly right the first time

        The amazing thing is that they are able to do it without at least three different noSQL databases.

    • it's that the systems were debugged before the nonstop drive to cut costs. They launched just as buggy, but there was time and money to fix it. Nobody remembers the bugs, they just know the current system (mostly) works.
    • by MBGMorden ( 803437 ) on Thursday February 28, 2019 @03:49PM (#58195440)

      Depends. I work in such a government office. Our property tax billing software is written in COBOL, was first written in the very early 1980's, and does generally work well.

      HOWEVER - whew boy is that some weird code. Nothing is driven by configuration tables. Pretty much every behavior is hard coded into the program. Even for security permissions like who can access which screens, there's literally a user ID HARDCODED into the source.

      And god help you if you need to something like widen a text field (all of which are notoriously small due to being defined back when disk space was much more expensive). You'll have to recompile dozens of programs that hit that file, and often times when it's absolutely necessary we have to cobble together fields to make a bigger one (ie, positions 1 through 30 of a field might be in the middle of a line, 30 through 40 towards the end, and then 40 through 50 in another field at the very end of each line.

      Rather than being an example of how good the code is, I think it's more an example of the fact that even for bad (or even terrible) code, if you've had 30 years to debug it it'll still function fine. It's an unmanageable mess, but it does indeed do exactly what its supposed to.

      • Back then, like today, programming was often ad-hoc. Everyone wanted to be a programmer in the 80s, it was promised as the best way to make money and college computer departments were overloaded. So ya, a lot of people programmed in a very dumb way, back then and today. But the few good programs did stand out. It seems rarer today though with everyone praising how awesome Microsoft Office 365 is despite it being worse in every say than the desktop versions.

        To be fair though, financial software is always

    • by Darinbob ( 1142669 ) on Thursday February 28, 2019 @04:07PM (#58195586)

      You can read that as some people just are dumb, criticizing governments for spending money in the first place and then simultaneously criticizing governments for not spending money to upgrade old systems. This is an age old problem that if you get the budget once you can spend it to buy useful stuff and then you have to deal with the same thing for years or decades before you'll get budget to properly improve or maintain it. The fact that slashdotters are still surprised and amazed that governments use outdated equipment and software just proves that they've been stuck too long in the basement and need to get out more.

    • by Livius ( 318358 ) on Thursday February 28, 2019 @05:54PM (#58196350)

      A lot of software reach a level of being essentially feature-complete and mostly bug-free, and then could only go downhill from there. Some software was at that point in the '80s, and most of the rest reached that level in the '90s.

      If you have software that
      1) works, and
      2) physically cannot connect to the Internet (which seems to be the only way to genuinely guarantee something is secure)
      then replacing it would have costs risks that outweigh the benefits.

  • by TWX ( 665546 ) on Thursday February 28, 2019 @01:36PM (#58194502)

    ...and is optimized to run on extremely slow hardware. It was often written by people that were extremely talented at optimizing their code because hardware limitations forced them to do this.

    I won't deny that advances in computer language have made it possible to write programs better than they used to be written when machines were extremely procedural and single threaded, but at the same time, the amount of bloat in modern programming afforded by modern hardware has more than made up for it.

    I've seen the progression of software for simple things like workorder systems and asset management and audit get worse over time. The only 'improvement' is access, in that going from an 80x25 text console on a remote system with terminal emulation, to a a full-fledged program running on a specific architecture in a text mode, to a GUI program on a specific architecture, to 'applet' type programs using runtime libraries cross-platform, to web-based access that theoretically are entirely platform independent presuming a minimum browser version, and in just about all cases the further they've gone, the slower clunkier for actual experienced users that frequently use the system. It might not be any better for inexperienced users either if the vendor hasn't taken the time to look at workflow from an outside point of view.

    • California and HP have collaborated [latimes.com] to keep the shit show at the DMV ongoing for years. Outages abound. Probably a good 20% of the time I go to the DMV, they're having some kind of outage. And then there's the training issue — the system is antiquated, and has to be massaged just so to get it to work, so a huge amount of training is required and employees clearly aren't getting it. The system is actually from the sixties, as I understand it. The ID system has been modernized, but vehicle registration

    • by Suren Enfiajyan ( 4600031 ) on Thursday February 28, 2019 @02:29PM (#58194890)
      Yeah, today's bloat is incredible, and this era of shitty programmers with their toy programming languages is very disappointing. One can only imagine how many CPU cycles and electricity is wasted. Just a fun fact, Windows 95 required less ram (sometimes in order of a magnitude) than a basic calculator in any modern OS.
      • by Anonymous Coward on Thursday February 28, 2019 @02:41PM (#58194976)

        It blew my mind when I ran the Windows 10 calc.exe for the first time on a HDD system. It had a measurable loading time.

        A program function as old as windows which was snappy on a 386 now has a human noticeable loading time-and a link to a privacy statement! But that's another rant.

      • by djinn6 ( 1868030 )

        Would you use Windows 95 as your primary OS though? Why not?

        Is it the fact that it constantly crashes for no reason? Or that it limited to 16 bits of color and one monitor? Or that you had such great file system choices as FAT12 and FAT16? Or that it didn't come with a networking stack or a 3D graphics library?

        All of those cost memory and CPU time. It's very easy to write hello world in assembly and have it come out to 400 bytes of machine code and the same amount of memory. It's not very easy to write a br

        • I agree somewhat. 4MB version probably had some limitations, as far as I know, IE wasn't included, after including IE the memory requirements rose. Anyways, even with its limitations Windows 95 is still incomparably more complex than calc.exe of any modern Windows.
    • Yep! I think this has generally been the case with most of the business Financial software packages out on the market, even though they're "boxed" software and not custom written.

      With few exceptions, they seem to usually have their roots in MS-DOS or Unix -- but were converted into a Windows app for the sake of the GUI (and often sold to another company before that changeover took place).

      Microsoft Great Plains ERP would be a great example. It's gone through at least 4 or 5 major revisions since Microsoft t

      • by TWX ( 665546 )

        Most of our older stuff ran on an AS/400. It's sad that a platform older than I am, on physical hardware that's a decade old and isn't even supported by a manufacturer anymore can do financials faster and better than modern Windows-based software.

  • Awesome! (Score:5, Insightful)

    by DogDude ( 805747 ) on Thursday February 28, 2019 @01:41PM (#58194526)
    I think this is awesome. It was good software that hasn't needed to be "updated" every other day like modern software is.
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Oh noes! The software has be running since (some_random_date_in_the_past). We must update now!

      One second there, pardner. This isn't a case of wooden sewer pipes or gas still operating streetlights. Hell, this isn't even a case of two cylinder cars running on super-highways. This stuff has been running since the 1980s, and as old as that sounds to many of you youngsters out there, that was only 25 to 30 years ago.

      My point here is that this is the very first iteration of the modern software cycle. We really h

    • by tomhath ( 637240 )

      It doesn't need to be updated because (I'm guessing) it isn't connected to the internet. Air-gapped systems can run for years without being updated.

    • by antdude ( 79039 )

      Yep, it's totally rad. ;)

  • This is because they budget for the acquisition of the equipment and software but somehow it never occurs to anyone to budget for future improvements and upgrades to keep the technology modern on an ongoing basis. So they end up with systems that work as originally designed but fail to keep pace with improvement in technology. There are a lot of these sorts of systems in the military. The military to this day still uses 8" floppy disks [cnbc.com] which have been obsolete technology for 40 years. My car is a pickup

    • This is because they budget for the acquisition of the equipment and software but somehow it never occurs to anyone to budget for future improvements and upgrades to keep the technology modern on an ongoing basis.

      You can't expect to provide a beautiful new quilted maple or cocobolo desk, real leather chair, new curtains, and office redecoration for every new incoming boss if you go spending money on lame stuff like tech. Geez, don't you know *anything*? I'll bet you advocate for other unnecessary stuff l

    • So they end up with systems that work as originally designed but fail to keep pace with improvement in technology.

      And this matters why, exactly?

      My car is a pickup truck but my state's outdated registration system lists it as a station wagon because it relies on old and hard to fix technology and that was the best it could do.

      Are they using bits of leather and bird wings dipped in soot? The term "pickup truck" predates computers.

      Now you'll hear some idiots saying "ain't broke don't fix it" which is is

      • And this matters why, exactly?

        Because requirements change with society. Because other software changes. Because machines improve and can do a better job. Because software has advanced a LOT in the last 30 years. Because computer hardware has improved a LOT in the last 30 years. Because the design of the original system was based in the limitations of the hardware of the day and a lot of compromises were made that hurt efficiency. Because it's expensive to maintain legacy systems. Because if it fails it might be really hard to res

      • by djinn6 ( 1868030 )

        So they end up with systems that work as originally designed but fail to keep pace with improvement in technology.

        And this matters why, exactly?

        Because no software is ever complete in the sense that you wouldn't ever want new features added to it. At some point, the requirements change, and if your system is so old that you can't find developers, or even development toolchains for it, you won't be able to update the software. And that's assuming it's bug free.

        Now imagine trying to update a machine that ran on punch cards. Even if you knew which holes to punch, can you even find the tools they used to punch those cards? Where do you even buy blank c

    • "Now you'll hear some idiots saying "ain't broke don't fix it" which is is a poor argument for technology that clearly so far behind the state of the art."

      The point here is that the "state of the art" is so lame that the big expenditure (36M!!! Do they want to buy one laptop per building they register!?) at a high risk (it seems that, once you dig beyond corpocrap, there are more failed projects than successful) for a very limited benefit even if the project goes OK, strongly calls for a "if ain't broken do

  • While San Francisco is a big city, that just feels like a big number to me. I imagine the assessor's office has a huge database to deal with, but there's still a finite number of employee users and whatever outward facing public interface. And while every city wants to think they are unique, it's also hard to imagine their ultimate needs are radically different than Chicago, Kansas City, Las Vegas, Phoenix or any other random big city. So I guess the number is big if they start wanting a completely bes

    • by Shotgun ( 30919 )

      I doubt that processing their entire database would make my phone sweat. I can't imagine that the city has more than a few million properties to deal with.

    • I imagine the assessor's office has a huge database to deal with,

      By modern standards, I'm not so sure. About a million people, one megabyte per person, twelve backups, and it'll all fit on ONE $500 HD.

      Yeah, a properly designed system won't be that cheap or easy. Of course, you won't be keeping a MB of data on each person, either. You won't be keeping a MB on each piece of property, much less each person....

      Sounds like a budget that includes bleeding edge equipment (that'll still be obsolete in ten ye

    • by DarkOx ( 621550 )

      It probably is a bespoke solution though. I am sure the original software was modeled on an existing paper process. Said paper process probably had a lot embedded rules; which are defined by local statue.

      This is one of those cases where there is probably quite a bit of nity-girty that most people don't realize. Its one of those cases where yes there is probably some off the shelf animal that might meet there needs but it probably also requires a army of consultants to customize and develop rules for. No

    • And while every city wants to think they are unique, it's also hard to imagine their ultimate needs are radically different than Chicago, Kansas City, Las Vegas, Phoenix or any other random big city.

      I defy you to find another city which considers whether you have free-range, organic, atisinal, non-GMO, LGBTQXTY-friendly handrails on your staircase as part of your property tax assessment. Except maybe Portland. :-p

    • "While San Francisco is a big city, that just feels like a big number to me. I imagine the assessor's office has a huge database to deal with, but there's still a finite number of employee users and whatever outward facing public interface. And while every city wants to think they are unique, it's also hard to imagine their ultimate needs are radically different than Chicago, Kansas City, Las Vegas, Phoenix or any other random big city. "

      Sum that up with the "surprised no one said 'open source'" from anothe

      • The problem with getting more cities involved? More sets of different rules and regulations. Now instead of trying to code for one municipality's weird laws, you have to code for ten different (and possibly conflicting) requirements.
    • While San Francisco is a big city, that just feels like a big number to me.

      Not really to me. My department uses three systems and just to upgrade one of them with the same vendor from one version to the next was around 12-15 million. That's for new servers, deployment team, trainers, vendor support, etc. Any changes needed to the actual server room because of need for better HVAC, user computers, or something like that would be extra. Now, that's for a pretty large system of servers handling the main cluster, reporting, back ups, web serving, file server and archives, etc. Start t

  • by rickb928 ( 945187 ) on Thursday February 28, 2019 @01:49PM (#58194584) Homepage Journal

    'can't be used with a mouse'

    Which describes an entire era of software. Was ti useful before a mouse was considered so vital, actually before it was even used, or existed? Well, back then it was specified, purchased, and used. Same as now, this is a specious argument.

    'Assessors are prone to make mistakes when using the vintage software because it can't display all the basic information for a given property on one screen'

    Dear, it seems as if either this software was NEVER usable, or are users able to take the necessary care to do their work accurately...?

    'The staffers have to open and exit several menus to input stuff as simple as addresses'

    Ah, the slings and arrows.

    'To put it mildly, the setup "doesn't reflect business needs now'

    As in ease of use, etc, sure. As in it has always worked like this, why do I seem to read this as 'it's old and clunky, and it's the fault of the software that I make so many mistakes'. Where I work, we do have a lot of this. Because we care, and work in private industry, we understand the software, make the necessary adjustments to our habits, and take the time to do it right.

    'It took her office almost four years to secure $36 million for updated assessors' hardware and software...'
    'The design requirements are due to be finalized this summer.'

    What comes first, the chicken, the egg, the funding, or the requirements?

    After all that, has no one in San Francisco government made a CBA case for replacing it? I'm betting they are leaving revenue on the table by not having accurate data. And I'm betting they need to build reasonable, achievable requirements. So many government IT projects fail because the project was designed so poorly from the start.. Examples abound.

    • 'It took her office almost four years to secure $36 million for updated assessors' hardware and software...' 'The design requirements are due to be finalized this summer.'

      What comes first, the chicken, the egg, the funding, or the requirements?

      Typically starts with a budgetary quote typically +-30% based current install base and a multitude of assumptions, plus a basic wish list of what the updated system should do. At this point they give them the budgetary number which then allows the agency to go for funding for the project. Once funding has been approved for the project then a full audit is done on the system, all of the detailed pieces are looked at and all of the i's dotted and the t's crossed. During this time the final system layout is r

    • Speaking as someone who has actually used SF’s assessor software in SF city hall (as a public citizen to lookup property records) I think they’ve making the common mistake of not realizing that introducing a mouse always hinders productivity (people are always faster keeping their hands on a keyboard and navigating via menu).

      The software is 80x25 and does look like a DOS app though behaves more like a unix or mainframe terminal app. The multi-user aspect might contribute to the replacement cost

      • And to be fair, a text-based solution demands care be taken to design a 'perfect' tab order, fields be laid out like any paper coming in, etc... My brother THE programmer bemoans the effort he puts into entry screens that is ignored because users never bother to use the TAB key, as instructed, to avoid taking their hand off the keyboard and dealing with a mouse. His stuff is forced into GUIs by OS and platform limitations, thank you IBM.

      • by djinn6 ( 1868030 )

        The “prone to mistakes” is a hiring issue (gov’t at its best), not a software issue. They’ve got a lot of errors in how docs are recorded and indexed that reveal sloppy and lazy work.

        If people are making different mistakes all the time, then it's a problem with the people. If they're consistently making the same mistakes, then there's a problem with the process.

        Good software can catch a lot of issues with the process. Not the least of which is automatically verifying the values appear reasonable, e.g. a 1500 sqft. apartment does not cost 5x the median house in the area, or showing a warning about what appears to be a very creative first name. It can eliminate a lot of the error-prone ma

    • Dear, it seems as if either this software was NEVER usable, or are users able to take the necessary care to do their work accurately...?

      "Usable" isn't a binary choice, it's a spectrum. I have no doubt this software was considered "usable" at the time and it probably beat the paper system it replaced. Standards have gone up since then and now it seems barely functional.

      I'm sure at the time, tradeoffs were made based on having a 24x80 character based screen, or maybe very low resolution graphics. I can just hear the conversation: "the clerk needs to enter a street address and a city name but I don't have room for both at once. I guess I'll ad

  • A mouse? (Score:5, Insightful)

    by Hugh Jorgen ( 4906427 ) on Thursday February 28, 2019 @01:52PM (#58194602)
    A mouse would not increase efficiencies, look at the horrid inefficiencies of web applications. The lost productivity when IBM forced one of their large financial clients to move from a 3270 green screen app for change management to web app caused outrage. Most UI/UX people do not use the tools they design. Is there "technical debt" here, likely. Does the new app need to support a mouse or be web based? Likely not.
    • Thing is, at least some of these applications are literally just using PCs as terminals with a 3270 emulator. Why can't the emulator let them select fields with the mouse? The inefficiencies of using a web application vs. a 3270 emulator are already in place, because they're not using 3270s any more.

  • by Anonymous Coward

    The only reason for San Francisco to upgrade is if they want to be like Atlanta and get to try out some ransomware code on their production systems

  • That will be new old standard.
  • Quote (Score:2, Insightful)

    by Anonymous Coward

    "We’re dealing with an irrational public who wants greater and greater service delivery at the same time they want their taxes to be lower." That's not irrational. That's optimistic. Improved service should decrease costs. If the cost of improved traffic management is less than the savings of improved road maintenance than the solution works. City planners work on long term solutions with tech that likely won't change for decades. Saying something was made in the 80s makes it sound old, but doesn't me

  • This kind of project screams career-ending budget overruns, graft, missed deadlines, etc. And when it's finally "done", the system fails to work in just one way that everyone notices. Paychecks aren't direct deposited. Vacation hours screwed up. Tax figures are off by a decimal point. Cron jobs fail when some SQL job fails due to some unexpected character in someone's newly created name.

    All of the above costs directors/managers in public positions their comfy jobs (which they earned through years of unenvia

  • Just because something has been in use for a long time doesn't mean it is in any way good.

    If you can extract the existing data and use it with a "newer" application, it may be a good thing.

    There should be some oversight concerning software purchasing/provisioning for a civil purposes.

    There isn't a real reason to have multiple municipalities using different software packages for the same purposes. This seems like a valid reason for state wide single sourcing.

  • by plague911 ( 1292006 ) on Thursday February 28, 2019 @01:59PM (#58194660)
    Finance, Cities, Airlines, Pharma, the whole defense industry. What kind of news article is this for a software developer biased news aggregation portal when even this most basic fact is stated as some kind of surprise.
    • Yep. Building those systems was incredibly expensive and difficult. Replacing such a system is orders of magnitude more difficult...and expensive. For one thing you'll need to retain/import all the old data that no one wants to lose, just in case they need it some day. And every stakeholder who puts up with the limitations of the existing system will have a laundry list of new bells and whistles they want implemented in the new system. "If it works don't fix it" may sound like an excuse to be lazy, but
  • https://en.wikipedia.org/wiki/... [wikipedia.org]

    No wonder our cities are such disasters. It's deliberate!

  • by slipped_bit ( 2842229 ) on Thursday February 28, 2019 @02:05PM (#58194696) Homepage

    ... Minnesota spent about a decade and $100 million to replace its ancient vehicle-licensing and registration software, but the new version arrived with so many glitches in 2017 that Governor Tim Walz has asked for an additional $16 million to fix it. ...

    Must have bought the same software that my state purchased. For over a decade they collected a "modernization" fee while using the old system of IBM 3270 terminals, and when they hired the company to develop the new system it was a colossal failure. Wait times went up significantly. Years later and the system still hasn't improved. Oh, and they still keep collecting that modernization fee. The only improvement is that you can now reserve a spot in line via a text message and they will then send you a text message when you're near the front of the queue, so you can go to work or whatever and don't have to sit in the waiting room for hours.

    I pay the "convenience" fee to handle as much as I can on-line and via mail, but some times that can't be done such as when purchasing a vehicle.

    I bought a utility trailer a while back. They had to fill out my name and address on no less than four different pieces of paper. Seriously? Even if they need to use paper for some strange reason, they can't just type it in once and print out the four different forms they need?

  • Leave it that way. Likely not directly connected to the Internet so it's secure. Or, if it is, not running anything that anyone remembers how to hack or are targeting these days.
  • I don't know what my software version/revision is -- it updates automatically every night between 2-6am -- but I'm running on hardware from the 60s -- walking too.

  • I'm still using Lotus 123 -- and WordPro (still a better editor than Word). I have it installed, and they work fine, on both my Windows 7 and 10 systems. I use them for old docs and my current financial/budget spreadsheet. Now, I do *also* have Word 2010 and LibreOffice 6.1.5.2 installed, but have been too lazy to migrate my Lotus docs to the newer applications.

  • by magarity ( 164372 ) on Thursday February 28, 2019 @02:25PM (#58194860)

    The design requirements are due to be finalized this summer.

    I can almost guarantee the project will be years over schedule and billions over budget because, no, the design requirements were not actually "finalized" and city managers with enough clout constantly interrupt development with additional feature requests.

  • by TRRosen ( 720617 ) on Thursday February 28, 2019 @02:28PM (#58194882)

    if you wait to long you get trapped by technology.
    The big problem here is while the data on these systems is trivial today the pushed the limits of 8-16k miniframes. Thus there were a lot of sneaky calls directly accessing memory or even processor registers. The software was integrated into the hardware with bits of assembly and what not. This makes it pretty much impossible for a young modern programer to figure out. The old ones are dead or want $500/hr to touch a COBOL system. and lets face it the guys that bid on gov contracts wont hire them anyway. So these systems become black boxes... they still work but no one knows how.
    There is a local business here that until being bought out still depended on an 80's wang system running software built for a 70's wang system. There only source of parts was trading with the air force and they only used theirs to train people to what tech may be hooked up to old soviet bloc weapon systems.

  • I will say no more. Vendor lock in for perpetuity.

    Unless you have $10-20 million to blow on the roll of a dice.

  • by Deathlizard ( 115856 ) on Thursday February 28, 2019 @02:29PM (#58194898) Homepage Journal

    I never will understand this obsessive need to try and kill systems that just work.

    AS/400 based system are some of the most reliable systems money can buy. They can handle insane amounts of workloads and can scale from small systems to complete mainframes. The only real issue with them is terminals (which is a minor issue with modern terminal emulation) and the hardware and software maintenance costs, but you do get what you pay for.

    I'll almost guarantee that they will switch to some "modern" system running either a Java or Web based backend that will either get hacked, crash due to excessive load, or both, and probably pay twice as much as they're currently paying now to switch vs upgrading their current setup.

    • Re:AS400 isn't dead (Score:4, Interesting)

      by smoot123 ( 1027084 ) on Thursday February 28, 2019 @03:56PM (#58195508)

      AS/400 based system are some of the most reliable systems money can buy. They can handle insane amounts of workloads and can scale from small systems to complete mainframes.

      OK, so I was actually looking into this a few days ago. I'm working on modernizing our build system, part of which includes running Jenkins jobs on AIX servers (to build an AIX client module. Whatever.) We fire off around 10 parallel jobs to build on AIX, Solaris/Sparc, Solaris/x86, Windows, Linux/x86, Linux/x64, and so on. I wanted to understand which ones take the longest because that's what I have to optimize.

      Funny, the AIX, PA-RISC, and SPARC systems get absolutely crushed by any x86 system. They may get a lot of work done per CPU cycle but the newer systems have just so many cycles it doesn't matter. AIX, HP-UX, and Solaris might be heroes at getting lots of transactions done on a 100 MHz processor, yay for Big Blue, HP and SunSoft, but if I care about actual absolute throughput, just nope.

  • "but the digital equivalents of such infrastructure projects generally don't draw the same enthusiasm. " -- it's not difficult to see why. Stories abound of government software upgrade attempts that turn into overpriced fiascos. (I was personally involved with one, where a utility was royally screwed over by a certain unnamed vendor.)

    Government agencies don't appear to understand how to manage the process, and vendors tend to take advantage of that. The budget tends to bleed out to the tune of five or t

    • by tomhath ( 637240 )

      Government agencies don't appear to understand how to manage the process, and vendors tend to take advantage of that.

      The fundamental problem is that people don't want to accept limitations in their shiny new software. If the buyer would accept a simple reliable system that has some limitations it wouldn't cost five or ten times what it should, but no project manager wants to have their name on a system that doesn't do X and Y in three different languages with 99.999% uptime and runs on OSX, Windows, iPhones, Android phones, and can easily be ported to Linux...oh, and it must have a voice interface and be handicapped frien

  • by rnturn ( 11092 ) on Thursday February 28, 2019 @03:24PM (#58195242)

    It's COBOL. Running on an AS/400. Big effin' deal. I'm guessing that the county assessor's opinion of computer technology is "If I haven't used it, it's got to go." [1] COBOL is not exactly dead, but it isn't the programming language du jour. (I've seen recent job ads looking for people with COBOL experience. I could forward some of them to her.) I suspect she'd much happier if the whole system consisted of a single, shared Excel spreadsheet. At least she'd have her mouse to move around.

    I predict we'll be seeing a future post about the huge cost overruns, inoperable software, and the crisis in the assessor's office when the conversion to a trendy "non-dead" language on new, overpriced hardware flounders and the lawsuits begin.

    [1] - She seems like she might be a kindred spirit with the guy I once worked with whose title of "Director of Computing Technology" would never have prepared you for his daily criticism of each and every computer technology he came in contact with. It was really quite amazing. Nobody--and I mean nobody--in the department could figure out what technology would be graced with his approval.

  • It just appears to us as we've all experienced software from the 1990s a time when software was _really_ bad.

  • It took her office almost four years to secure $36 million for updated assessors' hardware and software that can, among other things, give priority to cases in which delays may prove costly. The design requirements are due to be finalized this summer." $36mil? For some software and computer hardware that records simple information in a database? No wonder it took four years to get that funding. I'd have given them about a tenth of that, tell them to go hire 2-3 programmers, get some servers and PCs from Dell and stop bugging me for ridiculous amounts of money...

  • by Sqreater ( 895148 ) on Thursday February 28, 2019 @04:47PM (#58195956)
    You have to double the cost from 36 million to 72 million and expect a long delay in implementation and it won't work the way they said it would or at all and they will have to scrap it in the end and go back to paper and pencils because they tossed out the old system. Real world.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...