You've all heard the adage/joke, right? "That's the nice thing about Standards - there are so many to choose from!"
Back in November 2017, in response to a Slashdot post covering a story that Eric Raymond saw three viable alternatives to the C programming language, I wrote a post which I titled "The Problem With Me-Too Languages" [slashdot.org].
I think part of that post might be useful in this discussion:-
"So whilst I'm always interested in learning about developments in programming language design, I think it hel
Any line of code that makes it to Production is technical debt, regardless of how "new" the technology is.
When it comes to legacy replacement I like to ask how many person-years of development are sunk into the system? Assume 60-75% of that cost to replace. And risk, add 1% per person-year as the chance of some level of failure (including complete failure).
I've been expecting Cobol and other legacy jobs to see pay spikes due to lack of talent as people retire. What's hap
I don't have very much experience of working with large AD Teams, which is to say that all my experience with AD Teams has been earned within just two organizations.
But one observation that I would make would be that with Developers as a subset of the broader community of technologists, I've found them to be among the earliest of early-adopters; a community of people who love new and emerging technologies. This flows through to the workplace and results in the idea that the moment a new (language/IDE/toolkit/platform) comes along, they want to be using it. In fact, my current employer even has a term for it - we call it "Résumé Architecture" - the idea being that if some new piece of technology would look good on your Résumé, just put it in the architecture for your next project, and declare it as "essential". (It's how we've got such a mass of junk throughout our architecture).
This is the attitude that creates the problem; thousands of organizations across the country allow this to happen until it becomes "normal", especially when individuals move between organizations and take the attitude with them. It might satisfy the egos of those developers, but it absolutely contributes to this problem.
The best way I've seen for dealing with this was a manager who persuaded his superiors to allow him to set up a dedicated "lab". She had a security lock placed on a former conference room; in the room she had a partitioned network and a bunch of machines [we're talking in the age of Sun Sparc "pizza boxes": so, a while ago]. At the end of each project, when code went live, she had the entire project team cast ballots to identify the best-performing team members. In return for getting voted in to the top 2-5 participants [depending on the team size] you would have your ID card added to the "lab" for a fixed length of time. You were allowed to spend up to 1 day per week in the lab for 12 weeks. In those 12 days, you could work on anything you wanted.
At the end of your 12 days, you could either return to the project where you had been assigned for the remaining 4 days of your working week, or you could give a presentation in terms of what you have developed during your 12 lab days.
This was considered a golden opportunity to get a speaking slot on a Town Hall, which in turn was a way to get noticed for career opportunities.
But best of all, it allowed the same manager to outright prohibit any variations on the core technology stack her architects had specified. Others tried to replicate this with limited success, but it did work. The message was clear: if you want to play with cool new tech, prove you're good enough, or do it on your own time. An unreasonable employer will bleed good staff, but an overly accommodating employer will be left with unsupportable technology. This is all about finding a safe middle ground.
The N+1 Problem (Score:2)
Back in November 2017, in response to a Slashdot post covering a story that Eric Raymond saw three viable alternatives to the C programming language, I wrote a post which I titled "The Problem With Me-Too Languages" [slashdot.org].
I think part of that post might be useful in this discussion:-
"So whilst I'm always interested in learning about developments in programming language design, I think it hel
Re: (Score:2)
I appreciated your final comments.
Any line of code that makes it to Production is technical debt, regardless of how "new" the technology is.
When it comes to legacy replacement I like to ask how many person-years of development are sunk into the system? Assume 60-75% of that cost to replace. And risk, add 1% per person-year as the chance of some level of failure (including complete failure).
I've been expecting Cobol and other legacy jobs to see pay spikes due to lack of talent as people retire. What's hap
Re:The N+1 Problem (Score:2)
But one observation that I would make would be that with Developers as a subset of the broader community of technologists, I've found them to be among the earliest of early-adopters; a community of people who love new and emerging technologies. This flows through to the workplace and results in the idea that the moment a new (language/IDE/toolkit/platform) comes along, they want to be using it. In fact, my current employer even has a term for it - we call it "Résumé Architecture" - the idea being that if some new piece of technology would look good on your Résumé, just put it in the architecture for your next project, and declare it as "essential". (It's how we've got such a mass of junk throughout our architecture).
This is the attitude that creates the problem; thousands of organizations across the country allow this to happen until it becomes "normal", especially when individuals move between organizations and take the attitude with them. It might satisfy the egos of those developers, but it absolutely contributes to this problem.
The best way I've seen for dealing with this was a manager who persuaded his superiors to allow him to set up a dedicated "lab". She had a security lock placed on a former conference room; in the room she had a partitioned network and a bunch of machines [we're talking in the age of Sun Sparc "pizza boxes": so, a while ago]. At the end of each project, when code went live, she had the entire project team cast ballots to identify the best-performing team members. In return for getting voted in to the top 2-5 participants [depending on the team size] you would have your ID card added to the "lab" for a fixed length of time. You were allowed to spend up to 1 day per week in the lab for 12 weeks. In those 12 days, you could work on anything you wanted.
At the end of your 12 days, you could either return to the project where you had been assigned for the remaining 4 days of your working week, or you could give a presentation in terms of what you have developed during your 12 lab days.
This was considered a golden opportunity to get a speaking slot on a Town Hall, which in turn was a way to get noticed for career opportunities.
But best of all, it allowed the same manager to outright prohibit any variations on the core technology stack her architects had specified. Others tried to replicate this with limited success, but it did work. The message was clear: if you want to play with cool new tech, prove you're good enough, or do it on your own time. An unreasonable employer will bleed good staff, but an overly accommodating employer will be left with unsupportable technology. This is all about finding a safe middle ground.