I write software for a living and I last considered the algorithmic complexity (Big-O / Theta / Omega) of what I wrote:
Displaying poll results.19836 total votes.
Most Votes
- Will ByteDance be forced to divest TikTok Posted on March 20th, 2024 | 9220 votes
- What's the highest dollar price will Bitcoin reach in 2024? Posted on February 28th, 2024 | 8489 votes
Most Comments
- What's the highest dollar price will Bitcoin reach in 2024? Posted on February 28th, 2024 | 68 comments
- Will ByteDance be forced to divest TikTok Posted on February 28th, 2024 | 20 comments
Thanks to (Score:2)
Meta-suggestion for poll (Score:2)
Can't recall if any of my poll suggestions were ever used, so how about a meta-suggestion? How about some poll topics with more potential for "funny" comments. I think that is the biggest lack of today's Slashdot. Or perhaps all the members with humorous wits have already left the building?
Re: (Score:2)
Considering is different from doing something (Score:5, Interesting)
Considering Big-O is different than doing something about it. In my experience, for most applications, you are way more likely to be bound by network speed, local I/O, or something else entirely out of your control. That said, it is OK to look at something and say, "gee, if I take path A, it will be O(n^3) but if I take path B it will be O(n log n), so I should go with B since I it will only add 10% to my implementation time and will prevent it from being a bottleneck later on."
What I tend to try and resist quite strongly is the urge to re-implement working code because it looks inefficient to me. For that I need concrete evidence that the code is inefficient (e.g., profiler output, reliable report from users/developers accompanied by data to back it up, etc.) and a solid idea of how to change it to make it better. Otherwise, it is simply not work the risk of breaking working code that isn't causing a problem.
The other thing I try to avoid is spending lots of time up front on the algorithms when I don't even know where the bottlenecks will actually be.
Remember: "early optimization is the root of all evil."
Please, No Exponential Algorithms! (Score:5, Insightful)
It was only 3 years ago we found out that Windows Update in Windows XP used an exponential algorithm [arstechnica.com] that silently wasted away hours if not days of CPU cycles of at least one billion of machines worldwide every month during the patch Tuesdays. Please don't forget the lesson.
Maybe the exponential algorithm was fast when N is small, but you never know what kind of data the code will be tasked with in the future. Implementing suboptimal algorithms is a technical debt, and if your code is part of Windows, multiply the debt by one billion times. Speaking from experience, most common algorithms are O(n log n), with O(1) attainable with clever use of hash table, so I see no excuse implementing anything n^2 or worse. Such code needs to be documented with a prominent warning.
I'd argue that any code that runs on more than a handful of machines deserves some thought and polish. The impact is multiplied by the number of users, and I think we all want our code to be generally useful to as many people as possible. You should also have some idea where the bottlenecks are before writing a single line of code. Don't worry about premature optimization. The concern is overrated. It sounds like you are just adverse to criticisms of your code, and wanted the critiques to carry more burden of proof. Criticism is good: it means that your code is getting some attention, so be glad!
Re: (Score:2)
You make a good point, but the specific example which you cite is a result of poor architecture. That is to say, part of the architect's (or lead developer's if there is no designated architect) responsibility on a project is to identify those parts of the code which need special attention.
Of course, with the Windows XP bug we don't know if it was implemented by a junior developer or a senior developer. In any event, you are right that code intended for very wide use should be handled more carefully.
I loo
Re:Please, No Exponential Algorithms! (Score:4, Insightful)
I sometimes work with people who are slow in delivering code, and although "I'm spending time optimizing" has never really come up as a direct excuse, there are variations about the them spending time on something they shouldn't have. I think having visibility into their development process early is the key. It could be that the developer thought the problem is more complex than it should be. It's useful to find out early and clarify the problem statement. If there are roadblocks, it's better for the team to help these individuals early.
A lot of time it's the question of "should we be writing this piece of code in the first place?" rather than "how much should we optimize this code?" If we decide we need this code, we try to deliver it in high quality. I think even for internal applications, these are meant to automate paper-pushing jobs away, and they will be used more as the company grows. Using a polished internal application is actually a nice morale boost. It shows that someone cares, and it encourages other employees to put the same efficiency and attention to their work.
Re: (Score:2)
A lot of time it's the question of "should we be writing this piece of code in the first place?" rather than "how much should we optimize this code?" If we decide we need this code, we try to deliver it in high quality. I think even for internal applications, these are meant to automate paper-pushing jobs away, and they will be used more as the company grows. Using a polished internal application is actually a nice morale boost. It shows that someone cares, and it encourages other employees to put the same efficiency and attention to their work.
I have been developing software for the enterprise for nearly 40 years (back before we called it "the enterprise") and I can count on one hand the number of times we have been asked, required, or had the time to do ANY Big-O analysis of algorithms. In the early days we wrote a solution to the problem and then examined how it performed. For some of these solutions some pre-coding analysis would have been helpful but the demands of the data requirements in the old days were such that the only bottleneck was e
Re:Please, No Exponential Algorithms! (Score:5, Interesting)
I have seen both sides of the story about junior programmers, having served 10 years in the academia and several years now in the industry. The problem of failing to adapt to new problems didn't happen only after students graduate. At school, we try to expose our students to a variety of problems. They either aren't interested or simply get overwhelmed. I think only about 10% of the students really took advantage of the education they got. Those really good ones are self-motivated. To the rest, you simply have to spoon feed.
Since we can't just fail 90% of the students, to make the best of the situation, there are a few key messages that we try to hammer into the student's head: memory hierarchy (amortizing on storage bottleneck), queuing theory (service load modeling), and of course the big-O analysis. I can comfortably say these are some universal principles that remain true even as technology evolves. We didn't do critical path analysis, but that has become important since parallel and distributed computing took off---I think this is likely the change in data requirement and bottleneck you observed.
Now in the industry, I still see limits of this mostly theoretical approach to computer science. We hired a fresh college graduate who delivered code where all unit tests pass, but the code doesn't run. It's just not doing the things real code is supposed to do; the unit tests mocked everything out. But we hired an industry veteran who also wrote code like that. I'm not sure what the problem is. Maybe some people really just shouldn't be programmers.
The real good ones design optimal algorithms in their first attempt, and the optimizations are about how to make the code understandable to others who don't know programming as well as they do. But even the best programmers fall into the temptation of writing code they shouldn't have written in the first place. :)
Re: (Score:2)
"Since we can't just fail 90% of all students"
Sure we can. I had a teacher like that - first day of class he would kick out two students and fail them just to make sure everyone got the message. As well as anyone else screwing around the rest of the year. No appeal. No explanation. I almost got the boot for helping the kid at the next desk fill in his school health form by telling him how to spell asthma. He was THAT strict.
But if you survived his class, you knew your sh*t. Bit of a shock for home room first day in high school, but abut half of us mad
Re: (Score:2)
This is pretty rare for an administration to permit that these days. In the last few years, we've crossed an important threshold where non-teaching administrators rather blatantly run higher education and teachers do not. For example, at CUNY we had a 92% faculty no-confidence vote in the watered-down "Pathways" general-education demands from administration, but they refused to alter the dictate. Deans can override grades. These being the people who care about money and enrollments (watch for the code word:
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
So you had a narcissistic ass as a teacher and he's so good at manipulating that you still think you know "your shit" instead of "his shit"? ;)
Considering he's the teacher, the point of education was to know "his shit." Not churn out special snowflakes who get a gold star just for showing up.
Re: (Score:2)
"Since we can't just fail 90% of all students"
Sure we can. I had a teacher like that - first day of class he would kick out two students and fail them just to make sure everyone got the message.
The first day? I doubt the truth of this story for that reason alone.
Technical drawing - Mr. Stacey. He walked into class while two kids were using their t-rulers as swords. Kicked out within seconds, no appeal. He made a point of kicking out a few kids the first day of class every year, which gave him the reputation of being a tough teacher you didn't screw around with. Even the principle didn't dare challenge him.
Re: (Score:2)
that is ridiculous and it serves no purpose
To the contrary - it made the students focus, and not expect to be rewarded if they didn't produce. Every single detail that wasn't perfect caused lost marks - stuff that looked fine at first glance got you docked. The most minor flaw cost 1 point out of 10. By the end of the year, those who survived had nothing to be ashamed of.
Not like today's "ship it even if it's totally fucked up - we'll maybe patch it at some point, or if we're lazy, just remove the feature entirely and tell the customers that it's
Re: (Score:3)
Re: (Score:2)
Like the (numerous) private colleges that recently crashed and burned where most of their marketing was on how many of their students got jobs when they graduated? "110%* of our students get jobs!1!!" (*60% got jobs flipping burgers. 50% got jobs scrubbing toilets nightly. 20% got both.)
I'm kind of with the AC on this one, though I'm sure his way is just as unworkable (everyone
Re: (Score:2)
You seemed to miss out the part where there are no student loans - those diploma mills main function is milking student loans and grants. Maybe you need to go back to school and learn to read? Not at a university level - just at a late elementary grade school level (or college level in the USA).
Re: (Score:2)
And you seemed to miss out the part where that doesn't matter. Take away the loans and what do you get? You save up your (or your daddy's) pennies and go to State U, Ivy U, or take one of the dozens of discount degrees that suddenly popped up to capitalize on people who can't afford either. I suppose you could go to your local Community College (whose degree/"certificate" list [hccs.edu] has a lot of options that sound like they could have been read by Sally Struthers in one of those 3AM infomercials [youtube.com]). How do you d
Re: (Score:2)
YOU missed the point - people who aren't getting "free money" do their research. They won't spend the bucks on a useless paper-mill degree, because it's costing them real money - not credit. And it costs more for those diploma mills than some regular universities that offer far superior education, and are, unlike the mills, accredited.
Those mills came into prominence because of funny-money student loans being so easy to get. They catered to the very people who would never get into a regular university in t
Re: (Score:2)
Re: (Score:2)
If universities didn't charge for school up front with fixed tuition costs guaranteed by government, but instead worked out agreements with student where students all pay 10% of income for set number of years after leaving school, it would completely flip the education system from paper mill to skills society needs in numbers the job market supports. Then it would allow for industries, be it software or journalism to define needs. Universities would have to focused on teaching what is needed/useful, not what is comfortable/easy for professors or simply fun for students. In such a model good teaching programs and involved students win ... While poor professors, useless degree programs fail, which is what you want. It's really interesting to me how poorly educated college graduates are today, it isn't just that they don't know things. It is that they believe things that are no longer true or only worked that way in theory. It's awesome when students fail to understand that some applications differ greatly from conceptual models they were preached.
In that case your optimisation goal is not skill, or what society needs, it's x-year (discounted) earning potential. In other words, you don't optimise actual skills, you optimise marketability. And you prefer short-lived technologies to deeper understanding. You'd also completely gut long term basic research.
A university should teach universal skills and knowledge. General problem solving, not Java Enterprise Edition 3.1415 or SharePoint 2.71. Relational models, not Oracle. Emacs and make, not Eclipse or
Re: (Score:2)
"should we be writing this piece of code in the first place?"
The safe answer is always NO. That's why I've now chosen "I don't write software for a living (any more)." Well, that, plus I can no longer "get into it" - it just seems so pointless, why bother, 99% of everything out there is useless crap anyway, but that's what people seem to want.
Re: (Score:2)
I sometimes work with people who are slow in delivering code, and although "I'm spending time optimizing" has never really come up as a direct excuse, there are variations about the them spending time on something they shouldn't have.
There is no clear answer to whether the time being invested in what you're referring to is gold plating or not. It's all relative to the circumstances, your organization's needs and the level of quality you're willing to tolerate. The iron triangle comes to mind: fast, good, cheap, pick two. It still applies today even though some idealists in this space seem to think you can have all three all though finding that seems to be as elusive as capturing a unicorn. I don't see any evidence the would suggest
Re: (Score:2)
Right, I think it's useful for managers to have a technical background.
Although I'm not a manager, just served as tech lead on some projects, the kind of issues we've dealt with is more about should we have written this code in the first place. Using your websocket example, the develop might be working on acknowledgement of batched and out of order updates and allow individual ones to be retransmitted if lost or corrupted. I would point out websocket is over TCP, so the transmission is reliable and in order
Re:Please, No Exponential Algorithms! (Score:4, Interesting)
Excellent post. I see that an AC has taken what was written out of context.
The important point is to not have the hubris to think that you are such a good coder that, a) there's no point analyzing the approach because you're an experienced god and b) you understand the requirements so well that what you do will stand the test of time.
When I was at RIM, I was given the chance to look at the BES scheduler, which hadn't been touched since it was originally written and it was discovered that its operation was somewhat greater than n^2 and could not support more than 45 million accounts. The solution to the performance issues, up to that time, was to buy more servers (this was when RIM had more money than God so pouring money into servers was possible and the marketeers loved it as the number of servers being installed was proof of how great things were at RIM) but things started to collapse when the number of accounts reached 38 million. I left the company before the problem was addressed and a project put into place.
But, there is a lesson here regarding the necessity of spending a bit of time thinking about how your code operates and making sure that it can grow with you.
Re: (Score:2)
Windows XP? I'm pretty sure they used that all the way through Windows 7. Update checks on Windows 7 can cripple lower end machines (where lower end is something pretty normal for 2010 when Windows 7 was released, say a dual core with 2GB of ram), and starting with a fresh install of the Windows 7 original release and working through the updates can take days, almost all of which is svchost.exe pegging the CPU at 100% while nothing appears to be happening.
Microsoft seemed to have finally fixed that nonsen
Re: (Score:2)
Re: (Score:2)
After they switched to monthly patching, my computer wasn't able to update anymore as it would get stuck looking for updates at 0%.
This is probably why.
That's a feature, not a bug, you lucky dog!
Re: (Score:2)
starting with a fresh install of the Windows 7 original release and working through the updates can take days,
You can no longer use Windows Update with a fresh install of Windows 7. You have to manually install a rollup from spring 2016, when they broke backwards compatibility for no good reason.
Re: (Score:2)
These days though, software development is all about get it done fast and cheap. If it actually works, that's a bonus.
Re: (Score:2)
You clearly have never worked with an old-school Assembly programmer. Yes, there are people who will drop an O(n^3) algorithm in without even realizing what they've done, and those people need to take up an exciting career in fast food preparation. But at the other extreme, there are also people who will count every clock cycle and hand-unroll the inner loop to minimize the constant factor of a rarely-used O(log n) operation.
Avoiding both of those is optimal i
Re: (Score:2)
hand-unroll the inner loop to minimize the constant factor
Hope they're not still doing that. Unrolling loops can actually slow things down because you need to load more instructions from main memory into cache (or worse, it doesn't all fit into cache and then you really screwed yourself).
Re: (Score:2)
i remember trying to use rhythmbox to retag my mp3s. it had the functionality, and would have been ideal except that the algorithm went something like this:
foreach song in list:
1. process.
2. mark song as processed.
3. foreach song in list marked processed:
a. refresh information for UI
so yeah, that's a trivial for-loop which was accidentally made O(n^2) and, yes, it was a problem in practice since the "refresh" step was not particularly cheap. i estimated that fin
Re: (Score:2)
On the other hand, I can remember a time when the most computationally intensive task most people's PCs ran was the screen saver with the animated fish. If a process is properly scheduled or demoted in priority, who cares?
Re: (Score:2)
Although you were putting words into my mouth, it's actually not such a bad idea, and I have to give you credit for it. There is no point repeating the same exponential time computation on every device in the world. It's just a waste of time. One should coordinate the computation on a supercomputer where the results could be memoized and then shared with the people that need them. The same problem only need to be computed once.
Like the weather forecast service.
Also, dynamically programming could make the al
Re: (Score:3)
Remember: "early optimization is the root of all evil."
I always go by "make it right, then make it tight" ;-)
Re: (Score:3)
I believe the correct quote is: 'PREMATURE optimization is the root of all evil.'
Premature is similar to early, but doesn't mean the same thing. You can know a piece of low tight looping code needs to be optimized early, in which case it is not premature.
Re: (Score:2)
I'm not worried about the low tight looping code as much. Sure, it probably needs to be optimized (don't do stupid things). With the various bottlenecks that can exist, knowing where the worst case is ahead of time seems unlikley. Heck, even knowing you're CPU bound ahead of time seems unlikely for many applications (although if in the cloud, CPU cycles are money, so....)
But the real place to optimize early on is the architecture. Those changes are expensive to make. Rewriting a loop is usually not.
Re: (Score:2)
Follow the first three rules of optimization:
1 - Don't do it
2 - Don't do it yet (only for experts)
3 - Profile, then optimize. I never see people follow this order.
http://wiki.c2.com/?RulesOfOpt... [c2.com]
In any case, readability is typically much more desirable than enhanced execution times. In few cases you will want to sacrifice even the smallest bit of readability, for better execution times. At least if you are not John Carmack.
Re: (Score:2)
Considering Big-O is different than doing something about it. In my experience, for most applications, you are way more likely to be bound by network speed, local I/O, or something else entirely out of your control. That said, it is OK to look at something and say, "gee, if I take path A, it will be O(n^3) but if I take path B it will be O(n log n), so I should go with B since I it will only add 10% to my implementation time and will prevent it from being a bottleneck later on."
I'm glad at least some developers understand this. I'm on the infrastructure side of things, and we constantly deal with devs that don't take that into account. Yes, your code is technically O(n), but when you're doing something stupid like querying a DB for 200k record IDs, then going back and making 200k separate, tiny queries for each row individually, your code is really more like O(n + network RTT + DB query time), which is about three to four orders of magnitude slower than your code by itself. And th
Re: (Score:2)
The big evil from premature optimization comes from attempts to optimize the code that don't impact the computational complexity.
My usual path to writing non-trivial code is:
1. Find the lowest-complexity method of implementing the problem that is still readable.
2. Write the lowest-complexity method in the most readable way I can, even if it adds operations that may lower performance but don't increase the complexity (e.g. extra copies of variables, redoing simple computations).
3. Re-evaluate the above if
Re: (Score:2)
This.
I (no longer) write test code for device and platform firmware.
My focus was power management and security of a particular platform subsystem people here love to get up in arms about.
I didn't care (within very broad limits) how slow the code was for about 95% of my test cases because they were only going to be run once per iteration of the firmware (if that, could be once per dev generation: alpha, beta, etc.). I much more cared that the code was highly adaptable to changes in the firmware it was testi
Re: (Score:2)
I'll admit that I probably do worry too much about how something will perform if the dataset/... gets to be large. Reading articles like What every programmer should know about memory [lwn.net] do not help, it is easy to become seduced by the idea of trying to keep everything fast - when most of the time most of the code is quick enough; the rare times when it is a bit slow are probably not worth the effort/complexity dealing with. Identifying the 3% can be really hard.
Almost Pointless (Score:3)
In the amount of time it takes to construct an O / Theta representation of your function or code, you can usually be finished the code and get into the testing phase. Unless you're working with something like a DSP or you have a very tight requirements on performance / resources, then it's much more effective and worthwhile to develop your code, instead of computing the complexity of it, as the O / Theta factors.
Not even a little pointless (Score:5, Insightful)
In the amount of time it takes to construct an O / Theta representation of your function or code
Come on. To "consider" big O you don't need to build an exact O representation.
It means to take just a moment and say "I'm building an array of these things, will linear access speed matter here? Or should I have a dictionary to access items directly"?
Or to simply think about ways a query could spiral into being very slow just by the order of comparisons.
Or even to consider the complexity of the user completing some task they are supposed to be able to do with your application - even users can be subject to complexity, having to go in and out of screens to get something done they could do in batch on a single screen is a consideration...
None of these things have to take much time at all. Simply being aware of the monstrous effect that increased complexity can have, and the general abilities of most collection classes in whatever system you are working with... that alone with a few moments of consideration as you build out code can make a huge difference. It's so easy to slip some seemingly harmless yet very slow code into a system because you just are not thinking about the ramifications...
Re: Not even a little pointless (Score:2)
Re: (Score:2)
CPU-related performance for most code these days just comes down to "don't be silly". The stuff any senior engineer looks out for without really thinking about it. Having spent the last year on a performance optimization project: it's (almost) never CPU-bound. The problems I've seen recently in that area are "accidental N^2" on large data sets, where developers who knew better got stung by Java library classes that obscured a bit too much. E.g., set union or difference operations, except done on lists i
Re: (Score:2)
I do mobile development as a consultant, so I'm looking at a lot of code from different people all the time, some of it now rather old...
I agree that the biggest problem at the moment is considerations around large data sets, that is also true in mobile development. That said, I still see from time to time use of poorly used frameworks that introduces quite large slowdowns when assembling data for a screen, because people didn't stop to think about code being called repeatedly or just simply how much work
Re: (Score:2)
While writing out the tedious models and actually proving the run time is a ridiculous waste of time in many circumstances, most good developers (with an actual CS background) can come up with a Big O, Theta, and Omega with only a little bit of analysis on the algorithm in question. I personally consider it at all times when writing and modifying code, but I do the complexity in my head for a good ballpark more than anything. Is it always necessary? Probably not always, but especially given the high avail
Re: (Score:2)
In the amount of time it takes to construct an O / Theta representation of your function or code, you can usually be finished the code and get into the testing phase.
No lol. Unless you're doing some complex recursion, big-O is easy to calculate.
Re: (Score:2)
well gosh gollywumpers, I use a library for such things written by smart people so I don't have to worry
Re: (Score:2)
Re: (Score:2)
well gosh gollywumpers, I use a library for such things written by smart people so I don't have to worry
Can't tell if serious, but this has been an increasing concern. I've recently seen several cases of
thingsToProcess.removeAll(thingsToDefer);
Where thingsToDefer was a set, but oops thingsToProcess was a list, and suddenly we're CPU-bound. It's easy enough to overlook that sort of mistake in coding and code review, where it never would have been checked in if it were more explicit than nifty polymorphic library code.
Re: (Score:2)
About the only time I consider big O is if list.size() runs in (1) or (n). Been bitten by that too many times in the past.
Re: (Score:2)
I transcended algorithmic complexity? (Score:2, Insightful)
I stopped writing code many years ago, but in those days we didn't have such helpful profilers to figure out what was going on. I rarely did explicit analysis of the complexity, but I often did experiments to look for faster approaches. Since most of my programming involved databases, just watching the disk activity could reveal memory thrashing in many cases. Anything that eliminated lots of potential answers fast was a big win...
Later on I was supporting researchers, and they were often evaluating the com
Re: (Score:2)
Thanks for the favorable mod, whoever it was. Still disappointed by the lack of "funny" comments, but maybe the topic was unsuitable for such?
Optimization -- only if necessary (Score:2)
I just finished writing a group of ETL scripts in Perl. The longest-running script took eight hours to run, and had been optimized a great deal, with caching and other tricks. At the other end of the scale was a more recent script that ran in six seconds -- so I didn't waste any time trying to speed that one up.
In any case, these are throw-away scripts, since our product was the documentation about how the transformations were achieved, and once our output files matched the standard files, no further work w
"consider" (Score:4)
I consider big-o every time I write code but after decades in the field I only stop to evaluate the big-o if my gut warns me of a problem. I rarely have to resort to a profiler, and when I do it usually finds a problem in an underlying library that I must either replace or work around.
Re: (Score:2)
Yes, I meant it seriously. If you write efficient code all the time, you don't have to resort to a profiler to warn you when your sloppy code is too slovenly.
Pick your data structures (Score:2)
At the very least you should be considering what kinds of operations you are going to be using on your data structures. A huge number of performance problems start with people not giving a shit what the cost of an add/delete/find/whatever is on a given data structure.
embedded dev (Score:2)
Some of the first production code I wrote was a series of embedded engines - mainly for hand-held devices. I had to learn all the tricks like embedded ASM, compact lookup tables, manual memory swapping, even shaving and customizing the C runtime. I got used to making sure there were only so many operations in my main loop so the refresh interrupt wouldn't cut anything off. Now I work on much higher-level code, but I still make it a point to try and be as efficient as possible and still use many of my old tr
Re: (Score:3)
Why someone would need assembly for web client-side coding befuddles me.
When I need to write something fast I use Perl. When I need something to run fast, I use C.
Different languages are good for different situations.
Re: (Score:2)
1. I'm saying there's no proper language available within the browser.
2. With WebAssembly you can run *compiled* C code or a Perl interpreter on a semi-native [in a very meta sort of way] VM within the browser/a web page.
Coupled with things like web sockets there's a lot of opportunity to do some very low level stuff inside a web page. Sort of like what Java for the web and flash were supposed to do - but never got it right partially due to lack of proper open standardization and partially due to forcing a
Re: (Score:2)
And they're still slow as shit. It's not the language - it's the whole "web" thing, as well as GUIs in general. 8 gigahertz of cpu among 4 cores, and 8 gigabytes of ram to do what a 286 used to be able to do locally without working up a sweat with 20mhz of cpu, 2 meg of ram, a pair of 85 meg hard drives with 8k of cache, and 64k of video ram.
Everything was optimized because it HAD to be, so even meager hardware upgrades had an immediate impact. The opposite of today, where every hardware advance is swallo
Re: (Score:2)
Why there isn't a proper alternative [WebAssembly?] absolutely befuddles me.
Because the web didn't have enough security vulnerabilities?
Re: (Score:2)
You know I can't actually argue with that point I would have at least hoped in the last 20 years someone would have come up with something. That fact that even the alternative solutions are just syntatic-sugar transpilers that clumsily try and figure out how to maintain a meaningful "this" context or shoe-horn in classes and objects that work like actual classes and objects is just sad. And don't give me "OH but ES6..." - I just rewrote a mid sized CoffeeScript project and let me tell you ES6 fat arrows sti
Depends on what you're doing (Score:2)
For me, I can probably count on one hand the number of times this has been a consideration in any kind of code I've written. Although almost all the apps I've done are user driven where 99.9% of the time the app is idling on the user to do something. Then they initiate some action which takes a millisecond to execute then back to idling.
Big-O is only a small part of performance (Score:5, Interesting)
While I have a pretty good gut feel of the performance characteristics of my code these days, I know from experience Big-O and company are only part of the picture. Often choosing an algorithm in one place influences the performance of algorithms in another parts of the code. Consider various data structures, the insertion, iteration, and indexing performance; memory use; etc. all get traded off.
A lot of times what profilers point out can be surprising. I remember a couple decades ago finding a bottleneck that was a simple function that compiled to a single 1-clock-cycle inclusive-or instruction that took more of the run time after user inputs then the rest of the code. Yes, one clock cycle per call! The performance problem turned out to be that an O(n * log n) loop that called the inline function had too large an n. Ended off switching to an O(n^2) algorithm that was much faster because I was able to use a vastly smaller n. Just because some algorithm might have a better big-O form, the size of data and constants involved can make another algorithm much better in practice even if it looks like it would be worse purely by big-O.
Re: (Score:2)
I know from experience Big-O and company are only part of the picture. Often choosing an algorithm in one place influences the performance of algorithms in another parts of the code.
But sometimes it's a waste of time. You have to consider the value of what you're doing. If you spend copious amounts of time to use some modern language's functional programming syntax so that it takes microseconds vs. milliseconds to evaluate a function, have you really done anything useful? I suppose if you were working for Netflix trying to scale an application to a gagillion users and every CPU cycle counted, this might be useful. But in other applications, it's not useful. If you spend all your time optimizing that for loop but some critical business algorithm isn't optimized and is brittle, you haven't really done a good job. This is why I'm saying formal evaluation of these things is useless. The weighting process of what is important can't be taught in any type of formal education because it's all contextual.
Re: (Score:2)
I know from experience Big-O and company are only part of the picture.
Completely agreed. For the software that I write, disk I/O is a much bigger bottleneck than algorithmic complexity, even with fast SSDs. Often times performance optimization for me is all about intelligently deciding what to keep in RAM, the actual algorithms themselves are not a very big deal.
Re: (Score:2)
Re: Big-O is only a small part of performance (Score:2)
Then that's not O(n log n) vs O(n^2), that's O(n log n) vs O(m^2), where n >> m.
I'd hate to maintain your code if all your variables are named n.
Re: (Score:2)
Yes for the calling code the profiler found O(n log n) was actually closer to O((n+m) log (n+p)) and O(n^2) was more like O((n+q) * (n+r)). Where m was approximately a very large multiple of n; m and p were approximately equal; q and r were also approximately equal and q was approximately a small multiple of n. It was actually a bit more complicated then that, but compared to the main loops most of the rest of the algorithm was a constant. And for the record the function that highlighted as the CPU hog was
Re: (Score:2)
I know from experience Big-O and company are only part of the picture.
Sedgewick actually argues in Analysis of Algorithms that they may be crap for useful performance analysis, and argues for the use of analytic combinatorics to obtain more detailed estimates instead. So the whole premise of this poll may be wrong.
Re: (Score:2)
Yes there are domains where performance is likely to be more of an issue; though not every program targets generic devices.
I work on tools ( compilers, profilers and the like ) and OSes for real-time systems some of which that had way less power and memory than modern smart phones. O(n) is awesome when you can get it but not required or possible everywhere. It's very easy to have one apparently O(n) algorithm call another O(n) algorithm even indirectly ending up with an O(n^2). In any large system complexit
Emergent Complexity (Score:3)
Emergent Complexity is a real problem in software engineering. Many of the software developers I work with don't even consider complexity when writing code. They are hackers. They hack away to make complex algorithms work and when they find the first solution that satisfies the unit tests, they ship it. To be fair, usually they are under resource and time constraints that are unreasonable for the complexity of the problems at hand so a lot of brittle algorithms get shipped that accrue technical debt at astounding rates.
I do consider complexity when I write any algorithm of sufficient complexity. I can't really tell you how I identify said algorithms because it comes more from experience not formal evaluation learned in academics. When you get burned enough times by brittle code and technical debt you start to see patterns in problems and solutions that enable you to have an intuition about this sort of thing.
When I do identify a risky candidate that could benefit from simplification or reduction of a solution, I do not find that any academic method of evaluating complexity is particularly useful. Even measuring cyclomatic complexity with static code analysis has limited usefulness. It's better than nothing I suppose. The level of complexity a company can tolerate is directly related to the amount of resources and time available to deal with it. In a very fast moving organization that wants to deliver new features like hotcakes, you almost always have to write "better" code that doesn't require as much maintenance and is extendable in a way that it won't break gazillion other things. If you are limited on resources like maintenance developers because you sell your product in a very competitive market space and have to be competitive, you have even more constraints.
We truly lack sophisticated methods to deal with these types of problems. In that sense, software engineering is still quite immature. However, much progress has been made in the past 30 years. We ship more working software on time and within budget than we used to.
big big o (Score:2)
Re: (Score:2)
Discovered the Big O anime series a few years ago on Youtube. Somebody described it as "Batman the animated series" meets "Gundam"
https://www.youtube.com/watch?... [youtube.com]
https://www.youtube.com/watch?... [youtube.com]
The correct answer isn't listed (Score:2)
I rotate my tires (Score:2)
I rotate my tires every 10k miles.
Not sure what this has to do with writing code.
Re: (Score:2)
I rotate my tires every 10k miles. Not sure what this has to do with writing code.
Tires are like big Os.
Oh FFS (Score:2)
When was the last time you went and performed static and/or dynamic code analysis and performance profiling to fully optimize the toUpper function of the String class that you wrote in order to squeeze 3 more clocks cycles over the version provided with your SDK.
But that seems to be the question that most are reacting to. This was not a question about premature optimization, we all (hopefully) understand the problems with that. But if you don
Re: (Score:2)
A professional CONSIDERS computational complexity EVERY time they write a line of code.
No, they don't. A professional knows when the computational complexity doesn't matter so they don't bother considering it.
A professional also knows that big-O doesn't tell the whole story. Bubble sort is probably faster than a fancy n log n algorithm unless you're sorting thousands of items.
Only if it seems slow for the intended use-case. (Score:2)
No longer code (Score:2)
If you don't consider complexity your "quick and dirty" solution may not be as good as you think
yes, and 1000000n and 5n are not the same (Score:2)
I am a sysadmin that also deploys software software deployment. My job is to look after network traffic, software deployments, tweaks and, scripts. Scale and breadth-of-impact is always a concern. Furthermore, unlike O-notation, I do care about the multiplier. In the real world, 1000000n or 5n are not the same to me!
If you move some log files for all your users to a network instead of a local drive, your network traffic might be ok, but how will disk access cope? If you write a script to move the old
After bug reports (Score:2)
Database Optimization Effort (Score:2)
other : only at the lowest level (Score:2)
I only consider it at method level. The legacy Java app I support has loads of spaghetti in it. GOTO is not the only way to write spaghetti. I will Shoot everybody that says that methods should be small.
I write "apps" mostly (Score:2)
The trick is knowing which platform operations take long.
Newbies never consider it. They even add libraries so their code is small but they pretty much never seem to consider what code their function(a) call causes to run that they didn't write.
it pretty much doesn't matter how tight I make my loops anymore or what calculations I use in my logic, except when sorting large lists or what have you. It's more about what kind of layout runs scrolling and such causes "invisibly" most of the time, how the layout i
Re: (Score:2)
Re: (Score:2)
This.
An anecdote: I used to support an AI/natural language processing app that took English language systems specifications and automatically generated code. That was about 20 years ago. It worked. Quite well in fact. But seeing as how the app was written by a bunch of electrical and mechanical engineers, our CS group had a fit. The task was too complex, too computationally intensive. Terms like O and NP-hard were thrown around to scare management into taking the task away from engineering and giving it to
Re: (Score:2)
Re: (Score:2)
Sometimes i have solved calculus problems in closed form, that is, no loops. A no-o solution is better than big-o.
That still has a Big-O, it's O(1), usually pronounced "Order one."
Re: (Score:2)
The performance is rarely a concern.
Performance for me is a concern but it's usually not a fault with the algorithmic complexity. Usually when I find a performance problem it's where someone is doing something like retrieving data from mysql or doing some other operation one row at a time for 100 records instead of doing it all at once. In a case like mysql, 100 half second queries ends up being 50 seconds versus maybe 3 seconds to do it all at once.