The Insidious Creep of Latency Hell 297
Twinbee writes "Gamers often find 'input lag' annoying, but over the years, delay has crept into many other gadgets with equally painful results. Something as simple as mobile communication or changing TV channels can suffer. Software too is far from innocent (Java or Visual Studio 2010 anyone?), and even the desktop itself is riddled with 'invisible' latencies which can frustrate users (take the new Launcher bar in Ubuntu 11 for example). More worryingly, Bufferbloat is a problem that plagues the internet, but has only recently hit the news. Half of the problem is that it's often difficult to pin down unless you look out for it. As Mick West pointed out: 'Players, and sometimes even designers, cannot always put into words what they feel is wrong with a particular game's controls ... Or they might not be able to tell you anything, and simply say the game sucked, without really understanding why it sucked.'"
I noticed this (Score:2)
I thought things were supposed to get faster with newer technology but it does indeed look like the newer devices run slower becuase of bloated apps and firmware. Maybe this has to do with programming being "bestshored" =D nowadays.
Re: (Score:3)
It's fine to write lean and high performant code in a small application with a single developer. Once you have teams of people working on something, then you have to worry about something called "Maintainability". According to various studies, 80% of a piece of softwares total lifetime cost is maintenance and support costs. Nobody wants that to be come 90 or 95 or even 99% because people wrote code that others coudn't understand very easily.
Standardized frameworks, patterns, and components improve codes
Re:I noticed this (Score:5, Insightful)
I'm not buying those excuses.
Why is it Microsoft Word 97 fits into my 8 megabyte 386 laptop, and has 99% of the same functions as modern Word, plus is quick and responsive. Why can't they bring that level of efficiency for today's Word 2010?
Because they aren't trying.
Because they don't care.
Because it's easier for management to tell users, "Go buy a new computer with 8x more RAM," than to pay programmers to make the code more efficient/responsive.
Re: (Score:3)
Introduction to Algorithms, Data structures and advanced algorithms; and the math behind them. Unfortunately since the barrier to entry is so low in today's environment (Hey look I can code --- that is enough) and that coupled with the quest to recruit brute force programmers and not thinkers have led to this mess toda
Re: (Score:2)
If Word 5.1a ran ok under OS X, yeah, I wouldn't have shelled out my $20 for Office 2011 last month.
Hell, it might be faster to launch a Mac OS vm with Word 5.1a on it than Word 14/2011 to launch.
Re:I noticed this (Score:4, Insightful)
to
It all of a sudden holds more merit
:)
While I'm not claiming to hunt down statistics to prove my point, I think it's safe to say there's absolutely some truth to lazy programming becoming more the norm than the exception in certain areas.
Do you really think some ratio of perceived/available functionality over ram usage has remained constant or significantly decreased?
Don't you hesitate when you're about to download an application for a simple task but the first one you find is, inexplicably, 100 megabtyes, compressed, whereas the 2nd one you find (and download) is 10 or 15?
I know I do
Re: (Score:3)
Actually, it's closer to 50%. Which means that Office 2010 contains around twice as many features, and those features are almost certainly more complex than the first 50%. Consider a feature like Live Preview. This single feature touches most of the other features, and likely requires significantly more overhead than version of Office that did not have live preview.
As computers get more capable, the software that uses them is written to do more.
Re:I noticed this (Score:5, Interesting)
The only personal experience I have with this is that, among many other things I do professionally, usually at any organization I'm at (as a consultant or otherwise), I'm embarrassingly the "Excel guru".
Using Excel a moderate amount, I do try to use VBA pretty sparingly given its obvious slowness compared to other methods of calculation, but on occasion, it's markedly more efficient than other designs (e.g. testing for a series of involved conditions which would otherwise process extremely slowly using a formulas, etc.)
I've found that even on modern hardware, Excel 2007 VBA execution, of identical code, is much slower than Excel 2003.
For that reason, along with the sheer inefficiency of the ribbon design in terms of responsiveness and usage of screen real estate, I keep all Office 2007 usage relegated to a VM which I rarely even need.
To further clarify the issue, I have developed a personal library of macros I use in simply navigating spreadsheets efficiently. Even on modern hardware, Excel 2007 cannot keep up with my usage of these macros and throws errors repeatedly whereas I never see such errors in Excel 2003. Keep in mind, this is on modern hardware.
My interpretation of this is that speed and efficiency were low priorities for the Office development team. Given that their interface redesign was, I believe admittedly, largely geared towards novice users, these alleged low priorities make more sense.
To appeal to your sense of empiricism, which I appreciate, please see (perhaps not of the greatest quality...)
http://www.wilmott.com/messageview.cfm?catid=10&threadid=81967 [wilmott.com]
http://www.excelbanter.com/showthread.php?t=137875 [excelbanter.com]
http://www.ozgrid.com/forum/showthread.php?t=78673&page=1 [ozgrid.com]
for what it's worth, I always disable screen refresh and calculation during a macro (except in rare circumstances when that behavior is necessary)
Re:I noticed this (Score:5, Insightful)
Re: (Score:2)
Firefox's model enables this [mozilla.org], and I find it a fine reason to keep it that way.
Changing TV channels (Score:5, Interesting)
And here I thoguht I was the only one complaining that changing channels gets slower and slower with every new receiver box.
On analog it was basically instant, less than 100ms.
First digital box took half a second. Full HD box sometimes takes a whole second or more (and it's not even deterministic anymore)
That SUCKS big time!
Re:Changing TV channels (Score:5, Interesting)
That's down to compressed stream buffering. An analog box could be instant because every frame was transmitted uncompressed. With digital TV, you have to wait for a keyframe at least.
Re: (Score:3)
Re: (Score:2)
It might if it's getting the channel's feed in the background. It is annoying though. They really ought to consider a "just the listings, I don't want to see the channel at the same time" option. That would be a TON faster.
-l
Re: (Score:2)
That's due to how digital cable works. I'll speak of DVB since that's what I know (I don't know what USA is using, but surely it will be similar). DVB sends a big stream composed of several smaller streams, some of those are video/audio streams, some are channel information (the guide, the streams IDs (audio/video/cc) of the channel, etc), others are info on the stream itself (carrier frequencies) or general information (time).
For the video stream, as the parent poster said, you'll have to wait to get a key
Re: (Score:2)
TV Guide is instant on over-the-air television. Mainly because the data is immediately available from the box's memory. Anything past 3 hours takes a little longer to fill-in, since those are only sent every 5 seconds. Past 1 day, the data is only sent every 30 seconds.
As for channel changing, I've noticed some boxes are slow as snails, while others are instant and display the picture almost as fast as analog. It all depends on the manufacturer.
Re: (Score:2)
This does not explain the > 500ms delay when trying to use the guide.
That's probably due to being 'keyframe' compression. The way that works is you start with frame 0, and that is the entire picture. The computer looks at frame 1, compares it to frame 0, and determines the difference. Frame 1 is now a partial frame that is overlaid on top of frame 0. Then frame 2 repeats the process, you end up with a partial frame which looks a lot like an old cel of animation missing its background, yadda yadda yadda. Then, when you reach a certain threshold (like 96 frames, maybe?) a
Re:Changing TV channels (Score:5, Interesting)
it is the encryption technique utilized that introduce this lag. There is a key change at about 1s so it must wait until the next key to decode the stream.
Re: (Score:2)
i should have posted in the parent post
Re: (Score:2)
Re:Changing TV channels (Score:4, Informative)
Since many of these technologies transmit the data for all channels simultaneously, why not just scan for key frames and store the last key frame received for each channel? It might be that even just scanning for key frames might be too CPU intensive to do for all channels. But most people, when channel flipping, do so in a fairly predictable order. You could start doing this for the most likely targets for channel flipping when channel flipping behavior is detected.
Re: (Score:2)
A cable box needs to have a separate tuner for each stream. Live TV takes one, each recording show takes one, etc. and such keyframe prefetching would also take at least one. Most cable boxes only have 2-3 tuners, and inevitably they're all going to be taken. So while this could definitely hide the latency in some situations, it's not a perfect solution.
Cable companies are also beginning to use SDV, which broadcasts channels on demand instead of all at once. Some latency will be involved here too, becau
Re: (Score:2)
I the case of a request things get a little easier. The entity serving up the response can do a little extra work to send an immediate key frame and tailor the rest of the stream up until the next key frame for the requestor before shifting them to just getting a duplicate of the data everybody else is getting. You still have the request latency, but I bet you could still manage to keep the latency under 50ms if you tried hard.
Re: (Score:3)
A cable box needs one wideband tuner and an SDR that can demodulate all the channels at once. There's obviously no need to do video decoding (decompression), just buffering of some subset of channels (say most often recently used). One QAM channel brings about 34.5 Mbit/s. A modern ASIC should have no problem easily dealing with demodulating and buffering twenty such channels (700 Mbit/s) when presented with a wideband IF output. The buffering only needs 90 Mbytes/s for 20 channels. That's not much if you
Channels on each frequency (Score:2)
Since many of these technologies transmit the data for all channels simultaneously, why not just scan for key frames and store the last key frame received for each channel?
Only a few HD channels are transmitted on each 6 MHz carrier. Scanning for keyframes on other carriers would need multiple tuners, which increases the cost of the cable box.
Re: (Score:2)
I'm guessing Microsoft does something like for their IPTV platform (used in U-Verse in the US). You receive a unicast stream until the multicast stream is joined (and presumably another keyframe reached). Although this isn't always seamless and you can't rewind to what you received in the unicast stream.
Re: (Score:2)
Saving the last key frame is also a waste, unless you save all the deltas up to the current time, too. That's the same as trying to play all the channels at once.
You wait for the key frame, display it, and start viewing the stream from there. Using an old key frame and viewing the deltas somewhere midstream would result in weird looking corruption until the next key frame was displayed. Best just to wait.
Re: (Score:3)
In my opinion, showing just a still keyframe, or a keyframe with weird corruption is still preferable to channel switching taking a full 500-2000ms. Though the weird corruption would probably be a customer education issue.
Re: (Score:2)
No, in my case I can trace the worst of it directly the the Time Warner Cable channel guide. Slowly rolled out in Austin over a few days, I immediately noticed a huge delay when changing channels, and was able to confirm with friends and neighbors across the city (some who had HD boxes, some not, some with the new guide, some without), that those who had the new channel guide were noticeably slowed (~1s), and those with the old guide didn't see anything unusual... until they too received the "upgrade".
The t
Re: (Score:2)
So, I guess what I'm saying is: have you tried rebooting it?
Re: (Score:2)
And here I thoguht I was the only one complaining that changing channels gets slower and slower with every new receiver box.
On analog it was basically instant, less than 100ms.
First digital box took half a second. Full HD box sometimes takes a whole second or more (and it's not even deterministic anymore)
That SUCKS big time!
Or a more positive spin,maybe this will result in less TV watched. I've noticed that since we have gone Dish (wife must have TV5Monde), we hardly watch TV at all (the little tot watches the most, probably 30m-1hr a day avg, but this is all netflix streaming/DVR/appletv content).
If some startup or established company is keen on solving this, they could be highly disruptive in the TV space.
Re:Changing TV channels (Score:5, Interesting)
Re: (Score:2)
Re: (Score:2)
I suspect the half mute is in case the commercials are too loud, but you still kinda want to hear them.
N900 (Score:5, Interesting)
The N900 suffers from this, alas.
I can't comprehend why the phone app isn't in memory on boot. It's a PHONE. Instead, when the phone rings you have to wait several seconds for the phone application to load.
In contrast, my wife's new HTC Z snaps and zings along with Android, even though it's "bloaty" Java / Davlik.
Re: (Score:2)
Is it a phone?
I use my "phone" for phoning and receiving phone calls and talking on the phone maybe...I dunno, 0.6% of the time it's lit up?
It's a computer. If I happened to put a phone in there, then that saves me having also to have a phone.
Though I will get a burner if I think Jack Bauer is on to me. Maybe a six-pack. I ain't stupid.
Re: (Score:2)
That's a hell of an argument, it's not a phone, it is just a shitty computer with some cellular and audio hardware built in.
But at least they made a computer with a shitty phone application, and not a just a shitty phone.
Re:N900 (Score:4, Informative)
As Nokia put it, it's a device designed around "mobile computing." The phone app is entirely secondary (unlike on most Symbian devices, which have physical buttons) and is mostly there as a convenience thing. The primary reason the circuitry for it is even there is for 3G data.
Which lies totally outside my experience, where it comes up instantly.
90% of the lag on the N900 is if you've just done something memory intensive (notably the browser) that caused stuff to be paged out to the eMMC, which isn't exactly fast. But the crippling problems that used to cause were resolved back in September or so with the PR1.3 update.
That's because it's a phone, and not a more general purpose sort of device. In fact, they don't even want you messing with it which is why you have to root the damned thing.
Re: (Score:2)
What do you mean about the messaging? My Dell Streak has been used a lot more for texting, web browsing , ebook reading and youtube than it has as a phone. Likewise my Xoom doesn't even have anything but wifi.. Android is no more a phone OS than iOS is.
Re: (Score:2)
messing, not messaging. Regardless, they don't want you stepping outside the carefully set bounds of the Android OS and they try to enforce it by making you root the system.
The N900, by contrast, is a much more typical Linux system that you can freely enable root access via official methods. Tuning has definitely been needed, but my point against the GP was that I had not seen the latency he talks about in response to phone calls, and why the phone application wasn't loa
Re: (Score:2)
Re: (Score:2)
iReport (Score:3)
Whenever I use iReport there is about a sixth of a second delay between interacting with it and it doing what I asked. It's barely noticeable but still drives me insane.
RTFA (Score:5, Funny)
I was going to read the article, but it took too long to load.
Re: (Score:3)
Oh, Windows has antialiasing, it's just that it tries too hard to force fonts into the pixel grid so while it does make them readable it also makes them look sort of ugly and broken (unless specifically designed for ClearType).
Then of course as you pointed out there is the weird window tearing thing going on, which for some reason will happen even on a top-of-the-line Windows machine with no software running except the one explorer.exe window you're moving around.
And finally there's the general UI design wh
Worst Offender: Satellite Receivers (Score:3)
I don't personally have Satellite, but it seems that almost every satellite receiver is underpowered to run the GUI.
Re: (Score:2)
I've had a slightly different experience. I use DirecTV; while it's not the snappiest at changing channels, the channel listing menu is fairly responsive. In comparison, using my in-laws' (or my dad's) digital cable is like trying to use the application through a VPN into a slow windows ME box in Elbonia. Not only was it slow to change channels, the channel list itself had noticeable rendering latency and artifacts.
Re: (Score:2)
At least then you get the product slightly cheaper. I find products with plenty of cpu far more annoying. Take the Xbox 360 UI. A lot of the menu navigation has a delay of about 1 sec, just for doing some animation. And input is disabled in that time.
slashdot (Score:5, Insightful)
Slashdot's Javascript.
The new slashdot interface (Score:5, Insightful)
It actually drives me insane, it is markedly worse, I read less stories because of it (because I do not like the feel of the site so much).
I would bet most of the actual slashdot users feel (think ?) the same way. Why is there no mass appeal to change it back / forward in a more reasonable (i.e., simpler) direction ?
Re:The new slashdot interface (Score:4)
r
e
a
d
t
e
x
t
l
i
k
e
t
h
i
s
And yes, the posts lower in the chain would often look like that with the most recent javascript fiasco. I have no idea whether it's still that ugly.
Re: (Score:3)
It actually drives me insane, it is markedly worse, I read less stories because of it (because I do not like the feel of the site so much).
The "Many more" button (to get more stories) has never worked for me. This is across multiple computers and operating systems. I read less Slashdot because I simply can't easily get to the older stories.
I'm sure it has something to do with my account setup, but -- bah. The new Slashdot is a train wreck.
Re: (Score:3)
Strange! /. always worked for me smooth as silk except very long threads can take some time to open. I am a lame Windows user and it worked just fine with XP+Firefox and now 7+Firefox.
BUT, on the phone (HTC desire) - tragedy!! Stock browser does not scale the text properly. Opera mini scales OK, but refuses to display more than the 250 comments. Nothing helps - open in new tab, open in new window, the same window - it just does not work. View more stories sometimes works sometimes not but never works from t
Re: (Score:3)
Strange! /. always worked for me smooth as silk except very long threads can take some time to open. I am a lame Windows user and it worked just fine with XP+Firefox and now 7+Firefox.
My computer is a single core 1.66 GHz Atom, so I'm somewhat used to things being kind of slow, but Slashdot takes the cake. (Running Windows 7)
I am not very good with computers and programming - am I missing something?
I don't think you're missing anything. The new Slashdot is just poorly written.
Re: (Score:3)
Any popular story, such as the recent Bin Laden killing story, opens SSLLOOWWLLYY... It takes maybe 2-3 minutes to open, and during that time, the browser is unresponsive. (I have a single core HT laptop with 3 gigs of ram). Once the page loads, actually scrolling is a joke. The lag there is in the 5-10 second range. C'mon
Re: (Score:2)
Use firefox instead of chrome. ;-) Works like a charm for me and I don't have your specs.
Re: (Score:2)
Re: (Score:3)
Then block it. I use NoScript. Set your discussion preferences to "Classic Discussion System (D1)" for best effect.
the olden days. (Score:2)
same as it ever was (Score:2)
Linus Torvalds has rejected patch after patch after patch that would make Linux into BeOS style latency. He has rejected them all. Why?
because he wants to goose the server performance numbers. thats basically the story of the last 20 years of linux.
Re: (Score:3)
Hi, I wrote the article in the story.
Can you give me some verification and links to your comment, and I might see if I can fit it into the article. As you say, there should always be a sensible balance between latency and throughput.
VS2010 and smartphones (Score:3)
I have definitely noticed addition latency in the UI with VS2010, and it has always bugged me. I can't be certain if it is VS itself or if the underlying WPF is to blame. My current belief, and I have no facts to back this up, is that the VS UI is simply much more complicated than the "typical" use case that was targeted for WPF, and as a result, low-to-mid range computers fail to meet a human's expectations of UI latency.
On a separate beef, why is it that so many people put up with the AWFUL latency on smartphones? Especially when browsing the web, it can sometimes become unusable. The phones get more and more powerful, but instead of making the last UI work, the designers just add more and more flashy crap instead. UI is just as slow but it looks cool in the ads.
When I don't have a real computer nearby, I can use the android browser when I need to, because I recognize the flaws and can patiently work around them. It kills me to watch my girlfriend attempt to navigate the interface of her nook color. Usually goes something like this...
1. press UI element
2. no response, press harder (because that's what humans do when a button doesn't work)
3. harder has no effect, start tapping repeatedly
4. UI finally changes, menu pops up and immediately goes away, having tapped on an unknown selection.
5. human gets mad, claims unit is broken, goes to check email on 5 year old PC.
As an added bonus, the PC doesn't require you to put your fingers in the way of what you're trying to look at.
-d
Re: (Score:2)
Re: (Score:2)
How much of that is connection lag and how much is UI though?
If touching doesn't immediately produce a visual or audio response that the device is trying to load something over the connection, then it's perceived as UI lag. That's part of why Slashdot has that little "Working" box with a throbber that pops up at the bottom of the screen when I preview a comment.
Re: (Score:2)
Re: (Score:2)
The phone I use has indeed lags sometimes but this has less to do with the languages or frameworks used but a lot with processors and batteries not being able to cope
My Super NES with a 3.6 MHz 65C816 CPU doesn't lag when I play Super NES games. So why should a phone with a CPU 1000 times faster lag?
Re: (Score:2)
Slashdot (Score:5, Insightful)
Re: (Score:3)
What the h&*)) is going on here? (Score:4, Insightful)
I read about this right here in January [slashdot.org]. And February [slashdot.org].
Seriously, how many times can you recycle the same story with slightly (and I mean slightly) different nuances, but the same frakking story?
Next thing you know, I'll be reading another story on this with the angle 'women and children affected the most [google.com]'.
Slashdot is becoming the USA Today of the Internet. Isn't it time for another site upgrade?
strace / truss (Score:2)
Re: (Score:2)
Re: (Score:2)
Modern platforms have dtrace.
Which ones? I can only find it on stuff based on 4.3BSD and SVR4.
Skytopia article (Score:3)
The first article in the summary has some silly ideas on how to fix this:
What can we do reverse the onslaught of latency in all its forms? ... First off, we can make LCD (or future OLED/QLED) monitors run at 120 or even 240 frames per second.
What?!?! 1) That is not possible and 2) That would not help. There are so many reasons for this I can't even list them all.
First, there isn't an LCD panel in existence that actually responds that fast. Those phony 240 hz screens don't actually change the pixels 240 times per second. We don't need faster displays to solve a software problem.
If the software provided a 1/60 second delay, that would be just fine. The issue is that we are not taking advantage of the refresh rates we have no - so increasing those rates doesn't help. Heck -- that's probably faster than the debouncing circuits in the controller!
Compensating for a longer pipeline by increasing clock speed doesn't help anyway, because to get the higher rate you need to increase the pipeline more...
Re: (Score:2)
Well your retina doesn't react instantaneously either, but your brain knows how to compensate for it. My personal theory is that the slow response of the LCD is also an analog or analog-appearing process and your brain will find it less offensive than other types of timing distortions.
But even if the LCD itself doesn't respond instantaneously, increasing the frame rate can potentially decrease every frame-count-denominated so
Re: (Score:2)
I think he's talking about the image processing. LCD controllers need to see multiple frames in advance so they can set the control signals right to minimize ghosting. This "response time" is a marketing number, so they try to make it as small as possible. Unfortunately, that means delaying one or more input frames plus some extra processing time. At 60Hz, one frame is 16ms, which is a large unit for input lag purposes. Changing to 120Hz means the next frame comes faster, which in theory means less lag.
I'm
Re: (Score:2)
going from 60hz to 120hz on my crt helped latency considerably. however, I"m one of those old school quake heads who sees stuff like that. today's world of laggy GUIs and clunky virtualized binaries (java/.net et al) drives me insane...especially since the cpus we have today are roughly 10x faster than they were in 1998.
I coveed it in detail in this months column (Score:2)
Re: (Score:2)
Unless some one can give me a 'do that and you will see it' test that works I remain skeptical that buffer bloat is the cause.
Its sounds more like confirmation bias tied with a different issue.
If someone running a twitch game(Usually TF2), download bit torrent(WoW patch or ISO), on a computer (AMD Quade, 4 Gigs)that's may also be serving up media to my TV(10/100/1000) can't experience this, does it really exist?
I have 25/20 Mbps
.
Lightning (Score:3)
Sometimes in a thunderstorm you see lightning, but it can take several seconds till you hear thunder.
Pretty bad design, imho.
Re: (Score:2)
Really (Score:5, Informative)
"(Java or Visual Studio 2010 anyone?"
Really? do you even know what the hell you are talking about? OR did these two think pop into your skull and you use your meaty finger to pound out some sort of text in a vain effort to stay relevant?
Replace:
Java or Visual Studio 2010 anyone?
With:
Crappy programmers.
And has anyone documented a repeatable real world test for 'bufferbloat' or is this still an academic issue?
Impatience... (Score:3)
This is one of the reasons why on all the devices and computers I use I disable animations when I'm permitted.
I think the real problem, however, is that we've grown increasingly impatient. I notice it in myself. Some application takes over 3 seconds to open and I start getting agitated, wondering what the hell is taking so long.
Think back to the earlier days of computing when it might take 30 seconds or more to get an application started. I recall on my PCjr booting up games and sitting there for a good minute while the drive ground away. That said, some games nowadays have some rather appalling load times. But look at what has to be loaded compared to those games from the 80s.
I also recall trying to play games that were more demanding than my computer could handle. I'd hit a key and get an action a fraction of a second later. I got used to it, but it's something that would be intolerable today.
Even using early versions of Windows and Mac OS was an exercise in patience, especially if something was going on in the background. And what about waiting for a simple website to load, especially over 56kbps? Hell, what about loading up text-only menus on your favorite BBS via 2400 baud?
The simple fact is that we've been spoiled and as such have grown increasingly impatient over shrinking timescales. That said, given all the computing power at the tips of my fingers I do expect everything to be instant.
Re: (Score:2)
Think back to the earlier days of computing when it might take 30 seconds or more to get an application started
Back when I wrote one of my first gui programs, I added sleep() calls into the load process, because otherwise the splash screen disappeared before you saw the progress bar move and that wasn't so "cool"...
Re: (Score:2)
we aren't spoiled. we're more dependent. in the 80s you used your computer to play simplistic video games and maybe type a one page school assignment.. if you were learning programming, you were coding simple routines in assembly, maybe C if you were lucky enough to have a 68000 based machine with sufficient ram to run a full suite of tools..even then you were still coding a lot of assembly.
Today, programmers are pumping out bloated binaries using point-and-stick development environments. this problem is ad
Remember when... (Score:2)
.....I want to say it was around 2003 or so around here, when there were all sorts of discussions about "the cost of hardware and bandwidth is so cheap now, we don't need to optimize for the machine, we should instead optimize for the programmer". This came up again and again, and all the n00b programmers danced around their shiny Powermacs and sang and held hands and ratified it.
Notice how everything has exploded exponentially in size, and each revision of any bit of software has been noticeably more slug
Dell Axim X5 (Score:2)
It's all in engineering (Score:4, Informative)
The latency problem in non-networked applications is ultimately caused by poor software engineering, starting with system-provided APIs. Most of "bog standard" system libraries were designed for something entirely different than what they end up used for. The "normal" C I/O paradigm is used everywhere, yet it was really designed for batch applications, not for interactive use. The only way to do almost any filesystem and network interaction should be to submit a request, then react when the results come back, all the while being receptive to other "results" (events) coming in. Unfortunately, designing things this way requires a certain discipline and a mindset, and default APIs and "industry practices" simply don't encourage it at all.
A correctly engineered system API should not have any blocking calls apart from the "wait for events" call, it's as simple as that. It's very rare that an application is only waiting for one thing to happen. Even something as simple as a UNIX cat has two file descriptors open, and simultaneously waits for stuff to come and and for stuff to finish going out, with a buffer in-between (I'm ignoring no-copy APIs for a moment). Coding it up as a read followed by a write is, at best, wishful thinking. Of course event-based programming is something that seems like a lot of extra work, but it's just a matter of getting used to doing things the right way.
In fact, if you decide to code up your whole system in an entirely reactive way, you gain other benefits. By reactive I mean you could reduce every application thread's interface to a single processEvent(event), run-to-completion function that you implement. As it turns out, it becomes almost natural to get the guts of the processEvent() function implemented as a state machine. The state machine formalism often helps in producing better quality code, and it certainly makes it very easy to trace interaction with the outside world. Miro Samek shows a striking example [drdobbs.com]: the supposedly "so simple it couldn't possibly go wrong" calculator example from old Visual Basic has several bugs that stem from its bug-prone yet commonplace design. The calculator's state is spread out in an ad-hoc manner in various variables, and the tests done on those variables in response to external UI events pretty much amount to a buggy reconstruction of a single state variable to drive a state machine.
The state machine paradigm is in somewhat stark contrast to the way a typical GUI application is designed, where you have on_fooBar methods that get invoked when fooBar event happens. In the fooBar method it's up to you to verify that the application is in the correct state to do whatever fooBar calls for. This requires forethought, and status quo indicates that it's easy to get it wrong. Perhaps that's the reason: the de-facto mode of implementing reactions to external events is so broken that it's not used for much besides the GUI. Perhaps this is why "quick" system calls are usually done in line and end up blocking the whole application, or at least one thread, and those are not free either so why waste them with blocking APIs?! Apart from perhaps querying the current time or current username, there are really no "quick" system calls. Simple things like listing a directory or getting a key's value from the registry can potentially take seconds if your drive is thrashing around due to high I/O demand, or if the network happens to be slow.
Of course the line has to be drawn somewhere, so let's assume that paging of code, libraries and heap is something we should not worry about because it cannot be helped much. At that point one realizes that indiscriminate memmapping of data files can be problematic in itself: a memory-mapped file is, after all, supposed to hide the fact that you are doing a request-response that can be either very fast or very, very slow. The latter is something you should explicitly handle, and with memmap it's at best cumbersome: you have to use some API to check if given page is av
Re: (Score:3)
I believe they were talking about the Visual Studio 2010 interface. If you've used something from some years back, you'll understand how insanely slow doing they've managed to draw text. I think it's all incredibly hilarious watching everyone squirm. It makes me feel better when I think I must be getting old because people seem to program without any thought involved...just library calls.
Re:Java or Visual Studio 2010 anyone? (Score:4, Insightful)
Wow, I've seen no input lag on VS2010 at all - and I run it in a VM which is hosted on my slow-ass laptop, so it's pretty starved for resoruces. I wonder what I'm doing right here?
Re: (Score:2)
"program without any thought involved...just library calls."
That's actually a good thing. Software will only progress when basic routine things are just called..
And there is a difference between using libraries and not thinking. You use know measurements and techniques to build a bridge, you don't recreate everything from scratch all the time.
Re: (Score:2)
What about the interface? I use it for hours on end daily on a 3 year old laptop and I notice absolute zero lag in drawing or anything else. Maybe you need to ditch the 15 year old dumpster-dived computer?
Re: (Score:2)
Re: (Score:2)
Visual Studio tend to be used by less skilled developer
Since when?
Re: (Score:2)
Also Java and Visual Studio tend to be used by less skilled developers and students (disclosure: i'm a student). Poor responsiveness of programs written in Java or using VS is more a factor of who is writing it than anything to do with the language / VM / IDE.
Java and Visual Studio less skilled developers? Nope. No it doesn't.
But thank you for your disclosure, that explains a lot.
Re: (Score:2)
Visual Basic 4.0 starts up in three seconds on a ten year old computer. I have much more advanced IDEs on my machine nowadays, but if I just want to code up something for the fun and expect quick results, I'll still fire up VB4 from 1995.
Wow, really? Maybe you should try using notepad and vbc.exe to compile. Super quick!
Re: (Score:2)
This isn't so much about computers doing things too slowly, or trying to compute too much. It's about computers sitting idle at the wrong times, or computing things too early or too late. It's about buffering -- which trades latency for throughput -- making it more and more difficult to coordinate those timings correctly.
In gaming you have screen lag, video card lag, render lag, input lag, and network lag. They all add up, and synchronizing their timings to make things appear fluid and lag-free is so dif
Re: (Score:2)
Read what he has to say about Bradley Manning and you'll wet your kecks.
Re: (Score:2)
That's why I made him a foe in the first place. Not Bradley Manning per-se (he's a traitor, one I happen to not be all that upset with, but still), but with Wikileaks in general.
Re: (Score:2)
And then things took a turn downhill, I could speculate why but I doubt we'll ever really know.
It happened right about when you purchased a plasma TV, right? There you go.