Most of the Web Really Sucks If You Have a Slow Connection (danluu.com) 325
Dan Luu, hardware/software engineer at Microsoft, writes in a blog post: While it's easy to blame page authors because there's a lot of low-hanging fruit on the page side, there's just as much low-hanging fruit on the browser side. Why does my browser open up 6 TCP connections to try to download six images at once when I'm on a slow satellite connection? That just guarantees that all six images will time out! I can sometimes get some images to load by refreshing the page a few times (and waiting ten minutes each time), but why shouldn't the browser handle retries for me? If you think about it for a few minutes, there are a lot of optimizations that browsers could do for people on slow connections, but because they don't, the best current solution for users appears to be: use w3m when you can, and then switch to a browser with ad-blocking when that doesn't work. But why should users have to use two entirely different programs, one of which has a text-based interface only computer nerds will find palatable?
Most of the web really sucks (Score:5, Insightful)
No idea how your connection speed adds anything to this.
Re: (Score:3)
Most of the web sucks, but with a slow connection you really need a long time to get to that point of understanding.
An adblocker can help you go to this conclusion faster, tho.
Re: (Score:3, Insightful)
If you go to a page with a lot of images, the common sense approach would be to load them in order, one at a time. Instead, all browsers do the same stupid thing - try to load all 30 images at the same time, leaving you sitting there looking at 30 partial images.
So you'd rather wait 6 times as long to load a page one component at a time?
Re: (Score:2)
On a slow connection, yes. And especially if the images never load.
But more importantly, the data transfer should be about the same either way. So it won't be noticeably longer. But you will get more information, or faster, since you will see a complete image sooner.
If the browser adjusts to a bottleneck with different behavior, that's the best of both worlds.
Re: (Score:2)
That doesn't even make sense unless you assume that the webserver is limiting how much bandwidth each requested item is allotted, and not how much bandwidth each requesting IP is allotted.
because my ISP would never throttle my bandwidth
Re: (Score:3)
If your ISP throttles your bandwidth it STILL won't make a difference if you download 6 x 500 KB one at a time (3 MB) or simultaneously (3 MB). In fact there are several use cases where downloading them one at a time can reduce the needed bandwidth because you can stop the page after one or two images that you actually need are done. That is the same reason why jpg images load the way they do, giving you a chance to see early on if it's the image you're after or not.
Re: (Score:3)
Re: (Score:2)
I see you have no clue as to how networking works.
So you're running this over your own private internet with no contention
Re: (Score:2)
Re: (Score:2)
>> Instead, all browsers do the same stupid thing - try to load all 30 images at the same time, leaving you sitting there looking at 30 partial images.
You're free to use a browser that doesn't do that.
For me, I'd rather reduce the 30 components to 5 by using an adblock and a tracker script blocker
Re: (Score:2)
I'd rather reduce the 30 components to 5 by using an adblock and a tracker script blocker
Which doesn't work so well once the majority of commercial news-editorial sites start to make everything past the abstract JavaScript-dependent [blockadblock.com] and use the DMCA [blockadblock.com], CFAA [blockadblock.com], or foreign counterparts against developers and users of filter lists that use anti-anti-adblock.
Re:Most of the web really sucks (Score:5, Insightful)
Autoplaying videos are the bane of my existence. Nothing should should autoplay, ever, and it shouldn't require a browser plugin to prevent it.
Re: (Score:3)
Indeed, anyone thinking adding that crap to a page is just a horrible person. A close second is the delayed pop-up. Fuckers. Then they cry about ad-blockers. Been testing Pi-hole with very promising results, seems to even bypass the "Hey you are using an ad-blocker" message.
Re:Most of the web really sucks (Score:5, Informative)
If you use firefox, or a derivative, put this in your user.js file (or set it through about:config).
user_pref("media.autoplay.enabled", false);
user_pref("image.animation_mode", "once");
Re:Most of the web really sucks (Score:4, Informative)
Also, in about config, enter network.http.max
Reduce these to sane numbers.
Re:Most of the web really sucks (Score:4, Informative)
Well, it does make sense to load the pictures simultaneously because they may come from different source and what if one of them stalls? The whole page would sit there and wait.
What gets my piss to a boil is that browsers still cannot load text first and ignore pictures until the rest of the page is done. Gimme something to read while your slow ass server eventually, maybe, finally manages to send the picture I don't give a shit about.
Re: (Score:2)
And this is why writing the software is left to the experts. Why would you load images in series when a huge amount of the time they're coming from different sources and you can parallelize the work?
Re: (Score:3)
Parallelism is fine for wired broadband but starts to break when you deal with the conditions described in the featured article. Satellite has 1000 ms pings and up to 10 percent packet loss. And trying to multiplex 36 TCP connections over one 0.05 Mbps dial-up connection might cause each of the connections to time out, particularly if a server is using slowloris mitigation [wikipedia.org].
Re: (Score:2)
By doing the work in parallel, your one PC will hit many servers and the ones that are busy will delay sending you data till they are free. The others will send you data quickly. Its basically like queue theory; multiple tellers (servers) one queue (PC).
Even if you have ONE server to multiple PCs, doing parallel requests will make more efficient use of all the caches between the parties. The middleware or even server will know that PC5 wants the same resource that PC2 is currently requesting so it can ch
Re: (Score:2)
Instead, all browsers do the same stupid thing - try to load all 30 images at the same time, leaving you sitting there looking at 30 partial images.
On a higher bandwidth connection this would be faster. Loading small (say, 100KB) images on a connection where there is high ping to the server (say, I have a fiber optic connection, but the server is on the other side of the planet)in series is slower than loading them in parallel, because of the time spent waiting for the replies (setting up the tcp connection etc).
I also remember download accelerators that would split up a file in multiple parts and download them in parallel, this would usually make the
Do you Know WHY it sucks? (Score:5, Informative)
1. Sites that play videos when the load.
2. Sites that display the entire page for three seconds and then cover it with a full screen ad.
3. Sites that constantly reorganize the page as it loads new ads.
4. Sites that load ads FIRST instead of the actual content.
Bottom line is the web sucks because Madison Avenue got a hold of it. They aren't content with placing an ad like they do in papers or magazines. They all in your face and FORCE your participation in message delivery. And before you even mention Ad Block, more and more sites simply refuse to load when you have that installed/enabled.
Re: (Score:3)
"So don't ad-block. Script-block instead"
False dichotomy :-)
Re: (Score:2)
The universe doesn't suck. The gravity of black holes just make it look like it does.
Comment removed (Score:5, Interesting)
Re: (Score:2)
Lynx? You really should upgrade to eLinks. Lynx is superior for piping to a speech converter, on a real terminal eLinks formats the page a lot better.
Re: (Score:2)
I don't know how blind people would cope because they have text mode web browsers as well. There must be a lot of the web they can't get to despite web design 101 from the early days of the net insisting that you must have some way for blind people to navigate your site
Slashdot is down to stating the obvious (Score:2)
FTFY (Score:3)
Everything Internet-related really sucks if you have a slow connection.
Umm, no. (Score:5, Informative)
Ssh, and telnet work just fine over a slow connection and so does email so long as it doesn't have a load of attachments plus other protocols such as gopher. People did manage to use the internet over dial up before HTTP/HTML came along and sucked up as much bandwidth as it could!
Re: (Score:3)
Re: (Score:2)
Sadly insight and the ability to read simple english are something you seem to be lacking.
Re: (Score:2)
I can read just fine. Saying that "things are just fine" unless you do things that are absolutely commonplace is a stupid statement.
Timeout (Score:5, Interesting)
Why does my browser open up 6 TCP connections to try to download six images at once when I'm on a slow satellite connection? That just guarantees that all six images will time out!
The problem is not opening 6 connections, or failure to retry, but a timeout that's too short.
Re: (Score:2)
And its interesting to see how different browsers handle this.
I have a cable modem with a web admin interface that's *extremely* slow to respond to any requests. It works fine via Firefox, if I'm very patient. Its totally unusable via Chrome.
(Its Comcast's high-end "wireless business gateway" device, and something I'm basically stuck with if I want my current service package)
Re: (Score:2)
Which one? I'm on Comcast Business and while the modem's UI isn't particularly zippy, it's not terribly slow either. I don't use its wifi though; I have my own router and gateway.
Re: (Score:2)
The problem is that HTTP is a shitty protocol. It uses an unique TCP connection for every request. For each page of text and every image, HTTP requests a new TCP connection and tears it down after transfer. This causes a lot of latency, partly because TCP is designed to start slow and ramp up to the available bandwidth and partly because of the extra signalling for new TCP handshakes, authentication tokens, encryption renegotiation, etc. As a result, your web browser spends way more time than it should
Even HTTP 1.1 has keep-alive (Score:2)
For each page of text and every image, HTTP requests a new TCP connection and tears it down after transfer.
Which version of HTTP are you using? Even HTTP 1.1 has keep-alive.
Re: (Score:2)
That page is about http 1.0
1.1 (currently the dominent version) allows connection keepalive and pipelining which were supposed to solve those issues. Unfortunately pipelining has it's own problem. One slow request (large ammount of data, slow CGI script etc) can block the whole pipeline. So afaict most clients use connection keepalive but not pipelining.
2.0 allows multiple simultanious requests on the same TCP connection but has the downside of being much more complicated to implement.
Slight irony detected (Score:2, Insightful)
Tell that to all the people with slow connections, who's operating system is constantly phoning home to a dozen different servers and hijacking their bandwidth to spread itself on a p2p network.
Re: (Score:2)
I think you can mark the dial-up connection as a "metered" connection and updates will not use it to download.
No one tests software on a slow connection (Score:5, Insightful)
Re: (Score:2)
My transit prediction webapp, TransSee [transsee.ca] should work fine on slow networks. All that bandwidth sucking still just means more work for me to create it.
Re: (Score:2)
It's closer to lunch in this timezone.
Re:No one tests software on a slow connection (Score:4, Interesting)
I remember WordPerfect 6. All we, as users and support techs, asked, was that they did not make it slow with new features and debris.
They failed. Why?
We find out in one of the responses from the dev team, via support: "All our developers use it too, and they have no problems with performance. Perhaps you need top upgrade your computer?"
Well, we find out also that the dev team had 486DX2--66 machines with 16MB RAM. In an age when most attorneys' assistants were using 386 machines with 4MB RAM, maybe, and Windows 3.0/3.1. Yep, hanging on. A busy assistant would reboot twice a day. Had the WP dev team been testing even minimally on mainstream workstations, they would have known, and had a better answer, like "that's the price for state-of-the-art features, so suck it up and upgrade"
How is this related to smartphone apps and network performance?
First, the issue has nothing to do with user satisfaction. It is about 0. app infrastructure and 1. marketing.
These apps that require a server infrastructure aren't going to be built to serve 'slow' users. Open connections spanning minutes are expensive. Adapting to network speed by stretching timeouts and optimizing data flow don't make any money for advertisers or the app itself, the devs are focused on new features, gadgets, and monetizing the product. The rural market is literally worthless to them.
And since the apps are marketed to majorities, smaller populations out in the slow network space have no voice. And no money.
Lightweight apps like Twitter (how much data DOES it take to send a 140 character message?) and SMS manage, but even maps fail spectacularly when they are used in 2G network spaces, and there are still a lot of those.
My complaint is that in a metro area, during lunch, the congestion is so bad I lose data entirely. LTE4G is available, as well as UMTS, but when I lose I don't even get 2G - they just drop connections. Reboot my phone and get service for a few minutes, and then gone again. It's just 5500+ employees going out for lunch in a square mile. I have asked, and no carrier is immune to this. And building out capacity there isn't going to happen, because the congestion occurs for 3 hours a day, Monday-Friday. Nope, not worth it. When voice stats failing, maybe.
Surprisingly, the PSTN had minimum service levels dictated by state legislation, and while the penalties were often minimal, there is NOTHING like this in cell service that I am aware of. AMPS and NAMPS didn't really have them, and CDMA/GSM apparently did not. Now that cell service is the only service for many, SLAs for service woudl be useful. I doubt they will be proposed, enacted, or enforced.
And app performance will *never* be enforceable. Too many variables. I hear ya, rural performance is terrible, and every claim that it will be improved hinges on the cost benefit analysis. That will not change for a while, even with Band 12/700MHz/600MHz spectrum being deployed. Nope.
Now Gigabit LTE and the next gen give us hope that the cell companies will build out and try to displace the incumbent wired ISPs, but I'm dreaming here. Or am I? Since rural American doesn't have wired ISPs, it may be overlooked during the transition to 'copious bandwidth'. Glad I don't live in the woods any more, though if I didn't have a great job I would move there and disconnect.
Most Web Browser Engines Are Open Source (Score:2, Insightful)
Most web browser engines are open source. Go and modify one or many of them to handle slow connections better.
Re:Most Web Browser Engines Are Open Source (Score:5, Insightful)
Most web browser engines are open source. Go and modify one or many of them to handle slow connections better.
And after you get done with that, you can take your car's engine apart and redesign it to get 400 miles per gallon.
Re:Most Web Browser Engines Are Open Source (Score:5, Funny)
And after you get done with that, you can take your car's engine apart and redesign it to get 400 miles per gallon.
It's not hard to get a 400 MPG average. The trouble is, getting back to the top of the hill afterwards.
Re: (Score:2)
The question is do you want your browser doing a speed test or an os with an API that the browsers can pick up and adjust accordingly? I usually have chrome and safari open. Chrome eats up less memory and auto connects to my android tablet but safari is needed to connect to my phone.
Since they won't build a bridge between the two I have to link them the hard way.
But bandwidth and speed tests should be on a much lower level than web browser. That way other apps can adjust accordingly. Also I always first
Re: (Score:2)
That was rather my point in a somewhat underhanded way - the article moans about "why don't browsers do this", when the reason is "because everyone writing it found that it was a useless feature for one reason or another". It's not like guys writing browsers aren't thinking about how to deal with this well, it's just not a problem that they can solve magically themselves.
Re: (Score:2)
Labor to trust an HTTPS proxy's root cert (Score:2)
Providers don't offer a proxy because an HTTPS proxy requires each user of each browser on device to add the proxy's root certificate to the certificate store used by the profile associated with that user, browser, and device. Non-technical users are unlikely to successfully complete this process for both the OS-wide certificate set and the separate NSS set used by Firefox.
Firefox max concurrent connections setting (Score:5, Informative)
You can configure this setting in Firefox. It doesn't look like Chrome has a similar configuration.
http://kb.mozillazine.org/Abou... [mozillazine.org]
network.http.max-persistent-connections-per-server - default = 6
Try setting this to 1.
Source:
https://support.mozilla.org/t5... [mozilla.org]
Re:Firefox max concurrent connections setting (Score:4, Insightful)
The point is, browsers can do this intelligently. Automatically. But they don't.
When I leave wi fi and fall back to 3g, I should not have to tell my browser to behave optimally each transition.
Follow the money (Score:5, Interesting)
A lot of what drives modern internet design is e-commerce. If you're on a slow connection, you probably don't have much money to spend, so why should anyone care? Or so the thinking goes....
Re: (Score:3)
I do have money to spend and I maxed out my connection with dual 3Mbit connections into a load balancer. That is the best I could get, period. By the way, I only live about 15 miles from Portland, OR. Internet coverage in this country is very uneven.
Re: (Score:2)
Re: (Score:2)
Of course there are exceptions, but in aggregate, faster internet = more disposable income.
Thought I agree with the article that most web pages are needlessly bloated. I'm not a web developer, but I've seen a few demos of web sites where the customer comes to the developer's shop and is shown a demo of the new site. The page is hosted locally and many elements are cached on the developer's computer, so of course it's much faster than the average home user is going to experience. The customer rarely under
Re: (Score:2)
I am on a 448/96 kbps ADSL connection which is the fastest possible available to the house that I own.
So which should be the biggest indicator of my available money to spend? My connection speed or owning the house I live in?
Re: (Score:2)
A website can figure out the speed of your connection, it doesn't know if you own your home.
Most of the web sucks period... (Score:5, Interesting)
Even with fast Internet connections, websites are so bloated with ancillary scripts and tracking code and cross linking to 20 different various advertising and content servers that you get stuck waiting no matter what. CDNs helped but you're still hostage to X advertising companies one slow server because it's not on that CDN.
Re: (Score:3)
Unless you run a local dns server that spoofs a lot of those and points them to something local that responds "instantly". I use a Pi for this at home, with 1.5mb dsl it is a big saver, and even though I've finally been able to upgrade to 6mb is is still a saver.
Re: (Score:3)
It's even worse if you are using your ISPs DNS servers. Each request involves a high-latency hostname lookup before the HTTP portion can even start. And for some reason, CDNs use a different hostname for every client's files - so the DNS lookup is never cached.
Re: (Score:2)
Web page designers (Score:5, Insightful)
all show which side their bread is buttered on.
When the advertising content loads first and the page rebuilds/rearranges itself 6 or 8 times and finally the content you want to see becomes visible or stabilizes enough to click a link.
I think some of the pages are designed to do this on purpose.
You get a glimpse of the content you are looking for and click on a link just as the page rebuilds itself and the link has changed to an ad and the cash register rings on a click through.
Re: (Score:2)
You think the page designers make the decisions to add the advertising & tracking pixels? No.
Those decisions are made by the people who pay them. Some of us cranks sit in meetings and point out the drawbacks to their incestuous, ignorant greed, but again, we don't have the final decision.
Re: (Score:2)
You think the page designers make the decisions to add the advertising & tracking pixels? No.
Those decisions are made by the people who pay them. Some of us cranks sit in meetings and point out the drawbacks to their incestuous, ignorant greed, but again, we don't have the final decision.
Yet those developers don't seem to be able to come up with alternative ideas to "monetise" that web site - a necessary evil, as otherwise no-one is going to pay their salaries.
Re: (Score:3)
Ad networks in general try to make it "easy" for web designers by giving them just a javascript to include. Then you can't pre-define the dimensions of that content block, making it block rendering until it loads.
Annoying as fuck (Score:2)
I hate pages that look like they stopped loading and decide to do one last refresh AFTER I've started scrolling. That and images that don't load until you get near them. All that does is rearrange text while you're trying to read.
Re: (Score:2)
all show which side their bread is buttered on.
When the advertising content loads first and the page rebuilds/rearranges itself 6 or 8 times and finally the content you want to see becomes visible or stabilizes enough to click a link.
I think some of the pages are designed to do this on purpose.
You get a glimpse of the content you are looking for and click on a link just as the page rebuilds itself and the link has changed to an ad and the cash register rings on a click through.
I was getting ready to say fuck that shit, I'm out in the boonies on Hughsnet metered connections and we go over our bandwidth every month. What I was going to do is record the domains hosting any adds about herbal viagra, why a celeb doesn't talk about an offspring, or pictures being banned or embarrassing, dingus.tv or dear hughesnet subscriber popunder and the edit the host file so they would resolve to my router who's web server would send back a nice clean 404 error. Some how win10 isn't allowing me to
Because Browsers (Score:2)
Opera used to handle this nicely (Score:3)
It had a number of options where you could set up the number of global connections and per page. And yes, it was useful during the early days of dialup.
But anyway, most of the web sucks in this regard. Period. Sites are horribly designed these days - a gazillion JS dependencies, unnecessarily large images which get scaled, zero concern for mobile devices, etc.
Re: (Score:3)
Re: (Score:2)
A shitload of sites simply stop working if you disable JS. We're long past the days of content + presentation.
Re: (Score:3)
unnecessarily large images which get scaled
Understatement. I've seen 20MP images scaled down to under 200 pixels high. And worse, they're straight off the camera and not even recompressed for web. 6MB images like this really eat through my mobile data, as I try to stay under 1GB.
Welcome to the Digital Divide. (Score:5, Insightful)
The digital divide [wikipedia.org] in the US became most evident (to me) in this last election cycle.
If you look at the page weights of 'conservative' vs 'liberal' news sites the former are much smaller and tailored to people on even a dial up, in large part because they know their demographic. Rural internet in the US flat out sucks. We have counties in my state, not more than 3 hours outside of Chicago that still have dialup as a viable option.
Drudge Report [drudgereport.com] loads amazingly fast. Huffington Post [huffingtonpost.com] does not. Drudge was 1.13 MB in size with 44% of that images [imgix.com]. (The site I used to analyze them was done with Drudge's 14 assets long before Huffington Post stalled at 220/222 assets.)
The art of optimization seems to have disappeared, it made a small resurgence when web developers tried to optimize for the mobile web, but it doesn't look like most developers ever tried that hard.
It's a closed feedback loop. Developers live in places with fast Internet, test in places with fast Internet and then don't understand what it's like anywhere else. Students on college campuses live with gigabit internet and Internet2 connections to peer universities. They move to cities that Comcast pays attention to.
The best suggestion I have: Turn off images, configure the browser not to thread connections, and get involved in local government to get faster internet to your area.
Re: (Score:3)
The art of optimization seems to have disappeared...
Because shiny and swoopy is more important that performance
Re: (Score:3)
It's not that it has disappeared. But if I have to choose between making my website shiny and make it small, 9/10 will prioritise the UX for users with high bandwidth (HiDPI icons, etc).
Re: (Score:3)
The art of optimization seems to have disappeared, it made a small resurgence when web developers tried to optimize for the mobile web, but it doesn't look like most developers ever tried that hard.
They never tried to "optimize" for the mobile web, they just removed half the content while leaving all the ads and tracking in place. It means that the mobile version ALWAYS sucks in comparison to the full version, even when viewed on a mobile device.
I have NEVER seen a single "mobile" web page that was better on my phone than the full version of the same site. I never, under any circumstances, ever want to see the mobile version of any web page.
Important fix (Score:5, Insightful)
Be sure to run adblockers, stripping out adverts makes a big difference.
But even slashdot is a big fat bloated pig. no reason at all to load everything and a giant pile of JS.
Re: (Score:2)
Slashdot even gets crazy timeouts if you're lagging TOO much. Won't load all comments, maybe won't even load the main page or stories ... And looking at this site it really should be just friggin' text.
Re: (Score:2)
An adblocker and Noscript on Linux with Firefox and my internet is blazingly fast.
https://alterslash.org (Score:4, Informative)
Be sure to run adblockers, stripping out adverts makes a big difference.
But even slashdot is a big fat bloated pig. no reason at all to load everything and a giant pile of JS.
I highly recommend https://alterslash.org [alterslash.org] it removes all the bloat from slashdot.org Thanks to Jonathan Hedley!
Re: (Score:2)
As much as I'm not a fan of APK and his hosts file spam, this is somewhere he's right.
The only way to really use the internet these days is with either a hosts file blocking thousands upon thousands of entries, or better yet, a DNS server that does the same for all your devices at once (I use the latter).
Without that, many ads still waste your time and bandwidth, even if you don't have to see them at the end.
Of course the downside is that it makes it even easier for sites to detect when you're blocking ads,
Re: (Score:3)
oh boy....
Your DNS can easily be local, on your network, and it means that everyone connecting to your network, regardless of device, has the advantage of ad blocking.
I don't need to jailbreak my mother in law's iPad for her to get the advantage of the DNS blocking, I don't have to root my wife's android, every device that comes in to the house and connects to the network automatically gets the advantage of ad blocking just by virtue of connecting. And best of all, I can make updates to one device instead o
No shit, Sherlock! (Score:2)
Faster pages (Score:2)
Crime vs. Capitalism. (Score:2)
When one of the tactics for those on slow connections is to employ ad blocking technology, one doesn't have to look far as to the impact on bandwidth. Couple that with the fucking telemetry bullshit infecting software these days, and the problem is obvious in both directions.
I find it odd that our society will label physically tracking everything a human does a crime, but virtually tracking and selling everything a human does is called capitalism.
Opera middle-man. (Score:2)
I think Opera came up with an ideal compromise. A middle-man proxy that did filtering and compression. Fast pipe on their end, slow on yours.
It isn't just advertising and Javascript (Score:5, Informative)
There is a hilarious (and sad) commentary on website bloat at http://idlewords.com/talks/web... [idlewords.com] that shows truely outrageous examples of this sin.
Metered Users Also Suffer (Score:2)
Users like myself who are on a metered connection also suffer, but in a slightly different way. All those pesky videos that (partially) download on page load when you're just trying to read a news article. The bandwidth is plenty fast, which actually exacerbates the problem. How is it that Chrome provides a switch to turn off images, but not videos?
Another commenter here talked about pipelining. Again, this comes back to a lack of proper browsing features in our de-facto web browser, Chrome. We want power o
Easy solutions (Score:4, Informative)
Most browsers open 6 connections because most connections these day can handle it. If you live in a time warp (like most of the US) with tech that's 20 years old, then you have to use 20 year old tricks of the trade.
"Back in the day" we had 56k at home or for the rich folk, 128k (ISDN), double ISDN if you were really lucky. We had the same problems, the 'web' was getting fancy with things like Flash, video and high-def images because all the 'work' was being done at places that had either access through an institution with at least Fractional T1's and things like 100Mbps home-internet copper/fiber connections for $10 were being promised as less than a decade away by the ISP's which were then repeatedly subsidized by the governments to do just that.
How did we do it:
a) Set up your own DNS caching servers
b) Set up your own HTTP/HTTPS proxy caching servers with giant caches
c) Proper QoS to make sure certain traffic had priority over others, small packets and small buffers
d) Set up your own servers (such as IMAP/SMTP, gaming etc) and have them sync during times of lesser activity. These days, with a bit of API tinkering, even social media can be done that way.
We had LAN parties with 100's of computers behind a 1Mbps cable modem, no problem.
Opera 12 (Score:2)
Re: (Score:2)
Ahh, the Presto days. I wish they opensource that engine someday...
Comment removed (Score:3)
Re: (Score:3)
There are several reasons why things are this way (Score:2)
1. People are egocentric. Everyone thinks (by default) that everyone else is in the exact some relationship with their environment as they are. I've been in corporate IT for many years (in the past) and have seen a large number of major initiatives designed from the point of view of the HQ folks who are on top of the data center, network-wise. You know, millisecond latency to the servers. Then the app is deployed out to the remote masses, and... it sucks. So the app developers demand to know when the
Sprites (Score:2)
Why does my browser open up 6 TCP connections to try to download six images at once
The website could easily be serving those 6 images as 1 image and use CSS to display them individually.
So it's not only a browser issue.
Actually, it's a protocol issue.
I believe this sort of thing is being changed for HTTP/2, with multiplexing.
Re: HTTP Spec (Score:2)
Opera started it. Other browser makers had to follow suit to keep up with Opera's speed.
Re: (Score:2)
Of course the normal way of writing web pages these days includes at least 10 different servers (mostly ads and tracking), so it's easy to only open 2 connections to any one server while still having your browser opening 20 or more connections at a time.
Re: (Score:2)
That's AdSense (most likely). AdChoices is offered by multiple advertising networks as a way to set "preferences" to get more "relevant" ads (provide advertisers more data for their profile on you).