Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
News

Future of the PC on NPR's Science Friday 41

EMS writes "NPR's Science Friday news radio program today has a discussion on the Future of the PC at 3-4pm EST. Guest speakers Tim O'Reilly, Steven Levy, and Scott Bradner discuss how PC's are evolving in a "decentralized, network driven world". Contributing forces to be discussed include Linux and network appliances. " It's worth checking out NPR site later on-they typciall put these up in archives.
This discussion has been archived. No new comments can be posted.

Future of the PC on NPR's Science Friday

Comments Filter:
  • Every time I log into my ISP I'm part of a 'network'. When I want to balance my checkbook, I use my 'local' hardware. Not sure how this is going to change in the foreseeable...

    Breathe in...Clear Mind
    Breathe out...Don't Know

  • You put way too much faith in technology - and the so-called free-market.

    Once some company knows they've got you by the balls, they'll screw you, and send you the bill for services rendered. QOS guarantees? You've got to be joking. Any other service I rely on today? Like my Cable? Like my phone service? You're talking about adding costs for consumers, and decreasing reliability, and options. you think Microsoft has you by the balls now? Just wait until they bill you per-OS boot!




    "The number of suckers born each minute doubles every 18 months."
    -jafac's law
  • SCSI? ATA? 100 Mbps Ethernet? $30 dialups? Slashdot effect? Remember, we are talking about the future, not the present. 100 years ago anyone would regard as outright silliness to sell the horses which basically live on the fields, by the river for something that costs a few bucks at each refill to go a few miles...

    The future of the computing is just began forming; MS and all the big dinosaurs want to take part of it, but this time, with global cooperation, there's a chance to get everything right. Really everything, don't think the current drawbacks are bothering only you.

    I'm, for example pretty surprised by how much joshv thinks the same way as me; and chances are I'm about 8-10 hours away in a different time zone, and 8-10 thousand miles away. With helpers finding each other in this distance, turning the whole world around to the better is a snap. I mean it.

    (Well, assuming my English makes sense :)

  • Forget network computers, network devices, whatever. The future is here and most of us are using it already. Can you say cash machine?

    Wintel has pretty much saturated the PC market. Now they're busy changing the fins on next year's model. The biggest remaining market is "the rest of us" who didn't buy apples because even that was too hard, or because we're still working through issues about potable water and reliable electric service. This market is beyond huge: it's going to be everybody on the planet with enough money to make an advertiser ever care about them, i.e. a majority of humans.

    E-commerce is too hifalutin a term to describe what's coming. Niether windows nor anything else that advertises itself as an operating system is appropriate for this image of the future. Do you know what operating system your cash machine is running? How bout the last point-of-sale terminal you used?

    Think unlimited bandwidth, ubiquitous devices, and oh, a few trillion transactions a day, taking a max of a few minutes each (not including the time to actually watch the movie) among millions of networked servers/clients, initiated by people who really don't care about saving stuff or whether they're at home, the office, or a fast food franchise waiting area when they do it.

    Just to make sure I offend somebody (and you know who you are ;-) a dollar to a doughnut says it's going to be based on a protocol and not a binary format.
  • Casting my vote in your boat :-)

    I felt a surge of agreement with the woman who called in to talk about her Linux machine and how it had taught her to be a batter computer user. But the panelists were right -- in a few years the people who know initimately how computers and networks work will be as rare as people who know how cars and cell phones work today. Steve Sales and Mary Marketing don't give a shit how the box does what it does, they just want the stable, speedy function for a low low cost packaged in a stylish container.
  • Why have multipurpose computers, and especially PCs become so popular? Because for a relatively low price, one person can do many many many many tasks without using a time share device (anyone ever use one of these, I mean the REALLY old ones from the 70's). Software is what made the computer industry take off, you can word processes, build a database, run spreadsheets, play pong, all on the same machine. This is what has made the PC (Personal Computer) so popular recently. PC's will continue to exist because of the wide variety of things that can be done on them with little or no modification. Many people have forgotten this, they see their computer as just another tool, when it's really a toolbox. Devices like WebTV and those are for people who want something quick and easy and arent worried about productivity, thats fine with me, it's better they get something user friendly than calling me at 7 am asking me what a general protection error is. The PC will be around for a bit longer because of it's wide range of uses. Sure Sony can release a console system that can serve up graphics as fast as some PCs, but when you really look at it, it's not that spectacular. Look at the original Nintendo, it had some pretty rad games on it for it's time and Rob the robot. Did PCs of the time have any of that? No, most were still CLI boxes used for word processing. Did the PCs quickly become more powerful at graphical applications than the Nintendo? Yes.

    Another matter is what do you actually define as a PC (Personal Computer)? Is a PalmPilot a PC? It's personal and it computes. Or d oyou see a PC as a desktop tower with an obscenely heavy monitor and a bunch of cables sticking out the back? It's personal (for the most part) and it computes. Devices like tablets, handhelds, set top boxes ect. can all be considered Personal Computers. Ever seen the Toshiba Libretto? it's a 2lb sub notebook that can run Win98 or NT, it's innards could easily fit into a tablet sized device, you could have a fully functional workstation the size and weight of a notebook (a paper one, remember those?). Why hasnt this been done quite yet? Expense. The LCD monitors for portable devices are really nice...but expensive. Someone (I think it was Cyrix) released a tablet for surfing the web using their MediaGX processor, this is a step in the right direction for tablet devices. But is that considered a PC? Hell, I would love to have one of those, I could take notes in class in an appropriately sized screen instead of a little PalmIII. That would make it much easier to convert my scrible into a .doc file for inclusion in a thesis paper. I would say thats a personal computer, a type which we're going to see more of as hardware prices drop.

    I would rather not rely on a remote server to store files, I like keeping them local thank you. I dislike all the "everything on the internet" comments I have read. Don't you pay attention in the news? Do you read the articles on slashdot? Very large companies, who dont give a rat's ass about you the consumer and who only see you as a potential dollar value, are buying up as many forms of media as they can. They want to control the flow of information coming into your PSX2's and HDTV's. If a handful of large overpowering companies own a majority of media forms, or just the majority of distrobution mediums, what makes you think that your rights of free speech and al those other nifty ones are going to count anymore? I can almost assure you that they wont count for jack crap. And what about privacy policies, do the companies hosting your data have access to it? If I write up a text file describing the ultimate domathingy, and I store it on a remote server thats owned by someone, say Microsoft, can they take my idea and use it without me seeing a dime of the profits or without getting any credit for it's invention? No I would rather have my own hard drive.

    Anyone here ever hear of Jini? I can almost see some Java developers' ears perk up. It's a little toy created by Bill Joy of Sun to enable anything with a network interface and a processor of some sort to become a shared device. That means Jonny Jackoff can log onto my home network (saying I give him permission) and turn on my microwave, and watch my fishtank with a Java enables webcam, and all sorts of other stuff. Thats a very good technology and no offence to it's creators, I won't use it. I wont for the same reason I dont want my personal diskspace to be on some remote server, I don't want any script kiddies playing with my files or some dipshit admin to spill his Frappucino on the RAID controller frying both my data and his friend who beat him in a D&D match that weekend. If I do something stupid, or forget to setup a firewall and my data is lost, thats great, but I dont want some bonehead to do it for me and then be the victim of some subparagraphsectionarticlefineprintEncyclopediaBri tainiccaontheheadofapin legalese part of my license agreement that says the server was not responsible for my data.

    So to finally stop ranting, yes I believe the desktop as we know it is slowly going to die out in the mainstream consumer market, the iMac and IBM's latest little iCandy treats are giving us an idea where the consumer market is going, Cuddletech. Do I think the Personal Computer is going to die out? Not within the next couple decades, we're just going to see them become alot more streamlined with the way we work, the power of a desktop in the size of a pad of paper that DOESNT use a keyboard. As for infinite bandwidth and all that unhip lingo, read Farenheight 451. Wall sized TV's, "interactive" television that makes you want to take one too many sleeping pills, it's all streaming mass media. No one sits down to read anything. Reading may seem like some sort of archism in this MTV-You'vegotmail-Starbuckscoffe-acidtechno-everyo ne'sgotissues freak show that we find ourselves embroiled in, but it's one of the most effective forms of communication. Reading requires no burning of fossil fuels in order to pump up your 100 watt stereo speakers, depening on the type of paper used a book has an uptime of several hundred years, it's not terribly harsh on the eyes, it can be done almost anywhere, and anyone can write. You dont need to know a technical language, you can write in the vernacular about anything your underloved, over stressed, hypercaffinated heart desires.

  • Try using your local computers if the power company screws up with billing and deactivates your 'account'.

    And try using any kind of electronic device at all, whether the data/processing is local or not if the power is off.

    And try using your remotely served software if the ASP (Application Service Provider, which appears to be the defacto term for this) "screws up with billing and deactivates your 'account'." Really, this is far more likely than the power company screwing up. The electrical utilities have been billing people for decades, is some startup in a completely new field really going to be more reliable? And how are you going to argue that they screwed up? With the power company, if they overbill you for something, you can go outside and look at the numbers on the meter. With remote programs (assuming we're paying by the minute for usage), it's basically your word vs. theirs. Their logs show you used 90 minutes of time, while yours say only 30. So who's right?

    I am not talking about today, I am talking about the future. Things will presumedly by much more reliable in the technological realm.

    And presumably, we'd be living on Mars by now. Making presumptions about the future is one of those funny things, just because somebody thinks it will happen, doesn't necessarily mean it will.

    Percieved reliability is a big problem. If there's a failure at your end, there was always something you could do about it. Whether it's doing daily backups or buying a UPS, the safety of your data is in your hands. If something goes wrong at your service provider, you're screwed. This is especially a problem for the "stick the AOL disk in and get online" crowd, as they lack the experience or the intelligence to figure out who's a good SP, and who isn't. And don't assume the public at large is going to become magically knowledgeable about how the system works, and what factors affect their data's safety.

    Now, just to be fair, I don't think the idea of remote storage and processing is necessarily a bad one. It's just that I don't think it's ready for primetime, nor is it a be-all and end-all solutions to the problem of computing. To paraphrase jwz, you can't take a complex project, sprinkle it with the magic pixie dust of 'in the future it'll work' and have everything magically work out. The issues aren't that simple. Somebody has to get whatever follows ADSL and cable modems into Joe User's house. Somebody has to manufacture these "smart TVs." (the issue of whether or not a tv is the best hardware to use a web browser on is another issue entirely) Somebody has to build the application servers, and establish some sort of standard for the equipment. Somebody has to convince software companies that it's a good idea to let service providers rent their software out to end users. And most importantly, somebody has to convince the users that it's a good idea. If it doesn't make sense to the people who are supposed to pay money for the service, it ain't gonna happen.

  • The funny thing is that network backbone bandwidth is doubling every 6 months, thus actually going faster than Moore's Law of transistor count doubling every 18 months. Of course eventually network speeds will catch up with processor speeds, since you need a processor to route packets. Thought - 32 bit CPU at 500 MHz moves data at (around) 16 Gbps on-chip, the network backbone speed that should be attained in 2001 (a couple of OC-192's).
  • With the power company, if they overbill you for something, you can go outside and look at the numbers on the meter. With remote programs assuming we're paying by the minute for usage), it's basically your word vs. theirs. Their logs show you used 90 minutes of time, while yours say only 30. So who's right?

    I think you just answered your own question (albeit a rhetorical one)...

  • It's all ok, but I see two mistakes: first, you might have more "chips" (whose transistor count doubled) for a given node; that might give a few years to bump into limits.

    Second, I don't think system "processing" speed (highly variable anyway, depending on the task), system memory/link bandwidth, and transistor count of a single chip are all so really closely coupled. And of course, to be repetitive, you don't know the future :) BTW, This article is a nice friendly place, I guess that NPR think turns off most of the "interested" people :)

  • Drop by http://www.npr.org/programs/totn/ [npr.org] and visit the archives. They have all their programs for the last several years online in Real Audio.

  • I think the future of the PC is secure. There are just too many creative things you can do with one. In any event, I'm not giving up mine!

    However, with the power of the PC comes responsibility. Responsibility for a complex device that can only be a reflection of each user's habits, expertise, and organization. The power to mold the PC can be a curse in the hands of those not paying attention or lacking interest -- as the latest MS-macro-virus shows.

    It is for that reason that business will move to some sort of networked device, lacking significant user configurability, for well defined tasks like OLTP, email, and word processing. Of course, they will move slowly and cautiously, but economics and risk will push them inexorably toward a simpler, centrally-managed computer for most employees.
  • by ChrisRijk ( 1818 ) on Friday June 11, 1999 @08:56AM (#1854723)
    Whenever I see people discussing the "end of the PC", they seem to be getting misquoted a bit, or people are just latching onto the headline. What I mean is, that they're really predicting "the end of the era where a user's Personal Computer dominates where they do their work."

    Putting it another way, a particular bit of hardware becomes less important, and instead the data and services become more important. So, say within a business, you can log onto any (or most, at least) computers around, and you get the same data, same functions, same customised desktop at each and every place. Or, instead of taking a laptop or handheld, you login remotely (at the hotel, airport, general kiosk, whatever) and access your stuff like that. Or you just borrow a handheld/laptop from anywhere, and providing it has the required client stuff, then it takes little effort to 'customise' it, because your particular settings and files are exportable.

    An example of this sort of thing is Sun's recently released i-Planet software. All you need is a bit of hardware, with Java and internet access, and (without taking any software with you at all, just as password) you can remotely, and securely, get access to your mail, files, etc from it. (of course, you do need the required server stuff setup first.)

    Windows is very badly suited to doing this sort of thing (in general, not just the i-Planet example), though Microsoft have been cludging in some hacks around this.

    To workers/members of the public, it probably won't make that big a difference for a while, unless they do lots of 'remote' work. It will make a much bigger difference to the IT departments, and sys-admin though...

  • It'll be interesting to see how NPR frames the conversation with these folks, especially since Steven Levy was the primary author of Newsweek [newsweek.com]'s recent cover story on "The New Digital Galaxy" [newsweek.com], a fairly mainstreamed, fluffified and narrow view of the future of PCs and pervasive computing. I know Levy is capable of much better, but he does a good job of bringing it to the masses wrapped in a candy coating. The issue also included an article by Bill Gates on " Why the PC Will Not Die [newsweek.com]". Possibly good background reading for the NPR bit.
  • fsck that, my desktop machine's SCSI or ATA (or FireWire) bus gets me my data MUCH more quickly than say, even my stored data on my server across the 100mbps ethernet LAN - and you expect me to trade that for a dialup line that I gotta pay $30 a month for?

    If the power goes out that's one thing, then say there's a 1:1000 chance I can't use my computer on a given afternoon, but add onto that the chance that the network is down, congested, infected with or dealing with ExploreZip's descendents, etc. Again I say FSCK that!

    Then, on top of that, perhaps I forget my password to the network, perhaps someone hacks my account, perhaps their billing program barfs on a y2k bug and cancels my account, so I have to call their customer service rep during business hours on a weekday to get the account reinstated, and it's Saturday at 2am and I have a project deadline for Monday morning?

    There's just way too many things that can go wrong in the "networked future". So again, I say FSCK that, I want MY data, and MY software on MY machine, on MY desktop. I don't need or want to be reliant on someone else's network service, or their goodwill to keep their prices down once they've monopolized the market, or their altruistic desire to improve the reliability and speed, or to hire competent people -

    Forget it man, this is what the PC revolution was all about in the first place. Now that we've got network connections on desktop PC's, we have available the best of BOTH worlds, but take your elements of 1960's centralized control mainframes running dumb terminals, and shove them up your USB port.




    "The number of suckers born each minute doubles every 18 months."
    -jafac's law
  • Intelligence will eventually become distributed among the currently 'dumb' devices that we use each day. Hardware innovation will continue to drive down the price and size of devices while upping the performance, making sitting at a 'workstation' to do computer work seem quite anachronistic in the decades to come.

    The people that say "you will pry my PC out of my cold dead fingers" I think lack imagination.

    What do you need a PC for now?

    -Games - Dedicated 3D rendering hardware is so cheap it can be embedded in $100 consoles - why not in TVs, VCRs, etc.

    -Surfing the Net - Your HDTV purhcased for $1000 in the year 2005 will have this built in.

    -Local storage (I gotta lotta stuff and I don't trust it on the internet) - So you buy a storage unit for your local wireless network. The minute you plug it in it announces its presence to all networked devices and voila, your VCR has a place to store movies, your TV a place to archive web pages, etc...

    But why do you need local storage if your ISP can offer terabytes of movies/audio/programs all over a broadband connection?

    -Word processing, DTP, graphic design
    (general purpose applications) - broad band will most likely make it possible to rent the usage of such apps from your ISP and run them using a thin client. Might sound expensive now, but hardware and bandwidth prices keep falling. Do you want to buy photoshop with x plugins for $1000, or rent it at 50 cents an hour?

    Thin clients run nicely on near notebook-like (as in a paper notebook) tablets now. Wait a couple years and you will have a notebook thin 'thin client' with screen resolution approaching that of the printed page.

    Finally corporate drones will stop printing their email and schedules and just be able to carry their damned computer everywhere they go.

    -Compiling, rendering, other processor intensive tasks - Two words: server farms. Access to high speed/distributed hardware will be commodotized to the point where you can cheaply lease time on hardware with astounding capabilities.

    Some of this stuff is 30 years off, but it will eventually come. Some dork will always want a tremendously expensive box that does it all, but it will become less and less economically justified when most of the common devices in your home do 90% of your silicon based processing already, and the other 10% can be cheaply bought on demand.

    -josh
  • I didn't read the Newsweek, but I have always been impressed by Levy's articles. Back when he wrote for Macworld, The Iconoclast was the first article I read each month.

    -awc
  • 1) Privacy: I don't want to store my data and programs somewhere that I don't have ultimate control over.


    You don't have privacy now. The police can kick down your door and take everything, eavesdrop on your phone calls, etc... Encryption is the answer. And it keeps getting better/easier to use. It will make sure that your data can only be read by you whether or not it is stored locally.

    2) Accessibility: I want to be able to get at my data even when the network (say, to my ISP) goes down. Same for processing.

    When the power goes off you can't do anything with your computer. The Internet will eventually approach the reliability of the power-grid. If you are really fanatic about it store a local copy of the really important things.

    3) Flexibility: Look how difficult it is to perform non-mainstream operations with proprietary software now. Now think about how much more difficult it will be when you don't even have your own copy of the software; you'll have to rely on your "Processing Provider" supporting 3D rendering or whatever.


    No, you will be able to buy the capability from anyone that provides it, not just your ISP, and it won't cost you the huge upfront capital expenditures that come with high-end software these days. Renting apps provides much more flexibility than buying/installing/maintaining code for ever single computer based function you perform.

    It really all boils down to autonomy. I want to do my thing, my way. And so do a lot of other people.

    The scenario I offer presents no barriers to your autonomy. True, with current technology this scenario cannot be realized in any practical sense today. But given 20 or 30 years, we will be there.

    Furthermore, there's no pressure for me to join the collective. If I need more storage or processing power, I can build a beowulf cluster out of all the PCs discarded by the lemmings.


    And the rest of us will be renting beowolf clusters for pennies a mega-MIP while you chase around ethernet cable faults.

    -josh






  • by gavinhall ( 33 )
    Posted by FascDot Killed My Previous Use:

    And here's why:

    1) Privacy: I don't want to store my data and programs somewhere that I don't have ultimate control over.

    2) Accessibility: I want to be able to get at my data even when the network (say, to my ISP) goes down. Same for processing.

    3) Flexibility: Look how difficult it is to perform non-mainstream operations with proprietary software now. Now think about how much more difficult it will be when you don't even have your own copy of the software; you'll have to rely on your "Processing Provider" supporting 3D rendering or whatever.

    It really all boils down to autonomy. I want to do my thing, my way. And so do a lot of other people.

    Furthermore, there's no pressure for me to join the collective. If I need more storage or processing power, I can build a beowulf cluster out of all the PCs discarded by the lemmings.
    --
    "Please remember that how you say something is often more important than what you say." - Rob Malda
  • They exist. They are called UN*X workstations. All file systems are pushed out from a central server and the user has no admin. privileges (except on the scratch drive). Applying patches or upgrades is much easier, one administrator can load the application/patch/upgrade on the server side and then push it out to the users. So in other words, one admin. can make a piece of software available to hundreds or even thousands of users in a day or even an afternoon. Economy of scale was never what PC's were about in the first place.
  • I just want to add:

    the perfect example of why you don't want to rely on a network for your critical apps: The dreaded Slashdot effect.

    Anybody try to go to Symantec today to download a virus update for the latest viral attrocity? Forget it. Wait until next week. Now, I'm smart enough to resist the temptation of opening an unsolicited attachment while there's warnings flying all over the press and from every person I know who has a computer. So I can wait a week or two for the update to become "reasonably available". But when there's a mission critical application I need to use, it's gotta be local if at all possible. If I need to get something done, I don't want to sit on my ass waiting while 200,000 other users are pegging someone's server whom I'm paying to rent per-use software from. I'm certainly not going to store data, encrypted or otherwise, on someone else's machine across the internet. Not only is it not secure, (anyone can bust in and decode it at their leisure), it's risky. How does one handle a situation where your Internet Storage Provider says, "sorry, our system crashed and when we tried to restore your data from backup it was corrupted"?
    I know, all of these things can happen with data stored locally, but with local data, you are answerable to yourself, and you know exactly what is done to your data, and you are in charge of how well protected it is.


    "The number of suckers born each minute doubles every 18 months."
    -jafac's law
  • 2) Accessibility: I want to be able to get at my data even when the network (say, to my ISP) goes down.

    (This was the first one I want to comment on). Sure; but you are locked into thinking in the present. The majority of the people, even computing people depend really hard on the power grid functioning (although someone said in the US, this situation is worse than in Europe, where I am). The network could really get so reliable (erhm.. available), that you won't think about its downtime; of course, there will be people who will have backup "connections" (not necessarily physical lines) to different providers, and have local storage farms, just as some people have UPS, and/or generators.

    Privacy: it's just the protocols, if the ISP can store thousand times more data than my disks for the subscription that I'll need also; there will also be protocols to do this storage the way that really only you can access your data (not even the ISP).

    Autonomy: in this case, however, I also think you are right; everyone needs to notice that the only *good*, reliable parts of the internet are the ones that are *not* centralized. Slashdot is great, but freshmeat's mirroring system is much more reliable :) And ultimately, the domain registration should also change to a seemingly anarchic one (compared to the current one), and you know what? It will be better for *us*.

  • However the ISP will have to buy the hardware to support that, along with the broadband connections, and the connections to the users's homes. In addition the ISP will have to keep upgrading to keep in line with the times (just imagine when a new game comes out that requires a lot of resources) and find a way to administer and manage all of these resources.

    Anyway, I think this is going to cause a large amount of pain with end-users (especially when they get those monthly bills) and administrators alike, and will take a long time to be implemented, if ever.


    Granted, but the technology is in it's infancy. Think of what a PC is capable of now compared to ten years ago. Way back when I could run only one app at a time. There were hacks (DesqView) that could multi-task - kinda, task switchers, but it was complicated and there was a lot of manual intervention/tuning required.

    Now with Linux we think nothing of running hundreds of concurrent processes and it requires almost no user/admin intervention. Why?, because CPU cycles have become so cheap the software can do a heck of a lot more for us.

    Yes right now managing a SAN or keeping millions of broadband connections at 99.999% uptime might seem an impossible administrative chore but smart hardware will do most of it for us in the future.

    Look at the evolution in dialup ISP technology. The first ISPs were guys with an expensive, flaky T1 who crammed 16 modems onto a unix box and signed people up.

    Now we have network manageable, hot swappable remote upgradable modem pools. I have had dialup connections last for almost a week. Unthinkable even 2 years ago.


    -josh
  • I need to point out that most ISPs have a tough enough time managing just providing TCP/IP service. I don't think they are really prepared to handle all of the needs you outlined above. That sounds a lot like those SANs everybody has been talking about recently (Storage area networks for those of you who havn't). It sounds great in theory, but what it boils downto is a way to convert hardware limitations into software limitations and admin overhead. The scheme you propose above allows the users to rent as much CPU time/disk space/memory/etc... as they want (and as much as the ISP has to offer that is not in use by the thousands of other people currently using the system) whereas before they would have had to buy the hardware. However the ISP will have to buy the hardware to support that, along with the broadband connections, and the connections to the users's homes. In addition the ISP will have to keep upgrading to keep in line with the times (just imagine when a new game comes out that requires a lot of resources) and find a way to administer and manage all of these resources.

    Anyway, I think this is going to cause a large amount of pain with end-users (especially when they get those monthly bills) and administrators alike, and will take a long time to be implemented, if ever.
  • Go to Here [npr.org] to listen to the show. (requires RealAudio)
  • Try using your local computers if the power company screws up with billing and deactivates your 'account'.

    I am not talking about today, I am talking about the future. Things will presumedly by much more reliable in the technological realm.

    Remember how hard it used to be to get on the Internet? Now any idiot with an AOL CD can be spreading email virii within minutes of sticking the CD in the slot.


    -josh
  • Darn them for not carrying ToTN Hour 2 in the Washington, D.C. area!
  • You don't have privacy now. The police can kick down your door and take everything, eavesdrop on your phone calls, etc...

    At least when they kick your door down and take your stuff, it leaves a fairly obivous sign that they did, in fact, take your stuff. This provides fairly powerful mechanism to prevent widespread abuses of the power. When your data is on a server, they can just copy it and snoop through it without you even knowing. Hence, it is a lot more likely that the power will be abused if you can't tell it's being done.

    People would be mad as hell if every letter they got from the Post Office had been opened and read, but things like ECHELON [slashdot.org] prove just how easy it is to get away with the same things when you are dealing with bits instead of atoms.



    Encryption is the answer. And it keeps getting better/easier to use. It will make sure that your data can only be read by you whether or not it is stored locally.

    Provided the encryption is done on your end. If all your software is run off the server, it doesn't matter whether you encrypt the data that you store with unbreakable encryption, because they can just use a Trojan version of the encryption software to grab the data before it is encrypted (or grab a copy of your private encryption keys).

  • Hetwork computing is like public transport; it's more efficient, conveinient and rational, but....
  • -Games - Dedicated 3D rendering hardware is so cheap it can be embedded in $100 consoles - why not in TVs, VCRs, etc.

    And what is it about dedicated console gaming that sucks? It's targeted at the lowest common denominator of player; it's nonupgradable; it's non-networked; platforms are tightly, centrally controlled (Hello, Sony and Tendo!)... I could bitch ages about consoles, but the point is that the last thing I want to do is introduce these factors into my choice of a TV (and why do I want to play games using a TV's embedded hardware? The lifespan of the display is vastly longer than the lifecycle of a game platform.)


    -Surfing the Net - Your HDTV purhcased for $1000 in the year 2005 will have this built in.

    Whoopee. And that will integrate with junkbuster coupled with my gig of shared local cache... how, again? That cache will reside locally on a... what? And I'll be able to script, automate, and customize browsing with... what? I'll publish my weblog and surfing habits... how?

    A navigation interface for browsing and information collection is vastly different from a navigation and preference interface for a large public display device (like a TV). Or will I tune the characteristics of my display using some awful local web form / Java applet / ActiveX control / whatsit?

    Sorry, but a TV with browsing functionality that I want *is* a PC with a really huge display. Never mind that I have no use for browsing in my home theater! Integration rocks, convergence sucks.

    -Local storage (I gotta lotta stuff and I don't trust it on the internet) - So you buy a storage unit for your local wireless network. The minute you plug it in it announces its presence to all networked devices and voila, your VCR has a place to store movies, your TV a place to archive web pages, etc...

    Fair enough. That's what I do with a PC server now. I rack up a bunch of disk in a box, throw it in the back room and hide it behind the couch. Every other networkable device uses it for persistant storage (even the Sun3/50 and the Amiga 500!). The household gig of squid sits on it.

    Every CD I buy gets encoded to MP3 and stored on the disk array. It serves all but one audio output device in the house (the theater -- and only because I don't want disk & fan noise from a PC in there; I'm likely to fix that with a diskless MP3 client of some kind). If storage were cheap enough, and closed formats weren't, I'd archive my DVD collection, too, and stream that around the house.

    I may not capture video to it, but that's partly because I don't capture video. (I don't even own a VCR!) If I wanted to store video, that's where it would go, then spool it out to DAT on a LRU basis of some kind.

    A nitpick: if I'm that concerned about not sharing the my local storage, the hell if I'm going to translate it to RF broadcast and squirt it around the house!

    But why do you need local storage if your ISP can offer terabytes of movies/audio/programs all over a broadband connection?

    Because I barely trust my ISP to get packets to me, let alone store my data. I want my data on media I physically control, with access control at my sole discretion. Period.

    Video/audio on demand is a waste of my time. Why should I download (even "instantly") data that I can store locally on LD, DVD, disk array, or whatnot? Even assuming infinite free bandwidth, the "I own this" factor is pretty strong here, too. This is the reason Divx failed -- people pay money to own something, and they own it. A transaction-based universe is overrated and assumes some things about privacy that I don't particularly like.

    -Word processing, DTP, graphic design (general purpose applications) - broad band will most likely make it possible to rent the usage of such apps from your ISP and run them using a thin client. Might sound expensive now, but hardware and bandwidth prices keep falling. Do you want to buy photoshop with x plugins for $1000, or rent it at 50 cents an hour?

    I apologize, but I could poke holes in the vision of a transaction-based outsourcing world all day. For one, I barely trust my ISP with my traffic, let alone with renting functionality with me. Why rent Photoshop at 50c an hour when I can "apt-get gimp" and be running in 20 seconds, for nothing?

    For that matter, suppose I'm cleaning up my nude girlfriend snapshot library (assuming you don't banish secure, physical local storage with your fantasy of infinite bandwidth to outsourced servers, of course), do I want my usage tracked in that way?

    "Oh, look, he's fired up Photoshop to touch up his girlfriend's nipples again!"

    Do I want to rent an email client every time I interact with someone? What about PGP? Do I have to rent time on some master server to decrypt my mail? And is it then somehow transmitted to my thin client in a trusted way?

    Assuming the existance and prevalence of Universal Thin Client Hardware (a long, long shot to say the least!) that runs whatever software I'm renting, that still isn't about the death of the PC in the first place. Thin clients have certain tasks their great at (manipulating data on remote servers using rich interfaces) and some things they suck at (being a PC and doing PC tasks).

    Finally corporate drones will stop printing their email and schedules and just be able to carry their damned computer everywhere they go.

    That would rock. I'd love to see that.

    But mobile clients have limits given their lack of local storage, the requirement for wireless networking (which doesn't work well over long hauls, from in caves, on airplanes, or with data you don't want transmitted), and (presumably) limits on integration with other devices.

    Compiling, rendering, other processor intensive tasks - Two words: server farms.

    I'm with you 100%. Infinite bandwidth will make it much, much more cost effective to buy or borrow CPU time for intensive tasks from somewhere else.

    Although if I'm rendering nudes of that girlfriend, I might still want some local CPU...

    Some dork will always want a tremendously expensive box that does it all, but it will become less and less economically justified when most of the common devices in your home do 90% of your silicon based processing already, and the other 10% can be cheaply bought on demand.

    I want a small number of cheapish boxes that are highly flexible, which sit at the center of a network of even cheaper dedicated devices. I want to have a high degree of control and flexibility over the application-specific sattelite units through a powerful programmable system. I want my sattelite devices to be able to access the greater resources (local storage/caching/connectivity/IPC and device coordination) of a central network of servers and fat clients.

    Your home can be a network of microsmart devices that talk to each other. Mine'll be a network of smart, programmable PCs that integrate with dumb dedicated devices.

    You distribute your computing tasks -- I'll distribute usability, accessibility, and flexibility.

    I agree that a lot of what you say'll happen will happen, and probably in less than the 30 years you quote.

    All I'm saying is that I like my way of doing things better.

    -josh

    Nice name. =)
  • It seems to me that, at least in the short term, the generality of the pc is ending, with the psx2 being as powerful as a pc, without all the hassle, a neon chip being built into a dvd for a little as $30 extra allowing browsing the web, and tablets being used for data storage/manipulation.

    Im sure the pc will survive in its workstation form for a long while yet, but its appearance and function will surely change completely in the long term (in the end, a psx2, a neon dvd player, and a tablet are all just pcs designed for specific tasks)
  • > What do you need a PC for now?

    A better question would be "What do you need a TV/VCR/phone/fax/whatever for now?"

    Why should I clutter up my home with inflexible, single-purpose electronic doohickeys, when I can get a few PCs to do it all. Who needs a TV when you can put a tuner card in a comptuter, and use it for something useful when you aren't watching the tube. Why get a VCR, when DVDs are available?

    A computer is an all-purpose machine. No matter how many special-purpose devices you build, they won't provide all of the possibilities that you have with a fully-functional computer.

    Besides, who wants a lobotomized PC, which is basically what all of the NetPC/WebTV devices really amount to.

  • Listening to it now... about 19 minutes into the show, Philip Greenspun [photo.net] called in. He's written a book called 'Philip and Alex's Guide to Web Publishing' [photo.net], available for free on the web at the previous link. This is probably the best text on creating high value community sites and web applications such as www.scorecard.org [scorecard.org] using open source software.
  • I know, all of these things can happen with data stored locally, but with local data, you are answerable to yourself, and you know exactly what is done to your data, and you are in charge of how well protected it is.

    Hmmm... seems like you should go join the survivalists who don't trust anything they don't make or kill themselves. You depend on a remarkably intracate web of services as it is.

    To answer your 'slashdot' critique, there will be QOS guarantees for critical apps, and apps will be distributed. Sure crappy providers might skimp on server hardware or oversell their service, but this is no different than any other service you rely on today.

    -josh

Real programmers don't bring brown-bag lunches. If the vending machine doesn't sell it, they don't eat it. Vending machines don't sell quiche.

Working...