Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Editorial Cellphones DRM Open Source Privacy Hardware Slashdot.org

The Greatest Battle of the Personal Computing Revolution Lies Ahead 291

As tablets and computer-phones flood the market, the headlines read: "The Personal Computer is Dying." But they are only half true: an artifact of the PC is dying, but the essence of the PC revolution is closer to realization than ever before, while also being closer to loss than ever before.

Certainly one way to define the Personal Computer stems from the era of the IBM PC: a gray box with a monitor, mouse, and keyboard (or a laptop). But the idea of the Personal Computer dates back quite a while — back to Alan Kay's Dynabook, the Lisp Machine, etc.

The Apple Knowledge Navigator provided a vision of personal computing far more dynamic than that dull gray box. Although still a pale comparison, tablet and phone platforms are beginning to look awfully similar.

The essence of those pre-PC Personal Computers was that of the user controlling the device. You control the data, you control the software; the Personal Computer is a uniquely personal artifact that the user adapts to his own working style. One consequence of this is that creating is as easy (perhaps easier) as consuming content. Another nice side effect is that your data remains private by virtue of local storage.

In many ways, then, a tablet or phone comes significantly closer to a personal computer than that dull gray box under your desk. For example, on Android, the screen ceases to be a place to throw icons and becomes a rich canvas of widgets. Additionally, my phone fits into my pocket and is always there. Ubiquitous cellular coverage gives me access to my data from most anywhere. The touchscreen and interface conventions make direct manipulation shine in a way you just can't get from a screen two feet away on a desk.

And, those are just superficial improvements over the desktop. Albeit tied to proprietary services, Google's voice search and Siri are inching closer to the dream of personal Intelligent Agents reminding us all that our mothers called us earlier today and want us to pick up the birthday cake for the surprise party With a few taps I can search basically all of my data, not to mention the collective knowledge of mankind.

But the software running on these devices has a dark side. Want to access your music collection the go? You have to get it from Google Play. Want to have lightweight instant messaging? You have to use GTalk. Or take ebook readers (certainly personal devices): that book you just downloaded to your Kindle is DRMed and stuck there! That intelligent agent? Apple records everything you bark at her and can take her away at a moment's notice.

Furthermore, the software on these devices is geared almost exclusively toward content consumption. You can listen to music all day long, but don't try multi-track recording. That ebook reader is great for reading, but you can't scratch notes in the margins of any of your books or sit down with one and scrawl out your latest manuscript. Clearly, some of this is from the youth of these new systems, but it is distressing to see them geared first toward consumption (the Newton, for example, was geared from the start as a device for creation).

The "cloud" as implemented by Amazon, Google, Apple, et al. is a distinct threat to the personal computer. Loss of control over our own data is perhaps the worst part of the cloud. We're easily seduced by genuinely useful features like access to our contacts and music from any device without having to manually sync anything. It's certainly more convenient to purchase a digital movie on Amazon Prime than to hunt down a DVD, and Netflix is definitely nicer for most people than cable television. But when you buy a movie on Amazon, you don't really own it.

Underlying many of these cloud services (especially media-related ones) is Digital Restrictions Management. Whether it be the files themselves or the protocol used to transmit data, DRM is used to control what you can do with your data, restricting even what programs you can use to interact with seemingly neutral files. Worse, networked DRM services can and have led to lost data when it is no longer profitable for the company to run the verification servers.

The only copying that DRM discourages effectively is the sneakernet. And, given that the sneakernet has existed since recordable media has existed, it doesn't seem like the sneakernet is really much of a threat to creative business. I might lend a friend a CD (or even let her copy a few files), but just as I don't unwrap that CD and torrent it through The Pirate Bay, I'm not going to download a movie from Amazon and do the same. There's really no incentive to do so, for most people — most people pirate because that's what you have to do to get the media you want, not because you have a compulsive desire to share things with your closest 10,000 friends.

In order to prevent what is effectively sharing between actual friends, pushers of DRM-infected data want us to completely cede control of our own data!

And they have made people accept it: Steam, Netflix, and Amazon Prime are wildly popular. All of those services are great ideas, but all of them treat you as if you were a criminal.

Worse yet, the spread of Software-as-a-Service is returning us to the bad old days: that powerful PC in your pocket is quickly becoming no more than a glorified terminal. The open peer-to-peer network is being subverted from an enabler of collaboration never before seen into yet another scheme to tether users to proprietary, centralized services. And, as SaaS expands, privacy recedes. No longer is it implicit that your documents are yours alone; now you write and store things using Google Docs and have no expectation of privacy (legally), despite expecting privacy. Amazon knows what you read; Netflix knows what you watch; Google knows what you visit.

Control over the programs you run, and more importantly can write, is key to a personal computer being personal. And it seems absurd that that right might be taken away, but behold: the iPhone and soon Mac Store are these mythical walled gardens. You have to subvert your device to gain real control! And the natural path for Apple is to restrict Macs similarly to iOS devices.

And so we are all-too-near an Orwellian nightmare where vendors dictate what we can do with and how we can use our own data.

But what about the hardware itself? It could be argued that a device isn't really personal for some set of people if they can't change all of the software. Here too we see some promise, and some pitfalls.

The shift to tablet and phone hardware has meant a shift from x86 machines running PC BIOS to thousands of ARM boards, each with its own peculiar way of being programmed. Things you take for granted on x86, like being able to even boot, require custom code. And let's not even begin talking about all of the DSPs and co-processors. Vendors aren't always forthcoming with documentation for their boards, and, even worse, those that do port Linux to their hardware often blatantly violate the GPL and do not distribute kernel sources. This restricts the utility of perfectly fine hardware: often to the detriment of the user and to the benefit of the manufacturer.

Anyone who finds they can't upgrade to the latest version of Android because their vendor won't support it, and the community cannot support it because of non-free drivers, knows what losing control over their hardware is like (RIP HTC Dream).

It might seem like a minor setback ("I guess I have to buy a new phone"), but the lack of specifications or support marginalizes alternative operating systems. There's Meego, Tizen, Open webOS, Firefox OS, SHR, etc., but experimenting with them on your device is a non-starter. Imagine if the x86 were so closed (something we may not have to only imagine much longer): it is doubtful that GNU/Linux or the multitude "alternative" OSes would exist (Atheos, Haiku, L4Linux, even the Hurd). Ever more closed hardware is putting us into a position where two or three companies will dictate everything about the computing experience going forward, with no room for freethinking tinkerers to revolutionize how we interact with our devices.

We are staring at a bleak future, and living in a bleak present in some ways. But there is hope for the battle to be won by the Personal Computer instead of the Terminal.

The Internet is not yet merely glorified cable television. Hypertext, email, instant messaging, trivial file transfer, etc. have revolutionized how mankind communicates (understatement of the decade). Once upon a time the dream was that everyone would be a first-class netizen: your IP was publicly routeable and with a bit of know-how you had a server. Instead, thanks to grossly asymmetric pipes and heavy NATing, it is rare for any individual to run their own servers. Instead we turn to Google, Amazon, et al and cede control over our data.

But now broadband connections are spreading fast (I've gone from 100Kbit/s to 2Mbit/s upstream in three years just with basic service), IPv6 is really here, and software is being written to challenge the centralized "cloud" model being pushed on us from above.

We've had a few victories already: SMTP is still in use, XMPP is the dominant chat protocol (and IRC refuses to die), RSS/Atom aggregation decentralizes news, and the core network protocols are developed in the open.

But Google still controls Android, and myriad services control your data. Part of this is because they have easy client and server interfaces; sure you could run gallery2 and Wordpress on your own server, but I can just snap a photo on my phone and it's up on Facebook 40 seconds later (well, if their app worked, it would be).

Luckily, there are people working on making easy to use "cloud" services. In particular, ownCloud. ownCloud provides a framework for hosting and syncing data between your devices and sharing data with others. The important part is not so much the central server, but the clients they are writing. Eventually, it should be possible to e.g. replace the Google contact/mail/calendar sync and Google Drive, while adding these features to the desktop. Integration in KDE is already underway.

Imagine, instead of being tied to Google you could move the central server to the hosting provider of your wish (or pack up your data and move it to greener pastures if you're not running your own). And, perhaps more subtle (but the real liberation): Your data would be freely movable between all operating systems (interesting that you have to go through hoops to sync your GMail contacts with anything else, and Abandon All Hope Ye who wants to share between an Apple device and anything else). Additionally, the server is designed to respect your privacy (you can e.g. only store encrypted data server-side).

On the hardware side, projects like Firefox OS are very important: having a "mobile" Free Software OS developed in the open might be essential when the dominant open platforms are developed monolithically by corporations with no interest in protecting user control of data.

And then for developing the next generation of devices, folks like Rhombus Tech are pushing for the development of interchangeable CPU boards for embedded devices, and the FSF is expanding their focus to include open hardware.

There are two serious threats that would undermine any resistance: IPv4 exhaustion and draconian content policing. The former issue is technical and likely to solve itself: in the long run multi-level NAT would be too costly, switching hardware will be replaced as it is obsoleted, etc. The latter is political and represents the most serious threat of all. If we cannot communicate freely and the pipes are owned by the very organizations whose business interests will be harmed... we've already seen how brazen the current IP regime can be, and it will take vigilance on the part of many to prevent them from having their way.

Where will we be in ten years? If Google, Amazon, Apple, and Old Media get their way, in a new dark age of computing. Certainly, you'll have a fancy tablet and access to infinite entertainment. But you will own nothing. Sharing data will be controlled by a chosen few entities, the programs you can run or write will be limited in the name of security, and privacy will be dead.

History shows that personal computing survived despite Apple and Microsoft in the 80s and 90s. So, I'm hopeful that other forces will win: the forces of Free Culture and Free Software. If they succeed (or are at least not crushed), the future is much brighter: most content will be available DRM-free, users will control their computing environments, and the egalitarian promise of the Internet will be realized (in no small part thanks to IPv6).

This discussion has been archived. No new comments can be posted.

The Greatest Battle of the Personal Computing Revolution Lies Ahead

Comments Filter:
  • by Kenja ( 541830 ) on Wednesday October 24, 2012 @12:04PM (#41753207)
    Remember when the Palm Pilot and Apple Newton heralded the "end of the PC era"?
    • by jellomizer ( 103300 ) on Wednesday October 24, 2012 @12:42PM (#41753657)

      The Palm Pilot and Apple Newton never achieved the success the iPad has or the iPhone.

      The problem with comparing the Past Systems was the fact they while they look similar, that new feature of multi-touch is the real game player.
      Before we needed to use a stylus or one finger to just push a button on the screen. Doing things such as zooming in was very clumsy. The simple feature of the pinch zoom is a massive game changers. During Newton and Palm Pilot Hayday. PC's were in a get a really big display phase. 17" - 19" - 21" get as big of a CRT that can fit on your desk. Why? because you had so much information, you wanted to view but smaller screens didn't have the resolution or were too small to see it. For the most part on the screen we only focus on a couple square inches on it at any moment. But using the mouse to scroll and zooming was choppy, made it so you need to use a desktop if you want to get real work done. With multi-touch you can see scroll and zoom much faster and naturally then before.

      The next problem during the Palm Era. Was we didn't have too many good enough CPU's to do the job. During the Pentium 2 Era. your Palm Pilot had the power of an 8088 (10 year gap). Today We are closer to a 5 year gap, and our need for personal processing power has diminished. We can play a movie in High Definition on our phone and it will run smoothly. Programs are responsive and quick. While not as fast as the desktop, we are by no means suffering.

      The third problem was network infrastructure. The old devices you needed to sync with a PC. Today they are self updating and work by themselves without the need for the PC. And they have wireless internet that means it is actually handy if you want to look up something.

      The fourth problem was culture. Technology gadgets were not cool back in the late 90's. You would have been a major nerd or geek in the negative term if you were caught using one. Cell phones getting smaller and cheaper means more popular people were getting the technology thus allowing more high tech to be more common across the "normals".

      We had a bunch of horseless carriages designed before the Model-T too.

      It just needed the right situation to get them to kick off.

      • But the reason Palm wasn't the end of the PC era is the same as the reason The Tablet (you seem to like apple) wont be either. They are toys. While the PC can be used for entertainment, its primary purpose is as a tool. While I'm sure there are use cases you can come up with where a tablet can be used as a tool... there were for Palms as well, the fact of the matter is, real work is done on PCs. And will continue to be done that way for a the foreseeable future. Will desktop PCs go away? Eventually... but t
        • But I see people in meetings at work all the time with tablets in pretty cases furiously typing into them(they are all ipads here).......are you saying that those are just toys and those people are doing any real work?

          Wait a second.....where do you work?
        • by gtall ( 79522 )

          I think that the number of PCs necessary for "real work" is a lot smaller than the current market. This is a major reason why pads, pods, and phones are eating into PC sales. Some tasks definitely require PCs, but you don't need one to cruise the web, look up phone numbers, and a host of other small things for which business and consumers bought PCs. And as the follow below me indicates, some real work can be done on these small devices.

          • by Mephistophocles ( 930357 ) on Wednesday October 24, 2012 @05:01PM (#41757135) Homepage

            This is a major reason why pads, pods, and phones are eating into PC sales.

            Maybe, or maybe it's because PC sales have been over-inflated since sometime in the 90's. Just my opinion, but I think we should step back and take another look at this - TBH I'm not entirely sure that what we're seeing is such a bad thing - and that it's completely natural.

            Most people aren't tinkerers, inventors, hackers, or scientists. Most people aren't curious about their world, investigative of the way things work, interested in science, or even all that intelligent. Most people don't have a scientific mind or any desire whatsoever to use technology for any more than canned entertainment (which, by the way, is also what they use most of everything else in their lives for). Not because they're inherently a sub-species - but because they just don't care. All respect to the author of the original article above, but I think he's missing something important - PC's are losing ground to tablets, etc in the market because most people don't have any use for PC's - and they never really did. PC's were always FAR more complicated than they were able to appreciate or take advantage of - and they don't have any use for them because they can perform their stupid, meaningless, and irrelevant tasks quite adequately on a phone/tablet/whatever. They also don't give two farts who legally owns the movie they just paid $9.99 for, as long as they get to watch it right now, and have never heard of DRM (and wouldn't understand it if you explained it to them). Jobs was a genius - he understood that, and by simplifying the PC down to a glorified toy, he knew that the entire world would throw their money at him - and they did.

            And I say, more power to them. I don't care. Let them do their thing, and let Apple and Google and Amazon make bank off of them. Big deal. Me? I'll always have a PC of some kind, and a hundred other hacked and frankenstein'd gadgets - because my nature isn't to just consume, it's to create and arrange things to make them better. It doesn't really matter to me if Apple quits making Macbooks, or Microsoft quits writing operating systems that work on regular computers - at worst, it's a minor inconvenience, because I and many others like me will step in to fill the void - just as we created the beginnings of all this stuff to begin with, way back in the 80's and before. This move toward DRM and non-ownership of public entertainment is meaningless. Jobs and Gates and the rest took what was originally created and commercialized it; made it accessible to the masses - and the masses, because they don't know better or don't care, will eventually be controlled by draconian corporations or governments, or both. Those of us who care enough to invent, create, and make the world a better place, will not.

            • Most people aren't curious about their world, investigative of the way things work, interested in science, or even all that intelligent. Most people don't have a scientific mind or any desire whatsoever to use technology for any more than canned entertainment (which, by the way, is also what they use most of everything else in their lives for). Not because they're inherently a sub-species - but because they just don't care.

              In fairness, it doesn't require a computer to be "curious about the world," "investi

        • by Tsingi ( 870990 )

          But the reason Palm wasn't the end of the PC era is the same as the reason The Tablet (you seem to like apple) wont be either. They are toys.

          The Palm wasn't interfaced to a powerful network of apps and data. Today's tablet is an interface to a much more powerful set of tools.

          A tablet will never be enough for me, but it will suffice for most people. I like using tablets to interface to my own systems.

          I think you are wrong.

        • by Yaa 101 ( 664725 )

          But tablets and other toys will make the PC go back to it's early days price tags because most people only want the toys, especially now thay have to work all day on this shitty PC at their work.

          Not kill, but inability to buy due to price make the same outcome.

      • Comment removed based on user account deletion
    • by tatman ( 1076111 )
      Those were great products just too ahead of their time. This go around, I think they may be right. The market is different, costs and benefits are more inline and the general population is much more savy than when those devices came out (especially newton).
    • by Jawnn ( 445279 )

      Remember when the Palm Pilot and Apple Newton heralded the "end of the PC era"?

      I do. It was a stupid statement then, and it's just as stupid to suggest that tablets will be "the end of the PC" now. To be sure, tablets are going to replace PC's in a lot of places, but anyone whose computing tasks involve any serious amount of input knows that a table is very poor substitute for a keyboard. The same can be said for those tasks that need multiple displays, etc. Those users will absolutely not be replacing their PC's with tablets.

  • Walled gardens... (Score:5, Informative)

    by crazyjj ( 2598719 ) * on Wednesday October 24, 2012 @12:04PM (#41753209)

    Great for alzheimer's patients, criminals, and little kids. Not great for free adults.

    • It's great for power hungry CEOs as well...

      • It's great for power hungry CEOs as well...

        Jail cells are not walled gardens. The resemblance is superficial at best.

    • The whole idea of PCs started as a way to get away from walled gardens. It seems Microsoft, et al. have forgotten that the very reason for their existence is that people want more freedom than what you get from a rented terminal client. Cloud computing would be a different story if people needed more computing power than what they could reasonably afford to own in full; however, the exact opposite is the case.
      • by mikael ( 484 )

        The problem is that virus writers are coming out with 100,000 plus variants each day. The IT industry is coming to a point where a white-list of permitted applications vs. a black-list of malware is going to be the only way to download safe software. Then the malware battle will shift over to application plugins just like web-browsers.

        • You can have trusted software sources without a walled garden. In a walled garden, the vendor simply gets to choose the trusted sources instead of you.
        • The problem is that virus writers are coming out with 100,000 plus variants each day.

          The solution is to create less exploitable OSes and to make resetting a system to its factory state less painful. Walled gardens are a cheap way to achieve the first, do nothing to help with the second, and carry the cost of making hacking and innovation (i.e. disruptive technologies -- like the PC itself) more difficult.

    • Re:Walled gardens... (Score:4, Interesting)

      by ErikTheRed ( 162431 ) on Wednesday October 24, 2012 @01:09PM (#41754095) Homepage

      Also good for people who value their time (not having to worry so much about fraud and malware, research, etc.) more than their ability to do things with a device that they would never bother doing anyway.

      It's perfectly fine for tinkerers on Slashdot to have the opposite preference and express it verbally and in the market with their purchases, but to presume that their preference - which is shared by an extremely small minority of people - is ideal for everyone else is a bit silly. I fully support people who want to tinker - I used to be that way myself. But as I've gotten older my interests have shifted and I simply don't want to spend my very limited time on vetting everything that goes into my mobile device, and the limitations imposed by the "walled garden" don't really affect my interests. It's a simple trade-off.

      • Re:Walled gardens... (Score:5, Interesting)

        by CastrTroy ( 595695 ) on Wednesday October 24, 2012 @01:24PM (#41754295)
        I agree with you here. But I think that Android has reached a better balance between walled garden, and letting the user run whatever they want. I like that I can easily go and buy apps for my phone from a reliable source. But I also like that I have the option to install third party software, or develop my own software. I realize that for most people, you can't have it both ways. Give them an inch, and they will take a mile. Give them the ability to install software from unknown sources, and they will install all kinds of crap software which will wreak havoc on their system. The only thing to really stop most users from doing this is to outright refuse to run software from unknown sources.
      • Also good for people who value their time (not having to worry so much about fraud and malware, research, etc.) more than their ability to do things with a device that they would never bother doing anyway.

        Only if those people are content with things staying the way they are i.e. if they do not want the next technological revolution to occur. Disruptive technologies do not happen in walled gardens; that is the point of walled gardens, to protect their curators from the fate that buggy-whip makers faced.

        The World Wide Web could not have happened in a walled garden; if everything was locked behind walled gardens in the late 80s, we would have never had a web, we would still be using online services and the

    • Great for alzheimer's patients, criminals, and little kids. Not great for free adults.

      So long as the alzheimer's patients, criminals, and little kids are in the majority then the free adults aren't going to get what they want.

  • RTFA (Score:5, Funny)

    by l810c ( 551591 ) * on Wednesday October 24, 2012 @12:05PM (#41753235)

    Do you really expect us to read all of that?

    • by Anonymous Coward

      LOL heaven forbid an article be more than a snippet and someone express a full thought rather than a catch fraise

    • Re: (Score:3, Insightful)

      by Baloroth ( 2370816 )
      TL;DR: more predictions of locked down devices, death of personal computing. Same predictions that have been made for decades now. Keeps not happening, because DRM doesn't work, and locked down devices don't do what people who actually use them (as opposed to just play with them) need them to.
      • by Kjella ( 173770 )

        locked down devices don't do what people who actually use them (as opposed to just play with them) need them to.

        Locked down devices do exactly as much as the people who sell them want them to do, everything from a G-rated kids tablet to to fully automated and unvetted signing with only a banhammer lurking. I think you meant to say "The current locked down devices don't do what I want" because what that means isn't fixed and most people get a lot of "real work" done on business computers even more locked down than the Apple, no jumping on the app store and installing random software there. In fact you're the equivalen

    • by mrbene ( 1380531 )
      Come on, this isn't the "TOS-DR [slashdot.org]" article!
  • by alen ( 225700 ) on Wednesday October 24, 2012 @12:09PM (#41753293)

    long ago cars were sort of open and then things went to a vertical system where the manufacturer designs and manufactures the car with self made parts or custom made parts.

    computers are going the same way.

    most of us have better things to do than some of the nonsense in this article. $7.99 for netflix is a nice deal for what you get.

    • by HWguy ( 147772 ) on Wednesday October 24, 2012 @12:19PM (#41753429)

      This. At least for the general public. The whole idea of a "computer" is simply a result of how primitive they are. That the software that controls them requires the user to understand concepts such as operating system and application, networking and device drivers. People don't really ever want to know they are "running a word processor" or "launching a web browser". They want to accomplish specific things, like writing a note (or video chatting) with a friend, looking something up or watching a movie.

      The technical crowd loves to complain about Apple's walled garden, but this is exactly the genius of Apple. They get that. They get that they have to evolve the thing called a computer into a thing that people don't ever have to fiddle with. That simply exists to provide useful services for their life. The other computer manufacturers understand that to a smaller degree and then wonder why their tablets aren't as successful.

      The personal computer, as technical people know it, is going away. It's growing up into what the vast majority of people really want. And thank God. I'm glad I don't have to stand in front of my car turning a crank to get it running.

      But all is not lost for technical people. There will always be ways to have your own device. The free software and maker movements will ensure that. In some ways things are better today than ever. In the 1980s (some consider the heyday of the open personal computer) we had the 8-bit IBM PC. Today we have a gamut of programmable devices ranging from Arduinos to $35 linux computers to set top boxes to multi-core, multi-cpu computers more powerful that super computers of the last century. All totally accessible.

      • The technical crowd loves to complain about Apple's walled garden, but this is exactly the genius of Apple.

        Apple is only able to create a walled garden thanks to layers that have been built before by the tinkerers and technical folk. So I think that while Apple's strategy may work well in the short term, it will likely be their downfall long term.

        When you create the walled garden you allow developers to focus on apps, but exclude them from the areas that may have a large impact. Apple needs to do it themselves for the newest innovations. That fancy new, revolutionary FS or networking will need to be ported. Or t

        • by gtall ( 79522 )

          Developing on a Mac is not developing in a walled garden as you seem to think. Apple knows fully well what it takes to develop apps for their garden, they aren't so stupid as to cut developers out.

    • At least when it comes to software and computers, innovation will happen at a glacial pace. Look at cars -- the same basic design, the same set of features, and only moderate improvements in fuel efficiency (and only when the government demands it). You have the big corporations designing your car, making the parts for it, and decided who is allowed to do maintenance; now just shut up and go about your life, because cars are always going to work this way.

      Is that really what you want to see happen with
  • by tomhath ( 637240 ) on Wednesday October 24, 2012 @12:09PM (#41753297)

    The touchscreen and interface conventions make direct manipulation shine in a way you just can't get from a screen two feet away on a desk.

    Maybe for some people...personally I prefer a couple of big monitors in front of me.

    • Comment removed based on user account deletion
      • by mikael ( 484 )

        Some laptops have touchpads built in. My old laptop has an area of space in front of the keyboard dedicated to the touch pad and press buttons which is the same size as a smartphone. It would be very easy to modify a laptop so that the docking/charging port for a smartphone would be in this area.

  • I have a 32 GB SD Card in my phone. The reason? Because in Oregon, Cellular isn't ubiquitous. And because I can keep my entire 2GB music collection, plus several books, plus a bunch of other aps that don't need the net, on it.

  • by hack slash ( 1064002 ) on Wednesday October 24, 2012 @12:11PM (#41753319)
    There's no fuckin' way I'm ditching my mouse and keyboard for a touchscreen.
    • Yes there is, and here's how:
      1. Make sure no company makes new keyboards and mice.
      2. Render the old keyboards and mice unusable by making your new computing devices have no place to hook up your old keyboards and mice.

      Are you right that this plan is taking away consumer choice? Hell yes. But that doesn't mean it won't happen.

      • Yes, because Douglas Engelbart had to buy his mouse from Logitech.
    • Comment removed based on user account deletion
      • so you've pre-ordered your Wii-U then right?
      • I had a small trackball for my Amiga which I stuck to the keyboard, it was convenient to have it there but I never found it as quick / precise / easy as a mouse, which I have yet to find a better alternative to for GUI control.

        Touchscreens certainly have their place (phones/tablets etc.) but they are no match for a keyboard+mouse in many situations - just ask the gamers.

        The laptop & USB touchpads with multi-touch are pretty nice to use; two finger scrolling instead using the edges of the pad, three
  • by peter303 ( 12292 ) on Wednesday October 24, 2012 @12:13PM (#41753349)
    I've been in [personal] computers over 40 years, seeing them from the kilobyte/kiloflop era to the threshhold of the terabyte/teraflop era. There have been both surprises and disappointments at every turning. I dont see why this would not continue for another 40 or 100 years.

    My two biggest mispredictions were:
    (1) In the mid 70s I wounder why anyone would buy a store-made computer. They were so fun to solder together yourself.
    (2) The sudden rise of the world wide web in 1993. Everyone knew cycberspace would eventually happen, but probably another decade or so. That was a huge victory for open source: thanks Tim!
    • by jbolden ( 176878 )

      My worst prediction. I figured built in indexing (Gopher) was too valuable to give up and HTTP would thus remain a niche protocol mainly for graphics heavy content.

      • > HTTP would thus remain a niche protocol mainly for graphics heavy content.

        You were right. However, graphics heavy content is the format strongly preferred by humans.

    • You haven't seen my beige tower. My tiger once got completely nuts and completely scratched it.

  • by Isaac Remuant ( 1891806 ) on Wednesday October 24, 2012 @12:15PM (#41753375)

    Excellent read. Thanks.

  • by NixieBunny ( 859050 ) on Wednesday October 24, 2012 @12:16PM (#41753387) Homepage
    The 'real' PC, used for design and engineering work, is not likely to go away any time soon, as all our technological advances would grind to a halt without it.

    The PC will get more expensive as the sales volume goes down from hundreds of millions to hundreds of thousands of 'real' computers per year. but then, those of us who use PCs for real work have been riding the coattails of the gamers for a decade now.
    • So do I, but it's a Fujitsu Stylistic Tablet PC w/ a Wacom digitizer and pressure-sensitive stylus. I draw, sketch, create plans for woodworking projects, design typefaces, do some light programming and typesetting using (La)TeX and keep several decades worth of notes on it. The slate form-factor is well-suited to design work, and it's nice to be able to do this pretty much anywhere (Fujitsu has a history of offering daylight-viewable transflective displays, which my ST-4121 has). It also makes a very nice

  • This is the most significant concern raised by the article, and I think it's legitimate. That's why I continue to buy backup drives and keep my data local (except for the backup at my friend's house.)

    At a minimum, we need warranted Service Level Agreements with cloud providers, that include guarantees with penalties when access to their services (cloud based apps or data) fails. "Sorry about that, we won't let it happen again" ain't good enough.

    • This is indeed the original reason for Unix. Your data is more important than the technology that stores and mangles it.
  • Up until recently, there was one style of computer, the classic desktop box

    It had many diverse uses

    Some used it as an embedded controller

    Some used it for CAD design, video editing, music production, science, etc

    Some used it to read email and surf the web

    Since there was only one style, lots were sold, so they became very cheap

    Now, we see the market segmenting

    Many people can have their needs met by a smartphone or a tablet, but not all

    Some, like CAD designers, video editors, music producers, s

    • by tlhIngan ( 30335 )

      The bad news for us is that since the masses will probably move to the alternate devices, volume will go down on traditional computers

      This means prices will rise

      Is it a terrible thing?

      I mean, the problem is the "general public" cared about cost and we ended up in a race to the bottom, where margins are thin and we're seeing the results in low-res screens, integrated graphics, and basically a lot of sameness as everyone builds to a price.

      Let prices rise a bit - clear out the low end crap. If you wanted a dec

  • by Jackie_Chan_Fan ( 730745 ) on Wednesday October 24, 2012 @12:26PM (#41753505)

    The PC will never die. These attention seeking whores are fucking technology morons. They use their computers for facebook, jerking off and youtube. Computers are more than a jerk off machine and a twitter device.

    Yes, for the average idiot who was destined to sweep up shit for a living, they probably dont need a real deal pc workstation... because they'll never create or do anything.

    PCs are for people who USE pcs. PCS are for people who work, create, manage, code, program, animate, draw, paint, record, do research, study... PCs are for real users. The general public doesnt need roof ladder, but everyone has a fucking ladder still.

  • Reality (Score:4, Interesting)

    by jbolden ( 176878 ) on Wednesday October 24, 2012 @12:45PM (#41753713) Homepage

    1) Not one of these locked down devices is hard for a "free thinker" to put a new OS on. No one is making nor planning on making devices that are actually secure against a knowledgeable owner that wants them to do something different. They are looking to add some security that is impossible without hardware support. No one is actually advocating the position your essay is opposing.

    2) When PCs started they used to come with the OS (and arguably sometimes more than one OS) on ROM. People still booted different OSes on them.

    3) There is wealth of content creation tools for all these platforms that already exist, so concerns about consumption / creation are overblown.

    4) DRM is obviously popular with content creators to avoid sharing, and larger entities to allow for distribution and control. It comes in and out of fashion and has for long time. There is no long term trend in either direction. For example in the last 5 years virtually all music is sold DRM free while previously music companies had required DRM.

    5) On the consumer tablet / phone devices there already exist a wealth of services to setup alternative "clouds" including both Android and iOS. They are cheap and easy to configure. Instead of whining about them not existing for consumer just set one up.

    • Not one of these locked down devices is hard for a "free thinker" to put a new OS on.

      They are all significantly harder than a current PC, and end up only partially functional when you do.

      No one is making nor planning on making devices that are actually secure against a knowledgeable owner that wants them to do something different.

      Nonsense. Apple, Microsoft, Sony, etc. all spend lots of time and money designing and implementing security schemes that make doing this more and more difficult. The EFF scored a c

      • by jbolden ( 176878 )

        They are all significantly harder than a current PC, and end up only partially functional when you do.

        Maybe. But current PCs are pretty darn easy. They weren't that easy when Linux was thriving as an alternative to Windows. I'd say it is likely easier to install iPhone Linux today than RedHat in '97. As for partial functionality that's not the fault of the device manufacturers. The Linux kernel was tuned mainly for Microsoft / Intel / Western Digital (i.e. x86 PCs). It has expanded to other platform

      • by jbolden ( 176878 )

        Sorry hit submit too soon on last post:

        Music was freed of DRM but instead the most popular platforms have DRM integrated from the bottom up. And virtually every other media (books, video, etc.) are all slathered in a layer of DRM with no forward progress on removing it.

        Online video is getting less DRM oriented as shockwave / flash have gotten more open. Books are new and downloadable video is new. We'll see in 10 years how this pans out.

        Nonsense. Apple, Microsoft, Sony, etc. all spend lots of time and

    • Not one of these locked down devices is hard for a "free thinker" to put a new OS on.

      Yet some countries prosecute, or allow copyright owners grounds to sue, "free thinkers" under anti-circumvention laws.

      No one is making nor planning on making devices that are actually secure against a knowledgeable owner that wants them to do something different.

      Sony v. Hotz anyone? What about the downfall of Lik Sang?

      There is wealth of content creation tools for all these platforms that already exist

      What programming environment for iOS is comparable to AIDE for Android?

      It comes in and out of fashion and has for long time. There is no long term trend in either direction. For example in the last 5 years virtually all music is sold DRM free while previously music companies had required DRM.

      Good for music. But when have DRM-free feature films been in fashion at any time since Macrovision was introduced?

      • by jbolden ( 176878 )

        What programming environment for iOS is comparable to AIDE for Android?

        I think you mean programming on Android. The question was about content creation for not on. No question Apple considers iDevices secondary devices. They do have some programming languages where they think it appropriate like gambit scheme, ND1 (3 interpreters), a variety of Lua interpreters...

        Good for music. But when have DRM-free feature films been in fashion at any time since Macrovision was introduced?

        Internet short video has

        • by tepples ( 727027 )

          [Lik Sang] were hit for violating import and export restrictions.

          Who lobbied for these restrictions, and who pressed charges?

          Sony lost the case

          I thought it was settled out of court, with Hotz agreeing to shift his hacking work away from Sony products. Was there in fact a judgment in favor of Hotz?

  • I'm not sure that this statement makes any sense. It might make sense for Blizzard (Diablo 3), or any new games from Ubisoft, all of which apparently requires a persistent connection to play your games.

    Netflix is basically a movie rental service with no due dates, and you can watch the stuff you want at any time as many times as you want. I'm not under any illusion that I own any of the content they have available.

    I have Steam, and I usually only buy games that are on deep discount whenever they have
  • That's all pc's , labtops, tablets, smartphones are.
    They all serve a different purpose, and so one will never replace the other, it will just complement it.

  • Some very good points made there, and I completely agree that the main concern for the future is ownership of data, not what your PC looks like.

    I have been rather luddite in my avoidance of cloud services. In fact the only exception is Steam, which is perfectly fine and convenient for now, but I can foresee potential issues in the future. In particular when my 3 yr old son gets a bit older and wants to play games from my collection at the same time as I want to. I think the solution would be a bit torrent,

  • TLDR (Score:5, Insightful)

    by shadowrat ( 1069614 ) on Wednesday October 24, 2012 @01:04PM (#41754015)
    blah blah blah. walled gardens and cloud storage. I've been playing with my raspberry pi. it's just as fun as my first Apple II and way more hackable (maybe just because i'm a little bit smarter now) Personal computing is better than i ever imagined it would be. There is something to hack EVERYWHERE. get an android device, jailbreak your iphone. find cameras in the garbage. quit lamenting the fact that people who don't care about using computers settle for the walled garden.
  • The theory behind the cloud is that your data is available on multiple devices wherever you go. This is only a reality if you stay within your own connectivity area. Anyone who travels quickly understands that access to the cloud either becomes prohibitively expensive (data roaming) or limited. Streaming music on a beach in Mexico, and for example, if requires paying huge data roaming fees or requires the purchase of a local SIM card and an unlocked device. In my opinion the cloud will not become useful un

  • The editorial hits the main points, but perhaps understates the importance of US ISPs being controlled by non-competitive private companies. This is a disaster. Aside from Verizon Fios (which - surprise! - has stalled), Americans haven't put new pipe in the ground in ten years. Google shouldn't be making headlines with a modest proposed fiber-to-house project in Kansas.

    In the 1990s, backbone providers had to sell bandwidth to all last-mile-ISPs at the same rate. There were literally tens of thousands of ISP

  • But they are only half true: an artifact of the PC is dying, but the essence of the PC revolution is closer to realization than ever before, while also being closer to loss than ever before.

    I... wat?

    • But they are only half true: an artifact of the PC is dying, but the essence of the PC revolution is closer to realization than ever before, while also being closer to loss than ever before.

      I... wat?

      A mountaineer climbing K2 for the first time is closer to reaching the summit than ever before, but he is also closer to falling off and failing.

      Just sayin...

  • Or take ebook readers (certainly personal devices): that book you just downloaded to your Kindle is DRMed and stuck there!

    TTBOMK, all of the DRM schemes in common use for e-books have been broken. Of course, you have to get the files onto a normal sort of PC in order to decrypt them.

  • It costs under $5 per month to avoid the "free" services. I have a low-end $9/month HostGator account for my minor web sites. This allows multiple domains. If I want to publish a picture, it goes in a directory there. Another domain has Wordpress loaded for a blog.

    Mail comes into my own domains, is filtered, and dumps to an IMAP server at sonic.net, which I can access from all my devices. Sonic DSL has no ads, no filtering, no caching, no "deep packet inspection", no data caps, and no nonsense.

    The only

  • thanks for the link... some investigating to do, but looks like it's the answer to my prayers for getting my stuff back out of the clutches of Google...
  • What I keep wondering is, when I put all my cherished memories into the cloud and over years I allow that collection to grow with no means of extracting it or migrating it in a useful way, what will stop the cloud from taking full advantage of the worth of those memories to me? I can't imagine why they wouldn't eventually tell me that I must pay them to be able to see the media from Christmas 2012 and that is that? I believe that is the real lurking danger here. Right now we are the kids in the school yard
  • by VortexCortex ( 1117377 ) <VortexCortex AT ... trograde DOT com> on Wednesday October 24, 2012 @02:26PM (#41754997)

    On yesterday's PCs, I could just write raw machine code in Hex, save it to the 1st sector of a drive, boot the disk and be in full control of my own hardware with my own code. Many new-ish PCs now use EFI. To boot from EFI I have to write my machine code within a FAT (32) container, which means implementing MS's proprietary and patent encumbered File Allocation Table format... Tomorrow's PCs will use UEFI to boot, which requires a cryptographically signed EFI boot process. That means signing my own bootloader and installing my own keys, or paying for a key for each bootable from MS (some UEFI systems allow booting w/o signature via special boot mode, some do not) -- On ARM platforms shipping Win RT, MS has said the option to boot unsigned code or install user specified keys must be removed.

    So, you can see how it's slowly gotten a bit harder to play with my own new hardware thanks to the increasingly high hoops I've got to jump through. If Microsoft has their way you won't be able to boot any OS that doesn't fork over the cash to them. In fact, even the Linux Foundation is planning to pay MS for the right to sign a bootloader [linuxfoundation.org] so you can still boot your own software on UEFI hardware. I think that's horrible. I understand they want to make it easy for users to run free software but IMO, paying MS one red cent to give us back the freedom to use our own software with our own hardware is just vile and disgusting. Instead, I'll buy from vendors that respect my freedom. The subject line say MS + Secure Boot == PC Death, but really Apple, and many other vendors who don't let us unlock our devices to run arbitrary code are equally as evil in my book.

    Recently a longing for the good ol' days of unfettered computing led me to creating Hexabootable [vortexcortex.com]. It's a 512 byte boot sector that contains a Hex editor. With it you can edit raw memory then execute the memory you just edited. Using only this minimal tool you can extend the program's features (eg: disk I/O), write any other program, even create a whole new Operating System -- Indeed that's exactly what I'm doing. [slashdot.org]

    None of my hardware or software hacking hobbies will be possible if the OEMs get their way and lock us out of our own hardware. It's all under the guise of Security, but that's not really the reason. Think about it: OS code is huge and bug ridden; If there's even one kernel level arbitrary code execution vulnerability then the whole effort is useless. If the OS makers could write secure (read: bug free) OS's they would be just as secure with and without secure boot! If they can't write secure OSs then secure boot is pointless! Truly, I can use known exploit vectors against every modern OS, secure boot or not, to run my own unsigned machine code, and so can malware writers... So it's not a boot for normal end user security, it's just digital shackles. The real reason Secure Boot Chains exists is to keep you from tampering with your own computer.

    Now, what I do find hopeful is the cool work in the embedded systems fields. There are several projects that strive to be as transparent to the user as possible, and get their code up and running controlling everything. Unfortunately you don't always get to run plain machine code on all of the hobbyist devices. Open hardware initiatives give me a warm fuzzy feeling -- That's what will save the "PC" (Personal Computer) in my opinion. Protip: If you can't personalize the machine code and/or hardware, then it's really an Impersonal Computer -- An impostor of the worst kind...

    Here's a fun aside: Since I write software in machine code, I could release it under the GPL and provide no other "source code" but the binaries :-P
    Conversely, if you know Machine Code, every (non encrypted) binary executable is Open Source!

  • Which in PC parlance means they have to be thrown away every 12-18 months. Why is that? Bloat. Plain and simple. When your Andoid tablet or iPad accesses a typically horrible bloatpage with 3 different animated popups, a banner or two, 5 layers of Javascript and the rest, it grinds to a halt. And when the hardware engineers make a tablet that's twice as fast, the marketing douchebags tell the software developers "We need 7 more popups, a dozen more animations, twice as many switches and buttons for that 'us

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...