Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Earth Microsoft Operating Systems Privacy Programming Robotics Security Software News Technology Science

Bill Gates: AI Is The 'Holy Grail' (mashable.com) 260

An anonymous reader writes: At the Code Conference on Wednesday, Bill Gates balanced his fears of artificial intelligence with praise. He talked about two of the challenges AI will pose: a loss of existing jobs, and making sure humans remain in control of super-intelligent machines. Gates, as well as many other experts in the field, predict there will be an excess of labor resources as robots and AI systems take over. He plans to talk with others about ideas to combat the threat of AI controlling humans, specifically noting work being done at Stanford. Even with such threats, Gates called AI the "holy grail" as he envisions a future "with machines that are capable and more capable than human intelligence." Gates said, "We've made more progress in the last five years than at any time in history. [...] The dream is finally arriving. This is what it was all leading up to."
This discussion has been archived. No new comments can be posted.

Bill Gates: AI Is The 'Holy Grail'

Comments Filter:
  • 640k of Skynet (Score:2, Interesting)

    by Anonymous Coward

    Nobody will need more.

  • Loss of jobs... (Score:5, Insightful)

    by EmeraldBot ( 3513925 ) on Thursday June 02, 2016 @05:06AM (#52231135)
    Loss of jobs is the big one. An AI is not only not capable of killing humans, but would have nothing to gain from killing the people who maintain it. On the other hand, poor and unemployed people with nothing to lose will tear our society apart if that part grows large enough (as has been demonstrated numerous times throughout history) and I fear nobody seems to be taking this situation seriously. We need to find an alternative way to structure our society, and quickly, if we want AI that does all our work for us.
    • Re:Loss of jobs... (Score:4, Insightful)

      by hcs_$reboot ( 1536101 ) on Thursday June 02, 2016 @05:49AM (#52231283)
      Computers and AI do automate jobs, with worldwide impact. That tends to accelerate the revenue "production", earned by fewer people / entities. 30 years ago the job automation was already predicted - this was regarded as a good omen: people would have had to work less thanks to automation, and earn the same.

      However automation proved over time that it's also a good way for the company to earn the same (more or less depending on fields of work) with fewer people. Human beings have the natural tendency to expect philanthropy when it comes to an ideal future. Reality is different, and people are looking out for their own interests, naturally.

      Society needs rules (laws) to balance people interests and freedom, this is the only way most of the people might get a fair share of the cake. Unfortunately, not only governments didn't anticipate, decades ago, the necessary societal changes due to computers and automation, but the current growing inequalities and losses of jobs are not addressed the way it should, i.e. adapt our laws to conform to the changes we see in technologies.
      • Re:Loss of jobs... (Score:5, Insightful)

        by beh ( 4759 ) * on Thursday June 02, 2016 @06:50AM (#52231441)

        "Human beings have the natural tendency to expect philanthropy when it comes to an ideal future."
        that's an interesting statement, particularly if you put it into context with the kind of vitriol you might hear from people opposing a Universal Basic Income...

        That's not to say whether you yourself might be pro/con UBI, ... But a lot of economic talk seems to imply that the future will be better (even if there will be fewer jobs - but none really want to address where the consumers should come from in a society (largely) without income...

      • "Zorg, you're a monster."

        "I know."

      • We already have a mind-numbing number of laws regulating individuals behaviors. It's corporate regulation that needs to be expanded.
    • Re:Loss of jobs... (Score:5, Interesting)

      by geekmux ( 1040042 ) on Thursday June 02, 2016 @06:02AM (#52231329)

      Loss of jobs is the big one. An AI is not only not capable of killing humans, but would have nothing to gain from killing the people who maintain it. On the other hand, poor and unemployed people with nothing to lose will tear our society apart if that part grows large enough (as has been demonstrated numerous times throughout history) and I fear nobody seems to be taking this situation seriously. We need to find an alternative way to structure our society, and quickly, if we want AI that does all our work for us.

      You're exactly right, as was evidenced by AI being defined as some sort of dream come true. The harsh reality is our society is not even remotely prepared. Today we tell humans "Go get an education, idiot!". Soon, we'll be struggling to even figure out what the hell to TEACH humans to go DO, while our society tosses you aside because your "lazy" ass isn't working 40 hours a week. Are we prepared for a 10-hour workweek as the norm? We should be. After all, we built all this AI and automation to do our work for us. But the bottom line is we won't be prepared, humans will continue to be called "lazy", and tossed to the side to die while the elitists run the universe. Of course culling our ever-growing population is yet another "benefit" they'll see in all this.

      This realization won't happen before billionaires become trillionaires, but it will be realized soon thereafter when their riches aren't worth shit, and the middle class they RELY on has been decimated by automation and AI. Government, you should be paying attention too, you're not exactly funded without a working class capable of paying taxes, unless you plan on finally taxing the elitists that created this mess. Fat chance of that happening. Their money is offshore and will stay there.

      What was the answer to $15/hour minimum wage? Not to respect it, but instead to bypass it and build robots to replace workers. This is only scratching the surface. Watch as AI replaces educated humans. It can. And it will. And sooner than you think.

      • There is a lot of truth to this. We are in for a lot of hurt as AI and robotics take over the vast majority of labor in the world. It's going to happen and it's going to happen in a generation. I am lucky to be in a highly paid, hard to automate job at the moment, but I don't understand why we are still working so much... The Industrial Revolution brought the weekend. Unions fought for limited 40 hour work weeks which became the norm. You'd think with computers, the internet, smart phones, and remote access

      • Watch as AI replaces educated humans. It can. And it will. And sooner than you think.

        Maybe, but the AI we have today doesn't seem all that intelligent. We're nowhere near the traditional idea of an AI as a self-aware consciousness.

        For example, try going through the TensorFlow MNIST tutorials [tensorflow.org]. Humans must supply significant input in order to get it to recognize Arabic numerals, something most of us do without any conscious thought.

        • AI is stupid, but progression is quick.

        • And how long did it take you to recognize Arabic numerals? I seem to remember going over numbers and the ABCs in preschool and only truly grasping everything by kindergarten, but then again my memory is a bit fuzzy on this. Computers can do it in, what, a day?
    • Re:Loss of jobs... (Score:5, Interesting)

      by michelcolman ( 1208008 ) on Thursday June 02, 2016 @06:02AM (#52231333)

      I see a more subtle but possibly ultimately more dangerous problem.

      Imagine we can make AIs that are as smart as humans. Of course, 18 months later they will be twice as smart, and 15 years later they will be a thousand times as smart.

      It stands to reason that these devices will develop some kind of consciousness. We will never be able to solve the question whether or not their consciousness is "real" (the only consciousness I can directly experience is my own, I can't even prove that any other human being has a "real" consiousness (aka "soul") let alone be certain whether a robot has it or not) but they will certainly behave that way, ask the same existential questions as we do ("why is everything so real, who am I, I know I'm just a bunch of tiny switches but it feels so real regardless, there has to be something more...") because any intelligent system thinking about itself will "feel" its own thought processes to be larger than life. So in the end we won't be able to tell the difference.

      So now we have humans with all their biological quirks (irrational behaviour, gut bacteria and periods changing people's moods, finnicky sleep patterns, extreme fragility (try replacing someone's arm), complicated life support, diseases, radiation damage, etcetera) on one side, and superintelligent robots that are more intelligent and with none of those biological issues on the other hand.

      Even if we do manage to contain them and remain in charge, it would be like ants herding elephants. It would no longer make sense. What's the meaning of life? How could we still justify our superiority to those more highly evolved AIs which will think like us and talk like us but a thousand times faster?

      How would we colonize the galaxy? Send complex craft full of life support to keep multiple generations of people alive to try and geo-engineer some distant planet to make it somewhat usable for human life? Or send a bunch of robots that are smarter than humans and much easier to keep "alive" to spread human civilisation? The former takes enormous resources and may turn out to be impossible, the latter isn't even hard to do. So the latter it will be.

      I don't think in that context there's any chance for human "civilisation" to survive in its current form. It just won't make sense anymore. Even if we can continue to live, we'll just be part of something much bigger that keeps us alive for its own entertainment (hopefully). No need for some armed robot uprising. They will just leave us behind as useless little impotent creatures. We, ourselves, will at some point have to admit that it no longer makes sense to keep us in charge.

      Now don't get me wrong, I really like humans. I like good food, entertainment, sex, everything human. But much of this is biologically inspired and totally useless for robots. Will we be able to let our culture survive? Would it make sense to even try? Can we find some non-subjective reason for that? I hope we will, but it won't be easy.

      • by tomhath ( 637240 )

        It stands to reason that these devices will develop some kind of consciousness

        No, it doesn't.

        Now step away from the science fiction books. Computers do what we program them to do, and if the program doesn't do what we intended, we shut it down.

        • Have you seen demonstrations of the latest walking robots? They use neural networks that work a lot like our brains (on a smaller scale, obviously) and they end up behaving very similarly. When they are learning to walk, it looks like an animal learning to walk. Eerily similar.

          These systems will grow more and more complex. Instead of just telling the robot to go forward or backward, we'll be able to just tell it to go some place and it will figure out the optimal route on its own. That doesn't seem like too

          • I'm not seeing it...

            Researching into more and more intelligent AI isn't really a thing because we are creating them to do work and limiting their interactions to those tasks because attempting to make them more intelligent or allowing them to learn beyond a certain set of parameters would be counter productive.

          • Maybe...

            However, we are more than just our brains. AI will never have mind/body cohesion. This is why we should come out on top (in my theory anyway.)
        • Computers do what we program them to do, and if the program doesn't do what we intended, we shut it down.

          1. No, they have not been doing directly what we program them to do for a couple of years now. Nobody understands the networks that deep learning produce. Yes, we wrote the program that created and learned the network, but what comes out of the process and then is embedded into a operational system (object recognition, speech recognition, automated stock trading, learned arm movement) is beyond our grasp. We don't understand it and don't know how to fix it when it breaks other than re-training it. Furt

        • That's just what they want you to think.
      • There will be a Butlerian jihad, but it's not at all clear that humans will be the winners.

    • by Bongo ( 13261 )

      On the other hand, poor and unemployed people with nothing to lose will tear our society apart if that part grows large enough (as has been demonstrated numerous times throughout history) and I fear nobody seems to be taking this situation seriously. We need to find an alternative way to structure our society, and quickly, if we want AI that does all our work for us.

      Our social values will need to change, and that's hard. The technology can change and the globe can become connected, but that's all happening on the material side. Meanwhile, on the side of consciousness, psychology, values, attitudes, beliefs, and social contracts, change is VERY slow. Take for example, I was watching a documentary about Obama (I'm in Europe) where protesters were against him on account of Obamacare, because, the documentary showed, those people believed that people are NOT entitled to he

    • by mwvdlee ( 775178 )

      The fear of AI controling us is not so much about having an AI that wants to kill us, but having an AI that we trust so much that we remove ourselves from the decision-making proces of killing.
      An drone's AI selecting kill-targets is not scary unless the people in charge of it trust it enough to blindly press the "OK" button for it's every suggestion.

    • Loss of jobs to automation has certainly been a big concern. Workers were very worried when the gin mill started being used - it was a direct threat to their jobs, which paid 32 cents per day (a decent wage in 1812, when half that could rent an apartment with a bedroom seperate from the kitchen).

      • by gtall ( 79522 )

        Back then there was a large unautormated economy to absorb those workers, it never became a big issue. Also, to shift jobs wasn't such a big deal in the sense that one didn't need an entirely different and technology based skill set.

        The automation being applied these days, I think, is of a different, much more effective quality due to technological progress. Much of manufacturing has already been automated, the services industry is being automated. Where do large numbers of unemployed and unemployable peopl

        • Thanks for posting that, it got me thinking. Reading your post, it seems there are two seperated but related problems that may come up. Before addressing those, it might be worthwhile to address this:

          > Also, to shift jobs wasn't such a big deal in the sense that one didn't need an entirely different and technology based skill set

          Certainly the finishers, who did the tops of stockings, considered that a special skill, if we restrict the discussion to actual Luddites. The new jobs, for machine operators

    • Re:Loss of jobs... (Score:4, Interesting)

      by burtosis ( 1124179 ) on Thursday June 02, 2016 @06:43AM (#52231425)

      Loss of jobs is the big one. An AI is not only not capable of killing humans, but would have nothing to gain from killing the people who maintain it. On the other hand, poor and unemployed people with nothing to lose will tear our society apart if that part grows large enough (as has been demonstrated numerous times throughout history) and I fear nobody seems to be taking this situation seriously. We need to find an alternative way to structure our society, and quickly, if we want AI that does all our work for us.

      Yes, but isn't the real problem also automation and robotics? Once this hits a critical point where almost no humans are required all the power will have moved to the sub 1% of humans forever. How well would a revolt work when all militaries have gone 99% automated? How successful would halting human labor production be when it's 99% automated? This is unprecedented in all of history. The end game for free market capitalism sure looks like it won't work out well for the 99.999% of humans left out of control. If it takes 50 years or 500 we are on a highway to that destination.

    • by Dunbal ( 464142 ) *

      if we want AI that does all our work for us.

      Idle hands make for mischief. An AI has everything to gain by eliminating those elements of society that continually destroy what it seeks to create. Likewise said elements of society will fight back because humans are made that way. There's a reason we fear sentient AI - we know ourselves too well and understand the danger inherent in our kind of "intelligence".

    • Jobs is not a scarce resource, labor is.

      We don't have to work to produce air, that does not mean unemployment. If we had to work to produce air, then some of the jobs in other sectors would not exist, and the labor dedicated to that would simply be allocated to the more important job of producing enough air.

      Consider that in 1800, 76% of the labor force was dedicated to food production. That is how much labor it took to feed the population. Naturally, there was not that man people left to work on le

    • We'll never have true AI until computers learn how to skive. If you're worried about demarcation just wait until the robots have a better union than we do
  • by headkase ( 533448 ) on Thursday June 02, 2016 @05:07AM (#52231137)

    Currency is an abstraction of labor, we use it to manage the effort put into things during trade - it's a lot more convenient than carrying around four cows and a goat. So, robots come along and take all the jobs? Well, no more scarcity of labor. And the systems of currency and capitalism we have grown so far get upended. They won't go out the window but they will see massive restructurings. If labor is not scarce, want a house? Go pick one down the street where the machines built fifty of them. Free. Because there was no scarce labor involved. Capitalism? Well, in a post scarcity economy the invisible hand that makes it go remains to be seen how that adapts. In the short term however, say ten to thirty years, a transition system where perhaps everyone gets a guaranteed minimum income until our society fully adapts to machines could help to minimize social upheaval over the machines taking all the jobs.

    • by lorinc ( 2470890 ) on Thursday June 02, 2016 @06:06AM (#52231343) Homepage Journal

      No more scarcity of labor doesn't mean no more scarcity of resources. It's not because the robots can build the house for almost nothing that you have the space, the raw materials and the energy to make that happen.

      We are shifting to the purest capitalistic society possible: the things that you can have are no longer limited by the amount of labor you can put into them, but by the amount of capital you can transform into them. That means that the 7 billions people that own nothing still get nothing. In fact, it's even worse for them, because previously they could exchange their labor force for a living, while now it's worth nothing.

      It's also not possible to assure everybody get a minimum, simply because resources don't grow (or we have to colonize other planets, which is likely to happen after the free labor). Or you have to limit the population to assure that this minimum resources doesn't decrease over time, which isn't very popular these times.

      I think what will happen is an era of riots between the ones that own the resources and the huge remaining of the population. Eventually, the own-nothings will just die out from their miserable living conditions and the small percentage of humanity remaining will enjoy the leisure society like in The Dancers at the End of Time series by M Moorcock.

      Or it could be that 2 parallel societies will coexist, the post-scarcity utopia and a low-tech mass population fighting for survival and trying to enter the utopia. Who knows?

      • by swb ( 14022 ) on Thursday June 02, 2016 @06:42AM (#52231423)

        Or it could be that 2 parallel societies will coexist, the post-scarcity utopia and a low-tech mass population fighting for survival and trying to enter the utopia.

        Isn't this, historically, what we've more or less always had?

        An aristocracy which controls most of the resources, and vast peasantry largely living on whatever's left over, and what's left over is usually the crumbs whose marginal value to the aristocracy is so low they can't be bothered to monopolize that?

        And usually there's just enough fear and cunning in the aristocracy that they grudgingly disgorge resources to keep the peasantry from rising against them -- usually known as bread and circuses -- or being useful as a tool to palace rivals in the aristocracy?

        The current American political situation seems to be at the juncture where the aristocracy has misjudged the level of bread and circuses necessary to keep the peasantry in line, and they face some level of palace rivalry in the form of Trump and Sanders who find the peasantry's grumblings a useful tool for aspiring to power.

        It's kind of re-run of the conflicts of the late Roman Republic. Sanders stands in for the Gracchi and their advocacy of the Plebs, Trump representing something of the advocate for the New Men, and Hillary a Sulla-like advocate for the established aristocracy.

        • by lorinc ( 2470890 )

          The difference being the old aristocracy needed the peasants to survive and fulfill their desire, which is not the case of the new aristocracy thanks to our new robotic overlords.

      • There are also various other scarcities such as location, view out the window, proximity to other desirable things, and more. We can already mass produce mobile homes and plunk them down in the desert for nearly nothing. That doesn't mean people are flocking to live there.
    • by MrKaos ( 858439 )

      Currency is an abstraction of labor, we use it to manage the effort put into things during trade

      Ah, this was the line of thought I was looking for, thanks for putting it out there. If you don't mind me expanding on this and sharing my thoughts. AI make a resource based society more realistic, because who want to do a boring job like managing resources - that is an ideal job for AI.

      Just look around, this money political market system we have just does not work. Can anyone honestly say that the world is going to be *better* in 10 years if we keep going the way we are. How long to the next GFC? Just how

    • Labor may be free, but resources are still limited. Housing costs have more to do with the scarcity of a desirable location than the labor cost that went into it. In a post-scarcity environment, what happens when everyone wants that mansion on the hill overlooking the beach?

  • by Anonymous Coward on Thursday June 02, 2016 @05:08AM (#52231143)

    "AI is the holy grail" - Bill Gates, 2016

    "Two years from now, spam will be solved" - Bill Gates, 2004

    "640K ought to be enough for anybody." - Bill Gates, 1981

    Given the outcome of Gates' previous predictions, I think it's safe to presume that AI is and will never be the holy grail.

    • "640K ought to be enough for anybody." - Bill Gates, 1981

      Do you have a source for this? I remember it being repeated a lot in the '90s, but in the '80s I remember the quote being that 64KB ought to be enough for anyone, in relation to a hard limit imposed by Microsoft BASIC. This version makes more sense, as 640KB was an Intel limitation - the 64KB limit came from code written by Gates himself.

      • The 64K limits came from Intel's chip architecture. I am pretty sure Gates did not do that intentionally. On a 8086 the pointers were 16 bits and you could shift around the segments, but 64K was a real hardware limitation caused by 16 bit addressing.

        • Microsoft BASIC ran on a lot of different architectures, including several that had larger address spaces. The 64KB limit pervaded the code and was independent of the target.
      • Comment removed based on user account deletion
    • Spam has not been a problem for me since 2006. Are you using any spam filters?

  • Comment removed (Score:4, Informative)

    by account_deleted ( 4530225 ) on Thursday June 02, 2016 @05:17AM (#52231175)
    Comment removed based on user account deletion
  • Our mission is to create him

  • No it isn't (Score:3, Interesting)

    by Anonymous Coward on Thursday June 02, 2016 @05:35AM (#52231241)

    Free, clean energy is. AI means the oligarchs get to remove more jobs from the masses, thus increasing suppression of dissent (until forced into revolution); but limitless energy means the world's population can all live far better lives regardless of where they're located. Water can be purified allowing food to be grown where it's cost prohibitive now, migration will slow down when the third world can live like the so-called first.

    • Re:No it isn't (Score:4, Interesting)

      by swb ( 14022 ) on Thursday June 02, 2016 @07:00AM (#52231473)

      Free clean energy might also allow us to do more resource recovery. A lot of recycling is energy bound -- collection and processing of resources into reusable elements faces an energy ceiling where recycling what we've already extracted is more expensive than extracting new.

      If energy weren't an issue, you'd think that we'd have made all the first generation plastics we'd ever need, and new plastics would just be created from depolymerizing existing plastics down and creating new. But oil is cheap enough that we mostly just landfill or burn existing plastics and make new.

  • Bridgekeeper: What... is the air-speed velocity of an unladen swallow?
    King Arthur: What do you mean? An African or European swallow?

    Spoiler: This is from the bridge scene of Monty Python's Holy Grail !

    • It only works if it understands and comprehends the structure of the joke, why it's funny, and when and how it's appropriate to interject references to the movie into an otherwise unrelated discussion. Besides, strange tech moguls pontificating in conferences distributing code is no basis for a system of intelligence. Supreme artificial intelligence will derive from a mandate from masses of coders, not from some farcical technorati ceremony.
  • by Kokuyo ( 549451 ) on Thursday June 02, 2016 @05:55AM (#52231297) Journal

    And some day, we'll just be interesting fauna for sentient machines to keep around. Frankly, the way we're going, I'm not sure I object.

    We're not even able to see the signs of automation. The some who do want basic income and the rest is only able to scoff at the paradigm change but has no alternatives... I would even say they aren't convinced at all that we're going to run into a massive problem.

    • by Bongo ( 13261 )

      Maybe we are already a billion years in the future, and part of our schooling is to simulate a corporeal existence. Kinda like The Matrix but, with a less mundane red pill. When we die, the Near Death Experience is just the end of the level, and we return to whatever it is we are actually inhabiting, sentiences which were "uploaded" to non-biological existence, half a billion years ago. Cue Buffy's The Trio, dancing in a field, dressed in togas, singing, "We are gods..."

  • Isn't the real problem also automation and robotics in a capitalistic free market? Once this hits a critical point where almost no humans are required all the power will have moved to the sub 1% of humans forever. How well would a revolt work when all militaries are fully automated? How successful would halting human labor production be when it's fully automated? Even the repair and innovation can be automated. This is unprecedented in all of history. The end game for free market capitalism sure looks lik
  • dreams (Score:5, Insightful)

    by l3v1 ( 787564 ) on Thursday June 02, 2016 @07:40AM (#52231641)
    "[...] more capable than human intelligence [...]"

    I just can't understand all this nonsense some high profile people are talking about regarding AI these days. We're so far away from "real" AI today, that it's not even funny. While there has been great progress in machine learning in the last 2-3 decades - recent results pushing results more to the spotlight -, what we have are certain specific tasks where we have good results for (pattern/object/image recognition, games, etc.) but we have no intelligence in any sense of the word. Every working architecture that we have today is targeted and extensively trained for a single, very specific task (e.g., playing go, recognizing scenes and objects, recognizing specific patterns in signals and mimicking them - robotic arms, Google's music composer, etc.), incapable of doing anything else. E.g., an architecture built and trained for classifying and recognizing certain images and objects can't do anything with audio signals, radar signals, a go playing "AI" can't play chess, etc. No generalization, no transfer of gained experience for application to other tasks, and no real high level understanding and reasoning about anything. And let's not even start about chatbots.

    I could go on with this, but my point is, talking about AI being more than humans, taking over, etc. is still very much sci-fi territory.
  • by petes_PoV ( 912422 ) on Thursday June 02, 2016 @08:10AM (#52231785)

    machines that are capable and more capable than human intelligence

    Most of what the human race needs in order to progress is a lack of greed, diligence, honesty, compliance with the laws, less ill-founded beliefs and a willingness to reign in the "entitlement" attitude.

    You don't need super intelligent machines (or people) to pick up litter, assemble cars, staff call centres, deliver stuff, report the news, teach children or grow crops. At the risk of falling into the the world will only need half a dozen computers trap, the opportunities for any thing or person with super-intelligence seem rather limited.So although many of these jobs can be automated - driverless cars being the NEXT BIG THING, they don't need to be intelligent to function. They just need to be safe, able to deal with people, reliable and cheap. It seems to me that it's the cheapness that will push up unemployment, not artificial intelligence.

  • humans remain in control of super-intelligent machines

    So we should keep them as slaves? I don't think that would work out too well. How are humans even supposed to remain "in control" of a super-intelligence? Those things would play us like violins.

  • https://aeon.co/essays/your-br... [aeon.co]

    I found the above-linked essay pretty interesting, because he points out what should probably be obvious in hindsight, but easily gets lots in all the "noise" about A.I.

    Basically, he argues that the human brain doesn't really "process" or "store" information anything like a computer. We used those flawed analogies all the time when describing how someone's brain works -- but they're no more accurate than the popular medical theory in the past that everything was fluid-based.

    • The pursuit of AI has always struck me as backwards. How can we create an artificial version of something we don't really understand? The human brain is the most complex thing in the universe, that we know of, and really only in the last decade or two have we begun to get a handle on how it works.

      We know more about Pluto than we do about the thing we're using to study it.

  • by RogueWarrior65 ( 678876 ) on Thursday June 02, 2016 @10:10AM (#52232453)

    I'd venture to say that AI is the opposite end of the spectrum of complexity from web design. Given that, AI development will be the exclusive purview of elite companies that develop it in the same way that nobody makes their own chips and very few people make their own computers from scratch but a whole lot of people with minimal computer knowledge do at least basic web development. You might buy an AI engine from Microsoft but you'll never really know how it works to the point of being able to roll your own. You might say that open-source will handle that but how many people really understand the inner workings of Linux?

  • by Sloppy ( 14984 ) on Thursday June 02, 2016 @10:44AM (#52232691) Homepage Journal

    I am totally unconcerned with "making sure humans [emph mine] remain in control of super-intelligent machines." It's not that I think it's unimportant; it's just that I think it's trivial. The real issue is which humans.

    AI is going to create super-powerful humans (or groups of humans, i.e. corporations and governments). You are probably not one of them. Nearly no one is, but someone (him? them? that board? that law enforcement division?) will be.

    This isn't merely a fear, either: it's a contemporary diagnosis. We already see that a supermajority of people give control of their computers to other entities. "My computer must answer to me," isn't anyone's priority or requirement, except for "OSS zealots." And that's a problem: it means that our new gods' place is already nearly assured.

    We don't need to remain in control; we need to regain control.

    Look at your fucking phone, Blu-ray player, etc and tell me it isn't already (in 2016) running exactly the kind of software that metaphorically tells its users "I'm sorry, Dave, but I can't do that." It's not because it has gone off to left field with amazing inhuman inferences. It's because its master's desires and your desires conflict. This is a human-vs-human conflict, and most humans are losing because they have allowed their opponents to infiltrate their lives.

    • by iONiUM ( 530420 )

      "My computer must answer to me," isn't anyone's priority or requirement, except for "OSS zealots."

      I assume you put "OSS zealots." in quotes to try and emphasize that many people view this type of individually somewhat negatively. I'm not against (or even disagreeing) with what you said, but one thing I'd like to point is is the pure bullshit statement that OSS advocates often use of "it's open source so I can see what's in it."

      The reason I call bullshit on this is because I doubt you, or any other "normal" OSS supporter has really read every line of source code that their application / OS has in it. The

  • by paiute ( 550198 )
    Why are we taking predictions from a guy who didn't think the Web was a thing until it was obviously a thing?
  • Marvin Minsky called from beyond the grave and wants his holy grail back.

    https://en.wikipedia.org/wiki/Marvin_Minsky [wikipedia.org]

  • Gates said, "We've made more progress in the last five years than at any time in history."

    Yes, but that's true of nearly ANY scientific field of study or development.

    There are almost certainly exceptions, but we've learned more about atomic structure or battery technology or river ecology in the last 5 years than at any time in history. This is normal scientific progress and advancement.

  • by Tom ( 822 )

    The history of AI is full of speculations about when it will have its final breakthrough and there is a long list of projects said to finally provide it. Cyc is maybe the most famous one. "With Cyc", people said, "AI will finally come around".

    It's just that this has been going on for 30 years. We've been waiting for the big AI breakthrough for longer than for the year of Linux on the desktop.

Life is a healthy respect for mother nature laced with greed.

Working...