Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
IBM The Military United States

IBM Creates Custom-Made Brain-Like Chip 105

An anonymous reader writes In a paper published Thursday in Science, IBM describes its creation of a brain-like chip called TrueNorth. It has "4,096 processor cores, and it mimics one million human neurons and 256 million synapses, two of the fundamental biological building blocks that make up the human brain." What's the difference between TrueNorth and traditional processing units? Apparently, TrueNorth encodes data "as patterns of pulses". Already, TrueNorth has a proven 80% accuracy in image recognition with a power consumption efficiency rate beating traditional processing units. Don't look for brain-like chips in the open market any time soon, though. TrueNorth is part of a DARPA research effort that may or may not translate into significant changes in commercial chip architecture and function.
This discussion has been archived. No new comments can be posted.

IBM Creates Custom-Made Brain-Like Chip

Comments Filter:
  • by Anonymous Coward
    I think this chip could govern the US more effectively than our current Congress.
    • by Anonymous Coward
      This chip would be overkill, an old 4004 would be sufficient.
      • This chip would be overkill, an old 4004 would be sufficient.

        A block of wood would be sufficient. A 4004 would be overkill.

    • A commodore 64 running NATO COMMANDER on a datasette could run the George W Bush thought process.

    • by tomhath ( 637240 )
      But can it read from a teleprompter? If so it should qualify for a Nobel Peace Prize.
    • so could an 8008
  • IBM and chips (Score:5, Interesting)

    by jbolden ( 176878 ) on Friday August 08, 2014 @08:15AM (#47629153) Homepage

    It is getting hard to figure out where IBM is on chips. Arguably the 4 main chips experiencing investment are: x86, ARM, Z-Series processors and POWER series 2 of which are IBM. OTOH there is no roadmap for POWER beyond the current generation. I'd love to know is IBM getting more serious about CPUs or pulling back?

    • by Henriok ( 6762 ) on Friday August 08, 2014 @09:23AM (#47629523)
      I agree i your initial statement, but that's pretty much as it has been for at least 15 years or so. POWER9 is on the roadmaps, and the next generation zArch too. And they are sitting there like proxy boxes with nothing much spced, like it has been for almost all previous generations of their predecessors. What I'm concerned with is the lack of public roadmap for what they are planning in the HPC and super computer space. We had the very public Blue Gene project that began in 2001 with four projects; C, L, P and Q, but since the Blue Gene/Q came to life a couple of years ago, I have no idea what they are planning. It'd be nice to have some clue here.. Why not something from the OpenPOWER Foundation; A P8 host processor with integrated GPU from nVidia, on chip networking from Mellanox and programmable accelerators from Altera. But I haven't seen anything in that direction.
  • I am just curious to know whether this chip can lead to the development of artificial brains to be used by Humans in future? And, I couldn't understand why this chip will not available in the open market.
    • Thanks to your capitalization, I've misread it as "development of artificial brains to be used by Hamas". (Yes, I need new glasses.)
    • by Anonymous Coward

      I couldn't understand why this chip will not available in the open market.

      The millitary boys want to keep their toys secret to prevent the enimies getting them.

    • . . . I wacky-parsed the title as: "IBM Creates Custom-Made Brain-Like Chimp.

      . . . so just imagine where that thought train derailed me . . .

    • by gewalker ( 57809 )

      Can't Purina use them in Zombie Chow? If so, I would rather feed that to the neighborhood zombies instead of my own gray matter?

    • by narcc ( 412956 )

      I am just curious to know whether this chip can lead to the development of artificial brains to be used by Humans in future?

      That's easy: No.

      Kurzweil is the modern equivalent of a televangelist.

  • by Anonymous Coward on Friday August 08, 2014 @08:22AM (#47629199)

    The number of neurons in the brain varies dramatically from species to species. One estimate (published in 1988) puts the human brain at about 100 billion (10^11) neurons and 100 trillion (10^14) synapses.

    100 billion divided by 1 million = 100,000 of these chips to reach the human neuron count.
    100 trillion divided by 256 million = 390,625 of these chips to reach human synapse count.

    Assuming Moores Law for these chips with a doubling every 24 months to be conservative.
    2 of these on a chip in 2016
    4 of these on a chip in 2018
    8 of these on a chip in 2020
    16 of these on a chip in 2022
    32 of these on a chip in 2024
    64 of these on a chip in 2026
    128 of these on a chip in 2028
    256 of these on a chip in 2030
    512 of these on a chip in 2032
    1024 of these on a chip in 2034
    2048 of these on a chip in 2036
    4096 of these on a chip in 2038
    8192 of these on a chip in 2040
    16384 of these on a chip in 2042
    32768 of these on a chip in 2044
    65536 of these on a chip in 2046
    131072 of these on a chip in 2048
    262144 of these on a chip in 2050

    So we could be seeing human brain capabilities on a chip by mid century. Quite possible we'd see similar capabilities built as a supercomputer 10-20 years before that. Don't flame for the wild assumptions I'm making here - i know there are a lot, this is just intended as some back of the envelope calculations.

    • Specific activities engage only part of a brain - so we probably only have to go go 10% or so. That cuts less than a decade though, so 2040 something

    • by Henriok ( 6762 )
      There's no chance that Moores law can progress for 50 more years. Wouldn't each transistor be substantially smaller that atom nucleus by then? IF you don't mean a chip the size of a table, that is..
      • by Anonymous Coward

        The math: The latest intel processors use transistors that are 22nm across. The width of a hydrogen atom, ~1.1 angstrom, is about 0.11nm, or 110 picometers across. Assuming the transistor size halves every two years(which, from the looks of it, is impossible), we get this:

        2016: 11 nm transistors
        2018: 5.5 nm
        2020: 2.75 nm
        2022: 1.375 nm
        2024: 687.5 pm
        2026: 343.75 pm
        2028: 171.875 pm
        2030: 85.9375 pm

        And then we're smaller than the smallest atom. However, this is not smaller than the nucleus of the hydrogen atom

        • by Anonymous Coward

          you are halving when it's area not length - you should be multiplying by .66

          that comes out to;
          2016: 11nm
          2018: 7.26nm
          2020: 4.7916nm
          2022: 3.162456nm
          2024: 2.08722096nm
          2026: 1.3775658336nm
          2028: 909.193450176pm
          2030: 600.06767711616pm
          2032: 396.044666896666pm
          2034: 261.389480151799pm
          2036: 172.517056900188pm
          2038: 113.861257554124pm
          2040: 75.1484299857217pm
          2042: 49.5979637905764pm
          2044: 32.7346561017804pm
          2046: 21.6048730271751pm
          2048: 14.2592161979355pm
          2050: 9.41108269063746pm

          Yes this creates problems, but is made worse

      • If neuron-like processing turns out to be advantageous, there will be much more efficient ways to implement them than using tens of thousands logic gates to simulate each one.
      • by saider ( 177166 )

        Moore's law applied to transistor count and the atomic limit only applies if you limit yourself to 2 dimensional chips.

        • by fnj ( 64210 )

          Do you really believe that? Even iIf your transistor is 100 atoms high, it still can't be less than 1 atom wide or deep.

    • by Anonymous Coward

      Here is hoping they actually keep working at it. You know what IBM are like!
      All those plans for Cell, all wasted. Then Power went down the drain and one of their largest buyers (Apple) ditch them because Power was lacking compared to x86, which is just holding back everything.

      This is a genuinely interesting thing, possibly the best thing they have made in the longest time in fact.
      I couldn't see any reason DARPA wouldn't also be very interested in it, if it works as well as they say it does. Already it is

    • It's really hard to say how many of artificial neurons we would need to make a human-like intelligence, but it's certainly going to be less than the number of neurons in a human head. Computers already do a heck of a lot of tasks better than a human. Heck, using traditional computing methods with just a couple of these chips for image recognition and the like would already make a beast of a machine.
      • Yep like image recognition, and audio recognition.

        Oh wait.

        Computers can do logical operations better yes. Computers can't do fuzzy math, real time image recognition or real time audio recognition. Let me know when a computer can "see" with a pair of cameras. Identify an object heading toward the cpu(not just the cameras) and adjust its motors to dodge the incoming. Bugs can do that much yet computers can't.

        • by mark-t ( 151149 )
          Babies can't do any of those things very well either.
        • Let me know when a computer can "see" with a pair of cameras. Identify an object heading toward the cpu(not just the cameras) and adjust its motors to dodge the incoming.

          That actually do already exist.
          It's a car's collision avoidance system.
          It's already standard option from some manufacturer (e.g.: Volvo) (and should become mandatory in EU somewhere soonish).

          Some like Mobileye rely entirely on camera, while other are integrating other sensors in the mix, like radar, infra red lasers, etc.

          But yeah I see your point: complex task require complex network, way much more than this chip.

      • by Anonymous Coward

        Your brain has over a dozen different types of neurons with different functions and individual neurons themselves can have varying structures that can do more complex functions (like AND/OR/NEGATING within the same cell with different groupings of inputs)

        Some brain Neurons have thousands of inputs from nearly that many nearby nerve cells and brains have overall layer patterns often with broad regional interconnects of different specific functions.

        A million standard IBM-neuron each with 256 synapse inputs

    • by Anonymous Coward

      "IBM has already succeeded in building a 16-chip system with sixteen million programmable neurons and four billion programmable synapses. The next step is creating a system with one trillion synapses that requires only 4kW of energy. After that IBM has plans to build a synaptic chip system with ten billion neurons and one hundred trillion synapses that consumes only one kilowatt of power and occupies less than two liters of volume."

      I think the IBM roadmap is more aggressive than Moore's law, and of course g

    • by Anonymous Coward

      Each synapse contains dozens or hundreds of individual receptors that interact with the chemicals (neurotransmitters) being released to transmit the message. Certain types of receptors, called metabotropic, set off a cascade of enzymatic reactions inside the cell that represents further, highly complex, information processing. So when calculating the number of processing units in the brain, you have to go well beyond counting synapses. It's also worth noting that some of the interactions that take place can

    • The human brain needs all that parallelism because it's switching rate is abysmal, something like 200Hz. We ought to be able to beat that by a factor of million without setting anything on fire, so adjust your numbers accordingly.
    • You can simulate this guy [dailymail.co.uk] by 2030 then.

  • Is it coming soon?
  • by gelfling ( 6534 ) on Friday August 08, 2014 @08:27AM (#47629229) Homepage Journal

    Assuming of course this chip can hold 2 hr conference calls with 40 other chips and pound out 240 page Powerpoints.

  • HAL 9000? (Score:2, Funny)

    by Anonymous Coward

    Where are all the HAL 9000 jokes? HAL was built by IBM in "2001: A Space Odyssey", perhaps this is an example of life imitating art?

  • "Human brain has around ten-to-the-tenth neurons. By third year Mike had better than one and a half times that number of neuristors. And woke up." -- The Moon is a Harsh Mistress

    A.

  • Is this pulse recognition closer to Morse code or bar code? Obviously pulse code recognition could go beyond binary bit codes but the hardware must be seriously different from what we have now. It could even be analog.
  • As good as any other theory, I guess.

  • by Scottingham ( 2036128 ) on Friday August 08, 2014 @10:39AM (#47630017)
    Every time one of these damn 'neural computers' come out people tend to equate the number of neurons and synapses and think 'hey, if we can get to the number of human neurons... Presto!!!!1'

    Brains are waay more complicated than just neurons and synapses. Just taking the neurotransmitters into account makes the whole charade crash down. Then there is the glial network that, surprise surprise, does an enormous amount of complex work. There's even recent research suggesting that the branching patterns of the neurons perform complex computations. There are chemical gradients in the brain that act as a sort of addressing system.



    tl;dr Brain on a chip? Yeah fucking right.

E = MC ** 2 +- 3db

Working...