Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Education Supercomputing Hardware Technology

MIT Focuses on Chip Optimization 30

eldavojohn writes "MIT's Microsystems Technology Laboratories is focusing on the manufacturing of chips as the variables that affect chip quality become more and more influential. From one of the researchers, "The extremely high speeds of these circuits make them very sensitive to both device and interconnect parameters. The circuit may still work, but with the nanometer-scale deviations in geometry, capacitance or other material properties of the interconnect, these carefully tuned circuits don't operate together at the speed they're supposed to achieve.""
This discussion has been archived. No new comments can be posted.

MIT Focuses on Chip Optimization

Comments Filter:
  • Not just lithography (Score:5, Informative)

    by cannonfodda ( 557893 ) on Friday August 17, 2007 @04:27AM (#20258487)
    This isn't really that new. There are folk who have been looking at characterising nano-scale variability for years, and there is a LOT more to it that just the fluctuations introduced by lithographic limits. Glasgow uni's device modelling group [gla.ac.uk]. What's odd is that these guys are estimating the fluctuations based on mathematical models when there is pretty good data available for the 65nm technology node already.
    • by tool462 ( 677306 )
      Agreed. It would be interesting to see the actual paper, since--based on what's in this article--they haven't seemed to uncover anything remarkable.
    • Right. (Score:3, Informative)

      by lheal ( 86013 )
      We've been doing that kind of stuff at Illinois [uiuc.edu] for a while.
    • by moeinvt ( 851793 )
      "This isn't really that new. . . "

      The article is extremely short on details, but it sounds very similar to what IBM has done in the area of "statistical timing" over the last couple of years.

      http://www.physorg.com/news4385.html [physorg.com]
    • Indeed. Simulating and optimizing for process faults is often accomplished as a form of Monte Carlo testing [wikipedia.org], where a stochastic sweep is done over various possible process faults to determine the likelihood that transistor parameters like gain or threshold voltage come out as expected. This is often done at the analog design level as a necessary simulation step.

      I couldn't find much on the web about this besides a patent from 1994 [wikipatents.com].
  • Just to clarify (Score:4, Informative)

    by tygerstripes ( 832644 ) on Friday August 17, 2007 @05:05AM (#20258597)
    This work is for RFICs (communication chips), not your 10-Core Hyperon or whatever. More importantly, what they're doing is indirectly modelling the correlation between various electrical properties of the micro-components in order to optimise design stability prior to manufacture. This has no direct impact on the manufacturing process, but does impact on more fabrication-robust design.

    Ultimately this will have a limited impact on your desktop's Giggerhurts, somewhere way down the line, but it's nothing you'll notice and, for most of us, nothing we'll really understand. Unless the mathematical basis of chip-fab optimisation is your field, this isn't going to mean much.

    • Re: (Score:2, Insightful)

      Ultimately this will have a limited impact on your desktop's Giggerhurts, somewhere way down the line, but it's nothing you'll notice and, for most of us, nothing we'll really understand. Unless the mathematical basis of chip-fab optimisation is your field, this isn't going to mean much.

      There's plenty on /. that won't affect me personally (nor the vast majority of slashdotters). This doesn't lessen our interest in the matter. Perhaps plenty of slashdotters don't understand this now, but having been exposed to this the subject matter may garner some of our interests. Don't underestimate the value (or interest in) information, irrelevant of how useless it may seem.

      • You're probably right; I guess I'm just anticipating the crapflood of kids who want to know when it'll ramp up their frame-rate (which, of course, it won't). It doesn't help that the story is vague enough to give that very impression, and knowing how many people love to RTFPhysorgA...

        Maybe I'm just getting old, but there seem to be an awful lot more Ritalin-kids on /. these days. Maybe I'll emigrate to worsethanfailure.com.

    • Re:Just to clarify (Score:4, Informative)

      by imgod2u ( 812837 ) on Friday August 17, 2007 @11:06AM (#20261303) Homepage
      This affects digital chips more than you think. Process variations are a huge problem as we get to smaller and smaller feature sizes. While analog circuits are much more sensitive to variations in threshold-voltage, capacitance and resistance (and cross inductance), keep in mind that all digital circuits are still analog. They are simply interpreted as 1's and 0's.

      With this in mind, consider a digital circuit that's driving the output voltage from the voltage of a logical 0 (let's call it 0V) to logical 1 (let's say, 5V for early TTL lovers). That voltage isn't going to rise instantaneously. The time it takes to go from 0 to 5 volts will depend on:

      1. The various capacitances of the circuit, both parasitic and device capacitance.
      2. Resistance in various circuit elements.
      3. Cross-inductance.
      4. Threshold voltages for all of the transistors.

      Having an accurate model to statistically predict these variations will allow chip designers to better estimate the speed of their digital circuits. So if the target goal of a chip is 10 GHz, they can know, before they commit to silicon, roughly how many chips in a batch will meet that target speed.

      Other factors also play in as we get to lower and lower powered chips. With a VDD of 1.0V or below (as in ultra-low-voltage chips), cross-inductance, capacitance on the power rails, etc. can actually affect the stability of a digital circuit. Noise is injected that can turn a voltage that was meant to be a logical 0 into a logical 1. With modern chips turning voltages in regions of the chip on and off, the di/dt problem comes in. Without accurate predictions as to the impedances across the chip, reflections on the power rails can cause a voltage that's higher than VDD and, if the transistors weren't designed conservatively (to meet power and speed goals), they could burn out.
  • harder on designers (Score:5, Interesting)

    by drakyri ( 727902 ) on Friday August 17, 2007 @05:53AM (#20258755)
    This isn't really anything new - shrinking design processes always make life harder for designers. Each design process (.25 um, 90 nm, etc.) has a set of rules about things - for example, how close interconnects can be to each other without causing interference.

    The ruleset for quarter-micron was maybe forty pages. The ruleset for 90 nm was the size of a small phonebook. I don't even want to think about what the rules for 65 or 45 nm must look like.
    • by Narkov ( 576249 )
      At what point does the cost of refinement and R&D this process demands outweigh the benefits of increased yield?
    • by John Betonschaar ( 178617 ) on Friday August 17, 2007 @06:02AM (#20258791)
      Exactly. As a matter of fact I work for a company (not mentioning which, my boss wouldn't appreciate it) that develops software to migrate chips to smaller technologies, detects/fixes design-rule violations, detects/fixes litho hotspots, that kind of stuff. It is used by many well-known names in the IC industry. We've been in business for more than 10 years already, so this hardly sounds as something new.
    • Re: (Score:1, Funny)

      by Anonymous Coward
      The 65nm one is not much bigger than the 90nm one but it is in a vault like out of Mission Impossible and it has dry ice smoke flowing over it.

      The 45nm one is like the 2 dimensional prison in Superman II that Zod and his chums get banished into. Except it has greem-screen text characters on it like in the Matrix. And the 2001 music plays whenever you see it.
  • Shouldn't this be a technical paper in an electrical engineering journal?
     
    • by gamepro ( 859021 )
      From the article: "The researchers published their results in two papers in February and June. They also presented a paper on the modeling of variation in integrated circuits at this year's International Symposium on Quality Electronic Design." Indeed they are!
  • monkeys (Score:4, Funny)

    by stranger_to_himself ( 1132241 ) on Friday August 17, 2007 @06:19AM (#20258843) Journal

    I read the title as 'MIT Focuses on Chimp Optimization.'

    Thought maybe they'd been having trouble recruiting.

  • I really *love* science reporting like this:

    1. The "Symposium" was "March 26-28, 2007" ( this is OLD news )

    2. The MIT Team presented an invited paper that has *no* Abstract
    "Variation (Invited Paper)"Duane Boning, et al"

    3. The paper they presented from the article is for consumer electronics, at 65nm scale, which is basically yesterdays processor technology, ( they should ask AMD and Intel about *their* experence in 65nm fab, although they are working on digital computing silico

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...