Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Education Supercomputing Hardware Technology

MIT Focuses on Chip Optimization 30

eldavojohn writes "MIT's Microsystems Technology Laboratories is focusing on the manufacturing of chips as the variables that affect chip quality become more and more influential. From one of the researchers, "The extremely high speeds of these circuits make them very sensitive to both device and interconnect parameters. The circuit may still work, but with the nanometer-scale deviations in geometry, capacitance or other material properties of the interconnect, these carefully tuned circuits don't operate together at the speed they're supposed to achieve.""
This discussion has been archived. No new comments can be posted.

MIT Focuses on Chip Optimization

Comments Filter:
  • Not just lithography (Score:5, Informative)

    by cannonfodda ( 557893 ) on Friday August 17, 2007 @04:27AM (#20258487)
    This isn't really that new. There are folk who have been looking at characterising nano-scale variability for years, and there is a LOT more to it that just the fluctuations introduced by lithographic limits. Glasgow uni's device modelling group [gla.ac.uk]. What's odd is that these guys are estimating the fluctuations based on mathematical models when there is pretty good data available for the 65nm technology node already.
  • Just to clarify (Score:4, Informative)

    by tygerstripes ( 832644 ) on Friday August 17, 2007 @05:05AM (#20258597)
    This work is for RFICs (communication chips), not your 10-Core Hyperon or whatever. More importantly, what they're doing is indirectly modelling the correlation between various electrical properties of the micro-components in order to optimise design stability prior to manufacture. This has no direct impact on the manufacturing process, but does impact on more fabrication-robust design.

    Ultimately this will have a limited impact on your desktop's Giggerhurts, somewhere way down the line, but it's nothing you'll notice and, for most of us, nothing we'll really understand. Unless the mathematical basis of chip-fab optimisation is your field, this isn't going to mean much.

  • Right. (Score:3, Informative)

    by lheal ( 86013 ) <lheal1999NO@SPAMyahoo.com> on Friday August 17, 2007 @07:21AM (#20259001) Journal
    We've been doing that kind of stuff at Illinois [uiuc.edu] for a while.
  • Re:Just to clarify (Score:4, Informative)

    by imgod2u ( 812837 ) on Friday August 17, 2007 @11:06AM (#20261303) Homepage
    This affects digital chips more than you think. Process variations are a huge problem as we get to smaller and smaller feature sizes. While analog circuits are much more sensitive to variations in threshold-voltage, capacitance and resistance (and cross inductance), keep in mind that all digital circuits are still analog. They are simply interpreted as 1's and 0's.

    With this in mind, consider a digital circuit that's driving the output voltage from the voltage of a logical 0 (let's call it 0V) to logical 1 (let's say, 5V for early TTL lovers). That voltage isn't going to rise instantaneously. The time it takes to go from 0 to 5 volts will depend on:

    1. The various capacitances of the circuit, both parasitic and device capacitance.
    2. Resistance in various circuit elements.
    3. Cross-inductance.
    4. Threshold voltages for all of the transistors.

    Having an accurate model to statistically predict these variations will allow chip designers to better estimate the speed of their digital circuits. So if the target goal of a chip is 10 GHz, they can know, before they commit to silicon, roughly how many chips in a batch will meet that target speed.

    Other factors also play in as we get to lower and lower powered chips. With a VDD of 1.0V or below (as in ultra-low-voltage chips), cross-inductance, capacitance on the power rails, etc. can actually affect the stability of a digital circuit. Noise is injected that can turn a voltage that was meant to be a logical 0 into a logical 1. With modern chips turning voltages in regions of the chip on and off, the di/dt problem comes in. Without accurate predictions as to the impedances across the chip, reflections on the power rails can cause a voltage that's higher than VDD and, if the transistors weren't designed conservatively (to meet power and speed goals), they could burn out.

Stellar rays prove fibbing never pays. Embezzlement is another matter.

Working...