Contemporary Logic Design 69
Contemporary Logic Design | |
author | Randy H. Katz |
pages | 699 |
publisher | Addison-Wesley |
rating | 9/10 |
reviewer | Deepak Saxena |
ISBN | 0-8053-2703-7 |
summary | A good, thorough introduction to the world of digital logic design. |
The Scenario
You can code perl in your sleep, answer computer questions for all your non-geek friends, and create robust database-backed web sites for a living. The best description for you is that of software artist: You weave patterns of 1's and 0's that bring life to what would otherwise just be a hunk of silicon. However, you've always wondered how those 1's and 0's you create actually work. How does the flow of electrons through the silicon's microscopic pathways turn into the addition of two numbers? How does writing "*foo = 3;" actually put "3" into memory location foo? If you've ever asked any of these questions, you're ready to delve into the world of digital logic and computer architecture, and this book was written with you in mind.
What's Good?
Contemporary Logic Design provides a thorough introduction to the world of digital logic design. The author does an excellent job of not only presenting the concepts behind hardware design, but also covers some of the common pitfalls such as timing issues and dealing with the fact that hardware is not perfect. The book is logically divided into three groups of chapters that build on each other towards the final goal: Design of a simple 16-bit processor.The first five chapters of the book cover the concept of combinational logic. This is the creation of simple circuits that take a given input and provide a given output in which there is no feedback between output and input (for example 1 AND 0 = 0). First, the author covers the basic building blocks of digital logic: AND, OR, and NOT gates. >From here, the subject matter expands into more advanced circuits that are built from combinations of the above gates. The fifth chapter completes the first section with excellent information on the representation of numbers in hardware and implementation of basic add, subtract, and multiply circuits.
Chapters six through ten teach the reader about the world of sequential logic design. Sequential logic is that in which the previous state of the system affects the next given output. Sequential operation is at the heart of computer systems, and this is where the book excels. The basic theory of sequential logic is covered, and several useful examples such as binary counters, a simple DRAM chip, and a vending machine controller are used to demonstrate the principles.
The final two chapters provide and introduction to computer architecture and implementation. An excellent overview of computer organization is provided, and a 16 bit CPU is used as a case study of implementation issues.
While the book covers hardware, the author does an excellent job of keeping from getting too low level by delving into issues such as resistance, capacitance, and transistors. In a few places, the circuit design issues are brought up, but the are generally explained in enough detail that someone with no experience in electronics can understand. For those that are interested in the lowest level details, an appendix provides information on how digital gates are built up from basic analog components.
What to Watch Out For?
This is not a book that you can causally read. While the information covered is well presented, it is difficult material and you will often need to re-read a section several times before you clearly understand it, so plan to spend a few months with this book.In addition, many of concepts that are in the book cannot really be completely understood without seeing them in action. For this reason, I suggest that if you are interested in this material and get this book, you should do one of the following: a) Go to your local Radio Shack or your local electronics store and pick up one of those "101 digital logic projects" kits or b) pick up some digital logic simulation software (see this page on Freshmeat for a list of Linux offerings). Either option will allow you to actually build the circuits that are described and see how changing certain aspects will change their behavior.
In Summary
If you want to learn about computer hardware design, this is the book for you. It provides a thorough introduction to the subject without requiring much previous knowledge of electronics. The only warning is that you should have plenty of time in which to digest the information contained within this tome and that you should get some real digital hardware with which to experiment as you learn the material.Pick this book up at Amazon.
Table of Contents
- Introduction
- Two-Level Combinational Logic
- Multilevel Combinational Logic
- Programmable and Steering Logic
- Arithmetic Circuits
- Sequential Logic Design
- Sequential Logic Case Studies
- Finite State Machine Design
- Finite State Machine Optimization
- Finite State Machine Implementation
- Computer Organization
- Computer Implementation
- Appendix A: Number Systems
- Appendix B: Basic Electronic Components
Re:..., because they don't know better :) (Score:1)
People talk a lot about (digital) "hardware" and "software," but they're really the same thing (well, if you ignore analog effects). Understanding how the hardware works can provide great insight about how to design software. Hardware design was probably the first type of "object-oriented" or "modular" programming. It's a lot of high-level design, specifying interfaces and plugging components together.
And some of the best hacks were done in hardware. :)
A big issue in digital logic design is latency and synchronization, which is also an issue in software working with ever increasingly, pervasively, parallel systems. People who are good at one of these fields will have developed a good skill set that should transfer well to the other.
Bingo. A microprocessor is really just one big parallel program.
--
Re:Another good one... (Score:1)
Computer Architecture: A Quantitative Approach by Hennessy and Patterson
This one gets into a more detail on how modern processors work, with branch prediction and dynamic scheduling. It also covers everything in the "Organization and Design" book.
These books assume the reader knows something about digital logic design. Johnson's Superscalar Microprocessor Design is also a good one.
--
Re:Textbook (Score:1)
Looks like Renrior for Dummies (Score:1)
This book seems to get really heavy into state logic, which is really not as important as implementation. You can solve any problem by adding states, but it just slows things down. Being able to implement good forwarding, hazard detection or parallel logic seems to be a more important skill these days.
Re:All "Contemporary" Logic Design is OUT OF DATE! (Score:1)
Here's the COUNTER-POINT:
Boolean logic was NEVER DESIGNED to work in computers!
Digital circuits require a "control" line to work. In the original mathematics, this was the "mathmatician." Again, this is fine for boolean calculations by hand, but not autonomously in digital computers.
So when computers rolled around, someone thought, hey, let's just use a synchronous clock for boolean logic. Well, that was great for 2KHz, 1MHz and probably as high as 100MHz. But for 1GHz?! FORGET IT!
In essence, as long as we keep teaching boolean as a form of digital logic, the longer computers will continue to hard to design at higher speeds and lower densities!
Again, boolean algebra is NOT the IDEAL MATH to use for digital circuits. Our NCL process is.
With NCL, you can describe MORE GATES, MORE DESIGNS. And things like INVERTS require NO GATES.
Additionlly, boolean clocked gates require two states, low and high. Even when a gate is "off", there is still some low voltage running through it. With NCL and its dual-rail implementation, OFF MEANS OFF and you use NO POWER when it is off.
While boolean logic DOES exist AND there should be books written on it and its theory. Any "CONTEMPORARY" book on Digital Logic Design should AT LEAST mention asynchoronous logic design and the concept of things like NCL and other techniques.
The days of using elementary boolean gates are over. It is NOT ideal for digital logic design. Luckily for the industry, companies like Theseus Logic are designing software that converts boolean circuits into non-boolean circuits for the computers of tomorrow.
-- Bryan "TheBS" Smith
ACK!!!! GOD NOOOOOOOOOOOOOO!!!!!!!!!!!! (Score:1)
All I can say is that this book made me switch majors from Comp Eng to Electrical Engineering
Great Books, both of 'em (Score:1)
(Which we had to do last year)
Re:Booleen algebra? (Score:1)
Re:This is my college textbook! (Score:1)
Re:uhh...this book is old. new edition? (Score:1)
Heres a free Windows circuit simulator (Score:1)
This will make a difference.... (Score:1)
How has this knowledge helped me. I seem to have a grasp of computer logic and math, as well as storage and other hardware-related issues that I often seem to be lacking in the software only people I know. So I tend to be able to look at output and see where logic errors tripping up the code. I can still convert DEC=HEX=BIN faster in my head than on a calculator. So I can often find neat hacks and fix bugs faster. Because I can see the patterns of 1s and 0s that are involved. (Gosh if I mask out all but bits 7 and 9 we can use a simple 'if' statement to detect this pattern) or (gosh this fails every time bit 3 goes high). When it comes to drives and cross platform coding this knowledge is even more valuable.
All in all knowing how the computer works does make for better programming.
In Service
Charles Puffer
Re:Not for the Slashdot crowd... (Score:1)
While some may be software "artists" (a well deserved title for some), some of us also happen to be hardware folk (I'm in chip design at the moment, in a large blue building), who love to play with programming, but really love the lower levels. I think that there is a pretty good mix here overall, and like I said, folks worrying about device drivers or certain areas of kernel developement need to have this background.
Just my $.02
Re:All "Contemporary" Logic Design is OUT OF DATE! (Score:1)
uhh...this book is old. new edition? (Score:1)
I guess I was under the wrong impression that
So what's the next
-dr0ne
ECE 240 At Carnegie Mellon (Score:1)
Re:ECE 240 At Carnegie Mellon (Score:1)
Digital Logic Simulator (Score:1)
Capilano [capilano.com] to simulate the circuits we were building. As far as I know, they only make Windows and Mac versions, but some of my classmates said it ran quite nicely under Wine.
Digital Logic was pretty fun, but often gave me a headache
Dana
Really worth its price? (Score:1)
Re:ECE 240 At Carnegie Mellon (Score:1)
Re:uhh...this book is old. new edition? (Score:1)
EE4 at Caltech (Score:1)
There's a reason for starting with synchronous... (Score:1)
That being said, simply because current microprocessors aren't using a synchronous design ideology doesn't mean that synchronous design is dead. Take RAM. There's a reason it's called SDRAM, for Synchronous Dynamic RAM. The circuit that refreshed the RAM is a synchronous LFSR (Linear Feedback Shift Register). Synchronous design is not dead at all, it has its place.
Can anyone recommend an oscilloscope (Score:1)
You really can't do much electronics without one of those babies.
I don't think i can afford a real one, but what about oscilloscope cards?
I'm sure i could turn one of these 386 boards & mono monitors i have into a nice one if i had a card
Re:Not for the Slashdot crowd... (Score:1)
As a part-time embedded systems designer, I routinely have to consider both hardware and software issues. And in my estimate the mental models one builds based on some knowledge about both sides of the "science" help a lot with solutions design.
Regarding the book itself, I've skimmed through the version published on the web. Although a few parts are somewhat dated, it's seems to be quite useful for teaching principles. In the real world things aren't as clear-cut of course; I spend much more time ferreting out race conditions, worrying about shielding, decoupling and board layout, power supply ripple specs and so forth, rather than designing finite state machines. I've yet to see a textbook that addresses these issues in a useful way... let alone the hardware/software interactions.
For instance, consider something like a Palm Pilot. It's in sleep mode, and when you press a key, it wakes up and an interrupt routine is executed, which then does whatever the key means. On waking up, you get a huge power spike - the CPU's power consumption increases a thousandfold inside a few microseconds - and if the board designer hasn't allowed for that, when the power spike bounces back down the interrupt routine may crash. The programmer can stare at a source listing or emulator trace for months with no progress at all unless he knows about such hardware details.
I haven't seen demographics about the "slashdot crowd", but my impression is that perhaps half are quite young - they haven't messed around with valve radios or Heathkits before taking up programming [insert half-senile chuckle here]. Well, the hardware side of the business has progressed a lot - perhaps even more than the software side - since I started out 30 years ago... my advice to everybody is to take some time to study up on what's happening on the other side of the fence, it's worth it.
Re:Not for the Slashdot crowd...- yeah right! (Score:1)
In case you were wondering, I have been a computer geek since I was 13. My background has expanded to include Japanese Language and Literature and Electrical Engineering (Controls and Robotics)...AND I don't plan to limit myself by stoping there.
As they say "knowledge is power"
[/RANT]
Not so tasty... (Score:1)
This is also the book for the junior level EE and CS digital design courses at The University of Washington.
I can't say I really enjoyed this book at all. I already knew the boolean algebra from other CS courses I have taken and found the book to be too general.
I think I would have really enjoyed a book that spends a chapter or two on the basics of digital design and then introduces the rest of the material by showing the actual design of a computer from the ground up. In fact, I found such a book in the library in the last week of class (figures). I don't remember the name though.
Summary: If you are looking to learn more about How Computers Work, skip this book and look for something else. If you are looking for an intro to general (very) digital design, this may be the book for you.
ah - to feed off the tit of the government. (Score:1)
Amazing! (Score:1)
Excellent book!
I am amazed that the full text is available.
I actualy took the course at Berkeley which the book was written for (CS150). Good class, I was EE, so the material was useful as an introduction to logic design. A year or two of more advanced academic work might put you in a possition to understand some of the issues people working at AMD and INTEL deal with. However, this is a book about Logic Design not Digital Design. Two very different worlds.
Re:This is my college textbook! (Score:1)
A Decent Book If You Ask Me (Score:1)
digital designers Unite!!! (Score:1)
under-represented in net forums and associations.
Let's see...there's the SDF, the SVASE, the Center
for SW development. Now
that way too! A renegade group, one asks?
state machines without timing issues (Score:1)
Two things about introducing NCL to people who do state machines for a living (electrical engineers):
1. No experienced engineer would believe you. I have tried this so many times at the last place i worked (National Semiconductors) I got tired. It is not that they don't know/trust you. Timing is just a *big* problem right now.
2. If they listen long enough to understand what you mean, they get excited (good as a starter if you have really bad news to tell them).
College text book (Score:1)
Booleen algebra? (Score:2)
Re:Textbook (Score:2)
There is another book, Computer Architecture: A Quantitative Approach (Amazon link [amazon.com]) by the same authors. It covers the same material, but does go into more depth (and covers superscalar processors, which I don't think CO&D does much with). So if you're really hardcore into this stuff, go for CA: AQA (it is used in the CMU courses beyond the intro architecture course). If you're not going hardcore, CO&D should be very much sufficient for your needs.
The only other differences besides depth between the two books are: A) the authors names are switched on the cover and B) each author wrote the opposite chapters of what he did in CO&D.
I figured I'd explain the difference so if any of you see both books, you'd know which one was appropriate.
And in case you're wondering, Hennessy and Patterson were among the leaders of the "RISC-revolution". They are also (IMO) excellent authors, and the book includes plenty of diagrams and things to help you understand what's going on. If you've ever wondered how your CPU, cache memories, virtual memory, etc work, it's an excellent book to read.
Re:All "Contemporary" Logic Design is OUT OF DATE! (Score:2)
If I had thought that his post was actually serious, I might have gone into why nobody in their right mind would try to design a complex circuit using asynchronous design. Sure, async is useful for some things, but nobody would use it to design something like a CPU...you'd never manage to get the thing designed and debugged.
But I think you're right, it's spam. Not only that, but nonsensical spam. I really don't know who the target audience is, since (presumably) anyone who knows enough about logic design to be interested in their tools would know better.
..., because they don't know better :) (Score:2)
This is my college textbook! (Score:2)
This is a good textbook, the one thing that it really lacks is examples of how to do the homework problems at the end of each chapter.
Sure, you scan back through chapter after chapter and eventually you see how to do it, but it doesn't come easily at all.
Another good one... (Score:2)
This book is a well written foray into the internals of processor architecture, memory management structures, etc... one of the best books I've seen on the subject(s). Walks through examples based on a MIPS processor, including pipelining, and really does a good job of making it 'easy' to understand. Kind of expensive for the casual reader 8^) but well worth it for anyone who wants to really learn it.
Re:All "Contemporary" Logic Design is OUT OF DATE! (Score:2)
Just an observation.
Re:Not for the Slashdot crowd... (Score:2)
Just what is the "slashdot crowd"? Are they childish FUD-slingers who instantly attack anything that has the word "Microsoft" in it? Are they computer science experts who give advice to Jane's Intelligence Review? Are they uber-geeks who absolutely must have the latest and greatest kernel? Are they programmers who don't care about the underlying structure of the art they create?
I, for one, am quite interested in electronics... but I am primarily a programmer. I know enough to wire up a breadboard according to a circuit diagram, and to troubleshoot and modify the results. And I'm one of the few programmers at my college who can help out the electronics majors from time to time.
I consider myself part of the "slashdot crowd". I thought this was "news for nerds", not "news for perl programmers".
Textbook (Score:3)
Contemporary Logic Design is a good this-is-what-gates do book, but _COD_ is great for learning datapaths. You basically learn MIPS assembly and then design a processor to run the assembly. You must, of course, know the basics of gates (what a mux does, what a NAND does, etc) before you start _COD_. Everything from branch prediction to cache architectures. Yummy.
I guess I'm just more interested in the architecture end than the gate end. I'm certainly not interested in the chemistry/physics end, but alas, that is a required course, that I am taking this semester... and need to get back to work on.
The entire book is available online! (Score:4)
http://http.cs.berkeley.edu/~randy/ CLD/CLD.html [berkeley.edu]