Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
News Books Media Book Reviews

Contemporary Logic Design 69

Contemporary Logic Design, written by Randy H. Katz, is reviewed by Deepak Saxena. The tome is a thorough introduction to the world of digital logic design. Click below to learn more about whether the book is for you or not.
Contemporary Logic Design
author Randy H. Katz
pages 699
publisher Addison-Wesley
rating 9/10
reviewer Deepak Saxena
ISBN 0-8053-2703-7
summary A good, thorough introduction to the world of digital logic design.

The Scenario

You can code perl in your sleep, answer computer questions for all your non-geek friends, and create robust database-backed web sites for a living. The best description for you is that of software artist: You weave patterns of 1's and 0's that bring life to what would otherwise just be a hunk of silicon. However, you've always wondered how those 1's and 0's you create actually work. How does the flow of electrons through the silicon's microscopic pathways turn into the addition of two numbers? How does writing "*foo = 3;" actually put "3" into memory location foo? If you've ever asked any of these questions, you're ready to delve into the world of digital logic and computer architecture, and this book was written with you in mind.

What's Good?

Contemporary Logic Design provides a thorough introduction to the world of digital logic design. The author does an excellent job of not only presenting the concepts behind hardware design, but also covers some of the common pitfalls such as timing issues and dealing with the fact that hardware is not perfect. The book is logically divided into three groups of chapters that build on each other towards the final goal: Design of a simple 16-bit processor.

The first five chapters of the book cover the concept of combinational logic. This is the creation of simple circuits that take a given input and provide a given output in which there is no feedback between output and input (for example 1 AND 0 = 0). First, the author covers the basic building blocks of digital logic: AND, OR, and NOT gates. >From here, the subject matter expands into more advanced circuits that are built from combinations of the above gates. The fifth chapter completes the first section with excellent information on the representation of numbers in hardware and implementation of basic add, subtract, and multiply circuits.

Chapters six through ten teach the reader about the world of sequential logic design. Sequential logic is that in which the previous state of the system affects the next given output. Sequential operation is at the heart of computer systems, and this is where the book excels. The basic theory of sequential logic is covered, and several useful examples such as binary counters, a simple DRAM chip, and a vending machine controller are used to demonstrate the principles.

The final two chapters provide and introduction to computer architecture and implementation. An excellent overview of computer organization is provided, and a 16 bit CPU is used as a case study of implementation issues.

While the book covers hardware, the author does an excellent job of keeping from getting too low level by delving into issues such as resistance, capacitance, and transistors. In a few places, the circuit design issues are brought up, but the are generally explained in enough detail that someone with no experience in electronics can understand. For those that are interested in the lowest level details, an appendix provides information on how digital gates are built up from basic analog components.

What to Watch Out For?

This is not a book that you can causally read. While the information covered is well presented, it is difficult material and you will often need to re-read a section several times before you clearly understand it, so plan to spend a few months with this book.

In addition, many of concepts that are in the book cannot really be completely understood without seeing them in action. For this reason, I suggest that if you are interested in this material and get this book, you should do one of the following: a) Go to your local Radio Shack or your local electronics store and pick up one of those "101 digital logic projects" kits or b) pick up some digital logic simulation software (see this page on Freshmeat for a list of Linux offerings). Either option will allow you to actually build the circuits that are described and see how changing certain aspects will change their behavior.

In Summary

If you want to learn about computer hardware design, this is the book for you. It provides a thorough introduction to the subject without requiring much previous knowledge of electronics. The only warning is that you should have plenty of time in which to digest the information contained within this tome and that you should get some real digital hardware with which to experiment as you learn the material.

Pick this book up at Amazon.

Table of Contents

  1. Introduction
  2. Two-Level Combinational Logic
  3. Multilevel Combinational Logic
  4. Programmable and Steering Logic
  5. Arithmetic Circuits
  6. Sequential Logic Design
  7. Sequential Logic Case Studies
  8. Finite State Machine Design
  9. Finite State Machine Optimization
  10. Finite State Machine Implementation
  11. Computer Organization
  12. Computer Implementation
  13. Appendix A: Number Systems
  14. Appendix B: Basic Electronic Components
This discussion has been archived. No new comments can be posted.

Contemporary Logic Design

Comments Filter:
  • This is the best post I've read today.

    People talk a lot about (digital) "hardware" and "software," but they're really the same thing (well, if you ignore analog effects). Understanding how the hardware works can provide great insight about how to design software. Hardware design was probably the first type of "object-oriented" or "modular" programming. It's a lot of high-level design, specifying interfaces and plugging components together.

    And some of the best hacks were done in hardware. :)

    A big issue in digital logic design is latency and synchronization, which is also an issue in software working with ever increasingly, pervasively, parallel systems. People who are good at one of these fields will have developed a good skill set that should transfer well to the other.

    Bingo. A microprocessor is really just one big parallel program.

    --

  • I actually prefer the next level up:

    Computer Architecture: A Quantitative Approach by Hennessy and Patterson

    This one gets into a more detail on how modern processors work, with branch prediction and dynamic scheduling. It also covers everything in the "Organization and Design" book.

    These books assume the reader knows something about digital logic design. Johnson's Superscalar Microprocessor Design is also a good one.

    --

  • At the University of Minnesota-Duluth (UMD), where I go, I am using Computer Organization and Design. for my my Computer Organization class. However for the introductory ECE class that CS majors have to take I used another book, Digital Logic Circuit Analysis & Design by Victor Nelson et al. With only skimming over the web version of Contemporary Logic Design and comparing it to Digital Logic Circuit Analysis & Design the two books seem to cover the same main topics near the beginning. However, DLCA&D goes into more detail with boolean algebra but CLD seems to discuss topics in the last chapter that DLCA&D does not. But DLCA&D seems to cover many more topics that CLD does not seem to even touch. In the last chapters DLCA&D seems to focus on Progammable Logic Devices and testing of circuits. Has anyone else had any experience with Digital Logic Circuit Analysis & Design and could perhaps correct anything I might have missed or misstated?
  • It looks like Contemporary Logic Design revolves more around state logic than actual implemetion. Computer Organization and Design is more the physical layout. It's kind like comparing Spice to Renrior - different tools for different approaches.

    This book seems to get really heavy into state logic, which is really not as important as implementation. You can solve any problem by adding states, but it just slows things down. Being able to implement good forwarding, hazard detection or parallel logic seems to be a more important skill these days.
  • Here's the COUNTER-POINT:

    Boolean logic was NEVER DESIGNED to work in computers!

    Digital circuits require a "control" line to work. In the original mathematics, this was the "mathmatician." Again, this is fine for boolean calculations by hand, but not autonomously in digital computers.

    So when computers rolled around, someone thought, hey, let's just use a synchronous clock for boolean logic. Well, that was great for 2KHz, 1MHz and probably as high as 100MHz. But for 1GHz?! FORGET IT!

    In essence, as long as we keep teaching boolean as a form of digital logic, the longer computers will continue to hard to design at higher speeds and lower densities!

    Again, boolean algebra is NOT the IDEAL MATH to use for digital circuits. Our NCL process is.

    With NCL, you can describe MORE GATES, MORE DESIGNS. And things like INVERTS require NO GATES.

    Additionlly, boolean clocked gates require two states, low and high. Even when a gate is "off", there is still some low voltage running through it. With NCL and its dual-rail implementation, OFF MEANS OFF and you use NO POWER when it is off.

    While boolean logic DOES exist AND there should be books written on it and its theory. Any "CONTEMPORARY" book on Digital Logic Design should AT LEAST mention asynchoronous logic design and the concept of things like NCL and other techniques.

    The days of using elementary boolean gates are over. It is NOT ideal for digital logic design. Luckily for the industry, companies like Theseus Logic are designing software that converts boolean circuits into non-boolean circuits for the computers of tomorrow.

    -- Bryan "TheBS" Smith

  • I used this book in my intro to logic design course.

    All I can say is that this book made me switch majors from Comp Eng to Electrical Engineering :)
  • The program here at Purdue (where I know the poster of this article, Deepak, spent some time) uses both books. CLD is a good book for getting yourself up to speed with the basics, but CO&D was in-depth enough to get a 16-bit pipelined RISC processor up and running in simulation.
    (Which we had to do last year)
  • The book does cover boolean algebra, and I think it does a fairly good job of it. Of course, I took a Discrete Mathematics course before I took the class I used Katz's book for and thus knew Boolean algebra before hand.
  • Same here...EECS 140 at the Univ. of Kansas. Wasn't an overly helpful book IMHO, either. Then again, the course material wasn't overly complicated either, hence the 100 level-class.
  • The edition we use at Northwestern Univ. is from 1994. maybe there is a newer version?
  • I was a programmer in high school and built some small hardware projects. When I graduated I went on to Tech School and got an Associate degree in electronic and computer repair. Most of what we learned on the digital side (the other side was analog, radio, audio and stuff like that) is what is covered in this book.

    How has this knowledge helped me. I seem to have a grasp of computer logic and math, as well as storage and other hardware-related issues that I often seem to be lacking in the software only people I know. So I tend to be able to look at output and see where logic errors tripping up the code. I can still convert DEC=HEX=BIN faster in my head than on a calculator. So I can often find neat hacks and fix bugs faster. Because I can see the patterns of 1s and 0s that are involved. (Gosh if I mask out all but bits 7 and 9 we can use a simple 'if' statement to detect this pattern) or (gosh this fails every time bit 3 goes high). When it comes to drives and cross platform coding this knowledge is even more valuable.

    All in all knowing how the computer works does make for better programming.

    In Service
    Charles Puffer
  • True, many here are higher level programmers, but those who get interested in kernel development should really understand the lower level architectures (not necessarily down to the gate level) so they can more effeciently program. This is a major concern with device drivers, where you might have to worry about latching data (there, now you've seen it) before you do something else with it, and that can be down at the gate level. Think about firmware - you need an explicit knowledge of what's going on under the covers...

    While some may be software "artists" (a well deserved title for some), some of us also happen to be hardware folk (I'm in chip design at the moment, in a large blue building), who love to play with programming, but really love the lower levels. I think that there is a pretty good mix here overall, and like I said, folks worrying about device drivers or certain areas of kernel developement need to have this background.

    Just my $.02
  • Isn't this too much of a commercial advertisement? I agree with the views about asynchronous logic being the next step as we push clock rates higher and higher, but I just think that perhaps a smaller message with a "we have a solution" might be a little more appropriate.


  • I guess I was under the wrong impression that ./ only reviewed recent releases? Is this a new edition? I used this book ages ago in my intro CE course...I still refer to it every once in a while...it has a very thorough chapter on sequential/FSM design.

    So what's the next ./ review? K&R's the C prog. language? :P

    -dr0ne
  • This is the text used for ECE-240 at Carnegie Mellon. Hard to say whether it's a good book or not -- I'm taking the course now, but have found that I only needed to open the book once.
  • I'm CS too, but haven't really learned all that much in 240 yet... is Thomas really all that great a prof?
  • In my Digital Logic and Computer Organization courses at school, we used LogicWorks from

    Capilano [capilano.com] to simulate the circuits we were building. As far as I know, they only make Windows and Mac versions, but some of my classmates said it ran quite nicely under Wine.

    Digital Logic was pretty fun, but often gave me a headache :)

    Dana
  • I'm a cheap bastard, so looking at the price at Amazon really freaked me out. It costs $96.95! I don't even dare think what that would be over here in Sweden (with our nice, modern, education-friendly 25% tax on books). Ouch.
  • The reason you haven't needed it is just because Thomas is such a damn good prof. I skipped class almost daily, so the book was a lot more useful for me. I'm CS so I never really covered a lot of the material in the book. I had no idea what a K-map was before that class. The book also came in handy when it was time to start writing Verilog code.
  • Do you mean to imply there are any worthwhile programming books other than K&R? Absurd.
  • This is the book we used for our Introduction to Digital Systems class here at Caltech. Not even a required EE course. Actually the core of the digital design classes for EE's here doesn't have a textbook, we get every thing from our lecturer (who is a Caltech alum). The man is a genius.

  • There is a very good reason that intorductory EE classes and textbooks (such as this one) deal with synchronous design. It's easier than asynchronous design. This textbook was designed for undergraduate students, not for people who will be designing the next generation of microprocessors!

    That being said, simply because current microprocessors aren't using a synchronous design ideology doesn't mean that synchronous design is dead. Take RAM. There's a reason it's called SDRAM, for Synchronous Dynamic RAM. The circuit that refreshed the RAM is a synchronous LFSR (Linear Feedback Shift Register). Synchronous design is not dead at all, it has its place.
  • or oscilloscope software/hardware?

    You really can't do much electronics without one of those babies.

    I don't think i can afford a real one, but what about oscilloscope cards?

    I'm sure i could turn one of these 386 boards & mono monitors i have into a nice one if i had a card :)
  • While this is a worthwhile subject that most CSci people tackle, I don't feel that the /. crowd cares to get this low level. It strikes me that the core competencies are more script oriented (Perl, etc...), and thus aren't oriented toward the "science" of computers.

    As a part-time embedded systems designer, I routinely have to consider both hardware and software issues. And in my estimate the mental models one builds based on some knowledge about both sides of the "science" help a lot with solutions design.

    Regarding the book itself, I've skimmed through the version published on the web. Although a few parts are somewhat dated, it's seems to be quite useful for teaching principles. In the real world things aren't as clear-cut of course; I spend much more time ferreting out race conditions, worrying about shielding, decoupling and board layout, power supply ripple specs and so forth, rather than designing finite state machines. I've yet to see a textbook that addresses these issues in a useful way... let alone the hardware/software interactions.

    For instance, consider something like a Palm Pilot. It's in sleep mode, and when you press a key, it wakes up and an interrupt routine is executed, which then does whatever the key means. On waking up, you get a huge power spike - the CPU's power consumption increases a thousandfold inside a few microseconds - and if the board designer hasn't allowed for that, when the power spike bounces back down the interrupt routine may crash. The programmer can stare at a source listing or emulator trace for months with no progress at all unless he knows about such hardware details.

    I haven't seen demographics about the "slashdot crowd", but my impression is that perhaps half are quite young - they haven't messed around with valve radios or Heathkits before taking up programming [insert half-senile chuckle here]. Well, the hardware side of the business has progressed a lot - perhaps even more than the software side - since I started out 30 years ago... my advice to everybody is to take some time to study up on what's happening on the other side of the fence, it's worth it.

  • [RANT] "Look buddy!" is how I feel like starting this post, but I will take a gentler approach and say "Think again"! What makes you think you know the background of 'most of Slashdot'?! I thought the whole point of Slashdot is to get a bunch of people with different backgrounds and experience together so that one person can iron out the next guys deficiencies in knowledge. From the various discussions I have seen, there are people from all branches of science, math, engineering, and even [gasp!] humanities reading and participating on Slashdot. The one thing that annoys me is the number of Perl scripters/linux gurus/coders etc who think that because they can play on computers a bit they know everything.

    In case you were wondering, I have been a computer geek since I was 13. My background has expanded to include Japanese Language and Literature and Electrical Engineering (Controls and Robotics)...AND I don't plan to limit myself by stoping there.

    As they say "knowledge is power"

    [/RANT]

  • or at least I didn't think so.

    This is also the book for the junior level EE and CS digital design courses at The University of Washington.

    I can't say I really enjoyed this book at all. I already knew the boolean algebra from other CS courses I have taken and found the book to be too general.

    I think I would have really enjoyed a book that spends a chapter or two on the basics of digital design and then introduces the rest of the material by showing the actual design of a computer from the ground up. In fact, I found such a book in the library in the last week of class (figures). I don't remember the name though.

    Summary: If you are looking to learn more about How Computers Work, skip this book and look for something else. If you are looking for an intro to general (very) digital design, this may be the book for you.

  • The slides at theseus.com brought me back to the days before I worked for a living. Way back then, I was employed at a big company that was getting paid by the government to generate lots of paper and slides. Government contracters have no clue how the real world works.
  • Excellent book!

    I am amazed that the full text is available.

    I actualy took the course at Berkeley which the book was written for (CS150). Good class, I was EE, so the material was useful as an introduction to logic design. A year or two of more advanced academic work might put you in a possition to understand some of the issues people working at AMD and INTEL deal with. However, this is a book about Logic Design not Digital Design. Two very different worlds.

  • This is a textbook for one of my courses as well. It is a good textbook. But I agree that the textbook questions are really badly worded and written. (I've spent many a night so far scratching my head at what they are asking) I think it's the only reason I am passing the course right now, I know that the person teaching the course is no help whatsoever... I think it would be a good thing if the book came with the LogicWorks software and if the company that made LogicWorks ported it to Linux. That would be a good thing.
  • I am studying computer science/electircal engineering at New Mexico Tech [nmt.edu], and this book was required for Digital Electronics here. I doubt I would have paid $96 for it otherwise...but it has been one of the better books I have had to purchase yet. Aside from a few minor errors (doublecheck the 4-bit Gray Code in problem 4.24), the text is fairly airtight. My professor, however, who recently worked at Nokia, finds a lot of the ideas presented (e.g. one's complement) to be rather dated. He was suprised to return and find the Electircal Engineering department still using the book...but academia is seldom as current as industry.

  • Hmm. Is it just me? Or are chip designers really
    under-represented in net forums and associations.
    Let's see...there's the SDF, the SVASE, the Center
    for SW development. Now /. seems to be leaning
    that way too! A renegade group, one asks?
  • The current way people build finite state machines is using CBL (clocked boolean logic). There is a better way to do this though, called NCL, which i came across while surfing [theseus.com]. I studied the pdfs to the degree of being able to sythesize state machines Without Clocks: NEAT!

    Two things about introducing NCL to people who do state machines for a living (electrical engineers):
    1. No experienced engineer would believe you. I have tried this so many times at the last place i worked (National Semiconductors) I got tired. It is not that they don't know/trust you. Timing is just a *big* problem right now.
    2. If they listen long enough to understand what you mean, they get excited (good as a starter if you have really bad news to tell them).
  • Cotemporary Logic Design is used at Rensselaer Polytechnic Institute for a sophomore level course. The book is a solid introduction to logic circuit design, but, in my opinion, fails to give enough examples. I don't know what the newest edition is, but the book used at RPI has quite a few errors in it too, which tend to confuse a reader.
  • It's interesting there's no mention of booleen algebra. Is everyone taking booleen algebra as a seperate course from logic design?
  • Computer Organization and Design is excellent. We do use it at CMU, for the introductory Computer Architecture class.

    There is another book, Computer Architecture: A Quantitative Approach (Amazon link [amazon.com]) by the same authors. It covers the same material, but does go into more depth (and covers superscalar processors, which I don't think CO&D does much with). So if you're really hardcore into this stuff, go for CA: AQA (it is used in the CMU courses beyond the intro architecture course). If you're not going hardcore, CO&D should be very much sufficient for your needs.

    The only other differences besides depth between the two books are: A) the authors names are switched on the cover and B) each author wrote the opposite chapters of what he did in CO&D.

    I figured I'd explain the difference so if any of you see both books, you'd know which one was appropriate.

    And in case you're wondering, Hennessy and Patterson were among the leaders of the "RISC-revolution". They are also (IMO) excellent authors, and the book includes plenty of diagrams and things to help you understand what's going on. If you've ever wondered how your CPU, cache memories, virtual memory, etc work, it's an excellent book to read.
  • Right on...

    If I had thought that his post was actually serious, I might have gone into why nobody in their right mind would try to design a complex circuit using asynchronous design. Sure, async is useful for some things, but nobody would use it to design something like a CPU...you'd never manage to get the thing designed and debugged.

    But I think you're right, it's spam. Not only that, but nonsensical spam. I really don't know who the target audience is, since (presumably) anyone who knows enough about logic design to be interested in their tools would know better.
  • I'm a digital logic designer, so of course I'm interested, but I think many people would find it is very interesting stuff. I know some software types who told me they're practically afraid of learning how the hardware works, and I guess that's OK if you stay in application space. But, for those that kernal hack, write device drivers, hack gcc, or wish to write cache efficient code, I think gaining an understanding of the digital logic level of things can be very beneifical in becoming good at it. Also, the more parallel we design our computers, the more our software is going to have to deal with digital logic design issues. A big issue in digital logic design is latency and synchronization, which is also an issue in software working with ever increasingly, pervasively, parallel systems. People who are good at one of these fields will have developed a good skill set that should transfer well to the other.
  • (For ECES 281 at CWRU)

    This is a good textbook, the one thing that it really lacks is examples of how to do the homework problems at the end of each chapter.

    Sure, you scan back through chapter after chapter and eventually you see how to do it, but it doesn't come easily at all.
  • Computer Organization and Design : The Hardware/Software Interface by David A. Patterson, John L. Hennessy

    This book is a well written foray into the internals of processor architecture, memory management structures, etc... one of the best books I've seen on the subject(s). Walks through examples based on a MIPS processor, including pipelining, and really does a good job of making it 'easy' to understand. Kind of expensive for the casual reader 8^) but well worth it for anyone who wants to really learn it.
  • I must say, with ALL of the AMAZING capitalization and BUZZWORDS in these posts, they seem AMAZINGLY like the spam mail that I immediately delete without reading.

    Just an observation. :)
  • I think it's funny how everyone seems to be able to predict what the "slashdot crowd" is going to say or do.

    Just what is the "slashdot crowd"? Are they childish FUD-slingers who instantly attack anything that has the word "Microsoft" in it? Are they computer science experts who give advice to Jane's Intelligence Review? Are they uber-geeks who absolutely must have the latest and greatest kernel? Are they programmers who don't care about the underlying structure of the art they create?

    I, for one, am quite interested in electronics... but I am primarily a programmer. I know enough to wire up a breadboard according to a circuit diagram, and to troubleshoot and modify the results. And I'm one of the few programmers at my college who can help out the electronics majors from time to time.

    I consider myself part of the "slashdot crowd". I thought this was "news for nerds", not "news for perl programmers". :)
  • by Erich ( 151 ) on Wednesday October 20, 1999 @05:31AM (#1599455) Homepage Journal
    As noted elsewhere, this is a popular textbook (I see CMU, and we use it here at GaTech). It is fairly nice, and I found it quite up-to-date. _Computer Organization and Design_ by Patterson and Hennesy is another good one that we use.

    Contemporary Logic Design is a good this-is-what-gates do book, but _COD_ is great for learning datapaths. You basically learn MIPS assembly and then design a processor to run the assembly. You must, of course, know the basics of gates (what a mux does, what a NAND does, etc) before you start _COD_. Everything from branch prediction to cache architectures. Yummy.

    I guess I'm just more interested in the architecture end than the gate end. I'm certainly not interested in the chemistry/physics end, but alas, that is a required course, that I am taking this semester... and need to get back to work on.

  • by Why2K ( 29813 ) on Wednesday October 20, 1999 @05:54AM (#1599456)
    The author's site at Berkeley has the complete text of the book.

    http://http.cs.berkeley.edu/~randy/ CLD/CLD.html [berkeley.edu]

You do not have mail.

Working...