Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Ask Slashdot: Sources For Firmware and Hardware Books?

timothy posted more than 2 years ago | from the ok-but-what-do-you-call-the-layer-beneath-that? dept.

Books 88

First time accepted submitter cos(0) writes "Between O'Reilly, Wrox, Addison-Wesley, The Pragmatic Bookshelf, and many others, software developers have a wide variety of literature about languages, patterns, practices, and tools. Many publishers even offer subscriptions to online reading of the whole collection, exposing you to things you didn't even know you don't know — and many of us learn more from these publishers than from a Comp Sci curriculum. But what about publishers and books specializing in tech underneath software — like VHDL, Verilog, design tools, and wire protocols? In particular, best practices, modeling techniques, and other skills that separate a novice from an expert?"

cancel ×

88 comments

Unfortunately... (4, Insightful)

AmiMoJo (196126) | more than 2 years ago | (#39810555)

I am an embedded software engineer who does some hardware/electrical engineering too. Unfortunately there is very little material on this subject specifically. Basically you need to learn how to code in C (not C++ or C#, raw C), learn some electronics and then maybe learn VHDL/Verilog as well. You then put it all together in your own mind.

It is really hard to recruit people with those skills so they are worth having. You will need some hands on experience though. You can simulate wire protocols and some hardware but none of the simulations are particularly good practice for real life, and employers will want examples of work anyway. A university level course would be best.

Re:Unfortunately... (0)

Anonymous Coward | more than 2 years ago | (#39810723)

Unfortunately there is very little material on this subject specifically. Basically you need to learn how to code in C (not C++ or C#, raw C),.

Odd, that. I've been coding embedded firmware in C++ for years.

Re:Unfortunately... (1)

stanlyb (1839382) | more than 2 years ago | (#39810767)

And is it working? And is it working fast? And is it right?

Re:Unfortunately... (2)

wed128 (722152) | more than 2 years ago | (#39811067)

With C++, there's no way to really tell...

Re:Unfortunately... (1)

plopez (54068) | more than 2 years ago | (#39811349)

You could ask the same of C code. It comes down to the developer. Remember, C++ compilers are (supposed) to be compatible with C as well.

Re:Unfortunately... (0)

Anonymous Coward | more than 2 years ago | (#39812675)

Obviously you don't know enough C++.

Re:Unfortunately... (1)

smitty_one_each (243267) | more than 2 years ago | (#39817861)

But I #include all of http://www.boost.org/ [boost.org] for good justice. . .

Re:Unfortunately... (1)

plopez (54068) | more than 2 years ago | (#39826877)

Just because you CAN do something does not mean you SHOULD do it.

Re:Unfortunately... (1)

plopez (54068) | more than 2 years ago | (#39811399)

Considering that C++ came our of a Bell Labs that is unsurprising. They were programmed satellites and real time switches for years.

you've got a c++ runtime in your firmware? (1)

Chirs (87576) | more than 2 years ago | (#39814045)

How much space does that take exactly? Or were you talking about using a _subset_ of C++?

Re:you've got a c++ runtime in your firmware? (2)

neonsignal (890658) | more than 2 years ago | (#39817241)

For projects with limited resources, it is quite normal to use a subset of the language. For example, you may eschew use of the heap library functions in C, or the use of floating point calculations.

I too use C++ (a subset), even for embedded systems as small as AVR. Having a restrictive environment doesn't mean you can't use object classes and templates.

The main risk with using C++ in a tight memory space is hidden use of the heap (for example in the STL). I don't find that code bloat is a problem if you know what you are doing.

Re:Unfortunately... (0)

Anonymous Coward | more than 2 years ago | (#39810757)

Another set of books needed are books on Asyncrhonous Logic. There are plenty of books on Sychronous logic.

Re:Unfortunately... (0)

Anonymous Coward | more than 2 years ago | (#39810993)

Modern day design is done with Synchronous logic as they can be scale to much higher performance and is what the CPLD/FPGA vendors optimized their chips for.

Asynchronous Logic is sort of like bad coding style especially for newbies. While they can work, they are discourage as they can cause all kinds of timing issues if you are not careful.

Re:Unfortunately... (1)

Anne Thwacks (531696) | more than 2 years ago | (#39813855)

Even Seymour Cray couldn't get asynchronous hardware to work reliably, so there is no hope for mere mortals.

Re:Unfortunately... (1)

Anonymous Coward | more than 2 years ago | (#39810785)

This sounds a lot like what I do. I'm also an embedded software engineer working in a hardware shop. We have a few books on various buses and stuff. I often end up looking at Linux driver source for insight into particular hardware. Simply using powerful hardware tools like JTAG debuggers and PCIe analyzers gives some understanding of how the hardware actually works.

Sadly though, I agree with you for the most part, that there isn't a lot of good literature out there. A lot of the more specific information comes from expensive training sessions from chip vendors like Xilinx, Altera and TI, and the literature is not public.

Re:Unfortunately... (2)

Quiet_Desperation (858215) | more than 2 years ago | (#39811069)

Xilinx actually has excellent and free data sheets and manuals available (as do other vendors) on their site. When I was transitioning from RF to digital years ago, I taught myself how to design with and program FPGAs entirely from Xilinx documentation.

Re:Unfortunately... (1)

Anonymous Coward | more than 2 years ago | (#39811137)

You'll just spend a summer trying to track it down. A lot of the information is available in documents, it's just spread between several documents.

Re:Unfortunately... (3, Informative)

Anonymous Coward | more than 2 years ago | (#39810895)

I have the same kind of background and would add that the manufacturers documentation is a must read. Once you have the foundation in programming from the types of books you mentioned, and an understanding of computer hardware, you need to get into the hardware specifics of your platform. Embedded/Firmware differs from what I would call general programming in that it is much more hardware specific. Many chip manufactures have software examples, documentation even training classes(sometimes even free) that get you the kind of information you are looking for.

Re:Unfortunately... (1)

eminencja (1368047) | more than 2 years ago | (#39811329)

> It is really hard to recruit people with those skills.

Are you talking US? I'm curious if there is any demand for such people in Europe. AFAIK Europe is no longer relevant. One example: not a single mass market digital camera was produced in Europe.

Re:Unfortunately... (1)

Anne Thwacks (531696) | more than 2 years ago | (#39813887)

It is probably really hard to find people under 35 with 30 years experience, or get them to work for a pittance. (The strategies closest to the heart of most HR departments).

Re:Unfortunately... (0)

Anonymous Coward | more than 2 years ago | (#39815423)

No, it's really hard.

We have some recruiters who do some basic screening, but it's nothing like that. Unfortunately, most people who look great on paper are really just not good in person. When we interview someone to do embedded work, we generally have two criteria:

1) We must believe you know how to work pointers
To do embedded, you really, REALLY need to know pointers. It's fine if you accidentally try to increment (or dereference...) a void* on the whiteboard. But when I ask "what does that do?", you need to realize that it's not allowed by the standard (because it makes no sense in C).

2) We must believe you know how to work with bits
Nothing magic - say, find the position of the least-significant set bit in a uint32_t or something. Maybe round some pointers to power-of-2 alignments.

A disturbingly low number of "C experts" can satisfy both of those.

Re:Unfortunately... (1)

Cylix (55374) | more than 2 years ago | (#39816413)

For embedded systems maybe one or two of my peers in school really understood everything. Even at that level we were just poking a few memory addresses and sending in data down the bus to flip a few latches. Precision, timing and the code in a big pot tended to throw some off. Made a killing in tutoring for those classes as my rates were extremely high. (No one was displeased)

Most if it has rotted away, but I can still look at a spec sheet to get a feel for the chip. The things we do today are always impressive and if given the chance I could probably release the magic smoke. (Magic smoke for the uninformed is the core component to most electronics. Accidentally shorting a line could produce a puncture which releases the gas and renders the component useless.)

Re:Unfortunately... (1)

RightwingNutjob (1302813) | more than 2 years ago | (#39817195)

Nothing magic - say, find the position of the least-significant set bit in a uint32_t or something.

Oh hell. I've been doing this sort of thing for nearly 10 years and just today I was poking around on an FPGA and not only did I get the endianness wrong, I got the order of the bits in the bytes backwards! The fact that this was a 32 bit PPC machine talking over a 16 bit bus with Intel byte sex just confused everything.

Moral of the story: no one can find the LSB of any data type on any machine until they've gone and looked for it!

Re:Unfortunately... (0)

Anonymous Coward | more than 2 years ago | (#39817559)

> 1) We must believe you know how to work pointers

> 2) We must believe you know how to work with bits

So, how can I apply for the job? :)

Re:Unfortunately... (1)

KevReedUK (1066760) | more than 2 years ago | (#39821537)

With pointers, I thought the general rule was just to be careful to avoid other peoples' eyes (especially with laser pointers. IIRC this can be seen as assault.)

As far as bits are concerned, bits of what?

Re:Unfortunately... (0)

Anonymous Coward | more than 2 years ago | (#39817867)

Who needs a digital camera in a bread line?

Re:Unfortunately... (1)

PT_1 (2425848) | more than 2 years ago | (#39812087)

Admittedly this doesn't answer the original poster's question of what publishers are best in this area, but I second the 'learn VHDL/Verilog' advice.

If you buy an inexpensive development board from a company like Xilinx, Altera or Digilent, you can immediately begin to experiment in developing your own digital circuits (there are some hugely expensive dev boards, but you really just need a cheap Spartan 3 board or similar to start out). Check out Opencores.org, which is sort of like the Sourceforge of digital hardware, where engineers share open source hardware designs; they are largely implemented in Verilog or VHDL, but there are also dev board schematics available for free.

A lot of the designs on the site are of course fairly difficult for an absolute beginner to follow. However, as it is with software, having working examples to base your learning on can be tremendously helpful. The site has multiple full microprocessors, hardware video decoders, Ethernet adapters etc.

Re:Unfortunately... (0)

Anonymous Coward | more than 2 years ago | (#39817969)

so, the books tend to be written by academics. very few actually do this. VHDL and Verilog are both terrible, but better than nothing. Keep in mind that the people working on advancing these language almost certainly don't actually use them. even when you find books written by engineers, they get outdated quickly. mainly because everything is a hack, and two years later there's a less ugly hack to get something to work.

HDL interviews often look at very basic concepts -- can you avoid inferring latches, write basic state machines, understand simple pipelines, maybe time-slicing. the bar isn't set very high.

actual HDL is often terrible and dissapointful.

Springer (0)

Anonymous Coward | more than 2 years ago | (#39810655)

Some of the publishers above have hardware-focused titles as well. Also take a look at www.springer.com.

Gaisler VHDL style (4, Interesting)

PeterBrett (780946) | more than 2 years ago | (#39810751)

It's not a book, but this book chapter is more-or-less compulsory reading for someone planning to get into HDL programming:

A structured VHDL design method [gaisler.com]

Re:Gaisler VHDL style (0)

Anonymous Coward | more than 2 years ago | (#39817937)

ha ha ha "two process"

right out of first semester HDL and then no where else. really, no one likes that. Its rare to even see it. but the language writers seem to think people do that...

Not Me (4, Interesting)

eldavojohn (898314) | more than 2 years ago | (#39810753)

and many of us learn more from these publishers than from a Comp Sci curriculum

Not me, man. And, don't get me wrong, I love pragprog and I worship O'Reilly and NoStarch. Hell, I review books for them on Slashdot! But no book would have been able to teach me about automata theory or linear algebra and differential equations like my college courses did. I'm sorry but I must argue that there's a lot of application and implementation to be gleaned from these books -- not so much on the theory and foundational concepts. At least for me there's something really difficult about reading a book about really "far out there" concepts and truly understanding them without human intervention. Maybe I'm just stupid but I find the best tech books show me "little jumps" while my college courses were the only way I could make "the big jumps" in knowledge quickly enough.

Plus, going to a liberal arts college meant general requirements that furthered me along in ethics, philosophy, etc more than these books did. I wouldn't go selling a college education short even though it seems to be the popular thing to bash these days.

Re:Not Me (1)

spiffmastercow (1001386) | more than 2 years ago | (#39810959)

Maybe I'm weird, but I got more of a college education outside of college than in it.. For instance, my school dropped their compiler design course due to lack of enrollment, so I bought a textbook and taught myself. I learned physics and linear algebra through MIT OCW (though I admit I didn't retain much of either after 5 years). I got a C in discrete math because the prof refused to give the homework until 5-10 minutes after the bell rang, and I didn't have time to stick around that long, but I practice my knowledge of algorithms by doing Project Euler problems..

I'm not calling college a total wash.. I got my minor in philosophy, so I learned to bullshit pretty well. All the phil students thought i was weird though since the only area of philosophy i found interesting was epistemology. I also learned how to not be a dick, and how to talk to girls. But as far as my major was concerned, I would have been better off with a stack of books and some peace and quiet.

Re:Not Me (2)

bkcallahan (2515468) | more than 2 years ago | (#39811159)

I got my minor in philosophy, so I learned to bullshit pretty well.

And you'll always have a place in Sales, Customer Service, or tech Support :) You only get politics as an option if you majored in BS and failed Ethics & Morality.

Re:Not Me (1)

spiffmastercow (1001386) | more than 2 years ago | (#39811369)

I got my minor in philosophy, so I learned to bullshit pretty well.

And you'll always have a place in Sales, Customer Service, or tech Support :) You only get politics as an option if you majored in BS and failed Ethics & Morality.

Nah, I use my powers for good. One nice benefit of knowing bullshit, though, is also knowing how to spot it.

Re:Not Me (0)

Anonymous Coward | more than 2 years ago | (#39811199)

My introduction to epistemology was on the job, my boss was a philosophy major for his undergrad. The introduction came over a discussion about Isaac Asimov's three laws of robotics, I was saying that part of my interest was, because I was so impressed that he made half a fictional universe surrounding those three (later four) laws that it seemed Asimov commanded an infinite fountain of creativity. My chief suggested that the books were in fact a long informal study in epistemological philosophy.

Re:Not Me (1)

tlhIngan (30335) | more than 2 years ago | (#39812439)

Maybe I'm weird, but I got more of a college education outside of college than in it.. For instance, my school dropped their compiler design course due to lack of enrollment, so I bought a textbook and taught myself. I learned physics and linear algebra through MIT OCW (though I admit I didn't retain much of either after 5 years). I got a C in discrete math because the prof refused to give the homework until 5-10 minutes after the bell rang, and I didn't have time to stick around that long, but I practice my knowledge of algorithms by doing Project Euler problems..

Not exactly a bad thing - after all the whole point of undergraduate studies is to learn how to learn. The fact that you ended up teaching yourself is a Good Thing(tm) and meant the primary purpose of college is fulfilled. The reason for the subjects is to provide background to expand your horizons. After all, once you have a bachelor's, your next step could be research (so you need background on what to research and current state-of-the-art to advance it), or a job in industry (where you'll learn the non-academic, practical skills).

Most people will say they learned far more after graduating.

Re:Not Me (1)

luis_a_espinal (1810296) | more than 2 years ago | (#39812505)

Maybe I'm weird, but I got more of a college education outside of college than in it.. For instance, my school dropped their compiler design course due to lack of enrollment, so I bought a textbook and taught myself. I learned physics and linear algebra through MIT OCW (though I admit I didn't retain much of either after 5 years). I got a C in discrete math because the prof refused to give the homework until 5-10 minutes after the bell rang, and I didn't have time to stick around that long, but I practice my knowledge of algorithms by doing Project Euler problems.. I'm not calling college a total wash.. I got my minor in philosophy, so I learned to bullshit pretty well. All the phil students thought i was weird though since the only area of philosophy i found interesting was epistemology. I also learned how to not be a dick, and how to talk to girls. But as far as my major was concerned, I would have been better off with a stack of books and some peace and quiet.

Epistemiology is perhaps the most practical and interesting aspect of philosophy (at least to me). Maybe because of my CS background and practical uses of logic in design and argumentation (making a case for something or against something.)

Re:Not Me (0)

Anonymous Coward | more than 2 years ago | (#39811063)

I wouldn't go selling a college education short even though it seems to be the popular thing to bash these days.

Whether it's popular or not, I'll continue to bash some college educations, but a good Comp Sci program is worth a few years of your life, and it's not unique in that regard.

Re:Not Me (1)

Hatta (162192) | more than 2 years ago | (#39811065)

But no book would have been able to teach me about automata theory or linear algebra and differential equations like my college courses did.

Does that include the course texts?

Re:Not Me (1)

cpu6502 (1960974) | more than 2 years ago | (#39811115)

I learned more from reading my textbooks (and studying the problem solutions), then anything the professor taught as he mumbled into the chalkboard and erased the equations before I had a chance to copy them down. I had 4 maybe 5 good teachers in college and the other ~35 were poor.

Re:Not Me (1)

wisty (1335733) | more than 2 years ago | (#39811579)

Have a look at "Collective Intelligence". It's a great crash course in ML, comparable to Ng's course (though it skimps a little on the theory).

Re:Not Me (1)

giorgist (1208992) | more than 2 years ago | (#39814731)

<quote><p>and many of us learn more from these publishers than from a Comp Sci curriculum</p></quote>

All you have to do is ask one question ...

Do you know what a Laplace transform is. You end up with two groups, both can be crap or brilliant, but on has a wall they will never cross.

Datasheets (2)

troylanes (883822) | more than 2 years ago | (#39810831)

Quite often the best references are datasheets for microcontrollers. This is where I have gained the bulk of my knowledge.

Re:Datasheets (3, Informative)

Anonymous Coward | more than 2 years ago | (#39811075)

Don't forget the appnotes- those are great for picking up pieces of passed down lore that you won't otherwise be exposed to unless you hang out with EE / Hardware types. The problem for me is gaining awareness that a class of parts exists so that I can read the appnotes for them.

Addison Wesley (2)

fermion (181285) | more than 2 years ago | (#39810855)

While I have not done any work in this field, for deeply technical books, not the general reference that places like O'Reilly produces, I find that Addison Wesley is one of the better publishers. When I need such a book I first look at what AW has.

I would also look at what the Association for Computing Machinery has. I had a memebership a while back and it seems they had so resources on this topic. In fact looking back they have a book called VHDL for Programmable Logic, which I have no idea if it what you are looking for, but there you go.

Free Free Range VHDL (3, Informative)

rasmuswikman (2511948) | more than 2 years ago | (#39810889)

I read the free ebook titled Free Range VHDL out of pure interest. Found it will written, but as I'm no engineer I cannot tell if it's exactly what the poster is asking for. But might be worth a read. http://www.freerangefactory.org/ [freerangefactory.org]

VHDL (1)

cpu6502 (1960974) | more than 2 years ago | (#39810913)

I don't know why a software guy wants to learn VHDL, but here's where I started: VHDL For Designers http://www.amazon.com/VHDL-For-Designers-Stefan-Sjoholm/dp/0134734149 [amazon.com]

Re:VHDL (1)

wed128 (722152) | more than 2 years ago | (#39811117)

There's a wierd breed (of which I happen to be a member) of people who aren't quite software guys, and aren't quite hardware guys, but do a little bit of both. My title is "Embedded Software Engineer", but i look over (and redline) schematics regularly. Some EE courses certianly didn't hurt.

OS software guys deal with datasheets a lot (1)

Chirs (87576) | more than 2 years ago | (#39814091)

Anyone dealing with device drivers, early bringup code, or exception handlers needs to be able to read datasheets. Generally not so much VHDL though.

VHDL is Ada plus parallelism (0)

Anonymous Coward | more than 2 years ago | (#39811375)

If you're coming from a software background, you might want to start by learning Ada. VHDL is very similar to Ada, with the major addition of statements that truly execute simultaneously in parallel.

Why o Why? (1)

tanveer1979 (530624) | more than 2 years ago | (#39818697)

VHDL seems to be thrown about slashdot a lot. Of course the 1990s saw heated debate about how VHDL is better than Verilog, but if you look at the ground reality, hardly anybody is doing new VHDL design. Even Europe, the last bastion of VHDL is moving to Verilog.
So if you want to upgrade your skills, and are new to the field, try Verilog and System Verilog.
Though SV started as a "simulation and tbench" language", its being increasingly used in design.

System verilog for design (google it) is a popular book.

PS: I am working in EDA and VLSI field for past 11 years, and have seen multiple designs from many large Semiconductor companies.

Re:Why o Why? (1)

coinreturn (617535) | more than 2 years ago | (#39819389)

VHDL seems to be thrown about slashdot a lot. Of course the 1990s saw heated debate about how VHDL is better than Verilog, but if you look at the ground reality, hardly anybody is doing new VHDL design. Even Europe, the last bastion of VHDL is moving to Verilog. So if you want to upgrade your skills, and are new to the field, try Verilog and System Verilog. Though SV started as a "simulation and tbench" language", its being increasingly used in design.

System verilog for design (google it) is a popular book.

PS: I am working in EDA and VLSI field for past 11 years, and have seen multiple designs from many large Semiconductor companies.

Um, no. I work for a multi-billion dollar aerospace company in the USA and we are strictly VHDL. It is simply better. Verilog is a low-level ASIC gate wiring language. VHDL can do everything from high level to low level, and reads better. If you're doing FPGAs, VHDL is the best. I consider this http://www.amazon.com/Designers-Guide-VHDL-Systems-Silicon/dp/1558602704 [amazon.com] to be the absolute best book.

The Art of Electronics (0)

Anonymous Coward | more than 2 years ago | (#39810965)

The Art of Electronics by Horowitz and Hill is a must read for anyone looking to get started in the EE/CE field.

CE curriculums and Embedded Systems Conferences (2)

Jack Greenbaum (7020) | more than 2 years ago | (#39810987)

A Computer Engineering curriculum is much better than a traditional CS degree for this type of work, so you might look at what texts are being used in high quality CE programs. The Embedded Systems Conferences [ubmdesign.com] from UBM are also a good source of training for low level firmware implementation.

Re:CE curriculums and Embedded Systems Conferences (1)

luis_a_espinal (1810296) | more than 2 years ago | (#39812581)

A Computer Engineering curriculum is much better than a traditional CS degree for this type of work, so you might look at what texts are being used in high quality CE programs. The Embedded Systems Conferences [ubmdesign.com] from UBM are also a good source of training for low level firmware implementation.

Indeed. Now that I'm in my 40's and in a need to switch to a more hardware'y line of work, I'm finding myself in a need to go back to grad school and work towards a CE degree. My advice for people going to school is to work on two separate majors - CS and CE or CS and EE. Or at the very least to work on a double major or a minor on one of the two (or CS and MIS for the business/enterprise inclined). An extra year/year and a half will open so many doors it's not even funny.

Joseph Cavanagh or Michael Ciletti (1)

Anonymous Coward | more than 2 years ago | (#39811005)

Look for books by Joseph Cavanagh or Michael Ciletti

Couple of recommendations to get started... (4, Informative)

gmarsh (839707) | more than 2 years ago | (#39811019)

- Grab a couple books on C/C++ and Verilog. I highly recommend "Fundamentals of Digital Logic with Verilog Design", great for both learning and for reference. For C/C++, I've always been a fan of the Sam's "Learn __ in 24 hours" books.

- Get yourself a FPGA development card, so you can get some "hardware play" in and familiarize yourself with some development tools. I have an Altera DE1 educational card that's a few years old, but it's got endless blocks on it (displays, LEDs, buttons, flash, SDRAM, VGA, sound... you name it) which makes it a great little card for embedded system learning. There's a whole set of Verilog and Nios (embedded processor) tutorials available for it, and lots of online hackers who have ported x86 processors (Zet project), hardware emulations of the NES, etc... to it. Xilinx and Actel also make some nice evaluation boards that seem to be targeted fairly often by hobbyists.

Other than that... you can study the heck out of wire protocols, but you'll probably forget everything you learn unless you end up implementing it. You're better off trying to learn as many general things as you can - how to create well organized C/C++ and Verilog code, making your designs meet timing and such - so that if you end up having to implement something, you've got the basics already in place and don't need too much incremental learning. Also if you have some fun ideas for FPGA projects, implement your heart out - that sort of stuff looks great on a resume.

Good luck!

"...tech underneath software..." (4, Funny)

John Hasler (414242) | more than 2 years ago | (#39811047)

There isn't any. It's VMs all the way down now. Hardware is so 20th century.

Re:"...tech underneath software..." (1)

nurb432 (527695) | more than 2 years ago | (#39811345)

While i disagree, even if you were right, someone has to write the hypervisors, which just happen to run on hardware.

Re:"...tech underneath software..." (0)

Anonymous Coward | more than 2 years ago | (#39814527)

He was joking, numbnuts.

Re:"...tech underneath software..." (0)

Anonymous Coward | more than 2 years ago | (#39816395)

I think OP was only half joking. Even assembly language has to be assembled into ISA machine language which is then executed by microcode. Hardware virtual machines can be nested, and application code can run in bytecode virtual machines, interpreted script or at yet a higher level still (nonprocedural languages or languages supporting parallel/distributed processing).

Re:"...tech underneath software..." (0)

Anonymous Coward | more than 2 years ago | (#39812159)

It's VMs all the way down

POTD

What I used (2)

pavon (30274) | more than 2 years ago | (#39811051)

I recently taught myself VHDL and used Pong P. Chu's [amazon.com] book. I liked it quite a bit. It did an especially good job of reinforcing the mindset of approaching VHDL programming as digital circuit design, not software design.

Re:What I used (0)

Anonymous Coward | more than 2 years ago | (#39812111)

I recently taught myself VHDL and used Pong P. Chu's [amazon.com] book. I liked it quite a bit. It did an especially good job of reinforcing the mindset of approaching VHDL programming as digital circuit design, not software design.

Looking on Amazon for books that are unaffordable, like the above ($50 used) is the best way to go. Any VHDL book that can be had for $2 is almost certainly worthless. I have found that digging through other people's VHDL is pretty educational. OpenCores and the Xilinx EDK have a lot of good code available to pick through.

Vendor tools (2, Informative)

Anonymous Coward | more than 2 years ago | (#39811201)

Just like software project, you'll need to actually dive in to learn anything outside of quoting books. Experience is what separate the junior engineers from the fresh out of school.

You can get "free" CPLD/FPGA vendor tools from the big 3 chips suppliers - Xilinx, Altera and Lattice. There are restriction on the design tool - either size/the chips selection, but other than that they are more than generous for some large non-trivial real life designs. The environment also have simulations again with some limitations on speed and/or size which would not be an issue. The same vendors and 3rd parties also have evaluation boards should you decided to go the next step of downloading the design onto the target chip.

As for wired protocol, it is up to you to find the sources from the official standards themselves. Most of them (except USB/Ethernet and some open standards) require some payments. Everything else (i.e books, pre-standards floating on the web) are okay for understand, but should not be the basis for claiming standard compliance without reading up on the real standards and/or 3rd party testing lab.

"skills that separate a novice from an expert" is hands on experience... It is not something you can get in a book or utube.

The Reuse Methodology Manual (2)

randomlogin (448414) | more than 2 years ago | (#39811233)

Both VHDL and Verilog provide you with 1001 ways of designing crappy circuits which don't work. This book tells you precisely which language constructs to use to get good synthesis results which will work equally well with FPGA and ASIC. For most experienced designers this is considered 'obvious' stuff, but if you're new to HDL design, this is a must read on the way to becoming an expert.

Re:The Reuse Methodology Manual (0)

Anonymous Coward | more than 2 years ago | (#39817947)

and it assumes you have a 10 year old synthesis tool. Its ok, but take it with a grain of salt -- things like "global optimizations" do exist in the modern world.

Re:The Reuse Methodology Manual (1)

randomlogin (448414) | more than 2 years ago | (#39818059)

Only 10 years old? My first edition copy of the book is from 1998! The toolchains may have improved in the intervening years, but the advice on HDL coding style is still relevant. There may be more recent books around that are a better introduction, but I've never had cause to read them and so can't comment.

Library.nu (1)

nurb432 (527695) | more than 2 years ago | (#39811287)

Oh wait, those f-ing bastards from the "book cartel" got them shut down.

Never mind, good luck finding anything of value.

The Elements of Computing Systems (0)

Anonymous Coward | more than 2 years ago | (#39811381)

Check out http://www1.idc.ac.il/tecs/ [idc.ac.il]

alt.binaries.e-book.technical (0)

Anonymous Coward | more than 2 years ago | (#39811405)

alt.binaries.e-book.technical

These I used myself (1)

bempelise (1921514) | more than 2 years ago | (#39812153)

Being a hardware engineer my self some books I have been reading from time to time are

VHDL:
Pong P.Chu - RTL Hardware design using VHDL (2007)
Volnei A. Pedroni - Circuit Design with VHDL (2004)
One of the following is really good for VHDL but I don't remember which of the two I keep confusing them...
VHDL Handbook by HARDI electronics (1997)
Peter J. Ashenden - VHDL Cookbook (1990)

U. Meyer-Baese - Digital Signal Processing With FPGA (2007)
Laung-Terng Wang, Cheng-Wen Wu, Xiaoqing Wen - VLSI Test principles and architectures (2006)
Jean Pierre Deschamps, Gery Jean Antoine Bioul, Gustavo D.Sutter - Synthesis of arithmetic Circuits (2006)
Michael Keating, Pierre Bricaud - Reuse Methodology manual for System-On-Chip designs (2002) - Must have for HW design
There are also books like Patterson's Computer Architecture which are really fundamental.

For the state of the art you can search for papers in IEEE explore and ACM portal as well as Springer books there are some gems there!

Hope that helps.

A few books to try (2)

amigabill (146897) | more than 2 years ago | (#39812411)

I'm really enjoying the testbook for the VHDL/FPGA RTL design class I'm taking now. RTL Hardware Design Using VHDL: Coding for Efficiency, Portability, and Scalability by Pong Chu. It doesn't bog down talking about all possibilities the language allows for legal syntax. The author really seems to focus on common practice for coding into a chip. There's very little if any testbench/simulation in this book, so look elsewhere for that, this one is all about the circuit design. Rather than only explaining what an FSM is, or just that you need a combination of registers and combinational logic, it gives some good suggestion on how to organize your code, such as using two or three separate processes rather than a single one. It talks about coding styles to minimize logic stages for some types of circuits. And it's the first one to explain why getting latches in synthesis is "bad" in a way I could understand, where other books just seem to say "don't allow latches" and I didn't understand why. I find this very refreshing now, as my first exposure to VHDL back in the mod 1990's had a terrible textbook that left me horribly confused about how to do anything useful with the language. And that was very frustrating as I was very interested in FPGAs, and there was no Verilog course at my university. After working with Verilog for the past few years at work, Chu's VHDL textbook has made it a lot easier for me to see how to create Verilog as well, rather than just read to understand other people's code. Chu has some other books that come in VHDL and Verilog pairs. They have project examples to play with in an inexpensive FPGA board. Check them out too.

For firmware and coding hardware, look for books on Embedded style programming, and about driver development. Look up some programmer guides for some ARM chips, they will tell you register definitions and fields and such to program to. I don't know how good these will be, but on my todo list are "Essential Linux Device Driver", "Linux Device Drivers, 3rd Edition", and "Hardware/Firmware Interface Design: Best Practices for Improving Embedded Systems Development". I looked at "Atmel AVR Microcontroller Primer: Programming and Interfacing (Synthesis Lectures on Digital Circuits and Systems) " which may be a good example of some of this as well, and check out some Arduino project books, that seems likely to talk about some of this stuff as well. A reviewer of one of the many AVR books says to go look up the http://www.avrfreaks.net/ [avrfreaks.net] website for some free such info. They and similar sites for Pic and Arm should have some relevant information for you. If these all seem too much of the "this is the language" or "this is what the hardware looks like" but not enough of "common practice is to..." type, maybe get a kickstarter project going to make what you need, and start interviewing people that do this every day.

VHDL and Verilog create gates, not code (0)

Anonymous Coward | more than 2 years ago | (#39812413)

VHDL and Verilog create gates and flip-flops, not code. Learning the languages isn't all that difficult. Learning the particulars about the FPGA's your "software" is going to go into is not trivial and requires hardware design knowledge.

Your FPGA isn't running on a virtual machine somewhere on the cloud. You need to know how to use a multimeter to make a FPGA go.

When's the last time you had a C++ file that defined your pin locations?

Re:VHDL and Verilog create gates, not code (1)

coinreturn (617535) | more than 2 years ago | (#39819421)

VHDL and Verilog create gates and flip-flops, not code. Learning the languages isn't all that difficult. Learning the particulars about the FPGA's your "software" is going to go into is not trivial and requires hardware design knowledge.

Your FPGA isn't running on a virtual machine somewhere on the cloud. You need to know how to use a multimeter to make a FPGA go.

When's the last time you had a C++ file that defined your pin locations?

I think a logic analyzer is a bit more important than a multimeter when it comes to digital design.

fpga4fun (3, Interesting)

TeknoHog (164938) | more than 2 years ago | (#39812575)

This is where I started about a year ago:

http://www.fpga4fun.com/ [fpga4fun.com]

Got a Xilinx dev kit, and it didn't take me too many weeks to get my project up to a stage where other people started using it.

Besides programming, a background in general electronics does help. Even if you're coding in a somewhat C-like language, it's nothing like a sequential program, but a description of hardware with real, physical parallelity. To me, it often helps to look at a circuit diagram to understand some debugging issues.

Data sheets from chip manufacturers are essential for some of the trickier points. If you need to choose between the two largest players, I recommend Altera over Xilinx, as they are somewhat more open, but mostly there are no huge differences.

This is what I use / suggest. (1)

SrJsignal (753163) | more than 2 years ago | (#39812907)

This kind of Hardware stuff is what I do most of the time. First, yes do get a dev board and free tools (Xilinx or Altera) to play with (just like you would play with software by compiling and trying).
Second, pick a language, I use mostly VHDL.
I like:
"VHDL for Logic Synthesis" Rushton
"VHDL Coding Styles and Methodologies" Cohen

Also google "VHDL Math tricks of the trade" (pdf) you'll need that to stay sane if you actually do algorithms

Jack Ganssle (1)

Snotnose (212196) | more than 2 years ago | (#39814323)

Nobody has mentioned Jack Ganssle or Embedded Systems Magazine? Visit http://www.ganssle.com/ [ganssle.com] , subscribe to The Embedded Muse, and if possible go to one of his Seminars.

I'm lucky, I started as an electronic tech in the late 70's. I learned to write software to wiggle wires for troubleshooting, engineering found out about it and drafted me. Went to school for my degree (math), and haven't had much trouble working since.

Here is a good book. (1)

EETech1 (1179269) | more than 2 years ago | (#39816735)

"Embedded Systems Firmware Demystified" by Ed Sutter (the one with the computerized toaster on the cover) is a pretty good book to start out reading, and of course doing the examples:) The book is from 2002, but there is still a lot of good stuff in there. IIRC it was copyright Lucent Technologies, and comes with the GNU compiler and many examples from Linux.

An oscilloscope or logic analyzer, and a few months working through the examples in the book with some real hardware will really help!

Cheers!

Chose another field (1)

Anonymous Coward | more than 2 years ago | (#39817185)

Check out 'Advanced FPGA Design' ( ISBN-10: 0470054379 ).
Pretty good fundamental overview about tradeoffs when designing for area/frequency/power etc.
Has some stuff about clock domain crossing too.

-Learn basic Verilog ( asic-world.com is good )
-Start getting a feel for how your logic will be synthesized into primitives like LUT, Block Ram, etc.
-Get into parametric defines, generate statements, more advanced Verilog language features (makes your code more valuable, adaptable, reusable, etc)
-Pick up a copy of ModelSim and learn SystemVerilog (SV), DPI (makes your simulation/verification more productive)
-Understand common test environments used in OVM/UVM (i.e. driver, monitor, scoreboard, etc.) (makes your simulation/verificaiton more productive)

Debug efficiently, build assertions or triggers for logic analyzers to catch errors, rather than pouring over your RTL.
Be very methodical. Be very diligent. Be very patient.

Take a class or work with some friends on some projects.
Even if the first module is flashing an LED, you can start building some momentum and move to larger designs.

marcy_8951 (0)

Anonymous Coward | more than 2 years ago | (#39817455)

Cheap Air Jordan [airjordansneaker23.com] Warmly welcome to the cheap Air Jordan Shoes online, Nike Air Jordan Shoes the fashion design, the wear-resisting, the cheap price, the fast cheap for Air Jordan Shop Contacts us if there is any product you are searching for. Our dear customers!

Are you sure you want HDL? (2)

AdamHaun (43173) | more than 2 years ago | (#39817479)

First, I'd suggest deciding what you want to focus on -- firmware (embedded programming) or HDL. If you're coming from a CS background, you might want to start with digital systems and computer architecture before plunging into HDL. Designing a CPU at the gate level will teach you more about how the hardware works than writing a page or two of behavioral HDL. These basics are also good for embedded programming. Without knowing more about your background, it's hard to know what to suggest. If you're coming from PC application programming (or, god forbid, web scripting) with no electronics or low-level background, get ready for a shock -- you're not in Kansas anymore. Personally, I'd suggest starting with some basic 8-bit AVR or PIC projects, since there's a lot of material on the net to help you. I've mostly learned on the job, so rather than giving you books, I'll suggest some topics to study:

Software
1. C programming in general and pointers in particular. Use this as a bridge to assembly.
2. Enough assembly to understand what sort of operations there are and how they're used. Don't bother writing huge programs in assembly; just make you sure you can read your debugger's output. This is a good chance to figure out whether your CPU has a hardware multiplier (probably) and divider (probably not). This tells you which operations are fast and which are dog-slow.
3. Where all the pieces of your program go in memory (code and constants in flash, data in RAM). Learn about standard assembly sections like .text so you can parse the error messages when your program gets too big for the chip it's in. Read this article [muppetlabs.com] for some hints in a PC app context.
4. Bit twiddling. AND, OR, XOR, inversion, shifting. Basically, any C operator listed under the "bitwise" section.

Hardware (theory)
1. Registers. CPU registers. Memory-mapped IO registers. Read-only vs. writable registers. Reset states of registers. You're going to be dealing with a lot of registers.
2. General-purpose IO pins and all their features. Look at a schematic in a microcontroller datasheet and understand input/high-impedance vs. output vs. input with an internal pull-up/down. Maybe driver current strength if you want to make *big* blinking lights.
3. Clocking. Where the clock comes from (crystal? internal oscillator?), how precise it is, how it's divided down, and how to turn off the division so the chip will run at full speed.
4. How to read a datasheet. Figure out what voltage(s) you need to power the chip and what (if anything) should be connected to each pin on power-up. Datasheets are very long. Learn to skim.
5. How to limit the current on LEDs with resistors so they don't blow up.
6. How to use a linear voltage regulator to get a clean 5V out of whatever DC power source you can Frankenstein together.
7. At least a cursory knowledge of voltage, current, and Ohm's law. Know how to determine power dissipation in a resistor (compute it) or an LED (read it out of the datasheet).

Hardware (bench-top)
1. How to use a multimeter. Spring for a nice auto-ranging one on eBay.
2. How to wire up a breadboard. Don't bother soldering yet; this is much easier. Hint: keep your wires short and neatly-arranged. Get a wire stripper and learn how to use it.
3. (Optional) If you have some money to throw around, pick up a bench-top power supply off of eBay. A triple-output supply is the most you'll ever need, but single-output is fine for simple MCU projects. This is more convenient and reliable than cutting up a wall wart.
4. (Even more optional) If you have a lot of money and are pretty serious, get a digital storage oscilloscope (NOT an analog-only scope, unless it's really cheap). This will do wonders for your debugging. Buy used unless you're rich or hard-core. Bench-top electronic tools are not cheap.
5. Find a local electronics store. Fry's is a decent choice (for tools, too!), and many cities have smaller but cheaper and better stocked stores.
6. Familiarize yourself with DigiKey and Newark, because the moment you want something weird you'll have to order online.

Hopefully my fellow commenters can point out anything I've missed. Good luck!

the irony (0)

Anonymous Coward | more than 2 years ago | (#39817991)

ironically, the experts are the ones who take a high level approach. The novices are either the ones taking a high level approach that won't work, or the ones taking an incredibly low level approach.

Testbenches (1)

PMW (203329) | more than 2 years ago | (#39822673)

There are lot's of books on HDLs, but to get good results for non-trivial designs you also need to learn how to test them.

http://www.amazon.com/Writing-Testbenches-Functional-Verification-Edition/dp/1402074018 is a great guide to this.

Good books (1)

aprdm (2451390) | more than 2 years ago | (#39824239)

pong chu books are REALLY good. (Hardware design, VHDL/Verilog and SoC)
Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...