Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Happy Birthday! X86 Turns 30 Years Old

CmdrTaco posted more than 6 years ago | from the break-out-the-chips-and-dips dept.

Intel 362

javipas writes "On June 8th, 1978 Intel introduced its first 16-bit microprocessor, the 8086. Intel used then "the dawn of a new era" slogan, and they probably didn't know how certain they were. Thirty years later we've seen the evolution of PC architectures based on the x86 instruction set that has been the core of Intel, AMD or VIA processors. Legendary chips such as Intel 80386, 80486, Pentium and AMD Athlon have a great debt to that original processor, and as recently was pointed out on Slashdot, x86 evolution still leads the revolution. Happy birthday and long live x86."

cancel ×

362 comments

Sorry! There are no comments related to the filter you selected.

Intel's anniversary too (4, Funny)

suso (153703) | more than 6 years ago | (#23667557)

Intel's own 40th anniversary is coming up on July 18th. I guess the microcomputer industry is officially over the hill.

Nice self-reference double entendre Taco!

Might want to check your FPU (5, Funny)

Anonymous Coward | more than 6 years ago | (#23667607)

The story is a few days early. I think you may have a rounding bug somewhere.

Re:Might want to check your FPU (5, Funny)

somersault (912633) | more than 6 years ago | (#23667717)

I did wonder how they weren't sure how certain they were. Perhaps they weren't certain how certain they were but were certain how right they were. These guys should be building quantum CPUs by now with such confuddling principles of certainty.

Re:Might want to check your FPU (5, Funny)

Anonymous Coward | more than 6 years ago | (#23667825)

Happy Birthday! X86 Turns 29.991803 Years Old.

Re:Might want to check your FPU (0, Redundant)

jpcloninger (523751) | more than 6 years ago | (#23668395)

It appears that you have a Pentium rounding error, sir!

Re:Might want to check your FPU (2, Funny)

everphilski (877346) | more than 6 years ago | (#23668203)

Slashdot's got to be early on a few news articles to make up for being behind on so many.

Oh wait, did I say a few? ;)

Re:Might want to check your FPU (5, Funny)

MiniMike (234881) | more than 6 years ago | (#23668487)

I thought it was early so they could dup it in time for the real anniversary.

How Long? (4, Interesting)

dintech (998802) | more than 6 years ago | (#23667635)

I'm pretty sure x86 processors will still be in use for another 15 years at least. But, how much further will this architecture evolve? When will we see the demise of x86?

Re:How Long? (5, Interesting)

flnca (1022891) | more than 6 years ago | (#23667681)

The demise of the x86 general architecture will not begin until Windows goes out of fashion. It's the only major platform strongly tied to that CPU architecture. x86 CPUs have been emulating the x86 instruction set in hardware for many years now. I guess, if they could, Intel / AMD / VIA and others would happily abandon the concept, because it leads to all sorts of complexities.

Intel made some horrible design decisions. (4, Insightful)

Futurepower(R) (558542) | more than 6 years ago | (#23667883)

Before the 8086 was released, I knew a V.P. of Technology who was extremely excited about it. Every time I saw him, he would tell me the date of release, and how much he was waiting for that date.

On that day, he was very sad. Intel made some horrible design decisions. We've had to live with them every since. Starting with the fact that assembly language programming for the X86 architecture is really annoying.

And so did IBM with the PC (4, Interesting)

IvyKing (732111) | more than 6 years ago | (#23668015)

The docs for the 8086 stated that the interrupts below 20H were reserved, so guess what IBM used for the BIOS. The 8086 documentation was emphatic about not using the non-maskable interrupt for the 8087, and guess what IBM used. OTOH, Tim Paterson did pay attention to the docs and started the interrupt usage at 20H, but he wasn't working for either IBM or Microsoft at the time.


TFA doesn't get into the real reason that the x86 took off, that the BIOS for the IBM PC was cloned at least two or three times which allowed for much cheaper hardware (the original Compaq and IBM 486 machines were going for close to 10K$, where 486 whiteboxes were available a few months late for 2K$).

8086 computers? (1)

Futurepower(R) (558542) | more than 6 years ago | (#23668077)

You said, "... Compaq and IBM 486 machines ...".

I think you mean 8086 computers, or even 8008 computers.

Re:How Long? (4, Interesting)

peragrin (659227) | more than 6 years ago | (#23667889)

Actually Intel keeps trying(Itanium?) and AMD uses a compatibility mode.

The problem is as usual MSFT. which only runs on windows. yes I know a decade ago NT 4.0 did run on PowerPC, and even a couple of alpha chips.

Apple with a fraction a of the software guys can keep their OS on two major different style of chips PowerPC, and Intel x86, along with 32bit and 64 bit versions of both. Sun keeps how many versions of Solaris?

Nope but Vista only runs on x86. So X86 will remain around as long as it does.

Re:How Long? (3, Funny)

B3ryllium (571199) | more than 6 years ago | (#23667941)

Hrm, I wonder what this HAL thing is ... must be a virus! I'd better remove it.

Re:How Long? (5, Funny)

meadowsoft (831583) | more than 6 years ago | (#23668273)

I wouldn't do that, Dave...

Re:How Long? (1)

bsDaemon (87307) | more than 6 years ago | (#23668065)

I think if Intel were really interested, they could force MS to follow suit. The problem, so long as AMD is willing to run the compatibility mode, Microsoft doesn't have to change -- and that means that Intel would have to lose out, at least in the home market.

I have little doubt that Intel could force a change on servers and corporate desktops, and Linux, BSD and Solaris, as well as Apple, would be able to adjust within a very short period of time to run on it.

Re:How Long? (1)

haystor (102186) | more than 6 years ago | (#23668625)

Apple, with a fraction of the software guys, customizes BSD to multiple limited-hardware platforms.

Re:How Long? (5, Interesting)

Hal_Porter (817932) | more than 6 years ago | (#23667979)

The demise of the x86 general architecture will not begin until Windows goes out of fashion. It's the only major platform strongly tied to that CPU architecture. x86 CPUs have been emulating the x86 instruction set in hardware for many years now. I guess, if they could, Intel / AMD / VIA and others would happily abandon the concept, because it leads to all sorts of complexities.
Yeah, they could move to an architecture with a simple, compact instruction set encoding which makes efficient use of the instruction cache and can be translated to something easier to implement on the fly with extra pipeline stages.

But wait, that's exactly what x86 is. In terms of code density it does pretty well compared to Risc. Modern x86s don't implement it internally, they translate it to Riscy uops on the fly and execute those. And over the years compilers have learned to prefer the x86 instructions that are fast in this sort of implementation. And, thanks to AMD it now supports 64 bit natively in its x64 variant. This is important. 64 bit maybe overkill today, but most architectures die because of a lack of address space (see Computer Architecture by Hennessy and Patterson [amazon.com] ). But 64 bit address spaces will keep x86/x64 going for at least a while.

http://cache-www.intel.com/cd/00/00/01/79/17969_codeclean_r02.pdf [intel.com]
If you know that the variable does not need to be pointer polymorphic (scale with the architecture), use the following guideline to see if it can be typed as 32-bit instead of 64-bit. (This guideline is based on a data expansion model of 1.5 bits per year over 10 years.)

IIRC 1.5 bits per year address space bloat is from Hennessy and Patterson.

At this point we have 30 unused bits of address space, assuming current apps need 32GB tops. That gives 64 bit x64 another 20 years lifetime!

Re:How Long? (4, Interesting)

TheRaven64 (641858) | more than 6 years ago | (#23668397)

Many of the shortest opcodes on modern Intel CPUs are for instructions that are never used. Compare this with ARM, where the 16-bit thumb code is used in a lot of small programs and libraries and there are well-defined calling conventions for interfacing 32-bit and 16-bit code in the same program.

Modern (Core 2 and later) Intel chips do not just split the ops into simpler ones, they also combine the simpler ones into more complex ones. This was what killed a lot of the original RISC archs - that CISC multi-cycle ops became CISC single-cycle ops while compilers for RISC instructions were still generating multiple instructions. On ARM, this isn't needed because the instruction set isn't quite so brain-dead. ARM also has much better handling of conditionals (try benchmarking the cost of a branch on x86 - you'll be surprised at how expensive it is), since conditionals are handled by select-style operations (every instruction is conditional) and which reduces branch penalties and scales much better to superscalar architectures without the cost of huge register files.

Re:How Long? (0)

Anonymous Coward | more than 6 years ago | (#23668519)

Wow, I can't imagine what we'll be doing with 18 billion billion bytes of *RAM*. That's what 64 bits of address space gives you.

Note: I've done supercomputing programming professionally, and I *still* think that's a whole lotta address space... not that I couldn't fill it up with data :-) but I just don't see even micros~1 making their apps so bloated as to require that much space.

Re:How Long? (4, Insightful)

compro01 (777531) | more than 6 years ago | (#23668731)

Wow, I can't imagine what we'll be doing with 18 billion billion bytes of *RAM*. That's what 64 bits of address space gives you.
[bashing joke]
maybe that will finally be enough to run vista at a decent speed.
[/bashing joke]

Re:How Long? (0)

xgr3gx (1068984) | more than 6 years ago | (#23668085)

Once again - Microsoft and Windows stifle technological advancement.
Like the anology if Microsoft made cars
-they would all have coal fired steam engines that would require a 4000 gallon watertank and 12 tons of coal on hand.
(8000 gallons and 24 tons if you want the "Areo" experience)

Re:How Long? (1)

Hatta (162192) | more than 6 years ago | (#23668391)

That translation of x86 instructions must have some performance cost to it. What Intel should do is expose both sets of instructions, act like an x86 if the OS expects it, or act RISC-like if the OS expects that. Then everyone can have their Windows installed, and it creates an opening for other operating systems. An OS that uses the native instruction set should be a little faster, giving people a reason to use it over windows. That will encourage MS to port windows the the new instruction set, and voila we are free of x86.

Please (1, Informative)

oldhack (1037484) | more than 6 years ago | (#23667639)

Kudos to Intel for their bus/marketing/eng savvy, but the x86 instruction set? Please.

Backward integers forever! (1, Insightful)

Anonymous Coward | more than 6 years ago | (#23667643)

Buggy processors, flawed floating point, and backward integers. Better solutions have come and gone, always squashed by the INTEL 800 pound gorilla.

Yep, lots to be happy about. Long live mediocrity.

Its hard to believe ... (1)

veektor (545483) | more than 6 years ago | (#23667645)

... that we've been using the same architecture for almost 60% of my life. More than 60% if you count the 8080. -K1LT

Re:Its hard to believe ... (1)

mrbluze (1034940) | more than 6 years ago | (#23667707)

... that we've been using the same architecture for almost 60% of my life. More than 60% if you count the 8080. -K1LT
I agree. What impresses me is how, what I thought at the time were much better processor architectures, died a death, whereas clunky ol' x86 kept on going, warts and all.

Re:Its hard to believe ... (2, Funny)

Rhapsody Scarlet (1139063) | more than 6 years ago | (#23667849)

We've been using it for 100% of mine. The 80386 was still shiny and new when I was born.

See, this is one the reasons I come to Slashdot. Other discussion boards make me feel so old because I remember using my old 486DX2/66.

What other discussion boards? (2, Funny)

Futurepower(R) (558542) | more than 6 years ago | (#23667923)

What other discussion boards?

Re:What other discussion boards? (1)

conureman (748753) | more than 6 years ago | (#23668333)

Yeah, I keep hearing about those, but I haven't got time to google EVERYTHING.

Re:Its hard to believe ... (0)

Simon Brooke (45012) | more than 6 years ago | (#23668155)

We've been using it for 100% of mine. The 80386 was still shiny and new when I was born.

See, this is one the reasons I come to Slashdot. Other discussion boards make me feel so old because I remember using my old 486DX2/66.

Hah! The first machines I used professionally didn't even have a microprocessor... Have used 6502, 8080, Z80, H800, M68000, ARM, PowerPC, MIPS. Oh, and the odd x86.

Re:Its hard to believe ... (1)

jgarra23 (1109651) | more than 6 years ago | (#23668773)

... that we've been using the same architecture for almost 60% of my life. More than 60% if you count the 8080. -K1LT I'm 29 so all of mine.

Happy Birthday (1)

Daver297 (1208086) | more than 6 years ago | (#23667647)

Happy Birthday Big Guy

I wish it would just die. (2, Insightful)

Hankapobe (1290722) | more than 6 years ago | (#23667653)

Move on to something better. Backwards compatibility can too far some times.

Re:I wish it would just die. (0)

Anonymous Coward | more than 6 years ago | (#23667779)

I wish for the same. X86 wasn't even an efficient ISA at its introduction. Today it's just sad in comparison to modern ISA's. Hell, it's even sad (and, actually, in several ways very inefficient) compared to other 25 year old ISA's like 680x0.

But, sure, happy birthday, you clumsy, inoptimal ISA you.

Re:I wish it would just die. (2, Insightful)

quanticle (843097) | more than 6 years ago | (#23667843)

But you see, the thing with standards is that the longer they live, the harder they are to kill. At 30 years old, the x86 ISA is damn near invincible now.

Re:I wish it would just die. (1)

Nullav (1053766) | more than 6 years ago | (#23668035)

So what better birthday present than a new friend (that will later kill and supplant it)?

Re:I wish it would just die. (0)

Anonymous Coward | more than 6 years ago | (#23668593)

When the processor first came out I seem to remember writing an article criticising it for an arcane architecture and instruction set that would make it difficult to extend in a straightforward manner as technology developed.

Having set out to make backwards compatibility such a nightmare, it's a tribute to *something* that x86 is still around 30 years later and that it's delivering high performance in a consumer-oriented component. Probably the fact that to "move on" is harder than it seems - vested interests, established market relationships, additional engineering costs, existing knowledge base...

x86 did not succeed for technical reasons (1, Interesting)

Anonymous Coward | more than 6 years ago | (#23667685)

What a mishmash of zany grafted-on non-orthogonal instructions and registers the x86 is. For years its technology lagged Motorola's 68x00. x86 succeeded due to IBM and Microsoft selecting it. Anything will fly given enough propulsion. We can only imagine how much further ahead CPUs would be if not for the x86 monopoly.

Doing it right -- mostly (5, Interesting)

HW_Hack (1031622) | more than 6 years ago | (#23667689)

I spent over 16 yrs with Intel as a HW engineer. I saw many good decisions and a lot of bad ones too. Same goes for opportunities taken and missed. But their focus on cpu development cannot be faulted - they stumbled a few times but always found their focus again.

The other big success is their constant work on making the entire system architecture better, and basically giving that work to the industry for free. PCI - USB - AGP - all directly driven by Intel.

Its a bizarro place to work but my time their was not wasted

Re:Doing it right -- mostly (5, Insightful)

oblivionboy (181090) | more than 6 years ago | (#23667861)

The other big success is their constant work on making the entire system architecture better, and basically giving that work to the industry for free.

While I'm sure thats how the script was repeated in Intel, suggesting great generosity ("And we give it away for free!"), what choice did they really have? IBM's whole Micro Channel Architecture fiasco showed what licensing did to adoption of new advances in system architecture and integration.

Re:Doing it right -- mostly (-1, Troll)

Anonymous Coward | more than 6 years ago | (#23667995)

Seriously? x86 with it pathetic register limit is a bloody joke.

Re:Doing it right -- mostly (4, Interesting)

Simonetta (207550) | more than 6 years ago | (#23668551)

Hello,
    Congrats on working at Intel for 16 years. Might I suggest that you document this period of activity into a small book? It would be great for the historical record.

    Typing is a real pain. I suggest using the speech-to-text feature found buried in newer versions of MS Word or the IBM or Dragon speech programs. Train the system by reading a few chapters off the screen. Then sit back and talk about the Intel years, the projects, the personalities, the cubicals, the picnics, the parking lot, the haircuts, the water cooler stories, anything and everything. Don't worry about punctuation and paragraphing, which can be awkward when using speech-to-text systems. It's important to get a text file of recollections from the people who were there. Intel was 'ground zero' for the digital revolution that transformed the world in the last quarter of the 20th century. In fifty to a hundred years from now, people will want to know what it was really like.

Thank you.

A few tweaks, and... (5, Interesting)

kabdib (81955) | more than 6 years ago | (#23667691)

This is a case where just a couple of tweaks to the original x86 architecture might have had a dramatic impact on the industry.

The paragraph size of the 8086 was 16 bytes; that is, the segment registers were essentially multiplied by 16, giving an address range of 1MB, which resulted in extreme memory pressure (that 640K limit) starting in the mid 80s.

If the paragraph size had been 256 bytes, that would have resulted in a 24MB address space. We probably wouldn't have hit the wall for another several years. Companies such as VisiCorp might have succeeded at products like VisiOn, which were bending heaven and earth to cram their products into 640K, it would have been much easier to do graphics-oriented processing (death of Microsoft and Apple, anyone?). And so on.

Things might look profoundly different now, if only the 8086 had had four more address pins, and someone at Intel hadn't thought, "Well, 1MB is enough for anyone..."

Ah, fresh air! (1, Interesting)

Icarium (1109647) | more than 6 years ago | (#23667811)

Someone advocating better hardware over more efficient code? Heresy I say!

Re:A few tweaks, and... (5, Insightful)

fremen (33537) | more than 6 years ago | (#23668011)

What you're really saying is that "if only the chip had been a little more expensive to produce things might have been different." Adding a few little tweaks to devices was a heck of a lot more expensive in the 80s than it is today. The reality is that had Intel done what you asked, the x86 might not have succeeded this long at all.

Re:A few tweaks, and... (4, Insightful)

Gnavpot (708731) | more than 6 years ago | (#23668197)

If the paragraph size had been 256 bytes, that would have resulted in a 24MB address space. We probably wouldn't have hit the wall for another several years. Companies such as VisiCorp might have succeeded at products like VisiOn, which were bending heaven and earth to cram their products into 640K, it would have been much easier to do graphics-oriented processing (death of Microsoft and Apple, anyone?). And so on.

But would the extra RAM have been affordable to typical users of these programs at that time?

I remember fighting for expensive upgrades from 4 to 8 MB RAM at my workplace back in the early 90's. At that time PCs had already been able to use more than 1 MB for some years. So the problem you are referring to must have been years earlier where an upgrade from 1 to 2MB might probably have been equally expensive.

Re:A few tweaks, and... (1)

deniable (76198) | more than 6 years ago | (#23668299)

You've got that the wrong way around. The paragraph size was a result of register and address bus size. I doubt anyone would go out and say "Lets make assembly programmers use segments, they'll appreciate it." It was a way to make 16 bit registers handle a 20 bit address bus.

Anyway, you're giving me bad flash-backs to EMS and XMS and himem and other things best forgotten.

Itanium sank (1)

gilesjuk (604902) | more than 6 years ago | (#23667701)

Lets not forget the wonderful Itanium processor which was supposed to replace X86 and be the next gen 64-bit king.

How could Intel have got it so wrong? as Linus said "they threw out all of the good bits of X86".

It's good to see however that Intel have now managed to product decent processors now the GHz wars are over. In fact it's been as much about who can produce the lowest power CPU. AMD seem to just have the edge.

Re:Itanium sank (3, Funny)

Anonymous Coward | more than 6 years ago | (#23667863)

Itaniums were great processors. I have a bank of surplus ones installed in my oven as a replacement heating element.

Re:Itanium sank (5, Insightful)

WMD_88 (843388) | more than 6 years ago | (#23667925)

My theory is that Itanium was secretly never created to replace x86; rather, it was designed to kill of all competitors to x86. Think about it: Intel managed to convince the vendors of several architectures (PA-RISC, Alpha come to mind) that IA-64 was the future. They proceeded to jump on Itanium and abandon the others. When Itanium failed, those companies (along with the hope of reviving the other arch's) went with it, or jumped to x86 to stay in business. Ta-da! x86 is alone and dominant in the very places IA-64 was designed for. Intel 1, CPU tech 0.

Re:Itanium sank (2, Interesting)

argent (18001) | more than 6 years ago | (#23668125)

How could Intel have got it so wrong?

That's what they do best. Getting it wrong.

x86 segments (we'll make it work like Pascal). Until they gave up on the 64k segments it was excruciating.
iApx432 ... the ultimate CISC (the terminal CISC)
i860 ... The compilers will make it work (they didn't)
IA64 ... It's not really VLIW! We'll call it EPIC! The compiler's will make it work! Honest!

Re:Itanium sank (2, Insightful)

Hal_Porter (817932) | more than 6 years ago | (#23668223)

Lets not forget the wonderful Itanium processor which was supposed to replace X86 and be the next gen 64-bit king.

How could Intel have got it so wrong? as Linus said "they threw out all of the good bits of X86".

It's good to see however that Intel have now managed to product decent processors now the GHz wars are over. In fact it's been as much about who can produce the lowest power CPU. AMD seem to just have the edge.
Not just Itanium. All the x86 alternatives have sunk over the years. Mips, Alpha, PPC. x86 was originally a hack on 8080, designed to last for a few years. All the others had visions of 25 year lifetimes. But the odd thing is that a hack will be cheaper and reach the market faster. An architecture designed to last for 25 years by definition must include features which are baggage when it is released. x86, USB, Dos and Windows show that it's better to optimize something for when it is released. Sure doing this will leave a few holes. But if it succeeds you have the money to fix them. As limited human engineers this seems inelegant. But that's how evolution works.

Maybe having vision is overrated. Evolution has no vision it just hacks stuff blindly. But it designed your brain. Conscious engineers planning for the long term can't do that.

Overcoming Limitations (3, Interesting)

steve_thatguy (690298) | more than 6 years ago | (#23667709)

Kinda makes you wonder how different things might be or how much farther things might've come had a better architecture become the de facto standard of commodity hardware. I've heard it said that most of the processing of x86 architectures goes to breaking down complex instructions to two or three smaller instructions. That's a lot of overhead over time. Even if programmers broke down the instructions themselves so that they were only using basically a RISC-subset of the x86 instructions, there's all that hardware that still has to be there for legacy and to preserve compatibility with the standard. But I'm not a chip engineer, so my understanding may be fundamentally flawed somehow.

Re:Overcoming Limitations (4, Insightful)

Urkki (668283) | more than 6 years ago | (#23667903)

What may have been a limitation some time ago might start to be an advantage. I'm under the impression that there's already more than enough transistors to go around per processor, and there's nothing *special* that can be done with them, so it's just cramming more cores and more cache into a single chip. So parsing, splitting and parallelizing complex instructions at the processor may not be very costly after all. OTOH I bet it does reduce the memory bandwidth needed, which definitely is an advantage.

Re:Overcoming Limitations (2, Interesting)

Hal_Porter (817932) | more than 6 years ago | (#23668293)

Kinda makes you wonder how different things might be or how much farther things might've come had a better architecture become the de facto standard of commodity hardware.

I've heard it said that most of the processing of x86 architectures goes to breaking down complex instructions to two or three smaller instructions. That's a lot of overhead over time. Even if programmers broke down the instructions themselves so that they were only using basically a RISC-subset of the x86 instructions, there's all that hardware that still has to be there for legacy and to preserve compatibility with the standard.

But I'm not a chip engineer, so my understanding may be fundamentally flawed somehow.
I think the important thing to remember is that total chip transistor counts - mostly used for caches - have inflated very rapidly due to Moores Law. And legacy baggage has grown more slowly. So the x86 compatibility overhead in a modern x86 compatible chip is lower than it was in a 486 for example. Meanwhile the cost of not being x86 compatible has stayed the same. Arm cores are much smaller than x86 for example but most PC like devices still use x86 because most applications and OSs are distributed as x86 code.

1978?? (0, Flamebait)

Adeptus_Luminati (634274) | more than 6 years ago | (#23667733)

I don't recall 8086's hitting computer stores until like 1988 or so. What was Intel doing with these things for 10 years?

Re:1978?? (4, Interesting)

Ctrl-Z (28806) | more than 6 years ago | (#23667879)

Are you kidding? The 8086 was the processor used in the IBM 5150, also known as the IBM PC, introduced in 1981.

Re:1978?? (3, Informative)

kabdib (81955) | more than 6 years ago | (#23667969)

8088 (an 8086 with an 8-bit bus) at 8Mhz, and a graphics card architecture that was absolutely miserable, like stuffing pixels through a straw.

Oh, we suffered back then, believe me... :-)

Re:1978?? (2, Informative)

Rhapsody Scarlet (1139063) | more than 6 years ago | (#23667899)

1988? Where the hell do you live? In 1988, the 8086/8088, 80186, and 80286 had already been on the market for years. The 80386 was the CPU to have and the 80486 was only a year away.

Re:1978?? (1)

Cro Magnon (467622) | more than 6 years ago | (#23667971)

Interesting. I got my first PC, an 8088, in 1986. And I'm pretty sure the 286 was out by then.

Re:1978?? (4, Informative)

Vectronic (1221470) | more than 6 years ago | (#23667975)

My first assumption, was tha it was too expensive for average use... but, after some investigating, I foudn that wrong (In my opinion)

June 1979, Intel introduces the 4.77-MHz 8086 microprocessor. It uses 16-bit registers, a 16-bit data bus, and 29,000 transistors, using 3-micron technology. Price is US$360. It can access 1 MB of memory. Speed is 0.33 MIPS.
Not from '78, but the first price I could find... so maybe you are thinking of the 80386, released in '87, not just the 8086...

Re:1978?? (5, Informative)

squiggleslash (241428) | more than 6 years ago | (#23668013)

The 8086 predates the 8088, which was more popular and eclipsed it largely because IBM picked the latter for the original IBM PC. The 8088 is a modified 8086 that talks to the outside world using bytes rather than 16 bit words. It's otherwise completely identical.

PC manufacturers started switching over to the 8086 from around 1986 onwards (the Amstrad PC1512 was one example, dating to 1986) because of the slight performance improvement it offered without being as expensive as the 80286. For real-mode applications, and 8086s running at the same speed as the 80286, there was barely any performance difference between the two chips.

The old joke at the time was that the 8088 was being phased out in 1986-1988 because the '88 in 8088 was the expiry date...

Happy 29.9999999876547542 (5, Funny)

Shadow Wrought (586631) | more than 6 years ago | (#23667761)

Signed,
your great-great-great grandson,
Pentium

Happy Birthday to ya (0, Offtopic)

mgblst (80109) | more than 6 years ago | (#23667777)

happy birthday, Happy birthday to ya.

(You can thank TimeWarnerAol for the fact that we can't sing the usual happy Birthday song, without paying $30,000 is fees).

Happy Birthday (5, Funny)

marto (110299) | more than 6 years ago | (#23667789)

.model small .stack .data
message db "Happy Birthday!", "$" .code
main proc
      mov ax,seg message
      mov ds,ax
      mov ah,09
      lea dx,message
      int 21h
      mov ax,4c00h
      int 21h
main endp
end main

The scary thing is (3, Interesting)

wiredog (43288) | more than 6 years ago | (#23667977)

I was able to follow that, and it's been decades since I had to use x86 assembler.

Legendary? (0, Insightful)

Anonymous Coward | more than 6 years ago | (#23667797)

Everything the x86 series did, someone else did first on some other processor (68k, Sparc, MIPS, and PPC, to name a few), and usually better, because they didn't have the handicap of backward compatibility.

But I guess we have the Pentium 4 to thank for conditioning the masses to think that clock speed equals performance to the exclusion of all else, and that it's okay for a CPU to burn 100-150 watts all by itself.

Re:Legendary? (3, Insightful)

bhtooefr (649901) | more than 6 years ago | (#23668001)

What about CISC-to-RISC translation?

I do believe that was first done by an x86 CPU, the NexGen Nx586 (the predecessor to the AMD K6...)

Re:Legendary? (1)

Waffle Iron (339739) | more than 6 years ago | (#23668663)

Everything the x86 series did, someone else did first on some other processor (68k, Sparc, MIPS, and PPC, to name a few), and usually better, because they didn't have the handicap of backward compatibility.

Nevertheless, ever since the Alpha started faltering in the late 1990s, x86 processors have usually been the fastest or almost the fastest microprocessors available on the market at any given time, while being the most cost effective by a wide margin. At the end of the day, that's all that matters. (That, plus the fact that the backward compatibility "baggage" enables people to actually run the software they have.)

Intel has always been a P.O.S. (4, Insightful)

waldo2020 (592242) | more than 6 years ago | (#23667821)

Motorola always had better product, just worse marketing.. If IBM had chosen the 68K in their instruments machine, instead of the 8086/8085 from the Displaywriters, we would have saved ourselves from 3 decades of segmented address space, half a dozen memory models and non-orthogonal cpu architecture.

Re:Intel has always been a P.O.S. (3, Insightful)

divided421 (1174103) | more than 6 years ago | (#23668061)

You are absolutely correct. It amazes me how large of a market x86 commands with the undisputed worst instruction set design. Even x86 processors know their limitations and merely translate the instructions into more RISC-like 'micro-ops' (as intel calls them) for quick and efficient execution. Lucky for us, this translation 'baggage' only occupies, what, 10% of total chip design now? Had several industry giants competed on perfecting a PowerPC-based design with the same amount of funding as x86 has received, we would be years ahead of where we are now.

Re:Intel has always been a P.O.S. (1)

jalet (36114) | more than 6 years ago | (#23668479)

You expressed politely and in a few words what I'm thinking for twenty years and didn't dare to write for fear of being REALLY FUCKING IMPOLITE ABOUT THE x86 !!!

Thanks a lot.

Re: Intel has always been a P.O.S. (1)

FurtiveGlancer (1274746) | more than 6 years ago | (#23668695)

You forgot the unintuitive (until made standard through pervasiveness), inverted reference scheme. Should one have LSB or MSB first? IMHO, Motorola got that one right as well.

Happy Happy Birthday (3, Funny)

triffid_98 (899609) | more than 6 years ago | (#23667837)

Happy birthday my Intel overlords, and a pox on whomever designed that ugly memory map.

makes you wonder .. (1)

rs232 (849320) | more than 6 years ago | (#23667921)

Makes you wonder why they didn't fix the MMU issues while they went about evoluting .. :)

Fond memories of working with the 8086 (1)

pw1972 (686596) | more than 6 years ago | (#23667931)

Back in college we learned Assembly on the VAX which was a dream to program Assembly on with it's 16 registers. Then we moved on to a microprocessor class where we had to program the 8086 and to our frustration were limited to it's 4 registers. I think we also came to the conclusion that those chips ran on smoke. Whenever you let the smoke out of them, they stopped working.

Die already ! (5, Insightful)

DarkDust (239124) | more than 6 years ago | (#23667987)

Happy birthday and long live, x86.

Oh my god, no ! Die already ! The design is bad, the instruction set is dumb, too much legacy stuff from 1978 still around and making CPUs costly, too complex and slow. Anyone who's written assembler code for x86 and other 32-bit CPUs will surely agree that the x86 is just ugly.

Even Intel didn't want it to live that long. The 8086 was hack, a beefed up 8085 (8-bit, a better 8080) and they wanted to replace it with a better design, but iAPX 432 [wikipedia.org] turned out to be a desaster.

The attempts to improve the design with 80286 and 80386 were not very successful... they merely did the same shit to the 8086 that the 8086 already did to the 8085: double the register size, this time adding a prefix "E" instead of the suffix "X". Oh, and they added the protected mode... which is nice, but looks like a hack compared to other processors, IMHO.

And here we are: we still have to live with some of the limitations and ugly things from the hastily hacked together CPU that was the 8086, for example no real general purpose registers: all the "normal" registers (E)AX, (E)BX, etc. pp. are bound to certain jobs at least for some opcodes. No neat stuff like register windows and shit. Oh, I hate the 8086 and that it became successful. The world could be much more beautiful (and faster) without it. But I rant that for over ten years now and I guess I will rant about it on my deathbead.

Re:Die already ! (2, Insightful)

Anonymous Coward | more than 6 years ago | (#23668313)

The problem is that, like English, even though x86 sucks in so many ways it just happens to be very successful for those same reasons. Like for instance using xor on a register to zero itself is both disgusting and efficient... kind of like "yall" instead of "all of you".

Re:Die already ! (3, Funny)

timster (32400) | more than 6 years ago | (#23668483)

Nonsense -- it's the LACK of a plural second-person pronoun in "proper" English that is disgusting (and inefficient). "Y'all" is the best hope we have for fixing this bug, and y'all should start using it as much as possible.

Re:Die already ! (2, Informative)

gardyloo (512791) | more than 6 years ago | (#23668761)

Ha. That's what "you" is for. We just need to bring back "thou" for the singular.

Re:Die already ! (2, Insightful)

eswierk (34642) | more than 6 years ago | (#23668697)

Does anyone besides compiler developers really care that the x86 instruction set is ugly and full of legacy stuff from 1978?

Most software developers care more about things like good development tools and the convenience of binary compatibility across a range of devices from supercomputers to laptops to cell phones.

Cross-compiling will always suck and emulators will always be slow. As lower-power, more highly integrated x86 chipsets become more widespread I expect to see the market for PowerPC, ARM and other embedded architectures shrink rather than grow.

Re:Die already ! (4, Insightful)

Wrath0fb0b (302444) | more than 6 years ago | (#23668755)

Even Intel didn't want it to live that long. The 8086 was hack, a beefed up 8085 (8-bit, a better 8080) and they wanted to replace it with a better design, but iAPX 432 turned out to be a desaster.

The attempts to improve the design with 80286 and 80386 were not very successful... they merely did the same shit to the 8086 that the 8086 already did to the 8085: double the register size, this time adding a prefix "E" instead of the suffix "X". Oh, and they added the protected mode... which is nice, but looks like a hack compared to other processors, IMHO.
Perhaps this can be taken as a lesson that it is more fruitful to evolve the same design for the sake of continuity than to start fresh with a new design. The only really successful example I can think of a revolutionary design was OS-X, and even that took two major revisions (10.2) to be fully usable. Meanwhile, Linux still operates based on APIs and other conventions from the 70s, the internet has all this web 2.0 stuff running over HTTP 1.1, which itself runs on TCP -- old, old technology.

The first instinct of the engineer is always to tear it down and build it again, it is a useful function of the PHB (gasp!) that he prevents this from happening all the time.

June 8th (5, Funny)

coren2000 (788204) | more than 6 years ago | (#23667999)

Why couldn't the poster wait for June 8th to post this story... its *MY* birthday today dang it... x86 is totally stealing my day....

Jerk.

alt.x86.die.die.die (1)

AceJohnny (253840) | more than 6 years ago | (#23668051)

and long live x86
Are you out of your mind!? OK OK, I'll admit that commercially, Intel was genius in making backward-compatibility king.

But on a technical side, the world would be such a better place if we could just switch to more modern architectures / instruction set. The chips would be smaller and more power efficient, not having to waste space on a front-end decoding those legacy instructions for the core, for example.

I know, Intel tried to break with the past with the Itanium. They were wrong in betting that the compilers would be good enough, sadly. Damn you, real world, damn you!

NEC V20? (1, Insightful)

Anonymous Coward | more than 6 years ago | (#23668055)

Alright - who remembers replacing the original 8086/8088 with an NEC V20 for the extra 2mhz?

-- kickin it old school IBM PCjr style.

Certain? (1)

camperdave (969942) | more than 6 years ago | (#23668091)

Intel used then "the dawn of a new era" slogan, and they probably didn't know how certain they were.

How "certain" they were? "Certain"?? Surely you mean "correct", "right", or perhaps "prophetic".

I For One Welcome... (0)

Anonymous Coward | more than 6 years ago | (#23668093)

...just kidding...

Intel, 1978 (1, Informative)

conureman (748753) | more than 6 years ago | (#23668167)

Rejected my application for employment.

Happy birthday! (5, Funny)

Nullav (1053766) | more than 6 years ago | (#23668215)

Now die, you sputtering son of a whore. :D

Out of interest... (2, Interesting)

samael (12612) | more than 6 years ago | (#23668219)

I know that modern x86 chips convert into RISC-like instructions and then execute _them_ - if the chip only dealt with those instructions, how much more efficient would it be?

Anyone have any ideas?

Re:Out of interest... (2, Insightful)

imgod2u (812837) | more than 6 years ago | (#23668485)

Well, in order to be fast in executing, the code density can't be all that high for the internal u-ops. I don't have a rough estimate but if the trace cache in Netburst is any indication, it's a 30% or more increase in code size for the same operations vs x86. We're talking 30% increase of simple instructions too. I would imagine it's pretty bloated and not suitable to be used external to the processor.

On top of that, it's probably subject to change with each micro-architecture.

Re:Out of interest... (2, Insightful)

slittle (4150) | more than 6 years ago | (#23668659)

Does it really matter? Once you expose the instruction set, it's set in stone. That'll lead us back to exactly where we are in another 30 years. As these instructions are internal, they're free to change to suit the technology of the execution units in each processor generation. And presumably because CISC instructions are bigger, they're more descriptive and the decoder can optimise them better. Intel already tried making the compiler do the optimisation - didn't work out so well.

ARM architecture is 25 years old (4, Interesting)

cyfer2000 (548592) | more than 6 years ago | (#23668283)

The ubiquitous ARM architecture is 25 years old this year and still rising.

Re:ARM architecture is 25 years old (1)

mustrum_ridcully (311862) | more than 6 years ago | (#23668737)

Yes but not quite...

Development of the ARM architecture began in 1983, but the first prototype ARM 1 processors weren't completed until 1985, with the ready for market ARM 2 being available in 1986 when the Acorn Archimedes computers were released.

Report to Carousel (5, Funny)

xPsi (851544) | more than 6 years ago | (#23668323)

X86 Turns 30 Years Old
Happy Birthday! But do not be alarmed. That flashing red light on your palm is a natural part of our social order. On Lastday, please report to Carousel for termination at your earliest convenience. The computer is your friend. Oh, wait, you ARE the computer...

And Have We Learned Our Lesson? (2, Interesting)

FurtiveGlancer (1274746) | more than 6 years ago | (#23668503)

About the tyranny of backward compatibility? Think how much further we might be in capability without that albatross [virginia.edu] slowing innovation.

No "it was necessary" arguments please. I'm not panning reverse compatibility, merely lamenting the unfortunate stagnating side effect it has had.

Re:And Have We Learned Our Lesson? (1)

conureman (748753) | more than 6 years ago | (#23668649)

Dammit, you made me cry. Thanks.
  Thirty years wasted.

And me! (0)

Anonymous Coward | more than 6 years ago | (#23668709)


That means I'm 2 years older than x86.

You should change the front page to " Happy Birthday! X86 and Anonymous Coward"

"Dawn of a New Era" (2, Insightful)

kylben (1008989) | more than 6 years ago | (#23668743)

This is probably the first time in the history of advertising that a slogan of such over the top hyperbole turned out to be understated.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>