Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AMD

AMD Layoffs Maul Marketing, PR Departments 136

MojoKid writes "AMD's initial layoff announcement yesterday implied that the dismissals would occur across the company's global sales force. While that may still be true, it has become clear that AMD has slashed its PR and Marketing departments in particular. The New Product Review Program* (NPRP) has lost most of its staff and a Graphics Product Manager, who played an integral role in rescuing AMD's GPU division after the disaster of R600, also got the axe. Key members of the FirePro product team are also gone. None of the staff had any idea that the cuts were coming, or that they'd focus so particularly in certain areas. These two departments may not design products, but they create and maintain vital lines of communication between the company, its customers, and the press."
This discussion has been archived. No new comments can be posted.

AMD Layoffs Maul Marketing, PR Departments

Comments Filter:
  • Bye markedroids (Score:1, Insightful)

    by Anonymous Coward

    Honestly they haven't been performing and it's understandable they got the axe. Maybe now AMD can focus on product rather than image.

    • Re:Bye markedroids (Score:4, Insightful)

      by ackthpt ( 218170 ) on Friday November 04, 2011 @07:44PM (#37953564) Homepage Journal

      Honestly they haven't been performing and it's understandable they got the axe. Maybe now AMD can focus on product rather than image.

      In my experience image sells more often than brand. Particularly image establishes brand, for what it's worth.

      These look like the sort of cuts of a company which may be in particular stress. Not encouraging.

      • In my experience image sells more often than brand. Particularly image establishes brand, for what it's worth.

        Yes, but the people doing that for AMD haven't exactly been doing a stellar job over the years... Their marketing messages have been constantly changing, and each version was a muddled mess.

        Not saying they deserved to be sacked or anything... Just, marketing is not one of AMDs strengths and I don't think this will cost them as much as one might think.

        These look like the sort of cuts of a company which may be in particular stress. Not encouraging.

        I'm encouraged that they cut heavy on marketing and less on R&D, as opposed to the opposite. That'd imply they aren't planning on being competitive, eve

        • by rbeef ( 990946 )
          /me thinks they will contract marketing out. Cutting marketing and outsourcing is the lesser of the two evils.
      • Comment removed based on user account deletion
        • by tqk ( 413719 )

          ... I really doubt marketing when you are selling out of chips already is REALLY needed that much, do you?

          I'll be watching to see if this has any effect on their sales. If this keeps up, then performance and reputation is what sells (as I've always suspected), not pissing money into the wind on advertising and PR.

          • Personally, I hate marketing. I hated commercials when I watched television, I hate adverts in my newspaper, I hate them on the tubez. Today, with this wonderful internet we have, when I want something, I start searching.

            My youngest kid decided that a bike would be cool. He thought about a Harley. I told him that A; Harley is overpriced by an order of magnitude, and B; V-twins suck ass. He did some research, he half believed me, but still, the offer of a trade was just to good to pass up.

            Now, three mon

          • Comment removed based on user account deletion
      • The people that run the business at AMD are all geeks, and very good ones.

        That said AMD is probably the first company that gets it (next to Apple, for that matter) in the sense that the product that you're making should sell itself.

        I'm not a fan of Apple at all, but the point here is that I've never bought something because of an emotional response (marketing) but rather because I wanted something for a reason, like: I need something to do x, what is it and where can I get it?

        AMD has already done lots of th

    • Honestly they haven't been performing and it's understandable they got the axe.

      The products haven't been performing lately either - hell, the marketing people did their job too well, and got us thinking that Bulldozer might actually be worth waiting for. Oops.

  • by Etherized ( 1038092 ) on Friday November 04, 2011 @07:14PM (#37953380)
    I'm missing the context here; could somebody explain what this disaster was and how it threatened the existence of the GPU division? A quick google returns nothing.
    • by Sycraft-fu ( 314770 ) on Friday November 04, 2011 @07:32PM (#37953504)

      The reason it was a disaster was the nVidia GeForce 8800. ATi was pretty sure that nVidia was still going to be back on teh old style of cards, with separate shaders, for their first DirectX 10 part. That is allowed, though not ideal (the programming interface has to be unified, not the hardware). ATi already had experience with unified shaders from the 360.

      So from all accounts their not-so-great GPU that was up and coming was going to be fine against nVidia. Then out of the blue nVidia drops the 8800, they did a real good job keeping a lid on it. Fully unified architecture that was fast as hell. We are talkign twice as fast as previous generation stuff often and that was on DirectX 9 stuff, never mind what it'd be able to do with the newer APIs.

      So ATi had to delay their release a bit and try to get something to compete better. When the R600 did launch as the Radeon 2000 series, it wasn't good competition.

      However ATi recovered very well with the Radeon 4000 and 5000 series. The 4000 series were extremely competitive cards. Good prices, good performance, low power usage, etc. Then the 5000 series were the first DX11 cards on the market by a number of months, and also great performers.

      • by Hadlock ( 143607 ) on Friday November 04, 2011 @07:57PM (#37953638) Homepage Journal

        Excellent point. It's also worth pointing out that the 8800 survived for five years as a very viable card. Released in 2006, it's still listed as a minimum requirement for many games today (including Battlefield 3). That's quite a feat considering how fast technology matures in this market. In 2009 the 8800-class cards were still selling north of $120, and while not mind blowing by today's standards, were pretty much the gold standard until mid-2008. It's hard to compete against that kind of technology.

      • > However ATi recovered very well with the Radeon 4000 and 5000 series. The 4000 series were extremely competitive cards. Good prices, good performance, low power usage, etc. Then the 5000 series were the first DX11 cards on the market by a number of months, and also great performers.

        Yup. What's "ironic" is that the 6970 is significantly _slower_ then the 5970 *, since the 6000 series was even more about low power usage.

        It will be interesting to see if the 7000 series focuses more on power or efficiency

        • by Anonymous Coward

          Yup. What's "ironic" is that the 6970 is significantly _slower_ then the 5970

          Because one is a Dual-GPU part and the other is not. The 6 series equivalent to the 5970 is the 6990, not the 6970 like any sane person would expect.

        • by rtb61 ( 674572 )

          Still even counting all high performance graphics cards those number aren't really the big numbers. Keeping focus on design rather than marketing AMD especially after buying ATI, might be focusing on a CPU with a high powered embedded GPU, with separate memory caching but combined main memory usage. The demand is pent up and growing for high powered portable graphics based devices. Tablets, netbooks, notebooks even more efficient smaller desktops. That's where the big numbers are. The first to crack a real

        • For whatever reason, AMD decided that they wanted to mess with their scheme. In the 5000 and 4000 series, the x8xx part was the high end single card, the x7xx was the lower range (like half as many shaders and so on) and the x9xx part was the dual GPU part: two actual GPUs on one board.

          Well for the 6000 series, they changed it. The x8xx range is the same as the x7xx range from the 5000 series and the x9xx the same as the x8xx. So the 6970 is now the highest end single GPU card, and is equivalent in the line

      • there are actually people still using 4870 cards, to play recent titles in high settings. i noticed that if i had had bought a 4870 instead of a 3870 back in 2007ish, i wouldnt need to change my card last year. even then my 3870 was holding very well, but the dual slot fansink was rather noisy (they always are). so i moved onto a 5670 arctic cooler sapphire, which is so silent that i didnt hear any fan noice for the last year despite overclocking it in taxing games.

        ati seems to know how to produce gpus.
        • by Anonymous Coward

          Ironicly, the Datsun 1800 my dad purchased in 1978 (used for ~2grand) produces more megagiggles today.

      • And Linux Support came. And Eyefinity rocks. AMD is finally ok with me :)
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      R600 was a huge, hot, and expensive design. It had to be delayed due to it being impossible to release on the 65nm process that was available at the time, and it barely fit on the 55nm half-node either.

      All AMD (ATI) cards released after R600 have been build from the ground up to target the mainstream market, whereas in the past they would create big monolothic dies and then cut them down to fit the lower markets. The enthusiast slots from AMD are now filled by dual-GPU cards.

      A parallel would be Intel moving

    • by Anonymous Coward on Friday November 04, 2011 @08:08PM (#37953748)

      After spending lots of area and design time on the R600 to make this "ring-bus" to get good memory performance, basically someone at Ati f'd up and accidentally implemented the design of the R600 ROP w/o a pipeline (basically get a batch of pixels, crunch on it, output it, instead of pipelined like get a batch of pixels, crunch on it, and get the next batch of pixels, output the first batch, crunch on the second batch, get the third batch, etc, etc.). Although perfectly functional, the perf sucked big time (compared to the nvidia 8800 which was available about the same time and didn't make that kind of silly mistake).

      Through lots of software hacks and their marketing group twisting developer arms (having developers do massively custom AA modes or huge shaders where the abysmal rop performance didn't matter as much), they managed to salvage the situation from their crappy design mistake... This was highly fortunate as OEMs that purchase the midrange chips often use game benchmarks to select cards for various price points and if the game benchmarks showed say 1/3 the perf of a comparable nvidia card, they wouldn't sell many cards. That would have probably happened if all the benchmarks were ROP limited and they didn't use lots of MRT hacks to get better perf out of their ROP.

      Since ATI was losing money at that time, it may have been the end of the rope for them. They had just made an aborted R500 design (which they eventually salvaged by selling it to MSFT for Xbox360) and they were hoping to have a killer product on their hands and suffering through the illusion that nvidia wouldn't show up with a unified shader DX10 part. The resultant R600 wasn't good for ATI (bad slow rop made bad benchmark scores and nvidia G80 design was unified dx10 despite what the pundits thought at the time), but saved them long enough to be bought by AMD...

      -Anon

  • Good? (Score:4, Insightful)

    by Xanny ( 2500844 ) on Friday November 04, 2011 @07:15PM (#37953388)
    AMD's weakness is not in getting brand recognition, every major PC carrier knows who they are. They need a competitive product. That requires engineering investments and hard work to catch up to soon to be Ivy Bridge. Servers don't want a power inefficient processor, and power users want top class for the price, and AMD is delivering neither right now on the CPU front. They also shouldn't try entering any other markets, I imagine that is what they are thinking though, try to get out of the x86 business since they are falling behind. Hopefully the Radeon 7000 series does really well, this next GPU generation is shaping up to be a huge force in the massively parallel server market, and AMD better realize the opportunity they have right now to earn back some cred with a rock solid GPU lineup. It doesn't help that Nvidia, AMD, and every ARM manufacturer are all basically waiting on TSMC for bulk 28nm transistors. They are all starting to feel the heat for depending on one company for all their silicon for this next gen of graphics hardware.
    • Re:Good? (Score:5, Interesting)

      by blair1q ( 305137 ) on Friday November 04, 2011 @07:28PM (#37953476) Journal

      TSMC isn't the only fabber.

      Rumor is that AMD and ARM may team up. But this means they might be thinking of an ARM/ATI combo chip. Which would be verrrrry interesting. But it would leave AMD's x86 department out in the cold for the future of computing.

      It's also a clue as to why AMD dumped the marcom hacks: these are the people who are supposed to tell the bigwigs what the Next Big Thing is going to be, and they have consistently been 1-2 years behind the curve.

      The only place AMD has been approaching the bleeding edge is in graphics, where the ATI engineers are merely advancing their skillz as fast as they can. No need to guess where their market is going, since there's always a call for more cores and more clock.

      • "No need to guess where their market is going, since there's always a call for more cores and more clock."

        What is different at the CPU market? More cores, more clock, less power consuption, price (you forgot the later two). Ok, there was the change to 64 bits, and the doubt about ARM vs. x86*, but if they had an entire team just to figure those, they were really throwing money away.

        * ARM vs. x86 is easy. What plataform will be better in cores, clock, power consuption and price? Each will win the markets th

    • by Sir_Sri ( 199544 )

      AMD and its graphics subdivision also rely on globalfoundries, which used to be the production arm of AMD.

      If anything, in the long run, the ARM business might help AMD and nVIDIA even indirectly, a big market for another high end foundry company gives them more potential suppliers.

      But I agree, marketing handles itself if you have a good enough product right now. The 6000 series is good, but not revolutionary, but I suppose one could say exactly the same thing about the 500 series from nVIDIA. I wonder if t

    • Competitive products aren't the issue. Well, right now they are, but they do have issues with name recognition. PC carriers are not going to integrate AMD products of there isn't demand for them. And I've been shocked that AMD still doesn't bother with advertising the way that Intel does.

      There have been periods where AMD chips were better than Intel chips, and yet that hasn't ever been reflected in market share. Right now, the move is the correct one, cut marketing and focus on developing better products, b

      • by 0123456 ( 636235 )

        There have been periods where AMD chips were better than Intel chips, and yet that hasn't ever been reflected in market share.

        It's kind of hard to massively ramp up market share in a short time when that requires spending billions of dollars on building new fabs to churn out those new chips. If AMD had built new fabs when the Athlon-64 proved to be significantly better than the space-heater P4s, they'd probably have finished them sometime after Intel release the Core line and blew them away.

        • Re:Good? (Score:5, Informative)

          by Chris Burke ( 6130 ) on Friday November 04, 2011 @08:57PM (#37954154) Homepage

          AMD did start building fabs when the Athlon64 and Opteron were kicking ass all over, and when their projections of market share showed that they would be fab limited -- which for a while, they were.

          The problem is that when they opened up the flood gates on their production capacity, the market share didn't follow. It bumped slightly, but not nearly enough to justify the massive investment in the fabs, wrecking their financials and ultimately forcing them to spin off the fabs as Global Foundries. This is due to the backroom deals Intel had with OEMs limiting the amount of AMD parts they could sell.

          This is the essence of AMD's lawsuit against Intel and the anti-trust rulings by Japan, North Korea, and the EU.

          • ...anti-trust rulings by Japan, South Korea and the EU, I presume.

          • Please tell me more about the anti-trust case in North Korea.
          • by vakuona ( 788200 )

            AMD should have bought a computer manufacturer, or a brand. AMD can't compete against Intel by trying to get into Dells etc. They can compete with Intel by making their own desktop and server lines, targeting them well and making decent profits by cutting out the middleman. The profits would allow them to continue to invest in their engineering.

            AMD need a strategy that allows them to make money off a small market share without sacrificing their ability to invest in their engineering.

          • by splerdu ( 187709 )

            I would mod you informative for your post, and funny for north korea.

            If I had the points.

    • Sadly for AMD I feel their time has come and gone, for a number of years they were well ahead of Intel but we all know why they could never press home their dominance. A billion euro fine; despite being a record, was chump change for Intel and in reality they profited greatly by shutting AMD out of the market.

      • by Anonymous Coward

        Some nights I miss my old K133 and its solid decoding of 128kbps mp3s.

      • Show me the Intel parts that can compete with AMD's C-50 and E-350. The cost and/or power consumption will be a lot higher to reach performance parity with these two parts.

        AMD still doesnt have the market share in the segments where Intel should really be getting their ass handed to them. Why would any manufacturer pick inferior Intel for a netbook or notebook? Makes you wonder.
  • Vital? (Score:5, Insightful)

    by vux984 ( 928602 ) on Friday November 04, 2011 @07:22PM (#37953428)

    These two departments may not design products, but they create and maintain vital lines of communication between the company, its customers, and the press."

    Better to cut marketing and the "vital" line of communication to the press, than to cut product development and not have a new product next quarter... because then having lines of communication to the press won't seem so vitally important anymore.

    Still it sucks for anyone to lose their jobs.

    • by v1 ( 525388 )

      It can't be easy to determine where the cuts are going to be made if you've decided to not just do an even across-the-board cut. Most divisions within any sizable company could have good arguments made for not selecting them for cuts.

      Unless your company has a "wing" ripe for picking anyway. Something that can be cut out all at once like a tumor without much effect to the rest of the company. You just have to hope you have quality bean-counters working closely with the company directors/visionaries to det

      • Re: (Score:3, Insightful)

        by del_diablo ( 1747634 )

        It is easy: You cut down to the entire companies managmants wages to lowest engineering wage, and no bonuses. That includes the stockholdes, CEOs and other "high positions".
        It wouldn't surprise me one bit if that would earn them a really nice surplus of cash, which again could be used to massive amounts of R&D.
        Of course, no corporation these days wants to sit down and do what needs to be done.

        • If that's the way you want it to work, then move to North Korea.

        • by vux984 ( 928602 )

          You cut down to the entire companies managmants wages to lowest engineering wage, and no bonuses. That includes the stockholdes, CEOs and other "high positions".

          Or maybe if they cut the engineering wages to that of the lowest janitors wage and no bonuses. What do you think that would accomplish?

          A big fat surplus of cash? Of course not. All the engineers would find new jobs and quit.

          What makes you think management would be any different?

          I agree executive compensation is way out of whack, but you can't just c

      • It can't be easy to determine where the cuts are going to be made if you've decided to not just do an even across-the-board cut. Most divisions within any sizable company could have good arguments made for not selecting them for cuts.

        AMD is a Fabless chip company now. That means they design chips. They are behind in performance on the x86 side, are about to be behind in low power when Intel uses FinFETs (sorry Tri-Gate). The last thing they need to cut is their core design business - it's what the company

    • by Anonymous Coward

      Hi, a member of the press here (proper press, not a blogger);

      So the issue is that the NPRP was responsible for providing support for us when reviewing products at launch. This meant tracking down bugs, letting us know about internally known issues, and getting drivers issued for important bug fixes. The fact of the matter is that pre-launch hardware (particularly for a new architecture) is practically beta testing, and we're the beta testers. The risk AMD takes by not having a well staffed NPRP is that if w

      • by Grave ( 8234 )

        NPRP is the last line of defense? Cool. Glad they didn't cut the first line of defense, which is the engineer team that is responsible for creating a superior product in the first place.

      • that's a stigma that's would stick to a product for its entire life

        I think that depends entirely on the consumer. Personally, I go for the best (current) performance for the lowest cost. I realize that getting your CPU in a larger share of mainstream computers from Dell etc may be affected by general perception, but AMD is already losing on that front. Intel has a much larger desktop market share and, anecdotal as it is, most people I talk to favor intel for seemingly emotional reasons. I believe that the perception of the "enthusiasts", however, are ultimately what sways

      • by vux984 ( 928602 )

        Or to put this another way, the NPRP was the last line of defense against a bad review.

        Not having a new product to review in the first place is far worse.

  • by blair1q ( 305137 ) on Friday November 04, 2011 @07:23PM (#37953432) Journal

    Seriously, with 14 bazillion bloggers fighting to get clicks to their webpages, all you need is one guy with a copy of the datasheet and a twitter account, and you'll have your part's nomenclature showing up on every RSS feed in the world within minutes if not days. And, if you're lucky (or just know where to put the typos), you can get /. to send your favorite blogger enough clicks to buy an iPhone.

    • Yes, but those aren't the people that AMD needs to be reaching. To people that know little about computers, Intel is a name brand that has been associated with quality. The problem is that it isn't always true, there are periods where Intel is doing really good work and there's periods where AMD chips are better, but you don't really ever see that in the market share, in large part because for the most part you have to build your own computer if you want AMD parts.

      Not quite so much now that AMD does GPUs as

      • by blair1q ( 305137 )

        Okay, so in addition to your twitter guy, hire a guy to call OEMs and say "use our chips and we'll give you free stickers."

        And a guy to negotiate for a NASCAR team. Because, fuck, man, this is Amurrca.

        Fact is, if AMD had ONE THOUSAND FOUR HUNDRED people doing that job, they were wasting about $140 million a year on dead wood.

        • It's not a fact. That's how marketing works, it doesn't matter how incredibly brilliant and affordable your product is if nobody knows it exists. And for most people, AMD products just aren't available when they go to the store. And they don't know about them because there's no marketing and they often aren't carried by the stores.

          • Comment removed based on user account deletion
          • because there's no marketing and they often aren't carried by the stores.

            So, just WTF were the 1,400 marketers doing then? Apparently not their job. Maybe that's why they were purged.

      • by Kjella ( 173770 )

        The problem is that it isn't always true, there are periods where Intel is doing really good work and there's periods where AMD chips are better, but you don't really ever see that in the market share, in large part because for the most part you have to build your own computer if you want AMD parts.

        Well that, and you have the problem that it's not that easy to ramp up or ramp down CPU production. Building fab capacity is started years in advance so by the time AMD actually brings a processor to market they got a fairly narrow percentage of the market they can supply. Good chips mean high prices, poor chips means low prices but they can't take that much market share in one generation. And if you overextend yourself you risk that Intel pulls a very good processor out of the hat and you're left with way

    • Re: (Score:2, Insightful)

      by westlake ( 615356 )

      Seriously, with 14 bazillion bloggers fighting to get clicks to their webpages, all you need is one guy with a copy of the datasheet and a twitter account.

      If this were true why is Linux clinging by its fingertips to a bare 1% market share?

      You need people who can negotiate OEM system installs, retail placement and sales promotions. Your bazillion bloggers aren't as useful as the one man or woman who knows how to cut the right deal with Walmart.

      • by Anonymous Coward

        Oh shit son, did they just unthaw you?

        Linux doesn't have 1% marketshare of anything. It has closer to 10% desktop marketshare and dominates the mobile, server and embedded spaces.

    • If you have a superior product, then the approach you outline might work. Just let the bloggers tell everyone what they honestly think. But when it comes to Intel vs AMD practically everyone would say Intel is better except for those who are think price is a decisive factor. So, when you go to Best Buy and you see row after row of laptops with Intel chips and the sales people are telling you the battery will last 6 hours and they processor speed is this or that and its all because of the super efficient
      • by mikael ( 484 )

        A simple CPU performance comparison chart would work [cpubenchmark.net]

        But how can anyone tell whether one CPU is going to be fast or slow for their purposes? There's all that hyperthreading, cache size, memory size to consider as well.
        You'd really need to know how many MHz and Megabytes a particular application requires in order to run smoothly.

        My parents have a laptop (1.5 GHz) that is so slow on starting up, that they just basically keep it on all the time, and don't bother shutting down the applications (E-mail, spreadshe

  • Nothing of value was lost.

  • by catmistake ( 814204 ) on Friday November 04, 2011 @07:30PM (#37953488) Journal

    Office Space?

    These two departments may not design products, but they create and maintain vital lines of communication between the company, its customers, and the press.

    Bob Slydell: What would you say ya do here?

    Tom Smykowski: Well look, I already told you! I deal with the goddamn customers so the engineers don't have to! I have people skills! I am good at dealing with people! Can't you understand that? What the hell is wrong with you people?

  • Amazing (Score:5, Insightful)

    by liquidweaver ( 1988660 ) on Friday November 04, 2011 @07:31PM (#37953492)

    Here it is, 2011, when CEO's live and die by 10K's and stock prices, we have a company that layed off marketing and PR and kept their engineers. How much AMD stock can I buy? Sign me up!

    • by CAIMLAS ( 41445 )

      Absolutely. This is a "bold" (ie, different) move. I've always liked AMD as a company, as their business decisions have always at least been long-game driven. "It hurts now, but two, three quarters from now, we'll like it".

      I was starting to think AMD was going to fall way, way behind. Now, I think they're going to pull ahead of this one. (Hell, it took Nvidia one major revision in their cards to get from "poor performance and high power use" to "the head of the pack by a bit". From what I understand, Bulldo

      • by Anonymous Coward

        In most cases, losses to Marketing are not a big deal, hell Berkshire Hathaway could probably cut down half their advertising budget and you'd see only slightly less Geico and Dairy Queen commercials, but you'd still know they sell insurance and ice cream.

        AMD sells CPU's, and GPU's. I know they sell other stuff, but nobody ever talks about the other stuff. I think that tells you all you need to know.

        From an engineering POV, the only reason I buy the Radeon's but not the AMD CPU's is because the CPU's tend t

      • Re:Amazing (Score:5, Informative)

        by gman003 ( 1693318 ) on Friday November 04, 2011 @09:49PM (#37954468)

        From what I understand, Bulldozer isn't designed poorly - the implementation is just lacking. Sounds to me like they pushed a beta product out for quarterly product presence, but the real product isn't far behind...

        Actually, a huge part of Bulldozer's problem is marketing lies. The architecture is very interesting - it's based on a "module" made of an instruction fetcher/decoder, two integer cores, a floating-point core, and two levels of cache. The effect is comparable to Intel's Hyper-Threading, even if the implementation is different. A four-module Bulldozer chip is comparable to a hyper-threaded quad-core Intel chip - it can ALWAYS run four threads at once, and can theoretically reach eight.

        The problem is, AMD didn't market it that way. They market their four-module chips as 8-core, and their two-module chips as quad-core. Which isn't, technically, lying - they do have that many integer cores - but that marketing caused problems when benchmarks came out. People saw "AMD 8-core chip beaten by Intel 4-core chip" and thought "man, those cores must suck BALLS. And since even I know that a lot of programs are still single-threaded, it really makes no sense for me to buy an AMD chip right now".

        It's almost justice, seeing the marketers fired for this. They stretched the truth beyond what the public would believe, and it bit them in the ass.

        The other problem with Bulldozer is pricing - Bulldozer chips, at least right now, are ~$30 more expensive than the comparable Sandy Bridge processor. Sure, you'll quite likely save twice that if you're upgrading, since Bulldozer is mostly compatible with older motherboards while Intel is still thrashing sockets, but that's not going to be the case for everyone.

        • I can't think of very many situations where having a four-instruction decoder in front of two four-issue cores would be a good idea, unless you're doing something weird like issuing nothing but vector ops. Additionally, cache latency is through the roof compared to everything else in that space right now, including previous-gen parts.

          Bulldozer is a bad design, and AMD's current roadmap (10-15% improvement per year) doesn't look very reassuring. Still, I'd like to see them get their shit back together, b
        • by Nemyst ( 1383049 )

          AMD has another problem: its own Phenom line. For all intents and purposes, an older and dirt cheap Phenom II X4 will more than please 99% of the population. For the remaining 1%, performance is an important criterion, at which point they'll probably go for a top-of-the-line Sandy Bridge or Ivy Bridge processor since they're rebuilding every three years anyways, negating the upgrade discounts on AMD's platform.

          That AMD is refocusing its strategy towards APUs and possibly teaming up with ARM should be a tell

        • Bulldozer is mostly compatible with older motherboards

          I think you've got that backwards. The *older* chips (Phenom/Athlon II) are mostly compatible with *newer* motherboards.

          • Goes both ways:

            "Some manufacturers have announced that some of their AM3 motherboards will support AM3+ CPUs, after a simple BIOS upgrade. Mechanical compatibility has been confirmed and it's possible AM3+ CPUs will work in AM3 boards, provided they can supply enough peak current. Another issue might be the use of the sideband temperature sensor interface for reading the temperature from the CPU. Also, certain power-saving features may not work, due to lack of support for rapid VCore switching. Note that us

            • This has turned out to be rarely true in practice. Only about 3 models each of MSI and Asus 800-series motherboards can run a Bulldozer CPU with a BIOS update and NO Gigabyte 800-series boards will except for the very last hardware revisions of about 8 of their models.

              The motherboard OEMs did a much better job of supporting new CPUs on their existing boards for the previous AM2 - AM2+ - AM3 transitions.

        • by Kjella ( 173770 )

          Except if they called it a quad core, then it'd be the world's biggest and most power hungry quad core. It's got the transistor count and power consumption of an octo core, without actually delivering that performance. I'm not so sure marketing it the other way would have looked any better.

        • And they recently took the world's fastest processor record for Guinness, I believe. They used liquid helium cooling at -220 C... The actual chip and package are made to withstand high clock speeds and tortuous conditions. They are gonna tune these and make them faster. They need their R & D team to make better sense of their instruction set implementation. That's where Intel is killing them in Ops/second... I still have hopes for AMD, 'cuz competition is good for us, but I use Intel right now...
        • The Bulldozer FPU is shared between the two cores in a module. It sounded like it's really only a single FPU for the new 256 bit instructions. Remember when the K8 used a 64bit FPU and still kicked Intels butt? IMHO they waited about a year too long to upgrade it to 128bit. Do we really know where BD bottleneck is?
        • by CAIMLAS ( 41445 )

          So, for gaming, why would I want one of these? That's still the question on most people's mind.

          Most of the remainder of geeks think: but Bulldozer uses a gobton more power than the performance-equivalent Sandy Bridge chip (and it costs more). My existing AMD system out-performs it for single and double core workloads. I don't want one.

          The common consumer buys what's almost cheapest, usually, thinking it's the value proposition. Sometimes that's a good deal, and sometimes it'll be AMD. Or, at least, that's t

      • From what I understand, Bulldozer isn't designed poorly - the implementation is just lacking. Sounds to me like they pushed a beta product out for quarterly product presence, but the real product isn't far behind...

        I don't know jack about this, so I'll just quote what an acquaintance of mine wrote on my gaming community forum:

        The main issue is that the Windows 7 scheduler doesn't understand the effective use of the modules, which drastically cuts down on the ability of the processor to turn off cores and run in turbo mode.

        The bottom line in the end unfortunately will be that most desktop work loads still will not take advantage of the Bulldozer architecture. It'll fair much better in the server world but it's going to be a while before the desktop software truly shifts to the style of programming Bulldozer requires. A long while. Probably a lot longer than 5+ years. . .

        [I ask a stupid question]

        . . .Intel also disables cores. The idle cores are turned off to reduce the thermal foot print while running the active cores at higher clock speeds. Bulldozer's method is a little more complicated than Intel's though, and Windows 7 doesn't understand how to deal with it. For example, for two integer heavy threads with shared data, it should schedule them to a single module and throw the turbo on. Two integer heavy threads, with non-shared data? Two modules. Two floating point threads? Two modules. There's a lot of conditionals about the work load, based on the new longer pipeline, the dual integer unitss, but only a single fp unit.

        Piledriver will improve the FP a lot since it'll have the GPU on die, which is what AMD is really aiming to do. They don't really want FP units at all in the module, they want to push that work to GPU-style modules. Which makes sense, as they're a hundred times better at it. But programs aren't written to take advantage of that yet.

        I have no idea if he's correct, but that's the extent of my understanding.

    • I'm not sure what's so funny about this post, it's clear that they understand that engineers in a company that depends upon creating new products can't cut back on engineering indefinitely.

      • Except for the repetitive redundant part where you keep saying "engineers, engineers, engineers", lots of companies seem to not understand that at all.

    • I agree. I have always felt that the ROI for marketing and advertising is somewhere under $1 per $1 spent. The only reason that you have to have marketing is because your competitor has marketing, and even though the competition loses money on every dollar spent, you also lose some amount of money for every dollar they spend because their marketing has SOME effect on your business.
      In other words, marketing is necessary because marketing exists. In the same manner, we all hate lying salesmen, but for the sa
  • None of the staff had any idea that the cuts were coming, or that they'd focus so particularly in certain areas.

    And this is precisely why they were fired. I mean duh, this is not news that marketing is among the first areas to be axed in a dying company. There's quite a bit of precedent in the business world. If those employees didn't even know this, and had no situational awareness as to how their brands were doing, I can just imagine how they were handling their day to day work.

    • Actually, a lot of companies are hitting R&D the heaviest, since they require lots of space, supplies, and equipment (overhead) and are often the highest-paid non-management/attorney positions. Usually a terrible, terrible plan for any company as a whole, since THAT'S HOW A COMPANY STAYS COMPETITIVE, but it keeps the stock healthy for a long-enough period that management can cash out and then get the fuck out of Dodge before the house of cards comes tumbling down. For a thrill, keep an eye on the big

  • by Anonymous Coward

    take a look a apple... marketing wizards. you may love their products, or hate it, but their growth and sales tell the story for itself. this may not end up to be such a good move for AMD in the long run.

    • take a look a apple... marketing wizards. you may love their products, or hate it, but their growth and sales tell the story for itself. this may not end up to be such a good move for AMD in the long run.

      The problem is vastly different, as many posters here are pointing out: AMD has (had?) a lousy marketing department.

      The illustration you're making is apples (heh) to oranges. Apple has an extremely strong and talented marketing wing--so much so that the actual real world quality of their products almost doe

  • forthcoming chips from AMD will just have numbers instead of names

  • The future market for GPUs is not a bunch of gaming enthusiasts and the design cycle for new devices takes sometimes years. The future is in tablets, phones, TVs, and other connected devices. Having a GPU that fits in a smartphone is more important than having insanely good rendering performance. Having a GPU that draws little power and does not need three fans and a big heat sink is crucial. AMD's marketing team made a colossal branding mistake by killing the ATI brand when millions and millions of gami

BLISS is ignorance.

Working...