Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel Gives Up On TV

Soulskill posted more than 2 years ago | from the stick-to-your-strengths dept.

Intel 89

symbolset writes "Bloomberg is reporting that Intel, on the cusp of having low-power embedded chips that can do true HD in a flatscreen, has given up on getting its chips embedded in TVs. While many might say their efforts to date have been fruitless because of energy issues, Medfield might have had a chance in this field."

cancel ×

89 comments

Sorry! There are no comments related to the filter you selected.

first post (-1)

Anonymous Coward | more than 2 years ago | (#37687334)

this post made it to market. intel's chips didn't.
GNAA TURTLE NIGGER

Re:first post (-1)

Anonymous Coward | more than 2 years ago | (#37687352)

you are a cunt and you know it.

Re:first post (-1)

Anonymous Coward | more than 2 years ago | (#37687370)

Don't read this... it is a curse...

In 2008, a little boy named Erin was relaxing on the beach in the middle of the day. Whilst doing so, he spotted a small lizard beanie baby about 6 meters away, stood up, and then called out to it. After the lizard asked him what he wanted, Erin said in a confident manner, "I betcha can't lick my buttcheeks!" The lizard replied, "I bet I can!" and stuck out its tongue a few inches. Confident of the lizard's impending failure, Erin laughed. However, he discovered that his confidence was misplaced right as he heard the sound of the lizard's invisible tongue slapping his buttcheek!

Angry, Erin yelled, "I betcha can't lick my buttcrack!" The lizard replied the same way, and then once again stuck out its tongue a few inches. And, once again... Erin heard the sound of an invisible tongue slapping against something, but this time it violated his buttcrack. Furious, he screamed, "I betcha can't lick my butthole!" The lizard replied the same way, stuck out its tongue a few inches, and the exact same thing happened.

For Erin, that was the last straw. He was so furious that he ran up to the lizard beanie baby and tried to stomp on it. However, it somehow managed to crawl up his left pant leg and appeared to be crawling towards his bootyass! In his desperation, he attempted to stop it by blocking it with his hand. He quickly realized that that would not be effective when the lizard merely crawled under his hand. The lump in his pant leg continued onward towards his bootyass. After trying and failing to take off his pants, Erin gave up all hope and began screaming for help. Once the lizard reached Erin's precious bootyasscheekcrackhole, it began crawling on top of it in a square pattern, stopping and continuing every few seconds. Whenever the lizard moved, the sound of a snake was heard many times in a short amount of time. This inflicted tremendous amounts of tickle on Erin's bootyass!

Now that you have read this (even a single word of it), the lizard will crawl on your bootyasscheekcrackhole in a square pattern, inflicting extreme amounts of tickle upon it! To prevent this from happening, post this curse as a comment three times.

Translation, please? (3, Interesting)

afabbro (33948) | more than 2 years ago | (#37687362)

Intel has been unable to provide a chip that offered significantly different performance from rival offerings, and failed to convince TV makers such as Samsung Electronics Co. or Sony Corp. that they needed its chips, Acree said.

OK, geeky people, what does that mean?

I interpret it as "producing chips for TVs is a commodity business and there's little opportunity to introduce anything new." Was Intel just late to the TV chip party and other chipmakers had it sewn up?

I would think even as a commodity producer, Intel would be competitive just because they have huge scale.

Re:Translation, please? (2, Informative)

Anonymous Coward | more than 2 years ago | (#37687410)

I took it to mean intel wants to cram x86 suckage into everything, to leverage their efforts at making a low-power x86 for UMPCs.

But nobody wants to run extant x86 apps on their TV, so everyone is happy with their ARMs (mostly), MIPS, etc.; if intel was willing to depend on someone else's IP, they could get back into the ARM business and just clean house (as you said, huge scale, and usually a process shrink or so ahead of their competitors), but x86 just isn't cutting it.

Re:Translation, please? (1)

Amouth (879122) | more than 2 years ago | (#37689242)

i never did understand why they got out of the ARM group.. their xscale cpu's where better than the competition by a long shot. Sure they where more expensive but it was worth it if you actually had to use the device. and if someone did manage to challenge them they have the scale/volume to drop prices and clean house.

Re:Translation, please? (1)

DaVince21 (1342819) | more than 2 years ago | (#37705216)

Just a little unrelated grammar-related note... "Where" means asking for a location. "Were" is the past tense of "are". ;)

Re:Translation, please? (1)

Amouth (879122) | more than 2 years ago | (#37705528)

thanks - didn't realize i missed that - I've never been good with spelling and correct grammatical use, i proof read everything but i still miss a lot.

Re:Translation, please? (1)

DaVince21 (1342819) | more than 2 years ago | (#37705618)

Glad to help, and hey, it happens! I sometimes catch myself saying grammatically crazy things sometimes. :)

(To be honest, I expected a "are you just here to play grammar Nazi?" reply. Guess that says something about what kind of strange expectations the internet can give you!)

Re:Translation, please? (1)

Amouth (879122) | more than 2 years ago | (#37705896)

i do wish you could edit comments to fix errors like that.

I do try not to perpetuate the norms that have formed here - but rather the norm of what was originally intended.

Not even games? (1)

tepples (727027) | more than 2 years ago | (#37689718)

But nobody wants to run extant x86 apps on their TV

Not even games? Imagine a TV with a built-in PC that can connect to Steam, Impulse, and GOG to download games. Put your wireless mouse and keyboard on a TV tray, and PC gaming is back. Apple already makes a 21" and 27" model.

Re:Not even games? (0)

Anonymous Coward | more than 2 years ago | (#37690582)

Do you really want to have a hard drive, video card, and Windows on your TV? I sure as fuck don't. You may as well just bring your desktop into the living room.

Re:Not even games? (1)

Austerity Empowers (669817) | more than 2 years ago | (#37691308)

My wife won't let me, but I'd do it if I could.

Re:Not even games? (1)

tepples (727027) | more than 2 years ago | (#37691414)

Perhaps my point missed you. Apple sells a TV with a built-in Macintosh computer, called the "iMac".

Re:Not even games? (1)

Luckyo (1726890) | more than 2 years ago | (#37691940)

Why not? TVs are getting "smart", and many of them are already connected to personal computers.

Might as well have a choice of proper windows PC TV without a mess of cables and having to use keyboard/mouse to navigate.

Re:Not even games? (1)

Medievalist (16032) | more than 2 years ago | (#37692264)

I can already do all that stuff without having the TV and PC share a case. My computers have HDMI out and my TV has VGA in.

Almost nobody else has what you have (1)

tepples (727027) | more than 2 years ago | (#37693366)

My computers have HDMI out and my TV has VGA in.

Some people still have a CRT SDTV with only a composite input or an early 1080i CRT HDTV with only composite and component inputs. And even the video signals are compatible (VGA out to VGA in or HDMI or DVI out to HDMI in), not everybody has a computer in the same room as a TV. They might own only one computer and not want to have to carry it back and forth between the computer desk and the TV. It appears that in practice, statistically nobody is interested in buying a computer to hook up to the TV. (See previous comments: 1 [slashdot.org] 2 [slashdot.org] 3 [slashdot.org] 4 [slashdot.org] 5 [slashdot.org] 6 [slashdot.org] 7 [slashdot.org] .)

Re:Almost nobody else has what you have (1)

Medievalist (16032) | more than 2 years ago | (#37695284)

Good footnoting, there, tepples!

This is probably a function of time of purchase. I have not seen anyone buy a laptop that didn't have a TV-out of some sort (composite, DVI, HDMI or S-video) for years, but it seems like VGA inputs are just starting to become standard on HDTVs recently, and HDMI-out on video cards still isn't really widespread (although obviously it is already commonplace on gamer video cards). I just happen to have bought my first HDTV last solstice; early adopters are probably more limited.

Re:Almost nobody else has what you have (1)

tepples (727027) | more than 2 years ago | (#37695410)

HDMI-out on video cards still isn't really widespread

DVI-D output has long been standard on even low-end video cards, even if not on desktop integrated graphics. I've seen several video cards with no VGA connector, just a DVI-A to VGA adapter hanging off a DVI-I port. My TV has an analog audio input next to one of its HDMI inputs, which appears to have been designed specifically for use with a DVI to HDMI cable and an analog audio cable.

But just because the port is there doesn't mean that TV owners A. know it's there or B. feel like using it. Is there anything I can do to promote the use of PCs with TVs to the general public, other than writing a HOWTO [pineight.com] ?

Re:Almost nobody else has what you have (1)

Medievalist (16032) | more than 2 years ago | (#37695986)

That's a pretty good start on a howto, there (as somebody commented, you ought to find a Brit to add SCART).

My kids and octogenarian grandfather would have no problem following it. The pictures of connectors and corresponding tables work really well.

But my mother would never find it useful, because she doesn't want to know how to hook up electronics. She doesn't need to change her attitude, either - she's just fine the way things are! Similarly, I don't want to know how to deconstruct poetry or mine gypsum, no matter how useful those skills might be. That stuff is boring.

So I guess I'm saying "don't worry about it, you've already led the horses to the water. They'll drink when they're ready."

Re:Almost nobody else has what you have (1)

tepples (727027) | more than 2 years ago | (#37696184)

So I guess I'm saying "don't worry about it, you've already led the horses to the water. They'll drink when they're ready."

There are some video game genres that don't work well on the monitor connected to the average PC, but they work well on a larger, TV-size monitor. Take split- or otherwise shared-screen co-op games or party games in the vein of Bomberman or Smash Bros. It's kind of hard to fit two to four people holding gamepads around a 17" to 19" PC monitor. Such games have historically been released for consoles, but indie developers tend to be unable to afford the organizational overhead of console game development.

Re:Almost nobody else has what you have (0)

Anonymous Coward | more than 2 years ago | (#37695724)

That is one reason why my old Xbox still has a use. XBMC is an awesome media centre, and can run on a chipepd Xbox to an SD TV. 100% of my movie collection is suitable for playback on SD equipment, and looks pretty good. Only some of my latest rips are starting to push what the old XBox can play back, but I now have a custom handbrake config that works well, and keeps good quality while keeping size down to 700-900MB for a feature lenght movie. This means all the DVDs can be stored away and not scratched up by the kids, and I can play back on anything.
I have no plans to go HD until analog signal stops broadcasting late next year.
Also I'd never spend mony on a TV with a computer in it, as it would be obsolete in a couple of years. I want my TV to last longer than that is I am paying tripple figures for it. I also want the best panel I can afford. The media player is best left as a component, just like the 5.1 amp, the DVD player/DVR etc. The closest Ive come to integrated is looking at a BD player with 500GB drive, two digital tuners and DLNA client. But then I will need to move my storage to a NAS with a DLNA server, so I am still happy just playing back with the XMBC over SMB fileshares.

Re:Translation, please? (1)

Anonymous Coward | more than 2 years ago | (#37687428)

Samsung produces ARM chips which have smaller instruction sets than x86 chips and can run more efficiently than x86 chips in less robust applications such as TVs. While intel has made great strides in producing smaller and more efficient x86 chips, they are still just too bloated and power hungry.

Car analogy: Sticking a V8 in a golf cart. While the idea sounds appealing, it is not efficient for the designed purpose of the golf cart.

Re:Translation, please? (2)

TheInternetGuy (2006682) | more than 2 years ago | (#37687458)

I think the problem might be that they found out that TVs doesn't really eat chips, at least not at the same rate as the viewers.

And your car analogy makes no sense at all. Why would my golf cart need a java script interpreter?

Re:Translation, please? (2)

darthdavid (835069) | more than 2 years ago | (#37687972)

How could you use it to browse the links if you didn't?

Re:Translation, please? (2)

darthdavid (835069) | more than 2 years ago | (#37687976)

You need one to browse the links...

Re:Translation, please? (1)

TheInternetGuy (2006682) | more than 2 years ago | (#37697040)

I read this and thought... Yeah I guess your point is that everything has to be on the net these days.

Then I thought, wait there is something more going on here. ....
Oh links, golf , browse.... And suddenly there is coffee all over my screen.

-Well played sir.

Re:Translation, please? (1)

DaVince21 (1342819) | more than 2 years ago | (#37705254)

Whoosh and/or well played, sir.

Re:Translation, please? (0)

Anonymous Coward | more than 2 years ago | (#37688304)

This sounds wildly off the mark, but anyway...

Samsung. Period. Samsung, like Sony, has their sticky fingers in a lot of things, and it's a lot cheaper to use your own IP or licences than it is to use someone elses.

For Samsung, the LCD panel, ARM chip, NAND chips, RAM can all be made in-house. This leaves nothing for Intel to contribute. Samsung is also the only LCD manufacturer in Korea that doesn't fail horribly. Samsung is smart and arguably they have more money than they know what to do with.. or they are just wasteful about it.

Intel on the other hand produces powerful CPU's with crippled GPU's (licensed from another company) and not much else.

Look at Apple's AppleTV box. It's basically an iPhone4 without the LCD screen and battery in a tiny package. Samsung can make that and integrate it into a LCD TV easy. In fact I'm honestly surprised Samsung didn't just integrate Android into a TV already since they have more market in Korea than they do in North America (that might change with the Korea-America free trade agreement.)

Somehow "buy American" became impossible in the last 30 years. Yet Japan and Korea buy their domestic brands in preference to American ones. Yet there are only two things that are "American" that sell worldwide, the iPhone/iPod/iPad (Apple, made in China) and the Xbox360 (Made in Mexico), and here's the scary thing... the only reason that cheaper copies of these items don't sell is because you can't use their marketplaces with crap counterfeits. DingDingDing, looks like we found the winning business model.

(And no, restaurants, food and beverages aren't being considered since food items are almost always produced domestically for cost and spoilage reasons.)

Re:Translation, please? (0)

Anonymous Coward | more than 2 years ago | (#37688902)

Yet there are only two things that are "American" that sell worldwide, the iPhone/iPod/iPad (Apple, made in China) and the Xbox360 (Made in Mexico)

Are you even sure that the Mexican-made are sold worldwide?

I live in the UK, and "Made in Mexico" isn't that common here. (The old-style XBox 360 I just checked was made in China, for example.)

I suspect the deal with Mexican-made goods is that for markets within the NAFTA area (i.e. US, Canada, Mexico) they have a price advantage- one which they don't have in the rest of the world, and which might explain why they're not so common here.

Re:Translation, please? (1)

garyebickford (222422) | more than 2 years ago | (#37692688)

Last I read (in the last month or so) something like 70% of goods purchased in the US are made here. We are also still one of, if not the, largest exporter in the world. I don't remember the details though.

Of course 'made here' may only mean 'assembled here from parts made all over the world' - like a Boeing 787 - Japanese manufacturers have been making major parts of Boeing planes for at least two decades, and with the 787 major parts are made in (surprise, surprise) every one of Boeing's major market zones. Major parts are also still made here, of course, and final assembly is done here. There is a big fight between Boeing and the unions about opening a new assembly plant in South Carolina.

US auto import tariff rules used to require some percentage (50%? 70%? IDK) a car to be made here. That percentage could include administrative costs, which seemed a bit of a stretch. I don't know the present situation.

Re:Translation, please? (1)

swalve (1980968) | more than 2 years ago | (#37723450)

Is it 70% of goods by number, or by $ value? I don't know the answer, but I would bet my hat that it was $ value. We still have a thriving manufacturing community, but we make the expensive stuff like bulldozers and airplanes, and not cheap shit like DVD players and plastic kazoos.

Re:Translation, please? (1)

garyebickford (222422) | more than 2 years ago | (#37749792)

I think you are correct, it was value.

Re:Translation, please? (3, Interesting)

Anonymous Coward | more than 2 years ago | (#37687442)

Keep in mind up until 2000-something Intel was not only one of the largest chip manufacturers, but also one of the largest manufacturers of embedded controllers in the world. Some MBA dickhead under Otellini or his predecessor (the dude who fucked them to rambus for the first year of the P4's life) decided that embedded wasn't a high enough 'profit' division to hold onto and either sold it off or spun it down. Point? Intel already dominated that market many years ago, but due to trends in management are too shortsighted to retain the sort of market diversity they need to help them roll out emerging products like this into the sectors of the market that will leverage them and make them a steadily profitable while not ridiculously so market.

This is the same thing that happened to the i740 successors and larrabee (both of which sucked engineering-wise, but whose basic premise should've been kept: Intel could've been in the videocard market where nVidia/ATI are now. But some shortsighted manager decided it made more sense to cut their losses than to persevere and get the product right so the next generation and the one after that would succeed.

Re:Translation, please? (3, Informative)

dbc (135354) | more than 2 years ago | (#37687548)

Dickhead == Craig Barrett. Undid 25 years of Intel culture in less than a year. It took Otellini (for whom I have the greatest respect) almost two years to correct the Barrett fall out. But in the end, Intel pretty much makes decisions based on gross margin per wafer. They'll do strategic things for a while, but if the margin per wafer doesn't show up pretty soon, they kill the experiment. (Speaking of strategic, here's a fun game: The next time a salesman (or marketroid) tries to convince you to do some deal because "it's strategic", respond with "Oh, you mean it's no revenue." Enjoy deer-in-headlights face.)

Re:Translation, please? (1)

Kjella (173770) | more than 2 years ago | (#37688382)

They'll do strategic things for a while, but if the margin per wafer doesn't show up pretty soon, they kill the experiment. (Speaking of strategic, here's a fun game: The next time a salesman (or marketroid) tries to convince you to do some deal because "it's strategic", respond with "Oh, you mean it's no revenue." Enjoy deer-in-headlights face.)

Isn't that pretty much the definition of strategic? We're not great at making these kinds of products today, we lack the customer base, the experience and reputation. So we do projects at break-even or even possibly a slight loss in order to break into the market, because it's our strategy that we want to become an established player. If a) we aren't able to establish us or b) we do and there's still no profits then we don't keep doing what doesn't work.

Re:Translation, please? (1)

Runaway1956 (1322357) | more than 2 years ago | (#37691056)

Actually, no. That is not "pretty much the definition of strategic".

I don't like Microsoft, but they play an awesome strategic game. They invest in stuff, they buy stuff, the develop stuff - oftentimes, stuff that really has no future. But, do they ever get rid of any of that stuff? Not only "No", but "HELL NO!" Microsoft may put things on a shelf, and halt development, if it loses to much money - but they aren't about to get rid of anything. They can afford warehouses, terabyte on terabyte of hard drive space, offices, you name it. Microsoft's implementation of this thing, or that, may be worthless - in fact, it may even lose money. But, keeping it around, and maintaining ownership has a hell of a lot of value if/when some other innovation shows how they might use this older technology.

Not to mention, the patent portfolios!

What Intel engaged in, by comparison, was merely some tactical maneuvering. A good strategist has no problem beating hell out of an equally good tactician.

Re:Translation, please? (5, Interesting)

Anonymous Coward | more than 2 years ago | (#37687660)

Keep in mind up until 2000-something Intel was not only one of the largest chip manufacturers, but also one of the largest manufacturers of embedded controllers in the world. Some MBA dickhead under Otellini or his predecessor (the dude who fucked them to rambus for the first year of the P4's life) decided that embedded wasn't a high enough 'profit' division to hold onto and either sold it off or spun it down.

I worked for Intel during that period. Management was totally poisoned by the dot com disease. You could have a business plan that called for spending $50 million over five years to create a guaranteed $150 million a year product line with 25-40% margins and they didn't want to know. They were only interested in stuff that supposedly was going to produce a $500 million business a year in 18 months. They spent vast sums of money on second string chip companies, some of whom were already in trouble before the bottom fell out.

The other thing, they have this focus on margins that is deranged in that it's a straight percentage target that isn't adjusted depending on the market. In some markets, yes 60% is needed because you need to reinvest constantly in new designs. But there are other markets where 20% is more typical. Markets where the product life cycle is 36 months not 12. So they give up on stuff when the margins aren't their, forgetting that the proper metric is return on investment. Case, say you have a business, makes 150 mil in revenue, 20% margin, is 30 mil. If you have to invest 10 million a year in product design and what not to keep that business. Then you make 20 mil year off a 10 mil investment. 100% profit. But the way Intel sees it' it's only 20%, not enough to waste time on.

Then the dot bomb happened and they tossed overboard everything that wasn't going to turn a profit in 12 months. They also stopped development on product lines, thus killing them over the medium term. Of the dozen or so companies they bought 1998-2001 they closed all but one, and that only because being 12 and O would have been too embarrassing.

My impression is that Intel has a lot of capable people, and money to hire same. But the upper management has issues. It's like when they enter a business and find the other players are determined and competitive, customers who are used to wheeling an dealing to get the best value out of their vendors; management gets pissy and shuts everything down instead of sticking it out long enough to crave out some market share.

Re:Translation, please? (1)

Anon-Admin (443764) | more than 2 years ago | (#37690418)

That was well said! However, i don't see this as an INTEL issue. Most of the companies I have worked for over the last 10 years seem to have shifted to the same mindset. Only short term returns and cutting heads to show a profit. All so management can make their bonuses. At some point it has to fail, I just hope that when it does the board sees it for what it is and makes a better choice when replacing them.

Re:Translation, please? (0)

Anonymous Coward | more than 2 years ago | (#37691958)

> I just hope that when it does the board sees it for what it is and makes a better choice when replacing them.

This is a typically a false hope, when boards are de facto selected by corporate management (and not the other way around) to help assign a "proper" management compensation committee for the benefit of the management.

The multiple small acquisitions are plausibly beneficial to management too since a lot of staff changes hands

Re:Translation, please? (0)

Anonymous Coward | more than 2 years ago | (#37694520)

TV business do have very low margins, and chip makers hardly survive with around 20% margin. And even so you have to re spin a new chip every year. only taiwan maker like Mstar and Mediatek survive in such business.
So intel annoucement is very much in line with broadcom exiting TV world even if they are today in some TVs.

An intel was not really a siginificant player in TV side, as they are missing the TV processing side.

Re:Translation, please? (1)

swalve (1980968) | more than 2 years ago | (#37723466)

That's an MBA disease. They got the important message: sometimes it's better to think in percentages. They missed the less important message: 1% of $1million is still more than 50% of $1000.

Re:Translation, please? (1)

tlhIngan (30335) | more than 2 years ago | (#37691436)

This is the same thing that happened to the i740 successors and larrabee (both of which sucked engineering-wise, but whose basic premise should've been kept: Intel could've been in the videocard market where nVidia/ATI are now. But some shortsighted manager decided it made more sense to cut their losses than to persevere and get the product right so the next generation and the one after that would succeed.

True, but then again, considering Intel is one of the top video card makers out there, does it really matter? Considering the work involved in making a great graphics chip - is the R&D expense worth it to gain a % or two marketshare gain?

And with that, remember Intel's been synonymous with "crap graphics" for years now. Even if they released a card with awesome blows-everyone-out-of-the-water capability it'll be dead in 6 months unless they can come up with something new.

Re:Translation, please? (1)

davester666 (731373) | more than 2 years ago | (#37687456)

Intel has been on the cusp of producing the ultimate low-power chip that works everywhere from cell phones to microwaves to toaster ovens to tablets to watches [just about a year from now] for the past 10 years.

They ain't there. They ain't about to get there. And somebodies already there and been doing it for a long time.

Bullshit finally walks.

Re:Translation, please? (1)

symbolset (646467) | more than 2 years ago | (#37687788)

I was there. Intel doesn't "get" the TV biz. They're not gonna. This announcement is their admission that they don't get it.

This is sad because they were almost there. Their prior efforts were sad, but they had some prime shizzle in the pipe that had a legitimate chance.

Re:Translation, please? (1)

Penguinisto (415985) | more than 2 years ago | (#37690496)

I can agree to this (and was there too - as a former member of the Digital Home Group) - The Canmore [engadget.com] project for instance had some serious potential, but the thing was somewhat hobbled from the start (NDA prevents opinions as to exactly why, but let's just say that IMPO it could've done a lot more than it actually did).

The biggest problem was that they interrupted everyone on the Oregon side with a physical move (From CO to JF), and after that began the whole 'let's be a part of Viiv!' bullshit (Viiv? Yeah, that little bastard really should have been strangled in the crib, but more than a couple upper management types had a little too much political capital invested in it). Either way, that in turn caused more than a few key PMs to jump to other groups (one even jumped right into the pool - didn't give a damn) and DHG imploded shortly thereafter.

Kinda sad, too... the whole thing ran its dev on Linux, and the SDK/PDK delivery setup was hella cool. :(

Re:Translation, please? (1)

sjames (1099) | more than 2 years ago | (#37687864)

Not at all. There might be some market for a more capable chip at the same power consumption if the price is right, but Intel couldn't fulfill that demand. Their chips were either too expensive, not capable enough, or too power hungry for that application.

Other potential disqualifying factors might include not being willing to guarantee a long enough product lifetime. Unless there have been improvements, Intel's relatively weak debug and test interfaces could also play a role.

Last up, ARM and MIPS based chips have a long history of successful application in that space. Intel would have to present a significantly better looking solution to make the risk worth taking. Intel isn't used to selling as the new guy. It doesn't help that the first few revs of Atom proved that they didn't really understand the apace they were trying to sell into.

Re:Translation, please? (1)

ThirdPrize (938147) | more than 2 years ago | (#37688258)

Was Intel just late to the TV chip party and other chipmakers had it sewn up?

I blame Doritos.

Re:Translation, please? (1)

DrXym (126579) | more than 2 years ago | (#37689670)

OK, geeky people, what does that mean?

It probably means these companies already have SoCs they use for this stuff (and may have a stakeholding in) and see no reason for ditching what they have for something produced by Intel.

Re:Translation, please? (1)

jon3k (691256) | more than 2 years ago | (#37697944)

If AppleTV and GoogleTV are any indication, saying that there isn't much opportunity to advance the television might be the understatement of this decade.

Intel chips. In TVs! (3, Interesting)

adolf (21054) | more than 2 years ago | (#37687398)

I remember a TV many years ago, perhaps late 90's or early 2k, which booted with a common Award BIOS screen and RAM check. I think we sold exactly one (and that one was the display model).

It was a useless device. Despite having a high-res CRT display with decent color, and a line doubler (which was potentially way cool in those pre-HDTV/DVI/HDMI times), it sucked: It irrevocably upscaled the output of a PSX, and the result was double-ugly instead of double-smooth since it got the field order precisely wrong.

It had an Intel CPU.

Is it dead now?

Good.

Thanks!

[/shallow]

Re:Intel chips. In TVs! (0)

johanatan (1159309) | more than 2 years ago | (#37687420)

Umm, that's not science. One data point means nothing.

Re:Intel chips. In TVs! (2)

afabbro (33948) | more than 2 years ago | (#37687450)

Umm, that's not science. One data point means nothing.

Were we doing science here? I thought this was a message board.

Re:Intel chips. In TVs! (0)

johanatan (1159309) | more than 2 years ago | (#37687532)

Well, there are better places to shoot the breeze I think. I'm not sure that anecdote has much in the way of informing us about the fit of Intel chips for TVs.

Re:Intel chips. In TVs! (1)

adolf (21054) | more than 2 years ago | (#37687544)

I think the anecdote has something to add about the concept of "smart" televisions in general.

Re:Intel chips. In TVs! (0)

johanatan (1159309) | more than 2 years ago | (#37687614)

But, even dumb TVs have chips! TVs are merely display devices. It doesn't matter where the smarts are really. In the tv. In the laptop. Ultimately people want to use the TV as a display for their 'smart' devices.

Re:Intel chips. In TVs! (2)

adolf (21054) | more than 2 years ago | (#37687794)

Dumb TVs don't have CISC CPUs trying to solve the world's problems.

"Smart" TVs do.

If you can't detect the difference, then there's nothing more for us to discuss on this matter.

Re:Intel chips. In TVs! (1)

johanatan (1159309) | more than 2 years ago | (#37694162)

Funny how you ignored 90% of the content of that post and latched onto the one thing that you thought you could potentially contradict me on. And, who said anything about 'smart' TVs here anyway. I certainly didn't read the summary that way. It merely says that Intel no longer wants to seek embedding its low-power chips within TVs. All TVs have 'chips' whether they are 'smart' or not.

Chips merely crunch numbers. The software running on (or embedded within) the chips is where the smarts are. [And, I worked on Windows Compact Embedded 7 so I've seen TVs running on a lot of different chips first hand. There's nothing inherently less 'smart' about Intel's offerings compared to other manufacturers]. In this space, the differentiator is power consumption; not 'smarts' (whatever that is).

Re:Intel chips. In TVs! (1)

adolf (21054) | more than 2 years ago | (#37697540)

You actually want me to refute the rest of your post?

Funny how you ignored 90% of the content of that post and latched onto the one thing that you thought you could potentially contradict me on.

You really want me to refute every portion of your post? Seriously? What sort of weird masochistic pedant are you?

But as you wish:

But, even dumb TVs have chips!

Yes. So does my toaster, my thermostat, the remote for my car locks, and my flashlight. They do not have general-purpose CPUs, though, which is what Intel is in the business of selling.

TVs are merely display devices.

TVs are radio receivers that include a display device. Display devices which do not include a radio receiver are called "monitors." (The "tele" in "television" is not without specific, direct, and obvious meaning.)

It doesn't matter where the smarts are really.

Yes, it does matter.

In the tv.

...where it will prove useless as methods improve through the inevitable passage of time and the manufacturer no longer offers updates...

In the laptop

That might work at your house. At mine, I use my laptop for work, and my family would not be appreciative of having their viewing habits be dependent on my work schedule. (I will not accept the notion that I'm the only man on Earth who uses a laptop computer as a computer instead of as an extension of the entertainment system...)

Ultimately people want to use the TV as a display for their 'smart' devices.

If this is the case, then there's no point in having general-purpose CPUs in TVs, anyway. (Which I think was my general point, not yours. Glad you agree, though...)

[Boy, that was silly. Why do you ask for such things?]

Re:Intel chips. In TVs! (1)

johanatan (1159309) | more than 2 years ago | (#37698946)

Funny how you ignored 90% of the content of that post and latched onto the one thing that you thought you could potentially contradict me on.

You really want me to refute every portion of your post? Seriously? What sort of weird masochistic pedant are you?

The kind that hangs out at slashdot, obviously (like 90% of the other wierdos here [including you]).

But, even dumb TVs have chips!

Yes. So does my toaster, my thermostat, the remote for my car locks, and my flashlight. They do not have general-purpose CPUs, though, which is what Intel is in the business of selling.

You know that Linux runs on all of those devices too, no? And, I think Intel sells a fair bit more than GPCPUs. You're the one who interpreted 'embedded' as 'general-purpose CPU' and 'smart'; not myself. See: http://www.intel.com/p/en_US/embedded/hwsw/hardware [intel.com]

This story has never been about 'general purpose CPUs' and 'smart' TVs. That is precisely the error I pointed out initially.

TVs are merely display devices.

TVs are radio receivers that include a display device. Display devices which do not include a radio receiver are called "monitors." (The "tele" in "television" is not without specific, direct, and obvious meaning.)

Of course. Do you really think I don't know that? TVs are *now* merely display devices (sorry about the shorthand but I assumed that my reader would understand that I'm talking about 2010s and not 1950s). They are becoming more mere display devices with each passage of time. Who picks up radio signals (that they don't transmit to themselves via bluetooth or wifi) these days anyway?

In the laptop

That might work at your house. At mine, I use my laptop for work, and my family would not be appreciative of having their viewing habits be dependent on my work schedule. (I will not accept the notion that I'm the only man on Earth who uses a laptop computer as a computer instead of as an extension of the entertainment system...)

You've got that backwards. The display device is an extension of the computer (technically a peripheral) not vice versa. If you have too few display devices for your household, maybe you can take turns or buy more (or, gasp, do without).

Ultimately people want to use the TV as a display for their 'smart' devices.

If this is the case, then there's no point in having general-purpose CPUs in TVs, anyway. (Which I think was my general point, not yours. Glad you agree, though...)

Once again, you're the one who equated 'embedded' and 'smart' and 'general-purpose CPU'. Maybe you need to read TFS again? It says nothing of the latter two.

You have heard of sub-notebooks and tablets which run full-blown operating systems on, wait for it, special-purpose PUs (e.g., Atom, A4, ARM, etc), no? Or what about your smart phone? Do smart phones have CPUs just because they are referred to as 'smart'? Answer: no. They still run embedded chips.

This discussion is beyond ridiculous.

Re:Intel chips. In TVs! (1)

adolf (21054) | more than 2 years ago | (#37702582)

This discussion is beyond ridiculous.

While I suspect that you'll accuse me of cherry-picking again because I dismissed the rest of your post, I think we can both agree on that. Although I'm still not entirely sure why you steered it in this direction...

HTPCs are for geeks (1)

tepples (727027) | more than 2 years ago | (#37690064)

Ultimately people want to use the TV as a display for their 'smart' devices.

Then why do only geeks [pineight.com] ever buy or build a PC to hook up to the TV, as CronoCloud has pointed out [slashdot.org] ?

Re:HTPCs are for geeks (1)

johanatan (1159309) | more than 2 years ago | (#37694196)

Because 'people' aka the general population do not know what they want yet. They will get there though.

Re:HTPCs are for geeks (1)

tepples (727027) | more than 2 years ago | (#37694298)

Have you any ideas for marketing the concept of connecting a PC to a TV to the general public, beyond what I already have on this page [pineight.com] ?

Re:HTPCs are for geeks (1)

johanatan (1159309) | more than 2 years ago | (#37696178)

Not really. It sounds like it would be a difficult market to enter with gaming consoles, Apple TV, WD, Google, Logitech, etc already there. I think we're seeing a trend away from PC generally and this is only one particular instance of how. MS seems intent to make the XBOX the 'room computer' (and it was really quite great foresight for them to head in this direction over a decade ago only to now see the position really start to pay off).

If I were serious about entering this market though, I'd avoid paying MS license fees and stick to a Linux solution (MythTV maybe?) with the usability points you mentioned (it would be way too difficult to compete if you're paying $100-$150 per unit just for the OS). It will be hard to convince people to go with a custom [non-standard] system with such low market share though. Smart consumers don't venture too far off the beaten path if they want things to 'just work' (now and in the future). :-)

Re:Intel chips. In TVs! (0)

Anonymous Coward | more than 2 years ago | (#37688052)

intel gives up and picks up tv TWICE A YEAR.

it's fucking ridiculous.

second post. (-1)

Anonymous Coward | more than 2 years ago | (#37687554)

GNAA (GAY NIGGER ASSOCIATION OF AMERICA) is the first organization which gathers GAY NIGGERS from all over America and abroad for one common goal - being GAY NIGGERS.

Are you GAY?
Are you a NIGGER?
Are you a GAY NIGGER?

If you answered "Yes" to all of the above questions, then GNAA (GAY NIGGER ASSOCIATION OF AMERICA) might be exactly what you've been looking for!
Join GNAA (GAY NIGGER ASSOCIATION OF AMERICA) today, and enjoy all the benefits of being a full-time GNAA member.
GNAA (GAY NIGGER ASSOCIATION OF AMERICA) is the fastest-growing GAY NIGGER community with THOUSANDS of members all over United States of America and the World! You, too, can be a part of GNAA if you join today!

Why not? It's quick and easy - only 3 simple steps!

        First, you have to obtain a copy of GAYNIGGERS FROM OUTER SPACE THE MOVIE and watch it. You can download the movie (~130mb) using BitTorrent.
        Second, you need to succeed in posting a GNAA First Post on slashdot.org, a popular "news for trolls" website.
        Third, you need to join the official GNAA irc channel #GNAA on irc.gnaa.eu, and apply for membership.

Talk to one of the ops or any of the other members in the channel to sign up today! Upon submitting your application, you will be required to submit links to your successful First Post, and you will be tested on your knowledge of GAYNIGGERS FROM OUTER SPACE.

If you are having trouble locating #GNAA, the official GAY NIGGER ASSOCIATION OF AMERICA irc channel, you might be on a wrong irc network. The correct network is NiggerNET, and you can connect to irc.gnaa.eu as our official server. Follow this link if you are using an irc client such as mIRC.

If you have mod points and would like to support GNAA, please moderate this post up.

What does this mean for Google TV? (1)

amorsen (7485) | more than 2 years ago | (#37687564)

The Google TV's from Sony use Intel chips according to their own marketing at least. Will Sony give up on Google TV or switch to ARM?

Re:What does this mean for Google TV? (0)

Anonymous Coward | more than 2 years ago | (#37695262)

Google knows dick about content. All they have going for them is YouTube. The content providers aren't going to play ball with Google TV and thus it is going to fail.

Google TV is dead (0)

Anonymous Coward | more than 2 years ago | (#37687574)

Guess it's official. Google TV is dead and Sony must be pissed.

Once again Apple will dominate with their hobby.

LCOS (1)

Spy Handler (822350) | more than 2 years ago | (#37687586)

liquid crystal on silicon -- didnt they try this TV chip awhile back too and fail?

come to think of it, Apple and Google also tried TV and failed

maybe computer companies should just stick to making computers and leave TV to Sony and Samsung

Good riddance (1)

cerberusss (660701) | more than 2 years ago | (#37687594)

Good riddance, I'd say. I'm sick and tired of the 800 lb gorilla sticking its nose in everything that has more than a dozen transistors.

more cup & gpu power... (0)

Anonymous Coward | more than 2 years ago | (#37687684)

provides little extra utility over the embedded hulu and netflix players that are out now. I really like my htpc, and I don't see a demand for something between the existing embedded players and htpcs. But hey I could be wrong... I thought that the market for high-end cellphones would never take off either. Never underestimate marketing and fashion.

Intel Gives Up On TV (4, Funny)

Chrisq (894406) | more than 2 years ago | (#37687688)

Intel Gives Up On TV

I don't blame them ... there are very few good programs and too many reality tv shows

Intel not a commodity player (1)

unixisc (2429386) | more than 2 years ago | (#37687740)

Making chips for TV? The closest to a commodity market that Intel was in was in flash memory for cell phones, and that business was first spun off as Numonyx, and now is a part of Micron. So how did they even start to think they might be competitive in TVs?

As a former TV Product Planner (although (3, Interesting)

JTL21 (190706) | more than 2 years ago | (#37687860)

not for Google TV).

Intel chips are expensive and these days you would be very much be expecting a highly integrated chip with demuxes and decoders for digital broadcasts, video and audio processing elements to improve the quality. There would typically be a whole bunch of functional units for most functions all baked onto the silicon. The General Purpose Processor would typically be fairly weak but with a lot of support. Main processors may get somewhat more powerful to support browser type technology but I wouldn't expect them to reach Intel Atom speeds in most cases for some time. Which would you rather have, a TV with a fast web browser or good picture processing?

The current Sony Google TVs (the integrated screens) still carry the same main chip as the rest of the Sony range in addition to the Intel processor and graphics. I'm not certain of the extent to which this is absolutely technically required or whether it was needed to use the existing TV reception and processing software. This means that the cost of the to build Google TV was like building a normal TV and adding a bare bones Atom PC. Expectation of pure additional sales, marketing funds from Intel and an expectation of smaller margins for retailers were what made the business case I understand although I think there were also some unreasonable assumptions particularly if you had ever tried the product.
http://techon.nikkeibp.co.jp/english/NEWS_EN/20101117/187451/ [nikkeibp.co.jp]
http://www2.renesas.com/digital_av/en/mpegdec_tv/emma3tl2.html [renesas.com]

If Intel do back away from the highly cost sensitive TV chip business I would expect Google to offer support for ARM. I think most of the TV manufacturers on or moving to ARM although MIPS was is certainly used in current models. The newer high performance ARM chips are a probably significantly more expensive than the typical TV processors but probably make more sense than the Intel Atoms with the ability to custom specify the chip features and still be cheaper.

Features on such chips will be specified by major manufacturers but the feature set will probably be locked down at least 18-24 months before the TV ships ruling out some things after that date.

The TV business is hugely competitive market and there is no profit in it (possibly with the exception of companies that have their own panel manufacturing). The combination of falling prices, long parts lead times and the importance of volume to get component prices make it a very tricky business to make money in. But it is key to many companies positions in the Consumer Electronics area and can bring leverage into other businesses (by enabling retail space, offering full product suites and increasingly giving scale to over the top online video offerings.).

if they are having power issues with TVs... (1)

Gravis Zero (934156) | more than 2 years ago | (#37687932)

they should also give up on cell phones

Mont Blanc Pen (-1)

Anonymous Coward | more than 2 years ago | (#37687948)

Many people believe that fountain pens are remnants of the past and consider owners of classic writing instruments to be weird, to say the least. But connoisseurs know very well that a vintage pen in good condition is worth a ton of ballpoints, not speaking about drop-out BIC’s. Writing with a montblanc fountain pen [montblancpen-uk.com] is totally different from any other way of putting letters onto paper.Probably that’s one of the reasons why in some English private schools fountain pens are still in use.
                Well, few students use mont blanc pen [montblancpen-uk.com] right in their schooling years, but they sure get the experience most of us lacked.Collecting Montblanc pen (UK) [montblancpen-uk.com] is way more than just a hobby – it means leaving a trace in the history preserving the art of writing for future generations. So browse numerous articles to learn about the history of the pen maker, buying guides and reviews of Mont Blanc pens.
By http://www.montblancpen-uk.com

Who needs a TV anyway (1)

rossdee (243626) | more than 2 years ago | (#37687982)

One of my computer monitors inputs is connected to the cable set-top box via HDMI
To watch TV I just push a switch on the monitor

Re:Who needs a TV anyway (0)

Anonymous Coward | more than 2 years ago | (#37689848)

Why would you even want to watch TV anyway ?

It's boring, "talks" to you as if you're a 5 year old retard and is a complete waste of time.

TV is dead and the corpse stinks.

silly question (1)

Chirs (87576) | more than 2 years ago | (#37689852)

While not everyone needs a TV, they're optimised for different things. My local cable company's standard def boxes don't have HDMI out, only component/composite. A lot of my gear is old enough that it doesn't do HDMI. My computer monitor doesn't have output ports like a TV does. My monitor has a much higher pixel density, but would suck for watching a movie with a bunch of other people.

Re:Who needs a TV anyway (1)

tepples (727027) | more than 2 years ago | (#37690340)

Does your computer monitor or your cable set-top box also upscale composite video so that you can use legacy video sources, such as a VHS VCR (for cult classics that still haven't been rereleased on DVD due to licensing complexities) or a non-HD video game console?

They didn't want to do a concept that was good eno (1)

Snaller (147050) | more than 2 years ago | (#37688722)

Most companies never try to make something great (Apple being one exception, at least they try), most companies wants to make it as cheaply and as crappily as they can get away with. ie, getting away with = what people will still pay for.

Whilst calling Intel's offering crap might be too harsh, it was never promising or exciting, just their own little closed eco system. If we could get TV's running Honeycomb where you could sideload Apk's - you'd see such a TV system take of quickly, because all of the things they can't be bothered to add, or don't have the vision to add will be added by the army of dedicated manics out there programming freeware.

CISC! (1)

optymizer (1944916) | more than 2 years ago | (#37691728)

No wonder. Want low-power? Look at ARM. CISC devices cannot consume as little power as RISC - they have to pay for the extra features.

Re:CISC! (0)

Anonymous Coward | more than 2 years ago | (#37696686)

Mostly bullshit. The X86 chips aren't even CISC these days. The reason Intel can't get into the TV business is a lot simpler. They won't license their designs.

ARM will sell you the design and everything you need to build a one-chip solution that can be lower-powered, cheaper and more reliable than Intel's walled-off CPU, with all the bits needed for a TV outside of it.

And...? (0)

Anonymous Coward | more than 2 years ago | (#37693808)

Intel abandoned a failing market increasing made redundant by PCs. Win-win.

They need to diversify, but nothing compares... (0)

Anonymous Coward | more than 2 years ago | (#37701662)

...to the CPU business.

As a former Intel employee for 22+ years, I saw this repeated over and over. Intel has the primary business of CPUs, and they look for "adjacent" markets that help promote the primary business. WiFi was one example. USB is another. These "lower tech" products are now ubiquitous, and helped drive purchases of new PCs.

The CPU business, with its high margins, drives all decisions. If you have a factory that makes CPUs, where the average revenue per wafer is X, why would you want to run wafers with a much lower revenue per wafer? Their strategy for chipsets has basically allowed the older, depreciated, factories to stay alive longer without upgrading, and the chipsets get sold along with CPUs at a very high "attach rate". So while the profits aren't as great, it improves the revenue.

The problem is there are only so many adjacent businesses. Smart TVs would appear to be a good idea, but an Intel CPU becomes an expensive part of that TV. With other choices out there, the price is critical, and people are willing to accept less than PC performance on a TV.

Intel doesn't want to be a commodity manufacturer. They invented the SRAM, DRAM, ROM, EPROM, and FLASH memories, and one by one dropped out of the business when they became commodities. This makes it harder and harder to find profitable adjacent businesses that will support growth of the PC business.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>