Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel Launches Wi-Di

samzenpus posted more than 4 years ago | from the clear-streams dept.

Intel 172

Barence writes "Intel has launched a new display technology called Wi-Di at CES. Intel Wireless Display uses Wi-Fi to wirelessly transmit video from PCs running Intel's latest generation of Core processors to HD television sets. Televisions will require a special adapter made by companies such as Netgear — which will cost around $100 — to receive the wireless video signals. Intel also revealed its optical interconnect technology, Light Peak, will be in PCs 'in about a year.'"

cancel ×

172 comments

GNAA launches goatse (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#30693190)

Into outer space! [goatse.fr]

Why wouldn't... (1, Insightful)

Anonymous Coward | more than 4 years ago | (#30693198)

Why wouldn't it work with an older Core processor, or hell even an AMD processor?

Re:Why wouldn't... (0)

Anonymous Coward | more than 4 years ago | (#30693216)

because it doesn't have the optimized components

Re:Why wouldn't... (0)

Anonymous Coward | more than 4 years ago | (#30693416)

more likely, because it doesn't have the TPM features that can prevent you from intercepting the encrypted signal..

Re:Why wouldn't... (4, Insightful)

fuzzyfuzzyfungus (1223518) | more than 4 years ago | (#30693512)

It was specifically mentioned that this "Wi-Di" link does not support HDCP(and thus won't count as a "protected link" for the purposes of playing back blu-ray disks, won't Joe consumer be confused and angered by that one?) so I suspect that that isn't the reason.

I'd chalk it up to a mixture of "don't want the hassle of having to test and tweak and validate on large numbers of old components not designed with it in mind" and the desire to drive the sale of more laptoops with new intel silicon in them.

Re:Why wouldn't... (1)

omnichad (1198475) | more than 4 years ago | (#30694998)

They're going for a "Centrino" part 2. I'm still scratching my head of Centrino. Are people really that stupid? So, if all my parts are made by Intel, and it's a wireless PC, I suddenly have a magical thing called "Centrino." Don't know what it does for me, but my friends and neighbors will be jealous.

I can do that with a netbook (1)

linzeal (197905) | more than 4 years ago | (#30693230)

Why would I want to buy something else?

Great! (3, Funny)

Errol backfiring (1280012) | more than 4 years ago | (#30693234)

Why didn't I think of that? First, kill off all TV signals and force people to use cable companies, then invent a system to ...
transmit TV signals!
Brilliant!

Re:Great! (0)

Anonymous Coward | more than 4 years ago | (#30693298)

Who's killing off the what now?

Re:Great! (5, Insightful)

happy_place (632005) | more than 4 years ago | (#30693500)

Even better, now that it's wireless and just like my wireless internet, I'll get free TV, maybe even get to watch what they watch from the neighbor's houses!

Re:Great! (0)

Anonymous Coward | more than 4 years ago | (#30694452)

Anyone remember the "fictional" disease in the (crap-tastic) Sci-Fi thiller Johnny Mnemonic?

NAS, or the "black shakes", won't be so fictional if we keep doing stupid stuff like this...

http://en.wikipedia.org/wiki/Johnny_Mnemonic_%28film%29

Re:Great! (1)

timeOday (582209) | more than 4 years ago | (#30694504)

I actually do think this is great. For those of us with HTPCs, game consoles, cable boxes, and usual rat's nest of wiring we don't really want sitting right next to the TV, it beats trying to run 40 feet of DVI cable through the ceiling.

Re:Great! (0)

Anonymous Coward | more than 4 years ago | (#30694638)

Not to mention projectors where the cabling can be really ugly and long (roof, other side of the room).

Re:Great! (1)

omnichad (1198475) | more than 4 years ago | (#30695020)

Don't put your PS3 in your attic?

will the cable/ satellite industry fight this? (4, Interesting)

circletimessquare (444983) | more than 4 years ago | (#30693240)

if you can broadcast a signal to every set in your house, or even your entire apartment floor, then there goes a bunch of lucrative descrambler box fees. then again, they can all only show one channel at a time. however, media companies seem to all be losing income nowadays, and have all taken a hostile attitude towards new technology. they seem to need very little reason, however slim and irrational, to pick a fight with new technology

of course, the future is all streaming media over the internet, mostly on demand and mostly free, so they're all fucked

Re:will the cable/ satellite industry fight this? (1)

Ilgaz (86384) | more than 4 years ago | (#30693382)

Perhaps, cable companies have brains to go with standard technologies like H264 (it is lossy, remember) and add some kind of _standard_ DRM layer on it.

In fact, IPTV guys are doing it for years without 10000s of Intel CPUs. All they need is to put the encoder/encryption chip to the set top box and "air" over standard TCP/IP with gigabit cable or wireless.

The issue with Intel in that case is, $10 chip will do far better job than Intel Core i7 rolls royce processor since it is designed for it :)

I don't follow electronics/chips too much these days but I bet someone already has a working, cheap, low power solution and Intel came up with this thing to get attention away.

Re:will the cable/ satellite industry fight this? (1)

nine-times (778537) | more than 4 years ago | (#30694114)

That's progress. Some business models stop making money and become obsolete. New business opportunities become available.

All this wireless is Class 'C' ? (1, Interesting)

Anonymous Coward | more than 4 years ago | (#30693244)

So, if you live next to an Air Force Base, Airport, I don't know - all of your home entertainment gadgets won't work? And if you have a Kindle, you better have a shit load of books stored on the machine, because you won't be able to download any.

'Luddite' may not be a derogatory term in the near future.

Wi-Di (5, Funny)

hcpxvi (773888) | more than 4 years ago | (#30693258)

... when you could LIVE?
bada-bing-TISH! Thank you ladies and gentlemen, I'll be here all week.
Seriously, though, did their advertising people not spot what a silly name Wi-Di is?

Re:Wi-Di (1)

Spad (470073) | more than 4 years ago | (#30693278)

I prefer the pronunciation "widdy"

Re:Wi-Di (0)

Anonymous Coward | more than 4 years ago | (#30693686)

I'm a half-wid, you insensitive clod!

Re:Wi-Di (0)

Anonymous Coward | more than 4 years ago | (#30693928)

From TFA, it's short for "Wireless Display", meaning it should be pronounced "WHY - DEE". Sounds a little racists, but hey that's just me ;)

Re:Wi-Di (1)

omnichad (1198475) | more than 4 years ago | (#30695256)

I don't pronounce it DEEsplay. I'm not from Texas or Alabama. I would be pronounce it Dih. So that would make it sound more like Wide-ih.

Re:Wi-Di (0)

Anonymous Coward | more than 4 years ago | (#30694332)

So, do you also prefer "wiffy" ???

One or the other pronunciation... Come on...

Re:Wi-Di (1, Insightful)

Anonymous Coward | more than 4 years ago | (#30693286)

Right, that stopped the Wii...

Re:Wi-Di (3, Funny)

stupid_is (716292) | more than 4 years ago | (#30693326)

In Newcastle, UK, Wii is pronounced "Why-aye"

Re:Wi-Di (-1, Redundant)

Anonymous Coward | more than 4 years ago | (#30694660)

That's cause you live up North...

Re:Wi-Di (2, Funny)

dmayle (200765) | more than 4 years ago | (#30693400)

What a great joke! You're so Wi-Di...

Like the Wii? And the Kindle? (1)

wiredog (43288) | more than 4 years ago | (#30693432)

Devices with those silly names will never sell.

Re:Like the Wii? And the Kindle? (1)

Tim C (15259) | more than 4 years ago | (#30693990)

"Kindle" means "to start a fire burning by lighting paper, wood, etc"

I assume that it was chosen to conjure images of sparking off or kindling an e-book revolution.

Re:Like the Wii? And the Kindle? (1)

Known Nutter (988758) | more than 4 years ago | (#30694324)

I assume that it was chosen to conjure images of sparking off or kindling an e-book revolution.

Or perhaps it was meant to conjure images of Nazis burning books in Germany [historyplace.com] ??? Muhahaha... :/

Re:Like the Wii? And the Kindle? (1)

omnichad (1198475) | more than 4 years ago | (#30695274)

Probably more like Fahrenheit 451. Since that would be, you know, a literary reference. Referring to how they delete copies of books from your own device.

Re:Wi-Di (5, Funny)

JeffSpudrinski (1310127) | more than 4 years ago | (#30693442)

Discussion of how to pronounce it reminds me of the little-known trivia about how the inventor of SCSI wanted it to be pronounced as the "Sexy Interface" rather than the "Scuzzy Interface".

-JJS

Re:Wi-Di (4, Informative)

morgan_greywolf (835522) | more than 4 years ago | (#30694420)

Discussion of how to pronounce it reminds me of the little-known trivia about how the inventor of SCSI wanted it to be pronounced as the "Sexy Interface" rather than the "Scuzzy Interface".

The inventor of SCSI was Larry Boucher at Shugart Associates (and later Adaptec). They've always pronounced it 'scuzzy'. Apple was the player that wanted it to be pronounced 'sexy' because they were (at the time) pushing SCSI as a technology that made their machines superior to IBM and the clone makers, who were generally not including SCSI interfaces. Apple used SCSI for HDDs, FDDs, and CD-ROMs, and the inclusion of SCSI on the Mac was the biggest reason why early scanners always used a SCSI interface, Other players in the early days of SCSI (around 1986 or so) included Commodore, who included in the Amiga, and Sun Microsystems, who included it in their Unix workstations and servers.

Re:Wi-Di (-1, Troll)

Anonymous Coward | more than 4 years ago | (#30694636)

+1 pedantic humorless dipshit

Re:Wi-Di (1)

BurntNickel (841511) | more than 4 years ago | (#30695156)

While SCSI was used for hard drives and CD-ROM drives it was not used in as build Macintoshes for the floppy drive.

Re:Wi-Di (2, Insightful)

XavidX (1117783) | more than 4 years ago | (#30693486)

Well it worked didnt it. Your gonna remember it for awhile.

Re:Wi-Di (1)

AP31R0N (723649) | more than 4 years ago | (#30693800)

i've always despised the term Wi-Fi. Fidelity isn't the issue! Stop trying to steal recognition from a totally different type of product!

And get off my lawn.

Re:Wi-Di (0)

Anonymous Coward | more than 4 years ago | (#30694412)

there is no such thing as negative marketing

Re:Wi-Di (1)

artemis67 (93453) | more than 4 years ago | (#30694470)

I'm just surprised that no one has started in with the Princess of Wales jokes.

Buy a specialised chip for God's sake (0, Redundant)

Ilgaz (86384) | more than 4 years ago | (#30693348)

Intel pulls the usual trick since people figured their $30 GPU can indeed decode h264 on itself even with Adobe's Flash. There is another chip (Broadcom Crystal HD) which even media player developers started blogging about in amazement.

So, how to make people upgrade their CPU to do something it was never designed for? Come up with crap like this.

They should spend way more money to make use of multiple cores, easy conversion tools for older code, better GNU compiler collection support etc. That kind of "wireless HD" job is done way better with a $10 specialized chip with 1% of power/heat.

Re:Buy a specialised chip for God's sake (1)

TheKidWho (705796) | more than 4 years ago | (#30693474)

What are you ranting about? You don't even know how the tech is working? You're just assuming. I'm sure any P4 class processor can do this since it *seems* to just be encoding a video and transmitting it via WiFi...

Your silly rant in a previous post about the Core i7 processors is even more inane, get back to me when your $10 chip with 1% of the power/heat can do real time video editing of HD video.

Encryption on by default (1)

slasho81 (455509) | more than 4 years ago | (#30693354)

I really hope they've got encryption on by default in this technology or we'll have this whole security fiasco that we had and still are having with the open WiFi all over again.

Re:Encryption on by default (1)

delt0r (999393) | more than 4 years ago | (#30695062)

If you care about security, you probably won't be using this in the first place. Since i have over 12 overlapping wifi networks at my apartment, I can't imagine that these things will handle interference well. And it had better be better than HDTV which at least here (Vienna, Austria) looks like total crap.

Pronunciation? (1)

Sockatume (732728) | more than 4 years ago | (#30693366)

Homology with "wi-fi" and "hi-fi" demands that the two parts rhyme. The obvious is "why-die" but the alternatives such as "wee-dee" (weedy) and "whih-dhih" don't exactly jump off the tongue either.

Re:Pronunciation? (1)

Plunky (929104) | more than 4 years ago | (#30693508)

In french, wi-fi is pronounced as a single word as english might pronounce "whiffy", and I as a native english speaker generally say it that way too so "whiddy" is fine for me.

In yon case, makin just as much sends as like fiddy, nuff respec, ma bro.

Re:Pronunciation? (0)

Anonymous Coward | more than 4 years ago | (#30694994)

That's because the baguette-eating, stinky-armpit, weird frechies feel they must brutally alterate every non-frech word that comes from their mouths, or else they won't be enough frenchie.

Re:Pronunciation? GESTAPO! (1)

Max(10) (1716458) | more than 4 years ago | (#30695190)

I pronounce HiFi as high fee (high fidelity) and WiFi as wifey (diminutive of wife). It may sound strange to some people, but to me it's logical, HiFi systems used to be very expensive (high fee) and wireless fidelity implies being faithful to one's wife when she's too far to yank one's {wire,USB {keychain,dongle}}.

I'd probably pronounce WiDi as widey, which sounds a lot like whitey, but I'm tired of these silly acronyms, people keep pronouncing everything their own way anyway and most sales people seem to think that their pronunciation is the only correct one.

GESTAPO (Gotta Eliminate Silly Technology, Acronyms, People and Organizations)!

Wi-Di? (1)

mcgrew (92797) | more than 4 years ago | (#30693392)

You gotta go sometime...

At least this one makes sense, unlike Wi-Fi. Kind of morbid name, though.

But what does that mean for... (0)

Anonymous Coward | more than 4 years ago | (#30693412)

Wireless HDMI?

http://en.wikipedia.org/wiki/Wireless_HDMI [wikipedia.org]

Now that i think about it, i'm not even sure i have seen anything about Wireless HDMI...

How much cat6 would $100.00 buy? (1)

starbugs (1670420) | more than 4 years ago | (#30693422)

So this is the solution for people who don't want to run cable to a device which is moved only when it breaks.
What other benefit is there?

All I see is an expensive (probably proprietary) re-implementation of wi-fi which can not be used for anything but TV.
The only appeal I see is to those who have trouble watching iptv in the traditional way - TCP/IP.

Re:How much cat6 would $100.00 buy? (5, Insightful)

justinlee37 (993373) | more than 4 years ago | (#30693520)

I agree that it doesn't make sense for a desktop PC. However you are neglecting to consider a laptop. It can be a pain to attach and detach a laptop to a television or digital projector using a VGA cable. Imagine being able to sit down in your living room with your laptop and, from the couch, use only the laptop controls to transmit your screen to your television or projector. Imagine if everyone in the house had such a laptop, and they could all take turns using the same television to display their movies, music, games, etc.

Imagine if you could be at a business conference with a large video projector and hundreds of businesspeople all with laptops that were capable of wirelessly connecting to the projector to display their slide presentations, graphs, or videos, and if anyone in the audience could do this without even leaving their seat.

In the old days of computer, we used to have dummy monitor terminals connected to mainframes. The cost of the computer was greater than the cost of the monitor so we set up one computer to work with many monitors at once. Today, the cost of computers is much less, and the paradigm shifted; a monitor is more expensive (or as expensive) as a computer. So we rig our computers to use multiple video monitors. We are truly entering a golden age where it is possible for everyone to have a small computer, like a PDA device, that they can use to plug into dummy monitor/keyboard terminals or projected video screens. Imagine if they could do all of this without cables.

I'll get off your lawn now.

Re:How much cat6 would $100.00 buy? (1)

gnasher719 (869701) | more than 4 years ago | (#30693792)

I agree that it doesn't make sense for a desktop PC. However you are neglecting to consider a laptop. It can be a pain to attach and detach a laptop to a television or digital projector using a VGA cable.

Five meter VGA cable, and five meter headphone cable, running along the bottom of the wall, that works just fine. Certainly not worth spending $100 for.

Re:How much cat6 would $100.00 buy? (1)

justinlee37 (993373) | more than 4 years ago | (#30693844)

I think I made some other valid points though. Of course this technology might make it a lot easier for people to snoop on what you are doing by viewing your screen remotely ...

Re:How much cat6 would $100.00 buy? (2, Insightful)

slim (1652) | more than 4 years ago | (#30694122)

Five meter VGA cable, and five meter headphone cable, running along the bottom of the wall, that works just fine. Certainly not worth spending $100 for.

You could equally argue that a long ethernet cable means WiFi is useless. Cables are a nuisance. Fewer cables is good.

Re:How much cat6 would $100.00 buy? (1, Insightful)

Anonymous Coward | more than 4 years ago | (#30694244)

Everybody loves slow links with high latency, greater interference from ambient radio waves/microwaves, higher energy consumption, and a lower maximum distance.

Re:How much cat6 would $100.00 buy? (1)

mrbcs (737902) | more than 4 years ago | (#30694838)

I have an old P4 running media portal connected to my stereo and projector. I have an 80" projector system for under $1000. My dumb LG 32 flatscreen tv cost more than that last year. Guess which one I watch the most?

Re:How much cat6 would $100.00 buy? (1)

koiransuklaa (1502579) | more than 4 years ago | (#30694876)

Well, I have one of those 5 meter cables, and it's annoying. Always in the way, never where I need it (remember, it's a laptop, so movable).

Another thing: I'm currently thinking of buying a projector. The 12-15 meter VGA cable I need will be butt ugly and I'll have to do a lot of work to make it not stick out. Really hiding the cable would require a competent handyman and a hefty bill. I'd pay $200 in a heartbeat for working wireless display tech.

Re:How much cat6 would $100.00 buy? (1)

Verdatum (1257828) | more than 4 years ago | (#30695386)

Yes, that works just fine for my setup in my house, where I bothered to setup and tack down those cables. But going to random friend's house, and having to dig around the back of their TV to plug in a (in my case, S-Video) cable to play vids off my laptop HDD is a huge pain. It'd be wonderful to see a standardized wireless video receiver integrated into TVs someday. Costs could be driven down to the point where it is no more of a big deal than adding a 802.11 chip to a smartphone design. I've been wishing for exactly that for about 3 years now.

Re:How much cat6 would $100.00 buy? (1)

starbugs (1670420) | more than 4 years ago | (#30693848)

Imagine being able to sit down in your living room with your laptop and, from the couch, use only the laptop controls to transmit your screen to your television or projector.

I understand where you're coming from, but I did this back in 2004 with 802.11b an old(er) trinitron and a P4 with Ati's AIW. This is a hardware solution to a problem that can easily be solved with software. But of course, I don't think intel is a hardware company.

Most new tv's have digital input. More or less all laptops have wireless. Your server can use the tv as a monitor and you're running software on your laptop that either controls the server and/or 'forwards' the image from your laptop to the server to be displayed on your tv. That was the whole point of X11. Network transparency.

Re:How much cat6 would $100.00 buy? (1)

IndustrialComplex (975015) | more than 4 years ago | (#30693530)

So this is the solution for people who don't want to run cable to a device which is moved only when it breaks.
What other benefit is there

You build your over the top game crushing machine and stick it in your server closet. It does all the hard crunching and sends the preprocessed video to your Thin client like terminal. Your mouse/gamepad/keyboard controls are transmitted back via a simple low bandwidth link that requires little processing overhead.

You effectively have a hand-held gaming device with the power of a desktop sized computer. Imagine PS3 graphics and games on a hand-held device that only uses enough power to run the display the transceiver and the buttons.

That's just one of the options.

Re:How much cat6 would $100.00 buy? (1)

Spad (470073) | more than 4 years ago | (#30693824)

And who needs lag as an excuse for poor performance when you can have your Wi-Di connection drop totally everytime someone fires up the microwave or uses the vacuum...

Re:How much cat6 would $100.00 buy? (2, Insightful)

drinkypoo (153816) | more than 4 years ago | (#30693856)

You can get crazy long HDMI cables to transmit video and [digital] audio. I bought a 25 footer to go across a room, and that's not the top end, either. This is really useless for non-mobile devices.

Re:How much cat6 would $100.00 buy? (1)

YrWrstNtmr (564987) | more than 4 years ago | (#30694150)

I bought a 25 footer to go across a room

That's the problem. I would pay $100 to do away with the room-spanning cable(s).

Re:How much cat6 would $100.00 buy? (1)

drinkypoo (153816) | more than 4 years ago | (#30694290)

Go around the room instead. The right place to run it is usually in the works, but in my case there's an exploitable seam 'twixt carpet and tile right about where I would have put it. Are you really trying to dump more RF into your house? I'd rather cut down. I can't wait for LED TVs to come down so I can get rid of the silly LCD stuff; it's sort of CRT-lite in a way, although I have found it to be a massive improvement.

Re:How much cat6 would $100.00 buy? (1)

Yamata no Orochi (1626135) | more than 4 years ago | (#30694404)

Or you could run them through the ceiling/under the floor. For free if you're not a total lout.

Re:How much cat6 would $100.00 buy? (1)

YrWrstNtmr (564987) | more than 4 years ago | (#30694538)

Or you could run them through the ceiling/under the floor.

If this weren't a rental house, I would. Yes, there are myriad ways of getting a signal fom here to there. Beaming a TV signal around the house wireless is but another option. One that could come in quite handy in certain situations.
Just like 802.11 complements wired ethernet.

Re:How much cat6 would $100.00 buy? (1)

LWATCDR (28044) | more than 4 years ago | (#30695094)

This is more a replacement for HDMI than for CAT6.
Uses? Well I can think of a big one. Hooking your Notebook to your HD TV. Media PCs are not flying off the shelf right now but imagine how handy it would be to send video from your Notebook to your HD TV? Apple users I think will love it.
And let's face it $100 isn't that much more than a good proper HDMI cable from a good manufacture like Monster! Sure it is more expensive than those cheap HDMI cables that get the bits out phase but what idiot uses them.
All kidding aside it does make using your Notebook as a media PC a walk in the park.

Something else I'll probably never need (1, Interesting)

Lord Lode (1290856) | more than 4 years ago | (#30693472)

Yet another kind of connection from PC to TV?

Why not just watch on the monitor of the PC, or use a projector?

Re:Something else I'll probably never need (1)

JasterBobaMereel (1102861) | more than 4 years ago | (#30693576)

Exactly - a TV is often lower resolution and lower quality than the Monitor on your PC anyway (Monitors tend to be smaller simply because you tend to sit nearer it ...)

And the only difference is that your TV has a built in analog receiver (which will soon be obsolete) or a built in digital decoder (which you can replace with a box)

Re:Something else I'll probably never need (1)

mdwh2 (535323) | more than 4 years ago | (#30693718)

A "HD television set", as stated in TFS, is lower resolution than a monitor?

My 1680x1050 TV is my monitor. But sometimes it would be handy to send a display from my laptop without fiddling with cables, and my laptop resolution is lower than my TV.

Re:Something else I'll probably never need (0)

Anonymous Coward | more than 4 years ago | (#30694770)

Are you sure that's the resolution? That can't even play 1080i/p without downscaling or 720i/p without upscaling or letterboxing. That doesn't make any sense for a television.

My laptop has a higher screen resolution than your television, by the way, and can play 1080i/p without upscaling (and it's only a 15.4" screen).

Re:Something else I'll probably never need (1)

mdwh2 (535323) | more than 4 years ago | (#30693678)

Maybe someone wants to play something from their laptop?

Re:Something else I'll probably never need (1)

YrWrstNtmr (564987) | more than 4 years ago | (#30693768)

Why not just watch on the monitor of the PC, or use a projector?

I would use this. In my current set up, we have TV/Tivo/cable box on one side of the room, and projector/PC on the other side of the room. PC plays movies and Netflix on the projector, and Tivo/cable on either the TV or projector.
Because there are two viewing devices on opposite sides of the room, somewhere there will be a long-ass cable involved. Currently, a 50' S-video cable from the Tivo to the projector.

Re:Something else I'll probably never need (1)

WRX SKy (1118003) | more than 4 years ago | (#30693972)

Because my office is 2 stories above my media room with comfy chairs and surround sound... and running cables isn't a viable option.

Re:Something else I'll probably never need (2, Informative)

Steauengeglase (512315) | more than 4 years ago | (#30694498)

Because people have bought expensive HD sets with VGA/S-Video/HDMI and they want to use them as big, honkin' monitors in their living room without running cable.

Why Die? (1)

jmyers (208878) | more than 4 years ago | (#30693528)

Sounds like something you scream at the TV when the redneck down the street starts talking on his CB and turning the screen to snow right in the middle of your favorite show.

Re:Why Die? (1)

zach_the_lizard (1317619) | more than 4 years ago | (#30693664)

I pronounce it as "weedy." It reminds me of all the drugs that marketing must have done to come up with the name.

Does this work with real video cards / chips? (1)

Joe The Dragon (967727) | more than 4 years ago | (#30693538)

Does this work with real video cards / chips? and not intel GMA that is a about the same speed as 1-2 year old on board ati / nvidia chips?

Re:Does this work with real video cards / chips? (0)

Anonymous Coward | more than 4 years ago | (#30694560)

I'd say that's closer to 3-5 year-old ATi and nVidia chips.

Let me guess... (1)

ocularsinister (774024) | more than 4 years ago | (#30693552)

Let me guess...this comes laden with DRM and associated technologies. So, thank, but no thanks - I'm quite happy with a few feet of (well shielded analogue) cable.

Re:Let me guess... (2, Funny)

asylumx (881307) | more than 4 years ago | (#30693590)

Shielded with tin foil?

Re:Let me guess... (1)

ocularsinister (774024) | more than 4 years ago | (#30693698)

Of course! :D

Re:Let me guess... (0)

Anonymous Coward | more than 4 years ago | (#30694442)

Eventually, all new tech will be equipped with DRM. Unless you want to be the 2030 equivalent of that guy today who has an unnetworked PDP-11 in his basement and no other computer, you will have to accept DRM or be left far, far behind.

whats the chance... (1)

hitmark (640295) | more than 4 years ago | (#30693570)

that intel came up with light peak after getting called on their attempt to keep usb 3.0 host controller specs proprietary?

Light Peak started at Apple (2, Informative)

mbrod (19122) | more than 4 years ago | (#30693794)

Apple started the concept but ceded it to Intel to develop it.

http://www.engadget.com/2009/09/26/exclusive-apple-dictated-light-peak-creation-to-intel-could-be/

Re:Light Peak started at Apple (0)

Anonymous Coward | more than 4 years ago | (#30695146)

Or, if you wanted to be even more informative: "Engadget claims Apple started this concept but no-one else has found supporting evidence". Specifically I remember CNet claiming their insiders are dismissing the idea.

Not that this is important as Engadget reporting is universally known to be scientific and un-biased.

I have my doubts (1)

MBGMorden (803437) | more than 4 years ago | (#30693874)

They already sell the equivalent for iPods to transmit to a radio in your car. It does work, and I use one, but the quality is hit or miss. It's not as good as a straight cable, and it's very prone to interference. I'm planning on upgrading to a new head-unit sometime this year so that I can plug right into it rather than use the radio setup.

Wireless (anything) for me is only a temporary convenience that I can use until I properly setup a wired system. It ALWAYS has drawbacks, and I never want to use it for anything long term.

The real killer app (1)

spectrokid (660550) | more than 4 years ago | (#30693876)

The real killer app is of course the millions of projectors hanging from office ceilings worldwide. From now on you will get Death By Powerpoint without these pesky 20 meter VGA cables. When does somebody make a projector you can simply stick a USB key in?

Re:The real killer app (1)

karnal (22275) | more than 4 years ago | (#30694136)

Those do exist. http://blogs.zdnet.com/Berlind/?p=813 [zdnet.com]

Saw a few of these advertised in the travel rag (can't remember the name of it for the life of me) that sits in the airplane seats. Not as common as they should be, but you can buy them.

Intel CPUs? (3, Insightful)

VincenzoRomano (881055) | more than 4 years ago | (#30693936)

PCs running Intel's latest generation of Core processors

I don't see the point here. How can I see from WiFi whether you use Intel, AMD, ARM or whatever else?
Sounds more like advertisement than technology!

Televisions will require a special adapter.... (2, Interesting)

Anita Coney (648748) | more than 4 years ago | (#30694080)

Yeah, because we all know how completely difficult it is to connect a DVI to HDMI cable and an 1/8" cable from your computer to your TV.

Of course someone will say, "Most people don't keep their PCs near their TVs."

If people were willing to spend $600 on a PS3 that sits in their living room, I don't see why they can't spend a few hundred for a PC. Heck, if you subtract the $100 "special adapter" from the price of the PC, you can get one real cheap.

Of course someone else will say, "Who wants a noisy PC in their living room?" And to that I'll say, "Have you ever been in the same room with an Xbox 360?" Mine is much more noisy than my PC by a wide margin.

Compared to the 90s, I think retail desktop PCs are pretty quite nowadays. (Of course I built mine myself.)

Re:Televisions will require a special adapter.... (1)

Yamata no Orochi (1626135) | more than 4 years ago | (#30694454)

Compared to the 90s, I think retail desktop PCs are pretty quite nowadays. (Of course I built mine myself.)

You built your quite retail desktop PC yourself?

Re:Televisions will require a special adapter.... (1)

Anita Coney (648748) | more than 4 years ago | (#30694736)

Thanks, I needed that!

Re:Televisions will require a special adapter.... (1)

Fred IV (587429) | more than 4 years ago | (#30695118)

If people were willing to spend $600 on a PS3 that sits in their living room, I don't see why they can't spend a few hundred for a PC. Heck, if you subtract the $100 "special adapter" from the price of the PC, you can get one real cheap.

Meaning yet another power-hungry disposable consumer good to sit in common space now and in the landfill later. Thanks, but no thanks. Besides, if they already spent for a PS3, they can already stream video over wi-fi using PS3 Media Server [google.com] .

I wanna know... (0)

Anonymous Coward | more than 4 years ago | (#30694492)

"Wi" won't Sony "Di" ?

Should use ATSC (3, Insightful)

gr8_phk (621180) | more than 4 years ago | (#30694614)

They should just broadcast it using ATSC. Then we don't need a receiver on the TV just the antenna.

Why doesn't television use better compression? (1)

radarsat1 (786772) | more than 4 years ago | (#30694728)

If I understand correctly, digital television signals are still using basic MPEG2 compression, like on DVDs. I'm not sure if this is still the case for HD streams (blu-ray, etc), but it seems to me like they can't fit all that much data on a disc compared to what you can download in a torrent.

Meanwhile, I regularly stream xvid and h264 videos from my laptop to my "media" computer (a desktop PC connected to my TV running Ubuntu) using regular old 802.11G over SSH. (The ssh isn't necessary, but sshfs is pretty convenient.) Also I can fit several HD movies on a DVD.

Why don't television standards use more advanced compression technology? It seems to me like this would be just as beneficial as developing higher-bandwidth methods of transmitting video data.

Re:Why doesn't television use better compression? (2, Insightful)

delt0r (999393) | more than 4 years ago | (#30695304)

If you want really good quality (as i do) then you at the high bandwidth end of the spectrum and mpeg2 is no worse that H264 (and even the experts agree on this point). Basically you at the end where you are encoding quite a bit of noise (film grain etc). h264 shines at lower bitrates, but with massive increases in complexity and patents. Hell the spec reads like a bunch of engineers had a stack of patents that they wanted to include in the spec.

I know a lot of fan boys love h264 and believe that HD can fit in 1Gig for a 2 hour movie, but that only works if you are blind. Really the vast majority of content out there is so compressed that there no point in 1080p cus DVD looks better anyway. There is a reason Blue Ray can fit 25Gigs on it. Currently here in Vienna HDTV looks far worse than normal tv due to the horrible artifacts... that may be a combination of using mpeg2 at low bit rates, bad reception or using h264 at even lower bit rates. Either way whats the point of 1080i/p or even 720 when most pixels are mosquito and other types of decoding noise.

Why not just reduce bandwidth via a smaller image and rescale and be honest about what you are getting. HD does not fit in DVD bitrates. DVD does.

Oh and HDTV does include h264.

So many questions... (2, Interesting)

dr_wheel (671305) | more than 4 years ago | (#30695326)

... and none of the articles I've read about 'Wi-Di' seem to answer them.

How about sound? Transmitting video directly to my tv sounds nice, but how does this tech account for transmitting sound to a HT receiver? Potential for audio/video de-sync? How will this be handled?

Potential for latency issues? This could be a big one, especially for gaming.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...