×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Intel's 14-nm Broadwell CPU Primed For Slim Tablets

samzenpus posted about 4 months ago | from the check-it-out dept.

Intel 96

crookedvulture writes Intel's next-gen Broadwell processor has entered production, and we now know a lot more about what it entails. The chip is built using 14-nm process technology, enabling it to squeeze into half the power envelope and half the physical footprint of last year's Haswell processors. Even the thickness of the CPU package has been reduced to better fit inside slim tablets. There are new power-saving measures, too, including a duty cycle control mechanism that shuts down sections of the chip during some clock cycles. The onboard GPU has also been upgraded with more functional units and hardware-assisted H.265 decoding for 4K video. Intel expects the initial Broadwell variant, otherwise known as the Core M, to slip into tablets as thin as the iPad Air. We can expect to see the first systems on shelves in time for the holidays.

Sorry! There are no comments related to the filter you selected.

Thank GOD (4, Funny)

ADRA (37398) | about 4 months ago | (#47650197)

Because what I was missing from a tablet was 4K movies!

Re:Thank GOD (4, Funny)

Lab Rat Jason (2495638) | about 4 months ago | (#47650247)

Bow down to my 27" tablet!!!

Re:Thank GOD (3, Insightful)

Anonymous Coward | about 4 months ago | (#47650261)

It's about future proofing. Plus H.265 applies to all the resolutions, not just 4K. So you might be able to download a 720p video that's 70% to half the current file size.

I haven't touched Ivy Bridge or Haswell. I want to hold out for Broadwell or Skylake for a nice and even lower power notebook. That or a future AMD offering.

Re:Thank GOD (1)

armanox (826486) | about 4 months ago | (#47651619)

I went from AMD Bulldozer (FX-8120) to Intel Ivy Bridge (i5-3570K) and couldn't have been happier with the upgrade. Didn't see a need to buy Haswell, and in all honesty I'll probably skip Broadwell as well (maybe. BOINC could always use more computer power...The only game that maxes it out is War of the Vikings, on max settings)

Re:Thank GOD (2)

timeOday (582209) | about 4 months ago | (#47650303)

I do have a 4k display on my Mac Pro, but I don't have a tablet because I like having one device that can do it all. A Surface Pro with this new chip might end up being that device.

Re:Thank GOD (0)

Anonymous Coward | about 4 months ago | (#47655449)

Tablets are handy for things like reading PDFs(don't like reading myself on a desktop or notebook and ereaders just phail at PDF if they're even a tiny bit complexly formatted) but I'll tell you what I've found:

Inputting anything beyond a sentence or two(even with nice things like swype) is a MAJOR PITA(esp. with crappy autocorrection and suggestions for imbeciles(apparently, kind of like dictionaires on ereaders that suggest other words than the one that you just typed in even though their fucking onboard dictionary HAS the definition of the damned word that you typed in FFS!). So far for me nothing beats the mouse and keyboard, but if they had decent HWR(handwritten character recognition) and a stylus I'd probably be pretty happy, but I imagine still not enough to abandon mouse/keyboard/larger display for the truly serious work.

Re:Thank GOD (1)

timeOday (582209) | about 4 months ago | (#47655647)

Well, that's why I'm holding out hope for the Surface Pro. It seems to me that Microsoft is really trying to make something that can replace both a tablet and a computer. Even if you end up using separate applications when the keyboard/mouse is connected, that would be fine. I just don't like having my stuff being spread around, which is why I'm using the Macbook Pro for everything right now. But if I could pull off its screen and have all my stuff on a tablet too, it would be handy.

Re:Thank GOD (4, Informative)

CastrTroy (595695) | about 4 months ago | (#47650305)

You're missing the biggest point. It has hardware h.265 support(not to be confused with h.264) which is a newer compression algorithm that allows for even smaller files while maintaining the same video quality, or better quality when using the same bitrate.

Re:Thank GOD (1)

ADRA (37398) | about 4 months ago | (#47650463)

Don't get me wrong, I know the nuance of the change, I just had to laugh that 4K video was the selling feature of a tablet. I'd be hard pressed to see the difference in 1080 / 4K with my 52" TV and I'm 20/20, forget a screen pixel density significantly smaller pixel density rating or even perceived pixel density rating.

Re:Thank GOD (2)

nine-times (778537) | about 4 months ago | (#47650521)

Yeah, but you can also play video from your tablet to your TV, through HDMI out if you have it, or else streaming to a set-top box. It may not be an extremely common use for tablets, but I've done it before. And a 13" tablet running a "retina" resolution (~300 dpi) would run over 1080p, for whatever that's worth.

I mean, I'm not sure I care about 4k right now, since 1080p seems to be doing just fine for my purposes. Still, it's not as though the idea is completely stupid.

Re:Thank GOD (1)

alen (225700) | about 4 months ago | (#47650559)

why would i want to connect my tablet to my TV via HDMI so it's a PITA to use it while watching TV? if anything, i like airplay to my apple tv from my ipads to stream cartoons from the Nickelodeon app

if i'm going to stream to my TV i'll just buy a better apple TV or roku because ARM processors with hardware h.265 are probably on the horizon as well

Re:Thank GOD (2)

afidel (530433) | about 4 months ago | (#47650631)

I use HDMI from my tablet to TVs in hotel rooms when traveling.

Re: Thank GOD (0)

Anonymous Coward | about 4 months ago | (#47652385)

Yeah, you want the best display possible for that porn, am I right?

Re:Thank GOD (1)

nine-times (778537) | about 4 months ago | (#47650685)

why would i want to connect my tablet to my TV via HDMI so it's a PITA to use it while watching TV?

Well, for example, I've used airplay to stream a movie from my iPad to my parents Apple TV when I came to visit. It let us all watch the movie on TV instead of a little iPad, and I just didn't use the iPad while it was streaming.

if anything, i like airplay to my apple tv from my ipads to stream cartoons from the Nickelodeon app

Can you do other things now on an iPad while streaming video? Last I checked, if you were using AirPlay, you couldn't switch applications without it stopping the stream, which would negate your previous objection, "why would i want to connect my tablet to my TV... so it's a PITA to use it while watching TV?" Just don't use your tablet while playing videos on it.

Either way, you know, maybe it's not useful for you, but that doesn't mean that it's not useful for anyone.

Re:Thank GOD (1)

spire3661 (1038968) | about 4 months ago | (#47650689)

Airplay is buggy and not nearly as reliable as a wire. I have had 3 Apple TV boxes for years now. They dont work consistently.

Re:Thank GOD (1)

BitZtream (692029) | about 4 months ago | (#47654181)

You'll still need the hardware h265 decoding to do it via airplay unless you want to watch your iPad suck through its battery before the movie finishes.

Re:Thank GOD (0)

Anonymous Coward | about 4 months ago | (#47652711)

HDMI 2.0 is required for 30fps+ 4K , I don't see any TV's with this support yet, you only find it in Mac's through displayport. There's also no displayport TV's, so you'd be watching it on a computer screen.

4K really has no place right now, and neither does h.265, because by the time the hardware and software start including it as a standard feature, we'll be two years away from the next one, and next removable storage medium.

HEY GUYZ, WHERE BE YOUR BLURAYS NOW?

Seriously, I can see the applications for netflix-like services, but even the idiots doing TV over IP on cable and DSL are still only doing MPEG-2 (h.262), and most of the world skipped h.263 (AKA DIVX when nobody else adopted it)

Re:Thank GOD (1)

macromorgan (2020426) | about 4 months ago | (#47654759)

U-Verse is an H.264 IPTV solution (which doesn't support third party hardware, sadly). Most cable operators however still only transmit MPEG-2, though a few are rolling out some channels in H.264.

Re:Thank GOD (1)

ProzacPatient (915544) | about 4 months ago | (#47652855)

I'm not sure how it works out on Apple as I don't have an HDMI adapter for my iPad 2 but I've tried it before with some of my Samsung Android tablets and typically what I found is that on stock firmware some apps will not display on HDMI out or even let you take screenshots of the app in the name of copy protection.

Of course if you're using a non-standard firmware like the incredible CyanogenMod then these copy protection mechanisms seem to be completely ignored but as far as the other side of the fence goes you usually don't reliably have the option of using a custom firmware on Apple devices unfortunately.

Re:Thank GOD (0)

Anonymous Coward | about 4 months ago | (#47650737)

Yes, but you hold the tablet closer to your face. Modern (2xdpi/"retina") tablets already exceed 1080p as it is, and phones are settling comfortably around 1080p (3xdpi/super-"retina") anyway.

Re:Thank GOD (0)

Anonymous Coward | about 4 months ago | (#47650883)

Recently bought a 15" apple retina macbook. While I had been *resisting* the temptation now that I've seen 4k videos on it I can honestly tell you FullHD is crap compared to it. You don't even need to see the videos side by side, the 1080p version looks terrible.

YMMV though, nearly blind people still use 52" monitors for 80x40 text edition...

Re:Thank GOD (1)

FlyHelicopters (1540845) | about 4 months ago | (#47652627)

This is so true... the comment "oh, you can't see the difference between 4k/1080p" is just as much nonsense as the one 10 years ago saying, "oh, you can't see the difference betwee 720p/1080p"

It is people talking out of their rear ends mostly...

Having seen them both in person, 4k blows away 1080p on the proper equipment.

Re:Thank GOD (0)

Anonymous Coward | about 4 months ago | (#47652771)

Eh, it doesn't really matter, it's the viewing distance.

480p is fine... at 2" resolutions. 720p is fine, at 3", and 1080 is fine at about 5", 4K is somewhere around 10"... exactly the size of an iPad.

This all of course assumes you are viewing the content at less than 3'

When you start viewing the content at farther than 3', then the resolution matters less. A 50" screen needs to be viewed around 10' to see 4K out of it. Any further back, and 1080p and 4K don't look any different.

Re:Thank GOD (1)

tlhIngan (30335) | about 4 months ago | (#47652773)

This is so true... the comment "oh, you can't see the difference between 4k/1080p" is just as much nonsense as the one 10 years ago saying, "oh, you can't see the difference betwee 720p/1080p"

It is people talking out of their rear ends mostly...

Having seen them both in person, 4k blows away 1080p on the proper equipment.

Which for 99% of the population is never going to happen.

Because most people sit WAY too far away from their TVs - even 720p is "retina" resolution - increasing resolution does absolutely zip because they can't even resolve the added resolution.

A rough guide is about 1:1 screen size for 1080p - if you have 100" screen, you need to 100" away from it. a 60" set means you must sit 5' away from it. 4K is even worse, which is why they come in humongous sizes because unless you want to sit with the nose to the screen, you need a bigger screen.

Yeah, I'm sure if you set everything up properly, you can notice the difference. But for most people who sit 6-8' away from their TVs they have a set that's too small.

Re:Thank GOD (1)

johnw (3725) | about 4 months ago | (#47652865)

Because most people sit WAY too far away from their TVs - even 720p is "retina" resolution - increasing resolution does absolutely zip because they can't even resolve the added resolution.

A rough guide is about 1:1 screen size for 1080p

Way too far away from their TVs for what? If your criterion for deciding the correct sitting distance is whether or not you can tell 720p from 1080p then perhaps you have a point, but if the object of the exercise is to watch television in comfort then 1:1 is just silly.

Re:Thank GOD (0)

Anonymous Coward | about 4 months ago | (#47650899)

> I just had to laugh that 4K video was the selling feature of a tablet.

It's not. Its just that whatever eventual standard there is for 4K video will include h265 and that will be the first standard to specify h265. So reporters and PR people put them together in the same sentence is all.

Re:Thank GOD (0)

Anonymous Coward | about 4 months ago | (#47651809)

Heh... but you have to wonder who decided to nickname it "Core M" -- the most common piece of malware to rear its head on OS X (and it's a fake codec at that)....

Re:Thank GOD (1)

Bill, Shooter of Bul (629286) | about 4 months ago | (#47652791)

Maybe because its a a combo of a chip design pivot that saved the company, much like this might do if its successful.

Back in the day AMD was KILLING intel with the amd64 design in price/performance over the Pentium 4 line. Intel scraped that design and went back to Pentium M, the mobile version of the Pentium 3. It called it the core duo/solo. So "core M" makes a lot of sense. A piviot to meet a competitor ( this time ARM ).

Re:Thank GOD (1)

Anonymous Coward | about 4 months ago | (#47652303)

... and what they never seem to mention is that it gets those smaller files at the cost of many times the CPU requirement to decode the stream than h.264 or VP8. There's really not that much groundbreaking as far as the algorithm goes, it's just choosing a different compromise point. This is why hardware support for h.265 and VP9 is required, you really don't want to view those streams on devices on older devices. Or should I say general purpose devices which haven't signed the papers?

the way it was meant... (2)

Imazalil (553163) | about 4 months ago | (#47650307)

You just haven't seen a movie the way the director intended, until you've seen in on a 10 inch tablet in 800ppi at an airport. Now, how do I get this 160 gig movie on there.

Re:Thank GOD (4, Insightful)

VTBlue (600055) | about 4 months ago | (#47650319)

Funny, but actually what it means is that you a sandy bridge class core CPU in an iPad Air form factor that dramatically alters the scenarios for usage. A nice port replicator or docking station will make for a clean and minimalist work area. One more generation and graphics will be pretty capable of mainstream gaming. Even with core M, many games will be playable with medium/low settings.

Currently I'm looking for an excuse to dump my still capable lenovo t400s

Re:Thank GOD (0)

Anonymous Coward | about 4 months ago | (#47650693)

That is what I was thinking too. It is the only reason I do not have an iPad. I was excited at first about them. Then saw they gimped it and put the iphone os on it. I was thinking osx on a tablet yes please. Then I saw the real specs. I gave it a skip.

Now though this... windows on a decent tablet...

Wonder if they will gimp it though with the drive sizes. Put something like a 1tb msata in there ...

Re:Thank GOD (2)

mlts (1038732) | about 4 months ago | (#47650741)

I can see an x86 (well, more accurately x86_64 because it is the AMD 64 bit extensions) tablet taking the role of a main desktop, similar to how the Microsoft Surface Pro is starting to do.

I would like to see five things on it to make it a serious contender for a desktop replacement role:

1: Two Thunderbolt connectors on a port replicator or docking station. These would work for video out, as well as provide 8 (in TB 1) or 16 (in the TB 2 spec) PCI lanes. I wonder if this would be enough for an external video adapter.

2: USB 3.1 with a type C connector. This is small enough to be on the device itself and support high amperage charging (as well as voltage higher than 5 volts if negotiated through the plug.)

3: A decent docking station. Something that the device can easily be plugged or slid into (and can handle a lot of insertions without breaking), and offer connectors for video, keyboard, mouse, HDDs, multiple NICs, eSATA, USB ports (and lots of them), and so on. Bonus points if the tablet can be locked in place, as a theft-deterrent.

4: Decent RAM and disk space. For starters, it should have 16 GB of RAM, and at last 1TB of SSD.

5: A read-only drive that has OS media on it. This way, reinstalling the machine from malware-free media (not the "reset" button or recovery partitions that wind up just as infected as the main partition) would be doable, and there would not be a case of losing the install media that came with the box. Heck, Tandy did this in 1984 with MS-DOS, why can't it be done with a modern machine?

Re:Thank GOD (1)

edxwelch (600979) | about 4 months ago | (#47651255)

If you're running a game you will typically have the GPU and CPU maxed out, so basically all the clever power gating and duty cycle stuff is switched off. Basically, the battery isn't going to last much longer than prev gen CPUs.

Re:Thank GOD (1)

VTBlue (600055) | about 4 months ago | (#47651305)

If you're running a game you will typically have the GPU and CPU maxed out, so basically all the clever power gating and duty cycle stuff is switched off. Basically, the battery isn't going to last much longer than prev gen CPUs.

If they are iPad/iphone class games then I'm okay with that. But real gaming would be docked anyways with a KB and mouse.

Re:Thank GOD (1)

mr_exit (216086) | about 4 months ago | (#47652063)

That is until the thermal protection kicks in and the game starts to crawl.

The Ouya found this, they had room to add a small heatsink to the otherwise standard mobile SOC, and were able to get a lot more performance out of it because it wasn't hitting the thermal limits.

Re:Thank GOD (2)

Kjella (173770) | about 4 months ago | (#47651323)

One more generation and graphics will be pretty capable of mainstream gaming.

I'm not sure if I should disagree with you because there's plenty gaming on phones/tablets today or because the bar of what's mainstream keeps going up but I don't agree. Every time they do a better tablet they also release a new generation of graphics cards and a new generation of games comes out to use it. We no longer run Quake and Crysis is no longer all it's cracked up to be, so next generation I expect the situation to be exactly the same - many games will be playable with medium/low settings. And there's no catching up because you can't do in a 15W tablet power budget what a 150W desktop can do. It's always going to look much better, relatively speaking.

Re:Thank GOD (1)

VTBlue (600055) | about 4 months ago | (#47651401)

I meant desktop class gaming.

As far as the 150w vs 15w argument I disagree. Desktop components are typically less efficient than mobile components of the same generation. When you start comparing across 2-3 generations, than mobile components can easily perform as well as desktop components 2-3 gen behind. Desktop gaming targets multiple hardware generations usually so you have to factor that in, as game Devs always do. Today more game Devs are targeting hardware chips for better optimization.

  The xbox one is 30w TDP APU. However in general, if we are talking similar architectural class of components, then yes, more power is better.

Re:Thank GOD (0)

Anonymous Coward | about 4 months ago | (#47650335)

So maybe it's not needed for the resolution, but certainly for future codecs. Part of the evolution of video compression is the increased compute horsepower available for search-space solutions, which is always growing more complex.

Re: Thank GOD (1)

Redbehrend (3654433) | about 4 months ago | (#47650409)

All good news now they just need to lower their prices so they are used more lol

Marketing (1)

DarthVain (724186) | about 4 months ago | (#47650431)

That was my first thought. What does a tablet need 4K compatibility for!?

Though I guess technically rather than having a 50" tablet, it might allow someone to use the tablet as a media device to the TV.

I used my Samsung phone for example in a pinch when both my Netflix, and my media computer was on the fritz.

However that said, they better start offering some much larger storage configurations if they plan on people carting around a bunch of movies that don't look like garbage on 4k.

Re:Marketing (0)

Anonymous Coward | about 4 months ago | (#47650533)

Yeah, and I was a bit miffed when I realized that my 7" Samsung tablet doesn't have video out. It's pretty handy while traveling, though it seems some hotels disable the HDMI inputs so you can't just watch Netflix from your phone.

Re:Marketing (1)

Xenx (2211586) | about 4 months ago | (#47650623)

Storage capacity is where streaming comes in handy. Not just online streaming, but NAS and the like.

Re:Marketing (1)

DarthVain (724186) | about 4 months ago | (#47654405)

Don't have 4K or anything so not sure, but I suspect you may run into bandwidth issues. I guess it is really like offloading the cost of internal storage onto your ISP dl cap.

Re:Marketing (1)

Xenx (2211586) | about 4 months ago | (#47655527)

... You're not likely to run into bandwidth issues OR issues with a download cap, in regards to local network storage.

Re:Marketing (1)

DarthVain (724186) | about 4 months ago | (#47669929)

No but when referring to Online streaming you will run into both.

Also not sure if you can use your Tablet to access your NAS and stream from your NAS to your tablet for rendering to then stream it again to your TV. I think you might find that you can run into at least network issues when trying that.

Re:Marketing (0)

Anonymous Coward | about 4 months ago | (#47650637)

> What does a tablet need 4K compatibility for!?

"64K ought to be enough for anyone."

Why does a phone need 1080p?
A tablet with a screen that is twice as big (4x the surface area) has a clear case for having 4x the number of pixels.

Re:Marketing (0)

Anonymous Coward | about 4 months ago | (#47650891)

Why does a phone need 1080p?

New cell phones are 2160p :-)

Re:Thank GOD (1)

Jeff Flanagan (2981883) | about 4 months ago | (#47650527)

4K video and pictures on a tablet would look amazing, and a 4K display could display text much more clearly than a 1080p one.

The only thing that makes 4K on a tablet less desirable than 1080p to me is that a tablet would need a much faster, and presumable power-hungry, graphics subsystem to drive all the pixels in a 4K display, especially for gaming.

Re:Thank GOD (1)

Ichijo (607641) | about 4 months ago | (#47650657)

The latest generation tablets already have resolutions above Full HD and would therefore benefit from 4K video.

Re:Thank GOD (1)

hairyfeet (841228) | about 4 months ago | (#47651217)

Frankly Intel is the only one "chasing the rabbit" over ever smaller NM so I'd like to see data showing how many of these are crap off the line and how bad the leakage is.

Of course the thing that scares the living hell out of Intel, that any PC shop guy will tell you, is that the X86 chip went from "good enough but just barely" to "so insanely overpowered you'll never use half of it" several years back. Both my desktop and laptop are on 45nm and frankly I have so many cycles to spare now it isn't even funny, so what good would 14nm do me? Add another 30 minutes to the battery? My netbook already gets over 6 hours on a new battery which is honestly 3 and a half hours more than I need, and on the desktop the hexacore spends more time with half the cores parked than not simply because I can't come up with enough work to feed it.

Finally Intel is about a year and a half too late as from the trenches I can tell ya the whole tablet craze? Yeah its dying out, even Apple's numbers are going down because the race to the bottom has made tablets so cheap everybody and their cat and their cat's squeaky toy all have tablets now and most of 'em? gathering dust. I don't know how many customers I've had that just HAD to have a tablet that ended up coming back to me and saying "You were right, I was wrong, thanks for selling me a cheap tablet so I didn't spend a ton on something I didn't hardly use" because they get 'em home and find out the laptop has more utility on the road while at home the desktop has more utility so they end up being used for what I call the "WTF is his name?" Wiki searches and...they find out that android or iPhone in their pocket does that just as well so...yeah. Not really much of a point in 'em.

Re:Thank GOD (1)

Xenx (2211586) | about 4 months ago | (#47655667)

Because your anecdotal evidence shows how everyone experience is. Out of an office of 11 employees, 8 have tablets. 7 of us use them regularly. The 8th person is waiting for the next Nexus tablet as his old one's usb port isn't working correctly and won't reliably charge. Personally I think we might be an above average sample, but I somehow think you might be a below average sample.

Re:Thank GOD (1)

hairyfeet (841228) | about 4 months ago | (#47673169)

I serve a city of over 15k so i think my anecdotes are a little more weighted than your single office of 11 employees. I also have headlines on my side since, just as i said, even Apple sales are down [appleinsider.com] and I would argue the why is obvious, those that want one already have one and see NO point in getting another because it either 1.- Does what they need it to, since I have found tablet users needs do NOT require much in the way of hardware, or just as likely 2.- its gathering dust somewhere because they can't find a good use for it.

Not to long ago i thought I would die laughing as i saw a hipster chick struggling to drive a shopping cart while using an iPad as a grocery list and I called out "trying to justify that several hundred you spent on that thing aren't ya?" and the look of anger mixed with foolishness told me I nailed it. Tablets are good for a few niches...medical, where its "check the box and sign your name" forms, inventory management, and of course being a glorified video player. Those jobs it does quite well, problem is not a whole lot of the public requires those jobs very often, even the video player doesn't get used much as they are either at home where there is a big screen or out where they are busy doing other things. But at the end of the day numbers don't lie and even Apple is seeing slumping sales because like the netbook its a fad.

Re:Thank GOD (1)

Xenx (2211586) | about 4 months ago | (#47673531)

I used my office as an extreme. One that I knew the bounds of. Sure, I support thousands of people as well at work. Lots of them user iPads or android tablets. I don't have numbers for them. But, a lot of the ones I actually talk to... prefer it over their computer. It's people like you that actually make it worse. You intentionally recommend people buy inferior products, worsening their opinion of the form factor as a whole. Also, making fun of people tends to piss people off regardless of whether you're right or not. Tablets aren't for everyone, not everyone needs one. Tablets also don't need to be purchased every year. So yes, sales should decline with market saturation. It should, however, stabilize at a level that covers a normal upgrade cycle.

Re:Thank GOD (1)

backslashdot (95548) | about 4 months ago | (#47651463)

I guess you need your eyes checked, because I can easily tell the difference between a 1080p tablet versus higher resolution tablets because the pixels are visible even at standard viewing distance.

Anyway, looking at Intel's published die area cost, it adds probably a few pennies to the cost of the CPU to add a 4K decoder. Also, the 4K decoder algorithm didn't have to get developed, it was designed years ago. Once the algorithm is designed most of the process shrink work is done automagically in software. It also has virtually no effect on power consumption to include it, the inclusion of 4K costs very very little per cpu so I am not sure what you are complaining about.

All you luddites do is complain when new technology comes out.

Re:Thank GOD (1)

smallfries (601545) | about 4 months ago | (#47653227)

Are you sure that you are average? Perhaps you should not entirely discount the idea that you are in the 50% of the population with better than average vision.

I have no trouble seeing the difference between 720p / 1080p on a 55" screen at 5m (15'), what I find strange is that I notice that many other people do. I always thought the figures for average vision must be underestimates, but other people seem to roll with them.

Re:Thank GOD (1)

Proudrooster (580120) | about 4 months ago | (#47652361)

Dude, the electronics industry needs 4k to sell us 4k panels for our living rooms. Right now, everyone is happy with a el'cheapo 1080p. Time to step it up to 4K. Personally, I am happy with my 720p plasma TV. I am sad to see plasma go in favor of LCD, LED, OLED or whatever over-saturated color technology is being pushed out cheaply.

millions of innocent starving infants slim chance (-1)

Anonymous Coward | about 4 months ago | (#47650331)

WMD on credit zionic nazi book of death & debt neverending holycost is at full blast including https://www.youtube.com/results?search_query=wmd+intentional+starvation

thirst creation https://www.youtube.com/results?search_query=wmd+water+shortage & 'weather' manipulation; https://www.youtube.com/results?search_query=wmd+weather as unreported by our 'media' https://www.youtube.com/results?search_query=wmd+media leaving us wanting..... any word of the truth about anything almost

then there's http://science.slashdot.org/comments.pl?sid=5517341&cid=47646895 media censorship & vandalism i can access from my pocket gadget?

all things being equitable.. any notion of real justice is based entirely on mercy, the centerpeace of momkind's heartfelt connection with creation

being spiritually & creatively merciful with each other takes out the (media/fear) drama of the hateful fear & loathing punishment features. are we not each our very own reward? punish as we would wish to be punished? WMD on credit 'weather' is not punishment enough? https://www.youtube.com/results?search_query=wmd+weather+media news http://www.globalresearch.ca/weather-warfare-beware-the-us-military-s-experiments-with-climatic-warfare/7561

Due to excessive bad posting from this IP or Subnet, anonymous comment posting has temporarily been disabled. You can still login to post. However, if bad posting continues from your IP or Subnet that privilege could be revoked as well. If it's you, consider this a chance to sit in the timeout corner or login and improve your posting. If it's someone else, this is a chance to hunt them down (&/or demonize them....) based on speculation of ill intent... peace out /. https://www.youtube.com/watch?v=m39DWVFK-Bw

Mobile-only article; snort (1)

fnj (64210) | about 4 months ago | (#47650341)

I am MUCH more interested in Broadwell DESKTOP chips. I'm using a Haswell Xeon E3-1245v3 in a server now, and it speedsteps all the way from 3.4 GHz down to 100 MHz under light load. Ivy Bridge only stepped down to 800 MHz, and Sandy Bridge only stepped down to 1.6 GHz.

Re:Mobile-only article; snort (1)

InvalidError (771317) | about 4 months ago | (#47650475)

Since Broadwell-K is not going to launch until half-way through 2015 and Skylake was still on the 2015 roadmap last time I remember seeing one, I would not be surprised if Intel canned Broadwell-K altogether - no point in flooding the market with parts that only have a few months of marketable life in front of them. If Broadwell-K does launch beyond OEMs, it may end up being one of Intel's shortest-lived retail CPUs ever.

In the first Broadwell roadmaps, there were no plans for socketed desktop parts; all mobile and embedded.

Re:Mobile-only article; snort (1)

viperidaenz (2515578) | about 4 months ago | (#47650723)

But Intel have been bringing out a new CPU every year for years now.
Cedar Mill came out 6 months before Conroe

Re:Mobile-only article; snort (1)

InvalidError (771317) | about 4 months ago | (#47651063)

The P4 was getting destroyed by AMD in benchmarks, the 65nm die shrink failed to translate into significant clock gains and interest in power-efficient desktop CPUs was starting to soar so Intel had little choice but to execute their backup plan to save face: bring their newer and better-performing next-gen Core2 mobile CPU design to the desktop.

Broadwell only brings minor performance improvements to desktops and shaves a few watts along the way. If Intel decided to scrap Broadwell-K, or perhaps produce them in limited quantities due to launch dates getting too close to Skylake for full-scale production, few tears will be shed.

Re:Mobile-only article; snort (1)

viperidaenz (2515578) | about 4 months ago | (#47653073)

"a few watts" is 30%, which means a few hours more battery life in an ultrabook.
With a little more CPU power and more GPU too.

They're also talking 18-cores for the broadwell xeons and desktop chips not coming out till Q2 2015

Skylake won't be here in 2015.

Re:Mobile-only article; snort (1)

InvalidError (771317) | about 4 months ago | (#47654733)

But the comment I was replying to was about Broadwell-K which is the desktop variant. Shaving a few watts on a desktop CPU is not going to get you much battery life even if you have an UPS. Most people who will buy Broadwell-K will be using it with a discrete GPU too.

Re:Mobile-only article; snort (1)

viperidaenz (2515578) | about 4 months ago | (#47657407)

Where did you get Broadwell-K from? Apparently the desktop versions are going to be Broadwell-H
They're getting the GT3e GPU, which comes with a bunch of EDRAM and hardware support for VP8 and H265 and the GPU can be used for encoding and decoding at the same time.

Re:Mobile-only article; snort (1)

InvalidError (771317) | about 4 months ago | (#47660229)

Broadwell-H might be Intel's shipping name but the roadmap name has been Broadwell-K for about a year. That's why you see Broadwell-K used everywhere.

The fact that K-series chips (the enthusiast unlocked chips) will be from the Broadwell-K lineup likely contributed to most computer enthusiast sites choosing to stick with the old roadmap name instead of adopting Intel's new production codenames.

Re:Mobile-only article; snort (1)

nateman1352 (971364) | about 4 months ago | (#47653089)

I think what will be interesting and compelling for Broadwell Desktop is the Iris Pro graphics on LGA parts (not just BGA mobile parts like Haswell.) Certainly it won't be capable of competing with high end cards but you can probably expect mid range discrete graphics performance built in to the CPU.

For your standard desktop tower gaming rig it doesn't matter much since you will be likely using discrete graphics there anyway, what excites me more is mid range discrete graphics performance without the added power consumption->heat->large GPU heat sink. Which means a NUC form factor system with mid range discrete graphics performance, which would be a pretty awesome steam box and/or general living room entertainment system.

Also if Haswell history is any lesson, the chips with Iris Pro graphics launch after the chips with the low end integrated graphics. This probably gives Broadwell desktop a few extra months of life with the period in between Skylake desktop launch but before Skylake desktop with Iris Pro.

Re:Mobile-only article; snort (1)

InvalidError (771317) | about 4 months ago | (#47654685)

While Iris Pro performs quite well when you turn down graphics low enough to fit most of the resources in the 128MB Crystalwell L4 cache, nobody interested in mid-range graphics would be willing to give up this much quality for decent frame rates. Once you exceed that 128MB, even low-end discrete GPUs with GDDR5 take the lead. Broadwell's four extra units are not going to change this by much.

If Intel released chips with an upgraded 512MB Crystalwell and twice the L4 bandwidth, then that would nuke low-end GPUs and possibly start hurting mid-range.

VP9 (0)

Anonymous Coward | about 4 months ago | (#47650353)

What about VP9? YouTube is using that now. Who uses H.265 now?

Re: VP9 (0)

Anonymous Coward | about 4 months ago | (#47650603)

I want webm for 4Chan

ipad Air (0)

Anonymous Coward | about 4 months ago | (#47650359)

Since when do ipads run on x86?

Re:ipad Air (0)

Anonymous Coward | about 4 months ago | (#47650449)

Actually it says "as thin as the iPad Air".

Re:ipad Air (1)

present_arms (848116) | about 4 months ago | (#47650505)

Not yet but soon, When intel make this chip, Apple could do the same with the Ipad as they did with macs. This time it's from Arm to Intel instead of PPC to intel

Re:ipad Air (1)

mrchaotica (681592) | about 4 months ago | (#47651079)

Going from one third-party chip to another is fine and dandy, but why the hell would Apple -- especially Apple! -- dump their own design?

Re:ipad Air (1)

hairyfeet (841228) | about 4 months ago | (#47651625)

Because as I have been saying for years ARM just doesn't scale? The problem with ARM is while you can run it at insanely low wattage the problem is that the instructions per clock is likewise very low and unlike X86 it just doesn't scale up with power usage. this is why Samsung is up to 6 cores and Nvidia 5, because when you try to speed up the clock on ARM beyond a certain point it blows the power budget without giving enough of an IPC boost to be worth the increased power use.

As I have said for years it'll be easier for AMD and Intel to scale down than it will be for ARM to scale up because both companies already have such a high IPC. Between the new Jaguar cores at AMD and Broadwell and the latest Atom chips at Intel it is looking like by second quarter of next year the only thing ARM will have going for it is price. I predict ARM will end up in devices where price is king, your ultra cheap tablets, STBs, and industrial applications while X86 ends up taking a large chunk of the mid and high range.

Re:ipad Air (1)

R3d M3rcury (871886) | about 4 months ago | (#47652299)

They've done it before--ADB [wikipedia.org] .

One issue, though, is how long would Apple maintain both? Yeah, Apple dumped 68K for PowerPC, but they also stopped developing PowerPC. When Apple switched to Intel, they stopped doing things for PowerPC.

They could probably switch the iPad Air to Intel. But what about the iPhone? Apple--and 3rd party developers--would probably end up supporting both ARM and Intel for several more years in iOS.

Re:ipad Air (0)

Anonymous Coward | about 4 months ago | (#47650751)

RTF Summary: tablets as thin as the ipad. So if you're into thinness, you could have a tablet as thin as the ipad, but not have to worry about their OS and store. Basically, you could end up with modern hardware and software.

14 nanometers? (4, Insightful)

ChipMonk (711367) | about 4 months ago | (#47650633)

Given that the covalent radius of silicon is 111 picometers, that comes to a channel that's 63 silicon atoms across.

And I thought 65nm (~300 silicon atoms across) was impressive five years ago.

14 nanometers? (0)

Anonymous Coward | about 4 months ago | (#47650687)

Given that the covalent radius of silicon is 111 picometers, that comes to a channel that's 63 silicon atoms across.

I guess we only have 62 atoms left to get rid of.

Re:14 nanometers? (1)

rahvin112 (446269) | about 4 months ago | (#47651913)

I will be highly surprised if the quantum effects ever let them get close to one atom thick, at that small electrons start doing very weird things that are hard to compensate for. We've pushed process technology just about to it's limit.

Re:14 nanometers? (1)

phantomfive (622387) | about 4 months ago | (#47652975)

Last time I checked Intel's roadmap, they think they can get down to 5nm. Hard to see much beyond that.

Re:14 nanometers? (0)

Anonymous Coward | about 4 months ago | (#47654037)

Hard to see much beyond that.

My eyesight's doing just fine, thank you.

Real-world Moore's Law is toast... (1)

MetricT (128876) | about 4 months ago | (#47650721)

The transistor budget may still be scaling according to Moore's law, but that's failing to translate into real-world speed increases. The 5% increase in single-core IPC is weak sauce. And an annoying number of apps don't scale to multiple processors, or scale badly (Amdahl's law is unforgiving...)

You can add more cores, add more compute units to your GPU, or add DSP (Broadwell) or FPGA (Xeon), but that has an ever decreasing marginal impact on real-world speed.

We're probably stuck in a "5% IPC increase per tick/tock" world until they eventually shift off silicon onto Something Else (III-V semiconductors or something more exotic like graphene)

Re:Real-world Moore's Law is toast... (0)

Anonymous Coward | about 4 months ago | (#47652251)

Die Intel, die! [google.com]

One can dream

Re:Real-world Moore's Law is toast... (1)

4wdloop (1031398) | about 4 months ago | (#47652695)

So, no more .NET and Java...back to the bare metal!

Small form (0)

Anonymous Coward | about 4 months ago | (#47650869)

What I'm excited about is seeing the cost of putting something like this in a ultra small form factor like a Raspberry Pi.
great little Linux box for home networks, automation, Plex Server, etc...

Less power?? (1)

jd (1658) | about 4 months ago | (#47651187)

Power is governed by change of states per second. It varies by the voltage, but by the square of the current. There's only so much saving from reducing voltage, too, as you run into thermal issues and electron tunnelling errors.

You are much, much better off by saying "bugger that for a lark", exploiting tunnelling to the limit, switching to a lower resistance interconnect, cooling the silicon below 0'C and ramping up clock speeds. And switching to 128-bit logic and implementing BLAS and FFT in silicon.

True, your tablet will now look like a cross between Chernobyl, a fridge-freezer, and the entire of engineering on the NCC-1701D Enterprise, but it will now actually have the power to play those 4K movies without lag, freeze or loss of resolution.

intel is using fabs for low power (0)

Anonymous Coward | about 4 months ago | (#47651555)

Well, intel uses its mighty fabs to get an edge over the competition. With higher IPC microprocessors, the CISC overhead falls. Moore's Law stopped giving faster transistors a decade ago. Lower cost per transistor is now diminishing. Lower power consumption per transistor remains for some reason. Given the rise in mobile devices, that might be good enough for intel to keep buying new fabs.

Re:intel is using fabs for low power (1)

4wdloop (1031398) | about 4 months ago | (#47652703)

They have the tech to make the best low power processor(s) on the planet...if only they would make an ARM one!

Re:intel is using fabs for low power (1)

jd (1658) | about 4 months ago | (#47655829)

They did, at one point. They bought the rights to StrongARM and sold it for some time, then abandoned it completely.

Re:intel is using fabs for low power (1)

4wdloop (1031398) | about 4 months ago | (#47660197)

StrongARM was before iDevices...

I mean a 14nm ARM SoC for mobile devices, in fact why wouldn't they now when they push with x86 into this market space?
Can't be profit margins and they do not care about Windows anymore there, do they?

Oh great. (0)

Anonymous Coward | about 4 months ago | (#47652139)

Another mediocre architecture saved by superior manufacturing.

Thinkpad 8 (1)

TheMiddleRoad (1153113) | about 4 months ago | (#47652719)

I have a Thinkpad 8 and a Miix 2 8. The Thinkpad 8 is a desktop replacement. I use bluetooth for keyboard and mouse, run an HDMI monitor, and stick power through the USB. It works well, but not perfectly. I'll upgrade to a good broadwell or cherrytrail. Anyway, the future looks awesome.

I will enjoy having a slimmer tablet ... (1)

dave falkner (3585309) | about 4 months ago | (#47655497)

... that is 12 nm slimmer than it might otherwise be without this new technology.

Makes me want to buy Intel stock (1)

Mister Null (3688597) | about 4 months ago | (#47657051)

These innovations make me want to buy Intel stock.
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?