×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Graphics-Enabled CPUs To Take Off In 2011

timothy posted more than 3 years ago | from the in-my-day-we-had-integrated-graphics dept.

Graphics 172

angry tapir writes "Half the notebook computers and a growing number of desktops shipped in 2011 will run on graphics-enabled microprocessors as designers Intel and Advanced Micro Devices (AMD) increase competition for the units that raise multimedia speeds without add-ons. The processors with built-in graphics capabilities will be installed this year on 115 million notebooks, half of total shipments, and 63 million desktop PCs, or 45 percent of the total, according to analysts."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

172 comments

While APU's definitely have their place... (-1, Offtopic)

atari2600a (1892574) | more than 3 years ago | (#35527494)

They also said 2011 would be the year of Linux on the desktop!

Re:While APU's definitely have their place... (1)

Anonymous Coward | more than 3 years ago | (#35527524)

Nope, that was 1911.

Re:While APU's definitely have their place... (-1)

Anonymous Coward | more than 3 years ago | (#35527602)

that won't happen until there's a GPU with built-in central processing unit. Probably, oh, 2012, 2013 in there.

Re:While APU's definitely have their place... (-1)

Anonymous Coward | more than 3 years ago | (#35527668)

They also said 2011 would be the year of Linux on the desktop!

They also said that Linux is still for faggots.

Re:While APU's definitely have their place... (2)

peragrin (659227) | more than 3 years ago | (#35527884)

It may not be the year of Linux on the Desktop but it WILL be the Year of the Linux on your Mobile.

Re:While APU's definitely have their place... (0)

cashman73 (855518) | more than 3 years ago | (#35528968)

They also said 2011 would be the year of Linux on the desktop!

Well, on the bright side, 2011 is going to be the year of Duke Nukem Forever, so there's hope for Linux on the Desktop! ;-)

But not for workstation laptops (1)

marcel (6435) | more than 3 years ago | (#35527500)

I tried looking for a sandy bridge laptop with a 15" screen showing 1920x1200 resolution using built-in graphics, but it seems vendors are now using a power slurping external GPU as a luxury that you must have if you want a decent screen. I don't game nor do I have any need for CAD/CAM like applications, I just need a decent resolution/dpi on my laptop and integrated graphics would make the machine cheaper and less power-hungry, so ideal for developing. Alas, I will probably end-up with some Quadro or other high-end GPU just I want a normal screen.

Re:But not for workstation laptops (1)

bemymonkey (1244086) | more than 3 years ago | (#35527560)

Optimus, or some other form of switchable graphics.

Re:But not for workstation laptops (1)

marcel (6435) | more than 3 years ago | (#35527820)

The point being that I will pay extra for a GPU that I will not use. I want less hardware, not more and loose the complexity too. Drop the second graphics chip and also the price instead of including a pricey extra graphics chip and increase the price even more so I can switch it of.

Re:But not for workstation laptops (1)

Joce640k (829181) | more than 3 years ago | (#35528054)

Yep. I often need a machine with powerful CPU but don't care about the graphics. It seems like NOBODY makes one. Laptops are a bummer because you can't build your own.

Re:But not for workstation laptops (2)

Auroch (1403671) | more than 3 years ago | (#35528180)

Yep. I often need a machine with powerful CPU but don't care about the graphics. It seems like NOBODY makes one. Laptops are a bummer because you can't build your own.

Really? Because you just described the entire apple PC lineup.

Also, if you can't find it, it sounds like you're not looking. Or, only looking at bestbuy and futureshop. MSI has many models, sony and HP do custom to order (CTO) and asus has so many options I'm surprised they don't confuse customers out of a sale.

It really sounds like you are surprised to find that budget laptops don't come with premium features, like full HD, 1080p screens. So hit up dell and configure one.

Re:But not for workstation laptops (1)

Joce640k (829181) | more than 3 years ago | (#35528982)

The point being ... we want to pay less. Also, some of us want to run Windows.

"Custom to order" usually isn't as flexible as you might imagine. Go to those websites and try to get a machine with good CPU and 'bad' graphics. ...or a machine with 8Gb RAM with 'bad' graphics (which I tried to do a couple of months ago). I don't need graphics, I've got a pile of graphics cards here and don't need to pay for another one). None of the sites I tried could do that, despite offering "configure it any way you want!" in their adverts.

Macs using integrated and discrete GPUs ... (1)

perpenso (1613749) | more than 3 years ago | (#35527708)

Something like the MacBook Pro where there is basic graphics integrated into the CPU (favors power consumption) and optional high end graphics available from a discrete GPU (favors performance)?

"Energy-efficient graphics.
Thanks to the new microarchitecture, the graphics processor is on the same chip as the central processor and has direct access to L3 cache. That proximity translates into performance. The graphics processor also automatically increases clock speeds for higher workloads. An integrated video encoder enables HD video calls with FaceTime, while an efficient decoder gives you long battery life when you’re watching DVDs or iTunes movies.
Up to 3x quicker on the draw. And the render.
When you need more performance for things like playing 3D games, editing HD video, or even running CAD software, the 15- and 17-inch MacBook Pro models automatically switch to discrete AMD Radeon graphics that let you see more frames per second and experience better responsiveness. With up to 1GB of dedicated GDDR5 video memory, these processors provide up to 3x faster performance than the previous generation."
http://www.apple.com/macbookpro/performance.html [apple.com]

Re:Macs using integrated and discrete GPUs ... (-1)

Anonymous Coward | more than 3 years ago | (#35527896)

Something like the MacBook Pro where there is basic graphics integrated into the CPU (favors power consumption) and optional high end graphics available from a discrete GPU (favors performance)?

No, nobody thinks that you Apple fagboy. Get back to your cappuccino or whatever you fags drink.

Re:Macs using integrated and discrete GPUs ... (1)

Joce640k (829181) | more than 3 years ago | (#35529038)

Something like the MacBook Pro where there is basic graphics integrated into the CPU

The parent said "would make it cheaper"

Re:But not for workstation laptops (1)

somersault (912633) | more than 3 years ago | (#35527758)

The Dell XPS 15" has a 1920x1080 option and decent graphics capabilities, with nVidia Optimus which apparently switches between low power and full graphics mode depending on your usage.

Not quite 1200, but for a 15" widescreen, I think any res over 1680x1050 is going to be equivalent, since you'll have to increase font sizes anyway (unless you have some very good eyesight or hunch really close to the screen).

Re:But not for workstation laptops (1)

Nutria (679911) | more than 3 years ago | (#35528114)

nVidia Optimus

Doesn't work with Linux...

Re:But not for workstation laptops (0)

Auroch (1403671) | more than 3 years ago | (#35528190)

nVidia Optimus

Doesn't work with Linux...

What's a linux? Is that the OS that took the desktop world by storm several years ago?

... I guess no one noticed. And IIRC, most switchable graphics can be used, but require a reboot (can't be hot switched). Which is still the case for many windows switchable graphics (see: sony).

Re:But not for workstation laptops (1)

marcosdumay (620877) | more than 3 years ago | (#35529296)

"What's a linux? Is that the OS that took the desktop world by storm several years ago?"

It is that OS that people use when they want something more than a toy or text editing.

softwars over? is there money in eugenics too? (-1)

Anonymous Coward | more than 3 years ago | (#35527508)

does it never end? not so long as there's a subscription/per use, gig somewhere? 'we didn't get those penguinistias (yet), butt we'll take their spawn'. more steaming video.

the georgia stone remains uneditable? gad zooks. are there no chisels?

400 year 'shadow rulers' leaving us? (-1)

Anonymous Coward | more than 3 years ago | (#35527542)

one way or another. the time for them to go, is now. the 'complexity' disappears upon minimal observation & introspection (what fear causes/allows). see you at the play-dates etc...

don't forget apathy, for which we are medicated (-1)

Anonymous Coward | more than 3 years ago | (#35527692)

which tends to make us even less functional & wrecks our livers etc.., so that's good/working?

which leaves out the part about that there really is important stuff that matters, unhappy stuff that's not being attended to. & we're well equipped to address the real situations, as opposed to the media induced 'mood', or media induced 'sentiment'. unless...

Great but I think that it will heat like Hell!!!! (0)

Anonymous Coward | more than 3 years ago | (#35527556)

It was about time !!! Spent ages waiting for that. I have one question though....What about its power consumption/ heat transmission, diffusion? No doubt that engineers will have to rethink the cooling schema for these processors.

Re:Great but I think that it will heat like Hell!! (1)

Joce640k (829181) | more than 3 years ago | (#35528060)

Having a low power graphics chip generates more heat??

Re:Great but I think that it will heat like Hell!! (1)

Auroch (1403671) | more than 3 years ago | (#35528198)

Having a low power graphics chip generates more heat??

It's only low powered when compared to other discrete graphics solutions, not when compared to a bare CPU lacking any integrated GPU function. Yes, they are low power. But only in relative terms, not in absolute ones.

Re:Great but I think that it will heat like Hell!! (1)

realityimpaired (1668397) | more than 3 years ago | (#35528838)

And I have an Atom N270-based netbook that, probably 90% of the time, is on passive cooling. It *has* a fan, but that fan is almost never on when all I'm doing is surfing the web, writing a document, or working in a spreadsheet.

Yes, they're going to increase power consumption a little, but you're forgetting how low the power and heat requirements are for these devices in the first place. If the fan has to run 20% of the time instead of 10%, it may shorten the battery life by 30 minutes total, over the course of the 6 hour life that the battery currently has? And that's a 2-year old netbook (Dell Mini 9)... batteries and CPU's have gotten better in the mean time. That's not even considering the power savings from not having to power a discrete (Intel X3100) graphics card... I could actually see a noticeable *increase* in battery life thanks to this.

And the advantage is...? (0)

Anonymous Coward | more than 3 years ago | (#35527576)

So there will be more computers with crappy integrated graphics. Hopefully, it will still be possible to upgrade them with a decent graphics card.

Oh, and btw, wasn't the plan until recently to basically replace the CPU with the GPU? I'm confused...

Re:And the advantage is...? (2)

somersault (912633) | more than 3 years ago | (#35527784)

So there will be more computers with crappy integrated graphics. Hopefully, it will still be possible to upgrade them with a decent graphics card.

Yes, and yes.

Oh, and btw, wasn't the plan until recently to basically replace the CPU with the GPU? I'm confused...

No. Graphics Processor Units make very poor Central Processing Units. GPUss work nicely to augment CPUs when doing specialised calculations (encryption, video encoding, physics, etc) that would take the CPU a long time to do on its own, but there are no plans to replace CPUs with GPUs.

Re:And the advantage is...? (3, Informative)

petermgreen (876956) | more than 3 years ago | (#35528310)

And the advantage is...?

The advantage of shared memory graphics is reduced cost and power consumption.
The advantage of integrating the memory controller in the CPU is it allows the CPU faster access to memory.
The advantage of reducing the number of high speed chips is reduced cost and power consumption.

So with that in mind lets consider the options for a CPU with an integrated memory controller.

Putting the shared memory graphics on a seperate chip would require a link to the CPU that offered high speed high priority ram access by the GPU and would still leave you with two high speed chips. AMD do this with hypertransport though IIRC they usually have a small ammount of dedicated graphics memory as well to keep the framebuffer traffic off the hypertransport links.

Not offering shared memory graphics at all rules a platform out of the low end market and makes it less than ideal for the business market in general. Intel did this with the nahelm quad and hex core processors and I belive are planning to do the same with the LGA2011 high end sandy bridge chips.

So the natural thing to do is to put the shared memory graphics on the CPU with the memory controller. Intel did this with the dual core nahelm chips and with the LGA1155 mainstream sandy bridge chips.

So there will be more computers with crappy integrated graphics.

Probablly a few more because there were no nahelm quad cores with integrated graphics support. So if you wanted a fast quad core you pretty much had to have discrete graphics as well whether you wanted them or not.

Practically speaking sandy bridge puts things pretty much back the way they were before with the choice of processor core count decoupled from whether to use integrated graphics. It's just those integrated graphics are in the CPU rather than the northbridge. Hopefully this will mean the likes of dell will finally migrate off LGA775.

Oh, and btw, wasn't the plan until recently to basically replace the CPU with the GPU?

GPUs are great at some types of calculation but suck at branch heavy code. So many algorithms have to be completely redesigned to run on them. IIRC in the case of video encoding GPUs can do it quicker but only using cut down encoders that produce lower quality results.

AMD was at one point planning to make units that combined the best of both (note: the fusion name which originally reffered to this is now being used to reffer to CPUs and GPUs on the same die but logcially seperate). Dunno if they still are.

Supercomputing (4, Interesting)

louic (1841824) | more than 3 years ago | (#35527592)

Depending on how exactly these processors will look like, they may be very interesting for speeding up scientific computations. The fastest computer in the world at this moment is already GPU based, and such a CPU/GPU hybrid can possibly be even more efficient by removing the slow communication between CPU and GPU.

Re:Supercomputing (2)

nzac (1822298) | more than 3 years ago | (#35527676)

The point of GPU super computer is to have a lot of cores working at a slow speed most GPUs in the hybrids only have a small am amount of cores mine has 80. The point the hybrids is to be able to include low power graphics without the need for extra hardware thus reducing cost.

GPU clusters or just stand alone GPUs would like to have as many cores as possible compared to the rest of the machine. To achieve this effectively you want to buy a somewhat bear bones system and stick some cost effective high end GPUs with cooling in it.

Buying hybrids would mean that as you pay more you would likely get an improved GPU and CPU when the extra CPU is redundant. Plus it makes it much harder to keep booth cool.

Re:Supercomputing (0)

Anonymous Coward | more than 3 years ago | (#35528372)

The GPU in Intel's i5 sucks. As in, even with the reduced CPU/GPU latency, it provides performance equivalent to a mid-range, 3 year old graphics card. Fine for desktop applications, web browsing, and email. Not useful for video processing and definitely none of the computational advantages of putting a high-speed, massively parallel GPU on a PCI-express bus.

Re:Supercomputing (1)

CastrTroy (595695) | more than 3 years ago | (#35528756)

It used to be the case on the 386 that the FPU was a separate chip you could get and plug into the motherboard, much like the GPUs of today. It seems obvious that it would simplify a lot of things to just put the GPU directly on the processor.

Re:Supercomputing (1)

blahplusplus (757119) | more than 3 years ago | (#35528834)

The biggest problem is memory bandwidth. GPU's are fast because of their high throughput, the problem is CPU's won't ever have enough memory on die to keep up despite the communications. It's a trade off. I remember Mark rein of epic games saying on-die CPU's would kill video cards but they never did, because most people don't understand that performance is about trade offs.

A GPU by any other name would render as slowly (1)

dicobalt (1536225) | more than 3 years ago | (#35527610)

Integrated graphics is still integrated graphics. They are still slow, still useless for games (unless you are a self masochist), and still nothing impressive. Effort would be better placed in producing an open system and standards for coding to graphics to hardware directly instead of using flabbly cycle hogging API's. That's why pathetic hardware like the Xbox 360 can do so much with so little. Think of the days of DOS and Commodore 4K graphics demos.

Re:A GPU by any other name would render as slowly (2)

EzInKy (115248) | more than 3 years ago | (#35527646)

Other than a few hard core gamers and graphic artists, discrete graphic cards are a total waste of money for most people.

Re:A GPU by any other name would render as slowly (2)

giuseppemag (1100721) | more than 3 years ago | (#35527706)

No GPGPU, no accelerated desktop, maybe even problems with higher resolutions. Seems like a problem dressed as a solution to me.

Re:A GPU by any other name would render as slowly (0)

EzInKy (115248) | more than 3 years ago | (#35527860)

Personally I despise GPU accelerated desktops, all I want is a link to an app to click that results after clicking in lauching an app that performs the functions I require. Anyone who needs more should pay a premium for the added eye candy.

Re:A GPU by any other name would render as slowly (2)

Junta (36770) | more than 3 years ago | (#35528132)

Not all GPU accelerated desktop is 'fluff', such as expose/compiz scale/kde present windows (particularly the latter with window title search). When I have many windows open, it's a vastly superior way to find what I need than anything else. It could have been done without craphics acceleration, but it's easiest to get large as possible previews of the results of your search this way.

Re:A GPU by any other name would render as slowly (0)

Anonymous Coward | more than 3 years ago | (#35528208)

Agreed.

Back in my day, we didn't need any fancy 'colors', green text on a black background worked just fine. Plus there was no bullshit about having to 'click' on an 'app', we just typed in its name, and then continued to interact with it without having to alter hand positioning.

I despise all this progress. Anyone who thinks they need more than 64k of memory is just living in a fantasy world.
 

Re:A GPU by any other name would render as slowly (1)

Kjella (173770) | more than 3 years ago | (#35527922)

GPGPU no, but then most users don't do anything computing intense, CPU or GPU. Integrated chipsets handle simple desktop effects quite fine, your FUD is out of date. Problems with higher resolutions? What is this, the 90s?

Re:A GPU by any other name would render as slowly (3, Insightful)

giuseppemag (1100721) | more than 3 years ago | (#35528004)

First of all where would this be FUD? Try connecting a full HD monitor to an integrated Intel GPU and you'll see what I meant.

Also, this bullshit that users don't do computing intense stuff is, well, bullshit. Full HD video, 3D movies, photo processing are computationally intensive even if they are not particularly serious usage of computing power. Don't confuse "important work" with "computationally intensive work".

Re:A GPU by any other name would render as slowly (1)

PeterKraus (1244558) | more than 3 years ago | (#35528042)

Oh right. I'm using AMD HD 4200 (while not on-cpu graphics, it's still an IGP), on 1920x1080 via DVI, using radeon (ie FOSS) linux driver, without any problem on movie playback.

Okay, it can't handle playing that fancy scene with all the birds flying fluently unless I launch it via commandline mplayer (it doesn't even start using smplayer and it's so-so with VLC), but for a £50 motherboard with the GPU included, FML.

Re:A GPU by any other name would render as slowly (1)

CajunArson (465943) | more than 3 years ago | (#35528048)

Full HD video --> Works just fine on my 2008 era Intel laptop with integrated video... and I'm using it to drive a 1080p display to show ripped Blu-Rays at full definition on top of a composited KDE desktop.

Re:A GPU by any other name would render as slowly (1)

mozumder (178398) | more than 3 years ago | (#35528126)

You don't need a discrete GPU for any of that. They're not even computationally intensive.

Computationally intensive = 4 hours for a simulation.

Re:A GPU by any other name would render as slowly (1)

Corporate Troll (537873) | more than 3 years ago | (#35528192)

An Atom 330 with ION chipset (NVidia 9400M, aka integrated graphics) handles Full HD just fine.

Re:A GPU by any other name would render as slowly (1)

giuseppemag (1100721) | more than 3 years ago | (#35528238)

Clearly if they do it like ION then there is nothing to fear. If it is another shitty Intel integrated or something like the ancient Radeon IGP then God, please, humanity has been punished enough!

Re:A GPU by any other name would render as slowly (1)

Bengie (1121981) | more than 3 years ago | (#35528262)

An Intel Sandy Bridge on integrated graphics is quite snappy. I think you're a generation old on complaining. Next year's Ivy Bridge GPUs will be much stronger and should handle 1080p at ~30fps on most current games. Just make sure you set the quality to low. The current Intel integrated GPUs are getting 45fps in WoW at medium and can even beat nVidia 480 at converting videos.

Re:A GPU by any other name would render as slowly (1)

Kjella (173770) | more than 3 years ago | (#35528960)

No, full HD video is not particularly computing intense with dedicated hardware. The Intel Core i5-2500K decodes 5 simultanious 1080p streams [anandtech.com] according to Anandtech. Hell it even has HDMI 1.4a and 3D support if you're into that, this is "integrated" performance in 2011. I don't know how intensive Photoshop with thousand layers can get, but simple touchups of photos certainly do fine without a discrete GPU.

Re:A GPU by any other name would render as slowly (1)

zach_the_lizard (1317619) | more than 3 years ago | (#35529300)

I haven't used Photoshop since about 2005-2006, but I don't recall it ever using the GPU. Has this changed?

Re:A GPU by any other name would render as slowly (1)

realityimpaired (1668397) | more than 3 years ago | (#35529192)

The FUD would be the anecdotal evidence that many have already provided that you're completely and utterly wrong.

And to add to the anecdotal evidence: I have a 2-year old netbook which is able to handle 1920x1080p through its VGA out port, with a composited desktop and full motion video at that resolution. While I don't use it for intensive gaming, I have played Civ4 and WoW on the netbook at that resolution. While it's not exactly an ideal configuration, both games are playable. When I'm typing up a document in Word, or surfing the web, or working in a spreadsheet, it's plenty adequate. It's even fine playing full screen HD video from Youtube or editing a photo in GIMP.

The relevant specs of the netbook are as follows:
Atom N270 (dual core @ 1.6GHz)
2GB DDR2 memory
Intel X3100 graphics
64GB SSD
OS: Win7 Ultimate

I use a significantly higher end laptop (Core i7 quad, 4x as much RAM, and a Radeon HD 4870 1GB) for my regular gaming, but the netbook certainly would meet the needs of regular users. Hook it up to a larger external screen, and it's perfect for them. (though when you can buy a 15" laptop for $300 at Wal*Mart, I wouldn't recommend a netbook any more).

(and if you disbelieve me on the configuration, I will happily make a CPU-Z full report available to you. The only modification I will make is to remove the serial number of my netbook from the report)

Re:A GPU by any other name would render as slowly (1)

royallthefourth (1564389) | more than 3 years ago | (#35528596)

Apparently you have not used any recent integrated GPU's from AMD. My integrated Radeon 4250 can play Left4Dead 2 at a fine framerate with the settings adjusted appropriately. Accelerated desktops are light duty work. I don't think it'll do OpenCL right now, but AMD is serious about making their integrated graphics better than the barely usable stuff they've been pushing out before; I'm sure future iterations will do all of these things even better than current ones do.

Re:A GPU by any other name would render as slowly (1)

giuseppemag (1100721) | more than 3 years ago | (#35528952)

Actually I am writing this message on an integrated Radeon, and I am quite happy with it. My fear is that further reduction of power could mean a return to the past, and I've been there and care not to go back :)

Re:A GPU by any other name would render as slowly (2)

Corporate Troll (537873) | more than 3 years ago | (#35527722)

I realize that Integrated Graphics are sub-par, but to say they're useless for games unless you're a masochist (the "self" is redundant...), is a bit overstating it. Many of us non-gamers do like to play a game from time to time, but we don't want to spend ourself into bankruptcy. Guess, what? This means we buy older games (cheaper!), and from my experience today's integrated graphics (also cheaper!) handle older games perfectly fine.

Re:A GPU by any other name would render as slowly (4, Insightful)

Luckyo (1726890) | more than 3 years ago | (#35527940)

I actually can chip on this on a "this is not true" side. My father isn't a gamer by any stretch - the only games he likes to play are various arcanoid derivatives. Which meant that his work laptop served him just fine.
Then came shatter, and he all but killed me with his "why won't my laptop run this?" questions. Try to explain to someone running the crappy intel 945GM that always ran the old 2d arcanoids that shatter just won't work on it.

So now, I'm probably giving them my current gaming computer as I upgrade, and I'm pretty sure he'll be telling tech support at work that his next laptop has better include 3d acceleration or else (he's in position to be able to tell them that). So the old saying applies here - you'll be satisfied with integrated, until in comes one killer application that it won't run, and then you aren't. Problem is, with so much software requiring decent 3d graphics on board (even aero does!) you're still best served by a half decent dedicated graphics card that powers itself down when 3d features aren't used or used sparingly.

Finally there's an issue of quality, and that goes beyond 3d. Most integrated chipsets have clear problems displaying higher resolutions, which is why high resolution laptops generally have a dedicated chipset rather then integrated solution.

Re:A GPU by any other name would render as slowly (2)

Corporate Troll (537873) | more than 3 years ago | (#35528174)

Ehm...

I just checked. Shatter was released in 2010(!) for Windows. The Integrated graphics you mention were released in January 2006. Go and read my comment again: I said, older games on modern-day integrated graphics. I'm pretty sure Shatter will work perfectly fine on my wifes ATI Radeon HD 5750 (iMac bought in fall 2010)... which are the integrated graphics sold these days. Will Shatter work on my 2007 laptop? Can't say, because I can't find system requirements of Shatter. However, the ATI Radeon Xpress 1100 (which is crappy) runs Portal decently. Portal, however is a 2007 video game (Based on the Half-Life 2 engine, released in 2004).

You pretty much INVERTED my statement: use modern-day games on old integrated graphics.

I hope you see a problem with that...

Furthermore: nobody needs Aero... It's useless gimmics and the specs are insanely high for something compiz can do on a Gefore2 MX.

As for the high resolution comment: I've driven a 1280x1024 screen without a problem with my Asus EEE PC 701 4G. I fail to see how rendering a desktop is hard for those integrated graphics. Perhaps when running a GAME on those resolutions, but nobody in his right mind would do that. Back in the early days of graphics card, what interested you most was to know how much memory it had, because you could calculate how high resolution it would support. Those times are over. Integrated graphics drive two monitors at 1440x900 each just fine. I see that every day at work.

Re:A GPU by any other name would render as slowly (1)

Luckyo (1726890) | more than 3 years ago | (#35528266)

Shatter's system requirements were on a level of a 2005 computer at best, which is my point. Specifically my older 2005 bought computer ran it fine on max settings with a barely passable graphics card.

As for high resolution, 1280x1024 hasn't been "high" for a decade at least. The high resolutions nowadays start at around 1900x1200 and go up from there. My 1680x1050 is average at best nowadays, and in games tends to be the lowest benchmarked resolution.

Re:A GPU by any other name would render as slowly (1)

Corporate Troll (537873) | more than 3 years ago | (#35528432)

The high resolutions you talk about are absolutely non-typical on the range of laptops that do have integrated graphics. Go to your local geek provider and check. Laptops seem to hover around 1366x768. Typical stand alone screen is 1920x1080, simply because of economies of scale on HDTVs. My moms computer (Single core AMD Athlon 64 2800+, 2GB RAM, and.... a Geforce 4400MX, which is for all intents and purposes not better than any IGP these days) drives a 23" 1920x1080 screen just fine.

In the consumer-end, the standalone screens I see most are 1920x1080 and 1440x900. That's for NEW systems or people who replaced a screen that crapped out.

More to the point: my wifes iMac is powered by integrated graphics (pretty much everything from Apple is) and drives a 2560x1600 just fine... Incidentally, that resolution is the highest one I can find in my preferred online store at prices most consumers wouldn't spend. So, "start at around 1900x1200" sounds, let's just say a bit exaggerated.

Might it just be that your fathers laptop had other issues? It wouldn't be the first time that I see people complaining about something not working right and in the end its, either Windows that's really in a bad state, not enough RAM (A 2005 laptop? 512MB RAM... might be an issue, considering Windows XP SP3 pretty much requires 512MB RAM. Swapping takes an insane toll) or a failing harddisk. (Failing harddisk => I had that in my laptop... Ubuntu notified me about it... Replacing the HDD with another one, and suddenly the machine was very responsive again) Finally, even BIOS settings might influence it. In my laptop, there was a "Save Battery" option, disabled by default. I thought it was perhaps a good idea to turn it on. Turns out, it basically locked the CPU into 800MHz. Ooops!

Re:A GPU by any other name would render as slowly (1)

Luckyo (1726890) | more than 3 years ago | (#35529238)

My father had a pre-2000 (year) laptop with native resolution of 1600x1200. From work. with a (iirc) 14" or smaller screen. From business line of DELL's of all places. You must be looking at very cheap low end consumer crap which goes for lowest possible denominator.

And on the topic of shatter, no. I spent several hours trying to make the damn thing work, down to trying a couple of hacks. Nothing worked. 945GM's implementation of shaders is simply so horrendously bad, it doesn't work. Google was filled with that complaint back when I searched too. In a nutshell, intels' 945GM is a crappy chipset for anything with shaders.

Re:A GPU by any other name would render as slowly (1)

VGPowerlord (621254) | more than 3 years ago | (#35529242)

More to the point: my wifes iMac is powered by integrated graphics (pretty much everything from Apple is)

You're wrong.

Speaking as someone who was checking the specs on Apple products last year, the only Apple products that used integrated graphics then were the Mac Mini, Macbook, and Macbook Pro.

The iMac and Mac Pro all use discrete AMD/nVidia cards/processors.

The current iMac models [apple.com] have a choice of ATI Radeon HD 4670 256MB, ATI Radeon HD 5670 512MB, or ATI Radeon HD 5750 1GB.

Re:A GPU by any other name would render as slowly (1)

mordred99 (895063) | more than 3 years ago | (#35529258)

Now, 1280x1024 might not be "high" as in bleeding edge, however it is still the preferred market for a majority of all companies buying PCs for their employees. If someone runs only to best buy to look at monitors, then fine. Most corporate environments have their PCs in a 4:3 ratio and 1280x1024 is a good size for a 19" monitor. Anyone doing spreadsheets and excel/word all day should be fine with that. They are cheap and work. So what you are doing is making your arguement for approx 50% or less of the market right now.

Re:A GPU by any other name would render as slowly (2)

hairyfeet (841228) | more than 3 years ago | (#35527744)

Actually the AMD IGPs aren't half bad. Sure they'll never beat a discrete but while I was waiting for my discrete to show up I was playing Bioshock, FEAR, Swat 4, L4D, pretty much anything I wanted and that was with last years 4250 onboard. The new APU chips have an HD6xxx IIRC, anywhere from an HD6250 to an HD6550 depending on chip.

And the reason nobody allows you to write directly to hardware is we already tried that back in the days of DOS. What you ended up with was a single bit of buggy code could take the whole system down with VERY little trouble. Now add in the malware that would be written to try to hide in the GPU, and you'd have a mess.

The reason you can do that with an X360 OS because it is "DRM...in a box" which means all code is approved by MSFT and no approval? No run. Personally I'd rather NOT have my desktop programs have to be given a seal of approval from MSFT, thank you VERY much. There is a cost to everything, and the cost of having the freedom to run what you want is having to use abstraction since the OS has to have a way to control the program.

But don't worry I have a feeling you may get to try it that way if you switch to Apple, as I have a feeling OSX will end up being replaced by iOS and the app store. Then you'll only run what Apple approves of and I'm sure that having that level of control will make for faster access. Personally I'll take freedom over speed, thanks anyway.

Re:A GPU by any other name would render as slowly (0)

Auroch (1403671) | more than 3 years ago | (#35528218)

Personally I'll take freedom over speed, thanks anyway.

That's a false dichotomy. It isn't one or the other. You could have both freedom and speed, but lack reliability, for example.

Re:A GPU by any other name would render as slowly (1)

Inda (580031) | more than 3 years ago | (#35527756)

The integrated graphics on my HP notebook are plenty fast enough to play CIV5.

All these new HTML5 demos run at above 50fps. More than enough.

The HDMI port outputs full 1080p to my plasma TV.

A sibling poster is correct - no one really cares about how many FLOPS a GFX card can handle.

Re:A GPU by any other name would render as slowly (1)

somersault (912633) | more than 3 years ago | (#35527834)

What the hell are you talking about? The Xbox 360 uses DirectX just the same as Windows.

If you could change graphics settings on consoles the same as PCs, you'd probably notice the difference, but I'd assume that playing games on a console is often the equivalent of using "medium" settings on a PC. I say this as someone with both a PS3 and 360, not trying to say that the consoles are inferior in terms of gameplay, just that obviously a modern day PC is going to kick their ass. That's how things work. Consoles come out with competitive graphics, PCs quickly overtake them in capabilities.

Re:A GPU by any other name would render as slowly (1)

drinkypoo (153816) | more than 3 years ago | (#35528534)

What the hell are you talking about? The Xbox 360 uses DirectX just the same as Windows.

It doesn't use Windows the same as Windows, though; the Xbox 360 OS is based on the Xbox OS which is based on Windows 2000. But it has almost none of the OS present... Which is why you need a quad-core to play Grand Theft Auto when the Xbox 360 has only three. OS overhead.

Re:A GPU by any other name would render as slowly (1)

somersault (912633) | more than 3 years ago | (#35528802)

Still, it's a far cry from banging directly on the hardware.. they could be doing even more on the Xbox if they were allowed to do that.

Re:A GPU by any other name would render as slowly (1)

drinkypoo (153816) | more than 3 years ago | (#35529074)

Still, it's a far cry from banging directly on the hardware.. they could be doing even more on the Xbox if they were allowed to do that.

DirectX is frankly close enough. The version of DirectX was bumped for both systems to permit DirectX developers to take better advantage of the hardware.

Re:A GPU by any other name would render as slowly (0)

Anonymous Coward | more than 3 years ago | (#35527862)

AMD Fusion supports DX11 and OpenCL. The lowest-end 9W chip has 80 unified shaders running at 280MHz. The 18W chip clocks at 500MHz. That's definitely powerful enough to run modern shader-driven games. My back-of-the envelope calculations tell me that the 9W version is at least as powerful as a low-end Nvidia 400-series or ATI 5000-series. Yes, you're going to be fill-rate limited, but guess what: Laptops don't need 200 fps at 1080p. 30 fps at 480p upscaled to native will very playable, thank you very much.

The only "problem" I can see is that there's no mention of which OpenGL versions are supported.

Re:A GPU by any other name would render as slowly (3, Insightful)

Auroch (1403671) | more than 3 years ago | (#35528230)

My back-of-the envelope calculations tell me that the 9W version is at least as powerful as a low-end Nvidia 400-series or ATI 5000-series

My back of the envelope memory tells me that all low end 400 series and low end 5000 series "graphics" are actually IGPs as well...

Re:A GPU by any other name would render as slowly (1)

semi-extrinsic (1997002) | more than 3 years ago | (#35527906)

A lot of it has also to do with an unneccesary demand for ultra-high resolution. Some of my all-time favorite games, like Need for Speed 5 or space-battle-tank-fighter Battlezone, are playable on a netbook with integrated graphics while emulated through Wine. I still prefer them to e.g. the latest Need for Speed, where content and playability has been sacrifized on the altar of cartoon realism and HD/HDR graphics. Hey, EA: I don't enjoy games more if they are ultra-high-def, I enjoy them more if they are fun and challenging to play!

soapbox->get_off(me);

Re:A GPU by any other name would render as slowly (1)

Auroch (1403671) | more than 3 years ago | (#35528246)

I still prefer them to e.g. the latest Need for Speed, where content and playability has been sacrifized on the altar of cartoon realism and HD/HDR graphics

Let me tell you, there are way more people addicted to WoW than to nethack and dwarf fortress. I'm not saying one is better than the other for gameplay... but I am saying that gameplay is not the only reason to play. You need to reach a minimum threshold of performance. This APU just upped that threshold.

Re:A GPU by any other name would render as slowly (1)

RyuuzakiTetsuya (195424) | more than 3 years ago | (#35528254)

The target for on die GPUs aren't crysis and call of duty, it's aero and quartz. Which sandy bridge handles very well.

Re:A GPU by any other name would render as slowly (0)

Anonymous Coward | more than 3 years ago | (#35528616)

At least you could play Dwarf Fortress [bay12games.com] !

IGP's are sufficient for most games (1)

Troll-Under-D'Bridge (1782952) | more than 3 years ago | (#35529254)

IGP's are sufficient for most games. Yes, you read that right. IGP's with good drivers are sufficient for playing the games that most people play. These include Flash games (Farmville) and the "demo" games that come with a typical OS installation (Solitaire).

I hate how supposed "gamers" dominate any discussion that remotely has anything to do with computer graphics. Not everybody wants to play Crysis (and I don't even know what that is, without a quick peek at Wikipedia).

Reduces the load on the motherboard (2)

satuon (1822492) | more than 3 years ago | (#35527624)

One of the good things about having the GPU integrated in the processor chip itself is you don't have to go through the bus, so this reduces latency and leaves more bandwidth for everything else.

Re:Reduces the load on the motherboard (0)

Anonymous Coward | more than 3 years ago | (#35527764)

but only if the integrated gpu brings memory of its own with it.

Else the bus from the cpu connector will be cluttered as never before and slow stuff down you'd never expected it could.

Re:Reduces the load on the motherboard (1)

EzInKy (115248) | more than 3 years ago | (#35527804)

And this effects grandma checking her email how? Now that computers are main stream and smart phones nearly so, what difference does it make to the average user if his email loads in 2 seconds or 3?

Overheating already... (3, Interesting)

Anonymous Coward | more than 3 years ago | (#35527632)

...CPU handling the graphics in laptops is already causing overheating issues.

Two cases in point, a Toshiba laptop with AMD and a 13" MacBook Pro with Intel, the fans run annoyingly at high speed, the bottoms are hot enough to fry eggs on. That's just sitting with one web page open. How long can one expect machine like that to last? A year? two maybe?

Are web pages going to suddenly tone down their act, quit using video, animation, Flash? Text and pictures only? If they do that, then what? Hardware makers only start making laptops that can handle web text?

Dedicated graphics is the way to go, CPU and graphics on separate dies away from each other, separate the heat sources.

I can just imagine the scene where a bunch of power hungry types just made the decision to move towards integrated graphics, and a highly intelligent engineer just stomping out of the boardroom in protest.

Re:Overheating already... (1)

Corporate Troll (537873) | more than 3 years ago | (#35527776)

How long can one expect machine like that to last? A year? two maybe?

Heh.... Don't you think you just put your finger on the whole point? Computers are strong enough (most people really overestimate their real needs, and think they really do need that i7, if the Core2Duo would have overkill already.) and have been, for I dare to say, the last 6 years. I use today a machine I bought in january 2007 and it was one on sale, to get rid of it before the Vista release. So it was already bottom-line back then. It's purring along, doing exactly what it needs to do even now.

That's 4 years functional and no sign of stopping to work. My brother uses a dumpster sourced laptop, from around 2003. Works perfectly fine and unlike mine it doesn't heat up at all.

So there are two main reasons to buy computers these days: because you want to... or to replace a defective one. Unless a gamer (and other niche usages), the "want to" group doesn't seem to be very large. From a business perspective, making laptops fail earlier is a good business choice.

Re:Overheating already... (0)

Anonymous Coward | more than 3 years ago | (#35527782)

You can probably reduce temps on those laptops by 5-10C by applying better thermal compound (TX-4 or smth similar; wouldn't trust AS5 on a laptop mobo). I've never seen a laptop where the compound wasn't both sloppily applied and made of complete crap. Obligatory warranty warning ofc.

But yeah. Coolers are necessary now, especially for longevity in gaming notebooks.

Re:Overheating already... (1)

PeterKraus (1244558) | more than 3 years ago | (#35528084)

In my own business, I've been using Arctic Cooling's MX-2. On laptops and desktops alike. No customer has complained so far, and I use it on my own PC's too. It seems to be on par in terms of price with TX-4, much cheaper than AS5, and it also gives the welcome option to get it in 30g tube instead of the wee 3.5g one....

Re:Overheating already... (0)

Anonymous Coward | more than 3 years ago | (#35528204)

but it turns your nads to rice pudding.

Toshiba AC100 Nvidia Tegra smartbook (1)

migla (1099771) | more than 3 years ago | (#35527638)

This is ontopicish, isn't it? And it's all ready out there. (Hasn't probably "taken off" (depending on the definition of that), though...)

I'd love to have a Toshiba AC100 smartbook with an Nvidia Tegra ARM cpu. Capable of HD output, 9 h battery (on lighter usage IIRC). About 800 grams. Runs Android, but an Ubuntu port is progressing, from what I can tell.

Re:Toshiba AC100 Nvidia Tegra smartbook (2)

Tapewolf (1639955) | more than 3 years ago | (#35527752)

I have one. The main problems with it are that in Android, there is a certain lack of applications I need (can't seem to find a decent text editor / wordprocessor, for one).

Under Linux, you get all the software (Pidgin, proper text editors with undo and stuff, GIMP and so on) and it's handy for playing with ARM ports of software, but the battery life is only about 3.5 hours. If you want to keep Android there it has to run off the SD card, which is very slow, even with a class 10 card. I might try installing on the machine's internal flash, though - see if that is quicker or less power-hungry.

On mine, I actually disabled the graphics acceleration in Ubuntu - it tended to leave it a little unstable, but it also stopped the VT switching from working, which I didn't like.

Also it is worth mentioning that the 2.2 update did bad things to the bootloader and prevented linux from booting either. It took a bit of fiddling before I was able to reinstall the 2.1 bootloader and I still haven't repaired android after that.

In summary, it's a nice little machine, but I found Android too limiting and linux isn't really mature on this platform yet. With some kind of extended battery it would kick ass, though.

Re:Toshiba AC100 Nvidia Tegra smartbook (1)

Auroch (1403671) | more than 3 years ago | (#35528250)

The main problems with it are that in Android, there is a certain lack of applications I need (can't seem to find a decent text editor / wordprocessor, for one).

Good thing you can compile one yourself!

Re:Toshiba AC100 Nvidia Tegra smartbook (1)

Tapewolf (1639955) | more than 3 years ago | (#35528944)

Good thing you can compile one yourself!

The NDK doesn't really allow access to the UI and writing one from scratch in Java was not a prospect I really fancied. It was easier to just stick Linux on it.

I hadn't considered the possibility of QT on Android, though - an android build of Kate would be ideal. Wouldn't help with GIMP or Pidgin, mind.

Toshiba AC100 review in theregister.co.uk: 1/10 (1)

IYagami (136831) | more than 3 years ago | (#35528220)

http://www.reghardware.com/2010/11/03/review_netbook_toshiba_ac100/ [reghardware.com]

Verdict
The beautifully designed and executed hardware is very close to my ideal netbook, and it's hardly an exaggeration to say that I'm heart-broken by Toshiba's cocked-up Android implementation. The best one can hope for is a firmware rescue from the open source community, although I wonder if the product will stay around long enough in these tablet-obsessed times for that to happen.

The more things change... (5, Insightful)

Anonymous Coward | more than 3 years ago | (#35527688)

Way back near the dawn of time, Intel created the 8086, and its slightly less capable little brother, the 8088. And they were reasonable processors ... but although they were good at arithmetic, it was within tight constraints. Fractions were just too hard. Trigonometry sent the poor little souls into a spin. And so on.

And thus, the 8087 was born. It was able to carry the burden of floating point mathematical functions, thereby making things nice and fast for those few who were willing to pony up the cash for the chip.

Then out came the 80286 (let's forget about the 80186, it's not really all that relevant here). It was better at arithmetic than the 8086, but still couldn't handle floating point - so it had a friend, the 80287, that filled the same purpose for the 80286 as the 8087 did for the 8086 and 8088. (We'll blithely ignore Weitek's offerings here. They existed. They're not really germane to the discussion.)

Then the 80386. Much, much better at arithmetic than the 80286, but floating point was still an Achilles heel - so the 80387 came along for the ride.

And finally, the i486. By this stage, transistors had become small enough that Intel could integrate the FPU on die - so there was no i487. At least, not until they came out with the i486SX, which I'll blithely ignore. And so, an accelerator chip that was once hideously expensive and used only by a few who really needed it was integrated onto chips that everybody would buy.

Funnily enough, it was around the time that the i486 appeared that graphics accelerators came onto the scene - first for 2D (who remembers the Tseng Labs W32p?), and then for 3D. Expensive, used only by a few who could justify the cost ... is this starting to sound familiar to you?

So another cycle is beginning to complete, and more functionality that used to be discrete is now to be folded onto the CPU. I can't help but wonder ... what will be next?

Re:The more things change... (0, Insightful)

Anonymous Coward | more than 3 years ago | (#35527866)

When I read your post, I used the voice of Zapp Brannigan inside my head. You are a tedious blowhard who likes the sound of his own voice.

GPUs being integrated into CPUs mirror the situation of FPUs being integrated into CPUs in the 90s.

There you go, 17 words, point made.

ABLAH BLAH BLAH what I assume is esoteric knowledge that makes me look clever ABLAH BLAH BLAH things that everyone who reads slashdot already knows ABLAH BLAH BLAH overuse of the phrase "blithely ignore" just to make sure everyone sees that I know some other irrelevant facts ABLAH BLAH BLAH trite observation that is the first thing to occur to anyone when hearing that GPUs are being integrated into CPUs BLAH BLAH FUCKING BLAH.

Re:The more things change... (1)

ledow (319597) | more than 3 years ago | (#35528104)

Whereas your post was the height of eloquence and supremely succinct. I actually prefer the OP. At least it's *worth* reading.

Re:The more things change... (1, Funny)

Auroch (1403671) | more than 3 years ago | (#35528260)

When I read your post, I used the voice of Zapp Brannigan inside my head. You are a tedious blowhard who likes the sound of his own voice.

GPUs being integrated into CPUs mirror the situation of FPUs being integrated into CPUs in the 90s.

There you go, 17 words, point made.

ABLAH BLAH BLAH what I assume is esoteric knowledge that makes me look clever ABLAH BLAH BLAH things that everyone who reads slashdot already knows ABLAH BLAH BLAH overuse of the phrase "blithely ignore" just to make sure everyone sees that I know some other irrelevant facts ABLAH BLAH BLAH trite observation that is the first thing to occur to anyone when hearing that GPUs are being integrated into CPUs BLAH BLAH FUCKING BLAH.

When I read your post, I used the voice of Zapp Brannigan inside my head.

Re:The more things change... (1)

jefe7777 (411081) | more than 3 years ago | (#35528478)

When I read the history post, it had Yoda's voice. Then I read your post, and it was a dismissive, whiney, Skywalker voice. Then you went into the Dagobah Cave Tree and cut your own head off.

Re:The more things change... (0)

ledow (319597) | more than 3 years ago | (#35528860)

More importantly - who cares?

When I buy a PC, in comes in a box. When I buy a laptop, it comes in a box. When I *BUILD* a PC, I have to cobble a handful of components together according to their compatibility tables (the more general rule of which is "if it fits in the hole, it'll do") and thus it won't make any difference anyway.

If a PC already has integrated motherboard sound, integrated motherboard Ethernet, integrated motherboard USB, integrated motherboard RAID, etc. then for the most part you won't *care* how the layout is arranged. I probably could not point out a sound chip on a modern motherboard anymore - it's hidden away inside the whole chipset. It's not even vaguely interesting to know, certainly no more than those people who brag about their CPU using the latest "X number of microns" process (if you mean it does X, that's worth telling - but don't assume that just because you bothered to memorise a completely useless connection to the number of microns used in the process for that particular manufacturer that that somehow makes you superior).

People who want to upgrade cards will buy a computer with upgradeable cards (and they already disable the on-board graphics, so there's nothing to stop you doing the same when the "on-board" graphics are even more "on-board"). It's not like you're de-soldering the graphics chips and putting new ones in their place nowadays.

People who really don't care and just want a PC to a certain spec (the majority of PC buyers, even in big business) will just buy something that could have either CPU and GPU or the combined article. They won't know or care until it comes to upgrade time and most likely they *won't* be touching motherboard, CPU, GPU, or things like sound when upgrading - even RAM is a bit pointless because there's *ALWAYS* a restriction on where the upgrades can go and it's usually cheaper just to buy a new machine to that spec than to try to upgrade an old one.

With laptops, especially, I can't really say that I give a shit where the chip that runs my graphics happens to be. I do have a "gaming" laptop but all I care about is what model number (for downloading the driver) and what it can do (for choosing which games to buy and run on it). I don't even know if this one I'm typing on uses a CPU-GPU combination already, because I rarely open a laptop and if I have a problem serious enough to do so, it's time to get a new laptop just to be safe anyway. And the components that die first are the bits you *don't* worry about until they break - the USB ports, the rear connectors, and the screen. And when they break on a laptop - hey, price of new motherboard = price new laptop with better specs. The only upgrade on a laptop worth doing is RAM and they always expose the RAM ports.

With desktops, you might be able to upgrade CPU and GPU in one hit by changing an integrated package but I don't think in over 25 years of PC computing and managing school networks that I have *ever* upgraded a CPU on an existing motherboard. I've reapplied heat-transfer compound, I've added a *NEW* GPU into an otherwise empty slot and disabled internal graphics (so nothing changes there) but I've never upgraded what's there already.

Who cares? And if it means we can do things like speed up CPU->GPU memory transfers, put all the heat-making components in one place, supply smaller and "faster" thin clients and have assured compatibility, even better. Who knows, it might even lead to better documentation of the graphics chipsets inside the normal processor documentation!

Low-spec hardware is low-spec if it's integrated or not. But as to *where* the GPU lies? Who cares? My laptop could be an empty box for all I know, with one huge lump of plastic and silicon tucked into a corner somewhere and the case balanced with lead weights. I don't know and don't really care. So long as it gets the performance that I bought it for, it'll do.

Double Obsolescence (1)

Anonymous Coward | more than 3 years ago | (#35528206)

As long as they cost less than half of what I'll spend to replace them

  • a) when the CPU is outdated
     
      OR
  • b) when the graphics are.

buy handbags with free shipping (0)

HappyKay (2020318) | more than 3 years ago | (#35528946)

Elegant handicraft [wonderful-bags.com] LV Men Wallet [wonderful-bags.com] fashion style [wonderful-bags.com] your personal fashion [wonderful-bags.com] Cheap LV Bag [wonderful-bags.com] Fashion Sunglasses [wonderful-bags.com] Buy bags with PayPal [wonderful-bags.com] your personal fashion store [wonderful-bags.com] Balenciaga 1:1 bags [wonderful-bags.com] Chanel 1:1 bags [wonderful-bags.com] Chloe 1:1 bags [wonderful-bags.com] Gucci 1:1 bags and sunglasses [wonderful-bags.com] Jimmy Choo 1:1 bags [wonderful-bags.com] Prada 1:1 bags [wonderful-bags.com] LV Handbags outlet [wonderful-bags.com] bring you the best [wonderful-bags.com] buy good quality bags with free shipping [wonderful-bags.com] Elegant gift for yourself and your friends [wonderful-bags.com] Discount LV handbags [wonderful-bags.com] LV Handbags on sale [wonderful-bags.com] Hermes wallets [wonderful-bags.com] Gucci handbags outlet [wonderful-bags.com] Chanel designer handbags outlet [wonderful-bags.com] Chanel 2011 handbags [wonderful-bags.com]
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...