Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel Details Handling Anti-Aliasing On CPUs

timothy posted more than 3 years ago | from the upping-the-spec-of-normalcy dept.

Graphics 190

MojoKid writes "When AMD launched their Barts GPU that powers the Radeon 6850 and 6870, they added support for a new type of anti-aliasing called Morphological AA (MLAA). However, Intel originally developed MLAA in 2009 and they have released a follow-up paper on the topic--including a discussion of how the technique could be handled by the CPU. Supersampling is much more computationally and bandwidth intensive than multisampling, but both techniques are generally too demanding of more horsepower than modern consoles or mobile devices are able to provide. Morphological Anti-aliasing, in contrast, is performed on an already-rendered image. The technique is embarrassingly parallel and, unlike traditional hardware anti-aliasing, can be effectively handled by the CPU in real time. MLAA is also equally compatible with ray tracing or rasterized graphics."

Sorry! There are no comments related to the filter you selected.

Bah, humbug, tech writers need help (0)

djdanlib (732853) | more than 3 years ago | (#36866252)

Can amateur journalists PLEASE stop using the phrase "embarrassingly parallel" to describe software tasks? Who's embarrassed? Why are they embarrassed about designing something that can be efficiently processed?

Re:Bah, humbug, tech writers need help (5, Informative)

windwalkr (883202) | more than 3 years ago | (#36866282)

Re:Bah, humbug, tech writers need help (1, Interesting)

djdanlib (732853) | more than 3 years ago | (#36866388)

Well whaddya know, it's in Wikipedia. That makes it officially okay, right?

I still think it's a poorly worded phrase.

Re:Bah, humbug, tech writers need help (2, Informative)

Anonymous Coward | more than 3 years ago | (#36866594)

Uhh, it's not a new term at all. I distinctly remember it from my undergrad days, and those were in the early 1980s. In fact, I think we learned of it during one of our earliest introduction-to-computer-architecture courses. It was pretty basic knowledge that everyone in the program was assumed to know of and understand.

Re:Bah, humbug, tech writers need help (0)

Anonymous Coward | more than 3 years ago | (#36866938)

Glancing at the citations, it was well enough known to be in a textook from 1995. It's an old term.

Re:Bah, humbug, tech writers need help (0)

Anonymous Coward | more than 3 years ago | (#36866664)

It makes it okay because the reason it is on Wikipedia is that every half-decent programmer has heard or used the term before.

Re:Bah, humbug, tech writers need help (4, Insightful)

nedlohs (1335013) | more than 3 years ago | (#36866790)

It's a term of art commonly used in the field for a very long time. That you don't like it really doesn't matter at all to anyone but you.

Re:Bah, humbug, tech writers need help (2)

rs79 (71822) | more than 3 years ago | (#36866918)

"I still think it's a poorly worded phrase."

Yeah. Embarrassingly poorly worded.

Re:Bah, humbug, tech writers need help (1)

realityimpaired (1668397) | more than 3 years ago | (#36866934)

I think you've done a good job of pointing out what's wrong with the phrase actually... those in a technical field, or those who took a moment to read the Wikipedia article (and yes, usual caveats about believing everything you read in Wikipedia, just ask Steven Colbert about elephants...), the phrase Embarrassingly Parallel has a specific meaning. A journalist or non-technical reader, however, will probably assume that it is an exclamation that doesn't actually add anything to the meaning, as though they're describing something as ridiculously easy.

That's the main problem with tech reporting though... wording something in a way that will be understood by your reader without insulting the intelligence of the people actually in the field.

Re:Bah, humbug, tech writers need help (3, Funny)

MightyMartian (840721) | more than 3 years ago | (#36866944)

Translation: Damn, I was revealed as an ignoramus. How can I swing this back in my favor?

Re:Bah, humbug, tech writers need help (1)

hairyfeet (841228) | more than 3 years ago | (#36867038)

I agree and I think a much better phrase would be something like simplistically parallel or cakewalk parallel even.

And frankly I don't care if Intel calls it Shaka Zulu parallel as the whole point of the new AMD APU arch (which is more than just a CPU+GPU because at the same time they are switching from a VLIW GPU design to a vector based which will allow tighter integration and shared caching) is that there are many jobs the GPU does better than a CPU. It isn't that a CPU can't do those jobs, hell with a fast enough CPU you could probably render Crysis on nothing but a CPU, it is just it would suck down power and crank the heat worse than a Pentium 4.

I'm just glad Intel got caught with their bribing of OEMs (which got so bad an official from Dell said during the price wars there were quarters where the ONLY profit dell saw was Intel kickbacks) and rigging of their compilers so that no WE the customers can actually have choices in the market. walking into the local Walmart the other day i noticed more than half of the laptops and three quarters of the desktops were AMD based now.

Hopefully this will mean AMD will gain some of the share they should have gotten but were denied during the P4, aka "space heater o' suck" era and frankly they deserve as they have really great prices ATM and we are seeing for the first time since AMD64 something completely new in the X86 arena with the AMD APU, which has some really wicked features like letting the integrated do physics while the discrete takes care of textures and by having such tight integration from the looks of it for FP the AMD chips are gonna rock.

Intel seems to be missing the point with TFA, in that we have long since reached "good enough" when it comes to CPUs and more and more of the jobs we have, such as having fast transcoding to our mobile devices like phones and pads, HDMI HD video, gaming, video and picture editing, all these things are done better and with lower heat and power on the GPU.

My only worry is that Intel will yet again be rewarded for their douchebaggery, in this case slowly strangling Nvidia before simply buying them out if the AMD design turns out to be the way to go. Frankly Intel should have been busted for antitrust when the bribery came out, instead of being allowed to kill the Nvidia chipset business. I figure there next move will be when using CPU alone doesn't cut it and facing the fact their GPU division stinks on ice they just buy out Nvidia which we be really sorry as Nvidia would be doing quite well right now if Intel wouldn't have cut them off at the knees.

Re:Bah, humbug, tech writers need help (1)

djdanlib (732853) | more than 3 years ago | (#36867174)

I agree and I think a much better phrase would be something like simplistically parallel or cakewalk parallel even.

Thank you! Simplistically parallel works a lot better.

And frankly I don't care if Intel calls it Shaka Zulu parallel

I'd buy that. TAKE MY MONEY

all these things are done better and with lower heat and power on the GPU

Indeed, for most applications and users we have reached that "good enough" point, at least until someone develops more complex software. Someone else in the comments here mentioned bus overutilization as a potential future scenario. It all depends on how much you offload, I suppose.

Regardless of the antitrust issues which we both agree are terrible... you have to give Intel some credit for producing a killer combo with the i7-2*** series and *67 chipsets.

Embarrassingly authoritative (1)

macraig (621737) | more than 3 years ago | (#36867160)

In Wikipedia's defense, it's at the least more authoritative than The Bible, even though the latter is [i]embarrassingly trusted[/i] by more people.

Re:Embarrassingly authoritative (0)

Anonymous Coward | more than 3 years ago | (#36867202)

Trololo

Re:Embarrassingly authoritative (1)

macraig (621737) | more than 3 years ago | (#36867248)

What's he [youtube.com] gotta do with it? Is he embarrassingly parallel, too?

Re:Bah, humbug, tech writers need help (0)

Anonymous Coward | more than 3 years ago | (#36867270)

You're an idiot. Please stop posting here.

Re:Bah, humbug, tech writers need help (1)

sco08y (615665) | more than 3 years ago | (#36866468)

Glad to see that amateur journalists have created an article in wikipedia whose "sources remain unclear because it lacks inline citations."

Re:Bah, humbug, tech writers need help (1, Funny)

poena.dare (306891) | more than 3 years ago | (#36866302)

Sex. Sure, you and your SO may be so good at sex it only lasts a few seconds, but you'd never admit it in public.

Embarrassingly Parallel Processing is the same way.

Re:Bah, humbug, tech writers need help (0)

dakameleon (1126377) | more than 3 years ago | (#36866480)

This is Slashdot. Many here would be happy to admit to having an SO with whom they are having regular sex.

Re:Bah, humbug, tech writers need help (4, Funny)

Kjella (173770) | more than 3 years ago | (#36866638)

This is Slashdot. Many here would be happy to admit to having an SO with whom they are having regular sex.

So would a lot of the people with SOs...

Amateur commenter (-1)

Anonymous Coward | more than 3 years ago | (#36866316)

Hey, buddy, do better yourself. If it's SUCH A PAIN to read it and if it lowers the quality of your life SO GODDAMN MUCH, then be a better journalist than they are.

Re:Bah, humbug, tech writers need help (4, Informative)

MojoKid (1002251) | more than 3 years ago | (#36866332)

I think you need to do your research before being critical... embarrassingly critical it appears.

Re:Bah, humbug, tech writers need help (2)

djdanlib (732853) | more than 3 years ago | (#36866404)

Touche, sir.

Re:Bah, humbug, tech writers need help (-1)

Anonymous Coward | more than 3 years ago | (#36867048)

The word is "touché", not "touche".

Re:Bah, humbug, tech writers need help (1)

drinkypoo (153816) | more than 3 years ago | (#36867210)

The word is "touché", not "touche".

Too fucking shay, senor!

Re:Bah, humbug, tech writers need help (0)

Anonymous Coward | more than 3 years ago | (#36867614)

the word for you is douché

Any formal training in Comp. Sci. or Comp. Eng.? (1, Informative)

Anonymous Coward | more than 3 years ago | (#36866410)

Do you have any sort of formal (that is, university-level) training in Computer Science or Computer Engineering? Based on your comment, it really doesn't look like you have any at all.

Like others have already pointed out, "embarrassingly parallel" is a very legitimate and correct term to use in the field of parallel computing. It may sound funny to you, but it's a term used by the experts. In fact, it's such a core concept that even most undergraduates are well aware of it and what it means.

This is the sort of shit I see time and time again from Rails "developers" and JavaScript "programmers". Such people have no real training whatsoever, yet somehow believe themselves to be experts in the field. They go out and make blatantly ignorant and incorrect comments on various social media sites, and then wonder why actual professionals and academics think that these Ruby and JavaScript users are idiots.

Re:Any formal training in Comp. Sci. or Comp. Eng. (1)

djdanlib (732853) | more than 3 years ago | (#36866588)

Nope, I studied photography, evangelized multiple distributions of Linux, wrote my own games, work in IT at a place where all the workloads are single-threaded, have my own home recording studio, and other such things. That's my geek cred, not CS/CE.

I don't understand why people need to anonymously criticize an honest mistake by someone not "in the business" to the point where it's the biggest thread. I guess it's the Internet, and that's what's been happening since USENET, so I don't let it get to me. (I also used to participate in USENET, if that counts for anything anymore.)

Re:Any formal training in Comp. Sci. or Comp. Eng. (0)

Anonymous Coward | more than 3 years ago | (#36866742)

(FYI, I'm an auto mechanic. I don't have any comp sci background, aside from that I've learned while working with and on the systems embedded within modern vehicles.)

While I can't speak for the GP, I can understand why he's angry when an untrained amateur like yourself starts spewing ignorance all over the place.

We mechanics experience this phenomenon a lot, too. On a daily basis we have to deal with customers who think that they know about cars, and will speak very loudly as if they do, but while doing so they'll be wrong again and again and again. The worst are those who are so ignorant that they choose to work on their own vehicles, causing very serious damage. We call them 'tinkertards'.

Maybe you just need to think twice before posting about something that you don't know much about. Or at least don't act like you do know about the field when you apparently don't.

Re:Any formal training in Comp. Sci. or Comp. Eng. (1)

djdanlib (732853) | more than 3 years ago | (#36866842)

I only said one thing, for which I accepted correction, but apparently the quality of discussion on Slashdot has degraded to the point of the mud-slinging on Digg and Youtube... I am mildly intrigued by one thing, though. Why do you feel the need to rip me to shreds so thoroughly? Are you trying to accomplish something? And, why are you doing it anonymously?

Re:Any formal training in Comp. Sci. or Comp. Eng. (2)

X0563511 (793323) | more than 3 years ago | (#36867726)

Lets be fair, "embarrassingly parallel" is an embarrassingly stupid phrase. It takes a word out of it's normal context.

You'd think they would chose something less silly sounding and less prone to confusing those who encounter it for the first time. Say, "independently parallel" - seems to sum it up nicely while not confusing the hell out of those unfamiliar with the jargon.

Re:Any formal training in Comp. Sci. or Comp. Eng. (0)

Anonymous Coward | more than 3 years ago | (#36866894)

In this kind of situation, it's a lot better to simply admit your ignorance, thank the people who pointed it out to you, and move on.

Don't start some commentary on anonymity.

Re:Any formal training in Comp. Sci. or Comp. Eng. (0)

Anonymous Coward | more than 3 years ago | (#36866624)

Actually the original poster is right it now that I think about it. It is the wrong term to use.

Embarrassingly UNparallel would be a better term..

As if it can not be done in parallel then it is embarrassing. As by definition you can only do 1 at a time. Instead of getting scale with more cpu cores.

Re:Any formal training in Comp. Sci. or Comp. Eng. (0)

Anonymous Coward | more than 3 years ago | (#36866776)

It is the wrong term to use

No. "Embarrassingly parallel" is a term often used in parallel computing by academics and experts. An embarrassingly parallel problem is one that so easily lends itself to parallelism that to solve it any other way would be embarrassing.

Embarrassingly UNparallel would be a better term..

Not really. Just because a problem doesn't lend itself so easily to parallelism, doesn't mean that it is in any way embarrassing. There is not anything embarrassing about solving a problem serially unless the problem is embarrassingly parallel.

Re:Any formal training in Comp. Sci. or Comp. Eng. (1)

russotto (537200) | more than 3 years ago | (#36866912)

Do you have any sort of formal (that is, university-level) training in Computer Science or Computer Engineering? Based on your comment, it really doesn't look like you have any at all.

I have, but I didn't come upon the term "embarrassingly parallel" until much later. Possibly because I'm embarrassingly old; the first use I can find of the term is in 1989, and it doesn't seem to have become mainstream even within computer science until a few years later.

Re:Bah, humbug, tech writers need help (5, Informative)

PhrostyMcByte (589271) | more than 3 years ago | (#36866482)

"Embarrassingly parallel" refers to a problem made up of many isolated tasks -- such as running a fragment (pixel) shader on millions of different fragments, or a HTTP server handling thousands of clients -- that can all be run concurrently without any communication between them.

It's odd that they use that term here, because the other anti-aliasing techniques are embarrassingly parallel as well.

SSAA (super-sampling) always renders each pixel n times at various locations within the pixel, and blends them together.

MSAA (multi-sampling) is basically the same as SSAA, but only works on polygon edges and is very dependant on proper mipmapping to reduce aliasing introduced when scaling textures.

Re:Bah, humbug, tech writers need help (1)

djdanlib (732853) | more than 3 years ago | (#36866528)

Thanks for writing an educational reply! Alas I cannot mod it, being the OP.

It's still a weird phrase, though.

Re:Bah, humbug, tech writers need help (1)

wagnerrp (1305589) | more than 3 years ago | (#36866666)

I presume the implication is the fact that it is 'embarrassingly parallel' means it can be performed on the shader units of a graphics card. What I don't understand is that IF it is a problem that can be efficiently broken up across the hundreds of cores of a modern GPU, why would you want to? All you're doing is adding a huge volume of data transfer across the PCIe bus, to bog down a CPU that could be better spent running collision detection, netcode, AI routines, or any number of other tasks that cannot be handled easily on a GPU.

Re:Bah, humbug, tech writers need help (1)

X0563511 (793323) | more than 3 years ago | (#36867732)

So perhaps you can explain this:

Why would someone use such a stupid term for it, when something much more intuitive like "independently parallel" might suffice?

Re:Bah, humbug, tech writers need help (1, Funny)

sco08y (615665) | more than 3 years ago | (#36866562)

Can amateur journalists PLEASE stop using the phrase "embarrassingly parallel" to describe software tasks? Who's embarrassed? Why are they embarrassed about designing something that can be efficiently processed?

No can do. Journalists all read each other, and when one comes up with a catchy term, they all pick up on it. This is especially true if they have no idea what they're writing about, or some editor thinks it's punchier or dramatic.

My pet peeve is "gun-toting." No one "totes" a firearm! If it's a pistol, you holster it. If it's a rifle, you sling it or shoulder it. I guess "armed" is too simple.

Re:Bah, humbug, tech writers need help (0)

Anonymous Coward | more than 3 years ago | (#36866696)

My pet peeve is "gun-toting." No one "totes" a firearm! If it's a pistol, you holster it. If it's a rifle, you sling it or shoulder it. I guess "armed" is too simple.

Seriously? Lots of people tote firearms. Tote means to carry or to have on one's person.

Re:Bah, humbug, tech writers need help (1)

sco08y (615665) | more than 3 years ago | (#36866948)

My pet peeve is "gun-toting." No one "totes" a firearm! If it's a pistol, you holster it. If it's a rifle, you sling it or shoulder it. I guess "armed" is too simple.

Seriously? Lots of people tote firearms. Tote means to carry or to have on one's person.

Depends how lazy the guy writing the dictionary is. I've never hear someone say, "tote that rifle to the ready line." You can't get a "tote concealed weapons" permit. No one talks about the "right to keep and tote arms."

The only other expression I'm familiar with is "tote-bag." All I want to know: are we about to go shooting or shopping?

Re:Bah, humbug, tech writers need help (2)

Sulphur (1548251) | more than 3 years ago | (#36866570)

Can amateur journalists PLEASE stop using the phrase "embarrassingly parallel" to describe software tasks? Who's embarrassed? Why are they embarrassed about designing something that can be efficiently processed?

But amateur journalism is embarrassingly parallel.

Re:Bah, humbug, tech writers need help (0)

Anonymous Coward | more than 3 years ago | (#36866574)

I'm not quite sure how to describe windwalkr's reply. Is it informative, or is it a burn?

Re:Bah, humbug, tech writers need help (1)

Twinbee (767046) | more than 3 years ago | (#36867114)

It's embarrassing because it's almost *too* efficient. The term first came about when CPU manufacturers and the industry in general were embarrassed at how much faster the GPU was for certain algorithms (i.e. the embarrassingly parallel ones). Programmers also were generally embarrassed at not using the technology sooner, and they often spent YEARS writing efficient code for the CPU, only to have a 5 minute knock-up code job on the GPU beat it when they finally experimented with the GPU.

The definition was then further reinforced by programmers who were expected to write long convoluted code to show their managers. They were then embarrassed because the stuff was good yet very quick to write (parallel algorithms are by their nature more short and elegant), so it looked like they were being lazy.

Re:Bah, humbug, tech writers need help (0)

Anonymous Coward | more than 3 years ago | (#36867138)

It's embarrassing when the original code was not parallelized and it would be easy to do so.

Re:Bah, humbug, tech writers need help (1)

Bengie (1121981) | more than 3 years ago | (#36867364)

A barn (symbol b) is a unit of area. Originally used in nuclear physics for expressing the cross sectional area of nuclei and nuclear reactions, today it is used in all fields of high energy physics to express the cross sections of any scattering process. A barn is defined as 1028 m2 (100 fm2) and is approximately the cross sectional area of a uranium nucleus.

Two related units are the outhouse (1034 m2, or 1 b) and the shed (1052 m2, or 1 yb),[3] although these are rarely used in practice.

You'd never make it in physics either. Think you could hit the broad side of a barn? It's not as big as you'd think. "Embarrassingly parallel" is a technical term, the same way a "barn", "outhouse" or "shed" is perfectly technical. Your humor breaker needs to get reset.

Why not leave it on the GPU? (4, Insightful)

Tr3vin (1220548) | more than 3 years ago | (#36866338)

If it is "embarrassingly parallel", why not leave it on the GPU? Makes more sense to have it running on dozens to potentially hundreds of stream processors than a couple "free" cores on the CPU.

Intel wants for there to be no GPU (2)

Sycraft-fu (314770) | more than 3 years ago | (#36866402)

They want everything to run on the CPU, and thus for you to need a big beefy Intel CPU. Remember Intel doesn't have a GPU division. They make small integrated chips but they are not very powerful and don't stack up well with the low power nVidia/ATi stuff. What they make is awesome CPUs. So they really want to transition back to an all-CPU world, no GPUs.

They've been pushing this idea slowly with various things, mostly based around ray-tracing (which GPUs aren't all that good at).

Right now it is nothing but wishful thinking. Nobody is going to dump their GPU for CPU only rendering since even a cheap GPU and out do a powerful CPU in many respects. However maybe some day.

While it is for selfish reasons, I don't disagree with Intel's idea over all. It would be nice to have computers where everything is done on the CPU, no special dedicated hardware. That's really the whole idea of a computer, rather than having dedicated devices to do things, you have a powerful device that can do everything in software.

Re:Intel wants for there to be no GPU (0)

Anonymous Coward | more than 3 years ago | (#36866678)

Id say thats exactly what capitalism should be, not? inventing new things and ways to be the best.

Re:Intel wants for there to be no GPU (2)

wagnerrp (1305589) | more than 3 years ago | (#36866748)

GPUs haven't been special dedicated hardware for several generations. Ever since OpenGL 1.4 and Direct3D 8, they have been transitioning over to more general purpose use. They still have a dedicated raster unit, but the vast bulk of the chip is just a giant array of largely generic vector units. Those units can be put towards whatever application you want, whether it be graphics, physics, statistics, etc. It's basically a larger version of the SSE and Altivec units.

Re:Intel wants for there to be no GPU (0)

Anonymous Coward | more than 3 years ago | (#36866756)

I'm all for a big beefier CPU, I do research in Computational Intelligence, and my systems can utilise all available cores, and there is no way I'm going to create some dedicated snippets of code to offload things to the GPU. Moving to GPU makes the code more cumbersome, let portable, more difficult to understand that more difficult to fix update, expand and improve. I think Polaris and Knight's Corner CPUs are the way to go.

Re:Intel wants for there to be no GPU (0)

Anonymous Coward | more than 3 years ago | (#36866808)

There is nothing beefy about consumer CPUs. Mainstream games may be handling a lot more workload today than they were 10 years ago, but they still choke when handling more than a few hundred AI-processing NPCs (or just 40 for an engine like UE3).

Ray tracing can be very parallel, on GPUs (2)

Thagg (9904) | more than 3 years ago | (#36867000)

Many of the commercial ray tracing packages have written GPU-based versions that work remarkably well.

V-Ray and mental ray, in particular, have very exciting GPU implementations. A presentation by mental images showed some very high-quality global illumination calculations done on the GPU. Once you get good sampling algorithms, the challenge is dealing with memory latency. It's very slow to do random access into memory on a GPU. mental images solved that problem by running a lot of threads, as GPU's context switch very quickly. When I said "a lot of threads", I wasn't kidding -- the demo I saw was running 100,000 threads over 10 graphics cards. The huge majority of those threads are stalled waiting for memory, but it doesn't cost anything to wait for those accesses to be satisfied if you have other threads to run.

Re:Intel wants for there to be no GPU (1)

shutdown -p now (807394) | more than 3 years ago | (#36867396)

Why is GPU not good at ray-tracing? So far as I know, they excel at tasks which exhibit massive dependency-free parallelism, and lots of number crunching with little branching. It would seem to me that this describes ray-tracing almost perfectly.

Re:Why not leave it on the GPU? (1)

Zaphod The 42nd (1205578) | more than 3 years ago | (#36866412)

Yeah, that is exactly what I was wondering. I suppose the idea is for systems with bad integrated graphics cards, or with mobile devices that have no dedicated graphics.

Re:Why not leave it on the GPU? (1)

Zaphod The 42nd (1205578) | more than 3 years ago | (#36866458)

Ah, guess I should have thoroughly RTFA before commenting. I guess on consoles where MSAA is hard to have time for, this could be useful.

Re:Why not leave it on the GPU? (0)

Anonymous Coward | more than 3 years ago | (#36866470)

because Intel doesn't make good GPUs, but does make good CPUs, so they have a hammer and made a nail to pound

Re:Why not leave it on the GPU? (1)

grumbel (592662) | more than 3 years ago | (#36867088)

why not leave it on the GPU?

A lot of modern Playstation 3 games use that technique as it allows to do something useful with the SPUs, while the GPU is already busy enough rendering the graphics. It also helps with raytracing, as you might need less CPU power to do anti-aliasing this way, then the proper one. When the GPU of course has some free cycles left, there is no reason to not do it there.

Re:Why not leave it on the GPU? (1)

cthulhu11 (842924) | more than 3 years ago | (#36867630)

or, admit that anti-aliasing is a euphemism for blur, and rarely improves the appearance of text.

Rain dances around Shannon (3, Insightful)

Anonymous Coward | more than 3 years ago | (#36866392)

If your signal is aliased during sampling, you are toasted.
No voodoo will help you if your spectrum folded on itself.
So super-sample it or shut up.
Everything else is a snake oil for unwashed masses.
And yes, MPLAA still looks like crap in comparison to SS.

Re:Rain dances around Shannon (2)

cgenman (325138) | more than 3 years ago | (#36867532)

Judging by the article, MLAA is actually just a technique that looks for jagged edges, and blurs them.

How this is better than just blurring the whole thing is beyond me. Those images look terrible.

You sure GPU's aren't better? (3, Interesting)

guruevi (827432) | more than 3 years ago | (#36866408)

If the system is 'embarrassingly parallel' and simple then the GPU would be a better use case. GPU's typically have a lot (200-400) cores that are optimized for embarrassingly simple calculations. Sure you could render everything on a CPU these days, simpler games could even run with an old school SVGA (simple frame buffer) card and let all the graphics be handled by the CPU as used to be the case in the 90's and is evidenced by the 'game emulators in JavaScript' we've been seeing lately but GPU's are usually fairly unused except for the ultramodern 3D shooters which also tax a CPU pretty hard.

Re:You sure GPU's aren't better? (1)

Baloroth (2370816) | more than 3 years ago | (#36866506)

Yes, GPU would be better. I mean, look at Intel's amazing GPU divis... oh wait. That's why they want AA on the CPU. Because, you know, they actually have CPUs that are pretty decent. AMD probably added support because of their whole Fusion platform.

Blur (5, Insightful)

Baloroth (2370816) | more than 3 years ago | (#36866434)

So, it basically blurs the image around areas of high contrast? Sounds like thats whats going on. Looks like it, too. I can understand why they are targeting this at mobile and lower powered devices: it kinda looks crappy. I might even say that no antialiasing looks better, but I'd really have to see more samples, especially contrasting this with regular MSAA. I suspect, however, that normal antialiasing will always look considerably better. For instance, normal AA would not blur the edge between two high-contrast textures on a wall (I think, since it is actually aware that it is processing polygon edges), while I suspect MLAA will, since it only sees an area of high contrast. Look at the sample image they post in the article: the white snow on the black rock looks blurred in the MLAA processed picture, while it has no aliasing artifacts at all in the unprocessed image. Its pretty slight, but its definitely there. Like I say, need to see more real world renders to really tell if its a problem at all or simply a minor thing no one will ever notice. I'll stick to my 4X MSAA, TYVM.

Re:Blur (1)

Jamu (852752) | more than 3 years ago | (#36866560)

The article does compare it to MSAA. But the MLAA just looks blurred to me. Detail shown with MSAA is lost with MLAA. It would be informative to see how MLAA compares to simple Gaussian blurring.

Re:Blur (5, Informative)

djdanlib (732853) | more than 3 years ago | (#36866608)

It's different from a Gaussian blur or median filter because it attempts to be selective about which edges it blurs, and how it blurs those edges.

This technique really wrecks text and GUI elements, though. When I first installed my 6950, I turned it on just to see what it was like, and it really ruined the readability of my games' GUIs. So, while it may be an effective AA technique, applications may need to be rewritten to take advantage of it.

Bigger text (1)

tepples (727027) | more than 3 years ago | (#36866670)

[Morphological AA postprocessing] really ruined the readability of my games' GUIs. So, while it may be an effective AA technique, applications may need to be rewritten to take advantage of it.

Just as games and other applications supporting a "10-foot user interface" need to be rewritten with larger text so that the text is not unreadable when a game is played on a standard-definition television. The developers of Dead Rising found this out the hard way.

Re:Blur (1)

gr8_phk (621180) | more than 3 years ago | (#36866818)

It's different from a Gaussian blur or median filter because it attempts to be selective about which edges it blurs, and how it blurs those edges.

Anti-Aliasing is not supposed to blur edges arbitrarily. I suppose that's why this is selective, but it just seems like a crappy thing to be doing. And while it can be done by a CPU, that's probably not practical - either the CPU is busting it's ass to do rendering and doesn't really have time to make an AA pass, or the GPU did all the rendering and may as well do this too, rather than pass all the data back to CPU.

Re:Blur (1)

djdanlib (732853) | more than 3 years ago | (#36866870)

What I saw in non-technical terms: It appeared to blur edges whose location or neighboring pixels changed from one frame to the next. Unfortunately, whenever something changed behind text and GUI elements, it went right ahead and blurred those edges as well.

Re:Blur (1)

Baloroth (2370816) | more than 3 years ago | (#36867148)

This reminds me of a glitch in the original S.T.A.L.K.E.R. Shadow of Chernobyl game. If you turned on AA in game, the crosshairs would disappear. I've always suspected something like that was going on, where it would see the straight lines of the crosshair and blend it into the rest of the picture. I believe it worked right if you enabled AA from drivers outside the game, which reinforced my theory considerably.

FXAA (0)

Anonymous Coward | more than 3 years ago | (#36867090)

Even better is nVidia's FXAA [blogspot.com] , which was inspired by MLAA.

Re:Blur (1)

home-electro.com (1284676) | more than 3 years ago | (#36866888)

That's what it looks like. They can come up with fancy names like 'morphological AA', but the fact remains that they are simply apply blur filter, which is no replacement for proper AA. And when I say 'not a replacement' I mean NOT BY A LONG SHOT.

Yet I think some MBA in marketing earned a pat on a shoulder for coming up with the BS term.

It's like they are forgetting that AA is not just about blurred edges. Imaging looking at an infinite chess-board. At certain distance more than one field will appear in EVERY pixel, and, if no proper AA is done, it will look horrible. It will look just as horrible with ML'AA'.
 

Re:Blur (1)

grumbel (592662) | more than 3 years ago | (#36867140)

Some comparison screen shots [ign.com] , essentially it performs extremely well on clean high contrast edges, but can lead to ugly blurring when the source image contains heavily aliased areas (i.e. small sub-pixel width lines in the background). There are also some temporal issue, as some of the artifacts it causes get worse when its animated. Overall I'd say its an clear improvement, not perfect, but when you are stuck with 1280x720 on 44" TV you are happy about any anti-aliasing you can get.

Re:Blur (1)

Baloroth (2370816) | more than 3 years ago | (#36867338)

In those pictures it does look considerably better than no AA, thanks for pointing those out. Seems like MLAA is perfect for the PS3: it has a pretty slow and dated graphics card, but quite a lot of spare cycles on the CPU, as long as you can do it in parallel, which this can. Always amazed me how few PS3 games have AA. You're right, games without AA, especially on 720p (or *shudder* 480p) on a large TV, look absolutely shitty. One reason I love my PC: I've used AA in pretty much everything for almost 5 years now.

And the best part is... it looks like crap! (1)

SoopahMan (706062) | more than 3 years ago | (#36867786)

MSAA looks awful - and Intel's CEO famously knocked antialiasing as being a stupid blurring technique not long ago. So, he goes with the only form of AA that literally adds no value. Cutting off their nose to spite their face?

parallel (1)

Zaphod The 42nd (1205578) | more than 3 years ago | (#36866448)

Correct me if I'm wrong, but MSAA is already embarrassingly parallel, and provides for better fidelity than this newfangled MLAA.
Yes, its faster than MSAA, but modern GPUs are already pretty good at handling real-time MSAA.

Embarassingly parallel (1)

The Living Fractal (162153) | more than 3 years ago | (#36866450)

This is a phrase I would have reserved for myself after several too many drinks... not so much for this article ;p

So, it is not anti-aliasing at all... (5, Informative)

LanceUppercut (766964) | more than 3 years ago | (#36866478)

Anti-aliasing, by definition, must be performed in object space or, possibly, in picture space. But it cannot be possibly carried out on an already rendered image. They must be trying to market some glorified blur technique under the anti-aliasing moniker. Nothing new here...

Re:So, it is not anti-aliasing at all... (0)

Anonymous Coward | more than 3 years ago | (#36866592)

Anti-aliasing can be turned on and off on a per-texel basis, at least in OpenGL. This is particularly useful when trying to do motion blur. To me MLAA looks like crap - it's smudged the whole scene, even where it shouldn't.

hq3x (1)

tepples (727027) | more than 3 years ago | (#36866648)

Anti-aliasing, by definition, must be performed in object space or, possibly, in picture space. But it cannot be possibly carried out on an already rendered image.

Ever heard of hq3x [wikipedia.org] ? Or the pixel art vectorizer we talked about two months ago [slashdot.org] ?

Re:hq3x (0)

Anonymous Coward | more than 3 years ago | (#36866718)

Your definition of anti-aliasing is off by a long shot.

Even if it isn't technically AA (2)

tepples (727027) | more than 3 years ago | (#36866840)

Your definition of anti-aliasing is off by a long shot.

Both this edge blending technique and pixel art upscalers work by guessing underlying shapes based on the corners within high-contrast edges in the image. Pixel art upscalers aren't the same as hand-drawing the image at a higher pixel density, but they still produce a picture with some of the same desirable qualities. Likewise, even if this sort of edge blending isn't the same as proper anti-aliasing, it still produces a picture with some of the same desirable qualities.

Re:So, it is not anti-aliasing at all... (1)

debrain (29228) | more than 3 years ago | (#36866750)

Anti-aliasing, by definition, must be performed in object space or, possibly, in picture space. But it cannot be possibly carried out on an already rendered image.

Sir –

Anti-aliasing can be performed on a rendered image by performing image recognition i.e. vectorization. This is doable with edges of the geometric sort (i.e. straight lines, simple curves) and pre-existing patterns (e.g. glyphs of known fonts of given sizes). This result is probably an absurdity in terms of the performance, however "cannot be possibly carried out" is a bit too strong, in my humble opinion. It may be impractical, but certainly it's not impossible.

Re:So, it is not anti-aliasing at all... (1)

gr8_phk (621180) | more than 3 years ago | (#36866844)

Anti-aliasing, by definition, must be performed in object space or, possibly, in picture space. But it cannot be possibly carried out on an already rendered image. They must be trying to market some glorified blur technique under the anti-aliasing moniker. Nothing new here...

There are no pixels in object space. It's an operation on pixels. But I agree with your second half - it's some new blur method that probably isn't worth it. Nothing to see here - in fact things are harder to see.

AA always amuses me, for historical reasons... (1)

fuzzyfuzzyfungus (1223518) | more than 3 years ago | (#36866802)

While I understand AA, and why we do it; but I always experience a moment's rush of absurdity when I consider it.

Up until quite recently, with high-speed digital interfaces nowhere near what video of any real resolution required, and high-bandwidth analog components very expensive, AA was just something that happened naturally, whether you liked it or not: your not-at-all-AAed digital frame went to the RAMDAC(which, unless you had really shelled out, could likely have been a bit lax about accuracy in exchange for speed), and was then shoved through a VGA cable of undistinguished parentage, a whole pile of analog widgetry that controlled the yokes on the CRT, and was finally smeared onto the nice, soft, phosphor blobs on your CRT. For things involving video gear, rather than computers, this went double: a trip through a composite->RF modulator pretty much eliminated the ability to even display jagged pixels, whether you wanted them or not.

It's always a little strange to think of how much computer power we now burn so that our all-digital video signal paths don't shove the jaggies in our faces.

Re:AA always amuses me, for historical reasons... (1)

Anonymous Coward | more than 3 years ago | (#36867036)

That's blurring. AA is not the same. AA adds information to the picture, blurring removes information.

Re:AA always amuses me, for historical reasons... (1)

Osgeld (1900440) | more than 3 years ago | (#36867626)

its funny years ago I went on a hunt for the sharpest flat front CRT I could find to see the "edges of pixels" and would not have it any other way for old bitmap games, but on mondern high res games on modern high res monitors if you dont have it it just sprinkles the entire screen with jaggie noise.

FXAA is a better choice (5, Interesting)

dicobalt (1536225) | more than 3 years ago | (#36866816)

It can work on any DX9 GPU without dedicated support. http://hardocp.com/article/2011/07/18/nvidias_new_fxaa_antialiasing_technology/1 [hardocp.com]

Re:FXAA is a better choice (1)

Anonymous Coward | more than 3 years ago | (#36867620)

Eurogamer did a feature on various AA techniques: http://www.eurogamer.net/articles/digital-foundry-future-of-anti-aliasing

IMO, MSAA seems the most promising as long as they can deal with the pixel popping artifacts. It's already been used on a couple of PS3 games, and can be done by a GPU. FXAA blurs too much, a bit like Qunicunx AA, which is what you usually get on PS3 games these days,

I don't know... (1)

jonahbron (2278074) | more than 3 years ago | (#36866898)

I don't know, it looks very close to just blurring to me. Non-"anti-aliased" sample images are better IMHO.

MLAA is also... (0)

Anonymous Coward | more than 3 years ago | (#36867112)

MLAA is also crap, compared to "proper" antialiasing (supersampling) or even "draft" antialiasing (multisampling). Any detail smaller than 1 pixel simply isn't rendered with MLAA (and that also means no sub-pixel motion). Essentially, MLAA is just a blur filter, which actually reduces the amount of detail in the image (unlike supersampling, which increases the detail).

Edge detect + supersampling (or edge detect + high multisampling) is by far the best solution.

Oh, and technically blurring is antialiasing. It's just a very primitive flavor of.

No, MLAA is aliasing! (2)

sco08y (615665) | more than 3 years ago | (#36867178)

MLAA is also crap, compared to "proper" antialiasing (supersampling) or even "draft" antialiasing (multisampling). Any detail smaller than 1 pixel simply isn't rendered with MLAA (and that also means no sub-pixel motion). Essentially, MLAA is just a blur filter, which actually reduces the amount of detail in the image (unlike supersampling, which increases the detail).

Edge detect + supersampling (or edge detect + high multisampling) is by far the best solution.

Oh, and technically blurring is antialiasing. It's just a very primitive flavor of.

Aliasing is when signals become indistinguishable. The common symptom of jaggies occurs when of the ray that is chosen to sample hides the signal of the nearby rays.

But blurring is aliasing! In physical blurring signals from in focus rays are overwhelmed by signals from out of focus rays. Similarly, with a blur filter you're definitely losing data by blurring it with neighboring signals.

And you're right: this technology is nothing but an elaborate blur filter. So this looks like anti-aliasing because it masks one well known symptom, but it's clearly obscuring signal and thus aliasing.

only one problem... (0)

Anonymous Coward | more than 3 years ago | (#36867586)

it looks like shit.

decent phones don't need AA (3, Interesting)

bored (40072) | more than 3 years ago | (#36867606)

AA is a crutch to get around a lack of DPI. Take the iphone 4 at 326 DPI, it is 3 to 4x the DPI of the average craptasic "HD" computer monitor. I have a laptop with a 15" 1920x1200 screen. At that DPI Seeing the "jaggies" is pretty difficult compared with the same resolution on my 24". On the 15" can turn AA on/off and its pretty difficult to discern the difference. That monitor is only ~150DPI. I challenge you to see the affects of anti-aliasing on a screen with a DPI equivalent to the iphone 4.

The playstation/xbox on the other-hand are often used on TV's with DPI's approaching 30. If you get within a couple feet of those things the current generation of game machines look like total crap. Of course the game machines have AC power, so there really isn't an excuse. I've often wondered why sony/MS haven't added AA to one of the respun versions of their consoles.

Re:decent phones don't need AA (1, Interesting)

isaac (2852) | more than 3 years ago | (#36867736)

I challenge you to see the affects of anti-aliasing on a screen with a DPI equivalent to the iphone 4.

The eye is pretty good at picking out jaggies, especially in tough cases (high contrast, thin line, shallow slope against the pixel grid,) and where the screen is viewed from close range (my eye is closer to my phone's screen than my desktop monitor.)

Now, I don't think antialiasing makes a huge deal to game mechanics - but it is nice to have in high-contrast information situations (e.g. google maps) regardless of the pixel pitch of the underlying display.

This is NOT antialiasing (1)

spitzak (4019) | more than 3 years ago | (#36867660)

This is image reconstruction, where additional information (not necessarily correct) is derived from a limited image.

Close equivalents are the "font smoothing" done by the earliest versions of Macintosh for printing their bitmap graphics on a PostScript printer to draw 72 dpi 1-bit images at 300 dpi. Also I believe Microsoft's earliest subpixel font rendering, smoothtype, was done this way (not cleartype or any other modern font rendering).

Much more complicated examples are algorithms for scaling up images, including ones for converting video to HDTV or even IMAX resolution. These create images that replace the pixels, based on analysis of sometimes very large areas around the pixel. And by far the most complicated example are the programs that recover shapes from very low-resolution bitmaps, such as the Microsoft one posted earlier on slashdot.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?