Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Open-Source Intel Mesa Driver Now Supports OpenGL 3.2

Soulskill posted about 10 months ago | from the please-comply dept.

Graphics 35

An anonymous reader writes "Mesa and its open-source Intel graphics driver now are in compliance with the OpenGL 3.2 specification (PDF). It took four years for Mesa to get up to GL 3.2 / GLSL 1.50 compliance, and support for the other Mesa drivers isn't too far behind, but they're still years behind in supporting OpenGL 4. Supporting a major new OpenGL API has resulted in Mesa 10.0 being called the next release. It has many other features, like performance improvements and new Gallium3D features. OpenGL 3.3 support might also be completed prior to the Mesa 10.0 release in November."

cancel ×

35 comments

Sorry! There are no comments related to the filter you selected.

great! now what.. (0)

Anonymous Coward | about 10 months ago | (#45104383)

.. can I play on my SandyBridge i5-2410M -based HD 3000, that I wasn't able to play before, natively or under wine?

Re:great! now what.. (1)

Rockoon (1252108) | about 10 months ago | (#45104459)

...any game with a low quality graphics setting at a low resolution

You would be quite shocked (1)

tuppe666 (904118) | about 10 months ago | (#45104517)

...any game with a low quality graphics setting at a low resolution

I have the sameish processor and its surprisingly nippy. I just ploughed through the half life/2 series running at 1080P its getting on a little, but still looks very nice.

I have been thinking of treating myself to an AMD card...because of the lesser of two evils Nvidia being a little too full of shit for my liking, when Intel can throw 30 programmers behind their open source graphics on Linux, but steam seems to have given Nvidia their blessing...but right now I don't have to rush that choice. I have on-board graphics that is fast enough with real support behind it.

Re: You would be quite shocked (1)

Sigg3.net (886486) | about 10 months ago | (#45107205)

If you can live with a binary blob, nVidia is great. The company still supports chips that aren't on sale any longer.

But I'd prefer an open source driver with 3D support.

Re:great! now what.. (1)

Anonymous Coward | about 10 months ago | (#45104539)

The change is only for Ivy Bridge and Haswell.

Re:great! now what.. (1)

FreonTrip (694097) | about 10 months ago | (#45104623)

It probably won't hurt across-the-board performance, so even titles you've played 'til now may benefit. Give the new Amnesia a try - if my Macbook Air's Intel HD 5000 can push it along at 1440x900, you may have luck at 720p on the HD 3000.

Re:great! now what.. (1)

timeOday (582209) | about 10 months ago | (#45104795)

Maybe... in benchmarks the Intel HD 5000 is *much* faster than the 4000, let alone 3000. Then again the Air has a very low-wattage chip so maybe you lose some back to older HD 3000 desktop processors?

Re:great! now what.. (1)

FreonTrip (694097) | about 10 months ago | (#45147819)

Late in getting back to this, but yes: the HD 5000's definitely hobbled a bit by the 15W TDP restriction of the i5-4250U. Amnesia: A Machine for Pigs probably wouldn't be sexy on the HD 3000, but should be manageable at low to medium quality settings and conservative resolutions.

Truly a great day (0)

Chemisor (97276) | about 10 months ago | (#45104417)

This is truly a great day for Linux gaming.

Re:Truly a great day (0)

Anonymous Coward | about 10 months ago | (#45104497)

Still useless. Wake me up when they have support for OpenGL 4.3.

Re:Truly a great day (1)

cheesybagel (670288) | about 10 months ago | (#45104583)

IMO they need to unify the Linux graphics subsystem with the Android graphics subsystem. Then you will have much easier to port games not to mention more advanced functionality. Games don't need OpenGL. OpenGL ES is fine. This might be a problem for the Linux workstation market though.

Re:Truly a great day (1)

FreonTrip (694097) | about 10 months ago | (#45104687)

OpenGL 4.1 already has full API compatibility with OpenGL ES 2.0. Let's not go throwing out decades of hard work for a little bit of convenience with regard to video games, especially when hardware going forward will all be capable of transparently handling the API you wanna switch to. As for throwing out X11 and tossing in the Android graphics stack for everybody, that's madness for a thousand reasons.

Re:Truly a great day (0)

Anonymous Coward | about 10 months ago | (#45104761)

More importantly, the Intel Mesa driver is already fully OpenGL ES 3.0 compliant, so his argument is completely bunk in the first place. http://linux.slashdot.org/story/13/02/13/1756208/intel-supports-opengl-es-30-on-linux-before-windows

Re:Truly a great day (0)

Anonymous Coward | about 10 months ago | (#45121559)

Depends on your definition of "handling". OpenGL "handles" multi-threaded calls by having the caller use a global lock on the context to completely serialize calls. Just because you can abstract it out to be transparent, doesn't mean it will be within even a magnitude of the performance of doing it correctly.

Re:Truly a great day (0)

Anonymous Coward | about 10 months ago | (#45107925)

I'd like to disagree - at least with the current state of OpenGL ES. ES2.0 is a huge pain, especially if you want to do anything modern-ish. You need extension for functionality that's been around "forever" in the desktop GL. Want multiple render targets? Extension. Float textures/framebuffers? Extension. Shadow maps? Extension (or nasty hacks). Non-constant loops in shaders? Your driver might be OK with that, but the spec isn't. Haven't even seen an extension for this, some drivers just don't care. Heck, the core spec doesn't even require full 32-bit floats to be available in (fragment) shaders.

ES 3.0 is a bit better. It seems to fix a lot fo the above issues. Of course, there's no real hardware for ES3; most HW that supports the feature set also supports a much more advanced vanilla-GL implementation. So.. whatever. Might as well stick to vanilla-GL whenever that's possible.

Still, wake *me* up when ES gets on par with around GL 4.0 and/or there exists a significant class of ES-only devices with ES that doesn't totally suck.

awesome (0)

Anonymous Coward | about 10 months ago | (#45104541)

I've been developing a game for the past three years. I started out targeting OpenGL 2.1, but a year ago I decided to target OpenGL 3.3. Back then I thought this was a writeoff for Intel, but I didn't really care because I had never owned Intel GPU, and I figured "it's only 10% of the market...".

Then I upgraded my PC to an Ivy Bridge CPU + high-end add-on GPU card in January. I stll didn't care about Intel, because I had the add-on card. But then I upgraded my 5-year-old laptop to an Ivy Bridge CPU last month, and suddenly I care because the new laptop won't run my game! :(

However, at this rate of improvement of the Mesa driver, it looks like they'll support 3.3 in a few months, so I'll get to run my game on my laptop without any changes long before I'm ready to release. :) But shoot... I'd probably be willing to try to support OpenGL 3.2 if that's as far as they get with the Mesa driver, since I'm only using 3.3 in a few places.

Market Share (1)

tuppe666 (904118) | about 10 months ago | (#45104591)

Intel Market Share is 60%; Nvidia/AMD about 20% even Steam place Intel share at 15%(and Growing); Nvidia with 50%, AMD with 30%.

Re:Market Share (0)

Anonymous Coward | about 10 months ago | (#45104869)

Yep. I was talking about a year ago, and I didn't bother to look up the stats back then because Intel had "always" (traditionally) been about 10% amonst gamers.
But Intel has started to dramatically increase their gaming market share due to Ivy Bridge and Haswell's on-chip GPUs.

p.s. Unfortunately the fastest integrated GPUs are still pretty damn slow, but it's cool that they support OpenGL 3.2 now. :-)

OpenGLaDOS? (0)

Anonymous Coward | about 10 months ago | (#45104571)

Maybe Black Mesa...
That was a joke. Haha. Fat chance.

Re:OpenGLaDOS? (0)

Anonymous Coward | about 10 months ago | (#45104599)

You win the internet for today.

Cool, but why? (1)

sl4shd0rk (755837) | about 10 months ago | (#45104661)

Isn't Mesa software rendering? I've never found it to be anything but abysmal performance. Why does anyone use it?

Re:Cool, but why? (2)

paskie (539112) | about 10 months ago | (#45104729)

Nope. Mesa is a generic OpenGL API implementation that can use multiple backends - either software rendering or Gallium / DRI.

Re:Cool, but why? (1)

adolf (21054) | about 10 months ago | (#45106627)

Wait. Mesa stopped supporting GLIDE?

Re:Cool, but why? (1)

Anonymous Coward | about 10 months ago | (#45104735)

Mesa is the graphics API implementation that can used for both software and hardware-accelerated rendering.

Re:Cool, but why? (4, Informative)

timeOday (582209) | about 10 months ago | (#45104757)

I had thought the same, but from the first 3 sentences of "Mesa Introduction" on their homepage, mesa3d.org:

Mesa is an open-source implementation of the OpenGL specification - a system for rendering interactive 3D graphics.

A variety of device drivers allows Mesa to be used in many different environments ranging from software emulation to complete hardware acceleration for modern GPUs.

Mesa ties into several other open-source projects: the Direct Rendering Infrastructure and X.org to provide OpenGL support to users of X on Linux, FreeBSD and other operating systems.

and later on the page...

Mesa is the OpenGL implementation for several types of hardware made by Intel, AMD and NVIDIA, plus the VMware virtual GPU. There's also several software-based renderers

Re:Cool, but why? (1)

Kjella (173770) | about 10 months ago | (#45104865)

Isn't Mesa software rendering? I've never found it to be anything but abysmal performance. Why does anyone use it?

Mesa is many things, among them a huge graphics library to support the OpenGL API, a reference software rendering driver and a bunch of hardware accelerated drivers. The only reason to use the software rendering is to test accelerated drivers or because you don't have a choice and in the past you didn't really have a lot of choice. Today AMD and Intel has official open drivers, for nVidia there's the community built Nouveau that all have good hardware acceleration support, so it's pretty hard to find a graphics card that will be unaccelerated. But if you buy a brand new graphics card right after release it might still be a period between they get modesetting working (read: you get a picture) and 2D/3D acceleration working where you're back to software rendering.

Re:Cool, but why? (1)

Gavagai80 (1275204) | about 10 months ago | (#45105915)

Today AMD and Intel has official open drivers, for nVidia there's the community built Nouveau

What open drivers does AMD have? I'd love to get one for my AMD but as far as I've seen the only open options are pathetically inferior to the closed AMD drivers.

Re:Cool, but why? (1)

Xtifr (1323) | about 10 months ago | (#45106749)

There was reportedly a huge jump in performance [phoronix.com] with the 3.5 kernel series (as high as 38% in at least one case), and a lot more work was done with 3.6 through 3.10.

Re:Cool, but why? (5, Informative)

Anonymous Coward | about 10 months ago | (#45104891)

It is a pretty complicated setup for the uninitiated, but the basics are this:

Mesa was originally just a reference software implementation of OpenGL. People started making graphics drivers that started accelerating parts of the graphics rendering code. These drivers grew quite large and are considered "classic drivers". They are mostly just big monolithic blobs that accelerate whatever they please. The intel driver is one of the only classic drivers remaining.

People wanted something a bit more flexible and easier to extend, so gallium3d was devised. Shaders are compiled into an intermediate language called TGSI, and then passed to the gallium3d drivers to be converted into command streams/assembly that will run on the video card. This system is a bit like a generic interface to the card's capabilities, and can in theory be run independently of OpenGL. Nouveau and radeon both use gallium3d drivers instead of the classic driver model.

All of the modern 3d accelerated drivers also make use of the DRI interface, which is an interface to the linux drivers that allows them to pass draw commands directly to the kernel's drm driver. There are also state trackers, which help generate TGSI for the gallium drivers. These exist for things like dri, gles, opencl, vdpau and help make sharing functionality between the various gallium3d drivers easier.

There are probably a few things wrong in here so don't take it as scripture, but it should give the overall picture of how things work. If anyone notices anything wrong please point it out.

gma500 is still missing 3d (0)

Anonymous Coward | about 10 months ago | (#45104827)

So, will my Acer D270 with that Poulsbo cra..chipset get a 3d any day now?

Re:gma500 is still missing 3d (0)

Anonymous Coward | about 10 months ago | (#45104955)

GMA500? Go bitch to PowerVR, who continue to refuse to support Linux at all. How that's a defensible position in this day and age continues to amaze me.

Re:gma500 is still missing 3d (0)

Anonymous Coward | about 10 months ago | (#45107565)

Intel did advertise the chipset as their own, and people bought them in a false assumption that everything from Intel has open source drivers. Blaming a subcontractor of a major company for something bad is just lame. Intel could have had open drivers for these if they wanted. Of course people will remember to do their homework better next time and lazy ones will just switch to AMD.

Utterly irrelevent (0)

Anonymous Coward | about 10 months ago | (#45105795)

Every PC with ANY GPU supports every version of DirectX and OpenGL, but this fact is completely pointless and useless. I refer, of course, to software emulation, proving that just because you support an API, doesn't make the support at all useful in the real world.

It is how WELL and how COMPLETELY you support something like OpenGL. Slow, buggy and unreliable support is worse than no support whatsoever. Fantastic, utterly reliable GPU parts are available from Nvidia and AMD at minimal cost for modest, but very useful performance. Who the hell need dodgy as heck performance on an Intel part? what possible use would that be to anyone?

You have a computer running some OS. You can run a first-quality closed-source GPU driver on it, or a completely useless open-source driver (with the added ingredient that the open-source driver is a perfect vector for trojan attacks against your machine). The GPU hardware, just like the other hardware on your machine, is 'closed' and you have no issue with that fact.

But some dysfunctional sociopath has told you that 'closed' drivers are 'bad' (in some unspecified way, because if you EVER need to rely on the quality of coding, it is in a driver).

The good and useful aspect of open-source are absolutely fantastic, should be supported and praised at every opportunity. But the leeches who attach to the concept of open-source, and attempt to turn it into a form of 'church' are the same devious parasites behind all forms of organised religion. They use concepts of fear, devotion and loyalty to disarm critical thinking.

Intel (and others) only 'back' open-source drivers because the dangerous leaders of open-source churches declare any company that doesn't allow open-source drivers to be evil sinners who need to be shunned. Since each hardware company is in competition with others, they cannot afford the risk of offending the members of this 'church'.

Now the issue of OPEN documentation is very different. When AMD, for instance, allows access to documentation describing the low-level functioning of their GPU parts, it allows you and me to put pressure on AMD to build better drivers or coding environments to access the true power of these chips. But we need the software tools of the future to be as good as possible. The hardware people should know how to write the best drivers for their products- and we users should expect quality in their software just as we expect quality in their hardware.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>