Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Glamor, X11's OpenGL-Based 2D Acceleration Driver, Is Becoming Useful

Unknown Lamer posted about 5 months ago | from the spraying-for-bitrot dept.

X 46

The Glamor driver for X11 has sought for years to replace all of the GPU-specific 2D rendering acceleration code in X.org with portable, high performance OpenGL. Unfortunately, that goal was hampered by the project starting in the awkward time when folks thought fixed-function hardware was still worth supporting. But, according to Keith Packard, the last few months have seen the code modernized and finally maturing as a credible replacement for many of the hardware-specific 2D acceleration backends. From his weblog: "Fast forward to the last six months. Eric has spent a bunch of time cleaning up Glamor internals, and in fact he’s had it merged into the core X server for version 1.16 which will be coming up this July. Within the Glamor code base, he's been cleaning some internal structures up and making life more tolerable for Glamor developers. ... A big part of the cleanup was a transition all of the extension function calls to use his other new project, libepoxy, which provides a sane, consistent and performant API to OpenGL extensions for Linux, Mac OS and Windows." Keith Packard dove in and replaced the Glamor acceleration for core text and points (points in X11 are particularly difficult to accelerate quickly) in just a few days. Text performance is now significantly faster than the software version (not that anyone is using core text any more, but "they’re often one of the hardest things to do efficiently with a heavy weight GPU interface, and OpenGL can be amazingly heavy weight if you let it."). For points, he moved vertex transformation to the GPU getting it up to the same speed as the software implementation. Looking forward, he wrote "Having managed to accelerate 17 of the 392 operations in x11perf, it’s pretty clear that I could spend a bunch of time just stepping through each of the remaining ones and working on them. Before doing that, we want to try and work out some general principles about how to handle core X fill styles. Moving all of the stipple and tile computation to the GPU will help reduce the amount of code necessary to fill rectangles and spans, along with improving performance, assuming the above exercise generalizes to other primitives." Code is available in anholt's and keithp's xserver branches.

cancel ×

46 comments

Reminder (2, Funny)

Anonymous Coward | about 5 months ago | (#46428221)

X11 is the Iran-Contra of graphical user interfaces.

Re:Reminder (0)

Joce640k (829181) | about 5 months ago | (#46428281)

I dunno, I always get a big belly laugh whenever I log into something and see that horrible 1980s B&W X11 desktop, complete with ugly 'X' cursor.

Re:Reminder (1)

jones_supa (887896) | about 5 months ago | (#46428369)

Me too. :)

Re:Reminder (1)

Joce640k (829181) | about 5 months ago | (#46430447)

I think the last time I saw it was when I connected to a Raspberry Pi last year with some free X11 client or other.

I cracked up laughing when it appeared.

Re:Reminder (4, Funny)

red_dragon (1761) | about 5 months ago | (#46428491)

So do I. Before I start the X server I put my Quaker hat on and then say "HA! HA! I AM USING X11!" and everyone around looks at me saying "WTF is this weirdo talking about?" so I have to put the hat away and shut up. Nobody appreciates good software and hats these days.

Re:Reminder (1)

tlhIngan (30335) | about 5 months ago | (#46428577)

I dunno, I always get a big belly laugh whenever I log into something and see that horrible 1980s B&W X11 desktop, complete with ugly 'X' cursor.

Don't forget the stipple pattern background!

Re:Reminder (1)

wiredlogic (135348) | about 5 months ago | (#46430925)

Hey man don't knock the stipple. For a long time I used a custom .xconfig to make tools with Athena style scrollbars like classic xterm have a nice gray on white style that would blend together into a solid color on a high res CRT. Add in some scroll wheel support and colored highlighting and that was one bad little terminal emulator.

Even better... (2)

morgauxo (974071) | about 5 months ago | (#46429733)

I'll use X until it is pried from my cold dead hands. Or until Wayland has network transparency at least on par with X. Whichever happens first.

Recently I was re-installing my desktop (Gentoo) from scratch and decided to have a go at not installing any big heavy desktop environment. I already used Ratpoison when I am connected over VNC and have memorized most of the key combos so I thought I would try Ratpoison on the local desktop. Completely banishing KDE I switched from KDM to XDM.

I still have a stock XDM config. I think it's hilarious seing that 80s vintage login on an almost modern machine and having it lead to perfectly up to date applications. Maybe some day I will take the time to pretty it up. I have seen screen shots that show XDM can be made to look nice. But... it's only there until I log in. Why bother?

Re:Even better... (1)

fisted (2295862) | about 5 months ago | (#46436871)

You must be a Linux user. "It's old.. it works well.. -- Lets change it!"

IOW, don't fkn touch it.

Re:Reminder (2)

Ford Prefect (8777) | about 5 months ago | (#46430001)

I dunno, I always get a big belly laugh whenever I log into something and see that horrible 1980s B&W X11 desktop, complete with ugly 'X' cursor.

Try flying on a Virgin America plane with the LCD screen inflight entertainment systems in the seat-backs. They'll often mass-reboot the things before or after a flight, briefly revealing that retro-fantastic, monochrome stippled background with 'X' cursor...

Re:Reminder (1)

fisted (2295862) | about 5 months ago | (#46436851)

All your laughter does is giving away how little understanding there is on your side.
There's a strict separation between mechanism and policy.
X11 provides the mechanism; it creates a display to draw on, and provides basic operation to do the actual drawing of primitives.
Window managers and the like provide the policy -- how does the GUI behave, how are windows decorated, what pixmap for the cursor, etc.

This is a very useful and proven separation of concepts; there's nothing "horrible" to it. In fact, it would be horrible if it was different.

Raises hand... (1)

Joce640k (829181) | about 5 months ago | (#46428235)

"Before doing that, we want to try and work out some general principles about how to handle core X fill styles."

Use textures! And stencil bitplanes!

Vs compositing? (3, Interesting)

Anonymous Coward | about 5 months ago | (#46428291)

I wonder, how does it relate to compositing engine? Ain't surfaces already drawn using GPU accelerated function when using GL-based compositing ?

Re:Vs compositing? (2)

fnj (64210) | about 5 months ago | (#46429623)

I wonder, how does it relate to compositing engine? Ain't surfaces already drawn using GPU accelerated function when using GL-based compositing ?

I would like to know this too. And not just with megabuck megawatt GPUs, but with something reasonable like Intel HD2000, 3000, and 4000.

Re:Vs compositing? (3, Interesting)

Gaygirlie (1657131) | about 5 months ago | (#46429797)

I wonder, how does it relate to compositing engine? Ain't surfaces already drawn using GPU accelerated function when using GL-based compositing ?

The windows themselves should be drawn via the GPU on a modern compositing engine, sure, but the window - contents themselves have nothing to do with compositing managers; an app, depending on what UI-toolkit it uses, may be drawing its buttons and text-entries and scrollbars and whatnot via software, H/W-accelerated and somewhat outdated 2D-acceleration, or via the 3D-engine. Many drivers these days don't bother even trying to support the whole range of 2D-accelerated methods and some drivers don't bother supporting such at all, so the toolkits that still use these methods basically fall back to software-rendering.

FYI (-1)

ArcadeMan (2766669) | about 5 months ago | (#46428305)

It hasn't been called "Mac OS" for about a decade now. It's OS X.

Re:FYI (1)

Anonymous Coward | about 5 months ago | (#46428447)

Actually, it was officially "Mac OS X Snow Leopard" until 2011, when it was renamed "OS X Lion."

Considering the user share of 10.6 and the amount of time it was Mac OS (X) before then, I think it's fair to forgive the misnomer.

Re:FYI (0)

Anonymous Coward | about 5 months ago | (#46428465)

Can't edit because anon.

It was actually OS X Mountain Lion which prompted the change from Mac OS X to OS X, which was released in 2012.

Re:FYI (1)

ArcadeMan (2766669) | about 5 months ago | (#46428519)

It was referred as "Mac OS" in the pre-OS X days.

"Mac OS" is from the Windows 95/98/XP era.

Re:FYI (1)

unixisc (2429386) | about 5 months ago | (#46430227)

Yeah, OS-X is not the same OS as everything from System 6 to System 9. That was a homegrown cooperative multitasking system, that had no UNIX in it. Whereas OS-X is a NEXTSTEP derived OS which in its current form derives a lot from the FreeBSD project and others.

Re:FYI (1)

ArcadeMan (2766669) | about 5 months ago | (#46428535)

I need to specify "Mac OS != Mac OS X" because some mods hate Apple so much that they can't even be bothered to learn anything about it.

Calling OS X "Mac OS" (instead of at least "Mac OS X") is like calling the latest Windows version "MS-DOS 2014".

Windows 8 = "MS-DOS 2014"? (1)

Anonymous Coward | about 5 months ago | (#46428667)

Microsoft is going back to their roots of flat filesystems, cumbersome interface, and limited multitasking. This whole push to Metro finally makes sense!

Re:Windows 8 = "MS-DOS 2014"? (0)

Anonymous Coward | about 5 months ago | (#46428905)

Exactly the same way Android and iOS work. Imagine that. Letting other apps run in the background will kill your battery life. Suspend them. If you need true multitasking, use regular Windows apps. Many work loads now days are not about true multitasking, just pausing an resuming work.

If you need true multitasking, use regular Window (1)

morgauxo (974071) | about 5 months ago | (#46429757)

Actually there are times I want full multitasking on my Android phone. I didn't know it was an option to run regular Windows apps on it. Thanks for enlightening me! Now please, teach me how!

Re:FYI (1)

Anonymous Coward | about 5 months ago | (#46428849)

The X [wikipedia.org] just represents the major version number. So it's actually more like calling "Windows 8", "Windows".

Re:FYI (1)

jedidiah (1196) | about 5 months ago | (#46429047)

> It hasn't been called "Mac OS" for about a decade now. It's OS X.

So fucking what? I'm not participating in Apple's asshattery here. They can fool an idiot like you into helping to hijack other people's trademarks but I'm not participating.

Re:FYI (1)

ArcadeMan (2766669) | about 5 months ago | (#46431147)

And going from "Xbox 360" to "Xbox One" is not asshattery? At least "X" is further along the ASCII table than "9".

Yay! (3, Insightful)

buchner.johannes (1139593) | about 5 months ago | (#46428395)

Cheers to the heros working on improving X. It's probably the most important piece of software on GNU/Linux. Real hackers working there on the most complex issues.

Re:Yay! (0)

fnj (64210) | about 5 months ago | (#46429647)

Well, it's utterly irrelevant for server use, so it can't be THAT important, but it is definitely right up there for Desktop linux, which COULD potentially have a whole hell of a lot more penetration than it does now.

Re:Yay! (0)

Anonymous Coward | about 5 months ago | (#46430171)

Not really. I've used virtual X servers to run X software on servers before, for many purposes. Granted, it's always a disgusting hack, but it's nice to have that option when your only choice is a disgusting hack.

Re:Yay! (1)

fisted (2295862) | about 5 months ago | (#46436885)

you don't need an x server on the server, in order to run x clients on the server; just have them render on your local display.

Re:Yay! (0)

Anonymous Coward | about 5 months ago | (#46433497)

Cheers to the heros working on improving X. It's probably the most important piece of software on GNU/Linux. Real hackers working there on the most complex issues.

Yes, making sure your Motif applications from 1992 are hardware accelerated by your cutting edge GPU from 2008 is a really important piece of work, sure.

Nobody uses the Xserver's drawing commands, not GTK, not Qt. Who the hell cares if diagonally striped rectangles are accelerated or not? Nobody uses that crap any more.

Why? (1)

LifesABeach (234436) | about 5 months ago | (#46428561)

From reading the blog site, it appears that there is some benefit to creating a 2D acceration wrapper around OpenGL? It's obvious, I'm not getting it.

Why add a layer of complexity to OpenGL? Why not just explain OpenGL for 2D operations more clearly?

Re:Why? (4, Insightful)

squiggleslash (241428) | about 5 months ago | (#46428967)

The concept, as I understand it, is that at the moment in order to write a device driver for X11 you have to separately manage code that implements 2D and 3D graphic primitives. Given 2D operations are themselves a subset of 3D operations (even if the API doesn't reflect that), it makes sense to simply have device drivers implement the 3D parts. Then common wrapper code can implement the 2D, alleviating the driver developer of the burden of building and testing an entirely new block of code.

It should make X.org more reliable, as the same 2D code will be used for all drivers, and should end up being pretty solid. In the mean time, driver developers have more time to polish their 3D driver implementations. Win win. Maybe a slight performance hit, but probably not a significant one.

Re:Why? (1)

Anonymous Coward | about 5 months ago | (#46429561)

To understand you first need to believe that GPUs are becoming ubiquitous through integration; even portable devices provide GPUs today. This is a new phenomenon; only a few years ago you could purchase a laptop with integrated graphics that did not provide a 3D GPU. It's rare to encounter a new desktop/laptop that doesn't have a built-in GPU today.

So, with the assumption that OpenGL implementing GPUs will eventually be a given, there are two benefits. First, every 2D application is a subset of 3D; a 2D application in OpenGL is simply and orthogonal projection with an identity transform of 2D quads. A single well maintained 2D path implemented on top of OpenGL serves all cases, so the many existing distinct 2D implementations can be discarded. Second, the effort of GPU vendors is reduced; it should be sufficient to implement only the OpenGL part of the stack and get legacy 2D for free.

What you really need to believe (1)

morgauxo (974071) | about 5 months ago | (#46429787)

What you really need to believe is that GPUs with decent X support are becoming ubiquitious. Devices where the GPU becomes a paper weight the moment you try to run X because the maker doesn't want to supply any info might as well not have GPUs.

Eagerly awaiting ickle benchmarks (4, Interesting)

pavon (30274) | about 5 months ago | (#46428601)

The cairo-ickle blog [wordpress.com] has maintained very interesting benchmarks of the different cairo [cairographics.org] rendering backends. The short story is that every hardware accelered backend except for sandybridge SNA has performed worse than the software implementation. And in some cases the hardware acceleration is significantly less stable. I'm curious to see if this finally pushes Glamor over the hump and makes it faster than the software path.

Re:Eagerly awaiting ickle benchmarks (4, Interesting)

Chemisor (97276) | about 5 months ago | (#46429203)

I wonder if the differences are due to extracting the result from the GPU. There is no doubt whatsoever that doing 2D with OpenGL on the GPU will be faster than a software rasterizer - what kills the performance in these tests is having to copy the result back to the CPU so it can be displayed in an X window. Once X windows are fully composited and output graphics never leave the GPU memory, the hardware acceleration will no doubt prove to be the fastest.

the solution is HSA (0)

Anonymous Coward | about 5 months ago | (#46433257)

then there's no copying back and forth

Re:Eagerly awaiting ickle benchmarks (1)

Anonymous Coward | about 5 months ago | (#46429259)

Making it faster than the software path isn't the hump. User responsiveness is. If Cairo through my GPU is 20% slower, but the CPU now gets 30% of its time back, that makes my application could run up to 30% faster. But that's only good if I need that 30% CPU time for the rest of my application and I'm not blocking on draw calls. (For games that's certainly true. For most applications that use Cairo I couldn't say.)

Re:Eagerly awaiting ickle benchmarks (1)

OdinOdin_ (266277) | about 5 months ago | (#46436523)

And the great thing about this project is that if a graphic hardware vendor needs to choose what to spend 1000hours to work on, it can now be the 3D driver instead of a 2D driver. Because someone else created a layer to use that 3D driver for all 2D operations.

Previously the money/time was spent on the 2D to facilitate the largest target audience for the hardware and unfortunately that 2D effort could not be utilize for the linux 3D use case, so no wonder improvement to it were hard to get.

So expect those crappy 3D drivers to get good in the areas needed by the 2d-over-3d code real soon and while they are in there they might as well work on all the low hanging fruit concerning 3d usage.

So won't apply your previously knowledge on linux 3d driver support against what the future will now bring to both the 2d and 3d scene.

Can it acclerate the dictionary? (1)

fascismforthepeople (2805977) | about 5 months ago | (#46428675)

Accleration

Could have been prevented...

reviving the dead? (0)

Anonymous Coward | about 5 months ago | (#46429969)

Wasn't X11 supposed to be completely replaced by Wayland everyone until latest end of 2015?

Re:reviving the dead? (0)

Anonymous Coward | about 5 months ago | (#46430565)

Wasn't X11 supposed to be completely replaced by Wayland everyone until latest end of 2015?

Not going to happen unless wayland gets proper network transparency...

X network transparency is hardly useful anymore (1, Informative)

Anonymous Coward | about 5 months ago | (#46432219)

For everyone that disparages Wayland without really understanding anything about Wayland, which seems to be most everyone, I highly recommend listening to this talk by a core X.org developer:

http://www.youtube.com/watch?v... [youtube.com] [youtube.com]

TL;DR points:
- X11 is no longer "network transparent" and hasn't been so in a long time, due to reliance on DRI, Xrender, Xvideo, etc.
- X11 is already used in a manner that is similar to Wayland but with a very poor inter-process communication layer and synchronization issues, with most of X11's core bypassed (server-side fonts, drawing APIs, etc).
- X11 when used remotely is already like VNC, but very poor at it. Lots of round-trips, etc, all to show bitmaps.

In the end, there are a few things I need from Wayland, and I think they will be there in the end:
- app-based network transparency, not just remote desktop
- middle click paste. Maybe done with a virtual frame buffer and rdp to ship the final rendering across the wire.
- customizable focus policy (focus follows mouse, click to raise)
- user replaceable window/composite managers

I don't know what YOU... (0)

Anonymous Coward | about 5 months ago | (#46434089)

are talking about.

I just started up a pidgin instance from Server A to Desktop B and it runs plenty smooth over a lan. Far smoother in face than an equivalent VNC session.

Now mind you I agree that OpenGL, XVideo scalng, etc don't currently offer the same level of network transparency, but that is in large part due to those same schmucks that are producing glamor and wayland, not taking the time to architect protocol extensions that are transmissable over the network. Not in fact any inherent shortcoming of either X's protocol or architecture, but rather simply lazy SOBs taking a half assed approach to development.

Sure it sped up GL support back in the Utah GLX/DRI1 days, but a side effect of it was destroying the utility of X as a network transparent system. A similiar thing happened with OSS versus ALSA, and NAS versus ESD then Pulseaudio.
While OSS had legacy limitations to be overcome, as did NAS (which doesn't work with alsa OSS emulation anymore, although it once did!), both were far more flexible in regards to their intended uses than their replacements, both of which were decidedly desktop oriented and often have issues in the 'professional audio' or 'network transparency' realms, respectively.

While it's unlikely either alternative would get the development time to bring them up to par with the new incumbents, it is perhaps time for a serious and thoughtful look at the shortcomings of both the old and new systems and perhaps find a way to reimplement them now that the hardware has matured and 20/20 hindsight is available for how the drivers/software could best be architected. Then if any future tech has a dramatic shift away from it, provide a new and seperate solution to handle the changes made since, without simply tacking half assed attempts at features onto something that otherwise could be considered mature, stable, and well engineered.

Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...