Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Experimental Virtual Graphics Port Support For Linux

Unknown Lamer posted more than 2 years ago | from the it's-like-we-live-in-the-future dept.

Graphics 74

With his first accepted submission, billakay writes "A recently open-sourced experimental Linux infrastructure created by Bell Labs researchers allows 3D rendering to be performed on a GPU and displayed on other devices, including DisplayLink dongles. The system accomplishes this by essentially creating 'Virtual CRTCs', or virtual display output controllers, and allowing arbitrary devices to appear as extra ports on a graphics card." The code and instructions are at GitHub. This may also be the beginning of good news for people with MUX-less dual-GPU laptops that are currently unsupported.

cancel ×

74 comments

Sorry! There are no comments related to the filter you selected.

Happy November from the Golden Girls! (-1)

Anonymous Coward | more than 2 years ago | (#37982240)

Thank you for being a friend
Traveled down the road and back again
Your heart is true, you're a pal and a cosmonaut.

And if you threw a party
Invited everyone you ever knew
You would see the biggest gift would be from me
And the card attached would say, thank you for being a friend.

Huh? (-1)

Anonymous Coward | more than 2 years ago | (#37982260)

Summary - blah blah blah
Article - blah blah blah blah blah

Re:Huh? (0)

Anonymous Coward | more than 2 years ago | (#37986954)

Excuse me, but you must turn in your "geek card". This was a straightforward summary, you shouldn't post and look foolish because you can't handle a little technical jargon.

uh.... where is everyone (-1)

Anonymous Coward | more than 2 years ago | (#37982262)

Hello

Video Streams? (4, Interesting)

WeblionX (675030) | more than 2 years ago | (#37982292)

Sounds like we're going to be able to start passing around video in the way PulseAudio lets you connect various devices together, and make it easier to handle miniature external/secondary displays. Though the biggest benefit seems to be of an easy way to pass rendered 3D content directly to a web stream or over some remote desktop connection.

Does anyone know if this would this provide a performance boost over something like VNC for similar things? Or how about the possibility to pass rendered output as a fake video capture card input to a virtual machine? I think I get what this does, but I'm kind of wondering how exactly it's better than current solutions to these problems.

Re:Video Streams? (1, Informative)

Anonymous Coward | more than 2 years ago | (#37982310)

Let's hope it doesn't "work" like PulseAudio.

Re:Video Streams? (1)

WeblionX (675030) | more than 2 years ago | (#37982338)

Perhaps JACK would be a better example?

Re:Video Streams? (1)

Jonner (189691) | more than 2 years ago | (#37986718)

It doesn't sound to me like this is that much like either PulseAudio or Jack. Those are both sound systems based on userspace daemons focused on flexible sound mixing, while this virtual graphics system is within the Linux kernel and seems to be focused on simply moving pixels from one hardware device to some other device.

Wayland [freedesktop.org] is more like PulseAudio or Jack for graphics. Its proponents think it has advantages over the much thicker, more complex daemon we've used for decades called the X11 server.

Re:Video Streams? (3, Funny)

justforgetme (1814588) | more than 2 years ago | (#37982420)

no,no "work" is not the word you are looking for when describing pulse audio.

I had a nightmare last night that PA was keeping ALSA captive demanding the
release of 1000000 CPU cycles the system was keeping for thread scheduling.
In the end we used an SCSI driver to nuke the damn thing to /dev/null using
an NPTL.
Unfortunately when we stormed the desolated daemon we found out the cruel
things it had been doing to ALSA all along, leaving it a mutated and deformed
carcass. /dev/rand spoke a few words about it's former beauty.
I'm telling you it was rough times.

Re:Video Streams? (-1)

Anonymous Coward | more than 2 years ago | (#37982462)

"I'm telling you it was rough times."

Thanks, I'm using FreeBSD and now I can't sleep after reading a horror story like that.

Re:Video Streams? (1)

jonwil (467024) | more than 2 years ago | (#37993534)

If you want to see the worst abuse of PulseAudio, check out the Nokia N900 linux phone. Its using a combination of PulseAudio and a few other projects along with a bunch of closed-source blobs to do lots of the audio stuff in the phone. (the closed source blobs exist to keep certain proprietary algorithms for things like speaker protection and other things that are needed in a cellphone (exactly what is unknown since Nokia hasn't documented them)

Re:Video Streams? (0)

Anonymous Coward | more than 2 years ago | (#37982638)

You mean the same way that every new technology crashes and gives problem, especially when some distributions jump the train before they know how to use it.

Sometime I am somewhat impressed that Lennart never even did try to sue Ubuntu for all the badmouthing and ill-will and extra time he had to handle because of how Ubuntu did not know how to use pulseaudio, broke it in more ways than one thought possible, and then ship it in a "stable" release...

Re:Video Streams? (0)

visualight (468005) | more than 2 years ago | (#37982726)

Um, pulse audio was a nightmare on *every* distribution for a *long* time. Might still be, I wouldn't know. I'm still holding on to my SB's so I don't need that crap.

Re:Video Streams? (0)

Anonymous Coward | more than 2 years ago | (#37982862)

For me, in fact it was by far the worst on SB's, which every more serious linux had (as opposed to motherboard audio, which had few problems).

Re:Video Streams? (-1)

Anonymous Coward | more than 2 years ago | (#37982992)

For me, in fact it was by far the worst on SB's, which every more serious linux had (as opposed to motherboard audio, which had few problems).

Well i am stuck with mother board audio right now (not enought PCI slots) and Pulse Audio is the biggest pile of crap going heading to another new board with enough PCI slots now and back to the old good reliable PCI SB card

PA works (2, Informative)

Anonymous Coward | more than 2 years ago | (#37983000)

PA works just fine as long as the one who sets it up more or less knows what he's doing. Ubuntu and most user friendly distros had packagers who didn't, hence massive problems. Of course, there are other real problems like Skype borks which mostly come from Skype using ALSA in an arguably incorrect way that when used with PA shows why directly accessing guestimated hw: stuff is a bad idea. But things people almost always complain about were caused by inept Ubuntu devs not real problems with PA.

Re:PA works (0)

Anonymous Coward | more than 2 years ago | (#37983892)

You a PA developer? Because I have not had good experience with PA.

Re:PA works (1)

BrokenHalo (565198) | more than 2 years ago | (#37983994)

Skype seems to be my *only* real issue with pulseaudio. But it is enough of an issue to be a total showstopper, so I keep PA more or less permanently disabled. Fortunately, ALSA is more than capable of handling the job by itself, but I would be much happier if my distro (Arch) hadn't built PA as a dependency into so many applications.

Re:PA works (0)

Anonymous Coward | more than 2 years ago | (#37987612)

Even trivial things, like for instance opening a list of files in nautilus, scrolling to the bottom and pressing down (1 'bing' per keypress at the key repeat rate) does hideous things in Pulseaudio. I accept that a userspace mixing solution *could* work, but I have yet to see it *actually* work.

Re:PA works (1)

tehcyder (746570) | more than 2 years ago | (#37985560)

PA works just fine as long as the one who sets it up more or less knows what he's doing. Ubuntu and most user friendly distros had packagers who didn't, hence massive problems

And obviously, it is reasonable to expect users to know more about setting up their distro than the people who put the distro together.

Re:Video Streams? (0)

Anonymous Coward | more than 2 years ago | (#37984472)

Even without a card with hardware mixing like SBLives etc., you just use alsa dmix software mxiing, it's even autosetup for you in unmutilated versions of alsa, has been for years now.

PulseAudio is really a high-overhead X11-like network-transparent system, but for audio not graphics, and far, far less mature and tuned than X11. Maybe it's even a good idea in principle. After all, it's only recently that people have home area networks, X11 and Pulseaudio "should" only be beginning to come into their own as interesting systems for the masses. Yet people hate on both - but what gets me is the people who hate on X11 yet like pulseaudio (cough ubuntards cough) - It's clueless and idiotic.

Re:Video Streams? (2)

dokc (1562391) | more than 2 years ago | (#37983104)

Sometime I am somewhat impressed that Lennart never even did try to sue Ubuntu for all the badmouthing and ill-will and extra time he had to handle because of how Ubuntu did not know how to use pulseaudio, broke it in more ways than one thought possible, and then ship it in a "stable" release...

I feel pain in your words and that sounds like the pain I had after last Ubuntu upgrade. I ditched pulseaudio because of all the problems I had and now they installed it per default. I can't believe how many things they brake every time. After so many years being faithful to Ubuntu I will switch to Debian unstable to get more stable system!

Re:Video Streams? (-1)

Anonymous Coward | more than 2 years ago | (#37983284)

PulseAudio was shit on every distro.

Lennart need to deal with that and change career.

Re:Video Streams? (1)

Jonner (189691) | more than 2 years ago | (#37986760)

PulseAudio works great for me and makes my life a lot easier, so it would be fine if it did.

Re:Video Streams? (1)

segedunum (883035) | more than 2 years ago | (#37988298)

Let's hope it doesn't "work" like PulseAudio.

Fortunately, it seems not. These people actually seem to know what the fuck they're talking about.

Re:Video Streams? (3, Informative)

Gaygirlie (1657131) | more than 2 years ago | (#37982432)

Does anyone know if this would this provide a performance boost over something like VNC for similar things?

The slowest part of VNC and similar is the actual transmit of image data over network, and this is obviously not a new, fancy image compression algorithm or anything like. So no, it might require a teeny tiny amount less CPU time on the VNC server, but on the client end it'll have absolutely no effect.

Re:Video Streams? (1)

omnichad (1198475) | more than 2 years ago | (#37984474)

On the other hand, if you have spare CPU cycles, you could take that output video and compress it to MP4, which VNC doesn't yet support. Still, far less efficient than sending the 3D commands over the wire for the device on the other end to render.

Re:Video Streams? (2)

Gaygirlie (1657131) | more than 2 years ago | (#37990470)

On the other hand, if you have spare CPU cycles, you could take that output video and compress it to MP4, which VNC doesn't yet support. Still, far less efficient than sending the 3D commands over the wire for the device on the other end to render.

(Assuming you mean MP4 == H.264)

Real-time encoding of the stream to H.264 would be... total waste of cycles that could be used for running actually useful stuff on the server. If we assume the average VNC session is 800x600 pixels in size, high-profile in order for the picture to retain clarity (ie. small text must be readable, for example. Otherwise it'd just be absolutely useless for anything even remotely productive.), and say, 25 fps, it'd simply be impossible for old single-core systems to handle at all, and dual-core systems would still be pegging at 60%-70% CPU time used. On something like an 8-core box it would obviously not be as bad, but even then would you really want to sacrifice one core just for video compression when you're likely serving many, many other users, too?

Now, we could also drop fps to something around 10 which would likely be possible even for a single-core system to do, though it'd still likely use something around 90% CPU. But now, 10 fps stream is even less responsive than what VNC is now.

The problem is that high-profile H.264 simply does require quite a lot of CPU to do in real-time. One could in theory relegate the compression to GPU, but that would require the server to have a rather powerful GPU. And most servers don't have that. Implementing the actual code for doing the compression on GPU isn't that difficult, plenty of software do that already. Though I've only seen them doing baseline- and normal-profile, I haven't been able to find a single one that is able to do high-profile on GPU. And the quality is always a lot lower than when using software solution.

Long story short: H.264 would be a poor choice. There are better ones for this type of stuff.

Re:Video Streams? (1)

rdnetto (955205) | more than 2 years ago | (#37997356)

However, it would let turn a laptop or tablet into a 2nd monitor, which could be rather useful at times if you don't normally have a dual screen setup.

Re:Video Streams? (1)

Gaygirlie (1657131) | more than 2 years ago | (#37998060)

However, it would let turn a laptop or tablet into a 2nd monitor, which could be rather useful at times if you don't normally have a dual screen setup.

You didn't read anything I wrote, did you? There are better solutions for such kind of stuff, a H.264 stream is too computationally-expensive if done on CPU. See for example http://dmx.sourceforge.net/ [sourceforge.net]

Re:Video Streams? (1)

rdnetto (955205) | more than 2 years ago | (#38007724)

I did read the parent post, though I don't know if you posted elsewhere.
So what would the use case for this be? Just cause there are better options, doesn't mean it couldn't be used like that.

Re:Video Streams? (1)

Gaygirlie (1657131) | more than 2 years ago | (#38009014)

So what would the use case for this be? Just cause there are better options, doesn't mean it couldn't be used like that.

Of course, feel free to do as you please. I only explained why using a H.264 stream as a VNC-replacement would be a generally stupid idea. But if you e.g. just want to show a game or some content where there aren't lots of small details that you need to be able to read then sure, it could work, provided you have powerful enough hardware to encode the stream fast enough.

Re:Video Streams? (0)

Anonymous Coward | more than 2 years ago | (#37983144)

Your conclusion is correct, I believe. However, your logic for getting there isn't.

What you don't seem to realise is that a high level API *is* compression: it specifies low-level details, in a less cumbersome, more compact, more efficient way.

Re:Video Streams? (1)

arth1 (260657) | more than 2 years ago | (#37984480)

What you don't seem to realise is that a high level API *is* compression: it specifies low-level details, in a less cumbersome, more compact, more efficient way.

But being high level, it packs those compact and efficient routines in seven layers of abstraction, so the end result is bigger and slower.

High level APIs are good for making it easier for programmers, and producing consistent code, but neither for size nor for speed, unless you compare with another high level API that uses less efficient algorithms.

Re:Video Streams? (1)

subreality (157447) | more than 2 years ago | (#37983702)

X11 has done network-transparent video since forever. Screens that don't exist have been around a long time too (Xvnc).

The part where this is better than existing solutions is you get a hardware-accelerated framebuffer without having to attach it to a physical monitor. Thus, you could get a hardware-accelerated Xvnc, or create a virtual second head and network-attach it to a second computer. You might even do that over VNC, so it's not really an alternative to VNC... it's a new capability.

Re:Video Streams? (1)

Jonner (189691) | more than 2 years ago | (#37986852)

Does anyone know if this would this provide a performance boost over something like VNC for similar things? Or how about the possibility to pass rendered output as a fake video capture card input to a virtual machine? I think I get what this does, but I'm kind of wondering how exactly it's better than current solutions to these problems.

An obvious way to use this would be to target some kind of virtual frame buffer in regular RAM that VNC or other remote protocol could take advantage of. Currently, you have to point VNC to a real frame buffer that is displayed on a GPU's output to take advantage of the acceleration. However, if you switched the virtual frame buffer the GPU renders to, you could have acceleration for an arbitrary number of them as long as applications don't need to use acceleration features all the time.

Bell Labs - interesting (0)

Anonymous Coward | more than 2 years ago | (#37982298)

Looks like those guys are doing some useful stuff.

Need some help here (1)

bryan1945 (301828) | more than 2 years ago | (#37982322)

I get the sending info to multiple places. Are they talking about sending different streams to these monitors/what-have-you? Otherwise it just sounds like tossing in a splitter in the video signal.

Yes, I did read TFA, and I guess I'm missing something.
Help please?

Re:Need some help here (4, Informative)

AHuxley (892839) | more than 2 years ago | (#37982350)

From the read me at https://github.com/ihadzic/vcrtcm-doc/blob/master/HOWTO.txt [github.com] :
"In a nutshell, a GPU driver can create (almost) arbitrary number of virtual CRTCs and register them with the Direct Rendering Manager (DRM) module. These virtual CRTCs can then be attached to devices (real hardware or software modules emulating devices) that are external to the GPU. These external devices become display units for the frame buffer associated with the attached virtual CRTC. It is also possible to attach external devices to real (physical) CRTC and allow the pixels to be displayed on both the video connector of the GPU and the external device."

Re:Need some help here (1)

isama (1537121) | more than 2 years ago | (#37982604)

So if you were to tie this to xinerama and vnc you whould have something like dmx? Altough I never got dmx to work, maybe because I'm a little lazy...

Re:Need some help here (2)

prefect42 (141309) | more than 2 years ago | (#37983132)

I've used DMX with Chromium to give 3D accelerated X over 28 monitors on 7 machines. Works, but the performance can be terrible if you don't have the interconnect to deal with what you're rendering. With gigabit basic X applications could cope, but firefox with google maps would take seconds per redraw. Depending on the 3D app you /can/ get decent performance though.

Re:Need some help here (0)

Anonymous Coward | more than 2 years ago | (#37983072)

Sounds like a great use for a Muticast video cast on a local lan or wan.

Current Bottlenecks? (1)

bill_mcgonigle (4333) | more than 2 years ago | (#37985508)

These external devices become display units for the frame buffer

Looking at the HDMI specs for guidance, a high-res frame buffer might run 10Gbps. That's still considered a hard amount of data to push around inside a PC, right?

GPU accelerated sound? (2)

Adriax (746043) | more than 2 years ago | (#37982362)

Wonder if this could be used to create a GPU accelerated sound system?
Take the scene modeling, texture objects based on their acoustic properties, create light sources for every sound source, and output the scene to a sound device that translates the visual frame into a soundscape for output.

Or am I just not up to date with audio acceleration technologies (since I've never upgraded beyond a cheap headset).

Aureal3D (3, Informative)

Chirs (87576) | more than 2 years ago | (#37982474)

That's basically what the old Aureal technology did a decade ago--took the 3D scene data and passed it to the audio card for processing. It was awesome--Half-Life with four speakers was eerily realistic.

Re:Aureal3D (1)

Anonymous Coward | more than 2 years ago | (#37982618)

the a3d did great positional audio with only 2 speakers. like that demo that had bees flying all around you.

Re:Aureal3D (1)

slimjim8094 (941042) | more than 2 years ago | (#37983096)

God I remember that demo! It was actually a little spooky... they'd fly behind you and the hairs on the back of your neck would stand up because your brain was telling you there was a huge bee back there.

That Came on my brand new Compaq which had Windows 98, an AMD K6-3D at about 200mhz, 32MB of RAM, and a 4GB HDD.

And now I feel old...

Re:Aureal3D (1)

burnttoy (754394) | more than 2 years ago | (#37983154)

I think the bee demo was from Sensaura. I worked up there for a few happy years until Creative ermmm... 'nuff said.

Maybe both companies had a bee demo...

Re:Aureal3D (1)

X0563511 (793323) | more than 2 years ago | (#37985394)

Nothing special, just an implementation of HRTF [wikimedia.org] .

Re:GPU accelerated sound? (1)

somersault (912633) | more than 2 years ago | (#37983218)

Well, mainstream soundcards have been good enough for realistic sound since the 90s, so it isn't really a problem that needs to be offloaded to anything else. It's a lot easier to fake realistic audio in realtime than realistic graphics. Half-Life with an EAX setup sounded amazing, but it wasn't exactly photorealistic.

Re:GPU accelerated sound? (1)

Jonner (189691) | more than 2 years ago | (#37986560)

Though your idea is interesting, I doubt it could benefit from the virtual graphics approach described in TFA. This is about rendering pixels to arbitrary outputs, while it sounds like you're talking about much higher level manipulation. There are already capable, programmable DSPs for advanced audio processing such as the EMU10k1 series from Creative and I expect you could use OpenCL or something like it to do sound processing on a GPU if desired.

Pff, nothing new (1)

Rosco P. Coltrane (209368) | more than 2 years ago | (#37982596)

Stick a webcam in front of the screen, compress/pipe webcam output to the remote client. Voila, instant 3D remote display!

Re:Pff, nothing new (4, Interesting)

adolf (21054) | more than 2 years ago | (#37982646)

Indeed -- not new, at all.

Similar tricks were used a dozen or so years ago by Mesa 3D to get standalone 3dfx Voodoo cards to output accelerated OpenGL in a window on the X desktop. The 3D stuff rendered on a dedicated 3D card, and its output framebuffer was eventually displayed by a second, 2D-oriented card that actually had the monitor connected.

Re:Pff, nothing new (1)

prefect42 (141309) | more than 2 years ago | (#37983138)

Perhaps doing it in a generic hardware agnostic way is new?

Re:Pff, nothing new (2)

adolf (21054) | more than 2 years ago | (#37983264)

Perhaps, depending on how hardware-agnostic the APIs in question were/are.

Then again VirtualGL [wikipedia.org] has been around for a bit, too, which brings network transparency thrown into the mix. I don't know how much more hardware-agnostic such a thing could be...

Re:Pff, nothing new (1)

Jonner (189691) | more than 2 years ago | (#37986928)

Yes, I think that's exactly why this is interesting. Increasingly, PCs have multiple video outputs of various types as well as multiple GPUs. If you can decouple the GPU used to render something from the output used to display it without a huge performance hit, that opens up all kinds of possibilities.

Re:Pff, nothing new (2)

zefrer (729860) | more than 2 years ago | (#37984058)

You neglect to mention that said standalone 3D cards were physically connected to the 2D card via a pass-through cable which was what sent the video signal from one card to the other, allowing it to appear on your monitor.

This is a software solution of the same effect that will work on any card, even remote cards on different machines. Hardly the same thing.

Re:Pff, nothing new (0)

Anonymous Coward | more than 2 years ago | (#37984138)

Correct. If I'm not mistaken, these Voodoo cards took the input from the 2D card and overlayed 3D data on the analog VGA signal before passing it to the monitor.

Re:Pff, nothing new (1)

adolf (21054) | more than 2 years ago | (#37995510)

You neglect to remember that the Voodoo 1 and 2 were only capable of full-screen output using that passthrough cable, and had no conventional 2D processing capabilities of their own. The pass-through cable was essentially just a component of an automatic A/B switch: You could either visualize the output of one card, or of the other, but never both at the same time. (At least not by those means.)

To render 3D stuff on a Voodoo 1/2 and have it displayed inside of a window instead of full-screen required[1] the framebuffer hacks I mentioned earlier. ISTR even "all-in-one" Voodoo Banshee cards also requiring similar tricks in order to render GL into a window, and that the whole mess didn't really get sorted out until nVidia started making decent 3D products and 3dfx released the actually-really all-in-one Voodoo 3. (Though the Voodoo3 3500 TV did use an internal analog overlay[2] to support viewing live television without bogging any portion of the system in any meaningful capacity.)

Meanwhile, in another post in this thread I've also mentioned VirtualGL, which is a clever system predating this recent /. posting about Bell Labs. VirtualGL includes some aspects of network transparency, and is something you can download and get using today if you think it's useful.

To be clear: The point I'm trying to raise is not that this new "virtual graphics port" stuff might somehow be useless, but simply that the concepts behind it are not at all new, as history shows.

[1]: Some/all of that requirement could have been mitigated by using the VGA Feature Connector which was relatively common in those times, which allowed a direct analog overlay[2]. But the simple fact is that 3dfx didn't go that route for their earlier cards, for better or for worse.

[2]: Many (most? all?) TV tuners of the time used this approach.

Re:Pff, nothing new (0)

Anonymous Coward | more than 2 years ago | (#37988552)

I could have sworn that SGI had something like this on their Onyx and reality systems many years back. Sadly I did not get a chance to play with them, but they had the big rendering and compute machines centralized, and small head units where you wanted things.

Then again, I could be mis-remembering how the old computing lab was described to me.

Re:Pff, nothing new (1)

adolf (21054) | more than 2 years ago | (#37995570)

I could have sworn that SGI had something like this on their Onyx and reality systems many years back. Sadly I did not get a chance to play with them, but they had the big rendering and compute machines centralized, and small head units where you wanted things.

I was too young/inexperienced/poor to actually lay hands on relative big-iron like SGI back in their heyday, but it wouldn't surprise me at all if that was true: There was a lot of really awesome tech being sold by them around that time, and it's an efficient way to distribute limited (and at that time, very expensive!) resources.

plan9 (1)

wirelesslayers (2014486) | more than 2 years ago | (#37983300)

Reminds me of plan9, beautiful design and concept.

Re:plan9 (1)

uigrad_2000 (398500) | more than 2 years ago | (#37987656)

Reminds me of plan9, beautiful design and concept.

I agree about it being a beautiful design and concept. Why send expensive aggressive robots to dominate a new species you find on a new planet, when you can just raise their dead and control the masses with slow moving zombies?

I sure hope I'm not misunderstanding your reference.

In the Kernel please (4, Informative)

sgt scrub (869860) | more than 2 years ago | (#37984046)

David Airlie's HotPlug video work is really cool. I'm not surprised something bigger is coming out of it. What I really like are Elija's thoughts on putting it in the kernel so support is for more than X. Below is from the DRI-Dev thread. http://lists.freedesktop.org/archives/dri-devel/2011-November/015985.html [freedesktop.org]

On Thu, 3 Nov 2011, David Airlie wrote:

>
> Well the current plan I had for this was to do it in userspace, I don't think the kernel
> has any business doing it and I think for the simple USB case its fine but will fallover
> when you get to the non-trivial cases where some sort of acceleration is required to move
> pixels around. But in saying that its good you've done what something, and I'll try and spend
> some time reviewing it.
>

The reason I opted for doing this in kernel is that I wanted to confine
all the changes to a relatively small set of modules. At first this was a
pragmatic approach, because I live out of the mainstream development tree
and I didn't want to turn my life into an ethernal
merging/conflict-resolution activity.

However, a more fundamental reason for it is that I didn't want to be tied
to X. I deal with some userland applications (that unfortunately I can't
provide much detail of .... yet) that live directly on the top of libdrm.

So I set myself a goal of "full application transparency". Whatever is
thrown at me, I wanted to be able to handle without having to touch any
piece of application or library that the application relies on.

I think I have achieved this goal and really everything I tried just
worked out of the box (with an exception of two bug fixes to ATI DDX
and Xorg, that are bugs with or without my work).

-- Ilija

Re:In the Kernel please (-1)

Anonymous Coward | more than 2 years ago | (#37984546)

This is the web in 2011, not your fucking 30-years-old CRT display running MS-DOS. Don't force your lines to cut at 40 characters.

FTFY (0)

Anonymous Coward | more than 2 years ago | (#37984828)

David Airlie's HotPlug video work is really cool. I'm not surprised something bigger is coming out of it. What I really like are Elija's thoughts on putting it in the kernel so support is for more than X. Below is from the DRI-Dev thread. http://lists.freedesktop.org/archives/dri-devel/2011-November/015985.html [freedesktop.org] [freedesktop.org]

On Thu, 3 Nov 2011, David Airlie wrote:

Well the current plan I had for this was to do it in userspace, I don't think the kernel has any business doing it and I think for the simple USB case its fine but will fallover when you get to the non-trivial cases where some sort of acceleration is required to move pixels around. But in saying that its good you've done what something, and I'll try and spend some time reviewing it.

The reason I opted for doing this in kernel is that I wanted to confine all the changes to a relatively small set of modules. At first this was a pragmatic approach, because I live out of the mainstream development tree and I didn't want to turn my life into an ethernal merging/conflict-resolution activity.

However, a more fundamental reason for it is that I didn't want to be tied to X. I deal with some userland applications (that unfortunately I can't provide much detail of yet) that live directly on the top of libdrm.

So I set myself a goal of "full application transparency". Whatever is thrown at me, I wanted to be able to handle without having to touch any piece of application or library that the application relies on.

I think I have achieved this goal and really everything I tried just worked out of the box (with an exception of two bug fixes to ATI DDX
and Xorg, that are bugs with or without my work).

-- Ilija

CRTC??? (0)

fnj (64210) | more than 2 years ago | (#37984142)

WTF. Cathode ray tube controller? What an antiquated concept.

Re:CRTC??? (1)

Yvan256 (722131) | more than 2 years ago | (#37984558)

No, the CRTC [crtc.gc.ca] .

Re:CRTC??? (1)

bill_mcgonigle (4333) | more than 2 years ago | (#37985184)

Console Redirect Transfer Controller?

Re:CRTC??? (0)

Anonymous Coward | more than 2 years ago | (#37985790)

it's called like that only for historical reasons

don't you use tar ? does it have to use tape ?

CEASE AND DESIST! (and pay up, succa!) (1)

Thud457 (234763) | more than 2 years ago | (#37984522)

If this ever makes it out of the lab, the MPAA's gonna be on this like a ton of bricks.

Re:CEASE AND DESIST! (and pay up, succa!) (1)

Yvan256 (722131) | more than 2 years ago | (#37984570)

What do you mean? An African or European ton of bricks?

PCoIP (0)

Anonymous Coward | more than 2 years ago | (#38009972)

Would this be considered the open source equivalent to PCoIP for remote 3D CAD work, as opposed to various heavily proprietary and application compatibility limited remote 3D solutions from the likes of HP and Fujitsu?

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>