Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

A Proposal To Fix the Full-Screen X11 Window Mess

Unknown Lamer posted about a year and a half ago | from the just-run-windowed dept.

Graphics 358

jones_supa writes "The SDL developers Ryan Gordon and Sam Lantinga have proposed a window manager change to work out the full-screen X11 window mess, primarily for games. The proposal is to come up with a _NET_WM_STATE_FULLSCREEN_EXCLUSIVE window manager hint that works out the shortcomings of the full-screen hint used currently by most games, _NET_WM_STATE_FULLSCREEN. Ryan and Sam have already worked out an initial patch for SDL but they haven't tried hooking it to any window manager yet. Those interested in the details, information is available from this mailing list message. One of the key changes is that software would make the request to the window manager to change the resolution, rather than tapping RandR or XVidMode directly. Martin Gräßlin of KDE was rather wary about the patch and said that games changing the resolution just tend to mess up the desktop." Seems like a reasonable idea, given a bit of time to mature as a spec. In KDE's case, a separate daemon from the window manager handles resolution changes so going through the WM would add complexity, and the plasma shell still has no way to realize that it shouldn't reflow the desktop widgets. Setting window properties seems like a sensible IPC method for communicating intent though (without making yet another aspect of the X desktop reliant upon the not-very-network-transparent dbus): "hey, I need to resize, but just for me so don't reshuffle the desktop and docks."

cancel ×

358 comments

Hilarious excuses (5, Insightful)

dnaumov (453672) | about a year and a half ago | (#41772577)

Martin Gräßlin of KDE was rather wary about the patch and said that games changing the resolution just tend to mess up the desktop.

So, ugh... fix your desktop?

Re:Hilarious excuses (2)

cpicon92 (1157705) | about a year and a half ago | (#41772617)

Agreed. Why can't the plasma widgets just save their positions and change back when the resolution changes back?

Re:Hilarious excuses (1)

Anonymous Coward | about a year and a half ago | (#41772735)

Poor program structural choices is what it reads which where probably influenced by legacy structures. They just didn't see that it might be necessary to allow for temporary resolution changes so the WM just doesn't expect them. Now, I having done a bit of game and media player work on mac OS in the 00s before and after the game sprockets where released handling resolution has to be consistent by having a common framework that works in a sane manor. The Sprockets did this for Mac OS classic and X11 needs to seat down and make the programing frame works that make games easy to program. SDL works but it's not in a truly sane manor because X11 doesn't give a clear path for taking over the screen.

Re:Hilarious excuses (5, Interesting)

dgatwood (11270) | about a year and a half ago | (#41773283)

Why don't games just spawn a separate X11 window server instance with a different resolution on a separate VC? Adding proper resource sharing between X11 instances seems like it would be a lot easier to do than rearchitecting all the existing apps to do the right thing during a temporary resolution change.

And there's no benefit to a full-screen app running in the same X11 instance as any other app other than making it possible to transition a window from being a normal window to a full screen window and back, and with a resolution change, that won't work very well anyway, which makes even that argument mostly moot.

Re:Hilarious excuses (0)

adolf (21054) | about a year and a half ago | (#41772743)

And make KDE even slower? Meh.

Re:Hilarious excuses (1)

kiddygrinder (605598) | about a year and a half ago | (#41773053)

yeah, i'm sure repositioning icons every 30 minutes or so will bring systems to their knees

Re:Hilarious excuses (1)

adolf (21054) | about a year and a half ago | (#41773251)

yeah, i'm sure repositioning icons every 30 minutes or so will bring systems to their knees

Ever hear the story about the straw that broke the camel's back?

As I see it, with the way things stand and the direction in which they seem to be going, we generally need leaner (simpler, perhaps more clever) solutions to problems, not heavier ones.

Even if it's just a tiny bit of additional overhead. Operating systems are slow enough, these days, almost as if nobody even tries to optimize anything anymore with a grander view of things.

FFS: Why bother repositioning icons at all, if nobody is ever going to see the results of this in the first place? In the very best case, work was wasted. In the very worst case, it will be annoying to watch it happen and cause other issues.

All this work (millions of processor cycles, however instantaneous this might theoretically seem, silently occurring when all I'm trying to do is either load software or return to my desktop), just to let a game play full-screen at a different display resolution. Fuck that.

The fact that the position of desktop icons is even a factor in this discussion of full-screen gaming means that the entire philosophy is broken.

Re:Hilarious excuses (4, Insightful)

Kjella (173770) | about a year and a half ago | (#41773085)

The desktop doesn't know what caused the changes, so you could run into a lot of strange issues. Imagine you lay out your desktop on a 30" 2560x1440 monitor, then switch to a 1920x1080 monitor and added/removed/moved an icon. What happens when you reattach the first monitor, should everything just "snap back" to the places it had - even if you'd arranged your icons completely differently now? To me the solution outlined here seems much smarter, just let the game have it's own "screen" with its own settings, no need to even tell the other windows about it.

Re:Hilarious excuses (4, Informative)

Anonymous Coward | about a year and a half ago | (#41773215)

This is the exact purpose of this proposal, to create a new signal that would tell the window manager that the change is temporary and only takes effect while a specific window has focus. This way they window manager would know there's no need even to move the icons in the first place.

Re:Hilarious excuses (2, Insightful)

Anonymous Coward | about a year and a half ago | (#41772763)

Indeed, he didn't even realize this flag wouldn't even tell the widgets the resolution changed so they would never be rearranged for starters. I doubt he has even read the proposed spec.

Re:Hilarious excuses (2)

Lord Byron II (671689) | about a year and a half ago | (#41772781)

Except that we're no longer in the era of CRTs. Since LCDs have one native resolution, they should always be driven at that resolution. If a game wants to run at 640x480, then that should be accomplished by scaling it up and adding black bars, if necessary, but the signal to the LCD should still be at the original resolution.

OMFG (-1)

Anonymous Coward | about a year and a half ago | (#41772891)

what's a CRT? Is that why there's this "resolution" stuff?

Re:Hilarious excuses (2)

MBCook (132727) | about a year and a half ago | (#41773083)

If you don't trust your LCD to do it (I don't blame you, some LCDs are better at scaling that others), that sounds like something that should be done automatically and transparently by the video driver instead of something the WM should have to manage.

Re:Hilarious excuses (1)

epyT-R (613989) | about a year and a half ago | (#41773279)

or even multiples/divisors of that resolution, ideally exposed via EDID. since they're even, the screen can do a simple, lossless point sample scale which is computationally simple (compared to common bilinear) and allow these low resolutions to be full screen with no added latency (scaler chips inside most panels are sloow). These are needed because desktops might be 2560x1600 but most gpus won't run games well at that resolution.

nvidia's windows drivers support scaling in the gpu too, but unfortunately it's filtered.. I wish there was a way to disable that.

Re:Hilarious excuses (0)

Anonymous Coward | about a year and a half ago | (#41773127)

Martin Gräßlin's comments on this are bizarre. That's one of the problems this proposal is meant to *solve*. He's completely missing the point.

Re:Hilarious excuses (0)

jonadab (583620) | about a year and a half ago | (#41773139)

My desktop is fine. It uses precisely the resolution I want it to use on each monitor. Under no circumstances would I ever want an application window (other than the control panel that I use to set it up) to *mess* with that.

Then again, I also can't imagine circumstances where I would want an application window to be "fullscreen". (Maximized, yes, but maximized windows don't overlap my panels. That's important.)

Why do game developers always assume that my computer doesn't have any other purpose except to play their game? I've got other stuff on this computer -- stuff that is more important than the games. My computer is my alarm clock, my calendar, and a communication tool, among other things. Games had darned well better stay in the window I put them in, or I won't be playing them.

Re:Hilarious excuses (2, Insightful)

Anonymous Coward | about a year and a half ago | (#41773263)

Why do game developers always assume that my computer doesn't have any other purpose except to play their game? I've got other stuff on this computer -- stuff that is more important than the games. My computer is my alarm clock, my calendar, and a communication tool, among other things. Games had darned well better stay in the window I put them in, or I won't be playing them.

Because not everybody wants to be annoyed by the rest of the UI when playing a game. Of course, when fullscreen is available it should be an option (and not the only way to play the game), but that isn't an excuse to completely get rid of it.

Re:Hilarious excuses (0)

Anonymous Coward | about a year and a half ago | (#41773285)

Or, make X use 2 (or more! with 8+ GB RAM on computers these days, should be able to find some RAM to store off an inactive display, no?) framebuffers. One for games, one for desktop. Then each one can run in separate resolutions, and be more independent of one another.

But who am I kidding.

Games are the problem? (2, Insightful)

Anonymous Coward | about a year and a half ago | (#41772629)

Just start the goddamn games on a totally different TTY. There, problem solved!

Re:Games are the problem? (1)

Anonymous Coward | about a year and a half ago | (#41772733)

"Let's not fix the underlaying problem and come up with client-side work arounds"

Re:Games are the problem? (1, Redundant)

Waffle Iron (339739) | about a year and a half ago | (#41773229)

Just start the goddamn games on a totally different TTY. There, problem solved!

That's what I do to play games.

I usually just switch over to TTY1. Then I can load TREK73.BAS:

*
 
  * *
          *
        -E-
              *
 
quadrant 3/1
condition GREEN
torpedoes 9
energy 1434
shields 1000
klingons 14
 
command:

Music to my ears! (5, Interesting)

DaneM (810927) | about a year and a half ago | (#41772661)

With Linux finally becoming a more "proper" gaming platform (i.e. Steam and others), it's "about time" that this is dealt with. _NET_WM_STATE_FULLSCREEN_EXCLUSIVE, where have you been my whole adult life? Gotta hand it to Ryan Gordon ("Icculus," as I recall) for consistently making Linux gaming that much more viable.

Re:Music to my ears! (0)

Anonymous Coward | about a year and a half ago | (#41772833)

I agree: between full-screen support and wireless driver fixes, 2013 will truly by the year of Linux on the desktop. I can only hope that this sort of rapid innovation will cause Microsoft and Apple to wake up to what's possible if they put their minds (and money) to it.

CRT's (4, Insightful)

mcelrath (8027) | about a year and a half ago | (#41772669)

Who is still running a CRT? Who wants any program to change the resolution of their screen?

This strikes me as the wrong solution to the problem: A program should instead request the "current monitor's resolution" (because there can be more than one!) set its display area to that size, and then tell the window manager to "fullscreen" it by removing title bar and border decorations and moving it to (0,0) of that monitor. But NEVER EVER RESIZE MY MONITORS. Thank you. The window manager should always be superior to the app, and one should always be able to manage the window (task switch, move to another desktop, etc) using the window manager, regardless of what the app thinks it is doing.

Re:CRT's (0)

Anonymous Coward | about a year and a half ago | (#41772723)

If you have a game that requires more horsepower than the system can provide to render at native resolution, this could be a problem.

However, it may be the case that display resolution can be 1920x1080 and the internal resolution still be 1280x720 (and in fact generally this approach leads to better visuals than LCD rescalers).

Re:CRT's (1)

Lord Byron II (671689) | about a year and a half ago | (#41772789)

in fact generally this approach leads to better visuals than LCD rescalers

Citation needed.

What difference does it make who (the graphics card or the monitor) is doing the scaling??

Re:CRT's (0)

Anonymous Coward | about a year and a half ago | (#41772865)

Why do you need a citation? Scalers in the average LCD monitor are shit.

Re:CRT's (1)

UnknownSoldier (67820) | about a year and a half ago | (#41773437)

Exactly.

Cheap LCD don't scale properly.

Re:CRT's (1)

amRadioHed (463061) | about a year and a half ago | (#41772885)

It doesn't matter who, but it does matter what algorithm they're using. Monitors aren't guaranteed to be using the algorithm that produces the best results.

Re:CRT's (1)

silas_moeckel (234313) | about a year and a half ago | (#41773469)

If your talking about rendered 3D the vid card can always do a better job it has more information to work with. That's not to say that it will.

Re:CRT's (2)

dgatwood (11270) | about a year and a half ago | (#41773387)

What difference does it make who (the graphics card or the monitor) is doing the scaling?

Three big differences come to mind:

  • The graphics card probably has more precise pixel values (floating-point values instead of scaled integers per color channel), so even if the hardware scaling algorithms in the monitor are better than "whatever we can do on ten cents of silicon" (which is a big assumption), they'll still be slightly lower quality than the GPU can produce.
  • The monitor doesn't have anything remotely as powerful as a GPU in it, so it is limited in the quality of scaling algorithm it can realistically implement.
  • The monitor can scale only the final, layer-blended image data. That means elements that need to be precise (e.g. text) are just as blurry as everything else. By contrast, a game doing the scaling on the GPU could scale those elements separately, rendering things like text at the panel's native resolution and using alpha blending to superimpose it over blurrier, scaled-up game elements.

Re:CRT's -- Simple fix. (2, Funny)

Anonymous Coward | about a year and a half ago | (#41772955)

Force ALL games to run at 640x480 -- problem solved.

Except for those i386 Linux systems who are trying to run Half Life 2 .. perhaps we should lower that resolution to 320x240, just to guarantee we're not butting heads with the window manager. After all, the first goal of every Linux game designer should be to ensure the tail log window you're running is properly proportioned at all times.

Re:CRT's (2)

jedidiah (1196) | about a year and a half ago | (#41773123)

Then buy a better video card or run it windowed.

This full screen nonsense is something you flee from Windows to get away from. The idea that it is being dragged back into Linux is just annoying.

It's 2012. It's long past time that Game programmers realized that they don't get to run amok with the system.

It's 2012 and a modern OS, not an Amiga.

Re:CRT's (1)

cheater512 (783349) | about a year and a half ago | (#41773253)

You missed the point. LCDs don't have different resolutions, they have one resolution and only one resolution.

Re:CRT's (0)

Anonymous Coward | about a year and a half ago | (#41773353)

That's a terrible, terrible non-solution. People want to be able to run games full screen at a lower resolution.

I'm fine with requiring user confirmation, but blocking all resolution change is a poor idea.

Re:CRT's (2)

wisnoskij (1206448) | about a year and a half ago | (#41772731)

I agree, I have no idea why game windows are not handled better.
It is basically impossible to run many, quite possibly most, games in a window. And even the ones that do allow it often require editing of files or hacking the exe.
Theoretically the OS should be being sent this visual data and no matter how it was programed you would resize it/run it in a window.

Re:CRT's (5, Insightful)

EvanED (569694) | about a year and a half ago | (#41772737)

Who wants any program to change the resolution of their screen?

Someone whose graphics card isn't up to the task of running a game at full native resolution? That'd be my guess anyway; I haven't willingly used a lower resolution for a while. (Some games don't support high resolutions, or don't support widescreen resolutions, and there it's "reasonable" that they change it as well. But a program like that probably wouldn't use that in the first place, so whatever.)

The window manager should always be superior to the app, and one should always be able to manage the window (task switch, move to another desktop, etc) using the window manager, regardless of what the app thinks it is doing.

I don't know enough about this proposal to say how it interacts with this (indeed, I'm rather disappointed by both the summary and TFA not actually, you know, saying what the problems are in the first place), but there's absolutely no reason why those goals are in conflict. In fact, the proposal specifically addresses this: "If the window loses input focus while fullscreen, the Window Manager MUST revert the resolution change and iconify the window until it regains input focus. The Window Manager MUST protect desktop state (icon positions, geometry of other windows, etc) during resolution change, so that the state will be unchanged when the window ceases to be marked as fullscreen."

Re:CRT's (3, Informative)

mcelrath (8027) | about a year and a half ago | (#41773287)

Someone whose graphics card isn't up to the task of running a game at full native resolution?

For the myriad of responses that brought up this point: the answer is video card hardware scaling. E.g. add a flag _NET_WM_STATE_SCALING_ALLOWED which directs the WM to use hardware scaling from a fixed-size framebuffer, as is done by video players. Not only can you make it full screen, but you can resize it to any arbitrary shape and size (e.g. don't cover your widget bar, etc). Then the Window Manager decides what is "fullscreen". It could even make an app span more than one monitor when "fullscreen", or just one.

Re:CRT's (0)

Anonymous Coward | about a year and a half ago | (#41772745)

Who is still running a CRT? Who wants any program to change the resolution of their screen?.

People who use a high resolution for their desktop but who's graphics cards can't run games with a frame rate at that resolution?

Re:CRT's (5, Insightful)

marcansoft (727665) | about a year and a half ago | (#41772753)

This. I came here to say the same thing, but you already had. Every single modern graphics card is very efficient at scaling textures, and in fact, LCD scaling these days most often ends up happening on the GPU anyway. Don't touch my screen resolution. Ever. If the goal is to get better performance by rendering at a lower resolution, then render at a lower-resolution offscreen buffer and scale that up to the screen resolution.

I wish Wine had a mode that did this for Windows games that expect to change the screen resolution and don't play well with Xinerama. These days I end up using the "virtual desktop" wine mode with per-game settings and KDE's window override support to put it on the right display head and remove the borders, but it's a suboptimal manual solution. The Linux game situation is slightly better (they tend to be configurable to respect the current resolution and usually get the display head right), but still don't have scaling support.

Need inspiration? Do what video players (particularly mplayer) do. That is how fullscreen games should work.

Re: rendering lower then scaling up to native (5, Interesting)

brion (1316) | about a year and a half ago | (#41772809)

This is exactly how some games work on Mac OS X, for instance Source-based games like Portal and Half-Life 2. They don't muck with the actual screen resolution, but just render into an offscreen buffer at whatever resolution ant blit it stretched to the full screen. Switching from the game back to other apps doesn't disturb the desktop in any way. Would definitely love to see more Linux games using this technique.

Re: rendering lower then scaling up to native (3, Informative)

sunderland56 (621843) | about a year and a half ago | (#41772991)

This works - but wastes both ram space and performance.

Re: rendering lower then scaling up to native (0)

Anonymous Coward | about a year and a half ago | (#41773365)

Do you know by any chance what the performance hit is? The RAM hit isn't too bad, given that GPUs these days ship with 1GB or more, but I could see performance possibly being an issue in some circumstances.

Re:CRT's (2)

Mike deVice (769602) | about a year and a half ago | (#41772755)

Who is still running a CRT? Who wants any program to change the resolution of their screen?

Gamers often do. An average application might run nicely at a high resolution, but for a smooth Skyrim experience, many people may find it necessary to allow it to run at a lower resolution.

Re:CRT's (1)

AnAirMagic (989649) | about a year and a half ago | (#41772817)

Imagine how this might work with Wayland: Game renders at one resolution (say, 640x480) generates output and sends it to the compositor. The compositor takes the game window and scales it to match the actual screen resolution (say, 1920x1080). No hacks needed. /me drools a bit

Re:CRT's (0)

Anonymous Coward | about a year and a half ago | (#41773023)

As if that cute toy Wayland is the only compositing window manager out there.

-- Ethanol-fueled

Re:CRT's (1)

Bill, Shooter of Bul (629286) | about a year and a half ago | (#41773237)

Uh, wayland isn't a composting window manager, its a display server protocol. Kind of like X11.

Re:CRT's (0)

Anonymous Coward | about a year and a half ago | (#41772823)

Why don't video cards support resizing? As in render to, say, 800x600 and resize/stretch to 1280x1024 (I'm showing my age with those resolutions, aren't I?)

That's real easy math for the video card, it shouldn't hurt performance in any serious manner. Yes, it would still look like crap, but if you need to run at a lower resolution to make the card perform well, it's going to look like crap either way, isn't it?

Re:CRT's (0)

Anonymous Coward | about a year and a half ago | (#41773507)

I know that in the Windows world, virtually all video driver control panels have the option to make the GPU do the scaling, rather than sending the lower-resolution image to the monitor. I'd imagine that's possible in Linux as well, though I don't know how it's typically requested.

Re:CRT's (0)

Anonymous Coward | about a year and a half ago | (#41772799)

Actually, it makes a ton of sense for games to change the resolution - you might not have the graphical power to run your game at the native resolution, or you might sit further away while gaming, so a lower resolution makes interface components more readable. You might have a super high-resolution display, which, at the desktop, would make things far too small, but in a game, makes for great graphics.

Windowed mode also sucks for gaming, as it's generally far easier to loose mouse/keyboard focus, have windows pop-up in front, have distracting stuff around your game window that draws your focus out, not to mention using a small portion of the screen unless it supports scaling (which is rare).

Re:CRT's (1)

zippthorne (748122) | about a year and a half ago | (#41773359)

Your second reason is stupid. Just because Windows and OSX still sort of do it that way doesn't mean it actually makes sense that you should have to futz with the resolution just to make widgets use more or less screen real estate for better viewing. Window managers should handle scaling of UI elements and text sanely.

If I want bigger text to be easier to read, I still want crisp text. If I want smaller text to have more stuff on the screen, I don't want the letters to all run together like a censor bar.

Re:CRT's (4, Interesting)

DRJlaw (946416) | about a year and a half ago | (#41772835)

Who is still running a CRT?

This is not a CRT-only problem.

Who wants any program to change the resolution of their screen?

Gamers. [hardocp.com]

This strikes me as the wrong solution to the problem:

Not surprising, since you're ignoring the underlying problem. Your 2560x1600 desktop on that 30" LCD is going to kill the ability of your videocard to display a modern game at an acceptable frame rate. Many gamers will not accept windowed half-screen (or whatever fraction is required) gaming on their $1K LCD.

A program should instead request the "current monitor's resolution" (because there can be more than one!) set its display area to that size, and then tell the window manager to "fullscreen" it by removing title bar and border decorations and moving it to (0,0) of that monitor. But NEVER EVER RESIZE MY MONITORS.

No. Windows and OSX have figured this out. Linux window managers (at least one popular one) need to as well.

The window manager should always be superior to the app, and one should always be able to manage the window (task switch, move to another desktop, etc) using the window manager, regardless of what the app thinks it is doing.

Irrelevant to your desired scheme, where keyboard hotkeys would still be required. In Windows and OSX you can still task switch, move to another desktop, etc. using such hotkeys. Yet the game controls the resolution of the monitor in fullscreen mode.

Re:CRT's (3, Insightful)

Carewolf (581105) | about a year and a half ago | (#41773039)

Not surprising, since you're ignoring the underlying problem. Your 2560x1600 desktop on that 30" LCD is going to kill the ability of your videocard to display a modern game at an acceptable frame rate. Many gamers will not accept windowed half-screen (or whatever fraction is required) gaming on their $1K LCD.

No, you are missing his point. There is no reason the game could not run at a lower resolution and be scaled by the WM, instead relying on the screen to do the rescaling. Only CRTs are able to do rescaling physically, LCDs end up doing it in software anyway and usually in a crappier maner than what the WM could do.

Re:CRT's (0)

Anonymous Coward | about a year and a half ago | (#41773069)

Any good GPU post 2008 should have scaling built in. Any high end GPU post 2000 should have scaling built in. Nvidia had it for a long time and added some software version to allot of their drivers and it works great. So it should be handed by the GPU.

Re:CRT's (1)

Chemisor (97276) | about a year and a half ago | (#41773159)

Unless your game uses OpenGL and you have a fully accelerated driver (read: the proprietary Catalyst or nVidia blob), it will not be able to scale fast enough. Most games use SDL and main memory surfaces that are then blitted to the screen. Any scaling is done in software by the CPU and is dreadfully slow. My Core i7 can handle 1680x1050@60, but just barely, with one core pegged to 100%. The cheapest GPU, of course, can handle that easily, but you must run proprietary drivers and use OpenGL. If you don't, resolution scaling is your only option.

Re:CRT's (1)

adolf (21054) | about a year and a half ago | (#41773289)

But LCDs scale for free, while CPUs and GPUs do not.

Why reinvent the wheel?

Re:CRT's (1)

antdude (79039) | about a year and a half ago | (#41773125)

Not me, but I want to but they are impossible to find new quality ones. Anyways, I still use low resolutions like old games, MAME, demos, etc. I still use old KVM from Y2K that use VGA so not changing resolutions and keeping black bars doesn't work. :(

Re:CRT's (1)

Culture20 (968837) | about a year and a half ago | (#41773189)

I'm still running a hugemongous CRT. It probably won't go bad for another four years.

Martin Gräßlin doesn't get it (0)

Anonymous Coward | about a year and a half ago | (#41772683)

Was reading that e-mail from Martin Gräßlin. He's completely missing the point, he's complaining about all the issues this proposal *is meant to fix*, the whole idea is that a program going fullscreen doesn't interfere with anything else. Way to go.

No assistance required (3, Funny)

OhANameWhatName (2688401) | about a year and a half ago | (#41772707)

Martin Gräßlin of KDE was rather wary about the patch and said that games changing the resolution just tend to mess up the desktop

KDE doesn't need the help.

Re:No assistance required (0)

Anonymous Coward | about a year and a half ago | (#41772919)

Ever played a VGA game in Windows? Those screw up dekstop layout, too. Do wear a Gstring and Gbuttplug while typing your Gfud in Galeon on Gnome?

Perfect fix that works every time: (-1)

Anonymous Coward | about a year and a half ago | (#41772713)

Switch to OS X or Windows and dump Linsux already.

Re:Perfect fix that works every time: (1)

Black Parrot (19622) | about a year and a half ago | (#41773211)

Switch to OS X or Windows and dump Linsux already.

Actually, I keep a Windows box to use for gaming. Linux works just fine for everything else.

(And would probably work just fine for gaming, if anyone would bother making games for it.)

Then will it be year of the Linux desktop? (1)

theshowmecanuck (703852) | about a year and a half ago | (#41772741)

Here's to hoping.

Seems like a reasonable idea, given a bit of time to mature as a spec.

So another ten years? Seriously, this is well past due. This is the second story about someone wanting to fix the desktop in the last month or so. Hopefully if there are enough one of them might actually gain traction. Here is hoping. The X system really is a heap. As much as the purists like to bitch about it, thank goodness for nvidia when it comes to multiple monitor support. Too bad it doesn't help the gaming though.

Re:Then will it be year of the Linux desktop? (0)

Anonymous Coward | about a year and a half ago | (#41772827)

I've found multi-monitor support to be great across the board these days with XRandR - for my triple monitor setup, I used an ATI card. We are past the days where TwinView was the only good solution.

Re:Then will it be year of the Linux desktop? (2, Insightful)

ryanw (131814) | about a year and a half ago | (#41772855)

I'm pretty sure somebody did go in and fix the X11 desktop..... It was Apple w/ OSX.

Trying to solve the wrong problem (0)

narcc (412956) | about a year and a half ago | (#41772747)

We can all agree the X is a gigantic mess. It needs replaced by something better -- badly.

Yeah, we'll lose ... a lot ... but won't it ultimately be worth it in the end?

Re:Trying to solve the wrong problem (0)

Anonymous Coward | about a year and a half ago | (#41772793)

That's kind of funny.

Developer from the TFA: pointing out a very specific, very locally scoped problem, and using X's extensibility features to offer a solution.

You: Well I agree X is bad. No specifics offered.

I would much rather have the guy with the first approach around than someone like you.

But don't worry. I'm sure that Wayland vaporware will fulfill your desires for ... I'm not sure what exactly. Not being called X?

Re:Trying to solve the wrong problem (0)

Anonymous Coward | about a year and a half ago | (#41773021)

How about something that's not rooted in antique design and protocols from 1987?

Re:Trying to solve the wrong problem (0)

Anonymous Coward | about a year and a half ago | (#41773331)

No need to be vague. You're not going to bore anyone here with details.

Re:Trying to solve the wrong problem (2)

Black Parrot (19622) | about a year and a half ago | (#41773225)

We can all agree the X is a gigantic mess. It needs replaced by something better -- badly.

Maybe instead of everyone jumping in and telling us how bad X is, someone could take a minute to explain what's wrong with it for us non-technical types.

Dump X (5, Insightful)

Anonymous Coward | about a year and a half ago | (#41772749)

I still think X needs to go. For truely forward thinking, it needs to be replaced. Just look at Andriod. Andriod would not be useful if it was forced to use X.

Frankly, X does too many things that too few people need. It was designed for a different era and it scales to modern workflows very clumsily. Multi-monitor desktops and gaming on windows is effortless. On X it's frankly a chore.

Sorry, no, network transperancy is not an important feature anymore. Probalby implemented by .001% of regular users. VNC/RDP style remote access is the way it's done now. An no, nobody cares if it's tehnically inferior. It's hundreds of times easier to implment and use.

Modern toolkits pretty much ignore 95% of X's built in features an just pass bitmaps.

Yeah, X has lots of cool things but you have to realize most of them are impractical or unnecessary. Today we really have the memory, computational power, and bandwith to bang whatever we want on to the screen with out any trouble. The latency and overhead X present are the enemies today.

Now stop. - Yes you, stop. I know you're about to type up a 10 paragraph screed about how you ported X ap to obscure platform Y, or remotely managed 10,000 servers with facial twitches and GTK. Just stop. Your use case does not represent the vast majority of computer users. It doesn't even represent a full fraction of a percent.

Legacy baggage and clinging to old ideas spawned x.org. The same thing is what will spawn whatever is to replace X.

Re:Dump X (0)

Anonymous Coward | about a year and a half ago | (#41773047)

I'm sorry, but this is WAY too much honest and brutal truth to be a slashdot comment.

Or rather, fucking eh! Let's toss X11 on an iceflow already and work on something better. I've more or less given up on Linux game development because X11 is such a cantankerous bitch-hog, which all modern window managers have had to implement work-arounds and addons just to do simple things like rendering TrueType fonts for christ's sake. Please.

Re:Dump X (0)

Anonymous Coward | about a year and a half ago | (#41773049)

Yes, because in the brave new world of massive networking, what we really need is some crappy graphics API which is unable to display across a network.

X11 was way ahead of Windows technically, yet now we have innumerable idiots demanding that we throw it away and copy Windows single-device crappy API.

Re:Dump X (0)

Anonymous Coward | about a year and a half ago | (#41773259)

Cue one person who may have actually used this mainframe-timeshare style of computing. I've been using Linux for over a decade and I've never seen an implementation of network-based X11 terminals in real life, ever. Some school in Germany may have fussed with it a decade ago.

Can X11 show HD video over it's network protocol? Can you play a game or use a CAD/3D application with full speed on that protocol? Just because something is technically superior, it doesn't mean anything if it can't meet the needs of most users. Pragmatism is lost on the community far too often.

If you ever want to see the 'year of the Linux desktop', we need to ditch this technically-superior but useless to most mentality and just do something that *works*.

Re:Dump X (3, Insightful)

Anonymous Coward | about a year and a half ago | (#41773371)

This is a lot of FUD.

Android. Look at the N9 with an award winning UI. It uses X and is really cool (on outdated hardware).

Network transparency is really useful. VNC/RDP sucks compared to X. And I don't see how it is
easier to use than X. Maybe there are more GUI for it that make it easier for beginners, but that
is not for any technical reasons.

I don't see what overhead X causes. I worked fine decades ago. Latency is an issue over the network,
but only because the toolkits never cared about that. It's not a problem on a LAN and can also
be solved with a (local) proxy.

My use case does not interest you? That was never the Linux philosophy. Please go back to Windows.

Legacy baggage. There is no legacy baggage. There are some APIs which are not used anymore by
modern application, but that does not hurt anybody.

Re:Dump X (2)

deek (22697) | about a year and a half ago | (#41773405)

I agree that we need to come up with a brand new system to handle today's graphics systems. That's what Wayland is for, and why it's such an interesting project. It is not legacy baggage, but a ground up designed system. You have heard of it, haven't you? Seems like every Linux user and their dog knows about it these days.

Also, I'm very glad that Wayland is implementing an X compatibility layer. I'm one of those fraction of a percent that use and enjoy network transparency. It would annoy the hell out of me if I had to run a full graphic system on the servers I manage, and then use VNC to connect to them. It's just so much nicer to ssh into the machine, run the program, and have it appear directly on my screen. Never mind that I like keeping a minimal amount of packages installed on the servers. I try to keep it simple.

By the way, if we have the memory, computational power, and bandwidth, why are you so worried about X overhead and latency? Surely they become marginal with more resources.

deeply technical (1)

manu0601 (2221348) | about a year and a half ago | (#41772771)

It is a bit unusual to craft a news entry with deeply technical stuff taken from project mailing lists. What is _NET_WM_STATE_FULLSCREEN_EXCLUSIVE? A flag in some protocol?

Re:deeply technical (2)

Rockoon (1252108) | about a year and a half ago | (#41773037)

The issue is that some programs change the screen resolution, and different programs take notice and rearrange their windows and icons when a screen resolution change notification takes place.

The problem is that there are no semantics in X that allow a program to change the screen resolution while NOT causing those other programs to do stuff.

This new flag is to signal these semantics. "Hey, we are changing the resolution, but we have this new idea called Exclusive Control over the display, so nobody needs to know that we did it because they cant render themselves anyways"

..and thus, those different programs never see the resolution change and therefore do not destroy the users chosen window and icon layout in their attempt to "fit" the temporary resolution that never should have mattered to them to begin with.

Re:deeply technical (1)

manu0601 (2221348) | about a year and a half ago | (#41773149)

And the flag is passed to an API? (libX11 level? higher?), or it lives within the X11 protocol? Or both? I understand the background, I was just saying it was weird to use a flag name as being #define'ed in source code without the context required for it to make any sense.

Re:deeply technical (1)

jedidiah (1196) | about a year and a half ago | (#41773155)

I dunno. If a game is running amok because gamers and game programmers suffer from an 80s mentality that a computer is a game console, then perhaps you don't want the rest of the GUI acknowledging this foolishness.

The fact that games on Linux don't scramble my desktop like they do under Windows IS ACTUALLY A GOOD THING.

Even with the status quo, cleaning up after a game run amok is less bothersome under Linux.

Make it a library! (1)

Kenja (541830) | about a year and a half ago | (#41772777)

And then make sure that different versions of it cant coexist on the same system and cant run each others code. Perhaps change all the method calls every build.

Easy Fix (1)

ryanw (131814) | about a year and a half ago | (#41772845)

When a game starts, it wants the entire desktop, it doesn't want the other desktop elements at all, no dock, no icons, interaction, etc.

Why isn't there a function to create a new virtual desktop at any resolution you want and leave the other desktop untouched? So when you switch between them it knows to switch resolutions as well. Have the resolution tag part of the desktop, so when you switch between them it knows what to switch to.

Seems like an easy fix.

Re:Easy Fix (0)

PPH (736903) | about a year and a half ago | (#41773073)

When a game starts, it wants the entire desktop,

I don't want it to have the entire desktop. I'm using it for other stuff as well. If you need the entire desktop, get a PS3.

Horrible idea! (0)

Anonymous Coward | about a year and a half ago | (#41772875)

Linux is a serious OS, it shouldn't pander to those Windows-raised babies that use computers for trivial shit like gaming.

Quick solution? Strip 3D graphics HW support from all X11 drivers. Problem solved.

The only problem as far as I'm concerned (0)

Anonymous Coward | about a year and a half ago | (#41772951)

The only problem as far as I'm concerned is SDL's god-awful and inconsistent handling of input when in fullscreen mode.

They go out of their way use "raw input" methods which disable conveniences like alt-tab.
And if you're using SDL_ShowCursor(0) in fullscreen, woe onto you if you're using something like a tablet in a game like civilisation, as the cursor will be constantly offset by several thousand pixels each update. (I believe they blame the X guys, but this is SDL's fault, pure and simple)

And the first was an absolute pain in the ass a few years back too.
SDL_Mixer, thanks to racing condition bugs, happened to fuck with memory allocated to the NVIDIA driver and lock up various fullscreen applications too.

As a result, I've been conditioned to run everything windowed...
And Ryan Gordon, What the hell where you thinking by making your ports (like Aquaria) exhibit this exact same inane behaviour in windowed mode!?

This is one case where I'm with Martin and the Wayland guys, "fullscreen games" should render into an appropriately sized offscreen buffer which is then blit onto the screen. It's sweet, simple, has little overhead, and if you happen to still have a CRT, it even avoids the moire patterns with low resolutions.

captcha: evident

Re:The only problem as far as I'm concerned (0)

Anonymous Coward | about a year and a half ago | (#41773319)

A large amount (all?) of those input inconsistencies when SDL fullscreen have to do specifically with X never providing a proper way to go fullscreen, which this proposal is trying to fix.

Sam Lantinga (from Loki Games) (4, Informative)

mattdm (1931) | about a year and a half ago | (#41772969)

I don't know if kids today remember, but Loki Games was one of the first commercial plays for big name games on Linux. Ended in tragic business troubles and financial doom.

It warms my heart to see that Sam Lantinga is still working on SDL.

That is all.

Re:Sam Lantinga (from Loki Games) (3, Informative)

bootkiller (2760523) | about a year and a half ago | (#41773393)

Is also working on Valve Linux team now.

Re:Sam Lantinga (from Loki Games) (1)

game kid (805301) | about a year and a half ago | (#41773457)

I have had the feed of the SDL Mercurial changelog [libsdl.org] on watch for a good while, from back when I felt I could make a game with SDL within a reasonably short time.

Times changed, assorted Shit Happened (both within and without my PC), and my SDL tinkery and SVG tinkery became Blender tinkery (short YouTube video of mine) [youtube.com] became "fuck this I'll just play some Torchlight and roughly build a witch there instead", but I still had the feed on watch and saw a relevant change [libsdl.org] . The summary had a different tone from the many other changes I've seen before, enough to make me think "Something big will come from this." Then I closed the feed tab and went back to whatever the hell I was doing, I forget.

I admire Sam Lantinga's work (and patience) with SDL, given the collective trouble the various OSes bring. Many of the recent changes were for iOS...that must've been fun.

Wayland. (0)

Anonymous Coward | about a year and a half ago | (#41772987)

This is exactly the sort of case where Wayland should be used instead of patching around problems in X11 and/or window managers.

Yes, there are the huge amounts of servers where XWayland can still be used.
For many many users and use-cases X11 just needs to die, finally.

Re:Wayland. (1)

laffer1 (701823) | about a year and a half ago | (#41773509)

When Wayland supports !linux, it can be considered.

Eveything sucks and they're all doin' it wrong (1)

Corwn of Amber (802933) | about a year and a half ago | (#41773005)

Get it fucking right.

Even Windows had it right by the time of XP SP1, when no game worth even pirating actually broke anything when changing resolution.

Ho comes Linux can't do that?

Yeah, I know some answers. Fuck those answers. I can install HackOSX on the same machine and it works even better even when it's not even supposed to.

That speaks of QUALITY.

Linux's state of hardware 3D support and everything that needs for it to Just Work Right fucking SUCKS, and that this article exists is a symptom of that. A band-aid on a wooden leg, a hack on top of a hack, and nothing works right anyway.

That's actually the one first reason I hate using Linux. One more datapoint in the graphs, but I know I'm not exceptional enough that I'm not in a statistically-significant place. Are YOU that arrogant? Do you deserve to?

Games shouldn't need to change resolutions in the first place. If the real unemasculated cards were sold at reasonable prices, everyone could play every current game at native resolution in the year they come out, and next ones at native res and lessened candy.

Lolno not gonna happen. It's not like I don't know it. But that's what we deserve for spending money on things. A graph card that can't display current games at native resolutions is a scam deserving of a class-action for damages beyond bankruptcy until one company (or some Open Design) gets it right.

Just so that we use tech the way it's designed to be used.

Re:Eveything sucks and they're all doin' it wrong (0)

Anonymous Coward | about a year and a half ago | (#41773375)

I hope to goodness that English is a second language for you. Most of that didn't even make sense.

Every time I read about X11 problems (0)

Anonymous Coward | about a year and a half ago | (#41773013)

I think....you guys have heard of the Windows 7 distro from Microsoft right?

Well.. (0)

Anonymous Coward | about a year and a half ago | (#41773025)

As a hardcore gamer and extreme noob to linux...

I gotta say who the fuck knows what this means... it don't mean shit to me...

Sounds like linux is finally figuring out something that everyone else has had working since the early days of multitasking...
took 20 years to copy it eh?

Windows has similar problem (1)

detain (687995) | about a year and a half ago | (#41773065)

I'm glad they have a fix for this issue but it isnt one exclusive to X11. Try loading a fullscreen game at an alternate resolution on a multi-screen desktop in windows and you will see the other screens get all messed up

Re:Windows has similar problem (1)

detain (687995) | about a year and a half ago | (#41773081)

Windows does however let you run windowed but take up the fullscreen to get around this problem (as opposed to running fullscreen). However, doing this won't let you run at alternate resolutions.

Run a dedicated X-server (3, Interesting)

smugfunt (8972) | about a year and a half ago | (#41773243)

Not sure what 'mess' is referred to in the title but I sidestepped the issues I met with Baldur's Gate by running it in its own X-server on a separate VT.
As I recall it just took a simple script to start the server and the game, and one extra command to make the mouse cursor less fugly. My main desktop remained completely undisturbed and just an Alt-F7 away. A little polish and this approach could be a good general solution, no?

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...