×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The User Experiences Of The Future

Zonk posted more than 6 years ago | from the just-give-me-a-cyber-implanted-hud-with-ai-assistance dept.

GUI 230

Patrick Griffin writes "The way that we interact with technology is almost as important as what that technology does. Productivity has been improved greatly over the years as we've adapted ourselves and our tools to technological tasks. Just the same, the UI experience of most hardware and software often leaves novice users out in the cold. The site 'Smashing Magazine' has put together a presentation of 'some of the outstanding recent developments in the field of user experience design. Most techniques seem very futuristic, and are extremely impressive. Keep in mind: they might become ubiquitous over the next years.'"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

230 comments

Not sure 3D is always the best (4, Insightful)

PIPBoy3000 (619296) | more than 6 years ago | (#21492735)

They really seem to be pushing 3D interfaces in the article. While that's neat and all, I suspect there's a reason not every book is a pop-up book. Flat, 2D representations of data are typically the most efficient for our brain and eyeballs. For entertainment and representing 3D data, it can make sense. I just don't plan on coding in 3D any time soon.

Re:Not sure 3D is always the best (4, Funny)

TheSpoom (715771) | more than 6 years ago | (#21492873)

I suspect we'll have 3D interfaces when it becomes cheap to manufacture displays that can actually project a 3D interface. Screw 2D projections of a 3D world, I want my VR!

Speaking of which, the future needs the following three Star Trek items to solve everything all at once:
  • Teleporters (solves all transportation issues)
  • Replicators (solves hunger)
  • Holodeck (solves sexual ten... I mean, makes simulation much easier. Yes, that's it)
So seriously, science, it only took you like twenty years to catch up to the first Star Trek, what the hell?

*mumbles indistinctly about his flying car*

Re:Not sure 3D is always the best (5, Insightful)

Selfbain (624722) | more than 6 years ago | (#21492977)

The Star Trek version of a teleporter is essentially a suicide booth. It rips you apart and then makes a copy on the other end. Do not want.

Re:Not sure 3D is always the best (5, Funny)

TuringTest (533084) | more than 6 years ago | (#21493081)

You wouldn't notice when you've been terminated, and the other copy would still think that HE is YOU. So how would you tell it? And why should you care?

Re:Not sure 3D is always the best (4, Funny)

Selfbain (624722) | more than 6 years ago | (#21493139)

Arthur: I'd notice the difference. Zaphod: No you wouldn't, you'd be programmed not to.

Re:Not sure 3D is always the best (5, Interesting)

Lijemo (740145) | more than 6 years ago | (#21493491)

How would we know it didn't happen this way:

How do you know you wouldn't just experience being painfully killed: poof, bye-bye, assume afterlife, nonexistence, or reincarnation, depending on your beliefs.

Meanwhile, the copy of you with all your memories (or, all from before the "teleporter") doesn't realize that you have experienced death-- or even that s/he isn't you but a copy. It would be the same to everyone you know-- they wouldn't be able to tell that you'd been replaced by a dopoulganger. Your replacement, not knowing any better, would assure everyone that the process was completely safe and painless, and that "you" came to the other end just fine.

The only person that would know the difference is you, except you're not around anymore to know or tell. You're dead.

I'm not sure how one would test a teleportation system to see whether the person going in actually experiences coming out at the other end, or whether the person going in experiences death, and a copy at the other end doesn't know the difference. Or at least, how one could test it and relay the results to others.

Then we can further complicate the question: suppose that you die due to reasons unrelated to teleportation. And you last used a teleporter about a year back, but the teleporter saved your "pattern"-- so your grieving loved ones are able to "recreate" you, exactly as you were when you came out the teleporter-- the only difference is that you'd be confused as to how a year had passed since you'd gone in, and everyone else has memories of you during that time that you didn't experience. Is that you? Or not?

Re:Not sure 3D is always the best (2, Informative)

Z34107 (925136) | more than 6 years ago | (#21493657)

Then we can further complicate the question: suppose that you die due to reasons unrelated to teleportation. And you last used a teleporter about a year back, but the teleporter saved your "pattern"-- so your grieving loved ones are able to "recreate" you, exactly as you were when you came out the teleporter-- the only difference is that you'd be confused as to how a year had passed since you'd gone in, and everyone else has memories of you during that time that you didn't experience. Is that you? Or not?

Well, there was that one TNG episode where Scotty put himself into statis by loading himself into the transporter buffer of a crashed starship. Also, it's apparently a good way to keep coffee fresh, which I suppose is incredibly important because it's not like you can just replicate yourself a cup whenever you want.

What I don't get is why they never replicated people by "transporting" them from the buffer more than once.

Re:Not sure 3D is always the best (1)

Flodis (998453) | more than 6 years ago | (#21493879)

What I don't get is why they never replicated people by "transporting" them from the buffer more than once.
Yeah, me too... I'm pretty sure the extra in the red suit would appreciate it at least.

Re:Not sure 3D is always the best (2, Informative)

Rolgar (556636) | more than 6 years ago | (#21494021)

Well, there was the episode where they found another Riker where he'd been successfully teleported out of a dangerous situation but a copy was accidentally left on the planet. Season 6, Episode 24, Second Chances

Re:Not sure 3D is always the best (1)

Nullav (1053766) | more than 6 years ago | (#21493965)

I can't see that being much of a problem unless the subject subscribes to dualism. Well, except in the 'resurrection' example, in which case it would be confusing as hell for the copy.

Re:Not sure 3D is always the best (1)

Creedo (548980) | more than 6 years ago | (#21493497)

And why should you care?
Well, it might just be me, but I wouldn't want to be erased just to put a copy of me somewhere very quickly.

Re:Not sure 3D is always the best (1)

Orange Crush (934731) | more than 6 years ago | (#21493717)

I hate to break this to you, but this has already happened to you several times, albeit more gradually. Most (if not all) of your molecules have been replaced continuously throughout your life since you've been eating, drinking, breathing, sweating, shedding, growing, shrinking and using the bathroom (sorry, I didn't want to say "peeing and pooping" because it sounded to juven--well, shit).

*hand waving* (1)

Foerstner (931398) | more than 6 years ago | (#21493905)

No, it's not. There has been at least one episode that shows that, from the point of view of the transported person, consciousness is continuous.

The treknobabble explanation has something to do with quantum mechanics, which, as every sci-fi fan knows, is magic.

Re:Not sure 3D is always the best (2)

jellomizer (103300) | more than 6 years ago | (#21493057)

They all seem to work the same way... Which brings me to wonder.... If you configure a holodeck and cross reference it with a replicator. So you spend months or years in it eating holodeck food wich will work like real food while you are in the holodeck after a while you body and mass will become completely a holodeck creation, as your old cells die and new ones are created from virtual matter. So you then can save yourself on disk. Run backups of yourself incase you do something stupid. Or just turn yourself off for a couple of centuries until they can find a way to keep your projection in the real world. Or vice versa you have a holodeck character that you want to become real. You have them eat replicated food from real mass. After a while their bodies will become real and can leave the holodeck...

Re:Not sure 3D is always the best (1)

Rude Turnip (49495) | more than 6 years ago | (#21493187)

"...as your old cells die and new ones are created from virtual matter."

That wouldn't necessarily work, as it's the holodeck projector that controls the molecular composition of the food (albeit simulated), not your digestive system. The most that would happen is that you'd crap out a force field, which would be interesting in and of itself.

Going the other way you suggest, from hologram to organic, actually sounds feasible.

Re:Not sure 3D is always the best (1)

Reliant-1864 (530256) | more than 6 years ago | (#21493759)

holodecks are a mix of both. They use force fields and holograms to create optical illusions that look and feel real, but they can also use a replicator to create real matter, like food. If you ate food in a holodeck, you are eating real food. If you were to try and eat an optical illusion, the body would be unable to digest it. Optical illusions created by force fields are fake and can't leave the holodeck, but a replicated item can. Replication is also very energy intensive, so it's only used when needed (and it also closes a loophole when people leave the holodeck still wet)

Re:Not sure 3D is always the best (1)

Bent Mind (853241) | more than 6 years ago | (#21493529)

I can certainly see the holodeck as a boon to business. It would be nice to be able to virtually attend a meeting rather then drive across town, or crowd around a conference call. However, I suspect that the teleporter, and it's simpler cousin - replicators, are going to meet with a lot of resistance.

Having a device that can create goods out of basic materials that are locally available would kill the current economy. Granted, we would still have to pay for power and patented "Replicator" data designs. However, it would represent a major change in the world.

Re:Not sure 3D is always the best (1)

eggstasy (458692) | more than 6 years ago | (#21493679)

Congratulations, you have just described what companies are doing with Second Life.
It's a holodeck. You can do anything with it, including meetings, classes, roleplay, whatever.

Re:Not sure 3D is always the best (3, Insightful)

mikelieman (35628) | more than 6 years ago | (#21492961)

I suspect we will find that the top percentile of expert users will instead eschew all the "innovations" and use a window manager like Ratpoison which presents each window as it's own FULL SCREEN entity, without lost real-estate to window borders, taskbars, and other widgets.

It's a Zen thing, you just wouldn't understand.

Re:Not sure 3D is always the best (2, Interesting)

TuringTest (533084) | more than 6 years ago | (#21493165)

presents each window as it's own FULL SCREEN entity, without lost real-estate to window borders, taskbars, and other widgets. It's a Zen thing, you just wouldn't understand.
Actually, the real breakthrough in user experience would be an interface allowing that kind of 'zen' without needing to be an expert user. The Humane Interface [wikipedia.org] was a step in the direction of such an interface, but its current proof-of-concept implementation is unfortunately not enough developed to live to expectations.

Re:Not sure 3D is always the best (5, Insightful)

jythie (914043) | more than 6 years ago | (#21493353)

I think part of the problem in these various usability debates is that a good UI for learning and bringing in newbies is not the most effective solution once one has greater needs.

This 'one size fits all' mentality is the issue. We need interfaces that scale from basic to advanced so the basic users doing get slammed with all the advanced stuff and advanced users don't find themselves without the tools they need to actually do their work.

Re:Not sure 3D is always the best (1)

TuringTest (533084) | more than 6 years ago | (#21493507)

I think part of the problem in these various usability debates is that a good UI for learning and bringing in newbies is not the most effective solution once one has greater needs.
True, but why not? Just because we don't know how to do it work, not because it is a bad idea.

That's why we need a breakthrough. A system both learnable for newbies and efficient for experts is the holy grial of user experience. It can't be done with current mainstream commercial toolkits (too based on the WIMP paradigm), but new technologies (like multitouch and gesture recognition) and new paradigms (like Programming By Example [wikipedia.org]) could be the way to build such complex systems.

Re:Not sure 3D is always the best (2, Interesting)

jythie (914043) | more than 6 years ago | (#21493785)

Unfortunately the two tend to be mutually exclusive.

When we look at all these slick 'intelligent' interfaces that are newbie oriented, they all hinge on the computer figuring out what the user 'intends' to do. They work because they wrap up and automate the common cases, but in doing so they inherently limit the possible functionality.

When one looks at these technologies, even things like Programming By Example, they are cases of automating the usage of the computer like an appliance. They tend to make life much more difficult for any task that requires digging into what the computer is actually doing or preforming any task the UI developer did not consider important. A good example would be comparing a file browser to command line interface... I have never seen a graphical browser that has even a small fraction of the capability of the command line, but they DO usually make the most common tasks much simpler.

The examples in the original article.... these UI technologies are all very 'pretty' and add in a nice 'ooh/ahh' element that will coax people to use computers and doing graphic related things, but they really do not add in much for say programmers, administrators, etc.

Re:Not sure 3D is always the best (0)

Anonymous Coward | more than 6 years ago | (#21493683)

And perhaps one day, we can use all that technology to learn the mind-bending concept that it's means IT IS?

Re:Not sure 3D is always the best (2)

jellomizer (103300) | more than 6 years ago | (#21492967)

I am not sure about this 3d Technology. It is a good step in the right direction but the issue of the semi transparent images is still a real issue. Right now it looks really cool because it it like starwars... But for normal use it will get old fast. If it can make solid looking objects too then we may have a way to start a good interface.

Re:Not sure 3D is always the best (1)

dave1791 (315728) | more than 6 years ago | (#21493079)

Never mind the fact that over the top physical desktop metaphors have never caught on in twenty years of being the "next thing".

Re:Not sure 3D is always the best (3, Insightful)

jibster (223164) | more than 6 years ago | (#21493101)

I humbly disagree with you. Our brains have clearly evolved for a 3D world. I believe the reason you believe 2D is more efficent is 3D has a very long history of not being done right. There's a good reason why that is. 3D is far more computationaly expensive than 2D and lacks a true 3D display and interaction device.

I offer as evidance the spring and plastic ball models of modules, and the skelitons in the doctors offices.

2D clearly has its place, but I expect 3D to start elbowing in on it as soon as the display\interaction and computational difficults are met.

Re:Not sure 3D is always the best (2, Insightful)

grumbel (592662) | more than 6 years ago | (#21493631)

### Our brains have clearly evolved for a 3D world.

From where did you get that? Our movement is for most part pretty much limited to 2D (forward,backward,left,right are good, but up and down are heavily restricted), the earth is flat (at least from a human point of view) and there really isn't all that much true 3D in our daily lives. Sometimes we stuck a bunch of 2D things into a hierarchical structure, but thats as 3D as it ever gets. Our eyes of course also only see 2D view of the world, sure a little depth mixed in, but nothing close to full 3D.

If we would be build for 3D we wouldn't get dizzy when playing Descent, but quite frankly, most do.

There is of course also that little problem with interfaces: A bunch of papers spread before me allows me to easily grab exactly what I want with a single click, picking the right piece of paper from a stack is much harder, since I simply can't see what is in the stack. I only see a 2D projection of the stack and even a 3D display wouldn't change that.

That said, a little 3D does have its place, you do want have the ability to zoom-out, maybe add a little depth to see which Window is on top and such. But having to search for a Window that is hidden behind a stable of other Windows just isn't fun, but thats exactly what you get with 3D.

Re:Not sure 3D is always the best (1)

jo42 (227475) | more than 6 years ago | (#21493303)

Someone forgot the pointlesseyecandy (tagging beta) tag...

Re:Not sure 3D is always the best (1)

hey! (33014) | more than 6 years ago | (#21493451)

Plus, we've already been down the road of user interface literalism -- it didn't go anywhere.

Last time it was the adoption of systems with graphical capabilities that lead us down the road; when people started getting VGA instead of text interfaces, people thought it would be oh-so-usable to use a graphical representation of a desktop for the desktop. We're in the same boat now with 3D.

The most useful UI ideas don't seem to have close real world analogs. The "window"? Where they have superficial real world counterparts, they don't behave at all like those counterparts ("folders").

It's not 3D that is questionable, it is literalism. I really like Project Looking Glass's "rotate window" technique, even though it makes no real world sense; the idea that a 2D construct like a window has a back side in a 3D environment is simply logical. Now find a way to make good conventional use of that space and you have a useful technique.

Ick. Glam substitutes for usability (2, Insightful)

Geoffrey.landis (926948) | more than 6 years ago | (#21493495)

The article isn't about user interfaces that make the interface actually more usable, it seems to be entirely about interfaces that are flashy and glamorous-- eye candy (and maybe, to a small extent, touch candy.) The main problem with user interfaces today is that they are bafflingly opaque-- about the only way to learn most user interfaces is to just press all the buttons in sequence and see what they do. I hate glitz; I want function. Has anybody actually ever though about figuring out what users actually need to do, and make the things that they do most the ones that are easy?

...well, maybe I'm just crotchety because the DVD player just broke, only weeks after I finally got out most of the remote's cryptic functions learned. (The button with the diamond does this, and the button with the square plus a straight line does that, and the circle with a line through it does this... is anybody else disconcerted that, after two thousand years of refining the phonetic alphabet, in less than one generation we seem to have gone back to hieroglyphics?)

For most database schemas, 3D is better (1)

crovira (10242) | more than 6 years ago | (#21493667)

I feel for people who are still trying to make sense of their database schemas without explicit Relationships and in 2D.

I developed a schema and source code parsing technique for detecting Relationships in a well normalized database and then took the output of that (750+ Tables & 1,200+ Relationships) and developed 3D (VRML) presentation techniques to let me SEE the Tables and Relationships.)

I've used it to see the Tables and Relationships in other clients' databases as well. It's a very useful technique for analyzing the structure of financial databases.

I've even blogged about it. (http://oirc.blogspot.com/)

I'm now well rid of the whole mess (disease has sort of changed my perspective, [that's why I podcast now,]) but you're welcome to the idea.

Most Notable Difference... (2, Insightful)

gillbates (106458) | more than 6 years ago | (#21492737)

You'll be able to squeeze in a trip to Starbucks between reboots. And this in the early morning, rush hour traffic.

Seriously, the most problematic part about today's user experience is that the majority of the computers run Windows, and more slowly than they did 20 years ago. Sure, you get nice, pretty graphics, but when you're actually trying to get work done, you'd rather have a responsive machine.

Sadly true (1)

boristdog (133725) | more than 6 years ago | (#21492835)

While I personally believe the new Windows Vista interface is clunky, confusing and generally annoying, many of my very non-tech-savvy friends think the new interface is great and easy to use.

Different strokes and all that.

Re:Sadly true (1)

DeeQ (1194763) | more than 6 years ago | (#21493029)

There is nothing wrong with the UI in vista. Other than the annoying sidebar (easy to turn off) The UI hardly differs from XP. In my networking class at college we were all able to get a copy of Vista for 5 dollars. Almost everyone there enjoyed using it and toying around with it. The reason I figure this is the case is because we are all young and not dead set on learning a new thing. Granted Vista maybe slow on older machines its lightning fast on my machine and my machine is hardly top of the line. I would love to know exactly oyu find confusing because there nothing confusing about the UI when compaired to the UI of XP. It just looks shiney now. Im under the impression most people that dislike vista so much have not given it enough of a chance. By no means am I saying its a great OS. Its just as good as XP, however it is still windows afterall.

Re:Sadly true (1)

ByOhTek (1181381) | more than 6 years ago | (#21493137)

The configuration of Explorer is quite different, it has more of an IE7 look and feel, much like XPs has more of an IE6 design.

I found it a bit annoying, but useable, until I got to the list-view interface, which does not handle multiple selection with the mouse, or drag and drop as smoothly as explorer in XP or 2003.

Re:Sadly true (0, Redundant)

DeeQ (1194763) | more than 6 years ago | (#21493199)

I said Vista Not IE7, IE7 Is on XP as well. I personally use firefox.

Re:Sadly true (1)

ByOhTek (1181381) | more than 6 years ago | (#21493263)

I wasn't talking about IE7, I was talking about Explorer (the file manager in Windows).

I was using the IEs as reference points to what the designs looked like.

Re:Sadly true (1)

c_forq (924234) | more than 6 years ago | (#21493351)

And the poster was talking about Explorer resembling IE7, not about using IE7 as a web browser. Not to be an ass, but try actually reading the posts of people who respond to you - especially if you are going to respond back to them.

Re:Most Notable Difference... (3, Interesting)

capt.Hij (318203) | more than 6 years ago | (#21492847)

I would have strongly disagreed with your sentiment a short time ago but have changed my mind recently. The things in the article looked like fun but will have a hard time being accepted.

The thing that changed my mind is that I had to install a machine with Vista on it, and it was my first experience with the new OS. The machine is a new dual core Intel with 1Gb of memory. It should be a screamer but is essentially the same as the 5 year old XP machine it replaced. The secretary who has to use the machine did not like it at all and wanted XP installed. To my shock the reason was not about any of the things that I thought were "important" but really just amounted to "its different."

I used to have high hopes for a world with linux desktops but that has been dashed. Too many people prefer the old and comfortable to the new and cool. Apple and MS have the right idea. Get'm while they are young, although the Wii does offer some glimmer of hope.

Re:Most Notable Difference... (2, Interesting)

cowscows (103644) | more than 6 years ago | (#21493835)

The problem with something like desktop linux is that (for the average user) the changes either don't show enough immediate benefit to make relearning worth while, or don't offer enough of a difference to make the change interesting.

Using the Wii as an example as you did, the Wiimote is a pretty big change in how controllers work. Even if you don't see the potential of it right away, it's so different and a little bit wacky and so it's interesting enough that you want to give it a shot. But let's say that instead of using the Wiimote, Nintendo decided to use their gamecube controller design, but they mirrored the front of it so that the buttons were on the left and the stick/pad was on the right. And they backed up that decision with lots of testing that found it was more comfortable or something. It'd be a pretty significant change to controller design, but how well do you think it'd go over? I imagine it would be rather frustrating, and I think it'd be a tough sell convincing people that that change was worth retraining their thumbs.

Linux just isn't different enough from windows, at least not in ways that matter to an everyday person. All that backend stuff, command line stuff, none of that matters. At a desktop level, Linux doesn't really do anything significantly different than windows does, so why bother with it? Change for the sake of change isn't necessarily productive, old and comfortable often means efficient and cost effective. Apple has historically had the same problem competing against windows. I think there are a lot of ways that you can pretty decisively say that the MacOS has been easier to use, or more consistent, or more pleasant, etc. than its windows counterpart; but the differences have generally been a bunch of minor things. It's hard to get excited about a bunch of little things, unless they pertain to a subject that you already have a serious interest in. Most people don't have a serious interest in computer operating systems, so they don't care.

The big wrench in all of this is that things like malware have created a situation where there's a big difference, in that windows is far more likely to have very apparent problems than MacOS or Linux. Apple seems to be making an attempt to capitalize on that with advertising and such, but Linux unfortunately doesn't really have the same marketing budget.

Re:Most Notable Difference... (1)

xSauronx (608805) | more than 6 years ago | (#21492865)

When i began worked for a wISP earlier in the year i quickly moved from using windows on my laptop to Linux. Id always wanted to try it out (and i always loved multiple desktops!) so I jumped right in.

Something that is, to a lot of people, eye candy, proved very useful to me: the cube desktop. When in the field I didnt always have a place to put a mouse, and got very used to using the trackpoint on my thinkpad. With the desktop cube able to freely rotate, i could very quickly move away from what i was working on, grab and rotate to what i wanted to see and do it faster than using any other method.

As a bonus, it all runs VERY smooth on my T40, which is old by any standard with a 1.5ghz Pentium-M and a radeon 7500. I bumped the ram to 1.5gb and am more than happy with it, and the way the OS runs with any bells and whistles I care to turn on (this is with Ubuntu)

Re:Most Notable Difference... (2, Informative)

ByOhTek (1181381) | more than 6 years ago | (#21492993)

The latest and greatest Windows is typically slow and clunky, but all things considered, what alternative would you pick?

I ran Ubuntu on my notebook, next to FreeBSD and Windows XP. For responsiveness, it typically ran like this: FreeBSD/KDE > Windows > Kbuntu or Ubuntu >> Windows while virus scan was running.

Given that it usually isn't in windows, the last entry is required.

I wouldn't give the majority of users FreeBSD. There's MacOS, but my experiences with it (dual core Core2 cpu'ed machine), doesn't lead me to belive it is any faster than a Windows machine (then again, I usually turn off the fade in/out of menus in windows, which is the biggest delay in the UI I've found).

Basically, while I agree with your point in the case of Vista being the worst, the problem seems to persist throughout most operating systems that a normal user could use.

Even going back to my old K6-III machine (circa 1999, fairly high-end), Ubuntu doesn't perform significantly faster than XP. MacOS X on a circa '99 G3 Apple (also fairly high end model)? OK, I'll grant that as a bit faster than either above option.

I guess the point is, Windows isn't the only OS that is making things that are bloated and inefficient. Everyone is doing it, it seems, and if we want a fix, we have to look at EVERYONE.

Re:Most Notable Difference... (1)

Scootin159 (557129) | more than 6 years ago | (#21493103)

Of course you could make it to Starbucks in time... by then there should be one within 50 feet of any point on the planet.

Changes will be evolutionary, not revolutionary (3, Insightful)

Entropius (188861) | more than 6 years ago | (#21492791)

The metaphors we're using now work pretty well, and UI changes in the future will probably consist more of refinements of these rather than totally new things, at least until and unless there is a major advance in display technology.

As an example of a well-engineered UI that can make otherwise extremely tedious tasks manageable: Google's Picasa photo manager. It manages to deal with huge amounts of data (3700x2600 jpg's or whatever 10MP comes out to, and 24MB RAW files), run quickly, and show you relevant stuff.

The 3D rotating super+tab screen for task switching in Compiz is another example of using extra computing power to show something useful.

Opera's introduction of mouse gestures is another good idea.

Re:Changes will be evolutionary, not revolutionary (1)

toppavak (943659) | more than 6 years ago | (#21493909)

The metaphors we're using now work pretty well, and UI changes in the future will probably consist more of refinements of these rather than totally new things, at least until and unless there is a major advance in display technology.


This is why I feel that of the technologies displayed, multitouch has the potential to be the most pervasively applied. It's intuitive, it's useful- it's just plain cool. The bit in the demo with the digital camera and the phone represents the core of what kind of potential a technology like that has- provided said technology properly implements industry standards and doesn't seek to castrate features like that. Given Microsoft's track record with standards, its quite possible that this may be an issue that will have to be dealt with in the future, but we can hope it wont come to be.

Perhaps multitouch tables will spur the development and implementation of high throughput, short range and low power consumption wireless protocols? There's just something intrinsically cool about being able to share documents, images and videos just by setting my laptop down on said table and tossing those files around to various other devices.

The greatest UI was the fax machine (1)

Gizzmonic (412910) | more than 6 years ago | (#21492841)

A fax machine's UI is far more user friendly to novices and beginners alike. Is there some reason we don't design GUIs to mimic the fax machine? This, to me, is a substantial failing in modern UI design.

Re:The greatest UI was the fax machine (4, Funny)

spun (1352) | more than 6 years ago | (#21492913)

A fax machine's UI is far more user friendly to novices and beginners alike. Is there some reason we don't design GUIs to mimic the fax machine? This, to me, is a substantial failing in modern UI design.
Right, so you're sayng we'll have to dial 9 before we open or print any documents?

Re:The greatest UI was the fax machine (5, Insightful)

khendron (225184) | more than 6 years ago | (#21493119)

You're joking right? A fax machine's UI sucks. In my experience very few people, when faced with sending a fax for the first time, have managed to do so successfully. They always need help.

When you approach a fax machine, there is no obvious starting action to take. Do you dial first, or scan the pages first? Do you scan the pages one at a time, or can you put them down all at once? When you dial the number, there is no feedback that anything is happening. No sound of dialing, no sound of handshake. Just some cryptic messages like TX that mean absolutely nothing to a novice. Eventually the machine will spit out a page that, you hope, says somewhere on it STATUS: SUCCESS. If you do run into difficulty, you have to find the dead-tree manual to help you, because the messages on the little LCD display don't help much.

A fax machine's UI is about as user friendly as a linux shell without man pages.

I know why he likes fax machines (1)

Brett Buck (811747) | more than 6 years ago | (#21493653)

He also considers VI to be the greatest advance in productivity since the invention of assembly code and the acoustically-coupled modem.

          Brett

Re:The greatest UI was the fax machine (1)

advocate_one (662832) | more than 6 years ago | (#21493931)

Eventually the machine will spit out a page that, you hope, says somewhere on it STATUS: SUCCESS.

even after that you could have failed... I had the originals the wrong way up in the tray...... the recipient got seven sheets of blank paper with fax headers and footers on them...

Re:The greatest UI was the fax machine (1)

MontyApollo (849862) | more than 6 years ago | (#21493211)

Not sure if you are joking or not, but a fax machine UI sucks for the beginner.

I think the more important point though is that people are novices/beginners for a very brief period of time. If the UI is geared towards the novice instead of the regular user, then productivity will suffer.

Re:The greatest UI was the fax machine (3, Funny)

Thanshin (1188877) | more than 6 years ago | (#21493465)

A fax machine's UI is far more user friendly to novices and beginners alike. Is there some reason we don't design GUIs to mimic the fax machine? This, to me, is a substantial failing in modern UI design.
I design my GUIs following the breakthrough design of VCR programming.

Re:The greatest UI was the fax machine (1)

cream wobbly (1102689) | more than 6 years ago | (#21493549)

Bullshit. I've only recently started using a fax machine, and began by sending the back of the documents, because I didn't realize the paper was supposed to go in face down. I interpreted the helpful explanatory icon the wrong way, because it's designed after GUIs. What's wrong with stamping "DOCUMENTS FACE DOWN" into the tray? Also, I'm new to the US, so I didn't realize you have to dial "1" before a ten digit number. The fax machine saw me entering ten digits, plus it was designed for the US market, so why didn't it prepend a "1"?

Let me spill over to microwave ovens: I put the mug in *here*, but by the time it's finished, the handle's all the way to the back. How much effort would it take to drive the turntable to a "home" position after finishing cooking?

And cellphones: I don't *want* emergency numbers to override the keypad lock. I also have fingers bigger than a capuchin's.

Pretty much all user interfaces are in need of some innovation. About the only ones I can't complain about are my car (which isn't a BMW with "iDrive") and my front door.

Productivity improved? (1, Interesting)

khendron (225184) | more than 6 years ago | (#21492885)

"Productivity has been improved greatly over the years"

It has? Where is this increased productivity of which you speak?

I see people doing things differently than they did years ago, but I would hesitate to call it increased productivity.

Re:Productivity improved? (1)

sseaman (931799) | more than 6 years ago | (#21493099)

It has? Where is this increased productivity of which you speak?
I'm pretty sure this can be more-or-less objectively quantified. I'm not an economist, but a quick Google search gives me quotes like:

One of the most impressive aspects of the current U.S. economy is the acceleration of productivity growth (that is, the increased output of goods and services per hour worked) that has prevailed since the mid-1990s.[1]

Money spent on computing technology delivers gains in worker productivity that are three to five times those of other investments, according to a study being published today.[2]

Of course, I'm at work right now, and I'm on /., so YMMV.

[1] [epinet.org] [2] [nytimes.com]

Not desktops! (1)

ThirdPrize (938147) | more than 6 years ago | (#21493257)

This is why Moores Law was such an important thing. It may not have any noticable effect on the desktop but just think that a major company could do twice as much processing each year than it could do the year before. Amazon could handle twice as many orders, the Tax peole find twice as many tax dodgers, etc. While not every company upgrades every machine it owns every year, it has a knock on effect. If Intel stopped increasing CPU power then the economy would grind to a status quo.

Re:Productivity improved? (3, Insightful)

kebes (861706) | more than 6 years ago | (#21493265)

Where is this increased productivity of which you speak?
I think it's easy to miss the increased productivity because our standards rise very quickly with enabling technologies.

For instance, I can sit down on my computer, grab dozens of scientific articles in a few minutes, write a summary of them, and have it typeset to publication-quality with a few clicks. I can then launch a professional-quality graphics art program to make a few figures. I then put it all together and send it to someone (who gets it within seconds).

The same operation would previously have taken much more time and money, not to mention specialist talent. (E.g. numerous trips to library, typing and re-typing a manuscript, hiring a graphic artist to make a figure, and mailing the finished product would have taken weeks of time, hundreds of dollars, etc.) And I haven't even mentioned things that are inherently compute-bound (e.g. how long would it take to run a complicated simulation today vs. ten years ago?).

In short, these technologies have enabled the individual to do things that previously only specialists could do, and have allowed everyone to complete their work faster than before. It's easy to dismiss this since the promised "additional free time" from increased productivity never materializes: instead we merely increase our standards of quantity and quality. Many people don't even see this as progress (e.g. many people would prefer handing off tasks like typing and typesetting to others, whereas nowadays the norm is for everyone to do this themselves).

Nevertheless, the net amount of "stuff" that a person produces (documents, designs, computations, admin tasks completed, etc.) has indeed increased in breadth, quantity and quality, due to the use of computers, networks, and our modern clever user-interfaces.

I, for one, am much more productive using a computer than I would be otherwise. And if anyone thinks that their computer isn't making them more productive, then I challenge them to try to complete daily tasks without it, and see how long/arduous things actually are without.

Re:Productivity improved? (1)

khendron (225184) | more than 6 years ago | (#21493567)

You say "these technologies have enabled the individual to do things that previously only specialists could do, and have allowed everyone to complete their work faster than before."

That is not an example of being more productive,. Before these technologies existed, most people didn't *need* to do the things they now must do to complete their work.

Today I have a computer with a 3 GHz processor and 2 GB of RAM to help me do my job. When I first started my career, I used to time-share some processor time on a microVAX. Am I more productive today than I was when I graduated from University? I don't think so. In the end, my output is a computer program that does some sort of data processing. My program today might look fancier and be able to process more data in less time, but in the end it took me just as long to produce the program as it did years ago.

I know lots of very productive people who exist without ever directly using a computer.

Re:Productivity improved? (1)

pherthyl (445706) | more than 6 years ago | (#21493769)

Your program from today is much more complex, does a hell of a lot more things, and, as you say, looks nicer. If you say you are taking just as long to produce it as the old program, then by definition, you are more productive. Your argument makes no sense. Just because the output in both cases is "a program" doesn't mean they are equivalent.

Increased productivity (1)

dpbsmith (263124) | more than 6 years ago | (#21493417)

Oh, goodness. How can you even ask the question?

Would you believe it changed the whole basis of financial economics: everyone now "get it" that the value of stock is independent of whether the company is doing anything useful or making a profit?

Would you believe it created the dot-com revolution?

Would you believe it sparked the endless bull market and gave ordinary Americans access to the secret of wealth without work?

Would you believe it created 401(k)s growing at 20% per year and has made the average worker so resplendently rich that he couldn't care whether wages are flat?

Would you believe nobody really needs Social Security any more?

Would you believe that my used copy of "Dow 36000" is a steal at just $37.22 plus $7.95 shipping?

Re:Productivity improved? (1)

c_forq (924234) | more than 6 years ago | (#21493449)

I know in my field computers have caused productivity to leap vast amounts, but we use most AS400 systems, and most all the office has thin-clients and VNC into the important stuff. Now I will give you that if we lose telephone or internet than the office grinds to a halt, and if the server goes down the warehouse can only work on already printed tickets, but those outages are rare and keep getting more scarce.

Kissable UI (0)

Anonymous Coward | more than 6 years ago | (#21492919)

I think a kissable UI is the way of the future so that us Slashdotters can finally get some. Instead of a 'Submit' button, you have a "Make out with this picture of Jessica Alba to continue" screen.

Re:Kissable UI (1)

Thanshin (1188877) | more than 6 years ago | (#21493513)

I think a kissable UI is the way of the future so that us Slashdotters can finally get some. Instead of a 'Submit' button, you have a "Make out with this picture of Jessica Alba to continue" screen.
So, when you see Jessica Alba, your first thought is "I'd like to kiss her". Hmmm interesting.

We'll have to try next with Scarlet Johanson and see what UI we can come up with.

No Mention of Touch Feedback? (1)

imstanny (722685) | more than 6 years ago | (#21492933)

What about Touch Feedback?

Ticker symbols IMMR and NVNT.OB (Novint Falcon sold @ CompUSA and supports Half-Life) come to mind.

Re:No Mention of Touch Feedback? (1)

Jeremy Erwin (2054) | more than 6 years ago | (#21493367)

Ticker symbols?
Is this some super-libertarian way of looking at the world? "If it hasn't gone public, it doesn't exist"?

Re:No Mention of Touch Feedback? (1)

imstanny (722685) | more than 6 years ago | (#21493551)

Actually, Jeremy, I was too lazy to type the full name of the companies.

I really don't see what Libertarianism has anything to do with ticker symbols, it's like me calling someone a Darwinist when they use C++ code on their response.

~sigh (0)

Anonymous Coward | more than 6 years ago | (#21492957)

I'm disappointed by the lack of info on augmented/mediated reality type displays. That's the real future of the UI. Something you can take with you, not a table.

Re:~sigh (1)

c_forq (924234) | more than 6 years ago | (#21493527)

I have to agree, I personally thought the coolest part of the Minority Report computer interface was the little portable screens. I wish that when I brought my PDA near my desktop that it would augment the desktop screen, instead of just "syncing" the two times. For example if I could work it out so that when linked via bluetooth that my PDA became a display of unread e-mail messages, system information, or a music remote control or something like that I would be ecstatic. That is the future that I am most excited about.

The experience is in the details (3, Insightful)

TuringTest (533084) | more than 6 years ago | (#21492963)

Those futuristic FX barely have to do with what the final user get as 'experience'. The real experience is about the feelings of the user.

Unfortunately, the most common feelings provoked by today's interfaces are anger and frustration. That's because the interface is littered with rough/unpolished edges, and because software is designed as a bag full of (unrelated) features - instead of as a mean to achieve an end - the process to actually use a feature is rarely taken into the design, not to say tested with users to test it and debug it with the user using it.

A really good development in user experience would be a way to force programmers to follow
this kind of advice [joelonsoftware.com].

Nerromancer's Cyberspace (1)

lobiusmoop (305328) | more than 6 years ago | (#21493013)

What I'd really like to see coming in my lifetime is a fully immersive cyberspace-like interface, but done via direct neural stimulaton/reception through some kind of cranium socket.
  I've seen on a TV programme a while back very simple versions of this implemented already, enabling a blind man to 'see' numbers via electrodes surgically implanted in his visual cortex. It would be amazing to scale it up to the full simstim thing as in Neuromancer though.

Re:Nerromancer's Cyberspace (0)

Anonymous Coward | more than 6 years ago | (#21493305)

hey mate. I like your thinking.

I'm actually in the process of trying to get something like that off the ground.

I came up with a new video game system that has something like that. It's like the Wii, but with neural interface availablity.

If you've seen .Hack then its something like that, only a bit different. It has wii-esque controls that are compatible with any game system. you have sensor pads on your legs and your arms. so if your character has to walk, then in reality your legs move to simulate walking, if your character has to kneel, jump, or whatever then you do that. Those sensors also respond to your characters getting hit. so if your character gets shot at, you feel a sensor response, dulled though, so that you don't notice it that much. the neural interface part is for the head only. You have a VR helmet that you see everything through, and that is connected to the controls.

again. this is only a design in the process of my brain, but if I had the funds, and corporate backing, I could create a prototype.

The only thing I'd be worried about is the neural interface.

Love the icon! (0)

hellfire (86129) | more than 6 years ago | (#21493039)

This is completely off topic, but I love how the icon for a user interface story here is the original apple mouse. You know... a mouse with one button? I hope all you one-button bashers get really bent over that, too! That is just too ironic :)

Re:Love the icon! (1)

edremy (36408) | more than 6 years ago | (#21493887)

And I hate it. Why?

Back around 85, i tried to teach my girlfriend (now wife) to use a mouse. She simply could not double click reliably- she would click and in the process nudge the mouse, moving the icon and cancelling the double click. I saw this from other folks as well back then- it was very hard for a lot of people to learn. Utterly horrible design. Then you get into the contortions needed to do simple actions with only one button. Will the file move or copy when you drag it? "Well, it depends on if the target folder is on another volume" Bad design. How do you get info on a file? Single click+apple-I? Yeah, that's a winner.

Now do it with a RMB click. RMB->Open RMB->Copy RMB->Move RMB->Properties. Consistent, easy. Indeed, you can see what options are even available with one context-sensitive click, rather than select and scrub through menus until you find it.

"But you don't get confused with one button!" Bull- folks back then were used to dealing with much more complicated interfaces. My wife is totally non-technical and doesn't get along well with computers, but was perfectly capable of writing WP5.1 for DOS macros. It wasn't confusion- it was simple muscle learning that was the issue. Right click was a lot easier to learn than "Select and look through menus" or even double click.

Re:Love the icon! (1)

Foerstner (931398) | more than 6 years ago | (#21494039)

The original SRI mouse (Engelbart, et al, mid-1960's) had one button. By 1968, it had three buttons. Only the left button was used in conjunction with pointing. The other two buttons were used to confirm and cancel actions.

In 1974, at Xerox PARC, I implemented a prototype of the Gypsy text editor that introduced drag-select and needed only one button. (The final version of Gypsy, 1975, which I developed with Tim Mott, used all three buttons because they were there.)

Apple's mouse first appeared on the Lisa. If we had been designing the Lisa for power users, we may have provided at least two buttons. But 99%+ of our target customers were new to mouse use at that time--new, in fact, to computer use. In our 1980 usability tests, we observed significant button confusion. "Which button?" "Oops, wrong button." Or worse, the wrong thing happening without knowing why. We also observed that users paused before clicking to think about which button to click. This not only slowed them down, it took their minds off the task and made them think about the tool.

--Larry Tesler
Manager, Lisa Applications and User Interface, 1980-82

(Poster has no relationship to the author.)

The Cheoptics360 (1)

Mononoke (88668) | more than 6 years ago | (#21493045)

Am I the only one who sees that this is nothing more than a giant four-sided heads-up display?

Sorry, not much here to improve productivity (0)

Anonymous Coward | more than 6 years ago | (#21493071)

FTA: (with personal observations)

Future For Gamers: Cheoptics360(TM) ... n/a for me
reactable ... looks cool, but not sure how this will make me more productive
Multi-Touch ... i don't really work by puching items around on a table all day. so not much help for me.
Microsoft Surface ... i don't really work by pushing items around on a wall all day either. so not much help for me.
Photosynth ... not sure hand jesters will help me either. except flipping of some person, place, thing or OS.
BumpTop ... might be nice if it had video to it, but then the DRM issue would pop up.
Further References ... n/a for me

RTA for yourself might work for you...

User experience (4, Insightful)

Alioth (221270) | more than 6 years ago | (#21493127)

AAArrrgh. User experience.

I don't want a user experience. If I'm having a "user experience", then the application or operating system is getting in my way. I want the OS or app to melt into the background so I hardly think that I'm using it.

Re:User experience (1)

jythie (914043) | more than 6 years ago | (#21494051)

Well put.

This is actually a good example of why I like OSX (running with most of the silly stuff turned off). It stays the expletive out of my way and makes an efficient task switcher. Outside booting, launching the apps I want, and switching between them, I basicly do not interact with it. Which is nice.

Keep in mind: they might not (2, Funny)

dpbsmith (263124) | more than 6 years ago | (#21493189)

"Keep in mind: they might become ubiquitous over the next years."

Why should I keep that in mind? Do I need to prepare myself mentally to compete in the brave new world? Do I need to worry that people who keep in mind that these interfaces might become ubiquitous will become so much better at operating computers than me that I'll become unemployable? Where can I find a community college course on how to play 3D video games?

But, but, but: the fear factor. They might become ubiquitous over the next years. Maybe. And then again, maybe not.

What if I back the wrong horse? What if I budget three hours a day to do exercises to hone my spatial perception skills to a scalpel-like edge, only to find that the real winners are those who anticipated the rise of olfaction-based user interfaces?

Well, gotta go... time to do my PL/I programming exercises. PL/I, it's the wave of the future, y'know.

My thoughts on User Interface Design (2, Funny)

hey! (33014) | more than 6 years ago | (#21493315)

It would be a good thing.

(user interface techniques don't count as design)

*sigh* more balls of crystal (1)

sm62704 (957197) | more than 6 years ago | (#21493319)

I'd be willing to bet this article will be proven wrong. No, I did not RTFA. Where's that Randi guy?

Pressure simulation. (1)

Thanshin (1188877) | more than 6 years ago | (#21493321)

What all human-machine interaction lack is pressure simulation. Even the ugliest virtual reality, with some way of opposing the user's movement would allow the creation of virtual objects.

Porn industry alone could finance the investigation and development, and then everybody would be able to use the technology.

However, apart from some experimentation with oil spheres, I don't think there are feasible options yet.

So, stop with the multitouch already. We've not used more than one finger to paint since we were 2. Start the pressure investigation and give us a better virtual reality.

Re:Pressure simulation. (1)

grumbel (592662) | more than 6 years ago | (#21493719)

### However, apart from some experimentation with oil spheres, I don't think there are feasible options yet.

Actually there is, haptic devices [google.com] are used quite a lot in the gaming industry, i.e. you see them in every second making-of video of a game. Now these aren't consumer items due to price, but the hardware is there.

Take Microsofts surface (3, Interesting)

SmallFurryCreature (593017) | more than 6 years ago | (#21493541)

They link to a review of it, so here is my own. We accept for the moment that it will ONLY work with MS software and MS approved hardware.

I put my MS approved camera on the surface, up pops a enormous windows telling me I got to agree to a eula (exactly what happens when you access MS media player for the first time), it then finally allows me to download the photo's. I then try to put them on my Zune 2.0, OOPS cannot do that, the camera is digital and zune only accepts analog (Zune 2.0 doesn't allow the uploading of movies captured with a digitial tv tuner, only analog tuners)

Starting to get the picture? ALl these things sound nice when you just see the pre-scripted demo, but when it comes to real life, well, it all just breaks down. Especially when it comes to Microsoft.

Same thing with multi-touch screens, very nice, but how much software will be written to make use of it when so few people will have such a screen? I remember that System Shock ages ago had support for 3D helmets, it was a hot topic back then and one that never happened. SS was one of the few games to support such systems, the others wisely did not bother since nobody had such helmets and because few games supported them, what was the point in getting one.

I can make a game around the logitech G15 keyboard that makes the device indispensible to play, but I would be really hurting my changes of selling the game.

All these devices are intresting enough, but destined to remain obscure simply because people won't be buying them unless their is a killer application for it, and nobody will build such an application until there is a larger installed base.

Logical conclusion... Web 3.0? (0)

Anonymous Coward | more than 6 years ago | (#21493613)

New technologies have wide-ranging effects beyond their initial application. When the internet first became public, who could have predicted sophisticated systems like MySpace and Wikipedia?

So what wider effects will these technologies have on society? I can see horrendous negatives like the Terminatorization of already nasty military robots [slashdot.org]. And amazing positives like the extension of the metagovernment [metagovernment.org] into a pseudo-virtual space.

The best UI is the one you can't see (4, Insightful)

petes_PoV (912422) | more than 6 years ago | (#21493671)

All these "futuristic" interfaces fall foul of the "flying car" effect. In the past people expected that by now (well, by about 1980) we'd all have given up out automobiles for flying cars. These UIs are the computing equivalent - they take our current limited experiences and extrapolate them.

In practice anything that involves waving your arms around, a la Minority report will be the fastest way to get tired arms ever invented. So that's the Reactible, Multi-touch and Microsoft surface out of the running. Imaging doing that for an 8 hour shift in your datacentre. Completely impractical, but like flying cars, looks great to the uninformed.

Let's face it, typing is quicker than mousing - you've got 110 keys at your disposal instead of just 2 (or up to 5 - wow wee!!!) and the limitation is the number you can press is limited by the numberof fingers you can manipulate at once - not the numebr of things you can press. Just try writing a letter by mouse clicks. Typing is even quicker than speaking - especially when you have to go back and change the phonetically (sorry fonetically) spelled words that come out.

Personally, all I want from a UI is one that doesn't steal focus from my window to pop-up a "Shall I delete all your files Y / n" just when I think I'm, going to hit in a text window. It should keep the clutter off my screen and just show me the stuff I want. Aeroglass is nowhere near this (and probably going in the wrong direction anyway - far too complicated). Let's just keep it as simple as possible, but no simpler.

oblig (0)

Anonymous Coward | more than 6 years ago | (#21493963)

PC Load Letter?!? What the fuck does that mean??

Tag: Vaporware (1)

writermike (57327) | more than 6 years ago | (#21493955)

Ahhh, "vaporware." It's the tag that dismisses everything in record time.

I don't understand what some folks would have these research centers do. Not work on new GUIs? Why not? Remember, many things never left PARC's labs. Some of it did and even more of it went into the current crop of GUIs that we have today. (One can debate on the ethics behind how the ideas made it out of PARC.) All of that research, even the stuff that didn't work, helped to achieve a better, more polished end-result.

And, of course, our current GUIs aren't necessarily the "best," but they're what works generally for now. So, we should continue to look for new and better ways to interface with technology. Yep, some of it will be horrible, some won't work at all, and some will be announced but turn out little more than vapor. But everything (including the failures) will lead to the next better thing.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...