Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Future & History of the User Interface

ScuttleMonkey posted more than 8 years ago | from the pointing-at-point-and-click dept.

249

An anonymous reader writes "The Mac Observer is taking a look at UI development with lots of video links to some of the latest developments in user interfaces. It also has links to some of the most interesting historical footage of UI developments, here's one of the 1968 NLS demo. From the article: 'Sadly, a great many people in the computer field have a pathetic sense (or rather ignorance) of history. They are pompous and narcissistic enough to ignore the great contributions of past geniuses... It might be time to add a mandatory "History of Computers" class to the computer science curriculum so as to give new practitioners this much needed sense of history.'"

cancel ×

249 comments

Sorry! There are no comments related to the filter you selected.

I'm outraged! (5, Funny)

Cr0w T. Trollbot (848674) | more than 8 years ago | (#15922967)

Where are the glorious UI innovation like Clippy and Microsoft Bob?

Crow T. Trollbot

Re:I'm outraged! (1)

smitty_one_each (243267) | more than 8 years ago | (#15923078)

Devoured by the all-consuming CLI

Re:I'm outraged! (4, Interesting)

Tackhead (54550) | more than 8 years ago | (#15923159)

> Where are the glorious UI innovation like Clippy and Microsoft Bob?

On the shitcan of history, like the unreadable [google.com] choice of default font on Slashdot, the Star Wars Galaxies NGE [joystiq.com] , the changes to Yahoo's [thesanitycheck.com] stock message boards [gigaom.com] , and two recent changes to Google Maps, one of which has made broke printing [google.com] impossible (users are now reduced to taking goddamn screen captures and printing those!), and and another one that auto zooms [google.com] and recenters, instead of merely re-centering the map, on double-click, making navigation a time-consuming process of setting a desired zoom level, clicking to recenter, slowly loading a bunch of tiles you don't need, then unzooming back out, and loading yet another set of tiles.

In each of these cases, user feedback was nearly universally negative, and yet the "improvements" remain in place.

If this is UI innovation for Web 2.0, give me Web 1.0 back.

Re:I'm outraged! (0)

Anonymous Coward | more than 8 years ago | (#15923492)

Troll? Even if you haven't been on a Yahoo message board lately, look at the two Google threads. Google's own internal support boards show 74 complaints on printing, 46 complaints about auto zoom. Maybe you prefer the new /. but three very large web applications (message boards, and two of Google Maps' formerly greatest features - printing of large maps, and easy navigation - have been "upgraded" recently, all resulting in decreased usability. It's a trend, and it sucks.

Re:I'm outraged! (0)

Anonymous Coward | more than 8 years ago | (#15923230)

Wow it is truely stunning how microsoft bob jokes NEVER get old.

Re:I'm outraged! (2, Interesting)

pilgrim23 (716938) | more than 8 years ago | (#15923291)

Clippy came out of Bob, Melinda saw that it was good. But seriously though, about that same time period was the dawn of CDROM as a media type. Many magazines shipped with CDs, and each had a GUI. The gamer mags in particular had various custom GUIs for selecting their content. Some were based on shopping, some on Office or Home (ala Bob) Some on really weird stuff (Anyone remember the elevator ride to hell on old PC Gamers?). It seems those were some real free wheeling days of UI development, but none of that seems to have gone anywhere.

Re:I'm outraged! (1)

Jurrasic (940901) | more than 8 years ago | (#15923557)

Heh, I still have a bunch of those PC Gamer CDs from 96-98 with the 'elevator ride to hell' UI to get to the demos. I loved those days personally. Standardized = Boring.

Re:I'm outraged! (1)

oudzeeman (684485) | more than 8 years ago | (#15923700)

I am the coconut monkey!

boy was I pissed when they changed his voice.

In other news... (0)

Riding Spinners (994836) | more than 8 years ago | (#15922975)

Graphical User Interfaces are intuitive because you can remember the location of things.

Re:In other news... (3, Funny)

celardore (844933) | more than 8 years ago | (#15923000)

Graphical User Interfaces are intuitive because you can remember the location of things.

That's easy. It's at c:/>Files\Home\Photos\1997\Family\Snaps\*.jpg

duh.

Re:In other news... (2, Funny)

AssCork (769414) | more than 8 years ago | (#15923052)


Could be taken on an 'oliday...Know what I mean? Eh? Wink Wink.

Looks great but (1)

BeoCluster (995566) | more than 8 years ago | (#15922993)

Can I make a Beowulf Cluster of these ?

Re:Looks great but (0)

Anonymous Coward | more than 8 years ago | (#15923070)

oh yawn, can it run linux?

Multi-touch (5, Interesting)

identity0 (77976) | more than 8 years ago | (#15923032)

The multi-touch interface demo on Youtube was interesting, I saw it a while ago.

The thing that makes it different is how casual the interaction is compared to file & image programs today. You see the guy just touch the screen and rotate, zoom, and move images around and organize it, instead of opening up dialog boxes, secondary windows, or menus to access the functionality. It's very basic stuff, but you see how powerful it is, kind of like how Google Maps is compared to the old static kind of online maps.

It's like today's image programs are concerned with precicely doing something like zoom to exact levels(%100/%50/%33/etc), but this programs let you do it to "whatever zoom feels right", without worrying you with the details.

Hey speaking of which, I wish cameraphones had a much more fluid interface for picture organization, so I can add keywords, associate it with people on my contacts, etc... but what do they care, as long as they make money off the ringtones :(

Two mice. (2, Insightful)

Poromenos1 (830658) | more than 8 years ago | (#15923299)

Seriously. Most of that stuff can be done with two mice. Why hasn't anyone implemented that yet? Just grab the image from the ends and drag to resize, or drag one end to rotate, or whatever. Two mice would be much more natural. Sure, you'd probably use the one in your good hand more, but for some stuff it would be great (perhaps handling 3D models?).

Re:Two mice. (2, Interesting)

MadEE (784327) | more than 8 years ago | (#15923458)

Seriously. Most of that stuff can be done with two mice. Why hasn't anyone implemented that yet? Just grab the image from the ends and drag to resize, or drag one end to rotate, or whatever. Two mice would be much more natural. Sure, you'd probably use the one in your good hand more, but for some stuff it would be great (perhaps handling 3D models?).
It probably hasn't been implemented yet because it would be quite confusing to keep up with which pointer does what. There isn't that problem with these displays as the pointer is under your finger and it's hard to lose. But there are a bunch of advantages to this over 2 mice, the most obvious being it allows more then 2 pointers. The second the display could allow physical tools to be used on the display instead of simulating the tools. Third it's touch sensitive allowing command modifications by force. I have to disagree with it being more natural, how more natural can you get then actually touching the object on the screen the only thing I could think more natural would be touching the physical object itself.

Re:Two mice. (1)

lannocc (568669) | more than 8 years ago | (#15923547)

Exactly. But I'm holding out for two wii-motes :)

Re:Multi-touch (2, Interesting)

bunions (970377) | more than 8 years ago | (#15923364)

100% agreement.

A lot of the limitations on the UI stems from the hardware we use to talk to the computer. The multitouch stuff is awesome, and if/when we see some hardware support, you'll start to see some very, very interesting new stuff.

As much as I hate 'media' keyboards, if they were just standardized I'd be very happy. I'd love to have several software-configurable scrollwheels and sliders. Universal out-of-the-box support for secondary/tertiary/n-ary small LCD displays would also be nice.

Re:Multi-touch (2, Interesting)

Stalus (646102) | more than 8 years ago | (#15923637)

Honestly, I think the reason things like the multi-touch display is that we're too focused on having a single device for everything - or rather the price point isn't low enough yet not to. A vast majority of what people do on their computers is generate text... word processing, e-mailing, IMing, etc. That multi-touch display is no replacement for a physical keyboard. Yeah, you can pop one up on screen, but how many people have you heard complain about a keyboard just not feeling right? While such a display might be nice for art, photos, mapping, etc, it would be awful for what most people do most of the time, and businesses aren't going to sink money into that.

Another downside to such a display is that it would be physically stressful for prolonged use. There's a reason we train ourselves to touch type and use keyboard navigation. I'm not too thrilled by the idea of my primary interface being one that requires large arm movements.

Despite all that, the other reason UIs don't change much is that what people seem to see as UI advances just complicate the UI rather than simplify it. I love how the AI word spends all of their time trying to reduce dimensionality, and the UI world is always trying to increase it. Keep it simple! A cooler looking, higher dimensional UI is not necessarily a better UI.

Creaky and old fashioned? How about useful. (5, Interesting)

posterlogo (943853) | more than 8 years ago | (#15923040)

FTA: The current state-of-the-art User Interface (UI) we've been enjoying has remained largely stagnant since the 1980s. The greatest innovation that has been recently released is based on video card layering/buffering techniques like Apple's Expose. But, there is a large change coming. Rev 2 of the UI will be based on multiple gestures and more directly involve human interaction. Apple is clearly working in the area as some of the company's patent filings demonstrate. Nevertheless, these videos might make Mac (and Windows) users experience a huge case of UI envy, as a lot of UI development (in XGL in particular) makes the current Mac UI seem creaky and old fashioned.

The guy seems to think that the stagnation of the UI is an entirely bad thing. It seems to me that when something works well, people like to stick to it. I really don't think the majority of people need multiple desktops floating around let alone a brain interface. The only widely practical new UI technology I saw was multi-touch interactive displays (or touch screens in general, though they have been around for a long time and are still not very popular). As far as his comment that the new-fangled UIs make the Mac seem creaky and old, well, that's his opinion I guess. Some would just say the Mac UI is useful as it is. Even some of the new features in Leopard seem unnecessary to me. It's never bad to innovate, just don't automatically assume every new cool thing is practical or useful for most people.

Re:Creaky and old fashioned? How about useful. (2, Interesting)

megaditto (982598) | more than 8 years ago | (#15923095)

Reminds me of Alice in UNIX land [caltech.edu] . An oldie but a goodie.

What I am still waiting for is multi-pointer capable x11 (two mouses) and pressure-sensing mouse buttons.

Re:Creaky and old fashioned? How about useful. (0)

Anonymous Coward | more than 8 years ago | (#15923182)

Everyone except the Dormouse was holding a paper cup, from which they were sampling what appeared to be custard. "Wrong flavor," they all declared as they passed the cup the cup to the creature on their right and graciously took the one being offered on their left. Alice watched them repeat this ritual three or four times before she approached and sat down.

Immediately, a large toad leaped into her lap and looked at her as if it wanted to be loved. "Grep," it exclaimed.

"Don't mind him," explained the Mad Hacker. "He's just looking for some string."

"Nroff?" asked the Frog.

The Mad Hacker handed Alice a cup of custard-like substance and a spoon. "Here," he said, "what do you think of this?"

"It looks lovely," said Alice, "very sweet." She tried a spoonful. "Yuck!" she cried. "It's awful. What is it?"

"Oh just another graphic interface for UNIX," answered the Hacker.

Alice pointed to the sleeping Dormouse. "Who's he?" she asked.

"That's OS Too," explained the Hacker. "We've pretty much given up on waking him.

Re:Creaky and old fashioned? How about useful. (2, Informative)

kwark (512736) | more than 8 years ago | (#15923342)

I personally can't think of any use for it but a Multi-Pointer X Server already exists:
http://wearables.unisa.edu.au/mpx/ [unisa.edu.au]

Re:Creaky and old fashioned? How about useful. (1)

megaditto (982598) | more than 8 years ago | (#15923431)

Wow the just released that in June. Don't know how I missed that, thanks.

Re:Creaky and old fashioned? How about useful. (1)

kfg (145172) | more than 8 years ago | (#15923446)

The guy seems to think that the stagnation of the UI is an entirely bad thing.

That's because he lacks a sense of the history of human/tool interfaces. Perhaps he should take a course.

KFG

Re:Creaky and old fashioned? How about useful. (1)

FudRucker (866063) | more than 8 years ago | (#15923550)

i actually like multiple virtual desktops, 4 to 6 is plenty, i have fvwm2 fixed up pretty nice with full paging & edge wrap. once you get the hang of it then going back to icons on a taskbar is klumsy and slow...

Re:Creaky and old fashioned? How about useful. (2, Funny)

gardyloo (512791) | more than 8 years ago | (#15923632)

i have fvwm2 fixed up pretty nice with full paging & edge wrap. once you get the hang of it then going back to icons on a taskbar is klumsy and slow...

    I'm sorry. "Klumsy" is a trademarked adjective of a different desktop environment. You have been warned.

Assuming that I won the lottery tomorrow... (3, Interesting)

MarcoAtWork (28889) | more than 8 years ago | (#15923046)

... and I could stop working and go back to university to get another degree full time and end up into research, where would the state of the art of the UI/human-computer-interaction field be? which degree would one want to pursue? where?

I've always been fascinated by HCI but have yet to be able to pursue this in a work-related setting (where I tend to write backend code, basically as far away from users as you could possibly get).

Re:Assuming that I won the lottery tomorrow... (2, Insightful)

pugh (631207) | more than 8 years ago | (#15923265)

Dude, If you win the lottery have some fun. Buy a Ferrari, hang out on a tropical island for a while, do whatever you want. You don't have to go back into full-time education if your're rich.

Re:Assuming that I won the lottery tomorrow... (1)

MarcoAtWork (28889) | more than 8 years ago | (#15923292)

who says I would have to, I would do it because it'd be a lot of fun: as many people around here can attest to, coding and learning are a great when you can pick the projects and the topics vs doing your phb/shareholders' bidding.

Re:Assuming that I won the lottery tomorrow... (1)

pugh (631207) | more than 8 years ago | (#15923322)

I stand corrected. (Fingers crossed).

Assuming that I got educated tomorrow. (0)

Anonymous Coward | more than 8 years ago | (#15923527)

Questions, questions. The first degree is obviously psychology (either alone or as part of another degree. Usually a M.S.)

http://www.si.umich.edu/msi/hci-reqs.htm [umich.edu]

http://informatics.iupui.edu/academics/hci/hci_ms_ requirements.php [iupui.edu]

"I've always been fascinated by HCI but have yet to be able to pursue this in a work-related setting (where I tend to write backend code, basically as far away from users as you could possibly get)."

You may already have some of the requirements (see above). Fill in the rest, either self-study, or part-time schooling. But you have to be serious about this field. It's a LOT of work to become good.

Here's an example [nooface.net] of some of the things HCI produces.

Re:Assuming that I won the lottery tomorrow... (2, Interesting)

kbielefe (606566) | more than 8 years ago | (#15923658)

We have some human-machine interaction specialists where I work. I know their educational backgrounds are varied, but I'm not sure what the basic requirements are.

We make military aircraft, so they are concerned not only with the computer interaction in the cockpit, but also with the positions, labels, and feel of switches, knobs, controls, instruments, ejection buttons, etc. For some reason quick and reliable person-machine interaction is considered important when people are shooting at you. (Haven't we all been tempted to motivate certain Microsoft engineers the same way at one time or another?)

It's a lot of fun to go down to the simulator and watch these guys work, but I know there is a lot of tedious work in between the simulator "play" time. Just don't limit yourself to desktop computers when you think about possible careers in the field.

Overlapping windows (4, Insightful)

Peaker (72084) | more than 8 years ago | (#15923057)

Heh, the issue of User Interfaces always makes me laugh at the incompetence of seemingly the entire world when it comes to User Interfaces (or the whole computing world in general).

Some obvious trivial faults:
  1. The whole overlapping window model is bankrupt. You want to minimize the amount of information, especially redundant information, that the user has to input, and output as much information in an accessible way. The overlapping window model does the opposite: it requires that you tile your windows manually (or through tedious, inaccessible menus) rather than specifying which windows you want to see. If you don't do that (and due to the required effort, most don't) then you don't see all of the information you want even though most of the screen is wasted space!.
    For reference, just look at your screen now, and watch how much of it is covered by empty "gray areas". When you open a new window, does it hide gray areas, or real information?
    This is even more absurd when there are just a couple of windows, hiding each other, when the entire screen is free space! The computer expects YOU to work for HIM and move these windows from hiding each other.
    This phonemenon is also felt in list boxes, where you are expected to adjust the column widths manually to not be too short/too long, even when there is an optimal adjustment readily available. You again have to work for the computer, and ask for a ctrl+plus to set it up. Most people don't even know about ctrl+plus in column-listboxes.
    Some programs make it even worse, and don't let you resize their windows when the entire screen is free, and you have to scroll through their data in a little window.
  2. Internationalization and shortcut keys.
    What's so fascinating about this example - is how common it is across platforms, programs, operating systems.
    The feature is called "shortcut keys" and yet everyone is implementing it as "shortcut symbols".
    This is terrible - when you switch between languages, all shortcut keys break!
  3. Multitude of widgets, with overlapping functionalities. This is just silly and confusing to beginners. We need less widgets, not more.
  4. "Jumpyness". Today's GUI's all "jump". What I mean by that is that they don't smoothly switch from one state to the next, but rather do that with a single screen refresh. The human mind doesn't read that very well. For example, scrolling down "jumps" down a pageful instead of scrolling down a pageful in a smooth motion.
    The fact that fixing this would require modifications of all existing GUI programs is a certificate of poor architecture of GUI software.


There are many more trivial issues to fix. Until they fix these, I find it very funny to talk about future directions for the User Interface. We haven't even gotten the basics right yet!

Re:Overlapping windows (1)

wootest (694923) | more than 8 years ago | (#15923109)

Smooth scrolling is one of many fixes needed for 4. In general I agree with your notions although I don't think that the overlapping window model is bankrupt.

Re:Overlapping windows (2, Interesting)

mcrbids (148650) | more than 8 years ago | (#15923119)

Your points are interesting. But they have already been largely mitigated!

1) Your points on overlapping windows is interesting. But KDE already addresses that. When I open a new window in KDE, it opens the new window over the area of greatest unused space. Overlapping continues, but as unobstructively as possible. Contrast that with Windows' means of opening windows about 1/4" below and to the right of the previously opened window, which almost assuredly wastes as much screen real estate as is possible.

2) Can't comment on shortcut keys and internationalization, other than to state that most programs let you customize this.

3) Multitudes of widgets with overlapping functionality. Care to elaborate?

4) "Jumpyness" is more natural! When you flip pages in a book, you go from one page straight to the next. It doesn't "slide", you flip it and that's it. How is page up/down any different?

Re:Overlapping windows (1)

heinousjay (683506) | more than 8 years ago | (#15923323)

When is the last time you used Windows? 3.1? Applications open where they were last placed.

Re:Overlapping windows (1)

dan828 (753380) | more than 8 years ago | (#15923373)

"Jumpyness" is more natural! When you flip pages in a book, you go from one page straight to the next. It doesn't "slide", you flip it and that's it. How is page up/down any different?

It's not more natural, it's what you're used to. Realize that pages in books are a format that was dictated by technology, not what was the best solution-- the older way of written works were continous scrolls, which the reader could unroll at the top and roll out at the bottom for a continous reading experience, like much information is provided on the web. Sheets of paper were just a cheaper way of providing a writing surface, and later a much cheaper way (with the advent of the printing press) of producing information. Constraints of technology and price, and a familiarity with the format, do not make something more natural, it just makes it easier for you to use because it's what you expect.

Re:Overlapping windows (1)

bunions (970377) | more than 8 years ago | (#15923390)

"When you flip pages in a book, you go from one page straight to the next. It doesn't "slide", you flip it and that's it."

And that flip takes zero time? Last time I read a book, I could actually see the page being turned.

The problem with this sort of animation is that they're generally not fast enough to be unobtrusive. I don't need a 2-second animation of a page turn, I need a half-second one.

Re:Overlapping windows (1, Informative)

Anonymous Coward | more than 8 years ago | (#15923144)

When you open a new window, does it hide gray areas, or real information?


Unless you're using an extremely poor window manager, it hides the gray areas. Either that or you need to go into your KDE or Gnome preferences where you can specify this.


What are you using? IIRC this was considered a significant innovation in window managers when I was finishing college in the 90s; but certainly hasn't been a problem for at least a decade.

Re:Overlapping windows (2, Insightful)

dr.badass (25287) | more than 8 years ago | (#15923149)

If you don't do that (and due to the required effort, most don't) then you don't see all of the information you want even though most of the screen is wasted space!.

Why make the assumption that you always want to see all of the information in all of the windows you have open? Just because a window is visible doesn't mean it is relevant to the current (and ever-changing) task.

Right now I'm typing a comment on Slashdot, but my mail client is open behind this window because I was reading email just a few minutes ago. When I'm done typing this comment, and close this window, I'll be back in my email client. Without the very simple ability to have windows overlap, I would be looking at two completely different and unrelated tasks, which doesn't seem like an improvement.

Today's GUI's all "jump".

All except Mac OS X.

Re:Overlapping windows (4, Insightful)

CaptainCarrot (84625) | more than 8 years ago | (#15923208)

I disagree.

  1. Overlapping windows are used to make more information available to the user than can be displayed on the available screen real estate. The RL metaphor is a collection of papers on a desk. You can't see every paper all at once, but you bring to the top of the pile those which you need. You do this for your own benefit, based on the needs of the moment, not for that of the desk -- or the computer. The whole point is that the space isn't tiled. I don't like working that way personally, and I suspect the reason we've moved away from that model is because most people don't. Remember the early Windows versions?

    You asked how much of the screen was empty space and therefore wasted? Very little of it, most likely. Very little of mine is as I type. Space with no content in it is not necessarily wasted. In fact, it most likely isn't. Space is crucial to how our brains orgainize what we see. If every square inch of space on the screen was being used, we'd see it as a jumbled mess. The best and most eye-pleasing data presentation use of designs very carefully balance empty space against that occupied by content. Take, for example, your original post against my reply. See how I create spaces between my paragraphs with properly structured P tags? See how much more readable that is?

    I agree that some programs are badly designed and make poor use of the model. That doesn't mean the model itself is broken.

    Yes, it would be nice for those very particular about their screen arrangements if they could save state between sessions and recover it immediately when they start back up again. This is an implementation issue 00 remembering, of course, that most people prefer not to tile.

  2. Yes they're shortcut symbols really, but people have a hard time remembering arbitrary symbols. That's why we employ mnemonics, which naturally relate to the language of the interface. For example, it's easy to remember the shortcut to open a file in most word processors (ctrl-O) because "O" is the first letter in the word "open". It's not reasonable to expect such mnemonics, input through an alphanumeric keyboard, to work any other way -- unless you can think of a better one where alphanumeric input is both easy to remember and language-independent. Good luck.
  3. This is not an inherent fault in the model, but is a failure across an industry to standardize. In my own GUI design work in Motif, this is why I use the default behavior of the default widget set as much as possible. The users most often know exactly what to expect then.
  4. I remember when some word processors and the like included a "smooth scrolling" option. No one used it. It turned out that most people wanted the screen to scroll quickly instead.

Re:Overlapping windows (1)

CaptainCarrot (84625) | more than 8 years ago | (#15923233)

Ironic that I killed one of my own points in part through clicking on the wrong button. As a point of design, the "Preview" and "Submit" buttons on /. are rather too close together.

Re:Overlapping windows (1)

Gli7ch (954537) | more than 8 years ago | (#15923397)

Overlapping windows are used to make more information available to the user than can be displayed on the available screen real estate. The RL metaphor is a collection of papers on a desk. You can't see every paper all at once, but you bring to the top of the pile those which you need. You do this for your own benefit, based on the needs of the moment, not for that of the desk -- or the computer. The whole point is that the space isn't tiled. I don't like working that way personally, and I suspect the reason we've moved away from that model is because most people don't. Remember the early Windows versions?

I don't agree with your metaphor. The only reason you have overlapping papers on a desk is so that you have easy access to said papers. However, if you were writing on one piece of paper while referring to another, you would have the pieces of paper side by side. The need for easy access due to overlapping is removed thanks to wonderful inventions like task bars. But if were needing to read from say, a CSS reference page while I wrote a stylesheet, I'd rather have them side by side while I work, and the current implementations for that are rather... crap.

Re:Overlapping windows (2, Interesting)

grumbel (592662) | more than 8 years ago | (#15923559)

I don't like working that way personally, and I suspect the reason we've moved away from that model is because most people don't. Remember the early Windows versions?

I think the throuble with tiling is that it simply doesn't work that well as a generic concept, there are simply to much applications around that are just to small to make sense in a tiled workspace, ie. a small calculator [toastytech.com] should overlap, not tile, since else he can't be seen in full and wastes a lot of screenspace. However in Blender or Emacs tiling works great, much better then MDI solution which present windows in a window, this is probally because Blender and Emacs deal with one kind of data only and don't have to work with hundreds of different applications which made have wastly variing requirements.

However, while tiling has a fair share of problems, our way to manage windows is also far from optimal, there is a lot of time wasted with moving windows around and aranging the screen in such a way that it is actually usable Apples Expose helps a bit, but real solution is probally to move to a fully zoomable desktop, so that one isn't restrited by screen borders, but can simply zoom out when more space is needed. This also helps a lot with orientation, since you can simply place everything side by side and still reach it and don't have to lower/raise yourself through a stack of windows.

It's not reasonable to expect such mnemonics, input through an alphanumeric keyboard, to work any other way -- unless you can think of a better one where alphanumeric input is both easy to remember and language-independent. Good luck.

How about an LCD Keyboard [artlebedev.com] that actually displays those shortcuts so that you don't have to type them blindly in the first place? Might of course still take a while those are actually available and affortable, but the problem with shortcuts is certainly solvable in a better way.

I remember when some word processors and the like included a "smooth scrolling" option. No one used it. It turned out that most people wanted the screen to scroll quickly instead.

That isn't because smooth scrolling is a bad idea, but because it simply was badly implemented. Now I don't necesarily blame the developers for that, because some things simply can't be implemented well with todays hardware, ie. when I press down I don't want the screen to scroll automatically for half a second, thats just not really a good way of doing it. However I also don't want the screen to just jump around, since that is extremly disorientating. So what could the solution be? How about a pressure sensitive scroll button, the harder I press the faster it scrolls and scrolling both starts instantly when I press and stops when I depress. Or how about a scroll wheel that actually scrolls smoothly instead of just sending up/down events on every click?

Re:Overlapping windows (1)

Stalus (646102) | more than 8 years ago | (#15923586)

Some comments:

#2: I think his complaint is that standard practice tends to be that mneumonics are chosen for English, and are kept across translations. So, if you've ever used a platform in another language, often you can still use the computer without knowing that language because all of the shortcuts are still the same. As an English user, that's great... but they're no longer mneumonics for non-English speakers. The reason this is done is that it's typically desired to have mneumonics be unique for a given menu, so they would have to be chosen post-translation, and typical ship processes don't allow for that.

#3: While I agree that standards are good.. the reason the industry doesn't standardize is that there is no perfect widget set. There are tradeoffs, and the lack of standardization is just the manifestation of these tradeoffs. However, I think they take it a big too far... HTML doesn't exactly have a large widget set, but look at all the things you can do with it.

#4: You notice the reason for this alot more when working with blind users. People can't stand any delay between the key being pressed and the information reaching them. For the blind, a seemingly unnoticeable delay between keypress and speech response will bother them. For me, it's the animation that brings up the menu before the menu is usable (I always turn that junk off). While animations are nice for the inexperienced user to help them train their attention to focus on a given screen area for a particular key press, once our attention focus has been trained, we get impatient with such things.

Re:Overlapping windows (1)

poot_rootbeer (188613) | more than 8 years ago | (#15923327)

Heh, the issue of User Interfaces always makes me laugh at the incompetence of seemingly the entire world when it comes to User Interfaces (or the whole computing world in general).

So it's you against The World on the subject of how a UI ought to work. Hmm, I wonder who is more likely to be right.

1. The computer has no way of knowing, other than via user input, which information is important, and needs to be made viewable. I actually prefer to manually set the size and placement of my windows exactly the way I want them, because I know better than the computer does what I'm trying to accomplish.

2. You're complaining about an edge case. The vast majority of users will only ever use a computing system in their own native language. It doesn't matter to me that the Spanish term for "Save" doesn't start with 's'; in English, Ctrl+S maps nicely to "Save".

3. I am interested to know which UI widgets you feel are superfluous. Running through the common ones (text boxes, menus, radio buttons, sliders, et al) in my head, I find each one to have a reasonable logical and/or graphical justification for existing. So what confuses you?

4. I want a UI to react to my commands as quickly as possible. If I hit "page down", I want to see the next page now; I don't want to wait 2 seconds as the content smoothly scrolls to the new location. And if UIs were designed so that they did this, I suspect you'd be searching for the option to turn it back off as well.

Re:Overlapping windows (1)

dosius (230542) | more than 8 years ago | (#15923454)

I keep all my programs maximized and switch between them sometimes from the taskbar and sometimes with a good old-fashioned Alt-Tab.

My keyboard layout includes a Compose key for typing weird caacs on the keyboard, and in my main program it's able to remote-control my mp3 player app. Play, back, nest, pause, stop, in that order.

And my taskbar features a single-click run application button, current outdoor temperature, date and time.

btw, I don't find overlapping windows intuitive at all except for file management. Windows 1 might've actually had it right after all...

-uso.

Re:Overlapping windows (1)

dosius (230542) | more than 8 years ago | (#15923473)

Slashcode wrecked my use of foreign characters there (it says "caacs")

-uso.

Re:Overlapping windows (1)

Khashishi (775369) | more than 8 years ago | (#15923468)

I have to agree with the window model sucking. I tend to maximize all my windows anyway and alt-tab between them. I only window them when I need to copy and paste something where the copy and paste feature doesn't work for some reason.

Especially annoying are those apps that spew multiple toolbars and palettes in several child windows which constantly get in the way (as in most paint programs).

Re:Overlapping windows (1)

rpenguin (47905) | more than 8 years ago | (#15923565)

1. It's not bankrupt at all. Overlapping windows are very useful in filtering out which data you're currently paying attention to. I'm not the only one who is more productive without the distraction of superfluous information. On my Mac I run Backdrop, which is simply a solid black window which I bring to the front to cover windows I'm not working with. Additionally, I run SpiritedAway which hides windows that are idle. I can quickly add windows to an exception list to prevent this from happening when it's undesirable. In truth, I wish more applications had a fullscreen mode, without extraneous window boundaries and menu bars.

2. Obviously this is largely related to mnemonics over position. How do you determine what keyboard layout gets the most symbolic shortcut layout while the rest are merely positional? That said, shortcut keys are there to provide quick access to common actions, not inductive/intuitive paths.

4. Plenty of GUIs have smooth scrolling or transitional animations. Some are intrusive, some are appealing, but in general, once people are doing a task repetitively they are impatient and want it to happen quickly and crisply. With the exception of tracking moving objects, our eyes jump around and jerk quickly, so we're completely suited to it. Plenty of people feel very comfortable with the speed of a jerky page up/page down and find scrollbars tedious.

Re:Overlapping windows (0)

Anonymous Coward | more than 8 years ago | (#15923624)

Non-overlapping window managers exist.

wmii

Ion

ratpoison
Personally I use wmii. It solves a lot of the complaints you make in point 1. I switched to this from windowmaker where my working model consisted of one maximised window per workspace, I got annoyed with how much effort it took to look at multiple windows at once (resizing, moving etc.) and decided to find an alternative. This was the one I settled on, I'm very happy with it.

The Future is easy to predict here. (5, Insightful)

Anonymous Coward | more than 8 years ago | (#15923064)

In the long term, we'll be communicating with computers the same way we communicate with our pets, kids, and coworkers - with a combination of body language, voice, gestures, etc.

In the short term, we'll see Longhorn slowly and sloppily copy whatever Apple's doing; and we'll see KDE and Gnome both copying the bad parts of what the Gnome and KDE are doing respectively; and we'll see all real computer users using emacs/vi/pine/xterm/screen like they always did.

Re:The Future is easy to predict here. (1)

identity0 (77976) | more than 8 years ago | (#15923286)

Speak for yourself - I speak to my coworkers and kids with beatings and profanity.

Come to think of it, I can do that today! [slashdot.org]

The Future Is Now!

Re:The Future is easy to predict here. (1)

grumbel (592662) | more than 8 years ago | (#15923317)

In the long term, we'll be communicating with computers the same way we communicate with our pets, kids, and coworkers - with a combination of body language, voice, gestures, etc.

I am not so sure about that, for some things of course voice and gestures are great, but the computer isn't just a dog or a coworker, its also a tool and I neither talk or gesticulate to my screwdriver, instead I pick it up and get the job done with it myself, since thats simply a lot faster then trying to explain what and how something should be done. And if I try to explain something to a coworker a simple pen and a quick sketch can also do wonders compared to just talk and gestures. So I don't expect the keyboard or mouse to disapear anytime soon, they might get improved and enhanced, but I don't think they will ever completly disapeare, unless of course we get a direct-to-brain-interface that actually works.

One simple thing I expect to happen in the near future would be mice that supports rotations, optical mice already have all the data they need to detect it, so it would be relativly easy to add and could provide some instant benefits, ie. instead of just grabing an item to move it, you could also rotate it without toggling the drag-mode first, simply rotate the mouse and be done with it. This might also get important with zooming interfaces, Expose and the new XGL stuff already has pleny of zooming build-in so the next step to provide a fully zoomable desktop isn't that far away. Since zooming requires yet another axis, mouse rotation might be used for that, since its an easy way to do it in a way that doesn't require even more buttons on the mouse.

but (1)

geekoid (135745) | more than 8 years ago | (#15923355)

if with the wave of your hand the screw driver would leap up an remove that pesky screw on its own, wouldn't you want to do that?

Re:but (1)

grumbel (592662) | more than 8 years ago | (#15923584)

if with the wave of your hand the screw driver would leap up an remove that pesky screw on its own, wouldn't you want to do that?

How do I tell if the screw needs tightening at all? How does the computer figure out if I want it to go in or out? How does the computer tell me how tight it is? How to I pick a screw? If I just pick that screwdriver and start working all that information is easily available. Of course an automatic screwdriver might work better then a manual one and even with mouse and keyboard at hand there might be siutaions where voice commando or gesture would help. Voice and gesture however have their limits, so they simply don't work for all situtations. Just look at StarTrek, even with all that hightech around, they still have consoles that are operated via hand input ;)

Re:The Future is easy to predict here. (1)

EvilIdler (21087) | more than 8 years ago | (#15923486)

>Since zooming requires yet another axis, mouse rotation might be used for that
I'd be perfectly happy to replace my current functionality on the mousewheel for zoom.
Virtual screens are available through other motions, anyway.

Zoom the desktop when not pointing at a program, and a key to hold down to make it all zoom
while in a program.

NLS demo = state-of-the-art in 1968 (0)

Anonymous Coward | more than 8 years ago | (#15923065)

WOW! Doug Engelbart (the speaker) was already using an electret microphone in 1968, right after it went into production! They used the latest high-tech bling in that demo!

Attention Apple Fags! (0, Flamebait)

heauxmeaux (869966) | more than 8 years ago | (#15923066)


Now all you ass-miners can get together for a group iFisting at your local Apple Store!

Hail (Rim) Jobs!

One of the coolest things... (2, Informative)

scorp1us (235526) | more than 8 years ago | (#15923089)

Was a memory storage system that consisted of liquid mercury. A speaker at one end would cause waves to travel the length of the vat of mercury. At the other end, it was measured by a inducer(microphone) and re applified then sent back to the speaker. If you wanted to change a bit, you had to wait for it to come around and short it to ground, or inroduce a tone. Your amount of memoery was limited by the length of your tube and the viscosity of the mercury.

Re:One of the coolest things... (1)

lakiolen (785856) | more than 8 years ago | (#15923288)

Are you prehaps thinking of the computer that Lawrence Pritchard Waterhouse built in the book "Cryptonomicon"?

Re:One of the coolest things... (0)

Anonymous Coward | more than 8 years ago | (#15923582)

No, delay line memory actually existed, for real, pretty much exactly as described.

Re:One of the coolest things... (1)

Duhavid (677874) | more than 8 years ago | (#15923303)

Temperature was important as well.

Re:One of the coolest things... (1)

gary chund (697151) | more than 8 years ago | (#15923369)

"... limited by the length of your tube and the viscosity of the mercury."

The link between GUI design and Pornography, by jove he's cracked it!

Nobody's paying attention (4, Insightful)

quokkapox (847798) | more than 8 years ago | (#15923093)

At least not to common consumer devices. I cannot even count the number of remote controls, microwaves, cellphones, dishwashers, ATMs, and other devices which are seem to be designed completely without thought for the human who will need to use them.

Remote controls - ever heard of making the buttons distinguishable by FEEL, so I don't have to look down to tell whether I'm going to change the volume or accidentally change the channel or stop recording?

Microwaves - make the buttons we use all the time bigger and obvious. I can't use my microwave oven in near dark because the stupid thing's start button is indistinguishable from the power level button. That's just dumb. I don't need two different buttons that say "Fresh vegetable" and "Frozen vegetable" which I never use; and I have to babysit the popcorn anyway, so I don't need a "popcorn" button hardcoded for some random time limit. A microwave should have a keypad for entering time and bigger buttons labeled +1minute, +10seconds, ON, and OFF. That's all 99% of people use anyway.

The people who design interfaces should be made to use them for long enough so that they work out at least the most obvious design flaws.

I keep putting off buying a new cellphone because I know I will have to learn a new interface even to set the freaking alarm clock and it will probably take six menu choices to do it.

Re:Nobody's paying attention (3, Interesting)

Lugae (88858) | more than 8 years ago | (#15923173)

I have two more:

1. The gas pump that once you pick up the pump the prices disappear asking you to "Select Product."

2. The ATM that the button that you used to press "Withdrawl" on the next screen would withdraw $200. Shouldn't that go to the smallest amount or a "Go Back" button?

Re:Nobody's paying attention (0)

Anonymous Coward | more than 8 years ago | (#15923229)

In fact, I think only two buttons are needed for microwaves: add-a-minute-and-turn-on-at-full-power, and open door. These are the only two buttons I ever use.

Re:Nobody's paying attention (1)

geekoid (135745) | more than 8 years ago | (#15923332)

"Remote controls - ever heard of making the buttons distinguishable by FEEL, s"
I haven't seen a remote control where the buttons weren't easily navigatable by feel in years.

On my microwave the 1 minute, start are distinguishable from each other. Not in the dark, but who microwaves in the dark?

"The people who design interfaces should be made to use them for long enough so that they work out at least the most obvious design flaws."
the more you use it, the more intuitive it starts to seem to be.
They could use some UI tester though.

Re:Nobody's paying attention (1)

grumbel (592662) | more than 8 years ago | (#15923409)

At least not to common consumer devices. I cannot even count the number of remote controls, microwaves, cellphones, dishwashers, ATMs, and other devices which are seem to be designed completely without thought for the human who will need to use them.

One doesn't even need to look at all thoes high tech products to find bad user interface design, even something as simple as a door can be done extremly bad. Ever tried to push one that you needed to pull thanks to the fact that both side actually look the same? And faucets are not much better either, things like having seperate nobs for warm and cold water are still very common, they are not hard to understand, they however make it nearly impossible to actually get water out of the faucets with a desired temperature, since it all turns into a mixing game. Or hobs whose controls make it very non-obvious which control maps to which burner and such, The Design of Everyday Things by Donald Norman has a ton more example of simple things that are designed horribly.

Re:Nobody's paying attention (2, Insightful)

EvilIdler (21087) | more than 8 years ago | (#15923490)

Analog knobs rock. Heavily computerised interfaces outside actual
computers can be very annoying. Get a nice, cheap Korean microwave :)

Re:Nobody's paying attention (1)

Dr. Zowie (109983) | more than 8 years ago | (#15923499)

Sanyo makes a consumer level microwave/grill that is currently available at Target for about $100. The interesting UI feature is the stripped down user interface that has an analog knob to set the (digital) time.

Re:Nobody's paying attention (1)

RajivSLK (398494) | more than 8 years ago | (#15923514)

Actually, microwaves can be way better than that. My microwave has a really cool interface. Rather than a keypad it has a wheel that you rotate to indicate the time. Time increases in 10 seconds increments. The wheel is speed and acceleration sensitive. When the microwave is running simply spin the wheel to add or substract time. Popcorn not quite done? Spin the wheel and add 30 seconds. Want to use it in the dark? It's a giant wheel with two buttons next to it "start" and "stop".

It's so simple that a completely blind person could use it. Simply rotate the wheel slowly and listen for the very soft audiable "clicks" that indicate and increase in time.

Also, this particular model has a builtin convection oven. The perfect microwave.

Intuitiveness (3, Interesting)

FlyByPC (841016) | more than 8 years ago | (#15923120)

Amazing how naturally he uses the mouse -- back in 1968!

Re:Intuitiveness (1, Informative)

Anonymous Coward | more than 8 years ago | (#15923267)

You're aware he invented the thing, right?

self study as elective was denied (4, Interesting)

cadience (770683) | more than 8 years ago | (#15923162)

I graduated in 2003 with a BS in Computer Engineering and a BS in Software Engineering.

During my studies I proposed multiple times to do an independent study of the history of the computer field to count for 3 credits of my general electives. I was denied every time, even with support from the head the Engineering department. The liberal arts department continually stated that the purpose of the electives is to gain breath in knowledge. I finally took a (very interesting) class on Greek mythology.

I agree with the premise of increasing knowledge, but not the implementation. The college should encourage independent research when a student can blend his primary interests to meet a "credit based requirement".

What are your thoughts?

Understanding history of your profession should be as important as understanding your culture and your history. Your profession will become a part of who you are as well! Without context, you're clueless.

Re:self study as elective was denied (1, Insightful)

Anonymous Coward | more than 8 years ago | (#15923296)

My cynical viewpoint is that certain parts of history are buried quite deliberately. Or at least I can understand you not being encouraged to research it, to some people it's controvertial. Why? Partly because of the current madness regarding patents and partly because modern culture encourages an arrogant disconnect from our greater history. No longer is it "the shoulders of giants" we stand on, kids today seem to think everything was invented in the last 20 years, instead we live on the roof of a great skyscraper built by magicians. To understand your technological history is to grasp the context, the limitations, and then to develop a little healthy contempt for soothsaying futurist punits who whip up a frenzy of hype around each dull and trivial new technological advance.

Many of the things we believe are new are just rehashed ideas that were ahead of their time. As we go forward, ideas that were too advanced get second chances at coming into the mainstream. Sometimes they succeed, other times they have to wait for the third or fourth chance. A generation kept in ignorance of its history can be fooled into thinking those old ideas are the creation of new developers.

Please, write your history of computing, research it well and thoroughly starting with the Babylonians, but remember to include multiple perspectives, generally unquoted and unrecognised foriegn acheivements, look for prior art on every claim made in the last 30 years and you'll
probably find the person with their name to the acheivement "borrowed" it from work within a timeframe of 10-50 years previous.
Nothing upsets "historians" with a vested interest more than revisionists, but it's a necessary part of the very formation of history and informed hindsight .

Re:self study as elective was denied (1)

geekoid (135745) | more than 8 years ago | (#15923379)

but it shouldn't be a general elective.

They were right.

But there should be a history of technical advances in the computer cirr.
Not a study of dates but a study of what was done and why. as well as a chance for students to use tghe older UIs.

Useless Crap (1, Interesting)

Anonymous Coward | more than 8 years ago | (#15923170)

I looked at those videos, and most of the time the operator is spinning the damn cube, or waving a window.

Now you tell me, how much time are you going to spend doing that? I turned off the XP menu fade-in right from the start.

How about some useful stuff that actually helps organization, like virtual desktops did - a rather simple and cheap trick that didn't require everybody to upgrade to the latest octo-GPU CrossFile SLI-ed system.

MUD and MMRPG players know ... (3, Interesting)

cpu_fusion (705735) | more than 8 years ago | (#15923198)

That the biggest UI change yet-to-come has to do with moving from a single-user desktop metaphor to a collaborative virtual space that leverages a lifetime of perception of the real world. When computers evolve into a more transparent role in our life, layering this digital world on our physical world will be next. It's coming sooner than we think, will we survive that long though?

Sorry, I've got work to do... (1, Insightful)

Ralph Spoilsport (673134) | more than 8 years ago | (#15923215)

I don't see how the UI issues matter. I have work to do. If the UI does XYZ and I'm doing ABC, the UI is of no consequence. We regularly have idiotic flamewars here between people glued to CLI, and the zealots of GUI. Here it is, 2006, and I'm kicking builds in CLI using UNIX commands. I remember when the Mac came out all these CLI shitheads were barking "the Mac is a toy! REAL MEN use CLI and DOS!" Whatever. DOS bit the dust, and CLI is marginalised, but it hasn't disappeared because in specific ways, it's very useful. GUI was able to do 90% of what CLI did, and did it intuitively and easily. I don't see how these new UI innovations are going to improve on the work I do in a GUI in the same way the GUI improved on the CLI.

All that gestural stuff will make my work better exactly how? It's not gestural - it's just arm-waving of the "IN THE FUTURE..." variety.

HOWEVER: the I/O brush IS very k3wl. I can think of all kinds of fun stuff to do with that. It's an app, not a UI, but it's definitely fun.

RS

Re:Sorry, I've got work to do... (0)

geekoid (135745) | more than 8 years ago | (#15923315)

there is nothing the CLI does that can't been done with a GUI.

Nothing.
Unfortunatly most current GUIs are really that good.

Re:Sorry, I've got work to do... (0)

Anonymous Coward | more than 8 years ago | (#15923370)

ROTFFLMFAO

history of computing part 1 (2, Funny)

poot_rootbeer (188613) | more than 8 years ago | (#15923235)

"It might be time to add a mandatory "History of Computers" class to the computer science curriculum so as to give new practitioners this much needed sense of history.'"

Oh please no.

I had a mandatory Computers class in 6th grade (and again in 7th and 8th grade, with the exact same lesson plan). Half of this class was rudimentary BASIC programming on a room full of TRS-80s, the ones with the integrated green monochrome displays--and this was circa 1990.

The other half of the class was a purported history of computing, the key facts of which I can still recite today (learning the same thing thrice causes it to stick). These facts are:

- Charles Babbage made a mechanical computer.
- Then there were the UNIVAC and the ENIAC.
- The term "bug" is due to an actual bug Ada Lovelace found inside a computer.
- There are four kinds of computer: supercomputer, mainframe, minicomputer, and microcomputer.
- RAM stands for "random access memory"; ROM stands for "read only memory".
- Cray supercomputers are cool-looking.
- 10 PRINT "FART!!! "
- 20 GOTO 10
- RUN

but you get it wrong. (2, Informative)

geekoid (135745) | more than 8 years ago | (#15923297)

the first computer bug was not found by ada lovelace.
uit was found by Rear Admiral Grace Murray Hopper, USNR, (1906-1992)

http://www.maxmon.com/1945ad.htm [maxmon.com]
http://www.history.navy.mil/photos/pers-us/uspers- h/g-hoppr.htm [navy.mil]
http://www.history.navy.mil/photos/images/h96000/h 96566kc.htm [navy.mil]

she was an excellent speaker who could make anybody understand anything, a real gift.

Even the most elementary exercise with your brain would ahve allowed you to figure why it couldn't have been Ada Lovelace.

Re:but you get it wrong. (2, Informative)

nuzak (959558) | more than 8 years ago | (#15923592)

The fact that Hopper found this bug to be a play on words at the time when she found it suggests rather strongly that the term "bug" was around long before then.

Edison used the term quite a bit. In fact, it goes all the way back to Shakespeare.

This differs from any other history how? (0, Flamebait)

Infonaut (96956) | more than 8 years ago | (#15923252)

Unless they're harboring some religious, ethnic, or national grudge, people generally don't know much about history. That's 3x as true in the US as it is in most places.

'Sadly, a great many people in the computer field have a pathetic sense (or rather ignorance) of history. They are pompous and narcissistic enough to ignore the great contributions of past geniuses...

I'm confused. Are computer folks ignorant about history, or are they knowledgeable about history and chosing to ignore it?

Whatever the next UI is, it won't be "intuitive" (5, Interesting)

Dr. Zowie (109983) | more than 8 years ago | (#15923264)

Several years ago I had the delightful privilege of talking about interface design with Jef Raskin (who designed many aspects of the Macintosh UI).

He pointed out that "the only intuitive user interface is a nipple."

Several days ago my wife and I had a new son, so of course I watched them learn (together) how to breastfeed. It was not obvious to either one of them how to make it work -- they had to explore and figure it out together.

It appears that Jef was wrong: even nipples are not an intuitive user interface.

Re:Whatever the next UI is, it won't be "intuitive (1)

geekoid (135745) | more than 8 years ago | (#15923277)

I have been saying that for year.
I never culd figure out if he was wrong, are a genius.

Meaning, even the most seemingly intuitive interface has a learning curve.

I hope things worked out between your wife and son. It can be an extrememly fustrating thing for a woman.

Good luck!

Re:Whatever the next UI is, it won't be "intuitive (1)

Dr. Zowie (109983) | more than 8 years ago | (#15923423)

Thanks -- they're doing fine!

Jef was that rare jewel -- a visionary who is willing to admit mistakes. The world got a little poorer when he passed away.

Intuitive User Interface (1)

jck2000 (157192) | more than 8 years ago | (#15923622)

Can you now tell me why the common head gesture for "yes" is to shake the head up and down, and the common head gesture for "no" is to shake the head left and right?

I kind of did something related during my degree.. (1, Insightful)

Anonymous Coward | more than 8 years ago | (#15923413)

During my degree, I studied a module which was essentially "The History of Programming Languages [bath.ac.uk] ".

Suffice to say, it was the most soul destroying, mind numbing, useless waste of time that anyone on my course ever encountered. I'm sure it's down to the lecturer's "style", but it was really, really god awful.

But I agree, perspective is important...just like everything though, it has to be taught well! :)

Hmmm... Project Looking Glass? (1)

spectecjr (31235) | more than 8 years ago | (#15923417)

Re:Hmmm... Project Looking Glass? (0)

Anonymous Coward | more than 8 years ago | (#15923464)

Check out the very end of the Alan Key video in the article, around the 1:15:10 time marker. This is around 1977. A HUGE full wall screen showing all kinds of zooming interface called "Data Land" and you can zoom in on anything. This also seems to have been the first UI to use anti-aliasing.

Actually, here's the link:
http://video.google.com/videoplay?docid=4990841328 277971528 [google.com]

This kind of proves the point that folks don't know anything about UI/Comp.Sci history.

"Sample Augmentation System" (1)

jthill (303417) | more than 8 years ago | (#15923526)

God. That video isn't just humbling, it's damn near humiliating. Compare it to the nextstep 3 [google.de] demo someone else posted I think today. It isn't that nextstep isn't better - in many places it is, by far. But only in detail, and only some ways. There's stuff in NLS I still want. Anybody else seen folding that good? Where? I want it. What the hell have we been doing for forty years?

Re:"Sample Augmentation System" (0)

Anonymous Coward | more than 8 years ago | (#15923626)

" What the hell have we been doing for forty years?"

Either cloning Windows [kde.org] , or cloning Macs [sun.com]

Gestures are overrated (1)

fragmer (900198) | more than 8 years ago | (#15923541)

I just don't see how remembering a dozen gestures would be more difficult then remembering a dozen keyboard shortcuts. In my experience, keyboard shortcuts are way faster too. Gestures are very effective for tasks like resizing/arranging windows and interacting with games. But that's about it. Keyboard/mouse (or keyboard/trackball) are much more efficient for most tasks.

Tired of Eyecandy... (1)

evilviper (135110) | more than 8 years ago | (#15923574)

I'm tired how all GUI development is now centered around the GPU, and more eye-candy.

The useful features from OS X that people find useful, like a visual cue as to where a window is being iconified to, can and have been done in much faster/simpler ways. For as long as I can remember, Afterstep has drawn an outline of windows being iconofied, and quickly shows the outline spiraling down to, and shrinking into the icon.

Why is the rest of the GUI stagnating? Keyboard shortcuts are extremely primitive at best. Using TAB to navigate between fields is a rather nonsensical way to do things, particularly since you don't know if TAB is going to take you to the field to the left, right, up, down, etc. With something like Symbian, to get to the field below, you just hit the down arrow, and you're there.

Browsers are even worse. They are beyond horrible when trying to use keyboard navigation. The notable exception is Links (similar to lynx), and yet nobody is adapting those highly intuitive and powerful keyboard navigation features to other browsers.

Having to scroll side-to-side while reading a webpage is absolutely the worst interface design ever concieved. Web pages aren't giant images or PDFs, after all. I was telling people, 10 years ago, that browsers needed to ignore any HTML code (and wrap/resize images) that forced the page to become wider than the browser window... And I was REALLY ranting on the subject about 4 years ago, when it was driving me crazy on my 240x320 PDA. Yet, it was only about a year ago that Opera figured it out, and included that feature, and still none of the other web browsers have even picked-up on that important improvement.

You can make my browser window as transparent and warped as you want, but it's not going to fix any of the REAL problems people have.

apply it to a calculator... (3, Insightful)

3seas (184403) | more than 8 years ago | (#15923615)

take these fancy UIs and use them to control a calculator and then decide if it right for the job.

"Right for the Job" is the key phrase.

There are three primary UIs:

the command line (CLI)

the Graphical User Interface (GUI)

and the side door port used to tie functionality together. known by many different names, but in essence an Inter Process Communication Port (IPC)

Together they are like the primary colors of light or paint, take away one and you greatly limit what the user can do for themselves,

But if they are standardized with the recognition of abstraction physics (in essence what a computer impliments) then the user would be able to create specifically what they need for the job they do via understanding and applying abstraction physics. The analogy would be mathmatics and the hindu-arabic decimal system in comparison to the more limited roman numeral system.

There are all sorts of user interfaces that can be created but they all are made up of some combination of the primary three, perhaps lower down on the abstraction ladder but none the less there.

The reason why this is unavoidable is simple due to the nature of programming.

Programming is the act of automating some complexity, typically made up of earlier created automations (machine language - 0's and 1's is first level abstraction - all above it is an automation). The purpose of automating some complexity is tocreate an easier to use and reuse interface for that complexity. And we all build upon what those before us have created. Its a human unique characteristic that make its our natural right and duty to apply.

What the failure of so called computer science is guilty of is distraction by the money carrot, starting with IBM and wartime code cracking paid for by government/tax payers.

This distraction has avoided genuine computer science, or abstraction physics as it would be far more accurate in description.

Abstraction physics to the creation and manipulation of abstractions as mathmatics is a creation and manipulation of numbers, as physics and chemistry is a creation and manipulation of elements existing in physical reality.

With the primary three colors of paint you can paint anything you want, but you cannot call a painting "the painting" any more than you can call a mathmatical result mathmatics. Nor can you call some interface built upon the primary UIs the silver bullet of UI's.

All this will become much more clear, common and even second nature once we all get past the foolish fraudlent idea that software is patentable.

A roman numeral accountant, in defending his vested interest in math with roman numerals, promoted that only a fool would think nothing could have value (re: the zero place holder in the hindu arabic decimal system.)
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>