×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Futuristic UC Berkeley OS Tessellation Controls Discrete 'Manycore' Resources

timothy posted about 10 months ago | from the this-stuff-will-be-everywhere dept.

Software 28

coondoggie writes "At the Design Automation Conference (DAC) here this week, John Kubiatowicz, professor in the UC Berkeley computer science division, offered a preview of Tessellation, describing it as an operating system for the future where surfaces with sensors, such as walls and tables in rooms, for example, could be utilized via touch or audio command to summon up multimedia and other applications. The UC Berkeley Tessellation website says Tessellation is targeted at existing and future so-called 'manycore' based systems that have large numbers of processors, or cores on a single chip. Currently, the operating system runs on Intel multicore hardware as well as the Research Accelerator for Multiple Processors (RAMP) multicore emulation platform."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

28 comments

Bad Marvel villain (2)

girlintraining (1395911) | about 10 months ago | (#43944217)

I know I'm going to hell for this, but I read "futuristic" and "tessellation" in the summary and immediately thought of Loki from The Avengers. Terrible villain really, just went bad because he had daddy issues. *cough* Crap... going off topic and triggering a flame war from marvel lovers. Yeah. I'm taking the special bus to hell now...

Re:Bad Marvel villain (-1)

Anonymous Coward | about 10 months ago | (#43944265)

Clue: My rancid asshole is a tadpole sucker. What say you?

Re:Bad Marvel villain (1)

SuricouRaven (1897204) | about 10 months ago | (#43944703)

He was a nicely complicated villain in Thor - kept you guessing right to the end which side he was on, with his double-cross approach. He's a planner, not a fighter.

Yet another operating system (-1)

Anonymous Coward | about 10 months ago | (#43944239)

Why not using an existing one ? Will Nvidia support it ? How to get these 3D Acceleration ? : 3D [youtube.com]

And this is different how? (0)

roman_mir (125474) | about 10 months ago | (#43944281)

So there is a new weird name for an OS, but how is it really different from any other OS in that every other modern OS can do the same and will be doing the same: allow many input devices and it is up to drivers to take care of what type of device is allowed.

I mean it sound like a system based around specific drivers rather than a generic system that allows any device to be added by allowing drivers as modules.

holy FAIL, batman! (1)

fascismforthepeople (2805977) | about 10 months ago | (#43945447)

So there is a new weird name for an OS, but how is it really different from any other OS in that every other modern OS can do the same and will be doing the same: allow many input devices and it is up to drivers to take care of what type of device is allowed.

Wow, Roman, just wow. We knew you failed math, physics, chemistry, biology, and (your favorite lecture subject) economics while taking courses at one of the largest publicly funded research universities in the western hemisphere. But apparently you failed operating systems as well? What you just stated describes quite nearly every operating system ever made and would suggest that every OS is the same regardless of its lineage. I have never met a reasonable person who would suggest, for example, that DOS, Mac OS, Mac OS X, BSD, QNIX, Solaris, and BeOS are all exactly the same, yet you just did exactly that.

Of course, you are not a reasonable person. Is a bizarre statement such as what you just made just another step in your goal of establishing a fascist government in the united states (and of course eventually the world)?

Re:holy FAIL, batman! (1)

tragedy (27079) | about 10 months ago | (#43950063)

Based on the Slashdot summary, he does have a point. All the summary describes is a computer with multiple peripherals attached to it. Multiple monitors and sound devices and various input devices connected through who knows what buses. It sounds like they're just describing a home media server.

Sporadic scheduling (4, Interesting)

Animats (122034) | about 10 months ago | (#43944283)

Some of what they're doing with resource guarantees is like QNX's "sporadic scheduling" [qnx.com]. The idea is that you can guarantee a thread something like 1ms of CPU time every 10ms. This is useful for near-real-time tasks which need a bounded guarantee of responsiveness but don't need to preempt everything else immediately. Most UI activity is in this category. With lots of UI devices, including ones like vision systems that need serious compute resources, you need either something like this, or dedicated hardware for each device.

On top of sporadic scheduling there should be a resource allocator which doesn't let you overcommit resources. So if something is allowed to run, it will run at the required speed.

This is very useful in industrial process control and robotics. The use case for human interfaces is less convincing.

Re:Sporadic scheduling (3, Interesting)

Anonymous Coward | about 10 months ago | (#43944499)

My Thesis was about implementing a similar concept for Linux (http://saadi.sourceforge.net); it was working well in 2006, but I did not have the time to maintain it. The concept you describe has one flaw: For energy efficiency, most CPUs support digital voltage scaling. Therefore a time based CPU garantee is useles. Instead the guarantee should be about CPU cycles per time period.
With that in place, it could be used to increase energy efficiency as well because the frequency can be optimized to just match the guarantees.
One remaining problem is the required calibration for each CPU. A video player might require different amounts of CPU cycles depending on CPU architecture.

Re:Sporadic scheduling (1)

anon mouse-cow-aard (443646) | about 10 months ago | (#43945519)

allocating 1ms in 10 to a process on a process sounds pointless because in an environment with that many cores, getting a cpu will not be a problem. It's a bit like implementing QoS for VOIP on a gigabit ethernet link. Your voip channel is what a few kbps at most... a millionth of the available resource? What is the point of managing/scheduling it? That management overhead dwarfs the benefit in all but very rare cases.

What I suspect they are really about is assigning (ie. dedicating) groups of processors which are geometrically localized (patches or polygons on the compute surface) together, so they can run multi-processor apps on the surface without getting bogged down by noise of general purpose processes. This is well known in supercomputing, and is often termed reducing OS jitter, and is very important. It is about reducing contention for scheduling by undoing the last 50 years of OS work to share CPUs. CPUS are cheap, so dedicate them. It is typically the job of the batch scheduer, but I can see how pushing this down to the OS itself, in these days of many core on a chip, is a useful improvement.

Re:Sporadic scheduling (1)

RicktheBrick (588466) | about 10 months ago | (#43945891)

I have a fast 4 core computer with a GPU with a GByte of memory. Even with all of that, flash will bring it to a standstill. The worst is when the mouse pointer will not move. There are times when what I see lags behind what I am typing. There are times when after closing a program it takes several seconds before it disappears from the screen. I get messages about closing scripts before they make my computer look unresponsive. Most of those if not all have something to do with flash.

stop abusing the word Tessellation (1)

decora (1710862) | about 10 months ago | (#43944309)

Tessellation means to cover a polygon with smaller polygons. That's it.

Now whenever you google it, it turns up all this garbage from clueless gamers who think it's some kind of Crysis 3 feature. Now on top of that we have to deal with this?

Re:stop abusing the word Tessellation (2)

Psychotria (953670) | about 10 months ago | (#43944335)

Tessellation means to cover a polygon with smaller polygons.

Actually, no. Tiles may be polygons but they're not always.

From wikipedia:

More formally, a tessellation or tiling is a partition of the Euclidean plane into a countable number of closed sets called tiles, such that the tiles intersect only on their boundaries.

Re:stop abusing the word Tessellation (1)

Psychotria (953670) | about 10 months ago | (#43944341)

Look! http://www.wikipaintings.org/en/m-c-escher/horseman-1 [wikipaintings.org]

A tessellation that is not made up of polygons. Dang.

Re:stop abusing the word Tessellation (1)

drinkypoo (153816) | about 10 months ago | (#43945319)

So, those aren't many-sided figures? I suppose you think they're literally made up of infinitely many points, too.

Re:stop abusing the word Tessellation (0)

Anonymous Coward | about 10 months ago | (#43944827)

Not to mention what Google did to the word android. Every time I'm looking for information on androids, all I can find is crap about mobile phones and tablet computers.

Re:stop abusing the word Tessellation (0)

Anonymous Coward | about 10 months ago | (#43945149)

Movie avatar did to the word 'avatar' :(
That movie sucked soo much sack.
Movie Matrix did to the word 'matrix'
That movie was great (only the first one)

-HasH @ TrYPNET.net

Re:stop abusing the word Tessellation (1)

Goaway (82658) | about 10 months ago | (#43945321)

You do realize that what GPUs are doing in games like Crysis 3 is covering a polygon with smaller polygons?

Huh? (1)

stox (131684) | about 10 months ago | (#43944401)

"Is security an issue? Yes, Kubiatowicz acknowledges, suggesting cryptography, for one thing needs to be part of it."

When are people going to learn? Unless you design an operating system to be secure from the very start, it is never going to be really secure.

Some valuable lessons may be learned from this, but I don't see it having much of a future.

Re:Huh? (0)

Anonymous Coward | about 10 months ago | (#43944427)

Some valuable lessons may be learned from this, but I don't see it having much of a future.

It's a research OS, not a startup trying to replace Linux. The entire point is to demonstrate scheduling and resource partitioning. It's a waste of student resources to waste time beyond the mission goals.

News for Nerds? (2)

TheRaven64 (641858) | about 10 months ago | (#43944523)

Well, it's definitely for nerds, but the Tesselation paper [acm.org] was published in 2009, so hardly news. For those that don't have ACM DL access, the paper [usenix.org] is interesting, but suffers from many of the same problems as LibOS / Exokernel approaches.

Re:News for Nerds? (0)

Anonymous Coward | about 10 months ago | (#43945097)

i dont want to be snarky, but what problems are you referring to explicitly?

control transfer overhead?

Yawntastic (0)

Anonymous Coward | about 10 months ago | (#43944885)

Whatever. Whoopie-wow, man.

All interfaces between computers and people are just inconsequential incremental improvements on a single key input device.
Dit dit dit dah dah dit dit... The keyboard is just like a single key, with 100 and something times the vocabulary size. A mouse is similarly simple.

I assert that no change is made by adding extra buttons either to keyboard or mouse, nor by integrating a keyboard into a wall, or a desk, or turning a door-knob into a scroll-wheel, etc. Computers won't change substantially, move beyond the GUI in any meaningful way until they do away with the interface entirely, as it stands.

I am waiting for the day when they come up with a functional metaphor for a person sitting at a computer. I don't want there to be a keyboard or a mouse, I want that thing to take dictation. I don't mean dictation. one. word. at. a. time. like. this. I mean the entire OS is like Apple's Siri, except that it doesn't require an internet connection, it actually understands what I tell it in plain English, (or whatever,) and actually works. Let's say I want to surf the internet. The computer with this ideal interface is hooked up to a screen. (Yeah, it still needs an output, obviously...) I turn it on by saying "computer on"

It fires up, (by which I mean since it obviously had to be at least partially 'on' to hear and understand the command, that it enters fully-on mode,) and it states, "your wish is my command, master" or whatever string I want it to say. It might display a human-like face, just for full feedback between me and the machine, so I can read the expressions on its artificially generated face. (That is after all, a big part of human intercommunication.) I say "launch browser." It complies. "Which page would you like," it asks.

"um, gimme NBC News, and load Pandora, 80's channel in the background." I tell it, and add, "also open up an e-mail to my mom."

It understands all that, opens Firefox 938.2, (which will be the version out by the time this idea is available...) and navigates to that page. It launches the Pandora internet radio program, and goes to the 80's channel and starts playing, and then it opens the e-mail client and begins an e-mail to my mother. The display indicates that Pandora and the e-mail are minimized/running background, and shows NBC, which of course is detailing yet another act of senseless violence committed against a dozen or so random innocent people by some dickhead with an assault weapon that our government is too cowardly to do anything about, of course, just as it will be then with the corrupt Demopublicans nominally running things.

"Tell my mom I won't be able to make it for dinner tonight, since downtown is completely jammed with emergency vehicles helping all the people injured by the latest shooter to attempt to murder large numbers of people, and every other way to get to her house is to go over roads and bridges that aren't safe for vehicular traffic because the country's infrastructure is falling apart and the government is too paralyzed to do anything about it anymore, so I'd have to go through downtown."

The computer understands all this, and simply e-mails my mom the word "shiiiiiiiiiiiiiiiiiit." And my mom of course, understands exactly what is meant, as her computer gets the e-mail, and tells her, "your son won't be able to come for dinner tonight, as he can't get here from there without going through downtown, which as I just got through telling you, is jammed with emergency vehicles since some dickhead went and shot a bunch of people with a weapon he shouldn't have had because no civilian needs to have one of those but our useless cowardly government can't even buy the balls to do anything about it, and all the other ways to get from there to here are crumbling, unsafe freeways and bridges.

All that would be accomplished without a keyboard or a mouse, and it would almost make up for the fact that every other day you turn on the TV or radio, and read about how some bag of flaming excrement shot a bunch of people with an assault weapon because our dickless government can't stand up to a bunch of retards who think the Second Amendment was intended to arm them against their government to keep it from relapsing to a tyranny like the British monarchy, because, well... they're retarded.

Until computer interfaces reach that level of competence and sophistication, it's still just such a heap of fetid, sticky mice and grimy, oily keyboards full of tiny fragments of every person who ever used them. To recap... I yawn in your general direction.

But a computer like what I've described would be brilliant! It would give you the effect of having a person to interface with the machine for you, isolating you from the machine, just as we are with cars.

(When was the last time you hand-cranked your engine to start it, or anything like that? We're mostly heavily isolated, and we LIKE it that way.)

Re:Yawntastic (1)

gl4ss (559668) | about 10 months ago | (#43945179)

isn't that like using a computer through a secretary?
it's inefficient for most tasks.

anyhow, buy xbox one when it comes out. at least then you can get the nbc news by asking it.

Re:Yawntastic (0)

Anonymous Coward | about 10 months ago | (#43947597)

All interfaces between computers and people are just inconsequential incremental improvements on a single key input device.

Well, yeah. And all computers are just incremental amounts of switches chained together.

I don't mean dictation. one. word. at. a. time. like. this.

NaturallySpeaking. Look at the time—1997 already.

The computer understands all this, and simply e-mails my mom the word "shiiiiiiiiiiiiiiiiiit." And my mom of course, understands exactly what is meant

Compression?

since some dickhead went and shot a bunch of people with a weapon

Sorry; the school was the only shooting range open at the time.

It would give you the effect of having a person to interface with the machine for you, isolating you from the machine, just as we are with cars.

Most people don't have chauffeurs.

When was the last time you hand-cranked your engine to start it

Does your computer still look like this [wikimedia.org]?

Re:Yawntastic (1)

PopeRatzo (965947) | about 10 months ago | (#43948701)

move beyond the GUI in any meaningful way until they do away with the interface entirely, as it stands.

Call me old-fashioned, but I like having an interface between me and the machine. Even if it becomes some haptic setup, I find it somewhat comforting to have a little something between me and it.

I hope my need to have congress with a machine never becomes so great that I have to do away with the interface.

Of course, we could skip the whole neural interface and go straight to the autonomous machines...

Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...