Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

A Hardware-Software Symbiosis

ScuttleMonkey posted more than 7 years ago | from the your-job-will-be-replaced-by-robots-soon dept.

120

Roland Piquepaille writes "We all want smaller and faster computers. Of course, this increases the complexity of the work of computer designers. But now, computer scientists from the University of Virginia are coming with a radical new idea which may revolutionize computer design. They've developed Tortola, a virtual interface that enables hardware and software to communicate and to solve problems together. This approach can be applied to get a better performance from a specific system. It also can be used in the areas of security or power consumption. And it soon could be commercialized with the help of IBM and Intel."

Sorry! There are no comments related to the filter you selected.

Strawdick (2)

Edie O'Teditor (805662) | more than 7 years ago | (#19384943)

The thing I want to see smaller is the number of Roland's copy-pasted shill posts here.

Hot Chick (1)

spun (1352) | more than 7 years ago | (#19385067)

The thing I want to know is the number of the hot chick who invented this. Yowza!

Re:Hot Chick (0)

Anonymous Coward | more than 7 years ago | (#19385127)

She's married, sorry.

Re:Hot Chick (0)

Anonymous Coward | more than 7 years ago | (#19385263)

The thing I want to know is the number of the hot chick who invented this. Yowza!
434-982-2228

Unfortunately, she's married to someone named Matt Cettei, but don't let that stop you.

Re:Hot Chick (0)

Anonymous Coward | more than 7 years ago | (#19385435)

Sorry, I think you're on the wrong website. I believe you want stalkdot.org

Re:Hot Chick (0)

Anonymous Coward | more than 7 years ago | (#19387169)

If you can't find a number listed on the top of her homepage, she's out of your league.

I have her number... (1)

PRMan (959735) | more than 7 years ago | (#19389037)

(434) MAR-RIED

Re:I have her number... (0)

Anonymous Coward | more than 7 years ago | (#19389181)

Considering that most American women are skanks anyway and that if she's like the vast majority of American women she's already had probably 25 sex partners before she got married, I'd say that their chances are pretty good. Give her a ring.

Re:I have her number... (0)

Anonymous Coward | more than 7 years ago | (#19390367)

Imagine her naked and petrified! My hardware could manage her's any day.

Re:Strawdick (0)

Anonymous Coward | more than 7 years ago | (#19386463)

This is delicious copypasta. You must eat.

Roland Picpauilwqailuile submits: (1)

Anonymous Coward | more than 7 years ago | (#19384951)

"Reinventing the wheel for profit!"

NoROLAND NOROLAND NOROLAND (0, Offtopic)

zappepcs (820751) | more than 7 years ago | (#19384975)

I can't figure out who wrote the NoRoland grease monkey script... but I've been meaning to say thanks!

Re:NoROLAND NOROLAND NOROLAND (2, Insightful)

bigtangringo (800328) | more than 7 years ago | (#19385095)

Why the Roland hate? Many of the stories he submits are valid, interesting stories.

The stories he submits link directly to the article, it's only his submitter link that goes to his blog. I rarely, if ever, look at who submitted the article.

If he somehow profits from submitting articles I'm interested in reading, more power to him.

Re:NoROLAND NOROLAND NOROLAND (0)

Anonymous Coward | more than 7 years ago | (#19385157)

The reason for all the Roland hate is that he didn't used to submit articles in the fashion you mention above. His articles would have 3 or 4 links, all of which went to his blog. Once you were on his blog would you then find a link to the original source. It took a whole lot of complaining and several months before he changed his practices. I agree, now, it's not bad. But there's a history there that can't be ignored...

Re:NoROLAND NOROLAND NOROLAND (2)

drinkypoo (153816) | more than 7 years ago | (#19385525)

I agree, now, it's not bad. But there's a history there that can't be ignored...

I read (and commonly vote up) Roland's stories now since he mended his ways. I mean, he did what we were asking, the least we can do is be appreciative.

Re:NoROLAND NOROLAND NOROLAND (2)

tyme (6621) | more than 7 years ago | (#19385541)

Yes, Roland used to spam Slashdot with articles linked to his own web page, which was reason enough to hate him. Thankfully, his posts now link to the actual articles, but that doesn't really help matters all that much. Unfortunately, Roland never seems to understand the point of the articles to which he links. He always writes sesantionalist headlines and summaries that have almost nothing to do with the actual article he references. He picks out specific buzwords from the article, buzwords that excite the most crackpottish centers of the brain, then constructs a boilerplate submission based on those buzwords.

Is this really a good reason to hate Roland? probably not. It's not like anyone is forcing you to read his drivel, you can always ignore his articles. At least he isn't pushing his own crackpottery with his articles (ala Time Cube or Mentifex). Still, he gets under the skin.

hardware/software communicating? inconceivable! (3, Informative)

tuffy (10202) | more than 7 years ago | (#19385019)

A middle layer between hardware and software sounds a whole lot like an operating system - the sort of thing that "would allow software to adapt to the hardware it's running on". I can't figure out from the article what makes this thing so special.

Re:hardware/software communicating? inconceivable! (0)

Anonymous Coward | more than 7 years ago | (#19385045)

I don't either but there's a girl in the photo so let's keep quiet and just appreciate it.

Re:hardware/software communicating? inconceivable! (1)

DragonWriter (970822) | more than 7 years ago | (#19385083)

Well, something that allows software to adapt to the hardware its running on sounds to me a lot like an optimizing compiler, but only if the software is distributed in source form.

Re:hardware/software communicating? inconceivable! (1)

jimstapleton (999106) | more than 7 years ago | (#19385227)

Actually, it specifically mentions communication, so I'd say the OP is closer to correct, though a more specific answer would be "driver"

Re:hardware/software communicating? inconceivable! (1)

N3WBI3 (595976) | more than 7 years ago | (#19385087)

I was thinking the same thing, I had to read over the article several times to make sure I was not missing something. In the end, I think, she is after some uber firmware I mean we have the OS we have firmware, what else does she intend to do? Sure you could make your hardware components more programmable but the cost in terms of speed lost in operations and complexity of design far outweigh any benefit.

Re:hardware/software communicating? inconceivable! (1)

someone300 (891284) | more than 7 years ago | (#19385167)

I was thinking it seemed almost like what LLVM is trying to do by optimising at runtime.

Transmeta 2 (1)

mnmn (145599) | more than 7 years ago | (#19385377)

What makes it special is that she's using Transmeta's marketing literature.

I'd recommend her to hire Linus and go head-to-head against Intel. (Or try to be bought out by them).

It's scary to think what if the Cold Fusion professors were as pretty as she is.

Re:hardware/software communicating? inconceivable! (1)

nurb432 (527695) | more than 7 years ago | (#19385415)

Sounds more like a BIOS to me.

Re:hardware/software communicating? inconceivable! (1)

MoxFulder (159829) | more than 7 years ago | (#19385771)

Yeah... hardware/software working together. Not exactly new. In fact, I believe it is impossible to build a useful stored-program computer without it :-)

This is the most content-free article I've ever read. It's basically a press release with an female professor thrown in to boot. Yay.

Re:hardware/software communicating? inconceivable! (5, Informative)

kebes (861706) | more than 7 years ago | (#19385793)

The linked article doesn't really explain the work very well. The project homepage [tortolaproject.com] has quite a bit more information. What they are trying to accomplish is indeed a middle-layer between applications and hardware (e.g. OS functions or drivers) but the point is the solve a particular optimization problem (speed, low power usage, security) by optimizing software and hardware together.

So, for example, if low power usage is the goal, then instead of fine-tuning the hardware for low power usage, and then also tuning the software for low power usage (e.g. getting rid of unnecessary loops, checks, etc.), the strategy would be to create specific hooks in the hardware system (accessible via the OS and/or a driver, of course) to allow this fine-tuning. Nowadays we have chips that can regulate themselves so that they don't use excessive power. But it could be even more efficient if the software were able to signal the hardware, for instance differentiating between "I'm just waking up to do a quick, boring check--no need to scale the CPU up to full power" versus "I'm about to start a complex task and would like the CPU ramped up to full power."

They claim to have some encouraging results. From one of the abstracts:

We have demonstrated the effectiveness of our approach on the well-known dI/dt problem, where we successfully stabilized the voltage fluctuations of the CPU's power supply by transforming the source code of the executing application using feedback from hardware.
Obviously the idea of having software and hardware interact is not new (that's what computers do, after all)... but the intent of this project is, apparently, to push much more aggressively into a realm where optimizations are realized by using explicit signaling between hardware and software systems. (Rather than leaving the hardware or OS to guess what hardware state is best suited to the task at hand.)

Re:hardware/software communicating? inconceivable! (1)

Metaphorically (841874) | more than 7 years ago | (#19386311)

Thanks for the insight. I really wish the summaries would link to project pages when they're available. I sure don't see a link to the project page from the article - and that's par for the course with news publications any more. I mean really, right under the spot with the reporter's email address would be a great place for it (besides the obvious spot somewhere within the body of TFA...)

Re:hardware/software communicating? inconceivable! (1)

imgod2u (812837) | more than 7 years ago | (#19387781)

Software-controlled hardware scaling already exist IIRC. They've existed in a primitive form since Intel first introduced its "Speedstep" feature on its mobile Pentiums. The OS would control the clockrate of the microprocessor based on the CPU utilization.

Embedded systems (PDA's and cell phones) have had finer and more sophisticated grades of software-controlled frequency and voltage scaling and even software-controlled sleep-states.

I'm not sure why this research project is so special. I suppose since she's trying to form a standard interface, one that both hardware and software designers can independently use (whereas before, hardware designers would have to double as software designers designing the low-level control software and giving embedded software designers a library). But see, with something as customized as microchips, I'm not really sure it's possible to come up with a standard "end-all" interface that will fit all of the features a hardware designer could use.

Then there's the problem of where this software would run. The problem with software-controlled scaling isn't that software doesn't know the workload better (as it does) but rather, how fast can it communicate that information to the hardware? There's an overhead in that communication.

Not that I'm not saying software and hardware aren't merging, but it's not going to be in the realm of new software layers IMO. More likely than not, it'll be a move towards more firmware-based microprocessors with hard-wired core designs and configurable, tile-like co-processors that can be configured using non-volatile memory on startup (and stored in configuration SRAM) much like an FPGA.

Wow, the article REALLY sucked (1)

SanityInAnarchy (655584) | more than 7 years ago | (#19389979)

I'm still not amazingly impressed with what you told me... just kind of a "meh". But what bothers me is just how amazingly bad the article is:

a middle layer between hardware and software that can translate and communicate between software and hardware, allowing for cooperative problem solving.

Wow, sounds just like firmware. That or drivers, depending on your definition.

This middle layer would allow software to adapt to the hardware its running on, something engineers have not been able to do in the past,

Ok, now it sounds like a compiler (and architecture-specific compiler flags), and/or drivers.

Hazelwood cites a famous Intel mishap where microprocessors were distributed before a flaw in their fine mathematics function was detected, resulting in a massive recall. A system like Tortola could prevent such expensive glitches in the future.

What's this? A better way of debugging hardware? Maybe even reprogrammable hardware (like having a chip fab in your box)?

Nope, not even close:

We could use the software to hide flaws in the hardware, which would allow designers to release products sooner because problems could be fixed later,

Ok, this would be roughly like simply telling Linux to emulate a math coprocessor, or to emulate the math coprocessor only for the instructions that aren't accurate, and only on a Pentium. I'm not sure, but I think they do this already.

Except, this strikes me as a truly irresponsible way of handling this, bordering on fraud. You paid for a chunk of hardware, and due to a defect, you don't have certain functionality. You can emulate that in software for a performance hit, but really, you were robbed, and you should be able to get a refund/replacement.

Let me put it another way: Suppose someone shipped a video card which didn't work. At all. And when someone pointed this out to them, they shrug and release "updated drivers" which simply run a software renderer. Would you call that progress, or would you call it fraud?

So, turns out that she actually did something cool, but with an article like that, I might never have known. I'd say that an article like that actually hurts more than it helps -- if they had thrown a few more technical terms in there, it might've confused me, but I'd at least be paying attention and looking for more.

Re:hardware/software communicating? inconceivable! (1)

A Numinous Cohort (872515) | more than 7 years ago | (#19390227)

From TFA: Hazelwood already has collaborative ties with researchers at Intel and IBM that place her in an ideal position to eventually commercialize the technology her lab develops.

IBM's iSeries computers use a lot of microcode to mediate between OS and hardware, promoting both software and hardware independence. Sounds like the current project is in the same vein.

Re:hardware/software communicating? inconceivable! (0)

Anonymous Coward | more than 7 years ago | (#19386491)

What makes it special is it's driving hits to Roland Piquepaille's blog.

HAL more than OS (1)

EmbeddedJanitor (597831) | more than 7 years ago | (#19390053)

These days the OS components are getting laess and less tied to specific hardware. Even "drivers" are tending towards generic code that is adapted with hardware adaptation layers (HALs).

What is more interestin, however, is that hardware capability is getting richer. Gate arrays etc allow you you build far more intelligent hardware requiring less software control from the main CPU. That makes for more efficient processing.

How about some details? (2, Interesting)

AKAImBatman (238306) | more than 7 years ago | (#19385037)

TFA is amazingly short on details. All it says is:

...a middle layer between hardware and software that can translate and communicate between software and hardware, allowing for cooperative problem solving. "This middle layer would allow software to adapt to the hardware it's running on, something engineers have not been able to do in the past," she says.


That doesn't really say much. In fact, without further details it sounds like dynamic tuning in virtual machines. Which can't be the case here, as that would be reinventing what has already been inventing. (Seriously, her professor wouldn't approve a project like that, would he?)

Anyone have any more details?

Re:How about some details? [Errata] (1)

AKAImBatman (238306) | more than 7 years ago | (#19385085)

"what has already been invented"

Also, her mentor (chair of the Department of Computer Science) is female, so I should have said "her professor wouldn't approve a project like that, would she".

HOT! (1, Interesting)

Anonymous Coward | more than 7 years ago | (#19385043)

Damn! That professor is Hot!!!! And she teaches Compilers!!!!

http://www.cs.virginia.edu/kim/ [virginia.edu]

Re:HOT! (3, Funny)

bigtangringo (800328) | more than 7 years ago | (#19385141)

Holy hot faculty, Batman!

Funny thing is she'll probably read this /. thread.

Re:HOT! (0)

Anonymous Coward | more than 7 years ago | (#19388067)

Yep. And I've blacklisted you from ever even attempting to take my class, buster.

Re:HOT! (1)

Drew McKinney (1075313) | more than 7 years ago | (#19386559)

Wow, this must be some kind of record for the hottest CS female ever. CS women often (unfortunately) fall on the "troll" side of the hotness scale.

... that being said, the most presentable male example we have from the CS community is Grandmaster Ratte' [wikipedia.org]

Odd... (1)

SanityInAnarchy (655584) | more than 7 years ago | (#19390051)

I find that CS women look just like women everywhere else on campus. The only significant difference is that there are so few of them, but both of the girls in my freshman CS course were hot enough.

Really, the stereotype of "geek" for either sex is entirely obsolete now.

Heh (4, Funny)

NeoTerra (986979) | more than 7 years ago | (#19385049)

"We could use the software to hide flaws in the hardware, which would allow designers to release products sooner because problems could be fixed later," explains Hazelwood.
Looks like we have a winner!

Wow.... (1)

Ynot_82 (1023749) | more than 7 years ago | (#19385059)

They invented firmware

sounds like efi / uefi (1)

Joe The Dragon (967727) | more than 7 years ago | (#19385075)

and it is just a smarter bios

Lukewarm (0)

Anonymous Coward | more than 7 years ago | (#19385115)

Doesn't that feel nice? A female approach to tackling computer design challenges; rather than the square, stodgy old patriarchal logical and hierarchical thinking we grew up with.

hmm (1, Insightful)

Anonymous Coward | more than 7 years ago | (#19385121)

I remember something like this being talked about by my teacher 3 years ago. About how software could show down parts of the CPU to save power. It could also change the way the CPU worked on the fly.

"We could use the software to hide flaws in the hardware, which would allow designers to release products sooner because problems could be fixed later," translation -> Hardware companies can produce shit and if someone happens to notice a flaw we can create a patch instead of testing our products first. Will this not also open the hardware up to a virus?

Re:hmm (1)

misleb (129952) | more than 7 years ago | (#19385819)

Wasn't this already dealt with in cases like the Pentium fdiv bug? I remember the Linux kernel detecting and "patching" known problems with hardware. Also happened certain accelerted IDE controllers, IIRC.

But you're right, it sounds like a license for hardware manufacturers to be more careless and expect software people pick up the slack. As if software didn't have enough bugs... soon we won't even be able to trust that the hardware is reliable? WTF? In what world is this a good thing?

-matthew

Doesn't sound like such a good thing to me (3, Insightful)

LighterShadeOfBlack (1011407) | more than 7 years ago | (#19385159)

Hazelwood cites a famous Intel mishap where microprocessors were distributed before a flaw in their fine mathematics function was detected, resulting in a massive recall. A system like Tortola could prevent such expensive glitches in the future. "We could use the software to hide flaws in the hardware, which would allow designers to release products sooner because problems could be fixed later," explains Hazelwood.
Oh great. So not only do the public get to be unwitting beta testers for software but we'll soon be able to do it for hardware too.

I can't wait to pay £400 for a Beta CPU and then get to endure 6 months of crashing until it gets patched.

Re:Doesn't sound like such a good thing to me (0)

Anonymous Coward | more than 7 years ago | (#19385201)

Don't they do that already, patching around errata?

Re:Doesn't sound like such a good thing to me (1)

kebes (861706) | more than 7 years ago | (#19385939)

I can't wait to pay £400 for a Beta CPU and then get to endure 6 months of crashing until it gets patched.
That's a fair worry. But, on the other hand, chips already ship with plenty of bugs. There are thousands of documented bugs in every chip you've ever used. The expense of redesigning is too high, so they will never fix those bugs. Instead they usually just publish the list of known bugs, and tell the compiler writers: "don't ever use that particular instruction--it doesn't work" or "avoid this sequence of assembler commands--it will cause the chip to lockup."

So we are already in a situation where we buy buggy hardware, and we have to deal with it. In particular, the compiler writers deal with it, insulating the rest of us from the bugs. If it were possible to "code around" the hardware bugs at higher levels, this could be an advantage. So I guess the idea would be that instead of (or in addition to) having the compiler writers design around the bugs, you could have high-level libraries be hardware-specific, designed to work around the bugs or to clearly detect when the buggy states have arisen. The idea is to have explicit signaling between hardware and software, so in theory the library could actively check if it had triggered a bug-state, and do the computation elsehow (whereas as far as I know, compilers are written to completely avoid a possible bug-state).

Again, I share your worry that this will simply be an "excuse" for the hardware designers to get lazy and release inferior products. But when modern chips have 300 million transistors, you can be sure that bugs will creep in--and I would rather they have built-in strategies for mitigating those bugs.

Re:Doesn't sound like such a good thing to me (1)

ray-auch (454705) | more than 7 years ago | (#19390015)

I can't wait to pay £400 for a Beta CPU and then get to endure 6 months of crashing until it gets patched

I can't wait to do a months of simulation work on it, and then finding you have to redo it because your results were invalid due to hardware bug.

Oh, wait, been there done that. Years ago. FDIV.

Nothing new here, move along...

FPGA (1)

jshriverWVU (810740) | more than 7 years ago | (#19385165)

This sounds a lot like a virtual machine on top of a FPGA board. Would be neat to store a VM or OS on a seperate layer, and allow the OS to reflash the FPGA to optimize the hardware to a specific task.

Re:FPGA - or not... (0)

Anonymous Coward | more than 7 years ago | (#19388815)

FPGAs are too power-inefficient. If that's your goal you'd better keep the FPGA off for anything that isn't massively data-parallel or you've just lost any advantage you might have had... (People who claim FPGAs are power-efficient are comparing them to doing the same data-parallel computation at n-times the speed on a serial processor.)

Guys... (1)

rafael_es_son (669255) | more than 7 years ago | (#19385175)

Should I ask you to notice she's an intelligent, pretty, woman who's into CS? Nah.

Re:Guys... (1)

tjstork (137384) | more than 7 years ago | (#19385465)

Pretty? Look at those teeth. I wonder if her car is up on blocks.

Re:Guys... (1)

rafael_es_son (669255) | more than 7 years ago | (#19385635)

Where are those teeth you speak of? Show me.

Re:Guys... (1)

clever_moniker (1108253) | more than 7 years ago | (#19386229)

Her teeth aren't showing in the UVA Today link, from what I see she is surprisingly hot for woman in CS.

I feel shallower than usual for saying that, but I maintain that she is hot.

I For One... (0, Offtopic)

ReidMaynard (161608) | more than 7 years ago | (#19385205)

... welcome our new Kim Hazelwood overlords.

URLs (3, Informative)

mattr (78516) | more than 7 years ago | (#19385237)

Her homepage [virginia.edu] ,
tortola [tortolaproject.com] and
possibly unrelated paper [acm.org]

She's HOT! (0)

Anonymous Coward | more than 7 years ago | (#19385539)

God damn she's hot! (for a geek chick)

BS again.... (4, Insightful)

gweihir (88907) | more than 7 years ago | (#19385241)

Sorry, but Hardware and software do not solve problems together. That is straight from the "computers are magic"-fraction. Hardware solves problems under software control. Hardware alone can do nothing and software alone cannot run.

Using inferface layers to get more portable and easier to use interfaces is an old and well-established technique.

There people are looking for money, right? Why does /. provide free advertizing to them?

Re:BS again.... (1)

scatterbrained (144748) | more than 7 years ago | (#19386213)

Obviously written by a software guy...

Hardware alone can do lots of things (albeit hard to change) but
software alone can do nothing :-)

Seriously, hardware/software partitioning is key in product design,
and it affects everything. I'm curious as to what they are proposing,
and how it will affect product cost and development schedules. TFA is
completely uninformative.

Re:BS again.... (1)

gweihir (88907) | more than 7 years ago | (#19389973)

Hardware alone can do lots of things (albeit hard to change) but
software alone can do nothing :-)


Hehe, right. I have some elecronics experience (in fact a lot) and you are right of course. But computer hardware is entriely useless without software...

Re:BS again.... (0)

Anonymous Coward | more than 7 years ago | (#19387409)

There people are looking for money, right? Why does /. provide free advertizing to them?
Well, since they are academics, their funding comes from granting agencies, which probably do not read Slashdot. So, basically, this isn't "advertising" for them in any financial sense.

Re:BS again.... (1)

2short (466733) | more than 7 years ago | (#19387773)

"There people are looking for money, right?"

No, it appears they're doing something cool but fairly technical. A description of it so simplified that it misses the point has been written, and this has then been summarized to remove any shred of meaningful information, leaving "hardware and software solve problems together".

This is pretty par for the course when you apply a couple dumbing-down passes to the description of something that is fundamentally only interesting for its non-dumbed-down technical details.

Re:BS again.... (1)

gweihir (88907) | more than 7 years ago | (#19389987)

No, it appears they're doing something cool but fairly technical. A description of it so simplified that it misses the point has been written, and this has then been summarized to remove any shred of meaningful information, leaving "hardware and software solve problems together".

This is pretty par for the course when you apply a couple dumbing-down passes to the description of something that is fundamentally only interesting for its non-dumbed-down technical details.


Hmm. Could be right. Makes the article a complete waste of time though.

WTF, again?! (4, Interesting)

vivaoporto (1064484) | more than 7 years ago | (#19385253)

Although this is not a dupe, it practically is. Check this other story, New Way to Patch Defective Hardware [slashdot.org] , less than two months old. Basically, both approaches suck in the same way, they allow hardware manufacturers to be sloppy in order to rush the product out as fast as possible while allowing them to try to correct the errors that will appear later in the process. In short, they reinvented the FPGA.

Two non-stories. But makes one think, cui bono? Who is benefiting from these articles? Roland for sure, being such a click whore. But other than him, who else? Weird, very weird indeed.

Re:WTF, again?! (1)

aldheorte (162967) | more than 7 years ago | (#19390031)

Sorry, semantic digression: Will people stop it with the cui bono!? Just say it in English, which you have to do anyway in repetition since some people don't know what cui bono means. For someone who reads both Latin and English it reads like this. "But makes one think, who is benefitting? Who is benefitting from these articles?"

Transmeta Crusoe (2, Interesting)

dduardo (592868) | more than 7 years ago | (#19385299)

How is this any different than what Transmeta has already done?

Re:Transmeta Crusoe (1)

bprice20 (709357) | more than 7 years ago | (#19385343)

I thought the same thing.

Re:Transmeta Crusoe (0)

Anonymous Coward | more than 7 years ago | (#19385579)

I had a sense of deja vu in reading the article. I also immediately thought of Transmeta. From the little info that is available it seems ripe for a lawsuit.

Re:Transmeta Crusoe (1)

SanityInAnarchy (655584) | more than 7 years ago | (#19390103)

My understanding is, Transmeta actually ran an entirely different chip under the hood, and emulated x86 on the fly (with help from hardware).

This is entirely different -- it's about having the software be able to more tightly communicate with the hardware. To paraphrase someone else's post: It's so the hardware can know the difference between "I'm just waking up to poll something, keep everything low-power" and "OMG ramp it up to full lap-burning power NOW!!!"

Links and a comment (5, Informative)

martyb (196687) | more than 7 years ago | (#19385325)

Some Links:

And a comment:

I'm not entirely thrilled with this idea of dynamically communicating between hardware and software. From what I got from TFA, the hardware would change dynamically based on feedback from the software. It seems to me that we already have plenty of trouble writing programs that work correctly when the hardware does not change... imagine trying to debug a program when the computer hardware is adapting to the changes in your code. (IOW: heisenbugs [wikipedia.org] .)

Also, I've got some unease when I think about what mal-ware authors could come up with using this technology. Sure, we'll come up with something else to counteract that... but I think it'll bring up another order of magnitude's worth of challenge in this cat and mouse game we already have.

Links and a loss of control. (0)

Anonymous Coward | more than 7 years ago | (#19385671)

"I'm not entirely thrilled with this idea of dynamically communicating between hardware and software. From what I got from TFA, the hardware would change dynamically based on feedback from the software. It seems to me that we already have plenty of trouble writing programs that work correctly when the hardware does not change... imagine trying to debug a program when the computer hardware is adapting to the changes in your code. (IOW: heisenbugs [wikipedia.org].)"

I suspect that you're going to have to get use to the loss of control if you ever expect computers to evolve.

Re:Links and a comment (1)

pavon (30274) | more than 7 years ago | (#19386133)

From what I got from TFA, the hardware would change dynamically based on feedback from the software.
Based on their abstract in the link you post, it is the other way around - the hardware provides more information about the state of the processor than it normally would, and then the software uses this information to perform run-time optimizations taking these factors into account. Considering that we are already employing run-time optimization in languages such as Java and C#, providing more information to assist in these optimizations, and to allow them to optimize for things that they couldn't in the past (like power efficiency) doesn't sound like a bad idea to me. While any optimization can lead to heisenbugs (especially in the case of race conditions), I don't think that these optimizations would be any worse than what we already have to deal with.

And yeah the press release is horrible, as they always are.

First Post! (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#19385329)

Oh wait, Slowpoke is SLOOOOOOOOOOW

desu desu desu desu desu desu desu~~!! ^_^

sup /b/?

CCS (4, Insightful)

diablovision (83618) | more than 7 years ago | (#19385333)

This article is an example of CCS: "Cute Chick Science". The article has about as much fluff as a popcorn kernel. I am not exactly sure to what they are referring here--FPGAs? There seem to be a number of statements that are overly categorical and seemingly not well informed such as "This middle layer would allow software to adapt to the hardware it's running on, something engineers have not been able to do in the past," and "to engineer software that can communicate between the two layers, [hardware and software]".

If there wasn't a pic of a cute professor involved, would anyone care?

Re:CCS (1)

dfenstrate (202098) | more than 7 years ago | (#19386641)

I'm an ME with very limited programming experience who nonetheless hangs around slashdot, and even I've read enough to think this is fluff crap.

If there wasn't a pic of a cute professor involved, would anyone care?


Pretty sure you nailed it right there. On the one hand, this kind of crap isn't going to help her be taken seriously at all. On the other hand, she is very, very cute.

It's entirely possible that Prof Hottie ^H^H^H Hazelwood discovered some new arcana in the field and the reporter can't even come close to understanding 3/4 of it, so she just went with the easy stuff. More likely, Hazelwood rediscovered the FPGA.

Think she'll show up to defend herself?

Re:CCS (1)

Wesley Felter (138342) | more than 7 years ago | (#19387871)

It's entirely possible that Prof. Hazelwood discovered some new arcana in the field and the reporter can't even come close to understanding 3/4 of it, so she just went with the easy stuff.

I think this is the correct explanation. I actually understand what Tortola is, and it's not bogus nor is it a reinvention of previous work. Unfortunately, the Web site isn't very detailed; the one example given (di/dt) is a pretty obscure problem to solve. As it stands now, there is no way to explain Tortola to a regular person so that they would care.

Re:CCS (1)

ebichete (223210) | more than 7 years ago | (#19389373)

Pretty sure you nailed it right there. On the one hand, this kind of crap isn't going to help her be taken seriously at all. On the other hand, she is very, very cute.

It's entirely possible that Prof Hottie ^H^H^H Hazelwood discovered some new arcana in the field and the reporter can't even come close to understanding 3/4 of it, so she just went with the easy stuff. More likely, Hazelwood rediscovered the FPGA.


Lord, preserve us from slashidiots...

Cute Chick factor can only get you so far. This is not CS Undergrad Hazelwood we are talking about, it is Professor Hazelwood. You don't get that far without some serious chops, so I would say that it is bloody unlikely she is rediscovering the FPGA.

The reporter and/or the editor had no idea what they were writing about and it shows. The article makes no distinction between microcode and a CPU's instruction set, lumping them together as software. Which is kind of dumb because large parts of Tortola are about taking advantage of the separation of the two.

Re:CCS (1)

Raenex (947668) | more than 7 years ago | (#19389487)

This is not CS Undergrad Hazelwood we are talking about, it is Professor Hazelwood.
Assistant Professor.

Lameness Alert: +4, Elevated (0)

Anonymous Coward | more than 7 years ago | (#19385353)


Also known as reconfigurable computing [wikipedia.org] .

Link to more Tortola information (1)

linguae (763922) | more than 7 years ago | (#19385373)

Tortola Project [tortolaproject.com]

Seems like an interesting research project. The research seems new (I see no published papers on Tortola, although I do see some slides and an extended abstract), so it will be interesting to see how it develops. I am very interested in seeing how an operating system would interact with Tortola.

Umm... (1)

Wicko (977078) | more than 7 years ago | (#19385463)

"We could use the software to hide flaws in the hardware, which would allow designers to release products sooner because problems could be fixed later," explains Hazelwood.

How about lets not encourage companies to rush out unfinished products any more than they already do?

This doesn't make any sense to me. (0)

Anonymous Coward | more than 7 years ago | (#19385489)

Isn't this a bit like trying to make water communicate with pipes?

"A Hardware-Software Symbiosis" ! (1)

suv4x4 (956391) | more than 7 years ago | (#19385869)

Finally, the two arch nemesis: software and hardware, will live together in symbiosis. Never before you have seen software and hardware working together.

Now this article demonstrates that what was before unthinkable, may tommorow be a commodity, and we will finally be able to run software on our hardware.

yuo 7ail iT (-1, Troll)

Anonymous Coward | more than 7 years ago | (#19385945)

obsessiv3s and The worse and worse. As

What is it really? (0)

Anonymous Coward | more than 7 years ago | (#19385997)

Transmeta style code morphong? Firmware? OS? VM? Microcode?

Roland the Plogger, overdramatizing again (4, Informative)

Animats (122034) | more than 7 years ago | (#19386095)

First off, it's a Roland the Plogger story, so you know it's clueless. Roland the Plogger is just regurgitating a press release.

Here's an actual paper [virginia.edu] about the thing. Even that's kind of vague. The general idea, though, seems to be to insert a layer of code-patching middleware between the application and the hardware. The middleware has access to CPU instrumentation info about cache misses, power management activity, and CPU temperature. When it detects that the program is doing things that are causing problems at the CPU level, it tries to tweak the code to make it not do so much bad stuff. See Power Virus [wikipedia.org] in Wikipedia for an explaination of "bad stuff". The paper reports results on a simulated CPU with a simulated test program, not real programs on real hardware.

Some CPUs now power down sections of the CPU, like the floating point unit, when they haven't been used for a while. A program which uses the FPU periodically, but with intervals longer than the power-off timer, is apparently troublesome, because the thing keeps cycling on and off, causing voltage regulation problems. This technique patches the code to make that stop happening. That's what they've actually done so far.

Intel's interest seems to be because this was a problem with some Centrino parts. [intel.com] So this is something of a specialized fix. It's a software workaround for some problems with power management.

It's probably too much software machinery for that problem. On-the-fly patching of code is an iffy proposition. Some code doesn't work well when patched - game code being checked for cheats, DRM code, code being used by multiple CPUs, code being debugged, and Microsoft Vista with its "tilt bits". Making everything compatible with an on the fly patcher would take some work. A profiling tool to detect program sections that have this problem might be more useful.

It's a reasonable piece of work on an annoying problem in chip design. The real technical paper is titled "Eliminating voltage emergencies via microarchitectural voltage control feedback and dynamic optimization." (International Symposium on Low-Power Electronics and Design, August 2004). If you're really into this, see this paper on detecting the problem during chip design [computer.org] , from the India Institute of Technology in Madras. Intel also funded that work.

On the thermal front, back in 2000, at the Intel Developer Forum the keynote speaker after Intel's CEO spoke [intel.com] , discussing whether CPUs should be designed for the thermal worst case or for something between the worst case and the average case: "Now, when you design a system, what you typically want to do is make sure the thermal of the system are okay, so even at the most power-hungry application, you will contain -- so the heat of the system will be okay. So this is called thermal design power, the maximum, which is all the way to your right. A lot of people, most people design to that because something like a power virus will cause the system to operate at very, very maximum power. It doesn't do any work, but that's -- you know, occasionally, you could run into that. The other one is, probably a little more reasonable, is you don't have the power virus, but what the most -- the most power consuming application would run, and that's what you put the TDP typical."

From that talk, you can kind of see how Intel got into this hole. They knew it was a problem, though, so they put in temperature detection to slow down the CPU when it gets too hot. This prevents damage, but there's a performance penalty on some applications in hot weather. Some gamers complain about this.

Re:Roland the Plogger, overdramatizing again (4, Informative)

ukillaSS (1111389) | more than 7 years ago | (#19388611)

There has been a TON of work on creating uniform layers of OS, middleware, that are accessible to both SW and HW.

* EPFL - Miljan Vuletic's PhD Thesis
* University of Paderborn's ReconOS
* University of Kansas's HybridThreads
* etc. etc.

This work is becoming very influential in areas of HW/SW co-design, computer architecture, embedded & real-time systems due to it's importance to both research-oriented and commerical computing.

Additionally, this is now becoming a major thrust for many chip-makers that have now realized that serial programs running on superscalar machines really are getting any faster. Multicore systems are now available, and are still showing no significant speedups due to a lack of proper parallel programming models. In the past, developing HW was considered "hard/difficult" and developing SW was "easy". Additionally, this usually was due to the fact that HW design involved parallel processes, synchronization, communication, etc. while SW involved a serial list of execution steps. Now that we have multiple cores SW developers are realizing that not only do are most programmers horrible at writing code that interacts with hardware (an object of concurrency in most systems), but they are even worse at writing code that interacts with concurrent pieces of SW. The HW/SW boundary is only a small glimpse of how badly parallelism is managed in today's systems - we need to focus on how to describe massive amounts of coarse-grained parallelism in a such a way that one can reason about parallel systems.

Yes but, (0, Troll)

sirindex (1111327) | more than 7 years ago | (#19386245)

Does it run Linux?

Huh? (0)

Anonymous Coward | more than 7 years ago | (#19386439)

WTF is a "virtual interface"? Seems like an absolutely superfluous word adornment.

"Security." (1)

Zero_DgZ (1047348) | more than 7 years ago | (#19386619)

This smells like trying to open the back door to Treacherous Computing to me, just by way of wrapping it up in buzzwords that'll make otherwise wary geeks accept it thinking it'll make their computers somehow go faster.

Enough already (1)

AcidPenguin9873 (911493) | more than 7 years ago | (#19387077)

Of course she's hot, but she has a PhD from Harvard, and she's published a lot in major conferences. I'm pretty sure you can't get a PhD in CS, or papers accepted at major conferences (which are double-blind reviewed, btw) on looks alone.

As for this work, the article summary and the article itself are severely lacking in details. Go to the project page. And yes, people have been doing dynamic translation/optimization for years (Transmeta, Dynamo from HP - which she worked on actually, - rePLAY from UIUC), but it has always been with the intent to improve performance. This work is looking at a much wider scope of problems, and using the binary translation mechanism as a part of the solution. It looks to be in somewhat early stages - not sure specifically what problems they are looking to tackle other than the "dI/dt" problem that's posted on the project page. I'm interested to see what other papers come out of this group though.

Finally a reason to RTFA (0)

Anonymous Coward | more than 7 years ago | (#19387387)

...she's hot.

Holy crap! (1)

glwtta (532858) | more than 7 years ago | (#19387629)

"hardware and software to communicate and to solve problems together"

This is freaking slashdot - could we get something a little more technical in the summaries?

Missing tag: boycottroland (1)

loimprevisto (910035) | more than 7 years ago | (#19387987)

Aargh! I accidentally clicked on the link without noticing the submitter- most of Roland's trash gets tagged very quickly, why the delay with this one?

Device to let s/ware communicate with h/ware? cool (1)

Lalo Martins (2050) | more than 7 years ago | (#19388947)

A device to let software communicate with hardware? Cool! Why don't we call it a "computer"?

Vocabulary words for today (0)

Anonymous Coward | more than 7 years ago | (#19389693)

The story presents an example of synergy [wikipedia.org] rather than symbiosis [wikipedia.org] . Just sayin'.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?