Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Nvidia Wins $20M In DARPA Money To Work On Hyper-Efficient Chips

timothy posted about 2 years ago | from the just-tickle-the-electrons dept.

Power 72

coondoggie writes "Nvidia said this week it got a contract worth up to $20 million from the Defense Advanced Research Projects Agency to develop chips for sensor systems that could boost power output from today's 1 GFLOPS/watt to 75 GFLOPS/watt."

cancel ×

72 comments

Sorry! There are no comments related to the filter you selected.

Both AMD and Nvidia are already hyper-efficient (1)

Noughmad (1044096) | about 2 years ago | (#42300015)

If used as space heaters, that is.

Re:Both AMD and Nvidia are already hyper-efficient (0)

Anonymous Coward | about 2 years ago | (#42300405)

I turn my space heater off while playing Civilization V.

i wonder.... (0)

Anonymous Coward | about 2 years ago | (#42300029)

what the government would do with that kind of processor power........

Re:i wonder.... (0)

Anonymous Coward | about 2 years ago | (#42303137)

Mine bitcoins to pay back the deficit.

Re:i wonder.... (1)

viperidaenz (2515578) | about a year and a half ago | (#42310437)

There aren't (and never will be) enough bitcoins.

LIKE TELLING A DEMOCRAT TO CUT SPENDING !! (0)

Anonymous Coward | about 2 years ago | (#42300071)

Or a republican to declare all income !!

These things Just ain't gonna happen !!

Result will work great (3, Funny)

kthreadd (1558445) | about 2 years ago | (#42300083)

But you will need their proprietary driver.

Re:Result will work great (1)

lord_rob the only on (859100) | about 2 years ago | (#42300401)

If you give me $20 million at once I can give you the source code of my drivers you know ;-). Under a NDA of course.

Re:Result will work great (0)

flimflammer (956759) | about 2 years ago | (#42300411)

Fine with me.

Here we go again /. (0)

Anonymous Coward | about 2 years ago | (#42300109)

"Sensor" dropped from headline. Mistake or not?

Isn't this economic without DARPA funding? (5, Insightful)

iceco2 (703132) | about 2 years ago | (#42300219)

It seems to me a x75 increase in power efficiency should be worth to nVidia (or any competitor) much more than $20M, why does DARPA need to fund this, this seems exactly like the kind of work which doesn't need DARPA money. DARAPA should spend money where it is not clearly economic for others to do so.

Re:Isn't this economic without DARPA funding? (2, Interesting)

Anonymous Coward | about 2 years ago | (#42300241)

Maybe DARPA also wants the thing to withstand radiation. Maybe they want it to have so little computational power that it would not sell in the market. Maybe they want every component to have been made in the US which the market won't care about. Maybe they have other restrictions. You don't know.

Re:Isn't this economic without DARPA funding? (1)

rmstar (114746) | about 2 years ago | (#42300473)

Maybe they want it to have so little computational power that it would not sell in the market.

That's an intetresting point. How much power would a 300Mhz pentium of 1998 consume if built with modern technology? That thing could do quite a bit of computing, but as you say, wouldn't survive in todays PC market. But for embedded applications it would be great, not least for the fact that you can get the comfort of a full computer.

Re:Isn't this economic without DARPA funding? (1)

WhatAreYouDoingHere (2458602) | about 2 years ago | (#42301943)

Let's just hope it's not a 386 [slashdot.org] ... :)

Re:Isn't this economic without DARPA funding? (1)

DavidClarkeHR (2769805) | about 2 years ago | (#42300767)

Maybe they want every component to have been made in the US which the market won't care about.

... I don't think you understand the chip fab process (that's what we're talking about here). $20 million is a lot of money in any other business, though.

Re:Isn't this economic without DARPA funding? (0)

Anonymous Coward | about 2 years ago | (#42300253)

DARPA funds research, i.e. something that is not straightforward development. Work that may not produce results and requires new technology or insight to produce results.

A 75x increase in efficiency seems to qualify to me. Successful research is always profitable for the entity performing said research.

Re:Isn't this economic without DARPA funding? (0)

Anonymous Coward | about 2 years ago | (#42300549)

Maybe it is very risky, i.e. probably not achieving anything like a 75x increase.

Re:Isn't this economic without DARPA funding? (1)

RicktheBrick (588466) | about 2 years ago | (#42302075)

A 75x increase is absolutely needed if they are going to continue to make faster supercomputers. IBM is trying to make a 1,000 Petaflops supercomputer before the end of this decade. That supercomputer would require a Gigawatt of power to run at a Gigaflop per watt. Now one is talking about a nuclear power plant to generate enough power to run the computer. At a dollar per watt one is also talking about spending a billion dollars on the power plant which is a lot more money than they plan on spending to build the supercomputer. United States is not the only one trying as so is China, Russia, India, European Union, and Japan. So someone will probably invent a way to reduce power needs. It would be nice if it were invented in the United States.

Who needs the free market? (1)

DavidClarkeHR (2769805) | about 2 years ago | (#42300733)

It seems to me a x75 increase in power efficiency should be worth to nVidia (or any competitor) much more than $20M, why does DARPA need to fund this, this seems exactly like the kind of work which doesn't need DARPA money. DARAPA should spend money where it is not clearly economic for others to do so.

You were under the assumption that we live in a purely capitalistic society? My mistake. Even in countries that are into extreme capitalism shift back to subsidies and support when it comes to certain things.

Though, usually those things involve essential services like fire-fighters, road maintenance, and policing ...

Re:Isn't this economic without DARPA funding? (2)

drinkypoo (153816) | about 2 years ago | (#42300951)

DARAPA should spend money where it is not clearly economic for others to do so.

Well, that's good advice, and I'm sure they'll take it over at DARAPA. On the other hand, DARPA has certain goals, whatever they might be, and if this is the most economical way to achieve them, then it's money well-spent from their perspective. If you're going to have a thing like DARPA, then you need to permit it to do things like this if you want it to be efficient. On the other hand, if you're going to do things like this, you need substantial oversight in place to prevent abuse. And on the gripping hand, military might is an extremely inefficient way to allocate R&D resources, and a better way to cause technology to move forward would be to put away childish things, and give the people the say in what happens; in a capitalist system like ours, we need to find them jobs so they can afford to buy products so they can vote with their pocketbooks. Of course, then we get into more checks and balances; sane advertising law is to capitalism as oversight is to government funding.

Re:Isn't this economic without DARPA funding? (1)

RicktheBrick (588466) | about 2 years ago | (#42301493)

There is no way anyone can make an intelligent decision on what to purchase today. There are way too many variables so one can only guess. I am sure that a Gigaflops per watt is much better than the computer I am typing on now. So if they can go to 75 Gigaflops per watt why would anyone buy a computer again since a dump terminal could be better. But that decision would take a lot more knowledge than I have and I know a lot more about computers than the average person. I contribute to a project sponsored by IBM called world community grid. They send work to volunteers who in return send back results. Now IBM is the leading contributor to this cause. I look into their stats and I find that they have over a 100,000 computers. I would think that the cost in electricity to run so many computers would be more than the cost of running a supercomputer and giving everyone a dump terminal. Since they contribute about 50,000 results a day, I would think that they would prohibit this program on personal computers and use the money that they would save in electricity to run a supercomputer. I do not know why they are doing this but I can only assume they have an intelligent CEO who would make the right decision. So until they centralize their computer needs, I guess purchasing a new personal computer is the correct decision. Maybe Google will allow chrome users to upload all of their cpu intensive programs to a centralized computer so no one will need a fast computer. So again how do I make an intelligent decision?

Re:Isn't this economic without DARPA funding? (1)

drinkypoo (153816) | about 2 years ago | (#42301539)

There is no way anyone can make an intelligent decision on what to purchase today.

Why not? If I'm shopping for an Android tablet, say, there's only a small handful of credible processors, and there's only so many screen technologies and manufacturers, and I can gauge a manufacturer's past quality and hope that it will serve as a useful predictor of future performance. I can look at their financial statements and find out if they have been purchased by vulture capitalists. And you can do all of this from a free or nearly-free computer. (I've given away computers more than adequate to the task.)

Since they contribute about 50,000 results a day, I would think that they would prohibit this program on personal computers and use the money that they would save in electricity to run a supercomputer. I do not know why they are doing this but I can only assume they have an intelligent CEO who would make the right decision. So until they centralize their computer needs, I guess purchasing a new personal computer is the correct decision.

Well, that was a fun (if newline-deficient) walk through some relatively irrelevant material. You need a personal computer (of some kind — it could be a tablet with a keyboard) because the underlying technology isn't there yet; for the same reason, indeed, that IBM is still using PCs and Workstations in many contexts. They also use thin clients where they make sense, but if a worker is expected to work when the network goes down, then they absolutely must have more computer than a thin client. If you expect your computer to work when the network goes down, you absolutely must have more computer than a thin client. But as you can buy e.g. a chromebook and make it into both computer and thin client (as a boot option) there exist solutions that please everyone.

So again how do I make an intelligent decision?

If you don't want to do the research, you ask someone trustworthy who does. You could do worse than relying on a review site or magazine which purchases the products it reviews, like Consumer Reports.

Because free market didn't work? (0)

Anonymous Coward | about 2 years ago | (#42301159)

What a surprise.

Re:Isn't this economic without DARPA funding? (0)

Anonymous Coward | about 2 years ago | (#42303169)

Situtations where this might make sense:

A x75 increase in efficiency would be great for nVidia, but the risk might be prohibitive, in other words from a risk-reward perspective it isn't worth the $20M.

nVidia could lack the spare capitol. Every business needs to focus on their core business before they do exploratory research, and the computer hardware market is somewhat anemic recently.

Not asserting either is the case, but the point is just because improved power efficiency would be good for nVidia doesn't mean it can't use help.

Re:Isn't this economic without DARPA funding? (1)

viperidaenz (2515578) | about a year and a half ago | (#42310453)

There are only really two players in the GPU market. The only incentive they have is "be slightly better than the other guy".

It's not darpa money (1)

Anonymous Coward | about 2 years ago | (#42300231)

DARPA money is tax payer money.

Units (2)

ringman8567 (895757) | about 2 years ago | (#42300289)

Why GFLOPS/watt? that is (operations/second)/(Joules/second). why not just operations/joule?

Re:Units (2)

SuricouRaven (1897204) | about 2 years ago | (#42300305)

Practicality. When talking about energy consumption, it's usually given in watts because the practical implications are time-dependant. You've got to account for the time it takes to run the calculations (which may be time-critical - you don't want your amalgamated radar data on a five-minute delay) and need to know the wattage to calculate cooling requirements. While operations/joule and flops/watt are equivilent, it's easier to think in terms of the former.

Re:Units (1)

Rockoon (1252108) | about 2 years ago | (#42300339)

Maximizing ops/power is not the same as maximizing ops/energy. You would think that somebody that knows the difference between power and energy would also know the difference here.

Re:Units (1)

ringman8567 (895757) | about 2 years ago | (#42300449)

Yes you maximize ops/power by running for longer! Prior to you no one has mentioned this concept, we have only considered ops/energy, either as such or as ops/sec/power.

Re:Units (1)

Entropius (188861) | about 2 years ago | (#42300929)

I work with high-performance computing in physics -- all of my peers know the difference between energy and power. Sometimes people use "flops" as an abbreviation for "floating point operations" ("It takes XYZ flops per site to compute the Wilson Dirac operator" or "The flops/bytes ratio describes the balance between processing and communication in the algorithm") without the "per second".

Re:Units (0)

Anonymous Coward | about 2 years ago | (#42302463)

This is a really useless statistic for me. I really wish they could tell us about GFLOPS-month per kWh to help with bitcoin mining. Better luck next time NVIDIA.

did i misread something ? (1)

etash (1907284) | about 2 years ago | (#42300341)

the latest gpus are laready 15-18 GFLOPS/watt already. *confused*

Re:did i misread something ? (1)

Gaygirlie (1657131) | about 2 years ago | (#42300389)

DARPA wants to reach 75 GFLOPS/watt.

Re:did i misread something ? (4, Interesting)

Rockoon (1252108) | about 2 years ago | (#42300621)

We passed 1e+07 operations per kWh in 1965.
We passed 1e+08 operations per kWh in 1971.
We passed 1e+09 operations per kWh in 1976.
We passed 1e+10 operations per kWh in 1981.
We passed 1e+11 operations per kWh in 1987.
We passed 1e+12 operations per kWh in 1992.
We passed 1e+13 operations per kWh in 1997.
We passed 1e+14 operations per kWh in 2001.
We passed 1e+15 operations per kWh in 2008.

citation and graph [economist.com]

Energy efficiency consistently doubles approximately every 1.6 years, so if we are at ~16 glops/watt right now, then we will blow past DARPA's target early in 2016... just a little over 3 years from now.

Re:did i misread something ? (1)

Gaygirlie (1657131) | about 2 years ago | (#42300787)

Energy efficiency consistently doubles approximately every 1.6 years, so if we are at ~16 glops/watt right now, then we will blow past DARPA's target early in 2016... just a little over 3 years from now.

It's not guaranteed, and that's the whole point of this contract; to ensure that we will reach that point. Also, the article talks about chips for sensor systems, not GPUs or similar.

Re:did i misread something ? (0)

Anonymous Coward | about 2 years ago | (#42301547)

And the reason it's not guaranteed is quite fundamental:

somewhere around 22nm, scaling down silicon semiconductors will increase power per transistor, instead
of decrease. This is due to leakage currents.

There is no industrialized alternative for silicon semi-conductors yet, as far as I'm aware.

Re:did i misread something ? (0)

Anonymous Coward | about 2 years ago | (#42303111)

your math is wrong. in 2008 1e+15 ops/kWh = 0.27 GFlops/W. assuming you are right that it doubles every 1.6 years, in 2012 (n0w), it should be at about 1.6 GFlops/W. and will take about 9 years before reaching 75GFlop/W at current exponential rates of increase.

Re:did i misread something ? (0)

Anonymous Coward | about 2 years ago | (#42303281)

FLOPS and OPS are not the same. I don't see anything to suggest that chart is FLOPS, not regular OPS, which I believe are cheaper from a power standpoint.

FLOPS: Floating point operations per second

Re:did i misread something ? (3, Insightful)

Entropius (188861) | about 2 years ago | (#42300937)

Out of curiosity (and I ask because I genuinely don't know), how many flops/watt do modern smartphones do? What about the GPU coprocessors in them?

Modern GPU's are great, but they're not even optimized that strongly for power consumption.

Re:did i misread something ? (1)

Kjella (173770) | about 2 years ago | (#42303527)

Out of curiosity (and I ask because I genuinely don't know), how many flops/watt do modern smartphones do? What about the GPU coprocessors in them? Modern GPU's are great, but they're not even optimized that strongly for power consumption.

I would think GPUs are actually worked more on for peak efficiency because top cards have been consuming hundreds of watts and particularly workstation and compute cards will often run at 100% when in use for a big render/compute job. Smartphones are much more about dynamic power, adjusting clocks and voltages and tons of sleep modes, if you're doing 100% load on all cores then none of that will have an effect. Sure they care about power usage at peak too, but I don't think more than GPUs.

Re:did i misread something ? (1)

fnj (64210) | about 2 years ago | (#42304171)

All true and insightful, but it would still be nice to know the actual per-watt figures operating all out, and compare them with desktop figures.

I sure welcome this (2)

Gaygirlie (1657131) | about 2 years ago | (#42300343)

As things usually do, the results of this research will eventually trickle down to desktops, laptops and mobile devices, and will result in either lesser power consumption or the same power consumption but in higher performance -- either way it's a plus. I just wish the contract could've been given to someone other than NVIDIA as it would be nice if the results of the research were released completely for free to the public instead of being patented up the wazoo, but alas, NVIDIA has so much experience in these things that it just makes sense to slap them with it if you expect results.

Re:I sure welcome this (0)

Anonymous Coward | about 2 years ago | (#42300641)

And then Apple will invent it for free.

NVIDIA? (0)

DavidClarkeHR (2769805) | about 2 years ago | (#42300737)

As things usually do, the results of this research will eventually trickle down to desktops, laptops and mobile devices, and will result in either lesser power consumption or the same power consumption but in higher performance -- either way it's a plus. I just wish the contract could've been given to someone other than NVIDIA as it would be nice if the results of the research were released completely for free to the public instead of being patented up the wazoo, but alas, NVIDIA has so much experience in these things that it just makes sense to slap them with it if you expect results.

In the spirit of the flame war you may have begun, you do know that AMD generally has faster chips? They just run hotter (but stable) and have poor driver support.

Re:NVIDIA? (1)

Gaygirlie (1657131) | about 2 years ago | (#42300777)

In the spirit of the flame war you may have begun, you do know that AMD generally has faster chips?

I don't know that. I haven't seen any extensive research on such a topic and I do not have the time or money to come to such conclusion myself. Also, I am not taking any stance whatsoever on which one of the two would've been better suited for the task at hand, I'll leave waging such silly flame wars to you.

Re:NVIDIA? (1)

Entropius (188861) | about 2 years ago | (#42300945)

They also don't have CUDA. Some people in my field have considered doing high performance computing on Radeons and generally stick with the thing that's easier to code for.

Re:NVIDIA? (1)

WhatAreYouDoingHere (2458602) | about 2 years ago | (#42302003)

AMD generally has faster chips

Perhaps, but at what cost [slashdot.org] ?

Re:I sure welcome this (1)

gbjbaanb (229885) | about 2 years ago | (#42300977)

performance per watt - yeah, sure, give it to Nvidia.

Or they could have chosen ARM to design faster processors that still use less power. Or give it to AMD who sure need the money.

Maybe Bitcoin mining on GPUs isn't dead (0)

Anonymous Coward | about 2 years ago | (#42300349)

Hmm, perhaps future gpus will be able to give ASICs a run for their money, with the mass production advantages of commodity hardware.

NVIDIA is worth $7.87 billion (3, Interesting)

MSTCrow5429 (642744) | about 2 years ago | (#42300533)

So of course the Federal government needs to blow $20 million of taxpayer money, irregardless of its fiscal condition.

Money matters, but ... (4, Insightful)

DavidClarkeHR (2769805) | about 2 years ago | (#42300749)

So of course the Federal government needs to blow $20 million of taxpayer money, irregardless of its fiscal condition.

I prefer it was spent on computing, rather than explosions.

Re:Money matters, but ... (0)

Anonymous Coward | about 2 years ago | (#42300801)

Well what do you suppose DARPA is going to do with the info besides increase our "defensive" position?

BTW, $300 buys you a 4.3 TeraFlop/s CPU. If the article is correct and "today" is 1GFlop/Watt, then the Radeon 7970 uses 4.3kW. I somehow doubt that, so I looked up Radeon 7970 watts and see numbers like 300-400. If that's true, then "today" is actually about 10GFlops/Watt. That is still less than 75 GFlops, but the article is off by a factor of 10.

It'll help with explosions ultimately anyway. (1)

girlinatrainingbra (2738457) | about 2 years ago | (#42301241)

re: I prefer it was spent on computing, rather than explosions.
.
Don't forget that they can use the improved computational power and that improved computational power efficiency to simulate and design better explosions! But look at how much innovation comes about from war and war/defense funding. (It's not hard to search for it). Heck, even canned food had its research and development funded by Napoleon to help the French military.
And it cuts both ways: any innovation can be put to use in the aid of defense and in the aid of war. Many technologies and concepts created for the war and defense industry also have many civilian and non-military applications.

Re:NVIDIA is worth $7.87 billion (0)

Anonymous Coward | about 2 years ago | (#42300949)

Irregardless isn't a word. Just throwing that out there.

Re:NVIDIA is worth $7.87 billion (1)

WhatAreYouDoingHere (2458602) | about 2 years ago | (#42302027)

I thought "irregardless" meant without disregard to, or actually to simplify it more, "regarding" ...
it's a very nonflawless opinion of mine, though.

Re:NVIDIA is worth $7.87 billion (0)

Anonymous Coward | about 2 years ago | (#42302713)

Sure it is. It means the same as "regardless". It's just irredundant.

Re:NVIDIA is worth $7.87 billion (0)

Anonymous Coward | about 2 years ago | (#42301251)

No offense, but the tech industry is one of America's few industries that consistently pays back more than it costs.

Re:NVIDIA is worth $7.87 billion (0)

Anonymous Coward | about 2 years ago | (#42302303)

Entirely undisnonirregardfulllessness.

Re:NVIDIA is worth $7.87 billion (0)

Anonymous Coward | about 2 years ago | (#42302979)

The Fiscal condition of the United States is fine.

We're making far more money...but we're just refusing to charge appropriate taxes on services received.

Re:NVIDIA is worth $7.87 billion (0)

Anonymous Coward | about 2 years ago | (#42303717)

there's an unfortunately vocal slice of people that complain of too much government spending but don't want any decrease in military spending from which this $20m is sourced from

Re:NVIDIA is worth $7.87 billion (1)

tyrione (134248) | about 2 years ago | (#42305141)

So of course the Federal government needs to blow $20 million of taxpayer money, irregardless of its fiscal condition.

Joint research. Grow up.

sensor systems (1)

csumpi (2258986) | about 2 years ago | (#42300717)

"chips for sensor systems"

Wonder what they mean by "sensor systems".

Re:sensor systems (0)

Anonymous Coward | about 2 years ago | (#42301101)

Being able to calculate trajectories of many different objects at once with much less power = less useless material in the missile. i.e. smaller silicon, smaller batteries, more boom, etc.

Re:sensor systems (0)

Anonymous Coward | about 2 years ago | (#42301285)

They mean radar. Signal processing on modern AESA radar systems involves a lot of non-trivial computation and needs to fit in the size, weight, power, and cooling envelope of a (sometimes very small and unmanned) airplane. You're basically cramming a whole compute cluster into a drone. Getting more performance per watt means being able to stuff more processors in there and do better processing.

Math (1)

poofmeisterp (650750) | about 2 years ago | (#42300885)

..and 20 mil is enough to develop that??? 75x the capability?

I'm no genius in development or marketing, but if that could have been done, it would have already.

I don't see in TFA where it says how long they have to complete this project. So that makes one wonder if they'll (based on Moore's Law) have it out one week earlier than all competitors with that small lump of change.

Like the FSF endorsable embedded processor? (0)

Anonymous Coward | about 2 years ago | (#42301087)

How is the goal of this funding not already present in this processor for half the price?
http://tech.slashdot.org/story/12/12/04/1748232/toward-an-fsf-endorsable-embedded-processor

Share the Knowledge (1)

mastermind7373 (1932626) | about 2 years ago | (#42301331)

As an avid nVidia fan, I do hope they will share their findings with AMD(and Intel if applicable) to prevent anti-trust monopolies and to encourage even more innovation.

Blowing $20M to Spy On Citizens (0)

Anonymous Coward | about 2 years ago | (#42303991)

And crack our crypto...this is just so they can find a way to break certain algos and spy on us easier.

Article Summary Is Incorrect (3, Informative)

DanielRavenNest (107550) | about 2 years ago | (#42304607)

Current NVIDIA K20X compute card produces 5.575 Gflops double precision/Watt:

http://www.anandtech.com/show/6446/nvidia-launches-tesla-k20-k20x-gk110-arrives-at-last [anandtech.com]

Note that these cards are slightly different than consumer graphics cards. They have more double-precision pipelines because scientific computing cares more about that kind of math. They are also much more expensive than consumer cards. The underlying chip design is similar to the 600-series graphics cards. You can think of it as a modified version optimized for math, since the 600 series came out first, and is being produced in higher volume.

DARPA money is like mob money... (1)

thejazzcat (2667453) | about a year ago | (#42307363)

My theory is that by putting investment capital in to the tech, they have a "mob-like" hand in the technology. Yes, it doesn't seem like a good investment of DARPA's money now, but the favor WILL be repaid by nVidia at some point, probably to a tune of much greater than $20 million. GPU's have incredible potential for processing power even in current day, and DARPA is one of the government divisions that I would expect might need such power for various project(s).

millions for defense, (0)

Anonymous Coward | about a year ago | (#42307663)

but not a dime for open source drivers so your graphics will work on ubuntu?

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>