Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

IBM Builds First Graphene Integrated Circuit

Soulskill posted more than 3 years ago | from the incremental-progress dept.

IBM 77

AffidavitDonda writes "IBM researchers have built the first integrated circuit (IC) based on a graphene transistor. The circuit, built on a wafer of silicon carbide, consists of field-effect transistors made of graphene. The IC also includes metallic structures, such as on-chip inductors and the transistors' sources and drains. The circuit the team built is a broadband radio-frequency mixer, a fundamental component of radios that processes signals by finding the difference between two high-frequency wavelengths."

cancel ×

77 comments

Sorry! There are no comments related to the filter you selected.

This is an extremely important accomplishment. (2, Funny)

Anonymous Coward | more than 3 years ago | (#36406612)

I don't think that the article goes into enough detail about just how important this accomplishment is. Frankly, this is our only hope going forward. With so much slow software written in languages like Ruby and JavaScript becoming popular, it will again fall back to the hardware guys to really make things fast again. This will probably be the way they'll do it!

Re:This is an extremely important accomplishment. (1)

Blink Tag (944716) | more than 3 years ago | (#36406692)

Do lay it all on the hardware side. We'll continue to see speed gains in interpreter output too. Java, for example, was dog slow when it started; it's speed has improved significantly since then, and not solely due to processor improvements.

Or was this just an opportunity to troll?

Re:This is an extremely important accomplishment. (1)

Lunix Nutcase (1092239) | more than 3 years ago | (#36406702)

And imagine if all that effort speeding up these slow languages was actually put to use in writing code in an efficient language to begin with.

Re:This is an extremely important accomplishment. (2)

the eric conspiracy (20178) | more than 3 years ago | (#36406758)

Not that much effort compared to the savings of writing in a portable language like Java.

Re:This is an extremely important accomplishment. (3, Funny)

Anonymous Coward | more than 3 years ago | (#36406792)

Write Once, Debug Everywhere.

Re:This is an extremely important accomplishment. (4, Interesting)

Lunix Nutcase (1092239) | more than 3 years ago | (#36406796)

15 years of optimization from Sun just to bring it within a magnitude of C? That seems quite a bit of effort wasted.

Re:This is an extremely important accomplishment. (1)

davester666 (731373) | more than 3 years ago | (#36409066)

It's OK. Big enterprise has billions of dollars to waste on using a portable language to process textual data.

Re:This is an extremely important accomplishment. (3, Informative)

Lunix Nutcase (1092239) | more than 3 years ago | (#36406886)

portable language like Java.

Right.... One can find a C compiler for pretty much every processor since the 80s. I can point out a number of still widely used architectures that have no JVM.

Re:This is an extremely important accomplishment. (2, Insightful)

the linux geek (799780) | more than 3 years ago | (#36407178)

Porting C code between architectures is a pain in the ass. Endianness issues alone can fuck up plenty of code, without even getting into differences in compilers and standard libraries.

Things like porting from one UNIX to another UNIX on a different arch - stuff that most armchair programmers view as "just recompile" - can take hundreds of man-hours or more on complex codebases. C is not portable.

Re:This is an extremely important accomplishment. (1)

AsmCoder8088 (745645) | more than 3 years ago | (#36407644)

What Lunix Nutcase meant was that you can write C code for practically any processor out there. The same can't be said for Java. And yes, C is portable; all you have to do is make use of ifdef statements to use the right code for architecture-specific code.

Re:This is an extremely important accomplishment. (0)

Anonymous Coward | more than 3 years ago | (#36409518)

Or not write architecture-specific (broken) code in the first place. C does NOT have any endianess issues in the language standard, any endian issues come from people making invalid casts that are undefined in the C standards.

Re:This is an extremely important accomplishment. (3, Insightful)

bgat (123664) | more than 3 years ago | (#36408140)

While it is certainly possible to write C code that is endian-dependent, I consider such code to be broken--- as would any sane, professional C programmer.

To wit, the eleventy-million lines of C code in the Linux kernel are fully endian-agnostic. And largely independent of integer representation size, too!

Re:This is an extremely important accomplishment. (1)

phantomfive (622387) | more than 3 years ago | (#36409188)

Heh....I worked for a while at a company that had a product for Brew, written in C, and in J2ME, of course Java (covered the vast majority of programmable cell phones at the time). We had three guys working full time to port the J2ME version to different platforms, because all the manufacturers had different quirks. It only took one guy to take care of the Brew ports, and it was easier.

Now, obviously, J2ME is not J2SE, but it's all that was available on those platforms.

Re:This is an extremely important accomplishment. (0)

Anonymous Coward | more than 3 years ago | (#36409714)

C is very portable. as long as it doesn't make use of "undefined" or "implementation-defined" behaviour. Unfortunately, incorrect assumptions together with the attitude of "if it works, it's correct" results in C code that is not well-defined by the standard and therefore not portable.

Re:This is an extremely important accomplishment. (1)

maxwell demon (590494) | more than 3 years ago | (#36410138)

Endianness issues alone can fuck up plenty of code

Unless you are coding device drivers or the like (in which case the JVM wouldn't be very helpful anyway), the endianness shouldn't affect your code at all. If it does, you've done something wrong.

Re:This is an extremely important accomplishment. (2)

Squiddie (1942230) | more than 3 years ago | (#36409686)

As one of my colleagues said, saying Java is good because it's multi-platform is like saying anal sex is good because it works on both sexes.

Re:This is an extremely important accomplishment. (0)

Anonymous Coward | more than 3 years ago | (#36409954)

That saying originated from IRC from the user alanna bash.org [bash.org] (Also known as Slashdot user 35028)
Looks like the saying has gone full more than once.

Re:This is an extremely important accomplishment. (1)

RespekMyAthorati (798091) | more than 3 years ago | (#36414844)

Which is perfectly true.

Re:This is an extremely important accomplishment. (0)

Anonymous Coward | more than 3 years ago | (#36409914)

Java doesn't work for that many architectures.
If you want really portable bytecode I would recommend 6502-assembly. You have to look for pretty obscure stuff if you want to find something than can run Java bytecode and doesn't have a NES or C64-emulator. It is pretty much the next thing that is done for every new platform after porting Doom.

Re:This is an extremely important accomplishment. (3, Interesting)

gman003 (1693318) | more than 3 years ago | (#36406762)

Programmer time is more expensive than hardware time. If a less efficient language is easier to use, it makes business sense to use it to save money.

This does not explain the slow languages that are difficult to use, but it does explain why assembly has fallen from favor, and why C is in decline.

Re:This is an extremely important accomplishment. (2)

Lunix Nutcase (1092239) | more than 3 years ago | (#36406826)

and why C is in decline.

Hahaha, lol wut? According to the much beloved Tiobe Index by Java programmers, C is 2nd place in most popular languages.

Re:This is an extremely important accomplishment. (0)

gman003 (1693318) | more than 3 years ago | (#36406880)

Remember when it used to be first, by a huge margin? It's not dead by any means, and still a very active language, but it's not taught as much anymore. Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.

Re:This is an extremely important accomplishment. (5, Insightful)

Lunix Nutcase (1092239) | more than 3 years ago | (#36406924)

Remember when it used to be first, by a huge margin?

C is still more wisely used by a huge margin. Just because you "enterprise" developers don't use it (despite the infrastructure of your managed languages being written in C and C++) doesn't change that.

Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.

Yeah right. What are you going to write your kernels in? What are you going to use for those millions if not billions of microcontrollers that will still be in use that can't run a JVM? What exactly are you going to write your VMs and interpreters in? Right, they will have to be written in a C or C++ and assembly.

Re:This is an extremely important accomplishment. (1)

tyrione (134248) | more than 3 years ago | (#36408434)

Remember when it used to be first, by a huge margin?

C is still more wisely used by a huge margin. Just because you "enterprise" developers don't use it (despite the infrastructure of your managed languages being written in C and C++) doesn't change that.

Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.

Yeah right. What are you going to write your kernels in? What are you going to use for those millions if not billions of microcontrollers that will still be in use that can't run a JVM? What exactly are you going to write your VMs and interpreters in? Right, they will have to be written in a C or C++ and assembly.

Let's add that C99 and C2003 for OpenCL and C++ latest standard with all you mentioned and OpenCL will only expand.

Re:This is an extremely important accomplishment. (0)

Anonymous Coward | more than 3 years ago | (#36408698)

I'd write them in FORTH.

Re:This is an extremely important accomplishment. (4, Informative)

Jahava (946858) | more than 3 years ago | (#36406954)

Remember when it used to be first, by a huge margin? It's not dead by any means, and still a very active language, but it's not taught as much anymore. Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.

... and kernels, and drivers, and embedded applications, and core libraries, and runtimes, too, unless those go away.

C is a fantastic language that very effectively performs a much-needed role in software development: to provide a lightweight, usable, and readable language while retaining (most of) the capabilities of machine code. C is intended to interface directly with the system, or closely with the operating system.

C is in decline because many modern programming challenges don't benefit from working on the level of machine code or operating system, nor should they. If I want to write a game, I want to focus on the game design and mechanics, not bit blitting pixels onto a buffer. Libraries, interfaces, and abstraction levels are all things higher-level languages leverage to constrain the perspective and duty of the developer to the most productive (and, oftentimes, interesting) areas.

Also, let's not forget that in the common non-kernel case, most of the reason C is even usable is because C, itself, leverages a massive host of support libraries and a not-so-lightweight runtime.

Re:This is an extremely important accomplishment. (2)

bgat (123664) | more than 3 years ago | (#36408128)

C appears to be in decline only because of the explosive growth in the number of applications produced in higher-level languages. Total annual C output is increasing year-on-year--- mostly to implement systems that themselves support the aforementioned applications.

Put another way, none of the growth in Java, C#, Python, Ruby, etc. etc. etc. would be possible without growth in C output as well. C won't ever go away, because every higher-level language in existence depends on it.

So you can have your Java. I know you'll come crying back to me when you want a platform to run it on. :)

Re:This is an extremely important accomplishment. (1)

Aighearach (97333) | more than 3 years ago | (#36408330)

Absolutely! C is totally a part of Ruby. We'd be suffering without it.

If something is too slow, we write it as an extension... in C.

In fact, the ease of integrating with C is one of the biggest advantages in Ruby over Perl, where you need a whole additional language (XS) to glue them together. In Ruby all we need is a lightweight C API and we're there.

Re:This is an extremely important accomplishment. (0)

Anonymous Coward | more than 3 years ago | (#36408760)

"C is in decline because many modern programming challenges don't benefit from working on the level of machine code or operating system, nor should they. If I want to write a game, I want to focus on the game design and mechanics, not bit blitting pixels onto a buffer."

Modern 3d game engines are still largely written in C++, which is a superset of C (and not considered much of a high-level language). This Includes the Source engine (Valve), the very-widely-used Unreal Engine (Epic games), idTech 4 (id Software), CryEngine (CrtTek). Also, Check out the list of open-source engines at https://secure.wikimedia.org/wikipedia/en/wiki/Comparison_of_game_engines [wikimedia.org]
They use C/C++ so as to be able to push every last fps out of the hardware and because C++ has all the low-level access and a lot of the performance that C does. All of the cool stuff (the engine) is still done in C++.

"...., not bit blitting pixels onto a buffer."
Writing Shader code is no glamorous, high-level, object-oriented affair, neither. Syntax for GLSL is very much like C. Only much more primitive and much less powerful.

If you want to write a game using these one of these engines, the engine does much of the heavy-limiting, so you might not need to write much C/C++. But the "Libraries, interfaces, and abstraction levels" are still written in C/C++. And as long as those things still need to be written, C/C++ is not going away any-time soon.

Re:This is an extremely important accomplishment. (1)

evilgraham (1020325) | more than 3 years ago | (#36413094)

Errr. You do realise that your arguement reads a little as if you're are saying that that the automotive industry is in the ascendant whilst the oil exploration and pertrochemical industries are in decline? I'm not expert in rhetoric, but I belive that this sort of thing is called a false dichotomy (real rhetoric experts, feel free to jump in). One might make a similar arguement that most modern fashion challenges don't benefit from working on the level of textile design, or for that matter, growing cotton. Well, perhaps on a very narrow level indeed, but it does seem rather silly to mistake a desirable outcome for the reality of how it is achieved. I appreciate that I am somewhat misrepresenting your core point to leap into the discussion, but it is really quite impossible to not get a little peeved about these holy language wars which seem to break out at the drop of a hat (there is a post somewhere else in this article where someone states that they "hate assembler snobs", which makes about as much sense as hating farmers if you value being able to eat). Look, as many others have pointed out, the whole point of computers, as viewed as a universal machine, is that you, as the programmer, get the damn things to churn out perfect copies of what you have told them to do, once, twice, one hundred, one million, tens of trillions of times. There has never, in human history, been anything close in terms of amplifying effort unless Archimedes himself has been elevated to some celestial plane where he does indeed have a long enough lever. That said, whilst people are thinking themselves smartarses for writing stuff in Java (the only language which is quicker to write than to run), then a) the point is being missed, a bit, and b) best that other smart people look for better, faster ways of making stuff work. Oh, and in the day job, 'C' is very much a high-level language. It is fine and, most importantly, portable for the purposes required, but our stuff simply would not work without breaking out the dreaded assembler here and there (there is tons of Java too, but we prefer not to speak about that). So what? The appropriate tool for the job and all that. The IT world truly amazes me. Do we really imagine that "news for plumbers, stuff that matters" would have long debates about the superiority of the spanner over the sink plunger? Some languages are perfectly fine for applications where the end user has all the processing power they need on their desk. Other stuff has to play nicely in a multi-user, multi-tasking, multiprocessing environment, and there it is best to use (and if it is important to you) learn the appropriate way to get the best out of that. Of course some commentators have a bit of an agenda here - I know and am familiar with language x, so it's the best thing since sliced bread and everyone should bow down to my incandescent genius - but that's bollocks. All it does is give the rest of the world the impression that this is a long way away from being even close to a profession; something which is very much encouraged by the common perception of the industry here in the UK (but you already knew that by the way I spelled "arguement") - stories of large systems developments going expensively tits up are ten a penny. Whilst it appears that the basic values in our business remain "x" is better than "y", then I don't expect that to change real soon now. Which is a pity; the world is at our fingertips guys (and gals). About time we take a leaf (only the one, mind) out of the book used by lawyers, doctors and other sundry self-serving trades and stop pissing on our own fireworks for a change. Just my $0.02

Re:This is an extremely important accomplishment. (2)

Toonol (1057698) | more than 3 years ago | (#36407062)

Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.

No. FORTRAN was replaced. You can do anything you could do in FORTRAN in more modern languages (like C, for instance). However, you cannot write operating systems in Java. I don't think there's any replacement on the near horizon that fills C's low-level niche.

I can see the use of C and C++ in most applications decreasing, although not when speed or performance is more critical than the expense of extra labor.

Re:This is an extremely important accomplishment. (1)

tyrione (134248) | more than 3 years ago | (#36408448)

Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps. No. FORTRAN was replaced. You can do anything you could do in FORTRAN in more modern languages (like C, for instance). However, you cannot write operating systems in Java. I don't think there's any replacement on the near horizon that fills C's low-level niche. I can see the use of C and C++ in most applications decreasing, although not when speed or performance is more critical than the expense of extra labor.

FORTRAN is making a comeback especially seeing as how phenomenal its uses in the Applied Sciences needing Numerical Analysis [Yes, C is it's bride in this area] but FORTRAN was designed from the ground up for such work.

Re:This is an extremely important accomplishment. (1)

mjwalshe (1680392) | more than 3 years ago | (#36411894)

yep I have have been considering writing some Map Reduce programs in FORTRAN and as tryione points out FORTRAN still used in full on technical programming where you want to solve a problem and not reinventing the wheels C doesn't have. FORTRAN also has decades of work going into compiler development including the specialized parallel and CUDA aware variants.

Re:This is an extremely important accomplishment. (1)

GNious (953874) | more than 3 years ago | (#36409664)

So we start with a CPU that processes native Java bytecode, then build a kernel, then userspace, then ...

and we shall call it Javux ...

Re:This is an extremely important accomplishment. (0)

Anonymous Coward | more than 3 years ago | (#36409932)

How do you generate that bytecode in the first place?

Call it Janx (0)

Anonymous Coward | more than 3 years ago | (#36411034)

From The Hitchhiker's Guide to the Galaxy

      "Take the juice from one bottle of that Ol' Janx Spirit.
                Pour into it one measure of water from the seas of Santraginus V
                Allow three cubes of Arcturan Mega-gin to melt into the mixture (it must be properly iced or the benzene is lost).
                Allow four litres of Fallian marsh gas to bubble through it (in memory of all those happy Hikers who have died of pleasure in the Marshes of Fallia).
                Over the back of a silver spoon float a measure of Qualactin Hypermint extract, redolent of all the heady odours of the dark Qualactin Zones.
                Drop in the tooth of an Algolian Suntiger. Watch it dissolve, spreading the fires of the Algolian suns deep into the heart of the drink.
                Sprinkle Zamphour.
                Add an olive.
                Drink...but very carefully."

Re:This is an extremely important accomplishment. (1)

Mindcontrolled (1388007) | more than 3 years ago | (#36407098)

Dude, you can have my FORTRAN compiler when you pry it out of my cold, dead hands. Now where did I leave those punchcards for the latest project again?? AND GET OFF MY LAWN!

Re:This is an extremely important accomplishment. (1)

sneakyimp (1161443) | more than 3 years ago | (#36407466)

Legacy apps like the JVM?

Re:This is an extremely important accomplishment. (1)

CSMoran (1577071) | more than 3 years ago | (#36424066)

Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.

You're making a fool of yourself. New codes are still written in Fortran, just not in your niche. The Fortran standard is still evolving (we've had Fortran 2003, Fortran 2008 recently). Massively parallel (think thousands of cores) high-performance computing ("number crunching") scientific programs are often written in Fortran. This is because the language is rather simple and hence compilers can be heavily optimized -- often *beyond* what is possible in C (e.g. the compiler can use the fact that arguments to a function cannot alias to squeeze extra performance). Also it's easy to manipulate large data structures in Fortran (think array slices -- in Fortran you don't need loops to do that). Finally, GPUs have been embraced by the compiler vendors and CUDA-capable compilers for Fortran 95 are already available. So... no, it's not just the legacy apps.

Re:This is an extremely important accomplishment. (4, Interesting)

parlancex (1322105) | more than 3 years ago | (#36407676)

Programmer time is more expensive than hardware time. If a less efficient language is easier to use, it makes business sense to use it to save money. This does not explain the slow languages that are difficult to use, but it does explain why assembly has fallen from favor, and why C is in decline.

I honestly hate this idea. You write have to write a program once. Most programs run thousands of times, some programs will run millions or billions of times. If you actually calculated the global collective waste due to slow heavily abstracted languages running across the globe that cost is significantly than it would've been to write it properly to begin with.

Re:This is an extremely important accomplishment. (1)

Kjella (173770) | more than 3 years ago | (#36407874)

I honestly hate this idea. You write have to write a program once. Most programs run thousands of times, some programs will run millions or billions of times. If you actually calculated the global collective waste due to slow heavily abstracted languages running across the globe that cost is significantly than it would've been to write it properly to begin with.

If you're developing OpenOffice or MySQL perhaps. I've many scripts and procedures at work that are run once a day or once a month on a centralized system for many users, the carbon footprint of that is probably smaller than the first user who drove to the office. And right now I'm doing a migration that's only going to be done once, things that are wasteful but make no sense optimizing.

If we assume you get less done with a more "to the metal" language, you also have to consider the cost of what we wouldn't get done. Oh, they're still sending paper reports because that new electronic system isn't done yet - how costly is that and what's the carbon footprint of that? It's an opportunity cost, and compared to how much other shit we blow electricity on getting more things done by computer is usually both cheaper and more environmentally friendly.

Re:This is an extremely important accomplishment. (2)

suomynonAyletamitlU (1618513) | more than 3 years ago | (#36408648)

Programmer time is more expensive than hardware time. If a less efficient language is easier to use, it makes business sense to use it to save money.

I honestly hate this idea. You write have to write a program once.

If you made a mistake, and the language you're using makes it difficult to track down that mistake, you may write that program many times over.

If you have a hotshot programmer on your team who thinks he's more clever than he is, no matter what language you're using, you may have to rewrite it again, or parts of it. The easier that is, the less of a loss it was to have made that hiring decision.

If your language of choice does not support certain features natively (concurrency, garbage collection, variable types, others come to mind), rewriting them from scratch or finding libraries, and training programmers to use them, becomes an additional cost. Anyone not already familiar with those libraries, or your own custom in-house libraries, is likely to make first-time mistakes on your dime. If this gets bad enough, you may have to write the program again.

Additionally, electricity is cheap, and you may not actually know when developing the program whether it will be used one time, ten times, ten thousand times, ten million times, or zero times, because that's up to management, the client, etc. If you get the functionality working acceptably in short order, and it becomes production, you may waste X money in additional power consumption, slow processing, etc. If you spend months getting the code to work, you might be asked to alter, rewrite, cancel, etc the project due to changed requirements... I'm sure there are people who are better at the horror stories than I am.

Clearly the best would be people who really, 100% know C or even, god willing, assembly, and can program as fast and well in that as lesser men could in java, perl, etc; people who have a treasury of libraries they're fully versed in that touches the heavens, and who know memory management and pointer arithmetic better than the CPU designers themselves. Good luck affording THAT. In the meantime, it's probably better to prototype in an easy language, and if time, money, and management agree, port the code to something faster. If time, money, and management don't agree, you still have a working program.

Now, you'd have a working program if you did it in C too. I hear you saying it. And if you, and everyone on your team, or who will join your team later are all good enough to deal with that, then yes, you are good enough, and you'll do fine. Were you to run a company, however, you may have to make this decision when it comes to hiring programmers: Do you want a quick mock-up in a sloppy language, or a sturdy, well-built machine that will take longer and requires better trained personnel? Because you'll choose each of those for different projects. And if you choose the former, even as a proponent of harder, better, faster, stronger code, you may be embarrassed to find you still have some of those quick and sloppy mock-up jobs in a production environment, because you didn't need anything more, and it would have cost you to redo them.

Re:This is an extremely important accomplishment. (1)

im_thatoneguy (819432) | more than 3 years ago | (#36409672)

On the other hand would you rather have an application in which the developer spent most of their time writing code or adding features?

Sometimes performance is a feature but on most modern systems for most applications performance is secondary to functionality.

I write most of my tools in an abstracted scripting language instead of C++ SDKs. Why? Because I can crank out a tool in about an hour. Writing a C++ SDK would take several days.

I'm less interested in how fast it is than what it can do. JIT scripting in most cases delivers a responsive user experience and lets me spend a week adding features and usability instead of hunting bugs and interfacing large unwieldy SDKs.

Re:This is an extremely important accomplishment. (1)

knutkracker (1089397) | more than 3 years ago | (#36410702)

Most programs run thousands of times, some programs will run millions or billions of times. If you actually calculated the global collective waste due to slow heavily abstracted languages running across the globe that cost is significantly than it would've been to write it properly to begin with.

Yeah, but that's a cost to the user who pays for the hardware, not to the company that writes the software.

Re:This is an extremely important accomplishment. (1)

roman_mir (125474) | more than 3 years ago | (#36412284)

If you actually calculated the global collective waste due to slow heavily abstracted languages running across the globe that cost is significantly than it would've been to write it properly to begin with.

- but there is another side to this coin. What if writing an application in a higher level language is not only simpler than in a lower level language, due to more variety of specialized libraries, but also produces fewer errors per some unit of code (per line or per transaction or per function, whatever)?

If this is true ( and I do think it is, that's after working in this industry since 1995), then except for having a possibly slower program, you also end up having a program, that is possibly more correct (and easier to maintain).

Now, while the program speed is important, because of how many times the application will be executed, as you say, then what about all the errors that also will be executed that many times?

What is better, having an application that is faster or having an application that is more correct and easier to maintain? Of-course it's better to have both, but that's why the underlying platform for languages like Java goes over the changes, that make it faster as time passes, and the code does not have to be rewritten, it can stay the same really.d

Re:This is an extremely important accomplishment. (0)

Anonymous Coward | more than 3 years ago | (#36408872)

That's a fallacy.
Add up all the expense that slow code causes end users and it would be several orders of magnitude cheaper to write in more efficient language.
It is MUCH cheaper for the developer to write in crappy interpreted languages but that just offloads the expense on to the users.

Re:This is an extremely important accomplishment. (0)

Anonymous Coward | more than 3 years ago | (#36410332)

simply put, use the right tool for the right job
the following story may or may not to a real life example. on the names and such have been made generic for purposes of explanation

Phase 1:
you start that little bit of software out in the latest flavour of the month, X
really what you want to do is crank something out the door
you'll work out the bugs and other stuff later
the main thing is to be first to market with your features
second to that is to get the word out and getting everyone using it

Phase 2:
you've now got a base of users using your software
but then the tipping point comes along
the usage of the software just manages to cross that performance hurdle
you know. the one on that curve where it looks like it's going straight up
you scramble to add in as many hacks and workaround as you can find
google isn't providing the answers to speed things up
"throw more / faster hardware at the problem" is the answer
more user complaints flood in when given this answer
management decides it's time for a re-write.
searches on google now finds thousands of stories you didn't see before
they complain how X simply isn't fit for purpose after you go past Y
a handful of searches say they moved to using M instead and couldn't be happier
M has been around a little longer the X so it's more mature
so you begin the rewrite of doing everything in M instead
you build the base functions again as it was when you started with X
you test it and find you need to add work-arounds to make it compatible with your current release
eventually you get there and things run much faster now
the main thing is to keep the users happy and get it out the door
you'll work out the bugs and other stuff later

Phase 3:
things are running smooth. then suddenly the tipping point comes along again.
here you thought you had eliminated it for good, but all you did was push it further away.
learning the lessons from last time, you add in all the hacks and workaround again
except much faster this time since you've already done it one before
it works for a while. but eventually you hit the curve again.
you're doing very deep google searches and yelling at the M community for answers
the only response is "throw more / faster hardware at the problem"
given last time you immediately search for where people moved onto
management is happy with the decision as it'll save PR work
google reveals now find hundreds of stories you didn't see before
they complain that M works well up to point N then falls over and craps itself
most of them say they moved onto F instead as it's much more resilient
you've heard of F before. F has been around much longer the M so it's much more mature.
you also see report of F being 'enterprise grade'. you read up and are convinced.
so again you begin rewriting in F. this time you build to current compatibility
you test, debug, redo, rewrite, retest again and again. it's painful.
eventually you get there and things are much better now.
the main thing is keep the customers happy.
you'll work our the various bugs and other stuff later

Phase 4:
damn you tipping point! damn you to hell!
it took much longer this time back it now looks like a wall
you've got a team of dev's furiously pouring over google implementing every trick in the book
your dev process is being published and people see you guys as F gods
meanwhile you're cursing how much time you need to spend getting the most out of F
you're at the point where you're adding / extending / fixing / improving F
you start digging google early this time: "where did people go after using F"
you're promoted to upper management now due to your foresight
google now reveals the dark side of F. sure most cases are quirky but apparently it's common.
they also say how F works well but past G it's no longer a valid option
instead they all said they eventually went over to C praising it as a god language
you definitely know C. it's shown up in all you other research on where to go next
you start immediately. you throw devs at the problem to get it working
testing, re-testing, rewriting, optimisation. your team finds every nook and cranny
eventually you get there and things run much faster now
as you're now low on options, the main thing is to make sure you don't get these problems again

Phase 5:
the tipping point strikes back! you're in management and cbf.
throw more hardware at the problem. nothing else will help anyhow
complain how hardware isn't design for modern software patterns

Phase 6:
your competitor has overtaken you
they've managed to get past the tipping you thought was insurmountable
one of your devs mentions they're using V which is the currently flavour of the month
you give the blessing to start a small project using V. it'll be a small release only.
he's got an idea that seems unique in it's feature compared to the competition.
the main thing is to be first to market with your features
second to that is to get the word out and getting everyone using it
you give the go ahead for it to start...

the take away is to plan ahead and use the right tool for the right job

Re:This is an extremely important accomplishment. (1)

Kjella (173770) | more than 3 years ago | (#36406970)

But efficient in man hours? I prefer C++/Qt and without Qt on top I'd probably drop C++ entirely. Even with Qt there are many things I miss or think work illogically that are as they are because C++ was designed in 1979 as an extension to C designed in 1973 and C++0x only adds toppings. The standard library is completely barebones compared to what you'd expect from Java, C# or any other modern language in 2011.

Honestly so many defaults should have been switched for sanity, for example I'd make all numbers default to 0 instead of uninitialized and if that one write kills your performance I'd give you a uninitialized keyword instead. There's plenty such things that yes, it might save you one cycle but it creates more bugs and more hassle than that one cycle is worth.

Re:This is an extremely important accomplishment. (1)

sneakyimp (1161443) | more than 3 years ago | (#36407448)

You assembler snobs make me sick.

Re:This is an extremely important accomplishment. (0)

Anonymous Coward | more than 3 years ago | (#36409966)

You will never see a digital computer of any current (von Neumann, non-von Neumann, Harvard, etc.) architecture ever implemented in graphene. I'll put money on it. The problem if you can't make synchronous logic (or probably asynchronous logic) big enough to make a processor but small enough to avoid synchronization problems at these frequencies

Re:This is an extremely important accomplishment. (1)

Twinbee (767046) | more than 3 years ago | (#36406944)

Trolling against who? I thought it was obvious that these languages were slower than say C#, C++ or Java. They make up for it in terms of flexibility (dynamic code generation) and ease of use, but I thought that was also taken foregranted.

Re:This is an extremely important accomplishment. (1)

Hartree (191324) | more than 3 years ago | (#36406736)

It's certainly useful for analog systems.

Now we just need something with that kind of electron mobility that still has a band gap so it can be shut off for digital.

Re:This is an extremely important accomplishment. (4, Interesting)

lurgyman (587233) | more than 3 years ago | (#36406774)

Well, let's not get ahead of ourselves. A mixer is an analog circuit, and silicon carbide is an expensive substrate to work with (very high processing temperatures), so it is typically only worthwhile for high-power analog devices. There is no discussion about anything digital in this article, so this is not related to programming languages or computers. Many analog devices have been made beyond 100 GHz on plain old silicon too; graphene on SiC may be important by enabling greater power density at high frequencies. As a microwave engineer, I'm excited about this, but this needs to happen on an inexpensive IC process for very small devices to be useful for digital circuits.

Re:This is an extremely important accomplishment. (0)

Anonymous Coward | more than 3 years ago | (#36408274)

Did they give any indication of how broadband it was. If it was DC to 10GHz (or even 50% fractional bandwidth) I'm impressed.

Re:This is an extremely important accomplishment. (2)

Jahava (946858) | more than 3 years ago | (#36406882)

I don't think that the article goes into enough detail about just how important this accomplishment is. Frankly, this is our only hope going forward. With so much slow software written in languages like Ruby and JavaScript becoming popular, it will again fall back to the hardware guys to really make things fast again. This will probably be the way they'll do it!

While I agree with your statement that this is likely incredibly important, your concept of the state of software is absurd.

Non-specialized (e.g., consumer-grade) software platform choices - language, compiler, interpreter, execution environment, and operating system - are made largely based on the current hardware status quo of the typical software user. If hardware (CPU, GPU, network, etc.) continues to get faster, software will be written to complement that hardware. The second hardware becomes a limitation, software will back off, trim down, and optimize. While the factors behind it are too numerous to fully detail in a post, the key idea here is that waste and bloat can often be the consequences of a tradeoff for functionaity, stability, speed, and development time to the net benefit of the consumer.

A good example of this is the Android operating system. In the midst of gigantic browsers and bloated (not necessarily in a bad way) operating systems running on the latest quad-core beasts, an operating system derived from a high-performance kernel (Linux), set of libraries (libc, etc.), and high-level runtime (Dalvik VM) was created to specifically scale graphical and operational code to an open embedded platform. Everything running on Android could easily have been run on computers 10 years ago, yet it's currently a bleeding-edge development platform.

Another example is a modern high-profile web page/application. Consumer-grade Javascript-intense pages (Google Maps, etc.) often provide lightweight alternatives for smartphones and netbooks. The user consumes what they can handle and no more, and this results in a positive experience.

Just remember: software scales to meet the demands of its consumers and the capabilities of its hardware platform.

Re:This is an extremely important accomplishment. (1)

osu-neko (2604) | more than 3 years ago | (#36408792)

The second hardware becomes a limitation, software will back off, trim down, and optimize. While the factors behind it are too numerous to fully detail in a post, the key idea here is that waste and bloat can often be the consequences of a tradeoff for functionaity, stability, speed, and development time to the net benefit of the consumer.

lol -- meanwhile back in the real world...

Re:This is an extremely important accomplishment. (1)

amusenet (2084500) | more than 3 years ago | (#36414540)

I like the way you wrote "consumer grade", as if there is another

Attention Rob Malda!!! (-1)

Anonymous Coward | more than 3 years ago | (#36406618)

Attention Rob Malda!!! I need to inform you that last night after we hooked up at the glory I realized I left a beer bottle and my iPhone 4 up your anus. Can you please contact me at 517-837-5309. Thanks! I hope I can get to suck on that tiny pee pee again sometime soon.

Gigahertz are useless... (2)

spyked (1878060) | more than 3 years ago | (#36406742)

...without efficient static memory, mostly because of the CPU-Memory gap. A faster CPU would require the memory and the bus to keep up at a similar frequency. That's already a problem, and even if that were possible then it would lead to increased power consumption using dynamic RAM and frankly, I think that's the last thing we need.

So faster CPUs will only be a viable alternative when we manage to get something like those memristors they keep talking about. Until then, it's larger caches and higher-frequency DRAM.

Re:Gigahertz are useless... (0)

Anonymous Coward | more than 3 years ago | (#36406822)

Oh noes. My giaghurts have been stoled.

Re:Gigahertz are useless... (1)

damnfuct (861910) | more than 3 years ago | (#36408036)

h4x0r5 on teh yu0r PC?

Re:Gigahertz are useless... (0)

Anonymous Coward | more than 3 years ago | (#36411334)

We must go to hacker scene where bad guy has your hurts.

Re:Gigahertz are useless... (1)

parlancex (1322105) | more than 3 years ago | (#36406858)

Kind of. I agree memory bandwidth is important and especially for parallel computing problems which can be very memory bandwidth demanding, but for modern processors consisting of 4 or even 8 cores you'll see very little gains in performance when increasing DRAM frequency. The reason for this is modern processors have a lot of (ostensibly wasted) die space that allows it to make intelligent decisions in prefetching, register renaming, and yes, caching.

Re:Gigahertz are useless... (5, Interesting)

Anonymous Coward | more than 3 years ago | (#36407200)

Wow, you're full of shit. Graphene isn't useful for digital circuits at all (at least yet) because it has crappy on-off ratios, but GHz are most definitely useful for radio work.

Remember how everybody's using their mobiles for everything these days, and streaming video keeps getting more popular and higher bitrate? Well, when you run out of spectrum below 5GHz (where all mobile networks currently operate), getting up into the 10-100GHz range is extremely useful to provide that extra bandwidth.

Since nobody but actual electrical engineers seems to know anything about radio anymore (used to be a common hobby for geeks, but I guess it's not "cool" anymore), let me explain one application of a mixer like the one described in TFA. You can use it to make a transverter, which takes a signal from your UHF radio (maybe a mobile phone, wifi card, whatever), and kicks it up to ~10GHz for transmission. And flips received signals back down to the original ~2GHz band.

Not impressed? Sure, 10GHz isn't much, we can easily beat that already -- it's only a prototype. But it's quite likely we'll have 100GHz-1THz in the lab inside a year, and on the market in ~5. There's a whole lot of bandwidth north of 30GHz, and (as long as you stay out of absorption bands like O2 at 60GHz, which limit you to short-range stuff like wifi/bluetooth replacements), it's eminently usable for urban cellular networks -- if you have the ICs to handle it.

Re:Gigahertz are useless... (0)

Anonymous Coward | more than 3 years ago | (#36408090)

Mod the AC parent up. He is the only one here so far who knows what he is talking about.

Re:Gigahertz are useless... (0)

hairyfeet (841228) | more than 3 years ago | (#36410012)

But what EXACTLY will that do to us and/or the environment? I really REALLY don't like how this push for everything to be on the cell doesn't seem to have any real long term damage considerations being looked at, it is strictly "LOL need more GHZ for FB LOL!".

It is already beginning to look like colony collapse disorder, which is wiping out beehives all over the place (you know, those things that without crops don't get pollinated and we all starve? yeah those things) are being affected by cells, which may even be the root cause of the collapse (place a cell near the hive, the hive wanders off and doesn't come back. We still don't know for sure why, may have something to do with their internal ability to tell ranges) and here you are talking about 1THz ranges? Maybe, God forbid, people actually wait until they get fucking home to watch YouTube?

Re:Gigahertz are useless... (0)

Anonymous Coward | more than 3 years ago | (#36410320)

PRAY they don't start using the new 500THz emitters. They can broadcast over 20 miles through the atmosphere and still diffract around corners. I even heard they affect the visual cortex, causing shapes to appear when exposed to them. There's a huge satellite just launched that emits it huge amounts of it, over 1KW/m^2 power density by the time it reaches your cellphone. At that power it could affect plant life, people, anything. It's probably what's causing the bees to die.

Re:Gigahertz are useless... (0)

Anonymous Coward | more than 3 years ago | (#36410576)

Hey, but if you could watch a Youtube video on the go about us fucking up earth in Full HD... wouldn't that be just AWESOME?

It's a diode! (4, Informative)

bughunter (10093) | more than 3 years ago | (#36407016)

The circuit the team built is a broadband radio-frequency mixer, a fundamental component of radios that processes signals by finding the difference between two high-frequency wavelengths.

Did someone paid directly by IEEE write that? "Two high-frequency wavelengths?"

The device is a nonlinear summing element. In other words, it has a transfer function of the form y=Sum(ax^n) for integer values of n from zero to at least 2. A very common example is a diode. But it could also be a transistor in the saturation region, or something more esoteric.

Due to the nonzero second-order transfer function coefficient, provides not only the superposed sum of the two signals at their original frequency, but also at the sum and difference of the two input frequencies. Add filters to throw away the parts you don't want, and you can make a modulator, a frequency upconverter, or a downconverter... all of these are used every day inside things you probably have in your pocket or purse, from cellphones to car stereos, television receivers to communications satellites.

But basically, it does the same thing a diode does... just faster.

Re:It's a diode! (2, Funny)

Anonymous Coward | more than 3 years ago | (#36407196)

all of these are used every day inside things you probably have in your pocket or purse, from cellphones to car stereos, television receivers to communications satellites.

Lesse:

1) cellphone: yup!

2) car stereo: well, I don't carry a purse, but if I did, I guess so.

3) television receiver: yup, though an interesting one would probably warrant a purse in place of a pocket.

4) communications satellite: erm, here's where I run into the problem of one fitting in my pocket, or a purse.

So, is this "suggest three plausible things and the forth implausible one will be blindly accepted as well" day?

Re:It's a diode! (0)

Anonymous Coward | more than 3 years ago | (#36409394)

D'oh! There's clearly no enough room to carry both, the communications satellite, and the cellphone. When you think about it, you'll see that if you are already carrying the communications satellite, there's not much point in carrying a cellphone.

Re:It's a diode! (1)

artor3 (1344997) | more than 3 years ago | (#36407750)

I think you're being unfair to the writer. "Finding the difference between two high-frequency wavelengths" is an accurate* description of a downconverter, which is a likely use for this device. And saying it's no different from a faster diode is like saying a GaAs transistor is just like a faster vacuum tube.

* Okay, technically, using the word "wavelength" to describe the signals is a bit ...off. But it's close enough.

Re:It's a diode! (2)

Interoperable (1651953) | more than 3 years ago | (#36407778)

They happen to be using it as a mixer, but the article clearly says that it's a FET (which certainly qualifies as a non-linear device). It might not be suitable for digital logic yet, but it is a transistor I believe. Also, 10 GHz for a proof-of-concept is damn fast.

"What hath God wrought?" (0)

Anonymous Coward | more than 3 years ago | (#36407964)

Wow! A superhet! Now I can mix 30 ghz down to 300 khz and get some bounce out of what's left of the ionosphere, or is it the ozone layer? I always get those two mixed up. Can hardly wait to hood up the ol' telegraph key and send a message overseas

Little ambitious to call it a graphene IC (0)

Anonymous Coward | more than 3 years ago | (#36408222)

The only part of the circuit that is "graphene" is the mixer, which is just a transistor. The rest of the circuit (resonators, interconnects) are metal. I do not see how this is an improvement over their previous work with the 100GHz transistor.

More than binary? Wave propagation and harmonics. (1)

iiiears (987462) | more than 3 years ago | (#36409124)

Will this allow a different way to signal other logic gates? Is it worthwhile to think about the frequencies that are usually discarded? It seems to me that two or more gates might reinforce each other enough to trigger a third that wasn't directly linked? Could you do something with the extra information?

PMMA (1)

Ramble (940291) | more than 3 years ago | (#36411398)

The article mentions PMMA and resist for use in electron beam lithography, yet PMMA is THE resist used in e-beam.
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?