Slashdot: News for Nerds


Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

C Alive and Well Thanks to Portable.NET

michael posted more than 10 years ago | from the alive-and-well-anyway-in-case-you-hadn't-noticed dept.

Programming 582

rhysweatherley writes "So C is dead in a world dominated by bytecode languages, is it? Well, not really. Portable.NET 0.6.4 now has a fairly good C compiler that can compile C to IL bytecode, to run on top of .NET runtimes. We need some assistance from the community to port glibc in the coming months, but it is coming along fast. The real question is this: would you rather program against the pitiful number API's that come with C#, or the huge Free Software diversity that you get with C? The death of C has been greatly exaggerated. It will adapt - it always has."

cancel ×


FOR THE LAST TIME... (4, Insightful)

gid13 (620803) | more than 10 years ago | (#8566317)

...stop telling me things are DYING, maybe let me know when they're DEAD.

Re:FOR THE LAST TIME... (4, Funny)

MukiMuki (692124) | more than 10 years ago | (#8566389)

Isn't this the part where a troll brings in the "NetBSD is dying article" with C as the replacement modifier value? C'mon, they HAVE to have an atomatic generator by NOW.

Re:FOR THE LAST TIME... (0, Redundant)

MukiMuki (692124) | more than 10 years ago | (#8566398)

Atomatic generators, of course, allow you to change a text document's protons and electrons; it's Nano-Programming(tm)!

Okay, I'm an idiot and this is the second twice in as many posts that I've made a stupid spelling error that will haunt me for many replies to come. I figure this time I'll get the jump on 'em.

Re:FOR THE LAST TIME... (-1, Redundant)

Anonymous Coward | more than 10 years ago | (#8566425)

I just wanted to see this subject again..

Re:FOR THE LAST TIME... (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#8566438)

Natalie Portman and grits. That's dead.

What about C++? (4, Informative)

ace123 (758107) | more than 10 years ago | (#8566322)

Isn't C++ widely portable while giving mast if not all of the features of C# (except for being interpreted)

Re:What about C++? (2, Flamebait)

lpontiac (173839) | more than 10 years ago | (#8566375)

Notice that the KDE camp are humming along quite happily with qt and C++. Everyone clamouring for Mono seems to come from the "just C thanks" GNOME community.

Re:What about C++? (3, Informative)

H4x0r Jim Duggan (757476) | more than 10 years ago | (#8566499)

One of the design goals of GNOME was to support as many programming languages as possible.

I'm not very familiar with KDE's language binding availability right now, but I know that being written in C++ would make it more difficult to provide alternate bindings. C, being the lowest portable denomiator of programming languages is simple to create alternate bindings for.

language bindings Re:KDE, language support (3, Informative)

aaron_pet (530223) | more than 10 years ago | (#8566516)

There are python bindings for kde, as well as many others.

Re:What about C++? (3, Insightful)

davebrot (464549) | more than 10 years ago | (#8566381)

True enough, but that's not really the point of the posting. Lots of people know how to program C and not C++. Regardless of how one feels about procedural vs. OO languages, a .NET runtime for C does demonstrate the hardiness (or maybe just the deep entrenchment) of C.

Re:What about C++? (5, Insightful)

condensate (739026) | more than 10 years ago | (#8566444)

True, and C++ is more than a better C. But how in the world do you do numbercrunching with a bytecode language? Tried to do so once for Java, and I'm NEVER going to do it again. Compilers do exploit the specific processor, VMs can do so, too, but why should I introduce a level of complexity, when I just want my processor to calculate things for me? It doesn't get easier, just more portable, but then, C++ seems fairly portable, using templates all along and letting the compiler do the nasty stuff. This means for C it's going to be macros. But hell, isn't it great when you actually know what happens? If you start out on some byte code language, you actually have no idea about the basics of your system. How can you program this system then? And for the anti-FORTRAN fraction. It is still the fastest thing out there!!! Anyone who tried solving a system of linear equations containing 1000 equations knows what I mean. My eyes still pop out when I see a FORTRAN subroutine at work that will do the job in seconds on a normal PIII desktop. So please stop this thing about dying languages which are in fact not dying but a little hard to cope with. This doesn'd make them old. It's just that some people don't want to go through the trouble of learning them, yet they are simply too good to be left out.

Re:What about C++? (0)

Anonymous Coward | more than 10 years ago | (#8566509)

No. C++ is such a complex and bloated language that not one compiler implements all of its features, except possibly gcc.

Really, C is much simpler and more elegant. C++ programmers tend to think of programming as a feature contest, and proficiency is measured by how many features you can memorize. It's much more geeky to use a simple orthogonal language like C, and to get creative in how to use it.

Please stop C++ calling portable (5, Insightful)

Fuzuli (135489) | more than 10 years ago | (#8566534)

Please don't. Yes C++ as a language has compilers for many platforms which are pretty much compatible, but the degree of compatibility of these compilers don't mean much since the compatibility of an application is a totally different story. An application written in C++ will be using some kind of library, for DB access, for GUI, for network operations etc... Most of the times, these libraries are not cross platform. Or they have to be extended with platform spesific code. It has been discussed in /. many times, check it out yourself. Cross platform GUI, cross platform libraries, and there is almost always a catch in all the solutions.
The story may change if you are writing C++ code that can stay in some kind of boundy, without using much library code, but unfortunately, i did not have that chance.
IMHO, java is really successfull in cross platform software development, without much work i can make java software work on another platform.
If C# had the same future, i'd be really glad, since i like it too, but as Microsoft works harder and harder on .NET i just don't believe MONO guys can keep up with it. C# 2.0 and longhorn will be a huge step forward for .NET technologies, and i don't thinkk MONO team can find resources to keep up with MS.
Don't get me wrong, i loved the work they've done, but the result will be a platform inferior to java 1.5 and .NET.
So i'll be using C++ for platform spesific, high performance apps, C# for windows apps that require rapid development, and JAVA for cross platform. That's my 2 cents...

C is dead (-1, Troll)

Anonymous Coward | more than 10 years ago | (#8566323)

Dude, give up. The language is old. Java is struggling against modern languages like C# that were created with Web Services and the proliferation of web applications in mind.

So... (1)

Raindance (680694) | more than 10 years ago | (#8566325)

I'm not a programmer.

Is this a good thing or bad thing is alive?

Re:So... (0)

Anonymous Coward | more than 10 years ago | (#8566340)

I'm not sure what you're asking, but nothing happened yet, and no disasters will occur in the near future.

To be sure, check Google News. (4, Funny)

Futurepower(R) (558542) | more than 10 years ago | (#8566416)

However, you should check [] frequently in case the world ended and no one told you.

Re:So... (4, Informative)

Zardus (464755) | more than 10 years ago | (#8566376)

Linux is written in C, SDL is written in C, X (I think) is written in C. The gimp is written in C (along with GTK). Gaim is written in C. There are almost 13000 projects [] on sourceforge that are registered as being written in C.

C is neither bad nor dead (not that it doesn't have its problems). Whoever wrote this article and the previous one about it on slashdot is a moron.

Re:So... (5, Informative)

Foole (739032) | more than 10 years ago | (#8566450)

Whoever wrote this article and the previous one about it on slashdot is a moron.
No one in this article or the other one actually said that C is dead. This is another case of a quote being taken out of context. The original quote was "To me C is dead." which has a very different meaning.

Re:So... (5, Insightful)

GnuVince (623231) | more than 10 years ago | (#8566546)

What's more, Miguel de Icaza (the guy who said that) was talking about user applications. Unless someone can explain to me why a garbage collector for an IRC client or a payroll application is a bad thing, why I should fear buffer overflows, problems with pointer arithmetics, etc. in those kind of applications, please tell me. Does the programmer need to spin my hard drive backwards or something? It seems to me that high-level languages will do just fine for those tasks.

You know the old adage, "Use the right tool for the right job?" Well, use C when you need it. C is probably the most misused language I've ever seen. But of course, this is Slashdot, the land where opinions are forged and are never to change.

Re:So... (0)

Anonymous Coward | more than 10 years ago | (#8566457)

Somehow I doubt that a Linux kernel would compile in C.Net, much less run, much less run well. But maybe bits and pieces could be used to provide a compatibility layer for windows .NET crap.

Re:So... (0, Troll)

js3 (319268) | more than 10 years ago | (#8566406)

This project only proves that C is dead. When a language has to piggy back on another or come up with these weird combinations you know it is on it's way out. I recently used c# for something that would have caused me many headaches just debugging etc with plain C.

Re:So... (4, Insightful)

j-pimp (177072) | more than 10 years ago | (#8566426)

When a language has to piggy back on another or come up with these weird combinations you know it is on it's way out. With that logic every language with a .NET version is dead. First of all their are plenty of projects that hanve ane will continue to be written in C and compiled into good old unmanaged binary executables that execute without any of this newfangled bytecode. Also the whole point of .NET and the JVM is compile once, run anywhere. Java and Foo# are just languages well suited for such tasts. Programmmers always find a way to use whatever weird language, library, or methodology with whahtever new technology.

Re:So... (3, Insightful)

mpmansell (118934) | more than 10 years ago | (#8566440)

So, by that argument, shouldn't Java and C# both have been stillborn? Both piggy back on VMs and are the same 'weird' combination?? :)

Communix (0, Funny)

Anonymous Coward | more than 10 years ago | (#8566327)

Coming soon... [] . Put your blood, sweat, and tears into this Unix variant so that it can benefit your fellow comrades. No pay, long hours, and the glory of the state awaits you.

fuckiin a (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#8566330)

im fuckin drunk

Adaption, but.. (0)

Debug This (702664) | more than 10 years ago | (#8566334)

Sure, it will adapt -- all languages do. But wasnt this thought of FORTRAN and COBOL too? It is a little short sighted to say that C is indestructable; its variants will (and have) come, but eventually it will be phased out.

Remember when punch cards went obsolete?

Re:Adaption, but.. (1)

xenocyst (618913) | more than 10 years ago | (#8566352)

sure punchcards went obsolete, and _someday_ C will to, but like BSD, it is dying, and will be for many many years to come... ;)

Punch Cards (5, Insightful)

Detritus (11846) | more than 10 years ago | (#8566355)

Punch cards did not disappear, they just became virtual.

FORTRAN and COBOL are still in wide use, even if they aren't as popular as they once were.

Virtual punchcards are like virtual brains (0, Redundant)

John Courtland (585609) | more than 10 years ago | (#8566543)

I hated MVS. I hated COBOL. I hated ASSIST. I most DEFINITELY hated JCL. All due to the restrictions from those damn virtual punch cards. Column 8 my ass.

Re:Adaption, but.. (1, Funny)

Anonymous Coward | more than 10 years ago | (#8566359)

Remember when punch cards went obsolete?

No, I'm not a dinosaur.

Re:Adaption, but.. (2, Funny)

jayjaylee (684876) | more than 10 years ago | (#8566370)

Remember when punch cards went obsolete?

Yes. :'(

I'm old enough where my toddler drinks Pediasure, my wife drinks Boost (to give her nutrients during her pregnancy), and I'm on Slimfast.

I'll cry myself to sleep tonight ...

Re:Adaption, but.. (2, Insightful)

irokitt (663593) | more than 10 years ago | (#8566372)

I still use punch cards, you insensitive clod!
As for Fortran and COBOL, Fortan is still an entry requirement to the California college system, and COBOL is still everywhere - deeply embedded into the payroll and employment operations of many businesses. And there are still vestiges of punch cards too - Scantrons and the like. So things don't die as easily as you might think. Much as we sometimes wish them away, they hang on.

Re:Adaption, but.. (2, Informative)

stephentyrone (664894) | more than 10 years ago | (#8566413)

"Fortan is still an entry requirement to the California college system"

where can I get some of whatever you're smoking? there are classes where it's used, for sure, but "entry requirement"? no.

Re: Adaption, but.. (1)

Black Parrot (19622) | more than 10 years ago | (#8566384)

> It is a little short sighted to say that C is indestructable; its variants will (and have) come, but eventually it will be phased out.

Old languages don't die; they just fade away. Surely there's still some Algol 68 running out there somewhere?

I suspect we should be discussing the half-lives of languages rather than their lifetimes.

Re:Adaption, but.. (2, Interesting)

blowdart (31458) | more than 10 years ago | (#8566464)

Well if producing a CLR version is proof of life (and how exactly do they provide C pointers when every object is supposed to be by reference anyway) then COBOL is alive with Fujitsu COBOL.Net [] , and Fortran has 2 zombies, with ftn95 [] and Lahey/Futisju Fortan []

Who would have though that a mainframe manufacturer would keep prompting dead langauges? <g>

Whilst Algol isn't there, Oberon [] is, as is Ada [] , a shareware version of Forth [] , Haskell [] , Eifell [] , Pascal [] , Perl [] , Python [] (twice [] ) and SmallTalk []

Re:Adaption, but.. (1)

ace123 (758107) | more than 10 years ago | (#8566503)

And also there's Whitespace [] .

Re:Adaption, but.. (1)

usrusr (654450) | more than 10 years ago | (#8566484)

i really don't think c will ever be the language to run on top of a bytecode layer. there are enough languages which at least look like variants of c that are much better fitting for those projects that are suspect of being run inside a bytecode box.

instead, what i could imagine in a future in which nearly everything goes the bytecode way, is c at some day being the only remaining language for the down&dirty stuff that is compiled to native machine code, although certain variants of c are unlikely to completely move to bytecode land as well.

Re:Adaption, but.. (5, Insightful)

csirac (574795) | more than 10 years ago | (#8566495)

Saying C will die out is like saying that assembler will die out.

There will _ALWAYS_ be parts of an Operating System, hardware-oriented realtime or embedded app that _needs_ to be close to the metal. C/ASM is predictable, consistent, flexible and fine-grained in the things you can do with it. You certainly don't want a time critical interrupt handler routine that is supposed to be done in 5ms to suddenly decide that it needs to do some garbage collection or page in some hashing function to access an array of some sort.

Plus, C is great because it isn't assembly.

Even then, sometimes you just gotta write some ASM.

Sure, someone might make a "better" C that has similar goals (structured around ASM-style thinking rather than human-style thinking), but if they did it surely would be some incarnation of C. Compare traditional K&R C with the current features of GNU C (hooray for structure member tags!) or even the ANSI C99 specification.

Even though there has been no great change in the approach to programming itself (compare to LISP, haskell or Perl), C has nonetheless had continuous improvments along the way, from language and data structure standards to libraries, compilers, debugging tools, code profiling, and so on.

I find it hard to believe that we're going to have OS-level DMA transaction code written in Java or C#.

I once read in a visual basic for dummies manual (or was it Delphi?): "Trying to write an Operating System in Basic is like trying to fly to the moon in a hot air balloon".

At some point, you've _GOT_ to talk to the hardware.

- Paul

Re:Adaption, but.. (1)

blowdart (31458) | more than 10 years ago | (#8566501)

I find it hard to believe that we're going to have OS-level DMA transaction code written in Java or C#.

Who says that's what it's for though? Consider a large set of legacy libraries, doing data munching, or file IOs or other stuff which could be rewritten, but no-one has the time or the documentation. Now, whack it into a CLR object, then call it from whatever language you like. You've suddenly migrated your old codebase.

Just because you use C doesn't mean you HAVE to talk to the hardware.

Re:Adaption, but.. (1)

csirac (574795) | more than 10 years ago | (#8566545)

I was responding to the idea that C would be phased-out somehow, nothing to do with C for .NET. I guess I was being OT...

Just because you use C doesn't mean you HAVE to talk to the hardware.

I'm sure C for .NET will have its uses, it's just that I'm more of a hardware guy so that's why I'm trying to imagine a world without native C.

Other than re-using legacy C code as you've suggested, I would feel that the advantage of using a bytecode environment like .NET would be in using the new languages and features they offer, instead of writing even MORE C with functions that leak and void pointer arithmatic :-)

Re-using C code for use from a different language is a good idea though.

Re:Adaption, but.. (1)

blowdart (31458) | more than 10 years ago | (#8566559)

Re-using C code for use from a different language is a good idea though.

That's one of the nicest things about .net (imo), the cross language object support. Need Perl? Whack up a Perl object, then use it in C#

Er.. (5, Insightful)

iswm (727826) | more than 10 years ago | (#8566336)

C is still alive and kickin' in the NIX community I'd say. It seems it's really just Windows where other languages (C++, C#) seem to be taking over. Just because C isn't being used much in the Windows world doesn't mean it's dying ot is going to die anytime soon.

Re:Er.. (0)

Anonymous Coward | more than 10 years ago | (#8566535)

I nearly always use C when I'm programming on Windows. The Win32 API is straight-C friendly (structs, functions) without much need for C++ classes or OOP. Speed is important to me and I don't code in C++ unless it's absolutely necessary.

C?? (4, Funny)

Neo-Rio-101 (700494) | more than 10 years ago | (#8566339)


Yeah, just kill it off already... I wanna go back to using Commodore 64 BASIC.

Wow! (4, Funny)

stevens (84346) | more than 10 years ago | (#8566342)

All the advanced language features of C with all the speed of an interpreted VM!

Can I get them to compile asm to java bytecode next?

Re:Wow! (1)

metlin (258108) | more than 10 years ago | (#8566374)

That is one of the most insightful yet really funny comments I've ever read on Slashdot. Simply fabulous!

Re:Wow! (4, Interesting)

minus_273 (174041) | more than 10 years ago | (#8566385)

"Can I get them to compile asm to java bytecode next?"

Not as funny as you think. It would be a truly awesome program that can do that. Why? take MS office, disassemble make it java byte code then run it on the platform and OS of your choice.
dont see anything like that happening for a while, but it certainly is not funny.

Not that simple (4, Insightful)

xswl0931 (562013) | more than 10 years ago | (#8566536)

Of course you'd also have to disassembly every library MSOffice uses and every library those libraries use which includes the NT kernel. So by the time you're done, you'd be running Windows in a JVM just to run MS Office.

Yup that exists (4, Informative)

Julian Morrison (5575) | more than 10 years ago | (#8566407)

Re:Wow! (4, Informative)

tupshin (5777) | more than 10 years ago | (#8566435)

Takes compiled mips binaries and converts them to functional java classes.

Weren't you guys beaten to it by CNet? (3, Funny)

Ratface (21117) | more than 10 years ago | (#8566343)

Sorry - someone had to say it!

Ummmm.... (0)

Phidoux (705500) | more than 10 years ago | (#8566346)

With all the memory leaks in the .NET framework (I'm talking about the MS version of the .NET framework here) why would anyone, in their right mind, want to turn their C code into something that runs like crap?

Re:Ummmm.... (1)

BlueTrin (683373) | more than 10 years ago | (#8566356)

Nobody will notice that it is your app which is leaking memory under longhorn if it stays like the build 4053 =)

Because there was no more room in hell? (5, Funny)

Anonymous Coward | more than 10 years ago | (#8566348)

C lives on; driven by an insatiable unreasoning swarming hunger. Until the day when the seventh seal is broken, the sun dies, and all the languages are at last bound to it's dark will. Then all of man, in the Doom of our time, will writhe in agnoy for a thousand years of darkness until the, strongly typed, Rapture casts the dark empire back into the pits of hell, and scatters the damned to the winds.

Keep it real... (5, Insightful)

seanmcelroy (207852) | more than 10 years ago | (#8566353)

QUOTE: The real question is this: would you rather program against the pitiful number API's that come with C#, or the huge Free Software diversity that you get with C? The death of C has been greatly exaggerated.

Now what a spin. The .NET API's are by no means 'pitiful in number', and they can be embraced, extended, and overridden as desired. C *can* adapt, but the point of a C# based desktop system or development platform is not solely to exclude C, but to bring the benefits of managed code to other system consumers. C could adapt, but not without a lot of overhead and fundamental changes that really is the point behind C#. I'm sure we'll be in a backwards-compatible, C-friendly world for a long time to come, but there's no reason to bash something new and different because it is new a different. That's just FUD.

Re:Keep it real... (5, Insightful)

forgoil (104808) | more than 10 years ago | (#8566455)

Not to mention the fact that the developers of C# (i.e. the people developing the language, not with the language) made sure that one can easily make use of C and C++ code and binaries already in existance. You can already call all the C/C++ APIs.

I'm not sure or clear of why one would want to code in C instead of C# when developing for a .NET environment (be in Microsoft or Mono or something else). It can't be because you can't make use of already written APIs (since you easily can call them), so we killed that point. It can't be because of the "speed" of C, since it will be running the same IL on the same CLR. We are basically down to the language of C.

So is this because someone feels that OO could be too big for very small devices (Java MIDP showed us this very clearly, since it is completely missplaced and awful on mobile phones)? I can buy that.

Or is it because of some form of hatred towards C#? That makes it sad. The APIs for .NET are far better than any C API I've stumbled upon thus far (win32, POSIX, Solaris, GNU, etc), contains a vast amount of useful things, and easily calls unsafe (.NET terminology, look it up if you don't actually know what it means) C/C++ APIs.

Re:Keep it real... (2, Funny)

Anonymous Coward | more than 10 years ago | (#8566460)

The real question is this: would you rather program against the pitiful number of security holes that come with C#, or the huge Security Hole diversity that you get with C?

C's not dead because nothing better.... (5, Insightful)

bangular (736791) | more than 10 years ago | (#8566357)

C has it's problems. You could complain about C all day, the problem is, it's the best thing we have right now. One of the problem with modern languages is, you can't write an operating system in them. One of the problems is half the new languages are scripting perl/python like langauges and the rest compile to byte code. Maybe C would go away if there was a compiled langauge that wasn't largely controlled by one company that produced fast code and was portable. The closest thing to that besides C is C++.

Re:C's not dead because nothing better.... (2, Informative)

Ambassador Kosh (18352) | more than 10 years ago | (#8566508)

Python does compile to bytecode. It just does not require a separate step to compile to bytecode like java does. If the bytecode is out of date it will be recompiled and used automatically.

Re:C's not dead because nothing better.... (0)

Anonymous Coward | more than 10 years ago | (#8566544)


(the point, flying right over your head)

Re:C's not dead because nothing better.... (1)

John Courtland (585609) | more than 10 years ago | (#8566558)

Uh, you can't run bytecode on a raw machine. C and Assembler are what make the computer world run. You can't make Java in Java. C turns directly into executable binary (or object files then linked into executables). Java cannot. I suppose that if you were insane enough, you could make a bytecode to opcode converter, but then you lose 100% of the point of the langauge, probably a lot of the efficiency, and at that point you may as well use C.

Re: C's not dead because nothing better.... (1)

alice_in_cipherland (727512) | more than 10 years ago | (#8566567)

There are compiled languages that are arguably better than C, such as Eiffel [] , Modula-3 [] , and D [] . These may be promoted by companies, but surely not controlled. C is popular because it's simple and powerful, is good enough, and has inertia. But that doesn't necessarily mean it's the best.

Everything old is new again... (1)

SuperKendall (25149) | more than 10 years ago | (#8566360)

An interesting trck, but it has been thought of [] before...

A more interesting question is why you wouldn't rather just use C on these various devices, which by their nature are constrained and lend themselves to code that squeezes all you can get out of them.

A C to .Net bridge won't help you if there's some native feature of the device with no Compact.Net library support.

And then of course there are the number of devices that support Compact.Net... wouldn't you be better off finishing up that C->Java compiler so you could write bytecoded C for things like the blackberry or sidekick or Treo?

Seems kinda astroturfy to me.

C is dead for me! (0, Flamebait)

phsdv (596873) | more than 10 years ago | (#8566363)

I prommised myself that I will never program in C again. And so far, almost a year now, I am sucseeding in this! I even think that I already forgot what * and & are meaning. In the past, I have programmed many different projects in C, including a very complex embedded system. But when I have the change I will use a modern language like Python. Maybe it is slow(er), but the total time spend is so much less. But that I do not have to tell that to /. right?

Huh? (3, Funny)

drgonzo59 (747139) | more than 10 years ago | (#8566365)

I wonder if Microsoft can then compile the .NET framework into IL and run .NET (on top of .NET)* ?

In the meantime I'll just risk being labeled "old-fashioned" and compile C straight to binary

Declaring "X is dead" is just a cheap shot. (5, Insightful)

Biotech9 (704202) | more than 10 years ago | (#8566369)

And its done by someone with a new technology to get people talking about it. Look at all the debates and forum chatter that got sparked off by intels "Bluetooth is dead". [] "C is dead", "CISC is dead".... []
,"Apple is dead". []
When technologies really do die, its when noone gives a damn about them, and so noone will be writing a story about it.

Insightful (5, Interesting)

SuperKendall (25149) | more than 10 years ago | (#8566387)

When you hear someone declare "X is dead" it usually means they have a vested interest in X actually dying, and wish to further that belief. Either that or it's more like a mafia situation where a statement like "X is dead" is more of a prediction with a strong likelihood - it all depends on the power of the speaker.

Re:Declaring "X is dead" is just a cheap shot. (0)

Anonymous Coward | more than 10 years ago | (#8566479)

Oh my god! Oh my God!

X is dead, oh my, what are we all going to do go back to the comandline.?

oh wait...

Right tool for the right situation. (2, Insightful)

Slayk (691976) | more than 10 years ago | (#8566383)

It seems to me (even with my limited knowledge of programming and software engineering) that when such statements are made about the death (or undeath...mmm...CZombie...."HEADERS....HEADERSSS") , the idea that C# has its place in fitting in with the .NET framework, C has its place in things like...say...stuff like the Linux kernel (though that isn't near its only use), Java and it's being cross platform, etc is totally ignored.

Just because you can hammer in a screw if you try hard enough doesn't mean the screw driver is dead.

Broader Perspective (5, Funny)

VoidEngineer (633446) | more than 10 years ago | (#8566388)

Summary of argument to date (translated from geek-speak):

> Queens English is so dead.
> Yo, it's all about Ebonics.
> Dude, Southern Drawl is *soo* slow... Surfer speak is a way better language.
> Like, Valley Speak is, like, the best networking dialect to know!
> Well, if you want a job with a blue-chip company, go with Chicago Twang.
> I hear that they're porting the Queens English libraries to Chicago English, btw.
> See? Queens English is not dead...

Dialects, people... just dialects. Try to see things in the broader scheme of things. (punny, eh?).

C? Dead? (0)

Anonymous Coward | more than 10 years ago | (#8566390)

I think it's a bit silly to say that C is "dead." As a Christian, I won't write anything in C (obviously), but it's hard to call a language "dead" when there are billions of lines of C code out there and running everything from ATMs to nuclear power plants. What language do you think that your Linux kernel or your Windows XP internals are written in? How many processor-intensive image processing systems are written in C, and do you think that these systems are going to be ported to Java or BASH? C isn't going anywhere anytime soon. It has its problems as a language but it still remains a powerful tool if your particular belief system is compatible with its philosophy.

Re: C? Dead? (2, Funny)

Black Parrot (19622) | more than 10 years ago | (#8566423)

> As a Christian, I won't write anything in C (obviously) [...] and do you think that these systems are going to be ported to Java or BASH?

As a Christian, you should clearly support J4V4 in all things.

It's not dead. (4, Funny)

Teddy Beartuzzi (727169) | more than 10 years ago | (#8566392)

It's just pining for the fjords.

Re:It's not dead. (1)

Eric_Cartman_South_P (594330) | more than 10 years ago | (#8566488)

Pining for the fjords? What kind of talk is that?

*yawn* (2, Interesting)

sweepkick (531861) | more than 10 years ago | (#8566393)

I'll start worrying when I see entire OS's and their requisite device drivers written completely in a bytecode language.

*shrug*... bring it on.

Re: *yawn* (1, Interesting)

Black Parrot (19622) | more than 10 years ago | (#8566414)

> I'll start worrying when I see entire OS's and their requisite device drivers written completely in a bytecode language.

I don't suppose you'll like my idea for a metalanguage, which can be interpreted at run time into the bytecode language of your favorite bytecode interpreter?

Well.... (1, Funny)

Anonymous Coward | more than 10 years ago | (#8566402)

if we could only get a compiler that does what I think I'm doing instead of what I actually told it to do....then we'd have something

Let it die! (1, Flamebait)

Lux (49200) | more than 10 years ago | (#8566410)

We need C like we need more buffer overflow vulnerabilities!

Let the miserable wretch die, or overhaul it to be type safe.

There are some things that are better buried.

Re:Let it die! (2, Insightful)

zalas (682627) | more than 10 years ago | (#8566473)

Screw that! We need more programmers aware of vulnerabilities in systems and to be able to deal with it. Dumbing down a language at the expensive of performance is only going to dumb down the programmers. Well, to a point. My ideal programming language would be something that allows me to do practically anything with it and leaving its internals exposed. You can program with a safe subset if you're beginning, but you can then expand to advanced programming without the language limiting what you can do.

Re:Let it die! (5, Interesting)

Temporal Outcast (581038) | more than 10 years ago | (#8566555)


Remember, a language does not cause overflows - careless and stupid programmers do.

C is built for low-level interface, and its best suited for that purpose. Its lean and mean, and thats how its meant to be.

If you want complex exception handling and all that, you are probably using the wrong language for the task.

Blame the people who used C for the wrong task, not the language.

It's Dead Jim (2, Interesting)

myownkidney (761203) | more than 10 years ago | (#8566431)

Not it aint!

I have never used, and will never use this bytecode languages running on VMs. I won't the minimum distance between my program and the machine instructions. Currently, C is the best language for this purpose.

Unfortunately, a lot of CS courses are teaching people the importance of "managed code" and "strong typing" etc. I say to hell with that. If I feel like messing up with memory at AF345F12:BA231DCE then I shell do so. I don't want to hide behind "type safety". I know what I am doing.

I have no faith in these OO language crap either. The real world maybe OO, but once your code is compiled, it is going to run us a sequence of statements: i.e., like an imperative language. Not to say that people have not tried to build processors which had OO machine code, but none of them caught on. I work mainly with the Intel architecture. It's not natively OO.

Long live C

Re:It's Dead Jim (1)

mpmansell (118934) | more than 10 years ago | (#8566522)

I agree with much of what you say, but it is unfortunate that these feelings need to be tempered in light of the other programmers who may be around:)

I am one of C's greatest fans and have even (in the dim distant past) gone as far as writing and using my own compilers. However, like most pwerful tools it has its hazards. While more modern systems are usually managed so that a stray pointer is less likely to destroy you hardware, when I started this was not the case. If this was the only threat, then C would still be a good, but arcane, language for general use. However, the advent of the internet has opened up a whole new area of risk that many of the current crop of programmers are just not qualified to handle with a reasonable degree of risk. While changing industry attitudes to put more emphasis on professional qualifications may help this, in the meantime the handholding that managed code provides can reduce the chances of many types of bugs occuring. (At the cost of speed and space - not very elegant to us old-timers).

Not to put all the blame for needing "nanny languages" on inexperienced or poorly trained developers, Management is often at fault. Constant spec changes and reluctance to give proper proper provision for design, analysis and documentation leads to intemperate coding that breaks code security. Managed code allows their incompetence to slide past unaccounted ;)

At a low level, I agree that OO is awkward, and that at many other levels, it can lead to ineffcient code, but even I am now beginning to see the value of OO for many applications. As with choosing C over another language, choosing OO over functional or procedural paradigms is a measured engineering descision. Hopefully made by engineers :)

Re:It's Dead Jim (1, Insightful)

Anonymous Coward | more than 10 years ago | (#8566556)

I won't the minimum distance between my program and the machine instructions. Currently, C is the best language for this purpose.
Nope, assembly is the best langage for that purpose (i.e. 0 difference between your language and the machine instructions). Programming languages exist to abstract away from the details of the hardware. This allows you to write less buggy, more portable code that can be compiled to run efficiently on lots of different architectures. C code will never run efficiently on, say, a massively parallel computer, but Haskell code might because its semantics aren't tied to a particular machine model.

Unfortunately, a lot of CS courses are teaching people the importance of "managed code" and "strong typing" etc. I say to hell with that. If I feel like messing up with memory at AF345F12:BA231DCE then I shell do so. I don't want to hide behind "type safety". I know what I am doing.
Sure you do. It's a mistake to think that programming languages which have PDP-11 oriented semantics and weak typing are giving you more "power" or "control". Power comes from modular code and well-designed algorithms, and control comes from a language which supports these things by abstracting away from irrelavent hardware issues. Why should I have to think about pointers to iterate through an array? Why should I have to write my own buggy and inefficient memory management code when it could all be done by a finely tuned and heavily tested GC?

C is not even the only language you can use for writing low-level code. Operating systems have been written in Lisp, for example. Nor is it the only language which compiles to efficient machine code: ML, Lisp, etc. can all run at comparable speeds.

Embedded/Real-time systems still need C (4, Informative)

aarondsouza (96916) | more than 10 years ago | (#8566436)

There are a huge number of applications that have very stringent time constraints, especially in real-time control. Other than coding in assembly, there isn't any other language out there that is as efficient (both size *and* speed count) as well optimized C code.

As an example, our lab works with humanoid robots that run in a 5ms control loop, which means that the next command (computation of inverse dynamics, etc.) has to be ready in that timeslice. If you want to do fancier stuff like machine learning and AI, you'll have to squeeze in many more operations into that tiny window. Sure, additional processors are a plus, but you still need very fast and memory efficient code.

C is Dead (5, Funny)

WankersRevenge (452399) | more than 10 years ago | (#8566448)

I know. I killed him. I ran him down in my PHP-mobile while drag racing with those Ruby punks on their friggin crotch rockets. At least C++ had the sense to step out of the way. I guess they were arguing about how their half-witted brother C# knocked up his half witted twin sister, Java, and produced some hideous premature birth thingy who they called Mono. I would have turned around and hit C++ had I not blown a module and had to stop. Those Ruby punks gave me the bird, but you wait and see. I got this new Zend nitrus which knock the socks of those badboys but I don't know how plug it in. Anyone got the number of a good mechanic?

Saying C will be killed by a runtime architecture (4, Insightful)

StevenMaurer (115071) | more than 10 years ago | (#8566461) like saying too many busses will eliminate sports cars.

The C design paradigm (low level, varied environment, highly optimized, developer control) is intended to solve an entirely different class of problem than Business runtimes (higher level, standard interface, managed resource, developer handholding). The two aren't in competition much at all.

Nor do I think much about trying putting a racing-wheel on a bus either. We already have C# and "Managed C++", both which can look quite a bit like C, if you want them to. All you have to do is ignore that they're fundimentally different in the way they treat resources due to the underlying runtime or lack thereof. (Which is like equating a bus to a sports car, ignoring the size and speed issues.)

Didn't RTFA but have some questions anyway :) (2, Interesting)

idiot900 (166952) | more than 10 years ago | (#8566462)

Does this include *all* of C? How do they compile the following C features into VM bytecode?

- Pointer arithmetic
- Hardcoded type sizes instead of using sizeof() (i.e. assuming sizeof(int) == 4)
- Lax rules for casting
- And so on

Re:Didn't RTFA but have some questions anyway :) (1)

ncaHammer (518236) | more than 10 years ago | (#8566541)

IL code has all of them and you can enable them in C# in methods signed as unsafe.
In many cases the unsafe mode C# "behaves" as inline C. /library/en-us/csref/html/vcwlkunsafecodetutorial. asp

"Men Should Learn to Become Pregnant" (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#8566481)

Nalini Bannerjee, better known as the "Gender Blender" at FAIT, demanded from the men that they carry equal responsibility in childbirth. Addressing the crowds at the International Women's Day celebrations, she claimed that: "They just get us pregnant and expect us to carry big bellies around for 9 months while they go around happy with their lives. It is high time men started getting pregnant themselves"

Mrs. Bannerjee, 43, the much feared militant feminist who has fought for decades against gender discrimination, was outspoken in her criticism of the way society viewed motherhood as always being female duty.

"There's nothing stopping men from becoming pregnant or lactating. It is just a social issue. Women are groomed to be mothers, whilst men are taught to be fathers. But this is totally wrong", explained Mrs. Bannerjee.

"What is there that a man can do that woman can't do?" questioned Mrs. Bannerjee.

"Pee standing", someone in the crowd was heard to reply.

Have Your Say: Discuss Mrs. Bannerjee's Demand

What about f77? (1)

nate.sammons (22484) | more than 10 years ago | (#8566489)

When can I get an f77 to IDL compiler so I can watch an atmospheric simulator die on Windows? ;-)

Re:What about f77? (1)

blowdart (31458) | more than 10 years ago | (#8566494)

f95 [] do? Don't like that version? Try this [] instead.

Does it work with MONO? (1)

tgraupmann (679996) | more than 10 years ago | (#8566491)

So now you can write C code and it compiles for .NET. Does that mean it works for MONO as well?

Why C needs help (5, Insightful)

ttfkam (37064) | more than 10 years ago | (#8566504)

The real question is this: would you rather program against the pitiful number API's that come with C#, or the huge Free Software diversity that you get with C?
Or read a different way, would you rather program to a uniform API for GUIs that are accessible to many languages including C# or the huge Free Software diversity of GUI APIs that are all incompatible but still just make a button and a checkbox.

Or we can look at it like this: "Wouldn't it be better to have many different toolkits that allow string concatenation and tokenization than one standard library of string functions?"

Or maybe this: "Isn't it great that we have several different native APIs for threads, processes, and IPC depending on underlying platform, five different and incompatible implementations for cross-platform usage, and no way to easily switch between implementations after the project is underway?"

And next shall we talk about databases? Or maybe sound processing? Or regular expressions? Hmmm...

The thing that C zealots fail to recognize is the need for clean, standardized APIs (NOT implementations). If you write code that uses strncmp(...), aren't you glad that you don't have to worry if the C implementation is the BSD libc or glibc or MS Windows' C library? Don't you wish the same could be said for the user interface libraries -- for example being able to swap out the Qt or GTK+ implementations at compile or link time? Or the database libraries? (ODBC? Don't make me laugh.) But you can't because each implementation has its own interface even though a button is a button, a checkbox is a checkbox, a database connect is a database connect, a regexp is a regexp, etc.

This is what .NET gives; Not the mandated implementation, but much better it gives the recommended interface. If the C folks get it together and standardize more than just things like printf(...) and linked lists, you will get no end of gratitude from me and the gratitude of folks who are tired of reinventing the wheel and solving problems that were adequately solved twenty years ago. Unless that happens, you're gonna see more and more people moving to things like .NET and Java, warts and all.

POSIX was a good start, but it has stagnated and is showing its age.

Miguel de Icaza (1, Troll)

rixstep (611236) | more than 10 years ago | (#8566515)

Miguel de Icaza - this person is annoying. Many people write to me and tell me they suspect he is a Microsoft mole. Whatever: he's the guy who said Clippy is a good idea. Go figure.

What the world has right now is the following:

1. Native assembler. This is always a fall-back.
2. C. Great for writing operating systems. Capable of inline assembler as well, so efficiency is very high.
3. C++. I have my doubts. And I think its prevalence would not be as great were it not traditionally so difficult to use the next language on the list.
4. Objective-C. What Alan Kay always envisioned, but in compiled form. As long as we are using GUIs with widgets and gadgets, this will be the premier choice.
5. Java. Not native, but eminently portable.

In the context of the above, I am sorry, but .NET is totally uninteresting, Mono is even more uninteresting, C# is an abomination, and Miguel de Icaza is totally irrelevant.

Thompson, Ritchie, Cox, Gosling - these are great computer scientists. de Icaza is a fart.

Pointless (1)

NigelJohnstone (242811) | more than 10 years ago | (#8566517)

You could compile it to machine code and run it native, or you could compile it into any number of intermediate languages and run it with an interpreter.

Why would you do the latter? For what advantage?

C isn't dead, C++ isn't dying. .NET is the systems thats really stuggling.

Bug submission from banned contributor. (-1, Offtopic)

mdupont (219735) | more than 10 years ago | (#8566525)

Just wanted to point out that I have produced the most bug reports for pnet at the time (currently 54 bug reports in the system [] ).

One good thing I have to say about Rhsy is that he fixes bugs quickly!

My sometimes harsh and questioning nature that brought me to free software is sometimes too much for people to take.

I bitched and moaned alot about dotgnu implementing patent-endanged code instead of following thier original plan.

My complaining got me banned from the project.

Now instead of accepting reports about this buggy software that would warn people that it is not usable on really challenging code (like libx11 or the gcc) that I was compiling with it.

The author rhys tried to *gag* me. []

I complained to savannah about this here : []

In the end, he of course appreciated my valid bug report and fixed them like all the other ones.

The arguments that my bugs are not valid really dont hold water if all of them are fixed. The reality is that the software was lacking major testing, and that my reporting of all these bugs reflected that unstable state.

The funny thing it the backpedling you can find in the history here, the bug is turned from invalid to fixed!
"Mon 02/16/04 at 23:41 rweather resolution_id Invalid Fixed"

So, I would really think twice before spending your time working on pnet/c, because the developers are trying to hide bugs from you! (the ones that I reported (Real bugs))


FUD again from MS zeelots ! (1, Interesting)

Anonymous Coward | more than 10 years ago | (#8566529)

C# is nothing more that Java language and .net is nothingmore than Java platform. That's fact.

Did Java kill C since nearly a decade it is here ? No ! So there is no doubt that a windows only platform (.net) can kill a multiplatform language !

If there is a platform that has deprecate C/C++ in lots of enterprise developpment it is more Java itself. Ey guys, look at the realworld !

Java has replace lots C/C++ developement just because it is much easier to setup and to maintain with an "average" skill (you got plenty of free & free solutions as well as bullet proofed comercial solutions that fits every needs).

Java is also the key to open the server side OS. Because by choosing Java enterprise can shift to any supported OS they want depending on their TCO for instance. And there Linux win in most situations here !

So Java offer OS choice and Linux OS solution ;-)

Personally, i've worked with .net on projects and i am realy surprised by /. post that claim "portability" to other OS ! MS has clearly put this point in front of the world : .net will only be available on MS OSes. This means that the complete specifications will never be available. Hence you can never claim to be "compatible" with it as you can not raise your complience level to one that enterprise require for support reasons.

I do not realy understand people working on mono (which is a nonsense by the way), why don't they go and help FSF's classpath project instead if they want a realy free implementation of some advanced language & VM ? Ey, FSF ! Know those guy ;-)

So if you want to help Linux to get a truly independent, full GPLed & fully compatible solution : Go and help FSF's CLASSPATH PROJECT [] ! They do need your skills !

run! run! run! (0, Flamebait)

wash23 (735420) | more than 10 years ago | (#8566565)

It's all just a horrible conspiracy to gradually shift hardware and software towards a centrally controlled, inaccessible quagmire of unbreakable digital rights management and spyware! run for your lives!
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account