Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The IOCCC Competition Is Back

samzenpus posted more than 2 years ago | from the welcome-back dept.

Programming 201

Rui Lopes writes "After a 5 year hiatus, the IOCCC (International Obfuscated C Code Contest) is back! This marks the 20th edition of the contest. Submissions are open between 12-Nov-2011 11:00 UTC and 12-Jan-2012 12:12 UTC. Don't forget to check this year's rules and guidelines."

cancel ×

201 comments

Sorry! There are no comments related to the filter you selected.

It'd be nice if ... (5, Insightful)

fsckmnky (2505008) | more than 2 years ago | (#38042160)

They created a competition for the most well structured, well documented, clean and correct code.

Most C coders seem to achieve obfuscation without any additional incentive.

Re:It'd be nice if ... (5, Interesting)

phantomfive (622387) | more than 2 years ago | (#38042170)

But then we never would have a piece of code that calculates its own area. Isn't that worth it? (LINK [wikipedia.org] ).

Re:It'd be nice if ... (5, Insightful)

masternerdguy (2468142) | more than 2 years ago | (#38042234)

This is a good competition because it helps exploit the guts of C in new and exciting ways. Go back to your clean and neat database client if you can't play with the cowboys.

Re:It'd be nice if ... (5, Insightful)

Hazel Bergeron (2015538) | more than 2 years ago | (#38042250)

Most C coders seem to achieve obfuscation without any additional incentive.

Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year, each with their uninteresting quirks of syntactic sugar, competing on the basis of some uniquely uninteresting difference which can almost always be trivially implemented in any of the alternatives. They are hard to read in the same way that German is hard to read to someone who has only been reading German for a year: skill and speed comes through practice with the language, not from the ego of its authors.

Re:It'd be nice if ... (0)

masternerdguy (2468142) | more than 2 years ago | (#38042258)

Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year, each with their uninteresting quirks of syntactic sugar

*cough* python *cough*

Re:It'd be nice if ... (5, Funny)

Anonymous Coward | more than 2 years ago | (#38042282)

Yeah, Python sure flooded the marketplace in the past year. Now, if you'll excuse me, I've got to check the breaking news about the Lewinsky scandal after buying some hot dot-com stocks while on the way to work at the World Trade Center because apparently it's the late nineties again somehow.

Re:It'd be nice if ... (3, Funny)

martin-boundary (547041) | more than 2 years ago | (#38043294)

Great Scott! You're in a TIME LOOP! Don't move! There's a car to your left, get in slowly and accelerate to 88mph down the street. Oh, and here's a banana peel for the FUSION REACTOR.

Re:It'd be nice if ... (5, Insightful)

Anonymous Coward | more than 2 years ago | (#38042308)

Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year, each with their uninteresting quirks of syntactic sugar, competing on the basis of some uniquely uninteresting difference which can almost always be trivially implemented in any of the alternatives. They are hard to read in the same way that German is hard to read to someone who has only been reading German for a year: skill and speed comes through practice with the language, not from the ego of its authors.

+1, it all started going downhill when :
- professional language designers abdicated their role, and the void was filled by amateurs
- people who use these languages have no fucking clue what they're doing and we're all paying the price
- corporations hyped languages for their own purposes and languages stagnated or worse were crapified to an absurd level (witness java).

Re:It'd be nice if ... (1, Troll)

Ethanol-fueled (1125189) | more than 2 years ago | (#38042672)

...it all started going downhill when...professional language designers abdicated their role, and the void was filled by amateurs...people who use these languages have no fucking clue what they're doing and we're all paying the price...

No, it started when we had to take over your obfuscated crap which is technically C but most resembles the bastard love-child of BASIC and assember, you know, the stuff with variable and type names like "__MfxVge__" you didn't bother to comment because you wanted to artificially inflate your worth? The kind of crap that would be easier to rewrite than refactor?

Your job security is my job security - Only your code will be forever locked in a vault and replaced with mine, which will live on because I know how to write comments and spell out complete words.

Platforms that can't run C (2)

tepples (727027) | more than 2 years ago | (#38042964)

The kind of crap that would be easier to rewrite than refactor?

How about stuff that needs to be rewritten from scratch because a target platform can't run C? This is true of the web (or at least it was until Emscripten), and it's still true of Xbox Live Indie Games and Windows Phone 7.

Re:It'd be nice if ... (1)

TheCouchPotatoFamine (628797) | more than 2 years ago | (#38042976)

RIGHT the fuck on. +5, bigup, the facts of life.... here gentlemen - you have your answer.

Re:It'd be nice if ... (2)

Anonymous Coward | more than 2 years ago | (#38043834)

I would hope that no one who actually knows what they're doing would ever create a type with the name __MfxVge__, because symbols starting with a leading double underscore are reserved for use by the implementation. You knew that right? Right? Obviously, because you know how to write comments and spell out complete words.

Re:It'd be nice if ... (-1, Troll)

fsckmnky (2505008) | more than 2 years ago | (#38042314)

My comment had nothing to do with the C language itself. Yes, C is a simple language. I standardized on C a long time ago, and 100% of the code I write is in fact, in C. My comment had to do with messy, buggy, undocumented, unorganized C code, that I come across when dealing with "open source", that is glorified as being "the mostest awesomeness dude" simply because it is open source. Insert rants about "well you can fix it since its open source" here.

Re:It'd be nice if ... (4, Insightful)

Anonymous Coward | more than 2 years ago | (#38042476)

Closed source code is the same, only you don't get to see it.

Re:It'd be nice if ... (-1, Troll)

fsckmnky (2505008) | more than 2 years ago | (#38042514)

*sigh*

Re:It'd be nice if ... (2, Insightful)

Anonymous Coward | more than 2 years ago | (#38042558)

I'm going to say that open source is bad and pre-emptively brand all disagreement as fanboyism so that my opinion is taken as authoritative.

FTFY

Re:It'd be nice if ... (2)

gomiam (587421) | more than 2 years ago | (#38042710)

You can sigh all you want. Lousy programmers will write lousy code whether it is open source or not, whether it is C or not. I know, I have had to maintain it (and certainly written it at some time). Just sprinkle ten or so non-named constants in your code to signal for some database offsets and you are in for a few entertaining hours/days of repurposing.

Re:It'd be nice if ... (-1, Troll)

fsckmnky (2505008) | more than 2 years ago | (#38042736)

*sigh* x 2

Re:It'd be nice if ... (3, Funny)

gomiam (587421) | more than 2 years ago | (#38043784)

I would suggest you take up dubbing the Twilight movie series. You seem to have half their dialogues pat down.

Back on track, do you have anything useful to add besides pining for the fjords?

Re:It'd be nice if ... (-1, Troll)

fsckmnky (2505008) | more than 2 years ago | (#38043846)

Smells like teen spirit.

Re:It'd be nice if ... (0, Insightful)

Anonymous Coward | more than 2 years ago | (#38043020)

Wow, way not to look at all like a giant douchebag.

Re:It'd be nice if ... (3, Funny)

93 Escort Wagon (326346) | more than 2 years ago | (#38042390)

Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year, each with their uninteresting quirks of syntactic sugar, competing on the basis of some uniquely uninteresting difference which can almost always be trivially implemented in any of the alternatives. They are hard to read in the same way that German is hard to read to someone who has only been reading German for a year: skill and speed comes through practice with the language, not from the ego of its authors.

Wow! Dr. Ritchie, everyone thought you were dead!

C smells! (-1)

Anonymous Coward | more than 2 years ago | (#38042602)

Real programmers use Java!

Re:It'd be nice if ... (4, Insightful)

petes_PoV (912422) | more than 2 years ago | (#38042842)

Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year.

This clearly shows you simply don't understand the problem. A good programmer can (and does) write well structured, clean, DOCUMENTED and maintainable product in any language. The issue has nothing to do with the language used and everything to do with lack of discipline, inexperience and a slapdash and unprofessional attitude. Usually the worst programmers are the ones who think that once the code is written and compiles clean, the job is done. For most of these people there is little hope of educating them as they are incapable of seeing the bigger picture.

Re:It'd be nice if ... (1)

Hazel Bergeron (2015538) | more than 2 years ago | (#38042952)

Yes, a true communicator switches to any of Earth's languages at will and celebrates the variety, eagerly perfecting his ability in any new language which some committee or group of enthusiasts recently invented. This is a realisable and good use of the copious time every human has available: the sugary topping has always been more important than the meal below.

Re:It'd be nice if ... (4, Interesting)

Anonymous Coward | more than 2 years ago | (#38043352)

You're making basically the same argument as people were saying back when machine code was what people wrote and C was new. If you have an open mind, you can easily see that C has serious shortcomings by modern language standards.

C offers no abstractions for complex data types. It offers no subtyping. There's no facility for generic programming other than macros, which everyone knows suck. No support for closures or comprehensions. None of these things are "trivially implemented", as you state. Even its syntax sucks, as anyone would agree who's tried to declare a non-trivial function pointer.

Many common programming tasks require extensive pointer manipulation in C. Even the best programmers (I'm one of them, and I concede this point) make ocassional mistakes with pointers, and they are the worst kind of bug: silently incorrect or a crash at a random place in the code.

C is perfectly appropriate for some projects, especially with really low-level code (as most C constructs translate directly to assembly). C++ is usually better, as it has a richer typing system and ability to do generic programming, but you need to be an expert as the language is full of pitfalls (which are mostly C's fault). For projects that don't need to be close to the hardware, scripting languages can multiply programmer productivity.

Re:It'd be nice if ... (2)

Joce640k (829181) | more than 2 years ago | (#38043552)

Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

An upgrade to C++ is a very good idea though.

Re:It'd be nice if ... (3, Insightful)

antifoidulus (807088) | more than 2 years ago | (#38043822)

while some smart programmers think it's necessary to over-use the preprocessor

And that is ultimately my main beef with C, it's impossible to write non-trivial code that DOESNT make use of the pre-processor. Header guards in 2011? Really? C either needs to make an Objective-c like import statement a standard or else make #pragma once standard and make it default, so that if in the rare case you do actually need to include a file more than once, THEN you have to use a pre-processor command. I think the pre-processor is a really useful feature of C, but it should never be essentially mandatory to use it.

Re:It'd be nice if ... (1)

wisnoskij (1206448) | more than 2 years ago | (#38042278)

It is called playing to your strengths.
While producing well structured, well documented, clean and correct code in C would be quite a challenge it could never approach some of the new languages in these terms.

Re:It'd be nice if ... (4, Interesting)

Rosco P. Coltrane (209368) | more than 2 years ago | (#38042356)

Most C coders seem to achieve obfuscation without any additional incentive.

You got it wrong: bad coders create bad code. Good coders know how to create good code. In any language.

When someone knows C well enough to create a truly obfuscated or compressed piece of portable C code that follows the rule of the language to a tee, i.e. that can be compiled strict or linted, and wins the IOCCC, it's a very good sign that this someone can create excellent C code.

I should know, I won the IOCCC years ago, and used it many times in my resume. When would-be employers told me "what's the IOCCC?", I knew they weren't going to be good employers. When they told me "oh, I see you won the IOCCC", they knew I could code good C, and I knew they groked what I did. Winning the IOCCC helped me land a job a few times.

Re:It'd be nice if ... (-1, Offtopic)

Bill, Shooter of Bul (629286) | more than 2 years ago | (#38042638)

Ok, you seem to be pretty good at getting jobs, which leads to the obvious question: why have you had so many? Are you a contractor? Do you just get bored at jobs after a year or two?

Re:It'd be nice if ... (4, Informative)

Rosco P. Coltrane (209368) | more than 2 years ago | (#38042662)

Maybe I'm a little older than you think? :)

Re:It'd be nice if ... (0)

Anonymous Coward | more than 2 years ago | (#38043336)

I have no problem with his non-bloatful statement; it's not self-serving in any way or big headed.

I'm not trying to be demeaning, but good C code looks obfuscated to the less skilled. How many
if - do constructs do you use in your c-code, or if - for loops? The coma operator? Macros?

Throw stones after you even enter the IOCCC let alone win.

Re:It'd be nice if ... (0)

Anonymous Coward | more than 2 years ago | (#38042702)

I should know, I won the IOCCC years ago, and used it many times in my resume. When would-be employers told me "what's the IOCCC?", I knew they weren't going to be good employers. When they told me "oh, I see you won the IOCCC", they knew I could code good C, and I knew they groked what I did. Winning the IOCCC helped me land a job a few times.

Alas, apparently it didn't help so much in keeping them.

One word, fuckhead... (1)

Anonymous Coward | more than 2 years ago | (#38042872)

Contractor.

Re:It'd be nice if ... (-1)

Anonymous Coward | more than 2 years ago | (#38043098)

>Good coders know how to create good code. In any language.

Except haskell.

Re:It'd be nice if ... (0)

Anonymous Coward | more than 2 years ago | (#38043128)

as a gunsmith?

seriously, congratulations!

Re:It'd be nice if ... (2)

wavedeform (561378) | more than 2 years ago | (#38043192)

You got it wrong: bad coders create bad code. Good coders know how to create good code. In any language.

My favorite programming adage: "You can create bad Fortran in any language."

Re:It'd be nice if ... (2, Insightful)

Anonymous Coward | more than 2 years ago | (#38044250)

While your code may be technically correct, compile, and do what it's intended to do, that does not make it good code. It just makes it code that works.

Look at IOCCC examples posted on wikipedia. If the average programmer (ie: your coworker) will need to spend more time thinking about the extra whitespace and the syntactic monstrosity that comprises the competition, then your code design sucks and you've ended up causing more headaches with your "good" code.

Re:It'd be nice if ... (1)

sgt scrub (869860) | more than 2 years ago | (#38042362)

They did. It failed. They are back to what comes naturally, Erlang. :p

Re:It'd be nice if ... (3, Insightful)

tgv (254536) | more than 2 years ago | (#38042368)

Sure, that could be nice as well, but the IOCCC provides great challenges and puzzles, something that a clean code contest wouldn't. And what would you rather see in your news paper: difficult puzzles or easy ones? Or, for the youngsters here: would you rather play word feud, or type the answer to 1 + 1 over and over again?

Besides that, the IOCCC entries contain mostly well structured and correct code, and afterwards they get documented as well. It's just not readable.

Re:It'd be nice if ... (2)

Khyber (864651) | more than 2 years ago | (#38042456)

Go look up the Demoscene.

Re:It'd be nice if ... (2)

Goaway (82658) | more than 2 years ago | (#38042778)

Let me tell you, no demo code is ever anywhere near well structured, well documented, clean or correct.

Re:It'd be nice if ... (1)

Khyber (864651) | more than 2 years ago | (#38043208)

Are you serious?

It's so clean and correct that you can put a near doom-3 graphics game in 96K of executable code. Ever seen .kkreiger?

Re:It'd be nice if ... (2)

shagie (1803508) | more than 2 years ago | (#38043308)

Pardon me, but are you serious? Claiming that code is clean (or correct) because it compiles to a small executable isn't necessarily true. The demo scene prides itself on small executables and optimizes for this. Such optimizations are rarely the product of clean and correct code but rather hand crafted dark compiler (or assembler) magic.

Re:It'd be nice if ... (1)

Goaway (82658) | more than 2 years ago | (#38044142)

Heh.

Let me tell you, you don't get to 96k by writing clean code. You get there by writing utter unholy messes, and you get there by cheating like hell, and you get there by using every dirty trick in the book.

Also, you often do it in a week or so before the compo, and continue right up to the deadline, in the party hall, and you do it know you will never have to maintain or look at that code ever again after you hand it in.

If you think demoscene code is "clean", you have absolutely zero experience with it.

Re:It'd be nice if ... (2)

Khyber (864651) | more than 2 years ago | (#38044168)

Considering I have done some demos myself, you'd be wrong.

The software running my entire research facility is in 4K, that's network stack, video feed controls, nutrient/water monitoring, the works.

You don't get good small executables writing crappy code.

Period.

Re:It'd be nice if ... (1)

Goaway (82658) | more than 2 years ago | (#38044244)

Considering I have done some demos myself, you'd be wrong.

Links?

The software running my entire research facility is in 4K, that's network stack, video feed controls, nutrient/water monitoring, the works.

That is code that you are going to be maintaining. That is absolutely nothing like demo code.

Replace it with the American Management (0)

Anonymous Coward | more than 2 years ago | (#38043082)

Direction Obfuscation Test.

Am I the only one, but are our bosses not understanding Dilbert? I've got one who thinks it's about managers talking to dumb employees.

This ties nicely into the previous story, Is American Innovation Losing It.

Re:Replace it with the American Management (1)

fsckmnky (2505008) | more than 2 years ago | (#38043302)

You aren't the only one. I posted a comment about how it'd be nice to have a contest awarding excellent C programming practices, and I'm getting flamed for bashing the IOCCC contest, the C language, open source, closed source, peoples grandmothers, sliced bread, etc. Go figure.

Re:Replace it with the American Management (0, Offtopic)

Anonymous Coward | more than 2 years ago | (#38043320)

My grandmother was killed by sliced bread, you insensitive clod!

Re:It'd be nice if ... (0)

Anonymous Coward | more than 2 years ago | (#38043198)

Are you sure you don't mean C++?

Re:It'd be nice if ... (1)

Surt (22457) | more than 2 years ago | (#38043390)

But no one wants that, and hence, Microsoft.

Re:It'd be nice if ... (0)

Anonymous Coward | more than 2 years ago | (#38043576)

Did you mean C++?

I've used C throughout my professional career and even the worst code is still quite legible after running it through pretty printing to standardize the formatting.

C++ code can be an absolute mess.

Re:It'd be nice if ... (0)

fsckmnky (2505008) | more than 2 years ago | (#38043710)

If I dare to clarify C or C++ ... it will only result in exponential flamage, thus, I leave it open to interpretation.

Re:It'd be nice if ... (0)

Anonymous Coward | more than 2 years ago | (#38044456)

agreed, they are promoting bad behavior

Re:It'd be nice if ... (0)

Anonymous Coward | more than 2 years ago | (#38044640)

They created a competition for APL or APL2, which can be so clean it's unreadable (to some).

I'm twelve and can beat all of you. (-1)

Anonymous Coward | more than 2 years ago | (#38042200)

I was only ten years old when I solved one of the Clay Millennium prizes. My mom says I'm smart. I am a boy.

Re:I'm twelve and can beat all of you. (-1)

Anonymous Coward | more than 2 years ago | (#38043668)

Do you like football?

Jerry Sandusky

Underhanded C contest should return (5, Interesting)

vadim_t (324782) | more than 2 years ago | (#38042240)

The IOCCC is cool, but the Underhanded C Contest [xcott.com] was a lot more valuable.

The entries for the IOCCC can show a lot of cleverness, but nobody in their right mind would accept such code. The beauty of the Underhanded C ones is that the code looks reasonable, but does extremely undesirable things.

Re:Underhanded C contest should return (3, Interesting)

Truekaiser (724672) | more than 2 years ago | (#38042380)

call me paranoid but this contest and the ioccc are the reasons why i don't particularly let anything from s.e.l. touch my systems. i am not a good enough coder to be able to tell if what it's doing is what it says it's doing or something the cia wants it to do..

Re:Underhanded C contest should return (0)

Anonymous Coward | more than 2 years ago | (#38043620)

s.e.l.? What the fuck does that mean?

Schweitzer Engineering Laboratories?
Solar Energy Laboratory?

Even searching for it shows nothing in particular. Search Engine Land, maybe?

Re:Underhanded C contest should return (2)

martin-boundary (547041) | more than 2 years ago | (#38044634)

s.e.l. [wikipedia.org]

Re:Underhanded C contest should return (0)

Anonymous Coward | more than 2 years ago | (#38042494)

Except that they didn't hold a contest in two years and didn't even announce any winners of the last one held.

This makes me so happy... (2, Insightful)

Anonymous Coward | more than 2 years ago | (#38042244)

I hope there are many submissions... It's things like this that teach you the FULL amount of abuse a language can take while still making something that works. :-D

I would like to see this rule illustrated (2)

bogaboga (793279) | more than 2 years ago | (#38042246)

To illustrate some of the subtleties of the C language.

The C language is not my thing per se, but I'd like to see simple C program code the illustrates the subtleties of C. Anyone?

Re:I would like to see this rule illustrated (4, Insightful)

Anonymous Coward | more than 2 years ago | (#38042340)

Look up "Duff's Device". There's a good example.

Use Duff's Device (1)

Anonymous Coward | more than 2 years ago | (#38042396)

Use Duff's Device! (Responsibly)
Use Duff's Device! (Responsibly)

Re:Use Duff's Device (5, Interesting)

JonySuede (1908576) | more than 2 years ago | (#38042578)

don't use them anymore, go read that post: http://lkml.indiana.edu/hypermail/linux/kernel/0008.2/0171.html [indiana.edu]

Re:Use Duff's Device (5, Interesting)

Anonymous Coward | more than 2 years ago | (#38042798)

Jim Gettys has a wonderful explanation of this effect in the X server. It turns out that with branch predictions and the relative speed of CPU vs. memory changing over the past decade, loop unrolling is pretty much pointless. In fact, by eliminating all instances of Duff's Device from the XFree86 4.0 server, the server shrunk in size by _half_ _a_ _megabyte_ (!!!), and was faster to boot, because the elimination of all that excess code meant that the X server wasn't thrashing the cache lines as much.

Emphasis mine. That's REALLY freaking interesting. Posting this AC before modding you up.

Re:Use Duff's Device (4, Insightful)

TheRaven64 (641858) | more than 2 years ago | (#38043248)

It's the main reason why C++ does well in microbenchmarks and does so much worse in real-world usage. It encourages a lot of inlining, which reduces branching but increases instruction cache usage. It's difficult to benchmark well, because instruction cache pressure changes over time depending on what else is happening with the system.

Re:I would like to see this rule illustrated (2)

Bomazi (1875554) | more than 2 years ago | (#38042704)

This link:http://www.eecs.berkeley.edu/~necula/cil/cil016.html
describes some corner cases of C. You need to have some prior to knowledge of C to appreciate the non-obviousness of the examples though.

Re:I would like to see this rule illustrated (-1)

Anonymous Coward | more than 2 years ago | (#38043280)

You can barely handle english. Start there.

entry (0)

Anonymous Coward | more than 2 years ago | (#38042382)


#include "stdio.h"

int main()
{
        char buffer[80];
        printf( "hello, world.\n" );
repeat:
        gets( buffer );
        printf( "you know, I miss all you guys.\n" );
        gets( buffer );
        printf( "I was a pretty good guy, don't you think? But he doesn't seem to think so.\n" );
        goto repeat;
        return 0;
}

Awesome! (4, Funny)

Anonymous Coward | more than 2 years ago | (#38042478)

It's about time I got some more reference code.

What happened to the IUPCC? (1)

Anonymous Coward | more than 2 years ago | (#38042790)

What happened to the International Unobfuscated Perl Code Contest (IUPCC)? ... Oh wait it's impossible to write unobfuscated perl :)

The Internet is based on C (5, Insightful)

Required Snark (1702878) | more than 2 years ago | (#38042830)

I find all the "C sucks" comments to be both amusing and stupid. Without C code there would literally be no Internet. Every bit you are sending and receiving uses C. The two operating systems that represent 99.99% or more of the running computers that are online run C. Both Windows and Linux use the BSD TCP/IP stack.

If C did not get the job done for this kind of computing then it would have been replaced. The fact that C thrives in the systems programming domain is a tribute to it's utility.

A proficient C coder can write clear, maintainable, efficient code that runs on many platforms. This requires both skill and practice. Not everyone is capable of doing this. It requires the ability to keep multiple competing abstractions in mind when coding. I think a lot of people try this and find it difficult and then blame the language. Those who persevere and learn this style of working can usually move on to other kinds of programming and also do excellent work.

Some problem domains require different languages and different skill sets. Personally, I like writing code where I know that if I were to look at the assembly code generated by the compiler I can see how it relates to the C code I wrote. I rarely do this, but it's good to know that I can if I want to. I'm doing any C coding now, because I always use the appropriate language to the task. But I also know that my C coding skills give me a distinct advantage in solving difficult problems, no matter what they are,

Re:The Internet is based on C (5, Interesting)

Sduic (805226) | more than 2 years ago | (#38043228)

Without C code there would literally be no Internet.

Because obviously only C is Turing-complete.

Before I stir up any vitriol, I'm just kidding. I think C is under appreciated precisely because is provides only a thin abstraction that (hopefully) maps well to the target architecture, but otherwise stays out of the way. That is to say, when all you have is a hammer, you can easily shoot yourself in the foot.

Re:The Internet is based on C (0)

Anonymous Coward | more than 2 years ago | (#38043326)

Well, there are a couple things that suck about C, mostly syntactical and not a big deal. Disclaimer, I'm very ignorant of C because I've never written a large program in it (studied C for awhile after 6502 and gave up) :

- Choosing == for equality operator and = for assignment. I liked how Pascal uses := for assignment. Ugly, but stands out, even after 20 hour coding marathons (not that I've done 20 hour coding marathons... )
- The keyword "char" should be replaced with "byte."
- I hate "long long" - should be "wide", i.e. "wide int," or better yet, you should be able to explicitly state the width of data types with "int" being the natural "bitness" of the CPU (and sizeof(int) returning that).
- "bool" doesn't work right.
- I think people who put the "*" of pointer syntax near the variable name and not the type name when declaring pointers should be shot. It should always be int* pointer_to_int, not int *pointer_to_int.

I'm sure my complaints are unwarranted except for the first point.

Re:The Internet is based on C (3, Informative)

dlgeek (1065796) | more than 2 years ago | (#38043350)

#include <stdint.h>
uint16_t x = 0; /*unsigned 16 bit integer */
int32_t y = 0; /*signed 32 bit integer */

Almost every platform I've ever worked on has stdint...

Re:The Internet is based on C (3, Insightful)

mdf356 (774923) | more than 2 years ago | (#38043640)

stdint.h came in with C99. There were decades where people hand-rolled their own versions so network communications would work...

Re:The Internet is based on C (1)

cratermoon (765155) | more than 2 years ago | (#38044124)

htons() and ntohs() and related functions are still important, too.

Re:The Internet is based on C (0)

martin-boundary (547041) | more than 2 years ago | (#38044668)

You realize (I assume) that those are *lower* bounds on the storage size. There's no guarantee that uint16_t is in fact (only) 16 bits long...

Re:The Internet is based on C (5, Informative)

mdf356 (774923) | more than 2 years ago | (#38043664)

- I think people who put the "*" of pointer syntax near the variable name and not the type name when declaring pointers should be shot. It should always be int* pointer_to_int, not int *pointer_to_int.

I'm sure my complaints are unwarranted except for the first point.

But that's backwards of what the compiler really does. Consider this:

int* p, q;

What types do p and q have? p is a pointer-to-int; q is an int. By putting the * next to the type name it makes it look like all the things are int*, but they're not. By putting the * with the type (which I did for my first year of C coding) you're making reading the code harder rather than easier. It'd be like writing

a = b * c+d;

and trying to convey that the '+' binds tighter since it doesn't have spaces. That's not what the compiler will do and writing it so only serves to confuse the reader.

In addition, what you see at declaration is representative (modulo the weirdness of array subscript and pointer deference) to what you'd do to get the type. That is, int ***p means that you'd have to type ***p to get an int. *p means you'd need another ** to get an int, etc.

Re:The Internet is based on C (1)

Anonymous Coward | more than 2 years ago | (#38043260)

Both Windows and Linux use the BSD TCP/IP stack.

Nope. Wrong on both counts.
Linux always had its own stack (rewritten several times), Windows used some user-space utilities from BSD, no stack.

Re:The Internet is based on C (1)

Anonymous Coward | more than 2 years ago | (#38043368)

The fact that C thrives in the systems programming domain is a tribute to it's[sic] utility.

It's more of a tribute to C being a fancy assembler with a function calling convention. Yes, it's suitable for writing operating system kernels and not much else.

A proficient C coder can write clear, maintainable, efficient code that runs on many platforms. ... Not everyone is capable of doing this. It requires the ability to keep multiple competing abstractions in mind when coding.

Keyword: cognitive load. Case in point: hilariously excrutiating code example in linux man page of snprintf. If you need to jump through all these burning hoops to do something this mundane, imagine how much more your proficient C coder could achieve in a more sensible laguage with the same amount of effort.
Of course nobody bothers with the burning hoops so that's why we have buffer overflows.

I like writing code where I know that if I were to look at the assembly code generated by the compiler I can see how it relates to the C code I wrote.

You (or probably someone else) can do that just as well with C++, java bytecode and others.

Re:The Internet is based on C (2)

Carl Drougge (222479) | more than 2 years ago | (#38043758)

Keyword: cognitive load. Case in point: hilariously excrutiating code example in linux man page of snprintf. If you need to jump through all these burning hoops to do something this mundane, imagine how much more your proficient C coder could achieve in a more sensible laguage with the same amount of effort.

A sensible C coder might use vasprintf instead of the example in that manpage. The fact that all the standard library functions aren't great for all (or sometimes any) use cases is hardly unique to C.

Re:The Internet is based on C (1)

Old Wolf (56093) | more than 2 years ago | (#38044148)

A sensible C coder might use vasprintf instead of the example in that manpage. The fact that all the standard library functions aren't great for all (or sometimes any) use cases is hardly unique to C.

The *a*printf functions are not in the C standard, so the portable coder would not use them. I guess opinion may vary about what is 'sensible' :)

Re:The Internet is based on C (1, Interesting)

Billly Gates (198444) | more than 2 years ago | (#38043410)

No language is perfect and I like to think of using a particular language for a particular purpose. Some places C should not be used today. Gnome is a classic example if you want to bind OO languages with GObject.

One issue with C (I know I am going to get flamed here) is security. Most of Windows security flaws are from C due to buffer and stack overlfows, and even vector attacks that languages like Pascal handle better when turned into assembly code. The Marines used old macs for years for this reason. I am not a computer science major so you can flame all you want on me not knowing anything, but XP SP2 and OpenBSD had to introduce special libraries due to the fact that a buffer and stack overflow can result in code execution. Unix had a bad wrap with security too early on before Windows 2000 and much of that could be due to C and its libraries. Don't give me it is a the programmers fault as he or she has no clue what the resulting assembly code does and how it handles when something is full.

I am not a troll here or anti C but it is something overlooked by programmers who only grew up with C, C++ or Java/C# which are all derived or influenced by C itself.

Re:The Internet is based on C (0)

Anonymous Coward | more than 2 years ago | (#38043770)

hm all that hyper shit was totally based on systems built around pascal, here is the point though since your some type of evangelist on C... just becuase we NOW use rubber on our wheels does NOT mean we would not have wheels without rubber

now kindly take your fucked up short hand bullshit exploit ridden shitty ass language and go fuck yourself

thanks ... the world who do not think that ATT and microsoft are the only ways to do it.

Re:The Internet is based on C (0)

Anonymous Coward | more than 2 years ago | (#38044318)

A proficient C coder can write clear, maintainable, efficient code that runs on many platforms.

The problem isn't the great C coders who can write awesome code. It's the mediocre ones that turn it into a horrifying train wreck. C and C++ give you all the tools, including all the tools to get yourself into trouble.

Re:The Internet is based on C (0)

Anonymous Coward | more than 2 years ago | (#38044672)

A proficient C coder can write clear, maintainable, efficient code that runs on many platforms.

The problem isn't the great C coders who can write awesome code. It's the mediocre ones that turn it into a horrifying train wreck. C and C++ give you all the tools, including all the tools to get yourself into trouble.

And you somehow think mediocre programmers can write more maintainable code in Java, Perl, Javascript, Python, Groovy, Scala, C# etc... ? The higher abstraction levels of these languages in no way guarantees that you won't get unmaintanaible code. If you think otherwise you need a reality check.

web programming has ruled this contest obsolete (1)

Anonymous Coward | more than 2 years ago | (#38042954)

have you seen a designer's php code??

Perl IOCCC (1)

Billly Gates (198444) | more than 2 years ago | (#38043418)

That way we can submit hello world programs that make the C IOCCC ones look like a picnic

Oh no, someone obfuscated IOCCC for 5 years (1)

youn (1516637) | more than 2 years ago | (#38043572)

but I am happy it is back

Because D is such a heavily used language... (5, Insightful)

Anonymous Coward | more than 2 years ago | (#38043968)

A number of quick points... Some people just don't know, so here are some practical speaking points...

-C has been around longer than most of the non-C programmers alive. That includes you people on this site, which has the smartest people, from one of the most divisive areas in the civil space: the "tech wars".

-D was such a better language... also, C++ because we never hear about C anymore.

-Java is on it's way out, being deprecated by the largest company in the world, which also deprecated Flash (on mobile) which Adobe just acquiesced to, replaced by Google's new iteration. Maybe not in the next 5 years, but it can no longer grow... it will have to get smaller with less support.

-Objective-C, used by Apple Inc., the largest company in the world, is a wholly-compatible superset of (ANSI) C. There are no signs of change here. Big surprise, it's all the same hardware components, just in larger capacities, at faster rates, and smaller form-factor. C can't help us with the flux capacitor... but that has not been added to the standard CPU, memory, memory storage, etc. model.

-Google announced that Android will run a C-like-language in the native space that uses the CPU and GPU. Even with Dart coming our way...

-CUDA... C is relevant in other (all) GPU spaces which is the go-to-guy, for the moment, to eak out more performance from a machine.

-And here is where the feelings get hurt: In college, I strattled the EE/CS line while being firmly EE. EEs learn C because it teaches them valuable things about the hardware, being a very light obfuscation. CS departments tend to concentrate on, well, anything else. Flavors of the year, interesting projects, etc. That is their place. My older brother went the CS route, 8 years before I got my turn and went EE. I admire him and his success greatly but I know, push came to shove, I can talk about certain topics without talking about garbage collectors and universal typing.

So, please, if you've never used C in any significant way, just don't comment. Listen. People, young and old, have something to tell you about the most significant programming language ever invented.

And to bring this all together: When you are trying to eke out CPU cycles so your 3D rendering is above 60 fps on that mobile device, you will know why closeness to hardware and C, in particular, may be your best friend. Or a C-like language...

Another way to look at it: People who know C and have worked with it, can't just unknown it. They know what you non-C people know, but also have other experience. If MOST of them say C is indispensable, then how about you do the one thing some Tech Asshole never do: Take someone else's advice. And STFU.

Can we just talk about something else that is awesome and not caught up in this stupid argument?

Re:Because D is such a heavily used language... (3, Funny)

Anonymous Coward | more than 2 years ago | (#38044078)

People, young and old, have something to tell you about the most significant programming language ever invented.

You mean LISP? I haven't seen anyone mention it yet.

Re:Because D is such a heavily used language... (0)

Anonymous Coward | more than 2 years ago | (#38044438)

Asking people to not get caught up in stupid arguments, is a lot like asking 1st graders to pass high school level reading comprehension class. It almost never happens.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?