Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

cancel ×


Sorry! There are no comments related to the filter you selected.

13 years? (2, Interesting)

undulato (2146486) | more than 3 years ago | (#36511022)

I could've sworn I was using it before then.. perhaps it was all just a bad dream?

Re:13 years? (2)

elsurexiste (1758620) | more than 3 years ago | (#36511114)

A post under you already explained it: it refers to the first ISO standard for C++. And yeah, I thought the same thing: "Really? Only 13 years?"

Re:13 years? (4, Funny)

vidnet (580068) | more than 3 years ago | (#36511230)

What struck me was that C++ got lambda expressions before Java did!

Re:13 years? (3, Informative)

Lord Lode (1290856) | more than 3 years ago | (#36511356)

The previous C++ standard, C++98, is 13 years old, as the name implies.

13 years? (4, Informative)

Meneth (872868) | more than 3 years ago | (#36511038)

C++ has been around for at least 28 years. From Wikipedia: "It was renamed C++ in 1983."

The article is probably referring to the first finished C++ ISO standard, 14882:1998. Hardly the "first iteration" of the language.

Re:13 years? (3, Informative)

crow_t_robot (528562) | more than 3 years ago | (#36511198)

After years of development, the C++ programming language standard was ratified in 1998 as ISO/IEC 14882:1998

C++ didn't exist as a standardized language till 13 years ago. It was in development before then.

Re:13 years? (2)

utahjazz (177190) | more than 3 years ago | (#36511594)

It was developed by Bjarne Stroustrup starting in 1979

the C++ programming language standard was ratified in 1998

So you're saying the "first iteration" took 19 years. You must use the word "iteration" differently at your shop than we do at mine.

Re:13 years? (2)

StackedCrooked (1204878) | more than 3 years ago | (#36511666)

He started coding in 1979 and waited until 1998 to build it?

Re:13 years? (1, Funny)

Anonymous Coward | more than 3 years ago | (#36511830)


Re:13 years? (2)

sgt scrub (869860) | more than 3 years ago | (#36512426)

No. He used templates and built it statically. It finished building on Unix in 81. The compiler didn't finish on windows until 98 because of all of the constant win api changes.

Re:13 years? (3, Funny)

rjstanford (69735) | more than 3 years ago | (#36511852)

Now that's waterfall development!

Re:13 years? (1)

Anonymous Coward | more than 3 years ago | (#36511966)

No, that's glacial. :)

Re:13 years? (1)

Hognoxious (631665) | more than 3 years ago | (#36512056)

So you're saying the "first iteration" took 19 years.

Nope, the zeroth iteration.

Still playing catch-up to C#. (0, Troll)

Anonymous Coward | more than 3 years ago | (#36511050)

The saddest part about this whole C++0x ordeal is that they're still just playing catch-up to C#.

Re:Still playing catch-up to C#. (2)

lxs (131946) | more than 3 years ago | (#36511054)

Yeah but this one goes to eleven!

Re:Still playing catch-up to C#. (5, Interesting)

rennerik (1256370) | more than 3 years ago | (#36511620)

Your comment caught some flack, but I couldn't help but make a similar observation as I read the spec. It seems that they are adding a lot of stuff to C++ that exists in C# (lambda expressions, delegated constructors, automatic deduction, initialization syntax, a dedicated null keyword, etc).

Of course, they added a bunch of stuff that's also NOT in C# (since it's not necessary in a high-level language like C#), but I am glad that they are revamping C++ to incorporate some higher-level functions. Now we just have to wait for compilers to start adopting the new spec...

Re:Still playing catch-up to C#. (4, Insightful)

Waffle Iron (339739) | more than 3 years ago | (#36511962)

The saddest part about this whole C++0x ordeal is that they're still just playing catch-up to C#.

True. In particular, C++ is light years behind C# in patent FUD. And C++ hasn't even started work on requirements for a 100MB "managed environment" for users to install before running their apps. Nor have C++ developers chosen a monkey species after which to name its 2nd-class-citizen cross-platform implementation.

Re:Still playing catch-up to C#. (0)

siride (974284) | more than 3 years ago | (#36512198)

What's the patent FUD, specifically? I'm not talking about some obscure part of the Winforms API, I mean in the core language itself.

And you forget that C++ has a giant environment to install as well, but due to age, that is generally part of the OS as is. In time, modern generation languages will end up in the same category. In fact, Windows Vista and 7 already come with .NET pre-installed, so there's no need to download anything to run a .NET app.

Nice but... (3, Interesting)

genjix (959457) | more than 3 years ago | (#36511080)

Would love to use these features in the new C++, but unfortunately none of the major compilers support the new for-syntax, in class initialization, deleting members and explicit specification of base class methods.

Also I totally don't understand why enum class no longer casts to ints... it totally makes using binary flags impossible unless I revert back to using the old style enums. But then I need to do the ugly namespace myenums { enum myenum { foo = 4, bar = 8 ... }; } hack which makes nesting inside classes impossible -_-

Re:Nice but... (1, Insightful)

daid303 (843777) | more than 3 years ago | (#36511204)

If you are putting binary flags in enums then you are using them wrong. And that's why you are no longer allowed to do so. It's a GOOD thing.

You want bitfields: []

Re:Nice but... (3, Insightful)

Arlet (29997) | more than 3 years ago | (#36511340)

Bitfields are not as flexible when you want to change several different bits (belonging to different fields) in a word using a single write.

Re:Nice but... (1)

Anonymous Coward | more than 3 years ago | (#36511400)

Enums have always been one of the best ways to declare bit combinations, given the weak typing of C and C++ in general. Bitfields, OTOH, are mostly useless. Driver writers don't use them, nor do kernel writers: that shit is unpredictable. And userspace/applications has little use for them in the first place.

Re:Nice but... (1)

e70838 (976799) | more than 3 years ago | (#36511568)

It is the most used way, not the best one. It is an inappropriate way. Even if bitfields have drawbacks, simple constants defined in a namespace remain a better choice.

Re:Nice but... (3, Interesting)

JohnnyBGod (1088549) | more than 3 years ago | (#36511736)

What so wrong with "const int SOME_BINARY_FLAG = 0xff00ff"?

Re:Nice but... (1)

rjstanford (69735) | more than 3 years ago | (#36511888)

Agreed. Low level language features exist for a reason, but their use (outside of a few specialized fields where bit-level optimization is preferred to code-readability-and-use optimization) is limited. Making it slightly less convenient (ie: use an int like we always have) for a (relative) few, while encouraging the many to use the more comprehensible variant, is not a bad idea.

Re:Nice but... (1)

Dcnjoe60 (682885) | more than 3 years ago | (#36512140)

It is the most used way, not the best one. It is an inappropriate way. Even if bitfields have drawbacks, simple constants defined in a namespace remain a better choice.

It depends on what you mean by the word "best." There must be some reason that doing it this way is the most used way. Evidently, for many, it seems that it is the "best" way to do it. Since "best" is such an ambiguous word, maybe it would be better to elaborate on why it is inferior? Does it use more memory? Is it slower to execute? Is it more difficult to maintain? Depending on one's needs, the answer to any of those questions might influence what "best" means.

Re:Nice but... (1)

larry bagina (561269) | more than 3 years ago | (#36511500)

gcc and clang will optimize multiple bit field sets into one or two and/or operations.

Re:Nice but... (1)

Arlet (29997) | more than 3 years ago | (#36511584)

Not if the bitfields are volatile, which is common for hardware registers.

Re:Nice but... (0)

Anonymous Coward | more than 3 years ago | (#36511640)

Bit fields are inappropriate for hardware registers anyway, because the layout of the bits is implementation defined.

Re:Nice but... (0)

Anonymous Coward | more than 3 years ago | (#36512258)

You mean a union?

Also, you use the words "bits" and "fields" to describe how a bitfield is less flexible... than itself?

Bitfield layout portability or lack thereof (1)

tepples (727027) | more than 3 years ago | (#36511388)

I was under the impression that bitfield layout was implementation-defined, and that's undesirable if I want my project to compile on different architectures. The page you linked includes the phrase "Microsoft Specific" around anything mentioning layout, which illustrates my point.

I was also under the impression that code generated by compilers for reading and writing bitfields was still dog slow. Has this changed?

Re:Bitfield layout portability or lack thereof (1)

daid303 (843777) | more than 3 years ago | (#36511726)

You are basically saying. "I don't want C++, I want C!"

Re:Nice but... (0)

Anonymous Coward | more than 3 years ago | (#36511866)

Yes, let's switch to platform specific code. Just shoot yourself.

Re:Nice but... (0)

Anonymous Coward | more than 3 years ago | (#36512144)

Since they killed enums for me then I'll just go back to #define s. I win.

Re:Nice but... (2, Interesting)

Anonymous Coward | more than 3 years ago | (#36511246)

Would love to use these features in the new C++,...

Why? What is it about these new features that will make your job easier, your code more reliable, and easier to be maintained? Or do you just want to use those features because they're "new" - for C++ that is?

As a long time C++ guy (Borland C++ days), I look at some of these features and think "so what?" (Lambda functions, please.) I'll probably never use them. IMHO the last truly useful feature that C++ added was Templates which lead to the STL and made my life much easier - after I got the hang of the way the STL implemented things such as "iterators" and the gotchas associated with them.

Event hen, it drove me nuts when I had to maintain code by a C++ coder who just wanted to use features for the sake of using them - like all the classes that used one dtaa type and would always use one data type being a Template class - arrrrrghh! (All that code and overhead for nothing!) Coder: "It was a cool thing to do!"

I tell ya, with all these features being added, I just want to say "Fuck it! Gimme an assembler, a cave, a generator, and a computer!" and grow my hair and fingernails out and laugh manically at random times.

Re:Nice but... (0)

Anonymous Coward | more than 3 years ago | (#36511424)

I've asked a bunch of times and never received an answer, so can you please explain the ludditic tendencies of your type? Why do you wish progress froze? Is it because you feel yourself becoming less relevant?

Re:Nice but... (4, Insightful)

DrXym (126579) | more than 3 years ago | (#36511618)

Templates are fine for little classes but they got so abused by STL that it was not uncommon to see trivial syntax errors turn into enormous cryptic compiler errors spanning multiple lines of nested templates and typedefs, half of whom you'd never heard of. After poring over this enormous error for minutes or longer you might eventually discover you missed a * off some declaration.

Re:Nice but... (1)

Anonymous Coward | more than 3 years ago | (#36511634)

I can't imagine that there are many people who know much about Lambda functions and give them a 'meh'. If all you know about them is what is in that article, then fair enough, because its a terrible example, but otherwise it just makes you sound like a luddite. I'd be interested in hearing your opinion about why they are just a 'meh' feature though...

Re:Nice but... (2)

TheRaven64 (641858) | more than 3 years ago | (#36511684)

At least one feature in C++ 2011 (no, not C++11, they decided to make the name Y2K compliant this time) is useful: the auto type. If you've ever written a loop using foo::bar::iterator and had to split the loop over two lines because the iterator type is so long, you'll appreciate being able to do:

for (auto i=collection.begin(), e=collection.end() ; i!=e ; ++i)

Strong typing without type inference is just a stupid idea that should never have been allowed in a production language.

Re:Nice but... (1)

Nova77 (613150) | more than 3 years ago | (#36511936)

Even better

for (auto &x: collection)

Re:Nice but... (1)

fnj (64210) | more than 3 years ago | (#36511938)

You're right, that is very cool.

Re:Nice but... (1)

ConceptJunkie (24823) | more than 3 years ago | (#36512158)

they decided to make the name Y2K compliant this time

Yes, because a lot of people might be confused, thinking the standard came out in 1911. ;-)

Regarding the type inference, I always found that to be a glaring deficiency in STL. I always thought Borland's style of implementation was much better to use, although I can understand it wasn't as flexible or fast as templates. I did the same thing in my own class library back before I used templates (and they were standard), so I could do for-loops like this:

for (iterator i(collection); i.hasMore( ); i++) {
        something = collection[i]...

The new use for "auto" solves this whole issue very neatly.

Re:Nice but... (5, Informative)

mmcuh (1088773) | more than 3 years ago | (#36511412)

GCC, which is probably the most used C++ compiler, supports the new for-syntax since 4.6, deleted member functions since 4.4, and explicit virtual overrides in the 4.7 development series.

Re:Nice but... (0)

Anonymous Coward | more than 3 years ago | (#36511908)

Just use: class myenum { public: enum blabla { ... } };

a class is also a namespace.

Cruft removed? (4, Insightful)

kikito (971480) | more than 3 years ago | (#36511118)

I really like that they added new stuff to the language but ...

Have they *removed* anything at all from it? That's the only way I could get interested in that language again.

Re:Cruft removed? (0)

Anonymous Coward | more than 3 years ago | (#36511184)

The old meaning of the auto keyword has been removed, no one ever used it. There are others but this is the first that comes to mind as the keyword is reused for a new feature.

Re:Cruft removed? (1)

pauljlucas (529435) | more than 3 years ago | (#36512048)

Have they *removed* anything at all from it?

They can't because it would break people's code and most people get upset when that happens.

In a Google Talk about the Go language, Rob Pike make a snide remark about how they made the recent iteration of Go smaller unlike some other languages. The only reason they can get away with that is because there's very little Go code out there.

In contrast, there's tons of C++ and Java code out there. Both languages have cruft that I'd like to see removed too, but it simply can't be done. That's the price of success.

Re:Cruft removed? (1)

Anonymous Coward | more than 3 years ago | (#36512238)

If you don't want the whole language, don't use it all. Draw up a list of crufty things that you think ought to be removed from C++ and don't use them. There's no reason to break other people's code that relied on the standard because you don't want to use those features. It's not as though it's mandatory to use every language feature in every project, even where it's not appropriate (although many coders, ahem, don't seem to grok this).

I do this for all my C++ projects. All of them forbid the use of multiple inheritance, metaprogramming, non-trivial template usage, RTTI. Some forbid C++ exceptions. Just pick and chose what you want.

Captcha: perplex -- which is what the above comment did to me.

Cool! Meanwhile... (-1)

Anonymous Coward | more than 3 years ago | (#36511136)

Meanwhile, Obammy the Nobel Peace Prize winner is rolling out his global war of peace in Libya, Yemen, Pakistan, Syria, and Sudan. And all I hear from those who spent 8 years howling about George Bush being a war monger and a Nazi is crickets chirping. We're on the verge of WWIII. After nearly 100 years of planning, Progressives are on the verge of realizing their wet dream of massive global depopulation through a truly global war, and we're here wanking about C++ 11. Priorities?

Re:Cool! Meanwhile... (0)

ciderbrew (1860166) | more than 3 years ago | (#36511168)

Well, you'll be on the losing side so I guess you have a reason to worry. Did you know they are talking about the PS4 coming out in a few years?

Re:Cool! Meanwhile... (1)

Per Wigren (5315) | more than 3 years ago | (#36511394)

Yeah, if we just ignore C++11 the world's problems will go away.

Re:Cool! Meanwhile... (0)

Anonymous Coward | more than 3 years ago | (#36511484)

Good luck with C++ 11 when you're stuck out in a field tending the crops for your lord, serf!

Re:Cool! Meanwhile... (1)

Per Wigren (5315) | more than 3 years ago | (#36512106)

While you live your life in fear for a hypothetical future scenario, I'll try to live a happy life in the present instead. If your fears come true, we'll probably die at about the same time anyway.

Re:Cool! Meanwhile... (-1)

Anonymous Coward | more than 3 years ago | (#36512330)

Shut up and finish eating your nigger semen before talking to the grown-ups, boy.

Looks cool... (1)

jellomizer (103300) | more than 3 years ago | (#36511138)

I am glad that C++ is still evolving. That last major improvement I remember was the addition of the string class. Then shortly after that my professional focus moved away from C and C++ and towards higher level languages (.NET, PHP, Python, Java, etc...). I just recently started my own personal project so I decided to relearn C++ again, and I noticed there is a fair amount of new stuff that wasn't there before (or I was never taught)

Re:Looks cool... (0)

Anonymous Coward | more than 3 years ago | (#36511292)

I can't believe the article didn't mention variadic templates. They make it possible to do things like:

struct A { A(int, char*); };
std::vector<A> vec;
vec.emplace_back(1, "test");

instead of:

struct A { A(int, char*); };
std::vector<A> vec;
vec.push_back(A(1, "test"));

Before C++0x, adding something to this vector would have meant three constructor calls (default, custom, copy), a destructor call and allocating a temporary object. Using variadic templates you can forward the constructor call meaning only the custom constructor is called and that's it, no additional objects or calls. So now you have the efficiency of low-level C-like code with the convenience of not having to write any boilerplate code; the compiler takes care of all of that for you.

Re:Looks cool... (1)

e70838 (976799) | more than 3 years ago | (#36511660)

A new layer of syntactic hacks so that the language allows to do normal things with a reasonable efficiency.
I am really impatient to try this new language, but previous experience suggests I will have to wait at least 3 years before seeing a standard compliant compiler.

Re:Looks cool... (2)

compro01 (777531) | more than 3 years ago | (#36512204)

GCC has it mostly implemented (the new concurrency features remain MIA along with a few other bits) already, including the variadic templates the GP mentions. []

Oh Snap. (1)

crow_t_robot (528562) | more than 3 years ago | (#36511140)

Lambda expressions!

So... (1)

fitten (521191) | more than 3 years ago | (#36511170)

Looks like C# from a few years ago. Honestly, it's really good that they're moving C++ forward, it's been lacking these features when other languages have embraced them for some time. I see they still use a plethora of ugly ass underbars, though.

Re:So... (1)

glwtta (532858) | more than 3 years ago | (#36511636)

I like the underbars, always makes me feel "closer to the hardware" when I see a lot of them.

There's not reason for all languages to look the same.

Re:So... (1)

fitten (521191) | more than 3 years ago | (#36511878)

It makes me feel like I'm still programming in the 1980s. It's old and ugly. It's not a required "look", it's just an ancient custom. The compiler works just fine without underbars in names.

Re:So... (1)

JasterBobaMereel (1102861) | more than 3 years ago | (#36511766)

..Different languages are different for a reason, they do different jobs ...

C# is trying to be the universal very high level language that embraces all paradigms and can be used for everything ...
C is still a low level language allowing to program low level

C++ is stuck in the middle and so should cover the ground where you want to abstract away from the machine, but don't want a VM getting in the way, if it moves too far towards C# is will start to be ignored ...

Both C# and C++ seem to be falling into the trap of Greenspun's Tenth Rule,

Re:So... (1)

fnj (64210) | more than 3 years ago | (#36512072)

I don't think there is any danger of that. You can make a C++ program just as low level as one in C, if you want. You can still compile any C code with a C++ compiler in C++ mode with generally only very minor tweaks to the expressions, and none of the tweaks changing the binary code generated. You can use as much of the added C++ features as you want; anywhere from none of them to all of them. In an ideal world there would be no reason to use a straight C compiler any more. The ways real life departs from an ideal world is mostly that C++ compilers are not universally available in all environments; none of them are yet fully C++2011 compliant; and they tend to be VERY slow.

But... (5, Insightful)

DeathToBill (601486) | more than 3 years ago | (#36511172)

This is news for nerds. Stuff that matters. I thought /. abandoned this stuff ages ago...

Biggest Change? (4, Funny)

hal2814 (725639) | more than 3 years ago | (#36511216)

C++ goes all the way to 11. It's one louder than other languages.

That's way too modern (0)

Anonymous Coward | more than 3 years ago | (#36511260)

Let's see how long until Stallman posts a long text criticizing it all...

C++0x (0)

Anonymous Coward | more than 3 years ago | (#36511314)

And C++0x? ( )

Re:C++0x (3, Insightful)

LighterShadeOfBlack (1011407) | more than 3 years ago | (#36511352)

C++0x is C++11. C++0x was a placeholder name until they actually knew what year it would be finalised.

Re:C++0x (1)

queBurro (1499731) | more than 3 years ago | (#36511592)

C++xy apparently would have been better?

Re:C++0x (1)

DanTheStone (1212500) | more than 3 years ago | (#36511688)

No, in hex (0x) you really can go to 11.

Re:C++0x (1)

queBurro (1499731) | more than 3 years ago | (#36511850)

For $2000 I'll build you one that goes to twelve!

Re:C++0x (2)

geminidomino (614729) | more than 3 years ago | (#36511622)

You know, like how Clash at Demonhead and Mega Man 2 took place in "200X!"

And it looks like they missed their worst-case deadline by 2 years!

Re:C++0x (1)

TheRaven64 (641858) | more than 3 years ago | (#36511700)

Nope, C++0x is C++ 2011. C++11 does not exist.

Re:C++0x (1)

fnj (64210) | more than 3 years ago | (#36512094)

And it shows they should never again make assumptions on finish dates. They never did get it done during the decade they targeted or assumed.

Alternative syntax (2)

Errol backfiring (1280012) | more than 3 years ago | (#36511374)

If the "alternative syntax" from PHP were allowed, the language could even become legible. What I see is still a write-only language.

Re:Alternative syntax (1)

Ergodicity (1168195) | more than 3 years ago | (#36511572)

Agreed. C and C++ seem to love the idea of figuring out what the programmer's intention by looking at punctuation marks. For example, why on earth there's no function keyword? Make it two keywords, one for declaration and the other for implementation. The result is that a misplaced comma or a missing semi-colon (for example at the end of a function declaration) completely throws off the compiler. And, as Errol says, it makes the code completely unreadable. The new Lambda syntax is a good example. Would it be that painful to add a lambda keyword?

Re:Alternative syntax (2)

Mystra_x64 (1108487) | more than 3 years ago | (#36511680)

I fear they day C++ guys will learn that Unicode has so many previously unused special symbols that can be reused to mean something in C++.

Re:Alternative syntax (3, Informative)

TheRaven64 (641858) | more than 3 years ago | (#36511714)

Would it be that painful to add a lambda keyword?

Yes, actually. Adding keywords to a language is problematic, because lots of existing code will use them as identifier names. If you add a lambda keyword then you break any existing code that contains a variable, function, or type called lambda. C99 had some ugly hacks to get around this for bool: the language adds a __bool type, and the stdbool.h type adds macros that define bool, true, and false in terms of __bool.

Re:Alternative syntax (1)

BetterThanCaesar (625636) | more than 3 years ago | (#36512384)

Even worse, it's _Bool with one underscore and uppercase B. If you #include<stdbool.h> you get a #define bool _Bool which you can undefine if you really need the bool name for your own code.

See: stdbool.h []

Re:Alternative syntax (1)

nedlohs (1335013) | more than 3 years ago | (#36512002)

Congrats you just broke every C++ program that used "lambda" (or whatever keyword you chose) as a variable/class/whatever name.

And yes, while it may seem minor to you and something that can be fixed with search-n-replace, that's something to be avoided at almost any cost.

Re:Alternative syntax (2)

yarnosh (2055818) | more than 3 years ago | (#36511710)

LOL. PHP as a comparison for readability. $That's->rich();

Re:Alternative syntax (0)

Anonymous Coward | more than 3 years ago | (#36511734)

Ouch ! I would definitely not qualify VB as a legible language.

Re:Alternative syntax (0)

Anonymous Coward | more than 3 years ago | (#36512052)

You mean notation like this?

if ($a == 5):
        echo "a equals 5";
        echo "...";
elseif ($a == 6):
        echo "a equals 6";
        echo "!!!";
        echo "a is neither 5 nor 6";

That's the first time I've ever heard anyone say something positive about that - are you a VB user by any chance?

C++11one!1 (0)

Anonymous Coward | more than 3 years ago | (#36511408)

Is APK in charge of their naming convention or something?

Re:C++11one!1 (0)

Anonymous Coward | more than 3 years ago | (#36511522)

no, Lulzsec :)

Re:C++11one!1 (0)

Anonymous Coward | more than 3 years ago | (#36511570)

I did wonder why they added /etc/hosts support...

The purpose for these changes. (0)

Anonymous Coward | more than 3 years ago | (#36511604)

They'll be using it to implement HTML 5.

Design by Committee (2)

Alistair Hutton (889794) | more than 3 years ago | (#36511760)

I love had back in the day C++ advocate sneered at ADA due to that fact that ADA was "designed by committee". Now C++ is the ultimate example of a design by committee language. And that committee is huge.

Re:Design by Committee (0)

Anonymous Coward | more than 3 years ago | (#36512118)

It may be design by committee, but that doesn't mean it is suffering from the common problems that 'design by committee' is known for.

Re:Design by Committee (1)

rmstar (114746) | more than 3 years ago | (#36512162)

Now C++ is the ultimate example of a design by committee language. And that committee is huge.

Huge indeed, but judging by the results, not particularly good.

A doc that needs to be read by more people is the C++ Frequently Questioned Answers [] .

Too late (0)

Anonymous Coward | more than 3 years ago | (#36511840)

After many many years of C and C++, I jumped ship to C#. I don't really enjoy going back to C++ when I have to.

D FTW (0)

Anonymous Coward | more than 3 years ago | (#36511902)

C++ is a ugly beast, ok ok it was (is?) important and very used and it will always be around, but today we have far better choices than it, one of them being D ( Which receives collaboration from many number of C++ evangelist, to name one of them: Andrei Alexandrescu.

Does TFA actually explain things? (3, Insightful)

SanityInAnarchy (655584) | more than 3 years ago | (#36511924)

It's been awhile since I've had to do any C++, so maybe I'm just missing something, but it seems like either there's a lot of retarded functionality here, or there's a lot of TFA which introduces a feature, even motivates it, but doesn't actually explain what the new version looks like. For example, with "Rvalue References":

void naiveswap(string &a, string & b)
  string temp = a;
This is expensive. Copying a string entails the allocation of raw memory and copying the characters from the source to the target...

Ok, first, what? I thought standard library string implementations were supposed to be efficient, and include some sort of copy-on-write semantics, which would (I would hope) make the above a shuffle-pointer-around instruction instead of a copy-data-around instruction.

Second, here's the newer, better syntax:

void moveswapstr(string& empty, string & filled)
//pseudo code, but you get the idea
  size_t sz=empty.size();
  const char *p=;
//move filled's resources to empty
//filled becomes empty

Regarding the first comment, no, I really don't, unless the point is that this is what the code for "moving" would look like if implemented in older versions of C++. But also:

If you&rsquo;re implementing a class that supports moving, you can declare a move constructor and a move assignment operator like this:
class Movable
Movable (Movable&&); //move constructor
Movable&& operator=(Movable&&); //move assignment operator

Ok, cool... But where is this used in the "moveswapstr" example? Does this make the "naiveswap" example automagically faster? Or is there some other syntax? It doesn't really say:

The C++11 Standard Library uses move semantics extensively. Many algorithms and containers are now move-optimized.

...right... Still, unless I actually know what this means, it's useless.

It looks like there's a lot of good stuff here, and the article is decently organized, but the actual writing leaves me balanced between "Did I miss something?" like the above, and enough confusion that I'm actually confident the author screwed up. For example:

In C++03, you must specify the type of an object when you declare it. Yet in many cases, an object&rsquo;s declaration includes an initializer. C++11 takes advantage of this, letting you declare objects without specifying their types:
auto x=0; //x has type int because 0 is int
auto c='a'; //char
auto d=0.5; //double
auto national_debt=14400000000000LL;//long long

Great! Awesome! Of course, this arguably should've been there to begin with, and the 'auto' in front of these variables is still annoying, coming from dynamically-typed languages. But hey, maybe I can write this:

for (auto i = list.begin(); i != list.end(); ++i) ...

Instead of:

for (std::list<shared_ptr<whatever> >::iterator i = list.begin(); i != list.end(); ++i) ...

It's almost like C++ wanted to deliberately discourage abstraction by making it as obnoxious as possible to use constructs like the above. Anyway, that's what I expected the article to say, but instead, it says this:

Instead, you can declare the iterator like this:
void fucn(const vector<int> &vi)
vector<int>::const_iterator ci=vi.begin();

...what? Am I missing something, because this doesn't seem to be about type inference at all. Did we switch to another topic without me noticing? Nope, it continues:

C++11 offers a similar mechanism for capturing the type of an object or an expression. The new operator decltype takes an expression and &ldquo;returns&rdquo; its type:
const vector<int> vi;
typedef decltype (vi.begin()) CIT;
CIT another_const_iterator;

That's cool. Kind of funny how it needs yet another language construct to do it, or that I've been able to do stuff like this in other languages for ages, but hey, I can't complain... much.

But the article seems full of stuff like this. Either I suck at C++ more than I thought, or this guy should've at least proofread a bit. I mean...

If the changes in C++11 seem overwhelming, don't be alarmed. Take the time to digest these changes gradually.

It's not the changes that seem overwhelming, as these are mostly taking the good ideas from other languages and backporting them to C++. If I'm alarmed, it's because C++ was already bloated from the first attempt at this -- backporting objects and exceptions to C. But what's actually overwhelming here is trying to read this article, and that's not the language's fault.

Re:Does TFA actually explain things? (2)

siride (974284) | more than 3 years ago | (#36512354)

The article messed up in a number of places as you surmise. You can use auto in for-loops. I don't know why the example didn't show that properly (I was scratching my head).

As for the swap example, what ends up happening is that with move semantics, you can go back to using the naive version, but it will actually be efficient because under the hood, it uses the rvalue references instead of copying. It will behave is if it were written using all that "pseudo-code" (don't know why he called actual code "pseudo-code"), but without all the horror.

STL (0)

Anonymous Coward | more than 3 years ago | (#36511994)

C++ is a brilliant language at its core. The only glaring flaw was the default copy constructor and the woes of casting with multiple inheritance. Looking back, the redundant header vs. implementation file declarations is also a bit tedious. However, I'm willing to overlook these as minor flaws in an otherwise good standard.

Sadly, STL and the standard C++ library are C++'s Achilles Heel. The container libraries were overly complex and suffered the woes of C++ template limitations. Remember those STL container compiler errors where a single type name requires the entire screen to render? iostream while innovative ultimately proved less practical than stdio, and is not as good as contemporary solutions when it comes to string localization, where parameter position is locale dependent.

Most importantly, the C++ library is woefully inadequate. What makes Java, .NET, Pearl, Python, etc so great is the library support. In my opinion, way too much emphasis was placed on garbage collection in discussions around C++, and not enough around robust library. To be fair, platform issues back then made this a big challenge. However, had their been an effort to create an industry standard (free) robust library for C++ that was as feature rich as Java and Net, we'd see more C++ programming today. (Yeah, I know there were attempts at this, but none of them made it into the standard.)

As a C++ programmer who went on to projects in the managed-runtime languages, and recently due to platform requirements has returned to active C/C++ development, I am amazed at how my experiences with those higher level languages now influences my C++ writing. I credit those Java and .NET experiences with helping me now avoid all of the C++ pitfalls that lead to problems. The resulting C++ is much more elegant than what I used to write. I enjoy the raw power of native code, although for most applications, it is not practical.

Shouldn't they call this... (1)

Trailer Trash (60756) | more than 3 years ago | (#36512128)


Ignore this article (1)

MobyDisk (75490) | more than 3 years ago | (#36512388)

Do not read this article, it makes C++0x look bad by giving terrible examples of the new features. Even features I've been excited about look stupid after reading this. The article shows how *not* to use a lambda expression. A regular for loop would be better here. Using "auto" on int and long does work, but defeats the purpose. The second example of auto doesn't even make sense since it doesn't actually include the word auto. It should be something more like: auto ci=vi.begin();

Just ignore this and go to Wikipedia or Google.

Why is a garbage collector even needed? (3, Insightful)

ifrag (984323) | more than 3 years ago | (#36512400)

What is the big fuss about getting a garbage collector anyway? Why does it even matter? Good C++ code shouldn't need a garbage collector. If memory was allocated within an object then the destructor should be taking care of it. And with shard_ptr (which people should start using) it's taken care of within there anyway. Is this wanted so everyone can start coding sloppy C++ and forget about the delete calls? I suppose for those using some 3rd party library that behaves poorly and is totally out of your control it could be nice to stop that from leaking all over. Still, it should have been done right in the first place.

I suppose there might be some argument for preventing excessive memory fragmentation. Is there some other benefit to having one?

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>