Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Barbara Liskov Wins Turing Award

kdawson posted more than 5 years ago | from the getting-a-clu dept.

Programming 187

jonniee writes "MIT Professor Barbara Liskov has been granted the ACM's Turing Award. Liskov, the first US woman to earn a PhD in computer science, was recognized for helping make software more reliable, consistent and resistant to errors and hacking. She is only the second woman to receive the honor, which carries a $250,000 purse and is often described as the 'Nobel Prize in computing.'"

cancel ×

187 comments

Sorry! There are no comments related to the filter you selected.

Turing test (5, Funny)

ignishin (1334571) | more than 5 years ago | (#27139395)

Does this mean she passed the turing test?

Re:Turing test (5, Funny)

MrEricSir (398214) | more than 5 years ago | (#27139487)

I hope not. MIT professors are not human.

Re:Turing test (1, Interesting)

Anonymous Coward | more than 5 years ago | (#27139727)

While there's no doubting her accomplishments, I will say that my enjoyment of 6.170 was in spite of her.

Re:Turing test (2, Funny)

dedazo (737510) | more than 5 years ago | (#27139543)

At MIT, they give the test to the professors the award to the machines. Yeeeaaahhh [instantrimshot.com]

Re:Turing test (3, Funny)

Stormwatch (703920) | more than 5 years ago | (#27140635)

No, it means she is turing-complete.

Good for her... (4, Funny)

Em Emalb (452530) | more than 5 years ago | (#27139401)

I bet she has some stories from "the old days" of being about the only female geek around.

Good for her.

Re:Good for her... (0)

Anonymous Coward | more than 5 years ago | (#27139945)

"She" sure as hell didn't win this award for her looks.

Re:Good for her... (-1, Redundant)

overlordofmu (1422163) | more than 5 years ago | (#27140085)

Well, then it would not be the Turning test, would it?

And I, personally, do find her to be a beautiful woman.

This is the 21st Century isn't it?

We don't judge PhD holders by there appearance these days. Do we, AC?

Only the women (1)

HornWumpus (783565) | more than 5 years ago | (#27142413)

We judge all of them on their appearance. Deny it all you want but it's true. Women do it as well.

Their are only two kinds of women...

More women in the old days (5, Interesting)

Simian Road (1138739) | more than 5 years ago | (#27140389)

Apparently there were far more women in computing in "the old days". The dominance of the male geeks is a relatively recent phenomenon.

Re:Good for her... (3, Funny)

saiha (665337) | more than 5 years ago | (#27140407)

Yeah, there are like twice as many now.

Re:Good for her... (0)

Anonymous Coward | more than 5 years ago | (#27140519)

Yeah, there are like twice as many now.

What? You mean there's actually an entire woman in the field now?

Re:Good for her... (1)

geekoid (135745) | more than 5 years ago | (#27141011)

There weren't in geeks in those days, just nerds.

In technology, there were certianly circus geeks.

And that 'comic book' guy? he was a dork.
But there wasn't the market to have a geek in the way we think of it now.

Purses and wallets? (4, Funny)

interkin3tic (1469267) | more than 5 years ago | (#27139413)

She is only the second woman to receive the honor, which carries a $250,000 purse and is often described as the 'Nobel Prize in computing

Did they give $250,000 wallets to the men who won previously?

Re:Purses and wallets? (4, Funny)

CaptainPatent (1087643) | more than 5 years ago | (#27139621)

No, they're just bragging about the luxurious accessories the award lugs around all day.

Re:Purses and wallets? (1)

morgan_greywolf (835522) | more than 5 years ago | (#27139741)

Well, if it isn't Italian leather and made by Gucci, they certainly got ripped off.

Re:Purses and wallets? (0)

Anonymous Coward | more than 5 years ago | (#27139999)

I would've held out for Corinthian leather.

Re:Purses and wallets? Nahh... she got a $250k (1)

davidsyes (765062) | more than 5 years ago | (#27141567)

purse, so she should be Prada of herself. If it's another brand, she can curl up with it and grope it and say, "Gucci gucci gnu... Gucci gucci gnu...", but she can always ln with LV... I wonder if that purse will hold CUPS...

Re:Purses and wallets? Nahh... she got a $250k (2, Insightful)

vishbar (862440) | more than 5 years ago | (#27142075)

Fashion jokes aren't gonna go to far on Slashdot, buddy. Just a word of advice.

Re:Purses and wallets? (1)

cthulu_mt (1124113) | more than 5 years ago | (#27139843)

Horses also win purses. I hope this is not a commentary on her appearance.

Re:Purses and wallets? (0)

Anonymous Coward | more than 5 years ago | (#27139995)

Yea, it's more like an insult to the horses.

Re:Purses and wallets? (0)

Anonymous Coward | more than 5 years ago | (#27141607)

Did the men win $312,500 wallets?

first female post! (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#27139429)

breaking into the all boys club.

Her biggest achievement (0)

Anonymous Coward | more than 5 years ago | (#27139473)

She was the chief architect for Internet Explorer.

So does that mean... (2, Funny)

Chris Mattern (191822) | more than 5 years ago | (#27139499)

...we can't tell her apart from a computer over a teletype link?

No, wait...

Relations all the way down (3, Informative)

Baldrson (78598) | more than 5 years ago | (#27139539)

Liskov says: "Today the field is on a very sound foundation."

If only it were true.

I recall, in fact, the point in time when I first ran across Liskov's CLU in the context of working one of the first commercial distributed computing environments for the mass market, VIEWTRON, and determining the real problem with distributed programming was finding an appropriate relational formalism.

We're still struggling with the object-relational impedance mismatch today. The closest we are to finding a "solid basis" for computer science is a general field of philosophy called "structural realism [wordpress.com] " which attempts to find the proper roles of relations vs relata in creating our models of the world.

If anything, our descriptions should be "relations all the way down" unless we can find a good way, as some are attempting, to finally unify the two concepts as conjugates of one another.

Re:Relations all the way down (0)

Anonymous Coward | more than 5 years ago | (#27139771)

I didn't understand a word of that and I currently study for my bachelor's degree (though still on the first half of it).

Should I be worried? Y/N

Re:Relations all the way down (0)

Anonymous Coward | more than 5 years ago | (#27139815)

No, the gp was pretty much garbage.

Yes (2, Informative)

Baldrson (78598) | more than 5 years ago | (#27140395)

Aside from academic pissing contests you have a much more immediate worry: The lack of bankruptcy protection afforded student loans coupled with the trend in life-time income prospects for CS graduates.

Re:Relations all the way down (0)

Anonymous Coward | more than 5 years ago | (#27139893)

[fancy, $3 words omitted]

Is that just a fancy way of saying, "She's got boobies!"

Re:Relations all the way down (0)

Anonymous Coward | more than 5 years ago | (#27139973)

Is this the blowhard's way of saying that if computer science were bridge building we'd still be chucking large rocks for stepping stones into the middle of the river hoping they don't wash away?

Re:Relations all the way down (1)

Baldrson (78598) | more than 5 years ago | (#27140559)

If this "blowhard" didn't care about getting CS out that condition.

Coincidentally (5, Informative)

counterplex (765033) | more than 5 years ago | (#27139577)

I happen to have a printout of an article on "The Liskov Substitution Principle" and was wondering just yesterday how it is that as programmers we use these principles in everyday life yet don't know their names or the stories of how they came about. As the first US woman to earn a PhD in CS, I'm sure there are some interesting stories to tell about it.

For those who might not have her original text handy, the Liskov Substitution Principle states (rather obviously):

If for each object o1 of type S there is an object o2 of type T such that for all programs P defined in terms of T, the behavior of P is unchanged when o1 is substituted for o2 then S is a subtype of T

which, when stated in the words of Robert "Uncle Bob" Martin as something we probably all intuitively understand from our daily work, is:

Functions that use pointers or references to base classes must be able to use objects of derived classes without knowing it

LSP it's not a guideline, it's a rule. (4, Interesting)

refactored (260886) | more than 5 years ago | (#27139811)

Sadly, too many people still think it's a guideline, not a rule. Sorry, if your code violates the LSP, you've got a bug, it just hasn't bitten you yet.

She deserves recognition for the vast number of latent defects she has effectively removed from the worlds software with the LSP alone, I'm glad she got the award.

Re:LSP it's not a guideline, it's a rule. (0)

ClosedSource (238333) | more than 5 years ago | (#27140055)

"Sadly, too many people still think it's a guideline, not a rule. Sorry, if your code violates the LSP, you've got a bug, it just hasn't bitten you yet."

It's really a very common design convention, but not following the convention isn't necessarily a bug. A bug is a deviation from requirements and requirements should never be assumed.

Re:LSP it's not a guideline, it's a rule. (1)

davidsyes (765062) | more than 5 years ago | (#27141613)

"but not following the convention isn't necessarily a bug."

Not following the convention may not be a bug, but it may be annoy... This very suck, this condition...

Re:LSP it's not a guideline, it's a rule. (1)

ObsessiveMathsFreak (773371) | more than 5 years ago | (#27142043)

It's really a very common design convention, but not following the convention isn't necessarily a bug.

But without it you'll end up with situations where a function will not accept a pointer to a triangle, because it will only take pointers to data type "polygon" from which triangle is derived.

If the rule is correctly applied, the relationships between base and derived classes should also be a lot clearer. If it doesn't make sense for a derived type to be passed as a base type, you've probably made a wrong decision in your code design.

I have nothing against LSP (1)

ClosedSource (238333) | more than 5 years ago | (#27142409)

"But without it you'll end up with situations where a function will not accept a pointer to a triangle, because it will only take pointers to data type "polygon" from which triangle is derived."

Yes, I understand the implications.

"If the rule is correctly applied, the relationships between base and derived classes should also be a lot clearer. If it doesn't make sense for a derived type to be passed as a base type, you've probably made a wrong decision in your code design."

There are no "wrong decisions" unless some requirement is unfulfilled.

My point is that we sometimes we confuse dogma with correctness.

Re:LSP it's not a guideline, it's a rule. (1)

geekoid (135745) | more than 5 years ago | (#27141065)

"if your code violates the LSP, you've got a bug, it just hasn't bitten you yet. ..."

False.

Re:LSP it's not a guideline, it's a rule. (2, Interesting)

Catiline (186878) | more than 5 years ago | (#27141601)

"if your code violates the LSP, you've got a bug, it just hasn't bitten you yet. ..."

False.

Proof, please; you are contesting an award-winning theory, and I for one side with prevailing theory until further evidence is provided.

Re:LSP it's not a guideline, it's a rule. (1)

vishbar (862440) | more than 5 years ago | (#27142191)

Okay, so, stupid question. Wouldn't the very existence of virtual methods violate this principle? For example, if I have a method:

void notifyUsers(Publisher pub) { pub.publish(); }

where class Publisher contains a void method publish(). You have two subclasses of Publisher: EmailPublisher and SmsPublisher (functionality should be obvious, let's just say that it sends a message to users via email/sms). Would the behavior not be different based on whether I passed either of those two subtypes (assuming publish() was declared as virtual in the superclass and is overridden in both subclasses). Of course, I am probably misunderstanding the entire LSP...

Re:Coincidentally (1)

ranjix (892606) | more than 5 years ago | (#27140427)

the "behavior" should not change, or the results? if we can't change the "behavior" in subclasses, then I guess we're not allowed to overwrite methods (behavior) either.

Re:Coincidentally (3, Insightful)

shutdown -p now (807394) | more than 5 years ago | (#27141641)

I happen to have a printout of an article on "The Liskov Substitution Principle" and was wondering just yesterday how it is that as programmers we use these principles in everyday life yet don't know their names or the stories of how they came about.

To be honest, I would consider anyone who does not know what LSP is, to be OO-ignorant, even if (s)he does code in an OO language. It is a very fundamental rule, much more important that all the fancy design patterns. I guess it's possible to "invent" it on your own, just as it's possible to normalize databases without remembering, or even knowing about the strict NF definitions, but in either case, chances are high you'll get it wrong eventually.

EPq? (-1, Troll)

Anonymous Coward | more than 5 years ago | (#27139673)

of Jordan Hubbard Whether you bloodfart5. FreeBSD GAY NIGGERS from

1968 (4, Informative)

MoellerPlesset2 (1419023) | more than 5 years ago | (#27139695)

Since it's not in the article, I looked it up. She got her PhD in 1968.

I initially thought that kind of sucked (Cambridge's 'Diploma in Computer Science' has been awarded since 1954), but apparently the first US PhD in CS named as such was in 1965 (University of Pennsylvania).

The field could still use more women though.

Re:1968 (2, Insightful)

UnknownSoldier (67820) | more than 5 years ago | (#27140065)

> The field could still use more women though.

Why?

Do you complain that we need more pregnant men also?

Re:1968 (5, Insightful)

MoellerPlesset2 (1419023) | more than 5 years ago | (#27140631)

Why? Do you complain that we need more pregnant men also?

Men aren't capable of becoming pregnant. I however, happen to believe women are just as capable of being good computer scientists as men are.
The fact that only a small minority of computer scientists are women, means that upwards of half our best CS talent is going to waste.

I think that's a pity.

Re:1968 (1)

davidsyes (765062) | more than 5 years ago | (#27141699)

http://www.malepregnancy.com/ [malepregnancy.com]

http://www.cbsnews.com/video/watch/?id=4234033n [cbsnews.com]

Barbara Walters Exclusive: Pregnant Man Expecting Second Child
http://abcnews.go.com/Health/Story?id=6244878&page=1 [go.com]

Was yours a pregnant assumption? (Disclaimer: i am willing to be open that the above links may be hoaxes...)

Re:1968 (1)

Briareos (21163) | more than 5 years ago | (#27142569)

(Disclaimer: i am willing to be open that the above links may be hoaxes...)

*cough cough* [wikipedia.org]

np: 808 State - Goa (808 Archives Part IV)

That's a mutilated female! (0, Offtopic)

HornWumpus (783565) | more than 5 years ago | (#27142853)

Not a man.

Men have a X and a Y chromosome (sometimes an extra X or Y). Women have no Y chromosome.

How about we agree that men have the right to bear children even though they can't, not having a womb, which is no ones fault, not even the Romans.

We can also agree that women have the right to study CS, even though most can't, being bad a math, which is no ones fault, not even the Republicans.

Re:1968 (1, Troll)

CrimsonAvenger (580665) | more than 5 years ago | (#27142263)

The fact that only a small minority of computer scientists are women, means that upwards of half our best CS talent is going to waste.

Either that, or most of our women are much too sensible to waste time in a field like CS.

In other words, just because you think a field is important, doesn't imply that everyone agrees with you.

Re:1968 (1)

pjt33 (739471) | more than 5 years ago | (#27140651)

Cambridge's 'Diploma in Computer Science' has been awarded since 1954

It would be more accurate to say "was first awarded in..." because they recently shut it down after years of trying without much success to keep up the numbers.

Re:1968 (1)

drewvr6 (1400341) | more than 5 years ago | (#27140871)

"I" could use more women. But I don't see any government agency offering awards to subsidize that area. But good for her for getting the recognition she deserves.

Re:1968 (2, Insightful)

geekoid (135745) | more than 5 years ago | (#27141133)

"The field could still use more women though."

Why?
Not that there shouldn't, but you are blindly stating something without any argument.
WHat does a women bring that a man doesn? or vise versa?

When we can determine that, then maybe we can find out why the field continues to attracts so few women. Even in the presence of programs that push very hard to get door open and to give women priority in the education there just aren't a lot of women.

I am genuinely interested in why?
Yhe more I think about it, the more I wonder if it is the type of work.

Hey mods:
There is a difference between genuinely wondering why a disparity exists(as I am) then sexism or bigotry.

Goddamn media (0)

Anonymous Coward | more than 5 years ago | (#27139699)

Liskov, [...] was recognized for helping make software more reliable, consistent and resistant to errors and hacking.

Who would want to make their software resistant to hacking? Let me guess: she worked on OpenOffice on behalf of Sun Microsystems.

Patronising git will miss irony of the above and correct me on the media meaning of 'hacking' in three... two... one...

making software more reliable? (3, Interesting)

erroneus (253617) | more than 5 years ago | (#27139743)

Software is ALWAYS reliable. It is the code that people write that sucks.

I don't know how many people come from the "old school" of programming, but when I started, we didn't have all these libraries to link to. When we wanted a function to happen, we wrote it. And when we wrote it, we checked for overflows, underflows, error status and illegal input. We didn't rely on what few functions that already existed.

Most fatal program flaws are ridiculously easy to prevent, but bad programming habits prevail and short of creating some human language interpreter that writes code as it should be written, nothing will replace lazy programmers who trust library functions too much. And yes, I know about deadlines and not having time to waste and all that stuff. But there is something most people are also missing -- pride! I know that when I do something, I am putting my name on it whether it is directly or otherwise. And if my name gets associated with something, I make damned sure that it works and is of good quality. With the stuff that goes out these days (especially SQL injection?! PLEASE! What could be more fundamental than screening out acquired text data for illegal characters and lengths?!) it is clear that pride in one's own work is not something that commonly exists.

For those of you out there who agree with me, it probably doesn't apply to you. For those that disagree, tell me why? Why is a programming error FIXABLE but not PREVENTABLE?

Re:making software more reliable? (5, Insightful)

Anonymous Coward | more than 5 years ago | (#27139813)

Software is ALWAYS reliable. It is the code that people write that sucks.

No, computers are reliable. They'll do exactly what you tell them to do. Software, however, sucks, since it is simply a representation of the code that people write, which also sucks.

Re:making software more reliable? (5, Funny)

ChienAndalu (1293930) | more than 5 years ago | (#27139931)

No, electrons are reliable. They'll do what you tell them to do. Hardware engineers however design crappy hardware.

Re:making software more reliable? (3, Interesting)

Colonel Korn (1258968) | more than 5 years ago | (#27140015)

No, quantum mechanics is reliable. It defines physical uncertainties in a robust way. Electrons suffer from crappy positional and momentum certainty.

Re:making software more reliable? (3, Funny)

iminplaya (723125) | more than 5 years ago | (#27141777)

No, quantum mechanics is reliable. It defines physical uncertainties in a robust way.

Only when you're watching. Behind your back it's complete chaos.

Re:making software more reliable? (4, Funny)

Ardeaem (625311) | more than 5 years ago | (#27140087)

No, electrons are reliable. They'll do what you tell them to do.

I, for one, am never sure quite what my electrons are doing. After that Heisenberg guy, they've been a bit flaky...

Re:making software more reliable? (1)

AuMatar (183847) | more than 5 years ago | (#27140431)

Eh, I know exactly where my electrons are. I just have no clue where they're going.

Re:making software more reliable? (1)

spacefiddle (620205) | more than 5 years ago | (#27141825)

I'm not too certain where they've really been, either...

Re:making software more reliable? (0)

Anonymous Coward | more than 5 years ago | (#27142529)

...which is why this ``was recognized for helping make software more reliable, consistent and resistant to errors and hacking'' is funny, 'cause it seems she failed in things she's recognized for.

Re:making software more reliable? (2, Insightful)

morgan_greywolf (835522) | more than 5 years ago | (#27139987)

but when I started, we didn't have all these libraries to link to. When we wanted a function to happen, we wrote it.

Functions? Back when I started, we didn't have functions. We had jump instructions.

You kids and your newfangled 'functions' and 'libraries'. Now get off my lawn!

You had a lawn! (1)

ClosedSource (238333) | more than 5 years ago | (#27140129)

We just had dirt. Young whipper-snapper!

Re:making software more reliable? (2, Interesting)

Ardeaem (625311) | more than 5 years ago | (#27140171)

Functions? Back when I started, we didn't have functions. We had jump instructions.

When I first learned to program as a kid, I taught myself how to write pseudo-functions using goto. I looked back at those programs a few years later, and they were completely unreadable. Now, my wife does a little programming on the side (we're both researchers) and she loves goto. I keep trying to tell her NOT TO USE GOTO, but she never listens. It's painful to read her code. I think we might need counseling for this...

Re:making software more reliable? (0)

Anonymous Coward | more than 5 years ago | (#27140725)

Functions? Back when I started, we didn't have functions. We had jump instructions.

When I first learned to program as a kid, I taught myself how to write pseudo-functions using goto. I looked back at those programs a few years later, and they were completely unreadable. Now, my wife does a little programming on the side (we're both researchers) and she loves goto. I keep trying to tell her NOT TO USE GOTO, but she never listens. It's painful to read her code. I think we might need counseling for this...

She's a woman. Fully known, fully explainable, and able to be followed at every step from start to finish threatens her control. Over you and over the programs.

Re:making software more reliable? (1)

geekoid (135745) | more than 5 years ago | (#27141219)

Make here debug here own code, keep your hands off.

Re:making software more reliable? (1)

davidsyes (765062) | more than 5 years ago | (#27141775)

What, no "if then, else" in the way of "pre-else-tuals"? hehehe

Re:making software more reliable? (2, Funny)

spacefiddle (620205) | more than 5 years ago | (#27141859)

Now get off my lawn!

10 PRINT LAWN
20 GOTO CURB

Re:making software more reliable? (1)

oldhack (1037484) | more than 5 years ago | (#27140069)

Having a nice day? Or are you always like this?

If you run your own shop, you'd already have some hints as to why things are the way they are. If you're salaried, ask your boss.

Re:making software more reliable? (1)

n3tcat (664243) | more than 5 years ago | (#27140213)

As with all things, pride comes at a cost. If it's worth it to you to "finish" fewer projects at the expense of less $dollars, good for you. I'm with you on that one. But I totally respect anyone elses decision to go balls to the wall and ignore all proper coding technique so they can Get The Job Done and get paid. Especially with teh current economic climate.

Re:making software more reliable? (0)

Anonymous Coward | more than 5 years ago | (#27140429)

Most fatal program flaws are ridiculously easy to prevent, but bad programming habits prevail and short of creating some human language interpreter that writes code as it should be written, nothing will replace lazy programmers who trust library functions too much.

Only trust libraries as far as they are guaranteed to work. If you can't confirm that a library function will work in the way you intend to use it then don't use it. Yes, I'm saying that we need to use contracts and specifications. If nothing else, it assigns blame where it is due. You break the contract, you get the blame. I'm not asking for machine-checkable certifications of library code (although that would be most welcome). Any sort of consistent standard for interface specification would be a huge step forward at this point. I'm not going to assume that a library does sensible input validation until I get it in writing.

Re:making software more reliable? (0)

Anonymous Coward | more than 5 years ago | (#27140499)

Yes clearly the biggest problem is that people are too good at using existing libraries, and should reinvent everything because that would be alot more stable. Sigh, there is a whole subject on that kind of thinking here; http://en.wikipedia.org/wiki/Not_Invented_Here

Its the absolute opposite. The biggest problem is that people are NOT using existing libraries enough. Your halfassed reinventions are not going to be any safer or faster.

I make good use of existing libraries. I go through the extra steps and make sure that no function I write will ever break for any input. Not hard, just a bit tedious at times and judging from the code from other projects i see that barely nothing can handle bad input, so it's not that common practice unfortunally. People suck.

Re:making software more reliable? (1)

erroneus (253617) | more than 5 years ago | (#27141183)

You must not be terribly familiar with some of the nasty exploits that affected numerous programs because they all used the same faulty library eh? This goes for Linux/Unix as well as Windows.

Re:making software more reliable? (1)

AdamTrace (255409) | more than 5 years ago | (#27140587)

"And when we wrote it, we checked for overflows, underflows, error status and illegal input."

And you also wrote bugs that you didn't immediately realize you were creating.

Not all bugs are due to laziness. Sometimes people make mistakes, or misunderstand requirements, or the operating parameters change, or, or, or...

Re:making software more reliable? (2, Insightful)

Chibi Merrow (226057) | more than 5 years ago | (#27140757)

I don't know how many people come from the "old school" of programming, but when I started, we didn't have all these libraries to link to. When we wanted a function to happen, we wrote it. And when we wrote it, we checked for overflows, underflows, error status and illegal input. We didn't rely on what few functions that already existed.

That's great. Now that you guys built up the roads, bridges, and traffic lights... The rest of us are interested in actually using them to GET SOMEWHERE.

Rewriting OpenGL, a scene graph, network interface code, XML Readers, etc, etc, etc. for each project would only lead to increasing the amount of buggy code in existence and no actual work getting done. We really should be past reinventing the square wheel...

Seriously, you've gone beyond the stereotypical "In my day..." old coot bullshit and straight into loony "uphill both ways in a snowstorm" territory. Using library functions instead of writing your own isn't any different than using power tools over hand tools, the quality of the result has to do with the person using them, not how easy the tools are to use. And while something built by hand may *seem* nicer, the difference in the end doesn't really matter--and you're able to get a hell of a lot more done with modern tools.

Re:making software more reliable? (1)

oGMo (379) | more than 5 years ago | (#27140765)

Software is ALWAYS reliable. It is the code that people write that sucks.

Yeah right. Even a simple ADD instruction will give the wrong result when the hardware fails. And hardware will fail.

Software isn't "reliable, but." It's only as reliable as it can be. "Those damn kids and their fancy functions" isn't the problem. The problem is fundamental complexity; no magic wand will make that go away.

For those that disagree, tell me why? Why is a programming error FIXABLE but not PREVENTABLE?

Sure, you can write provably error-free code... but you have to solve the halting problem [wikipedia.org] first.

Re:making software more reliable? (1, Informative)

jc42 (318812) | more than 5 years ago | (#27141807)

Even a simple ADD instruction will give the wrong result when the hardware fails.

True, but the reality is much worse than that. A simple ADD instruction will also give a wrong result, on all current "popular" CPUs, when the hardware is working exactly as designed.

To the people who design CPUs, adding two positive integers and getting a negative result is exactly what the hardware should do in some cases, depending on the values of the integers. This wreaks havoc with software designs that assume the mathematical definition of integer addition.

Yes, I know that the hardware also sets an overflow flag bit somewhere to indicate that the result isn't (mathematically) correct. But the implementations of most programming languages, including all the common ones, knowingly and intentionally hides the overflow flag from the software. If you pick up a few of the top-selling programming-language texts, and look through the index for information on how the language handles things like integer overflows, you typically won't find any mention of it. The people working at "higher" levels can't be bothered with such mundane details.

They get away with all this because the people paying money for the computer systems and the software aren't generally willing to pay for hardware or software that always produces correct results. Programmers who insist on such correctness tend to find themselves shuffled off to the side or laid off, in favor of programmers who can write software to management's release schedules.

This is all hardly a secret. People have learned that you don't have to be secretive about how crappy most software (or hardware) is, because the people in positions to control purchasing don't read the technical literature and don't particularly care about such geeky stuff as overflow bits. So we can talk about it all we like amongst ourselves; it makes no difference to the people signing the purchase orders and paying our salaries.

Re:making software more reliable? (1)

spacefiddle (620205) | more than 5 years ago | (#27141899)

Even a simple ADD instruction

10 POKE RITALIN
20 GOTO WHEEE

... i'll get me coat.

Re:making software more reliable? (1)

geekoid (135745) | more than 5 years ago | (#27141205)

Becasue the spec wasn't written well.

I'm tlaking about millions of lines here. Software that effectivly gets the [programmers compartmentalized becasue no human can track everything everyone is doing at the same time.
So a poorly defined document dictating the layers is a bug waiting to happen.

Now some bugs are inexcusable. Divide by 0 crashes spring to mind as the most glaringly obvious inexcusable bug.

Re:making software more reliable? (1)

rkit (538398) | more than 5 years ago | (#27141527)

... but bad programming habits prevail ...

as promptly demonstrated by the following rant:

PLEASE! What could be more fundamental than screening out acquired text data for illegal characters and lengths?!

Answer: usage of prepared statements.

seriously? (1)

Evan55 (1496509) | more than 5 years ago | (#27139793)

I had to read one of her books for a grad school class. It was terrible. Chock full of errors, and tangents that were largely unrelated to software development.

Re:seriously? (2, Insightful)

Kozar_The_Malignant (738483) | more than 5 years ago | (#27139959)

tangents that were largely unrelated to software development.

Tangents are related to geometry, not software development. Besides, professors write textbooks so they can make their students buy them, and the professors get some of the students' money; not because they're any good at it. I thought everyone knew that.

She's been handing out Linux&BSD disks in clas (0)

Anonymous Coward | more than 5 years ago | (#27139961)

was recognized for helping make software more reliable, consistent and resistant to errors and hacking

A very welcome change [blogspot.com] indeed... ;-)

Ceremony for prize was like this... (1)

stimpleton (732392) | more than 5 years ago | (#27140103)

...which carries a $250,000 purse.

A woman's purse!! [youtube.com]

oblig. (1)

Eil (82413) | more than 5 years ago | (#27140199)

Liskov, the first US woman to earn a PhD in computer science, was recognized for helping make software more reliable, consistent and resistant to errors and hacking.

Clearly, she's never worked for Microsoft.

Zing!

Re:oblig. Obviously, someone envied her... (0, Offtopic)

davidsyes (765062) | more than 5 years ago | (#27141999)

LESKO!

http://en.wikipedia.org/wiki/Matthew_Lesko [wikipedia.org]

Must've gone on an all-out CRAYZE to sub-do her...

I, for one, welcome our new age (0, Troll)

Tiber (613512) | more than 5 years ago | (#27140237)

of moody computing technology!

Don't let the award fool you. (1)

Penguinoflight (517245) | more than 5 years ago | (#27140707)

Liskov is a horrible author, and given my experience with her thoughts from "Program Development in Java," I would guess she is a horrible coder as well. Don't be conned into buying books based an award; her works are conflicting where they aren't simply wrong.

Re:Don't let the award fool you. (2, Funny)

geekoid (135745) | more than 5 years ago | (#27141261)

HAHAHahahhaha...
Someone with your sig has the gall to write that about a book?

Irony is rich today.

Oh, and how about an example of where she is wrong? I don't think I ahve ever read her stuff but I would like to see an example of what you are talking about.

Translation for Americans (0, Offtopic)

thewils (463314) | more than 5 years ago | (#27141509)

That stuff she's talking about - "Toilet Paper", it's the same stuff you guys, for some weird reason, call "Bathroom Tissue".

Re:Translation for Americans (3, Informative)

Chris Burke (6130) | more than 5 years ago | (#27141727)

We don't call it that. We call it toilet paper like normal people. Makers of toilet paper call it bathroom tissue, I guess because they want a name that's a little more distant from "ass wipe" or less evocative of a porcelain bowl filled with crap or something, though they'll talk about their "bathroom tissue" in advertisements while showing cartoon bears (chosen because as everyone knows, bears shit in the woods) with little scraps of toilet paper all over their fat bear asses, which I can't help but wonder who the fuck has this problem and why, but I'm afraid of the answer, and apparently the right brand of ass wipe will solve it so lets just try to forget about that okay?

What were we talking about? Oh right. It's called "pop". "Soda" is okay too I guess.

This is obviously a lie... (0)

Anonymous Coward | more than 5 years ago | (#27141523)

everyone knows that there are no girls on the interweb

Re:This is obviously a lie... (0)

Anonymous Coward | more than 5 years ago | (#27141849)

None who don't sit in their bra and panties gyrating to a webcam anyway

Nobel Prizei in Computing? (0)

Anonymous Coward | more than 5 years ago | (#27142707)

it's becoming more like the Oscar Awards in Computing.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>