Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

RIP: Betty Holberton, Original Eniac Programmer

chrisd posted more than 12 years ago | from the more-than-just-a-mod_perl-hack dept.

News 154

DecoDragon writes "Betty Holberton, one of the original ENIAC programmers, died on December 8th. An obituary describing her many achivements as well as her work on the ENIAC can be found in the Washington Post. Her accomplishments included contributing to the development of Cobol and Fortran, and coming up with using mnemonic characters for commands (i.e. a for add). She was awarded the Lovelace Award for extraordinary acomplishments in computing from the Asssociation for Women in Computing, and the Computer Pioneer Award from the IEEE Computer Society for "development of the first sort-merge generator for the Univac which inspired the first ideas about compilation.""

Sorry! There are no comments related to the filter you selected.

and they say girls are only good... (-1, Troll)

Anonymous Coward | more than 12 years ago | (#2689876)

for cleaning and cooking... the two c's
awful sexist i know

Re:and they say girls are only good... (-1, Flamebait)

Anonymous Coward | more than 12 years ago | (#2689892)

They *ARE* good for polishing shoes and giving me money for crack!

Re:and they say girls are only good... (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#2689980)

are you buying GNU crack? or Medical/government regulated crack?

Re:and they say girls are only good... (-1, Flamebait)

Anonymous Coward | more than 12 years ago | (#2690101)

dont forget giving head

using 'a' for add? (-1)

Anonymous Coward | more than 12 years ago | (#2689884)

boy, that was a stretch. how'd she come up with that?

The TTBs (0)

Anonymous Coward | more than 12 years ago | (#2689886)

The TTBs [after-y2k.com] weep for her.

Now that's hard work.... (-1, Troll)

Bonker (243350) | more than 12 years ago | (#2689891)

Hmmm... what will we call the Addition command?

Re:Now that's hard work.... (-1, Troll)

Anonymous Coward | more than 12 years ago | (#2690135)

Hey, most women would have used "p"... because of how they add recipie ingredients together pouring. Thank goodness that Betty, like most female computer users, couldn't cook!

-- The_Messenger

Big ass memorial (1, Funny)

Anonymous Coward | more than 12 years ago | (#2689902)

They should errect the ENIAC like the Vietnam Wall somewhere and scrawl all the dude's names on the back of it... ;)

Re:Big ass memorial (0)

Anonymous Coward | more than 12 years ago | (#2690124)

Women of her generation only did a single dude, and then they married him. Omigod! According to my calculations based on her age at death, this women begat geek children, who begat geek children, who begat... slashdot!

Why didnt anyone think to stop her? Now its too late to save the world from being boring.

Re:Big ass memorial (1)

vax (251660) | more than 12 years ago | (#2690259)

it seems only fair, after all the not so glam pioneers of computers still need to be remembered, and in my opinion more so that thier galm counter parts. Iam glad slashdot posted this its good to know they intresting things that are not microsoft related (at all) are still being posted. It occured to me that inadvertaintly slashdot is giving microsoft free advertising by putting them up on a (what seems like) weekly baisis, i dont know about the rest of you but iam a bit tired of hearing microsoft does this and microsoft is doing that in court (doesn anyone care anymore? i dont) heh well keep up the good articles guys
peace love and free thinking
VAX

..

Loss and Gain (3, Insightful)

The Great Wakka (319389) | more than 12 years ago | (#2689903)

Sometimes, the world loses a great person. While her accomplishments may seem minor compared to those of the modern-day programmers, she laid an important stone in the foundation of modern computer science. Can you imagine life without her? One whole section of a computer's logic would be eliminated. Perhaps she made some obscure discovery that tomorrow will change the way we think about computers.

Re:Loss and Gain (0, Troll)

Anonymous Coward | more than 12 years ago | (#2689917)

Can you imagine life without her? One whole section of a computer's logic would be eliminated.

Dude, it was a mnemonic. She didn't invent addition.

Re:Loss and Gain (1)

eddy the lip (20794) | more than 12 years ago | (#2690241)

nobody bloody reads anything anymore. if you can't be bothered to more than skim the summaries, how can you bother posting (or moderating. yes, i mean you, who modded this up insightful). if you had, you would have seen the bit that referred to the work she did that was pivotal in the creation of these things called compilers. you know, things that turns code into more things computers understand.

granted, it was a the end of the article, and "generator" and "compilation" have four syllables apiece, so i should probably cut you some slack...

Re:Loss and Gain (1)

vax (251660) | more than 12 years ago | (#2690272)

leave it to the Anonymous Cowards to mock significant acheivements, all i have to say to you is your name suits you well. perhaps when you grow up you will learn to take your critizims by a real name.
VAX

...

Re:Loss and Gain (2)

Safety Cap (253500) | more than 12 years ago | (#2689958)

Can you imagine life without her?
What's sad is that she's one of the pioneers who was underappreciated ("semi-professional"... WTF is that?) in her time, and virtually unknown today -- yet she's directly/indirectly responsible for many of the things we take for granted.

The average John Q. Public neither knows nor cares about people like that, but doesn't think twice when he sorts a column on his spreadsheet. Perhaps he should.

Re:Loss and Gain (4, Insightful)

selan (234261) | more than 12 years ago | (#2689963)

I realize that your comment was meant as praise, but it really belittles what she achieved.

How could her accomplishments possibly be minor compared with today's programmers? Today we may code operating systems or apps, but she helped to invent programming. She did "change the way we think about computers."

Read the obit first, it's very interesting and you might actually learn something.

Re:Loss and Gain (0)

Anonymous Coward | more than 12 years ago | (#2690118)

It's called karma whoring.

Re:Loss and Gain (3, Insightful)

Usquebaugh (230216) | more than 12 years ago | (#2689964)

"One whole section of a computer's logic would be eliminated."

I doubt it, probably just attributed to another person. There are very few ideas that are not part of the society they spring from. It just depends on who is recognised as being first.

Re:Loss and Gain (3, Interesting)

Omerna (241397) | more than 12 years ago | (#2690217)

One thing I've noticed is almost always another person is noted as having invented the exact thing at the same time, but weren't officially recognized as "first" so they don't get any credit. The first example that comes to mind is the Periodic Table- Mendeleev was credited, but another guy (anyone remember his name? I can't) did the same thing very slightly differently at the same time (I mean invented a table with the elements arranged like this, not an improvement upon it) and gets no credit because Mendeleev was recognized as first!

Anyway, the point is yeah, it would definitely have been invented within a few years (months?) of when it was.

Re:Loss and Gain (4, Interesting)

andres32a (448314) | more than 12 years ago | (#2690069)

Perhaps she made some obscure discovery that tomorrow will change the way we think about computers.

Actually she did. We know that software has not progressed as far as hardware. Most of it's relative progress was made by the original ENIAC TEAM. And Betty more than anybody else on that team wanting something that most of modern day programers are also hoping for... make computers fun, user fiendly and a good part of our daily life.

Re:Loss and Gain (3, Funny)

NonSequor (230139) | more than 12 years ago | (#2690345)

Computers already are user fiendly.

Bug (1, Informative)

genkael (102983) | more than 12 years ago | (#2689921)

Isn't she also the person who coined the term "bug" after finding a moth in the system that was shorting it out?

Re:Bug (1)

ckuske (19234) | more than 12 years ago | (#2689940)

Nope, that was Grace Hopper.

Re:Bug (3, Informative)

blair1q (305137) | more than 12 years ago | (#2689947)

That's more attributable to Grace Hopper, but she didn't coin it, she just made a joke of it, pasting the moth in her lab notebook and annotating it "first real bug".

--Blair

Picture of the bug (2, Interesting)

bstadil (7110) | more than 12 years ago | (#2689957)

It was Gace Hopper Look at this site [vt.edu] they have picture of the Bug.

Full info... (3, Interesting)

kikta (200092) | more than 12 years ago | (#2689973)

See the page on 1945 [computer.org] , where it says:

"Grace Murray Hopper, working in a temporary World War I building at Harvard University on the Mark II computer, found the first computer bug beaten to death in the jaws of a relay. She glued it into the logbook of the computer and thereafter when the machine stops (frequently) they tell Howard Aiken that they are "debugging" the computer. The very first bug still exists in the National Museum of American History of the Smithsonian Institution. The word bug and the concept of debugging had been used previously, perhaps by Edison, but this was probably the first verification that the concept applied to computers."

No, that would be Grace Hopper (2)

devphil (51341) | more than 12 years ago | (#2690131)


Admiral Grace Hopper to you. :-)

Re:No, that would be Grace Hopper (1)

Irvu (248207) | more than 12 years ago | (#2690224)

Rear Admiral Grace Murray Hopper.

When she retired in 1986 at the age of 79 she was the oldest commissioned navy officer onactive duty. The retiement ceremony was held on the U.S.S Constitution [navy.mil] which is also still on active duty.

I guess any woman tough enough to do what they did when they did (not sit at home) doesn't like to quit.

[OT] (2)

devphil (51341) | more than 12 years ago | (#2690314)


Yeah, I remembered the "Rear" as I pressed "Submit," but I wasn't certain how much of a difference that makes to the official rank. I guess it's just like the subdivisions in a general's rank. (I haven't done as much work with the Navy as with the other branches.)

She also won a Turing Award, didn't she? In '74 or '78?

Nope.... (2, Interesting)

NickFusion (456530) | more than 12 years ago | (#2690162)

The folk etymology of "bug" is that, in the early days of electronic computing, an actual insect flew into the innards of the Harvard Mark II, and caused a malfunction (this did happen), and that is where we get the word bug (in the sense of a flaw in the process). It seems however that the word was already in use in that sense in industrial manufacturing circles at the end of the 19th century.

(New Hacker's Dictionary)

dance on her grave (-1, Troll)

Anonymous Coward | more than 12 years ago | (#2689922)

We know who to blame for both fortran AND cobol! Go defecate on her grave!

Re:dance on her grave (4, Insightful)

MisterBlister (539957) | more than 12 years ago | (#2689978)

COBOL and FORTRAN were both wildly successful computer languages. They may seem a bit dated now, but considering their age and the fact that they were designed to work well on computers that are ridiculously less powerful than the system you have now (even if you're using a 286!) and factoring in that they were some of the first high level computer languages, with little research or history to draw upon, they were rather amazing accomplishments.

What languages have YOU designed?

Re:dance on her grave (0)

Anonymous Coward | more than 12 years ago | (#2690026)

i'd tell you, but that would reveal my true identity

sure a hell sight better than cobol and fortran

Re:dance on her grave (0)

Anonymous Coward | more than 12 years ago | (#2690245)

LOL... somehow I think that Ritchie, Stroustrup, Wall, or Gosling would sound more intelligent than you do. So let me guess... you invented Visual Basic? Bravo, my friend, you're a hero to the thousands of computer users who aren't intelligent enough to use PowerBuilder! It's really good for their self-esteem, too, telling them that they're "programmers" and all. Hell, you're probably single-handedly responsible for the creation of IT majors at hundreds of otherwise respectable universities around the world! Bravo!

-- The_Messenger
(also known as Mr. Flamebait)

Re:dance on her grave (1)

DavidJA (323792) | more than 12 years ago | (#2690392)

you invented Visual Basic?

Actualy, the guy that invented visual basic [mailto] . Happens to be one of the richest men in the world.

Visual Basic [microsoft.com] is used by thousands of organisations around the world use it, and thousands of programmers make a lot of money by providing real world solutions, written in visual basic to real world problems.

There are many computing/business problems out there that C/C++/Linux is simply overkill.

I'm glad that your ability to program in c/c++/whatever gives you a high self-esteem, but why don't you take your illusions of grandure and shove them up your ass.

Your give the /. and linux community a bad name.

BTW - No, I'm not a vb programmer.

Re:dance on her grave (0)

Anonymous Coward | more than 12 years ago | (#2690606)

YHBT. YHL. HAND!

If you enjoyed this troll, you many also enjoy my other work from this article:

...and, of course... Yeah, and I'm really sorry for "giving the /. and linux community a bad name." I wouldn't want to ruin Slashdot's reputation as the home for hypocritical Microsoft bashing and gay porn.

HTH!

-- The_Messenger [geocities.com]

Re:dance on her grave (0)

Anonymous Coward | more than 12 years ago | (#2690107)

COBOL _is_ dated, but FORTRAN's still around, and continuously evolving, similarly to C/C++ etc - Fortran 95 is about as similar to FORTRAN-77 as C++ is to BCPL. THere's no good GPL/free Fortran 95 compilers yet, though there is a project to make one for gcc.

Re:dance on her grave (1)

DGolden (17848) | more than 12 years ago | (#2690122)

Yes, Java is the new COBOL. No, seriously - it's being used everywhere COBOL used to be in businesses. MS wants a piece of that market, and, no doubt, will get some, with C#.

Re:dance on her grave (0)

Anonymous Coward | more than 12 years ago | (#2690225)

Dude.... who cares? This is about Better Holberton and her accomplishments, not about Java and MS. Please leave your /. obsessions at the door.

Re:dance on her grave (0)

Anonymous Coward | more than 12 years ago | (#2690293)

Even I, the one who hates LISP so much, would never say that about a person. She was obviously someone with a great amount of intelligence to work on a project like that. Normally I would say something about LISP being the cause of this, but I will show some respect because I know when to quit. Betty will in fact be missed.

I just saw a video with (-1, Flamebait)

Anonymous Coward | more than 12 years ago | (#2689941)

This one girl holding a DILDO up the old tatter
while another girl was holding her tits n stuff
and then it all went wham! And I was like looking
Whaaats happening?

discuss ->

Re:I just saw a video with (0)

Anonymous Coward | more than 12 years ago | (#2689994)

URL plz?

Something to think about. (5, Insightful)

Matt2000 (29624) | more than 12 years ago | (#2689942)


From the article "By the completion of the ENIAC project in 1946, work that once took 30 hours to compute instead took 15 seconds."

Since most of us were born after the advent of computers we take for granted that mundane computation tasks can be automated for fairly low cost and at great time savings. However, for all that technological progress has been hailed in the last 20 years, is there any task that we have received this kind of improvement in efficiency on?

Are we becoming too focused on the day to day improvements in computing, each one of ever decreasing relevance to people who actually use the computer?

How can we focus more in the future on finding the areas where our efforts can be best utilized to produce efficiency gains of this sort, rather than Microsofting everything by putting 74 new features into a product just so a new product can be sold?

These kind of questions stand as the things that can best be answered by open source, where we are not constrained by profit. This should be what we think about in the future, rather than what featuress we can copy from someone else's software just because they have it and we don't.

Re:Something to think about. (4, Interesting)

MisterBlister (539957) | more than 12 years ago | (#2690015)

This may come off sounding like a flame, but I don't intend it to -- I fully support the notion of Open Source software and have released various bits of OSS myself.

Having said that, for OSS to foster the giant leap forward that you suggest would require a large shift in the way people look at and create OSS. The simple truth is that 99.99% of all OSS is just reinvention of closed source software to scratch an itch or for political reasons. This is not the type of environment in which such a leap springs forth.

While Open Source has many benefits, it would take an awful lot for me to agree with your premise that its more well suited than closed source for the type of efficency gain you're looking for. Such leaps are often made by one or very few people, with everyone else following later. Given that, such a leap is just as likely to occur with plain-old closed source as with OSS.

Re:Something to think about. (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#2690016)

Sheesh. Are you on Katz' medication now?

Re:Something to think about. (-1)

Anonymous Pancake (458864) | more than 12 years ago | (#2690151)

Computer advances are nothing compared to the human advances western society have made in the last 100 years.

100 years ago, most women wouldn't vote or work regular jobs. 50 years ago, our society was still very racist. 25 years ago, homosexuality was considered a desease.

It's not just in the area of computers have we made giant leaps forward, and many people take their own freedom of lifestyle for granted.

Re:Something to think about. (2)

LordNimon (85072) | more than 12 years ago | (#2690160)

Open Source is not some kind of new paradigm of computing, it is simply a development and distribution model, and not a new one either. There's nothing "special" about open source. In fact, considering how open source, by definition, results in lower revenue for the developers, I would expect more innovations to occur in closed source code. After all, the more money that a company makes, the more it can invest into R&D.

Of course, there are excpetions, but they are only exceptions.

Wheelier wheel (2, Insightful)

NickFusion (456530) | more than 12 years ago | (#2690202)

So what are you suggesting, that we invent a wheel that is an order of magnitude...wheelier?

We're talking about a basic shift in the way things are done, from humans adding colums of numbers to an industrial number-adding machine.

You don't get the next big thing from microsoft, or from open source, or from programming at all.

You get it from inventing the next widget that automates, streamlines, accelerates some human activity.

What is it? A better word processor? Nope. Who knows. An automated intuiter? An enlarged and speed up memory core for the human brain? Something that turns dioxin into peanut butter?

Ginger?

Damned if I know, kemosabi. But when you're making those kind of calls, you're in the high country....

Re:Something to think about. (2)

devphil (51341) | more than 12 years ago | (#2690254)

"Computers make it easier to do a lot of things. Trouble is, most of those things don't need to be done."

(Somebody wanna help me with who said that?)

Consider the FFT. (4, Informative)

RobertFisher (21116) | more than 12 years ago | (#2690339)

I've heard Cooley & Tukey's original 1965 paper "An Algorithm for the Machine Calculation of Complex Fourier Series" on the FFT algorithm cited as such a vast improvement. (Indeed, it has been called [siam.org] "the most valuable numerical algorithm in our lifetime" by the applied mathematician Gilbert Strang.) When you consider it is an N log N algorithm, as opposed to previous N^2 methods (amounting to a factor of ~ 100 in computational efficiency for N ~ 1000, and even bigger gains for larger N), and just how often Fourier methods are used in all branches of computational science, you begin to appreciate how significant their achievement was.

One should realize that the most fundamental numerical algorithms do not change very rapidly. The most common numerical algorithms (sorting, linear algebra, differential equations, etc., both in serial and parallel) have been the subject of intense research by an army of applied mathematicians over the last half-century. All you have to do to take advantage of that work is to call your friendly local numerical library [netlib.org] .

Of course, sophisticated 3D graphics methods are still the subject of intense research.

So in sum, I would argue that as far as "serious" numerical methods go, excellent solutions usually exist. (These methods are "open source", indeed open source before the term existed! They are usually published in the scientific literature. The main gains that remain are in "entertainment" applications. Bob

Re:Something to think about. (2)

nusuth (520833) | more than 12 years ago | (#2690374)

Forget open source for a revolution, next big thing will be affordable computers surpassing human brains in computational power, which could happen in thirty years if moore's law continues to hold, and our estimates about brain's processing capacity is not very misguided. Building one today for just half a million top-end processors, and billions of dollars will do no good, AI researchers must be able access such machines for long time periods.

The next BIG thing will be actually putting that processing power to use, building machines more intelligent than ourselves. I can't see how that can not happen, maybe it will take much longer than 30 years but I'm pretty sure the number is closer to 30 than 300.

Anything else happening in the period are just details, minor details.

Fortran: Bill Gates' favorite programming language (-1, Troll)

Anonymous Coward | more than 12 years ago | (#2689960)

*shudder*

I heard from someone that women were only good for three things and two of which are cleaning clothes and cooking food.

(o Y o) So why do they bother achieving such
\ / status when they are only good for
( # ) fucking?
/ / \ \

Re:Fortran: Bill Gates' favorite programming langu (0)

Anonymous Coward | more than 12 years ago | (#2690134)

ahahah, that's some straight up funny shit ('cause it's soo true), mod this one up!

Re:Fortran: Bill Gates' favorite programming langu (0)

Anonymous Coward | more than 12 years ago | (#2690144)

Haha! Yes, I have to completely agree! So funny, yet so true!

I guess she finally decided to ... (2)

WillSeattle (239206) | more than 12 years ago | (#2689968)

decompile her code.

Another one for the bit bucket ...

while my compiler gently weeps ...

-

The Origin of Pale Grey Boxes, etc. (3, Interesting)

Alien54 (180860) | more than 12 years ago | (#2689998)

While engineers focused on the technology of computing, Mrs. Holberton lay awake nights thinking about human thought processes, she later told interviewers. - - - She came up with language using mnemonic characters that appealed to logic, such as "a" for add and "b" for bring. She designed control panels that put the numeric keypad next to the keyboard and persuaded engineers to replace the UNIVAC's black exterior with the gray-beige tone that came to be the universal color of computers.

Now we got folks who what their case midnight black.

But given all of the design issues we have seen, it is interesting to note that the human interface problem was being considered from the very beginning.

[Insert your Microsoft insult joke here]

Re:The Origin of Pale Grey Boxes, etc. (2, Funny)

Hektor_Troy (262592) | more than 12 years ago | (#2690691)

"[...] and persuaded engineers to replace the UNIVAC's black exterior with the gray-beige tone that came to be the universal color of computers."

Just goes to show, that even great minds make cockups from time to time!

Betty Picture (4, Informative)

andres32a (448314) | more than 12 years ago | (#2690008)

There is a nice picture a her here. [awc-hq.org] just if anyone is interested...

roof*roof* Rurn roff ra rights raggy *roof*roof (0)

Anonymous Coward | more than 12 years ago | (#2690036)


"I know what you mean, scoob, ol' buddy. she ain't much of teen hearthrobb. She is... uhhhhhhhhh-gleeeeeeeeeeeee whith a capital YUCK in my dictionary. Oh why does Fredie and Daphne always go off together and me, you, and thelma get chased by the ghouls and ghosts?"

That's not a REAL women BABY! Take off that mask (0)

Anonymous Coward | more than 12 years ago | (#2690232)

And you'll find Bill Gates.

Re:Betty Picture (1)

ardiri (245358) | more than 12 years ago | (#2690266)

eek.. just as scary as our goatse friends! any 20-30 year old pics of this woman :) that 70+ year old pic scared me :)

This just in (-1)

Reikk (534266) | more than 12 years ago | (#2690012)

Betty Holberton, original Eniac Programmer, was found dead in her Maine apartment today. You may or may not like her work, but you cannot deny her contributions to the art of science fiction writing. We will miss her.

Appearantly, she fell and broke her hip but couldn't reach her medic alert bracelet in time. Her muscles gradually eroded, and she starved to death. Flies began eating her carcass. The flies will miss her tasty flesh, and I, her beautiful naked, wrinkly, boobies.

The most irritating part of it... (4, Redundant)

Eryq (313869) | more than 12 years ago | (#2690028)

I got my master's in Comp Sci at UPenn in '89 (I used to walk past some of the remnants of ENIAC on display there, every day). And I can't help but be saddened by this:

She hoped to major in the field [mathematics] at the University of Pennsylvania but was discouraged by a professor who thought that women belonged at home.

I'm glad she finally got her chance to shine during the war, but who knows what else she might have accomplished, had someone's idiotic prejudices not dissuaded her into working for the Farm Journal?

Stupid git.

Then again, maybe he just meant /home...

Re:The most irritating part of it... (-1, Flamebait)

Anonymous Coward | more than 12 years ago | (#2690132)

or maybe he meant

sh /home/wife/do_the_fucken_dishes_script

Parrallel processing from the start. (5, Interesting)

Alien54 (180860) | more than 12 years ago | (#2690032)

The Army chose six women, including Mrs. Holberton, to program the ENIAC, which weighed 30 tons and filled a room. The women had to route data and electronic pulses through 3,000 switches, 18,000 vacuum tubes and dozens of cables.

"There were no manuals," one of the women, Kay McNulty Mauchley Antonelli, later told Kathleen Melymuka for an interview in Computer World. "They gave us all the blueprints, and we could ask the engineers anything. We had to learn how the machine was built, what each tube did. We had to study how the machine worked and figure out how to do a job on it. So we went right ahead and taught ourselves how to program."

Mrs. Holberton took responsibility for the central unit that directed program sequences. Because the ENIAC was a parallel processor that could execute multiple program sections at once, programming the master unit was the toughest challenge of her 50-year career, she later told Kleiman.

Now that is a programming challenge.

Imagine that the first programs were parrallel processing problems from the start, with no manuals or instructions in programing because they had to invent it all first. And the pressure of being in wartime as well.

very impressive indeed. one of those things that get done because no one knows it is impossible yet.

Imagine... (-1)

AnonymousCowheard (239159) | more than 12 years ago | (#2690076)

...the first BEOWULF Clusters were women working together!

Yo when are we giving out the Miss Crimpy Cable award, the Miss ALU award, and the Miss DSCK Probe Award? Oh wow all my favorite women in one room with a computer... meow, rairrrrrr.

Female Programmers (3, Insightful)

Lunastorm (471804) | more than 12 years ago | (#2690035)

And I've always thought that the first programmers were all men. I do wonder: is there is a higher percentage of female programmers today or has it fallen in time?
As for those who are belittling her use of mnemonics, you shouldn't take it for granted. Imagine having to type out 'file system consistency checker' instead of fsck among other commands.

Re:Female Programmers (2)

os2fan (254461) | more than 12 years ago | (#2690110)

Regarding mnenotics, Imagine having to type something like JMP377 with the right tape loaded instead of typing in fsck.

My first computer I used was something like an 8086 (I think). The way you booted it was to load up a paper tape, and manually insert the boot sector at a specific address, and then manually press the go secuence (by loading in something like 377.

This loaded a driver for the teletype machine, which you could enter assembler codes (as numbers). Mnemotics like "a" for add and "b" for bring would have been an achievement worth speaking of.

Re:Female Programmers (5, Insightful)

devphil (51341) | more than 12 years ago | (#2690291)


No, it's a question of perceived status. At that time, being a computer -- recall that 'computer' was the title of the person doing the math, not the noisy room-sized thing you did the math on -- was considered something of a drudge job. The men discovered the algorithms, the women did the computing.

Later, as the idea of working with a (machine) computer as a career became more fashionable, more and more men moved into the field, as it was no longer considered "merely" women's work.

Remember Lady Ada Lovelace, the first programmer? Babbage couldn't be bothered to do the menial work of actually designing algorithms. Then the act of designing algorithms lost some of its stigma, and men took over. Finally the act of actually coding the algorithms has lost its stigma, and so I (a male) can sit here making a fabulous living as a coder, while my equally-talented coder girlfriend doesn't make as much money.

The glass ceiling is still there. It just shifts up and down to include/exclude different professions as culture changes. :-(

Re:Female Programmers (2, Interesting)

Suppafly (179830) | more than 12 years ago | (#2690292)

The original computer programmers were all women, because there was the thought at the time that they would be better at working with computers than men since they would be transisting over from typing pools and from working as jobs at telephone operators, all which were seen as womens job. There is also the thought that women were better at math than a lot of men, which is why women that couldn't get accepted into mathematics programs went into the budding computer field where they were more readily accepted.

More Revisionist History (-1, Flamebait)

UltraBot2K1 (320256) | more than 12 years ago | (#2690055)

I'm sorry, but I have to take issue with this revisionist history being pushed down our throats by feminists sympathizers.

Clearly, this woman was not the "original" Eniac programmer. And even if she was, which is more important--building a computer or programming it? Anyone can do software, but hardware takes brains and persistence to do right.

Why don't we give the inventor of the Eniac this kind of credit? Or what about the countless hundreds of male computer pioneers who remain anonymous while this individual steals the spotlight simply because of her sex?

I'm all for celebrating the accomplishments of pioneers in computing. But affirmative action has no place in history.

Re:More Revisionist History (0)

Anonymous Coward | more than 12 years ago | (#2690105)

Crack open a history book or go to the Museum of American History in D.C., before you start ranting about "revisionist history". The whole article is ON THE LEVEL.

And she did software AND hardware; they programmed by soldering and wiring back then.
It was in the 1940s BEFORE THERE WERE EVEN COMPILERS: whaddya think they used?

Wassamatta? Can't bear the though that someone without a penis did something important, while nobody gives a damn about YOUR accomplishments? Got a little vagina envy maybe, HMMM?

But don't worry. I'm sure if you get properly laid someday, you'll see things differently...

Re:More Revisionist History (-1, Flamebait)

Anonymous Coward | more than 12 years ago | (#2690195)

Read the fucking article, moron. The engineers who designed the system deserve the credit -- Betty was just one of the system's early users. You see, kids, there was once a time when computer use was computer programming... if you wanted to do anything, you had to "program" it, because there was so little abstraction from the hardware. Betty's work on the ENIAC is analagous to using MS Word on a PC. She was a user. And if you read her list of accomplishments, there aren't a lot!

Not to knock her, because I'm sure that she contributed something... like, oooohh, read how she decided that the chassic should be beige! Wow, only a woman could be taught how to use a computer and then complain about what color it is. Maybe we should blame Betty for the invention of the iMac.

Anyway, I'm sure she was great, but we shouln't exaggerate her accomplishments just because she's dead. Likewise, we should't exaggerate her place in computing history just because she was a woman. The original poster is right on the money.

-- The_Messenger [geocities.com]

Re:More Revisionist History (-1)

I.T.R.A.R.K. (533627) | more than 12 years ago | (#2690228)

CmdrTaco doesn't have a penis, and he managed to create something (somewhat) important.
And yes, I have proof. Polaroids start at $500 on the auction block! ;)

Re:More Revisionist History (0)

Anonymous Coward | more than 12 years ago | (#2690646)

i think i'm in love!

puussshhhtt (-1, Troll)

Anonymous Coward | more than 12 years ago | (#2690063)

who ever heard of women programmers?

women and computers (-1, Troll)

Anonymous Coward | more than 12 years ago | (#2690073)

Today, every woman who is truly proficient at using computers -- as opposed to IT majors -- is either a flat-chested tomboy, or, more usually, a fat goth. I wonder what the corresponding fat-female computer user was like it Betty's day... a fat hippy? A fat beatnik?

Remember, kids, there's nothing stopping girls from being great computer users! Like my wife, for instance, who has revolutionized her recipie collection by learning how to use Microsoft Access. And hey, get her a cute little iMac and she'll never have to leave the kitchen! I'll bet that Betty definitely had to leave the kitchen in order to work with her computer -- she may have even had to shame her sex by working outside the house! How awful!

You have to really monitor your woman's computer use, though, or even the most flat-chested of girls will start to pork up. You see, to the female, a PC is just like a television: an excuse to sit on her ass and overeat while passively staring at a glowing screen. The Internet has become the third-leading cause of female ass-expansion, behind Jenny Jones and Friends. If she starts to get a bit too meaty, just fuck her in the ass, and after the fortieth or fiftieth anal reaming, she'll catch on. Unless she's blonde, in which case you may have to resort to a Filthy Sanchez. Woah!

-- The_Messenger

Re:women and computers (0)

Anonymous Coward | more than 12 years ago | (#2690153)

HAHAHAHAHA, this is some good shit man! these fucking slashdot nerds are fucked in the head. bunch of scrawny 130lb computer nerds with four eyes and shit. you hit the nail on the head with the one, my friend. mod this up!

Re:women and computers (0)

Anonymous Coward | more than 12 years ago | (#2690352)

hey, shut up.. I'm up to 125lb this week

What is your version of the Filthy Sanchez? (0)

Anonymous Coward | more than 12 years ago | (#2690207)

I'm asking because I know of 4 different kinds. How do you perform yours. har har har matey

Re:What is your version of the Filthy Sanchez? (-1, Troll)

Anonymous Coward | more than 12 years ago | (#2690290)

Start out fuckin' her "doggy style," stick your finger in her asshole until it's good and shitty, then smear the finger across her upper lip. Now she has a mustache approximating the look and smell of a Mexican steretype ("Sanchez").

If you have Flash -- and who doesn't, I mean you'd have to be running Linux otherwise -- check out this Newgrounds animation [newgrounds.com] for a visual demonstration. EvilDave calls it a "Dirty Sanchez," but it's the same sexual maneuver.

-- The_Messenger [geocities.com]

i think is safe (0)

Anonymous Coward | more than 12 years ago | (#2690077)

to say this column is being handled mostly by guy's... im not saying thats bad thing... is just a fact ;)

What other famous woman programmers?!@! (-1, Flamebait)

Anonymous Coward | more than 12 years ago | (#2690165)

what other famous woman programmers?

Hardly any.

Why?

Because the chance a caucasian female has an IQ of 124 is a staggering eight times less likely than a male has.

Almost all retarded people are male, so the average IQ of both sexes is ironically 100, as per definition.

But the bell curve for mammal IQ shows that the longer a person goes through life before puberty, and puberty's effects on the brain, the higher the chance the animal might be super smart.

Tufts university showed that sexy women with large chests have lower IQ than masculine looking less curvy women.

And "The BEll Curve" famouse book is chock full of redundant data meticulously showing why so few skilled scientists, engineers, programmers, chemists, biologists, etc are female.

That is why so many men dominate programming... especially the competitive world of mass-market shrink wrapped programming.

I get so sick of this "emilia earhart" style back slapping of Girl Power. There were two people who died in emilia earharts plane that fateful day, one was her male navigator, whom also was skilled in things like morse code and many other disciplines ms earhart lacked.

The only good female coders I have seen are the kind of so-called women that attend CalTech. And I do mean "so-called."

Re:What other famous woman programmers?!@! (3, Informative)

andres32a (448314) | more than 12 years ago | (#2690209)

Most ENIAC progamers were women. Read this. [witi.com] You just might learn something.

Re:What other famous woman programmers?!@! (0)

Anonymous Coward | more than 12 years ago | (#2690395)

relax, dude - this article just shows that the population of women programmers just got cut in half.

'a for add' ?!? (0)

Anonymous Coward | more than 12 years ago | (#2690170)

She should be burned in effigy

Seriously though, it's still a big problem that women are underrepresented in comp sci./programming. While I don't particularily respect Cobol and Fortran as languages, I really respect the technical hurdles that the pioneers faced and the personal/scientific achievements of people like the dearly departed.

The worst part is that it occurs to me that the majority of us haven't learned to leave this legacy behind (rant on archaic languages and programming techniques) and the lack of real progress in this respect. 50 years and we don't have a significantly better language than Smalltalk (ok, maybe Lisp).

Re:'a for add' ?!? (0)

Anonymous Coward | more than 12 years ago | (#2690194)

What about Modula-2?

Or ADA?

At least good compilers were written for the former.

No matter what language you think is good, a female will have more difficulty using it than her average male counterpart.

More political correctness for the dickless wonder (-1, Flamebait)

Anonymous Coward | more than 12 years ago | (#2690181)

More political correctness for the dickless wonders.

Face facts... there are almost **NO** high IQ females inhabiting most of the skilled engineering professions.

Its genetics.

To a great geek, from a proud one, I salute you (3, Insightful)

Anonymous Coward | more than 12 years ago | (#2690199)

Ms. Holberton, this Jolt's for you. You are one of the few early computer geek vetrans of war, an honor that few can claim. Thank you for what you have done for my country, and my profession.

Re:To a great geek, from a proud one, I salute you (-1)

I.T.R.A.R.K. (533627) | more than 12 years ago | (#2690250)

Jolt? You degenerate. Real geeks drink coffee.

This woman rocked in many aspects... (2, Funny)

haggar (72771) | more than 12 years ago | (#2690275)

First of all, she is one of the first programmers in the history of computing.
Second, she is probably the programmer with the longest active career: she started before the war, and retired in 1983.
Third, hey, she had a husband 33 years younger than her!

I think she got a few things worth of envy, huh?

Old time computing (4, Informative)

os2fan (254461) | more than 12 years ago | (#2690332)

Before the computer revolution, computers were expensive and frail.

My computer at college in 1981 was something nearing the end of its life. It was an 8086 with 4K of ram, and a paper tape drive. To boot it, you load up the tape, and load three values into ram (by a series of eight switches and a "set" switch), and then send a command 377 to the processor. This would jump it tot a location in memory, and then run the commands that you loaded there (effectively JMP address), which would then run the KEX program. KEX was a driver for a teletype. After that, you input through the keyboard by assembler code.

Compared to that, mnenomics like a for add and b for bring would have been a godsend.

Of fortran, basic and cobol. In the days of wire wound core, each bit of the byte made the machine more expensive, and there was some comprimise on the size of the bit. Fortran was designed to run on a six-bit machine. Even Knuth's MIX is underpowered by modern computers.

BASIC is intended to run in small memory. MS made their packet by bumming it into 4K of ram, with a point and shoot interface.

In effect, you moved a cursor around the FAT and entered on the file you wanted to run or edit, at least on the tandy 1000. Still, I built a RPN multibase hackable calculator in 6K of code.

Where BASIC comes off the rails is that people start using it as a general programming language. Its inability to pass parameters to subroutines is easily overcome

Thus var1 = fn3130{x, v, z} can be written as:

A1=x:A2=v:A3=z:GOSUB 3130:var1=A1

In fact, once the kernel is written and documented, you can turn a generic RPN calculator script into specific special purpose code. I had mine so that all variables in the calculator start with O, P and Q. The idea was that you could write messy code outside these letters, and use the calculator as an input device.

And they say girls can't program. Ha. We just do it differently.

Re:Old time computing (-1, Flamebait)

Anonymous Coward | more than 12 years ago | (#2690399)

OS/2 - because choice is a terrible thing to waste.
May I suggest revising your .sig? I think that "OS/2 - Hahahahaha! OMG, this shit is hilarious!" would more accurately represent how lame OS/2 is. Your current .sig almost sounds as if it's possible to like OS/2.
And they say girls can't program. Ha. We just do it differently.
You mean they do it on OS/2? I guess the stereotypes are correct.

-- The_Messenger [geocities.com]

Like OS/2 - yes!!! (0)

Anonymous Coward | more than 12 years ago | (#2690533)

OS/2 was the OS of choice before the Linux revolution. Most people who used OS/2 did so by choice: actually had to buy and install.

The first PC grass root movement was for OS/2, and a lot of useful things were learnt from this (and the mistakes made).

That IBM "abandoned" OS/2 and the rise of Linux had allowed many OS/2ers to go over to linux. Many freely admit to it.

Some of us stayed, largely because Linux is an entirely different paradigm to the OS/2 - Windows - DOS world we live in. But at least I am doing something by supporting an alternate to Windows.

Re:Old time computing (1)

Steve Bergman (7667) | more than 12 years ago | (#2690526)

>It was an 8086 with 4K of ram,

Hmmm. Sounds more like an 8080...

Re:Old time computing (2)

os2fan (254461) | more than 12 years ago | (#2690611)

It was years ago. Haven't seen the docos for 15 years. You may be right...

Re:Old time computing (-1, Flamebait)

Anonymous Coward | more than 12 years ago | (#2690629)

Yeah. Girls aren't very good with numbers that aren't found in recipe books.

-- The_Messenger

I find it amusing .... (3, Insightful)

os2fan (254461) | more than 12 years ago | (#2690372)

that, when the mainframes ruled, that computing was associated with DATA (ie bits, bytes, fields and records), as in Automatic Data Processing, Datamation &c, but now that data is easy to come by, it's INFORMATION (eg Information Technology).

I wonder how many IT people suggest technologies that are not computer-related: eg how many people suggest paper cards as a solution. I know I have.

You see, once you start fiddling around with the hardware like Betty H did, you start using it wisely. It is one of the reasons that Unix works so well.

Re:I find it amusing .... (0)

Anonymous Coward | more than 12 years ago | (#2690666)

Hey, how many times are you going to post to this article?
You [slashdot.org]
must [slashdot.org]
be [slashdot.org]
bored! [slashdot.org]

I guess that it's hard for a female geek to find much to do with her evenings. Why not reread your Ann Rice novels, or compose another awful poem about how men are insensitive brutes? Yeah, that's the ticket!

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?