×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NYC Mayor Bloomberg Vows To Learn To Code In 2012

timothy posted more than 2 years ago | from the give-himself-a-fighting-chance dept.

Education 120

theodp writes "New York City Mayor Michael Bloomberg has announced his intention to take a coding class in 2012 via Twitter ('My New Year's resolution is to learn to code with Codecademy in 2012! Join me.'). So, is this just a PR coup for Codeacademy, or could EE grad (Johns Hopkins, '64) Bloomberg — who parlayed the $10 million severance he received after being fired as head of systems development at Solomon Brothers into his $19.5 billion Bloomberg L.P. fortune — actually not know how to program? Seems unlikely, but if so, perhaps Bloomberg should just apply to be a Bloomberg Summer 2012 Software Development intern — smart money says he'd get the gig!"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

120 comments

Cobol (5, Funny)

Anonymous Coward | more than 2 years ago | (#38623974)

Maybe he wants to know how to code in something besides cobol and fortran.

Re:Cobol (2, Funny)

Anonymous Coward | more than 2 years ago | (#38624070)

Maybe he wants to know how to code in something besides cobol and fortran.

Or morse...

Re:Cobol (4, Interesting)

Kristian T. (3958) | more than 2 years ago | (#38624314)

Fortran isn't that bad, considering it's from 1957. Anyone who can do Fortran, could learn C++ very quickly, [begin rant] Cobol on the other hand was a step backwards the day it appeared in 1959, and it's creators should be bludgeoned with a frozen fish for even writing the design document. And yes - I've written tons of Cobol - it doesn't grow on you. It's probably the first example of the fundamental misconception, that it's desirable (if even possible) to make formal descriptions using informal language. The MBA's still think you can describe a piece of software in Word, and then it's a trivial process to make the software that customers want. Informal language is desirable to humans because it supports leaving out details - which is exactly what makes it useless for programming a computer. Using the word "plus" in stead of the symbol "+" is completely missing that fundamental point.[end rant]

Fortran & COBOL are ok... apk (0)

Anonymous Coward | more than 2 years ago | (#38624390)

Fortran comes VERY quickly to anyone that's done BASIC - iirc, in fact? It is often considered the "father of BASIC", but... don't quote me on THAT! I took the 77 std. of it, been MANY years since I used it last though (1994).

COBOL's fine, IF You can stand the beginning "divisions" (environment, identification mostly - always seemed more like "documentation" in those to me @ least from what I recall (been many years, but I took it in 1984, COBOL-74 std. & later again for the COBOL-85 std.) - very wordy, but, it does hte job (good report program generator for business purposes). It's been, oh... 17 yrs. since I used it last though.

APK

P.S.=> Of the 2, I prefer Fortran (easy to write, much like BASIC I felt) & it has LOADS of libraries that extend the basics in it to make it VERY powerful... apk

Re:Fortran & COBOL are ok... apk (0)

Anonymous Coward | more than 2 years ago | (#38624544)

I need a language that supports the /etc/hosts file. Should I use COBOL or ForTran (Formula Translator)?

Re:Fortran & COBOL are ok... apk (5, Informative)

Grishnakh (216268) | more than 2 years ago | (#38624714)

Should I use COBOL or ForTran (Formula Translator)?

No, it's "FORTRAN". While it does indeed stand for "formula translator", back in those days they didn't use CamelCase, and making portmanteaus and then writing them in all caps was normal. You can still see it in US military acronyms, such as "USCENTCOM" (US Central Command).

According to Wikipedia [wikipedia.org] , they didn't start using camelcase for programming language names until the 1970s, and it only became fashionable for company names in the 80s.

Re:Fortran & COBOL are ok... apk (2, Informative)

Anonymous Coward | more than 2 years ago | (#38625530)

No, it's "FORTRAN". While it does indeed stand for "formula translator", back in those days they didn't use CamelCase, and making portmanteaus and then writing them in all caps was normal.

Bzzzt [wikipedia.org] . Nowadays it's "Fortran". The Wikipedia article is an interesting read, for instance "Free-form source input, also with lowercase Fortran keywords" was first introduced in FORTRAN 90.

Re:Fortran & COBOL are ok... apk (0)

Anonymous Coward | more than 2 years ago | (#38624942)

Any language that can open a file, parse strings line by line, & has a decent set of string functions will do the job (which is PRETTY MUCH, all of them).

APK

P.S.=> Now, I realize you're the anonymous coward stalker troll that nigh constantly tries to harass me here & especially on HOSTS files - so I must ask you this question:

Don't you have better things to do with yourself than that?

... apk

Re:Cobol (0)

Anonymous Coward | more than 2 years ago | (#38624734)

Depends on the level of C++ learning. Procedural programming as a better C? Definitely. Object oriented progamming? Not so easy not every thinks in objects and in reverse. Forward thinking: it's easier to more logical to think openDoor() and closeDoor() to open a door and close a door. Reverse thinking: defining a door class first, then operations such as open and close and then finally calling door.open and door.close--the first is thinking of the action on the object and the second is thinking of the object first then the action (thus reverse thinking). However, both can be reused via copy and paste! Templates? I still can't learn them and I'm well over mid 20's, I don't see the need for them. C++0x whatever new elegant and confusing stuff? No thanks. Functional programming? F**k no.

Re:Cobol (0)

Anonymous Coward | more than 2 years ago | (#38625576)

Thinking in terms of objects is, indeed, quite a shift to the brain and is probably pretty hard for some to pick up. I programmed for 20 years in procedural languages before using C++ professionally. My one prior exposure to object oriented languages was in school back in the 70's where we had one assignment in Simula in a survey class. I fell in love with the concept then and was frustrated that it was not widely used. This, I think, made it much easier to pick up C++ over 20 years later (that, and the fact that I had the good fortune of initially working on an existing code base which had been written by some very good C++ programmers so I had a lot of good models to help me make the transition).

Templates make a lot of sense (ignoring of course the traditionally horrible error messages they produce when there's a compile error!) when used judiciously. When abused, they are just terrible. Without templates in C++, we wouldn't have the STL which I find very useful.

Re:Cobol (5, Insightful)

zarlino (985890) | more than 2 years ago | (#38624798)

The MBA's still think you can describe a piece of software in Word, and then it's a trivial process to make the software that customers want. Informal language is desirable to humans because it supports leaving out details - which is exactly what makes it useless for programming a computer.

That's because software *is* the description of what the computer should do. Check this great article: http://www.osnews.com/story/22135/The_Problem_with_Design_and_Implementation [osnews.com]

Re:Cobol (1)

buchner.johannes (1139593) | more than 2 years ago | (#38626152)

The MBA's still think you can describe a piece of software in Word, and then it's a trivial process to make the software that customers want. Informal language is desirable to humans because it supports leaving out details - which is exactly what makes it useless for programming a computer.

That's because software *is* the description of what the computer should do.

Only in some paradigms (procedural, functional programming). In logical programming, or SQL, you don't tell the computer what to do, you tell it what you want. Yet it is software.

Re:Cobol (0)

Anonymous Coward | more than 2 years ago | (#38628004)

No, SQL is declarative. You tell the computer what you want it to do, not how to do it (that is done in imperative languages like C).

Re:Cobol (0)

Anonymous Coward | more than 2 years ago | (#38627670)

From the article you link to:

I've yet to run into a situation where I've written code and the compiler has not followed my instructions and that is the reason something broke.

I guess the author never wrote anything more complex than helloworls then.

Re:Cobol and the 50s (0)

Anonymous Coward | more than 2 years ago | (#38625552)

I remember the early days. This was just a bit before my time (I started to actually program in 1968, but my instructor taught a bit of the history too...).

Be,ieve it or not, the use of subroutines with arguments was originally controversial. One school of thought was that everything should be global and in one file. Objects? You must be joking. Nobody thought in those terms at that time. However you could do some nice tricks with the "entry" statement in FORTRAN. One entry would be the "constructor" (it did initialization) other entries were for the various things you wanted to do to the data, and of course you had an entry for shutting things down.

But because there was not a stack in the conventional sense, the "local" variables were always the same no matter when it was called.

You could use block common for a lot of things, but in a lot of cases, there was no stack inthe hardware (IBM 360 I do not think had a hardware stack).

Stone knives and bearskins.

Fortran (1)

Yobgod Ababua (68687) | more than 2 years ago | (#38625670)

Fortran makes it really really easy to do complex matrix arithmetic. It also makes text manipulation a serious PITA.
So, like so many other things, it's a trade-off between what a language makes easy to code and what you actually want to code.

Re:Fortran (2)

buchner.johannes (1139593) | more than 2 years ago | (#38626172)

Fortran makes it really really easy to do complex matrix arithmetic. It also makes text manipulation a serious PITA.
So, like so many other things, it's a trade-off between what a language makes easy to code and what you actually want to code.

You are right. However, with the arrival of numpy, I don't see the benefit of Fortran any more.

Re:Fortran (0)

Anonymous Coward | more than 2 years ago | (#38626726)

Ummm .. How about it compiles natively, allowing for things like whole-program-optimization and instruction-level auto-vectorization?

I give you numpy for exploratory coding, development / prototyping, small models, etc,
but when you want to compute with massive datasets of data on big hardware,
Fortran wins, hands down.

Re:Cobol (1)

MrSteveSD (801820) | more than 2 years ago | (#38626770)

It's probably the first example of the fundamental misconception, that it's desirable (if even possible) to make formal descriptions using informal language

It's always seemed to me like some confused attempt to make complex things simpler by writing them in laborious English. e.g. Integrating this function is proving to be quite hard. Perhaps the whole process of integration would be easier if we wrote "Integrate" rather than using that confusing stretched out S symbol.

Re:Cobol (1)

some-old-geek (1329305) | more than 2 years ago | (#38627006)

One doesn't the word "plus" instead of the symbol "+" in COBOL. So, no, I don't believe you've written "tons of Cobol [sic]."

Re:Cobol (3, Informative)

Grishnakh (216268) | more than 2 years ago | (#38624660)

Funny, but even this is ignoring reality. Accord to TFA summary, he was a EE grad in 1964. While those languages do indeed date back that far, EE students were probably not taught them at the time, and in fact probably weren't taught any programming at all, as that was a different discipline (CS). Even when I went to undergrad EE school in the early 1990s, we were only taught a little QBASIC, FORTRAN, C++, MATLAB (1/2 semester each), and x86 assembly language (full semester). There was some more in the junior/senior classes, but only if you elected to take those, and it was all concentrated on microcontroller and embedded programming. Back in the mid-60s, I imagine programming simply wasn't considered important for EEs, and that any EEs who ended up working on computers (which were room-size and mega-expensive at the time) would learn any necessary programming on the job. The fundamentals of EE simply don't include programming; they include network theory (Ohm's and Kirchoff's Laws), electromagnetics (Maxwell's equations), 3-phase power, etc. It's only been in very recent years (early/mid-90s and later) where they came up with the "computer engineering" degrees, or put the two together ("electrical and computer engineering" or ECE like at one university I went to).

Re:Cobol (0)

Anonymous Coward | more than 2 years ago | (#38624680)

Fortran was commonly taught to electrical engineering students in many US and Canadian colleges in the 1970s, but I don't know if any of them did that in 1960s. It was the precursor not only of C++ but also heavily used to do engineering mathematics, of the kind that might be done by a dedicated engineering or math program these days.

Warren

Re:Cobol (1)

Yobgod Ababua (68687) | more than 2 years ago | (#38625692)

When I got my EE in the early 90's, there wasn't any officially sanctioned CS degree on offer. You could either do an "Engineering and Applied Science" degree focusing on coding and algorithms, or a EE degree focusing on computer design and use.

If you went the EE route you still had to learn all the antenna theory and 3-phase stuff as well, but at the end you understood how computers worked -and- how to use them.

Re:Cobol (1)

Grishnakh (216268) | more than 2 years ago | (#38626560)

That must have been unique to your school, because my 2nd-rate state university had a CS program, and I'm pretty sure CS has been around in other universities since the 60s. Wasn't Dijkstra a famous CS professor?

Re:Cobol (1)

Djatha (848102) | more than 2 years ago | (#38627906)

No, technically he was a famous mathematics professor. Only years after Dijkstra left [that university] CS (well, informatics) was introduced as a separate bachelor+master. Still, ever since the 1960s there were mathematics and electronic engineering departments that focused on computer science (or applied mathematics, as it was sometimes still called).

Re:Cobol (3, Funny)

c++0xFF (1758032) | more than 2 years ago | (#38624746)

He'll have to learn APL, or at least Perl. After all, he's going "to take a coding class in 2012 via Twitter." Anything else and the program won't fit in a single tweet!

[rimshot]

Re:Cobol (1)

Ihmhi (1206036) | more than 2 years ago | (#38627110)

Bloomberg's coding is probably going to end up running about as well as his Spanish sounds.

Seriously, I live in Newark, NJ and I can barely speak Spanish for the life of me, but every time there's a press conference and Bloomberg speaks Spanish it is downright hilarious.

He was never a programmer (5, Informative)

tomalpha (746163) | more than 2 years ago | (#38624096)

Mike Bloomberg was always the business/sales guy at the company. Tom Secunda [wikipedia.org] was (one of the) original programmer of the first terminals. That was all in Fortran back then. A fair chunk of it probably still is. You can read this and oh so much more in his not-very-gripping autobiography, which was required reading for all team leads and managers at Bloomberg. [Ex Bloomberger].

Re:He was never a programmer (1)

LostCluster (625375) | more than 2 years ago | (#38624206)

Bloomberg LP even at one time claimed to own 2-screen setups. These days, there's not much on the Bloomberg Terminal platform that isn't available over the web from them or other sources.

99pct (0)

Anonymous Coward | more than 2 years ago | (#38626430)

there's not much on the Bloomberg Terminal platform that isn't available over the web from them or other sources

oh, you are out of your fucking mind.

Re:He was never a programmer (1)

b4dc0d3r (1268512) | more than 2 years ago | (#38624296)

I figured that was the case, but there is a distinction between resolving to learn something and not being able to do it. False dichotomy and all that.

Unfortunately, since you posted facts, theodp won't be able to practice his critical thinking skills. Or alternately practice being more subtle about driving page views for Codecademy with the rhetorical question. A few more obvious problems with this story are the following assumptions apparently based solely on a tweet and a short bio:

Heading up development of something means you can do it (managers can manage work without being able to do the work, not saying whether it's good just that it happens all the time)

EE graduate in '64 was capable of programming (he may have had courses, but he may have been more capable of building a logic circuit than coding for one)

Someone who left Saloman in 1981 still remembers enough to be able to write programs (most people forget skills if they don't use them, especially for 30 years outside of a rapidly evolving field)

Someone who resolves to learn something in a domain is learning the same thing (he could know something that's not relevant, and want to learn a more up to date language or environment)

Why was this mod'ed "Funny"? (5, Insightful)

Anonymous Coward | more than 2 years ago | (#38624394)

If you look at just about all tech companies, the person who got it going was the sales guy. In some cases the tech guy is also a great salesman - Larry Ellison of Oracle or Zuckerberg of Facebook - actually, FB is just a marketing data collection company.

In my years in software development, I've seen some really great ideas and implementations just get burried because the geek didn't know how to sell it's value.

All the tech bigshots knew how or knew someone who knew how to sell the value of their stuff.

Wozniak had the luck of having God's gift of salesmenship, Steve Jobs, as his friend. All the gazillionaire techies had someone with them that had the contacts and sales ability to take their idea and make it into something.

"Build a better mousetrap and the World will beat a path to your door" is a lie. The countless examples of inferior technology ruling the marketplace is proof.

Re:He was never a programmer (1)

Grishnakh (216268) | more than 2 years ago | (#38624724)

Mike Bloomberg was always the business/sales guy at the company.

So he was basically like Steve Jobs, only without the gift for picking mega-popular designs.

Much rather (0, Flamebait)

WillyWanker (1502057) | more than 2 years ago | (#38624150)

I'd much rather he learn empathy, humility, and how to not be a giant fucking jackass. Baby steps I guess.

Re:Much rather (4, Funny)

PPH (736903) | more than 2 years ago | (#38625326)

I'd much rather he learn empathy, humility, and how to not be a giant fucking jackass.

Well then, learning to code is definitely NOT the way to go.

Re:Much rather (0)

Anonymous Coward | more than 2 years ago | (#38625956)

You beat me to it. Hahaha, so true.

Head of systems development? (4, Insightful)

nurb432 (527695) | more than 2 years ago | (#38624166)

So? Just beacuse you manage a department doesn't mean you can do the work they are doing. He was there to manage people, not code.. a vastly different skill set.

Sure, its nice if you can do the job of your people, so you can have a deeper understanding of what is going on, but its not a requirement.

Re:Head of systems development? (0)

Anonymous Coward | more than 2 years ago | (#38624234)

You're an IT manager aren't you?

Anyway, what I don't understand ... "has announced his intention to take a coding class in 2012 via Twitter" is how the hell do you learn something via Twitter? I realize manuals put off people and some IT books are horrendously oversized, but this is kind of ridiculous.

Re:Head of systems development? (0)

Anonymous Coward | more than 2 years ago | (#38624458)

I really hope this was just a piss-poor attempt at humor. The announcement was made via Twitter. He's going to use Codecademy.

Re:Head of systems development? (0)

Anonymous Coward | more than 2 years ago | (#38624508)

I'm pretty sure the announcement is via Twitter, not the coursework itself. Maybe a better way to say it would be, "announced via Twitter his intention to take a coding class in 2012."

Re:Head of systems development? (2)

nurb432 (527695) | more than 2 years ago | (#38624530)

You're an IT manager aren't you?

I have been in the past, but i am not currently. In my case i do have expirence in the IT/Engineering field as a 'worker bee', for 20+ years, but i still don't feel that is a prerequisite for managing IT people. General knowledge of the field sure, but i would not expect a manager to know low level stuff like how to sit down and code an application, or recite the resistor color codes ( for 2 examples ) to be an effective manager of people.

Re:Head of systems development? (1)

mbkennel (97636) | more than 2 years ago | (#38624692)

The most important job of a manager is to know WHO is really performing well, and who is the problem, and why are both the case.

Sometimes not having enough technical knowledge can hurt this significantly.

Re:Head of systems development? (1)

bickerdyke (670000) | more than 2 years ago | (#38625044)

...and recognize those cases when adding some high performer reduces team productivity. (Or the other way round)

And a method of recognizing "performance" in the first place.

Re:Head of systems development? (0)

Anonymous Coward | more than 2 years ago | (#38624326)

This mentality is probably one of the biggest problems in the corporate world today. Management that has no fucking idea what they are managing.

Re:Head of systems development? (2, Interesting)

formfeed (703859) | more than 2 years ago | (#38624460)

Sure, its nice if you can do the job of your people, so you can have a deeper understanding of what is going on, ..

That's why I in principle like the announcement. Even if it turns out just to be a publicity stunt, it at least shows that Bloomberg thinks that learning something different would be good - or at least thinks, that his voters think that..

According to BBC, the reaction of the London mayor was that he's too busy for things like that. - Now, that shows a politican that needs to get rebooted. If politicians would do a couple things below their pay scale or volunteer for longer than a photo opportunity they might actually get a clue. I'd rather have a mayor "wasting" a few hours a week on his/her education (in a broader sense) and pay for another aide than being stuck with a clueless person who wastes millions.

Re:Head of systems development? (4, Informative)

Asic Eng (193332) | more than 2 years ago | (#38626252)

According to BBC, the reaction of the London mayor was that he's too busy for things like that.

That's completely wrong [bbc.co.uk] . The BBC actually reports [...] the mayor is in awe of his good friend Michael Bloomberg, and if re-elected will explore whether he can join him on that course. I believe you got Boris Johnson (current mayor) confused with Ken Livingstone (former mayor and current candidate for the opposing party). Ken Livingstone stated If I'm elected, I'll be a bit too busy to take any education courses.

Anyway, it's certainly nice if politicians broaden their minds, but it's reasonable that they have to allocate their time and set priorities.

Bloomberg on the Internet in 2001 (1)

theodp (442580) | more than 2 years ago | (#38624168)

BW 2001 [businessweek.com] : Bloomberg still insists that the Net is too "unreliable" a way to deliver his product. Servers go down, security is dicey, and he has faith in a closed system. There's a Bloomberg Web site with data and news for free. But the CEO was an early skeptic of the Internet gold rush, and these days he figures that he has been proved more right than wrong.

Re:Bloomberg on the Internet in 2001 (3, Informative)

betterunixthanunix (980855) | more than 2 years ago | (#38624214)

Frankly, given the line of business he was in -- rapid news delivery to investors -- I am inclined to agree with him about the Internet. Delays on the network could translate into millions of dollars in losses for Bloomberg's customers, which could translate to millions in losses for Bloomberg. From a business perspective it made sense.

Re:Bloomberg on the Internet in 2001 (1)

Sir_Sri (199544) | more than 2 years ago | (#38624540)

Agreed, in 2001 that was about correct. Systems generally weren't that easy to make reliable, and for what he was doing it wouldn't have been worth it.

If someone asked me today why I'm not making a WP7 app, the answer is: They don't have enough of a market share for it to be worth our time yet. If, 3 years from now, WP7 owns the whole damn marketplace that doesn't mean my opinion about what I'm doing right this minute is wrong. The world changes, the question is whether or not you can evolve with it, and whether or not you have enough insight to recognize shifts and either follow along quick enough, or lead the innovation.

Even today (2)

mbkennel (97636) | more than 2 years ago | (#38624676)

Even today, critical communications don't travel over the public internet:

a) Mastercard & VISA card processing networks
b) ACH & Fedwire money transfers
c) US DoD communications.

Using IP protocol isn't really the problem (why invent hardware now), but control & management of network is a big deal. Besides, his servers & his clients can be concentrated in Manhattan. Bloomberg made the right choice then, and it's still the right choice.

Re:Even today (0)

Anonymous Coward | more than 2 years ago | (#38624886)

Bloomberg terminals now operate over the internet if I'm not mistaken. I did some consulting work this summer at a financial firm that included getting them ready for the Bloomberg terminal (none of that funky heavy duty orange cable anymore). It's pretty much just their keyboard, and software to access their network. Incredibly impressive amount of information is concentrated within their network, and I can see why financial companies are willing to sink $1700+/month into them.

Re:Even today (3, Informative)

superwiz (655733) | more than 2 years ago | (#38626462)

Bloomberg terminals now operate over the internet if I'm not mistaken.

They can encapsulate their feed over the Internet, but that limits functionality and requires extra login steps. The standard setup is over their own network. It's has extra security (including protection against Van Eck phreaking of the terminal itself). What you get in the browser is a very, very, very limited subset of functionality of what the terminal itself provides. Although the terminal itself, as an interface, has all the usability level of a cash register.

"Hiring in NYC/SF only" (1)

betterunixthanunix (980855) | more than 2 years ago | (#38624180)

Bloomberg does not want to learn to code -- he is promoting a business with operations in NYC that will bring jobs into NYC. I do not think there is anything wrong with the mayor of NYC promoting such an organization, but why should /. glorify Bloomberg instead of just glorifying CodeAcademy?

They don't make programming tools like they use to (4, Insightful)

LostCluster (625375) | more than 2 years ago | (#38624190)

Common in the 60s: Punch cards, text only dumb terminals, mainframes...
Common Now: Online storage, visual designers, client/server setups....

If your knowledge of computers ends in the 60s. there's a lot of updating to be done. Mayor Bloomberg has the right idea... every 10 years or so it's time to retrain to the current tools.

Re:They don't make programming tools like they use (1)

Tablizer (95088) | more than 2 years ago | (#38625442)

Terminals were a luxury in the 60's. Teletype machines were generally cheaper (if you were lucky enough to get to use one), even though they consumed a lot of paper.

What does the submitter have against Bloomberg? (1)

Anonymous Coward | more than 2 years ago | (#38624240)

Did Bloomberg do something to the story submitter? Sounds like Bloomberg kicked his dog or something.

Re:What does the submitter have against Bloomberg? (2)

The End Of Days (1243248) | more than 2 years ago | (#38625402)

It's important to hate him because he's rich. Anyone who has money is evil because they are able to fulfill their desires and I am not.

Ahhhh.... (0, Offtopic)

Anonymous Coward | more than 2 years ago | (#38624330)

*THATS* what the mayans meant....

Bloomberg L.P. is Hiring Developers (0)

Anonymous Coward | more than 2 years ago | (#38624338)

Summary contains link to internships but, if anyone's interested, Bloomberg is currently hiring software developers (principally C++), both junior & senior in New York & London:

http://www.bloomberg.com/careers/

Bloomberg is a Billionaire (0)

Anonymous Coward | more than 2 years ago | (#38624502)

If he really wanted to learn to code, he wouldn't be pissing about on the internet... he'd just hire a Turing Award winner to home-tutor him.

Bloomberg to Helpdesk: (1, Funny)

mbkennel (97636) | more than 2 years ago | (#38624554)

Bloomberg: I need you to perform a privilege escalation on my compiler.
Helpdesk: Before we proceed, can you describe the symptoms?
Bloomberg: Yeah, it sometimes spits out some incomprehensible message, or the program says "Segmentation Fault." I don't care about its needs, I have work to do, now. So I'm calling in a privilege escalation. Now!
Helpdesk: Sir, I'm not sure that's going to help, do you know what a privilege escalation means?
Bloomberg: Yes, I think I do, or haven't you noticed that I'm the richest bleeping man in New York AND the mayor?
Helpdesk: I'm very aware of that sir, but the compiler isn't familiar with your exalted position.
Bloomberg: That's just the freaking point, you moron!! Do it now!
Helpdesk: (sigh) (click click click) OK, sir, your compiler is now ultraprivileged. Have a nice day.
Bloomberg: pathetic peons...he's probably not even one of my constitutents, so fuck him.

Won't get past HR zombie (0)

Anonymous Coward | more than 2 years ago | (#38624784)

Yes but the HR zombie will circular bin his resume as he doesn't have 5 years experince in iOS 5.0

Why (2)

lightknight (213164) | more than 2 years ago | (#38624792)

Just curious -> why? Personal interest, or business venture?

And someone make sure he starts with C++. If he survives that, he won't have any trouble picking up other languages.

C/C++ is pretty bad place to start learning (2)

F69631 (2421974) | more than 2 years ago | (#38625178)

And someone make sure he starts with C++. If he survives that, he won't have any trouble picking up other languages.

I've always been baffled by people who think that C/C++ is a good starting point when you want to learn/teach programming. I think that the most important thing to understand - whether you end up working as a programmer or not - is the basic structure/flow of the program (conditionals, loops, modularity/functions). Then the basic programming concepts (recursion, abstract data types, etc.) and then the libraries/APIs for your platform so that you can actually create something interesting/useful. I don't think that C/C++ offers any advantages over more modern languages in any of these things.

Perhaps advocates of C/C++ for first language think that if you start with a higher level language, the inner workings will forever be a mystery and you just end up using modules you don't understand. I could argue that if you aren't a professional programmer, that doesn't really matter at all but instead I'll argue that you do learn all the important concepts anyways. You can code in Java, PHP or Python and very quickly learn that there is a difference in whether you return a value or a reference to the value. The concept matters, not remembering where to put asterisks and where to put ampersands. ;)

You might say "OK, perhaps C/C++ doesn't offer much advantages but they're still the languages... Why go with something else?" and the answer is pretty simple. If you study C for a week and then get bored / are too busy for a while, etc. you can't really do anything useful with it. There are pretty slim chances that you could, for example, create an application that saves you X amount of work by spending less than X in creating the application. If you spend a week learning PHP, JavaScript, AutoIt [autoitscript.com] or whatever other language is best suitable for the domain of stuff that you're most interested in, you probably can actually use it for something. Also, if you choose a higher level language, the chances are that whether you spend a week or a month, you'll get to delve deeper into database access, networking, algorithms, etc. than you would by choosing C/C++. It's great to possess some basic understanding in those areas, even if you don't end up as a software engineer.

I guess that C/C++ is a good place to start for college kids who're just getting into CS: It's something that professionals probably should understand anyways (even if they don't end up coding in it) so they need to study it at one point or the another and it's easy way to get rid of the "I just like playing XBOX" crowd. For anyone else, I'd probably ask "What kind of stuff do you like to do on computer?" and then try determine what language helps them most in doing that thing.

Re:C/C++ is pretty bad place to start learning (1)

Turnerj (2478588) | more than 2 years ago | (#38625644)

A fair point. I started programming in Javascript before I moved to PHP, VB.Net (I regret this one), C#, Java, C++ in that order. However it could just have easily been C++ first.

If a person is really committed to learning a programming language, they would be fine learning C++ first which would teach not only all the fundamentals but also give some idea of how the system works.

It can also be unnecessary depending what they want to do though (as you suggested)

Re:C/C++ is pretty bad place to start learning (1)

lightknight (213164) | more than 2 years ago | (#38625876)

If we are talking about programming in general, I think I started with Logo, then Java / Q-Basic, then C, then JavaScript, then C++. Something like that, with HTML / VRML mixed in for good measure. Ah, good old VRML.

Currently enjoying C# as my primary language, and doing PHP work for a small project. Have a book on Ruby to finish reading, the AMD APP OpenCL reference for when I have some free time.

Re:C/C++ is pretty bad place to start learning (3, Insightful)

lightknight (213164) | more than 2 years ago | (#38625712)

I think C++ is a good starting point simply because it teaches memory management and class design.

Understanding the concept of a class is one of the most difficult programming concepts a novice will encounter. And they are used everywhere.

Just try explaining the concept of a class to a non-programmer. I will bet money that they will nod their heads, and still have no idea what you're talking about.

And memory management -> something you need to understand, even if you use a garbage collector.

If he's just taking a programming class to get a taste (dilettante) for programming, then by all means teach him Visual Basic or JavaScript or whatever. However, if he's taking a programming class to learn programming (he wants the programmer skillset a.k.a. a real programmer), then C++ is where he wants to be. Once you understand the concepts in C++ (which can be brutal / metal when it comes to learning), the hardest part of learning how to program is past.

Why, do you ask? Because otherwise you end up in sad scenarios, like when the PhDs in your Computer Science department do not know how to install an operating system, when the undergrads in your class have difficulty understanding the difference between an AMD processor and an Intel processor, or why one should never write a program in JavaScript that consumes 8 GB of the client computer's memory.

TLDR; C++ will expose him to the greatest number of programming concepts in the shortest period of time, and give him the minimal amount of understanding necessary to eventually grow into a respected programmer.

Re:C/C++ is pretty bad place to start learning (3, Informative)

Andrevan (621897) | more than 2 years ago | (#38626528)

Why do CS PhDs, who spend 98% doing theory (math), need to know anything about installing an OS? Why do undergrads, who probably use preassembled OEM boxes, need to understand the differences between hardware brands? More to the point, how does learning memory management or class design through C++ help one learn these things? To address a less ridiculous point, if I'm spending all my time in Java, Ruby or Python, why do I need to understand anything about pointers and memory management in C? For the sake of argument, let's say we need to understand how the stack, heap, and reference variables work in a garbage collected language. Why do we need to learn C to do that? In undergrad I was required to take a class which involved writing one's own implementation of malloc. Like so many other classes required for a CS degree, I use nothing from it in my day-to-day work as a Ruby developer.

Re:C/C++ is pretty bad place to start learning (0)

Anonymous Coward | more than 2 years ago | (#38626608)

Because school should be something more than where you go to learn a trade? Understanding computer architecture & how malloc operates is important. The less something is a black box, the fewer bad assumptions you'll make. Knowledge is never a bad thing.

Re:C/C++ is pretty bad place to start learning (1)

Andrevan (621897) | more than 2 years ago | (#38626654)

This, to me, gets into the difference between theory and implementation. I agree that understanding the theory behind memory management can be useful. However, learning malloc goes beyond theory into a specific implementation of the principle. In a garbage collected language, knowing the theory is potentially useful, but knowing how C's implementation works is not.

Re:C/C++ is pretty bad place to start learning (0)

Anonymous Coward | more than 2 years ago | (#38626682)

Because knowing the theory is insufficient. When you do the implementation yourself, you gain insight that is otherwise unavailable & you have a much deeper appreciation of what it takes to implement the theory in reality & why certain implementations behave better than others. There's like 100 variants of malloc & each has tradeoffs between each other (hell, even gnu's libc malloc isn't even the best).

That is why when you learn in school, they force you to do homework; it drills the concept into your head via practice.

Re:C/C++ is pretty bad place to start learning (1)

Aighearach (97333) | more than 2 years ago | (#38628026)

Understanding by doing is a completely different thing than what is gained by repetition.

Re:C/C++ is pretty bad place to start learning (1)

Johnny Mnemonic (176043) | more than 2 years ago | (#38627012)

"I use nothing from it in my day-to-day work as a Ruby developer".

You're a Ruby developer today. You may not be in 5 or 10 years from now. Then, your educational background will serve you flex into a different position. Ruby developers who know just that will only be Ruby developers forever, because they are one-trick ponies.

Re:C/C++ is pretty bad place to start learning (1, Informative)

Aighearach (97333) | more than 2 years ago | (#38628028)

Like so many other classes required for a CS degree, I use nothing from it in my day-to-day work as a Ruby developer.

As a Ruby developer I just have to point out, without C you can't understand the Ruby source or write native extensions.

A Ruby developer without C is totally weak.

Re:C/C++ is pretty bad place to start learning (2)

cerberusti (239266) | more than 2 years ago | (#38626066)

When teaching people to program I start with an introduction to binary and hexadecimal, making them do a few things like note the patterns various various numbers contain, and add up a couple of things (which illustrates why powers of two are important and convenient, etc.)

From there it is a brief introduction to logic gates. I demonstrate simple addition, and make them construct a circuit that will do add with carry.

Then I do a brief introduction to assembly (x86 these days, I used to choose Alpha.) I do not really expect them to become proficient, but I do think an intro and vague understanding is important. I also go over character sets and how to directly manipulate pixels on the screen to produce an image.

Then I teach C (not C++.) They need to demonstrate some knowledge of how to use basic constructs such as loops, structures, and functions, as well as an understanding of memory management and pointers (including function pointers.) C gives an understanding as to why most modern APIs behave the way they do, and allows one to infer the consequences of various choices in the higher level languages. I will usually throw in a large data set to process at some point (few hundred GB, with file access through a wrapper I provide to allow them to just specify a 64-bit offset directly), to demonstrate how various choices in program structure can have a large impact on speed and memory usage (and show the point of things like qsort and bsearch.) If they get it quickly I will go into the reasons why qsort tends to beat other sorting algorithms in run time even though it frequently performs more operations, and how to make a quick assessment as to if you are going to blow the L2 cache.

I usually stick with the standard C library, not going into something like win32. I honestly think C is one of the finest languages ever created, it offers quite a bit of control, can be easy to read if written well, and contains most of what you need to create a program quickly. It does not contain all the fluff that has been added to many modern languages, which just tends to get in the way of understanding the basics.

Once they have a fair handle on C I teach JavaScript / HTML, which makes it easy to produce a program which creates a decent UI and can handle whatever logic they need with little work (I used to give a brief introduction to classes and load up a copy of borland C++ for their decent visual UI design, but I think javascript and HTML is just a better way to do it these days.)

From there it is up to them, if they want to learn Java, C#, or C++ I will go over how classes work (and show how to quickly implement a basic class system in C), if they want to use windows as a platform I will cover win32 and directx (although honestly opengl is much better, and I will at least touch on that as an intro to 3D before hitting direct3D.)

This seems to work fairly well, although I am a programmer by profession and not a teacher (CTO really, and I mostly use C for what I do.) The important point is to show how a computer works, how to program it is a natural consequence of that. I have also found that programmers who just know something like Java or C# tend to be rather poor, and get confused when asked to do something that may not be trivial (which also leads to me generally refusing to hire anyone without C or C++ somewhere on their resume.)

Re:C/C++ is pretty bad place to start learning (2)

KingAlanI (1270538) | more than 2 years ago | (#38627316)

The advisor for my intro to programming project promptly nixed C++ and went with Python
A teacher who had worked with Fortran in the 70's said this: "Automatic memory management? You lucky bastard."
Moreover, Python has a fairly straightforward syntax without being _just_ a teaching language

Re:Why (1)

ohnocitizen (1951674) | more than 2 years ago | (#38625732)

To build a website. He is having trouble finding developers who *want* to help him oppose OWS's web presence, so he's going to learn html and make "an anti-anti-wall-street web-page" all by himself.

Re:Why (1)

asifyoucare (302582) | more than 2 years ago | (#38626982)

I'm not sure C++ would be a good choice, but sadly the chosen language is an even worse choice IMHO - javascript. javascript might be an OK choice if you have the explicit aim of being a web 'developer', but otherwise java would surely be a better choice. My 2c (AUD).

Doing one better (0)

Anonymous Coward | more than 2 years ago | (#38625084)

Bloomberg fired a guy because the mayor saw the guy had solitaire running on his computer, just as Bloomberg was making his rounds at the department, grinding all work to a halt.
Maybe someone whispered him in the ear that he should get more in touch with real people out there use computers, but he decided to top it?

Re:Doing one better (1)

swalve (1980968) | more than 2 years ago | (#38626996)

I suspect he fired the guy for not closing the program when the boss walked into the room. You play solitare when you are on a conference call, not when the mayor is wandering around.

good example of lifelong learning (3, Insightful)

peter303 (12292) | more than 2 years ago | (#38625504)

Always learn new things in life since technology evolves so fast. I feel sorry for my co-workers to refuse to learn on their own because it would cost them some time or money.

This is not good. (1, Funny)

Lord Kano (13027) | more than 2 years ago | (#38625560)

Bloomberg is a narcissist, he's going to write a Hello World program and think he's an expert in all things technology related.

LK

He should take Constitutional law classes instead (1)

ageoffri (723674) | more than 2 years ago | (#38626630)

No mayor has overstepped his legal boundaries like he has. Running multiple illegal so called sting operations, not only in New York State, but in other States as well. New York city also has some rather questionable intelligence units that partner way to close with FBI and CIA.

Class (0)

Anonymous Coward | more than 2 years ago | (#38628076)

At first I read it that he was going to take the class USING twitter! Good luck with that!

Run for the hills ... (0)

Anonymous Coward | more than 2 years ago | (#38628168)

he want to become a quant!

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...