Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Forget Math to Become a Great Computer Scientist?

Zonk posted more than 7 years ago | from the this-is-why-i-wasn't-a-good-programmer dept.

Programming 942

Coryoth writes "A new book is trying to claim that computer science is better off without maths. The author claims that early computing pioneers such as Von Neumann and Alan Turing imposed their pure mathematics background on the field, and that this has hobbled computer science ever since. He rejects the idea of algorithms as a good way to think about software. Can you really do computer science well without mathematics? And would you want to?"

cancel ×

942 comments

Sorry! There are no comments related to the filter you selected.

I am able (1)

Mipoti Gusundar (1028156) | more than 7 years ago | (#19787949)

I am able to be calculating the number of this post, and it is being one!

Re:I am able (3, Funny)

Anonymous Coward | more than 7 years ago | (#19787957)

Correction, make that minus one.

Re:I am able (4, Funny)

smitty_one_each (243267) | more than 7 years ago | (#19788431)

You were off by 19787949 - 1 = 19787948 in your calculation.
However aren't they all integers, and therefore morally equivalent?

wahay! (-1)

Anonymous Coward | more than 7 years ago | (#19787955)

Screw all those professors who told me I'd never be a great computer scientist because I suck at math!

Computer Science without math... (0)

siDDis (961791) | more than 7 years ago | (#19788013)

is the same as writing litterature with a programming language.

Re:wahay! (5, Interesting)

smilindog2000 (907665) | more than 7 years ago | (#19788189)

I sometimes run into great algorithm programmers who were poor at math, but they're rare, and usually can be explained away based on what kind of drugs they did in college. For a good algorithms guy, I love hiring good mathematicians and physicists. You can train them into great programmers a lot quicker than the other way around. However, algorithms are really a very small part of the programming space we work in. I choose to work in this space because it suits me, but most programmers never need calculus. To build a tree-based data structure and a GUI to drive it takes about an 8th grade level of knowledge. Doing a GUI really well takes creativity I've never had (apparently a lot of guys like me work at M$. I don't know where Apple finds it's GUI guys).

The summary of the author's points in the article make the book sound dead wrong on several counts, though it could just be the review. Procedural languages are the natural way to code most programs, and here's why: we've been recording recipes as a sequence of steps, with if statements and loops, since the invention of writing. It's become encoded in our genes. That's really all that early computer scientists put in our early languages like FORTRAN. It's all the stuff we've added since then that's up for debate, in my mind. The author makes money by pushing the boundaries of computing model research. I get big programs written by teams by restricting what language features are used, and how. I'd be interesting to debate the ideas, point by point.

Re:wahay! (5, Interesting)

atrocious cowpat (850512) | more than 7 years ago | (#19788417)

"Doing a GUI really well takes creativity I've never had (apparently a lot of guys like me work at M$. I don't know where Apple finds it's GUI guys)."

Maybe the question should rather be: Why doesn't Microsoft look for the kind of GUI-guys Apple hires. And the answer to that might well be found at the top of each company. A quote from Steve Jobs' Commencement address at Stanford (June 12, 2005):

"Because I had dropped out [of college] and didn't have to take the normal classes, I decided to take a calligraphy class [...]. It was beautiful, historical, artistically subtle in a way that science can't capture, and I found it fascinating. None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, its likely that no personal computer would have them. If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do."

Read the whole thing [stanford.edu] , it's quite interesting (if not to say: inspiring).

Damn straight! (4, Insightful)

Anonymous Coward | more than 7 years ago | (#19787965)

Who needs math? Bogosort is a good a sort algorithm as any. Hey, without math, how would you be able to tell?

Computer Science != Software Engineering (5, Insightful)

Anonymous Coward | more than 7 years ago | (#19787979)

Maths IS needed for computer science. Just be sure not to confuse Computer Science with Software Engineering. Software engineering is only a part of the computer science sphere.

As if computer science wasn't stunted enough (4, Insightful)

cyborg_zx (893396) | more than 7 years ago | (#19787981)

Do the lessons of VB6 teach us nothing?

COMPUTING IS HARD. You can't dumb it down just because it would be nice to do so. And I'm sorry but mathematics is just the way in which meaning is expressed for machines. There's no free lunch here. And he's wrong about algorithms too - since a non-terminating algorithm is always expressible by deconstruction into a series of terminating algorithms.

Re:As if computer science wasn't stunted enough (5, Insightful)

garcia (6573) | more than 7 years ago | (#19788261)

Do the lessons of VB6 teach us nothing?

People have been fucking saying this about various versions of BASIC since the beginning. Instead of trashing it, what did BASIC's various incarnations teach us?

It taught us that Microsoft could roll what amounts to a scripting language into its Office line and make the programs ever more powerful without having to relearn something completely new and difficult. An education in just about any language, a book or a list of commands, and some time and you will have a fully functional module or two that saves you a ton of time and energy.

I honestly think a lot of the hostility, here, towards VB has to do with the fact that now pretty much anyone can write code and that it's from Microsoft. If you're somehow saying that if they used C/C++ or even Perl that their code would somehow be wonderful or safe, you're insane.

COMPUTING IS HARD. You can't dumb it down just because it would be nice to do so. And I'm sorry but mathematics is just the way in which meaning is expressed for machines. There's no free lunch here. And he's wrong about algorithms too - since a non-terminating algorithm is always expressible by deconstruction into a series of terminating algorithms.

I agree and while most applications require this, if you look at VB as a way to either get people started coding or to do quick things because it's built into the system instead of concerning yourself with the necessity of math-based algorithms, it serves its need.

I'm no math whiz but I can write code (in languages other than VB) and so can plenty of others. Enough putting people down and being on your high-horse because you write in such and such. Math is important to CS and so is easy access to be able to write code.

Re:As if computer science wasn't stunted enough (4, Insightful)

cyborg_zx (893396) | more than 7 years ago | (#19788339)

The point is that VB is fine if you want to basically do completely trivial things with what you've already got - basic component connection. As soon as you want to do something non-trivial it all falls apart. The language design is simply worse. It's not that you cannot fuck up royally in C/C++ it's just that some stuff can be really hard to do elegantly in VB6. The newer versions have rectified this somewhat but having a rigorous approach to language design is something worth investing in - even if it's 'too mathematical' for some people's tastes.

It's just not that easy to do some of the cool stuff we want to do. No amount of wishing it were different is ever going to change that.

Re:As if computer science wasn't stunted enough (4, Insightful)

Anonymous Coward | more than 7 years ago | (#19788371)

fact that now pretty much anyone can write code

No, the hostility is because now pretty much anyone THINKS he can write code, which lowers the valuation of people who actually can do it. That lowers software quality on two fronts: People who can program are forced to write lower quality code because they need to write more to compete with too many amateurs (in the derogatory sense of the word) and people who can't really program write code that doesn't handle errors properly and fails, often silently and undetectedly, when the input deviates from the expected.

I agree... (1, Insightful)

Anonymous Coward | more than 7 years ago | (#19788401)

Before groupthink nukes this person's comment into oblivion, could you please reflect on the last time you had to deal with someone else's shitty code? (I'm sure you don't have to think back very far.)

If you've never had to deal with someone else's poor work then you [are the luckiest bastard on the planet, but more likely you] may want to consider a career change...

Some people shouldn't code production systems (4, Insightful)

QuoteMstr (55051) | more than 7 years ago | (#19788389)

Let me make this clear: your ability to write code in no way makes you a computer scientist. It's like saying that the ability to operate a forklift makes you a structural engineer. Stop it already.

That said, I'm sure you're good at what you do. I bet you can write good code in VB, as well as many other languages. This isn't a personal insult. VB, PHP, and other brutish languages are equally bad in my eyes.

These languages are brutish because they oversimplify key concepts. That oversimplification also makes them attractive to new programmers, and new programmers universally write terrible code. The languages themselves aren't bad, the coders are. That said, more experienced coders will generally choose more capable languages, so most of the time, a program written in a brutish language will be a bad one.

We need fewer programmers, not more. Maybe professional certification would help somewhat.

(Incidentally, we were lucky that Javascript became the de-facto client-side web language. We could have done far, far worse, and although we can change server languages, we can't change a user's web browser!)

Re:As if computer science wasn't stunted enough (1)

Lumpy (12016) | more than 7 years ago | (#19788419)

VB is the DEFACTO standard RAD for almost all corporations. and the funny part is that VB6 is the Defacto Standard. Not .NET not by a long shot.

Why? there are craploads of people that understand VB, Microsoft went and C-ified VB so hard that it does not even look anything like the original basic and now all those VB programmers they had need complete retraining.

So what happened? Corporate America shuns .NET. Almost all apps are still in VB and new ones are in VB not .NET asp is still the norm and not aspx.

It's far easier to keep using the old style that is working than to halt all productivity while you retrain all your VB jockeys for .NET and the sad fact is some things are incredibly faster to create in old VB6 than it is in vb.net so VB.net is actually a step backwards for most places.

Which is why there is still a HUGE install base of VB6. Many MANY Vertical apps that sell for way more than any coder here could dream of their app selling for ($34,000.00+) are oversized collections of VB6 apps or one huge one that frightens anyone that knows anything about software.

But it will not change. Those apps were coded by non programmers because programmers cant understand the nuances of the business process and would take weeks or even months to get enough education in the business process before they can even start to write useful software for that particular business.

That is why VB is strong out there. Nothing else even comes close in RAD speed or usability that is a commercial product that Pointy Haired bosses really like. (Free exists. Python can outperform VB in a heartbeat as a RAD platform, it just has a somewhat steep learning curve and requires the programmer to have some programming understanding. It is missing a really polished IDE with package/project management though for the windows platform. Adding one of those might speed up python uptake.)

Re:As if computer science wasn't stunted enough (1)

shaitand (626655) | more than 7 years ago | (#19788423)

'I honestly think a lot of the hostility, here, towards VB has to do with the fact that now pretty much anyone can write code and that it's from Microsoft.'

You are write but the reason people are upset about pretty much anyone writing code isn't that they want to keep the club elite. Or at least that isn't the right reason to be upset.

VB introduces the same problem many microsoft products do. VB is simple enough for just about anyone to write code but MOST people are not bright enough to be able to write code in a responsible fashion. They write poor programs that companies come to depend on. Those programs might be resource intensive, may not scale, may cause security issues, or a number of other problems. Avoiding these problems requires a solid knowledgebase. A competent programmer could learn BASIC and write relatively decent code (BASIC is not exactly a champion of efficiency) but there are certainly better language choices for an already competent programmer.

For a beginning programmer BASIC is also a bad choice. First because it gives those entering the field a false expectation of their ability to write real code. Second, and more importantly, because the language design supports bad programming habits and the syntax is dissimilar to the popular professional languages that have been in use during the past 10 years. You could justify it when there were many using Pascal but Pascal hasn't been in vogue for awhile now.

'I'm no math whiz but I can write code (in languages other than VB) and so can plenty of others.'

Then you are a math whiz. You are just more comfortable thinking about the math in non-classical terms. All programming is a form of algebra, not just the complex number crunching algorithms. All logic is algebra and in turn, numerical mathmatics is just a branch of algebra. Philosophy aside, all programming is just an abstraction for the mathematical operations being performed by a powerful calculator with lots of memory (AKA a computer). Every line of code you write (regardless of how you think about it logically) is converted into a number of mathematical operations.

I think the author is making a more subtle point (4, Insightful)

Stradivarius (7490) | more than 7 years ago | (#19788319)

I believe the author's point isn't that you don't need to know any mathematics, or that it doesn't have an important role to play in CS. He's simply arguing that some of the main issues in computer science are not fundamentally mathematical problems (even if they require some mathematics).

If you buy that argument, then treating CS as if it were merely simply another branch of mathematics will not help solve those problems.

Of course, this also takes us into the perennial debate between where to draw the line between "computer science" and "software engineering". One could certainly define away the author's problem by saying that his examples are software engineering issues rather than computer science issues. And it's true that it's software engineering has been driving a lot of the theory with respect to expressiveness (design patterns and the like). But that view also seems to really impoverish computer science - if all you leave the field of computer science is the stereotypical mathematics, why not just become an applied mathematics major?

Applied mathematics (4, Insightful)

Rakshasa Taisab (244699) | more than 7 years ago | (#19787987)

Taking CS without math is like taking engineering without any physics.

WTF is the the author smoking?.. There are of course parts of CS that are less involved in math, but it is still overall a fundamental part.

Re:Applied mathematics (2, Informative)

ScottyH (791307) | more than 7 years ago | (#19788307)

The author is saying that without the pioneers of the science CS wouldn't be intertwined with mathematics. So yeah, it is fundamental now, but in the absence of the original contributors it may have looked quite different. I personally find the argument a bit difficult to swallow, but then again, I'm inside the box.

Re:Applied mathematics (2, Insightful)

QuoteMstr (55051) | more than 7 years ago | (#19788399)

The thing about math, though, is that it's universal. If we ever discover an alien civilization that's a peer to our own, I'm sure it will have an identical formulation of Pythagoras' theorem. Given different starting conditions, we might have used different notation or words for computer science, but the concepts are inherent in the problems we solve, and therefore would eventually have been discovered and described regardless.

Re:Applied mathematics (2, Informative)

jawtheshark (198669) | more than 7 years ago | (#19788309)

There are of course parts of CS that are less involved in math, but it is still overall a fundamental part.

Not even that.... Computer Science is a subsection of Maths. That's it.... Theoretically, you can complete CS without ever touching a computer.

I was never the best at maths (even though, I beat the best of our class in the final maths exam, but that must have been pure luck. He is a math PhdD at Harvard now, so....). Luckily the parts of maths that are useful to CS, were within my reach ;-)

Sure thing Einstein (5, Insightful)

Anonymous Coward | more than 7 years ago | (#19787993)

Good luck on doing a kernel, file system, network stack, crypto, image processing, window manager, animation or 3D without math or algorithms. I look forward to reviewing some of this guys code.

What if you grew them instead? (2, Insightful)

Colin Smith (2679) | more than 7 years ago | (#19788099)

Hmmm?

Re:What if you grew them instead? (1)

jawtheshark (198669) | more than 7 years ago | (#19788323)

Actually, I think the author of the article grew some stuff himself, and it wasn't a filesystem or a kernel. Okay, he planted kernels, so that he could see pretty colours after smoking the dried remains of the female flowers....

Re:Sure thing Einstein (1)

mogul (103400) | more than 7 years ago | (#19788113)

Well, I will prefere NOT to review his code.

It's way easier to review well structured code.

Re:Sure thing Einstein (1)

Jeff DeMaagd (2015) | more than 7 years ago | (#19788217)

I don't know what to call that kind of statement. Structure exists outside the field of mathematics, mathematics is merely one means of expressing it. Better stated, I don't think high level math courses is the only way to teach algorithms.

Re:Sure thing Einstein (1)

TapeCutter (624760) | more than 7 years ago | (#19788281)

"Structure exists outside the field of mathematics."

You need to broaden your definition of mathematics.

"Better stated, I don't think high level math courses is the only way to teach algorithms."

The traditional way to introduce the subject of algorithms is to start with the analogy of a "recipie", but if you want to program a computer to do something usefull it's gonna take more than milk & honey.

Re:Sure thing Einstein (2, Insightful)

msormune (808119) | more than 7 years ago | (#19788345)

There's math and then there's advanced math. I once built a simple 3d modelling software (used 3d Studio mesh file format), and got by just using basic trigonometry and algebra. This stuff does not have to be taught in an university.

Computer science ? (3, Interesting)

ivan_w (1115485) | more than 7 years ago | (#19787995)

This is all fine.. But it doesn't explain something I have long thrived to understand :

What is computer science ?

Computer engineering.. yeah.. I can understand that.. But man.. Computer SCIENCE ?

That's like saying 'car science', 'cooking science' or 'go at the bar and have a drink science' !

--Ivan

Re:Computer science ? (1)

Kallahan (599898) | more than 7 years ago | (#19788057)

Technically, the study of logical systems. Nowadays it's just computer programing :(.

Re:Computer science ? (1)

Yetihehe (971185) | more than 7 years ago | (#19788075)

Car science - making better engines, car safety, ergonomics... Cooking science - why dough rises, what's happening with milk and sugar... ( http://www.exploratorium.edu/cooking/ [exploratorium.edu] )

Re:Computer science ? (2, Insightful)

ivan_w (1115485) | more than 7 years ago | (#19788101)

That's engineering !

Making better engines uses the science of Physics and chemistry..

Cooking uses the science of chemistry..

To me it's like saying : 'Lego Science'.. It's not 'science'.. You don't need to know the physical aspects of a lego block to assemble something.. Although you need some insight into how the thing works - but it's not science per-se !

Then again, it depends on how 'science' is defined !

--Ivan

Re:Computer science ? (1)

Yetihehe (971185) | more than 7 years ago | (#19788153)

Then again, it depends on how 'science' is defined !
Exactly that is what I'm saying. Computer science is an amalgam of mathematics, algorithms, engineering. How else do you describe what researchers do with computers in overall?

Re:Computer science ? (0)

Anonymous Coward | more than 7 years ago | (#19788331)

We call it computer science because many of the early computer 'scientists' transferred from physics and were too snobby and pretentious to admit that they were now engineers. Computer science is computer engineering, plain and simple.

Someone explained the difference between science and engineering to me as this (paraphrasing):
Science aims to take things apart so we can understand how they work, engineering uses knowledge of how things work to put things together. Obviously in practice taking things apaart and putting things together are completely intertwined: you may need to put something complex together to take something else apart or you may build something so complex that it has unexpected or unintended behaviour and hence you have to deconstruct it scientifically to determine how it is behaving. The later often ocurs in computer engineering, but occasional uses of the scientific method does not suddenly transform the whole discipline from engineering to science.

Re:Computer science ? (3, Insightful)

Anonymous Coward | more than 7 years ago | (#19788277)

It's an old joke that any subject that has "Science" in it's name is not a science e.g. Political Science, Social Science, Computer Science.

The Science in Computer Science consists largely of niches carved out of other disciplines e.g. algorithm analysis and crypto are mathematics, user interface design is psychology, computer graphics is really about approximating physics, audio compression is mathematics, psychology and physiology, AI steals ideals from biology... every now and then we find out that the physics department, or the electrical engineers, or the chemists, are actually doing almost identical research to us.

Re:Computer science ? (0)

Anonymous Coward | more than 7 years ago | (#19788091)

"I think the way in which people use --or misuse-- words always most revealing. (At a seminar I recently attended, one of the speakers consistently referred to people as "human beings"; he turned out to have been trained as a psychologist.) Bearing that in mind and, furthermore, remembering that the majority of the people in the field regards computers primarily as tools, we should notice that the English speaking world coined the term "Computer Science". We should do so, because it is very exceptional that a tool gives its name to a discipline: we don't call painting "brush art", nor surgery "knife science". From these observations we can only conclude that when the term "Computer Science" was coined, computers were regarded --either in fact, or mainly potently-- as exceptional gadgets. A question to be answered before proceeding is, whether this view of computers as exceptional gadgets is justified or not."

-Dijkstra (EWD 682 [utexas.edu] )

Re:Computer science ? (5, Insightful)

joel.neely (165789) | more than 7 years ago | (#19788315)

The term itself is a product of the academic environment, similar to the equally dubious "Library Science" and "Management Science". For what it's worth, the European term "informatics" would have been better, but never caught on.

That said, I believe there's a useful set of relationships well understood in other fields:

Science = The search for fundamental knowledge and predictive models;
Engineering = The creative application of the results of science;
Technology = The routine application of the results of engineering.

giving us, for example:

Science: Physics
Engineering: Electrical engineering
Technology: TV Repair, Cable TV Installation

The punch line is that application of this model to computing works as follows:

Science: Mathematics
Engineering: Programming, Informatics, "Computer Science"
Technology: Coding, Computer Installation, Home Computer Repair, etc.

Mathematics IS the science in "Computer Science".

Anyone who has studied advanced Mathematics knows that Math is not about numbers; think of mathematical logic, Boolean algebra, abstract algebra, set theory, topology, category theory, etc. ad infinitum. Dijkstra defined Mathematics as "the art of precise reasoning". In the same sense, "computation" doesn't mean "number crunching", but more generally the automated manipulation of information.

It is true that there are legitimate concerns in today's computational landscape (networking, concurrency, etc.) which didn't figure in the mathematical/engineering world view of the 1940s, but that's simply a sign that the field has grown up (i.e. grown beyond the limited perspectives of its founders). That's also true in many other applications of Mathematics. For example, early research in differential equations paid much more attention to linear differential equations (because they were more tractable). However, we now know that most "interesting" systems in the real world involve non-linearity.

Science, Engineering, and Technology share with living systems an important rule: "Grow or die!" Fortunately, the field of computing has grown.

"Informatics" - no, please NO! (1)

muecksteiner (102093) | more than 7 years ago | (#19788361)

"Informatics" is a horrible word - it's certainly not English, and was probably derived/invented/whatever by some kind of acedemic lameass who did not have English as a first langugae.

"Computer science" is not a perfect name, but at least it does not make native speakers cringe every time someone mentions it.

Just my 0.2E-32 EUR

A.

Re:Computer science ? (0)

Anonymous Coward | more than 7 years ago | (#19788375)

Yeah good one.

Except mathematics is not science.

Re:Computer science ? (1)

Colin Smith (2679) | more than 7 years ago | (#19788377)

Computer engineering.. yeah.. I can understand that.. But man.. Computer SCIENCE ?
At the moment, quantum computing. The rest is engineering.

 

Re:Computer science ? (-1, Flamebait)

Anonymous Coward | more than 7 years ago | (#19788425)

Nope, another one bites the dust.

Quantum mechanics is science, quantum information theory is science. Quantumn computing is engineering.

Given that everyone here is completely incompetant at something as basic as distiguishing maths, science and engineering, how can we have any faith that they are competant as any kind of mathematician, scientist or engineer?

Depends on the industry (0)

UDFlyers (1068364) | more than 7 years ago | (#19788001)

I don't think the field is "better off" without math it lays a good foundation for the field, but I have to say that, after college, I didn't use math very much to solve problems. I did use algorithms quite a bit. To be fair, I was designing software in a field that didn't require math. As my career has progressed I have moved higher in the food chain, but ironically I have started working at places that use their computer systems for "computational fluid dynamics." My background in math has been helpful to understand some of the requirements I get in from the engineers. The point is that math isn't critical outside of the school environment, it's more important to have a foundation/background in whatever industry you are working. Computer science is a support service just about everywhere I have ever gone (even if you sell products, you sell them to help someone) so being able to understand your target audience is probably the most important thing of all.

Sound like he's just trying to redefine the terms (1)

Derekloffin (741455) | more than 7 years ago | (#19788003)

Obviously, I haven't read the book, but the article sure makes it sound like he's just trying to play word games, relabeling the concepts. And then, once he's done that he goes, 'see, it isn't mathematical anymore', when in fact all he has done is disguise the terminology. I sometimes think myself that math hardly covers the intricacies of computer work, but I think this guy is attacking it in the completely wrong way.

So I am not alone (1)

thomas.prebble (1125281) | more than 7 years ago | (#19788019)

So I am not alone in questioning the maths side of things? Good. My major is computer science and let me say my maths is not strong but I have had no issues with any of my computer science papers. However when it comes to maths I am stuck, it's dry and does not relate anywhere to my degree that I can see. I've taken courses in algorithms, language theory, databases etc. and the majority of the work is not maths and if it is it's so obvious anyone can see it. To help gain experience I'm doing some part time work in the industry before I graduate and it comes up nowhere in that line of work either.

Re:So I am not alone (2, Insightful)

nospam007 (722110) | more than 7 years ago | (#19788353)

... I've taken courses in algorithms, language theory, databases etc. and the majority of the work is not maths and if it is it's so obvious anyone can see it.
--
The majority of _your_ work might be.

Depends (2, Insightful)

capt.Hij (318203) | more than 7 years ago | (#19788021)

This is just another stupid generalization. There are some areas where you can do good computer science without math. There are other areas where you absolutely need mathematics. For example, you cannot do scientific computing without mathematics. Broad generalizations like this for a wide spread field just shows the ignorance/narrow mind of the author.

Sadly mistaken (2, Interesting)

Rumagent (86695) | more than 7 years ago | (#19788023)

Isn't that pretty much the same as arguing that a surgeon doesn't have to know about anatomy? What we do is inherently mathematical - there exists no other way of defining and understanding complexity, computability and so on.

I agree that you do not need a good understanding of mathematics to create a homepage, but for anything remotely interesting you do.

Math not essential - Logic is! (5, Insightful)

DeadlyEmbrace (740321) | more than 7 years ago | (#19788025)

I attained a Computer Science BS in 1986. At the time everyone was getting Math minors. I opted for a communication minor instead. I've worked in high-tech engineering environments with real-time programming for many years. What I found is that I've never needed the intense mathematics attained by those with math minors. I needed to be able to implement equations that staff mathmaticians would develop. Though math is a fundamental of computer science, I believe the ability to logically assess a situation from multiple perspectives; communicate your approach with the customer; and then implement a maintainable solution is the key components required for computer scientists.

Re:Math not essential - Logic is! (0)

Anonymous Coward | more than 7 years ago | (#19788163)

You're talking about computer engineering, not science...

Re:Math not essential - Logic is! (0)

Anonymous Coward | more than 7 years ago | (#19788327)

Hate to break it to you, but you're not a computer scientist. You're a programmer. The "staff mathmaticians" you speak of could probably claim to be computer scientists; most staff scientists at software companies can. To be a computer scientist, you must as a minimum develop algorithms as part of your job.

He has no idea what math is (5, Insightful)

aleph taw (1113487) | more than 7 years ago | (#19788029)

This guy just doesn't seem to understand what math is. Substituting theory of computation with his "theory of expressions" just shifts focus on another field of math.

Just drink more caffeine (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#19788031)

Darn! Can't make sense of these number thingies on the soda labels, though.

a growing trend (0)

Anonymous Coward | more than 7 years ago | (#19788039)

I think the mathematical background was necessary as computers were created to solve maths related problems but computers have been used for more and more different things for a while now.
Of course a minimal mathematical background will always be necessary, but apart from specialized fields (maths, computers graphics, ...), and with the advance in programming languages and frameworks, I think it is already happening
Of course, I did not RTFA ;-)

An example (0)

Anonymous Coward | more than 7 years ago | (#19788047)

The difference between using math to solve a problem and using numerical techniques on a computer is seen in the space race. The Americans with computing power put men on the moon before the Soviets who had better rockets and math ability.

Test devices... (1)

g0dsp33d (849253) | more than 7 years ago | (#19788049)

I actually remember hearing about a device similar to this principle of not using math... I believe they called it an Etch-a-Sketch.

As much as I disliked math I don't really see how computing is possible without it. Perhaps using math as the proverbial hammer for computing has created some nails out of screws, but I can't imagine one could have any sort of computing without some level of math built in. The brain is probably the closest example I can think of, and I'm sure there is a lot of math involved somewhere behind the scenes (or certainly would need to be to replicate a brain electronically).

Wrong, on many levels (4, Insightful)

adamwright (536224) | more than 7 years ago | (#19788053)

Mainly, he claims to want to create a "comprehensive theory of process expression". Fair enough, but as soon as you want to extract usable, reliable results from your "comprehensive theory", you've really just created a branch of mathematics. Maths is not just numbers and calculus, but any systematic treatment of relations in a symbolic fashion - unless he plans a lot of fairly useless hand waving, "Oh, my process is expressed as *insert long winded ambiguous English description", he will be working within the remit of mathematics. Heck, one of my areas of study is the development of processes (studied through the use of process calculi) - a highly mathematical tool.

He also ignores the vast array of work on non-deterministic algorithms, stating that "Any program utilising random input to carry out its process, such...is not an algorithm". Sure, it's not a deterministic algorithm, but even if you artificially restrict your definition of algorithm to just be deterministic, it's a useful tool in analysing such problems.

Finally, statements such as "Computer science does not need a theory of computation" are just so bizarre as to be funny. I suggest he forgets all he knows about formal computational theory, and I'll contract "Theseus Research" to write me a program to determine the halting problem for an arbitrary program. I wonder what his bid will be, given that he doesn't need a theory of computation (that would tell him you can't do it, at least with our models of computation - and probably with any).

Now, all of this is not to say you can't make progress in computer science without the mathematics that's currently been developed - however, you will either spend a lot of time achieving unreliable results, be reinventing the wheel, or just be creating a new branch of mathematics.

Without computers, maths... (2, Funny)

mastermemorex (1119537) | more than 7 years ago | (#19788059)

I ok. Let me see if I am able to solve the Navier Stokes equations for unstady flows without the help of a computer. And the Schrodinger equation using a tridimensional net?

Buahhh! ha, ha!

Re:Without computers, maths... (1)

mastermemorex (1119537) | more than 7 years ago | (#19788441)

Wait! I have a better idea!
Lets express the Schrodinger equations with a Ricci tensor in a Riemann m-dimensional space in Visual Basic 6.0!

No, sorry. Buahhh! ha, ha!

Math is a subset of the bigger picture of ..... (4, Insightful)

3seas (184403) | more than 7 years ago | (#19788063)

....Abstraction.

And computer science, the software side, is really the science of abstraction physics.

http://threeseas.net/abstraction_physics.html [threeseas.net]

At some point in the higher levels of abstraction creation and use you use the lower mathematical level as more or less a carrier wave of the higher level abstraction, than for the purpose of performing a mathematical calculation. The analogy is that of using radio waves to carry the music you hear over the radio, but the carrier wave is discardedafter it has done it job. Likewise, the mathematics of computers boils down to binary flipping of transistor swiches upon which the higher level of mathematics is carried upon.

With a correct approach to the abstraction manipulation machine computers really are, we can accomplish a lot more, similar to the difference between using the limitation of roman numerals in math vs. the decimal system with its zero place holder.

Logic (1)

headkase (533448) | more than 7 years ago | (#19788255)

And abstraction is a facet of logic. The challenge facing most programmers is how to represent their logic and environments/apis in a matter that successfully performs the desired task. Representations can range from simple to complex and the challenge is finding one that efficiently and effectively solves the problem at hand.
As an example, take First-order logic with it's classical statement of: [wikipedia.org]
All men are mortal.
Socrates is a man.
Therefore Socrates is mortal.
For this situation I tend to look at it as a form of simplistic Venn diagram [wikipedia.org] . It could be interpreted in a set theory way of:
A circle (mortal)
inside that circle: (man)
inside that circle: (Socrates)
So writing a program that deduces that Socrates is mortal is simply a matter of following the links from socrates through all the containers/super-sets until you reach mortal or the top level.

Abstraction does not always equal Logic (1)

3seas (184403) | more than 7 years ago | (#19788435)

write a bad sci-fi fiction (what amounts to abstraction sequence) and show me the logic.

Abstraction is not a facet of logic, and you proved it!

Re:Math is a subset of the bigger picture of ..... (2, Informative)

3seas (184403) | more than 7 years ago | (#19788367)

What the short review seem to be saying is that the author recognizes its not just math.

How in depth the book goes I do not know, but I do know I've been on about the abstraction perspective for near two decades and communicating it to everyone I can including to those in positions at universities.

I have noticed these last few years there are others beginning to grasp the bigger picture, such as J. Wing of CMU and her "Computational Thinking" perspective http://www.cs.cmu.edu/computational_thinking.html [cmu.edu] perspective and another P. Denning of GMU and his "Great Principles of Computing" http://cs.gmu.edu/cne/pjd/GP/GP-site/welcome.html [gmu.edu] and I'm sure there are others.

Now I see this short book review "Computer Science Reconsidered: The Invocation Model of Process Expression"...yet I have not seen from any of them software or even an outline of such, that anyone can use to explore and apply the presented perspective. And we all know that to really understand something as it applies to computers requires that actual use of a computer in the learning process for verification of understanding.

So, here is mine http://threeseas.net/vicprint/Virtual_Interaction_ Configuration.html [threeseas.net] which the link I gave in the parent post points to.

Its all about Abstraction Physics no matter how you present it or what you call it. The evidence is in the inability to avoid using the mentioned action constants set, with or without computers. Know what you do, in everything you do!

Reading the article (1)

HangingChad (677530) | more than 7 years ago | (#19788065)

It sounds like the author is suggesting it's time for computer science to evolve from an algorithm based system to process based. Doesn't seem all that controversial. Wouldn't that be the next logical step in the evolution of the computing machine?

I don't think he's saying do away with math, sounds more like suggesting not to be limited by a mathematical computing model.

You don't need mathematics at all (0)

Anonymous Coward | more than 7 years ago | (#19788071)

I'm very poor on mathematics and I never learn CS at all.
I even never been on any university in my life.
Still I'm very good programmer, I'm making good money, I'm getting a lot of job offers because people know that I'm better then many other and I can code faster and better then many other.

To be a good programmer you just need to be good with logic skill and that's all.
Well, mathematics is also pure logic but still you don't need to know mathematics at all to be a programmer.

In real programming you just need to know AND,OR,XOR,NOT,0,1 and you have everything you need to build anything.

Yes and no (2, Interesting)

QX-Mat (460729) | more than 7 years ago | (#19788083)

I feel that there are a lot of software engineering areas where you don't need much in the way of maths experience - just logical thinking. Most real world math related implementations I've done haven't relied on a high level of maths... linear interpolation and solving quadratics are probably as "tricky" as math goes outside of academia... but...

That's not the end of it. I've also done a lot of image manipulation work, and you NEED a good math background when you step over simple 2d convolution filters. Knowing your physics also helps - being able to identify trends and patterns in wave forms, and then applying the necessary maths is a great help. When dging into aliasing and reconstruction now, not just filtering, a high math proficiency is a must.

I've taken to game programming recently. If you know your maths, the physics comes easily. If you know your maths, specially advance vector and matrix theory (with integration and differentiation being prerequisites), things become a breeze. I didnt know enough. And I still struggled from time to time today. Experience is helping me, but sometimes I wish I had a math background to roll on.

I guess my ramblings are leading to a poor conclusion. Without maths you're limited in what you can do - but you're only limited by lateral field... In most cases you can take an specific soft eng field and go to town without hitting maths. I'm a very good software engineering and reverser, and I gotten here without having a math background. When I wanted to expand into games programming and image processing, things became much harder without the math.

With all that said, I'm very very guilty of obscuring simple procedures with valid but pointless math - and I know for a fact there's too much pointless formal theory in computer science now. The pointless formal theory is actually what push me away from doing a masters in computer science, and find something more vocational and rewarding!

Matt

Re:Yes and no (1)

joel.neely (165789) | more than 7 years ago | (#19788383)

You seem to be making the common mistake of confusing numerical analysis with Mathematics.

Regardless of the application area for your program, when you rewrite something like:

if (!(a && b && c)) {...}

into:

if (!a || !b || !c) {...}

you're using Mathematics (DeMorgan's Law from Boolean Algebra, to be precise). It's hard to imagine a competent programmer writing code for ANY purpose who wouldn't understand the relationships between the two fragments above.

Mathematics (although usually the non-numerical flavor) is fundamental to programming.

CS is not IT (1)

Geoffreyerffoeg (729040) | more than 7 years ago | (#19788093)

You can be great at IT without knowing math. You can probably even be a good programmer.

You cannot go anywhere in CS without knowing math, because (as the author himself admits) CS is merely a discipline of pure mathematics.

This is rather like saying "Forget math to be a great stockbroker." You can start a Fortune 500 company from the ground up without knowing a cosine from a cosecant, but D. E. Shaw will never hire you.

It's computer SCIENCE (1)

Mr. Underbridge (666784) | more than 7 years ago | (#19788103)

Can you really do computer science well without mathematics?

You can't do computer SCIENCE at all without the math. You might do some software engineering. Without understanding phenomena that underly the principles you're studying, there is no science. Namely, without any study of algorithms, what's left in the major that anyone would actually call science?

I honestly don't understand the whining. To get the ACM-approved CS major, you end up basically having to get a minor in math, which will generally require a few classes in Calc, a Linear Algebra course, a Discrete Math course, and maybe two others. It's really not that hard.

Wow, women can now become computer scientists (-1, Troll)

Anonymous Coward | more than 7 years ago | (#19788121)

They are terrible at math.

Idiotic. (1)

A beautiful mind (821714) | more than 7 years ago | (#19788125)

Computer science IS applied mathematics at it's fundamental level.You can try to forget about it as a user, but never as a programmer, much the less as a computer scientist.

There is a reason why my MSc in Computer Science involved the third most mathematics education after applied mathematicians and physicists.

Re:Idiotic. (1)

A beautiful mind (821714) | more than 7 years ago | (#19788145)

There is a reason why my MSc in Computer Science involved the third most mathematics education after applied mathematicians and physicists.
Talk about not previewing properly. The sentence I intended to write was:

There is a reason why my MSc in Computer Science involved receiving the third most mathematics education available in my country after applied mathematicians and physicists.

Knuth - nuff said (1)

helfen (791121) | more than 7 years ago | (#19788133)

Just look at any Knuth book (well "The art of computer programming" would be suitable). You can't separate mathematics from computer science. Geez - crypthography, all numerical algoritms are math.

Re:Knuth - nuff said (1)

TheRaven64 (641858) | more than 7 years ago | (#19788165)

I'd love to lock the author of this book in a room with Knuth for ten minutes. My money's on Knuth, unless his brain explodes from exposure to such a high level of idiocy.

without maths... (1)

mastermemorex (1119537) | more than 7 years ago | (#19788141)

Physics without maths is called philosophy. Philosophy without physics is called theology. Where the only true is called G. Computers without maths are a TV screen. A player without a screen is a windows machine. Were the only true is called B. Wait a moment! ... Profit!

Teh Maths (4, Insightful)

ShakaUVM (157947) | more than 7 years ago | (#19788143)

This is something I've thought a lot about. There have been any number of times that math has helped me in my software development efforts. Things like trig to predict the path of a moving target in Robowars (back when I was in high school) to various vector and angle related maths in CustomTF for Quake 1 (www.customtf.com) to partial derivatives to calculate the slope on a surface. I've also needed math for various economics related things over the years, and probability and statistics have also been exceptionally useful to me. Currently I'm having to decipher a guy's code which is all eigenmath, so my linear algebra course is saving me from having to hire someone just to explain all the math to me.

But the kicker is that you can't just tell a student that they should "study vector math" because one day they'll write a Quake Mod, because, truth be told, they probably won't. It's the trouble with all examples you give when students ask how math will be useful -- I could pull any number of examples from my life, but the problem is, they probably won't happen in a student's life. Instead, they'll have their own trials. The best you can tell someone is to study all the math they can, because some day it *might* be useful, and they'll want to have that tool in their toolkit.

And that's just not a very satisfying answer to students who want to make sure that they'll be damn well using what you're teaching in the future.

But believe me, I thought I'd never have an application for eigenvectors, and now not only do I have to clean out my brain on the topic, but I have to parse someone else's code (PhD thesis code no less) and add functionality to it. Two other friends of mine got stuck on legacy Fortran apps which are essentially mathematical solvers (one for differential equations, the other for huge linear algebra problems), and both of them are extremely happy they paid attention in their respective math classes.

So, yeah. To CSE students out there: take math. Pay attention. It could very well save your neck some day at a job, and if it doesn't, at least try to make it interesting to yourself to think of applications where you might use them. All math through the first two years in college can find applications for it quite easily.

Depends on the brand (1)

countach44 (790998) | more than 7 years ago | (#19788157)

I think he means without calculus based mathematics. Discrete mathematics has obvious applications to computing, there are, of course some cases when you may need calculus (experience with power series may help to solve a recurrence relation, for example). But overall, computer scientists really don't need differential calculus or above (in general). I mean, ask a CS person the last time they needed to solve a differential equation or take a line integral. If anything, their answer would be when they took physics (if they had to take anything above general mechanics), which again, doesn't make a whole heap of sense.

Thanks for the quick review (1)

DynaSoar (714234) | more than 7 years ago | (#19788221)

He can try as hard as he wants to create CS without math, but when he comes up with something usable, even if it's written in words, it'll describe a process that could be stated clearer and shorter with math. And if he doesn't state it with math those that know better won't pay him any attention.

Boole's framework for describing logical processes is math. That's why it's called Boolean algebra. Try to do CS without that.

Re:Thanks for the quick review (1)

neillewis (137544) | more than 7 years ago | (#19788387)

His big idea seems to be that extending Boolean logic with a third value, of NULL, allows it to be more expressive and allows processes to be handled purely in relational logic rather than by rule-based algorithms. I can see there could be advantages for certain applications. I'd like to see a good example of the benefits.

http://www.theseusresearch.com/NCLPaper01.htm [theseusresearch.com]

What a twit. (0)

Anonymous Coward | more than 7 years ago | (#19788229)

You can reinvent the wheel of course, and you will have to if you shun math, but what you will do will still be math even if you call it by a different name. Computer science without math is like architecture without statics.

Computing: Art or Science (1)

dragonrouge (1059352) | more than 7 years ago | (#19788233)

I think that today it could be argued that developement of computer software and hardware is as much dependent on the developers imagination and understanding of how people interact with computers and each other. This would put computer developement in the field of art rather than science. Computers are definetly out of the lab and they are not going back. Mathematicians are surely needed but I would think there should be room for people with knowledge of linguistics and why not psychologists (possibly to help us away from our computers). A few designers would help

Actually, the author does not say that. (1)

master_p (608214) | more than 7 years ago | (#19788237)

The author says that there are certain things within a computer that should not be modeled as algorithms due to their purpose and complexity. I partially agree with him: in some cases, the point of interest is not algorithms but how the process evolves to reach the desired goal.

Computer science uses math because of requirements (1)

Morty (32057) | more than 7 years ago | (#19788267)

Computing is driven by requirements -- we want a program or system that does X, and it needs to do it in total time Y, cost less than Z, and respond to operator input in less than W milliseconds. This in turn drives us to design programs, OSs, drivers, and systems that run "quickly" and consume resources such as RAM and disk space within certain limits. To meet this requirements, we need to quantitatively measure the performance of hardware and code to determine time and resource consumption. The tools with which we do this require mathematics. So computing inherently needs math to analyze potential solutions against qualitative requirements. This is pervasive at all levels of CS, whether it's a logic path that needs to complete a calculation before a clock cycle deadline, a sort that needs to complete in a reasonable amount of time, a network transfer that needs to deal with latency and bandwidth constraints, an OS that needs to grab data off a disk as quickly and fairly as possible to meet the various demands of running programs, or a user interface that needs to respond to user input in a timely manner while creating a DVD image.

There are also plenty of areas of CS that are even more fundamentally mathematical in nature. Network dynamic routing protocols don't work without various graph-related algorithms. Cryptography leans heavily on number theory. Sets/relations revolutionized databases. It's hard to imagine these advances in CS occurring without the strong connection between CS and mathematics.

That said, articles are notoriously bad at summarizing books. After all, if the book could be meaningfully summarized in a few paragraphs, the book wouldn't have been published; instead, the author would have just written an article to begin with. Certainly, there are aspects of computing that are heavily abstracted away from CS's mathematical underpinnings -- for example, non-modal application user interfaces, usability, software engineering techniques, and access controls. Perhaps this guy has a method for describing computer systems that makes it easier to think about these problems. Or perhaps not -- even if he has a "new model," if it doesn't make it easier to solve problems in CS, it's not particularly useful. Hard to tell without more detail.

Binary (0)

Anonymous Coward | more than 7 years ago | (#19788275)

Computers work by ones and zeros - true and false - an electrial expression... There would need to be a significant change in the way computers work in order to program them in a non-mathematical way.What would we use instead in binary? Alphabetical soup?

Meaningless (1)

sjhs (453964) | more than 7 years ago | (#19788279)

I agree with what many have said above. It appears that the author doesn't really know what computer science is--it is the study of the theory of computation. As such, it is entirely reasonable if not necessary to describe characteristics of computation (complexity of algorithms etc.) mathematically. As for 'process expression' versus 'algorithm', he seems to be splitting hairs just to make an (unconsequential) point. It's not like a theory of 'process expression' isn't already a large part of CS, it's just not the only large part. (And for the record, I *do* hope that my operating system is able to terminate, and cleanly too ;-P). I think either this is a marketing ploy to get his company's name in the news, or he is just sore about getting bad grades in his math classes.

I wonder how Fant will manage to describe his new theory of 'process expression' without using any mathematics.

Modern Programming is closer to Linguistics (1)

ryber (209932) | more than 7 years ago | (#19788283)

I've been in programming business applications for over 7 years and I can you that I use little to no mathematics above high-school algebra. The fact is programming languages themselves require very little math. The math comes from what your programming "about".

Programming is much closer to the study of linguistics. They are programming LANGUAGES after all. They have subjects and verbs and modifying expressions. Putting a logical line of code together is no different from formatting a sentence in another language. Just look at modern languages like Ruby and Python, where's the math?

Missing The Point...Sigh (0)

Anonymous Coward | more than 7 years ago | (#19788289)

An algorithm is not an expression of the process. As an example, try using an algorithm to obtain a software patent. An algorithm is an abstract representation of the process, but only the actual process can obtain a patent.

In the case of a piece of software, the process is the effect it has on the processor's switches and logic gates in a physical sense. This is something an algorithm does not show you, hence, it does not present the basic requirements needed to obtain a patent (i.e. the actual process).

Thus, you require a new system that provides an expression of the process. 'Process expression' maps what is going on at in a physical sense (i.e. the actual process). The actual process has no mathematics, or logic, its just follows circuit paths, switch states, etc.

Using an algorithm to patent a piece of software, is about as useful as using a burger to inflate your tyres.

Such patents are worthless.

Re:Missing The Point...Sigh (1)

mmc_DeepT (1125283) | more than 7 years ago | (#19788405)

Explains a lot.

Thats correct (1)

halplus00 (1111725) | more than 7 years ago | (#19788295)

Math is based on computer science. Not computer science in math. Math is a subset of computer science.

Anti-Intellectualism (3, Insightful)

QuoteMstr (55051) | more than 7 years ago | (#19788313)

Algorithms exist whether you think about them or not, but if you don't think about them, you'll accidentally create terrible ones.

Just as few telescope makers are astrophysicists, most programmers aren't computer scientists. The author himself is evidently not one. Instead, he is one of the more vocal members of an angry, ignorant mob trying to burn down the edifice of computer science. Its members do not understand it, so they fear it and try to destroy it --- look what's happened to computer science at universities!

It was bad enough when courses about a programming language replaced ones about algorithms and data structures (I'm looking at you, Java and D-flat). It was bad enough when pure profit became the raison d'etre of computer science departments. It was bad enough when I noticed my peers start to joke about how they didn't care about this "maths bullshit" and just wanted to earn more money. It was bad enough when the object, not the bit, became the fundamental unit of information.

But what this author advocates is still worse. He's proposing that we replace the study of computer science with a vocational programming, and call that emaciated husk "computer science." We already have a "theory of process expression", and that's the rigorous of algorithms and data structures. We've constructed that over the past 50-odd years, and it's served us quite well.

That field has given us not only staples, like A* pathfinding, but a whole vocabulary with which we can talk about algorithms -- how do you say that a scheduler is O(log N) the number of processes except to, well, say it's O(log N)? You can't talk about computer science without talking about algorithms.

The author's fearful denunciation of algorithms is only one manifestation of the anti-intellectualism that's sweeping computer science. "We don't need to understand the underpinnings of how things work", the angry mob chants, "but only implement the right interfaces and everything will happen automatically."

The members of this angry mob sometimes manage to cobble something of a program together, but it's more like a huge rock pile than a colonnade. It often barely works, uses too much memory, doesn't handle corner cases, and is likely to crash. (See worsethanfailure.com.) Members of this mob even think that if the same algorithm is expressed in two different languages, it's two different processes. People like this ask painful questions like, "i know quicksort in C# but can someone teach it to me in PHP?"

Argh.

Even "new" developments in programming are just new claptraps for old ideas, with fashions that come and go over the years. The only really new things are algorithms, and increasingly, we're calling people who couldn't independently create bubble sort "computer scientists." It's ridiculous. Call computer science what it is, and create a separate label and department for people who can program, but not discover new things.

It's this idea that one doesn't need to understand or think to be successful that's at the center of the article, and it's not just affecting computer science. Look around you. I wonder whether we'll fall into an old science fiction cliché and regress so far that we are unable to understand or recreate the technology of our ancestors.

Stupid Generalization (1)

pionzypher (886253) | more than 7 years ago | (#19788317)

Really..... What are we talking about here? Web developer? He/she is going to rarely use anything above algebra. Network admin? Hmmmm... The ones I speak with don't generally have to do much higher math on a day to day basis.

For systems development, I'd absolutely agree. If you're working on low level stuff, you're going to be dealing with math regularly and you damn well better know wtf you're doing. On the other hand though, when I did software QA... the heaviest math I did was figuring some median, max,min, stddevs on load test failure rates. Generalizing that everyone needs math (or needs to forget math) to become a great computer scientist is just begging for attention. You don't need to have an intimate understanding of the pci bus to install a soundcard. You don't need to have a diff eq understanding to test software, or to be a web dev. But you DO need it for other areas. And pretending you don't need it in those circumstances is just stupid.

end inebriated rant(waits for the flammage)

Darn it (0)

Anonymous Coward | more than 7 years ago | (#19788335)

He's right.

Everyone really knows 1+1 =3.

Now, if I could only get 5-4 = 28......

You can't seperate logic and math (1)

shaitand (626655) | more than 7 years ago | (#19788337)

In fact, math is nothing more than a branch of logic. You can't seperate computer programming from math for two reasons, programming is nothing but logic and every function is an algebra problem whether you use that form of logic to derive it or not. The other is that the higher level programming used today is nothing but an abstraction of the pure number crunching that is the basic operation of a computer. A computer is a calculator with lots of memory, nothing more.

Knuth's point (0)

Anonymous Coward | more than 7 years ago | (#19788379)

"The primary questions of computer science are not of computational possibilities but of expressional possibilities."

That is why Dr. Knuth created something called "Literate Programming". It was a small step towards what the author calls "Expressional Possibility".

"The practitioner of literate programming can be regarded as an essayist, whose main concern is with exposition and excellence of style. Such an author, with thesaurus in hand, chooses the names of variables carefully and explains what each variable means. He or she strives for a program that is comprehensible because its concepts have been introduced in an order that is best for human understanding, using a mixture of formal and informal methods that reinforce each other."
                          --D.E Knuth

The author never heard of a CIS or MIS degree? (0)

Anonymous Coward | more than 7 years ago | (#19788397)

The author is unknowledgable about the IT industry.
There are lots of degrees in IT that do NOT involve Math requirements.
Most community colleges offer a 1 or 2 year degree in programming.
If you aspire for a higher level education (like Bachelors, Masters, PHD) without the Math...
C.I.S. (Computer Information Systems) replaces the Math requirements with Business classes.
M.I.S. (Management Information Systems) has even more Business classes.

Most college's school of business already provides a degree program for teaching software development to non-mathematical people.

However, sometimes you do need math in programming... which is why Computer Science requires math.
Don't water down what Computer Science means.

Misleading title and you should RTFA. (1)

Jartan (219704) | more than 7 years ago | (#19788413)

As per the norm on /. the article title is incredibly misleading. The guy never once talks about ditching math. He doesn't even vaguely imply that math should be ditched. What he seems to be talking about is how the methods and tools of orthodox mathematicians (i.e. pure math guys) are still being used by Computer Scientists when all of that stuff should by now be far more "specialized" towards the things Computer Science really studies.

I think we can see what he's talking about in the many arguments about functional programming. I don't want to open that debate but I can at least attest I've met many people who are biased towards functional programming simply because it's so mathematical.

It's not hard for me to imagine that some of these same people (who are in academia) are pretty close minded about updating their skill set to something more suited to their actual jobs.

Math != Logic (0)

Prototerm (762512) | more than 7 years ago | (#19788437)

There's something to this.

You don't need math skills to be a good computer programmer. I, for one, have awful math skills, and don't like dealing with numbers (and yes this does get in the way from time to time, but that is rare.)

Computers are all about logic, and due to a minor learning problem I've had since childhood, I've always leaned heavily on logic to get better-than-average grades throughout school. As long as a subject lends itself to being *understood* rather than *memorized*, I'm in-like-flint.

When you program a computer, you don't think to yourself "what's the square root of seventeen multiplied by e to the twenty seventh", you think in terms of "if-then-else" (even if the syntax is different, it often comes down to that simple bit of logic).

So is math useful to the computer programmer? Absolutely. Is it necessary? Absolutely not. As long as you can *think* like the computer -- in other words, using logic -- you're better off than someone who just knows their sums. I've been doing this for over 25 years, and politically correct or not, this idea has worked just fine for me.

Well... Yeah (0, Redundant)

CrashandDie (1114135) | more than 7 years ago | (#19788445)

I mean, who gives a damn about math ? You don't need to be brilliant in math to code a forum in PHP ? Nor do you need to be especially bright during the math class in order to understand how [insert cool language or feature here] works, and how to use it...

I mean, I don't think I'm a bad programmer, I'm still studying, but I guess I'm not that bad, but hell, I completely failed math for my bachelor's degree (2/20), and still am that weak in that domain...

The point is, there's absolutely no need for math in the most common tasks..

Ok, someone said you needed math to get a filesystem going. I'm sorry, but you really don't need to know how to use a Fourrier Series, or to know the Achilles Numbers by heart to open a file, save some stuff in it, and then close it eh... ( --fopen/fputs/close-- ok, so maybe that's a wee bit too simplified, but still, you get the point). So really, until you get to the point where you have to start designing chips that draw things on the screen; you're not going to need a lot of what you've learned...

Of course, this is completely different from job to job... Last internship, I was soaked in marine biology, and a lot of stuff I hadn't seen in years (almost a decade ?)... I'm sure, if I had to take an intership at another place, where math is the big guy in town, well yeah, I'd need start filling up the gaps in maths, but just because at one given position, I needed references in biology, doesn't mean none of us should drop biology when taking CS !

Maths can be a big bonus, whatever the domain you're in. Be it computer science, or just toasting burgers at the local snack. But don't forget they can be as much a big bonus as they can be NOT.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>