Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Nicholas Carr Foresees Brains Optimized For Browsing

timothy posted more than 2 years ago | from the think-different dept.

The Internet 110

An anonymous reader writes "In the next decade, our brains are going to become optimized for information browsing, says best-selling author Nicholas Carr. According to Carr, while the genetic nature of our brains isn't being changed by the Internet at all, our brains are adapting 'at a cellular level' and are weakening modes of thinking we no longer exercise. Therefore, in 10 years, if human beings are using the Internet even more than they do today, says Carr, "our brains will be even more optimized for information browsing, skimming and scanning, and multitasking — fast, scattered modes of thought — and even less capable of the kinds of more attentive, contemplative thinking that the net discourages."" While Carr isn't making a case for Lamarckian evolution, the argument here seems weak to me; the same kind of brain change could be attributed to books, or television, or the automobile, couldn't it?

cancel ×

110 comments

Crystal Balls (-1, Troll)

Anonymous Coward | more than 2 years ago | (#39974791)

I foresee your mouth optimized for cocksucking.

re (2, Interesting)

Anonymous Coward | more than 2 years ago | (#39974803)

Television and the automobile, certainly. However, it seems arguable that books encourage attentive, contemplative thinking. The automobile can be a bit fuzzier - but certainly highway driving requires extreme amounts of attention. City driving isn't usually done for long stretches - unless it's stop and go, in which case nothing is happening to make it require much brain exercise.

Also, how does this make the argument seem weak? I'm sure there's a large body of work arguing the same is indeed true of television.

Re:re (4, Informative)

chrb (1083577) | more than 2 years ago | (#39974887)

City driving isn't usually done for long stretches - unless it's stop and go, in which case nothing is happening to make it require much brain exercise.

Route planning and navigating through a complex urban environment can require more thought than driving along a relatively straight highway. MRI scans on taxi drivers have shown actual physical brain changes from learning complex urban maps. [bbc.co.uk]

Re:re (3, Insightful)

TheCarp (96830) | more than 2 years ago | (#39975423)

Isn't that to be expected? Any time you look for physical brain changes from years of practiced learning, you find it. That is just what the brain does.

also cabbies are a special case, most people drive the same repetitive routes over and over, route planning is hardly needed after you have settled in to one or two ways of getting to work and home.

Re:re (1)

phantomfive (622387) | more than 2 years ago | (#39977103)

Isn't that to be expected? Any time you look for physical brain changes from years of practiced learning, you find it.

No, actually, that was the interesting part of that study. It was the first that showed a part of the brain growing as a result of an activity. As recently as 2000 many neurologists argued (fiercely) that the brain is static after adult-hood.

In general it is difficult to tell whether someone has an enlarged brain region because they are good at it, or if they are good at it because they have an enlarged brain region. There is still a small possibility of the latter even with the cabbies, because they self-selected to do their job.

Re:re (1)

Anonymous Coward | more than 2 years ago | (#39975475)

While I agree with everything you say I have to be a pedant and suggest that all mental activity causes "actual physical brain changes". The study is remarkable due to these changes being measurable with our imprecise and uninvasive tools. A parallel would be to perceive evidence of human activity on earth from orbit with the naked eye. Not being able to see evidence would in no way show human activity wasn't there.
In the case of the human mind one would assume that evidence of the thought also functions as evidence of a physical change in the brain, unless we are working under the assumption that the human brain works using magic.

Back on topic I think the evolution (or devolution) of the exterior digestive process in humans is relevant here. By pre-digesting its food through cooking it, mankind lowered the evolutionary requirements of their digestive systems for survival. Over time the capabilities of the digestive system deteriorated to the point we are at now where we have something of a symbiotic relationship with cooking equipment.
Similarly it seems highly likely that as we augment our minds with computer equipment (merely by using a computer one is augmenting one's mind) what has previously evolved will be undone as the pressures favouring certain genes are gone. Once this point is reached we are dependent on the machines to function, just as they are dependent on us. A large question being which aspects of the human thought process are innate and which aspects aren't. However the eventual destination is the same whatever the answer, only the time of arrival actually differs. As well perhaps as the ease with which we may extricate ourselves from our dependence (assuming social structure and conventions are any easier to alter than genes of course).

I should mention that there are many pros to both the symbiotic relationships I outline above and personally I find them both desirable states of affairs, this is however no reason I should be ignorant of their true natures and the risks attached.

Re:re (1, Redundant)

steelfood (895457) | more than 2 years ago | (#39977193)

Highway driving requires the most attention at high speeds, or while weaving through heavy (but moving) traffic. Driving unfamiliar or less familiar routes (like going to a shopping mall 50 miles away that you'd normally go to only twice a year) also require the same navigation abilities as city driving, but at a larger physical scale.

Map reading and route memorization helps maintain short term memory. This exercise is rendered moot by using a GPS. It also doesn't come into play when driving a familiar route, e.g. from home to work or from home to the grocery store.

Now television...that's the biggest intellect-killer out there. At least there's interaction when browsing or surfing the internet (for example, posting on /.), albeit minimal. But the brain is less active when watching TV than even when sleeping. On the other hand, constantly being online causes shorter attention spans. What suffers is concentration and focus, generally speaking, which is a necessary in order to solve difficult problems on the fly. It's the same skill that's necessary to do a complicated math problem completely in your head.

All in all, while TV causes an overall degradation of the brain, continuous use of the internet results in lateral movement, from being able to do a singular task very well to being able to do multiple things all at once. Whether this is a bad thing or not is an exercise left to the reader. I personally think that though a jack of all trades is master at none, it's important to at least have both vast but shallow knowledge, and one specific field of expertise. In effect, being able to do one thing really well, and being able to do many things but medially, are both important mental skills to hone and maintain.

Re:re (0)

Anonymous Coward | more than 2 years ago | (#39977579)

Clearly false. Taxi drivers don't have brains.

'Er, do wot, John? Guess who I had in the back of my cab last week? Some bleedin; effin poofters, that's who, strike a bleedin' light.

Re:re (4, Insightful)

mellon (7048) | more than 2 years ago | (#39975309)

Plus, Lamarckian evolution involves inheritance, whereas the author is talking about learned/conditioned behavior in individuals. The brain is plastic. It very definitely does adapt to do well whatever you do often.

In my experience, highway driving is great for contemplation. City driving not so much. YMMV... :)

this would apply to childhood as well (1)

Pathoth (2637433) | more than 2 years ago | (#39976831)

I believe the main reason we don't have more female programmers is not a question of talent, but early exposure and the belief they can be successful in that area. If you don't think about something, "that part" of the brain will end up being used for something else instead.

Re:re (1)

tinkerton (199273) | more than 2 years ago | (#39977317)

However, it seems arguable that books encourage attentive, contemplative thinking.

And long attention span , and the ability to grasp complex subjects, and so on. If people aren't able to read books anymore, that really sucks.

Re:re (1)

drsmithy (35869) | more than 2 years ago | (#39977749)

The automobile can be a bit fuzzier - but certainly highway driving requires extreme amounts of attention. City driving isn't usually done for long stretches - unless it's stop and go, in which case nothing is happening to make it require much brain exercise.

This sounds backwards. City driving has a lot more hazards and variations that need to be tracked simultaneously. Trundling along on a highway at a pretty constant speed (probably using cruise control, at that) isn't especially taxing.

I would be interested to see some actual studies, but I would be surprised if the typical driver found highway driving more stressful than city driving.

pssssssh (1)

Hsien-Ko (1090623) | more than 2 years ago | (#39974811)

I want to see brains optimized for gopher and emacs.

Re:pssssssh (1)

LostCluster2.0 (2637341) | more than 2 years ago | (#39974921)

Nah, brains will be optimized for IPv6!

Re:pssssssh (0)

Anonymous Coward | more than 2 years ago | (#39976793)

Masami Eiri? Is that you?

Re:pssssssh (3, Funny)

spazdor (902907) | more than 2 years ago | (#39975049)

DUH. Brains can already run on emacs, didn't you know Lisp was originally developed for AI research?

Just type ctrl-shift-R meta-K semicolon shift-Q-X, hit F13, and hold your occipital lobe against the SysRq key for 5 seconds. Voila - instant, permanent transfer of consciousness from the boring old physical world, into your third-favourite text editor! It couldn't be simpler.

Re:pssssssh (2)

el3mentary (1349033) | more than 2 years ago | (#39975839)

Pfft that's ridiculous, on Python just type "import consciousness" and you're done

Re:pssssssh (2)

ultranova (717540) | more than 2 years ago | (#39976989)

And the best part is that your now mindless body can make a wonderful career in either politics or the financial industry!

Re:pssssssh (0)

Anonymous Coward | more than 2 years ago | (#39975125)

-1

vi > emacs

Already waaay beyond that ... (2)

Barbara, not Barbie (721478) | more than 2 years ago | (#39975177)

My brain is already optimized for ignoring Nicholas Carr. The first couple of sentences were enough to determine the rest is not worth reading.

Re:Already waaay beyond that ... (1)

Hognoxious (631665) | more than 2 years ago | (#39977023)

I was not aware of him, but my brain has miraculously reconfigured itself to do the same.

Re:pssssssh (1)

marcello_dl (667940) | more than 2 years ago | (#39977507)

Optimizing brains for emacs is pointless, until we have optimized hands.

HttPS For Brains? (2)

LostCluster2.0 (2637341) | more than 2 years ago | (#39974821)

If brains become web browsers, does that mean we'll need antivirus injections, javascript bandages, and be careful what cookies we eat?

Nicholas Carr Foresees Obvious: (5, Insightful)

RavenousBlack (1003258) | more than 2 years ago | (#39974823)

Do something more often and your brain will become optimized for it. I think they call it learning.

Re:Nicholas Carr Foresees Obvious: (2)

hughJ (1343331) | more than 2 years ago | (#39974975)

neuroplasticity

Re:Nicholas Carr Foresees Obvious: (1)

steelfood (895457) | more than 2 years ago | (#39977209)

This.

Learning is merely the acquisition of knowledge. This includes acquiring the knowledge of new methods of thought, or new ways to think. But actually thinking, and rewiring the brain to think in a certain way, is completely different.

You can teach knowledge, but you can't teach people how to think. They either do it or they don't. You can show them many methods to think, but you can't force them to think in a particular way.

Re:Nicholas Carr Foresees Obvious: (1)

Dahamma (304068) | more than 2 years ago | (#39975009)

And the inverse is usually true as well: don't do something at all for a long time and you tend to forget how to do it - like calculus :)

Re:Nicholas Carr Foresees Obvious: (1)

elucido (870205) | more than 2 years ago | (#39975263)

And the inverse is usually true as well: don't do something at all for a long time and you tend to forget how to do it - like calculus :)

So you'll forget how to read if you don't read the entire book cover to cover?

Re:Nicholas Carr Foresees Obvious: (2)

oldhack (1037484) | more than 2 years ago | (#39975315)

See what happened to you once you stopped thinking?

Re:Nicholas Carr Foresees Obvious: (1)

Hognoxious (631665) | more than 2 years ago | (#39977057)

And the inverse is usually true as well: don't do something at all for a long time and you tend to forget how to do it - like calculus :)

The proverb about riding a bike seems to suggest the opposite. Sure, you get rusty, but it doesn't take anywhere near as long to get back up to speed as it did to learn it in the first place. Seems the memory doesn't disappear but goes dormant.

But maybe calculus is different...

weak analogy (4, Insightful)

tverbeek (457094) | more than 2 years ago | (#39974827)

While Carr isn't making a case for Lamarckian evolution, the argument here seems weak to me; the same kind of brain change could be attributed to books, or television, or the automobile, couldn't it?

The counterargument here seems weak to me; books, television, and the automobile aren't the same as the web, so the learned change wouldn't be of the same kinds.

Re:weak analogy (5, Funny)

Dahamma (304068) | more than 2 years ago | (#39975025)

The counterargument here seems weak to me

Yeah, that's because the original article was written by a best-selling, Pulitzer prize nominated author, and the counterargument was written by timothy.

Re:weak analogy (2)

reve_etrange (2377702) | more than 2 years ago | (#39975031)

the automobile aren't the same as the web, so the learned change wouldn't be of the same kind

Indeed [wired.com] . Probably the impacts from web use will be greater than those from some other sources because an increasing number of young children, with their highly plastic brains, are spending time accessing the Internet. Mostly it is adults who, say, drive an automobile.

Re:weak analogy (0)

Anonymous Coward | more than 2 years ago | (#39975087)

Exactly, it seems the brain of the submitter has already changed. People generally scan books and multitask while reading them. They recreate worlds and imagery in their mind. People simply sit and watch TV (or do most people leave it on in the background like radios?). You are scanning the road while driving (well you should be), but you're trying to extract obsticles and perdict possible collisions. That's completely different from skimming through a web page.

I agree with the author. If you learn to ride I bike you don't forget, but you're not as good as you used to be if you haven't ridden it in a while. Same with concentration. If you rarley concentration on a single thing for a while, you'll have a harder time trying to do it in the future compared to someone who does it every day.

Practice makes perfect (or perfects your mistakes)

Re:weak analogy (2)

mcrbids (148650) | more than 2 years ago | (#39975627)

I don't think that the counterargument is weak at all because Oh look! Another awesome cat video! You gotta check this out! [youtube.com] And that means that what were we talking about again?

The internet makes you stupid - more at 11. (2)

epyT-R (613989) | more than 2 years ago | (#39974829)

oh well. I guess somethingawful was right all along! Now I must research this by finding blogs that agree with my bias..
---
the internet is a tool. like any tool it can have positive and negative effects on the user, remembering that positive and negative are relative terms.

Re:The internet makes you stupid - more at 11. (1)

LostCluster2.0 (2637341) | more than 2 years ago | (#39974839)

The 'net is a lot like television. Watching smart shows makes you smarted, watching dumb shows entertains you but makes you stupider.

Re:The internet makes you stupid - more at 11. (2)

epyT-R (613989) | more than 2 years ago | (#39974933)

hmmm.. well I think the determination between 'smart' content and 'stupid' content relates to the reasons why it's being looked up. TV is a bit different as it's completely passive, and all programming is from established outlets who depend on ubiquitous eyeballs. In order to do that, it must appeal to the largest demographic, demanding insipid melodrama and whitewashed, politically correct truth, even in things like documentaries and news broadcasts. In contrast, it is still possible to find interesting content on the net.

Habit != evolution (3, Interesting)

Spy Handler (822350) | more than 2 years ago | (#39974837)

Evolution requires death, selective pressure, serious things like that, and takes place over generations. This ain't evolution. It's just people getting into a habit.

Yes it probably does change our brains on a cellular level, just like the recent habit of no hard physical labor changed our muscles on a cellular level. It's easily reversible simply by doing the old things again.

Re:Habit != evolution (1)

tverbeek (457094) | more than 2 years ago | (#39974903)

Changes in cellular physiology are not fully reversible. If you revert to an active lifestyle after a decade of being sedentary, you can grow new muscle tissue, but it'll probably never be as healthy as what you had before. I can run and swim and lift as much as I have time for (and I am), but I'll never again be as fit as I was when I was 20. Likewise, there's no turning my brain back to the condition it was in back then, either (which is both a good thing and a bad thing).

Re:Habit != evolution (1)

AchilleTalon (540925) | more than 2 years ago | (#39974979)

Being non reversible doesn't mean it is an evolutive treat. To be claimed as an evolutive treat it must be encoded into the genes and passed to the next generations. So, the original argument is really poor and weak. It seems that guy just don't know what is evolution and is throwing this for the fame.

Re:Habit != evolution (2)

AchilleTalon (540925) | more than 2 years ago | (#39974987)

Oups, I meant trait instead of treat. BTW, no matter how short you cut your leg, even if irreversible your new born won't born with one leg.

Re:Habit != evolution (1)

reve_etrange (2377702) | more than 2 years ago | (#39975105)

it must be encoded into the genes and passed to the next generations

That's true, but those changes can probably be acquired in other ways than mutation and crossing over alone. There is a whole field [wikipedia.org] about it.

If you include traits like infection with Wolbachia [wikipedia.org] in insects (which has far-reaching consequences including parthenogenesis in species which otherwise need sexual reproduction), there is actually a fair amount of Lamarckism in the natural world.

Evolution != genetics (3, Insightful)

tverbeek (457094) | more than 2 years ago | (#39975515)

Describing evolution strictly in terms of DNA isn't exactly "wrong"... but it's comparable to describing astronomy strictly in terms of Newtonian physics: perfectly good most of the time, but there are "edge" cases (such as objects approaching the speed of light, or certain species of intelligent primate with advanced communication skills) where it doesn't quite explain what's happening. To fully understand and explain hominid evolution, you also need to look at the linguistic/educational channel through which certain non-genetic traits are passed from generation to generation.

Re:Habit != evolution (2)

tverbeek (457094) | more than 2 years ago | (#39975123)

Genes are not the only way we pass things on to later generations. We also do it through language. Genetics is just the "hardware" side of the system; humans also have developed a way of passing on behaviors and skills through "software", which we load into our offspring after they come off the assembly line. A great deal of what makes us the kinds of animals we are is implemented in software, not hardware. That ability to evolve in ways beyond mere genetic mutation is how we've become one of the most successful species on the planet.

Re:Habit != evolution (1)

elucido (870205) | more than 2 years ago | (#39975247)

Changes in cellular physiology are not fully reversible. If you revert to an active lifestyle after a decade of being sedentary, you can grow new muscle tissue, but it'll probably never be as healthy as what you had before. I can run and swim and lift as much as I have time for (and I am), but I'll never again be as fit as I was when I was 20. Likewise, there's no turning my brain back to the condition it was in back then, either (which is both a good thing and a bad thing).

That is just aging. That has nothing to do with how much you work out. You could be the athlete of the century and by age 30 you wont be like you were at age 20.

Re:Habit != evolution (1)

tverbeek (457094) | more than 2 years ago | (#39975357)

But that was my point: time's arrow only points one way. It's isn't even about what we consider "aging"; even if the change happens on a short biological scale, such that you're still about the same age before and after, it's still not fully reversible.

Evolution != genetics (2)

tverbeek (457094) | more than 2 years ago | (#39975023)

Also, it seems a bit narrow to insist that "evolution" be defined only in terms of genetic inheritance. The ability of a sufficiently intelligence species to not only learn new behaviors but also teach them to their offspring is – in effect – a persistent change in that species. We didn't become a species of arithmetic-performing apes through natural selection of genetic material, but by passing on that skill through teaching. Furthermore, a species which is capable of (more or less permanently) altering the environment in which future generations are born and develop is also producing a form of evolution. For example (and for better or worse), most of humanity now grows up looking at lighted screens a substantial part of their lives, and will continue to be different in their cognition from previous generations because of that. The genetics of H. sapiens have changed insubstantially in the past century, but H. sapiens as a population is a markedly different primate. That is evolution.

Re:Habit != evolution (1)

jo42 (227475) | more than 2 years ago | (#39975045)

Welcome to Dumbtards'R'Us - even dumber than before!

Idiocracy wasn't a comedy, it was a documentary sent back from the future...

Re:Habit != evolution (1)

Ichijo (607641) | more than 2 years ago | (#39975109)

Evolution requires death...

No, evolution doesn't require death, but death assists evolution by preventing some individuals from procreating.

...selective pressure, serious things like that..

Developing a skill that's in demand by society gives the individual a greater chance at passing on his/her genes, and that's evolution.

Re:Habit != evolution (1)

tverbeek (457094) | more than 2 years ago | (#39975291)

Developing a skill that's in demand by society gives the individual a greater chance at passing on his/her genes, and that's evolution.

Developing a skill that's in demand by society also gives the individual a greater chance at passing on that skill through education... and not just to his offspring, but to others' offspring! Isn't that a form of evolution as well? It's a well-established principle that we're the product of both nature and nurture... why look at evolution solely in terms of one (DNA), and not the other (K12)?

As seen in caveman science fiction (1)

steveha (103154) | more than 2 years ago | (#39974869)

Caveman0: I am draw story on cave wall!

Caveman1: No! Memorize oral tradition make brains strong! Picture story make brains weak!

[panel of cave-children staring vacantly at cave paintings, slack-jawed, drooling]

Caveman0: Me go too far! Me am play gods!

In case you don't know the meme, original source: http://dresdencodak.com/2009/09/22/caveman-science-fiction/ [dresdencodak.com]

steveha

Not a weak argument (4, Interesting)

artor3 (1344997) | more than 2 years ago | (#39974901)

While Carr isn't making a case for Lamarckian evolution, the argument here seems weak to me; the same kind of brain change could be attributed to books, or television, or the automobile, couldn't it?

Yes, the same kinds of changes could be attributed to the things you named. Which is likely why people who grew up with black and white television dreamed in black and white. Our brains are absolutely affected at a deep level by the things we spend our time on. It seems almost trivially obvious to say so. The real question is whether or not this is a bad thing. Yes, our modes of thinking may become dependent on "browsing" -- on having a ready cache of facts and trivia that don't need to be stored in gray matter. But if it is the case that browsing is indeed always available, might that not be a good thing? Couldn't that free up resources, currently devoted to memorizing state capitals, that could be better spent on higher level reasoning? Math classes can certainly teach more interesting topics now that calculators have obviated the need to memorize logarithm tables.

Re:Not a weak argument (-1, Flamebait)

AchilleTalon (540925) | more than 2 years ago | (#39975011)

Go back to school, you do not understand a word about evolution. It's a pitty to read a comment like yours on a geeks blog.

Re:Not a weak argument (1)

artor3 (1344997) | more than 2 years ago | (#39975059)

Where did I use the word evolution? People, and for that matter all living things, adapt to their surroundings, completely independent of the process of evolution. I'm pretty sure you're just trying to troll (how cute!), but really, you're just making a fool of yourself.

Re:Not a weak argument (1)

AchilleTalon (540925) | more than 2 years ago | (#39975069)

Same kind of changes, your words. Same as what? Same as the news which is about an evolutive trait.

Re:Not a weak argument (2)

artor3 (1344997) | more than 2 years ago | (#39975211)

Wow, fail at reading comprehension much?

From timothy's editorializing of the summary:

While Carr isn't making a case for Lamarckian evolution, the argument here seems weak to me; the same kind of brain change could be attributed to books, or television, or the automobile, couldn't it?

What I say:

Yes, the same kinds of changes could be attributed to the things you named

So, clearly, I was referring to the same changes as timothy was referring to. And what changes were those? Let's look at the summary to find out!

According to Carr, while the genetic nature of our brains isn't being changed by the Internet at all, our brains are adapting 'at a cellular level' and are weakening modes of thinking we no longer exercise.

So the original submitter says "the genetic nature of our brains isn't being changed at all". Timothy says "Carr isn't making a case for Lamarckian evolution". I don't mention evolution at all. Literally no one mentioned evolution, except to say that this isn't evolution. And then you come charging in to tell everyone, "Wow, you guys are stupid! This isn't evolution!"

Well, gee. Thanks for the tip, Captain Obvious.

pot - kettle - black (0)

Anonymous Coward | more than 2 years ago | (#39975157)

There are always some members of each generation of humans which are resistant to learning, and up as evolutionary dead-ends. Sorry, dude, but you appear to be Homo Erectus.*

*and not in the sense of a crude juvenile joke.

Re:Not a weak argument (1)

0111 1110 (518466) | more than 2 years ago | (#39975375)

Which is likely why people who grew up with black and white television dreamed in black and white.

When I grew up there were both black and white and color TVs. So did I dream in color or B&W? If I buy an HDTV will I start to dream in 1080p?

Math classes can certainly teach more interesting topics now that calculators have obviated the need to memorize logarithm tables.

For instance? I didn't realize that mathematics had changed all that much since the invention of calculators.

Re:Not a weak argument (1)

bondsbw (888959) | more than 2 years ago | (#39976779)

For instance? I didn't realize that mathematics had changed all that much since the invention of calculators.

No claim was made that the "more interesting topics" are new. They are perhaps just more advanced, like skipping some of the months of long division to focus earlier on algebra.

It bothers me when a fifth grader knows more about some subject than I do. But the problem isn't that I feel dumb... it's that I know that the fifth grader will one day be in my shoes, and ask, "Why was I put through years of this? I can't remember any of this because I never need it... and I want those years of my life back."

Computers are a tool; let's use them to modernize education and stop promoting the ideals of the 1940s.

Re:Not a weak argument (2)

martin-boundary (547041) | more than 2 years ago | (#39976313)

We teach math for a reason. If you go into a store and want to pay your bill, you have to know how to count so you can present the correct amount of money, and so you can check that your change is correct also.

It's not strictly been necessary to know how to do that for about two generations now, people *could* just carry an electronic pocket calculator and do the sums on demand at the store counter, but nobody does. It's stupid, and using the counting machinery in your head is much better and more practical.

The same is true with browsing facts on demand, it's generally stupid. It's like taking out the calculator in the store, then fishing out the manual from your pocket to learn how to operate the keys, and another piece of paper which has pictures of the bills in your wallet, and a conversion table so you can enter the proper values in the calculator, etc. Oh, and the manual is in Chinese, so you also have to take out the English-Chinese dictionary that you carry in your manbag...

We're better off with as many facts as possible in our heads, but realistically each of us only has limited capacity. We should still aim to fill the capacity completely.

Re:Not a weak argument (1)

phantomfive (622387) | more than 2 years ago | (#39977107)

Math classes can certainly teach more interesting topics now that calculators have obviated the need to memorize logarithm tables.

Did people really memorize logarithm tables? I always just looked them up in the back of the book......

Re:Not a weak argument (2)

steelfood (895457) | more than 2 years ago | (#39977243)

Math classes can certainly teach more interesting topics now that calculators have obviated the need to memorize logarithm tables.

Eh, no. Memorizing log tables is not relevant to math. It isn't even a part of math. You can leave your answers in log form without losing points. Math is more abstract than the final number at the end. If you've ever done algorithm analysis, (big O notation), that's closer to what's math than figuring out the log, sin, cos of some non-trivial number.

And because of this, calculators don't help. They do the exact opposite. I mentioned in an earlier post how by using a GPS, you lose the need to exercise a certain thought pattern. Calculators are the same. By constantly using it, you're actually losing a certain ability. Instead of "freeing up" room, you actually cease to be able to put 1 and 1 together without some external device giving you the answer. If you consider your brain to be a muscle (which is the most apt analogy for the brain as I've ever encountered), the calculator would be an exoskeleton. You won't and can't gain muscle by using an exoskeleton. You'll in fact start to lose muscle if that's all you use. You can only gain more muscle by exercising. And that means not using a calculator.

Anyways, real mathematics doesn't require you to get to the numerical answer. Perhaps in engineering, or physics, or other fields that apply math heavily, but not actual mathematics do you need to actually come up with a numerical answer. Leaving pi, or e, or i as a symbol is perfectly ok in math.

Old news (0)

Anonymous Coward | more than 2 years ago | (#39974915)

I didn't RTFA, but this point is roughly similar to the one made by Plato and others as the early greeks transitioned from an oral tradition to a written one. It's an obvious point, but one we should probably pay attention to: mathematics and writing changed the way we think because it allowed us to offload certain kinds of computations (arithmetic/geometric, storage/recall) to physical systems. Where humans interact most with algorithms is the internet, and the modes of information retrieval and display there will certainly effect which linguistic tools get developed in the next generation.

But like I said, old news.

Re:Old news (1)

Barbara, not Barbie (721478) | more than 2 years ago | (#39975061)

Where humans interact most with algorithms is the internet,

Very doubtful. Standing, walking, interacting with other people, eating, looking outside and figuring out what to wear - these all involve processing information and decision-making. The internet is a much narrower set of interactions, with a much smaller list of possible outcomes.

Re:Old news (1)

elucido (870205) | more than 2 years ago | (#39975231)

Where humans interact most with algorithms is the internet,

Very doubtful. Standing, walking, interacting with other people, eating, looking outside and figuring out what to wear - these all involve processing information and decision-making. The internet is a much narrower set of interactions, with a much smaller list of possible outcomes.

Not true at all. You'll find yourself in way more situations on the internet than anywhere else because situations like that just aren't very probable anywhere else. How do you explain the psychology of a serial killer outside of an MMORPG?

Re:Old news (1)

Barbara, not Barbie (721478) | more than 2 years ago | (#39976179)

You need to step away from the keyboard - that "serial killer" in an MMORGP isn't real, and there are no consequences.

Not sure optimized is the right word (1)

Livius (318358) | more than 2 years ago | (#39974973)

Doing something because it had practical benefit (or is even a necessity) does not mean it's optimal. Certainly neural pathways that are unused may atrophy, and repetition will make us better at any activity mental or physical, but I'm not sure that's really something I would call optimization.

Not sure if I agree (2)

BadPirate (1572721) | more than 2 years ago | (#39975037)

I only skimmed the article though. Back to facebook.

I foresee if /. keep posting stupid news (1)

AchilleTalon (540925) | more than 2 years ago | (#39975057)

we will endup with no brain at all.

WTF!!! Don't you have something about an elixir of immortality or a time-travel machine, I meant, something real.

yep (1)

Un pobre guey (593801) | more than 2 years ago | (#39975107)

While Carr isn't making a case for Lamarckian evolution, the argument here seems weak to me; the same kind of brain change could be attributed to books, or television, or the automobile, couldn't it?

Do you doubt that has occurred? Not to mention video games, urbanization, industrialization, birth control, etc.

I don't buy the argument (3, Interesting)

codeAlDente (1643257) | more than 2 years ago | (#39975143)

Whether humanity's method of information gathering is books and TV, the Internet, or (Heaven Forbid) interpersonal interaction, we'll all do it in some combination of long and short intervals. The Internet makes it possible to do both the high-frequency information gathering described here, and low-frequency contemplative activities such as gaming sessions, ./ articles and reading science papers. It's a lot easier now to learn about a single, narrow topic in depth than it has ever been in the past. Science has become more specialized. Less time spent searching for facts means more time to spend contemplating your favorite scientific issue. Consequently, given a period of time and a problem of a given complexity, scientists can now analyze an issue / solving a problem in greater detail and with better efficiency. Contemplate that, Carr.

Comeon guys (3, Insightful)

girlintraining (1395911) | more than 2 years ago | (#39975165)

A best selling author is apparently equal in credentials to a phd in Neurology? Really, slashdot? Where's the evidence? Brain scans? Double blind tests? Who was the control? What's the confidence rating? He's practicing pop psychology -- and he's even less credible than Dr. Phil. No evidence of any kind and he's making extraordinary claims about a field he has no formal training in. If this was a story about someone's evidence disproving evolution, Slashdot readers would be tearing the author limb from limb -- this guy's making claims that belong in the same bucket. Why are you wasting your time with this crackpot theory? You're supposed to be scientists, technology experts, and engineers -- act like one. Demand proof.

Re:Comeon guys (0)

Anonymous Coward | more than 2 years ago | (#39977797)

No evidence of any kind and he's making extraordinary claims about a field he has no formal training in

With all the support movies like Gas Lands and Sicko get here it seems people's brains have already been optimized (or washed) to believe what's placed in front of them.

is this why...? (0)

Anonymous Coward | more than 2 years ago | (#39975187)

So is this why that every time we come to this point technologically, and some cataclysm happens and society crumbles, that there is no record left for the survivors to remember and pass down?

The brain "optimizes" to anything (1)

metrometro (1092237) | more than 2 years ago | (#39975189)

The brain remakes itself constantly, in weeks, not decades. ANYTHING you do repetitively becomes a source of new pathways. So in our case, our brains are already optimized to Bear Party bootlegs and Pizza Bites.

Assumptions. (1)

elucido (870205) | more than 2 years ago | (#39975197)

Just because the brain develops a new mode of operation it doesn't mean the brain forgets the old mode.
Did you forget how to ride a bike? Forget how to swim? Forget how to play chess or play video games? Forget how to read?

Just because you learn to speed read it doesn't means you forgot how to read. And just because you pay close attention and don't use the internet it doesn't mean you've developed the ability to reason. The ability to reason comes from discrimination of good and bad information. Also what about the benefits to vocabulary? We will know way more words if we read a greater variety of text.

It's possible. (1)

outsider007 (115534) | more than 2 years ago | (#39975217)

Early exposure could hijack parts of the brain that were meant for other things. It's the same reason why all polyglots are airheads. Just limit your kids to an hour or two a day and they'll be normal.

Re:It's possible. (0)

Anonymous Coward | more than 2 years ago | (#39975267)

Probably not enough to prevent the brain from changing, I can't remember it off the top of my head right now, but Carr argued in "The Shallows" that only a few hours of Internet exposure is enough to cause the brain to start to re-wire itself thanks to neuroplasticity.

Flow concentration (3, Interesting)

sandytaru (1158959) | more than 2 years ago | (#39975297)

Browsing concentration is sort of the opposite of "flow" concentration. Flow concentration is engaged when you're doing something engrossing - painting, writing, sudoku, coding. As long as we balance out our browsing habit with artistic and creative pursuits, we'll be fine.

Missed the mark (1)

PopeRatzo (965947) | more than 2 years ago | (#39975305)

I'm pretty sure the data shows that if our brains are "optimized" for anything on the Internet, it's the pornography first and foremost.

Not so Lamarkian. (2)

Shavano (2541114) | more than 2 years ago | (#39975383)

No, it would be Larmarkian if he claimed you could inherit this characteristic. It has long been known that the brain's wiring depends on how you use it.

The thing about this claim is browsing the web a lot is enough unlike how people have used their brains in the past that it will cause a noticeable difference in how we gather, understand and analyze information.

Myself, I don't think it's all that different than what humans have done for a million years. When I was a kid, there was no such thing as the web. We gathered information by consulting various sources: television, books in the library, magazines, talking to other people and observing things for ourselves. Occasionally, we set out to break new ground and find out things that nobody knew, or that we didn't know somebody knew. People still gather information in all those ways but now they use the web too. The kinds of information a web search dredges up are the same kinds of things we used to draw information from 30 years ago: magazine articles, bloadvertisements, videos (analagous to television programs), blogs (which are journals) and scholarly articles. Although you get the information to your eyes a lot faster, you can't absorb it any faster with the web. And the quality of the information is probably worse, because it's so damn cheap to put up a blog full of bullshit, unchecked facts and misunderstood information. It's left to the searcher to decide what information is relevant, which of conflicting information sources are more accurate or reliable, etc. This is the same problem people always had.

The activity of creating new information -- original research or analysis -- was never easy, and there were never good tools available to most people to help with it. Now, at least there are computers that can assist you in analyzing large volumes of information or carrying out calculations too daunting to do by hand.

So all said, I doubt it will have much effect. People will still need to analyze data, but that has always been an activity for a few who were especially good at it. The rest of us can browse away, just like our apelike ancestors did 4 million years ago. (I bet they said Google, too.)

So let me get this straight (1)

Anonymous Coward | more than 2 years ago | (#39975487)

The guy is an editor for Encyclopedia Britannica, a publication which is likely being endangered by the free flow of information on the internet (wikipedia, we're looking at you!) and he's making arguments against it. What. A. Shock.

Severe conflict of interest here.

Normans invade, language changes (0)

Anonymous Coward | more than 2 years ago | (#39975493)

The Normans invade England, and the language changes. It makes sense to me. The question is, to what extent does language, culture, the means we acquire information, etc. change our minds? I wager quite a bit. After Gutenberg, people began to lament the loss of memory. Illiterate elders were no longer remembering the town register. It was all written down. If anything, society is the better for it. However, that might represent an "optimal enhancement" for the human mind. The fear is that if we go to far, the brain will atrophy. IMHO, the question is moot. The experiment is being run. Primitive cultures still allow us to compare the value of tribal memory against the written word. I suspect that small communes, orders like the Amish, and others who pick and choose their technologies will serve a similar laboratory function, provided we don't go all fascist and try to force things on them. Diversity in level of technology, just as in diversity of crops and other things is a good thing, I think.

Except... (3, Interesting)

kuzb (724081) | more than 2 years ago | (#39975525)

This is like Sony getting up and telling us all that the Xbox is making us stupid.

You have to question the source here. The guy is an editor for Encyclopedia Britannica - something that is likely being hurt by free information publications such as Wikipedia.

Well (1)

tsotha (720379) | more than 2 years ago | (#39975563)

I read the first part of the summary, but the paragraph was really too long. I find these days I'm prone to starting one thing and then

Tried to read ... (1)

PPH (736903) | more than 2 years ago | (#39975599)

... TFA, but got a warning that the page requires JavaScript and Masochism be enabled.

black and white dreams (1)

luther349 (645380) | more than 2 years ago | (#39975629)

when people had only black and white tvs people and some still do have all there dream in back and white. wile the newer generation of people raised on color tvs dream in color. so yes are brains did change from tv.so saying we are becoming at storing alot more wile less detailed info is not so far fetched.

Bad Practice (1)

Waltre (523056) | more than 2 years ago | (#39975685)

This seems redundant to me, since the way in which we find relevant answers from a vast source of information such as the internet needs to (and will) change considerably in the near future so that we no longer scan large volumes of information and search results.

If you consider that in terms of efficiency of getting 'an answer from a question' we currently:

Have Question -> get vast amounts of information from intertubes -> sort through information -> hopefully get answer.

But this is stupid. We can build things (as demonstrated by Siri and Wolfram with natural language processing) that do the processing for us, so we can:

Have Question -> Tell [algorithm] -> get answer.

This is the most efficient way to get an answer from a question, we don't need to be involved in sorting and processing of vast amounts of information. I'm sure that in the near future we geeks will make this happen.

That can only mean one thing (2)

eco2geek (582896) | more than 2 years ago | (#39975799)

"Brains Optimized for Browsing" can only mean one thing: zombies optimized for browsing.

(Talk about a case of "soft inheritance" [wikipedia.org] ...)

This certainly happens, the brain adapts, evolves (2)

Thagg (9904) | more than 2 years ago | (#39975953)

When movies were first made, they were single shots. A train approaching a station, something like that. Audiences oohed and ahhed.

But, the first time a cut was introduced, the audience was completely flummoxed. They had no idea what they were seeing. It's hard to believe that now, but we've probably seen 100,000 cuts by the time we are 5 now, and our brains are rewired to accept it.

Re:This certainly happens, the brain adapts, evolv (4, Interesting)

Animats (122034) | more than 2 years ago | (#39976753)

But, the first time a cut was introduced, the audience was completely flummoxed.

More than that. The average shot length in movies [cinemetrics.lv] has been decreasing over the years. There are up and down trends; 1971 had longer shots than 1974. But shot lengths today average around 2 seconds. The Bourne Ultimatum has a mean shot length of 800ms. This is the current record. MTV got people used to that rate of cuts.

Another thing that people have learned to tolerate is the demise of editorial geography. The best way to explain editorial geography is this (which I'm quoting from memory): "Bogart gets a phone call. He hangs up the phone. He puts on his coat, He opens his door and walks out. He walks down the front steps. He hails a cab. He gets in the cab and the cab drives away. We see a shot of him inside the cab. The cab stops in front of a building. Bogart gets out. He looks up at the tall building. We're shown the building. He walks into the lobby. He pushes the elevator button. He looks up at the elevator indicator. We're shown the elevator indicator moving down. The elevator doors open. Bogart gets in. We're shown the elevator indicator moving up. On another floor, we see the elevator doors open. Bogart gets out and walks down the hall. He knocks on a door, and Lauren Bacall opens the door. Bogart walks through the door into the apartment." Today, we'd see the phone call, and in the next scene, he'd be in the apartment.

Re:This certainly happens, the brain adapts, evolv (1)

SgtChaireBourne (457691) | more than 2 years ago | (#39977461)

But shot lengths today average around 2 seconds.

I attribute that to sheer lack of technical skill. There's few if any left in the industry with the skill both on both sides of the lens to carry off the long scenes that the old movies had. Part of that was the difficulty in the old lenses and in manual editing. But today's actors, directors and camera men just can't pull off the basics any more.

I can follow the 2 second shots, but actively dislike it. It's too much like following a bunch of stills and makes me feel like I'm watching a story board roughly migrated to the big screen instead of an actual 'movie'.

Re:This certainly happens, the brain adapts, evolv (1)

bogjobber (880402) | more than 2 years ago | (#39977623)

It's not lack of talent, it's just aesthetics and marketing. Look at guys like Robert Elswit and Roger Deakins. They're still doing work that rivals anything in old movies, but if you put a film out with long takes and a deliberate pace everyone complains that it's "slow" and automatically gets put in the arthouse category. These films just don't sell that well anymore.

I'm very intrigued to see what style Elswit brings to the new Bourne movie. I enjoyed the last one on TV, but I sat too close to the screen when I saw it in the theatre and it made me sick.

Re:This certainly happens, the brain adapts, evolv (0)

Anonymous Coward | more than 2 years ago | (#39977497)

I'd argue that duration of movies is shorter, because of the longer work hours and more media competing for our attention and not shorter attention spans. When you had nothing to do all day, you went for 5hr opera. Today it's impossible because you work 40-80hr per week and all the many things you can do after work. In such environment if something is short and sweet you can fit it inside your busy schedule more easily.

About the Boggart example. So, basically we are skipping all the unimportant detail and focusing on the important. That sounds like minimalism and not anything to do with Carr and his opinion.

Ontopic:" THE SKY IS FALLING!!!! IF WE USE BOOKS/MOVIE/INTERNET INSTEAD OF SITTING AROUND TRIBE CAMPFIRE TELLING EACH OTHER WHAT WE DID WE WILL BE STUPIDER!!11!!!"

Browsing is one thing, posting is another (2)

shoor (33382) | more than 2 years ago | (#39976221)

I didn't RTFA. I accept that brains adapt to the activities they do, and I understand that that's not the same thing as evolution, so this is not a claim of 'lamarckian' evolution.

My conjecture is that while the brain of a passive browser/lurker may develop one way, that of someone who also posts to the net might develop in another way. Conversations may be discussions of various issues, and interactions on the net could be likened to that but with more time to think about what you're about to say. Feedback in that you see later exactly what you did say (and maybe wince when you do sometimes), plus feedback from numerous people, including the classic "tl;dr" may sharpen certain thinking skills. There've been submissions about that here on slashdot in the past. The question would then be: how many mere lurkers are there out there as opposed to active posters. Plus it's a matter of degree, how much lurking per day, how much posting and thinking about posting per day.

Macluhan revisited (2)

paai (162289) | more than 2 years ago | (#39976647)

Excuse me, but isn't this just a rehash of what Macluhan already stated some fifty years ago?
Paai

So? (1)

Anonymous Coward | more than 2 years ago | (#39977183)

When I was a teenager, my brain became optimized for looking at pictures of girls' tits and for jerking off... anything you do, that represents a new behavior changes the structure of your brain, that's how the fuck you learn new behavior. New pathways are formed, or old ones are altered or destroyed... or the attributes of the pathways might change, i.e., activation thresholds could raise or lower, reuptake of neurotransmitters could be impacted, but in any case, any time a new memory of any kind forms, be it episodic, (watching parents fighting, fucking, etc.) procedural, (learning to type, drive a car, operate a web browser...) or sensory (acquiring a taste for calamari, getting so tired of fruit-punch "flavored" Gatorade that you want to vomit just thinking about it, without any specific event tied to the idea) etc., your brain is changed. Conversely, without change, no learning of any kind occurs. So this is not news. It's about as profound as saying "fire is hot". No shit, Sherlock.

The net is better than TV which it is replacing (0)

Anonymous Coward | more than 2 years ago | (#39977735)

Brain scans have shown that TV makes you stupider and trains your brain to be passive and unquestioning.

Time on the net mostly displaces TV viewing - a fact that advertisers hate.

Of course, we can't stop them from searching for pics of the Jersey Shore cast, how to win the lottery with astrology or what prayers will get you into heaven but at least it's better than TV because on the net they might just accidently end up on wikipedia and learn something.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...