Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!



Ask Slashdot: What Old Technology Can't You Give Up?

AthanasiusKircher Re:My watch (473 comments)

Yes, that's true too. Most people still think of watches as uniquely about checking the time (despite smartwatches, etc.). Glancing at your watch is probably the clearest signal you can give to others that you're worried about the time (e.g. may need to go, have another meeting, etc.). Checking your phone may just make you look rude or bored, since phones have so many other functions you could be monitoring.

2 hours ago

Ask Slashdot: What Old Technology Can't You Give Up?

AthanasiusKircher Re:My watch (473 comments)

I still wear a wristwatch. I've worn one constantly since I was 10. I'll probably be buried with one.

To me, a wristwatch is an essential tool. I give talks, teach classes, run meetings -- and I find it really annoying to do these things without bringing my analog watch.

Many rooms do not have visible clocks when I'm doing these things. But if I'm trying to run a class or give a talk or run a meeting on a schedule, I need to know what time it is. On the other hand, I don't want to make it look like I'm continuously checking the time, because that tends to make audiences nervous or anxious or feel bored or think you're bored or whatever.

Say I'm teaching a class. If the room doesn't have a visible clock, what are my options?

(1) Consult a classroom computer, if there is one. Well, some classrooms might not have one, but even if they do, usually a screensaver or something will turn off the monitor. So I need to go over and hit the spacebar (or worse, login) everytime I need to check the time. Yes, I could reconfigure the computer, but I may not have an account on it, it may be shared, etc.

(2) I could use my phone. But again we have the screen off problem. If I leave my phone on the desk, I'll still need to go turn it on to check the time, and it looks like I'm "checking my phone" (for messages, whatever). Not a good message to send to the students when I tell them I don't want to see *them* doing that. If it's in my pocket, I don't need to walk to it, but it's even more noticeable when I pull it from my pocket and turn the screen on briefly. I might be able to set my phone screen to stay on, but that wastes a lot of battery.

(3) I could bring along a tiny desk clock or something, but why do that when I already can just have one available on my wrist (which is probably even smaller and less obtrusive)?

(4) I can take my analog wristwatch off and set it down in a central location to where I'm presenting from. With an analog clockface, I can easily tell the time from just about any angle (not true of computer screens or phone screens), maybe 10-15 feet away (where I wouldn't necessarily be able to read a digital watch). And it's already on my wrist, so I don't need to remember to bring extra equipment. Even if I keep it on my wrist, it's usually less obtrusive to check the time than walking to some computer or pulling out a phone.

Basically, if you want to know what time it is in a room where there's no visible clock, but you don't want to necessarily signal to everyone else that you're constantly checking the time, a watch is a pretty ideal solution.

10 hours ago

Coffee Naps Better For Alertness Than Coffee Or Naps Alone

AthanasiusKircher Re:Good way to make yourself ill (113 comments)

Pre-industrially, those two blocks would have an hour or two of waking time between them

Indeed -- it was basically forgotten for about a century, but recently historians have been finding references EVERYWHERE to "first sleep" (or "early slumber" or "beauty sleep") and "second sleep" in many cultures around the world.

The first descriptions of "insomnia" come up only in the 19th century, just about the same time that the two sleep blocks really started to disappear.

And we should not forget the role of coffee in this transition. (From the link above:)

[A researcher] attributes the initial shift to improvements in street lighting, domestic lighting and a surge in coffee houses - which were sometimes open all night.

Coffee may not just ruin your sleep sometimes if you drink too much -- it may have played a major role in divorcing our entire species from its most natural sleep patterns and convincing everyone that a solid 8-hour block is most "normal."


Coffee Naps Better For Alertness Than Coffee Or Naps Alone

AthanasiusKircher Re:Drop Caffeine Altogether (113 comments)

About ten years ago, I cut out caffeine altogether.

Yes, I did that too out of necessity about 5 years ago. Not that I was ever actually "addicted" like many people -- I would rarely have coffee more than a few times per week, though I used to brew a LOT of my own tea and iced tea.

But at some point my body seemed to become hypersensitive to it. Now, if I have a cup of coffee after 2pm, it will likely keep me awake until the middle of the night. So I just had to move to decaf tea and coffee.

Now, I'm more alert than I was when I was caffeinated

This is the thing about studies like this. Many of these studies are rather small (and I didn't read the full studies), but I really hope they'd measure the differences between those who are heavily addicted to caffeine vs. "a cup or two per day" vs. "rarely consume caffeine or never."

Especially when you have other studies like this one, which suggests that caffeine addicts actually normally are functioning on a lower level than non-addicts, and the best they can hope for is a return to "baseline" by drinking more caffeine.

If there were differences in the napping between groups, it would be very relevant for recommendations. The danger of such studies without these kinds of nuances is you get people thinking, "I just need to drink even more coffee! And take naps!" when a more realistic recommendation would perhaps be to stop the addiction, live most of your life at a higher functioning level overall, and when you're really tired and need it, do the "caffeine nap" trick only occasionally.


Canada Tops List of Most Science-Literate Countries

AthanasiusKircher Re:Biased (205 comments)

(By the way, so no one can accuse me of being misleading, Bradley's observation of stellar aberration in 1729 was seen by many as the first proof of the earth's motion. However, my point was that soon after that the church eased its restrictions, and it had eliminated them before all the other problems with heliocentric theories were finally resolved empirically.)


Canada Tops List of Most Science-Literate Countries

AthanasiusKircher Re:Biased (205 comments)

"While 87 per cent knowing that the earth goes around the sun is pretty good, that still leaves 13 per cent of Canadians that haven't absorbed the scientific knowledge of several centuries ago," Ingram said.

It was also a pretty tough question for the Catholic church for quite a long time.

Not as long as you might think. The church removed the general prohibition against books advocating heliocentrism in 1758, and the last precedents for prohibiting specific passages (e.g., of Copernicus) were effectively overturned by 1820.

Meanwhile, the first actual empirical proofs of the earth's motion occurred mostly in the 19th century, with the first measurement of stellar parallax occurring in 1838. (Parallax had been predicted since the 1500s if the earth were in motion, but never observed.) Coriolis forces in projectiles, which had been predicted since the 1600s if the earth were in motion, were first measured in the 1800s. The problem of seemingly fixed apparent diameters of stars was finally resolved in the 1800s as scientists realized that stars must be farther away than had previously been thought (related to the parallax measurements). (Since the 1500s, scientists had predicted that if the earth were in motion, stellar diameters should vary as the earth moved closer and farther away from a given star, but this effect was not observed empirically.)

I could go on, but you get the point. Before the 1800s, there was simply no empirical evidence that could say the earth was definitely in motion around the sun -- other than that the math got really hard after Newton if you wanted to try to adhere to geocentric model.

So, the Catholic Church dropped the prohibitions actually a bit before we actually had real "proof" of the earth's motion.

Don't get me wrong: I thoroughly agree that the church banning books or passages of books was a terrible blow against free speech and free expression. But in Galileo's time, there really just wasn't the kind of actual "proof" we assume there was about the earth's motion -- Galileo did NOT have a better model (his circular orbits still required a bunch of epicycles, and his proof of the earth's motion required there to be a single tide per day at noon). He should NOT have been censored or arrested for asserting his beliefs, but let's be clear that that's exactly what they were at the time: beliefs.

The church can rightly be accused of censoring free speech or being overly conservative in its scientific tenets, but it actually stopped being upset about this question before it was finally resolved empirically.


Canada Tops List of Most Science-Literate Countries

AthanasiusKircher Re:Biased (205 comments)

The science literacy part asked questions like:
Does the sun go around the earth or does the earth go around the sun?
Human beings as we know them today developed from earlier species of animals. True or false?
Electrons are smaller than atoms. True or false?

I will never understand why anyone thinks asking questions like this is some sort of credible test of "science literacy."

Basically, these surveys usually end up testing only two things: (1) how good are you at memorizing and recalling facts your middle-school science teacher told you? and (2) are you more likely to trust your middle-school science teacher over your priest/rabbi/shaman/psychic/New Age crystals dude/whoever else also tells you things about the world?

Why do we think that "science literary" should only correlate best with those who have good memorization skills and blindly accept the authority of their science teacher?

If you sat me down in a room to interview people, and you told me to evaluate their "science literacy," I'd probably start by posing a real-world scenario, present them with some empirical evidence, and then have them choose among some conclusions from that empirical evidence to see whether they can evaluate and follow rational arguments. For more advanced levels, I might throw some data or some basic stats or some graphs at them too -- nothing requiring any computation -- and see whether they get fooled by bad logic, or common reasoning pitfalls.

Frankly, like Sherlock Holmes, I don't give a CRAP whether someone has memorized some answer to whether the earth goes around the sun or the reverse. It's completely irrelevant to most people in their everyday lives, and it's contrary to common empirical evidence (the earth actually does feel "stationary" and the sun does "appear" to move in the sky). What I *would* care about in terms of "science literacy" is whether I could present a person with a set of further empirical observations about the sun and the earth which should lead them to a rational conclusion that the earth goes around the sun -- and if they can correctly follow that argument, then they might be "science literate."

Simply asking them to regurgitate facts based on their adherence and trust in some authority figure is NOT "scientific" in the least. (Note, by the way, that I'm NOT at all arguing that trusting in your preacher or something over a scientist is a good thing -- but blind faith without understanding is still blind faith, whether the memorized "beliefs" you're regurgitating conform to science or to some wacko religion.)

These kinds of tests are about as useful as trying to test "reading literacy" by asking someone to identify letters in different fonts, or testing "mathematical literacy" by having people read numbers aloud. Such activities tell you nothing about whether that person has the least bit of understanding about how to read or do math. They can identify some basic facts, but they can't actually use them or do anything with them.

I'm also NOT saying that we should expect the general public to have enough expertise to be professional scientists. Of course not. But if we actually want to test "science literacy," we should see whether they have a minimum understanding of how to interpret evidence and empirical data, and what sorts of conclusions can be drawn from it. If they can't do that, they really don't understand much about science at all... and we might as well be asking them questions about trivia concerning Shakespeare's plays, or other random science questions.

I mean, is it really that much more useful to memorize a fact about the earth's motion that's pretty much irrelevant to everyday life than it is to, say, memorize the atomic weight of carbon, or the first 100 digits of pi, or the number of species of known insects? (Of course, some scientific facts ARE really relevant, like understanding that vaccines actually work -- and being ignorant of some scientific facts can be dangerous... but science is not unique there. There are lots of useful facts to know to avoid dangers in the world, and knowing most of them would not be considered tests of "science literacy.")

Of course, if we actually started measuring "science literacy" my way, we'd probably find that less than 5% of the population actually has it... and that would probably be disturbing. So, I guess we keep asking questions about memorized dogma, and when the numbers go up, we can congratulate ourselves for stamping out ignorance concerning matters mostly irrelevant to people's lives.


Canada Tops List of Most Science-Literate Countries

AthanasiusKircher Re:Biased (205 comments)

The clincher for me - which indisputably shows the authors' bias - is that Canada ranks #1 in people protesting GMOs and nuclear power, and the authors consider this a good sign that their population is scientifically literate!

The report says nothing of the kind. Did you read it? GMOs and nuclear power are mentioned as divisive issues, but there is no data on the ranking of people against them.

Well, for some reason the CBC's coverage of this seems to think that Canada is 3rd out of 33 countries in having high numbers protesting nuclear power. I haven't read the full report, but either (1) the CBC is wrong, (2) you're wrong, or (3) the CBC is reporting based on true information that isn't in the report you read.

Regardless, it sounds like SOMEBODY did a survey comparing attitudes about at least nuclear power and found Canadians were near the top in terms of objecting and protesting.


Underground Experiment Confirms Fusion Powers the Sun

AthanasiusKircher Re:That's not how science works (140 comments)

Science is intended more to adapt an actual "theory" over time to better suit the evidence that it is presented with until it increasingly encompasses all edge cases that relate to the topic in question. That "adaption" can be considered disproving with an immediate re-creation of an alternate theory moments later to encompass the changing circumstances.

Or, well... it "could be considered" exactly what it is: retaining an existing theory and all of its basic assumptions, while tacking on modifications or qualifiers to make it better fit the data. That's not "disproving" anything. That's improving an existing theory, and that's what the vast majority of everyday science is about. Most active scientists are working within existing paradigms and working out the details of theories by starting with all the assumed knowledge of their fields.

No actual scientist is walking around questioning every scientific "fact" on a daily basis. "Oh, you know what, I really don't believe that whole 'water is made up of hydrogen and oxygen' thing. That whole 'atomic elements build compounds' thing sounds potentially bogus and 'unproven,' so let's have my lab spend the next six months retesting that and finding out what water really is made of!"

No sane scientist thinks like that, and scientific progress would be practically impossible if we went out actually worrying about potentially falsifying everything.

Instead -- we accept much of scientific knowledge as "proven" (not in a formal logical/mathematical sense, but the normal everyday sense of "well-tested"), and we go on with our lives filling out the "edge cases" as you put it. Only if some major discrepancies arise repeatedly do we begin to wonder whether underlying assumptions may be at fault... and even then it would take a heck of a lot to overturn the idea that water is composed of oxygen and hydrogen, for example.

On the flip side, actually "proving" something is exceptionally hard work. It is saying that at no point, ever, under any circumstances in this or any conceivable universe, with any natural or unnatural influence could this situation *EVER* take place for *ANY* reason. These are the rules, these are how things behave, and this is how things will always, and forever behave; EXACTLY like this and there's not a damn thing that anyone including the hand of God himself could do to change that.

That is a completely and utterly BS definition of "proof" that no one EVER uses except in discussions like this. Seriously. It's not what the math or formal logic people mean by "proof," because actual math and formal logic people generally recognize that their claims are not directly relatable to the real world, let alone "any conceivable universe." Math and formal logic are abstract symbolic systems. While they may at times be very good models for talking about the real world, they do NOT have any exact correspondence with the real world.

Find me an exact "triangle," for example, in the real world. Not something that looks vaguely like a triangle -- something that fits the mathematical definition of one, with three exact points, precisely straight line segments, etc. Even if you came close, to create something that on a microscopic level still seemed to be a triangle, on the individual atomic level there would be irregularities -- and even if that were somehow "perfect," we could keep moving down until we got to the "quantum foam" level... a real EXACT triangle in the mathematical sense doesn't exist in the real world.

Does that mean all of Euclid and geometry and all the formal "proofs" are wrong or irrelevant? Of course not. But they are working with certain kinds of abstract assumptions that have no exact correspondence with the real world. They are a MODEL. So, when you try to apply that standard of "proof" to "this or any conceivable universe," you're doing something completely illogical. No one ever actually does that -- and if they claim they are doing that, they are being disingenuous. The logic of abstract models is simply that -- the logic of abstract models. Standards of proof there are simply not applicable to the real world.

Saying that something is "proved" means that there is nothing more that could ever be known about that topic, and that nothing could ever impact that field, be it further advances anywhere else, supernatural influence, extra dimensional characteristics, weird things that we haven't even considered possible

Sorry, but that's just NOT TRUE. In normal, everyday English, according to Merriam-Webster, "to prove" is "to test the truth, validity, or genuineness of" or "to test the worth or quality of." Note the emphasis on the word "to TEST." In normal language (i.e., >99% of the time "prove" is actually used in English), it means to TEST something to determine how good it is. If something passes a test, it has met some sort of standard -- that doesn't necessarily mean it can never be questioned again, or we should stop thinking about the subject forever, or that it's true in "any conceivable universe." Something that is "proved" has passed some standard test for quality, or, as another definition in Merriam-Webster puts it: "to establish the existence, truth, or validity of" or "to demonstrate as having a particular quality or worth." The normal way of doing these things in the real world (as opposed to some sort of abstract formal logic exercise) is through empirical testing.

Gravity is one such theory. We know that it exists, we know how it works, we know how to calculate it, we know how to utilize it's traits for all kinds of things. But "proving" that water goes downhill ... It's something that we take for granted and require to base civilization as a whole on, through irrigation and plumbing. Something doesn't need to be "proved" to be immeasurably useful in the daily lives of incalculable people over countless generations.

Once again -- this is NOT what the word means in real life. Say Person A claims, "I claim water will flow uphill and gravity doesn't exist." Person B says: "Prove it." Person A: "Uh... I can't. But can you PROVE that it flows downhill?" Person B: "Sure, watch. [Pours water on a slope -- it flows downhill.] See!" Person A: "Yeah, I guess you're right."

That's what "proof" actually means in English when referring to the real world, including empirical science. You subject a claim to a test, and you take the results of the test as some sort of "proof." Depending on the quality of the test, you may draw various conclusions about the quality of "proof" obtained -- in cases where the truth is really important, you may insist on more tests or more rigorous ones, or, as we say in English, "a higher standard of proof." See -- proof has degrees depending on the kind of tests you use and the standards that are upheld by those tests. It's why in a courtroom we can have a standard that says "proof beyond a reasonable doubt." If "proof" only meant what you think it means, then such a definition would be redundant and misleading -- if "proof" means something is true in "any conceivable universe" for all time, then what would it possibly mean to be "proving beyond a reasonable doubt"? What would that qualifier add? Well, obviously it sets the standard for the test -- a "reasonable person" [legal term] would have no doubt of the guilt in that situation.

You may think that this is getting pedantic, and it is, but at the same point, it is the difference in Science between "Proving" a theory and not.

There's a difference between being "pedantic" and being WRONG. Let's subject your hypothesis about the use of the word "proof" to some empirical testing, shall we?

You claim that actual expert empirical scientists don't use the word "proof" in this loose way -- that they always mean this very abstract version of formal logic "proof" where something would be true in "any conceivable universe."

Well, let's look at the abstract of TFA, linked to in TFS. What do we see?

Although solar neutrinos from secondary processes have been observed, proving the nuclear origin of the Sunâ(TM)s energy and contributing to the discovery of neutrino oscillations, those from protonâ"proton fusion have hitherto eluded direct detection. Here we report spectral observations of pp neutrinos, demonstrating that about 99 per cent of the power of the Sun, 3.84 Ã-- 1033 ergs per second, is generated by the protonâ"proton fusion process.

Not only does this scientific abstract include the word "proving" regarding previous research, but they characterize their own research in TFA as "demonstrating" that something "IS generated" -- that's language of formal logic or math proofs (e.g., "quod erat demonstratum"). They didn't say they were "theorizing" what something "might be" or that something merely "provided some evidence" of what "could be." They said that previous scientists had evidence "proving" something, now they had "demonstrated" what something truly "is."

So, let's see, I could either believe some random guy's definition of "proof" on Slashdot that never seems to apply to the real world. OR, I could look at the actual empirical usage of the term "proof" by actual leading scientists, which has been published in one of the top journals of the entire scientific field, with language that has been peer-reviewed and checked by editors of the highest scientific caliber.

Well, my empirical evidence suggests your definition of "proof" is wrong. QED. :)

2 days ago

Underground Experiment Confirms Fusion Powers the Sun

AthanasiusKircher Re:That's not how science works (140 comments)

Or we could just realize that "proof" in empirical science means something different than it does in pure mathematics.

THIS. By GP's standard, >99% of the uses of the word "proof" in the English language are invalid. Almost all uses of the word "proof" in thousands of legal statutes around the world are bogus and meaningless.

And, empirically, from looking at actual scientific methods as practiced, it's clear that scientists clearly do NOT treat all scientific theories as equally "falsifiable." Some are treated as "proven," if not in a strict mathematical-philosophical sense. It would take a LOT more to overturn a basic established law of physics than some off-the-cuff guess ("hypothesis") in a new experiment. So what exactly is it that we are doing when we verify and reverify and reverify a basic well-established tenet of basic science over centuries if not, in essence, proving "proof" of it (in any reasonable sense of the English word outside of the strange world of pure math and logic puzzles).

2 days ago

U.S. Senator: All Cops Should Wear Cameras

AthanasiusKircher Re:Can we stop using the word 'TAPE' (601 comments)

It's 2014 and nobody uses tape to record

Meh. This is one of those language battles that is probably lost. We still refer to "films" too even when most are no longer watched on actual film stock (and many are increasingly shot and produced without film either).

There are lots of words like this. When's the last time you "dialed" a phone with an actual dial? For me, I'd say almost 20 years ago. When's the last time you "hung up" a phone, i.e., literally hung it up on the wall (not "put it down" on a tabletop phone), with the force of gravity pulling the little cradle down to cut the line? I'd say maybe 25 years or more for me, and that was only on my grandmother's old dial wall phone that was probably at least 20 years old.

So, I think we may be stuck with "tape" as a synonym for "record," probably for decades to come.

2 days ago

Comcast Tells Government That Its Data Caps Aren't Actually "Data Caps"

AthanasiusKircher Re: Sigh (333 comments)

I know you're an AC, but it's really shocking to see something this ignorant. NOT A SINGLE REPUBLICAN voted for the ACA, not in the House, not in the Senate. I'm willing to blame Republicans for a lot of things, but this bill was the way it was solely to get support from Dems, and that's ALL it got. How and bill designed by Dems and only voted for by Dems (including the guy who signed it into law) somehow has problems that are solely the fault of Republicans is very confusing... even by AC logic.

2 days ago

The Evolution of Diet

AthanasiusKircher Re:"Paleolithic diets" now vs then (281 comments)

You might be right but you're missing the point.

It depends on what "the point" is. The "point" of this thread was the OP asking whether modern "paleo" diets are anything like actual ancient "paleo" diets. And the answer is "no." That's the "point" of this thread.

I was in no way passing judgment on whether some aspect of modern "paleo" approaches may be good or bad nutritionally -- only pointing out that they have very little in common with the foods eaten by people a hundred thousand years ago or whatever.

Yes a modern orange is excessively sweet (no seriously, they are far too bloody sweet, it tastes like sugar, not orange anymore) - apples too - however if you eat a diet of 95% unprocessed goods, green leafy vegetables (modern or not) - a sensible amount of fruit and some meat, it's still vastly vastly better for you than eating high processed garbage, including even "healthy" things like packaged breakfast cereals.

At no point did I imply that all aspects of the modern "paleo" approaches were necessarily bad nutritional advice. Obviously eating less processed food is a reasonable choice, but one doesn't need to go back 100,000 years to find less processed food. If you ate a diet only using foods found a couple HUNDERED years ago, you'd also be eating mostly "unprocessed" foods by today's standards, and those foods from a couple hundred years ago would be a lot more like today's supermarket "unprocessed foods" than trying to make some claim to approach a diet of tens of thousands or hundreds of thousands of years ago.

It's the dogmatism about it that I see as the problem, not the general principles. There's no necessarily logical reason to exclude all foods that date from later than the dawn of agriculture if you're admitting a bunch of foods that have been selectively bred in agriculture for millennia. The issue should be about balancing the nutrients among different food sources, regardless of when those foods date from.

And frankly, if you watch the video I linked, at the end of her presentation, the speaker comes to similar conclusions -- conclusions that sound suspiciously similar to things that "paleo diet" proponents often say. The difference is whether you want to buy into a diet because of arbitrary distinctions created for marketing reasons, or because it actually is more balanced. Most of the paleo dogma about what always to eat and what never to eat is unfortunately about the former.

3 days ago

The Evolution of Diet

AthanasiusKircher Re:"Paleolithic diets" now vs then (281 comments)

Maybe I just don't understand what paleo is all about, but trying to achieve a balance of macronutrients closer to those original diets seems like the point (or it should IMO) and not actually trying to eat foods that are 100% like what our ancestors ate.

I think you DO understand what the modern "Paleo" approaches are about. However, there's a common misconception that if you eat the modern "paleo" diet that you're actually eating something like humans would long ago. That's partly from the branding and marketing of the diet, more than anything else. From what I understand, those who actually promote it and have researched its effects tend to phrase it more like what you described than as an actual simulation of an ancient diet.

I was merely responding to a thread where someone posed the question about this misconception.

On the other hand, I think that the modern food differences ARE so vast that it's really not reasonable to achieve an accurate "balance of macronutrients" (as you put it) like ancient diets while eating modern foods. There's also a lot of dogmatism among many of the diet's proponents that takes the form of "Did people eat X before agriculture? If not, then we shouldn't eat X." My argument is that if we start going down that road and looking for exact equivalence, we immediately have to throw out almost all foods (even "raw, natural, whole" ones) from the modern supermarket.

So, rather than worrying about the dogmatism of what ancient people may have eaten, the more reasonable approach is actually to achieve a better nutrient balance -- in whatever way is best and using whatever foods will work best to achieve that goal, regardless of whether they're truly an ancient "wild" food or are some vastly different descendant of an ancient food or are a more modern food that also can serve to create dietary balance. The whole "paleo" thing, therefore, can end up standing in the way, because it's based on a misconception.

3 days ago

The Evolution of Diet

AthanasiusKircher Re:"Paleolithic diets" now vs then (281 comments)

(Just to be clear, "paleo diets" may have some benefits for some people. I'm NOT saying the "paleo diet" ideas are necessarily bad. I'm just saying that in most cases they're NOT actually very much like true hunter-gatherer diets before the dawn of agriculture.)

4 days ago

The Evolution of Diet

AthanasiusKircher Re:"Paleolithic diets" now vs then (281 comments)

The article mentions "unrefined grains, nuts, fruits, and vegetables" so your "for example" has holes in it.

What does that have to do with anything? The context of that quote is:

The foods we choose to eat in the coming decades will have dramatic ramifications for the planet. Simply put, a diet that revolves around meat and dairy, a way of eating thatâ(TM)s on the rise throughout the developing world, will take a greater toll on the worldâ(TM)s resources than one that revolves around unrefined grains, nuts, fruits, and vegetables.

The article here does NOT imply that paleo diets revolved around MODERN "unrefined grains, nuts, fruits, and vegetables." It instead merely hints that the environmental consequences of trying to raise more meat for billions of people requires a lot more resources than those MODERN foods.

The fact is that agriculture has selectively bred many of these things over the millennia to make them tastier, more nutrient dense, higher in sugar, etc. The kind of "unrefined grains, nuts, fruits, and vegetables" that were actually around hundreds of thousands of years ago were vastly different (in most cases) from what we pick off plants in our gardens and fields today -- even the "unrefined" ones.

So, GP's absolutely correct on this point. Human selective breeding has significantly changed both plant and animal sources of nutrients. Thus, no matter how "unrefined" our food is, very few things at a modern supermarket would have been available to a hunter-gatherer hundreds of thousands of years ago... hence, the "paleo" diet is mostly wishful thinking.

4 days ago

The Evolution of Diet

AthanasiusKircher Re:"Paleolithic diets" now vs then (281 comments)

I doubt so-called "Paleolithic diets" are anything like people ate during that.

Yes. The classic debunking, from someone who is actually an expert on early human diets, is here.

Now, before all you Paleo fanatics get worked up, yes -- this speaker overemphasizes the carnivore aspect of many so-called "Paleo" diets. And there are some other details she gets wrong, but mostly in stereotyping modern "paleo diets," not in her knowledge of actual ancient diets.

For example, people ate fruit then, but it was seasonal, and very different from the fruit we eat today. Same with veggies. The stuff we eat is nothing like the stuff that grew in the wild.

Yes, and this is the critical thing from that video. Even if you dismiss all the stuff she says about overemphasizing meat, the reality is that our plant-based foods are completely different from the plants that would have been eaten before the dawn of agriculture. We've selectively bred fruits and vegetables for millennia to make them tastier to us, and more concentrated in sugars and other nutrients. (And we've likewise selectively bred our meat sources so that they are very different in composition from wild game.)

So, yeah, it's basically IMPOSSIBLE to eat "like a caveman did" with normal foods from the supermarket. The "paleo" diet might be a few steps closer to some sort of early hominid diet, but it's still significantly closer to the modern diet than it is to anything eaten hundreds of thousands of years ago.

You can buy all the "unrefined" and "natural" and "raw" crap you want, but unless you're seeking out the wild forms of ancient plants (and probably eating many times the amount of fiber even vegetarians eat today) and hunting wild game, chances are your "paleo diet" is as far from the "caveman" as the diet of a rich nobleman 200 years ago would be.

4 days ago

New Nail Polish Alerts Wearers To Date Rape Drugs

AthanasiusKircher Re:The world we live in. (583 comments)

"I'm not "blaming the victim""

I hate to break it to you, but that is exactly what you are doing.

I hate to break it to you, but I think you might need to work on your reading comprehension skills.

You are also claiming that only woman without much brains or ability to think for themselves and plan ahead like to have a good time in public.

Actually, GP didn't say that AT ALL. At no point did he make assertions about intelligence ("without much brains") or "ability to think for themselves." That's entirely something you manufactured -- you may want to look into the mirror if you're worried about people making assumptions about others and stereotyping them.

What GP was specifically talking about was your third category -- people who don't "plan ahead." I know plenty of very intelligent people who make incredibly poor choices in social situations. I know plenty of very independent folks who can "think for themselves" who also make poor choices in planning. In fact, while I'd say that people who are a little above average in intelligence are better than average at these things, those who are very intelligent often get worse again.

GP said absolutely nothing about intelligence or independence -- he simply stated that some people don't plan ahead or think about all the "bad situations" they could get into in social situations, and that's simply a fact. Those people exist. Those are the people who really need this stuff. But I don't think it's at all a controversial claim to say that some people don't think about possible consequences in social situations, and they are more likely to fall victim to some bad scenario than others. And it's certainly not "blaming the victim" to provide advice that would aid in preventing date rape, as GP did.

And also, GP never said or implied that "only women without [X] like to have a good time in public." He said that women who plan ahead when they are having a good time are less likely to end up in scenario where they can be taken advantage of than women who DON'T plan ahead when they are having a good time.

Let's review the actual advice GP gave:

The best defense is, as always, for women to watch out for their friends when at bars and parties. Don't go wandering off alone after heavy drinking with a guy you don't know or trust.

Do you actually disagree with this advice? Or do you believe it's impossible "to have a good time in public" if you bother to make sure you're with some friends and not go wandering off alone (i.e., not "in public") with a guy you don't know?

Look -- random hook-ups are a risky business. Aside from STDs, you could be locking yourself up in room with an ax murderer, or a rapist, or... who knows? If I were a young woman, I would definitely follow GP's advice and go drinking with friends and never agree to go somewhere alone with a guy I just met.

But that's me. I'm cautious by nature. I also don't bet on horses or play in traffic, which is effectively what you're doing when you put your trust in being intimate with a person who is stronger and bigger than you without knowing a lot about him/her first. Sure, in an ideal world no one should have to worry about such things, but given that everyone knows it's not an ideal world, random hook-ups are inherently more risky than many other activities.

I also occasionally like to have a "good time in public," but I don't know why that means that a "good time in public" needs to include having a private session afterward with some unknown person.

But there are other people who are less cautious. I don't think it's a stretch to say (as GP did) that those people are also less likely to seek out all sorts of preventative items to warn them of date rape, if they haven't already taken other measures to do so. That's not "blaming the victim" -- that's lamenting the fact that there are terrible people in the world and realistically noting that some people take less precautions in general when confronting those terrible people than others.

4 days ago

ACM Blames the PC For Driving Women Away From Computer Science

AthanasiusKircher Re:Men in education and healthcare? (329 comments)

Where is the push to get men to become primary school teachers?

Unfortunately, our mass media's ridiculous "pedophile" scares have taken care of that. Do a cursory internet search sometime for male teachers in elementary or daycare -- you'll inevitably find a bunch of articles about how parents are convinced that any man who might want to spend some time with small children MUST be a pedophile. Nevermind that pedophilia is incredibly rare, and your son or daughter is probably a hundred times more likely to be sexually abused by as a teenager by a high school teacher or coach than by a pedophile.

So, even if men wanted to get into this profession, we have huge hurdles -- and I agree it's really not right. (As a father, I've even occasionally seen the suspicious looks and odd concern when I would take my young child to the playground or even just for a walk around the neighborhood.)

All of that said, most primary school teachers I know would be happy to have more male colleagues. Most of them know the benefits of having male teachers around small kids -- unfortunately, for us to start a campaign for male teachers, we'd have to overcome the inaccurate media fear campaign about pedophiles... and "Think of the children!" always overrides logic or reason.

Same for healthcare. With the exception of doctors most healthcare is dominated by women yet men are a large number of patients.

I posted on this above with links, so I'll just briefly say that there are in fact organizations trying to get more men into nursing -- and given the growing nursing shortage, just about any place would be thrilled if the numbers of male nurses went up.

5 days ago

ACM Blames the PC For Driving Women Away From Computer Science

AthanasiusKircher Re:What about nursing?? (329 comments)

How come there aren't any people complaining that there are VASTLY more women in nursing than men.

There are. For example, have a look at organizations devoted to recruiting more men, like the American Assembly for Men in Nursing or the "Are you man enough to be a nurse?" campaign. Also see various studies and concerns about the issue on the Minority Nurse page. It's really a complicated issue, and organizations like this have really been trying to figure out recruitment efforts.

Maybe there should be more "people complaining" about this issue, but your assertion that "there aren't any" is just untrue. The fact is that we have a shortage of qualified nurses that is only projected to get worse, and many of these organizations, many hospitals, etc. would be extremely happy if they could get more male nurses, or get more men who are currently unemployed or in crappy jobs in this economy to go to nursing school. But it doesn't help the stereotype when just about every portrayal of a male nurse on television or film is usually made to be the butt of jokes and ridicule.

5 days ago



Thousands of Workers Strike to Reinstate Fired Grocery CEO

AthanasiusKircher AthanasiusKircher writes  |  about a month ago

AthanasiusKircher (1333179) writes "Have you heard of Market Basket, a regional grocery chain which brings in $4 billion per year? If you're not from New England, you may not know about this quirky century-old family business, which didn't even have a website until two days ago. But that's only the beginning of its strange saga. In a story that labor experts are calling 'unique' and 'unprecedented', shelves in grocery stores across New England have been left empty while thousands of Market Basket workers have rallied for days to reinstate former CEO Arthur T. Demoulas, who was fired last month (along with a number of his management allies) as part of a long-standing family squabble. At a protest this morning, 6,000 protesters gathered at the Tewkbury, Massachusetts location where the supermarket chain is based, similar to rallies that have been staged at various locations over the past week. Unlike most labor protests, the workers have no demands for better working conditions or better pay--they simply want their old boss back. Reaction from consumers has been swift and decisive as well: a petition was submitted to the board this morning with over 100,000 signatures from customers calling for the reinstatement of the CEO, and over 100 local lawmakers have expressed support for the workers' cause, including the governor of New Hampshire and candidates for U.S. Senate and gubernatorial races in the region.

In an age where workers are often pitted against management, what could explain this incredible support for a CEO and member of the 0.1%? Columnist Adrian Walker from the Boston Globe described his interview last year with 'Artie T.': 'We toured the Chelsea store together... the connection between the magnate and his employees was frankly shocking. Demoulas knew almost everyone’s name. He knew the name of the guy cutting meat whose wife had just completed chemotherapy and asked about her with obvious concern. Customers came up to him and hugged him, cheered him on. The interactions were too numerous and spontaneous to be staged.' Workers at Market Basket are loyal to their employer and often stay for 20, 30, or more than 40 years. Even lowly store clerks receive significant quarterly bonuses, and experienced loyal workers are rewarded and promoted. Despite running a $4 billion per year business, 'Artie T.' over the years has shown up at countless family events for employees, even visiting sick family members of employees when they are in the hospital. But his generosity hurt the bottom line, according to other board members, who have sought for years to increase profits by raising prices and reducing employee benefits to be in line with norms at other grocery chains. (Market Basket has commonly led grocery store lists for value in regional price surveys.) As one possible resolution to the crisis, the former CEO yesterday offered to buy the entire grocery chain from other board members; this morning, the board stated they were considering the offer."

Judge Throws Out Thoughtcrime Conviction and Frees "Cannibal Cop"

AthanasiusKircher AthanasiusKircher writes  |  about 2 months ago

AthanasiusKircher (1333179) writes "The story is classic: Boy meets Girl. Boy likes Girl. Boy goes on the internet and writes about his fantasies that involve killing and eating Girl. Boy goes to jail. In this case, the man in question, NYC police officer Gilberto Valle, didn't act on his fantasies — he just shared them in a like-minded internet forum. Yesterday, Valle was released from jail after a judge overturned his conviction on appeal. U.S. District Judge Paul Gardephe wrote that Valle was "guilty of nothing more than very unconventional thoughts... We don't put people in jail for their thoughts. We are not the thought police and the court system is not the deputy of the thought police." The judge concluded that there was insufficient evidence, since "this is a conspiracy that existed solely in cyberspace" and "no reasonable juror could have found that Valle actually intended to kidnap a woman... the point of the chats was mutual fantasizing about committing acts of sexual violence on certain women." (A New York magazine article covered the details of the case and the implications of the original conviction earlier this year.)"

An MIT Dean's Defense of the Humanities

AthanasiusKircher AthanasiusKircher writes  |  about 4 months ago

AthanasiusKircher (1333179) writes "Deborah Fitzgerald, a historian of science and dean of MIT's School of the Humanities, Arts, and Social Sciences, speaks out in a Boston Globe column about the importance of the humanities, even as STEM fields increasingly dominate public discussion surrounding higher education. '[T]he world’s problems are never tidily confined to the laboratory or spreadsheet. From climate change to poverty to disease, the challenges of our age are unwaveringly human in nature and scale, and engineering and science issues are always embedded in broader human realities, from deeply felt cultural traditions to building codes to political tensions. So our students also need an in-depth understanding of human complexities — the political, cultural, and economic realities that shape our existence — as well as fluency in the powerful forms of thinking and creativity cultivated by the humanities, arts, and social sciences.' Fitzgerald goes on to quote a variety of STEM MIT graduates who have described the essential role the humanities played in their education, and she concludes with a striking juxtaposition of important skills perhaps reminscent of Robert Heinlein's famous description of an ideal human being: 'Whatever our calling, whether we are scientists, engineers, poets, public servants, or parents, we all live in a complex, and ever-changing world, and all of us deserve what’s in this toolbox: critical thinking skills; knowledge of the past and other cultures; an ability to work with and interpret numbers and statistics; access to the insights of great writers and artists; a willingness to experiment, to open up to change; and the ability to navigate ambiguity.' What other essential knowledge or skills should we add to this imaginary 'toolbox'?"

Wu-Tang Clan to Release Only One Copy of New Album

AthanasiusKircher AthanasiusKircher writes  |  about 5 months ago

AthanasiusKircher (1333179) writes "Wu-Tang Clan's double-album The Wu — Once Upon a Time in Shaolin was recorded in secret, and they recently announced that only one copy will be sold. Wu-Tang member Robert 'RZA' Diggs described the concept: 'We're about to sell an album like nobody else sold it before... We're about to put out a piece of art like nobody else has done in the history of [modern] music. We're making a single-sale collector's item. This is like somebody having the scepter of an Egyptian king.' Before the album is sold, probably for millions of dollars, it will tour the world as part of special listening exhibits. Patrons will be subjected to heavy security to ensure that no recording devices are allowed, as a single leak would spoil the artistic project. As RZA noted: 'The idea that music is art has been something we advocated for years. And yet its doesn’t receive the same treatment as art in the sense of the value of what it is, especially nowadays when it’s been devalued and diminished to almost the point that it has to be given away for free.'"

A Corporate War Against a Scientist, and How He Fought Back

AthanasiusKircher AthanasiusKircher writes  |  about 7 months ago

AthanasiusKircher (1333179) writes "Environmental and health concerns about atrazine — one of the most commonly used herbicides in the U.S. — have been voiced for years, leading to an EU ban and multiple investigations by the EPA. Tyrone Hayes, a Berkeley professor who has spearheaded research on the topic, began to display signs of apparent paranoia over a decade ago. He noticed strangers following him to conferences around the world, taking notes and asking questions aimed to make him look foolish. He worried that someone was reading his email, and attacks against his reputation seemed to be everywhere; search engines even displayed ad hits like "Tyrone Hayes Not Credible" when his name was searched for. But he wasn't paranoid: documents released after a lawsuit from Midwestern towns against Syngenta, the manufacturer of atrazine, showed a coordinated smear campaign. Syngenta's public relations team had a list of ways to defend its product, topped by "discredit Hayes." Its internal list of methods: "have his work audited by 3rd party," "ask journals to retract," "set trap to entice him to sue," "investigate funding," "investigate wife," etc. A recent New Yorker article chronicles this war against Hayes, but also his decision to go on the offensive and strike back. He took on the role of activist against atrazine, giving over 50 public talks on the subject each year, and even taunting Syngenta with profanity-laced emails, often delivered in a rapping "gangsta" style. The story brings up important questions for science and its public persona: How do scientists fight a PR war against corporations with unlimited pockets? How far should they go?"

Federal Government Surveillance of Santa Superior to Private Companies

AthanasiusKircher AthanasiusKircher writes  |  about 8 months ago

AthanasiusKircher (1333179) writes "This year, competing tracking services for Santa led to confusion for children worldwide. At one point on Christmas Eve, Santa was reported to be over Romania by NORAD's Santa tracker, while Google claimed he was in Madagascar at the same time. Moreover, the estimates for total toys delivered varied wildly, with Google claiming only 770 million at the same time Google estimated 2.8 billion. Veteran Santa analyst Danny Sullivan explained the discrepancies: "the precision offered by NORAD’s satellites likely is superior, offering it the ability to lock onto the position of the sleigh within a matter of inches. 'They’ve been doing it for almost 60 years,' Sullivan said.... He said Google likely relies on alternative technology, such as tracking Santa’s in-sleigh WiFi signal, causing a possible lag in showing his exact location. Sullivan also guessed that Google was using an algorithm to estimate the number of gifts delivered, while NORAD might have the ability to identify individual gifts, and perhaps even smaller items such as stocking stuffers.""

"Brain Activity" Found in a Dead Salmon Demonstrat

AthanasiusKircher AthanasiusKircher writes  |  more than 4 years ago

AthanasiusKircher writes "Neuroscientist Craig Bennett used a dead salmon in his Dartmouth lab as a test object while they were evaluating new lab methods. The lab even followed proper experimental protocols, including showing the salmon photos of humans displaying various emotions. They were somewhat surprised by the results:

When they got around to analyzing the voxel (think: 3-D or 'volumetric' pixel) data, the voxels representing the area where the salmon's tiny brain sat showed evidence of activity. In the fMRI scan, it looked like the dead salmon was actually thinking about the pictures it had been shown.

Of course, the salmon wasn't actually responding to pictures illustrating human emotions. But the data manipulation commonly used in brain studies caused apparently significant patterns to appear by chance. More from the Wired article: 'The result is completely nuts — but that's actually exactly the point. Bennett, who is now a post-doc at the University of California, Santa Barbara, and his adviser, George Wolford, wrote up the work as a warning about the dangers of false positives in fMRI data. They wanted to call attention to ways the field could improve its statistical methods."

The study demonstrates the potential for misinterpretation and misuse of data in brain studies, particularly as data manipulation becomes more and more complex. Bennett notes: 'We could set our threshold [of significance] so high that we have no false positives, but we have no legitimate results.... We could also set it so low that we end up getting voxels in the fish's brain. It's the fine line that we walk.'

So far the paper has been rejected for publication a number of times, but there is a poster available that was employed in a conference presentation. Recently it has been making the rounds informally in the neuroscience community."

Link to Original Source


AthanasiusKircher has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>