Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

Nuclear Missile Command Drops Grades From Tests To Discourage Cheating

dbrueck Re:What partisan wrote this? (122 comments)

Maybe, but I read it it slightly differently. 'Disproportionate' means 'too large or too small in comparison with something else'.

So I took it to mean that someone who got two percentage points higher on their test ended up being promoted at a much higher rate than would generally be expected for that small of a difference in scores.

As a made up example, if you scored two percentage points higher on your final than me, and all else equal, as a result over the course of your career that single test caused you to get promoted at a rate that was double the rate at which I got promoted, then one could realistically say that the rate of promotion is disproportionately higher than expected because in many cases a two percentage point difference would not be statistically significant while a doubling would be. The two are so different that it doesn't seem unreasonable to call them disproportionate. To be clear, those aren't the numbers from the article, but the article was just suggesting that there was that type of mismatch - small test score difference leading to a large, long term difference.

And that long term effect is key because it magnifies the issue: a very small difference in a score on one test you take early in your career has large ramifications for potentially decades? That creates a large incentive to cheat. And that's just it - I'm not arguing what's fair or unfair or what's right or wrong in this scenario, just saying that it sounds like there were some pretty big incentives to cheat.

about three weeks ago
top

Verizon's Accidental Mea Culpa

dbrueck Blog post gone? (390 comments)

L3's blog still has a summary blurb, but the link to the actual post gets a 404 - did they take it down or did they just link it wrong? Anybody have a cached copy?

about a month ago
top

Slashdot Asks: Do You Want a Smart Watch?

dbrueck Device convergence (381 comments)

No thanks. The watch is just another device on the long list of separate things that got consolidated into my phone (mp3 player, camera, calendar, ebook reader, flashlight, GPS, alarm clock, etc.). As with all those other things, the version on my phone is so far into the "good enough" range that having a separate device for the same functionality just doesn't offer much appeal.

Too many of the smart watches seem to try to move functionality back off the phone, which seems pretty pointless (until at such time as it could completely replace everything on my phone, which case I might be interested. You know, some sort of holographic magic screen that replaces the need for a large physical screen, or maybe interfaces with some futuristic contact lenses that project a HUD that only I can see).

Anyway, that seems to be the core problem - these watches just don't do anything worthwhile compared to what I'll already be carrying with me. I don't want a watch as a status symbol, I don't need a watch to just tell time, and I don't need/want a watch to do a bunch of stuff my phone already does.

An exception would be for highly niche purposes. I have a kid with type I diabetes. If he could have a watch that could monitor is blood sugar levels and dispense insulin, I'd buy it.

about a month ago
top

Ask Slashdot: Best Rapid Development Language To Learn Today?

dbrueck Re:Python (466 comments)

Native code, for example, refers to code in its binary (processor-specific) form. No processor that I'm aware of knows how to run C code natively - it has to first be taken from its portable format and translated into native (assembly and then machine) code. It's not a matter of when or how that translation happens; processors simply don't speak C.

You still don't get it.

Actually I /do/ get it. I'm really quite familiar with how Python works, as well as how C, assembly, and machine language work. I'm not debating one is the other, nor that VM vs not is better than the other. What I *am* saying, however, is that the thing you're talking about is not the same thing as a processor supporting such and such a language natively. "Natively" means something specific, and what you're talking about isn't it.

I'm not saying that Python does or doesn't require a runtime, a virtual machine, etc. (even though there are some versions that don't). What I /am/ saying is that I don't know of any processor that runs C natively. Make up a different term for the concept you're talking about, because "natively" already has a meaning in this context, and how C works is definitely not native.

about 2 months ago
top

Ask Slashdot: Best Rapid Development Language To Learn Today?

dbrueck Re:Python (466 comments)

Claiming that hardware can run C natively is quite a reach. Your definition of what it means to run a language natively is so broad that it encompasses a large number of programming languages (if not all of them).

It's not about the number of passes of the compiler or if a language is supposed to be portable or not, but about the "virtual machine" abstraction a language is assuming, and how far is that from what typical hardware has to offer.

So, by your definition, most modern machines run Pascal natively? All I'm pointing out is that this is a pretty atypical use of the term "natively". Native code, for example, refers to code in its binary (processor-specific) form. No processor that I'm aware of knows how to run C code natively - it has to first be taken from its portable format and translated into native (assembly and then machine) code. It's not a matter of when or how that translation happens; processors simply don't speak C.

The abstraction of a virtual machine is definitely interesting, it just has nothing to do with whether or not processors can run C natively.

about 2 months ago
top

Ask Slashdot: Best Rapid Development Language To Learn Today?

dbrueck Re:Python (466 comments)

I haven't kept up on what the latest and greatest things are in this area, but last time I checked Shedskin seemed the most mature and had a relatively small set of restrictions (in most cases, converting Python to a statically-compiled language involves either giving up some of Python's dynamicness to make it "fit" into the more static language or adding some sort of layer on top of the static language to support more dynamic functionality - Python is strongly but dynamically typed).

Anyway, py2c and Nuitka both seem to be in the same space as Shedskin but I haven't used either.

Cython bills itself as an "optimizing static compiler" for Python, although I think it's geared more towards writing Python extensions in C as opposed to trying to convert your entire program away from Python (i.e. it's a good fit if you're writing in Python but want to statically compile some performance critical parts of your app in C, or if you are calling some C library and don't want to use ctypes or cffi).

I believe that for awhile rpython (from the pypy project) optionally targeted the LLVM; not sure if that ever went anywhere.

The above are just the ones I've heard about; don't know if there are others.

about 2 months ago
top

Ask Slashdot: Best Rapid Development Language To Learn Today?

dbrueck Re:Python (466 comments)

Agree to disagree on what he's implying - the context was talking about C and Python and mentioning that he's no familiar with hardware that runs Python natively. If he wasn't implying that C on other hand does run natively, then the comment seems to have no meaning.

And yes, there /are/ versions of Python that compile to native code, but most people use the interpreter flavors as they give the most flexibility or have the fewest restrictions. Similarly, there are interpreted versions of C, but most people use the compiled flavor.

JIT-compiled languages sometimes in fact exceed statically-compiled languages as far as performance goes, but in general don't yet.

My point(s)? To recap: hardware doesn't run C natively. Python is not a language-of-the-day or some passing fad. The fact that the main version of Python is written in C is mostly irrelevant (as in, I'm still not sure why it was mentioned and it does not mean Python is not useful or not worth learning). Raw performance of a language is rarely as important as developer productivity, and languages like Python regularly yield higher developer productivity than lower level languages like C. More and more people moved to C from assembly as the benefits exceeded the costs more and more, but assembly of course didn't go away. More and more people move to higher level languages like Python from C as the benefits exceed the costs more and more, but C of course won't go away. The trend will likely continue, with lower level languages becoming more and more niche because progressively powerful and higher level languages will continue to provide benefits that outweigh the costs, while the lower level languages will not warrant as much use because of the relatively lower productivity.

about 2 months ago
top

Ask Slashdot: Best Rapid Development Language To Learn Today?

dbrueck Re:Python (466 comments)

I wasn't replying to the OP, nor (AFAICT) were any Python developers claiming C is inherently flawed. I was responding to the "Python is just Perl for the hipsters" nonsense as well as the weird statement that implied hardware ran C natively.

about 2 months ago
top

Ask Slashdot: Best Rapid Development Language To Learn Today?

dbrueck Re:Python (466 comments)

*Some* versions of Python are written in C, but not all of them. In fact, the version you're referring to is officially known as CPython to differentiate it from the others.

Claiming that hardware can run C natively is quite a reach. Your definition of what it means to run a language natively is so broad that it encompasses a large number of programming languages (if not all of them). Even before abstractions like LLVM were introduced, going from C to hardware involved translating to assembly and then machine language - multiple steps.

And "language-of-the-day"... really? Python is over 20 years old. I think it's beyond the passing fad stage.

Finally, it's not about getting rid of C, but people use higher level languages like Python over C for the same reason we generally stopped using assembly in favor of C back in the day - the costs vs benefits made it worth it in many cases. C's a good language in many ways and I still use it where it's the right fit, but these days it's realistic to get anywhere between 2x and 10x the productivity by using a higher level language than C. And if the cost is very little in practical terms, it's hard to justify not using a higher level language for many projects. Just like assembly, C probably won't go away anytime soon, if ever. But just like with assembly, these days there are many, many, many scenarios in which the cost vs the benefit isn't worth it.

about 2 months ago
top

Ask Slashdot: Best Rapid Development Language To Learn Today?

dbrueck Re:Python (466 comments)

Sorry, I'm struggling to understand the point you're trying to make, as well as correcting the implication that there's generally available hardware that runs C natively.

The fact that Python was originally written in C is an argument for not using Python? I don't get it.

about 2 months ago
top

Ask Slashdot: Best Rapid Development Language To Learn Today?

dbrueck Re:Stick to PHP/HTML/CSS/JS (466 comments)

Oh c'mon, "No company uses python outside of scientific use" is wildly incorrect.

I started using Python about 15 years ago and, at the time, Python was obscure and it took some convincing to get others to use it for any "real" (i.e. customer-facing) projects. But it's so widely used now that I haven't had to make that argument for at least a decade, and I've used it at many different (and successful) companies.

I work for a Fortune 100 company currently and we primarily use Python for our whole sytem. Not by mandate - it's just a really great language in so many areas. Our whole web stack is written in Python and we handle hundreds of millions of transactions each month (yes, there are companies doing more, that's not my point, just trying to convey that it's not some toy web app or something).

And for large scale web apps, pypy+gevent packs a nice one-two punch. We've even started using Python on Android (via Kivy) lately, but at this stage it /is/ just for toy apps - for now.

No language is perfect for everything, but Python is really great in a lot of scenarios. Honestly, at this point PHP qualifies for the title of "odd language" than Python does. The fact that so many companies still use PHP is a historical artifact more than anything.

about 2 months ago
top

Ask Slashdot: Best Rapid Development Language To Learn Today?

dbrueck Re:Python (466 comments)

Most hardware doesn't run C natively either. ;-)

about 2 months ago
top

Microsoft's New Smart Bra Could Stop You From Over Eating

dbrueck Re:Wrong fundamental assumption (299 comments)

read the stuff again. overeating will lead to obesity in most cases, where there isn't some disease/condition preventing the fat accumulation, in which case the person isn't really overeating anyways.

But, you could think it from the other way - without overeating it's __impossible__ to become obese. .

Unfortunately, this is well-documented as not being the case. Our "common sense" screams that it's true, but it's not, and so we really have a hard time letting go of it. :) You can start reading about it by Googling "obesity malnutrition paradox" (some of the links will take you to malnutrition via overeating, I'm not talking about those, but there are oodles of resources about obesity in populations where there are nowhere near enough calories to reach what is considered the minimum recommended daily intake). What's particularly compelling is that the same scenario is documented in many different peoples/societies in many different time periods - so it's incredibly well-established that obesity can and does occur without excessive caloric intake. Also, there are an abundance of studies that refute the notion that overeating *causes* obesity - there is a correlation, yes, but there isn't solid scientific evidence to support a causal relationship in the way you indicate (although there is a good bit of evidence that shows a causal relationship in the other direction - obesity leading to overeating).

statistics back this up (counting out exotic diseases, elephantinism or whatever). besides, it's a proven method for losing weight: eat less, do more - or inversely a proven method for getting fat: eat more and do less.

Actually, no, there is a strong tie between eating less / doing more and feeling hungry, but not between eating less / doing more and losing weight and keeping it off for any interesting length of time. A really interesting link though is that the rise of the "eat less / do more" philosophy (as a way to control weight) more or less exactly correlates with the obesity epidemic - not saying it caused obesity, but that those ideas as a means of weight control replaced earlier, more correct ideas, and that transition in thinking also marks the rise in obesity. Similarly, there is strong evidence that the introduction of the food pyramid with a high carbohydrate base has been particularly disastrous. Prior to both of these philosophies, it was fairly well-accepted that carbohydrate intake and obesity had a causal relationship, but these fell out of fashion... and obesity rates took off.

(and it doesn't really work like a river that it takes what it needs, it takes what it can and figures out where to put it later).

This too isn't quite right, unless you for some reason don't poop and pee. A really obvious example is most types of fiber - for the most part fiber passes right through your system (which, incidentally, is why e.g. a diabetic can for the most part ignore fiber when counting carbs to predict glycemic impact of a meal - the vast majority of the fiber isn't metabolized into sugars and just passes on through). Maybe that's what you meant by "takes what it can", but just because a calorie is introduced into the body, it doesn't somehow force the body to hang onto it - the body is quite adept at giving off what it doesn't think it needs (hint for what follows below: the problem isn't that the body doesn't know what to do with excess calories, but that it is being quite directly told to hang on to too many calories). Further, all calories aren't created equal - simple sugars, especially those already in liquid form, get sucked up by your body much more easily than the calories that are in the small, digestible portion of fiber.

Perhaps the misconception is because it's easy to assume that e.g. eating a fatty or high calorie food means more fat will end up on your body. That's not exactly how it works though - your body contains fat tissue that is used to store energy in the form of fat. It's like a rechargeable battery and is a necessary part of a normal, functioning body - it's what you need to survive the night without eating constantly, for example. Fat stored in fat tissue doesn't come from fat in food per se, or even directly from the calories in food (as in, it's not that direct of a relationship). The more correct (but still simplified) process is closer to: intestines metabolize food into sugars and deposit them into the bloodstream, insulin in the bloodstream acts as a trigger to the fat tissue to remove sugar from the bloodstream and to store it as fat. The reason many types of carbohydrates contribute to obesity is because they are often a double-whammy: many carbs contain high amounts of simple and easily digestible sugars *and* carbs trigger (often by a huge amount) rises in insulin levels. In a nutshell, a high carb diet tells your body to go nuts storing energy as fat and gives your body an excess of energy that is particularly easy to stockpile - very little "effort" is needed to extract it.

OTOH, many, many calories can be present in types of food that are more difficult or even impossible for the body to metabolize, which is why if you eat 2000 calories and burn 1500 calories, it doesn't *at all* mean that 500 calories must be stored by the body as fat - some or even all can be released from the body via waste (it's gross to think about, but both urine and feces can contain huge amounts of calories).

So back to the idea that without overeating it's impossible to become obese, the reason it's not right is at least in part this: obesity comes as a result of your body being told to store too much energy as fat, but an excess of calories does not necessarily translate into your body storing them as fat, nor does your body storing energy as fat depend on an excess of calories (the fat tissue can actually compete with e.g. the muscles for the sugar in the bloodstream, such that the person is nutritionally underfed but still accumulating fat).

Anyway, an active lifestyle and healthy eating are both extremely important and beneficial. They don't, however, help very much with obesity.

about 8 months ago
top

Microsoft's New Smart Bra Could Stop You From Over Eating

dbrueck Re:Wrong fundamental assumption (299 comments)

I used to think the same thing as you, but it turns out that "common sense" is wrong in this area, as are many dieticians (who too often also just rely on common sense). And history (at least in the form of rigorous scientific studies) does /not/ agree with you.

Overeating and obesity are correlated, but get this: some of the most current scientific studies show that, if anything, overeating is actually an *effect* of obesity and not a cause. What makes us fat? If you want to point to one thing and focus on it, it's carbohydrates. In a nutshell: carbs lead to insulin increases in the bloodstream, insulin tells fat tissue to remove energy from the bloodstream and store it as fat. Guess what happens when too much energy is removed (and stored as fat) such that there's not enough for normal body functions? Yep! You feel hungry, so you eat more.

In general, a reduction in carbs leads to a reduction in obesity, although the degree of the effect varies by body type and genetics and how much damage has been done from years of high carb eating. Also generally speaking, when people see a weight loss reduction from exercising, being on a "reality" TV show, or starting some fad diet, any lasting effects are likely to be highly correlated with a reduction (intentional or not) in carb intake.

There are numerous books on this subject and, sadly, the link between carbs and obesity has been recognized for over a hundred years. Unfortunately it fell out of fashion due to a combination of geopolitics and the health/exercise craze, among other things.

Don't get me wrong: I love to exercise. And I hate the victim mentality and am a huge believer in personal responsibility. But there is quite a large body of evidence that shows that the "eat less! be more active!" mentality is completely wrong. As in, those are generally great bits of advice (and have many helpful effects), but they neither prevent nor solve obesity.

For an accessible, entry level work on the subject, see "Why We Get Fat" by Gary Taubes, although he's the first to point out that the key principles he talks about have been understood for over a century.

about 8 months ago
top

Orson Scott Card Pleads 'Tolerance' For Ender's Game Movie

dbrueck Re:Who Cares? (1448 comments)

See, for example, http://www.mormonnewsroom.org/article/church-clarifies-proposition-8-filing-corrects-erroneous-news-reports which states that in-kind donations were under $200k (not millions) and that the church gave *no* cash donations.

The individual /members/ of the LDS church probably constituted a big chunk of the contributions to Prop 8, but that's very, very different than the LDS church as an institution directly contributing funds as you stated it - the assertion that Prop 8 passed because "the Catholic Church and the Mormon Church spent untold millions of dollars campaigning for it" is incorrect.

about a year ago
top

Orson Scott Card Pleads 'Tolerance' For Ender's Game Movie

dbrueck Re:ridiculous (1448 comments)

As others have pointed out, however, Card's even gone beyond that. He actively uses his money to lobby for discriminatory marriage laws and anti-sodomy laws. If that's not crossing a line, I don't know what is.

Really? You can't think of anything more extreme than him lawfully donating money to causes he supports? ;-)

I don't agree with Card's extreme views, but you're basically saying it's ok for people to have an opinion but that it's not ok for them to engage in any sort of advocacy for that opinion. People lobby for and fund causes you and I disagree with all the time. We don't have to like those causes, but it's hardly crossing any lines - it's a part of our society, and you trying to shut that down would be an example of intolerance. Just let him do his thing and either ignore him, or engage in counter-advocacy of your own.

about a year ago
top

Orson Scott Card Pleads 'Tolerance' For Ender's Game Movie

dbrueck Re:Who Cares? (1448 comments)

Prop 8 in California passed because the Catholic Church and the Mormon Church spent untold millions of dollars campaigning for it.

Minor correction: the Mormon church didn't spend money in support of Prop 8 (this is a matter of public record).

about a year ago
top

Physicists Create Quantum Link Between Photons That Don't Exist At the Same Time

dbrueck Re:Wait for the retraction (364 comments)

Wish I had some mod points right now - thanks for taking the time to explain this!

about a year ago
top

Python Family Gets a Triplet Of Updates

dbrueck Re:Not that surprising (196 comments)

Most people /use/ both in typical modern languages of course; but in terms of conveying the structure of the code to a person reading it, the whitespace is more significant (that's the point of the suggested experiment - to give you first-hand knowledge of the relative importance of each, independent of the other).

It's not "wrong" that both are used, but this whole sub-thread is in response to the implication that Python is somehow broken because it doesn't have the braces. But what Python has done is neither bad nor all that crazy of an idea: not only does it work extremely well in practice, it avoids the oddity of the tools looking at one structure indicator and the humans using it too but (subconsciously or not) actually relying fairly heavily on another.

Again, I don't think it's a big deal that other languages have been designed that way, but Python is certainly not poorly designed because it didn't opt to follow suit.

about a year ago
top

Python Family Gets a Triplet Of Updates

dbrueck Re:Not that surprising (196 comments)

(hint: if you're completely honest, you'll almost certainly come up with different answers for #1 and #2 :) )

You must free your mind even more, grasshopper.
I don't see why #2 would be different from #1.

Try the experiment I suggested and then you'll see.

What makes you (reader) think a byte is "stronger" than another for a parser ?

Nothing, because I don't think that. Try the experiment, and re-read my posts.

I find the mindstate about spaces being "weak" chars annoying.

Er... okie dokie. I'm not suggesting that spaces are weak characters, so if that's the conclusion you've reached I encourage you to re-read my post, as well as those in the related sub-thread.

I still encourage you to try the experiment for yourself, but here's the answers to the earlier two questions:

1) indentation - many people would recognize this intuitively, but if you don't, the experiment I suggested makes it very clear. You can also confirm it by glancing at code from a distance, looking at how you write pseudocode, or try writing code without indentation for a few hours. If you're feeling especially masochistic, download a non-trivial module from an OSS project, remove all indentation and blank lines, and then read through it to become familiar with how the code works.

2) braces - obviously, because that's how the syntax of the language has been defined

If we were to quantify the "weight" of each block indicator to humans vs computers, it's probably something like:

- for humans: indentation=95+%, braces=5% or less (again, if the value you come up with is substantially different, then you really ought to try some of the above experiments to see for yourself. Seriously, take a few minutes and give it a whirl.)

- for computers: indentation=0%, braces=100% (by definition, per the language syntax)

Now, take a step back, put on your language designer hat, and think through the implications of this.

about a year ago

Submissions

dbrueck hasn't submitted any stories.

Journals

dbrueck has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>