×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

State Colleges May Offer Best ROI On Comp Sci Degrees

raddan Re:Calculation was flawed (127 comments)

Not to mention: many UVA grads likely stay in Virginia, and Stanford grads likely stick around in Silicon Valley (e.g., 100% of the Stanford grads that I know). The cost of living in Silicon Valley is dramatically more expensive than in Virginia. E.g, the cheapest condo in Palo Alto listed on Zillow is priced at $548,000 (which is > $300k above the already insane appraisal value) and for that, you get 679 square feet. Since I was an intern, my housing was (fortunately!) covered by my employer when I worked in Mountain View, but my boss ended up taking a job elsewhere because he and his wife simply could not afford anything more spacious than an RV. If you don't adjust the salary to the cost of living, your study is fundamentally flawed.

about three weeks ago
top

Ask Slashdot: Should Developers Fix Bugs They Cause On Their Own Time?

raddan Re:Guarantee (716 comments)

We don't need to certify programmers, we need to certify programs. I'm not sure that certification for programmers would provide any extra benefit other than maybe being a prior on whether you think the programmer can get the job done or not (and I'm not a Bayesian, so...).

On the other hand, many properties about programs themselves can and should be verified. A great deal of current programming language research is devoted toward both improving the capabilities of automatic program verification as well as designing languages more amenable to verification. Functional languages, for instance, rule out entire classes of bugs present in imperative languages. People complain that they're hard to understand. Maybe. I argue that they're hard to understand if you're the kind of person who does not care about whether your program is correct or not.

about 2 months ago
top

Asm.js Gets Faster

raddan Re:"So who needs native code now?" (289 comments)

Unless and until some unforeseen, miraculous breakthrough happens in language design, GCd languages will always be slower when it comes to memory management. And because memory management is so critical for complex applications, GCd languages will effectively always be slower, period.

This isn't true. Have a look at Quantifying the Performance of Garbage Collection vs. Explicit Memory Management. The take-away is that GC'd languages are only slower if you are unwilling to pay an extra memory cost; typically 3-4x of your explicitly-managed program. Given that GC gives you safety from null-pointer dereferences for free, I think that's a fair tradeoff for most applications (BTW, you can run the Boehm collector on explicitly-managed code to identify pointer safety issues).

about 4 months ago
top

Asm.js Gets Faster

raddan Re:Maximum precision? (289 comments)

I was being glib. Just nitpicking on the phrase "maximum precision". Sorry, it's a bad habit developed from working around a bunch of pedantic nerds all day.

Thanks for the pointer about native ints, although I can't seem to find any kind of authoritative reference about this. This guy claims that asm.js converts these to native ints (see Section 2.3: Value Types), but his link seems to be talking about the JavaScript runtime, not the asm.js compiler. If you have a reference, I'd appreciate it if you'd send it along.

about 4 months ago
top

Asm.js Gets Faster

raddan Maximum precision? (289 comments)

Let's just open up my handy Javascript console in Chrome...

(0.1 + 0.2) == 0.3
false

It doesn't matter how many bits you use in floating point. It is always an approximation. And in base-2 floating point, the above will never be true.

If they're saying that JavaScript is within 1.5x of native code, they're cherry-picking the results. There's a reason why people who care have a rich set of numeric datatypes.

about 4 months ago
top

'Approximate Computing' Saves Energy

raddan Re:Numerical computation is pervasive (154 comments)

My mind inevitably goes to this when someone says "Big O". Makes being a computer scientist somewhat difficult.

about 4 months ago
top

'Approximate Computing' Saves Energy

raddan Re:Numerical computation is pervasive (154 comments)

Not to mention floating-point computation, numerical analysis, anytime algorithms, and classic randomized algorithms like Monte Carlo algorithms. Approximate computing has been around for ages. The typical scenario is to save computation, nowadays expressed in terms of asymptotic complexity ("Big O"). Sometimes (as is the case with floating point), this tradeoff is necessary to make the problem tractable (e.g., numerical integration is much cheaper than symbolic integration).

The only new idea here is using approximate computing specifically in trading high precision for lower power. The research has less to do with new algorithms and more to do with new applications of classic algorithms.

about 4 months ago
top

The Challenge of Cross-Language Interoperability

raddan Re:Cross language - what .Net gets right (286 comments)

Believe it or not, CIL (or MSIL in Microsoft-speak), the bytecode for .NET, is an ECMA standard, and implementations of both .NET JIT'ers and standard libraries exist for practically all modern platforms, thanks to Mono. So I'd say: "competition for portable applications". Really! Just take a look at Gtk#. As a result, there are numerous applications for Linux written in .NET languages (e.g., Banshee). Having written huge amounts of code in both JVM languages (Java, Scala, JRuby, and Clojure) and .NET languages (F# and C#), I would take .NET over the JVM for a new project any day.

Also, to pre-emptively swat down this counter-argument: while the Mono people and Microsoft may have had some animosity in the past, it is most definitely not the case any more. Most of the Mono people I have spoken to (yes, in person) say that their relationship with Microsoft is pretty good.

Build systems and dependency management for the JVM are their own mini-nightmare. .NET's approach isn't perfect but compared to [shudder] Ant, Maven, Buildr, SBT, and on and on and on... it largely just works.

about 5 months ago
top

The Challenge of Cross-Language Interoperability

raddan Re:Cross language - what .Net gets right (286 comments)

P/Invoke, the other interop mechanism alluded to by the poster, is substantially faster than COM interop. I spent a summer at Microsoft Research investigating ways to make interop for .NET faster. There's maybe 20 or so cycles of overhead for P/Invoke, which is practically free from a performance standpoint. In addition to having its own [reference-counting] garbage collector, COM has extensive automatic-marshaling capabilities. These things make interop easy, but they add substantial overhead compared to P/Invoke. On the other hand, P/Invoke is somewhat painful to use, particularly if you want to avoid marshaling overheads and play nice with .NET's [tracing] garbage collector and managed type system. P/Invoke will often happily accept your ginned-up type signatures and then fail at runtime. Ouch.

Coming from the Java world, I was totally blown away by what .NET can do. I can't speak for Microsoft myself, but I would be very surprised if .NET was not going to stick around for a long time. With the exception of perhaps Haskell, the .NET runtime is probably the most advanced managed runtime available to ordinary programmers (i.e., non-researchers). And, with some small exceptions (BigInteger performance... GRRR!), Mono is a close second. What the Scala compiler is capable of squeezing out of the poor, little JVM is astonishing, but Scala's performance is often bad in surprising ways, largely due to workarounds for shortcomings in the JVM's type system.

about 5 months ago
top

On the subject of robots ...

raddan Re:overly broad then overly specific definition (318 comments)

I think the key distinction is that a robot is autonomous to some degree. It needs to make use of techniques from AI. I.e., it learns.

As someone who dabbles in techniques from AI to solve problems in my own domain (programming language research), solutions in AI tend to have the quality that the algorithms that produced them are extremely general. For example, a robot that can manipulate objects may not even possess a subroutine that tells it how it should move its hands. Often, it learns these things by example instead. It "makes sense" of the importance of these actions through the use of statistical calculations, or logical solvers, or both. Since information in this context is subject to many "interpretations", these algorithms often most closely resemble search algorithms! If a programmer provides anything, it's in the form of "hints" to the algorithm (i.e., heuristics). To an outsider, it's completely non-obvious how "search" and "object manipulation" are related, but when you frame the problem that way, you get weird and sometimes wonderful results. Most notably, autonomy. Sadly, you also sometimes get wrong answers ;)

If your washing machine could go collect your dirty laundry and wash it without your help, I'd call it a laundry robot. Particularly if you could tell it something like "please do a better job cleaning stains", and it could figure out what you meant by that. Note that a Roomba doesn't know anything about your house until you turn it on the first time.

about 5 months ago
top

Ask Slashdot: How Reproducible Is Arithmetic In the Cloud?

raddan Re:Fixed-point arithmetic (226 comments)

Experiments can vary wildly with even small differences in floating-point precision. I recently had a bug in a machine learning algorithm that produced completely different results because I was off by one trillionth! I was being foolish, of course, because I hadn't use an epsilon for doing FP, but you get the idea.

But it turns out-- even if you're a good engineer and you are careful with your floating point numbers, the fact is: floating point is approximate computation. And for many kinds of mathematical problems, like dynamical systems, this approximation changes the result. One of the founders of chaos theory, Edward Lorenz, of Lorenz attractor fame, discovered the problem by truncating the precision of FP numbers from a printout when he was re-entering them into a simulation. The simulation behaved completely differently despite the difference in precision being in the thousands. That was a weather simulation. See where I'm going with this?

about 5 months ago
top

How Early Should Kids Learn To Code?

raddan Re:logic (299 comments)

There's an interesting project out of Microsoft Research to design a language/IDE that is easy to input via touch interfaces called Touch Develop. I've spoken with one of the researchers at length, and his motivation for doing this boils down to the same argument you make. When he was a kid, programming was simple to get into: he used BASIC on his Apple II. It was a language that kids could teach other kids, because their parents largely didn't "get it". MSR has a pilot program in one of the local high schools where they teach an intro CS course using TouchDevelop. Anecdotally, kids seem to be able to pick up the language very quickly, and the ease of writing games motivates a lot of kids who wouldn't ordinarily be motivated to do this.

That said, I think TouchDevelop's interface (like most of Metro) is a bit of a train wreck. I am a professional programmer, but I find myself floundering around. Part of the issue is Metro's complete annihilation of the distinction between text, links, and buttons. Unfortunately, iOS 7 has continued in this trend. But I digress...

TouchDevelop is also not a graphical language, like LabView, and I also think that's a bit of a mistake. While I agree that I prefer a text-based language for real work, I think a visual interface would be entirely appropriate for a pedagogical language. Heck, LabView is used daily by lots of real engineers who simply want some basic programmability for their tools without having to invest the [significant] time into learning a text-based language.

about 7 months ago
top

I'd prefer my money be made of ...

raddan Re:Trust (532 comments)

But gold's value is an artifact of a modern civilization that has its basic needs met. Suppose there's a global food crisis. What's more valuable now?

about 7 months ago
top

China Bumps US Out of First Place For Fastest Supercomptuer

raddan Re:Clueless (125 comments)

Counting operations is not enough. Memory access time is nonuniform because of cache effects, architecture (NUMA, distributed memory), code layout (e.g., is your loop body one instruction larger than L1 i-cache?), etc. Machine instructions have different timings. CISC instructions may be slower than their serial RISC counterparts. Or they may not be. SMT may make sequential code faster than parallel code by resolving dependencies faster. Branch predictors and speculation can precompute parts of your algorithm with idle function units. Better algorithms can do more work with fewer "flops". And on and on and on...

The best way to try to write fast code is to write it and run it (on representative inputs). Then write another version and run it. Run it like an experiment, and do an hypothesis test to see which one has the statistically-significant speedup. That's the only way to write fast code on modern machines. The idea that you can hand-write fast code on modern architectures is largely a myth.

about 10 months ago
top

China Bumps US Out of First Place For Fastest Supercomptuer

raddan Re:Clueless (125 comments)

As a computer scientist:

We rarely refer to the cost of an algorithm in terms of flops, since it is bound to change with 1) software implementation details, 2) hardware implementation details, and 3) input data dependencies (for algorithms with dynamical properties). Instead, we describe algorithms in "Big O" notation, which is a convention for describing the theoretical worst-case performance of an algorithm in terms of n, the size of the input. Constant factors are ignored. This theoretical performance figure allows apples-to-apples comparisons between algorithms. Of course, in practice, constant factors need to be considered for many specific scenarios.

"flops" are more commonly used when talking about machine performance, and that's why they're expressed as a rate. You care about the rate of the machine, since that often directly translates into performance. Computer architects also measure integer operations per second, which is in many ways more important for general-purpose computing. Flops are really only of interest nowadays for people doing scientific computing now that graphics-related floating point things have been offloaded to GPUs.

If you want to be pedantic, computers are, of course, hardware implementations of a Turing machine. But it's silly to talk about them using Big O notation, since the "algorithm" for (sequential) machines is mostly the same regardless of what machine you're talking about. The constant factors here are the most important thing, since these things correspond to gate delay, propagation delay, clock speed, DRAM speed, etc.

about 10 months ago

Submissions

top

Good electronic lab notebook software for CompSci?

raddan raddan writes  |  more than 3 years ago

raddan (519638) writes "I'm starting as a MS/PhD computer science student in the fall, and having gotten accustomed to electronic tools like RT, wikis, and version-control repositories in the private software industry, I'd like something similar for my lab notebook. However, I've discovered that most of these tools are either 1) geared toward the biological sciences or 2) just plain too expensive for me, probably because many fields have strict legal and ethical requirements for note-keeping. Anyone know of something affordable geared toward computer science? What are the legal and ethical requirements for CS notebooks anyway? And, of course, it'd be nice if I could run it on Linux. Is this just a fantasy?"
top

Windows 7 Whopper Cross-Promotion in Japan

raddan raddan writes  |  more than 4 years ago

raddan (519638) writes "NPR has a story about a new cross-promotion Microsoft is doing with Burger King in Japan:
To heighten awareness of its new Windows 7 operating software in Japan, Microsoft teamed up with Burger King for a cross-promotion. It's a Windows 7 Whopper — seven hamburger patties stacked in a bun. The offer is good for seven days, and it's only for the first 30 customers who come into the store each day."

Link to Original Source
top

Prolific hacker, Jun-ichiro "itojun" Hagin

raddan raddan writes  |  more than 6 years ago

raddan (519638) writes "Jun-ichiro "itojun" Itoh Hagino passed away on October 29, 2007 at the age of 37. Details are light, but there's a brief thread going over at undeadly. itojun was probably best known for his work on the KAME IPv6 stack which will benefit us for years to come. itojun, you will be missed!"
top

Author Robert Jordan dies

raddan raddan writes  |  more than 6 years ago

raddan (519638) writes "Our company just sent out the following memo:

Tor novelist Robert Jordan (whose given name was James Oliver Rigney Jr.), the beloved author of the bestselling Wheel of Time® fantasy series, died Sunday after a courageous battle with the rare blood disease amyloidosis.

In an entry posted Sunday on Jordan's blog at www.dragonmount.com, Jordan's cousin Wilson Grooms wrote that he passed away Sunday, September 16th at 2:45 pm and noted that: "He fought a valiant fight against this most horrid disease. In the end, he left peacefully and in no pain," and that "his beloved wife, Harriet, was at this side through the entire fight and to the end."

Tor publisher Tom Doherty said of Jordan: "He was one of the great storytellers of the 20th and early 21st centuries; Jim's Wheel of Time is a towering epic of power and scope, he was a man of courage and heart and vision but for me, first of all, he was my friend of 30 years.""

Journals

raddan has no journal entries.

Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...