Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

Major Scientific Journal Publisher Requires Public Access To Data

LourensV Re:good and bad (136 comments)

I work with data collected by others, and those others are typically rather protective of their data for commercial reasons. I can use them for scientific purposes, but I'm not allowed to publish them in raw form. For most of these data there are no alternatives. I'd much rather publish everything of course, but that's impossible in this case, so I wonder if that means that I can't publish in PLOS any more now?

Just to be clear, I applaud this move, we should be publishing the data, plus software and such, where possible. Anyone happen to have a spare couple of tens of millions of euro lying around? That would probably free the data I'm using...

about 8 months ago
top

Ask Slashdot: An Open Source PC Music Studio?

LourensV Audiality (299 comments)

If you're interested in programming your own synthesizer software, then you're probably also interested in David Olofson's Audiality.

about 9 months ago
top

Sun Not a Significant Driver of Climate Change

LourensV Re:Grasping at Straws (552 comments)

You're probably just trolling, but you're currently modded +3, so I'm going to reply.

Such as the US just had one of the 10 coldest years on record.

Minima and maxima are by definition outliers. While there is an entire body of statistical literature on outliers, they're not used to determine trends or draw conclusions, because they are essentially (bad) luck.

The UK getting record snowfall despite AGWers claiming the UK wouldn't see snow after 2008.

Sources please. Because no serious scientist would ever make such a definite statement. A mathematician might, but science, including climate science, is all about statistics and probabilities. In any field. Perhaps you mean this article in the Independent? The scientist quoted says that in 20 years time, snowfall will become a rare and exciting event. So I think that we can consider him proven wrong if it snows in southern England for say, five out of ten years from 2020 onwards?

Antarctica getting within .5 degrees of the coldest recorded temperature on earth.

Antarctica is a huge and largely unexplored continent. Finding a new minimum in a situation where very little information was available is hardly suprising, and certainly shouldn't be used to draw any conclusions.

Along with 2000 record low temperatures recorded over the last couple of months.

Among how many measurements? Record since when? And see above about outliers.

Add that to the IPCC report showing no warming for 17 years.

Indeed. They also investigated why, but you're conveniently leaving that out since it doesn't fit your agenda. I'll give you a hand as to the causes according to the IPCC: an exceptionally quiet sun (there's another of those outliers), several smaller volcanic eruptions increasing the amount of dust in the upper atmosphere, and an increase in dust in the lower atmosphere, probably due to industrial pollution. According to the IPCC, the discrepancy is partially explained by these three causes (which weren't put into the models when the prediction was made), and the remaining difference is small enough to fit within the natural variation (stochasticity) of the models, or be attributed to errors in the models.

Its become pretty obvious which side has been lying. Now they are grasping at straws to report ANYTHING that shows their side "might" be right.

Sorry, this is not the 18th century anymore. Science is a quantitative affair, and necessarily so, because our world isn't binary. The question is not whether there is human-induced climate change, the question is how strong an effect humans are having on the biosphere. Maybe it's small enough to be negligible (probably not, according to what we currently know), maybe it's huge and a danger, but it's a quantitative question.

I'm going to ignore the alarmists and look at the evidence myself. If AGW was real, they wouldn't have to lie as often and at least ONE of their predictions would have happened.

Excellent idea. Try reading the IPCC report instead of The Drudge Report and you might find some.

about 10 months ago
top

How the Lessons of Columbine Saved Lives At Arapahoe High School

LourensV Re:Rule #1 (894 comments)

I'm from Germany, were gun laws are much, MUCH stricter and therefore we aren't seeing such tragedies on a yearly basis like it's come to be anticipated in the US.

In Swiss, every single person that once belonged to the army are not just allowed, but expected to have personal arms in home.

The guns aren't the problem. People are.

In particular, the US has a serious problem with the way its health care system is organised, with the way its education system is organised, and with the way in which mental health issues are dealt with in the criminal justice system. As a result, people who should be getting psychiatric treatment or other forms of professional help don't get it because 1) they can't afford it, because they can't hold a job because of their mental health issue (or even just because of the stigma associated with it) and there's no working social health insurance system, 2) they're in prison instead of having been institutionalised, because of the revenge-over-fixing-the-problem mentality towards crime, or 3) they're kids whose teachers are way too busy to build up a personal relationship with their students, and to notice the early warning signs and offer a listening ear.

The problem isn't the guns, and the problem isn't the people. Changing those won't help. It's the society that will have to change to really do something about this.

about 10 months ago
top

NVIDIA's G-Sync Is VSync Designed For LCDs (not CRTs)

LourensV Re:In English (139 comments)

Exactly, and that is especially a problem for the Oculus Rift and other virtual reality headsets that are coming onto the market, because it becomes really noticeable when you move your head quickly. I think that that is what they're mainly targeting here, although according to John Carmack, G-Sync won't work on the Rift. Anyway, for those interested in the technical details, graphics programming legend Michael Abrash (currently at Valve) wrote an excellent technical piece about the frame timing issues you get with VR headsets some time ago.

1 year,2 days
top

Japan Controls Rocket Launch With Just 8 People and 2 Laptops

LourensV Re:It's a solid rocket booster stack (94 comments)

The Epsilon rocket is three stages of solid rocket booster, like an ICBM. So there's no fueling on the pad, no plumbing, no cryogenics, and no turbopumps. The launch team has a lot less to do than with liquid-fueled rockets.

They're also proudly proclaiming how quickly they can prepare the rocket for launch. I don't think that these features are coincidental, and I don't think that cost savings are the only driver behind developing this thing. North Korea's leadership is a bit unstable at times, it may have nuclear weapons, and Japan has had North Korean rockets fly over its territory before. It's a serious potential threat to them.

Since they lost in WWII, Japan has been very pacifist, but in recent years it has begun to expand its military activities a bit, taking part in a UN peace keeping mission for instance. Outright developing an ICBM would probably go a bit too far at this point, but making a civilian rocket that can be launched at short notice with a small crew and has the range to hit North Korea could just be an acceptable compromise between mitigating the NK threat and not rocking the domestic political boat too much with overly aggressive military moves.

about a year ago
top

US Electrical Grid On the Edge of Failure

LourensV Re:Coincidentally... (293 comments)

Based off of a sample size of 1. Nice generalization.

Hey! That's one better than some of the climate change theories!

(I know this was meant as a troll/joke but you're hitting the nail) No. They have the sample size of "1 earth". Exactly "1 earth". Of course that's due to the lack of spare earths that we could compare ours too. But it is exactly what makes this whole subject statistically "challenging".

If all you could measure was the global average temperature then yes, you'd have one sample of a simple probability distribution, which contains so little information that you can't possibly derive any interesting knowledge from it about a system as complex as our planet. Fortunately, we have measurements of many aspects of that planet, not just temperature but atmospheric composition, ocean temperatures and salinity, albedo, ocean currents and wind, and so on, and not just global averages but measurements localised in space and time. So your one sample is actually a sample of an extremely high-dimensional, highly internally correlated probability distribution, which gives us much more information to work with.

Now, it's true that we can't do controlled experiments covering the entire Earth, since we don't have a control to compare against. We can do such experiments at a smaller scale and use the results to guide construction of whole-planet models however, and we can exploit the natural variety across our planet to test hypotheses and draw conclusions. So science is still possible, we just need to use different tools. Models make predictions, and if a model predicts our actual sample to be unlikely, we can rightfully conclude that that model is unlikely to be a good description of reality.

The modeller's challenge is to create a description of a complete planet that accurately describes the characteristics that you're interested in and correctly mimics the emergent behaviour (insofar as is relevant to your research question) of the actual planet, while still being simple enough to fit in a computer, give meaningful and comprehensible results, and have reasonable uncertainty bounds on its predictions given the limited amount of information we have available to feed it. I don't think that having a second Earth would make that job much easier.

about a year ago
top

EFF Wins Release of Secret Court Opinion: NSA Surveillance Unconstitutional

LourensV Re:Tipping point (524 comments)

We have 2 political parties in this country. They dictate the issues. The write the rules governing how you create a party, how you get on a ballet. Nearly everyone in the media belongs to one of the two parties. The parties control the message. You basically can not vote for anyone if they do not belong to one of the parties. You can write in a name, but the fact of the matter is it's nearly impossible to co-ordinate a write-in voting effort.

I'm not from the US, but given all that's happened in the past 15 years it seems to me that at this point voting either Republican or Democrat in any federal election should be considered treason. A vote for either of these parties is a vote for a government of the people, by the elite, for the corporations, and as I understand it, that wasn't quite the idea of your country. Perhaps a write-in or third party vote is a wasted vote, but at least you're not actively voting for this abomination.

As for alternatives besides your current third parties, in the most recent elections in Italy (which had similar issues) the Five Star Movement got almost a third of the vote in what was previously a two-party (or two-coalition) system, with a strictly online and on-the-streets campaign (they're boycotting the Berlusconi-controlled mainstream media). They're promoting amongst others more direct (e-)democracy, limited terms in both houses of congress filled by ordinary people who take a few years out of their lives to serve the country, and reduction in campaign spending.

It's certainly not perfect: they are having issues with disagreements within the party, it turns out online voting doesn't work too well technically, and some of their other policy ideas probably wouldn't work in the US. You'd need your own version of such a party for sure, fix some things, and then it still will be a struggle to make it work. But it shows that it's not impossible to break a two-party system even if it controls the mainstream media, and it's worth a try. Even inexperienced and/or somewhat incompetent representatives would be an improvement over what you currently have as long as they're at least honestly trying to represent the people.

about a year ago
top

US States Banned From Exporting Trash To China Are Drowning In Plastic

LourensV Re:They aren't drowning in plastic (427 comments)

Actually, we recently started collecting plastic separately. Which means that we now collect glass (white and coloured often separate), paper, clothing, compostable waste, batteries and other small chemical waste, and plastic separately in most places, and then there is a separate recycling scheme that puts a small extra fee on PET soft drink bottles, glass beer bottles and beer crates, which you get back if you hand them back in in the shop. Supermarkets have machines that you put them into, and they're collected when the shop is resupplied. The bottles are stripped of labels, cleaned, and reused up to 50 or so times IIRC, before they're recycled.

It's not much of a burden, you just keep a few extra bins or bags with waste around and remember to take them with you when you go get your groceries. Just about every supermarket has a bunch of recycling bins on the front court. That's not to say that there aren't lazy people who just toss everything in the garbage, but they're probably a minority.

about a year ago
top

Shuttleworth Answers FSF Call for Free Software Drivers on Edge

LourensV Re:You can't make promises... (112 comments)

And as my grandpa used to say "Girls want ponies, people in hell want ice water, I want a million dollars...that don't mean any of us are gonna get it".

Unless they are gonna kickstarter the chips in the thing it'll be DAMN hard to make it FOSS, simply because the ones making the GPUs, wireless, etc, are about the most proprietary lot on the planet. Hell I don't even think you CAN make a FOSS GPU as everything from texture compression on up is patented up the ass, I know there was a project to make one using an FPGA but I never heard any more about it, probably ran into the legal minefield and ran aground.

Basically, it ran out of money; the main contributors didn't have as much time available any more and making an ASIC is expensive. Some prototype boards were manufactured, and the employer of the main developer (who allowed him to use their tools, and work on it some during office hours) made a commercial product based on the design. It never got to producing a consumer video card though. I see now that Kickstarter actually existed in 2010, but I don't think anyone of us had ever heard of it, and I don't think we could have got the couple million dollars needed to have the cards produced.

For those interested, there's still an active mailing list, the project isn't quite dead.

about a year ago
top

Ask Slashdot: Scientific Research Positions For Programmers?

LourensV Re:yes, there are a reasonable number of positions (237 comments)

Another option you may want to look into is working at a supercomputer centre. These are usually (semi-)independent organisations that maintain supercomputers and fast networks, and help scientists use them. Jobs there include technical sysadmin type work maintaining compute clusters, storage arrays, and networking equipment, programming with an emphasis on parallellisation, optimisation and visualisation, as well as more consulting-type work where you advise researchers on how to best use the available facilities to achieve their goals, and gather requirements for the programmers. As a random US example, there's one in Chicago.

As for technical skills, if you're in the geosciences then you'll definitely want to brush up on your knowledge of Geographical Information Systems. ESRI ArcGIS is the big commercial vendor there, but there's also a lot of FOSS GIS software available. Also, some knowledge on geostatistics will help you communicate; some tutorials can be found here.

about a year ago
top

AMD Overhauls Open-Source Linux Driver

LourensV Re:Still not Stallman-approved. (126 comments)

I don't understand why simply putting the closed source firmware on the card suddenly makes it ok for free software. Same code, just different home.

Back in the days of the Open Graphics Project (since defunct, although Timothy N. Miller is still working in this area and the mailing list is still active for those interested in the subject), we had several discussions about the borders between Free software, open firmware, and open hardware.

As I understood the FSF's position at that time, the point is that if the firmware is stored on the host, it can be changed, and frequently is (i.e. firmware updates). Typically, the manufacturer has some sort of assembler/compiler tool to convert firmware written in a slightly higher level language to a binary that is loaded into the hardware, which then contains some simplistic CPU to run it (that's how OGD1 worked anyway). So, the firmware is really just specialised software, and for the whole thing to be Free, you should have access to the complete corresponding source code, plus the tools to compile it, or at least a description of the bitstream format so you can create those. This last part is then an instance of the general rule that for hardware to be Free software-friendly, all its programming interfaces should be completely documented.

If the code is put into ROM, it cannot be changed without physically changing the hardware (e.g. desoldering the chip and putting in another one). At that point, the FSF considers it immutable, and therefore not having the firmware source code doesn't restrict the user's freedom to change the firmware, since they don't have any anyway. The consequences are a bit funny in practice, as you noted, but it is (as always with the FSF) a very consistent position.

We (of the OGP-related Open Hardware Foundation, now also defunct; the whole thing was just a bit too ambitious and too far ahead of its time) argued that since hardware can be changed (i.e. you can desolder and replace that ROM), keeping the design a secret restricts the users freedom just as well. So, we should have open hardware, which would be completely (not just programming interfaces, but the whole design) documented and can therefore be changed/extended/repaired/parts-reused by the user. The FSF wasn't hostile to that idea, but considered it beyond their scope. Of course, any open hardware would automatically also be Free software-friendly.

I tend to agree that in practice, especially if there are no firmware updates forthcoming but it's just a cost-savings measure, loading the code from the host rather than from a ROM is a marginal issue. Strictly speaking though, I do think that the FSF have a point.

about a year ago
top

Next-Next Generation Video: Introducing Daala

LourensV Re:Another day, another codec. (86 comments)

Those existing codecs are all very similar technically, and riddled with patents. If Monty can make something new (and he can, see CELT) and work around those patents (and he can, see Vorbis, Theora), then it's definitely a welcome addition. And a codec doesn't have to dominate to be useful; Vorbis is widely used (Wikipedia, all sorts of software that plays sound and music including a lot of if not most video games) and supported on a lot of platforms (including hardware players and set-top boxes) even if it never did completely replace MP3 and AAC. If nothing else, having a free and unencumbered option will keep the licensors of the proprietary codecs at least somewhat honest.

Incidentally, isn't it about time for Monty to get an EFF Pioneer award? He's been very successfully working on freely usable audio and video codecs for well over a decade now, starting at a time when many people didn't believe that a non-encumbered audio or video codec was even possible. Someone with his skills could probably make a very good living in proprietary codec development, but he chose to start Xiph.org and fight the good fight (and now works for Red Hat). He belongs in that list IMHO.

about a year ago
top

Taking Action For Free JavaScript

LourensV Re:Gosh!!! (318 comments)

So here we are, at a crossroads. If a project produces the source code needed to build a complete, binary-perfect copy of their executable(s), but it was run through the C pre-processor, or C++ pre-processor, is that enough? It compiles, it builds with the version of tools the provider used... if you discount the pre-processor, it is effectively the original source code provided to the compiler. Is that enough?

I believe Stallman answered that question already, and as you would expect from him, it's a smart answer too. In the GPL (v3, but it goes back all the way to v1) it says "The “source code” for a work means the preferred form of the work for making modifications to it." So, if the creator of the source code actually works on the preprocessed source all the time, then it's okay to redistribute only that. If, in fact, any work done on the program is typically done on the original, non-preprocessed source, then that is the source code and that has to be distributed. This neatly avoids having to define a minimum level of readability by simply requiring that all users/developers be equal.

So, for JavaScript, if the authors actually do their programming directly on the minified version, then distributing only that would be okay. If they don't, and use a non-minified version for development (which everyone does), then I'd want to have that original version as well before I'd call it Free software.

about a year ago
top

Physicist Proposes New Way To Think About Intelligence

LourensV Re:Am I missing something? (233 comments)

Suggesting that the purpose of intelligence in this man's random musings might be to increase the background levels of entropy for your own benefit.

That's close, I think. I am not a physicist and I skimmed the equations, but here's my take on what they're proposing. Physical systems have states, which can be described by a state vector. The state of these systems evolves according to some set of rules that describes how the state vector changes over time. They've built a simulator in which the probability of a certain state transition is computed by looking at how many different paths (in state space, i.e. future histories of the system) are possible from the new state, in such a way that the system tries to maximise the number of possibilities for the future. In one example, they have a particle that moves towards the centre of a box, because from there it can move in more directions than when it's close to a wall.

They then set up two simple models mimicking two basic intelligence tests, and find that their simulator solves them correctly. One is a cart with a pendulum suspended from it, which the system moves into an upright position because from there it's easiest (cheapest energetically, I gather) to reach any other given state. The other is an animal intelligence test, in which an animal is given some food in a space too small for it to reach, and a tool with which the food can be extracted. In their simulation, the "food" is indeed successfully moved out of the enclosed space, because it's easier to do various things with an object when it's close compared to when it's in a box. However, in neither case does the algorithm "know" the goal of the exercise. So they've shown that they've invented a search algorithm that can solve two particular problems, problems which are often considered tests of intelligence, without knowing the goal.

Then, they use this to support the hypothesis that intelligence essentially means maximising future possibilities. Another way of saying this, I think, is that an intelligent creature will seek to maximise the amount of power it has over its environment, and they've translated that concept into the language of physics. That's an intriguing concept, relating to the concept of liberty, power struggles between people at all scale levels, scientific and technological progress, and so on. I can't imagine this idea being new though. So it all hinges on to what extent this simulation adds anything new to that discussion.

On the face of it, not much. You might as well say that they've found two tests for which the solution happens to coincide with the state that maximises the number of possible future histories. The only surprising thing then is that their stochastically-greedy search algorithm (actually, without having looked at the details, I wouldn't be surprised if it turned out to be yet another variation of Metropolis-Hastings with a particular objective function) finds the global solution without getting stuck in a local minimum, which could be entirely down to coincidence. It's easy to think of another problem that their algorithm won't solve, for example if the goal would be to put the "food" into the box, rather than taking it out. Their algorithm will never do that, because that would increase the future effort necessary to do something with it. Of course, you might consider that pretty intelligent, and many young humans would certainly agree, although their parents might not. It would be interesting to see how many boxed objects you need before the algorithm considers it more efficient to leave them neatly packaged rather than randomly strewn about the floor, if that happens at all.

There's another issue in that the examples are laughably simple. While standing upright allows you to do more different things, no one spends their lives standing up, because it costs more energy to do that as a consequence of all sorts of random disturbances in the environment. The model ignores this completely. Similarly, you could argue that since in the simulation (unlike in the actual animal experiment) there is no reward for using the object, expending the energy to get it out of its box is not very intelligent at all.

Conclusion, interesting idea, but in its present state, not much more than that.

about a year and a half ago
top

Ask Slashdot: Linux Friendly Video Streaming?

LourensV Re:Backwards (147 comments)

The usual answer to questions like this is:

(1) Decide what you want the computer to do

(2) Acquire the right platform.

Syaing "I've already got [whatever platform], how do I make it do what I want?" is often not a helpful approach.

If RMS and Linus had followed that advice, GNU, Linux, and probably Slashdot would never have existed. Why should one have to buy Windows and allow customer-hostile DRM software on ones computer to be able to watch a movie easily and legally? It's your computer, and the whole point of owning it is that you can make it do what you want. Trying to do just that seems perfectly reasonable to me, and I can't see how any system that doesn't allow you to do that could be the "right platform" for anything.

about a year and a half ago
top

R 3.0.0 Released

LourensV Re:How modern! (75 comments)

I can somewhat relate to the documentation issue although I believe that it is more a question of organizing the documentation.

One of the things that bothers me about the documentation is that there's often no distinction between interface and implementation. Instead of a description of what a function does, you get implementation details mixed up with what it approximately hopes to achieve, leaving you unable to see the forest for the trees.

When you mention "a fundamental problem" you mention function implementations, thus library rather than language issues. R itself is an extremely expressive, functional (or rather multi-paradigm) language that can be programmed to run efficient code. Yet it is syntactically minimalistic without unneeded syntax (as opposed to all of the scripting languages perl/python/ruby). This makes it a truly postmodern language IMO.

Well, there's only one implementation, so it's rather pointless that it could be implemented efficiently. The language specification isn't exactly good enough to create a competing, compatible implementation either. I agree that the syntax is minimalistic and that there's extremely little boilerplate, but I could really do with some way of defining data types (Python 2 is lacking there as well IMO), and namespaces...

Efficiency can sometimes be a problem but the break-even point for implementing parts in say C/C++ is only slightly different than for other languages (say perl/python) and is enabled by an excellent interface (Rcpp package).

Ah, the universal solution to problems with R: here's how to do it in some other language or software instead. Sorry for being sarcastic, but it's amazing how often effectively that advice showed up whenever I searched the web for a solution to some problem I encountered with R.

As an example of my experience, I use JAGS to fit models to data, and JAGS wants to have the model as a text file description. My model has a node for every combination of some 13000 sites and 11 years, and the text file gets to several tens of megabytes depending on model options. Creating it is basically a matter of running through all the combinations of sites and years, looking up some additional data, and spitting out a line of text describing them. My first implementation was very naive, nested for loops that essentially did a nested loop on the data. It generated output at several tens of kilobytes per second, getting slower and slower as it went on. I managed to speed it up by preallocating memory (R seems to not double the capacity of a vector when it runs out, as the C++ STL does, but add a constant extra amount, so that growing a vector made the loop run in quadratic time, except that when measured it actually seemed to be exponential, for who knows what reason.), pre-sorting data and changing to a merge join, and vectorising as much as possible. It now does about a megabyte per second, which is fast enough for my purposes. However, the code is now completely unreadable, and it's still not anywhere near what the hardware can do (PostgreSQL does the equivalent nested loop in less than a second). R turned what should have been a trivial programming task into a frustrating adventure, and the result is still not very good.

For myself the biggest change to make was to start thinking in functional concepts coming from a procedural background. Much of R criticism IMO stems on a failure to realize conceptual differences between functional and procedural programming. Another problem that might spoil the impression of R sometimes is the plethora of packages of highly varying quality.

True, but this is really another instance of the don't-do-it-in-R solution, because those functional programming functions effectively just run your loop in C, rather than in R (if they don't forward the whole operation to a C scientific maths library), which makes the performance bearable. If R were really a multi-paradigm language, then you would be able to solve a problem procedurally as well if it happened to be the best way to do it.

about a year and a half ago
top

R 3.0.0 Released

LourensV Re:How modern! (75 comments)

I recently switched my scientific programming from R to Python with NumPy and Matplotlib, as I couldn't bear programming in such a misdesigned and underdocumented language any more. R is fine as a statistical analysis system, i.e. as a command line interface to the many ready-made packages available in CRAN, but for programming it's a perfect example of how not to design and implement a programming language. It's also unusably slow unless you vectorise your code or have a tiny amount of data. Unfortunately, vectorisation is not always possible (i.e. the algorithm may be inherently serial), and even when it is, it tends to yield utterly unreadable code. Then there is the disfunctional memory management system which leads you to run out of memory long before you should, and documentation even of the core library that leaves you no choice but to program by coincidence.

As an example of a fundamental problem, here's an R add-on package that has as its goal to be "[..] a set of simple wrappers that make R's string functions more consistent, simpler and easier to use. It does this by ensuring that: function and argument names (and positions) are consistent, all functions deal with NA's and zero length character appropriately, and the output data structures from each function matches the input data structures of other functions.". Needless to say that there is absolutely no excuse for having such problems in the first place; if you can't write consistent interfaces, you have no business designing the core API of any programming language, period.

Python has its issues as well, but it's overall much nicer to work with. It has sane containers including dictionaries (R's lists are interface-wise equivalent to Python's dictionaries, but the complexity of the various operations is...mysterious.) and with NumPy all the array computation features I need. Furthermore it has at least a rudimentary OOP system (speaking of Python 2 here, I understand they've overhauled it in 3, but I haven't looked into that) and much better performance than R. On the other hand, for statistics you'd probably be much better off with R than with Python. I haven't looked at available libraries much, but I don't think the Python world is anywhere near R in that respect.

Anyway, for doing statistics I don't really think there's anything more extensive out there than R, proprietary or not, although some proprietary packages have easier to learn GUIs. In that field, R is not going to go anywhere in the foreseeable future. For programming, almost anything is better than R, and I agree that those improvements you mention are not doing much to improve Rs competitiveness in that area.

about a year and a half ago
top

Python Gets a Big Data Boost From DARPA

LourensV Re:I get the impression that (180 comments)

You're not picking on me, you're arguing your point. That's what this thing here is for, so no hard feelings at all.

I'll readily admit to not knowing Fortran (or much Python! ;-)); I'm a C++ guy myself, having got there through GW-Basic, Turbo Pascal and C. I now teach an introductory programming course using Matlab (and know of its history as an easy-to-use Fortran-alike), and I use R because it's what's commonly used in my field of computational ecology. I greatly dislike R, and I'm not too hot on Matlab either, as the first thing you should do when programming is to decide what the program is about, and to express that you need type definitions, which Matlab nor R have. From a very quick look around, at least recent versions of Fortran do have them, so that's good in my view. As for the RAM limitations in R, it seems to me that that is actually a consequence of the vectorised style of programming and the lack of lazy evaluation: you tend to get either unreadable code with enormous expressions, or a lot of temporaries which eat up lots of RAM.

Replying to your other post, I was thinking of the many hundreds of millions that are spent on satellites and the dedicated compute clusters for weather forecasting. I've also heard of budget issues and lack of replacement satellites in that area, but it's still a lot of money compared to most grants. Over here it's big news if someone manages to get a million Euro grant, spread over a couple of years, while NOAA has a 4.7 billion USD yearly budget. Of course they do other things than weather forecasting, I'm comparing an entire government organisation to a single scientific investigation here, but it's a different level for sure.

In the end, I suspect that we're simply in different fields, and therefore seeing different things. Generally speaking, the more physical the field, the more tech-savvy the scientists, and the more computer use. In my institute, Microsoft Excel is by far the number one data processing tool...

about a year and a half ago

Submissions

LourensV hasn't submitted any stories.

Journals

LourensV has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?