Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Ask Slashdot: Statistical Analysis Packages For Libraries?

samzenpus posted more than 2 years ago | from the we-know-what-you-read-last-summer dept.

Education 146

HolyLime writes "I'm a librarian in a small academic library. Increasingly the administration is asking our department to collect data on various aspects of our activities, class taught, students helped, circulation, collection development, and so on. This is generating a large stream of data that is making it difficult, and time consuming, to qualitatively analyze. For anything complicated, I currently use excel, or an analogous spreadsheet program. I am aware of statistical analysis programs, like SPSS or SAS. Can anyone give me recommendations for statistical analysis programs? I also place emphasis on anything that is open source and easy to implement since it will allow me to bypass the convoluted purchase approval process."

cancel ×

146 comments

R or WEKA ... Wait, What Exactly Are You Doing? (5, Informative)

eldavojohn (898314) | more than 2 years ago | (#38076772)

R [r-project.org] is my personal favorite but you're going to have to get down and dirty with some high level programming (scripting). Check out the data import package [r-project.org] (you would probably export your spreadsheets to flat txt files and import although the functionality is ever increasing). There's no user interface in this suggestion ... what there is, however, is a massive collection [r-project.org] of packages for statistical analysis. Very well maintained, constantly updated and ever expanding.

The other suggestion has a better GUI but is really heavyweight. WEKA [waikato.ac.nz] has helped me time and time again perform advanced statistical calculations [ibm.com] on data sets and it's in Java so runs on just about anything. Their interface occasionally improves too, they now have an explorer that I use to prep data and remove outliers/null data [waikato.ac.nz] (don't worry, this isn't climate data). It's well documented [waikato.ac.nz] .

These (probably) require an intermediate data transformation step but are open source and extensively supported. Any examples of what you wanted to do? Simple stuff like standard deviation or complex stuff like principle component analysis (PCA)? I guess if it was just simple stuff, that'd be built into Excel, right? Maybe your problems are simple enough to just need a good macro writer to tackle? Whatever happens, good luck!

Re:R or WEKA ... Wait, What Exactly Are You Doing? (3, Informative)

logical_failure (2405644) | more than 2 years ago | (#38076794)

Came here for the mention of R, and leave satisfied. R is an excellent choice.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (0, Flamebait)

CmdrPony (2505686) | more than 2 years ago | (#38076826)

... you're going to have to get down and dirty with some high level programming (scripting) ... There's no user interface in this suggestion ...Their interface occasionally improves too ... These (probably) require an intermediate data transformation step ...Maybe your problems are simple enough to just need a good macro writer to tackle ...

He said he wants something that is easy to implement, and only reason he is going with open source is because then he doesn't have to ask for purchase approval. Which IMO is a really stupid reason and will hurt in the long run - it's insane to take worse software just because you don't want to ask your boss if it's okay to buy this one.

I also place emphasis on anything that is open source and easy to implement since it will allow me to bypass the convoluted purchase approval process.

Sorry to burst your bubble, but if you want good support and easy implementation, you have to look for normal paid-for solutions. Besides, open source is not synonym for free. This is especially true with specialized software or something you want good support for. Open source just means you get the code aswell, so you can implement your own additions (without use of plugins) or change it.

But unless you get an product from a company that is spending money to develop it, you never get good software and good support. No one can make both because everything in this world costs money, and developers have to live too. Open source and free software model works well for the likes of Google and Firefox because the developments get paid by money made with advertising. Statistical analysis software, and other specialized software is a different matter.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (0)

Anonymous Coward | more than 2 years ago | (#38077034)

Nonsense. He isn't doing heavy stats and he certainly doesn't understand what's involved. Large streams of data and MS Excel are mutually exclusive.

The poster should have provided input data and output requirements, he'd have gone a ton of solutions from one of us that writes commercial stats software for a living.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (2, Insightful)

Anonymous Coward | more than 2 years ago | (#38077536)

I was at a "Large Data Sets" conference where there was an awkard pissing contest over who had the biggest data set. Then it became a question of whether you had to time-adjust the size of a data set, since a megabyte data set used to be huge. Then someone pointed out that large is relative; what is a large data set for a stats student (or librarian) is trivial for people working on the largest of the day, but it is still large for that person. I don't know what the OP is analyzing, but for them, this is large AND it fits in Excel. (And, since an Excel sheet expended from 2^24 to 2^34 cells, it now can hold a fairly large amount).

TL;DR: "Large" is a matter of perspective, so don't think Excel makes it a small data set.

Go Ahead and List Them Then (4, Interesting)

eldavojohn (898314) | more than 2 years ago | (#38077042)

I also place emphasis on anything that is open source and easy to implement since it will allow me to bypass the convoluted purchase approval process.

Sorry to burst your bubble, but if you want good support and easy implementation, you have to look for normal paid-for solutions. Besides, open source is not synonym for free. This is especially true with specialized software or something you want good support for. Open source just means you get the code aswell, so you can implement your own additions (without use of plugins) or change it.

Your point may be valid. But what would really help your validity is mentioning some proprietary products that beat R and WEKA at their own game. Sure, I've used Matlab and it can't be beat in some respects and is heavily supported. But to suggest that just because it effortlessly interfaces with Excel spreadsheets when the person could get by with a simple export in Excel to run their R script on the resulting files? Not worth the cash, in my opinion. I don't go out and buy every piece of software to evaluate it, though. I'm aware of Matlab and Mathematica and have used them quite a bit ... but I still prefer R and WEKA. So, CmdrPony, go ahead and list all the proprietary point-and-click-omg-it-just-works software for our friend here. We're all waiting.

But unless you get an product from a company that is spending money to develop it, you never get good software and good support.

Say, friendo, have you ever heard of Linux? Eclipse? Audacity? PostGRES? VLC?

No one can make both because everything in this world costs money, and developers have to live too. Open source and free software model works well for the likes of Google and Firefox because the developments get paid by money made with advertising. Statistical analysis software, and other specialized software is a different matter.

Can you tell me what advertising model is employed to funnel money through Firefox into Google? I mean, Google makes a competing product called Chrome -- the rendering engines are even different! What in the world are you free basing?

Re:Go Ahead and List Them Then (2)

DeadDecoy (877617) | more than 2 years ago | (#38077120)

Stata is another option and it isn't too expensive. I find it more usable than R with regards to the basic tests. And it somewhat supports copy-paste functionality between excel.

Re:Go Ahead and List Them Then (1)

Anonymous Coward | more than 2 years ago | (#38080130)

Stata is by far and away the easiest program to use, having used Stata, SAS, SPSS, and spent an afternoon with R and then simply gave up.

You just get your data in .csv form, import it appropriately, and then reg y x for the basics. If you want fancypants implementation, the GUI has a good deal of stuff, but most of it is a matter of googling.

IV, ARMA, ARIMA, Panel Data, time series, data, generating lags, all not terribly difficult. Lowest time investment.

Free software with a good GUI is "Gretl".

Re:Go Ahead and List Them Then (0)

Anonymous Coward | more than 2 years ago | (#38077568)

I don't know for sure, but I'm guessing you were just successfully trolled, in the sense of "practice of playing a seriously misinformed or deluded user":

http://en.wikipedia.org/wiki/Troll_%28Internet%29

Re:Go Ahead and List Them Then (4, Informative)

peter in mn (1549709) | more than 2 years ago | (#38077882)

One major advantage of R is that it's the standard teaching package for undergraduate statistics. That means that stats department (or math department, if the school is too small to have a separate stats dept) will have people who can show you how to do stuff. That is, support is available, locally, for free. Also, there are teaching texts that start simple and build up to as complicated as you want. A saved R script is a reasonable way to automate the report preparation process. You can collect data in Excel, dump it to tab-delimited text, read it into R and generate a pile of pretty graphs over and over again every month. But writing the script requires a fair amount of study, and being able to talk to someone who uses it a lot will make you much happier.

Re:Go Ahead and List Them Then (1)

Anonymous Coward | more than 2 years ago | (#38079616)

I would mention Statistica and SPSS. They have really nice interfaces and have extensive statistics/data mining capabilities.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (2, Insightful)

Anonymous Coward | more than 2 years ago | (#38077100)

Easy of deployment does NOT leave out open source. Ease of deployment simply depends on how the package was programmed. Many closed software are just as hard to deploy as open source. Saying you must buy software is useless without actually giving information out on the general field of statisical analysis like various options and comparisons between closed sourced and open.

As for support. It's true open source generally has limited support but often enough it's enough since many closed software also provide limited or slow support unless your one of their larger customers. Also, sometimes you can find companys to pay for support when dealing with open source software. Simply speaking, support varies GREATLY depending on the software in question be it open or closed source. Note, however, he did NOT mention anything about his requirements in terms of software support so whether the issue of support exists is up in the air.

This is also a library probably with limited funds. Organizations like these can take insane amount of time before software is approved when not even factoring in that the software can easily be rejected. While is true open source doesn't mean free, it might as well in the majority of cases. Most software that are open sourced often release the binaries for free (companies like cedega are in the extreme minority where they hide access to the source and charge for the binary). If a open source product can meet his requirements, why shouldn't he go with it? Both open and closed source take time to deploy and the amount of time spent trying to get a closed software to be approved can also be spent on deployment an open sourced software.

*Note, I don't advocate open vs closed source. I'm just speaking for this specific case. If closed software fits and is less hassle, go for it. If open source fits and easier to deploy (be it deployment or approval), go for that instead. Really depends on the requiresments. Software are tools, use the one that fits best for your needs.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (4, Insightful)

Alan Shutko (5101) | more than 2 years ago | (#38077122)

In this case, you're not quite correct. The head of our statisticians wants to get R in here to supplement SAS (which we pay a lot of money for) because it is both good software, and also being used heavily for research. As he put it "If we started using R, we could start using new tools as soon as we read the paper, since most of the researchers are using R."

Re:R or WEKA ... Wait, What Exactly Are You Doing? (3, Informative)

Warlord88 (1065794) | more than 2 years ago | (#38077196)

Why do you think R is not easy to implement? My company has been using SAS for a long time and we are finally making the change to R. As far as OP's requirements are concerned, I think R is way superior to SAS or SPSS because of its free, modular nature. It is clean, simple and suitable for a wide range of users. The commercial packages are filled with way too much business lingo garbage for me.

I personally think commercial support is overrated. I can install software on my own. I know how to browse through manuals and other information to find what I need. For a package like R, I almost always get any questions answered in at most few hours on online forums. So what exactly do I get from commercial support for my money? But, if OP needs commercial support, there is an enterprise edition of R by Revolution Analytics located here: http://www.revolutionanalytics.com/products/revolution-enterprise.php [revolutionanalytics.com] . Might be worth looking into.

Bottom line: R all the way.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (1)

ottothecow (600101) | more than 2 years ago | (#38077932)

While I do my work in SAS and recognize where it has strengths (especially running on a sas server with lots of shared libraries and huge datasets...sometimes it is nice that datasets are disk-space limited and not ram-limited)...R is better than SAS for the submitter.

Its not a pretty and easy GUI like excel, but it is going to be no more difficult to implement than SAS (and since the language is less archaic, it might be more intuitive) and it is free. There is a large community to provide support to get quick answers from other people who are just "dabblers". Due to the cost of running SAS, almost all of the support online is written by and for people who use it daily and as such may not feel as accessible to a newbie as the R community.

Maybe some bastardized gui tool like sas enterprise guide could solve the problems of the user as well...and I always hear about packages like Tableau which might do what they want...but R is quite functional and completely free.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (2, Interesting)

Anonymous Coward | more than 2 years ago | (#38077244)

He said he wants something that is easy to implement, and only reason he is going with open source is because then he doesn't have to ask for purchase approval. Which IMO is a really stupid reason and will hurt in the long run - it's insane to take worse software just because you don't want to ask your boss if it's okay to buy this one.

Horse shit. I've seen projects die because they couldn't get software through the approval process. Better to try 10 apps that are free and run in userspace (so no need to get IT involved for an Administrator install) than to wait for management approvals, budget cycles, and IT support, and never get the project done. If I'd done that on the job, I'd have been fired for taking too long to do my work.

I also resent the implication the "free" means "worse."

Sorry to burst your bubble, but if you want good support and easy implementation, you have to look for normal paid-for solutions. Besides, open source is not synonym for free. This is especially true with specialized software or something you want good support for. Open source just means you get the code aswell, so you can implement your own additions (without use of plugins) or change it.

I'm guessing you haven't used R. Not only is there a thorough user manual [r-project.org] , but there are books [amazon.com] from most major statistical and instructional groups on how to use R, AND the R-help mailing list [stat.ethz.ch] answers every R question I've ever had about it, AND there are local R user groups [revolutionanalytics.com] where you can get support similar to how LUG's work.

But unless you get an product from a company that is spending money to develop it, you never get good software and good support. No one can make both because everything in this world costs money, and developers have to live too. Open source and free software model works well for the likes of Google and Firefox because the developments get paid by money made with advertising. Statistical analysis software, and other specialized software is a different matter.

Please shut up. If your assumption were true, R would not exist. R exists, so you're just an asshat.

My advice to the original poster: Use R if you have any familiarity with programming. Any higher level math/stat course OR experience with basic programming will let you get started in R. If you've been doing this all in Excel already, you're probably ready to hop into R. If you're still uncomfortable, I'm sure one of the people who value your academic library could help out.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (1)

SirGarlon (845873) | more than 2 years ago | (#38077368)

But even if you get an product from a company that is spending money to develop it, you never get good software and good support.

Fixed that for you.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (1)

geekoid (135745) | more than 2 years ago | (#38077548)

why do you assume open = worse?

"you never get good software and good support. "
bullshit.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (1)

Bucky24 (1943328) | more than 2 years ago | (#38079850)

As far as support, it depends on the size of the project and how involved the developers are. There are a ton of small scale projects that are good at what they do, but are abandoned. You'll get no support for those.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (1)

SecurityGuy (217807) | more than 2 years ago | (#38078610)

He said he wants something that is easy to implement, and only reason he is going with open source is because then he doesn't have to ask for purchase approval. Which IMO is a really stupid reason and will hurt in the long run - it's insane to take worse software just because you don't want to ask your boss if it's okay to buy this one.

Uh, no. Depending on where in the economy you live, you can find an open source product that does what you need and get the actual work done LONG before purchasing can actually get the software on your desk.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (1)

magisterx (865326) | more than 2 years ago | (#38080140)

Sorry to burst your bubble, but if you want good support and easy implementation, you have to look for normal paid-for solutions. Besides, open source is not synonym for free. This is especially true with specialized software or something you want good support for. Open source just means you get the code aswell, so you can implement your own additions (without use of plugins) or change it.

I think it depends on how you define "good support". Many free (both libre and gratis) applications are very well supported by the community, this includes both Python and R. If you do not like community support, most major free applications have companies that will happily sell support contracts. Red Hat is the obvious example with Linux. Logilab and ActiveState will sell support contracts for Python.

As for the open source part, you are technically right that there is a difference beween "open source" and "libre" or "gratis". But unless they specifically say otherwise at some point, most people that say open source are looking for something that is both libre and gratis, not just that there is some way to acquire the source code.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (1)

fropenn (1116699) | more than 2 years ago | (#38076866)

I too like R. You might link it with TINN-R (http://sciviews.org/Tinn-R/) to simplify some of the coding process. Last I had heard there was also some work on a GUI for R but I don't know if that's progressed very far.

SPSS is fairly easy to use and I would recommend it over SAS for basic analyses, but, as parent suggested, it really depends on what you want to do. You might be pretty happy just downloading some Excel macros which can be found through web searches (or, better yet, writing your own).

Re:R or WEKA ... Wait, What Exactly Are You Doing? (1)

garcia (6573) | more than 2 years ago | (#38076916)

I guess if it was just simple stuff, that'd be built into Excel, right? Maybe your problems are simple enough to just need a good macro writer to tackle? Whatever happens, good luck!

Sounds like you may be correct. More information is definitely required in order to recommend the proper product.

However, R would definitely be my go to choice when someone is asking about SPSS/SAS. Speaking of that, being a SAS guy I really need to take the time to get R experience.

Anyone with decent recommendations, aside from R's own website, where to do a quickstart when you're a SAS geek?

Blog and Book for SAS to R (2)

eldavojohn (898314) | more than 2 years ago | (#38077106)

Anyone with decent recommendations, aside from R's own website, where to do a quickstart when you're a SAS geek?

This blog [statmethods.net] explains some of the stuff you do in R and as he does it, he compares it to SAS.

Example:

Unlike SAS, which has DATA and PROC steps, R has data structures (vectors, matrices, arrays, dataframes) that you can operate on through functions that perform statistical analyses and create graphs. In this way, R is similar to PROC IML.

And here's an entire book on the topic [google.com] (although may be difficult to find)!

Re:R or WEKA ... Wait, What Exactly Are You Doing? (1)

bukharin (344329) | more than 2 years ago | (#38079928)

Anyone with decent recommendations, aside from R's own website, where to do a quickstart when you're a SAS geek?

About Quick-R

R is an elegant and comprehensive statistical and graphical programming language. Unfortunately, it can also have a steep learning curve. I created this website for both current R users, and experienced users of other statistical packages (e.g., SAS, SPSS, Stata) who would like to transition to R. My goal is to help you quickly access this language in your work.

http://www.statmethods.net/index.html [statmethods.net]

R - There is nothing that beats it on any platform (1)

G3ckoG33k (647276) | more than 2 years ago | (#38077168)

R

There is nothing that beats it on any platform. Some links:

http://www.sr.bham.ac.uk/~ajrs/R/r-gallery.html [bham.ac.uk]
http://addictedtor.free.fr/graphiques/index.php [addictedtor.free.fr]
http://opencpu.org/ [opencpu.org]
https://r-forge.r-project.org/ [r-project.org]
http://hlplab.wordpress.com/ [wordpress.com]
http://rseek.org/ [rseek.org]
http://www.r-bloggers.com/ [r-bloggers.com]

Re:R or WEKA ... Wait, What Exactly Are You Doing? (0)

Anonymous Coward | more than 2 years ago | (#38077202)

R Is a great choice. You may want to use it as a PostgreSQL database Stored Procedure language. See http://www.joeconway.com/plr/
With this combination, your data can be loaded and stored in an excellent open source data base; the analysis done with R. It then an easy task to write some Web applications to report the results.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (1)

FhnuZoag (875558) | more than 2 years ago | (#38077302)

In addition, mondrian is a good complement to R for some interactive data visualisations. http://rosuda.org/mondrian/ [rosuda.org] The OP really needs to make clear what he wants to do, though.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (0)

Anonymous Coward | more than 2 years ago | (#38077322)

R can read in .csv files, with a few caveats around spaces in the column headings (take them out) and missing data (leave cells empty).

Check out Quick-R http://www.statmethods.net/ for a nicely laid out introduction which gives some idea as to the syntax.The R graph gallery shows its versatility as a plotting software http://addictedtor.free.fr/graphiques/.

There are also quite a few books on R, including at least 2 by O'Reilly (R in a Nutshell and the R Cookbook).

Another advantage is that it will load and run on a wide variety of OSs - I've run it off Windows Vista and 7 (including off a USB stick), several generations of OSX, and Xandros and Ubuntu 10 on an eeePC.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (1)

Anonymous Coward | more than 2 years ago | (#38078452)

Try SOFA (www.sofastatistics.com) alongside R. SOFA (Statistics Open For All) focuses on making some of the most important statistical tests easy to use and understand. It also has attractive charting and report tables. Disclosure - I am the lead developer of SOFA.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (0)

Anonymous Coward | more than 2 years ago | (#38078942)

How about Netlib [netlib.org] ? Wait, they have a library for statistics and not statistics for a library.

Ultimately, you are a librarian and not a statistician. If you have a background in mathematics that includes statistics, then you might be able to use the "better" products. The problem, though, is that the person interpretting the results will also require an understanding of what the pretty charts mean. It seems unreasonable to send people to a statistics course to understand the kind of data you are collecting.

I do not know how the statistics are being stored or calculated now. If there is a database involved, most of those can do basic statistics directly in the query [mysql.com] .

The more in-depth statistics programs are often used by people who want to spend large amouns of time analyzing data. Though they can produce standard charts quickly, it often takes a bit of exploring for a way to convey the desired message.

"Figures don't lie, but liars figure." - Mark Twain

Re:R or WEKA ... Wait, What Exactly Are You Doing? (3, Informative)

kiwigrant (907903) | more than 2 years ago | (#38079486)

Try SOFA (http://www.sofastatistics.com/ [sofastatistics.com] ) alongside R. SOFA (Statistics Open For All) focuses on making some of the most important statistical tests easy to use and understand. It also has attractive charting and report tables. There are also videos, on-line documentation, and direct support from the developer. Disclosure #1 - I am the lead developer of SOFA. #2 I already posted accidentally as AC

Re:R or WEKA ... Wait, What Exactly Are You Doing? (2)

demonbug (309515) | more than 2 years ago | (#38079720)

I second R, and would also suggest adding in R Commander [mcmaster.ca] . Adds a fairly usable GUI simplifying lots of common tasks, while maintaining the flexibility of R.

Re:R or WEKA ... Wait, What Exactly Are You Doing? (1)

magisterx (865326) | more than 2 years ago | (#38080034)

I second the suggestion of R. I have only dabbled with it, but it is quite powerful and has a great community. You might also want to consider something a little more general purpose though. Python with the NumPy and SciPy packages can handle just about any statistical problem you want to consider and it has the versatility to do a whole lot more, such as handle any intermediate steps. It is completely free and you can download an excellent complete package at http://code.google.com/p/pythonxy/wiki/Welcome [google.com]

Re: What Exactly Are You Doing? (0)

Anonymous Coward | more than 2 years ago | (#38080056)

I suspect that what the original poster is doing is trying to cope with bad management.

Anyone who demands complex statistical analysis from a small academic library is creating make-work that will most likely result in diversion of resources from more meaningful tasks.

Subservience to management hubris is probably being prioritized over service to patrons, or they would have assigned additional staff (some math students, perhaps?) to do this work, and not dumped it on the librarians.

The phrase paralysis by analysis [wikipedia.org] comes to mind....

This may be a bad idea (2)

bluefoxlucid (723572) | more than 2 years ago | (#38076828)

I find that libraries carry a lot of common information and not so much uncommon information. This sort of muckery seems to encourage concentration of information into a smaller and smaller realm, constantly sorting out first the never-used, then the minimally-used, to maximize volume of return but minimize the use of the library as a haven for obscure and long-forgotten knowledge. Effectively, like burning some books while not burning other books--removes knowledge.

As with all things, there must be balance. A library where you don't increase holding of more useful texts is less immediately useful; although if you removed all the most used texts, you would have an interesting outcome... the obscure and oft-overlooked need retention, too.

Re:This may be a bad idea (2)

Galaga88 (148206) | more than 2 years ago | (#38077148)

Libraries don't necessarily enjoy removing materials from the collection, but the two main reasons to do so are to make sure we have current/accurate materials and make room in our always limited shelf space. (The first is of presumably higher importance in an academic library.)

Unless libraries can get an unlimited budget for expansion of their physical space or off-site archives, weeding materials will be a necessary evil.

Here's a huge list (1)

narcc (412956) | more than 2 years ago | (#38076852)

Try this giant list [wikipedia.org]

From personal experience, I can recommend WINKS. It's ridiculously easy to use.

SAGE (2)

MetalliQaZ (539913) | more than 2 years ago | (#38076880)

Sage (formerly SAGE?) is an open source mathematical package that includes statistical functions. I wanted to add that to the usual mentions of R, etc.

However, are you sure this is what you want? It sounds to me like your real problem is that you have too much data to store. If you're currently using Excel to process your data, and it has been working except that you are running out of space, perhaps what you really need is a database, like Access. If you want OSS, you can probably try LibreOffice, or engage a local student to design a web based system based on MySQL.

already have? (0)

Anonymous Coward | more than 2 years ago | (#38076896)

Have you checked what's already available on the institution's computers? Many have SAS and others site-licensed. SAS can certainly do small tasks, but it's the most versatile and powerful, if not exactly friendly and intuitive. If you have to acquire something and don't need heavy duty tools, I'm thinking you might fare well with R, also. It's free, so if it doesn't work out, no problem, and you'll find clues scattered all over the 'net. Also, some academic departments may have favorite tools which might not be exactly what you want but are close enough and you know where to find help.

SAS is expensive (1)

KernelMuncher (989766) | more than 2 years ago | (#38076906)

SAS is a great package but is probably prohibitively expensive. An open source version like R is probably more appropriate.

A good database? (2, Interesting)

Anonymous Coward | more than 2 years ago | (#38076908)

Hear me out. We deal with about 3 million data-producing elements and track in real-time to near-real-time. We ingest everything into MySQL (via macros, scripts, tools, etc.) and normalize the data on the way in. For analysis we simply query. Those queries may have their outcome displayed in a simple report generator, or (more often than not) via HTML5 Canvas graphs/charts, Cacti graphs, etc. What we're doing doesn't lend itself well to a SAS type solution. If you could use SAS for what you're doing, this probably wouldn't work for you.

PSPP (5, Informative)

Geste (527302) | more than 2 years ago | (#38076946)

Look at the free SPSS work-alike PSPP. http://www.gnu.org/software/pspp/ [gnu.org] Sounds like R might be a bit much for your needs.

MYSTAT (or SYSTAT) (1)

dereference (875531) | more than 2 years ago | (#38079772)

Sounds like R might be a bit much for your needs.

Agreed. Another good alternative is MYSTAT [systat.com] , the free "student" version of SYSTAT. Note also that many academic institutions negotiate site licenses for SYSTAT, so you might already have the full version available to you.

PowerPivot maybe (2)

AaronLS (1804210) | more than 2 years ago | (#38076996)

Depending on the type of "analysis" you might be better off with something like PowerPivot. There's alot that you can probably gleen from your data without doing sophisticated statistics, but instead using PowerPivot to slice/dice/summarize/chart your data in different ways. It is easiest to use if you structure your data in a data warehouse/star schema fashion.

What output do they want and what answer? (3, Informative)

vlm (69642) | more than 2 years ago | (#38076998)

Blue skying the toolset is not gonna work. What output do they want, then figure out what tools can generate that output.
If the most important thing is inserting pretty graphs into newsletters, thats one thing.
If the most important thing is hard core data warehousing analysis (for a library?) thats another thing.

The other thing is what answer do they want? They're just looking for data to back up an unpopular decision or glorify themselves demonstrating their amazing management talents. So figure out what that is (by asking them?) and help them get the data they want. Don't give them a graph of declining circulation if they're trying to emphasize their brilliant leadership. Don't give them a graph of increasing student help, if they're trying to justify downsizing.

Re:What output do they want and what answer? (1)

AaronLS (1804210) | more than 2 years ago | (#38077252)

Agreed. A lot depends on what you want to accomplish. "Analysis" can be completely different beasts from project to project. The term "analysis" is kinda thrown around loosely and encompasses a lot of things. So it's important to not dive into the analysis if you don't have a very specific goal in mind.

Stick with Excel (2, Insightful)

syousef (465911) | more than 2 years ago | (#38077008)

Seriously, stick with Excel. You and anyone who comes after you would need to learn whatever statistical package you introduce. That is either overkill for the kind of data you're collecting and analysing, or it's a full time job requiring specialist knowledge for which they should be hiring someone else.

Excel has a few bugs but for the most part it's very capable. Ensure you run the service packs and can install the addons that come with it (analysis pack). Get them to send you on advanced short courses for Excel and Statistics. If there isn't that kind of commitment there's no room for any statistical package.

Almost all ask slashdot stories that are work related can be answered the same way - bad idea: you're already out of your depth and if you can't be bothered to google for the information the project is doomed.

Re:Stick with Excel (1)

bogaboga (793279) | more than 2 years ago | (#38077128)

Excel has a few bugs but for the most part it's very capable.

Care to name some of those bugs? I have not come across a single one!

Do NOT stick with Excel (5, Informative)

Anonymous Coward | more than 2 years ago | (#38077170)

Excel and other spreadsheets suck at stats:

* Burns, P. (2005). Spreadsheet Addiction. [burns-stat.com]
* Cryer, J. (2001). Problems with using Microsoft Excel for StatisticsPDF [uiowa.edu] .
* Pottel, H. (n.d.). Statistical flaws in Excel. PDF [coventry.ac.uk]
* Practical Stats (n.d.), Is Microsoft Excel an Adequate Statistics Package? [practicalstats.com]
* Heiser, D. (2008). Errors, faults and fixes for Excel statistical functions and routines [daheiser.info]

For a more comprehensive and technical discussion, see the papers by Yu (2008); Yalta (2008); and McCullough & Heiser in Computational Statistics and Data Analysis 52(10).

Re:Do NOT stick with Excel (0)

Anonymous Coward | more than 2 years ago | (#38078018)

More complete references to the above papers:

Yu-Sung Su: "It’s easy to produce chartjunk using Microsoft Excel 2007 but hard to make good graphs"

A. Talha Yalta: "The accuracy of statistical distributions in Microsoft Excel 2007"

B.D. McCullough, David A. Heiser: "On the accuracy of statistical procedures in Microsoft Excel 2007"

All published in "Computational Statistics and Data Analysis" [elsevier.com] , 2008

Re:Do NOT stick with Excel (1)

iroll (717924) | more than 2 years ago | (#38078188)

Did you even read the articles you linked? From Pottel:

My overall assessment is that while Excel uses algorithms that are not robust and can lead to
errors in extreme cases, the errors are very unlikely to arise in typical scientific data analysis.
However, I would not advise data analysis in Excel if the final results could have a serious
impact on business results, or on the health of patients. For students, itâ(TM)s my personal belief
that the advantages of easy-to-use functions and tools counterbalance the need for extreme
precision.

Emphasis mine. I highly doubt that the OP's data require more than a couple of significant figures of precision. While their stats could influence resource allocation, differences of a few percent are unlikely to be deal-breakers--think about it; the library is likely to be dealing with budget items that range in the thousands of dollars, probably in blocks. You're not going to accidentally budget for a whole class based on a wiggle of a percent in attendance.

Re:Stick with Excel (0)

Anonymous Coward | more than 2 years ago | (#38077452)

Agreed. While we tech people are experts at creating situations where an employer can't afford to do without us, it makes leaving a job down the road very hard. Remember KISS and document everything you do so that someone else can come in later on and pick up where you left off.

One other option is a Microsoft Access Database. Not open source, but you can do just about anything with a relational database & MS Access makes building queries for the novice much easier than using SQL. You can even get very fancy with user-interfaces and reports using VBA code if you want. I work at a shop that does a lot of statistical analysis. We use SPSS & SAS but I've found that for managing data long-term, it's hard to beat building an MS Access database in terms of development speed and simplicity of use & maintenance.

Re:Stick with Excel (0)

Anonymous Coward | more than 2 years ago | (#38077874)

1) Most Librarians don't change jobs every two years like programmers.
2) Excel caps out at a certain level with complex statistical analysis. You can buy add-ons which do more, but that's what this librarian is trying to avoid.
3) These questions and discussions are very useful. Actually listening to what people have to say is sometimes more useful than a page of google results. It can take hours to evaluate each and every one of twenty possible hits from Google. If you hear someone say "this product is good but it doesn't do X" or "this product has a lousy user interface but great X capabilities" that's all valid information.
4) Stop being a curmudgeon. ;)

R and Python (Rpy2) (3, Interesting)

mpetch (692893) | more than 2 years ago | (#38077026)

I have grown accustomed to doing statistical analysis using Python and R using http://rpy.sourceforge.net/rpy2 [sourceforge.net]

Access (0)

Anonymous Coward | more than 2 years ago | (#38077038)

I hate to toe the Microsoft line here, but I'd go with Access in this case.

True, SPSS and SAS are statistical analysis packages, but I think they're far beyond what you're looking for. You haven't mentioned T-tests, F-tests, multi-variable regression, etc. That's what those packages are for, and I doubt your administrators even know what they are.

It sounds like you're after basic aggregation ("how many fiction books were checked out in July?", "How many overdue notices to we have to send out, on average, before they bring the book back?", etc.). Access does these just fine, and the built-in wizards will probably help you get more use out of it. You'll get better ability to select what it is you're counting than you would with Excel (want to know the percentage of books in your collection with more than 100 pages that get checked out more than twice per year? Access will do that easily. Excel... you can can do it, but it's a lot of filtering, cutting/pasting, etc.)

PDL (1)

Raiford (599622) | more than 2 years ago | (#38077046)

Check out PDL (Perl Data Language). It may not be the most convenient solution but it's free and has a great, informed and responsive user group.

Did you want to keep it simple? (0)

Anonymous Coward | more than 2 years ago | (#38077066)

If you're already proficient in Excel, I suggest adding the Microsoft Office data analysis pack if you just need some statistical tools inside of Excel. This adds various ANOVA and regression tools inside of excel using an easily accessible Microsoft add-in (you may need to have your Office installation CD handy). http://office.microsoft.com/en-us/excel-help/load-the-analysis-toolpak-HP010021569.aspx contains the instructions. This will be your best and simplest bet if you just need to look at correlation, covariance, descriptive statistics, histograms, and the like.

Alternatively, you can get pretty complicated with all this other fancy stuff or stay in Excel by using RExcel, a free package that allows R Statistical Package to play nice with Excel as the data input interface.

Cheers!

Liberals?! (0)

Anonymous Coward | more than 2 years ago | (#38077098)

I read this first as: Statistical Analysis Packages For Liberals. Thinking it was some kind of lie with statistics deal going on...

Maybe a slightly different tool (3, Interesting)

LWATCDR (28044) | more than 2 years ago | (#38077142)

It almost seems like you are not doing statistics as much as creating reports from data.
Maybe you should be using a database instead of a spreadsheet or a statistics program.
The Uber geek way would be to set up a LAMP server and create a webased system.
The more convent way would be something like Access.
You can then use Excel to manipulate the data as needed or the database program.

In the end if you know excel you may want to stick with it. I see people use Excel for databases all the time. Drives me a bit nuts but sometimes what ever works is just fine.

Re:Maybe a slightly different tool (1)

jgrahn (181062) | more than 2 years ago | (#38077574)

It almost seems like you are not doing statistics as much as creating reports from data. Maybe you should be using a database instead of a spreadsheet or a statistics program.

I don't see why even a database would be needed. "Increasingly the administration is asking our department to collect data on various aspects of our activities, class taught, students helped, circulation, collection development, and so on."

Seems to me that information already exists in a library, and the report generation is the only thing missing. And possibly looking into the database on the 1st of every month and writing down the number of books on a piece of paper.

Or a reply to the administration "Stop asking for these statistics and let me do my job!"

Re:Maybe a slightly different tool (2, Informative)

Anonymous Coward | more than 2 years ago | (#38077786)

Agreed. Access is a sh*tty database but you seem to be saying that volume is your problem not functionality. However if you've got an Excel license you've probably got an Access license already and Access will allow you to re-use a lot of what you've put together in Excel while handling the volume of data better.

Unfortunately I also agree with the other posters, if you're after more relevant advice you really need to give a bit more background on:
  - your skill set (Excel user/VBA hacker/Stats major/Hardcore programmer)
  - what do you mean by 'statistical analysis'? This is too broad a description
  - the data you're using (volumes, sources, complexity)

Another option if volume is your only problem is to not use all the data. Take a random sample and work from that - this is common practice even for people/orgs with high end stats packages.

Re:Maybe a slightly different tool (0)

Anonymous Coward | more than 2 years ago | (#38078894)

I second this! It seems to me a pretty good solution - export the current database to an online one, for instance MySQL. The main statistical packages, such as R, provide means to interface with those databases. Also, you'd have an easy maintenable DB which could be queried by users of the library through a web interface (there are some pretty interfaces for MySQL out there)

Re:Maybe a slightly different tool (0)

Anonymous Coward | more than 2 years ago | (#38080050)

Completely agree... What the OP is trying to get comes out of correlating trends from a DB, not that much stats. Excel would be good to start, coding a web app relying on a DB (probably the one where the primary info comes from) once you know what you need is the following step.

A suggestion... (2)

esme (17526) | more than 2 years ago | (#38077210)

I suggest you post your question to the code4lib mailing list [nd.edu] . It's going to get you much more informed and practical advice. You might even find some people who already have a good workflow who will share their tools.

-Esme

code4lib (1)

oneiros27 (46144) | more than 2 years ago | (#38079210)

Agreed ... odds are, they're not running homebrewed circulation software and someone in the library community has tried to extract metrics from whatever they're using.

Matlab or Octave (1)

gnu-sucks (561404) | more than 2 years ago | (#38077212)

Depending on how large your dataset is, you may have luck using Matlab (or the opensource gnu octave). These programs will let you do *whatever* you want with the data (plotting, correlation, fft, etc).

With at least Matlab, there are some MySQL plugins available that will let you get data out of your database and into arrays rather quickly. And of course, both matlab and gnu octave let you import csv and plaintext datafiles.

Here is the matlab plugin I have used very successfully (and it's open source. No idea if it would work with octave):
http://www.cims.nyu.edu/~almgren/mysql/ [nyu.edu]

You will need some background with math, statistics, and programming to effectively do this. If you don't have the skills, learn them or pay up for some overpriced commercial product...

free SPSS clone (1)

Anonymous Coward | more than 2 years ago | (#38077216)

I love R, but if you want something that looks more like SPSS, you could try the free SPSS clone PSPP:
http://www.gnu.org/software/pspp/

Scipy or Numpy with Python (0)

Anonymous Coward | more than 2 years ago | (#38077218)

Typically for simple statistics, I do better staying in Python and using the Numpy package for calculation. Scipy provides a mountain of extra packages, and probably has a nice setup for exactly what you want. Certainly Python with Numpy is free, but I'll bet Scipy could be a free resource for you as well.

Minitab (0)

Anonymous Coward | more than 2 years ago | (#38077306)

Minitab

python + scipy (1)

rla3rd (596810) | more than 2 years ago | (#38077320)

if a full stats package is a bit heavy, try python + http://www.scipy.org/ [scipy.org]
below is using the ipython shell

In [1]: import scipy

In [2]: x = [1,3,6,8,9,4,9,0,5,3,6,8,6,8]

In [3]: scipy.mean(x)
Out[3]: 5.4285714285714288

In [4]: scipy.std(x)
Out[4]: 2.7957693986829897

and if you need more than that you can really delve into its stats submodule http://www.scipy.org/doc/api_docs/SciPy.stats.html [scipy.org] .

What is your Integrated Library System? (2, Insightful)

Anonymous Coward | more than 2 years ago | (#38077324)

What is your ILS? Depending on what it is, you may already have access to just about all of what you need there along with Excel. Atriuum from Booksys has wonderful features like you are asking about, record tracking, and it exports to Excel very well. Voyager from Ex Libris had wonderful integration with Access and my boss could pull out some amazing statistics with it.

If you don't have an ILS then seriously look at Atriuum as they are great for the smaller libraries.

lordjim AT gmail DOT com

Do you have a mathematics department? (0)

Anonymous Coward | more than 2 years ago | (#38077344)

You may already have access to the tools you need through their licensing. Also, they may be able help get you going.

Two tools I made for this... (2)

njvack (646524) | more than 2 years ago | (#38077346)

OK, this is a horribly shameless self-plug, but hey, it's directly relevant. I started two projects aimed at tracking reference statistics: Libstats [google.com] , which is PHP-based and open-source. I'm also one of the founders of Gimlet [gimlet.us] , which is hosted and closed-source, but provides a similar workfow.

If you're looking to spend some time delving in code, Libstats is looking for maintainers -- I'm no longer working in libraries, so it's largely orphaned.

Less Is More +4, Seditious (0)

Anonymous Coward | more than 2 years ago | (#38077360)

( Before you make the leap into "statistical" "packages" ):

Would a spreadsheet program satisfy your needs?

Yours In Ulanbator,
Kilgore Trout

Tableau Public (0)

Anonymous Coward | more than 2 years ago | (#38077552)

Great package for visualizing data. ez to use. great online videos and training.
http://www.tableausoftware.com

How large is "large"? (1)

Nutria (679911) | more than 2 years ago | (#38077610)

What you think is large might be trivial even for OpenOffice/LibeOffice.

Also, the real solution might be to automate data collection and storage in a database. Manipulation would then sort itself out.

If you're at a University, then you should go to the Math Dept and talk to some Statistics grad student or maybe even an econometrics grad student in the College of Business. Heck, there's probably Comp Sci undergrads looking for a project to add to their resume.

Evergreen (0)

Anonymous Coward | more than 2 years ago | (#38077670)

Perhaps collecting the data in a standard open source software system would be a helpful first step? http://open-ils.org/ [open-ils.org]

I find that ... (2)

PPH (736903) | more than 2 years ago | (#38077694)

... rand() serves most of my statistical needs.

Try the JMP demo (2)

jollespm (641870) | more than 2 years ago | (#38077726)

I use and like JMP from SAS. They offer a free 30 day demo and I think it does a good job at data visualization and statistical modeling, or as they call it, discovery. It will interface with SAS, R, Excel along with various database packages for additional capability that may not exist in the core product. I found it pretty easy to pick up with a fairly active user base to help get started.

Re:Try the JMP demo (1)

jfb2252 (1172123) | more than 2 years ago | (#38079466)

I agree whole-heartedly. I've been using JMP since version 2.0. Great for exploratory data analysis. SAS differentiates it from SAS proper by limiting the data sets it can deal with to RAM, but with 4GB of RAM common these days that's not likely to be an impediment.

Almost twenty years ago I compared the sort routine in JMP to Excel's. 30K rows, 28 columns, sort on 3 columns. JMP took about 1% of the clock time Excel did.

Academic pricing is pretty good.

DeskTracker (0)

Anonymous Coward | more than 2 years ago | (#38077860)

Desk Tracker it maybe to simple of a program for what you need, but I have seen it used and it does work. Even has a report section..

I would suggest Rapid-I's tool suite. (1)

sgtrock (191182) | more than 2 years ago | (#38078050)

Their product list is here [rapid-i.com] . In particular, I think you would be interested in RapidMiner and RapidAnalytics. WIkipedia has a good overview of RapidMiner [wikipedia.org] .

Video tutorials for both RapidMiner [rapid-i.com] and RapidAnalytics [rapid-i.com] are available on their website. Those videos are a great way to get a good sense of what the product line is capable of. Searching on YouTube will find plenty more that focus on specific use cases and more advanced functionality.

All of their software is dual licensed with a GPL version and closed source license available. GPLed versions of their software also has support contracts available for everything from basic troubleshooting support to full implementation. That includes both Rapid-I itself as well as partnerships with contracting companies in the U.S. and elsewhere. In addition, Rapid-I hosts a community forum that is well run and has active developer input.

I've been using RapidMiner myself for 3 years for smaller projects. I have had occasion to use all of the free resources that I mention above. I have found them all to be very solid. The developers in particular have proven themselves to be knowledgeable and very polite. (IME, that's only to be expected of co-founders who happen to be German. :-) )

Rapidminer! (0)

Anonymous Coward | more than 2 years ago | (#38078078)

Rapidminer http://rapid-i.com/content/view/26/84/lang,en/ has a free, open-source community edition. It has an insanely easy to use, slick, very well designed graphical interface, and there are a lot of nice video tutorials (accessible directly from the help drop-down as well as from the website) for the basic functionality, to get you started. It already contains most WEKA functionality built-in, as well as a lot of other freely available algorithms of te type that you could also implement via R packages / SPS / SASS. It can also interface directly with R via its R extension, should you need any additional R functionality that Rapidminer doesn't have (my guess is you won't for your project). I've had zero problems installing and running it on Ubuntu (from Edgy to Natty) or on Windows (from XP to 7 Pro), have not tried Mac. It also updates itself automagically without snags via its own update mechanism on both of these platforms, it just asks you to check "ok" for the GPL for each module. It has a schweet graphical wizard for importing excel, tab delimited text, csv, whatever and telling it what the headers etc. are, and it also has some repository/database functionality but honestly I haven't fooled with that so I can't comment.

R with RKWard (4, Informative)

binarstu (720435) | more than 2 years ago | (#38078226)

I will echo the support for the open-source statistics package R. R is incredibly powerful, and in the natural sciences it is fast becoming the standard statistics software.

I will also echo the sentiment that, by itself, R is fairly low-level and typically requires at least some simple programming to get what you want.

However, there is a very nice graphical front end for R called RKWard (http://rkward.sourceforge.net/). With RKWard, importing and exporting data, running basic analyses on it (descriptive statistics, linear regression, t-tests, etc.), and producing basic graphs is very straightforward and does not require detailed knowledge of the R language. Plus, RKWard is also a nice development environment for writing R code, so if you want to take your project further, you can easily do so. So, I'd recommend giving RKWard + R a look.

Find out the real need and focus on that (2)

fredrikv (521394) | more than 2 years ago | (#38078366)

It seems to me that all you need is descriptive statistics (change from last month, mean, min, max, etc and probably graphing). Using a general spreadsheet application like Excel or Calc will do the job just fine. Remember that Excel is designed to support business calculations and what you are asked to provide is exactly that! Using a dedicated statistics software for this task (in your environment) is a waste of resources. Full stop.

However, the solution may not be straight-forward to solve in Excel or any other program. In my experience there are two main reasons:

1. The request for data is unclear.
Why do they "increasingly want data on various aspects of our activities"? It could be that the data you have provided so far has not provided support to decisions. Are the questions they really want answered possible to support with the data you can provide? Meet up with the actual decision makers or at least someone who knows what the statistics are actually used for and ask them WHY they need it. Is it used to support resourcing? Is it used to describe changes? Not even a university administration creates statistics for no reason. Most likely, what they really want to know is a handful of numbers like "change from last month", "overall sum", "hours spent on teaching vs information searches".

Do this with an open mind. You will probably learn that many of the imperfections you see in the details are less important to them. When you know their true needs, suggest a package of data, graphs, free-text report or whatever is suitable. If some parts are easy to provide, be clear about that. If something is more difficult to produce, tell them that it is is possible but time-consuming and costly. Get their buy-in before you spend time on producing the output.

2. The raw data is not optimally formatted for the calculations
First of all, if raw data quality can be improved, do that first. Update forms used for feedback, ask for output in a specific format etc. Then arrange the data and calculations in Excel to make it flexible and easy to read and troubleshoot. The trick is to use structure your data and calculations in Excel in a way that is easy to follow visually and logically. In my experience it is very useful to use different tabs for data entry, data analysis and presentation.

It seems from your examples that your input will come from a variety of sources, both manually entered and output from other systems. To get it into Excel, create separate source data tabs where you can enter or paste your raw data. For each source data tab, create a "clean up and calculate" tab where you rearrange source data and make most of the calculations. If raw data is very far from optimal or calculations are complex you may want to use several tabs or even several workbooks for this. Then create presentation tabs where you present the results from calculations in a useful format.

I'm convinced you are suffering from both these problems. Attack them in numeric order and you are well on your way. And by all means, sign up for a course in advanced Excel that is suitable for your application. Best of luck!

Sofa (2)

zdammit (1143747) | more than 2 years ago | (#38078396)

http://www.sofastatistics.com/ [sofastatistics.com]

Re:Sofa (1)

Anonymous Coward | more than 2 years ago | (#38079054)

http://www.sofastatistics.com/ [sofastatistics.com]

Yes this might be what you are looking for to generate your reports. Free software and comes with video tutorials

GUI for R (0)

Anonymous Coward | more than 2 years ago | (#38078436)

I don't know if anyone else mentioned the Rattle GUI [wikipedia.org] for R. It can import data in a number of formats including .csv and will perform all sorts of common statistics including graph functions. The nicest thing is that as you try out new functions the interface automatically asks permission to download the appropriate CRAN packages. If there is any downside to learning R it is finding one's way through the extensive library of packages--Rattle eliminates this anxiety.

r with rattle (0)

Anonymous Coward | more than 2 years ago | (#38078600)

i think R with Rattle would be your best bet. Rattle gives a point and clock interface. Also look at rstudio it has a server version so you can run it in a browser, but its best feature is having the help panel beside the coding panel so you can look up syntax and options.

duphenix.com

R, Octave, Matlab (2)

Virtucon (127420) | more than 2 years ago | (#38078606)

I've used them all and in terms of engineering and academia, MATLAB seems to be where most theoretical prototyping is done. The license costs for academic/student use are reasonable but it's about $2K for a commercial single seat license. Octave is the MATLAB open source alternative and for most basic functions it does well however it doesn't have the extension packages available that MATLAB does.

My favorite and one I use all the time is "R" because it does have great open source community support and there's not a lot it can't do.

JMP!!! (0)

Anonymous Coward | more than 2 years ago | (#38078778)

JMP from SAS, Inc. would be perfect for what you are looking for.

JMP would be a sports car, while SAS is a huge semi truck. It is the most powerful point-and-click stats package you'll come across, ie, no languages to learn. Everything is visual. However, it is powerful enough to have a scripting language that is comparable to a lite version of SAS. SAS, SPSS, R, Matlab are overkill for what you are trying to do, like roasting a marshmallow with a rocket engine. You don't need the enterprise database incorporation, you don't need to access terabytes of data, what you need is a visual explorer of the small dataset you have. (ie, you are not analyzing consumer spending patterns across the U.S, you are looking at at most couple hundred thousand observations).

Coming from someone who sits in front of a computer 8 hour a day looking at data. When you don't know what trends you are looking for and you are just exploring, having a visual interface that will scream "look at me, there's an upward trend" is much more useful than having to worry about whether the proper syntax for proc reg.

What does "anything complicated" mean? + gretl (1)

wfolta (603698) | more than 2 years ago | (#38079302)

As others have said, if you're mainly doing reports, stick with Excel or a database solution. Excel lets you look at your data from a variety of angles (pivot tables, etc), and has usable graphs. As usual, Microsoft has numerical issues, so you may get wrong answers under certain conditions, but hey, it's Excel.

What is it that "anything complicated" means? Fancy graphs? Fancy partitioning/aggregation of data? Modeling and forecasting? Summary statistics? Graphs that aren't fancy, but Excel doesn't provide?

An open source option that I haven't seen mentioned is gretl. It has a reasonable GUI and can make nice graphs (though not terribly customizable), give summary statistics, sample data in various ways, and do basic modeling. (It comes from an econometric world, so has quite a few time series capabilities.) If you need to do some things with time series, it would be helpful. (Though if you don't know what you're doing, it simply makes it easy to shoot yourself in the foot.)

Software! (1)

malaprohibita (924587) | more than 2 years ago | (#38079414)

I also work for a (relatively) small academic library, but our campus has free licenses for SAS and JMP. I had to go through hoops to get it (bureaucracy being what it is) but I use SAS all the time for inventory and usage data. It helps that I was a SAS programmer once upon a time, but I love it for its abilities to clean data as much as its statistical chops. Check around campus if you haven't done so already. You may find access to one or both of these to be easier than you think.

More friendly faces for R (0)

Anonymous Coward | more than 2 years ago | (#38079516)

I'm a working statistician with a fair amount of (data oriented) programming experience. I can say that R is at least initially not that friendly to use and lacks some of the conveniences of traditional statistical packages SAS, SPSS(PASW), STATA. Initially, I preferred to just use SAS to transform the data and then import the final data into R for analysis. There are some things that R does that SAS does not - usually things that are on the bleeding edge of statistical methods. (The project I was working on involved a rare events logistic regression - available through the Zelig package.) There are some groups that have tried to put a nicer friend end on R though. R Studio and Revolution R both attempt to help with the interface a bit and are either free (R-Studio) or free for academics (Revolution R.)

I think a lot of whether or not R is right for you depends on what kind of analyses you're doing and how you get your data. If all you're doing are basic analyses and the data comes in a form that doesn't require extensive modification then R might appear tricky at first, but will likely fit your needs.

Rstudio (2)

rmcd (53236) | more than 2 years ago | (#38079664)

If you do go with R, be sure to check out Rstudio (rstudio.org), which is a very nice front-end for R.

In response to the posters who tell you that R is low quality because it's open source, I can tell you that's nonsense. I have Stata, Matlab, and R on my machine, and access to SAS on a research server. There are times to use each, but all else equal I use R. It's not trivial to learn, but it's a powerful high-quality piece of software, widely used in the statistics community. Whether it's appropriate for your use depends on you and the task. But it's great software.

stats and graphing package (0)

Anonymous Coward | more than 2 years ago | (#38079946)

Graphpad Prism

Yellowfin (1)

sproose (2509588) | more than 2 years ago | (#38080058)

Not open-source but very easy to generate reports off relation data sources. http://www.yellowfin.bi/ [yellowfin.bi]
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...