×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

SKA Telescope Set To Generate More Data Than Current Net

timothy posted more than 2 years ago | from the recycle-the-bits-responsibly dept.

Data Storage 73

angry tapir writes "The forthcoming $2.1 billion Square Kilometre Array (SKA) radio telescope could generate more data per day than the entire internet when it comes online in 2020, according to the director of the International Centre for Radio Astronomy Research (ICRAR), Professor Peter Quinn. SKA — which Australia with New Zealand and South Africa are competing to host, and which will help the search for Earth-like planets, alien life forms, dark matter and black holes — will be 10,000 times more powerful than any telescope currently used. Slashdot has previously discussed the proposal to use 'Skynet' — a grid-computing-based solution for processing and storage."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

73 comments

Uh oh. (1)

ErikZ (55491) | more than 2 years ago | (#36688958)

Cue in the HAARP freaks in 3...2...1...

Re:Uh oh. (1)

davester666 (731373) | more than 2 years ago | (#36690976)

How can they afford this with Congress cutting the budget for everything except military spending and they aren't yet that delusional into planning for an interstellar invasion?

This thing will blow through their ISP's data cap in the first couple minutes, after which they'll be on the hook for more than $1/Gb... Hell, it might even increment at a rate within 1 or 2 orders of magnitude of the national debt's rate of increase.

finally we can retire that... (1)

MoFoQ (584566) | more than 2 years ago | (#36688984)

finally we can retire that saying/meme about Internet is for porn....or that the mass storage market is driven by porn

unless...I guess we happen to be able to spot the alien equivalents via this SKA.....

Re:finally we can retire that... (0)

Anonymous Coward | more than 2 years ago | (#36689708)

We have to find the alien civilizations in order to become pioneers of interspecies porn.

Surprised it hasn't happened already. (1)

blair1q (305137) | more than 2 years ago | (#36689008)

Really, how much "data" is "generated" by the internet every day?

Sure, there's lots of traffic, but that's millions of copies of the same data.

The new data going on to the internet probably isn't too heinous in quantity.

And the summary blew the meme. It's not "generate more data per day than the internet", it's "generate more data per day than the earth does in a year, and conduct more internal networking traffic than the internet."

vastly, mindbogglingly big (0)

Anonymous Coward | more than 2 years ago | (#36689912)

Really, how much "data" is "generated" by the internet every day?

Sure, there's lots of traffic, but that's millions of copies of the same data.

Where do you put the line between data that's entirely new and a copy? Will an MP3 file with it's ID tags modified be classified as completely different? Does a web page count as entirely new traffic if the URL is different because of the session ID? Would you count each line of code in a program's source as an unique entity? Also, compressed data traffic makes sure any larger chunk of a file is pretty much unique.

I find these kinds of imprecise estimations pointless, because I know they're made up to entertain, not to inform.

Re:Surprised it hasn't happened already. (1)

Gumbercules!! (1158841) | more than 2 years ago | (#36690206)

Depends if you're logging that traffic. I would imagine if you logged all internet traffic, which by strict definition would be considered "new data", it would dwarf whatever data this SKA can produce. Now admittedly, that logged data would be "mostly useless and never read" - but I imagine most of the data the SKA generates will be "mostly useless and never read", too.

Re:Surprised it hasn't happened already. (1)

blair1q (305137) | more than 2 years ago | (#36697254)

if you start logging all traffic, you'll quickly run out of space to store it.

if on the other hand you start logging only novel content, you'll take somewhat longer to run out of space to store it.

just ask google how their cache works.

Re:Surprised it hasn't happened already. (1)

sFurbo (1361249) | more than 2 years ago | (#36692120)

Sure, not that much new information is made by the net, but it sure is represented by a lot of data. Or, that is how I would use the terms, YMMV.

SKA telescope you say? (0, Funny)

Anonymous Coward | more than 2 years ago | (#36689038)

I prefer the PUNK telescope myself.

SKA and other astronomy projects (2)

JoshuaZ (1134087) | more than 2 years ago | (#36689074)

Given that we just had a Slashdot article about how the space based James Webb telescope is already on the ropes with Congress, http://science.slashdot.org/story/11/07/07/0038247/Congress-Dumps-James-Webb-Space-Telescope [slashdot.org], perhaps we should be worried that the same will happen for SKA. Unlike Webb, SKA is an international project, so it won't necessarily go down the tubes if the US backs out. Moreover, the US has backed out of European lead science projects before often with very little warning. SKA is going to allow some very interesting work. Among other things, SKA might be able to detect extraterrestrial life, either through direct radio signals (from intelligent life) which would be a really big deal, or more indirectly detect non-intelligent through the analysis of extrasolar planets' atmospheres (such as the detection of large amounts of oxygen). SKA will also be used for many other astronomy and astrophysics projects, such as examination of supernovas. SKA is very good science, let's hope that the penny pinchers who repeatedly cut tiny science programs while leaving defense, social security and medicare alone will not touch it. In the long run, science helps everyone.

Re:SKA and other astronomy projects (0)

Anonymous Coward | more than 2 years ago | (#36689488)

SKA has a lot of backing in Australia. The CSIRO invention of 802.11a WLAN was a direct offshoot of their radio astronomy program, and it has earned the government/CSIRO about $1 billion. I reckon that, for the near future, the government will probably fund radio astronomy in the hope of more spinoffs.

Re:SKA and other astronomy projects (1)

st0nes (1120305) | more than 2 years ago | (#36691224)

The SKA isn't all-or-nothing. It can start small with a few receivers (South Africa already have the MeerKAT array [wikipedia.org]) and scale when funding is available. It doesn't really matter if there are delays--good science can still be done with half an SKA.

Re:SKA and other astronomy projects (0)

Anonymous Coward | more than 2 years ago | (#36692828)

Similarly, Australia has the ASKAP array [wikipedia.org]. Either this or MeerKAT will be chosen as the site of the SKA, on February 29 next year.

Re:SKA and other astronomy projects (0)

Anonymous Coward | more than 2 years ago | (#36692698)

The US already backed out of the SKA. They've got observer status on the project, so they could theoretically re-enter it, but it's going ahead regardless.

Re:SKA and other astronomy projects (0)

Anonymous Coward | more than 2 years ago | (#36693730)

Herschel

10 years from now... (1)

voodoosteve (1045878) | more than 2 years ago | (#36689078)

Sure but think about how much more data the Internet generates today than it did 10 years ago. I'm sure we'll be able to cope in 10 years. Also, I know the LHC produces a ton of data. Does anyone know how it compares?

The FUTURE (0)

Relic of the Future (118669) | more than 2 years ago | (#36689080)

More data/traffic than the internet NOW, or then the internet will be doing in 2020? 'cause let me tell you, the last 9 years has had a pretty sharp increase...

Re:The FUTURE (0)

Anonymous Coward | more than 2 years ago | (#36689178)

It depends on how the caps go.
It's quite possible we'll transferring the same amount as we do today.

Grid Computing??? (0)

Anonymous Coward | more than 2 years ago | (#36689102)

Is it just me, or are the statements

1) "could generate more data per day than the entire internet when it comes online in 2020" and
2)"Slashdot has previously discussed the proposal to use 'Skynet' — a grid-computing-based solution for processing and storage"

Somewhat incompatible?

NBN (3, Interesting)

dakameleon (1126377) | more than 2 years ago | (#36689188)

(Warning: Australian content ahead!)

I hope this lays down a water-tight case for the NBN going ahead - or the combination of the two being a catalyst for each other. If there's one thing this is good for demonstrating, it's that future data requirements will outgrow the current infrastructure very quickly, and a project which is as far-sighted as installing FTTH throughout the country has a justification for the unforeseen benefits it can help happen.

(and bah humbug to anyone who thinks the SKA isn't justified to begin with!)

Re:NBN (0)

Anonymous Coward | more than 2 years ago | (#36689348)

It makes you wonder why we need to justify something as useful and grand as the NBN is anyway.

I feel that Australia is on an increasingly irriversible downward slope when it comes to the intelligence of the common man.

Back in the 1960s in Australia, people probably said "well what use is television anyway?", but we all still bought television sets and made them on record numbers, not because we thought of the amazing technology but because our kids wanted to see Elvis.

Fact of the matter is, even in such a wide and vast area that is Australia, we are running out of bandwidth on the radio spectrum, even on Satellite broadcast technology, because most Australian's live on the coastlines, so how in the future will we be able to get 4320p television broadcasts without needlessly increasing the compression ratio? the NBN.

In a country which thinks that the incredibly miniscule amount of bandwidth that is allocated to 1080p Terrestrial broadcasting (DVB-T) is hot stuff, I'm not surprised.

Even when a 1080p broadcast on Terrestrial television actually looks closer to 720p because of the massive amount of compression needed to fit into the channel's allocated frequency range, people still bought into it, because of Blu-ray, but I can tell you that even Blu-ray is severely limited, 50gb per disc? for 1080p? give me a break, any seasoned computer enthusiast can tell you where the compression block artifacts are, and the bigger the screen you buy the worse this gets.

Everyone bought the idea that 1920x1080p is hot stuff, instead of questioning the reason why they need to spend $3,500 on a television set for a technology that isn't even properly utilized in this country because of radio spectrum limitations and a resolution that I achieved in 1999 on a Sony Trinitron 19" CRT monitor, why can't they do the same for the NBN or the SKA Telescope?

Maybe we need Elvis to peer into the telescope and say some catch phrases.

Re:NBN (1)

BeanThere (28381) | more than 2 years ago | (#36691642)

That's absurd on the face of it, you don't need FTTH for the SKA, because the data isn't going to be distributed on a large scale to homes! For the SKA, you only need one fat pipe leading from the SKA, out to the under-ocean cables.

Re:NBN (0)

Anonymous Coward | more than 2 years ago | (#36692538)

The NBN isn't just the last mile.

Re:NBN (0)

Anonymous Coward | more than 2 years ago | (#36702942)

Please, for the love of God, re-read what dakameleon actually wrote. He wrote that the mere possibility (not even a sure thing, but the mere possibility of) one single telescope project lays down a "watertight case" for nationalising the entire broadband network AND that it lays down a "watertight case" that it is "required" to install FTTH to every single home throughout the entire country - even though FTTH will help this project zero, and I mean zero, and zero, and I repeat, zero. Then tell me, please, that you can't just see that dakameleon clearly has some other vested interest in this nationalised FTTH network, and that he isn't just jumping on this as an excuse to use logical fallacies to give the vague and false impression that this somehow "proves" there is a "need" for the NBN. And the other contending country, just for interest, has a plethora of under-ocean fiber-optic cables either existing, or new ones currently being installed, and all laid down by the market, more than enough for the SKA telescope, not one is nationalised, proving in fact that dakameleon is speaking complete and utter hooey.

Re:NBN (0)

Anonymous Coward | more than 2 years ago | (#36692034)

By the time the NBN reaches its endgame you are going to wish you had never spoken those words. Australians want hospitals and schools that work, not a chunk of bling that will establish yet another Telstra led monopoly in a decade or so. No matter which way you swing it, we have only so may people to pay for this massive lemon, we have a system of government that hasn't delivered a successful project anywhere near this scale on time/under budget for my entire lifetime (40 years). The project is run by a senator with in abysmal understanding of IT. The cracks are already showing. I'd cal thel SKA far-sighted. NBN on the other hand will be another slow motion train wreck.

Re:NBN (1)

Canberra Bob (763479) | more than 2 years ago | (#36692884)

How many major infrastructure projects have been attempted at all over the past 40 years? That is the entire problem, as far as I see it, is we are falling behind because we are not building anything. Successive governments keep playing it safe and if there is a surplus the money does not get invested into major infrastructure projects, it gets "invested" into winning the next election through middle class welfare. The mining boom will eventually end and we will be left in the tough times with nothing to show from the good times.

Ska (1)

supernatendo (1523947) | more than 2 years ago | (#36689202)

I did not know that the music created by the likes of "Reel Big Fish", "The Clash" or other Ska bands could be used to find intelligent life somewhere...

Re:Ska (3, Funny)

Hsien-Ko (1090623) | more than 2 years ago | (#36689534)

Implying that there's depth in ska to use a telescope for that is. A reggae telescope would be better. Let's all get together and see deep space, mon.

Re:Ska (1)

Kyont (145761) | more than 2 years ago | (#36697124)

No, there is just too much data. They chose ska because they needed something exactly like reggae except much, much faster.

Re:Ska (0)

Anonymous Coward | more than 2 years ago | (#36692414)

Ignorance of musical genres is unacceptable.

is this a good thing? (2)

Charliemopps (1157495) | more than 2 years ago | (#36689222)

If the CIA invented a device that listened into every phone call in the entire world, real time and dumped it all as a WAV file on a storage device in the basement, would that really do them any good at all?

Re:is this a good thing? (1)

Anonymous Coward | more than 2 years ago | (#36689594)

If the CIA invented a device that listened into every phone call in the entire world, real time and dumped it all as a WAV file on a storage device in the basement, would that really do them any good at all?

Why would they invent that when they already have a device that listens into every phone call in the entire world, real time and dumps it all as a bunch of AMR files on a storage array in the basement?

Re:is this a good thing? (0)

Anonymous Coward | more than 2 years ago | (#36689980)

The SKA includes developing the equipment necessary to transmit, store and process the information. Let's face it, the WWW was a side effect of CERN's attempts to grapple with exactly the issues that the SKA will face. I'm looking forward to seeing the SKA solution!

Seems like... (1)

z3alot (1999894) | more than 2 years ago | (#36689236)

...calling your global computing initiative 'theskynet' might be detrimental to your acceptance by the technologically affluent who operate the computers you need help from.

Nonstandard unit fun! (1)

z3alot (1999894) | more than 2 years ago | (#36689310)

Finally a new unit for large amounts of data, appropriate for this new golden age of progress and hyperbole What I want to know is how many library of congresses are in an 'internet-year'...

Sensationalistically inaccurate article... (3, Informative)

Dahamma (304068) | more than 2 years ago | (#36689424)

The project is expected to deliver up to an exabyte a day of raw data, compressed to some 10 petabytes of data in images for storage.

So, 10 petabytes of data - who cares about the raw source. I work for a video streaming company and we have several petabytes of H.264 video. If that were to be uncompressed into 30 FPS 1080p raw data, it would be 50-100x that, so already approaching a couple hundred petabytes. And think of all of the JPEGs out there - why don't we just uncompress all of those for the comparison as well?

A (likely conservative) back of the hand calculation by Google estimated at least 5 exabytes accessible on the Internet (so even the wrong estimate is wrong). I'd imagine a huge percentage of that is compressed video, audio, and images. So, basically 5 exabytes vs 10 petabytes - it's off by 3 orders of magnitude.

Re:Sensationalistically inaccurate article... (2, Informative)

Anonymous Coward | more than 2 years ago | (#36689922)

Indeed ... while it's an impressive number, we already have experiments that generate more "raw data" per day than that: "CERN experiments generating one petabyte of data every second" http://www.v3.co.uk/v3-uk/news/2081263/cern-experiments-generating-petabyte That's 84EB per day. But "all" of it is crap, and they eventually store only about 25PB per year.

Re:Sensationalistically inaccurate article... (1)

fusiongyro (55524) | more than 2 years ago | (#36691304)

It's a specious analogy. Video and audio can be compressed with loss, and the algorithms make heavy use of human perceptual limitations. Scientific data produced by large instruments need to have breadth and depth; the instrument is a scarce resource and there are unlimited ways of reducing radio astronomy datasets to produce different data and different insights. Especially with radio, you're going to be collecting a ton of white noise-looking data, but you can't use a lossy compression algorithm to trim it, so you're stuck doing lossless compression on essentially random data, which doesn't work so well.

At the same time, it's an empty boast. I'm a programmer on the EVLA project. WIDAR would happily produce hundreds of gigabytes per second if we let it, but the rest of our pipeline is completely unprepared for it, and the astronomers don't have a cluster large enough to reduce datasets that big anyway. Some of their analyses aren't even parallelizable. So in practice, you want an instrument that can gather ridiculous amounts of data for PR and future-proofing purposes, but your scientists aren't likely to use it that way. Certainly not in the year it goes live.

Re:Sensationalistically inaccurate article... (1)

Dahamma (304068) | more than 2 years ago | (#36697878)

It's a specious analogy.

If you RTFA, no it's not.

The project is expected to deliver up to an exabyte a day of raw data, compressed to some 10 petabytes of data in images for storage.

They clearly say it's compressed before it's stored. The raw data number itself was pure boast. Besides, no one said it was lossy compression, anyway. Lossless image compression can be very efficient depending on the image.

Re:Sensationalistically inaccurate article... (1)

fusiongyro (55524) | more than 2 years ago | (#36698864)

It's a distortion on the part of the article. Radio astronomy raw data are not images. And the computational effort to reduce the raw data into data from which you could make images is large enough that you tend to store the result of the processing alongside the original data--it takes up more space, not less.

It still doesn't matter, because there's no way they'll be running the telescope at full throttle until several years after commissioning.

Re:Sensationalistically inaccurate article... (0)

Anonymous Coward | more than 2 years ago | (#36692050)

Exactly. Actually this telescope generates about half as much data as a single human eye, so viewed from that angel it's not that impressive.

Perhaps more impressive is that one human eye actually generates what amounts to 125 megapixels ~ 30 fps, with a far wider light spectrum than the very best cameras, and far more intensity spectrum (a photoreceptor can actually detect a single photon, and it can look into the sun (1kW/m2)).

So that would be 125 000 000 * 65000 (number of different colors that can be observed) * 1 million (number of different intensities the eye can detect) * 30, in bps. This comes down to 208 petabits per second per eye.

What's perhaps even more impressive is that in the retina itself there is a basic processing layer that compresses this down to a few dozen gigabits/s to send to your brain.

But the point was, this telescope may be mighty impressive, but it hardly compares to the human eye.

Re:Sensationalistically inaccurate article... (2)

bertok (226922) | more than 2 years ago | (#36692890)

Your maths is quite a bit off.

We have only 3 primary color channels (4 if you count rods separately, 5 if also counting tetrachromats [wikipedia.org]), not "65000". We can't see 1 million different intensities simultaneously either -- while the human eye does have an enormous dynamic range, this adaptation takes a while (minutes). At any one time, we can see maybe 300-1000 distinct intensity levels per color channel. This only requires 10 bits to represent per channel. Even your 125 Mpixels is an exaggeration, because we have roughly 125 million sensor cells total, not 125 per color channel!

This then works out to 125*10^6 cells * 10 bits * 30 Hz = 38 Gbps, which is a lot, but is almost 7 orders of magnitude less than your estimate!

In practice, even 38 Gbps is overestimating things substantially -- the cells in the retina have a trade-off between temporal resolution ('framerate') and dynamic range, so we can't simultaneously get 10 bits of intensity detail and 30 Hz of temporal resolution. Additionally, the image on the retina isn't focused very well, reducing the actual image detail quite a bit, especially near the edge of our field of view.

Whatever the real raw bandwidth is, the optic nerve transmits only about 8.75 Mbps [newscientist.com] per eye!

Re:Sensationalistically inaccurate article... (1)

discord5 (798235) | more than 2 years ago | (#36692388)

So, 10 petabytes of data - who cares about the raw source.

Depends on what you're doing. If you're taking pictures of the Eiffel Tower on your vacation you're not really interested in if the CCD on your camera's LSB was a 1 or a 0. If you're taking images from a hyperspectral sensor for scientific purposes the better the accuracy the better (hopefully) your results. It depends on the type of application and how important accuracy is to you (and how accurate your sensor is).

I work for a video streaming company and we have several petabytes of H.264 video.

We've got several hundreds of terabytes in lossless compressed high resolution hyper spectral images. If you start using lossy compression you often lose too much information and gain artifacts that can (sometimes drastically) change the outcome of whatever algorithms you wish to run on those images. However, in some cases you can find a good compromise in compression factor, it all depends on what you're planning to measure or calculate. If you're doing scientific research you'll always want to determine what the effects are of lossy compression on your dataset before you start throwing away the raw data.

In your case where you are streaming video though, lossy compression is very applicable. The human eye (and/or brain) is quite limited in what it can perceive visually.

Re:Sensationalistically inaccurate article... (1)

Dahamma (304068) | more than 2 years ago | (#36697908)

So, 10 petabytes of data - who cares about the raw source.

Depends on what you're doing. If you're taking pictures of the Eiffel Tower on your vacation you're not really interested in if the CCD on your camera's LSB was a 1 or a 0. If you're taking images from a hyperspectral sensor for scientific purposes the better the accuracy the better (hopefully) your results. It depends on the type of application and how important accuracy is to you (and how accurate your sensor is).

I already pointed this out above, but it you RTFA they clearly say it would be compressed before it's stored. If they don't store the raw data, then, yes, who cares about it because it doesn't exist for analysis. And as I also pointed out, nowhere did it say it was lossy, just that the ~1EB of data compressed to 10PB.

Re:Sensationalistically inaccurate article... (0)

Anonymous Coward | more than 2 years ago | (#36693248)

Yes, also "10,000 times more powerful" is also interpreted as "40 dB gain" in EE vernacular ...

Re:Sensationalistically inaccurate article... (1)

dargaud (518470) | more than 2 years ago | (#36697674)

This kind of back of the hand calculations and the fact that hard drive capacity hasn't increased much in the last few years (from 2Tb 3 years ago to only 3Tb now) makes me wonder if there aren't much larger prototype HDs in use in large datacenters à la Google.

My Home DVR (1)

jageryager (189071) | more than 2 years ago | (#36689966)

In nine years my home DVR system will generate more data than the entire internet!

Re:My Home DVR (0)

Anonymous Coward | more than 2 years ago | (#36691540)

God, you are watching that much HD porn?

Old and reliable (0)

Anonymous Coward | more than 2 years ago | (#36690156)

3.5" floppies sent through the mail. You get an extra back in the form of floppy disks and the extra billion or so pieces of mail being sent every month will help out the post office.

Early processing of raw data (1)

BCMcI (838317) | more than 2 years ago | (#36690256)

This begs for research on how the neural system in the eye compresses raw data into information that can be transmitted through the limited bandwidth optic nerve. Collecting data is great but we can drown in data. We need information processing near to the source.

Re:Early processing of raw data (2)

ceoyoyo (59147) | more than 2 years ago | (#36690548)

We know a decent amount about it. And we don't want to replicate it in our scientific instruments. The brain does all sorts of extrapolating, interpolating and other forms of making shit up. Which is great if you're a mammal who needs to see the sabre toothed tiger stalking you, but not so good if you're trying to get accurate, quantitative data out of a scientific instrument.

Re:Early processing of raw data (1)

Anonymous Coward | more than 2 years ago | (#36690746)

Also, computationally intensive to do algorithmically, free when it happens due to quantum interference or neurotransmitter binding or whatever in the optic nerve.

No Way! (0)

Anonymous Coward | more than 2 years ago | (#36690272)

Something Coming online a DECADE from now may have the same amount of data as the internet TODAY?!?

It's not like data on the net is growing exponentially or anything...

Info from one of the IT Engineers (0)

Anonymous Coward | more than 2 years ago | (#36690470)

I know this guy who is working on the SKA project here in Australia. Have a look at this site http://www.icrar.org/education/edxn_resources - I have seen this presentation before, and it gives a rough idea of how much data is involved and how much processing is required.

As an excerpt from the PDF of the presentation, each image of the sky will be multiple wavelengths of light (i.e. 2d image x multiple wavelengths gives a 'cube' shape). Each 'cube' of data will be between 4 and 5 TB, and there will be about 1000 of these every 5 days to make up an image of the entire sky visible by the array. This of course is the data that has been pre-processed and had the atmospheric noise removed, so the total bandwidth and processing requirements are many orders of magnitude higher.

It's an exciting project.

Mostly noise (1)

pentadecagon (1926186) | more than 2 years ago | (#36692208)

This merely says they are unable to filter out all the noise. Producing huge numbers of bits isn't a particularly remarkable achievement. I guess there is far less than 1% actual information in that bit stream.

Re:Mostly noise (0)

Anonymous Coward | more than 2 years ago | (#36694122)

I agree, 10Exabytes of static is still just static.

pfff.. (0)

Anonymous Coward | more than 2 years ago | (#36692774)

I could generate just as much with a big computer and exclusive access to random.org :)

The first image recorded from the SKA (1)

jockeys (753885) | more than 2 years ago | (#36693116)

was a checkerboard pattern in black and white. When reached for comment, the leading scientist on the project had this to say,
"Pick it up, pick it up, GO!"

OHHH THERES NOOOOO SCIENCE! (0)

Anonymous Coward | more than 2 years ago | (#36693604)

THE SKY IS FALLING!

SKA generating astronomical amounts of data (0)

Anonymous Coward | more than 2 years ago | (#36693894)

With so much data processing being generated what will they rely on to pickitup pickitup pickitup?

Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...