Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

£52 Million Govt Funding for New UK Supercomputer 135

Lancey writes "The BBC reports that the UK government has contributed £52 million towards the building of the High-End Computing Terascale Resource to replace two existing supercomputers currently in use by British scientists. The story claims a maximum speed of 100 teraflops, although it is unlikely that the machine will ever be pushed to this limit. Some of the government funding will also be used to train scientists and programmers to develop software capable of exploiting the machine's potential."
This discussion has been archived. No new comments can be posted.

£52 Million Govt Funding for New UK Supercomputer

Comments Filter:
  • Born Yesterday? (Score:5, Insightful)

    by ExE122 ( 954104 ) * on Monday April 03, 2006 @02:28PM (#15052174) Homepage Journal

    However, it is unlikely to ever be pushed to its limits

    Give it a little while. Ten years ago, people thought 16MB of RAM was excessive. Ten years before that, 512KB was considered a luxury.

    --
    "Man Bites Dog
    Then Bites Self"
    • And 10 years from now, we all say "I remember when 100 teraflops was fast..."
      • Re:Born Yesterday? (Score:2, Insightful)

        by plankrwf ( 929870 )
        Nope, we will say: and we thougth that THAT was anywhere near good enough
        to actually make any chance of beeting a 12 year old in the game of Go

        Roel
        • 12-year-old Korean, you mean? I'm not joking. Your typical 12-year-old American doesn't even know how to play Go. 12-year-old Koreans are sometimes pros, and may have been since the age of 5.

          Then there's those Japanese kids possessed by ghosts of ancient, suicidal Go masters. Hoo boy.
    • The brits are desperately trying to beat the yanks to be the first to compute the square root of a negative number, and they are throwing everything they got at it.
    • Re:Born Yesterday? (Score:5, Interesting)

      by adz ( 630844 ) on Monday April 03, 2006 @02:38PM (#15052264)
      However, it is unlikely to ever be pushed to its limits It would be more accurate to say that it is impossible to achieve the theoretical maximum speed, and very hard to come even close. Without doubt the machine will be used extensively and people will ensure they get as much performance as they can out of the system. Given how much it costs, they're hardly going to use it as a doortstop, are they?!
    • Give it a little while. Ten years ago, people thought 16MB of RAM was excessive. Ten years before that, 512KB was considered a luxury.

      Most laptops, if boxed today and shipped to "super computing" sites 10 years ago would have had much better performance than those rooms of machines that they had. Not to mention the power/cooling etc.

    • "However, it is unlikely to ever be pushed to its limits, achievable only for short bursts of time that are too small for scientists to run their programs properly."

      It's not going to be pushed to its top limits because it can't handle it for more than a few seconds, has nothing to do with its load.
    • It would do a good job on BOINC-type endeavors where the results are relatively independent of each other. On hybred jobs where communication between nodes isn't huge, and large computational chunks are possible, you could still get very close to the theoretical maximum. There will, however, continue to be some jobs where this kind of supercomputer sucks relative to it's theretical capability.

      Finding and creating the kinds of algorithms where this system works at it's best continues to be half the

    • Ten years ago, people thought 16MB of RAM was excessive. Ten years before that, 512KB was considered a luxury.

      Those numbers only apply to a gamer's laptop.

      14 years ago I admined a deskside RS-6000 box with 380MB of ram (( although the first response I often got when telling people how much ram it had was "Oooh! That's a lot of disk space isn't it?" ))

      Almost 25 years ago, the Computer Science building at the University of Alberta had at least two machines with at least 16MB of ram in them -- one was

    • Give it a little while. Ten years ago, people thought 16MB of RAM was excessive. Ten years before that, 512KB was considered a luxury.

      Yes, but how many people are using old machines, pushing them to the limits?

      I read the sentence as referring specifically to this machine. Sure, eventually there'll be greater demand for power, but it won't be done on that particular machine (and indeed, as others pointed out, the reason why is explained if you read the whole sentence).
  • by RatOfTheLab ( 535003 ) on Monday April 03, 2006 @02:36PM (#15052239) Journal
    Preparation for the release of Vista, no doubt.
    • If anything it will be "Vista Ready"... No guarantee it will run on it or how it will perform.
    • oh that's why it says it is unlikely that the machine will ever be pushed to this limit
    • Your joke is a handy jumping-off point to mention that in all likelihood this beast will NOT be able to run Vista, or any other version of Windows for that matter. The only systems that currently operate in the teraflop-ish range (aka the top 3 in the world and the #1 in Europe) contain IBM Power CPUs. Unless they specifically want to burn a bunch of cash investing in a new architecture, their best option is a nice big BlueGene.
  • Run Windows Vista on the new supercomputer. There has to be some CPU-cycle-sucking bug in there to bring down the mightiest of supercomputers. Assuming that a memory leak doesn't kill it first.
  • by Expert Determination ( 950523 ) on Monday April 03, 2006 @02:36PM (#15052248)
    Of course it's 100,000 times faster than an ordinary computer. It's a rack of 100,000 ordinary computers.

    Anyone remember the days when the word 'supercomputer' actually meant something?

    • imagine a beowulf clu....
      no really, i did have a point. it's still a big brain,it's just this is more of a hive mind
    • Don't think I was born then. What did it mean?
    • It's a rack of 100,000 ordinary computers.

      A rack of 100,000 ordinary boxes does not equal 100,000 faster. The problem is always keeping the CPU's busy. There's also the problem of the granularity of the calculation being performed, which is certainly related.
      • Yes - we've got a 1/4 teraflop system down the hall (hundreds of nodes) and we light it up full scale for weeks at a stretch on some classes of computational chemistry problems. A system like this one would bring the time span down to hours, not weeks, but it would still need to execute reliably flat out for that time.
    • Anyone remember the days when the word 'supercomputer' actually meant something?

      Don't DC and Marvel have a trademark on that, or something?

    • I'm not sure that I can figure out exactly what else it could actually mean (besides trivial differences).

      I'm very willing that I'm just ignorant of something here, but is there some kind of special way that devices have to be connected that make them more like a computer? If I connect two computers together, so that they accomplish one (computational) task, are they not one unit computing the answer... one computer? If not, how is it more ... unitary(?) for me to connect (basically by "wire") a bunch m

      • Similarly two people working on a task become one person - after all, the complete system of person 1 and person 2 accepts inputs, produces outputs, and there's a single subset of spacetime containing grey matter that could be considered to be their brain. Or maybe you've done more philosophy than is good for you :-)
        • Perhaps I can resolve this: the normal definition of "person" does not allow you to consider two persons to be equivalent to one in general. However, the definition of "computer" - a device that computes - allows this, because "device" is suitably vague with respect to spatial boundaries, physical form etc.
          • the definition of "computer" - a device that computes

            That may be a fine definition for philosophers, but not for ordinary speech, or even for speech by technical users of computers. For example, if the head of systems in our company said to a sysadmin "please install a computer for this new developer, the fastest thing we have" he'd be pretty unhappy if he swung by later and found a rack of a thousand machines stuffed in there. On the other hand, for philosophical discourse that would be a reasonable int

            • Actually, I would point out that the original article was from the BBC news ... a venue specifically designed to convey ideas in "ordinary speech". In "ordinary speech" this device was called "Britain's most powerful super computer." The complaint implied that the "ordinary speech" definition was a little too ordinary, and that he preferred a definition that "actually meant something" (calling for increased precision).
              • Actually, both the meaning in ordinary speech and the meaning in speech for technical people who work with computers more or less agree. It's the meaning in the speech of philosophers, would-be philosophers and marketing people that's the problem.
                • First, before I transferred to get my degree in philosophy, I had 7 semesters of training with a double major in physics and math (with an strong elective emphasis on computer science). I've worked in tech fields for almost a decade -- primarily as a programmer. I surely qualify as "a technical person who works with computers."

                  But let it be that "philosophy" is generally absurd or whatever else. Still no one has said what "a computer" "actually" means. All that's happened is that

                  1. someone (the re
        • Come to think of it ... I have noticed that my philisophical considerations do increase my general difficulty in interacting with the world at large ... =)

          Nonetheless, the above post makes the correct distinction. Were I to disregard considerations of personhood, I would be fine regarding two people intertwined with respect to their functions as a single thing with respect to it as "mechanism of task completion" though not with respect to it as "person".

          Certainly we do do this when we refer, for exampl

    • by san ( 6716 ) on Monday April 03, 2006 @05:56PM (#15053699)

      Of course it's 100,000 times faster than an ordinary computer. It's a rack of 100,000 ordinary computers.

      Anyone remember the days when the word 'supercomputer' actually meant something?

      Yes! and good riddens.

      Do you remember having to re-code for every single machine? Because they were such specialized machines, they tended to be extremely fickle: one wrong operation and performance would go down the drain.

      In practice, most computational work in the end consists of running many jobs independently. There are rare occasions where a single super fast CPU might be better but it's even rarer for the performance gains to outweigh the incredible cost increases for buying specialized supercomputer hardware.

      Whether it's wise to spend so much money on a single enormous cluster is another issue. You could buy many many individual clusters for individual groups and have them operational in a matter of weeks, rather than having wait till 2008. Besides, the thing is going to be obsolete by 2010.

      • Can't say I disagree. I work in an industry where people used to buy multiprocessor machines for tasks that could easily be separated into tasks runnable on separate machines. These machines were probably as much a status symbol as anything else.

        Nonetheless, it irks me that people use 'supercomputer' to mean cluster. It irks me even more that one of our competitors uses a network of a few thousand CPUs and claims that as a supercomputer, getting it listed in the top 100 list (or was it top 400, can't reme


      • Lancey: Some of the government funding will also be used to train scientists and programmers to develop software capable of exploiting the machine's potential.

        san: Yes! and good riddens. Do you remember having to re-code for every single machine? Because they were such specialized machines, they tended to be extremely fickle: one wrong operation and performance would go down the drain.

        If this architecture of theirs is at all novel, and if this is a one-time build of a machine with that architecture [

  • However, it is unlikely to ever be pushed to its limits.

    Tony Montana could, if he had a montague.
  • wait til you see the average credit this thing gets on SETI@Home [berkeley.edu] - there'll be a TBlair@10DowningSt account at the top of the list [berkeley.edu] before you know it.
  • Donations Needed (Score:3, Interesting)

    by digitaldc ( 879047 ) * on Monday April 03, 2006 @02:40PM (#15052282)
    This article should be renamed,
    "£52 Million Govt Funding for New UK Supercomputer, Donations Needed to Help Find and Train People to Operate It"
  • Didn't see any reference to the builders of this machine in the article. Did I miss it, or is that just an unimportant detail? Or are the owners the builders as well? Just curious...
    • I'm not sure they've chosen a builder yet. They were still taking putting together a shortlist of vendors a few months ago and knowing how quickly the wheels of academia turn.
      I know one of the panel involved in the planning of HECTOR so I might have to ask next time I see him.....
  • Imagine a Beowulf cluster of <kick> OW!
  • by fernandoh26 ( 963204 ) on Monday April 03, 2006 @02:55PM (#15052413) Homepage
    although it is unlikely that the machine will ever be pushed to this limit

    Like I said class: If you can't fit your program in 640k of memory, you don't know how to program... "640k should be enough for anybody"
  • by Don_dumb ( 927108 ) on Monday April 03, 2006 @02:58PM (#15052444)
    This will be made by EDS, in a poorly thought out 'Public Private Partnership' and will cost three times as much, arrive in 2010 and be obsolete when it does.
    If you think I am being too cynical, just look at their track record. The CSA computer system, the air traffic control system, etc

    What amazes me is that they still get more work. Surely even New Labour have a limit to how far a bribe can take them.
    • Add on the system for changing over farmers to the Single Payment Scheme... I was forced to work on that, and it sucked total balls. Fell over every 15 minutes tops, usually losing all the work you'd done to that point. EDS again. High quality development.
    • the govt. are highly motivated to make this work. This is, after all, the machine they need to process all of the data that their ID card scheme will generate (and run face recognition software on live video feeds from thousands of surveillance cams, and decrypt and analyze internet traffic and PSTN voice data, and run sophisticated prediction algorithms on the lot). With approx. 50 milion adults who can now *all* be monitored 24/7 in terms of where they go who they talk to and what they talk about, that's
    • This will be made by EDS, in a poorly thought out 'Public Private Partnership' and will cost three times as much, arrive in 2010 and be obsolete when it does.

      Fortunately the fact that it is public money doesn't mean the government run it. The UK research community have a proven track record on running big iron; it's really no different to US.gov giving money to LLNL to run Blue Gene/L.
    • Which was that it could never happen in the UK: Big Brother wouldn't be able to watch you because the CCTV cameras would be broken, and the Civil Service would be unable to organise a whole cage of rats.

      Don't blame the politicians: I believe Mr, Blair still has to get his wife to type his emails. He wouldn't know a supercomputer from a Gameboy. Blame the Civil Service, who make damned sure that no scientists or engineers ever reach the top level and show up the incompetence of the Oxford Greats graduates y

  • You forgot the link on the rss feed.
  • I have long since lost track, what are we otherwise up to in teraflops?

    Also, I am curious how some of the folks here on slashdot would answer the question: If 100 teraflops can be achieved and sustained, what are the three best single uses to apply that much processing power against?

    Personaly, I have no idea. As for Vista, I believe that joke has already been made once or twice in the discussion.

  • Connection? (Score:4, Interesting)

    by sane? ( 179855 ) on Monday April 03, 2006 @03:33PM (#15052698)
    I wonder if that and this story [bbc.co.uk] on replacing the Trident nuclear deterent have any connection?

    No nuclear testing means all proving of a new warhead design have to be done computationally. Now a new machine is being bought...

    • No, this new one is for civilian stuff. They already have the one for the nuclear weapons programme [awe.co.uk]

    • I was under the impression that we weren't allowed to build new nukes anyway, under the Nuclear Non-Proliferation Treaty, only maintain old ones. Or is the plan to have new warhead designs, computationally tested, ready to be built in time of war, when all treaties go out of the window?
      • I think it's more to do with the fact that (a) nukes have a limited shelf-life, so they eventually have to be replaced; and (b) the various bits of support technology that are necessary for actually targeting and launching the things now look big, clunky, power-hungry, manpower-intensive, and very expensive to maintain. Nuclear proliferation treaties don't prevent countries that already have nukes from building new ones as long as they de-comission an equal amount of existing stuff.
  • by 02bunced ( 846144 ) on Monday April 03, 2006 @03:35PM (#15052718) Homepage Journal
    It would still take a good 10 seconds to start up OpenOffice.org
  • it is unlikely that the machine will ever be pushed to this limit

    Pfft.. lets see them try to run web2.0 on that..

  • a Blue Gene computer, manufactured by IBM.

    Is it a Blue Gene (just a stock picture so that the average reader will get the idea that this is something BIG), or is this actually Hector?

  • They should just wait and string a load of PS3s together. That's what Saddam used to program his nukes back in the day. Plus they could render some Toy Story stuff in realtime on the side.
  • by Adult film producer ( 866485 ) <van@i2pmail.org> on Monday April 03, 2006 @03:57PM (#15052835)
    But of a bigger and badder supercomputer that will require it's own 170 MW generating station..

    Cray to build 24,000 quad-core Opteron Petacomp!!
    Friday March 31, @07:03AM Rejected

    check it out here.. [gcn.com].. now imagine a beowolf of those !!
  • "The computational limits of the existing facilities are now being reached," he said. Tell them to just format and re-install windows like the rest of us. I'd like to buy a new computer every time mine gets to the "limits of existing facilties" but that would get expensive
  • Finally those brits will be able to generate the funniest joke in the world [wikipedia.org].
  • or better yet does the electrical system short out during any rain storm. Oh, snap!! hopefully they have a different engineering team then the typical UK automakers.
  • The applications always mentioned in the articles are simulations (like weather and nuclear modeling) and the speed of these computers are measured in floating point operations per second. What about combinatorial problems and other problems that are discrete? Do these computers have applications for basically integer manipulations? and counting?
    • That's partly because what we tend to think of as Supercomputer-class problems are floating point based, and partly because with modern machines the integer unit is already screamingly fast, so it's not much of an issue. More subtly, while searching algorithms can be parcelled out in a loosely-parallel (or embarrassingly parallel) manner (i.e. if you have 100 processors, break your database into 100 subsets, and search each subset on its own processor), most of the floating-point heavy algorithms are also


  • 100 Teraflops ought to be enough for anybody.
  • Why is the UK government spending so much money on a supercomputer? Why don't they just buy 100 xBox 360's. Microsoft claims the overall system performence is 1 teraflop!
    • Oh come on, you're not trying hard enough. There are LOADS of jobs the government would need a supercomputer for. Storing everyone's inside leg measurements for ID Card database. Working out how much money Tony Blair and his grasping hag of a wife have gouged over their years in office. Calculating the maximum number of speed cameras that could be deployed before everyone snaps and marches on Downing Street with pikestaffs. That sort of thing.
  • but will it run VISTA and whats the framerate for DOOM 3
  • Okay heres a thought for the enterprising uni-students out here (I have a job and no free time so it's up to the graduates to solve this one). why not write a seti@home-esk bit of software that can be deployed on university desktops/servers (WIN/MAC/*nix) that can run whole/segments of custom written programmes.
    Imagine the power of every uni-desktop all running together in parralle! Plus you get constant free upgrades. The solution would need it's own c++/java esk language for the tasks to be written in, s
  • Looks like they couldn't come up with a good with a really good acronym, so they settled on something that might build them an ultimate weapon...

  • hah, didn't they manage to pass the laws for detaining people until they crack their encrypted hard drives?

    Remember?: "If 256-bit triple-DES or similar techniques are used then decryption could require supercomputer-levels of cracking."

    http://yro.slashdot.org/article.pl?sid=05/11/04/13 48200&tid=123 [slashdot.org]

  • You could have one of the best Duke Nukem sessions ever on that thing! /end satire

What is research but a blind date with knowledge? -- Will Harvey

Working...