Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AMD Graphics Upgrades Hardware Games

AMD Unveils the Liquid-Cooled, Dual-GPU Radeon R9 295X2 At $1,500 146

wesbascas (2475022) writes "This morning, AMD unveiled its latest flagship graphics board: the $1,500, liquid-cooled, dual-GPU Radeon R9 295X2. With a pair of Hawaii GPUs that power the company's top-end single-GPU Radeon R9 290X, the new board is sure to make waves at price points that Nvidia currently dominates. In gaming benchmarks, the R9 295X2 performs pretty much in line with a pair of R9 290X cards in CrossFire. However, the R9 295X2 uses specially-binned GPUs which enable the card to run with less power than a duo of the single-GPU cards. Plus, thanks to the closed-loop liquid cooler, the R9 295X doesn't succumb to the nasty throttling issues present on the R9 290X, nor its noisy solution."
This discussion has been archived. No new comments can be posted.

AMD Unveils the Liquid-Cooled, Dual-GPU Radeon R9 295X2 At $1,500

Comments Filter:
  • And they all sold out instantly and the Litecoin difficulty went up ...
    • by Anonymous Coward
      And then all who bought them realised they should have bought a Viper instead...
    • And they all sold out instantly and the Litecoin difficulty went up ...

      Are there any dual-chip cards that cryptominers actually buy? (honest question, I don't know). Back in my youthful gaming-nut days, dual GPU cards, because of some mixture of worse economies of scale and 'people who absolutely must have the bleeding edge will pay, so why not?', always commanded rather more than twice the price of two equivalent single-GPU cards (and sometimes clocked worse, as well, just to keep the heat down).

      Now that everything is PCIe, and the low bandwidth requirements of mining allo

      • Now that everything is PCIe, and the low bandwidth requirements of mining allow you to stuff even x1 slots with GPU cards, I'd have to imagine that your miner would be willing to pay only a very small premium to conserve slots.

        I'd imagine (I've done as much research as you have, apparently) that it's difficult to come by PCIEx1 cards with meaningful GPUs on them. Only some subset can be hacked from x16 to x1.

        • You don't need the memory bandwidth for Scrypt mining, so you can use whatever card you want in a x1 slot, with the help of a riser/cable.

          • Or a dremel. Just cut off the remaining pins for the extra channels and it works fine. You can even test it out by taping off those pins and mounting it in an x16 slot.
          • You don't need the memory bandwidth for Scrypt mining, so you can use whatever card you want in a x1 slot, with the help of a riser/cable.

            Not all cards' BIOS will permit that.

      • Nowadays? No.
        A few years ago? Yes.
        Before everyone jumped on the bandwagon, 5970s had similar price/performance to 5850s while being superior in pretty much every other way.
        4 cards == 8 GPUs off a $45 board with x1 to x16 riser cables.
        Fans that could actually run 24/7 for years without dying.
        Lower power usage.

      • by goldcd ( 587052 )
        I was running a pair of 6990s (previous gen dual-GPU AMD cards).
        Bought the first one when I realized that Bitcoin app I'd accidentally installed a few months earlier on my server had produced something I could flog for 10 bucks and I'd earnt myself a free card. Then used that one to cover the cost of the second one.
        Quit mining when difficulty meant I was pulling in less than a BTC a day.. Looking back..
    • That can't be a good idea. Litecoin has steadily dropped from $25 to $11 during 2014, after briefly touching $44 late last year. I don't see any reason why the downward trend in bitcoin or litecoin price will reverse.

      I can't imagine that a $1500 card plus electricity costs are ever going to pay for themselves.

  • by b0r0din ( 304712 ) on Tuesday April 08, 2014 @09:32AM (#46693195)

    as if millions of Litecoins suddenly cried out in terror and were suddenly silenced.

  • A Raspberry-Pi Beowulf cluster of those R9 295X2's.

  • Does anyone actually reads those ridiculousy long tech reviews? Or just skip to the verdict/conclusion page?
  • Comment removed based on user account deletion
  • Do not want (Score:3, Insightful)

    by Anonymous Coward on Tuesday April 08, 2014 @10:27AM (#46693839)

    Didn't find any need for the $1000 Titan card, doubtful I will find a need for a $1500 flavor either.

    Patience works well. Wait a year or two and you can pick up this awesome horsepower at a fraction of the price. Pick up any games that require this much horsepower at the same time and you're golden. It's similar to how I buy games today. I'll be damned if I'm paying full price for what is effectively Beta III. I'll let them sit for a while, let the world test it and complain, watch all the patches get applied and ultimately pick it up when it goes on sale for $20 or so.

    I learned long ago to quit buying bleeding edge gear.

    • LITECOIN MINING! lol. Make dem dolla dolla bills y'all.
    • The Titan was a clever thing from Nvidia: a product marketed to gamers that the budget supercomputing crowd is buying. This means that they don't have to provide professional-level support for the things, but can sell them at semi-professional-level prices. They give top-end performance and are (as far as we can tell -- we have a few dozen of the things) as stable as anything else.

      An Nvidia K20X costs many thousands of dollars and is actually slower than a $1K Titan (by about 10%, according to lattice QCD b

  • by Dr. Spork ( 142693 ) on Tuesday April 08, 2014 @10:50AM (#46694085)
    So finally, AMD came out with a power algorithm that reduces power consumption when the resources of the GPU aren't needed. I'm not sure where in their product line they introduced it, but it's about damn time. There are all kinds of good reasons to leave our computers on all the time, but I haven't been doing it because the idle power consumption has been needlessly high - in my case, over 50 watts. This adds up over time. Just how much power does a high-end computer need to idle, serve files, run non-demanding background processes, etc.? Millions of computers do just that, for many hours every day. A focus on reducing the power draw of these basically idling computers could make a huge difference to the world.
    • Just how much power does a high-end computer need to idle, serve files, run non-demanding background processes, etc.? Millions of computers do just that, for many hours every day.

      Serving files, best done by something dedicated. In my case, I am using an ultra-low-power solution, a Pogoplug connected to the disk via USB3. Performance will not exactly set the world alight; via my crappy dlink GigE switch I'm getting about 10-15MB/sec real-world as reported by various file managers. Performance is better via samba than nfs, but I haven't looked into why at all. The disk spins down when not in use and the pogoplug uses jack diddly for power.

      Most of the rest of the idle consumption could

    • by AmiMoJo ( 196126 ) *

      It's been there since the 5000 series. I had one of the high idle power 4000s, but they fixed it with the 5000s. Shame the promised driver update to reduce power on the 4000s never materialized.

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...