Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.
Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and learn more about it. Thanks for reading, and for making the site better!
MojoKid writes Dell's Alienware division recently released a radical redesign of their Area-51 gaming desktop. With 45-degree angled front and rear face plates that are designed to direct control and IO up toward the user, in addition to better directing cool airflow in, while warm airflow is directed up and away from the rear of the chassis, this triangular-shaped machine grabs your attention right away. In testing and benchmarks, the Area-51's new design enables top-end performance with thermal and acoustic profiles that are fairly impressive versus most high-end gaming PC systems. The chassis design is also pretty clean, modular and easily servicable. Base system pricing isn't too bad, starting at $1699 with the ability to dial things way up to an 8-core Haswell-E chip and triple GPU graphics from NVIDIA and AMD. The test system reviewed at HotHardware was powered by a six-core Core i7-5930K chip and three GeForce GTX 980 cards in SLI. As expected, it ripped through the benchmarks, though the price as configured and tested is significantly higher.
132 comments | 2 days ago
MojoKid writes When Apple debuted its A8 SoC, it proved to be a modest tweak of the original A7. Despite packing double the transistors and an improved GPU, the heart of the A8 SoC is the same dual-core Apple "Cyclone" processor tweaked to run at higher clock speeds and with stronger total GPU performance. Given this, many expected that the Apple A8X would be cut from similar cloth — a higher clock speed, perhaps, and a larger GPU, but not much more than that. It appears those projections were wrong. The Apple A8X chip is a triple-core variant of the A8, with a higher clock speed (1.5GHz vs. 1.4GHz), a larger L2 cache (2MB, up from 1MB) and 2GB of external DDR3. It also uses an internal metal heatspreader, which the Apple A8 eschews. All of this points to slightly higher power consumption for the core, but also to dramatically increased performance. The new A8X is a significant power house in multiple types of workloads; in fact, its the top-performing mobile device on Geekbench by a wide margin. Gaming benchmarks are equally impressive. The iPad Air 2 nudges out Nvidia's Shield in GFXBench's Manhattan offscreen test, at 32.4fps to 31 fps. Onscreen favors the NV solution thanks to its lower-resolution screen, but the Nvidia device does take 3DMark Ice Storm Unlimited by a wide margin, clocking in at 30,970 compared to 21,659.
129 comments | 4 days ago
jimwormold writes: I need to build a system for outdoor use, capable of withstanding a high pressure water jet! "Embedded PC," I hear you cry. Well, ideally yes. However, the system does a fair bit of number crunching on a GPU (GTX970) and there don't appear to be any such embedded systems available. The perfect solution will be as small as possible (ideally about 1.5x the size of a motherboard, and the height will be limited to accommodate the graphics card). I'm U.K.- based, so the ambient temperature will range from -5C to 30C, so I presume some sort of active temperature control would be useful.
I found this helpful discussion, but it's 14 years old. Thus, I thought I'd post my question here. Do any of you enlightened Slashdotters have insights to this problem, or know of any products that will help me achieve my goals?
199 comments | 4 days ago
Ubuntu 14.10, dubbed Utopic Unicorn, has been released today (here are screenshots). PC World says that at first glance "isn't the most exciting update," with not so much as a new default wallpaper — but happily so: it's a stable update in a stable series, and most users will have no pressing need to update to the newest version. In the Ubuntu Next unstable series, though, there are big changes afoot: Along with Mir comes the next version of Ubuntu’s Unity desktop, Unity 8. Mir and the latest version of Unity are already used on Ubuntu Phone, so this is key for Ubuntu's goal of convergent computing — Ubuntu Phone and Ubuntu desktop will use the same display server and desktop shell. Ubuntu Phone is now stable and Ubuntu phones are arriving this year, so a lot of work has gone into this stuff recently. The road ahead looks bumpy however. Ubuntu needs to get graphics drivers supporting Mir properly. The task becomes more complicated when you consider that other Linux distributions — like Fedora — are switching to the Wayland display server instead of Mir. When Ubuntu Desktop Next becomes the standard desktop environment, the changes will be massive indeed. But for today, Utopic Unicorn is all about subtle improvements and slow, steady iteration.
109 comments | about a week ago
jones_supa writes Microsoft is expected to release a new build of the Windows 10 Technical Preview in the very near future, according to their own words. The only build so far to be released to the public is 9841 but the next iteration will likely be in the 9860 class of releases. With this new build, Microsoft has polished up the animations that give the OS a more comprehensive feel. When you open a new window, it flies out on to the screen from the icon and when you minimize it, it collapses back in to the icon on the taskbar. It is a slick animation and if you have used OS X, it is similar to the one used to collapse windows back in to the dock. Bah.
209 comments | about two weeks ago
An anonymous reader writes Twelve years after Microsoft debuted DirectX 9.0, open-source developers are getting ready to possibly land Direct3D 9.0 support within the open-source Linux Mesa/Gallium3D code-base. The "Gallium3D Nine" state tracker allows accelerating D3D9 natively by Gallium3D drivers and there's patches for Wine so that Windows games can utilize this state tracker without having to go through Wine's costly D3D-to-OGL translator. The Gallium3D D3D9 code has been in development since last year and is now reaching a point where it's under review for mainline Mesa. The uses for this Direct3D 9 state tracker will likely be very limited outside of using it for Wine gaming.
55 comments | about two weeks ago
355 comments | about two weeks ago
MojoKid writes: For the past few years, Intel has promised that its various low-power Atom-based processors would usher in a wave of low-cost Android and Windows mobile products that could compete with ARM-based solutions. And for years, we've seen no more than a trickle of hardware, often with limited availability. Now, that's finally beginning to change. Intel's Bay Trail and Merrifield SoCs are starting to show up more in full-featured, sub-$200 devices from major brands. One of the most interesting questions for would-be x86 buyers in the Android tablet space is whether to go with a Merrifield or Bay Trail Atom-based device. Merrifield is a dual-core chip without Hyper-Threading. Bay Trail is a quad-core variant and a graphics engine derived from Intel's Ivy Bridge Core series CPUs. That GPU is the other significant difference between the two SoCs. With Bay Trail, Intel is still employing their own graphics solution, while Merrifield pairs a dual-core CPU with a PowerVR G6400 graphics core. So, what's the experience of using a tablet running Android on x86 like these days? Pretty much like using an ARM-based Android tablet currently, and surprisingly good for any tablet in the $199 or less bracket. In fact, some of the low cost Intel/Android solutions out there currently from the likes of Acer, Dell, Asus, and Lenovo, all compete performance-wise pretty well versus the current generation of mainstream ARM-based Android tablets.
97 comments | about two weeks ago
57 comments | about three weeks ago
An anonymous reader writes: AMD recently presented plans to unify their open-source and Catalyst Linux drivers at the open source XDC2014 conference in France. NVIDIA's rebuttal presentation focused on support Mir and Wayland on Linux. The next-generation display stacks are competing to succeed the X.Org Server. NVIDIA is partially refactoring their Linux graphics driver to support EGL outside of X11, to propose new EGL extensions for better driver interoperability with Wayland/Mir, and to support the KMS APIs by their driver. NVIDIA's binary driver will support the KMS APIs/ioctls but will be using their own implementation of kernel mode-setting. The EGL improvements are said to land in their closed-source driver this autumn while the other changes probably won't be seen until next year.
80 comments | about three weeks ago
MojoKid (1002251) writes A new interview with Assassin's Creed Unity senior producer Vincent Pontbriand has some gamers seeing red and others crying "told you so," after the developer revealed that the game's 900p framerate and 30 fps target on consoles is a result of weak CPU performance rather than GPU compute. "Technically we're CPU-bound," Pontbriand said. "The GPUs are really powerful, obviously the graphics look pretty good, but it's the CPU that has to process the AI, the number of NPCs we have on screen, all these systems running in parallel. We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise..." This has been read by many as a rather damning referendum on the capabilities of AMD's APU that's under the hood of Sony's and Microsoft's new consoles. To some extent, that's justified; the Jaguar CPU inside both the Sony PS4 and Xbox One is a modest chip with a relatively low clock speed. Both consoles may offer eight CPU threads on paper, but games can't access all that headroom. One thread is reserved for the OS and a few more cores will be used for processing the 3D pipeline. Between the two, Ubisoft may have only had 4-5 cores for AI and other calculations — scarcely more than last gen, and the Xbox 360 and PS3 CPUs were clocked much faster than the 1.6 / 1.73GHz frequencies of their replacements.
338 comments | about three weeks ago
An anonymous reader writes: AMD is moving forward with their plans to develop a new open-source Linux driver model for their Radeon and FirePro graphics processors. Their unified Linux driver model is moving forward, albeit slightly different compared to what was planned early this year. They're now developing a new "AMDGPU" kernel driver to power both the open and closed-source graphics components. This new driver model will also only apply to future generations of AMD GPUs. Catalyst is not being open-sourced, but will be a self-contained user-space blob, and the DRM/libdrm/DDX components will be open-source and shared. This new model is more open-source friendly, places greater emphasis on their mainline kernel driver, and should help Catalyst support Mir and Wayland.
56 comments | about three weeks ago
MojoKid writes: When Nvidia launched their new GeForce GTX 980 and 970 last month, it was obvious that these cards would be coming to mobile sooner rather than later. The significant increase that Maxwell offers in performance-per-watt means that these GPUs should shine in mobile contexts, maybe even more-so than in desktop. Today, Nvidia is unveiling two new mobile GPUs — the GeForce GTX 980M and 970M. Both notebook graphics engines are based on Maxwell's 28nm architecture, and both are trimmed slightly from the full desktop implementation. The GTX 980M is a 1536-core chip (just like the GTX 680 / 680M) while the GTX 970 will pack 1280 cores. Clock speeds are 1038MHz base for the GTX 980M and 924MHz for the GTX 970M, which is significantly faster than the previous gen GTX 680M's launch speeds. The 980M will carry up to 4GB of RAM, while the 970M will offer 3GB and a smaller memory bus.
From eyeballing relative performance expectations, the GTX 970M should be well-suited to 1080p or below at high detail levels, while the GTX 980M should be capable of ultra detail at 1080p or higher resolutions. Maxwell's better efficiency means that it should offer a significant performance improvement over mobile Kepler, even with the same number of cores. Also with this launch Nvidia is introducing "Battery Boost" as a solution for games with less demanding graphics, where battery life can be extended by governing clock speeds to maintain playable frames, without overpower the GPU at higher than needed frame rates.
29 comments | about three weeks ago
An anonymous reader writes William Steptoe, a senior researcher in the Virtual Environments and Computer Graphics group at University College London, published a paper (PDF) detailing experiments dealing with the seamless integration of virtual objects into a real scene. Participants were tested to see if they could correctly identify which objects in the scene were real or virtual. With standard rendering, participants were able to correctly guess 73% of the time. Once a stylized rendering outline was applied, accuracy dropped to 56% (around change) and even further to 38% as the stylized rendering was increased. Less accuracy means users were less able to tell the difference between real and virtual objects. Steptoe says that this blurring of real and virtual can increase 'presence', the feeling of being truly present in another space, in immersive augmented reality applications.
75 comments | about a month ago
sfcrazy writes Adobe is bringing the king of all photo editing software, Photoshop, to Linux-based Chrome OS. Chrome OS-powered devices, such as Chromebooks and Chromeboxes, already have a decent line-up of 'applications' that can work offline and eliminate the need of a traditional desktop computer. So far it sounds like great news. The bad news is that the offering is in its beta stage and is available only to the customers of the Creative Cloud Education program residing in the U.S. I have a full subscription of Creative Cloud for Photographers, and LightRoom, but even I can't join the program at the moment.
197 comments | about a month ago
An anonymous reader writes: In a blow to those working on open-source drivers, soft-mods for enhancing graphics cards, and the Chinese knock-offs of graphics cards, NVIDIA has begun signing and validating GPU firmware images. With the latest-generation Maxwell GPUs, not all engine functionality is being exposed unless the hardware detects the firmware image was signed by NVIDIA. This is a setback to the open-source Nouveau Linux graphics driver but they're working towards a solution where NVIDIA can provide signed, closed-source firmware images to the driver project for redistribution. Initially the lack of a signed firmware image will prevent some thermal-related bits from being programmed but with future hardware the list of requirements is expected to rise.
192 comments | about a month ago
MojoKid writes: Save for a smattering of relatively small, 3K and 4K laptop displays, we haven't quite gotten to the same type of pixel density on the PC platform, that is available on today's high-end ultra-mobile devices. That said, the desktop display space has really heated up as of late and 4K panels have generated a large part of the buzz. Acer just launched the first 4K display with NVIDIA G-Sync technology on board. To put it simply, G-SYNC keeps a display and the output from an NVIDIA GPU in sync, regardless of frame rates or whether or not V-Sync is enabled. Instead of the monitor controlling the timing and refreshing at say 60Hz, the timing control is transferred to the GPU. The GPU scans a frame out to the monitor and the monitor doesn't update until a frame is done drawing, in lock-step with the GPU. This method completely eliminates tearing or frame stuttering associated with synchronization anomalies of standard panels. There are still some quirks with Windows and many applications that don't always scale properly on high-DPI displays, but the situation is getting better every day. If you're a gamer in the market for a 4K display, that's primed for gaming, the Acer XB280HK is a decent new option with this technology on board, though it does come at a bit of a premium at $799 versus standard 28-inch panels.
64 comments | about a month ago
MojoKid writes Not many would argue that current console and PC graphics technologies still haven't reached a level of "photo-realism." However, a company by the name of Euclideon is claiming to be preparing to deliver that holy grail based on laser scanning and voxel engine-based technologies. The company has put together a six-minute video clip of its new engine, and its genuinely impressive. There's a supposed-to-be-impressive unveil around the two minute mark where the announcer declares he's showing us computer-generated graphics rather than a digital photo — something you'll probably have figured out long before that point. Euclideon's proprietary design purportedly uses a laser scanner to create a point cloud model of a real-world area. That area can then be translated into a voxel renderer and drawn by a standard GPU. Supposedly this can be done so efficiently and with such speed that there's no need for conventional load screens or enormous amounts of texture memory but rather by simply streaming data off conventional hard drives. Previously, critiques have pointed to animation as one area where the company's technique might struggle. Given the ongoing lack of a demonstrated solution for animation, it's fair to assume this would-be game-changer has some challenges still to solve. That said, some of the renderings are impressive.
134 comments | about a month ago
An anonymous reader writes Counter-Strike: Global Offensive has finally been released for Linux two years after its Windows debut. The game is reported to work even on the open-source Intel Linux graphics drivers, but your mileage may vary. When it comes to the AMD and NVIDIA drivers, NVIDIA continues dominating for Linux gaming over AMD with Catalyst where there's still performance levels and other OpenGL issues.
93 comments | about a month ago
schwit1 writes Using its new top-shelf graphics processing unit, Nvidia tackles one of the most persistent conspiracy theories in American history: the veracity of the 1969 to 1972 Apollo moon landings. From the article: "'Global illumination is the hardest task to solve as a game company,' Scott Herkelman, Nvidia's GeForce general manager, said in an interview. 'Virtual point lights don't do a bad job when the environment stays the same, but a game developer has to fake shadows, fake reflections...it's a labor-intensive process.' So when a Nvidia research engineer used the company's new dynamic lighting techniques to show off a side-by-side comparison between an Apollo 11 photo and a GeForce-powered re-creation, the company knew it had a novel demo on its hands. 'We're going to debunk one of the biggest conspiracies in the world,' Herkelman said."
275 comments | about a month ago