×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Comments

top

Dad Makes His Kid Play Through All Video Game History In Chronological Order

default luser Guess you forgot F-22 Interceptor for the Genesis (222 comments)

And it's predecessor, LHX Attack Chopper. Two games that managed full filled-polygon 3D engines on just a 68000 processor (AFAIK there is no coprocesor in the cartridge).

http://en.wikipedia.org/wiki/F...

While most ground targets were just boxes, the plane models were surprisingly detailed. Star Fox the first 3D game on consoles? Hah, I was playing F-22 for years before Star Fox came along :D

about two weeks ago
top

Ask Slashdot: Making a 'Wife Friendly' Gaming PC?

default luser Get a Fractal Design Node 304 (720 comments)

This case has the best airflow I have ever seen in a MiniITX case that isn't ten feet tall, with three fan settings. The front has two 92mm fans, and the rear has one 140mm exhaust. It cools processor plus graphics card well even on low, and you can't hear anything from further than 3 inches away.

Just add an Intel Core processor, a video card in the GTX 750 Ti range, and you'll have a capable gaming system with silence.

about three weeks ago
top

Oldest Human Genome Reveals When Our Ancestors Mixed With Neanderthals

default luser clan of the cave bear (128 comments)

That's the first book in the series.

It was made into a film 6 years after the original book, after she had written the second and third books.

about 2 months ago
top

Which Android Devices Sacrifice Battery-Life For Performance?

default luser Re:all (108 comments)

Turn on Power Save mode and turn off Wifi? Check to see if you have any CPU-heavy applications, and force-quit them while not in-use?

Also, the amount of power used depends entirely on how powerful a signal you are getting. For example, my Galaxy S4 typically uses 30-35% battery per-day at normal home/work day, but this last weekend I went up to the middle of nowhere, PA. The house barely got 3G at one bar, and because of the shit signal my phone was down to 30% every night. If your place of work is inside a large building, it can play havoc on your signal.

about 2 months ago
top

One In Three Jobs Will Be Taken By Software Or Robots By 2025, Says Gartner

default luser Re:Yes yes yes (405 comments)

They are "undervalued" in your eyes. In reality, they get no value because there is no more growth. This is evidenced by the increase in base unemployment since the early 1990s.

http://bilbo.economicoutlook.n...

While other countries have employment ups and downs, Japan's population has been on a downward spiral because there is no more real growth. Since the 1980s, manufacturing has moved to Korea and China, and there's not enough of a service and tech industry to cover the loss.

So yeah, for the last 20 years those people who have jobs are treated well, and the young people get the finger. Same thing is beginning to happen here in the United States, although not as massive a movement. The Japanese companies are going to either adapt, or crater and then be reborn as fast-and-lean pension-less workhouses like the US.

about 2 months ago
top

David Cameron Says Brits Should Be Taught Imperial Measures

default luser Re:FP? (942 comments)

In Ada, you can separate numbers with underscores (they are ignored by the compiler). This allows you to create more easily-readable numbers like:

Var := 2#0011_0110#;
Var := 3_295_692_839_298_459

Not sure how many other languages allow this, but it should be every one of them :D

about 3 months ago
top

Microsoft Paid NFL $400 Million To Use Surface, But Announcers Call Them iPads

default luser Re:How much! (405 comments)

This only carries for simple stuff like clothing, furniture, dishware and other branded crap. Anything with a visible tie to the team (prominent logo), not too gaudy, and some basic functionality understandable by the average idiot is a guaranteed sale.

Most people won't buy anything as complicated as a Surface just because they see the NFL players using them. There's no SIMPLE need that the Surface satisfies, and the then it's also rather expensive, and thanks to poor naming and late to market, it's lost in a sea of tablets/convertibles. There's also no actual NFL branding on the device, just product placement; so as soon as they see it, most people just automatically think it's an iPad anyway.

Anything complex to use is best not sold directly to the NFL audience. I mean, nobody went out to buy Motorola radio headsets (or even their cell phones) just because they saw them them every week for 13 years on the NFL sidelines, but they sure as hell buy the team-branded warmups that everyone is wearing. Electronics sponsorship for the NFL is more about letting everyone know your company still exists, not about pushing specific products - much like putting your name on a stadium.

about 3 months ago
top

$299 Android Gaming Tablet Reviewed

default luser There's more to it than just that (65 comments)

You have to want a better streaming experience than Valve's Steam already offers for free (and you can buy a Windows Tablet for the same price, and Valve is expected to support Android and iOS soon). You can use whatever system and whatever video card you want to stream the game to and from - even go wired ethernet to get around the inevitable problems you get streaming games over wireless.

If you go Shield, the tablet price is just the beginning: you have to have a mid-range GeForce card purchased in at least the last 2 years ($120+ if you don't already have one), the controller + stand ($100), and of course a suitable dual-band router runs at least $70 (most people use the crap one that came with their internet install).

In all, you could be on the hook for anywhere from $400 all the way up to $600. That's getting DANGEROUSLY close to the same price as an entry-level gaming PC, so again the need just doesn't present itself there.

about 5 months ago
top

NVIDIA Launches Tegra K1-Based SHIELD Tablet, Wireless Controller

default luser Re:I can already do this on Android... (42 comments)

You'll notice in the second half if my post that I called the separate controller + tablet + stand gaming method CLUMSY, and praised the original Shield for including all-in-one. This is because I also agree with you that this sort of gaming is untenable, and I actually gave up and bought a 3DS about 6 months back to get my portable gaming fix.

Touchscreens suck for gaming!

Also, you'll notice I suggested the WIRED 360 controller because it's not nearly as complicated as pairing a wireless bluetooth controller. Just plug it in, and play your games...or you can sit here all day and complain about Android breaking your favorite bluetooth controller.

about 4 months ago
top

NVIDIA Launches Tegra K1-Based SHIELD Tablet, Wireless Controller

default luser Re:I can already do this on Android... (42 comments)

The K1 has *vastly more GPU power* than any other ARM SoC out.

And your point is?

Games are being produced for the mainstream to high-end currently out there, so a SoC with twice the power brings nothing useful to the table. And streaming PC games requires none-of-the-above GPU power, so it's one of those questionable value cases.

Then there's also the concern about whether K1 will throttle under load, since this tablet doesn't have a fan like last year's Shield did. You'll notice that NONE of today's reviews go into that level of detail, and that's probably no accident.

about 4 months ago
top

NVIDIA Launches Tegra K1-Based SHIELD Tablet, Wireless Controller

default luser I can already do this on Android... (42 comments)

With something as simple as a USB OTG adapter and a wired Xbox controller.

Believe it or not, most modern Android games already support the Xbox controller, and if you're gaming on such a tiny screen you can believe me that the wireless controller is NOT a necessity (will be 1-2 feet away at most). You can buy a $200-ish entry-level Android tablet that can handle games just fine, and reuse the Xbox controller you undoubtedly already have. So, why would you spend $300 + $100 for the same thing with an Nvidia label on it?

I'm actually disappointed that they didn't stick with the concept of the original Shield, and deliver another handheld gaming system; that alone is the only thing you cannot find in the Android world. If they insist that you use a clumsy tablet setup involving a screen prop and a separate controller, then why do we have to buy the tablet, screen prop and controller FROM THEM?

about 4 months ago
top

Coming Soon(ish) From LG: Transparent, Rollup Display

default luser Promises... (64 comments)

Also, there's the unavoidable problem you have with display clarity. Right now screens are on a flat substrate, and so each pixel is aligned with the next one, which reproduces an image accurately. But what happens when you have an unrolled display sitting on your desk, or held in your hand? It will inevitably be have varying levels of curve along it's length and possibly more complex crumples, resulting in poor image accuracy. Fixing that will require some clever sensors embedded in the display along with some expensive signal processing, and that fix will STILL cost you resolution.

Then when you consider that LG's current flexible displays have poor color rendition and contrast, along with piss-poor resolution, you realize how much of a lost cause this is. I cannot see myself giving up the best qualities of modern displays so that they break a little less often, and can fit in a smaller pocket.

about 5 months ago
top

Researchers Unveil Experimental 36-Core Chip

default luser Re:Moore's Law (143 comments)

In your glass tower, yes.

In the real world, not so much.

Here is an example of one of the world's most optimized pieces of software: x264. It's also one of the few real-world loads that can take advantage of multiple processor and SSE. So how much speedup did this incredible piece of software see with AVX2, which DOUBLED the width of the integer pipelines?

FIVE PERCENT! Yup, that's it!

All that work for so very little improvement, because in the REAL WORLD data does not align on perfect AVX2 boundaries, and data fetch is as much of a hindrance as the actual processing of that data. Read more about WHY this is the best that could be done here, if you don't mind paying for SCRIBD.

Parading around test results form something like Passmark is just self-delusion. It only tests that the features do in-fact work, and these tests tend to work directly from cache in small data sets that are usually not branch-heavy. IT gives score for number of MIPS, but does not take into account the fact that most real software can't actually make use of these features at-speed.

And when they increase the vector size yet-again to 512-bits wide in a year, it will once-again be a limited real-world improvement, because optimization of real loads is hard, and auto-vectorization of arbitrary loads is even harder problem to solve. So Intel keeps adding new features, and they keep adding about 5-7% each (real world). So I don't see how you get above 3x from those puny performance increases, while not deluding yourself.

about 6 months ago
top

How Vacuum Tubes, New Technology Might Save Moore's Law

default luser Nice intentional troll (183 comments)

Sorry to say, it's too cohesive to be real. Better luck hooking the big kahuna next time, eh?

about 6 months ago
top

Researchers Unveil Experimental 36-Core Chip

default luser Re:Moore's Law (143 comments)

Which uses Passmark, which is a simple corner-case number-crunching bonanza. Pure AVX2, or FMA without any real-world qualifiers, restriction or branching? Sure, we got that!

And even with that, you're still off. The performance improvement with Haswell per-core is less than 5x. See here:

http://www.cpubenchmark.net/compare.php?cmp[]=2020&cmp[]=1127

So, in the unoptimized case the performance improvement is 2-3x, and in the embarrassingly-parallel case the speedup is 4-5x. But then, if you had such an embarrassingly-parallel case, you'd just port it to OpenCL and be done with it. Haswell is for all those hard-to-optimize compute cases.

about 6 months ago
top

Researchers Unveil Experimental 36-Core Chip

default luser Re:Moore's Law (143 comments)

Absolutely not true.

The Core 2 Duo is approximately 2x faster clock-for-clock versus the Pentium 4, and the current Haswell core is barely 40% faster than that (assume a 7% speedup per-clock for every core rev since). That gets you somewhere in the 2x-3x performance improvement range for Haswell, barring corner-cases that are embarrassingly easy to leverage AVX/FMA (most real-world use cases show small improvements).

Intel proved that they could do a whole lot better than the Pentium 4, but your performance improvement factor is off by half!

about 6 months ago
top

Will 7nm and 5nm CPU Process Tech Really Happen?

default luser Re:This affects our entire industry (142 comments)

I figured the hardware effect was fairly obvious :D

I concentrated on the software side effects because more readers here work on that end.

about 6 months ago
top

Will 7nm and 5nm CPU Process Tech Really Happen?

default luser This affects our entire industry (142 comments)

Because whatever you do in the computing world, you are affected by processing power and cost. Growth in these regions drives both new hardware and new software to go with it, and any hit to growth will mean loss of jobs.

Software (what most of us here create) usually gets created for one of two reasons:

1. Software is created because nobody is filling a need. Companies may build their own version if they want to compete, or a company may contract a customized version if they can see increased efficiency or just have a process they want to stick to. There used to be a lot of unfulfilled need out there, but this demand is much sated in the 21st century.

2. Software is created because a company desires increased performance/new features (basic need is filled, this is a WANT). Once a new processor/feature becomes available, you either wedge it into existing code. Or, if it's a massive enough of an improvement, you create entirely new software enabled by the new level of performance-per-dollar.

Without continued growth, the industry is in danger of cratering because there's only so much processor architecture optimization you can do in the same process node, and the same goes for optimized libraries on the software side. In addition, brand-new industries enabled by cost reductions (e.g. digital FMV explosion in the 1990s, or the movement to track your every move in the 2000s) will no-longer be so common, and that will again force people to look elsewhere for employment.

Software engineers won't disappear, but they will be culled. The industry has not had to deal with that yet in it's entire history, so it will be painful. I'm hoping they can hod this off for as long as possible!

about 6 months ago
top

Kingston and PNY Caught Bait-and-Switching Cheaper Components After Good Reviews

default luser Re:Kingston selling shit USB3 flash keys (289 comments)

You compared A 128GB drive to an 8GB drive. That's likely your problem.

Flash is inherently parallel, which means that the more chips you have, the more bandwidth the controller can extract. USB 3 versus USB 2 is of no concern if you can't even squeeze enough bandwidth from the flash chips to saturate the interface.

There is also the quality of the controller that could affect things. USB 3 flash controllers come in all sorts of different specifications: you can have something that barely exceeds the speeds of USB 2, or a slightly more expensive controller that has fast block reads but poor small file performance and slow writes, or you can pay a premium price for all-around excellent performance. This is the same thing you saw in USB 2 land, and also quite clearly seen in the SATA SSD world, so why would you expect anything less in USB 3 land? You bought a bunch of low-end "USB 3" labeled parts, and you probably got exactly what you paid for.

This happens in every industry, because there's a different set of requirements for every purchase, and an OEM ticking all the right boxes at the right price gets the sale, so they make sure to have lots of different options. Don't blame Kingston because you were shopping for crap and received crap.

This is not the same as relabeling products with advertised speeds that are higher than what was delivered. THAT is bait-and-switch, which is reprehensible. That has nothing at all to do with your case, which was simply a case of you not doing your homework.

about 6 months ago
top

Could High Bay-Area Prices Make Sacramento the Next Big Startup Hub?

default luser You forgot a third, but very important negative re (190 comments)

3. The cities are islands in a sea of rural nothingness. Seriously, if you make your home in (e.g.) Austin, just try to commute somewhere else. San Antonio is a stretch (1.5-3 hours each way, depending on which sides of the city you are commuting to), and Houston and Dallas are out. Every other town is too small and too isolated to attract tech industry jobs.

This means that when a major tech industry in your chosen metro area craters, it takes YEARS for the economy to recover, and there's no other option available except for you to move. So if you move to the area seeking fame and fortune, remember to keep a deep nest egg, and don't expect to put down any deep roots.

Believe me, my family moved to Austin to follow the growing tech industry in 1983, and they ditched the place in the late 90s because they were tired of dealing with the boom-bust cycle. Since they moved, Austin crashed yet-again (Dell + Dot Com Bubble at the same time). The place has finally recovered and looks attractive again, but it will only be a short matter of time before another crash hits. So keep your nest egg close, and your roots shallow folks!

about 7 months ago

Submissions

default luser hasn't submitted any stories.

Journals

top

Touch Typing

default luser default luser writes  |  more than 10 years ago

What is it with the entire geek culture and touch typing? It seems these days that if you don't have at least 8 fingers on the keyboard, and a whopping speed of 60+ wpm as a minimum, you automatically do not qualify as a geek. I mean, people toss out bonus geek points if you know how to switch between QWERTY and Dvorak on the fly.

What are we, geeks, or one giant clique centered on convincing everyone else that they are worthless unless they type with one standard posture and position?

This is sickening, really. I have never learned how to type "properly." That is to say, I HAVE taken classes and even tried a couple typing tutors, but the end result has been unchanged. I still to this day type everything with two fingers.

Now, you might say, how do I get anything done with that horrible interface? You have to realize that people, even untrained, can make their bodies do exceptional things. For those of you who gaze in awe of a man who can pump out 120 wpm with his hands in perfect touch-typing position, I offer you an alternative tidbit of amazement:

I have typed this entire post to this point with my two little fingers in under 3.5 minutes.

END TEST. Three errors corrected (~99% accuracy).

Who says that there is only one "right" way to do a task?

Slashdot Login

Need an Account?

Forgot your password?