×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Comments

top

At Oxford, a Battery That's Lasted 175 Years -- So Far

im_thatoneguy Re:Hold your horses (210 comments)

Correct me if I'm wrong but without knowing the voltage isn't comparing amperage hours to one another useless?

5v * 1Ah = 5watt hours
12v * 1Ah = 12watt hours

Amp-hour isn't actually a unit of energy potential.

One AA battery has about 2.6ah * 1.5v = 3.9 watt/hr
One D Battery has about 18ah * 1.5v = 27 watt/hr

175 years = 1533000 hours * 7200 nanoampere seconds per hour = 11.06 ah. Which if it's .1 volt would be 1 watt/hr of capacity. Or if it was 10v it would be 100 watt hour. Makes a pretty big difference. And without knowing voltage we can't compare.

3 days ago
top

The Current State of Linux Video Editing

im_thatoneguy Nuke Studio, Flame and Davinci (223 comments)

If we're talking about NLEs for VFX then the obvious choice would be Nuke Studio (http://www.thefoundry.co.uk/products/nuke/studio/) It's integrated with Nuke which is used everywhere and it's a multiplatform app which runs on Linux, OSX and Windows.

Davinci is also for Linux and it's got pretty decent editing capabilities now. And like Nuke Studio it also has lots of VFX friendly features like handles and solid EDL support.

Another obvious option are the Autodesk (Discreet) systems. Flame Premium 2013 supports Linux. For a while there Flame/Inferno were exclusively linux.

So there is plenty of VFX editing on Linux, it's just pricey for the most part and not at all open source.

5 days ago
top

The Current State of Linux Video Editing

im_thatoneguy Nuke Studio is designed for VFX facilities (223 comments)

Blender is not going to address the needs of a VFX facility. Having a python checkbox isn't enough to handle the sorts of scenes and needs of a feature film vfx shot in most situations. There is a reason CG supervisors still pick Max, Maya or Houdini over "free" software and that's the cost of productivity. $3,000 is a small price to pay compared to being even 10% more productive. The average VFX artist is paid at least $65,000. So if you need 10% more artists to do the same thing in the same amount of time then you're paying $6,500 per year in lost productivity. That's substantially more expensive than $1,000 per year for maintenance. Which isn't to say that there aren't good video editing applications for Linux. For VFX studio editing needs Nuke Studio is enough and it runs on Linux:

http://www.thefoundry.co.uk/pr...

In fact from a VFX facility's perspective it integrates better into a pipeline than any of the other commercial editing applications and it works well with Nuke which is the defacto standard for compositing.

5 days ago
top

Microsoft Reveals Windows 10 Will Be a Free Upgrade

im_thatoneguy Re:Only for the first year (567 comments)

Or this is the exact same policy they've had for what... nearly 20 years and they want to get people onto an OS which supports their new Universal Application runtime in order to encourage adoption and by extension Microsoft App Store revenue.

5 days ago
top

Intuit Charges More For Previously Offered TurboTax Features, Users Livid

im_thatoneguy Re:Schedule D?! (450 comments)

If you're getting that much back on your refund then you're probably doing it wrong.

I did do it wrong, but I also spent a lot more than I anticipated after already paying. But it *does* create a bit of a loophole. 10% is a pretty good return. Especially since you can theoretically put it in the final quarter. So if you want to buy a big screen TV on Amazon for $3,000: overpay $2800 on your final quarter. File your taxes in February. Get your return entirely in Amazon gift certificates and you just got 10% off your TV.

about two weeks ago
top

Intuit Charges More For Previously Offered TurboTax Features, Users Livid

im_thatoneguy Re:Schedule D?! (450 comments)

I bought the Professional Turbo Tax on Amazon and it cost $64. And I get 10% extra on everything from my refund that I Put into Amazon Gift Certificates. So with TurboTax this year I could theoretically take my whole refund as Amazon Gift Certificates and pay off Turbotax a few times over. But I don't think I spend enough on Amazon even with my prolific Amazon Purchasing to justify taking all of it back in Gift Card Balance.

about two weeks ago
top

Why We're Not Going To See Sub-orbital Airliners

im_thatoneguy Re:That target already captured elsewhere (300 comments)

I'm a Global Entry card holder. There's almost no line at all for customs. If they're doing Quarantine and smuggling checks they can do it on the train en-route. It would be like gate-checking. Check in at the train station. Then have a white glove security service run your luggage through adequate screening and then take it directly from the train to the spacecraft.

about three weeks ago
top

Why We're Not Going To See Sub-orbital Airliners

im_thatoneguy Re:That target already captured elsewhere (300 comments)

This is a fair point. I just flew from my home town to my current city.

Time to drive to airport: 15 minutes
Time to wait in security/waiting area: 40 minutes
Time to taxi: 5 minutes
Flight time: 45 minutes
Taxi Time: 15 minutes
Time to get to Taxi stand: 10 minutes
Time for taxi to get downtown: 35 minutes.

Total time to/from runway: 1.75 hours.
Flight time: 45 minutes.

Then again LA to Sydney Australia is about a 15 hour flight. If this is the future and you had a maglev/hyperloop type transport you could get to a remote spaceport in under an hour completely isolated from an urban area. That would be about equal to traffic to/from the nearest small airfield for a private jet. Also it wouldn't have that much security since the only threat would be a bomb and they could pre-screen your luggage while in route to make things efficient. All in all with a good highspeed rail solution you could best a business jet easily. The fixed time tables are harder though. But with an 18 hour flight time to beat you would stay overnight in a hotel and be productive then leave the next day and still beat the flight in the morning.

about three weeks ago
top

Why We're Not Going To See Sub-orbital Airliners

im_thatoneguy Re:I think the thing being missed here (300 comments)

A flight from New York to Singapore is usually around $1,300. A Suite Class ticket from New York to Singapore is $23,000.
https://medium.com/travel-adve...

People already pay 20x coach to fly comfortably for 18 hours. If you reduced the flight time to 2-3 hours and people didn't need a bed, shower and other amenities associated with a full day in the sky then you would be price competitive.

Here is another example. I rent a camera for $1,500+ for about 36 hours. If you hard a cargo flight that could do a point-to-point delivery from Indonesia to my door and it cost 1/10th of 10x the price of a ticket ($10,000) per 160lbs for a 16lb package then it would cost them $1,000 for shipping vs $1,500 for an extra day of rental. That would save them $500. You could even include a courier to the spaceport and back. I'm certain that there are items today that could use a sub-orbital delivery and save money at $100 per lb.

about three weeks ago
top

Why We're Not Going To See Sub-orbital Airliners

im_thatoneguy Re:huh? (300 comments)

The Concord could have continued flying. Virgin Atlantic offered to operate the aircraft but British Airways didn't want someone else operating them if they didn't.

about three weeks ago
top

Tumblr Co-Founder: Apple's Software Is In a Nosedive

im_thatoneguy Re: Nosedive (598 comments)

Apparently, contrary to all those science fiction stories, people in general really don't want videophones after all, even after they became practical. To my knowledge, only uber-geeks are using it, and only because they can.

Phone calls period are barely used. People prefer asynchronous communication.

But video chat obviously has two big fans:
1) People showing someone something (real estate, christmas presents, things in a store, etc.)
2) Long distance romantic partners.

The advantages are pretty obvious for both use cases. :D

about three weeks ago
top

Professor: Young People Are "Lost Generation" Who Can No Longer Fix Gadgets

im_thatoneguy Re:Dupe (840 comments)

Nonsense, my dad couldn't fix our car or an appliance. He could build houses though.

If people were soooooo handy why were there so many appliance repair shops and car mechanics?

about three weeks ago
top

How We'll Program 1000 Cores - and Get Linus Ranting, Again

im_thatoneguy Re:Pullin' a Gates? (449 comments)

Game developers waste less processor power than just about any other developer I know of short of super-computer developers. When you have 16ms to render a frame and you have to recreate the entire universe in those 16ms you have to be extremely judicious in your use of cycles.

about three weeks ago
top

How We'll Program 1000 Cores - and Get Linus Ranting, Again

im_thatoneguy Re:Pullin' a Gates? (449 comments)

Torvalds dismisses photo editing as a task for "professional photographers", but our amateur cameras are taking phenomenally detailed pictures, and even making fairly simple edits is a compute-intensive task. He may be right, but he may equally be wrong.

Torvalds is being completely ridiculous here. Avid used to be the domain of professional film editors but iMovie is incredibly popular. We even see cell phones these days sporting 4k cameras. My Lumia has a 41 megapixel sensor! I have a RED camera and it's "only" 18 megapixels. In fact the less professional you are the more processing power you need. Photoshop's paint brush can accomplish wonders in the hands of a professional touch-up artist. But Photoshop's Content-Aware-Fill is processor murder and designed specifically to intelligently replace a professional artist. Take something like 3D rendering. You could have someone hand paint every frame. It would without question require a professional artist. But if you want a pretty picture at the push of a button you want raytracing.

This is actually something that you see happening today in the high-end VFX market. It used to be that raytracing was too compute intensive for films. But for amateurs and non-artists ironically enough ray tracing was fine. The architect only needed to render 3 frames. Waiting a day was perfectly acceptable there wasn't another 100,000 frames that also needed to get rendered. In film there wasn't time for something like Global Illumination and the shortcuts caused unacceptable flickering. Now the film industry is starting to embrace advanced lighting like GI and they're getting all of the bounces and detail that used to take hundreds of lights to fake automatically. It's making artists more productive but it's coming at the cost of increased compute time. Again a professional lighter can as an artist fake global illumination. An amateur could simply position the sun, turn on GI and wait 18 hours.

The future will be an Automagical button that not only fixes your photo *cough* instagram *cough* but also performs even more advanced editing like "Remove the gray clouds and put in a photorealistic blue sky. Oh yeah, and also change the lighting of the photo to make it look sunny!" That's going to be far more CPU intensive than any photoshop filter currently in existence and it'll be targeted as much as your average cell phone user as a professional.

about three weeks ago
top

How We'll Program 1000 Cores - and Get Linus Ranting, Again

im_thatoneguy Re:Pullin' a Gates? (449 comments)

You assume that task-specific tasks are all that people will come up with. If you have to spin a new ASIC every time you want to improve your software we aren't going to innovate. ASICs are specifically for something like 10GB networking which is a defined standard. But most tasks aren't defined standards. Changing specs is the norm not the exception outside of core OS functionality like storage or networking. GPUs couldn't keep up so they moved to a compiled per-pixel shading model so that developers could rapidly iterate and invent new uses. In the process GPUs by necessity became pretty general purpose. But GPUs are still frustratingly limited in their general purpose applications. There is a huge domain of problems that need more than 4 cores but need more memory and larger caches than a GPU offers them. You could legitimately call whatever processor manages to handle them a "CPU" or a "GPU".

about three weeks ago
top

How We'll Program 1000 Cores - and Get Linus Ranting, Again

im_thatoneguy Re:Pullin' a Gates? (449 comments)

The point isn't to pick any one approach or technology (say neural nets) the point is that we *already* have an application that comfortably uses more than Linus' mythically adequate 4 cores. A 4 core CPU is fantastic at running a word processor and an email client in the background. But that's not the future of computing. The future of computing is going to be doing the work of the human brain, but better. The human brain is one example of the sort of application we are going to see more of. Improved Microsoft Word is not the future. Improved Chrome is not the future, we see the future in Science Fiction and it's an interface that can communicate with us naturally. Natural human/computer communication means a whole new set of problems, and these are not problems relegated to "niche" marketplaces like research lab super computers. The applications for machine vision are everywhere. The applications for voice recognition are everywhere. The applications for 'common sense' in your interaction are everywhere. These aren't problems that I expect will be solved best with fast linear serial processes. To date all of these classes of problems have been best approached with multi-threaded parallel computing.

You mention the GPU. It's true the GPU was a custom semi-specialized piece of hardware. In fact the original 3D accelerators weren't even in the display card they were pass-through cards. But you know what else used to be a semi-specialized chip? Math Co-Processors. Even today GPUs are slowly blending back into the CPU. Once something like a math co-processor becomes sufficiently critical to the average user it becomes part of the CPU's die. AMD has already integrated pretty substantial GPUs into their "APUs". By definition SOCs are integrating the GPU. If we do develop a chip that is critical the average user like AI with a magic AI-chip then they'll just integrate it into the CPU.

It used to be that video playback was a niche market and now just about every CPU, GPU and combination there-of has integrated video decoding into the chip. So what makes you think they won't integrate ai and call it a "CPU"?

about three weeks ago
top

How We'll Program 1000 Cores - and Get Linus Ranting, Again

im_thatoneguy Re:Pullin' a Gates? (449 comments)

If that whole process takes 3 seconds (which would be amazing) then your computer only performed 1 "operation per second". But computers don't perform "operations" they have to perform millions of sub-actions to accomplish your goal.It would be like saying that "Rendering a game's frame is only a single task so it would be a very serial task without any potential for multithreading." when in reality "rendering a frame" is a massively parallel task of rasterizing millions of triangles (or intersecting rays) and sampling textures, computing lighting values and performing table look ups.

Take interpreting voice. By applying multiple models simultaneously you can get better results. Seems pretty obvious.
http://devblogs.nvidia.com/par...

For the flyer maybe it'll generate 1,000 flyers simultaneously and then compare them to award winning graphic design projects to see which of the 1,000 ideas it had matches historical good ideas.

about three weeks ago
top

How We'll Program 1000 Cores - and Get Linus Ranting, Again

im_thatoneguy Re:Pullin' a Gates? (449 comments)

I say you are the one moving the goal posts, Linus and *most* of the other people working on parallelism solutions are working/speaking in the context of computers like the ones we know today, you they guy trying to apply what they say to *any* computer. Linus will probably be proved correct there. Past n cores the fundamental architecture in use today will not scale but for niche cases.

Within the context of traditional Van Neumann computers we already today have voice recognition, we already have SLAM 3D positioning, we already have databases like Wolfram Alpha which can give us insights, we already have applications which crunch massive 3D datasets. Some of these run ok on GPGPUs and some need the larger cache sizes of a CPU to run efficiently.

My point isn't that we need some completely exotic system, my point is that with the very limited amount of applications today for AI-driven solutions there are plenty of applications that can and would use hundreds of cores. Computers were once a "niche" tool for rich people. The internet was once just a niche tool for academics. Only gamers needed a GPU etc etc. All the way back through history when something becomes accessible someone finds an application. Build it and they will come.

about three weeks ago
top

How We'll Program 1000 Cores - and Get Linus Ranting, Again

im_thatoneguy Re:Pullin' a Gates? (449 comments)

It is a niche which will need specific algorithms tuned for the hardware (GPU or other) the pipeline must be kept busy to observe a performance gain. It doesn't scale to general purpose computing.

I feel like this is moving the goal posts. "You will never do massively parallel computing on a CPU because if it's massively parallel it's a GPU not a CPU."

Linus is 100% wrong. What's the "general purpose" computing that we all want? The NCC-1701D's main computer from star trek. If I say "Cortana/Siri/Google Now please rough me out a flyer for our yardsale on Saturday." you're going to be looking at massively parallel task for the neural networks to not only interpret the voice but then make sense of the words and finally produce a printable flyer suitable for hanging. Programming is still a really fancy version of "IF A THEN B". "for X in GROUP do Z". "X = Y". Yeah, if your application is incredibly serial then a serial processor is all that you'll need. When computing advances to the next phase of neural networks, AI and directed (not instructed) computing then it'll need to be more like our brain: massively parallel.

Now there are two obnoxious tautological arguments against this:
A) "That's not a "CPU" that's like a NeuroProcessorUnit, an NPU if you will"
B) "Yes we'll need a giant mainframe, but it'll be a server in the cloud!"

A is moving the goal posts. Just because the processor isn't an ARM or x86 instruction compatible chip doesn't mean it's not worthy of the label CPU. As mentioned above you can't say that there'll never be a CPU with massive parallelism because as soon as it has massive parallelism it's by definition no longer a CPU. B is just saying that nobody will have a need for computers because we'll have a giant mainframe. Which might be true but you just need a basic DSP not even a CPU if it's just a pure thin client transmitting a video, audio and input stream to the cloud for processing. In which case all of the CPUs in existence... need to be massively parallel AI processors.

about three weeks ago
top

Microsoft Is Building a New Browser As Part of Its Windows 10 Push

im_thatoneguy Re:Nth verse, same as the first (248 comments)

Who cares about developers? Microsoft is rewriting their browser to make it faster and use less battery/resources. The Trident render engine is already good. The JS engine is already one of the fastest. Developers should be happy to develop for Trident, rewriting the browser so that it's more cross platform compatible and smoother on mobile seems like a "good thing" to me.

about a month ago

Submissions

top

Atari Claims Piracy to Supress Bad Reviews

im_thatoneguy im_thatoneguy writes  |  more than 6 years ago

im_thatoneguy writes "According to Shacknews

Atari has filed suit against German gaming website 4Players for publishing a negative pre-release review of Alone in the Dark, alleging that it was written based on an illegally obtained copy of the game.
Atari has also demanded the removal of 3/10 reviews from both Gamer.no and Gamereactor citing accusations of piracy."

Link to Original Source

Journals

im_thatoneguy has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?