Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

Programming Error Doomed Russian Mars Probe

Whillowhim Re:headline fail (276 comments)

As someone who is currently writing up a dissertation dealing with this topic, I can assure you that mil spec is not sufficient. Hardening chips for radiation is completely different from hardening them for other hostile environments, especially when you look at the heavy ion strikes you can get in space.

Radiation effects are generally split into two basic categories, Total Ionizing Dose (TID) effects and Single Event Effects (SEEs). TID results from lots of little ion strikes, which gradually build up charge and/or defects and screws with transistor characteristics. Often the result is that transistors leak a lot more current when off, reducing your margins. Since this takes time to build up, it is highly unlikely that this caused the issues with the probe. Since mil spec chips often have a bit more tolerance for this, mil spec does help, but it does not help enough for long exposures.

SEEs are the result of a single, high energy particle hitting the chip. The area of effect varies greatly depending on the energy of the particle, but the typical results of a strike are than a logic gate or cluster of nearby logic gates end up forced to output the wrong value. Essentially, one or more of your "0"s just became "1"s, and vice versa. If these values happened to be important to the current state of the machine or OS running on it, then congratulations, you just got screwed. The two most common ways to harden a chip against this are temporal redundancy and logic redundancy. Temporally redundant circuits assume that any ion will only upset the logic for a short period of time, and wait for the signal to become stable before storing values. This has been the staple of custom hardened chips for a while now, because it is relatively easy to convert all your flip flops into hardened flip flops, and thus harden the entire circuit.

Logically redundant circuits essentially have 3 copies of the logic that vote to determine the correct value. This was often used in the early days of hardening, since you could just stick 3 chips in there and add some basic voting circuits outside the chips to correct the values. However, as processors got more complex, it became harder and harder to restore their state properly in a reasonable amount of time, so people tended to move to temporal hardening for custom chips, and only used logic hardening for things like FPGAs.

Currently, however, temporal hardening is breaking down, since it doesn't scale well with smaller processes. A heavy ion deposits a fixed amount of charge, but smaller processes have less current flow per transistor, so it takes longer to remove that charge and restore proper operation. Thus, the length of time temporal designs have to wait for the signal to stabilize keeps increasing. This is one of the main reasons why hardened chips lag behind in terms of transistor size and the processes they can use. My graduate research has created a method to do high speed, logically redundant circuits that are highly scalable, meaning that you can automatically create three circuits that vote on the same chip, using commercial synthesis and APR tools to automate the process. I firmly believe that this is going to be the standard once people realize how much faster they can make chips run on new processes.

more than 2 years ago
top

B&N Yanks DC Titles After Exclusive Amazon Deal

Whillowhim Re:Local Neighborhood (125 comments)

That isn't the point. This isn't a local vs. big store issue, and it isn't really about comic books. It is about publishers and bookstores, and the balance of power between them. Exclusive deals between publishers and a specific store with a specific DRM tied to a specific brand of ebook reader are bad. Even if it happens to be the biggest store that services the most people, it just helps to support a monopoly. And even if it were a smaller store that just happened to use your personal brand, so you personally were not inconvenienced, it still artificially segments the market and tries to lock people in to different personal playgrounds so that they can be milked by one company. The market needs to be open and unrestricted to promote quality services, not locked in to whichever bookstore happened to score a deal on what you most want to read, even if that bookstore is worse than the one you really want to use.

As far as they're standing up for a less segmented market, I applaud B&N for taking the long view on this issue, despite any short term loss in profits. Although they may be in it for selfish reasons (would they have protested if they were the ones with the exclusive deal, and amazon was shut out?), they are making at least some sort of stand. Now we just need someone big enough to take a stand against the entire idea of DRM on ebooks....

more than 2 years ago
top

Coming Soon to EA's Origin Store: Third-Party Titles

Whillowhim We don't need another friend list. (88 comments)

Even if the store isn't broken and works flawlessly (and from what I've heard, that is a big if), I'm still opposed to exclusive Origin titles just on general principles. Withdrawing titles from other services is a huge pain in the ass for the consumer, and does a huge disservice to PC gaming as a whole. It fragments the PC community even more, creating yet another friends list you need to keep track of, and yet another program that needs to be running. Ultimately, PC games can only benefit from a single unified service that tracks friends, achievements and such, just like there is only one Xbox system, or one PS3 system, but there is no point to trying to force people to adopt your system as the primary one. While EA may think it sucks that Valve got there first, the solution is not to keep fracturing the marketplace and forcing people to use a special system just for your games. The only responsible policy is to provide your games to any existing systems, so that people have a choice of which one they want to use. Then you make yours good enough that people want to use it.

From a business standpoint, I can see why they pulled their titles from Steam, because if they didn't, no one would ever use Origin. But if that is the case, you have to ask yourself why you're forcing people to use a service that they would never normally use. You want people to use your service because they like its features, not resent your service because its their only choice. It seems like they only reason they want to push Origin is that they want some of the money Steam is making, not because they actually feel like they can offer a new and/or superior product.

Like a lot of EA's decisions, this seems to be very short term and focused tightly on pure monetary numbers. Cash grabs can work, but you build up enough ill will among your customers and eventually they'll stop buying your stuff. It takes a long time to get to that point, but EA has been working at it for years. More and more of my friends seem to be aware of the crap they're pulling. That is not a good sign for them. I'm already fairly careful on which games I buy, and tend to skip ones that aren't available on Steam. Refusing to put your games where I do most of my shopping can only hurt you more.

about 3 years ago
top

Nuclear Risk Expert: Fukushima Fuel May Be Leaking

Whillowhim Re:Mine it. (500 comments)

http://mitnse.com/2011/03/16/what-is-decay-heat/

There is no more uranium fission, that was stopped within seconds of the earthquake hitting. The problem is the decay products of the reaction, which are unstable and thus radioactive. The power given off by the reactor at this point is just a percent or so of its original power, and all of that is coming from unstable isotopes splitting on their own. There is no real point to separating the fuel, the byproducts will continue to fission without any neutrons hitting them. Removing them to make them easier to cool is pointless, since by the time they could set something up, they could've set up a real cooling system and solved the problem on site.

more than 3 years ago
top

Toyota Accelerator Data Skewed Toward Elderly

Whillowhim Re:Starting from full stop ..... (776 comments)

My current car has a cruise control with resume functionality, but it clears the resume whenever it drops below a certain speed (somewhere around 10-30 mph, I really only notice it when stopping). So you can tap the brake pedal to take it off cruise, coast for a bit, then hit resume to get back to the previous speed, but if you go down to a stop or close to it it'll clear it and you have to get back to the desired speed manually.

more than 4 years ago
top

Security Industry Faces Attacks It Can't Stop

Whillowhim Re:Industry slow to respond to challenges (305 comments)

This, also, shouldn't be news.

Niche applications have a much lower install base, and must make more money on each sale in order to pay for the same amount of development. Since niche markets often have orders of magnitude less users, you have to both jack up the cost of the item and cut back on development.

Its the difference between having 50,000 users and 100 developers, and 500 users and 10 developers. Assuming the project is of comparable complexity, you're going to pay 10x as much and get something 10x less polished.

more than 4 years ago
top

Five Top Publishers Plan Rival to Kindle Format

Whillowhim They just don't get it. (123 comments)

Games? Social Networking? The fact that Murdoch is a part of this venture does not surprise me, because it shows an astounding lack of understanding for why people are buying ebook readers and what the market actually wants in a book reader appliance. Namely, they failed to do prior art to find the millions of PDAs people were using to do exactly what this new format is proposing. Or rather... not doing exactly what this format is proposing, because no one really needs it and it is an energy hog.

The Kindle and other ebook readers (i.e. the Sony one I've owned for the past 3 years) did not become popular because they were a new idea and a new device, they became popular because of a new technology: e-ink. There were book readers before the e-ink displays came around, but very few people used them because they suffered from 2 major drawbacks. The first was the power consumption of their displays meant that you had to plug them in and let them charge on a daily or twice daily basis. People already have to charge their cell phones on a daily basis, but charging one twice a day when you use it a lot is pretty annoying, and a huge amount of power is spent on the display when a cell phone is being used. The second drawback is simply screen real estate and the interface to get to it. PDAs could do exactly what is being proposed, but they didn't because it was hard to use a handheld device in that manner. Sure handheld gaming devices exist and are used... but they have buttons and layouts specifically tailored to using the device as a game. The same goes for cell phones, PDAs, and ebook readers. You can play games on cell phones, but not easily and the power usage sucks up the battery. The new format proposal looks to do exactly the same thing to ebook readers. Congratulations, you just re-invented the N-Gage.

The major "killer app" in the ebook market that no one is mentioning is really quite simple. It isn't a killer display (black and white is fine for books), it isn't a fancy new display (though color would be nice, it would also be mostly useless and a major expense), and it isn't a whiz-bang new DRMed file format. What is missing from the ebook marketplace is simply a universal storefront. Amazon books only work with the kindle. Sony's store only works with their ebook readers. The same for most other ebook stores (with a wider list of readers that can use their store... but a lower percentage of people who actually have those readers). DRM has fractured the marketplace, but selling to the entire install base of ebook readers is really quite simple because all ebook readers out there can read non-DRMed files. It is only the stores that are enforcing DRM. The first store to offer a wide selection of books in non-DRMed format at reasonable prices will suddenly be able to sell to 100% of people interested in ebooks and steal market share from everyone else out there.

I could rant on this subject for days, but the bottom line is: I can get almost any book out there for free from pirates, and I don't have to worry about losing those books when I migrate from my Sony Reader to whatever device I might end up using next (the battery is finally dying). However, I've bought most of my books from the Baen store, because I can get them fast, easily, and with good proofreading. It is easier to read them and find them, and they aren't some OCRed crap with forced line breaks and errors. Publishers have to understand that on the web, they're not competing against the price and convenience other publishers, they're competing against some random pirate scanning in a copy of their book and giving it away for free. If it isn't easy to find a copy of their book that will work on my system for a reasonable price there ($15 for a paperback selling for $8 at the local bookstore?) there is no reason to give them money.

That said, there is one thing I can see some value in for the proposed format: daily deliverables. This is something that isn't done all that well in current generation ebook readers, but it isn't exactly a new idea. There has been some freeware software for the Sony Reader that was able to download and sync online newspapers for you for quite some time now. I first ran into it a couple years back, but didn't actually use the functionality. The only real drawback to it was having to connect it to your computer in order to update, so wireless updating in a smooth manner would be worth some money. So it is valuable, but not nearly as new and unique as they seem to think. For that matter, I saw info on the new "Sony Daily" that is supposed to come out soon, and its entire premise is that it can download content wirelessly. If they can actually deliver content easily and smoothly over a wireless link, I see no real reason to move to a special format for it and the inevitable device specific DRM that tries to lock you in.

more than 4 years ago
top

CT Scan "Reset Error" Gives 206 Patients Radiation Overdose

Whillowhim Re:Not the engineers fault (383 comments)

The Therac-25 incidents happened partly because there were hardware interlocks on previous versions, but not on the updated version. However, a simple "don't kill the patient" interlock would not have worked. The basic problem is that it handled both e-beam and X-ray dosage on the same machine. And you get X-rays by hitting a target with an e-beam of much, much greater power. This absorbs the e-beam and emits a much weaker X-ray beam. If I remember what I read about this incident correctly, all of the incidents were some form of "we wanted X-rays, but the target was rotated out as if we wanted an E-beam, so the entire E-beam was applied to the person instead of the X-ray target". In standard X-ray operation (which was by far the majority of the doses that were requested), the beam had to be active at a high level in the majority of cases. Since this beam was more than strong enough to kill anyone if the target was improperly placed, almost every single treatment would involve someone bypassing a "don't kill the patient" safeguard. That is just begging to be bypassed each and every time without thinking about it.

The fact that there were several bugs that led to similar results with no backup is the major issue. There are various ways to fix this issue, including hardware interlocks, actual software review, and exhaustive test methodology (including designing the software so that it can be tested exhaustively). In the end, they cut corners and this killed patients. They reduced the cost by removing "extraneous" hardware interlocks found on the Therac-20 model, because they didn't realize that they were activating and saving lives. They reduced the cost by hiring programmers who clearly did not understand proper code design and by reusing old code that depended on the interlocks. They reduced the cost by not requiring exhaustive testing, and code that supported exhaustive testing. In particular, the hardware interlocks were not the simple "low power or else" checks, but more complicated checks on what valid powers vs. other settings were appropriate. More expensive than a simple "don't go to high power without authorization" check, and thus more expensive.

I can remember two examples of errors that caused problems. One of the incidents involved an 8-bit integer that was incremented when it was checked and found not ready in a continuous loop. This integer was part of what checked to see if the target was in place. So using a testing procedure where you make a slight mistake, fix that mistake but then forget to rotate the target back in would be stopped by this check.... 255 out of 256 times. The other 1 out of 256 times it had just rolled over and gave an incorrect output. Someone lost that game of Russian roulette.

Another of the incidents involved fast data entry. You enter the dosage as if you were going to give the patient an X-ray beam (which was much more common than E-beam treatments and became a habit to some operators), and hit enter at the bottom of the setup form. This starts the beam strength calibration. If you then realize you really wanted an E-beam of the same strength for this patient, go back to the top, change one entry from X-ray to E-beam and fly through hitting enter on the rest of the form in 8 seconds to get to the bottom. The beam strength calibration finishes 8 seconds after you hit enter the first time, exits its loop and checks to make sure the form is still properly filled out (which by now it is). Then it removes the target because you asked for E-beam and it doesn't double-check the power setting which was originally set for X-rays. Since it doesn't go back to double check the power setting vs. E-beam/X-ray and just checks the single "form properly filled out" variable, it is inherently dangerous. This was fixed by the infamous "remove the up key on the keyboard" hack by the company, forcing people to take more than 8 seconds to fill out the form again.

While I'm more of a hardware engineer than a software one, even I can see where both of these errors should not have been made by anyone who know what the heck they were doing. The fact that they were not reviewed exhaustively before going into a product as potentially dangerous as a radiation treatment machine is... well, a case study in how to do things wrong.

more than 4 years ago
top

E-book readers ...

Whillowhim Re:Sharing books? (503 comments)

Yes, Amazon's store is not a good advertisement for the value of electronic publishing vs. paper. It is better than the Sony store, but that is not a ringing endorsement. The price differences are pretty random and nonsensical, but at least they're often in the same range as a paperback book instead of selling it as though it is a hardback even though it has been out in paperback for months. Compare that to Baen books, however. The last book I was just reading is $8 on amazon for the dead tree edition. At the Baen online store it is $6 with no DRM and in a variety of formats. And they've been selling all their books for 1-2 bucks cheaper than the paperback edition for the past several years, even when the book was still in hardback. This is even cheaper when you consider the "webscription" bundle of all the books they're publishing or republishing in a month, typically 3-4 new hardback books and 3-4 older paperback books for $15. If you'll read half those books, that is $5 or less per book.

I've spent around $1000 at Baen's store in the last 3 years to put books on my Sony Reader. The only problem is that they can only publish new and interesting books so fast. I've bought pretty much nothing from any other online bookseller (unless you count the free $50 coupon that came with my Sony ebook reader, where I discovered exactly how bad the Sony store sucked). It's not that I don't want to buy other books, it's that all the other bookstores I've found either had restrictive DRM that didn't work with the Sony reader or had pretty horrible selection. Or both. If I could have found a store like Baen with a wider selection of books, I could have easily spent another $2000 on my book habit. Instead, I reread old books, or look for them from pirated sources in badly OCRed and formatted versions.

about 5 years ago
top

Bacteria Used To Make Radioactive Metals Inert

Whillowhim Re:Non-Toxic inert? (237 comments)

This distinction is part of what makes the Hanford area in Washington such a difficult cleanup effort. The separation of plutonium for WW2 isn't really the problem, its all the poorly documented experimental methods they used in the cold war. You end up with radioactive metals dissolved in all sorts of chemicals, and then you don't bother to document which chemicals. I wouldn't even consider it radioactive waste exactly, it's some nasty chemical waste that just happens to be radioactive from dissolved metals. Separating the radioactive metals from the rest of the chemical soup would be a significant first step, because then you can treat each part differently. i.e. the radioactive stuff won't try to eat its way through the container, and the chemical stuff won't try to kill you just for standing next to it.

about 5 years ago
top

Initial Tests Fail To Find Gravitational Waves

Whillowhim Re:They exist. (553 comments)

Exactly. The initial design for LIGO was only expected to be sensitive to the largest possible sources of gravity ways. The theories on most of these weren't proven in any real sense, so it was decided to go look for them. Many theories predict a much lower level of background noise, which LIGO cannot yet detect without a _very_ long run time. It is faster to upgrade the device and continue looking with more sensitivity than to integrate more to look for coherency in a noisy signal. The evidence we've seen so far of gravity waves comes from sources that would be much weaker than the current LIGO sensitivity, or else from sources that are expected to be very rare (2 neutron stars colliding should give off a ton of gravitational energy, but how often does this happen within X light years?).

The original LIGO sensitivity also matched or slightly exceeded other gravitational wave observatories that were being designed around the globe, so it was decided it was a good place to start. However, while other observatories in more populated areas get a lot of their sensitivity by having very complicated suspension systems for the mirrors and active isolation systems to reduce outside noise, LIGO gets its sensitivity by being in the middle of nowhere with plenty of space to build massively long beam tubes, which are a direct multiple of the sensitivity. Thus, it was easier to get LIGO running with simpler suspension systems, but upgrading the sensitivity does not require replacing the entire device. Simply replacing the mirror suspension systems with the more advanced ones that others have been working the bugs out on should give a large boost to the sensitivity. And since others have been working with these new systems, they'll be better understood and hopefully take less time and fiddling after they're installed in the LIGO facilities.

more than 5 years ago
top

Initial Tests Fail To Find Gravitational Waves

Whillowhim Re:I think I see the problem. (553 comments)

I worked at LIGO Hanford a few years back before going back to grad school. Since it is essentially a scaled up prototype, new things were always being fiddled with and the device was very temperamental. If we could have blown the dust out of the cartage, we would have. How easy/hard it was to gain and hold lock (when the laser is resonating properly) varied on a daily or sometimes hourly basis with no obvious way to tell what was wrong this time.

As a joke, I put together an emergency kit for the control room. It consisted of:
1) one(1) cardboard box with "emergency locking kit" written on it. Also suitable for use as an altar.
2) one(1) rubber chicken for use as a sacrifice for any suitable god.
3) one(1) butter knife stolen from the lunch room.

To my knowledge, it was never officially used. But the rubber chicken did end up with some suspicious marks on its neck and the butter knife did end up with red marks along the edge. It was claimed to be accidental damage and a slip with one of the whiteboard markers, but I suspect something else was at play.

more than 5 years ago
top

Fatal Explosion At Russian Hydroelectric Dam

Whillowhim Re:It is not the volts (336 comments)

I'll expand on this since people keep claiming I'm wrong. This all depends on where you measure the voltage. If its on the device itself, then technically I'm wrong. However, if you look at the body itself and the important parts of it, I'm correct. Everything has some sort of capacitance and inductance associated with it, even the human body. It isn't a great capacitor or a great inductor but it does act somewhat like one. This doesn't matter at DC or at low frequencies, but when you look at AC or high frequency transients (shocks from rubbing your feet, the initial hit of a spark plug/taser, etc.) these values start to have an effect.

Without going into gory details, the main effect these values have is that they smooth out the voltage that is actually applied. The capacitance of your body means that it resists the instantaneous change in voltage, so the "12,000V" discharge is not applied to your body the exact picosecond that you touch it. Instead, it starts charging your body's capacitance, and your body's voltage starts to rise. If the voltage source can't actually sustain 12,000V across your body its output voltage drops very, very quickly. Eventually it reaches equilibrium at a lower voltage across the body, hopefully one that is not fatal.

So, to correct my statement: Anything that can sustain 12,000 can not only kill you, it can jump air gaps to do so. Anything that can't sustain that voltage is likely just painful.

Of course, if it can't sustain the voltage, usually the number is just given by the marketing department to sound large. Or for very specific purposes (ESD testing, spark generation, etc.).

more than 5 years ago
top

Fatal Explosion At Russian Hydroelectric Dam

Whillowhim Re:It is not the volts (336 comments)

Somewhat off topic, but...

While true and oft-repeated, the volt/amp comment ignores the fact that there is a definite relation between the two. It is easier to determine the exact effect on the body if you know how many amps went through the person's heart and/or other muscles, but ballpark figures with volts can give some idea of the danger. The body is essentially just a resistor, so there is a linear relation between volts and amps if you know where that voltage is applied and thus what the resistance of the body between those 2 points is. You know that with 12 volts it takes some ingenuity to kill someone, but 120 volts from a wall socket is dangerous if mishandled. 1200 volts will be fatal when applied directly to the skin almost anywhere. 12,000 volts will not only kill you, it will arc through small air gaps to do so (i.e. tasers, you don't get all of the claimed thousands of volts over the body, most is dissipated across the air gap or is regulated by the circuitry to keep the current low).

The way I look at it, amps give you a good idea of how dead you are. Volts gives you a measure of how bad something is trying to kill you.

more than 5 years ago
top

Best Mouse For Programming?

Whillowhim Re:Why a mouse? (569 comments)

Obviously you've never been forced to use Labview by some odd piece of testing equipment. You should count your blessings.

more than 5 years ago
top

Sony Pondering Game/Phone Hybrid

Whillowhim Re:That could be pretty cool (80 comments)

My current annoyance with Sony is product support. I bought the Sony reader 3 years ago. I just upgraded to Vista 64 after my computer had a minor melt down. Their software does not support 64 bit OSes with the very first version of the device. Apparently the PRS-500 had its own specialized USB driver, while the current models act more like a thumb drive. So they never ported the specialized drivers over to 64 bit vista, just let the normal USB drivers work with the 505 and later models. 64 bit Vista has been out for how long now? At this point I just assume it will never happen, and they've dropped support for the device only a couple years after putting it on the market. Inexcusable.

more than 5 years ago
top

AT&T's Bad Math Strikes MythBusters' Savage

Whillowhim Re:First response... (305 comments)

Turn in your Geek card. The corrected quote is:
"I reject your reality and substitute my own"

Also applicable to Ahmedinejad when the election results came in.

more than 5 years ago
top

Kindle, Zune DRM Restrictions Coming Into Focus

Whillowhim Re:When Will the Average Consumer Learn? (311 comments)

I completely disagree. Buying an item you don't intend to actually use is sending the wrong message. You're rewarding the book publishers for their insane DRM when you should be discouraging them.

Finding pirated books can be a pain in the ass. If they're going to force me to spend time looking for a copy with bad proofreading and odd line-breaks, I'm going to ask for a refund on the money I spent on the book. Or better yet, just not spend it in the first place. Its not that I'm unwilling to buy ebooks, its that I value my time and spending 10-60 minutes looking through various websites/peer to peer applications is more valuable to me than the cost of the book in the first place.

And for the record, I've spent just under $1000 at Baen's online store over the last 3 years, because the books there are unencumbered by DRM and are easy to find and buy. I'm more than willing to buy books if I'm given a fair deal. It just seems that a lot of book publishers are so scared by the piracy boogieman that they piss off their real customers.

more than 5 years ago
top

America's Army 3 Has Rough Launch, Development Team Canned

Whillowhim Re:It's not quite that bad . . . (150 comments)

I even spawned without a weapon, I had to find a dead body to steal his

Naw, that's a feature. They're training you to recreate the human wave attacks used in WWII after they run out of funding for rifles.

more than 5 years ago
top

Google Set To Tackle eBook Market

Whillowhim Re:The real questions is: (170 comments)

This is the major question, and I can't seem to find any info about it. If the books are sold without DRM Google is in a position to force other online publishers to follow suit, but at the same time fewer publishers will want to list books with Google (due to percieved losses from piracy). It seems like more publishers are wising up to the fact that DRM is only hurting them, but there is still a long way to go before all books are available in non-DRMed formats. I suspect that Google will end up using the middle ground again and allow publishers to choose whether the books are DRMed or not, which means that all the major publishers will continue to try to make DRM work.

The basic issue is that all major eBook readers can handle a large number of non-DRMed files, but only 1 DRM format. If you can't find the book in that specific DRM format, you're out of luck. Typically, these are specific to the company that puts out the reader (i.e. Amazon's kindle format, Sony's reader format, etc.). The Sony store is expensive and has a limited selection. Amazon has a much better selection, but not perfect and is often expensive as well. Fictionwise has a mediocre selection (seems to be better than Sony in my area of interest), but their DRM doesn't work with the two most common kinds of ebook readers (the Sony and Amazon ones).

Since they're not going to put out their own version of an ebook reader, I'm hoping that Google will go without DRM so that I can use their store with my Sony Reader. If not, I'll end up pirating books again and end up with free but often badly formatted books after spending 4-5x as much time looking for the book as I would with a proper store. I've already tapped out Baen's back catalog of interesting books (spent close to $1k getting the ones that looked interesting) and I read books faster than they can publish them. I'm willing to buy books online, I just can't find someone to take my money and give me something that works. Yes, this frustrates the hell out of me. I refuse to buy books then pirate them because it sends the signal that DRM is acceptable. If you're going to make me spend time looking for crappily formatted books due to fear I'm going to steal something, I'm not going to pay you for it. I don't like the fact that this means authors don't get paid, but I'm more than happy with the fact that publishers don't get paid because of this.

more than 5 years ago

Submissions

Whillowhim hasn't submitted any stories.

Journals

Whillowhim has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>