Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Free Software, a Matter of Life and Death

kdawson posted more than 4 years ago | from the serious-as-it-gets dept.

Open Source 197

ChiefMonkeyGrinder writes "Software on medical implants is not open to scrutiny by regulatory bodies. Glyn Moody writes: 'Software with the ability to harm as well as help us in the physical world needs to be open to scrutiny to minimise safety issues. Medical devices may be the most extreme manifestation of this, but with the move of embedded software into planes, cars and other large and not-so-large devices with potentially lethal side-effects, the need to inspect software there too becomes increasingly urgent.' A new report 'Killed by Code: Software Transparency in Implantable Medical Devices' from the Software Freedom Law Center points out that, as patients grow more reliant on computerized devices, the dependability of software is a life-or-death issue. 'The need to address software vulnerability is especially pressing for Implantable Medical Devices, which are commonly used by millions of patients to treat chronic heart conditions, epilepsy, diabetes, obesity, and even depression.' Will making the source code free to scrutiny address the issue of faulty devices?"

Sorry! There are no comments related to the filter you selected.

I've got to say... (4, Funny)

fuzzyfuzzyfungus (1223518) | more than 4 years ago | (#33048442)

That the Pacemaker Genuine Advantage warning I got last week was a bit of a shock...

Re:I've got to say... (2, Funny)

natehoy (1608657) | more than 4 years ago | (#33048580)

... or potential lack thereof when you need it.

Re:I've got to say... (2)

jaak (1826046) | more than 4 years ago | (#33048634)

Haha, just think... somewhere out there is someone who is thinking it would be a great idea to run Windows Embedded in a pacemaker.

Re:I've got to say... (5, Funny)

Mongoose Disciple (722373) | more than 4 years ago | (#33048668)

Blue Screen of Death, now with real death?

Re:I've got to say... (1)

CeruleanDragon (101334) | more than 4 years ago | (#33048682)

Blue Screen of Death, now with real death?

I wish we could up-vote comments ourselves, I'd give this a ++.

Re:I've got to say... (5, Funny)

Mongoose Disciple (722373) | more than 4 years ago | (#33048776)

Thanks!

At least I didn't say it'd be the first killer app for the platform. Man, these jokes write themselves!

Re:I've got to say... (2, Funny)

angelwolf71885 (1181671) | more than 4 years ago | (#33049380)

i can just see it now bonzi buddy taping his bongos to the heart beat

Re:I've got to say... (3, Informative)

Monkeedude1212 (1560403) | more than 4 years ago | (#33049244)

I wish we could up-vote comments ourselves, I'd give this a ++.

We do. You just have to earn them, that's all. And once you earn them, you can waste them on as many +funny's as you want.

Re:I've got to say... (4, Funny)

Sponge Bath (413667) | more than 4 years ago | (#33048882)

Roy: [answers phone] Hello, IT. Have you tried turning it off and on again?

Re:I've got to say... (2, Informative)

jgagnon (1663075) | more than 4 years ago | (#33049000)

Make sure you leave it off for at least 15 seconds before turning it back on...

Re:I've got to say... (4, Funny)

fuzzyfuzzyfungus (1223518) | more than 4 years ago | (#33048672)

Well, do you want your pacemaker to have intuitive manageability through Group Policies, or not?

Re:I've got to say... (4, Funny)

camperdave (969942) | more than 4 years ago | (#33049424)

just think... somewhere out there is someone who is thinking it would be a great idea to run Windows Embedded in a pacemaker.

Just think... Somewhere out there is someone who writes pacemaker software who is thinking "There are alternatives to Windows Embedded?"

Re:I've got to say... (0)

Anonymous Coward | more than 4 years ago | (#33048966)

Party on Wayne

Some dead people are OK (-1, Troll)

Anonymous Coward | more than 4 years ago | (#33048454)

Who cares about some people dying, when it protects capitalist America from the communist Open Sores that robs programmers of their deserved profits?

Same article different day (4, Informative)

guruevi (827432) | more than 4 years ago | (#33048460)

Dupe! This was covered a couple of days ago.

Re:Same article different day (0)

Anonymous Coward | more than 4 years ago | (#33048542)

care to provide a link? I read every day and did not see it, and couple searches turned up nothing.

Re:Same article different day (2, Informative)

betterunixthanunix (980855) | more than 4 years ago | (#33048552)

And as people pointed out the first time around, medical devices are tested extensively before being deployed. I am an ardent free software supporter, but the safety/reliability issue is simply the wrong argument. I would say the more important argument when it comes to medical software is control -- do you really want to have a corporation that you have absolutely no control over to be in control of a device that sustains your very life? What happens if that company goes bankrupt, and the source code dies with the company? What if they decide they want to start charging people a yearly fee for using their pacemakers (a situation that does not seem too far fetched, given what I have seen proprietary software companies do in the past)?

Re:Same article different day (4, Funny)

fuzzyfuzzyfungus (1223518) | more than 4 years ago | (#33048604)

Not to worry. Authentication dongles will be available in a variety of sizes, to make insertion endurable for all our users.

Re:Same article different day (2, Insightful)

Anonymous Coward | more than 4 years ago | (#33048730)

That's not specific to software-controlled devices though. If you're dependent on taking a pill every week to keep you alive and/or healthy, you're in trouble if the supply chain gets disrupted in any way.

Re:Same article different day (2, Interesting)

shentino (1139071) | more than 4 years ago | (#33048992)

Running out of pills is one thing.

Having your pace-maker shut down due to non-compliance with an EULA is quite another.

Sure, corporations can make a killing...but it will come with a murder conviction.

I seriously doubt it would EVER be legal to remotely disable a pace-maker.

Re:Same article different day (1)

jgagnon (1663075) | more than 4 years ago | (#33049124)

"We didn't sell you that pacemaker, we leased it to you. You read the EULA before it was inserted, right?"

Re:Same article different day (1)

betterunixthanunix (980855) | more than 4 years ago | (#33049150)

I seriously doubt it would EVER be legal to remotely disable a pace-maker.

Why do you doubt that? "Our licensing is reasonable and non-discriminatory."

Do you also doubt that a pacemaker manufacturer would refuse to provide a critical software update unless each pacemaker user pays them for it?

Re:Same article different day (4, Informative)

kipd (1593207) | more than 4 years ago | (#33048932)

Yes... No bugs, thoroughly tested: http://www.ccnr.org/fatal_dose.html [ccnr.org]

Re:Same article different day (1)

digitig (1056110) | more than 4 years ago | (#33048934)

And as people pointed out the first time around, medical devices are tested extensively before being deployed.

Not just tested. Safety-of-life systems are usually subject to extensive checks on the development process and in the most significant cases to extensive static analysis. Even if you opened the code up to inspection it would be pretty much useless without all of the design documentation and all the other documentation that explains why the code is sufficiently safe. I don't see how the free software model gives any significant safety advantage; all that would happen is the developers would get bogged down by people commenting that they haven't used the latest tricks for "efficiency", whereas in fact mission critical software deliberately sticks to boring, old-fashioned, tried-and-tested techniques.

Re:Same article different day (1)

Draek (916851) | more than 4 years ago | (#33049220)

Hell, what if some third-party finds your pacemaker infringes on their patent and demand monthly royalties from you? replacing, say, a pacemaker is a *bit* harder than uninstalling your copy of ffmpeg.

Formal methods, not open code (1, Insightful)

Anonymous Coward | more than 4 years ago | (#33048494)

Just require that all such software rigorously use formal methods to mathematically prove that it functions as intended. The manufacturer could then send their proofs to some regulatory/standards agency to verify.

Re:Formal methods, not open code (1)

TheLink (130905) | more than 4 years ago | (#33048856)

It may work as intended/designed. But the intention/design could be wrong/mistaken.

So what if you can prove that the software does "X" as designed? Very many bugs are because the spec/design should be to do "Y" instead.

That's why math proofs of "software correctness" are not that useful in most real world scenarios.

Re:Formal methods, not open code (1)

shentino (1139071) | more than 4 years ago | (#33049068)

GIGO

Re:Formal methods, not open code (1)

jgagnon (1663075) | more than 4 years ago | (#33049144)

GIDI (Garbage In, Death Imminent)

Re:Formal methods, not open code (1)

Mikkeles (698461) | more than 4 years ago | (#33049212)

'It may work as intended/designed. But the intention/design could be wrong/mistaken.

So what if you can test that the software does "X" as designed? Very many bugs are because the spec/design should be to do "Y" instead.

That's why tests of "software correctness" are not that useful in most real world scenarios.'

Modifying correct (verified) code to be valid is many times easier than modifying incorrect code to be valid.

Re:Formal methods, not open code (1)

kav2k (1545689) | more than 4 years ago | (#33048886)

But it is my understanding that from such a full proof of an algorithm's specification you can extract the algorithm itself, thanks to Curry-Howard.. Then again, maybe not always.

Re:Formal methods, not open code (0)

Anonymous Coward | more than 4 years ago | (#33049074)

That's effectively what's required. Proof how it will perform with of every possible state. The process is such a pain, medical device manufactures keep software use to a minimum. And if used, complexity to a minimum.

That's why computer technology in medicine is so far behind the rest of the world.

Re:Formal methods, not open code (2, Insightful)

digitig (1056110) | more than 4 years ago | (#33049462)

Formal methods on their own are not enough -- at least, not with the current state of formal methods. Formal methods and testing tend to expose different bugs. But the principle is right: maybe an independent safety assessor evaluates the process and products, and the manufacturer submits their argument as to why the system is acceptably safe to a regulator.

We need to be careful about what is "sufficiently safe" though. If somebody would die for sure without the implant then "the implant probably won't kill them" is a big improvement, whereas achieving "the implant almost certainly won't kill them" might price the implant out of reach of most people who need it so it goes back to the situation in which they die. As a rule of thumb, moving up one IEC61508 SIL increases costs by about an order of magnitude. Formal proofs mean that you're talking about SIL 4, so you're talking of the order of 10 000 times the cost of normal commercial standard software (treating that as SIL 0). Increase the development cost of a life-saving implant by a factor of 10 000 and unless you have massive economies of scale you're going to end up indirectly killing people by pricing it out of the market.

Makes sense (4, Insightful)

MBGMorden (803437) | more than 4 years ago | (#33048496)

To me, this is just common sense. This code doesn't necessarily have to be FL/OSS in my mind - let them keep the copyright, but it most definitely should have code available for public review. Would you be willing to take a new wonderdrug where the drug company won't tell anyone what's actually in it, but assures you that it'll work? If they must disclose the formula to their drugs, then they ought to be required to disclose the code to their software. Let existing laws like copyright ensure that no one else uses it.

Re:Makes sense (2, Insightful)

betterunixthanunix (980855) | more than 4 years ago | (#33048592)

Except that the mechanisms behind many of the drugs we use are not fully understood by the companies that make those drugs. They only disclose the chemical formula behind the drugs, not the logic of why that particular chemical works the way it does.

Re:Makes sense (1)

MBGMorden (803437) | more than 4 years ago | (#33048614)

While that presents some difficulty, it's still not that bad. IMHO, that's akin to releasing the code with the comments stripped out. Sure, it makes testing more difficult, but it does ensure that you can reproduce their pill/executable to run tests on yourself.

Re:Makes sense (1)

dj961 (660026) | more than 4 years ago | (#33049018)

Not it's more like a black box where we know most of the inputs but don't know the all outputs.

Re:Makes sense (1)

digitig (1056110) | more than 4 years ago | (#33049488)

The number of people in this discussion who think that safety is a matter of testing seems to me to be a pretty good indicator of why the idea won't work.

Re:Makes sense (0)

Anonymous Coward | more than 4 years ago | (#33049286)

This is true of all medical treatment. Until full-body molecular simulation is possible, this will continue to be the case.

Re:Makes sense (1)

ceoyoyo (59147) | more than 4 years ago | (#33048902)

In that case the engineering drawings for a 777 (or anything else) should also be open to public scrutiny. Is that reasonable?

Re:Makes sense (2, Insightful)

MBGMorden (803437) | more than 4 years ago | (#33048988)

Some level of documentation for such things will be available. How do you think A&P's all over the country work on them? Just pop the hood and figure it out as they go along?

And yes, I think that anything on which the safety of a life depends should be open to scrutiny. Alarm clocks and keyboards? Not so much.

Re:Makes sense (2, Interesting)

Blakey Rat (99501) | more than 4 years ago | (#33049226)

The cheap Chinese copies made from the "open to scrutiny" plans are probably not going to be as safe as the Boeing originals.

Re:Makes sense (2, Insightful)

JWSmythe (446288) | more than 4 years ago | (#33049314)

    It's the same argument that an automobile manufacturer doesn't release the detailed specs of a vehicle, because the owners manual doesn't show a breakdown of the engine. They are available (for a price, of course) to the people that need the information.

    Here's the list of manuals for a Boeing 777 [boeing.com] .

    But for both aircraft and auto manufacturers, I don't believe they release detailed specs of say the software that makes their vehicles work. I doubt A&P mechanics are fixing software flaws in the autopilot, just as auto mechanics can't fix the software in the cruise control. It's the same as a doctor wouldn't be able to change the software controlling a pacemaker.

    I know plenty of automobile electronics have been reverse engineered, but that's due to the number available to work with, and the potential profit to be had from tuning the software. Most of us wouldn't know where to get our hands on a new or used pacemaker to begin reverse engineering it. I definitely wouldn't be able to get my hands on a new or used 777, nor have anywhere to store it. It's a bit bigger than most of our garages, and I can't imagine our significant others not minding that we have one in the garage.

Re:Makes sense (1)

ceoyoyo (59147) | more than 4 years ago | (#33049564)

Mechanics also don't fix the cruise control hardware. They throw out the unit and buy a new one. Ditto with pacemakers and 777s. If the part doesn't work, you get a new one. The provided information usually only describes the system down to the level of replaceable components, not below. The software in each of those replaceable components is part of the black box.

Did your computer come with the circuit diagram for the motherboard (the Apple II did)? Suppose something goes wrong and you need to fix it? Today you don't - you toss it and get a new one.

Re:Makes sense (1)

ceoyoyo (59147) | more than 4 years ago | (#33049502)

I very much doubt the full engineering documentation is available, and it is almost never available publicly. Many critical systems manufacturers will already make some parts of the software specifications, testing methodology, troubleshooting guides, etc. available to people authorized to make repairs or do troubleshooting.

Re:Makes sense (1)

LWATCDR (28044) | more than 4 years ago | (#33049058)

Maybe but this problem has already been solved.
The aerospace industry has been flying code that can kill for decades. They have a procedures to develop and test mission critical code for everything from navigation to flight control systems.
Is it prefect? No but the system does seem to work well. Just base the certification process for medical software on the certification process for aerospace software and you have a good working solution.

Re:Makes sense (1)

Snowmit (704081) | more than 4 years ago | (#33049102)

What you're describing is patents on software.

Re:Makes sense (1)

MBGMorden (803437) | more than 4 years ago | (#33049368)

No, I said copyrights. Patents are a completely different animal. I personally see no need to protect the manufacturer's algorithm on detection of a particular trait, even if it could be reimplemented in different code. And yes, I feel that that algorithm should be publicly reviewable if someone's life depends on it.

Re:Makes sense (2, Informative)

Hatta (162192) | more than 4 years ago | (#33049202)

This code doesn't necessarily have to be FL/OSS in my mind - let them keep the copyright

Authors of open source software retain their copyright.

Re:Makes sense (1)

MBGMorden (803437) | more than 4 years ago | (#33049410)

True. Poor choice of wording. Authors of open source code keep their copyright, but it really only applies to the opportunity to close the code again. Any version that has been legitimately released under the GPL and makes it into the wild can't be retracted. Others could still re-release the code again with modifications. That ability doesn't need to be there at in in this case. The point here isn't to promote or enable derivative works, but rather to ensure safety and security of the code.

Re:Makes sense (1)

daviee (137644) | more than 4 years ago | (#33049210)

Do you think any civil engineers with a degree can just walk to any construction site and get full compliance from the site people to give them anything they need to analyze whether their design, construction, etc. are safe?

Now open that up to let anyone walk in...

Re:Makes sense (0)

Anonymous Coward | more than 4 years ago | (#33049578)

Let us be very clear, the originator of a piece of software absolutely retains the copyright to that code when publishing it under any OSI or FSF approved F/OSS license. Please try your best not to proliferate FUD. The meaning of F/OSS isn't that copyright is relinquished, rather it is a guarantee that it will be enforced in a particular, and permissive, way. To be sure, without the retention of copyright, GPL-style licenses would NOT be enforceable at all. How can I, the originator of a piece of software prosecute an entity for redistributing a GPL-licensed work in violation of the terms of the GPL without retaining my copyright?

Not a question of badly written software (1)

Attila Dimedici (1036002) | more than 4 years ago | (#33048506)

The thing about this is that this is not really a question about badly written software. I think the current regulatory system provides a high enough level of protection against badly written software that making the software open source would not add a significant amount of increased security. However, a greater concern is the possibility that someone could insert code with specific triggers which could be used for malicious purposes. It is not that I believe that they would, it is that the implications for our society if someone did are so severe that some effort must be made to reduce the chances of that happening.

Re:Not a question of badly written software (1)

Fallon (33975) | more than 4 years ago | (#33048848)

<quote>The thing about this is that this is not really a question about badly written software. I think the current regulatory system provides a high enough level of protection against badly written software that making the software open source would not add a significant amount of increased security. However, a greater concern is the possibility that someone could insert code with specific triggers which could be used for malicious purposes. It is not that I believe that they would, it is that the implications for our society if someone did are so severe that some effort must be made to reduce the chances of that happening.</quote>

I have several diabetic friends with insulin pumps... It's been found that the wireless protocols used by these pumps to communicate with sensors & other devices is plain text and completely unauthenticated. That makes the very real possibility somebody can hack the pump and kill my friends with very little defenses in the way to stop them. A lot of implanted devices are no different, wide open just relying on people not bothering to try. Very poor security that I wouldn't want to stake my life on. What regulatory system has caused the devices to be secure so far? None.

Re:Not a question of badly written software (1)

bws111 (1216812) | more than 4 years ago | (#33049134)

This is exactly why the code should not be 'open'. You just wind up with hysterics like yours, and very little in the way of actually useful information. Did it occur to you that security may be way down on the list of concerns when making such a device? Actual real-world things are somewhat more important. Like battery life. Trying to maintain communications under all circumstances. Heat generated. Reliability.

Yes, theoretically someone could kill your friends by hacking their insulin pumps. Or, they could just shoot them, or run them down with a car, or poison their food, or steal their insulin, or use any of the other thousands of ways you could kill someone.

Re:Not a question of badly written software (1)

Yetihehe (971185) | more than 4 years ago | (#33049330)

Yeah, but typically not by passing by in a place with many people with a radio in pocket. Plus no one would notice it immediately and the user would just collapse some time after.

Re:Not a question of badly written software (1)

Attila Dimedici (1036002) | more than 4 years ago | (#33049354)

You are right that there are other things that are more important than security in the design and development of these devices and those things were rightly solved first. Now that those things have mostly been solved it is time to start thinking about security and to start fixing the security issues that were left open while the more important issues were resolved.
Most of the other ways that someone could kill another person leave significant evidence as to who committed the crime. Right now, there is very little chance of identifying who used wireless access to a medical device to kill someone.
This is not an "OMG the sky is falling, do something NOW" type of problem. This is a "You know, this is going to cause problems sooner or later, we should figure out how to prevent as many of those problems as possible as we get the chance" type of problem.

Re:Not a question of badly written software (1)

Darinbob (1142669) | more than 4 years ago | (#33048900)

I've worked many years on medical devices (not implantable). The FDA has a strong regulatory oversight, as long as many other regulatory bodies (international as well). This includes scouring through bug lists, reviewing QA procedures, etc. And these were relatively safe and benign devices.

Re:Not a question of badly written software (1)

Attila Dimedici (1036002) | more than 4 years ago | (#33049280)

And in what way would any of the various regulatory agencies catch an intentional backdoor left in one of these devices? Or a kill switch?

Open - not necessarily free (1)

kubitus (927806) | more than 4 years ago | (#33048528)

open to allow analysis

and maybe free as in reusable to improve it,

but not necessarily free as in beer!

People will die (1)

clang_jangle (975789) | more than 4 years ago | (#33048534)

People will die from shortcomings of technology, whether the tech is FOSS or not. Capitalism being the state religion in most of the developed world, I think it's safe to say that proprietary software will be with us forever. Fortunately, idealism is still a common enough human trait we can say the same of FOSS. One isn't going to "win" over the other, ever. We will simply continue to have both. And that's as it should be.

Re:People will die (1)

betterunixthanunix (980855) | more than 4 years ago | (#33048638)

Personally, I would not be very comfortable if a company I have absolutely no control over has control over the software that runs the pacemaker in my chest. What if they decide to start charging a yearly fee for their pacemaker software? What if they refuse to provide critical software updates unless I fork over more money to them?

Medical software should be libre, there is just no arguing that. When it comes to software that sustains a person's life, no single corporation or entity of any sort should have absolute control over that software.

Re:People will die (1)

bws111 (1216812) | more than 4 years ago | (#33048850)

Their software makes you 'uncomfortable'? Well, don't use it - that should ease your discomfort. OK, you may be dead, but hey, at least those greedy bastards aren't getting any of YOUR money.

Re:People will die (0)

Anonymous Coward | more than 4 years ago | (#33049164)

Dude, too much paranoia...
1-) If things get THESE far, government should be the one to take action.
2-) If they do such a thing, no one else will use their stuff, and they will get out of business.

Re:People will die (1)

digitig (1056110) | more than 4 years ago | (#33049602)

Personally, I would not be very comfortable if a company I have absolutely no control over has control over the software that runs the pacemaker in my chest.

If you needed one, you'd be a damn site less comfortable without the pacemaker. Briefly.

What if they decide to start charging a yearly fee for their pacemaker software?

How would they enforce it? The things are not connected to the internet, you know.

What if they refuse to provide critical software updates unless I fork over more money to them?

Er -- do you actually know what a pacemaker is? What would comprise a "critical software update"? The whole thing gets replaced every few years, anyway, because of battery life.

Re:People will die (1)

digitig (1056110) | more than 4 years ago | (#33049518)

People will die from shortcomings of technology, whether the tech is FOSS or not.

And since we're talking about medical implants here, even more people will die without the technology. That's no reason not to try to improve the safety of the implants, but we need to keep it in perspective.

You mean this stuff has to be robust? (0, Offtopic)

Just_Say_Duhhh (1318603) | more than 4 years ago | (#33048586)

commonly used by millions of patients to treat chronic heart conditions, epilepsy, diabetes, obesity, and even depression.

Of course, the #1 software used to treat depression (even though it doesn't work) is Facebook. Does that mean they need to go open source?

And Facebook is also the #1 software used to cause chronic heart conditions, diabetes, obesity (and very successfully, thank you).

Re:You mean this stuff has to be robust? (1)

betterunixthanunix (980855) | more than 4 years ago | (#33048678)

Here I was, thinking that Facebook was the software the created depression...

Or was that MySpace?

Lots of tedious details (0)

Anonymous Coward | more than 4 years ago | (#33048588)

I am sure I will be mod'd down for this but... Writing safety critical software requires a level of testing, documentation and traceability that I seriously doubt the open source community would tolerate. Many, if not most, open source projects don't even keep a reasonably current wiki let alone a complete set of traceable requirements, design and formal tests. I don't blame people for not wanting to do this sort of work for free - but it is very important when developing systems which need to run reliably for decades or someone will die.

Re:Lots of tedious details (3, Insightful)

fuzzyfuzzyfungus (1223518) | more than 4 years ago | (#33048700)

I don't think that the notion is that all medical code is going to be written by happy-go-lucky FOSS volunteers, the notion is that people ought to be able to inspect the code that is going to becoming a part of their life-critical systems...

Re:Lots of tedious details (1)

bws111 (1216812) | more than 4 years ago | (#33048972)

What good is that going to do? Who is qualified to look at it? It seems to me you would not only have to understand programming, but also the actual hardware it is controlling, the conditions it is supposed to be treating, biology, etc. If you can do all that chances are you work for a medical devices company. If someone does inspect the code, and says it is OK, does that absolve the manufacturer from all liability?

Re:Lots of tedious details (1)

Kjella (173770) | more than 4 years ago | (#33048778)

This is pretty much what I'm thinking too, you can't just throw up an open source repository and say that's good enough testing. You have to run through a proper process with testing and documentation of all that testing, and if that is proof of sufficient testing then what does open source add to that? Sure, there's the possibility that some random open source review will find flaws that the official, documented process missed but it'd be no necessity. If it was a necessity it would mean the formal process is insufficient to put it to medical use. Or to put it more bluntly, this is "nice to have".

Where is this ideal world where the FOSS? (1, Insightful)

not already in use (972294) | more than 4 years ago | (#33048610)

The typical FOSS argument usually involves living in a perfectly ideal world. You know, the kind of world where highly qualified individuals scour the internet for code to audit. And where Russian (et al) hackers don't scour open source code looking for exploits to cash in on.

Re:Where is this ideal world where the FOSS? (1, Informative)

Anonymous Coward | more than 4 years ago | (#33049200)

The typical FOSS argument usually involves living in a perfectly ideal world. You know, the kind of world where highly qualified individuals scour the internet for code to audit. And where Russian (et al) hackers don't scour open source code looking for exploits to cash in on.

No, that's the strawman FOSS argument. Most of us FOSS guys are living in the real world, where neither of those things happen.

  FOSS doesn't rely on people "scouring the internet" - just the coders and users of a program tracking down bugs in a natural way, which will usually turn up problems in a timely manner.
  Some security group about 7 or 8 years ago ran a study of a few different webservers and their code flaws -- the result was that they all started out with a similar number of bugs, but the open source project slowly pulled ahead of the closed source project, as its bugs got fixed more often and faster.

  Also, Russian hackers don't "scour open source code looking for exploits" because finding a bad piece of code is an entirely separate issue to finding out how to exploit a flaw. Just because you've found an unchecked boundary or something doesn't necessarily mean you even can exploit it, and it generally doesn't do more than give you a hint of how it might be exploited.
  Which is a huge waste of time, compared to actually banging on the compiled program with automated tools looking for something that works.

Re:Where is this ideal world where the FOSS? (0)

Anonymous Coward | more than 4 years ago | (#33049552)

And where Russian (et al) hackers don't scour open source code looking for exploits to cash in on.

Haha, yes, nice dig against the Russians there. The cold war's over, didn't you get the memo?

For that matter, this isn't about exploits, either - how many pacemakers have you seen that are connected to the Internet? This is about being able to verify the correctness of the code, ideally in a strict mathematical sense (i.e., proving it formally correct), and that's kinda hard to do when you don't have the code in the first place.

Crowd sourcing your Quality Assurance department? (2, Interesting)

stagg (1606187) | more than 4 years ago | (#33048664)

I'm a big fan of FOSS, but I've got my QA methodology hat on right now. Opening up the source code isn't providing better Quality Assurance coverage, it's just getting more eyes on the code, and at best a bunch of User Acceptance Testing. (Though clearly not with pace makers, I doubt people are going to line up for that Beta test. Unless maybe Blizzard claims it's part of their next big MMO.) This could be as much an argument for higher standards of quality assurance as for open source software. In face, hell, I can see companies opening up the source to reduce their liability and cut the costs of their own QA.

Re:Crowd sourcing your Quality Assurance departmen (1)

ducomputergeek (595742) | more than 4 years ago | (#33049500)

From what I've seen, most FOSS projects seem to skip the QA step unless it's a dual licensed product with a company behind it.

Give new meaning to the word (1, Funny)

Anonymous Coward | more than 4 years ago | (#33048688)

heartworm.

Operator Error (2, Insightful)

Darkness404 (1287218) | more than 4 years ago | (#33048710)

Even the best software can go completely wrong with the wrong person operating it.

Favorite Quotes from TFA (1)

xemc (530300) | more than 4 years ago | (#33048714)

In one experimental attack conducted in the study, researchers were able to first disable the ICD to prevent it from delivering a life-saving shock and then direct the same device to deliver multiple shocks averaging 137.7 volts that would induce ventricular fibrillation in a patient. The study concluded that there were no “technological mechanisms in place to ensure that programmers can only be operated by authorized personnel.” Fu’s findings show that almost anyone could use store-bought tools to build a device that could “be easily miniaturized to the size of an iPhone and carried through a crowded mall or subway, sending its heart-attack command to random victims.” ..

Though the adversarial conditions demonstrated in Fu’s studies were hypothetical, two early incidents of malicious hacking underscore the need to address the threat software liabilities pose to the security of IMDs. In November 2007, a group of attackers infiltrated the Coping with Epilepsy website and planted flashing computer animations that triggered migraine headaches and seizures in photosensitive site visitors.13 A year later, malicious hackers mounted a similar attack on the Epilepsy Foundation website.14

Re:Favorite Quotes from TFA (2, Informative)

xemc (530300) | more than 4 years ago | (#33048782)

The article also links to: http://cio-nii.defense.gov/sites/oss/Open_Source_Software_(OSS)_FAQ.htm#Q:_Doesn.27t_hiding_source_code_automatically_make_software_more_secure.3F [defense.gov]

Excerpt:

    Q: Doesn't hiding source code automatically make software more secure?

No. Indeed, vulnerability databases such as CVE make it clear that merely hiding source code does not counter attacks:

        * Dynamic attacks (e.g., generating input patterns to probe for vulnerabilities and then sending that data to the program to execute) don’t need source or binary. Observing the output from inputs is often sufficient for attack.
        * Static attacks (e.g., analyzing the code instead of its execution) can use pattern-matches against binaries - source code is not needed for them either.
        * Even if source code is necessary (e.g., for source code analyzers), adequate source code can often be regenerated by disassemblers and decompilers sufficiently to search for vulnerabilities. Such source code may not be adequate to cost-effectively maintain the software, but attackers need not maintain software.
        * Even when the original source is necessary for in-depth analysis, making source code available to the public significantly aids defenders and not just attackers. Continuous and broad peer-review, enabled by publicly available source code, improves software reliability and security through the identification and elimination of defects that might otherwise go unrecognized by the core development team. Conversely, where source code is hidden from the public, attackers can attack the software anyway as described above. In addition, an attacker can often acquire the original source code from suppliers anyway (either because the supplier voluntarily provides it, or via attacks against the supplier); in such cases, if only the attacker has the source code, the attacker ends up with another advantage.

Hiding source code does inhibit the ability of third parties to respond to vulnerabilities (because changing software is more difficult without the source code), but this is obviously not a security advantage. In general, “Security by Obscurity” is widely denigrated.

New meaning to iHeart Huckabees? (1)

CeruleanDragon (101334) | more than 4 years ago | (#33048728)

New from Apple, the iHeart, with iPod/iPhone docking station, it'll give new meaning to dancing to the beat.

Visible code helps... (1)

91degrees (207121) | more than 4 years ago | (#33048784)

But really what you need is established code standards, designed to substantially reduce the possibility of errors based on known faults, and an absolute rule that they not be violated, code reviews, and an independent third party auditing both the standards and the code produced to ensure it's compliant with these standards.

Double-edged sword (2, Interesting)

gmueckl (950314) | more than 4 years ago | (#33048820)

Making the code available for scrutiny is a double-edged sword. Sure, it might help to catch critical bugs in the software faster. But with some devices, it really raises a host of new issues. Some pacemakers out there are configurable wirelessly once they are in the patient's body. This is actually a very critical feature. But do you want to risk everyone being able to reverse-engineer the protocol used for adjusting the settings for such a device? A wrongly configured pacemaker can be deadly. Right now these things are fairly secure because they are rather rare and not interesting enough as targets for hacking.

Besides, proving that some piece of code works as intended (or at least fails gracefully in all circumstances) in an essentially uncontrolled environment is quite a feat. Embedded equipment is hard to service, has to have a longer hardware lifespan than normal hardware, is often custom designed for a single application and thus may have subtle hardware defects not reproducible on similar test systems... oh, and who says that the compiler or even the CPU is bug free? This is all common knowledge around here, but I just want to give the full list. What this means is that just opening the source code might not cut it. The validation would have to be performed on the hardware it was designed to run. So, where's the call to open up the hardware design and documentation to public scrutiny as well?

Re:Double-edged sword (0)

Anonymous Coward | more than 4 years ago | (#33049086)

Making the code available for scrutiny is a double-edged sword. Sure, it might help to catch critical bugs in the software faster. But with some devices, it really raises a host of new issues. Some pacemakers out there are configurable wirelessly once they are in the patient's body. This is actually a very critical feature. But do you want to risk everyone being able to reverse-engineer the protocol used for adjusting the settings for such a device? A wrongly configured pacemaker can be deadly. Right now these things are fairly secure because they are rather rare and not interesting enough as targets for hacking.

So what you're saying is that you're worried about a heart-hack...har harr...

Re:Double-edged sword (5, Insightful)

Hatta (162192) | more than 4 years ago | (#33049272)

But do you want to risk everyone being able to reverse-engineer the protocol used for adjusting the settings for such a device?

Yes. Security through obscurity is essentially no security at all. The only thing that should be secret is the private encryption key that is uniquely associated with the remote control, which should be under strict physical security at all times.

What you say? There's no encryption implemented in these devices? That's a big problem whether the code is open or not.

How about software escrow? (1, Interesting)

Anonymous Coward | more than 4 years ago | (#33048830)

With rigorous testing and regulation, I don't worry too much about a device while the company that made it is still thriving, but what if the they go out of business?

I propose that the FDA require all design data for "complex" devices, including the software" be placed into some form of escrow so it can be released if the company goes under.

The code is going to do you a whole lot of good .. (3, Interesting)

Ihlosi (895663) | more than 4 years ago | (#33048864)

... if you don't know the hardware it runs on and the external circuitry.

Really. Finding security holes in software that runs on a plain vanilla PC is one thing, finding the cause of glitches in the nanosecond range on an embedded system is another thing entirely.

Re:The code is going to do you a whole lot of good (1)

gknoy (899301) | more than 4 years ago | (#33049264)

No, but it makes it easier for an auditing body to do so, or for competitors to point out (and prove) that their device is safer.

Re:The code is going to do you a whole lot of good (1)

Ihlosi (895663) | more than 4 years ago | (#33049442)

No, but it makes it easier for an auditing body to do so,

Official auditing bodies could have the source code any time they ask for. They don't.

, or for competitors to point out (and prove) that their device is safer.

... which still doesn't help you a lot if you don't know _their_ hardware. Your software might malfunction in one out of 2^21 cases due to some obscure bug, but their hardware could go up in flames the second you look at it the wrong way.

Re:The code is going to do you a whole lot of good (2, Insightful)

Dribbitz (239455) | more than 4 years ago | (#33049468)

^THIS

Implantable pacing devices, cardioverters, and pumps (life-sustaining devices) depend on complex custom hardware designs as their platform, and that hardware is *highly* interactive with the software. Many of these devices can only achieve their miraculous longevities on a primary cell by deferring functions to hardware. If you don't have access to the information re: the hardware, the code itself might as well be inscriptions in Atlantean glyphs. You'd have to bust trade-secret protection to get a public viewing of everything needed to review the code, because you'd have to see, *everything*.

Is there at least some kind of vault storage (1)

kg261 (990379) | more than 4 years ago | (#33048868)

I would imagine this kind of software would be too complex and specialized to be effectively reviewed at large. And who would still be responsible if something was wrong? There would be discussions to no end on how to do things. However, some kind of vault (government or 3rd party) to store the source would be good just to prevent intentional or accidental loss of the information should long term statistics show something is not right. If the software is open source, then the whole design may have to be open.

If i get a pacemaker... (1)

drolli (522659) | more than 4 years ago | (#33048964)

i will overclock it. Live fast. Die young.

The GPLv3 makes this completely impossible (1, Insightful)

Anonymous Coward | more than 4 years ago | (#33049004)

First of all, most device manufacturers would prefer to build based on a closed-source infrastructure so that they do not have to re-publish their source code. So it's unlikely that we will see much GPL software in medical devices. Look at how effectively threats of lawsuits over busybox completely removed Linux from consumer routers post-2005.

Second of all, the GPLv3 prevents you from signing a binary to run on a specific piece of hardware. So no GPLv3 on medical devices.

It is entirely within the rights of free software publishers to impose these restrictions. However, it is disingenuous for them to express surprise that their software is then avoided for certain applications.

After Therac-25, there is no excuse (2, Interesting)

ChipMonk (711367) | more than 4 years ago | (#33049070)

The Therac-20 radiation therapy device worked reasonably well. Despite the software flaws, the hardware safeties in place prevented any deadly accidents. Problem is, because of the hardware safeties, nobody knew just how bad the software was. It had never been formally verified.

Then some numbskull decided, "Hey, let's let the software handle the safety interlocking, and we can cut down on hardware manufacturing costs!" The result was the Therac-25 [wikipedia.org] , which maimed and killed people.

After the machine was recalled, someone finally sat down and did a real analysis of the code, and found a whole raft of problems and bad assumptions. Nancy Leveson wrote the definitive report [mit.edu] (PDF) on the failures in the R&D processes that made the Therac-25 so deadly.

Yet, armed with this warning (among many others), both manufacturers and purchasers keep human lives as transactions on a double-entry ledger. It simply comes down to, how many deaths per thousand uses are "acceptable"? Manufacturers and medical facilities already have so many costs. Is it worth it to add on the cost of formal code analysis?

But nobody will ask the Therac-25 victims and their families.

I decided early on in my I.T. career, that I didn't want the stress of people's lives depending on my correct code. I hadn't had any training in formal verification. In hindsight, I see my worries would have come from incompetent management, more than from myself.

RoboCop Ref (1)

MrTripps (1306469) | more than 4 years ago | (#33049310)

Apparently the Family Heart Center will now let you overclock the Jarvik and Yamaha models, but it voids the warranty.

Licensed Professional Software Engineers? (2, Insightful)

davidwr (791652) | more than 4 years ago | (#33049378)

Some mechanical devices and most bridges and buildings require licensed engineers or architects to put their stamp of approval on the designs. They do not require publication of the engineering or architectural drawings though.

I for one would welcome professional licensing for certain "it can kill you if it goes wrong" software, particularly in isolated devices whose software can't be tampered with undetectably.

If a licensed Professional Software Engineer puts his seal on a pacemaker or airplane, and the software kills someone, he's just as responsible as the civil engineer would be if a faulty bridge design kills someone. In both cases, the licensed professional's responsibility would come back to "was the engineer acting in accordance with professional standards at the time" and "was the device built and maintained in accordance with the design."

UP THE IRONS (0)

Anonymous Coward | more than 4 years ago | (#33049422)

Had to do it, saw them live the other day.

A story from the industry? (1)

Mint Sharpie (1631231) | more than 4 years ago | (#33049530)

A friend of mine who does programming for heart-lung machines once told me about something like this. One of the radiation machines used to treat cancer patients had a multi-core processor, but its programming didn't always take this into account. So every so often, the machine would have a race condition, and administer radiation several thousand times more powerful than it was supposed to. The poor patient would go in for a routine procedure and end up in worse shape than he/she was in already. I'm not sure if this is one of those apocryphal things that floats around or if it actually happened, but it seems like it could be possible if all the software isn't carefully checked first.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?