Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Graphs Show Costs of DNA Sequencing Falling Fast

timothy posted more than 3 years ago | from the we've-got-your-number dept.

Biotech 126

kkleiner writes "You may know that the cost to sequence a human genome is dropping, but you probably have no idea how fast that price is coming down. The National Human Genome Research Institute, part of the US National Institute of Health, has compiled extensive data on the costs of sequencing DNA over the past decade and used that information to create two truly jaw-dropping graphs. NHGRI's research shows that not only are sequencing costs plummeting, they are outstripping the exponential curves of Moore's Law. By a big margin."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered


your mom (-1)

Anonymous Coward | more than 3 years ago | (#35397470)

my nuts. make it happen.

Great! (2)

the_humeister (922869) | more than 3 years ago | (#35397488)

How about the cost of analysis of said genomes?

Re:Great! (3, Insightful)

hedwards (940851) | more than 3 years ago | (#35397548)

Sequencing has been where the focus on cost has been going. It doesn't make much sense to try and reduce the cost of analysis when it takes a very long time and a huge amount of money to accomplish. The graph was hard to read, but at this point with the cost well over $10k there's a lot more that has to be done before analysis is worth spending a lot of time economizing.

But as it gets cheaper more and more of the focus will be on the analysis side. And the cost of analysis will come down, given that insurance isn't going to cover the sequencing at this point, analysis is moot in most cases. As more research analyzes sequenced DNA I'm sure tricks and such will be discovered to bring the cost down. But right now you're dealing with low volumes and as such cost is higher than it will be with higher volumes.

Re:Great! (1)

TooMuchToDo (882796) | more than 3 years ago | (#35397670)

Define "cost of analysis". I paid 23andme.com $100 per person (myself, my wife, and my brother) to sequence 1 million SNPs per person using Illumina's V3 chip (plus $5/month/person for as long as we have accounts with them) and to provide current and future research data with regards to those SNPs. That is *super cheap* for the kind of data I'm getting out of it (I'd be happy to post an imgr link with an anonymized print-version of the report, although I guess it doesn't matter since I've already uploaded the raw data to my PersonalGenomes.org profile).

Re:Great! (4, Informative)

varcher (156670) | more than 3 years ago | (#35397836)

to sequence 1 million SNPs per person

Actually, they're not sequencing.

They're checking.

The way 23andme and most personal genome companies work is that they have those genochips (Illumina) with one million DNA sequences on them, and they check whether or not your DN has one of those sequences.

If you have a SNP not on the chip (well, you have lots of SNP not on the chip), it won't list anything. If, at a given chromosome locale, they have "all" of the "known" SNP, but you happen to have a mutant variant not on their lib, then you're not detected.

"Sequencing" involves taking your DNA, and getting every sequence, no matter what. And that's still long and very expensive. We're in the era of the "thousand genomes", meaning we expect in a couple year to complete a thousand full sequences. Of course, 10 years later, we'll sequence everyone, but, so far, it's still a way out.

Re:Great! (2)

RDW (41497) | more than 3 years ago | (#35397840)

Progress in SNP chips, though they were a big breakthrough when introduced and remain very important in research, has been pretty static compared to the dramatic speed with which 'next generation' sequencing technologies have brought down the cost and increased the amount of data we have to cope with. Whole genome sequencing is on an entirely different scale - 3 billion bases rather than a million. Even an 'exome' (the sequence of all the actual genes in your genome) runs to about 40 million bases.

Costs Falling Fast? (1, Interesting)

Jeremiah Cornelius (137) | more than 3 years ago | (#35399466)

That's monetary cost - not social and personal. ;-)

Soon the $ cost will be free - and mandatory. If you want to fly, or even drive a car.

Hey! And to think, they said it couldn't happen here!

GATTACA World is coming (0)

Anonymous Coward | more than 3 years ago | (#35398972)

When the costs become trivial and the speed of analysis becomes quick, we will have a GATTACA world.

Re:Great! (2)

RDW (41497) | more than 3 years ago | (#35397682)

"How about the cost of analysis of said genomes?"

It's computationally expensive and pretty much subject to Moore's Law (though improved algorithms like Burrows-Wheeler alignment have helped to speed things up in the last couple of years). So it's getting cheaper, but not fast enough to keep up with the expected deluge of data. If you're just interested in sequencing a fixed number of genomes you benefit from both cheaper/faster sequencing and cheaper/faster processing power. But if you're a major genome centre, be prepared for some serious IT investment, or the bottleneck will increasingly be the speed with which you can crunch the data.

Re:Great! (1)

mapkinase (958129) | more than 3 years ago | (#35398820)

If it's not a secret, which center uses Burrows-Wheeler alignments for annotations? All of annotation pipelines I know are based on granola BLAST.

Re:Great! (2)

RDW (41497) | more than 3 years ago | (#35399080)

Not for annotation, but for the initial alignment of NGS reads to the reference genome. BWA, Bowtie, and SOAP2 all use the Burrows-Wheeler transform, and are in common use. For variant calling and functional annotation we'd use other tools, e.g.:

http://www.broadinstitute.org/gsa/wiki/index.php/Best_Practice_Variant_Detection_with_the_GATK_v2 [broadinstitute.org]

http://www.openbioinformatics.org/annovar/ [openbioinformatics.org]

Figures (1)

rouge86 (608370) | more than 3 years ago | (#35397490)

The industry has been trying to get sequencing done cheaply for a while now. Good to see that there has been success. Now if only the doctors get on the bandwagon and start diagnosing people based on an individuals genome.

More unecessary tests.... (0)

Anonymous Coward | more than 3 years ago | (#35397568)

Now if only the doctors get on the bandwagon and start diagnosing people based on an individuals genome.


If someone has a genetic disease, it's pretty apparent that they have it without sequencing anything.

Familial history gives an excellent idea of any propensity for genetic disorders.

The only thing the DNA sequencing will do is add another test, give MDs another revenue stream and a method for CYA, and at the end, our health care costs go up even more.

Re:More unecessary tests.... (2)

durrr (1316311) | more than 3 years ago | (#35397832)

If someone has a genetic disease it's not always apparent. You can have a genetical susceptibility to tuberculosis, unless you get inefcted you'd never know it. Or you can have some more general problem such as marfans syndrom which traditionally is diagnosed based on aortic root dilatation or some similar criteria. Turns out that around half or so of the people with marfans syndrome have no traditional manifestations of the disease, that is that in the absence of genetic sequencing you'd never know they have marfan syndrome.

Re:More unecessary tests.... (0)

Anonymous Coward | more than 3 years ago | (#35398352)

If someone has a genetic disease it's not always apparent. You can have a genetical susceptibility to tuberculosis, unless you get inefcted you'd never know it.

Yeah, and?

So, let's say they have a genetic test and find out they have a "genetical" susceptibility to any disease. So what? They're going to go around and screen everyone else to stay away from them? That knowledge doesn't help the patient. Let's say the have a susceptibility to high blood pressure and hear disease. So what? The doc advises them to exercise, eat right, and cut down on the salt- advice that everyone should follow

Again, how does this test and cost benefit that patient?

in the absence of genetic sequencing you'd never know they have marfan syndrome.

If they're not showing symptoms, how will this help in the treatment of the patient?

It's one thing if you have two people who have a familial history of genetic illnesses to want to be screened for child bearing reasons, it's another to make it a standard test to "find out for sure".

Docs are test crazy these days and it's NOT entirely because of the lawyers - it's another revenue stream. (Gotta make the payments on the BMW and the 5,000 sq/ft. house and the second home in the Hamptons!)

Re:More unecessary tests.... (2)

khallow (566160) | more than 3 years ago | (#35399520)

That knowledge doesn't help the patient.

If there was no difference in treatment or health outcome for people who have a known inclination for disease or disorder, then you'd be right. You aren't.

Re:More unecessary tests.... (0)

Anonymous Coward | more than 3 years ago | (#35398928)

If a program occasionally segfaults, it's pretty apparent that it has a bad bug without looking at any of the source code.

The backups of previous version binaries gives an excellent idea of any propensity for segfaults.

The only thing that a source code viewer does is to add another test, give programmers a revenue stream and another method for CYA, and at the end, our software costs go up even more.

basic questions (0)

Anonymous Coward | more than 3 years ago | (#35397510)

What is DNA sequencing, exactly?

Are different methods required depending on what part is being sequenced? IOW, why was it so hard to finish what was started?

What's it useful for (apart from the obvious "note some symptoms and see if they correlate to some particular sequence")?

Still north of $12,000 (0)

Anonymous Coward | more than 3 years ago | (#35397528)

The graphs seem to indicate that the cost is still north of $12,000 which isn't exactly cheap.

Re:Still north of $12,000 (2)

TooMuchToDo (882796) | more than 3 years ago | (#35397686)

Compared to the $100 million it cost 10 years ago? Yeah, $12K is cheap. Not to *you*, but for research its direct cheap.

Re:Still north of $12,000 (1)

Fuzion (261632) | more than 3 years ago | (#35397736)

While $12,000 is quite a bit of money, it's certainly much lower than the July 2001 cost of $100,000,000. Also, my understanding is that most uses don't require sequencing the entire genome, but rather just a small subset of it. The cost per megabase has dropped from nearly $10,000 to less than $0.20, which does seem quite cheap.

Re:Still north of $12,000 (4, Informative)

RDW (41497) | more than 3 years ago | (#35398048)

'Also, my understanding is that most uses don't require sequencing the entire genome, but rather just a small subset of it.'

Very small subsets (e.g. individual genes) are still done the 'traditional' way (1990s technology!). Intermediate subsets (like the 'exome') are now done using a pre-selection 'capture' process ('target enrichment') followed by analysis on the same 'next generation' instruments that are used for whole genomes. Right now, this makes sense economically, since it requires less capacity (fewer consumables and less run time) on the expensive sequencers. But as sequencing prices continue to drop, we'll probably reach a point where it's cheaper to do the whole genome than any significant subset (since the 'capture' process is also fairly expensive). Cheaper to do the wet lab stuff, anyway - whole genomes also require much more processing power than useful subsets like exomes.

Re:Still north of $12,000 (1)

HeyBob! (111243) | more than 3 years ago | (#35398008)

Being a logarithmic graph, every tick in this range on the graph is $10,000, so the final number looks like it's over $20k right now...

Re:Still north of $12,000. (No, that's $30,000) (2)

Smurf (7981) | more than 3 years ago | (#35398134)

The graphs seem to indicate that the cost is still north of $12,000 which isn't exactly cheap.

Dude, you are reading the graph incorrectly. Look carefully at the (logarithmic) scale: the cost is actually around $30,000 !! (No, those are not factorial signs, I am just expressing my shock by the 30 thousand figure.)

Yes, I know, that actually supports your point even more strongly: while the cost was reduced dramatically from $100 million, it seems to be leveling at a cost that is still way too high for many practical applications in the clinical field outside of research.

DIY? (1)

gclef (96311) | more than 3 years ago | (#35397532)

Question for the bio-folks: is there a way for someone (okay, me) to DIY this? I'm curious to know my own genome, but I'm *very* leery of having that data living in some company's database. What I'd like to be able to do is have the data, and be able to look up how any discoveries later on map to what I've got. Is that possible? What I don't want is what seems to be the prevalent pattern right now of companies telling customers: "You have indicators for x & y. Re-do this & pay us again in a few years when we discover more."

Re:DIY? (2)

MoobY (207480) | more than 3 years ago | (#35397678)

Even in research, most of the sequencing at whole genome level is outsourced to big companies (like, for example, Complete Genomics) since investing in the capabilities, machinery and computer power to sequence whole genomes is simply too big for sequencing one or a few individual genomes (you currently need to invest a few millions to get started with the sequencing of whole genomes). You can DIY sequencing of small fragments (for example, to determine whether a known genetic cause of a hereditary disease that is looming in your family is also affecting you) but it still requires quite a few skills in molecular biology and a few thousand euros/dollars of investment to get to this level.

Re:DIY? (1)

TooMuchToDo (882796) | more than 3 years ago | (#35397708)

If you could get your hands on a V3 chip from Illumina (the same used by 23andme.com) you could do it, although you could always pay 23andme.com the $200 for the sequencing, download the raw data, and delete your account.

I myself paid for 23andme.com to do genetic profiles for myself, my wife, and my brother. $100/person and then $5/month/person ongoing as they add more research each month (come on, $5? Cheap! for the research data they add, although I guess some people don't see the value).

Re:DIY? (0)

Anonymous Coward | more than 3 years ago | (#35397816)

Unfortunately, SNP ("single nucleotide polymorphism") chips, as the ones from Illumina that you mention which are being used by 23andme are not sequencing technologies as discussed in the article, but more like fingerprinting of (many) likely places of single base substitutions, rather than sequencing. (Of course these prices are dropping exponentially, too.) Note that the chips that cost a few hundred euros also require a few hundreds of thousands of euros of machinery (cost is mainly because of expensive optics) so it also falls out of the range of DIY biotech.

23andme does not sequence your DNA (2)

drerwk (695572) | more than 3 years ago | (#35398238)

As some ACs have pointed out in response to a few of your posts on this thread, 23andme does not sequence your DNA.
https://www.23andme.com/you/faqwin/sequencing/ [23andme.com]
my emphasis:

What is the difference between genotyping and sequencing?

Though you may hear both terms in reference to obtaining information about DNA, genotyping and sequencing refer to slightly different things.

Genotyping is the process of determining which genetic variants an individual possesses. Genotyping can be performed through a variety of different methods, depending on the variants of interest and resources available. At 23andMe, we look at SNPs, and a good way of looking at many SNPs in a single individual is a recently developed technology called a “DNA chip.”

Sequencing is a method used to determine the exact sequence of a certain length of DNA. Depending on the location, a given stretch may include some DNA that varies between individuals, like SNPs, in addition to regions that are constant. So sequencing is one way to genotype someone, but not the only way.

You might wonder, then, why we don't just sequence everyone's entire genome, and find every single genetic variant they possess. Unfortunately, sequencing technology has not yet progressed to the point where it is feasible to sequence an entire genome quickly and cheaply. It took the Human Genome Project over 10 years' work by multiple labs to sequence the three billion base pair genomes of just a few individuals. For now, genotyping technologies such as those used by 23andMe provide an efficient and cost-effective way of obtaining more than enough genetic information for scientists—and you—to study. Copyright © 2007-2011 23andMe, Inc. All rights reserved.

To be sure you have gained interesting information for your $200, but you have neither your sequence, nor a complete list of differences from a reference human sequence, which of course if you did would give you your sequence.
23andme only gives you a list of many SNPs.

Re:DIY? (1)

RDW (41497) | more than 3 years ago | (#35398792)

'What I'd like to be able to do is have the data, and be able to look up how any discoveries later on map to what I've got. Is that possible?'

You can't do genome sequencing, or even SNP chip genotyping, in a DIY lab, so you'll have to involve a large company or research centre at some point. But you can do this anonymously (e.g. through a physician) and get hold of the raw data afterwards to analyse as you please, assuming you have the technical knowledge to make sense of it. Illumina is one company that provides this sort of service for whole genomes via physicians at a cost of $10,000-20,000 USD:

http://www.everygenome.com/ [everygenome.com]

For much less money (say $200), you can have genotying done and download the SNP calls for future analysis, e.g.:

http://www.snpedia.com/index.php/23andMe [snpedia.com]

The Question Needs To Be Clarified (1)

damn_registrars (1103043) | more than 3 years ago | (#35399086)

Of course it would be great if we could each get out full genomes - full coverage of every chromosome at high confidence - for an affordable price. However, if you did that, you would find that the vast majority of the information would be quite uninteresting or even borderline meaningless. There are large regions of the chromosomes that do not code for anything, and some of those end up being particularly difficult to sequence accurately. While changes in those regions can be important, changes in those regions are likely not anything you could easily search out and compare anyways.

If what you actually want is to know how the expressed parts of your genome compare to known genomic assemblies, that is fairly straightforward now. While not exactly cheap they are not impossible to handle on a reasonable budget. You could look at your favorite genes - particularly if you know of health conditions in your family that may have clear genetic components - and generate some DNA oligos to sequence those. Of course, that sequencing is not really approachable by most people because the sequencers are still quite expensive (unless you want to do sequencing gels which require radioactive and/or carcinogenic dyes and will make you go cross-eyed to boot) so you'll end up having to send out your samples somewhere eventually.

Which is why we have various companies who are doing SNP profiling for a few hundred dollars per person. They can recover the cost of their sequencing and assembly fairly quick at that rate, and provide some of the most useful information to the customers at a manageable price and turnaround rate.

So basically what it boils down to is that if you want to do a full genome, every chromosome, you most likely couldn't afford the equipment and materials to do it yourself. If you want just SNPs or even ESTs, you could possibly do it but it would be vastly impractical to pay for it yourself in comparison to sending it out to an existing company.

I say this as a grad student in a field closely related to genomics, with several genomics papers on my CV from over the years.

Re:DIY? (0)

Anonymous Coward | more than 3 years ago | (#35400234)

In most places you will get the sequences themselves as well if you buy capacity from a machine (at least where I work in will do that). Though in general there is some amount of analysis done on it as that's not very trivial and requires some amount of computing power to do.

If you want to do the sequencing yourself, you need to own a lot of money as the machines capable of doing efficient sequencing aren't cheap or very simple to use. Never mind the reagents which you will need for each run of the machine.

Moore's law is too slow (3, Interesting)

MoobY (207480) | more than 3 years ago | (#35397588)

We've been observing this decrease over the last few years at our sequencing lab too. Some people might find it fascinating, but I, as a bioinformatician, find it frightening.

We're still keeping up at maintaining and analysing our sequenced reads and genomes at work, but the amount of incoming sequencing data (currently a few terabytes of data per month) is increasing four-to-five-fold per year (compared to doubling each 18-24 months in Moore's law). Our lab had the first human genomes at the end of 2009 after waiting for almost 9 years since the world's first human genome, now we're getting a few genomes per month. We're not too far away of running out of installing sufficient processing power (following Moore's law) and no longer being able to process all of this data.

So yes, the more-than-exponential decrease in sequencing costs is cool and offers a lot of possibilities in getting to know your own genome, advances in personalized medicine, and possibilities for population-wide genome sequencing research, but there's no way we'll be able to process all of this interesting data because Moore's law is simply way too slow as compared to advances in biochemical technologies.

Re:Moore's law is too slow (0)

Anonymous Coward | more than 3 years ago | (#35397764)

Your observations are very interesting to me, since even though I work as a research scientist I wasn't aware of the data explosion problem in sequencing. For your terabytes per month, is that composed of the raw data, or even after condensing down to a sequence string?

Are you suffering from a true lack of storage and processing power, or is your field hampered by the failure to implement modern techology in software and hardware? I personally see a lot of software written for single-core, single-thread workflows, and we buy new computers rather than re-write bad software decisions.

Re:Moore's law is too slow (2)

Kjella (173770) | more than 3 years ago | (#35397814)

I assume you're talking about incoming data, not the final DNA sequence. As I understand it the final result is 2 bits/base pair and about 3 billion base pairs so about a CD's worth of data per human. And if you were talking about a genetic database I guess 99%+ is common so you could just store a "reference human" and diffs against that. So at 750 MB for the first person and 7.5 MB for each additional person I guess you could store 2-300.000 full genetic profiles on a 2 TB disk. Probably the whole human race in less than 100 TB.

Re:Moore's law is too slow (3, Informative)

RDW (41497) | more than 3 years ago | (#35397942)

Yes, the incoming (and intermediate) data sets are huge. You don't just sequence each base once, but 30-50 times over on average (required to call variants accurately). And you don't want to throw this data away, since analysis algorithms are improving all the time. But it's true that the final 'diff' to the reference sequence is very small, and has been compressed to as little as 4Mb in one publication:

http://www.ncbi.nlm.nih.gov/pubmed/18996942 [nih.gov]

Re:Moore's law is too slow (1)

jda104 (1652769) | more than 3 years ago | (#35398010)

I assume you're talking about incoming data, not the final DNA sequence. As I understand it the final result is 2 bits/base pair and about 3 billion base pairs so about a CD's worth of data per human. And if you were talking about a genetic database I guess 99%+ is common so you could just store a "reference human" and diffs against that. So at 750 MB for the first person and 7.5 MB for each additional person I guess you could store 2-300.000 full genetic profiles on a 2 TB disk. Probably the whole human race in less than 100 TB.

The incoming data is image-based, so yes, it will be huge. Regarding the sequence data: yes; in its most condensed format it could be stored in 750MB. There are a couple of issues that you're overlooking, however:
1. The reads aren't uniform quality -- and methods of analysis that don't consider the quality score of a read are quickly being viewed as antiquated. So each two bit "call" also has a few more bits representing the confidence in that call.
2. This technology is based on redundant reads. In order to get to an acceptable level of quality, you want at least ~20 (+/- 10) reads at each exonic loci.
So that 750MB you mention for a human genome grows by a factor of 20, then by another factor of 2 or 3, depending on how you store the quality scores.

Your suggestion of deduplicating the experiments could work, but definitely not as well as you think because of all the "noise" that's inherent in the above two steps.

If you really just wanted to unique portions of a sample, you could use a SNP array which just reads the samples at specific locations which are known to differ between individuals. Even with the advances in the technology, the cost of sequencing a genome still isn't negligible. For most labs, it's still cheaper to store the original data for reanalysis later.

Re:Moore's law is too slow (2)

RDW (41497) | more than 3 years ago | (#35398184)

'The incoming data is image-based, so yes, it will be huge.'

The image data is routinely discarded by at least some major centres; the raw sequence and quality data alone is huge enough to be a major issue! See:

http://www.bio-itworld.com/news/09/16/10/Broad-approach-genome-sequencing-partI.html [bio-itworld.com]

'It's been a couple of years since we saved the primary [raw image] data. It is cheaper to redo the sequence and pull it out of the freezer. There are 5,000 tubes in a freezer. Storing a tube isn't very expensive. Storing 1 Terabyte of data that comes out of that tube costs half as much as the freezer! People [like Ewan Birney at EBI] are working on very elaborate algorithms for storing data, because you can't compress bases any more than nature already has. The new paradigm is, the bases are here, only indicate the places where the bases are different . . . In 2-3 years, you'll wonder about even storing the bases. And forget about quality scores.'

Re:Moore's law is too slow (1)

jda104 (1652769) | more than 3 years ago | (#35398050)

Interesting. I view this from a completely different perspective: if DNA sequencing really is outpacing Moore's Law, that just means that the results become disposable. You use them for your initial analysis and store whatever summarized results you want from this sequence, then delete the original data.

If you need the raw data again, you can just resequence the sample.

The only problem with this approach, of course, is that samples are consumable; eventually there wouldn't be any more material left to sequence. So this wouldn't be appropriate in every situation.

Re:Moore's law is too slow (1)

RDW (41497) | more than 3 years ago | (#35398260)

'If you need the raw data again, you can just resequence the sample.'

See my reply above to another post - this is exactly the approach that some centres are taking. But as you say, some samples can't be regarded as a consumable resource (e.g. archival clinical material is often only available in limiting quantities).

Re:Moore's law is too slow (1)

Anonymous Coward | more than 3 years ago | (#35398172)

Indeed, as a developer/sysadmin in a bioinformatics lab, I find this equally terrifying.

As of two years ago when my supervisor went to a meeting at Sanger (one of the largest sequencing centres in the world for those reading this, the granddaddies of large scale sequencing) they said a few frightening things. First, they were spending more on power and other items related to data storage than chemical supplies for sequencing. Second, the cost of resequencing something compared to storing the sequenced data had dropped so much they were actually storing replicates in biological form and resequencing them as needed.

We have a large scale metagenomics project coming up, luckily this is only bacterial data, but oh my is my brain starting to spin about what infrastructure upgrades will be needed to handle this data. Even shifting it around between computational clusters, you start to question if gigabit ethernet is enough. My 3 year old, 14TB file server is almost full and it'd be considered small these days, even just migrating that data to a newer system, days, literally days.

It's an exciting and scary time for bioinformatics.

Re:Moore's law is too slow (0)

Anonymous Coward | more than 3 years ago | (#35398626)

Does this mean that we will soon be able to buy recorded music encoded into DNA such as that of yeast and play it on a Sony sequithizer costing less than a car? The DRM implications are interesting.

Re:Moore's law is too slow (1)

mapkinase (958129) | more than 3 years ago | (#35398870)

Do not forget storage problems. The center I know already is dropping annotations closer than a substrain. Given recent setbacks in budgeting American national centers (raw sequence data storage project being dropped in one of them), the problem will only get bigger.

Re:Moore's law is too slow (1)

timeOday (582209) | more than 3 years ago | (#35399126)

It sounds like the problem is storage capacity moreso than processing capacity, is that so?

Re:Moore's law is too slow (0)

Anonymous Coward | more than 3 years ago | (#35400314)

It's probably more processing. The new sequencing techniques work by doing a huge amount of parallel short reads and then aligning them (usually onto a reference genome). You'll often sequence >10 fold coverage so you can call individual bases with high confidence. Crunching through that much data has got to be a pain (which is why most of us just farm out the whole process to a company).

Of course, that's still relatively simple compared to whole-methylome sequencing...

Talking about jaw dropping (0)

Anonymous Coward | more than 3 years ago | (#35397606)

How 'bout that burger?

What happened in... (2)

diewlasing (1126425) | more than 3 years ago | (#35397618)

...Oct 2007?

Re:What happened in... (1)

Anonymous Coward | more than 3 years ago | (#35397696)

...Oct 2007?

FTFA: "From 2001 to 2007, first generation techniques (dideoxy chain termination or ‘Sanger’ sequencing) for sequencing DNA were already following an exponential curve. Starting around January of 2008, however, things go nuts. That’s the time when, according to the NHGRI, a significant part of production switched to second generation techniques [wikipedia.org]."

Re:What happened in... (0)

Anonymous Coward | more than 3 years ago | (#35399372)

Amazon EC2 went live.

Re:What happened in... (0)

Anonymous Coward | more than 3 years ago | (#35399474)


Starting around January of 2008, however, things go nuts. That’s the time when, according to the NHGRI,

which links off to:


Re:What happened in... (0)

Anonymous Coward | more than 3 years ago | (#35399488)

if you had bothered to RTFA you would know that this was when the new generation sequencing technology started to take of

Re:What happened in... (1)

Anonymous Coward | more than 3 years ago | (#35399500)

In 2008 454 LifeSciences released the Genome Sequencer FLX, which was the first affordable next-generation sequencer to become widely available, Since then a number of other high-throughput sequencers have been released (including Illumina and SOLiD). This marks the beginning of 2nd generation sequencing era, prior to this, the method used was Sanger-based sequencing, and although this is completely automated nowadays, it is still based on principles that were established in the 1970's, which are comparatively slow. Note that the second generation sequencing technologies, although much faster and much cheaper for large-scale analysis, is very error-prone and for accurate de-novo sequencing, Sanger-based methods are still preferred.

Re:What happened in... (0)

Anonymous Coward | more than 3 years ago | (#35399874)


Starting around January of 2008, however, things go nuts. That’s the time when, according to the NHGRI, a significant part of production switched to second generation techniques [wikipedia.org].

Deja vu all over again... (1)

swm (171547) | more than 3 years ago | (#35397622)

I saw the same thing back in the mid-1990s.
Sequencing technology was ramping up hyper-exponentially.
That means that it curves up on semi-log paper.
It was outstripping Moore's Law, and crushing our data systems.

Finished DNA sequence only needs 2 bits/base pair,
but the raw data behind those 2 bits can be much bigger;
in our case, the raw data was scanned images of radiograms.

In the early '90s, a typical sequencing project was a few hundred DNA fragments.
Each fragment is a few hundred base pairs.
You put each fragment in a file, and put all the files in a subdirectory on disk.
The file system becomes your de facto database, and it works OK,
because there are only ~10K base pairs in the whole project.

When I got there, a typical sequencing project had grown to a few thousand fragments.
They were still keeping everything in the file system,
and it took a dozen networked DECstations with big (1GB!) drives to manage all the projects.

Then the biologists went 40x in one generation.
(Remember, RAM only goes 4x per generation.)
That meant that a sequencing project now had 40K fragments.
They bought HUGE (9GB!) drives, and had a TB of storage online,
which was unprecedented at the time.

They were still keeping all the fragments in files in subdirectories,
because who has time to rewrite your data systems when you have all that DNA to sequence.
The system was slowly grinding to a halt;
the weekly (tape) backups were taking longer than a week to complete;
they fell behind on their OS updates.
Eventually there was file system corruption,
and then the whole thing came crashing down around our ears.

It was an exciting time.

FWIW, the graphs in the article show don't show continuous hyper-exponential growth.
They show two discrete jumps (doubtless due to introduction of new sequencing technologies),
one in 2003 and one in 2008.
After each jump, the growth rate returns to exponential (straight line).
And the last three data points show the cost bottoming out at $0.30/MBase.
Of course, that may just be the plateau before it falls off the next cliff.

Re:Deja vu all over again... (3, Informative)

RDW (41497) | more than 3 years ago | (#35399156)

'Of course, that may just be the plateau before it falls off the next cliff.'

The next cliff is already emerging through the mist, e.g.:

http://www.genomeweb.com/sequencing/life-tech-outlines-single-molecule-sequencing-long-pieces-dna [genomeweb.com]

http://www.wired.com/wiredscience/2011/01/guest-post-introduction-to-nanopore-sequencing/ [wired.com]

It's not clear which 'single-molecule' technology will eventually win out, but it will almost certainly have the word 'nano' in it somewhere.

Read the stats (1)

houghi (78078) | more than 3 years ago | (#35397688)

What happens is that in July 07 a new way to do it was introduced. As it was a new technique, it started of 'expensive' and became cheaper. The last three months it is again in the standard just as it was before.

As if you compare the drop in household costs of one family, where the family moved house to a cheaper estate.

So yes, in numbers ist has become cheaper, but also it must be clear you are comparing two ways of doing things and obviously people will select the cheaper one.

babys/LSI/w+dog; dna advances unquantifiable.. (-1)

Anonymous Coward | more than 3 years ago | (#35397698)

by any mechanism available to us. we're just responding to the need, they repeat. they really know stuff that we do not. they reek of good (as in pure) intentions. trouble believing? look at what we're 'believing in' now? better yet, come to one of the many scheduled million baby+ play-dates, consciousness arisings, photon sharing sessions, georgia stone editing(s),,, & a host of other life promoting events.

a few of them (pure intentions), already occurring, by the way;

1. DEWEAPONIZATION (not a real word, but they like it) almost nothing else good happens until some progress here.

2. ALL BABYS CREATED/TO BE TREATED, EQUALLY. (a rough interpretation (probably cost us. seems like a no-brainer but they expressed that we fail on that one too(:)->) 'we do not need any 300$ 'strollers', or even to ride in your smelly cars/planes etc..., until such time as ALL of the creators' innocents have at least food, shelter, & some loving folks nearby.' again, this is a deal breaker, so pay attention, that's cheap enough, & could lead to our survival?

3. THOU SHALT NOT VACCINATE IRRESPONSIBLY. this appears to be a stop-gap intention.

the genuine feelings expressed included; in addition to the lack of acknowledgment of the advances/evolution of our tiny bodies/dna (including consciousness & intellect), almost nobody knows anymore what's in those things (vaccines) (or they'd tell us), & there's rumor much of it is less than good (possibly fatal) for ANY of us. if it were good for us we'd be gravitating towards it, instead of it being shoved in our little veins, wrecking them, & adversely affecting our improving immune systems/dna/development? at rite-aid, they give the mommies 100$ if they let them stick their babys with whoknowswhat? i can see why they're (the little ones) extremely suspicious? they're also asking that absolutely nobody be allowed to insert those corepirate nazi 'identity' 'chips' in their tiny frames. they know who they, and we, are, much better than we ever will? many, oddly? have fading inclinations to want to be reporters of nefarious life threatening processes, ie. 'conspiracies', as they sincerely believe that's 'stuff that REALLY matters', but they KNOW that things are going to be out in the open soon, so they intend to put their ever increasing consciousness, intellect, acute/astute senses & information gathering abilities, to the care & feeding of their fellow humans. no secrets to cover up with that goal.


sortie like a no-(aerosol tankers)-fly zone being imposed over the whole planet. the thinking is, the planet will continue to repair itself, even if we stop pretending that it's ok/nothing's happening. after the weather manipulation is stopped (& it will be) it could get extremely warm/cold/blustery some days. many of us will be moving inland..., but we'll (most of us anyway) be ok, so long as we keep our heads up. conversely, the manufactured 'weather' puts us in a state of 'theater' that allows US to think that we needn't modify our megaslothian heritage of excessiveness/disregard for ourselves, others, what's left of our environment etc...? all research indicates that spraying chemicals in the sky is 100% detrimental to our/planet's well being (or they'd talk to US about it?). as for weather 'extremes', we certainly appear to be in a bleeding rash of same, as well as all that bogus seismic activity, which throws our advanced tiny baby magnets & chromosomes into crisis/escape mode, so that's working? we're a group whose senses are more available to us (like monkeys?) partly because we're not yet totally distracted by the foibles of man'kind'. the other 'part' is truly amazing. we saw nuclear war being touted on PBS as an environmental repair tool (?depopulation? (makes the babys' 'accountants' see dark red:-(-? yikes. so what gives? thanks for your patience & understanding while we learn to express our intentions. everybody has some. let us know. come to some of our million baby play-dates. no big hurry? catch your breath. we'll wait a bit more. thanks.

do the math. check out YOUR dna/intentional healing potential. thanks again.

Re:babys/LSI/w+dog; dna advances unquantifiable.. (1)

Skidborg (1585365) | more than 3 years ago | (#35397798)

I'd have an easier time believing what you had to say if your grammar didn't obscure your actual meaning to anyone but a cryptographer.

market at work (-1, Offtopic)

roman_mir (125474) | more than 3 years ago | (#35397766)

as long as government stays out with its regulations and subsidies that promote monopolies, the costs will continue dropping.

This just shows how invalid all of the ideology is behind this notion that inflation is the right thing, that government needs to be in health insurance and health care and that private sector cannot do efficient job in health care and insurance.

Just a very little while ago in this thread [slashdot.org] I compiled some data from various sources (including gov't statistics and some research papers) that shows that private health care and insurance were affordable and in fact preferred by population before the gov't and insurance and health provider companies have colluded and before gov't provided the moral hazard in form of Medicare and Medicaid (and CHIP), the prices for health care and for health insurance were low and most people preferred having private insurance plans even to things like Blue Shield/Cross.

Health insurance and medical treatment are just normal goods, there is nothing magical about them, the normal rules of market apply. Prices fall with increase in production and competition. That's before gov't gets involved, collusion and price fixing happens as well as gov't money influx and prices rise sky high, completely out of whack (same with education prices, and all other things government money is spent on.)

Markets that are less influenced by government regulations and subsidies than others, such as computers and various medical procedures that are not covered by government money (lasic eye surgery), have more competition, innovation and prices go down in those markets, not up. This is true even as new advances are made, new technologies are introduced. Those technologies are not cheap, but they are used massively in competitive manner and prices fall.

You are not paying millions of dollars for your computers, and you are able to buy many computers for a reason. That reason being that government is mostly uninvolved.

This is why I am and will always be against such things as government regulating anything, including 'net neutrality' laws, etc. AFAIC any government initiative and a law and a subsidy and a tax can be explained by reversing the official intent for it.

So if government is supposedly involved into making medical insurance less expensive and more affordable, expect the insurance to become less affordable and more expensive.

If government is saying it will protect you from terrorists, expect completely nonsensical policies that will at the end create more terrorists (all this while your real rights will be stripped off you and you'll be left sitting there, holding your dicks in your hands, with no right to anything at all.)

If government is saying it will fix the economy by printing money and spending, then you know what's coming - complete destruction of economy, collapse of the monetary system.

If government is saying it's going to bring you clean energy by various mandates, expect huge energy shortages, eventual failure of those clean energy policies to deliver enough energy, loss of real industries and real innovation and eventual situation, where most of the country is forced backwards, to use the most dirty but cheapest way to get energy.

If anything good comes out of the incoming economic disaster, hopefully it will be that the Keynesian policies, the policy of having a federal bank, the fiat currency, regulation of economy and money by government might be discredited completely (of-course nothing happens completely), but hopefully mostly, as the government shows itself not just totally impotent in these issues, but actually is shown to be the driving force behind the economic disasters of 20-21 centuries.

Re:market at work (0)

Anonymous Coward | more than 3 years ago | (#35397844)

All of the technological improvements - the very technologies that were created- were funded by the government in the form of NHGRI and DOE grants. Also, the entire next generation is being funded directly from the government. While the commercialization of these have been commercially funded. And to note the technologies developed to date are to resequence human genomes, which is the same thing as denovo sequencing. The costs have come down, as has the utility of the sequence.

Re:market at work (1)

roman_mir (125474) | more than 3 years ago | (#35397968)

And how much money that government spent was wasted and never produced anything at all? You are assuming that government grants are a good thing, while forgetting that this is the money that was taken out of the economy by taxing the private sector (direct taxes or borrowing, which is deferred taxes + interest, or just money printing, which is taxing the entire net-worth via inflation).

How much money was wasted by government? How many projects have failed?

All of the wasted money, and even the money that finally did give something back at the end, all of that money was taken away from the pool of credit that the private sector never saw.

You are assuming for some reason that private sector couldn't have done the same or better research if the credit to the private sector was available not in the form of government grants but in the form of normal business credit. That's a strange assumption as well. Most of the research is done privately, especially by the manufacturing businesses, which require constant innovation to outcompete the next guy, to bring the costs of production down while coming up with another new product/service.

No. I don't see government provided incentives as good in any cases, even when you point out that there were some successes.

Well, of-course there were, you can't throw infinite amount of money at huge number of people without something sticking.

In private sector you have to make sure that what you do is not going to be all waste, that it will provide some return. This is not a bad thing, this is a good thing, it keeps people from chasing the wrong thing for too long. What prevents anybody in government from chasing the wrong thing for too long while spending seemingly infinite resources on that?


Are the resources really unlimited? I don't think so, I think the resources are out, there is nothing left, only printing and currency destruction.

But back to the point at hand - the prices for the DNA sequencing are falling quickly. There is nothing that should hold them up for ANY technology, in ANY field, this includes medical fields as well as computer, military, energy, etc.

The real difference between prices falling and going up is government force. Government always causes prices to go up, especially in the long run.

Re:market at work (1)

the gnat (153162) | more than 3 years ago | (#35399138)

In private sector you have to make sure that what you do is not going to be all waste, that it will provide some return.

Yes, and it needs to provide some return in a relatively short amount of time. Almost all research done by private companies is aimed at product development; only a handful of corporations have enough resources (i.e. spare cash) to fund undirected basic research. Non-commercial basic research, regardless of funding source, has no such constraints, and can afford to take a much longer-term view.

Some understanding of history is useful here:
1972: first viral genome sequenced
1975: Sanger method of DNA sequencing invented
1980: Sanger wins Nobel prize
mid-1980s: first public discussion of Human Genome Project
1990: Human Genome Project officially founded
1994: First bacterial genome sequenced
1998: First multicellular organism sequenced
2001: "drafts" of human genome published
2006: Final human chromosome sequence officially complete

Good luck getting VCs to fund your company based on that timeline - and if a publicly traded corporation decided to bet the farm on technology that wouldn't pay off for 30-plus years, they'd be at risk of a shareholder suit. I know you're going to mention Celera, but they only became involved after the sequencing technology (and computers) had improved immensely, and much of the basic groundwork had already been done. (Also, it was never clear to me what their business model was anyway.)

One of the important (but seldom-mentioned) essential ingredients for low-cost personal genomics is the availability of multiple high-quality, complete human genome sequences. Most of the next-generation technologies that have driven down the cost of genomics are not designed for de novo assembly; they depend on having a reference sequence to align the fragments. The fact that the reference sequences are deposited in a public database makes it much easier for everyone to pursue these technologies, whether for profit or for science.

Re:market at work (1)

roman_mir (125474) | more than 3 years ago | (#35400366)

Some understanding of history is useful here


  • Thales of Miletus, at around 600 BCE, described a form of static electricity
  • At around 450 B.C. Democritus, a later Greek philosopher, developed an atomic theory
  • Georg Ohm in 1827 quantified the relationship between the electric current and potential difference in a conductor
  • Michael Faraday in 1831 discovered electromagnetic induction
  • James Clerk Maxwell in 1873 published a unified theory of electricity and magnetism
  • Werner von Siemens in 1866 invented the industrial generator, which didn't need external magnetic power
  • James Wimshurst in 1878 developed an Wimshurst machine
  • Edison in 1882 switched on the world's first large-scale electrical supply network that provided 110 volts direct current to fifty-nine customers in lower Manhattan
  • Heinrich Hertz in 1886 succeeded with his radio wave transmitter
  • Nikola Tesla in 1887 filed a number of patents related to a competing form of power distribution known as alternating current. Began the War of Currents has. (Tesla: Induction motors, polyphase systems. Edison: telegraph, stock ticker, GE)
  • Alexander Popov in 1896, transmitted wireless signal across 60 meters
  • Guglielmo Marconi in 1896, transmitted wireless signal across 2.4 km.
  • John Fleming in 1904 invented the first radio tube, the diode.
  • Robert von Lieben in 1906 invents amplifier tube (triod)
  • Lee De Forest in 1906 also invents amplifier tube (triod)
  • Edwin Howard Armstrong in 1931, electronic television.


Aviation history, airplanes and helicopters, even rocketry and nuclear physics (prior to the nuclear bomb).
Computer technology history.
Medical developments.
Food processing.
Steam engines.
Factories and assembly lines.

If you do pay attention to history you may finally recognize that most of the inventions, most of the innovations, most of the work was done privately, with private interests, private money, private people working out of interest and profit motives.

SOME things were done with public funds, this is especially true when it concerns weapons and military.

The huge experiments that government does normally end up costing something enormous, so that it has to be subsidized forever and if the subsidies stop, the entire construct falls apart. Apollo missions to the Moon come to the mind immediately. They were successful in themselves and yet nobody has been to the moon in my lifetime and I am not that young anymore either. This is because subsidies for that kind of stuff stopped.

All government subsidies stop eventually, that is why relying on technology and structures that are built with public funding and require constant subsidies to operate are and always will be inferior to those, that are developed in the private sector EVEN if it takes more time to develop them through the private sector because the timing also must be right for the idea and implementation to make sense.

I completely disagree that government subsidies are either desirable or good in the long run for anything.

AFAIC the only spending that gov't should be given a mandate to do is for minimum military protection necessary to maintain liberty and a justice system that's fair and takes care of criminal and contract laws.

I don't actually TRUST governments to do even those things right, but most people are not ready to see where the most logical conclusion of real competition must take them, because they are not free in their thinking.

Re:market at work (1)

repapetilto (1219852) | more than 3 years ago | (#35399164)

Government funding provides the degree of stability necessary to complete huge, perhaps decades-long, projects. Also besides things that get classified state secrets and such, the raw data and details regarding processes become publicly available, decreasing the need to reinvent the wheel... and therefore waste. Finally, money gets spent on projects that may not have any immediately obvious benefit to Joe Sizpack, but increase humanity's understanding of the world around them. So research is actually one of the valid uses of government money.

It's not a matter of having access to the huge amounts of money. It's a matter of having it practically guaranteed, which allows researchers (as a group) to have priorities beyond capitalization.

Re:market at work (1)

roman_mir (125474) | more than 3 years ago | (#35399600)

I disagree fundamentally that there should be such ridiculous things as huge, decade long projects. Who the hell do you want to trust some ridiculous projects like that: people with no accountability and no reason to bother cutting costs because their bottom line will be affected? Why? Why would I ever want to fund something like that with my money? I will not. I will find every possible way to avoid funding it. It's ridiculous.

Besides, when you say something is 'guaranteed' because it's done by government. Hello. This is obviously ridiculously and patently untrue.

All government spending programs are pyramid schemes. Here is an example - SS trustee explaining [youtube.com] that SS never was any actual fund, it was never insurance, it was never an investment strategy. Money was spent immediately and replaced with US bonds, which are all debt, that has to be realized to make payments. SS is a ponzi scheme, so is Medicare/Medicaid, so is government based education system, so is gov't based energy policy, so is the foundation of the nation's economy - government based monetary system.

I can't trust government with basics, why would I trust it with anything that requires many years, decades even, where possibility of huge abuse and fund mis-allocation is enormous, the temptations are enormous, the possibilities for corruption are enormous, where the funding is all tax/borrowing/printing based, where there is no profit motive and NO WAY to shut the fucking flood gates down?

I don't trust it. I won't trust it. I won't participate in it.

That's why I don't hold currencies for example.

Re:market at work (1)

Thing 1 (178996) | more than 3 years ago | (#35398188)

All of the technological improvements - the very technologies that were created- were funded by the government in the form of NHGRI and DOE grants. Also, the entire next generation is being funded directly from the government. While the commercialization of these have been commercially funded. And to note the technologies developed to date are to resequence human genomes, which is the same thing as denovo sequencing. The costs have come down, as has the utility of the sequence.

That's all well and good, but we don't have a "control world" where the government didn't interfere and people created technological improvements themselves, at a different pace. My belief is that it would be a more rapid pace, but since we don't have this control world we will never know if the government helped or hindered progress.

Re:market at work (0)

Anonymous Coward | more than 3 years ago | (#35399160)

Sorry I should have said tulips. No one would have put the billions into the technology if the government would not have created the project. This entire field only exists because of government funded academic scientists who thought isn't was important. This isn't some generic technology and science that someone would have invented cause they can make money on it. It will be interesting to see if any of the current companies will even have a market. The cost and intended target of resequencing human is a darn narrow target.

Re:market at work (1)

Thing 1 (178996) | more than 3 years ago | (#35399842)

"No one would have" is not arguable. I maintain that someone would have. But we don't have a control world. So we can only hold differing opinions, one of which is more likely.

Re:market at work (0)

Anonymous Coward | more than 3 years ago | (#35399992)

Yes in this case I am correct, however also feel free to argue the market would work on ocean and space exploration, telescope development, physics, biochemistry, HPC computing, weather moldeling ( heck any modeling), weapons development, math, ecology, just about any scientific advances we have in the 20 th century. Do you see why the other guy said tulips? The market does not favor innovation, it favors optimization and reduction in direct cost to the manufacturer. It favors externalizing costs and then passing them back onto the taxpayers. The government isn't the best at spending money in the best ways, however, I can give you 50 failed startups for every unsuccessful government grant. And by your criteria all of the gov grants for sequencing research were wildly successful- some of the companies founded to exploit them were sadly not.

Re:market at work (2)

ObsessiveMathsFreak (773371) | more than 3 years ago | (#35398602)


Your arguments are invalid.

Re:market at work (1)

roman_mir (125474) | more than 3 years ago | (#35398628)

Tulips? That's your entire argument? Yeah, I see how my argument is invalid because some European governments in 16 hundreds were DEBASING their currencies by replacing replacing monetary metals in them with common non monetary metals.

You are invalid.

Re:market at work (1)

sjames (1099) | more than 3 years ago | (#35398700)

I would say that the current state of healthcare in the U.S. proves conclusively that "the market" isn't universally effective. Our system of private healthcare has the highest cost per patient of any in the world and slightly poorer outcome than countries with socialized medicine.

Another problem with a pure market approach is rather intrinsic. Markets always settle to a state where there is some segment of the population that is priced out. We have no good way to make a market not have that tail. When the market is for private yachts or a sports car, that's acceptable, but when the consequence of being part of the tail is death, it's unconscionable. We distort the market because we MUST. It's not even a pure matter of ethics (though I consider that paramount). It's a simple practicality that otherwise we will have people ACTUALLY holding hospitals hostage to have their childdren's lives saved. We couldn't even honestly blame them fully for doing that. Is that what you hope to see?

You look at lasic and see a market with less interference from government. I see a market where people are free to exit without dying. There are cheaper and less frightening alternatives available even if they are much less convenient. If treatment for a stroke or heart attack was purely elective, that would be a lot cheaper as well with or without government interference.

No free market can exist where the consumer's options boil down to buy or die. When you're unconscious in the ambulance, there is no rational market decision to be made. You will go to the hospital, you will get whatever treatment is deemed necessary and you will get a bill for the full amount. Imagine how much a new DVD player would cost if sales people could 'sell' them to you under threat of death!

Thus far, regulation has stayed out of gene sequencing (other than market distorting IP laws) exactly because nobody is forced into a buy or die situation, it hasn't been subject to rampant privacy abuses, and it can't kill you if someone does it wrong. As long as those conditions remain true, regulation will probably be minimal (other than the IP laws).

That's not to say that government is always good. We would be a lot better off for example if 100% of the money funding wars in Iraq and Afghanistan as well as the entirety of the TSA and drug interdiction were diverted to providing healthcare (including addiction treatment).

If we REALLY want to get rid of the problems in government, explicitly define the act of giving corporations sweetheart legislation against the interests of the constituency in exchange for bribes/donations to be an act of treason. Since it would be up to those same legislators to pass that law, a significant amount of public torch and pitchfork action may be required.

Re:market at work (1)

the gnat (153162) | more than 3 years ago | (#35399248)

Our system of private healthcare has the highest cost per patient of any in the world and slightly poorer outcome than countries with socialized medicine.

It isn't even a binary choice between private healthcare and socialized medicine, of course. The other first-world nations generally have superior outcomes regardless of the exact type - many with have private hospitals and private insurance - but they all have more heavily regulated health insurance markets, and some degree of subsidies. This too is anathema to conservatives, but it isn't the Orwellian nightmare of government micromanagement that the rhetoric suggests. (Much as the right likes to bash the Canadian or European health-care systems, I have yet to meet anyone from those countries who has lived in the US for a significant amount of time and actually prefers our system.)

Getting back to the original subject, there is a risk that governments will regulate personal genomics as a "diagnostic test", and require authorization from a doctor. (You can imagine who will be lobbying for this.) However, perhaps a greater risk is that the huge number of gene patents - obtained in an era where DNA synthesis and sequencing were prohibitively expensive for individuals - will either stop the industry in its tracks, or drive the cost up by thousands of dollars. If we take some of these patents at face value, simply by sequencing your genome and running annotation programs on it, you are already violating them. Most of these are effectively unenforced, otherwise the academic researchers would have been screwed years ago, but once personal genome sequencing becomes widespread (and profitable), I'm sure we'll see the patent holders creeping out from under rocks to file lawsuits.

Re:market at work (1)

roman_mir (125474) | more than 3 years ago | (#35399516)

<quote>I would say that the current state of healthcare in the U.S. proves conclusively that "the market" isn't universally effective</quote> - you start with a broken premise. The broken premise is that there is free market in US health care, which is not true.

If you bothered to go to the link I provided to a different thread in the top comment, you would have seen that I am giving information there, that shows that private sector was the preferred solution for buying both health insurance and health services because private sector was doing it inexpensively and effectively due to real competition before the Nixon's administration screwed it up dramatically and turned something that was supposed to be health insurance into really health account management and something that was supposed to be a normal good/service (providing health care) into a subsidized monopoly situation.

It's not the market that is causing the problems for US health insurance and health care, it's the government.

<quote>We distort the market because we MUST</quote> - I disagree violently. We must not do such thing if we want to have mostly inexpensive and affordable health care and economy in general.

There are always cases of people who are poor in any economy, you will not get a 'perfect' distribution of wealth and access, regardless of what type of economy you do have. However in real free market economy you have more choices and more competition, which ends up serving the poorer people better than other systems.

<quote>No free market can exist where the consumer's options boil down to buy or die.</quote> - I disagree violently. In free market there are more choices than that. First, people have always provided health services without requiring much payment or completely free depending on the situation. There have always been volunteers and charities, the government takes that idea and perverts it on its head, turning it into an obligation and one that is forced by the threat of violence.

<quote>explicitly define the act of giving corporations sweetheart legislation against the interests of the constituency in exchange for bribes/donations to be an act of treason.</quote> - more legislation?

No, <a href="http://slashdot.org/comments.pl?sid=1799588&cid=33702246">I gave the correct proposal</a>, that is only based on the original idea in the Constitution that does not require more legislation. It only requires more strict following of the original agreement that was ratified by the States.

I will repeat it again: <b>Congress shall pass no law, that changes the status of any entity in a way that allows that entity to get any preferential treatment in economy.</b> However some more strengthening is required: <b>All national/federal banks are explicitly forbidden with an intent to prevent government from printing fiat currencies and setting short term interest rates.</b> and another one: <b>All income taxes are explicitly forbidden.</b>

Re:market at work (1)

Brannoncyll (894648) | more than 3 years ago | (#35399438)

Personally I find it abhorrent to trust the health of my country to corporate entities that treat lifesaving medicines as 'products' and dying people as 'consumers', and who have proved time and time again that they are willing to drop all dregs of morality that once existed in their black souls for a quick buck. I would much rather have the government in charge, because at least in principle they care for the well-being of every citizen and not just the selfish pricks who are happy to let thousands of their fellow countrymen die as long as it doesn't affect their income.

Re:market at work (1)

roman_mir (125474) | more than 3 years ago | (#35399540)

If you actually followed the provided link, you would have seen the data, government collected and university research data, which supports the claim that private enterprise was doing a better job at providing health insurance and health care than any government entity was.

I find it abhorrent to trust anything at all to the government, especially anything that deals with money, regulations of business, taxes, any sort of health care.

The only purpose for a federal government is common military defense and a common justice system (and I really do not trust them with either as well, but if you have to have federal government, those are the only things they should be given a role in.)

Re:market at work (1)

Brannoncyll (894648) | more than 3 years ago | (#35399658)

I'm sure they're doing a great job for those who can afford it. But what about the estimated 45,000 people who die every year [reuters.com] for lack of health insurance? Somewhere around 45 million Americans are uninsured, or almost 15% of the population! Do you not see anything wrong with this picture?

Faster Than Moore's Law - I Should Hope So (1)

nick_davison (217681) | more than 3 years ago | (#35397782)

"they are outstripping the exponential curves of Moore's Law. By a big margin"

Moore's law simply states that the quantity of transistors that can be inexpensively placed on a circuit doubles every two years.

This is a relatively new area of science. New techniques can be expected to evolve, as would refinements of existing techniques. As it moves from the domain of a very few skilled individuals at universities to more of a commodity where $100 buys you your family tree, economies of scale kick in. And then there's the technical refinement of a new process where successive revisions, even if transistor counts/quality/sizes remained the same, would ensure its own rate of change.

So, off the top of my head, there are at least four different factors in addition to Moore's law. It's hardly surprising it's outstripping one of the five. Any new area of technology that also leverages transistor advances should do so.

The dangers of extrapolation (1)

PolygamousRanchKid (1290638) | more than 3 years ago | (#35397864)

The Economist had an excellent article about this a while back. Using the number of blades in a razor as the example. The made a graph of time, on the bottom, and the number of blades on the left. Then they drew a curve that fit. For a long time, there was only one blade. Then there were two, and that held a while. Then came three, then four, and now we have five. Now, using sound mathematical methods to extrapolate this curve, The Economist projected that by 2020, a razor will have something like 40 or 50 blades.

The lesson? Just because this worked for Moore's Law, doesn't guarantee that this will hold true for other industries.

Re:The dangers of extrapolation (0)

Anonymous Coward | more than 3 years ago | (#35398002)

You could have just http://xkcd.com/605/

Moore's Law? (2)

fahrbot-bot (874524) | more than 3 years ago | (#35398058)

NHGRI's research shows that not only are sequencing costs plummeting, they are outstripping the exponential curves of Moore's Law. By a big margin.

Moore's Law [wikipedia.org] is about the number of transistors on a wafer and other directly-related hardware density issues, not about cost - and certainly not the cost of gene sequencing.

Re:Moore's Law? (0)

Anonymous Coward | more than 3 years ago | (#35398428)

Yeah we know that.

The problem in bioinformatics is that computers arent getting faster quick enough as our datasets get larger. Right now you need a world class supercomputer to properly analyse a genome.

Next: synthesis, please? (0)

Anonymous Coward | more than 3 years ago | (#35398108)

Reading is great, but there's like 30 bajillion (okay, maybe only 30) next-generation DNA sequencing companies out there grabbing money and kicking butts. What strikes me as odd is that there hasn't been comparable advancement in DNA synthesis [google.com] (it's still at $0.30/bp using phosphoramidite synthesis), save for ligation and oligo libraries. Meanwhile my cells are busy synthesizing my own genome for less than $0 per chromosome.

Re:Next: synthesis, please? (1)

TamCaP (900777) | more than 3 years ago | (#35398406)

Not sure how serious you are, but cells are hardly synthesizing your own genome for less than $0.
Just two things:
a) why less than 0? Do you get money out of it?
b) let's try not giving you any food for a month and see how well your DNA synthesis goes then...

in the language of "linked" (0)

Anonymous Coward | more than 3 years ago | (#35398422)

The drop is not necessarily indicative of a long-time trend. The predictions of long-time trends of exponential growth are based on normal rate of network growth. But occasional Bose-Einstein condensation effect can produce very rapid growth spikes. These are due to a new approach that is adapted. The normal growth rates are generally due to efficiency gains due to incremental improvements.

Moore's "law" (0)

Anonymous Coward | more than 3 years ago | (#35398488)

I don't understand why Moore's predictions about integrated circuits would apply to the cost of gene sequencing. Unless there is a relationship that I am missing, let's keep Moore's "law" to the realm of electronics.

quality note (1)

mapkinase (958129) | more than 3 years ago | (#35398702)

As a person in the field, I have to say, that one has to consider a quality of genomes in the field (at least bacterial genomes). So called "complete genomes" submissions of the past, in the form of full continuous sequences of chromosomes and plasmids of the organism, are staying in the past, almost all of the new submissions are WGS (whole genome sequences), which is basically bunch of "contigs", pieces of sequences not connected together, 10s, 100s and sometimes 1000s of them. (This is a result of adopting 454-style sequencing).Those contigs do not cover completely genomes, they leave gaps between them.

But that's a minor problem compared to artificial frameshifts in poly-A areas, one such frameshift in a CDS (coding) area and the gene is gone (they also can occur naturally, which complicates the situation even more).

$1999 sequence exome ad in Nature. (1)

peter303 (12292) | more than 3 years ago | (#35398882)

Page advertisement in Nature, a leading science journal. The company was Axeq base in Korea. Here is the ad [axeq.com]. The exome is the 2% of the genome that appears to code for proteins.

Of course it has, that's commodity pricing (0)

Anonymous Coward | more than 3 years ago | (#35399420)

It has to fall even more so that the US government can continue it's eugenics programs in a cost effective manner.

Economic of Scale vs Moore's Law? (1)

netdigger (847764) | more than 3 years ago | (#35399518)

Moore's Law shouldn't be what the graph is based on. A best fit line would be much more accurate at predicting the price. Technology can have effect on the price. Increase in processor speed and more efficient algorithms can both decrease the time spent processing therefore decreasing the overall cost. But economy of scale will have a much more dramatic effect. When firms increase in size or increase their production, their costs generally go down until they hit a certain point. http://en.wikipedia.org/wiki/Economies_of_scale [wikipedia.org]

Re:Economic of Scale vs Moore's Law? (2)

mlush (620447) | more than 3 years ago | (#35399706)

Moore's Law is exactly what we should be measuring this against. CPU speed is proportional to the amount of data that can be processed, it looks like were headed for a era where there is more data than we can process!
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account