Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

GPUs Helping To Lower CT Scan Radiation

kdawson posted more than 4 years ago | from the healthy-green-glow dept.

Medicine 77

Gwmaw writes with news out of the University of California, San Diego, on the use of GPUs to process CT scan data. Faster processing of noisy data allows doctors to lower the total radiation dose needed for a scan. "A new approach to processing X-ray data could lower by a factor of ten or more the amount of radiation patients receive during cone beam CT scans... With only 20 to 40 total number of X-ray projections and 0.1 mAs per projection, the team achieved images clear enough for image-guided radiation therapy. The reconstruction time ranged from 77 to 130 seconds on an NVIDIA Tesla C1060 GPU card, depending on the number of projections — an estimated 100 times faster than similar iterative reconstruction approaches... Compared to the currently widely used scanning protocol of about 360 projections with 0.4 mAs per projection, [the researcher] says the new processing method resulted in 36 to 72 times less radiation exposure for patients."

cancel ×

77 comments

Sorry! There are no comments related to the filter you selected.

Voodoo lied to us (2, Funny)

jgtg32a (1173373) | more than 4 years ago | (#33003192)

They said we'd use these processors for video games not medical technology

http://www.youtube.com/watch?v=o66twmBEMs0 [youtube.com]

Funny what drives the HPC market... (3, Insightful)

onionman (975962) | more than 4 years ago | (#33003290)

It's remarkable that high performance computing is driven by video games. So, legions of PC enthusiasts and uber-gamers, I salute you for your contributions to technology! P0wn on.

Re:Funny what drives the HPC market... (1)

spire3661 (1038968) | more than 4 years ago | (#33003346)

I translated your post to "Battle on, Heroes"

Re:Funny what drives the HPC market... (0)

Darkness404 (1287218) | more than 4 years ago | (#33003416)

How is that remarkable? Just about every major function of a typical computer can be done with a low-end Celeron and 2 GB of RAM. Games are about the only thing that low-spec system can't do. And gamers enjoy paying lots and lots of money for the best. If I can do everything that I do now on a $300 computer, why would I pay $800 and get a quad-core unless I'm a gamer? Yes, there are a few other areas such as CAD and the like that need high-powered systems, but in the cost-conscious world, it is gamers that are the only ones paying the money and really need the power. Whereas companies who need CAD have no problems spending $4K on a system, a gamer is only going to spend half to a fourth of it on a system.

Re:Funny what drives the HPC market... (2, Insightful)

toastar (573882) | more than 4 years ago | (#33003566)

Games are about the only thing that low-spec system can't do

I take you never met someone who's job it is to solve the wave equation on very large datasets.

Re:Funny what drives the HPC market... (4, Funny)

localman57 (1340533) | more than 4 years ago | (#33004030)

I met one once. I pulled his underware up over his head, then took his lunch money.

Re:Funny what drives the HPC market... (2, Insightful)

ljw1004 (764174) | more than 4 years ago | (#33004076)

That sounds fun. Is it available on Steam?

Re:Funny what drives the HPC market... (1)

jgtg32a (1173373) | more than 4 years ago | (#33005862)

Yup,

store.steampowered.com/app/8500

Re:Funny what drives the HPC market... (1)

poetmatt (793785) | more than 4 years ago | (#33003570)

wow, obvious troll much?

Ever tried to do anything graphically intensive on a celeron with 2gb of ram? Here's a hint: it won't work.

Re:Funny what drives the HPC market... (1)

Jeng (926980) | more than 4 years ago | (#33003688)

Graphically intensive? I think that was his point, if you do not need a graphically intensive computer then a celeron with 2 gigs of ram will do you.

He wasn't trolling, but you I'm not so sure of.

Re:Funny what drives the HPC market... (1, Informative)

Andy Dodd (701) | more than 4 years ago | (#33003856)

Matlab is rarely ever graphically intensive...

Re:Funny what drives the HPC market... (4, Insightful)

Jeng (926980) | more than 4 years ago | (#33003936)

Neither is Email or internet usage.

I'm pretty sure the comment was for general usage, which is normally just Email and internet usage with some office apps thrown in. That is what a celeron with 2gigs of ram would be sufficient for.

Yes, there are many many programs that are used in many fields that would not fit into the celeron with 2 gigs comment. I work in an office environment, we don't need massive processors, we don't need massive video cards, all we need is a low end processor with a good amount of ram.

That is what I got from reading his comment, but apparently I am in the minority.

Re:Funny what drives the HPC market... (2, Interesting)

Score Whore (32328) | more than 4 years ago | (#33004666)

You and a couple of others in this sub-thread are defining the problem backwards. As near as I can tell you're approach is to look at computer A and computer B and then to say "B is five times faster than A, therefore I need B." The correct way is to lay out your requirements: technical, financial and SLAs for delivery of your "product." Then to identify the system you need.

While it's nice to be able to cache gigabytes of data, the reality is is that 2 GB is a fuckload of memory. Say you have a 21 MP camera (a 5D Mark II for example) and want to do some imaging work. Give up 1 GB of your RAM to your OS and apps. The remaining 1 GB can hold more than six complete copies of your images at 16 bits per channel + 16 bits of alpha.If you've got 8 bits per channel then you can have twelve copies. A 10 megapixel/8 bits per channel image (sufficient for most commercial work), in that case 1 GB is enough space for twenty-five images in RAM simultaneously. For the vast majority of users that's enough. Yes, it's possible to have that not be enough, but that says more about the user than the system.

Re:Funny what drives the HPC market... (0)

Anonymous Coward | more than 4 years ago | (#33010736)

Say you have a 21 MP camera (a 5D Mark II for example) and want to do some imaging work. Give up 1 GB of your RAM to your OS and apps. The remaining 1 GB can hold more than six complete copies of your images at 16 bits per channel + 16 bits of alpha.If you've got 8 bits per channel then you can have twelve copies.

You have never done anything more than red-eye reduction in the GIMP. You calculated the framebuffer size properly, yes, but there are many problems with using that to base your estimate of memory required.

1) layers - each new layer used to blend, channel mask, or otherwise create special effects is (usually) the size of the original frame buffer. On top of which you have to have space for the results of your blending calculations. So Original + Effect + ResultFrameBuffer = 3x framebuffer size to add a single layer with a multiply blend (simple HDR effect.)

2) non-destructive undo - to be "non-destructive" it has to save every layer for every previous step in editing the image. This is a big step forward for the artist from "destructive undo" which remembers how it got to the current place and reverses the formulae to take a step back. The reason that this is a huge feature is floating-point error accumulation. 1.1 * 0.999999999999 / 0.999999999999 != 1.1 (on binary FPU hardware)

Those two reasons combined mean that if you intend to do minimal graphic work a better guideline is 4x the base framebuffer size of the image. For "serious graphic work" you can never have enough memory - a baseline is more like 10x framebuffer size, but that's in the same way that "minimum system requirements" on video game boxes work; to work well you want at least twice that.

Re:Funny what drives the HPC market... (1)

Score Whore (32328) | more than 4 years ago | (#33022410)

You have never done anything more than red-eye reduction in the GIMP. You calculated the framebuffer size properly, yes, but there are many problems with using that to base your estimate of memory required.

I've done plenty of graphic design, I just don't use crappy tools. If your tool requires a full size copy of what the image was before every single change, then your tool is hopelessly naive in it's implementation.

If you're going to refute my claim, then refute my claim. All you did was say "In my opinion graphics design needs lots of memory."

My point is is that 2 GB is a lot of memory. You say "10x is a 'minimum system requirement'". For commercial work, at 10 megapixel, 8 bits per channel, 1 GB of RAM provides enough space for 25x complete copies. And even at that I gave away half the total system RAM as step 1. 1.5 GB available for apps in a 2 GB RAM system is pretty normal. And even bump up your images to 16 bits per channel (RGB+A), you've got enough space for 19 full size layers/framebuffers/images.

Re:Funny what drives the HPC market... (1)

Apple Acolyte (517892) | more than 4 years ago | (#33005636)

2 GHz Celerons don't cut it for heavy web surfing, either. You'll probably have problems on YouTube. A low-end machine won't cut it for photo editing, or video editing, or audio editing.

Re:Funny what drives the HPC market... (1)

ooshna (1654125) | more than 4 years ago | (#33006338)

Most people don't to heavy video, audio, or photo editing. The most people usually do are crop and resize photos for facebook or myspace and small editing of video for youtube. I used to do video editing on my old 466mhz celeron with 64mb of ram. Sure it was slow but most people don't need to edit and preview in real time.

Re:Funny what drives the HPC market... (1)

poetmatt (793785) | more than 4 years ago | (#33006774)

maybe you want to look at what happens if you try to dual or tri-screen on an integrated graphics card. Hint: doesn't go well even at moderate resolutions.

My point was, what people think is general work, it's not always cpu-focused. Some of the time? sure. Most of the time? I wouldn't say so.

Re:Funny what drives the HPC market... (1)

jandrese (485) | more than 4 years ago | (#33003458)

Well, the market for actual high performance computers is way too small to fund the R&D necessary to build those crazy GPUs. The high performance computing folks should thank their lucky stars that games went in a direction that required more and more processing power (well in excess of what CPUs can provide) and that the GPU companies didn't decide to just leave them out in the cold. There have already been stories of some jagoff putting a few GPUs in a box and outperforming million dollar supercomptuers at some narrow task. They're not going to go away as GPUs get more and more ridiculously overpowered either.

Re:Funny what drives the HPC market... (1)

The Master Control P (655590) | more than 4 years ago | (#33012016)

I'm about to get a pair of nVidia C2050s so I'm really getting kick out of being one of those jagoffs.

Suck it, Poisson equation. Suck all 16 million cells in under 90 seconds (under 10 once the 2050s arrive).

Re:Voodoo lied to us (1)

spire3661 (1038968) | more than 4 years ago | (#33003332)

I miss 3DFX. Definitely my life's second technology love crush, after I got over Sony. My first video card setup was a trident and 2 Voodoo IIs in TRUE SLI.

NOOOOOO! (1)

Chas (5144) | more than 4 years ago | (#33003224)

But I want that mutation in the "rage center" of my brain!

The ideas is I turn into this huge green, angry thing (currently all I'm lacking is the green pigmentation).

Then it's BULK SMASH!

Re:NOOOOOO! (1)

maxume (22995) | more than 4 years ago | (#33003484)

If you promise to do a bunch of hilarious stuff and also wear it at trial, I'll buy you some paint.

Re:NOOOOOO! (1)

ByOhTek (1181381) | more than 4 years ago | (#33003548)

Well, would you accept an external supplement instead?

ATI RAGE (appropriately named) cards had a similar effect on me about 8-10 years ago.

Another benefit (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#33003258)

Laying on those chilly tables with the hospital gowns just got a little warmer. Plus theres enough cycles left over to play crysis.

so what processors are used (0)

Anonymous Coward | more than 4 years ago | (#33003268)

so what processors are being used now - must be some DSP type. What is the processing power versus the GPU?

ATI needs to get off its ass (1)

mdm-adph (1030332) | more than 4 years ago | (#33003276)

And start paying developers to make things in OpenCL instead of CUDA, or they're going to be quickly left behind.

Re:ATI needs to get off its ass (1)

etherway (1842902) | more than 4 years ago | (#33009904)

They've already been left behind, or else they would not have to pay developers to not use CUDA. Also, nVidia has better OpenCL support than ATI in terms of performance and stability despite the fact that it's not their first choice language for GPU development (obviously).

What ATI actually needs to do is stop treating software development like some minor aspect of their GPU production that can be haphazardly tossed together. They have much, much better hardware than nVidia on paper and yet they are merely competitive when it comes to graphics and much worse when it comes to general purpose computation, because their drivers seem to have been written by a high school programming class.

Timeline (1)

dimethylxanthine (946092) | more than 4 years ago | (#33003292)

As usual in applying advances in medical (IT?) research, it won't be for at least 5-10 years before this reaches consumers. Might see it on House before then, of course.

CPU speed determines req. radiation amount? (4, Insightful)

iPhr0stByt3 (1278060) | more than 4 years ago | (#33003360)

So, they pump in all that radiation because the processor is too slow? Doesn't seem right to me. I would think if they could have simply put another $10000 into the machine (adding CPU cycles) to lower the required radiation they would have done that a long time ago. So is the use of a GPU just a side effect of some new technology that allows the machine to estimate or predict the image with a lower radiation dose? That GPUs are more effecient for some operations is nothing new - what's the real breakthrough here?

Re:CPU speed determines req. radiation amount? (3, Interesting)

Barny (103770) | more than 4 years ago | (#33003468)

Pretty much.

The reconstruction time ranged from 77 to 130 seconds on an NVIDIA Tesla C1060 GPU card, depending on the number of projections –-- an estimated 100 times faster than similar iterative reconstruction approaches, says Jia.

So in essence they have built a parallel optimised calculation system rather than an iterative one, and we all know the one thing CUDA and OpenCL do VERY well is parallel processing.

It seems the real win here is the new code, it could run on a TI-82 calculator and still require that level of radiation, its just that its very well suited to GPU to crunch.

Re:CPU speed determines req. radiation amount? (4, Informative)

FTWinston (1332785) | more than 4 years ago | (#33003482)

The TFA says that this tech is usually used prior to treatment, while the patient is in the treatment position.
Because processing a limited number of scans into a useful model previously took several hours, they were forced to perform many more scans to get a more accurate picture with which to build their model - because they don't want to leave the patient lying in the scanner for 6 hours prior to treatment.
With this improvement in processing power, they can produce the model from limited data in a feasable time.

So the summary does actually describe the breakthrough quite well: It's not a new image processing technique for working with limited data, it's just new hardware allowing that process to be run in a quicker way. Yes they're using a slightly new algorithm, but I doubt that is a massive breakthrough in itself.

Re:CPU speed determines req. radiation amount? (1)

FTWinston (1332785) | more than 4 years ago | (#33003498)

The TFA

:(

Re:CPU speed determines req. radiation amount? (1)

john83 (923470) | more than 4 years ago | (#33003930)

I think it's being driven by recent work which suggest risks associated with the scans are a bit higher than previously thought. There's a perceived medical need to reduce the radiation. I'm afraid I can't put my finger on a citation though.

Re:CPU speed determines req. radiation amount? (1)

suomynonAyletamitlU (1618513) | more than 4 years ago | (#33008032)

I have to imagine that there are all kinds of people working on software and hardware upgrades all over medical science/engineering. Decreasing the risk to patients might be a nice reason to upgrade these scanners in particular, but you sorta sound like 'if it wasn't for the risk to the patients, this upgrade wouldn't be needed anytime soon.'

Engineers want to make better products, both to contribute and to make sales. Doctors want better products, both to decrease risk and to make their work easier and more successful. If there's a project that seems like it would contribute OR sell OR reduce risk OR make work easier, any of those might get it greenlighted. Things like this could easily do all of them, so it's hard to know what the motivation was, and honestly, it's kind of silly to go looking for the reason anyway, unless it turns out to be a scam.

Re:CPU speed determines req. radiation amount? (0)

Anonymous Coward | more than 4 years ago | (#33004036)

This is actually not an accurate description of the 'breakthrough' here. TFA only mentions compressive sensing in passing, but this is what's really causing the 100x speedup. If you haven't heard of compressive sensing (aka compressive sampling or compressed sensing), check out the wikipedia article: http://en.wikipedia.org/wiki/Compressive_sensing.

Re:CPU speed determines req. radiation amount? (1)

Score Whore (32328) | more than 4 years ago | (#33004794)

Ah, interpolation, aka. making up data. This doesn't seem like a brilliant idea for purposes with accuracy is important.

I do acknowledge however that if your bullet is 10 mm in diameter and your target is 5 mm in diameter, you probably don't need a precise surface map of the target as long as you know where it's at within three or four mm.

Re:CPU speed determines req. radiation amount? (1)

compro01 (777531) | more than 4 years ago | (#33008478)

Ah, interpolation, aka. making up data. This doesn't seem like a brilliant idea for purposes with accuracy is important.

Doing the scan quickly and then filling in the missing data computationally is becoming better than doing the scan slowly due to movement. People cannot remain perfectly still (breathing, etc.), so if you do the scan more quickly, you get less motion and less burring.

Re:CPU speed determines req. radiation amount? (1)

Achra (846023) | more than 4 years ago | (#33014954)

Because processing a limited number of scans into a useful model previously took several hours, they were forced to perform many more scans to get a more accurate picture with which to build their model - because they don't want to leave the patient lying in the scanner for 6 hours prior to treatment. With this improvement in processing power, they can produce the model from limited data in a feasable time.

Good lord. Am I the only one that is terrified by the idea that they are take several scans and trying to come up with a vague model of how your organs tend to move, and then firing a rather large dose of ionizing radiation at their best guess? I was under the misunderstanding that imaging guided radiation therapy was somewhat real time up until now.

Re:CPU speed determines req. radiation amount? (2, Interesting)

Anonymous Coward | more than 4 years ago | (#33003492)

The real breakthrough is the development of Compressed Sensing/Compressive Sampling algorithms; this is just an application.

Re:CPU speed determines req. radiation amount? (1)

guruevi (827432) | more than 4 years ago | (#33003510)

At some point the amount of processors becomes insignificant because of the overhead and costs it will introduce. A Tesla C1060 costs ~$700 for these types of projects and has 240 processors designed to efficiently process this type of data, compare this to the cost and maintenance of a half-rack cluster this would take in generic processors.

Re:CPU speed determines req. radiation amount? (2, Interesting)

jandrese (485) | more than 4 years ago | (#33003514)

My guess is that each scan requires a considerable amount of processing to render into something we can read on the screen. Probably billions of FFTs or something. You can make a tradeoff between more radiation (cleaner signal) and more math, but previously you would have needed a million dollar supercomputer to do what you can do with $10k worth of GPUs these days, which is how they're saving on radiation.

Re:CPU speed determines req. radiation amount? (2, Interesting)

Zironic (1112127) | more than 4 years ago | (#33003520)

What's going on is that instead of taking a clear picture they take a crappy picture and have the ludicrously fast GPU clean it up for them. While you could have done that by just putting 50 CPU's in parallel the GPU makes it quite simple.

The speed is important because their imaging is iterative, with the GPU they're apparently waiting 1-2 minutes, without the GPU it takes them 2-3 hours which is a rather long time to wait between scans.

Re:CPU speed determines req. radiation amount? (2, Informative)

Anonymous Coward | more than 4 years ago | (#33004110)

The technique is called iterative backprojection. The reconstruction process assumes an array of pixels which, at the beginning, are of some uniform value. It then looks at a ray of attenuation data from the CT projection (along this ray, the tissues in the target result in this degree of attenuation of the xray beam), and asks "how must the pixels along this ray be adjusted, so that their attenuation along the ray matches the data from the CT beam?". It does this for every measured ray taken during the acquisition, over many different angles. The more sparse the acquired data, the more iterations (and thus, longer) it takes to get a reliable (approximated) image.

Re:CPU speed determines req. radiation amount? (1)

alen (225700) | more than 4 years ago | (#33004072)

this is expensive medical equipment. the costs are in the approval process and sales commissions. not in the cost of the hardware

cheap super-computer (1)

stanlyb (1839382) | more than 4 years ago | (#33003372)

They (NVIDIA) say that you could have a very cheap supercomputer for just $10k, made with Nvidia GPUs only. Pretty impressed achievement, and btw they also say that their GPUs are in fact faster that the normal, Intel/AMD CPUs. I don't know about you, but once my piggy bank is full, i will get one of these super-computer monster.

Re:cheap super-computer (1)

toastar (573882) | more than 4 years ago | (#33003756)

They (NVIDIA) say that you could have a very cheap supercomputer for just $10k, made with Nvidia GPUs only. Pretty impressed achievement, and btw they also say that their GPUs are in fact faster that the normal, Intel/AMD CPUs. I don't know about you, but once my piggy bank is full, i will get one of these super-computer monster.

more like 20-40k

Each 4 GPU node costs us about 5k, the thing is you can do with a 4 node GPU cluster what would normally take 50-100 CPU's or about 10-15 nodes.

Re:cheap super-computer (1)

toastar (573882) | more than 4 years ago | (#33003818)

Keep in mind some of the operations you have to do like data sorts, and other stuff I class as overhead, doesn't get a speedup from the GPU's

Re:cheap super-computer (1)

ChefInnocent (667809) | more than 4 years ago | (#33004250)

It's true that GPUs are faster than normal CPUs for some operations. If you have programs that are nearly pure linear algebra and looking for single precision FLOPS, then the GPU will leave a CPU in virtual dust. If you have a lot of branching, conditionals, double or integer operations, and care about MIPS, then not so much. Image processing is one place where linear algebra is king, so just think about what you want to do with a "super-computer" before you break open Hammy.

Electricity (1)

Reilaos (1544173) | more than 4 years ago | (#33003374)

Neat. Does this also reduce the running costs of the machines, or would that be a negligible benefit compared to not irradiating your patients?

Re:Electricity (1)

Zironic (1112127) | more than 4 years ago | (#33003622)

Well, less scans should translate into less power usage, less doctor time and less machine time which should mean a lower cost per patient. How significant that is is hard to tell though.

Re:Electricity (1)

russotto (537200) | more than 4 years ago | (#33003704)

Neat. Does this also reduce the running costs of the machines, or would that be a negligible benefit compared to not irradiating your patients?

From the point of view of the hospital? It's the other way around; increasing the lifetime of the expensive X-ray tube (which this will indeed do) is the important benefit, and not irradiating your patients as much is just a side effect.

Re:Electricity (1)

quantumghost (1052586) | more than 4 years ago | (#33004276)

Neat. Does this also reduce the running costs of the machines, or would that be a negligible benefit compared to not irradiating your patients?

From the point of view of the hospital? It's the other way around; increasing the lifetime of the expensive X-ray tube (which this will indeed do) is the important benefit, and not irradiating your patients as much is just a side effect.

Certainly not from the perspective of a physician. I continually bear in mind the cancer risk for CT scans that I order....the problem is that what I'm scanning for is an immediate threat to life, so I have to take a long term potential risk to offset a more immediate, more probable, and higher risk.

As for saving time...it is negligible...most new scanner (64 slice and up) process the images as quickly as the machine can scan. And even if there is a delay (e.g. 16 slice machines) most scans are put into a queue while the machine continues to process the signals and are read after the 3-5 scans done earlier are read. The exception is when I show up, the radiologists are required to prioritize my scans...(I'm a trauma surgeon)...but I usually don't have to wait very long even with the older scanners (most are done within 15 min)...if my patient is too unstable to wait 15-20 min for a scan, I take them directly to the OR.

As for saving electricity, there are several factors...1) the machines are always kept "on", 2)there is the power consumed to generate the x-rays and 3) there is a mechanical gantry (that I estimate weighs 0.5 to 1 ton) that spins around the pt when the scan is being performed that probable outweighs the power used by the other two put together.

Welcome to 2 years ago (0)

Anonymous Coward | more than 4 years ago | (#33003462)

Mercury Computer Systems has had this for 2 years now for CT and MRI. Re-inventing the wheel is awesome!

th1S FP for GNAA (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#33003496)

Pro-homosexual sudden and Don't walk around Were taken over The num/bers. The

lower rad dose (4, Informative)

SemperUbi (673908) | more than 4 years ago | (#33003624)

CT scanning is associated with an increased risk of cancer in children [nih.gov] . This development will significantly lower that risk.

Re:lower rad dose (1)

jabuzz (182671) | more than 4 years ago | (#33004002)

Any X-ray imagining protocol is associated with an increased risk of cancer in everyone. From memory I belive it is around 1 extra death per 1.3 million chest X-rays for example.

Re:lower rad dose (1)

quantumghost (1052586) | more than 4 years ago | (#33004120)

This development will significantly lower that risk.

Eventually it might. The exact technique they are using is for planning a radiation _treatment_ (cone beam CT), not a _diagnostic_ (helical scan) CT. They are quoted at the bottom that it _might_ be applicable. There are probably 100 to 1000 diagnostic scans for every treatment protocol.

"CT dose has become a major concern of medical community. For each year's use of today's scanning technology, the resulting cancers could cause about 14,500 deaths. "Our work, when extended from cancer radiotherapy to general diagnostic imaging, may provide a unique solution to solve this problem by reducing the CT dose per scan by a factor of 10 or more," says Jiang.

There currently protocols that are used to lower the radiation dose for pediatric patients...the problem is that not all hospital use them. Except in a life threatening emergency, the parents should ask before a routine/elective scan is performed on their children.

Source: someone who has ordered enough scans to have statistically killed someone.

Re:lower rad dose (0)

Anonymous Coward | more than 4 years ago | (#33012744)

A CT scan should never be performed on a child unless it is a severe trauma or life threatening emergency as stated above.

All hospitals I have done a practicum in pretty much enforce this protocol and I have seen only head traumas and a few others get clearance for a CT. Developing children are more radiation sensitive. CT dose is huge, and the increased number of slices may increase detail but it also increases dosage. Think of a CT as an X-Ray, except it is continuous in a 360 degree circle. The dosage is at best 10 times greater than an X-Ray and only increases dramatically from there (50x or greater).

Almost all modern CT systems use a form of iterative reconstruction, while a few other diagnostic procedures use a filtered back projection combined with iterative. One issue with this is that if you use too many iterations you can introduce artifacts into the image and I would worry that this new technique could do the same as it relies on calculations rather than actual data points.

Re:lower rad dose (0)

Anonymous Coward | more than 4 years ago | (#33005572)

The cone-beam ct for image-guided radiation therapy is for people who already have cancer. It's to make sure they are in the correct position right before they are given a massive dose of radiation to kill their cancer. Decreased dose for this particular type of CT is completely ironic.

Re:lower rad dose (1)

jonathan1979 (1712270) | more than 4 years ago | (#33012188)

CT scanning is associated with an increased risk of cancer in children [nih.gov] . This development will significantly lower that risk.

As a physics engineer experienced in the field of radiotherapy and familiar with the techniques mentioned in the /. article as well as certified in radiation safety I am sorry to say that although the radiation dose is reduced, it is only reduced in very specific cases, where it is actually not a real benefit. This technique is not used for normal CT scanning, used to diagnose in your average hospital.
This technique is used for radiotherapy (and mainly for position verification of the organ to be irradiated). Lowering CT dose in such cases is a benefit, but compared to the amount of radiation the person undergoing the treatment receives, to treat his' or hers cancer, it is finite. Apart from that the dose for a Conebeam CT in general is already lower than the dose received by a diagnostic CT scan.
The benefit to using the GPU to do the reconstruction of the conebeam CT is also in the fact that reconstruction and therefor the assessment of the scan can be done quicker, making it less likely that the patient has moved, making it more likely to treat the correct spot. It also makes it possible to more accurately deliver dose and thus sparing surrounding healthy organs and tissue.
To put the received conebeam CT dose in perspective: The biological dose received from on such CT scan [springerlink.com] is about as high as a few hrs long haul flight (considering the effective dose received per hour as stated by BA [britishairways.com] ).
Regarding the cited article of increased risk in cancer in children: Every person receiving radiation has a risk of getting cancer in the future added to the normal risks of getting cancer (for instance by aging or cosmic radiation).
For children this is far more important as induced cancer is a late effect that takes years to decades before it kicks in. Since the bulk of cancer patients is of higher age (due to the fact that cancer is a age deficiency, mainly) they will most of the times not live long enough to experience the side effects. Since children have a longer live span in front of them compared to adults, we have to be more careful, as because of the larger number of years to life, inherently means a higher risk of late effects induced by radiation.

Re:lower rad dose (1)

jonathan1979 (1712270) | more than 4 years ago | (#33012218)

To put the received conebeam CT dose in perspective: The biological dose received from on such CT scan [springerlink.com] is about as high as a few hrs long haul flight (considering the effective dose received per hour as stated by BA [britishairways.com] ).

Oops, strike that. Mixing up the magnitude orders. It should read a few hundred hrs of long haul flight. :blush:

context (1, Insightful)

Anonymous Coward | more than 4 years ago | (#33003834)

These patients are about to get RADIATION THERAPY. This CT scan will be delivered immediately before they are to receive a lethal radiation dose at the same location to kill their tumor. Reduction of dose in diagnostic CT (not cone-beam) is a much more valuable accomplishment.

Re:context (0)

Anonymous Coward | more than 4 years ago | (#33004790)

As a diagnostic radiologist performing CT scans for over 25 years, I advise caution in how you interpret the findings from this study. A treatment planning CT is a very different animal from a diagnostic CT, and the level of contrast resolution you need is far lower than when you are a priori trying to figure out if a patient actually has a tumor. This makes great press, but currently is not directly applicable. Note that all major vendors of CT systems, and essentially all diagnostic radiologists, are aware of the dose/diagnostic quality conundrum and are working to a common goal. A more worthy goal would be to somehow reduce the total number of CT scans being done now, especially out of the ER. Unfortunately, if you are warm and have a pulse and present to an ER (in the US), you will get some kind of imaging study.

Re:context (1)

quantumghost (1052586) | more than 4 years ago | (#33004836)

These patients are about to get RADIATION THERAPY. This CT scan will be delivered immediately before they are to receive a lethal radiation dose at the same location to kill their tumor. Reduction of dose in diagnostic CT (not cone-beam) is a much more valuable accomplishment.

LOL...if it is a _lethal_ dose, why treat the patient?

They are going to get a _theraputic_ dose of directed radiation to target a specific tumor bed. The reduction in the imaging scan portion will lower _total_body_ dosing.

Not all body tissues deal with radiation the same way. Thyroid and small bowel mucosa are the most radio-sensitive tissues, while areas like bone and muscle are much more tolerant...If you can avoid thyroid cancer or radiation enteritis, you'll have or be a much happier patient.

Re:context (2, Informative)

budgenator (254554) | more than 4 years ago | (#33005274)

"Our work, when extended from cancer radiotherapy to general diagnostic imaging, may provide a unique solution to solve this problem by reducing the CT dose per scan by a factor of 10 or more," says Jiang.
It's probably applicable to diagnostic cone beam scans, which are the hot item in implant dentistry. The reason it's first applied to therapy scans is because the tissue surrounding the tumor suffers radiation from scattering of the therapeutic beam, making dosage reduction highly desirable.

Tesla? (1)

adavies42 (746183) | more than 4 years ago | (#33004290)

This must be the first time anything associated with Tesla reduced radiation exposure....

Re:Tesla? (1)

McTickles (1812316) | more than 4 years ago | (#33005756)

Meh

Crysis (1)

JoeQuaker (888617) | more than 4 years ago | (#33004852)

Yeah yeah.. that's all fine and dandy, but how many FPS does it get in Crysis on ultra settings? (heh-heh)

bizna7Ch (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#33006172)

Worse and worse. As Arrogance was as one of the been the best, this exploiTation, Give BSD credit It's best to try about a project bureaucratic and people's faces at

now if they could just improve the bedside manner (0)

Anonymous Coward | more than 4 years ago | (#33006230)

Great. I had a CT this weekend. When I asked the technician how many Seivert of radiation I was being exposed to, she said "the lowest, if that's what you're worried about". Yes, that's what I'm worried about, and your answer didn't help me at all.

SETI (1)

hhedeshian (1343143) | more than 4 years ago | (#33006252)

Does this now mean we can get more points on BOINC for finding ET?

Anonymous Coward (0)

Anonymous Coward | more than 4 years ago | (#33012688)

the article and the original article, unfortunately, completely misses the point:

the reduced exposure time and processing time is essentially based on a rather new data processing technique called 'compressive sampling', or 'compressed sensing'.
it basically brings together data acquisition and data compression, namely such that the data compression is 'built-in' during the particular acquisition process.
Information is available on the net.

Some foreposters noticed that (iPhr0stByt3) in fact.
The use of GPU is only an obvious way to implement it for different resons.
@slashdot: completely missed the point (medical doctors anyway)

Anonymous Coward (0)

Anonymous Coward | more than 4 years ago | (#33012734)

the reduced processing and data acquisition time is based on 'compressive sampling' or 'compressed sensing'.

The article completely misses the point !

Some posters noticed that already.

The reduced time has nothing to do with GPU's and blabla.
It is only an obvious implementation of this technique (applied to CT).

  K.

The real hero here is compressed sensing (2, Informative)

HuguesT (84078) | more than 4 years ago | (#33013516)

This has been said elsewhere in this thread, the real breakthrough here is due to compressed sensing, but here are some extra information:

1- Compressed sensing basically used the idea that it is not necessary to sample an image (or a projection in this case) everywhere because natural data is fairly redundant. This is why you can capture a 10 Mpixel image in a digital camera and have it compressed to a 2 Mbyte JPEG file without losing much visible information. Compressed sensing basically does the compression *before* the sampling and not after. Researchers at Rice University for instance built a working, one-pixel camera [rice.edu] using this brilliant principle.

2- Compressed (or compressive) sensing was proposed by Emmanuel Candes [stanford.edu] and Terence Tao [ucla.edu] respectively at Stanford and UCLA. Tao is a recent Fields medalist. I recommend reading his blog if you like mathematics.

3- This field is really less than 10 years old, it has completely turned on its head classical ideas about sampling-limited signal processing (Nyquist, Shannon, etc). It is a brilliant combination of signal, image processing and recent advances in combinatorial and convex optimization.

4- However this is only the beginning. Because compression happens before sampling, you need to make so-called sparsity assumptions about the signal ; in other words you need to know a great deal about what you are going to try to image. In interventional therapy, precise imaging of the patient is made beforehand in a classical way (CT or MRI), and this kind of technique is only used to make fine adjustments as therapy is ongoing. This is extremely useful and safe because of lower radiation output and because the physicians know what to expect.

5- Here the GPU is useful because it makes the processing fast enough to actually be used. It is an essential brick in the application, but of course not in the theory.

Best.

Most informatie post ever! (1)

backwardMechanic (959818) | more than 4 years ago | (#33019704)

Congratulations! That's the most informative most I've ever seen on /. A very nice summary of a new field of research, and without jargon. What are you doing here?
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>