Friday, July 27, 2007

Rounds in Different ICUs, and the Horrible Monster that Haunts My Dreams

Rounds in 3 ICUs
So far I have spent time in three different ICUs: neonatal, pediatric, and medical. The ways patients are handled in the NICU and PICU are different, though the underlying structure of each unit is very much the same. The kids in the PICU can potentially have a much wider range of problems than those in the NICU, such as developmental disabilities and cancer. One of the patients in isolation was thought to have tuberculosis, while a 1-year old girl was suffering from severe scoliosis. The way the units are organized, however, is quite similar. Going on rounds with Dr. Kutko and her team was reminiscent of Dr. Frayer's team: the morning was spent going to see each patient and occasionally meeting with a surgeon to discuss surgery.

The organization sort of fell apart when I went to the medical ICU. If I had to describe it in one word, it would be "chaotic."Prior to rounds, the residents and fellows meet to discuss each patient. I often heard talk about terminally ill patients, many of which have cancer. Once the rounds begin (which is usually never on schedule), we would meet Dr. Berlin. The conditions of the patients we would meet were generally much worse than anything I'd seen in the other ICUs, and each discussion would be more involved with much more data to consider. At times, Dr. Berlin had other obligations, so he would often leave us and let one of the fellows take charge. It was an interesting (and fast paced) experience, though somewhat depressing to think that some of the patients I saw may not have much longer.

Error Messages
I am trying to write a Matlab program that can generate a two-dimensional airway model of the lungs, turn it into a mesh, and then export it in a form that is readable by Fluent. Right now, the hard part is trying to make it readable. Trying to reverse engineering the Fluent .msh file format is proving to be the bane of my existence. Instead of showing results, I chose to show the error message that I keep getting and a picture of a monster. If the error message had a representative monster, this is what it would look like.

Thursday, July 26, 2007

Ladies and Gentlemen, the results are in...

Still no word from the movers. Well, there goes my entire apartment deposit. But, that affords me some more time to type out a blog entry on a cell phone dial pad! T9 makes every sentence seem so much more epic...

Unfortunately, I am unable to post the graphs from the preliminary data analysis, but it appears that the automatic segmentation algorithm lies somewhere between the phase contrast analysis and the manual segmentation. The manual segmentation tends to overestimate when compared to the other techniques, while the PC flow analysis tends to underestimate. Furthermore, the relation between all three is relatively linear, as indicated by a high correlation (0.95 or greater between any two of the three techniques) and a high level of confidence in the zero intercept inear regression.

And perhaps of the greatest importance, the technique demonstrates a "high degree of accuracy," as indicated by Dr. Weinsaft, when used to evaluate the cadiac metrics of subjects with Left Ventricle Disfunction (those who eject only a small portion of their blood in each cardiac cycle, on the order of 25% when compared to normal function of 50 or greater.) This has traditionally been one of the most time consuming
scenarios for a clinician to evaluate, so the time saving benefits from this automatic segmentation stand to be quite considerable.

Admittedly, the data set is still somewhat small (just 20 fully completed cases with an eventual target of 50 to 100). But the trends are already beginning to show. The next step, other than validation with more data sets, is to reconsider the design of the algorithm. As I mentioned before, which technique is more accurate? Phase contrast analysis or manual segmentation? Further, should the algorithm even emulate these techniques or should the design process be independent of other techniques? Perhaps a "true" measure should be experimentally derived using experimentation somehow?

The logistical hurdles of obtaining the truth are considerable. But for now, we are fairly convinced that the automatic segmenter operates consistly between the phase contrast measure and the magnitude measure, both of which are valid and widely used techniques in the clinical world.

Open Cholecystectomy

Over the past week I have observed many, many operations, some of which have been previously blogged before. One operation that hasn't been blogged is Cholecystectomy which is the surgical removal of a gallbladder. This procedure is typically necessary when there are gallstones occluding the gallbladder duct/cystic duct or severe inflammation of the gallbladder(cholecystitis). These gallstone could potentially cause the gallbladder to swell and ultimately rupture. Some of the symptoms including sharp abdominal pain and nausea. For the case that I observed, gallbladder was severely inflamed and was suspected to be cancerous which necessitated an open cholecystectomy instead of a laparoscopic cholecystectomy.

http://www.doereport.com/displaymonograph.php?MID=93

Laparascopic Cholecystectomy


Gallbladder












So it turns out that a gallbladder can be removed and one can lead completely normal life without one, but under diets with less fat. This is because the primary function of a gallbladder is to store bile( for fat emulsification ) secreted in the liver. Having an inflamed gallbladder adds risks to liver inflammation and inflammation to other adjacent organs such as the pancreas.

Dickson

Catheter's in X-ray

Today I am going to talk about the research I have done in determining catheter tip locations from chest radiographs and a little of the future work we have planned.

As I mentioned previously, a lot of time is spent by radiologists doing intensive care unit imaging to make sure that every tube, line, and catheter is where it is supposed be. This is because if they aren't, complications and even death can occur. For example, its very important that feeding tubes end up in the stomach and not down one of the bronchi before you try to feed someone. But these reads are very time consuming, both in per scan rate and number of scans to read. So my research is focused on developing a method of computer aided detection of the tips, so that we at the very least can reduce the workload of the radiologists, which would further result in an increase in efficiency and throughput.

The first thing we looked at was that catheters were synthetic objects. They are basically tube with the ends cut off (kind of like a straw). This means that
1) The profile is consistent along its length
2) The profile of the object can known a priori
3) Intensity variations can be explained by
But we needed to verify the assumption so we took sample profiles along the length and got the graph below.



The profiles had roughly the same added attenuation at each point, the baseline is what mainly changed. So we are good to go on that front.

Taking into account the synthetic nature of the catheters, we developed a way to generate profiles automatically. These profiles are then matched from points outside the body to the tip, until no more evidence is found. We used normalized cross-correlation because it allowed for matching to occur regardless of intensity variations. Doing this progressive matching gave us results like the image below.



Not bad for proof of concept, right?

Luckily, I got to see many of the steps in how the images I am working with were acquired. This is actually quite helpful in knowing all the potential problems that the algorithm might need to deal with. I also spent time gathering data so that I have a larger set to work with. We are also planning on generating synthetic dataset next week to try out other methods of more direct detection.

CFD in Coronay Arteries: Part 1

For the past few weeks, I've been trying to develop a software that will be able to quantify the vulnerability of a plaque to rupture. As I mentioned before, no software in the market today is capable of extracting such information from CTA - this is my motivation. Nevertheless, no matter how motivated I am, it is not possible for me to develop such a software in a month; this is far beyond my capabilities. As a result, I broke the project up into four parts: Part 1, segment the lumen of the coronary arteries using CTA data; Part 2, discretize the boundaries into small cells to form a volume mesh or grid; Part 3, apply Navier-Stokes equations to solve the equations of motion; and finally, Part 4, validate simulation and develop a turbulence index to quantify the vulnerability of a particular lesion or plaque.

Part 1: Segmentation

Thus far, I've only been able to complete part 1 (figure). Basically, I have created an interface that allows one to load CTA data, preprocess the data through numerous filters, and finally, segment the arteries of interest using a region growing algorithm and/or simple thresholding. The software is far from perfect since it performs the segmenation in 2D. That is, it segments the arteries slice by slice, causing the whole process of segmentation to be slightly tedious. Nevertheless, one can use it to reconstruct arteries of interest.


Part 2: Tessellation

Pending... I'm trying to tessellate the segmented volume with rectangles. The following depicts the grid for a straight pipe. Although it seems simple, it is really not for odd geometries.

Part 3: Computational Fluid Dynamics

If I get lazy, which I probably will, I plan to use a CFD software package like FLUENT. This sophisticated software package will allow me to simulate blood flow in reconstructed arteries of interest. The following is a simulation by some company (I can't find there link so if anyone knows where this is from, post it in the comments).

Part 4: Validation and Index Development

I will validate the simulation by comparing theoretical fractional flow reserve values to practical ones (fraction of pressure about a lesion). And if the simulation proves to be incorrect, I will change the parameters of the mesh; otherwise, I will work on an index that decodes vulnerability.

J-Pouch

After an entire colectomy, the large intestine and rectum are removed. About a decade ago, it was necessary to divert a piece of small intestine to an artificial opening on the abdominal wall as the exit for the output. I’ve seen patients with this operation in China. The quality of life is significantly sacrificed after the operation. I always wondered if there is another way to make the exit. This week, I saw such an operation that makes a J-pouch to give a normal life to patients.

The operation was done laparoscopicly. The first part was as usual. Small holes were drilled on the abdominal wall and a 7cm incision was also made below the belly button. Large intestine and rectum were removed and taken out from the incision. However, the control muscle was kept, and this is the key point of the surgery.


After the removal of colon and rectum, the ileum was folded to form a reservoir. Later on, the bottom of the reservoir was connected to the control muscle. Sometimes, the second step requires two separate operations. After the reservoir is made, a temporary exit on abdomen is made so that the newly created reservoir isn’t in use immediately and may heal. 2 or 3 months after the first surgery, another operation closes the exit on abdomen.

The benefit of making a J-pouch is significant. The reservoir may store the waste until the need to have a bowel movement. The patient may have a pretty good control as normal people. It doesn’t need an external bag to store. Overall, the quality of life is enhanced.

Advanced Breast Reconstruction

So this past week I got to see one of the other, less common breast reconstruction procedures. As I mentioned in an earlier post, after a breast mastectomy, tissue expanders followed by implants are usually used to reconstruct the breast. However, a less common, way cooler, method is to use some type of flap, such as, the latissimus dorsi muscle, TRAM (transverse rectus abdominis myocutaneous), SIEP/DIEP (superficial/deep inferior epigastric perforator) flap. The free TRAM and SIEP/DIEP flaps are the most advanced procedures because they involve microsurgery to anastomose the tissue’s blood supply. Some flaps do not require this as the blood supply is left intact and the vessels are tunneled underneath the skin, as is the case for the latissimus dorsi muscle flap (on your back and tunneled under your armpit). Anyway, the cool one I got to see was the DIEP flap.

The DIEP flap procedure essentially combines two procedures in one. The tissue harvested to reconstruct the breast comes from the patient’s belly fat. So the patient basically gets an abdominoplasty (tummy tuck) at the same time. No muscle is taken with this flap, just fat. The novelty of this procedure is that the blood vessels perforating through the rectus muscle (the deep inferior epigastric vessels) are clipped and subsequently microsurgically anastomosed to the internal mammary vessels to feed the transplanted tissue in a quasi-Frankenstein kind of way.

Figure 1. The deep inferior epigastric perforator vessels.

Figure 2. The internal mammary vessels.

It was really amazing to watch the microsurgery. Blood vessels were sutured together with these tiny needles and very thin thread…I couldn’t even see where the needle was…just an occasional flare of light as the shiny metal reflected the light from the microscope. The surgeons performing this operation were very skilled; it was amazing to watch them work.

So the end result looked something like this. Pretty cool.

Figure 3. Final result showing the anastomosis of the DIEP and internal mammary blood vessels.