Epidemiology of Gastrointestinal Cancers, #4

Epidemiological Spectrum of Gastrointestinal Lymphoma

Read Article

Epidemiology of GI lymphomas differs substantially in different geographical regions of the world. Most population based studies have been done in Asian and European patients, with very few studies from North American cohorts. Also, a huge difference exists in the prevalence and epidemiology of GI lymphomas in different anatomical regions of the GI tract. The aim of this review is to provide data on the complex epidemiology of GI lymphoma as part of the present series on cancers of the GI tract.

INTRODUCTION AND DEFINITION

Lymphomas of the gastrointestinal (GI) tract can be primary or secondary. Primary GI lymphomas (PGIL) originate from the GI tract and are relatively uncommon, accounting for 1 to 4% of the malignant tumors of the GI tract. The more common secondary lymphomas involve the GI tract as a result of extra intestinal involvement. GI tract is the most common site of extra nodal involvement in lymphomas.1 The diagnosis of PGIL can often be missed by initial endoscopic or radiological examination, and hence a high index of suspicion may be needed for diagnosis.2 This may be contributing to underreporting of the prevalence.

Lymphomas of the GI tract may be of the B-cell or T cell type. One of the definitions of PGIL, somewhat arbitrary, is summarized in Table 1.3. According to another definition,1 PGIL are any of those where patients exhibit GI symptoms and/or the lymphoma is confined to the GI tract, or is clearly predominant within portion of the GI tract. The majority of GI lymphomas are non-Hodgkin’s Lymphoma (NHL). Hodgkin’s lymphoma of the GI tract, although reported, are rare.4,5,6 The specific subtypes of NHL which involve the GI tract primarily versus secondarily have not been clearly studied.7

Epidemiology of GI lymphomas differs substantially in different geographical regions of the world. Most population based studies have been done in Asian and European patients, with very few studies from North American cohorts.7 Also, a huge difference exists in the prevalence and epidemiology of GI lymphomas in different anatomical regions of the GI tract.8 The aim of this review is to provide data on the complex epidemiology of GI lymphoma as part of the present series on cancers of the GI tract.

CLASSIFICATION AND STAGING OF GI LYMPHOMA

GI lymphomas can be broadly classified as B-cell lymphomas, which is the major type (about 90%) or T cell lymphomas.9 There are many popular staging systems in use. The original classification of GI lymphomas was proposed by Isaacson et al.10 (Table 2). The most recent WHO classification of GI lymphomas, widely followed world over11 is summarized in Table 3. Another widely utilized staging system is the Ann- Arbor staging system,12 modified by Musshoff et al. for the GI tract13 (Table 4).

A more specific staging system for GI lymphomas has been developed by the European Gastro Intestinal Lymphoma Study Group (EGILS).14 This system is better than the Ann Arbor system since it is specifically designed for the GI tract and allows for better staging of local tumor infiltration and nodal involvement based on Endoscopic Ultrasound, a recent diagnostic modality (Table 5).

GEOGRAPHICAL DISTRIBUTION PATTERN OF GI LYMPHOMAS

The variable anatomical distribution of GI lymphomas in different geographical regions is primarily attributable to the difference in the prevalence of risk factors. Another reason could be that in some studies, simultaneous involvement of various GI sites is not reported separately.1,15 It is not clear as to why this separate reporting has not been done. It could be either because it was not diagnosed in the first place or it was reported as primary gastric or primary intestinal lymphoma.

Trends from several large population based studies are discussed below.

United States

The following distribution has been observed for primary GI NHL.7,16 from the United states (see Figure 2). The pattern of prevalence is as follows:

The stomach is the most frequent single site involved in 44% of the cases. If lymphomas of small and large intestine are combined together, intestinal lymphomas account for more cases than gastric lymphoma, in contrast to Western European data where the stomach is the single most common site and is even more frequently involved than the small and large intestine combined.17 The difference is interesting as prevalence of various risk factors for gastric lymphomas are almost identical in Western Europe and the United States. Overall the most common lymphomas are gastric MALT lymphoma and diffuse large B-cell lymphomas (DLBCLs). DLBCLs are the most common type of lymphoma noted in the intestine (33%). Mantle cell lymphoma (22%) and follicular lymphoma (21%) are the next most frequent types of lymphomas in the intestine. Burkitt’s lymphoma accounts for 9% of all intestinal lymphomas and is mostly confined to the intestines.

The incidence of gastric lymphoma is 3.8 per 1,000,000 person-years (PY), that of small intestinal lymphoma is 0.4 per 1,000,000 PY and that of colorectal lymphoma is 0.6 per 1,000,000 PY.16 Amongst all the anatomical locations, the only site where the incidence of lymphoma has declined over time is the stomach, most likely due to the widespread diagnosis and treatment for Helicobacter Pylori (H. Pylori),18 an etiological factor for gastric lymphoma.

Male predominance is noted in intestinal MALT lymphoma (male: female ratio of 5:1), intestinal DLBCL (male: female ratio 1.7:1) and intestinal Burkitt’s lymphoma(male: female ratio 1:0). In the other lymphoma subtypes, there is no major sex difference. DLBCLs, MALT lymphomas, Follicular lymphomas are mainly seen in the sixth and seventh decade (median age 68 years), whereas the median age for Burkitt’s lymphoma is 41 years.7

Western Europe

The following anatomical distribution has been observed in large population based, retrospective, epidemiological studies from Greece and Germany for primary GI NHL 17,19,20 (see Table 6 and Figure 1). The stomach is noted to be the most frequent site, followed by small intestine including duodenum. In the stomach, about 40% of the lymphomas are low grade MALT type. The prevalence of non-MALT type NHL (mantle cell lymphoma) and Burkitt’s lymphoma/lymphoblastic lymphoma are noted to be 1.4% and 3.2% respectively.

In the small intestine, almost all of the NHL are germinal-center lymphoma originating from the germinal center. Only T cell type NHL and not Burkitt?’s lymphoma are noted in the ileocecal region. The prevalence of MALT lymphoma in the small intestine is rare (3.1%). Results from some of the major Western European population based studies are summarized in Table 6.

As in the United States, a male preponderance is observed for gastric lymphoma (male: female 1.1:1), small intestinal lymphoma (male: female ratio 1.9:1) and ileocecal lymphoma (male: female ratio 2.7:1). The median age for gastric lymphoma is 61.2 and for small intestinal lymphoma it is 62.3

Middle East and Mediterranean Basin

Some studies from Middle eastern countries report that the small intestine is the most common site for primary GI lymphoma.21, 22 However, in other studies the pattern is found to be similar to that in western countries and stomach is found to be the most common site.23,24,25 26 DLBCL, the predominant histological variant in patients with NHL of the stomach, is the most common type of lymphoma noted in most of the recent studies. This is similar to the trend in western countries, however the relative percentage of DLBCL in the Middle Eastern population is slightly higher than in the west. This could very well be the beginning of a change in the epidemiological trend of the GI lymphomas in the Middle East. The high prevalence of gastric NHL could be due to the effect of environmental factors, specifically the increasing prevalence of Helicobacter pylori infection in the Middle Eastern countries. However, this is contrary to the observations from western countries where the prevalence of H. Pylori is decreasing.

In Saudi Arabia, the mean age of GI lymphomas is 55 years ( range 40-60 years). The overall male: female ratio is 1.2:1, the ratio being 2:1 for small intestine, 1:0 for large intestine and 1:1 for stomach.25 This is by and large similar to the epidemiological trend seen in the West

Indian Subcontinent

The largest single center study from the Indian state of Tamil Nadu included 336 patients with primary GI lymphoma. The anatomical pattern of distribution was found to be similar to that in Western studies and the following pattern is observed.27 (Figure 3).

The most common site is stomach followed by small intestine and large intestine. DLBCL is the most common subtype of lymphoma like in the west with an overall prevalence of 66.27%, followed by Burkitt?’s lymphoma (10.48%) and MALT lymphoma (10.12%). A few cases of immunoproliferative disease of the small intestine (IPSID) are also noted, which are rare in western countries but the prevalence is less than that seen in the Middle East

Male predominance is observed ( male: female ratio 3.93:1) and the mean age at diagnosis is 45 years (range 3-88 years). The mean age is slightly lower than that noted in Western studies, likely due to a higher number of cases seen in pediatric age group. The prevalence of EATL is much lower than that seen in the western studies likely due to a low rate of diagnoses of celiac disease in India. Major population based studies are not available from Northern India.

China

The findings from recent population based studies from Zhejiang and Tianjin provinces are depicted in Figure 4.28,29,30

As seen in Figure 4, the stomach is the most common site involved, followed by the ileocecal region. The prevalence of PGIL is higher in the ileocecal region in China as compared to the West and India. The reason for this relatively higher prevalence in ileocecal region is not clear from these studies. The majority of the cases (182 out of 216) are B-cell lymphomas.

As in the West and India, a male predominance is noted (male: female ratio 1.27:1) and the median age at diagnosis is 56.9 years (range 8-89 years).

EPIDEMIOLOGY BASED ON ANATOMICAL LOCATION OF GI LYMPHOMA

Esophageal Lymphomas

Esophageal involvement is rare with Esophageal lymphoma accounting for approximately 1% of all PGIL. The available literature is meager in the form of case reports.31 Esophageal extension of a primary mediastinal or gastric lymphoma is common. Primary esophageal lymphoma more often involves the distal esophagus.32

Gastric Lymphomas

Gastric lymphoma, the most common anatomical type of GI lymphoma, accounts for about 68 to 75 % of all PGIL.17,19 Primary gastric lymphoma constitutes about 3 % of all gastric cancers and accounts for 10% of all lymphomas.20 In the GI tract, lymphoid tissue is only present in the tonsils and Peyer?s patches in terminal ileum. The normal gastric mucosa lacks structured lymphatic tissues (lymphatic follicles). However, in response to inflammatory processes, lymphoid tissue appears in the gastric mucosa, called mucosa associated lymphoid tissue (MALT), a term first coined by Isaacson et al.33

Incidence of gastric lymphoma is higher in males compared to females. The peak age of incidence is between 50 and 60 years.

A majority of the gastric lymphomas are of the low-grade MALT type (40%), with non-MALT type NHL (mantle -cell lymphoma) and Burkitt?’s lymphoma being very rare (1.4% and 3.2% respectively).17 MALT lymphoma, Mantle cell lymphoma and Burkitt?’s lymphoma are discussed separately in detail.

Diffuse large B-cell lymphoma (DLBCL) comprises 40-70% of all gastric lymphoma and is the most common B-cell NHL overall.17, 19 Stomach is the most common location for GI DLBCL, but it can also occur rarely in rectum, colon or terminal ileum. The origin of DLBCL is not clear however it could arise as a transformation of MALT lymphoma. Median age of diagnosis is in the fifth decade with male predominance. No risk factors except immunodeficiency (congenital immunodeficiency, organ transplant, HIV infection) have been identified for DLBCL. The prevalence of H. Pylori infection in patients with DLBCL is 35%, but majority of the DLBCL are associated with MALT lymphoma implying that it arises from MALT lymphoma transformation.34

Small Intestinal and Colorectal Lymphomas

Small intestinal lymphomas account for about 23-26% of all GI lymphomas in the West. After stomach, small bowel is the second most common location for all GI lymphomas. The epidemiology of small intestinal lymphomas is interesting because it differs depending on the geographical region.23,25

There are three main types of small intestinal lymphomas:

  • Immunoproliferative small intestinal disease (IPSID) also called Mediterranean lymphoma, Seligman disease or alpha heavy chain disease.
  • Enteropathy associated T cell lymphoma (EATL) associated with celiac disease.
  • Non IPSID lymphomas, for example: Mantle cell, Burkitt?’s lymphoma, follicular lymphoma

According to some studies, small intestinal lymphomas are the most common GI lymphomas in the Middle East accounting for nearly 75 % of all GI lymphomas.25 However, other studies show that the distribution is similar to that seen in Western population and stomach is the most common site.23 Most common type of small intestinal lymphoma in Middle Eastern population is IPSID. Risk factors are summarized in table.7,21

The overall incidence of EATL is rare but EATL is associated with celiac disease and is the most common type of small intestinal lymphoma in the West. Per a study from Netherlands, the overall crude incidence is 0.1 per 100,000. The peak incidence is in the seventh decade, with the proximal small intestine being the most common location.35 Although the incidence of uncomplicated celiac disease is about 2 times higher in women compared to men, the incidence of EATL is higher in men as compared to women.36

Colorectal lymphomas are very uncommon, comprising about 3% of all GI lymphomas and about 0.3 % of all colorectal malignancies. Literature about their epidemiology is limited however the incidence is higher in males as compared to females.

Primary Pancreatic Lymphomas (PPL)

PPLs are extremely infrequent. The most commonly accepted diagnostic criteria for the diagnosis of PPL are as follows.37

  • 1. Mass involving the pancreas with or without loco-regional lymph nodes
  • 2. No superficial/mediastinal lymphadenopathy
  • 3. No hepatic or splenic involvement
  • 4. Normal peripheral leukocyte count

In a series of 12 cases of PPL,38 median age at diagnosis was found to be 65.5 years and 91.7% of the patients were Caucasian and 58.3% were male. Only one patient had a diagnosis of HIV prior to the diagnosis of PPL. The majority of cases involved the head of the pancreas (83.4%).

GI LYMPHOMA ASSOCIATED WITH SPECIFIC DISORDERS

Helicobater Pylori (H. Pylori) Infection and Mucosa Assoicated Lymphoid Tissue (MALT) Lymphoma

Gastric MALT lymphomas have a close association with H. Pylori infection, with 90-95% of the MALT lymphoma patients having evidence of H. Pylori infection currently or in the past.39 Although the incidence of H. Pylori infection is different in different parts of the world, the global prevalence of H. Pylori is estimated to be as high as 50%.40 The overall incidence of Gastric lymphoma, however, is very rare, accounting for up to 2-8% of all gastric cancers. The median age at presentation is 61 years and there is no sex preponderance.17

Even amongst the developed countries there are regions where the prevalence of MALT lymphoma is higher than average.41 For example, in north-eastern Italy, the frequency of primary gastric lymphoma was noted to be very high, with an incidence as high as 13.2 cases per 100,000 per year, which is significantly higher than that of other European countries,42 presumably related to the difference in the prevalence of the oncogenic strain of H. Pylori.

Several studies have shown the association between H. Pylori infection and Mucosa Associated MALT lymphoma.43,44 The development of MALT lymphoma may be related to the strain of H. Pylori expressing the CagA protein. Serum Ig G antibody to the CagA protein was much more common in patients with H. Pylori who developed MALT lymphoma as compared to patients who did not have MALT lymphoma.45 The very low prevalence of MALT lymphoma despite very high global prevalence of H. Pylori could be possibly because of the rarity of the strain producing the Cag A protein. A recent study has shown that H. pylori may translocate CagA protein into B-cells.46 In the B-cells, CagA protein induces extracellular signal-regulated kinase activation and Bcl-2 expression up-regulation.This in turn inhibits apoptosis and leads to cancer or lymphoma.46

The predominant site for the development of MALT lymphoma is the stomach, however, other sites in the GI tract such as caecum can also be involved rarely.47 Other infection that has been linked but not causally proved to cause GI lymphoma is Campylobacter jejuni, which has been associated with small bowel lymphoma.

Burkitt’s lymphoma

Denis Burkitt, working as a surgeon in Kampala, Africa, noted a special type of lymphoma mostly in children, who had dysmorphic facies, at times with proptosis. Some children were also noted to have distended abdomen. This malignancy was initially thought to be a sarcoma. In the initial epidemiological studies carried out by Burkitt, it was found that the lymphoma was seen in the region 15 degrees north and south of the equator, a region found to be hot and wet with high rainfall year round. Subsequent epidemiological studies have been done which will be discussed later, however the original epidemiological investigation had a significant impact on the understanding of this disease entity. 48,49

According to the WHO, there are 3 types of Burkitt?’s lymphoma – endemic, sporadic and HIV associated.

Endemic Burkitt’s lymphoma

Endemic Burkitt’s lymphoma (BL) is seen in many countries of Africa. The endemic zone is both north and south of the equator extending from Nigeria, Mali, Uganda up to Tanzania covering all central African countries near the equator. The endemic area is bisected by the equator.50 The incidence, however, is noted to be 100-fold more common in tropical Africa and Papua New Guinea.51

Endemic BL, seen predominantly in children between 4-7 years, is the most common childhood malignancy in Africa.52 Male dominance is noted and the male: female ratio is 2:1. Burkitt’s lymphoma commonly involves the jaw bones. However, kidneys, gastrointestinal tract, ovaries, breast and other extranodal sites can also be involved.53 This endemic form is almost always associated with Epstein Barr virus infection. The prevalence of this form is about 60 times higher than the prevalence of Burkitt’s lymphoma in the United States.54

In a recent study carried out in north eastern Nigeria, majority (63.3%) of the affected children were in the 6-10-year age bracket with male predominance. The majority of children affected were the Fulani ethnic group (30.6%), from Borno state (36.7%) and were living in rural areas (40.8%).55

In a recent major study from Uganda, the extent of abdominal involvement in Burkitt?’s lymphoma was quantified. The mean age for abdominal tumor was higher (7 compared to 6 for overall BL, p values < 0.001). Interestingly although overall BL was more common in males, abdominal involvement was seen more in females. The age adjusted incidence noted to be 2.4 per 100,000, was lower in districts that were far from Lacor and higher in districts that were close to Lacor. While districts close to Lacor were also more urbanized, the incidence was higher in the close by semi -rural areas also.56 This is in contrast to Nigerian study where incidence was noted to be high in rural areas.

Sporadic Burkitt’s lymphoma

Sporadic Burkitt’s lymphoma is seen all across the world. There is no specific geographic or climatic association. In the United States and Western Europe, it constitutes about 1-2% of all lymphomas in adults and about 40% of all lymphoma in children.54

Interestingly, abdomen and specifically ileocecal area is the most common site of involvement in sporadic BL. This is in contrast to endemic BL which mainly involves the facial bones. In addition to abdomen, other sites that are affected include ovaries, kidneys, omentum and Waldayer?s ring.

HIV associated Burkitt’s lymphoma is discussed separately.

Immunoproliferative Disease of the Small Intestine (IPSID)

IPSID lymphoma has interesting epidemiological features. IPSID, a type of MALT lymphoma, occurs exclusively in the small intestine. It was previously called alpha – heavy chain disease because it expressed monotypic truncated Immunoglobulin Alpha chain. Also called Mediterranean lymphoma because of its geographical epidemiology, it is seen in regions bordering the Mediterranean sea and Cape region of South Africa.

The stomach, as in the Western population, is the most common site for GI lymphoma even in the Mediterranean region. The median age for IPSID is 25-30 years, with no gender preponderance.22 In comparison, the Western type MALT lymphoma occurs mainly in the elderly. In a recent study published from Tunisia,57 epidemiological data from the past 28 years was presented in a cohort of about 210 patients. Surprisingly, there was a significant decrease in the annual incidence of primary small intestinal lymphoma. Interestingly, it was also noted that there was a significant transition of IPSID or Mediterranean lymphoma, progressively being replaced by ?Western? lymphomas.

Mantle Cell Lymphoma (MCL)

Mantle cell lymphoma is a rare GI lymphoma, constituting about 4-9% of all GI lymphoma and 5-8% of all NHL. In previous studies MCL has been noted to affect the GI tract in 15?30% of cases.58 However, in a recent study this view has been challenged. Microscopic evidence of MCL was noted in 84% of the cases with normal macroscopic appearance by colonoscopy and 45% by Upper GI endoscopy. Due to this high level of involvement noted, it was suggested that MCL may originate in the MALT region of the GI tract.59

The median age at diagnosis is 60 years. The male: female ratio is close to 3:1.60 MCL arising primarily from GI tract has better prognosis as compared to MCL arising from lymph nodes.61

GI Lymphoma and Celiac Disease

Celiac disease (CD), an autoimmune disorder, has a prevalence of about 1% in Western world with many more undiagnosed cases.62 As the incidence of CD is higher in Caucasians, the prevalence of EATL is higher too in this ethnic group.63 CD appears to increase the incidence of many GI and extra intestinal cancers, in particular lymphoma of the small intestine, probably related to villous atrophy. EATL is of two types: Type 1, which is associated with CD and Type 2, which is possibly a separate disease entity, occurring sporadically with different morphological features.64

The majority of EATL (65%) cases are of the T cell immunophenotype and are called Enteropathy-type T.cell lymphoma (ETTL). ETTL not associated with CD is a rare entity. However, most lymphomas that occur as a complication of CD are not of the ETTL-type.65 It was found that there was a significant link between female gender, CD, autoimmune or inflammatory disorders and B-cell NHL. The degree to which concurrent autoimmune or inflammatory disorder contributed to the risk is unknown.66 ETTL develops in approximately 5% of the patients with celiac disease who were followed for a 30 year period.

In most of the patients, the diagnosis of CD is established just preceding or at the time of lymphoma diagnosis. It occurs only in small portion of patients with established history of childhood CD, and in these cases the disease has often been not well controlled. Conversely, 80-90% of the patients with EATL have CD.64 The median age of onset is in the seventh decade of life, with male predominance (64%) despite the fact that CD is more common in females as compared to males.

Celiac disease was also found to be associated with small intestinal adenocarcinoma, pharyngea1 and esophageal squamous cell carcinoma.36,67

The risk for GI lymphoma decreases over time subsequent to the diagnosis of CD. This could be due to the introduction of a gluten-free diet. The risk, although decreased, remains persistently elevated in comparison with the general population.68

GI Lymphoma and Inflammatory Bowel Disease(IBD)

A considerable amount of advancement in the knowledge of lymphomas in the setting of IBD has been made since it was first described at Mayo Clinic by Bargen69 There are three important aspects to the association of lymphomas with IBD:

  • Is there a risk for lymphoma independent of the treatment for IBD?
  • Is there a link between the use of immunosuppressants used for treatment such as Azathioprine or 6-mercaptopurine (6-MP) and the development of lymphomas?
  • Is anti- tumor necrosis factor (TNF) therapy a risk for development of lymphomas?

Multiple population based studies have been done to determine if there is an increased risk of lymphoma in patients with IBD as compared to the general population.6,70,71,72 Review of all the data generated from these studies collectively do not suggest an increased risk of lymphoma in patients with IBD.73

It is not clear if the use of conventional immunosuppressants like azathioprine or 6-MP is associated with an increased risk of developing lymphoma.74 Most of the studies to date have found no risk, however they lacked sufficient power to detect the risk. A large meta-analysis of a total of six cohort studies found a fourfold increase in the risk of lymphoma in patients with IBD who were treated with azathioprine or 6- mercaptopurine.75 It is not clear from this meta-analysis whether the increase in risk is from the medications per se or from the increased severity of the underlying IBD.

Similarly, the data about the risk of lymphoma with the use of anti-TNF treatment are contradictory. The United States Food and Drug Administration (FDA) along with the manufacturer of Infliximab issued a boxed warning about the risk of lymphoma with its use in October 2004. Up until the end of 2010 a total of 20 cases of hepatosplenic T cell lymphoma (HSTCL) associated with infliximab have been identified by the FDA. However, several other studies have failed to confirm an increased risk of lymphoma in patients receiving anti-TNF treatment.76,77

GI Lymphoma and HIV Infection

There has been a significant decline in the infectious and non infectious complications of HIV, including GI lymphoma, with the increasing availability of antiretroviral therapy (ART). Amongst HIV associated NHL, the GI tract is the most frequent extranodal site, being involved in 30-50% of the cases. GI lymphomas are mainly seen late in the course of HIV and with advanced immunosupression and believed to be causally linked to immunosupression. They are usually of high grade B-cell histology with multifocal involvement affecting several regions of the GI tract simultaneously.78

HIV associated BL mainly affects adults, unlike endemic BL which occurs mainly in children.51 HIV associated BL is also associated with EBV virus infection.

Compared with other HIV+ patients with NHL of the diffuse large B-cell type, those with BL are younger in age with higher mean CD4 counts (usually >200 cells/µl). In studies conducted before ART became widely available, HIV associated BL was 1000 times more common in HIV patients as compared to the general population.79

GI Lymphoma and Autoimmune Diseases

A number of autoimmune diseases have been linked to an increased risk of lymphoma. These include:

  • Sjogren&rsuqo;s Syndrome
  • Systemic Lupus Erythematosus
  • Granulomatosis with polyangitis
  • Rheumatoid arthritis
  • Hashimoto?s thyroiditis

It is believed that the immunosuppressive therapy used for the treatment of these conditions, rather than the disease per se, is responsible for the increased risk.80

GI Lymphoma and Immunosuppression

Both congenital and acquired immunodeficiency are associated with increased risk of developing B-cell lymphoma.

Congenital immunodeficiency syndromes linked with GI lymphoma include:

  • Wiskott-Aldrich syndrome
  • Severe combined immunodeficiency syndrome
  • Ataxia telangiectasia
  • X-linked agammaglobulinemia

Acquired immunodeficiency syndromes linked with GI lymphoma include:81,82

  • Human immunodeficiency virus (HIV) infection
  • Immunosuppressive therapy for autoimmune disease or post organ transplant

Most of such patients have secondary GI lymphoma, however primary GI involvement of stomach and small bowel has been reported.83

GI Lymphoma and Nodular Lymphoid Hyperplasia

Nodular Lymphoid hyperplasia (NLH) is a polyclonal follicular reactive hyperplasia characterized by the alteration of the small intestinal lamina propria or colonic lamina propria.84 The behavior of NLH is different in children and adults. In children, NLH often has a benign course and typically regresses spontaneously. In adults however, there is an association of NLH with immunodeficiency including common variable immunodeficiency, selective IgA deficiency and giardiasis.85 The benign nature of NLH in adults is less certain with several studies and case reports showing association between NLH and malignant lymphomas including high grade NLH.86, 87,88 Evidence of association between NLH and GI lymphoma is stronger in the absence of immunodeficiency.

Recently an entity called indolent T-cell lymphoproliferative disease of the gastrointestinal tract, or indolent T-LPD has been described.89 It is an indolent clonal proliferation of T cell. It is important to recognize this entity because it can be easily misdiagnosed as intestinal T cell lymphoma and lead to aggressive therapy.

Epidemiology of Natural Killer (NK) Cell and T Cell Lymphomas Involving GI Tract

The majority of GI lymphomas are B-cell lymphomas with NK cell/T cell lymphomas of the GI tract being very rare comprising about 3.1% of all GI lymphomas.90 In a retrospective population based study from Asia, NKT cell GI lymphoma had a median age at diagnosis of 45 years with a male predominance (Male: Female 7:3). NKT cell GI lymphomas tend to be more aggressive, occur at a younger age and have poorer prognosis compared to B-cell GI lymphomas.91 Small intestine is the most common anatomical site for primary NKT cell GI lymphomas, rather than stomach.

SUMMARY

Primary GI lymphomas are far less common than secondary GI lymphomas. The GI tract is the most common extra nodal site involved in NHL. Primary GI lymphomas can affect any site, or multiple sites with the stomach being the most common site. The epidemiology and risk factors for GI lymphoma differs in different populations. Disease entities associate with GI lymphomas include H. Pylori infection, inflammatory bowel disease, immunosuppression and auto immune diseases.

Download Tables, Images & References

A Case Report

HSV Hepatitis as the Initial Presentation of Acquired Immune Deficiency Syndrome

Read Article

A 28 year-old female without significant prior medical history presented with five days of right upper quadrant abdominal pain, vomiting, fevers up to 39.3 and chills. Physical examination revealed right upper quadrant tenderness. Initial laboratory data revealed normal complete blood count and comprehensive metabolic panel. Intravenous antibiotics were started for suspected biliary infection. Hyperdense gallbladder sludge, dilated biliary ducts and multiple hepatic hypodensities, suspicious for abscesses, were noted on computed tomography (CT) scan. Laparoscopic cholecystectomy was performed for suspected acalculous cholecystitis. Abdominal MRI on post-operative day 2 confirmed rim-enhancing lesions around the mildly dilated biliary system. Due to the extent of the abscesses, a human immunodeficiency virus (HIV) test was performed. It was positive and the absolute CD4 count was 4 cells/mm3. Her liver enzymes started to increase on post-operative day 7 and endoscopic retrograde cholangiopancreatography (ERCP) was performed; it was normal but a small sphincterotomy was made. She remained febrile and her liver enzymes peaked by post-operative day 20 [alkaline phosphatase 487 IU/L, total bilirubin 0.7 mg/dL, alanine aminotransferase (ALT) 1517 IU/L and aspartate aminotransferase (AST) 2137 IU/L]. She underwent diagnostic laparoscopy with wedge resection of hepatic segment five. Histology showed extensive hepatocellular necrosis, particularly in the subcapsular regions, with ground glass nuclear morphology and intranuclear (Cowdry type A) inclusions suspicious for herpes simplex virus (HSV) cytopathic effect.

The whole blood HSV 2 DNA PCR viral load was above the upper limit of detection for the assay (1 x 108 DNA copies/mL). A thorough physical exam revealed herpes labialis. Intravenous acyclovir (10 mg/kg q8h) was immediately started and she defervesced rapidly with normalization of liver enzymes on day 14. On day 19 of acyclovir treatment, she was diagnosed with cytomegalovirus (CMV) co-infection with neurological involvement. The acyclovir was changed to ganciclovir to cover both infections. She completed three months of antiviral treatment. Repeat abdominal CT scan two months after initiation of treatment showed interval decrease in the hepatic abscesses and liver enzymes remained normal.

Our patient presented with acute, anicteric hepatitis clinically mimicking acute cholecystitis. Histology was consistent with HSV hepatitis, a treatable cause of liver failure. Prompt initiation of antiviral treatment strongly correlates with improved survival.1 Patients present with fever, high aminotransferases, normal bilirubin, leukopenia, thrombocytopenia, abdominal pain and acute renal failure. Herpetic lesions are present in 57% of cases.2 Liver biopsy is the gold standard for establishing the diagnosis. Immunosuppression, major surgery, trauma or pregnancy can reactivate HSV infection. Our patient?s recently diagnosed HIV/AIDS could have predisposed herpes virus dissemination to the liver causing hepatitis.

There are limited number of HSV hepatitis cases in the literature thus there is no guideline as to the duration of antiviral treatment. Existing case series demonstrated varying lengths of treatment ranging from 3 weeks to 6 months, depending on clinical response.3 Data supports empiric antiviral treatment with acyclovir for all patients with acute hepatitis of unclear etiology until HSV infection is excluded.1

Download Tables, Images & References

A Special Article

Etiology of Small Bowel Obstruction (SBO) in a Culturally Diverse Patient Population

Read Article

Small bowel obstructions (SBO) are a major cause of morbidity and recurrent hospitalizations worldwide. The leading cause of SBO in the western world is adhesions. The goal of this study was to determine the etiologies of SBO in a large, university-affiliated hospital with a culturally, ethnically and socioeconomically diverse patient population.

Background & Aims: Small bowel obstructions (SBO) are a major cause of morbidity and recurrent hospitalizations worldwide. The leading cause of SBO in the western world is adhesions. The goal of this study was to determine the etiologies of SBO in a large, university- affiliated hospital with a culturally, ethnically and socioeconomically diverse patient population.

Methods: Systematic chart review of all patients hospitalized at Elmhurst Hospital Center with the discharge diagnosis of �bowel obstruction� between January 2005 and October 2012 was conducted. Patients with the diagnosis of SBO were selected from this group. Our cohort included 348 patients accounting for 405 admissions for SBO. Data collected included demo- graphic profile, length of stay, hypothesized etiology of SBO and type of management. Because all data had skewed distributions, we calculated medians and compared several parameters.

Results: The etiologies of SBO were found to be adhesions (56%), hernia (10.3%), Crohn’s disease (5.7%), neoplasia (4.9%), tuberculosis (0.9%) and miscellaneous (22.2%). Surgical management was more frequent when a hernia (61.1%) or malignancy causing obstruction (59%) was the cause of SBO. Medical management was more common in Crohn’s disease (72%). Patients with hernia, malignancy or adhesions were older and had a longer median hospital stay after surgical management. There was no specific gender predilection for any cause of SBO except for Crohn’s (predominantly male). Ethnicity of the patient population was white (12.1%), African American (7.9%), Hispanic (48.4%) and Asian/others (31.6%).

Conclusion: Adhesions were the most common cause of SBO according to our study (56%), a finding consistent with other studies in the developed countries (70% as per current literature). Hernia was the second most common cause of SBO in our study, unlike other studies in western countries where malignant mass or Crohn’s disease have been found to be the second most common cause. This could be attributed to the cultural diversity in our population group. The prominence of hernias as an etiology of SBO in developing countries has been attributed to the infrequency of elective hernia repair in those areas.

Small bowel obstructions (SBO) account for more than 300,000 hospitalizations annually in the United States.1 The incidence of SBO from adhesions has increased during the last 30 years because of increasing number of laparotomies.2 The morbidity and financial cost of SBO are compounded by the recurrent nature of the disease, which often depends upon the etiology of the obstruction. The outcome of the disease, the length of hospital stay and treatment modality also vary according to the underlying reason for obstruction. Treatment success and health care costs also differ depending on the treatment modality. Numerous factors contribute to the underlying pathology resulting in SBO including socio-economic background, ethnicity and cultural diversity of the serving patient population as well as developed versus developing countries. We conducted a review to determine the etiology of SBO in our hospital, which serves as one of the most diverse patient populations from an ethnic, cultural and socio- economic standpoint in United States. According to the 2010 Census, 39.7% of the population was white, 19.1% black or African American, 22.9% Asian, 12.9% from other races and 4.5% of two or more races. 27.5% of the Queens population was of Hispanic, Latino or Spanish origin (they may be of any race).3

METHODS
Patients

Systematic chart review of all patients hospitalized at Elmhurst Hospital Center with the discharge diagnosis of bowel obstruction between January 2005 and October 2012 was conducted. Patients with the particular diagnosis of small bowel obstruction were selected from this group. No distinction was made between complete versus partial obstruction. The exclusion criteria included age less than 18 years and diagnosis of large bowel obstruction. The cohort included 348 patients accounting for 402 admissions for SBO.

Data Collection and Analysis

Medical records were reviewed in their entirety; admission notes, progress notes, radiology reports, operative reports and pathology reports were included. Demographic profiles, length of stay, hypothesized etiology of SBO and management parameters were evaluated. Final determination of the etiology of the small bowel obstruction was based on clinical presentation, operative findings, radiological findings and consultant reports.

Institutional review board approval for a retrospective chart review was obtained. Informed consent was unnecessary as this was a retrospective chart review. All data were collected into a computerized database. Because all data had skewed distributions, we calculated medians and compared multiple parameters such as ethnicity of the patient population, management approach and length of hospitalization.

Results

Table 1 summarizes the etiologies of small bowel obstruction as determined by our review. More than half of the patients with SBO and their admissions for SBO were due to adhesions. The second most common cause was hernia followed by Crohn’s disease third. The ethnicity of the patient population was white (11.1%), African American (7.9%), Hispanic (48.4%) and Asian (29.2%).

Miscellaneous causes included: Strictures (13 patients, 15 admissions), volvulus (5, 6), foreign body (8, 9), endometriosis (2, 5), fecal impaction (5, 5), intussusception (3, 3), gallstone ileus (2, 3), malrotation (3, 3), abscess (4, 4), paralytic ileus (20, 20) and unspecified (12, 12). Of the eight patients with foreign body as the etiology, two were from a condom, two were due to a phytobezoar, one was from an ingested metallic pin, one patient had wireless capsule endoscopy retention and one patient was admitted twice with mushroom impaction.

Comparisons of the demographic profile and hospitalizations for patients with leading etiologies of SBO are summarized in Table 2. Patients who presented with small bowel obstruction secondary to Crohn’s were relatively younger with a median age of 43. Additionally, only percentage (30%) of patients with Crohn’s required surgery to relieve their obstruction relative to the nearly 60% of the patients with SBO from other causes that required surgery. As might be expected, conservatively treated patients had a shorter duration of stay versus those treated surgically, regardless of etiology.

No specific gender predisposition for any cause of small bowel obstruction was determined except for Crohn’s disease, approximately 72% males. Ethnical distribution is mentioned in Table 2.

Comments

Adhesions were the most common cause of small bowel obstruction according to our study, a finding consistent with other studies in the developed countries.1, 2, 4-7

Adhesions accounted for 55.6% in our study, while it accounts for approximately 70% of all cases of SBO as per current western literature.1, 11-13,17 Hernias were the second most common cause of SBO in our study, unlike other studies in western countries where malignant mass19 or Crohn’s disease 1 have been found to be the second most common etiology. See Table 3. This variability could be attributed to the cultural diversity in our population group. The prominence of hernias as an etiologic agent in developing countries has been attributed to the infrequency of elective hernia repair in those areas. 10 In western countries, because of the increasing elective prophylactic herniorrhaphy, there is relative decreased frequency of SBO from hernia and relative increased frequency of SBO from adhesions.9, 11

A significant shift in the underlying causation of small bowel obstruction has been documented in the literature over the course of the past century. In a British study involving 6,892 patients conducted during the 1920s, Vick15 reported that hernias resulted in 49% of intestinal obstruction, while adhesions resulted in only 7%. The United States population was evaluated from 1942 to 1945 and McEntee et al.4 reported a dramatic change in pattern of causes of SBO, with adhesions accounting for 31% and strangulated hernias only 10%. In a similar study from United States four decades later (1980-1981);13 adhesions were accounting for 74% of cases and hernias only 8%. (See Table 3) This drastic change is largely due to elective treatment of inguinal hernias and increasing number of laparotomies.

Socio-economic backgrounds as well as the cultural and ethnical diversity of patient populations are also independent predictors of small bowel obstruction. In a review of 316 African cases in 1980, Chiedozi et al.5 reported that strangulated hernia resulted in 65% of cases of intestinal obstruction while adhesions yielded only 11%. In a similar review from 2005- 2008, including 367 Indian patients,16 hernias resulted in 36% of all cases of SBO while adhesions were responsible for only 16%. An interesting finding was that intestinal tuberculosis resulted in 14% in this study thus illustrating the difference in patterns of SBO in developing countries. In our study, 3 cases of SBO were secondary to intestinal tuberculosis that resulted in 9 hospitalizations.

No specific gender predilection for any cause of obstruction was determined except for Crohn’s disease which predominantly was found in young males. Surgical intervention was more frequently used when hernia or malignant mass was involved. The length of hospital stay was found to be higher for patients treated surgically as compared to non operative management.12, 20

While managing these patients, it is important to determine whether patients can be subjected to conservative treatment or to an emergency surgery. Most of the cases of partial SBO and acute obstruction from Crohn’s disease often resolve spontaneously with conservative management.8 Obstruction from impacted food, bezoars, foreign bodies or gall stones17 may be treated endoscopically. Complete obstruction, peritonitis or strangulation mandates emergency surgery. If SBO doesn’t resolve after 24-48 hours of conservative management, it is more likely a complete obstruction rather than a partial SBO and laparotomy is often indicated.7, 14 Delaying surgery for more than 24 hours after the symptom onset in cases with strangulation increases the mortality threefold.1,18

SUMMARY

In conclusion, our study supports previous literature in that adhesions remain the most common cause of small bowel obstruction (55.6%), a finding consistent with other studies in the developed countries (70%). This could be attributed to the ethnic, cultural and socio- economic diversity in our patient population.

Download Tables, Images & References

Frontiers in Endoscopy, Series #12

Endoscopy for Primary Treatment of Obesity

Read Article

Obesity is a major health concern in the developed world. It complicates many illnesses and its reduction offers a major opportunity for health impact. Endoscopic therapies offer reversibility, minimal invasion, and same day treatment options but are not commonly used in the United States. Here we discuss these therapies, some new and still under active investigation: space occupying intragastric balloons, duodenal barrier sleeves, caloric removal via gastric tube, and botulinum toxin to reduce gastric motility.

BACKGROUND

Obesity has increased significantly in the United States in the last 50 years and is a major contributor to rising health care costs. The majority of this increase has occurred since 1990 and seems to be stabilizing at a prevalence of 25-35% of the US adult population, depending on the survey.1 (see Figure 1). The CDC estimates that obesity is directly responsible for $147,000,000,000.2 in healthcare costs and representing anywhere between 3-8% of healthcare costs in various countries.3 The diseases directly related to obesity include diabetes type II, hyperlipidemia, cardiovascular diseases, hypertension, kidney disease, gallbladder disease, liver disease including cirrhosis, GERD, colorectal, breast and prostate cancer, polycystic ovarian syndrome, neonatal complications, sleep apnea and depression. While research into obesity has demonstrated a complicated web of hormonal signals and neuropathways, the driving force behind the increase in obesity has been an increase in caloric intake and a decrease in physical activity leading to surplus in daily calories. Other factors such as the intrauterine environment,4 formula feeding,5 genetics,6 medications, and possibly viruses,7 are thought to also contribute to obesity. Obesity also varies by race, gender, and socioeconomic status.8,9

Combating this epidemic has largely focused on methods that decrease caloric intake by either increasing satiety or blocking absorption of nutrients. Medical therapies have largely been underwhelming but there are four FDA approved medications for weight loss for the obese. Orlistat inhibits the action of pancreatic lipases which prevents uptake of fatty acids and can lead to a 10% decrease in body weight (placebo lost 6%).10 Sibutramine, an SSRI/norepinephine inhibitor, acts to increase satiety and has been shown to cause around an 8% body weight loss but is limited by side effects on blood pressure and pulse.11 In 2012 two new drugs, Qsymia (Phenteramine + topirimate) and Belviq (lorcaserin) both received FDA approval for weight loss. These medicines also act to increase satiety and offer modest weight loss over lifestyle therapy.

Surgical therapies for weight loss have been much more effective and remain the mainstay of treatment for morbid obesity at this time. Excess weight loss of ~50% can be expected after two years following bariatric surgery with proven reduction or resolution in blood pressure, diabetes, and hyperlipidemia. However, bariatric surgery carries increased cost, morbidity and mortality relative to medical therapy. Several different surgical procedures exist and most common bariatric surgery in the United States is the Roux-en-Y Gastric Bypass which combines both restrictive and malabsorptive/neurohormonal strategies. Laparoscopic Adjustable Gastric Band and Vertical Banded Gastroplasty both increase satiety by decreasing stomach volume. Biliopancreatic Diversion with Duodenal Switch decreases the effective absorptive area and alters the neurohormonal balance to increase satiety.

Curiously absent in the battle against obesity has been the gastroenterologist. This has not been for lack of trying as endoscopic treatments for obesity were first explored in the 1980s. Some are predicting that gastroenterologists? role in obesity management will mirror that of the interventional cardiologist relative to the cardiothoracic surgeon with regards to offering a less invasive management option than surgery. However, the therapies that have been tried have largely been underwhelming and fraught with complications. As therapies improve, many have speculated that gastroenterology will play an increasing role in obesity management as therapies improve.12

Space Occupying Balloons

The first endoscopic therapy attempts to manage obesity were in 1982 13 using an inflatable balloon that was placed in the stomach. The theory was that decreasing the accommodation of the stomach would cause early satiety and lead to decreased caloric intake. Early studies showed that ghrelin levels decreased in those with balloons corresponding to the increased sensation of satiety. The first balloons brought out only modest weight loss and sham studies were shown not to be superior to aggressive diet.14 Additionally, the early balloons suffered from a high degree of complications and spontaneous rupture. Further developments in intragastric balloons were made and currently the most commonly used balloon worldwide is the BioEnterics Intragastric Balloon or BIB. The deflated balloon is loaded onto a catheter and then blindly passed into the stomach. An EGD scope is then inserted alongside the balloon catheter to confirm position and placement. A needle is passed through the scope and 400-600ml of saline (often mixed with methylene blue) is injected into the filling port of the balloon. The balloon is then detached from the catheter and the EGD and catheter are removed (See Figure 2). Methylene blue, if used, is an indicator of balloon ruptures as it will dye the urine after being absorbed into the body.

While not FDA approved for US use, it is used elsewhere and has been well studied. The first large testing was done in Italy between 2000 and 2004.15 In this retrospective study, over 2500 patients had the BIB placed for 6 months with instructions on a 1000k cal diet and then the balloon was removed. The pre-placement BMI was 44.4 and the post-removal BMI was 35.4 with an extra body weight loss of 34%. Placement was successful in 99.2% of attempts. Side effects included balloon rupture (0.36%), gastric obstruction (0.76%), esophagitis (1.3%) and gastric ulcer (0.2%). There were five gastric perforations which resulted in two deaths. Four of the five patients with perforation had prior gastric surgery which suggested a contraindication to balloon placement.

The same authors also led a much smaller study which was a prospective double-blind, randomized, controlled with cross-over study.16 Group A had the BIB placed for three months followed by a sham procedure for three months. Group B had a sham procedure for three months followed by BIB for three months. Group A had significant weight loss within the first three months (BMI 43.5 to 38.0) and was able to maintain it following BIB extraction. Group B had no change in weight during the first three months (BMI 43.6 to 43.1) but lost weight after BIB placement (BMI 43.1 to 38.8). There was no mortality.

While the Italian studies and similar studies in Brazil17 and Spain18 demonstrated good short term data, another study by Mathus-Vliegen 19 et al showed less encouraging results. Forty-three patients were enrolled with an average BMI of 43.3 and were divided into two groups. Group A would have the balloon for three months and if they lost weight were given an additional 9 months of balloon treatment. Group B was given a sham treatment for 3 months and if they lost the weight were given 9 months of BIB. Five were excluded for not meeting certain pre-trial weight loss goals and six more were excluded for either not tolerating balloon placement or for developing esophagitis and requesting the balloon be removed. In intention to treat analysis there was no difference in weight loss between group A and group B for the first three months. Once the sham group was given an additional three months of BIB they lost more weight than the group that had the BIB from the beginning. After the balloon was removed following one year of therapy the average weight loss was 25.6kg for both groups. Follow-up another year later showed that the patients had gained half of the weight lost initially. Their conclusion was that the BIB was safe and led to weight loss although they did not see a significant benefit compared to sham in the first three months. After three months there was mild but significant weight loss.

The goal of any weight loss procedure is to achieve durable weight maintenance and a small study evaluated the long term weight trends of one hundred patients treated with six months of BIB.20 The average pre-balloon BMI was 35kg/m2 and after six months of therapy with the BIB, it was removed. They were not given any weight loss guidance in an attempt to measure just the balloon?s effect. Average weight loss was 12.6kg and about 63% of the patients had >10% of baseline weight loss. After one year the group had gained 4.2kg back and at 2 years another 2.3kg. At 30 months after BIB placement only 24/100 patients had maintained weight loss >10%. Following intention to treat analysis, by the end of the study (4.8 years =/-1.6 years post balloon) 28/100 patients had maintained >10% weight loss. Thirty five patient had gone on to bariatric surgery, three were lost to follow-up, and the remaining thirty four patients who had failed to maintain >10% baseline weight loss had a weight loss of 1.5kg (+/- 5.8kg). The authors? conclusion was that BIB was an effective option that provides durable weight loss in around 25% of patients.

Currently, the BIB is indicated for only six months of continuous use. The Italian group led by Genco et al designed a study that looked at the safety and efficacy of placing sequential balloons compared to a group that only had one balloon followed by diet.21 They took 100 patients (1:4 male:female) and randomized half of the group to receive a balloon for six months and then get seven months of diet therapy while the other group received six months of balloon therapy, one month without a balloon, and another six months of balloon therapy. Baseline BMI for both group was 42.6 and 42.9 respectively. After six months of balloon therapy both groups had achieved a BMI of 34.2 and 34.8 respectively. At the end of the study group A (one balloon followed by diet) had a mean BMI of 35.9 and the group B (two sequential balloon treatments) achieved a mean BMI of 30.9 (p<0.05). There were no complications in either group.

Combination therapy with intragastric balloons and pharmacotherapy has been studied to boost the effect of the balloon. Farina et al.22 recently published their trial involving a small group of obese patients who were randomized to balloon therapy plus lifestyle modification with or without pharmacotherapy for one year. The group initially had the BIB for six months and then were randomized to either pharmacotherapy or lifestyle modification. A control group of patients who just received lifestyle modification with pharmacotherapy was also used. The subset that received the balloon had significantly more weight loss than the pharmacotherapy alone group (37.7% EBL vs 25.3% EBL). At one year, the group that had balloon therapy for six months followed by six months of pharmacotherapy had significantly more weight loss compared to pharmacotherapy only. The balloon+pharmacotherapy group lost 41.3% EBL, balloon+lifestyle lost 34.9% EBL, and the pharmacotherapy only group lost 22.1% EBL.

As durable weight loss remains a question with balloons, studies have assessed it as a bridge therapy to bariatric surgery in patients unsure of surgery or thought too obese for surgery. A Greek study.23 published in 2006 took 140 obese patients with a median BMI of 42kg/m2 who had previously refused bariatric surgery out of fear of complications. They were all given the BIB for 6 months and then followed for another 24 months. Seventy-one percent of the patients given the balloon had lost >25% of their calculated excess body weight. Forty percent of this group regained their weight during follow-up while 56% maintained their weight loss. Thirty two percent of the total 140 patients eventually underwent some form of bariatric surgery. The vast majority of those that accepted surgery were from the group that failed to lose weight from the BIB or those that regained their weight after balloon extraction.

Another small study looked at super-morbid obese patients who had high risk factors to undergo weight loss surgery.24 Twenty six patients with average BMI of 65 kg/m2 were given the BIB for six months in an attempt to decrease the risk factors of bariatric surgery. One patient died of cardiac arrest following an aspiration event the day after balloon placement. Twenty of the twenty six patients lost enough weight that they were able to proceed with bariatric surgery the day after balloon extraction.

Other studies have shown that intragastric balloons do decrease NASH scores on hepatic biopsies after six months which represents a novel treatment for NASH.25 While the BIB is the most common, other balloons in development focus on delivery systems that can be deployed without endoscopy,26,27 or on filling the balloon with air.28 Advantages of balloon therapy include quickness in placing the balloon and relative safety.

United States FDA approval is pending for intragastric balloons. It was denied in 2007 after a Cochrane review of nine randomized trials came to the conclusion that weight loss was not convincingly better than focused lifestyle modification and they could not recommend it above eating and behavioral modification.29 One confounder was that the studies had difficulty separating out the balloon effect from weight loss by lifestyle modifications in motivated patients involved in the studies.

Intestinal Barrier Sleeves

Decreasing the absorptive surface of digestion decreases absorption of calories but has also been shown to alter gut hormones which helps decrease weight.30 Restrictive surgeries such as the vertical banded gastroplasty and gastric bypass are proof that this method of weight loss is effective.

The first endoscopic device tried in humans was the Endobarrier made by GI Dynamics (See Figure 3). It is about 60cm long and is endoscopically anchored in the duodenal bulb. The sleeve is passed over a guidewire so that it rests in the proximal jejunum thus inhibiting mixing of the stomach contents with pancreatic and biliary secretions. The first human study was published in 2008 and consisted of 12 patients with an average BMI of 42.31 Average time to implant the device was 26 minutes and average time of explantation was 43 minutes. Ten of the 12 patients were able to complete a 3 month trial of the Endobarrier sleeve and lost about 28% of their excess body weight.

One of the first randomized trials was published in 2010 from the Netherlands.32 In this study, 41 patients were recruited and randomized such that 30 patients received 12 weeks of Endobarrier sleeve and 11 patients were treated with diet counseling. The BMI in the treatment group was 48.9 and the diet group was 47.4 but this was insignificant. Twenty six out of thirty patients had Endobarrier sleeve successfully placed with an average time of 35 minutes (12-102 minutes) for implantation. Four of these patients had to have the device removed prior to 12 weeks of therapy due to sleeve obstruction, sleeve migration, dislocation of the anchor, and continuous epigastric pain. All 26 patients who had the sleeve had at least one adverse event which was mainly abdominal pain or nausea. The twenty two patients that completed the 12 week trial saw an average excess body weight reduction of 19% compared to just 6.9% for the diet group. Hgb1ac was improved in diabetics that completed the trial with Endobarrier sleeve. Endoscopic time to remove the device was 17 minutes (5-99 minutes).

One of the only randomized sham controlled trial to be done on the Endobarrier was published in 2010.33 In this study, 21 morbidly obese patients were randomized to have 12 weeks of the duodenal bypass sleeve and 26 patients were randomized to a sham procedure. Eight (41%) of the sleeve patients had to have the sleeve removed prematurely for GI bleeding, abdominal pain, nausea/vomiting, or an unrelated pre-existing illness. Comparing the remaining 13 patients who had the sleeve placed to the 26 sham control group, the excess weight loss was 11.9% to 2.7% (p<0.05). Over 62% of the sleeve group achieved >10% of EWL compared to only 17% of the sham group. This study showed significance above sham trials but was still limited in its small number and the fact that the study personnel were not blinded.

A similar device studied in humans is the ValenTx barrier sleeve. This sleeve is longer (120cm) and anchors in the gastric cardia effectively bypassing the stomach as well as the duodenum. This makes it function more like a roux-en-Y gastric bypass. The only published study looked at 24 morbidly obese patients recruited between 2008 and 2010.34 The average BMI of the group was around 42 kg/m.2 Twenty two of the 24 patients had the gastroduodenal bypass sleeve placed and 17/22 completed the 12 weeks of therapy. Average weight loss of the 17 patients was 39.7% of their excess body weight and there were no major complications from the procedure. Four out of four patients with elevated HgbA1c at the beginning of the trial had improved A1c scores by the end. However, this was not a randomized controlled study and further studies are needed.

Duodenal bypass sleeves offer some promise but are limited by side effects from placement of the sleeve. These side effects include failure of around 20% of patients to tolerate the sleeve due to pain or nausea as well as lesser degrees of bleeding from the anchoring site, sleeve migration, and sleeve obstruction. Additionally, no long term studies have concluded that check for durable response or the possibility of repeat sleeve placement if feasible.

Aspiration Therapy

In December 2013 a pilot study of an endoscopically placed aspiration tube was published.35 (See Figure 4). This study randomized 18 obese patients in a 1:2 fashion to either have lifestyle therapy (LT) training versus an endoscopically placed aspiration tube (AspireAssist (Aspire Bariatrics, King of Prussia, PA) that would assist in emptying stomach contents 20 minutes after a meal. The aspiration tube (A-tube) is placed in the usual fashion as a PEG tube but has a long fenestrated intragastric portion as well a skin port. When it is time to aspirate, a connector is attached to the skin port and water is flushed into the stomach and then aspirated into a companion reservoir that is then emptied. While the study was small there were no statistically significant differences between the LT and AT groups in age, BMI or relevant lab values. After one year 10 patients in the aspiration therapy arm and 4 patient in the lifestyle therapy arm presented for follow-up. Average % TBW [loss] was 18.6 [%] in the AT group vs 5.9% in the LT group. Adverse events in the AT group included mild/moderate abdominal pain (improved with tube redesign), mild peristomal bleeding, mild peristomal infection, mild constipation or diarrhea, and one persistent fistula after A-tube removal. All problems were treated conservatively. There were no hospitalizations required in any of the study subjects.

Botulinum Toxin Injection

Several studies have assessed the effect of injecting botulinum toxin into the gastric muscles. The idea is that reducing the stomach?s ability to empty foods will alter the gastric hormones and prolong the feeling of satiety. An initial pilot study in 2005, which injected botulinum toxin A 100u into the gastric antrum only, did not significantly decrease weight or solid gastric emptying time at 4 and 12 weeks.36 However, Foschi et al found that injecting botulinum toxin 120u into the antrum AND 80u into the fundus led to small but significant decreases in weight and increases in gastric emptying time.37 The group injected with botulinum lost 11.8kg after 2 months compared to 5.5kg in a control group injected with saline in both the antrum and fundus. The solid gastric emptying time increased from 83.4 minutes to 101.6 minutes with no change in the saline group. This study helped pave the way for further studies where botulinum is injected in both the antrum and the fundus. Further studies have demonstrated decreased levels of ghrelin and PYY 38 and no study patient has experienced any severe complications.

To date no large, randomized, prospective, sham controlled trial has been done on intragastric botulinum injections.

CONCLUSION

Primary endoscopic treatment for obesity is in its infancy but while some therapies have shown promising results, better therapies are needed to overcome the problems of lack of durability, moderate to severe side effects, time and technical skills required, and complications. For the foreseeable future, bariatric surgery will remain the most effective management of obesity. As we gain further insight in the neurohormonal gut-brain axis, medical therapies may become predominant.

Download Tables, Images & References

Nutrition Issues in Gastroenterology, Series #131

The Calorie Requirement Conundrum

Read Article

Although guidelines promote more complex means of estimating calorie expenditure, critically ill patients often receive variable and incomplete amounts of nutrition. The optimal timing and amount of nutrition to feed critically ill patients has not been established. In this article, we discuss multiple prediction equations for estimating calorie expenditure, calorie requirements, and the importance of addressing barriers to meeting basic nutrition goals in order to prevent large cumulative nutrition deficits.

INTRODUCTION

Nutrition recommendations for adult critically ill patients in the 1970’s and 1980’s encouraged increased calories to a range of 3000-5000 kcals/day to reduce muscle breakdown or improve nutrition status.1 It was not uncommon for patients to receive 1.5 to more than 2X their actual calorie expenditure into the mid 1980’s.2,3 However, case reports of hyperglycemia, hepatic enzyme elevations, respiratory failure and protracted ventilator weaning associated with purposeful overfeeding (hyperalimentation) were reported by the early 1980’s.3,4 Research also demonstrated that providing nutrition in excess of calorie expenditure in the early phase of illness or injury did not prevent catabolism and muscle breakdown.5,6 In view of the evidence that severe overfeeding of calories in the early stage of critical illness caused negative consequences, without apparent benefits, clinicians searched for a means to guide the provision of nutrition support.

Measurement Versus Estimation of Calorie Expenditure

Indirect calorimetry (IC) estimates 24-hour calorie expenditure via measurement of oxygen consumption and carbon dioxide exhalation. Studies with indirect calorimetry have revealed that the calorie expenditure of most critically ill adults were more modest than had previously been thought; and early prediction equations with activity and stress factors for estimation of calorie expenditure often led to overfeeding.7,8

IC is frequently not available at many facilities due to the cost of the equipment and its maintenance, as well as the training and time of experienced personnel. In the absence of IC clinicians routinely use one or several of the multitude of prediction equations to estimate a patient’s calorie expenditure.9-16 These equations are based on physical attributes such as height, weight and age, and/or physiologic variables such as body temperature, respiratory rate, tidal volume, minute ventilation, and/or severity of injury (See table 1). Over the years, a number of “improved” equations have been published and studies have compared the accuracy of these prediction equations with estimation of 24-hour calorie expenditure via indirect calorimetry.9 Although there is a plethora of work regarding energy expenditure and the accuracy of prediction equations, there is very limited research about the amount of nutrition that will optimize the outcome of critically ill people.

Calorie Requirements Versus Calorie Expenditure

In order to understand the best way to estimate calorie needs, it is important to differentiate between the calorie expenditure of critically ill patients and the calorie provision that allows the best possible patient outcome. To date, there are no large randomized trials demonstrating that meeting a critically ill adult patient’s full calorie expenditure results in improved outcomes. Obviously, providing minimal nutrition for an extended period of time will eventually result in serious malnutrition. However, it is possible that providing nutrition to critically ill adults that meets full calorie expenditure may actually have negative consequences, especially in the early, most acute phase of critical illness. Patients may benefit from a period of reduced, or even no nutrition during the early stage of illness, with increased nutrition at a later point. The ideal amount of nutrition support may be different depending on the degree of malnutrition, age, severity of illness or injury, presence and severity of surgical wounds, trauma or burns, requirement for repeated surgical procedures and duration of recovery. The early phase of critical illness is characterized by unavoidable catabolism that is not reversed by meeting full calorie expenditure.5,6 Additionally, increased insulin resistance and decreased gastrointestinal motility in the early stage of critical illness or injury have the potential to increase complications related to providing nutrition support.17 Researchers have postulated that providing full nutrition needs in the early phase of critical illness may impair the normal activation of mechanisms that are needed to remove cellular damage.18 There is evidence that in some critically ill adult populations, hypocaloric, full protein feeding may actually improve patient outcomes.19,20

A number of observational studies have described associations between the amount of nutrition provided to critically ill patients and their outcomes.21-23 However, observational studies cannot attribute cause and effect related to the amount of nutrition received because those patients with worse outcomes are more likely to receive less nutrition. It is not possible to statistically control for all variables that affect outcome in observational studies, and it is inappropriate to make practice recommendations based on associations reported in observational studies.

Two randomized studies have purported to describe improvements in selected outcomes in patients who had calorie expenditure measured by IC followed by increased nutrition delivery.24,25 However, the amount of calories provided was not the only difference between the experimental groups in these studies. Both studies provided increased calories and significantly more protein to the experimental group, primarily by providing increased parenteral nutrition. In both studies there was only a trivial difference between the calculated nutrition needs and IC measurements, so neither study provides any meaningful data about indirect calorimetry. In one unblinded single center pilot study, patients received a bundle of increased calories and protein as well as individualized attention from the study dietitian to help ensure adequate nutrition delivery.24 Patients who received increased nutrition had significantly more infectious complications, delayed ventilator weaning and increased time in the ICU. There was no significant difference in hospital mortality on intention to treat, but in the smaller per protocol analysis (n=112), mortality was significantly decreased in the experimental group.24 The authors concluded that a much larger multi-center trial would need to be conducted to understand the effect of increased nutrition on mortality. 24 The other study did not result in any significant differences in infectious complications over the entire study period, but did report decreased adjusted probability of infections in the group receiving increased calories and protein over a post-hoc selected time period between days 9-28 (after the parenteral nutrition was discontinued).25 In contrast to these 2 studies of reduced calories with very limited protein, a modest sized study (n = 240) of reduced calories with supplemental protein (compared to full calories and protein) in medical-surgical ICU patients reported significantly less mortality in the group receiving reduced calories.26

A much larger multi-center, randomized study of 1000 patients with acute lung injury (ALI) or ARDS found that attempting to provide full feedings did not result in any outcome improvements compared to “trophic feeding” (approximately 25% of calculated needs).17 The group with planned full feedings received an average of 80% of calculated needs, but had significantly more minor gastrointestinal complaints such as elevated gastric residuals, regurgitation and episodes of emesis, as well as requiring increased prokinetic and anti-diarrheal medications.17 In this higher quality study, an average difference of 900 kcals/day (56% of estimated needs) in well-nourished patients with ALI/ARDS did not result in any significant difference in patient outcomes.17

Calorie Prediction Equations

There are a large number of calorie prediction equations, and multiple studies have compared various prediction equations in assorted patient populations with indirect calorimetry.9-16 Although many prediction equations have reasonable accuracy for groups of patients, the potential error for most prediction equations in individual patients is +/- 500 calories.27 As stated above, the clinical implications of this magnitude of error are unclear.

Early prediction equations such as the Harris-Benedict equation used fixed variables such as weight, height and age. Some recent prediction equations have incorporated clinical variables such as body temperature, respiratory rate, tidal volume, minute ventilation, and/or severity of injury (See table 1 for a summary of several commonly used fixed and complex predictive equations). The American College of Chest Physicians recommended a simple weight-based method of estimating initial calorie goals.28 More complex calorie estimation formulas generally require more time for collection of clinical variables and calculations that can change throughout the day.

Several studies have suggested superior accuracy for more complex prediction equations and some guidelines have favored one or another method for estimating calorie goals.9,27,28,29 Unfortunately, weaknesses of the study methods in most of the research with prediction equations severely limit any conclusions relevant to clinical practice. The vast majority of studies with prediction equations used only a single IC study per patient, measured at various points during the hospitalization for each patient. The largest validation study of predictive equations included 202 mechanically ventilated, critically ill patients and compared 17 different equations.29 Accuracy of the prediction equations was arbitrarily defined as prediction within 10% of the IC measurement. However, there was only 1 IC measurement per patient that was completed between day 2 and day 64 of the admission, between day 2 and 27 of their ICU stay, and with a sepsis-related organ failure score between 1 and 18.29

Studies of day to day variation in energy expenditure have established that in the early portion of a patient’s admission the day to day variation in energy expenditure varies by as much as 46%, with a more recent study demonstrating that mean daily energy expenditure varied by an average of 31.7 (+/-22.6)%.30,31 Even in stable patients the daily variation measured by IC varies by an average of 12%.30 Translated, this means that a single IC measure does not accurately represent the average calorie expenditure of a critically ill patient. A single IC does not meet the criteria of accuracy (+/- 10%) used in most studies of prediction equations. A study of daily calorie expenditure in critically ill adults found that a single IC measurement extrapolated for 1 week has more cumulative error than several prediction equations.32 Although IC is frequently referred to as the “gold standard,” it is clear that in critically ill adults, a single IC study is not a more accurate predictor of average calorie expenditure than most prediction equations. Due to the potential “error” of a single IC measurement, compared to the average calorie expenditure, it is not appropriate to recommend one method of estimating calorie expenditure over any other based on studies that used a single IC study.

In one study using daily indirect calorimetry, one prediction equation that used maximum body temperature and expired minute ventilation as part of the calculation estimated daily calorie expenditure with acceptable accuracy (compared to the daily indirect calorimetry).32 However, temperature and minute ventilation used for the calculation in the study were collected at the same time as IC was completed, so it is not surprising that the calculation was similar to the indirect calorimetry. Temperature and minute ventilation vary over the course of the day and there are no studies of prediction equations that use clinical variables where the calculations are done in a blinded fashion to the time and results of the indirect calorimetry. Additionally, the investigators did not report the actual amount of nutrition received by the patients, nor discuss how the equation chosen to estimate calorie goals may affect the actual amount of nutrition provided to the patient. A recent observational study demonstrated that patients who were in the ICU > 4 days who had their calorie goals determined with only weight-based equations had a significantly shorter time to discharge alive, than patients who had calorie goals determined with more complex calculations.33 As stated above, no cause and effect conclusions can be made from observational studies, but this association between complex calculations and worse patient outcomes highlights the need for outcome data before any method of calculating calorie goals can be recommended above another.

A Practical Issue: Calorie Prediction Versus Calorie Delivery

Compounding the inaccuracy of predictive equations, and the daily variability of energy expenditure, is the issue of how much nutrition a critically ill patient actually receives each day. There are a number of studies documenting that many critically ill patients receive only a portion of the nutrition support that is ordered, and the amount of nutrition provided can vary greatly from day to day.31-35 Only a single study has investigated the difference between daily energy expenditure and the amount of nutrition actually received by the patient.31 The study established that even when energy expenditure was measured each day, and daily adjustments made to feeding rates, the actual cumulative energy balance ranged from -6702 to + 4791 kcals.31 This large day to day variation in the amount of nutrition actually received suggests that the much smaller “statistical” difference in accuracy between different prediction equations is likely clinically irrelevant. There are no adequate data to support that the use of a particular prediction equation results in improved nutrition delivery, or improved patient outcomes.

In recent years, some studies have focused on enhancing the delivery of nutrition rather than the precision of the initial calorie goal. A study that provided intensive nutrition support with increased feeding rates (150% of goal initially) delivered significantly more calories than a cohort fed according to standard protocols.36 However, the intensive nutrition support group had a significantly increased duration of ICU stay, compared to the standard group. Braunschweig et al, in a randomized study of intensive feeding delivery that increased feeding rates to compensate for “missed” feeding reported significantly increased mortality in patients with acute lung injury that received increased nutrition.37

A cohort type study (cluster randomized) demonstrated that enhanced enteral feeding protocols (early protein supplements, prophylactic prokinetic medications, volume based feeding) combined with a nursing education program significantly increased protein and calorie delivery compared to control facilities.38 The enhanced feeding protocols combined with nursing education (Pep uP protocol) did not significantly improve patient outcomes, which reinforces the notion that modest changes in calorie delivery do not appear to affect patient outcome.38 The results of these studies36-38 indicate the need for better quality randomized studies about patient outcomes before wide scale adopting of intensive feeding protocols or enforcing delivery of full nutrition needs in the early phases of critical illness.

Clinical Considerations

Although there are insufficient data to make strong recommendations for any particular method of estimating calorie expenditure, the reality is that there needs to be some method of establishing initial nutrition goals for patients that receive nutrition support. Time efficient weight based calculations, or more complex calculations are sufficient to prevent gross overfeeding. The actual amount of nutrition provided to patients should be monitored, and if patients are not receiving initial calorie and protein goals, the obstacles to providing reasonable amounts of nutrition in a safe manner should be addressed. Feeding schedule, rate, caloric density of the formula, supplemental protein, alterations in position of the tip of the feeding tube, prokinetic medications, and/or alterations to the bowel regimen are some of the considerations that may need to be modified. The difference between the nutrition goals and the amount of nutrition provided, especially with enteral nutrition, often dwarfs any difference in the accuracy of prediction equations.31,34,35 After the most acute phase of critical illness has passed, calorie goals may need to be increased to allow positive nutrition balance and improvements in nutrition status, at a time when patients are capable of compensating for the period of catabolism, and are less likely to have complications from overfeeding.

Patients with a very low BMI (<15 kg/m2) may have their calorie expenditure underestimated by many prediction equations, because they have a greater amount of metabolically active mass per kg, compared to normal weight individuals.39,40 Patients with a very low BMI may also be more subject to negative consequences if they receive hypocaloric nutrition support for any length of time. After concerns for possible refeeding syndrome have been addressed, and the most acute phase of critical illness has passed, patients with severe malnutrition and/or low BMI may require a calorie level that exceeds their needs to improve nutrition status.

Patients with obesity (BMI > 30 kg/m2) appear to benefit from hypocaloric feedings when full protein is provided.41 The details of hypocaloric feeding in obese patients are beyond the scope of this paper, but have been addressed elsewhere.42 One of the few prospective studies to address outcomes with hypocaloric feeding in obese patients reported improved patient outcomes using weight based calculations (20 kcals/kg of adjusted body weight) as the goal for hypocaloric feeding.41

CONCLUSIONS

Indirect calorimetry is the most accurate method of estimating resting energy expenditure, but it is misleading to suggest that IC is a “gold standard” for establishing nutrition goals of critically ill adult patients. Currently, there are inadequate outcome data from randomized studies to support the routine use of IC or a particular prediction equation. Available evidence suggests that in the early stages of critical illness, a single IC study is no more accurate than commonly used prediction equations for estimating average calorie expenditure. Based on the best evidence to date, modest calorie deficits within the early stage of critical illness do not appear to influence outcome, and hypocaloric feedings with full protein and micronutrients may be advantageous in some populations. There are some data to suggest that intensive efforts to make up for “missed” feeding may compromise outcome, and large randomized studies are needed before implementing intensive feeding protocols in routine clinical practice.

In patients receiving EN, the difference between nutrition goals and the actual nutrition provided, by far overshadows the small differences between various prediction equations. Simple weight based prediction equations appear safe and adequate to prevent gross under- or overfeeding. Attention to the actual nutrition provided to patients and addressing barriers to meeting basic nutrition goals is helpful to prevent large cumulative nutrition deficits.

Download Tables, Images & References

A Case Report

Infected Microabscesses In Congenital Hepatic Fibrosis and Subsequent Treatment With Liver Transplantation

Read Article

INTRODUCTION

Congenital hepatic fibrosis (CHF) is a developmental malformation of the hepatic ductal plate with associated renal disease, most commonly autosomal recessive polycystic kidney disease (ARPKD).1 A minority of patients with CHF/ARPKD come to medical attention in adulthood with liver-related complications.2 The clinical presentation of the hepatic disease is dependent on the presence or absence of portal hypertension and/or biliary disease.3

The most common manifestations are related to portal hypertension, including splenomegaly, cytopenias and esophageal varices.1,3,4 A subset of patients may also develop cystic dilatation of the bile ducts (Caroli syndrome), which increases the risk of recurrent cholangitis and microabscesses.5

We present a patient who underwent extensive treatment for liver microabscesses and ultimately required liver transplantation for total eradication.

PRESENTATION

A 49-year-old female with CHF/ARPKD, three years post renal transplant on immunosuppressant therapy, presented to the emergency department in late August 2006 with right-sided abdominal pain and diarrhea. In the emergency department she was febrile, had abnormal liver enzymes (elevated bilirubin and transaminases) and thrombocytopenia with disseminated intravascular coagulation. She was admitted to the intensive care unit with septic shock and was placed on broad-spectrum antibiotics. Blood cultures yielded gram-negative rods further identified as a Salmonella serial group B infection. Colonoscopy did not reveal significant abnormalities. She underwent repeated abdominal computed tomography (CT) scans, which did not discover any hepatic abscess formation (Figure 1). Throughout her hospitalization, the patient had persistent fever. Positron emission tomography (PET) scan noted increased uptake at the level of the right lobe of the liver. Liver biopsy revealed the presence of microabscesses, as well as features of cholangitis and bile stasis (Figure 2A and 2B). The patient received three weeks of intravenous (IV) levofloxacin and imipenem. Her fevers persisted so magnetic resonance cholangiopancreatography and endoscopic retrograde cholangiopancreatography were performed. These tests revealed patency of the common bile duct where a stent had previously been placed. The patient was discharged home on antibiotics after twenty-four days despite persistent low-grade fevers.

One day after discharge, she was readmitted for a fever of 38.6 degrees Celsius. She was reevaluated and underwent blood cultures, transesophageal echocardiogram and a bone marrow biopsy, all of which were negative. A repeat PET scan confirmed the initial PET scan findings. The patient was stabilized and discharged despite continued low-grade fevers.

Within 10 days, the patient experienced intermittent fever up to 39.4 degrees Celsius. Blood cultures grew Enterobacter and Enterococcus, and she was admitted. A Flavobacterium empyema was discovered in the left upper lobe of her liver. The infections were felt secondary to persistent cholangitis. Doppler ultrasound of the abdomen showed an enlarged liver with diffuse echotexture, enlarged spleen with splenic varices, inferior vena cava thrombosis with probable partial portal vein thrombosis and retrograde flow and a biliary stent in place. The workup suggested a biliary origin of her infections, so she underwent a cholecystectomy and liver biopsy, which revealed extensive portal fibrosis and marked bile duct proliferation, consistent with CHF. She ultimately defervesced on imipenem and was discharged on ertapenem.

Just six days after discharge, the patient returned with fevers up to 39.9 degrees Celsius. PET scan on this admission revealed a change in location of the increased metabolic activity of the liver compared to prior scans (Figure 3). There was no evidence of endocarditis on repeat transesophageal. Blood cultures again showed the presence of Enterococcus bacteremia. Ertapenem and subsequently a combination of daptomycin, levofloxacin and metronidazole failed to resolve her persistent fevers. Imipenem was initiated with good results and she was discharged on IV imipenem for three weeks.

The patient did well for the next year, but in 2008 she presented with recurrent episodes of Klebsiella bacteremia. At this time, she pursued liver transplantation but was not deemed to be a candidate. She later developed new onset of ascites, cirrhosis, portal hypertension, hepatosplenomegaly with varices and chronic pneumobilia. She had multiple admissions with vancomycin resistant enterococcus (VRE) bacteremia and subsequent Klebsiella sepsis with pancytopenia, acute renal failure, shock and disseminated intravascular coagulation. The patient was then placed on chronic suppressive therapy with cefditoren.

In 2009, the patient received a liver transplant. The explanted liver showed the presence of persistent microabscesses despite extensive preoperative antibiotic treatment (Figures 2C and 2D). After the procedure, she did well. Her antibiotic therapy was discontinued and she no longer experienced recurrent fevers, nor has she been admitted for bacteremia or sepsis.

DISCUSSION

CHF/ARPKD is a rare, inherited hepatorenal fibrocystic disease with an estimated frequency of 1 in 20,000 live births.6,9 Most of the patients with CHF /ARPKD present in the perinatal period with oligohydramnios secondary to decreased fetal urine output and related hypoplastic lungs, but others can present�as late as the fifth or sixth decade of life when the clinical symptomatology is dominated by either renal failure, hepatic dysfunction or both.6 Both kidney and liver disease are progressive, and almost all patients with ARPKD have some degree of CHF at birth; however, some individuals develop portal hypertension and its manifestations, such as hypersplenism with thrombocytopenia, splenomegaly, gastroesophageal variceal bleeding or cholangitis of variable severity, as they age.1-4,6 When biliary ectasia is present in the setting of CHF, clinical manifestations may result from biliary stone formation, cholangitis and occasionally liver abscess.7-9

The defective gene in CHF /ARPKD, PKHD1/fibrocystin on chromosome 6p21.1, is expressed in the primary cilia of bile duct and renal tubular epithelium.2,3,5,6 Dysfunction of fibrocystin causes abnormal ciliary signaling, leading to disruption in regulation of proliferation and differentiation of renal and biliary epithelial cells.6 This causes a hepatic ductal plate malformation, which results in increased numbers of dilated and irregular, tortuous bile ducts in a ring around the periphery of a portal tract.3,9

As a treatment, CHF /ARPKD patients are candidates for liver, renal or combined liver and renal transplantation.3,5,8 Definitive indications for combined liver and renal transplantation in CHF /ARPKD include the combination of renal failure and either recurrent cholangitis or refractory complications of portal hypertension.3 Liver transplantation alone may be considered if there is a single, well-documented episode of cholangitis or marked abnormalities in the biliary system.3

Although CHF primarily involves portal areas and preserves synthetic hepatic function,10 eventually this patient developed portal hypertension, which progressed to cirrhosis and its complications. Our patient represents an example of successful, long-term management of CHF and ARPKD. First, she needed a kidney transplant due to her ARPKD. At that time, her hepatic synthetic function was adequate, so a combined liver transplant was not warranted. Over time, she developed recurrent infections without an obvious source other than the liver. This presentation is very similar to recurrent cholangitis episodes seen in primary sclerosing cholangitis. With consideration to her CHF and immunosuppressed status, we believe that her infection was the result of the infection of microscopic bile lakes in the liver. PET imaging revealed resolution of high intensity signaling after antibiotic treatment with repeated infections. She faced many severe septic, life-threatening episodes and prolonged hospitalizations. Long-term antibiotic therapy helped with her septic events. Eventually, however, she developed advanced fibrosis with cirrhosis and a MELD score of 26, which permitted liver transplantation.

Documented examples of infection regarding microscopic bile lakes leading to microabscesses were not found upon review of literature. This absence may result from under-diagnosis, complications resulting from immunosuppression, death resulting from complications of portal hypertension or early combined liver and kidney transplantation.

Our case represents a unique opportunity to learn that patients can present with infected microabscesses of the liver, which may be extremely difficult to eradicate. In this case, liver transplant presented as a viable option to eradicate the infection by removing the infected foci as a whole and should be considered in similar cases. n

Significant contributions were made by the following: James M. Small MD PhD of UniPath, PC, Denver, CO; Matthew Seto DO of Rocky Vista University College of Osteopathic Medicine, Parker, CO; Simeon Abramson MD of Radiology Imaging Associates, PC, Littleton, CO; Bethany E. Ho of Scripps College, Claremont, CA; Ryan Barmore of the University of CO School of Medicine, Aurora, CO.

Special thanks to S. Russell Nash MD PhD of Colorado Gastrointestinal Pathology for digital photography assistance and Andrew M Ho MD for computer imaging, technical medical assistance, and editing.

Download Tables, Images & References

Inflammatory Bowel Disease: A Practical Approach, Series #89

Inflammatory Bowel Disease: The Bellevue Experience

Read Article

The Bellevue IBD clinic serves a radically different patient population than is treated in most US healthcare settings, represented in most clinical trials, or reflected in current management guidelines. Here we discuss the complexity of providing care to these individuals, areas of disparity in IBD care and resources to assist our patient population

INTRODUCTION

Bellevue Hospital Center (Bellevue), the oldest public hospital in the United States, was founded in 1736 as a six-bed infirmary utilized to quarantine ill patients. In 1811, the hospital moved to its current location on Manhattan’s east side, when New York City (NYC) purchased the Kips Bay farm and began construction of an almshouse (the “Bellevue Establishment”), consisting of two pavilions for men and women, respectively.1 Bellevue has been known as an institution willing to serve the poor and underserved since its creation.2 Currently, Bellevue is a level 1 trauma tertiary care teaching hospital in New York City’s Manhattan borough, serving as a major referral center for complex medical cases citywide.1, 2 It serves as the flagship for the NYC Health and Hospitals Corporation (HHC), which is the largest public healthcare delivery system in the United States, overseeing 11 acute care hospitals.3 Bellevue has been affiliated with NYU School of Medicine since 1847.

Bellevue’s mission is to provide the highest quality of care to New York’s neediest populations and to deliver health care to every patient with dignity, cultural sensitivity and compassion, regardless of ability to pay”.2 There are 828 beds in operation with nearly 528,000 clinic visits, 125,000 emergency room visits, and approximately 32,000 inpatient discharges annually.1 According to New York State healthcare center cost records from 2010, HHC hospitals provided the highest proportion of care to uninsured, self-paying patients.2

Bellevue’s patient population resides mainly in the following communities: Southern Manhattan, Northern Brooklyn, and Western Queens. The Bellevue patient population differs from that of NYC as a whole. In NYC, 63.4% of individuals are younger than age 65, while 67.1% of Bellevue’s patients are younger than age 65. The racial background of Bellevue’s patients are as follows: 40% White, 38.1% Hispanic, 21.9% Black, 15.4% Asian, and 22.5% other.2, 4 In addition, poverty rates of families cared for at Bellevue, especially those with children, are significantly higher than those of NYC overall. At Bellevue, 22% of families and 31% of families with children live below the federal poverty guideline. In NYC overall, these numbers are 16.7% and 24.4%, respectively.2, 4, 5

Prior to implementation of the Affordable Care Act, 45% of Bellevue outpatient visits were covered through Medicaid and 31% of Bellevue outpatient visits were self-pay. This degree of poverty goes hand in hand with the difficult social circumstances facing this patient population and adds to the complexity of providing care to these individuals. In addition to facing high poverty rates, the population served by Bellevue consists of individuals with lower education levels than patients of surrounding city hospitals.

Bellevue has long offered a Gastroenterology Clinic for patients who are in need of screening, diagnostic, and/or treatment services in the area of digestive diseases. More recently, due to a growing number of patients presenting with suspected or diagnosed inflammatory bowel disease (IBD) along with the complex nature of IBD treatment, Bellevue implemented an Inflammatory Bowel Disease Clinic, specifically to serve patients with Crohn’s disease and ulcerative colitis.

IBD Epidemiology

IBD is a chronic, relapsing and remitting intestinal condition with increasing incidence worldwide. Currently, ulcerative colitis has an incidence of 2.2-19.2 cases per 100,000 person-years and Crohn?s disease has an incidence of 3.1-20.2 cases per 100,000 person-years.6 It is a disease of the developed world and typically affects those originating from Northern Europe and North America. The incidence and prevalence of IBD appears to be lower in Asia, the Middle East, and Central and South America, likely reflecting both genetic and environmental differences between these patient populations. While the incidence of IBD has plateaued in the Western world, areas previously unaffected by IBD are exhibiting an increase in this disease.7, 8 Due to the makeup of the Bellevue patient population we are now seeing more individuals presenting with Crohn?s disease and ulcerative colitis at our hospital. Few studies have evaluated the IBD phenotype of the more recently affected patient populations; however, there is no clear pattern of age at time of diagnosis, pattern of disease distribution, or severity.9, 10

Areas of Disparity in IBD Care

Patients with IBD typically require close surveillance by their care providers for management of symptoms, adjustment of medication regimens, and endoscopic interventions. Moreover, individuals with this disease face unique barriers to accessing medical care that greatly impact their clinical outcomes and prognosis. Studies have shown that race and socioeconomic status weigh heavily in the type of care patients who are suffering from chronic disease states receive.11 Sewell and colleagues performed a systematic review, studying the role of socioeconomic status and race and the quality of care delivered to patients with IBD. They specifically focused on differences in utilization of medical and surgical therapies, rates of adherence to therapy, clinical outcomes, access to healthcare, utilization of healthcare, patient?s individual perception and knowledge about IBD, employment rates, and medical insurance status. The authors hypothesized that patients who were non-white and of lower socioeconomic status would receive less effective and lower quality care for their IBD. The majority of studies in this review found that white patients were more likely to be treated with immunomodulators and infliximab when compared with non-white patients, despite the severity of illness being reported as similar.11 The majority of studies also displayed disparities in surgical care among different racial and socioeconomic groups, with minorities and impoverished patients less likely to undergo colectomy when compared with white, affluent patients.11, 12 Medication compliance is a particularly important issue in the management of IBD. This review found that black patients were less likely to adhere to prescribed medications and more likely to discontinue medications if they perceived subjective improvement of symptoms. They also noted that black patients understood less about the nature of their disease.11-15

With the increased incidence of IBD being seen among minority patients, this information is highly relevant to the Bellevue patient population.3, 4, 10, 11 Although studies evaluating outpatient utilization of IBD healthcare resources among different races have yielded different results, follow-up rates have been universally and unequivocally low at Bellevue. Patients have been lost to follow-up due to barriers such as difficulty paying for medications, perceived difficulty paying for visits, employment and familial obligations, and substance abuse.

Thus, the need for an IBD Clinic at Bellevue was identified not only because of the growing numbers of Bellevue patients presenting with IBD, but also due to potential barriers to care and the complexity of the diagnostic, treatment, and surveillance care required for these individuals.

IBD at Bellevue: The IBD Clinic

In late 2011, Bellevue opened the first clinic within the NYC HHC system dedicated solely to the comprehensive evaluation of patients with IBD. The IBD Clinic includes two physicians who specialize in IBD and engenders a collaborative multidisciplinary approach to the care of this patient population. The IBC Clinic works to coordinate care with all needed physicians and other healthcare professionals from other departments and divisions including: infectious diseases, general and colorectal surgery, rheumatology, endocrinology, radiology, pathology, nutrition, and psychiatry.

In terms of the IBD patient population being seen in the Bellevue system, during an 18-month span between 2012 and 2013, 218 patients were referred through the gastrointestinal clinic system with a diagnosis of Crohn’s disease or ulcerative colitis based on the ICD 9 codes 555 and 556. Since its inception, the IBD Clinic patient volume has grown steadily and currently treats 110 patients in an ongoing fashion. Most patients are referred from within the medical clinic system at Bellevue, other HHC hospitals, and by providers outside the institution after providing treatment in an acute care setting. Roughly half of the patients have ulcerative colitis and half have Crohn’s disease. Due to economic barriers, delayed diagnosis, and low health literacy, a disproportionately high percentage of our Bellevue patient population is treatment-na�ve. The IBD Clinic population consists of approximately 40% Hispanic, 25% Asian, 20% Caucasian, and 15% African/African-American patients. Since the majority of these patients are uninsured or underinsured, it became necessary for the Clinic to pioneer access to medical care that would otherwise be financially prohibitive.

IBD at Bellevue: Addressing Barriers to Care

As part of the specialized care provided by the IBD Clinic at Bellevue, several barriers to care have been identified and continue to be addressed (Figure 1). Three major factors include significant language barriers, limited support staff for the IBD Clinic, and the patients’ inability to pay for medications and other treatment needs.

First, English is not the primary language for a large proportion of the Bellevue population. For non-English speaking patients, coordinating care and gaining a true understanding of IBD and its treatment can be difficult. To help resolve potential language barriers, Bellevue offers a full phone interpretation service as well as in-person interpreters. Despite these services, gaps in communication can still occur and the IBD Clinic is exploring additional options to further address this issue.

Second, the IBD Clinic currently has limited support staff to assist with patient scheduling, phone calls, and paperwork. Moreover, patients may be scheduled for appointments without confirming their availability, resulting in a high no-show rate to the Bellevue clinics, which in turn can generate long wait times to be seen, in some cases greater than 90 days. This issue has been identified and plans are underway to address this need.

Finally, patients with IBD have significant treatment and care needs, both acute and ongoing, but many lack the financial resources to obtain their medications and other treatment services. To assist patients in this area, the IBD Clinic works with several organizations that address the needs of underserved patients.

Financial Resources for Our Patient Population

To assist patients who lack the funds to obtain the needed medications and supplies for the treatment of their IBD, the IBD Clinic has found great benefit in working with external organizations that help underserved patients. These organizations provide assistance ranging from free medications to reduced-cost ostomy supplies to assistance with insurance premiums and co-pays

(Table 1).

For example, Needy Meds (www.needymeds.org) is an organization that has been helpful in assisting our underserved patient population with access to select prescription medications. The mission of the organization is stated as follows:

“to be the best source of accurate, comprehensive and up-to-date information on programs that help people facing problems paying for medications and healthcare; to assist those in need in applying to programs; and to provide health-related education using innovative methods.”

Founded by a physician and social worker in 1997, Needy Meds assists patients who are unable to receive prescribed treatment through the hospital by providing prescriptions at a significantly reduced fee. The Needy Meds website notes a number of useful resources available to patients and providers, including a listing of reduced fee imaging centers and active governmental healthcare programs.

One such resource includes patient assistance programs (PAPs), which are typically sponsored by pharmaceutical companies and provide medications free of charge or at a significantly reduced fee for patients who earn a low to moderate income and are uninsured or underinsured. Eligibility and requirements for the programs differ by drug and all information can be viewed at www.needymeds.org.

Additionally, discount drug cards are available at needymeds.org and can be printed for patients. These drug programs typically save patients up to 80% off the listed price and can be used at over 63, 000 pharmacies nationwide. The card can be used in the following scenarios: in lieu of a patient’s insurance when there is a costly co-pay and the reduced fee is more manageable; when the plan has a high deductible; when the drug is not covered by the patient’s insurance plan; or for patients in the Medicare Part D “donut hole.” The card cannot be used in conjunction with insurance. There are no income insurance or residency requirements for use of the card. Pricing, coupons, and rebates can be found at: http://www.drugdiscountcardinfo.com/disclaimer.htm.

Other resources for the uninsured or underinsured with IBD include the Health Well Foundation, which provides financial assistance to eligible individuals to assist with co-insurance, co-pays, healthcare premiums, and deductibles for certain medications and therapies; the Osto Group, which provides ostomy supplies for the uninsured (patients pay for shipping only); and the Oley Foundation, which provides an equipment/supply exchange to patients that require total parenteral nutrition.

CONCLUSION

The Bellevue IBD Clinic serves a radically different IBD patient population than is treated in most US healthcare settings, represented in most clinical trials, or reflected in current management guidelines. To ensure the standard of care in IBD diagnosis and treatment for this underserved patient population is met, barriers to care must continue to be identified and addressed. Great progress has been made since the Bellevue IBD Clinic?s inception in 2011, and plans are underway for further expansion of this much needed clinical program for underserved patients with this complex disease.

Download Tables, Images & References

Nutrition Issues in Gastroenterology, Series #130

Nutrition Guidelines for Treatment of Children with Eosinophilic Esophagitis

Read Article

Eosinophilic Esophagitis (EoE) is a chronic immune-/antigen-mediated disease characterized by clinical symptoms and histological changes induced by environmental and/or dietary triggers. Dietary intervention for EoE includes food eliminations, which can put patients at risk for poor nutrition intake. In this article, we discuss the integral role of a registered dietitian to assess growth, micronutrient intake and to provide guidance for implementing any elimination diet. Understanding typical presentations in patients with EoE help to determine nutrition risk, and target evaluation, education and regular follow-up are key.

BACKGROUND

Since the original description of eosinophilic esophagitis (EoE) in the early 1990s, a number of guidelines for the diagnosis and treatment of the disease have been developed.1-3 EoE is defined as a chronic immune-/antigen-mediated disease of the esophagus characterized by both clinical symptoms and histological changes. Estimated prevalence of EoE is at least 56.7 in every 100,000 persons and is more common in males than females.4 Symptoms of EoE vary with age of presentation, from feeding refusal in infants and toddlers to dysphagia and abdominal pain in adolescents. The histological findings in patients with EoE are increased eosinophilic infiltration of the esophageal lining (>15/hpf) that is not attributable to reflux or any other cause of inflammation. At this time it is understood that EoE is a chronic disease that is not usually outgrown.

EoE and Food Allergies

Explanations of the three major categories of food allergies and intolerances are summarized in Table 1. To date, the exact pathophysiological mechanism(s) by which food allergies cause EoE are not certain, but it is likely that both IgE-mediated and non-IgE-mediated processes are involved.3 Clinical evidence supporting a role for food allergies as an underlying cause of EoE has been provided using three dietary approaches: an elemental diet, an empiric approach and a tailored diet. The elemental diet involves complete avoidance of food in lieu of taking in only elemental (amino acid-based) formula for complete nutrition (see Table 2). The empiric approach is to provide a diet free of the top six (milk, soy, egg, wheat, peanut/tree-nut, fish/shellfish), or top four (milk, soy, egg and wheat) most likely allergens with or without elemental formula supplementation. Lastly, the tailored diet is a diet that is crafted utilizing allergen test data and a detailed history for the individual, which may or may not be excluding the top known allergens, also with or without elemental formula supplementation. The benefits and challenges to each dietary approach are outlined in Table 3.

Dietary therapy was first identified as effective treatment with a diet consisting of only an elemental formula, which established a clinicopathological remission in children with EoE.5,6 Later using an empiric approach by removing the six most common food allergens in the United States (milk, soy, wheat, egg, fish/shellfish, peanut/tree-nut) yielded similar results.7 Further, empiric elimination of only the top four most common food allergens (milk, soy, wheat and egg) has also proven effective.9,10 The third commonly used diet therapy, the tailored diet, is documented to be the lesser of the effective dietary treatments, yet results in similar levels of clinicopathological remission in a smaller percentage of patients.11 In comparison, the literature suggests that the elemental diet remains the most effective followed by the empiric elimination and finally the tailored diet approach.3 Taken together, these studies demonstrate the role of food antigens in the pathogenesis of EoE and a high response rate with each approach chosen. The clinician should use this information to help decide which treatment approach would be best, or most feasible, for each individual patient or family.

There are obvious practical limitations to each of the three dietary approaches. The elemental diet may pose the greatest challenge, especially for those who previously ate a regular diet. Though improvements have been made in the palatability and variety of available amino acid-based formulas (Table 2), they can also be expensive and difficult to maintain as exclusive nutrition intake. The empiric elimination diets can be hard for those with limited access to allergen-free alternatives and for families with skill and affinity for home cooking tend to report an easier transition to this diet. Tailored elimination diets may pose similar challenges as with the empiric elimination diet. Though, in some cases foods removed are more difficult to identify on food label in the tailored elimination diet, since current regulations on food labeling only require clear indication of presence of the top six allergens. Of all the dietary approaches, it is ideal to work with the patient and family to provide the guidance needed to make the changes work for them.

EoE Nutrition Risks

A number of challenges exist if any of the dietary treatments for EoE are pursued. Nutritional adequacy, medication-nutrient interactions, practical implementation challenges, and costs all play a role. Because of these potential barriers to success, the expertise of a pediatric dietitian specializing in food allergies is ideal.

Nutrition risk associated with presentation of disease differs by age of presentation. Younger patients are more likely to present with feeding difficulties, while adolescents and adults are more likely to present with dysphagia and food impactions. Table 4 delineates the most common presentations of EoE and how they may result in nutrition risk. The presenting symptoms associated with EoE may require special attention prior to initiating a dietary treatment. For example, in the case of a young child presenting with poor growth, it may be ideal to establish adequate weight gain and linear growth prior to initiating a dietary treatment of eliminating foods in the diet.

Prior to starting an elimination diet, a full diet assessment is warranted to identify inadequacies in the current diet. If any are identified, it would be important to address them within the confines of the prescribed diet. Following this, considerations for new deficiencies that may develop as a result of the prescribed diet would also need to be addressed. For example, elimination of dairy from a toddler’s diet may result in lower calcium, vitamin D, possibly vitamins A and E and fat intake. Providing education that will target replacement of key nutrients in the diet is imperative. Table 5 describes a few potential nutrients impacted by food elimination diets. Laboratory evaluation may be necessary to determine if supplementation is necessary and if so, is necessary for determining how much is needed. When prescribing supplements, the clinician must take extra care in ensuring that the supplement is also allergen free. Confirming active and inactive ingredient content of over-the-counter supplements with the manufacturer or pharmacist should be done to be sure the supplement does not contain the allergens the patient is avoiding.

Patients prescribed an elemental diet will require recommendations for the goal amount of formula to take to meet calorie, protein and micronutrient needs. In addition, recommendations for fluid and electrolyte intake should also be provided as some patients choose to concentrate the formula to combat early satiety while keeping caloric intake at goal. For example, the recommended intake of a standard complete amino acid formula will contain 20-22 mEq sodium for 1000 calories. Suppose a 20 kg child at 6 years of age is taking this formula at a goal of 65 calories/kg/day, he/she will only receive about 1.4 mEq/kg sodium where the recommended range is 2-6 mEq/kg/day. It is important to ask how the patient/family is mixing formula to ensure they are receiving the adequate macro and micronutrients.

Empiric or tailored elimination diets can pose nutrition risk if appropriate dietary substitutes are not recommended or not accepted. When removing wheat from a child’s diet, attention should be paid to sources of B-vitamins through alternative grains or wheat-free fortified grain products. If milk is removed, a dairy-free fortified alternative should be provided or a supplement identified to meet Calcium and Vitamin D needs. In some cases, an amino acid-based formula can accompany the empiric or tailored elimination diet to better meet nutrient needs.

Children with EoE may also be treated with other medical treatments such as proton-pump inhibitors for concomitant gastroesophageal reflux disease. An emerging body of data suggests that some adults may be prone to problems with calcium absorption and potential bone mineralization when on long-term proton-pump inhibitors.12 Topical swallowed steroids carry potential, but unmeasured, risks including decreased bone mineralization.3 Thus, monitoring of adequate calcium and Vitamin D intake is important for pediatric EoE patients on these medications.

A meta-analysis of studies evaluating the nutrition risks of patients on multiple food allergy elimination diets highlighted the benefits of dietitian counseling for these patients.13 This analysis suggests that children with multiple food allergies are at higher risk for poor growth and inadequate vitamin and mineral intakes. Though the number of studies in this analysis was few, it does indicate that monitoring and evaluating the growth and nutrient intakes of these children is of the utmost importance.

Role of the Clinician in EoE Treatment

The clinician, (often a dietitian) plays several integral roles in the management of EoE related to instituting a therapeutic diet. This role includes providing guidance for an individualized dietary plan. Figure 1 identifies the different nutrition evaluation components for each stage in treatment. The initial evaluation should include a review of the full diet intake including actual intakes for each food group, textures preferred, possible preferences for liquids, ability to swallow pills (if multivitamin supplement needed). Objective evaluation metrics should include previous growth patterns, and any parent concerns for growth, development, or physical findings.

When dietary therapy is initiated, it is most often not the “final” diet prescription for the patient. With the direction of the allergist and gastroenterologist, foods are methodically reintroduced until the maintenance diet is reached. In some cases, patients will “fail” reintroductions, and when no additional foods or food groups will be introduced, the patient is deemed to be on their maintenance diet. Most patients are able to successfully reintroduce foods and will have a very limited number of foods removed on their maintenance diet.10 To achieve nutrition goals, the dietitian should provide dietary education for food elimination as needed, including skills for label-reading, substitutions, cooking, supplementation (vitamins, minerals and/or formula as needed) and texture modification. When providing diet education, one should consider that dietary adherence can be challenging and many of the barriers to adherence can be addressed by the dietitian and the care team.14

As the patient with untreated EoE, or EoE not yet in remission, may experience the need for softer textures, the dietitian should aim to help the patient achieve adequate intake within the confines of the skill with feeding. While this would not be a long-term issue, it is important to meet the nutrition goals in the interim to help bridge the patient to their goal feeding skill with adequate nutrient intake. Table 6 shows examples of different ways to prepare some common foods based on the patient’s ability or comfort level with textures.

In the maintenance phase, the dietitian should focus on helping the patient achieve a reasonable variety in the diet. As much as possible, the diet should be designed to help meet all macro- and micronutrient needs for growth as well as fluid and electrolyte needs, with supplementation as necessary. Patients and families should be educated in recognizing signs of disease recurrence such as increased or excessive water to drink with meals to wash food down, preference for softer foods or purees, and avoidance of social eating. These signs are important for recognizing when the disease may not be controlled and they should see their doctor for evaluation. Lastly, while in the maintenance diet phase, periodic review of the patient’s growth is important.

CONCLUSION

The clinician plays an integral role in the assessment and treatment of patients with EoE. A well versed understanding of the three dietary elimination diets used for EoE is essential. Expertise of nutrient deficiency signs and symptoms are important to guide assessment in the child with multiple food allergies. In addition, educating patients and their parents to label-read, cook with limited ingredients, and use of food substitutions are an important component of the educational process. Long-term close follow up is ideal as patients and families work through the implementation phase of elimination diets. Reevaluation of diet intakes for variety and overall nutrient intake is necessary in each stage of treatment.

Download Tables, Images & References

Frontiers in Endoscopy, Series #11

Endoscopic Management of Pancreatic Duct Stones Via ERCP

Read Article

Endoscopic sphincterotomy (ES) has allowed the development of interventional endoscopic procedures to remove pancreatic duct stones. These endoscopic approaches were less invasive than previously developed surgical measures to treat pancreatic duct stones and rapidly gained widespread acceptance. Other endoscopic procedures such as electrohydraulic lithotripsy, extracorporeal shock wave lithotripsy, and laser lithotripsy can be used in the removal of pancreatic duct stones. This manuscript will review the current state of the art with regards to endoscopic management of pancreatic duct stones.

INTRODUCTION AND OVERVIEW OF PANCREATIC DUCT STONES

Calcium carbonate deposition combined with inadequate pancreatic ductal drainage and/or pancreatic duct strictures in the setting of chronic pancreatitis can lead to the formation of pancreatic duct stones. (Figure 1) The formation of such stones can result in inflammation and/or obstruction of the pancreatic duct and promulgate a vicious cycle whereby stones lead to inflammation and obstruction, and ongoing obstruction can further promote the formation of pancreatic duct stones. Pancreatic stones can lead to increased intraductal pressure, enhancing the symptoms of pancreatitis, most notably pancreatic-type pain.1 If pancreatic stones develop they can sometimes be asymptomatic. If stones cause obstruction of the pancreatic duct, pain is most often the first symptom. Pain from chronic pancreatitis is often epigastric but can often manifest anywhere in the upper abdomen, and can often radiate to the back.

In addition to pain, pancreatic duct stones can be the cause of other symptoms such as pancreatic duct obstruction and severe acute pancreatitis.2 Surgical or endoscopic measures are often used in the management and treatment of pancreatic duct stones. The primary goal of these procedures is to remove the stone(s) in order to relieve pain and restore the patency of the pancreatic duct.3

Endoscopic sphincterotomy (ES) has allowed the development of interventional endoscopic procedures to remove pancreatic duct stones. These endoscopic approaches were less invasive than previously developed surgical measures to treat pancreatic duct stones and rapidly gained widespread acceptance. Other endoscopic procedures such as electrohydraulic lithotripsy, extracorporeal shock wave lithotripsy, and laser lithotripsy can be used in the removal of pancreatic duct stones.

This manuscript will review the current state of the art with regards to endoscopic management of pancreatic duct stones.

Sphincterotomy and Balloon Extraction

In order to facilitate the removal of pancreatic duct stones, a pancreatic sphincterotomy is typically the first therapeutic maneuver performed after pancreatic cannulation and deep guidewire access to the main pancreatic duct have been obtained. (Figure 2) This procedure can be done using a needle-knife sphincterotome (usually over a pancreatic duct stent) or with a standard sphincterotome passed over a guidewire. Both are equally effective and safe. A standard sphincterotome is most commonly used. In a study performed of 1,000 patients by Choi and Lehman, complications in patients who underwent pancreatic sphincterotomy procedures included acute pancreatitis (2-7%), bleeding (0-2%), perforation (< 1%), and delayed stenosis of the pancreatic sphincter orifice (up to 10%) were reported, arguing for the overall safety of the procedure.2

Following pancreatic sphincterotomy, the most common first step when attempting to remove pancreatic duct stones is simple balloon extraction in a manner analogous to that used during ERCP to remove common bile duct stones. (Figure 3) An occlusion balloon is inserted into the duct until it reaches an optimal location proximal to the stone. The balloon is then inflated to an appropriate size and used to sweep the pancreatic duct, dragging stones and debris through the duct, out the sphincterotomy, and into the duodenal lumen.

Compared to other stone extraction techniques such as basket extraction and lithotripsy, balloon extraction is considered to be the safest and easiest to perform. Unlike basket extraction, an extraction balloon has little to no chance of becoming impacted in the pancreatic duct because it can be deflated. Removal of pancreatic duct stones with occlusion balloons usually works best in patients with small stones and/or pancreatic duct debris. Balloons may fail clinically if they are popped or torn on the sharp or jagged edges of pancreatic duct stones and may be of limited value in patients whose stones are above difficult pancreatic duct strictures.2

Concomitant Pancreatic Duct Stricture Therapy

The successful removal of pancreatic duct stones is often predicated on the absence of significant ductal strictures. Other key factors include the size and number of stones, the absence of impacted stones, and the location of the stones within the pancreatic duct.4 If ductal strictures interfere with stone extraction, endoscopic balloon dilation (EBD) of the stricture itself can be performed. Dilation of the pancreatic duct (either by balloon or passage dilators) followed by stone extraction can be performed in a single session in many patients.5

Endoscopic balloon dilation requires guidewire passage across the stricture, which can be difficult (or impossible) in very tight strictures.6 In addition, endoscopic catheters may not always be able to traverse pancreatic duct strictures, especially if they are highly angulated.7 In a study by Brand et al., the use of the 7-Fr Soehendra Stent Retriever used as a dilator when conventional endoscopic balloon dilation was unsuccessful was evaluated. Successful dilation of the tight strictures was accomplished in all of the patients.8 If a pancreatic duct stricture is intractable, proximal stones may not be able to be removed endoscopically.

Basket Extraction

In addition to balloon extraction, basket extraction is a standard method for removal of pancreatic duct stones. A basket catheter or Dormia basket is commonly used, although a variety of baskets are available from multiple vendors. No single basket has been identified as being ideal. A basket catheter is typically made of nitinol wires. A wide range of sizes is available, with basket diameters of 1-3cm being most commonly employed. A wire connecting the tip of the basket to the handle of the device allows the basket to be opened and closed. Modern stone extraction catheters are typically double lumen, one lumen to extract the stone, and the second to accommodate a guidewire, although some single lumen devices are still available.9

Baskets may be designed to simply trap stones or may also be considered lithotripters i.e. baskets that are able to both capture and crush stones in the duct itself prior to attempts at removal. Lithotripter baskets typically have braided wires for additional tensile strength and to reduce the risk of wire breakage during lithotripsy.

Non-lithotripter baskets can be used to trap and drag stones through the pancreatic duct and out to the duodenal lumen through the pancreatic sphincterotomy. (Figure 4) In practice, non-lithotripter baskets are uncommonly used for pancreatic duct stone removal given the risk of basket entrapment (either above a stricture, after stone capture, or both). After the stone has been trapped inside the basket, a metal coil (which may be advanced over a Teflon sheath or may simply serve as the sheath itself) is advanced to the base of the basket and the basket/stone complex is withdrawn into the sheath, crushing the stone and releasing the stone fragments from the basket into the duct as the lumen of the basket functionally disappears.10 The basket can then be reopened to capture and/or crush more stones and stone fragments as needed.

In a retrospective review of 69 patients by Thomaset al., endoscopic clearance of pancreatic ductal stones was successful in 90-97% of patients when basket extraction and mechanical lithotripsy were combined with pancreatic sphincterotomy. Of those 69 patients, there was a complication rate of 11.6% (8/69 patients). Of the eight patients that experienced complications, 37.5% had single stones, while 62.5% had multiple stones, suggesting that more involved duct clearance procedures were associated with a higher rate of pancreatitis. Seven patients experienced complications concerning a trapped or broken basket. Wire fracture of the basket occurred during mechanical lithotripsy in four patients. One patient experienced a pancreatic duct leak following the procedure. Complications were successfully treated with endoscopic measures including electrohydraulic lithotripsy, stenting, per-oral Soehendra lithotripsy, and extracorporeal lithotripsy.10

In a study by Hintze and Adler, 60 patients had pancreatic ductal stones large enough to require mechanical lithotripsy before removal. Three patients experienced traction wire fracture during the procedure. In 2 patients the use of a shorter metal sheath allowed for immediate resolution and continuation of the procedure. Extracorporeal shock wave lithotripsy was performed on 1 patient in order to successfully remove the stone.11 In a retrospective study of 53 patients by Smitset et al., pancreatic ductal stones were fragmented using mechanical lithotripsy in 4 patients. Eight patients underwent extracorporeal shock wave lithotripsy. A pancreatic stent was placed in 28 patients. Small stones were removed with a balloon-tipped catheter. Dormia-type baskets were used to extract larger stones. If balloon and/or basket extraction techniques were unsuccessful, a nasopancreatic drain or pancreatic stent was inserted beyond the stone to allow for sufficient drainage. After a mean follow-up of 33 months, stone removal was successful in 79% of patients.12

In a study by Farnbacher et al., 125 patients with pancreatic ductal stones were retrospectively analyzed. Successful removal of ductal stones was achieved in 85% of patients. Eleven patients (8.8%) underwent mechanical lithotripsy procedures to fragment the stones before subsequent removal. There were no significant complications experienced in lithotripsy procedures. Patients that experienced stones larger that 12 mm, stones in the tail of the pancreas, or multiple stones (2 or more) required more extensive therapeutic measures before removal could be accomplished.13

Pancreatoscopy, while not used directly during mechanical lithotripsy of pancreatic duct stones, can be used to identify the specific locations of pancreatic duct stones prior to inserting a basket into the pancreatic duct if the location of the stones is unclear on pancreatography.14 Pancreatoscopy can also be used to confirm duct clearance after stone extraction. (Figure 5)

Unlike extraction balloons, which can almost always be removed from the duct without difficulty (whether the balloon is working or has failed), stone retrieval baskets can become impacted within the pancreatic duct. Basket impaction can occur via several mechanisms, the most common of which is if the stone/basket complex cannot fit through the duct and be withdrawn to the duodenum. This situation is common if the pancreatic duct contains strictures, especially fibrotic strictures. Furthermore, if the ductal stones are too dense to be crushed, traction wires can fracture during attempted mechanical lithotripsy, creating a non-functioning basket with part or all of a stone still inside. Fractured traction wires can make the basket impossible to close and/or remove by pulling it back through the duct.15 In a multi-center study cited by Hlaing, a complication rate of 11.6% was associated with mechanical lithotripsy of pancreatic duct stones, often due to basket failure and/or impaction.16

Basket impaction in the pancreatic duct can be treated by the use of a Soehendra lithotripter cable (to destroy the stone/basket complex and allow basket removal) or via EHL or laser lithotripsy to try to break up the stone in the basket (theoretically allowing the empty basket to then be removed from the pancreatic duct), although the latter two devices may be very difficult to use in this situation. The actual incidence of basket impaction in the pancreatic duct is unknown as these events are likely underreported in the literature.

In a study by Sasahira et al., basket extractions in 10 patients with main pancreatic duct stones of 5 mm or less were observed. There was one case of temporal basket impaction. The basket was easily removed after the stone was released from the basket. A sphincterotomy extension was performed and the stone was successfully extracted after a second attempt. In 5 patients a nitinol basket catheter was successfully used to remove ductal stones. A basket catheter was used to initially extract stones in the remaining 5 patients. A balloon catheter sweep was performed to remove any residual stones. A pancreatic stent was temporally placed in all 10 patients.9 In a study by Thomas et al., 69 patients with pancreatic stones underwent endoscopic intervention. A complication rate of 0.8-5.9% associated with mechanical lithotripsy was reported. Extraction baskets broke in 7 patients, traction wires fractured in 4 patients, and basket handles broke is 5 patients. The basket impactions were successfully resolved using EHL and extracorporeal shock wave lithotripsy (ESWL).10

Electrohydraulic Lithotripsy

Electrohydraulic Lithotripsy (EHL) can be applied to pancreatic ductal stones in order to facilitate endoscopic removal. EHL applies high-energy shock waves to small areas.17 The technique uses a wire to generate a spark in a fluid filled duct. The spark generates a shock wave that propagates through the fluid and can fracture stones. EHL requires pancreatoscopy to perform, both to access and visualize the stones in question. EHL is directly applicable to pancreatic duct stone extraction as this procedure can be used to fragment stones that are too dense to be successfully fragmented using mechanical lithotripsy or that have failed attempts at balloon extraction.

Thomas et al. reported retrospective results in 69 patients with pancreatic stones. Two patients with large stones that proved difficult to remove underwent EHL. Stone clearance was successful in both procedures without any complications of pancreatitis associated.10 Attwell et al. performed a study of 46 patients undergoing various methods of endoscopic stone removal. EHL was performed in 85% of patients. Stone clearance and reduction of pain was successfully obtained in 74%.18

Craigie et al. reported on 10 patients with stones in the head of the pancreas underwent EHL. Intraductual stones were fragmented and successfully removed in all patients. There were no reported complications associated with EHL or stone extraction procedures.19

Studies on EHL have generally been small, single center, and retrospective in nature. In a study by Howell et al., 6 patients underwent EHL to fragment pancreatic duct stones. EHL was used as initial therapy in 1 patient. In 5 patients, EHL was used after other endoscopic clearance procedures were unsuccessful. Of note, balloon dilation of pancreatic duct strictures was performed in the same procedure to facilitate the entire procedure. After EHL was completed, balloon catheters and baskets were used to remove the stone fragments. Ductal clearance was successfully achieved in 50% of patients. Improved drainage and symptom relief was obtained in 100% of patients. No complications associated with EHL were observed.20

In a small series by Tanaka et al., two patients with large pancreatic ductal stones lodged in the head of the pancreas underwent EHL. The stones were successfully fragmented and optimal ductal flow was restored.21

In a case study by Papachristou et al., a patient with a pancreatic stone measuring 15.5×11.1×6.4 mm underwent various endoscopic procedures. EHL was performed after failed attempts of balloon extraction, balloon dilation, and mechanical lithotripsy. Using EHL, the stone was successfully fragmented and removed using balloon and basket extraction devices.17

Laser Lithotripsy

Laser lithotripsy procedures can be used to facilitate removal of difficult pancreatic stones, although there is relatively little clinical data on the efficacy of this technique. Laser lithotripsy, like EHL, requires a pancreatoscope to both directly identify stones and to serve as a conduit for a laser fiber. In laser lithotripsy, a laser pulse is used to pulverize the stone making extraction measures more successful. GI lasers or lasers designed for urology can be used for laser lithotripsy. Stones fragmented by laser lithotripsy can then be removed from the pancreatic duct via baskets and/or balloons.

In a study by Alatawi et al., five patients with pancreatic stones were treated with laser lithotripsy using a holmium YAG (yttrium aluminum garnet) laser All patients had previously undergone unsuccessful ERCP treatments. After stone fragmentation, stone retrieval was performed using a dormia basket or balloon catheter. Stone extraction was successful in all five patients. No complications associated with laser lithotripsy procedures were reported.22

In a small prospective study by Maydeo et al., 4 patients with pancreatic stones underwent laser lithotripsy when removal by basket or balloon catheters failed. Pancreatic stones were fragmented in 100% of patients followed by complete clearance via ERCP. No complications associated with laser lithotripsy or ERCP were reported.23

In a case report Hlaing et al., laser lithotripsy was used to assist in removing a basket that had become impacted in the pancreatic duct after entrapment of a pancreatic duct stone. During a mechanical lithotripsy procedure, the basket wires fractured and the basket became entrapped in the pancreatic duct. The pancreatic stone was fractured via laser lithotripsy. The basket and stone fragments could be successfully removed after laser lithotripsy.24

Extracorporeal Shock Wave Lithotripsy

Extracorporeal show wave lithotripsy (ESWL) is a non-invasive treatment for pancreatic duct stones. ESWL can be used to fragment stones using an externally applied shock wave pulse. A water filled cushion can be placed externally adjacent to the target area, or ESWL can be administered to a patient that is partially immersed in water. Sedation, epidural anesthesia, or general anesthesia can be used in patients undergoing ESWL depending on local institutional protocols. ESWL is generally used if the patient has many stones or if previous endoscopic measures have failed.

In a study performed by Ong et al., 250 patients underwent ESWL after chronic pancreatitis was confirmed via ERCP, conventional ultrasound, endoscopic ultrasound, or computed tomography scans. Multiple stones were reported in 87% of patients. Only 13% of patients had single stones. Stones in the head of the pancreas were seen in 98% of patients with multiple stones. After the stones were fragmented with ESWL, the pieces were removed with a dormia basket and/or balloon extraction via ERCP. Complete clearance of associated stones was achieved in 60% of patients and partial clearance occurred in 24% of patients, illustrating how even the most aggressive treatments can sometimes fail to allow duct clearance. Complications associated with ESWL occurred in 6% of patients. Pain during ESWL was the most common complication reported. Mild bleeding during ERCP occurred in three patients. No complications of acute pancreatitis associated with ESWL or stone extraction procedures were reported.25

Farnbacher et al. retrospectively analyzed 125 patients with pancreatic duct stones. Eighty-two patients had multiple stones. Successful stone clearance was achieved in 85% of patients. Stone clearance was achieved in 11 patients by mechanical lithotripsy and 114 patients by ESWL followed by ERCP. ESWL was performed in patients in whom stone clearance was unsuccessful via ERCP.26

In a retrospective study by Hiromu et al., 80 patients with pancreatic stones underwent ESWL. Forty-five patients underwent pancreatic stenting prior to ESWL treatment. Stone fragmentation was achieved in 91% of patients who underwent stenting prior to ESWL therapy and in 80% of patients who did not undergo stenting prior to ESWL therapy. Stone fragmentation was successful in 89% of patients while symptom relief was observed in 88% of patients regardless of stent placement. A complication rate of 7% was seen in patients who had pancreatic stents and 17% in patients who had no pancreatic stent. Complications included rare pancreatitis and cholangitis.27

CONCLUSIONS

A variety of endoscopic treatments are available for pancreatic duct stones. For small pancreatic duct stones and/or debris, balloon catheters and basket extraction devices can be used to extract stones. For larger or difficult pancreatic duct stones, a combined approach of basket/balloon extraction with more extensive measures such as electrohydraulic lithotripsy, laser lithotripsy, and/or extracorporeal shockwave lithotripsy can be used. Some patients will fail all endoscopic techniques; in these patients, surgery is still an option. Endoscopic approaches to treating pancreatic duct stones has an acceptable rate of adverse events and is now considered first line therapy.

Download Tables, Images & References

Fellows' Corner

An Unusual Cause of Acute Pancreatitis

Read Article

A 19-year-old man with one previous episode of pancreatitis presented with abdominal pain, nausea and vomiting for two days. He presented one month previously with similar symptoms to another institution and was diagnosed with acute pancreatitis based on an abdominal CT scan and an elevated lipase. He was managed conservatively with NPO, analgesia and IV fluids and ultimately discharged. Two days prior to the current admission he developed constant epigastric pain with radiation to the back. This was accompanied by nausea and non-bloody, non-bilious vomiting. He denied diarrhea, melena, or hematochezia. He endorsed drinking five drinks per weekend in the past, but stated that his last drink was 3 months prior to admission.

Laboratory studies were significant for a lipase of 209 U/L with normal liver-associated enzymes tests, and a normal white blood cell count. Triglycerides, ANA, and IgG-4 were all within normal limits.

CT scan of the abdomen and pelvis with contrast showed mild stranding and fluid adjacent to the pancreas consistent with mild uncomplicated pancreatitis. The gallbladder was unremarkable without findings of cholecystitis or cholelithiasis.

An MRCP was then performed and showed a normal pancreas and bile ducts; however, there was new mild segmental dilation of the proximal jejunum with suggestion of segmental mild wall thickening distally.

An anterograde push enteroscopy was performed to evaluate the abnormal small bowel seen on MRCP and showed severe, circumferential erythema, edema, friability, and ulcerated mucosa with exudate extending circumferentially from the duodenal bulb to the proximal-mid jejunum. Multiple biopsies were taken of the small bowel and stomach.

QUESTIONS

  1. What is the differential diagnosis?
  2. Are there associations between this condition and pancreatitis?
  3. What are the causes of pancreatitis in patients with this condition?
  4. How would you treat this condition in this patient?

Discussion

The pathology from the small bowel biopsies showed small bowel mucosa with severe acute and chronic inflammation consistent with Crohn’s disease, and his episodes of acute pancreatitis were thought to be due to peri-pancreatic duct inflammation caused by Crohn’s involvement of the duodenum.

Although extraintestinal manifestations of IBD are well described and relatively common, acute pancreatitis is a rare extraintestinal manifestation. In a retrospective cohort study, the prevalence of pediatric and adult patients with acute pancreatitis as the initial presenting symptom of IBD was 2.17% (10 of 460 pediatric patients) and 0.06% (2 of 3500 patients), respectively.

(1) The most common etiologies of pancreatitis in patients with Crohn’s disease are gallstones, alcohol, and purine analogs.(2) However, Crohn’s disease of the duodenum was found in a retrospective cohort study of 48 patients to be a risk factor for acute pancreatitis in 7 patients (15% of the cases of pancreatitis). (2) Although the pathophysiology of duodenal Crohn’s causing acute pancreatitis has not been well-studied, it has been theorized that small bowel inflammation can cause both papillitis and the reflux of pancreatic contents into the pancreatic duct. (3) Thus, Crohn’s disease of the duodenum should be considered in the differential diagnosis in individuals with acute pancreatitis, particularly in younger patients whose work-up has otherwise been unrevealing.

This patient is currently doing well on a steroid taper with plans to initiate infliximab therapy at his next clinic visit.

Download Tables, Images & References

jojobethacklinkJojobet GirişJojobet GirişCasibomCasibomkarın ağrısına ne iyi gelirluxbetluxbetRulobetbaşakşehir masaj salonukatlaJojobet GirişHoliganbetholiganbetJojobet GirişMarsbahis GirişCasibom GirişMarsbahis Giriş