Nutrition Issues In Gastroenterology, Series #179

Pragmatic Management of Nutrition in Severe Acute Pancreatitis

Read Article

Severe acute pancreatitis (SAP) is a clinically debilitating condition with significant morbidity and mortality that requires attention to optimal supportive management. In this article we discuss nutrition management options, which are critical to supportive management, with appropriate route and timing of nutritional support paramount to optimal outcomes.

Severe acute pancreatitis (SAP) is a form of acute pancreatitis (AP) with significant morbidity and mortality. Nutrition is critical to supportive management, with appropriate route and timing of nutritional support paramount to optimal outcomes. Previous paradigms of maintaining patients (NPO) while utilizing parenteral nutrition (PN) have evolved. Enteral nutrition (EN) is now gener?ally preferred, with PN being utilized only in select situations. Additionally, data support the use of early initiation of EN, within 48 hours of admission, to reduce gut barrier dysfunction and infectious complications. While limited data suggest that gastric EN may be pragmatic and non-inferior to jejunal EN, caution is recommended if using gastric EN. Finally, while it is reasonable to trial oral nutrition in patients experiencing a less protracted course with improved pain and hunger, long term EN is recommended in those patients expected to have a prolonged need for nutrition support.

Dushant Uppal, MD, MSc, Assistant Professor of Medicine, Division of Gastroenterology & Hepatology, University of Virginia, Charlottesville, VA

INTRODUCTION

Acute pancreatitis (AP) is an inflammatory condition of the pancreas ranging in severity from mild to severe and contributing to significant burden/cost to the healthcare system. In 2012, AP accounted for approximately 280,000 hospital admissions in the USA, with a median length of stay (LOS) of 4 days and a total cost to the healthcare system of $2.6 billion.1 Management predominantly involves supportive care with IV fluids and pain control in the acute setting. The severity of AP, as defined by the revised Atlanta criteria, may be classified as mild, moderate, and severe. Severe acute pancreatitis (SAP) occurs in approximately 15-20% of patients with AP and is defined by the presence of persistent organ failure (>48 hours).2 A further 20% of patients with SAP may have necrotizing pancreatitis, defined as focal areas of non-viable pancreatic parenchyma >3cm in size or >30% of the pancreas.2 Predicting the severity of AP is critical to optimizing management strategies including timing and type of nutrition. While scoring systems may be cumbersome, simple clinical and lab parameters may provide ample distinction between predicted mild AP and SAP at the time of presentation. In particular, persistent (>48 hours) systemic inflammatory response syndrome (SIRS), defined by two or more of the following four criteria: (1) temperature < 36°C (96.8 °F) or >38°C (100.4 °F), (2) heart rate >90/min, (3) respiratory rate >20/min, and (4) white blood cells (<4 x 109/L (<4 K/mm3), >12 x 109(>12 K/mm3) or 10% bands, is predictive of SAP.3 This distinction permits optimization of management and support in SAP including the delivery of nutrition support. Nutrition is a critical element of supportive care as it is thought to diminish:

1. damage to gut barrier with resultant increasing intestinal permeability and initiation of SIRS, sepsis and associated infected necrosis
2. translocation of bacteria/toxins which is considered the main cause for superinfection/SIRS
3. pancreatic inflammation predisposing to gastric stasis/abdominal distension2

Certainly, keeping a patient with SAP ‘nil per os’ (NPO) may be appropriate at presentation if they are intolerant or incapable of eating. However, fear of worsening inflammation and/or infection in SAP with eating based on the physiologic understanding that gastric accommodation and delivery of partially digested proteins and fats lead to pancreatic stimulation, has pervaded nutrition management in AP. This led to the long-held belief that keeping patients NPO for prolonged periods, while providing parenteral nutrition (PN), was optimal. The paradigm has evolved with literature now demonstrating improved outcomes in patients with SAP being trialed on oral nutrition or receiving enteral nutrition (EN) compared with PN. Furthermore, recent data and critical assessments of the literature have shed more light on the optimal route and timing of EN in patients with SAP. Certainly, the merits of each nutrition modality should be considered in specific clinical scenarios to optimize patient outcomes (Table 1).

Parenteral Nutrition (PN) vs Enteral Nutrition (EN)

The belief that keeping a patient NPO and utilizing PN would lead to optimal outcomes was germane to management of AP for years. However, data from a number of studies have refuted this notion, demonstrating that outcomes with PN tend to be worse.4-9 In particular, patients with AP who are placed on PN may experience more hyperglycemia and suffer increased line related infections or other infectious complications.2

Additionally, it has been demonstrated that gut mass and barrier function may be improved in patients with SAP who are enterally fed compared with those patients kept NPO and/or placed on PN.2,10 This altered physiology has been postulated to be responsible for increased systemic infections, organ dysfunction, increased need for surgical intervention, hospital LOS and mortality.4-9

A Cochrane review of 8 trials with a total of 348 patients with AP demonstrated a reduction in death, MOF, systemic infection, need for operative intervention and hospital LOS for patients receiving EN compared to those receiving PN.9 Furthermore, a subgroup analysis of patients with SAP receiving EN had a lower risk of death and MOF compared with those patients on PN. This improvement in major complications and death was corroborated in a subsequent meta-analysis of 8 RCTs including 381 patients comparing PN to EN in patients with SAP.8 Thus the data to date support the notion that EN should be favored over PN as it leads to improved outcomes in patients with SAP.

Although PN use in patients with AP, including SAP, is not advised as the initial form of nutrition support, there are certain conditions where it may be indicated. For instance, in the rare instances of mechanical bowel obstruction or bowel perforation, EN would be ill advised and PN preferred, or when adequate nutrition intake cannot be met by EN and/or po intake. Additionally, in patients with lymphatic disruption and chylous ascites not responding to a fat free or fat restricted diet or elemental EN, transient PN may be indicated.11,12

EN vs Oral Nutrition

While the data demonstrating the benefit of EN over PN is robust, data regarding EN vs oral nutrition in SAP is somewhat limited and incongruent. Although some studies have demonstrated improved gut mass and barrier function in patients receiving EN over PN, another study by Powell et al.,13 demonstrated no difference in inflammatory markers, interleukin 6, soluble tumor necrosis factor receptor1 and C-reactive protein between patients with SAP who were fed by mouth and those receiving EN. Several additional studies comparing oral nutrition vs EN in patients with AP found no statistical significance in complications between the two groups.14,15 The study by Stimac et al.,15 demonstrated a RR reduction in death and MOF in pts on EN vs ‘nil by mouth’ (NBM), but these did not meet statistical significance. However, the NBM group received more IVF early and during the remaining hospital course and oral nutrition was introduced to both groups variably, but as early as 3 days. It is thus unclear if longer duration of EN could have resulted in statistically significant reductions in risk of mortality, MOF, and infectious complications. Furthermore, all patients received prophylactic antibiotics for 10 days, a practice that is no longer recommended and could have contributed to similar outcomes between both groups.

The Python trial,14 compared early EN within 24 hours of admission with oral diet after 72 hours in patients with AP at high risk for complications. Two hundred eight patients at 19 centers were randomized with a primary composite end point of major infection or death. The primary end-point occurred in 30% of the early EN group vs 27% of the on demand group leading the authors to conclude that there was no superiority of early nasoenteric feeding compared with oral feeding after 72 hours. However, caution is advised in extrapolating the results of this study to clinical practice for those patients with proven SAP due to several limitations. A substantial proportion (one third) of patients did not meet initial criteria for SAP at the time of enrollment into the trial, nor did the study identify differences in outcomes among those patients with true SAP following initial resuscitation. Additionally, the transition to oral nutrition occurred for both groups during their hospital course (full oral tolerance at 9 days for EN group vs 6 days for on demand group), yet the primary outcome compared 6-month mortality and major infections without additional assessment of symptoms and oral feeding tolerance in the interim. Further, the results from the intention to treat (ITT) analysis, as it pertains to those patients with true SAP, may be misleading as a large proportion of the on-demand group received EN (31%), yet were included in the analysis as patients in the on-demand arm instead of the early EN arm. Thus, it remains possible that longer duration EN, through and beyond the primary hospitalization, may result in improved morbidity and mortality in patients with SAP.

The paucity of data and these disparate data should lead practitioners to act cautiously when introducing oral diet for patients with SAP. Nevertheless, in those patients with a more benign clinical course who improve rapidly, a trial of po intake is appropriate. At our institution, EN is often used preferentially in those patients with severe necrotizing pancreatitis and infection, or at the first signs of intolerance in those patients with SAP cautiously initiated on oral nutrition.

Type of EN

With respect to types of enteral formulations, elemental or polymeric products may be used as there are no data to support the superiority of one over the other.16,17 Alhough some studies have demonstrated pancreatic exocrine insufficiency, based on fecal elastase and quantitative fecal fat testing, to be a variably common occurrence (13%-87%) following SAP, paricularly alcohol related and/or necrotizing pancreatitis,18-21 other studies have demonstrated likely recovery of exocrine function, particularly in those patients not requiring pancreatic debridement or drainage.22-24 Our institutional practice is to first employ polymeric EN as it has been well tolerated and less costly. If steatorrhea develops or persists, and there does not appear to be sufficient pancreatic damage suspected, we then undertake an evaluation for clostridium difficile infection. In our experience, if infective colitis is ruled out, patients generally will accommodate well to polymeric feeds as the acute decrement in exocrine function experienced with SAP resolves. However, should steatorrhea or weight loss persist to suggest more semi-acute or chronic exocrine insufficiency, or there is significant pancreatic necrosis on imaging, then transition to semi-elemental or full elemental EN, or the addition of pancreatic enzymes, may be considered. In these scenarios, fecal elastase may help to guide changes in EN formulation in those patients suspected as having pancreatic insufficiency, so long as the patient is not experiencing diarrheal stools, which could dilute fecal elastase, leading to a false positive result.

Timing of EN

The benefit of EN on morbidity and mortality in SAP is supported by the literature with additional data demonstrating improved outcomes with early initiation of EN.

A retrospective review of 197 patients with predicted SAP compared patients receiving early EN, within 48 hours of admission to those receiving delayed EN, after 48 hours and demonstrated reduced mortality, development of infected necrosis, respiratory failure, and need for ICU admission in the early EN group.25 However, the study results must be considered carefully, given the study’s retrospective nature, as bias may have existed with respect to the allotment of each type of therapy for the included patients. For instance, the delayed group had a trend towards greater use of PN. Taken together, the increased need for ICU admission and PN use in the delayed group may be reflective of a more critical course that resulted in delayed initiation of EN and resultant increased morbidity/mortality.

Nevertheless, subsequent studies have demonstrated improved outcomes with early initiation of EN.7,13,25-31 A Cochrane review of 11 RCTs by Petrov et al.,32 demonstrated improved outcomes with respect to MOF, pancreatic infectious complications, and mortality in those patients with AP receiving EN within 48 hours of admission compared with those receiving PN. The improved outcomes are postulated to be due to improved immune function with early initiation of nutrition support. This improvement in immune function was evidenced by Sun et al.,29 who showed that compared with delayed EN, those patients receiving early EN experienced lower levels of cytotoxic CD4+ T lymphocytes and CRP in addition to reduced MOF, SIRS, pancreatic infection, and ICU LOS. Thus, the available data support the initiation of EN in patients with SAP within 48 hours, if clinically feasible.

Gastric Nutrition vs Jejunal Nutrition

While EN in SAP is clearly of benefit, literature regarding route of enteral support is somewhat disparate. As discussed earlier, studies comparing PN with EN demonstrate reduced mortality and morbidity in patients with SAP receiving EN. Most of these early studies utilized jejunal feeding. However, given the sometimes cumbersome nature of jejunal feeding tube placement, more recent studies have attempted to elucidate the difference in outcomes between patients with SAP receiving nasogastric tube (NGT) vs nasojejunal tube (NJT) feedings. In a clinical feasibility study conducted by Eatock et al.,33 22 out of 26 patients with SAP receiving NGT feeds within 48 hours of admission tolerated the feedings well without evidence of clinical or biochemical deterioration. A subsequent RCT of early NGT vs NJT feeding in SAP, conducted by the same group, found no statistically significant difference in inflammatory markers or pain scores between the groups.34 However, they did not objectively evaluate long-term outcomes or differences among patients with necrosis. Subsequently, a RCT by Kumar et al.,35 comparing NJT to NGT feedings in patients with SAP found no difference in LOS, need for surgical intervention, or death. However, time to initiate EN was up to one month after presentation with SAP, and both groups only received low EN infusion rates. Furthermore, long-term outcomes were not compared. Additionally, the authors concluded that EN was tolerated in both groups. However, if the EN rate had been optimized to meet true nutritional requirements and volumes, a difference in tolerance may have been appreciated. These two studies and a third RCT, comprising a total of 157 patients, comparing NGT vs NJT feeding in SAP were used to conduct a meta-analysis.36 The authors of the study reported no significant differences between groups receiving NGT feeds vs NJT feeds in their RR of mortality, diarrhea, exacerbation of pain, and meeting energy balance, leading them to conclude that NGT feeding was not inferior to NJT feeding in patients with predicted SAP. However, the authors also cautioned that the results of the study would require larger RCTs as their evaluation was not adequately powered. Moreover, the meta-analysis was limited by the small and heterogeneous trials included, non-blinding, delay in patients being initiated on EN in the two studies from India, and the lack of confirmation of NJT positioning. Although the available data suggest that NGT feeding may be non-inferior and that NGT placement may be more pragmatic given the potential difficulties in NJT placement, the data are not robust. Furthermore, jejunal feeding in SAP may be physiologically sensible.

Data demonstrate higher secretions of trypsin and lipase in subjects who have formula delivered to the duodenum compared with those receiving jejunal feedings at least 40cm beyond the ligament of Treitz and this may potentiate further pancreatitis.37 Nevertheless, if NGT feedings are pursued in SAP caution is advised and reassessment of the patient’s clinical condition and tolerance of feeding is recommended.

NJT or Percutaneous Endoscopic Gastrostomy Tube with Jejunal Extension (PEG-J)

When the decision is made to utilize EN, an empiric decision to place a NJT or transition to PEG-J must be made. In rare circumstances NJT placement may not be possible due to altered nasopharyngeal anatomy or issues with nasopharyngeal bleeding or infection. If the tube can be placed easily, NJT is generally appropriate for patients initially if transition to oral diet during the hospitalization is likely. Additionally, NJT placement may be appropriate in the setting of necrotizing pancreatitis without gastric outlet obstruction or in patients with significant malnutrition where it remains unclear if they will be able to meet oral caloric targets early, but are likely to transition to full oral diet shortly following hospitalization. However, in the setting of necrotizing pancreatitis with outlet obstruction, which may necessitate gastric venting, PEG-J placement may be more ideal. Long term NJT use out of the hospital may also not be palatable due to cosmetic concerns if the patient is planning on returning to work and cannot otherwise advance to oral diet. Finally, PEG-J may be of benefit in the setting of SAP with significant necrotizing pancreatitis necessitating repeat debridement, as the need for enteral support may be greater than 6 weeks in these circumstances. Relative contraindications to PEG-J placement may include poor window for endoscopic or radiographic placement, ascites, or bleeding diatheses (Table 2).

When to Transition to Oral Nutrition

As mentioned previously, the data regarding oral vs early EN in patients with SAP is limited and caution is advised in early re-introduction of oral diet in these patients. However, in a large majority of patients with SAP, a trial of oral nutrition tolerance prior to discharge may be reasonable, particularly as hunger returns. In a study by Zhao et al.,38 no difference in adverse events or complications were identified in patients with moderate or SAP who received early oral re-feeding based on return of hunger vs those patients who received conventional oral refeeding when clinical symptoms and laboratory parameters had resolved. Of course, if oral tolerance fails due to worsening pain, infection, or inability to meet caloric needs, then EN should be introduced/resumed until clinical re-assessment with or without interval imaging can be undertaken. If the patient is well at that point, re-introduction of oral diet is reasonable. Additionally, though empiric, our institutional practice tends to favor continued EN at the time of discharge in the subgroup of SAP patients with necrotizing pancreatitis with infection or gastric outlet obstruction. In these patients, our pragmatic approach is to maintain EN until clinical reassessment is completed following hospital discharge or debridement/drainage if necessary (Figure 1). If the patient is clinically better and necrosis, if present, is stable/improved and no further intervention is anticipated, oral diet is resumed. Of course, if signs of intolerance occur earlier, interval imaging is obtained more urgently and oral diet is either transitioned back to EN or maintained based on imaging and patient parameters. When the durable tolerance of oral nutrition is demonstrated and imaging, if obtained, is encouraging, feeding tube removal is undertaken. Although this approach may be difficult, owing to the need for close patient follow up and interval imaging, it is undertaken in an effort to limit worsening clinical status of the patient and prevent premature removal of enteral access.

CONCLUSION

SAP is a clinically debilitating condition with significant morbidity and mortality that requires attention to optimal supportive management, including nutrition, for improved outcomes. The classic paradigm of maintaining a patient NPO while providing parenteral support has evolved, with PN now being recommended only in very select situations. Literature supports the early introduction of EN in SAP within 48 hours, if po challenge fails, and while the literature appears to demonstrate clinical equipoise with respect to gastric vs jejunal feeding, it is not robust and clinical management must be approached cautiously. As with other interventions, close reassessment of clinical, laboratory, and radiographic parameters once any nutritional support is initiated is paramount to improved patient outcomes. In an effort to limit cost and simplify management, ongoing assessment of oral diet tolerance is reasonable in those patients with hunger and improved pain prior to discharge. However, pragmatically, the use of long term EN in SAP patients is recommended in those with demonstrated intolerance of oral diet either due to pain, early satiety, or inability to meet caloric requirements. This may be via NJT in those patients expected to transition to oral diet within 4-6 weeks. Alternatively, patients with gastric outlet obstruction, severe necrotizing pancreatitis necessitating debridement, and those patients requiring enteral access, but unable to tolerate NJT placement, a PEG-J may be more appropriate. Ultimately, optimal patient outcomes with respect to nutrition in SAP are realized when attention to literature is married to diligent observation of the individual patient and reorienting therapy based on clinical, biochemical, and radiographic response to implemented strategies.

Download Tables, Images & References

A Special Article

Hepatocellular Carcinoma Secondary to Chronic Hepatitis C Virus Infection in Veterans at the VA Caribbean Healthcare System – Have Surveillance Measures Been Effective?

Read Article

The incidence of hepatocellular carcinoma (HCC) due to hepatitis C virus (HCV) infection has been rising worldwide as well as in the United States. Current American Association for the Study of Liver Disease (AASLD) guidelines recommend performing an abdominal ultrasound in cirrhotic patients every six months for early detection of HCC. The main objective of this study was to retrospectively evaluate a population diagnosed with HCC secondary to HCV in the Veterans Administration (VA) Caribbean Healthcare system and to determine if screening strategies were applied appropriately. Secondary aims were to describe certain patient characteristics upon diagnosis of HCC and determine the median survival time of this population. It was found that in 95.4% of the cases, the diagnosis of HCC was incidental, and not part of a surveillance strategy. More so, the median survival after diagnosis was only 10 months. These findings should help raise awareness of the importance of HCC surveillance in cirrhotic patients with or without HCV.

The incidence of hepatocellular carcinoma (HCC) due to hepatitis C virus (HCV) infection has been rising worldwide as well as in the United States. Current American Association for the Study of Liver Disease (AASLD) guidelines recommend performing an abdominal ultrasound in cirrhotic patients every six months for early detection of HCC. The main objective of this study was to retrospectively evaluate a population diagnosed with HCC secondary to HCV in the Veterans Administration (VA) Caribbean Healthcare system and to determine if screening strategies were applied appropriately. Secondary aims were to describe certain patient characteristics upon diagnosis of HCC and determine the median survival time of this population. It was found that in 95.4% of the cases, the diagnosis of HCC was incidental, and not part of a surveillance strategy. More so, the median survival after diagnosis was only 10 months. These findings should help raise awareness of the importance of HCC surveillance in cirrhotic patients with or without HCV.

Sheryl Rosa, Walisbeth Class, Henry DeJesus, Doris H. Toro, VA Caribbean Healthcare System, San Juan, PR

INTRODUCTION

Hepatocellular carcinoma (HCC) is one of the most feared outcomes of chronic hepatitis C virus (HCV) infection.1 The incidence of HCC varies widely within different regions of the world and concordantly differs among different racial and ethnic groups within the same country.2 This phenomenon is attributed to regional variations in exposure to the different hepatitis viruses, environmental pathogens and inheritance patterns of genetically linked liver diseases.2 Although the mechanism of carcinogenesis of the Hepatitis C virus has not been elucidated, studies which have used mouse models suggest that the development of HCC in HCV arises from a rapid cellular turnover and chronic inflammation and not from oncogene activation as is seen in hepatitis B related HCC.3

Epidemiology

There is an estimated global distribution of 185 million people who currently live with chronic HCV infection.4 Even more so, approximately 399,000 people die each year because of HCV complications, among them being HCC.5,6 Liver cancer is the fifth most frequently diagnosed cancer in men, while in women it is the ninth most frequently diagnosed cancer worldwide, and it is the fourth leading cause of cancer-related death in the world.2,7 In North America, the incidence rates in 2008 for males and females were 6.8 and 2.2 per 100,000 persons, respectively. The differences in gender expression are not clearly understood, but are suspected to be secondary to hepatitis carrier states, exposure to environmental toxins, and the effects of androgens.8

Although North and South America are considered low incidence areas for cases of HCC, the incidence in the United States has increased during the past two decades, possibly due to a large pool of people with longstanding chronic hepatitis C.9 The rate began to accelerate in the mid-1980s, most likely because of the increased incidence of cirrhosis due to chronic HCV infection and nonalcoholic fatty liver disease, combined with a large influx of immigrants from East Asia and other geographic areas with high endemic rates of hepatitis B viral infection.9 The annual incidence of HCC in the US was at least 6 per 100,000 in 2010. Recently, the incidence of HCC among the United States veteran population had been notably rising as well, likely secondary to an increase of HCV infection among this specific population.10

Risk Factors

In addition to chronic HCV there have been many risk factors associated to developing HCC which include Hepatitis B carrier state, hereditary hemochromatosis, comorbid hepatic disease, environmental toxins and cirrhosis of any cause. The most commonly seen risk factors in the United States are HCV infection, alcohol use and nonalcoholic fatty liver disease; risk factors commonly identified in the veteran population. Risks factors for HCC development among patients with HCV-related cirrhosis can be considered as host related, virus related and of external origin.9 Independent risk factors associated with progression to HCC are older age (>55 years: 2- to 4-fold increased risk) and male sex (2-to 3- fold increased risk). Several comorbid conditions are thought to increase the risk of HCC among patients with HCV-related cirrhosis, including porphyria cutanea tarda (PCT), hepatic iron overload, liver steatosis and diabetes mellitus.9

Upon review of the 2010 HCV Veterans registry in the region of Veterans Integrated Service Network (VISN 8), to which Puerto Rico belongs, based on serologic evidence of HCV infection status (HCV positive) as well as the subset of those with VA laboratory evidence of HCV viremia, there were 21,997 patients registered with HCV from which 19,649 were HCV positive and 15,587 had HCV viremia.11 Per the report from 2011, a total of 16,026 HCV viremic Veterans were registered with VHA care in VISN 8. During the same year in San Juan VA, a total of 1,567 patients were registered as being in care for HCV and 25 of them were first diagnosed with HCC during that year. The prevalence of HCV in veterans is about 3.7 times higher than in the general population, which could explain the many new cases of HCC arising in this population.

As a deadly entity, it is of benefit that HCC is detected early in its course, for management or even if possible, curative treatment. HCC, if untreated, has a mean survival of 1 to 3 months and a 5-year survival rate as low as 3%. For this important reason, surveillance measures guidelines for high-risk populations have been published by the American Association for the Study of Liver Disease (AASLD).1 Surveillance is deemed cost-effective if the expected HCC risk exceeds 1.5% per year in patients with cirrhosis. Current AASLD guidelines recommend physicians to perform an abdominal ultrasound in a screening interval of 6 months in patients with confirmed cirrhosis. It is expected that by following this strict routine sonographic surveillance of cirrhotic patients it would be possible to identify any liver lesion with malignant potential in a timely manner, and therefore treat patients at an earlier stage of disease and possibly extend these patient’s survival expectancy and improve their quality of life.1 It is imperative for primary care physicians and for gastroenterologists to keep this recommendation in mind as part of routine screening in patients with cirrhosis secondary to HCV or any other etiology. However, it is suspected that in real life practice, these recommendations are not being followed as rigorously as expected, and in turn hepatic lesions are either found at a later stage of disease or incidentally when investigating a different disease ailment.

Objective

This study consisted of a retrospective analysis using data gathered from the medical records of Latino veterans from the VA Caribbean Healthcare System who had documented diagnoses of HCV and also HCC. The principal objective of this study was to evaluate if adequate surveillance of hepatoma by imaging was performed in these patients as recommended by AASLD guidelines. The secondary objective was to identify and describe the pertinent sociodemographic and clinical characteristics of these patients. Variables accounted for were gender, comorbid diabetes mellitus, platelet count, body mass index (BMI), coexisting risk factors for HCC development, if surveillance was done appropriately, Child Pugh score upon HCC diagnosis, model for end stage liver disease (MELD) score upon HCC diagnosis, diagnostic study used, stage of diagnosis using the BCLC staging system, treatment modalities applied, and median survival time.

Materials and Methods

This study was conducted by performing a detailed review of the electronic medical charts (CPRS system) of Latino veterans from the VA Caribbean Healthcare System with confirmed chronic HCV infection and HCC diagnosed between January 1, 2001 and May 21, 2013. These cases were identified by patient encounters coded as per ICD-9 diagnosis codes 155.0, 155.2, 230.8, V10.07, 070.44, and 070.54. Inclusion criteria were ages between 22-88, confirmed HCV with HCC, and Latino origin. Exclusion criteria were absence of HCV, ages less than 22 or over 88, and any other ethnic origin which is not Latino. Statistical analysis was performed with the Statistical Package for Social Science, IBM SPSS Statistics for Windows, Version 21.0. Armonk, NY: IBM Corp.

Results

After a systematic review of all the medical records of patients with HCV and concomitant HCC, we identified a cohort of 131 patients who met the inclusion criteria. Details are in Table 1.

All the patients within our cohort were of male gender. The mean age at diagnosis was 62.2 years. The majority of the patients were of normal body mass index (BMI) (43.5%), but followed closely by patients with BMI compatible with pre-obesity (33.6%). When taking into account diabetes mellitus as a comorbid condition, there was not a substantial difference among those who were diabetic and the non-diabetics which were 45% and 55% respectively. In addition, a significant difference was neither found when considering platelet count with 56.5% of patients with platelet count 100,000 µL or above whereas 43.5% of patients were found with less than 100,000 µL upon diagnosis.

Liver biopsy was performed in 42% of our cohort and within that percentage, 22.9% were patients with histopathology findings reporting cirrhosis which is consistent with a Metavir score of F4. However, in 19.1% of the cases where biopsy was performed, the Metavir score was not described in the pathology report and the grade of fibrosis of the tissue was not specified. Since studies published in 2001, liver biopsy is not required for diagnosis of HCC after the establishment of specific diagnostic criteria by imaging studies which in turn avoids the risks associated to percutaneous biopsies which include bleeding and tumor spread along the needle track.12 Of the total cohort, 39.7% had a diagnostic imaging study (either quadruple phase computed tomography (CT) or magnetic resonance imaging (MRI)) followed by a confirmatory biopsy whereas 3.2% had a biopsy performed as the sole diagnostic study. Of those diagnosed by imaging studies, 37.4% was by quadruple phase CT and 24.4% was by dynamic MRI, however some patients had both studies done upon diagnosis and therefore are accounted for in each category.

Alfa fetoprotein (AFP) is a glycoprotein considered to be the most commonly associated serum marker with the presence of HCC. Most studies agree that levels above 500 mcg/L are highly suspicious for the presence of HCC although it has also been established that not all tumors secrete AFP and up to 40% of patients with HCC may have normal AFP levels.13 This previously described data is validated in our study where only 13.7% of the cohort was found with AFP levels above 500 mcg/L upon diagnosis but most of the cohort (77.8%) had AFP levels less than 250 mcg/L when diagnosed with HCC.

Two predictive models of the prognosis of patients with cirrhosis are the Child Pugh classification and the Model for End Stage Liver Disease (MELD) score.14 In general terms, A Child Pugh score of A means that a patient has compensated cirrhosis, a score of B is compatible with significant functional compromise, and C signifies decompensated cirrhosis. The MELD score is used to calculate an estimated 90-day mortality in patients with cirrhosis and is a crucial factor in prioritizing patients for liver transplantation. Due to the latter, a MELD exception is given to patients with HCC who are candidates for liver transplant to prioritize those patients due to the high mortality risk of HCC.15 The MELD score accounted for in this study was without the inflation of the MELD exception points to better characterize these patients upon diagnosis. The data obtained revealed that in terms of both Child Pugh score and MELD score, the majority of that patients were diagnosed within an early stage with 59.5% of patients diagnosed while still being classified within Child Pugh A and notably as well 59.5% of patients with a MELD score less than 10.

The Barcelona-Clinic Liver Cancer (BCLC) staging system is an HCC treatment algorithm which consists of multiple variables related to HCC that include tumor stage, physical status, liver functional status, and cancer-related symptoms.16 Essentially, patients at stage 0 are candidates for resection, stage A are candidates for curative therapies, stage B patients should be referred for chemoembolization, stage C can be considered for experimental therapies, and patients with stage D should undergo palliative treatment. Within our cohort, six cases (4.6%) were diagnosed at BCLC stage 0, 59 cases (45%) at BCLC stage A, 40 cases (30.5%) at BCLC stage B, 15 cases (11.5%) at BCLC stage C, and 11 cases (8.4%) at BCLC stage D. Correspondingly, the therapeutic modalities used in these patients were: surgery in three (2.3%), TACE in 72 (55%), RFA in three (2.3%), TACE + RFA in 10 (7.6%), liver transplant in two (1.5%), Sorafenib in 49 (37.4%), palliative care in 53 (40.5%), and experimental or no treatment in 12 (9.2%). It is important to point out that 52 patients received two or more treatment modalities.

Overall, the median survival was 10 months after diagnosis. Of the 131 patients diagnosed with HCC, only six (4.6%) were identified within a routine surveillance program whereas 125 (95.4%) of the cases were diagnosed incidentally. AASLD guidelines recommended screening of cirrhotic patients every 6 to 12 months until 2010 and afterwards guidelines were revised and recommended screening for HCC every 6 months. These changes were considered upon data collection.

Discussion

The data obtained from this retrospective analysis of the medical records of Latino veterans who had HCC within a background of HCV suggests that are no clear predictors indicative of which patients will develop this disease with such a low 5-year survival rate. In contrast to other studies, it was seen that in our population there was not a strong association with risk factors such as diabetes mellitus, elevated BMI, or low platelet count. The majority of patients who were diagnosed with HCC were classified within early stages of the most common prognostic models which are the Child Pugh score and MELD score. Also, this study validates previously described data noting that AFP is not a universal marker for the diagnosis of HCC with only 13.7% of our cohort with levels above 500 mcg/L upon diagnosis. More importantly, although 80.1% of the cohort was diagnosed within BCLC stages B or better, the median survival was only 10 months after diagnosis. However, the most striking point of all the data is that only 4.6% of the patients with HCC were diagnosed within a routine surveillance program.

The main limiting factors in our study were that all patients were male veterans; therefore, these results may not represent the general population of non-veteran men and any women. Another limiting factor was that RFA was not available in our institution until 2007 and therefore unknown if the lack of this treatment modality within this time period would have had an effect in the median survival time after diagnosis if patients would have had that option. It is also fair to mention that when this study was conducted, the new direct acting antiviral treatments for HCV were not available and therefore new studies will need to be performed on that specific population of patients that are treated for HCV.

In conclusion, surveillance was not conducted effectively in our study population. Patients diagnosed with cirrhosis need to be entered in a strict surveillance program due to the lack of clinical or laboratory indicators of HCC development and the poor prognosis once diagnosed. Within our study, 95.4% of the HCC cases were diagnosed incidentally and therefore it is unknown if an earlier diagnosis would have led to a better survival rate for which new studies are recommended within a cohort that has been surveilled as recommended by guidelines.These findings should help raise awareness among all physicians of the importance of HCC surveillance in all cirrhotic patients to grant patients an early diagnosis and hopefully improve their chance of survival and quality of life. With the mandatory widespread use of electronical medical records and the aid of modern technology, it may be of great help to use regulatory medical reminders in the institution of imaging surveillance in cirrhotic patients within all the healthcare facilities in our nation.

The contents of this publication do not represent the views of the VA Caribbean Healthcare System, the Department of Veterans Affairs or the United States Government.

Download Tables, Images & References

Dispatches From The Guild Conference, Series #15

Management of the Complications of Cirrhosis

Read Article

Cirrhosis is the end result of any chronic liver disease and is an entity that progresses across different prognostic stages, the most important being the compensated (asymptomatic) and the decompensated (symptomatic) stages. Here, we discuss these stages, defined by the absence or presence of overt complications of cirrhosis, specifically variceal hemorrhage, ascites and encephalopathy.1 Each stage has entirely different prognosis, predictors of death and predominant pathogenic mechanisms and therefore should be managed separately both in research and in practice.

Guadalupe Garcia-Tsao, M.D. Professor of Medicine/Digestive Diseases, Yale University School of Medicine, Chief, Digestive Diseases Section, VA-CT Healthcare System

Cirrhosis is the end result of any chronic liver disease and is an entity that progresses across different prognostic stages, the most important being the compensated (asymptomatic) and the decompensated (symptomatic) stages. These stages are defined by the absence or presence of overt complications of cirrhosis, specifically variceal hemorrhage, ascites and encephalopathy.1 Each stage has entirely different prognosis, predictors of death and predominant pathogenic mechanisms and therefore should be managed separately both in research and in practice.

Portal hypertension (PH) is the initial and main consequence of cirrhosis and is responsible for most of its complications. Portal pressure increases initially as consequence of increased intrahepatic resistance to portal flow due to a) structural vascular distortion (e.g. fibrous tissue, regenerative nodules, microthrombi) which account for about 70% of the increased intrahepatic resistance and b) increased intrahepatic vascular tone which is consequence of endothelial dysfunction resulting mostly from reduced nitric oxide bioavailability.2 As portal pressure increases, there is splanchnic vasodilatation, which leads to increased portal venous inflow that further increases portal pressure. Vasodilatation is due to angiogenic factors and increase in nitric oxide and leads to activation of neuro-humoral systems, sodium and fluid retention, resulting in increased cardiac output, and a hyperdynamic circulatory state.3

Stages of Cirrhosis

Complications of cirrhosis do not occur until there is both increase in resistance and flow. When increased resistance is the sole pathogenic factor, portal hypertension is mild (<10 mmHg as determined by the hepatic venous pressure gradient or HVPG) but when there is both increased resistance and flow and the HVPG rises to levels ≥10 mmHg, the patient is at a higher risk (4x) of developing decompensating events.4 Therefore, patients with compensated cirrhosis are currently sub-staged into those with mild PH (HVPG >5 but <10 mmHg) and those with clinically significant portal hypertension (CSPH) (HVPG ≥10 mmHg).5 Among the latter, roughly half the patients have gastroesophageal varices. Because the HVPG of patients with gastroesophageal varices is of at least 11-12 mmHg, patients with varices have, by definition, CSPH.6

Of the decompensating events, ascites is the most common (20-year first event rate of 33%), followed by variceal hemorrhage (15%) and encephalopathy (7%). Jaundice is quite rare as the first decompensating event (3%) because its presence indicates a more advanced disease (further decompensation) or acute-on-chronic liver failure.7 Prognosis is different depending on the type and number of decompensating events. Patients with gastrointestinal bleeding as the sole decompensating event; those presenting with a non-bleeding complication (mainly ascites) as sole decompensating event and those with two or more concomitant complications have a progressively worse prognosis (20%, 30% and 88%, respectively).7 Therefore, the prognosis and management of variceal hemorrhage should always be considered in the context of the presence or absence of other decompensating events.

In patients with decompensated cirrhosis, vasodilatation is the main pathogenic mechanisms and is secondary to bacterial translocation (covert infection) or overt bacterial infections, with a main mediator being systemic inflammation.8

Reducing Portal Pressure

While HVPG measurements are useful in patient stratification in compensated cirrhosis, it is not as important in the decompensated stage, where markers of liver and kidney dysfunction (model for end-stage liver disease or MELD score) are of greater prognostic significance.9 However, both in compensated and decompensated cirrhosis, decreases in portal pressure (induced by non -selective beta-blockers) are associated with improvement in outcomes. Hemodynamic responders are traditionally defined as those in whom HVPG decreases below 12 mmHg or > 20% from baseline. In patients with compensated cirrhosis, a decrease of >10% from baseline has been shown to be predictive of a more favorable outcome.10

Portal pressure can be decreased by decreasing intrahepatic resistance and/or by decreasing portal vein blood inflow. For over 30 years, treatment of portal hypertension has been based on non-selective beta-blockers (NSBB), drugs that decrease portal pressure by a reduction in splanchnic blood flow. NSBB have been shown to be effective in reducing first and recurrent variceal hemorrhage both in patients with compensated and decompensated cirrhosis.

More recently, attention has been placed on drugs that act by decreasing intrahepatic resistance. Carvedilol, is a unique type of NSBB with additional alpha-adrenergic blocking activity and may therefore also act by vasodilating the intrahepatic circulation. Carvedilol has a larger effect in reducing portal pressure compared to traditional NSBB (nadolol, propranolol) but its vasodilating properties, especially in the decompensated patients, may lead to further vasodilatation and worsening of the already unstable hemodynamic status of the decompensated patient. Statins act by ameliorating endothelial dysfunction and have shown to decrease the HVPG.11 In retrospective studies, statins have been shown to decrease decompensation and in one prospective study simvastatin improved survival in the setting of secondary prophylaxis of variceal hemorrhage12 (see below).

Variceal Hemorrhage

Acute variceal hemorrhage is the cause of approximately 70% of the episodes of upper gastrointestinal bleeding in patients with cirrhosis. The current standard of care has resulted in a major decrease in mortality. However, even in the last published series, it remains above 15%, which places acute variceal hemorrhage (AVH) as one of the most serious medical emergencies. The immediate goal of therapy in these patients is to control bleeding, to prevent early recurrence (within five days) and to prevent six-week mortality (main treatment outcome).

After initial volume replacement, blood transfusion strategy should be conservative, initiating packed red blood cell (PRBC) transfusion when hemoglobin is < 7 g/dL with a goal of maintaining hemoglobin between 7 and 9 g/dL), prophylactic antibiotic (ceftriaxone 1 g/day) and infusion of a safe vasoactive drug (e.g. octreotide). If the patient was on NSBB these should be discontinued, as they will blunt the cardiovascular response to bleeding.

Endoscopy should be ideally performed within 12 hours of admission, following hemodynamic resuscitation. Current evidence supports endoscopic variceal ligation (EVL) as the endoscopic therapy of choice for the initial control of bleeding, as it is associated with less adverse events and less mortality than sclerotherapy.

Once endoscopy and EVL have been performed, high-risk patients (defined in that study as Child C cirrhosis with a score of 10-13 or Child B with active bleeding on endoscopy) who have a transjugular intrahepatic portosystemic shunt (TIPS) placed within 72 hours (“early” or preemptive TIPS) have been shown to have lower failure and mortality rates both at 6 weeks and at 1 year compared to patients that continue on standard therapy.13 Because Child B patients with active hemorrhage were subsequently shown to be at an intermediate risk of mortality,14 the recommendation of considering a pre-emptive TIPS in patients with variceal hemorrhage applies mostly to Child C patients (score 10-13).15

Patients not receiving TIPS should continue with vasoactive drugs for at least two days and up to five days. Patients without evidence of rebleeding should be then tapered off octreotide, taken off antibiotics and started on NSBB for secondary prophylaxis (see below). Patients with persistent bleeding or severe rebleeding should receive a “rescue” TIPS.

Patients who recover from the first episode of variceal hemorrhage have a high re-bleeding risk (60% in the first year), with a mortality of up to 33 %. Therapy to prevent re-bleeding is therefore mandatory in these patients and should be instituted prior to hospital discharge. Patients who presented with variceal hemorrhage and other complications (ascites, encephalopathy, spontaneous bacterial peritonitis) with indications for liver transplant should be referred for evaluation. Patients in whom TIPS was placed during the acute episode require no further specific therapy for portal hypertension or varices is required. Surveillance for TIPS patency should be instituted (Doppler ultrasound every six months). For all other patients, the first-line therapy for the prevention of re-bleeding is the combination of NSBB (propranolol or nadolol) and EVL, with NSBB being the key component of combination therapy as shown in a recent individual meta-analysis.16

NSBB in Patients with Ascites

NSBB have been shown to prevent first and recurrent variceal hemorrhage in patients with cirrhosis and, in hemodynamic responders, NSBB have also been shown to prevent decompensation and death.17 The effect appears to be independent of the presence or absence of ascites.18

The main pathophysiologic mechanism in patients with cirrhosis and ascites is splanchnic and systemic vasodilatation that leads to activation of neuro-humoral systems, sodium and fluid retention, resulting in increased cardiac output, and a hyperdynamic circulatory state.3 In patients with refractory ascites, these abnormalities are maximal and a relative decrease in cardiac output can lead to a decrease in renal perfusion and to hepatorenal syndrome.19 NSBB could precipitate this decrease in cardiac output and lead to renal dysfunction and death. This would be particularly so for carvedilol which, in addition to decreasing cardiac output, can worsen vasodilation. In fact, retrospective studies have shown that NSBB can lead to renal dysfunction in decompensated patients20 and to a higher mortality in patients with refractory ascites.21 Subsequent retrospective studies including larger number of patients with ascites and/or refractory ascites (a collective of over 2,000 patients) have shown that beta-blocker (BB) use is either unrelated to an increased mortality. In fact, a recent meta-analysis including these observational studies and randomized studies of BB in the prevention of AVH, shows that BB use was not associated with increased all-cause mortality in patients with ascites, non-refractory ascites alone or refractory ascites alone.22

In studies showing a deleterious effect of NSBB, the mean arterial pressure is significantly lower in patients in the NSBB group, indicating that this may be the clinical indicator that would lead to NSBB dose reduction or discontinuation.23 Given the benefit of NSBB, particularly in the prevention of recurrent variceal hemorrhage, the Baveno VI consensus conference recommended that, until further evidence is available, NSBB should be used cautiously in patients with refractory ascites and dose reduced/discontinued in the presence of a systolic blood pressure <90 mmHg, serum sodium <130 mEq/L or development of acute kidney injury.6 Further, recent guidance have suggested that NSBB in patients with ascites require adjustment to a maximal daily dose of 160 mg of propranolol or 80 mg/day of nadolol.5

Ascites and Complications

The two main mechanisms of ascites formation in cirrhosis are universal: portal (sinusoidal) hypertension and renal retention of sodium. In cirrhosis, fluid extravasates from the hepatic sinusoids rather than from the splanchnic capillaries. Therefore, leakage of fluid into the peritoneal space occurs as a result of sinusoidal hypertension that in turn results from hepatic venous outflow block secondary to regenerative nodules and fibrosis. However, sinusoidal hypertension alone is not sufficient for the continuous formation of ascites. Plasma volume expansion, through sodium and water retention, allows for the replenishment of the intravascular volume and is the other essential factor in the pathogenesis of cirrhotic ascites.

As mentioned above, as portal pressure increases (and collaterals form), there is concomitant arterial splanchnic and systemic vasodilatation that results in a reduction in the effective arterial blood volume.19 This “underfilling” leads to baroreceptor stimulation and consequent activation of various vasoconstrictor and anti-natriuretic neurohumoral systems (the renin-angiotensin-aldosterone system and sympathetic nervous system) that lead to renal sodium and water retention and to an increase in intravascular volume that maintains ascites formation.3

The natural history of cirrhotic ascites progresses from diuretic-responsive (uncomplicated) ascites to the development of dilutioal hyponatremia, refractory ascites, and finally, hepatorenal syndrome (HRS).

First line therapies for new onset ascites (diuretics) and refractory ascites (therapeutic paracenteses) act downstream of the pathogenic cascade and are mainly symptomatic and therefore have not resulted in a significant improvement in survival. However, treating ascites is important, not only because it improves quality of life but also because spontaneous bacterial peritonitis (SBP), a lethal complication of cirrhosis, does not occur in the absence of ascites.

Most patients with cirrhosis who first develop ascites will respond to treatment with salt restriction and diuretics. Later on, as the pathophysiological mechanisms leading to ascites formation worsen, ascites no longer responds to diuretics and the patient is then said to have developed refractory ascites. First line therapy for these patients is serial large volume paracenteses, the frequency of which is determined by patient discomfort. TIPS acts on the pathophysiological mechanisms and its earlier placement in patients with refractory ascites should be considered. A recent multicenter trial In a recent randomized study of 62 patients with cirrhosis and at least two large volume paracentesis in the previous three weeks, those randomized to covered TIPS stents (average MELD 12, CTP score 9) had a significantly better one-year survival without transplant than those randomized to LVP (93% vs. 52%, respectively) with no differences in encephalopathy, suggesting that TIPS could be first-line therapy for patients with hard to treat ascites and relatively preserved liver function.24

Periodic albumin infusions, by increasing intravascular volume (and perhaps by additional functions that including binding of vasodilators and an anti-inflammatory activity) may play a role in the treatment of ascites. In an open RCT, chronic (weekly or biweekly) albumin infusions were associated with an improved survival in patients with non-refractory ascites.25 This was a small proof-of-concept study and therefore no firm recommendations can be made regarding this approach.

Management of hyponatremia has also been directed downstream of the pathophysiological cascade by the use of “vaptans” that block renal tubular reabsorption of water. However, as expected, the effect is only transient. In a large multicenter randomized trial, tolvaptan used for 30 days in patients with dilutional hyponatraemia (of whom 63 had cirrhosis), was associated with a rapid improvement in serum sodium and significant weight loss compared to placebo, without significant side effects.26 However, a sub-analysis of patients with cirrhosis and hyponatraemia showed that the effect on serum sodium was not only transient but, in those with severe hyponatremia, the effect was not sustained.27

Regarding HRS, vasoconstrictors constitute the current mainstay pharmacological therapy in the treatment of HRS. The rationale for use of these agents is to reverse the intense splanchnic and systemic vasodilatation, the main hemodynamic alteration in HRS. Administration of vasoconstrictors (ornipressin, terlipressin, octreotide with midodrine, noradrenaline) for periods greater than 3 days is associated with significant increas1s in mean arterial pressure, decreased serum creatinine and plasma renin activity as well as an increase in serum sodium .28 Additional evidence is the significant correlation between increases in mean arterial pressure and decreases in serum creatinine induced by vasoconstrictors in HRS.29

In meta-analyses of randomized controlled trials, vasoconstrictor therapy (most studies used terlipressin) was associated with a significantly greater rate of HRS reversal (46-51% vs. 11-22% in control group) and a lower mortality compared to control therapy.30,31 Studies included in these meta-analyses all defined HRS with a creatinine >2.5 mg/dL. With changes in the definition of acute kidney injury,32 a diagnosis of HRS would be reached with lower creatinine levels and thereby a greater rate of response would be expected.33 This is important because in all studies survival is significantly better in terlipressin ‘responders’.

Alternative vasoconstrictive therapy has included the use of intravenous noradrenaline infusion which has been shown to be as effective as terlipressin,31 and the use of the combination octreotide/midodrine which, despite having shown efficacy in uncontrolled trials, was recently shown to be significantly inferior to terlipressin in a randomized controlled trial34 and inferior to norepinephrine.31 Therefore, the vasoconstrictor of choice in HRS is terlipressin, but in countries like the United States, where terlipressin is not available, the combination of octreotide/midodrine can be initiated, and if there is no decline in serum creatinine within a maximum of 3 days, the patient should be transferred to the ICU intensive care unit for a trial of norepinephrine.35

Because patients with refractory ascites and HRS have a higher mortality than those with diuretic-responsive ascites, efforts to avoid drugs/procedures that will lead to worsening vasodilatation and/or to kidney injury in patients with cirrhosis and ascites are essential.

Download Tables, Images & References

Nutrition Issues In Gastroenterology, Series #178

Full Force Enteral Nutrition – A New Hope, or the Dark Side? A Critical Look at Enhanced Enteral Feeding Protocols

Read Article

In this article, we take a close look at enhanced enteral nutrition (EN) protocols, including volumebased feeding (VBF), which have been highly promoted. Evidence suggests that some enhanced EN protocols may be harmful to some critically ill adult patients and should be avoided. Although observational studies have reported an association between delayed provision of goal nutrition and compromised patient outcomes, interventional studies have reported more compromised outcomes than benefits from early goal nutrition and VBF in critically ill adult patients.

Enteral nutrition (EN) is the preferred method of providing nutrition support to critically ill patients, but EN is often interrupted and, as a result, many patients receive less than full nutrition. Multifaceted strategies for increasing the delivery of EN have been developed, including compensatory increased feeding rates after interruptions (volume-based feeding). Enhanced EN protocols, including volume-based feeding (VBF), have been highly promoted, but evidence suggests that some “enhanced” EN protocols may be harmful to some critically ill adult patients and should be avoided. Although observational studies have reported an association between delayed provision of goal nutrition and compromised patient outcomes, interventional studies have reported more compromised outcomes than benefits from early goal nutrition and VBF in critically ill adult patients. There is a need for additional research before enhanced enteral feeding protocols and VBF are routinely adopted in clinical practice.

Joe Krenitsky, MS, RD Nutrition Support Specialist, University of Virginia Health System, Digestive Health Center, Charlottesville, VA

Background

Anumber of observational studies of adult critically ill patients have reported an association between decreased EN provision and compromised patient outcomes.1-3 The association between decreased EN provision and compromised ICU outcomes is not a recent finding, since this relationship has been described since the early 1980’s.4 Furthermore, observational studies have suggested that the initial days of critical illness are particularly important for providing adequate nutrition. A failure to provide a threshold level of nutrition within the early portion of an adult ICU stay was reported to be associated with increased infectious complications.1

Clinical nutrition and other medical professionals are trained that associations noted in observational studies should never be used to infer causality, nor should the results of observational studies alone be used to suggest practice changes.5,6 However, nutrition support professionals also have an intimate understanding regarding the hypermetabolism and accelerated catabolism that occurs during critical illness or injury, as well as the negative consequences of malnutrition. Witnessing the negative clinical sequela caused by severe malnutrition in hospitalized patients often leaves an indelible impression on clinicians. Undoubtedly, for many clinicians, the observational studies documenting associations between nutrition provision and patient outcomes appeared to confirm what their training and experience had suggested about nutritional adequacy in the ICU.

Enteral Nutrition Comes of Age

Studies have consistently documented that many critically ill patients receive only a portion of the amount of EN that is ordered.7-9 EN is frequently and often repeatedly interrupted for essential diagnostic and therapeutic interventions, real and perceived feeding intolerance, routine bedside care, enteral access device occlusion or displacement, and a myriad of other feeding disruptions in the ICU.7-9 Over the past three decades, increased experience and research with EN has gradually contributed to more effective nutrition provision at many facilities. We have learned that initial feedings do not need to be diluted or initiated at low rates, then gradually advanced over several days, and that physiologic volumes of feeding and secretions in the stomach (gastric residuals) may not be a reason to stop EN.10 However, studies continue to document that EN is often incompletely delivered.9,11

Although some professionals have recommended early supplemental parenteral nutrition (PN) to avoid nutrition deficits, concerns regarding the increased cost and infectious complications related to PN have spurred the development of methods to improve EN delivery.12-14

Full Force Enteral Nutrition

In order to avoid delayed delivery of full nutrition goals, a number of enhanced EN protocols have been developed to permit more timely and complete nutrition. 11,15-24 One strategy that has been proposed to allow increased EN delivery is volume based feeding (VBF). The strategy of VBF is centered on using a compensatory increase in feeding rate upon restarting a feeding after any EN interruption, so that the daily goal volume is more consistently delivered. VBF has often been studied as part of a multi-faceted enhanced EN protocol and each study has included some, but not all of the following components: education programs for physicians and/or nursing, daily monitoring of amounts of nutrition delivery, early initiation of EN, starting EN at goal flow rate, reducing time without nutrition prior to operative procedures, starting EN with an increased calorie goal, routine use of prokinetics, supplemental protein, use of calorie-dense formulas and/or the use of a semi-elemental feeding formula (See Table 1).11,15-24 Different studies have utilized various components of these enhanced feeding protocols in addition to different VBF procedures and maximum allowed feeding rates.11,15-24 The 2016 ASPEN/SCCM guidelines for adult critically ill patients endorses the use of volume-based, multi-strategy enhanced enteral feeding protocols.25

It’s a Trap

Research has documented that enhanced EN protocols can often increase nutrition delivery and reduce the delay for reaching nutrition goals in the ICU. 11,15-24 However, the real concern is whether increasing the delivery of EN in the ICU actually confers beneficial effects on patient outcomes. In contrast to the associations identified in observation studies, the weight of evidence from randomized studies in the past 8 years is that modest calorie deficits within the first week of critical illness have no negative effects on clinical outcomes.26-29 It is perhaps not surprising then, to find out that most studies of enhanced feeding protocols reported no improvement in patient outcome, despite significantly improving the timeliness and completeness of EN delivery.11,15-22 Even more concerning is that several investigations of “enhanced” enteral feeding protocols have reported a dark side, in the form of negative outcomes, including increased mortality in some studies in the group receiving increased EN delivery.15-18 A closer look at the methods and limitations of the key studies should be helpful for clinicians deciding on EN feeding protocols for their facility.

Intensive EN and Patient Harm?

One study that has increased attention to the potential harm from enhanced EN protocols was a single-center, randomized study of 78 patients from medical or surgical ICUs with acute lung injury.15 The intensive nutrition group received feeding tubes and started EN sooner, used continuous feeding (no bolus or cyclic feeds), EN infusions were monitored daily, rates were increased after feeding interruptions occurred (VBF), and the amount of nutrition received was recorded. Following extubation, the intensive nutrition group had oral intake initiated as soon as safe swallowing function returned.15

The intensive EN group received significantly greater kcal/kg/d compared to the standard EN group (mean 25.4 kcal/kg vs. 16.6 kcal/kg, respectively).15 However, the data safety monitoring board stopped the study early when it was revealed that significantly more deaths occurred in the intensive EN group compared with the standard group (40% intensive EN vs. 16% standard EN).13 There were no significant differences between the groups in other outcomes (hospital or ICU stay, duration of mechanical ventilation, number of infections).15

Due to the fact that this study was terminated early (with less than the full number of participants enrolled), it is possible that this statistically significant difference in mortality occurred from chance alone.15 However, several other studies of enhanced EN have reported negative outcomes, most without any significant improvement in patient outcomes.16-24

Another study that has reported only a negative outcome with enhanced EN was a before-after cohort study of 49 medical ICU patients.16 The intensive EN group received a calorie-dense feeding, and feeding rates were increased to 150% of goal to compensate for EN interruptions.16 The intensive EN group received significantly increased mean calorie provision compared to the standard group (1198 vs. 474 kcals, respectively). Although baseline characteristics were similar (including APACHE II score) between the 2 groups, the intensive EN group had a significantly increased ICU length of stay, compared to the standard group (13.5 vs. 8.0 days).16

One before-after cohort study in 110 surgical-trauma ICU patients reported a trend towards negative outcomes, without improvement in outcomes.17 The intensive EN protocol utilized VBF, a 350mL threshold for gastric residuals, plus an educational program for ICU caregivers.17 This study was notable for delaying the start of the VBF portion of the protocol until patients had established tolerance to the initial goal rate of EN. The intensive EN group received a significantly increased percentage of goal calories, compared to the standard group (89% vs. 63%, respectively). Not only was no outcome improved from the increased calories, there was actually a trend towards a longer ICU length of stay compared to the standard group (15.0 days vs. 12.2 days, respectively).17 When the patients who died were excluded, the strength of this trend was decreased, but still persisted (P = 0.09). The incidence of diarrhea was significantly increased in the intensive EN group, compared to the standard group.17

Intensive EN: No Benefit, No Harm

Two studies have reported neither harm, nor outcome benefits from an intensive EN protocol.19-20 One was a before-after cohort study of 77 mixed ICU patients.19 Patients in the intensive EN group received a significantly greater percentage of prescribed calories than those in the standard group (74% vs. 57%, respectively). On the initial analysis, patients in the intensive EN group had a significantly longer length of ICU stay (14 vs. 9 days), as well as days on the ventilator (9 vs. 7). However, patients in the intensive EN group had a greater APACHE II score, and after controlling for the admission APACHE II score, the differences in clinical outcomes were not statistically significant.19

The second study with neither harm nor benefits was a larger multi-center cohort study where the different facilities were randomized to implement either the intensive EN protocol, or continue with standard care (“cluster randomized”).20 This study was one of the larger studies to date with over 1059 patients initially enrolled, but only 252 received the enhanced EN protocol.20 This larger study utilized VBF (PEP uP protocol), initiated EN at goal flow rate, used a peptide-based EN formula for initial feedings, used a prokinetic agent, increased the gastric residual threshold (300mL) and encouraged trophic feeding for patients initially deemed unsuitable for full feeding. Perhaps reflecting the difficulties in implementing a new protocol in diverse hospitals, compliance was not 100%, and as a result only 1/3 of the intervention group had EN started at goal rate, with ultimately only a 12% increase in calorie delivery in the PEP uP protocol group (from 32% of goal at baseline to 44% of goal on the PEP uP protocol).20

Two additional studies of VBF did not report any patient outcomes, only that more EN was received.21-22 One cohort study of the PEP uP protocol in 57 surgical patients demonstrated no increase in nutrition delivery when compared to surgical patients at sites that did not implement the PEP uP protocol.21

A second study of VBF randomized patients to either VBF or standard protocol after they had reached goal EN rate following a slow feeding rate progression over 24 hours.22 In the 57 patients who completed the study, VBF received 92.9% of calorie goals, while the standard group received 80.9% of calorie goals. Although the authors suggest this study provides evidence that VBF is “safe”, no information regarding patient outcomes were reported, and far too few patients were enrolled to provide reliable safety data. 22

Intensive EN: Reported Benefits (but with fine print)

One study that reported potential benefits of an intensive EN protocol was a before-after cohort study of 239 adult trauma ICU patients.18 Data was collected only for those patients who required ≥ 7 days of mechanical ventilation and who did not receive parenteral nutrition. The enhanced EN protocol consisted of early start of EN, a physician education program, intraoperative small bowel tube placement, EN ordering “bundle”, continued EN prior to procedures until patients were called to the OR and a VBF protocol with “catch-up” feeding rate if feedings were held.18

The intervention group received significantly increased calories during the first 3 days compared to baseline and received 100% of calorie goals after day 3.18 The cumulative calorie deficit was -1907 kcals in the intervention group and -7240 kcals in the baseline group. The intervention group had significantly decreased incidence of pneumonia (42%) compared to the baseline group (56%). Although pneumonia incidence was decreased, the intervention group had a greater requirement for mechanical ventilation, with significantly more days on the ventilator at day 28 compared to the baseline group.18 Although reading the abstract of this study may make it seem in favor of the intensive EN protocol, the exclusion of any data from patients who died before day 7, the high (56%) incidence of baseline pneumonia in the standard group, and the increased need for mechanical ventilation (in the setting of decreased pneumonia) raise concerns that this is not necessarily data in favor of the intensive EN protocol.18

Another study reporting positive outcomes was a before-after cohort study of 213 adult surgical ICU patients.23 The intensive EN group focused primarily on increasing protein delivery by increasing the standard initial protein goal from a baseline of 1.5 gm/kg to 2 gm/kg to be achieved with protein supplements. Calorie goals were unchanged from baseline, and although VBF was encouraged, it was not strictly implemented in the intervention group.

The intervention group received significantly more calories/kg (18.6 kcal/kg/d vs. 16.5 kcal/kg/d) and protein/kg (1.2 g/kg/d vs. 0.8 g/kg/d) compared to the standard group.23 The ICU length of stay (LOS) and hospital LOS were both significantly shorter in the intervention group (10 vs. 15 days, and 20 vs. 29 days, respectively). In the intervention group, there was a trend toward fewer late infections (mean 0.7 vs. 0.9, respectively).21 A regression analysis that adjusted for age, sex, BMI, APACHE II score, and GI surgery demonstrated that the aggressive EN protocol was associated with a significantly lower risk of late infection.23 Of note, thirty-day mortality was significantly increased in the intervention group compared to the control group (13.6% intervention vs. 7.4% control, respectively), but hospital mortality was not significantly different between the groups.23 This was primarily a study of increased protein delivery, with a clinically trivial difference in calorie provision between groups, and even the intensive EN group received hypocaloric feeding due to incomplete compliance with the VBF protocol.23

One of the studies used as evidence for positive outcomes from enhanced enteral feeding protocols was an unblinded, randomized investigation of 82 patients with severe head injury who required mechanical ventilation.24 This study was one of the 2 studies cited in the ASPEN/SCCM guideline endorsing volume-based, multi-strategy enhanced enteral feeding protocols.24 The control patients’ EN was started at a very slow rate of 15mL/hr, with a very conservative feeding advancement schedule.24 The rate of control feedings could be doubled every 8 hours, but the rate was only increased if gastric residual measurement was < 50mL X 2 consecutive measurements, and the feeding rate was reduced by 50% if a single gastric residual volume was ≥ 150mL. The intervention group received nasointestinal feeding when the tube could be successfully advanced (34% of group), or an NG tube when it could not.24 The intervention group had EN started at the goal flow rate; the feeding rate for the NG-fed portion of the intervention group was not decreased unless gastric residuals exceeded 200mL. Considering the very conservative feeding regimen used in the control group, it is not surprising that median calorie and protein delivery were less than 50% of goal even by day 7 of the study.24 The intervention group received significantly greater calories and protein throughout the first week of the study, compared to the control group. Mean energy delivery in the control group was 36.8% of goal, compared to 59.2% of goal in the intervention group.24 The intervention group had significantly less infections compared to the standard group (61% vs. 85%, respectively) and significantly less total other complications (37% vs. 61%, respectively).24 There was also a trend for improved neurologic complications at 3 months in the intervention group, but no difference at 6 months. However, when the results were analyzed by disease severity, there were no statistically significant improvements in patient outcome, but there was still a trend towards improved neurologic outcome at 3 months. When the methods of this study are compared to other enhanced enteral protocols it is important to note that the intervention group did not actually receive VBF, but rather hypocaloric nutrition that was gradually increased over several days to a maximum of just over 70% of goal calories by day 5-6.24

Discussion and Clinical Implications

There are inadequate randomized data to provide strong evidence for how much nutrition should be provided in the early stage of critical illness for optimized patient outcome. The randomized studies that are available suggest that the data from observational studies is misleading.26-29 The topic of early nutrition adequacy in the medical ICU has been highlighted as one of the notable occasions where data from observational data has led to clinical recommendations demonstrated to be incorrect by randomized data.30 It would seem reasonable that we should first know the proper timing and amount of nutrition to provide to critically ill patients for best outcome, before expending time and energy implementing protocols aimed at maximizing early nutrition delivery.

One of the problems faced when attempting to evaluate the studies of enhanced enteral feeding in critically ill patients is that all of the investigations enrolled a relatively small number of patients.11,15-24 These studies all lack statistical power to have confidence that any significant outcome difference between groups did not occur by chance alone.11,15-24 The varied populations as well as different methods used to enhance enteral delivery in the various studies do not lend this data to allow a scientifically valid meta-analysis; even if all of the studies were combined, it is likely that there are too few patients studied to allow adequate analysis of outcomes such as mortality in a mixed medical, surgical, and trauma ICU population.

However, what is striking is that those studies that were successful in increasing early calorie delivery, not only reported no improvement in patient outcome, but some may have actually caused net harm.15,16 Considering that the intensive EN groups also had other interventions such as nurse or MD education programs and daily monitoring of nutrition delivery, it would be reasonable to expect improved outcomes from staff education and close monitoring alone. The single study that actually provided full calories (Intensive EN group received 25.4 kcals/kg) had to be stopped for patient safety due to increased mortality.15 The studies that reported no harm, or some improved outcomes, generally were slower to meet calorie goals, had less compliance with VBF (or no VBF), and generally provided hypocaloric feeding to most patients, even in the “intensive EN” group.23,24 While it is quite possible that meeting full calorie expenditure in the earliest stages of critical illness itself may have detrimental effects, it is also possible that the negative effects reported in VBF studies are related to the method for increasing calorie delivery.

One topic that has not received adequate attention is the potential effect of VBF on glucose variability. Increased variability of serum glucose is associated with compromised outcomes in the ICU, and has been reported to potentially be more important to good ICU outcomes than the absolute glucose values.31 We know that providing full calories increases insulin requirements in critically ill patients, compared to hypocaloric feeding.27 No study has provided data about glucose variability during VBF, but accelerating feeding rates immediately after feeding is held, especially while providing full calories may be a risk for increased glucose variability.32 Glucose variability with VBF is potentially more likely, now that many facilities have abandoned insulin infusion protocols with hourly serum glucose checks in favor of basal-bolus protocols with less frequent glucose monitoring.

It is possible that full calories, or even VBF, may be helpful for some patients, but detrimental to others. The average BMI for the studies of intensive EN (see Table 2) demonstrates that, on average, the patients in these studies were overweight, and a number were likely obese. Obese patients appear to benefit from hypocaloric, full protein feedings, even while critically ill.33 Obese patients in the intensive EN groups of some studies would have exceeded the recommended calorie goals for obese critically ill patients.25

It is also possible that very malnourished patients who receive early full calories may experience detrimental effects. One study has demonstrated that failure to decrease calorie provision in patients who are experiencing refeeding hypophosphatemia may increase mortality.34 However, after refeeding syndrome has resolved, it is possible that patients with more severe malnutrition may benefit from efforts to improve nutrition delivery. Those patients with decreased BMI and minimal fat stores, patients who require extended ICU stays, or those who require repeated surgical interventions may benefit from efforts to minimize accumulated nutritional deficits. The initial days of critical illness, with unavoidable catabolism and increased insulin resistance may also be the wrong time to enforce full calorie provision. It is possible that enhanced EN protocols may have benefits once the initial phase of critical illness has subsided, and patients are capable of recovering from catabolism and regaining lean muscle mass.

More recently, additional associations from observational studies have been used to suggest that nutritional adequacy is critical only for those more severely ill patients (increased NUTRIC score).35 It is important to remember that the NUTRIC score is based on data only from observational studies. Analysis of data from patients randomized to receive reduced-calorie feeding has demonstrated that the NUTRIC score is not a valid indicator of patients who benefit from full calorie nutrition support.36

CONCLUSIONS

There is insufficient evidence to support intensive EN protocols in the early portion of critical illness in adult patients. Furthermore, VBF that meets full calorie expenditure within the first several days of critical illness may have negative effects and may even increase mortality in some populations. The best available data from randomized studies of hypocaloric feeding have demonstrated that the associations described from observational studies are not cause and effect: a modest calorie deficit in the early portion of critical illness does not compromise patient outcome, even in those patients with an increased NUTRIC score. Early VBF that meets calorie goals may be detrimental for some critically ill adult patients, and in the absence of data showing clear benefit, intensive, early VBF EN protocols should be avoided in routine use, especially in patients above their ideal weight. Additional research is required to see if a more gradual increase in calories with a focus on nutrition adequacy after the most critical stress has passed, or a focus on protein adequacy may have outcome benefits. There is a need for studies that focus efforts to increase nutrition provision in patients that are underweight, malnourished and have extended ICU admissions.

Download Tables, Images & References

Nutrition Issues In Gastroenterology, Series #177

Got Lactase? A Clinician’s Guide to Lactose Intolerance

Read Article

Intolerance of lactose-containing foods is a very common condition that usually arises as a result of a genetically programmed decline in the enzyme lactase. Here we discuss the management of lactose intolerance and the challenge of maintaining proper nutrition status – particularly as it pertains to calcium and vitamin D. We provide a variety of additional tools available to help address symptoms. Further work is needed to evaluate the role of lactose and dairy foods in other GI conditions, as well as on methods to avoid unnecessary dietary restriction.

Meagan Bridges, RD, Nutrition Support Specialist, University of Virginia Health System Charlottesville, VA

INTRODUCTION: WHAT IS LACTOSE INTOLERANCE?

Lactose intolerance is a clinical syndrome in which lactose ingestion causes symptoms such as abdominal pain, bloating, flatulence, and diarrhea due to lactose malabsorption. Lactose (milk sugar) is a disaccharide found in milk and milk products. Digestion and absorption require the enzyme lactase, which is found in the brush border of the small intestine. Lactase hydrolyzes lactose into the monosaccharides glucose and galactose, which can then be absorbed and used for energy.

Lactose malabsorption is most commonly caused by reduced lactase levels. Primary lactase deficiency is a genetically modified condition resulting from the physiological decline of lactase activity after infancy. Other terms used to describe this condition include lactase nonpersistence, lactase insufficiency, and adult type hypolactasia. It is estimated that about 68% of the world’s population is lactase nonpersistent, with wide variance levels according to region and ethnicity.1 In the U.S., 36% of the population is lactase deficient, with highest prevalence levels found in individuals of African, Asian, Latin American, Native American, and South American descent.1

Not everyone with lactose malabsorption is lactose intolerant. In lactose intolerant individuals, unabsorbed lactose transits to the colon, carrying with it an increased osmotic load and serving as a ready substrate for the microbiome to ferment and produce short chain fatty acids and gas. This is what results in the classic symptoms of bloating, flatulence, borborygmi, abdominal pain, and diarrhea. Less often, it can present with nausea or constipation. Symptoms of lactose intolerance can range from mild to severe but generally do not occur until there is at least a 50% reduction in the lactase enzyme.2

There is also secondary hypolactasia, or secondary lactase deficiency, which can occur as a consequence of any condition that damages the brush border of the small intestine. Mucosal damage due to celiac disease, Crohn’s disease, or ulcerative colitis may result in a transient lactase deficiency. Small intestinal bacterial overgrowth, certain medications, infectious enteritis (e.g. giardiasis), radiation enteritis, gastrointestinal surgery, and short bowel syndrome may also result in either a reduction in absorptive capacity or a downregulation of lactase expression in the small intestine. Whereas primary hypolactasia is irreversible, secondary hypolactasia can often be reversed once normal intestinal mucosa is restored.

DIAGNOSIS

Lactose malabsorption can be diagnosed using various methods (Table 1). The hydrogen breath test involves ingestion of a standard dose of lactose, usually 20-50g (roughly 400-1,000 mL cow’s milk), and measuring breath hydrogen at 30-minute intervals. A diagnosis of lactose malabsorption can be made with a hydrogen level > 20 ppm within 3 hours of ingestion. This test is 78% sensitive and 98% specific for lactose malabsorption, but it is susceptible to both false positives (e.g., in the presence of small intestinal bacterial overgrowth) and false negatives (e.g. in the presence of non-hydrogen producing bacteria).

Other tests aimed at detecting lactase deficiency are available, but they are rarely performed. Biopsies of the jejunum are often regarded as the gold standard for determining lactase activity and have the advantage of determining whether a patient may have secondary lactose malabsorption. This procedure, however, is highly invasive and not particularly reflective of intestinal lactase activity as a whole. Lactase deficiency may also be assessed by way of genotyping, but the available genetic tests do not account for all the possible polymorphisms resulting in lactase nonpersistence, nor are they useful for patients with secondary lactase deficiency.

As opposed to lactose malabsorption, lactose intolerance is much more difficult to ascertain. A presumptive diagnosis can be made in patients with symptoms that occur within a few hours after significant lactose ingestion (>2 servings of dairy/day or >1 serving in a single dose that is not associated with a meal), which resolve after 5-7 days of lactose avoidance.3

It is important to note that whereas lactase deficiency and malabsorption can be objectively measured, demonstration of lactose intolerance relies on subjective self-reporting of symptoms, which are very common even in the absence of lactose ingestion and are also highly susceptible to the placebo effect. Indeed, the few double-blind trials that have been conducted reveal a poor association between self-reported lactose intolerance and the occurrence of symptoms after lactose ingestion, even in patients with known lactase deficiency.4

SOURCES OF LACTOSE

Lactose is in virtually all milk and milk products (Table 2). By far, the highest concentration of lactose per serving is present in milk, ice cream and some yogurts, while cheeses generally contain much lower quantities of lactose. Lactose may also be found in other foods and beverages containing milk or milk products, including boxed, canned, frozen, packaged, and prepared items. Table 3 lists common types of these foods, as well as terms on the ingredient list that indicate whether a product contains lactose.

There are also certain medications that contain lactose, including over-the-counter pain relievers, multivitamins, and anti-diarrheal agents. Usually, the amount is so small (less than 0.5g) that it will not raise hydrogen levels, much less cause GI symptoms.

MANAGEMENT: LACTOSE RESTRICTION

The most common therapeutic approach to lactose intolerance involves limiting milk and milk products in the diet. Complete lactose avoidance is rarely indicated. Most blinded studies suggest that people with lactose intolerance can consume around 12g of lactose – roughly the same amount in one cup of milk – in a single dose with no or mild symptoms. When consumed with other foods and/or spread out in small amounts over the course of the day, up to 18g of lactose can generally be tolerated.5

It may be beneficial for some patients to completely exclude milk and milk products for 2-4 weeks (or long enough for the remission of symptoms), and then gradually reintroduce dairy products up to a threshold of individual tolerance.6 An individual’s tolerable level depends on several factors, including the amount of lactose consumed at one time, residual lactase activity, ingestion with other foods and beverages, gut transit time, and the gut microbiome.

Nutrition Considerations

While limiting dairy may reduce lactose intolerance symptoms, it comes with certain risks, particularly as it pertains to osteoporosis and bone fractures secondary to inadequate calcium and vitamin D intake. As dairy remains the primary dietary source of calcium and vitamin D for the general population, several studies have pointed to a relationship between lactose malabsorption, low intake of dairy products, and reduced bone mass.7,8

The recommended daily intake for calcium is based on age and sex (Table 4). Patients with lactose intolerance should be assessed for dietary calcium adequacy and instructed to increase calcium intake from other foods if necessary (Table 5). They may also need to take calcium supplements, which come in a wide range of preparations and doses (Table 6). Calcium carbonate is the most common and least expensive form of calcium supplementation. It is best absorbed with a low-iron meal, but it may not be as effective in people who take proton pump inhibitors or H2 blockers. Calcium citrate can be taken with or without a meal and may be better suited for people with achlorhydria, inflammatory bowel disease, or absorption disorders. Calcium absorption is highest in doses ≤ 500 mg; amounts greater than this should be taken in divided doses.

Through fortification, milk is also a major source of vitamin D, which is needed for calcium absorption. Few other foods naturally contain significant amounts of vitamin D, with the exception of fatty fish, liver, cheese, and egg yolks. Monitoring of vitamin D status and supplementation may be necessary, especially in patients with Crohn’s or celiac disease, or in other patients at additional risk for deficiency.

There is a growing array of plant-based (non-dairy) alternatives to cow’s milk, most of which are fortified to offer nutrients in amounts comparable to those found in cow’s milk, including calcium (300 mg per cup) and vitamin D (120 international units per cup). These beverages can be a viable alternative for those with lactose intolerance.9

OTHER MANAGEMENT OPTIONS
Exogenous Enzyme Supplementation

Exogenous lactase (obtained from yeasts or fungi) can be taken before or during dairy consumption to help hydrolyze lactose. These supplements commonly come in tablet form, and a dose of 6000-9000 units/meal is typically taken (Table 7). Liquid drops, which are not widely available in the U.S. but can be ordered online, may also be added directly to milk. Results of these products vary, and research thus far has been inconclusive regarding the efficacy of supplementation, which may depend on enzyme origin, residual endogenous lactase activity, dosage, the amount of lactose consumed, stomach pH, and bile salt concentration.10

Low-lactose or lactose-free milk, such as Lactaid®, is milk with added lactase enzymes that have pre-hydrolyzed the lactose. In recent years, there has been an increase in reduced-lactose or lactose-free dairy foods, to include not only milk, but also yogurt and ice cream (see Table 8). These products are readily available in most large grocery stores, but they are typically more expensive than their lactose-containing counterparts. Some store brands are beginning to carry their own products at a lower price.

Yogurt and Probiotics

Plain yogurt has been shown to be as effective as pre-hydrolyzed milk in reducing hydrogen production and intolerance via lactase-containing microorganisms.11 Sweet acidophilus milk contains the same bacteria added to cold milk but is not as effective.12 Likewise, yogurts that contain milk or milk products added back after fermentation may still produce symptoms.

A related strategy involves probiotic supplementation with the goal of altering intestinal flora so that more lactic acid bacteria may salvage malabsorbed lactose and ferment it without excessive gas production. Some studies have shown that probiotic supplementation can lead to decreased hydrogen production and improved symptoms in lactose intolerant individuals.13,14 The full body of evidence, however, is insufficient to recommend this approach.

Colonic Adaptation

Although lactase expression cannot be up-regulated by the presence of lactose, it is thought that “tolerance” may be induced despite malabsorption by way of adaptive processes involving the gut microbiota and some colonic functions and features. Consecutive incremental doses of lactose compared to dextrose have been shown to reduce flatulence, but not abdominal pain or diarrhea.16 Current research, however, is not convincing enough to support incremental increases of lactose ingestion to treat symptoms of intolerance, as results have been variable and often conflated with the placebo effect.17

Slowing Gastrointestinal Transit

Co-ingestion of other foods has been demonstrated to improve lactose tolerance, possibly by way of delaying gastric emptying, slowing down intestinal transit time, and prolonging contact time with available lactase.15 It is thought that consuming full-fat milk versus low-fat or skim may have the same effect, but current research is inconclusive. Pharmacological agents such as loperamide can also slow GI transit, but they often come with significant side effects and/or high cost.

CONSIDERATIONS FOR PATIENTS
WITH OTHER GI CONDITIONS

Observational studies point to an overlap between lactose intolerance and irritable bowel syndrome (IBS), as well as other gastrointestinal conditions that may result in secondary lactase deficiency, such as inflammatory bowel diseases, celiac disease, and small intestinal bacterial overgrowth (SIBO).6 Relationships, however, are confounded by several factors, including:

  • the transient nature of secondary hypolactasia
  • the preponderance of individuals with a genetic predisposition for primary hypolactasia
  • the symptom profiles of these GI conditions, which share many of the same attributes as lactose intolerance
  • the overlap with fermentable oligo-, di-, and monosaccharides and polyols (FODMAPs);
  • the subjectivity of self-reporting symptoms

Patients with a concomitant GI condition, whose diets may already be limited to some degree, should have suspected lactase deficiency confirmed before starting a low-lactose or lactose-free diet so as to avoid unnecessary restrictions. In patients with secondary lactose malabsorption, successful treatment of the primary disorder can lead to restoration of lactase activity. However, lactose intolerance may persist for months after healing starts.

Irritable Bowel Syndrome

Retrospective studies have shown that up to 85% of IBS patients with lactose malabsorption have improved symptoms when they restrict lactose in their diet.18 At the same time, prospective studies show that symptom improvement is highly susceptible to the placebo effect, and that lactose restriction alone is not sufficient for effective symptom relief in functional GI diseases.19 In IBS, lactose intolerance tends to be subsumed under a wider intolerance of FODMAPs and may not be directly related to lactase deficiency. Several randomized controlled trials indicate that IBS patients can benefit from a low-FODMAP diet that includes lactose restriction.20 However, the specific impact of lactose and lactose restriction on symptoms is difficult to assess.

Crohn’s Disease and Ulcerative Colitis

There are no viable studies to date that specifically examine lactose as a symptom mediator in inflammatory bowel diseases (IBD) such as Crohn’s and ulcerative colitis. Patients are often instructed to restrict lactose and dairy intake, but the prevalence of true lactose malabsorption is unclear, as GI manifestations of these diseases are often similar to those of lactose intolerance. Studies indicate that 40-70% of patients with Crohn’s self-report that they are lactose intolerant, but in most cases, lactose malabsorption appears to be driven more by ethnicity and genetic makeup, rather than by a direct association with the disease itself.21,22 The exceptions to this are Crohn’s patients with small bowel involvement, who do appear to be at higher risk for lactose malabsorption but not necessarily lactose intolerance.21

Celiac Disease

If the progression of celiac disease (CD) results in damage to the intestinal brush border, patients are likely to experience secondary lactase deficiency. Most people with CD can eventually rehabilitate their brush border and regain lactase activity within 6-12 months after following a gluten-free diet, assuming they do not also have primary lactase deficiency with lactose intolerance. In some cases, the villi and microvilli damage can take up to 2 years to heal completely.24

Small Intestinal Bacterial Overgrowth

IBS patients with lactose intolerance are more likely to also have small intestinal bacterial overgrowth (SIBO).25 Excessive bacterial fermentation of lactose with production of short-chain fatty acids and gas in the small bowel may particularly trigger abdominal symptoms. If during a hydrogen breath test, a patient complains of abdominal pain and has an early hydrogen peak (within 15-30 minutes of ingesting lactose), SIBO should be ruled out before attributing these symptoms to lactose intolerance.

CONCLUSION

With such a high prevalence of lactase deficiency worldwide and in the U.S., clinicians are more than likely to encounter lactose intolerance in clinical practice. There are many strategies to help patients manage their symptoms (Table 9). Complete elimination of dairy is likely not required; at least 12 mg of lactose (about 1 cup of milk) is often well-tolerated and will not induce GI symptoms. Patients with lactose intolerance may need to be monitored for adequate calcium and vitamin D status and counseled on non-dairy sources and/or supplementation of these nutrients.

Download Tables, Images & References

Gastrointestinal Motility And Functional Bowel Disorders, Series #25

Revisiting Achalasia and Esophageal Squamous Cell Carcinoma

Read Article

After immediate endoscopic or surgical management of achalasia, many patients may not follow through with long-term care. Though less common than adenocarcinoma, squamous cell carcinoma of the esophagus can be associated with achalasia. This article highlights current care in diagnosis, treatment and possible long-term esophageal surveillance strategies.

Fernando Moran, MD, Cong Phan, Richard W. McCallum, MD, FACP, FRACP (AUST), FACG, AGAF, Professor of Medicine and Founding Chair, Division of Gastroenterology, Director, Richard W. McCallum, MD, FACP, FRACP (AUST), FACG, AGAF, Professor of Medicine and Founding Chair, Division of Gastroenterology, Director, Center for Neurogastroenterology and GI Motility, Texas Tech University Health Sciences Center, Paul L. Foster School of Medicine, El Paso, TX

INTRODUCTION

Achalasia is a rare, chronic esophageal motility disorder with an estimated annual prevalence of 1 per 100,000 subjects in the western populations. The disease can occur at all ages but the incidence increases with age.1 Achalasia results from progressive degeneration of ganglion cells in the myenteric plexus of the smooth muscle of the lower esophageal sphincter and the lower two-thirds of the esophagus, resulting in failure of relaxation of the lower esophageal sphincter, accompanied by a loss of peristalsis in the distal esophagus.2 Predominant symptoms are dysphagia and regurgitation. Treatment is purely symptomatic as the etiology of achalasia is still unclear. Treatment aims at lowering the lower esophageal sphincter (LES) pressure to improve the passage of food. Even after treatment there is continued aperistalsis and delayed transit, so sufficient symptom control does not prevent patients from having persistent retention of foods and fluids in the esophagus. This is associated with degrees of bacterial degradation of the retained contents and impaired clearance of regurgitated acid gastric contents. These factors can result in chronic inflammation of the esophageal mucosa, which potentially increases the risk of development of hyperplasia, dysplasia, and esophageal cancer. In addition, lowering of LES pressure does facilitate chronic acid gastroesophageal reflux which in a small percentage of patients leads to Barrett’s metaplasia and adenocarcinoma.1 Currently there are no specific guidelines for cancer surveillance in long term follow up of patients with achalasia.

Case Report

A 58 year-old Caucasian male presented with dysphagia. He had the history of heavy alcohol use (four drinks daily for 35 years). Previously diagnosed with achalasia, he underwent pneumatic balloon dilation in 2012. He experienced an esophageal perforation requiring an open repair and myotomy without any accompanying fundoplication. After surgery, he experienced constant reflux but no achalasia symptoms. He was started on a proton pump therapy immediately after surgery. He noticed weight loss and difficulty swallowing in July 2017, with a 20-pound weight loss and progressive dysphagia to solid food, unable to tolerate anything but a pureed diet.

On physical examination the patient had facial thinning, firm hepatomegaly and scoliosis. There were no Virchow lymph nodes palpable in the neck. His laboratory evaluation was unremarkable, including albumin and hemoglobin. Liver enzymes were also within normal limits.

Barium swallow with a 13 mm barium tablet revealed a tight stricture with a suggestion of “shouldering” in the proximal esophagus, 15 cm proximal to the gastroesophageal (GE) junction, and delay of the barium tablet at the stricture (Figure 1). Distal to the stricture there were no radiographic findings of achalasia. Upper endoscopy revealed that the upper third of the esophagus was normal. A stricture was found 25 cm from the incisors and the endoscope would not pass (Figure 2). Savary dilation was performed at 7 mm, 9 mm, 11 mm and 14 mm. The endoscope could then traverse the stricture after dilation. The stricture extended from 25 to 35 cm from the incisors. Its mucosa was nodular, friable, irregular and polypoid, suspicious for esophageal cancer (Figure 3). Biopsies of stricture showed moderately differentiated squamous cell carcinoma. The biopsies of the esophagus distal to the stricture showed changes of reflux esophagitis but no Barrett’s esophagus.

Subsequent computed tomography (CT) imaging of the chest revealed a circumferential, mass-like thickening of the proximal esophagus, approximately 7.5 cm in length. There was a loss of the fat plane with the aortic arch, proximal ascending aorta, lower trachea and left mainstem bronchus, concerning for tumor infiltration. There were no pulmonary nodules but a couple of small mediastinal lymph nodes were noted (Figure 4).

The patient was referred for chemotherapy and radiation. The esophageal stricture was re-dilated to a 17mm diameter size in preparation for the initiation of the radiation treatment. In addition, a percutaneous gastrostomy tube was placed to ensure adequate nutrition maintained through the treatment course/

Discussion

Esophageal cancer has been a very infrequent complication in the long term follow up of achalasia. Among a large case series, it ranges from 0.4% to 9.2%.3 One review found that the prevalence of esophageal cancer in achalasia was 3% in the long term follow up (five to 20 years), corresponding to a 50-fold increased risk.4 Most cases of esophageal cancer in patients with achalasia are squamous cell carcinoma located in the middle third of the esophagus. It is proposed that, although improved symptomatically by medical or surgical therapies, there is continuing stasis of food in the esophagus promoting lactic acid production and fermentation, inducing slow and continuous damage to the esophageal mucosa.5,7 Conversely, adenocarcinoma may occur after treatment for achalasia, almost invariably arising from Barrett’s esophagus due to longstanding gastroesophageal reflux.7 Alcohol use also places patients at higher risk for squamous cell cancer. A combination of the factors described above along with the long history of alcohol abuse may have been the main triggers for squamous cell carcinoma.

Currently there are no guidelines for monitoring squamous cell carcinoma or other late complications such as esophageal and peptic stenosis or megaesophagus.6 Whether surveillance endoscopy should be generally recommended for all patients with esophageal achalasia is still controversial due to the long interval between the initial symptoms and diagnosis of achalasia and the development of carcinoma.8 Studies have indicated an interval between the diagnosis and treatment of achalasia and the diagnosis of esophageal cancer of at least 15 years.9,10 Its opponents contend that, even under surveillance, mortality from esophageal cancer in achalasia patients resembles the general population with a survival rate of 40% after year two of diagnosis1 while similar surveillance programs for Barrett’s esophagus improve survival to 73-85% within two years of diagnosis.11 One other consideration is the large cost of such surveillance programs. One study in 1995 has estimated that about 732 endoscopic procedures were needed to detect three cancers over a 15-year study period costing $585,000 thus averaging $195,000 per cancer detected.12 This is contrasted to $31,000 in similar adenocarcinoma surveillance program for patients with Barrett’s esophagus.12 On the other hand, its proponents for surveillance argue that without strict endoscopy surveillance, esophageal malignancies will be detected very late and in an advanced stage. This is thought to be due to residual dysphagia which can mimic esophageal cancer and recurrent achalasia.13 In a recent study in 2016, Ota et al. performed annual endoscopies follow-up in 32 patients over a mean period of 14 years (range 5-40 years) after successful achalasia surgery treatment. They were able diagnose 6 of 32 patients with esophageal cancer at early stage. All six patients were alcoholic drinkers and three had smoking habit.14 This suggests follow-up endoscopy with biopsy is important in early cancer diagnosis as the risk for malignant transformation still persist even after successful achalasia treatment.

Overall, we suggest to identify a risk assessment profile score in individual achalasia patients based on the type of treatment they initially underwent, esophageal pH data, previous endoscopy biopsy results, barium swallow and other known independent risk factors for squamous cells carcinoma such as age, alcohol, tobacco use and male gender. In high risk patients, we should consider endoscopy with biopsy beginning within five years after diagnosis and possibly every three to five years thereafter.

Take Home Messages

When following patients who have been treated for achalasia, the initial treatment is key. It is expected that after pneumatic dilation there may be recurrence of dysphagia and repeated pneumatics over the next five to 20 years. With a successful myotomy and partial fundoplication there should be minimal or no recurrence of the achalasia.

However, when the myotomy surgery is incomplete, typically because the myotomy is not extended at least 2cm into the stomach, recurrence of “achalasia” dysphagia will occur within the first one to two years. When there is no fundoplication accompanying the Heller myotomy then the scenario for complications of long term reflux are also in the equation, specifically a peptic stricture as witnessed in our case. Finally, there remains a background incidence for squamous cell esophageal cancer in all patients, particularly with age, heavy alcohol intake and or cigarette smoking. The bottom line is to be aware of the different possibilities in the long-term follow-up of achalasia patients.

Download Tables, Images & References

Frontiers In Endoscopy, Series #44

Radiofrequency Ablation for Biliary Disease

Read Article

Biliary RFA has emerged as a viable method for the ablation and palliation of biliary strictures, and studies suggest that endobiliary RFA is a safe and feasible technique. In this article we discuss important issues for the clinician to be aware of: Adverse events common to endobiliary RFA, the effect of chemotherapy on survival, plastic vs metal stents and the continued investigation to further elucidate the efficacy of endobiliary RFA in the management of biliary strictures.

Patrick Powers MD, Douglas G. Adler MD, FACG, AGAF, FASGE, University of Utah School of Medicine, Gastroenterology and Hepatology, Utah School of Medicine, Salt Lake City, UT

INTRODUCTION

Biliary obstruction remains a common clinical compliant, and strategies to treat this such as ERCP have been widely employed. The majority of patients with biliary obstruction harbor an underlying malignancy, most commonly pancreatic adenocarcinoma and cholangiocarcinoma. Hepatocellular carcinoma, metastatic masses or adenopathy, gallbladder carcinoma, and ampullary carcinoma are also seen frequently.1 Surgical resection is the treatment of choice, and if possible leads to better long term outcomes.2,3 However, because of the insidious and progressive nature of these malignancies, the presentation and diagnosis often occurs late in the disease and most cases are surgically unresectable. Therefore, the palliative relief of biliary obstruction has become the standard of care, and has been shown to improve quality of life.3,4

The development of self-expanding metal stents (SEMS) has been shown to give symptomatic relief and offers improved quality of life. However, tumor ingrowth, overgrowth, and stent occlusion from sludge or stones can cause stent failure, often at a median time of 6-8 months.5,6,7 While most stent occlusions are treated by placement of a new stent within the old stent, interest in other approaches has resulted in the development of endoscopic strategies to relieve stent occlusion, including photodynamic therapy (PDT) and radiofrequency ablation (RFA).

In PDT, a photosensitizer is administrated 48 hours prior to the procedure and is taken up by tumor tissue. An ERCP-directed laser fiber is passed across the malignant stricture, and apoptosis is induced via specific light wavelengths. PDT has demonstrably improved both quality of life, and outcomes in malignant biliary obstruction. The photosensitizer is believed to possibly be preferentially absorbed by malignant cells, and so it may have the advantage of causing relatively little damage to nearby non-malignant cells. 8,9,10,11

Radiofrequency Ablation

Radiofrequency ablation utilizes an electric current to induce thermal coagulative necrosis of localized tissue. The circuit is composed of either a monopolar probe coupled with a dispersive electrode placed on the skin of the patient, or a local bipolar probe. The current alternates between probes at the frequency of radio waves, typically 400-600 hertz (Hz). Tissue is a poor electrical conductor, and so as the current flows it causes the target tissue to heat up to 50-100 degrees Celsius.12 Heat causes protein denaturation, followed by cell dehydration and coagulative necrosis. The heat generated is directly proportional to the current generated, and so tissue nearest the probe experiences the greatest rise in temperature. Necrosis causes eventual dehydration and charring with loss of the conducting ions within the cells. This leads to a rise in impedance, reducing the depth to which RFA can penetrate.12,13,14 One strategy to reduce this impedance is by using pulsed RFA, which allows the tissue to cool and rehydrate between pulses, allowing the current to travel deeper into the tissue. Another strategy is to use internally cooled RFA probes. In this case, the goal is to reduce the heat gradient between the probe and the tissue while maintaining current, thereby reducing charring, and allowing deeper penetrance of adequate heat to induce coagulative necrosis.14

RFA has also been proposed to stimulate systemic antitumor immunity. Theoretically, by exposing tumor antigen with modalities such as RFA, antigen presenting cells can direct the immune system against targets that would otherwise be hidden.15,16 Further research into the field of synergistic immunomodulation with RFA is ongoing.

Percutaneous or intraoperative RFA has been historically quite successful in the management of solid tumors including liver, breast, lung, and kidney. More recently RFA has been used in the management of various gastrointestinal disorders as well including Barrett’s esophagus, gastric antral vascular ectasia, and metastatic hepatocellular carcinoma.13 First described in 2011, Habib et al. developed a bipolar catheter that could be directed via ERCP to enter the biliary tree. Since its development, several groups have investigated the safety and efficacy of RFA in biliary obstruction. Its use has been described mostly in the management of malignant biliary obstruction, both before SEMS placement, and after SEMS occlusion. Newer uses are being investigated as experience with RFA grows.

ENDOBILIARY RADIOFREQUENCY ABLATION PROCEDURE

The Habib EndoHPB catheter (Habib EndoHPB, EMcision Ltd, London, UK) is an 8Fr catheter with two 8 mm stainless electrodes separated by 6 mm. The proximal end is connected to an electrosurgical generator, where the power and duration can be adjusted. The biliary tree is accessed via standard ERCP. Opacification of the biliary tree is used to demarcate the stricture location and size, and the guidewire is directed across the stricture. The RFA catheter is then threaded over a guidewire. Energy is supplied by the generator at desired specifications (typically 5-10 watts for 90-120 seconds with a 60 second cooling period). After RFA, the biliary tree is then swept via balloon to remove debris. Depending on the size and location of the stricture, multiple RFA applications may be required, and these may be to some extent overlapping. This is usually followed by the application of SEMS. If RFA is applied to an already obstructed stent, several applications may be required.13,14

RFA FOR MALIGNANT BILIARY STRICTURES

In the vast majority of patients, RFA is used in the setting of a malignant obstruction, either prior to SEMS placement or in an occluded SEMS that has developed tumor ingrowth or overgrowth. (Figure 1)

RFA Prior to Stent Placement

The first group to look at the use of RFA prior to SEMS placement in humans was Steel et al. in 2011. In this study, a total of 21 patients underwent RFA for malignant biliary stricture by either cholangiocarcinoma or pancreatic adenocarcinoma. At 30 days, there were no mortalities, and at 90 days 16 of the initial 21 patients were alive with patent stents. This paper introduced RFA as a relatively safe modality to reduce obstruction prior to SEMS placement.17

Four additional small studies from 2013 to 2015 also investigated the safety and feasibility of RFA in malignant biliary obstruction. In a retrospective study, Alis et al. investigated obstruction secondary to cholangiocarcinoma in 17 patients. Of this group, 7 did not have successful ERCP or endobiliary RFA, although the group did not describe the technical difficulties. Of the 10 remaining patients, 30-day mortality was 0%. Two patients developed ERCP-pancreatitis. The authors concluded that RFA was a safe modality for malignant obstruction, although there was a high rate of technical failure.18 A pilot study by Figueroa-Barojas et al. looked at 20 total patients (11 with pancreatic adenocarcinoma, 7 with cholangiocarcinoma, 1 intraductal papillary mucinous neoplasm and 1 metastatic gastric cancer). All 20 patients underwent successful RFA, and complication rates were similar to rates that had been published for ERCP with stent alone. With a 0% immediate and 30-day mortality, the group concluded that RFA was a safe, and technically feasible technique for malignant obstruction.19 In a 2014 study of 12 patients, Tal et al. investigated the safety of RFA in hilar tumors (mostly Klatskin Bismuth IV), followed by placement of plastic stents. RFA was technically successful in all cases, but there was a higher frequency of complications. Hemobilia occurred in three patients between 4 and 6 weeks: one of which was spontaneous, and two, which occurred during removal of the plastic stent. Two of these cases were fatal, and one was successfully treated with a SEMS. 30 and 90-day mortality were 8.3% and 50% respectively. Given their higher mortality and complication rate than prior studies, the group reported RFA as a technically feasible technique, but one that required further investigation with large controlled trials to avoid fatal complications.20 In a 12 patient case series, Laquiere et al. investigated the safety of RFA in a relatively homogenous group of patients with extrahepatic cholangiocarcinoma. Endobiliary RFA was technically successful in all cases. Within 30 days, there was one patient with sepsis thought to be secondary to bacterial translocation, and one patient with acute cholangitis secondary to stent migration. 30-day mortality was 0%. Patients were followed to 9 months and of the 12 patents, 6 received a second RFA session, 3 of which were due to acute stent obstruction with or without acute cholangitis, 3 of which were planned to prevent re-obstruction. This study demonstrated again that RFA was a safe technique for treatment patients with malignant obstruction, and suggested that scheduled repeat session may also be effective at preventing biliary re-obstruction.21

In 2014, Dolak et al. published a retrospective, multicenter study of 58 patients undergoing a total of 84 total RFA procedures to look at safety and feasibility.22 78 procedures were performed via ERCP and 6 were performed via a percutaneous approach. The patient population had varying malignancies (51 with biliary tract cancer, 4 with pancreatic cancer, and 3 with hepatocellular cancer), and over 50% had received other treatment modalities prior to RFA. The majority had SEMS placed (35 of 58 pts), and 19 received plastic stents. 30-day mortality was 1.7%, and 90 day mortality was 19%. Within 30 days, there 5 cases of cholangitis, three cases of hemobilia, two cases of cholangiosepsis, and one case each of gallbladder empyema, hepatic coma, and newly diagnosed left bundle branch block. The patient with hepatic coma had a fatal oucome. One patient had a severe complication of liver infarct, which was thought to be secondary to thermal damage to a segmental branch of the hepatic artery. The patient with hepatic coma Median survival rates were extrapolated to 10.6 months post RFA procedure, and 17.9 months from initial diagnosis. Median stent patency was found to be 115 days in plastic stents and 218 days in metal stents. This was the largest retrospective analysis of RFA to date, and while it again demonstrated RFA as a relatively safe option, the size of the cohort allowed the authors to suggest RFA as a way to improve survival and extend stent patency in malignant biliary obstruction.

While the above studies provide data regarding the safety and feasibility of RFA for malignant obstruction, several groups have investigated whether this procedure could impact stent patency and patient survival. To this end, Sharaiha et al. compared patients receiving RFA prior to SEMS placement to patients who received SEMS alone.23 Of 66 total patients, 26 underwent RFA prior to SEMS placement, and they were matched with 40 patients treated with SEMS alone. RFA was technically successful in all attempts. There were five adverse events in both groups including 3 with abdominal pain, one with pancreatitis, and one with cholecytitis. There was no difference in adverse events between the two groups. There was no significant difference in survival rates between the two groups. However, Multivariate Cox proportional analysis showed that RFA was an independent predictor of survival, with a hazard ratio of 0.29 (95% confidence interval 0.11-0.76). The following year, the same group retrospectively studied 69 patients: 45 with cholangiocarcinoma, 19 with pancreatic cancer, 2 with gallbladder cancer, 1 with gastric cancer, and 3 with liver metastases.24 These patients all underwent RFA in addition to stent placement for malignant biliary stricture, and were compared to Surveillance, Epidemiology, and End Results (SEER) database patients. In the study group, 78% had received prior or concurrent chemotherapy, and there was a mixture of plastic stents and SEMS. In the study group, the authors described survival rates of 14.6 and 17.7 months for pancreatic and cholangiocarcinoa respectively. These were significantly improved over SEER database survival rates: 5.9 months for pancreatic cancer and 6.2 months for cholangiocarcinoma. The authors concluded that their study suggested that RFA prior to stenting can improve survival rates. However, their study was limited in its retrospective nature, and use of historical data, necessitating larger and prospective trials.

Kallis et al. (2015) performed a similar study to assess the efficacy of RFA in biliary obstruction secondary to pancreatic adenocarcinoma.25 In this study, 23 patients receiving RFA prior to SEMS were matched with 46 controls receiving SEMS alone. 30 day mortality was 0%, and adverse events were limited to one case each of hyperamylasemia and antibiotic-responsve cholangitis. Of note, the two groups were stringently matched based on demographic characteristics and chemotherapeutic regimens. The group found that there was a statistically significant difference in survival: median survival 226 days with the RFA group vs. 123.5 days in the SEMS only group. Multivariate analysis revealed that RFA afforded an early survival benefit, but this was lost after the 180 day mark. They also found that stent patency was not significantly improved with RFA, and that a majority of patients died from carcinomatosis rather than complications arising from biliary obstruction. The authors concluded that while there was a survival benefit associated with RFA, they could not necessarily connect it to improved stent patency. While their data could not adequately demonstrate improved stent patency, just 9 of 23 patients in the RFA group, and 14 of 46 patients in the control group reached the end point of stent occlusion due to mortality. The high mortality rate of these malignancies makes correlation between patency and survival inherently difficult.

Dutta et al. published a study that also looked at RFA vs stenting alone. 26 This was a smaller study, with 15 patients in the RFA arm, and 16 receiving stent alone, and compared various malignancies causing biliary obstruction. RFA median survival was 220 days vs 106.5 days in those receiving SEMS alone. Multivariate analysis was able to demonstrate a survival advantage with RFA independent of age, malignancy type, or obstruction site. Although many of the above studies were able to demonstrate statistically significant efficacy, they were limited in their retrospective nature, and small sample size.

In an attempt to combine the data from several of these smaller studies, Sofi et al. performed a meta-analysis to evaluate survival and stent patency benefits, as well as complication rates in RFA for malignant biliary strictures.27 Their study population included 505 patients from nine studies, and had patients receiving both percutaneous and endoscopic RFA. The authors found that stent patency was significantly improved by a mean of 50.6 days in the RFA group. Survival rates were significantly better as well with a median 285 days in the RFA group versus 248 days in the controls. When evaluating for adverse events, the RFA group was found to have significantly more episodes of abdominal pain, but all other common adverse events (cholangitis, acute cholecystitis, acute pancreatitis, and hemobilia) were not significantly different between the two groups. Sofi et al. concluded that while their pooled data was promising, they acknowledge the need for larger prospective trials to fully elucidate the efficacy of RFA and the correlation between stent patency and survival.

In a randomized, controlled, single-center prospective study, Yang et al. investigated the use of RFA in 65 patients with unresectable extrahepatic cholangiocarcinoma.28 The patients were randomized to RFA combined with plastic stenting, or stenting-only. Of note, patients receiving chemotherapy, and those with Bismuth types III and IV lesions were excluded from the study. All patients received endoscopic ultrasound prior to intervention to fully characterize the lesions, and had repeat RFA at a mean of 6 months after initial RFA. Adverse event rates were not significantly different between the two groups, and included 3 cases of acute cholangitis, and one each of acute pancreatitis and hemorrhage. The authors found that stent patency was significantly improved with RFA (6.8 months vs 3.4 months). Survival rates were also significantly improved in the RFA group with a mean survival of 13.2 months vs 8.3 months in the stent-only group. The authors concluded that RFA prior to stenting can significantly improve both survival and stent patency in extrahepatic cholangiocarcinoma.

RFA vs PDT Prior to Stent Placement

In the only study to date to directly compare PDT to RFA prior to stent placement, Strand et al. performed a retrospective analysis of 48 patients with unresectable cholangiocarcinoma. 29Of these patients, 16 received RFA, and 32 underwent PDT, followed by either plastic stent or SEMS. The PDT group received primarily plastic stents (90.6%) whereas the RFA group received primarily uncovered SEMS (68.7%). Tumor type, and percentage of patients receiving chemotherapy were not significantly different between the two groups. Median survival rates were 9.6 months in the RFA arm, and 7.5 months in the PDT arm, which were not significantly different. In general, adverse event rates were comparable in the two arms, although stent occlusion and cholangitis were significantly increased in the RFA arm. The authors concluded that while their study was limited in its retrospective nature and small sample size, it suggested that RFA and PDT were comparable methods for the palliative treatment of unresectable cholangiocarcinoma. They also discussed the relative advantages and disadvantages between the two methods. While PDT does potentially preferentially select malignant cells, it requires a pre-treatment with photosensitizer and subsequent avoidance of sun exposure for at least 48 hours. The continued comparison between these two methods requires further, large, prospective trials.

RFA for Occluded SEMS

Pozsar et al. were the first to report the use of RFA for occluded biliary SEMS in 2011.30 In this small trial, 5 patients with occluded SEMS secondary to cholangiocarcinoma were treated with RFA. In all patients, biliary obstruction was successfully relieved without complication. The median stent patency after RFA treatment was 62 days, and the authors concluded that RFA was a relatively safe method for treating occluded SEMS. In a larger trial, Kadayifci et al. investigated the use of the RFA in 50 patients with occluded SEMS.31 In their study, patients with occluded SEMS were split into two groups. The first group had a new, plastic stent placed within the occluded stent. The second group had RFA to the occluding tissue, without subsequent stent placement. In the RFA group, only 56% had successful ablation, defined as >80% obstruction removed. In the remaining 44%, RFA was considered to have failed, and plastic stents were placed. Analysis of RFA failure found that pancreatic cancer and distal obstruction were predictive of RFA success. Survival rates were not significantly different between the two groups. However, stent patency was found to be significantly improved in the successful ablated group when compared to those with stent alone. The authors concluded that while RFA success was dependent on tumor type and location, when >80% ablation was achieved, there was a demonstrable improvement in stent patency compared to re-stenting alone.

Percutaneous Intraductal RFA

While not the focus of this review, it should be noted that percutaneous intraductal RFA has been described in the literature as a viable tool. Several studies have looked at the use of percutaneous intraductal RFA in malignant biliary strictures. In general, these groups have shown this technique to be safe and feasible. 32,33,34,35 Larger trials, such as those by Wu et al. and Cui et al., have demonstrated a significant improvement in both stent patency and survival.36,37 Their outcomes were comparable to those found in endoscopic RFA trials discussed above.

OTHER USES FOR ENDOBILIARY RFA

Ampullary Adenoma with Intraductal Extension Ampullary adenomas are typically treated successfully with endoscopic or surgical resection. However, when the adenoma extends into the biliary tree itself, it often presents a therapeutic challenge for clinicians. (Figure 2) Suarez et al. investigated the use of RFA for the treatment of ampullary adenomas with intraductal extension in a 4 patient case series.38 Three of the patients had benign ampullary adenomas, and the fourth had an ampullary adenoma with a foci of adenocarcinoma. All cases were technically feasible, and aside from a delayed biliary stricture in one patient, there were no immediate adverse events. Those with benign adenomas had no recurrence of ampullary lesions, but the patient with adenocarcinoma developed overt invasive ampullary cancer and passed away from their disease, demonstrating the limitations of this technology. Rustagi et al. investigated the use of RFA in ampullary neoplasms in a slightly larger trial of 14 patients.39 The group had mixed pathology, including two adenocarcinomas, and a mix of tubulovillous and tubular adenomas. There were also varied treatment strategies, with half receiving RFA-only, and half receiving a mix of RFA in combination with argon plasma coagulation, PDT, or electrocoagulation. Despite the heterogenous nature of the cohort, the authors reported 92% of patients with successful eradication of disease after a 16 month median follow up period. Adverse event rates were high at 43%, the majority of which were benign common bile duct strictures. These studies demonstrate that RFA has potentially beneficial applications in the management of ampullary lesions, but will require more robust, standardized trials to further elucidate its clinical utility.

Benign Biliary Strictures

As several groups reported the safety and efficacy of RFA in malignant biliary strictures, this technique was proposed by some as a means to manage benign strictures as well. Benign strictures are fairly heterogenous in nature, resulting from multiple etiologies including liver transplant, pancreaticobiliary surgery, and any condition causing chronic inflammation. The majority results in stiff fibrinous tissue, histologically far different than malignant strictures. Typically managed by dilatation and multiple stenting procedures, up to 40% of strictures are refractory to current endoscopic techniques. Hu et al. investigated the use of RFA in 9 patients with benign biliary strictures.40 All patients had successful RFA followed by balloon dilatation. No serious adverse events occurred, and all patients achieved immediate improvement or resolution of their benign strictures. There were variable follow up times, but at a median of 12.6 months, 4 patients had achieved full stricture resolution without need for further stenting. The authors concluded that RFA was a feasible and safe method for benign biliary strictures, but refrained from making conclusions on efficacy given the relatively small and heterogenous study design. Swine models are currently being developed, and further studies are needed to investigate the use of RFA in benign biliary strictures.

CONCLUSION

Biliary RFA has emerged as a viable method for the ablation and palliation of biliary strictures. Data thus far has been largely limited to retrospective analyses, but the results from these studies suggest that endobiliary RFA is a safe, and feasible technique. In unresectable malignancy, RFA may confer both survival and stent patency benefits, but further investigation is required before definitive statements can be made. Clinicians should be aware of the adverse events common to endobiliary RFA, including abdominal pain, cholangitis, acute cholecystitis, stricture formation or worsening, and hemobilia. As experience with endobiliary RFA grows, further questions remain; including the effect of chemotherapy on survival and how plastic vs metal stents may alter stent patency duration. Continued investigation with large, randomized, controlled, prospective trials is required to answer these questions and further elucidate the efficacy of endobiliary RFA in the management of biliary strictures.

Download Tables, Images & References

A Case Report

Gastrointestinal Bleeding Due to a Post Transplant Lymphoproliferative Disorder: A Complication of Renal Transplant

Read Article

Post-transplant lymphoproliferative disorder (PTLD) is a rare complication after solid organ or stem cell transplant, which is thought to be related to the immunosuppression medications used in this patient population. Although the gastrointestinal tract is a common location for proliferation, gastrointestinal bleeding (GIB) is rare in PTLD. We present a patient with history of renal transplant treated with prednisone, tacrolimus and azathioprine presenting with GIB. She was found to have multifocal jejunal B cell lymphoma. Outcomes in PTLD treated with immunosuppression reduction and chemotherapy are promising with response rates of 82-90%. Therefore, post-transplant care must include careful surveillance for PTLD.

Marianna Mavilia DO1 Leon Averbukh DO1 Michael Einstein MD2 1University of Connecticut, Department of Medicine, Farmington, CT 2Hartford Hospital, Department of Gastroenterology and Hepatology, Hartford, CT

INTRODUCTION

Post-transplant lymphoproliferative disorder (PTLD) is any proliferation of lymphocytes occurring after solid organ or allogeneic stem cell transplant due to immunosuppression (IS) therapy.1 The most common extranodal location for PTLD is in the gastrointestinal tract.2

On average, PTLD occurs in 1-3% of renal transplant cases, though percentages vary based on patient’s age, type of transplant, and IS regimen.2 There is a higher frequency of PTLD amongst patients transplanted for autoimmune etiologies.1 This is thought to be due to more intensive immunosuppression regimens used in this population.1 The presentation of PTLD is variable. B symptoms are seen in about 49% of cases.2 The initial presentation may be complications such as mass effect of tumor, gastrointestinal obstruction, perforation, or gastrointestinal bleeding (GIB).2 GIB is a rare presentation with only a few case reports to date in the literature. PTLDs were initially thought to occur within 1 year of transplant, however studies have shown that more than 70% cases occur later.1,3 Here we present a case of multifocal small bowel B cell lymphoma occurring 16 years after renal transplant. Case Description

A 48 year-old female with history of end stage renal disease secondary to membranous glomerulonephritis, with living donor renal transplant 16 years prior, presented to the emergency department with generalized abdominal pain and fatigue. She reported intermittent dark-colored stools, generalized weakness, and a 10-lb weight loss in the preceding three months. She denied use of nonsteroidal anti-inflammatory drugs, alcohol and tobacco. The patient was on prednisone 5 mg daily, tacrolimus 3 mg twice daily, and azathioprine 50mg daily for IS. She had no significant family history. Physical exam was benign except for positive occult blood on rectal exam. Laboratory workup revealed hemoglobin of 7.4 mg/dL, which was decreased from her baseline of 10 mg/dL three months prior. She also had mildly elevated BUN and creatinine at 30 mg/dL and 2.5 mg/dL respectively from a baseline creatinine of 1.4-1.6 mg/dL. Renal ultrasound with doppler was normal. Patient underwent an esophagogastroduodenoscopy and colonoscopy, which demonstrated only mild gastritis. Further workup with capsule endoscopy was positive for an ill-defined lesion in the proximal jejunum. Based on these results, the patient underwent push enteroscopy and was found to have an area of patchy inflammation with congestion and shallow ulcerations in the proximal jejunum (shown in Figure 1). Biopsy of this lesion showed focal ulceration and necrosis of the mucosa with underlying cellular infiltrate. The infiltrate was comprised mainly of large abnormal cells with scattered small lymphocytes and eosinophils (Figure 2A). The neoplastic cells were positive for CD20 (Figure 2B), CD79a (Figure 2C), PAX-5 (Figure 2D), and expressed monotypic lambda light chains (Figure 2E). The Ki-67 proliferation index was elevated at greater than 90%. The specimen was weakly positive for CD30 and negative for CD15. Due to strong suspicion for Epstein-Barr virus (EBV)-positive mucosal ulcers, EBV testing was conducted with both in situ hybridization for EBER and immunohistochemistry for EBV-LMP, though both were negative. Taken together, the above findings were consistent with monomorphic PTLD with features of diffuse large B-cell lymphoma.

The patient was subsequently referred to oncology for staging where a bone marrow biopsy was found to be normocellular with non-neoplastic lymphocytes on flow cytometry. Positron emission test/computed tomography (PET CT) demonstrated multifocal areas of intense hypermetabolism involving the GI tract including the distal gastroesophageal junction and multiple foci within the jejunum (Figure 3). There was no FDG avid lymphadenopathy. Although some bowel uptake can be physiologic, these findings were suggestive of multifocal small bowel disease given her biopsy-proven lymphoma in the jejunum.

Our patient’s IS dose was minimized with reduction of tacrolimus to 1mg twice daily and azathioprine was stopped. She was treated with six cycles of R-CHOP (rituximab, cyclophosphamide, doxorubicin, vincristine and prednisone). After initiation of treatment, the patient continued to have evidence of GIB with melena and slowly dropping hematocrit. Her blood counts were monitored closely and she was transfused as needed. PET/CT was performed after 3 cycles of chemotherapy, revealing resolution of the jejunal lesions and no new areas of abnormal FDG uptake. Her blood counts stabilized as therapy progressed.

DISCUSSION

GIB in a post-transplant patient is a complicated issue. With renal transplant specifically, renal dysfunction alone increases risk of GIB 8.7-fold, as uremia and nitric oxide accumulation contribute to impaired platelet function.4 Additionally, immunosuppressive agents can lead to GIB. Steroids and tacrolimus can impair gastrointestinal epithelium, thus predisposing patients to ulceration or other mucosal injury.5 Other agents, such as azathioprine, can cause thrombocytopenia.6 As demonstrated in this case, IS also predisposes patients to the development of PTLD which can cause GIB.

The pathogenesis of PTLD is not completely understood. EBV, known to provoke malignant transformation of B cells, is a major risk factor for the development of PTLD, accounting for more than 70% of cases.3 In the absence of EBV, as in our patient, the pathogenesis is not well understood. This patient was on an intensive IS regimen with three agents, which subjected her to increased risk for PTLD. IS, used for its inhibition of T cell function to prevent graft rejection, also allows for uncontrolled lymphoproliferation with potential for malignant transformation.

Mortality rates for PTLD ranges from 50-70%,7 with five year survival rate of 53-59%.3 Those with EBV positivity carry an unfavorable prognosis.9,10 Our patient possessed features thought to be good prognosticators, including female gender and inclusion of tacrolimus or azathioprine in IS regimen.7 These specific agents have more favorable prognosis as compared to patients on other IS regimens.

Treatment of PTLD requires a balance between eliminating the malignancy while also preserving the graft, with the key being dose reduction of IS.7 This allows for the re-engagement of the natural anti-tumor actions of the immune system. However, IS reduction is only effective in 10% of cases. Rituximab alone or combined with CHOP is an alternative in high risk cases.7 R-CHOP has a response rate of 82-90%.7 Given this favorable response to treatment, early identification of PTLD is key. With increasing number of solid organ transplants, it is important to consider PTLD as a potential complication in this population.

Acknowledgements A special thank you to Dr. Hassan Dalal of the University of Connecticut Department of Pathology for providing pathologic figures for this case.

Download Tables, Images & References

Nutrition Issues In Gastroenterology, Series #176

Blenderized Feeding Options- The Sky’s the Limit

Read Article

The use of blenderized tube feeding (BTF) continues to increase in popularity, among people of all ages, in the United States and across the globe. It will likely continue to increase in popularity as more studies are published. Clinicians have a responsibility to include assessment of BTF use in all home enteral nutrition (HEN) patients in order to provide guidance for appropriate recipe development and monitoring. This article is intended as a guide to empower clinicians to aid patients in the use of BTF.

The use of blenderized tube feeding (BTF) continues to increase in popularity, among people of all ages, in the United States and across the globe. BTF is the process in which foods and liquids are blended together and given via an enteral feeding tube. This may be in place of, or in addition to/combination with, commercially available enteral formulas. Commercial enteral formulas (CEF) have precise amounts of micronutrients and macronutrients and are prepared in a sterile fashion; unlike BTF, which is usually prepared at home in the family kitchen.

Lisa Epp, RDN, CNSC, LD, Assistant Professor of Nutrition, Mayo Clinic College of Medicine and Science, Division of Endocrinology, Mayo Clinic, Rochester, MN

INTRODUCTION

Home enteral nutrition (HEN) is when tube feeding is given in the home setting. A publication by Mundi, et al. reported that more than 400,000 people (189,036 pediatric & 248,846 adult patients) are receiving HEN in the United States as of 2013.1 This is a significant increase from previously reported numbers in 1992, when an estimated 152,000 people were on HEN.2 In recent decades, the standard process for HEN was for a clinician to prescribe a CEF in the hospital and the patient would continue its use at home.

the last 10 years consumer demand for “natural” and organic foods has increased, and the HEN population is no exception. One study suggests that as many as 55.5% of adult HEN users use BTF in varying amounts, and 90% expressed the desire to use BTF if given adequate information.3 Yet another survey of 216 adult and pediatric Oley Foundation members indicated it could be as high as 90% in some populations, especially among children.4 The Oley Foundation is a non-profit organization that supports people at home on parenteral nutrition and/or enteral nutrition (oley.org). Adults who responded to the Oley survey indicated 65.9% use BTF.4 In a survey of 433 parents of tube fed children, 49.5% indicated they used BTF for their child.5 However, a concerning finding of this survey was that only 50% of respondents used a nutrition professional to help create recipes.5 Given that only 50% of parents are getting assistance from a nutrition professional, clinicians have a responsibility to identify those who are utilizing BTF in order to provide support and ensure HEN patients are meeting their nutrition needs.

A recent study of 212 head and neck cancer patients demonstrated that many are using BTF even when it is not prescribed.6 In this cohort, 112 received CEF, 69 patients voluntarily switched to BTF with unknown ingredients and 31 were prescribed BTF due to lack of health insurance coverage. The results showed that those using BTF did not receive adequate nutrition support and had a decrease in fat free mass. One could argue that since patients changed to BTF on their own, they did not have the clinical support or guidance needed to create nutritionally complete recipes with adequate calories. This is a prime example of why clinicians must be open to the use of BTF.

the previously mentioned surveys, common reasons given for using BTF were:

  • 1. “It’s more natural”
  • 2. “I can tolerate it better”
  • 3. “I like to eat what my family is eating”

Additionally, patients may prefer whole foods, organic, non-GMO, allergen free ingredients. Another reason for using BTF is CEF intolerance, such as reflux, constipation, diarrhea or fullness. Finally, some patients just do not have insurance coverage or adequate funds for commercial enteral formulas. BTF has the potential for allowing each patient’s nutrition needs to be met with individualized medicine.

Despite a patient’s desire to use BTF, some health care professionals hesitate to support its use. A survey of registered dietitians showed that the use of BTF is largely patient/family driven, but 28% of registered dietitians surveyed felt they needed more information about using BTF in clinical practice.7 Some clinical hesitations may include increased clinician time, potential for increased microbial contamination,8-9 increase in tube clogging and variability in nutrition composition.

At this time there is a small body of evidence that indicates that BTF may help with EN intolerance such as reflux, volume intolerance and bowel issues. With the reemergence of BTF, additional research is underway and this body is evidence is growing. In a study of 33 children, 52% of those given BTF had a reduction in gagging, 73% had a decrease in overall GI symptoms, and 57% had an increase in oral intake; no child had worsening symptoms.10 In another study, 10 children with short bowel syndrome were given formula with real food ingredients; 9 experienced an improvement in their stool habits and were able to wean off elemental formula.11 In a third study,18 infants with diarrhea were randomized to BTF vs. semi-elemental formula, and those on BTF experienced improvement in diarrhea and weight gain compared to those on semi-elemental formula.12 Lastly, a recent publication showed that children who were given BTF had decreased vomiting along with an increase in the bacterial diversity of their stool.13 These studies were all done in the pediatric population; however one study in adult patients included a group of 178 elderly individuals, 5 of which were on enteral feeding via percutaneous endoscopic gastrostomy (PEG).14 Those with a diverse diet had the healthiest gut microbiome, while those on a single formula had increased frailty.

remainder of this article is intended as a guide for clinicians wanting to help their patients utilize BTF if desired. Considerations before starting BTF:

  • Does your patient have a 14 French or greater size feeding tube?
    • Smaller tubes may work with thinner blends
  • Is the stoma mature in case the tube does become clogged?
    • A tube in a mature stoma can be changed more easily if needed
    • Some patients may benefit from BTF at initial tube placement and should be considered on a case by case basis.
  • Can your patient tolerate bolus feeding since food can only be held safely at room temperature for 2 hours?15
    • More information is needed on the use of BTF in post pyloric feeding tubes

Tools needed:

  • Commercial grade blender such as Vitamix®(vitamix.com), Blendtec® (blendtec.com) or Ninja® (ninjakitchen.com)
  • O ring syringes (Figure 1)
  • Large bore gravity bags (Figure 2)
    • Feeding pumps may not work well due to decrease in accuracy and motor failure
  • Straight bolus extension set for low profile tubes (Figure 3)
  • Bolee bag with bolink (Figure 4)
  • Nutrition professional involved
  • Plan for monitoring and evaluation

There are a variety of ways to develop BTF recipes; including using food exchanges, standard recipes or the plate method with family meals. Table 1 shows a 500 calorie recipe that is easy to double, triple or quadruple as needed to meet estimated calorie needs with balanced macronutrients. Using food exchanges makes it easy to choose foods that are available in the home to create a variety of recipes. Tables 2 and 3 provide sample standard recipes for approximately 1000 and 1600 calories. Lastly, reviewing the MyPlate Daily Checklist (choosemyplate.gov)16 is a great way to determine the number of servings of each food needed in the blend at a given calorie level.

Many children and adults use BTF due to enteral formula intolerance. Therefore, these patients may not be able to tolerate large volumes of feeding at one time. Table 4 gives examples of nutrient dense foods in each food group that may help decrease overall volume intake while still providing adequate calories. Table 5 provides options to “exchange” foods to increase the variety of the blenderized meals.

Some patients may use commercial food-based enteral formulas for some or all of their nutrition intake. See Table 6 for a variety of available products. These products may make it easier to travel or be away from home since they are shelf stable and do not require preparation, or refrigeration until opened.

is important to note that BTF is approximately 70-75% fluid. Therefore, extra fluid will likely be needed to meet hydration needs. Fluid can either be mixed into the recipes or given as boluses between feedings. It is also important to review micronutrient profiles of the recipes as homemade blends tend to be low in sodium, and salt may need to be added in some cases. A multivitamin/mineral, calcium, vitamin D or iron supplement may be needed, but should not be necessary if the recipes contain a wide variety of foods. Routine multivitamin/mineral use is not usually indicated if a variety of foods are used. Monitoring patients as they transition onto BTF is essential, and when they are tolerating goal feedings these patients should be followed in the same fashion as any other HEN patient. Labs should only be done when relevant to the clinical situation and are typically not routinely monitored.

SUMMARY

Blenderized tube feeding use has increased over the recent years, and will more than likely continue to increase in popularity as more studies are published. While patients are interested in using BTF for a variety of reasons, clinicians remain hesitant to support its use. We have a clinical responsibility to include assessment of BTF use in all HEN patients in order to provide guidance for appropriate recipe development and monitoring. This article is meant to increase awareness of the widespread use of BTF and to help empower clinicians to aid patients in its use.

Download Tables, Images & References

The Microbiome And Disease, Series #5

The Microbiome, Viscerosensory Signaling and Autism

Read Article

Throughout this series we have pointed to evidence of an increasingly complex understanding of the relationship between the gut, its commensal bacterial composition, and its link to various pathological states within different organ systems. Here we will briefly discuss the emerging research that demonstrates the connection between gut microbes and the nervous and immune systems as well as how disproportional bacterial concentrations may be implicated in Autism and other neuropsychiatric ailments.

Daniel Frochtzwajg, DO, Research Assistant, Ventura Clinical Trials, Ventura, CA Sabine Hazan, MD, Gastroenterology/Hepatology/Internal Medicine Physician, CEO, Ventura Clinical Trials, CEO, Malibu Specialty Center, Ventura, CA

Throughout this series we have pointed to evidence of an increasingly complex understanding of the relationship between the gut, its commensal bacterial composition, and its link to various pathological states within different organ systems. Paramount in our articles has been a theme of holistic interconnectedness. The relationships between the gut microbiome and the disease states already discussed are not isolated but are inextricably linked. Here we will briefly discuss the emerging research that demonstrates the connection between gut microbes and the nervous and immune systems as well as how disproportional bacterial concentrations may be implicated in Autism and other neuropsychiatric ailments.

Contemporary research has shown that gastrointestinal (GI) infections lead to behavioral changes in mice and that the immune system affects mood and learning.2,5,12 In turn, depression and stress can cause changes in immune functioning.5 The former relationship, that of infection influencing behavior, was observed in a study by Lyte et al. in 1998. Researches noted increased anxiety in mice after the per-oral administration of Campylobacter jejuni, as compared to saline treated mice.11 Also of note was the lack of bacteria or pro-inflammatory cytokines found in the systemic circulation of the C. jejuni-treated mice. This finding was a springboard for the investigation of intestinal viscerosensory nerves as a possible conduit between the gut and the brain and behavioral responses. Within the gut, there are two types of viscerosensory nerves, intrinsic and extrinsic. Intrinsic nerves control motility and secretion, but do not directly convey signals to the central nervous system. Extrinsic nerves, including the vagus nerve and the spinal visceral nerves communicate with the CNS, innervate the intrinsic nerves, and contact lymphoid tissue in the subepithelium.5 To firmly establish the connection between the immune system, infection, and changes in behavior, Goehler et al. and subsequently Lyte et al. demonstrated evidence of c-Fos, an early gene product and marker of cell “activation”, in the vagal neurons of mice after inoculation with Campylobacter jejuni or Citrobacter rodentium, respectively.5,6,10 Another intestinal-CNS link was elucidated by Castex et al. in 2005. In this study, researchers noted c-Fos expression in specific brain regions in rats in response to intestinal ischemia. Furthermore, they found that intraperitoneal administration of ondansetron or perivagal capsaicin attenuated c-Fos expression, implicating 5-HT3 in the immune “activation” of the viscerosensory nerves.1 A number of other proinflammatory mediators/receptors have also been identified as possible players in the intestinal activation of vagal/spinal nerves. These include: Bradykinin, prostaglandins and leukotrienes, ATP and adenosine, vanilloid receptors, proteinase-activated receptors, and nerve growth factor (NGF).9 Vagal fibers also express toll-like receptors (TLRs), specifically TLR-4 on their surface, which is known to respond to bacterial lipopolysaccharide.7

The established paradigm demonstrating a behavioral response to intestinal bacteria, bacterial products, and mechanical stimuli logically suggests that some neuropsychiatric disorders may be, at least in part, due to derangements in the human intestinal microbiome. The relationship between irritable bowel syndrome (IBS) and depression and anxiety are widely known, and one can imagine that other intestinal insults may be responsible for various neurologic and/or psychiatric conditions and vice versa. In fact, there is evidence that predictable gut microbiome derangements may contribute to Autism Spectrum Disorder (ASD), as GI symptoms are increasingly associated with autism.3,4 Per the most recent CDC statistics, roughly 1 in 59 children are affected by ASD and the worldwide prevalence is somewhere between 1-2%; ASD is about 4 times more likely to be identified in boys as compared to girls.13 In a very recent study by Finegold et al. published in Anaerobe, researchers studied stool specimens from 33 autistic children ages 2-9 with intestinal symptoms and 13 control children without autism or GI upset. Results showed that the intestinal microbiome of autistic children with GI disease was colonized by proportionally higher counts of Clostridium perfringens. In addition to the statistically significant higher raw number of C. perfringens colony forming units, it was also noted that children with ASD harbor significantly more of the C. perfringens responsible for beta2-toxin production, as opposed to colonies producing other C. perfringens toxin genes.

Diets including specific probiotics have been shown to inhibit visceral pain caused by colonic distension.8 We have discussed the commonality in viscerosensory afferents. If alterations in diet can mediate visceral pain, it is likely that further research will lead to dietary and pharmacologic interventions aimed at the microbial disproportions associated with ASD and other neuropsychiatric illnesses.

Download Tables, Images & References

jojobethacklinkmarsbahisJojobet GirişcasibomJojobet GirişCasibomCasibomvaycasinoholiganbetcasibommarsbahis girişJojobettaraftarium24madridbet güncel girişmadridbet girişmadridbetGrandpashabet