In recent years, increasing attention has focused on the treatment of chronic pain with a considerable number of research and publications about it. At the same time, opioid prescription, use, abuse and death related to the inappropriate use of opioids have significantly increased over the last 10 years. Some reports indicated that there were more than 100 ‘pain clinics’ within a one-mile radius in South Florida, between 2009 and 2010, which led to the birth of new opioid prescription laws in Florida and many other states to restrict the use of opioids. In the face of clinical and social turmoil related to opioid use and abuse, a fundamental question facing each clinician is: are opioids effective and necessary for chronic non-malignant pain?
Chronic low back pain (LBP) is the most common pain condition in pain clinics and most family physician offices, which ‘requires’ chronic use of opioids. Nampiaparampil et al conducted a literature review in 20121 and found only one high-quality study on oral opioid therapy for LBP, which showed significant efficacy in pain relief and patient function. Current consensus believes that there is weak evidence demonstrating favourable effectiveness of opioids compared to placebo in chronic LBP.2Opioids may be considered in the treatment of chronic LBP if a patient fails other treatment modalities such as non-steroidal anti-inflammatory drugs (NSAIDs), antidepressants, physical therapy or steroid injections. Opioids should be avoided if possible, especially in adolescents who are at high risk of opioid overdose, misuse, and addiction. It has been demonstrated that the majority of the population with degenerative disc disease, including a disc herniation have no back pain. A Magnetic Resonance Imaging (MRI) report or film with a disc herniation should not be an automatic ‘passport’ for access to narcotics.
Failed back surgery syndrome (FBSS) is often refractory to most treatment modalities and sometimes very debilitating. There are no well-controlled clinical studies to approve or disapprove the use of opioids in FBSS. Clinical experience suggests oral opioids may be beneficial and necessary to many patients suffering from severe back pain due to FBSS. Intraspinal opioids delivered via implanted pumps may be indicated in those individuals who cannot tolerate oral medications. For elderly patients with severe pain due to spinal stenosis, there is no clinical study to approve or disprove the use of opioids. However, due to the fact that NSAIDs may cause serious side effects in gastrointestinal, hepatic and renal systems, opioid therapy may still be a choice in carefully selected patients.
Most studies for pharmacological treatment of neuropathic pain are conducted with diabetic peripheral neuropathy (DPN) patients. Several randomized clinical controlled studies have demonstrated evidence that some opioids, such as morphine sulphate, tramadol,3 and oxycodone controlled-release,4 are probably effective in reducing pain and should be considered as a treatment of choice (Level B evidence), even though anti-epileptics such as pregabalin should still be used as the first line medication.5
Some studies indicate opioids may be superior to placebo in relieving pain due to acute migraine attacks and Fiorinal with codeine may be effective for tension headache. However there is lack of clinical evidence supporting long-term use of opioids for chronic headaches such as migraine, chronic daily headache, medication overuse headache, or cervicogenic headache. Currently there are large amounts of opioids being prescribed for headaches because of patients' demands. Neuroscience data on the effects of opioids on the brain has raised serious concerns for long-term safety and has provided the basis for the mechanism by which chronic opioid use may induce progression of headache frequency and severity.6 A recent study found chronic opioid use for migraine associated with more severe headache-related disability, symptomology, comorbidities (depression, anxiety, and cardiovascular disease and events), and greater healthcare resource utilization.7
Many patients with fibromyalgia (FM) come into pain clinics to ask for, or even demand, prescriptions for opioids. There is insufficient evidence to support the routine use of opioids in fibromyalgia.8 Recent studies have suggested that central sensitization may play for role in the aetiology of FM. Three central nervous system (CNS) agents (pregabalin, duloxetine and milnacipran) have been approved by United States Food and Drug Administration (US FDA) for treatment of FM. However, opioids are still commonly prescribed by many physicians for FM patients by ‘tradition’, sometimes even with the combination of a benzodiazapine and muscles relaxant - Soma. We have observed negative health and psychosocial status in patients using opioids and labeled with FM. Opioids should be avoided whenever possible in FM patients in face of widespread abuse and lack of clinical evidence.9
Adolescents with mild non-malignant chronic pain rarely require long-term opioid therapy.10 Opioids should be avoided if possible in adolescents, who are at high risk of opioid overdose, misuse, and addiction. Patients with adolescents living at home should store their opioid medication safely.
In conclusion, opioids are effective and necessary in certain cases. However, currently no single drug stands out as the best therapy for managing chronic non-malignant pain, and current opioid treatment is not sufficiently evidence-based. More well-designed clinical studies are needed to confirm the clinical efficacy and necessity for using opioids in the treatment of chronic non-malignant pain. Before more evidence becomes available, and in the face of widespread abuse of opioids in society and possible serious behavioural consequences to individual patients, a careful history and physical examination, assessment of aberrant behavior, controlled substance agreement, routine urine drug tests, checking of state drug monitoring system (if available), trials of other treatment modalities, and continuous monitoring of opioid compliance should be the prerequisites before any opioids are prescribed.
Opioid prescriptions should be given as indicated, not as ‘demanded’.
A 69 year old male with hypertension, body mass index 24 kg/m2, neck circumference 16 inches, and moderate COPD, on home oxygen, presented to his pulmonary clinic appointment with worsening complaints of fatigue, leg cramps, and intermittent shortness of breath with chest discomfort. A remote, questionable history of syncope five to ten years ago was elicited. His vital signs were: temperature 98.80F, blood pressure 119/76 mmHg, pulse 92/min and regular, and respirations 20/min. Physical exam was significant for crowded oropharynx with a Mallampati score of four, distant breath sounds with a prolonged expiratory phase on lung exam with a normal cardiac exam. Laboratory investigation showed normal complete blood counts, haemoglobin 15 g/dL, and normal chemistries. Compared to his previous studies, a pulmonary function study showed stable parameters with a FEV1 1.47 L (69%), FVC/FEV1 ratio 0.44 (62%), and a DLCO/alveolar volume ratio of 2.12 (49%). A room air arterial blood gas revealed pH 7.41, PCO2 44 mmHg, and PO2 61 mmHg, with 92% oxygen saturation. A six minute treadmill exercise test performed to assess the need for supplemental oxygen showed that he required supplemental oxygen at 1L/min via nasal cannula to eliminate hypoxemia during exercise. His chest radiograph was significant for hyperinflation and prominence of interstitial markings. A high resolution computed tomography of the chest demonstrated severe centrilobular and panacinar emphysema only. A baseline electrocardiogram (EKG) showed normal sinus rhythm with an old anterior wall infarct (Figure 1). Echocardiography of the heart revealed a normal left ventricle with an ejection fraction of 65%. Right ventricular systolic function was normal although elevated mean pulmonary arterial pressure of 55 mmHg was noted. A diagnostic polysomnogram performed for evaluation of daytime fatigue and snoring at night revealed mild OSA with an AHI of 6/hr. with sleep time spent with oxygen saturation below 90% (T-90%) of 19%. The EKG showed normal sinus rhythm. A full overnight polysomnogram for continuous positive airway pressure (CPAP) titration performed for treatment of sleep disordered breathing was sub-optimal, however it demonstrated an apnoea–hypopnea index (AHI) of 28 during REM (rapid eye movement) sleep, and a T-90% of 93%. The associated electrocardiogram showed Wenckebach second degree AV heart block during REM sleep usually near the nadir of oxygen desaturation. On a repeat positive airway pressure titration study, therapy with Bilevel pressures (BPAP) of 18/14 cmH20 corrected the AHI and nocturnal hypoxemia to within normal limits during Non REM (NREM) and REM sleep. His electrocardiogram remained in normal sinus rhythm .A twenty-four hour cardiac holter monitor revealed baseline sinus rhythm and confirmed the presence of second degree AV block of the Wenckebach type. A one month cardiac event recording showed normal sinus rhythm with frequent episodes of second degree AV block. These varied from Type I progressing to Type II with a 2:1 and 3:1 AV block, during sleep. Progression to complete heart block was noted with the longest pause lasting 3.9 seconds during sleep. The patient underwent an electrophysiology study with placement of a dual chamber pacemaker. He was initiated on BIPAP therapy. Subsequently, the patient was seen in clinic with improvements in his intermittent episodes of shortness of breath, fatigue, and daytime sleepiness.
Figure 1- Patient’s baseline EKG, normal sinus rhythm. Figure 2 -Progression to Mobitz Type II block 5:07 am. Figure 3 and 4- Sinus pauses, longest interval 11:07 pm 3.9 seconds (Figure 4).
In healthy individuals, especially athletes, bradycardia, Mobitz I AV block, and sinus pauses up to 2 seconds are common during sleep and require no intervention5. Cardiac rhythm is controlled primarily by autonomic tone. NREM sleep is accompanied by an increase in parasympathetic, and a decrease in sympathetic, tone. REM sleep is associated with decreased parasympathetic tone and variable sympathetic tone. Bradyarrhythmias in patients with OSA are related to the apnoeic episodes and over 80% are found during REM sleep. During these periods of low oxygen supply, increased vagal activity to the heart resulting in bradyarrhythmias may actually be cardioprotective by decreasing myocardial oxygen demand. This may be important in patients with underlying coronary heart disease.
Some studies have found that Mobitz I AV block may not be benign. Shaw6 et al studied 147 patients with isolated chronic Mobitz I AV block. They inserted pacemakers in 90 patients, 74 patients were symptomatic and 16 patients received a pacemaker for prophylaxis. Outcome data included five-year survival, deterioration of conduction to higher degree AV block, and new onset of various forms of symptomatic bradycardia. They concluded that survival was higher in the paced groups and that risk factors for poor outcomes in patients with Mobitz I included age greater than 45 years old, symptomatic bradycardia, organic heart disease, and the presence of a bundle branch block on EKG.
The Sleep Heart Health Study7 found a higher prevalence of first and second-degree heart block among subjects with sleep-disordered breathing (SDB) than in those without (1.8% vs. 0.3% and 2.2 vs. 0.9%, respectively). Gami et al8observed thatupon review of 112 Minnesota residents who hadundergone diagnostic polysomnography and subsequentlydied suddenly from a cardiac cause, sudden death occurred between the hours of midnight and 6:00 AM in 46% of those with OSA, as compared with 21% of those without OSA. In a study of twenty-three patients with moderate to severe OSA who were each implanted with an insertable loop recorder, about 50% were observed to have frequent episodes of bradycardia and long pauses (complete heart block or sinus arrest) during sleep9. These events showed significant night-to-night intra individual variability and their incidence was under-estimated, only 13%, by conventional short-term EKG Holter recordings.
Physiologic factors predisposing patients with OSA to arrhythmias include alterations in sympathetic and parasympathetic nervous system activity, acidosis, apnoea’s, and arousal2, 10, 11. Some patients with OSA may have an accentuation of the ‘Diving Reflex’. This protective reflex consists of hypoxemia-induced sympathetic augmentation to muscles and vascular beds associated with increased cardiac vagal activity which results in increased brain perfusion, bradycardia and decreased cardiac oxygen demand. In patients with cardiac ischemia, poor lung function (i.e. COPD), or both, it may be difficult to differentiate between these protective OSA-associated Bradyarrhythmias and those which may lead to sudden death. It has been well established that patients with COPD are at higher risk for cardiovascular morbidity12 and arrythmias13. Fletcher14 and colleagues reported that the effects of oxygen supplementation on AHI, hypercapnea and supraventricular arrhythmias in patients with COPD and OSA were variable. Out of twenty obese men with COPD studied, in most patients oxygen eliminated the bradycardia observed during obstructive apnoea’s and eliminated AV block in two patients. In some patients supplemental oxygen worsened end-apnoea respiratory acidosis however this did not increase ventricular arrhythmias.
CPAP therapy has been demonstrated to significantly reduce sleep–related Bradyarrhythmias, sinus pauses, and the increased risk for cardiac death 9, 15. Despite this, in certain situations placement of a pacemaker may be required. These include persistent life-threatening arrhythmias present in patients with severe OSAS on CPAP, arrhythmias in patients who are non-compliant with CPAP, and in patients who may have persistent sympathovagal imbalance and hemodynamic fluctuations resulting in daytime bradyarrhythmias16.
Our case is interesting since it highlights the importance of recognizing the association between OSA, COPD, and life-threatening cardiac arrhythmias. Primary care providers should note the possible association of OSA-associated bradyarrhythmias with life-threatening Type II bradyarrhythmias and pauses. Since bradyarrhythmias related to OSA are relieved by CPAP, one option would be to treat with CPAP and observe for the elimination of these arrhythmias using a 24hour holter or event recorder17. Compliance with CPAP is variable and if life-threatening bradycardia is present, placement of a permanent pacemaker may be preferred18.
Our patient is unusual because most studies showing a correlation with the severity of OSA and magnitude of bradycardia have included overweight patients without COPD19. This patient’s electrocardiogram revealed a Type II AV block at 5am (Figure 2). This is within the overnight time frame where patients with OSA have been observed to have an increased incidence of sudden death. Figures 3 and 4 show significant sinus pauses. In selected cases where patients have significant co-morbidities (i.e. severe COPD with OSA), in addition to treatment with positive airway pressure, electrophysiological investigation with placement of a permanent pacemaker may be warranted.
Even though it is commonly seen in Graves' disease, TPP is not related to the etiology, severity, and duration of thyrotoxicosis. 1
The pathogenesis of hypokalaemic periodic paralysis in certain populations with thyrotoxicosis is unclear. Transcellular distribution of potassium is maintained by the Na+/K+–ATPase activity in the cell membrane, and it is mainly influenced by the action of insulin and beta-adrenergic catecholamines.2 Hypokalemia in TPP results from an intracellular shift of potassium and not total body depletion. It has been shown that the Na+/K+–ATPase activity in platelets and muscles is significantly higher in patients with TPP.3 Hyperthyroidism may result in a hyperadrenergic state, which may lead to the activation of the Na+/K+–ATPase pump and result in cellular uptake of potassium.2, 4, 5 Thyroid hormones may also directly stimulate Na+/K+– ATPase activity and increase the number and sensitivity of beta receptors.2, 6 Patients with TPP have been found to have hyperinsulinemia during episodes of paralysis. This may explain the attacks after high-carbohydrate meals.7
A 19 year old male patient presented to our emergency room with sudden onset weakness of lower limbs. He was not able to stand or walk. Power of 0/5 in both lower limbs and 3/5 in upper limbs was noticed on examination. Routine investigations revealed to have severe hypokalemia with a serum potassium of 1.6 meq/l (normal range 3.5-5.0 meq/l), a serum phosphorus level of 3.4 mg/dl (normal range 3-4.5 mg/dl) and mild hypomagnesemia with serum magnesium level of 1.5mg/dl (normal range 1.8-3.0 mg/dl). ECG showed hypokalemic changes with prolonged PR interval, increased P-wave amplitude and widened QRS complexes. He was managed on intravenous as well oral potassium and history revealed weight loss, increased appetite and tremors from past 4 months. He had a multinodular goiter and radioactive iodine uptake scan (Iodine 131) showed a toxic nodule (Toxic nodule shows increased iodine uptake while the rest of the gland is suppressed) with no exophthalmos, sensory or cranial nerve deficits. Thyroid function tests revealed thyrotoxicosis with free T4 of 4.3ng/dl (normal range 0.8-1.8ng/dl), T3 of 279 ng/dl (normal range = 60 - 181 ng/dl) and a TSH level of <0.15milliunits/L (normal range = 0.3 - 4 milliunits/L). He was managed on intravenous potassium & propanolol. The patient showed dramatic improvement of his symptoms. The patient was discharged home on carbamazole with the diagnosis of TPP secondary to toxic nodular goiter.
In this case there was a significant family history as one of his elder brother had a sudden death (cause not known) and his mother was primary hypothyroid on levothyroxin replacement therapy.
TPP is seen most commonly in Asian populations, with an incidence of approximately 2% in patients with thyrotoxicosis of any cause.1,8,9,10 The attacks of paralysis have a well-marked seasonal incidence, usually occurring during the warmer months.1 Pathogenesis of hypokalaemia has been explained by some authors to be due to an intracellular shift of body potassium, which is catecholamine mediated.11,12 Shizume and his group studied total exchangeable potassium which revealed that patients with thyrotoxic periodic paralysis were not significantly different from controls when the value was related to lean body mass.11 The paralytic symptoms and signs improve as the potassium returns from the intracellular space back into the extracellular space.13 The diurnal variation in potassium movement where there is nocturnal potassium influx into skeletal muscle would explain the tendency for thyrotoxic periodic paralysis to occur at night.14 Hypophosphataemia and hypomagnesaemia are also known to occur in association with thyrotoxic periodic paralysis.14,15,16,17,18 The correction of hypophosphataemia without phosphate administration supports the possibility of intracellular shift of phosphate.16 Electrocardiographic findings supportive of a diagnosis of TPP rather than sporadic or familial periodic paralysis are sinus tachycardia, elevated QRS voltage and first-degree AV block (sensitivity 97%, specificity 65%).20 In addition to ST-segment depression, T-wave flattening or inversion and the presence of U waves are typical of hypokalaemia.
The management is to deal with the acute attack as well as treatment of the underlying condition to prevent future attacks. Rapid administration of oral or intravenous potassium chloride can abort an attack and prevent cardiovascular and respiratory complications.4 A small dose of potassium is the treatment of choice for facilitating recovery and reducing rebound hyperkalaemia due to release of potassium and phosphate from the cells on recovery.1,2,3 Rebound hyperkalaemia occurred in approximately 40% of patients with TPP, especially if they received >90 mmol of potassium chloride within the first 24 hours.4 Another mode of treatment is to give propranolol, a nonselective b-blocker, which prevents the intracellular shift of potassium and phosphate by blunting the hyperadrenergic stimulation of Na+/K+–ATPase.20 Hence, initial therapy for stable TPP should include propranolol.21,22,23 The definitive therapy for TPP includes treatment of hyperthyroidism with antithyroid medications, surgical thyroidectomy, or radioiodine therapy.
Normal sleep is divided into Non-REM and REM. REM occurs every 90-120 minutes during adult sleep throughout the night with each period of REM progressing in length such that the REM periods in the early morning hours are the longest and may last from 30-60 minutes. Overall, REM accounts for 20-25% of the sleep time but is weighted toward the second half of the night. During REM sleep with polysomnography monitoring one observes a low voltage mixed frequency amplitude EEG and low voltage EMG in the chin associated with intermittent bursts of rapid eye movements. During the periods of REM breathing becomes irregular, blood pressure rises and the heart rate also increases due to excess adrenergic activity. The brain is highly active during REM and the electrical activity recorded in the brain by EEG during REM sleep is similar to that of wakefulness.
Parasomnias are undesirable, unexpected, abnormal behavioral phenomena that occur during sleep. There are three broad categories in parasomnias. They are
Disorders of Arousal (from Non-REM sleep)
Parasomnias usually associated with REM sleep, and
Other parasomnias which also includes secondary type of parasomnias.
RBD is the only parasomnia which requires polysomnographic testing as part of the essential diagnostic criteria.
Definition of RBD
“RBD is characterized by the intermittent loss of REM sleep electromyographic (EMG) atonia and by the appearance of elaborate motor activity associated with dream mentation” (ICSD-2).1 These motor phenomena may be complex and highly integrated and often are associated with emotionally charged utterances and physically violent or vigorous activities. RBD was first recognized and described by Schenck CH et al. in 1986.2 This diagnosis was first incorporated in the International Classification of Sleep Disorders (ICSD) in 1990. (American Academy of Sleep Medicine)
A defining feature of normal REM sleep is active paralysis of all somatic musculature (sparing the diaphragm to permit ventilation). This result in diffuse hypotonia of the skeletal muscles inhibiting the enactment of dreams associated with REM sleep. In RBD there is an intermittent loss of muscle atonia during REM sleep that can be objectively measured with EMG as intense phasic motor activity (figure 1 and 2).
This loss of inhibition often precedes the complex motor behaviors during REM sleep. Additionally, RBD patients will report that their dream content is often very violent or vigorous dream enacting behaviors include talking, yelling, punching, kicking, sitting, jumping from bed, arm flailing and grabbing etc. and most often the sufferer will upon waking from the dream immediately report a clear memory of the dream which coincides very well with the high amplitude violent defensive activity witnessed. This complex motor activity may result in a serious injury to the dreamer or bed partner that then prompts the evaluation.
The Prevalence of RBD is about 0.5% in general population.1, 3 RBD preferentially affect elderly men (in 6th and 7th decade) with ratio of women to men being 1 to 9.4 The mean age of disease onset is 60.9 years and at diagnosis is 64.4 years.5 RBD was reported in an 18 year old female with Juvenile Parkinson disease,6 so age and gender are not absolute criteria.
In Parkinson disease (PD) the reported prevalence ranges from 13-50%,7, 14-19 LewyBody Dementia (DLB) 95%,8 and Multiple System Atrophy (MSA) 90 %.9 The presence of RBD is a major diagnostic criterion for MSA. RBD has been reported in Juvenile Parkinson disease, and pure autonomic failure10-12 all neurodegenerative disorders are synucleinopathies.13
The neurons of locus coeruleus, raphe nuclei, tuberomammillary nucleus, pedunculopontine nucleus, laterodorsal tegmental area and the perifornical area are firing at a high rate, and cause arousal by activating the cerebral cortex. During REM sleep, the aforementioned excitatory areas fall silent with the exception of the pedunculopontine nucleus and laterodorsal tegmental areas. These regions project to the thalamus and activate the cortex during REM sleep. This cortical activation is associated with dreaming in REM. Descending excitatory fibers from the pedunculopontine nucleus and laterodorsal tegmental area innervate the medial medulla, which then sends inhibitory projections to motor neurons producing the skeletal muscle atonia of REM sleep.20-21
There are two distinct neural systems which collaborate in the “paralysis” of normal REM sleep, one is mediated through the active inhibition by neurons in the nucleus reticularis magnocellularis in the medulla via the ventrolateral reticulospinal tract synapsing on the spinal motor neurons and the other system suppresses locomotor activity and is located in pontine region.22
REM sleep contains two types of variables, tonic (occurring throughout the REM period), and phasic (occurring intermittently during a REM period). Tonic elements include desynchronized EEG and somatic muscle atonia (sparing the diaphragm). Phasic elements include rapid eye movements, middle ear muscle activity and extremity twitches. The tonic electromyogram suppression of REM sleep is the result of active inhibition of motor activity originating in the perilocus coeruleus region and terminating in the anterior horn cells via the medullary reticularis magnocellularis nucleus.
In RBD, the observed motor activity may result from either impairment of tonic REM muscle atonia or from increase phasic locomotor drive during REM sleep. One mechanism by which RBD results is the disruption in neurotransmission in the brainstem, particularly at the level of the pedunculopontine nucleus.23Pathogenetically, reduced striatal dopaminergic mediation has been found24-25 in those with RBD. Neuroimaging studies support dopaminergic abnormalities.
Types of RBD
RBD can be categorized based on severity:
Mild RBD occurring less than once per month,
Moderate RBD occurring more than once per month but less than once per week, associated with physical discomfort to the patient or bed partner, and
Severe RBD occurring more than once per week, associated with physical injury to patient or bed partner.
RBD can be categorized based on duration:
Acute presenting with one month or less,
Subacute with more than one month but less than 6 months,
Chronic with 6 months or more of symptoms prior to presentation.
Acute RBD: In 55 - 60% of patients with RBD the cause is unknown, but in 40 - 45% the RBD is secondary to another condition. Acute onset RBD is almost always induced or exacerbated by medications (especially Tri-Cyclic Antidepressants, Selective Serotonin Reuptake Inhibitors, Mono-Amine Oxidase Inhibitors, Serotonin Norepinephrine Reuptake Inhibitors,26 Mirtazapine, Selegiline, and Biperiden) or during withdrawal of alcohol, barbiturates, benzodiazepine or meprobamate. Selegiline may trigger RBD in patients with Parkinson disease. Cholinergic treatment of Alzheimer’s disease may trigger RBD.
Chronic RBD: The chronic form of RBD was initially thought to be idiopathic; however long term follow up has shown that many eventually exhibit signs and symptoms of a degenerative neurologic disorder. One recent retrospective study of 44 consecutive patients diagnosed with idiopathic RBD demonstrated that 45% (20 patients) subsequently developed a neurodegenerative disorder, most commonly Parkinson disease (PD) or Lewy body dementia, after a mean of 11.5 years from reported symptoms onset and 5.1 years after RBD diagnosis.27
The relationship between RBD and PD is complex and not all persons with RBD develop PD. In one study of 29 men presenting with RBD followed prospectively, the incidence of PD was 38% at 5 years and 65% after 12 years.7, 28, 29 Contrast this with the prevalence of the condition in multiple system atrophy, where RBD is one of the primary symptoms occurring in 90% of cases.9 In cases of RBD, it is absolutely necessary not only to exclude any underlying neurodegenerative disease process but also to monitor for the development of one over time in follow up visits.
Sufferers of RBD usually present to the doctor with complaints of sleep related injury or fear of injury as a result of dramatic violent, potentially dangerous motor activity during sleep. 96% of patients reporting harm to themselves or their bed partner. Behaviors during dreaming described include talking, yelling, swearing, grabbing, punching, kicking, jumping or running out of the bed. One clinical clue of the source of the sleep related injury is the timing of the behaviors. Because RBD occurs during REM sleep, it typically appears at least 90 minutes after falling asleep and is most often noted during the second half of the night when REM sleep is more abundant.
One fourth of subjects who develop RBD have prodromal symptoms several years prior to the diagnosis. These symptoms may consist of twitching during REM sleep but may also include other types of simple motor movements and sleep talking or yelling.30-31 Day time somnolence and fatigue are rare because gross sleep architecture and the sleep-wake cycle remain largely normal.
RBD in other neurological disorders and Narcolepsy:
RBD has also been reported in other neurologic diseases such as Multiple Sclerosis, vascular encephalopathies, ischemic brain stem lesions, brain stem tumors, Guillain-Barre syndrome, mitochondrial encephalopathy, normal pressure hydrocephalus, subdural hemorrhage, and Tourette’s syndrome. In most of these there is likely a lesion affecting the primary regulatory centers for REM atonia.
RBD is particularly frequent in Narcolepsy. One study found 36% pts with Narcolepsy had symptoms suggestive of RBD. Unlike idiopathic RBD, women with narcolepsy are as likely to have RBD as men, and the mean age was found to be 41 years.32 While the mechanism allowing for RBD is not understood in this population, narcolepsy is considered a disorder of REM state disassociation. Cataplexy is paralysis of skeletal muscles in the setting of wakefulness and often is triggered by strong emotions such as humor. In narcoleptics who regularly experienced cataplexy, 68% reported RBD symptoms, compared to 14% of those who never or rarely experienced cataplexy.32-33 There is evidence of a profound loss of hypocretin in the hypothalamus of the narcoleptics with cataplexy and this may be a link that needs further investigation in the understanding of the mechanism of RBD in Narcolepsy with cataplexy. It is prudent to follow Narcoleptics and questioned about symptoms of RBD and treated accordingly, especially those with cataplexy and other associated symptoms.
Diagnostic criteria for REM Behavior Disorder(ICSD-2: ICD-9 code: 327.42)1
A. Presence of REM sleep without Atonia: the EMG finding of excessive amounts of sustained or intermittent elevation of submental EMG tone or excessive phasic submental or (upper or lower) limb EMG twitching (figure 1 and 2).
B. At least one of the following is present:
i. Sleep related injurious, potentially injurious, or disruptive behaviors by history
ii. Abnormal REM sleep behaviors documented during polysomnographic monitoring
C. Absence of EEG epileptiform activity during REM sleep unless RBD can be clearly distinguished from any concurrent REM sleep-related seizure disorder.
D. The sleep disturbance is not better explained by another sleep disorder, medical or neurologic disorder, mental disorder, medication use, or substance use disorder.
Several sleep disorders causing behaviors in sleep can be considered in the differential diagnosis, such as sleep walking (somnambulism), sleep terrors, nocturnal seizures, nightmares, psychogenic dissociative states, post-traumatic stress disorder, nocturnal panic disorder, delirium and malingering. RBD may be triggered by sleep apnea and has been described as triggered by nocturnal gastroesophageal reflux disease.
Evaluation and Diagnosis
Detailed history of the sleep wake complaints
Information from a bed partner is most valuable
Thorough medical, neurological, and psychiatric history and examination
Screening for alcohol and substance use
Review of all medications
PSG (mandatory): The polysomnographic study should be more extensive, with an expanded EEG montage, monitors for movements of all four extremities, continuous technologist observation and continuous video recording with good sound and visual quality to allow capture of any sleep related behaviors
Multiple Sleep Latency Test (MSLT): Only recommended in the setting of suspected coexisting Narcolepsy
Brain imaging (CT or MRI) is mandatory if there is suspicion of underlying neurodegenerative disease.
RBD may have legal consequences or can be associated with substantial relationship strain; therefore accurate diagnosis and adequate treatment is important, which includes non-pharmacological and pharmacological management.
Non-pharmacological management: Acute form appears to be self-limited following discontinuation of the offending medication or completion of withdrawal treatment. For chronic forms, protective measures during sleep are warranted to minimize the risks for injury to patient and bed partner. These patients are at fall risk due to physical limitations and use of medications. Protective measure such as removing bed stands, bedposts, low dressers and applying heavy curtains to windows. In extreme cases, placing the mattress on the floor to prevent falls from the bed has been successful.
Pharmacological management: Clonazepam is highly effective in treatment and it is the drug of choice. A very low dose will resolve symptoms in 87 to 90% of patients.4, 5, 7-34 Recommended treatment is 0.5 mg Clonazepam 30 minutes prior to bed time and for more than 90% of patients this dose remains effective without tachyphylaxis. In the setting of breakthrough symptoms the dose can be slowly titrated up to 2.0 mg. The mechanism of action is not well understood but clonazepam appears to decrease REM sleep phasic activity but has no effect on REM sleep atonia.35
Melatonin is also effective and can be used as monotherapy or in conjunction with clonazepam. The suggested dose is 3 to 12 mg at bed time. Pramipexole may also be effective36-38 and suggested for use when clonazepam is contraindicated or ineffective. It is interesting to note that during holidays from the drug, the RBD can take several weeks to recur. Management of patients with concomitant disorder like narcolepsy, depression, dementia, Parkinson disease and Parkinsonism can be very challenging, because medications such as SSRIs, selegiline and cholinergic medications used to treat these disorders, can cause or exacerbate RBD. RBD associated with Narcolepsy, clonazepam is usually added in management and it is fairly effective.
Because RBD may occur in association with neurodegenerative disorder, it is important to consult a neurologist for every patient with RBD as early as possible, especially to diagnose and provide care plan for neurodegenerative disorder, which includes but not limited to early diagnosis and management, regular follow up, optimization of management to provide better quality of life and address medico-legal issues.
In acute and idiopathic chronic RBD, the prognosis with treatment is excellent. In the secondary chronic form, prognosis parallels that of the underlying neurologic disorder. Treatment of RBD should be continued indefinitely, as violent behaviors and nightmares promptly reoccur with discontinuation of medication in almost all patients.
RBD and neurodegenerative diseases are closely interconnected. RBD often antedates the development of a neurodegenerative disorder; diagnosis of idiopathic RBD portends a risk of greater than 45% for future development of a clinically defined neurodegenerative disease. Once identified, close follow-up of patients with idiopathic RBD could enable early detection of neurodegenerative diseases. Treatment for RBD is available and effective for the vast majority of cases.
Early diagnosis of RBD is of paramount importance
Polysomnogram is an essential diagnostic element
Effective treatment is available
Early treatment is essential in preventing injuries to patient and bed partner
Apparent idiopathic form may precede development of Neurodegenerative disorder by decades
Saliva is the watery and usually frothy substance produced in and secreted from the three paired major salivary (parotid, submandibular and sublingual) glands and several hundred minor salivary glands, composed mostly of water, but also includes electrolytes, mucus, antibacterial compounds, and various enzymes. Healthy persons are estimated to produce 0.75 to 1.5 liters of saliva per day. At least 90% of the daily salivary production comes from the major salivary glands while the minor salivary glands produce about 10%. On stimulation (olfactory, tactile or gustatory), salivary flow increases five fold, with the parotid glands providing the preponderance of saliva.1
Saliva is a major protector of the tissues and organs of the mouth. In its absence both the hard and soft tissues of the oral cavity may be severely damaged, with an increase in ulceration, infections, such as candidiasis, and dental decay. Saliva is composed of serous part (alpha amylase) and a mucus component, which acts as a lubricant. It is saturated with calcium and phosphate and is necessary for maintaining healthy teeth. The bicarbonate content of saliva enables it to buffer and produce the condition necessary for the digestion of plaque which holds acids in contact with the teeth. Moreover, saliva helps with bolus formation and lubricates the throat for the easy passage of food. The organic and inorganic components of salivary secretion have got a protective potential. They act as barrier to irritants and a means of removing cellular and bacterial debris. Saliva contains various components involved in defence against bacterial and viral invasion, including mucins, lipids, secretory immunoglobulins, lysozymes, lactoferrin, salivary peroxidise, and myeloperoxidase. Salivary pH is about 6-7, favouring digestive action of salivary enzyme, alpha amylase, devoted to starch digestion.
Salivary glands are innervated by the parasympathetic and sympathetic nervous system. Parasympathetic postganglionic cholinergic nerve fibers supply cells of both the secretory end-piece and ducts and stimulate the rate of salivary secretion, inducing the formation of large amounts of a low-protein, serous saliva. Sympathetic stimulation promotes saliva flow through muscle contractions at salivary ducts. In this regard both parasympathetic and sympathetic stimuli result in an increase in salivary gland secretions. The sympathetic nervous system also affects salivary gland secretions indirectly by innervating the blood vessels that supply the glands.
Table 1: Functions of saliva
Digestion and swallowing Initial process of food digestion Lubrication of mouth, teeth, tongue and food boluses Tasting food Amylase- digestion of starch Disinfectant and protective role Effective cleaning agent Oral homeostasis Protect teeth decay, dental health and oral odour Bacteriostatic and bacteriocidal properties Regulate oral pH Speaking Lubricates tongue and oral cavity
Drooling (also known as driveling, ptyalism, sialorrhea, or slobbering) is when saliva flows outside the mouth, defined as “saliva beyond the margin of the lip”. This condition is normal in infants but usually stops by 15 to 18 months of age. Sialorrhea after four years of age generally is considered to be pathologic. The prevalence of drooling of saliva in the chronic neurological patients is high, with impairment of social integration and difficulties to perform oral motor activities during eating and speech, with repercussion in quality of lifeDrooling occurs in about one in two patients affected with motor neuron disease and one in five needs continuous saliva elimination7, its prevalence is about 70% in Parkinson disease8, and between 10 to 80% in patients with cerebral palsy9.
Pathophysiology of drooling is multifactorial. It is generally caused by conditions resulting in
Excess production of saliva- due to local or systemic causes (table 2)
Inability to retain saliva within the mouth- poor head control, constant open mouth, poor lip control, disorganized tongue mobility, decreased tactile sensation, macroglossia, dental malocclusion, nasal obstruction.
Problems with swallowing- resulting in excess pooling of saliva in the anterior portion of the oral cavity e.g. lack of awareness of the build-up of saliva in the mouth, infrequent swallowing, and inefficient swallowing.
Drooling is mainly due to neurological disturbance and less frequently to hyper salivation.Under normal circumstances, persons are able to compensate for increased salivation by swallowing. However, sensory dysfunction may decrease a person’s ability to recognize drooling and anatomic or motor dysfunction of swallowing may impede the ability to manage increased secretion.
Depending on duration of drooling, it can be classified as acute e.g. during infections (epiglottitis, peritonsilar abscess) or chronicneurological causes.
Drooling of saliva can affect patient and/or their carers quality of life and it is important to assess the rate and severity of symptoms and its impact on their life.
Table 3 Effect of untreated Drooling of saliva
Perioral chapping (skin cracking) Maceration with secondary infection Dehydration Foul odour Aspiration/ pneumonia Speech disturbance Interference with feeding
Isolation Barriers to education (damage to books or electronic devices) Increased dependency and level/intensity of care Damage to electronic devices Decreased self esteem Difficult social interaction
Assessment of the severity of drooling and its impact on quality of life for the patient and their carers help to establish a prognosis and to decide the therapeutic regimen. A variety of subjective and objective methods for assessment of sialorrhoea have been described3.
History (from patient and carers)
Establish possible cause, severity, complications and possibility of improvement, age and mental status of patient, chronicity of problems, associated neurological conditions, timing, provoking factors, estimation of quantity of saliva – use of bibs, clothing changing required/ day and impact on the day today life (patient/carer)
Evaluate level of alertness, emotional state, hydration status, hunger, head posture
Examination of oral cavity- sores on the lip or chin, dental problems, tongue control, swallowing ability, nasal airway obstruction, decreased intraoral sensitivity, assessment of health status of teeth, gum, oral mucosa, tonsils, anatomical closure of oral cavity, tongue size and movement, jaw stability. Assessment of swallowing
Assess severity and frequency of drooling (as per table 4)
Lateral neck x ray (in peritonsilar abscess)
Ultrasound to diagnose local abscess
Barium swallow to diagnose swallowing difficulties
Audiogram- to rule out conductive deafness associated with oropharyngeal conditions
Salivary gland scan- to determine functional status
Table 4 : System for assessment of frequency and severity of drooling
Dry (never drools)
Mild (wet lips only)
Moderate (wet lips and chins)
Severe (clothing becomes damp)
Profuse (clothing, hands, tray, object become wet)
Other methods of assessing salivary production and drooling
1) 1- 10 visual analogue scale (where 1 is best possible and 10 is worst possible situation)
2) Counting number of standard sized paper handkerchiefs used during the day
3) Measure saliva collected in cups strapped to chin
4) Inserting pieces of gauze with a known weight into oral cavity for a specific period of time and then re-measuring weight and calculating the difference between the dry and wet weights.
6) Salivary duct canulation 12 and measuring saliva production.
Drooling of saliva, a challenging condition, is better managed with a multidisciplinary team approach. The team includes primary care physician, speech therapist, occupational therapist, dentist, orthodontist, otolaryngologist, paediatrician and neurologist. After initial assessment, a management plan can be made with the patient. The person/ carer should understand the goal of treating drooling is a reduction in excessive salivary flow, while maintaining a moist and healthy oral cavity. Avoidance of xerostomia (dry mouth) is important.
There are two main approaches
Non invasive modalities e.g. oral motor therapy, pharmacological therapy
Invasive modalities e.g. surgery and radiotherapy
No single approach is totally effective and treatment is usually a combination of these techniques. The first step in management of drooling is correction of reversible causes. Less invasive and reversible methods, namely oral motor therapy and medication are usually implemented before surgery is undertaken5
Non invasive modalities
Positioningprior to implementation of any therapy, it is essential to look at the position of the patient. When seated, a person should be fully supported and comfortable. Good posture with proper trunk and head control provides the basis for improving oral control of drooling and swallowing.
Eating and drinking skills-drooling can be exacerbated by pooreating skills. Special attention and developing better techniques in lip closure, tongue movement and swallowing may lead to improvements of some extent. Acidic fruits and alcohol stimulate further saliva production, so avoiding them will help to control drooling10
Oral facial facilitation - this technique will help to improve oral motor control, sensory awareness and frequency of swallowing.Scott and staios et al 18 noted improvement in drooling in patients with both hyper and hypo tonic muscles using this technique. This includes different techniques normally undertaken by speech therapist, which improves muscle tone and saliva control. Most studies show short term benefit with little benefit in long run. This technique can be practiced easily, with no side effects and can be ceased if no benefits noted.
a) Icing – effect usually last up to 5-30 minutes. Improves tone, swallow reflex.
b) Brushing- as effect can be seen up to 20- 30 minutes, suggested to undertake before meals.
c) Vibration- improves tone in high tone muscles
d) Manipulation – like tapping, stroking, patting, firm pressure directly to muscles using fingertips known to improve oral awareness.
e) Oral motor sensory exercise - includes lip and tongue exercises.
Speech therapy-speech therapy should be started early to obtain good results. The goal is to improve jaw stability and closure, to increase tongue mobility, strength and positioning, to improve lip closure (especially during swallowing) and to decrease nasal regurgitation during swallowing.
Behaviour therapy-this uses a combination of cueing, overcorrection, and positive and negative reinforcement to help drooling. Suggested behaviours, like swallowing and mouth wiping are encouraged, whereas open mouth and thumb sucking are discouraged. Behavior modification is useful to achieve (1) increased awareness of the mouth and its functions, (2) increased frequency of swallowing, (3) increased swallowing skills. This can be done by family members and friends. Although there is no randomized controlled trial done, over 17 articles published in last 25 years, show promising results and improved quality of life. No reported side effects make behavioural interventions an initial option compared to surgery, botulinum toxin or pharmaceutical management. Behaviour interventions are useful prior and after medical management such as botulinum toxin or surgery.
Oral prosthetic device- variety of prosthetic devices can be beneficial, e.g. chin cup and dental appliances, to achieve mandibular stability, better lip closure, tongue position and swallowing. Cooperation and comfort of the patient is essential for better results.
Systematic review of anticholinergic drugs, show Benztropine, Glycopyrrolate, and Benzhexol Hydrochloride, as being effective in the treatment of drooling. But these drugs have adverse side-effects and none of the drugs been identified as superior.
Hyoscine- The effect of oral anticholinergic drugs has been limited in the treatment of drooling. Transdermal scopolamine (1.5 mg/2.5 cm2) offers advantages. One single application is considered to render a stable serum concentration for 3 days. Transdermal scopolamine has been shown to be very useful in the management of drooling, particularly in patients with neurological or neuropsychiatric disturbances or severe developmental disordersIt releases scopolamine through the skin into the bloodstream.
Glycopyrrolatestudies have shown 70-90% response rates but with a high side effect rate. Approximately 30-35% of patients choose to discontinue due to unacceptable side effects such as excessive dry mouth, urinary retention, decreased sweating, skin flushing, irritability and behavior changes. A study on 38 patients with drooling due to neurological deficits had shown up to a 90% response rateMier et al21 reported Glycopyrrolate to be effective in the control of excessive sialorrhea in children with developmental disabilities. Approximately 20% of children given glycopyrrolate may experience substantial adverse effects, enough to require discontinuation of medication.
Antimuscarinic drugs, such as benzhexol, have also been used, but limited due to their troublesome side effects.
Antireflux Medication: The role of antireflux medication (Ranitidine & Cisapride) in patients with gastro esophageal reflux due to esophageal dysmotility and lower esophageal tone did not show any benefits in a study 21.
Modafinil - One case study noticed decreased drooling in two clients who were using the drug for other reasons, but no further studies have been done.
Alternate medications: (Papaya and Grape seed extract) – Mentioned in literature as being used to dry secretions but no research in to their efficacy has been conducted.
Botulinum toxin It was in 1822 that a German poet and physician, Justinus Kerner, discovered that patients who suffered from botulism complained of severe dryness of mouth which suggested that the toxin causing botulism could be used to treat hypersalivation. However, it was only in the past few years that botulinum toxin type A (BTx-A)has been used for this purpose. BTx-A binds selectively to cholinergic nerve terminals and rapidly attaches to acceptor molecules at the presynaptic nerve surface. This inhibits release of acetylcholine from vesicles, resulting in reduced function of parasympathetic controlled exocrine glands. The blockade though reversible is temporary as new nerve terminals sprout to create new neural connections. Studies have shown that injection of botulinum toxin to parotid and submandibular glands, successfully subsided the symptoms of drooling 30,31. Although there is wide variation in recommended dosage, most studies suggest that about 30- 40 units of BTx-A injected into the parotid and submandibular glands are enough for the symptoms to subside The injection is usually given under ultrasound guidance to avoid damage to underlying vasculature/ nerves. The main side effects from this form of treatment are dysphagia, due to diffusion into nearby bulbar muscles, weak mastication, parotid gland infection, damage to the facial nerve/artery and dental caries.
Patients with neurological disorders who received BTX-A injections showed a statistically significant effect from BTX-A at 1 month post injection, compared with control, this significance was maintained at 6 months. Intrasalivary gland BTX-A was shown to have a greater effect than scopolamine.
The effects of BTx-A are time limited and this varies between individuals.
Surgerycan be performed to remove salivary glands, (most surgical procedures focused on parotid and submandibular glands). ligate or reroute salivary gland ducts, or interrupt parasympathetic nerve supply to glands. Wilke, a Canadian plastic surgeon, was the first to propose and carry out parotid duct relocation to the tonsillar fossae to manage drooling in patients with cerebral palsy. One of the best studied procedures, with a large number of patients and long term follow up data, is submandibular duct relocation 32, 33.
Intraductal laser photocoagulation of the bilateral parotid ducts has been developed as a less invasive means of surgical therapy. Early reports have shown some impressive results34.
Overall surgery reducedsalivary flow and drooling can be significantly improved often with immediate results – 3 studies noted that 80 – 89% of participants had an improvement in their control of their saliva. Two studies discussed changes in quality of life. One of these found that 80% of those who participated improved across a number of different measures including receiving affection from others and opportunities for communication and interaction. Most evidence regarding surgical outcomesof sialorrhea management is low quality and heterogeneous. Despitethis, most patients experience a subjective improvement followingsurgical treatment 36.
Radiotherapy - to major salivary glands in doses of 6000 rad or more is effective Side effects which include xerostomia, mucositis, dental caries, osteoradionecrosis, may limit its use.
Chronic drooling can pose difficulty in management
Early involvement of Multidisciplinary team is the key.
Combination of approach works better
Always start with noninvasive, reversible, least destructive approach
Surgical and destructive methods should be reserved as the last resort.
Lumbar punctures are commonly performed by both medical and anaesthetic trainees but in different contexts. Medically performed lumbar punctures are often used to confirm a diagnosis (meningitis, subarachnoid haemorrhage) whilst lumbar puncture performed by anaesthetists are usually a precedent to the injection of local anaesthetics into cerebrospinal fluid for spinal anaesthesia. The similarity relies on the fact that both involve the potential for iatrogenic infection into the subarachnoid space. The incidence of iatrogenic infection is very low in both fields; a recent survey by the Royal College of Anaesthetists1 reported an incidence of 8/707 000 whilst there were only approximately 75 cases in the literature after ‘medical’ lumbar puncture.2 However, the consequences of iatrogenic infection can be devastating. It is likely that appropriate infection control measures taken during lumbar puncture would reduce the risk of bacterial contamination. The purpose of the present study is to compare infection control measures taken by anaesthetic and medical staff when performing lumbar puncture.
A survey was constructed online (www.surveymonkey.com) and sent by email to 50 anaesthetic and 50 acute medical trainees in January 2011. All participants were on an anaesthetic or medical training programme and all responses were anonymous. The survey asked whether trainees routinely used the following components of an aseptic technique3 when performing lumbar puncture:
Clean patient skin
No ethical approval was sought as the study was voluntary and anonymous.
The overall response rate was 71% (40/50 anaesthetic trainees and 31/50 medical). All anaesthetic trainees routinely used the components of an aseptic technique when performing lumbar puncture. All medical trainees routinely cleaned the skin, decontaminated their hands and used a non-touch technique but only 80.6% used sterile gloves. 61.3% of medical trainees used a sterile trolley, 38.7% used an apron/gown and 77.4% used a dressing pack.
This survey shows that adherence to infection control measures differ between anaesthetic and medical trainees when performing lumbar puncture. The anaesthetic trainees have a 100% compliance rate compared to 80% for the medical trainees for all components of the aseptic technique. Both groups routinely cleaned the patient’s skin, decontaminated their hands and used a non-touch technique. However, there were significant differences in the use of other equipment, with fewer medical trainees using sterile gloves, trolleys, aprons and dressing packs.
Although the incidence of iatrogenic infection after lumbar puncture is low, it is important to contribute to this low incidence by adopting an aseptic technique. There may be differences with regards to the risks of iatrogenic infection between anaesthetic and medical trainees. Anaesthetic lumbar punctures involve the injection of a foreign substance (local anaesthesia) into the cerebrospinal fluid and may therefore carry a higher risk. Crucially however, both anaesthetic and medical lumbar punctures involve accessing the subarachnoid space with medical equipment and so the risk is present.
There are many reasons for the differing compliance rates between the two specialties. Firstly, anaesthetic trainees perform lumbar punctures in a dedicated anaesthetic room whilst the presence of ‘procedure/treatment rooms’ is not universal on medical wards. Secondly, anaesthetic trainees will always have a trained assistant present (usually an operating department practitioner, ODP) who can assist with preparing equipment such as dressing trolleys.
The mechanism of iatrogenic infection during lumbar puncture is not completely clear.4 The source of microbial contamination could be external (incomplete aseptic technique, infected equipment) or internal (bacteraemia in the patient); the fact that a common cause of iatrogenic meningitis are viridans streptococcus strains5 (mouth commensals) supports the notion that external factors are relevant and an aseptic technique is important.
It is very likely that improved compliance amongst acute medical trainees would result from a dedicated treatment room on medical wards, but this is likely to involve financial and logistical barriers. The introduction of specific ‘lumbar puncture packs’, which include all necessary equipment (e.g. cleaning solution, aprons, sterile gloves) may reduce the risk of infection; the introduction of a specific pack containing equipment for central venous line insertion reduced colonisation rates from 31 to 12%.6 The presence of trained staff members to assist medical trainees when performing lumbar puncture may assist in improved compliance, similar to the role of an ODP for anaesthetic trainees.
The main limitation of this study is that the sample size is small. However, we feel that this study raises important questions as to why there is a difference in infection control measures taken by anaesthetic and medical trainees; it may be that the environment in which the procedure takes place is crucial and further work on the impact of ‘procedure rooms’ on medical wards is warranted.
Hepatitis B (HB) is a major disease and is a serious global public health problem. About 2 billion people (latest figures so far by WHO) are infected with the hepatitis B virus (HBV) all over the world. Interestingly, rates of new infection and acute disease are highest among adults, but chronic infection is more likely to occur in persons infected as infants or young children, which leads to cirrhosis and hepatocellular carcinoma in later life. More than 350 million persons are reported to have chronic infection globally at present1,2. These chronically infected people are at high risk of death from cirrhosis and liver cancer. This virus kills about 1 million persons each year. For a newborn infant whose mother is positive for both HB surface antigen (HBsAg) and HB e antigen (HBeAg), the risk of chronic HB Virus (HBV) infection is 70% - 90% by the age of 6 months in the absence of post-exposure immunoprophylaxis3.
HB vaccination is the only effective measure to prevent HBV infection and its consequences. Since its introduction in 1982, recommendations for HB vaccination have evolved into a comprehensive strategy to eliminate HBV transmission globally4. In the United States during 1990–2004, the overall incidence of reported acute HB declined by 75%, from 8.5 to 2.1 per 100,000 population. The most dramatic decline occurred in children and adolescents. Incidence among children aged <12 years and adolescents aged 12-19 years declined by 94% from 1.1 to 0.36 and 6.1 to 2.8 per 100,000 population, respectively2,5.
Population of countries with intermediate and high endemicity rates are at high risk of acquiring HB infection. Pakistan lies in an intermediate endemic region with a prevalence of 3-4% in the general population6. WHO has included the HB vaccine in the Expanded Programme on Immunisation (EPI) globally since 1997. Pakistan included the HB vaccination in the EPI in 2004. Primary vaccination consists of 3 intramuscular doses of the HB vaccine. Studies show seroprotection rates of 95% with standard immunisation schedule at 0, 1 and 6 months using a single antigen HB vaccine among infants and children7,8. Almost similar results have been reported with immunisation schedules giving HB injections (either single antigen or in combination vaccines) at 6, 10 and 14 weeks along with other vaccines in the EPI schedule. But various factors like age, gender, genetic and socioenvironmetal influences, are likely to affect seroprotection rates9.So there is need to know actual seroprotection rates in our population where different vaccines, EPI procured and privately procured incorporated in different schedules are used. This study has been conducted to know the real status of seroprotection against HB in our children. Results will help in future policy-making, highlighting our shortcomings, comparing our programme with international standards and moreover augment future confidence in vaccination programmes.
Materials And Methods
This study was conducted at vaccinations centres and paediatrics OPDs (Outpatient Departments) of CMH and MH, Rawalpindi, Pakistan. Children reporting for measles vaccination at vaccination centres at 9 months of age were included. Their vaccination cards were examined and ensured that they had received 3 doses of HB vaccine according to the EPI schedule, duly endorsed in their cards. They included mainly children of soldiers but some civilians also who were invited for EPI vaccination at the MH vaccination centre. Children of officers were similarly included from the CMH vaccination centre and vaccination record was ensured by examining their vaccination cards. Some civilians who received private HB vaccination were included from paediatric OPDs . Some children beyond 9 months and less than 2 years of age who reported for non-febrile minor illnesses in the paediatric OPD at CMH and MH, were also included and their vaccination status was confirmed by examining their vaccination cards.
1) Male and female children >9 months and <2 years of age.
2) Children who had received 3 doses of HBV according to the EPI schedule at 6,10 and 14 weeks.
3) Children who had a complete record of vaccination- duly endorsed in vaccination cards.
4) Childen who did not have history of any chronic illness.
1) Children who did not have proper vaccination records endorsed in their vaccination cards.
2) Interval between last dose of HBV and sampling was <1 month.
3) Children suffering from acute illness at time of sampling.
4) Children suffering from chronic illness or on immunosuppressive drugs.
Informed consent for blood sample collection was obtained from the parents or guardians. The study and the informed consent form was approved by the institutional ethical review board. Participants were informed about results of HBs antibody screening. After proper antiseptic measures, blood samples (3.5 ml) were obtained by venepuncture. Autodisabled syringes were used. Collected blood samples were taken in vaccutainers and labelled by identification number and name of child. Samples were immediately transported to the Biochemistery Department of Army Medical College. Samples were kept upright for half an hour and then centrifuged for 10 minutes. Supernatant serum was separated and stored at -20 0C in 1.5 ml eppendorf tubes till the test was performed. Samples were tested using ELISA (DiaSorin S.p.A Italy kit) for detection of anti-HBs antibodies according to manufacturers’ instructions. The diagnostic specificity of this kit is 98.21% (95% confidence interval 97.07-99.00%) and diagnostic sensitivity is 99.11% (95% confidence interval 98.18-99.64%) as claimed by the manufacturer. Anti-HBs antibody enumeration was done after all 3 doses of vaccination (at least 1 month after the last dose was received).
As per WHO standards, anti-HBs antibody titres of >10 IU/L is taken as protective and samples showing antibody titres <10 IU/L were considered as non-protected. Samples having antibody titres >10 IU/L were taken as seroprotected against HB infection. All relevant information was entered in a predesigned data sheet and used accordingly at the time of analysis. Items entered included age, gender, place of vaccination, type of vaccination (private or government procured), number of doses and entitlement status (dependent of military personnel or civilian). The study was conducted from 1st January 2010 to 31st Dec 2010.
Data was analysed using SPSS version 15. Descriptive statistics were used to describe the data, i.e. mean and standard deviation (SD) for quantitative variables, while frequency and percentages were used for qualitative. Quantitative variables were compared through independent samples’ t-test and qualitative variables were compared through the chi-square test between both the groups. A P-value <0.05 was considered as significant.
The mean age of the children was 13.7 months. The overall frequency of children with titres <10 IU/L was 61 (31.4%) while frequency of children with titres >10 IU/L was 133 (68.6%).
Geometric mean titres (GMT) were 85.81 for the seroprotected (>10 IU/L) category.
One hundred and ninety-four children, who had received HB vaccination according to EPI schedule, were tested for anti-HBs titres. Out of them 61 (31.4%) had anti-HBs titres less than 10 IU/L (non-protective level) while 133 (68.6%) had anti-HBs titres above 10 IU/L (protective level) as shown in Figure 1. The GMT of anti-HBs among the individuals having protective levels (> 10 IU/L) was found to be 85.81 IU/L.
Figure 2 shows that anti-HBs titres between 10–100 IU/L was found in 75 (50.4%) children. Twenty-six (19.5%) individuals had titres between 100–200 IU/L. Twenty (14%) children had titres between 20–500 IU/L, 10 (7%) children had titres between 500–1000 IU/L and only 2 (1.5%) children had anti-HBs titres > 1000 IU/L.
One hundred and eighty-four children received vaccination supplied by government sources (Quinevaxem by Novartis) out of which 61 (33.1%) children had anti-HBs titres <10 IU/L (non- protective) and 123 (66.9%) had anti-HBs titres >10 IU/L (protective level). Only 10 children had received vaccination obtained from a private source (Infanrix Hexa by GSK), out of which all 10 (100%) had anti-HBs titres >10 IU/L (protective level). Comparison between the two groups revealed the difference to be significant (P value= 0.028).
One hundred and thirty-two children received vaccination from army health facilities (CMH and MH) out of which 36 (27.3%) had anti-HBs titres < 10 IU/L while 96 (72.7%) had anti-HBs titres >10 IU/L. Sixty-two children were vaccinated at civilian health facilities (health centres or vaccination teams visiting homes). Out of them 25 (40.3%) had anti-HBs titres <10 IU/L while 37 (59.7%) had anti- HBs titres >10 IU/L. The difference was insignificant (P value= 0.068). Gender analysis revealed that in the study group 129 (68.5%) were male children. Out of them 34 (26.4%) had anti-HBs titres <10 IU/L and 95 (73.6%) had anti-HBs titres >10 IU/L. Sixty-five (31.5%) were female children and out of them 27 (41.5%) had anti-HBs titres <10 IU/L while 38 (58.5%) had anti-HBs titres > 10 IU/L. Statistical analysis revealed the difference between males and females was significant (P value= 0.032).
One hundred and twenty-two (62.9%) children were less than 1 year of age. Out of them 37 (30.3%) had anti-HBs titres <10 IU/L and 85 (69.7%) had anti- HBs titres >10 IU/L. Seventy-two (37.1%) children ranged between 1 to 2 years of age. Out of them 24 (33.3%) had anti-HBs titres <10 IU/L while 48 (66.7%) had anti-HBs titres >10 IU/L. On comparison the difference between the two groups was insignificant (P value= 0.663), as shown in Table 1.
Anti-HBs titres (< 10 IU/L) (n = 61)
Anti-HBs titres (> 10 IU/L) (n = 133)
P – values
< 1year (n = 122)
> 1year (n = 72)
Male (n = 129)
Female (n = 65)
Army (n = 132)
Civilian (n = 62)
Government (n = 184)
Private ( n = 10)
Table 1 (NS = Insignificant; * = Significant )
HB is a global health problem with variable prevalence in different parts of the world1. Various studies carried out in different parts of Pakistan in different groups of population have shown diverse figures regarding prevalence of HB. However, a figure of 3-4% is accepted as general consensus by and large, thus making Pakistan an area of intermediate endemicity for HB6. Yet when we extrapolate these figures to our population, it is estimated that Pakistan hosts about seven million carriers of HB which is about 5% of the worldwide 350 million carriers of HB10,11.
Age at the time of infection plays the most important role in acquisition of acute or chronic HBV disease. HBV infection acquired in infancy is responsible for a very high risk of chronic liver disease due to HBV in later life12. HB is a preventable disease and fortunately vaccination at birth and during infancy can eradicate the disease globally, if vaccination strategy is effectively implemented13. This can be claimed as the first anti-cancer vaccine which prevents hepatocellular carcinoma in later life.
In Pakistan, the HB vaccine was included in the EPI in 2004, given along with DPT (Diphtheria, Pertussis, Tetanus) at 6, 10 and 14 weeks of age. The vaccine is provided through government health infrastructure to health facilities. Private HB vaccines supplied as a single antigen or in combination vaccines are also available in the market. The efficacy of these recombinant vaccines is claimed to be more than 95% among children and 90% among normal healthy adults14. The immunity of the HB vaccination is directly measured by development of anti-HBs antibodies more than 10 IU/L, which is considered as a protective level15. However, it is estimated that 5–15 % of vaccine recipients may not develop this protective level and remain non-responders due to undermentioned reasons.16 Published studies regarding antibody development in relation to various factors in terms of immunogenicity and seroprotection, show highly varied results. Multiple factors like dose, dosing schedules, sex, storage, site and route of administration, obesity, genetic factors, diabetes mellitus and immunosupression, affect HB antibodies development response17.
Although the HB vaccine was included in the EPI in 2004 in Pakistan, until now no published data showing seroconversion and seroprotection among vaccine recipients of this programme is available on a national level to our knowledge. Our study has revealed that out of 194 children, only 133 (68.6%) had anti-HBs titres in the protective range (>10 IU/L) while 61 (31.4%) did not develop seroprotection. These results are low as compared to other international studies. A study from Bangladesh among EPI vaccinated children shows a seroprotection rate of 92.2%13 while studies from Brazil18 and South Africa19 have separately reported seroprotection rates of 90.0% and 86.6%, respectively. Studies from Pakistan carried out in adults also show seroprotection rates (anti-HBe titres >10 IU/L) of more than 95% in Karachi University students14 and 86% in health care workers of Agha Khan University Hospital20, respectively. However, in these studies the dosing schedule was 0, 1 and 6 months, and participants were adults. These results are consistent with international reports.
The gravity of low seroprotection after HB vaccination is further aggravated when we extrapolate these figures to our overall low vaccination coverage rates of 37.6% to 45% as shown in studies at Peshawar and Karachi respectively21,22. One can imagine a significantly high percentage of individuals vulnerable to HBV infection even after receiving HB vaccine in an extensive national EPI programme. Therefore, a large population still remains exposed to risk of HBV infection, and national and global eradication of HBV infection will remain a dream. Failure of seroprotection after receiving the HBV vaccination in the EPI will also be responsible for projecting a sense of false protection among vaccine recipients.
Dosing schedule is an important factor in the development of an antibody response and titre levels. According to the Advisory Committee on Immunization Practices (ACIP) of America, there should be a minimum gap of 8 weeks between the second and third doses and at least 16 weeks between the first and third doses of the HB vaccination23. To minimize frequent visits and improve compliance, the dosing schedule has been negotiated in the EPI to 6, 10 and 14 weeks24. Although some studies have shown this schedule to be effective, the GMT of anti-HBs antibodies achieved was lower than that achieved by the standard WHO schedule25. This may be one explanation of lower rates of seroprotection in our study. The GMT achieved in our study among the children having protective levels of antibodies is 85.81 IU/L which is lower than most other studies. This supports the observation that GMT achieved in this schedule is lower than that produced by the standard WHO schedule. This may result in breakthrough infection of HB in vaccinated individuals in later life due to waning immunity. However, the immune memory hypothesis supports protection of vaccinated individuals in later life in spite of low anti-HBs antibody titres26. Yet further studies are required to dispel this risk.
Another shortcoming of this schedule is to miss the dose at birth (‘0 dose’). It has been reported that the 0 dose of the HB vaccine alone is 70% - 95% effective as post-exposure prophylaxis in preventing perinatal HBV transmission without giving HB immunoglobulins27. This may also be a factor contributing to lower rates of seroprotection in our study as we have not done HBsAg and other relevant tests to rule out HBV infection in these children. Moreover pregnant ladies by and large are not screened for HBV infection in Pakistan routinely in the public sector except in a few big cities like Islamabad, Lahore or Krachi. Therefore, we do not know the HB status of pregnant mothers and the risk of transmission to babies remains high. Different studies have reported much varied figures of HB status in pregnant ladies. A study from Karachi reports 1.57% pregnant ladies are positive for HBsAg while a study from Rahim Yar Khan reports this figure to be up to 20%28,29. A study by Waheed et al regarding the transmission of HBV infection from mother to infants reports the risk to be up to 90%30. All of these studies support the importance of the birth dose of the HB vaccination and augment the fact that control and eradication of HB with the present EPI schedule is not possible. Jain from India has reported a study using an alternative schedule of 0, 6 weeks and 9 months. He has reported it to be comparable to the standard WHO schedule of 0, 1, 6 months in regards to seroprotection and GMT levels achieved31. This schedule can be synchronised with the EPI schedule, avoiding extra visits and incorporating the birth dose. A similar schedule can also be incorporated in our national EPI.
In our study, seroprotection rates were found to be low in the female gender and the difference was significant. This finding differs with other studies which report lower seroprotection rates in males32. Although the number of female children was less, there is no plausible explanation for this observation. The site of inoculation of the HB vaccine is also very important for an adequate immune response. Vaccines given in the buttocks or intradermally produce lower antibody titres than intramuscular injections given in the outer aspect of the thigh in children, due to poor distribution and absorption of the vaccine within the host body. The practice of giving vaccinations in the buttocks by vaccinators is a common observation which they feel convenient for intramuscular injection in children. This may also be one reason for low seroprotection rates in our study, as we picked the children at random who had received vaccination at public health facilities except a small number of private cases.
The effectiveness of the vaccine also depends on the source of procurement and proper maintenance of the cold chain. In this study 100% seroprotection was observed in children who received the HB vaccine procured from a private source. Although the number of private cases was less, this factor of source and the cold chain also needs attention. To address this issue proper training of EPI teams regarding maintenance of temperature, injection techniques, motivation and monitoring can improve outcomes substantially.
The findings of this study are different from published literature because this is a cross-sectional observational study. This reports the actual seroprotection rates after receiving the HB vaccination in the EPI schedule. While most other studies show the results after ensuring control of influencing factors such as type of vaccine, dose, schedule, route of administration, training and monitoring of local EPI teams and health status of vaccine recipients, etc. Therefore, this is an effort to look at a practical scenario and evaluate outcomes which can help in framing future guidelines to achieve the goal of control and eradication of HB infection. Further studies are required at a large scale to determine the effect of HB vaccination at a national level.
The HB vaccination programme has decreased the global burden of HBV infection, but evidence of decreased burden is not uniform amongst world population.Of course figures witness marked decrease in developed world while in developing world statistics show little change. Unfortunately, implementation of this programme is not uniformly effective in all countries, thus resvoirs of infection and the source of continued HBV transmission persists. HBV infection is moderately endemic in Pakistan. The HB vaccine has been included in the national EPI since 2004. The present study shows seroprotection rates of only 68.6% in vaccine recipients, which is low when compared with other studies; 31.4% of vaccine recipients remain unprotected even after vaccination. Moreover GMT achieved in seroprotected vaccine recipients is also low (85.81 IU/L). There can be multiple reasons for these results, such as type of vaccine used, maintenance of the cold chain, route and site of administration, training and monitoring of EPI teams and dosing schedule. In present practice, the very important birth dose is also missing. These observations warrant review of the situation and appropriate measures to be taken to rectify the above mentioned factors, so that desired seroprotection rates after HB vaccination in the EPI can be achieved among vaccine recipients.
The clinical features of early HAT are well defined, yet the features of delayed HAT are less clear. Delayed HAT is a rare complication of OLT that may present with biliary sepsis or remain asymptomatic. Sonography is extremely sensitive for the detection of HAT in symptomatic patients during the immediate postoperative period. However, the sensitivity of ultrasonography diminishes as the interval between transplantation and diagnosis of HAT increases due to collateral arterial flow. MRA is a useful adjunct in patients with indeterminate ultrasound exams and in those who have renal insufficiency or an allergy to iodinated contrast.
In the absence of hepatic failure, conservative treatment appears to be effective for patients with HAT but retransplantation may be necessary as a definitive treatment.
A 52 year old male with a history of whole graft OLT for primary sclerosing cholangitis presented with two days of fever, nausea, and mild abdominal discomfort.
One week prior to presentation, he was seen in the liver clinic for regular follow-up. At that time, he was totally asymptomatic and his laboratory workup including liver function tests were within normal range.
He has undergone OLT three years prior. At the time of transplant he required transfusion of 120 units of packed red blood cells, 60 units of fresh frozen plasma and 100 units of platelets due to extensive intraoperative bleeding secondary to chronic changes of pancreatitis and severe portal hypertension, but had an otherwise uneventful postoperative recovery.
On physical examination the temperature was 39C, heart rate was 125 beats per minute, respiratory rate was 22 bpm. Initial laboratory workup revealed a white blood cell count of 25,000/mm3, AST of 6230 U/L, ALT of 2450 U/L, total bilirubin of 11 mg/dL , BUN 55 mg/dL and Creatinine of 4.5 mg/dL. Lactate level was 5 mmol/L. Doppler ultrasonography revealed an extensive intrahepatic gas (Image 1A). Computed tomography of the abdomen and pelvis revealed extensive area of hepatic necrosis with abscess formation measuring 19x14 cm with extension of gas into the peripheral portal vein branches (Image 1B,C). Upon admission to the hospital, the patient required endotracheal intubation, mechanical ventilator support and aggressively fluid resuscitation. He was started on broad-spectrum antibiotics and a percutaneous drain was placed that drained dark, foul smelling fluid. Cultures from the blood and the drain grew Clostridium perfringens.
Magnetic resonance imaging (MRI), MRA revealed occlusion of the hepatic artery 2 cm from its origin and also evidence of collaterals (Image 2A,B).
Image 1: (Pannel A) Doppler ultrasonography reveal extensive intrahepatic gas. (Pannel B&C) Computed tomography of the abdomen and pelvis reveal an extensive area of hepatic necrosis with abscess formation measuring 19x14 cm with extension of gas into the peripheral portal vein branches.
Image 2: MRI & MRA reveal occlusion of the hepatic artery 2 cm from its origin and also evidence of collaterals.
Following drain placement, the patient’s clinical condition markedly improved with significant reduction of liver function test values. Retransplantation was considered but delayed in the setting of infection and significant clinical and laboratory testing improvement.
The patient was transferred to the medical floor in stable condition, and the drain was then removed.
A week later the patient developed low grade fevers and tachycardia. One day later he began to experience mild abdominal discomfort and high grade fevers. Repeat CT of the abdomen revealed worsening hepatic necrosis and formation of new abscesses. His clinical condition decompensated quickly thereafter requiring endotracheal intubation, mechanical ventilation and aggressive resuscitation. Percutaneous drain was placed and again, drained pus-like, foul-smelling material. His overall condition deteriorated, and he eventually expired a few days later.
Delayed (more than 4 weeks after transplantation) HAT is a rare complication of OLT with an estimated incidence of at around 2.8%1.
Risk factors associated with development of HAT include Roux-en-Y biliary reconstruction, cold ischaemia and operative time, the use of greater than 6 units of blood, the use of greater than 15 units of plasma, and the use of aortic conduits on arterial reconstruction during transplant surgery2.
Collateralization is more likely to develop after Live Donor Liver Transplantation (LDLT) than after whole-graft cadaveric OLT3. Therefore, the latter is also associated with increased risk of late HAT.
Although the clinical features of early HAT are well described, the features of delayed HAT are less clearly defined1: the patient may present with manifestations of biliary sepsis or may remain asymptomatic for years. Right upper quadrant pain has been reported to occur in both immediate and delayed HAT. The clinical presentations may include recurrent episodes of cholangitis, cholangitis with a stricture, cholangitis and intrahepatic abscesses, and bile leaks1. Doppler ultrasonography has been extremely sensitive for the detection of HAT in symptomatic patients during the immediate postoperative period but becomes less sensitive as the interval between transplantation and diagnosis of HAT increases because of collateral arterial flow4.
3D gadolinium-enhanced MRA provides excellent visualization of arterial and venous anatomy with a fairly high technical success rate. MRA is a useful adjunct in patients with indeterminate ultrasonography examination in patients who have renal insufficiency or who have allergy to iodinated contrast 5.
Antiplatelet prophylaxis can effectively reduce the incidence of late HAT after liver transplantation, particularly in those patients at risk for this complication6. Vivarelli et al reported an overall incidence of late HAT of 1.67%, with a median time of presentation of 500 days; late HAT was reported in 0.4% of patients who were maintained on antiplatelet prophylaxis compared to 2.2% in those who did not receive prophylaxis6. The option of performing thrombolysis remains controversial. Whether thrombolysis is a definitive therapy or mainly a necessary step in the proper diagnosis of the exact etiology of HAT depends mostly on the particular liver center and needs further analysis7. Definitive endoluminal success cannot be achieved without resolving associated and possible instigating underlying arterial anatomical defects. Reestablishing flow to the graft can unmask underlying lesions as well as assess surrounding vasculature thus providing anatomical information for a more elective, better plan and definitive surgical revision7. Whether surgical revascularization compared to retransplantation is a viable option or only a bridging measure to delay the second transplantation has been a longstanding controversy in the treatment of HAT.
Biliary or vascular reconstruction do not increase graft survival and ongoing severe sepsis at the time of re-graft results in poor survival7. However, although uncommon, delayed HAT is a major indication for re-transplantation7. In the absence of hepatic failure, conservative treatment appears to be effective for patients with hepatic artery thrombosis.
C. perfringensis an anaerobic, gram-positive rod frequently isolated from the biliary tree and gastrointestinal tract. Inoculation of Clostridium spores into necrotic tissue is associated with formation of hepatic abscess8.
Necrotizing infections of the transplanted liver are rare. There have been around 20 cases of gas gangrene or necrotizing infections of the liver reported in the literature. Around 60% of these infections were caused by clostridial species with C. perfringens accounting for most of them. Around 80% of patients infected with Clostridium died, frequently within hours of becoming ill9,10. Those who survived underwent prompt retransplantation and the infection had not resulted in shock or other systemic changes that significantly decreased the likelihood of successful retransplantation8.
Because the liver has contact with the gastrointestinal tract via the portal venous system, intestinal tract bacteria may enter the liver via translocation across the intestinal mucosa into the portal venous system. Clostridial species can also be found in the bile of healthy individuals undergoing cholecystectomy9,10.
The donor liver can also be the source of bacteria. Donors may have conditions that favor the growth of bacteria in bile or the translocation of bacteria into the portal venous blood. These conditions include trauma to the gastrointestinal tract, prolonged intensive care unit admissions, periods of hypotension, use of inotropic agents, and other conditions that increase the risk of potential infection 8,9,10. C. perfringens sepsis in OLT recipients has been uniformly fatal without emergent retransplantation. Survival from C. perfringens sepsis managed without exploratory laparotomy or emergency treatment has been extremely rarely reported8. In those patients who survived, and in whom the infection has not resulted in shock or multiple organ failure, retransplantation may be successful8.
Although our patient survived his intensive care course, his recovery was tenuous as he quickly developed additional hepatic abscesses that led to his eventual demise. Post-mortem examination in our patient revealed intra-hepatic presence of Clostridium perfringens.
He was managed conservatively since he markedly improved both clinically and by liver function tests. Because of this, retransplantation was delayed. He was also already on antiplatelet prophylaxis.
We report an interesting case of Clostridium perfringens hepatic abscess due to late HAT following OLT. Although the patient initially improved with non-surgical treatment, he eventually died. In similar cases, besides aggressive work-up and medical management retransplantation may be necessary for a better long term outcome.
Many subjects with chronic Hepatitis C Virus (HCV) infection show persistently normal alanine aminotransferase (ALT) levels (PNALT),1-4 and thus formerly defined as ‘healthy’ or ‘asymptomatic’ HCV carriers.1 However, it is now clear that only a minority of these people show normal liver (15-20%).5-7 Therefore, ‘normal ALT’ does not always mean ‘healthy liver.’4
It is known that during the course of HCV infection ALT levels could fluctuate widely, with long periods of biochemical remission.1-4Thus, at least two different subsets of HCV-PNALT carriers exist: patients with temporal ALT fluctuations, that could be within the normal range for several months, and true ‘biochemically silent’ carriers showing persistently normal ALT values.4It means that the observation period should not be shorter than 12 - 18 months, and ALT determinations should be performed every 2 - 3 months.4, 6
Although liver damage is usually mild, 1, 2the presence of more severe chronic hepatitis (CH) or cirrhosis has been reported despite consistently normal liver biochemistry.8Although some studies showed that HCV carriers with normal ALT have mild and rather stable disease, others reported a significant progression of fibrosis in approximately 20-30% of the patients with ALT normality.9The development of hepatocellular carcinoma (HCC) has been also described.10Sudden worsening of disease with ALT increase and histological deterioration has been reported after many years of follow-up.11
Finally, HCV carriers with PNALT may suffer from extra-hepatic manifestations, sometimes more severe than the underlying liver disease: lymphoproliferative disorders, mixed cryoglobulinaemia, thyroid disorders, sicca syndrome, porphyria cutanea tarda, lichen planus, diabetes, chronic polyarthritis, etc.1, 2, 12
Therefore, the possibility of progression to more severe liver damage despite persistently normal biochemistry, the risk of HCC, the possibility of extra-hepatic diseases, and economic considerations, suggest that HCV-infected persons with PNALT should not be excluded a priori from antiviral treatment.1, 2
The earliest guidelines discouraged interferon (IFN) treatment in patients with PNALT because of the cost and side effects of therapy,1, 2 and of the low response rates to IFN monotherapy (<10-15%) with a risk of ALT flares in up to 50% of patients during treatment.9
The introduction of the combination of weekly subcutaneous pegylated-IFN (PEG-IFN) plus daily oral ribavirin (RBV) has led to response rates >50%, with a favourable risk-benefit ratio even in patients with slowly progressing disease.1, 2, 9 The first trial of PEG-IFN plus RBV found a sustained virological response (SVR) in 40% of HCV-1 carriers with PNALT treated for 48 weeks, and in 72% of HCV-2 and HCV-3 treated for 24 weeks.13The efficacy of antiviral treatment with PEG-IFN plus RBVwas subsequently confirmed in clinical practice.14, 15
However, in everyday practice, management of carriers with PNALT may be paradoxically more difficult than that of patients with abnormal ALT levels. Indeed, it is not always so easy to ascertain in the single case whether it should be considered as healthy subject or true patient. Several topics to date remain unresolved: Should these ‘seemingly healthy’ people undergo routine liver biopsy? Is antiviral treatment justified in ‘asymptomatic’ subjects with persistently normal liver biochemistry? Is long-term follow-up needed in this setting, and how long it should last?2
Liver biopsy provides helpful information on liver damage, as it may reveal the presence of advanced fibrosis or cirrhosis. Without a biopsy, it is impossible to clinically distinguish true ‘healthy’ carriers from those with CH.4 On the other hand, it is difficult to recommend routine biopsy for all HCV-PNALT .4 The decision to perform a biopsy should be based on whether treatment is being considered, taking into account the estimated duration of infection, probability of disease progression, willingness to undergo a biopsy, motivation to be treated, and availability of non-invasive tools to assess liver fibrosis.12 The recently developed transient elastography has improved our ability to non-invasively define the extent of fibrosis in HCV persons.5
Careful evaluation of parameters associated with disease progression is mandatory to assess the actual need for antiviral treatment.4 Indeed, it is really impossible to suggest antiviral therapy in all HCV carriers, as the costs would be exceedingly high, due to the high number of HCV patients with PNALT. Data from the literature indicate that the main factors of progression are male gender, advanced age, severe fibrosis, ALT flares, and steatosis.1-2
Cost/benefit might be particularly favourable in:
Young patients, having high rate of SVR (e.g. females, low viral load, HCV genotype non-1, etc).
Middle age patients with ‘significant’ liver disease and/or co-factors of progression of liver damage, thus at risk of developing more severe liver disease.12
The age issue has a critical role for decision making. Younger patients have a higher chance of achieving SVR and tolerating therapy better; they a have longer life expectancy, are often well motivated, usually have minimal disease and fewer contraindications. Thus, in this group decision to treat should be based more on expected response and motivation than on the severity of liver disease.
On the contrary, older patients respond less well to therapy, are more likely to have significant liver disease and/or co-factors, could experience more side effects and may be less motivated. Thus, in this group decision to treat should be based on the severity of liver disease and on the possibility of SVR.
A recent Italian Expert Opinion Meeting suggested the following recommendations:12
HCV carriers with PNALT may receive antiviral treatment with PEG-IFN plus RBV using the same algorithms recommended for HCV patients with abnormal ALT.
Decision making should rely on individual characteristics such as HCV genotype, histology, age, potential disease progression, probability of eradication, patient motivation, desire for pregnancy, co-morbidities, co-factors, etc.
Treatment might be offered without liver biopsy in patients with a high likelihood of SVR (e.g. age <50 years + non-1 HCV genotype + low viral load), in the absence of co-factors of poor responsiveness.
Inpatients aged 50–65 years, and in those with a reduced likelihood of achieving a response, biopsy may be used to evaluate the need for therapy, with treatment being recommended only for patients with more severe fibrosis and higher possibility of SVR. Biopsy and therapy are not recommended in the elderly (>65-70 years).
In patients who are not candidates for antiviral treatment, follow-up may be continued, and ALT should be monitored every 4-6 months. Avoidance of alcohol and obesity may be strongly recommended.12 It is not clear whether these subjects should be routinely offered anti-HBV vaccine, given the risk of disease progression in the case of HBV infection.12 Antiviral treatment should be re-considered in the case of ALT flares, US abnormalities or platelet count decrease. Repeated measurements of serum HCV RNA to evaluate disease progression is not recommended.1, 9, 11, 12.
Hyperthyroidism is one of the most frequently encountered conditions in clinical endocrinology.1 The modes of treatment available are antithyroid drugs, surgery and radioiodine (RAI) and although each of these is highly successful in controlling or curing hyperthyroidism none leads to permanent euthyroidism on a consistent basis. 2 Although over the last three decades RAI therapy has replaced surgery as the leading form of definitive treatment 3,4, 5 there is no universally accepted dose or regime for its use. Previous attempts to individualise the dose of RAI to reduce the rate of post-RAI hyper- or hypothyroidism have been unsuccessful 6, 7. Fixed dose RAI administration has therefore become the most commonly used regime although the actual dose of RAI used varies considerably and ranges between 185MBq to 600MBq 8, 9. For the last two decades we have used a fixed RAI dose of 550MBq for all patients. Others have used this regime with a high success rate 10 and a prospective head to head comparison with the calculated dose method found the fixed dose regimen to be superior for curing Graves’ hyperthyroidism 11.
Conflicting results have been produced in several studies that have attempted to predict outcome following RAI therapy by correlating cure rate with various pre-treatment factors including age, gender, aetiology of hyperthyroidism, goitre size, use of antithyroid drugs, free thyroxine levels at diagnosis and thyroid antibody status. Various forms of calculated or low fixed dose RAI therapy have been used in these studies but no study used a high fixed dose of 550MBq. In this study we have evaluated the overall success rate of high fixed dose RAI therapy and attempted to identify simple clinical predictors of failure to respond the initial RAI dose.
Patients and Methods
The study is a retrospective analysis of 584 consecutive patients referred to the Shropshire endocrinology service (Princess Royal Hospital and Royal Shrewsbury Hospital) over a 14 year period for the treatment of hyperthyroidism. These patients received RAI therapy at Royal Shrewsbury Hospital, which is the only centre providing facilities for RAI administration in the county of Shropshire and also draws referral from adjoining trusts in Powys, North Wales. Information for this study was obtained from the thyroid database which is maintained on all patients who have received RAI since 1985 at the above hospitals.
RAI was administered both as a primary (53%) and as secondary (47%) treatment. A majority of patients with moderate to severe hyperthyroidism were rendered euthyroid by antithyroid drugs (ATD). Ninety percent (518/584) patients were pre-treated to euthyroidism by antithyroid drugs (carbimazole in 95% and propylthiouracil in 5%) before RAI therapy. Carbimazole was withdrawn one week and propylthiouracil 4 weeks prior to RAI therapy. A standard RAI dose of 550MBq was administered to all patients without a prior uptake study. Thyroid function was measured at 6 weeks and at 3, 6 and 12 months following RAI therapy. ATD drugs were not recommenced routinely following RAI therapy and were reserved for patients who were persistently and significantly hyperthyroid following RAI administration. Patients who developed clinical and biochemical hypothyroidism after the initial 6-8 weeks were commenced on thyroxine. Patients with high free thyroxine level (FT4) and a suppressed thyroid stimulating hormone (TSH) level and those on antithyroid medication were defined as being hyperthyroid, those with low FT4 or on thyroxine as hypothyroid and those with normal FT4 and a normal or low TSH as euthyroid. At the end of one year if a patient remained hyperthyroid, another RAI dose of 550MBq was administered. The patient was considered to have been “cured” if euthyroidism or hypothyroidism was achieved during the first year following RAI therapy and “not cured” if patient remained persistent hyperthyroidism at the end of this period.
Information recorded on the database included age, gender, aetiology, indication (primary or secondary), dose of RAI, number of RAI doses, name and duration of antithyroid drugs used, if any, and FT4 and TSH levels at diagnosis, at the time of RAI therapy and at 6 weeks, 3, 6 and 12 months after RAI therapy. Diagnosis of Graves’ disease was based on the presence of Graves’ ophthalmopathy or a combination of a diffuse goitre and a significant titre of thyroid peroxidase antibodies or if radionuclide scan showed diffuse uptake. Toxic nodular disease was diagnosed on the grounds of a nodular goitre and a focal increase in radionuclide uptake. Patients who could not be classified to either of the groups on clinical grounds and where a radionuclide scan could not be performed for a variety of reasons, were categorised as “unclassified” on aetiological grounds.
Continuous random variables were compared using t-tests and association of categorical variables by using chi-squared tests. The effect on outcome (cure of hyperthyroidism) of all variables was assessed by using logistic regression analysis and a step-wise routine was applied to choose the best set of predictors. All analyses were carried out by using NCSS2000.
Data on 584 patients was included with a mean age of 56 years (range 20-90) and a female preponderance (82%). Assessment of the aetiology of hyperthyroidism was made by the above-mentioned criteria. In 110(15%) patients precise aetiological diagnosis could not be made. 344/474 (72%) patients had hyperthyroidism secondary to Graves’ disease and 134/474(28%) had toxic nodular disease. 518 patients received pre-RAI antithyroid medications. Mean free thyroxine level at time of diagnosis was 45.4pmol/L in 259 patients in whom this information was available. Data for thyroid status at 3, 6, and 12 months post-radioiodine were available in 97, 94 and 100% patients respectively (see Table 1).
Table 1: Thyroid status at 3, 6 and 12 months
FT4 values were entered onto the database more recently and this result was available in 259 patients. The group of patients where FT4 data was available was comparable to the group where this information was not available in all respects apart from age (mean age (SD) 54 (±15) vs 58 (±14) years respectively, p<0.02). Similarly, the group of patients in whom the aetiology could not be ascertained was not different from the group where the aetiology could be identified in any respect apart from the age (mean age (SD) 60 (±13) vs 55 (±15) respectively).
Table 2 – Forward Stepwise (Wald) logistic regression analysis to identify factors independently associated with failure to respond to first dose of RAI
Adjusted r2; OR (95% CI)
Free T4 at diagnosis
0.084; 1.04 (1.01-1.07)
Free T4 > 45 pmol/l at diagnosis*
0.056; 3.43 (1.17-10.04)
Pre RAI use of anti-thyroid drugs
* Regression analysis carried out with free T4 as a continuous variable and separately as a categorical variable at a cut off of 45pmol/l
One year following RAI treatment, 543(93%) patients were either euthyroid (162;28%) or hypothyroid (383;65%) and considered “cured”; 39(7%) patients remained hyperthyroid and required further doses of RAI, with 34(6%) patients requiring two doses and 5(1%) patients three doses. At 3 months, 484 out of 571 (85%) patients, and at 6 months, 490 out of 549 (89%) patients were “cured” (table 2). On univariate analysis no correlation could be established between the failure to respond to the first dose RAI and age, gender, aetiology or use of antithyroid medication (p = ns for all) although the rate of hypothyroidism was significantly higher at the end of one year in patients with Graves’ disease as compared to those with toxic nodular disease (77.1% vs. 50.3%, p<0.01). These results were not affected by limiting the analyses to any of the following groups: only those patients in whom the aetiological diagnosis could be made (n=478), only those patients in whom FT4 value was available (n=259) or only those patients where both FT4 was available and aetiology could be ascertained (n=209). On univariate analysis FT4 at diagnosis was associated with the outcome when it was used as a continuous variable (p<0.05) or as a categorical variable with the cut off set at mean FT4 value of 45pmol/L (p=0.01) and high values were associated with failure to respond to the first dose of RAI (mean ± SD, 57.28±20.1 v 44.58±16.1 pmol/L, p<0.05). On multivariate analysis with all variables, FT4 was found to be independently associated with outcome and again this association was seen when FT4 was used as a continuous variable (p=0.01) as well as a categorical variable (p=0.02). On using step-wise selection routine only FT4 could be chosen as a predictor when criterion for selection was set at p=0.05 and a value of over 45pmol/L predicted failure to respond to the first dose of RAI.
The use of a standard fixed-dose RAI therapy is gaining increasing popularity and several studies have now shown that formal estimation of the required dose based on the thyroid size and iodine kinetics does not lead to a higher cure rate 6,7,10,11 or a lowerhypothyroidism rate 7. For several years we have used 550MBq dose for all patients of hyperthyroidism. The overall success rate with this regime was 93% and only 7% of patients required a repeat RAI dose. These figures are comparable to those from most other centres, which have used a similar dose of RAI 10. In addition to achieving a high cure rate, hyperthyroidism was controlled rapidly with 85% of the patients becoming either euthyroid or hypothyroid within 3 months of treatment. Early onset of hypothyroidism (>70% at 12 months) facilitated institution of thyroxine replacement therapy during the first year during which the patients were being closely followed.
The use of a relatively higher dose of RAI leads to more stringent restrictions to the normal life of patients and these have to be followed for a longer period of time than is the case with the use of a lower dose. Majority of patients accept these restrictions at the prospect of a cure of hyperthyroidism. However, even at this dose, 7% of patients required repeat dosing which in turn led to another restrictive period for these patients. In view of this it is useful to be able to predict failure of the first dose in an individual patient. This would enable us to warn these patients about the higher possibility of requiring repeat dosing, further period of post-RAI restrictions and target them for a closer follow up. To allow us to make this prediction we correlated simple clinical pre-treatment variables to the need for repeat dosing. We found that there was no statistically significant correlation between age, gender, aetiology and the use of anti-thyroid medication prior to RAI and the outcome following RAI therapy although a high free thyroxine level at diagnosis predicted a failure of the first dose to achieve a cure of hyperthyroidism. There are several conflicting reports in the literature on the correlation between these factors and the response to RAI therapy. Most of the studies have failed to show a significant association between the age of the patient and the outcome irrespective of whether the age was used as a continuous or a categorised variable 12-15 although in a study where a standard 150 gray RAI was used age >50 was found to be associated with a higher failure rate 16. In one study, male gender was associated with a lower cure rate following a single dose of RAI in patients with Graves’ disease 12 although others have failed to confirm this association 13,14. Use of antithyroid drugs prior to RAI has been shown to independently reduce the success rate of RAI 17, 18 while other studies have shown such an association with the use of propylthiouracil but not with carbimazole 19, 20. Literature on the association between the aetiology of hyperthyroidism and the outcome is even more confusing. Patients with toxic nodular disease have been considered to be more radio-resistant as compared to patients with Graves’ disease 21 although opposite results have also been noted 22. In other studies no correlation could be established on multivariate analysis between the aetiology and outcome following RAI 14, 18. Our study is the only one which analyses the influence of these factors on the outcome following the use of a standard 550MBq RAI dose and the above studies which have attempted to identify clinical predictors of outcome have either used various forms of the calculated dose regime or a lower fixed-dose RAI regime. We feel that this is the reason for the inconsistencies in the results and when a 550MBq dose RAI is used only FT4 value at diagnosis could predict the failure of RAI therapy to achieve cure. This dose of RAI appears to override the variations in the response induced by the remaining pre-treatment variables studied.
Studies using smaller doses or calculated doses of RAI have shown the outcome to be inversely associated with the thyroid size 14, 16 although this could not be ascertained in our study due to the lack of consistent documentation of the size ofgoitre in the clinical notes. In addition there are several possible confounding factors. Firstly the overall cure rate could have been influenced by the long period of time over which patients have been included (15 years) and the resulting changes in the criteria and threshold for the use of RAI. However if we divide the figures into 3 time periods of 5 years each, the findings remain consistent during each of these periods. Secondly, in over 50% of our patients, RAI was administered as a primary measure and it could be argued that a larger number of patients with milder hyperthyroidism may have been included in our cohort as compared to the patients at other centres where RAI is mainly reserved for patients who fail to respond to ATD. However there was no significant difference in the cure rate between those patients who received RAI as a primary measure and those in whom RAI was administered as a secondary treatment (94% v 93%). Thirdly in 15% of patients the aetiology could not be ascertained by using our well-defined criteria, mainly because of the practical difficulty of performing radionuclide scans in some of the patients where the diagnosis could not be made clinically. We do not feel that our results on the association between the aetiology and the cure rate were affected, as the patients with undefined aetiology were comparable to the remaining patients in all respects apart from age and had similar outcomes. Lastly the information on the FT4 value at diagnosis was available in only 259 patients. To exclude a selection bias this group was compared to the group of patients where this information was not available. Again the only difference between the two groups was the age distribution. In both instances this difference was not large (though statistically significant) and we do not feel it affected the outcome, especially as age does not appear to influence the outcome following RAI therapy. We could not assess the impact of post-RAI use of antithyroid drugs as these were not routinely restarted following RAI therapy at our centre.
In conclusion, high fixed dose RAI therapy is a very effective treatment for patients with hyperthyroidism and has a high success rate. Failure to respond to this dose cannot be predicted by most of the pre-treatment variables apart from the severity of the hyperthyroidism as judged by the FT4 value at diagnosis. Patients who present with severe hyperthyroidism should be warned regarding the higher possibility of requiring further doses of radioiodine even when treated with a dose of 550MBq.