Fluvoxamine for COVID-19 ICU patients?Vladimir TrkuljaDepartment of PharmacologyZagreb University School of MedicineŠalata 11, Zagreb, Croativladimir.email@example.comNumber of words: 800Number of figures/tables: 1To the Editor,I read with interest a recently BJCP-accepted manuscript on the use of fluvoxamine in COVID-19 patients who needed admission to an intensive care unit (ICU) 1. It was instructive to read about the pre-existing clinical experience and about possible mechanisms of presumed benefits of fluvoxamine in COVID-19. However, attention needs to be drawn to the suggested effect of fluvoxamine quantified as a 40% reduction in instantaneous risk of death. The authors report1 on a cohort (n=51) of patients who, upon ICU admission, were treated with oral fluvoxamine added to the standard of care (SoC) (3x100 mg/day over 15 days, then 2x50 mg/day over 7 days), and who were compared to a cohort (n=51) of SoC-only patients. The two cohorts were said to be matched 1. Based on reported data 1, it appears that the patients were matched exactly in respect to gender and COVID-19 vaccination status, and, seemingly, on a rather narrow age-caliper, but the matching method was not reported 1; not reported was also a measure of matching adequacy – standardized difference (d ), a preferred method of balance assessment (adequate if d <0.1) since independent of the sample size 2. Based on the reported data1, for example, the fluvoxamine – SoC d regarding body mass index was -0.30 (-0.31 in women and -0.29 in men); also, d=-0.122 regarding history of diabetes,d= -0.350 regarding history of treated hypertension,d=-0.11 regarding on-admission APACHE score – all suggesting a considerable imbalance between the two cohorts (lower values in the fluvoxamine cohort). The authors provide Kaplan-Meier curves of time-to-death (or ICU discharge) but without the numbers at risk1. Still, data could be read from the graphs and curves reconstructed (Figure 1A):(i) the first marked difference between the treated and controls occurs during the first 7 days of observation – 3 patients died and 3 were censored in the former, and 11 died and 4 were censored the latter cohort (Figure 1A). This difference in deaths (3 vs. 11) did not change over the entire later period since the overall difference in the number of deaths was 9 (30/51 in treated vs. 39/51 in controls). This would indicate a very rapid-onset (and subsequently “lost”) effect of fluvoxamine, which does not seem pharmacologically plausible. The assumed fluvoxamine mechanisms1 are not of the immediate-onset type; with a 3x100 mg/day dosing, elimination half-life is likely to extend well beyond 30 hours, hence steady-state would be achieved only after 7-10 days 3. Combined with the baseline imbalance between groups, this indicates that the initial separation of the two curves – more or less preserved throughout the entire subsequent period - was likely not attributable to fluvoxamine; (ii) after day 21, and particularly after day 28, the numbers at risk were very low, and after day 35 there were no further events (Figure 1A), hence accounting for the entire curve is likely misleading 4; (iii) although the curves do not cross (Figure 1A), they indicate a possibility that hazard ratio varied over time. Hazard ratio as generated in a Cox proportional hazard model (as done by the authors) is an average of values that can change over time5; it is also inherently prone to selection bias and, even in absence of confounding its interpretation is not straightforward5. This holds for randomized and particularly for non-randomized settings 5. Reconstructed data depicted in Figure 1A were used to fit a complementary log-log model for continuous time process taking into account the first 35 days (no events after that point): the method treats time as a continuous but more “coarsely” measured variable, in intervals of identical length (in this case 7-day intervals, i.e., weeks); based on assumption of constant hazard within the interval, the method provides period-specific (for weeks 1-5) hazard ratios 6, which is likely a preferable option 5. Figure 1B depicts estimated probabilities of death and HRs: it is only during week 1 that the hazard appeared lower in treated – a period during which, as elaborated, fluvoxamine most likely had no effect. Finally, authors fitted a multivariable Cox model 1to substantiate the fluvoxamine effect. With a total of 15 independents in a study with 102 subjects, the model was likely overfitted and susceptible to bias arising from over(unnecessary)-adjustments 7. But more importantly, it included adjustment for renal replacement therapy (RRT), which was actually one of the outcomes. Inadequacy of adjustments for post-exposure outcomes as if they were baseline covariates has been extensively elaborated 8 and almost inevitably results in a considerable bias, regardless of whether the respective variable was actually a mediator or a collider 8. Such adjustments require implementation of marginal structural models or some of the g-estimation methods 9.Overall, the reported difference between the two cohorts of patients is more likely bias arising from design and analysis than evidence supporting a causal effect of fluvoxamine.ReferencesČalušić M, Marčec R, Lukša L et al. Safety and efficacy of fluvoxamine in COVID-19 ICU patients: an open label, prospective cohort trial with matched controls. Br J Clin Pharmacol . 2021; doi: 10.1111/bcp.15126.Stuart EA. Matching methods for causal inference: a review and a look forward. Stat Sci . 2010; 25(1):1-21.Hiemke C, Hartter S. Pharmacokinetics of selective serotonin reuptake inhibitors. Pharmacol Ther . 2000; 85 (1):11-28.Machin D, Cheung YB, Parmar MKB, eds. Survival analysis: a practical approach . 2nd ed. Chichester, West Sussex: John Wiley & Sons Ltd; 2006. p.38.Hernan MA. The hazards of hazard ratios. Epidemiology 2010; 21(1):13-15.Prentice RL, Gloecker LA. Regression analysis of grouped survival data with application to breast cancer data. Biometrics 1978; 34(1):57-67.Schisterman EF, Core SF, Platt RW. Overadjustment bias and unnecessary adjustment in epidemiological studies. Epidemiology 2009; 20(4):488-495.Greenland S. Quantifying biases in causal models: classical confounding vs collider-stratification bias. Epidemiology 2003; 14(4):300-306.Hernan MA, Robins JM, eds. Causal inference: What if . 1st ed. CRC Press LLC; 2019.Figure 1 . Summary of re-analysis of survival data published in ref. 1. A . Reconstructed curves of Kaplan-Meier product-limit estimates. Data1 were read using a digitizing software, and were re-analyzed and curves were drawn using JMP 13 software (SAS Institute Inc., Cary, NC). Upward oriented ticks indicate censorings, downward oriented ticks indicate failures. ICU – intensive care unit. B . Estimated probabilities of death during weeks 1 to 5 by treatment (Fluvox – fluvoxamine) and period-specific hazard ratios (HR) with confidence intervals. A complementary log-log model was fitted to reconstituted data using SAS 9.4 for Windows (SAS Inc., Cary, NC).
Introduction: In critically ill patients, Transfusion Related Acute Lung Injury (TRALI) remains the leading cause of transfusion-related fatalities in critical care setting and associated with inflammation and oxidative stress state. Recent research raised the potential efficacy of high dose intravenous ascorbic acid in critically ill patients. Objective: The aim of this trial was to investigate the effect of high dose intravenous ascorbic acid (VC) as a targeted therapy for TRALI in terms of serum proinflammatory (interleukin-8, interleukin-1β, C-reactive protein), anti-inflammatory (interleukin-10), oxidative stress (superoxide dismutase, malondialdehyde) markers, and plasma VC levels. Secondary outcomes were oxygenation (PaO2/FiO2 ratio), vasopressor use, duration of mechanical ventilation, ICU length of stay, 7-days mortality and 28-days mortality. Methods: Eighty critically ill patients with TRALI (n=80) were randomized to receive 2.5gm/6hr intravenous vitamin C for 96 hours (ASTRALI group) or placebo. Patients were followed-up to measure the outcomes initially (T0) and at the end of treatment (T96). Results: When compared to control group, ASTRALI group at T96, showed significantly higher median of interleukin-10 (31.6 ± 25.8 Vs. 17.7 ± 12.0 pg/mL, p<0.0001) levels and superoxide dismutase (12876 ± 4627 U/L Vs. 5895 ± 6632 U/L, p<0.0001) activities, lower median C-reactive protein (76 ± 50 Vs. 89 ± 56 mg/L, p=0.033), interleukin-8 (11.8 ± 7.3, 35.5 ± 19.8 pg/mL, p<0.0001), and malondialdehyde (0.197 ± 0.034 Vs. 0.234 ± 0.074 µM/L, p=0.002) levels. Conclusion: High dose ascorbic acid was associated with significantly reduced oxidative stress, reduced pro-inflammatory markers except IL-1β, elevated anti-inflammatory marker, and elevated plasma VC levels
Paracetamol overdose is common in developed countries but less than 10% involve large ingestions exceeding 30g or 500mg/kg. High dose acetylcysteine (NAC) has been proposed in patients taking large paracetamol overdoses based on reports of hepatotoxicity despite early initiation of NAC treatment with the commonly used 300 mg/kg intravenous acetylcysteine regimen. The evidence from cohorts of patients treated with the standard NAC regimen after large paracetamol overdoses shows that it is effective in most patients. Small studies in patients whose paracetamol concentration are above the 300mg/L nomogram line show that modification of the standard NAC regimen to provide a total of 400-500 mg/kg NAC over 21-22h may reduce the risk of hepatotoxicity (peak ALT>1000 IU/L) but the impact on development of hepatic failure, liver transplantation and mortality with this approach is presently unknown. Better risk stratification of patients taking paracetamol overdose may allow higher dose NAC and adjunctive treatments such as CYP2E1 inhibition and extracorporeal removal of paracetamol to be targeted to those patients at the highest risk of hepatotoxicity after a large paracetamol overdose.
The propellants used in metered-dose inhalers (MDIs) are powerful greenhouse gases, which account for approximately 13% of the NHS’s carbon footprint related to the delivery of care. Most MDI use is in salbutamol relievers in patients with poorly controlled disease. The UK lags behind in this regard with greater reliance on salbutamol MDI and correspondingly greater greenhouse gas emissions; roughly treble our European neighbours’. There has been a broad switch towards MDIs in the UK over the last 20 years to reduce financial costs such that two-thirds of asthma patients in the UK are on treatment dominated by salbutamol MDI. Strategies that replace overuse of reliever MDIs with regimes emphasising inhaled corticosteroids have the potential to improve asthma control alongside significant reductions in greenhouse gas emissions. Real-world evidence shows that once-daily long-acting combination dry-powder inhalers can improve compliance, asthma control and reduce the carbon footprint of care. Similarly, maintenance and reliever therapy (MART) which uses combination reliever and inhaled steroids in one device (usually a dry-powder inhaler) can simplify therapy, improve asthma control and reduce greenhouse gas emissions. Both treatment strategies are popular with patients, most of whom are willing to change treatment to reduce their carbon footprint. By focussing on patients who are currently using high amounts of salbutamol MDI, and prioritising inhaled steroids via dry-powder inhalers, there are golden opportunities to make asthma care more effective, safer and greener.
It is unfortunately true that clinicians lack the necessary evidence to know how to use medications properly in large sections of the population, and we do not have optimal drugs to use in many Neglected Tropical Diseases (NTDs). NTD’s often disproportionately affect neglected populations such as children and pregnant women. As reliable access to safe, effective preventives and treatments can break the cycle of poverty, illness, and ensuing debility that further perpetuates poverty, it is of paramount importance to investigate and develop new medicines for neglected populations suffering from NTDs. Furthermore, there is not only a need to develop and evaluate novel therapies, but also to ensure that these are affordable, available, and adapted to the communities who need them. With this editorial, the British Pharmacological Society hereby launches a call for high-quality articles focusing on NTDs in special populations, to facilitate the reversal of this dual neglect.
Aim. Cancer patients with reduced dihydropyrimidine dehydrogenase (DPD) activity are at increased risk of severe fluoropyrimidine (FP)-related adverse events (AE). Guidelines recommend FP dosing adjusted to genotype-predicted DPD activity based on four DPYD variants (rs3918290, rs55886062, rs67376798, rs56038477). We evaluated relationship between three further DPYD polymorphisms [c.496A>G (rs2297595), *6 c.2194G>A (rs1801160) and *9A c.85T>C (rs1801265)] and the risk of severe AEs. Methods. Consecutive FP-treated adult patients were genotyped for “standard” and tested DPYD variants, and for UGT1A1*28 if irinotecan was included, and were monitored for the occurrence of grade ≥3 (National Cancer Institute Common Terminology Criteria) vs. grade 0-2 AEs. For each of the tested polymorphisms, variant allele carriers were matched to respective wild type controls (optimal full matching combined with exact matching, in respect to: age, sex, type of cancer, type of FP, DPYD activity score, use of irinotecan/UGT1A1, adjuvant therapy, radiotherapy, biological therapy and genotype on the remaining three tested polymorphisms). Results. Of the 503 included patients (82.3% colorectal cancer), 283 (56.3%) developed grade ≥3 AEs, mostly diarrhea and neutropenia. Odds of grade ≥3 AEs were higher in c.496A>G variant carriers (n=127) than in controls (n=376) [OR=5.20 (95%CI 1.88-14.3), Bayesian OR=5.24 (95% CrI 3.06-9.12)]. Odds tended to be higher in *6 c.2194G>A variant carries (n=58) than in controls (n=432) [OR=1.88 (0.95-3.73), Bayesian OR=1.90 (1.03-3.56)]. *9A c.85T>G did not appear associated with grade ≥3 AEs (206 variant carriers vs. 284 controls). Conclusion. DPYD c.496A>G variant might need to be considered for inclusion in the DPYD genotyping panel.
Climate change continues to pose a dangerous threat to human health. However, not only is health impacted by this crisis, healthcare itself adds to the problem, through significant contributions to green house gas emissions. In the UK, the National Health Service (NHS) is responsible for an estimated 4% of the overall national carbon footprint. Medicines account for a quarter of this and whilst they are vital in in health now, through sustainable use they can also positively influence the environmental health of the future. In this review, we explore how clinical pharmacologists and other health care professionals can practice sustainable medicines use or eco-pharmaco-stewardship. We will discuss current and near future environmental practices within the NHS, which we suspect will resonate with other health systems. We will suggest approaches for championing eco-pharmaco-stewardship in drug manufacturing, clinical practices and patient use, to achieve a more a sustainable healthcare system.
Aim: The risk-benefit profile of angiotensin-converting enzyme inhibitors (ACEIs) and angiotensin receptor blockers (ARBs) in coronavirus disease 2019 (Covid-19) is still a matter of debate. With growing evidence on the protective effect of this group of commonly used antihypertensives in Covid-19, we aimed to thoroughly investigate the association between the use of major classes of antihypertensive medications and Covid-19 outcomes in comparison with the use of ACEIs and ARBs. Methods: We conducted a population-based study in patients with pre-existing hypertension in the UK Biobank. Multivariable logistic regression analysis was performed adjusting for a wide range of confounders. Results: The use of either beta-blockers (BBs), calcium-channel blockers (CCBs), or diuretics was associated with a higher risk of Covid-19 hospitalization compared to ACEI use (adjusted OR, 1.63; 95% CI, 1.40 to 1.90) and ARB use (adjusted OR, 1.50; 95% CI, 1.27 to 1.77). The risk of 28-day mortality among Covid-19 patients was also increased among users of BBs, CCBs or diuretics when compared to ACEI users (adjusted OR, 1.64; 95% CI, 1.23 to 2.19) but not when compared to ARB users (adjusted OR, 1.18; 95% CI, 0.87 to 1.59). However, no associations were observed when the same analysis was conducted among hospitalized Covid-19 patients only. Conclusion: Our results suggest protective effects of blocking of the renin-angiotensin-aldosterone system on Covid-19 hospitalization and mortality among patients with pharmaceutically treated hypertension, which should be addressed by randomized controlled trials. If confirmed, this finding could have high clinical relevance for treating hypertension during the SARS-CoV-2 pandemic.
There is paucity of evidence to support clinical decision making and counseling related to medication use in pregnancy. Despite multiple efforts from legislative bodies and advocacy groups, the inclusion of pregnant women in clinical drug trials assessing efficacy and safety remains scarce. Pregnancy can be complicated by multiple co-morbidities that require pharmacological intervention; these interventions primarily target the pregnant women but also sometimes have secondary effects for the fetus. The U.S. Food and Drug Administration has issued multiple guidance documents on incorporating pregnant women in clinical trials to aid pharmaceutical companies in designing a protocol to ensure safety and adherence to ethical standards. Advances in pediatric pharmacology studies provide lessons for researchers on the best practice of designing clinical trials with inclusion of patients from special populations. In this review, we present the status of pregnant women in clinical trials, highlighting the ethical stigma and possible future directives.
AIM Ropeginterferon alfa-2b is a new site-specific conjugated 40 kDa branched polyethylene-glycol recombinant interferon (IFN). The aim of the study was to determine its safety, pharmacokinetics (PK) and pharmacodynamic (PD). METHODS Ropeginterferon alfa-2b was evaluated first in human in 48 healthy male volunteers after a single dose subcutaneous injection by either 24, 48, 90, 180, 225, 270mcg of the product or 180mcg of marketed pegylated (peg)-IFN alfa-2a. Within each dosing group, 6 subjects received ropeginterferon alfa-2b and 2 subjects received peg-IFN alfa-2a. RESULTS Dose-related increases in ropeginterferon alfa-2b PK parameters (Cmax, AUC, and AUC0-t) were observed over the dose range 24 to 270mcg. The geometric mean values for these PK parameters of ropeginterferon alfa-2b were higher than that of peg-IFN alfa-2a at the 180mcg dose level of 176%, 166%, and 182%, respectively. Mean PD parameters (Emax, Tmax, and AUC0-t) for ropeginterferon alfa-2b increased with dose for both biomarkers neopterin and 2’, 5’-OAS. Ropeginterferon alfa-2b has similar PD profiles as peg-IFN alfa-2a. The treatment related adverse events are similar between the two study drugs, but the overall incidence was numerically lower for ropeginterferon alfa-2b (83%) than peg-IFN alfa-2a (100%) at the 180mcg dose level. CONCLUSIONS Single subcutaneous dose of Ropeginterferon alfa-2b of up to 270mcg is safe and well tolerated. It displays dose related increase in PK and PD parameters, potentially less frequent injection, and better safety profiles. Ropeginterferon alfa-2b is being developed for diseases in which previous peg-IFN use has been limited by side effects.
Aspirin has known effects beyond inhibiting platelet cyclooxygenase-1 (COX1) that have been incompletely characterized. Transcriptomics can comprehensively characterize the on- and off-target effects of medications. We used a systems pharmacogenomics approach of aspirin exposure in volunteers coupled with serial platelet function and purified platelet mRNA sequencing to test the hypothesis that aspirin’s effects on the platelet transcriptome are associated with platelet function. We prospectively recruited 74 adult volunteers for a randomized cross over study of 81- vs. 325 mg/day, each for 4 weeks. Using mRNA sequencing of purified platelets collected before and after each 4-week exposure, we identified 208 aspirin-responsive genes with no evidence for dosage effects. In independent cohorts of healthy volunteers and patients with diabetes we validated aspirin’s effects on five genes: EIF2S3, CHRNB1, EPAS1, SLC9A3R2, and HLA-DRA. Functional characterization of the effects of aspirin on mRNA as well as platelet ribosomal RNA demonstrated that aspirin may act as an inhibitor of protein synthesis. Database searches for small molecules that mimicked the effects of aspirin on platelet gene expression in vitro identified aspirin but no other molecules that share aspirin’s known mechanisms of action. The effects of aspirin on platelet mRNA were correlated with higher levels of platelet function both at baseline and after aspirin exposure – an effect that counteracts aspirin’s known antiplatelet effect. In summary, this work collectively demonstrates a dose-independent effect of aspirin on the platelet transcriptome that counteracts the well-known antiplatelet effects of aspirin.
Paracetamol poisoning continues to be a worldwide problem and despite the availability of an affective antidote, N-acetylcysteine (NAC), the optimal way to use this antidote, particularly following very large doses of paracetamol, has not been established. Recent case series have shown an increased toxicity from paracetamol, even in those receiving prompt NAC therapy, at high doses of paracetamol, particularly in patients above the 300 mg/L nomogram treatment line. Clinical trial evidence supporting shorter NAC dosing now allows the possibility for intensifying treatment without the risk of very high rates of ADRs. New biomarkers also show the possibility of early identification of patients at risk of liver injury who might also benefit from increased intensity treatment. This article discusses these data and proposes a logical therapy for increasing NAC dosing which now requires clinical trial testing.
X-linked adrenoleukodystrophy (X-ALD) is an inherited, neurodegenerative rare disease that can result in devastating symptoms of blindness, gait disturbances, and spastic quadriparesis due to progressive demyelination. Typically, the disease progresses rapidly, causing death within the first decade of life. With limited treatments available, efforts to determine an effective therapy that can alter disease progression or mitigate symptoms have been undertaken for many years, particularly through drug repurposing. Repurposing has generally been guided through clinical experience and small trials. At this time, none of the drug candidates have been approved for use, which may be due, in part, to the lack of pharmacokinetic/pharmacodynamic (PK/PD) information on the repurposed medications in the target patient population. Greater consideration for the disease pathophysiology, drug pharmacology, and potential drug-target interactions, specifically at the site of action, would improve drug repurposing and facilitate development. Although there is a good understanding of X-ALD pathophysiology, the absence of information on drug targets, pharmacokinetics, and pharmacodynamics hinders the repurposing of drugs for this condition. Incorporating advanced translational and clinical pharmacological approaches in preclinical studies and early stages clinical trials will improve the success of repurposed drugs for X-ALD as well as other rare diseases.
Introduction: Free-of-charge (FoC) medicine schemes are increasingly available and allow access to investigational treatments outside clinical trials or in advance of licensing or NHS commissioning. Methods: We retrospectively reviewed FoC medicine schemes evaluated between 2013 and 2019 by a single NHS trust and a regional drug and therapeutics committee (DTC). The details of each locally reviewed FoC scheme, and any nationally available MHRA Early Access to Medicines Scheme (MHRA EAMS) in the same period, were recorded and categorised. Results: Most FoC schemes (95%) allowed access to medicines intended to address an unmet clinical need. Over 7 years, 90% were company-FoC schemes and 10% were MHRA EAMS that were locally reviewed. Phase 3 clinical trial data were available for 44% of FoC schemes; 37% had phase 2 data; and 19% were supported only by phase 1, retrospective observational studies, or pre-clinical data. Utilisation of company-FoC schemes increased on average by 50% per year, while MHRA EAMS showed little growth. Conclusion: Company-FoC medicine schemes are increasingly common. This may indicate a preference for pharmaceutical companies to independently co-ordinate schemes. Motivations for company-FoC schemes remain unclear and many provide access to treatments that are yet to be evaluated in appropriately conducted clinical trials, and whose efficacy and risk of harm remain uncertain. There is no standardisation of this practice and there is no regulatory oversight. Moreover, no standardised data collection framework is in place that could demonstrate the utility of such programmes in addressing unmet clinical need or allow generation of further evidence.
Fomepizole is a promising new treatment for preventing liver injury following paracetamol (acetaminophen) overdose. However, we need robust clinical trials to be performed to demonstrate its effect on clinical outcomes that are important to our patients and important to healthcare providers. Until such trials are performed, the toxicology community should learn the lessons from the COVID pandemic – potential novel therapeutic options may be theoretically appealing, but their effectiveness needs to be assessed in robust clinical trials before they are used in clinical practice.
Brigatinib was recently approved for the treatment of anaplastic lymphoma kinase-positive non-small cell lung cancer and is dosed according to a one-dose-fits-all paradigm. We aimed to identify a pharmacokinetically-guided precision dosing strategy to improve treatment response with brigatinib through simulations using a previously published pharmacokinetic-pharmacodynamic model. Dosing strategies explored were the approved 180mg QD, the highest tolerable dose tested in clinical trials: 240mg QD, and two precision dosing strategies targeting the median trough concentrations following 180mg QD, and 240mg QD. We investigated the impact of alternative dosing regimens on progression-free survival (PFS), overall survival (OS), and the probability of developing a grade ≥2 rash or grade ≥2 amylase increase. Median PFS and OS increased by 1.6 and 7.8 months, respectively between the currently approved dosing strategy and precision dosing to the median trough concentration of the 240mg dosing strategy, with only a minor increase in the probability of developing toxicity.