A recently published piece in the journal nature reviews: drug discovery outlines the challenges involved with developing new, non-opioid painkillers that target a human sodium ion channel known as Nav1.7. The great promise of Nav1.7 flows from the discovery that individuals who are congenitally insensitive to pain (‘CIP’) feature Nav1.7 deletions. This positioned Nav1.7 as a ‘genetically validated’ target for painkiller discovery efforts, one that many players large and small across the biotech and pharma sector have been pursuing for a number of years.
Cryo-EM reconstruction of an alpha-scorpion toxin in complex with a Nav channel pore. Source
However, over the years, R&D outcomes have been sobering, for reasons that can be summarized as follows:
Minimal differentiation of the Nav1.7 pore from those of other sodium ion channels.
Nav1.7 pore – a transmembrane protein – is hard to express in sufficient quantities for classic drug discovery efforts. As one scientist told me, “expression of sufficient amounts (>0.1 of mg on a regular basis) of well-folded NaV1.7 full-length channel for biochemical studies remains an unsolved problem”.
While safely inducing allosteric modulation of Nav1.7 might be more feasible with biologics than with small molecules, the former would be less convenient, thus restraining commercial opportunity.
Small molecule efforts also need to take classic PK/PD considerations into account. Ziconotide, a synthetic conotoxin derivative which targets N-type voltage gated calcium channels, needs to be administered intrathecally.
Datasets to date suggest that near-complete Nav1.7 blockade may be required for meaningful therapeutic effect, which poses additional challenges around PK/PD & therapeutic practicality.
Congenitally Nav1.7-null individuals feature additional abnormalities which cannot be mimicked with short-term, drug-induced Nav1.7 blockade.
With all the above caveats in mind, what could a realistic way forward with Nav1.7 look like? As the nature article explains, not all hope is lost, and synergistic effect between Nav1.7 inhibitors with sub-therapeutic doses of opioid painkillers seems like a promising and pragmatic route to explore.
Naturally, such potential synergistic effects would still require any sponsor to develop a viable (effective and safe) Nav1.7 inhibitor in the first place. As both private sector R&D efforts and academic research continue to elucidate the structural components surrounding the Nav1.7 pore, the binding properties of naturally derived inhibitors or additional characterization of CIP phenotypes, I am cautiously optimistic that a therapeutic breakthrough is a matter of when, not if. To get a glimpse of ongoing academic efforts, I suggest watching this video summary of recent findings by McDermott et al. published in Neuron:
While Nav1.7-targeted investment opportunities in listed equities appear sparse at the moment, I will keep an eye on this space both out of intellectual curiosity and with a view on identifying promising drug candidates down the line.
Statistical inference is the perilous but necessary exercise of drawing conclusions from (sometimes very) limited datasets, usually with a view on facilitating some sort of decision – whether to go forward with a particular line of R&D, or whether to approve a drug for a given indication.
In a fantasy world of unlimited resources and omniscient decision-makers, statistical inference would be a futile exercise. Seeing how this is not the case, many bright minds have dedicated significant efforts to developing methodologies that allow us to draw reasonable conclusions from limited datasets. Much of the scientific enterprise rests on our ability to test hypotheses on the basis of datasets via generally accepted statistical methods. The emergence of such methods via the interplay between some of classical statistics’ major contributors is summarized in this book review. Among other things, it is shown that Fisher’s own thinking around ‘statistical significance’ evolved over time, with the late Fisher emphasizing that “no scientific worker has a fixed level of significance at which from year to year, and in all circumstances, he rejects hypotheses; he rather gives his mind to each particular case, and his ideas”.
Ronald Fisher would probably be dismayed to find out that the notion of ‘statistical significance’ he introduced has turned into precisely that which it was not meant to be, namely a means of over-simplification of statistical findings which keeps leading laymen and even scientists astray. In a March 20 Nature piece, Valentin Amrhein, Sander Greenland and Blake McShane produce a cogent indictment of the uncritical use of ‘statistical significance’ as the be all, end all of hypothesis testing in academia and beyond. Their call for an end to the concept of ‘statistical significance’ is at once a call for the nuanced interpretation which statistical findings require by their very nature. For instance, sample size needs to be adequately large in relation to the overall population that is being studied in order to minimize the impact of underlying heterogeneity. This is often reflected in confidence intervals narrowing in response to increased sample size.
In the life sciences, go / no go decisions are crucial to the efficient allocation of scarce resources, and at the end of the road, the regulator also needs to be presented with ‘substantial evidence’ of a drug’s safety and efficacy for the target population. However, we would be kidding ourselves if we postulated that a single statistical parameter would suffice to inform such go / no go decisions, or to qualify evidence as ‘substantial’ in the eyes of the regulator. Traditionally, the Food and Drug Administration (FDA) required at least 2 adequately powered, randomized studies for drug approval, but in many cases today, the agency is willing to grant approval on the basis of a single such study, provided it yields statistically significant outcomes. While there are certainly instances in which it is appropriate to require only a single randomized study, or even approval on the basis of ‘surrogate endpoints’; as far as non-orphan indications are concerned, it would seem more clinically meaningful if a sponsor were to produce, for instance, 3 large studies showing meaningful effect size and reasonably narrow confidence intervals even if such studies were to produce p-values of variable significance, rather than ‘lucking it out’ on a single study which happens to yield p<0.05.
However, given that ‘statistical significance’ has reigned supreme for quite some time both in the literature and with the regulator, should anyone be surprised that a superficial reading of clinical data still dominates the news cycle or that investors tend to over-rely on the magical statistical threshold known as p<0.05?
One key section of Amrhein et al.’s paper is titled Quit categorizing, which in my estimation really summarizes the whole conundrum. The trouble started with Fisher proposing, and his contemporaries adopting, an arbitrary threshold (0.05) below which results should be dubbed ‘statistically significant’. As we highlighted earlier, the ‘late’ Fisher himself emphasized the nature of the datasets under consideration and the goals of a particular line of scientific inquiry over a static ‘stat sig’ threshold, perhaps in recognition of the dangers inherent in ‘<0.05’ becoming dogma.
In my occupation as an investor in the life sciences, I am frequently confronted with what might be appropriately labeled ‘stat sig fundamentalism’, which not unlike religious fundamentalism, continues to draw an intellectually lazy & largely ignorant crowd with unrealistic promises of easy solutions to complex problems. It is also apparent that a number of drug development outlets somewhat maliciously focus their efforts around producing ‘stat sig’ results at all costs, knowing of course that such results are highly PR-worthy and will be propagated & soaked up by the trading masses.
In a December 2018 report, I indicted the misleading use of ‘stat sig’ against small statistical samples, in particular in the context of ‘PhII’ trials which cannot support regulatory approval. To help wean investors off their reliance on p-values as a one-size-fits-all measure of relevance, I suggested they ask themselves the following:
How representative is a given clinical trial cohort of the overall patient population?
What is the natural history of the condition and what is the efficacy of existing therapies?
How informative are measures of statistical significance against a given dataset?
I then went on to explain that a representative sample size may vary materially depending on the prevalence & etiology of a given condition, citing two contrasting examples: 1) a rare condition with invariably poor outcomes and low susceptibility to existing treatments 2) a widespread condition with variable outcomes and some susceptibility to existing treatments.
A small sample showing therapeutic benefit from an investigational drug, yet lacking placebo control and formal hypothesis testing, may be adequate to reasonably suggest significant therapeutic benefit in the context of rare, practically intractable diseases. In such instances, given the difficulties – both practical and ethical – involved with enrolling large, placebo-controlled trials with a view on generating ‘stat sig’ outcomes, it is appropriate to consider the typical clinical trajectory of patients with this rare condition and to subsequently evaluate ‘numerical’ outcomes on study drug against this expected trajectory in the absence of a new therapeutic intervention. This in turn necessitates a deep understanding of ‘natural history’ for the condition, which must almost invariably draw on feedback from the trifecta of healthcare stakeholders – patients, caregivers (typically family members) and physicians. In recognition of this important consideration for rare disease drug development, the FDA recently issued draft guidance on natural history studies. While emphasizing proper planning & methodology, this draft guidance recognizes the ‘sociological’ aspect of developing therapies for rare diseases by emphasizing that “consideration should be given to enlisting the help of disease-specific support groups or patient advocacy groups because they are invaluable resources for identifying and helping to recruit patients. They also can contribute to study design and execution because of their unique perspectives“. Furthermore, FDA encourages the conduct of natural history studies with a view on improving outcomes for patients independently of a given investigational therapy’s odds of approval: “The benefits of planning, organizing, and implementing a natural history study may go beyond drug development. A natural history study may benefit patients with rare diseasesby establishing communication pathways, identifying disease-specific centers of excellence, facilitating the understanding and evaluation of the current standard of care practices, and identifying ways to improve patient care. A natural history study may provide demographic data and epidemiologic estimates of the prevalence of the disease and disease characteristics and aid disease tracking.”
It should become immediately apparent that a collaborative approach both between industry and the public sector & civil society, but also between actors within the private sector, is in the best interest of patients and industry by facilitating the approval of innovative therapies for the ‘right’ patients on the basis of a comprehensive understanding of natural history which enables appropriate evaluation criteria which go beyond ‘p<0.05’. A deeper exploration of the rational and promise of public-private and private-private partnerships in the pharmaceutical space merits a standalone examination, which could be the subject of future blog posts. For now, let us revert to a more general consideration of possible ways forward beyond the dogma of ‘stat sig’.
The primary contentions advanced by Amrhein et al. are neither new, nor are the solutions they propose, which is at once indicative of the persistence of the problem and of a certain consistency in the thinking of those seeking to supersede ‘stat sig’ with more comprehensive evaluation. A much-cited 2014 piece by Regina Nuzzo titled Scientific method: Statistical errors, also published in Nature, already laid out much that is wrong with our over-reliance on ‘stat sig’: stat sig in a single, small sample does not necessarily translate into reproducibility, does not indicate meaningful effect size and incentivizes unscrupulous actors to engage in ‘p-hacking’. The author recognized that “any reform would need to sweep through an entrenched culture. It would have to change how statistics is taught, how data analysis is done and how results are reported and interpreted”, which is of course what Amrhein et al. are attempting to induce by collecting a significant number of endorsements for their proposals.
By accepting that the ‘solution’ to a deeply rooted problem caused by worship of a single statistical measure cannot be an equally unidimensional alternative, instead requiring complementary statistical methods, confirmation via reproduction, and due implementation of background considerations; we will inevitably re-discover a sense of agency and the real-world consequences of human judgment. In the context of healthcare R&D and investment, a coming-to-terms with the fact that statistical analysis is but a tool, and not a substitute for human judgment, should strengthen our resolve to do what is right and necessary to drive the only meaningful outcome we can hope to produce – namely, a meaningful, positive impact on the lives of patients.
As a student at Paris’ SciencesPo, I would occastionally bemoan the seeming absence of, or my seclusion from, literary and artistic genius in the City of Lights in the 21st century, relative to its illustrious past. It is all the more exciting to witness the literary coming-of-age of a young French writer, a personal acquaintance who at 23 was awarded France’s ‘young writer’s prize’ and who at 25 just published her first novel with Gallimard, the leading French publishing house once described as having the best backlog in the world.
Diane Chateau-Alaberdina’s first published novel is reflective of the young author’s uncommonly profound perceptiveness, paired with a wealth of insight into the plight of womanhood and the perennial pathological tendency for people to exert control over objects of desire.
If fogginess – as a feature of the environment but foremost as an apt description of a contemporary malaise and that particular melancholy afflicting the Russian diaspora – emerges as a recurrent theme throughout the book, it is pierced by the author’s aptitude at pacing the narrative’s progression with sharp yet elegant introspective and descriptive morsels. Throughout, this is a sophisticated yet unpretentious verbal painting.
The fog-of-war enveloping the trajectories of several Russian families, beginning with the hazy past of the protagonist’s father as a warzone photographer, over the mystery shrouding the ever-present antagonist ‘Agafonova’ and the unexplained wealth of “the photographer’s” subject and her husband, invites us to indulge in the author’s carefully crafted, yet seemingly effortless, exploration of the ‘inner worlds’ which eclipse comparably mundane surroundings and careers.
The theme of photography as explored here is reminiscent of an act of rebellion against the fleeting nature of existence; of an act of vanity whose short-lived fascination fails to fill a void whose presence is felt throughout the book. Where the photographer’s flash, her voyeur-like examination of the subject’s features and physical deterioration ostensibly lead to a rapprochement, a sisterhood of sorts, they fail to prevent the most tragic of outcomes.
La Photographe stands out as an unusually honest examination of the failings of inter-personal relationships and of some of modern society’s dirty secrets, such as the abuse of anxiolytic prescription drugs as a treacherous escape from our troubles. The failings and temptations of the book’s characters are internalized, contained within discreet family structures and tightly knit spheres of influence. Quiet lives devoid of ecstasy are contrasted by the protagonist’s mind games and a cruel ‘long con’ which finds unexpected closure. If a dreamy fogginess serves as a metaphor for our complex yet often uprooted and unfulfilling contemporary lives, it is not the only theme that emerges from Diane’s writings. Escape – escape from the motherland, escape from a mundane existence, escape from manipulative others permeates the book. Eventually, all of these escapes are semi-failures: for the protagonist’s brother, escaping an unhealthy obsession is rendered futile by means of willful and skilled manipulation; for all main characters, emigration from Russia to the West seems contrasted by unquenchable nostalgia.
La Photographe is brutally candid in exploring the frailty of life; suggestive of a deep sadness rooted in separation, longing, denial and decline; subtle at providing anything resembling judgment or catharsis. The simplistic disapproval of a lone, anonymous exhibition-goer falls short of providing the speech required to counter the silent suffering present throughout this novel.
All in all, La Photographe is a remarkable debut; sensible, observant, reflective, subtle yet forceful, potentially shocking to the uninitiated. A thought-provoking stepping stone towards, hopefully, an enduring string of intelligent writing capable of lifting the fog-of-war of our contemporary existence bit by bit.
Intercept Pharmaceuticals, which develops & commercializes obeticholid acid (‘OCA’) in several indications, has reported positive PhIII data from a study in NASH patients with fibrosis. As I wrote on SeekingAlpha last month, liver fibrosis is recognized both by FDA and independent researchers as the strongest predictor of adverse clinical outcomes in NASH patients. It follows that OCA’s ability to demonstrate a clear fibrosis benefit that appears to be dose-dependent in a large PhIII trial poises the molecule to become the first FDA-approved treatment for NASH fibrosis. With some 20m patients estimated to suffer from NASH in the United States, OCA would only need to capture a diminutive portion of the overall market opportunity over time to 1) lead Intercept to profitability 2) justify a significant multiple of the company’s current $3.15bn Mcap.
Above: OCA’s PhIII data showing dose-dependent fibrosis improvements at month 18. Source: Intercept
OCA is currently marketed as OCALIVAin a rare condition called primary biliary cholangitis (‘PBC’), where it retails for roughly $70k/year. Together with the frequency of dose-dependent pruritis in the NASH fibrosis study, this has raised questions around OCA’s commercial viability in NASH. A few things to consider: – If OCA is approved in NASH fibrosis, lowering its unit price across PBC & NASH with a view on maximizing NASH revenues is trivial and makes sense economically (a multi-million potential market in NASH far outweighs a few thousand patients in PBC) – Intercept would first launch OCA in NASH through the same physician network (e.g. hepatologists) it has already established via OCALIVA’s launch in PBC. The company would likely target sub-groups of patients who at high risk for progression to liver Cirrhosis. – Titration will be used to minimize pruritis as an adverse effect leading to drug discontinuation. With only 5% of patients experiencing severe pruritis, there remains a very large patient population who might benefit from OCA. Additional considerations such as OCA’s tendency to raise lipid levels in blood & other metabolic implications will be taken into account by treating physicians but all in all do not reduce commercial opportunity to the point of making a NASH launch trivial for Intercept. Naturally, an established global player with a strong presence in metabolic & cardiovascular disease could speed up OCA’s sales ramp in NASH dramatically, which leads me to think that an acquisition of Intercept by Big Pharma or the likes of Gilead is near-inevitable – barring any major setbacks on the regulatory front. On social media, the discussion around OCA’s commercial potential is likely to simmer on, with a negative spin being almost inevitable as proponents of competing approaches, notably of Madrigal and Viking’s thyroid hormone receptor approach aimed at lowering liver fat, tout the prospect of better drugs following into OCA’s footsteps. My take is that these are complimentary approaches and the very early stage nature of Viking’s compound in particular should not lead to investor exuberance at present.
A more serious question raised on social media is likely adoption of OCA, regardless of competing efforts, purely on the basis of physicians buy-in. Several analysts have conducted limited physician polling, and I would caution against drawing premature conclusions from extremely small samples such as the 2 physicians being quoted in this tweet:
$ICPT Doc feedback from Jefferies NASH panel. "One would use OCA in no more than 5% of total patients, the other doc is concerned about unwanted metabolic side effects and would use OCA in a lower percentage of his patients and instead would prefer to wait for something better" pic.twitter.com/HdGrgBpaoV
All in all, I expect institutional investors to increase their Intercept positions while traders & retail investors quibble about what looks to me like marginally relevant considerations with regards to OCA’s peak sales potential in the light of a first-to-market advantage. NASH fibrosis is now literally Intercept’s for the taking and the only real question is whether and how quickly OCA will become a blockbuster in this indication. When it does, investors may regret not buying into ICPT when it was still moderately priced.
A small enzyme known as LRRK2 – named after the genes that encode it – appears to play an ubiquitous role in the onset and progression of Parkinson’s disease, new research finds. The findings, published in Science Translational Medicine, indicate that the market for small molecule drugs designed to ‘tone down’ LRRK2 activity may be significantly greater than previously thought.
As discussed in a recent STAT article, one small but very capable biotech in particular is set to benefit from increasing interest in the LRRK2 pathway: Denali Therapeutics. The company is currently testing two LRRK2 inhibitors in parallel, with additional data set to be released in a matter of weeks. Following full PhI readouts, the company intends to select one molecule for further evaluation in PhII trials later this year.
Denali was co-founded by renowned neurologist & current President of Stanford University Marc Tessier-Lavigne, former head of neuroology research at Genentech Ryan Watts and other high profile scientists, executives and investors.
In addition to its LRRK2 program, the company boasts a potentially revolutionary technology platform for the design of highly brain-penetrant biologics (antibodies, enzymes and hybrids). We believe that Denali’s portfolio and technology platform position it for significant appreciation over the coming years.
Equity research outlining the investment opportunity in greater detail is available on my service, Second-Level Investing.