Category Archives: Biotechnology

FDA drug approval reform: Moving to a Safety-Only System

Thanks heaps to Tyler Cowen for this heads-up on drug development. The ideas are blindingly obvious once explained. The outmoded idea behind the laborious FDA system is now known to be false (that average efficacy is relevant to an individual patient). Read this, then contact your elected representatives about reforming the FDA. One of the lives saved could be yours.

It seems likely to me that part of the problem with the current scheme is regulatory capture. Big pharma benefits by completely eliminating any small (startup) innovators from competing. That’s a typical incumbent strategy – get the regulators to erect barriers preventing new entrants.

It now costs about a billion dollars to develop a new drug which means that many potentially beneficial drugs are lost. Economist Michele Boldrin and physician S. Joushua Swamidass explain the problem and suggest a new approach:

Every drug approval requires a massive bet—so massive that only very large companies can afford it. Too many drugs become profitable only when the expected payoff is in the billions….in this high-stakes environment it is difficult to justify developing drugs for rare diseases. They simply do not make enough money to pay for their development….How many potentially good drugs are dropped in silence every year?

Finding treatments for rare disease should concern us all. And as we look closely at genetic signatures of important diseases, we find that each common disease is composed of several rare diseases that only appear the same on the outside.

Nowhere is this truer than with cancer. Every patient’s tumor is genetically unique. That means most cancer patients have in effect a rare disease that may benefit from a drug that works for only a small number of other patients.

…We can reduce the cost of the drug companies’ bet by returning the FDA to its earlier mission of ensuring safety and leaving proof of efficacy for post-approval studies and surveillance.

Harvard Neurologist Peter Lansbury made a similar argument several years ago:

There are also scientific reasons to replace Phase 3. The reasoning behind the Phase 3 requirement — that the average efficacy of a drug is relevant to an individual patient — flies in the face of what we now know about drug responsiveness. Very few drugs are effective in all individuals. In fact, most are not effective in large portions of the population, for reasons that we are just beginning to understand.

It’s much easier to get approval for drugs that are marginally effective in, say, half the population than drugs that are very effective in a small fraction of patients. This statistical barrier discourages the pharmaceutical industry from even beginning to attack diseases, such as Parkinson’s, that are likely to have several subtypes, each of which may respond to a different drug. These drugs are the underappreciated casualties of the Phase 3 requirement; they will never be developed because the risk of failure at Phase 3 is simply too great.

Boldrin and Swamidass offer another suggestion:

In exchange for this simplification, companies would sell medications at a regulated price equal to total economic cost until proven effective, after which the FDA would allow the medications to be sold at market prices. In this way, companies would face strong incentives to conduct or fund appropriate efficacy studies. A “progressive” approval system like this would give cures for rare diseases a fighting chance and substantially reduce the risks and cost of developing safe new drugs.

Instead of price regulations I have argued for more publicly paid for efficacy studies, to be produced by the NIH and other similar institutions. Third party efficacy studies would have the added benefit of being less subject to bias.

Importantly, we already have good information on what a safety-only system would look like: the off-label market. Drugs prescribed off-label have been through FDA required safety trials but not through FDA-approved efficacy trials for the prescribed use. The off-label market has its problems but it is vital to modern medicine because the cutting edge of treatment advances at a far faster rate than does the FDA (hence, a majority of cancer and AIDS prescriptions are often off-label, see my original study and this summary with Dan Klein). In the off-label market, firms are not allowed to advertise the off-label use which also gives them an incentive, above and beyond the sales and reputation incentives, to conduct further efficacy studies. A similar approach might be adopted in a safety-only system.

Addendum: Kevin Outterson at The Incidental Economist and Bill Gardner at Something Not Unlike Research offer useful comments.

[From FDA: Moving to a Safety-Only System]

Finding Good Drugs is Harder than it Sounds

At The Atlantic Megan McArdle examines drug development. Demonstrating once again how capable she is, Megan quotes my favorite insider source Derek Lowe:

As NIH director, Elias Zerhouni was a great champion of “translational research” that would get academics into the business of trying to make drugs rather than doing basic research like identifying drug targets. Now that he’s the head of R&D at Sanofi Aventis, he seems to be realizing that discovering drugs is harder than it sounds:

When he arrived at Sanofi, “I thought the solution would be simple,” Zerhouni said at a recent R&D press event attended by the Health Blog. He thought the answer to the company’s R&D woes was to make it more creative and more nimble, like a small biotech.

But he realized that small biotechs are no more successful than large drug makers at coming up with new drugs. “At the end of the day, there’s a gap in translation,” he said. . .

At Sanofi, the goal now is to strive for “open innovation,” which involves looking for new research and ideas both internally and externally — for example, at universities and hospitals. In addition, the company is focusing on first understanding a disease and then figuring out what tools might be effective in treating it, rather than identifying a potential tool first and then looking for a disease area in which it could be helpful.

Derek Lowe adds:

With a lot of these things, if you’re going to first really understand them, you could have a couple of decades’ wait on your hands, and that’s if things go well. More likely, you’ll end up doing what we’ve been doing: taking your best shot with what’s known at the moment and hoping that you got something right. Which leads us to the success rates we have now.

On the other hand, maybe Zerhouni should just call up Marcia Angell or Donald Light, so that they can set him straight on the real costs of drug R&D. Why should we listen to a former head of the NIH who’s now running a major industrial research department, when we can go to the folks who really know what they’re talking about, right? And I’d also like to know what he thinks of Francis Collins’ plan for a new NIH translational research institute, too, but we may not get to hear about that. . .

Read the whole thing »

Combining antibiotics with bioactive drug compounds

This is interesting. I wonder how practical it is to do a broader screening program for combos are effective against such as MRSA?

New research, published yesterday (April 25) ahead-of-print in Nature Chemical Biology, provides evidence that combining antibiotics with marketed drug compounds could be one answer, uncovering previously unknown antibacterial functions of drugs that boost the effectiveness of antibiotics.

(…) So Wright and his colleagues decided to broaden the search. They focused on minocycline, an antibiotic that inhibits protein synthesis, frequently used in the 1950s and 1960s until bacteria developed resistance. “It seemed like a good place to start as something that already had some intrinsic anti-microbial activity but had been largely abandoned by the clinical community because of the resistance problem,” said Wright.

They screened minocycline in combination with more than 1,000 previously approved bioactive drug compounds — most of which had no known antibiotic function — against three common and often resistant bacteria: Pseudomonas aeruginosa, Escherichia coli, and Staphylococcus aureus.

The screen revealed a total of 69 compounds never before used to treat bacterial infections that, when combined with minocycline, decreased bacterial growth by at least 45 percent — significantly more than when treated with only the antibiotic . “It was very gratifying for us that our hypothesis was right,” said Wright. “We found all these unexpected interactions.”

Cite: L. Ejim et al., “Combinations of antibiotics and nonantibiotic drugs enhance antimicrobial efficacy,” Nature Chemical Biology, doi:10.1038/nchembio.559, 2011.

So, You Thought Breast Cancer Was Complicated?

This is not good news. Organic chemist Derek Lowe discusses a new study that employed the latest DNA sequencing techniques. Personalized therapy is looking more and more distant.

You may have detected, here and there, a certain amount of skepticism on this blog about the direct application of genomic information to complex human diseases. And several times I’ve beaten the drum for the position that there is no such disease as “cancer” – just a lot of conditions that all result in the phenotype of uncontrolled cellular growth.

Well, here’s some pretty dramatic evidence in favor of both of those positions. A new study, one of those things that could only be done with modern sequencing techniques, has given us the hardest data yet on the genomic basis of cancerous cells. This massive effort completely sequenced the tumors from 50 different breast cancer patients, along with nearby healthy cells as controls for each case.

Over 1700 mutations were found – but only three of them showed up in as many as 10% of the patients. The great majority were unique to each patient, and they were all over the place: deletions, frame shifts, translocations, what have you. The lead author of the study told Nature News that the results were “complex and somewhat alarming”, and I second that, only pausing to drop the “somewhat”. I add that qualification because these patients were already more homogeneous than the normal run of breast cancer cases – they were all estrogen-receptor positive, picked for trials of an aromatase inhibitor.

Read the whole thing »

Health Care Reform and the Drug Industry: How Goes It?

Derek Lowe examines what has actually come out of the Obama administration:

(…) Even without any backtracking on exclusivity, the article maintains that health care reform was a loser for the drug industry. The author goes on on to detail the various other costs of the bill as it was passed, and then gets to the biggest structural problem:

While the healthy part of the pharmaceutical market will be pounded, the government-run segment of the market, Medicaid, will be expanded by 16 million patients. Medicaid has the worst pricing structure and the worst track record in paying for innovations of any sector in the United States market. Like government health-care systems around the world, Medicaid must be dragged to pay for medical advances. Unlike employers and seniors in Part D, Medicaid patients cannot vote with their feet if their health plan does not provide the new medicines they want. The incentives in Medicaid all run against paying for pharmaceutical innovations.

So, Obamacare significantly expands the worst sectors of the pharmaceutical market while degrading the best.

Read the whole thing »

TEDxCaltech: J. Craig Venter – Future Biology

On January 14, 2011, Caltech hosted TEDxCaltech, an exciting one-day event to honor Richard Feynman, Nobel Laureate, Caltech physics professor, iconoclast, visionary, and all-around “curious character.”

Don’t you wish Richard Feynman could be one of the presenters.

One of the talks that has already been uploaded is Venter’s Future Biology. These Caltech talks are not in the TED iTunes subscription list – so you won’t automatically see them unless you visit the Caltech site.

Intrexon: “the Google of the life sciences”?

So says Randal J. Kirk. I’ve not heard of Intrexon before. The Forbes writers asked Craig Venter, who said the same thing. There is so much hype in synthetic biology, is this hype or the real deal?

(…) Kirk says everything he has done in the past pales next to the potential of his latest project: Intrexon, a secretive research-stage company that is working on the hot new field of synthetic biology—basically genetic engineering on steroids. Kirk and his investment fund, Third Security, have poured $200 million into the closely held 180-person company based in Blacksburg, Va., which has no drugs on the market.

“I’ve been a biotech investor for 27 years, and Intrexon is by far the best thing I’ve ever seen,” says Kirk, 56, who raises falcons and composes electronic music on a 7,200-acre cattle farm in rural Pulaski County, Va. He likens Intrexon to “the Google of the life sciences” and predicts that in a decade it could become “the largest, most significant company” in its burgeoning field.

(…) Lots of big scientific names are working in synthetic biology, which so far has produced lots of hype and headlines but few practical breakthroughs. Gene jockey J. Craig Venter, known for sequencing the first human genome in 2000, leads a company called Synthetic Genomics that has a $300 million deal with ExxonMobil to make designer biofuels.

Intrexon has released few details about which products it is pursuing. Its lead drug is only at the earliest stage of human trials. It is so obscure that three prominent synthetic biology researchers contacted by FORBES—including Venter—said they had never heard of it. Kirk shrugs. Among other colossal ambitions, he wants to revitalize the troubled field of gene therapy, make dozens of inexpensive protein drugs and produce better genetically engineered crops that will benefit consumers, not just farmers. The company is also working on biofuels, designer enzymes, bioplastics and unspecified consumer products. Keeping the work secret is part of the plan, Kirk says. “If we were in the business of publishing, we could get the cover of Science magazine any issue we wanted,” he boasts.

The scientist behind Kirk’s mystery company is the 45-year-old molecular geneticist Thomas Reed. He founded Intrexon in 1998 while still completing his Ph.D. and postdoctoral work in cardiovascular genetics at the University of Cincinnati. “I think of him as the Henry Ford of DNA,” says Kirk. “We are all living in his dream.”

(…)

Read the whole thing » See also Is Randal J. Kirk Biotech’s Best Investor? And this on the Intrexon/Ziopharm deal.

The drug pipeline: the numbers on innovations

Organic chemist Derek Lowe’s commentary on drug discovery at “In the Pipeline” is a valued source for insider perspectives. In Where Drugs Come From: The Numbers Derek examines the Nov 2010 Robert Kneller paper. Hopefully this work will provide the numbers to quiet the ongoing arguments over who does the heavy lifting (e.g., “pharma doesn’t discover important compounds”). The Kneller study is focused only on the 252 new drugs approved by the US Food and Drug Administration over the decade from 1998 to 2007. Excerpt:

(…) A new paper in Nature Reviews Drug Discovery takes on all 252 drugs approved by the FDA from then through 2007, and traces each of them back to their origins. What’s more, each drug is evaluated by how much unmet medical need it was addressed to and how scientifically innovative it was. Clearly, there’s going to be room for some argument in any study of this sort, but I’m very glad to have it, nonetheless. Credit where credit’s due: who’s been discovering the most drugs, and who’s been discovering the best ones?

First, the raw numbers. In the 1997-2005 period, the 252 drugs break down as follows. Note that some drugs have been split up, with partial credit being assigned to more than one category. Overall, we have:

58% from pharmaceutical companies.

18% from biotech companies..

16% from universities, transferred to biotech.

8% from universities, transferred to pharma.

That sounds about right to me. And finally, I have some hard numbers to point to when I next run into someone who tries to tell me that all drugs are found with NIH grants, and that drug companies hardly do any research. (I know that this sounds like the most ridiculous strawman, but believe me, there are people – who regard themselves as intelligent and informed – who believe this passionately, in nearly those exact words). But fear not, this isn’t going to be a relentless pharma-is-great post, because it’s certainly not a pharma-is-great paper. Read on. . .

Read the whole thing »

Ann do read the comments — e.g., commenter Virgil added some useful background the impact of university “indirect costs”.

Long answer… When Universities get funding from NIH, it comes with “indirect costs” (to pay for administration, lighting, AC, building maintenance etc.) The indirect cost rate runs anywhere 50-90% depending on the University, and is pretty hard to change – it gets reviewed by independent panels. So for example, when I get a $1m grant, the University gets an extra $530k (in my case) to pay for all the fluff.

The problem is, many granting agencies do not pay “full indirects”. American Heart Association only pays 10%. Most industry sponsored clinical trials pay 20%. A lot of charities and foundations pay nothing at all. So, when faculty get these grants, they’re bringing in dollars to do research, but those dollars do not bring in enough indirects to support all the background stuff.

Typically, the indirect cost recovery rate for most Universities is in the 75% range, meaning that they bring in enough indirects to support about 75% of the total cost of doing research. If all research grants paid full indirects (i.e. everything was NIH funded) this would not be a problem. One of the biggest budget problems facing a lot of Universities today, is how to make up that gap of 25%. The old fashioned way was to use the endowment (now shot to bits by the recession), to skim money off the profit from the adjacent hospital, as is the case at most University Medical Centers (now shot to pieces by medicare/medicaid reimbursement rates, and coming healthcare reform), or to rely on other revenue streams (medical student fees, licensing and patents, charitable donations).

The cynical way to look at this is “for every research dollar we bring in, we have to find an extra 25c from somewhere to cover the real costs, so research actually costs us money”. As you may guess, such a message does not sit well with the faculty at many Universities. Nevertheless, the old business model wherein research is a profitable enterprise at Universities, is simply no longer sustainable. Only those Universities with very big endowments, are surviving the current financial crunch without big cost cutting measures (typically, firing administrative staff and cutting back on support – goodbye core facilities, etc.)