Have you ever wondered about what goes on behind the scenes—how new drugs are magically produced and brought forth? We’ll continue to take the mystery out of clinical research and drug development and to provide background information so that both patients and physicians can make more informed decisions about whether they wish to participate in clinical trials or not.
Why care?
To develop a medicine, from the time of discovery of the chemical until it reaches your drug store, takes an average of 12-15 years and the participation of thousands of volunteers in the process of clinical trials (Fig 1).
Very few people participate in clinical trials—it is even less than 5% for patients with cancer—due to lack of awareness or knowledge about the process. We’ll go into detail about how drugs are developed in later posts.
An inadequate number of volunteers is one of the major bottlenecks in drug development, delaying the product’s release and usefulness to the public. Of course, many people may suffer or even die during this wait, if they have an illness that is not yet otherwise treatable. So if you want new medicines, learn about—and decide if you wish to participate in—the process. I have, as a volunteer subject, researcher, and advocate.
Why are clinical trials done?
People naturally want to find things that will help them feel better. So people concocted brews, sometimes known as “patent meds.” But then some asked if the medicine actually worked, or what else the medicine did, besides its intended use. Questions later became more sophisticated, asking if the drug might be dangerous, either to people with specific conditions or people taking other meds. Or how does the drug work? If it works by a specific mechanism, does that suggest it might be useful for another condition? Are there other unintended consequences? So these, and many more questions, are why clinical trials are undertaken.
How clinical trials came to be
In order to understand why trials are done in certain ways, it’s helpful—and interesting—to understand how they evolved.
Experiments were described as far back as biblical times, with Daniel following a diet of pulses and water in lieu of meat and wine. Others followed, with observations between groups receiving treatments. But the first known, prospective controlled clinical trial occurred in 1747 when James Lind gave sailors different dietary supplements, in an effort to prevent scurvy, an illness due to Vitamin C deficiency. While he demonstrated efficacy, Lind didn’t obtain consent from his participants, resulting in his study having the dubious honor of also being the first to be criticized on ethical grounds.
Epidemics of smallpox, a devastating viral infection now eliminated, were common in the 1700s and 1800s. In 1796, Edward Jenner demonstrated that a vaccine made from cowpox, a related but milder illness, could be used to prevent smallpox. In the U.S., attempts were made to develop a vaccine from cowpox scabs imported from England. Because the cowpox virus could not live very long in dried scabs, the virus was propagated by arm-to-arm transmission in successive person-to-person inoculations: an infected vaccination lesion on one person was scraped and used as the source of material with which to inoculate the next person.
Congress mandated that an adequate supply of uncontaminated cowpox be maintained and that the vaccine be available to any citizen. Under this 1813 Vaccine Act, Dr. James Smith, a Baltimore physician propagated cowpox for 20 years via arm-to-arm transmission every 8 days. Unfortunately, in 1821 Dr. Smith mistakenly sent smallpox crusts instead of cowpox vaccine to North Carolina, precipitating a smallpox epidemic, as well as the subsequent repeal of the Vaccine Act of 1813.
Besides this egregious error in vaccinating people with live smallpox virus, rather than attenuated (weakened) cowpox, the arm-to-arm inoculation also often transmitted other infectious diseases along with the cowpox vaccine, dampening the enthusiasm for vaccination efforts. (This nicely illustrates the Law of Unintended Consequences; such person-to-person propagation is no longer done, fortunately).
Many laws were subsequently passed reactively, in response to tragedies, rather than proactively, preventing problems from occurring.
For example, after American troops fighting in Mexico received ineffective, counterfeit medications for malaria, the Import Drugs Act of 1848, was passed, establishing customs laboratories, to verify the drug’s authenticity. Ironically, counterfeit anti-malarials are again a huge problem in Southeast Asia and Africa—but that is for a later story.
In 1880, the first major attempt at passage of a national food and drug law was attempted—and failed, as there was no immediate crisis in the public’s eye.
After the gory, nauseating descriptions in Upton Sinclair’s expose of Chicago’s meat-processing plants, The Jungle, Congress passed legislation in 1906 forbidding commerce in impure and mislabeled food and drugs—though not requiring efficacy. But there weren’t really teeth in the legislation, as the burden of proof was on FDA to show that a drug’s labeling was false and fraudulent before it could be taken off the market.” Similarly, there was no requirement to disclose ingredients of drugs, as they were considered trade secrets, lending the name “patent medicine.” Sound familiar?
There was minimal progress towards consumer protection when, in 1911, the Supreme Court, in U.S. v. Johnson, ruled that the 1906 FDA Act did prohibit false or misleading statements about the ingredients or identity of a drug—but still did not prohibit lying about efficacy.
It wasn’t until 1938, after 107 deaths from “Elixir Sulfanilamide” had occurred, that the FDA was able to require “a manufacturer to prove the safety of a drug before it could be marketed.” This established the need for clinical trials.
Unintended Consequences
Clinical trials seek to learn whether a drug (or device) works as expected—it’s unknown, until tested in people. That’s why early phase trials use only a few people, and more are added as experience is gained. Sometimes unexpected discoveries are made along the way. For example, Rogaine was discovered by an astute clinician researcher during a clinical trial studying high blood pressure. The drug, minoxidil, originally under study as an anti-hypertensive medication, was serendipitously found to have the unexpected side effect of stimulating hair growth, prompting a whole new line of products for baldness.
Similarly, Viagra was discovered by accident. Sildenafil, the generic form, was being studied as a treatment for angina, as it dilates blood vessels by blocking an enzyme, phosphodiesterase (PDE). While not very effective for angina, it was found to prolong erections, stimulating the whole “life-style drug” industry. Fortunately, PDE inhibitors are now being found useful for a host of important medical conditions, ranging from pulmonary hypertension to asthma and muscular dystrophy.
Of course, not all inadvertent discoveries have such rosy outcomes.
For example, Diethylstilbesterol (DES), a synthetic estrogen, was commonly prescribed in the US 1938-1971, to help prevent miscarriages. It was only after many years that DES was found to cause a rare type of vaginal cancer in daughters of exposed women. Later, other types of cancers showed up as well, in small numbers.
The tragic effects of thalidomide on developing embryos is perhaps the most notorious and horrible unexpected outcome in the history of drug development. Thalidomide was first released in 1957, with over-the-counter availability in Germany, to treat morning sickness. It was several years before the link was clearly made between thalidomide’s use in early pregnancy and the rash of children born with small seal-like flippers instead of limbs (phocomelia). Thalidomide was then removed from the market. There was a controversial resurgence of interest in the drug and approval by the FDA for thalidomide’s use in multiple myeloma in 1998; it’s use is now being explored for other serious illnesses.
Good from Evil-Ethical Standards
In 1928, through a chance discovery, Alexander Fleming discovered Penicillin in a “mold juice” that inhibited the growth of bacteria on Petri dishes. The potential value of Penicillin was underappreciated until 1939, when its purification and development began in earnest, as part of a war-time effort. War (and friendlier competition between nations) is, all too often, the impetus for research, and has lead to many useful inventions. Such advances naturally occurred in vascular surgery, regional anesthesia, and orthopedics, as well as less obviously related therapies as immunizations and treatments for malaria and other infections.
For example, animal studies in 1940 showed that mice could be effectively treated for Streptococcus with penicillin. The first patient received penicillin in 1941, under what would now be called “compassionate use.” Unfortunately, although he initially responded to treatment, there was not enough drug available, and he later died of his Streptococcus infection.
This “proof of concept” was enough, however, to spur development and extensive cooperation between the Britain and the U.S., driven by the desire to have the drug available to treat military injuries in World War II. Spurred by the Office of Scientific Research and Development (OSRD), pharmaceutical companies joined this patriotic, war-time effort; Merck was the first to develop the antibiotic for clinical use, followed soon by Squibb, Pfizer, and Lilly. In 1943, the War Production Board (WPB) assumed responsibility for increasing production to meet the military’s needs. The National Research Council’s chairman, Dr. Chester Keefer, had the thankless position of rationing the limited stocks of penicillin available to those outside of the military. Although not via a formal clinical trial, Dr. Keefer, too, gathered data as to the civilians’ response. The same process is followed today when a patient receives an experimental drug outside of a trial protocol.
As of March 15, 1945, rationing of penicillin stopped, as there were adequate supplies for both military and public needs. Unfortunately, the “miracle drug” was squandered and now it, along with many other antibiotics, is of limited use—the bacteria having evolved to become resistant much more successfully than pharmaceutical development has kept pace. Many of us are concerned we are entering the post-antibiotic era—the Infectious Diseases Society of America has been trying to call attention to this critical problem since their “Bad Bugs, No Drugs” campaign began in 2004. In a déjà vu moment, Australian news just reported that international shortages are once again necessitating rationing of Penicillin there.
Regulated, standardized clinical trials formally began in response to the horrors of World War II abuses, and ethical requirements for human research was established by world consensus.
Most drug trials are closely regulated and safe to participate in. Those that aren’t make the headlines, as they sell copy. Rebecca Skloot’s fine, captivating tale, The Immortal Life of Henrietta Lacks, is a superb example both of medical research gone awry and our fascination with the dark side of stories. But without clinical trials, none of us would have any prescription medicines available.
As we’ve seen, the history of drug development has been checkered at times. Clinical research is neither perfect nor without some degree of risk, but these can be minimized, and more safeguards are in place than ever in the past. There have been huge strides in the development of drugs, medical devices, vaccines and novel therapies in recent decades. Each has gone through a similar process of extensive testing before approval for use by the general public. But because these early phases of testing involve, at most, a few thousand volunteers, unexpected outcomes after market approval are inevitable and unavoidable, as the new drug is taken by millions.
Judy Stone, MD is an infectious disease specialist, experienced in conducting clinical research. She is the author of Conducting Clinical Research, the essential guide to the topic, and regular posts can be found on her Scientific American Network Blog, Molecules to Medicine, where this post originally appeared.
Categories: Uncategorized
When you go on holiday, the last thing, you don’t want is a SPA not functioning or a horrible breakfast with only a few croissants on the table. Remember, you work all year long, and your holiday is very important to you and your family, don’t make the same mistake many have done in the past, which is taking the first booking you see on a brochure or be totally brain washed by an agent who promises the best holidays of your life….Finding the right place for you to spend time on holiday can be tricky, risky and expensive if you buy the first booking you will see on the internet.
Ralph Lauren Espana Ropa http://www.nutec.es/RalphLaurenEspanaRopa.html
Estructura del curso deben ser evaluados para saber más sobre el curso, este artículo le proporciona la estructura del curso.
Ray Ban Baratas http://nutec.es/rayban.html
When there is so much, there is no need for you to not plan a holiday in this splendid land. So, plan a vacation and get set go!
MBT Baratas http://www.nutec.es/MBT-Baratas.html
The Rock Garden is a unique world acclaimed garden which is made of urban and industrial waste. Spread over 20 acres, it has been designed in the form of and open air exhibition hall to create a dream folk world of palaces, soldiers and village life.
Zapatos MBT outlet http://www.nutec.es/Zapatos-MBT-outlet.html
BUY CHEAP INDIAN DRUGS : -==== Anti Depressants Drugs ====-…
Dr. Wiginton is among the 45 participants in a southwestern Utah-based clinical trials – the first of its kind in the south-west – to investigate the use of the delivery system for radiation. His post-lumpectomy therapy lasted one third of the length of a session of radiation typical of a patient with breast cancer.If you are willing to use the brain tumor, I think it’s a pretty safe bet for use in a breast, said Wiginton. Dr. Wiginton, including the study by Dr. Dan Garwood, associate professor of radiation oncology, said he hopes that the procedure will be successful and to offer new options of radiotherapy for breast cancer patients.^
My favorite blog page
<http://www.foodsupplementcenter.com/best-joint-supplement/
If anyone is interested in a career or education in Clinical Research then take a look at the Clinical Research Masters Degree at The University of Southampton. This Masters Degree is a recognised component of the National Institute of Health Research Clinical Academic Careers Training Pathway. Have a look at http://www.southampton.ac.uk/healthsciences for more information.
In an effort to further educate your readers, I would like to offer other suggestions for complete and balanced information about clinical trials participation including process, protection, benefits and risks. Here are some helpful resources (websites and book):
1) http://www.CISCRP.org (Center for Information and Study on Clinical Research Participation)
This non-profit organization is focused on educating and informing the public about clinical research participation. CISCRP provides valuable information including how volunteers can protect themselves.
CISCRP tries to help you locate ongoing clinical trials by supporting SearchClinicalTrials.org. CISCRP also helps those patients who are having difficulty locating clinical trials by conducting a custom search for them.
2) The Gift of Participation: A Guide to Making Informed Decisions About Volunteering for a Clinical Trial (author: Kenneth Getz) – available on the CISCRP website.
3) And for specific trial listings and news about Clinical Trials, your readers should go to the website: http://www.CenterWatch.com There, they can search for clinical trials in their area or around the world, and they will find listings of clinical trials organized by medical condition, therapeutic area and location.
My goal is to help people feel empowered and protected as they participate in the Clinical Trials process. I hope you find these suggestions helpful.