Doctor's Review: Medicine on the Move

December 18, 2017
Bookmark and Share

The uneven history of clinical trials

Human testing of new cures from the brilliant to the bizarre

Curing the unwell is among the noblest acts of our species. On the other hand, the practice of harming other human beings in the name science is one of the least noble.

Historically, involuntary medical testing has been performed on the most vulnerable humans: children, the mentally ill, the physically impaired, prisoners of war, the colonially dominated, the incarcerated, the list goes on. Gruesome operations without anesthetic, infecting infants with syphilis, feeding uranium to pregnant mothers and injecting prisoners with live cancer cells are among the acts that have been carried out in the name of medical progress.

What constitutes consent?

Not until the 1970s was it deemed unethical to use prisoners as subjects for medical tests on the basis that inmates were not adequately equipped to provide informed consent. With the end of that practice, medical pharmaceutical researchers shifted their attention to university centres and soon established relationships that seemed mutually beneficial. Academic researchers could not only design clinical trials, they could also publish the results in credible journals, which could be used to help market the products that proved effective. Students were a freely available pool of potential test subjects and by the early 1990s much of new drug clinical testing took place at university centres.

Since then, priorities have shifted. These days, industry goals are focused on bringing new products to market quickly pushed, in considerable part, by 20-year-patent expiry dates.

Swift entry to markets has become the driving force to get medications on pharmacy and hospital shelves. Trials have been moved to the private sector where contract research organizations (CROs) shepherd products through each stage of the trials. About half of all global drug companies contract CROs outside North America, where costs are lower and regulations less stringent. That said, Canada continues to be a major contributor to the research industry — this country is fourth in the world in the number of clinical trials taking place at any given time, according to Health Canada.

Human participants in these trials are more richly compensated than they once were. Though the vision of volunteers giving their time and enduring discomfort “for the good of humanity” persists, in truth, most people do it for the money.

Professional “guinea pigs” as they call themselves can make as much as $40,000 annually if they manage to pack in eight to 12 phase one trials a year. These trials — when treatments are tested on humans for the first time — are the most lucrative owing to their less certain outcomes. And being a guinea pig can be risky. In France, not long ago, five trial participants were permanently disabled and one died. A recent British trial in England resulted in amputations on some of the subjects and also in dangerous head-swelling. These are cautionary tales; the vast majority of trials cause no lasting damage.

CROs favour these “professional” test subjects. They know the drill and are unlikely to back out mid-trial, but just who are the participants in these trials? Who has long stretches of available time and is willing to the endure boredom, discomfort and potential health risks?

The usual suspects include a variety of students, mavericks and those who want to escape the daily grind, but it's worth noting that many CROs recruit at the gates of prisons, ready to offer a source of income to people who face considerable job-finding challenges. Professional guinea pigs often gripe about the work they do, calling it “mild torture,” and saying that having things “done to them” has replaced a job market where they once did things in the pre-information economy.

Back to Nuremberg

The guidelines for testing on human subjects were first established in the Nuremberg Code in 1947 during the Nuremberg trials when Nazi doctors were accused of murdering and torturing victims in valueless experiments. The code states that no experimenter should subject the participants to any procedure they would not be willing to undergo themselves. For these and other reasons, many researchers have done just that: become their own guinea pigs.

Drinking black vomit infected with yellow fever; enduring the subcutaneous injection of 50 hookworms; or surgically implanting a computer chip into the median nerve fibres in the arm are doubtless experiments that would be unacceptable to many human volunteers regardless of the compensation offered. The doctors who performed these experiments reasoned that the only way to test their hypotheses without risking legal consequences was to perform them on themselves. Indeed, some physicians argue that self-experimentation is the only form of research that fully meets the requirement of informed consent. They believe that experimental research is too complex for those without medical backgrounds to understand the potential of untested medications for harm.

Dr Eugene G. Laforet, the Boston physician and ethicist, believed volunteers should be given the reassurance that researchers are taking part in the experiment themselves. Another prominent medical researcher, Rosalyn S. Yalow, co-winner of the 1977 Nobel Prize for development of the radioimmunoassay (RIA) technique concurs: “In our laboratory we always used ourselves because we are the only ones who can truly give informed consent.”

Self-experimentation has a long history. Nineteenth-century doctor Charles-Édouard Brown-Séquard, whose self-experiments led him to the concept of hormones, stated: “I believe you will never fully know the action of certain remedies, if you have not ascertained, on your own person, what effects they produce on the brain, the eye, the ear, the nerves, the muscles, and the principal viscera.”

Though the practice is discouraged these days, there are contemporary visionaries willing to put their own bodies on the line. The person who introduced 50 hookworms to his body did so in 2004. Immunologist-biologist Dr David Pritchard spent years in Papua, New Guinea researching the possible power of hookworms to boost the human immune system and prevent allergies, but he needed definitive proof so he volunteered himself. He applied a dressing to his arm that was crawling with pin-size hookworm larvae and left it on for a few days to make sure the creatures had made their way into his system.

His theory was that the worms have evolved to switch off the human immune system in order to survive in their hosts, reasoning that that was why infected people have fewer allergies. Though hookworm is rampant in the tropics, where it kills 65,000 people a year and afflicts many with anemia, in controlled experiments, Dr Pritchard says the worms have not caused any problems, and serve to “turn down the volume” on the immune system. He winnowed the initial 50 worms down to ten, and in 2006 was given the thumbs up by the ethics committee at the US National Health Service to conduct a study with 30 participants, many of whom were enthusiastic about the disappearance of their allergies. The trial was a success. Some elected to keep the worms after the trial and others who received the placebo asked if they could have worms too!

Bionic research may be a logical next step for humanity, but there are few willing to take the risks of becoming cyborgs themselves. Kevin Warwick, Deputy Vice-Chancellor of Research at Coventry University in England, had no such qualms. In 1998, he began the first phase of Project Cyborg, remaining conscious while his GP surgically implanted a silicon chip transponder into his left arm. If the procedure went awry, he risked brain damage or amputation. Instead, he was able to parade about campus being identified by computers who opened doors and turned on lights for him, in much the way that pets are identified by computer chips today. By 2002, he was able to control a wheelchair and a robotic hand across the Atlantic — with his mind. His next project is to communicate from brain-to-brain with another person also equipped with implants.

And who was the nutter who willingly drank black vomit? Stubbins Ffirth was a Philadelphia doctor who witnessed the devastating Yellow Fever Epidemic of 1793 and wanted to prove his theory that the disease was not contagious. To make his point, he drank the vomit by the glassful, poured it into open cuts and waded waist-deep in a veritable hot tub of the foul stuff. Remaining uninfected, he declared that that Yellow Fever was noncontagious. But he was, of course, mistaken. As we now know, yellow fever is indeed infectious and can be transmitted only through the bite of an infected mosquito.

This article was accurate when it was published. Please confirm rates and details directly with the companies in question.

Comments

Post a comment