Human Subject: An Investigational Memoir

Previous chapter | Contents | Next chapter | References | Contact


4. Freeing of Information

“Les sciences n'avancent que par les idées nouvelles et par la puissance créatrice ou originale de la pensée. Il faut donc prendre garde, dans l'éducation, que les connaissances qui doivent armer l'intelligence ne l'accablent par leur poids et que les règles qui sont destinées à soutenir les côtés faibles de l'esprit n'en atrophient ou n'en étouffent les côtés puissants et féconds.”

To apply for IRB approval, an investigator must submit a study protocol. This is a detailed description of the proposed research, including the perceived need, exact procedures that will be performed, and a discussion of any ethical considerations. The application must include a draft of the consent form that subjects will be required to sign before participating in the study. Normally the consent forms are the only documentation that the study subject ever sees.

After deciding to delve deeper into the whole murky business of conducting research on human subjects, I thought it would be helpful to see the actual protocol for at least one of the studies in which I participated. It made sense to start with the oxycodone study, because of my curiosity as to whether it had been approved with the understanding that an honest-to-God psychologist or other professional would be administering at least some of the tests. Twelve years earlier I had volunteered for another study at Big U (the details of which appear in chapter 20). In that study, my primary contact was with the principal investigator, a dentist. And years before that, I’d been in a study of an anti-anxiety drug, where I think I met at least once with a real-live psychiatrist.

Getting a copy of the protocol should be a routine matter, I thought. There certainly wasn’t anything secretive or proprietary about it, as there would be if it were a new drug that was being tested. But I soon felt as if I were requesting a top-secret national security document. Here's the sequence of events:

Day 1 I submitted an email request to InvestiGuard, the university’s research ethics overseers, asking if they could furnish a copy of the protocol.
Day 8 I emailed the CRC’s “research subject advocate” (I’ll call him Advocate Dude) to let him know that I had written to and received no response from InvestiGuard. I noticed on the CRC Web site that he had the initials “CIP” after his name; looking that up, I learned that he had passed an exam to become a Certified IRB Professional. This certification is awarded to anyone with $400 and the know-how to pass a four-hour, multiple-choice test. The Council for Certification of IRB Professionals, a division of PRIM&R (Public Responsibility in Medicine & Research) administers the test and awards the certification ( http://www.primr.org/certification/exams.html).
Day 12 Advocate Dude wrote back, suggesting that I contact the principal investigator.
Day 13 I wrote to Lisa, the research study assistant, who called me to ask why I wanted to see the protocol. (It’s funny how some people always pick up the phone as soon as they get an email message, instead of hitting the “Reply” button.) No one else had ever asked to see it, and she wanted to have a reason to give the PI before she asked her about it. I told her that I didn’t have any particular questions or concerns. I was just curious about it. Finally she said it would be best if I just asked the PI myself.

I asked Advocate Dude if I really had to have a reason to want to see the protocol, and he said no.
Day 14 I emailed the PI, asking if I could see a copy of the protocol. She never replied.
Day 21 I received an unsigned, one-sentence message from a generic InvestiGuard email address, asking: “Have you received a reply to your e-mail?” (the one I'd sent 10 days earlier, I presumed). I replied that no, I hadn't received an answer, and I included in my reply a copy of my email to the PI, so that they would see what steps I’d already taken. I sent a copy of this whole correspondence to Advocate Dude.

The anonymous InvestiGuard staff person wrote back, “I don't believe protocols are made available to subjects.” Then she/he/it told me who was the Right Person to contact at InvestiGuard for further information. I wondered why he/she/it couldn't have just forwarded my request to the Right Person.

I wrote to the Right Person, who immediately wrote back and said that she wasn't in fact it, but that she would forward my request to the person who was.
Day 22 The Really Right Person wrote to me that InvestiGuard was “not authorized to openly share research protocols with third parties,” but that since Big U was a public institution, I could request the protocol under the state's public records act. She also said she would be happy to answer any questions I had.

I wrote back to the Really Right Person, taking her up on her offer by asking several questions, including this one that seemed essential: Would the protocol really be considered public information? She never wrote back.

Curious as to the actual rules regarding public disclosure of protocols, I searched the archives of the IRB Forum and found one ethicist who seemed to favor a more open approach than most on the list to sharing information with subjects and the public. I’ll call him Ethicist Dude. I wrote to him about my situation and asked for his opinion.
Day 26 Ethicist Dude wrote back to say that he thought protocols should be available for viewing by potential and enrolled subjects. He attached an article by a doctor and medical ethicist who had fought for, and eventually won, the right to see the protocol for a study he participated in. “Every IRB should routinely require investigators to make protocols available to subjects who ask,” wrote Dr. Robert Veatch (an author whom I would continue to encounter as I read more about research ethics). “The really interesting question is whether they should also require that subjects be routinely told of their right of access to redacted protocols if they want to see them.” (Veatch, 2002)
Day 27 Having given the PI nearly two weeks to respond, I submitted requests by email to both the university's public records office, which I’ll call InfoGuard, and the National Heart, Lung, and Blood Institute, which, I had learned, was handling Freedom of Information Act requests for the National Institute on Aging, the agency that had funded the oxycodone study.

I received an email from the director of InvestiGuard, offering to discuss with me which documents I would like to see. I wrote back, asking whether she knew about my just-submitted official request, and I said I'd be happy to talk to her. She replied that she wasn't aware of my public-records request, and that maybe we could wait to see if I had any questions “after you receive the protocol."
Day 30 I received a letter from the NIA's Freedom of Information Specialist, letting me know that they'd received my request (What happened to the NHLB? I wondered), that they would have to check with the PI before releasing anything, and that they would charge me 10 cents a page for copying, except that the first 100 pages would be free. They would also charge me for any required research, at an hourly rate of $20, $41, or $74, with no explanation as to which rate would apply under which circumstances. They didn't seek approval or payment in advance, which led me to wonder how much copying and research we taxpayers end up subsidizing for deadbeat public-information junkies.
Day 35 I received a postcard from InfoGuard, telling me that it would take about 15 days to comply with my request, unless they needed extra time to put together the documents or to “notify third parties,” in which case they would let me know that it would take longer.

As I got bounced from one person to the next and experienced various stalling tactics—such as people offering to answer questions I had about the study as an alternative to showing me the protocol—I began to feel just like an obstinately curious child: If something was so off-limits, I was darned if I was going to let anyone keep me from it.

While waiting for my requests to be acted upon (and for the humongous bills I would surely get for the research and photocopying), I went back and reread the consent form I had signed for the oxycodone study. The law requires that the form be signed, but no one can compel you to read it, and apparently I hadn’t read that one very thoroughly. There really were a lot of details on the procedures I supposedly underwent, and I discovered that I had been deprived of one procedure: pulse oximetry. That's where they clip a device to your finger to measure the level of oxygen in your blood.

A pulse oximeter works by emitting light at two wavelengths and then measuring how much of it is absorbed by the hemoglobin in your blood. Apparently oxygen-poor and oxygen-rich hemoglobin absorb light better at differing wavelengths. There are just a few circumstances that interfere with the magic by which this fingertip reading is obtained. Here's one warning I found: “Nail varnish may cause falsely low readings. However the units are not affected by jaundice, dark skin or anaemia.”

The consent form clearly stated that I would be given a “small plastic clip” to place on my index finger, and that I could remove this clip when I needed to go to the bathroom. They did stick a gadget on my finger each time they took my blood pressure, but I thought that was just to get my pulse. I had an IV tube in my arm, so that they could draw blood ten times, but no semi-permanent oximeter. This was one more thing I could check for in the protocol, if I ever got to see it.


In 2002 the IOM Committee on Assessing the System for Protecting Human Research Participants recommended that the informed consent process become “an ongoing, interactive dialogue between research staff and research participants . . . ” rather than a static event or a piece of paper. In keeping with this recommendation, the committee proposed that the form be called “Consent Form” rather than “Informed Consent Form”; thus, I suppose, subjects could always claim that they were never fully informed of the risks of research. This recommendation seems unnecessary to me, because I’ve never seen anything but plain old Consent forms. This could be because the research community has adopted the committee’s proposal, but I suspect that people just think the words “Informed Consent Form” look and sound dumb.

The committee’s report, which was issued as a 300-page book (Federman, Hanna, & Rodriguez, 2003), also recommended extending the OHRP regulations to cover all research, regardless of funding source or setting. Many IRBs have already taken it upon themselves to oversee research that isn’t federally funded, but as far as I can tell, this is self-initiated mission creep rather than a response to the IOM report. No institution seems to have adopted the committee’s suggested name change: from Institutional Review Board to Research Ethics Review Board, or ERB.

Another recommended terminology change would have turned “subjects” into “participants.” This recommendation may actually have been implemented by a few researchers: A search of refereed publications at the Thomson Gale Health & Wellness Resource Center reveals that “research subject” was used five times more than “research participant” prior to October 2002; after that date the ratio was closer to 3 to 1. During the years 1999 to 2002, the incidence of “subject” versus “participant” was about the same as during the three previous years, so a change does appear to have occurred around the time of the report’s release.

The committee had been charged by the Office of Human Research Protections with evaluating the existing system for protecting human subjects. At first it seemed odd to me that the resulting recommendations had apparently not been implemented in the regulations. Then I realized that the committee had been formed during the Clinton administration, and it had issued its report in October 2002, during George W. Bush’s second year in office. At that time, HHS was busy establishing the Center for Faith-Based and Community Initiatives, implementing the HIPPA privacy rule (more on that in chapter 21), and preparing a plan to respond to bioterrorism.

Even with so much on his agenda, HHS Secretary Tommy Thompson found time to dissolve the National Human Research Protection Advisory Committee, a body established by his Clinton-appointed predecessor, Donna Shalala. Shortly after that, the head of the OHRP resigned (he was replaced by a veterinarian, Bernard Schwetz). It may have been pressure from Democrats in Congress that persuaded Thompson a few months later to resurrect the committee with a new name: the Secretary’s Advisory Committee on Human Research Protection (SACHRP). He even appointed a few members of the old committee to the new one. SACHRP has quite a few subcommittees, including the Subcommittee on Federal Policy for the Protection of Human Subjects (Subpart A) and the Subcommittee on Issues Impacting those with Impaired Decision-Making Capacity. SACHRP and its subcommittees meet three or four times a year.

It seems that every decade or so some incident or scandal, or some very vocal advocacy group, causes a committee to be appointed to study the protection of human subjects. Reports are issued, regulations are adopted, and then another incident reveals some loophole or omission in the regulations. As Kevin Gleeson, IRB Executive Chair at Penn State University, has pointed out, whenever a research regulatory body gets a flea, “1000 IRB’s scratch and create more restrictions for investigators.” The regulators, he writes, don’t consider “the risks to human subjects of not conducting human research.” The resulting IRB system, he argues, continues “neither protecting human subjects much nor building any momentum for change.” (Gleeson, 2004)

This reactive approach on the part of federal regulators is not unique to OHRP. Consider the defensive tactics employed by the U.S. Department of Homeland Security in the War on Terror. Someone put bomb-making materials in his shoe? OK, everyone remove your shoes and have them X-rayed. Someone used a liquid explosive? OK, no one can bring liquids on the plane. Like DHS, OHRP can live with the risk that a particular tragedy may occur—until it does occur. Then there will be a new assortment of burdensome and arguably ineffectual measures put in place to make sure that the exact same tragedy never occurs again. Meanwhile, the enemy is devising increasingly innovative and deadly techniques, which the regulators either haven’t imagined or think are unlikely to be implemented.

What OHRP and IRBs seem to have forgotten is that researchers are not the enemy.


Previous chapter | Contents | Next chapter | References | Contact