PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
Episteme (Edinb). Author manuscript; available in PMC 2010 November 24.
Published in final edited form as:
Episteme (Edinb). 2006 February 1; 2(3): 135–147.
PMCID: PMC2991133
NIHMSID: NIHMS33449

Openness versus Secrecy in Scientific Research Abstract

Abstract

Openness is one of the most important principles in scientific inquiry, but there are many good reasons for maintaining secrecy in research, ranging from the desire to protect priority, credit, and intellectual property, to the need to safeguard the privacy of research participants or minimize threats to national or international security. This article examines the clash between openness and secrecy in science in light of some recent developments in information technology, business, and politics, and makes some practical suggestions for resolving conflicts between openness and secrecy.

“By academic freedom I understand the right to search for the truth and to publish and teach what one holds to be true. This right also implies a duty; one must not conceal any part of what one has recognized to be true. It is evident that any restriction of academic freedom serves to restrain the dissemination of knowledge, thereby impeding rational judgment and action.”

Albert Einstein, quotation inscribed on his statute in front of the National Academy of Sciences, Washington, DC.

Introduction

Openness is one of the most important principles of scientific research. It is necessary for achieving the goals of science and for enabling society to benefit from the results of research. It plays a key role in confirmation and collaboration, and it promotes innovation and discovery. Additionally, openness is important for holding scientists publicly accountable and for developing well-informed public policy. Even though openness is an essential part of the scientific ethos, it is not an absolute rule, because there are many good reasons for maintaining secrecy in scientific research, ranging from the desire to protect priority, credit, and intellectual property, to the need to minimize threats to national or international security. Thus, openness often conflicts with the demand for secrecy in science. This conflict is as old as science itself and will probably never end (Shamoo and Resnik 2003). This article examines the clash between openness and secrecy in science in light of some recent developments in information technology, business, and politics, and makes some practical suggestions for resolving conflicts between openness and secrecy in science.

A Very Brief History of Openness and Secrecy in Scientific Research

To have a better understanding of the conflict between openness and secrecy in science, it will be useful to review some history. Although openness currently enjoys an esteemed place in academic science, it has often stood on shaky ground. Moreover, while some important social institutions have supported openness, others have worked against it. Openness has been a part of the scientific attitude since ancient times. The sharing of information and ideas was important in ancient Greek philosophy and science, which emphasized free, open, and rational debate. Openness was also a key pillar in the emergence of the first universities during the 12th century. Universities were formed to translate, share, and contemplate ancient texts discovered in Islamic libraries during the Crusades (Burke 1995).

Although free and open inquiry became part of university life, other sectors of the knowledge economy did not have as much of an appreciation for openness. Since antiquity, artisans and craftsmen have guarded secrets concerning their methods and materials. To share in those secrets, one needed to serve as an apprentice for many years to learn the tricks of the trade. During the 1400s, many people in England were concerned about the negative effects of trade secrecy on commerce and industry. In 1449, King Henry VI issued the first known patent to John of Utynam for a method for making stained glass (Resnik 2003). In 1641, the Massachusetts Bay Colony issued the first Colonial patent for a method for making salt. The idea behind a patent—the “patent bargain”—is that government allows inventors to control the commercialization and use of their inventions for a limited time in exchange for showing other people how to make those inventions.

By the end of the 1700s, Thomas Jefferson, Benjamin Franklin other founding fathers of the United States (U.S.) understood the importance of patents in promoting scientific and technical innovation. Accordingly, they included a clause in the United States Constitution that gives Congress the authority to grant patents: “Congress shall have the power…to promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries (United States Constitution 1787, Article 1, Section 8, Clause 8).” In 1790, the U.S. adopted the patent act, which established rules for patenting and a patent office. Most of the countries in the world now have some type of patent system (Resnik 2003). Patents are privileges granted by the government to provide incentives for inventors and investors and encourage disclosure of scientific and technical information. Patents, which last for 20 years in most countries, apply to inventions that are original, non-obvious, and useful. The patent application is disclosed to the public when the patent is awarded (Resnik 2003).[1]

Scientific journals are other social institutions that help to promote openness in science. Throughout history, scientists have been concerned that their ideas would be stolen. Leonardo da Vinci wrote his notes in mirror-writing to prevent intellectual theft. Isaac Newton, Tycho Brahe, Johannes Kepler, Galileo, Charles Darwin and other great scientists were also concerned with protecting their work. In 1665, the Royal Society of London established the first scientific journal, the Philosophical Transactions of the Royal Society of London. The journal was established to encourage the rapid and free exchange of ideas among scientists. Publication in the journal could also help to establish credit for scientific discoveries. Prior to the advent of journals, scientists could publish their ideas in books, but journals were a more convenient method of publication. Soon, journals also started using scientific peers to review submitted manuscripts, to ensure the quality and reliability of published research. Today, there are thousands of different scientific journals pertaining to many different disciplines and specialties. Credit for scientific discoveries is usually based on publication in a peer reviewed journal: the first person to publish an idea in a peer reviewed journal is usually the person who receives credit for the idea (Shamoo and Resnik 2003).

Although patents and scientific journals have promoted openness, other institutions have had the opposite effect. Prior to the 1800s, private corporations had very little involvement with scientific research. Although the collaboration between James Watt and Matthew Bolton in the refinement and production of the steam engine stands out as a salient example of what can be accomplished when scientists and entrepreneurs work together, industrial science did not emerge until the 1850s, when German companies hired chemists to develop synthetic dyes. By the early 1900s, many different industries employed scientists and engineers. In the U.S., private industry supported more research than the federal government. After World War II, the government funding caught up with private funding, but that has changed since the 1980s, due to heavy private investment in pharmaceuticals, biotechnology, electronics and computers, and, recently, nanotechnology. Today, private industry funds approximately 60% of all research and development in the U.S. (Shamoo and Resnik 2003).[2]

Although openness has considerable influence over academic research, secrecy rules industrial research. Companies treat scientific research as trade secrets, and usually share data and results only to meet legal requirements or achieve financial goals. For example, a company will disclose research to obtain approval for its product from a regulatory agency, or to market its product to the public. Companies use knowledge for competitive advantage, and treat information as proprietary. If a scientist works for a company, he or she usually signs a non-disclosure agreement (NDA), which obligates him or her not to disclose any information obtained during employment without the company’s permission. Companies also assert control over intellectual property generated during research, such as inventions, manuscripts, graphics, and computer programs. Academic researchers who receive contracts or grants from private companies usually also must sign NDAs. Some academic researchers have signed agreements that allow companies to prevent the publication of results (Resnik In Press; Krimsky 2003).

The military is another social institution that practices secret scientific research.[3] Scientists have provided advice and assistance to military leaders for many years. For example, the Greek scientist Archimedes invented machines and developed strategies that helped the citizens of Syracuse defend against the Roman army, and Isaac Newton provided advice to English generals. Although military leaders have long recognized the value of scientific and technical knowledge, scientific discoveries did not have a greater influence over warfare until the 20th century. During World War I, propeller airplanes, mustard gas, gas masks, and other inventions played a key role in tactics. During World War II, radar, large airplanes, missiles, secret codes, computers, and the atomic bomb helped to decide the outcome of the conflict. Today, technological superiority is probably the most important asset for achieving military victory. The U.S. government spends more on military and national security research than all other types of research combined (Kintish and Mervis 2006).

Secrecy is important in military research to promote national security and achieve tactical and strategic advantages. Research that is funded by the military or conducted in military laboratories is usually classified. To gain access to classified information, one must undergo a comprehensive background check and accept restrictions on privacy and freedom of expression and association. Classified information is not made available to the public but is distributed on a need-to-know basis. Classified information can be declassified when an appropriate authority determines that secrecy is no longer needed, and the government can request that non-classified research that poses a threat to national security be classified (Resnik and Shamoo 2005). The government generally controls all intellectual property related to military research.

Ethical and legal protections for human research subjects are a more recent development with important implications for openness and secrecy in scientific research. Prior to World War II, there were no laws or widely accepted ethical guidelines concerning research on human subjects. As a result of the atrocities committed by Nazi scientists and physicians, the world community adopted an international standard for research with human subjects known as the Nuremberg Code (Shamoo and Resnik 2003). Although the Code did not mention the obligation to protect the confidentiality of research subjects, other codes, such as the Declaration of Helsinki, as well as research regulations, such as The Common Rule (45 C.F.R. 46), make protection of confidentiality a fundamental requirement for research with human subjects. Protection of confidentiality during medical research is also required by the Privacy Rule of the Healthcare Information Portability and Accountability Act (HIPAA) (Shamoo and Resnik 2003). Taken together, these rules and guidelines imply that researchers have an obligation not to disclose information that can be used to identify research subjects, unless they have permission from the subjects.

Justifying Openness and Secrecy in Science

From this brief sketch of the history of openness and secrecy in research one can see that these two norms have coexisted since the dawn of science and have often clashed. To decide how to resolve conflicts between openness and secrecy in research, it will be useful to examine their philosophical foundations. What justifies openness in science? What justifies secrecy? To answer these questions, it will be worth reflecting on the nature of these norms. Openness is a norm (or more precisely, a rule) for sharing information. Information sharing can take place in many different contexts, such as government, business, and litigation. Open-government rules, such as the Freedom of Information Act (FOIA) and open meeting laws, mandate the sharing of information. Financial disclosure laws pertaining to the trade of stocks and bonds also require the sharing of information. In scientific research, openness entails not only the sharing of information but also the sharing of some of the means needed to understand, validate, and apply information, such as data, results, methods, and tools. In the U.S., federal granting agencies, such as the National Institutes of Health (NIH) and National Science Foundation (NSF), have adopted regulations that require grantees to share data, results, methods, and tools (Shamoo and Resnik 2003).

To have a better understanding of the justification of openness in science, it will be useful to consider information sharing as an aspect of social epistemology. Traditional (or individual) epistemology studies concepts and principles of knowledge, such as justification, belief, evidence, warrant, and so on, from the perspective of an individual inquirer. The problem of skepticism and the problem of defining ‘knowledge’ are two of the key issues associated with traditional epistemology. Social epistemology studies concepts and principles of knowledge from the perspective of a community of inquirers. Social epistemology examines concepts and principles governing the social interactions among people who share common epistemic goals, such as the pursuit of truth, the avoidance of error, and explanation, and understanding. Some of the topics of social epistemology include the reliability of testimony, the division of cognitive labor, restrictions on speech, publication and peer review, rules of evidence in the law, and public education (Goldman 1999, Kitcher 2001). One of the important insights of social epistemology is that rules governing a community of inquirers are subject to practical, ethical, legal and political constraints and qualifications. That is, there are often tradeoffs among epistemic values and non-epistemic ones. For example, rules of legal evidence are a compromise among the epistemic goal of seeking the truth about a particular legal matter, and other goals of the legal system, such as protecting the defendant’s rights to avoid self-incrimination and unreasonable searches and seizures, and the practical goal of efficiency (Goldman 1999). Debates about how society should allocate funding for scientific research usually involve compromises among many different values, including the quest for knowledge for its own sake, promoting public health, and other political or social goals (Kitcher 2001, Dresser 2001). Government agencies set their funding priorities based on the scientific and practical significance of research proposals.

Openness can be understood in terms of the social epistemology of scientific research (Resnik 1996). The principle instructs inquirers to share information, which helps to promote the epistemological goals of the group, such as truth, avoidance of error, knowledge, and explanation. There are several ways that openness promotes the goals of science. First, openness enhances the productivity and efficiency of research, since it allows people to rely on the work of others and saves people from needing to repeat scientific studies (Munthe and Welin 1996). A scientist who reads about a study in a journal can use the data, results, and methods from the study in his or her own work. As Newton once said, “If I have seen further, it is by standing on the shoulders of giants (Newton 1676).” Second, openness is essential to the confirmation (or validation or corroboration) of hypotheses, theories, data, and results. Scientific knowledge is public knowledge: before a theory, hypothesis, or concept is accepted, it must be examined, critiqued, tested, or reviewed by other scientists. If a result is not reproducible, then it is usually rejected. Without a strong commitment to openness in science, publication and peer review would be ineffective (Merton 1973). Third, openness helps to promote trust among scientists by fostering cooperation and collaboration. Since science is a social activity, trust is very important. Most scientific activities could not take place without a high degree of trust among scientists. Fourth, openness can promote creativity and innovation in science by exposing scientists to different ideas, theories, and concepts. New ideas are the fuel that stimulates intellectual growth and development (Shamoo and Resnik 2003).

Openness is also important for non-epistemic reasons. First, openness helps to secure the public’s trust and support. Most scientific disciplines would lack the resources needed to achieve their goals without strong public support. By sharing information with the public, scientists can demonstrate that their work deserves public support. Second, openness is important in holding scientists accountable for their use of public funds. Secrecy is often used to hide corruption, abuse of power, and misconduct. Openness can expose unethical and illegal activities to outside scrutiny. Third, by sharing information with the public, scientists can fulfill their moral responsibilities to society. Scientists have considerable knowledge, expertise, authority, and autonomy in society. With that power comes responsibility: scientists have a moral obligation to use their talents and abilities to prevent harmful consequences and promote good ones (Shamoo and Resnik 2003). One way that scientists can fulfill this obligation is to share their knowledge and expertise with the public, which can be valuable in informing public policy and public opinion.

Although openness helps to promote the goals of science, secrecy is also sometimes important in scientific research. First, secrecy is important for helping scientists to receive proper credit for their work. Scientists want—and deserve—proper credit for their work, so that they can receive the rewards of research, such as respect, prestige, money, and intellectual property. Secrecy is necessary in research to enable scientists to prepare scientific presentations, publications, or patent applications. Although science is a social activity that requires cooperation and collaboration, science is conducted by individuals with their own interests. Scientific norms must maintain the proper balance between cooperation and self-interest in research (Hull 1988). When a scientist submits his or her work for publication, editors and reviewers have an ethical obligation to maintain confidentiality and to protect the scientist’s interests. Second, secrecy is important when confidentiality is needed to protect scientists (or future scientists) from harm, stigma, harassment, or embarrassment related to their work. For example, educational records, personnel records, disciplinary hearings, grievance proceedings, misconduct investigations, and other matters concerning sensitive personal and professional relationships among scientists should be kept secret. Additionally, secrecy is important for non-epistemic reasons, which were mentioned earlier, such as protecting the rights and welfare of human research subjects, trade secrets and proprietary business information, and national and international security (Shamoo and Resnik 2003).

Resolving Conflicts between Openness and Secrecy

Given the preceding account of the justification for openness and secrecy, we can understand conflicts between these principles in two ways: a conflict may be internal to science or it may be external to science. If a conflict is internal to science, then the best way to resolve it is to determine which policy (openness or secrecy) best serves the aims of science in a given situation. For example, one of the current controversies in the ethics of publication is whether reviewers should be anonymous (Rennie 2003). Openness favors unmasking reviewers, while secrecy favors the opposite tactic. Those in favor of unmasking argue that this will make reviewers more careful, responsible, and ethical, because authors will know who they are. Openness can make reviewers more accountable. Those in favor of masking argue that reviewer anonymity is necessary to avoid bias, since it allows reviewers to give honest, critical feedback without the fear of repercussion. How should scientists resolve this question? Since this is largely a question concerning the most effective way to review scientific articles, empirical research into this issue would be very useful in settling the debate. It would be useful to know whether there is a difference between open and anonymous peer review, whether people would refuse to serve as reviewers unless they could be guaranteed anonymity, etc. Some journals have begun to conduct research that addresses these and other issues, and some have experimented with open peer review and compared it to anonymous review in controlled trials (Godlee 2002). Empirical research has limitations, of course. For example, studies may yield results that are inconclusive and different studies may support opposing hypotheses. Moreover, different policy options may be supported by different scientific goals, which may conflict. Nevertheless, empirical research should be sought and considered in deciding how to best promote the aims of science (Laudan 1984).

For another example of a dispute mostly internal to science, consider database access policies.[4] A great deal of contemporary scientific research involves the development of large electronic databases for analysis and interpretation. The internet, increases in computing speed and power, and the development of sophisticated statistical software have made it possible to search for patterns in large databases and to compare databases. Genomics, proteomics, epidemiology, astrophysics, geology, neuroscience, and other disciplines fit this new “data-driven” model. However, the development of these electronic databases has created a potential conflict between those who gather data and those who analyze and interpret it. To promote openness in science, many government agencies that support research have required scientists to deposit their data in publicly available archives, even before they have published any articles related to the data. Also, many journals now require that authors make their data available to the public as a condition of publication (Marshall 2002). These two requirements have, at times, interfered with scientists’ ability to receive proper credit for their work. Sometimes scientists have been scooped by researchers outside their group who have published articles that analyze and interpret data available on public databases. In some instances, outside researchers have published articles before the researchers who gathered the data have been able to publish any articles pertaining to the data. In other instances, scientists have published an article or two from a database, expecting to publish several more, only to watch other scientists publish articles from that database on topics that they were planning to cover in their work. Many scientists have protested these policies and have argued that outside researchers should have to obtain permission before publishing articles from databases that they have made available to the public. Empirical research into specific questions relating to this dispute could be useful in deciding the best policy concerning access to databases. It would be important to know how many scientists are concerned about getting scooped by someone using their data, how this might affect their research practices, how rules pertaining to access to data would affect the conduct of research, and so on.

If the conflict is a quarrel external to science, it may be much more difficult to resolve than simply determining which policy best serves the aims of science in a given situation, since the conflict may involve a trade-off among epistemic values and non-epistemic ones. Many different moral and political issues involve conflicts among competing values (goals, rights, or principles). For example, in the abortion debate, the woman’s right to control her own body conflicts with the value of protecting the life of the fetus; in controversies relating to land use, the goal of economic development conflicts with the goal of preserving the natural environment, and so on. While there are usually no easy solutions to these problems, philosophers and ethicists have developed methods of moral reasoning for making personal and social choices. Most of these methods involve a series of steps, such as defining the problem, gathering relevant information, identifying options, and weighing and balancing competing values (Fox and DeMarco 2000, Beauchamp and Childress 2001). Empirical research can help to address moral dilemmas, but it cannot solve them, since moral dilemmas involve conflicts among competing values.[5]

Secrecy vs. Openness: Three Case Studies

The remainder of this article will consider three contemporary problems involving conflicts between openness and secrecy in science.

Research Involving Human Subjects

The first case pertains to research involving human subjects. Ethical and legal rules require that researchers refrain from disclosing confidential information about human subjects, unless they obtain permission from the subject or the subject’s representative (Coleman et al 2006). Scientists usually do not ask research subjects for permission to disclose confidential information in publications or report. Instead, they usually remove information that can be used to identify research subjects from the data that is published or otherwise disseminated, such as name, address, social security number, and so on. Most of the time, de-identification of data has virtually no effect on scientific utility. However, sometimes it may be necessary to remove useful demographic information, such as sex, age, income, and race or ethnicity, to avoid identification of subjects, since an outside party might be able to use this information to identify individuals by searching and analyzing different databases (Kaiser 2002). But removing demographic information can decrease the value of the data to researchers, since it may be useful to study relationships between demographic variables and mortality, disease incidence, and so on. So, the de-identification of data pertaining to research involving human subjects can generate a conflict between scientific goals, which favor openness, and non-scientific ones, which favor secrecy. Or, to put the matter in slightly different terms, it exemplifies the perennial conflict in research on human subjects between producing socially valuable results and protecting human rights (Levine 1988).

How should society resolve this conflict between openness and secrecy? Those who value the protection of human rights more than the advancement of research would argue that demographic information should not be released without permission if there is any chance that someone could identify research subjects, while those who value research more than the protection of rights would argue for releasing all demographic information. Can one forge a reasonable compromise between these opposing views? More information concerning this dilemma would be helpful. It would be useful to know if it is practical to ask the subjects for their permission to release data that could identify them. If contacting the subjects is not practical, this would rule out this option. Even if subjects can be contacted, if most of them do not give their permission, this could bias the research results, which would also eliminate this option. Furthermore, it would be useful to know how easy it would be for someone to use some of the demographic information to identify research subjects. An expert in statistics and electronic database analysis could determine if different types of demographic information have different risks for human subjects if disclosed. It might be the case that some demographic information could be released without placing the confidentiality of research subjects at a significant risk. Information could be released only if there is a very small chance (e.g. 5% or less) that it could be used to identify an individual (Department of Health and Human Services 2004). This option would help to advance scientific research without compromising the rights of research subjects.

Trade Secrets in Biomedical Research

The second case concerns trade secrets in biomedical research. In the U.S., pharmaceutical and biotechnology companies that want to obtain approval for a new drug, biologic, or medical device must submit data from clinical trials on human subjects to the Food and Drug Administration (FDA). The FDA will approve the new product if it determines that the benefits of the product outweigh its risks. The FDA also requires companies to submit data from clinical studies conducted after their products have been approved (Angell 2004). Companies treat all aspects of their research including the protocol, informed consent documents, investigator’s brochure, data, interim analyses, and results as trade secrets (Krimsky 2003). Trade secrecy law protects business information that is 1) not public; 2) protected by efforts made by the company; and 3) has business value. Trade secrets encourage investment in research and product development and promote competition and economic growth (Foster and Shook 1993). When companies submit their data to the FDA, the FDA treats the data as a trade secret and does not disclose it without permission. Although companies are not required to publish their research, they often do, to generate support for their products.

Because companies are not required to publish their research, it is possible for a company to keep important information about its product from the public. Since companies are in the business of selling their biomedical products, they often seek to avoid the publication of information that portrays their products in a negative light. In some cases, companies have prevented (or have tried to prevent) researchers from disclosing or publishing information about the risks of their products (Angell 2004). For example, in 1998, Apotex took legal action against University of Toronto researcher Nancy Olivieri and tried to have her employment at the university terminated for warning patients and doctors about risks associated with its drug deferiprone, which she had been studying under a contract with Apotex. In 1995, Microfibres tried to stop occupational health physician David Kern from publishing his research on the causes of respiratory problems among workers at its factory (Resnik In Press). The company had hired Kern to prepare a report on these problems.

Some companies have known about problems with their products for several years without disclosing them to the public. In May 1999, the FDA approved Vioxx as a treatment for pain. Vioxx’s manufacturer, Merck, sponsored a post-marketing study of the drug, known under the acronym VIGOR, which compared Vioxx to several other analgesics. In 2000, Merck reported the results of the study to the FDA. The data showed that Vioxx users had five times more heart attacks than users of the other medications. Merck also published an article on the VIGOR that reported the gastrointestinal benefits of Vioxx but did not report its cardiovascular risks (Angell 2004). The FDA sent a warning letter to Merck in 2001 stating that the company misrepresented Vioxx’s safety profile. The following year, the FDA required Merck to change the warning label on Vioxx to include cardiovascular risks. Vioxx’s worldwide sales soared to $2.5 billion, despite the warnings. In 2004, Merck took Vioxx off the market after another study showed that Vioxx doubled the risk of a stroke or heart attack if taken for more than 18 months. In 2005, Merck began defending itself against thousands of lawsuits from Vioxx users and their families (Resnik In Press).

How should society resolve the conflict between openness and secrecy that occurs in privately funded biomedical research? Openness favors full disclosure of data and results, while secrecy opposes disclosure. Is it possible to reach some satisfactory compromise here? One way toward a satisfactory solution is to recognize the importance of trade secrecy but to forbid secrecy in some situations. Individual rights—and by extension corporate rights—do not include the right to harm others. The right to free speech does not include a right to yell “fire” in a crowded movie theater, since this could cause great harm to others (Feinberg 1987). Likewise, the right to secrecy does not include the right to harm others with one’s secrets. So, companies should not be allowed to keep secrets that harm others, and employees or agents of companies that keep secrets should not be penalized for divulging secrets to promote public health and safety. But how can one know whether a particular secret kept by a company is likely to harm other people? Determining the risk profile of a particular drug usually requires time and careful analysis, since one must ensure that the results are statistically significant. If 10 people have taken a new drug and 1 person has suffered a stroke, it may not be immediately apparent whether the drug caused the stroke, since 10 cases is a small sample. One may not know whether a drug significantly increases the risk of stroke until hundreds of people have taken the drug.

One proposal that may make it more difficult for private companies to keep secrets that are harmful to human health is to require that all clinical trials be registered in a publicly accessible database (Angell 2004). The International Committee of Medical Journal Editors (ICMJE) now requires that clinical trials be entered into a publicly accessible registry as a condition of publication (DeAngelis et al 2004). The clinical trial registry should contain data concerning the clinical trial, including, “a statement of the intervention (or interventions) and comparison (or comparisons) studied, a statement of the study hypothesis, definitions of the primary and secondary outcome measures, eligibility criteria, key trial dates (registration date, anticipated or actual start date, anticipated or actual date of last follow-up, planned or actual date of closure to data entry, and date trial data considered complete), target number of subjects, funding source, and contact information for the principal investigator (DeAngelis et al 2004, p. 1250).” The U.S.’s National Library of Medicine has established a registration site, www.clinicaltrials.gov, which includes over 22,000 clinical trials (Zarin et al 2005).

Although clinical trial registration as a condition of publication can certainly provide some valuable information to health researchers and the public, it does not completely solve the problem of secret biomedical research. The most important shortcoming of this policy is that is does not apply to research that companies do not plan to publish. Companies do not always publish the results of their clinical trials. Although they eagerly publish favorable results, they often do not publish papers with unfavorable or inconclusive results (Giles 2006). The clinical trial registration system will help researcher track down unpublished data only when companies publish some of the data. If a company does not plan to publish any of its data or results, its research will be under the radar. A second problem with clinical trial registration is that it only covers the journals that belong to the ICMJE. Although the ICMJE includes hundreds of the world’s most prestigious biomedical journals, there are many journals that do not belong to that organization. A company would still be able to use a paper published in an obscure journal for marketing purposes, since most patients and many physicians will not know whether a journal is prestigious or belongs to the ICMJE. A third shortcoming of this system is it does not include outcome data from clinical trials. The registry provides data on what types of outcomes are being measured, but not information about the actual health outcomes. This is a very serious deficiency, since there is no way to determine whether a new treatment is safe or effective if one does not know how it affects the patient’s health. To obtain outcome data, a health researcher (or clinician) would need to ask the company to provide him or her with the data, but the company would have no obligation to do this.

So what solution would work? To put an end to secret clinical trials involving human subjects, the following policies should be adopted:

  1. All clinical trials should be registered in a clinical trial registry, regardless of the source of funding or any plans to publish the data.
  2. Clinical trials registries should include outcome data, so that health researchers and clinicians can analyze and interpret them. Outcome data should be submitted to the registry within three months after the completion of the trial, to give the sponsor of the trial a chance to conduct its own analysis and interpretation and correct any problems with the data. The sponsor may avoid submitting data (or request that data be withdrawn) if the trial is found to have flaws in its design, methodology, or implementation.
  3. Clinical trials in the registry should be linked to publications that report the data in scientific journals.

There are may be some flaws with these proposals as well, but they are an improvement on the status quo.

National and International Security

The third and final case involves questions about national and international security in the wake of the threat of terrorism. Terrorist groups, such as Al-Qaeda, have publicly declared their interests in acquiring biological, chemical, and nuclear weapons, and using them against military and civilian targets. Terrorists have also sought to disrupt or destroy bridges, nuclear reactors, power grids, oil pipe-lines, and other important parts of society’s infrastructure. In response to the threat of terrorism, the U.S. has adopted laws, such as the Patriot Act and the Public Health Security and Bioterrorism Preparedness and Response Act, which strengthen governmental power related to surveillance and espionage and restrict access to dangerous biological and chemical agents (Resnik and Shamoo 2005). From 2001–2002, several prominent journals published articles that could provide terrorists with information for making biological weapons. One of the articles described a method for overcoming the human immune system’s defense against smallpox (Rosengard et al 2002). In the U.S., politicians expressed grave concerns about the publication of these articles, and the National Academy of Sciences and the American Society for Microbiology (ASM) held a meeting in January 2003 to discuss issues related to control of biological information in the age of terrorism. The ASM adopted a policy for adding an additional layer of review for articles that pose security risks, and other journals followed suit. In 2003, the National Research Council (NRC) published a report on security issues related to biotechnology (NRC 2003). In 2005, the Proceedings of the National Academy of Sciences (NAS) published an article discussing the vulnerability of the milk supply to botulism toxin. The authors estimated that a third of an ounce of botulism toxin poured into a milk truck on its way to a processing plant could result in thousands of deaths and billions of dollars in economic losses (Wein and Liu 2005). The Department of Health and Human Services asked the NAS not to publish the paper, because of its security risk, but the NAS decided that the good consequences of publication outweighed its potential bad effects (Weiss 2005).

The controversies surrounding these publications boil down to a conflict between openness and security. The basic choice that society faces is between some restrictions on publication or no restrictions. The arguments in favor of some restrictions are compelling and emotionally charged: scientists should not disclose information that can be used by terrorists because scientists have an obligation not to cause harm to society. Additionally, the government is justified in restricting the disclosure of scientific information to protect the public from harmful uses of that information (Resnik and Shamoo 2005). However, the arguments in favor of no restrictions (i.e. complete openness) are also compelling. First, as noted elsewhere in this article, openness is an important part of the ethos of science. Second, most information in biomedicine and biotechnology has a dual use: knowledge of how to increase the virulence of a pathogen could be useful in making bioweapons or developing vaccines or preparing a defense against bioweapons (NRC 2003). Third, scientists, like other citizens, have a right to freedom of speech, and restrictions on publication violate this right. In the U.S., freedom of speech is protected by the Constitution.

How should society deal with this issue? To bolster the case for some restrictions, one should note that society already accepts some restrictions on freedom of speech. It is illegal to incite a riot, give an order to kill someone, lie under oath, commit fraud, or yell “fire!” in a crowded movie theater. Freedom of speech is not an absolute, unqualified right. Speech can be restricted for important purposes, such as protecting people from harm. However, since freedom of speech is a very important right, prohibitions should be the least restrictive means necessary to protect people or society. Restrictions on speech should be specific and clear, not general and vague, because general and vague restrictions could deter speech that does not threaten people or society (Feinberg 1987; Austin v. Michigan Chamber of Commerce 1990). When the government restricts speech, it should take great care not to exert a chilling effect on free discussion, inquiry, and debate.

Although scientific journals are not branches of the government, the arguments concerning governmental restrictions on speech also apply restrictions on the publication of scientific papers. One could argue that journals could impose some restrictions on scientific publication to protect people or society from harm, provided that those restrictions are the least restrictive means necessary to accomplish this task. If an article poses a significant threat to people or society, there are a range of options from blocking publication of the article, to publishing the article in an obscure journal, to publishing parts of the article, to publishing the entire article. In deciding whether to publish a potentially dangerous article, reviewers and editors must evaluate each option in light of the relevant facts and the competing moral values at stake (e.g. freedom of speech and the progress of science vs. protecting people and society from harm). They should conduct a thorough assessment of: 1) the usefulness of the publication to terrorists or other parties with violent or criminal intentions; 2) the importance of the research for science or society; 3) the ability to control dissemination of the information; 4) the chilling effect of restricting publication.[6] In general, journals should consider blocking publication (the most extreme option), only when a) the information in the article would be very useful to terrorists or other parties with violent or criminal intents; 2) the research is not very important to science or society (e.g. it does not involve basic science or clinically valuable research); 3) the ability to control dissemination is high (e.g. if the article is not published, the information will not be disseminated by other means); and 4) there would be only a minimal chilling effect of stopping publication. When some of the preceding conditions are not met, journals should consider options that are not as extreme as blocking publication.[7]

Conclusion

While openness is undoubtedly a key pillar in the ethics of research, there are a variety of reasons for not sharing scientific knowledge, ranging from the desire to safeguard priority and intellectual property, to the need to protect the privacy of human subjects, to maintaining the confidentiality of trade secrets, or information with an impact on national or international security. Conflicts between openness and secrecy may be internal to science or external to science. When a conflict is internal to science, the optimal way to resolve it is to determine which policy best serves the aims of science in a particular situation. When a conflict is external to science, one must make trade-offs among epistemic values, such as seeking truth or understanding nature, and non-epistemic ones, such as promoting economic development or protecting human welfare or human rights. To deal with these conflicts, one should define the problem, gather relevant information, identify options, and weigh and balance competing values.

Acknowledgments

This paper is based, in part, on presentations I made at the Mini-Conference on Secrecy at the Pacific Division Meetings of the American Philosophical Association, Portland, Oregon, March 22, 2006 and the Frontis Workshop on Ethics In the Life Sciences, Wageningen University, Wageningen, The Netherlands, May 19, 2003. The paper was supported by the intramural program of the National Institute of Environmental Sciences, National Institutes of Health. It does not represent the views of the National Institute of Environmental Sciences or the National Institutes of Health.

Footnotes

1I am assuming that intellectual property (patents, copyrights, trademarks, and trade secrets) is morally justifiable. Many scholars and policy analysts have questioned the justification of intellectual property. Some have argued against any type of intellectual property, while others have argued for a radical reform of the current system, especially a revision of intellectual property pertaining to biotechnology. Elsewhere, I have argued that intellectual property can be justified on the grounds that it promotes the progress of science, technology, and the arts; and that it protects autonomy and privacy (Resnik 2003). For further discussion of these issues, see Resnik (2004) and Angell (2004).

2Some would dispute this percentage on the grounds that private industry inflates its estimates of the amount of money that it spends on R & D. See Angell 2004.

3I am also assuming that some type of secrecy is justified with respect to research pertaining to the military and security issues. Pacifists would dispute this assumption on the grounds that military force should never be used. For further discussion of the justification of military research, see Resnik (1998).

4I say “mostly internal to science” because some private corporations that conduct research have also objected to data access policies on the grounds that the policies undermine their proprietary interests.

5By “competing” I mean opposing and incommensurable. Such values making have very different implications for decision-making and cannot be compared by means of a common metric or formula.

6The same reasoning considerations would also apply to other forms of public dissemination, such as presentations at scientific meetings, posting information on websites, etc.

7There are some practical issues that will not be discussed here, such as establishing policies and procedures for reviewing potentially dangerous scientific research. One might argue that special committees or panels should be established to assist journals in making publication decisions (Resnik and Shamoo 2005).

References

  • Angell M. The Truth about Drug Companies. New York: Random House; 2004.
  • Beauchamp T, Childress J. Principles of Biomedical Ethics. 5. New York: Oxford University Press; 2001.
  • Austin v. Michigan Chamber of Commerce, 494 U.S. 652 (1990).
  • Burke J. The Day the Universe Changed. New York: Back Bay Books; 1995.
  • Coleman C, Menikoff J, Goldner J, Dubler N, editors. Ethics and the Regulation of Research with Human Subjects. Dayton, Ohio: LexisNexis; 2006.
  • DeAngelis C, et al. Clinical trial registration: a statement from the International Committee of Medical Journal Editors. New England Journal of Medicine. 2004;351:1250–51. [PubMed]
  • Department of Health and Human Services. HIPAA privacy rule: information for researchers, updated August 4, 2004. 2004. [Accessed: April 6, 2006]. Available at: http://privacyruleandresearch.nih.gov/pr_08.asp#8a.
  • Dresser R. When Science Offers Salvation. New York: Oxford University Press; 2001.
  • Feinberg J. Harm to Others. New York: Oxford University Press; 1987.
  • Foster F, Shook R. Patents, Copyrights, and Trademarks. New York: John Wiley; 1993.
  • Fox R, DeMarco J. Moral Reasoning. 2. Belmont, CA: Wadsworth; 2000.
  • Giles J. Stacking the deck. Nature. 2006;440:270–72. [PubMed]
  • Godlee F. Making reviewers visible: openness, accountability, and credit. Journal of the American Medical Association. 2002;287:2762–65. [PubMed]
  • Goldman A. Knowledge in a Social World. New York: Oxford University Press; 1999.
  • Hull D. Science as a Process. Chicago: University of Chicago Press; 1988.
  • Kaiser J. Privacy rule creates bottleneck for U.S. biomedical researchers. Science. 2002;295:1206–1207. [PubMed]
  • Kintish E, Mervis J. A budget with big winners and losers. Science. 2006;311:762–64. [PubMed]
  • Kitcher P. Science, Truth, and Democracy. New York: Oxford University Press; 2001.
  • Krimsky S. Science in the Private Interest. Lanham, MD: Rowman and Littlefield; 2003.
  • Laudan L. Science and Values. Berkeley, CA: University of California Press; 1984.
  • Levine R. Ethics and the Regulation of Clinical Research. New Haven, CT: Yale University Press; 1988.
  • Marshall E. Data sharing: clear cut rules prove elusive. Science. 2002;295:1624. [PubMed]
  • Merton R. The Sociology of Science. Chicago: University of Chicago Press; 1973.
  • Munthe C, Welin S. The morality of scientific openness. Science and Engineering Ethics. 1996;2(4):411–28. [PubMed]
  • National Research Council (NRC) Biotechnology Research in an Age of Terrorism: Confronting the Dual Use Dilemma. Washington, DC: National Academy of Sciences; 2003.
  • Newton, I. 1676. Letter to Robert Hooke. February 5, 1676.
  • Rennie D. Editorial peer review: its development and rationale. In: Godlee F, Jefferson T, editors. Peer Review in the Health Sciences. 2. London: BMJ Books; 2003. pp. 1–14.
  • Resnik D. Social epistemology and the ethics of research. Studies in the History and Philosophy of Science. 1996;27:566–586. [PubMed]
  • Resnik D. The Ethics of Science. New York: Routledge; 1998.
  • Resnik D. A pluralistic account of intellectual property. The Journal of Business Ethics. 2003;46:319–35.
  • Resnik D. Owning the Genome. Albany, NY: State University of New York Press; 2004.
  • Resnik D. The Price of Truth: How Money Affects the Norms of Science. New York: Oxford University Press; In Press.
  • Resnik D, Shamoo A. Bioterrorism and the responsible conduct of biomedical research. Drug Development Research. 2005;63:121–33.
  • Rosengard A, Liu Y, Nie Z, Jimenez R. Variola virus immune evasion design: expression of a highly efficient inhibitor of human complement. Proceedings of the National Academy of Sciences. 2002;99:8808–13. [PubMed]
  • Shamoo A, Resnik D. Responsible Conduct of Research. New York: Oxford University Press; 2003.
  • United States Constitution. 1787. [Accessed: March 28, 2006]. Available at: http://www.usconstitution.net/const.txt.
  • Wein L, Liu Y. Analyzing a bioterror attack on the food supply: the case of botulinum toxin in milk. Proceedings of the National Academy of Sciences. 2005;102:9984–89. [PubMed]
  • Weiss R. Report warns of threat to milk supply. The Washington Post. 2005:A8. (June 29, 2005)
  • Zarin D, Tse T, Ide N. Trial registration at clinicaltrials.gov between May and October 2005. New England Journal of Medicine. 2005;353:2779–87. [PMC free article] [PubMed]