First off, I'm not a bioethicist. I'm interested in using good business strategies to develop science and get it into the marketplace. Pinker doesn't dispute the need for ethics to guide day-to-day decisions, but questions the need for overly bureaucratic ethics processes that impede work that is obviously ethical and relatively low risk. In the most simplistic terms, there are two types of ethical dilemmas: the hard ones, and the easy ones.
Consider the hard ethical dilemma. For the moment, let's get back to what triggered Steven Pinker's position. Many scientists have been calling for a moratorium on the use of CRISPR-Cas9 genome editing technology for use that can potentially enter the human germline (For background, see here, here, here, and here). Pinker coined the terms 'germlinophobia' and 'moratorista' to describe those that take the standpoints of delaying use of genome editing until it's potential pitfalls are understood--especially the potential for abuse by those after 'designer babies'.
His argument is basically that the benefits of correcting disease genes (which are, as a whole, relatively well understood) are greater than those given by potentially enhancing complex traits like intelligence (which are, as a whole, not). Furthermore, the risks of CRISPR-Cas9 vs the potential benefits of enhancing children are such that, in Pinker's words, "it’s unlikely that today’s morbidly risk-averse helicopter parents will take a chance at enhancing a child—they won’t even feed their babies genetically modified applesauce!"
Well said.
And then there are the much more easy ethical dilemmas that affect the majority of researchers and that have nothing to do with sexy topics like intelligently designing the next generation of humans (sorry, creationists). These problems deal with the much more boring topic of using human DNA or tissue samples in research, which in cancer research can be very often. Here's how Pinker responded to Paul Knoepfler:
Today mainstream bioethics gets in the way on a massive scale. The most obvious example is Institutional Review Boards. They are blatant abridgments of free speech, convenient weapons for fanatics to wield against people whose opinions they don’t like, and high-volume red-tape dispensers which bog down research while being unnecessary or even harmful to the protection of patients and research subjects. Regulations on confidentiality and consent to use data and tissues have also gone way overboard. The future of medicine hinges on the use of massive, open-access datasets to find signals in the noise. If every byte has to be multiply certified for consent and privacy, or even destroyed after a few years, no matter how inconsequential to the person who contributed it, then huge numbers of future patients will suffer or will fail to be helped by our faulty knowledge of the real effects of treatments.To be frank, I haven't dealt with any IRBs (Research Ethics Boards in Canada) used as 'weapons for fanatics', but can attest to the red-tape involved in getting permission to use patient samples.
Take genomic sequencing of typical cancer patient samples: In many cases researchers would like to study as large as possible population of people to understand how to classify them into cancer subtypes, confirm rare events, whatever. If a researcher has permission to sequence 100 samples from one source, and identifies another source (like a second tumor bank) with 20 more, they need to either amend their IRB approval or file a completely new one. This might take an afternoon or a few days to think through, depending on complexity and whether an expert or PhD student is drafting it. The turnaround time may be a few weeks to a few months.
Do you do the paperwork to get 20 more samples? 50 more? 5 more? It's a judgement that depends on the situation and project, but this shouldn't be what scientists are concerned with. The effort needed to comply with regulations is why, as a researcher, I always looked at the amount of data available through companies like AncestryHealth and 23andMe with awe.
The obsession with patient privacy and information security leads to expensive and/or cumbersome solutions that just become costs of the research enterprise. It can cost a lot of time (and therefore money) to encrypt and decrypt data or move it around from secure server to secure server and falls under Pinker's position that "truly ethical bioethics must weigh the benefits of any restriction on research against the harm that will be caused to the vast number of people who would benefit if the research proceeded expeditiously."
Opting-In to the Research Ethics Review System
The solution that I envision is an opt-in system where researchers working with other researchers or companies are cleared to engage in similar, but not identical, work to what they are already permitted to do, with the caveat that they could be audited to ensure that they comply with regulations.For instance, Scientist A should be permitted to do a certain set of experiments on lung cancer tissue obtained from Scientist B, if A is already allowed do nearly identical work on breast cancer. There is little point in wasting an Ethics Board's time with reapplications that are substantially the same; they should have the time to consider the really serious ethical quagmires that have the potential to be reported on by the press, like life or death decisions, not sequencing some old DNA from a tissue bank. If some researchers are really nervous about audits, they could always opt-in to the review system to have all their paperwork cleared up front. If anyone can provide some insight into what percentage of review board application are routine, I'd like to know.
For a biomedical system that's trying to find cost savings in an era of budget cuts, an opt-in system may be one small win.