When historians of the future try to identify the moment that neuroscience began to transform the American legal system, they may point to a little-noticed case from the early 1990s. The case involved Herbert Weinstein, a 65-year-old ad executive who was charged with strangling his wife, Barbara, to death and then, in an effort to make the murder look like a suicide, throwing her body out the window of their 12th-floor apartment on East 72nd Street in Manhattan.
Before the trial began, Weinstein’s lawyer suggested that his client should not be held responsible for his actions because of a mental defect — namely, an abnormal cyst nestled in his arachnoid membrane, which surrounds the brain like a spider web.
The implications of the claim were considerable. American law holds people criminally responsible unless they act under duress (with a gun pointed at the head, for example) or if they suffer from a serious defect in rationality — like not being able to tell right from wrong. But if you suffer from such a serious defect, the law generally doesn’t care why — whether it’s an unhappy childhood or an arachnoid cyst or both. To suggest that criminals could be excused because their brains made them do it seems to imply that anyone whose brain isn’t functioning properly could be absolved of responsibility.
But should judges and juries really be in the business of defining the normal or properly working brain? And since all behavior is caused by our brains, wouldn’t this mean all behavior could potentially be excused?
The prosecution at first tried to argue that evidence of Weinstein’s arachnoid cyst shouldn’t be admitted in court. One of the government’s witnesses, a forensic psychologist named Daniel Martell, testified that brain-scanning technologies were new and untested, and their implications weren’t yet widely accepted by the scientific community. Ultimately, on Oct. 8, 1992, Judge Richard Carruthers issued a Solomonic ruling: Weinstein’s lawyers could tell the jury that brain scans had identified an arachnoid cyst, but they couldn’t tell jurors that arachnoid cysts were associated with violence.
Even so, the prosecution team seemed to fear that simply exhibiting images of Weinstein’s brain in court would sway the jury. Eleven days later, on the morning of jury selection, they agreed to let Weinstein plead guilty in exchange for a reduced charge of manslaughter.
After the Weinstein case, Daniel Martell found himself in so much demand to testify as a expert witness that he started a consulting business called Forensic Neuroscience. Hired by defense teams and prosecutors alike, he has testified over the past 15 years in several hundred criminal and civil cases. In those cases, neuroscientific evidence has been admitted to show everything from head trauma to the tendency of violent video games to make children behave aggressively.
But Martell told me that it’s in death-penalty litigation that neuroscience evidence is having its most revolutionary effect. “Some sort of organic brain defense has become de rigueur in any sort of capital defense,” he said. Lawyers routinely order scans of convicted defendants’ brains and argue that a neurological impairment prevented them from controlling themselves. The prosecution counters that the evidence shouldn’t be admitted, but under the relaxed standards for mitigating evidence during capital sentencing, it usually is.
Indeed, a Florida court has held that the failure to admit neuroscience evidence during capital sentencing is grounds for a reversal. Martell remains skeptical about the worth of the brain scans, but he observes that they’ve “revolutionized the law.”
The extent of that revolution is hotly debated, but the influence of what some call neurolaw is clearly growing. Neuroscientific evidence has persuaded jurors to sentence defendants to life imprisonment rather than to death; courts have also admitted brain-imaging evidence during criminal trials to support claims that defendants like John W. Hinckley Jr., who tried to assassinate President Reagan, are insane.