At the Radcliffe hospital in Oxford, England, in March 2002, doctors wheeled Kevin Warwick, a professor of cybernetics at the University of Reading, into an operating theater for what has to be one of the world’s only cases of elective neurosurgery on a healthy patient. Warwick belongs to a rare breed of scientists who experiment on themselves. He had volunteered to go under the knife so surgeons could hammer a silicon chip with 100 spiked electrodes directly into his nervous system via the median nerve fibers in his forearm. The goal was to fire electrical impulses into his brain to see whether a human could learn to sense, interpret and reply to computer-generated stimuli. The operation was dangerous. Success could lead to new avenues for prosthesis development, among other applications. Failure could mean nerve damage, infection, amputation or even brain injury. The lead surgeon paused before making the first incision into Warwick’s arm. “He asked if I was ready,” remembers Warwick, now 56. “Of course I was. I had never been so excited. When they got in, the surgeons grabbed hold of my nerves, and it felt like my hand was being electrocuted. The pain was brilliant!” The chip in Warwick’s arm did what it was intended to do, picking up neural action potentials the signals sent from the cortex when a person thinks of moving a limb but does not actually do it. That allowed Warwick to use thoughts to control an electric wheelchair and, through an Internet connection, an artificial hand back in his lab in Reading. Six weeks after Warwick was wired up, his brain learned to interpret signals sent back from the chip too; when an improvised sonar device was connected to the implant, Warwick could sense how far away an object was from his arm even while he was blindfolded. Warwick’s work may be cutting-edge, but his method is as old as science itself. In popular culture, self-experimenters are portrayed as mad scientists attempting to turn themselves into superhuman villains; in real life, their contribution to scientific progress is immense. Self-experimenters have won Nobel Prizes and helped control diseases. For centuries, self-experimentation was an accepted form of science. Sir Isaac Newton almost burned his cornea because he could think of no other means of understanding visual hallucinations than staring at the sun. But in recent years, the academic institutions, grant agencies and journals that have codified the scientific method have come to view self-experimentation with suspicion, worrying that it leads to bias or misleading results. Nevertheless, the practice continues among a small number of professors and doctors who see it as the last chance to prove an underfunded theory, as an act of solidarity with other study subjects. Or simply as an avenue to fame. Self-experimentation has also found new life on the Internet. So-called self-tracking has already made lay scientists of many of us as we buy the latest exercise device or nutritional supplement and then log into forums to compare our findings with other investigators. What the practice lacks in rigor, it makes up for in zeal, not to mention the sheer number of subjects running their mini-studies. Somewhere in there, real if ad hoc science might occur. “To me, [self-tracking] is the future of self-experimentation,” says Seth Roberts, a professor of psychology at Tsinghua University in China, whose work led to the quirky best-selling diet book The Shangri-La Diet. The practice will continue among “normal people who are simply intent on discovering what works for them.” A Rich Tradition Warwick is a good example of people who choose to experiment on themselves. His first motivation was, he admits, selfish: “Pure scientific adrenalism,” he says. “The desire to follow my heroes.” At the same time, he understood the risks involved and felt that “if we were going to fry someone’s nervous system, I’d rather it be my own.” See how to prevent illness at any age.