Happy Accidents by Morton A Meyers

Rating: 6/10

Read More on AmazonGet My Searchable Collection of 200+ Book Notes

High-Level Thoughts

An interesting overview of the role of serendipity in scientific and medical breakthroughs. It got a little dry at times, though, and somewhat repetitive. Still an interesting medical and scientific history though.

Summary Notes

This is the essence of serendipity. Although the term has become popularized to serve as the synonym for almost any pleasant surprise, it actually refers to searching for something but stumbling upon an unexpected finding of even greater value—or, less commonly, finding what one is looking for in an unexpected way.

Discovery requires serendipity. But serendipity is not a chance event alone. It is a process in which a chance event is seized upon by a creative person who chooses to pay attention to the event, unravel its mystery, and find a proper application for it.

At the Johns Hopkins Hospital in 1947, two allergists gave a new antihistamine, Dramamine, to a patient suffering from hives. Some weeks later, she was pleased to report to her doctors that the car sickness she had suffered from all her life had disappeared.

Anderson discovered the “magic fertilizer” that allowed some of them to bloom into neurons, sprouting axons and dendrites: “It was a very boring compound that we used to coat the plastic bottom of the Petri dish in order to afford the cells a stickier platform to which to attach. Never would we have predicted that such a prosaic change could exert such a powerful effect. Yet it turned out to be the key that unlocked the hidden neuronal potential of these stem cells.”

Kuhn makes a distinction between “normal” and “revolutionary” science. In “normal” science, investigators work within current paradigms and apply accumulated knowledge to clearly defined problems. Guided by conventional wisdom, they tackle problems within the boundaries of the established framework of beliefs and approaches. They attempt to fit things into a pattern. This approach occupies virtually all working researchers. Such efforts, according to Nobel laureate Howard Florey, “add small points to what will eventually become a splendid picture much in the same way that the Pointillistes built up their extremely beautiful canvasses.”

Kuhn portrays such scientists as intolerant of dissenters and preoccupied with what he dismissively refers to as puzzle-solving.

However, it is “revolutionary” science that brings creative leaps. Minds break with the conventional to see the world anew. How is this accomplished? The surprising answer may be “blindly”!

“We invent by intention; we discover by surprise.” In other words, accidents will happen, and it's a blessing for us that they do.

The Three Princes of Serendip (set in the land of Serendip, now known as Sri Lanka): “As their highnesses traveled, they were always making discoveries, by accidents and sagacity, of things they were not in quest of.”

Accidents and sagacity. Sagacity—defined as penetrating intelligence, keen perception, and sound judgment—is essential to serendipity.

Accidental discoveries would be nothing without keen, creative minds knowing what to do with them.

Serendipity means the attainment or discovery of something valuable that was not sought, the unexpected observation seized upon and turned to advantage by the prepared mind. The key factor of sagacity has been lost. Chance alone does not bring about discoveries. Chance with judgment can.

“Discovery consists of seeing what everybody has seen and thinking what nobody has thought,” according to Nobelist Albert Szent-Györgyi.

Another trap for scientists lurks in the common logical fallacy post hoc, ergo propter hoc— the faulty logic of attributing causation based solely on a chronological arrangement of events. We tend to attribute an occurrence to whatever event preceded it: “After it, therefore because of it.”

“the asking of the question is the important thing…. The idea is: can you ask the question in such a way as to facilitate the answer? And I think really great scientists do that.”

Analogical thinking has certainly been a cornerstone of science. The seventeenth-century English physiologist William Harvey compared the heart to a pump. The physicists Ernest Rutherford and Niels Bohr pictured the atom as a tiny solar system. “Every concept we have,” writes the cognitive scientist Douglas R. Hofstadter, “is essentially nothing but a tightly packaged bundle of analogies.”

For example, the era of cancer chemotherapy was initiated by the recognition of never-before-seen symptoms in sailors saturated for long periods with liquid mustard gas during a military disaster in World War II. From this came the development of alkylating chemical agents, followed by a series of different categories of anticancer drugs.

Creative thinkers tend to take analogies and anomalies to higher levels. They have a gift for seeing similar differences and different similarities— phrases coined by the British theoretical physicist David Bohm. True creation, Bohm argues, relies upon perceiving a new fundamental set of similar differences that constitutes a genuinely new order. Indeed, it is the recognition of anomalies, discrepancies, inconsistencies, and exceptions that often leads to the uncovering of a truth, perhaps one of greater magnitude than the one originally pursued.

As Winston Churchill whimsically observed, “Men occasionally stumble across the truth, but most of them pick themselves up and hurry off as if nothing happened.”

Virtually without exception, scientific literature imposes a post facto logic on the sequence of reasoning and discovery. The role of chance would never be suspected from the logically rigorous sequence in which research is reported.

it is typically not until the Nobel Prize acceptance lectures that the laureate will for the first time clearly acknowledge the role of chance, error, or accident

Yet, despite the examples given, and all that follow, medical research stubbornly continues to assume that new drugs and other advances will follow exclusively from a predetermined research path.

Leeuwenhoek came tantalizingly close to grasping the germ theory of disease, when he found animalcules swarming in the decaying roots of one of his teeth and in the plaque from his own and other people's mouths. He noted that people who cleaned their mouths regularly had much less plaque than those who did not. And coming within hailing distance of heat pasteurization, he saw that the animal-cules in plaque “could not endure the heat of my coffee.”

“Where observation is concerned, chance favors only the prepared mind.”

About half of the 10 million soldiers killed in World War I died not directly from explosives, bullets, shrapnel, or poison gases but from infections in often relatively mild wounds.

Peering through his microscope, he marveled that bacteria in the presence of tears became swollen and transparent, then simply disappeared before his eyes. Of equal importance, he found that lysozymes were present in many animal and plant tissues, including blood, milk, and egg whites. Fleming's accidental illumination revealed that human fluids have some bacteria-fighting properties and that these are parts of the body's defense system essential to life. Unfortunately, lysozyme has relatively little medical use, being most effective against bacteria that do not cause illness. The discovery, however, put Fleming on the lookout for other nontoxic antibacterial substances.


Fleming was surprised to see that for some distance around the patch of mold, there was a zone cleared of bacteria, presumably due to some substance manufactured by the mold. At one edge was a blob of yellow-green mold, with a feathery raised surface, and on the other side of the dish were colonies of staphylococcus bacteria. But in a circular zone around the mold, the bacteria had been lysed, dissolved.

Fleming began to tease me about my excessive tidiness in the laboratory. At the end of each day's work, I cleaned my bench, put it in order for the next day and discarded tubes and culture plates for which I had no further use. He for his part kept his cultures… for two or three weeks until his bench was overcrowded with 40 or 50 cultures. He would then discard them, first of all looking at them individually to see whether anything interesting or unusual had developed. I took his teasing in the spirit in which it was given. However, the sequel was to prove how right he was, for if he had been as tidy as he thought I was, he would never have made his two great discoveries.”

Other elements contributed to the combination of chance circumstances improbable almost beyond belief. The mold that contaminated the culture was a very rare organism, Penicillium notatum, ultimately traced to a mycology laboratory on the floor below, where molds from the homes of asthma sufferers were being grown and extracts of them made for desensitization.

In chronicling the story for his book, Hare consulted old meteorological records and found out that an intense heat wave, which had been smothering London and which would have prevented the growth of the penicillium spore on the Petri dish, broke on the day Fleming opened the dish, thereby allowing the penicillium spore to thrive.

The following cool spell in London created conditions in which the mold grew first, followed by the bacteria when the weather turned warm again. These were the only conditions, Hare later found, under which the discovery could have been made.

As the war dragged on, the need for penicillin to treat Allied soldiers became more crucial. Realizing that an adequate supply could not be produced in wartime England, Florey and Heatley left for the United States at the end of June 1941. They flew via neutral Lisbon in a blacked-out Pan Am Clipper to New York City, storing their precious freeze-dried penicillin mold in the plane's refrigerator. Arriving several days later, in 90-degree heat, they rushed across Manhattan with their precious sample by taxi to a midtown hotel where it could be refrigerated. The American government was easily persuaded to undertake large-scale production of the drug, and events progressed from there at lightning speed.

The serendipitous nature of this turn of events, which resulted in the large-scale production of penicillin, was emphasized by Coghill: “One of the least understood miracles connected with it [penicillin] is that Florey and Heatley were directed to our laboratory at Peoria—the only laboratory where the corn steep liquor magic would have been discovered.”

By the time of the D-Day invasion of Normandy in June 1944, the American drug companies Merck, Squibb, and particularly Pfizer were producing 130 billion units of penicillin per month, enough to treat all 40,000 wounded soldiers.

Perhaps even more important was the fact that it was easier for the admiring public to comprehend the deductive insight of a single individual than the technical feats of a team of scientists.


Physicians treating their ulcer patients with the Sippy diet must have been amazed at how many of their cases proved intractable. The fact that this falsely based approach nonetheless persisted for six decades illustrates the unfortunate fact that conventional wisdom once adopted remains stuck in place even when it flies in the face of reality.

As declared by Robert Koch, the German scientist who established the bases of bacteriology in modern medicine in the latter part of the nineteenth century, three conditions are required in order to prove causation: the organism must be shown to be constantly present in characteristic form and arrangement in the diseased tissue; pure cultures of the organism must be obtained; and the pure culture must be shown to induce the disease experimentally.

After gastroscopy found no bacteria or inflammation present in his stomach, Marshall swallowed a foul-tasting brew of H. pylori– laden broth after inhibition of gastric acidity. After a week, he suffered first vomiting and then, for about a week, headaches and putrid breath. Ten days after his symptoms began, follow-up gastroscopy with biopsy documented that he had acute gastritis and that the spiral bacteria had established themselves in his stomach. On the fourteenth day, Marshall began treating himself with an antibiotic and bismuth. His symptoms promptly cleared and another endoscopic biopsy documented resolution. This infection had induced an acute self-limited gastritis. Marshall had fulfilled Koch's third postulate, on himself!


Advances in the chemotherapy of cancers testify to the breakthrough the episode provided. The investigation of the Bari survivors’ mysterious burns and skin ailments led to the observation that alkylating agents, such as nitrogen mustard, cause marrow and lymphoid depletion, which led to their successful use in treating certain kinds of cancers. According to the American Cancer Society, “the age of cancer chemotherapy was initiated…. From this [Bari] disaster, a chemical agent with anticancer activity was serendipitously discovered.”14 Nitrogen mustard became a model for the discovery of other classes of anticancer drugs.


Wills, a British physician working in India, identified a type of anemia in textile workers that she attributed to their severe poverty and grossly deficient diet. She found that the anemia responded to the consumption of Marmite, a food product made from purified yeast that has been available since around 1900.

The mystery nutrient in both Marmite and liver is now known to be folic acid. The term “folic acid” was coined in 1941, following its isolation from spinach (L. folium = leaf), but it was not until 1946 that its chemical structure was identified. We know today that folic acid—one of the B vitamins—is a dietary essential, present especially in fresh green vegetables, liver, yeast, and some fruits. Essential for blood formation, which occurs in the bone marrow, its deficiency leads to anemia. Studies during World War II had shown that certain anemias, in which large immature cells filled the bone marrow, responded to treatment with folic acid. Farber believed a similar solution could be discovered for leukemia.

In 1949 methotrexate was developed as a safer folic acid antagonist. By interfering with folic acid, methotrexate causes white cells to die of vitamin deficiency. By the mid-1950s it produced the first cure—not just remission but actual cure—of a solid tumor, choriocarcinoma, including, astonishingly, its lung metastases.


Indeed, teas made from periwinkle were used by diabetics in places as geographically diverse as the Philippines, South Africa, India, and Australia, following the customs of traditional folk medicine.1 A proprietary herbal preparation, Vinculin, was even marketed in England as a “treatment” for diabetes.

Noble was trumpeting the value of having stumbled across a discovery. He claimed that if he and his colleagues had restricted their search to plants with suspected anticancer properties, they would have missed the periwinkle.

The periwinkle alkaloids function as poisons at a critical stage of cell division, preventing the cancer cell from reproducing.5 Vincristine used together with steroids is presently the treatment of choice to induce remissions in childhood leukemias. Vinblastine, in a regimen with other agents, has resulted in an astonishing cure rate of over 90 percent in patients with testicular carcinomas. Employed in adult Hodgkin's lymphoma along with other drugs, the Vinca alkaloids have raised the five-year survival prospect to 98 percent.

Sex-Hormone Cancers

As castration would obviously not have been a treatment option for humans, he injected estrogen into the dogs to produce, in effect, chemical castration. This, too, caused the tumors to regress. He had stumbled over a new threshold. This “Eureka!” moment led to the realization that some tumors are, in fact, hormone-dependent. Professor Huggins reported his results in the Journal of Experimental Medicine in 1940.

He then began experimenting with injecting estrogen into men with prostate cancer that had spread to their bones. To his delight, the tumors regressed and the bone pain was relieved.

Aspirin and Cancer

For over a century, people have reached for aspirin to relieve headaches or back pain. In the 1990s a host of studies affirmed that aspirin might also lower the risk of developing colon cancer by as much as 40 percent.

As sometimes happens to people with this condition, within a few years, noncancerous scarlike tumors called desmoids grew in her abdomen. Waddell placed her on an anti-inflammatory drug for an upset stomach, and, much to his surprise, the tumors disappeared. Wondering if the startling effect could have been caused by the anti-inflammatory, he prescribed the drug to three of the woman's relatives who also had FAP. Amazingly, the colonic polyps either completely disappeared or shrank dramatically.

And then a finding provided the stunning explanation for aspirin's ability to shrink polyps: colon polyps make huge amounts of COX-2, and the increased prostaglandins can help cancers flourish. Inhibiting COX-2 not only decreases pain, it may slow cell growth, promote cell death, and prevent tumors from generating blood vessels for nourishment.4 Furthermore, there is increasing evidence that drugs that block COX-2 might thwart not only cancers of the colon but also those of the breast, lung, bladder, skin, and esophagus.


Based on that finding alone, in 1954 the company gave out free samples of the drug across Germany. Then, without having done any tests on either its safety or its effectiveness, the company began marketing thalidomide in 1957 as the first over-the-counter sedative.

By 1961, after a massive marketing campaign, thalidomide was the best-selling sedative in Germany and was being sold in forty-six countries throughout Europe, Asia, Africa, and the Americas.

A well-known public health disaster ensued, as pregnant women who took it for morning sickness and as a sedative soon saw its horrifying effects. Thalidomide is a teratogen, an agent that causes malformation of the fetus.

Remarkably, within three months of McBride's alert, Gerard Rogerson, a doctor from Shropshire, England, raised the possibility, in another letter to the Lancet, that since thalidomide inhibits growing tissue in these circumstances, it might be investigated as an anticancer drug. However, it was not until 1994, thirty-two years later, that Robert D'Amato, a member of the Judah Folkman laboratory, discovered that thalidomide acts as a mild inhibitor of the growth of new blood vessels critical to tumor formation.

McCulloch said he had always thought that the idea of “the scientific process” was overblown. He knows that “typically a successful scientist may start with an experimental design but then makes an unexpected observation that leads a prepared mind to follow a chance event.”

Pap Smear

In 1928 the Greek-American pathologist George Papanicolaou, while investigating the reproductive cycle by using vaginal smears obtained from human and animal subjects, observed the presence of cancer cells in people who were not known to be sick. That a cancer could be found this early, through the examination of cells (cytology), was a totally revolutionary idea.

Papanicolaou went on to develop the famous Pap smear test—so called from the first three letters of his name—which was not routinely adopted until the early 1940s. Cervical cancer, once the leading cancer killer among American women, is now, thanks to Pap smears, a distant seventh.

Cancer conclusion

The truth remains that over the course of the twentieth century, the greatest gains in the battle against cancer came from independent research that was not under any sort of centralized direction and that did not have vast resources at its disposal. As we have seen, such research led to momentous chance discoveries in cancer chemotherapy and a greater understanding of the mechanisms of the disease that have resulted in exciting new therapeutic approaches.

Yet one overwhelming fact persists: cancer will kill more Americans in the next fourteen months than perished in all the wars the nation has ever fought—combined.

Cancer has for too long been treated with some combination of surgery, radiation, and chemotherapy in a program derided by some as “slash, burn, poison.”


A twenty-five-year-old surgical intern in Germany, Werner Forssmann, famously performed the first human cardiac catheterization in 1929—on himself!—in an astounding example of self-experimentation in medicine.

He gained the trust of the surgical nurse who provided access to the necessary instruments. So carried away by Forssmann's vision, she volunteered herself to undergo the experiment. Pretending to go along with her, Forssmann strapped her down to the table in a small operating room while his colleagues took their afternoon naps. When she wasn't looking, he anesthetized his own left elbow crease. Once the local anesthetic took effect, Forssmann quickly performed a surgical cutdown to expose his vein and boldly manipulated a flexible ureteral catheter 30 cm toward his heart. This thin sterile rubber tubing used by urologists to drain urine from the kidney was 65 cm long (about 26 inches). He then released the angry nurse.

Forssmann's surgical career was subsequently severely curtailed. Nevertheless, he continued his experimental pursuits with injection of X-ray dyes through a catheter into dogs and his own body. Since hospitals at that time did not have special quarters for experimental animals, Forssmann's mother cared for the dogs in her apartment. Forssmann would inject a dog with morphine, put the sleepy animal in a potato sack, and take it by motorcycle to the hospital. As in the experiments by the French physiologists, Forssmann would insert a catheter through a vein in the dog's neck and into its heart. He would then inject dyes and attempt X-ray documentation

To demonstrate his faith in the safety and feasibility of outpatient coronary angioplasty, he asked one of his fellows to perform cardiac catheterization on him. Gruentzig jumped onto the table at 5:00 P.M., underwent the procedure, jumped off the table, picked up his wife, and arrived for the department Christmas party by 7:00 P.M.

Months later, he opened the lifeless dog's heart and was startled at what he saw: a cord stretched across the chamber, its surface glistening. Within, he could discern the three-inch-long stitch. What had caused the glistening filled Voorhees with excitement. The stitch was coated with endocardium—meaning, the lining of the heart chamber had grown over the suture!

Twenty-five years earlier, he had noted the same paradoxical response to this same drug, acetylcholine, a very potent vasodilator that elicited no relaxation but only contraction of the aortic strip. At the time, it was peripheral to his main research pursuits and he did not recognize its significance. Looking back, he ruefully noted: “Little did I suspect then what I was able to show many years later—namely, that relaxation of arteries by acetylcholine is strictly endothelium-dependent, and that my method of preparing the strips inadvertently resulted in the mechanical removal of all the endothelial cells.”

Remarkably, Furgott had accomplished all his work throughout the years by using simple and easily reproducible techniques, without sophisticated instruments and technology. His experience is a reminder that an individual does not need a heavily funded laboratory equipped with state-of-the-art technology to uncover fundamental truths.

The annual scientific meeting of the American Urological Association is usually a pretty staid affair. But one meeting has entered the annals of folklore. In the early 1980s, during the course of his lecture on the effectiveness of injecting substances directly into the penis to increase blood flow, one urologist announced that he had performed such injections on himself only an hour earlier. Stepping from behind the lectern, he dropped his trousers and proudly demonstrated to the audience his own erect manhood. Urologists who attended this meeting still shake their heads at the memory.


Around 1986 laboratory scientists working at Pfizer's research facility in Sandwich, in the United Kingdom, developed a new chemical compound, sildenafil citrate, and considered it a candidate for the treatment of angina. In early clinical trials in 1990 and 1992, its effect on circulation and heart function was disappointing. The researchers decided to alter the dose to see if it would make any difference. It did indeed—but not in the way the scientists expected. Rather than boosting blood flow to the heart, it affected blood flow to the penis, with startling effect: many impotent patients experienced restored penile erectile function.

Meanwhile, a National Institutes of Health panel comprised of leading urologists, psychiatrists, psychologists, gerontologists, and surgeons, recognizing that the term “impotence” was confusing and had pejorative implications, created a more accurate definition of what came to be called “erectile dysfunction” (ED).

Within the first month, 40,000 Viagra prescriptions were filled each day.8 For a time, it was the fastest-selling drug in history. Worldwide sales were $788 million in its first nine months on the market in 1998. In 2004, some 250,000 to 300,000 men were taking Viagra weekly


The first experiments on its nature were undertaken in the early twentieth century. In 1909 a clinical investigator at the Russian Imperial Military Medical Academy in St. Petersburg found that rabbits fed a mixture of eggs and milk developed atherosclerotic plaques.

a team of researchers led by John Gofman had worked to develop atherosclerosis in rabbits by feeding them a high-cholesterol diet. (Gofman was not dissuaded by the simplistic criticism that the rabbit is herbivorous and thus ordinarily ingests essentially no cholesterol.)

Further convincing proof that diet does cause human coronary disease was then demonstrated in a report on young American soldiers killed in the Korean War. Their average age was only twenty-two years, yet autopsies showed that three-fourths of them already had fat streaks in their coronary arteries. In contrast, the Korean and Chinese soldiers, having spent their lives eating no milk products and little egg or meat fat, had no cholesterol in their coronary arteries. Following President Eisenhower's widely publicized heart attack in 1955, the dangers of high-fat diets were openly discussed in the general press as well.

In 1984, after completing a ten-year study conducted in twelve centers in North America involving more than 3,800 men and costing $150 million, the NIH established that lowering blood cholesterol clearly lowers the risk of heart attack. The nation's premier health institute officially recommended that Americans lower the amounts of fat and cholesterol in their diet.

It was now clear that atherosclerosis in the general population was caused by a dangerously high blood level of low-density lipoproteins, resulting from failure to produce enough LDL receptors, and that cholesterol not absorbed into the cells as it courses through the circulatory system sticks instead to the walls of blood vessels, disrupting the flow of blood to the heart and brain.

It was now understood why some people can eat more cholesterol-rich foods and not have high blood cholesterol concentrations, while others on low-fat diets continue to have high levels: the difference is in the number and efficiency of the LDL receptors.

The British geneticist J. B. S. Haldane once commented, “The world is not only stranger than we imagine, it is stranger that we can imagine.” This famous quote is often used to support the notion that the mysteries of the universe are beyond our understanding. Here is another way to interpret his insight: Because so much is out there that is beyond our imagination, it is likely that we will discover new truths only when we accidentally stumble upon them. Development can then proceed apace.

Psychiatric Health

Through the end of 2004, there were fourteen potential new drugs for depression, eighteen for anxiety disorders, and fifteen for schizophrenia. Overall, there were 109 medications being developed for all mental illnesses.

Within a few decades, some mental states of mood, thinking, or behavior came to be recognized as arising through an error in synthesis, storage, metabolism, or release of neurotransmitters in the brain. Chief among these chemicals identified were serotonin, dopamine, and noradrenaline.

Serendipity, not deductive reasoning, has been the force behind the discovery of virtually all the psychotropics: Chlorpromazine (Thorazine) was unearthed in a hunt for antihistamines. The first antianxiety drug meprobamate (Miltown) was chanced upon in a search for an antibiotic. Valium was found by a scientist tidying up a lab bench. Antidepressants were uncovered as a side effect in the treatment of tuberculosis or stumbled upon in the quest for a tranquilizer. LSD was a surprising result in experiments for an ergot-derived stimulant. Lithium was happened upon through a misguided theory regarding a metabolic defect in manic states.

Shock Treatments

Shock treatment came about through a chain of serendipitous events when Sakel was working as an assistant physician at the Lichterfelde Sanitorium in Berlin. Sakel accidentally administered an overdose of insulin to one of his morphine-addicted patients who also happened to be diabetic. The level of glucose—the main source of energy for the brain—in the patient's blood dropped so severely that a mild coma ensued. Upon recovery, “the patient's mind, until then completely befogged, was absolutely clear,” and his craving for morphine had subsided. Emboldened, Sakel gave insulin overdoses to other drug addicts and, in 1930, published a report on his successes.


It was at this point in carrying out tests to measure the toxicity of urea in the presence of varying concentrations of uric acid that Cade ran into difficulties in preparing solutions of the highly insoluble uric acid. The problem was overcome by using its most soluble salt, which was lithium urate. “And,” Cade relates in a disarming admission, “that is how lithium came into the story.”

Far from increasing the toxic effect of urea, the lithium urate exerted some kind of protection. Moreover, and what was equally unexpected, the normally jittery guinea pigs lost their natural timidity and became instead placid, tranquilized, and generally lacking in responsiveness to stimulation. Cade turned them on their backs, and instead of frantically trying to right themselves and scurry away, they lay still and gazed serenely back at him. Further experiments confirmed that this calming action was due not to the urate component but to the lithium ion itself.

In Lithia Springs, Georgia, Lithia Springs Mineral Water is still sold to devoted self-healers, one of whom is quoted as drinking a gallon a day because “it keeps my nerves steady.” Other bottled curative waters, many of which are still on the market—Vichy, Apollinaire, Perrier, Lithée—were all promoted at one time for their high lithium content.

Finding no harmful effects, he then gave the lithium salt to a fiftyone-year-old male patient who had been in a state of manic excitement for five years. After five days, there was a clear improvement in the patient's condition, and within three weeks he was considered well enough to be transferred to a convalescence ward for the first time. In his fourth month of continuous treatment, he was sent home with instructions to take lithium carbonate daily.

The most notable advance was the introduction of agents that have, in essence, the same mechanism of action as the original tricyclic antidepressants, but with fewer side effects. These are called selective serotonin reuptake inhibitors, or SSRIs, and they include the well-known brand-name drugs Prozac, Paxil, and Zoloft. (Prozac's generic name is fluoxetine; Paxil's is paroxetine; and Zoloft's is sertraline.) SSRIs prolong serotonin's neurotransmission by inhibiting its reuptake by the nerves that release it, which is the normal way that serotonin signaling is terminated.

By 1970, one woman in five and one man in thirteen were using “minor tranquilizers and sedatives,” meaning mainly the benzodiazepines.6 Valium reached the height of its popularity in 1978, a year when Americans consumed 2.3 billion of the little yellow pills.


As it turned out, disulfiram by itself is innocuous, but combined with alcohol it breaks down to form a highly toxic product. At one point, Jacobsen's blood pressure fell to almost zero and he came close to death.


Lysergic acid itself is not hallucinogenic. Hofmann's original purpose was to synthesize the basic components of ergot in hopes of developing new compounds that might be useful in other fields of medicine. In 1938 Hofmann had synthesized lysergic acid diethylamide with the chief aim of obtaining a stimulus to the circulatory and respiratory systems.

The following Monday, the nineteenth, using other ergot alkaloids as a guide, Hofmann took 0.25mg of LSD orally, believing that to be the lowest dose that might be expected to have any effect. In fact, this is five to ten times the average effective dose of LSD!


Dwight Eisenhower famously cautioned the nation about the influence of the “military-industrial complex,” coining a phrase that became part of the political vernacular. However, in the same speech, he presciently warned that scientific and academic research might become too dependent on, and thus shaped by, government grants. He foresaw a situation in which “a government contract becomes virtually a substitute for intellectual curiosity.”

Peer review institutionalizes dogmatism by promoting orthodoxy. Reviewers prefer applications that mesh with their own perspective on how an issue should be conceptualized, and they favor individuals whom they know or whose reputations have already been established, making it harder for new people to break into the system. Indeed, the basic process of peer review demands conformity of thinking and disdains a maverick's approach.

“We can hardly expect a committee,” said the biologist and historian of science, Garrett Hardin, “to acquiesce in the dethronement of tradition. Only an individual can do that.” Young investigators get the message loud and clear: Do not challenge existing beliefs and practices.

Since the FDA lifted limits on direct-to-consumer (DTC) advertising of drugs in 1997, this form of marketing grew into a $4.2 billion business. (Only one other country, New Zealand—with a population of less than 4 million—allows such advertising.)

Indeed, this compliance on the part of doctors is part of the problem. The average number of prescriptions they wrote for each American rose from seven in 1993 to twelve in 2004. American doctors wrote some 3 billion prescriptions in 2005.

In a typical year, the fifty most heavily advertised drugs account for nearly half the increase in spending on prescription drugs.

Under such a system, premenstrual syndrome (PMS) became a psychiatric illness with the imposing name of premenstrual dysphoric disorder (PMDD), and for the very bad moods some women suffer before their periods, Eli Lilly repackaged its aging antidepressant Prozac in a lavender pill dubbed Sarafem. Hyperactive children all have attention deficit disorder (ADD), and there are now maladies called female sexual dysfunction (FSD) and social anxiety disorder (Glaxo advertised its antidepressant Paxil for this newly noted “disorder”). By 2006, nearly 4 million people—adults and children—were taking drugs to treat attention deficit disorder and hyperactivity.

The focus in medical schools and higher education institutions in the field of science is on facts, not ideas; on outcomes, not process.

Why are particular people able to seize on such opportunities and say, “I've stumbled upon a solution. What's the problem?” Typically, such people are not constrained by an overly focused or dogmatic mindset. In contrast, those with a firmly held set of preconceptions are less likely to be distracted by an unexpected or contradictory observation, and yet it is exactly such things that lead to the blessing of serendipitous discovery

The center will emulate Bell Labs and the British Medical Research Council's Laboratory of Molecular Biology by encouraging scientists to do hands-on creative work on their own or in small, close-knit teams, and to focus on original projects. In other words, the Janelia Farm Research Campus is being set up as a serendipity incubator.

General education: Students, particularly in science, must be educated not only to know facts and the scientific method but also to be prepared to recognize and exploit departures from expected results. Essential are the tools related to thinking: pattern recognition and pattern formation, alertness to similar differences and different similarities, analogies, imagination, visual thinking, aesthetics of nature and of knowledge.

Medical and general science education: Current methods of educating medical students overemphasize memorization at the expense of full understanding, reinforce compulsive behavior, and stifle creativity.

Big Pharmaceuticals: Restrictions should be placed on the pharmaceutical industry to shift the emphasis from “me-too” drugs to innovative drugs and to break the cozy link between drug companies and the thousands of doctors who take their gifts, consulting jobs, and trips, then turn around and write prescriptions for their benefactors’ drugs, which may just mimic the action of easily available and less expensive generic medications

Research grants: Agencies and foundations that fund research grants should allow curiosity-driven investigators to pursue promising findings that may deviate from the scientific question originally proposed. Some flexibility should be provided in the funding mechanisms to allow investigators to follow any unexpected findings wherever they may lead. For investigators to be forced to secretly divert funds for this purpose from their declared and approved missions incurs a sense of dishonesty and, in the end, is not the most productive path.

Peer review process: Whereas peer review is intended to protect the autonomy and self-governance of the sciences, it has become an agent for the defense of orthodoxy and a constraint on creativity. The review process should be modified to reduce the inherent bias toward prevailing concepts and to encourage mavericks and outsiders.

Scientific journals: Editors of medical and scientific journals should encourage researchers to be more forthcoming in fessing up to serendipity's contributions to their experimental results. The widespread contribution of chance is typically obscured, so other researchers remain ignorant of the important role serendipity plays.

Did You Enjoy This?

Then consider signing up for my Monday Medley newsletter. It's a collection of fascinating finds from my week, usually about psychology, technology, health, philosophy, and whatever else catches my interest. I also include new articles, book notes, and podcast episodes.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.