As LST has risen in prominence, other researchers have begun analyzing Botvin’s published articles, and many have discerned a common pattern. “Botvin gets positive effects but only in a very small subsample,” says Dennis Gorman, an expert in prevention and evaluation methodology at the Texas A&M University System Health Center.
As an example, Gorman showed in the 1998 article “The Irrelevance of Evidence in the Development of School-based Drug-Prevention Policy, 1986-1996,” in Evaluation Review, that Botvin emphasizes specific data from students who were exposed to at least sixty percent of the program’s curriculum. Gorman states that the students Botvin ends up focusing on are likely to be those who were most motivated and least inclined to be involved with drugs in the first place. Botvin responds that breaking down the data from only the high-implementation group tells you most about the usefulness of a program. (This is a practice that the National Academy of Sciences called “misleading” in a report that condemned the quality of current prevention research.)
Others have gone beyond Gorman in criticizing Botvin’s methods. In an article in the April issue of Journal of Drug Education, Joel Brown found that when students received fifty-nine percent or less of Life Skills Training, their drug use was actually higher than that of students who didn’t go through LST at all. Botvin categorically denies any negative results: “The fact of the matter is, we present more data than any of these other researchers.”
Another person who takes issue with Botvin’s claims is Stephanie Tortu, an associate professor of public health at Tulane University in New Orleans. In 1984, she was project manager on one of LST’s first major studies – an investigation of the effectiveness of the program in fifty-six schools across New York state.
She says that when Botvin presented her with the draft of the study’s results, she was shocked to discover that crucial data on the students’ alcohol use had been left out. Tortu and the other researchers had found that students who went through LST were more likely to drink alcohol than students who weren’t exposed to the program, but this information was nowhere to be found in the report.
Jason Cohn is a freelance journalist based in San Francisco. She and several colleagues on the project, including Barbara Bettes, a data analyst, sent Botvin a memo documenting their concern and asking that an investigation of the alcohol findings be made their highest priority.
“He was the principal investigator,” Tortu says. “When he saw that alcohol use was up in his prevention group, he should have been trying to figure out why. I felt he was required ethically to call attention to it and investigate it.”
Shortly afterward, Tortu says, Botvin denied her a standard raise, a message she interpreted as punishment for sticking her neck out. Soon after that, Botvin informed her that there was no longer enough money to keep her on the project.
“To be straightforward and candid,” says Botvin, “we’ve produced the strongest effects for tobacco, and also strong effects for marijuana, but the alcohol effects in some studies have been inconsistent.” And he maintains that the data at issue in the staff memo was just preliminary. Bettes counters that Botvin felt comfortable using the same set of data to announce positive effects on tobacco and marijuana.
In the end, the report delivered to New York state did not indicate that alcohol use had increased among students who went through the program, Joel Moskowitz, director of the Center for Family and Community Health at UC-Berkeley, notes that “unfortunately, Botvin is not the only researcher to engage in such practices of overstating positive program effects or neglecting to report negative program effects and limitations of the research.”
Tortu maintains that, ultimately, Botvin has a conflict of interest in both evaluating and profiting from LST, but Botvin says that he has fully disclosed the fact that he receives royalties from sales of the program. Furthermore, he says, the evidence confirming LST’s success is superior. “In my view, the quality of the science in the Life Skills Training research is higher than for any other prevention program that I’m aware of in America,” he says.
That may not be far from true, but it’s also not saying a whole lot. Botvin has been careful to distance LST from DARE and other programs that have been found to be ineffective. He calls LST a “comprehensive” approach to drug education.
“Life Skills Training deals with a broad array of skills that we think kids need to navigate their way through the dangerous minefield of adolescence,” Botvin says. “Skills that will help them be more successful as adolescents and help them to avoid high-risk behaviors, including pressures to drink, smoke or use drugs.”