IN FEBRUARY, THE HEAD OF DRUG ABUSE Resistance Education – used in seventy-five countries worldwide – made the extraordinary admission that the program has not been effective. Nonetheless, the Robert Wood Johnson Foundation gave DARE a $13.7 million grant to bring the curriculum up to date and to scientifically evaluate its usefulness. The foundation reasoned that it would be easier to change DARE than to bring another program to its level of penetration. And so, in September, DARE will launch its new and improved program with great fanfare in six cities, including New York and Los Angeles. In March 2002, administrators will implement it worldwide.
The DARE-makeover announcement is being interpreted by some as a signal that science is coming to the rescue at last in the politically sensitive field of drug education. Zili Sloboda, the former director of the Division of Epidemiology and Prevention Research at the National Institute on Drug Abuse, was chosen to oversee the evaluation of the renovated program. She says that DARE “will do everything it can to update its programs and to make them evidence-based.”
But many social scientists are unimpressed: They argue that drug-prevention education must be the only category in their field where failure – such as DARE’s – is used as an occasion to continue and even expand a program.
In fact, the problem goes way beyond DARE. In interviews with more than a dozen experts, a picture emerges of a dysfunctional and highly politicized drug-education environment in which even the “research-based programs” now favored by the federal government don’t stand up to scientific scrutiny. In fact, many say, despite all the “scientific” claims to the contrary, drug-prevention education – at least the abstinence-based model that reigns in America’s schools – is just as likely to have no effect or to make kids curious as it is to persuade them not to use drugs.
Here’s why the current models are flawed: Drug-education researchers generally evaluate their own programs, and, with few exceptions, they tend to parse out their data so programs seem more successful than they actually are. Scientists call it “over-advocating.” Positive results in limited situations are exaggerated, and instances of increased drug use are obscured or suppressed. Such practices should never survive the process of peer review, critics say, but they do.
The federal government plays a major role. Key agencies set unrealistic guidelines that ensure failure, and they continue to nurture programs despite bountiful evidence that they don’t work. What’s worse, drug education is big business. Fueled by a perpetual sense of crisis, schools and communities pour scarce resources into prevention programs. Each year, the federal government spends upward of $2 billion on drug-prevention education, and states and localities contribute more, according to data extrapolated by Joel Brown, director of the Center for Educational Research and Development, in Berkeley, California. Estimates on total expenditures range as high as $5 billion annually. Researchers who evaluate their own programs stand to profit only when they can report success. And these same researchers are often asked to sit on exclusive government panels, deciding which programs will be recommended for sale to the nation’s schools.
DARE MAY BE “THE ONLY GAME in town,” as Sloboda puts it, but that hasn’t kept other researchers from developing programs to fight for a share of the market. These competitors have been buoyed by DARE’s public-relations woes and by a 1998 law limiting Department of Education drug-prevention funds to programs that at least minimally demonstrate “the promise of success” in reducing teen drug use.
Only one of the programs deemed exemplary by all three major agencies – the Department of Education, NIDA and the Center for Substance Abuse Prevention – is commercially available nationwide. That program is called Life Skills Training. While LST is not nearly as big as DARE, the program is currently in about 3,000 schools, and an estimated 800,000 students have gone through it to date, according to a spokesman.
LST has never been evaluated independently; two studies are going on now, with the results expected this summer. But the program’s creator, Gilbert Botvin, a professor of psychology and public health at Cornell University, claims that the program reduces tobacco, alcohol and marijuana use in young people by up to an incredible seventy-five percent. He has published more than a dozen articles in leading journals, including the Journal of the American Medical Association, saying as much. These would be remarkable outcomes indeed, enough to warrant implementing LST in every school in the country, if it weren’t for one thing: They are probably not true.