EAST ST. LOUIS - Thousands of plaintiffs who claim weed killer paraquat caused Parkinson’s disease depend mostly on a study from 1997 that didn’t establish causation, according to the transcript from a four-day hearing in Chief U.S. District Judge Nancy Rosenstengel's court.
The plaintiffs' expert, Cornell University professor Martin Wells, assigned 74% of the weight of his causation opinion to the study.
Wells assigned light weights to five other studies and calculated higher odds of association between paraquat and Parkinson’s disease than any of the studies found.
He found no value in more recent studies, including one that caused the U. S. Environmental Protection Agency (EPA) to weaken advice on its website.
Defendants Syngenta and Chevron await a ruling from Rosentengel on a motion to exclude Wells' opinions.
Rosenstengel presides over almost 5,000 paraquat suits from district courts across many states by appointment of a judicial panel in Washington, D.C.
She has selected four plaintiffs for trials to shape settlement but hasn’t set dates.
She held a hearing, which was mostly about Wells, for four days in August.
According to the transcript recently obtained by the Record, plaintiffs and defendants stated their positions on causation during a short session on Aug. 21 to kick off the four-day event.
Plaintiff counsel Eric Kennedy of Cleveland began by saying, “Paraquat is an extraordinarily effective killer of cells. Plant cells, animal cells, human cells.”
He said applicators can create a mist, and droplets can get into the lungs.
“They're absorbed into the bloodstream. The blood flows, circulates to the brain,” he said.
Kennedy said applicators can get paraquat in the mouth when they breathe.
“It's absorbed through the stomach wall, gets into the bloodstream,” he said.
“The blood circulates to the brain and the paraquat is deposited outside of the blood vessels into the tissue of the brain,” he added.
“It could be absorbed by the skin into the bloodstream. The blood circulates to the brain. The paraquat is deposited in the brain,” he continued.
Kennedy said defendants knew in the late 60s that paraquat got into the brains of humans.
“A series of autopsies came to the defendants for decades where someone would drink, intentionally or accidentally, paraquat,“ he said.
“Within days, weeks, as much as months, it would cause their death. An autopsy would be done and they would find paraquat in the brain,” he added.
Kennedy told Rosenstengel the defendants would ask her to believe an extraordinary killer of cells gets into the brain, stays for an indefinite period, and causes no damage.
He said defendants know it stays in the mouse brain for a half life of 26 days, and it stays in the human brain "a whole lot longer."
“We are not required to establish that paraquat is the sole cause or that it’s the only cause, simply that it is a cause," he said.
“We’re not saying that there is a consensus that paraquat causes Parkinson’s disease," he added.
"What we’re saying is there is an issue that exists, that the voices that are in support of the existence of causation can be heard at the medical meetings, could be heard in regulatory agencies throughout the world, in textbook and review articles. The voice is there,” he continued.
Chevron counsel Leon DeJulius of New York City rejected Kennedy's arguments saying, “Science is to be done in peer reviewed scientific articles, not in a courtroom.”
“When an expert purports to apply principles and methods in accordance with professional standards and yet reaches a conclusion that other experts in the field would not reach, the court may fairly suspect that the principles and methods have not been fairly applied,” he said.
“When an expert's on an island that opinion should not come in,” he added.
DeJulius said Wells hadn’t studied paraquat or Parkinson’s disease.
He said Wells disclaimed in the past the ability to opine on causation.
“None of that stops him here,” he said.
Syngenta counsel Ragan Naresh of Washington also said no scientist published an analysis in peer reviewed literature concluding that paraquat causes Parkinson's disease.
He said Parkinson's disease was given a name in the 1800s, “but it was likely that people were living with Parkinson's for thousands of years before that.”
“There's only one known cause of Parkinson's disease, and it's genetics,” he said.
Wells testified on Aug. 22 - Day two of the hearing
Wells said he started working at Cornell in 1987.
Then in the late 1990s, the university asked him to be founding chair of a department of biological statistics and computational biology.
“With new technologies there are new statistical methods that need to be developed, so I sort of jumped into biology,” he said.
“My research is a bit of a balance between theoretical methods development and applied epidemiology and biostatistics,” he added.
Kennedy asked who funded his research over the years, and Wells said the National Science Foundation, National Institutes of Health, and the Army.
Kennedy asked how he started his research, and Wells said he focused on six major review articles and an EPA review of studies.
Kennedy asked if he arrived at an opinion to a reasonable degree of scientific certainty as to whether or not occupational exposure to paraquat could cause Parkinson’s disease.
“Looking at this type of study, occupational studies or use studies, use of paraquat is causative of Parkinson’s disease. Causes Parkinson’s disease,” Wells said
Kennedy then asked about "association", and Wells explained that the term is used when one thing relates to another.
He added that risk measures and ratios for odds and hazards quantify the association in the data.
“You’re trying to understand the population level by using sample values from data to estimate what these population level values are,” he said.
Kennedy asked, “Association does not automatically mean causation, correct?”
“No, not at all,” Wells said.
“But the existence of an association is step one?” Kennedy asked.
“It’s very important, yes,” Wells responded.
He explained that the odds ratio is the measure of association.
“When an odds ratio is two, that indicates it’s sort of doubling the risk or doubling the odds so you’d say like the odds were two to one,” he said.
“Over one, there’s an association and then the question is how far over one?” he added.
Kennedy asked what his odds ratio was, and Wells said, “Two point eight.”
Kennedy asked, “Why is your odds ratio higher than those that we see in these six studies?”
“The studies that I collected are a bit more focused,” he responded.
Wells said the individuals in the studies were exposed to paraquat through direct use rather than ambient exposure.
Kennedy asked if he considered a 2020 study by Shrestha.
Wells said the study had 66,000 participants, and the number of Parkinson exposed cases was 8,600.
“They had a lot of other pesticides in there and that’s where the big number comes from; but this is a case about paraquat, so we should look at the paraquat information,” he said.
Wells said he relied on a 2011 study by Tanner.
“They sent people out to the farms. They did surveys and kind of tried to chase down what the exposure was,” he said.
Kennedy asked if he considered a study by Pouchieu, and Wells said it was weak.
“In their model that’s unadjusted for multiple exposures, they find an elevated risk; and then when they put the exposures in, that odds ratio plummets,” he said.
Rosenstengel also had questions for Wells, asking if he had a general problem with cohorts.
“It’s great to have a cohort if you can actually get the information,” he said.
“When you’re defining your exposure and looking forward like the Shrestha study you might have to wait a long time,” he added. “You don’t know when they get the disease. You may have to wait long, long, long.”
“Retrospective cohort is easier to do particularly when you have databases because you can look back and then you can be sure you capture everything in the scope of the data,” he continued.
Rosenstengel asked him to explain his weighting process, and he said it was calculated from information in the study.
“If you have an individual study that’s a small study and it’s quite variable, the meta analysis procedure will put less weight on that just because it’s a study with not as much information,” he said.
“If you have a study that’s larger, the meta analysis procedure will put more weight on that because it has sort of tighter information," he added. “It’s not something that I do. It’s part of the statistical procedure.”
Rosenstengel asked, “The more weight you give to a study is determinative of the outcome?”
“Definitely, and that’s a function of the variability from the underlying studies,” he responded.
She asked about 74% weight for a study by Liou.
“It’s from the data. I mean I didn’t calculate that,” Wells responded.
“I put in the confidence interval from Liou into the software along with the other point estimates and confidence intervals and then you run the procedure and then it gives that number,” he explained.
Next, Chevron counsel Sharyl Reisman of New York City examined Wells.
Reisman asked if he conducted a study assessing occupational exposure before, and he said no.
She asked if he had training in how to perform an assessment, and he said no formal training.
She asked if it was his first assessment of environmental exposure involving a chemical product, and he said yes.
Reisman asked, “The number 2,8 only reflects and only accounts for data and odds ratios and confidence intervals pulled from these seven studies, correct?”
Wells said yes.
He also admitted that the most recent study was 12 years ago, and the Tanner study fails to rely on more recent data available today.
Reisman asked if the Shrestha study collected data for 13 additional years, and he said yes.
She asked if the Liou study involved Parkinson’s patients recruited from a clinic in Taiwan with no U.S. subjects, and he said yes.
Wells also admitted that the Liou study didn't control for smoking or overall pesticide exposure, but said "they had given alternative analysis."
“We don’t know what that odds ratio would be if you did indeed control for smoking and exposure to other pesticides, correct?” Reisman asked.
“I don’t have that number, but I have a sense of what it would be given the information in the paper,” he responded.
She asked if only two studies were statistically significant, and he said, “Yes, but that’s the role of meta analysis.”
“You can combine and leverage information from the other studies in the combined analysis,” he said.
She asked if his seven studies were a subset of a universe of studies looking at occupational exposure to paraquat and Parkinson’s disease.
”They’re a subset but the subset is chosen because I think they’re very high quality studies,” he said.
Reisman asked if Pouichieu in 2019 and Shrestha in 2020 were the largest studies of occupational exposure to paraquat.
“They have lots and lots and lots of patients, but that doesn’t impress me,” he said.
“Large is one thing. High quality is another thing,” he added.
Wells agreed that those two studies were numerically bigger than all his studies together.
Reisman asked if a 2.8 odds ratio accounted for risk estimates from his studies, and he said, “No, it’s numerically calculated.”
“I put the 95% confidence intervals in the studies, and the computer has a routine algorithm it uses,” he said.
“It calculates those weights based on those numbers so the numbers that I put in reflect what the calculations are,” he added.
He explained that it could only reflect and calculate numbers from the studies he put in it.
Reisman then asked if none of the three most recent analyses resulted in odds ratios that reached two, and he said correct.
She asked about a review of reviews from 2021 by Weed, and Wells said, “Yes I recall it because he’s a notorious guy.”
Reisman quoted Weed: “No author of any published review stated that it has been established that exposure to paraquat causes Parkinson’s disease, regardless of methods used and independent of funding source.”
She asked if not a single study that served as the foundation for his analysis found causation.
“Mine are looking at use, so it’s a different set of inclusions in the study,” he said.
“These are broader studies. They include drift cases. Mine are very focused,” he added.
Reisman asked if the EPA conducted a systematic review of 26 epidemiology studies, and Wells said, “That’s the magnitude.”
She quoted the agency: “Overall there is limited but insufficient epidemiologic evidence at this time to conclude that there is a clear associative or causal relationship between occupational paraquat exposure and Parkinson’s disease.”
Wells agreed that the agency updated a statement on its website after the Shrestha study, stating that it “reported no association between paraquat exposure and Parkinson’s disease.”
Reisman quoted the EPA: “This updated study did not replicate earlier 2011 findings from the agricultural health survey that were considered by EPA and suggested a potential association may exist.”
“EPA has not found a clear link between paraquat exposure from labeled uses and adverse health outcomes such as Parkinson’s disease and cancer,” she continued.
Reisman asked, “The updated study could not replicate the findings from the study that was published ten years earlier based on data that was 20 years earlier, correct?”
“Yes, they came up with a different result,” he responded.
She asked if Shrestha estimated the hazard ratio at 1.09, and Wells said, “That involves the adjustment procedure that I think has some problems.”
She asked if the study didn’t report an association between paraquat exposure and Parkinson’s disease, and he said it didn’t.
Reisman finished the examination, and Kennedy asked Wells if older studies mean less quality.
Wells said it depends.
“People were exposed to something in 1997. They could be exposed to something in 2018. It’s still exposure,” he said.
Then Rosenstengel asked what the size of a study had to do with the weighting process.
Wells said confidence intervals in a very large study are narrower and have more weight.
“The problem is you have to look at the quality of the study,” he said.
“Pouchieu is an enormous study. I think it’s very poor quality,” he added, saying big studies are "hard to pull off."