In previous articles (see here, here, and here), I discussed some very interesting results that were coming from different labs. These results indicated that the half-lives of some radioactive isotopes vary with the seasons. They seemed to imply that the sun was somehow affecting the rate of radioactive decay here on earth. This made the results controversial, because there is nothing in known physics that could cause such an effect.
One criticism of the studies was that weather could be the real issue. Even though labs are climate-controlled, no such control is perfect. Humidity, pressure, and (to a lesser extent) temperature can all vary in a nuclear physics lab, so perhaps the variations seen were the result of how the changing weather was affecting the detectors. However, the authors used several techniques to take changing weather into account, and all those techniques indicated that it couldn’t explain the variations they saw. The authors were (and probably still are) convinced that they were seeing something real. I was as well. In fact, one of my posts was entitled, “There Seems To Be No Question About It: The Sun Affects Some Radioactive Half-Lives.”
Well, it looks like there is some question about it. Two scientists from Germany decided to measure the rate of radioactive decay of the same isotope (Chlorine-36) that was used in some of the previously-mentioned studies. However, they decided to use a different experimental technique. The studies that showed variation in the rate of radioactive decay used a Geiger-Muller detector (often called a “Geiger counter”) to measure the radioactive decay. The two scientists who authored this study used a superior system, based on liquid scintillation detectors. The authors contend (and I agree) that the response of such detectors is much easier to control than the response of Geiger-Muller detectors, so their results are more reliable. They also used a particular technique, called triple-to-double coincidence ratio, that reduces “noise” caused by background radiation. When doing detailed measurements of radioactive decay, this is one of the standard techniques employed.
What did they see with their more reliable system? They saw some small variations (much smaller than those seen in the previous studies), but unlike the other studies, those variations were not correlated with the season. They seem to be completely random. This makes it look like the previous measurements that saw fluctuations correlated with the seasons were wrong. As the authors state:1
The work presented here, clearly proves that variations in the instrument readings in 36Cl measurements are due to the experimental setup rather than due to a change in the 36Cl decay rates.
Now, I think the authors are overstating their results. Most scientists understand that science cannot prove anything. However, their paper does make a very strong case that the previous studies weren’t seeing real fluctuations. Instead, they were seeing some artificial effect that was a result of their measurement technique.
Is this the final word? I doubt it. I suppose the authors of the previous Chlorine-36 studies will try to replicate this experiment and see if it has some flaw that would cause it to miss periodic variations. That’s certainly possible. Nuclear physics (and nuclear chemistry) experiments are very tricky, and there are a lot of unforseen factors that can affect them. However, there are two things we can say for sure:
1. Either the previous studies have come to the wrong conclusion, or this study has come to the wrong conclusion. Hopefully, more studies will be done to figure out which is which.
2. My previous post entitled, “There Seems To Be No Question About It: The Sun Affects Some Radioactive Half-Lives” has too strong of a title. There is definitely a question now, and it is a big one!
1. Karsten Kossert and Ole J. Nähle, “Long-term measurements of 36Cl to investigate potential solar influence on the decay rate,” Astroparticle Physics 55:33-36, 2014
Return to Text