Unpleasant Statistics: Santa Causes Kids to Turn Away From Christianity
By Max Candocia
|
December 23, 2017
This December, I surveyed a 312 users from Reddit, Facebook, and, to a lesser extent, other social media sources on how they celebrated (or didn't celebrate) Christmas. You can find an active version of the survey here . I automatically generate a good portion of this article, so I may update it in the future.
Other Articles in Series
This is the fifth and final article, and I'm keeping it short and (bitter)sweet. I'm going to use statistics to show that Santa makes kids turn away from Christianity when they become an adult...without abusing statistical methodology, of course.
Religious Transitions
Below is a plot of the religions that respondents were before 18 and their current religion. The diagonal is fairly high, although, apart from atheism/agnosticism, there are many off-diagonal elements that indicate someone switched religions, including between different sects of Christianity.
Taking all responses from people who were Christian before turning 18 and are now 18 or older, a sample size of 194, I construct a logistic model to determine the effects of parents telling their kids Santa is real, controlling for gender and region of the US. I do not consider transitions from one form of Christianity to another as leaving Christianity.
Note: blue means more likely to turn away, red means less likely to turn away
Oh, no! Look at that figure on the bottom. Apparently kids who were told they got gifts from Santa were up to 2.5 (1/0.4) times as likely to change religions than kids who got gifts and were not told that Ol' Saint Nick brought them. It even has a star next to the number, which means that it is STATISTICALLY SIGNIFICANT at the 95% confidence level. You probably want to keep them away from the West Coast and other countries, too.
Click Here for a half-decent model and an explanation of why the above should not be taken at face value
Why is the above incorrect & misleading?
In reality it's mostly religion that impacts the probability. I was abusing which control parameters I used to get the desired effect. This is one form of what is known as p-hacking.
Also, the "odds ratio" will seldom have an impact near a 2.5x change in probability unless the probability is already really low. Odds are defined as the ratio of one probability to the other, so 50/50 would be 1:1 or 1, and 40/60 would be 2:3 or 2/3.
The odds ratio is the ratio of two different odds before and after a certain effect.
For example, if the "default" probability is 50/50, that has an odds of 1:1, or 1. An odds ratio of 0.4 would change that to (50 * 0.4)/50 = 20/50, which has to be scaled to 2/7 and 5/7 in order to be an actual probability, or about 28% and 72%, respectively. That is only a 30% decrease and 44% increase in the probabilities/likelihood. So, while that can still be a noticeable effect, it is far from the 2.5, or 150% change that I weaseled in with "up to" above.
The below model is a result of using religion as a predictor, which shows that Evangelical Christianity is much less likely to change religions than other branches of Christianity. No other variables were significant, including how frequently the responder attended religious service as a child. That variable did not even make it past the stepwise selection process for choosing variables.
The data is a bit biased in sampling as far as religion goes, though, so do not take this too seriously.
Source Code
I have the source code for my analysis on GitHub here. All the responses (after removing timestamp/order info) will be released once I finish my article series.