



There is a lot of confusion because physicists changed the meaning of “locality” since the EPR paper to refer to relativistic locality (sending information faster than light) which was not what Einstein was on about. Einstein’s locality is probably most succently summarized as such:
In this case, assume a bunch of particles are interacting, and S is the state of a system of interacting particles prior to the interaction, and S’ is the state of the system of interacting particles after the interaction. We then want to look at the variance (statistical spread) of the probability distribution of S’ preconditioned on S, that is to say, a prediction of the state of the system after the interaction given complete knowledge of the state of the system prior to interaction, and then compare that to the variance of another prediction where we precondition both on S and x, where x is the state of something outside of the system of interacting particles.
If a theory is local, then the two should always be equal for any possible value of x. This is because the outcome of a local interaction should only be determined by everything participating in the local interaction, that is to say, S, so preconditioning on complete knowledge of the initial states of everything participating in the interaction should give you sufficient knowledge to predict the outcome of the interaction, that is to say, S’, to best that is physically possible.
If you can include something outside of the interaction, that is to say, x, and it can improve your prediction further, then it must be nonlocal because it contains irreducible dependence upon something not involved in the interaction.
The point about the EPR paper is that if you don’t assume hidden variables, then this definition of locality is broken. Two entangled particles are said to be ontologically in a superposition of states, meaning, having complete knowledge on their states prior to the measurement interaction can only predict them both with a distribution of 50%/50%, but if you precondition on knowledge of an observer’s measurement far away, then you can improve your prediction as to your measurement of your local particle to 100% certainty, which violates this locality condition.
This is still local in the classical case where the only reason you could improve your prediction is because you were ignorant of the initial state of the particle to begin with, so you never preconditioned on the complete initial state of the system to begin with. Hence, adding hidden variables would, supposedly, restore this notion of locality, which we can call causal locality as opposed to relativistic locality.
What Bell’s theorem proves is that adding hidden variables does not restore causal locality. This is because, as he proves, in quantum mechanics, the state of an individual particle in a collection of entangled particles can have dependence upon the configuration of a collection of measurement devices, even though it only ever interacts with an individual measurement device. That means this violation of causal locality is intrinsic to the mathematics of the theory and is not something that just arises due to a lack of hidden variables.
Even worse, as Bell says, adding hidden variables appears to make it “grossly nonlocal,” which by that he meant it violates relativistic locality as well. At least without introducing something like superdeterminism or retrocausality.


I think there is nothing more fitting than an anti-ML using Wikipedia on Marxism-Leninism as their source. Chef’s kiss.
It is the academic consensus even among western scholars that the Ukrainian famine was indeed a famine, not an intentional genocide. This is not my opinion, but, again, the overwhelming consensus even among the most anti-communist historians like Robert Conquest who described himself as a “cold warrior.” The leading western scholar on this issue, Stephen Wheatcroft, discussed the history of this in western academia in a paper I will link below.
He discusses how there was strong debate over it being a genocide in western academia up until the Soviet Union collapsed and the Soviet archives were open. When the archives were open, many historians expected to find a “smoking gun” showing that the Soviets deliberately had a policy of starving the Ukrainians, but such a thing was never found and so even the most hardened anti-communist historians were forced to change their tune (and indeed you can find many documents showing the Soviets ordering food to Ukraine such as this one and this one).
Wheatcroft considers Conquest changing his opinion as marking an end to that “era” in academia, but he also mentions that very recently there has been a revival of the claims of “genocide,” but these are clearly motivated and pushed by the Ukrainian state for political reasons and not academic reasons. It is literally a propaganda move. There are hostilities between the current Ukrainian state and the current Russian state, and so the current Ukrainian state has a vested interest in painting the Russian state poorly, and so reviving this old myth is good for its propaganda. But it is just that, state propaganda.
Discussions in the popular narrative of famine have changed over the years. During Soviet times there was a contrast between ‘man-made’ famine and ‘denial of famine’.‘Man-made’ at this time largely meant as a result of policy. Then there was a contrast between ‘man-made on purpose’, and ‘man-made by accident’ with charges of criminal neglect and cover up. This stage seemed to have ended in 2004 when Robert Conquest agreed that the famine was not man-made on purpose. But in the following ten years there has been a revival of the ‘man-made on purpose’ side. This reflects both a reduced interest in understanding the economic history, and increased attempts by the Ukrainian government to classify the ‘famine as a genocide’. It is time to return to paying more attention to economic explanations.