Bell's theorem


Bell's theorem

[′belz ‚thir·əm] (quantum mechanics) A theorem which states that any hidden variable that satisifies the condition of locality cannot possibly reproduce all the statistical predictions of quantum mechanics, and which places upper limits, for the predictions of any such theory, on the strength of correlations between measurements of spatially separated objects, whereas quantum mechanics predicts very strong correlations between such measurements.