Bohr Radius
Bohr radius
[′bȯr ‚rād·ē·əs]Bohr Radius
the radius of the first (closest to the nucleus) orbit of an electron in a hydrogen atom, according to the atomic theory of N. Bohr; it is represented by the symbol a0 or a. The Bohr radius equals (5.29117715 ± 0.0000081) x 10-9 cm ≈ 0.529 angstroms; it is expressed by the universal constants: a0 = h2/me2, where h is Planck’s constant divided by 2π and m and e are the mass and electric charge of the electron. In quantum mechanics, the Bohr radius is defined as the distance from the nucleus at which the electron is observed with the greatest probability in an unexcited hydrogen atom. [3–1673–4; updated]