# Park, Buhm Soon: *Computational Imperatives in Quantum Chemistry*

## Computational Imperatives in Quantum Chemistry

A few years after the advent of quantum mechanics, Paul Dirac made a famous dictum: “The underlying physical laws necessary for the mathematical theory of a large part of physics and the whole of chemistry are thus completely known, and the difficulty is only that the exact application of these laws leads to equations much too complicated to be soluble.” In this paper, I explore that “difficulty” of the “exact application” in dealing with many-electron atoms and molecules, and examine various “approximation” methods developed in the early years of quantum chemistry. I focus on those who endeavored to calculate physical and chemical properties without using empirical data—i.e., the pioneers of the ab initio methods.

To be sure, the hydrogen molecule served as a testing ground for accuracy of quantum-mechanical calculations. Shortly after the publication of Walter Heitler and Fritz London’s 1927 paper, which explained the nature of the forces that bind hydrogen atoms, several young scientists jumped into the task of making the Heitler-London approach acceptable quantitatively or developing alternative methods. They were mostly graduate students and visiting fellows from various countries, eager to learn and use the new mechanical system: e.g., Yoshikatsu Sugiura, Shou Chin Wang, Nathan Rosen (later, one of the co-authors of the EPR paradox), Sidney Weinbaum, and Hubert M. James. Incremental improvements they made for better agreement between theory and experiment usually involved an enormous amount of labor in computations. I show that this computational constraint became an important factor in limiting the application of quantum mechanics to more complex molecules until the appearance of electronic digital computers.

On the other hand, the point of departure for the problem of many-electron atoms was helium. The Norwegian physicist Egil A. Hylleraas’s innovative method of calculating helium’s ionization potential—by introducing a new coordinate of the inter-electronic distance into a trial function—was much hailed in the early 1930s, but it was not easily adaptable for heavier atoms because the number terms that had to be computed increased very rapidly with increasing numbers of electrons. Less accurate but more applicable than Hylleraas’s method was the one developed by Douglas R. Hartree. I trace the evolution of Hartree’s idea of the Self-Consistent-Field method and the modification of this method by Vladmir Fock and John Slater. Commonly called the Hartree-Fock method, this approximation method began to be used for the molecular problems in the early 1950s.

The main stream quantum theorists tended to undervalue this labor-intensive effort just as “a bone-headed calculation,” or question theoretical justifications of the approximation methods. I argue that, despite the existence of this cynical attitude, computational imperatives made pivotal contributions in spreading the validity of quantum mechanics across disciplinary and national boundaries and producing the practitioners not too much worried about its philosophical ramifications.