1.1 Foundation and mandate of the Commission
The Commission on Computational Physics has been founded at the IUPAP 22th General Assembly at Uppsala in 1996 following the 2-years activity of a Working Group. The mandate assigned to the new Commission requires it to promote the exchange of information and views in the area of Computational studies of problems originating in or relevant to Physics, including:
- numerical and symbolic models and algorithms for the simulation of physical systems
- computational control and data processing of experiments
- the physical basis of computer machinery.
The Commission should closely collaborate with Group of Computational Physics (CPG) of the European Physical Society and Division of Computational Physics (DCP) of the American Physical Society that have had a long tradition in the exchange of knowledge among members of the international community of physicists in this area. Moreover, the Commission was established as a successor to a joint Committee of these bodies. This cooperation was implemented by:
- the nomination of chairmen of CPG and DCP as Associate members of the Commission
- a joint meeting and
- the invitation of all European Commission members to the meetings of the CPG Board.
The membership of the Commission as was elected at this General Assembly is composed as follows: 6 from Europe, 2 from North America, 2 from Asia, 1 from South America and 1 from Australia.
1.2 The form of Commission business
The Commission conducted its business mostly by e-mail. It had met twice:
- in Praha, Czech Republic, together with members of the CPG Board and representatives of DPG on 25th April, 1997 (Asian and Australian members who could not attend this meeting met with the Commission chairman in Singapore during a conference sponsored by the Commission in June 1997),
- in Granada, Spain, on 4th September, 1998 during another conference sponsored by the Commission.
The last meeting of the current Commission has been scheduled to Atlanta in the week following the General Assembly 1999.
1.3 Conference sponsorship
The Working Group and the Commission proposed for IUPAP sponsorship 5 conferences that were or should be held in the years 1997-99:
International Conference on Computational Physics, Santa Cruz, Ca. 25-28 August, 1997
4th International Conference on Computational Physics (ICCP4), Singapore. 2 - 4 June 1997
Conference on Computational Physics (CPC´98) - Modelling Collective Phenomena in complex Systems, Granada, Spain.
1-4 September, 1998
Conference on Computational Physics (CPC´99), Atlanta, Da, USA. 21-25 March, 1999
5th International Conference on Computational Physics (ICCP5), Kanazawa, Japan. 11-13 October 1999 (this conference was not approved for IUPAP sponsorship by the Executive Council).
The conferences in Santa Cruz, Granada, and Atlanta are parts of the series CCP that is a successor of EPS-APS Joint Conferences "Physics Computing" (PC) organized annually since 1989.
1.4 Associate Members
The Commission decided to invite chairmen of CPG and DPG as Associate members of the Commission. As the present chairman of the CPG is P. Borcherds who is an ordinary member of the Commission, only B.M. Klein, Davis, CA, USA as chairman of DPG was nominated. Since 1998 D. Landau, Athens, GA, USA, has been the DPG chairman and would replace B.M. Klein.
Other Associate Members are R. Gruber, EPFL Lausanne, Switzerland, and D.H. Feng, Philadelphia, PA, USA as a representative of ICCP that has organized ICCP Conference in the Far East (which are expected to merge with CCP Conference series).
The Commission has established an own Internet page that has the address http://info.rmc.ca/external/iupapc20/
and is referenced on the IUPAP page.
1.6 IUPAP Prize on Computational Physics
The Commission has strongly discussed the idea of a special IUPAP Prize on Computational Physics but has not yet reach an implementable proposal. Therefore, only a prize for young authors of contributions at the CPC conferences was approved that will be recommended to the organizers of future conferences.
2. New developments in Computational Physics
Computational Physics is rapidly maturing as the third leg of the scientific enterprise alongside Experiment and Theory. Traditionally, computational physicists have been self-trained, there being no formal curricula aimed at educating computationalists. Indeed there really has been little formal discussion of what is computational physics. However, in the past few years, many schools have begun undergraduate and graduate-level programs in computational science and engineering. Computational physics plays a vital role in these programs. This development will help ensure a growing cadre of trained computational physicists.
At the same time the computational tools available continue to evolve at explosive rates. Today, $5000 personal workstations have computing speeds as fast as those provided by a $5,000,000 supercomputer in 1990 and the available memories are actually much larger. This democratization of supercomputing is opening up computational physics to scientists in every country. We may thus expect far greater use of simulation as a basic tool of physics enquiry as we go forward into the next century. Today's supercomputers are largely built using the same commodity technology as in personal computers. Thus we are seeing similar thousand-fold increases in absolute capability. In 1990, the world's fastest computers could carry out a few billion operations per second. In 1996, a massively parallel supercomputer at Sandia National Laboratories in America achieved over a trillion calculations per second. By the middle of the next decade, computers are expected to be nearly a hundred times faster and larger than they are today. While this is occurring, yet another revolution in computing is threatening to make traditional supercomputers irrelevant. Across the world - often in physics departments - scalable clusters of workstations and personal computers linked together by fast commodity networks are becoming virtual supercomputers - at as little as 1/10th the cost of proprietary supercomputers. These clusters are being integrated into the world-wide web at an accelerating pace. This development promises to democratize even the highest end of scientific simulation.
In the past few years, researchers have increasingly been able to investigate fundamental questions in physics computationally with unprecedented fidelity and level of detail. For example, fluid dynamics simulations aimed at understanding turbulence have recently been carried out with over 200 billion computational degrees of freedom in three dimensions. This allows examining fluctuations over a three-order-of-magnitude span in length scales. Using Monte Carlo methods, researchers can now carry out statistical mechanical simulations with stunning fidelity and detail. For example, with current algorithms on the largest machines we could now carry out simulations of the Ising model with 1012 spins in three dimensions. Doing so would allow us to faithfully simulate the behavior of the Ising model over six orders of magnitude in reduced temperature near the critical point. Similarly, quantum chromodynamic simulations based on lattice gauge theory are already beginning to provide quantitative estimates of important baryon masses. Electronic structure studies of the ground states of metals, semiconductors and molecules are becoming a foundational tool in materials science. Today, ab initio molecular dynamics studies of systems with tens of thousands of electrons and thousands of ion cores can be carried out and accurate materials and molecular structures predicted using density functional approaches. As recently as a decade ago, simulations with more than 100 electrons were not practicable. Currently the most accurate method for the study of many-electron problems is the Quantum Monte Carlo method. With today's faster computers and algorithmic improvements for handling the fermion sign problem, QMC simulations are becoming feasible at large enough scale as to make QMC a viable alternative to Hartree-Fock or density-functional approximations when accuracy is paramount. Puzzling phenomena such as sono-luminescence are yielding to understanding through a combination of molecular-dynamic and continuum shock-physics simulations. Four-dimensional, highly-detailed simulations combining resistive magneto-hydrodynamics with radiation transport are becoming possible. These tools are aiding in understanding the lifecycle of stars and are being used to design inertial-confinement fusion experiments aimed at net energy gain from fusion in the laboratory.
In biophysics, we are making progress in simulating the folding of complex proteins. Simulations are helping to unravel the physical processes involved in the informational role of primary DNA structures (the genetic sequence) as well as delving into the role of secondary structures (e.g., detachments and loops) in DNA. Researchers are also modeling with increasing detail the physics of enzymatic catalysis. Recently key progress is being made in the use of classical density functional theory to model ion channels in cells.
Finally, computational physics involves more than using simulation to provide insight and interpretation. It involves the acquisition and management and understanding of seas of experimental data. Two areas stand out. In high-energy physics, the ability to acquire, store and interpret terabyte data sets is becoming a key part of progress in accelerator experiments. Similarly, in geophysical modeling of global climates, satellite data provides critical information on overall status of global climate as well as key global parameters needed to improve models and to validate theoretical methods. The acquisition and management of these data sets poses grand challenges in real-time data acquisition, in large-scale data management, and in data visualization. Computer graphics has become an important tool for an ever larger group of physicists. This is due to dramatic price drops in video memory and graphical processing cards, and improvements in graphical libraries. The need for fast and high resolution graphics is growing with the problems in interpreting increasing numbers of experimental data and with understandig simulations with ever larger numbers of degrees of freedom. Computer Graphics is similarly important as a debugging tool for the ever more complicated simulation software.
Looking forward, within the next few years we may expect lattice gauge simulations in QCD sufficiently accurate to confirm or eliminate current theoretical models and approximations. In biophysics, we should see great progress in ab initio and classical molecular mechanical simulation of many of the dynamic processes involved in the microscopic evolution of cellular building blocks. In materials physics, we should see a revolution in mesoscopic physics enabled by microscopic computer experiments of realistically defected solids and melts. Both of these latter two predictions are predicated on fundamental progress in theoretical methods to deal with highly disparate timescales in two key phenomena, optical-mode vibrations O(10**-12 secs) and defect diffusion O( 10**-3 - 1 secs), and in related algorithms.
In the area of the physics of computing, progress continues in attempts to develop fundamentally new approaches to computation based on quantum computers. The problems to be overcome are extremely formidable. Nevertheless great progress has been made-- essentially all of it in the past five years-- and we may expect a rich exploratory development phase to unfold in this area over the next five years. On a more mundane and practical level, the availability of inexpensive, commodity simulation engines with capabilities in the many gigaflops/gigabyte range along with new educational and national research initiatives will continue to attract more physicists into the computational arena. In sum, the past five years have been the brightest in the history of computational physics and progress in the next five will eclipse even the accomplishments of the past five.
William J. Camp,