[Home]History of Physics

HomePage | Recent Changes | Preferences

Difference (from prior major revision) (no other diffs)

Changed: 1c1
The growth of physics has brought not only fundamental changes in ideas about the material world, mathematics and philosophy, but also, through technology, a transformation of society. Physics is considered both a body of knowledge and the practice that makes and transmits it. The year 1900 is a convenient boundary between classical and modern physics.
Copyrighted article deleted. Please help write (or find a non-copyrighted) article!

Changed: 3,112c3
It is usually considered that the discipline started in Greece. In Greece physics signified the study of nature in general; it was literary and qualitative as well as encompassing; it wasn't experimental, and it did not rely on mathematics. [Geometrical optics]?, mechanics, and hydrostatics? belonged to applied mathematics.

The Aristotelian conception of the scope of physics prevailed in the universities into the 18th century. Meanwhile, a different conception developed outside the schools, exemplified in [William Gilbert]?'s De magnete (1600), the work of a practicing physician. Gilbert's book on magnetism was the first report of connected, sustained, and reconfirmed experiments in the history of physics. Magnetism became a popular study and not only for practical application: it could also be used in "natural magic"--the production of perplexing effects by concealed mechanisms. Such magic figured prominently in the academies or museums formed in the 17th century by gentlemen virtuosos interested in the unusual. Their play often amounted to experiment. Many of their practices--for example, cooperative work and showy demonstrations--recurred in the first national academies of science, which were established in London and Paris in the 1660s.

Many virtuosos explained their magic or experiments on the principles of the new mechanical philosophy, of which Rene Descartes's Principia philosophiae (Principles of Philosophy, 1644) was the chief guide and apology. His reader did not need to accept Descartes's peculiar models or metaphysics to see the advantages of his system. A leading mechanical philosopher, [Robert Boyle]?, explained these advantages: compared to Aristotle's scheme, corpuscularism offers simple, comprehensive, useful, and intelligible accounts of physical phenomena and a basis for further advance. Descartes stressed another advantage as well: his physics, built on extension and motion, was implicitly mathematical.

Like Galileo Galilei before him, Descartes called for a physics patterned after mixed mathematics. Each was able to fashion bits of such a physics, but neither managed to quantify a large domain of phenomena. Sir Isaac Newton was the first to do so, in his Principia mathematica philosophiae naturalis (Mathematical Principles of Natural Philosophy, 1687). The reach of his principles, from Saturn's satellites to pendulums swung on Earth, and the precision of his results astounded his contemporaries. They did not see in Newton's masterpiece, however, a model for a comprehensive exact physics. As successful as the Principia was as a mathematical description, it failed as physics for those faithful to Descartes's goal. Either one took the principle of gravity--the mutual acceleration of all particles in the universe--as mathematical description, as Newton usually did, and had no physics, or one postulated that bodies could act on one another at a distance. In the latter case, according to the up-to-date 17th-century physicist, one returned to the unintelligible, magical explanations from which Descartes had rescued natural philosophy.

During the 18th century physicists lost the scruples that had caused Newton as well as his adversaries to worry about admitting action-at-a-distance forces into physics. Even if the physicist knew nothing of the true nature of these forces, they were nonetheless useful in calculation and reasoning; a little experience, as one physicist said, "domesticates them." This instrumentalism? began to spread after 1750. It set the precedent of choosing theories in physics on the basis not of conformity to general, qualitative, intelligible principles but of quantitative agreement with measurements of isolated phenomena.

Institutionalization of Physics

During the first half of the 18th century the demonstration experiment, long the practice of the academies, entered university physics courses. For practical reasons, most of these demonstrations concerned physics in the modern sense. By 1750 the physicist no longer had responsibilities for biology and chemistry; instead he had the upkeep and usually the expense of a collection of instruments. By the 1780s he was winning institutional support for his instructional hardware: a budget for apparatus, a mechanic, storage and maintenance facilities, and lecture rooms. By the end of the century advanced professors had apparatus and research assistance from their institutions.

The initiative in changing the physics curriculum came from both inside and outside the discipline: inside, modernizing professors strove to master Newton's ideas and to enrich the teaching of physics (and themselves) through paid lecture demonstrations; outside, enlightened ministers, believing that experimental physics together with modern languages might be helpful in the running of states, pressed the universities to discard scholastic? principles and "everything useless."

Meanwhile, physics became important in schools where mining, engineering, and artillery were taught. The most advanced of these schools, the [Ecole Polytechnique]? established in Paris in 1793, inspired upgrading of curricula in universities as well as in technical schools. Among its descendants are the German-language [Technische Hochschulen]?, of which 15 existed in 1900.

During the 19th century the old university cabinets de physique were transformed into institutes of physics. The transformation occurred in two steps. First, the university accepted the obligation to give beginning students laboratory instruction. Second, it provided research facilities for advanced students and faculty. The first step began about 1850; the second, toward the end of the century. By 1900 the principle was accepted in the chief physics-producing countries (Great Britain, France, Germany, and the United States) that the university physics institute should give theoretical and practical instruction to aspiring secondary school teachers, physicians, and engineers and also provide space, instruments, and supplies for the research of advanced students of physics.

By 1900 about 160 academic physics institutes, staffed by 1,100 physicists, existed worldwide. Expenditures reckoned as percentages of gross national product were about the same in the four leading countries. Academic physicists maintained pressures on government and private donors by pointing to the increasing demand for technically trained people from industry, government, and the military. This pressure was brought not only by individuals but also by a characteristic novelty of the 19th century, the professional society. The first of the national associations for the advancement of science, in which physicists always played prominent parts, was established in Germany in the 1820s. Beginning in 1845 physicists also formed national professional societies.

While physics prospered outwardly it grew internally with the definitive capture of the physical branches of applied mathematics. The acquisition proved more than physics could handle. Beginning about 1860, it recognized a specialty, theoretical physics, that treated the more mathematical branches with an emphasis on their interconnections and unity. By 1900 about 50 chairs in theoretical physics existed, most of them in Germany. During the same period important new fields grew up around the borders between physics and astronomy, biology, geology, and chemistry.

Content and Goal of Physics

The unity that the theoretician referred to was mechanical reduction. The goal of the late 18th century--to trace physical phenomena to forces carried by special substances, such as electrical fluids--gave way to a revised corpuscularism about 1850. The new doctrine of the conservation of energy and the interconvertibility of forces promised that all physical transactions could be reduced to the same basis. Physicists took the concepts of mechanics as basic, for much the same reasons that Boyle had given, and they strove to explain the phenomena of light, heat, electricity, and magnetism in terms of the stresses and strains of a hypothetical ether supposed to operate as a mechanism.

The program had as its remote goal a model such as the vortex atom, which forms matter of permanent, tiny vortices in the same ether that mediates electromagnetic interactions and propagates light. The president of the [French Physical Society]? may have had the vortex atom in mind when he opened the International Congress of Physics in 1900 with the words, "The spirit of Descartes hovers over modern physics, or better, he is its beacon."

The development of main branches

The main branches of classical physics are mechanics, [electricity and magnetism]?, light, and [heat and thermodynamics]?.

Mechanics

The first branch of physics to yield to mathematical description was mechanics. Although the ancients had quantified a few problems concerning the balance and hydrostatics, and medieval philosophers had discussed possible mathematical descriptions of free-fall, not until the beginning of the 17th century was the desideratum of quantification brought into confrontation with received principles of physics. The chief challenger was Galileo Galilei, who began with a medieval explanation of motion, the so-called impetus theory, and ended by doing without an explicit dynamics. To him it was enough that, as a first step, the physicist should describe quantitatively how objects fall and projectiles fly.

Galileo's kinematical approach did not please Descartes, who insisted that the physicist attack received principles from a knowledge of the nature of bodies. Descartes gave out this knowledge as laws of motion, almost all incorrect, but including a strong statement of the principle of rectilinear inertia, which was to become Newton's first axiom of motion. Another Cartesian principle important for Newton was the universalizing of mechanics. In Aristotelian physics the heavens consist of material not found on earth. The progress of astronomy had undermined Aristotle's distinction, and Newton, like Descartes, explicitly unified celestial and terrestrial mechanics.

In Descartes's system bodies interact only by pushing, and space devoid of body is a contradiction in terms. Hence the motion of any one object must set up a vortex involving others. The planets are swept around by such a whirlpool; another carries the Moon, creates the tides, and causes heavy bodies to fall; still others mediate the interactions of objects at or near the Earth's surface. Newton tried to build a quantitative celestial vortical mechanics, but could not; Book II of the Principia records his proof that vortices that obey the mechanical axioms posited for terrestrial matter cannot transport planets according to Kepler's laws. On the assumption of universal gravitation, however, Newton could derive Kepler's laws and tie together planetary motions, the tides, and the precession of the equinoxes. As one essential step in the derivation, Newton used Galileo's rule about distance traversed under constant acceleration. He also required the assumption of "absolute space"--a preferred system of reference against which accelerations could be defined.

After receiving their definitive analytic form from Leonhard Euler, Newton's axioms of motion were reworked by [Joseph Louis de Lagrange]?, [William Rowan Hamilton]?, and [Carl Gustav Jacobi]? into very powerful and general methods, which employed new analytic quantities, such as potential, related to force but remote from immediate experience. Despite these triumphs, some physicists nonetheless retained scruples against the concept of force. Several schemes for doing without it were proposed, notably by [Joseph John Thomson]? and [Heinrich Hertz]?, but nothing very useful came from them.

Electricity and Magnetism

As an apparent action at a distance, magnetism challenged the ingenuity of corpuscular philosophers. Descartes explained that the terrestrial vortex, which carried the Moon, contained particles shaped so they can find easy passage through the threaded pores that define the internal structure of magnets and the Earth. The special particles accumulate in vortices around magnets and close to the Earth, orient compass needles, and mediate magnetic attraction and repulsion. This quaint picture dominated continental theorizing about magnetism until 1750. Meanwhile, Newton's disciples tried to find a law of magnetic force analogous to the law of gravity. They failed because they did not follow Newton's procedure of integrating hypothetical microscopic forces to obtain a macroscopic acceleration. In 1785, [Charles A. Coulomb]? demonstrated the laws of magnetic force between elements of the supposed magnetic fluids. He benefited from the domestication of forces, from a developed understanding of Newton's procedures, and from a technique of making artificial magnets with well-defined poles.

Gilbert established the study of electricity in the course of distinguishing electrical attraction from magnetism. The subject progressed desultorily until [Francis Hauksbee]? introduced a new and more powerful generator, the glass tube, in 1706. With this instrument [Stephen Gray]? and [C. F. Dufay]? discovered electrical conduction and the rules of vitreous and resinous electrifications. In the 1740s electricity began to attract wider attention because of the inventions of the [electrostatic machine]? and Leyden jar and their application to parlor tricks. The demonstration in 1751 that lightning was nature's electrical game further enhanced the reputation of electricity.

Up to 1750 physicists accepted a theory of electricity little different from Gilbert's: the rubbing of electric bodies forces them to emit an electrical matter or ether that causes attractions and repulsions either directly or by mobilizing the air. The theory confused the roles of charges and their field. The invention of the Leyden jar (1745) made clear the confusion, if not its source; only Benjamin Franklin's theory of plus and minus electricity, probably developed without reference to the Leyden jar, proved able to account for it. Franklin asserted that the accumulation of electric matter within the Leyden jar (the plus charge) acted at a distance across the bottom to expel other electrical matter to ground, giving rise to the minus charge. Distance forces thus entered the theory of electricity. Their action was quantified by [F. U. T. Aepinus]? (1759), by [Henry Cavendish]? (1771), and by Coulomb, who in 1785 showed that the force between elements of the hypothetical electrical matter(s) or fluid(s) diminished as the square of the distance. (The uncertainty regarding the number of fluids arises because many physicists then preferred the theory introduced by [Robert Symmer]? in 1759, which replaced Franklin's absence of electrical matter, negative electricity, with the presence of a second electrical fluid.) Since the elementary electrical force followed the same law as the gravitational, the mathematics of the potential theory lay ready for exploitation by the electrician. The quantification of electrostatics was accomplished early in the 19th century, principally by [Simeon Denis Poisson]?.

In 1800, Alessandro Volta announced the invention of a continuous generator of electricity, a "pile" of disks of silver, zinc, and moist cardboard. This invention--the first battery--opened two extensive new fields: Electrochemistry, of which the first dramatic results were [Humphry Davy]?'s isolation of the alkali metals, and electromagnetism, based on the healing of the breech opened by Gilbert in 1600.

The discovery in 1820 by Hans Christian Oersted that the wire connecting the poles of a Voltaic cell could exert a force on a magnetic needle was followed in 1831 by Michael Faraday's discovery that a magnet could cause a current to flow in a closed loop of wire. The facts that the electromagnetic force depends on motion and does not lie along the line between current elements made it difficult to bring the new discoveries within the scheme of distance forces. Certain continental physicists--at first Andre Marie Ampere, then [Wilhelm Eduard Weber]? and [Rudolf Clausius]?, and others--admitted forces dependent on relative velocities and accelerations.

The hope that electric and magnetic interactions might be elucidated without recourse to forces acting over macroscopic distances persisted after the work of Coulomb. In this tradition, Faraday placed the seat of electromagnetic forces in the medium between bodies interacting electrically. His usage remained obscure to all but himself until William Thomson ([Lord Kelvin]?) and James Clerk Maxwell expressed his insights in the language of [Cambridge mathematics]?. James Maxwell developed the theory of electromagnetism to unify electricity and magnetism. At the heart of this theory is the notion of an electromagnetic field and light waves resulted naturally from it. Many British physicists and, after Heinrich Hertz's detection of electromagnetic waves (1887), several continental ones, tried to devise an ether obedient to the usual mechanical laws whose stresses and strains could account for the phenomena covered by Maxwell's equations.

In the early 1890s, [Hendrik Antoon Lorentz]? worked out a successful compromise. From the British he took the idea of a mediating ether, or field, through which electromagnetic disturbances propagate in time. From continental theory he took the concept of electrical charges, which he made the sources of the field. He dismissed the presupposition that the field should be treated as a mechanical system--that it should be assigned any properties needed to account for the phenomena. For example, to explain the result of the [Michelson-Morley Experiment]?, Lorentz supposed that objects moving through the ether contract along their line of motion. Among the unanalyzed and perhaps unanalyzable properties of the ether is the ability to shorten bodies moving through it.

In 1896-97, Lorentz's approach received support from the [Zeeman effect]?, which confirmed the presence of electrical charges in neutral atoms, and from the isolation of the electron, which could be identified as the source of the field. The electron pulled together many loose ends of 19th century physics and suggested that the appearances of matter itself, including its inertia, might arise from moving drops of electric fluid.
That it was not expected is illustrated by a remark made by J J Thomson, the discoverer of the electron. He said: "I was told long afterwards by a distinguished physicist who had been present at my lecture that he thought I had been pulling their leg.
But the electron did not save the ether. Continuing failure to find effects arising from motion against it and, above all, certain asymmetries in the electrodynamics of moving bodies, caused Albert Einstein to reject the ether and, with it, the last vestige of Newton's absolute space.

Light

During the 17th century the study of optics was closely associated with problems of astronomy, such as correcting observations for atmospheric refraction and improving the design of telescopes. Kepler obtained a good approximation to refraction and explained the geometry of the eye, the operation of the lens, and the inversion of the image. Descartes computed the best form for telescope lenses and found, or transmitted, the law of refraction first formulated by [Willebrord Snell]? in 1621. While trying to correct telescopes for chromatic aberration, Newton discovered that rays of different colors are bent by different but characteristic amounts by a prism. The discovery upset the physics of light.

Traditional theory took white light to be homogeneous and colors to be impurities or modifications. Newton inferred from his discovery that colors are primitive and homogeneous, and he portrayed their constituents as particles. This model also conflicted with the ordinary one. For example, [Christiaan Huygens]?, who did not bother about colors, gave a beautiful account of the propagation of light, including an explanation of birefringence, on the supposition that light consists of longitudinal waves in a pervasive medium.

Newton also required an optical ether to explain phenomena now referred to as interference between light waves. The emission of particles sets the ether vibrating, and the vibrations impose periodic properties on the particles. Although many 18th-century physicists preferred a wave theory in the style of Huygens, none succeeded in devising one competitive with Newton's. Progress in optics took place mainly in fields Newton had not investigated, such as photometry?, and in the correction of lenses for [chromatic aberration]?, which he had not thought possible.

In the first years of the 19th century [Thomas Young]?, a close student of Newton's work and an expert on the theory of vibrations, showed how to quantify Huygens's theory. Young succeeded in explaining certain cases of interference; [Augustin Jean Fresnel]? soon built an extensive analytical theory based on Young's principle of superposition. Newton's light particles, which fit well with the special fluids assumed in theories of heat and electricity, found vigorous defenders, who emphasized the problem of polarization?. In Newton's theory, polarization could be accommodated by ascribing different properties to the different "sides" of the particles, whereas Young's waves could be characterized only by amplitude (associated with intensity), period (color), phase (interference), and velocity (refraction). About 1820, Young and Fresnel independently found the missing degree of freedom in the assumption that the disturbances in light waves act at right angles to their direction of motion; polarization effects arise from the orientation of the disturbance to the optic axis of the polarizing body.

With the stipulation that light's vibrations are transverse, the wave theorists could describe simply and precisely a wide range of phenomena. They had trouble, however, in developing a model of the "luminiferous ether," the vibrations of which they supposed to constitute light. Many models were proposed likening the ether to an elastic solid. None fully succeeded. After Maxwell linked light and electromagnetism, the duties of the ether became more burdensome and ambiguous, until Lorentz and Einstein, in their different ways, removed it from subjection to mechanics.

Heat and Thermodynamics

In Aristotelian physics heat was associated with the presence of a nonmechanical quality, "hotness," conveyed by the element fire. The corpuscular philosophers rejected some or all of this representation; they agreed that heat arose from a rapid motion of the parts of bodies but divided over the existence of a special fire element. The first theory after Aristotle's to command wide assent was developed by [Hermann BoerHaave]? during the second decade of the 18th century; it incorporated a peculiar, omnipresent, expansive "matter of fire," the agitation of which caused heat and flame.

Physicists examined the properties of this fire with the help of thermometers, which improved greatly during the 18th century. With Fahrenheit thermometers, [G. W. Richmann]? established (1747-48) the calorimetric mixing formula, which expresses how the fire in different bodies at different temperatures comes to equilibrium at an intermediate temperature when the bodies are brought into contact. By following up discrepancies between experimental values and results expected from Richmann's formula, [Joseph Black]? and [J. C. Wilcke]? independently discovered phenomena that led them to the concepts of latent and specific heat.

About 1790 physicists began to consider the analytic consequences of the assumption that the material base of heat, which they called caloric, was conserved. The caloric theory gave a satisfactory quantitative account of adiabatic processes in gases, including the propagation of sound, which physicists had vainly sought to understand on mechanical principles alone. Another mathematical theory of caloric was Sadi Carnot's analysis (1824) of the efficiency of an ideal, reversible heat engine, which seemed to rest on the assumption of a conserved material of heat.

In 1822, [Joseph Fourier]? published his theory of heat conduction, developed using the trigonometrical series that bear his name ([Fourier Analysis]?), and without specifying the nature of heat. He thereby escaped the attack on the caloric theory by those who thought the arguments of Count [Benjamin Rumford]? persuasive. Rumford had inferred from the continuous high temperatures of cannon barrels undergoing grinding that heat is created by friction and cannot be a conserved substance. His qualitative arguments could not carry the day against the caloric theory, but they gave grounds for doubt, to Carnot among others.

During the late 18th century physicists had speculated about the interrelations of the fluids they associated with light, heat, and electricity. When the undulatory theory indicated that light and radiant heat consisted of motion rather than substance, the caloric theory was undermined. Experiments by [James Prescott Joule]? in the 1840s showed that an electric current could produce either heat, or, through an electric motor, mechanical work; he inferred that heat, like light, was a state of motion, and he succeeded in measuring the heat generated by mechanical work. Joule had trouble gaining a hearing because his experiments were delicate and his results seemed to menace Carnot's.

In the early 1850s the conflict was resolved independently by Kelvin and by Clausius, who recognized that two distinct principles had been confounded. Joule correctly asserted that heat could be created and destroyed, and always in the same proportion to the amount of mechanical, electrical, or chemical force--or, to use the new term, "energy"--consumed or developed. This assertion is the first law of Thermodynamics--the conservation of energy. Carnot's results, however, also hold; they rest not on conservation of heat but on that of Entropy, the quotient of heat by the temperature at which it is exchanged. The second law of thermodynamics declares that in all natural processes entropy either remains constant or increases.

Encouraged by the reasoning of [Hermann Helmholtz]? and others, physicists took mechanical energy, the form of energy with which they were most familiar, as fundamental, and tried to represent other forms in terms of it. Maxwell and [Ludwig Boltzmann]? set the foundations of a new branch of physics, the mechanical theory of heat, which included statistical considerations as an integral part of physical analysis for the first time. After striking initial successes, the theory foundered over the mechanical representation of entropy. The apparent opposition of the equations of mechanics (which have no direction in time) and the demands of the second law (which prohibits entropy from decreasing in the future) caused some physicists to doubt that mechanical reduction could ever be accomplished. A small, radical group led in the 1890s by the physical chemist [Wilhelm Ostwald]? went so far as to demand the rejection of all mechanical pictures, including the concept of atoms.

Although Ostwald's program of energetics had few followers and soon collapsed, the attack on mechanical models proved prescient. Other work about 1900--the discoveries of [X ray]?s and radioactivity, the development of the quantum theory and the theory of relativity--did eventually force physicists to relinquish in principle, if not in practice, reliance on the clear representations in space and time on which classical physics had been built.

Modern Physics

Around 1900 the understanding of the physical universe as a congeries of mechanical parts shattered forever. In the decades before the outbreak of World War I came new experimental phenomena. The initial discoveries of Radioactivity and [X ray]?s were made by [Antoine Henri Becquerel]? and [Wilhelm Conrad Roentgen]?. These new phenomena were studied extensively, but only with Niels Bohr's first atomic theory in 1913 did a general, theoretical picture for the generation of X rays emerge. Radioactive decay was gradually clarified with the emergence of quantum mechanics, with the discovery of new [Fundamental Particles]?, such as the neutron and the neutrino, and with countless experiments conducted using [particle accelerator]?s.

The central, theoretical syntheses of 20th-century physics--the theories of Relativity and Quantum Mechanics--were only indirectly associated with the experimental research of most physicists. Albert Einstein and [Wolfgang Pauli]?, for example, believed that experiment had to be the final arbiter of theory but that theories were far more imaginative than any inductivist assemblage of experimental data. During the first third of the century it became clear that the new ideas of physics required that physicists reexamine the philosophical foundations of their work. For this reason physicists came to be seen by the public as intellectual Brahmins who probed the dark mysteries of the universe. Excitement over reorganizing physical knowledge persisted through the 1920s. This decade saw the formulation of quantum mechanics and a new, indeterminist epistemology by Pauli, Werner Karl Heisenberg, [Max Born]?, [Erwin Shrodinger]?, and Paul Dirac.

The early-20th-century vision of the universe issued principally from German-speaking Europe, in university environments where daily patterns of activity had been fixed since the 1880s. During the interwar period first-rate physics research operations flourished in such non-European environments as the United States, Japan, India, and Argentina. New patterns of activity, intimated in the pre-1914 world, finally crystallized. Physics research, such as that of [Clinton Davisson]?, came to be supported heavily by industries using optics and electricity. The [National Research Council]? and private foundations in the United States, notably the Rockefeller trusts, sponsored expensive and time-consuming experiments. European governments encouraged special research installations, including the [Kaiser Wilhelm Institute]?s and the [Einstein Observatory]? in Potsdam. What has been called big physics emerged in the 1930s. Scores of physicists labored over complicated apparatus in special laboratories indirectly affiliated with universities. As one of the most significant consequences of the new institutional arrangements, it became increasingly difficult for a physicist to imitate scientists such as Enrico Fermi, who had mastered both the theoretical and the experimental sides of the discipline. Following the models provided in the careers of [J. Robert Oppenheimer]? and [Luis Walter Alvarez]?, the successful physicist became a manager who spent most of his or her time convincing scientifically untutored people to finance arcane research projects.

The awesome respect accorded physicists in the 1950s--when the United States and the Soviet Union carried out extensive research into [thermonuclear weapons]? and launched [artificial satellites]?--has eroded in recent years. In part this new development is the result of a continuing emergence of new specialties; [applied electronics]?, for example, until recently part of the physicists' domain, has become an independent field of study, just as physical chemistry, geophysics?, and astrophysics split off from the mother discipline around 1900. At the same time, a number of physicists, such as [Richard Phillips Feynman]? , have come to emphasize the aesthetic value of their research more than its practical application.

In recent years, physicists have been at the center of major interdisciplinary syntheses in biophysics?, [solid-state physics]?, and astrophysics. The identification of the double-helix structure of DNA, the synthesis of complex protein molecules, and developments in genetic engineering all rest on advances in spectroscopy, X-ray crystallography, and [electron microscopy]?. Semiconductor technology, at the base of the revolution in information processing, has been pioneered by solid-state physicists. Fundamental insights into the large-scale structure of the universe and its constituent parts have depended on harmonies previously revealed by theoretical physicists. This cross-fertilization has had an impact on physics itself; it has produced new understanding of basic physical laws ranging from those governing elementary particles to those in irreversible thermodynamic processes. Among all modern scientific disciplines, physics has been the most successful in maintaining a high public profile while adapting to new scientific and social circumstances.

.....



/Talk
/Talk

Copyrighted article deleted. Please help write (or find a non-copyrighted) article!

/Talk


HomePage | Recent Changes | Preferences
This page is read-only | View other revisions
Last edited September 20, 2001 5:21 am by Larry Sanger (diff)
Search: