Tuesday, 25 December 2007

10,000 Earths' Worth of Fresh Dust Found Near Star Explosion



10,000 Earths' Worth of Fresh Dust Found Near Star Explosion



By:
Arip Nurahman Department of Physic
Faculty of sciences and Mathematics
Indonesia University of Education

blown-out remains of a stellar explosion The supernova remnant Cassiopeia A is shown here in an infrared composite from NASA's Spitzer Space Telescope. A supernova remnant is the blown-out remains of a stellar explosion. Image credit: NASA/JPL-Caltech
› Full image and caption Astronomers have at last found definitive evidence that the universe's first dust - the celestial stuff that seeded future generations of stars and planets - was forged in the explosions of massive stars.

The findings, made with NASA's Spitzer Space Telescope, are the most significant clue yet in the longstanding mystery of where the dust in our very young universe came from. Scientists had suspected that exploding stars, or supernovae, were the primary source, but nobody had been able to demonstrate that they can create copious amounts of dust - until now. Spitzer's sensitive infrared detectors have found 10,000 Earth masses worth of dust in the blown-out remains of the well-known supernova remnant Cassiopeia A.

"Now we can say unambiguously that dust - and lots of it - was formed in the ejecta of the Cassiopeia A explosion. This finding was possible because Cassiopeia A is in our own galaxy, where it is close enough to study in detail," said Jeonghee Rho of NASA's Spitzer Science Center at the California Institute of Technology in Pasadena. Rho is the lead author of a new report about the discovery appearing in the Jan. 20 issue of the Astrophysical Journal.

Space dust is everywhere in the cosmos, in our own neck of the universe and all the way back billions of light-years away in our infant universe. Developing stars need dust to cool down enough to collapse and ignite, while planets and living creatures consist of the powdery substance. In our nearby universe, dust is pumped out by dying stars like our sun. But back when the universe was young, sun-like stars hadn't been around long enough to die and leave dust.

That's where supernovae come in. These violent explosions occur when the most massive stars in the universe die. Because massive stars don't live very long, theorists reasoned that the very first exploding massive stars could be the suppliers of the unaccounted-for dust. These first stars, called Population III, are the only stars that formed without any dust.

Other objects in addition to supernovae might also contribute to the universe's first dust. Spitzer recently found evidence that highly energetic black holes, called quasars, could, together with supernovae, manufacture some dust in their winds (http://www.spitzer.caltech.edu/Media/releases/ssc2007-16/index.shtml) .

Rho and her colleagues analyzed the Cassopeia A supernova remnant, located about 11,000 light-years away. Though this remnant is not from the early universe, its proximity to us makes it easier to address the question of whether supernovae have the ability to synthesize significant amounts of dust. The astronomers analyzed the infrared light coming from Cassiopeia A using Spitzer's infrared spectrograph, which spreads light apart to reveal the signatures of different elements and molecules. "Because Spitzer is extremely sensitive to dust, we were able to make high-resolution maps of dust in the entire structure," said Rho.

The map reveals the quantity, location and composition of the supernova remnant's dust, which includes proto-silicates, silicon dioxide, iron oxide, pyroxene, carbon, aluminium oxide and other compounds. One of the first things the astronomers noticed was that the dust matches up perfectly with the gas, or ejecta, known to have been expelled in the explosion. This is the smoking gun indicating the dust was freshly made in the ejecta from the stellar blast. "Dust forms a few to several hundred days after these energetic explosions, when the temperature of gas in the ejecta cools down," said Takashi Kozasa, a co-author at the Hokkaido University in Japan.

The team was surprised to find freshly-made dust deeper inside the remnant as well. This cooler dust, mixed in with gas referred to as the unshocked ejecta, had never been seen before.

All the dust around the remnant, both warm and cold, adds up to about three percent of the mass of the sun, or 10,000 Earths. This is just enough to explain where a large fraction, but not all, of the universe's early dust came from. "Perhaps at least some of the unexplained portion is much colder dust, which could be observed with upcoming telescopes, such as Herschel," said Haley Gomez, a co-author at University of Wales, Cardiff. The Herschel Space Observatory, scheduled to launch in 2008, is a European Space Agency mission with significant NASA participation.

Rho also said that more studies of other supernovae from near to far are needed to put this issue to rest. She notes that the rate at which dust is destroyed - a factor in determining how much dust is needed to explain the dusty early universe - is still poorly understood.

The principal investigator of the research program, and a co-author of the paper, is Lawrence Rudnick of the University of Minnesota, Twin Cities. Other co-authors include W.T. Reach of the Spitzer Science Center; J. D. Smith of the Steward Observatory, Tucson, Ariz.; T. Delaney of the Massachusetts Institute of Technology, Cambridge; J.A. Ennis of the University of Minnesota; and A. Tappe of the Spitzer Science Center and the Harvard Smithsonian Center for Astrophysics, Cambridge, Mass.

NASA's Jet Propulsion Laboratory, Pasadena, Calif., manages the Spitzer Space Telescope mission for NASA's Science Mission Directorate, Washington. Science operations are conducted at the Spitzer Science Center at the California Institute of Technology. Caltech manages JPL for NASA. Spitzer's infrared spectrograph was built by Cornell University, Ithaca, N.Y. Its development was led by Jim Houck of Cornell. For more information about Spitzer, visit http://www.nasa.gov/spitzer and http://www.spitzer.caltech.edu/spitzer .

Thursday, 20 December 2007

Teknologi Penjelajahan Bulan




Edited and Added By:
Arip Nurahman Department of Physics, Faculty of sciences and Mathematics
Indonesia University of Education
&
Follower Open Course Ware at MIT-Harvard University, U.S.A.


NASA's blueprints for an outpost on the moon are shaping up. The agency's Lunar Architecture Team has been hard at work, looking at concepts for habitation, rovers, and space suits.







Image left: Concept of one potential design for a future lunar rover. Spacesuits would be attached to the exterior of the rover. Credit: NASA











NASA will return astronauts to the moon by 2020, using the Ares and Orion spacecraft already under development. Astronauts will set up a lunar outpost – possibly near a south pole site called Shackleton Crater – where they’ll conduct scientific research, as well as test technologies and techniques for possible exploration of Mars and other destinations.

Even though Shackleton Crater entices NASA scientists and engineers, they don’t want to limit their options. To provide for maximum flexibility, NASA is designing hardware that would work at any number of sites on the moon. Data from the Lunar Reconnaissance Orbiter mission, a moon-mapping mission set to launch in October 2008, might suggest that another lunar site would be best suited for the outpost.

First, astronauts on the moon will need someplace to live. NASA officials had been looking at having future moonwalkers bring smaller elements to the moon and assemble them on site. But the Lunar Architecture Team found that sending larger modules ahead of time on a cargo lander would help the outpost get up and running more quickly. The team is also discussing the possibility of a mobile habitat module that would allow one module of the outpost to relocate to other lunar destinations as mission needs dictate.

NASA is also considering small, pressurized rovers that could be key to productive operations on the moon’s surface. Engineers envision rovers that would travel in pairs – two astronauts in each rover – and could be driven nearly 125 miles away from the outpost to conduct science or other activities. If one rover had mechanical problems, the astronauts could ride home in the other.




Image left: Concept of one potential design for a future lunar rover. Spacesuits would be attached to the exterior of the rover. Credit: NASA














Astronauts inside the rovers wouldn't need special clothing because the pressurized rovers would have what's called a "shirt-sleeve environment." Spacesuits would be attached to the exterior of the rover (see images). NASA's lunar architects are calling them "step in" spacesuits because astronauts could crawl directly from the rovers into the suits to begin a moonwalk.




NASA is also looking to industry for proposals for a next-generation spacesuit. The agency hopes to have a contractor on board by mid-2008.


NASA will spend the next several months communicating the work of the Lunar Architecture Team to potential partners -- the aerospace community, industry, and international space agencies -- to get valuable feedback that will help NASA further refine plans for the moon outpost. The agency's goal is to have finalized plans by 2012 to get "boots on the moon" by 2020.

Semoga Bermanfaat!

Sumber:

http://www.nasa.gov/exploration/lunar_architecture.html

Arip Nurahman

Wednesday, 19 December 2007

Higgs Particels


Add and Edited By:
Arip Nurahman
Department of Physic
Faculty of Sciences and Mathematics
Indonesia University of Education





In particle physics, little Higgs is a refined version of the Higgs boson based on the idea that the Higgs boson is a pseudo-Goldstone boson arising from some global symmetry breaking at a TeV energy scale. The main goal of little Higgs theories was to have electroweak symmetry breaking be the result of strong dynamics in the spirit of pions in QCD. The idea was initially studied by Nima Arkani-Hamed, Andy Cohen, and Howard Georgi in the spring of 2001. The idea was explored further in a scientific paper by Nima Arkani-Hamed, Andy Cohen, Thomas Gregoire, and Jay Wacker in the spring of 2002. In the spring of 2002 several papers appeared that refined the ideas of little Higgs theories, most notably the Littlest Higgs by Nima Arkani-Hamed, Andy Cohen, Emmanuel Katz, and Ann Nelson.

Little Higgs theories were an outgrowth of dimensional deconstruction. In these theories, the gauge group has the form of a direct product of several copies of the same factor, for example SU(2)\times SU(2). Each SU(2) factor may be visualised as the SU(2) group living at a particular point along an additional dimension of space. Consequently, many virtues of extra-dimensional theories may be reproduced even though the little Higgs theory is 3+1-dimensional. The little Higgs models are able to predict a naturally-light Higgs particle.

The main idea behind the little Higgs models is that the one-loop contribution to the tachyonic Higgs boson mass coming from the top quark cancels. (The other one-loop contributions are small enough that they don't really matter; the top Yukawa coupling is huge and all the other Yukawa couplings and gauge couplings are small.) This protects the Higgs boson mass for about one order of magnitude, which is good enough to evade many of the precision electroweak constraints

Tuesday, 18 December 2007

Strong Force


Strong interaction






Add and Edited By:
Arip Nurahman Department of Physics, Faculty of Sciences and Mathematics
Indonesian University of Education
and

Follower Open Course Ware at Massachusetts Institute of Technology
Cambridge, USA
Department of Physics
http://web.mit.edu/physics/
http://ocw.mit.edu/OcwWeb/Physics/index.htm
&
Aeronautics and Astronautics Engineering
http://web.mit.edu/aeroastro/www/
http://ocw.mit.edu/OcwWeb/Aeronautics-and-Astronautics/index.htm













Physics Poster (CERN)

Strong force - Weak force
Ref.: CERN-DI-9112020_1
History of the Universe
Ref.: CERN-DI-9108002

Big Bang
Ref.: CERN-DI-9112020_3
Quark gluon plasma
Ref.: CERN-DI-9203073

From atom to quark
Ref.: CERN-DI-9306017_1
Force carrying particles
Ref.: CERN-DI-9306017_2



In particle physics, the strong interaction, or strong force, or color force, holds quarks and gluons together to form protons and neutrons. The strong interaction is one of the four fundamental interactions, along with gravitation, the electromagnetic force and the weak interaction. The word strong is used since the strong interaction is the most powerful of the four fundamental forces; its typical field strength is 100 times the strength of the electromagnetic force, some 1013 times as great as that of the weak force, and about 1038 times that of gravitation.

The strong force is thought to be mediated by gluons, acting upon quarks, antiquarks, and the gluons themselves. This is detailed in the theory of quantum chromodynamics (QCD).


History

Before the 1970s, protons and neutrons were thought to be indivisible fundamental particles. It was known that protons carried a positive electrical charge, electric repulsion made same-charge particles repel each other, and multiple protons were bound together in the atomic nucleus. However, it was unknown what force held the like-charged protons together (with neutrons) in the nucleus.

Another, stronger, attractive force was postulated to explain how protons and neutrons were held together in the atomic nucleus, overcoming electromagnetic repulsion. For its high strength at short distances, it was dubbed the "strong" force. It was thought, at that time, this strong force was a fundamental force acting directly on the nuclear particles. Experiments suggested that the force acted equally between any nucleons, whether protons or neutrons.

It was later discovered this phenomenon was only a residual side-effect of another, truly fundamental, force acting directly on particles inside protons and neutrons, called quarks and gluons. This newly-discovered force was initially called the "color force." The name was chosen for convenience, and has no relation to visible color. [1]

Today, the term "strong force" is used for that strong nuclear force that acts directly on quarks and gluons. The original strong force that acts between nucleons (and also between particles made of quarks, or hadrons) is today called the nuclear force, or the "residual strong nuclear force."

Details

The behavior of the strong force

The contemporary strong force is described by quantum chromodynamics (QCD), a part of the standard model of particle physics. Mathematically, QCD is a non-Abelian gauge theory based on a local (gauge) symmetry group called SU(3).

Quarks and gluons are the only fundamental particles which carry non-vanishing color charge, and hence participate in strong interactions. The strong force itself acts directly only upon elementary quark and gluon particles.

All quarks and gluons in QCD interact with each other through the strong force. The strength of interaction is parametrized by the strong coupling constant. This strength is modified by the gauge color charge of the particle, a group theoretical property which has nothing to do with ordinary visual color.

The strong force acting between quarks, unlike other forces, does not diminish in strength with increasing distance, after a limit (about the size of a hadron) has been reached. It remains at a strength of about 10 newtons, no matter how far away from each other the particles are, after this limiting distance has been reached. In QCD, this phenomenon is called color confinement, implying that only hadrons can be observed; this is because the amount of work done against a force of 10 newtons is enough to create particle-antiparticle pairs within a very short distance of an interaction. Evidence for this effect is seen in many failed free quark searches.

The elementary quark and gluon particles affected are unobservable directly, but instead emerge as jets of newly created hadrons, whenever energy is deposited into a quark-quark bond, as when a quark in a proton is struck by a very fast quark (in an impacting proton) during an accelerator experiment. However, quark-gluon plasmas have been observed.

The behavior of the residual strong force (nuclear force)

A residual effect of the strong force is called the nuclear force. The nuclear force acts between hadrons, such as nucleons in atomic nuclei. This "residual strong force," acting indirectly, transmits gluons that form part of the virtual pi and rho mesons, which, in turn, transmit the nuclear force between nucleons.

The residual strong force is thus a minor residuum of the strong force which binds quarks together into protons and neutrons. This same force is much weaker between neutrons and protons, because it is mostly neutalized within them, in the same way that electromagnetic forces between neutral atoms (van der Waals forces) are much weaker than the electromagnetic forces that hold the atoms internally together.

Unlike the strong force itself, the nuclear force, or residual strong force, does diminish in strength, strongly with distance. The decrease is approximately as an negative exponential power of distance, though there is no simple expression known for this; see Yukawa potential. This fact, together with the less-rapid decrease of the disruptive electromagnetic force between protons with distance, causes the instability of larger atomic nuclei, such as all those with atomic numbers larger than 82.

See also

References

  1. ^ Feynman, Richard (1985). "Loose Ends", QED: The Strange Theory of Light and Matter. Princeton University Press, p. 136. ISBN 0-691-08388-6. "The idiot physicists, unable to come up with any wonderful Greek words anymore, call this type of polarization by the unfortunate name of “color,” which has nothing to do with color in the normal sense."

Further reading

External links

Monday, 17 December 2007

Theory of Everything


By:
Arip Nurahman
Department of Physics
Faculty of Sciences and Mathematics, Indonesia University of Education

and

Follower Open Course Ware at Massachusetts Institute of Technology
Cambridge, USA
Department of Physics
http://web.mit.edu/physics/
http://ocw.mit.edu/OcwWeb/Physics/index.htm
&
Aeronautics and Astronautics Engineering
http://web.mit.edu/aeroastro/www/
http://ocw.mit.edu/OcwWeb/Aeronautics-and-Astronautics/index.htm















A theory of everything (TOE) is a putative theory of theoretical physics that fully explains and links together all known physical phenomena. Initially, the term was used with an ironic connotation to refer to various overgeneralized theories. For example, a great-grandfather of Ijon Tichy — a character from a cycle of StanisÅ‚aw Lem's science fiction stories of 1960s — was known to work on the "General Theory of Everything". Physicist John Ellis claims[1] to have introduced the term into the technical literature in an article in Nature in 1986.[2] Over time, the term stuck in popularizations of quantum physics to describe a theory that would unify or explain through a single model the theories of all fundamental interactions of nature.
There have been many theories of everything proposed by theoretical physicists over the last century, but none have been confirmed experimentally. The primary problem in producing a TOE is that the accepted theories of quantum mechanics and general relativity are hard to combine.
Based on theoretical holographic principle arguments from the 1990s, many physicists believe that 11-dimensional M-theory, which is described in many sectors by matrix string theory, in many other sectors by perturbative string theory is the complete theory of everything, although there is no widespread consensus.

Contents


Historical antecedents

Laplace famously suggested that a sufficiently powerful intellect could, if it knew the velocity of every particle at a given time, along with the laws of nature, calculate the position of any particle at any other time:
An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past would be present before its eyes.
Essai philosophique sur les probabilités, Introduction. 1814
Although modern quantum mechanics suggests that uncertainty is inescapable, a "single formula" may nevertheless exist.

Ancient Greece to Einstein

Since ancient Greek times, philosophers have speculated that the apparent diversity of appearances conceals an underlying unity, and thus that the list of forces might be short, indeed might contain only a single entry. For example, the mechanical philosophy of the 17th century posited that all forces could be ultimately reduced to contact forces between tiny solid particles.[3] This was abandoned after the acceptance of Isaac Newton's long-distance force of gravity; but at the same time, Newton's work in his Principia provided the first dramatic empirical evidence for the unification of apparently distinct forces: Galileo's work on terrestrial gravity, Kepler's laws of planetary motion, and the phenomonenon of tides were all quantitatively explained by a single law of universal gravitation.
In 1820, Hans Christian Ørsted discovered a connection between electricity and magnetism, triggering decades of work that culminated in James Clerk Maxwell's theory of electromagnetism. Also during the 19th and early 20th centuries, it gradually became apparent that many common examples of forces—contact forces, elasticity, viscosity, friction, pressure—resulted from electrical interactions between the smallest particles of matter. In the late 1920s, the new quantum mechanics showed that the chemical bonds between atoms were examples of (quantum) electrical forces, justifying Dirac's boast that "the underlying physical laws necessary for the mathematical theory of a large part of physics and the whole of chemistry are thus completely known".[4]
Attempts to unify gravity with electromagnetism date back at least to Michael Faraday's experiments of 1849–50.[5] After Albert Einstein's theory of gravity (general relativity) was published in 1915, the search for a unified field theory combining gravity with electromagnetism began in earnest. At the time, it seemed plausible that no other fundamental forces exist. Prominent contributors were Gunnar Nordström, Hermann Weyl, Arthur Eddington, Theodor Kaluza, Oskar Klein, and most notably, many attempts by Einstein and his collaborators. In his last years, Albert Einstein was intensely occupied in finding such a unifying theory. None of these attempts were successful.[6]

New discoveries

The search for a unifying theory was interrupted by the discovery of the strong and weak nuclear forces, which could not be subsumed into either gravity or electromagnetism. A further hurdle was the acceptance that quantum mechanics had to be incorporated from the start, rather than emerging as a consequence of a deterministic unified theory, as Einstein had hoped. Gravity and electromagnetism could always peacefully coexist as entries in a list of Newtonian forces, but for many years it seemed that gravity could not even be incorporated into the quantum framework, let alone unified with the other fundamental forces. For this reason, work on unification for much of the twentieth century, focused on understanding the three "quantum" forces: electromagnetism and the weak and strong forces. The first two were unified in 1967–68 by Sheldon Glashow, Steven Weinberg, and Abdus Salam as the "electroweak" force.[7] However, while the strong and electroweak forces peacefully coexist in the standard model of particle physics, they remain distinct. Several Grand Unified Theories (GUTs) have been proposed to unify them. Although the simplest GUTs have been experimentally ruled out, the general idea, especially when linked with supersymmetry, remains strongly favored by the theoretical physics community.
Beyond the Standard Model
Standard Model













































Modern Physics
In current mainstream physics, a Theory of Everything would unify all the fundamental interactions of nature, which are usually considered to be four in number: gravity, the strong nuclear force, the weak nuclear force, and the electromagnetic force. Because the weak force can transform elementary particles from one kind into another, the TOE should yield a deep understanding of the various different kinds of particles as well as the different forces. The expected pattern of theories is:









Theory of Everything











































































































































































































































































In addition to the forces listed here, modern cosmology might require an inflationary force, dark energy, and also dark matter composed of fundamental particles outside the scheme of the standard model. The existence of these has not been proven and there are alternative theories such as modified Newtonian dynamics.
Electroweak unification is a broken symmetry: the electromagnetic and weak forces appear distinct at low energies because the particles carrying the weak force, the W and Z bosons have a mass of about 100 GeV, whereas the photon, which carries the electromagnetic force, is massless. At higher energies Ws and Zs can be created easily and the unified nature of the force becomes apparent. Grand unification is expected to work in a similar way, but at energies of the order of 1016 GeV, far greater than could be reached by any possible Earth-based particle accelerator. By analogy, unification of the GUT force with gravity is expected at the Planck energy, roughly 1019 GeV.
It may seem premature to be searching for a TOE when there is as yet no direct evidence for an electronuclear force, and while in any case there are many different proposed GUTs. In fact the name deliberately suggests the hubris involved. Nevertheless, most physicists believe this unification is possible, partly due to the past history of convergence towards a single theory. Supersymmetric GUTs seem plausible not only for their theoretical "beauty", but because they naturally produce large quantities of dark matter, and the inflationary force may be related to GUT physics (although it does not seem to form an inevitable part of the theory). And yet GUTs are clearly not the final answer. Both the current standard model and proposed GUTs are quantum field theories which require the problematic technique of renormalization to yield sensible answers. This is usually regarded as a sign that these are only effective field theories, omitting crucial phenomena relevant only at very high energies. Furthermore, the inconsistency between quantum mechanics and general relativity implies that one or both of these must be replaced by a theory incorporating quantum gravity.
Unsolved problems in physics: Is string theory, superstring theory, or M-theory, or some other variant on this theme, a step on the road to a "theory of everything", or just a blind alley?
The mainstream theory of everything at the moment is superstring theory / M-theory; current research on loop quantum gravity may eventually play a fundamental role in a TOE, but that is not its primary aim. These theories attempt to deal with the renormalization problem by setting up some lower bound on the length scales possible. String theories and supergravity (both believed to be limiting cases of the yet-to-be-defined M-theory) suppose that the universe actually has more dimensions than the easily observed three of space and one of time. The motivation behind this approach began with the Kaluza-Klein theory in which it was noted that applying general relativity to a five dimensional universe (with the usual four dimensions plus one small curled-up dimension) yields the equivalent of the usual general relativity in four dimensions together with Maxwell's equations (electromagnetism, also in four dimensions). This has led to efforts to work with theories with large number of dimensions in the hopes that this would produce equations that are similar to known laws of physics. The notion of extra dimensions also helps to resolve the hierarchy problem, which is the question of why gravity is so much weaker than any other force. The common answer involves gravity leaking into the extra dimensions in ways that the other forces do not.
In the late 1990s, it was noted that one problem with several of the candidates for theories of everything (but particularly string theory) was that they did not constrain the characteristics of the predicted universe. For example, many theories of quantum gravity can create universes with arbitrary numbers of dimensions or with arbitrary cosmological constants. Even the "standard" ten-dimensional string theory allows the "curled up" dimensions to be compactified in an enormous number of different ways (one estimate is 10500) each of which corresponds to a different collection of fundamental particles and low-energy forces. This array of theories is known as the string theory landscape.
A speculative solution is that many or all of these possibilities are realised in one or another of a huge number of universes, but that only a small number of them are habitable, and hence the fundamental constants of the universe are ultimately the result of the anthropic principle rather than a consequence of the theory of everything. This anthropic approach is often criticised in that, because the theory is flexible enough to encompass almost any observation, it cannot make useful (as in original, falsifiable, and verifiable) predictions. In this view, string theory would be considered a pseudoscience, where an unfalsifiable theory is constantly adapted to fit the experimental results.

With reference to Gödel's incompleteness theorem

A small number of scientists claim that Gödel's incompleteness theorem proves that any attempt to construct a TOE is bound to fail. Gödel's theorem, informally stated, asserts that any sufficiently complex mathematical theory that has a finite description is either inconsistent or incomplete. In his 1966 book The Relevance of Physics, Stanley Jaki pointed out that, because any "theory of everything" will certainly be a consistent non-trivial mathematical theory, it must be incomplete. He claims that this dooms searches for a deterministic theory of everything.[8]
Freeman Dyson has stated that
Gödel’s theorem implies that pure mathematics is inexhaustible. No matter how many problems we solve, there will always be other problems that cannot be solved within the existing rules. [...] Because of Gödel's theorem, physics is inexhaustible too. The laws of physics are a finite set of rules, and include the rules for doing mathematics, so that Gödel's theorem applies to them.
Stephen Hawking was originally a believer in the Theory of Everything but, after considering Gödel's Theorem, concluded that one was not obtainable.
Some people will be very disappointed if there is not an ultimate theory, that can be formulated as a finite number of principles. I used to belong to that camp, but I have changed my mind.
This view has been argued against by Jürgen Schmidhuber (1997), who pointed out that Gödel's theorems are irrelevant even for computable physics.[9] In 2000, Schmidhuber explicitly constructed limit-computable, deterministic universes whose pseudo-randomness based on undecidable, Gödel-like halting problems is extremely hard to detect but does not at all prevent formal TOEs describable by very few bits of information.[10][11]
Related critique was offered by Solomon Feferman,[12] among others. Douglas S. Robertson offers Conway's game of life as an example:[13] The underlying rules are simple and complete, but there are formally undecidable questions about the game's behaviors. Analogously, it may (or may not) be possible to completely state the underlying rules of physics with a finite number of well-defined laws, but there is little doubt that there are questions about the behavior of physical systems which are formally undecidable on the basis of those underlying laws.
Since most physicists would consider the statement of the underlying rules to suffice as the definition of a "theory of everything", these researchers argue that Gödel's Theorem does not mean that a TOE cannot exist. On the other hand, the physicists invoking Gödel's Theorem appear, at least in some cases, to be referring not to the underlying rules, but to the understandability of the behavior of all physical systems, as when Hawking mentions arranging blocks into rectangles, turning the computation of prime numbers into a physical question.[14] This definitional discrepancy may explain some of the disagreement among researchers.

Potential status of a theory of everything





No physical theory to date is believed to be precisely accurate. Instead, physics has proceeded by a series of "successive approximations" allowing more and more accurate predictions over a wider and wider range of phenomena. Some physicists believe that it is therefore a mistake to confuse theoretical models with the true nature of reality, and hold that the series of approximations will never terminate in the "truth". Einstein himself expressed this view on occasions.[15] On this view, we may reasonably hope for a theory of everything which self-consistently incorporates all currently known forces, but should not expect it to be the final answer. On the other hand it is often claimed that, despite the apparently ever-increasing complexity of the mathematics of each new theory, in a deep sense associated with their underlying gauge symmetry and the number of fundamental physical constants, the theories are becoming simpler. If so, the process of simplification cannot continue indefinitely.
There is a philosophical debate within the physics community as to whether a theory of everything deserves to be called the fundamental law of the universe.[16] One view is the hard reductionist position that the TOE is the fundamental law and that all other theories that apply within the universe are a consequence of the TOE. Another view is that emergent laws (called "free floating laws" by Steven Weinberg), which govern the behavior of complex systems, should be seen as equally fundamental. Examples are the second law of thermodynamics and the theory of natural selection. The point being that, although in our universe these laws describe systems whose behaviour could ("in principle") be predicted from a TOE, they would also hold in universes with different low-level laws, subject only to some very general conditions. Therefore it is of no help, even in principle, to invoke low-level laws when discussing the behavior of complex systems. Some argue that this attitude would violate Occam's Razor if a completely valid TOE were formulated. It is not clear that there is any point at issue in these debates (e.g., between Steven Weinberg and Philip Anderson) other than the right to apply the high-status word "fundamental" to their respective subjects of interest.
Although the name "theory of everything" suggests the determinism of Laplace's quote, this gives a very misleading impression. Determinism is frustrated by the probabilistic nature of quantum mechanical predictions, by the extreme sensitivity to initial conditions that leads to mathematical chaos, and by the extreme mathematical difficulty of applying the theory. Thus, although the current standard model of particle physics "in principle" predicts all known non-gravitational phenomena, in practice only a few quantitative results have been derived from the full theory (e.g., the masses of some of the simplest hadrons), and these results (especially the particle masses which are most relevant for low-energy physics) are less accurate than existing experimental measurements. The true TOE would almost certainly be even harder to apply. The main motive for seeking a TOE, apart from the pure intellectual satisfaction of completing a centuries-long quest, is that all prior successful unifications have predicted new phenomena, some of which (e.g., electrical generators) have proved of great practical importance. As in other cases of theory reduction, the TOE would also allow us to confidently define the domain of validity and residual error of low-energy approximations to the full theory which could be used for practical calculations.

Theory of everything and philosophy

The status of a physical TOE is open to philosophical debate. For example, if physicalism is true, a physical TOE would coincide with a philosophical theory of everything. Some philosophers (Aristotle, Plato, Hegel, Whitehead, et al) have attempted to construct all-encompassing systems. Others are highly dubious about the very possibility of such an exercise.

See also

References

  1. ^ Ellis, John (2002), "Physics gets physical (correspondence)", Nature 415: 957
  2. ^ Ellis, John (1986), "The superstring: theory of everything, or of nothing?", Nature 323: 595–598, doi:10.1038/323595a0
  3. ^ e.g., Shapin, Steven (1996). The Scientific Revolution. University of Chicago Press. ISBN 0226750213.
  4. ^ Dirac, P.A.M. (1929), "Quantum mechanics of many-electron systems", Proc. Royal Soc. London, Series A 123: 714, doi:10.1098/rspa.1929.0094
  5. ^ Faraday, M. (1850), "Experimental Researches in Electricity. Twenty-Fourth Series. On the Possible Relation of Gravity to Electricity", Abstracts of the Papers Communicated to the Royal Society of London 5: 994–995, doi:10.1098/rspl.1843.0267
  6. ^ Pais (1982), Ch. 17.
  7. ^ e.g., Weinberg (1993), Ch. 5
  8. ^ Jaki, S.L.: "The Relevance of Physics", Chicago Press,1966
  9. ^ Jürgen Schmidhuber. A Computer Scientist's View of Life, the Universe, and Everything. Lecture Notes in Computer Science, pp. 201-208, Springer, 1997.
  10. ^ Jürgen Schmidhuber. Algorithmic Theories of Everything, 30 Nov 2000
  11. ^ Jürgen Schmidhuber. Hierarchies of generalized Kolmogorov complexities and nonenumerable universal measures computable in the limit. International Journal of Foundations of Computer Science 13(4):587-612, 2002
  12. ^ Feferman, S. The nature and significance of Gödel’s incompleteness theorems, Institute for Advanced Study, Princeton, November 17, 2006
  13. ^ Douglas S. Robertson (2007). "Goedel’s Theorem, the Theory of Everything, and the Future of Science and Mathematics". Complexity 5: 22–27. doi:10.1002/1099-0526(200005/06)5:5<22::aid-cplx4>3.0.CO;2-0.
  14. ^ Stephen Hawking, Gödel and the end of physics, July 20, 2002
  15. ^ Einstein, letter to Felix Klein, 1917. (on determinism and approximations) Quoted in Pais (1982), Ch. 17.
  16. ^ e.g., see Weinberg (1993), Ch 2.

Further reading

External links