Historical Background Leading to the Special Theory:

In order to have a proper prospective of
how the special theory came about, we need to look at the the times in which
these events took place: As the 20th century dawned, Newton's laws had been eminently
successful in describing almost every aspect of the physical world. In this
mechanical system, the interaction of particles were governed by his laws of
motion. Gravitational forces acted at a distance on tiny particles or on stars and
planets in keeping with the universal law of gravitation. Maxwell's equations,
on the other hand, provided a structure that had successfully unified
electricity, magnetism and optics. Out of this unification came the undeniable
result that light, as it traveled through space at 186,000 miles per second
(3x10^{8} meters/second) was itself an electromagnetic wave. Therein
resided the crux of a dilemma: Newtonian waves (water waves, sound waves and the
like), propagated by the interaction of particles that made up the medium that carried the
waves. To preserve the connection, physicists of the time needed a medium to
carry light waves, but what medium existed between us and the stars so that
their light could reach us? The answer was the ether, an ancient idea resurrected
to save the particle view of Newtonian mechanics. That the ether had properties
difficult to understand was a problem to be addressed, but one that could be
temporarily set aside. The existence of the ether offered an interesting
opportunity: If such a medium does exist, it must be in a state of relative
motion with respect to the stars (or at rest with respect to them), and so it
should be possible to measure the motion of another system with respect to the
ether. To do this simply required a measurement of the velocity of light in one
system compared to the other. Such an experiment was undertaken by A. A.
Michelson and E. W. Morley in 1887. Although their apparatus was sensitive
enough to measure the velocity differences, no differences were found. This
result, retested many times, requires the acceptance of the idea that the
velocity of light must be independent of the motion of the observers. We thus
have a dilemma: There are three basic tenets of which only two can be true.
These are Galilean relativity, addition of velocities, and the constancy of the
velocity of light. If the first two are chosen we are firmly in the Newtonian
world, but how do we explain the Michelson-Morley
experiment? The first explains
a wide range of phenomena, and that cannot be overlooked.
Hendrik Antoon Lorentz and, independently, George Francis FitzGerald set about
to modify the first equation in a manner that would keep its generality for
velocities small compared to that of light, but also allow the velocity of light
to remain constant as seen by different observers, in keeping with the results
of the Michelson-Morley experiment. The result was a mathematical
term called the "relativistic factor", often labeled with the Greek
letter
g
(gamma) with the form

.

Look at this factor carefully. As you
should do with all mathematical expressions, read it like a story rather than
simply try to memorize it: The Greek letter
g
is just an arbitrary name given to the expression. In the expression *v* is
the velocity of one observer with respect to another (e.g. if I stand still and
you walk past me at 20 m/s, then *v* = 20 m/s. The constant velocity of
light (3x10^{8} m/s) is labeled *c*. Look at the term in the
denominator, the square root of 1 - (v/c)^{2}. Notice that if v is
larger than c then *v*/*c* is larger than 1, and 1 - (*v*/*c*)^{2
}is less than 1. As you may recall from your algebra, the square root of a
negative number is imaginary. Imaginary numbers are fine in mathematics, but not
in physics - imaginary numbers do not describe physical things. Conclusion: *v*
cannot be larger than *c* - no velocity can exceed the velocity of light. You
probably knew this; now you know why. The value of
g depends on the fraction *v/c*. Remember that *c* is a very large number, and so if
(*v*/*c*)^{2}
is to amount to anything substantial, v is going to have to be large also.

To see that this is so,
I have plotted g
as a function of v/c. When v is small, near 0,
g is equal to 1. When v is
close to c, the denominator of
g approaches 0 and so
g becomes very large. approaching
infinity. Let's put in some numbers: The fastest airplanes travel at about
mach 6. That's 6 times the speed of sound or about 2000 m/s (4,000 mph).
At this speed (v/c)^{2 }is 6.67x10^{-6} and so g
is almost exactly equal to 1 [(1 - 6.67x10^{-6})^{1/2} is
equal to 0.99999667, so
g
is equal to 1.00000333] . The only way man has been able to go faster
is in space craft. Astronauts circle Earth at about 25,000 mph, or about
11,000 m/s. At this speed
g
is equal to 1.000018, so even at man's highest attained speed the correction to
Galilean relativity is less than 2 thousandths of a percent. It's no wonder that
Newton's laws seemed so secure for so long, and that they still hold today for
most situations. On the other hand, physicists often deal with electrons and
other particles that routinely travel at speeds approaching that of light, and
some day we would like to visit the stars, a task that will need very high
velocities indeed. Suppose then that we design a space craft capable of
traveling at 60% the speed of light (over 670 million mph!). Then
g is equal to 1.25, and so finally
becomes important. |