to library         main page
genius Albert Einstein, Ђ«мЎҐав ќ©­и⥩­ - ЈҐ­Ё© ўбҐе ўаҐ¬Ґ­

Dr. Bryan G. Wallace

FARCE OF PHYSICS

http://users.navi.net/~rsc/physics/wallace/farce.txt
© Copyright 1993 Bryan G. Wallace
                          Chapter 3

                       Mathematical Magic

   There is a tradition of brown-bag lunch in the foyer of the
Science auditorium at Eckerd College.  Most of the Natural
Science Collegium faculty tend to observe this tradition, and it
is not unusual to have faculty from the other Collegiums or even
the President or Dean of the College to attend the lunch as well. 
The well upholstered easy chairs and sofas are dragged over the
carpet to form a circle, and the lunch becomes an informal
discussion group, with wide ranging topics from sports to
philosophy.  Many of the arguments presented in this book have
evolved from the discussions and debates at this lunch, and even
the book itself has become a topic of discussion, as I've passed
out copies of the material as it has developed to interested
faculty members, in an effort to obtain input from the group. 
One of the topics that was discussed was the question of the
nature of mathematics.  It was interesting to find that the Math
faculty had no simple well defined definition of Mathematics!  My
Grolier Encyclopedia states that the word was derived from the
Greek word for learning mathema, and that Mathematical scholars
disagree upon a definition of mathematics.  The article goes on
to state under HISTORY:

     As a recognizable discipline, mathematics is found first
  among the ancient Egyptians and the Sumerians.  In fact, the
  Egyptians probably had considerable mathematical knowledge as
  early as 2900 B.C., when the Great Pyramid of Gizeh was built. 
  A handbook upon mathematics, known as the Ahmes Papyrus,
  written about 1550 B.C., shows that the early Egyptians could
  solve many difficult arithmetical problems.  Some modern
  scholars believe that the Sumerians, who were the predecessors
  of the Babylonians, may have had a system of arithmetic as
  early as 3500 B.C.  The Sumerians and Babylonians applied
  arithmetic and elementary geometry to the study of astronomical
  problems and to the construction of great irrigation and other
  engineering projects.
     The Greek philosopher-mathematician Thales is usually
  regarded as the first to realize the importance of organizing
  mathematics upon a logical basis.  Such a tradition was carried
  on and further developed in early times by Pythagoras, Plato,
  Aristotle, and especially by the mathematicians of the
  Alexandrian School.  The famous University of Alexandria,
  between 300 B.C. and 500 A.D., had upon its staff such
  distinguished mathematicians as Euclid, Archimedes, Apollonius,
  Eratosthenes, Ptolemy, Heron, Menelaus, Pappus, and Diophantus.
     For nearly a thousand years before the 15th century little
  original work was done in the field of mathematics except that
  produced by the Hindus and the Arabs.  In the 16th century
  Tartaglia, Cardan, and Ferrari in Italy and Vieta in France
  laid the foundations of modern algebra.  The 17th century
  produced many outstanding mathematicians including Descartes,
  Newton, Leibnitz, Fermat, Pascal, Desargues, Napier, and
  Kepler.  During the 17th century mathematics was extended in
  many directions, and modern analysis was born with the
  invention of the calculus.  The 18th, 19th, and the first half
  of the 20th centuries have seen a tremendous growth in the
  development of mathematical theory, and mathematical techniques
  have been introduced into virtually all branches of pure and
  applied science.

I presented the argument that mathematics was a language.  My
view on this matter was based on the following statement by Dr.
Robert B. Fischer, in his book "Science Man and Society":

  The language of mathematics, which consists of its symbols and
  their relationships, is very much at the heart of the practice
  of virtually all fields of science.[40]

My view was also shaped by various statements made by Prof.
Albert Einstein such as the following sentence:

  It demands the highest possible standard of rigorous precision
  in the description of relations, such as only the use of
  mathematical language can give.[39 p.225]

Prof. Richard Rhodes II, a member of the Physics faculty, and a
graduate of Yale University, told a story in support of my
argument.  The story concerned a statement made by Prof. Josiah
Willard Gibbs, Yale's first professor of mathematical physics. 
With regard to Gibbs, the following was taken from an article on
him entitled "A loner's legacy":

    Gibb's work was so advanced that one of his great admirers,
  Albert Einstein, complained about one of his papers that "it is
  hard to read and the main points have to be read between the
  lines."  However, Einstein also termed it "a masterpiece." 
  Scientists have been reading between the lines since Gibbs
  first laid out the fundamental equations of thermodynamics and
  reshaped the study of relations between energy and the
  composition of matter into a modern field with implications
  still being found.[41]

The story came from a biography on Gibbs by Dr. Muriel Rukeyser,
and goes as follows:

  A story is told of him, the one story that anyone remembers of
  Willard Gibbs at a faculty meeting.  He would come to meetings
  - these faculty gatherings so full of campus politics, scarcely
  veiled manoeuvres, and academic obstacle races - and leave
  without a word, staying politely enough, but never speaking.
     Just this once, he spoke.  It was during a long and tiring
  debate on elective courses, on whether there should be more or
  less English, more or less classics, more or less mathematics. 
  And suddenly everything he had been doing stood upДДand the
  past behind him, his father's life, and behind that, the long
  effort and voyage that had been made in many lifetimesДДand he
  stood up, looking down at the upturned faces, astonished to see
  the silent man talk at last.  And he said, with emphasis, once
  and for all:

     "Mathematics is a language."[42]

Following Rhodes' story about Gibbs, everyone seemed to agree,
that yes, mathematics is a language.

   The major problem with mathematics, is that for the average
person, it is a foreign language.  To illustrate this point, I
will cite several paragraphs taken from a very interesting
article published in Physics Today, entitled "Math anxiety and
physics: Some thoughts on learning 'difficult' subjects":

     However, students bring more than Aristotelianisms to class. 
  They consider science in general and physics in particular
  "hard" subjects to learn.  As Robert Fuller of the University
  of Nebraska points out, professors intentionally and
  unintentionally contribute to this reputation.  In a proposal,
  since funded by Exxon, for AAPT workshops to help teachers
  develop student confidence in physics, Fuller notes that
  "Opening lectures often describe the high standards maintained
  by the department, the firm math prerequisites, the poor grade
  records of previous classes."  Even when they do not make such
  explicit statements, teachers convey the message that physics
  is a particularly difficult subject, says Fuller, and this
  damages student confidence.
     How significant, then, is apprehension in discouraging
  nonscience undergraduates from attempting physics?  Might the
  anxiety-reduction techniques that proved useful in treating
  fear of mathematics work for the physics student?  While it
  remains to be seen whether the sources of physics anxiety and
  math anxiety are the same, one thing is clear to someone who
  has dealt with fear of mathematics in college-age students: 
  The two have similar manifestations.  Hence, even though the
  discussion in the first half of this article focuses on
  obstacles to learning mathematics, I think readers will find
  that it rings true for physics as well. ...
     Instead, what appears to link students of very diverse
  mathematical "ability" is a collection of what might be called
  ideological beliefs or prejudices about the subject.  Students'
  early experiences with mathematics typically give them false
  impressions not only of the nature of the subject, but also,
  and more perniciously, of the kinds of skills required to
  master it.  They think, for example, that speed is more
  important than persistence.  Even more humbling, most come away
  from their exposure to mathematics believing they do not have
  the sine qua non of mathematics success, namely, a
  "mathematical mind."
     When the students that I interviewedДДparticularly the woman
  studentsДДdecided to stop taking mathematics, they explained
  this in terms of their feelings:  They felt helpless and out of
  control in confronting mathematics; they were easily bewildered
  and found themselves humiliated in class; they were uneasy
  solving or analyzing problems under time pressure, and they had
  become distrustful of intuitive ideas that had not been
  formally introduced in the text.  Because of all this, the
  students felt compelled to memorize solutions to individual
  problems.[43]

   Mathematics forms the foundation of the technical jargon that
the average physicist uses to confuse the issues and enhance his
status by over publishing his work.  The same basic equations, or
algebraic variations of them, are repeated over and over in the
literature.  If the unneeded equations were eliminated, the
articles would be easier to understand, and the inflated volume
of the physics journals would be reduced by at least 90%.  To
illustrate the problem, I will make several quotes from an
article by Prof. N. David Mermin entitled "WHAT'S WRONG WITH
THESE EQUATIONS?":

     A major impediment to writing physics gracefully comes from
  the need to embed in the prose many large pieces of raw
  mathematics.  Nothing in freshman composition courses prepares
  us for the literary problems raised by the use of displayed
  equations.  Our knowledge is acquired implicitly by reading
  textbooks and articles, most of whose authors have also given
  the problem no thought....
     Admittedly sometimes an equation is buried so deep in the
  guts of an argument, so contingent on context, so ungainly in
  form that no brief phrase can convey to a reader even a glimmer
  of what it is about, and anybody wanting to know why it was
  invoked a dozen pages further on cannot do better than wander
  back along the trail and gaze at the equation itself, all
  glowering and menacing in its lair.... Indeed, is the equation
  itself essential?  Or is it the kind of nasty and fundamentally
  uninteresting intermediate step that readers would either skip
  over or, if seriously interested, work out for themselves, in
  neither case needing to have it appear in your text?...
     We punctuate equations because they are a form of prose
  (they can, after all, be read aloud as a sequence of words) and
  are therefore subject to the same rules as any other prose....
     Most journals punctuate their equations, even if the author
  of the manuscript did not, but a sorry few don't, removing all
  vestiges of the punctuation carefully supplied by the author. 
  This unavoidably weakens the coupling between the math and the
  prose, and often introduces ambiguity and confusion.[44]

Dr. Oliver C. Wells is a research scientist at the IBM Thomas J.
Watson Research Center, and concerning the difficulty in
understanding the mathematics and technical jargon in physics,
writes:

     On the subject of writing style, I am frequently horrified
  to discover that I quite simply cannot understand even the
  first paragraph of a technical article on a subject quite close
  to my own major area of interest.[45]

   The Executive Director of the scientific research society
Sigma Xi, has published a booklet on scientific ethics.[50]  On
page 11 of Chapter 3 which is titled "Trimming, Cooking, and
Forging" Dr. Jackson starts with:

     Charles Babbage (1792-1871) is generally remembered as the
  prophet of the electronic computer, because of his "difference
  engine" and the uncompleted "analytical engine."  But he had a
  much more extensive influence on scientific development.  As
  professor of mathematics at Cambridge University, he published
  a book entitled Reflections on the Decline of Science in
  England.  Since the year was 1830, the same year that Charles
  Lyell began to publish his Principles of Geology and shortly
  before Charles Darwin set sail on the "Beagle," the title may
  seem as premature as his calculating devices.  Babbage's book,
  however, is generally given credit as a catalyst in the
  creation of the British Association for the Advancement of
  Science, and indirectly of similar associations in the U.S.A.,
  Australia and elsewhere.
     Babbage, the "irascible genius," was also concerned with how
  science should be done, and the same book describes the forms
  of scientific dishonesty that give this chapter its title.  The
  definitions used here are phrased in contemporary English;
  otherwise not much seems to have changed in 150 years.

          Trimming: the smoothing of irregularities to make the
                    data look extremely accurate and precise.

          Cooking:  retaining only those results that fit the
                    theory and discarding others.

          Forging:  inventing some or all of the research data
                    that are reported, and even reporting
                    experiments to obtain those data that were
                    never performed.

   Dishonest deceptions are not unusual in the history of
physics.  They began with Galileo Galilei, the man who laid the
foundations of modern physics.  My insight into this matter came
from a book titled "The Birth of a New Physics" by Dr. I. Bernard
Cohen.[51]  On page 66 we find:

  ...Galileo was born in Pisa, Italy, in 1564, almost on the day
  of Michelangelo's death and within a year of Shakespeare's
  birth.  His father sent him to the university at Pisa, where
  his sardonic combativeness quickly won him the nickname
  "wrangler."

And then on page 111:

     Galileo's originality was therefore different from what he
  boastfully declared.  No longer need we believe anything so
  absurd as that there had been no progress in understanding
  motion between the time of Aristotle and Galileo.  And we may
  ignore the many accounts that make it appear that Galileo
  invented modern dynamics with no debt to any medieval or
  ancient predecessor.
     This was a point of view encouraged by Galileo himself but
  it is one that could be more justifiably held fifty years ago
  than today.  One of the most fruitful areas of research in the
  history of science in the last half centuryДДbegan chiefly by
  the French scholar and scientist Pierre DuhemДДhas been the
  "exact sciences" of the Middle Ages.  These investigations have
  uncovered a tradition of criticism of Aristotle which paved the
  way for Galileo's own contributions.  By making precise exactly
  what Galileo owed to his predecessors, we may delineate more
  accurately his own heroic proportions.  In this way,
  furthermore, we may make the life story of Galileo more real,
  because we are aware that in the advance of the sciences each
  man builds on the work of his predecessors....

   More than any other man, Sir Isaac Newton set the tone for
scientific dishonesty in modern physics by his skilled use of
"Mathematical Magic."  My insight into this came from a very
interesting article titled "Newton and the Fudge Factor" by Dr.
Richard S. Westfall.[52]  To advance my argument I start with the
following paragraph from the article:

     And having proposed exact correlation as the criterion of
  truth, it took care to see that exact correlation was
  presented, whether or not it was properly achieved.  Not the
  least part of the Principia's persuasiveness was its deliberate
  pretense to a degree of precision quite beyond its legitimate
  claim.  If the Principia established the quantitative pattern
  of modern science, it equally suggested a less sublime
  truthДДthat no one can manipulate the fudge factor quite so
  effectively as the master mathematician himself.

In explaining Newton's motives in fudging his work, I present the
following paragraph from Westfall's article:

     The second edition of the Principia was at once an amended
  version of the first edition and a justification of Newtonian
  science.  The battle with the continental mechanical
  philosophers who refused to have truck with the occult notion
  of action at a distance still raged.  The second edition made
  its appearance framed, as it were, by its two most important
  additions, Cotes' "Preface" at the beginning and Newton's
  "General Scholium" at the end, both of them devoted to the
  defense of Newtonian philosophy, of exact quantitative science
  as opposed to speculative hypotheses of causal mechanisms.  By
  1713, moreover, Newton's perpetual neurosis had reached its
  passionate climax in the crusade to destroy the arch-villain
  Leibniz.  Only a year earlier the Royal Society had published
  its Commercium epistolicum, a condemnation of Leibniz for
  plagiary and a vindication of Newton, which Newton himself
  composed privately and thrust upon the society's committee of
  avowed impartial judges.  In Newton's mind, the two battles
  merged into one, undoubtedly gaining emotional intensity in the
  process.  Not only did Leibniz try to explain the planetary
  system by means of a vortex and inveigh against the concept of
  attraction, but he also encouraged others to attack Newton's
  philosophy.  His arrogance in claiming the calculus was only a
  special instance of his arrogant presumption to trim nature to
  the mold of his philosophical hypotheses.  In contrast, the
  true philosophy modestly and patiently followed nature instead
  of seeking to compel her.  The increased show of precision in
  the second edition was the reverse side of the coin stamped
  hypotheses non fingo.  It played a central role in the polemic
  supporting Newtonian science.

The term "fudge factor" is of course, just a polite way of
describing  Newton's dishonest ways of Trimming, Cooking, and
Forging the data.  The following is taken from one of the
examples of Newton's fudging in the article:

     In examining the alterations, let us start with the velocity
  of sound since the deception in this case was patent enough
  that no one beyond Newton's most devoted followers was taken
  in.  Any number of things were wrong with the demonstration. 
  It calculated a velocity of sound in exact agreement with
  Derham's figure, whereas Derham himself had presented the
  conclusion merely as the average of a large number of
  measurements.  Newton's assumptions that air contains vapor in
  the quantity of 10 parts to 1 and that vapor does not
  participate in the sound vibrations were wholly arbitrary,
  resting on no empirical foundation whatever.  And his use of
  the "crassitude" of the air particles to raise the calculated
  velocity by more than 10 percent was nothing short of
  deliberate fraud.

   Interesting additional information with regard to Newton's
lack of scientific integrity can be found in an article published
by Dr. I. Bernard Cohen in the journal Scientific American.[53] 
The article is titled "Newton's Discovery of Gravity" and
contains the following paragraph:

     A decisive step on the path to universal gravity came in
  late 1679 and early 1680, when Robert Hooke introduced Newton
  to a new way of analyzing motion along a curved trajectory. 
  Hooke had cleverly seen that the motion of an orbiting body has
  two components, an inertial component and a centripetal, or
  center-seeking, one.  The inertial component tends to propel
  the body in a straight line tangent to the curved path, whereas
  the centripetal component continuously draws the body away from
  the inertial straight-line trajectory.  In a stable orbit such
  as that of the moon the two components are matched, so that the
  moon neither veers away on a tangential path nor spirals toward
  the earth.

Later in the article Cohen writes this paragraph:

     In his letter Hooke ventured the suggestion that the
  centripetal force drawing a planet toward the sun varies
  inversely as the square of the separation.  At this point Hooke
  was stuck.  He could not see the dynamical consequences of his
  own deep insight and therefore could not make the leap from
  intuitive hunch and guesswork to exact science.  He could go no
  further because he lacked both the mathematical genius of
  Newton and an appreciation of Kepler's law of areas, which
  figured prominently in Newton's subsequent approach to
  celestial dynamics.  The law of areas states that the radius
  vector from the sun to a planet sweeps out equal areas in equal
  times.

With regard to Newton's philosophy as to the cause of the
gravitational force, we find the following paragraph:

     Although Newton at times thought universal gravity might be
  caused by the impulses of a stream of ether particles
  bombarding an object or by variations in an all-pervading
  ether, he did not advance either of these notions in the
  Principia because, as he said, he would "not feign hypotheses"
  as physical explanations.  The Newtonian style had led him to a
  mathematical concept of universal force, and that style led him
  to apply his mathematical result to the physical world even
  though it was not the kind of force in which he could believe.

With regard to Newton's dishonest attempt to claim full credit we
find:

     In 1717 Newton wanted to ensure his own priority in
  discovering the inverse-square law of gravitation, and so he
  invented a scenario in which he made the famous moon test not
  while writing the Principia but two decades earlier in the
  1660's....

And in this same regard, Cohen states this paragraph:

     Newton never published his invented scenario of the early
  moon test.  He included it in the manuscript draft of a letter
  to the French writer Pierre Des Maizeaux but then crossed it
  out.  Newton also circulated the familiar story that a falling
  apple set him on a chain of reflections that led to the
  discovery of universal gravitation.  Presumably this invention
  was also part of his campaign to push back the discovery of
  gravity, or at least the roots of the discovery, to a time 20
  years before the Principia.

   With Newton as a role model, it's no wonder that modern
physics is riddled with an almost complete lack of scientific
objectivity and integrity!  Additional insight into this matter
comes from a very interesting book by Dr. Rudolf Thiel.[54]  The
insight starts on page 183 with the following paragraph:

     Ren‚ Descartes dominated the first half of the seventeenth
  century in his dual capacity of mathematician and philosopher. 
  He had developed mathematical analysis, which wiped out the
  boundary between geometry and algebra, in which curves became
  functions.  By comparison, Euclidean thinking seemed pedantic
  and limited.  Then he attempted to explain the entire mechanism
  of the world by ether eddies.  These supposedly transmitted
  light, and at the same time set the celestial bodies in motion. 
  He succeeded in reducing all the phenomena of nature known at
  the time to this single cause, which transmitted its effect
  tangible from one thing to another; thus everything was
  connected in a chain with everything else.  Descartes's
  contemporaries hailed this triumph of reasoning which seemed to
  explain every detail of the entire Creation.
     Then Newton came along with his mathematical proofs of
  gravitation, which could not be explained by ether eddies. 
  Gravitation was a mystery working over great distances in some
  inconceivable manner.  Such a thing was repugnant to Europeans,
  who wanted to see the interlocking cause and effect with their
  own eyes.  Newton's version of nature therefore seemed to be a
  descent from the heights attained by Descartes, retrogression
  to an outmoded stage of philosophy.
     Worse still, in Newton's mighty system there was no room
  left for the ether.  This also undermined the wave theory of
  light, which Huygens had recently presented to the world. 
  Newton himself regretted this, for the wave theory was
  essential to his theory of color.  There still remained the
  problem of explaining the spectrum: why were the rays of
  primary light arranged in the particular order of red, yellow,
  green, and violet?  Why did light consist of many colors; what
  were colors?  According to Huygens they were simply waves of
  differing lengths, differing frequencies, just like different
  pitches.  The spectrum represented a scale, a gamut of light.
     This explanation seemed to emerge again from another of
  Newton's experiments.  If light is passed through a lens
  pressed upon a plate of glass, a wreath of colored rings is
  produced.  When monochromatic light is sent through such an
  apparatus, the rings of each color appear at different
  distances from one another.  Newton measured the distancesДДand
  was in effect measuring the wave lengths of light.  But he
  would not accept this explanation; light waves could not exist
  because there was no medium, no ether, to transmit them.  So
  impossible, nonsensical a concept as that of the ether had no
  right to existence.  Anything that did not follow from
  observations was a hypothesis, he maintained, and hypotheses
  had no place in experimental science.
     Newton therefore concluded that light consisted of
  corpuscles passing through empty space.  The differing
  distances of the colored rings proved only that the corpuscles
  were affected by their passage through the lens and the glass,
  that their character was affected in some way, to what degree
  depending on their color.
     Only Newton with his incredibly sane and all-embracing
  system, could have succeeded in putting across so absurd a
  conception.  He won the battle completely.  The wave theory
  vanished, and with it Descartes's ether eddies.  The whole
  triumphant world-view of the Baroque Age had been shattered. 
  In its place Newton offered the inexplicable, remote force of
  gravitation which was, admittedly, a mystery to himself.  When
  he was asked what accounted for it, he flatly refused to
  venture any opinion: "I do not invent hypotheses."
     This attitude of his became a model for future natural
  philosophers.  Henceforth scientists considered it more
  important to recognize where the limits of science lay than to
  satisfy the urge for knowledge by unproved speculations, no
  matter how pretty they might be.
     The incomprehensibility of gravitation Newton considered a
  divine dispensation.  The Almighty had denied man ultimate
  insight into the mystery of His Creation.  A Christian must be
  able to reconcile himself to this factДДand Newton was a devout
  Christian....

With regard to Newton as the role model for the corrupt politics
of modern physics, we find on page 185:

     In his mid-fifties there came a radical change in Newton's
  way of life.  He was appointed master of the Royal Mint, an
  office equivalent to what would now be governor of the Bank of
  England.  He exchanged his modest lodgings at Cambridge for a
  palace in London, entered society, kept horses, carriages, and
  servants.  His income shot abruptly from sixty to five hundred
  pounds a year, besides various perquisites; he was able to
  indulge his taste for philanthropy.  He was knighted, and
  became an influential personage at court.  Most important of
  all, he became president of the Royal Society.
     This celebrated association of scientists was about the same
  age as Newton himself.  At the time he was given his
  professorship, the society became "royal," and was provided
  with special privileges, robes of state, a mace, and a seal
  bearing the motto: "Let no one's word be law."  But the motto
  went by the board once Newton was elected with absolute
  regularity to the presidency.  His word was sacred.  An
  excellent model for a cannon was unanimously rejected because
  Newton declared: "This diabolic instrument will only multiply
  mass killing."  In London the Royal Society was generally known
  as Sir Isaac's Parliament.
     This parliament became the platform for Newton's world fame. 
  But it also embittered the closing days of his life by its
  frenetic partisanship, in connection with his fourth great
  contribution, the calculus of fluxions, which has become the
  core of modern mathematics.  This time, however, Newton was not
  the sole discover of the method.  It was simultaneously
  developed, under the name of the differential calculus, by the
  German philosopher Leibnitz....
     Most of the technical terminology of modern mathematics
  derives from Leibnitz.  All of Europe learned the differential
  calculus from his textbook.  He described the new art of
  reckoning in such lucid terms that a veritable race began among
  mathematicians, each trying to outdo the other in elegant
  solutions of hitherto unsolved problems.  Mathematicians posed
  each other riddles, and sent each other the results in code to
  be sure that no one copied.  The period immediately after
  Leibnitz was an exciting and glorious era in the history of
  mathematics.  And all the newest discoveries were made by means
  of Leibnitzian differential quotients.  No one had ever heard
  of Newton's counterpart, his fluxions.  Newton had created the
  method for his own private use, and hesitated to publish it
  because it was so difficult to grasp.  For his Principia he
  therefore invented a less difficult, more geometrical method of
  proof....
     The most remarkable aspect of the whole barren struggle was
  this: no participant doubted for a moment that Newton had
  already developed his method of fluxions when Leibnitz began
  work on the differential calculus.  Yet there was no proof,
  only Newton's word.  He had published nothing but a calculation
  of a tangent, and the note: "This is only a special case of a
  general method whereby I can calculate curves and determine
  maxima, minima, and centers of gravity."  How this was done he
  explained to a pupil a full twenty years later, when Leibnitz's
  textbooks were widely circulated.  His own manuscripts came to
  light only after his death, and then they could no longer be
  dated.
     Though Newton's priority was not provable, it was taken for
  granted, while Leibnitz was always asked to prove that he had
  not plagiarizedДДa charge as humiliating as it was absurd. 
  This grotesque situation demonstrates most vividly the
  authority Newton enjoyed everywhere.  He was truly the monarch
  of all he surveyed, a unique phenomenon.  To Western science he
  occupied the same place that had been held in classical
  antiquity by PythagorasДДwhose disciples were wont to crush all
  opponents with the words: "Pythagoras himself has said so."

   In our time, Einstein has replaced Newton as the monarch of
physics.  Einstein's disciples tend to crush all opponents of his
relativity theories by citing chapter and verse of articles he
has published.  The main problem with this is the fact that
Einstein tends to be a moving target, and his arguments are not 
consistent from paper to paper, and often within the same paper. 
Louis Essen has published a booklet titled "The Special Theory of
Relativity A Critical Analysis" in which he examines this
question in great detail.[55]  Essen is a prominant English
physicist who built the first caesium atomic clock in 1955 and
determined the most accurate value for the velocity of light by
using a cavity resonator.  Skipping around the math, I present
the following excerpts from the booklet:

     Perhaps the strangest feature of all, and the most
  unfortunate to the development of science, is the use of the
  thought-experiment.  The expression itself is a contradiction
  in terms, since an experiment is a search for new knowledge
  that cannot be confirmed, although it might be predicted, by a
  process of logical thought.  A thought-experiment on the other
  hand cannot provide new knowledge; if it gives a result that is
  contrary to the theoretical knowledge and assumptions on which
  it is based then a mistake must have been made.  Some of the
  results of the theory were obtained in this way and differ from
  the original assumptions....
     A common reaction of experimental physicists to the theory
  is that although they do not understand it themselves it is so
  widely accepted that it must be correct.  I must confess that
  until recent years this was my own attitude.  I was, however,
  rather more than usually interested in the subject from a
  practical point of view, having repeated, with microwaves
  instead of optical waves(Essen 1955), the celebrated Michelson-
  Morley experiment, which was the starting point of the theory. 
  Then with the introduction of atomic clocks, and the enormous
  increase in the accuracy of time measurements that they made
  possible, the relativity effects became of practical
  significance....
     Many of the thought-experiments described by Einstein and
  others involve the comparison of distant clocks.  Such
  comparisons are now made every day at many laboratories
  throughout the world.  The techniques are well known.  It seems
  reasonable, therefore, to consider the thought-experiments in
  terms of these techniques.  When this is done, the errors in
  the thought-experiments become more obvious.  The fact that
  errors in the theory arise in the course of the thought-
  experiments may explain why they were not detected for so long. 
  Theoretical physicists might not have considered them
  critically from an experimental point of view.  But if one has
  been actually performing such experiments for many years, one
  is in a more favorable position to detect any departure from
  the correct procedure.  In the existing climate of opinion, one
  needed to be very confident to speak of definite errors in the
  theory.  Was there not perhaps some subtle interpretation that
  was being overlooked?  A study of the literature did not reveal
  any, but even so it was familiarity with the experiments that
  gave one the necessary confidence to maintain a critical
  attitude.
     The literature sometimes reveals a remarkable vagueness of
  expression, a lack of a clear statement of the assumptions of
  the theory, and even a failure to appreciate the basic ideas of
  physical measurement.  Ambiguities are not absent from
  Einstein's own papers, and various writers, even when advancing
  different interpretations of the theory, are correct in as much
  as these interpretations can all be attributed to Einstein....
     The contraction of length and the dilation of time can now
  be understood as representing the changes that have to be made
  to make the results of measurement consistent.  There is no
  question here of a physical theory but simply of a new system
  of units in which c is constant, and length and time do not
  have constant units but have units that vary with v2/c2.  Thus
  they are no longer independent, and space and time are
  intermixed by definition and not as a result of some peculiar
  property of nature....  If the theory of relativity is regarded
  simply as a new system of units it can be made consistent but
  it serves no useful purpose....  The argument about the clock
  paradox has continued interminably, although the way the
  paradox arose and its explanation follow quite clearly from a
  careful reading of Einstein's paper....  The experiment is
  often expressed in the dramatized form of two twins, one of
  whom returns from a round trip younger than his brother; and in
  this form it has received wide publicity....  It is illogical
  to suggest that a result obtained on the basis of the special
  theory is correct but is a consequence of a completely
  different theory developed some years later.  It is also
  illogical to assume that accelerations have no effect Д as he
  does in A's picture of the events Д and then to assume that
  gravitation, which in the general theory is assumed to be
  equivalent to acceleration, does have an effect....  It may be
  surprising, therefore, to find that a more critical examination
  of the experiments and the experimental conditions suggests
  that there is no experimental support for the theory...  The
  experiments of the Michelson-Morley type cannot be taken as
  supporting the theory, because the theory was developed in
  order to explain the null result that was obtained....  The
  increase of mass with velocity was predicted for the case of
  charged particles directly from electromagnetic theory before
  the advent of relativity theory and was confirmed
  experimentally by Kaufmann....

  18.   Conclusions

  A critical examination of Einstein's papers reveals that in the
  course of thought-experiments he makes implicit assumptions
  that are additional and contrary to his two initial principles. 
  The initial postulates of relativity and the constancy of the
  velocity of light lead directly to length contraction and time
  dilation simply as new units of measurements, and in several
  places Einstein gives support to this view by making his
  observers adjust their clocks.  More usually, and this
  constitutes the second set of assumptions, he regards the
  changes as being observed effects, even when the units are not
  deliberately changed.  This implies that there is some physical
  effect even if it is not understood or described.  The results
  are symmetrical to observers in relative motion; and such can
  only be an effect in the process of the transmission of the
  signals.  The third assumption is that the clocks and lengths
  actually change.  In this case the relativity postulate can no
  longer hold.
     The first approach, in which the units of measurement are
  changed, is not a physical theory, and the question of
  experimental evidence does not arise.  There is no evidence for
  the second approach because no symmetrical experiment has ever
  been made.  There is no direct experimental evidence of the
  third statement of the theory because no experiments have been
  made in an inertial system.  There are experimental results
  that support the idea of an observed time dilation, but
  accelerations are always involved, and there is some indication
  that they are responsible for the observed effects.

   My main insight into Einstein and his work came from a book by
Dr. Abraham Pais titled 'Subtle is the Lord...' The Science and
the Life of Albert Einstein.[37]  Pais is an award-winning
physicist who knew Einstein personally during the last nine years
of his life.  On page 13 we find that in Einstein's own words he
had been an "unscrupulous opportunist."  On page 44 we learn that
Einstein did not attend lectures or study, but instead used
Marcel Grossman's lecture notes to pass his college examinations.
With regard to the mathematics of relativity, page 152 states:

  Initially, Einstein was not impressed and regarded the
  transcriptions of his theory into tensor form as 'uberglussige
  Gelehrsamkeit,' (superfluous learnedness).  However, in 1912 he
  adopted tensor methods and in 1916 acknowledged his
  indebtedness to Minkowski for having greatly facilitated the
  transition from special to general relativity.

Since most scientists do not use or are conversant in tensor
mathematics, its use has tended to obscure the intimate meaning
behind the relativity theoretical arguments. On page 164 Pais
asks:

  Why, on the whole, was Einstein so reticent to acknowledge the
  influence of the Michelson-Morley experiment on his thinking?

On page 168 we find the answer to this question in the second
volume of Sir Edmund Whittaker's masterpiece book entitled
"History of the Theories of Aether and Electricity", where:

  Whittaker's opinion on this point is best conveyed by the title
  of his chapter on this subject: 'The Relativity Theory of
  Poincar‚ and Lorentz.'

In effect Whittaker showed that Einstein's special relativity
theory was not original work, but just a clever restatement of
the theoretical work of Poincar‚ and Lorentz.  The translation of
Lorentz's 1904 relativity paper[57 p.12] states:

  ...Poincar‚ has objected to the existing theory of electric and
  optical phenomena in moving bodies that, in order to explain
  Michelson's negative result, the introduction of a new
  hypothesis has been required, and that the same necessity may
  occur each time new facts will be brought to light.  Surely
  this course of inventing special hypotheses for each new
  experimental result is somewhat artificial.  It would be more
  satisfactory if it were possible to show by means of certain
  fundamental assumptions and without neglecting terms of one
  order of magnitude or another, that many electromagnetic
  actions are entirely independent of the motion of the system.

The translation of Einstein's 1905 special relativity paper[57
p.37] presented the argument that one could explain many
electromagnetic actions by fundamental assumptions based on two
postulates and that the "introduction of a "luminiferous ether"
will prove to be superfluous", and his paper made no direct
reference to the Michelson-Morley experiment or the work of
Poincar‚ and Lorentz.  On page 313 of Pais' book we learn that in
1920, after Einstein had become famous, he made an inaugural
address on aether and relativity theory for his special chair in
Leiden.  In the address he states:

  The aether of the general theory of relativity is a medium
  without mechanical and kinematic properties, but which
  codetermines mechanical and electromagnetic events.

So we finally find that relativity is an ether theory after all,
and that this ether has arbitrary abstract contradictory physical
characteristics!  This illustrates the arbitrary nature of
relativity.  Most physicists, and for that matter, most physics
text books, present the argument that relativity is not an ether
theory.  On page 467 we find that near the end of his life in
1954, Einstein wrote to his dear friend M. Besso:

  I consider it quite possible that physics cannot be based on
  the field concept,i.e., on continuous structures.  In that
  case, nothing remains of my entire castle in the air,
  gravitation theory included, [and of] the rest of modern
  physics.

With regard to the problem of the average physicist not
understanding relativity theory, Dr. S. Chandrasekhar, a Nobel
laureate physicist, writes in an article[46] titled "Einstein and
general relativity: Historical perspectives":

     The meeting of November 6, 1919 of the Royal Society also
  originated a myth that persists even today (though in a very
  much diluted version):"Only three persons in the world
  understand relativity."  Eddington explained the origin of this
  myth during the Christmas-recess conversation with which I
  began this account.
     Thomson, as President of the Royal Society at that time,
  concluded the meeting with the statement:"I have to confess
  that no one has yet succeeded in stating in clear language what
  the theory of Einstein really is."  And Eddington recalled that
  as the meeting was dispersing, Ludwig Silberstein (the author
  of one of the early books on relativity) came up to him and
  said: "Professor Eddington, you must be one of three persons in
  the world who understands general relativity."  On Eddington
  demurring to this statement, Silberstein responded, "Don't be
  modest Eddington."  And Eddington's reply was, "On the
  contrary, I am trying to think who the third person is!"

This lack of comprehension of Relativity theory is not uncommon
among physicists and astronomers.  Over the years, in many
intimate conversations and correspondence with them, I've found
few scientists willing to admit to an indepth understanding of
the theory, yet most of them will argue of their belief in it.  I
have also discovered that even the scientists that are willing to
admit to full comprehension of the theory, have serious gaps in
their knowledge of it.  For example, Prof. William H. McCrea of
England wrote the counter argument to Prof. Herbert Dingle's
controversial attack on the inconsistent logic in the theory,
which was published in the prestigious journal NATURE.[47] 
Dingle was an interesting fellow, at one time he was a leading
proponent of the relativity theory, and even was a member of
several British solar eclipse expeditions.  He was a professor at
University College in London, and the author of many books and
papers on astrophysics, relativity, and the history of science. 
I was introduced to McCrea by Prof. Thornton Page, at the 1968
Fourth Texas Symposium on Relativistic Astrophysics.  McCrea who
is considered to be an authority on relativity theory, was
surprised to find that Einstein considered relativity to be an
ether theory.  With regard to the argument that I showed McCrea
that represented relativity as an ether theory, Einstein and
Infeld state:

  ...On the other hand, the problem of devising the mechanical
  model of ether seemed to become less and less interesting and
  the result, in view of the forced and artificial character of
  the assumptions, more and more discouraging.
     Our only way out seems to be to take for granted the fact
  that space has the physical property of transmitting
  electromagnetic waves, and not to bother too much about the
  meaning of this statement.  We may still use the word ether,
  but only to express some physical property of space.  This word
  ether has changed its meaning many times in the development of
  science.  At the moment it no longer stands for a medium built
  up of particles.  Its story, by no means finished, is continued
  by the relativity theory.[20 p.153]

There is a very interesting article on this question published in
the August 1982 issue of Physics Today by Prof. Yoshimasa A. Ono. 
The article begins:

  It is known that when Albert Einstein was awarded the Nobel
  Prize for Physics in 1922, he was unable to attend the
  ceremonies in Stockholm in December of that year because of an
  earlier commitment to visit Japan at the same time.  In Japan,
  Einstein gave a speech entitled "How I Created the Theory of
  Relativity" at Kyoto University on 14 December 1922.  This was
  an impromptu speech to students and faculty members, made in
  response to a request by K. Nishida, professor of philosophy at
  Kyoto University.  Einstein himself made no written notes.  The
  talk was delivered in German and a running translation was
  given to the audience on the spot by J. Isiwara, who had
  studied under Arnold Sommerfeld and Einstein from 1912 to 1914
  and was a professor of physics at Tohoku University.  Isiwara
  kept careful notes of the lecture, and published his detailed
  notes (in Japanese) in the monthly Japanese periodical Kaizo in
  1923; Ishiwara's notes are the only existing notes of
  Einstein's talk....

Ono ends his introduction to his translation with the statement:


  It is clear that this account of Einstein's throws some light
  on the current controversy as to whether or not he was aware of
  the Michelson-Morley experiment when he proposed the special
  theory of relativity in 1905; the account also offers insight
  into many other aspects of Einstein's work on relativity.

With regard to the ether, Einstein states:

  Light propagates through the sea of ether, in which the Earth
  is moving.  In other words, the ether is moving with respect to
  the Earth....

With regard to the experiment he argues:

  Soon I came to the conclusion that our idea about the motion of
  the Earth with respect to the ether is incorrect, if we admit
  Michelson's null result as a fact.  This was the first path
  which led me to the special theory of relativity.  Since then I
  have come to believe that the motion of the Earth cannot be
  detected by any optical experiment, though the Earth is
  revolving around the Sun.[48]

The above information gives us insight into the nature of
Einstein's relativity theory.  He believes that the sea of ether
exists, but he also believes that it cannot be detected by
experiments, in other words, he believes it is invisible.  The
situation in modern physics is very much like the Hans Christian
Andersen tale of "The Emperor's New Clothes", with Einstein
playing the part of the Emperor.  The tale goes that the Emperor,
who was obsessed with fine clothing to the point that he cared
about nothing else, let two swindlers sell him a suit of cloth
that would be invisible to anyone who was "unfit for his office
or unforgivably stupid."  It turned out that no one could see the
suitДДnot the emperor, not his courtiers, not the citizens of the
town who lined the street to see him show off his new finery. 
Yet no one dared admit it until a little child cried out, "But he
doesn't have anything on!"
   In regard to Einstein's reluctance to acknowledge the
influence of the Michelson-Morley experiment on his thinking, and
Whittaker's argument that his special relativity theory was a
clever restatement of the work of Poincar‚ and Lorentz, I report
the following published[56] statements which Einstein made to
Prof. R. S. Shankland on this matter:

     The several statements which Einstein made to me in
  Princeton concerning the Michelson-Morley experiment are not
  entirely consistent, as mentioned above and in my earlier
  publication.  His statements and attitudes towards the
  Michelson-Morley experiment underwent a progressive change
  during the course of our several conversations.  I wrote down
  within a few minutes after each meeting exactly what I recalled
  that he had said.  On 4 February 1950 he said,"...that he had
  become aware of it through the writings of H. A. Lorentz, but
  only after 1905 had it come to his attention."  But at a later
  meeting on 24 October, 1952 he said, "I am not sure when I
  first heard of the Michelson experiment.  I was not conscious
  that it had influenced me directly during the seven years that
  relativity had been my life.  I guess I just took it for
  granted that it was true."  However, in the years 1905-1909 (he
  told me) he thought a great deal about Michelson's result in
  his discussions with Lorentz and others, and then he realized
  (so he told me) that he "had been conscious of Michelson's
  result before 1905 partly through his reading of the papers of
  Lorentz and more because he had simply assumed this result of
  Michelson to be true."...

   With regard to the politics that led to Einstein's fame Dr. S.
Chandrasekhar's article[46] states:

    In 1917, after more than two years of war, England enacted
  conscription for all able-bodied men.  Eddington, who was 34,
  was eligible for draft.  But as a devout Quaker, he was a
  conscientious objector; and it was generally known and expected
  that he would claim deferment from military service on that
  ground.  Now the climate of opinion in England during the war
  was very adverse with respect to conscientious objectors: it
  was, in fact, a social disgrace to be even associated with one. 
  And the stalwarts of Cambridge of those daysДДLarmor (of the
  Larmor precession), Newall, and othersДДfelt that Cambridge
  University would be disgraced by having one of its
  distinguished members a declared conscientious objector.  They
  therefore tried through the Home Office to have Eddington
  deferred on the grounds that he was a most distinguished
  scientist and that it was not in the long-range interests of
  Britain to have him serve in the army....  In any event, at
  Dyson's interventionДДas the Astronomer Royal, he had close
  connections with the AdmiraltyДДEddington was deferred with the
  express stipulation that if the war should have ended by 1919,
  he should lead one of two expeditions that were being planned
  for the express purpose of verifying Einstein's prediction with
  regard to the gravitational deflection of light....  The Times
  of London for November 7, 1919, carried two headlines: "The
  Glorious Dead, Armistice Observance.  All Trains in the Country
  to Stop," and "Revolution in Science. Newtonian Ideas
  Overthrown."

Dr. F. Schmeidler of the Munich University Observatory has
published a paper[49] titled "The Einstein Shift ДД An Unsettled
Problem," and a plot of shifts for 92 stars for the 1922 eclipse
shows shifts going in all directions, many of them going the
wrong way by as large a deflection as those shifted in the
predicted direction!  Further examination of the 1919 and 1922
data  originally interpreted as confirming relativity, tended to
favor a larger shift, the results depended very strongly on the
manner for reducing the measurements and the effect of omitting
individual stars. 
   So now we find that the legend of Albert Einstein as the
world's greatest scientist was based on the Mathematical Magic of
Trimming and Cooking of the eclipse data to present the illusion
that Einstein's general relativity theory was correct in order to
prevent Cambridge University from being disgraced because one of
its distinguished members was close to being declared a
"conscientious objector"!

to library         main page

Знаете ли Вы, что, как и всякая идолопоклонническая религия, релятивизм ложен в своей основе. Он противоречит фактам. Среди них такие:

1. Электромагнитная волна (в религиозной терминологии релятивизма - "свет") имеет строго постоянную скорость 300 тыс.км/с, абсурдно не отсчитываемую ни от чего. Реально ЭМ-волны имеют разную скорость в веществе (например, ~200 тыс км/с в стекле и ~3 млн. км/с в поверхностных слоях металлов, разную скорость в эфире (см. статью "Температура эфира и красные смещения"), разную скорость для разных частот (см. статью "О скорости ЭМ-волн")

2. В релятивизме "свет" есть мифическое явление само по себе, а не физическая волна, являющаяся волнением определенной физической среды. Релятивистский "свет" - это волнение ничего в ничем. У него нет среды-носителя колебаний.

3. В релятивизме возможны манипуляции со временем (замедление), поэтому там нарушаются основополагающие для любой науки принцип причинности и принцип строгой логичности. В релятивизме при скорости света время останавливается (поэтому в нем абсурдно говорить о частоте фотона). В релятивизме возможны такие насилия над разумом, как утверждение о взаимном превышении возраста близнецов, движущихся с субсветовой скоростью, и прочие издевательства над логикой, присущие любой религии.

4. В гравитационном релятивизме (ОТО) вопреки наблюдаемым фактам утверждается об угловом отклонении ЭМ-волн в пустом пространстве под действием гравитации. Однако астрономам известно, что свет от затменных двойных звезд не подвержен такому отклонению, а те "подтверждающие теорию Эйнштейна факты", которые якобы наблюдались А. Эддингтоном в 1919 году в отношении Солнца, являются фальсификацией. Подробнее читайте в FAQ по эфирной физике.

НОВОСТИ ФОРУМА

Форум Рыцари теории эфира


Рыцари теории эфира
 10.11.2021 - 12:37: ПЕРСОНАЛИИ - Personalias -> WHO IS WHO - КТО ЕСТЬ КТО - Карим_Хайдаров.
10.11.2021 - 12:36: СОВЕСТЬ - Conscience -> РАСЧЕЛОВЕЧИВАНИЕ ЧЕЛОВЕКА. КОМУ ЭТО НАДО? - Карим_Хайдаров.
10.11.2021 - 12:36: ВОСПИТАНИЕ, ПРОСВЕЩЕНИЕ, ОБРАЗОВАНИЕ - Upbringing, Inlightening, Education -> Просвещение от д.м.н. Александра Алексеевича Редько - Карим_Хайдаров.
10.11.2021 - 12:35: ЭКОЛОГИЯ - Ecology -> Биологическая безопасность населения - Карим_Хайдаров.
10.11.2021 - 12:34: ВОЙНА, ПОЛИТИКА И НАУКА - War, Politics and Science -> Проблема государственного терроризма - Карим_Хайдаров.
10.11.2021 - 12:34: ВОЙНА, ПОЛИТИКА И НАУКА - War, Politics and Science -> ПРАВОСУДИЯ.НЕТ - Карим_Хайдаров.
10.11.2021 - 12:34: ВОСПИТАНИЕ, ПРОСВЕЩЕНИЕ, ОБРАЗОВАНИЕ - Upbringing, Inlightening, Education -> Просвещение от Вадима Глогера, США - Карим_Хайдаров.
10.11.2021 - 09:18: НОВЫЕ ТЕХНОЛОГИИ - New Technologies -> Волновая генетика Петра Гаряева, 5G-контроль и управление - Карим_Хайдаров.
10.11.2021 - 09:18: ЭКОЛОГИЯ - Ecology -> ЭКОЛОГИЯ ДЛЯ ВСЕХ - Карим_Хайдаров.
10.11.2021 - 09:16: ЭКОЛОГИЯ - Ecology -> ПРОБЛЕМЫ МЕДИЦИНЫ - Карим_Хайдаров.
10.11.2021 - 09:15: ВОСПИТАНИЕ, ПРОСВЕЩЕНИЕ, ОБРАЗОВАНИЕ - Upbringing, Inlightening, Education -> Просвещение от Екатерины Коваленко - Карим_Хайдаров.
10.11.2021 - 09:13: ВОСПИТАНИЕ, ПРОСВЕЩЕНИЕ, ОБРАЗОВАНИЕ - Upbringing, Inlightening, Education -> Просвещение от Вильгельма Варкентина - Карим_Хайдаров.
Bourabai Research - Технологии XXI века Bourabai Research Institution