Home Science Energy Predictive Cosmology: Creation's secret revealed in muon-electron mass ratio = 206.768 283


JUser: :_load: Unable to load user with ID: 1543
It is well known in science that three forces'”electromagnetic, strong, and weak'”govern the microscopic world of elementary particles. However, the reason why any of these forces exist in the first place is a question that is seldom asked and has never been satisfactorily answered. The theory called "Predictive Cosmology" claims to state the reason and provides the answer through mμ/me = 206.768 283. Could this be a breakthrough in theoretical physics?

Cosmology is the branch of physics that deals with the structure and origin of the Universe. For millennia people have been creating stories to explain why and how the universe came to be. However, despite much effort, physicists still have not been able to formulate a sensible theory they can all agree on.

Such a theory should be detailed enough to make predictions that can be verified. In addition, it should provide rational answers to questions such as:

'¢    'What makes the Universe expand?'

'¢    'How were the first particles created?'

'¢    'Why are there four forces (gravitation and the three already-mentioned forces)?'

'¢    'When did the forces first appear?' and

'¢    'Why did not matter and antimatter completely annihilate each other in the early Universe?'

Physicists have been particularly puzzled by the existence of the muon (μ− with antiparticle μ+) and the tauon (τ− with antiparticle Ï„+), which are short-lived elementary particles that differ from the electron (e− with antiparticle e+, the positron) only via their much larger mass. The electron, muon, and tauon are said to represent three generations, or families, of spinning 'leptons.'

Another mystery is why spinless elementary particles have not been observed. Why isn't there, say, a set of charged 'spinless leptons' (e0±, μ0±, and Ï„0±)?

The Standard Model (SM) of particle physics does not forbid their existence!

This new theory now attempts to resolve those mysteries, thereby answering many of the previously unanswered questions in cosmology. This article is an explanation of the work of a European theoretical physicst, and his development of "Predictive Cosmology."

Page 2 continues with the "space equation," what is oftentimes called the Momentum Equation.

Behind the new theory lie no new axioms or postulates, not even a new idea. In fact, the basic idea is well known, and can be traced back to ancient Greek philosopher Anaximander (about 611-547 BC), who suggested that matter originates from eternal motion in some kind of 'primeval matter.'

Later, physicists have talked about 'vortices in the ether,' suggested that space is a 'perfect fluid,' etc. What is new with the present theory is that the full consequences of Anaximander's idea are worked out. 

Momentum Equation

The natural starting point is the momentum equation. This equation is sometimes referred to as the 'fundamental hydrodynamical equation' because it provides the basis for our understanding of how fluids (liquids and gases) behave. Its role in weather prediction probably makes it the world's most heavily used equation.

The momentum equation has a solution in the form of a very simple equation. Assuming that it is applicable to space, it is found that this 'space equation' may be interpreted to picture a (negatively or positively charged) 'stationary electron,' showing a snapshot of the particle at the very instant of its birth.

It turns out that, in addition to explaining the electron's charge and spin, the 'space equation' explains expansion as a predetermined unperturbable feature of nature on the same footing with spin and charge. Gravity, in turn, is only a side effect of the expansion, with no role to play in the early phases of the Universe.

Another consequence of the space equation is that the Universe must have been born cold'”that is, without temperature in the conventional meaning of the word according to which pressure and temperature arise when molecules or particles bounce off each other.

(The early Universe is in an indeterminate quantum state with charged leptons and background photons only appearing in the form of entangled pairs. In other words, there is no kinetic energy, and all energy comes in the form of rest energy of composite particles [pairs of elementary particles].)

This fact means that the Universe originally was very simple, and that its early evolution may be tracked in detail in a computer simulation. By demanding that the results match the actually observed Universe, the simulation answers several previously open questions.

For instance, it becomes clear that the Universe initially contains equal amounts of matter and antimatter (or particles and antiparticles) and remains cold until the first proton-antiproton pair appears, after which the antiproton immediately is forced to decay into an electron, thereby heating matter (proton plus electron) to a temperature of about a thousand billion Kelvin.

An immediate consequence of the Universe being initially cold is that the Higgs boson is expected to be no heavier than the electron (for which holds mec2 = 0.511 MeV, that is, me = 0.511 MeV/c2). Supposedly, such a light, neutral and spinless particle is difficult to experimentally observe at present. (Compare with the light, neutral and spinning neutrinos, which are abundantly produced in nuclear reactors, but rarely captured in detectors.)

Even so, a readily testable negative prediction can be made: No Higgs bosons will be detected in CERN's LHC (Large Hadron Collider) experiment, in which high-energy particles are looked for (a Higgs particle born in the intense heat of an originally 'infinitely' hot Universe is expected [by most particle physicists] to have a mass ranging between a few GeV/c2 and a few TeV/c2).

So, Predictive Cosmology contends that when the Higgs boson is some day experimentally 'observed' its rest mass will be at a minimum several thousand times smaller than that presently predicted by leading particle physicists.

Page 3 continues with some predictions from Predictive Cosmology.

To be able to judge Predictive Cosmology on its merits, one needs to look at some of its other predictions.

Muon-Electron Mass Ratio

And, indeed, a theoretical calculation of the muon-electron mass ratio (the mass of a muon divided by the mass of an electron, or mμ/me) seems to provide conclusive evidence that the new theory is good since its prediction, mμ/me = 206.768 283, exactly matches the less precise measured value of 206.768 28 (with its last digit somewhat uncertain).

Refer to CODATA (the Committee on Data for Science and Technology that publishes recommended values of the fundamental physical constants): 'muon-electron mass ratio.' [Reference: http://physics.nist.gov/cgi-bin/cuu/Value?mmusme]

Physicists wouldn't be very surprised by a statement stating that energy is part of the answer to the question of why the electromagnetic, strong, and weak forces exist.

However, more surprising is the assertion by the theory presently under consideration, i.e. Predictive Cosmology, that the law of conservation of energy provides the constraints that make cosmology a 'predictive' science.

Thus, Predictive Cosmology provides the answer to the questions: 'Why do the three forces exist?'; 'Why are there three generations of elementary particles (leptons and quarks)?'; and 'Why does matter dominate over antimatter?'.

The answers follow naturally when the computer simulation of the early Universe is made to fit reality (that is, picture a Universe similar to the one we observe'”a Universe from which matter never entirely disappears even if it originally contains equal amounts of particles and antiparticles that rapidly annihilate each other).

In the computer program, no arbitrarily adjustable parameters need be introduced, a fact that makes the theory particularly appealing. (Making the simulation fit reality means, in effect, investigating if and how the lifetime of a particle and the speed of light might be interdependent'”which leads to a very surprising result.)

The website http://www.physicsideas.com provides the arguments for the validity of Predictive Cosmology.

Page 4 continues with more reasoning behind this Predictive Cosmology argument.

Let's take a look at these arguments to see how the reasoning goes.

The principle of conservation of energy is one of the solid pillars on which physics rests. Thus, a particle's rest energy (E), which is connected to its rest mass (m) via Albert Einstein's famous relation E = mc2, is constant over time'”and, thus, the speed of light (c) is also constant over time.

The observed redshift of distant galaxies indicates that'”when seen in a cosmic or 'global' perspective'”the Universe is uniformly expanding. However, 'local' structures, such as everyday objects, stars, and galaxies (that is, structures held together by electromagnetic or gravitational forces), do not expand in the same way.

If radiation escaping from the galaxy or received from other galaxies is neglected, the energy of a galaxy is conserved. That is, the energy principle implies that energy is 'locally' conserved.

In contrast, energy is not conserved 'globally' throughout the Universe. That is to say, in a large cosmic volume (containing any given number of galaxies'”that is, composed of many independent local structures) co-expanding with the Universe, total energy is not conserved.

This is so because radiation is redshifted during its voyage through intergalactic space. That is, the expansion of the Universe causes the photon's wavelength (λ) to stretch and the photon energy, hc/λ, to decrease.

(In the present theory, this is the whole story, while in theories where gravity and expansion are assumed to balance each other, things are more complicated. Things become particularly complicated when one attempts to use the theory for gravity'”general relativity'”to predict expansion. In Predictive Cosmology, where expansion is a fixed property of the Universe, such an attempt is not meaningful.)

Now, why should it be that energy is not conserved 'globally' throughout the Universe, but is conserved locally, and not the other way round?

What if we were to assume that the velocity of light (c) changes with time in such a way that energy, instead of being locally conserved, is conserved globally'”that is, total energy is conserved in a cosmic volume expanding with the Universe? In other words, what if

Mc2 + Nhc/λ = constant

(implying that c increases very slowly to compensate for the comparatively much faster growth in radiation's wavelength λ) holds true for an expanding volume having a mass M and containing N background photons subject to cosmological redshift?

Clearly, building our theories on the principle of conservation of energy, we may imagine, as it seems, two mutually exclusive 'pictures' of the expanding Universe.

In the 'local picture' (with c constant), energy is not conserved globally (because of the redshifting of radiation due to the expansion) but is conserved locally.

In the 'global picture' (with c varying over time), energy is conserved globally but is not conserved locally (because the change in light's velocity, c, changes the rest energy of matter).

And, now comes the big surprise. It turns out that the two pictures are NOT mutually exclusive. One picture does not exclude the other!

Page 5 continues with a look at the beginning of the Universe.

At first sight, the conclusion that the two widely different pictures are both true appears counterintuitive. And, to be sure, in the classical world of physics, the two pictures would be irreconcilable.

However, in our quantum-physics world, they do not conflict with each other.

The Beginning of the Universe

In fact, a detailed computer simulation of the Universe (with its source code listed on the website) shows that, if both pictures are assumed to be valid, it becomes possible to calculate many quantities previously thought to be determined only by chance. Thus, the theoretical cosmology of the Big Bang turns into a model of Predictive Cosmology.

The pattern that emerges differs radically from present descriptions of the Big Bang. It shows that, at the beginning of time, the Universe popped up, not as an infinitely small and dense point (what physicists call a 'singularity'), but as a single-particle four-dimensional (4D) spacetime bubble with extended space and time dimensions (that is, with a nonzero initial radius and a nonzero initial age).

The initial age of the Universe defines the natural time scale via which age and time is measured. Thus, taking the bubble to have unit dimensions initially means that the Universe was born at the age of t = 1 time unit. (This may be compared to measuring a person's age, not in years, but in units of nine months'”the 'age' of a baby when it is born.) The computer simulation of the early Universe shows that this 'natural time unit' corresponds to about 10−19 seconds.

At this point, one may pause to philosophize a little about the beginning of the Universe.

Since the Universe exists, and since astronomical observations indicate that it had a definite beginning, the probability for a symmetry-breaking transition from the perfectly symmetric state of literally nothing to a less symmetric single-particle (and consequently material) Universe cannot be exactly zero.

Consequently, a transition must occur. Since not even space nor time can exist in literally nothing, the spontaneous symmetry breaking must occur at the beginning of time (at t = 1 time unit if the newborn Universe is taken to have unit spacetime dimensions).

[In still simpler terms, the birth of the Universe is discussed in 'A Fairy Tale,' which may be reached by clicking on 'they solve the mystery' in the website's first paragraph or accessed directly via http://www.physicsideas.com/Fairytale.pdf.]

Page 6 continues with the evolution of this new Universe.

The evolution of the Universe occurs in four phases.

Evolution of the Universe

Four phases occur in this evolution of the early universe.


In phase 1, the very first particle is originally alone in its Universe. Because forces are mediated by particles (so-called gauge bosons), which do not yet exist, there are no forces acting upon the primordial particle. This massive, neutral, and spinless particle (which was first described by Paul Dirac in 1971) has several remarkable properties.

For instance, it is its own antiparticle (both matter and antimatter, much like today's neutral pion that predominantly decays into two photons) and may annihilate itself in an irreversible process, either directly, or indirectly via an intermediate particle-antiparticle state. 

As the Universe expands, clones of the particle appear on the horizon. (The particles are indistinguishable from each other. Therefore, any particle may be regarded as the original one, in the same way as there is no preferred 'center of the Universe.')

The primordial particles are unstable and begin to decay into radiation in a process that rapidly leads to their complete extinction. (Since the process is irreversible, virtual particles cannot form, and all traces of the original particle will eventually be lost.)

Global conservation of energy implies that the rest energy of matter must increase as radiation constantly loses energy (no change in net energy) because of the redshift caused by the expansion (that is, photons propagating through the expanding space are stretched, creating the cosmological redshift).

In a purely radiative Universe (one consisting exclusively of massless radiation), there is no matter (i.e., particles possessing mass) to counterbalance the continuous decrease in radiation energy due to redshift caused by the expanding Universe.

A continuous loss of energy through photon redshifting, with nothing to counterbalance that energy loss, is a violation of the law of conservation of energy; therefore, when the last primordial particle decays the Universe is forced to rematerialize.

An electric force (which today is no longer active) appears in the early Universe with the sole purpose to recreate matter from radiation. This matter comes in the form of pairs of oppositely charged 'spinless muons' (that is, non-spinning heavy electrons).

The force is theoretically described by quantum electrodynamics (QED), or more precisely by scalar QED, which is the theory for 'spinless electrons,' while spinor QED describes today's spinning electron and its interaction with photons.

[Since spinless electrons do not exist today, scalar QED has no practical use, and neither 'scalar QED' nor 'spinor QED' are currently used terms. Instead of talking about 'spinor QED,' physics textbooks use the simpler 'QED' and often only in passing mention 'scalar electrodynamics' as opposed to 'spinor electrodynamics.']

Page 7 continues with phases two, three, and four.

Now, on to phase 2.


In phase 2, the rematerialized Universe consists of matter and antimatter that annihilate each other. Again, the end of the phase is a forbidden, purely radiative Universe. This time, global energy conservation forces the appearance of the familiar electromagnetic force (one of the four forces of nature'”described by spinor QED), whose sole purpose is to transform photons (massless radiation) into electron-positron pairs (matter).


In phase 3, the electron-positron pairs (or electron pairs for short'”the positron is a positively charged electron) annihilate each other until only two pairs remain in a sea of almost three billion background photons. (More precisely, the computer simulation predicts the number of photons to be about 2 786 275 000.)

To inhibit the last electron pairs from annihilating each other, the strong force (another one of the forces'”described by quantum chromodynamics, or QCD) appears. Its purpose is to ensure continuous presence of matter in the Universe by transforming the last annihilating electron pair into a new type of matter'”a proton-antiproton pair (or proton pair for short).


In phase 4, the energy needed to transform the lightweight electron pair into a heavyweight proton pair has to be taken from the only existing energy source'”the background photons. Therefore, the strong force is accompanied by a weak force (the third force of nature), whose purpose is to provide a link between the electromagnetic force and the strong force, and to transfer energy from the photon radiation to the proton pair.

Finally, to inhibit proton-antiproton (p-p−) annihilation, global energy conservation forces the antiproton (p−) to decay back into an electron (e−), thus releasing a large amount of energy. This is the 'Big Bang' that heats the now stable matter (the proton-electron pair) to an initial temperature a thousand times higher than in the blast of a hydrogen bomb.


The table shows the three matter-creating forces that, in turn, saved the early Universe from decaying into pure radiation'”a state forbidden by the law of conservation of energy. Today, only the latter two forces exist.

And, today's spinning tauon and muon remind us of the spinless tauon and muon that disappeared'”at about 10 and 33 natural time units old, respectively'”when the Universe was still very young.

The fine-structure constant α (with measured value 1/137.035 9991) characterizes the electromagnetic force.

The charged pion (Ï€+ with antiparticle π−) is a short-lived, strongly interacting particle'”a kind of 'spinless proton.' The reason for including the pion pair in the table is that the creation of the proton pair proceeded via an intermediate, very brief, 'pion phase.' Thus, the value of the pion-electron mass ratio (mÏ€/me) provides vital information about the process in which the proton was initially formed.

The tauon-muon mass ratio (mτ/mμ) and muon-electron mass ratio (mμ/me) tell us how much energy the last spinless tauon and muon had taken up from radiation at the end of phase 1 and phase 2, respectively.

The table gives the measured values of the constants with the last digit of each value unreliable. The constants should in theory be calculable from first principles'”a challenge for theoretical physicists! A failed attempt to calculate the fine-structure constant α is presented in the form of a very simple computer program with its source code listed on pages 64-65 in the paper.

Also, the ratio mμ/me hasn't yet been calculated from first principles. Instead, its theoretical value discussed below is obtained from the measured value of α, to which it is directly related.

Page 8 continues with more on the muon-electron mass ratio.

Let's talk in more detail about the muon-electron mass ratio, mμ/me.


The muon-electron mass ratio (mμ/me) belongs to the physical constants that were believed previously not to be calculable. The reason for this belief was that nothing in the QED theory for the electron hints at the existence of a heavy electron'”the muon'”or a superheavy one'”the tauon. Their existence was an enigma that to this day has puzzled physicists. [Read more about the confusion caused by their discovery in 'they solve the mystery' (see below).]

However, in the Predictive Cosmology theory, not only is the muon-electron mass ratio calculable, but its value reflects the history of the early Universe and helps fill in the fine details about the process leading to our present stable Universe.

The ratio's theoretically computed value may be written as a sum of four terms:

mμ/me = 205.759 223 + 1.009 816 − 0.000 830 + 0.000 074 = 206.768 283.

The first term'”the zeroth-order value 1/Bα = 205.759 223, with B a numerical constant given by the solution to the momentum equation'”is directly connected to the electromagnetic force, which is characterized by the fine-structure constant α = 1/137.036 = 0.007 297.

The second term'” +1.009 816 '”is a correction that appears when the spinless muon of phase 2 is succeeded by the spinning electron of phase 3. Its value is obtained as a sum of an infinite series of Feynman diagrams in scalar QED.

The third and fourth terms are obtained from electroweak Feynman diagrams, which accompany the appearance of the Higgs bosons and the neutrinos. The two terms are directly connected to the Higgs and neutrino masses, respectively.

The third term'” −0.000 830 '”reveals that the creation of the proton was a two-step process involving four Higgs bosons. In the first step, a single Higgs boson mediates the transfer of energy required to convert the last two electron pairs into a couple of pion pairs of 273 times greater mass. In the second step, the last pion pair is transformed into a 6.7 times heavier proton pair with the aid of three additional Higgs bosons.

The fourth term'” +0.000 074 '”with sign opposite to that of the third term, reflects the fact that the three Higgs bosons supply more energy to the strong force than is needed to create the proton pair. Its value is a measure of the surplus energy that is restored to the background photons by the weak force, which in so doing fulfills its last task. And in the process transforms into the present, highly complex weak force'”a force that has puzzled physicists ever since it was first discovered, as no obvious reason for its existence or complexity was previously found.

Calculation of the fourth term in both the local picture and the global picture reveals that the life of the last pion was prolonged by the appearance of a parity-switching weak force (interchanging left and right), which by necessity brought with it a superweak CP-violating effect with no purpose. (The effect is observed in kaon decay and implies that a particle and its antiparticle'”matter and antimatter'”do not always, in every respect, behave as each other's exact opposites.)

The four terms result from well-established physical theories; the fundamental hydrodynamic equation (or momentum equation) and the pure QED and electroweak theories of SM. Since all four terms are unambiguously defined and no alternative to them can reasonably be conceived, it is virtually impossible that the good agreement (to within 0.025 ppm) between theory and experiment could result from chance.

Page 9 concludes.

In summary, the new Predictive Cosmology gives a logically consistent description of the early phases of the Universe. It explains many previously inexplicable features of the elementary particles and the forces acting between them.


To understand the calculations in the paper detailing Predictive Cosmology, only basic knowledge of vector analysis is required. In other words, only the easy part of the theory has so far been developed. It is a vanishingly small part compared with what remains to be done. No doubt, it will take particle physicists many years to work out the full consequences of the incorporation of global energy conservation into the Standard Model (SM) of elementary particle physics.

For astrophysicists, the task may be a little simpler. It should not require too many years of work to develop a reliable model simulating the evolution of the Universe from the early stage; that is, early on when the bulk of its matter and energy was swallowed by black holes, to its present state.

(Gravity was initially very strong. Today, the gravitational force between electrons is roughly 1040 times weaker than the electric force. When the black holes began to form, gravity may have been, say, 1030 times stronger than it is today. Consequently, most of the energy in the initially very dense universe should have been rapidly trapped into black holes. The weakening of the gravitational force is, according to the space equation, caused by the Universe's expansion from the size of an atom to its present size'”G being inversely proportional to the radius R of the Universe.)

Returning to the questions posed near the beginning of the article, they may now be briefly answered:

'¢    What makes the Universe expand?

Answer: Expansion is the result of an ongoing process in which particles create space at a rate that is proportional to their energy. Also, matter, spin, electric charge, creation of space, and gravity all have a common explanation: They are features inherent in the space equation, which is a particular solution to the fundamental hydrodynamic (or momentum) equation.

'¢    How was the very first particle created?

Answer: In a spontaneous breaking of the perfect symmetry of literally nothing (where neither space nor time exists).

'¢    How were today's massive elementary particles first created?

Answer: In transformations of radiation into matter when (global) energy conservation repeatedly forbade the Universe from decaying into massless radiation (a Universe composed solely of radiation (e.g., light) would continuously lose energy via redshifting caused by the expanding Universe).

'¢    Why are there electromagnetic, strong, and weak forces?

Answer: Because these forces were needed to transform radiation into matter.

'¢    When did the forces first appear?

Answer: When matter was recreated from radiation.

'¢    Why did not matter and antimatter completely annihilate each other in the early Universe?

Answer: Because an expanding matter-free Universe violates the principle of (global) energy conservation.

For additional information on the theory of Predictive Cosmology, please go to 'physicsideas.com,' which has links to the paper ('Paper.pdf'), an easy-to-read introduction ('they solve the mystery'), a discussion of the primordial particle ('Dirac's new equation'), a simulation program written in FORTRAN ('Simulation.for'), and a calculation of neutrino and Higgs masses ('Higgs and neutrino masses').

Source: http://www.physicsideas.com

To read the first of three question-and-answer interviews with the author of this Predictive Cosmology theory, please go to the December 21, 2009 iTWire article "Predictive Cosmology and Standard Model revisited."

Please note: For people interested in discussing Stig's ideas in more detail, please email William Atkins at william.atkins 'at' itwire.com and he will relay the information to Mr. Sundman.


Original article:
August 10, 2009 iTWire.com article 'Predictive Cosmology: Creation's secret revealed in muon-electron mass ratio = 206.768 283' (or, http://www.itwire.com/content/view/26822/1066/)

First interview:
December 21, 2009 iTWire.com article 'Predictive Cosmology and Standard Model revisited' (or, http://www.itwire.com/content/view/30199/1066/)

Second interview:
January 9, 2010 iTWire article 'Q&A Interview, Part 2: Predictive Cosmology and Standard Model revisited' (or, http://www.itwire.com/content/view/30398/1066/)

Third interview:
March 3, 2010 iTWire article 'Q&A Interview, Part 3: Predictive Cosmology and Standard Model revisited' (http://www.itwire.com/science-news/energy/37280-xsm3)


Australia is a cyber espionage hot spot.

As we automate, script and move to the cloud, more and more businesses are reliant on infrastructure that has the high potential to be exposed to risk.

It only takes one awry email to expose an accounts’ payable process, and for cyber attackers to cost a business thousands of dollars.

In the free white paper ‘6 Steps to Improve your Business Cyber Security’ you’ll learn some simple steps you should be taking to prevent devastating and malicious cyber attacks from destroying your business.

Cyber security can no longer be ignored, in this white paper you’ll learn:

· How does business security get breached?
· What can it cost to get it wrong?
· 6 actionable tips



Ransomware attacks on businesses and institutions are now the most common type of malware breach, accounting for 39% of all IT security incidents, and they are still growing.

Criminal ransomware revenues are projected to reach $11.5B by 2019.

With a few simple policies and procedures, plus some cutting-edge endpoint countermeasures, you can effectively protect your business from the ransomware menace.



Popular News




Sponsored News