History’s Biggest Con

The world’s most successful con man is not in finance or politics. He is the scientist that runs the world’s biggest machine. He has defrauded the US taxpayer of tens of billions of dollars, and he’s not done yet.

This is the story of particle physics and its kingpin, Carlo Rubbia.

A Field Forged in Fear

Particle physics is the study of matter and space. Newton and Einstein are the most famous scientists in this field. For centuries, physicists went about their business largely unnoticed by the public. Then came nuclear weapons.

History’s most famous equation was given to us by Einstein. E = mc2. To military planners, the equation is important because it says that matter can be converted to pure energy. Prior to World War II, chemical munitions only used a billionth of that explosive potential. The atom bomb showed that chemical munitions could start a nuclear reaction that achieved a million-fold improvement. A decade later, atom bombs were used to trigger fusion in a hydrogen bomb, achieving another factor of forty improvement.

Naturally, after World War II, politicians recognized that particle physicists were the most dangerous people in the world. A single hydrogen bomb can wipe out a city like London. Particle physicists were organized under the Department of Energy and told to find out whether even greater horrors were possible. That mission was sustained by the Cold War competition with the Soviet Union.

This work was done at particle colliders. Over time, these became the world’s largest machines, costing hundreds of millions of dollars to build and operate.

Fortunately for the survival of the human race, by the mid-eighties we knew that the hydrogen bomb was the limit. Everything discovered by the particle colliders was unstable, lasting at most a millionth of a second. However, this was bad for particle physicists. They needed a new marketing message to convince politicians to give them billions so they could keep on building and running colliders.

Given that the researchers were inspired by the prospect of blowing up the world, perhaps we should have expected what came next.

The Final Theory of Everything

Every politician knows that politics is a contest of wills. In the halls of Congress and in the White House, palpable energy is generated by these contests. Politicians know that spirituality is real.

Could that energy be tapped? Well, not according to physics. In fact, Einstein’s theories seemed to prove that spiritual energy couldn’t exist. Remove all the matter from space and there is nothing left.

Physicists knew better. Richard Feynman, the quirky theorist from Cal Tech, spoke about going to Princeton to speak before the “Monster Minds.”

This, then, was the pitch: “We know that our theories of matter and space are incomplete. Give us money so that we can find the final theory of everything. Then we’ll know how to harness the power of will.” Now, this was absurd from the start. Will is generated by the human mind, which needs to avoid explosions at all costs. But it worked for a while. Congress is a creature of habit, and it wasn’t too much money, at first. Only a couple of hundred million dollars a year.

Then, in the mid-eighties, came the supercolliders. These were billion-dollar machines. Finally, the international particle physics community banded together into coalitions. In Europe, researchers at CERN promoted an upgrade to their collider. In the US, states competed to host the Superconducting Super Collider. Not surprisingly, George Bush Sr. picked Texas as the winner.

As the price tag went up and up, the particle physics community realized that only one candidate could be built. And this is where the con started – the con that left the US giving billions of taxpayer dollars to CERN.

Nobels Oblige

Alfred Nobel was a Swedish chemist and arms merchant (alas, explosions again) who bequeathed his fortune to fund the Nobel Prize. Winning the Nobel Prize in any science is one of the few ways that a scientist gains public notoriety. With that stature comes access to politicians that funnel taxpayer dollars into research. Universities and laboratories, naturally, compete to hire Nobel Prize winners. When they can’t hire them, they try to create them.

Inevitably, the Nobel Prize is a highly political award. It’s not just the ideas that count.

The Nobel Prize for Physics is dominated by fundamental physics. Discovering a new particle or force is almost guaranteed to be followed by an invitation to Stockholm.

Motive: billions of taxpayer dollars for the next particle collider. Opportunity: given that politicians don’t understand a single thing about particle physics, winning a Nobel Prize establishes prestige that could determine the flow of those dollars. Means: the existing collider at CERN. Sounds like a recipe for crime.

Exposing the grift is difficult because particle physicists speak an arcane language. I will try keep it to a minimum, but to be able to confront the perpetrators of this crime against the American taxpayer, we need to understand some of that language.

As well as particles of matter called fermions, the universe contains fields. These fields come in packets called bosons. Bosons allow matter to interact. As a practical example, when you chew food, the atoms of your teeth are not mechanically breaking the food apart, but generating bosons called photons that break the food apart.

How do physicists prove that they have discovered a new fermion?

The concept is built upon Einstein’s equation. E = mc2. To achieve perfect conversion of mass to energy, physicists discovered that they could make antimatter that, when combined with normal matter, annihilates completely.

How to make new kinds of matter? In this regard, the most interesting bosons are the W and Z. Through these so-called weak interactions, any kind of matter can be created. The only requirement is that enough energy exists to run annihilation in reverse. This is called “pair creation.” From the pure energy of the Z, matter and antimatter are created.

To find a new kind of fermion, a collider first manufactures antimatter. It then takes the antimatter and matter, pushing them through voltage that adds energy of motion, creating beams. Finally, the beams are aimed to an intersection point at the center of a detector. Randomly, annihilation occurs. Both the energy of mass and the energy of motion are available to create new fermions.

The process is rote. Build a collider. Use the acceleration to control the energy of the collisions. Analyze the data coming out of your detectors. When you get to the power limit of your collider, go to Congress and ask for more money.

The challenge is that sometimes beams collide without producing anything interesting, filling your detectors up with noise. Fortunately, there is a specific signal that occurs most frequently when creating a new kind of fermion. The detectors will see two photons moving in opposite directions.

Remember that last fact. When a new kind of fermion is found, we see two photons moving in opposite directions.

From the start of particle physics until 1987, eight fermions were discovered. They first six showed a definite generational pattern: a light lepton followed by two heavier quarks. The first triad is known as electron, down, and up. The second generation contains muon, strange, and charm. In the third generation, colliders had detected the tau and bottom. The field was racing to find the third member of that generation, the top.

Along the way, there was another important discovery. The weak interactions are weak because the W and Z themselves have large masses. On the way to finding the top, the bosons were confirmed, at energies of 80 and 93 GeV. (The units are not important. Remember the numbers.) For purposes of understanding the fraud, I emphasize that the W and Z do not produce two photon signals.

The W and Z results confirmed theoretical predictions, convincing politicians that the field was on a solid footing. For this, Carlo Rubbia was awarded the Nobel Prize in 1984.

I was in my last year of my graduate studies in 1987 when CERN announced the discovery of the top, publishing its claim in Physical Review. One of my thesis advisors, Mary Kay Gaillard, had come to UC Berkeley through CERN. That connection brought researchers from CERN who described the result. I was shocked to hear that the data did not demonstrate the required two-photon signal. Furthermore, the accelerator energy during the study was 346 GeV, exactly twice the sum of the W and Z masses.

I trusted Mary Kay. In my presence, she denounced the evils of nuclear weapons. I went to her and voiced my confusion. How could this be a new particle? It looked like a collection of four weak bosons, exactly at the energy that you would predict.

Her answer amounted to, “Go home little boy. The adults are playing politics.”

As the leader of CERN, winner of the Nobel prize, and lead author of the top paper, Carlo Rubbia was the kingpin of particle physics. And CERN won the competition for the next collider.

Higgsy Pigsy

Let’s return to the political context now. Remember: the Cold War was ending. Everyone knew that no new bomb technology was coming out of particle physics. The goal was now a theory of everything. How long would that political motivation last?

Given the abstractness of the motivation, the field needed a long runway in its next accelerator. This was part of the strategy with the top announcement. The heaviest particle to that point was the bottom quark, at 4.2 GeV. That the fraudulent “top” was all the way up at 173 GeV suggested that there was much more to come, if only politicians would fund the work.

The proposed upgrade of CERN was not modest. It set a 20-year goal of attaining a sixty-fold increase in the collider’s power. Bedazzled by Nobel prizes and the pretty pictures produced by taxpayer-funded science propagandists, the politicians were persuaded to comply.

Then came turn-on date in 2012. The machine was ramped up through its energy range, scanning for new particles all the way up to its limit.

Nothing. Zero, Zilch. A ten-billion-dollar boondoggle, funded in no small part by the American taxpayer.

Except then, after a summer spent scanning higher energies, the machine was turned down to 125 GeV. Be clear: this was an energy accessible by the earlier collider. At that energy, the detectors showed a two-photon signal. Detecting this signal is a primary design criterion for every detector. As it occurred at lower energy than the signal announced as the “top,” it must have been known before that study.

Demonstrating their impenetrability to shame, the 125 GeV signal was published in Physical Review and announced as the long-sought after “Higgs particle.”

“Really,” I though, “you are going to double down on your fraud?”

Remember: two photons is the signal for a new particle. The “Higgs” is what the top should have looked like. In fact, by the standards of the field, I should be awarded the Nobel prize for recognizing that it is the top.

None-the-less, the shameless perpetrators began their pressure campaign. They leaned on the Nobel committee to recognize Peter Higgs, the developer of the field’s minimally coherent theory of particle mass. In the background, Marco Rubbia, CERN’s prior laureate, went to the funding panels, demanding, “You know, this Higgs is kind of weird. We need more money for another collider.” The Nobel committee, having acceded to the Higgs award, heard of this and protested, “We are about to award the Nobel Prize for this discovery. Is it the Higgs or is it not the Higgs?” Rubbia backtracked.

Only temporarily, however. Read the popular science press and every week you will see a propaganda piece promoting the next collider at CERN. After all, the full-time job of their taxpayer-funded propagandists is to secure funding for that collider.

Omerta

The question, in any massive conspiracy, is how the community maintains discipline. This is a matter of leverage.

You see, university posts in particle physics are not funded directly. They are funded as an adder on collider construction and operation budgets.

For twenty years, I have been trying to get particle physics out of the rut of superstring theory – a theory that is certifiably insane for its violations of everything that we observe about the universe. In the one instance that I was able to get into dialog with a theorist, I was told “I know that you are right, but if I work with you, I will lose my funding.”

CERN is the only game in town. Anything that does not build to more construction is not funded. Pure and simple, Rubbia is the godfather of particle physics. If you don’t play, he won’t pay.

It is time to stop the grift. The next machine will cost the US taxpayer tens of billions of dollars. Enough is enough. Call your local congresspeople and demand that they investigate and shut this down. We have more pressing problems to worry about.

Einstein and Mental Illness

For more than a century, psychiatrists have been trying to solve mental illness by changing the brain. They have failed, and that failure has harmed the lives of many, many people.

Psychiatry was driven to emphasize the brain because Albert Einstein declared that if we removed matter, space would be empty. This was a death knell for the soul, leading to conceptions that people are just machines. Treating mental illness was therefore like changing a spark plug.

In this paper, I prove that Einstein was wrong. The physical world that we observe is actually more gracefully and accurately explained if space is filled with a lattice of infinitely slippery polygons. Within that sea, there are loops of spirit that become a soul. Loops that attach to the polygons are understood in Einstein’s physics as “charge.” It is through this attachment that the soul connects to matter. Our “minds” are therefore the brain plus our soul.

Mental illness is not just a problem in the brain. It is a problem in the soul. In this new vision of reality, damaging the brain to fix the mind is clearly understood as counterproductive.

The paper is not an easy read. Please, if you know a young or aspiring physicist, get them to look at this. Physical Review X refuses to publish this paper, so I am putting it out to the public through social media. I have explained to PRX that I am trying to clear up a critical public health problem, but the old guard is afraid that they are going to lose their research funding.

Anti-Matter Antidote

On my New Physics tab, I have a set of links that document some important facts that are unexplained by modern particle theory. These aren’t obscure points of experience. Rather, they include facts such as “the proton weighs 50 times as much as it should” and “quazars precede galaxy formation.” They are “first order” facts that should cause every particle theorist to blush in shame.

Experimenters at CERN have now magnified the problem.

The reigning theory of the universe holds that it formed from a super-hot gas – so hot that the very fabric of space contained more energy than the existing particles. As the universe cooled, that energy was converted to particles.

One problem with this theory is that energy is converted to matter through a process called “pair production.” You can’t make only one particle – you have to make two.

Specifically, the particle comes with an “anti-particle” with equal mass and opposite charge. The conundrum is that those particles attract, and when they meet, they annihilate each other. The matter and anti-matter convert back to pure energy.

This leads the physicists to wonder: how did we end up with a universe composed only of matter? In principle, there should be equal amounts of matter and anti-matter, and every solid object should be annihilated.

The answer proposed by the theorists was that matter and anti-matter are slightly different – and most importantly in their stability. Anti-matter must disappear through some unknown process that preserves matter.

The experiment reported today attempted to measure differences between the most important building-block of matter – the proton – and its antiparticle. None was detected.

In consequence, everything created by the Big Bang (or the Expansive Cool – take your pick) should have disappeared a long time ago. There should be no gas clouds, no galaxies, no planets, and no life.

If that’s not a reason to be looking for new theories of fundamental physics, then what would be?

Reconciling Scripture and Evolution

Posted in a discussion of our symbiotic relationship with mites, this summarizes my position succinctly:

The biologists that rely upon strictly biochemical processes of evolution will never be able to calculate rates, because the forcing conditions have been lost in prehistory. I found it interesting to ask “why does every civilization develop the concept of a soul”, and eventually concluded that Darwin was half right: life is the co-evolution of spirit with biological form. The addition of spirit influences the choices made by living creatures, and so changes the rates.

Given this, I went back to Genesis and interpreted it as an incarnation (“The SPIRIT of God hovered over the waters” – and then became God for the rest of the book), with the “days” of creation reflecting the evolution of senses and forms that enabled Spirit to populate and explore the material conditions of its survival (photosensitivity, accommodation of hypotonic “waters above”, accommodation of arid conditions on the “land”, accommodation of seasons with sight (resolving specific sources of light), intelligent species in the waters and air, and mammals on earth (along with man)).

Couple this with the trumpets in the Book of Revelation, which pretty clearly parallel the extinction episodes identified by paleontology – including injection of the era of giant insects – and it looks like science and scripture actually support each other.

The only point of significant disagreement is spirit itself. Given my knowledge of the weaknesses of modern theories of cosmology and particle physics, I found myself considering the possibility of structure inside of the recognized “fundamental” particles. It became apparent to me that it wouldn’t be too difficult to bring spiritual experience into particle physics. To my surprise and delight, I became convinced that this reality is constructed so that love inexorably becomes the most powerful spiritual force.

A Massive Mystery

Quantum Mechanics describes particles as vibrations in time and space. The intensity of the vibration in time (i.e. – when it is) reflects the particle’s energy; the intensity of the vibration in space (i.e. – where it is) reflects its momentum.

In large-scale reality, such as baseballs and buildings, those vibrations are way too small to influence the results of experiments. In studying these “classical” systems, physicists discovered certain mathematical laws that govern the relationship between momentum (p) and energy (E). Believing that these rules should still be manifested in the quantum realm, they were used as guidelines in building theories of vibration.

In Special Relativity, that relationship is (m is the mass of the particle):

m2 = E2 – p2

In the case of electromagnetic waves, we have m = 0. Using a fairly simple mathematical analogy, the equation above becomes a wave equation for the electromagnetic potential, A. An electric field (that drives electricity down a wire) arises from the gradient of the potential; a magnetic field (that causes the electricity to want to turn) arises from the twisting of the potential.

The contribution of P.A.M. Dirac was to find a mathematical analogy that would describe the massive particles that interact with the electromagnetic potential. When the meaning of the symbols is understood, that equation is not hard to write down, but explaining the symbols is the subject of advanced courses in physics. So here I’ll focus on describing the nature of the equation. Let’s pick an electron for this discussion. The electron is a wave, and so is represented by a distribution ψ.

Physically, the electron is like a little top: it behaves as though it is spinning. When it is moving, it is convenient to describe the spin with respect to the motion. If we point our right thumb in the direction of motion, a “right-handed” electron spins in the direction of our fingers; a “left-handed” electron spins in the opposite direction. To accommodate this, the distribution ψ has four components: one each for right- and left-handed motion propagating forward in time, and two more for propagation backwards in time.

Dirac’s equation describes the self-interaction of the particle as it moves freely through space (without interacting with anything else). Now from the last post, we know that nothing moves freely through space, because space is filled with Dark Energy. But when Dirac wrote his equation, Einstein’s axiom that space was empty still ruled the day, so it was thought of as “self-interaction”. That self-interaction causes the components of the electron to mix according to m, E and p. When the self-interaction is applied twice, we get Einstein’s equation, relating the squares of those terms.

So what does the mass term do? Well, it causes right-hand and left-hand components to mix. But here’s the funny thing: imagine watching the electron move in a mirror. If you hold up your hands in a mirror the thumbs pointed to the right, you’ll notice that the reflection of the right hand looks like your left hand. This “mirror inversion” operation causes right and left to switch. In physics, this is known as “parity inversion”. The problem in the Dirac equation is that when this is applied mathematically to the interaction, the effect of the mass term changes sign. That means that physics is different in the mirror world than it is in the normal world. Since there is no fundamental reason to prefer left and right in a universe built on empty space, the theorists were upset by this conclusion, which they call “parity violation”.

Should they have been? For the universe indeed manifests handedness. This is seen in the orientation of the magnetic field created by a moving charged particle, and also in the interactions that cause fusion in the stars and radioactive decay of uranium and other heavy elements.

But in purely mathematical terms, parity violation is a little ugly. So how did the theorists make it go away? Well, by making the mass change sign in the mirror world. It wasn’t really that simple: they invented another field, called the Higgs field (named after its inventor), and arbitrarily decided that it would change sign under parity inversion. Why would it do this? Well, there’s really no explanation – it’s just an arbitrary decision that Higgs made in order to prevent the problem in the Dirac equation. The mass was taken away and replaced with the Higgs density and a random number (a below) that characterized its interaction with the electron: m ψ was replaced with a H ψ.

Now here’s a second problem: if space was empty, why would the Higgs be expected to have a non-zero strength so that it could create mass for the electron? To make this happen, the theory holds that empty space would like to create the Higgs field out of nothingness. This creation process was described by a “vacuum” potential with says that when the Higgs density is zero, some energy is available to generate a density, until a limit is reached, and then increasing the density consumes energy. So space has a preferred density for the Higgs field. Why should this happen? No reason, except to get rid of the problem in the Dirac equation.

And what about the other spinning particles? Along with the electron, we have the muon, tau, up, down, strange, charm, bottom, top and three neutrinos, all with their own masses. Does each particle have its own Higgs field? Or do they each have their own random number? Well, having one field spewing out of nothingness is bad enough, so the theory holds that each particle has its own random number. But that begs the question: where do the random numbers come from?

So now you understand the concept of the Higgs, and its theoretical motivations.

Through its self-interaction, the Higgs also has a mass. In the initial theory, the Higgs field was pretty “squishy”. What does this mean? Well, Einstein’s equation says that mass and energy are interchangeable. Light is pure energy, and we see that light can be converted into particle and anti-particle pairs. Those pairs can be recombined to create pure energy again in the form of a photon. Conversely, to get high-energy photons, we can smash together particles and anti-particles with equal and opposite momentum, so that all of their momentum is also converted to pure energy (this is the essential goal of all particle colliders, such as those at CERN). If the energy is just right, the photons can then convert to massive particles that aren’t moving anywhere, which makes their decay easier to detect. So saying that the Higgs was “squishy” meant that the colliding pairs wouldn’t have to have a specific energy to create a Higgs particle at rest.

Of course, there’s a lot of other stuff going on when high-energy particles collide. So a squishy Higgs is hard to detect at high energies: it gets lost in the noise of other kinds of collisions. When I was in graduate school, a lot of theses were written on computer simulations that said that the “standard” Higgs would be almost impossible to detect if its mass was in the energy range probed by CERN.

So it was with great surprise that I read the reports that the Higgs discovered at CERN had a really sharp energy distribution. My first impression, in fact, was that what CERN had found was another particle like the electron. How can they tell the difference? Well, by looking at the branching rations. All the higher-mass particles decay, and the Higgs should decay into the different particle types based upon their masses (which describe the strength of the interaction between the Higgs field and the particles). The signal detected at CERN was a decay into two photons (which is also allowed in the theory). I am assuming that the researchers at CERN will continue to study the Higgs signal until the branching ratios to other particles are known.

But I have my concerns. You see, after Peter Higgs was awarded the Nobel Prize, his predecessor on the podium, Carlo Rubia (leader of the collaboration that reported the top particle discovery) was in front of a funding panel claiming that the Higgs seemed to be a bizarre object – it wasn’t a standard Higgs at all, and the funding nations should come up with money to build another even more powerful machine to study its properties. Imagine the concern of the Nobel committee: was it a Higgs or not? Well, there was first a retraction of Rubia’s claim, but then a recent paper that came out saying that the discovery was not a Higgs, but a “techni-Higgs”.

One of the characteristics of the scientific process is that the human tendency to lie our way to power is managed by the ability of other scientists to expose fraud by checking the facts. Nobody can check the facts at CERN: it is the only facility of its kind in the world. It is staffed by people whose primary interest is not in the physics, but in building and running huge machines. That’s a really dangerous combination, as the world discovered in cleaning up the mess left by Ivan Boesky and his world-wide community of financial supporters.

Generative Orders Research Proposal – Part V

Research Program

In this section, we suggest a research program, motivated by a strategy of incremental complexity. The initial steps of the program focus on the characteristics of the lattice. As these are resolved, the parameter space of the theory is reduced, helping to focus analysis of fermion dynamics in the later stages.

The reference model as described suggests that in theories of generative order, many of the intrinsic properties of the standard model may be extrinsic properties of one-dimensional structures. If this is so, ultimately theorists should be able to calculate many of the fundamental constants (G, h, c, α, etc.), and to establish correspondence theorems to existing models of gravitation, quantum electrodynamics (QED), quantum chromodynamics (QCD) and the weak interactions. It is the opinion of the author that the single-component reference model is unlikely to satisfy these requirements.

Conversely, the isotropy of space suggests that a single-component lattice is likely to be sufficient to explain gravitation. The initial work therefore focuses on creation of models that can be used to assess the stability of behavior against segmentation length and interaction models. The program will also help scope the computational resources necessary to analyze fermion kinematics.

The steps in the research program are successively more coarse. Particularly when the program progresses to consideration of fermion kinematics, years of effort could be invested in analysis of a single particle property, such as spin. Considering the history of quantum mechanics and relativity, the entire program can be expected to take roughly a century to complete.

Given the resources available to the proposer, funding of the research program for the first year focuses on two goals.

  1. Re-analysis of the spectra of side-view elliptical galaxies with the goal of establishing a high-profile alternative to the Hubble expansion.
  2. Identifying research teams that would be capable and interested to pursue the modeling effort.
  3. A successful effort would culminate with invitation to a symposium considering new horizons in the theories of cosmology and particle physics.

Modeling Program

Exposure of generative orders to the community of particle theorists is going to result in a large body of objections to the reference model. To avoid these being raised as impediments to obtaining research funding for interested theorists, we list the challenges that must be overcome to elaboration of a satisfactory model, and consider possible mechanisms that might lead to the observed behavior of known physical systems.

The program is presented in outline form only. If desirable, elaboration can be provided.

  1. Precession of perihelion – due to “drag” between the inconsistent lattice configurations of bound gravitational bodies. Explore parameterization of lattice structure – sub-unit modes, lattice sheer and compressibility. Again a fairly approachable study that could stimulate further work by demonstrating feasibility of explanations of large-scale phenomena, with correspondence to parameterization at the lattice scale. Success would begin to break down resistance to the idea that a preferred reference frame is disproven by observations that support Einstein’s theories of Special and General relativity.
  2. Dynamics of the formation of galactic cores, parthogenesis, black hole structure – Relative energetics of lattice cohesion vs. encapsulation of lower-dimension structures. Initial density and uniformity of 1-dimensional structures (also considering observed smoothness of lattice compression – I.e. “dark energy” distribution). Success would be a clear demonstration of worthiness as an alternative to the Big Bang theory.
  3. Superfluid transport of particles through the lattice.
    1. Fiber characteristics
    2. Intrinsic angular momentum
    3. Virtual photon / gluon emission as an analog of lattice disruption
    4. Conservation of momentum
    5. Theory of kinetic energy (scaling of scale of lattice distortion vs particle velocity)
    6. Effect of sub-unit distortion on particle propagation
  4. Gravitation/QED/GCD
    1. Equivalence of gravitational and kinetic masses
    2. Electric charge signs. Note that 2 sub-units with 1 thread each are not equivalent to one subunit with 2 threads.
    3. Thread dynamics and interaction with lattice
  5. Lattice distortion and correspondence to quantum-mechanical wave function
    1. Pauli exclusion principle
    2. Wave-particle duality (theory of diffraction)
    3. Hydrogen energy levels
    4. Wave-function collapse
  6. Weak interactions
    1. Thread transfer processes
    2. Temporary creation of unstable higher-dimensional structures.
  7. Theory of light
    1. Electric field as a mode of coupling due to lattice disrupted by thread oscillations
    2. Magnetic fields as a special mode of particle coupling with a lattice distorted by motion of threads
    3. Speed of light
    4. Light decay during lattice propagation
    5. Theory of microwave background radiation (lattice relaxation or light decay)
  8. Theory of anti-particles
    1. Sub-unit chirality and lattice coherence
    2. Annihilation as synthesis of higher-order structures
    3. Pair production as decomposition of higher-order structures
    4. Meson theory

Generative Orders Research Proposal – Part II

Assessment of GI Theories

The principle of gage invariance has underpinned development of theoretical physics for almost a century. Application of the principle is conceptually simple: experimental data is analyzed to propose “invariants” of physical systems. The (generally simple) equations that describe these properties (energy, momentum, mass, field strength) are then subjected to classes of transformations (rotations, velocity changes, interactions with other particles), and the equations are manipulated until the proposed invariants are maintained under all transformations.

Having recognized gage invariance as an organizing principle, theorists beginning with Einstein have sought to construct “Grand Unified Theories” that unite all of the invariants in a single framework. There is not fundamental reason for this to be so – the proposition was motivated by reasonable success in explaining experimental studies performed at particle colliders.

Two kinds of difficulties have arisen for the proponents of these theories. First, particle colliders have become enormously expensive and time-consuming to construct. That has been ameliorated somewhat by the introduction of astrophysical data, and the attempts to connect the history of the early universe to the properties of the gage theory. In the interim, however, the enormously prolific imaginations of the particle theorists were insufficiently checked by experimental data. This led them to emphasize numerical tractability in constructing their theories.

Given this situation, we should perhaps not have been surprised to learn that as astrophysics observatories proliferated, the theorists faced intractable difficulties in reconciling predictions with data.

As if these problems were not serious enough, the focus on explaining observations of the behavior of particles under unusual conditions has led to a certain myopia regarding the intractability of what I would call “first order” phenomena: things that are obvious to us in our every-day lives, but have yet to be satisfactorily explained by theory.

We continue with an enumeration of defects.

The Supremacy of Formalism

The program for constructing a Grand Unified Theory of physics is a theoretical conceit from the start. There is no a priori reason to expect that summing the effects of independent forces is not a satisfactory and accurate means of describing the universe.

Once the program is undertaken, however, every term in every equation falls under the microscope of formal criteria. For example, the Higgs field was motivated as a means of restoring parity invariance to the Dirac equation. Similarly, sparticles were introduced to eliminate the distinction between particles with Bose and Fermi statistics. The “strings” of super-string theory were invented to cut off integrals that produce infinities in calculations of particle kinematics. Although these innovations are sufficient to achieve consistency with phenomenology, there is absolutely no experimental evidence that made them necessary. They were motivated solely by abstract formal criteria.

The tractability of formal analysis also has a suspicious influence over the formulation of particle theories. The dynamics of two-dimensional “strings” in super-string theory are susceptible to Fourier analysis. However, Fourier modes are normally far-field approximations to more complex behavior in the vicinity of three-dimensional bodies. In a three-dimensional manifold such as our reality, it would seem natural that particles would manifest structure as toroids, rather than as strings. Unfortunately, the dynamics of such structures can be described only using computational methods, making them an inconvenient representation for analysis.

Finally, while the Large Hadron Collider (LHC) is now marketed principally as a Higgs detector, the original motivation for its construction was a formal problem in particle kinematics: the Standard Model predicted that reaction rates would exceed unity in the vicinity of momentum transfers of 1 TeV. Something truly dramatic was expected from the experimental program, which, at least from the press reports, appears not to have manifested.

Violations of Occam’s Razor

A widely held principle in the development of scientific theory, Occam’s Razor recognizes that a simpler theory is more easily falsified than a complex theory, and so should be preferred as a target for experimental verification. By implication, theorists faced with unnecessary complexity (i.e. – complexity not demanded by phenomenology) in their models should be motivated to seek a simpler replacement.

The most egregious violation of the principle is the combinatorics of dimensional folding in “big bang” cosmologies derived from super string theories. There are tens of millions of possibilities, with each possibility yielding vastly different formulations of physical law. In recent developments, the Big Bang is considered to be the source of an untold number of universes, and we simply happen to be found in one that supports the existence of life.

The Higgs as a source of mass is also an apparent superfluity. In the original theory, each particle had a unique coupling constant to a single Higgs field. The number of parameters in the theory were therefore not reduced. More recently, theorists have suggested that there may be multiple Higgs fields, which is certainly no improvement under the criteria of Occam’s Razor.

The vastly enlarged particle and field menageries of GUTs are also suspicious. There are roughly ten times as many particles and fields as are observed experimentally; the addition of seven extra spatial dimensions is also of concern.

Unverifiable Phenomena

Particularly in the area of cosmology, the theories take fairly modest experimental results and amplify them through a long chain of deduction to obtain complex models of the early universe. Sadly, many of the intermediate steps in the deduction concern phenomena that are not susceptible to experimental verification, making the theories unfalsifiable.

The point of greatest concern here is the interpretation of the loss of energy by light as it traverses intergalactic space. In the reigning theory, this is assumed to be due to the special relativistic “red shift” of light emitted from sources that are moving away from the Earth at a significant fraction of the speed of light. Of course, no one has ever stood next to such an object and measured its velocity. In fact, the loss of energy is interpreted (circularly) as proof of relative motion.

The “red shift” interpretation is the principle justification of the “Big Bang” theory, which again is a phenomenon that cannot be directly verified. There are difficulties in the theory concerning the smoothness and energy density of the observable universe. These are purported to be side-effects of “inflationary” episodes driven by symmetry breaking of Higgs-like fields. No demonstrated vacuum potential manifests a sufficient number of e-foldings of space, and the relevant energy scales are many orders of magnitude beyond the reach of our experimental facilities.

Finally, the 2012 Nobel prize was awarded for studies that indicated that the universe is “inflating”. The inference was achieved by looking at the spectra of distant light sources, and determining that they no longer followed the predictions of Hubble’s law. However, extrapolating from those measurements into the distant future is troubling, as even in the context of the Big Bang model, this opens the door to additional effects that may mitigate or reverse the predicted inflation. Obviously, since these effects would occur on time-scales exceeding the existence of the Earth (which will be vaporized when the sun’s photosphere expands), they will never be verified.

Axiomatic Contradictions

As discussed in the previous section, the lack of an explanation for mass verges on an axiomatic need in the theory. That is to say, it appears to require a fundamental reevaluation of the abstract principles used to construct physical theories.

There are at least two other phenomena that directly violate fundamental axioms in the theory. The first is the existence of non-uniformity in the structure of space-time that is not associated with matter (so-called “dark energy”). Special relativity and all of its dependent theories (i.e. – all of particle physics) rests upon the assumption that space is empty. In the era in which special relativity was formulated, evidence (the Michelson-Morley experiment) suggested that there was no “luminiferous ether” – no medium in which electromagnetic radiation propagated. Dark energy is in fact an ether, and its existence requires a dynamical explanation for the Michelson-Morley results. (It is my opinion that this is why Einstein called the vacuum energy the worst idea he ever had – the existence of such a term undermines all of special and general relativity).

Finally, the work of the Princeton Engineering Anomalies Research team demonstrated couplings between psychological states and the behavior of inanimate objects that are outside of the modes of causality allowed by existing physical theory. The rejection of these findings by the mainstream physics community indicates that accommodating these findings is going to require rethinking of the axioms of the theory. The most extreme examples concern the structure of time – the standard model allows non-linear causality only at quantum scales, and some studies of the “paranormal” appear to indicate non-causal behavior (information preceding effects) on macroscopic scales.

Sorry to Get All Technical on You…

To this point, I’ve been writing about spirituality with a certain confident imprecision. That confidence is backed by a model of physics that I am fairly confident can overcome many of the difficulties in modern particle theory. I wrote a Templeton Fund proposal a couple of years back, and sent it around to my erstwhile peers in the community. Response was tepid, at best.

Having published The Soul Comes First, I’m getting ready to put the research proposal back around in the community. I thought that it wouldn’t hurt to serialize it first here, as that may reach people with an interest in these matters that I can’t contact directly. I’ll start that tonight. It will run for the next two weeks. Then I’ll get back to moral philosophy, starting with the matter of death.

This will be fairly technical. Any of you readers that know some science buffs, you might have fun getting them to read through it.