Generative Orders Research Proposal – Part II

Assessment of GI Theories

The principle of gage invariance has underpinned development of theoretical physics for almost a century. Application of the principle is conceptually simple: experimental data is analyzed to propose “invariants” of physical systems. The (generally simple) equations that describe these properties (energy, momentum, mass, field strength) are then subjected to classes of transformations (rotations, velocity changes, interactions with other particles), and the equations are manipulated until the proposed invariants are maintained under all transformations.

Having recognized gage invariance as an organizing principle, theorists beginning with Einstein have sought to construct “Grand Unified Theories” that unite all of the invariants in a single framework. There is not fundamental reason for this to be so – the proposition was motivated by reasonable success in explaining experimental studies performed at particle colliders.

Two kinds of difficulties have arisen for the proponents of these theories. First, particle colliders have become enormously expensive and time-consuming to construct. That has been ameliorated somewhat by the introduction of astrophysical data, and the attempts to connect the history of the early universe to the properties of the gage theory. In the interim, however, the enormously prolific imaginations of the particle theorists were insufficiently checked by experimental data. This led them to emphasize numerical tractability in constructing their theories.

Given this situation, we should perhaps not have been surprised to learn that as astrophysics observatories proliferated, the theorists faced intractable difficulties in reconciling predictions with data.

As if these problems were not serious enough, the focus on explaining observations of the behavior of particles under unusual conditions has led to a certain myopia regarding the intractability of what I would call “first order” phenomena: things that are obvious to us in our every-day lives, but have yet to be satisfactorily explained by theory.

We continue with an enumeration of defects.

The Supremacy of Formalism

The program for constructing a Grand Unified Theory of physics is a theoretical conceit from the start. There is no a priori reason to expect that summing the effects of independent forces is not a satisfactory and accurate means of describing the universe.

Once the program is undertaken, however, every term in every equation falls under the microscope of formal criteria. For example, the Higgs field was motivated as a means of restoring parity invariance to the Dirac equation. Similarly, sparticles were introduced to eliminate the distinction between particles with Bose and Fermi statistics. The “strings” of super-string theory were invented to cut off integrals that produce infinities in calculations of particle kinematics. Although these innovations are sufficient to achieve consistency with phenomenology, there is absolutely no experimental evidence that made them necessary. They were motivated solely by abstract formal criteria.

The tractability of formal analysis also has a suspicious influence over the formulation of particle theories. The dynamics of two-dimensional “strings” in super-string theory are susceptible to Fourier analysis. However, Fourier modes are normally far-field approximations to more complex behavior in the vicinity of three-dimensional bodies. In a three-dimensional manifold such as our reality, it would seem natural that particles would manifest structure as toroids, rather than as strings. Unfortunately, the dynamics of such structures can be described only using computational methods, making them an inconvenient representation for analysis.

Finally, while the Large Hadron Collider (LHC) is now marketed principally as a Higgs detector, the original motivation for its construction was a formal problem in particle kinematics: the Standard Model predicted that reaction rates would exceed unity in the vicinity of momentum transfers of 1 TeV. Something truly dramatic was expected from the experimental program, which, at least from the press reports, appears not to have manifested.

Violations of Occam’s Razor

A widely held principle in the development of scientific theory, Occam’s Razor recognizes that a simpler theory is more easily falsified than a complex theory, and so should be preferred as a target for experimental verification. By implication, theorists faced with unnecessary complexity (i.e. – complexity not demanded by phenomenology) in their models should be motivated to seek a simpler replacement.

The most egregious violation of the principle is the combinatorics of dimensional folding in “big bang” cosmologies derived from super string theories. There are tens of millions of possibilities, with each possibility yielding vastly different formulations of physical law. In recent developments, the Big Bang is considered to be the source of an untold number of universes, and we simply happen to be found in one that supports the existence of life.

The Higgs as a source of mass is also an apparent superfluity. In the original theory, each particle had a unique coupling constant to a single Higgs field. The number of parameters in the theory were therefore not reduced. More recently, theorists have suggested that there may be multiple Higgs fields, which is certainly no improvement under the criteria of Occam’s Razor.

The vastly enlarged particle and field menageries of GUTs are also suspicious. There are roughly ten times as many particles and fields as are observed experimentally; the addition of seven extra spatial dimensions is also of concern.

Unverifiable Phenomena

Particularly in the area of cosmology, the theories take fairly modest experimental results and amplify them through a long chain of deduction to obtain complex models of the early universe. Sadly, many of the intermediate steps in the deduction concern phenomena that are not susceptible to experimental verification, making the theories unfalsifiable.

The point of greatest concern here is the interpretation of the loss of energy by light as it traverses intergalactic space. In the reigning theory, this is assumed to be due to the special relativistic “red shift” of light emitted from sources that are moving away from the Earth at a significant fraction of the speed of light. Of course, no one has ever stood next to such an object and measured its velocity. In fact, the loss of energy is interpreted (circularly) as proof of relative motion.

The “red shift” interpretation is the principle justification of the “Big Bang” theory, which again is a phenomenon that cannot be directly verified. There are difficulties in the theory concerning the smoothness and energy density of the observable universe. These are purported to be side-effects of “inflationary” episodes driven by symmetry breaking of Higgs-like fields. No demonstrated vacuum potential manifests a sufficient number of e-foldings of space, and the relevant energy scales are many orders of magnitude beyond the reach of our experimental facilities.

Finally, the 2012 Nobel prize was awarded for studies that indicated that the universe is “inflating”. The inference was achieved by looking at the spectra of distant light sources, and determining that they no longer followed the predictions of Hubble’s law. However, extrapolating from those measurements into the distant future is troubling, as even in the context of the Big Bang model, this opens the door to additional effects that may mitigate or reverse the predicted inflation. Obviously, since these effects would occur on time-scales exceeding the existence of the Earth (which will be vaporized when the sun’s photosphere expands), they will never be verified.

Axiomatic Contradictions

As discussed in the previous section, the lack of an explanation for mass verges on an axiomatic need in the theory. That is to say, it appears to require a fundamental reevaluation of the abstract principles used to construct physical theories.

There are at least two other phenomena that directly violate fundamental axioms in the theory. The first is the existence of non-uniformity in the structure of space-time that is not associated with matter (so-called “dark energy”). Special relativity and all of its dependent theories (i.e. – all of particle physics) rests upon the assumption that space is empty. In the era in which special relativity was formulated, evidence (the Michelson-Morley experiment) suggested that there was no “luminiferous ether” – no medium in which electromagnetic radiation propagated. Dark energy is in fact an ether, and its existence requires a dynamical explanation for the Michelson-Morley results. (It is my opinion that this is why Einstein called the vacuum energy the worst idea he ever had – the existence of such a term undermines all of special and general relativity).

Finally, the work of the Princeton Engineering Anomalies Research team demonstrated couplings between psychological states and the behavior of inanimate objects that are outside of the modes of causality allowed by existing physical theory. The rejection of these findings by the mainstream physics community indicates that accommodating these findings is going to require rethinking of the axioms of the theory. The most extreme examples concern the structure of time – the standard model allows non-linear causality only at quantum scales, and some studies of the “paranormal” appear to indicate non-causal behavior (information preceding effects) on macroscopic scales.

Generative Orders Research Proposal – Part I

Summary

The author proposes to develop research partnerships to develop a conceptual model of fundamental physics that has the potential to place spirituality on a firm scientific basis. The motivation for the scientific program is well-grounded in phenomenology, and the author outlines correspondence with established theory as limiting cases of the proposed model.

The author recognizes the gnostic implications of the program on society. Certain rules of engagement must be observed in doing such work – generally, the first application of any new technology is to obtain competitive advantage. To mitigate against such outcomes, the author has written a book that explains, in layman’s terms, the disciplines required to safely engage these principles, and the long-term personal and global consequences of failing to observe them. The proposal includes support for updating the book, entitled “Love Works”, and for professional preparation and publication.

The proposal seeks not to resolve all questions regarding the proposed class of theories. It is intended to stimulate thinking that should lead to independent research and funding by the research community. The proposal does include publication of a single paper that may demonstrate a significant point of departure from current models of particle physics and cosmology. Successful publication should stimulate “out-of-the-box” thinking by the research community, followed by independent research proposals.

Qualifications

The author’s principle qualifications for those work are selflessness and a commitment to Life in all of its forms. Many of the ideas presented were formulated through engagement with forms of sentience not recognized by many scientists.

With regards to the fundamental physics, the author received his Ph.D. in high-energy particle physics in 1987 and was active as a Post-Doctoral research fellow until 1992. Most of the conceptual underpinnings of modern particle theory and cosmology were developing during this period, and his observations of their development makes the author well-suited to recognize their short-comings.

However, the author recognizes his limitations with regards to the skill-sets of the modern particle theorist, including large-scale numerical modeling. The author will develop relationships at institutions with large-scale computational physics programs to collaborate in the program.

Motivations

The principal motivation for this work is to heal the divide between science and religion that promotes fear, anxiety, anger and apathy in those confronted with the enormous global challenges of the 21st century. The author believes that science is a process of revelation that can embolden and empower those with a genuine desire to be of service to the end of healing the world. Religion is concerned with the development of disciplines that enable us to work safely with the requisite spiritual energies.

While fostering spiritual maturity is critical to a successful execution of the overall program, the development of supporting resources is fairly well advanced. (The author has published his moral and ethical philosophy at www.everdeepening.org, and Love Works is a popularization aimed at the culturally dominant community of Christian believers.) The author considers publication of Love Works to be a critical adjunct, and will not pursue separately the scientific program.

Plan of Exposition

Love Works is provided as an attachment for the evaluation. The focus of exposition will therefore be to motivate and describe the scientific program. The scope of the development is far greater than necessary to complete the work of the first year. As an alternative to theories that have had thousands of man-years invested in their development, it is important to establish plausible paths of investigation for the obvious problems that must be overcome in investigation of the new class of theories, characterized as theories of “Generative Order” (GO).

To be fair, the discussion starts with an enumeration of the failures of the prevailing class of theories, which are characterized as theories based upon “Gage Invariance” (GI).

Every physical theory has a set of fundamental constants. In current theories, the fundamental constants include the speed of light, the particle masses, Plank’s constant, and the strengths of the fundamental forces. The principle challenge in qualifying theories of Generative Order is determining the number and values of those constants. The exposition proposes a series of modeling problems that could be undertaken to evaluate a specific theory and determine its constants. Each modeling problem addresses a critical issue in establishing that a theory of Generative Order yields the current theory as a limiting case (just as Relativity and Quantum Mechanics have Newtonian physics as a limiting case).