Super Massive Black Holes

New study indicates that super massive black holes did not form through slow accretion from normal black holes, but rather early in the evolution of the universe in some unknown, cataclysmic process.

This contradicts the “Big Bang” theory, but is expected in a physics of Generative Orders (see points 7 and 8 of the “Reference Model”).

Getting in Line

More than a decade ago, I proposed the idea that the universe is composed of one-dimensional structures. My motivations for seeking an alternative to the reigning standard model of physics, along with a fifty-year research program, were published as the Generative Orders Research Proposal (follow the New Physics link at the top of this blog). The idea is now making its way into the physics journals. (Did the Universe Begin as a Simple 1-D Line?)

What’s curious is that the Live Science report on the work is headed with a graphic that summarizes the reigning inflationary model of the early universe (still commonly referred to as the “Big Bang” model).

It’s nice to see the basic concepts of Generative Orders gaining traction – it moves us one step closer to a reconciliation of science and spirituality.

Einstein is So 20th Century

In the two centuries between Newton and Einstein, arguably the greatest physicist of the 19th century was the Scotsman James Clerk Maxwell. Maxwell made fundamental contributions to thermodynamics, the study of how gases, liquids and solids change when ambient conditions (such as temperature and pressure) change, and how to convert heat to work. One of the results was an understanding of the propagation of sound waves through the air. But Maxwell also applied the new mathematics of differential calculus to create a unified theory of electricity and magnetism. These are the famous “Maxwell’s Equations” that predict the existence of electromagnetic waves, which we see as “light”.

Maxwell saw the relationship between electromagnetic waves and water and sound waves. Being steeped in a mechanical analysis of the world, he was unsatisfied with his abstract mathematical theory, and invested time in building a mechanical model of the “aluminiferous ether” – the medium in which light waves traveled. Having spent years studying his equations and their predictions, I am fascinated by claims of his success. It’s a magical world in which the linear motion of charges creates rotary magnetic effects. My understanding is that the model was not simple, but contained complex systems of interlocking gears.

Now Maxwell’s work was not merely a curiosity – it was the basis for the design of communication networks that broke down distances with the enormous speed of light. More than anything else, this has brought us into each other’s lives and helped to create the sense that we are one human family. (The social and psychological reaction to that reality is complex, and we’re still growing into our responsibilities as neighbors. In The Empathic Civilization, Jeremy Rifkin offers a hopeful analysis of the transition.)

So the world of scientific inquiry hung on Maxwell’s words, and in America, two of them, Michelson and Morley, designed an experiment to detect the presence of the ether. If the ether filled all of space, the Earth must be moving through it. Therefore the speed of light should change depending upon the motion of the observer through it. The analogy was with water waves: an observer moving along with a water wave doesn’t experience its disturbance – while one moving against it feels its disturbance enhanced. This is an example of Newton’s laws concerning the change of reference frames.

Since the Earth rotates around the sun, light emitted from the Earth in a specific direction relative to the sun should have a different speed at different times of the year. To test this hypothesis, Michelson and Morley built a sensitive instrument that compared the speed of light travelling in two perpendicular directions. As the Earth varied its motion through the ether, the pattern of dark and light on a screen was expected to shift slowly. Strangely, the result was negative: the image did not change.

The conclusion was that there was no ether. This was a real crisis, because Maxwell’s Equations don’t behave very well when trying to predict the relationship between observations made by people moving at different speeds. To understand how really terrible this is, consider: in Maxwell’s theory, charges moving through empty space creates a rotary magnetic field. But what if the observer is moving along with the charge? The charge no longer appears to move, so the magnetic field disappears. How can that be possible?

This was the challenge taken up by the Dutch physicist Henrik Lorenz. He analyzed the mechanical properties of rulers and clocks, which are of course held together by electromagnetic forces, and discovered a magical world in which rulers change length and clocks speed up and slow down when the speed of the observer changes.

This was the context in which Einstein introduced his theory of Special Relativity. He did not really add to the results of Lorenz, but he simplified their derivation by proposing two simple principles: First, since the vacuum is empty, we have no way of determining whether we are moving or not. All motion is relative to an observer (thus the title: Special Theory of Relativity), and so no observer should have a preferred view of the universe. The second was that the speed of light is the same to every observer. Einstein’s mathematical elaboration of these principles unified our understanding of space and time, and matter and energy. Eventually, General Relativity extended his ideas to include accelerating observers, who can’t determine whether they are actually accelerating or rather standing on the surface of a planet.

Special and General Relativity were not the only great theories to evolve in the course of the 20th century. Quantum Mechanics (the world of the microscopic) and Particle Physics (describing the fundamental forces and how they affect the simplest forms of matter) were also developed, but ultimately Einstein’s principles permeated those theories as criteria for acceptance.

Then, in 1998, studies of light emitted from distant supernovae seemed to indicate that something is pushing galaxies apart from each other, working against the general tendency of gravity to pull them back together. The explanation for this is Dark Energy, a field that fills all of space. This field has gravitational effects, and its effects in distorting the images of distant galaxies have been observed. However, this field cannot be moving in all possible directions at all possible speeds. Therefore, it establishes a preferred reference frame, invalidating Einstein’s assumptions.

Working physicists resist this conclusion, because they have a means of accommodating these effects in their theories, which is to introduce additional mathematical terms. But science is not about fitting data – it is about explaining it. Einstein used his principles as an explanation to justify the mathematics of his theories. When those principles are disproven, the door opens to completely new methods for describing the universe. We can travel as far back as Maxwell in reconstructing our theories of physics. While for some that would seem to discard a lot of hard work done over the years between (and undermine funding for their research), for others it liberates the imagination (see Generative Orders as an illustration).

So, for example, why didn’t Michelson and Morley detect the ether? Maybe ether is more like air than water. Air is carried along with the Earth, and so the speed of sound doesn’t vary as the Earth moves about the sun. Maybe dark energy, which Maxwell knew as the ether, is also carried along with the Earth. Maybe, in fact, gravitation is caused by distortion in the Dark Energy field when it is bound to massive objects.

Generative Orders Research Proposal – Part V

Research Program

In this section, we suggest a research program, motivated by a strategy of incremental complexity. The initial steps of the program focus on the characteristics of the lattice. As these are resolved, the parameter space of the theory is reduced, helping to focus analysis of fermion dynamics in the later stages.

The reference model as described suggests that in theories of generative order, many of the intrinsic properties of the standard model may be extrinsic properties of one-dimensional structures. If this is so, ultimately theorists should be able to calculate many of the fundamental constants (G, h, c, α, etc.), and to establish correspondence theorems to existing models of gravitation, quantum electrodynamics (QED), quantum chromodynamics (QCD) and the weak interactions. It is the opinion of the author that the single-component reference model is unlikely to satisfy these requirements.

Conversely, the isotropy of space suggests that a single-component lattice is likely to be sufficient to explain gravitation. The initial work therefore focuses on creation of models that can be used to assess the stability of behavior against segmentation length and interaction models. The program will also help scope the computational resources necessary to analyze fermion kinematics.

The steps in the research program are successively more coarse. Particularly when the program progresses to consideration of fermion kinematics, years of effort could be invested in analysis of a single particle property, such as spin. Considering the history of quantum mechanics and relativity, the entire program can be expected to take roughly a century to complete.

Given the resources available to the proposer, funding of the research program for the first year focuses on two goals.

  1. Re-analysis of the spectra of side-view elliptical galaxies with the goal of establishing a high-profile alternative to the Hubble expansion.
  2. Identifying research teams that would be capable and interested to pursue the modeling effort.
  3. A successful effort would culminate with invitation to a symposium considering new horizons in the theories of cosmology and particle physics.

Modeling Program

Exposure of generative orders to the community of particle theorists is going to result in a large body of objections to the reference model. To avoid these being raised as impediments to obtaining research funding for interested theorists, we list the challenges that must be overcome to elaboration of a satisfactory model, and consider possible mechanisms that might lead to the observed behavior of known physical systems.

The program is presented in outline form only. If desirable, elaboration can be provided.

  1. Precession of perihelion – due to “drag” between the inconsistent lattice configurations of bound gravitational bodies. Explore parameterization of lattice structure – sub-unit modes, lattice sheer and compressibility. Again a fairly approachable study that could stimulate further work by demonstrating feasibility of explanations of large-scale phenomena, with correspondence to parameterization at the lattice scale. Success would begin to break down resistance to the idea that a preferred reference frame is disproven by observations that support Einstein’s theories of Special and General relativity.
  2. Dynamics of the formation of galactic cores, parthogenesis, black hole structure – Relative energetics of lattice cohesion vs. encapsulation of lower-dimension structures. Initial density and uniformity of 1-dimensional structures (also considering observed smoothness of lattice compression – I.e. “dark energy” distribution). Success would be a clear demonstration of worthiness as an alternative to the Big Bang theory.
  3. Superfluid transport of particles through the lattice.
    1. Fiber characteristics
    2. Intrinsic angular momentum
    3. Virtual photon / gluon emission as an analog of lattice disruption
    4. Conservation of momentum
    5. Theory of kinetic energy (scaling of scale of lattice distortion vs particle velocity)
    6. Effect of sub-unit distortion on particle propagation
  4. Gravitation/QED/GCD
    1. Equivalence of gravitational and kinetic masses
    2. Electric charge signs. Note that 2 sub-units with 1 thread each are not equivalent to one subunit with 2 threads.
    3. Thread dynamics and interaction with lattice
  5. Lattice distortion and correspondence to quantum-mechanical wave function
    1. Pauli exclusion principle
    2. Wave-particle duality (theory of diffraction)
    3. Hydrogen energy levels
    4. Wave-function collapse
  6. Weak interactions
    1. Thread transfer processes
    2. Temporary creation of unstable higher-dimensional structures.
  7. Theory of light
    1. Electric field as a mode of coupling due to lattice disrupted by thread oscillations
    2. Magnetic fields as a special mode of particle coupling with a lattice distorted by motion of threads
    3. Speed of light
    4. Light decay during lattice propagation
    5. Theory of microwave background radiation (lattice relaxation or light decay)
  8. Theory of anti-particles
    1. Sub-unit chirality and lattice coherence
    2. Annihilation as synthesis of higher-order structures
    3. Pair production as decomposition of higher-order structures
    4. Meson theory

Generative Order Research Proposal – Part III

Principle of Generative Order

In this section we motivate the principle of generative order and define a reference model to serve as a framework for exploring the challenges in elaborating the principle into a model capable of explaining the known characteristics of particles and their interactions.

Signposts

The preceding survey of the deficiencies of GI theories, culminating with lists of unexplained first-order phenomenology and axiomatic contradictions, is a powerful motivation to search for new principles to guide the construction of alternatives. In proposing generative orders (to be developed below), the author was motivated by the following observations. The observations span the scale of phenomenology from the quantum to the cosmological, in recognition of the connection between these scales established by current physical theory.

The Preponderance of Threes

As observed, we inhabit a universe with three spatial dimensions. The three particle families (of which the first is summarized below) consist of four fermions, with three charge states (the fourth state being uncharged). Finally, the non-gravitational forces (electromagnetic, weak and color) have group-theoretical ranks of 1, 2 and 3.

Particle Mass (MeV) Charge Color Spin
n 0 0 0 ½
u 138 -1/3 1 ½
d 140 2/3 1 ½
e 0.5 -1 0 ½

 

Following this correspondence, it seems natural to suggest that the principles that explain our reality of threes should also be able to explain realizable physical realities based upon one, two and four and higher dimensions. This is the fundamental principle of generative order.

In the table above, it is also interesting to note other correspondences. Only fractionally charged particles (u and d) have color, and their masses are far larger than the masses of integrally charged particles (n and e). I also note that particles with odd fractional charge repel each other, but are attracted to the remaining charged particle, of even fractional charge.

Gross Cosmological Structure

Almost all galaxies have super-massive black holes at their center (Galactic Cores, or GCs). Mechanisms for ejection of GCs have been proposed to explain those that do not. In addition, the oldest objects in the universe appear to be quasars. This tends to indicate that quasars may represent the early stages of GC formation, and so that galaxies form through a sudden and enormously violent mechanism, rather than through the gradual coalescence of intergalactic gas.

Secondly, galaxies appear to be clustered on the surface of extremely large voids, lacking any visible matter, but still capable of lensing light. This indicates that the initial stages of the universe must include mechanisms that explain variations in the uniformity of space (in the sense of General Relativity, thought not necessary through the mechanisms it allows).

Core Principles

The statement of generative order provided above is weak. It admits of realities in which a three-dimensional reality is independently established, but does not co-exist with realities of higher or lower dimensionality. Lacking a dynamical result that establishes preference for a three-dimensional reality, it would seem prudent to extend the basic principle of generative order with two others. The three are then:

  1. Realizable physical laws must exist on all orders of dimensionality.
  2. Orders are compositional: elements of lower order combine to produce elements of higher order.
  3. Orders must co-exist, and transitions between orders must be related to recognizable physical phenomena.

Generative Orders Research Proposal – Part II

Assessment of GI Theories

The principle of gage invariance has underpinned development of theoretical physics for almost a century. Application of the principle is conceptually simple: experimental data is analyzed to propose “invariants” of physical systems. The (generally simple) equations that describe these properties (energy, momentum, mass, field strength) are then subjected to classes of transformations (rotations, velocity changes, interactions with other particles), and the equations are manipulated until the proposed invariants are maintained under all transformations.

Having recognized gage invariance as an organizing principle, theorists beginning with Einstein have sought to construct “Grand Unified Theories” that unite all of the invariants in a single framework. There is not fundamental reason for this to be so – the proposition was motivated by reasonable success in explaining experimental studies performed at particle colliders.

Two kinds of difficulties have arisen for the proponents of these theories. First, particle colliders have become enormously expensive and time-consuming to construct. That has been ameliorated somewhat by the introduction of astrophysical data, and the attempts to connect the history of the early universe to the properties of the gage theory. In the interim, however, the enormously prolific imaginations of the particle theorists were insufficiently checked by experimental data. This led them to emphasize numerical tractability in constructing their theories.

Given this situation, we should perhaps not have been surprised to learn that as astrophysics observatories proliferated, the theorists faced intractable difficulties in reconciling predictions with data.

As if these problems were not serious enough, the focus on explaining observations of the behavior of particles under unusual conditions has led to a certain myopia regarding the intractability of what I would call “first order” phenomena: things that are obvious to us in our every-day lives, but have yet to be satisfactorily explained by theory.

We continue with an enumeration of defects.

The Supremacy of Formalism

The program for constructing a Grand Unified Theory of physics is a theoretical conceit from the start. There is no a priori reason to expect that summing the effects of independent forces is not a satisfactory and accurate means of describing the universe.

Once the program is undertaken, however, every term in every equation falls under the microscope of formal criteria. For example, the Higgs field was motivated as a means of restoring parity invariance to the Dirac equation. Similarly, sparticles were introduced to eliminate the distinction between particles with Bose and Fermi statistics. The “strings” of super-string theory were invented to cut off integrals that produce infinities in calculations of particle kinematics. Although these innovations are sufficient to achieve consistency with phenomenology, there is absolutely no experimental evidence that made them necessary. They were motivated solely by abstract formal criteria.

The tractability of formal analysis also has a suspicious influence over the formulation of particle theories. The dynamics of two-dimensional “strings” in super-string theory are susceptible to Fourier analysis. However, Fourier modes are normally far-field approximations to more complex behavior in the vicinity of three-dimensional bodies. In a three-dimensional manifold such as our reality, it would seem natural that particles would manifest structure as toroids, rather than as strings. Unfortunately, the dynamics of such structures can be described only using computational methods, making them an inconvenient representation for analysis.

Finally, while the Large Hadron Collider (LHC) is now marketed principally as a Higgs detector, the original motivation for its construction was a formal problem in particle kinematics: the Standard Model predicted that reaction rates would exceed unity in the vicinity of momentum transfers of 1 TeV. Something truly dramatic was expected from the experimental program, which, at least from the press reports, appears not to have manifested.

Violations of Occam’s Razor

A widely held principle in the development of scientific theory, Occam’s Razor recognizes that a simpler theory is more easily falsified than a complex theory, and so should be preferred as a target for experimental verification. By implication, theorists faced with unnecessary complexity (i.e. – complexity not demanded by phenomenology) in their models should be motivated to seek a simpler replacement.

The most egregious violation of the principle is the combinatorics of dimensional folding in “big bang” cosmologies derived from super string theories. There are tens of millions of possibilities, with each possibility yielding vastly different formulations of physical law. In recent developments, the Big Bang is considered to be the source of an untold number of universes, and we simply happen to be found in one that supports the existence of life.

The Higgs as a source of mass is also an apparent superfluity. In the original theory, each particle had a unique coupling constant to a single Higgs field. The number of parameters in the theory were therefore not reduced. More recently, theorists have suggested that there may be multiple Higgs fields, which is certainly no improvement under the criteria of Occam’s Razor.

The vastly enlarged particle and field menageries of GUTs are also suspicious. There are roughly ten times as many particles and fields as are observed experimentally; the addition of seven extra spatial dimensions is also of concern.

Unverifiable Phenomena

Particularly in the area of cosmology, the theories take fairly modest experimental results and amplify them through a long chain of deduction to obtain complex models of the early universe. Sadly, many of the intermediate steps in the deduction concern phenomena that are not susceptible to experimental verification, making the theories unfalsifiable.

The point of greatest concern here is the interpretation of the loss of energy by light as it traverses intergalactic space. In the reigning theory, this is assumed to be due to the special relativistic “red shift” of light emitted from sources that are moving away from the Earth at a significant fraction of the speed of light. Of course, no one has ever stood next to such an object and measured its velocity. In fact, the loss of energy is interpreted (circularly) as proof of relative motion.

The “red shift” interpretation is the principle justification of the “Big Bang” theory, which again is a phenomenon that cannot be directly verified. There are difficulties in the theory concerning the smoothness and energy density of the observable universe. These are purported to be side-effects of “inflationary” episodes driven by symmetry breaking of Higgs-like fields. No demonstrated vacuum potential manifests a sufficient number of e-foldings of space, and the relevant energy scales are many orders of magnitude beyond the reach of our experimental facilities.

Finally, the 2012 Nobel prize was awarded for studies that indicated that the universe is “inflating”. The inference was achieved by looking at the spectra of distant light sources, and determining that they no longer followed the predictions of Hubble’s law. However, extrapolating from those measurements into the distant future is troubling, as even in the context of the Big Bang model, this opens the door to additional effects that may mitigate or reverse the predicted inflation. Obviously, since these effects would occur on time-scales exceeding the existence of the Earth (which will be vaporized when the sun’s photosphere expands), they will never be verified.

Axiomatic Contradictions

As discussed in the previous section, the lack of an explanation for mass verges on an axiomatic need in the theory. That is to say, it appears to require a fundamental reevaluation of the abstract principles used to construct physical theories.

There are at least two other phenomena that directly violate fundamental axioms in the theory. The first is the existence of non-uniformity in the structure of space-time that is not associated with matter (so-called “dark energy”). Special relativity and all of its dependent theories (i.e. – all of particle physics) rests upon the assumption that space is empty. In the era in which special relativity was formulated, evidence (the Michelson-Morley experiment) suggested that there was no “luminiferous ether” – no medium in which electromagnetic radiation propagated. Dark energy is in fact an ether, and its existence requires a dynamical explanation for the Michelson-Morley results. (It is my opinion that this is why Einstein called the vacuum energy the worst idea he ever had – the existence of such a term undermines all of special and general relativity).

Finally, the work of the Princeton Engineering Anomalies Research team demonstrated couplings between psychological states and the behavior of inanimate objects that are outside of the modes of causality allowed by existing physical theory. The rejection of these findings by the mainstream physics community indicates that accommodating these findings is going to require rethinking of the axioms of the theory. The most extreme examples concern the structure of time – the standard model allows non-linear causality only at quantum scales, and some studies of the “paranormal” appear to indicate non-causal behavior (information preceding effects) on macroscopic scales.

Generative Orders Research Proposal – Part I

Summary

The author proposes to develop research partnerships to develop a conceptual model of fundamental physics that has the potential to place spirituality on a firm scientific basis. The motivation for the scientific program is well-grounded in phenomenology, and the author outlines correspondence with established theory as limiting cases of the proposed model.

The author recognizes the gnostic implications of the program on society. Certain rules of engagement must be observed in doing such work – generally, the first application of any new technology is to obtain competitive advantage. To mitigate against such outcomes, the author has written a book that explains, in layman’s terms, the disciplines required to safely engage these principles, and the long-term personal and global consequences of failing to observe them. The proposal includes support for updating the book, entitled “Love Works”, and for professional preparation and publication.

The proposal seeks not to resolve all questions regarding the proposed class of theories. It is intended to stimulate thinking that should lead to independent research and funding by the research community. The proposal does include publication of a single paper that may demonstrate a significant point of departure from current models of particle physics and cosmology. Successful publication should stimulate “out-of-the-box” thinking by the research community, followed by independent research proposals.

Qualifications

The author’s principle qualifications for those work are selflessness and a commitment to Life in all of its forms. Many of the ideas presented were formulated through engagement with forms of sentience not recognized by many scientists.

With regards to the fundamental physics, the author received his Ph.D. in high-energy particle physics in 1987 and was active as a Post-Doctoral research fellow until 1992. Most of the conceptual underpinnings of modern particle theory and cosmology were developing during this period, and his observations of their development makes the author well-suited to recognize their short-comings.

However, the author recognizes his limitations with regards to the skill-sets of the modern particle theorist, including large-scale numerical modeling. The author will develop relationships at institutions with large-scale computational physics programs to collaborate in the program.

Motivations

The principal motivation for this work is to heal the divide between science and religion that promotes fear, anxiety, anger and apathy in those confronted with the enormous global challenges of the 21st century. The author believes that science is a process of revelation that can embolden and empower those with a genuine desire to be of service to the end of healing the world. Religion is concerned with the development of disciplines that enable us to work safely with the requisite spiritual energies.

While fostering spiritual maturity is critical to a successful execution of the overall program, the development of supporting resources is fairly well advanced. (The author has published his moral and ethical philosophy at www.everdeepening.org, and Love Works is a popularization aimed at the culturally dominant community of Christian believers.) The author considers publication of Love Works to be a critical adjunct, and will not pursue separately the scientific program.

Plan of Exposition

Love Works is provided as an attachment for the evaluation. The focus of exposition will therefore be to motivate and describe the scientific program. The scope of the development is far greater than necessary to complete the work of the first year. As an alternative to theories that have had thousands of man-years invested in their development, it is important to establish plausible paths of investigation for the obvious problems that must be overcome in investigation of the new class of theories, characterized as theories of “Generative Order” (GO).

To be fair, the discussion starts with an enumeration of the failures of the prevailing class of theories, which are characterized as theories based upon “Gage Invariance” (GI).

Every physical theory has a set of fundamental constants. In current theories, the fundamental constants include the speed of light, the particle masses, Plank’s constant, and the strengths of the fundamental forces. The principle challenge in qualifying theories of Generative Order is determining the number and values of those constants. The exposition proposes a series of modeling problems that could be undertaken to evaluate a specific theory and determine its constants. Each modeling problem addresses a critical issue in establishing that a theory of Generative Order yields the current theory as a limiting case (just as Relativity and Quantum Mechanics have Newtonian physics as a limiting case).