Way Beyond Teflon

In imagining a universe filled with an invisible substance, it is natural to use air as an analogy. We then run immediately into trouble with Newton’s first law of motion, which is also an assumption in Einstein’s theories:

Every object in a state of uniform motion tends to remain in that state unless acted upon by an external force.

We know that air actively slows the movement of objects passing through it. Why aren’t moving objects slowed as they pass through Dark Energy?

One way around the problem is to assert that Dark Energy is a wall-flower: it doesn’t interact with anything else. That’s a prevalent assumption, and it causes me to remember the early history of thermodynamics. In building a theory of heat, early investigators, noticing that heat moved from place to place without changing the substance it occupied, conceived of caloric, an invisible field that permeated the spaces between atoms. That didn’t have much explanatory power, and was rapidly replaced by theories that explained heat as disordered motion of atoms.

Astrophysicists tell us that the universe is a pretty cold place – only a few degrees centigrade away from the coldest temperatures possible. Study of systems at these temperatures have revealed some amazing behaviors. For purposes of our discussion, liquid helium is an interesting example because it exhibits superfluidity, which allows objects to move through it without resistance. But superconductivity – materials that pass electricity without resistance – is another consequence of the basic principles that determine the behavior of really cold systems. Both liquid helium and superconductivity, by the way, are extremely important technologies in building facilities such as CERN.

Liquid helium is particularly simple because it bonds only very weakly, which is why it is liquid at temperatures that cause almost every other element to freeze. For illustration, I’m going to show a model system that shows atoms in a square two-dimensional lattice. The details may not apply to liquid helium, but I have reason to believe that they might to dark energy.

Imagine that we have a tank filled with liquid helium. At very cold temperatures, the atoms stack uniformly in the tank.
Super Fluid Lattice
Such arrangements are said to have high order. They are typical of crystalline materials, including many solids. One of the upshots is that it’s difficult to move a single atom without moving the entire collection. That’s because gravity presses the volume into a compact mass, which means that that atoms are compacted slightly, and therefore repelling each other. So moving one helium atom causes the atom it’s moving towards to move away. The cold here is important: if the lattice were vibrating somewhat, there would be little gaps that could absorb some of the distortion, and so the parts of the lattice could change independently. It’s the lack of such vibrations that forces the lattice as a whole to respond to changes.

Now let’s imagine that we place an impurity into the lattice.
Impurity in Super Fluid
This time a slight distortion of the arrangement will occur. The atoms nearest the impurity will indeed shift their positions slightly. Since the atoms at the walls of the container can’t move, they will tend to remain in place. So the distortion will be localized. What’s interesting to consider is what might happen if two defects are created. Will the disturbance to the lattice be minimized if the defects are brought together, or if the lattice acts to separate them? The astute student of physics will see that this thought leads to a model for gravity.

Now let’s propose that somehow our impurity begins to move.
Slow Impurity
How will the lattice react? Well, again, the atoms at the walls can’t move. The impurity will push against the atom in front of it, and leave a gap behind it. So long as the speed of the impurity is much less than the speed of the sound in the lattice, it is only the nearest atoms that will be disturbed. Obviously, the solution to restoring the order of the lattice is for the forward atoms to migrate to the sides as the impurity passes, displacing the atoms already on the side so that they fill the gap left by the passing impurity. When they reach the back, the atoms will come to rest by giving their energy back to the impurity. This is the essence of superfluidity: the impurity loses energy to the lattice only temporarily.

What is interesting to note is that in quantum mechanics, when calculating collisions between two charged particles, we have to assume that the particles are constantly emitting and re-absorbing photons. This is analogous to the situation in the superfluid: the impurity is constantly losing energy and then regaining it.

Finally, let’s consider an impurity moving closer to the speed of sound in the lattice. In this case, the distortions affect more than the nearest atoms, and the circulation becomes more widespread.
Fast Impurity
It’s important to note that energy is stored in the circulatory motion of the helium atoms. They are moving, just as the impurity is moving – but in the opposite direction, of course. The closer to the speed of sound, the more energy is stored in the circulation. This means that it becomes harder and harder to make the impurity move faster as it moves more and more nearly at the speed of sound.

In Special Relativity, Einstein showed that particles become harder and harder to accelerate as they come closer and closer to the speed of light. The relationship is (m0 is the mass of the particle at rest):

m = m0/(1-v2/c2)1/2

Again, we see some strong correspondence between superfluidity and the behavior of particles in both special relativity and quantum mechanics. The big difference is that, while Richard Feynman famously stated that quantum mechanics was merely a mathematical procedure without any explanation, when applying the superfluid analogy to dark energy, it seems that at least some previously mysterious quantum and relativistic phenomena are simple to understand.

For more on models of particle mass, see That’s the Spirit.

A Massive Mystery

Quantum Mechanics describes particles as vibrations in time and space. The intensity of the vibration in time (i.e. – when it is) reflects the particle’s energy; the intensity of the vibration in space (i.e. – where it is) reflects its momentum.

In large-scale reality, such as baseballs and buildings, those vibrations are way too small to influence the results of experiments. In studying these “classical” systems, physicists discovered certain mathematical laws that govern the relationship between momentum (p) and energy (E). Believing that these rules should still be manifested in the quantum realm, they were used as guidelines in building theories of vibration.

In Special Relativity, that relationship is (m is the mass of the particle):

m2 = E2 – p2

In the case of electromagnetic waves, we have m = 0. Using a fairly simple mathematical analogy, the equation above becomes a wave equation for the electromagnetic potential, A. An electric field (that drives electricity down a wire) arises from the gradient of the potential; a magnetic field (that causes the electricity to want to turn) arises from the twisting of the potential.

The contribution of P.A.M. Dirac was to find a mathematical analogy that would describe the massive particles that interact with the electromagnetic potential. When the meaning of the symbols is understood, that equation is not hard to write down, but explaining the symbols is the subject of advanced courses in physics. So here I’ll focus on describing the nature of the equation. Let’s pick an electron for this discussion. The electron is a wave, and so is represented by a distribution ψ.

Physically, the electron is like a little top: it behaves as though it is spinning. When it is moving, it is convenient to describe the spin with respect to the motion. If we point our right thumb in the direction of motion, a “right-handed” electron spins in the direction of our fingers; a “left-handed” electron spins in the opposite direction. To accommodate this, the distribution ψ has four components: one each for right- and left-handed motion propagating forward in time, and two more for propagation backwards in time.

Dirac’s equation describes the self-interaction of the particle as it moves freely through space (without interacting with anything else). Now from the last post, we know that nothing moves freely through space, because space is filled with Dark Energy. But when Dirac wrote his equation, Einstein’s axiom that space was empty still ruled the day, so it was thought of as “self-interaction”. That self-interaction causes the components of the electron to mix according to m, E and p. When the self-interaction is applied twice, we get Einstein’s equation, relating the squares of those terms.

So what does the mass term do? Well, it causes right-hand and left-hand components to mix. But here’s the funny thing: imagine watching the electron move in a mirror. If you hold up your hands in a mirror the thumbs pointed to the right, you’ll notice that the reflection of the right hand looks like your left hand. This “mirror inversion” operation causes right and left to switch. In physics, this is known as “parity inversion”. The problem in the Dirac equation is that when this is applied mathematically to the interaction, the effect of the mass term changes sign. That means that physics is different in the mirror world than it is in the normal world. Since there is no fundamental reason to prefer left and right in a universe built on empty space, the theorists were upset by this conclusion, which they call “parity violation”.

Should they have been? For the universe indeed manifests handedness. This is seen in the orientation of the magnetic field created by a moving charged particle, and also in the interactions that cause fusion in the stars and radioactive decay of uranium and other heavy elements.

But in purely mathematical terms, parity violation is a little ugly. So how did the theorists make it go away? Well, by making the mass change sign in the mirror world. It wasn’t really that simple: they invented another field, called the Higgs field (named after its inventor), and arbitrarily decided that it would change sign under parity inversion. Why would it do this? Well, there’s really no explanation – it’s just an arbitrary decision that Higgs made in order to prevent the problem in the Dirac equation. The mass was taken away and replaced with the Higgs density and a random number (a below) that characterized its interaction with the electron: m ψ was replaced with a H ψ.

Now here’s a second problem: if space was empty, why would the Higgs be expected to have a non-zero strength so that it could create mass for the electron? To make this happen, the theory holds that empty space would like to create the Higgs field out of nothingness. This creation process was described by a “vacuum” potential with says that when the Higgs density is zero, some energy is available to generate a density, until a limit is reached, and then increasing the density consumes energy. So space has a preferred density for the Higgs field. Why should this happen? No reason, except to get rid of the problem in the Dirac equation.

And what about the other spinning particles? Along with the electron, we have the muon, tau, up, down, strange, charm, bottom, top and three neutrinos, all with their own masses. Does each particle have its own Higgs field? Or do they each have their own random number? Well, having one field spewing out of nothingness is bad enough, so the theory holds that each particle has its own random number. But that begs the question: where do the random numbers come from?

So now you understand the concept of the Higgs, and its theoretical motivations.

Through its self-interaction, the Higgs also has a mass. In the initial theory, the Higgs field was pretty “squishy”. What does this mean? Well, Einstein’s equation says that mass and energy are interchangeable. Light is pure energy, and we see that light can be converted into particle and anti-particle pairs. Those pairs can be recombined to create pure energy again in the form of a photon. Conversely, to get high-energy photons, we can smash together particles and anti-particles with equal and opposite momentum, so that all of their momentum is also converted to pure energy (this is the essential goal of all particle colliders, such as those at CERN). If the energy is just right, the photons can then convert to massive particles that aren’t moving anywhere, which makes their decay easier to detect. So saying that the Higgs was “squishy” meant that the colliding pairs wouldn’t have to have a specific energy to create a Higgs particle at rest.

Of course, there’s a lot of other stuff going on when high-energy particles collide. So a squishy Higgs is hard to detect at high energies: it gets lost in the noise of other kinds of collisions. When I was in graduate school, a lot of theses were written on computer simulations that said that the “standard” Higgs would be almost impossible to detect if its mass was in the energy range probed by CERN.

So it was with great surprise that I read the reports that the Higgs discovered at CERN had a really sharp energy distribution. My first impression, in fact, was that what CERN had found was another particle like the electron. How can they tell the difference? Well, by looking at the branching rations. All the higher-mass particles decay, and the Higgs should decay into the different particle types based upon their masses (which describe the strength of the interaction between the Higgs field and the particles). The signal detected at CERN was a decay into two photons (which is also allowed in the theory). I am assuming that the researchers at CERN will continue to study the Higgs signal until the branching ratios to other particles are known.

But I have my concerns. You see, after Peter Higgs was awarded the Nobel Prize, his predecessor on the podium, Carlo Rubia (leader of the collaboration that reported the top particle discovery) was in front of a funding panel claiming that the Higgs seemed to be a bizarre object – it wasn’t a standard Higgs at all, and the funding nations should come up with money to build another even more powerful machine to study its properties. Imagine the concern of the Nobel committee: was it a Higgs or not? Well, there was first a retraction of Rubia’s claim, but then a recent paper that came out saying that the discovery was not a Higgs, but a “techni-Higgs”.

One of the characteristics of the scientific process is that the human tendency to lie our way to power is managed by the ability of other scientists to expose fraud by checking the facts. Nobody can check the facts at CERN: it is the only facility of its kind in the world. It is staffed by people whose primary interest is not in the physics, but in building and running huge machines. That’s a really dangerous combination, as the world discovered in cleaning up the mess left by Ivan Boesky and his world-wide community of financial supporters.

The God Particle

When I did my undergraduate studies in physics at UC Berkeley, the textbooks (always a generation behind) celebrated the accomplishments of great particle physicists of the ‘50s and ‘60s. The author lists on the papers, typically eight people, offered a picture of personal and meaningful participation in revealing the mysteries of the universe.

When I stood one step down on the stage at Wheeler Hall, giving my thesis adviser a height assist when passing the Ph.D. sash over my head, the realities of research in the field of particle physics had completely changed. While I had worked on an eight-person experiment, the theorists had dismissed the results even before they were published. Many of my peers worked as members of geographically dispersed teams, either national or international in scope. The design and commissioning of apparatus had become major engineering projects requiring a decade or more to complete. Some of them never sat shift to acquire data, but published a thesis based upon computer simulations of what their data would look like when (or in some cases, sadly, if) their experiment was run. They were forgotten cogs in collaborations involving hundreds of scientists.

The sociological side-effects of these changes could be disconcerting. The lead scientist on my post-doctoral research project acquired most of his wealth trading property in the vicinity of Fermilab, sited in bucolic countryside that sprouted suburbs to house the staff of engineers and technicians that kept the facility running. Where once a region could host a cutting-edge experimental facility, eventually the sponsors became states, then nations. The site selection process for the Superconducting Super Collider, the follow-on to Fermilab, was a political circus, eventually falling in favor of Texas during the first Bush administration. The project was cancelled in a budget-cutting exercise during the Clinton Administration. This left CERN, the European competitor to Fermilab, as the only facility in active development in the world, with thousands of researchers dependent upon its survival.

Obviously managing the experimental program at such a facility requires an acute political ear – not just to manage the out-sized egos of the researchers themselves, but in packaging a pitch for politicians approving billion-dollar line-items in their budgets. I watched with trepidation as every year a low-statistics survey was done at the limits of the machine’s operating range, with the expected anomalies in the data held out as evidence that there was “something right around the corner” to be uncovered if the machine was allowed to continue to operate. This happened year-after-year, and that can have bad consequences: the frustration of the funding community creates pressure that causes things like the Challenger disaster to happen.

When I left the field in 1995 (yes, 1995! And it’s still relevant!), two specific problems were held out as motivations for continued funding. First, the equations used to calculate reaction probabilities developed a serious anomaly at the energies targeted by the next set of improvements: the values were greater than unity. Since an experiment can have only one outcome, this was held out as proof that something new would be discovered. The other problem was the existence of the Higgs boson, known popularly as the god particle.

There are many explanations for that soubriquet: “God Particle”. Some attribute it to Stephen Weinberg, a theorist whose frustration with the difficulty of proving or disproving its existence led him to call it “that god-damned particle.” I had a personal view, which was that every time theoretical physics ran into a difficulty, it seemed to be resolved by introducing another Higgs-like particle. But the cynic might also be forgiven if he claimed that the Higgs had become a magic mantra that induced compliance in mystified politicians, and spirited money out of public coffers – pretty much as atheists like to claim religions do.

So what is the Higgs particle?

Einstein is So 20th Century

In the two centuries between Newton and Einstein, arguably the greatest physicist of the 19th century was the Scotsman James Clerk Maxwell. Maxwell made fundamental contributions to thermodynamics, the study of how gases, liquids and solids change when ambient conditions (such as temperature and pressure) change, and how to convert heat to work. One of the results was an understanding of the propagation of sound waves through the air. But Maxwell also applied the new mathematics of differential calculus to create a unified theory of electricity and magnetism. These are the famous “Maxwell’s Equations” that predict the existence of electromagnetic waves, which we see as “light”.

Maxwell saw the relationship between electromagnetic waves and water and sound waves. Being steeped in a mechanical analysis of the world, he was unsatisfied with his abstract mathematical theory, and invested time in building a mechanical model of the “aluminiferous ether” – the medium in which light waves traveled. Having spent years studying his equations and their predictions, I am fascinated by claims of his success. It’s a magical world in which the linear motion of charges creates rotary magnetic effects. My understanding is that the model was not simple, but contained complex systems of interlocking gears.

Now Maxwell’s work was not merely a curiosity – it was the basis for the design of communication networks that broke down distances with the enormous speed of light. More than anything else, this has brought us into each other’s lives and helped to create the sense that we are one human family. (The social and psychological reaction to that reality is complex, and we’re still growing into our responsibilities as neighbors. In The Empathic Civilization, Jeremy Rifkin offers a hopeful analysis of the transition.)

So the world of scientific inquiry hung on Maxwell’s words, and in America, two of them, Michelson and Morley, designed an experiment to detect the presence of the ether. If the ether filled all of space, the Earth must be moving through it. Therefore the speed of light should change depending upon the motion of the observer through it. The analogy was with water waves: an observer moving along with a water wave doesn’t experience its disturbance – while one moving against it feels its disturbance enhanced. This is an example of Newton’s laws concerning the change of reference frames.

Since the Earth rotates around the sun, light emitted from the Earth in a specific direction relative to the sun should have a different speed at different times of the year. To test this hypothesis, Michelson and Morley built a sensitive instrument that compared the speed of light travelling in two perpendicular directions. As the Earth varied its motion through the ether, the pattern of dark and light on a screen was expected to shift slowly. Strangely, the result was negative: the image did not change.

The conclusion was that there was no ether. This was a real crisis, because Maxwell’s Equations don’t behave very well when trying to predict the relationship between observations made by people moving at different speeds. To understand how really terrible this is, consider: in Maxwell’s theory, charges moving through empty space creates a rotary magnetic field. But what if the observer is moving along with the charge? The charge no longer appears to move, so the magnetic field disappears. How can that be possible?

This was the challenge taken up by the Dutch physicist Henrik Lorenz. He analyzed the mechanical properties of rulers and clocks, which are of course held together by electromagnetic forces, and discovered a magical world in which rulers change length and clocks speed up and slow down when the speed of the observer changes.

This was the context in which Einstein introduced his theory of Special Relativity. He did not really add to the results of Lorenz, but he simplified their derivation by proposing two simple principles: First, since the vacuum is empty, we have no way of determining whether we are moving or not. All motion is relative to an observer (thus the title: Special Theory of Relativity), and so no observer should have a preferred view of the universe. The second was that the speed of light is the same to every observer. Einstein’s mathematical elaboration of these principles unified our understanding of space and time, and matter and energy. Eventually, General Relativity extended his ideas to include accelerating observers, who can’t determine whether they are actually accelerating or rather standing on the surface of a planet.

Special and General Relativity were not the only great theories to evolve in the course of the 20th century. Quantum Mechanics (the world of the microscopic) and Particle Physics (describing the fundamental forces and how they affect the simplest forms of matter) were also developed, but ultimately Einstein’s principles permeated those theories as criteria for acceptance.

Then, in 1998, studies of light emitted from distant supernovae seemed to indicate that something is pushing galaxies apart from each other, working against the general tendency of gravity to pull them back together. The explanation for this is Dark Energy, a field that fills all of space. This field has gravitational effects, and its effects in distorting the images of distant galaxies have been observed. However, this field cannot be moving in all possible directions at all possible speeds. Therefore, it establishes a preferred reference frame, invalidating Einstein’s assumptions.

Working physicists resist this conclusion, because they have a means of accommodating these effects in their theories, which is to introduce additional mathematical terms. But science is not about fitting data – it is about explaining it. Einstein used his principles as an explanation to justify the mathematics of his theories. When those principles are disproven, the door opens to completely new methods for describing the universe. We can travel as far back as Maxwell in reconstructing our theories of physics. While for some that would seem to discard a lot of hard work done over the years between (and undermine funding for their research), for others it liberates the imagination (see Generative Orders as an illustration).

So, for example, why didn’t Michelson and Morley detect the ether? Maybe ether is more like air than water. Air is carried along with the Earth, and so the speed of sound doesn’t vary as the Earth moves about the sun. Maybe dark energy, which Maxwell knew as the ether, is also carried along with the Earth. Maybe, in fact, gravitation is caused by distortion in the Dark Energy field when it is bound to massive objects.

The Writing of The Soul Comes First

In Catholic terminology, thaumaturgy is the working of miracles through love. Raised by a skeptical father and steeped in science that disproved the possibility of such experiences, for most of my life I disbelieved.

That changed with the millennium, when personal and political crises brought fear into my life. I began to read widely on spiritual and religious experience. Then one Sunday I entered the sanctuary at St. Kolbe’s in Oak Park, CA. A thirty-foot statue of Christ hangs from the ceiling, not nailed to the cross, but suspended before it. Confronted with this powerful image of human suffering, I instinctively put my hand over my heart, held it out to him, and thought, “Use this for healing.”

In the intervening years, I have learned a great deal about healing through divine love. I learned that many “evil” people are simply doing what was done to them, and desperately looking for someone with the strength to show them how to get over it. I learned that people used to being in control find the sensations that come with being loved to be frightening, almost a betrayal by the thirst of their hearts. I learned that many intellectual atheists are “spiritual”, and those that are not do not realize how frightening others find the strength of their minds. I realized that Biblical literalists use their dogmatism to hold those minds at bay.

As I sought for answers, the astrophysicists announced the discovery of Dark Energy. To those that remember the philosophical roots of modern physics, this discovery was shattering. Einstein’s theories of relativity are based upon the assumption that space is empty. Dark Energy demolishes that assumption. With that called into doubt, we might notice another oddity in the history of physics: where from Ancient Greece to 1950 the complexity of nature was always understood by positing structure inside the smallest objects we could observe, in the modern era physicists assumed that no additional structure was needed. Taking away relativity and adding additional structure reveals a whole new class of theories that have the potential to reconcile science and spirituality (see Generative Orders (GO) and GO Cosmology).

I began to share these insights in 2005 with the web site at http://www.everdeepening.org, in which, as a Greek philosopher might have, I try to prove that love works. Realizing that the material was really difficult, I wrote a “layman’s” treatment back in 2008, the unpublished “Love Works.” Unfortunately, attempts to teach others demonstrated that the ideas were still difficult to grasp.

Then, in 2013, I was moved to re-read the Bible cover-to-cover, and saw it in a completely new light. I realized that what Darwin and paleontology had revealed about natural history was written right into the Bible. No conflict existed, and in fact the consistency of science with the Bible served to substantiate everything else written within.

Reading through the book in such a short time, I also saw the greater work on human nature, and the majesty and brilliance of God’s efforts to prepare us for the manifestation of Christ.

So I sat down at my computer and wrote The Soul Comes First in three weeks. In it is contained all the hopes that I pray I share with Christ: the unification of reason and faith, the hidden strength that will give humanity victory over fear, and the healing of the world through the power of love.

The message may be frightening to some. The job that we forsook in Eden is a big job, and difficult. All I ask is that you remember that it is not in human hands that the work is held. We all do our part, and the farm hand that plants a sustainable crop is no less essential than the ecologist that plans the restoration of a forest. The housewife serving in the soup kitchen is no less essential than the CEO commissioning a new factory. The counsellor that saves a marriage is no less essential than that politician that negotiates a peace treaty. With love, the strength of Christ, and the unifying wisdom of the Holy Spirit, all things are possible.

Generative Orders Research Proposal – Part V

Research Program

In this section, we suggest a research program, motivated by a strategy of incremental complexity. The initial steps of the program focus on the characteristics of the lattice. As these are resolved, the parameter space of the theory is reduced, helping to focus analysis of fermion dynamics in the later stages.

The reference model as described suggests that in theories of generative order, many of the intrinsic properties of the standard model may be extrinsic properties of one-dimensional structures. If this is so, ultimately theorists should be able to calculate many of the fundamental constants (G, h, c, α, etc.), and to establish correspondence theorems to existing models of gravitation, quantum electrodynamics (QED), quantum chromodynamics (QCD) and the weak interactions. It is the opinion of the author that the single-component reference model is unlikely to satisfy these requirements.

Conversely, the isotropy of space suggests that a single-component lattice is likely to be sufficient to explain gravitation. The initial work therefore focuses on creation of models that can be used to assess the stability of behavior against segmentation length and interaction models. The program will also help scope the computational resources necessary to analyze fermion kinematics.

The steps in the research program are successively more coarse. Particularly when the program progresses to consideration of fermion kinematics, years of effort could be invested in analysis of a single particle property, such as spin. Considering the history of quantum mechanics and relativity, the entire program can be expected to take roughly a century to complete.

Given the resources available to the proposer, funding of the research program for the first year focuses on two goals.

  1. Re-analysis of the spectra of side-view elliptical galaxies with the goal of establishing a high-profile alternative to the Hubble expansion.
  2. Identifying research teams that would be capable and interested to pursue the modeling effort.
  3. A successful effort would culminate with invitation to a symposium considering new horizons in the theories of cosmology and particle physics.

Modeling Program

Exposure of generative orders to the community of particle theorists is going to result in a large body of objections to the reference model. To avoid these being raised as impediments to obtaining research funding for interested theorists, we list the challenges that must be overcome to elaboration of a satisfactory model, and consider possible mechanisms that might lead to the observed behavior of known physical systems.

The program is presented in outline form only. If desirable, elaboration can be provided.

  1. Precession of perihelion – due to “drag” between the inconsistent lattice configurations of bound gravitational bodies. Explore parameterization of lattice structure – sub-unit modes, lattice sheer and compressibility. Again a fairly approachable study that could stimulate further work by demonstrating feasibility of explanations of large-scale phenomena, with correspondence to parameterization at the lattice scale. Success would begin to break down resistance to the idea that a preferred reference frame is disproven by observations that support Einstein’s theories of Special and General relativity.
  2. Dynamics of the formation of galactic cores, parthogenesis, black hole structure – Relative energetics of lattice cohesion vs. encapsulation of lower-dimension structures. Initial density and uniformity of 1-dimensional structures (also considering observed smoothness of lattice compression – I.e. “dark energy” distribution). Success would be a clear demonstration of worthiness as an alternative to the Big Bang theory.
  3. Superfluid transport of particles through the lattice.
    1. Fiber characteristics
    2. Intrinsic angular momentum
    3. Virtual photon / gluon emission as an analog of lattice disruption
    4. Conservation of momentum
    5. Theory of kinetic energy (scaling of scale of lattice distortion vs particle velocity)
    6. Effect of sub-unit distortion on particle propagation
  4. Gravitation/QED/GCD
    1. Equivalence of gravitational and kinetic masses
    2. Electric charge signs. Note that 2 sub-units with 1 thread each are not equivalent to one subunit with 2 threads.
    3. Thread dynamics and interaction with lattice
  5. Lattice distortion and correspondence to quantum-mechanical wave function
    1. Pauli exclusion principle
    2. Wave-particle duality (theory of diffraction)
    3. Hydrogen energy levels
    4. Wave-function collapse
  6. Weak interactions
    1. Thread transfer processes
    2. Temporary creation of unstable higher-dimensional structures.
  7. Theory of light
    1. Electric field as a mode of coupling due to lattice disrupted by thread oscillations
    2. Magnetic fields as a special mode of particle coupling with a lattice distorted by motion of threads
    3. Speed of light
    4. Light decay during lattice propagation
    5. Theory of microwave background radiation (lattice relaxation or light decay)
  8. Theory of anti-particles
    1. Sub-unit chirality and lattice coherence
    2. Annihilation as synthesis of higher-order structures
    3. Pair production as decomposition of higher-order structures
    4. Meson theory

Generative Orders Research Proposal – Part IV

Reference Model

Having advanced the principles of generative orders, we find ourselves in a situation somewhat similar to that faced by quantum theorists after wave-particle duality was advanced. A number of experiments appeared to violate the principles of Classical Mechanics (i.e. – the double-slit experiment, electronic excitations of the hydrogen atom, and the photoelectric effect). Progress was achieved by generalizing the methods of classical mechanics (Hamiltonian and Lagrange equations) into differential equations through Fourier analysis.

The problem in the case of generative orders is more difficult. The principle does not generalize existing theory into new realms of application – it serves to supplant existing theories, stretching back to Special Relativity and quantum mechanics. Additionally, the enumerated principles are abstract. They do not drive us to a specific formulation of physics in one dimension. A number of alternatives may be mathematically feasible.

Lacking a definite starting point for analysis, nothing short of an intellectual Big Bang would produce a fully elaborated theory that explains everything that is known about particle physics and cosmology. That does not exclude thoughtful exploration of specific possibilities. In this section, we consider a simple model (narrative here), elaborated to the point that conceptual correspondence with known phenomenology is established. The model is sufficient to support development of model potentials (as outlined in the research program), and therefore to advance theoretical insight and analysis methods that can be applied to other models.

  1. The initial state of the universe is a disordered but “cold” (at least as compared to Big Bang theories) collection of one-dimensional structures.
  2. Physics of one dimension includes a mechanism of segmentation (or quantization). The W/Z mass may establish a scale for this segmentation (see item 8 in this list).
  3. Folding or bonding on segmentation boundaries produces higher-dimensional structures. Geometrically, we know that triangles are the most stable of these structures.
  4. Higher-dimensional structures are self-associative, building lattices of distinct dimensionality. Tiling a plane with triangles is trivial. The structure of higher-order lattices is a an extrinsic property of the lattice potential.
  5. Lower-order lattices may exist in the empty spaces between cell layers. This is again an extrinsic property of the lattice potential
  6. Lattice formation is spontaneous. Orientation of expanding lattices is random.
  7. Surface energy at the boundaries between merging lattices of different orientation (a la grain boundaries in metals) provides the energy to compress structures into lower order, producing quasars and super-massive black holes at the center of galaxy formation. In this model, a black hole in three dimensions is a volume bounded by a two-dimensional lattice.
  8. Parthogenesis occurs through the expulsion of residual lower-order structures from the enclosed surface. In the reference model, these are one-dimensional structures (termed “threads” below). Threads may pass around the polygonal subunits of the lattice or through them. Threads that penetrate the lattice sub-units are localized, creating loci that we identify with fermions. Fermions interact strongly with similarly localized threads, giving rise to the non-gravitational forces. The potential barrier of the W and Z mass corresponds to a thread-exchange process, which requires reconfiguration of the sub-units.
  9. Captured threads locally distort the lattice. Gravity is a side-effect of the lattice energetics that localizes the distortion.
  10. Dark energy corresponds to the potential energy of lattice compression.

This illustrates how the principles of generative orders can be used to build a simple one-component model of the early universe. Geometrical models are presented in Chapter 4 of Love Works.

Certain details of particle phenomenology appear superficially to be accessible in the context of this model.

  1. Charge corresponds to the number of threads that penetrate a lattice sub-unit (which naturally has three degrees of freedom). Sign is simply a way of characterizing the tendency of fermions to attract fermions with different degrees of thread penetration.
  2. Mass arises naturally when threads pull on each other, causing the loci of thread capture to be dragged through the lattice. From the properties of the first particle family, it would appear that asymmetrical thread configurations must be more disruptive than symmetrical configurations. The equivalence of gravitational and kinetic mass is natural, as both effects correspond to lattice distortions. The equations of special relativity suggest the velocity-dependence of kinetic distortions.
  3. Particle families correspond to distortions of a particle’s lattice sub-unit from its normal configuration.
  4. Conservation of momentum could result from lattice dynamics that tends to reject disturbances, forcing energy back onto moving fermion. Analogies in material science include superfluidity and superconductivity.
  5. Light could be a self-propagating disturbance in the lattice, achievable only through fermion kinematics. Assuming that gravitational packing of particles causes re-orientation of the lattice at the surface of large bodies, the constancy of the speed of propagation is a local phenomenon (i.e. – a massive body “drags” space around with it).
  6. Light may interact with the lattice as it propagates, causing energy loss that manifests as a shift to lower frequencies. This may explain the microwave background radiation.
  7. A soul is a complex configuration of threads that are supported by but only tenuously bound to the lattice.

These configurations store energy as potential energy due to the associated distortion of the lattice.

Obviously, all of these are conceptual possibilities, whose validity can only be established through construction of a model of the energetics of the interactions between one-dimensional structures. As will become clear in the description of the research program, the list is by no means exhaustive. It is presented to provide a sense of the naturalness of fit between phenomenology and theories that might be elaborated using the principles of generative order.

Generative Order Research Proposal – Part III

Principle of Generative Order

In this section we motivate the principle of generative order and define a reference model to serve as a framework for exploring the challenges in elaborating the principle into a model capable of explaining the known characteristics of particles and their interactions.

Signposts

The preceding survey of the deficiencies of GI theories, culminating with lists of unexplained first-order phenomenology and axiomatic contradictions, is a powerful motivation to search for new principles to guide the construction of alternatives. In proposing generative orders (to be developed below), the author was motivated by the following observations. The observations span the scale of phenomenology from the quantum to the cosmological, in recognition of the connection between these scales established by current physical theory.

The Preponderance of Threes

As observed, we inhabit a universe with three spatial dimensions. The three particle families (of which the first is summarized below) consist of four fermions, with three charge states (the fourth state being uncharged). Finally, the non-gravitational forces (electromagnetic, weak and color) have group-theoretical ranks of 1, 2 and 3.

Particle Mass (MeV) Charge Color Spin
n 0 0 0 ½
u 138 -1/3 1 ½
d 140 2/3 1 ½
e 0.5 -1 0 ½

 

Following this correspondence, it seems natural to suggest that the principles that explain our reality of threes should also be able to explain realizable physical realities based upon one, two and four and higher dimensions. This is the fundamental principle of generative order.

In the table above, it is also interesting to note other correspondences. Only fractionally charged particles (u and d) have color, and their masses are far larger than the masses of integrally charged particles (n and e). I also note that particles with odd fractional charge repel each other, but are attracted to the remaining charged particle, of even fractional charge.

Gross Cosmological Structure

Almost all galaxies have super-massive black holes at their center (Galactic Cores, or GCs). Mechanisms for ejection of GCs have been proposed to explain those that do not. In addition, the oldest objects in the universe appear to be quasars. This tends to indicate that quasars may represent the early stages of GC formation, and so that galaxies form through a sudden and enormously violent mechanism, rather than through the gradual coalescence of intergalactic gas.

Secondly, galaxies appear to be clustered on the surface of extremely large voids, lacking any visible matter, but still capable of lensing light. This indicates that the initial stages of the universe must include mechanisms that explain variations in the uniformity of space (in the sense of General Relativity, thought not necessary through the mechanisms it allows).

Core Principles

The statement of generative order provided above is weak. It admits of realities in which a three-dimensional reality is independently established, but does not co-exist with realities of higher or lower dimensionality. Lacking a dynamical result that establishes preference for a three-dimensional reality, it would seem prudent to extend the basic principle of generative order with two others. The three are then:

  1. Realizable physical laws must exist on all orders of dimensionality.
  2. Orders are compositional: elements of lower order combine to produce elements of higher order.
  3. Orders must co-exist, and transitions between orders must be related to recognizable physical phenomena.

Generative Orders Research Proposal – Part II

Assessment of GI Theories

The principle of gage invariance has underpinned development of theoretical physics for almost a century. Application of the principle is conceptually simple: experimental data is analyzed to propose “invariants” of physical systems. The (generally simple) equations that describe these properties (energy, momentum, mass, field strength) are then subjected to classes of transformations (rotations, velocity changes, interactions with other particles), and the equations are manipulated until the proposed invariants are maintained under all transformations.

Having recognized gage invariance as an organizing principle, theorists beginning with Einstein have sought to construct “Grand Unified Theories” that unite all of the invariants in a single framework. There is not fundamental reason for this to be so – the proposition was motivated by reasonable success in explaining experimental studies performed at particle colliders.

Two kinds of difficulties have arisen for the proponents of these theories. First, particle colliders have become enormously expensive and time-consuming to construct. That has been ameliorated somewhat by the introduction of astrophysical data, and the attempts to connect the history of the early universe to the properties of the gage theory. In the interim, however, the enormously prolific imaginations of the particle theorists were insufficiently checked by experimental data. This led them to emphasize numerical tractability in constructing their theories.

Given this situation, we should perhaps not have been surprised to learn that as astrophysics observatories proliferated, the theorists faced intractable difficulties in reconciling predictions with data.

As if these problems were not serious enough, the focus on explaining observations of the behavior of particles under unusual conditions has led to a certain myopia regarding the intractability of what I would call “first order” phenomena: things that are obvious to us in our every-day lives, but have yet to be satisfactorily explained by theory.

We continue with an enumeration of defects.

The Supremacy of Formalism

The program for constructing a Grand Unified Theory of physics is a theoretical conceit from the start. There is no a priori reason to expect that summing the effects of independent forces is not a satisfactory and accurate means of describing the universe.

Once the program is undertaken, however, every term in every equation falls under the microscope of formal criteria. For example, the Higgs field was motivated as a means of restoring parity invariance to the Dirac equation. Similarly, sparticles were introduced to eliminate the distinction between particles with Bose and Fermi statistics. The “strings” of super-string theory were invented to cut off integrals that produce infinities in calculations of particle kinematics. Although these innovations are sufficient to achieve consistency with phenomenology, there is absolutely no experimental evidence that made them necessary. They were motivated solely by abstract formal criteria.

The tractability of formal analysis also has a suspicious influence over the formulation of particle theories. The dynamics of two-dimensional “strings” in super-string theory are susceptible to Fourier analysis. However, Fourier modes are normally far-field approximations to more complex behavior in the vicinity of three-dimensional bodies. In a three-dimensional manifold such as our reality, it would seem natural that particles would manifest structure as toroids, rather than as strings. Unfortunately, the dynamics of such structures can be described only using computational methods, making them an inconvenient representation for analysis.

Finally, while the Large Hadron Collider (LHC) is now marketed principally as a Higgs detector, the original motivation for its construction was a formal problem in particle kinematics: the Standard Model predicted that reaction rates would exceed unity in the vicinity of momentum transfers of 1 TeV. Something truly dramatic was expected from the experimental program, which, at least from the press reports, appears not to have manifested.

Violations of Occam’s Razor

A widely held principle in the development of scientific theory, Occam’s Razor recognizes that a simpler theory is more easily falsified than a complex theory, and so should be preferred as a target for experimental verification. By implication, theorists faced with unnecessary complexity (i.e. – complexity not demanded by phenomenology) in their models should be motivated to seek a simpler replacement.

The most egregious violation of the principle is the combinatorics of dimensional folding in “big bang” cosmologies derived from super string theories. There are tens of millions of possibilities, with each possibility yielding vastly different formulations of physical law. In recent developments, the Big Bang is considered to be the source of an untold number of universes, and we simply happen to be found in one that supports the existence of life.

The Higgs as a source of mass is also an apparent superfluity. In the original theory, each particle had a unique coupling constant to a single Higgs field. The number of parameters in the theory were therefore not reduced. More recently, theorists have suggested that there may be multiple Higgs fields, which is certainly no improvement under the criteria of Occam’s Razor.

The vastly enlarged particle and field menageries of GUTs are also suspicious. There are roughly ten times as many particles and fields as are observed experimentally; the addition of seven extra spatial dimensions is also of concern.

Unverifiable Phenomena

Particularly in the area of cosmology, the theories take fairly modest experimental results and amplify them through a long chain of deduction to obtain complex models of the early universe. Sadly, many of the intermediate steps in the deduction concern phenomena that are not susceptible to experimental verification, making the theories unfalsifiable.

The point of greatest concern here is the interpretation of the loss of energy by light as it traverses intergalactic space. In the reigning theory, this is assumed to be due to the special relativistic “red shift” of light emitted from sources that are moving away from the Earth at a significant fraction of the speed of light. Of course, no one has ever stood next to such an object and measured its velocity. In fact, the loss of energy is interpreted (circularly) as proof of relative motion.

The “red shift” interpretation is the principle justification of the “Big Bang” theory, which again is a phenomenon that cannot be directly verified. There are difficulties in the theory concerning the smoothness and energy density of the observable universe. These are purported to be side-effects of “inflationary” episodes driven by symmetry breaking of Higgs-like fields. No demonstrated vacuum potential manifests a sufficient number of e-foldings of space, and the relevant energy scales are many orders of magnitude beyond the reach of our experimental facilities.

Finally, the 2012 Nobel prize was awarded for studies that indicated that the universe is “inflating”. The inference was achieved by looking at the spectra of distant light sources, and determining that they no longer followed the predictions of Hubble’s law. However, extrapolating from those measurements into the distant future is troubling, as even in the context of the Big Bang model, this opens the door to additional effects that may mitigate or reverse the predicted inflation. Obviously, since these effects would occur on time-scales exceeding the existence of the Earth (which will be vaporized when the sun’s photosphere expands), they will never be verified.

Axiomatic Contradictions

As discussed in the previous section, the lack of an explanation for mass verges on an axiomatic need in the theory. That is to say, it appears to require a fundamental reevaluation of the abstract principles used to construct physical theories.

There are at least two other phenomena that directly violate fundamental axioms in the theory. The first is the existence of non-uniformity in the structure of space-time that is not associated with matter (so-called “dark energy”). Special relativity and all of its dependent theories (i.e. – all of particle physics) rests upon the assumption that space is empty. In the era in which special relativity was formulated, evidence (the Michelson-Morley experiment) suggested that there was no “luminiferous ether” – no medium in which electromagnetic radiation propagated. Dark energy is in fact an ether, and its existence requires a dynamical explanation for the Michelson-Morley results. (It is my opinion that this is why Einstein called the vacuum energy the worst idea he ever had – the existence of such a term undermines all of special and general relativity).

Finally, the work of the Princeton Engineering Anomalies Research team demonstrated couplings between psychological states and the behavior of inanimate objects that are outside of the modes of causality allowed by existing physical theory. The rejection of these findings by the mainstream physics community indicates that accommodating these findings is going to require rethinking of the axioms of the theory. The most extreme examples concern the structure of time – the standard model allows non-linear causality only at quantum scales, and some studies of the “paranormal” appear to indicate non-causal behavior (information preceding effects) on macroscopic scales.

Generative Orders Research Proposal – Part I

Summary

The author proposes to develop research partnerships to develop a conceptual model of fundamental physics that has the potential to place spirituality on a firm scientific basis. The motivation for the scientific program is well-grounded in phenomenology, and the author outlines correspondence with established theory as limiting cases of the proposed model.

The author recognizes the gnostic implications of the program on society. Certain rules of engagement must be observed in doing such work – generally, the first application of any new technology is to obtain competitive advantage. To mitigate against such outcomes, the author has written a book that explains, in layman’s terms, the disciplines required to safely engage these principles, and the long-term personal and global consequences of failing to observe them. The proposal includes support for updating the book, entitled “Love Works”, and for professional preparation and publication.

The proposal seeks not to resolve all questions regarding the proposed class of theories. It is intended to stimulate thinking that should lead to independent research and funding by the research community. The proposal does include publication of a single paper that may demonstrate a significant point of departure from current models of particle physics and cosmology. Successful publication should stimulate “out-of-the-box” thinking by the research community, followed by independent research proposals.

Qualifications

The author’s principle qualifications for those work are selflessness and a commitment to Life in all of its forms. Many of the ideas presented were formulated through engagement with forms of sentience not recognized by many scientists.

With regards to the fundamental physics, the author received his Ph.D. in high-energy particle physics in 1987 and was active as a Post-Doctoral research fellow until 1992. Most of the conceptual underpinnings of modern particle theory and cosmology were developing during this period, and his observations of their development makes the author well-suited to recognize their short-comings.

However, the author recognizes his limitations with regards to the skill-sets of the modern particle theorist, including large-scale numerical modeling. The author will develop relationships at institutions with large-scale computational physics programs to collaborate in the program.

Motivations

The principal motivation for this work is to heal the divide between science and religion that promotes fear, anxiety, anger and apathy in those confronted with the enormous global challenges of the 21st century. The author believes that science is a process of revelation that can embolden and empower those with a genuine desire to be of service to the end of healing the world. Religion is concerned with the development of disciplines that enable us to work safely with the requisite spiritual energies.

While fostering spiritual maturity is critical to a successful execution of the overall program, the development of supporting resources is fairly well advanced. (The author has published his moral and ethical philosophy at www.everdeepening.org, and Love Works is a popularization aimed at the culturally dominant community of Christian believers.) The author considers publication of Love Works to be a critical adjunct, and will not pursue separately the scientific program.

Plan of Exposition

Love Works is provided as an attachment for the evaluation. The focus of exposition will therefore be to motivate and describe the scientific program. The scope of the development is far greater than necessary to complete the work of the first year. As an alternative to theories that have had thousands of man-years invested in their development, it is important to establish plausible paths of investigation for the obvious problems that must be overcome in investigation of the new class of theories, characterized as theories of “Generative Order” (GO).

To be fair, the discussion starts with an enumeration of the failures of the prevailing class of theories, which are characterized as theories based upon “Gage Invariance” (GI).

Every physical theory has a set of fundamental constants. In current theories, the fundamental constants include the speed of light, the particle masses, Plank’s constant, and the strengths of the fundamental forces. The principle challenge in qualifying theories of Generative Order is determining the number and values of those constants. The exposition proposes a series of modeling problems that could be undertaken to evaluate a specific theory and determine its constants. Each modeling problem addresses a critical issue in establishing that a theory of Generative Order yields the current theory as a limiting case (just as Relativity and Quantum Mechanics have Newtonian physics as a limiting case).