Quantum Inversion

In modeling large-scale systems, wave equations are often useful approximations. So, while water at the quantum scale is made up of molecules that bounce around like billiard balls, in our swimming pool waves look perfectly smooth, and we can predict their behavior using wave theory.

A researcher at Cal Tech has applied this approximation to the modeling of very large astronomical objects: super-massive black holes and their entourage of stars and planetoids. In pursuing the mathematics, he discovered that the system behaves according to a wave equation that looks just like the equation that governs slowly-moving subatomic particles: Schrödinger’s equation.

But the equation alone does not generate “quantum” behavior in the objects described by the equation. That is generated by Fermi’s “exclusion” rules. In Fermi’s rules, the particles that make up stable matter all obey this rule: all particles of any one type (such as an electron) are indistinguishable, and therefore the equation describing the behavior of the system must be the same if any two particles are exchanged, with one exception: the amplitude of the wave changes sign.

Going back to our swimming pool, this is like saying that if we exchanged any two water molecules, the wave would turn into its mirror image: where there were peaks in the wave, now there would be troughs (and visce-versa).

I am absolutely certain that this makes as little sense in describing the behavior of supermassive black holes as it does in describing the behavior of pools of water.

That a working physicist could so casually misrepresent the nature of the system reflects the subtlety of quantum concepts, and the tempting ease with which those concepts are used to manipulate public fascination.

Intelligence and Creativity

Joseph at Rationalizing the Universe explores the modern formulation, rooted in Goedel’s Theorem of logical incompleteness, of living with uncertainty. The following discussion ensued:


You point out correctly that Goedel’s theorem is restricted to a fairly narrow problem: proving that a proof system is “correct” – i.e. – that its axioms and operations are consistent. In other words, we can’t take a set of axioms and apply the operations to disprove any other axiom.

This seems to lead to the conclusion that we can’t trust our proofs of anything, which means that there are no guarantees that our expectations will be met. Unfortunately, expectations are undermined by many other problems, among them determination of initial conditions, noise and adaptation. The last is the special bete noir of sociology, as people will often violate social norms in order to assuage primitive drives.

At this point in my life, I am actually not at all troubled by these problems. Satisfaction is not found in knowing the truth, it is found in realizing creative possibilities. If we could use mathematics to optimize the outcome of social and economic systems, we would have no choices left. Life would become terribly boring. So what is interesting to me is to apply understanding of the world to imagine new possibilities. Mathematics is a useful tool in that process, particularly when dealing with dumb matter.

This brings me back to the beginning of the post: you state that “mathematics is the unspoken language of nature.” If there is anything that Goedel’s theorem disproves, it is precisely that statement. Mathematics is a tool, just as poetry and music are tools. At times, both of the latter have transported my mind to unseen vistas; mathematics has never had that effect.


You raise a very interesting point; if we could optimise everything then would we take all of the joy out of being…. you may well be right. I know I get a lot of my satisfaction from the quest to know more. Although I disagree that Godel’s theorems disprove my original statement in this sense; language is essentially about describing things. That is why you can have different languages but they are easily translatable…. bread/pan/brot etc…. we all know what they mean because they all describe the same thing. In exactly the same way, mathematics describes things that actually exist; that isn’t to say nature is mathematics at all – mathematics is the language of nature but it is just as human in its construction as the spoken word. But is matter not matter because a human invented the label? Matter is matter.

To be, these theorems don’t break down all of our proofs; but what they do show is a vital point about logic. One which I think is going to become and increasingly big issue as the quest to understand and build artificial intelligence increases – can we every build a mind as intelligent as a humans when a human can know the answer to a non-programmable result? We hope so! Or rather I do – I do appreciate it’s not for everyone


I appreciate your enthusiasm, but I must caution that the mathematical analogies in classical physics cannot be extended in the same way to the quantum realm. Richard Feynman warned us that there is no coherent philosophy of quantum mechanics – it is just a mathematical formulation that produces accurate predictions. Ascribing physical analogies to the elements of the formulation has always caused confusion. An extreme example was found in the procedure of renormalization, in which observable physical properties such as mass and charge are produced as the finite ratio of divergent integrals.

Regarding human and digital intelligence: one of the desirable characteristics of digital electronics is its determinism. The behavior of transistor gates is rigidly predictable, as is the timing of clock signals that controls that propagation of signals through logic arrays. This makes the technology a powerful tool to us in implementing our intentions.

But true creativity does not arise from personal control, which only makes me loom bigger in the horizon of others’ lives, threatening (as the internet troll or Facebook post-oholic) to erase their sense of self. Rather, creativity in its deepest sense arises in relation, in the consensual intermingling of my uniqueness with the uniqueness of others.

Is that “intelligence?” Perhaps not – the concept itself is difficult to define, and I believe that it arises as a synthesis of more primitive mental capacities, just as consciousness does. But I doubt very much that Artificial Intelligence is capable of manifestations of creativity, because fundamentally it has no desires. It is a made thing, not a thing that has evolved out of a struggle, spanning billions of years, for realization. Our creativity arises out of factors over which we have no control: meeting a spouse-to-be, witnessing an accident, or suffering a debilitating disease. We have complex and subtle biochemical feedback systems which evolved to recognize and adjust to the opportunities and imperatives of living. We are a long way from being able to recreate that subtlety in digital form, and without those signals, meaningful relation cannot evolve, and thus creativity is still-born.

Nucleons in a Bunch

The world of the very small is impossible to observe in complete detail. In the everyday world, once the billiard ball is struck, we can predict the final configuration on the pool table. This is because the method we use to observe the initial positions and motion of the balls – vision – doesn’t change appreciably those positions and motions. In the microscopic world described by quantum mechanics, however, Heisenberg’s uncertainty principle tells us that we can’t measure with arbitrary accuracy both position and velocity.

A similar principle affects the theory of quantum mechanical rotations. In principle, a rotating body has a total angular momentum (its propensity to keep spinning) and an orientation of the angular momentum in space. Since we have three spatial directions in our reality, there are three components of angular momentum. However, quantum mechanical theory tells us that we can know the total angular momentum, but any attempt to measure one of its components will disrupt the values of the other two components.

This leads to some confusion in interpreting the theory, even among physicists. The leader of my Ph.D. thesis project, hearing that I was doing well in my advanced coursework on quantum mechanics, expressed his confusion regarding the underlying physics of the system we were studying (muons in a magnetic field). I explained to him that the other two components still existed and influenced the time-evolution of the muon, but at the end only a single component could be measured.

This was a man that intimidated his collaborators with his brilliance and drive, and no one had ever clarified for him the basics of the quantum theory  of angular momentum. This is not uncommon – often the words used to describe quantum processes are not reflective of the underlying mathematics of the theory. This allows lots of room for physicists to overplay the significance of their measurements.

Today we have a report from an experimental study that confirms that some quantum objects are not symmetric. This is not surprising, in some sense. The system, the nucleus of the barium atom, is a swirling stew of 56 protons and 88 neutrons. What the study reveals is that some number of these particles can clump together in a particularly ordered fashion. Once they achieve that configuration, the remaining protons and neutrons can’t push their way into the structure, and end up hanging like a barnacle on the outside.

Here’s a way of visualizing this: let’s say that we have twelve of those little magnetic balls. We can organize eleven of them into a nice little tetrahedron. But the twelfth ball is going to be stuck on the outside of the tetrahedron like a barnacle. It is going to ruin the regularity (what physicists call symmetry) of the assembly.

Why is this loss of symmetry exciting? Well, it seems to be a pretty natural consequence of self-organizing aggregates. But it’s also related to some principles used to guide the development of quantum mechanical theories. Remember, we can’t see this world very clearly, and touching its inhabitants disrupts their behavior. So to guide the development of theory, physicists have come up with abstract mathematical principles. Three important ones are charge (C), parity (P) and time (T) inversions. These state, respectively, that the equations that describe the quantum world should not change if:

  • particles are replaced with anti-particles
  • the particles are observed in a mirror, and
  • the universe is run backwards.

In actuality, it’s hard to create theories that violate all of these principles simultaneously (what is called CPT violation). However, the weak force that controls radioactivity is known to violate parity (P), though invariance is restored under CP.

So what is the significance of the asymmetry of Barium-144? The authors claim that it is parity violation in the strong and electromagnetic forces. The claim is based upon the observation that when looked at in the mirror, the barium atom will have its bump on the opposite side.

But that is not what parity violation means! The mirror-image barium nucleus is still allowed under the equations that describe its structure. In fact, it can also be obtained simple by walking around to observe it from the other side. That is certainly allowed in the theory.

We can contrast this with parity violation in  neutrinos. Neutrinos, which only participate in the weak interactions, always have their angular momentum aligned against their direction of motion. They are “left-handed.” Observed in a mirror, however, that orientation changes: the direction of motion is reversed, but not the angular momentum. Thus the neutrino becomes “right-handed,” which is not known in nature, and so the equations of the weak interaction are violated by parity inversion. However, by adding charge inversion, the violation is removed: anti-neutrinos are indeed right-handed.

So in this case I’m afraid that got those making so much of the Barium-144 asymmetry have gotten their “nucleons in a bunch” for no good reason.

In general, the obscurity of quantum phenomena are not even well understood by physicists themselves. When they trumpet a great discovery, then, you should always ask yourself whether the practical implications of their work merit continued support by the public.

Unless, of course, you think of science as a cultural investment, like art or politics.

Quantum Entanglement

John Markoff at the New York Times has been heralding an experiment at Delft as disproving Einstein’s view of the universe. While I have my own issues with Einstein, I am not as impressed with the Delft demonstration as Markoff and others appear to be.

The quantum world is incredibly mysterious to us – we cannot observe its inner workings directly, but only observe its side-effects. This means that we can’t make statements about the behavior of any one system of particles, but only about many systems in aggregate.

Let me give a classical example. When we toss a coin in the air, we know that there is a fifty percent chance that it will land “heads up.” If we could measure the coin’s position and rate of spinning and also knew precisely the properties of the floor that it would land on, as it was in flight we could calculate precisely which way it would land. But we can’t do that, so we believe that there is an element of “chance” in the outcome. In the terminology of quantum mechanics, we might say that the coin in flight is in a “quantum” state: 50% heads up and 50% heads down.

Now let’s say that we put two people in a room and asked them to toss a coin. Since we can’t observe the thoughts in their mind, we might consider them to be in an “entangled” state. We know that if we ask one the answer, we’ll receive the same answer from the other We then separate them by miles and ask the first one what the result of the toss was. If she says “heads,” we know instantaneously that the second person will also say “heads.” So we might say that the state of the pair has “collapsed” to “heads” instantaneously, and we know what answer will be given by the second person.

But the information didn’t travel instantaneously from one to the other. The two people from the room knew all along what the answer was.

If this is actually the nature of quantum entanglement of very small particles such as electrons (the subject of the experiment in Delft), why do scientists become so confused about the process of information transfer?

That chance in coin tossing actually reflects the randomness of the tossing process: the position of the coin on our thumb, the effort of our muscles, the condition of the floor: only with great practice could we ensure that all of these were identical on each toss. If that investment in discipline were made, we could actually control the outcome of the toss, achieving heads 100% of the time.

Now let’s say that, unbeknownst to us, the coin tossers are actually trained in this skill. How would we find out? We couldn’t find out from one experiment. Even after a second experiment, there’s still a one-in-four chance that a random toss would achieve “heads” in both cases. No, we’d have to run many experiments, and decide how improbable the outcome would have to be before we accepted that something was wrong with our theory of coin tossing.

In other words, the confusion comes in because the philosophy of quantum mechanics confuses the problem of proving the correctness of the theory with the actual behavior of the particles that produce any specific outcome. In our coin-tossing case, the quantum theory holds that we’ll get heads 50% of the time. But to prove that, we have to do many, many experiments.

Let’s extend this to the problem of Schroedinger’s cat: a cat is in a box with a vial of poison gas and a radioactive isotope. When the isotope decays (at some random time), a detector triggers a hammer to smash the vial. In the “accepted” philosophy of quantum mechanics, the state of the isotope evolves over time, being partially decayed. This means that the state of the cat is also partially dead. When we open the box, its “wave function” collapses to one state or the other.

We can clarify this confusion with a thought experiment: In our coin tossing example, let’s say that we put coins in boxes and had children run around the room to shake them up, randomizing their state. In quantum mechanical terms, we would say that the state of any one coin was “50% heads.” When we look in a box, the state of that coin is determined: it’s wave function collapses to either heads or tails. It is only by observing all of the coins, however, that we can determine whether the children actually were successful in randomizing the state of the coins.

By analogy with this, we can only prove Schrodinger’s theorem about the “deadness” of cats by performing many experiments. At any instant, however, each cat in its box is either alive or dead. It is unfortunate that we’d have to kill very many of them to determine whether the theory of radioactive decay was correct.

So I side with Einstein: I don’t see any mysterious “action at a distance” in the experiment at Delft, and I certainly don’t see it as proof that information can travel faster than the speed of light.

My own proposition is very different: it is that the dark energy that permeates space and constrains the speed of light can have holes opened in it by the action of our spirit. Once it is removed, the barriers of time and distance fall. When such bonds are created through fear, the subject of the fear seeks to escape them, and the strength of the bond dissipates. When the bonds are created in love, the entanglement persists by mutual consent, and grows inexorably in strength and power, eventually sweeping all else before it.

What kind of confirmation could the physicists at Delft provide of this? I’m not certain, but it would be an experiment in which the electrons were separated, and a manipulation of one was reflected in the other. In our coin-toss experiment, it would be if the two people in the room were separated before the coin toss, and the second knew instantly what the result was of the toss performed by the other. From the video they made, I don’t think that’s what is happening at Delft.

This post in memoriam of Professor Eugene Commins who taught my upper-division course in Quantum Mechanics at UC Berkeley in 1981, and who benefited during his doctoral studies at Princeton from conversations with Einstein.

The Relative Incoherence of Special Quantum Spirituality

We in the West see the attempt to reconcile physics and spirituality as an Eastern concern. Indeed, it is the Vedantic philosopher Deepak Chopra who most vigorously engages Western science in that debate. The Western prejudice, however, is supportable only for those with a selective memory. Following the discovery of magnetism in the 19th century, “Mesmerists” were popular in Europe. The practitioners would demonstrate their mind-control abilities by touching the cranium of a susceptible assistant. When one was brought to trial for fraud, the scientists of the era actually testitfied in his defense.

Keeping in mind that history, I tend to be sympathetic to Chopra and his partisans. Unfortunately, they are chasing after rainbows, and creating a lot of confusion as a result.

Richard Feynman, brilliant quantum theorist, observed that quantum mechanics was a mathematical procedure without philosophical foundation. That’s pretty unique to 20th century physics. Prior to that time, the scientist could always build mental pictures of the interactions between the elements of the model. This was a practice that they attempted to apply to Quantum Mechanics and Special Relativity as they evolved, with unfortunate results.

This desire to provide explanations was carried forward during an era in which the basic tenets of the theory were still being worked out. Sometimes the preliminary theory would be applied in way that later scientists would consider incomplete, but a sensible answer would be obtained. The answers were published, often with popular interpretations of what was going on in the underlying reality. What is perhaps not surprising is that the popular interpretations are more widely known today than the actual theory itself. Because the interpretations were based upon bad science, they create confusion in the public mind.

To illustrate: in Special Relativity, Einstein held that clocks appear to tick more slowly when they move rapidly with respect to the observer. Based upon this, a thought experiment was constructed involving two twins, one of whom travels to a distant star and returns much younger than his sibling that stayed home on Earth. The calculation assumes, however, that the traveling twin reverses instantaneously his speed and direction upon arrival at the distant star. Obviously, if this was the way that the space ship was designed, the traveling twin would be just so much pate upon returning to Earth. No, the ship must decelerate and accelerate. When that part of the mission plan is included in the calculations, it turns out that the special relativistic effects disappear completely. The twin paradox is a hoax.

In quantum mechanics, we have the famous “wave-particle duality” and “wave function collapse”. Wave-particle duality was “proven” by electron self-interference: an electron impinging upon a screen with two closely-spaced slits will not be seen in two spots on the far side of the screen, as though it had passed through one slit or the other, but instead be distributed over numerous islands of intensity, as though it was a wave that had passed through both slits. The problem in this calculation is that in quantum mechanics, the behavior of any one electron can only be understood by considering the behavior of all the electrons in the system. The failure to include the electrons in the screen in the calculation leads to at least one paradox, and precludes alternative explanations of the observations.

“Wave function collapse” was an extension of “wave-particle duality” to scattering problems. In classical mechanics, when two billiard balls collide, we can predict the final state of the balls from the initial state. Not so in quantum mechanics: scattering objects spray about more broadly. However, the rules of energy and momentum conservation still apply. Therefore, measuring the final state of one of the scattered particles determines the state of the second. The first measurement causes the possible final states of the second to “collapse” to a single allowed result. This led to the idea that the conscious act of observation affects the behavior of physical systems. The “Schrodinger’s cat” thought experiment is the popular expression of this idea. But there are many types of uncertainty in quantum mechanics, and just because the observer doesn’t know the final state of the particles doesn’t mean that they particles don’t have a definite state. They may “know” perfectly well what their direction and speed of motion is.

The weak practice and explanations offered by early quantum and relativity theorists open the door to mystics seeking to explain their experience of reality. The acausal connectedness of mystical events (what Jung called “synchronicity”) seems to correspond to the complex structure of time in special relativity. The interaction between consciousness and physical events in Schrodinger’s world corresponds to the mental powers of the guru.

But the fact is that the theories, while describing unfamiliar behavior in fundamental particles, are completely inapplicable to the behavior of macroscopic composites such as people. The probability of seeing quantum behavior in a macroscopic object is so minute that the Eastern mystic must hold his experience as a refutation of quantum mechanics. That leads in the direction of new physics.

At this point, I would argue that the most powerful laboratories of the modern era will be our minds, rather than the billion-dollar observatories that the scientific-industrial establishment insists the public must fund. The ultimate proof of the power of a theory will be not in how it empowers us to manipulate objects without personality, but rather in the degree to which it makes us transparent to the flow of Divine Love.

Way Beyond Teflon

In imagining a universe filled with an invisible substance, it is natural to use air as an analogy. We then run immediately into trouble with Newton’s first law of motion, which is also an assumption in Einstein’s theories:

Every object in a state of uniform motion tends to remain in that state unless acted upon by an external force.

We know that air actively slows the movement of objects passing through it. Why aren’t moving objects slowed as they pass through Dark Energy?

One way around the problem is to assert that Dark Energy is a wall-flower: it doesn’t interact with anything else. That’s a prevalent assumption, and it causes me to remember the early history of thermodynamics. In building a theory of heat, early investigators, noticing that heat moved from place to place without changing the substance it occupied, conceived of caloric, an invisible field that permeated the spaces between atoms. That didn’t have much explanatory power, and was rapidly replaced by theories that explained heat as disordered motion of atoms.

Astrophysicists tell us that the universe is a pretty cold place – only a few degrees centigrade away from the coldest temperatures possible. Study of systems at these temperatures have revealed some amazing behaviors. For purposes of our discussion, liquid helium is an interesting example because it exhibits superfluidity, which allows objects to move through it without resistance. But superconductivity – materials that pass electricity without resistance – is another consequence of the basic principles that determine the behavior of really cold systems. Both liquid helium and superconductivity, by the way, are extremely important technologies in building facilities such as CERN.

Liquid helium is particularly simple because it bonds only very weakly, which is why it is liquid at temperatures that cause almost every other element to freeze. For illustration, I’m going to show a model system that shows atoms in a square two-dimensional lattice. The details may not apply to liquid helium, but I have reason to believe that they might to dark energy.

Imagine that we have a tank filled with liquid helium. At very cold temperatures, the atoms stack uniformly in the tank.
Super Fluid Lattice
Such arrangements are said to have high order. They are typical of crystalline materials, including many solids. One of the upshots is that it’s difficult to move a single atom without moving the entire collection. That’s because gravity presses the volume into a compact mass, which means that that atoms are compacted slightly, and therefore repelling each other. So moving one helium atom causes the atom it’s moving towards to move away. The cold here is important: if the lattice were vibrating somewhat, there would be little gaps that could absorb some of the distortion, and so the parts of the lattice could change independently. It’s the lack of such vibrations that forces the lattice as a whole to respond to changes.

Now let’s imagine that we place an impurity into the lattice.
Impurity in Super Fluid
This time a slight distortion of the arrangement will occur. The atoms nearest the impurity will indeed shift their positions slightly. Since the atoms at the walls of the container can’t move, they will tend to remain in place. So the distortion will be localized. What’s interesting to consider is what might happen if two defects are created. Will the disturbance to the lattice be minimized if the defects are brought together, or if the lattice acts to separate them? The astute student of physics will see that this thought leads to a model for gravity.

Now let’s propose that somehow our impurity begins to move.
Slow Impurity
How will the lattice react? Well, again, the atoms at the walls can’t move. The impurity will push against the atom in front of it, and leave a gap behind it. So long as the speed of the impurity is much less than the speed of the sound in the lattice, it is only the nearest atoms that will be disturbed. Obviously, the solution to restoring the order of the lattice is for the forward atoms to migrate to the sides as the impurity passes, displacing the atoms already on the side so that they fill the gap left by the passing impurity. When they reach the back, the atoms will come to rest by giving their energy back to the impurity. This is the essence of superfluidity: the impurity loses energy to the lattice only temporarily.

What is interesting to note is that in quantum mechanics, when calculating collisions between two charged particles, we have to assume that the particles are constantly emitting and re-absorbing photons. This is analogous to the situation in the superfluid: the impurity is constantly losing energy and then regaining it.

Finally, let’s consider an impurity moving closer to the speed of sound in the lattice. In this case, the distortions affect more than the nearest atoms, and the circulation becomes more widespread.
Fast Impurity
It’s important to note that energy is stored in the circulatory motion of the helium atoms. They are moving, just as the impurity is moving – but in the opposite direction, of course. The closer to the speed of sound, the more energy is stored in the circulation. This means that it becomes harder and harder to make the impurity move faster as it moves more and more nearly at the speed of sound.

In Special Relativity, Einstein showed that particles become harder and harder to accelerate as they come closer and closer to the speed of light. The relationship is (m0 is the mass of the particle at rest):

m = m0/(1-v2/c2)1/2

Again, we see some strong correspondence between superfluidity and the behavior of particles in both special relativity and quantum mechanics. The big difference is that, while Richard Feynman famously stated that quantum mechanics was merely a mathematical procedure without any explanation, when applying the superfluid analogy to dark energy, it seems that at least some previously mysterious quantum and relativistic phenomena are simple to understand.

For more on models of particle mass, see That’s the Spirit.

Generative Orders Research Proposal – Part IV

Reference Model

Having advanced the principles of generative orders, we find ourselves in a situation somewhat similar to that faced by quantum theorists after wave-particle duality was advanced. A number of experiments appeared to violate the principles of Classical Mechanics (i.e. – the double-slit experiment, electronic excitations of the hydrogen atom, and the photoelectric effect). Progress was achieved by generalizing the methods of classical mechanics (Hamiltonian and Lagrange equations) into differential equations through Fourier analysis.

The problem in the case of generative orders is more difficult. The principle does not generalize existing theory into new realms of application – it serves to supplant existing theories, stretching back to Special Relativity and quantum mechanics. Additionally, the enumerated principles are abstract. They do not drive us to a specific formulation of physics in one dimension. A number of alternatives may be mathematically feasible.

Lacking a definite starting point for analysis, nothing short of an intellectual Big Bang would produce a fully elaborated theory that explains everything that is known about particle physics and cosmology. That does not exclude thoughtful exploration of specific possibilities. In this section, we consider a simple model (narrative here), elaborated to the point that conceptual correspondence with known phenomenology is established. The model is sufficient to support development of model potentials (as outlined in the research program), and therefore to advance theoretical insight and analysis methods that can be applied to other models.

  1. The initial state of the universe is a disordered but “cold” (at least as compared to Big Bang theories) collection of one-dimensional structures.
  2. Physics of one dimension includes a mechanism of segmentation (or quantization). The W/Z mass may establish a scale for this segmentation (see item 8 in this list).
  3. Folding or bonding on segmentation boundaries produces higher-dimensional structures. Geometrically, we know that triangles are the most stable of these structures.
  4. Higher-dimensional structures are self-associative, building lattices of distinct dimensionality. Tiling a plane with triangles is trivial. The structure of higher-order lattices is a an extrinsic property of the lattice potential.
  5. Lower-order lattices may exist in the empty spaces between cell layers. This is again an extrinsic property of the lattice potential
  6. Lattice formation is spontaneous. Orientation of expanding lattices is random.
  7. Surface energy at the boundaries between merging lattices of different orientation (a la grain boundaries in metals) provides the energy to compress structures into lower order, producing quasars and super-massive black holes at the center of galaxy formation. In this model, a black hole in three dimensions is a volume bounded by a two-dimensional lattice.
  8. Parthogenesis occurs through the expulsion of residual lower-order structures from the enclosed surface. In the reference model, these are one-dimensional structures (termed “threads” below). Threads may pass around the polygonal subunits of the lattice or through them. Threads that penetrate the lattice sub-units are localized, creating loci that we identify with fermions. Fermions interact strongly with similarly localized threads, giving rise to the non-gravitational forces. The potential barrier of the W and Z mass corresponds to a thread-exchange process, which requires reconfiguration of the sub-units.
  9. Captured threads locally distort the lattice. Gravity is a side-effect of the lattice energetics that localizes the distortion.
  10. Dark energy corresponds to the potential energy of lattice compression.

This illustrates how the principles of generative orders can be used to build a simple one-component model of the early universe. Geometrical models are presented in Chapter 4 of Love Works.

Certain details of particle phenomenology appear superficially to be accessible in the context of this model.

  1. Charge corresponds to the number of threads that penetrate a lattice sub-unit (which naturally has three degrees of freedom). Sign is simply a way of characterizing the tendency of fermions to attract fermions with different degrees of thread penetration.
  2. Mass arises naturally when threads pull on each other, causing the loci of thread capture to be dragged through the lattice. From the properties of the first particle family, it would appear that asymmetrical thread configurations must be more disruptive than symmetrical configurations. The equivalence of gravitational and kinetic mass is natural, as both effects correspond to lattice distortions. The equations of special relativity suggest the velocity-dependence of kinetic distortions.
  3. Particle families correspond to distortions of a particle’s lattice sub-unit from its normal configuration.
  4. Conservation of momentum could result from lattice dynamics that tends to reject disturbances, forcing energy back onto moving fermion. Analogies in material science include superfluidity and superconductivity.
  5. Light could be a self-propagating disturbance in the lattice, achievable only through fermion kinematics. Assuming that gravitational packing of particles causes re-orientation of the lattice at the surface of large bodies, the constancy of the speed of propagation is a local phenomenon (i.e. – a massive body “drags” space around with it).
  6. Light may interact with the lattice as it propagates, causing energy loss that manifests as a shift to lower frequencies. This may explain the microwave background radiation.
  7. A soul is a complex configuration of threads that are supported by but only tenuously bound to the lattice.

These configurations store energy as potential energy due to the associated distortion of the lattice.

Obviously, all of these are conceptual possibilities, whose validity can only be established through construction of a model of the energetics of the interactions between one-dimensional structures. As will become clear in the description of the research program, the list is by no means exhaustive. It is presented to provide a sense of the naturalness of fit between phenomenology and theories that might be elaborated using the principles of generative order.

Generative Orders Research Proposal – Part I


The author proposes to develop research partnerships to develop a conceptual model of fundamental physics that has the potential to place spirituality on a firm scientific basis. The motivation for the scientific program is well-grounded in phenomenology, and the author outlines correspondence with established theory as limiting cases of the proposed model.

The author recognizes the gnostic implications of the program on society. Certain rules of engagement must be observed in doing such work – generally, the first application of any new technology is to obtain competitive advantage. To mitigate against such outcomes, the author has written a book that explains, in layman’s terms, the disciplines required to safely engage these principles, and the long-term personal and global consequences of failing to observe them. The proposal includes support for updating the book, entitled “Love Works”, and for professional preparation and publication.

The proposal seeks not to resolve all questions regarding the proposed class of theories. It is intended to stimulate thinking that should lead to independent research and funding by the research community. The proposal does include publication of a single paper that may demonstrate a significant point of departure from current models of particle physics and cosmology. Successful publication should stimulate “out-of-the-box” thinking by the research community, followed by independent research proposals.


The author’s principle qualifications for those work are selflessness and a commitment to Life in all of its forms. Many of the ideas presented were formulated through engagement with forms of sentience not recognized by many scientists.

With regards to the fundamental physics, the author received his Ph.D. in high-energy particle physics in 1987 and was active as a Post-Doctoral research fellow until 1992. Most of the conceptual underpinnings of modern particle theory and cosmology were developing during this period, and his observations of their development makes the author well-suited to recognize their short-comings.

However, the author recognizes his limitations with regards to the skill-sets of the modern particle theorist, including large-scale numerical modeling. The author will develop relationships at institutions with large-scale computational physics programs to collaborate in the program.


The principal motivation for this work is to heal the divide between science and religion that promotes fear, anxiety, anger and apathy in those confronted with the enormous global challenges of the 21st century. The author believes that science is a process of revelation that can embolden and empower those with a genuine desire to be of service to the end of healing the world. Religion is concerned with the development of disciplines that enable us to work safely with the requisite spiritual energies.

While fostering spiritual maturity is critical to a successful execution of the overall program, the development of supporting resources is fairly well advanced. (The author has published his moral and ethical philosophy at www.everdeepening.org, and Love Works is a popularization aimed at the culturally dominant community of Christian believers.) The author considers publication of Love Works to be a critical adjunct, and will not pursue separately the scientific program.

Plan of Exposition

Love Works is provided as an attachment for the evaluation. The focus of exposition will therefore be to motivate and describe the scientific program. The scope of the development is far greater than necessary to complete the work of the first year. As an alternative to theories that have had thousands of man-years invested in their development, it is important to establish plausible paths of investigation for the obvious problems that must be overcome in investigation of the new class of theories, characterized as theories of “Generative Order” (GO).

To be fair, the discussion starts with an enumeration of the failures of the prevailing class of theories, which are characterized as theories based upon “Gage Invariance” (GI).

Every physical theory has a set of fundamental constants. In current theories, the fundamental constants include the speed of light, the particle masses, Plank’s constant, and the strengths of the fundamental forces. The principle challenge in qualifying theories of Generative Order is determining the number and values of those constants. The exposition proposes a series of modeling problems that could be undertaken to evaluate a specific theory and determine its constants. Each modeling problem addresses a critical issue in establishing that a theory of Generative Order yields the current theory as a limiting case (just as Relativity and Quantum Mechanics have Newtonian physics as a limiting case).