The Big Bang Collapses

Yet again.

One of the challenges confronting astrophysicists is figuring out how galaxies form. The problem arises in kind of a round-about way.

The space the fills our universe is remarkably uniform. That’s surprising, because it formed from an extremely violent context. We would expect it to be warped, in the mode of Einstein’s general relativity, causing light to “bend” as it traveled the great distances between galaxies. In addition, until a couple of years ago it was believed that the universe was coasting to a stop. In other words, the mass of the universe appeared to be just enough to keep the galaxies from flying apart forever, but not so much that they would turn around and collide together in a “big crunch.”

These two questions were reconciled with Alan Guth’s “inflationary universe” hypothesis. This holds that the universe was created with an invisible, uniform background energy that dissipated very early, creating most of the matter that we see around us.

One consequence of this model is that matter should be distributed uniformly in the universe. This is a problem for galaxy formation, because if matter is distributed uniformly, there’s no reason for it to start clumping together. There have to be little pockets of higher density for galaxies to form. When only normal matter is included in the simulations of the early universe, galaxies form way to slowly, and don’t exhibit the large-scale structures that we observe in the deep sky surveys.

Worse, when we look around the universe, we can’t actually see enough visible matter to account for the gravitational braking that slows down the rushing apart of the galaxies.

One way of solving these conundrums is “dark matter.” The proposed properties of dark matter are that it does not emit light (it’s dark) and that it has a different kind of mass that causes it to clump together to seed the formation of galaxies.

Today we have a negative result from an experiment designed to detect dark matter. This won’t deter the theorists for long – they’ll just come up with new forms of dark matter that are invisible to the detector (this is an old trick, which caught out my thesis adviser back in the ’80s).  But it does seem to make Occam’s razor cut more in the direction of the generative orders proposal for the formation of the early universe. That model doesn’t need inflation or dark matter or a multiverse to work. It anticipates just the universe that we see around us.

*sigh* Just saying.

Galactic Asymmetry and the Big Bang

The reigning model of cosmology (the history of the universe) holds that it formed as a cooling bubble in a super-heated stew. It proposes that a lot of energy was stored in the fabric of space (whatever that means), and what we recognize as matter was created as that energy was released. That matter slowly coalesced to form concentrated seeds that eventually grew into galaxies. It’s a model not too different from the model we have for the formation of the solar system.

The model is notoriously called the “Big Bang” theory, but it’s not really a bang, nor is the universe really big in absolute terms. In fact, in that super-heated stew our universe is just a little tiny bubble that only looks big to us because as energy is released from the fabric of space signals travel more slowly through it, much as a violin string vibrates more slowly when it is loosened. In my book Love Works I coin another term for the process: the “Expansive Cool.”

The problem is that this model of gradual accretion is very difficult to reconcile with the structure and sub-structure  of the universe. This was first apparent in the distribution of galaxies, which is non-uniform. A more recent study of the age of stars in the Milky Way also shows some surprising structure.

It will be interesting to see if the cosmologists can come up with an explanation. I have to hand it to the astronomers, though: they sure know how to use pretty pictures to make a point!


From the Earth to the Sun and Back Again

One of the hazards of engaging in epistemological debate is that they almost always become religious. We look back through the haze of history, trying to understand the practices by which knowledge is revealed to us, hoping to glean insights that help us heal divisive intellectual conflicts in the present.

Currently, these discussions become religious because our era suffers from an extreme bifurcation in our pursuit of knowledge. In no other era of human history have the two great pursuits of understanding – religion and science – been perceived as diametrically opposed. The linear causality of Einstein stands in contradiction of the gift of prophesy, and the power and predictability of dumb matter seduces us into believing that we can achieve all of our desires right here on Earth. Conversely, science denies us the comfort of meaning, to the extent that some denounce the search for meaning, or go even further to propose that this reality is evidence of a malefic creator.

Given this modern myopia, in looking back at the great episodes of resistance to truth, we tend to focus on the conflict between science and religion. Consider, for example, the succession from geocentric models of the solar system to the heliocentric models. The oppression of Brahe and Galileo is characterized as resistance by a religious elite threatened by the destruction of a Platonic universe whose geometrical perfection (circles moving within circles) was advanced as proof of the existence of the Christian God.

In fact, the history was rather more subtle, and its consideration brings a great deal of insight into the intellectual resistance to the program of this blog, declared on the title bar: “Unifying Science and Spirituality.”

The Greeks advanced both the geocentric and heliocentric models. If the ancients had been capable of building the instruments used by Galileo, they would certainly have settled on the latter. They resolved on the former for entirely practical reasons: they were concerned with using the positions of the stars to calculate the calendar date and the position of objects on the Earth’s surface. Culturally, their needs were absolutely geocentric. To solve this problem, they correlated geographical position with stellar observations and the progression of the seasons. Next, they sought methods for compacting this large body of data in a form that could be used by voyagers. The technology most adaptable to that purpose was the mathematics of circular revolution. Not only was the mathematics of circular revolution relatively simple, it was easy to translate to mechanical form as instruments containing rotating dials.

The “geocentric” model of the heavens was not in essence a philosophical proposition, but a proposition of practical technology. The principle motivation for upending the model was that over the centuries, the circular approximations began to fail. Designs specified in the first century produced the wrong answers in the eleventh century. A more reliable model was necessary, and the application of the new mathematics of elliptical analysis revealed that the heliocentric model fit the data more reliably than did the geocentric model of circular revolution.

As for the resistance of the Church, Galileo insisted on publishing an insulting parody of the Pope with his observations. He made his science a political issue. This was not an idle matter: the Church used the feudal compact to constrain the rapaciousness of those with a monopoly on the instruments of war. Those scientists were well accepted that chose to engage with the Church with the aim of minimizing the social disruption that always comes with new knowledge.

In my own intellectual adventures here on this blog, I find myself confronted by those that tout modern cosmology as proof that the universe is a machine unfolding without purpose from its initial conditions. The foremost intellectual challenge to that conclusion has been “fine tuning” – the delicate balance of the fundamental constants of nature (specifically the relative strengths of the four forces) that must be preserved if life is to survive. The solution to this conundrum has been the “multiverse” variant of the Big Bang theory (the name itself is a mischaracterization). The multiverse proposition holds that universes exist with and without life – we just happen to occupy one in which life is possible.

The random generation of universes in the Big Bang, however, results from the proposition that we can explain all of nature by using two branches of mathematics: group theory and Fourier analysis. Both of these methods are relatively susceptible to hand calculation. What is little understood by the public is that the theorists trumpet their successes and ignore their failures. The application of current theory to study of the hydrogen nucleus is summarized here, and the results are incredibly ugly.

Why is the theory not abandoned? For the same reason that the geocentric theory was not abandoned: physicists and astronomers have used the current theory to justify the construction of multi-billion dollar observatories. As the Church did, they oppose any idea that might destabilize the social order that pays their salaries.

What is scandalous is that the interstellar navel-gazing saps money from problems here on Earth that desperately call for the full commitment of our best and brightest minds. The scientists need to get the heads out of the stars and back onto the Earth.

My Background in Particle Physics

I earned my B.A. in Physics from UC Berkeley in 1982. That spring, I was asked by the undergraduate adviser where I had been accepted for graduate studies. I told him that Princeton had rejected me, and that Harvard expected me to find $10,000 a year. Face paling, he excused himself to go talk to the department head. When he came back, he said, “Here’s an application for graduate school at Berkeley. Fill this out. I’ll walk it down to the admissions office. If you don’t get accepted, don’t worry: you won’t have to pay the application fee.”

So I did my graduate work at UC Berkeley as well, receiving a Ph.D. in particle physics in 1987. There were two significant things about this era. First, it was when the fundamental ideas of particle physics and cosmology (the study of the early universe) were assembled.

Particle physics had been pursuing the use of group theory as a framework for unifying our understanding of the four forces (electromagnetic, weak, strong and gravitational). The theory had some really ugly problems. It did not account for particle masses, it produced infinities in its calculations that had to be “renormalized” away, and it had no satisfying explanation for the mathematical structure of the four forces. With the exception of the first, these problems were resolved by bringing gravity into the framework (through a Grand Unified Theory that was finally refined as superstring theory).

With regards to cosmology, the Big Bang had become dogma back in the 30s when Hubble discovered the red shift. The only available explanation for the result was the relativistic Doppler shift. The problem was that the universe was far too smooth to have been created in an explosion involving normal matter. The contribution of Alan Guth was a model of the early universe with ten spatial dimensions heated to the Planck scale, followed by an “inflation” driven by a Higgs-like particle with extremely large mass. Normal three-space and matter would only appear after the universe had cooled enormously, and light would slow down tremendously in the process. However, it turned out that there were tens of millions of possible configurations of the laws of physics in that cooling. Again, there was no way of explaining the mathematical structure of the four forces. This was addressed by assuming that our universe was only one of an infinite number of universes spawned from the original super-heated Plank plasma.

The second significant aspect of this era was the rise of Big Science in these fields. I was lucky to work on a team of eight, and turned my Ph.D. around in five years. Most of my peers worked on far larger projects, anywhere from one hundred to (at the end) a thousand researchers. The projects involved hundreds of millions or billions of dollars. Because the work had absolutely no practical utility, the arguments for funding became more and more abstract (often invoking science as a fundamental moral imperative), and then became simply political. To illustrate: the organizational success of the particle physics community, in alliance with the Department of Energy, was scandalous to the material science community, whose funding was drained to support the construction of large and larger particle colliders. The rebuttal came in the form of a proposed designer for a linear collider to study particle zoology at the Plank scale (10^40 electron volts, as opposed the the 10^15 electron volts at CERN). The sarcastic concept drawing showed a linear collider superimposed on the galaxy.

I was offered a job at BellCore (the telephone systems research lab) after graduating, but decided to give Particle Physics one more chance by joining a neutrino mass project at Lawrence Livermore National Laboratory. The woman that taught me particle theory, Mary Gaillard, was despondent. I had the feeling that she felt that I was joining the evil empire. Indeed, the nuclear weapons facilities were a vortex that absorbed a lot of talented particle physicists (I guess that DoD was worried that we’d go off and invent something even more destructive than the hydrogen bomb). So the ten years that I spent there were amidst a vital community of theorists, and I was able to keep abreast of developments in particle physics and cosmology.

I chose my position at LLNL because I knew that if particle physics didn’t appeal to me, I would be able to change careers. I did so after three years, entering Environmental Science. Unfortunately, I became married in 1994 to a trauma victim of the Soviet secret police. That trauma made it impossible for my peers to sustain their relationships with me. I was encouraged to leave the Laboratory for industry.

When I made a decision to restructure my personal life in 2000, I went through a period of enormous volatility in my career. My peers at LLNL (some of who had intervened in my personal life with disastrous effect) decided to throw me a lifeline, and I was back there in 2004 and 2005. The latter was the centenary of Einstein’s “anno mirabilus”, when he published his papers on the photoelectric effect, Brownian motion, and special relativity. The speaking schedule that year was dominated by cosmologists and particle theorists. I was able, in that venue, to come up to date on current developments in the field. What I came away with was confirmation that nothing had changed, and that theorists were simply adding parameters in order to match data that they couldn’t explain, often with unsatisfactory results. It was so dire that the NSF head of fundamental physics declared that the field needed “revolutionary” ideas.

I had begun to assemble the thoughts presented here in 2000 (see the “New Physics” tab), and offered them to some of my peers. It was then that I ran into political restrictions. I was told “wait ten years,” which was the foreseeable duration of the CERN research program. Well, that ten years is up.

I did receive some recognition while I was there. During a budget cutting exercise, funding of the National Ignition Facility was threatened. I ate lunch frequently at the NIF cafeteria, and one day found myself looking at the promotional poster on the wall, wondering how to make the program work. As I sat there, I had the sense of having a conversation with researchers from a number of disciplines. When I published that analysis (several months later), the budget discussions were resolved with an increase to support new research directions, and I was invited by the Associate Director’s office for a program participant’s tour of the facility. It was the only concrete evidence I received of the political contributions I had made to the laboratory in the eighteen months that I was able to remain there.

Shedding Light on Light Mysteries

In the last post, I identified correspondences between superfluid motion and the phenomenon that are described by the equations of quantum mechanics and special relativity. The discussion leads to the assumption that light is a disturbance in a cold – and therefore highly ordered (“crystal-like”) – sea of dark energy.

The illustration in that post showed a perfect lattice, but given what we know about the universe, we’d expect the dark energy lattice to be a little less regular. For example, we know from the Michelson-Morley experiment that dark energy is entrained with massive objects, which tend to be round. There’s an old adage about “pounding a round peg into a square hole” (or was it the other way around) that fits here: the distortion created by the spherical Earth requires accommodation from the rectangular lattice, which will introduce defects.

And then we have the early history of the universe: unless the universe was unfolded from a single location, dark energy will organize itself locally, just as we see in crystals formed in solution. Here’s a picture of insulin crystals:

Insulin crystals grown in solution
Now obviously as these crystals grow to fill in the volume, there’s going to be some places where they don’t fit together nicely, which is going to leave defects in the final mass. So it would happen with the dark energy lattice.

What would we expect to happen when light encounters such a defect? Well, a reasonable analogy is what happens when a water wave encounters a rock. While most of the wave will continue around the rock, ripples will be cast off all around.

Do we see evidence of this in our study of the universe? Well, yes we do. First of all is the cosmic microwave background. But there’s more than than. Recent studies reveal that there is too much light coming from the empty space between galaxies (see Galaxies Aren’t Bright Enough). Astronomers originally assumed that the light had to come from early sources (back around the “Big Bang”, which I think is hokum), but that early light should should be “stretched”, and therefore redder than it is. So the light must be coming from modern sources. Without any other proof, astronomers suppose that there must be many stars between galaxies.

In the lattice model, the cosmic microwave background and extra light between galaxies actually go together: if light is scattered by dark energy, it will lose a little bit of its energy (perhaps into microwaves) and change its direction. Therefore, some of the light coming from a distant galaxy will appear to have originated from empty space, and space will seem to be filled with microwaves.

Finally, the loss of energy from scattering in the lattice explains why light emitted from distant galaxies appears redder than light from nearer galaxies. In current theory, this is explained as due to the relativistic Doppler effect (similar to what we experience when a car passes us with its horn blaring, the pitch drops after the car passes us). But with the discovery of Dark Energy, other mechanisms may exist to explain this effect.

I will admit that the last two paragraphs are a “have you cake and eat it too” situation. If light from distant galaxies loses energy to scattering, it would be diffused as it passes, which would make the galaxies indistinct. But remember that the volume around galaxies is expected to have many more defects in the lattice than the intergalactic medium, which would cause stronger scattering in their vicinity. And when defects exist, radiation may also be emitted when the lattice reorganizes itself to close the defect. The point is that there is a whole set of new phenomena to consider when explaining astrophysical observations.

All this without needing to suppose a Big Bang at all.

Einstein is So 20th Century

In the two centuries between Newton and Einstein, arguably the greatest physicist of the 19th century was the Scotsman James Clerk Maxwell. Maxwell made fundamental contributions to thermodynamics, the study of how gases, liquids and solids change when ambient conditions (such as temperature and pressure) change, and how to convert heat to work. One of the results was an understanding of the propagation of sound waves through the air. But Maxwell also applied the new mathematics of differential calculus to create a unified theory of electricity and magnetism. These are the famous “Maxwell’s Equations” that predict the existence of electromagnetic waves, which we see as “light”.

Maxwell saw the relationship between electromagnetic waves and water and sound waves. Being steeped in a mechanical analysis of the world, he was unsatisfied with his abstract mathematical theory, and invested time in building a mechanical model of the “aluminiferous ether” – the medium in which light waves traveled. Having spent years studying his equations and their predictions, I am fascinated by claims of his success. It’s a magical world in which the linear motion of charges creates rotary magnetic effects. My understanding is that the model was not simple, but contained complex systems of interlocking gears.

Now Maxwell’s work was not merely a curiosity – it was the basis for the design of communication networks that broke down distances with the enormous speed of light. More than anything else, this has brought us into each other’s lives and helped to create the sense that we are one human family. (The social and psychological reaction to that reality is complex, and we’re still growing into our responsibilities as neighbors. In The Empathic Civilization, Jeremy Rifkin offers a hopeful analysis of the transition.)

So the world of scientific inquiry hung on Maxwell’s words, and in America, two of them, Michelson and Morley, designed an experiment to detect the presence of the ether. If the ether filled all of space, the Earth must be moving through it. Therefore the speed of light should change depending upon the motion of the observer through it. The analogy was with water waves: an observer moving along with a water wave doesn’t experience its disturbance – while one moving against it feels its disturbance enhanced. This is an example of Newton’s laws concerning the change of reference frames.

Since the Earth rotates around the sun, light emitted from the Earth in a specific direction relative to the sun should have a different speed at different times of the year. To test this hypothesis, Michelson and Morley built a sensitive instrument that compared the speed of light travelling in two perpendicular directions. As the Earth varied its motion through the ether, the pattern of dark and light on a screen was expected to shift slowly. Strangely, the result was negative: the image did not change.

The conclusion was that there was no ether. This was a real crisis, because Maxwell’s Equations don’t behave very well when trying to predict the relationship between observations made by people moving at different speeds. To understand how really terrible this is, consider: in Maxwell’s theory, charges moving through empty space creates a rotary magnetic field. But what if the observer is moving along with the charge? The charge no longer appears to move, so the magnetic field disappears. How can that be possible?

This was the challenge taken up by the Dutch physicist Henrik Lorenz. He analyzed the mechanical properties of rulers and clocks, which are of course held together by electromagnetic forces, and discovered a magical world in which rulers change length and clocks speed up and slow down when the speed of the observer changes.

This was the context in which Einstein introduced his theory of Special Relativity. He did not really add to the results of Lorenz, but he simplified their derivation by proposing two simple principles: First, since the vacuum is empty, we have no way of determining whether we are moving or not. All motion is relative to an observer (thus the title: Special Theory of Relativity), and so no observer should have a preferred view of the universe. The second was that the speed of light is the same to every observer. Einstein’s mathematical elaboration of these principles unified our understanding of space and time, and matter and energy. Eventually, General Relativity extended his ideas to include accelerating observers, who can’t determine whether they are actually accelerating or rather standing on the surface of a planet.

Special and General Relativity were not the only great theories to evolve in the course of the 20th century. Quantum Mechanics (the world of the microscopic) and Particle Physics (describing the fundamental forces and how they affect the simplest forms of matter) were also developed, but ultimately Einstein’s principles permeated those theories as criteria for acceptance.

Then, in 1998, studies of light emitted from distant supernovae seemed to indicate that something is pushing galaxies apart from each other, working against the general tendency of gravity to pull them back together. The explanation for this is Dark Energy, a field that fills all of space. This field has gravitational effects, and its effects in distorting the images of distant galaxies have been observed. However, this field cannot be moving in all possible directions at all possible speeds. Therefore, it establishes a preferred reference frame, invalidating Einstein’s assumptions.

Working physicists resist this conclusion, because they have a means of accommodating these effects in their theories, which is to introduce additional mathematical terms. But science is not about fitting data – it is about explaining it. Einstein used his principles as an explanation to justify the mathematics of his theories. When those principles are disproven, the door opens to completely new methods for describing the universe. We can travel as far back as Maxwell in reconstructing our theories of physics. While for some that would seem to discard a lot of hard work done over the years between (and undermine funding for their research), for others it liberates the imagination (see Generative Orders as an illustration).

So, for example, why didn’t Michelson and Morley detect the ether? Maybe ether is more like air than water. Air is carried along with the Earth, and so the speed of sound doesn’t vary as the Earth moves about the sun. Maybe dark energy, which Maxwell knew as the ether, is also carried along with the Earth. Maybe, in fact, gravitation is caused by distortion in the Dark Energy field when it is bound to massive objects.

Generative Orders Research Proposal – Part V

Research Program

In this section, we suggest a research program, motivated by a strategy of incremental complexity. The initial steps of the program focus on the characteristics of the lattice. As these are resolved, the parameter space of the theory is reduced, helping to focus analysis of fermion dynamics in the later stages.

The reference model as described suggests that in theories of generative order, many of the intrinsic properties of the standard model may be extrinsic properties of one-dimensional structures. If this is so, ultimately theorists should be able to calculate many of the fundamental constants (G, h, c, α, etc.), and to establish correspondence theorems to existing models of gravitation, quantum electrodynamics (QED), quantum chromodynamics (QCD) and the weak interactions. It is the opinion of the author that the single-component reference model is unlikely to satisfy these requirements.

Conversely, the isotropy of space suggests that a single-component lattice is likely to be sufficient to explain gravitation. The initial work therefore focuses on creation of models that can be used to assess the stability of behavior against segmentation length and interaction models. The program will also help scope the computational resources necessary to analyze fermion kinematics.

The steps in the research program are successively more coarse. Particularly when the program progresses to consideration of fermion kinematics, years of effort could be invested in analysis of a single particle property, such as spin. Considering the history of quantum mechanics and relativity, the entire program can be expected to take roughly a century to complete.

Given the resources available to the proposer, funding of the research program for the first year focuses on two goals.

  1. Re-analysis of the spectra of side-view elliptical galaxies with the goal of establishing a high-profile alternative to the Hubble expansion.
  2. Identifying research teams that would be capable and interested to pursue the modeling effort.
  3. A successful effort would culminate with invitation to a symposium considering new horizons in the theories of cosmology and particle physics.

Modeling Program

Exposure of generative orders to the community of particle theorists is going to result in a large body of objections to the reference model. To avoid these being raised as impediments to obtaining research funding for interested theorists, we list the challenges that must be overcome to elaboration of a satisfactory model, and consider possible mechanisms that might lead to the observed behavior of known physical systems.

The program is presented in outline form only. If desirable, elaboration can be provided.

  1. Precession of perihelion – due to “drag” between the inconsistent lattice configurations of bound gravitational bodies. Explore parameterization of lattice structure – sub-unit modes, lattice sheer and compressibility. Again a fairly approachable study that could stimulate further work by demonstrating feasibility of explanations of large-scale phenomena, with correspondence to parameterization at the lattice scale. Success would begin to break down resistance to the idea that a preferred reference frame is disproven by observations that support Einstein’s theories of Special and General relativity.
  2. Dynamics of the formation of galactic cores, parthogenesis, black hole structure – Relative energetics of lattice cohesion vs. encapsulation of lower-dimension structures. Initial density and uniformity of 1-dimensional structures (also considering observed smoothness of lattice compression – I.e. “dark energy” distribution). Success would be a clear demonstration of worthiness as an alternative to the Big Bang theory.
  3. Superfluid transport of particles through the lattice.
    1. Fiber characteristics
    2. Intrinsic angular momentum
    3. Virtual photon / gluon emission as an analog of lattice disruption
    4. Conservation of momentum
    5. Theory of kinetic energy (scaling of scale of lattice distortion vs particle velocity)
    6. Effect of sub-unit distortion on particle propagation
  4. Gravitation/QED/GCD
    1. Equivalence of gravitational and kinetic masses
    2. Electric charge signs. Note that 2 sub-units with 1 thread each are not equivalent to one subunit with 2 threads.
    3. Thread dynamics and interaction with lattice
  5. Lattice distortion and correspondence to quantum-mechanical wave function
    1. Pauli exclusion principle
    2. Wave-particle duality (theory of diffraction)
    3. Hydrogen energy levels
    4. Wave-function collapse
  6. Weak interactions
    1. Thread transfer processes
    2. Temporary creation of unstable higher-dimensional structures.
  7. Theory of light
    1. Electric field as a mode of coupling due to lattice disrupted by thread oscillations
    2. Magnetic fields as a special mode of particle coupling with a lattice distorted by motion of threads
    3. Speed of light
    4. Light decay during lattice propagation
    5. Theory of microwave background radiation (lattice relaxation or light decay)
  8. Theory of anti-particles
    1. Sub-unit chirality and lattice coherence
    2. Annihilation as synthesis of higher-order structures
    3. Pair production as decomposition of higher-order structures
    4. Meson theory