Truth to Tell

The good thing about science is that it’s true whether or not you believe in it.

-Neil deGrasse Tyson


As a physics student, my undergraduate curriculum was dominated by physics and math classes. Even then, though, I had a penchant for philosophy that culminated with Paul Feyerabend’s course on the philosophy of science. I didn’t do terribly well in those classes, having a fundamental misconception regarding the purpose of the term papers. Rather than summarizing the text, I always set out to propound novel thought. The teaching assistants were not amused.

Feyerabend may have read some of what I had written, however, because he called on me in his final lecture and asked me to offer my thoughts on the scientific process. Never one to deny credit where it was due, I began “Well, my father says…” which caused the rest of the class to erupt in laughter. Paul waved his hand and told me “Write a book some day.”

deGrasse Tyson’s observation is representative of the philosophy of those inspired by the engineering marvels of the industrial age. The associated advances in the public welfare seemed to demolish all the works of the past. Philosophers did see the scientific mindset as a matter of concrete truth. But it is far more and less than that. “Less”, because the equations that we teach in introductory physics are wrong. A ball doesn’t fall in a parabola because it is subject to other forces than gravity – air drag is one. What the solution without drag offers is a sufficiently good approximation for most engineering applications. “More” because the engineers so empowered change the truth that we experience. They create microchips and vaccines, things that would never exist in the natural world.

What I had concluded, a few years after taking Feyerabend’s course, is that science is not important because it tells us what is true. It’s important because it guides our imaginings into what is possible. But if you talk to most scientists, that isn’t why science inspires them. Most of them study science because they want to do what others believe is impossible. That was certainly my case – when I went off to college, in the middle of “Whip Inflation Now” and the first OPEC oil crisis, it was with the stated aim of “figuring out how to break the law of conservation of energy.” I wager that many creative scientists feel the same – they actually don’t want to believe their science. They want to prove it wrong.

I know that was the conclusion of my own journey into understanding of the nature of spiritual experience (follow the menu to “New Physics”), and so see a certain myopia in Tyson’s statement. This came to the fore one Saturday afternoon during a workshop run by Tom Owen-Towles, the foremost modern theologian/philosopher in the Unitarian Universalist tradition. In responding to a point Tom made, I offered my observations of the nature of our engagement with the divine source. Before I could get to the main point, a loud, sneering snort came from the assembly behind me. I turned around to face the originator, a man older even than I, and then proceeded to make my point. For the next five minutes, I felt pressure building from my antagonist, and just let it flow into me, finally broadening the focus to embrace the community of atheists that he represented. When I had their full attention, I sent this thought: “And yet here I am.”

And so my response to deGrasse Tyson is this: “You receive love from an inexhaustible source. Whether or not you believe it, I am glad that it is true.”

Please Help Me Understand

After reading the summary of the today’s mass shooting in Roseburg, Ore., I made the mistake of opening the comments. The posts were dominated by Second Amendment prattle – you know, “prying my gun out of my cold, dead hands.”

Not a word of sympathy.

Not an offer of support.

Does a heart beat in your chests?

Or is it that it beats too hard, that you face anxiety every day, and the only way to cauterize your fear is to go out to the gun range and shred a silhouette with automatic weapons fire?

I guess it boils down to this, for me: America’s Gun Lobby is a mechanism that we use to prove that we can’t use the threat of violence to protect ourselves from pain. It binds us to wander in fear through the valley of the shadow of death.

Wandering until we turn our faces upwards to the healing light of love.

How often does a gun clutched to a chest serve any purpose other than to prevent us from “baring our arms” to help each other? To clap shoulders in welcome? To offer and receive an embrace?

Understanding, Hope

The ferocity of the wildfires raging in Northern California was given a human face last Monday morning when one of the staff at AMC shared that two members of her family had lost their homes and everything they owned when their town was devoured by the flames. As I write today, the fires have destroyed 1400 homes.

To some, it is human crisis that makes global climate change palpable to them. For me, once a wanderer of the trails above the Conejo Valley, the cries of nature have weighed on my heart for far longer. The day that I first encountered the great Muslim love poem, Yusef and Zuleika, these words caused me to weep as I looked out over the hills:

To my wounded heart this soft balm to lay,
For not beyond this can I wish or pray.
The streams of thy love will new life bestow,
On the dry, thirsty field where its sweet waters flow.

After services at St. Kolbe’s today, I was moved to stand on the floor where the gaze of Christ fell. I was struck suddenly that the last thing that he beheld was the earth under the cross. The earth that held in place the instrument of his destruction, but also that had carried him on his wandering, that had brought forth food for him to eat, and provided all the tools of weather and life that had responded to his authority as he tried to teach his people to heal the world.

We could have avoided this destruction. Not just the destruction of families, cities and nations, but the loss of species and the poisoning of water and earth that will delay their recovery. Both to the reasoning mind and the intuitive heart, these consequences have long been apprehensible. Now, faced with the undeniable evidence of doom, we still hesitate to act, for we think first of what is close to us. Our families, our homes, and our land: they all suffer, and so we take from elsewhere to preserve them. We take from those with no voice: the poor, the uneducated, and the natural world.

But what else are we to do?

I write here because I understand things that others do not, and so I perceive solutions that are beyond their grasp. It may seem small-minded to decry the folly of Elon Musk and his peers, desperately trying to disperse the human species so that it can survive all the threats of the natural world: black holes, solar instability, and human greed. But I do so with sympathy for them, for they cannot see how much power is available to us if only we understand it.

On the New Physics page I offer a model of physics that holds these truths: space is not empty. It is filled with a medium in which light propagates, the medium that physicists once called the “aluminiferous ether”, and now call “dark energy.” That medium is wrought through with threads that appear most obviously to us as electric charge when bound to the medium, but that may also float in the medium. The floating threads interact, merge and evolve to form what we know as “souls.” The souls merge with matter to “live” as plants, animals and people. In that form, they are capable of warping the fabric of space. In most cases, that warping occurs through the use of their physical manifestation – in humans, we commonly use our legs, hands, and mouths.

Through our actions, we join other things in the service of our will. That can be a temporary affair, such as when we throw a light switch or press the accelerator pedal. We are often seduced by the temporary thrill of such expressions, a thrill made accessible through the efforts of engineers to remove souls from the world around us, ensuring that it responds only to our will.

But any great lover knows the permanence of the bonds that arise when we ask permission before enjoying a gift, and attempt to reciprocate in kind. In those exchanges, we make persistent spiritual arrangements – persistent precisely because the participating souls do not seek to escape them.

So this is how we save the world: we surrender our self-concerns. We open our hearts in compassion to the suffering of the world. We marshal the displaced souls of the natural world and join them together to warp the fabric of space to create a lens that bends light away from the earth. And we reward them every day with the expression of our gratitude for their service.

Are we enough to do this, by ourselves? Perhaps, and perhaps not. But we should consider this: there is a billion times as much energy leaving the sun than comes to us on Earth. The source of that energy is not unintelligent. It is, in fact, the “Ancient of Days” described in Daniel’s Dream of the Four Beasts. It would help us if it could, but we are so terribly small, and one mistake would destroy us all. It needs us to guide it.

I had a friend challenge me once that with faith we should be able to move mountains. My response was: “Yes, if every living thing on the mountain and the land around it agreed that the mountain should move, the mountain would move.” But if any voice claimed privilege over that power, the result would be chaos. It is for this reason that I decry the ugliness of the Republican debates. If we are going to save all of the world, the power of such voices will still be among us. The destructive effects of their expression cannot be risked. They must learn self-control.

I was late getting to church this morning. As I organized my thoughts to write this post, I sat down to the reading from Acts. I wept as these words were read [James 4:2-3]:

You lust and do not have, so you commit murder. You are envious and cannot obtain, so you fight and quarrel. You ask and do not receive, because you ask with wrong motives, so that you may spend it on your pleasures.

Oh, humanity! Why must the world suffer so?

Freedom from Government through the Governance of Love

In explaining the necessity of God in Tragic Sense of Life, the Jesuit philosopher Miguel de Unamuno asserts that it arises when every man, naturally desiring to control the world, confronts the inevitability of death. As the latter treads on our heels, even the most powerful are pressed to the conclusion that the only way to live forever is to embrace a God that loves us enough to grant us life.

Atheists are inclined by this logic to conclude that faith is a delusion. Marx certainly saw it that way, declaring that “religion is the opium of the masses.” But the underlying pressure is evidenced in the pronouncements of some technologists, among them the man I described yesterday who saw our digital sensors, networks and software as empowering us to build God. Others are more humble. At the ACM fiftieth anniversary symposium in 1997, Nathan Myhrvold, then chief architect at Microsoft, envisioned (somewhat playfully) a future in which we could escape death by creating digital simulations of our brains. The video skit included Bill Gates rubbing his chin as he thoughtfully considered the reduction in Microsoft’s benefits budget.

But if delusion is pathetic, oftentimes in the powerful avoidance is grotesque. We have Vladimir Putin, assassin of Russian patriots, proclaiming that Jesus will find no fault with him on Judgment Day. Or the effrontery of Donald Trump who, protected by his army of lawyers, knows that so long as he asserts righteousness, no one has the means to contradict his claims of competency and benevolence. Thus he continues to assert – in contradiction of the actual birth certificate – that his lawyers have compelling evidence to reveal regarding President Obama’s citizenship. Both of these men suffer from the same affliction, the tendency of our bodies to respond to successful acts of aggression by manufacturing more and more testosterone, the chemical driver for aggression. This is a positive feedback loop that was broken only by death in the cases of Hitler, Franco, Mao, Stalin, Kim Yung Un and so many other tyrants. In the prelude, millions of people were sacrificed on the altars of their psychological invincibility.

This dynamic is writ small in the lives of many businesses, congregations and families. People addicted to the rush of adrenaline and the power of testosterone manufacture experiences that stimulate their production. This is why it is said “absolute power corrupts absolutely.” The desire for power arises from the biological thrill of success, and to continue to receive that thrill, the addict must continue to risk his power in ever greater contests. In the heat of passion, the suffering visited upon others is ignored.

There are three antidotes to this dynamic. The first is popular rebellion. Paradoxically, this is the very force that pushed Putin and Trump to prominence. At a stump speech yesterday, Trump opened the floor to questions, and the first person to the microphone began to rant hatefully about President Obama and an imagined domestic Muslim threat. Trump did not defuse the situation, instead responding “We need to hear this question!” But often rebellion is merely another manifestation of the drive to power. Unless tempered, it rages out of control, as happened in the Jacobian tyranny following the French Revolution.

The second antidote is reason. Reason builds discipline that forces us to reconcile our actions with their consequences, thereby disciplining our aggression with objective evidence of failure. The tension between reason and will is not just moral, however: heightened levels of adrenaline actually degrade the higher thinking centers of the brain. This creates a terribly contradictory dynamic, perhaps manifesting itself in the fact that most academics do their greatest work in their youth. While testosterone serves the reasoning mind in creating the thirst to conquer and claim ideas, as the successful mind expands, so do levels of testosterone and adrenaline, which destroys the power of reason. In that context, the methods used to sustain power are not as brutal as those used by the social tyrant, but have their own unique form of cruelty, and leave lasting scars on the psyche. Isaac Newton, cheated of credit for a scientific insight by his predecessor as head of the Royal Academy of Sciences, had the satisfaction of burning the man’s portrait. Most victims of intellectual tyranny are consigned to obscurity.

It is natural for supporters to gather around the social or intellectual tyrant during his rise to power. Claiming benevolent intention is a great way of rallying support from the oppressed. Unfortunately, this dictum holds: A man will change his beliefs before he will change his behavior. When that behavior is organized around aggression, enemies must be created when there are none left at hand. All tyrants eventually turn on their lieutenants, often using hallucinatory rhetoric to justify their actions.

A peer once offered to me that all the greatest scientists were lovers of humanity. This brings us to the third antidote: love. This arrives upon us through many pathways. It can be through sex and maternity. It can be when an infant first grasps our forefinger. It can be through service to those in want. In those moments a bond is established, a linkage that makes palpable the suffering we visit upon others. That can be rationalized in material terms: tears on a beloved face or cries of shame are evidence of our failure. That breaks the vicious cycle of success and aggression.

But there is another aspect that goes beyond negative feedback. Aggression stimulates the loins and the mind, but barely touches the heart. Exchanging love with someone just feels good. It opens us up to a world of experience that can be touched in no other way. Ultimately, its rewards are far greater because no one that loves themselves objects to being loved. They do not turn on their friends for satisfaction, because their friends offer them satisfaction every day.

Democracy attempts to combat the urge to power by institutionalizing rebellion. In America, the two Presidents that were awarded most authority were George Washington, who gracefully surrendered power after two terms of service following universal acclamation by the Electoral College, and FDR, who literally worked himself to death through four terms in office. Both those men were governed by a sense of duty and love for their country, a commitment affirmed by the popular voice that is expressed in elections. At the end of the 20th century, those that seek the freedom to act always as they please (the ultimate manifestation of power) responded to electoral constraint by attacking our faith in government. Driven by testosterone and thus unable to govern themselves, they have invested huge amounts of money creating personalities such as Newt Gingrich, Rush Limbaugh and Bill O’Reilly. As visible in the Oklahoma City bombing and the events surrounding the Republican nominating process, the end result has been to stimulate the resort to violence by others.

Thus we have the wisdom of Jesus: “Render unto Caesar those things that are Caesar’s.” We have the promise of Jeremiah: “For I will write my law on their hearts, and no man will be told ‘Come learn about my God’, because all will know me.” And we have Christ’s summation of the Jewish experience with law (the rule of reason) and governmental control: Love God and your neighbor.

It is through self-regulation that we discover truth and peace[NIV Matt. 7:13-14]:

Enter through the narrow gate. For wide is the gate and broad is the road that leads to destruction, and many enter through it. But small is the gate and narrow the road that leads to life, and only a few find it.

But what other government would we choose, except the governance of our hearts? And to what other authority would be choose to submit, other than the authority of compassion in another? Why do we delude ourselves that there is any other way?

Artificers of Intelligence

The chess program on a cell phone can beat all but the best human players in the world. It does this by considering every possible move on the board, looking forward perhaps seven to ten turns. Using the balance of pieces on the board, the algorithm works back to the move most likely to yield an advantage as the game develops.

These algorithms are hugely expensive in energetic terms. The human brain solves the same problem in a far more efficient fashion. A human chess player understands that there are certain combinations of pieces that provide leverage over the opposing forces. As opportunities arise to create those configurations, they focus their attention on those pieces, largely ignoring the rest of the board. That means that the human player considers only a small sub-set of the moves considered by the average chess program.

This advantage is the target of recent research using computerized neural networks. A neural net is inspired by the structure of the human brain itself. Each digital “node” is a type of artificial neuron. The nodes are arranged in ranks. Each node receives input values from the nodes in the prior rank, and generates a signal to be processed by the neurons in the next rank. This models the web of dendrites used by a human neuron to receive stimulus and the axon by which it transmits the signal to the dendrites of other neurons.

In the case of the human neuron, activation of the synapse (the gap separating axon and dendrite) causes it to become more sensitive, particularly when that action is reinforced by positive signals from the rest of the body (increased energy and nutrients). In the computerized neural network, a mathematical formula is used to calculate the strength of the signal produced by a neuron. The effect of the received signals and the strength of the generated signal is controlled by parameters – often simple scaling factors – that can be adjusted, node by node, to tune the behavior of the network.

To train an artificial neural network, we proceed much as we would with a human child. We provide them experiences (a configuration of pieces on a chess board) and give feedback (a type of grade on the test) that evaluates their moves. For human players, that experience often comes from actual matches. To train a computerized neural network, many researchers draw upon the large databases of game play that have been established for study by human players. The encoding of the piece positions is provided to the network as “sensory input” (much as our eyes do when looking at a chess board), and the output is the new configuration. Using an evaluative function to determine the strength of each final position, the training program adjusts the scaling factors until the desired result (“winning the game”) is achieved as “often as possible.”

In the final configuration, the computerized neural network is far more efficient than its brute-force predecessors. But consider what is going on here: the energetic expenditure has merely been front-loaded. It took an enormous amount of energy to create the database used for the training, and to conduct the training itself. Furthermore, the training is not done just once, because a neural network that is too large does not stabilize its output (too much flexibility) and a network that is too small cannot span the possibilities of the game. Finding a successful network design is a process of trial-and-error controlled by human researchers, and until they get the design right, the training must be performed again and again on each iteration of the network.

But note that human chess experts engage in similar strategies. Sitting down at a chess board, the starting position allows an enormous number of possibilities, too many to contemplate. What happens is that the first few moves determine an “opening” that may run to ten or twenty moves performed almost by rote. These openings are studied and committed to memory by master players. They represent the aggregate wisdom of centuries of chess players about how to avoid crashing and burning early in the game. At the end of the game, when the pieces are whittled down, players employ “closings”, techniques for achieving checkmate that can be committed to memory. It is only in the middle of the game, in the actual cut-and-thrust of conflict, that much creative thinking is done.

So which of the “brains” is more intelligent: the computer network or the human brain? When my son was building a chess program in high school, I was impressed by the board and piece designs that he put together. They made playing the game more engaging. I began thinking that a freemium play strategy would be to add animations to the pieces. But what about if the players were able to change the rules themselves? For example, allow the queen to move as a knight for one turn. Or modify the game board itself: select a square and modify it to allow passage only on the diagonal or in one direction. I would assert that a human player would find this to be a real creative stimulus, while the neural network would just collapse in confusion. The training set didn’t include configurations with three knights on the board, or restrictions on moves.

This was the point I made when considering the mental faculties out at http://www.everdeepening.org. Intelligence is not determined by our ability to succeed under systems of fixed rules. Intelligence is the measure of our ability to adapt our behaviors when the rules change. In the case of the human mind, we recruit additional neurons to the problem. This is evident in the brains of blind people, in which the neurons of the visual cortex are repurposed for processing of other sensory input (touch, hearing and smell), allowing the blind to become far more “intelligent” decision makers when outcomes are determined by those qualities of our experience.

This discussion, involving a game without much concrete consequence, appears to be largely academic. But there have been situations in which this limitation of artificial intelligence have been enormously destructive. It turns out that the targeting systems of drones employ neural networks trained against radar and visual observations of friendly and enemy aircraft. Those drones have misidentified friendly aircraft in live-fire incidents, firing their air-to-air missile and destroying the target.

So proclamations by some that we are on the cusp of true artificial intelligence are, in my mind, a little overblown. What we are near is a shift in the power allocated to machines that operate with a fixed set of rules, away from biological mechanisms that adapt their thinking when they encounter unexpected conditions. That balance must be carefully managed, lest we find ourselves without the power to adapt.

Authority in Scriptural Interpretation: The Value of Science

I keep on getting caught up in debates on other sites (The River Walk and There’s a Thing Called Biology come to mind) that tend to end with charges against my intellectual integrity. The progression goes something like:

  1. I observe that the people that wrote the Bible were recording experiences that they lacked the scientific understanding to describe accurately.
  2. I propose alternative interpretations of the events in modern scientific terms.
  3. I am told that the events recorded in the Bible could not have happened because they violate scientific knowledge.
  4. I suggest that science is not as iron-clad as many believe, and direct the conversation to my “New Physics” page.
  5. The responder offers the unsophisticated interpretation of the Biblical record (i.e. – Creation occurred in seven days) as evidence that people that believe in God do not understand science, and accuses me of being a poor scientist.
  6. I offer that my personal experience of God contradicts their science, and re-iterate that that I have offered models that integrate science and spirituality for their consideration.
  7. I am accused of intellectual dishonesty and ignoring scientific truth.
  8. I break off the discussion.

This may seem like just whining, but there’s a really fundamental point that nobody seems to have grasped just yet: the reason that religious authorities offered an “unscientific” understanding of scripture was because they didn’t have enough science to interpret scripture. Receiving a document through a long chain of translation from dead languages, they interpreted the words as literal truth because they had nothing else to guide their understanding.

But we do have science as our guide. So why not make use of it?

Given what we know about paleontology, for example, we can clearly interpret the days of creation as the history of biological development, running from single-celled organisms that learned to use light as a source of energy, and ending with the mammals and man on “day” six. Along the way, the development of eyesight replaces “light” with the more specific sun and moon.

Similarly, the trumpets of Revelation are seen to correspond almost exactly with the ancient mass extinctions. The era of giant insects is noted, and the final extinction episode (involving a meteor strike, volcanic vents and egg-eating mammals) describes distinctly the mechanisms that terminated the age of the dinosaurs.

Scripture and Darwin don’t contradict each other, they support each other. In the other direction, I think that the most powerful tool we have to advance our understanding of fundamental science is not the billion-dollar satellites and particle accelerators, but rather the well-documented record of spiritual experience.

Really, I would think that we’d be getting together to shake hands and pat each other on the back, not trading barbs.

Design by Discipline

When I received my Ph.D. in Particle Physics in 1987, I was qualified as among the wonkiest elite in science. If I had been concerned with proving that I was smart, I might have stayed in physics, but the expectations for practical applications of fundamental physics had eroded greatly after my freshman year. I wanted the larger world to benefit from the work that I did, so I took a job at a national laboratory. After a brief post-doc in fundamental physics, I moved over to environmental science. Throughout, the growing importance of computerized control and simulation meant that I enjoyed a distinct competitive advantage over my peers, as I had learned to program from one of the foremost practitioners in his generation – my father. When I became a full-time software developer, my background in physics allowed me to collaborate with engineers, to the extent that I would be brought in on engineering meetings when my peers were unavailable.

Now this may seem like just bragging, but the point is that my career has been a dynamically evolving mash-up of science, engineering and programming. My experience was filtered through a practice of systems analysis that led me to examine and control the way that those disciplines interact. So when I think about science, I don’t think about it as “what scientists do.” I do consider myself a scientist, but I do engineering and programming as well, and I perceive the three disciplines as very different activities.

I took a course on philosophy of science as an undergraduate, and I won’t drag you, dear reader, through all the definitions that have been offered. Most of them hold that Frances Bacon’s articulation of the scientific process was a magic portal for the human intellect, as though practical efficacy and the rational ordering of knowledge had not been recognized virtues among the ancients. This leads many philosophers of science to be overly concerned with truth, when what is really of interest to us as people is what has yet to be true.

The power of science is in allowing us to pierce the shadowy veil of possibility. In biology, understanding of the variety of living things and their mutual dependencies gives us the power to sustain agriculture, breed robust animals, and improve our health. Chemistry empowers us to predict the stability and properties of new substances. And physics probes the fundamental mechanisms that determine both the stability of the world around us and our ability to manipulate it.

So science provides us with pure knowledge, unconstrained by our desires or intentions. It therefore tends to attract people that are driven by curiosity. That may sound like a trivial thing, but to find order in the chaotic milieu of nature is a source of great hope. Calendars that predict the seasons allowed agricultural societies to improve their harvests and so avoid famine. The germ theory of disease motivated doctors to wash their hands, transforming hospitals from centers of disease incubation to places of healing. Scientific curiosity – to ask neurotically “why?” – is the source of great power over the world.

That power is visible in the manufactured reality all around us: the houses, roads, dams and microchips. None of these things would have existed in the natural world. The artifacts exist only because people have a purpose for them. That purpose may be as simple as cooking dinner for our children, or as grand as ensuring that the world’s knowledge is available through the internet to any person, anywhere, any time. Which of our goals are realized is largely a matter of economics: are enough people invested in the outcome that they are willing to pay to see it accomplished? We don’t have to have a kitchen in every home, but few of us can afford to go out to dinner every night, so we pay for a kitchen. The cost and delay of moving information via mail drove the growth of the internet, at an expense that I would imagine (I can’t find numbers online) has run into trillions of dollars.

Now when people invest a substantial sum of money, they want some assurance that they’ll get what they’re paying for. Appreciating that gold does not tarnish, the sultan seeking to protect the beauty of his marble dome does not want to be told, “All natural gold originates in supernovae.” Or, worse, “If we smash heavy elements together in an accelerator, we can produce ten gold atoms a day.” Those kinds of answers are acceptable in scientific circles, but they are not acceptable in the engineering world. In the engineering world, when somebody comes to you with money and a problem, your job is to design an implementation that will realize their goal.

Since we’re a species of Jones-chasers, most of the time the engineer’s job is fairly simple. People come wanting something that they’ve seen, and the challenge is to understand how it was done before and adapting the design to local conditions. But every now and then somebody comes in to ask for something completely novel. They want to build an elevator to space, for example, or create a light source that doesn’t produce soot. The engineer has no way of knowing whether such things are possible, except by reference to science.

It is into the gap between the formless knowledge of science and the concrete specifications of engineering that programming falls. Considering the light bulb, the scientists know that heated objects glow, but also burn. Applying an electric voltage to a poor conductor causes it to heat as current flows through it. The filament will burn when exposed to oxygen, so we need to isolate it from air. Using an opaque material as the barrier will also trap the generated light. However, some solids (such as glass) are transparent, and air permeates slowly through them.

The illustration is a cause-and-effect analysis. It looks at the desirable and undesirable outcomes of various scientific effects, attempting to eliminate the latter while preserving the former. The cause-and-effect analysis leads to an operational hypothesis: if we embed a wire in a glass bulb and apply a voltage, the wire will heat and emit light. This is not an engineering specification, because we don’t know how much the light bulb will cost, or how much light it will emit. But it also isn’t science, because the operational hypothesis is not known to be true. There may be no filament material that will glow brightly enough, or the required voltage may be so high that the source sparks down, or the glass may melt. But without the operational hypothesis, which I have called a “program,” engineering cannot begin.

We examined the challenge of software engineering in the first post in this series, focusing on the rapid development in the field and the difficulty in translating customer needs into terms that can be interpreted by digital processors. Today, we have arrived at a more subtle point: the algorithms written in our programming languages process information to produce information. The inputs for this process arise from nature and humans and increasingly other machines. Those inputs change constantly. Therefore very few programs (except maybe that for space probes) are deployed into predictable environments. That includes the hardware that runs the program – it may be Atom or Intel or AMD, and so the performance of software is not known a priori. For all of these reasons, every piece of software is thus simply an operational hypothesis. It is a program, not a product.

The Modern Tower of Babel

I alluded to the problem of language in my introductory post on programming. The allusion was hopeful, in that our machines are learning to understand us. Or rather, they are learning to understand those of us that speak supported languages.

The dominant language of international discourse today is English. That can be attributed to the success of the English Empire in the colonial age, and then to the industrial and diplomatic dominance of America in the aftermath of World War II. But the proliferation of English has affected the language itself.

The most significant changes impacted many of the colonial languages: they were simplified and regularized to make them easier to teach. Study of tribal languages reveals that they defy analysis. Few patterns are discerned in verb conjugations, and sentence structure obeys arbitrary rules. But the languages of major civilizations can also be daunting: the ideograms and subtle intonations of Chinese are a case in point. For both types of language, it is impossible for an adult to become fully proficient. But the education of adult leaders and manual laborers was critical to the stability of Empire. In the absorption of foreign populations, the complexity of the original language was eroded by the logistics of minority control.

And yet today the Brits like to say that England and America are divided by a common language. While the grammar and basic terms of the language are shared, cultural development and ambition still drive change. The physical sciences are characteristic. While my professors focused on physics as applied mathematics, it was clear to me that it was also a foreign language, with arcane terms such as “Newton’s Third Law”, “Lagrangian” and “Hamiltonian” use to distinguish alternative formulations of the mathematics used to describe the motion of classical particles. As cultural developments, the latter two came to prominence because their mathematical formulations were generalized more readily to non-classical systems. And as regards ambition, we need only note that all three formulations bear the name of their originators.

But language can also be used consciously as a political tool. Newt Gingrich created the modern Republican media machine around 1990 by distributing cassette tapes each month with terms to be applied in derogating Democratic and lauding Republican policies. Many oppressed minorities encode their conversations to prevent authorities from interfering with the conduct of their lives, and those can emerge as full-blown languages in their own right (The “Ebonics” movement reflected such a development in America).

But in other cases, new usage arises as a form of entertainment. I had to ask my son to clarify the meaning of “sick” as used by today’s youth, and was surprised to discover that, as in Chinese, nuances of intonation were essential to understanding.

Most of these variations can be expected to be ephemeral. “Cool” was “sick” when I was growing up, and all attempts to obscure meaning will eventually founder on the rock of economic realities. People that can’t describe accurately the world around them seem bizarre if not outright insane, and ultimately excuse themselves from collaboration with others. While the linguists are fascinated by variation, they predict that the number of living languages will continue to decline.

As a programmer, however, I have the opposite experience. Fred Brooks and Martin Fowler have decried the “language of the month” phenomenon in software engineering. I myself feel a certain insecurity in my job search because the applications that I develop can only be created using fifteen-year-old technologies that most programmers would consider to be “archaic.”

To understand the root of this proliferation, it is amusing to backtrack to 1900 or so. Mathematicians had developed categories for numbers: the integers (used for inventory tracking), rational numbers (ratios of integers) and real numbers that seemed to have no repeating pattern. Two very important branches of mathematics had been proven to depend upon real numbers: geometry and calculus. In geometry, the real number pi is the ratio of a distance across a circle and the distance around it. In calculus, Euler’s constant e is the number that when exponentiated has a slope equal to the value at every point on the curve.

However, philosophers pointed out that while calculation of the exact value of these numbers was impossible, even in the case that we could, any calculation performed using them could only be performed with finite precision – and that is good enough. If we can’t cut a board to better than one thousands of an inch, it doesn’t matter if we calculate the desired length to a billionth of an inch. Practically, the architect only needs to know pi well enough to be certain that the error in his calculation is reasonably smaller than one thousandth of an inch.

Given that binary notation could be used to represent numbers as well as common numerals, it was clear that computers could be used for practical calculations. When Alan Turing defined a simple but comprehensive model for digital computation, the field progressed confidently to construct machines for general purpose applications, encompassing not only mathematics but also language processing.

Now in Turing’s model, the digital processor operates on two kinds of input: variable data and instructions. The variable data is usually read from an input at execution. The instructions could be built into the actual structure of the processor, or read in and interpreted at run-time. The machine that Turing built to crack the Nazi Enigma code was of the first type, but his general model was of the second.

Turing’s original specification had fairly simple instructions (“move tape left”, “move tape right”, “read value” and “write value”), but it wasn’t long before Turing and others considered more complex instruction sets. While after the Trinity test, Oppenheimer famously penned a poem comparing himself to “Shiva, the destroyer of worlds”, I can’t help but wonder whether the original computer designers saw the parallels with Genesis. Here they were, building machines that they could “teach” to do work for them. They started with sand and metal and “breathed life” into it. The synaptic relays of the brain that implemented human thought have operational similarities to transistor gates. Designs that allowed the processor’s output tape to be read back as its instruction tape also suggested that processors could modify their behavior, and thus “learn.”

The Turing test for intelligence reflects clearly the ambition to create a new form of intelligent life. But creating the instruction tape as a series of operations on zeros and ones was hopelessly inefficient. So began the flourishing of computer languages. At first, these were simply mechanisms for invoking the operation of blocks of circuitry that might “add” two numbers, or “move” a collection of bits from one storage location to another. Unfortunately, while these operations provided great leverage to programmers, they addressed directly only a small part of the language of mathematics, and were hopelessly disconnected from the language used to describe everything else from banking to baking.

Still fired with ambition, the machine designers turned to the problem of translating human language to machine instructions. Here the most progress was made in the hard sciences and engineering, where languages such as FORTRAN attempted to simulate the notation of mathematical texts. The necessary imprecision of business terminology was refined as COBOL, allowing some processes to be automated. And as machine architectures grew more complex, with multi-stage memory models, communication with external peripherals including printers and disk drives, and multi-processing (where users can start independent applications that are scheduled to run sequentially), C and its variants were developed to ease the migration of operating systems code through architecture generations.

These examples illustrate the two streams of language development. The first was the goal of recognizing patterns in program structure and operation and facilitating the creation of new programs by abstracting those patterns as notation that could be “expanded” or “elaborated” by compilers (a special kind of software) into instructions to be executed by the machine. So for example, in C we type

c = a + b;

To anyone that has studied algebra, this seems straight-forward, but to elaborate this text, the compiler relies upon the ‘;’ to find complete statements. It requires a declaration elsewhere in the code of the “types” of c, a and b, and expects that the values of a and b have been defined by earlier statements. Modern compilers will report an error if any of these conditions are not met. A competent programmer has skill in satisfying these conditions to speed the development of a program.

The other stream is driven by the need to translate human language, which is inevitably imprecise, into instructions that can be executed meaningfully upon zeros and ones. Why is human language imprecise? Because more often than not we use our language to specify outcomes rather than procedures. The human environment is enormously complex and variable, and it is rare that we can simply repeat an activity mechanically and still achieve a desirable output. In part this is due to human psychology: even when the repetitions are identical, we are sensitized to the stimulus they provide. We desire variability. But more often, it is because the initial conditions change. We run out of salt, the summer rains come early, or the ore shipped to the mill contains impurities. Human programming is imprecise in part because we expect people to adapt their behavior to such variations.

Both abstraction and translation have stimulated the development of programming languages. Often, they go hand-in-hand. Systems developers expert in the use of C turn their skills to business systems development, and find that they can’t communicate with their customers. C++ arose, in part, as a method for attaching customer terminology to programming artifacts, facilitating negotiation of requirements. When the relational model was devised to organize business transaction data, SQL was developed to support analysis of that data. And when the internet protocols of HTTP and HTML became established as the means to acquire and publish SQL query results in a pretty format on the world-wide web, languages such as Ruby arose to facilitate the implementation of such transactions, which involve a large number of repetitious steps.

What is amusing about this situation is that, unlike human languages, computer languages seem to be almost impossible to kill. Consider the case of COBOL. This language approximates English sentence structure, and was widely used for business systems development in the sixties and seventies. At the time, the language designers assumed that COBOL would be replaced by better alternatives, and so adopted a date format that ran only to the end of the century. Unfortunately, the applications written in COBOL became services for other applications written in other languages. The business rationale for the logic was lost as the original customers and developers retired, and so it was effectively impossible to recreate the functionality of the COBOL applications. As the century came to a close, the popular press advertised the “Year 2000” crisis as a possible cause of world-side financial collapse. Fortunately, developers successfully isolated the code that depended upon the original date format, and made adaptations that allowed continued operation.

This trend will be magnified by the economics of software solutions delivery. Unlike other industries, almost the entire cost of a software product is in the development process. Manufacturing and distribution is almost free, and increasingly instantaneous. This means that the original developer has almost no control over the context of use, and so cannot anticipate what kinds of infrastructure will grow up around the application’s abstract capabilities.

The popular ambitions for software reflect this reality. The ability to distribute expert decision making as applications operating on increasingly precise representations of reality, all in the context of data storage that allows the results to be interpreted in light of local conditions: well, this implies that we can use software to solve any problem, anywhere. Some people talk about building networks of digital sensors that monitor everything from the weather to our respiration, and automatically deploy resources to ensure the well-being of everyone everywhere on earth.

In the original story of Babel, the people of the Earth gathered together to build a tower that would reach to heaven. Their intention was to challenge God. The mythical effort was undermined when God caused people to speak different languages, thus frustrating their ability to coordinate their efforts. In the modern era, we in effect seek to approximate the Biblical God using digital technology, but our ambitions lead us to create ever more abstract languages that we cannot rationalize, and so we find our efforts frustrated by the need to backtrack to repair our invalid assumptions.

In the terms of the programming discipline we will propose, however, the fundamental problem can be put this way: the digital realm is an abstract representation of reality. Why basis do we have for believing that the applications created using those abstractions accurately translate the needs described by human users? If we can’t solve that problem of describing and analyzing that correspondence, then our software must inevitably become a form of black magic that rules over us.

Love Works Posted

Just a note that I’ve uploaded the rest of Love Works. Click on the page link on the banner. The post explains the delay.

The document was originally created in OpenOffice, and the images acquired a grey background in the port to Word. At some point I’ll fire up my old laptop and break it apart in OpenOffice. If there’s an immediate need, let me know and I’ll push it up on the priority list.

The Trust Mind

Hundreds of years before the life of Jesus of Nazareth, the mystics of Greek Hellenismos understood Humanity’s spiritual development as a growth into engagement with certain fundamental natural forces. Aphrodite, for example, was represented as a beautiful woman, but as a god mediated between humanity and the force of attraction, which manifests as much in gravitation as it does in sensual desire. Following the era of the Titans and Olympians, the aim of the mystics was to usher in the age of Dionysius, allowing men to interact directly with the principles. In other words, for us to become gods.

When this truth was first revealed to me, the speaker admitted that in the modern era, we view Dionysius, the “party god”, as an unlikely avatar. We view alcohol as a vice, but the Greeks saw it as a tool. When we are drunk, we “lose our inhibitions.” That may manifest itself in a tendency to orgy, but at a deep spiritual level reflects the loosening of the protective barriers around our souls. We surrender ourselves to trust, and so relate more freely and deeply than we would otherwise. (See this post by Irwin Osbourne for more on this experience.)

The power of this relation can be abused. Megalomania is one pathology. In “Ray”, the film biography of Ray Charles, one scene reconstructs a set in which a horn player stands up to take an impromptu solo in the middle of a number. The man was dismissed, not because he violated the integrity of the rendition, but because Ray recognized intuitively that the man was on heroin. Accused of hypocrisy, Charles’s retort was that he had to be the only one. A second pathology is dependency. In graduate school, a friend shared his experience of a teacher who drank incessantly, and actually could do chemistry well only in that state. It took me a while to figure out how to suggest that maybe the teacher wasn’t doing the thinking at all – that the alcohol enabled him to inject himself into a community of minds that tolerated his needs.

There are other methods to achieve this integration. A young woman can be almost suicidal in her disposition to trust the men that she desires, and when that is manifested in sexual license, she may serve as the pool in which men join. Junger’s book “War” documents the characteristics of men that survive constant threat only by surrendering themselves to trust in each other.

There is enormous power in such melding, but the methods listed above cannot be sustained by our physiology. The licentious woman becomes corrupted by masculine demons, and loses her beauty. Substance abuse drives our metabolism into pathways that destroy our health. And war is a process that no one escapes without harm, even if it is hidden deep in the soul behind a stoic mask.

It is for this reason that everdeepening.org opens with this statement:

Love dissolves the barriers of time and space, allowing wisdom, energy and understanding to flow between us, and embracing us with the courage, clarity and calm that overcomes obstacles and creates opportunities. When we open our hearts to one another, there is no truth that is not revealed, and to those that love themselves, no impulse to harm that cannot be turned to the purposes of healing and creation.

As a Christian, I see the ultimate human manifestation of this truth in the march of Jesus of Nazareth to the cross. And behind that sacrifice, I must see the yearnings of a perfect and unconditional love that invests itself in the realization of that truth in our lives.

But when picking up the Bible, it doesn’t take long to reach contradictory evidence. Taking Eden as a metaphor for a relationship of trust between the source of love and humanity, that trust is corrupted by the serpent, which appeals fundamentally to human selfishness. In God, we were gods, but Eve is encouraged [NIV Gen. 3:5] to “be like God, knowing good an evil.” For this breach of trust, Adam and Eve are dismissed from the garden, and punishments are heaped upon them.

What was so heinous about their crime? Was it worse than the slaying of Abel, for which Cain was allowed a lifetime of repentance? And what is so important about us that God would give Jesus as a sacrifice to the goal of our redemption?

To understand this, we have to understand the nature of thought. We have succumbed in the modern age to scientific materialism, and so hold that thought occurs in the brain. I know this not to be true: I relate frequently to thinking beings that have no bodies and no brains, and so must recognize that my brain is merely an interface to my soul. To facilitate the expression of will through my body, the operation of the brain must correlate completely with the thinking done by my spirit.

Thus I interpret “In the image of God he created them” [NIV Gen. 1:27] in this way: our bodies are a tool through which we manifest the will of our souls and – given the quote above – they operate most effectively when used to express love.

The problem is that every interface is a two-way street. While through our commitment to creative expression, we can bring truth and beauty into the world, the opposite can occur. In the experience of pain and suffering, we project thoughts back into God. In the expression of greed and lust, we corrupt the purity of love. This is articulated many times in the Bible: consider Noah, Exodus and Ezekiel. Rather than being remote and impervious, God suffers from our wrong-doing. The flood is thus a desperate move to rid himself of the irritation, as is the destruction of the Holy City through the witness of Ezekiel. While horrifying to us as humans, we might imagine that so must the bacterium feel when confronting the operation of the immune system.

The error of the Law is to interpret these actions as a judgment, as an evidence of sin. They are not. The effect is to destroy the material manifestations of the success of selfishness, revealing its sterility. They are actions taken to frustrate selfish personalities that attempt to prevent love from liberating and healing their abused captives.

This is “The Knowledge of Good and Evil” that brings death into the world. Lacking appreciation of the virtues of love, we chose not to trust in love. We demanded understanding. But understanding is gained only through experience, and experience requires expression of both good and evil. We are educating ourselves.

In the end, Christ gathers those that chose good into the fold of the perfect love that originates from the divine source. We join our shared memory and wisdom into a single holy mind, and heal the world of the disease of selfishness. Thus I do not interpret the Crucifixion as atonement for our sins. Rather, I believe it should be seen as a surrender to trust in love, a struggle waged most fiercely in the Garden of Gethsemane, and redeemed by the proof of the power of love in the Resurrection. Rather than an indictment of our frailty, it is meant to be an exhortation to manifest our own forms of greatness.

Trust in yourselves. Trust in love. Welcome yourselves into the Holy Spirit, the mind formed when that trust is perfected in us.