Prediction and Primacy of Consciousness

I finished Leonard Peikoff’s Objectivism: The Philosophy of Ayn Rand in 2015, and on the whole, didn’t get that much out of it. It took a long time to slog through, and didn’t answer some of my longstanding questions about Rand’s intellectual history. I’d recommend it as a reference text, but not as an introduction to Objectivism.

This isn’t a review of OPAR; I’ve discussed it elsewhere. Today we’re going to discuss one of the few good new ideas I learned reading it: primacy of consciousness.

Objectivism advocates a worldview based on primacy of existence. Rand holds that consciousness has no power over reality in and of itself—consciousness is the processes of identifying existents, not creating them. Now a conscious mind can decide to alter existents through physical action, or extrapolate the possibility of not-yet-existing existents, but the mere act of thinking cannot produce physical phenomena.1

Primacy of consciousness puts the cart before the horse. Perception can neither create a percept, nor modify it, nor uncreate it.2 Sufficiently invasive methods of inquiry may do that, but the mental process of observation does not.

Let us consider a technical example. When solving engineering assignments, it is often tempting to avoid checking my work. The correct answer is independent of whether I’ve made an exhaustive search for mistakes. Yet, on a certain level, it might seem that not looking will make an error go away.

But it won’t. As my structures professor often says, in aerospace engineering we have a safety factor of 1.5. In school, that’s just a target to aim for—if I screw up, the grade will point it out and I’ll feel silly for missing easy points. On the job, that’s not the case. If your work has a serious mistake, you’re going to kill people.

Or wreck the global economy.

Since starting Nate Silver’s book, perhaps the most interesting thing I’ve learned so far (besides an actually intuitive explanation of Bayes’ Theorem, contra Yudkowsky) was just how stupid the root causes of the housing crisis were.

I’d recommend reading the book if you’d like a properly comprehensive explanation, but the executive summary would be that, starting in the late 1990s, the value of houses began to skyrocket in what we now know was a real estate bubble. This was basically unprecedented in US history, which should have been a wake-up call in itself, but the problem was compounded by the fact that many investors assumed that housing prices would keep going up. They wanted to bet on these increasingly risky properties, creating all sorts of creative “assets” to bundle specious loans together. Rating agencies were happy to evaluate all of these AAA, despite being totally new and untested financial instruments. Estimated risk of default proved to be multiple orders of magnitude too low. And yet everyone believed them.

Silver describes this as a failure of prediction, of epistemology. Assessors made extremely questionable assumptions about the behavior of the economy and the likelihood of mortgage default, which are legitimate challenges in developing predictive models. Going back to my examples of structural engineering, it’s easy to drop the scientific notation on a material property when crunching the numbers. If you say that aluminum has a Young’s Modulus of 10.7, the model isn’t going to know that you meant 10.7 × 106 psi or 10.7 Msi. It’s going to run the calculations regardless of whether your other units match up, and may get an answer that’s a million times too big. Remember that your safety margin is 1.5.

I don’t think economic forecasters have explicit margins of error, but the same general principle applies. Using the wrong Young’s Modulus is an honest mistake, an accident, which is easily rectified if found early. Lots of errors in the rating agencies’ models weren’t so honest. They made what looked like big allowances for unknowns, but didn’t question a lot of their key assumptions. This speaks to a real failure of epistemic humility. They didn’t ask themselves, deep down, if their model was wrong. Not the wrong inputs, but the wrong equations entirely.

For instance, say I model an airplane’s wing as a beam, experiencing bending and axial loads, but no axial torsion. That’s a very big assumption. Say there’s engines mounted on the wing—now I’m just ignoring Physics 101. If I ran the numbers and decided that propulsive and aerodynamic twisting moments were insignificant for the specific question I’m considering, then it might be an acceptable assumption. But I would need to run the numbers.

Many people, at many organizations, didn’t run the numbers in the years leading up to the financial crisis. Now not all of them were given an explicit choice—many were facing managerial pressure to meet deadlines or get specific outputs. That’s an organizational issue, but really just bumps the responsibility up a level.3 Managers should want the correct answer, not the one that would put the biggest smile on their face.

In aerospace engineering, we have an example of what happens when you do that:

741px-challenger_explosion

Just because the numbers look good on paper doesn’t mean they correspond to the real world. Empirical testing is where that comes in. Engineers do that all the time, but even then, it doesn’t prevent organization incentives from bungling the truth. If the boss wants to hear a particular answer, she may keep looking until she finds it.

Economists are worse, trying to predict a massively nonlinear system and, Silver reports, doing quite badly at it. Objectivism is very strong on the importance of saying I know, but rationality also depends on saying I don’t know when you legitimately don’t. Try to find out, but accept that some truths are harder to obtain than others.

Existence exists, and existents don’t care what you think.


1Outside of your body, that is. This is where the line between body and mind becomes pertinent and about where I give up over reducibility problems. Suffice to say that if you can create matter ex nihilo, there’s a lot of people who would be interested in speaking with you.

2Those of you with itchy fingers about quantum mechanics are politely invited to get a graduate degree in theoretical physics. We’re talking about the macroscale here.

3Not that responsibility is a thing that can truly be distributed:

Responsibility is a unique concept… You may share it with others, but your portion is not diminished. You may delegate it, but it is still with you… If responsibility is rightfully yours, no evasion, or ignorance or passing the blame can shift the burden to someone else. Unless you can point your finger at the man who is responsible when something goes wrong, then you have never had anyone really responsible.
 —Admiral Hyman G. Rickover, USN

The Worst Week of American Spaceflight

On January 27th, 1967,the crew of Apollo 1 was undergoing a simulated countdown when an electrical fire started within the spacecraft. The hatch was bolted tightly onto the capsule. Escape was impossible and the blaze quickly grew in a pure oxygen atmosphere. Astronauts Gus Grissom, Ed White, and Roger Chaffee died on the pad.

On January 28th, 1986, the space shuttle Challenger was destroyed was destroyed 73 seconds after lift off for the STS-51L mission. Cold weather in the days before launch had weakened the rubber o-rings sealing sections of the solid rocket boosters. Flames escaped and penetrated the external fuel tank, igniting an explosion of liquid hydrogen and oxygen that disintegrated the orbiter vehicle. The crew was not killed in the explosion—forensic investigation revealed that pilot Michael Smith’s emergency oxygen supply had been activated, and consumed for two and a half minutes: the amount of time between the break-up to when the remains of Challenger landed in the Atlantic Ocean.

On February 1, 2003, the space shuttle Columbia disintegrated during re-entry over the southern United States after sixteen days in orbit. During launch, a piece of cryogenic insulation foam fell from the external fuel tank and struck the left wing of the orbiter, damaging the thermal protection system. As Columbia streaked across the southern sky, atmospheric gases heated by its hypersonic flight entered the wing and melted critical structural members. Ground observers in Texas could see the shuttle breaking apart over their heads. Rapid cabin depressurization incapacitated the crew.

This is the worst week in the history of American spaceflight. These three disasters are not the only dark spots on that record, by they are by far the worst. We remember them, and vow not to repeat the mistakes that led to their deaths.

After Apollo 1, Flight Director Gene Kranz gave the following address to his mission controllers:

Spaceflight will never tolerate carelessness, incapacity, and neglect. Somewhere, somehow, we screwed up. It could have been in design, build, or test. Whatever it was, we should have caught it.

We were too gung ho about the schedule and we locked out all of the problems we saw each day in our work. Every element of the program was in trouble and so were we. The simulators were not working, Mission Control was behind in virtually every area, and the flight and test procedures changed daily. Nothing we did had any shelf life. Not one of us stood up and said, “Dammit, stop!”

I don’t know what Thompson’s committee will find as the cause, but I know what I find. We are the cause! We were not ready! We did not do our job. We were rolling the dice, hoping that things would come together by launch day, when in our hearts we knew it would take a miracle. We were pushing the schedule and betting that the Cape would slip before we did.

From this day forward, Flight Control will be known by two words: “Tough and Competent.” Tough means we are forever accountable for what we do or what we fail to do. We will never again compromise our responsibilities. Every time we walk into Mission Control we will know what we stand for.

Competent means we will never take anything for granted. We will never be found short in our knowledge and in our skills. Mission Control will be perfect.

When you leave this meeting today you will go to your office and the first thing you will do there is to write “Tough and Competent” on your blackboards. It will never be erased. Each day when you enter the room these words will remind you of the price paid by Grissom, White, and Chaffee.

Gene Kranz is right. Tough competence is what those of us in the space business must strive to be, every day, for lives are on the line, and the future of manned exploration of the cosmos is at stake.

These seventeen are not the only space travelers to die in the line of their work, and undoubtedly more astronauts and cosmonauts will perish in our conquest of the universe. That is no excuse for sloppiness. The Apollo 1 fire could have been prevented. STS-51L should not have launched. STS-107 could have been saved on-orbit. It’s the job of engineers, technicians, flight controllers, and fellow astronauts to see accidents before they occur and prevent them from happening.