Deadlines Make You Dumb
How the Yerkes-Dodson law turns pressure into tunnel thinking

TL;DR: Stress narrows your thinking fast. Under deadline pressure, the first acceptable answer starts looking brilliant and every inconvenient doubt starts looking optional. That is how teams ship bad decisions with a straight face.
You have launched under pressure. You know the feeling. The deadline is fixed, the client is waiting, and someone raises a concern at the worst possible moment. The data is off. The edge case hasn’t been handled. That feature breaks on Android. You have two options: delay and deal with the consequences, or ship and deal with consequences later. Under pressure, you ship. Not because you evaluated the tradeoffs. Because the deadline made the decision before your brain got a vote.
This is not laziness. It is physiology.
Stress narrows, it does not sharpen
When Robert Yerkes and John Dodson ran their 1908 experiments on learning and stress, they found something uncomfortable. Some pressure improved performance. But beyond a certain point, more pressure made performance worse. The relationship was an inverted U. Too little stimulation and nothing got done. Too much and the system broke down. Moderate stress helped. Extreme stress didn’t just fail to help, it actively degraded the quality of thinking.
When you perceive threat, cortisol floods your system and your thinking narrows to the most immediate problem. Great system if the threat is a grizzly bear and not a Jira ticket. If the threat is a deadline, it means you stop weighing options and start executing the first answer that makes the feeling go away. You’re not deciding anymore. You’re just making the discomfort stop.
Design work requires wide thinking. You need to hold multiple possibilities at once, consider edge cases, question your own assumptions. Stress collapses that. It turns a complex tradeoff into a binary: ship or don’t ship. The nuance disappears, and with it the judgment that might have caught the problem.
Barry Staw documented what this looks like at the organizational level. His 1981 threat-rigidity research showed that when groups feel threatened, they centralize authority, restrict information flow, and lean harder on whatever worked before. They stop experimenting. They stop tolerating dissent. The engineers closest to the problem get quieter, because raising concerns under pressure feels like making the situation worse. The people with the most relevant knowledge go silent at exactly the moment you need them most.
“We had to prove we weren’t ready”
On the night of January 27, 1986, engineers at Morton Thiokol spent three hours trying to stop the Challenger from launching. The forecast for the next morning called for temperatures in the low twenties. The rubber O-rings that sealed the solid rocket booster joints had never been tested below 53 degrees. Cold made rubber stiffer. A stiff O-ring could fail to seat. If hot gas bypassed the seal, it would burn through the external fuel tank. Roger Boisjoly, the O-ring specialist, told NASA that temperature was a discriminator and they should not fly.
NASA managers pushed back. Lawrence Mulloy asked where it was written that you couldn’t launch below 53 degrees. George Hardy said the recommendation was appalling. The pressure in that teleconference was total. Challenger had already been delayed five times, the launch was tied to the President’s State of the Union, and Morton Thiokol held an $800 million contract with a $10 million penalty for delays. The engineers presented their erosion data. NASA demanded quantitative proof of failure. Boisjoly had concern and analysis. He did not have certainty. There is no certification that says “this will fail at X degrees.” You prove safety in steps. You don’t prove catastrophe in advance.
Jerry Mason, Morton Thiokol’s senior vice president, called a management caucus and turned to Bob Lund, the VP of Engineering. He told Lund to take off his engineering hat and put on his management hat. Lund changed his recommendation. The engineers were overruled. The launch was approved.
Lund later testified to the Rogers Commission. He said:
“We had to prove to them that we weren’t ready, and so we got ourselves in the thought process that we were trying to find some way to prove to them it wouldn’t work.”
— Robert Lund, Rogers Commission testimony, 1986
Read that again. The burden of proof had inverted. The engineers were no longer trying to prove the launch was safe. They were trying to prove it was dangerous, to an audience that had already decided to fly. Challenger launched the next morning in 36-degree weather. Seventy-three seconds later, all seven crew members were dead.
Staw’s threat-rigidity hypothesis was not an abstract idea. It was those three hours on the phone the night before Challenger launched.
The pattern you already know
You do a smaller version of this all the time. You cut the usability test because the build is happening tomorrow. You skip the accessibility check because there’s no time to fix it anyway. You hear a concern from the developer and redirect the conversation because addressing it would mean pushing the demo. The concern doesn’t go away. It just gets scheduled into someone else’s sprint.
The pressure convinces you these are rational tradeoffs. They’re not. They’re the same cognitive collapse, at smaller scale. Stress has narrowed your attention to the immediate goal and blocked out everything that would complicate reaching it. The concern felt inconvenient, not important. That’s the tell. Inconvenient concerns are almost always the ones worth taking seriously.
The concern that feels inconvenient
The shift is not about better time management or building slack into your schedule. It is one question, asked at the worst possible moment: what would it take to check this?
Not “should we delay?” Not “is this really a problem?” Those questions invite rationalization. “What would it take to check this?” is concrete. It forces the concern out of abstraction and into an estimate. An hour of testing. A day of investigation. Sometimes the answer is small enough that you can do it before the ship date. Sometimes the answer tells you the risk is real and you need to have the harder conversation. Either way, you’ve given the concern a fair hearing instead of dismissing it because the timing was bad.
The engineers who worked on Challenger had that information. They knew what they knew. Nobody at NASA asked what it would take to verify their concern. The deadline made that question feel impossible.
It wasn’t.

