“When life gives you lemons, make lemonade.”
A quote attributed to Elbert Hubbard in 1915 — still relevant 111 years later.
That line has been on my mind recently after I unexpectedly found myself speaking at a Proximity Collective networking event in Denver, led by Brian Midtbo. During an exercise, I was asked to give a ten-minute talk about a life event that fundamentally changed how I view decision-making, teamwork, and leadership under pressure.
I hadn’t planned to speak. I was unprepared. Which, in hindsight, made the exercise oddly fitting.
In my professional life, I wear two hats that seem unrelated at first glance. On one side, I work as an Information Technology and Cyber Security Advisor, focusing on compliance, resilience, and penetration testing — both cyber and physical. On the other, I am a Technical Scuba Diving Instructor Trainer and expedition leader, responsible for planning and executing complex dives in remote environments.
One profession operates in data centers and boardrooms. The other operates below the surface of the ocean. Yet the human factors involved are strikingly similar.
Two professions, one common weakness
As a security advisor, my job is to identify weaknesses in systems — often by deliberately exploiting them. Physical penetration testing involves attempting to gain unauthorized access to “secure” facilities, under controlled and legally sanctioned conditions. It requires preparation, awareness of regulations, and an understanding that real-world environments rarely behave as planned.
As a technical diving instructor, the stakes are more obvious. Planning a complex dive or expedition is not just about skills or procedures. It’s about teamwork, communication, and the ability to adapt when conditions change. The mantra “plan your dive and dive your plan” is sound — until reality intervenes.
In both fields, we tend to overestimate how well preparation translates to performance under stress.
The Sri Lanka incident
Off the east coast of Sri Lanka lies the HMS Hermes wreck of the world’s first purpose-built aircraft carrier, sunk during World War II and now considered a war grave. It is a technically demanding dive, complicated by logistics, environmental conditions, and limited local resources.
The team was well trained. Skills were solid. Planning was extensive. Multiple preparation dives were conducted. On paper, everything checked out.
What the plan did not fully account for was how individual stress manifests under real pressure.
During the dive, one team member developed Immersion Pulmonary Oedema (IPO) — fluid in the lungs — resulting in acute breathing distress. With hours of decompression remaining, an immediate ascent would have been life-threatening. The diver became mentally unresponsive, breathing at an elevated rate, rapidly consuming gas reserves.
At this point, the dive plan changed drastically and anything other then getting the diver safely to the surface ceased to be relevant.
The team had to adapt in real time: managing a compromised diver, coordinating gas sharing, handling extreme currents, and making decisions under escalating stress. Communication became critical — and fragile. Simple actions required precise coordination. Assumptions about how someone “should” behave no longer held.
The diver survived and physically recovered. Mentally, it took over a year before he returned to the water — and ultimately, he stopped technical diving altogether.
The penetration test incident
In a very different environment, I was conducting a physical penetration test for a client — attempting to access an office building after hours. The initial engagement was successful, but the scope was later extended. By then, the team had moved on to other projects, and I attempted a follow-up test alone.
Access was easy. A poorly secured door and a bypassable magnetic lock led me inside. For a brief moment, I was exposed on the street — but the area was typically deserted.
Except that night.
A security patrol drove by just as I closed the door behind me. Normally, when this happens we stay calm and engage in conversation, verification of authorization, and a debrief. This time, that did not happen. The head of security team was unaware of the extended scope and was unavailable since he was on vacation. The escalation path failed.
I spent the night in a holding cell.
Only when colleagues noticed I hadn’t shown up to work the next day did the situation unravel. After many phone calls and delays, the misunderstanding was resolved.
Why these stories matter
These incidents are not about diving or hacking. They are about human factors.
In both cases, plans existed. Roles were defined. Risks were considered. And yet, critical gaps remained — not in technical capability, but in communication, awareness, and the ability to adapt when assumptions broke down.
In the dive, early warning signs existed: stress during practice dives, minor equipment issues, discomfort that went unspoken. Individually, they seemed manageable. Collectively, they formed a slippery slope.
In the penetration test, the failure wasn’t technical either. It was procedural and human: assumptions about who knew what, incomplete escalation paths, and a lack of redundancy when key people were unavailable.
Most importantly, both situations exposed a common leadership fallacy:
A team leader believing the team understands their roles does not mean the team can execute those roles effectively when conditions deteriorate.
Stress changes behavior. Fear silences voices. Authority gradients inhibit speaking up. Ego and performance anxiety distort judgment. Teams that function perfectly in training or planning sessions may struggle when reality deviates from the script.
The uncomfortable takeaway
Human systems fail quietly before they fail catastrophically.
People often see problems forming but hesitate to speak up — unsure whether the issue is “serious enough,” worried about disrupting momentum, or assuming someone else has a better view. Leaders may believe they have created clarity, while team members privately question their readiness or understanding.
The real challenge is not eliminating uncertainty — that’s impossible. The challenge is designing teams, cultures, and plans that expect uncertainty, encourage dissent, and adapt when reality refuses to cooperate.
Because lemons are inevitable.
The real question is whether your team is actually ready to make lemonade when the recipe suddenly changes.
Paul Emous
Program Director | Security Advisor | Technical Extended Range Instructor Trainer
Focused on mission-critical IT, digital transformation, and sovereign system design. He leads complex multi-prime environments across Europe and the Middle East, acting as a neutral authority to de-risk delivery and ensure predictable outcomes.
Separately, Paul is a Technical Extended Range Instructor Trainer and expedition leader in advanced and remote diving. He teaches Just Culture and decision-making under pressure, where risk is real and accountability is absolute.
Contact: paul@mousemedia.nl