The Sunk Cost Fallacy Is Running Your Technology Stack
A company spent nearly a million dollars on failing software and chose to continue. Not because the future looked promising — because the past felt too heavy to abandon.

I have credentials in marketing psychology, consumer behavior, and positive psychology. Not because I wanted to be a therapist — because I wanted to understand why people do what they do.
I hold certifications in marketing psychology, consumer behavior, positive psychology, and neuro-linguistic programming. I've been studying behavioral science formally for years and informally for much longer. I'm not a psychologist — that's a licensed profession with a doctoral requirement, and I respect the distinction. What I am is a CEO who invests heavily in understanding how people make decisions, because that understanding changes everything about how I build technology and run a business.
Most technology is built by people who think about systems. Data flows, API contracts, infrastructure scaling, deployment pipelines. These are important. They're also the easy part.
The hard part is the human layer. Why does a user abandon a workflow three steps in? Why does a team resist adopting a tool that objectively saves them time? Why does a client say they want a dashboard but never look at it after launch?
Systems thinking answers "how does this work." Behavioral science answers "how will people actually interact with it." The gap between those two answers is where most product failures live.
Consumer behavior research — the formal discipline, not the marketing-blog version — provides frameworks for understanding cognitive load, decision fatigue, loss aversion, status quo bias, and the gap between stated preferences and revealed preferences. These aren't abstract concepts. They're engineering constraints, as real as memory limits or network latency.
At Kief Studio, behavioral principles influence product decisions at the architectural level, not just the UI layer.
Onboarding flows are designed around cognitive load research. The research consistently shows that people can hold 4±1 items in working memory at once (Cowan, 2001 — the updated version of Miller's famous 7±2). Every onboarding screen we build is evaluated against that constraint. If a step requires the user to hold more than four pieces of new information simultaneously, the step gets split. This isn't UX polish — it's an engineering specification.
Default configurations are chosen based on status quo bias. Behavioral economics demonstrates that people disproportionately stick with whatever option is pre-selected. In regulated industries, this means the secure default must also be the easy default. If the user has to actively choose encryption, most won't. If encryption is on by default and the user has to actively disable it, almost no one will. Same outcome, opposite architecture — and the secure version costs nothing extra to build.
Notification and alerting systems account for alarm fatigue. In healthcare IT, alarm fatigue is a documented patient safety risk — clinicians exposed to too many alerts start ignoring all of them, including the critical ones. The same dynamic exists in operations dashboards, security monitoring, and compliance alerting. We design alert thresholds and escalation paths to minimize false positives, because behavioral science predicts exactly what happens when you don't: everything gets ignored.
Client communication cadences follow spaced repetition principles. The research on memory consolidation (Ebbinghaus, updated by Cepeda et al., 2006) shows that information retention follows specific timing curves. Quarterly business reviews, project status updates, and training sessions are scheduled at intervals that align with how memory actually works, not just what fits the calendar.
Regulated industries have a specific problem that behavioral science addresses directly: compliance depends on human behavior, and humans are predictably imperfect.
A compliance program that relies on people remembering to follow procedures is a compliance program that will fail. Not because the people are bad — because the procedures are designed for how managers wish people behaved, not how they actually behave.
Effective compliance systems account for cognitive shortcuts, attention limits, and the tendency to take the path of least resistance. They make the compliant action the easiest action. They surface the right information at the right moment — not in a 200-page policy document that nobody has read since onboarding.
This isn't manipulative. It's respectful. It acknowledges that the people operating within these systems are humans with finite attention and competing priorities, and designs the system to work with that reality instead of pretending it doesn't exist.
I'm occasionally asked why a tech CEO has certifications in psychology-adjacent fields. The answer is simple: because the work demanded it.
Early in my career, I noticed that the projects that succeeded and the ones that failed often had identical technical execution. The difference was almost always in the human layer — user adoption, team dynamics, client communication, stakeholder alignment. The technical skills got the system built. The behavioral skills determined whether anyone actually used it.
The formal study didn't give me magic powers. It gave me vocabulary for patterns I'd already been observing and frameworks for problems I'd been solving intuitively. "Status quo bias" is a more useful concept than "people don't like change" because it predicts specific interventions (change the default) rather than just describing a feeling.
Every credential changed how I think about at least one aspect of the business. Not all of them agreed with each other — and the ones I disagree with sharpened my thinking more than the ones I accepted at face value.
Yes. B2B purchasing decisions are made by humans, and those humans are subject to the same cognitive biases as consumers — anchoring, loss aversion, status quo bias, social proof. The contexts differ, but the underlying decision-making mechanisms are the same. Understanding these patterns improves product design, sales communication, and user adoption.
UX research typically focuses on usability — can the user complete a task efficiently? Behavioral science addresses the broader question of why a user does or doesn't engage with a system at all. It covers motivation, decision-making, habit formation, and cognitive constraints. Good UX research incorporates behavioral principles; behavioral science extends beyond the interface into system architecture, communication design, and organizational behavior.
No. What you need is formal study of the research — through certifications, coursework, or disciplined reading of primary sources — and the ability to apply those frameworks to practical problems. A certification in marketing psychology or consumer behavior provides the vocabulary and the evidence base. The application comes from experience building systems and observing how people interact with them.
Default bias (status quo bias). The option that's pre-selected is the option most users will accept. In product design, this means the default configuration carries enormous weight — it should be the safest, most effective option, not the one that generates the most upsells. In regulated industries, it should be the compliant option. One architectural decision about defaults often eliminates more support tickets and compliance issues than months of feature work.
A company spent nearly a million dollars on failing software and chose to continue. Not because the future looked promising — because the past felt too heavy to abandon.
Most inventory variance isn't caused by lack of data — it's caused by disconnected data. Cannabis compliance is the case study. The lesson applies everywhere.
Forty percent of public company boards now prioritize data governance. Growing businesses can't keep theirs in a spreadsheet someone built three years ago.
Work With Us
Kief Studio builds, protects, automates, and supports full-stack systems for businesses up to $50M ARR.
Newsletter
Strategy, psychology, AI adoption, and the patterns that actually compound. No spam, easy to leave.
Subscribe