Start With a Monolith. Seriously.
42% of companies moved back to monoliths in 2026. For teams under 20 engineers, microservices solve problems you don't have yet — and create problems you don't need.

Samuelson and Zeckhauser proved in 1988 that people overwhelmingly stick with the default option — even when alternatives are objectively better. Every product you build inherits this.
In 1988, William Samuelson and Richard Zeckhauser published a study in the Journal of Risk and Uncertainty that changed how behavioral economists think about decision-making. They demonstrated that when people are presented with a default option, they overwhelmingly favor it — even when an alternative is objectively superior. They called it status quo bias, and three decades of subsequent research has confirmed it across every context studied: retirement plans, organ donation, privacy settings, software configurations, insurance selections, and product design.
The implication for anyone who builds products is direct: the default configuration you ship is the configuration most users will keep. Not because they evaluated it and chose it — because they didn't change it. That makes the default the most powerful design decision in your product, and most teams treat it as an afterthought.
Samuelson and Zeckhauser's original experiments showed that participants favored the status quo option at rates far exceeding what random selection would predict. Kahneman, Knetsch, and Thaler expanded the finding in 1991, connecting it to loss aversion — people experience the pain of giving up something (even a default they didn't choose) roughly twice as intensely as the pleasure of gaining something equivalent.
Research published in Judgment and Decision Making in 2013 confirmed that people favor defaults even when they have no evidence that the default is beneficial. The mechanism isn't rational evaluation — it's cognitive efficiency. Changing a default requires effort: evaluating alternatives, understanding trade-offs, making a decision, and accepting responsibility for the outcome. Keeping the default requires nothing.
Complexity amplifies the effect. A study by Dinner, Johnson, and colleagues published in the Journal of Experimental Psychology in 2011 found that as the number of options and the complexity of the decision increase, the fraction of people who accept the default rises. More choices, more cognitive load, more defaults accepted.
Every product ships with defaults: the initial privacy settings, the pre-selected notification preferences, the standard configuration for data sharing, the default authentication method, the preset dashboard layout. Each of these is a decision made by the product team that the majority of users will passively accept.
This creates a responsibility. If the default is insecure — encryption off, data sharing on, weak authentication enabled — most users will run the product in that insecure state. Not because they chose insecurity, but because they accepted the default. The product team made the real decision; the user just didn't override it.
At Kief Studio, one of our four core values is "Security by Default" — every solution ships locked down. Security is not a feature we sell. It is just how we build things. That value statement is a direct application of what Samuelson and Zeckhauser proved: the default is the decision. We don't offer encryption as an opt-in. We don't make two-factor authentication an advanced setting. We don't pre-select "share analytics data" and hope users will notice the checkbox. The LTFI methodology that runs our operations treats default configuration as a first-class engineering concern — not a settings page someone reviews after launch.
The logic is straightforward: whatever you set as the default is what you're choosing for your users. Thirty-five years of replication confirms this. Choose the option that protects them.
In regulated industries, default bias isn't just a UX concern — it's a compliance concern.
If a cannabis seed-to-sale tracking system defaults to "manual inventory reconciliation" instead of "automatic audit logging," most operators will run without automatic logging. When the state auditor arrives, the gap isn't a user error — it's a design decision that the product team made and the operator passively accepted.
If a fintech platform defaults to storing customer data in a region that doesn't comply with the client's data residency requirements, and the client never navigates to the settings page to change it — the compliance failure traces back to the default, not to the client's negligence.
The EU's regulatory approach to dark patterns (under the Digital Services Act and the forthcoming Digital Fairness Act, expected 2026) increasingly scrutinizes default settings that don't serve the user's interest. The California Consumer Privacy Act regulations, effective January 1, 2026, added specific protections against defaults that manipulate consent.
The regulatory direction is consistent across jurisdictions: defaults are design decisions, and design decisions carry accountability. Building with protective defaults from the start means this isn't a compliance scramble later — it's already how the system works.
The research suggests a framework for choosing defaults:
Default to the safest option. If a default has security or privacy implications, the most protective setting should be the default. Users who need less protection can opt out; users who need the protection (which is most of them) are covered automatically.
Default to the simplest configuration. Complexity increases default acceptance. If the default is already the simplest reasonable configuration, users who accept it get a product that works well. Users who need customization will invest the effort to change settings — they're the minority who would have done so regardless.
Make defaults visible, not hidden. A default that users don't know exists can't be evaluated. Surfacing the default during onboarding — "We've set your data to encrypt at rest by default. You can change this in settings." — respects user autonomy while maintaining the protective default.
Audit defaults periodically. Regulatory requirements change. Best practices evolve. A default that was appropriate at launch may be insufficient two years later. Review defaults on the same cadence as security patches — they carry a similar level of responsibility.
Test whether users understand what the default does. If user research shows that most users don't understand a default setting well enough to make an informed decision about changing it, the default carries even more weight. The less users understand a setting, the more important it is that the default protects them.
Default bias can be used to help people or to exploit them. The same research that supports protective defaults also explains why companies pre-check "sign me up for marketing emails" and make the unsubscribe path deliberately complex.
The distinction isn't subtle: defaults that serve the user's interest are ethical design. Defaults that serve the company's interest at the user's expense are manipulation. The research doesn't change based on intent — people accept defaults either way. What changes is whether the designer used that knowledge to protect or to extract.
This is where the behavioral science connects to something more fundamental. The reason we build with protective defaults at Kief Studio isn't because the research says it's effective — though it is. It's because the people using these systems deserve to be protected by the choices their technology makes for them, not exploited by those choices. The behavioral science gives us the mechanism. The intent comes from somewhere simpler: build things that help the people who use them.
For anyone building products where trust matters — and in regulated industries, trust is the business — the ethical approach is also the pragmatic one. A product whose defaults protect users earns the kind of trust that sustains long-term client relationships. We've had clients for thirteen years. Defaults that protect are a small part of why. But they're a real part.
Default bias (also called status quo bias) is the well-documented tendency for people to accept pre-selected options rather than actively choosing an alternative. First demonstrated by Samuelson and Zeckhauser in 1988 and extensively replicated since, the effect occurs because changing a default requires cognitive effort — evaluating alternatives, understanding trade-offs, and accepting responsibility for the decision — while accepting the default requires none.
If the default configuration is insecure (encryption off, weak authentication, data sharing enabled), most users will run the product in that insecure state — not by choice, but by inaction. Making the secure option the default ensures that users who don't actively engage with settings are still protected. This is especially critical in regulated industries where configuration choices have compliance implications.
It depends on whose interest the default serves. Defaults that protect users (privacy-preserving, security-enabled, data-minimizing) are ethical applications of the research. Defaults that extract value from users (pre-checked marketing consent, auto-enrolled subscriptions, data-sharing enabled) exploit the same cognitive mechanism for the company's benefit at the user's expense. The EU and California are increasingly regulating defaults that don't serve user interests.
Default bias is one mechanism within choice architecture — the broader practice of designing how choices are presented. Richard Thaler and Cass Sunstein (Nudge, 2008) identified defaults as one of six key tools of choice architecture. Thaler received the Nobel Prize in Economics in 2017 partly for this work. Default settings are the most powerful nudge because they require zero effort from the user and affect the largest number of people.
42% of companies moved back to monoliths in 2026. For teams under 20 engineers, microservices solve problems you don't have yet — and create problems you don't need.
A company spent nearly a million dollars on failing software and chose to continue. Not because the future looked promising — because the past felt too heavy to abandon.
Operations represents 51% of self-hosting TCO. A $49/month VPS can cost 1,300 developer hours a year in patching alone. Here's the real math.
Work With Us
Kief Studio builds, protects, automates, and supports full-stack systems for businesses up to $50M ARR.
Newsletter
Strategy, psychology, AI adoption, and the patterns that actually compound. No spam, easy to leave.
Subscribe