Skip to main content
← Back to Writing

When Truth Depends on How It Travels

When Truth Depends on How It Travels

This article is in reference to:

A System of Record Needs a System of Communication

As seen on: cfcx.work

The real problem behind “data issues”

Most teams don’t argue about whether they need a “single source of truth.” They argue about why, once again, the truth was wrong when they needed it.

That gap has real costs: slipped go-live dates, duplicated work, late-night rework, and a quiet erosion of confidence in both tools and teammates. When the system of record keeps surprising people, the organization pays twice—once in operational friction, and again in trust that is hard to rebuild.

This is the “why” behind the original piece. On the surface, the story is about spreadsheets, ERPs, and item masters. Underneath, it is about something more basic: the gap between what a system of record claims to be (the state of the world) and how people actually experience it in the flow of work.

The article is trying to name that gap so operators can redesign around it instead of fighting the same “data issue” over and over. It argues that this gap is routinely misdiagnosed. It shows up as “data quality” issues, failed imports, and rework. But those are surface symptoms.

The deeper pattern is that organizations keep trying to solve a coordination problem with a storage solution. They are buying or building better databases when what keeps breaking is the communication protocol around those databases. That is why the piece matters: it reframes a familiar failure mode.

A system of record, it suggests, is not just a technical object—it is a social contract. And like any contract, its power comes less from where it lives and more from how changes are communicated and accepted.

From files and fields to promises and protocols

The article’s central move is to redefine a system of record from “the master file” into “a promise about reality.” That shift sounds abstract, but it has concrete consequences.

Seen as a file, the system of record is done once it is updated. Seen as a promise, it is not done until everyone who relies on it has had a fair chance to realign their work. The piece is pushing operators to adopt the second lens.

This is why it stresses that record integrity is a human protocol, not a property of the tool. Access controls, version history, and audit logs can prevent certain classes of error, but they cannot prevent the most common one: people acting with outdated but once-true information.

In that light, “single source of truth” becomes a coordination challenge. The real questions are:

  • Who is allowed to move the truth forward?
  • How do they signal that movement?
  • How do others know when they must reconcile their own work to the new state? By foregrounding these questions, the post is not just giving operational advice. It is asking leaders to acknowledge that data governance is governance of expectations, not just governance of fields and tables.

The silent fork: how systems diverge without anyone noticing

A central image in the article is the “forked reality.” One file, many copies, multiple sets of assumptions diverging quietly until something breaks.

That fork emerges from a pattern that is almost invisible because it feels efficient: a shared sheet, open edit access, quick updates made by whoever sees a problem. In isolation, each change is rational. In aggregate, the effect is that two truths coexist: the updated record and the work already in flight based on an earlier snapshot.

The NetSuite item master example is not just a cautionary tale about one botched import. It illustrates a broader dynamic:

  • The analyst is correct to fix the codes in the master.
  • The consultant is correct to trust the spreadsheet they downloaded earlier.
  • The system is correct in storing the most recent values. Despite everyone doing something reasonable, the project still loses time. The failure is not in any single action; it is in the absence of a shared pattern that links “I changed the record” to “everyone who depends on this knows they must resync.”

The article’s emphasis on making updates and notifications a single transaction is an attempt to close that gap. Treating change and communication as an atomic unit is a way of preventing the fork from forming in the first place. The file and the mental models move together, or the change is considered incomplete.

In this sense, the post is less about tooling choice and more about temporal alignment. It asks teams to pay attention to when people find out about changes as much as what changed.

Designing for alignment instead of control

The operational patterns proposed—change logs, named owners, dedicated channels, freshness checks, batch windows—might look like standard process design. The underlying intent is more specific: shift from a model of control to a model of alignment.

Control-centric governance tries to reduce risk by limiting who can change data and how often. Alignment-centric governance accepts that change is necessary and constant, and focuses instead on making those changes legible and predictable to everyone affected.

Seen through that lens, each suggested pattern plays a distinct role:

  • Change logs make it easier to see what moved without rereading the entire record.
  • Named owners give people a mental map of “who to watch” and “where changes originate.”
  • Dedicated communication channels separate signal from noise so important updates are less likely to be buried.
  • Freshness checks insert a small, repeatable ritual at the start of work to prevent large, expensive rework later.
  • Batch windows trade real-time change for predictable change when volatility becomes more harmful than delay. Collectively, these practices recognize that the limiting factor in operations is rarely the capacity to store or update data. It is the capacity of humans to track and integrate those updates into ongoing work.

From critique to practice

Seen as a whole, the piece is asking for a practical shift: from thinking of truth as a static object to treating it as a shared, continually renegotiated state.

Once truth is framed as a coordination property, the success of an ERP rollout or process redesign is no longer measured only by whether the fields are configured correctly, but by whether the surrounding communication patterns make change predictable and legible.

A “clean” item master that keeps surprising people is not clean in any meaningful operational sense. Private spreadsheets, screenshot archives, and constant revalidation checks are not just bad habits; they are local attempts to buffer against an unreliable protocol. They signal that the system of communication has not kept pace with the system of record.

The forward-looking move is straightforward, if not easy. Every time a team names a system of record, it can pair it with an explicit protocol: who owns the changes, how and where updates are announced, what “receipt” looks like, and which rituals (like freshness checks) keep the whole thing from silently drifting.

Framed this way, the original article is less a critique of tooling and more an invitation to treat communication as first-class infrastructure. The system of record is only as reliable as the pathways that carry its changes into people’s plans.

The deeper implication is cultural. Organizations that learn to see truth as something they coordinate on, rather than something they merely store, will spend less time recovering from invisible forks in reality. They will still have bugs, delays, and surprises—but fewer of them will be self-inflicted by their own silence.

That is the wager behind pairing a system of record with a system of communication: better data emerges not from perfect databases, but from a repeatable practice of keeping promises about reality, out loud, together. For leaders, the next step is simple and concrete—whenever they declare “this is the source of truth,” they can also ask, “and what is our shared protocol for keeping it true in people’s hands?”