The standard line in this industry is to publish success stories. They are clean, they are reassuring, and they make good marketing. We publish ours too. But there is a question every prospective client is silently asking when they read one: what happens when it does not go smoothly?
This account answers that question. It is about a four-partner accounting practice in Zurich, an implementation that ran nearly three times longer than scoped, and the data quality problem we did not catch early enough. The engagement ended well. The path there was not the one we had planned.
The firm and the scope
The practice handled tax returns, annual accounts, and advisory work for around 140 client businesses: mostly small trade companies, medical practices, and family-owned retailers in the canton of Zurich. They had been operating for 22 years and had accumulated processes in the same way most long-running firms do: one workaround at a time.
The Clarity Scan identified three high-priority automation targets:
The scope was agreed: two sprints across four weeks. Sprint 1 would handle document collection and deadline tracking. Sprint 2 would automate client status reporting. The quote was provided in writing. The start date was confirmed.
What went wrong in week two
The document collection automation required a clean client list: names, contact emails, the documents expected from each client, and the filing deadline associated with each case. We asked for this during the briefing. We were told it lived in their practice management software.
It did. In three different versions.
Over 22 years, the firm had migrated their software twice. Each migration had carried forward the previous data without cleaning it. The result was a client database with 140 active clients, approximately 60 archived clients who had not been removed, duplicate entries for 11 clients who had changed their legal structure at some point, and email addresses that in some cases reflected personal accounts the clients had abandoned years earlier.
We discovered this at the end of week two, when the first test run of the collection workflow bounced 23 messages and delivered a document request to the wrong contact for one client.
We stopped the automation immediately.
The honest conversation
The call with the managing partner the following morning was not comfortable. We explained what we had found, what had caused it, and, critically, what we should have done differently during scoping.
The data audit should have been a separate phase. We had assumed the client list was clean enough to work with. That assumption was wrong, and we should have verified it before agreeing to the timeline. The four-week scope had been priced on clean data. We were now looking at a data remediation project before the automation could safely run.
The managing partner's response was measured. He said: "I suspected the database had issues. I did not know how to quantify them. I should have told you that upfront."
Both sides had missed something. We agreed to share the remediation cost: MEIKAI absorbed the additional analytical work; the practice contributed three days of a staff member's time to go through the client records and correct them. We revised the timeline to eleven weeks and documented the revised scope in writing before continuing.
"We should have built a data verification step into every engagement where the automation depends on an existing client database. We did not. We do now."
What the remediation found
Working through the client records was not only a precondition for the automation. It was independently useful. The firm discovered:
- 14 clients whose contact details were outdated by more than two years
- 7 clients whose service agreements had never been formally closed after they stopped engaging
- 3 duplicate billing entries that had been generating minor discrepancies in the monthly reconciliation
- A pattern of inconsistent document naming that, once corrected, reduced the time to locate historical files by an estimated 20 minutes per week across the team
The database cleanup was, in retrospect, overdue. The automation work forced it to happen.
The outcome at week eleven
By the end of the engagement, all three original workflows were running.
The document collection automation ran its first full cycle, covering 140 clients across an eight-week collection window, without a single manual follow-up from any partner. The deadline tracker had been running for six weeks by the end of the engagement with no missed deadlines and, according to the managing partner, significantly less anxiety in the office during peak season.
The client status reports went from 25–40 minutes each to approximately 8 minutes: a structured template, populated from the practice management system, reviewed and sent by the partner. Not eliminated: reviewed. The partners wanted to keep a human check on outgoing client communication. That was the right call.
What changed in how we scope
Every engagement that touches an existing client database now includes a data audit as a named, priced step before automation work begins. This is not optional. If the data is clean, the audit takes half a day and costs little. If it is not. And it often is not in businesses that have been running for more than a decade: the audit is the most valuable thing we do before writing a single line of automation logic.
We also adjusted the Clarity Scan questionnaire. There is now a specific question about the age and migration history of any client or contact database that the automation will depend on. The answer shapes the scope before we price anything.
The honest summary
We missed a scoping step that cost both sides several weeks. We caught the problem before it caused a real failure. We shared the cost of fixing it. The result exceeded the original projected value. The managing partner renewed to the Continuity plan.
A Clarity Scan that launches a clean project is useful. A project that surfaces a hidden data problem, forces the remediation, and still delivers on the original objective is arguably more useful, even if nobody wanted it to go that way.
We publish this because prospective clients deserve to know what happens when something goes wrong. It goes like this: we stop, we tell you what we found, we share responsibility where it is shared, and we finish the work.