Story #1 — Stabilizing a Compliance Assessment After a Six‑Month Leadership Gap

Re‑establishing engineering discipline in a safety‑critical environment

Walking Into a Leadership Vacuum

When I joined the program, I walked in on Day 1 of an external ASPICE Compliance assessment for both firmware and software. Before the assessment even began, I learned that the organization had been operating without an ASPICE software assessment lead for 6 months. There had been no preparation, no coaching, no alignment, and no one responsible for ensuring readiness.

As the Day 1 assessment session ended, I had a clear understanding of the organization’s actual state of preparation — and the gaps were significant. The teams were walking into interviews without alignment, coaching, or clarity. I immediately gathered the team members scheduled for the next day’s interviews and began preparing them for how and what to present, the types of auditors’ questions, and how to handle them.

Stabilizing the Assessment in Real Time

Day 1 → Preparing Day 2

At the end of Day 1, I shifted into damage‑control mode. I worked with the Day‑2 interviewees to collect their processes and evidence, organize their materials, and structure their presentations. With almost no time left, the focus was on basic, essential readiness — what to present, the order in which to present it, and how to pull up artifacts immediately when assessors requested them. The teams organized their materials to ensure they were ready for the next day. On an engineering-time-available basis, I ensured they had the right evidence prepared and accessible.

Day 2 → Preparing Day 3 and Day 4

By Day 2, it was clear that staying only one day ahead wouldn’t be enough to stabilize the assessment. The preparation gaps were too large, and the teams scheduled later in the week were just as unprepared as those on Day 1.

I attended all software process audits to observe firsthand where the assessors were probing, where teams were struggling, and what information or presentation gaps were creating unnecessary findings. This gave me real‑time insight into exactly what the next wave of interviewees needed to be ready for.

Immediately after the auditors completed the Day‑2 sessions, I met with the Day 3 and Day 4 interviewees outside the audit room to begin closing their gaps early. With very little time left, we focused on rudimentary coverage — the minimum structure required to avoid unnecessary findings. I worked with each team to collect their processes and evidence, organize their materials, and to assist in quick and coherent responses under pressure.

Day 3 → Reduced Pressure, Emerging Clarity

By Wednesday, the pressure within the organization had greatly reduced, though it was not fully resolved. The frantic uncertainty of the first two days was gone, replaced by a clearer understanding of what to present, how to present it, and in what order. Teams were still under stress, but now they had structure, direction, and a shared mental model. They entered the interviews with enough clarity and confidence to navigate the auditor’s questions effectively. As the assessment sessions were completed, I did follow-up sessions with the day 4 presenters.

Day 4 → Test Processes and New Presenters

Day 4 followed the same pattern as Day 3, but with a new set of presenters and a shift into the test‑related process areas. The preparation gaps were similar, and the time pressure was the same. I repeated the stabilization rhythm — reviewing their processes, organizing their evidence, and giving them the same basic structure that had helped earlier teams succeed. With new presenters and new process areas, Day 4 was essentially a continuation of Day 3, applying the same triage‑driven preparation model to ensure consistent, coherent presentations.

End of Day 4 → Full Diagnostic of Systemic Gaps

By the end of Day 4, I had compiled a comprehensive set of notes capturing the assessment gaps I observed across all software process audits… gaps in presentation, in information transfer, and in how teams demonstrated alignment with their defined processes. My analysis showed clear patterns: missing linkage between the process and the evidence, inconsistent presentation flow, and artifacts whose strength or completeness didn’t align with the intent of the process areas being assessed. I also documented gaps where no artifacts existed, as well as cases where artifacts were present. Still, I did not meet ASPICE expectations — including “design” documents that had been reverse‑engineered from the code rather than produced upstream.

These notes became the foundation for the corrective actions I would begin defining on Day 5.

Day 5 → Scoring, Evidence Requests, and Deep Analysis

Day 5 was a scoring and assessor‑collaboration day — the assessors were finalizing their scoring, reviewing evidence, and aligning on the findings for each process area. Throughout the day, they requested direct evidence from engineering to support the findings they were uncovering, or reminders for evidence they had not yet received.

While the assessors worked through their scoring sessions, I shifted from stabilization to deep analysis. I reviewed the full set of gaps I had captured across Days 1–4 and synthesized them into a clear diagnostic of the organization’s systemic issues. This analysis became the backbone of the Week 2 corrective action plan and the longer-term uplift cadence.

Missing Training, Missing Structure

Week 2: Rebuilding an Engineering-Focused Alignment

As Week 2 began, another major gap surfaced: no one could locate the training materials created by the previous ASPICE lead. The organization had lost the very resources needed to understand the software engineering processes being assessed.

Drawing on my background in software engineering and quality engineering, I rebuilt the training foundation from scratch. I created new training materials for the ASPICE software engineering process areas, reviewed the existing process documents, and began closing the gaps that had accumulated over the previous six months.

In parallel, I completed my analysis of the assessment. On Tuesday, I called a meeting with the presenters to go through my findings, ensure I included team comments, and share the training I had under construction to close the gaps.

Because the content was grounded in real engineering practice, it resonated instantly with frontline engineering leadership. They saw that I understood their world, their constraints, and their language. That credibility translated into strong collaboration almost immediately, opening the door for honest conversations, shared ownership, and unified execution.

Moving Forward

Building the Six‑Month ASPICE 3.1 Uplift Cadence

With the assessment behind us, I established a six-month cadence focused on training, mentoring, and strengthening processes and artifacts to achieve tighter alignment with ASPICE 3.1. Instead of treating ASPICE as a one‑time event, I built a recurring rhythm of capability‑building:

  • weekly coaching sessions

  • targeted process improvements

  • evidence reviews tied to engineering milestones

  • cross‑functional alignment checkpoints

  • as we neared the assessment, I offered 24/7 availability for presentation practice, so engineers could rehearse without interruption whenever they had time

This practice rhythm helped teams internalize structure, improve clarity, and build confidence.

Results in the Next External ASPICE Compliance Process Assessment Scores

Six Months Later — A 20% – 30% Process Improvement

Six months after the initial Compliance assessment, the organization underwent its next external ASPICE evaluation. The results reflected the impact of the training, mentoring, and process alignment cadence I had established. Across the software teams I coached, assessment scores improved by 20–30%, driven by clearer process understanding, stronger evidence, improved artifact completeness, and practiced, coherent presentations.

What began as a week of triage had evolved into a sustainable operating model that materially improved the organization’s maturity and readiness.