Verification, Validation, and Digital Twins for Software-Defined Vehicles

Why Software-Defined Vehicles Demand Stronger Verification and Validation

Software-defined vehicles (SDVs) are reshaping automotive engineering. Vehicle behavior is no longer fixed at production; it evolves through over-the-air (OTA) updates, software feature toggles, and AI-enabled decision logic. This shift dramatically raises the stakes for SDV verification.  Personally, I think treating a vehicle like IT equipment may not be such a good idea.

I have a few posts scheduled for SDV, based on an invitation to speak at an SDV event.  I have done considerable volunteering over the course of my career: I have spoken at events for free, written magazine articles for free, guest-lectured at universities for free, and more. My familly tell me that I give too much for free. They may be right.  I wanted to attend the event, but I needed the trip covered; unfortunately, that did not work out. There is one thing to undertake work that does not keep a roof over one’s head; it’s another to actively lose the ability to keep a roof over one’s head (expenditure).  I have already done some work in this area and thought it was time to use it.  The result? The topics I was ruminating on talking about will end up here on my blog.

Increased software content, continuous deployment models, and AI/ML components mean failures can propagate faster, affect entire fleets, and emerge long after launch. Traditional test-to-specification strategies struggle to keep pace with this level of dynamism and uncertainty.

Regulatory expectations are also rising. Functional safety (ISO 26262) and cybersecurity (UN R155) are no longer siloed—they must be woven directly into verification and validation throughout the lifecycle. Compliance alone is insufficient; organizations must demonstrate confidence in real-world behavior.

The Limits of Testing to Specifications

Testing to specifications has value—but it has limits, and as such, shortcomings.  Engineering, product development, and manufacturing are about margins.

Specifications describe intended behavior under known assumptions. SDVs, however, operate in environments characterized by uncertainty, interaction effects, and adaptation. Passing all specified tests does not guarantee acceptable behavior when:

  • Inputs vary beyond assumed ranges

  • Software components interact in unanticipated ways

  • OTA updates alter timing, dependencies, or priorities

  • AI models encounter novel scenarios

I emphasize, confidence does not come from test counts alone. It comes from multiple testing approaches, applied across contexts, revealing different classes of defects. Over-reliance on specifications can create false confidence—especially in complex SDV ecosystems.  We seldom know the product’s true application and environmental exposure in the field.

This is where SDV verification validation must expand beyond specification compliance toward evidence-based confidence.

Verification – Proving the Design Before Hardware Exists

Verification answers the question: Did we build the system right?  We cover this in our Dictionary of Testing, Verification, and Validation.

Digital Twins and Model-Based Verification

Digital twins enable engineers to explore thousands—sometimes millions—of SDV scenarios virtually. By combining system models, software logic, and environmental variables, teams can:

  • Stress timing, load, and interaction boundaries

  • Identify emergent behaviors before hardware integration

  • Evaluate safety and cybersecurity controls early

Model-based verification allows defects to surface when they are cheapest to fix and hardest to hide.

Enhanced V-Model Practices for SDVs

The traditional V-model must evolve. Modern SDV programs use:

  • Continuous verification pipelines

  • Automated checks embedded in CI/CD workflows

  • ECU-level verification triggered on every software change

This continuous approach strengthens SDV verification by ensuring that design assumptions are continually challenged as the software evolves.

Validation – Proving Behavior in the Real and “Near-Real” World

Validation answers the harder question: Did we build the right system?

DiL, HIL, and OTA Shadow Modes

Driver-in-the-loop (DiL) and hardware-in-the-loop (HIL) testing expose system behavior under realistic timing, human interaction, and fault conditions. OTA shadow modes allow new software to run silently alongside production code, revealing issues without customer impact.

AI-driven scenario generation further expands coverage by creating edge cases humans are unlikely to imagine.

Bridging the Model–Reality Gap

No model perfectly reflects reality. Bridging this gap requires:

  • Designed experiments to isolate causal factors

  • Continuous validation pipeline
  • Fleet data feedback loops to refine assumptions

  • Continuous calibration of digital twins

Validation is not a phase—it is an ongoing learning system.

Variation and Complexity in SDV Systems

Variation is unavoidable. Complexity is cumulative.

SDVs face variation across hardware revisions, software versions, regional regulations, driver behavior, and environmental conditions. Complexity emerges from interactions, not components.

Testing strategies must therefore address combinatorial explosion, not just individual requirements. Digital twins, probabilistic testing, and field feedback are essential tools for managing this complexity and sustaining confidence in SDV validation over time.

Practical Implementation Using Quigley’s Five Testing Approaches

I have written about this many times, of the five complementary testing approaches, each revealing different risks:

Mapping the Five Approaches to SDV Programs

  1. Lab Testing – Early exploration of concepts, algorithms, and assumptions

  2. Bench Testing – Subsystem behavior under controlled but realistic conditions

  3. SIL/HIL Testing – Integrated software and hardware interaction validation

  4. Vehicle Testing – Full-system behavior in representative environments

  5. Field Data Analysis – Real-world evidence to uncover unknown-unknowns

No single approach is sufficient. Together, they create layered confidence that no specification-based strategy can achieve on its own.

Metrics That Support Confident Release Decisions

Effective SDV verification relies on meaningful metrics, including:

  • Coverage across scenarios, not just requirements

  • Defect discovery profiles over time

  • Confidence measures tied to operational risk

These metrics support informed release decisions—balancing speed, safety, and reliability in a continuously evolving product.

Conclusion

Verification, validation, and digital twins are no longer optional for software-defined vehicles. They are foundational capabilities for managing uncertainty, variation, and complexity at scale.

By moving beyond testing to specifications and embracing a continuous and multi-approach testing philosophy, organizations can build real confidence—not just compliance—into their SDV programs.

 

 

 

 

For more informationcontact us:

The Value Transformation LLC store.

Follow us on social media at:

Amazon Author Central https://www.amazon.com/-/e/B002A56N5E

Follow us on LinkedIn: https://www.linkedin.com/in/jonmquigley/

https://www.linkedin.com/company/value-transformation-llc

Follow us on Google Scholar: https://scholar.google.com/citations?user=dAApL1kAAAAJ 

Post by Jon Quigley