Despite its manufacturing origins going far back to 1950, today, the Value Stream Management (VSM) term is mainly associated with digital product delivery. It comprises a set of practices that improve how teams deliver high-quality customer experiences through digital products and services. Forrester suggests that “VSM has the potential to completely transform the process of funding, building, managing, and maintaining software at scale.” In today's world, it's fair to say that all companies are technology companies, which makes VSM more relevant and critical than ever before.
When discussing contemporary digital product delivery, we should recognize that it is a complex system with many intertwined and dynamic elements. As a part of such systems, we have customers, delivery teams (sometimes including vendors), technical architecture, requirements, code, tests, delivery pipelines, technical environment, etc. All of these components interact, making it hard to predict the system's behavior as a whole.
VSM builds a foundation for understanding the digital product delivery flow, providing transparency and measurement, unveiling the current state and trends, and aligning teams on potential issues. However, the adoption of VSM doesn’t correlate with the substantial performance improvements one might expect, suggesting that more is needed to make a performance impact on complex systems.
Let’s explore how VSM might evolve to overcome such complexity and dynamics to facilitate substantial performance impact on value streams.
At Flowtopia 2023, Dean Caron, who leads the Engineering Excellence Center at Unum, detailed their journey in implementing VSM. This initiative arose from a need to demystify the often obscure and intangible aspects of the Software Development Life Cycle (SDLC), thereby enabling more informed decision-making about potential areas for improvement. By evaluating the practices of engineering teams, Dean's group identified key qualitative indicators pointing to gaps that could enhance flow performance. They also gathered flow and DORA metrics, which served as lagging indicators, shedding light on the current state of flow performance.
Despite the SDLC's inherent complexities, a critical question remains unanswered: How can one effectively enhance the flow? Determining which gaps and initiatives will substantially impact a specific value stream's performance is challenging. While the combination of qualitative and quantitative metrics offers some guidance, it falls short of clearly assessing the potential impacts of proposed changes.
Dean envisions an expansion of VSM that includes engineering and product management. This expansion aims to shift the focus from mere transparency to actionable insights, providing a foundation for decision-making on the specific changes that should be implemented for optimal impact of the system as a whole, avoiding local optimization that doesn’t yield system performance impact. Another aspect that is necessary for VSM is to address across teams/functions bottlenecks that are often invisible, and it is quite challenging to estimate the impact of tackling such issues.
Phil Clark from Parchment, recently acquired by Instructure, another Flowtopia contributor, emphasized the significance of effectively conveying to business stakeholders the necessity of transitioning from individual productivity metrics to a focus on value stream flow performance. He argued that increasing individual efficiency is futile if it doesn't positively impact the overall value stream performance. Instead, the emphasis should be on investing in technical capabilities that address bottlenecks and enhance overall flow. To successfully communicate this mindset change, one needs a thorough understanding of the value stream flow's causal model, the ability to identify and analyze bottlenecks, and strategies for addressing them since they might reside outside the team boundaries. Additionally, it's essential to reinforce learning and experimentation mindset to explore various decisions leading toward business objectives. These decisions and ROI from their implementation will be different whether you are optimizing for value stream performance metrics such as speed, throughput, etc.
In NTT DATA’s work with its clients, they have been experimenting with change and transformation management in smaller time slices (one to three months each). Instead of asking organizations to spend large sums of money immediately to sustain a transformation, they ask for time slices of funding, alignment on goals, and executive participation to enable more successful transformation outcomes. This allows the consulting team to get hyper-focused with their client on goals that can be achieved in the smaller time slices, demonstrating faster time to value.
“This faster time to value leads to a focus on foundational outcomes like transparency that ultimately help organizations achieve their own speed to value. It’s essentially teaching our client,” says Logan Daigle, Director of Business Transformation at NTT DATA.
“When we work with our clients, one of the first things we need to do is to understand the outcome and the opportunity in front of us with our clients. Many clients believe VSM is “Value Stream Mapping” which we all know is an important practice but just a piece of the pie. With most clients, we start with using the VSM Consortium’s Value Stream Management Implementation Roadmap and helping them through the different steps. If you’re looking at where mapping sits in the roadmap - it’s right in the middle of activities.
I like looking at engagements in three parts:
1) Let’s Vision and Identify
2) Let’s Organize and Map
3) Then, let's Connect, measure, and Inspect what we are doing and then Adapt
The North Star is making sure continuous improvement and adapting to the learnings of Value Stream Management as you move towards success and your vision.” says Keith Buehlman, Senior Director of Business Agility Enablement NTT DATA Services.
While VSM offers insights into flow performance metrics, aligns team perspectives on challenges, and informs transformation initiatives, it has limited ability to evaluate the impact of changes on flow performance. Due to local optimizations rather than end-to-end value stream evaluations, we often see quick wins that fail to deliver desired outcomes.
Consider an instance where a team prioritizes automating the build process while relying on manual testing. Without addressing the lengthy lead times inherent in the manual testing phase, enhancements made in other segments of the value stream are unlikely to yield a comprehensive system-wide impact. This scenario illustrates the limited effectiveness of isolated improvements when other critical phases remain inefficient.
Imagine a scenario where a team significantly improves the efficiency of their code deployment processes. However, the overall value stream remains inefficient if the initial requirements-gathering phase is still time-consuming and prone to errors. The bottleneck has merely shifted from deployment to the initial stages of product development.
Although these examples seem intuitive, the underlying system behavior is non-linear. For instance, investing in test automation can enhance feedback from the testing phase, resulting in unexpectedly high returns. Engineers will generate less code on top of the defect, leading to less time spent fixing it—a compounded impact.
To effectively manage these complexities, VSM methods need to evolve, distilling these flow systems' intricacies, causal relationships, and dynamics. This evolution would enable organizations to proactively manage value delivery by anticipating issues, validating change options before implementation, and identifying the most effective solutions for enhancing the value stream as a comprehensive system.
Organizations and leaders inside Product teams need to adopt a more systems thinking approach to be able to connect the dots end to end, reminiscent of Eli Goldratt's insights in The Goal, which teaches us to see this narrative as a series of interconnected processes and relationships, rather than isolated events or functions. It's about understanding the holistic picture—the complete story of how a client or customer creates, maximizes, and eventually consumes value.
Being part of a value stream or network means you're part of the story of value creation. Whether you develop cloud solutions or integrate complex systems, you contribute to a symphony of service. The challenge, as Goldratt illustrates, is optimizing these contributions to enhance the overall narrative of value, reduce bottlenecks, and streamline interactions.
Evolving roles for business and technology leaders in all businesses are changing, and the roles of product managers or other team members working within the product or portfolio of products are evolving to encompass a broad view of these value networks. They must now look outward to comprehend market trends and customer feedback while simultaneously steering internal development teams. The goal is to ensure product evolution aligns with and anticipates customer needs and preferences.
As we revisit the agile principles that once revolutionized software development, we are reminded of their core intent—delivering value through meaningful products and services. Unfortunately, some agile practices have become checkboxes rather than paths to genuine value. It's time to reforge these principles in the context of today's value networks, focusing on meaningful and purposeful service delivery.
Emerging technologies and methodologies will transform the VSM methodology and capability to understand and optimize value streams. Initially, Value Stream Mapping sets the stage by providing a basic, static overview of the dynamic system, keeping metrics collection manual and biased. We can instantly collect and visualize these metrics from the data layer with process digitalization. This provides stakeholders with a quick overview of performance indicators and symptoms of inefficiencies.
The introduction of Process and Task Mining marks a significant advancement. These technologies delve into the operational details, extracting data from existing systems to reveal the process structure and actual performance of processes and tasks. They reveal inefficiencies, deviations, and bottlenecks, providing a more profound and accurate understanding of the current state than traditional process maps and dashboards. However, they overlook some essential concepts for proper reasoning, such as capacity and agent behavior.
Although process simulation has been around for a long time, it’s gaining traction now, providing a more dynamic understanding of the system. Simulation models can predict the impacts of potential changes, aiding in visualizing cause-and-effect relationships. However, they don't offer real-time analysis.
Leaping forward, Digital Twin technology provides a real-time virtual model of the value stream - a simulation model linked to the data layer. This interactive model enables real-time scenario exploration, enhancing decision-making capabilities. However, its complexity can occasionally slow down decision processes in more intricate systems.
This is why Generative AI and Machine Learning technologies will transform the simple digital twin into the Intelligent Digital Twin. This integration enables real-time processing and analysis of complex scenarios, offering insights and optimization strategies faster than a human can tackle this puzzle. This technology provides a significant step in streamlining decision-making, even in highly complex environments.
Together, these technologies represent a substantial evolution in managing the complexities of VSM (VSM). They provide powerful tools for organizations to optimize their value streams effectively and responsively, addressing the challenges of today's dynamic digital delivery landscape. Crucially, integrating these advanced methodologies marks a paradigm shift from reactive to proactive VSM. This proactive approach allows organizations to anticipate challenges, adapt to changes swiftly, and continually improve processes, ensuring that value streams are efficiently managed and strategically evolved in real-time.