The challenge is that the IT systems that power core business functions for even a single agency are complex, supporting thousands of employee workflows and resident touchpoints. While the underlying business framework for an agency may be well established and similar across different states, the human workflows are unique to the given agency or state. Because of that, even a robust software package could not effectively meet the needs of human users right out of the box. What makes or breaks the success of a modernization is our willingness to develop a detailed, data-driven understanding of the unique needs of those that we aim to benefit. We must allow this understanding to lead the process of configuring the technology, so it meets both human and business needs with precision and control. That is exactly what most successful players in the commercial space (such as Apple) have been doing for decades—letting human-centered design determine how technology is configured (not the other way around). Often, the reason digital government experiences lag behind commercial enterprises is not a lack of funding, but a lack of human-centered design.
To reduce the risk of shortcomings and ensure that transformations are on track to meet resident, employee and business needs, those in charge of government IT transformations need to consider the following four areas, prior to technical implementation:
◉ What to improve: Understand exactly what’s “broken” and what enhancements and new capabilities are needed to fix it.
◉ How to prioritize: Use data to understand what enhancements and capabilities would bring the greatest benefit.
◉ What good experience looks like: Validate that the experience you propose is right and will be adopted by the users, before implementing it.
◉ How to quantify the impact: Quantify, articulate and measure the expected long-term benefit of a capability to justify the investment.
Using data, user research and human-centered design to effectively address these considerations can help you develop a clear modernization strategy that objectively drives your priorities, backlogs and roadmaps. This strategy is not based on a “gut feel” or anecdotal knowledge of a few individuals, but on a well-documented, expansive set of findings, business analysis and calculated projections. This approach also provides a framework that is fully transparent and traceable to the source of every finding and decision. It builds confidence that you are on the right track and enables you to articulate benefit to your stakeholders and the community in detailed and quantifiable terms.
Case study: state of Arizona
The Arizona Department of Child Safety (AZDCS) requested help from IBM to build human-centricity into a major IT transformation to advance the way their child welfare workforce and their service provider community provide services to children and their families. The motivation behind this request was the agency’s experience with an out-of-the-box platform that struggled to get traction and adoption due to its “technology-first” approach. The community of workers and providers often avoided using it because it was not designed with human users’ needs and priorities in mind. This severely reduced the business benefit expected from the platform and perpetuated negative sentiment around the digital experience provided by the agency.
As AZDCS embarked on a 3-year transformation, Steven Hintze (AZDCS Chief Data and Product Officer) prioritized a human-centric design approach to elevate the usefulness and efficiency of their Comprehensive Child Welfare Information System (CCWIS) technology platform. They focused on a clear data-supported blueprint to achieve high marks with the platform’s human users, leading to widespread adoption, positive sentiment and measurable advancements in how well the platform supports their child welfare workforce and their service providers’ critical work functions.
To help AZDCS, IBM leveraged our Service Design for Governments methodology that was developed to solve these exact types of challenges. Focused on business and human outcomes first to inform technology decisions, the IBM team of UX designers and child welfare subject matter experts conducted user research by engaging a diverse group of stakeholders and end users of the platform. The team implemented a phased approach to the human-centered design assessment, which led to a data-driven roadmap of recommended technological enhancements. Each recommendation was grounded in the user research conducted and validated to render significant return on investment (ROI) to the business mission of AZDCS. This meant that each enhancement recommendation was not only infused with the voice of the stakeholders and end users, but also included strategic business case developments in which AZDCS was able to measure and report upon the impacts to their business.
The IBM Service Design for Government methodology follows a phased approach that focuses on improving the end user experience with data-driven prioritization with the downstream effect of increasing user adoption, leading to substantially improved business and resident outcomes.
Phase 1: Understand needs
Acknowledging that we ourselves are not the users of the agency platform is one of the most often overlooked realities in implementing technology into the health and human services’ space. This acknowledgement is critical to designing and implementing technology that meets the needs of the intended end users. For AZDCS, end user groups included internal workforce roles, contracted community service providers, foster caregivers, kinship caregivers and adoptive parents. Creating assumptions around what we believe we know about these users is not enough; we must do our due diligence and conduct research. In a Service Design engagement, user research is a collaborative and robust process that focuses on understanding user challenges and pain points, as well as gathering the data needed to measure the negative value of these challenges, so the agency can address problem areas and prioritize the roadmap. Through the analysis of collected data, potential opportunities for improvement are uncovered.
A pain point tracker (a repository of business, human-centered design and technology issues that inhibit users’ ability to execute critical tasks) captures themes that arise during the data collection process. The pain point tracker clusters the foundational data in which value metrics are then applied. For an agency with a child welfare mission like AZDCS, value metrics include multiple dimensions such as volume (what roles are impacted and how many people?), frequency (how many occurrences?), time (how much time is lost?) and quality (how does this impact service delivery, business process and data quality?). The positive value of an enhancement can be measured by how quickly a worker can navigate to data needed for timely and critical decision making; or how easily and rapidly workers can enter required data into the system, freeing up more time to engage in higher value tasks such as face-to-face time with the children and families on their caseload; or decreasing the amount of time it takes for a family to be connected with the right services that will enable them to stabilize sooner (and with better, longer-term outcomes).
Phase 2: Map current state
From the data collected in Phase 1, current state user journey maps are synthesized into golden threads—critical interconnected workflows for the top user segments.
For an agency like AZDCS, an example of a golden thread might be the journey of an ongoing case specialist, a referral specialist and a payment specialist, as each contribute within the workflow to request an out-of-home placement for a child. Throughout the visual representation of the journey, pain points are plotted accordingly. Each pain point is referenced back to the pain point tracker for full traceability of how it was uncovered and how the value was calculated, providing the basis for the ROI if the pain point is resolved.
Phase 3: Envision the future
Phase 3 shifts focus to guided co-creation with key stakeholders, sponsor users, and business and technical subject matter experts. Engaging in a set of design thinking sessions, this group converges on a future state system of experiences. This part of the process effectively turns the existing pain points and challenges into opportunities to innovate efficient and intelligent ways to conduct key business and user functions that leverage the full potential of the supporting technology. This cross-disciplinary, real-time collaboration enables rapid decision making that considers human needs, business priorities and technological realities. This shapes a path forward that delivers the maximum benefit to residents in the shortest time, with a focused investment of resources.
As stated earlier, for AZDCS the goal was a platform design that leads to high user adoption, positive sentiment and measurable advancements in how well the platform supports their child welfare workforce and their service providers’ critical work functions. Key opportunities revealed during pain point analyses (along with the measurable impacts and current state mappings) serve as the basis for the design phase of the project. They also leverage ideas from design thinking workshops where platform users and stakeholders are encouraged to think big. Subsequent activities break each big idea into smaller, more manageable technical enablers that the group can prioritize based on measurable impacts to the user and to the business.
Phase 4: Prototype & validate
In this phase of the methodology, the group builds prototypes of key user journeys using the prioritized future state ideas generated by stakeholders and end users in Phase 3. This provides an opportunity to review proposed experiences and obtain essential feedback from end users. These reviews can be factored in before enhancements make their way into production. Ideas are also vetted against available technology, as well as the agency’s platform architecture and prior investments, to further determine the value of each proposed enhancement. In this process, AZDCS and the IBM methodology work together to apply greater value metrics to determine where standard configuration of the platform/product can be achieved and where customization may be necessary. The prototype building process prioritizes the ideas that receive the highest marks for business value that are also the most feasible to implement. As prototypes are finalized, key stakeholders and end users are brought in again to validate and refine the designs. The output in Phase 4 is an experience blueprint and product roadmap that represents the best of collaboration between IT, design and front office business. It also ensures that the agency’s platform/product roadmap infuses the voices of the end users and also represents enhancements that will deliver the greatest value to the business if implemented in upcoming builds.
Realized impact
In the absence of a human-centered, data-driven project, many agencies find themselves modernizing with a technology platform that fails to meet the needs of their workforce and stakeholders. It often requires costly iterations of customizations that still do not produce the intended value. AZDCS’ strong leadership, vision and prioritization of creating a human-centered, value-based and modernized child welfare technology platform/product is producing favorable outcomes. Rather than modernizing their system for the sole purpose of being federally compliant, AZDCS’ Chief Data and Product Officer Steven Hintze is infusing the department’s workforce with optimism, excitement and a sense of value in participating in the design of the platform.
“With a large enterprise system change like this, it’s important to engage all customers and stakeholders. When those stakeholders speak up, listen to them, and do something with their information. That doesn’t mean saying yes to everything, but it does mean making information accessible to them, being transparent with problems, teaming with them on key decisions, and trusting them as much as you expect them to trust the implementation team.”
Steven continues to lead the future of their CCWIS by co-creating with stakeholders in building an internal culture that feels supported by the modernized technology. He is dedicated to a product roadmap based on value to the workforce, value to the business and, ultimately, value to those who interact with the agency. Through a human-centered service design engagement, AZDCS is now equipped with a data-driven, value-metric orchestrated road map that guides their technology enhancement decisions.
Guided by an attitude of trust and transparency, AZDCS continues to put voices of the end users at the forefront. With an open-door invitation, the agency continues to listen to feedback and maintains consistent communication with the department’s workforce and provider community. Rather than waiting two years to fully design something new (a pain point experienced often in the design, development and implementation life cycle common to some technology projects), department leadership places high value on the service design outputs and is purposeful about adding new, human-centered design enhancements into their maintenance and operation quarterly builds.