Saturday, 30 September 2023

Real-time transaction data analysis with IBM Event Automation

IBM, IBM Exam, IBM Exam Prep, IBM Preparation, IBM Tutorial and Materials, IBM Guides, IBM Certification, IBM Guides

As the pace and volume of digital business continue to increase, organizations are facing mounting pressure to accelerate the speed at which they do business. The ability to quickly respond to shifting customer and market dynamics has become key for contending with today’s growing digital economy.

In a survey run by IDC, a leading provider of global IT research and advice, 43% of technology leaders indicated that they were “planning to deliver innovative digital products and services at a faster pace.”

Real-time intelligence helps improve responsiveness

To accelerate the speed of business, organizations should acquire the ability to access, interpret and act on real-time information about unique situations arising across the entire organization.

In the same survey run by IDC, 36% of IT leaders rated “using technology to achieve real-time decision-making” as critical to business success.

The need to improve real-time decision-making is contributing to a growing demand for event-driven solutions and their ability to help businesses achieve continuous intelligence and situation awareness.

An event-driven architecture focuses on the publication, capture, processing and storage of events. When an application or service performs an action or undergoes a change, it publishes an event—a real-time record of that action or change. Those events can then be consumed and processed to help inform smarter decisions and trigger automations.

Putting business events to work

However, becoming an event-driven business involves more than just collecting events from applications and services. Streams of events need to be processed—by applying operations like filtering, aggregating, transforming or joining with other streams of events—to define a time-sensitive scenario that the business needs to detect.

To show this with an example, consider one stream of business events sourced from an order management system. This stream provides a persistent, real-time log of all orders received. Though this stream of events can be useful on its own, allowing us to detect any time a customer transacts and places an order, we can unlock more value by combining this with other streams of events to define a critical business scenario.

Now, consider processing this stream of events to filter it to new orders that have a value greater than $100 and are placed within 24 hours of a new customer account creation. Now, beyond just the notification of a transaction, we are detecting a key scenario that warrants action from the business—namely, an opportunity to foster new customer growth and loyalty. This was just one example, and business events can be collected from a wide variety of systems, applications, devices and services across an organization. But no matter the source, getting the most value out of events requires that we process and correlate streams of events together to define a business scenario.  

Get started with IBM Event Automation

By putting events to work, companies can help teams proactively respond to key customer opportunities, issues or potential threats as they arise. They can also help quickly adjust to shifting market dynamics and external factors affecting business operations, like changes in levels of demand, costs and more.

And these benefits shouldn’t be available only to highly technical teams. By lowering the skills barrier to processing events, line-of-business teams can be empowered to define and detect time-sensitive situations. This is especially important as companies continue to grapple with the impact of the Great Resignation.

In a survey run by IDC, 45% of IT leaders reported “a general shortage of people with the right skills” as “the main reason vacancies are hard to fill” for their real-time use cases.

IBM Event Automation’s event processing capability is designed to be intuitive and accessible by users across the business. With it, both business and IT teams can start working with events and defining business scenarios without writing code or being an expert in SQL.

Source: ibm.com

Thursday, 28 September 2023

Discovery to delivery: Transform the shopper’s journey

IBM Exam, IBM Exam Prep, IBM Tutorial and Materials, IBM Preparation

The relevance of a promise


The fundamental principle of commerce is the concept of a “promise.” A promise of accuracy in product detail, product capabilities, quality, price and delivery.

The promise of accuracy in inventory—or, more importantly, in the availability of inventory—is very important when there are competing demands for the same unit of inventory and unpredictable disruptions to sources of supply. It becomes very challenging to uphold the promise when so much is changing in the processes that trigger demand and supply.

The challenges of keeping a promise


The challenge is particularly true in highly competitive areas like omnichannel retail or even B2B domains—wholesale distribution and dealer-based after-market parts. This is a space where IBM Sterling Order Management and IBM Sterling Intelligent Promising have been leaders since the early days of eCommerce.

The secret to success is making sure that changes to supply and demand are tracked and processed at ultra-scale in real-time and that an accurate Available to Promise quantity (ATP) is presented to the channels of promise. It is not enough to just present a quantity; demanding shoppers also want to see an expected date of delivery—not an estimated date of when a product will be shipped, but a precise date of delivery at their doorstep or a confirmed date for in-store pickup, etc. This means keeping track of stock availability, capacity availability and possibly, the availability of specific delivery slots for pickup. In the wider context of flash-sales, in-store campaigns, marketplaces and omni-channel connected commerce initiatives, staying on top of this dynamic environment can be very challenging to companies.

Welcome to a new era of intelligent promising: IBM’s Sterling Intelligent Promising approach


IBM Sterling Intelligent Promising addresses this challenge by providing powerful promising and inventory services that deliver accurate, real-time information to power an enhanced shopper experience. Implementation and deployment of this solution are not difficult and can very quickly help companies simplify the complexity.

IBM Sterling Intelligent Promising delivers what next-generation shoppers expect from modern commerce experiences:

  • Certainty of product availability, delivery estimates, actual delivery and promptness.
  • Choice of how to order, where and when they can have it delivered or picked up, and their post-order service options.
  • Transparency of the entire customer order journey, from order tracking to returns.

Key features and benefits


  • Strengthen shopper loyalty: Confidently deliver choice and transparency to every shopper throughout the buying journey.
  • Manage accurate promises: Increase digital conversions by presenting precise promise dates on the product list page, product details page and during checkout.
  • Maximize inventory productivity: Reduce order cancellations and optimize inventory by dynamically providing shoppers with enterprise-wide inventory views.
  • Take advantage of fulfillment optimization: Balance predefined business rules with real cost drivers to make the best fulfillment decisions for business outcomes, optimizing across thousands of permutations in milliseconds.
  • Increase omnichannel profitability: Drive higher conversion and in-store sales with accurate promise dates, choice in delivery and pickup, and upsell of related products.

Try IBM’s Sterling Intelligent Promising solution today


Commerce, inventory and fulfillment managers can envision how to improve digital and in-store conversion, omnichannel revenue and profitability via contextual, transparent and accurate delivery estimates at every stage of the order journey. To help them experience this journey ahead of the actual purchase of the service, IBM has made available a “try before you buy” trial experience for IBM Sterling Intelligent Promising.

Users and architects can try a 30-day trial experience that features the powerful capabilities of the promising and inventory services and the pre-configured data that powers the shopper experience. The trial experience will also demonstrate the architectural elegance and automation provided by the underlying Kubernetes container-based technology. This Kubernetes Operator pattern design brings simplicity and scalability to the management of interconnected ecosystem solutions.

The walkthrough of our solution will demonstrate the trust and confidence your customers have on your word. Learn how you can make accurate promises and how you can keep those promises—profitably.

Source: ibm.com

Tuesday, 26 September 2023

Generative AI as a catalyst for change in the telecommunications industry

IBM, IBM Exam, IBM Exam Prep, IBM Exam Preparation, IBM Tutorial and Materials, IBM Certification

Generative artificial intelligence (AI) burst into the mainstream in 2023, lighting a fire under businesses to integrate enterprise-grade versions into their processes. By 2024, 60% of C-suite executives are planning to pilot or operate generative AI in some way, indicating that generative AI’s public-facing platforms have awakened the world to its groundbreaking capabilities

For Communications Service Providers (CSPs) and Network Equipment Providers (NEPs), in particular, generative AI holds tremendous potential to help improve all manner of operations and customer engagement. Specifically, generative AI would transform customer care, IT and network optimization and digital labor—all areas in which automation can notably help increase agility and efficiency. CSPs and NEPs usually have huge support centers and IBM has the potential to help transform workflows between all ecosystem players. Here are some ways AI can contribute to transformation in the telco ecosystem:

Customer Lifecycle Management and service innovation


The job of managing customer relationships is traditionally a reactive one: fielding calls, responding to emails and working out solutions. Infusing generative AI into these interactions helps support the shift to more proactive care that has the potential to improve customer satisfaction and unlock new revenue streams. Enabling customer care agents to focus on complex cases by removing routine types of Q&A is a perfect case for concurrently addressing Net Promoter Score and employee satisfaction.

Chatbots have been around for some time, but can often create frustrating experiences for customers. Generative AI can go beyond basic Q&A, and can also train to identify negative sentiment and triage the ticket to the right agent, reducing further escalation and enabling agents to respond quickly and appropriately. Chatbot technology can also be applied to phone interactions, driving additional refinement to the customer care process.

AI can also help drive automated outreach that anticipates customers’ needs and issues, along with personalized marketing that can drive boosted sales and optimize the customer experience. For example, AI can look at a range of inputs to build offers, such as current usage and tariff plans, lifecycle of device ownership, service experience and extend offers to upgrade and be incentivized to buy more or retain service based on offerings. This has potential for helping reduce churn, improve revenue-per-user and lower the cost of subscriber acquisition.

Network optimization


AI can help to improve the performance, efficiency and reliability of telecommunications networks, which is essential to satisfy ever-increasing demands of different customer segments. Through live data analysis and predictive forecasting, AI tools can help employees working in network operations centers and network engineers to mitigate congestion and downtime. As 5G networks continue to expand, the need for intelligent load balancing and traffic shaping will likely grow.

AI-enhanced network optimization could benefit CSPs in a multitude of ways: not only can it add to a company’s competitive advantage by enhancing service for customers, but it can also help manage operating costs by addressing the strain on resources and helping CSPs and NEPs alike to avoid over-or under-provisioning resources.

CSPs can take advantage of watsonx.ai to train, validate, tune and deploy AI and machine learning capabilities to help optimize network performance. Watsonx’s open-source frameworks and SDK and API libraries are designed to make it easier to implement AI into existing software platforms that telcos already use to oversee their networks.

Digitalizing operations with AI talent


One of AI’s chief benefits is its power as a productivity tool to automate more mundane and time-consuming tasks, freeing up employees to focus on higher-order activities and work. Many of today’s employees utilize a staggering number of manual processes or fragmented tooling in their day-to-day jobs, with constant screen switching. A good example is the use of IBM Watson Orchestrate, using robotic process automation to streamline workflows, and connect to apps to help employees tackle a variety of tasks more easily.

The path to implementation


Before embarking on implementing AI enhancements, it’s crucial that CSPs and NEPs take care to develop organizational strategies to make these powerful tools most effective.

AI relies on data, but many organizations still operate various siloed repositories. CSPs and NEPs should define and establish a hybrid information architecture that facilitates the easy flow of data across multicloud environments and provides insights into the quality of that data. Watsonx.data helps make this process easy, allowing CSPs and NEPs to scale AI across a data store built on an open lakehouse architecture that supports querying, governance and fluid access to data. Using watsonx.data, business functions within the CSP and NEP can access their data through a single point of entry and connect to storage and analytics environments to  build the trust in their data and work from auditable sources.

CSPs and NEPs that develop thorough organizational and data strategies will not only be  positioned to maximize the capabilities and ethics of their AI frameworks, but they can also apply these methodologies to guide their own enterprise customers along their own journeys—opening up the potential for additional revenue streams in the process.

As AI’s capabilities evolve, companies should choose from two paths: There will be organizations that see AI as an additional tool for various aspects of their business and organizations that are AI-first. CSPs and NEPs that take the latter route will bepositioned to realize advantages over competitors in terms of cost savings, service quality and customer experience—and this advantage can only deepen with the maturation of AI over the coming decade. 

Source: ibm.com

Thursday, 21 September 2023

Data science vs data analytics: Unpacking the differences

Data science, data analytics, IBM, IBM Career, IBM Skills, IBM Prep, IBM Preparation

Though you may encounter the terms “data science” and “data analytics” being used interchangeably in conversations or online, they refer to two distinctly different concepts. Data science is an area of expertise that combines many disciplines such as mathematics, computer science, software engineering and statistics. It focuses on data collection and management of large-scale structured and unstructured data for various academic and business applications. Meanwhile, data analytics is the act of examining datasets to extract value and find answers to specific questions. Let’s explore data science vs data analytics in more detail.

Overview: Data science vs data analytics


Think of data science as the overarching umbrella that covers a wide range of tasks performed to find patterns in large datasets, structure data for use, train machine learning models and develop artificial intelligence (AI) applications. Data analytics is a task that resides under the data science umbrella and is done to query, interpret and visualize datasets. Data scientists will often perform data analysis tasks to understand a dataset or evaluate outcomes.

Business users will also perform data analytics within business intelligence (BI) platforms for insight into current market conditions or probable decision-making outcomes. Many functions of data analytics—such as making predictions—are built on machine learning algorithms and models that are developed by data scientists. In other words, while the two concepts are not the same, they are heavily intertwined.

Data science: An area of expertise


As an area of expertise, data science is much larger in scope than the task of conducting data analytics and is considered its own career path. Those who work in the field of data science are known as data scientists. These professionals build statistical models, develop algorithms, train machine learning models and create frameworks to:

  • Forecast short- and long-term outcomes
  • Solve business problems
  • Identify opportunities
  • Support business strategy
  • Automate tasks and processes
  • Power BI platforms

In the world of information technology, data science jobs are currently in demand for many organizations and industries. To pursue a data science career, you need a deep understanding and expansive knowledge of machine learning and AI. Your skill set should include the ability to write in the programming languages Python, SAS, R and Scala. And you should have experience working with big data platforms such as Hadoop or Apache Spark. Additionally, data science requires experience in SQL database coding and an ability to work with unstructured data of various types, such as video, audio, pictures and text.

Data scientists will typically perform data analytics when collecting, cleaning and evaluating data. By analyzing datasets, data scientists can better understand their potential use in an algorithm or machine learning model. Data scientists also work closely with data engineers, who are responsible for building the data pipelines that provide the scientists with the data their models need, as well as the pipelines that models rely on for use in large-scale production.

The data science lifecycle


Data science is iterative, meaning data scientists form hypotheses and experiment to see if a desired outcome can be achieved using available data. This iterative process is known as the data science lifecycle, which usually follows seven phases:

  1. Identifying an opportunity or problem
  2. Data mining (extracting relevant data from large datasets)
  3. Data cleaning (removing duplicates, correcting errors, etc.)
  4. Data exploration (analyzing and understanding the data)
  5. Feature engineering (using domain knowledge to extract details from the data)
  6. Predictive modeling (using the data to predict future outcomes and behaviors)
  7. Data visualizing (representing data points with graphical tools such as charts or animations)

Data analytics: Tasks to contextualize data


The task of data analytics is done to contextualize a dataset as it currently exists so that more informed decisions can be made. How effectively and efficiently an organization can conduct data analytics is determined by its data strategy and data architecture, which allows an organization, its users and its applications to access different types of data regardless of where that data resides. Having the right data strategy and data architecture is especially important for an organization that plans to use automation and AI for its data analytics.

The types of data analytics


Predictive analytics: Predictive analytics helps to identify trends, correlations and causation within one or more datasets. For example, retailers can predict which stores are most likely to sell out of a particular kind of product. Healthcare systems can also forecast which regions will experience a rise in flu cases or other infections.

Prescriptive analytics: Prescriptive analytics predicts likely outcomes and makes decision recommendations. An electrical engineer can use prescriptive analytics to digitally design and test out various electrical systems to see expected energy output and predict the eventual lifespan of the system’s components.

Diagnostic analytics: Diagnostic analytics helps pinpoint the reason an event occurred. Manufacturers can analyze a failed component on an assembly line and determine the reason behind its failure.

Descriptive analytics: Descriptive analytics evaluates the quantities and qualities of a dataset. A content streaming provider will often use descriptive analytics to understand how many subscribers it has lost or gained over a given period and what content is being watched.

The benefits of data analytics


Business decision-makers can perform data analytics to gain actionable insights regarding sales, marketing, product development and other business factors. Data scientists also rely on data analytics to understand datasets and develop algorithms and machine learning models that benefit research or improve business performance.

The dedicated data analyst


Virtually any stakeholder of any discipline can analyze data. For example, business analysts can use BI dashboards to conduct in-depth business analytics and visualize key performance metrics compiled from relevant datasets. They may also use tools such as Excel to sort, calculate and visualize data. However, many organizations employ professional data analysts dedicated to data wrangling and interpreting findings to answer specific questions that demand a lot of time and attention. Some general use cases for a full-time data analyst include:

◉ Working to find out why a company-wide marketing campaign failed to meet its goals
◉ Investigating why a healthcare organization is experiencing a high rate of employee turnover
◉ Assisting forensic auditors in understanding a company’s financial behaviors

Data analysts rely on range of analytical and programming skills, along with specialized solutions that include:

◉ Statistical analysis software
◉ Database management systems (DBMS)
◉ BI platforms
◉ Data visualization tools and data modeling aids such as QlikView, D3.js and Tableau

Data science, data analytics and IBM


Practicing data science isn’t without its challenges. There can be fragmented data, a short supply of data science skills and rigid IT standards for training and deployment. It can also be challenging to operationalize data analytics models.

IBM’s data science and AI lifecycle product portfolio is built upon our longstanding commitment to open source technologies. It includes a range of capabilities that enable enterprises to unlock the value of their data in new ways. One example is watsonx, a next generation data and AI platform built to help organizations multiply the power of AI for business.

Watsonx comprises of three powerful components: the watsonx.ai studio for new foundation models, generative AI and machine learning; the watsonx.data fit-for-purpose store for the flexibility of a data lake and the performance of a data warehouse; plus, the watsonx.governance toolkit, to enable AI workflows that are built with responsibility, transparency and explainability.

Source: ibm.com

Tuesday, 19 September 2023

How generative AI is revolutionizing supply chain operations

IBM, IBM Exam, IBM Exam Prep, IBM Tutorial and Materials, IBM Certification, IBM Learning, IBM Certification

Supply chains are under enormous stress. As supply chain leaders, we’ve seen unprecedented levels of disruption, challenges with our ecosystem trading partner networks, tariff wars, real wars and tremendous turnover in human capital. Senior supply chain leaders are retiring in record numbers. Welcome or not, large scale change is afoot in our profession. And just when we thought things were on their way back to a new “hybrid” normal, a curious new technology in generative AI seems poised to upend the world of operations yet again.

Across media headlines, we see dark warnings about the existential risk of generative AI technologies to our culture and society. Yet as supply chain innovators, we know there is a rich history of applying technologies to continuously optimize operations. Is generative AI likely to drive an “extinction event” for supply chains as we know them? We think not. The fundamental nature of supply chain is evolutionary, and it has been that way since our craft was born out of the Toyota Production System in the 1950s. In supply chain, we take intentional but measured risks; we don’t swing for the fence.

As our profession looks to apply generative AI, we will undoubtedly take the same approach. With that mindset, we see the potential for step change improvements in efficiency, human productivity and quality. Generative AI holds all the potential to innovate beyond today’s process, technology and people constraints to a future where supply chains are foundational to delivering operational outcomes and a richer customer experience. But we must choose to embrace this new technology and make it part of the fabric of everything that we do.

A new and exciting wave of disruption


Let’s get one thing clear right from the start: generative AI is unlike any technology we’ve ever seen before in supply chain. It’s a game changer. Do not be tempted to discount generative AI as another Radio Frequency Identification (RFID), Blockchain or other “hype cycle” technology. Over the next five years, generative AI will fundamentally change the way we work in supply chain. It will automate out the dull, menial and transactional work that consumes an enormous amount of low value add time for our professionals. Far from eliminating human value, it will elevate the role of our people and empower them to run the business on a predictive, proactive and forward-looking basis.

Despite the tremendous investment our supply chains have received over the last several decades, today they still demonstrate the need for significant reinvention. Many supply chain structures remain functionally siloed and struggle to execute predictably end-to-end. Applications and data are often trapped within departmental boundaries. Multiple ERP instances create challenges with fragmentation of orders and commitments across disparate systems. These combine to create unnecessary costs, increased execution latency and a suboptimal customer experience. There simply isn’t enough time or investment to uplift or replace these legacy investments. It is here where generative AI solutions (built in the cloud and connecting data end-to-end) will unlock tremendous new value while leveraging and extending the life of legacy technology investments. Generative AI creates a strategic inflection point for supply chain innovators and the first true opportunity to innovate beyond traditional supply chain constraints.

Pivoting from ideation to action


A leading supply chain needs to be orchestrated across the value chain. Internal and external stakeholders need fast and accurate information at their fingertips to plan, manage and direct supply chains. To drive personalized actions, insights and visibility, large volumes of data (ERP, WMS, RFID and visual analytics) need to be ingested, normalized and analyzed at high speeds. The need for agile, resilient and competitive supply chains has never been greater than today.

Addressing these challenges requires a platform that enterprises can own, shape and scale per the business needs. At IBM we have embraced a hybrid cloud, component-based architecture that is built on open technologies. Ingesting high volumes of data at speed and contextualizing them to each persona is a given. The “machine” learns, thinks and executes repetitive tasks while allowing supply chain professionals to focus on high impact business events.

The advent of generative AI will transform the ways we think and work. Traditional solutions constrain users in how to engage, utilize and investigate the large volumes of enterprise data. Frequently, the data is hard to even access. While chat bots have attempted to make it easier to get information, they rely on extensive model training and are frequently limited to “how to” questions. Generative AI brings the force of machine learning to everyday tasks by leveraging foundation models that allow users to interact with structured and unstructured data like never before. What took months of training takes just days now. Our supply chain digital assistant allows users to interact with data conversationally to interrogate vast transaction data, such as hundreds of thousands of documents and visual images. Users can go from getting an overall picture of the health of the supply chain to understanding a specific transaction by just asking. They get contextual information as well as factual data (PDFs, visual images, RFID tags information). A question from a supply chain manager (“Where do I have excess inventory?”) or a buyer (“How is my vendor performing?”) becomes a simple question rather than a complex exercise in bringing disparate reports together.

By establishing a common platform for all stakeholders, orchestrating the supply chain becomes intrinsic to everyday tasks and processes. Building on the core foundation, enterprises can deploy generative AI-powered use cases, allowing enterprises to scale quickly and be agile in a fast-paced marketplace.

IBM’s supply chain AI journey


A few years ago, in IBM’s own infrastructure supply chain, we decided that conversational AI could help overcome boundaries between different supply chain domains and provide a collaboration platform with our sales teams, suppliers and partners. On this platform, any authorized user could access critical information in an intuitive way, using natural language. We were able to achieve an easy-to-access, real-time, single view of truth with immediate insights to manage the client experience, operate with resilience and react to market disruptions. Despite major disruptions and dislocations in the COVID and post-COVID worlds, IBM’s supply chain fulfilled 100% of its orders and delivered on its promises.

Integrated generative AI accelerates intuitive conversations between supply chain decision makers and virtual assistants, enabling fast and fact-based actions. These innovations empower supply chain professionals to focus on complex problem resolution, the continuous improvement of our workflow designs and augmenting AI models. Adding generative AI and the power of foundational models to the existing solution is a natural step in the evolution of our supply chain capabilities.

Source: ibm.com

Thursday, 14 September 2023

Human-centered design and data-driven insights elevate precision in government IT modernization

IBM, IBM Exam, IBM Exam Prep, IBM Tutorial and Materials, IBM Certification, IBM Preparation

Government agencies are under pressure to close the gap between the needs and expectations of their residents and the level of services that government IT systems can realistically support. When substantial investments of taxpayers’ money are allocated toward IT modernization, agencies are mandated by local legislature and judiciary to deliver meaningful improvements. Government executives face several uncertainties as they embark on their journeys of modernization. The bulk of these uncertainties do not revolve around what software package to pick or whether to migrate to the cloud; they revolve around how exactly to apply these powerful technologies and data with precision and control to achieve meaningful improvements in the shortest time possible. Residents judge the success of modernization by how much easier it becomes to transact with the government on digital devices and how much faster the government can respond to their needs. And government employees can appreciate the effort of IT transformations that make it easier and faster to do their jobs servicing the residents.

The challenge is that the IT systems that power core business functions for even a single agency are complex, supporting thousands of employee workflows and resident touchpoints. While the underlying business framework for an agency may be well established and similar across different states, the human workflows are unique to the given agency or state. Because of that, even a robust software package could not effectively meet the needs of human users right out of the box. What makes or breaks the success of a modernization is our willingness to develop a detailed, data-driven understanding of the unique needs of those that we aim to benefit. We must allow this understanding to lead the process of configuring the technology, so it meets both human and business needs with precision and control. That is exactly what most successful players in the commercial space (such as Apple) have been doing for decades—letting human-centered design determine how technology is configured (not the other way around). Often, the reason digital government experiences lag behind commercial enterprises is not a lack of funding, but a lack of human-centered design.

To reduce the risk of shortcomings and ensure that transformations are on track to meet resident, employee and business needs, those in charge of government IT transformations need to consider the following four areas, prior to technical implementation:

◉ What to improve: Understand exactly what’s “broken” and what enhancements and new capabilities are needed to fix it.
◉ How to prioritize: Use data to understand what enhancements and capabilities would bring the greatest benefit.
◉ What good experience looks like: Validate that the experience you propose is right and will be adopted by the users, before implementing it.
◉ How to quantify the impact: Quantify, articulate and measure the expected long-term benefit of a capability to justify the investment.

Using data, user research and human-centered design to effectively address these considerations can help you develop a clear modernization strategy that objectively drives your priorities, backlogs and roadmaps. This strategy is not based on a “gut feel” or anecdotal knowledge of a few individuals, but on a well-documented, expansive set of findings, business analysis and calculated projections. This approach also provides a framework that is fully transparent and traceable to the source of every finding and decision. It builds confidence that you are on the right track and enables you to articulate benefit to your stakeholders and the community in detailed and quantifiable terms.

Case study: state of Arizona


The Arizona Department of Child Safety (AZDCS) requested help from IBM to build human-centricity into a major IT transformation to advance the way their child welfare workforce and their service provider community provide services to children and their families. The motivation behind this request was the agency’s experience with an out-of-the-box platform that struggled to get traction and adoption due to its “technology-first” approach. The community of workers and providers often avoided using it because it was not designed with human users’ needs and priorities in mind. This severely reduced the business benefit expected from the platform and perpetuated negative sentiment around the digital experience provided by the agency.

As AZDCS embarked on a 3-year transformation, Steven Hintze (AZDCS Chief Data and Product Officer) prioritized a human-centric design approach to elevate the usefulness and efficiency of their Comprehensive Child Welfare Information System (CCWIS) technology platform.  They focused on a clear data-supported blueprint to achieve high marks with the platform’s human users, leading to widespread adoption, positive sentiment and measurable advancements in how well the platform supports their child welfare workforce and their service providers’ critical work functions.

To help AZDCS, IBM leveraged our Service Design for Governments methodology that was developed to solve these exact types of challenges. Focused on business and human outcomes first to inform technology decisions, the IBM team of UX designers and child welfare subject matter experts conducted user research by engaging a diverse group of stakeholders and end users of the platform. The team implemented a phased approach to the human-centered design assessment, which led to a data-driven roadmap of recommended technological enhancements. Each recommendation was grounded in the user research conducted and validated to render significant return on investment (ROI) to the business mission of AZDCS. This meant that each enhancement recommendation was not only infused with the voice of the stakeholders and end users, but also included strategic business case developments in which AZDCS was able to measure and report upon the impacts to their business.

The IBM Service Design for Government methodology follows a phased approach that focuses on improving the end user experience with data-driven prioritization with the downstream effect of increasing user adoption, leading to substantially improved business and resident outcomes.

Phase 1: Understand needs

Acknowledging that we ourselves are not the users of the agency platform is one of the most often overlooked realities in implementing technology into the health and human services’ space. This acknowledgement is critical to designing and implementing technology that meets the needs of the intended end users. For AZDCS, end user groups included internal workforce roles, contracted community service providers, foster caregivers, kinship caregivers and adoptive parents. Creating assumptions around what we believe we know about these users is not enough; we must do our due diligence and conduct research. In a Service Design engagement, user research is a collaborative and robust process that focuses on understanding user challenges and pain points, as well as gathering the data needed to measure the negative value of these challenges, so the agency can address problem areas and prioritize the roadmap. Through the analysis of collected data, potential opportunities for improvement are uncovered.

A pain point tracker (a repository of business, human-centered design and technology issues that inhibit users’ ability to execute critical tasks) captures themes that arise during the data collection process. The pain point tracker clusters the foundational data in which value metrics are then applied. For an agency with a child welfare mission like AZDCS, value metrics include multiple dimensions such as volume (what roles are impacted and how many people?), frequency (how many occurrences?), time (how much time is lost?) and quality (how does this impact service delivery, business process and data quality?). The positive value of an enhancement can be measured by how quickly a worker can navigate to data needed for timely and critical decision making; or how easily and rapidly workers can enter required data into the system, freeing up more time to engage in higher value tasks such as face-to-face time with the children and families on their caseload; or decreasing the amount of time it takes for a family to be connected with the right services that will enable them to stabilize sooner (and with better, longer-term outcomes).

Phase 2: Map current state

From the data collected in Phase 1, current state user journey maps are synthesized into golden threads—critical interconnected workflows for the top user segments.

For an agency like AZDCS, an example of a golden thread might be the journey of an ongoing case specialist, a referral specialist and a payment specialist, as each contribute within the workflow to request an out-of-home placement for a child. Throughout the visual representation of the journey, pain points are plotted accordingly. Each pain point is referenced back to the pain point tracker for full traceability of how it was uncovered and how the value was calculated, providing the basis for the ROI if the pain point is resolved.

Phase 3: Envision the future

Phase 3 shifts focus to guided co-creation with key stakeholders, sponsor users, and business and technical subject matter experts. Engaging in a set of design thinking sessions, this group converges on a future state system of experiences. This part of the process effectively turns the existing pain points and challenges into opportunities to innovate efficient and intelligent ways to conduct key business and user functions that leverage the full potential of the supporting technology. This cross-disciplinary, real-time collaboration enables rapid decision making that considers human needs, business priorities and technological realities. This shapes a path forward that delivers the maximum benefit to residents in the shortest time, with a focused investment of resources.

As stated earlier, for AZDCS the goal was a platform design that leads to high user adoption, positive sentiment and measurable advancements in how well the platform supports their child welfare workforce and their service providers’ critical work functions. Key opportunities revealed during pain point analyses (along with the measurable impacts and current state mappings) serve as the basis for the design phase of the project. They also leverage ideas from design thinking workshops where platform users and stakeholders are encouraged to think big. Subsequent activities break each big idea into smaller, more manageable technical enablers that the group can prioritize based on measurable impacts to the user and to the business.

Phase 4: Prototype & validate

In this phase of the methodology, the group builds prototypes of key user journeys using the prioritized future state ideas generated by stakeholders and end users in Phase 3. This provides an opportunity to review proposed experiences and obtain essential feedback from end users. These reviews can be factored in before enhancements make their way into production. Ideas are also vetted against available technology, as well as the agency’s platform architecture and prior investments, to further determine the value of each proposed enhancement. In this process, AZDCS and the IBM methodology work together to apply greater value metrics to determine where standard configuration of the platform/product can be achieved and where customization may be necessary. The prototype building process prioritizes the ideas that receive the highest marks for business value that are also the most feasible to implement.  As prototypes are finalized, key stakeholders and end users are brought in again to validate and refine the designs. The output in Phase 4 is an experience blueprint and product roadmap that represents the best of collaboration between IT, design and front office business. It also ensures that the agency’s platform/product roadmap infuses the voices of the end users and also represents enhancements that will deliver the greatest value to the business if implemented in upcoming builds.

Realized impact


In the absence of a human-centered, data-driven project, many agencies find themselves modernizing with a technology platform that fails to meet the needs of their workforce and stakeholders. It often requires costly iterations of customizations that still do not produce the intended value. AZDCS’ strong leadership, vision and prioritization of creating a human-centered, value-based and modernized child welfare technology platform/product is producing favorable outcomes. Rather than modernizing their system for the sole purpose of being federally compliant, AZDCS’ Chief Data and Product Officer Steven Hintze is infusing the department’s workforce with optimism, excitement and a sense of value in participating in the design of the platform.

“With a large enterprise system change like this, it’s important to engage all customers and stakeholders. When those stakeholders speak up, listen to them, and do something with their information. That doesn’t mean saying yes to everything, but it does mean making information accessible to them, being transparent with problems, teaming with them on key decisions, and trusting them as much as you expect them to trust the implementation team.”

Steven continues to lead the future of their CCWIS by co-creating with stakeholders in building an internal culture that feels supported by the modernized technology. He is dedicated to a product roadmap based on value to the workforce, value to the business and, ultimately, value to those who interact with the agency. Through a human-centered service design engagement, AZDCS is now equipped with a data-driven, value-metric orchestrated road map that guides their technology enhancement decisions.

Guided by an attitude of trust and transparency, AZDCS continues to put voices of the end users at the forefront. With an open-door invitation, the agency continues to listen to feedback and maintains consistent communication with the department’s workforce and provider community. Rather than waiting two years to fully design something new (a pain point experienced often in the design, development and implementation life cycle common to some technology projects), department leadership places high value on the service design outputs and is purposeful about adding new, human-centered design enhancements into their maintenance and operation quarterly builds.

Source: ibm.com

Tuesday, 12 September 2023

Powering the future: The synergy of IBM and AWS partnership

IBM, IBM Exam, IBM Exam Prep, IBM Certification, IBM Tutorial and Materials, IBM Learning, IBM Preparation Exam

We are in the midst of an AI revolution where organizations are seeking to leverage data for business transformation and harness generative AI and foundation models to boost productivity, innovate, enhance customer experiences, and gain a competitive edge. IBM and AWS have been working together since 2016 to provide secure, automated solutions for hybrid cloud environments. This partnership gives clients the flexibility to choose the right mix of technologies for their needs, and IBM Consulting can help them scale those solutions enterprise-wide. The result is easier, faster cloud transformation, technology deployment, and more secure operations, which can help drive better business results. This powerful synergy is revolutionizing how businesses harness the potential of data-driven AI to stay ahead in the digital age.

Data and AI as the Pillars of the Partnership


At the heart of this partnership lies a deep appreciation for the role of data as the catalyst for AI innovation. Data is the fuel that powers AI algorithms, enabling them to generate insights, predictions, and solutions that drive businesses forward. This understanding forms the cornerstone of the IBM and AWS collaboration, creating an environment where data and AI are seamlessly integrated to yield remarkable results. IBM and AWS has worked together to bring all of IBM data management portfolio namely Db2, Netezza and IBM watsonx.data to work on AWS in cloud native fashion with convenience of SaaS for our customers. This collaboration recognizes that data and AI are inseparable allies in driving digital transformation. Data provides the raw materials for AI algorithms to learn and evolve, while AI extracts actionable insights from data that shape business strategies and customer experiences. Together, they amplify each other’s potential, creating a virtuous cycle of innovation.

IBM’s expertise in data management, AI research, and innovative solutions aligns perfectly with AWS’s cloud infrastructure and global reach. This partnership isn’t just a union of convenience; it’s a strategic alliance that capitalizes on the synergies of data and AI.

All in on IBM Data Store portfolio


Over the last 6 month The IBM and AWS partnership has launched a suite of groundbreaking data and AI solutions that empower businesses to thrive in a data-driven world. These offerings focus on data management, AI development, and seamless integration, enhancing the capabilities of organizations across various industries.

◉ Watsonx.data on AWS: Imagine having the power of data at your fingertips. watsonx.data is a software-as-a-service (SaaS) offering that revolutionizes data management. It slashes Data Warehouse costs by half, grants instant access to all your data in just 30 minutes on AWS, and incorporates a fit-for-purpose data store based on an open lakehouse architecture. Multiple query engines, built-in governance, and hybrid cloud deployment models further elevate its capabilities.

◉ Netezza SaaS: Transitioning to the cloud has never been easier. Netezza SaaS is designed to simplify data migration, ensuring frictionless upgrades and the lowest Total Cost of Ownership (TCO). Its support for open table formats on COS and integration with watsonx.data streamlines data operations, making it a powerful asset for businesses in need of seamless cloud integration.

◉ Db2 Warehouse SaaS: The need for rapid data processing is met by Db2 Warehouse SaaS, which promises up to 7 times faster performance and a remarkable 30 times reduction in costs. Whether dealing with tens or hundreds of terabytes, this solution scales effortlessly. Its support for native and open table formats on COS and watsonx.data integration underscores its versatility.

A glimpse of real-world impact: Data-driven success


The true power of this partnership and its offerings becomes evident when considering real-world scenarios. One notable example is a prominent Japanese insurance company that leveraged Db2 PureScale on AWS. This partnership allowed them to enhance customer service, gaining insights that reshape how they interact with their clients.

A call to embrace data-driven AI transformation


For our customers, business data is king and AI is the game-changer. The IBM and AWS partnership emerges as a lighthouse of innovation. Together we are reinforcing to our customers that data and AI are pivotal to progress. As businesses seek to navigate the complexities of the digital landscape, this partnership offers them a roadmap to data-driven AI solutions that fuel growth, innovation, and sustainable success.

Join IBM and AWS experts at IBM TechXchange Conference 2023, the premier learning event for developers, engineers, and IT practitioners.

The future unveiled through data and AI unity


The partnership between IBM and AWS signifies more than a collaboration; it’s a movement towards harnessing the full potential of data and AI. The symphony of their expertise creates a harmonious blend of innovation that transcends industries and reshapes possibilities. With data as the bedrock and AI as the engine, this partnership sets a precedent for a future where businesses thrive on the power of insights, intelligence, and innovation.

Source: ibm.com

Saturday, 9 September 2023

5 types of chatbot and how to choose the right one for your business

IBM Exam, IBM Exam Prep, IBM Preparation, IBM Tutorial and Materials, IBM Guides, IBM Certification

Now, more than ever, different types of chatbot technology plays an increasingly prevalent role in our lives, from how we receive customer support or decide to purchase a product to how we handle our routine tasks. Many of us have interacted with these chatbots or virtual assistants on our phones or through devices in our homes—such as Apple’s Siri, Amazon Alexa and Google Assistant. You may have interacted with these chatbots via SMS text messaging, social media or with messenger applications in the workplace.


Chatbots have made our lives easier by providing timely answers to our questions without the hassle of waiting to speak with a human agent. In this blog, we’ll touch on different types of chatbots with various degrees of technological sophistication and discuss which makes the most sense for your business. Before addressing these questions, we’ll start with the basics. 

Chatbots explained


A chatbot is a conversational tool that seeks to understand customer queries and respond automatically, simulating written or spoken human conversations. As you’ll discover below, some chatbots are rudimentary, presenting simple menu options for users to click on. However, more advanced chatbots can leverage artificial intelligence (AI) and natural language processing (NLP) to understand a user’s input and navigate complex human conversations with ease. 

What are the different types of chatbot? 


1. Menu or button-based chatbots

Menu-based or button-based chatbots are the most basic kind of chatbot where users can interact with them by clicking on the button option from a scripted menu that best represents their needs. Depending on what the user clicks on, the simple chatbot may prompt another set of options for the user to choose until reaching the most suitable, specific option. Essentially, these chatbots operate like a decision tree. 

Although these chatbots offer simple functionality and can be helpful for answering users’ repetitive, straight-forward questions, these chatbots may struggle when faced with more nuanced requests because they are limited to pre-defined answer options. First, this kind of chatbot may take longer to understand the customers’ needs, especially if the user must go through several iterations of menu buttons before narrowing down to the final option. Second, if a user’s need is not included as a menu option, the chatbot will be useless since this chatbot doesn’t offer a free text input field. 

2. Rules-based chatbots

Building upon the menu-based chatbot’s simple decision tree functionality, the rules-based chatbot employs conditional if/then logic to develop conversation automation flows. The rule-based bots essentially act as interactive FAQs where a conversation designer programs predefined combinations of question-and-answer options so the chatbot can understand the user’s input and respond accurately.

Operating on basic keyword detection, these kinds of chatbots are relatively easy to train and work well when asked pre-defined questions. However, like the rigid, menu-based chatbots, these chatbots fall short when faced with complex queries. These chatbots struggle to answer questions that haven’t been predicted by the conversation designer, as their output is dependent on the pre-written content programmed by the chatbot’s developers. 

Because it’s impossible for the conversation designer to predict and pre-program the chatbot for all types of user queries, the limited, rules-based chatbots often gets stuck because they can’t grasp the user’s request. When the chatbot can’t understand the user’s request, it misses important details and asks the user to repeat information that was already shared. This results in a frustrating user experience and often leads the chatbot to transfer the user to a live support agent. In some cases, transfer to a human agent isn’t enabled, causing the chatbot to act as a gatekeeper and further frustrating the user. 

3. AI-powered chatbots 

While the rules-based chatbot’s conversational flow only supports predefined questions and answer options, AI chatbots can understand user’s questions, no matter how they’re phrased. With AI and natural language understanding (NLU) capabilities, the AI bot can quickly detect all relevant contextual information shared by the user, allowing the conversation to progress more smoothly and conversationally. When the AI-powered chatbot is unsure of what a person is asking and finds more than one action that could fulfill a request, it can ask clarifying questions. Further, it can show a list of possible actions from which the user can select the option that aligns with their needs. 

The machine learning algorithms underpinning AI chatbots allow it to self-learn and develop an increasingly intelligent knowledge base of questions and responses that are based on user interactions. With deep learning, the longer an AI chatbot has been in operation, the better it can understand what the user wants to accomplish and provide more detailed, accurate responses, as compared to a chatbot with a recently integrated algorithm-based knowledge.  

Conversational AI chatbots can remember conversations with users and incorporate this context into their interactions. When combined with automation capabilities like robotic process automation (RPA), users can accomplish tasks through the chatbot experience. For example, when ordering pizza, the restaurant’s chatbot can recognize a loyal customer returning to place an order, greet them by their name, remember their “regular” order, and use their saved delivery address and credit card to complete the order. Being deeply integrated with the business systems, the AI chatbot can pull information from multiple sources that contain customer order history and create a streamlined ordering process. 

Additionally, if a user is unhappy and needs to speak to a human agent, the transfer can happen seamlessly. Upon transfer, the live support agent can get the chatbot conversation history and be able to start the call informed.  

The time it takes to build an AI chatbot can vary based on factors such as the technology stack and development tools you are using; the complexity of the chatbot; the desired features; the data availability; and whether you need it to integrate with other systems, databases or platforms. With a user friendly, no-code/low-code platform you can build AI chatbots faster. 

With watsonx Assistant, chatbots can be trained on little data to correctly understand the user, and they can be enhanced with search capabilities to sift through existing content and provide answers that address questions beyond what was initially programmed by the chatbot conversation designer. 

IBM watsonx Assistant accelerates the deployment of virtual agents by providing:

  • Improved intent recognition from using large language models (LLMs) and advanced NLP and NLU
  • Built-in search capabilities 
  • Starter kits or built-in integrations with channels, third-party apps, business systems or Contact Center as a Service platforms, such as Nice CXone

According to the 2023 Forrester Study The Total Economic Impact™ Of IBM Watson Assistant, IBM’s low-code/no-code interface enables a new group of non-technical employees to create and improve conversational AI skills. The composite organization experienced productivity gains by creating skills 20% faster than if done from scratch.

4. Voice chatbots

A voice chatbot is another conversation tool that allows users to interact with the bot by speaking to it, rather than typing. Some voice chatbots can be more rudimentary. Some users may be frustrated by the Interactive Voice Response (IVR) technology they’ve encountered, especially when the system can’t retrieve the information a user is looking for from the pre-programmed menu options and puts the user on hold. However, this system is evolving with artificial intelligence.

AI-powered voice chatbots can offer the same advanced functionalities as AI chatbots, but they are deployed on voice channels and use text to speech and speech to text technology. With the help of NLP and through integrating with computer and telephony technologies, voice chatbots can now understand spoken questions, analyze users’ business needs and provide relevant responses in a conversational tone. These elements can increase customer engagement and human agent satisfaction, improve call resolution rates and reduce wait times. 

While chat and voice bots both aim to identify the needs of users and provide helpful responses, voice chatbots can offer a quicker and more convenient communication method, as it’s easier to get a real-time answer without typing or clicking through drop-down menu options. 

5. Generative AI chatbots 

The next generation of chatbots with generative AI capabilities can offer even more enhanced functionality through their fluency in understanding common language, their ability to adapt to a user’s style of conversation and their use of empathy when answering users’ questions. While conversational AI chatbots can digest a users’ questions or comments and generate a human-like response, generative AI chatbots can take this a step further by generating new content as the output. This new content could look like high-quality text, images and sound based on LLMs they are trained on. Chatbot interfaces with generative AI can recognize, summarize, translate, predict and create content in response to a user’s query without the need for human interaction.

What is the right type of chatbot for your business? 


When assessing the various types of chatbots and which could work best for your business, remember to place your end user at the center of this decision. What are your users’ goals and their expectations from your business, and what are their user experience preferences for a chatbot? Would they prefer to select from a simple menu of buttons, or would they need the option to correspond in open-ended dialogue for nuanced questions? 

Also, consider the state of your business and the use cases through which you’d deploy a chatbot, whether it’d be a lead generation, e-commerce or customer or employee support chatbot. If you’re working for a smaller company, such as a startup, with limited active users and minimal amounts of frequently asked questions that your chatbot conversation designers would need to pre-program, a rules or keyword recognition-based chatbot may sufficiently address your business needs and satisfy customers without much lift. 

However, for medium to larger sized companies that house vast amounts of user data that a chatbot could self-learn from, an AI chatbot could be an advantageous solution to provide detailed, accurate responses to users and enhanced customer experiences. 

When thinking about generative AI’s impact on chatbots, think about how your business can take advantage of creative, conversational responses and when this technology makes the most sense for your business objectives and the needs of your customers.  

Source: ibm.com

Friday, 8 September 2023

Harnessing AI and data analytics can make clean energy more viable

IBM, IBM Exam, IBM Exam Prep, IBM Career, IBM Preparation, IBM Tutorial and Materials

The effects of climate change grow more tangible by the day. As a result, the electric power industry is hastening efforts to reduce environmental impact. To do so, they need help to get a clearer picture of where they fall on the emissions reduction roadmap, and to better understand their opportunities for improvement. This is where harnessing artificial intelligence (AI) and data analytics can help.

To assist utility companies, IBM has created the Clean Electrification Maturity Model (CEMM) in conjunction with the American Productivity & Quality Center (APQC). Based on 14 years of development and input from electric utility experts, the CEMM provides a series of benchmarks to help utility companies measure the maturity of their clean electrification capabilities, set priorities for transformation and track their progress along the way.

The results from the first global CEMM benchmark of 90 transmission and distribution utilities paint a telling picture of the role that technology can play in this critical transformation. Companies that harness AI and data analytics can also make clean energy more viable overall by increasing their cost competitiveness over legacy energy sources.

On average, 3.3 times more of the most mature utilities prioritize research in energy balancing and trading. This research can enable more liquid markets and lower energy prices for customers. By introducing AI into the renewable energy generation, transmission and distribution processes, utilities can better predict weather patterns in advance, giving them better insights into the output of solar and wind farms.

AI: The road ahead for energy and utilities


AI can also benefit customer care, freeing up resources for innovation in other aspects of their business. The most mature utilities were more than four times more likely to utilize customer experience platforms, including AI-powered chatbots and data analytics for personalized service. The latter can also help drive efficiency by lowering end-user energy use. Informed by automated reports, customers can better understand their energy consumption and take steps to reduce their power draw during peak periods of demand.

Despite these strides, the industry still has a lot of ground to make up to realize its carbon reduction goals. According to the CEMM study, the most mature 25% of transmission and distribution utilities still only achieved the modest overall score of 2.2 out of 5 across all domains. The average maturity for all T&D utilities was at the 1.6 mark. Technological maturity has vast potential for improvement; here, the most mature segment fared only marginally better than the bottom 75%, falling into the lowest development category.

Global electricity demands continue to rise. Utility companies must take urgent action to realign their organizations’ strategies and implement intelligent, data-driven processes for change. By emulating the successes of industry leaders and adopting the latest proven AI-enabled technologies, utility companies can take a big step toward meeting their own emissions goals and reducing the impact of climate change on the planet.

Source: ibm.com

Wednesday, 6 September 2023

Transforming financial transparency and trust through integrated resource planning

IBM, IBM Exam Prep, IBM Career, IBM Tutorial and Materials, IBM Certification

Athabasca University (AU), Canada’s Open University, is dedicated to removing barriers that restrict access and success within university-level study and increasing equality of educational opportunities for adult learners worldwide.

To support their mission, AU defined their “Imagine” Strategy, which was a five-year plan outlining strategic priority outcomes for the institution. Underpinning the execution and ensuring alignment with their objectives required a collaborative integrated planning process.  The challenge was that the annual planning and subsequent forecasting processes were a major obstacle due to the lack of transparency and access to relevant timely information for decision-making, resulting in a lack of trust and collaboration. To fix this, Long Huynh, Director of Decision Support, was brought on board to work with the team to fulfill two objectives:

  • Roll out a Finance Business Partner (FBP) Program, which focused on building a team of finance folks who could collaborate with faculty members and department heads on financial/business plans, transforming finance into advisors to the business as opposed to compilers and producers of reports.
  • Streamline the integrated planning and forecasting process, making it easy for users to access information, track variances in spend and initiatives and enable faculty members to re-forecast and ultimately own their numbers.

To assist with this transformation, AU began working with us at ActionKPI—a Performance Management Consultancy specializing in integrated resource planning—to help define a roadmap and phased approach to re-architecting their planning processes and systems. The result was increased trust and collaboration from all faculties and departments while increasing the speed and agility of the financial reporting, budgeting and forecasting processes.  

Challenges faced


◉ Lack of transparency: The data for financial reporting was nested within different platforms with different levels of access. Budget owners struggled to understand their budgets and had difficulty accessing relevant information, understanding variances throughout the month and leading up to the quarter.
◉ Manual and siloed budgeting: The budgeting process relied on an antiquated system that was not user-friendly and disconnected from the main ERP system. Budgets had to be manually uploaded, making it challenging to integrate actuals and forecasts.
◉ Limited access to data: Some data—such as HR and payroll—was difficult to access, hindering financial analysis and decision-making. They were not able to get insights into labour costs and make accurate budgets and forecasts on one of their major cost centres.

Before implementing the solution, the financial data was scattered across different platforms with varying levels of access. Budget owners couldn’t easily access their budget information, which caused them to rely on specific individuals to explain reports due to siloed data and complex nuances.

Additionally, Athabasca University’s budgeting process was hindered by an unintuitive system that operated independently from their ERP system, Banner. Manual uploads and a lack of integration between their old system and Banner added to the complexity and time-consuming nature of the budgeting process.

The journey begins: Improved budget visibility


After initial discussions with ActionKPI, Long Huynh and the finance team saw the opportunity to demonstrate the power of integrated resource planning through IBM Planning Analytics (PA).

The focus at the start was on developing a variance and forecasting model within PA that combined budget and actual data from source systems in a user-friendly solution accessible to various stakeholders across the university. Unlike their old system, PA enabled daily automatic uploads from their ERP and provided timely access to different reporting hierarchies and variances, and it gave the ability to drill down on transactional data. Budget owners could log in and view their budgets at any time, clearly understanding their financial position and empowering them to actively participate in the budgeting process and take corrective action to meet targets. This improvement significantly enhanced transparency, trust and accountability among their various budget owners. 

“It was such a foundational change to how we looked at the budgets—from a manual upload of data that is a month old or even a quarter old to every night. It changed the speed at which we can make decisions and see what is happening.” — Long Huynh

The second phase: Workforce and HR integration


Once the initial model was built and data from Banner was integrated directly with IBM Planning Analytics (PA), the focus turned to HR (specifically workforce) to better understand one of the university’s major cost centres. 

Previously, HR tightly guarded payroll information due to privacy requirements, resulting in limited access and insufficient granularity.

To overcome this challenge, AU and ActionKPI developed a workforce variance model within PA that encompassed the FTEs and positions at the university, as opposed to employees. Actual faculty payroll data by position was incorporated into the model, providing visibility into labour costs, vacancy cost savings and headcount, while still maintaining data security and privacy. 

Having this workforce detail allowed faculty members and departments to revise their FTE plans and subsequent labour costs, creating a rolling forecast. This not only improved the forecast accuracy but enabled AU to become more agile within the decision-making process.

“Payroll is the biggest piece of the expense pie, and a thorough and accurate understanding of it is required for an accurate forecast. Now, we can forecast by each position and see the impact in-month and into the future.” — Long Huynh

To address privacy and data security concerns, the finance team collaborated with HR to determine the appropriate salary detail each budget owner could access. Each manager can only see salary details for employees within their hierarchy and only position numbers are shown rather than names. This allows managers the transparency and detail required to make informed decisions while maintaining adequate security and privacy.

Integrated resource planning: A holistic approach


The journey continued with a focus on integrated resource planning. The university aimed to transition from closed-door decision-making to inclusive, integrated resource planning that involved stakeholders from across the organization.

Prior to this initiative, resource plans and proposed budget changes were submitted as separate Excel files, leading to collaboration challenges, version control issues and manual data entry.

ActionKPI integrated the budget and resource planning processes into IBM Planning Analytics (PA), allowing for a streamlined approach. This also enabled a democratized process where budget owners collaborate with their Finance Business Partners to update their budget entries. This supported greater ownership and eliminated the process of budget owners sending numbers to finance and finance doing the entry.

Previously, inconsistent finance knowledge across different faculties and departments led to poor or inaccurate forecasting and budgeting practices.

The rollout of the Finance Business Partner Program contributed to the success of the integrated resource planning process by providing personalized finance support and advisory to each department. It was the financial acumen, passion for customer service and data-driven decision-making skills of the FBP team—combined with the streamlined planning processes and systems—that made it possible.

“Before, a lot of decisions were made behind closed doors. Now, the university has moved to integrated planning, so it is very inclusive. Almost every budget holder is involved in the process versus just finance.” — Long Huynh

Finance Business Partners, assigned to each department, worked closely with budget owners to demonstrate how PA could be utilized to enter, track and visualize budget changes over time.

This integration fostered collaboration, transparency and informed financial decision-making throughout the university.

Through this model, Athabasca University achieved improved forecasting consistency and quality. The FBP model would not have been successful without the implementation of PA, as it provided the necessary infrastructure for effective collaboration and support between finance and budget owners.

Conclusion

The journey to enhance transparency and trust within the university’s financial processes resulted in significant improvements.

By implementing IBM Planning Analytics and integrating budgets, actuals and resource planning, the university achieved better visibility, streamlined processes and increased stakeholder involvement.

The initiatives undertaken by Long and his team not only transformed financial decision-making but also contributed to a cultural shift towards data-driven decision-making and collaboration.

Source: ibm.com

Tuesday, 5 September 2023

Accessing your on-premises network and IBM Cloud VPC using a single VPN connection

IBM, IBM Exam Prep, IBM Certification, IBM Tutorial and Materials, IBM Learning, IBM Preparation, IBM, IBM Learning, IBM Certification Exam

To ensure data privacy and reliable access, it’s crucial to establish secure connections between networks and resources. However, with the countless connections we create, it becomes a hassle to maintain them.

Luckily, you can now optimize your VPN connections with IBM’s VPN offerings: Client-to-Site VPN and Site-to-Site VPN. While you can learn more about these offerings here, feel free to follow the instructions provided in this blog post to connect to your IBM Cloud and on-premises environments using a single Client-to-Site VPN connection.

The use case is visually depicted in Figure 1 below. End users connect to the VSIs in their IBM Cloud VPC and to the Instances and DBs in their on-premises environment using a single Client-to-Site VPN connection:

IBM, IBM Exam Prep, IBM Certification, IBM Tutorial and Materials, IBM Learning, IBM Preparation, IBM, IBM Learning, IBM Certification Exam
Figure 1

This optimized architecture requires that a Client-to-Site VPN server and a Site-to-Site VPN gateway first be deployed in your IBM Cloud account.

Prerequisites

 
◉ An IBM Cloud account with a VPC and at least one VSI deployed in the VPC to validate the VPN connection.
◉ Necessary IAM permissions, Security Groups and ACLs in place to create VPN gateway(s) and other required resources.
◉ Peer device information from the on-premises location along with pertinent Subnet CIDR information.
◉ OpenVPN client installed on your local laptop, which will be used to validate the VPN connectivity.

Summary of the steps to set up the two VPNs in tandem


First, we’ll create a Site-to-Site VPN and then a Client-to-Site VPN. Once deployed, we’ll create routes and set up authentication and service-to-service authorization to connect the VPNs together. Finally, we’ll install OpenVPN on the laptop and validate connectivity to both IBM Cloud and the on-premises environment. We’ll go into each of these steps in more detail below.

Create the Site-to-Site VPN gateway


Before you begin this step, make sure you have the Peer Gateway and Preshared Key from your on-premises environment at hand along with any IKE and IPsec policies that you intend to use.

Log in to the IBM Cloud Catalog, search for “VPN” and select VPN for VPC. Choose Site-to-site gateways and select the location where you would like to deploy the gateway (along with all the required input parameters). You must choose the Route-based option for the VPN tunnel.

Click on the Create VPN gateway button on the right-hand side of the page. This creates the VPN connection to connect your IBM Cloud with your on-premises data center. Once the gateway is successfully created, it should show as active on the IBM Cloud portal. At this time, the connection is ready for the routes to be set up to route traffic from IBM Cloud to your on-premises environment.

Create the Site-to-Site VPN routes


Now that the VPN connection is in place, we’ll create VPN routes to define egress routes from IBM Cloud VPC to your on-premises router. Navigate to the VPC Routing Tables to create a new Routing Table or use an existing one to create your VPN route. Input all the required fields. For example:

◉ Destination subnet: CIDR from on-premises
◉ Action: Deliver
◉ Next hop type: VPN connection
◉ VPN gateway: The VPN gateway that was just created
◉ VPN connection: Connection name that was provided while creating the VPN gateway

Important: Once the routes are created, do not forget to attach the source subnet(s) in the VPC to the routing table.

You should now have a VPN connection with routing established between your IBM Cloud VPC and your on-premises environment. This flow is indicated in red in Figure 1 above.

Configure authorization and authentication


Before we create a Client-to-Site VPN connection, we must generate client and server certificates and store them in IBM Cloud Secrets Manager.

To enable the VPN to access the certificates from the Secrets Manager, a service-to-service authorization for the VPN Server and IBM Cloud Secrets Manager needs to be established as described here.

Create the Client-to-Site VPN server


Login into IBM Cloud Catalog, search for VPN and select VPN for VPC. Choose Client-to-site servers and select the location where you would like to deploy the gateway (along with all the required input parameters). For this article, we have chosen a standalone configuration. Choose a desired CIDR range for the Client IPv4 address pool so that IPs can be assigned to client connections from this range. Input all the mandatory fields in the Subnets section.

Next, configure the Server and Client Authentications. Select Server and Client Certificates that were added to Secrets Manager from the previous steps in this article. For added security, you can optionally choose User ID and passcode. Finally, you must ensure that the Security Group rules are configured appropriately to allow VPN traffic into the subnet.

While the rest of the input parameters are optional in this form, choose the Full tunnel option to allow all traffic to flow through the VPN interface and into the VPN tunnel. Click on the Create VPN server button on the right-hand side of the page.

Create the Client-to-Site VPN routes


Once the connection shows active on the Portal, you must create two routes—one to allow end-user access to resources within the VPC and one to allow end-user access to the remote/on-premises network. This flow is indicated using solid green and red dashed lines in the VPC in the above diagram.

Configure the client profiles


Lastly, download the client profile from your VPN server. On your VPN server in the IBM Cloud portal, navigate to the Clients tab and click on the Download client profile button. Append the Client certificate and Private Key to the Client Profile .ovpn file.

Detailed instructions to set up the client VPN environment to connect to a VPN server can be found here.

Configure the OpenVPN client and validate connectivity


You will need a VPN client to access your IBM Cloud and on-premises environment. Depending on your local operating system, you can download and install an appropriate VPN client from here. Once installed, launch the OpenVPN client and connect to the OpenVPN profile that was configured in the previous steps to connect to the VPC.

IBM, IBM Exam Prep, IBM Certification, IBM Tutorial and Materials, IBM Learning, IBM Preparation, IBM, IBM Learning, IBM Certification Exam
Figure 2

This VPN connection allows users to connect to their VPC in IBM Cloud as well as their on-premises environment using IBM Cloud VPN offerings. You can validate successful client connections by navigating to the Clients tab on the VPN server in your IBM Cloud portal.

Source: ibm.com