Thursday, 30 June 2022

IBM named a leader in The Forrester Wave™: Enterprise Data Fabric, Q2 2022

IBM, IBM Exam, IBM Exam Prep, IBM Certification, IBM Tutorial and Material, IBM Career, IBM Skills, IBM Jobs

The opportunity to work with many clients on their data fabric journey continues to drive and inspire us to achieve even greater heights with our solutions. We believe the findings of the recent Forrester Data Fabric Wave for 2022 is a clear indication that the efforts we’ve taken to “[ramp up our] fabric offering aggressively” for our clients is delivering on its initial promise: A data fabric that dynamically and intelligently orchestrates governed data across a distributed landscape to provide a common data foundation for data consumers.

The ability to compose and re-use data services with IBM’s data fabric on IBM Cloud Pak for Data allows you to tackle a variety of use cases such as multi-cloud data integration, governance and privacy, customer 360, and MLOps and Trustworthy AI. In this way, the IBM data fabric delivers end-to-end automation to maximize productivity and time–to–value such as automated policy and business controls enforcement – all while leaving data where it resides. Let’s take a look at IBM’s take on some of the specific strengths recognized in Forrester’s Wave below.

“IBM is a good fit for organizations with large, complex, distributed data stores across hybrid-and-multi-cloud environments, including legacy systems.”

Key attributes of IBM’s approach to data fabric

Good for large, complex environments – As noted in the quote above from the report, IBM’s data fabric can handle the workload of large organizations even when the data in question is dispersed across hybrid or multi-cloud environments. Making complex environments operate more smoothly without excluding legacy systems is key to addressing the challenges faced by most organizations today, as noted in the report “IBM is strong in several data fabric capabilities, including data modeling, data catalog, data governance, data pipeline, data discovery and classification, event and transaction processing, and deployment options.”. 

Data governance – Some of the most exciting governance capabilities of the IBM Data fabric include automatically applying metadata to new datasets using machine learning as well as auto-generated data quality assessments and scoring and AI-based dataset recommendations. Providing the semantic

meaning to technical data assets is the foundation of any governance program and key to enabling self-service; IBM’s data fabric accelerates business understanding through machine learning applied classification, profiling and quality assessments. Moreover, dynamic masking alongside automatic policy enforcement and recognition of sensitive data allows businesses to make sure data is only in the hands of those with a need to know without losing value from key datasets. Finally, the automated and central enforcement of policy and business controls sets apart the data governance and privacy capability of IBM’s data fabric architecture.

Data pipeline and Trustworthy AI – Getting enough data of sufficient quality to the right users or apps for analysis and AI processing will always be easier said than done. However, we’re working to narrow that gap as much as possible. A plethora of data integration styles such as ETL and ELT, data replication, change data capture and data virtualization help access all data seamlessly. Across the IBM data fabric components such as advanced data engineering work to automate access and sharing, while accelerating data delivery with active metadata. From there the focus becomes a trust in model and deployment with automated tools for data cleaning and preparation so users can dive right into the integrated tools for building, deploying, scaling and training models. Finally, the monitoring and management of models comes into sharp focus with automation of monitoring and retraining models to help avoid model degradation, drift and bias.

Transaction processing – The storied reputation of IBM Db2 carries forward into the transaction processing capabilities of the IBM Data Fabric. All of the same great capabilities businesses have come to rely on remain and have been upgraded with a data fabric approach. The data integration styles mentioned above are, of course, part of these improvements, but even more is available including automatic workload balancing and elastic scaling to help ensure you’re ready for high volumes of data. Zero-downtime on data migration and upgrades also helps make sure your transaction processing never misses a beat.

Deployment options – IBM’s approach to the data fabric is founded on giving clients the freedom to choose how they architect their solution. Nowhere is this more important than in our deployment options. Our clients have a broad range of deployment options from fully managed by IBM to clients fully managing the deployment themselves on premises. Additionally, can choose from several options for cloud vendor including AWS and Azure. And while our experts would be happy to advise on which capabilities should be chosen as part of the IBM Data Fabric to suit your existing architecture the ultimate decision lies with you on which of our highly-composable data fabric components are selected now and which you might want to activate at a future time – allowing you to avoid paying extra for a “one-size-fits-all” approach.

We look forward to showing you more in the upcoming months. Until then, please check out The Forrester Wave™: Enterprise Data Fabric, Q2 2022 for the full details on what led us to be named a Leader. And take a moment to peruse The Data Differentiator our newly-released, continually-evolving guide shaped by CDOs, data leaders and data experts across IBM and beyond.

Source: ibm.com

Tuesday, 28 June 2022

Turn sustainability ambition into action

IBM, IBM Exam Study, IBM Exam, IBM Exam Preparation, IBM Certification, IBM Tutorial and Material, IBM Career, IBM Skills, IBM Jobs, IBM Guides, IBM Learning, IBM Materials

For nearly 20 years, IBM has surveyed thousands of CEOs about their biggest challenges. In the latest survey, sustainability ranked at the top, a 5-spot jump from 2021. Nearly 60% of CEOs told us they see significant demand from investors for greater transparency on sustainability. They are also feeling the pressure from multiple stakeholders. Regulators and governments in most top economies have developed corporate disclosure requirements around environmental impact. Customers want to buy from sustainable businesses. People want to work for these companies and invest in them. From the boardroom to the operations centers, all stakeholders want to play a role in making a positive difference for our world.

However, while 86% of companies say they already have a sustainability strategy in place, only 23% say they are implementing sustainability strategies across their entire organization. Many organizations with good intentions are stalled at the planning stage because implementing sustainable practices is complex and they don’t know how or where to make an impact. Despite this delay, the same IBV CEO study found that 80% of CEOs believe investments in sustainability will improve their own business results within 5 years. How can you turn strategy into results?

Take the first steps to build and operationalize sustainability

Becoming more sustainable is an opportunity to innovate, make a difference and grow. Take action by following these three steps:

Define your Sustainability goals – To succeed, your business needs to set and act on clear environmental, social and governance (ESG) goals, then execute with exceptional data discipline across the enterprise.

Establish your ESG data foundation – Create a clear baseline to underpin every goal from which to determine your current impact, track progress and implement adjustments. This requires a single system of record to integrate and manage ESG data that aligns to your goals. Collect, correlate, visualize, and analyze relevant data so you can deliver transparent, verifiable, financial-grade information and more easily identify where improvements are most needed.

Operationalize your sustainability goals – Full benefit can then be achieved by leveraging the links between this system of record for ESG data and the underlying operational systems that run across all the departments and business units of your organization. With these links in place, you can automate feedback loops that enable actions based on insights. These insights help drive sustainable transformation through intelligent facilities and assets, resilient IT Infrastructure, and circular supply chains.

Focus on three key operational areas

IBM, IBM Exam Study, IBM Exam, IBM Exam Preparation, IBM Certification, IBM Tutorial and Material, IBM Career, IBM Skills, IBM Jobs, IBM Guides, IBM Learning, IBM Materials

Intelligent facilities and assets—Monitoring and recording operational data from your organization’s physical assets and real estate facilities is a good place to start. The data you collect can fuel insights to drive significant energy savings, optimize waste management and provide predictive maintenance data to help reduce unplanned downtime.

Resilient IT infrastructure—Data centers provide multiple opportunities for improving sustainability. Upgrading IT infrastructure with newer, more energy-efficient equipment can help you reduce energy consumption and eliminate wasteful, outdated hardware. What’s more, the same steps you take to improve business resiliency across your organization also help you improve customer experiences and productivity while you work towards meeting your sustainability goals.

Circular supply chains—Encouraging reuse and providing customers with transparent sourcing data for the products they buy is something more consumers are demanding. Deploying intelligent workflows and taking advantage of automation opportunities can not only reduce waste but also optimize fulfillment and delivery paths with lower carbon footprints. Solutions powered by AI and backed by blockchain can help you progress toward a net zero supply chain.

Advance your journey with IBM


As you embark on your sustainability journey and tackle each of our 3 recommended steps, IBM and its ecosystem of partners are on hand to support and help you along the way. We recognize that no one can do it alone, but together we can turn sustainability ambition into action.

Define your sustainability goals with IBM Consulting™ — IBM experts, strengthened by our strategic alliances and partnerships, are helping clients embed sustainability into the fabric of their business to increase operational efficiencies, expose innovation opportunities and drive competitive differentiation. Many businesses rely on IBM Consulting as their partner for the new rules of modern business. We believe open ecosystems, open technologies, open innovation, and open cultures are the key to creating opportunities and the way forward for modern business and our world. Our expert advice helps you leverage sustainability as a catalyst for profitable business transformation.

Establish your ESG data foundation and a clear baseline with IBM® Envizi — Envizi provides a single system of record to manage ESG data that aligns to your goals in a verifiable and auditable way. Automate the collection and consolidation of more than 500 data types, with support for internationally recognized ESG reporting frameworks, to manage environmental goals, identify efficiency opportunities, and assess sustainability risks. Proactively plan and manage the economic impact of weather and climate change events with IBM® Environmental Intelligence Suite®.

Operationalize sustainability goals with IBM technology — At IBM, we are building a comprehensive portfolio of capabilities in collaboration with our network of partners, including AWS, EY, Microsoft, Salesforce, SAP, and WBCSD among others. We help you realize sustainable transformation by leveraging the connections between your ESG data and your daily operations. Many of our clients choose IBM® Maximo® as their foundation for operational insights on physical assets to drive clean energy transition, efficient water and waste management, and decarbonization using AI, analytics and IoT. If your goal is to automate workflows and establish resilient IT infrastructure, IBM Turbonomic® is a no-fuss solution that helps you improve data center efficiency, performance optimization and cloud workflow tracking. IBM LinuxONE can also bring sustainability to mission-critical workflows, helping you manage server sprawl and contain your corporate carbon footprint. And if you seek to improve responsible sourcing and supply chain transparency, IBM Supply Chain Intelligence Suite provides insights and intelligent workflows to enable equitable, net zero supply chains through AI and blockchain. No matter where you decide to act, IBM and its partners have the capabilities to help you achieve your sustainability goals.


Industry leaders rely on IBM solutions to help meet sustainability goals


BestSeller—Reducing return rates, driving supply chain efficiencies, and stepping forward to sustainable fashion with the help of IBM Garage, by developing “Fashion.ai” to help customers purchase items that fit their physical dimensions and preferred style.

Celestica—Consolidating, streamlining, and automating ESG data reporting with IBM Envizi Sustainability Performance Management to produce finance-grade, auditable data and transform sustainability reporting across the manufacturing portfolio.

Sund & Baelt—Digitizing maintenance and asset management processes using IBM Maximo to extend the life of critical infrastructure (like bridges and tunnels) by 100 years and avoid 750,000 tons of carbon dioxide emissions.

Carhartt—Deploying cloud-first strategy with IBM Turbonomic® Application Resource Management to ensure 24×7 performance while reducing application needs and energy consumption.

BBVA—Adopting IBM z15’s central processing equipment to reduce carbon dioxide emissions and energy consumption of BBVA Data Center processors by 50%.

Partnership for Carbon Transparency—Partnering with the World Business Council for Sustainable Development (WBCSD) and others to enable the secure exchange of verified, standardized product emissions data across organizations.

Source: ibm.com

Sunday, 26 June 2022

Automated governance and trustworthy AI

IBM, IBM Career, IBM Certification, IBM Tutorial and Material, IBM Career, IBM Jobs, IBM Skills, IBM News, IBM Exam Preparation

Artificial intelligence is being infused slowly but surely into all aspects of our lives, and it will be ubiquitous in everyday life sooner than we might imagine. Governments and regulatory bodies around the world are working to establish safety standards.

In the U.S., the Consumer Finance Protection Bureau (CFPB) recently outlined options to prevent algorithmic bias in home valuations for mortgage lenders. The proposed rules aim to govern automated valuation models to protect borrowers. In October 2021, the White House Office of Science and Technology Policy announced, “Americans Need a Bill of Rights for an AI-Powered World.” The announcement highlighted the crucial role of training data, and the terrible consequences of using data that “fails to represent American society.”

As governments recognize and regulate the growing use of AI for crucial decisions, enterprises should prepare proactively. As part of their AI adoption, enterprises should define and adopt safety standards to manage their regulatory, financial, operational, technology and brand risks. Organizations must plan to govern and establish trustworthiness metrics for their use of AI. With this governance, AI can help enterprises fight discriminatory outcomes, even when AI was not involved in the original decision-making.

AI safety standards should include both governance automation and trustworthiness

Governance automation enables an enterprise to institutionalize the process, policies and compliance of AI deployments to continuously collect evidence, ensuring consistency, accuracy, timely, efficient, cost effective and scalable deployment of AI. This cannot be sustained by manual efforts. Firms in regulated industries such as financial services, healthcare and telecom will see additional regulations being enforced to ensure AI governance compliance with requirements to document the evidence of it.

As an example, a large financial services firm could easily deploy hundreds or thousands of AI models to assist decision makers in various tasks. The use of AI becomes necessary to take advantage of the massive amounts of transactional and client experience data. These models could include diverse use cases like client credit monitoring, fraud analytics, lending decisions, targeted micro marketing, managing chat bot interactions, call center analytics and others. For banks with multiple lines of business including retail and corporate clients, this becomes a daunting challenge to manage diverse technology, operational, brand and regulatory risk exposure with appropriate process, policies and compliance automation.

Additionally, governance automation needs to be consistent enterprise-wide. It needs to be planned holistically to avoid new technical debt in the governance implementation and to future-proof the investments.

Trustworthiness in AI means the results from AI models are continuously monitored and frequently validated based on model risk characteristics, so that these results can be trusted by decision makers, clients, regulators and other stakeholders. Each stakeholder has their own perspective of trustworthiness. The stakeholders include:

◉ The decision maker: the organization using the model outcomes to make a decision

◉ The regulator: the internal auditor, validator, third-party organizations and government bodies performing oversight and reviewing the outcomes of the decision

◉ The subject: the client or end user involved in the decision process

To understand the different perspectives of these stakeholders, consider a simple credit card or loan approval process. The loan officer makes the decision with the aid of an AI model. This decision maker needs to trust the model for the accuracy of its prediction, quality of the training data used, fairness of the -prediction, explicit confirmation that various biases are eliminated, and that they can explain the decision (based on the model prediction) as they would if the decision was made without assistance from an AI model.

The regulator monitoring the decisions will need all the above across all the decisions made by all decision makers. The regulator will look for evidence to confirm that errors, omissions and biases have not occurred.

The client (the subject of the decision) will want to understand and trust the approval or denial of the loan decision. Explaining the decision to the client easily and at scale has always been as important as the decision. Now it can become a competitive advantage for brands that invest in automating explanations with easy-to-understand visuals.

Clients can and will continue to demand explanations of the decision.

Automate governance and trustworthiness of AI at scale

Organizations with multiple business segments in regional/global domiciles governed by diverse regulatory regimes need an open architecture platform approach to successfully implement governance automation.. An open platform should seamlessly automate the integration of AI compliance processes, operating policies and continuous monitoring of models for various Trustworthiness metrics like accuracy, fairness, quality, explainability, etc. Automation of compliance processes will require configurable workflows and operating policies will vary across the enterprise based on business segment needs. IBM Cloud Pak® for Data provides a platform approach with scalable and configurable services for both governance automation and trustworthiness. It can be deployed on premises or in the cloud.

Case study: IBM Cloud Pak for Data at work in a large financial services firm

Recently IBM deployed AI governance automation and trustworthiness with IBM Cloud Pak for Data for a large financial services firm. The implementation was done in partnership with various teams on the client side to support the governance process holistically for the enterprise. The solution is configurable for business segment needs with seamless integration to their technology platforms that is being used to develop and deploy AI models.  This platform approach ensured existing AI investments in technology and skills were preserved while enabling compliance with governance automation.

IBM configured and customized IBM Cloud Pak for Data services — namely OpenPages®, IBM Watson® OpenScale, IBM Watson Machine Learning and IBM Watson Studio — to automate their AI governance enterprise-wide for the entire model development life cycle, from model idea inception to production.

IBM Cloud Pak for Data will monitor and track the models in production and triggers alerts when models breach their predefined thresholds or trends change across various metrics. Metrics include built-in types like quality, fairness, bias, accuracy, drift, explainability or custom metrics as defined by specific business needs.

The solution automates the generation of facts about the model in terms of documents and metrics collection for the full life of the model. It enables easy tracking and handover of tasks and issues in a seamless workflow across various actors and roles in the model lifecycle, such as the business owner, data scientist, model validator and data steward. By implementing this solution, the bank can confidently manage operational, regulatory and technology risks of AI model deployments by different business segments across the enterprise with a uniform, integrated and automated platform.

Not using AI is not an option for any business. But using AI safely and responsibly requires C-suite commitment. Now is the moment for organizations to commit strategic investments towards holistically governing their use of AI in decision-making with an enterprise governance and trustworthiness platform approach.

Source: ibm.com

Saturday, 25 June 2022

The Future of RegTech for AI Governance

IBM Exam Study, IBM Certification, IBM Career, IBM Skills, IBM Jobs, IBM Preparation, IBM Learning, IBM Exam Preparation, IBM Guides, IBM Tutorial and Material

The use of artificial intelligence (AI) is now commonplace throughout society. The adoption of AI is driven by its utility and the improvements in efficiency it creates. Every day, most of us rely on AI for tasks like autocompleting our text messages, navigating our route to a new location, and recommending what movie to watch next. Beyond these common uses of AI, there are also uses that regulators are beginning to identify as areas where there may be a higher risk. According to the Digital Strategy website from the European Commission, these higher-risk areas can include uses of AI in employment, financial, law enforcement, and healthcare settings, as well as other areas where outcomes can have a significant impact to individuals and society. As the adoption of AI grows there is increasing recognition that enabling trustworthy AI is important, with 85% of consumers and 75% of executives now recognizing its importance. Establishing principles, such as IBM’s principles for trust and transparency, are important for guiding the development and use of trustworthy AI. Central to putting these principles into practice is establishing the appropriate governance mechanisms for AI systems.

AI governance will require an agile approach


AI governance is an organization’s act of governing, through its corporate instructions, staff, processes, and systems to direct, evaluate, monitor, and take corrective action throughout the AI lifecycle, to monitor whether the AI system is operating as the organization intends, as its stakeholders expect, and as required by relevant regulation. We expect regulations focused on AI systems to evolve rapidly and operators and developers of AI systems will need to adjust quickly as policy initiatives and new regulations are passed. Agile approaches, first championed in a software development context, are based on values that include collaboration and responding to rapid change. Agile governance approaches are now being used by governments around the world to respond quickly as technology advances and help enable innovation in emerging technology areas such as blockchain, autonomous vehicles, and AI. The use of an agile approach in AI governance can help AI adopters to identify whether changes in governance and regulatory requirements are integrated appropriately and in a timely manner.

Integrating RegTech into broader AI governance process


An agile approach to AI governance can benefit from the use of RegTech to meet the expected regulatory requirements for AI systems. As defined in the “Regulatory Technology for the 21st Century” World Economic Forum white paper, RegTech is “the application of various new technological solutions that assist highly regulated industry stakeholders, including regulators, in setting, effectuating and meeting regulatory governance, reporting, compliance and risk management obligations”. Examples of RegTech include chatbots that can advise on regulatory questions, cloud-based platforms for regulatory and compliance data management, and computer code that helps enable more automated processing of data relating to regulations. These RegTech solutions can operate as part of a wider AI governance process and  be integrated as components of broader AI governance mechanisms that can include non-tech components such as an advisory board, use case reviews, and feedback mechanisms. Integrating into existing processes can be helped by strong stakeholder buy-in and beginning with basics such as a clear definition of AI, internal policies, and clarity on current legal requirements.

Case studies on OpenPages: Using RegTech for AI governance


IBM OpenPages with Watson is a RegTech solution that can help adopters to navigate an environment with rapidly changing regulatory and compliance demands. IBM has helped clients such as Citi, Aviva, General Motors, and SCOR SE to leverage this RegTech to help address governance requirements, mitigate risks, and mangage compliance.  IBM is also using IBM OpenPages with Watson as a foundational RegTech component in its internal end-to-end AI governance process. IBM OpenPages with Watson can help enable the collection of compliance data on AI systems to help evaluate compliance against corporate policy and regulatory requirements. The use of RegTech for AI governance from the early outset of regulatory requirements for AI systems can help enable the creation of a centralized regulatory library to facilitate collection of data and tracking where data would otherwise exist in silos across the business. By leveraging a centralized RegTech solution, the business can also potentially benefit from efficiencies in the processes and resources enabling these solutions.

Looking forward: RegTech expected to play a central role in AI governance practices


We predict RegTech will play a central role in AI governance practices in 2022 and beyond. We expect RegTech solutions will continue to adapt to meet the needs of companies who will be impacted by new regulations, standards, and AI governance requirements. AI is also likely to drive unique requirements for specific RegTech functionality relating to bias assessments (which could include specific metrics like disparate impact ratio), automated evidence to monitor for drift in AI models, and other functionality relating to the transparency and explainability of AI systems.

Source: ibm.com

Thursday, 23 June 2022

Deliver on your data strategy with a data fabric

IBM Exam Study, IBM Certification, IBM Career, IBM Skills, IBM Jobs, IBM News, IBM Tutorial and Material, IBM Data, IBM Preparation Exam

Today, organizations are experiencing relentless data growth spurred by the digital acceleration of the past two years. While this period presents a great opportunity for data management, it has also created phenomenal complexity as businesses take on hybrid and multicloud environments.

When it comes to selecting an architecture that complements and enhances your data strategy, a data fabric has become an increasingly hot topic among data leaders. This architectural approach unlocks business value by simplifying data access and facilitating self-service data consumption at scale.

At IBM’s recent Chief Data and Technology Officer Summit on data fabric and data strategy, I had an exciting conversation with data leaders from some of the world’s most prestigious organizations about how a data fabric architecture can help get the right data to the right people at the right time, driving better decision-making across the organization.

A data fabric orchestrates various data sources across a hybrid and multicloud landscape to provide business-ready data in support of analytics, AI and other applications. This powerful data management concept breaks down data silos, allowing for new opportunities to shape data governance and privacy, multicloud data integration, holistic 360-degree customer views and trustworthy AI, among other common industry use cases. 

How IBM built its own data fabric 

When I rejoined IBM in 2016, enterprise-level data and its use was having a pivotal moment. Because of advances in cloud computing and AI, it was clear that data could play a much bigger role beyond being a necessary output. Data was emerging as an asset that could benefit all aspects of an organization. 

As the newly appointed Chief Data Officer (CDO), I was charged with creating a business data strategy built around making IBM a hybrid cloud and AI-driven enterprise. My goal was to implement a data strategy and architecture that gave appropriate users access to data. The key aspects were that the data had to be trusted and secure, and it had to deliver insights that drive business value through analytics without sacrificing privacy. 

An important part of our evolution also included preparing for the EU’s General Data Protection Regulation (GDPR), which went into effect in May 2018. Our journey to build security, governance and compliance into our data strategy still serves as a digital solution for our clients and customers to achieve GDPR readiness.

Amid the ever-evolving complexity of our environment, we recognized a need for a data fabric architecture to deliver on our data strategy. The critical benefit of a data fabric is that it provides an augmented knowledge graph detailing where the data is, where it lies, what it’s about and who has access to it. Once we established an augmented knowledge graph, which is the main component of a data fabric, we were in a strong position to intelligently automate across the enterprise, infusing AI into all our major processes, from supply chain to procurement to quote-to-cash. This eventually delivered major reductions in cycle time. Plus, we soon realized that our own business transformation doubled as a blueprint for our clients and customers. But what else can a data fabric do? 

How data fabric lays the foundation for data mesh

A data fabric not only acts as a central pane of glass that creates visibility; it also provides a flexible foundation for a component such as a data mesh. A data mesh breaks large enterprise data architectures into subsystems that can be managed by various teams.

By laying a data fabric foundation, organizations no longer have to move all their data to a single location or data store, nor do they have to take a completely decentralized approach. Instead, a data fabric architecture allows for a balance between what needs to be logically or physically decentralized and what needs to be centralized. 

A data fabric sets the stage for data mesh in several ways, such as providing data owners with self-service and creation capabilities, including cataloging data assets, transforming assets into products and following federated governance policies. 

The future of data leadership 

Today’s data leaders are primarily focused on one of three strategic drivers: mitigating risk, growing the top line and enhancing the bottom line. What I find exciting about building a strategy around a data fabric architecture is that it allows data leaders to act as change agents by addressing these business needs all at the same time.

At the IBM summit, data leaders all concurred that we’re entering a new phase, where there will be much more decentralization in the world of data management. We expect to see concepts such as data fabric and data mesh play a critical role in strategy, as they can empower teams to access resources and tools they need on-demand to support them throughout the data product lifecycle. 

But there’s more discussion to be had. In the newly-released guide for data leaders, The Data Differentiator, you’ll find our six-step approach for designing and implementing a data strategy. This information is continually tested and optimized by IBM experts during client engagements, and we wanted to share it with the community to help facilitate conversation about what it takes to succeed with data. You’ll also find a discussion of the role your data management architecture plays.

Source: ibm.com

Sunday, 12 June 2022

The next era in banking starts with reframing trust

IBM Exam Study, IBM Tutorial and Material, IBM Career, IBM Prep, IBM Learning, IBM

In a world where 76% of Americans are choosing mobile apps over teller windows and digital wallets are fast replacing cash in pockets, banks are changing rapidly. But it’s more than just expanding digital services. Financial institutions are faced with a culture shift that requires redefining and earning customer trust.

Trust has always mattered in banking. But it has been a particularly thorny issue since the 2008 financial crisis when it became clear that banking practices did not always serve the best interest of customers. “The industry did try to claw back a good 10 years after that to say, ‘We are rebuilding trust with customers,’” says Anthony Lipp, IBM Global Head of Strategy for Banking and Financial Markets.

An Investment in Trust

When COVID-19 hit, an opportunity inadvertently presented itself: the chance to build trust around customers’ unexpected needs. As branches abruptly shuttered around the world, banks had to find new ways to engage with customers. In the case of helping customers apply for loans or open new accounts, financial institutions had to deftly swap face-to-face and other human interactions for more digitally-powered solutions. For instance, when call center volume shot up as much as 400%, banks leaned more on automated chatbots to handle the enormous volume increases.

This accelerated digital transformation ushered in a heightened and different focus on customer centricity.

Regulated versus unregulated trust

From the establishment of the First Bank in 1791, to the 1929 stock market crash, to the sub-prime mortgage crisis, U.S. banking regulations are constantly changing to mitigate risks (such as financial instability) and to protect customers. Banks are entrusted with personal and confidential customer information that they are obligated to protect.

At the same time, banks need to work toward what Lipp calls “unregulated trust,” building trust beyond what is required by regulation. Is the financial institution operating in the best interest of the customer? Or is it creating friction that often results in hidden charges or a fee?

According to the Consumer Financial Protection Bureau, in 2019 alone, credit card companies charged $14 billion in “punitive” late fees, and banks charged $15 billion in overdraft and non-sufficient-funds fees.

“Unregulated trust requires creating a banking relationship that is more transparent,” says Lipp. “Customers want visibility of the entire product process.”

This is a challenge in an industry where business processes have historically been opaque. Take home mortgages, which typically involve a 12-step process. The customer is faced with the laborious task of filling out the application, which then goes into the dark, mysterious void of “processing.” Banks increasingly offer digital platforms where customers can log in and track the progress. That’s a big step in the right direction. When the customer has more visibility — when they can view outstanding requirements, the schedule of fees, and other parts of the process — they gain a sense of control and an experience they can trust. 

Technical Debt of Legacy Banks

While the pandemic pushed banks to transform their operations to meet customer needs, the industry still lags behind other business sectors in embracing new digital operating models.

“Banks have traditionally been very monolithic,” says Lipp. “It’s hard because the industry is dragging 50 years of layered legacy versus building something new. Companies that were built more recently, like Amazon and Google, didn’t start with this legacy complexity.”

To catch up, banks are seeking to deliver a more transparent, easy and efficient experience enabled by exponential technologies such blockchain and AI delivered on the hybrid cloud. They are modernizing their legacy systems and business processes in place within a structurally lower operational cost envelope.

Startups, fintechs and other disruptors

In recent years, mortgage volumes went through the roof as new homeowners migrated out of dense city centers and took advantage of low interest rates. The fintech players in this space were more than ready to meet this surge in demand. They showed home buyers the entire process, from application to closing, through digital end-to-end platforms. Through that transparency, they started to earn trust from a customer base that had historically relied on incumbent banks.

“It was harder for many incumbent players,” says Lipp. “How do non-digital incumbents double the size of their workforce to deal with these volumes? Newer nontraditional players could just add another server.” This rapid, scalable growth has seen companies like Square reach market caps comparable to the 209-year-old Citibank.

Disruptors are also seeing stunning success in the small business space. When traditional banks onboarded a small business customer, they charged a sizeable onboarding fee to cover traditionally inefficient processes, then leave customers waiting for weeks on end. They charged even more for additional services. Digital payment companies, on the other hand, allowed small business owners to onboard for free in minutes.

For the digital payment company who poaches that new customer, says Lipp, “they are not just getting the payments business, they’re getting point-of-sale, finance and accounting, inventory management, and much more functionality within that small business ecosystem,” says Lipp. “But more importantly, they’re extracting value tied up in the friction between the individual value chains supporting small business.”

The emergence of embedded finance

In recent years, emerging platform-enabled, customer-centric business models have made it simpler for customers to go about their lives and conduct business by tapping into value chains within and across industries. Financial services are an integral enabler for many, if not all, of these ecosystems. Take a moment to think about all the interconnected elements of commerce happening in the background of your day. To reduce the friction in these complex customer interactions, platform companies are increasingly embedding financial services into their value propositions, especially in payments.

“The real challenge banks have had in this new operating environment is determining how best to embed their products and services, without accelerating the commoditization of their business,” says Lipp. Financial institutions can rebuild that customer trust by becoming truly customer-centric and showing up where their customers expect and need them to be.

Innovative financial service leaders are embedding and integrating their capabilities into platforms throughout the expanding, cross-industry ecosystem. With these new integrations, they can engage customers differently, and in the process, gather new insights to improve their own platform by developing targeted products and services.

Ultimately, it’s a win for the customer, a win for the ecosystem platforms and a win for the financial institution, all with trust at the core.

“People don’t wake up in the morning thinking about doing banking,” said Lipp. “But they want to make sure that banking is there when they need it, that it’s embedded in the right part of the experience, and that it can be trusted.”

Source: ibm.com

Saturday, 11 June 2022

The CPO’s role in driving enterprise sustainability

IBM Exam Study, IBM, IBM Career, IBM Skills, IBM Jobs, IBM News, IBM Certification, IBM Preparation, IBM Exam Prep, IBM Tutorial and Materials

The British Zero Defects thinker Phil Crosby said, “If anything is certain, it is that change is certain. The world we are planning for today will not exist in this form tomorrow.” Few events in modern history have amplified this as much as the COVID-19 pandemic. In corporate supply chains, Chief Procurement Officers (CPOs) have had to exercise muscles of risk management, contingency planning and expedited relationship development to quickly pivot away from complex supply chains, keep supply afloat and meet new demand perpetuated by the pandemic. As we emerge from the throes of the pandemic, CPOs have a new pressing challenge to rise to: sustainability.

While climate change is not the only sustainability topic that must be addressed by today’s corporations, it is the burning platform causing regulators, non-governmental organizations, and the business world to understand the critical role of corporations in driving climate action to keep us all from the consequential and increasingly frequent issues perpetuated by inaction. New regulations sweeping the globe make it clear there is increasing understanding that climate risk is financial risk. Climate risk accelerates the demise of the most vulnerable among us, and climate risk is inextricably tied to other environmental, social and governance (ESG) issues. In this new world focused on sustainability, CPOs must once again redefine their roles in the organization to shape the sustainability focus areas, efforts, targets and impact within the organization.

Defining and driving sustainable sourcing materiality and focus areas

In a world with numerous sustainability challenges, it is easy to fall into the trap of pursuing every sustainable sourcing and procurement opportunity. This can leave the enterprise paralyzed, unsure where to start in an ocean of opportunities. One of the CPO’s leadership imperatives will be to define materiality areas in sustainable sourcing and procurement to maximize return on investment and overall impact from sustainability efforts. Materiality is the measure of the importance of key ESG issues to an organization’s stakeholders. For a consumer packaged goods (CPG) company for example, stakeholders would likely care more about eradicating child labor and slave labor in commodity supply chains than they would about green technology infrastructure. The CPO must therefore understand the ESG issues that significantly provide exponential shared value to stakeholders.

Setting supply chain sustainability targets

As companies have continued to set and announce Net Zero targets, many CPOs feel like they are left holding the bag on Scope 3 emissions reduction targets that they were not part of setting or vetting. CPOs must ensure they are working closely with chief sustainability officers, corporate environmental affairs teams, legal teams and all other functions responsible for setting and communicating sustainability goals. Being able to set sustainability goals should be a direct result of having a well-defined materiality assessment, from which key performance indicators (KPIs) and metrics can be set, and the corresponding goals established.

KPIs and goals must also be aligned with widely accepted standards such as the Global Reporting Initiative (GRI), which is a global standard for impact reporting, or Task Force on Climate-Related Financial Disclosures (TCFD), which is a global standard designed to help public companies and other organizations disclose climate-related risks and opportunities. The CPO must be well-versed in the implications of choosing to align with one standard over another, especially for reporting and compliance purposes in various geographies the company operates in. With the multitude of regulatory requirements in force now and coming soon, CPOs have a mandate to ensure their sustainable sourcing and procurement goals are actionable and achievable, and that they help meet sustainability regulatory requirements.

Procuring sustainability instruments

As companies try to meet the Paris Agreement goals, many are finding they cannot meet emissions goals through their own operations or in Scope 3 emissions collaboration with their suppliers. To meet these goals, companies are resorting to purchasing instruments such as renewable energy certificates (RECs), green power purchases (PPAs), and energy efficiency certificates (EECs). Purchasing these instruments will require an understanding of the valuation of these instruments, as well as an understanding of their effectiveness and environmental integrity in meeting sustainability goals.

There is ongoing debate about the effectiveness of these instruments in actually offsetting emissions. There is constant scrutiny of the companies that sell these instruments. Some are viewed as greenwashers, i.e., companies that create a false impression about how environmentally sound their products are. Category managers in this area are relatively new, so upskilling will be critical in this area. As companies, especially in heavy materials industries, start approaching their target dates for emissions reduction, the instruments will become more important in helping meet any shortfalls in net-zero or decarbonization commitments. More importantly, companies will need to ensure that the instruments meet the decarbonization goals they purport to meet.

Managing the green premium in procurement

Many companies that have attained sustainability brand equity charge a “green premium” for their products and services. These premiums may reflect underlying costs of new sustainable technologies and solutions, but sometimes they are only a result of the market perception. As enterprises transition to purchasing sustainable products and services, it is the CPO’s role to minimize these green premiums and drive competition in sustainable markets.

To drive down green premiums, CPOs must understand the cost factors that drive final price and develop sourcing strategies to drive down costs. These strategies include leveraging supplier collaboration for innovation, complexity reduction and strategic alliances. The CPO’s success will increasingly be measured by how well they develop alliances and collaboratively innovate to develop affordable sustainable products and services.

Supplier development to meet sustainability targets

In assessing company sustainability programs, I have often seen companies forget that their success is intrinsically tied to that of their suppliers. In the transition to Industry 4.0 for example, many companies have implemented leading tools to provide real-time visibility and intelligence — but their success is often hampered by bottleneck suppliers, many of whom are still operating on legacy systems with little visibility to their upstream or downstream supply chain.

To be successful in driving sustainability in their own enterprises, CPOs must also partner with suppliers to drive their downstream sustainability. In Scope 3 emissions for example, suppliers feel like they keep getting the short end of the stick by being instructed to reduce their emissions without any direction or help. Small suppliers especially struggle with how to effectively meet Scope 3 emissions reduction goals without the resources needed to understand their footprint.

It is incumbent on CPOs to ensure suppliers have the support they need to meet company emissions reduction targets and other sustainability targets by working collaboratively, for example in specifications management.

Source: ibm.com

Thursday, 9 June 2022

Overcome these six data consumption challenges for a more data-driven enterprise

IBM Exam, IBM Exam Study, IBM Certification, IBM Career, IBM Skills, IBM Jobs, IBM Tutorial and Material, IBM News

Implementing the right data strategy spurs innovation and outstanding business outcomes by recognizing data as a critical asset that provides insights for better and more informed decision-making. By taking advantage of data, enterprises can shape business decisions, minimize risk for stakeholders, and gain competitive advantage. However, a foundational step in evolving into a data-driven organization requires trusted, readily available, and easily accessible data for users within the organization; thus, an effective data governance program is key.

Read More: C2090-552: IBM InfoSphere Optim for Distributed Systems Fundamentals

Ensuring data quality and access within an organization, while establishing and maintaining proper governance processes, is a major struggle for many organizations. Here are a few common data management challenges:

1. Regulatory compliance on data use

Whether data protection regulations like GDPR, CCPA, HIPAA, etc. are put in place by governments or a specific industry these data privacy and consent controls go beyond sensitive data to outline how organizations should allow their employees to access enterprise data in general.

2. Proper levels of data protection and data security

Certain data elements are critical for competitive advantage and business differentiation; therefore, those data assets need to be protected against data breaches, ensuring that only authorized users have data access.

3. Data quality

For data to be trusted, it needs to be complete, accurate and well understood. This requires data stewardship and data engineering practices to curate data standards and track data lineage, increasing the value of data. AI and Analytics is only good as the quality of data been used for it.

4. Data silos

A typical organization’s data landscape consists of a large number of data stores across workflows, business processes and business units, including but not limited to data warehouses, data marts, data lakes, ODS, cloud data stores, and CRM databases. Integrating data across this hybrid ecosystem can be time consuming and expensive.

5. The volume of data assets

The number of data assets and data elements that a typical organization stores continues to grow. This extremely large amount of enterprise data – comprising thousands of databases and millions of tables and columns – makes it difficult or impossible for users to find, access and use the data they need.

6. Lack of a common data catalog across data assets

Lack of a common business vocabulary across your organization’s data and the inability to map those categories to existing data leads to inconsistency of business metrics and data analytics in addition to making it difficult for users to easily find and understand the data.

Why you should automate data governance and how a data fabric architecture helps

The challenges outlined above demand a data strategy that includes a governance and privacy framework. Furthermore, to help the framework scale across the enterprise, it needs to be automated.

To help avoid vulnerability and inability to innovate caused by a lack of proper data governance, an architecture that enables the design, implementation and execution of automated governance across the enterprise is needed. This is especially important for organizations that operate in hybrid and multi-cloud environments.

A data fabric is an architectural approach to simplify data access in an organization. This architecture leverages automated governance and privacy to facilitate self-service data consumption. Self-service data consumption is crucial because it improves data users ability to easily find and use the right governed data at the right time regardless of where it resides using foundational data governance technologies such as data cataloging, automated metadata generation, automated governance of data access and lineage, data virtualization, and reporting and auditing.

Source: ibm.com

Tuesday, 7 June 2022

Six reasons you need an intelligent asset management strategy now

IBM, IBM Exam, IBM Exam Prep, IBM Exam Preparation, IBM Certification, IBM Tutorial and Materials, IBM News

There is an explosion of data surrounding asset management processes. This data is invaluable, but it can only be used if it can be properly analyzed. Today more than two-thirds of data goes unused due to the complexity of integrating multiple platforms, devices and assets, and the slow, labor-intensive processes required to make it consumable. The result? Subpar operational performance and reliability issues, made most obvious in downtime and defects.

This is where intelligent asset management comes in.

Intelligent asset management (IAM) solutions put data and AI to work to optimize critical asset performance and automate enterprise operations.

Here are six reasons why your company needs intelligent asset management today:

1. Orchestrate and automate your processes

Intelligent asset management focuses on automating operational processes. This streamlines asset maintenance and management to reduce bottlenecks and manual work, improving uptime, productivity and costs.

In the world of building and space management, companies are using integrated workplace management solutions that use data, IoT and AI. These solutions help organizations design a safe, flexible workplace, increase employee engagement and drive operational efficiency.

2. Create value to grow your organization

Intelligent asset management helps grow revenue through increased asset availability and reliability.

Mining companies, for example, use autonomous vehicles for certain tasks. Equipment can be remotely monitored — sometimes from halfway across the globe — to check for proper oil pressure or temperature and to keep the asset running properly. Robots working in mines underground can operate with no downtime, mitigating the safety risks of hazardous conditions such as fire, flood, collapse, or toxic atmospheric contaminants.

3. Be more competitive

Intelligent asset management makes it easier to deploy industry best practices, such as more sustainable operations.

IAM solutions can incorporate AI, weather data, climate risk analytics, and carbon accounting capabilities, allowing organizations to spend less resources curating this complex data and more on analyzing it for insights and taking action.

4. Connect to the enterprise

Intelligent asset management means building an enterprise operations system that governs business operations, financials, and production at all levels of an organization, from the C-suite to the frontline, and along the supply chain.

Changing economic and regulatory conditions challenge oil and gas companies to constantly find better ways to monitor, manage and maintain assets while keeping employees safe.

Kuwait Oil Company needed an asset management solution that could integrate this broad range of processes under a single umbrella. The company improved its production targets through improved efficiency by deploying an IBM Maximo for Oil and Gas solution, not only in its oil extraction and drilling operations, but across operations including marine operations and the employee hospital.

5. Build AI capabilities without data scientists

Intelligent asset management integrates asset data into no-code and low-code applications for visual inspection, remote asset monitoring and predictive maintenance, eliminating content silos to provide visibility across the organization, all without the need for data scientists.

Last year IBM helped Toyota move from reactive, cycle-based maintenance to proactive, reliability centered maintenance. Now the car company can detect anomalies and measure the health of equipment at all times, while predicting and fixing failures before they occur.

6. Uncover simplicity and scalability in one package

Intelligent asset management equals a simple, secure data architecture with an open, extensible asset management platform, to act on any data, on any cloud, anywhere.

Let us help you create an intelligent asset management solution with IBM Maximo, an extendable suite with the capabilities you need.

Source: ibm.com

Thursday, 2 June 2022

Why ISO modernization is so important for financial institutions

IBM Exam Study, IBM Career, IBM Prep, IBM Jobs, IBM Skills, IBM Learning, IBM Preparation, IBM Preparation Exam, IBM News

With the migration to ISO 20022 financial messaging standard, the world is moving toward a standardized system of payment messaging. ISO 20022, which SWIFT will enable for cross-border payments beginning August 2022, will significantly benefit financial institutions (FIs). But these institutions’ journey to adoption may prove quite challenging. To successfully implement the new standard, FIs will need a comprehensive solution for translation between legacy message formats and ISO 20022.

Opportunities in ISO 20022 adoption

This transition offers a golden opportunity to enhance how payments operate. In a world of instant payments and escalating payments fraud in the digital payments space, we must take this opportunity to reduce fraud, enhance compliance and build a better, more robust payment infrastructure. Ideally, a solution will offer ongoing interoperability across the different ISO-compliant payment networks.

Since the standard, and its adoption, continues to evolve, the solution must be future-proofed for ISO 20022 changes as they occur. An ISO-native solution will make it easy to adopt new requirements as they are published. Businesses will have the option to integrate payments into their complete order-to-pay business processes, instead of having to use separate business processes to invoice, receive a payment and ultimately reconcile that payment. The integration simplifies the process from start to finish.

ISO-native messages can contain more data than under previous standards. By leveraging this additional data, FIs can reach a higher standard of protection against financial crime, AML and sanctions. Businesses can better manage cash and liquidity risk through faster cash application, and both corporations and FIs can predict cash flows.

With easier access to standardized data sets, FIs can get data analytics faster and use payment stream data to provide business intelligence not previously available in other payment streams.

Developing an ISO API-enabled Transformation layer helps FIs create a business strategy that clearly defines the differentiated experiences and products the firm offers. And with an ongoing focus on the ISO APIs, FIs can connect to their ecosystems to deliver on their differentiated experiences and products.

With an ISO 20022 message-based API transformation layer solution, an FI can transform its legacy payment system messages into an ISO 20022-specific message format for the downstream system. This is almost a prerequisite for many FIs to bridge the gap until they have native ISO 20022 capability in their systems.

Accommodating legacy payment formats

Legacy proprietary formats such as SWIFT MT are already well-known and widely used in the industry, so it will take time to get used to ISO 20022. There is a strategic opportunity to allow for a few years of coexistence between legacy and ISO standards. But it is important to provide end-users with an interface that shows original and converted messages clearly and unambiguously, while providing functionalities that realize the value of ISO’s rich message fields.

A successful adoption of ISO 20022 should meet several objectives: reduce IT complexity, enable agility, enable partners, integrate seamlessly, and comply with regulatory requirements. These objectives may sound familiar: they are objectives for most FI IT systems generally. But they take on heightened importance when prioritizing ISO 20022-enabled systems.

This is especially true for the first objective, reducing IT complexity. Simply adding in an ISO 20022 layer to an existing complex system as a tactic to address the immediate gap may solve the immediate issue, but it does not address the longer term strategic objective of reducing complexity. In fact, as one moves to ISO 20022-native architectures and processing, a tactical solution may be throwaway or even a barrier to meeting that objective.

The future of financial messaging is still being written with respect to ISO 20022. But long-term strategic planning can’t wait. Institutions that adapt now will be better prepared for future changes.

Source: ibm.com