Saturday, 31 December 2022

A catalyst for security transformation: Modern security for hybrid cloud

IBM, IBM Exam Study, IBM Exam Prep, IBM Tutorial and Material, IBM Certification, IBM Guides, IBM Career, IBM Skills, IBM Jobs

Today it takes an average of 252 days for an organization to identify and contain a breach across hybrid cloud environments, while ransomware attacks occur every 11 seconds. This proves that traditional security can no longer keep up with our modern world. As most big businesses move to be multicloud, SaaS-heavy hybrid cloud users, enterprises must raise cyber awareness to protect a dramatically expanded security attack surface.

Security is no longer an afterthought and must be embedded in everything we do. In the increased complex hybrid cloud environment, how do we secure end-to-end and obtain a holistic security posture that is adequate to support business functions? It’s time to think of security at the enterprise level as industries shift to a new, Security First archetype: Transformative Security Programs.

Modernize security: Quality, velocity, affordability


80% or more of executives struggle to engage information security and operations disciplines early enough to prevent rework or security incidents. To incorporate a Security First mindset, companies should consider policy compliance, security regulations and asset protection before they design their cloud strategy. In an effort to prevent costly reworks, companies should also address complexities early on in the strategy and design phase, rather than waiting to deal with security later.

A modernized security operation and management system should avoid the antiquated approach of security as a stand-alone function. Instead, run it as a true integral business entity and invest accordingly to drive cyber resiliency and the quality, velocity and affordability needed to protect digital assets. With a Security First approach, not only will your vulnerabilities be subsidized through secure architecture design and early, modern security testing, but your enterprise can also leverage automation, artificial intelligence (AI) and machine learning (ML) to shorten MTTR and supplement cyber talent shortages.

Hybrid cloud mastery demands a whole-team approach to security


With 82% of security breaches caused by human error, a modern security program should include situational awareness with a single pane of glass and advanced cyber training such as simulated cybersecurity attack and response exercises. These training designs incorporate the intensity of countering attacks with fun factors to best educate and relate security to your team’s day-to-day activities. Modern security awareness and education encourages people to exercise critical thinking and promote good cyber behavior for normal operations as well as disrupted, under-attack operations.

Though improving cybersecurity and reducing security risks are critical for the successful execution of digital initiatives in cloud portfolios, they’re not always directly linked in execution. Rather than merely running a security modernization program in parallel with a cloud adoption program, aim to explicitly integrate roadmaps and embed security into the hybrid cloud journey—with enterprise security and hybrid cloud security playing on the same team.

As an example, no matter who is leading a data fabric initiative, designing and implementing a secure data fabric requires the engagement of the whole team. Engaging the whole team means security becomes an explicitly shared responsibility, and this approach is easier and more effective when it’s grounded in a broader Security First and Security Always culture.

3 steps for overcoming the security challenge to hybrid cloud mastery


Step 1: Harmonize the security posture across the estate

Think holistically. Security posture is the sum of security policies, capabilities, and procedures across the various components of a hybrid cloud estate. When we push the “start” button and ask the specific cloud or components to interoperate in a productive way, the lack of harmony among security postures can expose serious problems. Harmonizing the security posture across the entire hybrid cloud builds a fabric of protection that helps keep “bad guys” from entering through the weakest link. Enterprise security management from the top down allows enterprises to achieve consistency.

Step 2: Create visibility through a single pane of glass

If hackers really want to attack you, they will touch your network at different app ports, and generate a lot of network activity. If your data is siloed, you might not notice this surge and could miss a leading indicator of a potential security attack.

Enclaves of data (apps, network, security) should be fused into a data lake to allow accurate security insights across the entire cloud estate. Your enterprise can impose AI or machine learning capabilities into the data lake, and IT Ops data and AI Ops data can be tools for making better business decisions. This aggregated visibility capability, known as a “single pane of glass,” helps enable detection, assessment and resolution of security anomalies with high velocity.

Remember, in a hybrid cloud ecosystem, security is more than just the security function: it’s central to your business. You need the rights to harvest these data through good terms and conditions with your cloud provider.

Step 3: Leverage AI to predict vulnerabilities

The single pane of glass is more powerful if we can also make better, faster sense of what we’re seeing. AI, machine learning and automation can ingest high volumes of complex security data, enabling near-real-time threat detection and prediction. AI tools can be “trained” to detect cyberattack patterns that have preceded incidents in the past. When those patterns recur, AI can trigger alerts or even provide actions for self-healing well before a human operator could detect and act upon a potential incident.

With security talent challenges and 3.5 million available security jobs, leveraging advanced tech automation and AI machine learning allows enterprises to find new ways to put security first with skill and velocity.

It’s time to embrace the transformational power of security to keep up with the demands of the modern world. To master hybrid cloud, you need to develop a unified security program that steers business initiatives, optimizes security resources and transforms your operating culture to be Security First.

Source: ibm.com

Thursday, 29 December 2022

IBM and AWS: Partnering to solve clients’ most complex business challenges

IBM, IBM Exam, IBM Exam Prep, IBM Tutorial and Materials, IBM Certification, IBM Prep, IBM Learning, IBM Guides

Since 2016, IBM and Amazon Web Services (AWS) have been working together to bring more secure, automated solutions to hybrid cloud environments, enabling a smoother digital transformation of our joint clients’ technology infrastructure. Underscored by a deep relationship between AWS and Red Hat, the market quickly recognized the value of this partnership.

Clients now have the flexibility to choose the mix of technologies to best suit their business needs. When combined with IBM Consulting expertise, clients can scale those solutions enterprise-wide. The result? Easier, faster cloud transformation, technology deployment and more secure operations, to help drive better business results.

The evolution of an industry-leading partnership


Clients are increasingly using software purchases through AWS Marketplace to make use of committed spending, accelerate time to value and simplify procurement. Through the recently enhanced IBM and AWS agreement, clients can benefit from improved, more flexible access to IBM automation, AI, security, sustainability, and software-as-service (SaaS) offerings available in AWS Marketplace. Within a few clicks, clients can get started using IBM API Connect, IBM Db2, IBM Observability by Instana APM, IBM Maximo Application Suite, IBM Security, ReaQta, IBM Security, Trusteer, IBM Security, Verify and IBM Watson Orchestrate.

Another example of our AWS relationship is through IBM Consulting, one of the fastest-growing global systems integrators for AWS. IBM Consulting has more than 15,000 active AWS certifications, 200 global locations, 14 AWS competencies and 16 AWS service delivery program validations. IBM continues to invest in our AWS collaboration through our acquisitions of Taos and NordCloud, which bring an additional depth of expertise needed to help our clients manage and modernize their AWS environments. We also remain committed to helping our SAP clients running workloads on AWS, delivering on the complex hybrid cloud modernization and regulated workload migrations our clients are navigating. We are proud to facilitate better business transformations for our clients heading towards a virtual enterprise.

We are making it easier for clients to get the most out of their mainframe investment. In May of this year, we launched five patterns to enable rapid innovation, offering a single integrated operating model, with common agile practices and interoperability of applications between the AWS Cloud and IBM zSystems running on-premises. This can significantly reduce talent gaps, allow for rapid innovation with an agile DevOps approach, make it easier to access applications and data without significant changes, and optimize the costs of running or extending applications. Together, this approach maximizes business agility and return on investment (ROI).

Meeting businesses where they are


Whether it is gaining faster, deeper insights with analytics, ensuring greater security, compliance and resiliency, or modernizing applications and development practices, IBM and AWS are helping create an environment of sustainable innovation and future-ready solutions with speed and scale for all our clients.

In 2020, the Rhode Island Department of Health needed help uncovering existing and emerging data patterns in response to the COVID-19 pandemic. Through the agency’s collaboration with IBM Consulting and AWS, its data team was able to provide deeper COVID insights to the public within three weeks. These critical insights guided critical policy and operational decisions that allowed the department to keep the public informed throughout the crisis.

When the COVID-19 pandemic brought air travel to a halt, Finnair, a Finnish airline, needed a strategic approach to application modernization. The airline turned to IBM Consulting, Nordcloud and AWS to identify the right modernization paths for 70 applications, which was implemented in just seven months. The IBM solution provided an agile infrastructure, bolstering Finnair’s digital transformation and ensuring a better customer experience.

The power of partnership


No matter where you are on your journey, the IBM and AWS partnership can help you innovate and compete on a new level.

As one of the fastest growing GSIs for AWS, we are also excited to share that AWS has recognized IBM Consulting with two important distinctions at this year’s AWS re:Invent: Global Innovation Partner of the Year and LatAm GSI Partner of the Year. We are proud to be essential partners to AWS, and are eager to deliver for our clients’ businesses in 2023.

Source: ibm.com

Tuesday, 27 December 2022

IBM journey to more sustainable facilities: IBM as client zero

IBM Exam, IBM Exam Prep, IBM Tutorial and Materials, IBM Career, IBM Skills, IBM Jobs, IBM Certification

As IBM helps customers align sustainability goals with business objectives, we also leverage technology to achieve our own sustainability goals. Currently, IBM has a set of environmental commitments, including achieving Net Zero GHG Emissions by 2030 and diverting 90% of nonhazardous waste (by weight) from landfill and incineration by 2025. And IBM Global Real Estate (GRE) plays a critical role in meeting these targets.

Roughly 40% of the global share of annual carbon dioxide emissions comes from buildings. The UN Environment Program states that if nothing is done, greenhouse gas (GHG) emissions from buildings will more than double in the next 20 years.

To meet targets for GHG emissions, buildings and cities must shift from being part of the problem to being part of the solution. A survey from IBM and Morning Consult found that in the next two years, 62% of business leaders plan to invest in solutions that help manage assets, facilities and infrastructure to drive clean energy transition, efficient waste management and decarbonization.

Clients are looking for mature, real-world solutions that can shorten their path to sustainability, while helping improve the efficiency and performance of their operations. In this context, IBM GRE is leading a far-reaching transformation of nearly every facet of IBM’s real estate and asset management practices that demonstrates how to turn sustainability ambition into action.

Using IBM’s technology and expertise to capture data


Like other corporations, IBM is on a journey to not only reduce its GHG emissions, water consumption and waste, but also to track, analyze and report its progress toward carbon reduction goals.

For IBM, everything starts with data, so our first step was to implement solutions that capture the right data and embed the sustainability and environmental impacts of our day-to-day decisions into our operations. Two integral tools that help IBM manage its real estate operations more effectively are the IBM TRIRIGA® solution for facilities management and the IBM Maximo® solution for asset management.

Charged with the sustainable management of over 50 million square feet of space under management, across some 800 locations in 100 countries, IBM GRE had a large potential to deliver change—and a large challenge. So IBM worked with process experts from IBM Consulting to create a plan for extracting and capturing sustainability and operational data automatically and in real time to enable better decision making.

Our journey toward actionable insights


Once the plan was set, we focused on building a consistent base of data, a “single source of truth” for all underlying facilities and assets. That consideration led to two major developments to help put this foundation in place.

The first development was a close collaboration with the IBM Chief Data Office to build a new data governance program that aligns with IBM’s enterprise data standards and emphasizes data ownership. The second step was GRE’s decision to use sustainability performance management software from Envizi (now an IBM company) to help consolidate sustainability data (including key elements from TRIRIGA and Maximo) into a single, auditable system of record.

The Envizi solution was selected based on its automation capabilities, its ease of integration with core systems like TRIRIGA and Maximo, and its ability to deliver dashboard-based insights that inform business strategy.

Embedding insights into everyday operating decisions


A consistent data baseline allowed us to embed insights into everyday operating decisions. For example, using Maximo for data center maintenance, GRE analyzes sensor data to detect and fix problems before scheduled maintenance would find them. This eliminates the extra energy consumption, waste and emissions that invisibly failing parts would cause. Maximo also enables technicians to skip unnecessary repairs and avoid incidental carbon impacts like travel and parts shipping.

This transformation is already paying off


As GRE’s transformation continues to unfold, it’s already achieving results. In 2021, IBM registered a 61.6% emission reduction against base year 2010, placing the company on track to meet its goal of a 65% reduction by 2025 (adjusted for acquisitions and divestitures). And with its recent investments, IBM is also improving its reporting capabilities. Replacing third-party tools with IBM’s Envizi ESG Suite has enabled a roughly 30% reduction in reporting costs.

Building a more sustainable organization requires strong partnerships and shared vision. At IBM, we lead by example and work with clients to turn that vision into a reality. IBM Global Real Estate is proving how it is done.

Source: ibm.com

Sunday, 25 December 2022

Save energy, decarbonize and transition to renewables while operationalizing sustainability

IBM Exam, IBM Exam Prep, IBM Tutorial and Materials, IBM Guides, IBM Certification, IBM Learning, IBM Career, IBM Skills, IBM Job

Recent political and climate-related environmental events have impacted energy sourcing, supply and costs. The resulting energy crisis impacts all countries, industries, sectors and societies across Europe. Combined with imminent reporting requirements from the European Commission, saving and securing energy sustainably and moving to renewable energy sources equitably is imperative.

The immediate energy crisis coincides with the equally crucial long-term journey to sustainability. A traditional management mindset could see this imperative as an onerous obligation that could cut into profitability. But evidence shows that sustainability efforts unlock value and innovation throughout the organization. Numerous reports cite a direct correlation between sustainable practices, share prices and business performance. According to a recent IBV report, 83% of CEOs expect sustainability investments to produce improved business results in the next five years.

Meeting the needs of energy saving and security, regulatory compliance, cost savings and sustainability requires technological solutions paired with a reconsideration of business processes throughout the entire enterprise.

We see five key functional areas IT and operations executives can assess for transformation:

1. Asset and facility management


Integrated asset management helps organizations minimize the environmental impact of their operations across the asset lifecycle and extend asset life through predictive maintenance and condition monitoring. More accurate replacement planning can help companies minimize and consolidate technician visits, reducing use and saving energy.

Intelligent integrated workplace management software can help organizations infuse sustainability into their real estate and facilities management operations. Criteria for sustainability (such as ISO 14000) can easily be specified for space acquisition or leases, and for construction and renovation projects. Carbon footprints can be reduced through space optimization and planning, and by extending asset life through improved maintenance and assessment.

2. IT: infrastructure and code


The global power capacity of data centers has grown by 43% since 2019. Data centers can account for a large portion of an organization’s energy use, making environmentally sustainable computing more critical. This year’s generation of servers can reduce energy consumption by up to 75%, space by 50% and the carbon emissions footprint by more than 850 metric tons per year, compared to x86 servers under similar conditions.

Hybrid cloud is also a critical enabler of green IT, facilitating increased visibility, greater integration and enhanced capabilities across the cloud estate. Moreover, running workloads in a container platform instead of classically deployed virtual machine environments can reduce annual infrastructure costs by 75%, thanks in part to increased energy efficiency.

The choice of code matters too. Using the right language for the right workload can reduce computing power, and therefore energy usage. Switching from one programming language to another can reduce the energy consumption of an application by up to 50%.

3. Sustainable supply chain and circularity


Through intelligent workflows and automation, companies can reduce waste and improve circularity. Making shipping and routing more efficient and trackable reduces costs and warehouse space, and thus reduces the carbon footprint. It also helps organizations ensure provenance. With 80% of consumers saying sustainability is important to them, and 60% willing to change their purchasing habits based on environmental impact, such assurance can improve consumer loyalty.

Intelligent order management software also marries sustainability to customer satisfaction by allowing companies to consolidate shipments into fewer packages. Integration with carbon-accounting engines allows carbon savings to be shown to online shoppers as they make their shipping choices.

4. Transitioning to sustainable energy sources


Becoming a sustainable enterprise requires a rigorous strategy and roadmap. It encompasses all areas of the organisation right through to enabling customers. It includes ESG reporting and finance, climate risk assessment and adaptation, responsible computing and green IT, a circular supply chain and decarbonization and clean energy transition.

Decarbonization and clean energy transition alone spans many areas including distributed energy, grid resilience, alternative fuels and emissions, flexible energy, new energy systems, mobile energy and low-carbon customers. Moving to a low carbon physical infrastructure, low carbon energy markets, EV platform enablement and digital product business for consumers are among the tactics that can be deployed.

Few organisations have a complete portfolio of these capabilities. Working with partners who can co-create these strategies using proven, repeatable methods, and leveraging extensive partner ecosystems can accelerate realization times on an organization’s journeys to net zero.

5. ESG and risk management


ESG platforms can help organizations build a single system of record that delivers auditable, finance-grade ESG and sustainability data. They enable efficient monitoring of energy demand, consumption and emissions across organizations by capturing data from utility bills, interval meters and renewable assets. When combined with weather and facility information, they can provide a granular data foundation for use with enterprise asset management software, improving predictive insights. ESG platforms also enable organizations to meet increasing regulatory reporting requirements.

Effective use of these platforms can help corporations take the right actions at the right time, reducing corporate risk and the financial and environmental impact of disasters.

Embodying sustainability


IBM leads the way for clients because we embody innovation in sustainability within our organization. We have set precedents for environmental and social commitments for over 50 years, starting with our first corporate policy on environmental affairs, enacted in 1971. Our 21 current environmental commitments include achieving net-zero GHG emissions by 2030 and diverting 90% of nonhazardous waste (by weight) from landfill and incineration by 2025, which we report on annually in our ESG Report, IBM Impact. We also promote environmental justice programs such as the IBM Sustainability Accelerator and Call For Code, enabling organizations and communities to tackle environmental issues.

Proven methods, comprehensive solutions


IBM helps customers save and secure energy sustainably while complying with increasing regulatory demands. We operationalize sustainability end-to-end with data-driven innovation through a comprehensive and growing portfolio of industry-leading consulting and technology capabilities. With an ecosystem of partners, we co-create solutions to uncover new opportunities while finding cost efficiencies, without trade-offs or compromising profitability.

Source: ibm.com

Saturday, 24 December 2022

How data, AI and automation can transform the enterprise

IBM Exam, IBM Exam Prep, IBM Tutorial and Materials, IBM Certification, IBM Career, IBM Skills, IBM Jobs

Today’s data leaders are expected to make organizations run more efficiently, improve business value, and foster innovation. Their role has expanded from providing business intelligence to management, to ensuring high-quality data is accessible and useful across the enterprise. In other words, they must ensure that data strategy aligns to business strategy. Only from this foundation can data leaders foster a data-driven culture, where the entire organization is empowered to take advantage of automation and AI technologies to improve ROI. These areas can transform the enterprise, from cost savings to revenue growth to opening new business opportunities.

Building the foundation: data architecture


Collecting, organizing, managing, and storing data is a complex challenge. A fit-for-purpose data architecture underpins effective data-driven organizations. Driven by business requirements, it establishes how data flows through the ecosystem from collection to processing to consumption. Modern cloud-based data architectures support high availability, scalability and portability; intelligent workflows, analytics and real-time integration; and connection to legacy applications via standard APIs. Your choice of data architecture can have a huge impact on your organization’s revenue and efficiencies, and the costs of getting it wrong can potentially be substantial.

The right data architecture can allow organizations to balance cost and simplicity and reduce data storage expenses, while making it easy for data scientists and line of business users to access trusted data. It can help eliminate siloes and integrate complex combinations of enterprise systems and applications to take advantage of existing and planned investments. And to increase your return on AI and automation investments, organizations should consider automated processes, methodologies, and tools that manage an organization’s use of AI through AI governance.

Taking advantage of automation for LOB and IT activities


You can use data to completely digitize your organization with automation and AI. The challenge is bringing it all together and implementing it across lines of business and IT.

For line-of-business functions, here are five key capabilities to consider:

1. Process mining to identify the best candidates for automation and scale your automation initiatives before investments are carried out

2. Robotic process automation (RPA) to automate manual, time-consuming tasks

3. A workflow engine to automate digital workflows

4. Operational decision management to analyze, automate, and govern rules-based business decisions

5. Content management to manage the growing volume of enterprise content that’s required to run your business and support decisions

6. Document processing to read your documents, extract data, and refine and store the data for use

Looking at the digitization of IT, here are three capability areas to evaluate:

1. Enterprise observability to improve application performance monitoring and accelerate CI/CD pipelines

2. Application resource management to proactively deliver the most efficient compute, storage, and network resources to your applications

3. AI to proactively identify potential risks or outage warning signs across IT environments

Help increase ROI on data, AI and automation investments by making data and AI ethics a part of your culture


But process and people can’t be ignored. If you don’t properly infuse AI into a major process in an organization, there may be no real impact. You should consider infusing AI into supply chain procurement, marketing, sales, and finance processes, and adapt processes accordingly. And since people run the processes, data literacy is pivotal to data-driven organizations so they can both take advantage of and challenge the insights an AI system can provide. If data users don’t agree or understand how to interpret their options, they might not follow the process. This can be a particularly high risk when you consider the implications this can have when it comes to cultivating a culture of data and AI ethics, and complying with data privacy standards.

Building a data-driven organization is a multifaceted undertaking spanning IT, leadership, and line of business functions. But the dividends are unmistakable. It sets the stage for enterprise-wide automation and IT. It can provide a competitive edge to organizations in their ability to quickly identify opportunities for costs savings and growth, and even unlock new business models.

Source: ibm.com

Thursday, 22 December 2022

The importance of governance: What we’re learning from AI advances in 2022

IBM Exam, IBM Exam Study, IBM Exam Prep, IBM Tutorial and Materials, IBM Career, IBM Skills, IBM Jobs

Over the last week, millions of people around the world have interacted with OpenAI’s ChatGPT, which represents a significant advance for generative artificial intelligence (AI) and the foundation models that underpin many of these use cases. It’s a fitting way to end what has been another big year for the industry.

We’re at an exciting inflection point for AI. Adoption of AI among businesses is increasing, and research and AI development on foundation models is enabling use cases like generative AI to become even more sophisticated and powerful. The potential is vast. It can help us leverage significant amounts of data to start designing and discovering new solutions to business and societal problems such as those related to sustainability, life sciences, customer care, employee experience and many more.

These advances are simultaneously raising separate, but important discussions and questions within the industry: how can you trust the algorithms and outputs of these models? How can we ensure that these models are being used responsibly? For example, generative AI models can produce highly believable, well-structured responses so it can be hard to immediately pinpoint an incorrect response without the right subject matter expertise.

This is a dialogue that IBM is engaging in with our clients and partners every day. Advances across AI technology are happening quickly. At the same time, governments around the world are continuously evaluating and implementing new AI guidelines and AI regulation frameworks. We think that businesses have an opportunity to act now to put guardrails in place internally to govern how AI is developed and deployed.

To scale the use of responsible AI requires AI governance, the process of defining policies and establishing accountability throughout the AI lifecycle. This can also help your models adhere to principles of fairness, explainability, robustness, transparency and privacy. A comprehensive AI governance strategy encompasses people, process and technology.

IBM Exam, IBM Exam Study, IBM Exam Prep, IBM Tutorial and Materials, IBM Career, IBM Skills, IBM Jobs
Organizational AI governance processes help to decide when, where and how to use AI across the business and establish policies based on corporate values, ethical principles, regulations and laws. At IBM, we have an AI Ethics Board that supports a centralized governance, review, and decision-making process for IBM ethics policies, practices, communications, research, products and services.

AI governance technology can help implement guardrails at each stage of the AI/ML lifecycle. This includes data collection, instrumenting processes and transparent reporting to make needed information available for stakeholders. We recently launched IBM AI Governance, a solution designed to help companies get a better understanding of what’s going on below the surface of these systems. IBM AI Governance is designed to help businesses develop a consistent transparent model management process, capturing model development time, metadata, post-deployment model monitoring and customized workflows. IBM has also developed and open sourced a set of Trusted AI toolkits, including AI Fairness 360, Adversarial Robustness 360, AI Explainability 360, Uncertainty Quantification 360 and AI FactSheets 360.

In addition to discussions on the importance of AI governance, many of the advances across the industry reinforce IBM’s focus on the unique needs of AI for business. Our clients want AI that is designed for and managed by their subject matter experts, and that can be easily customized based on their domain and business priorities; that is robust with high accuracy and reliability; that operates in and navigates through siloed data in complex formats; and that is guided by principles of trust and transparency. This focus on AI for business is what guides the development of our AI software like IBM Watson Assistant and IBM Watson Discovery.

2022 has been another big year for AI with increasing adoption across the industry as well as promising new advancements. We believe that businesses that embrace AI governance early will be better positioned to responsibly harness this technology now and in the future.

Source: ibm.com

Tuesday, 20 December 2022

2023 Look ahead: ESG policies are good not just for the planet, but for business

IBM Exam Study, IBM Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation

It’s that time of year when we all take a moment to reflect on the past twelve months and, importantly, begin to chart the new year ahead. Here at IBM, our focus continues to be how we turn ambition into action—in sustainability, ESG and beyond.

Over the next year, we expect to see businesses across the globe advance important sustainability policies and practices. In fact, according to a recent Morning Consult poll commissioned by IBM, 61% of global business leaders said that their business will be investing in sustainability programs and initiatives to meet their ESG goals in the next 12 months. Interestingly, the poll also found that among the top sustainability investments planned are energy-efficient computing and IT, with 30% of companies planning to invest in green IT solutions.

As companies increase investments and strengthen their sustainability practices in an effort to transform their operations, they are wise to harness the power of technology to meet their ESG goals efficiently and make a profound impact on our planet.

That’s exactly what we aim to do at IBM. Through groundbreaking technology, including hybrid cloud and AI, IBM is not only supporting global companies in building successful ESG programs but is also consistently reaching its own ESG goals by investing in critical sustainability initiatives. Here are some examples:

IBM Sustainability Accelerator


This year IBM launched its Sustainability Accelerator, a pro bono social impact program that applies IBM technologies, such as hybrid cloud and AI, and an ecosystem of experts to enhance and scale non-profit and government organization interventions, helping populations vulnerable to environmental threats including climate change, extreme weather, and pollution. With the launch, the company announced the first group of partners to work on sustainable agriculture, including Plan 21 Foundation, The Nature Conservancy India, Heifer International, Texas A&M AgriLife and Deltares.

The second cohort of the IBM Sustainability Accelerator was announced during COP27, with five organizations focused on clean energy projects selected to receive end-to-end support and a technical roadmap to scale long-term impacts and drive key societal outcomes. The local and regional organizations selected were the United Nations Development Programme (UNDP), Sustainable Energy for All, the Miyakojima City Government, the Environment Without Borders Foundation and Net Zero Atlantic.

IBM will work closely with the organizations to use sustainability technology to effectively scale positive climate outcomes.

Tech Solutions for Sustainability


This fall, IBM Global Real Estate won multiple awards for its cutting-edge sustainability initiatives, which included using sustainability software to embed insights into daily operating decisions for more sustainable facilities management. IBM reduced its carbon emissions by nearly 62% from 2010 to 2011 and reduced sustainability reporting costs by 30% by replacing multiple tools with a single, automated platform.

As a big global company, IBM has over 50 million square feet of space under management in 100 countries, so increasing the sustainability efficiency of those facilities is crucial to meet our commitments, including achieving Net Zero GHG Emissions by 2030 and diverting 90% of nonhazardous waste (by weight) from landfill and incineration by 2025.

These are just some examples of IBM harnessing the power of data and good tech to make a lasting, positive impact in the communities where we work and live.

A Shifting Culture


Through powerful data and tech, businesses are developing robust sustainability initiatives to meet their ESG goals and strengthen supply chain operations, saving money and reducing carbon emissions in the process.

IBM believes that having strong ESG principles and practices, including working towards sustainability goals, is good for business as well as the planet. Business leaders are paying attention to this, and IBM aspires to lead the way.

Source: ibm.com

Sunday, 18 December 2022

Successful collaboration with DAI Source and IBM helped PortX pioneer cloud-native connectivity to the Federal Reserve

IBM Exam, IBM Tutorial and Material, IBM Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation

Thanks to digital transformation, banking looks nothing like it did a generation ago. In fewer than 30 years, customers have gone from queuing up in a physical bank during lunchtime and weekends, to transferring money and cashing checks via mobile phones at any hour, virtually anywhere on the planet.

All that technological infrastructure is not easy to maintain. The global fintech market was valued at USD 112.5 billion in 2021, and researchers predict it will reach USD 332.5 billion by 2028. The globe-spanning IT infrastructure that makes it all run is a complex web of digital connections and payment networks talking to each other—or trying to. There are about 2 billion people excluded from the global financial system because they don’t have the right technology to allow their payment network to join in the conversation. Banks need skilled IT practitioners to create platforms that facilitate these interactions between financial institutions. IT professionals like PortX do precisely that.

PortX is a Seattle-based infrastructure and integration technology company focused on open-source banking solutions that give community financial institutions (CFIs) access to global economic systems. Its PortX offering, an integration platform as a service (iPaaS), simplifies connectivity between banks and credit unions to the new wave of fintechs and real-time payment networks that shape global finance.

To offer better service to CFI clients, PortX needed to expand its capacity as a Federal Reserve FedLine Direct service provider. FedLine Direct provides access to critical payment benefits via a highly secure computer-to-computer link to Federal Reserve Financial Services. The expansion would facilitate digital transformation for their clients, who would then be able to compete with big banks by providing services that are cheaper, faster and easier to use.

PortX needed to upgrade its queuing middleware to accomplish this expansion and decided to use the IBM MQ messaging application. MQ provides asynchronous messaging for applications that need to communicate but don’t need to be online continuously. MQ allows applications to run at different speeds and handles transactions, communication and security so clients can focus less on maintaining technology and more on adding business value.

But the PortX team needed to ensure the adoption didn’t disrupt its IT processes and didn’t affect clients, drain their existing technical skillsets or affect the company’s overall cost of delivering services.

Finding success with IBM Platinum Partner DAI Source


With a tight deadline looming and an extremely complex technology adoption underway, PortX leaders were getting nervous. Failure to complete the expanded Federal Reserve connection threatened to delay the project indefinitely.

PortX needed to host IBM MQ in a cloud-native, highly available state, an approach consistent with its cloud-native operational model. Fortunately, IBM recently announced Cloud Pak for Integration, which solved some of PortX’s hosting challenges.

But the PortX team was still unfamiliar with running IBM MQ as a Docker container in Kubernetes, and the details of configuring the queue manager correctly to provide the necessary levels of availability, security and runtime isolation eluded the team. They needed to get it right and fast.

IBM Exam, IBM Tutorial and Material, IBM Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation
The PortX team brought in IBM Platinum Business Partner DAI Source to help navigate the IBM ecosystem, ensuring that its developers found and used essential resources from IBM’s product team. The collaboration resulted in a solution that leveraged IBM Cloud Pak for Integration on the AWS ROSA platform.

That solution: PortX deployed the MQ capability via the operator in Cloud Pak, with the MQ NativeHA architecture for production and MQ’s Single Resilient Queue Manager for non-production environments. They also implemented the FLUX operator on Red Hat OpenShift GitOps for continuous deployment, demonstrating the flexibility of Cloud Pak for Integration, given that Argo was IBM’s default GitOps option.

It worked. The solution integrated seamlessly, allowing FedLine Direct wire payments with the Federal Reserve. PortX is currently developing the next set of requirements to enable additional Federal Reserve integration services, such as ACH and FedNow. With Cloud Pak for Integration, PortX expanded its fintech offerings and broadened the range of services for its customers.

The result: new business growth and continuous product improvement


By delivering these new capabilities quickly and effectively, PortX was able to strategically acquire new customers and establish a long-term business trajectory. The capabilities give the company confidence that it can provide access to a new tier of high-value customers, accelerating its growth into the future. In addition, through the successful collaboration with DAI Source, PortX has renewed confidence that IBM invested in its clients’ success.

Additionally, this project empowered IBM and critical Ecosystem Business Partners like DAI Source to establish ongoing mind share with their clients. For example, PortX is now part of IBM’s early beta program, which will allow the company to help define the next generation of MQ and guide its development roadmap priorities to ensure the future success of other MQ users. Partnering with IBM resulted in new business growth and continuous product development for PortX.

Source: ibm.com

Saturday, 17 December 2022

Five benefits of a data catalog

IBM Exam Study, IBM Certification, IBM Prep, IBM Preparation, IBM Tutorial and Materials, IBM Exam Preparation

Imagine walking into the largest library you’ve ever seen. You have a specific book in mind, but you have no idea where to find it. Fortunately, the library has a computer at the front desk you can use to search its entire inventory by title, author, genre, and more. You enter the title of the book into the computer and the library’s digital inventory system tells you the exact section and aisle where the book is located. So, instead of wandering the aisles in hopes you’ll stumble across the book, you can walk straight to it and get the information you want much faster.

An enterprise data catalog does all that a library inventory system does – namely streamlining data discovery and access across data sources – and a lot more. For example, data catalogs have evolved to deliver governance capabilities like managing data quality and data privacy and compliance. It uses metadata and data management tools to organize all data assets within your organization. It synthesizes the information across your data ecosystem—from data lakes, data warehouses, and other data repositories—to empower authorized users to search for and access business-ready data for their projects and initiatives. It also serves as a governance tool to drive compliance with data privacy and industry regulations. In other words, a data catalog makes the use of data for insights generation far more efficient across the organization, while helping mitigate risks of regulatory violations.

For example, imagine business analyst Alex is working on a data analytics project to help her retail company better quantify the success of shoe sales versus jewelry sales. She also wants to predict future sales of both shoes and jewelry. Since her company doesn’t have a data catalog, Alex must first communicate with the shoe line-of-business and the jewelry line-of-business departments to ask what data she needs to conduct her analysis. Next, she submits a request form for each dataset she thinks will be most helpful, then waits while the IT team completes her request. Weeks pass by until the IT team locates and masks the data. Once Alex finally has the information she requested, she still must make sense of it before she can use it. Before she knows it, four weeks have passed from the time she requested the data until the time she has the data in her possession and in a usable form. This is anything but efficient and practical. Thankfully, a data catalog can help.

Let’s look at five benefits of an enterprise data catalog and how they make Alex’s workflow more efficient and her data-driven analysis more informed and relevant.

1. Speed and self-service


A data catalog replaces tedious request and data-wrangling processes with a fast and seamless user experience to manage and access data products. If Alex’s company had an enterprise data catalog in place, she wouldn’t have to submit requests to multiple departments to get the data she needs. Instead, she could simply search the data catalog and access the required information in minutes. So, Alex and other business analysts could complete their projects faster. Meanwhile, the company’s IT teams could optimize their time by focusing on other important workloads.

2. Comprehensive search and access to relevant data


Because Alex can use a data catalog to search all data assets across the company, she has access to the most relevant and up-to-date information. She can search structured or unstructured data, visualizations and dashboards, machine learning models, and database connections. Conversely, without a data catalog, Alex has no guarantee that the data she’s using is complete, accurate, or even relevant. After all, Alex may not be aware of all the data available to her. With a data catalog, Alex can discover data assets she may have never found otherwise.

3. Meaningful business context


An enterprise data catalog automates the process of contextualizing data assets by using:

◉ Business metadata to describe an asset’s content and purpose
◉ Technical metadata to describe schemas, indexes and other database objects
◉ A business glossary to explain the business terms used within a data asset

With this detailed level of intelligence about the data, Alex can view details regarding data lineage and data structure alongside comments from other data users about what each dataset contains. This context helps Alex quickly gauge how useful a particular data asset will be for her analysis. As most enterprise data catalogs allow for curation of metadata, data assets become easier to find, trust and use.

4. Improved trust and confidence in data


As Alex searches the data catalog to gather necessary information, she can preview datasets and their profiles to see if important fields have null or incorrect values. Ensuring data quality is made easier as a result. And because data assets within the catalog have quality scores and social recommendations, Alex has greater trust and confidence in the data she’s using for her decision-making recommendations. This is especially helpful when handling massive amounts of big data.

5. Protected and compliant data


A data catalog when tightly integrated with the company’s data governance platform helps an organization comply with changing regulations and policies while ensuring fast data access and maintaining appropriate data privacy. Rules can be created that anonymize or restrict access to certain data assets throughout their lifecycle so that Personal Identifiable Information (PII) and other sensitive data don’t end up in the wrong hands.

For Alex, this means she won’t have to wait for weeks while the IT team masks columns that contain sensitive information. Instead, governance rules automate which data is viewable and accessible based on permissions and policies. Alex gets the information she needs while the organization protects data from being accessed by unauthorized users or moved to less secure, non-compliant environments.


Why IBM Watson Knowledge Catalog?


IBM Watson Knowledge Catalog on IBM Cloud Pak for Data offers integrated data cataloging and data governance capabilities powered by active metadata, to facilitate advanced data discovery, automated data quality, data governance, data lineage, and data protection across a hybrid distributed data landscape to enable discovery and access to the right data for insights and compliance.

Gartner calls out IBM’s innovation in metadata and AI-/ML-driven automation in Watson Knowledge Catalog on Cloud Pak for Data, along with fully integrated quality and governance capabilities, as key differentiators that make IBM a leading vendor in competitive evaluations.

Watson Knowledge Catalog has numerous use cases. It helps data stewards enable intelligent curation and delivery of trusted, high-quality data to data consumers in a self-service manner to accelerate insight generation, compliance, data quality management.  It simplifies policy management and enables organizations to comply with data privacy and industry regulations while ensuring that sensitive and confidential information is protected from unauthorized access. The solution also helps with data quality management by assigning data quality scores to assets and simplifies curation with AI-driven data quality rules. It seamlessly integrates with IBM’s data integration, data observability, and data virtualization products as well as with other IBM technologies that analysts and data scientists use to create business intelligence reports, conduct analyses and build AI models.

Data professionals such as data engineers, data scientists, data analysts and data stewards benefit from these self-service data catalog tools that allow for self-service analytics, data discovery, and metadata management. AI recommendations and robust search methods with the power of natural language processing and semantic search help locate the right data for projects. Data engineers can build trusted data pipelines without having to wait on IT teams to make data accessible.

When it comes to deploying IBM Watson Knowledge Catalog, organizations can do so wherever their data resides—be it on-premises or in cloud environments.

With IBM Watson Knowledge Catalog, Alex would’ve found out that jewelry is way more profitable than shoes in the same amount of time it took her to submit data requests to the departments. She then would have had another month to predict buying trends in other lines of business. Finally, her company’s IT department would have had more time to finish their data projects as it would have been less distracted by data requests. Everybody wins with a data catalog.

Source: ibm.com

Tuesday, 13 December 2022

Three keys to maximize the impact of supply chain business process operations

IBM Exam, IBM Tutorial and Materials, IBM Certification, IBM Career, IBM Job, IBM Preparation, IBM Guides

Achieving transformational business outcomes is about applying the deepest capabilities and talents. Enterprises almost always look to external partners for consulting and technology acumen to accelerate and de-risk supply chain transformation journeys. But too often transformations fail to deliver on their full vision and business case. Frequently that’s because the “last mile” of the transformation—business process operations—didn’t get fully activated. To address this, sophisticated transformation leaders are picking consulting partners who can deliver from advisory to operations, bringing deep operational expertise and capacity to truly forge a trusted partnership for the journey.

Consumers today expect a global supply chain to operate seamlessly, transparently and without fail. Supply chain leaders are yearning for real-time insights and a skilled workforce to design, build and run processes that automate and scale their operations and improve customer satisfaction.

However, the human capital in supply chain talent is both disrupted and scarce. A generational shift is under way. Senior leaders are retiring at an accelerated pace. A new generation of early professional hires seeks to work with data and AI, best-in-class SaaS solutions—not Excel and transactional ERP systems. It’s never been a more challenging time to attract and retain the best talent.

It’s time to stop looking at supply chain BPO providers as lower-cost FTE capacity, and instead look to them as strategic sources of diverse talent needed to navigate an increasingly complex operating environment. Supply chains can be more than a cost to minimize; they now offer an opportunity to fuel strategic reinvention. The next generation of supply chains will do more than efficiently move material from one place to another; they’ll model and underpin resilient, agile and sustainable business operations.

Here are three keys to maximize the impact of supply chain business process operations:

1. Don’t try to excel at operating both the “old” and the “new” supply chains


Excellence in supply chain execution is all about the discipline of staying focused on the operations and metrics that drive the right outcomes. However, one of the biggest challenges in today’s complex world is the immense number of business model pivots that directly impact supply chains.

As an example, most auto makers currently need to grow and manage their internal combustion vehicle supply chains, while simultaneously standing up a whole new supply chain and supplier network for battery electric vehicles.

Additionally, many companies who previously built and shipped products as a one-time sale are now shifting to a product-as-a-service model, which requires a supply chain built to fulfill, maintain, upgrade, repair and redeploy products over a long lifecycle.

It’s nearly impossible for most supply chain organizations to excel at the “new” and the “old” at the same time. Leveraging external partners to provide capacity to accelerate the new, or to automate and take cost out of legacy operations, is essential to success. External partners can provide a better understanding of real-time operations, helping you prevent problems and respond with agility while supporting new business models.

2. Rethink the skills, capabilities, and sources of talent for supply chain success


Supply chain leaders face talent and human capacity challenges in just about every area of the business today. A recent IBM survey of chief supply chain officers (CSCOs) highlighted sentiment that as much as 20% of workforce capacity has “vanished” during pandemic disruptions. Automation used to be viewed as a threat to supply chain workforces. Now most CSCOs view the creation of intelligent workflows and automation as essential to closing the labor gap.

Bringing these new solutions to life will require enormously different skillsets, including facility with data science, visualizations, predictive analytics, machine learning and AI. Forward-thinking supply chain leaders are looking to BPO partners not only for transactional excellence, but also these next-generation skills and acumen combined with new levels of diversity and inclusiveness to co-create the talent bases needed to win in a new world.

3. De-risk your success by committing an operating partner to your outcomes


If there’s one critical mindset change emerging from the recent years of disruptions, it’s that the remit of supply chain goes beyond cost and is critical to delivering enterprise business outcomes and a differentiated customer experience that powers growth. When you couple those new imperatives with the continuing day-to-day challenges reverberating through supply chains, even the best supply chain leaders find themselves requiring more parties who share a commitment to their priorities and objectives. The right Supply Chain BPO partners will become part of the fabric of your business and measure their success by the outcomes they help achieve along the way.

Unlock a powerful partnership for the journey with IBM. Our practitioners bring the latest technology in AI, automation, hybrid cloud and digitization, along with experience unraveling the kind of digital change that crosses lines of business and transforms an enterprise.

Source: ibm.com

Saturday, 10 December 2022

Maximize your data dividends with active metadata

IBM Exam, IBM Exam Study, IBM Tutorial and Materials, IBM Certification, IBM Learning, IBM Skills, IBM Job

Metadata management performs a critical role within the modern data management stack. It helps blur data silos, and empowers data and analytics teams to better understand the context and quality of data. This, in turn, builds trust in data and the decision-making to follow. However, as data volumes continue to grow, manual approaches to metadata management are sub-optimal and can result in missed opportunities. Suppose that a new data asset becomes available but remains hidden from your data consumers because of improper or inadequate tagging. How do you keep pace with growing data volumes and increased demand from data consumers and deliver real-time data governance for trusted outcomes?

It is imperative to evolve metadata management approaches to keep pace with the proliferation of enterprise data. This puts into perspective the role of active metadata management. According to Gartner, active metadata management includes a set of capabilities that enable continuous access and processing of metadata.

What is Active Metadata management?


Active metadata management uses Machine Learning to automate metadata processing and use the outcomes of that metadata analysis to help drive decisions through recommendations, alerts and more. In short, active metadata management makes data more actionable in real-time. It includes a set of capabilities that facilitate automated data discovery, improve confidence in data, and enable data protection and data governance at scale.

Common use cases for active metadata management


Improve data discovery

Research shows that up to 68% of data is not analyzed in most organizations. Knowing what data assets are available across the enterprise is key to improving data utilization. You can enable advanced data discovery with AI-driven recommendation engines that analyze active metadata and recommend new assets to data consumers based on their usage patterns.

Provide early indicators of data quality

Poor data quality is a barrier faced by organizations aspiring to be data-driven. Most data quality management approaches are reactive, triggered only when consumers complain to data teams about the integrity of datasets. Active metadata management can help with proactive data quality management. Data observability capabilities help augment trustworthy data and detect anomalies in data pipelines, allowing IT teams to quickly surface and resolve issues before they impact the business.

Regulatory and compliance

The risks of non-compliance – legal penalties, loss of reputation and customer trust – are too big to be ignored. According to the Gartner Hype Cycle for Data Privacy 2021, more than 80% of companies worldwide will face at least one privacy-focused data protection regulation by 2023. Rather than responding to each challenge individually, a proactive approach to data privacy, protection and risk management is an opportunity for organizations to build customer trust. With active metadata management, organizations can enforce data policies automatically and implement data protection rules at scale for better compliance with new data regulations.

3 benefits of an active metadata management solution


A data fabric solution connects the right data, at the right time, to the right people, from anywhere it’s needed. One of the key aspects of the IBM data fabric solution is the active metadata capabilities delivered by IBM Watson Knowledge Catalog for Cloud Pak for Data. This data catalog empowers data producers and consumers to understand, trust and protect data, and to use it confidently throughout its lifecycle.

Know your data

Ensuring that data is enriched with all the relevant context is critical for advanced data discovery and improved trust in data. Watson Knowledge Catalog helps data consumers find and understand data by offering a strong metadata foundation consisting of business terms, data classifications, and reference data backed by AI/ML-driven automation. With intelligent recommendations from IBM Watson and peers, users are empowered to find relevant assets from across the enterprise at scale. Furthermore, automated metadata enrichment built into Watson Knowledge Catalog uses machine learning to automatically assign business terms to data assets at scale. This helps users find data faster, decide if data is appropriate and can be trusted and how to work with data.

Trust your data

Complex data landscapes and resulting data silos place a time-consuming burden on data teams to govern data spread across distributed data environments and deliver trusted data.  To improve trust in data, Watson Knowledge Catalog performs data quality analysis to assign quality scores to data assets based on dimensions like data class and type violations, duplicate values, missing values, and suspect values. Custom data quality rules can then be defined to improve curation activities.  Furthermore, IBM’s partnership with MANTA brings automated data lineage capabilities to trace and analyze how data is moved and consumed across all your applications and data sources. This complements IBM’s acquisition of Databand.ai and its data observability solutions to facilitate trustworthy data by actively using historical trends and statistics to detect data anomalies in data pipelines so that IT teams can quickly surface issues before they impact the business.

Protect your data

IBM supports advanced data privacy management capabilities for dynamic enforcement of your data protection policies globally. Create data protection rules to help control access to data assets no matter where they reside, mask data at the column level and filter data rows based on row attributes. IBM can help protect sensitive and critical data through de-identification of personal information and confidential information.

Want to try out the active metadata features that allow IBM to deliver integrated quality and governance capabilities? Check out the free trial.

Access the report to read why IBM is recognized as a Leader in the 2022 Gartner® Magic Quadrant™ for Data Quality Solutions.

Source: ibm.com

Thursday, 8 December 2022

Risk is not static: Exploring the implications of the German Supply Chain Due Diligence Act

IBM, IBM Exam, IBM Exam Study, IBM Tutorial and Materials, IBM Prep, IBM Career, IBM Skills, IBM Jobs, IBM Certification

Across the globe, there are increasing regulatory requirements in place to address environmental, social and governance (ESG) actions needed to create a more sustainable world. While modern environmental regulations have been around for over half a century, we see increasing government actions addressing forced labor, unfair working conditions and modern slavery. These efforts reflect concerns over human rights in modern supply chains, as depicted by the International Labor Organization (ILO) 2021 report. The report stated that 50 million people globally are in conditions of modern slavery, including forced labor. That figure was up by 10 million people from the ILO’s report 5 years prior.

Environmental concerns, modern slavery, and forced labor require a global response—from the adoption of Sustainable Development Goals (SGDs), such as Goal 8 on Decent Work and Economic Growth and Goal 13 on Climate Action, to country-specific regulations, such as the UK Modern Slavery Act and the California Transparency in Supply Chain Act. In June 2021, Germany responded by passing their Supply Chain Due Diligence Act, Lieferkettensorgfaltspflichtengesetz (LkSG).

LkSG requirements and considerations


Starting January 1, 2023, companies based in Germany or German-registered branches of foreign companies with over 3,000 employees must create/update business processes to identify, assess, remediate, prevent, and report on both human rights and environmental risks and related actions of not only their own area of business and direct suppliers, but also their indirect suppliers. Failure to comply with LkSG can result in fines of up to 2% of annual turnover, and/or exclusions from being awarded public contracts.

In response to the requirements of LkSG, we believe there are three important points a company should consider:

◉ Risk is a variable, and should therefore not be addressed as a static, once-a-year check exercise. Finding a holistic way to address dynamic environmental, social and governance risks proactively must be a priority.

◉ Companies across a supply chain desire to be efficient in achieving regulatory compliance. Suppliers often respond to multiple questionnaires from their clients and the time and human resources required to respond to questionnaires can be significant. It is important to find solutions that minimize the burden on the supplier.

◉ Regulatory requirements are on the rise globally, including those applicable to ESG practices, Companies can improve their operational efficiency by implementing compliance solutions that are responsive to evolving regulatory requirements and that can scale to meet their business needs.

To address these issues, IBM and FRDM have partnered to provide a human rights and environmental risk sensing and management solution. The solution uses big data and AI to generate real-time risk signals, up to the supply chain third tier, and provides a team with the ability to respond to these signals and connect suppliers to mitigate risks. It enables companies to detect issues in a timely and dynamic manner. To minimize the cost of compliance to suppliers, there is no cost to sign up to the platform. And the solution is expandable to address changing regulations.

Addressing risk proactively and dynamically


LkSG requires companies to establish a risk management system, perform regular risk analysis, lay down preventive measures for own area of business and multiple tiers of suppliers, and take remedial action. Its scope is broad and extends beyond the companies’ tier 1 suppliers. The main challenge with risk is that it isn’t static. Self-assessment, survey-based tools in the market can only provide a snapshot of a company’s business and supply chain risks, and administering surveys and processing results can be time consuming and resource intensive. These tools are also usually unverifiable, and companies need to trust the accuracy of responses. To address risk proactively and dynamically, it would be desirable to implement a solution that can constantly update and keep abreast of changes in the supply chain and risk levels, while ensuring the information is current.

The IBM FRDM solution leverages big data to generate insights on supply chain environmental, social, and governance (ESG) risk from tier one suppliers to tier 3 suppliers. This platform leverages a company’s spend data and third-party data (including news sources, trade databases, and sanctions databases) to map supply chains and commercial relationships and generate a live risk assessment of a company’s supply chain. Most notably, the proprietary product genome database can build a predictive build of materials (BOMs) that breaks down your purchases to determine the material and services inputs, allowing the platform to map risk up through the third tier. The platform creates dashboards for companies and their suppliers with live risk rating and issues alerts powered by machine learning for ongoing risk alerts. It also provides a forum for supplier engagement on remediation and enables report generation on progress updates and impact tracking.

The IBM FRDM solution also provides risk management and response management services that assist companies with taking remedial action, documenting, and reporting on the applicable due diligence obligations. These services include coverage of and a first-level response to risk alerts, supplier questionnaires, and risk assessment changes. IBM can provide a third-party review of supply chain whistle-blower reports and help to ensure timely escalations to the appropriate parties and expeditious remedial action.

Reducing the burden on suppliers


Suppliers routinely receive audits, surveys, and requirements for trainings from their many customers, generating not only fatigue for their teams, but also a financial burden. It would be desirable for businesses to develop and implement processes that are effective and efficient for complying with LkSG, and to partner with suppliers to lessen the collective burden of compliance.

Most risk assessment platforms in the market today rely on supplier-filled questionnaires and are not verified through audit. They are backward looking in the sense that they ask suppliers about measures they have put in place to mitigate risks. Some of these platforms also charge the supplier a fee to respond to the questionnaire and have their data available to their customers. The IBM FRDM joint solution uses big data to generate insights and risk ratings at no cost to suppliers and doesn’t require suppliers to sign up for any special platform.  Moreover, if companies want to take a deeper dive into their suppliers, they are able to deploy a free digital supplier assessment within the platform.

Having no supplier cost allows companies to collect information from all their suppliers, not just their strategic partners with high order volumes that can pay platform fees. Smaller suppliers can also afford to participate, which is especially important as LkSG requires companies to look at all suppliers, the smallest of whom are more heavily burdened by platform/survey costs. This also means that the companies don’t have to pay out of pocket to cover the cost for these smaller suppliers.

The IBM FRDM solution also saves suppliers time—they don’t have to pay for the platform or module, and they don’t have to be trained on a new platform where they would be entering data. This allows the suppliers to focus on higher value activities, and helps reduce their survey fatigue. Overall, this allows companies to foster healthier relationships with their suppliers, and create more effective supply chain operations.

The evolving regulatory environment


LkSG is Germany’s response to holding companies accountable to creating and nurturing more equitable and sustainable supply chains. It follows other regulations in Europe, such as the French Duty of Vigilance law and UK Modern Slavery Act, and regulations around the world, such as the Australia Modern Slavery Act and California Supply Chain Transparency Act. A similar European Union-wide act is expected to be effective in January 2024.

Companies need to build capabilities that allow them to be nimble and react in a timely manner to growing regulatory demands. With a large number of requirements, it becomes challenging to build teams with expertise in every type of environmental, social and governance request. IBM’s managed services help companies understand the requirements, manage the data, and prioritize follow-up and remediation. With a global presence, IBM is also able to set up local teams that understand the requirements and can work in real time, on the ground with clients. These teams serve as a first level response to risk alerts, supplier questionnaires, and risk assessment changes, and route necessary escalations to the responsible parties in the company.

The IBM FRDM solution provides the ability to adapt to changing or expanded many regulations through AI and machine learning and provides local teams with deep expertise and support.

Conclusion


LkSG is but one of the more recently enacted government regulations on supply chain responsibility. As regulatory bodies, consumers, and employees continue to demand more due diligence around protecting the people in the supply chain and the planet that we inhabit, companies must design and implement solutions that can address risk as a dynamic variable, and scale with the changing environment. Risk is not static, it is ever-changing, and companies need more than a snapshot of their supply chain risk to adequately address shortcomings. The IBM FRDM solution is forward-looking and can adapt to new risk factors and indicators, and expand with new legislations and requirements. IBM and FRDM are ready to support companies as they continue to improve their practices to safeguard the planet and people globally.

Source: ibm.com