Thursday 17 February 2022

How CDOs create lasting culture change through employee empowerment

News, Journey to AI, Data and AI,

Do you remember when you first began to think about data? Long before we learn about the concept, from the moment we are born, we are absorbing, consuming, sorting and organizing data.

Nowadays, not all enterprises use data in a strategic way. Only recently have business leaders had access to much of the important data affecting their enterprises, and the tools to understand it at a granular level.

The growing importance of the CDO

The Chief Data Officer (CDO) role is relatively new to the c-suite. Its evolution into a board-level position is telling, as businesses worldwide come to understand just how much they can benefit from being a data-first organization. Put simply, the goal of the CDO is to help the organization make decisions based on data. But their reach doesn’t stop at the boardroom. Their strategies affect the entire company, making the CDO a powerful agent of change.

As a CDO weaves data decisions into daily workflows, their success is measured along three dimensions:

◉ Growth of the top line

◉ Expansion of the bottom line

◉ Reduction of risk

As CDO at IBM, an essential part is being an evangelist for data literacy and data democratization in service of those dimensions above. Building a data-first culture involves talking a lot about it. So, in a recent episode of the podcast Sunny Side Up, I spoke with Asher Mathew and explained how the most effective way to see success is by transforming your company’s culture to a data-first philosophy.

You can listen to that conversation right here.

So, how does an organization do this in a holistic and lasting way? By building a data-first culture.

Building a data-first culture

Every organization has mountains of data that, when properly analyzed, provide a risk-averse growth strategy unique to the needs of their business or industry. An effective CDO must pay attention to that data, understand how to apply it to advance the company’s business goals, and be prepared to articulate their case at the board level.

Changing an organization can be daunting, which is why it’s crucial to develop a data strategy that aligns with your business strategy. For example, when I joined IBM, I knew their business strategy focused on selling Hybrid Cloud and AI. We thoroughly understood what that journey meant for a consumer, but not so much in terms of an enterprise. In time we recognized we needed to become an AI enterprise, to be in a position to help organizations do the same and infuse AI into their business processes. Therefore, our data strategy became about transforming IBM into an AI enterprise and then showcasing our story to our clients. Of course, you can’t do that successfully without a data-first culture.

To usher in this change, we created a variety of specialized business units to work toward infusing AI throughout the company.

◉ The data standardization and governance unit ensures that all the data we use is fit for purpose.

◉ The adoption unit is fully empowered to work directly with various departments and implement an AI transformation. We recognized that AI needed to be incorporated into our company-wide workflows so the participation of this unit is critical to ensuring that the company’s data and AI platform has full adoption across the company.

◉ The data officer council is made up of members from organizations throughout the company who help validate the direction set by the Chief Data Office and remain responsive to business needs.

Many of IBMs clients look like us: they are complex organizations filled with potential that’s difficult to unlock. Since it’s a challenge to change such complex ecosystems, it is important that IBM be a definitive showcase for the kinds of transformation we can offer to our clients and partners.

The successes of a data-first culture

Let me share some examples of our success. In 2016, we laid down our data foundation to deliver trusted, enterprise-wide datasets and standards. Once we deployed our AI solution along those vectors, we saw a boost in operational efficiency: an over 70% improvement to the average business process cycle time, which is the time a process takes to complete from start to finish.

We had a massive effort around risk mitigation from 2016-2018 to respond to the introduction of GDPR in the EU. More than that, however, it was an opportunity to innovate and develop new services and governance frameworks to facilitate better compliance efforts at scale.

With the creation of our AI Accelerator Team in 2018, we started to focus on revenue growth. The AI Accelerator team takes our internal data and AI transformation and showcases it to our clients to help them drive similar changes within their organizations.

Data literacy is at the core of each unit’s success. This should be the primary goal of any CDO change agent. Data is crucial to everybody’s lives, not just to people in organizations like ours. The public at large needs to understand what can be done with data. The cultural change element of the CDO role is concerned with influencing how data is used from within and being the example, that others can follow. If the CDO does not focus on this, how can they expect anyone else to care?

Above all, we want to empower our people to be data-driven; move at the speed of their insights, to observe the data, act on it and not ask for permission.

You must be prepared to inspire and support your teams to thrive, and more realistically, to fail fast, but learn through that failure. The CDO is in a position to understand all elements of the change, how to navigate challenges, how to learn through failure and how these components transform the culture of the enterprise. Effective communication is critical at the C-suite level, and the data you’re using becomes an integral part of the business strategy.

If you found these ideas intriguing, I invite you to join me in a deeper discussion about exploring data and technology with leading CDOs, CAOs, and CTOs at the upcoming IBM Chief Data Technology Summit Series.

Source: ibm.com

Tuesday 15 February 2022

Redefine cyber resilience with IBM FlashSystem

IBM FlashSystem, IBM Exam Prep, IBM Learning, IBM Career, IBM Skills, IBM Jobs

Today, we’re announcing new data resilience capabilities for the IBM FlashSystem family of all-flash arrays to help you better detect and recover quickly from ransomware and other cyberattacks. We’re also announcing new members of the FlashSystem family with higher levels of performance to help accommodate these new cyber resilience capabilities alongside production workloads.

Cybercrime continues to be a major concern for business. Almost every day we see reports of new attacks. The average cost is $4.24 million, and recovery can take days or weeks. Cyberattacks have both an immediate impact on business but can also have a lasting reputational impact if the business is unavailable for a long time.

How Cyber Vault Can Help Businesses Recover Rapidly

Even with the best cyberattack defense strategy, it’s possible that an attack could bypass those defenses. That’s why it’s essential for businesses to have both defense and recovery strategies in place. Storage plays a central role in recovering from an attack.

IBM Safeguarded Copy, announced last year, automatically creates point-in-time snapshots according to an administrator-defined schedule. These snapshots are designed to be immutable (snapshots cannot be changed) and protected (snapshots cannot be deleted except by specially defined users). These characteristics help protect the snapshots from malware or ransomware and from disgruntled employees. The snapshots can be used to quickly recover production data following an attack.

Recovery from an attack involves three major phases: detection that an attack has occurred, preparing a response to the attack, and recovery from the attack. Each of these phases can take hours or longer, contributing to the overall business impact of an attack.

An offering implemented by IBM Lab Services, IBM FlashSystem Cyber Vault is designed to help speed all phases of this process. Cyber Vault runs continuously and monitors snapshots as they are created by Safeguarded Copy. Using standard database tools and other software, Cyber Vault checks Safeguarded Copy snapshots for corruption. If Cyber Vault finds such changes, that is an immediate sign an attack may be occurring. IBM FlashSystem Cyber Vault is based on a proven solution already used by more than 100 customers worldwide with IBM DS8000 storage.

When preparing a response, knowing the last snapshots with no evidence of an attack speeds determining which snapshot to use. And since Safeguarded Copy snapshots are on the same FlashSystem storage as operational data, recovery is fast using the same snapshot technology. Cyber Vault automation helps speed the process of recovery further. With these advantages, FlashSystem Cyber Vault is designed to help reduce cyberattack recovery time from days to just hours.

IBM FlashSystem Cyber Vault is part of IBM’s comprehensive approach to data resilience: high availability and remote replication for disaster recovery in IBM FlashSystem. Backup, recovery, and copy management using IBM Spectrum Protect Suite. Ultra-low-cost long term storage with physical air gap protection with IBM tape storage. Early attack detection through IBM QRadar and IBM Guardium. And proactive attack protection using IBM Safeguarded Copy.

High Performance Hybrid Cloud Storage Systems

To ensure cyber security does not have to come at the expense of production workload efficiency, IBM is introducing new storage systems with greater performance than previous systems.

Built for growing enterprises needing the highest capability and resilience, IBM FlashSystem 9500 offers twice the performance, connectivity, and capacity of FlashSystem 9200 and up to 50% more cache (3TB). The system supports twice as many (48) high-performance NVMe drives. Likewise,FlashSystem 9500 supports up to forty-eight 32Gbps Fibre Channel ports with planned support for 64Gbps Fibre Channel ports. There’s also an extensive range of Ethernet options, including 100GbE RoCEv2.

IBM FlashSystem, IBM Exam Prep, IBM Learning, IBM Career, IBM Skills, IBM Jobs

Supported drives include new IBM FlashCore Modules (FCM 3) with improved hardware compression capability, Storage Class Memory drives for ultra-low latency workloads, or industry standard NVMe flash drives. FCMs allow 2.3PB effective capacity with DRAID6 per control enclosure and 4.5PB effective capacity with forty-eight 38TB FCMs in a planned future update. These new FCM 3 drives help reduce operational cost with a maximum of 116TB per drive and an impressive 18PB of effective capacity in only 16U of rack space with FlashSystem 9500. FCM 3 drives are self-encrypting and are designed to support FIPS 140-3 Level 2 certification, demonstrating that they meet rigorous security standards as defined by US government.

FlashSystem 9500 also provides rock solid data resilience with numerous safeguards including multi-factor authentication designed to validate users and secure boot to help ensure only IBM authorized software runs on the system. Additionally, IBM FlashSystem family offers two- and three-site replication plus plus configuration options that can include an optional 100% data availability guarantee to help ensure business continuity.

“In our beta testing, FlashSystem 9500 with FlashCore Module compression enabled showed the lowest latency we have seen together with the efficiency benefit of compression. FlashSystem 9500 delivers the most IOPS and throughput of any dual controller system we have tested and even beat some four-controller systems.”

— Technical Storage Leader at a major European Bank.

New IBM FlashSystem 7300 offers about 25% better performance than FlashSystem 7200, supports FCM 3 with improved compression, and supports 100GbE ROCEv2. With 24 NVMe drives, it supports up to 2.2PB effective capacity per control enclosure.

IBM FlashSystem, IBM Exam Prep, IBM Learning, IBM Career, IBM Skills, IBM Jobs

For customers seeking a storage virtualization system, new IBM SAN Volume Controller engines are based on the same technology as IBM FlashSystem 9500 and so deliver about double the performance and connectivity of the previous SVC engine. SAN Volume Controller is designed for storage virtualization and so does not include storage capacity but is capable of virtualizing over 500 different storage systems from IBM and other vendors.

IBM FlashSystem, IBM Exam Prep, IBM Learning, IBM Career, IBM Skills, IBM Jobs

Like all members of the IBM FlashSystem family, these new systems are designed to be simple to use in environments with mixed deployments that may require multiple different systems at the core, cloud, or at the edge. They deliver a common set of comprehensive storage data services using a single software platform provided by IBM Spectrum Virtualize. Hybrid cloud capability consistent with on-prem systems is available for IBM Cloud, AWS, and Microsoft Azure with IBM Spectrum Virtualize for Public Cloud. These systems also form the foundation of IBM Storage as a Service.

Source: ibm.com

Sunday 13 February 2022

We must check for racial bias in our machine learning models

As a data scientist for IBM Consulting, I’ve been fortunate enough to work on several projects to fulfill the various needs of IBM clients. Over my time at IBM, I have seen technology applied to various use cases that I would have never originally considered possible, which is why I was thrilled to steward the implementation of artificial intelligence to address one of the most insidious societal issues we face today, racial injustice.

As the Black Lives Matter movement started to permeate throughout the country and world in 2020, I immediately wondered if my ability to solve problems for clients could be applied to major societal issues. It was with this idea that I decided to look for opportunities to join the fight for racial equality and found an internal IBM community working on projects that were to be released through the Call for Code for Racial Justice.

Adding my two cents to TakeTwo

There were numerous projects that were being incubated within IBM but I found myself drawn to one in particular that was looking both an implicit and explicit bias. That project was TakeTwo and was to become one of the seven projects that was released as an external open source project just over a year ago. The TakeTwo project uses natural language understanding to help detect and eliminate racial bias — both overt and subtle — in written content. Using TakeTwo to detect phrases and words that can be seen as racially biased can assist content creators in proactively mitigating potential bias as they write. It enables a content creator to check content that they have created before publishing it, currently through an online text editor. Think of this like a Grammarly for spotting potentially racist language. TakeTwo is designed to leverage directories of inclusive terms compiled by trusted sources like the Inclusive Naming Initiative.

Not only did TakeTwo allow me to apply my expertise to improve the project, but it also afforded me the opportunity to look inward at some of the implicit racial biases that I may have held but have been formerly unaware of. Working on TakeTwo was great way to work on a mission that matters for the world, while also providing a chance for self-reflection.

See the solution action: 

The Data Challenge

While working on TakeTwo it became abundantly clear that although the solution aims at detecting bias by fielding and evaluating massive amounts of data, it’s important to recognize that the data itself can hold implicit bias in itself. By leveraging Artificial Intelligence and open source technologies like Python, FastAPI, JavaScript, and CouchDB, the TakeTwo solution can continue to evaluate the data it ingests, and better detect when bias exists within it. For example, one word or phrase that may be acceptable to use in the United States may not be acceptable in Japan – so we need to be cognizant of this to the best of our ability and have our solution function accordingly. As someone who is passionate about data science, I know from firsthand that our model is only as good as the data we feed it. On that note, one thing I’ve learnt from working on this project is that we need better data sets that can help us train the machine learning (ML) models that underpin these systems. Kaggle datasets has been a great starting point for us, but if we want to expand the project to take on racism wherever it exists, we’ll nee more diverse data.

On a related note, the skills needed on projects like these go way beyond just data science. Particularly for this project, it was important to leverage linguistics experts who can help define some of the cultural nuances that exist in language that a system like TakeTwo either needs to codify or ignore. Only by working cross-discipline can we get to a workable solution.

Be part of the future

The value that ML and AI bring to enhancing solutions like TakeTwo is inspiring. From hiring employees, to getting approved for a loan at the bank, ML and AI is permeating into the way we interact with one another and can help ensure we remove as much racial bias as possible for business decision-making. As technologists, we have a distinct responsibility to produce models that are honest, unbiased, and perform at the highest level possible so that we can trust their output.

You can check out how we at IBM are building better AI here. TakeTwo continues to make strides in developing and strengthening its efficacy, and you can contribute to making this open source project better. Check out TakeTwo and all of the other Call for Code for Racial Justice Projects today!

Source: ibm.com

Saturday 12 February 2022

Kai Ming makes more data-driven decisions with IBM Cognos Analytics

IBM Cognos Analytics, IBM Exam Prep, IBM Career, IBM Preparation, IBM Skills, IBM Prep, IBM Preparation, IBM Certification, IBM Study

In the 1960s, emerging research on the effects of poverty and its impact on education came to light. This research indicated an obligation to help disadvantaged groups, compensating for inequality in social or economic conditions. In January 1964, a former teacher and then-President Lyndon B. Johnson declared a “war on poverty.” They established Head Start, a program to promote the school readiness of infants, toddlers and preschool-aged children from low-income families as part of the arsenal. Today, Head Start provides services to more than a million children every year in every U.S. state and territory.

Head Start programs are typically run by non-profit organizations, schools and community action agencies. Operations and management require extensive coordination between federal, state and local governments and detailed, often disparate, reporting to each organization participating in the program.

Potential has no limits at Kai Ming

It is the motto and mantra of the executives, educators and staff at Kai Ming Head Start. Founded in Chinatown in 1975, Kai Ming serves low-income families, providing child educational services and family resources to nearly 400 students and their families in 10 San Francisco, California locations. Its governing bodies, executives, educators and staff are dedicated to empowering students and their families.

“Our teachers, they devote themselves to the students in the classroom, and we honor them. To effectively manage the program, we need to work with the parents, through hundreds of interactions, for hundreds of kids,” said Executive Director Jerry Yang, Ph.D., at Kai Ming, “We do a lot of health and age-appropriate developmental screening. But we also work with parents to help them progress toward self-sustainability as many of our families face challenges with employment, housing and food insecurity. So, we needed a good data system to track all these things.” Communication and information exchange is essential for each child’s success.

Solving the data dilemma — fast and accurate capture, reporting and interpretation

Data capture and the subsequent reporting were not easy for Kai Ming. Separate local, state and federal agencies plus private donors and sponsors support Kai Ming. Monthly government-mandated reporting is necessary to maintain funding. Kai Ming’s Governing Board and Parent Council require different formats and content. Furthermore, Dr. Yang needed a comprehensive view of the program, from overall operations to classroom-level analytics and individual family and child records.

Each of Kai Ming’s locations has children with different abilities and challenges. Tracking the scope of these attributes is yet another factor in data capture, reporting and analytics. The organization’s manual reporting frequency and each entities’ detailed reporting requirements were not only time-consuming but took precious time away from more strategic activities that would benefit their children. Automated monitoring of each child’s attendance, health records, and classroom performance were essential for proactive support of the individual and family. Yang sought solutions that would speed Kai Ming’s data capture and analytics capabilities across the organization.

The right tools and support make the difference

Dr. Yang explored several tools for database and application development. He went so far as teaching himself the basics of a software product. The solutions he found were not robust enough to support his business need. Then he learned about the capabilities of IBM® Cognos® Analytics on Cloud software. With the help of IBM Business Partner PMsquare, they applied the features and capabilities that best suited Kai Ming’s business intelligence needs.  Kevin Emanuel, Vice President of Sales at PMsquare, said, “Because Kai Ming, like many non-profit organizations, had budget constraints, we recommended the on-demand version of Cognos and provided the skills to integrate the advanced features of the product.”

The solution enables Kai Ming to import and analyze data from multiple on-premises or cloud sources and provides AI-assisted data cleansing and preparation when data comes from multiple sources. Its visualization capabilities were particularly valuable to Kai Ming as dynamic dashboards allowed Dr. Yang to drill down into the detail and share information when appropriate. With Cognos Analytics, Kai Ming uncovered hidden patterns in operations that assisted planning and decision-making.

Today, Cognos Analytics is an integral part of Kai Ming’s business intelligence and reporting processes.

“Previously, we had to chase after people for information before we could send a report. It could take weeks to compile,” describes Dr. Yang, “Today, it’s done in a snap — in less than an hour.”

Dr. Yang uses dashboard capabilities to examine each child’s progress and classroom operations to optimize needed resources. So what does Dr. Yang do with his extra time? “While I can obtain, analyze and send information faster, that doesn’t mean we sit back with nothing to do. No way. We continue to keep busy working on new ways to empower our kids and their families,” concludes Dr. Yang. In fact, despite the COVID pandemic, 65% of Kai Ming’s children advanced at least one development level in the 2021 academic year.

About Head Start

Head Start programs, administered by the Office of Head Start (OHS), within the Administration for Children and Families, U.S. Department of Health and Human Services, promote the school readiness of infants, toddlers and preschool-aged children from low-income families. Services are provided in various settings, including centers, family childcare and childrens’ own homes. Head Start programs engage parents or other key family members in positive relationships, focusing on family wellbeing. Decades of research show that Head Start participation has both short- and long-term positive effects for Head Start children and their families. To donate to Kai Ming Head Start or learn more about their services, visit its site.

About PMsquare

PMsquare is a data and analytics consultancy and an IBM Business Partner specializing in Data Science and AI. As an end-to-end solution provider, PMsquare’s goal is to help clients simplify the complex challenges involved with architecting data and analytics strategies and implementing analytics solutions that deliver powerful insights to organizations around the globe.

As a supporter of Kai Ming’s organizational mission, PMsquare is also a proud sponsor of Kai Ming’s newest children’s center in San Francisco: PMsquare Children’s Center

Source: ibm.com

Thursday 10 February 2022

Boosting engineering efficiencies with digital threads

IBM Exam Prep, IBM Exam Certification, IBM Learning, IBM Career, IBM Skills, IBM Jobs

As the demand for agile development grows, so does the demand for digital threads. But what are digital threads, and what is their value?

Digital threads enable a product’s development through its lifecycle domains to be digitally traced. The tracing can be in either direction, upstream or downstream in the product development lifecycle.

“Digital” means that all the paths are electronic — no one is searching through filing cabinets looking for requirements or test documents. “Thread” means there is a traceable path that anyone can follow between processes and across data domains.

The value in digital threads is the ability to quickly trace an entire development process, digitally identifying causalities between the processes and ensuring that their datasets are consistent. When there is a product quality issue, the digital thread allows for the part or subsystem to be traced back through its lifecycle. Original design concept, requirements, test cases, quality checks, and signoffs are all available to the engineering team diagnosing the quality issue. If a requirement needs to be changed, that change’s impact can be determined digitally by its logical thread.

The need for digital thread enablement has been amplified by growing market pressures for development to be more agile, as well as the increasing software integration in “smart products.”

Digital threads in agile development

The need for agility is especially clear when dealing with supply chain issues. Companies need to know the dependencies in product design for every part and subsystem.

This is important for accelerating reaction time, for instance, when a delay from one vendor forces the substitution of another vendor’s part. Leveraging a digital thread helps companies automate changes and execute them faster. Manually managing changes can make an organization subconsciously adverse to change, which is completely averse to the concept of being agile.

Digital threads in software integration

The increasing integration of software introduces a new market model for many companies, and it increases the need to track and maintain their products in market. Software offers an attractive potential revenue stream through in-market upgrades and enhancements. Maintaining a digital thread helps engineering teams automate necessary processes and leverage data analytics for better insights to proposed changes, such as optimizing the release of fix-packs, retro-functions, and incremental enhancements.

The digital thread is incredibly valuable in the development process. It enables decision-makers to analyze the impact of changes before they are made, establish cross-functional KPI’s for measuring readiness and progress, or more readily respond and report on cross-functional compliance requirements.

IBM designed the Engineering Lifecycle Management portfolio on a digital foundation that not only leverages a single view of digital development data, but also leverages an industry-standard, open exchange format of Open Services for Lifecycle Collaboration (OSLC). The IBM Engineering requirements, test, and workflow management tools, as well as the systems/software modeling tool, all leverage this digital foundation to establish traceability across the development environment. This ELM digital foundation can be leveraged via its OSLC architecture by third-party tools and other domain applications, further extending the capabilities of ELM’s digital thread services. This would allow a manufacturing management system to extend the digital thread traceability across development into the actual product’s manufacturing.

Source: ibm.com

Tuesday 8 February 2022

If data is the new oil, ISO 20022 is the new gasoline

IBM Exam Prep, IBM Learning, IBM Exam Preparation, IBM Skills, IBM Jobs, IBM Preparation

The phrase ‘data is the new oil’ has been widely used in the last number of years, but in an unrefined state, it has limited use. ISO 20022 is refined and provides the necessary structure to efficiently drive multiple engines in a bank.

Background on ISO 20022

ISO 20022 was first introduced in 2004 to provide more standardization and deliver richer information for Financial Services transactions. The benefits of providing an enhanced definition for payment transactions include the following:

◉ Improving automation to achieve higher straight-through-processing (STP) rates.

◉ Leveraging rich data to provide value-added services, such as real-time fraud detection, automated reconciliation, etc.

◉ Improving reporting and analytics capabilities through financial services transactions.

◉ Leveraging artificial intelligence (AI) for more efficient processing of transactions and introducing market-differentiating services.

During the financial crisis in 2008, the trust levels of bank clients dipped. One client we worked with had a ‘trust’ equation on how to increase the trust levels of clients (new and existing) to maintain or grow their client base:

Trust equation: Trust = Transparency + Innovation

ISO 20022 helps with the right-hand side of this equation by increasing transparency and innovation to achieve end-to-end payment transaction visibility.

Current drawbacks associated with implementing ISO 20022

Introducing ISO 20022 into the bank requires a sizeable investment, which can deter customers. Existing back-office systems often do not speak the language of ISO 20022, so they either need to be updated or a translation service needs to be put in place. This is not a trivial task. In the case where translation occurs, the compliance officers are concerned about data loss — as a result, there needs to be a mechanism in place to ensure there is no loss. By adopting ISO 20022 on the cloud, however, banks can reduce such losses and improve the overall cost of ownership.

There is also a possibility that various implementations of ISO 20022 could diverge and lead to banks having to support multiple dialects of the ISO 20022 language. For High-value payment migrations, as an example, there are different usage guidelines for HVPS+ for Market Infrastructures and CBPR+ for SWIFT cross-border payments. Currently, SWIFT oversees the ISO 20022 standard, so we expect coordination and harmonization will occur based on their long-standing experience in managing standards.

Evolution of ISO 20022

In the payments area, ISO 20022 has been used for the last 15+ years. In Europe, ISO 20022 has been used since 2008 for cross-border payments under the Single Euro Payments Area (SEPA). Adoption was slow to start until it was mandated that all domestic ACHs in the Europe zone use SEPA. According to the European Payments Council, simplicity, convenience and efficiency are the three core benefits of SEPA.

In more recent years, there has been a move worldwide to adopt ISO 20022 for instant payments, high-value payments and bulk payments. European high-value payment systems —Target2 and Euro1 — will do a migration to ISO 20022 in November 2022. Accordingly, SWIFT cross-border payments will start the migration to ISO 20022 with a completion date of November 2025. In the U.S., the Fedwire and CHIPS payments systems will migrate to ISO 20022 in the coming years.

The future of payments

With the move to digitization accelerated by COVID-19, there will be a growing number of instant payment services using ISO 20022 in the coming years. Initiatives like the ‘Request to Pay’ will lead to new innovations in the payments space and likely increase the volume of electronic payments and the need for instant payments in particular.

The rich data that is provided by ISO 20022 will fuel the drive to efficient payment engines, fraud detection engines and analytics engines across banks worldwide. The structured nature of this ISO 20022 fuel also provides the opportunity for banks to innovate with emerging cloud-native technologies, such as artificial intelligence (AI) and machine learning (ML) for more efficient processing and reporting of financial transactions to change the payments world as we know it.

Source: ibm.com

Sunday 6 February 2022

Hybrid cloud for manufacturing equals resiliency, agility and flexibility

IBM Hybrid Cloud, IBM Exam Prep, IBM Career, IBM Skills, IBM Jobs, IBM Prep

The manufacturing industry, spurred by COVID-19, is increasing its automation efforts with cloud technologies that offer resiliency, agility and flexibility. However, on the shop floor, slow progress to modernize infrastructure and applications prevents companies from achieving gains in productivity and operational efficiency. One of the key reasons for this lagging adoption is the reluctance of plant managers to implement digital technologies. This exists due to concerns around security, latency and resiliency, as well as a lack of understanding about new generation cloud technologies and how they can coexist with legacy applications.  

How manufacturing is evolving

Manufacturing 4.0, often interchanged with Industry 4.0 or Smart Manufacturing, is a term referring to the paradigm shift comprised of major technological innovations in manufacturing. These include cloud and artificial intelligence (AI) technologies that address manufacturers’ challenges and build confidence through application and infrastructure modernization efforts. This tech maturity unlocks benefits through intelligent automation using, but not limited to, the following: Sensors and digital transfer of data, advanced robotics, Internet of Things (IoT), mobile services, 3-D printing and data analytics.  

Cloud computing is paramount for manufacturing companies undertaking this shift — especially in the areas of data processing, data storage and enterprise resource planning systems. Hybrid cloud brings cloud directly to the manufacturing facility and provides benefits such as on-demand computing failover and auto-scaling, while ensuring that the operations keep running even if there is a connectivity failure. It also allows control decisions to be made in real time.  

Scalability is the key for success

The manufacturing industry is on the forefront of technology. Consider an average steel plant, oil refinery or an aluminum smelter, all of which had considerable industrial automation built at the equipment level long before we started talking about IoT. For these types of manufacturing, there is a great deal of legacy technology investment in the plant.  

Much of the data generated at the equipment level is used for running the plant and improving operations. Coupled with the fact that many large facilities are in remote areas — often in proximity to mines or oil well. In remote facilities, telecommunication infrastructure and skill availability are more often than not an issue. This inhibits the manufacturer from taking full advantage of Manufacturing 4.0.  

Open integration and supporting microservices enables an architecture that is flexible and scalable. This scalability and flexibility need to be accompanied with resiliency from a failover and model-drift perspective. Performance and security aspects cannot be over-emphasized considering the real-time and mission-critical nature of manufacturing operations.  

Hybrid cloud benefits

There are four primary benefits to hybrid cloud in manufacturing: faster cycle time, improved visibility, reduced cost and better management of plant applications.  

Faster cycle time and improved visibility: In typical analytics projects in plants, if data scientists spend three weeks doing the initial analysis, they spend three months trying to discover data, collect data and provision servers. Machine learning and AI workloads running in isolation may not be able to extract the intelligence of out of the huge data volume. When visibility improves, data scientists can access data on-demand and spend more time on the value-add activities of actual analysis and model building.  

To achieve faster cycle time, we need to use AI and ML services offered by public cloud, while addressing the challenges of security across locations, data latency, operational visibility, compliance and regulations. 

IBM Cloud Satellite can incorporate any public cloud service to plant-level data without moving it to cloud. By deploying IBM Cloud Satellite in plant locations or within an existing data center, access to public cloud service is possible. This reduces the cycle time of projects, improves productivity and allows manufacturers to run more projects.  

Cost reduction and improved plant management

In an audit of where data is generated, stored and used, frequent data movement is visible. This movement is wasteful and expensive. Currently, many plants aggregate data before storing in consideration of cost and network congestion. An on-premise database managed by IBM Cloud Satellite solution saves on data egress charges and reduces the management effort of plant IT teams. Better IoT integration and smarter data placement reduces costs, enables faster projects and cuts the hot-data access time significantly.  

An industrial edge (powered by data fabric and IBM Edge Application Manager) helps us to generate relevant and timely insights at scale. Creating Continuous Integration/Continuous Development (CI/CD) pipelines for plant applications provides more confidence to push changes in the operational technology (OT) landscape that reduces the need for a longer shutdown. The iterative learning related to training, deploying, learning and redeploying is especially important in data science and AI application. Enabling agile iterations allows users to improve AI applications and get more value.  

Hybrid cloud for the manufacturing industry is unique given the mission-critical, real-time applications that require low latency and high security. These factors lead to multiple smaller on-premise clouds instead of one large cloud. Hybrid cloud in manufacturing follows a hub-and-spoke model, where each satellite node has a relative lower footprint. A central on-premise cloud will manage these multiple mini clouds which may lead to some variance in standardization across various clouds requiring multi-cloud management technologies.  

Many plant applications are tied to hardware and are critical for plant control. Migrating these applications to new architecture requires special care. There are multiple legacy applications and the importance of these applications may warrant an incremental modernization instead of modernization of all applications at once. Hybrid cloud needs to support bare metals and virtual machines during the journey to containerization and cloud-native computing.  

IBM and Red Hat architecture ensures a journey to cloud that balances the imperative to modernize with the need to minimize disruptions in operation.

Source: ibm.com

Thursday 3 February 2022

Navigating the new reality of HR with skills at the core

IBM Exam Study, IBM Exam Career, IBM Skills, IBM Preparation, IBM Jobs, IBM Tutorial and Material

Business leaders continue to witness multi-fold organizational challenges due to ongoing disruption in the ways of working and the accelerated digitization, automation and  shift to remote/hybrid work. These changes have highlighted the role of flexibility as a way to meet the growing individual and organizational demands and no matter where you look, emerging technologies, mobilization of complementary competencies, innovation, governance, growth, net promoter score, value creation, delivery and more are all being introduced, dynamically exposed and evaluated.

Moving into the new paradigm

Unparalleled changes are a new normal and the related disruptions are exposing blind spots. To remain relevant, leaders can help if they maintain an agile approach to the workforce and shape and empower their businesses towards a new reality. When global leaders are in a volatile, uncertain, complex, and ambiguous (VUCA) state of dilemma it’s a perfect time to evaluate and determine effective areas of focus including:

◉ Defining and improving employee lifecycle and considering employee personas via empathy mapping

◉ Strategic workforce planning in the context of a gig economy

◉ Adoption of analytics/automation like bots and workforce analytics

◉ HR reinvention for new-age skills looking at skills frameworks for reskilling and upskilling and mapping talent to value

◉ Digital change/transformation dashboard-enabling technologies and digital HR platforms

To enable leaders to prioritize effectively and create an agenda for reinvention, they must have a clear and transparent view of HR priorities and interventions vis-a-vie their organization’s capabilities.

Demand and supply, a contextual perspective

Post pandemic the demand for a skilled workface has become directly proportional to supply. The new-age skills to drive a sustainable business model have increased the needs. And, learning and development (L&D) has also moved upwards in the “must-have” category. This shift has forced leaders to acquire dynamic capabilities to innovate their business model because it is perceived as a key driver for competitive advantage.

Qualitative findings and comparative analysis for HR leaders and chief experience officers (CXOs) aim to build a future-ready workforce and a workplace to support the successful execution of the organization’s strategies. The needs of tomorrow’s organization should also include identifying and defining priorities for HR leaders to address the ongoing business-as-usual changes.

A contemporary lens for HR transformation

Comprehensive analysis of leading consulting organizations who examine HR priorities helps identify the top priorities of HR leaders. It also aims to segment HR topics based on current capabilities and their future importance.

Key trends and actions in the sense-and-respond model indicate the following as upcoming opportunities to be tapped:

◉ Upskilling, reskilling and learning

◉ Employee experience and engagement

◉ Leadership behaviors and development

◉ Workforce planning and adjustment

◉ Organization restructuring and operating model

◉ Talent management

◉ Digital HR

◉ HR Reinvention

◉ Change management capabilities

◉ Workforce/HR analytics.

Transparency drives accountability in planning and introducing interventions to eliminate possible barriers leading to rapid expansion in organizational capacity. As we step into the new ways of working, retaining employees and creating a leadership bench in a remote/hybrid workplace post-pandemic would be one of the critical challenges to be addressed.

Building critical skills and competencies

Skills are the new currency. 72% of outperformers recognize the importance of skills and continually invest in them. Yet only 41% of organizations report they have the talent required to execute their business strategy. Skills are, and will remain, the golden thread across the employee lifecycle.

As per the recent global survey conducted by the IBM Institute for Business Value, employee experience and skill are at the core in HR 3.0 as HR helps drive a company’s overall enterprise transformation. Skill at the core has a 69% level of importance to the future of HR and 38% level of achievement today.

Bridging the gap between future vs. present

Organizations are eager to build a future-ready workforce equipped with the skills and capabilities to meet tomorrow’s challenges. Reskilling in the post-pandemic era has emerged as the top challenge for business leaders. Bridging the gap between future and current skills is an opportunity available to create competitive edge.

Enterprise skilling for the future is an approach to build the right skills —at scale— with existing talent to meet the needs of business and to stay ahead of the market. A compelling future skilling journey as a tool to identify skills gaps and build a cross-functional team in hot skills with a quick and efficient learning path to support employees as owners of their careers, managers as talent builders and strategic planning for business unit leaders.

Building a job taxonomy framework and a skills library that can be extended to specific industries would help in the creation of:

◉ Job descriptions and responsibilities

◉ Core competencies and behavioral-based proficiency statements

◉ Development goals

◉ Coaching tips

◉ Interview question

The call to action

It’s imperative to highlight that the involvement of leaders and employees is equally important in building the new-age organization. Reskilling the workforce through personally and professionally enhancing initiatives will remove the obstacles in meeting the unpredictable future challenges. And, it will not undermine the availability of critical skills for the organization. Businesses need to have the right people with the right skill sets and the HR agenda should focus on the key actions to reflect the new reality in the future and beyond.

Source: ibm.com

Tuesday 1 February 2022

AI can help in the fight against racism

In my role as Open Source Community Manager for the Call for Code for Racial Justice, I oversee a community of developers, data scientists, designers and general problem-solvers all looking to use technology to fight for racial justice. Just like any role, there are challenges I must deal with on a daily basis, but the one thing that has pleasantly surprised me since I started almost a year ago has been the interest and enthusiasm from people all around the world and from different backgrounds who are invested in advancing racial equity using data and artificial intelligence (AI).

The Call for Code for Racial Justice is an initiative external to IBM, so the people I deal with come from big and small organizations from around the globe — yet they all share this common belief and that drives them to give up weekends and work nights building tech for social good.

What is this community building to fight racial injustice?

We currently have seven projects in the Call for Code for Racial Justice. These were originally incubated by the black community inside of IBM as a response to the racial injustice highlighted through the #BlackLivesMatter campaign in 2020. When looking across these projects, you can see that there are certain areas where technology has the greatest opportunity to fight racial bias in society:

◉ Accessing information: When information is dense and difficult to consume, it tends to be hard for people to come together and use it in an effective way. This often happens in the government and policy space, where information can have a significant impact on our lives — especially for underserved communities. Policy related to schools, roads, availability of local shops and resources can often be written in legalese that is hard for people to comprehend. AI can help rectify this. The Legit-Info project utilized Watson Natural Language Understanding to identify titles, summaries, locations and impacts. The results can then be further curated to improve readability and make these meaningful to all members in a community.

◉ Identifying racial bias: Racial bias can creep into all kinds of places — from a police write-up of a crime to technical documentation on a software tool. In some cases, this may be explicit and driven by the bias of the individual writing the document, but in just as many cases, this may be implicit and the result of societal norms carried over from the past. TakeTwo is an API-based tool that can take a document as its input and highlight potential racial bias based on a trained machine learning model. Looking for insights in data is another way to identify racial bias — the Open Sentencing project looks specifically at incarceration rates based on racial demographics to help defense lawyers make the case for black defendants who often face tougher sentences for the same crimes as those committed by people of other races.

Why get involved in building AI solutions to fight racism?

In the case of these open-source projects, community involvement is as important as the technology itself. In a recent survey of community members, many were motivated by a desire for social good. Others were interested in networking and connecting with those sharing similar interests. The development of skills is also a big component — working with industry-leading technology and building skills that they can take into other areas of their lives.

For myself, starting as a contributor and progressing to Community Manager, I’ve experienced all these benefits, but there is another factor that is important when it comes to technology helping with social justice. After earning a post-graduate degree in Mechanical Engineering, I started my career as a product manager for AI products. One thing that has become clear as I have progressed through my career is the need to have the right people in the room when making all kinds of decisions. We need to ensure the AI systems we build are trustworthy. Beyond that, whether it’s the policies that impact communities, the products we build and how we market them or, indeed, almost any facet of our lives, we need proper representation and diversity of thought if we are to realize the dream of creating a more just society. AI has a growing role to play in the fight for social justice, but we can’t rely on it alone.

Get involved with the Call for Code for Racial Justice Projects

We are always looking for new participants in the Call for Code for Racial Justice Projects — find out more about how you can get involved.

Source: ibm.com