Tuesday, 30 November 2021

DevOps for IBM Z Firmware – with Red Hat OpenShift

IBM DevOps, IBM Z Firmware, Red Hat OpenShift, IBM Exam Prep, IBM Tutorial and Materials, IBM Certification, IBM Preparation, IBM Career, IBM Jobs

Red Hat OpenShift is perhaps best known for providing a platform for developing and deploying cloud-native, microservices-based applications.

But when the IBM Z Firmware development team in Germany were looking to modernize their DevOps process, they found the ideal environment in Red Hat OpenShift running on IBM Z.

Scalability and Security

IBM Z systems sit at the heart of many of the world’s biggest companies and most critical workloads. Optimized for performance, security and reliability, IBM Z is designed to handle billions of transactions without missing a heartbeat.

The IBM Z firmware layer sits between the physical hardware and the operating system, and executes many of the low-level operations of the IBM Z system. Creating and maintaining this layer is the responsibility of the IBM Z Firmware development organization, which includes hundreds of developers.

The challenges the team faced were similar to those faced by many large development organizations – flexibility, security, and availability, especially combined with the need to scale. How could existing Jenkins setups be easily extended to add new workers? How could the existing login server be updated to support new security requirements? And how could runtime environments be designed with high availability in mind?

“The DevOps process is based around a large code pipeline, moving from source code management to binary repositories to automated testing, all managed by Jenkins scripts and workers”, commented Ralf Schaufler, IBM Z Firmware Integration Architect. “Supporting this are tools for bug tracking, access control, and backup.”

Technical Solution

The team looked at various options and decided to go with Red Hat OpenShift as this provided a secure enterprise DevOps capability, as well as a CI/CD pipeline. Although most of the IBM Z Firmware artifacts run on the IBM Z architecture (s390x), some run on x86 – and so OpenShift’s support for heterogeneous environments could offer additional benefits in the future.

The next question the team faced was whether to run OpenShift in the cloud or on-premises on Z. They determined that the cloud would be a more expensive option, especially as they have a fairly static environment of hundreds of users. In addition, running OpenShift on-prem on IBM Z enabled them to co-locate the development environment next to the test environments. This dramatically reduced the time taken to transfer IBM Z firmware images between development, simulation, and new hardware – and increased security by locating all these environments in the same protected zone with local access only.

“The first use case we implemented was to migrate their multi-user development server to ‘interactive containers’ running on Red Hat OpenShift on IBM Z”, said Edmund Breit, Senior IT Specialist, IBM Z Firmware Delivery & Suppprt. “This enabled us to use the access control features of OpenShift and meet the IBM security requirements for developers.”

The next use case they deployed was to use Jenkins for the Continuous Integration and Development (CID) pipeline within Red Hat OpenShift, supporting greater scalability and enabling updates to be packaged and then included in the next driver update. This simplified the pipeline automation, and can also potentially enable future multi-arch support for both s390x and x86 firmware production in the future.

“We’re now looking at further use cases, including supporting virtual machines as well as containers, wider options for persistent storage, and additional CID services”, added Edmund.

“The migration of the DevOps process to OpenShift on Z has proved very successful and delivered a more secure and scalable approach,” continued Ralf. “This has also been helped by the Red Hat OpenShift for Z development team being close by in the IBM Boeblingen lab.”

Source: ibm.com

Thursday, 25 November 2021

Retain clients with Trustworthy AI in wealth management

IBM Exam Study, IBM, IBM Exam, IBM Exam Prep, IBM Exam Preparation, IBM Tutorial and Materials, IBM Guides, IBM Certification, IBM Career

2021 sees the need for an acceleration in the transformation of wealth management IT systems. A study from Ernst and Young in 2019 showed that a third of clients plan to switch wealth management providers over the next three years. Insufficient personal attention and advisory capabilities were cited as the main reasons for customers leaving.

The traditional wealth advisor relationship must be considered, and growth in self-service tools will be required to support the newer investment channels. Smarter tools are vital for wealth management companies to enable advisors to create a better service for their clients. Such tools are likely to involve the use of AI and machine learning techniques to maximize information around customer needs, behaviours and data to hyperpersonalize, reinvent the market, and enable investments to fit individual customer circumstances.

Modernisation Pitfalls

Historically, attempts to modernize applications have often failed for wealth management companies. Let’s take a closer look at three main pitfalls which have hindered success along with our recommendations on how to best mitigate these risks:

No user-centric approach for AI

Acceptance of new systems by advisors and customers have failed because systems may not be designed around end user needs. Consequently, AI solutions often fail to create understandable and trustworthy personalization for the customers or the advisor. This results in a lack of confidence in the financial advisor and causes customers to look elsewhere and advisors to fall back on their traditional approach of using their own experience.

Technical challenges with AI applications

Infusing AI into wealth management systems presents many technical implementation challenges. Bespoke solutions are often too cumbersome, difficult to understand, not repeatable and have resulted in higher IT costs and time to implement changes. This is a major barrier to digital modernization, and the inclusion of explainable and usable AI.

Lack of end-to-end AI strategy

Getting value out of AI is not easy. According to MIT Sloan, 40% of organizations making significant investments in AI do not report business gain. Often, companies focus too much on the Data Scientists and clever algorithms in the belief that this alone will bring success, rather than looking at the full end to end process which would deliver the true outcomes they require. Whilst Wealth management organizations have long understood that AI is an essential, they have not followed a proper AI framework.

Tactics to win with AI Best Practices

To combat a lack of trust and confidence in AI, agile methods such as IBM Design Thinking aim to center your focus on user needs. Such methods involve multiple brainstorming sessions at the beginning of the project with client advisors and customers to align AI to the main pain points and system desires for end users.

Prototyping and iterating on these ideas should follow before formulating solutions. When it comes to customer attrition, clients need a smarter system to help prioritize which customers needed attention, and immediate notification when a customer is at high risk of leaving. To truly embrace AI, advisors want a smarter system they can trust – a system which produces AI output they can explain and understand.

The Data Science and AI Elite (DSE) developed machine learning models which identify and provide insight to customers who are at risk for attrition.  Design Thinking helps validate which functionality is most relevant and to address the issues of operational acceptance.

Rather than adopting bespoke, non-repeatable deployment approaches for AI based applications, IBM Cloud Pak for Data addresses challenges by offering a single platform to deploy all AI applications. It offers a wide range of services, including AutoAI to automate the model building approach and Watson Studio to allow for ethical and explainable AI.

IBM Exam Study, IBM, IBM Exam, IBM Exam Prep, IBM Exam Preparation, IBM Tutorial and Materials, IBM Guides, IBM Certification, IBM Career

The AI Ladder (Figure 1) provides organizations with an understanding of where they are in their AI journey as well as a framework for helping them determine where they need to focus by providing four key areas to consider: how they collect data, organize data, analyze data, and then ultimately infuse AI into their organization.  IBM Cloud Pak for Data standardizes these processes to make AI operational to deliver business outcomes.

Transform your retention strategy to achieve Trustworthy AI:

◉ Use an agile approach to better understand customer’s and user needs, such as IBM Design Thinking

◉ Embrace the AI Ladder for adoption of an end-to-end process for delivering AI applications

◉ Reduce complexity and increase repeatable AI processes by deploying applications on IBM Cloud Pak for Data

Fast-track your journey to AI with IBM Industry Accelerators


Industry Accelerators on IBM Cloud Pak for Data provide tools to help you shorten time-to-value from demonstration to implementation. Learn how these accelerators can help you expedite your business strategy by exploring the new Accelerator Catalog.

For help getting started on your data science project, let our experts assist you. The IBM Data Science and AI Elite (DSE) team works side by side with your team to co-engineer AI solutions and help your business prove value at no cost. Get the skills, methods and tools you need to overcome AI adoption and to solve your business challenges quickly.

Source: ibm.com

Tuesday, 23 November 2021

Open innovation with IBM LinuxONE III Express

IBM LinuxONE, IBM Exam Prep, IBM Certification, IBM Tutorial and Material, IBM Preparation, IBM Study Materials

IBM LinuxONE III Express is designed to be an off-the-shelf Linux server to help get clients up and running quickly. Starting at $135,000 USD*, the single-configuration LinuxONE III Express is designed as a new cost-effective offering for popular workloads, including data serving and hybrid cloud.

Today, we’re excited to let you know that Red Hat Enterprise Linux is available to order bundled with the LinuxONE III Express hardware.

Red Hat Enterprise Linux is the intelligent operating system that is the foundation for the open hybrid cloud and provides the tools needed to deliver critical services and workloads faster and with less effort. Red Hat Enterprise Linux complements LinuxONE to provide a full integrated software and hardware solution.

Organizations have come to rely on IBM Z and LinuxONE systems running Red Hat Enterprise Linux to gain next-level data privacy, hardware and software security, scalability and resiliency and capitalize off of this strong foundation to host their most critical workloads running in hybrid multi-cloud environments.

LinuxONE III Express is bundled with Red Hat Enterprise Linux for IBM Z and LinuxONE with Comprehensive Add-Ons which includes:

◉ Red Hat Enterprise Linux.

◉ Red Hat Enterprise Linux High Availability Add-On for increased uptime.

◉ Red Hat Enterprise Linux Extended Update Support (EUS) Add-On for flexibility in updates and migrations.

◉ Red Hat Smart Management for supported use cases that provide optimization and management of Red Hat Enterprise Linux environments at scale.

◉ Red Hat Insights to continuously analyze platforms and applications to help enterprises better manage hybrid cloud environments

◉ Unlimited virtual guests.

◉ Premium support with unlimited access to Red Hat technical support and 24×7 coverage.

IBM LinuxONE, IBM Exam Prep, IBM Certification, IBM Tutorial and Material, IBM Preparation, IBM Study Materials
Available since May 25th, LinuxONE III Express features:

◉ Three sizes to fit most workloads.

◉ Improved time-to-value: a single configuration with predictable pricing ensures more rapid delivery and install of the server.

◉ Confidential computing capabilities: Ability for our clients to leverage IBM Cloud Hyper Protect Services for the highest commercial levels of privacy.

For businesses that want to take advantage of open-source technology and hybrid cloud computing solutions, LinuxONE III Express and Red Hat are a logical choice. Learn more about Red Hat, IBM Technology Support Services, and IBM LinuxONE by contacting your IBM or IBM Business Partner sales representative.

Learn more about IBM LinuxONE III Express here.

Source: ibm.com

Saturday, 20 November 2021

What’s next in AI-assisted governance, risk and compliance

IBM Exam Prep, IBM Tutorial and Material, IBM Certification, IBM Preparation, IBM Career, IBM Jobs, IBM Learning

“You need a technology plan that’s aligned with your risk and compliance objectives,” says Heather Gentile, Head of RegTech Offerings, Data and AI at IBM. In an episode of the executive video series “Compliance Over Coffee,” Gentile and Brian Clark, co-founder of regulatory knowledge platform Ascent, discuss the partnership between Ascent and IBM OpenPages with Watson to handle governance, risk and compliance (GRC) for clients. The two discuss the responsibilities that come with digitalization, proactive measures for compliance, the importance of trust, and what’s next in compliance trends.

Gentile points out the two sides of the trend toward total digitalization. “With ‘going digital,’” she says, “we see organizations collecting more information about their clients than ever. On the good side, you have a lot of data available for AI analysis. The challenge is, you need a data governance framework that allows for the secure collection and organization of that data so that it can be securely leveraged.”

Using the predictive capabilities of AI, organizations can get more proactive with their compliance strategies. IBM OpenPages integrates with Ascent to bring obligation data into OpenPages and help organizations look ahead, rather than simply react. “You can’t always predict where the Administration is going to go with their new legislation,” Gentile says. But there’s an opportunity to start earlier, make plans and involve stakeholders in a more collaborative approach to compliance. “The lines between first line, second line, third line are really starting to blur now,” she says. “If you can anticipate the risks accurately, you’re less inclined to have an audit issue to clean up later.”

“By combining Ascent’s knowledge with IBM OpenPages, we’ve created an integration that helps make the process of compliance more seamless, repeatable, and scalable than ever before,” says Brian Clark, President and Founder at Ascent. “Ascent’s RegulationAI solves an actual business problem, and our partnership with IBM focuses on maximizing this impact.

“From ‘Compliance Over Coffee’ to the IBM RegTech Summit, we are working together to help the market distinguish between smoke and mirrors and true value-add technology. Ultimately, we’re on a mission to help firms ‘de-risk’ their business in a cost-effective and accurate way.”

Gentile emphasizes the advantage of the IBM approach to AI, which emphasizes trust and transparency, breaking open the “black box” of AI. Many organizations are eager to adopt AI models to support business strategies, she says. Those organizations need to have control of, and insight into, how the models operate. “You can set everything up with the best of intentions from a governance perspective,” she says, “but a big piece of people being able to accept AI is through effective controls.”

To scale GRC solutions, financial services firms are looking to the cloud and hybrid cloud. That, says Gentile, is where they can leverage containerization on the Cloud Pak for Data platform. Now that IBM OpenPages is part of Cloud Pak for Data, OpenPages has a direct integration with Watson Knowledge Catalog to address data governance.

Gentile points out GDPR regulation from the EU, and the intense work that organizations went through to comply. This work isn’t over, thanks to similar data privacy laws in states such as California, as well as potential federal regulation under the new administration. But Ascent and IBM OpenPages can make it easier. With anticipatory requirements and control suggestions from Watson Natural Language Classifier inside of the OpenPages UI, based on a repository of regulatory data, clients can save time on data mapping and administration, so they can focus better on analysis.

Gentile places regulatory compliance within IBM OpenPages paradigm of infusing AI throughout an entire organization. Clients can use OpenPages to optimize the compliance process end-to-end. “We’re seeing more and more that IT is not just a stakeholder in the risk and compliance buying decisions, but more of a decision maker and a collaborator.” That makes it important to align an organization’s GRC objectives with its technology plan. There’s a lot of work ahead, and organizations that automate, integrate and optimize end-to-end will come out ahead.

Source: ibm.com

Thursday, 18 November 2021

From research to contracts, AI is changing the legal services industry

IBM Exam Study, IBM Exam Study Material, IBM Career, IBM Exam Prep, IBM Exam, IBM Preparation

There are many reasons a person chooses to become a lawyer or work in the legal industry, but hours of paperwork every week isn’t one of them. Often legal professionals spend a lot of time trying to find and properly classify information in their complex and siloed filing systems.

To complete the many tasks they are responsible for, paralegals, attorneys, and compliance and contract specialists need the ability to quickly identify relevant information in an overabundance of big data.

That’s where AI comes in. Natural language processing (NLP) is an AI technique that can help legal professionals quickly surface insights across millions of unstructured data sources, like printed books, legal websites, commercial databases and historical case files. By augmenting manual processes with AI, paralegals and attorneys can focus on more rewarding, higher-level tasks like working with clients.

Let’s explore a few of the major responsibilities of legal professionals and the top AI use cases in the legal field.

Legal research and drafting

While making legal decisions, attorneys and their teams spend time sorting through documents and running database searches to locate and review relevant statutes, laws and precedents. Not only is this a time-consuming and detail-oriented process, but if the correct keywords aren’t used, important resources may not even surface. Legal teams can use AI search tools like NLP to pull up information quickly, identify emerging trends and reveal hidden connections that help perform billable work faster.

Paralegals and attorneys can also use AI to help ensure all relevant facts, laws and statutes are included in legal documentation and follow the tedious standard formatting rules. AI solution provider LegalMation created a domain-specific model focused on legal terminology and concepts. The LegalMation platform helps legal teams craft early-phase response documentation in under two minutes.

Contract lifecycle management

Legal organizations create, update and store a large volume of contracts throughout their entire lifecycle. From initial drafting and negotiations to compliance management, maintaining these hundreds and sometimes millions of contracts — often stored across multiple repositories — represents a huge investment.

To ease the workload of contract review, legal professionals can use AI to help quickly identify and surface contracts in need of renewal before they expire. Teams can also use AI to help minimize the negotiation time frame by suggesting standard updates and renewal opportunities during protracted negotiations.

Legal technology firm ContractPodAi offers an end-to-end contract management solution that can analyze inventories of over 400,000 contracts. Designed to help counsel easily and cost-effectively manage any contract throughout its lifecycle, ContractPodAi clients report over 50% reductions in contract renewal time.

Client service

Law firms are experimenting with digital subscription services, which provide fast, affordable online legal services that can help reduce operational costs and empower teams to serve additional smaller clients without sacrificing quality. Teams can use AI-assisted customer service to offer clients a faster way to get common questions answered automatically. For example, teams can deploy AI-powered chatbots equipped with search capabilities that can surface relevant data, present it to customers and perform other tedious tasks, enabling legal teams to focus on higher-level work.

Affordable legal services provider QNC GmbH built its “digital law firm” subscription service Prime Legal to provide fast, affordable, flat-rate online legal services to small businesses in Germany. Lawyers can now match client questions against the Prime Legal database of 180,000 previously answered questions and typically respond to client inquiries in less than an hour.

Source: ibm.com

Tuesday, 16 November 2021

Data Fabric and the search for the single source of truth

IBM, IBM Exam, IBM Exam Prep, IBM Exam Preparation, IBM Certification, IBM Learning, IBM Guides, IBM Learning, IBM Career, IBM Skills, IBM Jobs

About once a year, usually around this time and for unknown reasons, I find myself watching the movie National Treasure. I guess there’s just something about watching Nicolas Cage connecting the dots to find hidden treasure that I can’t pass up when I see it on TV. I know I’m not alone in this; whether it’s Nic stealing the Declaration of Independence or Indiana Jones seeking out the holy grail – putting together clues to complete a quest is a time-honored tale. For decades, businesses and Chief Data Officers — specifically once the position was created —have been seeking their own holy grail. Namely, they want a single source of the truth for data – one that’s easily accessible, responsibly governed, works with current systems, integrates across a disparate data estate, and isn’t too costly. As they learn more about the data fabric, it appears to be the perfect protagonist. And just as unsurprisingly, connecting data plays a crucial role in obtaining the treasure being sought.


The story so far


During Think 2021, IBM announced the launch of a modular and composable data fabric that enables a dynamic and intelligent data orchestration across a distributed landscape, creating a network of instantly available information for data consumers. Its self-service consumption capabilities allow users to have a complete view of their data, connecting them as one no matter where the data resides or how siloed they had previously been. This optimized data access means businesses can immediately reduce the amount of data duplication and migration processes required. Moreover, the data can be queried where it resides. As a result, businesses can speed results, and access fresher data. The addition of Watson Query (formerly AutoSQL) provides access to databases, data warehouses, data lakes, and streaming data, and the ability to have queries executed without additional manual changes or data movement. And because all of the data is visible and accessible from a single point, automated data cataloging and the enforcement of data governance policies is much easier. Businesses no longer needed to apply these crucial components across myriad individual silos.

Are you ready for the data fabric sequel? 


Today, just over six months from that initial announcement, IBM is unveiling new data fabric capabilities that further connect data and make it readily available for use even in the most stringent regulatory environments. Foremost is the inclusion of distributed data processing. Clients can now execute cloud runtimes remotely using IBM Cloud Satellite, which means workloads can be executed wherever the data resides. Because of this ability to execute runtimes in place, data movement needs are further reduced helping to save up to 47% by minimizing data egress costs, eliminate the need to use different tools on different workloads, and maintain data sovereignty by allowing data to remain in the geographic area it was created. 195% performance improvements when co-locating the workload with the data as a result.

Advanced Data Privacy features are also being introduced into the data fabric. Through this capability, in addition to dynamic masking of structured data, masking of unstructured data can now be automatically applied in a consistent manner, as opposed to the typical manual process. Static masked structured or unstructured data copies can be sent clients’ desired target data sources. This capability is particularly important for facilitating anonymized training data and creation of data test sets. In other words, it provides one more way in which the data fabric allows businesses to take full advantage of their data while respecting their customers’ privacy and local regulations.

Time for your close-up


While we believe in the value of a robust data infrastructure for every business, we also believe that each organization has unique challenges that differentiate their implementation from anyone else’s. While you’re considering how a data fabric can help you obtain the data-driven utopia you’ve been pursuing, let us help by sharing our expertise.

Source: ibm.com

Tuesday, 9 November 2021

With AI, capital markets firms will spend less time wrangling data and more time serving clients

IBM Exam Study, IBM, IBM Exam, IBM Preparation, IBM Certification, IBM Guides, IBM Skills, IBM AI

Institutional firms invest heavily in technology that helps employees quickly process information and share insights with clients. Over the past decade, these firms and fintech companies have competed for market share by delivering the most client-centric financial services through innovative means. As a result, organizations that were previously less interested in taking full advantage of artificial intelligence (AI) and machine learning solutions are reconsidering their approach to data and analytics, accelerating AI adoption, and strengthening their technology partnerships.

Unfortunately, finding and processing data is a challenge for capital markets organizations.

◉ 46% of financial executives say they and their teams are unable to fully execute their responsibilities. 

◉ 49% say that this is due to “manual, time-consuming processes. 

◉ 21% attribute this to an “inability to readily access required data.”

Financial services providers can use AI implementations across many processes to conquer these challenges with automation.

Finding relevant information in the era of big data

Big data enables capital markets firms to help employees make better financial decisions on behalf of clients. But finding relevant data points in hundreds of documents is time-consuming, since this is unstructured data and resides in formats that are difficult to analyze using traditional analytics tools.

Sound financial decision-making is informed by the latest reports on market segments and client data. With AI-powered tools like natural language processing (NLP), employees can quickly find insights in unstructured data by using AI to recognize patterns across millions of documents. This approach can also surface meaningful insights that might have previously gone unnoticed.

NLP enabled fintech company EquBot and ETF Managers Group to build the AIEQ, the world’s first AI-powered equity ETF (exchange-traded fund). AIEQ collects and parses data on over 6,000 US companies each day, including unstructured data stored in formats that are difficult for analysts to inspect quickly. This includes posts on blogs and social media, like the ones that drove up video game store GameStop’s stock price in early 2021.

Managing regulations

Working in such a heavily regulated industry, risk and compliance professionals need to adapt to changes that impact their investments in real time. But the pace of change makes it challenging to maintain compliance.

Financial institutions can use AI to help comply with ever-changing regulatory frameworks by automating document processing for regulatory notices, consultation papers, policy statements, and more. Employees can use AI solutions to scan these documents and produce a consolidated view of rules applicable to their firm, depending on specific characteristics.

IBM Exam Study, IBM, IBM Exam, IBM Preparation, IBM Certification, IBM Guides, IBM Skills, IBM AI

Buy-side opportunities

On the buy-side of capital markets, organizations can integrate financial services AI across research workflows, using the solution to help scan for investment opportunities, analyze sales-side research and reports, process confidential client information, and identify actionable client insights. From there, analysts can use AI recommendations when drafting research for portfolio managers, suggesting investment strategies, and streamlining meetings and quarterly reviews. Firms can also use AI to help anticipate the needs and behaviors of their customers and use those insights to provide customers with automated, client-centric customer service experiences.

Sell-side opportunities

To help monitor markets, firms can use AI to automate the tracking of companies of interest and financial news feeds. With machine learning algorithms, analysts can look at the unstructured data of a potential investment — founders’ backgrounds, total money raised and old deals — and synthesize that information to gauge the potential ROI of an opportunity. Employees can also use AI to assist deal idea generation by creating a list of potential private equity firms to work with or by analyzing past private equity firm deals. On the buy-side, teams can use AI to gather data about companies in the research phase. AI can even augment pitch development.

Source: ibm.com

Monday, 8 November 2021

Accelerating Modernization on IBM Power with Open Source Software and CI/CD Tooling

IBM Power, IBM Exam Prep, IBM Certification, IBM Learning, IBM Career, IBM Jobs, IBM Skills

Much as Linux and open source transformed how operating systems were developed, open source software (OSS) is transforming the way applications are developed. Clients and ISVs (Independent Software Vendors) are increasingly embracing OSS. According to the 2021 Red Hat State of Enterprise Open Source report, 90% of IT leaders are using enterprise open source. At IBM, OSS forms the foundation for our Red Hat synergy, Cloud Paks, and the ISV enablement that Power customers can rely on as they accelerate the modernization of their mission critical workloads.

To stay competitive, customers are embarking on workload and infrastructure modernization which is designed to help their enterprise systems keep pace with today’s speed of business. We see that many of these customers are shifting to an open hybrid cloud and are adopting flexible tools for a modern DevOps environment. Architecture-independent developer tools can simplify developer workflows while increasing collaboration, productivity, and visibility by unifying enterprise-wide DevOps regardless of the underlying platform.

With the GitLab Ultimate 14.3 release, IBM Power customers can gain a modern DevOps platform, delivering an end-to-end solution for collaboration, visibility, and development. The GitLab Runner, and its operator, available via the OpenShift Operator Hub with support for Power expected at the end of October, extends GitLab’s multi-architecture CI/CD capabilities and native build and validation of high value workloads to IBM Power.

Nima Badiey, VP of Alliances at GitLab shared that “GitLab, the DevOps platform, is designed to accelerate deployment of critical workloads on IBM Power and Red Hat OpenShift. As businesses look to accelerate their digital transformation initiatives with hybrid and multi-cloud deployments, our customers can leverage the latest GitLab 14.3 release including the GitLab Runner for IBM Power and the new Power10 processor, to develop, secure, and operate software in a single application which can lead to faster pipeline performance and increased collaboration between developers and security teams.”

To further integrate Git repositories into Kubernetes deployment paradigms, OpenShift offers fully supported CI/CD with OpenShift Pipelines, based on OSS Tekton. OpenShift Pipelines is fully integrated with GitLab to deliver a seamless experience for both ISVs and end customers. Other tooling ISVs focused on CI/CD include Travis-CI and Jenkins. Jenkins is used by developers mostly as a continuous integration engine but also facilitates continuous build and deployment and infrastructure deployment. In keeping with the open source spirit, OpenShift Pipelines can be used as an alternative to or alongside Jenkins deployments. Jenkins and Tekton are both Continuous Delivery Foundation Projects.

IBM Power, IBM Exam Prep, IBM Certification, IBM Learning, IBM Career, IBM Jobs, IBM Skills
Travis-CI, who announced support for Power9 in 2019, is a continuous integration platform for building, testing, and deploying open source software, and it supports open source projects that are hosted on GitHub, GitLab, and Bitbucket. Travis CI is designed to make it simple to run a build against multiple CPU architectures with the addition of one additional line in the configuration file.

Choosing the right tooling solution influences DevOps flexibility, maturity, automation and transformation to help customers develop and deploy applications to the cloud environment of their choosing. Along with our ISVs like GitLab, IBM Power and Red Hat OpenShift can help customers and ISVs accelerate their modernization journeys by adopting tooling designed to support important missions, likean accelerated time to market through the continuous delivery of an automated and efficient CI/CD pipeline.

Source: ibm.com

Sunday, 7 November 2021

IBM Z and the Open Neural Network Exchange

IBM Z, Open Neural Network Exchange, IBM Exam Prep, IBM Tutorial and Material, IBM Certification, IBM Career, IBM Skills

Nearly everyone recognizes the profound opportunity to bring new insights and better decisions to business workloads using AI and analytics. Enabling AI on IBM Z and LinuxONE is a key focus for IBM,  allowing clients to have a reliable, secured, and high performing environment for delivering critical business insights using Machine Learning and Deep Learning applications.

However, with this opportunity there are also challenges, especially those around deploying AI in a production environment. The use of AI in critical business workloads is a growing space, and as with other new technologies, the path to production can be challenging. Key challenges include the need to deploy data science assets without sacrificing production qualities of service (i.e., meet response time goals) in a consistent, repeatable manner.

That is where the Open Neural Network Exchange (ONNX) comes in. ONNX is an open-source format used to represent machine learning models and is one of the key ecosystem technologies enabling a “Build and Train Anywhere, Deploy on IBM Z” strategy. ONNX helps establish a streamlined path to take a project from inception to production. Models represented in a standard ONNX format can then be implemented by an ONNX backend (i.e., runtime or model compiler), such as on IBM Z.

This journey to production starts with the data scientist, who may use a preferred set of tools to understand a business problem and analyze data. When that data scientist creates and trains a model, they build assets that ultimately need to be deployed in production. Often, however, the deployment platform and production requirements aren’t considered heavily in these early stages. This is where utilizing ONNX in a deployment strategy really shines. Many of the most popular libraries and frameworks, including PyTorch and TensorFlow, support the ability to export or convert a trained model to an ONNX format.

Once a model has an ONNX representation, it can be deployed to run on any platform with an ONNX runtime. This provides several key benefits: the model is now portable, with no runtime dependencies on the libraries or framework it was trained on. For example, an ONNX model that was originally created and trained in TensorFlow can be served without the TensorFlow runtime. Additionally, ONNX allows vendors to create high performing model backends that can optimize and accelerate the model for a specific architecture.

For IBM Z and the mission critical workloads it typically hosts, this combination of portability and optimization makes IBM Z an optimal environment for deploying models. One key example of the use of ONNX is in Watson Machine Learning for z/OS (WMLz), which incorporates an ONNX model compiler technology based on the ONNX-MLIR project. The ONNX model compiler feature of WMLz is focused on deep learning models and produces an executable optimized to run on IBM Z. WMLz allows the user to easily deploy this compiled ONNX model for model serving.

As IBM Z continues to innovate in enterprise AI, ONNX is a key part of IBM’s AI strategy. It allows IBM to build a deployment strategy optimized for the IBM Z architecture, while staying closely aligned with the broader ecosystem.

IBM Z, Open Neural Network Exchange, IBM Exam Prep, IBM Tutorial and Material, IBM Certification, IBM Career, IBM Skills

In August, you may have read that IBM previewed Telum, the next generation IBM Z processor. IBM is now examining opportunities to exploit the on-chip AI accelerator with the ONNX model compiler.

ONNX is part of the Linux Foundation and has widespread support from numerous key vendors that recognize the value it delivers. IBM is an early adopter of ONNX and contributes upstream to the ONNX project.

Be on the lookout for additional updates on how you can leverage ONNX as part of your IBM Z AI story!

Source: ibm.com

Saturday, 6 November 2021

Securing the open source software supply chain

IBM Exam, IBM, IBM Exam Study, IBM Tutorial and Material, IBM Guides, IBM Career, IBM Jobs, IBM Skills

Cybersecurity incidents are among the greatest threats facing organizations today. In the wake of recent high-profile software supply chain attacks, the US Federal government has taken bold action to strengthen the country’s cyber resilience. On 12 May 2021, President Biden issued a widely anticipated Executive Order on Improving the Nation’s Cybersecurity, which calls for stringent new security guidelines for software sold to the federal government, and has wide-ranging implications that will ripple across the entire software market.

Despite the troubling frequency of malicious attacks, most organizations still have only a partial view of the make-up of their software applications. This partial knowledge leaves them exposed to unknown software component vulnerabilities and hampers any response efforts.

Anaconda asked about open source security in our 2021 State of Data Science survey, and the results were surprising:

◉ 87% of respondents said they use open source software in their organization.

◉ 25% are not securing their open source pipeline.

◉ 20% did not report any knowledge about open source package security.

We also found that in organizations that aren’t using open source software today, the most common barrier to entry is security concerns, including fear of common vulnerabilities and exposures (CVE), potential exposures, or risks. It’s no secret that open source software is key to accelerating the development of new business ideas—not only by saving time, but by allowing greater collaboration and assembling more minds to solve for some of the world’s toughest challenges.  With the increased visibility and involvement from third parties, however, these benefits come with exposure to potential risk. IT departments need solutions that support innovation but also provide governance to mitigate the damage from any attack or exposure.

Providing security and trust in open source

CVE matching and remediation information enables an organization to build a secure supply chain tailored to their unique needs and policies. For example, one foundational cybersecurity practice is to consult CVE databases and scores regularly to guard against the risk of using vulnerable packages and binaries in applications. Anaconda Repository for IBM Cloud Pak® for Data automates this process by allowing IT security administrators to filter access to packages and files against a curated database of known vulnerabilities. This effort-saving feature frees developers and data science teams to focus on building models.

Collaborating to confront risks head-on

The Executive Order includes many additional steps to improve cybersecurity, such as providing a software bill of materials (SBOM) that enables potential software consumers to know exactly how something is developed. These additional steps are essential for mitigating the many malicious cyber campaigns aimed at gathering critical information and disrupting operations across the nation. As society continues to become more and more technologically driven, vulnerabilities are inevitable. However, a spirit of transparency and collaboration—when combined with the right tools—will help enterprises guard against potential breaches and hacks to their systems, so they can continue to innovate and safely collaborate in the open source ecosystem.

IBM Exam, IBM, IBM Exam Study, IBM Tutorial and Material, IBM Guides, IBM Career, IBM Jobs, IBM Skills

Anaconda Repository for IBM Cloud Pak for Data helps organizations identify vulnerabilities and enables greater control over open source packages in use by allowing admins to block or safelist packages based on IT policies and CVE scores.

Source: ibm.com

Tuesday, 2 November 2021

BNZ protects customers (and their experience) with IBM Safer Payments

IBM Exam Prep, IBM Tutorial and Material, IBM Career, IBM Certification, IBM Learning, IBM Skills, IBM Jobs, IBM Safer Payments

Bank of New Zealand (BNZ), one of the leading banks in ANZ, announced in 2018 that they have selected IBM Safer Payments to deliver cross-channel fraud protection to its customers. The multi-million-dollar deal supports BNZ’s efforts to provide frictionless and safer payments experience to their customers.

Growing fraud requires new approach

Many conveniences that customers enjoy as a result of modern banking carry an increased risk of fraud. Global card fraud losses are on the rise—from 2016 to 2025, they are projected to nearly double, climbing from US $22.8 billion to nearly $50 billion. Mobile banking is particularly vulnerable as 65% of fraudulent transactions were perpetrated through a mobile browser or mobile app according to a recent study.

Legacy systems were designed to see and stop easily recognizable, repetitive fraud patterns, however modern “anytime, anywhere” banking on mobile devices has made fraud detection much more challenging. Banks’ time to respond is also shrinking as real-time payments mean there are just milliseconds to detect and prevent theft before transactions are done.

Protecting customers and the customer experience

BNZ understands the importance of providing solutions that meet its customers’ expectations for convenience while also ensuring that state-of-the-art security is in place, capable of withstanding increasingly sophisticated cyber-attacks and large-scale fraud breaches. To do so, BNZ selected IBM Safer Payments, a modern real-time fraud detection solution that allows banks to intercept fraudulent activity before it happens, while ensuring customers’ genuine transactions are not stopped in error. IBM Safer Payments uses machine learning and artificial intelligence to analyze behavior and fraud patterns, build and adapt models to deter emerging fraud threats, and recommend countermeasure responses.

IBM Exam Prep, IBM Tutorial and Material, IBM Career, IBM Certification, IBM Learning, IBM Skills, IBM Jobs, IBM Safer Payments

“We are ruthlessly vigilant in protecting our customers’ trust in us, and we put security front and center, so they can be sure their money and personal information is well-protected. With IBM Safer Payments, we are stepping up this protection, analyzing every transaction in real time, but without sacrificing the customer experience. Everything we do to protect our customers from fraud and cybercrime also helps us contribute to upholding New Zealand’s excellent e-commerce and trading reputation globally.” – Owen Loeffellechner, Chief Safety and Security Officer, BNZ

BNZ looks beyond just transactions for greater context and accuracy

IBM Safer Payments uses both financial and non-financial data together with a customer’s transaction history, to perform rigorous authentication and profiling on each and every transaction. Potentially fraudulent transactions are quickly identified – allowing them to be stopped or put on hold pending further validation. The solution also complies with all credit card scheme rules. “Banks are facing the challenge of needing to adapt to meet their customers’ evolving expectations for a frictionless transaction, while also ensuring their security,” said Mike Smith, Managing Director of IBM New Zealand. “With financial crime becoming increasingly sophisticated, BNZ partnered with IBM to address the rising threat of crime and fraud while still enabling top quality experiences for customers and allowing for future growth.”

With the implementation of IBM Safer Payments, BNZ’s 1.2 million customers are enjoying heightened security.

Source: ibm.com