Saturday, 28 November 2020

Mainframe use is on the rise—driven by security and compliance requirements

IBM Exam Prep, IBM Tutorials and Material, IBM Guides, IBM Prep, IBM Certification, IBM Career

Is mainframe usage on the decline? That’s what we wanted to find out in Deloitte’s 2020 Mainframe Market Pulse Survey. From our work with IBM in organizations across industries, we were confident that the survey would show mainframes are still prevalent in the corporate world. And, in fact, the results showed that mainframe usage is rising.

Even more eye-opening? The strength of support for mainframes and active plans to expand their usage. Ninety-one percent of respondents identified expanding their mainframe footprints as a moderate or critical priority in the next 12 months. Seventy-four percent say they “believe the mainframe has long-term viability as a strategic platform for their organizations.” And 72 percent are planning upgrades to their mainframes in the next three years, to address expected increases in usage across three key areas: security (69%), storage (62%) and software (61%).

The value starts with security

What’s driving the robust and growing usage of mainframe computing? There are lots of reasons, but security topped the list in our survey results. This should come as little surprise. Mainframes offer unparalleled protection, and that’s a big deal at a time when data breaches can not only be devastating to a company’s brand but can force them to run aground of rising regulatory requirements regarding data security.

Don’t get me wrong—security has always been a top-tier issue, but it’s even more critical today, in an environment where information is used and shared broadly across employees, partners, regulators, and more, through many different channels and clouds. A quarter of our survey respondents pointed to “maintaining compliance” as a top priority for their IT organizations.

For example, imagine an airplane manufacturer operating at the center of a sprawling network of maintenance teams all over the world. Information sharing between these groups is constant and, of course, highly sensitive. If an engine shows warning signs of failure, maintenance teams and the manufacturer need to share volumes of data with one another, securely and remotely. But how secure is that information–wherever it is? The answer has serious implications for passenger safety, as well as for the manufacturers of these expensive machines and the airlines that rely on them to serve the travel needs of millions of passengers.

What happens once sensitive data is in transit?

Even mainframes have their security limits in today’s environment, where data is in transit (leaving the mainframe or in the process of entering it) constantly. Meanwhile, regulators are steadily tightening their requirements regarding the security of in-transit data, stipulating that organizations are still responsible for what happens to data after it “leaves” their systems. Which means organizations need the ability to protect the data even after it departs the mainframe.

IBM Exam Prep, IBM Tutorials and Material, IBM Guides, IBM Prep, IBM Certification, IBM Career
This is a big deal. And it’s why pervasive encryption has been deployed so broadly in recent years. With pervasive encryption, data at rest and in transit can be encrypted at scale, simplifying encryption and reducing costs associated with protecting data and achieving compliance mandates.

But even pervasive encryption only goes so far. Looking to the future of hybrid cloud security, IBM is extending data protection beyond data at rest and in transit to cover data in use with confidential computing capabilities.

Beyond pervasive encryption

Pervasive encryption is a powerful tool that reduces the cost and vulnerability of standard encryption, is more effective at preventing incursions, requires less effort to secure, and is more cost effective than alternative solutions. But it still may not be enough—not when users with keys can be unreliable. They’re human, after all. What happens when they switch jobs? Or leave the organization? Or prove to be untrustworthy?

IBM Data Privacy Passports on IBM z15 give companies maximum control over data wherever it travels after it leaves the system of record—throughout the enterprise and into the hybrid cloud—allowing them to manage how data is shared on a need-to-know basis, with centralized control over user access policies. Just as important, with Data Privacy Passports, companies can document their ability to control and track data in order to ease compliance. If regulators want to know who had access to encrypted data, when they had access, and when access was revoked, it’s all managed and tracked in Data Privacy Passports.

That airplane manufacturer working with maintenance teams all over the world? It can put Data Privacy Passports to work in further securing its most sensitive product data, in conjunction with pervasive encryption and blockchain. No matter where the data goes, they’re able to track it. It’s never unencrypted, and future access can be revoked at any time.

Mainframes give you more control over your sensitive data

Today, your ability to confidently share and control data outside the mainframe adds up to a big strategic advantage. It can help your organization move faster, collaborate more effectively, and ease the challenges of compliance. So if you’re one of the majority of IT or business leaders who have plans to expand your mainframe usage, or just one of the many who are working to get more from your technology investments, this is the moment to take advantage of major advances in mainframe security enablers.

Regardless of the ebb and flow of their marketplace popularity over the years, mainframes have always been recognized for their security advantages. At a time when organizations are sharing more information than ever, in more types of ways, it’s possible to secure mainframe-based data even further—wherever it goes. That’s powerful. It can lead to stronger outcomes across your organization, not just in IT. That level of control and security is available today. Don’t sit this one out.

Source: ibm.com

Thursday, 26 November 2020

Hybrid cloud innovation with IBM Power Systems

IBM Exam Prep, IBM Certification, IBM Learning, IBM Guides, IBM Prep

As we navigate the global pandemic and economic disruption, nearly every organization is pursuing a digitally-driven business model. The need to scale while freeing up cash flow is now driving hybrid cloud adoption.

However, scaling digital workflows across multiple clouds that run on multi-architecture environments requires a hybrid cloud platform that can provide consistent capabilities across different types of clouds and entire IT infrastructures.

IBM’s hybrid cloud platform with Red Hat solutions at its core is designed to enable organizations to accelerate innovation with automation and modern apps that bring out the best of what multi-arch multicloud infrastructure can provide. A pragmatic hybrid cloud strategy requires an innovative, low-risk application modernization approach that extends core business applications in VMs with co-resident Red Hat OpenShift-enabled containerized workloads.

To that end, I am excited to announce the availability of Red Hat OpenShift 4.6 across IBM Power Systems, IBM Z, and IBM LinuxONE infrastructure alongside x86. The support for Red Hat OpenShift on IBM Power Systems Virtual Server co-located with IBM Cloud is also coming in December 2020. With Red Hat OpenShift available across these environments, customers get the flexibility to modernize and deploy their applications where it makes sense.

Where do you deploy?


IBM Power Systems is purpose-built to deliver secured and reliable cloud infrastructure that efficiently scales mission-critical and data-rich applications. These are applications that cannot go down and need to scale on demand. Power Systems delivers industry-leading scalability for both VM-based and containerized applications.

IBM Exam Prep, IBM Certification, IBM Learning, IBM Guides, IBM Prep
Figure: IBM’s hybrid cloud platform

Case in point: Among the certified platforms for Red Hat Enterprise Linux, Power has the highest memory and processor scalability (64TB and 1536 logical CPU) respectively. With Red Hat OpenShift on IBM Power, which is built from highly scalable RHEL, Power can pack up to 3.2X containers per core vs. compared x86 systems. Customers can now co-locate their containerized applications alongside their AIX and IBM i environments in an on-premises private cloud on Power Systems. We are also working on delivering support for Red Hat OpenShift on IBM Power Systems Virtual Server co-located with IBM Cloud. 

IBM hybrid cloud platform


Red Hat’s solutions are the foundation of IBM’s hybrid cloud platform. The recent Forrester Wave report on Multicloud Container Development Platform has identified Red Hat + IBM as a leader. Red Hat OpenShift, the industry’s leading enterprise Kubernetes platform, enables developers to modernize apps, simplifies distributed infrastructure operations across hybrid cloud and expands enterprise value with a rich app and partner ecosystem.

Many customers on Power run their business-critical applications on AIX and IBM i environments. They can establish a consistent enterprise-wide automation of hybrid cloud operations across AIX, IBM i and Linux environments with Red Hat Ansible Automation Platform. This can help customers improve IT admin productivity, leveraging their existing Ansible skills to automate these environments. In the last five months Ansible community content has been downloaded over 2500+ times!

Innovative and low-risk application modernization approach 


As we go up the stack, the cloud-native or containerized software layer delivers critical components to modernize the applications for hybrid cloud. IBM Cloud Paks are designed to provide enterprise-ready containerized software solutions for modernizing existing applications and developing new cloud-native apps that run on Red Hat OpenShift.

When you modernize enterprise applications, you can ease the transition to a hybrid cloud environment by gaining the flexibility to run apps wherever you want, whenever you want. Modernizing on IBM Power Systems enables new cloud-native microservices to coexist and connect with existing enterprise applications running on AIX, IBM i and Red Hat Enterprise Linux on Power, while still leveraging the inherent scalability, reliability and security benefits of the Power platform.

You can thus address barriers to productivity and integration to create new user experiences, develop new applications and ultimately unlock new business opportunities. However, the key to a low-risk approach is to modernize incrementally. I encourage you to take a look at this ebook that lays out an innovative and low-risk application modernization approach on IBM Power Systems.

Statements regarding IBM’s future direction and intent are subject to change or withdrawal without notice and represent goals and objectives only.

Source: ibm.com

Wednesday, 25 November 2020

Gain insights into an ocean of documents with deep search

IBM Exam Study, IBM Certification, IBM Exam Prep, IBM Certification

Do you have vast amounts of digital documents (such as PDF files, patents and corporate documents) piled up in your organization that are humanly impossible to read and digest? Wouldn’t it be nice if you could query against these document piles with questions like “List all the materials claimed by company X in the US patent office”? Deep search is an IBM Research® service that automatically analyzes enormous digital libraries and facilitates discovering unknown facts. It implements an AI-based approach to enable intelligent querying against document repositories. This capability has been demonstrated to aid innovation across various industries such as material sciences, insurance and drug discovery.

IBM deep search service


How does deep search work? Initially, as shown in figure 1, the digital documents are segmented into multiple components (heading, introduction, references and so on) using machine learning models and converted into structured data representations (such as HTML or JSON). These supervised learning models are customizable and highly accurate, making use of huge data sets and modern neural network topologies.

IBM Exam Study, IBM Certification, IBM Exam Prep, IBM Certification

The second step of deep search involves using the existing data sources (corporate databases, publicly available data sets and the like) to identify the concepts (such as alloy, material) and relationships that are relevant to the context of knowledge discovery. Finally, a searchable and queryable knowledge base is built by linking the structured data formats of documents to the identified concepts and relationships.

Deep search in action


The document processing techniques coupled with the graph analytics provided by deep search can accelerate novel discoveries from document repositories across industries. The chemical company Nagase & Co has put deep search to extensive use in developing new compounds. ENI, an oil and gas company, is using the service for upstream exploration. Currently, deep search is also aiding drug discovery in COVID-19 research.

Knowledge discovery at scale


In addition to the knowledge engineering techniques described above, automatic analysis of a huge number of documents demands powerful storage, compute and network infrastructure. The deep search platform is currently available as a service through Red Hat® OpenShift® on IBM Cloud®. It can also be set up on your premises in an OpenShift environment on IBM Power Systems as well as Intel x86 servers. The software is designed as a group of cloud-based microservices that can scale along with the number of documents and hardware resources for large search applications. This hardware-software codesigned platform has demonstrated capability to ingest as many as 100,000 pages per day per core.

IBM Systems Lab Services can help your organization make better use of document repositories using the deep search platform. Our experienced consultants help you set up the OpenShift platform, work with your subject matter experts to build the knowledge bases and design queries to help you develop novel insights into your digital libraries.

Source: ibm.com

Monday, 23 November 2020

IBM Bayesian Optimization Accelerator: designed to build better products faster

IBM Exam Prep, IBM Tutorial and Material, IBM Certification, IBM Learning, IBM Career

Everything old is new again, and IBM is leading the way, bringing tried and true optimization methods into the current era. Bayesian optimization methods based on Bayes’ Theorem, first explained in 1763, are getting a supercharge thanks to IBM Research and IBM Power Systems.

Today, IBM is releasing a new appliance: IBM Bayesian Optimization Accelerator, designed to help product and design teams introduce products and features faster by lowering their design time to generate new and better products.

The problem spaces that product innovators are working with are getting more complex. We find our clients are demanding faster answers, forcing teams to re-examine current best practices and consider alternative methods because “building bigger” won’t be an option forever. Clients tell us that internal stakeholders are demanding that solutions come faster, cost less and be more accurate than ever. Meanwhile, budget isn’t expanding to meet all the new needs.

In a perfect world, what would solve all of these challenges?

◉ Methods that do not require knowledge about a problem beforehand.

◉ The ability to make the most of infrastructure by parallelizing efforts and spending less CPU and time getting to the ultimate answer.

◉ Easy-to-use services which can cope with high dimensionality because the name of the game is optimizing real problems, not simple academic ones.

◉ Traceable and non-biased methodology, particularly in regulated industries.

Simply put, the ideal solution for these rising challenges would deliver fast innovation and superior results while using fewer resources.

Introducing a more efficient way: IBM Bayesian Optimization Accelerator


With IBM Bayesian Optimization Accelerator, a state-of-the art general parameter optimization tool created based on cutting-edge innovations from the IBM Research team, users only need to define design variables, objective and constraints to leverage a powerful optimization engine. This is an appliance and can be accessed as a full solution from IBM–including hardware, software and installation services.

Simply put, IBM Bayesian Optimization Accelerator is designed to find the optimal solution for complex, real-world design problems in less time and using less resources.

◉ Fast innovation: Bayesian Optimization Accelerator is designed to find solutions quickly with easy, quick initial integration and methods that require fewer starting inputs and scale in parallel to help decrease the time to result.

◉ Superior results: Bayesian Optimization Accelerator locates the most optimum solution 89% of the time with traceable, explainable optimization decisions that require no prior data, avoiding bias.

◉ Fewer resources: These methods can be applied without specialized data science skills and make existing infrastructure more efficient, keeping costs down while still responding to business needs.

The solution works by helping the HPC cluster know where to look. It sits outside of the traditional HPC cluster and is dedicated to running Bayesian Optimization methods only. The HPC cluster will send the values for constraints and objective functions to the appliance, which will send back new locations in the search space to find optimal solutions.

What makes this solution different?


Bayesian methods are not new to the mathematical world. However, we have found that standard and freely available Bayesian methods, such as Greedy or Monte Carlo search, suffer from several challenges that make applying a Bayesian search methodology to a product design problem challenging and often impractical.

IBM Bayesian Optimization Accelerator can scale to orders of magnitude number of dimensions which allows it to tackle real world problems instead of simplistic ones. And unlike search methods such as IBM implementations of Greedy and Monte Carlo searches, it determines design points with much fewer samples required, which allows it to get to results faster and cheaper. In fact, in comparison tests run by IBM Research, Bayesian Optimization Accelerator reached the least regret solution in the fastest time in over 82% of IBM’s tested experiments against IBM implementations of Greedy and Monte Carlo algorithms.

In those same comparison tests, this solution delivered the answer with the least regret over 89% of the time.1 To do this, it does not require any prior knowledge of a design problem. With a proprietary “bootstrapping” method, it can start an optimization from no initial data, gather initialization data on its own, and then start the Bayesian optimization process.

Furthermore, Bayesian Optimization Accelerator provides a graphical optimizer “explainability” interface for users who are interested in traceability of model design history and optimizer choices to help build trust in the methodology. This means that scientists can interrogate the optimizer during an experiment about why it chose to evaluate suggested parameters.

IBM Exam Prep, IBM Tutorial and Material, IBM Certification, IBM Learning, IBM Career

IBM is already pursuing application areas for this technology across many industries, such as aerospace, automotive, electronic design and oil and gas. In electronic design, IBM’s own signal integrity research team used Bayesian Optimization Accelerator to reduce the time required to reduce their signal integrity simulations by 99.3%, from nearly eight days to just 80 minutes. Meanwhile in oil and gas, a research team sliced their time to reach results by 61% by using Bayesian Optimization Accelerator to identify the ideal mix of injection materials and timing into reservoirs to maximize their outputs.

These and other early results with teams across the globe indicate that IBM Research’s work in the lab is already driving real results for businesses and helping them deliver superior results faster and using fewer resources.

Source: ibm.com

Thursday, 19 November 2020

Red Hat Ansible: A 101 Guide

IBM Exam Prep, IBM Tutorial and Material, IBM Learning, IBM Prep, IBM Certification, IBM Guides

You’ve probably heard a lot about Ansible recently. Are you curious to learn more about this rapidly growing DevOps tool? Here’s a guide to help you get started.

What is Red Hat Ansible?

Red Hat® Ansible® is an open source IT deployment, orchestration and configuration management tool. It allows IT teams to define how a client (or group of clients) should be configured, and then Ansible issues the commands to match that stated configuration. Using Ansible’s built-in modules, those clients we can control need not only be compute resources but other environments including storage controllers, network switches and on-premises and public clouds.

Why do we need Ansible?

If you think back to how we used to manage computing, it was often a very time-consuming and error-prone process. If we needed a new server to be built, we would ask the system administrators to manually build that instance. We might then require a certain software stack to be installed, along with a required configuration and user IDs and so forth. Once the server is active, we have the added complexity of keeping up with the developers requiring new software releases, configuration changes or issues around scaling. Ansible enables us to automate all these stages into efficient, repeatable tasks, allowing IT to align to flexible business requirements in this increasingly DevOps-driven world.

In the 2019 Gartner I&O Management Survey, 52% of respondents said they were investing and will continue to invest in infrastructure and operations automation. The same study highlighted that 42% of respondents said they planned to start investing within the next two years.

What can Ansible do?

Ansible gives us the ability to:

1. Provision servers or virtual machines, both on premises and in the cloud

2. Manage configuration of the clients, set up users, storage, permissions and the like

3. Prevent configuration drift of clients by comparing desired state to current state

4. Orchestrate new workloads, restarting application dependencies and so forth

5. Automate application deployment and lifecycle management

Advantages of Ansible

There are a number of advantages in using Ansible ahead of other orchestration and configuration management tools:

Agentless: No agents are needed on the client systems you want to manage. Ansible communicates with the clients over the standard SSH port.

Simple: Ansible uses “playbooks” to define the required state of the client(s). Those playbooks are written in human-readable YAML format, so no programming skills are required.

Declarative: Ansible playbooks define a declarative state required for the client, and the key principle that enables this is idempotency. Idempotency is a concept borrowed from the world of mathematics meaning that an operation can be applied multiple times without changing the results beyond the initial application. For example, if you declare a client should have a software package installed at a certain version, Ansible will only install it if it isn’t already installed at that version.

Reusable: We have the ability to break tasks into a repeatable items called roles. These can then be called from multiple playbooks. For example, we might have a role that installs a database, one that configures users and one that adds additional storage. As we write new playbooks we just include the role we required, meaning we don’t have to rewrite that functionality multiple times.

Open: Ansible is a vibrant and reliable open source project and as such has a very active community, writing roles and modules for everyone to use. Galaxy Hub is once such repository where (for example) over 25,000 roles are available for anyone to download.

Ansible Tower

Ansible Tower provides an intuitive UI and dashboard to allow role-based access so users can run and monitor their own template.

IBM Exam Prep, IBM Tutorial and Material, IBM Learning, IBM Prep, IBM Certification, IBM Guides

Using REST APIs, Ansible Tower also allows us to integrate with existing clouds, tools and processes. These include hybrid cloud endpoints such as IBM PowerVC, Amazon EC2, Google Cloud, Microsoft Azure and VMware vCenter. In addition to endpoints, Ansible Tower provides the ability to interact with source code management repositories including GitHub.

Ansible and IBM Systems


Ansible and Ansible Tower can be used to deploy and manage workloads across the IBM Systems portfolio. Using the core OpenStack modules, we can create playbooks to build virtual machines running AIX, IBM i or Linux across IBM Power Systems estates. Using the extensive IBM collections on Galaxy Hub, we can build instances in the IBM Cloud, manage AIX, IBM i, Linux and z/OS instances both on premises and in a public cloud (where appropriate). We can also manage IBM Spectrum Virtualize storage using the modules freely available in Galaxy Hub.

IBM Exam Prep, IBM Tutorial and Material, IBM Learning, IBM Prep, IBM Certification, IBM Guides
Source: ibm.com

Friday, 13 November 2020

Building a secure hybrid cloud

IBM Cloud, IBM Tutorial and Material, IBM Learning, IBM Certification, IBM Prep, IBM Career

In a day and age when the words digital transformation and cloud have become household names, businesses of all shapes and sizes are jumping on the bandwagon to cloud-enable their business applications. That sounds great as part of a dinner conversation; however, when the CIO needs to request funding for such a project, there must be a valid business case to get it approved.

One major driver of cloud technology is cost reduction. Cloud applications require little to no upfront investment, versus legacy on-premises solutions that require dedicated servers, operating system licenses, and possible back end database licenses. For legacy applications, you pay the entire cost upfront as a capital investment, plus a possible monthly support cost. As time goes on, the software as well as the platform becomes outdated and you need to pay for costly upgrades to keep up with the latest security updates and product feature sets.  With cloud applications, however, you pay a low monthly service charge that is reported on the balance sheet as an operating expense, and usually includes support and upgrades.

For on-premises solutions, your own IT staff is responsible for its operation. This means that you need to hire new staff and/or train existing staff to take on this new responsibility. This presents an added cost to your organization. With cloud, however, the provider manages everything. You just click a few buttons on a web page and the job gets done.


As so eloquently said by Bola Rotibi, Research Director for CCS Insight, on a recent IBM IT Infrastructure webinar titled Secrets From the C-Suite: Building a Secure Hybrid Cloud: “As we think of the digital technologies transforming the way we live, work, rest and play, cloud presents a multitude of opportunities to operate and innovate more cost effectively and efficiently.”

Another major driver of cloud-based technology is business continuity. If all your computing assets are stored in a single location which then experiences an extended power outage, phone service or internet outage, natural disaster, or terrorist attack, your business essentially grinds to a halt. Many larger organizations invest in constructing and maintaining multiple data centers for just that reason. For most small businesses, this added cost is beyond their capabilities. Cloud technology removes this challenge by placing the business continuity requirement entirely on the provider.

Along the same lines of business continuity, is that because of its ubiquity, cloud provides businesses with a competitive advantage over companies that still rely on legacy on-premises hardware-based solutions. Case in point: I recently worked with a company who had one of their location’s phone lines go down. It took 3 days for 2 different phone companies to figure out whose fault it was and then finally fix the problem. During those 3 days, a busy office was completely down with no phone service whatsoever. This kind of service level might have been acceptable in 1992. However, in the 2020s that’s beyond unacceptable. A cloud communications provider with a guaranteed service-level agreement would have ensured that such a serious outage would never happen.


Now, one might argue that this sounds good in theory. However, a small business that manufactures designer bathroom tiles doesn’t have the same needs or security requirements as a large healthcare provider or global investment banker. You can’t just put your customers’ financial portfolio onto public cloud providers! If your public cloud provider gets hacked, the damage to your business can be fatal! This is what hybrid cloud computing is all about. With hybrid cloud your sensitive data and critical workloads remain under your control inside your on-premises private cloud, whereas your less sensitive or critical workloads can be redistributed to the public cloud provider of your choice.

IBM Cloud, IBM Tutorial and Material, IBM Learning, IBM Certification, IBM Prep, IBM Career

One important consideration when migrating applications to the cloud is security. On the previously mentioned IBM IT Infrastructure webinar, Elisabeth Stahl, IBM Garage Distinguished Engineer, talks about encryption and how organizations think about it. Elisabeth Stahl stated that surprisingly, “a very small percentage of enterprise systems are actually encrypted.” She explains that while many organizations encrypt their data, they don’t do so from a holistic point of view. This means that important data gets encrypted and non-sensitive data doesn’t.

Now, if you were a hacker who just discovered a cache of encrypted data among a whole bunch of non-encrypted data, what would be your first thought? You guessed it! That’s the data you’re going to go after and try to break into. By selectively encrypting your data, you’re actually doing yourself a disservice. Elisabeth goes on to discuss a framework called pervasive encryption, “Where you’re really saying you need to easily be able to encrypt all, everything that you have. End-to-end holistically.”

IBM IT Infrastructure has a great resource on their website for businesses that are exploring for ways to educate themselves on security. The eBook is “Seven steps to make secure IT infrastructure a business priority”, and can be accessed here.

In conclusion, there are many different drivers for cloud-based technology. There are also many different options and configurations from which to choose from. Regardless of your choice, cloud security should be a central part of your overall digital transformation strategy, and not tacked on later as an afterthought.

Source: ibm.com

Thursday, 12 November 2020

Accessible bookcast streaming service becomes securely available on IBM Cloud

IBM Cloud, IBM Exam Prep, IBM Certification, IBM Career, IBM Guides

Accessible entertainment isn’t often easily accessible. The mainstream entertainment industry doesn’t cater to people who can’t see or hear well; people with PTSD, autism or epilepsy; or those who are learning English as a second language. These population segments can become isolated, and then marginalized when they can’t enjoy the latest feature film – even if it has subtitles – or best-selling book.

Wenebojo is a new streaming service designed to help solve that challenge with bookcasts. A bookcast is an immersive experience that combines audio narration, pictures and closed captioning. Packaged for easy consumption, each short story or series of episodes is available on demand on a smart TV, tablet, phone or computer.

Wenebojo, named for a mythical Native American storyteller, is designed to be inclusive, and transition people from television to reading. Bookcasts are accessible to people of all abilities, and as such the platform is a mashup of entertainment and a social initiative.

Piracy protection: Keeping the intellectual capital in bookcasts secure

The Wenebojo parent company, Solitaire Interglobal (SIL), is a predictive performance service provider and has been doing complex modeling since 1978. This year, SIL will do well over a quarter billion security and performance models and we have literally trillions of data points that show us where security breaches occur, what platforms they cluster around and the conditions surrounding breaches. We know the impact of security breaches, how much it costs, what gets lost, how long it takes to recover and so on.

Wenebojo is a disruptive technology and there is no acceptable risk profile for data loss or variability in delivery of service. To support the Wenebojo bookcasts, SIL chose IBM Z on Cloud, paired with the IBM Cloud Hyper Protect Services solution and built on the IBM LinuxONE platform.

Being able to keep both client data and the intellectual capital of the writers secure is a big thing. There’s so much pirating of intellectual capital that we needed something that was going to be as hacker-proof as possible. We also needed to fit a fairly complex architecture, because we need the ability to scale based on what side of our system is being stressed, whether it’s the streaming back end or the customer-interfacing travelogue front end. We ran a huge number of models to determine what the impact was and what the risk factors were. We determined that we couldn’t find the support we needed anyplace other than IBM Z on Cloud.

Scaling the experience: Handling expected platform growth

IBM Cloud, IBM Exam Prep, IBM Certification, IBM Career, IBM Guides
For PTSD sufferers or those with epilepsy, Wenebojo is a godsend because there are no big explosions and no triggers. For those learning English, putting something that reads to them and displays the full, correct text as it goes along helps them learn to read. This is proving to be very important for English-as-a-second-language (ESL) students, because even the best ESL programs can’t build vocabulary like intriguing stories can.

Bookcasts are better than books on tape because we created it in such a way that a commuter can click on the bookcast and their 20-minute commute is exactly one chapter. Or if you’re waiting in the doctor’s office, you can play something that’s very short. They’re also the perfect length for a bedtime story.

We have predicted that Wenebojo will grow and keep growing for at least seven years before it levels off, so we need the platform to be scalable. We can’t wait six months to put in a new machine. And with the IBM Cloud, we don’t have to. We call up IBM and say, “We need to expand. We just got this influx of people.” And they’re able to get everything up and running in less than a day.

We’re seeing interest from a variety of potential customers, including prison systems, school systems and hospitals. With IBM, we’re ready for rapid growth as more and more people begin to stream Wenebojo bookcasts.

Source: ibm.com

Monday, 9 November 2020

Top 5 Advantages of Software as a Service (SaaS)

IBM Exam Prep, IBM Certification, IBM Learning, IBM Study Materials, IBM Career

Software as a service (SaaS) is a cloud computing offering that provides users with access to a vendor’s cloud-based software. What are the main benefits of using SaaS?

SaaS provides an intriguing alternative to standard software installation in the business environment (traditional model), where you have to build the server, install the application, and configure it. Instead, the applications reside on a remote cloud network accessed through the web or an API, and it works like a rental. You and your organization have the authorization to use it for a period of time and pay for the software that you are using.

The following are five of the top advantages of using SaaS:

1. Reduced time to benefit

Software as a service (SaaS) differs from the traditional model because the software (application) is already installed and configured. You can simply provision the server for an instance in cloud, and in a couple hours, you'll have the application ready for use. This reduces the time spent on installation and configuration and can reduce the issues that get in the way of the software deployment.

2. Lower costs

SaaS can provide beneficial cost savings since it usually resides in a shared or multi-tenant environment, where the hardware and software license costs are low compared with the traditional model.

Another advantage is that you can rapidly scale your customer base since SaaS allows small and medium businesses to use a software that otherwise they would not use due to the high cost of licensing.

Maintenance costs are reduced as well, since the SaaS provider owns the environment and it is split among all customers that use that solution.

3. Scalability and integration

Usually, SaaS solutions reside in cloud environments that are scalable and have integrations with other SaaS offerings. Compared with the traditional model, you don't have to buy another server or software. You only need to enable a new SaaS offering and, in terms of server capacity planning, the SaaS provider will own that. Additionally, you'll have the flexibility to be able to scale your SaaS use up and down based on specific needs.

IBM Exam Prep, IBM Certification, IBM Learning, IBM Study Materials, IBM Career
4. New releases (upgrades)

With SaaS, the provider upgrades the solution and it becomes available for their customers. The costs and effort associated with upgrades and new releases are lower than the traditional model that usually forces you to buy an upgrade package and install it (or pay for specialized services to get the environment upgraded).

5. Easy to use and perform proof-of-concepts

SaaS offerings are easy to use since they already come with baked-in best practices and samples. Users can do proof-of-concepts and test the software functionality or a new release feature in advance. Also, you can have more than one instance with different versions and do a smooth migration. Even for large environments, you can use SaaS offerings to test the software before buying.

Thursday, 5 November 2020

IBM Big Data Architect - The Art of Handling Big Data

data architect, ibm big data architect certification, data architect training, ibm data architect, ibm certified data architect - big data, data architect ibm, C2090-102

The Big Data Architect closely collaborates with the customer and the solutions architect to change its business requirements into a Big Data solution. The Big Data Architect has in-depth knowledge of the relevant technologies, understands the relationship between those technologies, and can be integrated and combined to effectively solve any given big data business problem.

This individual can design large-scale data processing systems for the enterprise and provide input on architectural decisions, including hardware and software. The Big Data Architect also understands data complexity and can design systems and models to handle complex data variety, including (structured, semi-structured, unstructured), volume, velocity (including stream processing), and veracity. The Big Data Architect can also effectively address information governance and security challenges associated with the system.

IBM Certified Data Architect - Big Data Certification Essentials

This IBM certification requires an exam IBM Big Data Architect. This test consists of 5 sections, including a total of 55 multiple-choice questions. The rates after each section title indicate the relative distribution of the total question set across the areas.

Each Test:

  • Contains questions requiring single and multiple answers. For multiple-answer questions, you need to choose all the required options to get the answer correct. You will be told how many options make up the right answer.
  • It is designed to provide diagnostic feedback on the Examination Score Report, correlating back to the test objectives, informing the test taker how they did each test section. As a result, to keep the integrity of each test, questions and answers are not distributed.
  • An IBM Certified Big Data Architect had demonstrated proficiency in the design, implementation, and combination of Big Data solutions within the IT enterprise or cloud-based environments. Depending on the exam format chosen, accomplishing the Big Data Architect Certification can require passing a single exam or multiple exams.

How to Become An IBM Big Data Architect?

Follow these steps to seek a career as an IBM Big Data Architect:

  • Earn an education. A bachelor’s degree in a field such as computer science, computer engineering, or information technology is the essential entry-level requirement for a job as a data architect. A master’s degree can be useful if you are seeking positions in leadership and larger corporations.
  • Gain work experience. Getting an entry-level job as a data architect will expect you to show some level of involvement in managing data and data systems. Internships, workshops, and boot camps you created and maintained methods could be useful to list on your resume. You may also require to gain experience in other IT positions before applying to jobs as a data architect.
  • Earn certifications. You can take an order of software and IBM Big Data Architect program certification from nonprofit agencies and the companies that make the software. Completing one or more certifications can show your general understanding of the subject and dedication to the field.
  • Build a resume. List your education, skills, and job experience, with the most critical and recent experience first, along with the organizations' names and the length of time you worked there. It would be best to tailor your resume to the specific job for which you are applying.

What Does An IBM Big Data Architect Do?

An IBM Big Data Architect helps a company understand its strategic goals regarding data management and controls with software designers and data engineers to generate new database integration plans. Data architects must have vital business intelligence to work with higher-level officials in a company and assess their particular needs. Keeping aware of industry trends, data architects create a new platform for many people and in many areas.

Data architects are also managing scheduling updates and improvements to the database they have created with minimum contact with the company. This can mean that a data architect must work long hours and on the weekends to complete projects and updates on time.

Misconceptions About the Role of an IBM Big Data Architect

Data Architects are not: A lot of things! Here is a description of what parts Data Architects from Data Analysts, Data Engineers, and Data Scientists.

Essential Metrics for an IBM Big Data Architect

Data Architects are often included in creating and executing an organization’s overall data strategy and setting enterprise-wide quality metrics.

Here are several ways to measure and evaluate the success of a data solution:

  • KPIs for Data Architecture
  • The Data Quality Cycle
  • Critical Factors in Architecting a Master Data Solution

Think You Have What It Takes?

Do you have the data-driven mindset it takes to make it as a data architect? Do you love helping businesses find efficiencies through the intelligent capture and analysis of raw data? If so, you are in luck - companies are paying closer attention to the insights gleaned from data. And using that information to their benefit is impossible without people like you.

Wednesday, 4 November 2020

6 spooky stats to help you better manage your 2020 holiday retail season

IBM Tutorial and Material, IBM Learning, IBM Certification, IBM Exam Prep

Even though this year has been scarier than encountering a black cat, Halloween is still eagerly anticipated across the United States. According to NRF, an estimated 58% plan to celebrate and a total of $8.05B will be spent during the holiday. Candy, costumes and decorations – oh my!

Let’s review six spooky stats that can help transform your business now, and in the future, as we enter the peak shopping season:

1. 46% of shoppers plan to start their holiday shopping earlier this year – An extended shopping period starting before Black Friday and Cyber Monday can ultimately create a ripple effect of challenges for retailers. Whether fulfilling orders online or in-store, you need more inventory available longer across a broader range of locations in order to meet individual customer expectations. But if you don’t have visibility into accurate inventory levels, this can lead to lost sales, customers aggravated by overpromising, giving away margin due to markdowns and expedited shipping charges.

2. 29% say their personal finances have decreased – Yes, this stat is more than frightening. However, the same report also says that 56% of respondents plan to spend the same amount of money holiday shopping this year as last. (There’s some good news for retailers and brands!) To entice customers to purchase, offer additional promotions in order to meet them at a more cost-efficient price point.

3. Foot traffic has fallen 20% since the pandemic arrived – Decreased foot traffic can only mean one thing – increased online sales! Maximize fulfillment options such as buy-online-pickup-in-store (BOPIS), curbside pickup and ship-from-store in an effort to keep inventory moving. The holidays come with a hard delivery timeline to guarantee gift-giving promises, so alternative options are essential in light of shoppers who may not step inside your four walls.

4. 40% expect to spend less on Halloween than in 2019 – Coresight Research estimates a dip in Halloween spending this year. While this doesn’t come as a huge surprise, this should serve as motivation to capitalize on each and every dollar spent with your business. Aligning promotions with CDC guidance on low-risk Halloween activities, and providing options to your shoppers such as curbside pickup and same day delivery, can ensure a safe and convenient experience, while also securing sales.

IBM Tutorial and Material, IBM Learning, IBM Certification, IBM Exam Prep
5. 31% of shoppers are placing orders via a mobile app – According to the IBM COVID-19 Consumer Survey, 2020 has pushed shoppers to try new channels such as mobile apps and social media. Big retailers like Lowe’s have seen record-breaking app downloads throughout the pandemic. While this might have been a one to three-year vision for your digital transformation plan, data shows that you need to meet customers the way they want to engage. According to Forrester, 33% of shoppers don’t plan to resume normal shopping habits. The uptick in curbside pick-up and BOPIS that retailers are seeing presents new opportunities to leverage in-app promotions to drive order values once customers enter the proximity of the store pick-up location.

6. Pop-up locations to be scaled back by more than 90% – Party City announced to open just 25 pop-up stores under its Halloween City banner this year. With fewer storefronts open, shoppers have to travel further to get what they are looking for – or look elsewhere. For that reason, its critical to optimize inventory at each holiday pop-up throughout the 2020 season to match customer demand. Looking back at the previous years’ trends, you can shift inventory levels to neighboring stores to make sure you never risk out-of-stocks for key items.

Hopefully these stats don’t have you too spooked! Yes, 2020 has been a wild ride. But the unexpected can also present opportunities. Don’t try to bite off more than you can chew. Start small to build quick wins that will have immediate impact to the customer experience, and your bottom line.

Source: ibm.com

Tuesday, 3 November 2020

What is supply chain security?

IBM Exam Prep, IBM Certification, IBM Learning, IBM Tutorial and Material, IBM Prep

Many of us in supply chain remember the major data breaches Target and Home Depot suffered within months of each other resulting from third-party relationships. Six years later, supply chain security breaches still make headlines with the average cost of a data breach now at $3.86 million and mega breaches (50 million records or more stolen) reaching $392 million.

So, what’s it going to take to tackle supply chain security?

Part of the challenge is that there is no single, functional definition of supply chain security. It’s a massively broad area that includes everything from physical threats to cyber threats, from protecting transactions to protecting systems, and from mitigating risk with parties in the immediate business network to mitigating risk derived from third, fourth and “n” party relationships. However, there is growing agreement that supply chain security requires a multifaceted and functionally coordinated approach.

Supply chain leaders tell us they are concerned about cyber threats, so in this blog, we are going to focus on the cybersecurity aspects to protecting the quality and delivery of products and services, and the associated data, processes and systems involved.

“Supply chain security is a multi-disciplinary problem, and requiresclose collaboration and execution between the business, customer support and IT organizations, which has its own challenges. The companies that get this rightstart with IT and a secure multi-enterprise business network, then build upwardwith carefully governed and secured access to analytics and visibility capabilities and,from there, continuously monitor every layer for anomalous behavior.” Marshall Lamb, CTO, IBM Sterling

Why is supply chain security important?


Supply chains are all about getting customers what they need at the right price, place and time. Any disruptions and risk to the integrity of the products or services being delivered, the privacy of the data being exchanged, and the completeness of associated transactions can have damaging operational, financial and brand consequences. Data breaches, ransomware attacks and malicious activities from insiders or attackers can occur at any tier of the supply chain. Even a security incident localized to a single vendor or third-party supplier, can still significantly disrupt the “plan, make and deliver” process.

Mitigating this risk is a moving target and mounting challenge. Supply chains are increasingly complex global networks comprised of large and growing volumes of third-party partners who need access to data and assurances they can control who sees that data. Today, new stress and constraints on staff and budget, and rapid unforeseen changes to strategy, partners and the supply and demand mix, add further challenges and urgency. At the same time, more knowledgeable and socially conscious customers and employees are demanding transparency and visibility into the products and services they buy or support. Every touchpoint adds an element of risk that needs to be assessed, managed and mitigated.

Top 5 supply chain security concerns


Supply chain leaders around the globe and across industries tell us these five supply chain security concerns keep them awake at night:

1. Data protection. Data is at the heart of business transactions and must be secured and controlled at rest and in motion to prevent breach and tampering. Secure data exchange also involves trusting the other source, be it a third party or an e-commerce website. Having assurances that the party you are interacting with is who they say they are is vital.

2. Data locality. Critical data exists at all tiers of the supply chain, and must be located, classified and protected no matter where it is. In highly regulated industries such as financial services and healthcare, data must be acquired, stored, managed, used and exchanged in compliance with industry standards and government mandates that vary based on the regions in which they operate.

3. Data visibility and governance. Multi-enterprise business networks not only facilitate the exchange of data between businesses, but also allow multiple enterprises access to data so they can view, share and collaborate. Participating enterprises demand control over the data and the ability to decide who to share it with and what each permissioned party can see.

4. Fraud prevention. In a single order-to-cash cycle, data changes hands numerous times, sometimes in paper format and sometimes electronic. Every point at which data is exchanged between parties or shifted within systems presents an opportunity for it to be tampered with – maliciously or inadvertently.

5. Third-party risk. Everyday products and services – from cell phones to automobiles – are increasing in sophistication. As a result, supply chains often rely on four or more tiers of suppliers to deliver finished goods. Each of these external parties can expose organizations to new risks based on their ability to properly manage their own vulnerabilities.

Supply chain security best practices


Supply chain security requires a multifaceted approach. There is no one panacea, but organizations can protect their supply chains with a combination of layered defenses. As teams focused on supply chain security make it more difficult for threat actors to run the gauntlet of security controls, they gain more time to detect nefarious activity and take action. Here are just a few of the most important strategies organizations are pursuing to manage and mitigate supply chain security risk.

◉ Security strategy assessments. To assess risk and compliance, you need to evaluate existing security governance – including data privacy, third-party risk and IT regulatory compliance needs and gaps – against business challenges, requirements and objectives. Security risk quantification, security program development, regulatory and standards compliance, and security education and training are key.

◉ Vulnerability mitigation and penetration testing. Identify basic security concerns first by running vulnerability scans. Fixing bad database configurations, poor password policies, eliminating default passwords and securing endpoints and networks can immediately reduce risk with minimal impact to productivity or downtime. Employ penetration test specialists to attempt to find vulnerabilities in all aspects of new and old applications, IT infrastructure underlying the supply chain and even people, through phishing simulation and red teaming.

◉ Digitization and modernization. It’s hard to secure data if you’re relying on paper, phone, fax and email for business transactions. Digitization of essential manual processes is key. Technology solutions that make it easy to switch from manual, paper-based processes and bring security, reliability and governance to transactions provide the foundation for secure data movement within the enterprise and with clients and trading partners. As you modernize business processes and software, you can take advantage of encryption, tokenization, data loss prevention, and file access monitoring and alerting, and bring teams and partners along with security awareness and training.

◉ Data identification and encryption. Data protection programs and policies should include the use of discovery and classification tools to pinpoint databases and files that contain protected customer information, financial data and proprietary records. Once data is located, using the latest standards and encryption policies protects data of all types, at rest and in motion – customer, financial, order, inventory, Internet of Things (IoT), health and more. Incoming connections are validated, and file content is scrutinized in real time. Digital signatures, multifactor authentication and session breaks offer additional controls when transacting over the internet.

◉ Permissioned controls for data exchange and visibility. Multi-enterprise business networks ensure secure and reliable information exchange between strategic partners with tools for user- and role-based access. Identity and access management security practices are critical to securely share proprietary and sensitive data across a broad ecosystem, while finding and mitigating vulnerabilities lowers risk of improper access and breaches. Database activity monitoring, privileged user monitoring and alerting provide visibility to catch issues quickly. Adding blockchain technology to a multi-enterprise business network provides multi-party visibility of a permissioned, immutable shared record that fuels trust across the value chain.

◉ Trust, transparency and provenance. With a blockchain platform, once data is added to the ledger it cannot be manipulated, changed or deleted which helps prevent fraud and authenticate provenance and monitor product quality. Participants from multiple enterprises can track materials and products from source to end customer or consumer. All data is stored on blockchain ledgers, protected with the highest level of commercially available, tamper-resistant encryption.

◉ Third-party risk management. As connections and interdependencies between companies and third parties grow across the supply chain ecosystem, organizations need to expand their definition of vendor risk management to include end-to-end security. allows companies to assess, improve, monitor and manage risk throughout the life of the relationship. Start by bringing your own business and technical teams together with partners and vendors to identify critical assets and potential damage to business operations in the event of a compliance violation, system shutdown, or data breach that goes public.

◉ Incident response planning and orchestration. Proactively preparing for a breach, shut down or disruption, and having a robust incident response plan in place is vital. Practiced, tested and easily executed response plans and remediation prevent loss of revenue, damage to reputation and partner and customer churn. Intelligence and plans provide metrics and learnings your organization and partners can use to make decisions to prevent attacks or incidents from occurring again.

Supply chain security will continue to get even smarter. As an example, solutions are beginning to incorporate AI to proactively detect suspicious behavior by identifying anomalies, patterns and trends that suggest unauthorized access. AI-powered solutions can send alerts for human response or automatically block attempts.

How are companies ensuring supply chain security today?


IBM Sterling Supply Chain Business Network set a new record in secure business transactions in September of this year, up 29% from September 2019, and is helping customers around the world rebuild their business with security and trust in spite of the global pandemic.

International money transfer service provider Western Union completed more than 800,000 million transactions in 2018 for consumer and business clients rapidly, reliably and securely with a trusted file transfer infrastructure.

Fashion retailer Eileen Fisher used an intelligent, omnichannel fulfillment platform to build a single pool of inventory across channels, improving trust in inventory data, executing more flexible fulfillment and reducing customer acquisition costs.

Blockchain ecosystem Farmer Connect transparently connects coffee growers to the consumers they serve, with a blockchain platform that incorporates network and data security to increase trust, safety and provenance.

Financial services provider Rosenthal & Rosenthal replaced its multiple approaches to electronic data interchange (EDI) services with a secure, cloud-based multi-enterprise business network reducing its operational costs while delivering a responsive, high-quality service to every client.

Integrated logistics provider VLI is meeting myriad government compliance and safety regulations, and allowing its 9,000 workers access to the right systems at the right time to do their jobs, with an integrated suite of security solutions that protect its business and assets and manage user access.

Solutions to keep your supply chain secure


Keep your most sensitive data safe, visible to only those you trust, immutable to prevent fraud and protected from third-party risk. Let IBM help you protect your supply chain, with battle-tested security that works regardless of your implementation approach – on-premise, cloud or hybrid.

Source: ibm.com

Monday, 2 November 2020

The ninth wave of tape storage innovation

IBM Certification, IBM Exam Prep, IBM Learning, IBM Guides, IBM Tutorial and Material

Storage cost and data protection are two top storage challenges. Tape-based data storage solutions and technology help clients to address both of these critical modern challenges, which makes tape a powerful solution in the storage market today

Tape can help enterprises of all types and sizes address their need for low-cost, high-volume data storage. And tape offers ways to help mitigate, and even thwart cyber threats through the use of tape “air-gapping”. Just as importantly, innovation within the tape-based data storage platform has remained strong. IBM has never lost sight of the value and the promise of tape-based storage. Our ongoing commitment to this technology, and the storage solutions it provides, has recently been reaffirmed with the release of new Linear Tape Open (LTO) Ultrium Generation 9 technologies and offerings.

IBM intends to offer LTO 9 Ultrium Tape Drive technology and media that complies with the LTO Consortium Generation 9 specifications across the full suite of IBM Tape Automation Solutions in the first half of 2021. The new IBM LTO 9 drives and supporting systems are designed to help increase tape’s ability to provide cost-effective, secured storage solutions to meet the full range of rapidly evolving 21st-century requirements for storage of less active data.

IBM is a strong tape storage vendor. This success is the result of more than six decades of commitment to the technology, plus the adoption and execution of a product strategy based on innovation, continual improvement, communication with customers and business partners, and a focus on quality.

This advantage is certainly recognized by IBM Business Partners:

“IBM LTO tape technology provides a secured, cost-effective solution for our customers requiring on-premises archival storage as part of a multicloud strategy,” notes Rusell Schneider, Storage Director of Jeskell Systems, a long-time IBM Storage Business Partner. “Tape offers significant benefits in cost and access time complementing public cloud storage and offering “air-gap” security with protection from cyber-criminals and ransomware threat actors. With the LTO Ultrium Generation 9 announcement, the data stored on each cartridge grows by 50 percent while performance increases 11 percent, which strengthens IBM’s award-winning tape technology and portfolio of systems, software and libraries.”

IBM LTO Ultrium tape technology is designed for heavy demands posed by the storage of less active data in modern use cases such as the Internet of Things (IoT), big data analytics, artificial intelligence (AI)-driven applications, media and entertainment, genomics, video streaming and digital archiving. This tape technology is enhanced in new IBM LTO Generation 9 tape solutions that are designed to help provide increased capacity, performance, cyber resilience and cost efficiency:

◉ IBM LTO Ultrium 9 data cartridges provide 50 percent more capacity (18 TB) than LTO 8 cartridges with an 11 percent faster data rate (400 MB/s).

◉ IBM LTO 9 solutions can reduce Total Cost of Ownership of your tape library by 39 percent.

◉ IBM LTO 9 technologies now provide cost-effective, secured data retention for containers in Red Hat OpenShift environments through the integration with IBM Spectrum Protect solutions.

As stated by David Hill at Mesabi Group, “IBM built the next generation tape for the hybrid cloud with LTO 9 technology. By offering data immutability with WORM and the physical air gap offered by tape, plus AES-256-bit encryption of data at rest, IBM LTO tape maintains its strong position as a cost-efficient, secured storage solution.”

In fact, tape:

◉ Is leveraged by 67 percent of current users for archiving, and 57 percent of current users for backup
◉ Has greater potential to meet predicted capacity growth over the next decade than HDD technologies
◉ Is 86 percent less expensive than disk storage in a 10-year TCO comparative study dated 2018
◉ Is about two orders of magnitude more reliable than disk

The latest IBM LTO Ultrium Gen 9 tape technology is designed to offer significantly higher storage density than previous generations, helping to lower the cost of storing large data volumes. IBM tape drives and libraries provide multiple layers of data protection. And IBM Spectrum Archive, a member of the industry awarded IBM Spectrum Storage family of software-defined solutions, offers solutions that can make access to data on tape as easy as disk.

IBM Certification, IBM Exam Prep, IBM Learning, IBM Guides, IBM Tutorial and Material

Tape remains a leading solution in addressing modern data storage needs. IBM LTO Ultrium Gen 9 continues tape’s innovation with the aim of providing organizations with faster access, lower total cost of ownership (TCO), greater security, longer life, and more functionality. Complementary technologies such as IBM SDS solutions increase their compatibility, integration and capabilities with tape. And even more industries, business use cases, and leading enterprises find that tape isn’t living in the past; it’s innovating toward the future—just like they are.

Source: ibm.com