Showing posts with label Hybrid cloud storage. Show all posts
Showing posts with label Hybrid cloud storage. Show all posts

Wednesday, 1 September 2021

How IBM Research is creating the future of hybrid cloud

Constellation Research report praises IBM Research for commitment to fundamental research that's delivering competitive cloud solutions.

IBM Research, Hybrid Cloud, IBM Exam Prep, IBM Tutorial and Material, IBM Career, IBM Study Materials, IBM Preparation

“IBM is one of the few vendors offering a competitive cloud solution in the 2020s that also has been a traditional IT leader and supplier for more than 50 years.”

Strong words from Constellation Research Vice President and Principal Analyst Holger Mueller in his March 2021 report, “IBM Research Bolsters IBM Hybrid Cloud: A Strong Innovation Pipeline Can Make a Difference for IBM Hybrid Cloud—and for IBM.”

And like so many past innovations and industry firsts—from powering the Apollo mission, to putting the first quantum computer online for the world to use—IBM is building the hard tech behind the future of computing.

A hybrid approach to cloud computing transformation


Thanks to innovations like agile development practices and cloud-native tools, platforms, and consumption models, technological progress is faster than ever before. At the same time, AI and data analytics techniques are getting increasingly capable of extracting value from the vast amounts of data we produce, fueling an ever-increasing demand for more compute capacity. This demand can be met by the nearly limitless computing now available at all our fingertips, thanks to the maturation of public clouds.

But efforts to adopt new technologies are juxtaposed against a backdrop of years of prior infrastructure investments, where large IT footprints exist in traditional data centers running legacy systems. And for most companies, cloud technologies’ value also requires specialized skills, and the ability to use, manage, and protect the data on the cloud isn’t so simple.

Navigating this landscape and the associated opportunities and challenges is a massive undertaking for any business. It's particularly straining for the business functions responsible for security and compliance against relentless and increasingly sophisticated attacks, and an ever-changing compliance landscape.

We believe there is tremendous value to unlock if we solve the primary challenges in today’s world of hybrid cloud computing:

◉ Moving from legacy to cloud and AI models: Helping businesses transform their IT estates so they can reap the benefits of state-of-the-art hybrid cloud and AI technologies.

◉ Creating truly hybrid architectures: Helping customers simplify the productive use and management of complex and heterogeneous IT estates, spanning legacy systems, private cloud, several public cloud environments, and even edge.

◉ Streamlining security and compliance: Reduce the burden and overhead of navigating security and compliance concerns in the hybrid cloud, which currently slows down innovation and technology adoption.

◉ Preparing for the future: Identifying and anticipating the next frontiers of technology that will enable businesses to achieve things that remain unachievable today.

Our hybrid cloud research works to address these challenges in four ways:

1. Driving agility through AI and automation. We are developing AI capabilities to support our customers throughout their journeys to cloud. This includes helping them modernize applications, keep IT available more of the time, and navigate security and compliance concerns. AI assistance for these tasks will help customers overcome some of the roadblocks that prevent them from reaping the full benefits of hybrid cloud. This work is underpinned by a strong technological focus on the discipline of AI for Code: the idea that code is itself similar in many ways to human language, and we can therefore build AI systems that can “speak the language” of software.

2. Creating a seamless hybrid cloud platform. We are focused on simplifying the experience of using and managing a fragmented IT landscape of both infrastructure and data. To do this we are evolving the platform itself, particularly the management for both infrastructure and data across on-premise data centers, public clouds, and edge. Letting customers treat fragmented data and large fleets of distributed devices as if they are all part of a single computing environment would tremendously simplify the job of developers and IT administrators, and open the door to new business opportunities.

3. Establishing a holistic approach to security and compliance. Our approach involves both hardware- and software-defined technologies that enhance security by strengthening the separation between user workloads and providing evidence of the trustworthiness of the underlying IT platforms running the user workloads. Moreover, we are transforming compliance processes from manual exercises to modern, software-defined approaches to dramatically reduce the overhead on compliance officers and their teams. Our vision is to mitigate the security and compliance obstacles that slow teams down, and in doing so accelerate the pace of innovation across the enterprise.

4. Enabling flexible, composable computing. We are redefining performance-focused computing across all layers of the hybrid cloud stack, from core infrastructure innovations (like network and storage, as well as quantum computing and AI accelerators), to a performant hybrid cloud platform (optimized for large-scale and performance-sensitive workloads), to advances in serverless computing that simplify consumption of state-of-the-art capabilities underneath.

IBM Research, Hybrid Cloud, IBM Exam Prep, IBM Tutorial and Material, IBM Career, IBM Study Materials, IBM Preparation
Our vision is to reimagine supercomputing, using the benefits of the hybrid cloud: simplicity, agility, scale on demand, automation, and access to best-in-class resources, wherever they are. Reimagining and democratizing access to supercomputing, making it available to anyone with a cloud account, has the potential to enable more people to achieve previously unachievable things.

The hybrid cloud research organization, along with our IBM Research counterparts in AI, quantum computing, and exploratory science, is delivering research innovations that help enterprises and other organizations meet what challenges and opportunities lie ahead. We are extremely excited about the hybrid world we are helping to shape.


Source: ibm.com

Sunday, 29 August 2021

Open source workload identity management could help secure hybrid clouds

IBM is open sourcing project “Tornjak” to encourage the development and adoption of enterprise-level identity management between clouds.

IBM Exam Study, IBM Learning, IBM Guides, IBM Tutorial and Materials, IBM Career, IBM Preparation, IBM Cert Prep

Organizations have made great strides migrating workloads to the cloud and deploying cloud native applications. At the same time, the resulting hybrid multi-cloud architectures can create challenges for identity and access control, as resources and workloads must operate across multiple public clouds and services.

IBM Research’s new open source “Tornjak” project seeks to tackle those challenges head-on. Our goal is to help enterprises embrace this new way of working by providing a consistent level of control, visibility and auditability of workload identities for workloads across various clouds.

Different cloud providers have their own sets of identity and access control systems. That allows strong authentication of workloads and access control management within a cloud provider’s own domain. Securing shared resources between clouds can be complex.

Read More: C2090-424: IBM InfoSphere DataStage v11.3


Today, when developers want to grant access between clouds, they use one of two common methods:

1. The first is to generate a long-term token or API key. Unfortunately, that approach comes with many downsides because it leaves administrators unable to audit and determine the total impact—or blast radius—of a potential security incident.

2. The second method relies on federation, which is more secure but not very efficient, today.
That’s because federation support across different clouds varies greatly. More importantly, each cloud provider has its own notion of identity, schema and trust relationships. That makes creating a holistic federated identity within an organization a complex exercise, whose result is often misconfiguration and mismanagement of access control.

Finding common ground


Tornjak is designed to create common ground for workload identity management. It does that by providing a management layer atop the SPIFFE (Secure Production Identity Framework for Everyone), a universal identity control plane for distributed systems under the aegis of the CNCF—the Cloud Native Computing Foundation.

Tornjak also uses SPIRE, an implementation of the SPIFFE runtime environment. Using SPIFFE and SPIRE as a foundation of a zero-trust security model, Tornjak can help manage the secure provisioning and authentication identities.

One of our main goals is to provide CISOs, security operators and auditors the management interfaces and tools necessary to manage their organizations’ workload identities. The combination of SPIFFE, SPIRE and Tornjak should offer organizations stratified workload identity management, simplifying access control without sacrificing security.

“I am very excited about the innovation happening in the zero-trust security. SPIFFE and SPIRE now combined with Tornjak are providing a highly scalable and community-driven solution to address service mesh security,” said Luke Hinds, security engineering lead, Red Hat Office of the CTO.

It's great to see IBM and others spearheading this work upstream in the community. This is technology that addresses a clear problem space of machine identity trust in cloud native networks.

IBM Exam Study, IBM Learning, IBM Guides, IBM Tutorial and Materials, IBM Career, IBM Preparation, IBM Cert Prep
Figure 1:
IBM open sourcing project “Tornjak” logo.

Barriers to secure workload identity management


The main challenge of our research is to create a shift in the way cloud users manage and secure their organization’s workload identities.

This challenge manifests itself in two ways:

1. As use of cloud native technology matures and users start to move more workloads to cloud, they will discover the difficulty managing workload identities across multiple public clouds. Because there’s very little education on how this should be handled, most users end up creating work arounds or employing techniques that may jeopardize security.

2. And from a security and audit perspective, requirements around security are still being formalized. For the most part, users and auditors alike are uncertain how cloud native technology plays into security and compliance. That lack of common understanding and tooling in cloud native environments keeps the technology from reaching the mainstream, relegating cloud native to a much smaller group of early adopters.

Open source can accelerate development


In an effort to keep Tornjak open and available to all, IBM is donating the project to be part of the CNCF, under the SPIFFE community umbrella. The project will join a well-founded community of developers, integrators and users—including Bloomberg, ByteDance (developer of TikTok) and Github—focused on solving workload identity challenges introduced by hybrid cloud environments. The community also includes Cisco, Google, HPE and others building new tools atop SPIFFE/SPIRE.

In open sourcing Tornjak, IBM’s goal is to accelerate the development of hybrid cloud workload identity solutions. We’re also hoping to highlight the workload identity problem for those unfamiliar with it, and to demonstrate IBM’s close partnership with Red Hat and the open-source community in addressing these challenges. The CNCF SPIFFE community offers us an excellent forum through which we can contribute our ideas and pursue the best identity management solutions.

Tornjak is still in its early development stages—the project has been implemented with the basic functionality for managing identities. Additional work needs to be done to get it ready for enterprise adoption. Our hope is that the open-source community's combined efforts will enable us to achieve a production-ready solution by the end of the year.

Source: ibm.com

Tuesday, 10 August 2021

IBM Research and Red Hat work together to take a load off predictive resource management

PayPal, the global payments giant, has already started putting load-aware scheduling into production.

IBM Research, IBM Exam Prep, IBM Preparation, IBM Learning, IBM Career, IBM Tutorial and Materials, IBM Guides, IBM Learning

In today’s ever-changing hybrid cloud field, based on many open-source projects, researchers face two fundamental challenges:

1. Being able to back up their ideas with deep research.
2. Convincing the open source community that their idea is important and enhances existing software frameworks.


Working as one team, scientists at IBM Research and Red Hat joined together to overcome these obstacles and produce tangible solutions in just seven months.

Red Hat OpenShift is the connective tissue between the infrastructure our clients use. It allows users to write applications once and run them anywhere. And it standardizes the approach to development, security, and operations on any cloud, from any vendor. But Kubernetes, the container orchestration engine at the core of OpenShift, has some areas where our team thought additional features and enhancements could be added.

There are two separate components to the work:

◉ The first is a set of load-aware scheduler plugins, called Trimaran, that factor in the actual usage on the worker nodes—something Kubernetes doesn’t take into account.

◉ The second is a controller that allows developers to automatically resize their containers, called Vertical Pod Autoscaler (VPA).

Today, most developers need to guess how much resource they believe they’ll need, or overestimate just to be sure. This controller can resize a container in real time during runtime. We introduced upstream enhancements in Kubernetes, which lets developers easily incorporate more predictive autoscaling algorithms.

In both cases, the work started as open-source projects. Red Hat works with community-created open-source software and builds upon each project to harden security, fix bugs, patch vulnerabilities and add new features so it is ready for the enterprise.

The auto-scaler has just been made generally available and supported by Red Hat in OpenShift 4.8, and the load-aware scheduling is expected to be available in the next release of OpenShift. 
“Our collaboration with IBM Research takes an upstream-first approach, helping to fuel innovation in the Kubernetes community.

IBM Research, IBM Exam Prep, IBM Preparation, IBM Learning, IBM Career, IBM Tutorial and Materials, IBM Guides, IBM Learning
“When innovation first happens in the Kubernetes community, it provides the opportunity for others to provide feedback. We then build on that feedback and apply it in OpenShift to help solve new customer use cases in the platform. Red Hat is one of the top contributors in the Kubernetes community,” Tushar Katarki, Director of Red Hat OpenShift Product Management, told us.

And the aim is to make these open-source projects that started out as research efforts blossom into use that can have a profound impact on Red Hat’s customers.

“The collaboration between IBM Research and Red Hat OpenShift has resulted in numerous enhancements that expand the intelligence of core OpenShift components,” Red Hat’s Director of OpenShift Engineering, Chris Alfonso, said. “The impact to our customers is significant in terms of managing their workloads in complex environments which demand flexibility in compute resource utilization.”

PayPal, the global payments giant, has already started putting load-aware scheduling into production

“In a large-scale environment like PayPal, the platform team has to assure the efficiency of the fleet while keeping safety in mind,” Shyam Patel, the director of Container Platform & Infrastructure at PayPal, told us.

“Standard scheduling uses declarative resource mapping, and at times workload has higher SKU than they need. In this case, we end up wasting resources. Similarly, we don’t want compute resource utilization to go beyond safe allocation. Trimaran offers resource usage-based scheduling capabilities that greatly helps achieving the optimal usage while maintaining a safety net.”

Source: ibm.com

Saturday, 26 June 2021

Extend privacy assurance in hybrid cloud with IBM Hyper Protect Data Controller

IBM Hyper Protect Data Controller, IBM Tutorial and Material, IBM Career, IBM Preparation, IBM Certification, IBM Exam Prep, IBM Guides, IBM Career

As IBM CEO Arvind Krishna has stated, data breaches and ransomware attacks such as the recent attack on Colonial Pipeline are increasing in frequency and scope, making data protection and privacy more critical than ever. According to a recent study conducted by Ponemon and commissioned by IBM, customers’ personally identifiable information (PII) was the most frequently compromised type of record, impacted in 80% of the data breaches studied. At the same time, many enterprises are adopting hybrid cloud architectures to help them increase agility and drive innovation. In today’s threat landscape, sharing data across a hybrid cloud environment introduces new challenges around maintaining compliance and governance—and new security vulnerabilities that bad actors can take advantage of.

Enterprises need to be able to share data to extract value from it, but how can they maintain privacy assurance in the era of hybrid cloud?

Maintain privacy by policy

Today we announce the latest addition to the IBM Hyper Protect Services family designed to help you gain a higher level of privacy assurance and maintain data integrity: IBM Hyper Protect Data Controller. This data-centric audit and protection capability allows you to define and control who has access to eligible data as it leaves the system of record and moves throughout your enterprise. With the addition of IBM Hyper Protect Data Controller, the security capabilities and technical assurance associated with Hyper Protect Services help provide protection for your consistent data access policies. Additionally, robust audit logging can help you address your regulatory compliance directives.

IBM Hyper Protect Data Controller, IBM Tutorial and Material, IBM Career, IBM Preparation, IBM Certification, IBM Exam Prep, IBM Guides, IBM Career

The data-centric protection provided by Hyper Protect Data Controller opens a wide range of new possibilities for data sharing, so you can leave non-sensitive data in the clear while keeping sensitive data private. Consider the data used by the call center agent at your bank. The bank stores data in their system of record, and the agent needs access to certain information to assist you—such as the last four digits of your social security number to verify your identity. IBM Hyper Protect Data Controller protects your eligible sensitive data using encryption and masking before it leaves the system of record, and only reveals the data that the agent is authorized to see. This is made possible through a set of centralized policy controls that the data owner can dynamically update when the agent’s access needs change—including revocation of future access if the agent no longer has the call center responsibilities and moves into a different role within the organization.

Prevent unauthorized policy changes

Once a data owner sets policy controls that govern data access, how can they be sure a bad actor won’t modify them? IBM Hyper Protect Data Controller is deployed within IBM Hyper Protect Virtual Servers, which establishes a protective boundary designed to prevent access by unauthorized users—providing the data owner with a tamper-resistant confidential computing environment to set and maintain policy controls for data access.

Whether you are running your workloads with sensitive data in the cloud, on premises or in a hybrid solution, Hyper Protect Services can offer you protection for your sensitive data, keys and now data access policies. We look forward to continuing our journey to protect your data access and use, wherever it resides.

Source: ibm.com

Wednesday, 3 March 2021

Improve IT infrastructure with IBM hybrid cloud storage for IBM Cloud Satellite

IBM Hybrid Cloud Storage, IBM Cloud Satellite, IBM Learning, IBM Career, IBM Preparation, IBM Exam Prep

The majority of organizations today use a mix of on-premises, edge and public cloud infrastructure. This should be no surprise: according to one recent study, companies derive up to 2.5x the value from hybrid cloud than from a single-cloud, single-vendor approach. But as environments grow, pain points related to managing applications across various IT infrastructures grow as well.

Left unchecked, these challenges – including a lack of visibility, maintaining regulatory compliance and latency issues – can lead to decreased productivity, compromised data security and negative user experiences.

So how can you solve these problems and continue to capitalize on hybrid and multicloud adoption?

Enter IBM® Cloud Satellite™, which provides standardized, consistent cloud services across all on-premises, edge and public cloud infrastructure from any cloud vendor. It helps you to manage applications with full visibility across all environments, improving developer productivity, data security and customer experiences.

To understand how IBM Cloud Satellite brings hybrid cloud to the edge, watch our explainer video.

Choosing the right storage solution for IBM Cloud Satellite

IBM Cloud Satellite is supported by IBM hybrid cloud software-defined storage (SDS) solutions: IBM Spectrum® Virtualize and IBM Spectrum® Scale. Deploying IBM Cloud Satellite on premises allows you to choose which storage infrastructure to use:

◉ Block storage: IBM FlashSystem® family and IBM SAN Volume Controller (SVC), both built with IBM Spectrum Virtualize

◉ File storage: IBM Elastic Storage® System (ESS) or an IBM Spectrum Scale software-defined solution

In many cases, this means you can use the IBM storage you already have, which speeds initial implementation, likely reduces costs and means you don’t have to learn how to use a whole new type of storage.

For system data and for high-performance applications such as database workloads, block storage may be your best choice. IBM FlashSystem family and SVC offer enterprise-class storage function, NVMe high performance and great ease of use. All members of the family are built with the same IBM Spectrum Virtualize software. This means that for multiple Satellite deployments, all of your storage works the same way and there’s nothing to re-learn. IBM Spectrum Virtualize for Public Cloud provides the same storage functionality on IBM Cloud, AWS, and soon Microsoft Azure, which simplifies hybrid cloud storage deployments.

For analytics and AI workloads, file storage may be your best choice. ESS nodes deliver either high-performance NVMe storage technology or high-capacity low-cost storage. This gives users the ability to match the storage to their needs. In addition, ESS comes equipped with the industry-leading global data access and unified data services of IBM Spectrum Scale.

Adding ESS nodes multiplies throughput and capacity to over yottabyte scalability and can be integrated into a federated global storage system. By consolidating storage requirements from cloud to edge to the core data center, IBM Spectrum Scale and ESS can simplify storage management and data access and eliminate data silos.

Soaring with Satellite

IBM Storage solutions integrate with IBM Cloud Satellite using Container Storage Interface and come with Satellite templates to simplify deployment. In addition, these solutions are supported with extensive Red Hat® Ansible® automation playbooks.

IBM Hybrid Cloud Storage, IBM Cloud Satellite, IBM Learning, IBM Career, IBM Preparation, IBM Exam Prep

These storage solutions enable you to share your storage infrastructure—and in the case of ESS and IBM Spectrum Scale, data too—between IBM Cloud Satellite and your existing platforms, including VMware, IBM Power, and bare metal servers. That means that as your workload deployments change, you can retain your storage infrastructure, helping to save costs and maintain consistency.

IBM FlashSystem and SVC go a step further: Their ability to “virtualize” over 500 existing storage systems from IBM and others means that you can use practically any storage with IBM Cloud Satellite with just a small investment in one of these systems.

These solutions are award-winning and proven for hybrid cloud Red Hat OpenShift® container environments. They are integrated and tested with IBM Cloud Satellite.

Key takeaway: IBM Storage solutions provide a single namespace across all workloads and multiple clouds with automated data movement and local high-performance access, optimizing both performance and cost for hybrid clouds. Whichever workloads you plan to deploy with Satellite, there’s an IBM Storage solution available.

Source: ibm.com

Friday, 4 September 2020

Automate AIX and IBM i operations with Ansible on IBM Power Systems

IBMi, IBM Exam Prep, IBM Tutorial and Material, IBM Cert Prep

IT organizations are actively looking for ways to modernize software development and IT operations. However, most of them run on multiple operating environments. According to a 2020 HelpSystem survey of IBM i users, more than 80 percent of respondents run other environments like Linux, AIX and Windows along with IBM i. Each environment comes with its own set of administrative tools and processes, which makes it challenging to establish modern agile and DevOps processes consistently. Often modern agile and DevOps processes are adopted only in new implementations. So, how can you successfully apply these processes across the entire IT stack?

Start by establishing a consistent approach to automate IT operations across the various operating system (OS) environments. Manual OS build takes significant admin hours. Its reliability depends on the admin’s experience and skills. In fact, one of the common challenges we hear from AIX and IBM i users is that this skills gap is increasing. Most organizations run hundreds of these environments. This means admins will need to repeat their build processes across multiple environments and they also need to validate that the correct security baseline is applied. With manual processes, there is a high likelihood for errors and delays. Automating these processes will enable admins to quickly deliver reliable OS images on demand.

However, consistent automation across multiple OS environments requires a tool that works across all environments. Red Hat Ansible Automation Platform is that tool. It is built on the open source community project sponsored by Red Hat and can be used across IT teams — from systems and network administrators to developers and managers. Red Hat Ansible provides enterprise-ready solutions to automate your entire application lifecycle — from servers to clouds to containers and everything in between.

Ansible content for AIX and IBM i helps enable IBM Power Systems users to integrate these operating systems into their existing Ansible-based enterprise automation approach. This also helps address the AIX and IBM i skills gap, since admins can leverage their existing Ansible skills to automate these environments. Red Hat Ansible Certified Content for IBM Power Systems, delivered with Red Hat Ansible Automation Platform, is designed to provide easy-to-use modules that can accelerate the automation of operating system configuration management. Users can also take advantage of the open source Ansible community-provided content (i.e., no enterprise support available) to automate hybrid cloud operations on IBM Power Systems.

Source: ibm.com

Thursday, 3 September 2020

4 ways Red Hat OpenShift is helping IBM Power Systems clients

IBM Exam Prep, IBM Certification, IBM Tutorial and Material, IBM Learning, IBM Prep

OpenShift® on IBM Power® Systems takes advantage of hybrid cloud flexibility, enterprise AI, and the security and robustness of the Power platform for private and public clouds. OpenShift is the Red Hat® cloud development platform as a service (PaaS) that enables developers to develop and deploy applications on public, private or hybrid cloud infrastructure. OpenShift 4, its latest version, is an Operator-driven platform that delivers full-stack automation from top to bottom. From Kubernetes to the core services that support the OpenShift cluster to the application services deployed by users, everything is managed throughout its lifecycle with Operators.

In this blog post, I’ll highlight 4 ways that OpenShift on IBM Power Systems is helping clients as they modernize applications and move to hybrid cloud.

4 benefits of Red Hat OpenShift 4 on Power


Red Hat OpenShift on Power Systems can be a building block in your journey to hybrid cloud. It’s well known that IBM Power Systems hosts mission-critical workloads and delivers excellent workload performance and reliability across industries. Now you can take advantage of this performance with container workloads using Red Hat OpenShift Container Platform. You can rapidly deploy OpenShift clusters using IBM PowerVC on Power Systems enterprise servers to help modernize your existing workloads.

Here are some of its advantages:

1. Flexibility


OpenShift on Power Systems can be deployed on IBM PowerVM and Red Hat KVM hypervisor, enabling you to use either scale-up or scale-out servers as required. You can also install Red Hat OpenShift bare metal on Power Systems.

2. Performance


Due to the simultaneous multithreading (SMT) technology on Power Systems, it’s possible to run more threads per core, which reduces the number of cores in comparison to an x86 system, achieving a 3.2X container density per POWER9 core. Depending on the types of workloads used — for example, a large-scale database, an AI or machine learning application or training modules — there’s a significant performance boost when using IBM Power Systems.

3. Better storage ROI


With the invention of IBM PowerVC Container Storage Interface (CSI) driver, you can take advantage of using existing block storage subsystems as persistent volumes for the container world. IBM PowerVC CSI driver bridges the need for persistent storage in a container environment by using your existing storage infrastructure.

Also, with the new addition of IBM Block Storage CSI driver, clients that don’t have PowerVC can leverage this CSI driver to directly access their existing IBM storage. The IBM Block Storage CSI driver for IBM Storage systems, can dynamically provision persistent volumes for block or file storage to be used with stateful containers, running in Red Hat OpenShift Container Platform.

The savings is manifested in many ways: new storage purchases and technology and training costs can be reduced or completely avoided.

4. Modernization and the hybrid cloud journey


Modernizing applications has become more critical than ever. Earlier apps can be difficult and costly to maintain, require antiquated or hard-to-find developer skills, and create a maze of disparate platforms that grow in complexity over time. OpenShift is built to help you make the shift to app modernization with greater ease, efficiency and precision. This Kubernetes-based platform enables you to achieve shorter application development cycles to deliver better quality software.

In addition to the many benefits of OpenShift on Power Systems, your journey to hybrid cloud can be further enhanced by IBM Cloud Pak® solutions built on Red Hat OpenShift, such as the Cloud Pak for Multicloud Management, Cloud Pak for Data and Cloud Pak for Applications. Together they provide a complete solution for AI, machine learning and cloud-native application workloads and management.

Red Hat OpenShift running on Power Systems is currently available on premises and in IBM Cloud data centers across the globe, as well as in partner cloud solutions (like Google Cloud and Skytap), to create a great synergy for your hybrid cloud environment. Thus, you now can deploy and manage your applications and services at your data center or in a public cloud. OpenShift’s ability to bring IBM Power Systems into the heterogenous container and virtualization environment is focused on hybrid, open cloud architecture and brings harmony to divergent hardware platforms that must coexist to make the business run.

Source: ibm.com

Friday, 21 August 2020

IBM Power Systems Announces POWER10 Processor

IBM Exam Study, IBM Learning, IBM Tutorial and Material, IBM Certification, IBM Exam Prep

The accelerating shift to hybrid cloud models worldwide requires new tools to provide greater flexibility, efficiency and security across the systems enterprises use every day. Today, IBM announced the POWER10 processor at the Hot Chips 2020 conference, bringing an innovative set of capabilities to address these needs.

The IBM POWER10 processor underscores IBM’s belief in the fourth platform of IT: hybrid cloud. With hardware co-optimized for Red Hat software, IBM POWER10-based servers will deliver the future of the hybrid cloud when they become available in the second half of 2021.

The IBM POWER10 processor is equipped with enhancements to meet enterprise demands around capacity, security, energy efficiency, elasticity and scalability. In addition, the POWER10 processor can integrate AI into enterprise business applications to drive the future of enterprise computing.

POWER10 processor innovations


Some of the new innovations of the POWER10 processor include:

◉ IBM’s first commercialized 7nm processor, expected to deliver up to a 3x improvement in capacity and processor energy efficiency within the same power envelope as IBM POWER9, allowing for greater performance.

◉ Support for multi-petabyte memory clusters with a breakthrough new technology called memory inception, designed to improve cloud capacity and economics for memory-intensive workloads from ISVs like SAP, the SAS Institute and others as well as large-model AI inference.

◉ New hardware-enabled security capabilities including transparent memory encryption designed to support end-to-end security. The IBM POWER10 processor is engineered to achieve significantly faster encryption performance with quadruple the number of AES encryption engines. In comparison to IBM POWER9, POWER10 is updated for today’s most demanding standards and anticipated future cryptographic standards like post-quantum and fully homomorphic encryption, and brings new enhancements to container security.

◉ New processor core architectures in the IBM POWER10 processor with an embedded matrix math accelerator which is extrapolated to provide 10x, 15x, and 20x faster AI inference for FP32, BFloat16, and INT8 calculations, respectively, per socket than the IBM POWER9 processor to infuse AI into business applications and drive greater insights.

Designed over five years with hundreds of new and pending patents, the IBM POWER10 processor is an important evolution in IBM’s roadmap for POWER. Systems taking advantage of IBM POWER10 are expected to be available in the second half of 2021.

Samsung will manufacture the IBM POWER10 processor, combining Samsung’s industry-leading semiconductor manufacturing with IBM’s CPU designs.

Monday, 10 August 2020

IBM Z — the digital reinvention continues

IBM Exam Prep, IBM Tutorial and Material, IBM Certifications, IBM Z

In today’s digital world, you must strike a balance between technical and business needs — addressing service delivery, availability, flexibility, skills and time to market — while optimizing your digital transformation to hybrid cloud. The platform underpinning your hybrid cloud strategy must be reliable, flexible and agile to meet your current business needs while helping you to prepare with confidence for the future.

Today, we’re making a series of exciting announcements to help our clients become even more flexible, fast and secure. We are extending the capabilities of IBM z15™ and LinuxONE III across cloud native development, data protection, flexible configuration and resiliency.

Accelerate your journey to cloud 


We’ve made a variety of enhancements enabling our clients to have a common developer experience on IBM Z and LinuxONE, including:

◉ Red Hat OpenShift Container Platform is generally available for IBM Z and IBM LinuxONE and recently Red Hat released OpenShift 4.5 on the platform. This brings together the cloud-native world of containers and Kubernetes with the security, scalability and reliability features of IBM enterprise servers.

◉ Cloud Pak for Applications 4.2: The latest cloud-native application development tools and languages are available on IBM Z, designed to simplify life for developers, operations and architects. Now you can bring new applications to market quicker, leveraging IBM Z’s scale, security, and resiliency.

◉ Expanded access to IBM z/OS Container Extensions (zCX) enables clients to deploy a large ecosystem of open source and Linux on IBM Z applications on their native z/OS environment without requiring a separate Linux server (IFL). The latest open source tools, NoSQL databases, analytics frameworks, and application servers are all now easily accessible.

◉ IBM Secure Execution for Linux extends confidential computing to new heights through the implementation of a scalable trusted execution environment (TEE) on LinuxONE, which allows organizations to further protect and isolate virtual machines.

◉ Red Hat Ansible Certified Content for IBM Z is designed to automate z/OS applications and IT infrastructure as part of a consistent overall automation strategy across different environments using developer-friendly Ansible tools familiar with your teams.

Protect and keep your data private


Customers need to protect not only the security of their data but its confidentiality as well as it travels throughout the enterprise. Pervasive encryption was the first step towards enabling extensive encryption of data in-flight and at-rest, simplifying data protection while helping to reduce the costs associated with compliance. In addition, IBM offers protection of data privacy as data travels from your system of record to distributed and hybrid cloud environments.

◉ IBM Data Privacy Passports V1.0.1 now supports additional enforcement techniques, providing users with more options to access protected data.

◉ Cryptographic scale is a must-have in a cloud service environment to support a high volume of tenants. IBM z15 and LinuxONE III can now support up to 60 crypto hardware security modules (HSMs) and more domains, allowing for over 5100 virtual highly secured HSMs for ultimate scalability. Similarly, IBM z15 T02 and LinuxONE III LT2 can support up to 40 HSMs, for 1600 virtual HSMs.

Cyber resiliency lets you run with confidence


Clients realize that to protect their business, they must often fend off increasingly sophisticated threats, recover quickly from downtime, and meet unforeseen spikes in demand — all while delivering competitive service levels. IT resiliency gives clients the ability to adapt to planned or unplanned events while keeping services running continuously. System Recovery Boost now delivers new recovery process boosts to address a range of sysplex recovery processes, including sysplex partitioning, coupling facility structure and coupling facility data sharing member recovery, and HyperSwap recovery. With these enhancements, clients can expedite the return to normal operations and catch up on workload backlog after a sysplex event.

Flexible computing to make your life easier


New enhancements accelerate critical workloads and provide additional data center efficiencies:

◉ A new hardware accelerator for sort functions using a CPU co-processor called the Integrated Accelerator for Z Sort is designed to reduce elapsed time for critical sort workloads during batch processing and help improve Db2 database reorganization time.

◉ New IBM z15 T02 flexible physical configuration options set aside reserved space in the rack to integrate select storage devices such as IBM DS8910F and switches. Storage integration can help save space, which is perfect for clients who have smaller I/O configurations and can take advantage of running a smaller footprint.

The z15 and LinuxONE III continuous delivery process extends the innovations delivered earlier in the generation, enabling clients to exploit their IBM Z investments as they continue on their journey to the cloud. The security, resiliency and cloud-native capabilities of IBM z15 and LinuxONE III help clients to leverage their secure and reliable foundation for hybrid cloud, while also allowing them to respond to evolving business pressures — setting them up for success now and in the future.

Source: ibm.com

Tuesday, 28 July 2020

Flexibility and choice with new hybrid cloud capabilities on IBM Power Systems

IBM Exam Prep, IBM Tutorial and Material, IBM Study Materials, IBM Learning

The unprecedented disruption we have experienced is driving enterprises to prioritize between business continuity and innovation and to rethink how they accelerate digital transformation and modernization.

With the new hybrid cloud enhancements we are announcing today (which will be generally available during this quarter), IBM Power Systems clients can be assured they have the right compute platform to help provide continuity of key business operations and processes and enable digital transformation by easily extending these workloads to private, public or hybrid clouds. They can continue to do so with the same performance, adaptability, resiliency and security they have come to expect from IBM Power Systems.

IBM Power Systems servers, new consumption models, and cloud capabilities are designed to help you optimize costs and improve continuity as you build a seamless hybrid cloud environment on the platform ranked the most reliable mainstream server for mission-critical applications. With IBM Power Systems combined with Red Hat OpenShift, IBM Cloud Paks and Red Hat Ansible Automation, you have the keys to a robust modern IT infrastructure that to help enable you to adopt new technology on your terms, whether driven by rapidly changing business requirements or global situations. Executing your hybrid multicloud strategy is simplified with IBM Power Systems:

◉ Flexibility and choice of cloud consumption models in on premises and public cloud

◉ Certification of SAP HANA and SAP Applications on IBM Power Virtual Servers

◉ Expanded global access, capacity and workloads for Power Virtual Server

◉ Simplified hybrid cloud management and AIX/ IBM i applications modernization for cloud agility

IBM Power Private Cloud, more servers, more choices


In 1Q20 we announced IBM Power Systems Private Cloud solutions for our scale-up servers; today we are expanding the offerings with dynamic capacity for the S922 and S924 scale out systems. The idea of dynamic capacity is not new to us. IBM Power has been offering capacity on demand since year 2007, and now with the IBM Power Systems Private Cloud solutions you can:

◉ Optimize your resource utilization and associated costs by sharing pay-per-use capacity across systems of the same model. Available for the E980, E950, S922 or S924

◉ Reduce CAPEX with new base minimum capacity, as low as 1 core and 256GB, and pay by the minute for capacity used above the aggregated base across systems

◉ Make informed decisions on capacity requirements utilizing IBM Cloud Management Console with granular, real-time & historical views of consumption

◉ Realize greater flexibility to co-locate existing AIX/IBM i and new cloud native applications with 3.2X more containers per core, 2.6X better TCO with OpenShift and 2X throughput improvement with our enhanced scale-out servers.

SAP HANA certification on Power Virtual Server


We are also announcing SAP Certified IaaS to deploy SAP HANA on IBM Power Virtual Server, extending the value we bring to SAP HANA clients. With this new solution, we combine the flexibility and choice of a hybrid environment to SAP customers, from classic SAP ECC on AIX to SAP S/4 HANA on Linux. IBM Power Systems has extended their lead in performance with the most powerful SAP certified server in our IBM Power Systems Virtual Server offering, measured using SAPS benchmark published by SAP.

More locations and capacity for Power Virtual Server


With growing demand and positive feedback from our clients, we are expanding the Power Virtual Server with more capacity in NA and EU and new availability in AP coming in the second half of 2020. These offerings encompass AIX, IBM i and Linux on Power.  These are engineered so that customers running these operating systems can accelerate their hybrid cloud strategy execution development, pre-production, production and disaster recovery.  With a consistent compute platform, clients can avoid refactoring business-critical applications, move data and processes to the cloud, and seamlessly extend to new Linux workloads.

Simplifying hybrid cloud management and automation


IBM Power Systems together with Red Hat OpenShift, IBM Cloud Paks and Ansible Automation is the modern infrastructure foundation for your digital transformation that will streamline the cost, deployment, automation and management of your hybrid multicloud environment. With 98 percent of surveyed organizations saying that they will be using multiple hybrid cloud environments by 2021, the future is clearly hybrid multicloud, and IBM Power Systems is the hybrid cloud compute platform specifically architected for your business-critical processes and operations.

Source: ibm.com

Wednesday, 17 June 2020

Everything should go cloud now, right?

IBM Tutorial and Material, IBM Exam Prep, IBM Certification, IBM Prep, IBM Guides

From time to time, we invite industry thought leaders to share their opinions and insights on current technology trends on the IBM Systems IT Infrastructure blog. The opinions in these posts are their own, and do not necessarily reflect the views of IBM.

The state of cloud today


We are now at the stage where the majority of easy workloads have been shifted, representing an estimated 20 percent of compute moved across. We now enter the next phase. This is where the harder workloads will be migrated, those that will likely bring the most advantage to the business and the customer in moving to the cloud. This shift represents more than a simple technology change and will lead to new ways of working, where we shall see the reaping of the optimal benefits of cloud compute.

However, this will be the harder phase and remains the phase where many applications may be found too hard or inappropriate to port for potential benefits. New applications will still get provisioned that do not fit cloud and will remain on premises.

Digital transformation projects are often correlated with everything cloud. And yet, according to an Everest Group survey, due to this 73 percent of enterprises failed to provide any business value whatsoever from their digital transformation efforts.

A transformation is taking something from one state to another, to gain benefit and upside. A digital transformation purely aligns to it being a technical change; you change form factor from one technology approach to another, but this does not necessitate an improvement or transformation of the business process.

Too many assume a digital transformation is the process of moving from legacy on-premises to a cloud-based solution. It should mean a review of processes and technical approaches to gain maximum business advantage for both users and customers, and this may include both hosted and on-premises cloud as appropriate.

IDC stated that by 2020, 55 percent of organizations will be digitally determined, and if they are not, they shall be digitally distressed. We have seen a continued disruption of traditional organizations with many large brand names struggling, restricting, being acquired or simply going out of business through digital distress. They simply have not adjusted to the persona needs of their customer in delivering an omni-channel digital experience. Those lacking effective digitization with speed will risk their business legacy being marginalized or totally disrupted. Technology and its affordability are no longer the barrier; the willingness, capability and effectiveness of the digital change journey will be the determining factors for future success. The challenge facing most businesses now comes from selection of the right form factor or cloud type for the right compute need and reducing the risk that a selection now will provide a restrictive future lock-in commitment!

Applications are everything


This need is the driver of application-level compute demands, application selections driving the underlying platform, much like we experienced in the good old days when organizations found themselves with mixed UNIX, Novell Netware, Lan Manager, NT, and VM environments, driven by the applications selected and the operating systems required to run them. Mixed hybrid and multicloud environments have become the norm, not by design, but by osmosis. We must accept that it is a multicloud world that we will exist in and that the luxury to select one singular public cloud platform now and for the future is an assumed expectation.

One cloud doesn’t fit all


The right cloud for a specific application is determined by individual discrete metrics for that app with different app vendors offering varying integration levels for different platforms with different capabilities. This makes it nearly impossible to utilize one cloud platform across all application needs and not be restricting your future flexibility and freedom of choice to other application choices.

Multicloud is a must as we progress forward in a world where digital and first to market will increasingly distance the “haves” and “have nots”. The digital customer is demanding more of all providers, and the consumer will expect agility from their provider or simply have freedom of choice from those who deliver. Multiclouds have the ability to offer great flexibility. However, challenges of compliance, skill sets, development specifics, monitoring and security still remain factors to overcome. The cloud, the network and IT services are converging. Hybrid and multicloud are becoming the norm and a high percentage of workloads are still on-premises. With these in mind it is critical that any cloud transformation accommodates co-existence, flexibility and portability.

The right platform for the right need, the ability to co-exist more effectively and easily and to deliver competitive business advantage is key in today’s economy; however, based on the recent Forrester study 39 percent have achieved a loss of competitive edge as an IT organization, not a gain! What is also clear from the study and from my engagement with leading business clients is that delaying decisions and investment in refreshes and upgrades is not a saving but a hindrance and has far greater cost and negative impact on the business performance.

The Forrester study from IBM cited that key drivers behind the multicloud strategy of clients included: higher performance, flexibility, improved customer experience and the ability to support change. In today’s pressurized world with customers who demands more, employees who expect more, and a business likely built in the past, the ability to transform, adjust and become agile and open to more rapid change is critical.

Achieving this is not a one-size-fits-all fix and is certainly not an easy task that is solved by a single platform or vendor. Hybrid cloud strategies have developed too quickly, become the norm, and have been accepted as the appropriate model for the forward-thinking organization.

Businesses from the largest to the smallest firms are finding that despite the promised land of single vendor-sourced cloud offerings, that in reality the mixing of cloud platforms across Saas, PaaS and IaaS from multiple vendors is quickly becoming the norm.  In order to deliver the maximum upside outcome for the business there is a growing understanding and receptiveness that the true cloud world . . .  will be a hybrid and multicloud world.

Wednesday, 29 April 2020

Making data smarter

IBM Exam Prep, IBM Tutorial and Material, IBM Certification, IBM Prep

What makes artificial intelligence (AI)-driven applications smarter? High-quality, high-volume data. Now, IBM Storage is integrating file and object data with IBM AI solutions to make data smarter.

Enterprises looking to leverage the power of AI are learning that the accuracy and consistency–among other qualities–of the data fed into AI applications directly affects the quality of the results produced by AI. Better data in, better results out.

But this isn’t the only data-related challenge faced by organizations these days. Many companies have been deluged by so much business-generated data that they can’t keep track of it all. That, or they don’t know as much about their data as they should in order to make the most profitable use of these very valuable assets. Where does any particular data asset reside? What, specifically, is contained in each of those thousands or millions of files? How old is it? Who has access to it–and should they? Is the data compliant with the latest governmental and international regulations?

IBM Spectrum Discover simplifies data and AI organization. It is a high-volume data catalog and data management solution that provides a 360-degree view of data for higher productivity and helps to optimize data for faster AI analysis.

IBM Exam Prep, IBM Tutorial and Material, IBM Certification, IBM Prep

Traditional metadata solutions for unstructured data aren’t designed to provide the level of detail needed these days for storage consumption and data quality–or integrate well to other AI solutions. IBM Spectrum Discover serves the important role of classifying and labeling data with custom metadata that not only makes it easier to find and recall data for analysis but can actually increase the value of data by imbuing it with additional semantics and meaning.

IBM Spectrum Discover is an on-premises interactive data management catalog that offers a detailed, real-time view of data. It’s non-disruptive to existing storage and applications and provides the ability to create custom indexes and reports for identifying and optimizing enterprise data.

Innovation within the IBM Spectrum Discover platform has been moving at a brisk pace. Today, IBM Storage is announcing multiple enhancements to the powerful data catalog and data management solution. This includes the capability to utilize IBM Watson® solutions and IBM Cloud Pak for Data to make data “smarter” and more valuable. Now, with just a single click, IBM Spectrum Discover can export data information from file and object storage systems to IBM Watson Knowledge Catalog, plus leverage other IBM Watson AI solutions integrated with the Catalog.

IBM Watson Knowledge Catalog is an open, AI-driven data catalog for managing enterprise data and AI model governance, quality, and collaboration. The Catalog provides a gateway to other tools such as Watson Studio and Watson Machine Learning that together provide wide-ranging security, research, governance, self-service, and data lake management.

IBM Exam Prep, IBM Tutorial and Material, IBM Certification, IBM Prep

IBM Spectrum Discover leverages the capabilities of IBM Cloud Pak for Data to enable secure, high-performance multicloud connectivity, management, and data movement to hybrid cloud-based solutions such as IBM Watson Knowledge Catalog. IBM Cloud Paks enrich basic container platforms such as Red Hat OpenShift and Kubernetes with additional functionality to enhance, automate and accelerate many tasks associated with developing, deploying and maintaining cloud-native applications.

Other recent enhancements to IBM Spectrum Discover include a new Windows SMB connector to ingest information from Windows data and more easily uncover security issues with files that are not protected in connected desktop environments. Also, IBM Spectrum Discover now offers the ability to optimize data in IBM Spectrum Scale with more granularity using a new policy engine.

Finally, to help organizations deal with the disruption caused by the COVID-19 pandemic, IBM is offering eligible customers access to IBM Spectrum Discover for 90 days at no cost.

AI adoption rates are soaring. But getting the most out of AI requires putting the best possible data in. IBM Spectrum Discover with one-click connectivity to IBM Watson Knowledge Catalog offers a solution to transform chaotic streams of unstructured data into well-understood, easily accessible, totally compliant business assets. This is how data gets smarter–and much more valuable to the enterprise.

Source: ibm.com

Thursday, 23 April 2020

Are we running to the cloud hills too quickly?

IBM Tutorial and Material, IBM Prep, IBM Exam Prep, IBM Learning, IBM Guides, IBM Certification

From time to time, we invite industry thought leaders to share their opinions and insights on current technology trends to the IBM Systems IT Infrastructure blog. The opinions in these posts are their own, and do not necessarily reflect the views of IBM.

With billions gained by cloud technology providers hailing multicloud and hybrid public cloud services as well as native cloud initiatives, are we overlooking some crucial points when discussing technology migrations? Mid to large-cap enterprises have substantial technology that exists either at their on-premises tech estate or housed within any given data center. Infrastructure-wise, there’s a lot to change when talking cloud specifics for said infrastructure, let alone a cloud-first program!

I’m a big advocate of reusing and repurposing data from the outset as this will bring down the multitude of layered services you will be told to purchase to get the most out of your data. But do we need to layer these services around our estate? The answer that will probably be on the tip of your tongue is “It depends.”

A recent Forrester study, in conjunction with IBM, highlights what is on the minds of CTO/CIO’s when making these cloud decisions. Enterprise technology environments have undergone many refreshes over the last few decades and bursting out to hybrid cloud services for the specific line of business applications/services. Our on-premises technology is seen as a management burden, a cost of ownership problem and more often as old technology! Although some of these statements can be viewed as accurate (depending on your standpoint), I would argue the following points:

Digitalization doesn’t necessarily need an external cloud approach; it requires planning and focuses on providing the right solution. If this method is taken, I bet some of you will agree with me when I say “on-premises technology is your first option” before outsourcing some of your critical IT Infrastructure. Most heterogeneous, on-premises technology has evolved to meet the new digital demands with burst out to hybrid services. Static site-based technology has gone through the pain of multiple API services and controls years back now, and we have gotten over the cloud buzz.

The Forrester report findings also sum up current cloud buying trends, highlighting IT enterprise estates not being a “one-size-fits-all solution” and with hybrid cloud adoption being on everyone’s tip of the tongue but not executing… yet.

Platforms such as the IBM Z Systems z Series has had more R&D dollars pumped into them than most tech platforms (billions) and the latest z15 generation reflects those advancements, which is just phenomenal!

The IBM Z series, LinuxONE, or even the now-overused term for this technology–mainframe–are all fantastic pieces of tech and are pretty much a cloud in a box (if you wish to use it that way). I have followed this technology for many years and say unprompted that if I had a choice of multiple vendors delivering services (hybrid cloud models), together with having to wade through endless contracts to ensure all liabilities are covered = I wouldn’t! (who would?). If I can have enterprise technology on premises with some of the best security, speed and resiliency built-in through one provider; I would!

However, it is often the case today that we demand limitless API’s for anything we may need to utilize in the future, and so we opt for the public hybrid cloud highway. But there are so many points to weigh up when considering this option–security, management, agility and scale, speed, access, the list goes on. Why not opt for localized technology that suits your needs with overspill to hybrid cloud vendors?

The IBM Systems platforms not only provide an absolute powerhouse of technology but also have encryption everywhere with speeds that can’t be matched by most conventional mainstream tech. They have built-in hybrid cloud connectors, they encrypt at hardware, software and in transit with the new passport data security options and are fast, very fast!

Supporting 2.4 million Docker containers on a single system and handling 19 billion encrypted web transactions per day without breaking a sweat are two examples that come to mind.

This technology may not apply to some of you, but my point is this. Don’t automatically jump to grab off-premises services using cloud initiatives. Take a look around and take what works for you. Cloud technology will only get more evolved as we progress with more interconnectivity for bespoke services, but they will also become more diverse and complicated to maintain/support.

I have always said that technical revolutions come around every ten years or so, which is also valid in the business world. We see this explosion of services and IT buzzwords that provides too many vendors for each new trend, and then we see a consolidation of them over 5-6 years. We are going through this right now, and if you don’t believe me check out some of the new options that the other prominent tech vendors are offering–on-premises replication from a hybrid cloud service!

Source: ibm.com

Friday, 3 April 2020

Don’t let cloud exuberance stop other IT infrastructure investments

IBM Study Materials, IBM Learning, IBM Guides, IBM Cert Exam, IBM Prep

Cloud computing is a growing component of modern IT infrastructure and it will continue to grow into the foreseeable future… but on-premises computing is not going away. Nor should it!

There are many reasons to continue to embrace and even expand your existing on-premises workloads. Compliance, latency requirements, security and data protection, cost issues, and productivity (among others) are all valid reasons to run computing workloads on premises instead of shifting to the cloud.

IBM commissioned Forrester Consulting to evaluate how organizations develop and implement their IT infrastructure strategies. The results were published recently in a study titled The Key to Enterprise Hybrid Multicloud Strategy. And this study clearly backs up the premise of this blog post, that the future is hybrid and not everything will migrate to the cloud.

Indeed, one of the key findings of the study is that the push to public cloud doesn’t mean organizations have stopped investing in on premises. Organizations are increasing their funding for on-premises infrastructure at about the same rate as they are increasing cloud funding. While 82 percent of organizations are planning to increase spending on public cloud, 85 percent are planning to increase spending on existing infrastructure, outside of public cloud.

To me, this is not surprising. Given that many of the applications running on your on-premises IT infrastructure are running your business (that is, they are mission critical), it stands to reason that organizations should want to keep them maintained, operational, and efficient. And that requires investment.

We must never forget that the benefit of the applications that run on premises is substantial. These on-premises applications were built over time, perhaps over many decades, and they embody your core business processes. A pragmatic approach means that your existing enterprise applications must not only be maintained, but perhaps bolstered and improved. This is especially so because many of these applications are being used more heavily than ever before as organizations embrace digital transformation, analytical processing and mobile computing.

If you cease to invest in your existing software and hardware that runs your critical applications, it should be no surprise when they stop working efficiently or stop working altogether. Investing in the updates and refreshes required to keep your IT infrastructure operational is important, but unfortunately updates and refreshes tend to be the first thing that fall off the agenda when time/budget is a restriction. It is the silent risk that nobody is talking about.

Isn’t everything going to the cloud?


But what about the industry predictions that everything will be moving to the cloud? This is another example of tech industry over-exuberance. Whenever I hear an absolute term like “everything” I immediately doubt its veracity. Let me put it bluntly: everything will not be moving to the cloud. The cost and effort to do so would not deliver a reasonable return on the investment. There are two primary reasons why this is so:

◉ The majority of existing applications were not built with an understanding of the public cloud and it would take a lot of investment to re-engineer them to properly take advantage of a public cloud architecture.

◉ Even if demand is high, cloud service providers (CSPs) can’t build out their infrastructure fast enough to support all the existing data center capacity “out there” to immediately support everything.

Finally, there are scenarios that do not fit well in public cloud, perhaps in terms of pricing structure or compliance requirements.

The bottom line


The future is hybrid… namely, hybrid multicloud. I’ve written about hybrid multicloud here on the IBM IT Infrastructure blog before, so if that term confuses you be sure to click over and read my definition.

On-premises computing has its place in today’s IT infrastructure and it undoubtedly will in the future. As will, of course, public and private cloud. The bottom line is, and should always be, that you utilize the appropriate platform and technology for the task at hand. And many times, that will mean running applications on your IT infrastructure on premises.

Don’t let cloud obsession curtail your other IT infrastructure investments!

Thursday, 2 April 2020

Address customer data privacy and protection concerns with encryption everywhere

IBM Cert Exam, IBM Learning, IBM Tutorial and Material, IBM Guides, IBM Learning

Data privacy and protection remain top boardroom and C-suite issues. The data breach threat still looms large: 59 percent of businesses experienced a data breach caused by a vendor or third party in 2018. As organizations migrate workloads to hybrid multicloud environments, they must ensure that the data within these environments is effectively protected.

Consumers have grown more concerned with the privacy of their data — as have regulators. In 2019, many fines were levied related to GDPR and U.S. Federal Trade Commission regulations. High-profile corporate data breaches and misuses have increased consumer scrutiny of how corporations use and share their data. A new IBM and The Harris Poll study found that almost all consumer respondents (94 percent) agree that businesses should do more to protect their privacy.

These trends, along with recent regulations such as the EU GDPR, the upcoming California Consumer Privacy Act, and Thailand’s Personal Data Protection Act, indicate that the pendulum is swinging toward more privacy and protection of personal data.

In addition to protection, your customers now expect privacy and control of their data. How can you deliver this?

Expand data privacy and protection


Until now, in my experience, both organizations and solutions have typically focused on protecting data at the aggregate level — within entire databases or applications. Existing data-protection solutions tend to be siloed and focus on protecting only data within the IT infrastructure. But data does not stay in one place: it needs to move. The need to manage privacy across multiple disjointed solutions makes enforcing the appropriate use of data (data privacy) across an organization complex.

Pervasive encryption helps prevent data misuse from data breaches across your enterprise and keeps data within your direct control. Even if hackers breach the data, they would likely not be able to access it because it is encrypted.

Maintain privacy with IBM Data Privacy Passports

Pervasive encryption lets you encrypt all enterprise data, keeping it secured within your on-premises environment. But what about when the data leaves this environment? What about data on other platforms?

Much of your customers’ data lives in the public cloud and is shared with your business partners. Your customers want this data private and easy to control, yet instantly accessible. Consumer mandates for data privacy and protection wherever their data lives require the extension of enterprise-level security beyond your data center’s on-premises architecture. This requires data-centric audit and protection (DCAP): protecting information at the data level rather than broadly at the IT infrastructure level. Think of this level of protection as having encryption everywhere.

Introducing IBM Data Privacy Passports, exclusively for IBM z15™


IBM Data Privacy Passports, a leading-edge DCAP solution available in beta on the new IBM z15, empowers you to  build customer trust by keeping data private and secured wherever it goes.

With Data Privacy Passports, you control how data is shared and accessed. Now you can protect and provision data while revoking access to that data at any time, regardless of where the data is located. Data Privacy Passports extends encryption everywhere, enforcing data privacy by policy even when the data leaves your data center and extending IBM Z enterprise-class protection to data from other sources. It enables you to enforce the appropriate use of data across private, public and hybrid clouds at the data level. It does this all without impacting system performance.

A simple example demonstrates how Data Privacy Passports creates strong data privacy and protection for your customers. Consider an international bank that does business with a financial technology company. The bank sets rules governing the company’s use of the bank’s customer data through agreed-upon terms and conditions. Using Data Privacy Passports, it can enforce these rules and limit or revoke access to data as appropriate.

Keep data protected and private


Here’s a closer look at how Data Privacy Passports keeps data private and secured while simplifying compliance and data management.

◉ Protect data wherever it goes. Data does not stay in one place and typical solutions are often fragmented or siloed. Data Privacy Passports addresses this by introducing Trusted Data Objects (TDO), which provide data-centric protection that moves with the data–even with unauthorized copies.

◉ Ensure privacy with controlled data usage. Data Privacy Passports is designed to establish and enforce an enterprise-wide data privacy policy where different views of data are surfaced to different users based on their need to know. TDO technology can also be used to prevent collusion between data owners to use data inappropriately. It does this by breaking the referential integrity between data tables with different owners or limiting that connection based on policy.

◉ Track provenance and consumption of data. Track the data from point of origin to point of consumption, with a central point of auditing information for data access and aggregation for your compliance obligations. End-to-end tracking is achieved by encrypting the data as a Trusted Data Object, so it does not need to be tracked throughout its journey but only when opened with a passport controller. If a user whose access you have revoked tries to access the data through the passport controller, it fails and that fail is logged.

◉ Simplify data management with Embedded Key Management. Data Privacy Passports provides all required key management for TDOs created and distributed throughout your enterprise and beyond. This greatly reduces the complexity of implementing the solutions and provides simple management of data as it moves between systems and across hybrid multiclouds.

Source: ibm.com