Constellation Research report praises IBM Research for commitment to fundamental research that's delivering competitive cloud solutions.
Wednesday, 1 September 2021
How IBM Research is creating the future of hybrid cloud
Sunday, 29 August 2021
Open source workload identity management could help secure hybrid clouds
IBM is open sourcing project “Tornjak” to encourage the development and adoption of enterprise-level identity management between clouds.
Read More: C2090-424: IBM InfoSphere DataStage v11.3
Finding common ground
Barriers to secure workload identity management
Open source can accelerate development
Tuesday, 10 August 2021
IBM Research and Red Hat work together to take a load off predictive resource management
PayPal, the global payments giant, has already started putting load-aware scheduling into production.
The auto-scaler has just been made generally available and supported by Red Hat in OpenShift 4.8, and the load-aware scheduling is expected to be available in the next release of OpenShift.“Our collaboration with IBM Research takes an upstream-first approach, helping to fuel innovation in the Kubernetes community.
Saturday, 26 June 2021
Extend privacy assurance in hybrid cloud with IBM Hyper Protect Data Controller
As IBM CEO Arvind Krishna has stated, data breaches and ransomware attacks such as the recent attack on Colonial Pipeline are increasing in frequency and scope, making data protection and privacy more critical than ever. According to a recent study conducted by Ponemon and commissioned by IBM, customers’ personally identifiable information (PII) was the most frequently compromised type of record, impacted in 80% of the data breaches studied. At the same time, many enterprises are adopting hybrid cloud architectures to help them increase agility and drive innovation. In today’s threat landscape, sharing data across a hybrid cloud environment introduces new challenges around maintaining compliance and governance—and new security vulnerabilities that bad actors can take advantage of.
Enterprises need to be able to share data to extract value from it, but how can they maintain privacy assurance in the era of hybrid cloud?
Maintain privacy by policy
Today we announce the latest addition to the IBM Hyper Protect Services family designed to help you gain a higher level of privacy assurance and maintain data integrity: IBM Hyper Protect Data Controller. This data-centric audit and protection capability allows you to define and control who has access to eligible data as it leaves the system of record and moves throughout your enterprise. With the addition of IBM Hyper Protect Data Controller, the security capabilities and technical assurance associated with Hyper Protect Services help provide protection for your consistent data access policies. Additionally, robust audit logging can help you address your regulatory compliance directives.
The data-centric protection provided by Hyper Protect Data Controller opens a wide range of new possibilities for data sharing, so you can leave non-sensitive data in the clear while keeping sensitive data private. Consider the data used by the call center agent at your bank. The bank stores data in their system of record, and the agent needs access to certain information to assist you—such as the last four digits of your social security number to verify your identity. IBM Hyper Protect Data Controller protects your eligible sensitive data using encryption and masking before it leaves the system of record, and only reveals the data that the agent is authorized to see. This is made possible through a set of centralized policy controls that the data owner can dynamically update when the agent’s access needs change—including revocation of future access if the agent no longer has the call center responsibilities and moves into a different role within the organization.
Prevent unauthorized policy changes
Once a data owner sets policy controls that govern data access, how can they be sure a bad actor won’t modify them? IBM Hyper Protect Data Controller is deployed within IBM Hyper Protect Virtual Servers, which establishes a protective boundary designed to prevent access by unauthorized users—providing the data owner with a tamper-resistant confidential computing environment to set and maintain policy controls for data access.
Whether you are running your workloads with sensitive data in the cloud, on premises or in a hybrid solution, Hyper Protect Services can offer you protection for your sensitive data, keys and now data access policies. We look forward to continuing our journey to protect your data access and use, wherever it resides.
Source: ibm.com
Wednesday, 3 March 2021
Improve IT infrastructure with IBM hybrid cloud storage for IBM Cloud Satellite
The majority of organizations today use a mix of on-premises, edge and public cloud infrastructure. This should be no surprise: according to one recent study, companies derive up to 2.5x the value from hybrid cloud than from a single-cloud, single-vendor approach. But as environments grow, pain points related to managing applications across various IT infrastructures grow as well.
Left unchecked, these challenges – including a lack of visibility, maintaining regulatory compliance and latency issues – can lead to decreased productivity, compromised data security and negative user experiences.
So how can you solve these problems and continue to capitalize on hybrid and multicloud adoption?
Enter IBM® Cloud Satellite™, which provides standardized, consistent cloud services across all on-premises, edge and public cloud infrastructure from any cloud vendor. It helps you to manage applications with full visibility across all environments, improving developer productivity, data security and customer experiences.
To understand how IBM Cloud Satellite brings hybrid cloud to the edge, watch our explainer video.
Choosing the right storage solution for IBM Cloud Satellite
IBM Cloud Satellite is supported by IBM hybrid cloud software-defined storage (SDS) solutions: IBM Spectrum® Virtualize and IBM Spectrum® Scale. Deploying IBM Cloud Satellite on premises allows you to choose which storage infrastructure to use:
◉ Block storage: IBM FlashSystem® family and IBM SAN Volume Controller (SVC), both built with IBM Spectrum Virtualize
◉ File storage: IBM Elastic Storage® System (ESS) or an IBM Spectrum Scale software-defined solution
In many cases, this means you can use the IBM storage you already have, which speeds initial implementation, likely reduces costs and means you don’t have to learn how to use a whole new type of storage.
For system data and for high-performance applications such as database workloads, block storage may be your best choice. IBM FlashSystem family and SVC offer enterprise-class storage function, NVMe high performance and great ease of use. All members of the family are built with the same IBM Spectrum Virtualize software. This means that for multiple Satellite deployments, all of your storage works the same way and there’s nothing to re-learn. IBM Spectrum Virtualize for Public Cloud provides the same storage functionality on IBM Cloud, AWS, and soon Microsoft Azure, which simplifies hybrid cloud storage deployments.
For analytics and AI workloads, file storage may be your best choice. ESS nodes deliver either high-performance NVMe storage technology or high-capacity low-cost storage. This gives users the ability to match the storage to their needs. In addition, ESS comes equipped with the industry-leading global data access and unified data services of IBM Spectrum Scale.
Adding ESS nodes multiplies throughput and capacity to over yottabyte scalability and can be integrated into a federated global storage system. By consolidating storage requirements from cloud to edge to the core data center, IBM Spectrum Scale and ESS can simplify storage management and data access and eliminate data silos.
Soaring with Satellite
IBM Storage solutions integrate with IBM Cloud Satellite using Container Storage Interface and come with Satellite templates to simplify deployment. In addition, these solutions are supported with extensive Red Hat® Ansible® automation playbooks.
These storage solutions enable you to share your storage infrastructure—and in the case of ESS and IBM Spectrum Scale, data too—between IBM Cloud Satellite and your existing platforms, including VMware, IBM Power, and bare metal servers. That means that as your workload deployments change, you can retain your storage infrastructure, helping to save costs and maintain consistency.
IBM FlashSystem and SVC go a step further: Their ability to “virtualize” over 500 existing storage systems from IBM and others means that you can use practically any storage with IBM Cloud Satellite with just a small investment in one of these systems.
These solutions are award-winning and proven for hybrid cloud Red Hat OpenShift® container environments. They are integrated and tested with IBM Cloud Satellite.
Key takeaway: IBM Storage solutions provide a single namespace across all workloads and multiple clouds with automated data movement and local high-performance access, optimizing both performance and cost for hybrid clouds. Whichever workloads you plan to deploy with Satellite, there’s an IBM Storage solution available.
Source: ibm.com
Friday, 4 September 2020
Automate AIX and IBM i operations with Ansible on IBM Power Systems
Start by establishing a consistent approach to automate IT operations across the various operating system (OS) environments. Manual OS build takes significant admin hours. Its reliability depends on the admin’s experience and skills. In fact, one of the common challenges we hear from AIX and IBM i users is that this skills gap is increasing. Most organizations run hundreds of these environments. This means admins will need to repeat their build processes across multiple environments and they also need to validate that the correct security baseline is applied. With manual processes, there is a high likelihood for errors and delays. Automating these processes will enable admins to quickly deliver reliable OS images on demand.
However, consistent automation across multiple OS environments requires a tool that works across all environments. Red Hat Ansible Automation Platform is that tool. It is built on the open source community project sponsored by Red Hat and can be used across IT teams — from systems and network administrators to developers and managers. Red Hat Ansible provides enterprise-ready solutions to automate your entire application lifecycle — from servers to clouds to containers and everything in between.
Ansible content for AIX and IBM i helps enable IBM Power Systems users to integrate these operating systems into their existing Ansible-based enterprise automation approach. This also helps address the AIX and IBM i skills gap, since admins can leverage their existing Ansible skills to automate these environments. Red Hat Ansible Certified Content for IBM Power Systems, delivered with Red Hat Ansible Automation Platform, is designed to provide easy-to-use modules that can accelerate the automation of operating system configuration management. Users can also take advantage of the open source Ansible community-provided content (i.e., no enterprise support available) to automate hybrid cloud operations on IBM Power Systems.
Thursday, 3 September 2020
4 ways Red Hat OpenShift is helping IBM Power Systems clients
OpenShift® on IBM Power® Systems takes advantage of hybrid cloud flexibility, enterprise AI, and the security and robustness of the Power platform for private and public clouds. OpenShift is the Red Hat® cloud development platform as a service (PaaS) that enables developers to develop and deploy applications on public, private or hybrid cloud infrastructure. OpenShift 4, its latest version, is an Operator-driven platform that delivers full-stack automation from top to bottom. From Kubernetes to the core services that support the OpenShift cluster to the application services deployed by users, everything is managed throughout its lifecycle with Operators.
In this blog post, I’ll highlight 4 ways that OpenShift on IBM Power Systems is helping clients as they modernize applications and move to hybrid cloud.
4 benefits of Red Hat OpenShift 4 on Power
Red Hat OpenShift on Power Systems can be a building block in your journey to hybrid cloud. It’s well known that IBM Power Systems hosts mission-critical workloads and delivers excellent workload performance and reliability across industries. Now you can take advantage of this performance with container workloads using Red Hat OpenShift Container Platform. You can rapidly deploy OpenShift clusters using IBM PowerVC on Power Systems enterprise servers to help modernize your existing workloads.
Here are some of its advantages:
1. Flexibility
OpenShift on Power Systems can be deployed on IBM PowerVM and Red Hat KVM hypervisor, enabling you to use either scale-up or scale-out servers as required. You can also install Red Hat OpenShift bare metal on Power Systems.
2. Performance
Due to the simultaneous multithreading (SMT) technology on Power Systems, it’s possible to run more threads per core, which reduces the number of cores in comparison to an x86 system, achieving a 3.2X container density per POWER9 core. Depending on the types of workloads used — for example, a large-scale database, an AI or machine learning application or training modules — there’s a significant performance boost when using IBM Power Systems.
3. Better storage ROI
With the invention of IBM PowerVC Container Storage Interface (CSI) driver, you can take advantage of using existing block storage subsystems as persistent volumes for the container world. IBM PowerVC CSI driver bridges the need for persistent storage in a container environment by using your existing storage infrastructure.
Also, with the new addition of IBM Block Storage CSI driver, clients that don’t have PowerVC can leverage this CSI driver to directly access their existing IBM storage. The IBM Block Storage CSI driver for IBM Storage systems, can dynamically provision persistent volumes for block or file storage to be used with stateful containers, running in Red Hat OpenShift Container Platform.
The savings is manifested in many ways: new storage purchases and technology and training costs can be reduced or completely avoided.
4. Modernization and the hybrid cloud journey
Modernizing applications has become more critical than ever. Earlier apps can be difficult and costly to maintain, require antiquated or hard-to-find developer skills, and create a maze of disparate platforms that grow in complexity over time. OpenShift is built to help you make the shift to app modernization with greater ease, efficiency and precision. This Kubernetes-based platform enables you to achieve shorter application development cycles to deliver better quality software.
In addition to the many benefits of OpenShift on Power Systems, your journey to hybrid cloud can be further enhanced by IBM Cloud Pak® solutions built on Red Hat OpenShift, such as the Cloud Pak for Multicloud Management, Cloud Pak for Data and Cloud Pak for Applications. Together they provide a complete solution for AI, machine learning and cloud-native application workloads and management.
Red Hat OpenShift running on Power Systems is currently available on premises and in IBM Cloud data centers across the globe, as well as in partner cloud solutions (like Google Cloud and Skytap), to create a great synergy for your hybrid cloud environment. Thus, you now can deploy and manage your applications and services at your data center or in a public cloud. OpenShift’s ability to bring IBM Power Systems into the heterogenous container and virtualization environment is focused on hybrid, open cloud architecture and brings harmony to divergent hardware platforms that must coexist to make the business run.
Friday, 21 August 2020
IBM Power Systems Announces POWER10 Processor
The accelerating shift to hybrid cloud models worldwide requires new tools to provide greater flexibility, efficiency and security across the systems enterprises use every day. Today, IBM announced the POWER10 processor at the Hot Chips 2020 conference, bringing an innovative set of capabilities to address these needs.
The IBM POWER10 processor underscores IBM’s belief in the fourth platform of IT: hybrid cloud. With hardware co-optimized for Red Hat software, IBM POWER10-based servers will deliver the future of the hybrid cloud when they become available in the second half of 2021.
The IBM POWER10 processor is equipped with enhancements to meet enterprise demands around capacity, security, energy efficiency, elasticity and scalability. In addition, the POWER10 processor can integrate AI into enterprise business applications to drive the future of enterprise computing.
POWER10 processor innovations
Some of the new innovations of the POWER10 processor include:
◉ IBM’s first commercialized 7nm processor, expected to deliver up to a 3x improvement in capacity and processor energy efficiency within the same power envelope as IBM POWER9, allowing for greater performance.
◉ Support for multi-petabyte memory clusters with a breakthrough new technology called memory inception, designed to improve cloud capacity and economics for memory-intensive workloads from ISVs like SAP, the SAS Institute and others as well as large-model AI inference.
◉ New hardware-enabled security capabilities including transparent memory encryption designed to support end-to-end security. The IBM POWER10 processor is engineered to achieve significantly faster encryption performance with quadruple the number of AES encryption engines. In comparison to IBM POWER9, POWER10 is updated for today’s most demanding standards and anticipated future cryptographic standards like post-quantum and fully homomorphic encryption, and brings new enhancements to container security.
◉ New processor core architectures in the IBM POWER10 processor with an embedded matrix math accelerator which is extrapolated to provide 10x, 15x, and 20x faster AI inference for FP32, BFloat16, and INT8 calculations, respectively, per socket than the IBM POWER9 processor to infuse AI into business applications and drive greater insights.
Designed over five years with hundreds of new and pending patents, the IBM POWER10 processor is an important evolution in IBM’s roadmap for POWER. Systems taking advantage of IBM POWER10 are expected to be available in the second half of 2021.
Samsung will manufacture the IBM POWER10 processor, combining Samsung’s industry-leading semiconductor manufacturing with IBM’s CPU designs.
Monday, 10 August 2020
IBM Z — the digital reinvention continues
In today’s digital world, you must strike a balance between technical and business needs — addressing service delivery, availability, flexibility, skills and time to market — while optimizing your digital transformation to hybrid cloud. The platform underpinning your hybrid cloud strategy must be reliable, flexible and agile to meet your current business needs while helping you to prepare with confidence for the future.
Today, we’re making a series of exciting announcements to help our clients become even more flexible, fast and secure. We are extending the capabilities of IBM z15™ and LinuxONE III across cloud native development, data protection, flexible configuration and resiliency.
Accelerate your journey to cloud
We’ve made a variety of enhancements enabling our clients to have a common developer experience on IBM Z and LinuxONE, including:
◉ Red Hat OpenShift Container Platform is generally available for IBM Z and IBM LinuxONE and recently Red Hat released OpenShift 4.5 on the platform. This brings together the cloud-native world of containers and Kubernetes with the security, scalability and reliability features of IBM enterprise servers.
◉ Cloud Pak for Applications 4.2: The latest cloud-native application development tools and languages are available on IBM Z, designed to simplify life for developers, operations and architects. Now you can bring new applications to market quicker, leveraging IBM Z’s scale, security, and resiliency.
◉ Expanded access to IBM z/OS Container Extensions (zCX) enables clients to deploy a large ecosystem of open source and Linux on IBM Z applications on their native z/OS environment without requiring a separate Linux server (IFL). The latest open source tools, NoSQL databases, analytics frameworks, and application servers are all now easily accessible.
◉ IBM Secure Execution for Linux extends confidential computing to new heights through the implementation of a scalable trusted execution environment (TEE) on LinuxONE, which allows organizations to further protect and isolate virtual machines.
◉ Red Hat Ansible Certified Content for IBM Z is designed to automate z/OS applications and IT infrastructure as part of a consistent overall automation strategy across different environments using developer-friendly Ansible tools familiar with your teams.
Protect and keep your data private
Customers need to protect not only the security of their data but its confidentiality as well as it travels throughout the enterprise. Pervasive encryption was the first step towards enabling extensive encryption of data in-flight and at-rest, simplifying data protection while helping to reduce the costs associated with compliance. In addition, IBM offers protection of data privacy as data travels from your system of record to distributed and hybrid cloud environments.
◉ IBM Data Privacy Passports V1.0.1 now supports additional enforcement techniques, providing users with more options to access protected data.
◉ Cryptographic scale is a must-have in a cloud service environment to support a high volume of tenants. IBM z15 and LinuxONE III can now support up to 60 crypto hardware security modules (HSMs) and more domains, allowing for over 5100 virtual highly secured HSMs for ultimate scalability. Similarly, IBM z15 T02 and LinuxONE III LT2 can support up to 40 HSMs, for 1600 virtual HSMs.
Cyber resiliency lets you run with confidence
Clients realize that to protect their business, they must often fend off increasingly sophisticated threats, recover quickly from downtime, and meet unforeseen spikes in demand — all while delivering competitive service levels. IT resiliency gives clients the ability to adapt to planned or unplanned events while keeping services running continuously. System Recovery Boost now delivers new recovery process boosts to address a range of sysplex recovery processes, including sysplex partitioning, coupling facility structure and coupling facility data sharing member recovery, and HyperSwap recovery. With these enhancements, clients can expedite the return to normal operations and catch up on workload backlog after a sysplex event.
Flexible computing to make your life easier
New enhancements accelerate critical workloads and provide additional data center efficiencies:
◉ A new hardware accelerator for sort functions using a CPU co-processor called the Integrated Accelerator for Z Sort is designed to reduce elapsed time for critical sort workloads during batch processing and help improve Db2 database reorganization time.
◉ New IBM z15 T02 flexible physical configuration options set aside reserved space in the rack to integrate select storage devices such as IBM DS8910F and switches. Storage integration can help save space, which is perfect for clients who have smaller I/O configurations and can take advantage of running a smaller footprint.
The z15 and LinuxONE III continuous delivery process extends the innovations delivered earlier in the generation, enabling clients to exploit their IBM Z investments as they continue on their journey to the cloud. The security, resiliency and cloud-native capabilities of IBM z15 and LinuxONE III help clients to leverage their secure and reliable foundation for hybrid cloud, while also allowing them to respond to evolving business pressures — setting them up for success now and in the future.
Tuesday, 28 July 2020
Flexibility and choice with new hybrid cloud capabilities on IBM Power Systems
With the new hybrid cloud enhancements we are announcing today (which will be generally available during this quarter), IBM Power Systems clients can be assured they have the right compute platform to help provide continuity of key business operations and processes and enable digital transformation by easily extending these workloads to private, public or hybrid clouds. They can continue to do so with the same performance, adaptability, resiliency and security they have come to expect from IBM Power Systems.
IBM Power Systems servers, new consumption models, and cloud capabilities are designed to help you optimize costs and improve continuity as you build a seamless hybrid cloud environment on the platform ranked the most reliable mainstream server for mission-critical applications. With IBM Power Systems combined with Red Hat OpenShift, IBM Cloud Paks and Red Hat Ansible Automation, you have the keys to a robust modern IT infrastructure that to help enable you to adopt new technology on your terms, whether driven by rapidly changing business requirements or global situations. Executing your hybrid multicloud strategy is simplified with IBM Power Systems:
◉ Flexibility and choice of cloud consumption models in on premises and public cloud
◉ Certification of SAP HANA and SAP Applications on IBM Power Virtual Servers
◉ Expanded global access, capacity and workloads for Power Virtual Server
◉ Simplified hybrid cloud management and AIX/ IBM i applications modernization for cloud agility
IBM Power Private Cloud, more servers, more choices
In 1Q20 we announced IBM Power Systems Private Cloud solutions for our scale-up servers; today we are expanding the offerings with dynamic capacity for the S922 and S924 scale out systems. The idea of dynamic capacity is not new to us. IBM Power has been offering capacity on demand since year 2007, and now with the IBM Power Systems Private Cloud solutions you can:
◉ Optimize your resource utilization and associated costs by sharing pay-per-use capacity across systems of the same model. Available for the E980, E950, S922 or S924
◉ Reduce CAPEX with new base minimum capacity, as low as 1 core and 256GB, and pay by the minute for capacity used above the aggregated base across systems
◉ Make informed decisions on capacity requirements utilizing IBM Cloud Management Console with granular, real-time & historical views of consumption
◉ Realize greater flexibility to co-locate existing AIX/IBM i and new cloud native applications with 3.2X more containers per core, 2.6X better TCO with OpenShift and 2X throughput improvement with our enhanced scale-out servers.
SAP HANA certification on Power Virtual Server
We are also announcing SAP Certified IaaS to deploy SAP HANA on IBM Power Virtual Server, extending the value we bring to SAP HANA clients. With this new solution, we combine the flexibility and choice of a hybrid environment to SAP customers, from classic SAP ECC on AIX to SAP S/4 HANA on Linux. IBM Power Systems has extended their lead in performance with the most powerful SAP certified server in our IBM Power Systems Virtual Server offering, measured using SAPS benchmark published by SAP.
More locations and capacity for Power Virtual Server
With growing demand and positive feedback from our clients, we are expanding the Power Virtual Server with more capacity in NA and EU and new availability in AP coming in the second half of 2020. These offerings encompass AIX, IBM i and Linux on Power. These are engineered so that customers running these operating systems can accelerate their hybrid cloud strategy execution development, pre-production, production and disaster recovery. With a consistent compute platform, clients can avoid refactoring business-critical applications, move data and processes to the cloud, and seamlessly extend to new Linux workloads.
Simplifying hybrid cloud management and automation
IBM Power Systems together with Red Hat OpenShift, IBM Cloud Paks and Ansible Automation is the modern infrastructure foundation for your digital transformation that will streamline the cost, deployment, automation and management of your hybrid multicloud environment. With 98 percent of surveyed organizations saying that they will be using multiple hybrid cloud environments by 2021, the future is clearly hybrid multicloud, and IBM Power Systems is the hybrid cloud compute platform specifically architected for your business-critical processes and operations.
Wednesday, 17 June 2020
Everything should go cloud now, right?
The state of cloud today
We are now at the stage where the majority of easy workloads have been shifted, representing an estimated 20 percent of compute moved across. We now enter the next phase. This is where the harder workloads will be migrated, those that will likely bring the most advantage to the business and the customer in moving to the cloud. This shift represents more than a simple technology change and will lead to new ways of working, where we shall see the reaping of the optimal benefits of cloud compute.
Applications are everything
This need is the driver of application-level compute demands, application selections driving the underlying platform, much like we experienced in the good old days when organizations found themselves with mixed UNIX, Novell Netware, Lan Manager, NT, and VM environments, driven by the applications selected and the operating systems required to run them. Mixed hybrid and multicloud environments have become the norm, not by design, but by osmosis. We must accept that it is a multicloud world that we will exist in and that the luxury to select one singular public cloud platform now and for the future is an assumed expectation.
One cloud doesn’t fit all
The right cloud for a specific application is determined by individual discrete metrics for that app with different app vendors offering varying integration levels for different platforms with different capabilities. This makes it nearly impossible to utilize one cloud platform across all application needs and not be restricting your future flexibility and freedom of choice to other application choices.
Wednesday, 29 April 2020
Making data smarter
Thursday, 23 April 2020
Are we running to the cloud hills too quickly?
With billions gained by cloud technology providers hailing multicloud and hybrid public cloud services as well as native cloud initiatives, are we overlooking some crucial points when discussing technology migrations? Mid to large-cap enterprises have substantial technology that exists either at their on-premises tech estate or housed within any given data center. Infrastructure-wise, there’s a lot to change when talking cloud specifics for said infrastructure, let alone a cloud-first program!
I’m a big advocate of reusing and repurposing data from the outset as this will bring down the multitude of layered services you will be told to purchase to get the most out of your data. But do we need to layer these services around our estate? The answer that will probably be on the tip of your tongue is “It depends.”
A recent Forrester study, in conjunction with IBM, highlights what is on the minds of CTO/CIO’s when making these cloud decisions. Enterprise technology environments have undergone many refreshes over the last few decades and bursting out to hybrid cloud services for the specific line of business applications/services. Our on-premises technology is seen as a management burden, a cost of ownership problem and more often as old technology! Although some of these statements can be viewed as accurate (depending on your standpoint), I would argue the following points:
Digitalization doesn’t necessarily need an external cloud approach; it requires planning and focuses on providing the right solution. If this method is taken, I bet some of you will agree with me when I say “on-premises technology is your first option” before outsourcing some of your critical IT Infrastructure. Most heterogeneous, on-premises technology has evolved to meet the new digital demands with burst out to hybrid services. Static site-based technology has gone through the pain of multiple API services and controls years back now, and we have gotten over the cloud buzz.
The Forrester report findings also sum up current cloud buying trends, highlighting IT enterprise estates not being a “one-size-fits-all solution” and with hybrid cloud adoption being on everyone’s tip of the tongue but not executing… yet.
Platforms such as the IBM Z Systems z Series has had more R&D dollars pumped into them than most tech platforms (billions) and the latest z15 generation reflects those advancements, which is just phenomenal!
The IBM Z series, LinuxONE, or even the now-overused term for this technology–mainframe–are all fantastic pieces of tech and are pretty much a cloud in a box (if you wish to use it that way). I have followed this technology for many years and say unprompted that if I had a choice of multiple vendors delivering services (hybrid cloud models), together with having to wade through endless contracts to ensure all liabilities are covered = I wouldn’t! (who would?). If I can have enterprise technology on premises with some of the best security, speed and resiliency built-in through one provider; I would!
However, it is often the case today that we demand limitless API’s for anything we may need to utilize in the future, and so we opt for the public hybrid cloud highway. But there are so many points to weigh up when considering this option–security, management, agility and scale, speed, access, the list goes on. Why not opt for localized technology that suits your needs with overspill to hybrid cloud vendors?
The IBM Systems platforms not only provide an absolute powerhouse of technology but also have encryption everywhere with speeds that can’t be matched by most conventional mainstream tech. They have built-in hybrid cloud connectors, they encrypt at hardware, software and in transit with the new passport data security options and are fast, very fast!
Supporting 2.4 million Docker containers on a single system and handling 19 billion encrypted web transactions per day without breaking a sweat are two examples that come to mind.
This technology may not apply to some of you, but my point is this. Don’t automatically jump to grab off-premises services using cloud initiatives. Take a look around and take what works for you. Cloud technology will only get more evolved as we progress with more interconnectivity for bespoke services, but they will also become more diverse and complicated to maintain/support.
I have always said that technical revolutions come around every ten years or so, which is also valid in the business world. We see this explosion of services and IT buzzwords that provides too many vendors for each new trend, and then we see a consolidation of them over 5-6 years. We are going through this right now, and if you don’t believe me check out some of the new options that the other prominent tech vendors are offering–on-premises replication from a hybrid cloud service!
Friday, 3 April 2020
Don’t let cloud exuberance stop other IT infrastructure investments
There are many reasons to continue to embrace and even expand your existing on-premises workloads. Compliance, latency requirements, security and data protection, cost issues, and productivity (among others) are all valid reasons to run computing workloads on premises instead of shifting to the cloud.
IBM commissioned Forrester Consulting to evaluate how organizations develop and implement their IT infrastructure strategies. The results were published recently in a study titled The Key to Enterprise Hybrid Multicloud Strategy. And this study clearly backs up the premise of this blog post, that the future is hybrid and not everything will migrate to the cloud.
Indeed, one of the key findings of the study is that the push to public cloud doesn’t mean organizations have stopped investing in on premises. Organizations are increasing their funding for on-premises infrastructure at about the same rate as they are increasing cloud funding. While 82 percent of organizations are planning to increase spending on public cloud, 85 percent are planning to increase spending on existing infrastructure, outside of public cloud.
To me, this is not surprising. Given that many of the applications running on your on-premises IT infrastructure are running your business (that is, they are mission critical), it stands to reason that organizations should want to keep them maintained, operational, and efficient. And that requires investment.
We must never forget that the benefit of the applications that run on premises is substantial. These on-premises applications were built over time, perhaps over many decades, and they embody your core business processes. A pragmatic approach means that your existing enterprise applications must not only be maintained, but perhaps bolstered and improved. This is especially so because many of these applications are being used more heavily than ever before as organizations embrace digital transformation, analytical processing and mobile computing.
If you cease to invest in your existing software and hardware that runs your critical applications, it should be no surprise when they stop working efficiently or stop working altogether. Investing in the updates and refreshes required to keep your IT infrastructure operational is important, but unfortunately updates and refreshes tend to be the first thing that fall off the agenda when time/budget is a restriction. It is the silent risk that nobody is talking about.
Isn’t everything going to the cloud?
But what about the industry predictions that everything will be moving to the cloud? This is another example of tech industry over-exuberance. Whenever I hear an absolute term like “everything” I immediately doubt its veracity. Let me put it bluntly: everything will not be moving to the cloud. The cost and effort to do so would not deliver a reasonable return on the investment. There are two primary reasons why this is so:
◉ The majority of existing applications were not built with an understanding of the public cloud and it would take a lot of investment to re-engineer them to properly take advantage of a public cloud architecture.
◉ Even if demand is high, cloud service providers (CSPs) can’t build out their infrastructure fast enough to support all the existing data center capacity “out there” to immediately support everything.
Finally, there are scenarios that do not fit well in public cloud, perhaps in terms of pricing structure or compliance requirements.
The bottom line
The future is hybrid… namely, hybrid multicloud. I’ve written about hybrid multicloud here on the IBM IT Infrastructure blog before, so if that term confuses you be sure to click over and read my definition.
On-premises computing has its place in today’s IT infrastructure and it undoubtedly will in the future. As will, of course, public and private cloud. The bottom line is, and should always be, that you utilize the appropriate platform and technology for the task at hand. And many times, that will mean running applications on your IT infrastructure on premises.
Don’t let cloud obsession curtail your other IT infrastructure investments!
Thursday, 2 April 2020
Address customer data privacy and protection concerns with encryption everywhere
Consumers have grown more concerned with the privacy of their data — as have regulators. In 2019, many fines were levied related to GDPR and U.S. Federal Trade Commission regulations. High-profile corporate data breaches and misuses have increased consumer scrutiny of how corporations use and share their data. A new IBM and The Harris Poll study found that almost all consumer respondents (94 percent) agree that businesses should do more to protect their privacy.
These trends, along with recent regulations such as the EU GDPR, the upcoming California Consumer Privacy Act, and Thailand’s Personal Data Protection Act, indicate that the pendulum is swinging toward more privacy and protection of personal data.
In addition to protection, your customers now expect privacy and control of their data. How can you deliver this?
Expand data privacy and protection
Until now, in my experience, both organizations and solutions have typically focused on protecting data at the aggregate level — within entire databases or applications. Existing data-protection solutions tend to be siloed and focus on protecting only data within the IT infrastructure. But data does not stay in one place: it needs to move. The need to manage privacy across multiple disjointed solutions makes enforcing the appropriate use of data (data privacy) across an organization complex.
Pervasive encryption helps prevent data misuse from data breaches across your enterprise and keeps data within your direct control. Even if hackers breach the data, they would likely not be able to access it because it is encrypted.
Maintain privacy with IBM Data Privacy Passports
Pervasive encryption lets you encrypt all enterprise data, keeping it secured within your on-premises environment. But what about when the data leaves this environment? What about data on other platforms?
Much of your customers’ data lives in the public cloud and is shared with your business partners. Your customers want this data private and easy to control, yet instantly accessible. Consumer mandates for data privacy and protection wherever their data lives require the extension of enterprise-level security beyond your data center’s on-premises architecture. This requires data-centric audit and protection (DCAP): protecting information at the data level rather than broadly at the IT infrastructure level. Think of this level of protection as having encryption everywhere.
Introducing IBM Data Privacy Passports, exclusively for IBM z15™
IBM Data Privacy Passports, a leading-edge DCAP solution available in beta on the new IBM z15, empowers you to build customer trust by keeping data private and secured wherever it goes.
With Data Privacy Passports, you control how data is shared and accessed. Now you can protect and provision data while revoking access to that data at any time, regardless of where the data is located. Data Privacy Passports extends encryption everywhere, enforcing data privacy by policy even when the data leaves your data center and extending IBM Z enterprise-class protection to data from other sources. It enables you to enforce the appropriate use of data across private, public and hybrid clouds at the data level. It does this all without impacting system performance.
A simple example demonstrates how Data Privacy Passports creates strong data privacy and protection for your customers. Consider an international bank that does business with a financial technology company. The bank sets rules governing the company’s use of the bank’s customer data through agreed-upon terms and conditions. Using Data Privacy Passports, it can enforce these rules and limit or revoke access to data as appropriate.
Keep data protected and private
Here’s a closer look at how Data Privacy Passports keeps data private and secured while simplifying compliance and data management.
◉ Protect data wherever it goes. Data does not stay in one place and typical solutions are often fragmented or siloed. Data Privacy Passports addresses this by introducing Trusted Data Objects (TDO), which provide data-centric protection that moves with the data–even with unauthorized copies.
◉ Ensure privacy with controlled data usage. Data Privacy Passports is designed to establish and enforce an enterprise-wide data privacy policy where different views of data are surfaced to different users based on their need to know. TDO technology can also be used to prevent collusion between data owners to use data inappropriately. It does this by breaking the referential integrity between data tables with different owners or limiting that connection based on policy.
◉ Track provenance and consumption of data. Track the data from point of origin to point of consumption, with a central point of auditing information for data access and aggregation for your compliance obligations. End-to-end tracking is achieved by encrypting the data as a Trusted Data Object, so it does not need to be tracked throughout its journey but only when opened with a passport controller. If a user whose access you have revoked tries to access the data through the passport controller, it fails and that fail is logged.
◉ Simplify data management with Embedded Key Management. Data Privacy Passports provides all required key management for TDOs created and distributed throughout your enterprise and beyond. This greatly reduces the complexity of implementing the solutions and provides simple management of data as it moves between systems and across hybrid multiclouds.