Showing posts with label IT infrastructure. Show all posts
Showing posts with label IT infrastructure. Show all posts

Thursday, 25 July 2024

Optimizing data flexibility and performance with hybrid cloud

Optimizing data flexibility and performance with hybrid cloud

As the global data storage market is set to more than triple by 2032, businesses face increasing challenges in managing their growing data. This shift to hybrid cloud solutions is transforming data management, enhancing flexibility and boosting performance across organizations.

By focusing on five key aspects of cloud adoption for optimizing data management—from evolving data strategies to ensuring compliance—businesses can create adaptable, high performing data ecosystems that are primed for AI innovation and future growth.

1. The evolution of data management strategies


Data management is undergoing a significant transformation, especially with the arrival of generative AI. Organizations are increasingly adopting hybrid cloud solutions that blend the strengths of private and public clouds, particularly beneficial in data-intensive sectors and companies embarking on AI strategy to fuel growth. 

A McKinsey & Company study reveals that companies aim to have 60% of their systems in the cloud by 2025, underscoring the importance of flexible cloud strategies. Hybrid cloud solutions address this trend by offering open architectures, combining high performance with scalability. For technical professionals, this shift means to work with systems that can adapt to changing needs without compromising on performance or security. 

2. Seamless deployment and workload portability


One of the key advantages of hybrid cloud solutions is the ability to deploy across any cloud or on-premises environment in minutes. This flexibility is further enhanced by workload portability through advanced technologies like Red Hat® OpenShift®.  

This capability allows organizations to align their infrastructure with both multicloud and hybrid cloud data strategies, ensuring that workloads can be moved or scaled as needed without being locked into a single environment. This adaptability is crucial for enterprises dealing with varying compliance requirements and evolving business needs. 

3. Enhancing AI and analytics with unified data access


 Hybrid cloud architectures are proving instrumental in advancing AI and analytics capabilities. A 2023 Gartner survey reveals that “two out of three enterprises use hybrid cloud to power their AI initiatives”, underscoring its critical role in modern data strategies. By using open formats, these solutions provide unified data access, allowing seamless sharing of data across an organization without the need for extensive migration or restructuring. 

Furthermore, advanced solutions like IBM watsonx.data™ integrate vector database like Milvus, an open-source solution that enables efficient storage and retrieval of high-dimensional vectors. This integration is crucial for AI and machine learning tasks, particularly in fields like natural learning processing and computer vision.  By providing access to a wider pool of trusted data, it enhances the relevance and precision of AI models, accelerating innovation in these areas. 

For data scientists and engineers, these features translate to more efficient data preparation for AI models and applications, leading to improved accuracy and relevance in AI-driven insights and predictions. 

4. Optimizing performance with fit-for-purpose query engines


In the realm of data management, the diverse nature of data workloads demands a flexible approach to query processing. With watsonx.data, multiple fit-for-purpose open query engines are offered such as Presto, Presto C++ and Spark, along with integration capabilities for data warehouse engines like Db2® and Netezza®. This flexibility allows data teams to choose the optimal tool for each task, enhancing both performance and cost-effectiveness. 

For instance, Presto C++ can be used for high-performance, low-latency queries on large datasets, while Spark excels at complex, distributed data processing tasks. The integration with established data warehouse engines ensures compatibility with existing systems and workflows. 

This flexibility is especially valuable when dealing with diverse data types and volumes in modern businesses. By allowing organizations to optimize their data workloads, watsonx.data addresses the challenges of rapidly propagating data across various environments. 

5. Compliance and data governance in a hybrid world


With increasingly strict data regulations, hybrid cloud architectures offer significant advantages in maintaining compliance and robust data governance. A report by FINRA (Financial Industry Regulatory Authority) demonstrates that hybrid cloud solutions can help firms manage cybersecurity, data governance and business continuity more effectively than by using multiple separate cloud services. 

 Unlike pure multicloud setups, which can complicate compliance efforts across different providers, hybrid cloud allows organizations to keep sensitive data on premises or in private clouds while using public cloud resources for less sensitive workloads. IBM watsonx.data enhances this approach with built-in data governance features, such as having a single point of entry and robust access control. This approach supports varied deployment needs and restrictions, making it easier to implement consistent governance policies and meet industry-specific regulatory requirements compromise on security. 

Embracing hybrid cloud for future-ready data management


The adoption of hybrid cloud solutions marks a significant shift in enterprise data management. By offering a balance of flexibility, performance and control, solutions like IBM watsonx.data are enabling businesses to build more resilient, efficient and innovative data ecosystems. 

As data management continues to evolve, using hybrid cloud strategies will be crucial in shaping the future of enterprise data and analytics. With watsonx.data, organizations can confidently navigate this change, using advanced features to unlock the full potential of their data across hybrid environments and be future ready to embrace AI. 

Source: ibm.com

Tuesday, 25 June 2024

Speed, scale and trustworthy AI on IBM Z with Machine Learning for IBM z/OS v3.2

Speed, scale and trustworthy AI on IBM Z with Machine Learning for IBM z/OS v3.2

Recent years have seen a remarkable surge in AI adoption, with businesses doubling down. According to the IBM® Global AI Adoption Index, about 42% of enterprise-scale companies surveyed (> 1,000 employees) report having actively deployed AI in their business. 59% of those companies surveyed that are already exploring or deploying AI say they have accelerated their rollout or investments in the technology. Yet, amidst this surge, navigating the complexities of AI implementation, scalability issues and validating the trustworthiness of AI continue to be significant challenges that companies still face.   

A robust and scalable environment is crucial to accelerating client adoption of AI. It must be capable of converting ambitious AI use cases into reality while enabling real-time AI insights to be generated with trust and transparency.  

What is Machine Learning for IBM z/OS? 


Machine Learning for IBM® z/OS® is an AI platform tailor-made for IBM z/OS environments. It combines data and transaction gravity with AI infusion for accelerated insights at scale with trust and transparency. It helps clients manage their full AI model lifecycles, enabling quick deployment co-located with their mission-critical applications on IBM Z without data movement and minimal application changes. Features include explainability, drift detection, train-anywhere capabilities and developer-friendly APIs. 

Machine Learning for IBM z/OS use cases


Machine Learning for IBM z/OS can serve various transactional use cases on IBM z/OS. Top use cases include:

1. Real-time fraud detection in credit cards and payments: Large financial institutions are increasingly experiencing more losses due to fraud. With off-platform solutions, they were only able to screen a small subset of their transactions. In support of this use case, the IBM z16™ system can process up to 228 thousand z/OS CICS credit card transactions per second with 6 ms response time, each with an in-transaction fraud detection inference operation using a Deep Learning Model.

Performance result is extrapolated from IBM internal tests running a CICS credit card transaction workload with inference operations on IBM z16. A z/OS V2R4 logical partition (LPAR) configured with 6 CPs and 256 GB of memory was used. Inferencing was done with Machine Learning for IBM z/OS running on Websphere Application Server Liberty 21.0.0.12, using a synthetic credit card fraud detection model and the IBM Integrated Accelerator for AI. Server-side batching was enabled on Machine Learning for IBM z/OS with a size of 8 inference operations. The benchmark was run with 48 threads performing inference operations. Results represent a fully configured IBM z16 with 200 CPs and 40 TB storage. Results might vary. 

2. Clearing and settlement: A card processor explored using AI to assist in determining which trades and transactions have a high-risk exposure before settlement to reduce liability, chargebacks and costly investigation. In support of this use case, IBM has validated that the IBM z16 with Machine Learning for IBM z/OS is designed to score business transactions at scale delivering the capacity to process up to 300 billion deep inferencing requests per day with 1 ms of latency.

Performance result is extrapolated from IBM internal tests running local inference operations in an IBM z16 LPAR with 48 IFLs and 128 GB memory on Ubuntu 20.04 (SMT mode) using a synthetic credit card fraud detection model exploiting the Integrated Accelerator for AI. The benchmark was running with 8 parallel threads, each pinned to the first core of a different chip. The lscpu command was used to identify the core-chip topology. A batch size of 128 inference operations was used. Results were also reproduced using a z/OS V2R4 LPAR with 24 CPs and 256 GB memory on IBM z16. The same credit card fraud detection model was used. The benchmark was run with a single thread performing inference operations. A batch size of 128 inference operations was used. Results might vary. 
 
3. Anti-money laundering: A bank was exploring how to introduce AML screening into their instant payments operational flow. Their current end-day AML screening was no longer sufficient due to stricter regulations. In support of this use case, IBM has demonstrated that the IBM z16 with z/OS delivers up to 20x lower response time and up to 19x higher throughput when colocating applications and inferencing requests versus sending the same inferencing requests to a compared x86 server in the same data center with 60 ms average network latency.

Performance results based on IBM internal tests using a CICS OLTP credit card workload with in-transaction fraud detection. A synthetic credit card fraud detection model was used. On IBM z16, inferencing was done with MLz on zCX. Tensorflow Serving was used on the compared x86 server. A Linux on IBM Z LPAR, located on the same IBM z16, was used to bridge the network connection between the measured z/OS LPAR and the x86 server. Additional network latency was introduced with the Linux “tc-netem” command to simulate a network environment with 5 ms average latency. Measured improvements are due to network latency. Results might vary. IBM z16 configuration: Measurements were run using a z/OS (v2R4) LPAR with MLz (OSCE) and zCX with APAR- oa61559 and APAR- OA62310 applied, 8 CPs, 16 zIIPs and 8 GB of memory. x86 configuration: Tensorflow Serving 2.4 ran on Ubuntu 20.04.3 LTS on 8 Skylake Intel® Xeon® Gold CPUs @ 2.30 GHz with Hyperthreading turned on, 1.5 TB memory, RAID5 local SSD Storage.  

Machine Learning for IBM z/OS with IBM Z can also be used as a security-focused on-prem AI platform for other use cases where clients want to promote data integrity, privacy and application availability. The IBM z16 systems, with GDPS®, IBM DS8000® series storage with HyperSwap® and running a Red Hat® OpenShift® Container Platform environment, are designed to deliver 99.99999% availability.

Necessary components include IBM z16; IBM z/VM V7.2 systems or above collected in a Single System Image, each running RHOCP 4.10 or above; IBM Operations Manager; GDPS 4.5 for management of data recovery and virtual machine recovery across metro distance systems and storage, including Metro Multisite workload and GDPS Global; and IBM DS8000 series storage with IBM HyperSwap. A MongoDB v4.2 workload was used. Necessary resiliency technology must be enabled, including z/VM Single System Image clustering, GDPS xDR Proxy for z/VM and Red Hat OpenShift Data Foundation (ODF) 4.10 for management of local storage devices. Application-induced outages are not included in the preceding measurements. Results might vary. Other configurations (hardware or software) might provide different availability characteristics. 

Source: ibm.com

Tuesday, 11 June 2024

Mastering budget control in the age of AI: Leveraging on-premises and cloud XaaS for success

Mastering budget control in the age of AI: Leveraging on-premises and cloud XaaS for success

As organizations strive to harness the power of AI while controlling costs, leveraging anything as a service (XaaS) models emerges as a strategic approach. In this blog, we’ll explore how businesses can use both on-premises and cloud XaaS to control budgets in the age of AI, driving financial sustainability without compromising on technological advancement.

Mastering budget control in the age of AI: Leveraging on-premises and cloud XaaS for success

Embracing the power of XaaS


XaaS encompasses a broad spectrum of cloud-based and on-premises service models that offer scalable and cost-effective solutions to businesses. From software as a service (SaaS) to infrastructure as a service (IaaS), platform as a service (PaaS) and beyond, XaaS enables organizations to access cutting-edge technologies and capabilities without the need for upfront investment in hardware or software.

Harnessing flexibility and scalability 


One of the key advantages of XaaS models is their inherent flexibility and scalability, whether deployed on premises or in the cloud. Cloud-based XaaS offerings provide organizations with the agility to scale resources up or down based on demand, enabling optimal resource utilization and cost efficiency. Similarly, on-premises XaaS solutions offer the flexibility to scale resources within the organization’s own infrastructure, providing greater control over data and security.

Maintaining cost predictability and transparency 


Controlling budgets in the age of AI requires a deep understanding of cost drivers and expenditure patterns. XaaS models offer organizations greater predictability and transparency in cost management by providing detailed billing metrics and usage analytics. With granular insights into resource consumption, businesses can identify opportunities for optimization and allocate budgets more effectively.

Outsourcing infrastructure management 


Maintaining and managing on-premises infrastructure for AI workloads can be resource-intensive and costly. By leveraging both cloud-based and on-premises XaaS offerings, organizations can offload the burden of infrastructure management to service providers. Cloud-based XaaS solutions provide scalability, flexibility and access to a wide range of AI tools and services, while on-premises XaaS offerings enable greater control over data governance, compliance and security.

Accessing specialized expertise 


Implementing AI initiatives often requires specialized skills and expertise in areas such as data science, machine learning and AI development. XaaS models provide organizations with access to a vast ecosystem of skilled professionals and service providers who can assist in the design, development and deployment of AI solutions. This access to specialized expertise enables businesses to accelerate time-to-market and achieve better outcomes while controlling costs.

Facilitating rapid experimentation and innovation 


In the age of AI, rapid experimentation and innovation are essential for staying ahead of the competition. XaaS models facilitate experimentation by providing businesses with access to a wide range of AI tools, platforms and services on demand. This enables organizations to iterate quickly, test hypotheses and refine AI solutions without the need for significant upfront investment. Embracing a culture of experimentation helps businesses drive innovation while minimizing financial risk.

Managing budgets effectively 


As organizations navigate the complexities of AI adoption and strive to control budgets, leveraging both on-premises and cloud XaaS models emerges as a strategic imperative. By embracing the flexibility, scalability, cost predictability and access to expertise provided by XaaS offerings, businesses can optimize costs, drive innovation and achieve sustainable growth. Whether deployed on premises or in the cloud, XaaS serves as a catalyst for success, empowering organizations to unlock the full potential of AI while maintaining financial resilience in an ever-evolving business landscape. 

IBM solutions 


Master your AI budget with IBM Storage as a Service and Flexible Capacity on Demand for IBM® Power®. Whether on premises in your data center or in the IBM Cloud®, you can provision, budget and get the same customer experience from these IBM offerings.

Source: ibm.com

Thursday, 2 May 2024

How fintech innovation is driving digital transformation for communities across the globe

How fintech innovation is driving digital transformation for communities across the globe

To meet the demands of today’s consumers, enterprises must be continuously innovating. But innovation doesn’t happen in silos. Fintechs, for example, have been transformational for the financial services industry, from democratizing finance to establishing digital currencies that revolutionized the way that we think of money

As fintechs race to keep up with the needs of their customers and co-create with larger financial institutions, they can leverage AI and hybrid cloud solutions to drive true digital transformation and meet these evolving demands. 

How Dollarito is connecting larger financial institutions with financially underserved communities 


According to research from the US Government Accountability Office, roughly 45 million people lack credit scores because they don’t have certain data points that credit scores are based on, which limits their eligibility. Traditional credit report models use parameters such as the status of an active loan or credit card payment records to give an individual a credit score. If someone does not fit within these parameters, it can be difficult to procure a loan, take out a mortgage or even buy a car. However, with a more accurate model, such as one powered by AI, financial institutions can better identify applicants who are fit for credit. This can result in a higher approval rate for these populations that otherwise would typically be overlooked. 

Dollarito, a digital lending platform, is focused on helping the Hispanic population with no credit history or low FICO scores access fair credit. The platform offers a unique solution that measures repayment capabilities by using new methodology based on AI, behavioral economics, cloud technology and real-time data. Leveraging AI, Dollarito’s models tap into a wide store of data from banking transactions, behavioral data and economic variables related to the credit applicant’s income source. 

With IBM Cloud for Financial Services, Dollarito, an IBM Business Partner, is able to scale their models continuously and quickly deploy the services that their clients need, while ensuring their services meet the standards and regulations of the industry.  

“Dollarito uses IBM Cloud for Financial Services technologies to optimize infrastructure and demonstrate compliance, allowing us to focus on our mission of providing financial services to underserved communities. We are dedicated to building a bridge of trust between these populations and traditional financial institutions and capital markets. With AI and hybrid cloud technologies from IBM, we are developing solutions to serve these groups in a cost-effective way while addressing risk.” – Carmen Roman, CEO and Founder of Dollarito 

Dollarito is also embracing generative AI, integrating IBM watsonx™ assistant to help its users interact easily and get financial insights to improve the likelihood of access to credit. Like IBM®, Dollarito recognizes the great opportunity that AI brings for the financial services industry, allowing enterprises to tap into a wealth of new market opportunities.  

How Ionburst is helping to protect critical data in a hybrid world 


Data security is central to nearly everything that we do, especially within financial services as banks and other institutions are trusted to protect the most sensitive consumer data. As data now lives everywhere, across multiple clouds, on-premises and at the edge, it is more important than ever before that banks manage their security centrally. And this is where Ionburst comes in. 

With their platform running on IBM Cloud, Ionburst provides data protection across hybrid cloud environments, prioritizing compliance, security and recovery of data. Ionburst’s platform provides a seamless and unified interface allowing for central management of data and is designed to help clients address their regulatory requirements, including data sovereignty, which can ultimately help them reduce compliance costs.  

Ionburst is actively bridging the security gap between data on-premises and the cloud by providing strong security guardrails and integrated data management. With Ionburst’s solution available on IBM Cloud for Financial Services, we are working together to reduce data security risks throughout the financial services industry. 

“It’s critical financial institutions consider how they can best mitigate risk. With Ionburst’s platform, we’re working to give organizations control and visibility over their data everywhere. IBM Cloud’s focus on compliance and security is helping us make this possible and enabling us to give customers confidence that their data is protected – which is critically important in the financial services sector,” – David Lanc and Anne Lanc, Co-Founders and Inventors of Ionburst 

Leveraging the value of ecosystems


Tapping into innovations from fintechs has immensely impacted the financial services industry. As shown by Ionburst and Dollarito, having an innovative ecosystem that supports your mission as a larger financial institution is critical for success and accelerating the adoption of AI and hybrid cloud technology can help drive innovation throughout the industry. 

With IBM Cloud for Financial Services, IBM is positioned to help fintechs ensure that their products and services are compliant and adhere to the same stringent regulations that banks must meet. With security and controls built into the cloud platform and designed by the industry, we aim to help fintechs and larger financial institutions mitigate risk, address evolving regulations and accelerate cloud and AI adoption. 

Source: ibm.com