Thursday, 30 April 2020

Learning during Lockdown: An IBM Digital Credential Case Study

On the frontlines of IBM’s training and education programs, we have seen and experienced many milestones influenced directly by our organization’s digital credentialing strategy. Learner engagement has improved by a significant margin, digital course completion rates have increased exponentially, and skills coverage insights are dramatically improved.

These outcomes are achieved in large part by the implementation of a comprehensive credentialing framework to extend the application of digital credentials across the entire learning journey and not limiting them to only the highest stakes achievements. While high stakes credentials like formal certification may be the ultimate goal, we also recognize the importance of providing the learner with credentials reflecting achievement of important skill progression milestones and essential discreet level expertise acquired incrementally throughout the learning journey. Providing our learners with a more comprehensive and verifiable digital record of their investment in educational activities is a cornerstone of IBM’s credentialing strategy, and it also serves as an excellent way to gain improved visibility into talent development pipelines and trends.

So how does this credentialing strategy hold up during a global pandemic when our targeted audiences are restricted to working and learning in isolation and having to contend with a whole new set of distractions? Will skill development take a back seat to other priorities as individuals are trying to adjust to new ways of working, or will we see an increase in learning engagement and credentialing trends? Will our digital credential strategy serve as a catalyst for learner motivation and engagement during such unprecedented times? These were key questions we set out to find answers for, and we believe the results clearly reinforce the value and importance of investing in a thoughtful digital credential program framework to add value to our education and training initiatives.

What a difference a year and one global pandemic makes


We began our analysis by performing a simple side-by-side calculation of the total number of IBM digital credentials earned worldwide between January 1st and April 15th for both 2019 and 2020. The same analysis was also performed for the periods between Mar 1st and April 15th and again between Apr 1 – Apr 15. The last two measurements provided a more direct contrast between the impact of the COVID-19 pandemic and learner engagement in credentialed IBM training activities. The results imply a correlation between regional and country lockdown mandates and earned digital credentials: Note: COVID-19 was a well-established worldwide health concern by Mar 1, with a global pandemic formally declared on Mar 11.

◉ Jan 1 – Apr 15: 40% year over year increase in earned IBM digital credentials
◉ Mar 1 – Apr 15: 74% year over year increase in earned IBM digital credentials
◉ Apr 1 – Apr 15: 120% year over year increase in earned IBM digital credentials

These trends are further illustrated in the chart shown in figure 1:

As the severity of the global pandemic escalated and shelter-at-home mandates grew, the consumption of IBM credentialed learning activities rose almost proportionately. Figure 2 further reinforces this trend by illustrating the almost immediate increase in earned digital credentials in India within 24 hours of the country being placed on lockdown.

IBM Exam Study, IBM Tutorial and Material, IBM Learning, IBM Certification, IBM Exam Prep

We have seen similar trends occurring across many other countries and regions around the globe and expect those trends to continue for some time to come.

What have we learned?



Two important lessons can be taken from this experience:

1. The resiliency of human nature is nothing short of amazing, and IBM’s eco-system of learners serves as a great reminder of this truth.

2. Never underestimate the impact of a thoughtful and well managed digital credential program as a means to motivate individuals to take ownership of their own skill development and to take advantage of every available opportunity to grow professionally – even during times of adversity.

Make no mistake – the work required to attract and engage learners is never complete. The lifecycle of measuring, analyzing, and adjusting training and credentialing programs is critical for realizing continuous success in capturing, keeping, and growing talent both within and outside the organization. Digital credentials are essential to this mission and should be an integral part of every organization’s training strategy.

Source: ibm.com

Wednesday, 29 April 2020

Making data smarter

IBM Exam Prep, IBM Tutorial and Material, IBM Certification, IBM Prep

What makes artificial intelligence (AI)-driven applications smarter? High-quality, high-volume data. Now, IBM Storage is integrating file and object data with IBM AI solutions to make data smarter.

Enterprises looking to leverage the power of AI are learning that the accuracy and consistency–among other qualities–of the data fed into AI applications directly affects the quality of the results produced by AI. Better data in, better results out.

But this isn’t the only data-related challenge faced by organizations these days. Many companies have been deluged by so much business-generated data that they can’t keep track of it all. That, or they don’t know as much about their data as they should in order to make the most profitable use of these very valuable assets. Where does any particular data asset reside? What, specifically, is contained in each of those thousands or millions of files? How old is it? Who has access to it–and should they? Is the data compliant with the latest governmental and international regulations?

IBM Spectrum Discover simplifies data and AI organization. It is a high-volume data catalog and data management solution that provides a 360-degree view of data for higher productivity and helps to optimize data for faster AI analysis.

IBM Exam Prep, IBM Tutorial and Material, IBM Certification, IBM Prep

Traditional metadata solutions for unstructured data aren’t designed to provide the level of detail needed these days for storage consumption and data quality–or integrate well to other AI solutions. IBM Spectrum Discover serves the important role of classifying and labeling data with custom metadata that not only makes it easier to find and recall data for analysis but can actually increase the value of data by imbuing it with additional semantics and meaning.

IBM Spectrum Discover is an on-premises interactive data management catalog that offers a detailed, real-time view of data. It’s non-disruptive to existing storage and applications and provides the ability to create custom indexes and reports for identifying and optimizing enterprise data.

Innovation within the IBM Spectrum Discover platform has been moving at a brisk pace. Today, IBM Storage is announcing multiple enhancements to the powerful data catalog and data management solution. This includes the capability to utilize IBM Watson® solutions and IBM Cloud Pak for Data to make data “smarter” and more valuable. Now, with just a single click, IBM Spectrum Discover can export data information from file and object storage systems to IBM Watson Knowledge Catalog, plus leverage other IBM Watson AI solutions integrated with the Catalog.

IBM Watson Knowledge Catalog is an open, AI-driven data catalog for managing enterprise data and AI model governance, quality, and collaboration. The Catalog provides a gateway to other tools such as Watson Studio and Watson Machine Learning that together provide wide-ranging security, research, governance, self-service, and data lake management.

IBM Exam Prep, IBM Tutorial and Material, IBM Certification, IBM Prep

IBM Spectrum Discover leverages the capabilities of IBM Cloud Pak for Data to enable secure, high-performance multicloud connectivity, management, and data movement to hybrid cloud-based solutions such as IBM Watson Knowledge Catalog. IBM Cloud Paks enrich basic container platforms such as Red Hat OpenShift and Kubernetes with additional functionality to enhance, automate and accelerate many tasks associated with developing, deploying and maintaining cloud-native applications.

Other recent enhancements to IBM Spectrum Discover include a new Windows SMB connector to ingest information from Windows data and more easily uncover security issues with files that are not protected in connected desktop environments. Also, IBM Spectrum Discover now offers the ability to optimize data in IBM Spectrum Scale with more granularity using a new policy engine.

Finally, to help organizations deal with the disruption caused by the COVID-19 pandemic, IBM is offering eligible customers access to IBM Spectrum Discover for 90 days at no cost.

AI adoption rates are soaring. But getting the most out of AI requires putting the best possible data in. IBM Spectrum Discover with one-click connectivity to IBM Watson Knowledge Catalog offers a solution to transform chaotic streams of unstructured data into well-understood, easily accessible, totally compliant business assets. This is how data gets smarter–and much more valuable to the enterprise.

Source: ibm.com

Tuesday, 28 April 2020

Your data’s been prepared: Now train the AI model

IBM Exam Prep, IBM Tutorial and Material, IBM Learning, IBM Guides, IBM Certification

I recently wrote about how important data preparation is in the lifecycle of an AI project. Providing good data to an AI model is key to making it thrive. However, training it also plays an important role and can impose a few challenges.

An AI model learns by repetition. If you don’t train a model enough, it can’t properly adjust the weights of its neural network. The way you train it also impacts its usefulness and accuracy when it tries to provide an answer to an input with real data. Effectively training an AI model is comparable in some ways to teaching a child in school through repeated instruction, exercises and tests.

Let’s analyze how this works. When you’re training an AI model, first you need to feed it with data. It will try to learn and then go through a test. In each test, the model is scored against the expected answer to determine the accuracy of the model. Ideally, we want an accuracy as close as possible to 100 percent. When the AI model begins learning from training data, it yields low accuracy, meaning it doesn’t yet understand what we’re trying to make it achieve. Thus, to improve its accuracy, it is repetitively exposed to rounds of training. Each round is called an epoch (iteration), and the model is benchmarked again at the end. A child at school, similarly, might not do well in early tests but through repetitive learning can improve until a desired score is reached.

Figure 1 shows the training of a model. Notice how accuracy improves with the number of epochs. The final accuracy of this model, however, is still at around just 60 percent.

IBM Exam Prep, IBM Tutorial and Material, IBM Learning, IBM Guides, IBM Certification

Figure 1: Model accuracy for 20K epochs [Source: Ex. IBM Watson Machine Learning Accelerator]

AI model training challenges


As with learning at school, some problems may occur with an AI model. Suppose that a student goes over a round of exercises; then, at the end, you apply a test that has some of the exact same exercises he or she learned from. That would be an easy test since the child has seen the answers before. That can happen with a model as well, if you benchmark it using the same data used to train it. Therefore, a model needs to be benchmarked for accuracy using different data than what it was trained with. A data scientist must be careful to plan for that during the training.

Some other common challenges may happen during training. One is overfitting, which happens when the model picks up too much detail from the training data, to the point of understanding noise as part of the attributes it’s supposed to learn. It benchmarks well with the training data but fails in the real world. The opposite of overfitting is underfitting, a situation in which the model can’t even comprehend the training data and shows low accuracy even in the training phase. When underfitting happens, it’s usually a sign to suggest a change in the model itself. There are many other problems that may occur in the training phase, such as gradient explosion, overflow, saturation, divergence. Although I won’t mention each one specifically, they are widely explained elsewhere.

Hyperparameters


Now that you’re aware of challenges with training, let’s suppose for a minute that you were able to avoid all of them and that you chose a specific algorithm to train your model with. Even in that scenario, you’d still be left with the choice of some parameters that this algorithm uses to control the learning of the neural network. These are called hyperparameters. They are chosen before the training begins and are kept at that same value throughout it. So how do you choose the best set of hyperparameters to use in order to get the best result? This is tough to answer, and usually it’s done by trial and error or by running a hyperparameter search where a round of smaller trainings is run with different values for the parameters, and the winner becomes the set of parameters with the best accuracy overall.

Hardware makes a difference


Training can be a difficult task on its own, and often it’s a step that’s repeated until you’re satisfied with the quality of the resulting neural network. The faster you can run training, the faster you can fail and learn from your errors to try again. I don’t say this to undermine anyone’s enthusiasm, but to mention that using top-notch hardware to train these AI models makes a big difference. The IBM Power System AC922 supports up to six NVIDIA V100 GPUs, interconnected with NVLink technology and featuring PCIe Gen4 slots — all of which tremendously helps to speed and refine your training models.

IBM Exam Prep, IBM Tutorial and Material, IBM Learning, IBM Guides, IBM Certification

Figure 2: IBM AC922, 6x V100 GPUs, NVLink, PCIe Gen4, water-cooled model [Source: https://drive.google.com/open?id=1YWkJESNjgjGT8GDLem5_85nRUhMAcMsf]

Detecting failure early can also save you time. Solutions such as IBM Watson Machine Learning Accelerator with Deep Learning Impact (DLI) has an interesting characteristic of detecting training problems such as underfitting, overfitting, gradient explosion, overflow, saturation and divergence while the training is taking place. So, you don’t need to wait for hours until the training is complete. Should any of those situations be detected, the tool simply warns you about it and you can stop the training, make some adjustments and start over. Finally, DLI can also perform hyperparameter searches and provide you with a set of better values to use. For the data scientists out there: You can control the search with four types of algorithms (random, Tree-based Parzen Estimator, Bayesian and Hyperband), select the universe for the search (parameter range) and amount of time to look for a better answer.

Monday, 27 April 2020

Fueling the next-generation of supply chains

IBM Prep, IBM Guides, IBM Tutorial and Materials, IBM Exam Prep, IBM Certification

In today’s global marketplace, businesses can win or lose on the strength of their supply chain. To deliver market-leading agility, supply chain leaders have spent years digitizing processes, optimizing workflows, and introducing analytics to better understand past events so they can take more informed actions in the future.

Despite enormous investments of time, budget and resources, some first-generation digital supply chains are still optimized in functional silos. Collaboration across functions and organizations for them would still be plagued by traditional manual processes. And supply chains of that type could still be struggling to meet the needs of the business and the expectations of customers.

The next chapter of digital transformation will require supply chains that are dynamic, responsive and interconnected across ecosystems and processes. Moving forward, you will need to take even bolder strides to drive efficiencies and be resilient to disruptions.

The ultimate goal? Build intelligent, self-correcting supply chains that deepen your competitive differentiation today and are designed for whatever the future holds.

What smart means for supply chain


That’s why I’m thrilled about the IBM Sterling Supply Chain Suite that we’re launching today. It’s an open, integrated platform with embedded Watson AI and IBM Blockchain that easily connects to your supplier and customer ecosystem. It helps you address persistent supply challenges by providing end-to-end visibility, real-time insights and recommended actions to turn disruptions into opportunities for customer engagement, growth and profit.

How smart works for supply chain


The IBM Sterling Supply Chain Suite helps you build an intelligent, self-correcting supply chain with:

◉ Real-time intelligence and actionable recommendations. Applications and control towers, embedded with AI and trained in supply chain, provide end-to-end visibility, real-time alerts and recommendations that can be automated for self-correcting actions to help drive better business outcomes. Clients using individual Sterling applications, such as IBM Sterling Fulfillment Optimizer with Watson, in their supply chains today have lowered shipping cost per order by an average of 7 percent. IBM also used these capabilities in its own global supply chain to reduce disruption mitigation time from days to hours, becoming 95 percent more efficient at tackling recurring supply chain challenges.

◉ Trusted connectivity – built to scale, backed by IBM Blockchain. Our blockchain-enabled, multi-enterprise business network provides frictionless, secured connectivity and collaboration with customers, partners and suppliers. You can quickly engage with more than 800,000 preconnected trading partners executing 3 billion transactions a year.

◉ Open to developers to create tailored solutions. The IBM Sterling Supply Chain Suite allows systems integrators and developers to build, extend and integrate tailored supply chain solutions that can interoperate with other business networks and applications. It also enables you to bring in third-party data so that all connected applications and networks can benefit from it. The Suite’s Developer Hub provides a global community of developers, open-source programs, and a library of knowledge resources to help solve your unique supply chain challenges faster.

◉ Hybrid-cloud integration to extend existing supply chain investments. Instead of requiring time-consuming and expensive migrations, the Suite’s enterprise-ready containerized software, along with IBM Cloud Paks, allows you to extend the value and reach of your legacy applications and data. This hybrid approach means you have the freedom to choose where to run your workloads and the ability to link them to value-added services in the IBM Sterling Supply Chain Suite. For example, IBM Sterling Order Management containers for Red Hat OpenShift will allow you to continue to run your software in their own datacenter – or in any cloud.

The bottom line: When you have greater transparency and a better understanding of what’s happening across your supply chain – coupled with intelligent applications for key supply chain activities – you can radically improve efficiencies. Achieve lower cost to serve with less chance of over or under correction. And make more informed decisions faster. Quite simply, you deliver better business outcomes.

Source: ibm.com

Sunday, 26 April 2020

Can AI transform the supply chain?

IBM Exam Prep, IBM Tutorial and Material, IBM Learning, IBM AI, IBM Prep

Chief Supply Chain Officers (CSCOs) and their organizations often struggle to optimize decision making and manage disruptions because they lack end-to-end visibility. More than 75 percent of CSCOs and supply chain leaders report that they have “very limited” supply chain visibility — and 65 percent say they have limited or no visibility beyond Tier 1 suppliers.

The overwhelming amount of data spread across siloed systems and external sources is a familiar challenge among CSCOs – but they don’t always know how to address it.

AI has emerged among best-in-class supply chain organizations as a textbook solution to enable end-to-end supply chain visibility. It correlates data across siloed systems and external sources, making it easier to quickly implement control tower solutions that provide a single view of data in a personalized dashboard.

However, just one in three supply chain leaders in the retail and manufacturing sectors recognize AI as one of the “next major steps” in digitizing supply chain. While adoption rates may lag among some, AI is poised to enable industry-leading supply chain organizations. Gartner recently identified AI as one of the top strategic supply chain technology trends that can affect broad impact, noting that the technology has reached a critical point in terms of supply chain capability and maturity.

Several years ago, IBM began applying AI to its own supply chain when it launched a strategic initiative aimed at improving visibility. The primary objective was to connect supply chain professionals with data and intelligence from across silos and disparate sources, to better prevent disruptions, manage events and quickly respond to any issues. The IBM Supply Chain team believed it was critical to establish a single, end-to-end view of the company’s many Enterprise Resource Planning (ERP) systems, as well as other internal and external sources of data including third-party logistics providers (3PL).

The IBM supply chain team recognized that AI could help address visibility challenges by drawing data directly from the organization’s ERP systems and other relevant sources, analyzing it in real-time, and serving up actionable insights. This would help them to make better decisions faster and take more confident action to resolve disruptions. They also understood that AI could help aggregate key performance indicators (KPIs) and produce smart alerts for specific personnel.

Benefits of AI across the supply chain


Since the IBM supply chain organization began leveraging AI, the company has achieved exceptional results, successfully:

◉ Avoiding any major supply impacts.

◉ Shortening critical supply chain disruption management time from 18+ days to just hours.

◉ Maintaining greater than 95 percent of serviceability targets.

◉ Reducing expedite costs by 52 percent.

◉ Realizing an 18 percent reduction in inventory levels.

Over the past year, IBM also saw a 5 percent decrease in year-over-year operational costs and an 11 percent reduction in year-over-year structural costs.

The success of this AI initiative led IBM to commercialize its related AI capabilities as IBM Sterling Supply Chain Insights with Watson and as blueprints for purpose-built control towers. If you’re interested in learning more, read how global electronics manufacturer Lenovo is harnessing AI to reduce global supply chain costs, complexity and risk.

Source: ibm.com

Saturday, 25 April 2020

The COVID-19 crisis reinforces the need for these supply chain methods

COVID-19, IBM Study Materials, IBM Certification, IBM Exam Prep

Anyone who has worked in supply chain management for a while knows that disruptions are inevitable. Unlike previous disruptions, the COVID-19 pandemic is less localized and the long-term impact on individual companies and entire industries is still unknown. It does, though, serve as the latest reminder of the importance of making supply chains more resilient to stresses.

IBM has been involved with more than 7,000 successful supply chain deployments around the globe spanning every major industry. Based on what we have learned, here are some suggestions for making your supply chains resilient.

1. Manage the ABCs: Many of us use ABC analysis to classify inventory items so we can work out ways to manage “A” items more aggressively to drive down overall inventory. “A” items account for the highest value over a period of time to your organization, with “B” and “C” items stepping down in relative value. Typically, organizations maintain low inventory and low risk of stockout by closely managing A items, while carrying higher amounts of B and C items to keep workloads down and provide high customer service.

A properly implemented strategy for B and C items will make it easier to respond when events disrupt the supply chain. If you can count on having plenty of B and C items, which aren’t as costly and represent typically 80% to 95% of your overall part numbers, then staff that is stretched thin during times of crisis can focus on solving supply problems for A items – a much smaller number of items. This allows you to keep your overall inventory under good control and deliver on more customer promises, even when the supply chain is impacted by an event.

2. Leverage analytics for shortage analysis: Most of us have seen a supply-versus-demand stockout. But many organizations have never built this simple analysis into a full report for real-time shortage visibility. Supply chain analytics can quickly help you identify precisely which items require the most urgent attention, which items are expected to run out one or two weeks from now, and so on. When you move into crisis mode, it’s critical that you’re able to identify items at risk of stockout so you can take early action.

Adding two relatively simple analytics to the standard supply-demand analysis can create a report that provides the insights you need. Those analytics include the ability to set up the report so that you can sort by first shortage date and, also, so you can either include or exclude inbound materials on order. This allows you to do two things:

◉ Identify the first instance of shortage. That way, the earliest problem items—where demand will exceed supply—rise to the top of the report.

◉ By including or excluding items with purchase orders, you can identify what action is needed. Items with planned deliveries may need to be expedited, or have deliveries confirmed so you can be confident that supplies will really arrive as scheduled. You’ll need to place new orders for other items as needed.

3. Use AI to manage supply chain disruptions: We all talk about being “less reactive and more proactive.” But, in fact, the best tools for managing supply chain disruptions are a combination of (i) tools that detect events faster and allow us to react faster for better outcomes; and (ii) tools that help us anticipate where and when supply chain disruptions are more likely so we can proactively get ahead of them. Both require the assistance of AI.

Let me provide an example. A big electronics manufacturer was hampered by limited visibility into supply planning and time-consuming manual processes. And so, the company was often missing opportunities to make timely decisions and mitigate disruptions. Deliveries would already be late or the options available would be limited and more expensive.

Using IBM Sterling Supply Chain through the Fast Start program, in just six weeks the manufacturer discovered how to automatically correlate supply and carrier data. This includes Advanced Ship Notifications and shipment status updates for greater and faster visibility into inbound logistics. The company also began to use AI to proactively search for and use information about external events that could impact their supply chain, tapping into weather updates, social media, news reports, and customs and border reports.

Empowering its people with real-time insights to make more informed decisions faster, that manufacturer can now deliver better outcomes when delays happen. They can even anticipate where and when disruptions may occur to turn them into opportunities.

Disruptions are inevitable, but good supply-chain practices help ensure that businesses are better prepared. That way, they’re able to navigate disruptions today and adapt to changing markets and business dynamics in the future.

Friday, 24 April 2020

Five key things to know about IBM Systems Lab Services

IBM Exam Prep, IBM Tutorial and Material, IBM Certifications, IBM Systems Lab Services

If your business is looking for infrastructure services, you’ve come to the right place to learn. IBM Systems Lab Services is helping organizations around the world deploy the building blocks of next-generation IT infrastructure — from servers and mainframes to storage systems and software. Through short consulting engagements, we help IBM clients and partners implement, optimize and gain skills on cloud and cognitive infrastructure solutions, among many other services. Our consultants bring a wealth of IT experience and strategic insight to help businesses get the most of their infrastructure investments. And we can provide that support virtually or at your location.

If you’ve never worked with Lab Services, here are five key things to know about us:

1. Lab Services consultants offer deep technical expertise.


Lab Services is a global team of technical consultants with experience on IBM Power Systems, IBM Storage, IBM Z and LinuxONE, as well as AI, cloud and security. Our teams leverage deep technical know-how, along with proven tools and methodologies, in consulting engagements and workshops. Whether your organization is looking to deploy high-performance servers for multicloud, AI, blockchain and analytics initiatives; to secure your data with physical and software-defined storage solutions; or to maximize your infrastructure investments with software to help you accelerate workloads and simplify administration, Lab Services can help.

2. Lab Services co-creates solutions alongside clients and partners.


Lab Services approaches work with clients through a co-creation lens. Co-creation is a cooperative strategy for developing new business models, products and services. We collaborate with you, IBM Business Partners, product development teams, Client Experience Centers and other IBM service units to design and deliver the innovative solution your business needs to win and grow in the marketplace. Together, following an agile approach, we assess your current environment and needs, define a roadmap, and design and implement the most valuable solution for and with you, so at the end your team is fully enabled to manage the environment and the solution.

3. Lab Services helps clients gain new skills that empower them for the future.


Whether you’re migrating to new hardware, adopting storage solutions for AI or designing a multicloud infrastructure for enterprise transactions, Lab Services not only tackles the current challenge but also transfers skills to your team. Skills transfer is a key component of our engagement model that helps ensure clients have the needed proficiency to manage their solutions going forward.

4. Lab Services delivers technical training.

IBM Systems TechU (TechU) events take place around the world throughout the year to provide education to clients, partners and IBM employees. From April to July 2, 2020, physical events have been replaced by a series of no-charge weekly webcasts called TechU Talks. TechU offers focused, in-depth training sessions delivered by IBM Distinguished Engineers, developers or product experts to help clients and partners learn, grow and connect with developers, industry leaders and IBM executives.

5. Lab Services helps clients adopt cutting-edge hybrid cloud and AI solutions.


Lab Services has helped numerous clients design hybrid multicloud infrastructures and implement enterprise AI applications. We understand the challenges clients face around hybrid cloud and can serve as trusted advisors at any stage of the cloud journey — from design to management to optimization. That includes support for the use of IBM Cloud Paks, which can help businesses more easily and securely move applications to the cloud. We also know enterprise AI inside and out, and we can help you deploy infrastructure solutions to support your AI and big data workloads now and in the future.

The following are just a few of our recent engagements:

◉ A Lab Services team helped Tunku Abdul Rahman University College in Malaysia build a Big Data Analytics Lab with IBM Power Systems to help students develop in-demand AI and analytics skills for the workforce.

◉ Lab Services implemented and configured new IBM Power servers for SAP S/4HANA for petrochemical company India Glycols, helping speed up financial reports and give the company more integrated, automated, efficient business management.

◉ When the Brazilian bank Banrisul needed to improve storage performance, Lab Services supported a comprehensive storage overhaul with all-flash IBM storage and IBM Spectrum software. The result was a 60 percent reduction in batch processing times in Banrisul’s IBM Z environment.

Source: ibm.com

Thursday, 23 April 2020

Are we running to the cloud hills too quickly?

IBM Tutorial and Material, IBM Prep, IBM Exam Prep, IBM Learning, IBM Guides, IBM Certification

From time to time, we invite industry thought leaders to share their opinions and insights on current technology trends to the IBM Systems IT Infrastructure blog. The opinions in these posts are their own, and do not necessarily reflect the views of IBM.

With billions gained by cloud technology providers hailing multicloud and hybrid public cloud services as well as native cloud initiatives, are we overlooking some crucial points when discussing technology migrations? Mid to large-cap enterprises have substantial technology that exists either at their on-premises tech estate or housed within any given data center. Infrastructure-wise, there’s a lot to change when talking cloud specifics for said infrastructure, let alone a cloud-first program!

I’m a big advocate of reusing and repurposing data from the outset as this will bring down the multitude of layered services you will be told to purchase to get the most out of your data. But do we need to layer these services around our estate? The answer that will probably be on the tip of your tongue is “It depends.”

A recent Forrester study, in conjunction with IBM, highlights what is on the minds of CTO/CIO’s when making these cloud decisions. Enterprise technology environments have undergone many refreshes over the last few decades and bursting out to hybrid cloud services for the specific line of business applications/services. Our on-premises technology is seen as a management burden, a cost of ownership problem and more often as old technology! Although some of these statements can be viewed as accurate (depending on your standpoint), I would argue the following points:

Digitalization doesn’t necessarily need an external cloud approach; it requires planning and focuses on providing the right solution. If this method is taken, I bet some of you will agree with me when I say “on-premises technology is your first option” before outsourcing some of your critical IT Infrastructure. Most heterogeneous, on-premises technology has evolved to meet the new digital demands with burst out to hybrid services. Static site-based technology has gone through the pain of multiple API services and controls years back now, and we have gotten over the cloud buzz.

The Forrester report findings also sum up current cloud buying trends, highlighting IT enterprise estates not being a “one-size-fits-all solution” and with hybrid cloud adoption being on everyone’s tip of the tongue but not executing… yet.

Platforms such as the IBM Z Systems z Series has had more R&D dollars pumped into them than most tech platforms (billions) and the latest z15 generation reflects those advancements, which is just phenomenal!

The IBM Z series, LinuxONE, or even the now-overused term for this technology–mainframe–are all fantastic pieces of tech and are pretty much a cloud in a box (if you wish to use it that way). I have followed this technology for many years and say unprompted that if I had a choice of multiple vendors delivering services (hybrid cloud models), together with having to wade through endless contracts to ensure all liabilities are covered = I wouldn’t! (who would?). If I can have enterprise technology on premises with some of the best security, speed and resiliency built-in through one provider; I would!

However, it is often the case today that we demand limitless API’s for anything we may need to utilize in the future, and so we opt for the public hybrid cloud highway. But there are so many points to weigh up when considering this option–security, management, agility and scale, speed, access, the list goes on. Why not opt for localized technology that suits your needs with overspill to hybrid cloud vendors?

The IBM Systems platforms not only provide an absolute powerhouse of technology but also have encryption everywhere with speeds that can’t be matched by most conventional mainstream tech. They have built-in hybrid cloud connectors, they encrypt at hardware, software and in transit with the new passport data security options and are fast, very fast!

Supporting 2.4 million Docker containers on a single system and handling 19 billion encrypted web transactions per day without breaking a sweat are two examples that come to mind.

This technology may not apply to some of you, but my point is this. Don’t automatically jump to grab off-premises services using cloud initiatives. Take a look around and take what works for you. Cloud technology will only get more evolved as we progress with more interconnectivity for bespoke services, but they will also become more diverse and complicated to maintain/support.

I have always said that technical revolutions come around every ten years or so, which is also valid in the business world. We see this explosion of services and IT buzzwords that provides too many vendors for each new trend, and then we see a consolidation of them over 5-6 years. We are going through this right now, and if you don’t believe me check out some of the new options that the other prominent tech vendors are offering–on-premises replication from a hybrid cloud service!

Source: ibm.com

Wednesday, 22 April 2020

Infrastructure for AI: Why storage matters

IBM Prep, IBM Tutorial and Materials, IBM Guides, IBM Learning, IBM Certification

Perhaps your organization has recently decided to purchase compute nodes and get started with artificial intelligence (AI). There are many aspects of your IT infrastructure and technology landscape to scrutinize as you prepare for AI workloads — including, and perhaps especially, your storage systems. AI is driven by data, and how your data is stored can significantly affect the outcome of your AI project. Not only that, but the four different stages of AI (ingest, preparation, training and inference) each have different storage needs and requirements.

Unfortunately, some organizations focus on the compute side of AI and overlook the storage side. This singular focus can, and sometimes does, lead to the disruption or complete failure of AI projects. Massive amounts of data are needed to facilitate the AI training stage. This data needs to be ingested, stored and prepared so it can be “fed” to the training stage. Without the ability to ingest, store and consume the necessary data for training, the project will be at risk of failure.

AI projects demand a storage infrastructure with excellent performance, scalability and flexibility. The good news is that today’s storage systems can be purpose-built to meet the needs of AI projects. Two great examples of this are some of the world’s most powerful supercomputers, Sierra and Summit.

Now, let’s look at some requirements.

Workload specifics and the movement of data


The requirements for each stage of the AI pipeline need to be reviewed for the expected workload of your AI application. Workloads vary, but some companies using large data sets can train for long periods of time. Once the training is done, that data is often moved off of critical storage platforms to prepare for a new workload. Managing data manually can be a challenge, so it’s wise to think ahead when considering how data is placed on storage and where it will go once training is done. Finding a platform that can move data automatically for you puts you one step closer to efficient and capable storage management for AI.

After reviewing the implications of your own workload needs, you can decide on the storage technologies that work best for your AI compute infrastructure and project.

Storage needs for different AI stages


Data ingestion. The raw data for AI workloads can come from a variety of structured and unstructured data sources, and you need a very reliable place to store the data. The storage medium could be a high capacity data lake or a fast tier, like flash storage, especially for real-time analytics.

Data preparation. Once stored, the data must be prepared since it is in a “raw” format. The data needs to be processed and formatted for consumption by the remaining phases. File I/O performance is a very important consideration at this stage since you now have a mix of random reads and writes. Take the time to figure out what the performance needs are for your AI pipeline. Once the data is formatted, it will be fed into the neural networks for training.

IBM Prep, IBM Tutorial and Materials, IBM Guides, IBM Learning, IBM Certification

Illustration 1: Ingest, Data Preparation and Training

Training and inferencing. These stages are very compute intensive and generally require streaming data into the training models. Training is an iterative process, requiring setting and resetting, which is used to create the models. Inferencing can be thought of as the sum of the data and training. The GPUs in the servers, and your storage infrastructure become very important here because of the need for low latency, high throughput and quick response times. Your storage networks need to be designed to handle these requirements as well as the data ingestion and preparation. At scale, this stresses many storage systems, especially ones not prepared for AI workloads, so it’s important to specifically consider whether your storage platform can handle the workload needs in line with your business objectives.

Don’t forget capacity and flexibility


Also consider: Does your storage infrastructure scale easily? Can you expand the storage system as your data needs grow? These are very important questions that have a direct effect on your AI infrastructure requirements.

Make sure you can scale your storage infrastructure up and out with minimal to no disruption to your production operations, keeping pace with data growth in your business. Be flexible enough to consider different storage configurations for the different needs of the AI infrastructure.

Turn to the experts for advice


Careful planning, matching your AI server and modeling requirements to the storage infrastructure, will help you get the most from your investments and lead to success in your AI projects.

These recommendations are just a starting point. Always keep in mind that if you don’t have the expertise in your organization to design and implement the correct AI storage infrastructure, you should work with your vendor to assist and help prepare your storage systems for AI.

Source: ibm.com

Monday, 20 April 2020

Small business inventory management

IBM Study Material, IBM Tutorial and Material, IBM Certification, IBM Exam Prep

The world has changed. People have changed. Demands are evolving. COVID-19 unexpectedly shook the global marketplace and as a result, created a domino effect of challenges for business leaders and consumers alike. Among them? Inventory management. For small and mid-size businesses in particular, the management of dispersed inventory has delivered major disruptions and unexpected experiences. Collectively, this has created frustration and confusion – and yet, in the midst of it all, the opportunity to help people get what they need in a timely manner.

Looking ahead into 2020 and beyond, what inventory challenges do you face? What opportunities do you have to help your clients get what they need through better utilization of inventory? And what can help you bring better clarity to both?

Introduce untraditional strategies into inventory management


Traditionally speaking, inventory should never experience a birthday. After all, once inventory is delivered to a store, the goal is to sell it. For many businesses, seasonal inventory in particular may never have the chance to connect with consumers due to unexpected challenges posed by COVID-19. Keeping this in mind, consider how unanticipated ways inventory management can help you navigate some of these challenges:

Re-allocate inventory slotted for Spring 2020 into Spring 2021. By repositioning both your investment into seasonal inventory – and securing a future time for it to be merchandised, marketed and ultimately sold – you can reduce the possibility of having to mark it down over the course of the next weeks, months and even year ahead. In order to do this, however, you must be financially positioned to continue introducing inventory into the summer, fall and winter seasons despite not selling this previously purchased inventory. The catch will be to shift the dollars already spent into your 2021 budget, opening up your 2020 with less financial expectations.

Introduce new channels into your existing fulfillment strategy. Many organizations depend on one or two selling channels, with physical stores and e-commerce being the most common places to process orders. There are a variety of other ways to connect with customers, however – including selling directly from social media, as well as incorporating delivery and curb-side pick-up options into your fulfillment strategy. Even offering a phone number attached to product images shared on social media can help convert browsers into buyers, if they’re forced to shop from home versus going into your store or ever landing on your website. The main takeaway? Don’t limit your selling avenues to just one or two. The more you can offer, the better.

Offer gift-card sales to be used at a future date. Managing inventory ultimately comes down to managing your collective business expenses. With less sales than expected likely to come into most small, mid and even large businesses during this time, capturing any revenue you can is critical. Encouraging customers to purchase gift cards that they can enjoy at a later date is a win-win scenario. This can bring you revenue and consumer satisfaction. This revenue will then allow you to more comfortably manage existing on-hand inventory and not be forced to mark it down for a quicker sell-through – or worse, a sell-through with little profit gained.

Additionally, one thing business leaders should remember is that while consumers are also experiencing unknown and unexpected challenges, they still appreciate personalized attention. Now is the time for businesses to show a little loyalty to their customer base – new and existing – in order to build a long-lasting relationship. To help achieve this, cognitive technology can review multiple applications used by businesses to better understand customer behavior. As a result, consumers benefit from being supported with personalized customer care no matter where they may decide to shop with your company – social media, website or even through a search engine. Essentially, this data is then collected through cognitive intelligence and allows retailers to better support consumers in their future purchase decisions and alert businesses what inventory they should more proactively sell or shift in their management strategies.

Inventory management software can help ease these challenges as well and shed light on opportunities during unexpected times. When investing in technology to support your unique business, aim to prioritize systems that work together cohesively so that you can gain deeper insights into your business, stronger management across your inventory network and, of course, additional opportunities to help at a time of need.

As many decision-makers know, efficiently managing overhead – dollars and inventory alike – can be essential. Leverage technology to help you achieve this and as a result, you can enjoy a more proactive approach to the unknown and unexpected challenges inventory management may deliver in 2020 and beyond.

Saturday, 18 April 2020

Ways IBM Cognos Controller Developer (C2020-605) Certification Can Make You More Successful

IBM Cognos Controller Developer, C2020-605, C2020-605  exam,  IBM Cognos Controller Developer exam, IBM Cognos Controller Developer certification, C2020-605 practice exam
The IBM Cognos Controller Developer (C2020-605) Certification Program places an industry-standard benchmark and validation of technical support for professionals working with IBM Cognos technology. The program provides a way for professionals to show their expertise in an active marketplace, allowing businesses to leverage their IBM Cognos business intelligence and performance management solutions with confidence.

IBM Cognos Certification provides a reliable standard for including the knowledge and skills of professionals working with IBM Cognos technology, assessing their willingness to implement, operate, and maintain IBM Cognos solutions successfully.

IBM Cognos Certification is the only authorized accreditation in the industry for benchmarking and certifying Cognos expertise.

Benefits of IBM Cognos Controller Developer (C2020-605) Certification

1. Getting Hired

Having an IBM Cognos Controller Developer Certification will undoubtedly give you a choice when hiring managers to look at your resume. Meeting for IT jobs can be stiff, and having a certification is a significant benefit compared to those who do not have one. Certification can be a qualifier for a position.
IBM Cognos Controller Developer, C2020-605, C2020-605  exam,  IBM Cognos Controller Developer exam, IBM Cognos Controller Developer certification, C2020-605 practice exam
Simple Steps to Prepare for IBM Cognos Controller Developer (C2020-605) Certification Exam
Keep in mind that when two otherwise similar candidates are competing for the same job, the one who has a certification will have the advantage over the candidate that does not. A certification may do nothing more than getting your resume a second look, which may be more than your competition gets.

2. Job Retention

In a volatile economic environment, businesses are continually looking for ways to cut costs. That may mean jobs are on the line. This is when having an IBM Cognos Controller Developer Certification can mean the difference between having your job and having a chance to attempt a new one. Earning a C2020-605 certification demonstrates that you are determined to enhance your skill set and knowledge, which benefits you and your employer. The bottom line is you must spend in yourself.

3. Promotions

Want to move up the corporate ladder or into a better, higher-paying job in your company, then you will want to learn new technologies or enhance the skills you currently own. There is no better way to explain this than to get a new certification or step further up the Cognos Controller Developer certification chain in a current area of expertise.

4. Networking Opportunities

Once you earn an IBM Cognos Controller Developer (C2020-605) certification with a specific vendor or manufacturer, then you join a unique group of IBM certified and skilled professionals. This can confirm to be an invaluable peer resource group when seeking answers to problems or sharing the solutions to your challenging scenario. This peer group of IBM certified professionals can also pass along guidance on how to further enhance your career or where to seek specialized technical knowledge.

5. Professional Credibility with IBM Cognos Controller Developer (C2020-605) Certification

Earning an IBM Cognos Controller Developer (C2020-605) certification, especially a series of certifications from the same vendor, will provide immediate professional credibility. Having received one or more of the IBM certifications shows your dedication and motivation to professional development. Many companies will actively encourage their employees in making the IBM certification that may even lead to promotions and raises as well.

6. Partner Programs


In some cases, companies may ask that there be a certain number of IBM certified individuals associated with their organization to keep a current partner level and more if they need to try a higher partner level. Most major manufacturers and many other vendors have this condition. Getting a vendor's certification benefits you and your organization since it allows the company to meet the required number of certified associates on staff.

7. IBM Cognos Controller Developer (C2020-605) Certification Retake Policy

The IBM C2020-605 certification test may be taken only two times within 30 days. If a certification exam is not completed successfully on the first attempt, there is no waiting necessary before taking the test a second time. However, candidates may not receive the same test more than twice within any 30 days. Additionally, retakes are not provided after the successful completion of an exam.

8. New and Current Technologies

Passing an IBM C2020-605 exam for a newly released certification is exciting due to the lack of study material and help available from other certified individuals. Earning an IBM Cognos Controller Developer Certification on a new product can make you the subject matter expert in your organization, putting you in a positive light for your early efforts.

9. Personal Goal

You may have set your own goal for yourself to earn a new certification, whether for professional recognition or personal achievement. Certifications you get this way may be the most satisfying, as you are compensating yourself for your efforts. If it happens to lead to a raise, promotion, and recognition, so much, the better. In many instances, these are also the most difficult to earn because of the self-motivation and discipline required where there are not any tangible rewards.

10. Professional or Corporate Requirement

One way to ensure that IT professionals have the necessary skills and experience on existing and new technologies is through certifications and training. IT professionals who have passed an IBM C2020-605 certification exam can be presumed to have the particular knowledge to be more productive members of the IT department and respond better to any incidents outside the healthy environment.
IBM Cognos Controller Developer (C2020-605) Certification: Top 5 Preparation Tips
From Visually.

Conclusion

But while these benefits are all significant, they are merely foundational. They need to be elaborated upon, and the IBM Cognos Controller Developer (C2020-605) certification is an excellent approach to do that. Speaking a common language and developing a structured set of methods adds real business value to these focus skills, by creating an environment where innovation and collaboration can happen more efficiently and with far less risk.
IBM Cognos Controller Developer, C2020-605, C2020-605  exam,  IBM Cognos Controller Developer exam, IBM Cognos Controller Developer certification, C2020-605 practice exam
My Strategy to Prepare for IBM Cognos Controller Developer (C2020-605) Certification Exam
The IBM Cognos Controller Developer credential is a huge value-add. This business reality trumps any arguments to be made against the requirement for C2020-605 certification. Many major players in today's economy are asking it from their vendors and business partners, so organizations and employees cannot manage to buck the trend if they wish to stay competitive.

Friday, 17 April 2020

Communication skills for business leadership and success

IBM Study Materials, IBM Exam, IBM Prep, IBM Learning, IBM Tutorial and Material

Communication skills are an important factor in business decisions and a driver of successful business outcomes. It’s not surprising, then, that good communication often tops the list of skills employers look for, no matter the job or industry. How well you communicate affects everything, from small interactions with coworkers to the closing of large deals. And it’s a critical skill area for leaders.

In the tech industry, when we think about skill development, better communication probably isn’t the first thing that comes to mind, but in fact it’s crucial to everything we do in IBM IT infrastructure to help clients conquer demanding workloads and achieve success in their domains. IBM is a company built around the core values of dedication to every client’s success, innovation that matters — for our company and for the world, and trust and personal responsibility in all relationships. Good communication is at the heart of those values, and it’s needed at every level of our business.

Leadership is about human interaction, and that’s why communication is key. We work with human beings and look to mobilize a group of people together to collectively accomplish a mission or task. In everyday professional interactions — from meetings to sales opportunities to client technical engagements — stakeholders come together with different priorities, needs and interests. Good communication can help us take all those varied interests into account and make decisions that best serve our collective goals. We succeed when we listen well, offer valuable insights and make space for all voices to be heard.

Top communication skills for leaders


Be authentic. When a leader communicates with a team, it’s crucial to build conditions of trust. Building trust can bring an organization together and help teams move forward together. The conditions of trust are grounded in communication — be respectful, selfless, empathetic and clear. Provide a sense of purpose, fight for your people and do all of this with high ethical standards.

Listen more than you talk. Listening is a valuable skill to practice with colleagues and business associates as well as in personal relationships. Reflective listening refers to the ability to listen to someone and then reflect back that person’s point of view. It helps confirm what has been said and correct any miscommunications. In any situation where you’re trying to achieve shared goals, this kind of purposeful listening can facilitate mutual understanding and better outcomes.

Speak with purpose and add value to conversations. When it’s your turn to speak, offer thoughtful, prepared, well-organized, concise verbal communications. We’ve all been in meetings where some individuals talked too much or spoke without really adding value. Everyone can contribute to more thoughtful exchanges by thinking ahead, coming to meetings prepared and speaking only when we have something of value to add.

Make room for diverse voices. It’s well established that diversity boosts innovation and financial results. Throughout its history, IBM has been committed to including diverse voices at the table and has been recognized for its leadership in accelerating progress for women in the workplace, for example. (The team I lead is likewise committed to advancing gender equality.) Effective communication in a business requires making space for all voices to be heard — not just those with the most power.

The business payoff


Better communication produces more thoughtful insights into our pressing challenges, which in turn leads to better product engineering, strong technical engagements and more satisfied clients. When we apply these communication skills in interactions with customers, we no longer have to rely on data alone and can instead enable shared value creation that benefits everyone.

IBM IT infrastructure provides the foundation for many of the world’s largest companies, from top global banks to supercomputing centers for research. IBM servers, storage and software are designed to help these organizations conquer challenging workloads and extend the value of their existing investments. We empower clients by understanding their needs and serving as trusted advisors — and that begins and ends with effective communication. This is how we lead.

Source: ibm.com

Thursday, 16 April 2020

IBM Z deepens data privacy capabilities with new air-cooled models and IBM Secure Execution for Linux

IBM Tutorial and Material, IBM Learning, IBM Prep, IBM Guides, IBM Z

In September 2019, IBM announced IBM z15™, delivering industry-first data privacy capabilities with the ability to manage the privacy of customer data across hybrid multicloud environments and to scale from one to four frames. For our clients on their journeys to cloud, IBM z15™ and IBM LinuxONE III was a major step toward encrypting everywhere, cloud native development and IBM Z Instant Recovery–but we aren’t stopping there.

Announcing IBM z15 Model T02, IBM LinuxONE III Model LT2 and IBM Secure Execution for Linux


Every day, clients of all sizes are examining their hybrid IT environments, looking for flexibility, responsiveness and ways to cut costs to fuel their digital transformations. To help address these needs, today IBM is making two announcements. The first is two new single-frame, air-cooled platforms– IBM z15 Model T02 and IBM LinuxONE III Model LT2–designed to build on the capabilities of z15. The second, is IBM Secure Execution for Linux, a new offering designed to help protect from internal and external threats across the hybrid cloud. The platforms and offering will become generally available on May 15, 2020.

Expanding privacy with IBM Secure Execution for Linux


According to the Ponemon Institute’s 2020 Cost of an Insider Breach Report[1] sponsored by IBM, insider threats are steadily increasing. From 2016 to 2019, the average number of incidents involving employee or contractor negligence has increased from 10.5 to 14.5–and the average number of credential theft incidents per company has tripled over the past three years, from 1.0 to 3.2.[2] IBM Secure Execution for Linux helps to mitigate these concerns by enabling clients to isolate large numbers of workloads with granularity and at scale, within a trusted execution environment available on all members of the z15 and LinuxONE III families.

For clients with highly sensitive workloads such as cryptocurrency and blockchain services, keeping data secure is even more critical. That’s why IBM Secure Execution for Linux works by establishing secured enclaves that can scale to host these sensitive workloads and provide both enterprise-grade confidentiality and protection for sensitive and regulated data. For our clients, this is the latest step toward delivering a highly secure platform for mission-critical workloads.

For years, Vicom has worked with LinuxONE and Linux® on Z to solve clients’ business challenges as a reseller and integrator. On learning how IBM Secure Execution for Linux can help clients, Tom Amodio, President, Vicom Infinity said, “IBM’s Secure Execution, and the evolution of confidential computing on LinuxONE, give our clients the confidence they need to build and deploy secure hybrid clouds at scale.”

Simplifying your regulatory requirements for highly sensitive workloads


In addition to the growing risk of insider threats, our clients are also facing complexity around new compliance regulations such as GDPR and the California Consumer Privacy Act, demonstrating that workload isolation and separation of control are becoming even more important for companies of all sizes to ensure the integrity of each application and its data across platforms. IBM Secure Execution for Linux provides an alternative to air-gapped or separated dedicated hardware typically required for sensitive workloads.

Delivering cyber resiliency and flexible compute


Building on recent announcements around encrypting everywhere, cloud-native and IBM Z Instant Recovery capabilities, as well as support for Red Hat OpenShift Container Platform and Red Hat Ansible Certified Content for IBM Z, these two new members of the IBM Z and LinuxONE families bring new cyber resiliency and flexible compute capabilities to clients including:

◉ Enterprise Key Management Foundation–Web Edition provides centralized, secured management of keys for robust IBM z/OS® management.

◉ Flexible compute: Increased core and memory density with 2 central processor complex drawer design provides increased physical capacity and an enhanced high availability option. Clients can have up to 3 I/O drawers and can now support up to 40 crypto processors.

◉ Red Hat OpenShift Container Platform 4.3: The latest release, planned for general availability this month on IBM Z and LinuxONE.

Complementary IBM Storage enhancements


In addition, IBM also announced new updates to our IBM Storage offerings for IBM Z. The IBM DS8900F all-flash array and IBM TS7700 virtual tape library both now offer smaller footprint options. This week the TS7700 family announced a smaller footprint, with flexible configurations for businesses of all sizes and different needs that can be mounted in an industry-standard 19-inch rack.

Source: ibm.com

Wednesday, 15 April 2020

Innovation and synergy with IBM Storage for IBM Z and LinuxONE

IBM Storage, IBM Learning, IBM Tutorial and Material, IBM Prep, IBM Certification

In September 2019, IBM announced the IBM z15TM, designed to deliver strong enterprise capabilities with the ability to manage the privacy of customer data across hybrid multicloud environments and to scale from one to four frames. This week, IBM is once again pushing the envelope with the announcement of the new z15 Model T02 and LinuxONE III Model LT2.

IBM Storage, IBM Learning, IBM Tutorial and Material, IBM Prep, IBM Certification

And the innovation doesn’t stop there. IBM Storage prides itself on delivering value to our customers by creating strong synergies between IBM Z® and our number one storage systems for enterprise servers–IBM DS8900F all-flash arrays and the IBM TS7700 virtual tape solution.

These enterprise storage solutions offer a crucial advantage over other products in the marketplace–deep integration with IBM Z. The design teams within these market-leading IBM product lines share intellectual capital and synchronize new technology releases.

This advantage is certainly recognized by IBM Business Partners:

“As an IBM Business Partner with a strong commitment to IBM Storage, we’ve watched the recent surge in IBM Enterprise Server sales with great interest,” notes Rusell Schneider, Storage Director of Jeskell Systems, a long-time IBM Storage Business Partner. “Our customers understand that in mission-critical environments, IBM Storage solutions such as DS8900F and TS7700 offer many advantages. The integration among IBM Z and IBM Storage is deeper. Installations are easier. Performance and functionality are simply better. Innovation in integrating IBM Z with IBM Storage in multicloud environments offers our large data center users the flexibility, cost and scalability they want.”

The deep integration between IBM Enterprise Servers and IBM Storage means that the recent focus areas for IBM Z–cloud native, encryption, cyber resilience, and flexible compute–have been major directions of innovation for IBM DS8900F all-flash arrays and TS7700 virtual tape solutions as well. New storage capabilities in these domains demonstrate that IBM continues to invest in advancing IBM DS8900F and TS7700.

IBM Storage supports business transformation into a multicloud ecosystem through a series of innovations centered around Transparent Cloud Tiering (TCT), Red Hat® OpenShift® Container Platform, IBM Cloud PaksTM and IBM Storage Suite for Cloud Paks, among others. IBM DS8900F and TS7700 TCT functionality enables hybrid multicloud as an extension of the primary storage for data archiving and disaster recovery. It also provides physical separations of systems to help customers prevent widespread corruption of data due to system failures, intentional or accidental human error, malware or ransomware attacks.

IBM Storage Suite for Cloud Paks is the software-defined storage infrastructure supporting IBM Cloud Paks. These solutions are designed to enable organizations to easily deploy modern enterprise software either on premises, in the cloud, or with pre-integrated systems. They also bring workloads to production by seamlessly leveraging Kubernetes as the management framework, supporting production-level qualities of service and end-to-end lifecycle management. This is designed to give clients an open, faster, more secure way to move core business applications for any cloud.

Then, to protect multicloud environments, IBM TS7700 virtual tape systems can now be used as both traditional backup and archive solutions or as object store repositories.

IBM Storage, IBM Learning, IBM Tutorial and Material, IBM Prep, IBM Certification

One of the most successful collaborations between IBM Storage and IBM Z has been the implementation of encrypting data everywhere. Both IBM DS8900F and TS7700 solutions offer security integration to extend IBM Z pervasive encryption to the storage by providing encryption of data at rest, in flight and in the cloud.  Additionally, when integrated with tape systems such as IBM TS4500, TS7770 solutions offer the ultimate physical air gap cyber security between data and online cyber threats.

Cyber resilience goes hand-in-hand with data protection. The modern enterprise demands that digital systems and assets be available 24/365. Part of the reason that IBM Z remains as popular as ever for business-critical workloads is due to its reputation for reliability. IBM DS8900F systems complement IBM Z resilience with multiple data replication, backup, and protection capabilities such as IBM HyperSwap and Safeguarded Copy technologies that result in 2 to 4 second recovery point objectives and better than seven nines overall system availability–that’s three seconds of downtime in a 365x7x24 year.

“Because information is now our most valuable business resource, data protection and cyber resilience have become our most important business tasks,” states Andreas Kirrstetter, Platform Manager Tape Infrastructure of IBM client Fiducia & GAD IT AG. “Part of the reason we use IBM Z is for their security and resilience. But this means that our storage and archiving systems must offer just as much security. This is why we’ve turned to IBM solutions such as TS7700 virtual tape libraries. They help us lower costs while allowing us to implement ‘air gaps’ for the ultimate data security. We see IBM TS7700 as the future of data archiving and protection.”

In addition to multicloud capabilities, encryption, and extraordinary cyber resilience, the IBM Z focus on flexible compute solutions means that the power of IBM Z is available for medium to large organizations, thanks to innovations such as the newly announced entry-level IBM z15 Model T02.

IBM Storage has innovated in step with this move toward greater flexibility and accessibility. The IBM DS8900F and TS7700 families both now offer smaller-footprint, lower-entry cost options: The entry-level DS8910F model has been available since last autumn and includes a smaller footprint with a rackless configuration that can be mounted in an industry standard 19-inch rack. This week, TS7700 is also announcing flexible configurations for organizations with different business needs. The same virtual tape capabilities are delivered as prepackaged racked solutions or rack mounted configurations with flexible deployments across client-supplied 19-inch industry standard racks. Enterprises starting with modest investments can surround them with the same deeply integrated, highly functional multicloud storage used by the largest organizations.

And innovation continues at a brisk pace; IBM TS7700 is working on a number of new features that will enhance cyber security, increase object store functionality and accelerate disaster recovery from the cloud.

“One of the significant advantages of IBM Z has always been its tight integration across hardware, operating systems/application software and storage.” states analyst Ray Lucchesi, president and founder of Silverton Consulting. “The new IBM z15 model T02 now brings those capabilities to medium and entry level businesses. So, these businesses can now enjoy the comprehensive data security, cyber resilience and multicloud capabilities as well as the sophisticated functionality of DS8900F AFA and TS7700 virtual tape library storage previously only available to high-end enterprise customers.”

IBM Storage works closely with the IBM Z and LinuxONE teams to synchronize our development and release of new technologies in order to give organizations the greatest benefits and advantages possible. This is the definition of synergy–when the team is greater than the individuals involved. And it’s exactly what IBM Storage gives our clients every day around the world.

Source: ibm.com