Wednesday 31 March 2021

The hidden danger of outdated infrastructure: security risk

IBM Exam Prep, IBM Learning, IBM Tutorial and Material, IBM Career, IBM Preparation

With all the talk about cloud solution adoption, it’d be easy to assume that on-premises IT infrastructure is fading in popularity. However, the recent IBM and Forrester Consulting study “The Key to Enterprise Hybrid Cloud Strategy,” found that on-premises infrastructure still has a strong presence for many enterprises. The study found that “firms are planning to increase investments toward on-premises infrastructure, and 85% of IT decision-makers (ITDMs) in our survey agree that on-premises infrastructure is critical to their hybrid cloud strategies.” In fact, 75% of IT decision makers plan to increase their infrastructure investment in the next two years.

Unfortunately, plans aren’t always followed through. On-premises infrastructure updates are often one of the first things to get pushed based on budget needs, project priority or unexpected disruptive events (such as COVID-19). The Forrester study found that 70% of responding organizations have delayed infrastructure refreshes at least a few times in the last five years or more (up from 61% in 2019).

When looking at IT projects and priorities, refreshing on-premises infrastructure is an easy candidate for delay. It’s not a flashy new project and it may be difficult to justify the cost to the C-suite. When juggling multiple projects or the need to slash the budget, IT teams may look at risk/reward equation for not refreshing existing on-premises infrastructure. A decision is arrived that everything is working well enough for now. What is often not taken into account is that there are security risks associated with this gear. In fact, the Forrester study found that half of IT decision-makers found infrastructure-based security issues and vulnerabilities following a delayed refresh.

Changing nature of cyber risk

Security isn’t getting any easier. While the overall number of reported data breaches decreased in 2020, RiskBased Security’s 2020 Year End Report found that more than 37 million records were breached last year, up 141% over 2019 and reportedly the highest number of breached records since RiskBased Security began its annual report.

While security risk is increasing, organizational commitment to updated hardware is diminishing. The Uptime Institute found that the average timeframe for a hardware refresh is now every five years (compared to an average of every three years in 2015). Think about how much has changed in the cyber security landscape over the past five years. In many cases, five-year-old infrastructure was never designed to handle the high-risk workloads and security challenges we now task it with.

With the increasing adoption of artificial intelligence (AI) and machine learning (ML) in business and technology applications, the need to support data-sensitive workloads is far greater than it was five years ago and will only increase. Forrester Consulting found that 84% of ITDMs anticipate greater data-sensitive workloads going forward. Couple all that with rigorous compliance standards that are closely tied to infrastructure security and it’s easy to see how not regularly refreshing infrastructure can create a dire security risk and impact an organization’s overall security posture.

Adopting a holistic security posture

Security isn’t a single headed monster, and the enterprise approach to strong, holistic security needs to remain equally multi-faceted. That includes not forgetting or dismissing the importance of regularly refreshing on-premises infrastructure, even as enterprises build out increasingly complex hybrid cloud solutions.

Source: ibm.com

Tuesday 30 March 2021

IBM Places Power Systems at the heart of its hybrid cloud strategy

IBM Power Systems, IBM Preparation, IBM Learning, IBM Career, IBM Tutorial and Material

Customers know that IBM’s top strategies are hybrid cloud, AI and application modernization. These strategies support customers’ accelerated moves to hybrid cloud, and to more versatile, flexible business systems suited to “new normal”business conditions. The new normal world demands security, greater efficiency, and more consistency— with no room for errors, security flaws or slowdowns.

In the wake of the COVID-19 pandemic, customers’ top priorities are to meet new and changing business conditions by:

◉ Developing and testing modernized applications rapidly

◉ Migrating well-established business applications to the cloud

◉ Scaling workloads to meet new patterns of end-user access and demand

What customers may not know is that IBM is placing IBM Power Systems at the center of all three of those top strategies in the hybrid cloud. IBM’s February 2021 announcements show that Power Systems are important platforms for application modernization, deployment and management as customers update their critical applications to run in the hybrid cloud— spanning data centers, private clouds and public clouds.

Businesses are putting hybrid clouds to work in the real world

Modernization is a broad term, bringing forward important transactional applications and data, speeding performance— and making them more efficient and flexible to match changing business conditions. It is a process that can be performed by in-house DevOps teams, by outside resources— or by both, working together.

When the cloud revolution began in 2008-2009, it was driven by rapidly changing economic conditions that favored rapidly deployed services over infrastructure build-outs. Business executives could start application development and testing projects by charging their corporate credit card.

Today, the COVID-19 pandemic, which began in 2020, is accelerating a new wave of cloud migrations, as many business applications are being reinvented (modernized) for use in the hybrid cloud. Subscription-based, pay-as-you-go cloud services are the modern equivalent of the credit card for cloud computing.

The modernization process now underway allows businesses to leverage cloud economics to link back-office and customer-facing systems. In many cases, it’s the first time these “front” and “back” IT systems are being seamlessly connected and managed – often by a single operations team.

Power Systems fit the needs of application modernization projects, with their support of Red Hat OpenShift containers, open-source Kubernetes for orchestration and Red Hat Ansible for application deployment. Power Systems are well-known as engines for AI/ML data analytics, working with on-premises and off-premises data resources to improve business outcomes by finding patterns in the data.

It’s worth noting that, for both scale-up and scale-out deployments, Power Systems host customers’ custom applications, SAP ERP, SAP/HANA, EPIC health care applications, and SAS analytics across Linux, AIX and IBM i environments. All of these environments run side-by-side on Power Systems— and all of them scale up by tapping more POWER processor cores, as user demand for resources grows.

Three customer examples

Here are examples of global brand-name enterprises actively engaged in updating older systems, building new ones— and harmonizing all of them by managing them together in a unified, flexible framework:

◉ Coca Cola European Partners is running SAP enterprise applications on HANA platforms in the hybrid cloud. This multi-national company, with offices across the European continent, is scaling up SAP processing with clusters of Power Systems, meeting growing demand for end-to-end services.

◉ Delta Airlines. Delta announced that it is working with IBM Cloud to accelerate cloud migration for its industrial-strength enterprise applications—as its business units develop new cloud-native apps for passengers to use. In the hybrid cloud, some of the airline’s key workloads will run on Power Systems across the world, and some will run on IBM Z mainframes in the IBM Cloud.

◉ Shree Cement Ltd., one of the largest cement providers in India, is scaling up its IT resources by running OpenShift containers on clusters of Power Systems. Their applications are running on Red Hat Enterprise Linux (RHEL) and IBM AIX Unix resources, including scalable databases running on IBM Power Systems.

Getting ready for business tasks ahead

Re-writing mission-critical applications can be just as tough as updating an airline schedule while the company’s fleet of planes is in-flight around the world. Time is of the essence for DevOps groups. So are reliability and flexibility for a worldwide customer-facing airline that needs its systems to be available 24 x 7.

IBM Power Systems, IBM Preparation, IBM Learning, IBM Career, IBM Tutorial and Material
These factors are the main drivers for change in the new normal environment around the world. Things cannot remain as they were, because there were too many “islands of automation” that could not work together easily. For many businesses, finding platform-specific skillsets to manage all of their systems was— and is— difficult. By pulling compute, storage and network resources together in a consistently managed hybrid cloud, customers can harness substantially more processing power more quickly than before.

By deploying in the hybrid cloud, modernized applications will run, literally, all over the world, across all major time zones. With accelerated migrations, companies will move more quickly to hybrid cloud— tapping both cloud-native applications and enterprise applications.

The common elements for these end-to-end modernized solutions, across platforms, are Red Hat Open Shift containers to deploy applications across multiple host systems (Power, x86 and z), and Red Hat Ansible management software. Power Systems, running Linux, AIX and IBM i, play important roles in modernization projects, supporting DevOps teams for app/dev, and hosting production applications and data as demands grow.

IBM’s recent Power announcements provide scalable resources and flexible pricing across the hybrid cloud. The Power Private Cloud with Dynamic Capacity offer allows customers to gain cloud-like consumption-based pricing as more POWER9 processor cores are added to support fast-growing workloads. This offer will be extended to POWER10 processors when they ship in Power Systems later this year.

Summing up

Hybrid cloud and modernization allow enterprises to align their business objectives with their IT goals for high availability, resiliency and flexibility. These businesses have a consistent operating model that links their data centers, private clouds, and public clouds deployed across the company— and around the world. A hybrid cloud strategy— combining front-end systems for consumers and back-end systems for enterprise applications and financial transactions— directly addresses both of these important aspects of end-to-end enterprise computing.

Source: ibm.com

Saturday 27 March 2021

3 ways to avoid EDI pitfalls during peak events

IBM Exam Prep, IBM Learning, IBM Certification, IBM Preparation

Over the last year, consumers and business buyers have dramatically changed how they procure products and services, highlighting just how important digital transformation now is to building resilience. During expected or unexpected peak events, IT leaders and B2B managers need confidence that their B2B infrastructure operating behind the scenes — connecting retailers, distributors, manufacturers and suppliers throughout the lifecycle of a customer order — can keep pace as demand spikes.

As you face peak events and the certainty of change, here are three ways you can shore up your B2B systems and infrastructure to be ready for surges in EDI transaction volumes.

1. Keep orders flowing in the cloud

As orders for products surge and are fulfilled, you need to replenish inventory — fast. Orders must continue to flow across your supply chain to ensure distributors ship additional products to stores or warehouses, and manufacturers have the supplies they need to make more products and keep the pipeline full. But if your EDI system slows under mounting transaction volumes or worse, completely fails, you lose the ability to communicate with your trading partners and receive orders from your customers. Critical transactions, like orders and ship notices, are delayed.

Eric Doty of Greenworks Tools keeps orders flowing without hiring more staff with a cloud-enabled multi-enterprise business network that enables reliable, secure and scalable B2B exchanges. As this power equipment manufacturing company expands its global presence, the network provides a more efficient and cost-effective way to track the increasing number of orders. The solution digitizes and automates transactions and uses AI technology to deliver deeper insights into B2B processes.

With visual reports and natural language queries, business users can quickly track the status of an order without help from IT to make faster and more-informed decisions and deliver better customer service. Greenworks Tools is keeping up with global B2B transaction growth and realizing a 40 percent IT cost savings by putting EDI insights into the hands of business users.

2. Get flexibility to auto-scale

2020 has taught us that peak events can happen at any time for a variety of reasons. There are black swan events like a pandemic, but weather, seasonal, regional, and industry-specific events are far more common. Sometimes you can anticipate disruptions, and other times you can’t. Either way, you can eliminate worry by being prepared. A business network that’s available as a cloud or hybrid solution, makes it fast and easy to scale up or down to support growing or slowing transaction volumes and manage costs.

Cinram links some of Europe’s biggest media producers and retailers with consumers of music, TV shows and movies, helping to keep shelves stocked with popular titles. Volumes spike during peak periods, but also when highly-anticipated media releases become available. With a cloud-based business network, Cinram has maintained close to 100 percent uptime and can easily scale up the system when business volumes spike. They can add EDI connections rapidly without having to worry about provisioning new hardware to deliver the consistent, fast response and reliable service their clients have come to expect.

3. Invest in proven B2B infrastructure

To keep pace with demand peaks, like those some industries saw due to COVID-19, you need the capability to and suppliers quickly. Faster onboarding means eliminating slow and error-prone manual processes with more efficient digitized processes. IT leaders are doing this now with B2B infrastructure built for multi-enterprise connectivity, automation of manual processes, and transaction visibility to exchange necessary information without disruption.

Saint-Gobain, a leading global manufacturer of abrasives, is putting their cloud-based, B2B infrastructure to work every day, all day, and have cut costs per line order by 92 percent. They are transacting securely with customers through EDI rather than manually and onboarding new customers faster. They have moved several vendors to EDI, all using one generic map. And Chase Shelby, eBusiness Manager, aims to bring on as many customers to EDI as possible to drive further efficiency and gain competitive advantage. With automation and visibility, they are receiving orders 24×7 and providing customers real-time updates on order and shipment status.

They’re also simplifying the inherent complexity in their environment, even when one PO may include made-to-order products and stock products, or a complementary product from one of Saint-Gobain’s partners that will be drop shipped. In one simple query, customer service reps can retrieve all documents related to the PO for streamlined tracking. Automation has freed-up customer service reps for other value-added tasks, and Shelby points to additional savings from moving some of the in-house server load to the cloud.

Business buyers’ procurement expectations have changed forever, and there’s no turning back to phone and paper-based transactions. So, B2B managers must increasingly look to digital channels to meet demand and build resilience. With a proven B2B infrastructure that is available as a cloud or hybrid cloud option, there’s no need to worry about keeping orders flowing with your customers and suppliers or scaling up or down easily. IT leaders and B2B managers can take on peak events and the certainty of change with confidence.

Source: ibm.com

Friday 26 March 2021

Climate change: IBM boosts materials discovery to improve carbon capture, separation and storage

IBM Exam Prep, IBM Certification, IBM Learning, IBM Career, IBM Prep

Capturing it at the point of origin is thought to be one of the most effective ways to limit its release into the environment. Once captured, the gas could then be sequestered and stored for centuries.

But capturing and separating CO2 from exhaust gases in energy production and transportation is tricky. Moving it to a storage site so that it doesn’t enter the atmosphere again is also far from trivial. Researchers have been trying to improve these techniques for decades.

Artificial intelligence (AI) could help.

Our IBM Research team has turned to AI to accelerate the design and discovery of better polymer membranes to efficiently separate carbon dioxide from flue gases — the results that we will present at the upcoming 2021 Meeting of the American Physical Society.

Using molecular generative AI modeling, we have identified several hundred molecular structures that could enable more efficient and cheaper alternatives to existing separation membranes for capturing CO2 emitted in industrial processes. We are now evaluating these candidate molecules with the help of automated molecular dynamics simulation on high-performance computing (HPC) clusters.

We will also present the initial results of two other essential material discovery projects – dealing with carbon sequestration and storage.

Simulating carbon separation and conversion

Safely and effectively storing CO2 after it’s been captured is still a challenge. One promising approach is injecting the gas into geological formations. Indeed, experts confirm that “…pore space in sedimentary rocks around the globe is more than enough to sequester all the CO2 that humanity could ever want to remove from the air”. But the physics and chemistry of the process at a reservoir rock’s pore scale is not well understood. And the efficiency of CO2 conversion and storage also depends on the type of rock and the reservoir conditions.

To tackle the issue, we have created a cloud-based tool that simulates fluid flow of carbon dioxide in specific types of rock, allowing scientists to evaluate CO2 trapping and, eventually, conversion scenarios at pore scale. Ultimately, the technology could enable researchers and engineers to perform rapid analysis and optimization of the rock-specific requirements for mineralizing and storing CO2 efficiently, safely and long-term.

Also, we’ll have to accelerate the discovery of CO2-absorbing materials. It can take years, even decades, to discover a new material, or to determine which existing material is best suited to a particular carbon capture application. With our changing climate, there is no time to lose.

In a bid to speed up the process, we have created a cloud-based screening platform to rapidly sift through millions of potential CO2 adsorbents at the nanoparticle level. The tool should enable materials engineers to select the best materials for enhancing the absorption of carbon dioxide in a particular application.

The platform allows fast searches through large quantities of known structures, enabling faster discovery. For example, it could be used by a chemist to identify the most promising nanomaterials for an industrial process. Once the most viable candidates are identified, the computational framework could then inform chemical synthesis and material optimization for accelerating the discovery in the lab.

In all of our projects, we have combined AI, HPC and cloud technologies to greatly accelerate the discovery of new materials. Our efforts stem from the recently launched Future of Climate global initiative at IBM Research, which pools materials discovery technology and scientific know-how across IBM’s worldwide network of research labs. The broader portfolio also includes the research and development of strategies to reduce the carbon footprint of cloud computing and within the supply chain, as well as techniques to model the impact of climate change.

Of course, climate change is a global challenge, requiring the collaboration of academia and industry — a joint effort of the global research community. This is why IBM has recently become an inaugural member the MIT Climate and Sustainability Consortium, along with other enterprises including Apple, Boeing, Cargill, Dow, PepsiCo and Verizon.

Only together we can advance and adopt our research outcomes at global scale, use our new solutions to formulate a long-term, sustainable climate strategy — and limit climate change.


Source: ibm.com

Wednesday 24 March 2021

Speech-to-text AI could help doctors prescribe placebo to ease chronic pain

IBM Exam Study, IBM Career, IBM Learning, IBM Certification, IBM Preparation

Placebos work, but we still don’t really know why. Knowing who would respond well to a placebo — typically just a sugar pill — could help doctors prescribe certain patients a placebo instead of a real drug to alleviate chronic pain, potentially helping them to save money and avoid side effects.

AI could help. Machine learning and natural language processing (NLP) analysis of patients’ verbal accounts of their experience are now starting to shed some light on the placebo response.

In a newly published paper “Quantitative language features identify placebo responders in chronic back pain” in the peer-reviewed journal PAIN, we report the first proof-of-concept that uses AI to analyze patients’ clinical trial experiences. The AI quantifies a placebo response in patients with chronic pain and distinguishes those who respond to placebo from those who do not.

The results show key differences in language use between patients who responded favorably to placebo — meaning their pain improved —versus those that did not, with patients’ language picking up on underlying personality traits and psychological factors. We hope that in future, clinicians could use speech-to-text AI to transcribe their conversations with patients and assess how likely a patient would be to respond to a placebo instead of a drug. Knowing ahead of time who would be good placebo responders, doctors may not have to prescribe them treatments at all, and still help them get better.

The results could also help improve the design of future clinical trials by either removing potential placebo responders from active treatment groups or by helping better balance placebo responders across different treatment arms to make statistics more robust.

Language as a window into the mind in pain

Determining whether a patient’s placebo response is reliably predictable is tricky. It’s also difficult to pinpoint what features of a person’s pain experience, personality or cognitive processes factor most into the response. The placebo response has been observed across a variety of conditions and treatment types – including pills, patches, injections and surgeries — with some of the most significant placebo responses producing meaningful relief for those experiencing chronic pain.

Working with researchers from Northwestern University and McGill University in Montreal, we conducted a double-blinded study, meaning both the study participants and the facilitators were unaware of who had received a placebo pill during language collection and initial data analysis. We distinguished — with 79 percent accuracy using language features, alone — chronic back pain sufferers who were unknowingly helped by a placebo from those whose pain was not reduced.

To better understand why this response happens and how we could one day hopefully predict it, we used chronic pain patients’ speech collected via open-ended interviews to quantify hidden aspects of their thoughts and emotions. Based solely on what they said, we used AI to identify them as placebo responders (those who had a positive effect to a placebo pill, in the form of pain reduction) or non-responders (those who either showed no effect to the placebo pill or a negative effect — “a nocebo response”).

This is not the first time researchers have looked into the relationship between pain and voice. But most previous studies have concentrated on acute instead of chronic pain or have focused on the acoustics and quality of a subject’s voice (how they speak) instead of their language content (what they say).

Identity, emotions, achievements, and perceived pain relief

Given that language is “a window into the mind,” we wanted to do it differently. We suspected that using a quantitative language methodology would provide a tool to directly tap into certain personality traits and psychological factors to help us objectively measure patients’ pain experience and response to treatment.

IBM Exam Study, IBM Career, IBM Learning, IBM Certification, IBM Preparation
We conducted a monitored, registered clinical trial approved by an Institutional Review Board designated to review and monitor biomedical research under FDA regulations. We randomized patients with chronic low back pain into one of three groups: a placebo treatment group (sugar pills), an active treatment group (over-the-counter naproxen pain relief pills) or a no-treatment group (no pills).

We then followed these patients for eight weeks using a combination of clinical questionnaires, neuroimaging, and pain ratings collected twice a day on a smartphone. All participants signed an informed consent form, and data were de-identified prior to analysis.

At the end of the trial, we conducted an exit interview asking patients about various topics including their hobbies, medical history, and pain experiences. From this interview, we extracted more than 300 language features from what patients said. These features included how many words a person used in their interview, their use of different parts of speech (nouns, verbs, adjectives), how positive or negative their speech was, and how semantically similar their interviews were to different topics or concepts of interest (such as suffering, joy, pain, or disappointment). We entered the replies into a machine learning model to classify and identify if someone responded well to a placebo.

We were in for a surprise.

The patients who responded to the placebo talked more about their emotional experiences, themselves, and their personal relationships. And those who didn’t respond to the placebo talked more about taxing movements and doing physical activities in their interviews. Interestingly, how much their pain changed between treatment and either baseline or no-treatment periods was significantly linked to how much patients talked about their identity and their achievements. These language features ended up explaining 46 percent of the variance in pain relief between patient groups.

Using AI to better understand neurodegenerative diseases

The translational potential of our methodology cannot be understated.

Clinicians regularly make use of conversations with patients to understand how much pain they are in, where the pain is located and how their quality of life is impacted. In the future, for example, a standardized list of pain-related questions given to patients during a visit could potentially be automatically transcribed with AI speech-to-text tools and analyzed in real time to provide a physician the likelihood of that patient’s response to a prescribed treatment.

Better understanding and quantifying pain might also have an impact on the ongoing opiate crisis, to help determine whether patients need to be prescribed strong pain medicine or if they might also respond to a placebo or a different drug. It may also help improve clinical trial efficacy and accuracy by providing tools that can identify placebo responders before randomization, allowing for more balanced study designs and treatment groups.

By mid- 2021, we aim to submit a new paper detailing the second set of results from this study, where AI identifies future placebo responders from non-responders before they take any pills. The second paper validates the findings from the first study and expands upon the utility of language by showing that not only can it be used to identify placebo response and quantify it, but also to predict it.

IBM Research’s use of AI to better understand pain management through speech is part of the company’s larger effort using AI and speech to analyze a host of neurodegenerative disorders, including Alzheimer’s, Parkinson’s and Huntington’s diseases, as well as psychiatric disorders such as schizophrenia and addiction. Chronic pain is likewise a neurodegenerative disease and is often associated with mental health issues, including depression, anxiety, and substance abuse.

IBM’s larger mission is to build a digital health platform that can analyze a range of biomarkers, including sleep, movement, and pain and use those metrics to help physicians better understand and treat diseases, using IT and AI to complement the clinical assessments, reaching patients with minimal burden as they go about their daily lives.

Source: ibm.com

Tuesday 23 March 2021

AI and crowdsourcing to help physicians diagnose epilepsy faster

IBM Exam Prep, IBM Certification, IBM Learning, IBM Preparation, IBM Career

Epilepsy, a chronic neurological disorder, always causes unprovoked, recurrent seizures — but the experience can be very different from person to person.

A highly individualized condition, epilepsy is extremely difficult to diagnose uniformly or at scale, which is further complicated by the fact that disease expressions change over time. One development in helping us better understand epilepsy is that researchers have been collecting electroencephalography (EEG) data about patients for quite some time.

In a new paper in The Lancet’s EBioMedicine journal, “Evaluation of Artificial Intelligence Systems for Assisting Neurologists with Fast and Accurate Annotations of Scalp Electroencephalography Data,” we describe the design and implementation of a new open hybrid cloud platform to manage and analyze secured epilepsy patient data. We also present the findings from a related crowdsourced AI challenge we launched to encourage IBMers across the globe to use Temple University epilepsy research data and our new platform to develop an automatic labelling system that could potentially help reduce the time a clinician would need to read EEG records to diagnose patients with epilepsy.

The results indicate deep learning can play an extremely important role in patient-specific seizure detection using EEG data, gathered using small metal discs—called electrodes—attached to a patient’s scalp to detect electrical activity of the brain. We found that deep learning, in combination with a human reviewer, could serve as the basis for an assistive data labelling system that combines the speed of automated data analysis with the accuracy of data annotation performed by human experts.

The challenges within the Challenge

The IBM Deep Learning Epilepsy Challenge, as described in the EBioMedicine paper, asked participants to develop AI algorithms that could automatically detect epileptic seizure episodes in a large volume of EEG brain data collected by the Neural Engineering Data Consortium at Temple University Hospital (TUH). For this application, operating at high sensitivity (~75%) while maintaining a very low false alarm rate is crucial. IBM researchers who participated as competitors were provided with an ecosystem that allowed them to efficiently develop and validate detection models. Importantly, competitors did not have direct access to the dataset nor were they able to download the data.

The IBM-TUH challenge organizing team processed participant responses through objective and predetermined evaluation metrics. One of our goals was to lower the barrier of entry in using AI model development platforms. We turned to crowdsourcing because it let us draw from a larger pool of talent across the company, essentially turbocharging the discovery process.

As organizers of the challenge, our big test was finding a way to exploit the “wisdom of the crowd” while keeping highly sensitive medical data secured and private. IBM’s hybrid approach to cloud infrastructure played a pivotal role in meeting that challenge, enabling the broader research community to participate in crowdsourced model development, all while keeping patient data secured and preventing it from being downloaded or directly accessed by participants. Our challenge platform infrastructure was housed and hosted data behind a secure firewall, allowing participants to test and submit models and then to receive feedback about the performance of their algorithms.

Close to one hundred IBM researchers participated in the challenge. The criteria for evaluation of submitted models were fairly straightforward—detect a seizure when there is one, without producing a lot of false positives that would undermine confidence in the model being judged. The best performing model in the challenge would have, if used in the real world, decreased the amount of data a doctor would have had to manually review by a factor of 142. That means that instead of having to manually review 24 hours of raw EEG data, using the models developed in the challenge, a doctor would only have to review 10 minutes of data. The key wasn’t just speeding up analysis, but also accurately labeling and reducing the amount of data a doctor would need to review.

IBM Exam Prep, IBM Certification, IBM Learning, IBM Preparation, IBM Career

After close to two years building the platform, the challenge served as a showcase for its capabilities. The platform facilitated the use of Temple University’s data to develop an effective detection system that we hope can one day assist neurologists to improve the efficiency of EEG annotation. Ultimately, this lays the foundation for clinicians to develop more accurate, personalized and precise treatment plans for epilepsy patients.

We’ve continued to develop our deep learning platform and have already made an updated version available for a public crowdsourced challenge project with MIT. The platform will be eventually open sourced, a move we anticipate will open the door to even more exciting deep learning projects.

Helping physicians to improve patient care

Our work is part of IBM’s larger mission to build a digital health platform that can analyze a range of biomarkers—including sleep, movement and pain—and use those metrics to help physicians better understand, monitor and treat diseases. Deep learning—and the AI models the method creates—can potentially complement doctors’ clinical assessments to help them provide faster and more accurate diagnoses and treatments.

Additionally, IBM Research and Boston Children’s Hospital will soon publish joint work in which AI is used to study epilepsy. The paper showcases AI models that can detect the largest range of epileptic seizure types yet in pediatric patients—including seizure types that have never before been able to be detected automatically using technology. The AI algorithms use temperature, electrodermal activity and accelerometer data from commercially available wearable devices (such as smartwatches) to detect and identify epileptic seizures. This work was also showcased recently at the American Epilepsy Society Annual Meeting (AES) and through PAME (Partners Against Mortality in Epilepsy) Recognition in the Clinical Research Category.

This work is part of IBM Research’s use of AI to better understand a range of diseases and conditions through the analysis of natural, minimally invasive biomarkers such as speech, language, movement, sleep, pain, stress levels and mood. This includes work to use these data points to help better monitor, measure and predict events for conditions such as chronic pain, Alzheimer’s, Parkinson’s and Huntington’s diseases, as well as psychiatric disorders such as schizophrenia and addiction.

Source: ibm.com

Saturday 20 March 2021

Simplifying data: IBM’s AutoAI automates time series forecasting

IBM Exam Prep, IBM Tutorial and Materials, IBM Learning, IBM Preparation

Creating AI models is not a walk in the park. So why not get AI to… build AI?

Sounds simple, but with the ever-growing variety of models, data scientists first have to have the tools to better automate the model building process. In time series forecasting – models that predict future values of a time series, based on past data or features – the problem is even harder. There are just too many domains that generate time series data, with different and complex modeling approaches.

Read More: C2065-055: IBM i2 Analyst Notebook V9

We think we can help.

Figure 1: AutoAI-TS overall architecture

On March 1, Watson Studio, IBM’s AI automation modeling system, rolled out AutoAI Time Series under closed beta. IBM Watson users can already tap into our research to automate time series forecasting. And now our paper “AutoAI-TS: AutoAI for Time Series Forecasting” is out, too – we’ll present it at the 2021 ACM SIGMOD/PODS Conference in China in June.

The paper details how the AutoAI Time Series for Watson Studio incorporates the best-performing models from all possible classes — because often, there is simply no single technique that performs best across all datasets.

Suppose a building management company is forecasting energy and resource consumption in a smart building. It can use AutoAI Time Series to quickly train models with less effort than conventional approaches. With just a few mouse clicks, AutoAI Time Series can quickly identify the best features, transforms and models, as well as train the models, tune the hyperparameters and rank the best-performing pipelines — specifically for the company’s resource consumption data.

Combing through the data

While some approaches perform automated tuning and selection of models, these approaches often focus on one class of models, such as autoregressive approaches, which learn from past values of a time series in order to predict future values.

Instead, AutoAI Time Series performs automation across several different model classes, incorporating a variety of models from each class.

Figure 2: SMAPE based rank comparison of AutoAI Time Series and SOTA toolkits for univariate data sets (lower rank is better)

Our AutoAI Time Series system is different. It achieves leading benchmark performance and accuracy across a variety of univariate datasets — be it social networks such as Twitter, phone call data logs, the weather, travel times and production volumes. It even works with multivariate datasets, such as exchange rates, household energy use, retail sales, and traffic data. Figures 2 and 3 show the performance results of AutoAI Time Series and several state of the art (SOTA) toolkits on more than 62 univariate and more than 9 multivariate data sets from a variety of application domains and ranging in size from dozens to more than 1,400,000 samples.

The system, which provides the necessary preprocessing components to orchestrate the building of pipelines for several different modeling approaches, relies on three primary components.

The first is lookback window generation. Many time series forecasting techniques are based on extracting a segment of the historical data – the so-called lookback window – and using it, or its derived features, as inputs to a model. Our lookback window generation approach uses signal processing techniques to estimate an appropriate lookback window. The conventional approach is to repeat the modeling exercise for a variety of lookback window sizes and identify what is best – but this can be very time consuming. Using our lookback window approach, we can dramatically reduce the number of models that need to be built, resulting in a faster modelling process.

Take the earlier example of forecasting energy consumption of a building. Such data would vary seasonally across multiple time periods — daily, weekly, and yearly. Applying the automated lookback window tool could help identify the best candidate windows.

The next component is pipeline generation – where the system produces candidate pipelines to solve a modeling problem. A pipeline is a series of steps that define the process of transforming data and building an AI model. The AutoAI Time Series system has several pre-built candidate pipelines adapted to data characteristics that use a variety of models. For our users, the pipelines include transformers for unary transformations, flattening, and normalizing the data. Models include Holt-Winters Seasonal Additive and Multiplicative, ARIMA, BATS, Random Forest Regression, Support Vector Regression, Linear Regression, Trend to Residual Regressor, Ensemble Methods, among others.

Finally, there is pipeline selection — making use of a reverse progressive data allocation technique to efficiently train and choose only the most promising pipelines. This is an incremental technique in AutoAI Time Series that continually ranks pipelines based on their expected performance, minimizing overall training time. In our example, we have several years’ worth of data. Training all pipelines on all this data would be very time consuming – but our technique could help to identify the best pipelines faster.

Figure 3: SMAPE based rank comparison of AutoAI Time Series and SOTA toolkits for multivariate data sets (lower rank is better)

After modeling is completed, the system can also generate back-test results for a selected model. Users can flexibly configure multiple back-test periods to provide insights into the temporal behavior of the model performance.

Automating AI Time Series

Our current research focuses on refining AutoAI Time Series to enhance its capabilities. In the future, we aim to incorporate what’s known as imputation to handle missing values, enhance pipelines and models to use exogenous features, and produce prediction intervals to provide a measure of uncertainty in the predictions.

To cope with increasing pipeline complexity and add support for larger datasets, we are also looking into strategies to better scale our system. These strategies include ways to sample data, develop scalable pipeline components to handle large data, and parallelization of the modeling workload.

Our AutoAI Time Series is just the first step towards a more efficient and advanced system that will enable data scientists to easily build AI models for broader use case scenarios.

Source: ibm.com

Friday 19 March 2021

Benefits of IBM SPSS Modeler Professional Certification And Career In IBM

IBM SPSS Modeler Professional Certification

The Professional version of IBM SPSS Modeler is the standard edition of the Modeler family. It gives an extended range of data preparation, graphical, and data import/export functionality. IBM SPSS Modeler Professional also covers the full range of Modeler's classification, association and segmentation algorithms, and several automated modeling systems.

IBM SPSS Modeler Professional is available in both desktop client and server editions. Reveal complicated and nuanced patterns in structured data. SPSS Modeler Professional provides excellent algorithms, data manipulation, and automated modeling and preparation techniques to create predictive models that can help you deliver better business outcomes more quickly.

SPSS Modeler Professional gives all of the power, the efficiency of use, and flexibility of its unique 'Visual Programming' interface and allows for more comprehensive data connectivity to enterprise databases. IBM SPSS Modeler Professional is also the foundation stone of full enterprise-strength predictive analytics, enabling Modeler Server's addition, opening the way to SQL push back and In Database mining capabilities.

What Does A IBM SPSS Modeler Professional Certification Mean?

Taking a job in the Tech Industry is not tough if you maintain an excellent skillset. To demonstrate that skillset, though, is near impossible. You cannot just show up during the interview and say, "I'm good with computers."

Without an industry-approved certification from a vendor such as IBM, employers cannot trust that you know what you know. Better yet, these certifications are not just a piece of paper. Becoming A+ certified is hard work, but it automatically tells employers that you have an accurate world skillset, forged and tested. They teach excellent skills that will make your job easier and more rewarding. Whether it is a Data science certification like Dell EMC Data Science Associate or a Dell EMC Advanced Analytics Specialist certification, these complex certifications take time, effort and offer an immense reward!

Why IBM?

Certification takes time and money, so you need to pick a certificate awarding institution with real-world value. IBM is recognized for offering quality exams that give a variety of excellent certifications. You will get world classwork by taking any certification path they provide and surely learn some new incredible skills along the way! After all, IBM is industry renowned and produces employees who work worldwide in different tech niches. Whether it is one exam or many, you will find everything on AnalyticsExam for IBM certification, so you can begin your journey to a rewarding career.

Significant Benefits of IBM SPSS Modeler Professional Certification

1. Knowledge

When you became IBM Certified, you gain more knowledge in IBM Data and AI - Platform Analytics, and you can learn the fundamental concepts. This aids in creating many job opportunities in networking careers. The company first hires a fresher graduate with an IBM certificate more likely than an average IT professional without a certification.

2. Formation of a Successful IBM SPSS Modeler Professional

IBM SPSS Modeler Professional certification builds up a successful foundation, an essential advantage of holding a certificate. The IBM certification programs offer over two dozens of chances for people to enhance their skills. It demonstrates the considerable amount of planning, implementing, and troubleshooting skills for managing network switching and routing. The IBM certification offers a unique path to people to build up their knowledge of the foundational level.

3. Employer Benefits

Most employers like to employ certified professionals because they can work efficiently in the competitive Information Technology field. IBM Certification is the authentic acknowledgment that states that you have the skills and knowledge required for Data and AI - Platform Analytics.

4. Career Growth

IBM Professional Certification helps increase the chances of promotion if you are an experienced candidate. The Data and AI Industry offers a vast opportunity to develop your career if you wish to shift your job or beginner. Since the technology is getting immensely day by day, the government and the other business seek a qualified to handle their Operations.

5. Salary Increment with IBM SPSS Modeler Professional

Most of the companies regarding IBM SPSS Modeler Professional certified professionals. The chances for increasing the salary are high if you have the IBM SPSS Modeler Professional Certification on your resume. Also, an employee working in an organization, on the conclusion of the IBM Professional course with a certificate, could negotiate the salary hike to their higher officials. The chances of getting a walk in the salary are higher.

Summary

Small or big, the changes matter and technology is all about change. IBM Professional certification has been a significant part of worldwide and is serving huge career moves. Employers, when they screen for the candidates, look for the validation of skills. Well, this is only possible for a certification.

Thursday 18 March 2021

How data transformation technology is helping drive customer centricity

IBM Exam Prep, IBM Learning, IBM Preparation, IBM Certification, IBM Tutorial and Material

In today’s highly connected global economy, data and digital connections are critically important to achieving business objectives. As the sources of data continue to grow in volume, this data can be leveraged to accelerate decision making and drive customer centricity. So, how can a company reduce the complexity arising from data growth while growing its customer base? The answer lies in integrating data across data types and formats. This integration includes transforming data formats, validating data, and creating value from data, while complying with rapidly changing industry and regulatory standards.


Today, most IT departments solve this problem by building customized integration platforms for each new customer to help ensure frictionless data flow. Unfortunately, this process is not scalable when working with hundreds of customers. It’s also a huge drain on IT time, resources and budget. But the right technology can automate complex data transformation processes and validate data across a range of different formats and standards, scaling these processes across large volumes of enterprise data. Leaders like DHL are already using these solutions to drive value.

The DHL story 


IBM Exam Prep, IBM Learning, IBM Preparation, IBM Certification, IBM Tutorial and Material
DHL Supply Chain (DHL), a division of DHL International GmbH, offers warehousing and distribution services to more than 2,000 customers spanning industries, countries and sizes. Two-way communication between those customers and DHL is core to the company’s warehouse management business. Facilitating that data flow is not a one-size-fits-all task. Each customer’s data comes in different formats and follows different standards, depending upon such factors as a customer’s internal IT infrastructure, industry and location. 

For many years, DHL built customized integration platforms each time it onboarded a new customer. It was a time-consuming, laborious and expensive process. DHL needed a uniform way to integrate the ERP systems and data sets of its warehouse management customers with DHL’s internal systems.  

DHL deployed IBM Sterling Transformation Extender, that is used to map data from customers’ platforms to a custom internal integration management platform, aligned with the customers’ industry regulations and standards. This has allowed DHL to successfully transform over 2.2 Billion messages from more than 2,000 customers into their standard format every year.

Source: ibm.com

Tuesday 16 March 2021

5 tips for modern B2B order orchestration

IBM Exam Prep, IBM Learning, IBM Certification, IBM Preparation, IBM Tutorial and Material

Have you been tasked to transform your B2B order orchestration operations because of the enormous supply and demand disruptions experienced in 2020? You’re not alone. In a recent online interactive session, supply chain leaders at industrial companies and OEMs told us that their top priorities include flawless order orchestration, real-time inventory availability and accurate and on-time fulfillment.

Joe Cicman, Senior Analyst at Forrester, and I, offered advice and answered questions about how to address gaps in technology and processes to tackle these priorities. Here are five tips we shared for B2B organizations to get started:

1. Start with your digital experience strategy

Any investments in digital experience technology should be driven by your digital experience strategy and your customer journey. Work closely with your customer experience team to:

◉ Define business and brand objectives

◉ Identify customers’ digital interactions and devices

◉ Prioritize and fund interactions that benefit customers and are valuable to you

Your digital experience strategy will also help you determine how to prioritize initiatives like developing direct-to-consumer models, expanding into marketplaces, and leveraging the power of cloud. Your priorities need to be informed by your customers, not outside forces and trends.

2. Layer in modern technology

Industrial businesses and manufacturers often grow through mergers and acquisitions and have multiple legacy technologies already in place, including siloed ERP systems for different brands and geographies. Fortunately, with today’s modern order orchestration technology, companies can transform their order orchestration capabilities without doing a complete technology overhaul. By providing a layer of process on top of systems of record, order orchestration insulates the business from the complexity of having to migrate or merge existing ERP systems. You can quickly bring to market new capabilities that drive value for your business and customers.

For supply chain leaders who tackled the challenge years ago with a homegrown order management system, it’s time to take a fresh look at what’s available today. Unless you have a team of software engineers skilled in the latest technology, there’s probably a lot of capabilities you aren’t taking advantage of — and potentially losing opportunities to drive revenue, increase efficiencies, and reduce costs. You’ll also be pleasantly surprised with the flexibility of today’s solutions and how they can be tailored to meet your unique business needs.

3. Focus your digital experience technology strategy

Innovative firms tell Forrester they use the following principles to help focus their digital experience technology strategy:

◉ Define a digital experience roadmap – not a one-time web or mobile project – to guide your plan and ensure you gain strategic value.

◉ Use customer journey maps to bring relevancy to your investment by allowing you to identify points of friction and prioritize.

◉ Focus on a digital experience platform and a digital operations platform for the core of your technology investment.

◉ Shape your vendor choices by considering solutions that are cloud-hosted, mobile first, insights driven and loosely coupled.

These technology principles help lay a foundation for the future that enables business responsiveness and cost reduction.

4. Build momentum with quick wins

The sooner you can demonstrate you’re driving revenue and optimizing costs, the more support you’ll gain to expand your digital experience program. To do this, start by taking inventory of what you have and where you are in your journey. Understand what is currently hindering you from moving forward, so you can identify what you can do to help improve processes or remove any blockers. Find ways to minimize risk and achieve continuity in your business.

When you look at technology, like order management platforms, there is a natural place to start and progress. Inventory visibility is a clear first step; it’s quick and efficient to implement. With real-time, end-to-end inventory views across divisions, geographies, channels and partners, you can see where shortages exist and inventory resides. A central dashboard makes it easier to manage inventory across your fulfillment network with views into key business metrics and events. Inventory visibility also unlocks other use cases to mature your order orchestration operations.

5. Address the culture shift

Implementing technology is one thing, but getting your team on board is another. Culture is the glue that binds people, processes and technology. Identifying the behaviors impeding innovation and transformation efforts is the key to understanding which cultural attributes you need to address. Risk aversion, product-driven decisions, siloed work patterns, and hierarchical accountability will all slow digital transformation and cause systemic problems that will make it difficult to extract value from transformation efforts. Companies that embrace the digital experience foster a culture Forrester describes as customer obsessed, empathetic, agile, collaborative, and experimental.

IBM Exam Prep, IBM Learning, IBM Certification, IBM Preparation, IBM Tutorial and Material

While it’s true that retailers pioneered omnichannel order management, now is an ideal time for B2B companies to take advantage of the comprehensive suite of order orchestration capabilities that are available in the market today. Many B2B companies who got on board early have reaped the benefits of increased revenue, streamlined processes and optimized supply chains. Embracing modern technology to empower your people, processes, and mitigate supply chain disruptions is essential to success now more than ever.

Source: ibm.com

Monday 15 March 2021

5 retail supply chain insights from NRF 2021

IBM Exam Study, IBM Certification, IBM Learning, IBM Preparation, IBM Career

Did you miss “Chapter One” of the National Retail Federation’s (NRF’s) Big Show event in January? Wondering what valuable supply chain insights you might’ve missed from IBM retail clients and experts? We’ve got you covered with the top five supply chain insights from this year’s virtual event.

1. Supply chain resilience starts with end-to-end visibility

Moving into 2021, supply chain resilience is a top priority for most retailers. To optimize operations in good times and maintain business continuity during disruption, supply chain leaders need real-time, actionable insights. But supplier complexity, siloed systems, batch ERP data, and error-prone, manual processes can make it difficult to even see current and complete data – let alone analyze it in time to make better supply chain decisions.

Golden State Foods (GSF) is tackling visibility and transparency challenges using a combination of AI, IoT and blockchain to preserve the quality of products the company delivers to quick-serve restaurants and stores worldwide. In our first Big Ideas speaking session, Bob Wolpert, Chief Innovation & Strategy Officer for GSF, shared how they’re gaining visibility into data from IoT sensors in refrigeration trucks and applying AI to reveal trends and patterns. This allows them to monitor compliance with temperature ranges, address points of weakness, and ensure higher product quality with less waste. Blockchain drives additional value by enabling GSF to track product movement in real-time with trust and transparency, so they can adjust replenishment cycles, shift goods to different locations, and alter production based on demand patterns. “If you have the data and you can see it to act quickly, that’s a game changer for your supply chain,” said Bob.

2. Omnichannel fulfillment keeps customer needs front and center

Retailers never could have imagined the turbulence of last year when their doors were forced to close, at least temporarily. Flexible fulfillment options like curbside pickup were the key to meeting customer needs for essential retailers, like grocers, and keeping inventory moving for others. Even in the face of added complexity, some retailers were able to pivot more quickly and easily because they had the right order management system (OMS) in place.

For example, Sally Beauty serves both consumers and businesses and experienced different demand shifts depending on the customer base. Consumers were moving to DIY hair color and nail treatments, while salons were closing. In another Big Ideas session, Sally Beauty explained how they regrouped and reopened by implementing new omnichannel capabilities in three short weeks. With IBM’s SaaS-based OMS, they were able to scale quickly and manage peak demand, which was essential considering they re-emerged into an environment that was the equivalent of Black Friday times 10. They expanded ship-from-store and same-day delivery capabilities, and soon added buy online, pickup in store (BOPIS). “All of this is with the intent to remove customer friction and thinking customer first. If it doesn’t impact the customer, we’re not going to spend time on it,” said Joe Condomina, CTO, Sally Beauty.

3. When you empower the customer, you empower the industry

Consumers want to make more informed decisions about the food they buy and the brands they support. Research shows 65% of consumers want to know more about how food is produced. Providing this information builds trust and competitive advantage. But what does this mean for supply chain leaders?

The National Chamber of Aquaculture of Ecuador, through its Sustainable Shrimp Partnership (SSP), has realized the answer is education enabled through blockchain. Many consumers don’t know the origin of the shrimp they eat and the practices behind the product. The SSP leads the way in sustainable practices that have personal health and environmental benefits, but they needed to be able to capture and share that information to empower customers and the shrimp industry. In our final Big Ideas session, Jose Antonio Camposano, Executive President, National Chamber of Aquaculture of Ecuador, SSP, explained how by shifting the industry from manual processes and excel spreadsheets to a blockchain-based platform, all parties can track and trace shrimp from the hatchery, to the farm, to processing facilities and export operations. “For the first time, the consumer will be the one with the power to go back to the company and ask them directly for the information they are interested in,” said Jose. Consumers – chefs, nutritionists and home cooks – can scan a QR code to get information like antibiotic usage and farming impact on ecosystems to help make better decisions. And the industry gains long term sustainability, differentiation and growth opportunities.

4. Building trust and transparency is a team sport

Supply chains are no longer linear, but more dynamic in many industries. They are networks of networks with parties sometimes changing roles to play different positions on the team. During the pandemic, some channel distributors holding a lot of inventory became suppliers, and retail stores forced to close their doors became distribution centers. With shifting relationships and rising customer expectations, trust has come to the forefront – especially with trust in vaccines top of mind right now – but trust in food and fashion has been a top concern for years.

IBM Exam Study, IBM Certification, IBM Learning, IBM Preparation, IBM Career

In an interactive session co-led by Jason Kelley, General Manager, IBM Blockchain Services, and Karim Giscombe, president and founder, Plant AG, participants discussed how trust is established with data. In non-linear supply chains, that data must be shared in a distributed model. Adding blockchain technology to a multi-enterprise business network allows for this, providing permissioned access to trusted data across the supply chain. Karim explained how Plant AG is addressing the massive disconnect between food producers and consumers. Working with IBM, Plant AG is using blockchain to create data connectivity and a single source of truth along the food supply chain, from point of origin to point of consumption and all parties in between.

5. Weather insights help mitigate the impact of climate change on the supply chain

Finally, in an interactive session with John Bosse, Offering Manager for AI Applications, IBM, participants discussed how to identify and manage the effect of climate challenges within supply chains and retail operations. Today, climate change is disrupting the operating model of all businesses in myriad ways, and brands must understand how to turn negatives into positives. Consumers are willing to align their shopping habits to support their values of sustainability and healthy lifestyles, making it more important than ever for retailers to consider their environmental impact. In addition to doing right by the planet, companies that plan with climate change in mind can secure an 18% higher return on investment than companies that do not. With weather insights, companies are increasing food production on existing farmland to avoid deforestation and responsibly feed a growing population, enabling sustainable oil palm production, and increasing crop yields and quality while reducing water use. Additionally, retailers are reducing greenhouse gas emissions around their supply chain logistics and better predicting the impact weather has on consumer demands.

Source: ibm.com

Saturday 13 March 2021

IBM AI finds new peptides – paving the way to better drug design

IBM AI, IBM Tutorial and Material, IBM Certification, IBM Learning, IBM Career, IBM Preparation

Antibiotic resistance is no joke. It’s a huge threat to human health — even more so during the raging pandemic. We need new antibiotics, and we need them fast.

In the US alone, nearly three million people get infected with antibiotic-resistant bacteria or fungi every year. But very few new antibiotics are being developed to replace those that no longer work. That’s because drug design is an extremely difficult and lengthy process — there are more possible chemical combinations of a new molecule than there are atoms in the Universe.

We want to help.

Paving the way to the era of Accelerated Discovery, our IBM Research team has developed an AI system that can help speed up the design of molecules for novel antibiotics. And it works — in “Accelerating Antimicrobial Discovery with Controllable Deep Generative Models and Molecular Dynamics,” published in Nature Biomedical Engineering, we outline how we used it to create two new non-toxic antimicrobial peptides (AMPs) with strong broad-spectrum potency. Peptides are small molecules — they are short strings of amino acids, the building blocks of proteins. Our approach outperforms other leading de novo AMP design methods by nearly 10 percent.

Beyond antibiotics, this generative AI system could potentially accelerate the design process of the best possible molecules for new drugs and materials — helping scientists to use AI to discover and design better candidates for more effective drugs and therapies for diseases, materials to absorb and capture carbon to help fight climate change, materials for more intelligent energy production and storage, and much more. To fight these challenges, we need to accelerate the rate of discovery of new and functional molecules — at scale.

Where is that molecule?

That’s far from easy. Zeroing in on the correct molecular configuration that would lead to a new material with desired properties among the astronomical number of possible molecules is like looking for a needle in a haystack. For peptides, one would typically have to experimentally screen more than a hundred molecules to find one with the right properties.

So we’ve turned to AI for help.

First, we used an AI generative model dubbed a deep generative autoencoder to learn about the vast space of known peptide molecules. The model captured meaningful information such as molecular similarity and function about diverse peptide sequences, enabling us to explore beyond known antimicrobial templates.

We then applied Controlled Latent attribute Space Sampling (CLaSS) — a recently developed computational method for generating novel peptide molecules with custom properties. It works by sampling from the informative latent space of peptides and relies on a rejection sampling scheme guided by the molecular property the classifier trained on during the latent representation. Since CLaSS performs attribute-conditioned sampling in the compressed latent space, it is a computationally efficient and scalable approach that can be easily repurposed.

We then used deep learning classifiers to screen the AI-generated candidate antimicrobial molecules for additional key attributes, such as toxicity and broad-spectrum activity. We performed additional screening with the help of high-throughput, coarse-grained molecular dynamics simulations. These simulations look for presence of novel physicochemical features indicative of stable and peptide-membrane binding, such as low-contact variance between peptide and membrane.

IBM AI, IBM Tutorial and Material, IBM Certification, IBM Learning, IBM Career, IBM Preparation
Within 48 days, our AI-boosted molecular design approach to Accelerated Discovery enabled us to identify, synthesize, and experimentally test 20 AI-generated novel candidate antimicrobial peptides. Two of them turned out to be highly potent against diverse Gram-positive and Gram-negative pathogens (including multidrug-resistant K. pneumoniae) and very unlikely to trigger drug resistance in E. coli.

We also didn’t find any cross-resistance for either of the AMPs when tested using a polymyxin-resistant strain. Live-cell confocal imaging showed the formation of membrane pores as the underlying mechanism of bactericidal mode of action of these peptides. Both antimicrobials have low toxicity — we tested them in vitro and also in mice, providing important information about the safety, toxicity and efficacy of these antimicrobial candidates in a complex animal model.

Our proposed approach could potentially lead to faster and more efficient discovery of potent and selective broad-spectrum antimicrobials to keep antibiotic-resistant bacteria at bay — for good. And we hope that our AI could also be used to help address the world’s other most difficult discovery challenges, such as designing new therapeutics, environmentally friendly and sustainable photoresists, new catalysts for more efficient carbon capture, and so much more.

Source: ibm.com

Thursday 11 March 2021

Expanding our DevOps portfolio with GitLab Ultimate for IBM Cloud Paks

IBM Exam Prep, IBM Learning, IBM Certification, IBM Career, IBM Preparation

In the era of hybrid cloud, our clients are facing constant pressure to adapt to and adopt new models for application development and IT consumption in order to unlock the speed and agility of hybrid cloud. With even tighter budgets and more pressure to transform given the impact of our new normal, there is no better time to reimagine the way mission-critical workloads run on IBM Z.

Many clients such as State Farm, RBC and BNP Paribas have benefited by adopting DevOps on IBM Z which allows them to unlock new potential for increased speed and agility, directly influencing their digital transformation initiatives. Clients are able to leverage their investments in and the strength of their existing IT infrastructure, clouds and applications in a seamless way with people, platforms and experience they already have on hand.

Over the past year, IBM Z has taken significant strides to bring new DevOps tools to the platform, with seamless hybrid application development including IBM Wazi Developer for Red Hat CodeReady Workspaces and Red Hat Ansible Certified Content for IBM Z, as well as a series of new hybrid cloud container offerings for Red Hat OpenShift on IBM Z and LinuxONE.

Today IBM announced GitLab Ultimate for IBM Cloud Paks, expanding our DevOps offerings across the business allowing clients to get a comprehensive solution with a DevOps toolchain for modernization.  This is another significant milestone for IBM Z and IBM LinuxONE clients, bringing even more choice with cloud-native development on IBM Z.

IBM Exam Prep, IBM Learning, IBM Certification, IBM Career, IBM Preparation
This is a significant step forward for IBM Z clients and is designed to unlock their software innovations by reducing the cycle time between having an idea, seeing it in production and monitoring it to ensure optimal performance. It provides an innovative, open and hybrid solution for DevOps.

With GitLab Ultimate for IBM Cloud Paks, developers will be able to compose their DevOps solution using GitLab, write in any language they want and deploy it in any hybrid cloud environment they choose. Developers will also be able to take advantage of GitOps and GitLab’s orchestration automation technology, which can be used in conjunction with GitLab pipelines.

Source: ibm.com

Wednesday 10 March 2021

IBM’s innovation: Topping the US patent list for 28 years running

IBM’s Innovation, IBM Exam Prep, IBM Learning, IBM Preparation, IBM Certification

Granted to my IBM colleagues and myself in 2005, it was for a topcoat waterproof material for a photoresist — a light-sensitive substance used to make circuit patterns for semiconductor chips. It was a proud moment for me — especially as I knew that this patent contained novel capabilities that were critical for a brand-new technology called immersion lithography. This technology soon became the basis for how all advanced chips are manufactured, even to this date.

I also knew it had contributed to IBM’s patent leadership that year. Just like during the 13 years before and 15 years after, IBM has been getting more patents granted than any other company in the US.

For me, this patent leadership symbolizes much more than just the mere fact of being at the top. A patent is evidence of an invention, protecting it through legal documentation, and importantly, published for all to read. The number of patents we produce each year — and in 2020, it was more than 9,130 US patents — demonstrates our continuous, never-ending commitment to research and innovation. We are actively planting the research seeds of the bleeding edge technological world of tomorrow. Our most recent patents span artificial intelligence (AI), hybrid cloud, cyber-security and quantum computing. It doesn’t get more future-looking than this.

The US patent system goes back to the very dawn of our nation. It is detailed in the Constitution, enabling the Congress to grant inventors the exclusive right to their discoveries for a specific period of time. It is an assurance designed to motivate inventors to keep innovating.

One might argue against having patents that don’t get immediately turned into commercial products. But I disagree. Inventing something new is similar to putting forward a well thought out theory that may, one day, be verified experimentally. Perhaps not straight away, but it’s still vital to have theories to enhance our overall understanding of a field and to keep progress going. Having future-looking patents is just as important as those aimed at products of today, and a broad portfolio of scientific advances always ends up contributing to waves of innovation.

Patents drive innovation and a nation’s economic performance. Over the years, they have given us breakthrough technologies such as the laser, self-driving cars, graphene and solar panels. We at IBM have developed and patented such widely used products as the automated teller machine (ATM), speech recognition technology, B2B e-commerce software with consumer-like shopping features for processing business orders, the hard disk drive, DRAM (the ubiquitous memory that powers our phones and computers), and even the famous floppy disk that’s now history, to name just a few.

Tackling the world’s problems

A patent’s assurance of the protection of inventions is a key reason why companies invest billions of dollars in research and development. This results in scientists and engineers in different companies trying to find the best, original solutions to the world’s problems, paving the way for new and better products. And we haven’t run out of global problems to solve, far from it. Innovation is what helps us deal with pandemics, tackle global warming, address energy and food shortages, and much more.

IBM’s Innovation, IBM Exam Prep, IBM Learning, IBM Preparation, IBM Certification
Last year, just in the field of AI, our researchers received more than 2,300 patents. To take two examples among many: a novel way to search multilingual documents using natural language processing, and an ultra-efficient system for transferring image data taken by an on-vehicle camera. These both speak to the innovation and original thinking from our inventors in AI.

In cloud, we received about 3,000 patents, many focusing on data processing categorizations that can help bring services to the edge. In cyber-security, I’d like to single out patents in fully homomorphic encryption — an area of cryptography where computations are made on data that stays encrypted at all times. With so many data leaks jeopardizing the privacy of our medical, genomic, financial and other sensitive records, secure encryption is more important than ever.

Finally, there is quantum computing. This next-generation technology is getting ever better. I am convinced that in the near future, products relying on quantum computation will be an integral part of our daily lives. By inventing and patenting those products today, we are ensuring our quantum future.

One of our quantum computing patents deals with running molecular simulations on a quantum computer. Performing such simulations faster and across a much wider molecular space than a classical computer can ever do could help us design new molecules for novel drugs or catalysts. Another patent addresses the use of quantum computing in finance, to run risk analysis more precisely and efficiently than ever before.

That’s far from all. Quantum computers of today are ‘noisy’ — meaning that the quantum bits, or qubits, they rely on get easily affected by any external disturbances. Many of our patents detail ways to make qubits much more stable and even suggest approaches to correct the remaining errors in future stable qubits, offering a path to realize quantum error-correction and unleash the power of quantum computers to solve the currently unsolvable.

I’ll end with a reflection. A vibrant culture of innovation combines patenting, publishing, contributing to open-source, and active in-market experimentation and discovery. All are needed, fueled by the joy that innovators experience with the spark of novel ideas, and the desire to bring them to life.

Source: ibm.com