Thursday, 30 March 2023

Challenges of today’s monolithic data architecture

IBM, IBM Exam, IBM Exam Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation, IBM Tutorial and Materials

The real-world challenges organizations are facing with big data today are multi-faceted. Every IT organization knows there is more raw data than ever before; data sources are in multiple locations, in varying forms and of questionable data quality. To add complexity, the business users of, and use cases for, data have become even more varied. The new data being used for decision support and business intelligence might also be used for developing machine learning models. In addition, semi-structured data, such as JSON and IoT log files, might need to be mixed with transactional data to get a complete picture of a customer buying experience, while emails and social media content might need to be interpreted to understand customer sentiment (i.e., emotion) to enrich operational decisions, machine learning (ML) models or decision-support applications. Choosing the right technology for your organization can help solve many of these challenges. 

Addressing these data integration issues might appear to be relatively easy: land all enterprise data in a centralized data lake and process it from start to finish. But that is a bit too simplistic because simultaneously, real-time data needs to be processed for decision-support access and often the curated data inputs reside in a data warehouse repository. And keeping data copies synchronized between the physical platforms supporting Hadoop based data lakes and data warehouses and data marts can be challenging.

Warehouses are known for the high-performance processing of terabytes of structured data for business intelligence but can quickly become expensive for new, evolving workloads. And when it comes to price-performance, the reality is that organizations are running data engineering data pipelines and data science machine learning model building workflows in data warehouses that are not necessarily optimized for scalability or to run these challenging workloads – impacting pipeline performance and driving up costs. It is this complex set of different data relationship dependencies requiring continuous data movement of interdependent data sets across platforms that makes these data challenges so complex to solve.

Rethinking data analytics architecture


Software architects at vendors understand these challenges, and several companies have tried to address the challenges in their own way. New workload requirements led to new functionality in software platforms that were not specifically optimized for these workloads, reducing their efficiency and worsening data silos within many organizations. Additionally, each platform must have overlapping copies of data, implying issues with data management (data governance, privacy, and security) and higher costs for data storage.

For these reasons, the challenges of the traditional data warehouse and data lake architecture have led businesses to operate complex architectures, with data siloed and copied across data warehouses, data marts, data lakes, and other relational databases throughout the organization. Given the prohibitive costs of high-performance on-premises and cloud data warehouses, and performance challenges within legacy data lakes, neither of these repositories satisfy the need for analytical flexibility and price-performance.

Instead of having each new technology solve the same problem, what is needed is a fresh, new architectural style.

Fortunately, the IT landscape is changing due to a mix of cloud computing platforms, open source, and traditional software vendors. Cloud vendors, leading with object storage, have helped to drive down the cost of disk storage. But data stored in object storage cannot readily be updated and object storage does not offer the type of query performance to which business users have become accustomed. Open-source technology such as Apache Iceberg combined with open-source engines such as Presto and Apache Spark are providing the advantage of object storage along with the business capabilities of better SQL performance and the ability to update large structured and semi-structured data in place. But there is still a gap to be filled that allows all these technologies to work together as a coordinated, integrated platform.

To truly solve these challenges, query, and reporting, provided by engines such as Presto, needs to work along with the Spark infrastructure framework to support advanced analytics and complex data transformations. And Presto and Spark need to readily work with existing and modern data warehouse infrastructures.

The industry is waiting for a breakthrough approach that allows organizations to optimize their analytics ecosystem by selecting the right engine for the right workload at the right cost — without having to copy data to multiple platforms and while taking advantage of integrated metadata. Whichever vendor gets there first will allow organizations to reduce cost and complexity and drive the greatest return on investment from their analytics workloads while also helping to deliver better governance and data security.

Source: ibm.com

Saturday, 25 March 2023

IBM and Cleveland Clinic unveil the first quantum computer dedicated to healthcare research

In late 2021, Cleveland Clinic and IBM entered into a landmark 10-year partnership to use emerging technologies to help crack some of the biggest challenges in healthcare and life sciences. Using a combination of state-of-the-art high-performance hybrid cloud computing, next-generation AI, and quantum computing, the goal was to create a collaborative environment for Cleveland Clinic researchers and partners to advance biomedical science and treatment, as well as foster the next generation technology workforce for healthcare.

IBM, IBM Exam Study, IBM Tutorial and Materials, IBM Career, IBM Learning, IBM Guides, LPI Prep, LPI Preparation

That partnership hit a major milestone last night, when Cleveland Clinic and IBM unveiled the first quantum computer delivered to the private sector and fully dedicated to healthcare and life sciences. The IBM Quantum System One machine sits in the Lerner Research Institute on Cleveland Clinic's main campus, and will help supercharge how researchers devise techniques to overcome major health issues. "Quantum and other advanced computing technologies will help researchers tackle historic scientific bottlenecks and potentially find new treatments for patients with diseases like cancer, Alzheimer’s and diabetes," said Dr. Tom Mihaljevic, CEO and president of Cleveland Clinic.

Many people across Cleveland Clinic and IBM Research have come together to make this launch possible.

IBM, IBM Exam Study, IBM Tutorial and Materials, IBM Career, IBM Learning, IBM Guides, LPI Prep, LPI Preparation

Dr. Serpil Erzurum, Cleveland Clinic’s chief research and academic officer, oversees enterprise-wide research programs that aim to deliver the most innovative care to patients. Her office doubles as an entry way into her lab, where her team studies mechanisms of airway inflammation and pulmonary vascular diseases.

IBM, IBM Exam Study, IBM Tutorial and Materials, IBM Career, IBM Learning, IBM Guides, LPI Prep, LPI Preparation

Dr. Jae Jung, director of the Global Center for Pathogen & Human Health Research, works closely with his team of dynamic scientists who focus on the understanding of viral pathogens and the human immune responses so that we can better prepare and protect against future public health threats.

IBM, IBM Exam Study, IBM Tutorial and Materials, IBM Career, IBM Learning, IBM Guides, LPI Prep, LPI Preparation

John Smith, IBM Fellow and chief IBM Scientist for the Discovery Accelerator, works in partnership with his Cleveland Clinic counterpart, Dr. Ahmet Erdemir, to orchestrate the many targeted workstreams that aim to accelerate biomedical research efforts. The complexity of the biomedical and healthcare data ecosystems require multidisciplinary investigations of disease trajectories, intervention possibilities and healthy homeostasis. Leveraging Cleveland Clinic’s biomedical research and clinical expertise and IBM’s global leadership in quantum computing and commitment to research at enterprise scale, the teams aim to advance the pace of discovery in healthcare and life sciences, and explore what wasn't previously possible.

IBM, IBM Exam Study, IBM Tutorial and Materials, IBM Career, IBM Learning, IBM Guides, LPI Prep, LPI Preparation

Dr. Feixiong Cheng, of Cleveland Clinic’s Genomic Medicine Institute, is working with the IBM team to develop computer-based systems pharmacology and multi-modal analytics tools to optimize human genome sequencing and leverage large-scale drug-target databases more efficiently. Together with IBM researchers, Yishai Shimoni and Michael Danziger, they are aiming to develop effective ways to improve outcomes in long-term brain care and quality of life for people with Alzheimer’s disease and dementia.

IBM, IBM Exam Study, IBM Tutorial and Materials, IBM Career, IBM Learning, IBM Guides, LPI Prep, LPI Preparation

Dr. Lara Jehi, chief research information officer at Cleveland Clinic, is the executive program lead for the Discovery Accelerator partnership, in addition to running her own research, where her team applies AI to large-scale data sets to understand the effect of anti-inflammatory drugs on seizure recurrence in epilepsy patients who went through cranial surgery, in partnership with IBM researcher Liran Szlak.

IBM, IBM Exam Study, IBM Tutorial and Materials, IBM Career, IBM Learning, IBM Guides, LPI Prep, LPI Preparation

Dr. Shaun Stauffer, director at the Center for Therapeutics Discovery, is working with IBM researcher, Wendy Cornell, to see how high-performance computing (HPC) and molecular modeling tools can be harnessed to vastly improve the small molecule discovery process, in particular with COVID antiviral drug development.

IBM, IBM Exam Study, IBM Tutorial and Materials, IBM Career, IBM Learning, IBM Guides, LPI Prep, LPI Preparation

At an event yesterday, leaders from Cleveland Clinic and IBM, along with local, state, and federal officials came together to formally unveil the IBM Quantum System One on main campus. Cleveland Clinic's CEO and President Dr. Tom Mihaljevic, along with IBM Vice Chairman Gary Cohn and Darío Gil, SVP and director of IBM Research, and Cleveland Mayor Justin Bibb, Ohio Lieutenant Governor Jon Husted, ARPA-H Deputy Director Dr. Susan Monarez and Congresswoman Shontel Brown, were in attendance for the ribbon-cutting.

Source: ibm.com

Thursday, 23 March 2023

Five industries benefiting from drone inspections

IBM, IBM Exam, IBM Exam Prep, IBM Exam Certification, IBM Prep, IBM Tutorial and Materials, IBM Learning

The use of commercial drones to conduct inspections can significantly improve business operations across industries. These inspections increase precision, provide safer options for the workforce and drive efficiency. According to Quadintel, the global drone inspection and monitoring market was $7.47 billion in 2021 and will grow to $35.15 billion by 2030. This blog examines five industries that benefit from the fast-growing technology of commercial drone usage.

1. Infrastructure and construction


Infrastructure is critical for a society and an economy, but some of the world’s most industrialized countries face crumbling infrastructure, especially aging bridges. In the US, 1 in 3 bridges needs repair or replacement. In Japan, the number of aging bridges that restrict traffic tripled between 2008 and 2019. In the UK, over 3,200 bridges need repairs.

Traditional bridge inspections are tedious and require extensive human effort, but the use of drones piloted with artificial intelligence has the potential to save our crumbling infrastructure. Drones simplify the challenging task of inspecting and maintaining critical structures like bridges. In the construction industry, drones can validate that construction is in line with blueprints, and they can measure, transmit and store data for civil or structural surveys. Infrastructure managers can use drones to create a digital twin of the building or infrastructure to allow for faster decision-making and communication between various departments in large construction projects.

2. Search and rescue operations


Time is of the essence when searching for people who are injured or lost in the wilderness. Drones equipped with thermal imaging, zoom cameras, and lights can fly quickly over large distances to find lost hikers and guide them home. The Mountain Rescue Association of North America estimates that 80% of its members use drones as a critical tool in the search-and-rescue process.

3. Energy, utilities, and resources


Oil and gas enterprises and utility firms traditionally use helicopters to inspect refineries, offshore rigs, and power lines. Replacing helicopters with drones makes these tasks more cost-effective and sustainable. Many renewable energy companies use drones and zoom cameras to inspect solar panels and wind turbines. The mining industry now sends drones to inspect open stopes, which reduces the risk to human life. In these industries, drone usage is a part of business operations that increases efficiency and safety.

4. Insurance claims


When natural weather conditions devastate people’s homes and properties, drones can help insurance companies manage the deluge of claims. For example, a drone can inspect a home before the insurance company sends an adjustor to assess a roof damage claim. This streamlined process saves time and money for insurance companies. In a connected use, individuals who provide roof repair services can use drone imagery to show the owner the condition of their roof and to discuss plans for restoration.

5. Agriculture and agribusiness


Before drones, farmers manually inspected their fields, which could take days. Hundreds of acres of crops were at risk of damage during this lengthy process. With drones, farmers can now get an instant view of crop problems and focus on the remedy. Similar to utility and energy firms, large agricultural businesses can monitor their assets from above to ensure that shutdowns for repairs can be reduced to a minimum.

As commercial drones are deployed across the globe for inspections, drone relevance grows across multiple industries. Drone inspections increase accuracy, drive efficiency and protect the workforce. We can expect more industries to adopt these tools to maximize their business operations as drone technology progresses in the coming years.

Source: ibm.com

Tuesday, 21 March 2023

Embeddable AI saves time building powerful AI applications

Embeddable AI, AI applications, IBM Exam, IBM Exam Prep, IBM Exam Preparation, IBM Career, IBM Jobs, IBM Prep, IBM Tutorial and Materials

Just a few weeks ago, IBM announced an expansion to their embeddable AI software portfolio with the release of three containerized Watson libraries. This expansion allows our partners to embed popular IBM Watson capabilities, including natural language processing, speech-to-text, and text-to-speech into their applications and solutions. But what is embeddable AI, and what are its uses?

Embeddable AI is the first-of-a-kind suite of IBM core AI technologies that can be easily embedded within enterprise applications to serve a variety of use cases. Think of embeddable AI as an engine. Planes and cars both use engines to make them go, but each engine accomplishes its purpose in different ways.

The analogy doesn’t end there either, because just like with a car or plane engine, it’s much easier to use something pre-made than to construct one yourself. With embeddable AI, you get a set of flexible, fit-for-purpose AI models that developers can use to provide enhanced end-user experiences—like, automatically transcribing voice messages and video conferences to text.

Unfortunately, many organizations still struggle to find talent with the skills to build and deploy AI solutions into their businesses. So, it’s crucial for companies that want to add specific AI capabilities to their applications or workflows to do so without expanding their technology stack, hiring more data science talent, or investing in expensive supercomputing resources.

To address this, businesses have found value in embedding powerful technology using specific models to harness AI’s potential in the way that best fits their needs, whether it is through domain optimized applications to containerized software libraries.

Differentiated solutions drive business success


IBM Research has added three new software libraries to IBM’s portfolio of embeddable AI solutions—software libraries that are not bound to any platform and can be run across environments, including public clouds, on-premises, and at the edge.

As a result, organizations can now use this technology to enhance their current applications or build their own solutions.

The new libraries include:


In addition to the libraries, the embeddable AI portfolio includes IBM Watson APIs and applications like IBM Watson Assistant, IBM Watson Discovery, IBM Instana Observability, and IBM Maximo Visual Inspection.

Embeddable AI libraries are lightweight and provide stable APIs for use across models, making it easier for organizations to bring novel solutions to market.

Partner solutions using embeddable AI


IBM partners are making use of embeddable AI in various ways and across different industries.

LegalMation, an IBM partner that helps the legal industry make use of AI and advanced technology, uses natural language processing to automate contract privacy. Contracts and agreements contain information that organizations want to be careful about. Usually, redacting information within a contract is a manual process involving a person going line by line to mark passages for redaction. Instead, LegalMation uses embeddable AI to create an automated solution. The legal company now uses a natural language processing tool to find and mark sensitive information automatically.

See how LegalMation also uses AI to reduce the early-phase response documentation drafting process from 6 – 10 hours to 2 minutes.

Language-training school ASTEX, based in Madrid, Spain, has seen student careers skyrocket after they completed its courses. ASTEX uses AI to streamline students’ onboarding experience, offer personalized learning plans and improve the program’s scalability by reducing its dependence on humans. IBM partner Ivory Soluciones connected ASTEX with IBM because of the tech company’s expertise in AI solutions. Working closely together, ASTEX, Ivory, and IBM developed the ASTEX Language Innovation platform on IBM Cloud® with IBM Watson® technology.

Call recording service, Dubber, uses speech-to-text, tone analyzer, and natural language understanding to capture and transcribe a variety of verbal exchanges. The solution, powered by embeddable AI, automatically translates phone calls and video conferences into text and assigns each conversation a positive, negative, or neutral value, depending on the call. Users can then mine the data using simple keyword searches to find the information they need.

Now, with the addition of the new software libraries, new and existing IBM partners can embed the same Watson AI that powers IBM’s market leading products with flexibility to build and deploy on any cloud in the containerized environment of choice.

Source: ibm.com

Thursday, 16 March 2023

CFOs are the new ESG storytellers

IBM Exam, IBM Exam Prep, IBM Career, IBM Jobs, IBM Tutorial and Material, IBM ESG, IBM Certification, IBM Study, IBM ESG

Ready or not, here they come. Mandatory regulatory disclosures regarding environmental, social and governance (ESG) factors are on the horizon for businesses everywhere. Governments worldwide have already adopted reporting requirements aligned with recommendations by the Task Force on Climate-Related Financial Disclosures (TCFD), including those in Canada, Brazil, the EU, Hong Kong, Japan, New Zealand, Singapore and Switzerland. Additionally, the German Supply Chain Due Diligence Act went into effect in January 2023. In the US, the Securities and Exchange Commission (SEC) is considering new requirements for public companies that could take effect as early as January 2024.

ESG disclosures require companies to analyze and report on a wide and evolving range of factors. The SEC regulations would require public companies to disclose greenhouse gas emissions (in the near term for their own operations and utilities, and in the future for their entire supply chain), as well as their targets and transition plans for reducing emissions. And companies would need to report potential exposure to extreme weather events, including hurricanes, heatwaves, wildfires and drought, and also identify how they’re assessing those risks.

In the EU, the Corporate Sustainability Reporting Directive (CSRD), which will take effect in 2024, demands even more detailed disclosures about how a company’s business model and activities affect sustainability factors like environmental justice and human rights. “You can’t wait for the regulations to kick in,” says Adam Thompson, Global Sustainable Finance and ESG Reporting Offerings Lead, IBM Consulting. “You need to be on this journey now.”

CFOs at the helm of transformation


CFOs are now responsible for transparent communication about how a company’s sustainability performance and sustainability metrics are tied to financial disclosures. That speaks to a larger evolution in the role of the finance function in business. “Finance leaders have gone from being bean counters to storytellers,” says Monica Proothi, Global Finance Transformation Lead for IBM Consulting. “They are not merely informing the business, but partnering with the business to transform data into insights that drive strategic ambitions, including leadership agenda around sustainability.”

The pressure is on for CFOs to build the capability for high-quality ESG reporting and communications into their finance departments. Underperforming in this capacity comes with material risk. “Access to capital is going to change,” Thompson says. A 2022 study by IBM’s Institute for Business Value found that the majority of CEOs surveyed were under intense pressure from investors to improve transparency around sustainability factors such as emissions, resource use, fair labor and ethical sourcing. As ESG reporting becomes more prevalent, details like these will be used to determine the quality of an investment. “You’re going to have an ESG risk rating, just like you have a credit rating,” Thompson says.

It’s important that we continue to make sustainable finance and ESG reporting efforts relevant for the CFO. The increasing focus on data transformation must integrate sustainable finance efforts in the process. We know that the CFO will be at the helm of this transformation too, yet some currently view sustainability more narrowly, as an operational or compliance issue only.

How CFOs can keep pace with high-quality ESG reporting


Thompson warns that many companies are taking shortcuts, cherry-picking data for reports or relying on estimates and secondary sources. It amounts to greenwashing, which is a reputational risk, and it won’t prepare businesses for increasingly comprehensive disclosure requirements. High-quality ESG reporting requires real visibility, not only into a company’s own operations but into those of its suppliers.

Thompson calls this “getting under the hood,” and while it requires new technologies and capabilities, it will ultimately drive value and unlock opportunities. CFOs must rise to new challenges and expectations brought on by rapidly evolving regulations.

Take it step by step

Proothi advises CFOs to “think big, start small and act fast.” Quick wins will generate value that can be reinvested into larger initiatives, and breaking transformation into small steps helps employees adopt change. Proothi says it’s often helpful to set up a dedicated transformation office to orchestrate the process. That might involve managing upskilling, coordinating the cadence of new initiatives, evaluating employees’ experiences and ensuring progress isn’t derailed by “change fatigue.”

Untangle your processes

Process mining is one of the first steps Proothi and Thompson recommend in financial transformation. “It shows the optimal path, and then all of the variations that clients are seeing, which ends up being this big ball of spaghetti,” Proothi says. Optimizing processes gives your workforce the bandwidth it needs to tackle the high-value work of data orchestration and interpretation. One approach Proothi recommends is using solutions provided by Celonis, which can operationalize sustainability by giving real-time schematics of how a business actually works. This helps to identify supply chain bottlenecks, recalibrate workflows and spotlight hidden inefficiencies.

Build a data foundation

Enterprise Resource Planning (ERP) systems are a good tool for collecting quality data, “but they don’t have any de facto sustainability or data objects built in,” Thompson says. “Look at how you can extend, enhance or augment your ERP.” And legacy data storage will no longer suffice. “Data lakes are more like data swamps now,” Proothi says. Data mesh and data fabric solutions help ensure data is clean, current and accessible.

Build diverse teams

The evolving needs of the finance department require a range of skills that many finance professionals don’t have yet. “You’re never going to find a unicorn of a person who has all the skills required,” Proothi says. Combining individuals who have traditional accounting skills with people experienced in sustainability and communications will empower teams to handle new responsibilities as they help one another upskill.

Leverage technology to increase visibility

Integrated AI-powered platforms like IBM Envizi ESG Suite encompass asset management and supply chain management solutions to help organizations collect and compile a wide range of environmental data. Perhaps more importantly for CFOs, they transform that data into legible outputs that provide the visibility required to make effective decisions about sustainability risks and opportunities. IBM also partners with FRDM, a platform that uses data science, machine learning and AI to highlight ESG risks across global supply chains and provide real transparency, even when it comes to complex issues like fair labor.

Gone are the days when the finance department was just responsible for closing the books each quarter. “The role of the CFO has completely changed,” Proothi says. What was historically an accounting role now holds opportunities for strategic leadership and balancing sustainability and profitability. CFOs of the future can reshape financial functions to drive performance and tell compelling stories to align an enterprise’s operations with its high-level goals and values.

Source: ibm.com

Tuesday, 14 March 2023

Data is key to intelligent asset management

IBM Exam, IBM Exam Prep, IBM Tutorial and Materials, IBM Preparation, IBM Career, IBM Skills, IBM Jobs

Planning for business disruptions is the new business as usual. To get ahead in a rapidly shifting environment, industrial businesses are leaning more on integrating operational technology (OT) with IT data, which allows them to move from time-based to predictive maintenance, monitoring and management. But collecting and deriving insights from data that resides across disparate applications, legacy databases, machines, sensors and other sources is a complex affair. In fact, two-thirds of business data remains unleveraged. If companies can’t turn their data into value, it’s useless.


This is where intelligent asset management (IAM) comes in.

Announced today at MaximoWorld, IBM’s intelligent asset management brings together powerful solution suites for asset management, facilities management and environmental intelligence in one integrated place. It empowers the entire organization, from the C-suite to the frontline and all along the supply chain, to make more informed, predictive decisions across key operational and functional arenas. With IAM, all players can:

◉ Monitor and measure operations for a 360-degree view of internal and external data, using asset and sustainability performance management capabilities to help balance net income with net-zero objectives.
◉ Manage assets, infrastructure and resources to optimize and prioritize activities that improve the bottom line, including new integrations between Maximo and TRIRIGA to merge best practices across property, plant and equipment.
◉ Improve product and service quality with AI and next-gen technologies that increase customer satisfaction and cost control with intuitive visual inspection and occupancy experience solutions.

IAM breaks down the walls between these traditionally siloed data sets through a holistic approach to asset management, allowing organizations to untangle their data and bring sustainability and resiliency into their business. Here are just a few examples of how clients are utilizing IAM today.

Creating an end-to-end digital utility


One example is the New York Power Authority (NYPA). The NYPA, already the largest state public power organization in the U.S., seeks to become the nation’s first fully digital public power utility. This ambitious goal is part of the organization’s VISION2030 strategic plan, which provides a roadmap for transforming the state’s energy infrastructure to a clean, reliable, resilient and affordable system over the next decade.

To help unify its asset management system and integrate its Fleet Department, the NYPA turned to IBM Maximo®. The NYPA already uses several Maximo solutions — including the Assets, Inventory, Planning, Preventive Maintenance and Work Order modules — to help manage its generation and transmission operations. But its Fleet Department still relied on separate, standalone software for fleet management, preventing cross-organizational visibility into vehicle information. With the Maximo for Transportation solution, the Fleet Department is helping to ensure optimal management of approximately 1,600 NYPA vehicles. Using this central source reduces operational downtime and cuts costs while boosting worker productivity. It also supports the NYPA’s clean energy goals to decarbonize New York State.

Harnessing weather predictions to deliver power across India


Leading companies are also turning to IAM solutions to become more responsive to changes and to ensure business continuity. They are leveraging tools like the IBM Environmental Intelligence Suite, which provides advanced analytics to plan for and respond to disruptive weather events and avoid outages.

In recent years, India has made massive strides in ensuring that every electricity-consuming entity has access to the power they need. But the country struggled when it came to the reliability and efficiency of these services. Government officials had to calculate energy predictions manually using spreadsheets that could only consider historical energy usage. This process left much room for inefficiencies, waste, and financial loss. Officials needed a new way to understand all the factors that impact demand.

Delhi-based Mercados EMI, a leading consultancy firm that specializes in solving energy sector challenges, worked with IBM to create an AI-based demand forecasting solution to help address this problem. The model combined historical demand data with weather pattern information from The Weather Company’s History on Demand data package, which enabled officials to accurately predict when and where energy would be consumed based on environmental conditions. With this data, Mercados could provide utilities with demand forecasts with up to 98.2% accuracy rate to reduce the chances of outage and optimize their costs when it came to buying capacity. This allowed officials to make better overall decisions to balance supply, demand and costs to the consumers.

Keeping cities safe and sustainable with AI and IoT


As the economics of leveraging AI and monitoring assets remotely become more favorable than large supervisory systems, ensuring this lightweight infrastructure can also provide rapid insights from real-time situation data becomes critical. This challenge is particularly pronounced when it comes to environmental issues, where connecting city systems and infrastructure resources with real-time awareness can make all the difference.

Take Melbourne, Australia as an example. Due to climate change, Melbourne is experiencing more extreme weather such as severe rainfall events. In 2018, over 50 mm (2 in.) of rain fell in 15 minutes, resulting in flash floods and widespread power outages.

To help provide protection against flooding, the city’s water management utility, Melbourne Water, operates a vast drainage network that includes approximately 4,000 pits and grates. To function properly, the stormwater drainage system requires regular inspection and maintenance, which in turn requires thousands of hours of manpower every year, often executed during the most dangerous conditions.

That is why Melbourne water turned to AI-powered visual inspection technology in the IBM Maximo Application Suite. This allowed them to use cameras to capture real-time information about their stormwater system, then leverage AI to analyze the situation and detect blockages. And because Maximo allows easy integration between management, monitoring and maintenance data and applications, crews can focus on the areas that pose the most risk to Melbourne and its citizens.

Source: ibm.com

Saturday, 11 March 2023

Exploring generative AI to maximize experiences, decision-making and business value

IBM, IBM Exam, IBM Exam Prep, IBM Exam Tutorial and Materials, IBM Guides, IBM Jobs, IBM Learning

In the first part of this three-part series, generative AI and how it works were described.

IBM Consulting sees tangible business value in augmenting existing enterprise AI deployments with generative AI to improve performance and accelerate time to value. There are four categories of dramatically enhanced capabilities these models deliver: 

◉ Summarization as seen in examples like call center interactions, documents such as financial reports, analyst articles, emails, news and media trends. 
◉ Semantic search as seen in examples like reviews, knowledge base and product descriptions. 
◉ Content creation as seen in examples like technical documentation, user stories, test cases, data, generating images, personalized UI, personas and marketing copy.
◉ Code creation as seen in examples like code co-pilot, pipelines, docker files, terraform scripts, converting user stories to Gherkin format, diagrams as code, architectural artifacts, Threat models and code for applications.

With these improvements, it’s easy to see how every industry can re-imagine their core processes with generative AI.

Leading use cases do more than simply cut costs. They contribute to employee satisfaction, customer trust and business growth. These aren’t forward-looking possibilities because companies are using generative AI today to realize rapid business value including things like improving accuracy and near real-time insights into customer complaints to reduce time-to-insight discovery, reduction in time for internal audits to maintain regulatory compliance and efficiency gains for testing and classification.

While these early cases and the results they’ve delivered are exciting, the work involved in building generative AI solutions must be developed carefully and with critical attention paid to the potential risks involved including:

◉ Bias: As with any AI model, the training data has an impact on the results the model produces. Foundation Models are trained on large portions of data crawled from the internet. Consequently, the biases that inherently exist in internet data are picked up by the trained models and can show up in the results the models produce. While there are ways to mitigate this effect, enterprises need to have governance mechanisms in place to understand and address this risk. 

◉ Opacity: Foundation models are also not fully auditable or transparent because of the “self-supervised” nature of the algorithm’s training. 
Hallucination: LLMs can produce “hallucinations,” results that satisfy a prompt syntactically but are factually incorrect. Again, enterprises need to have strong governance mechanisms in place to mitigate this risk.  

◉ Intellectual property: There are unanswered questions concerning the legal implications and who may own the rights to content generated by models that are trained on potentially copywritten material.  

◉ Security: These models are susceptible to data and security risk including prompt injection attacks. 

When engaging in generative AI projects, business leaders must ensure that they put in place strong AI Ethics & Governance mechanisms to mitigate against the risks involved. Leveraging the IBM Garage methodology, IBM can help business leaders evaluate each generative AI initiative on how risky and how precise the output needs to be. In the first wave, clients can prioritize internal employee-facing use cases where the output is reviewed by humans and don’t require high degree of precision. 

Generative AI and LLMs introduce new hazards into the field of AI, and we do not claim to have all the answers to the questions that these new solutions introduce. IBM Consulting is committed to applying measured introspection during engagements with enterprises, governments and society at large and to ensuring a diverse representation of perspectives as we find answers to those questions. 

Source: ibm.com

Thursday, 9 March 2023

Innocens BV leverages IBM Technology to Develop an AI Solution to help detect potential sepsis events in high-risk newborns

IBM, IBM Exam, IBM Exam Prep, IBM Exam Preparation, IBM Career, IBM Skills, IBM Jobs, IBM Guides, IBM Learning

From the moment of birth to discharge, healthcare professionals can collect so much data about an infant’s vitals—for instance, heartbeat frequency or every rise and drop in blood oxygen level. Although medicine continues to advance further, there’s still much to be done to help reduce the number of premature births and infant mortality. The worldwide statistics on premature births are staggering— the University of Oxford estimates that neonatal sepsis causes 2.5 million infant deaths annually.

Babies born prematurely are susceptible to health problems. Sepsis or bloodstream infection is life threatening and a common complication when admitted in a Neonatal Intensive Care Unit (NICU).

At Innocens BV, the belief is that earlier identification of sepsis-related events in newborns is possible, especially given the vast amount of data points collected from the moment a baby is born. Years’ worth of aggregated data in the NICU could help lead us to a solution. The challenge was gleaning relevant insights from the vast amount of data collected to help identify those infants at risk. This mission is how Innocens BV began in the Neonatal Intensive Care Unit (NICU) at Antwerp University Hospital in Antwerp, Belgium in cooperation with the University of Antwerp. The NICU at the hospital is associated closely with the University , and its focus is on improving care for premature and low birthweight infants. We joined forces with a Bio-informatics research group from the University of Antwerp and started taking the first steps in developing a solution.

Using IBM’s technology and the expertise of their data scientists along with the knowledge and insights from the hospital’s NICU medical team, we kicked off a project to further develop the ideas into a solution that was aimed at using clinical signals that are routinely collected in clinical care to aid doctors with the timely detection of patterns in such data that are associated with a sepsis episode. The specific approach we took required the use of both AI and edge computing to create a predictive model that could process years of anonymized data to help doctors make informed decisions. We wanted to be able to help them observe and monitor the thousands of data points available to make informed decisions.

How AI powers the Innocens Project


When the collaboration began, data scientists at IBM understood they were dealing with a sensitive topic and sensitive information. The Innocens team needed to build a model that could detect subtle changes in neonates’ vital signs while generating as few false alarms as possible. This required a model with a high level of precision that also is built upon  key principles of trustworthy AI including transparency, explainability, fairness, privacy and robustness.

Using IBM Watson Studio, a service available on IBM Cloud Pak for Data, to train and monitor the AI solution’s machine learning models, Innocens BV could help doctors by providing data driven insights that are associated with a potential onset of sepsis. Early results on historical data show that many severe sepsis cases can be identified multiple hours in advance. The user interface providing the output of the predictive AI model is designed to help provide doctors and other medical personel with insights on individual patients and to augment their clinical intuition.

Innocens worked closely with IBM and medical personel at the Antwerp University Hospital to develop a purposeful platform with a user interface that is consistent and easy to navigate and uses a comprehensible AI model with explainable AI capabilities. With the doctors and nurses in mind, the team aimed to create a model that would allow the intended users to reap its benefits. This work was imperative for building trust between the users and the instruments that would help inform a clinician’s diagnosis. Innocens also involved doctors in the development process of building the user interface and respected the privacy and confidentiality of the anonymous historical patient data used to train the model within a robust data architecture.

The technology and outcomes of this research project could have the potential to not only help the patients at Antwerp University Hospital, but to scale for different NICU centers and help other hospitals as they work to combat neonatal sepsis. Innocens BV is working in collaboration with IBM to explore how Innocens can continue to leverage data to help train transparent and explainable AI models capable of finding patterns in patient data, providing doctors with additional data insights and tools that help inform clinical decision-making.

The impact of the Innocens technology is being investigated in clinical trials and is not yet commercially available.

Source: ibm.com

Tuesday, 7 March 2023

Real-time analytics on IoT data

IBM Exam, IBM Tutorial and Materials, IBM Career, IBM Jobs, IBM Skills, IBM Prep, IBM Preparation, IBM IoT, IBM Guides, IBM Learning

Why real-time analytics matters for IoT systems


IoT systems access millions of devices that generate large amounts of streaming data. For some equipment, a single event may prove critical to understanding and responding to the health of the machine in real time, increasing the importance of accurate, reliable data. While real-time data remains important, storing and analyzing the historical data also creates opportunities to improve processes, decision-making and outcomes.

Smart grids, which include components like sensors and smart meters, produce a wealth of telemetry data that can be used for multiple purposes, including:

◉ Identifying anomalies such as manufacturing defects or process deviations
◉ Predictive maintenance on devices (such as meters and transformers)
◉ Real-time operational dashboards
◉ Inventory optimization (in retail)
◉ Supply chain optimization (in manufacturing)

Considering solutions for real-time analytics on IoT data


One way to achieve real-time analytics is with a combination of a time-series database (InfluxDB or TimescaleDB) or a NoSQL database (MongoDB) + a data warehouse + a BI tool:

IBM Exam, IBM Tutorial and Materials, IBM Career, IBM Jobs, IBM Skills, IBM Prep, IBM Preparation, IBM IoT, IBM Guides, IBM Learning

This architecture raises a question: Why would one use an operational database, and still need a data warehouse? Architects consider such a separation so they can choose a special-purpose database — such as a NoSQL database for document data — or a time-series database (key-value) for low costs and high performance.

However, this separation also creates a data bottleneck — data can’t be analyzed without moving it from an operational data store to the warehouse. Additionally, NoSQL databases are not great at analytics, especially when it comes to complex joins and real-time analytics.

Is there a better way? What if you could get all of the above with a general-purpose, high-performance SQL database? You’d need this type of database to support time-series data, streaming data ingestion, real–time analytics and perhaps even JSON documents.

IBM Exam, IBM Tutorial and Materials, IBM Career, IBM Jobs, IBM Skills, IBM Prep, IBM Preparation, IBM IoT, IBM Guides, IBM Learning

Achieving a real-time architecture with SingleStoreDB + IBM Cognos


SingleStoreDB supports fast ingestion with Pipelines (native first class feature) and concurrent analytics for IoT data to enable real-time analytics. On top of SingleStoreDB, you can use IBM® Cognos® Business Intelligence to help you make sense of all of this data. The previously described architecture then simplifies into:

IBM Exam, IBM Tutorial and Materials, IBM Career, IBM Jobs, IBM Skills, IBM Prep, IBM Preparation, IBM IoT, IBM Guides, IBM Learning
Real-time analytics with SingleStoreDB & IBM Cognos

Pipelines in SingleStoreDB allow you to continuously load data at blazing fast speeds. Millions of events can be ingested each second in parallel from data sources such as Kafka, cloud object storage or HDFS. This means you can stream in structured — as well as unstructured data — for real-time analytics.

IBM Exam, IBM Tutorial and Materials, IBM Career, IBM Jobs, IBM Skills, IBM Prep, IBM Preparation, IBM IoT, IBM Guides, IBM Learning

But wait, it gets better…

1. Once data is in SingleStoreDB, it can also be used for real-time machine learning, or to safely run application code imported into a sandbox with SingleStoreDB’s Code Engine Powered by Web Assembly (Wasm).
2. With SingleStoreDB, you can also leverage geospatial data — for instance to factor site locations, or to visualize material moving through your supply chains.

Armis and Infiswift are just a couple of examples of how customers use SingleStoreDB for IoT applications:

◉ Armis uses SingleStoreDB to help enterprises discover and secure IoT devices. Armis originally started with PostgreSQL, migrated to ElasticSearch for better search performance and considered Google Big Query before finally picking SingleStoreDB for its overall capabilities across relational, analytics and text search. The Armis Platform, of which SingleStoreDB now plays a significant part, collects an array of raw data (traffic, asset, user data and more) from various sources — then processes, analyzes, enriches and aggregates it.

◉ Infiswift selected SingleStoreDB after evaluating several other databases. Their decision was driven in part because of SingleStore’s Universal Storage technology (a hybrid table type that works for both transactional and analytical workloads).

Want to learn more about achieving real-time analytics?


Join IBM and SingleStore on Sep 21, 2022 for our webinar “Accelerating Real-Time IoT Analytics with IBM Cognos and SingleStore”. You will learn how real-time data can be leveraged to identify anomalies and create alarms by reading meter data, and classifying unusual spikes as warnings.

We will demonstrate:

◉ Streaming data ingestion using SingleStoreDB Pipelines
◉ Stored procedures in SingleStoreDB to classify data before it is persisted on disk or in memory
◉ Dashboarding with Cognos

These capabilities enable companies to:

◉ Provide better quality of service through quickly reacting to or predicting service interruptions due to equipment failures
◉ Identify opportunities to increase production throughput as needed
◉ Quickly and accurately invoice customers for their utilization

Source: ibm.com

Saturday, 4 March 2023

How AI and automation are helping America’s Internal Revenue Service better serve we the people

IBM Exam, IBM Exam Prep, IBM Exam Preparation, IBM Exam Certification, IBM Learning, IBM Guides

The delivery of public services by the government continues to evolve as citizens increasingly look for more personalized and seamless experiences. The last three years have acted as a tailwind, further pushing the demand and forcing government to rethink their service delivery models. The Internal Revenue Service’s (IRS) tax return processing function, the quintessential citizen service that touches every American household, is a powerful example of the innovation that can come from viewing public services through the lens of citizens.


In 2020 the IRS faced significant paper tax return processing backlogs. Tractor trailers full of paper were sitting at processing centers to be opened, scanned, sorted, properly inventoried and manually processed onsite. Tax return processing delays can impact the timing of refunds for taxpayers by as much as six months or longer. And although the increase in electronic filing of tax returns has greatly reduced the number of paper returns the IRS receives in an ordinary filing season, paper tax return volume is still significant. In alignment with the agency’s strategic goals to make the IRS more accessible, efficient and effective by continually innovating operations, adopting industry leading technology and increasing the efficiency and currency of technology investments, the IRS engaged IBM to digitize its tax year 2020 and 2021 paper tax return intake process to enable remote scanning, validation and processing.

The project went well beyond paper scanning and the automated extraction of data from paper tax returns. Enabled by artificial intelligence and a digital extraction tool, with IBM’s automated data validation process, the IRS’s Modernized e-File (MeF) system ultimately accepted 76% of paper tax returns processed without human intervention. In the end, working with the IRS, the team processed nearly 140,000 paper tax returns at a significantly higher rate of quality relative to human transcription — providing the IRS with the simplicity and efficiency needed to tackle its backlog challenge. Technologies used for the project also laid the foundation for the possibility of future anomaly and fraud detection during the tax return intake process — even for paper tax returns.

The IRS is at a self-defined inflection point. IBM’s IT modernization work with the IRS is a demonstration of the agency’s commitment to the delivery of more seamless citizen services. However, as with all digital transformations, there’s still work to be done. Backlogs of tax returns may unfortunately continue into the 2023 filing season, and IBM is prepared to continue assisting the IRS in this critical work for the American people.

In today’s rapidly changing, technology enabled world where citizens have become used to services only a click away, government agencies are under increasing pressure to keep pace. As we learned during this work with the IRS, digital transformation is a marathon, not a sprint. It’s steered by continuous technology innovation that presents new, more effective ways to conduct business and deliver services. Complex challenges and growing citizen expectations make it imperative that agencies maintain the accelerated pace of digital innovation to deliver value today and tomorrow.

Source: ibm.com

Friday, 3 March 2023

Trends driving managed file transfer and B2B data exchange modernization

IBM Exam, IBM Exam Study, IBM Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation, IBM Certification, IBM Learning

The most important business conversations your organization has are the daily digital exchanges of data with your customers and suppliers. Significant negative trends have emerged in the past few years, raising pressure on the critical software that most companies use to exchange business data within and outside their company, especially with their customers and suppliers. Your organization is likely feeling some challenges created by these trends.

Among the current key trends impacting file transfer and B2B systems, undoubtedly the main disruption comes from the overall increase in transaction volume. With the accelerated digitization of many business processes, organizations have seen unprecedented growth in their file transfer and B2B transactions in recent years.

Other trends include:

◉ A growing number of customers and suppliers. As more organizations look to increase the agility of their supply chain, they are adding suppliers.

◉ Expectations for faster response times. Many organizations are receiving response time objectives from their customers. Not replying to an inventory request or shipping date query fast enough means the customer may move on to the next supplier.

◉ IT and cloud efficiencies. By moving to the cloud, IT efficiencies are being explored in every IT department. Your company’s journey to the cloud can impact which solution best suits the digital conversation you have with your organization’s business partners.

◉ Increasing transaction complexity. As some customers and suppliers request semi-custom APIs and Hybrid EDI-API options, complexity soars.

◉ Increasing costs of data breaches. As found in the IBM Security report Cost of a data breach 2022, the cost of a breach has risen 12.5% as ransomware and destructive attacks surge.

Trends, challenges, disruptions


Ignoring these trends can be extremely costly. If not appropriately analyzed, trends can become challenges. And unaddressed IT challenges can become business disruptions. Businesses that ignore this landscape are more likely to experience order delays and interferences resulting in revenue loss, unexpected costs or customer dissatisfaction.

As these challenges remain unaddressed, IT teams are beset by tasks of increasing complexity driven by weak or inconsistent security and lack of visibility.

Siloed solutions and IT skills shortages are also common pain points. These challenges cause companies to struggle with their journey to the cloud and modernization efforts, leading to difficulty in scaling and slow onboarding of new customers or suppliers.

Is your business experiencing any of these IT trends in your file transfer or B2B transactions? Then you need modern solutions to meet and overcome IT challenges before they impact your business. And if your organization is not yet seeing these trends surface, then it’s important to start building the foundation for growth your business needs before the storm hits.

Businesses need a modern and reliable B2B data exchange solution built for demanding workloads. An effective solution must provide:

◉ Connectivity across a wide range of standards and protocols, data translation and validation
◉ End-to-end insight at the document, transaction and business level
◉ Rapid onboarding and efficient management of customers and partners
◉ Hybrid cloud, on-premises, and SaaS capabilities to support self-managed, vendor-managed or custom deployment
◉ Secure, scalable and reliable solutions for a diverse range of demanding workloads

IBM Sterling® Data Exchange is one such effective solution: a group of offerings built to improve the exchange of data with customers and suppliers, revolving around managed file transfer (MFT) and B2B integration. Take advantage of IBM’s long-standing experience in EDI, APIs and MFT business data exchange.

Source: ibm.com

Thursday, 2 March 2023

Will ISO 20022 overcome its delays to unlock huge opportunities?

IBM Exam, IBM Exam Prep, IBM Exam Certification, IBM Career, IBM Jobs, IBM Skills, IBM Prep, IBM Preparation, IBM Tutorial and Materials

For decades, financial institutions and corporations have sought an easy and standard method of exchanging electronic financial messages. MT standards, X.25 and all the EDI formats were supposed to solve deficits in the data and finance reporting space. In 2004, when the first publication of ISO 20022 was released, payments practitioners and industry professionals all had the same thought: What a great idea!


At the time, it made absolute sense to introduce data-rich payments to improve automation, enhance reporting and analytics, improve interoperability and decrease risk through accurate reconciliation and regulatory compliance activities.

By 2018, many wondered why the industry hadn’t widely adopted ISO 20022. Although some countries adopted ISO 20022 for payments systems by 2019, most did not realize the potential of the rich data within streams. That same year, when SWIFT mandated an ISO 20022 rollout for 2021, with richer data than its old MT standard, the industry started investing more heavily in the adoption of ISO 20022. Despite this mandate, the roll-out plan was pushed out to 2022, and again more recently, to the first quarter of 2023.

These delays (and what seems like a reluctance to make ISO 20022 happen in some jurisdictions) led me to wonder, “What’s hindering the global adoption of ISO 20022 and the rich data structures it supports?” The answer may lie in the balance of benefits and costs of adoption.

What is the value of adopting ISO 20022?


Numerous publications on the topic have boasted the benefits of adopting ISO 20022. As a community, payments professionals and practitioners were promised numerous benefits from a processing and payments operations perspective, related to the rich and structured data of ISO 20022:

◉ Ability to transfer more complete and accurate information between banks and other FIs
◉ Higher straight-through processing rates using the rich information
◉ Fewer manual validations from a regulatory compliance perspective
◉ Ability for compliance teams to focus on investigating true hits rather than false positives
◉ No truncated information and no data loss
◉ Little to no limitation on the number of characters travelling with the payment messages
◉ Interoperability between interconnecting core banking systems, market infrastructures and end users
◉ Easier development due to the format’s use of XML, a common language

Theoretically, these ISO 20022 benefits are great. But to understand the big picture, we must address the challenges.

What are the challenges of adopting ISO 20022?


The industry has faced multiple challenges with the adoption of ISO 20022. Timelines are aggressive (though the pressure is largely due to inactivity from an ISO 20022 adoption perspective), and many institutions find themselves with a lack of resources to be able to make the roll-out deadlines. Additionally, ISO 20022 regularly updates its message standard and publishes new versions of message types regularly. How do financial institutions and FinTechs keep up with this fast-paced change? It would benefit adopters to implement a system that can adapt to change easily. FinTechs could develop software that’s made for change to stay ahead of the evolving standard.

Many well-established financial institutions and market infrastructures rely on legacy systems running on an aging infrastructure, which is a challenge the industry needs to overcome. Moreover, these legacy systems speak a language that is not easily compatible with ISO 20022, and the message formats are typically far more stringent than even ISO 20022’s predecessor ISO 15022 (with formats for CHIPS, BACS, CPA005 and ACH, to name a few). These systems were originally developed to process low-value payments at very low cost. Meanwhile adopting ISO 20022 can be very costly for the participants of the payments ecosystem—so how can we argue that the value in adopting ISO 20022 for these systems outweighs the cost?

Key advantages come from business value


With little to no limitation in data and a highly structured format, there is more to ISO 20022 than payments advantages. To have a truly impactful conversation, we must consider the business value that ISO 20022 adoption can provide.

◉ Corporate treasurers receive tools required to enhance cash flow forecasting and improve reconciliation processes with richer, comprehensive data.

◉ With increased automation potential, treasurers benefit from faster payments with less friction, allowing quick decision making while meeting payment cut-off deadlines for sweeping and investing activities.

◉ The payments community could consistently send the same structure in payment messages to multiple, geographically distributed entities without having to develop a bank-specific format.

◉ Data required in one jurisdiction, but optional in another, can consistently travel with the payment message whether required or not.

◉ The payments community gives business users the full set of data that can be contained within purchase orders and invoices, rather than a series of truncated or contracted characters.

◉ Individuals could simply click on their payroll deposit in their bank account to see their full paystub, rather than having to get the details from another system.

Perhaps ISO 20022 shouldn’t be perceived as today’s biggest value driver, but rather as the catalyst for innovation that will accelerate value-driven growth tomorrow. In a way, ISO 20022 adoption is paving the way for the next generations of payments professionals to create opportunities for the development of the industry. It allows more players in the FinTech space to collaborate with financial institutions to innovate the payments landscapes and to co-create in a climate of competition.

70 countries have already successfully modernized payments infrastructure and implemented faster payments schemes powered by ISO 20022. Forward-thinking institutions in these countries can start developing value-added services and features like automated receivables tracking and reconciliation, real-time cash balances and forecasting, real-time multibank dashboards and more.

Should institutions and companies wait until there is a more immediate return on investment to adopt ISO 20022?


Ask yourself, “Do I want to wait until I lose a client to someone who can offer services on a real-time basis?” I believe that cross-border payments globalization will remain one of the trends that drives payments modernization. Standardization, consistency and rich data are key elements of this globalization initiative. Similarly to the telecommunications industry implementing the 5G network despite no obvious immediate business value or widespread end-user use cases, achieving complete interoperability in payments may take longer than we initially anticipated. However, as a part of the payments community, I challenge us to invest in a vision that goes beyond an imminent ROI and see the potential of data-rich, structured payments to create an inclusive, collaborative future for the payments industry. Let’s give ourselves the opportunity to accelerate payments innovation and make ISO 20022 table stakes.

Source: ibm.com