Showing posts with label AI and analytics. Show all posts
Showing posts with label AI and analytics. Show all posts

Tuesday, 12 September 2023

Powering the future: The synergy of IBM and AWS partnership

IBM, IBM Exam, IBM Exam Prep, IBM Certification, IBM Tutorial and Materials, IBM Learning, IBM Preparation Exam

We are in the midst of an AI revolution where organizations are seeking to leverage data for business transformation and harness generative AI and foundation models to boost productivity, innovate, enhance customer experiences, and gain a competitive edge. IBM and AWS have been working together since 2016 to provide secure, automated solutions for hybrid cloud environments. This partnership gives clients the flexibility to choose the right mix of technologies for their needs, and IBM Consulting can help them scale those solutions enterprise-wide. The result is easier, faster cloud transformation, technology deployment, and more secure operations, which can help drive better business results. This powerful synergy is revolutionizing how businesses harness the potential of data-driven AI to stay ahead in the digital age.

Data and AI as the Pillars of the Partnership


At the heart of this partnership lies a deep appreciation for the role of data as the catalyst for AI innovation. Data is the fuel that powers AI algorithms, enabling them to generate insights, predictions, and solutions that drive businesses forward. This understanding forms the cornerstone of the IBM and AWS collaboration, creating an environment where data and AI are seamlessly integrated to yield remarkable results. IBM and AWS has worked together to bring all of IBM data management portfolio namely Db2, Netezza and IBM watsonx.data to work on AWS in cloud native fashion with convenience of SaaS for our customers. This collaboration recognizes that data and AI are inseparable allies in driving digital transformation. Data provides the raw materials for AI algorithms to learn and evolve, while AI extracts actionable insights from data that shape business strategies and customer experiences. Together, they amplify each other’s potential, creating a virtuous cycle of innovation.

IBM’s expertise in data management, AI research, and innovative solutions aligns perfectly with AWS’s cloud infrastructure and global reach. This partnership isn’t just a union of convenience; it’s a strategic alliance that capitalizes on the synergies of data and AI.

All in on IBM Data Store portfolio


Over the last 6 month The IBM and AWS partnership has launched a suite of groundbreaking data and AI solutions that empower businesses to thrive in a data-driven world. These offerings focus on data management, AI development, and seamless integration, enhancing the capabilities of organizations across various industries.

◉ Watsonx.data on AWS: Imagine having the power of data at your fingertips. watsonx.data is a software-as-a-service (SaaS) offering that revolutionizes data management. It slashes Data Warehouse costs by half, grants instant access to all your data in just 30 minutes on AWS, and incorporates a fit-for-purpose data store based on an open lakehouse architecture. Multiple query engines, built-in governance, and hybrid cloud deployment models further elevate its capabilities.

◉ Netezza SaaS: Transitioning to the cloud has never been easier. Netezza SaaS is designed to simplify data migration, ensuring frictionless upgrades and the lowest Total Cost of Ownership (TCO). Its support for open table formats on COS and integration with watsonx.data streamlines data operations, making it a powerful asset for businesses in need of seamless cloud integration.

◉ Db2 Warehouse SaaS: The need for rapid data processing is met by Db2 Warehouse SaaS, which promises up to 7 times faster performance and a remarkable 30 times reduction in costs. Whether dealing with tens or hundreds of terabytes, this solution scales effortlessly. Its support for native and open table formats on COS and watsonx.data integration underscores its versatility.

A glimpse of real-world impact: Data-driven success


The true power of this partnership and its offerings becomes evident when considering real-world scenarios. One notable example is a prominent Japanese insurance company that leveraged Db2 PureScale on AWS. This partnership allowed them to enhance customer service, gaining insights that reshape how they interact with their clients.

A call to embrace data-driven AI transformation


For our customers, business data is king and AI is the game-changer. The IBM and AWS partnership emerges as a lighthouse of innovation. Together we are reinforcing to our customers that data and AI are pivotal to progress. As businesses seek to navigate the complexities of the digital landscape, this partnership offers them a roadmap to data-driven AI solutions that fuel growth, innovation, and sustainable success.

Join IBM and AWS experts at IBM TechXchange Conference 2023, the premier learning event for developers, engineers, and IT practitioners.

The future unveiled through data and AI unity


The partnership between IBM and AWS signifies more than a collaboration; it’s a movement towards harnessing the full potential of data and AI. The symphony of their expertise creates a harmonious blend of innovation that transcends industries and reshapes possibilities. With data as the bedrock and AI as the engine, this partnership sets a precedent for a future where businesses thrive on the power of insights, intelligence, and innovation.

Source: ibm.com

Saturday, 12 August 2023

Optimizing clinical trial site performance: A focus on three AI capabilities

IBM, IBM Exam, IBM Exam Study, IBM Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation

This article, part of the IBM and Pfizer’s series on the application of AI techniques to improve clinical trial performance, focuses on enrollment and real-time forecasting. Additionally, we are looking to explore the ways to increase patient volume, diversity in clinical trial recruitment, and the potential to apply Generative AI and quantum computing. More than ever, companies are finding that managing these interdependent journeys in a holistic and integrated way is essential to their success in achieving change.

Despite advancements in the pharmaceutical industry and biomedical research, delivering drugs to market is still a complex process with tremendous opportunity for improvement. Clinical trials are time-consuming, costly, and largely inefficient for reasons that are out of companies’ control. Efficient clinical trial site selection continues to be a prominent industry-wide challenge. Research conducted by the Tufts Center for Study of Drug Development and presented in 2020 found that 23% of trials fail to achieve planned recruitment timelines; four years later, many of IBM’s clients still share the same struggle. The inability to meet planned recruitment timelines and the failure of certain sites to enroll participants contribute to a substantial monetary impact for pharmaceutical companies that may be relayed to providers and patients in the form of higher costs for medicines and healthcare services. Site selection and recruitment challenges are key cost drivers to IBM’s biopharma clients, with estimates, between $15-25 million annually depending on size of the company and pipeline. This is in line with existing sector benchmarks.

When clinical trials are prematurely discontinued due to trial site underperformance, the research questions remain unanswered and research findings end up not published. Failure to share data and results from randomized clinical trials means a missed opportunity to contribute to systematic reviews and meta-analyses as well as a lack of lesson-sharing with the biopharma community.

As artificial intelligence (AI) establishes its presence in biopharma, integrating it into the clinical trial site selection process and ongoing performance management can help empower companies with invaluable insights into site performance, which may result in accelerated recruitment times, reduced global site footprint, and significant cost savings (Exhibit 1). AI can also empower trial managers and executives with the data to make strategic decisions. In this article, we outline how biopharma companies can potentially harness an AI-driven approach to make informed decisions based on evidence and increase the likelihood of success of a clinical trial site.

IBM, IBM Exam, IBM Exam Study, IBM Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation

Tackling complexities in clinical trial site selection: A playground for a new technology and AI operating model


Enrollment strategists and site performance analysts are responsible for constructing and prioritizing robust end-to-end enrollment strategies tailored to specific trials. To do so they require data, which is in no shortage. The challenges they encounter are understanding what data is indicative of site performance. Specifically, how can they derive insights on site performance that would enable them to factor non-performing sites into enrollment planning and real-time execution strategies.

In an ideal scenario, they would be able to, with relative and consistent accuracy, predict performance of clinical trial sites that are at risk of not meeting their recruitment expectations. Ultimately, enabling real-time monitoring of site activities and enrollment progress could prompt timely mitigation actions ahead of time. The ability to do so would assist with initial clinical trial planning, resource allocation, and feasibility assessments, preventing financial losses, and enabling better decision-making for successful clinical trial enrollment.

Additionally, biopharma companies may find themselves building out AI capabilities in-house sporadically and without overarching governance. Assembling multidisciplinary teams across functions to support a clinical trial process is challenging, and many biopharma companies do this in an  isolated fashion. This results in many groups using a large gamut of AI-based tools that are not fully integrated into a cohesive system and platform. Therefore, IBM observes that more clients tend to consult AI leaders to help establish governance and enhance AI and data science capabilities, an operating model in the form of co-delivery partnerships.

Embracing AI for clinical trials: The elements of success


By embracing three AI-enabled capabilities, biopharma companies can significantly optimize clinical trial site selection process while developing core AI competencies that can be scaled out and saving financial resources that can be reinvested or redirected. The ability to seize these advantages is one way that pharmaceutical companies may be able to gain sizable competitive edge.

AI-driven enrollment rate prediction

Enrollment prediction is typically conducted before the trial begins and helps enrollment strategist and feasibility analysts in initial trial planning, resource allocation, and feasibility assessment. Accurate enrollment rate prediction prevents financial losses, aids in strategizing enrollment plans by factoring in non-performance, and enables effective budget planning to avoid shortfalls and delays.

  • It can identify nonperforming clinical trial sites based on historical performance before the trial starts, helping in factoring site non-performance into their comprehensive enrollment strategy.  
  • It can assist in budget planning by estimating the early financial resources required and securing adequate funding, preventing budget shortfalls and the need for requesting additional funding later, which can potentially slow down the enrollment process.

AI algorithms have the potential to surpass traditional statistical approaches for analyzing comprehensive recruitment data and accurately forecasting enrollment rates.  

  • It offers enhanced capabilities to analyze complex and large volumes of comprehensive recruitment data to accurately forecast enrollment rates at study, indication, and country levels.
  • AI algorithms can help identify underlying patterns and trends through vast amounts of data collected during feasibility, not to mention previous experience with clinical trial sites. Blending historical performance data along with RWD (Real world data) may be able to elucidate hidden patterns that can potentially bolster enrollment rate predictions with higher accuracy compared to traditional statistical approaches. Enhancing current approaches by leveraging AI algorithms is intended to improve power, adaptability, and scalability, making them valuable tools in predicting complex clinical trial outcomes like enrollment rates. Often larger or established teams shy away from integrating AI due to complexities in rollout and validation. However, we have observed that greater value comes from employing ensemble methods to achieve more accurate and robust predictions.

Real-time monitoring and forecasting of site performance

Real-time insight into site performance offers up-to-date insights on enrollment progress, facilitates early detection of performance issues, and enables proactive decision-making and course corrections to facilitate clinical trial success.

  • Provides up-to-date insights into the enrollment progress and completion timelines by continuously capturing and analyzing enrollment data from various sources throughout the trial. 
  • Simulating enrollment scenarios on the fly from real time monitoring can empower teams to enhance enrollment forecasting facilitating early detection of performance issues at sites, such as slow recruitment, patient eligibility challenges, lack of patient engagement, site performance discrepancies, insufficient resources, and regulatory compliance.
  • Provides timely information that enables proactive evidence-based decision-making enabling minor course corrections with larger impact, such as adjusting strategies, allocating resources to ensure a clinical trial stays on track, thus helping to maximize the success of the trial.

AI empowers real-time site performance monitoring and forecasting by automating data analysis, providing timely alerts and insights, and enabling predictive analytics. 

  • AI models can be designed to detect anomalies in real-time site performance data. By learning from historical patterns and using advanced algorithms, models can identify deviations from expected site performance levels and trigger alerts. This allows for prompt investigation and intervention when site performance discrepancies occur, enabling timely resolution and minimizing any negative impact.
  • AI enables efficient and accurate tracking and reporting of key performance metrics related to site performance such as enrollment rate, dropout rate, enrollment target achievement, participant diversity, etc. It can be integrated into real-time dashboards, visualizations, and reports that provide stakeholders with a comprehensive and up-to-date insight into site performance.
  • AI algorithms may provide a significant advantage in real-time forecasting due to their ability to elucidate and infer complex patterns within data and allow for reinforcement to drive continuous learning and improvement, which can help lead to a more accurate and informed forecasting outcome.

Leveraging Next Best Action (NBA) engine for mitigation plan execution

Having a well-defined and executed mitigation plan in place during trial conduct is essential to the success of the trial.

  • A mitigation plan facilitates trial continuity by providing contingency measures and alternative strategies. By having a plan in place to address unexpected events or challenges, sponsors can minimize disruptions and keep the trial on track. This can help prevent the financial burden of trial interruptions if the trial cannot proceed as planned.
  • Executing the mitigation plan during trial conduct can be challenging due to the complex trial environment, unforeseen circumstances, the need for timelines and responsiveness, compliance and regulatory considerations, etc. Effectively addressing these challenges is crucial for the success of the trial and its mitigation efforts.

A Next Best Action (NBA) engine is an AI-powered system or algorithm that can recommend the most effective mitigation actions or interventions to optimize site performance in real-time.

  • The NBA engine utilizes AI algorithms to analyze real-time site performance data from various sources, identify patterns, predict future events or outcomes, anticipate potential issues that require mitigation actions before they occur.
  • Given the specific circumstances of the trial, the engine employs optimization techniques to search for the best combination of actions that align with the pre-defined key trial conduct metrics. It explores the impact of different scenarios, evaluate trade-offs, and determine the optimal actions to be taken.
  • The best next actions will be recommended to stakeholders, such as sponsors, investigators, or site coordinators. Recommendations can be presented through an interactive dashboard to facilitate understanding and enable stakeholders to make informed decisions.

Shattering the status quo


Clinical trials are the bread and butter of the pharmaceutical industry; however, trials often experience delays which can significantly extend the duration of a given study. Fortunately, there are straightforward answers to address some trial management challenges: understand the process and people involved, adopt a long-term AI strategy while building AI capabilities within this use case, invest in new machine learning models to enable enrollment forecasting, real-time site monitoring, data-driven recommendation engine. These steps can help not only to generate sizable savings but also to make biopharma companies feel more confident about the investments in artificial intelligence with impact.

IBM Consulting and Pfizer are working together to revolutionize the pharmaceutical industry by reducing the time and cost associated with failed clinical trials so that medicines can reach patients in need faster and more efficiently.

Combining the technology and data strategy and computing prowess of IBM and the extensive clinical experience of Pfizer, we have also established a collaboration to explore quantum computing in conjunction with classical machine learning to more accurately predict clinical trial sites at risk of recruitment failure. Quantum computing is a rapidly emerging and transformative technology that utilizes the principles of quantum mechanics to solve industry critical problems too complex for classical computers. 

Source: ibm.com

Thursday, 25 May 2023

IBM IT Automation: Reflections from IBM Think

IBM’s AIOps Platform, IBM Career, IBM Jobs, IBM Skills, IBM Learning, IBM Guides, IBM Tutorial and Materials, IBM Preparation Exam

We have had an amazing week with IBM clients, partners, and stakeholders at our annual Think Conference in Orlando, Florida. For IBM, Think is a perfect time for us to connect, collaborate and help our clients and partners continue to forge ahead with digital transformation and innovation.

As we wrap up Think 2023, we’re excited to recap a few of the exciting new capabilities we are launching in our IT Automation portfolio.

Accelerating IT operations with IBM’s AIOps platform


Managing IT environments in today’s modern business landscape is a complex and challenging task, requiring IT operations teams to move from being reactive to predictive and proactive. Traditional approaches are no longer sufficient to ensure continuous availability, functionality and performance.

To address this challenge, IBM offers an AIOps platform that utilizes AI to enhance decision-making across infrastructure and operations personas by contextualizing large volumes of operational data.

IBM Cloud Pak for AIOps is a comprehensive, self-hosted AIOps platform solution that provides central IT operations teams with a single pane of glass to view their managed IT environment. This platform utilizes intelligent automation and artificial intelligence (AI) to aggregate data from various sources, detect and correlate incidents, and quickly drive incidents to resolution. IBM’s AIOps platform is designed to help businesses by augmenting, accelerating and automating incident resolution, while helping to foster collaboration across teams.

Further contributing to the value of our existing AIOps platform—alongside the Cloud Pak for AIOps—IBM is excited to announce the addition of our cloud-native SaaS solution, IBM AIOps Insights, set for general availability at the end of June 2023. This AI-powered solution is designed to streamline incident management and increase uptime to help improve operational efficiency and reduce costs. With AIOps Insights, organizations can leverage AI and machine learning to automate end-to-end processes, transforming reactive IT operations into proactive and predictive operations that identify issues and anomalies before they occur.

AIOps Insights is a SaaS deployment option for incident management and remediation that provides a comprehensive, real-time view of an organization’s IT environment. Like IBM Cloud Pak for AIOps, it consolidates multiple tools and correlates resource status across silos, streamlining incident management, reducing incident costs and increasing availability. With AIOps Insights, organizations can connect and integrate cross-domain data—including infrastructure, network and APM—and visualize their entire IT environment with just a few clicks.

The platform dynamically generates a topology of an organization’s IT environment, adding new entities as they are detected. By correlating events and grouping them into incidents, AIOps Insights addresses noise and enables quick identification of incidents that may require more investigation. The platform also automatically suggests remediation actions. The platform then provides visibility of status updates as incidents are resolved.

IBM’s AIOps platform, including IBM Cloud Pak for AIOps and the new IBM AIOps Insights, empowers organizations to tap into the power of intelligent automation and streamline IT operations to help them achieve better business outcomes.

Better together with IBM Instana Observability and IBM Turbonomic


In today’s complex IT environments, lack of visibility is a major challenge. IT operations teams need to gain full-stack health and performance monitoring, which traditional monitoring tools may have lacked.

IBM Instana Observability provides comprehensive visibility into modern applications, services and environments through a technology-agnostic approach that continuously provides high-fidelity data. It is engineered to identify issues before they impact business operations and automatically deploys monitoring in over 250 different applications, microservices and software infrastructure components across hybrid and multicloud environments. Instana is designed to deliver quick-time-to-value while keeping up with dynamic complexities.

In addition to comprehensive visibility, organizations need to detect performance issues and eliminate waste caused by over-provisioning. Enterprises need to shift from reactive to dynamic and continuous resource allocation for optimal, cost-effective utilization.

IBM Turbonomic is an AI- and automation-driven solution that dynamically allocates resources based on performance analysis and AI-driven recommendations. It supports application performance while helping clients address their compliance requirements, reduce cloud and infrastructure spending, and give more time back to engineering teams.

Ultimately, IBM Instana delivers real-time observability, while Turbonomic helps support application performance. While each of these solutions are incredibly powerful alone, they are even more valuable together.

That is why IBM is very excited to introduce the integration of IBM Instana Observability and IBM Turbonomic, available June 2023. Together, the integrated solutions offer comprehensive visibility and control of IT environments, enabling organizations to leverage real-time monitoring data for dynamic resource allocation decisions and AI-driven recommendations for optimal performance and cost reduction. The combination empowers organizations to optimize performance, address costs and streamline IT operations by tapping into the power of intelligent automation solutions to deliver business results.

Source: ibm.com

Saturday, 11 March 2023

Exploring generative AI to maximize experiences, decision-making and business value

IBM, IBM Exam, IBM Exam Prep, IBM Exam Tutorial and Materials, IBM Guides, IBM Jobs, IBM Learning

In the first part of this three-part series, generative AI and how it works were described.

IBM Consulting sees tangible business value in augmenting existing enterprise AI deployments with generative AI to improve performance and accelerate time to value. There are four categories of dramatically enhanced capabilities these models deliver: 

◉ Summarization as seen in examples like call center interactions, documents such as financial reports, analyst articles, emails, news and media trends. 
◉ Semantic search as seen in examples like reviews, knowledge base and product descriptions. 
◉ Content creation as seen in examples like technical documentation, user stories, test cases, data, generating images, personalized UI, personas and marketing copy.
◉ Code creation as seen in examples like code co-pilot, pipelines, docker files, terraform scripts, converting user stories to Gherkin format, diagrams as code, architectural artifacts, Threat models and code for applications.

With these improvements, it’s easy to see how every industry can re-imagine their core processes with generative AI.

Leading use cases do more than simply cut costs. They contribute to employee satisfaction, customer trust and business growth. These aren’t forward-looking possibilities because companies are using generative AI today to realize rapid business value including things like improving accuracy and near real-time insights into customer complaints to reduce time-to-insight discovery, reduction in time for internal audits to maintain regulatory compliance and efficiency gains for testing and classification.

While these early cases and the results they’ve delivered are exciting, the work involved in building generative AI solutions must be developed carefully and with critical attention paid to the potential risks involved including:

◉ Bias: As with any AI model, the training data has an impact on the results the model produces. Foundation Models are trained on large portions of data crawled from the internet. Consequently, the biases that inherently exist in internet data are picked up by the trained models and can show up in the results the models produce. While there are ways to mitigate this effect, enterprises need to have governance mechanisms in place to understand and address this risk. 

◉ Opacity: Foundation models are also not fully auditable or transparent because of the “self-supervised” nature of the algorithm’s training. 
Hallucination: LLMs can produce “hallucinations,” results that satisfy a prompt syntactically but are factually incorrect. Again, enterprises need to have strong governance mechanisms in place to mitigate this risk.  

◉ Intellectual property: There are unanswered questions concerning the legal implications and who may own the rights to content generated by models that are trained on potentially copywritten material.  

◉ Security: These models are susceptible to data and security risk including prompt injection attacks. 

When engaging in generative AI projects, business leaders must ensure that they put in place strong AI Ethics & Governance mechanisms to mitigate against the risks involved. Leveraging the IBM Garage methodology, IBM can help business leaders evaluate each generative AI initiative on how risky and how precise the output needs to be. In the first wave, clients can prioritize internal employee-facing use cases where the output is reviewed by humans and don’t require high degree of precision. 

Generative AI and LLMs introduce new hazards into the field of AI, and we do not claim to have all the answers to the questions that these new solutions introduce. IBM Consulting is committed to applying measured introspection during engagements with enterprises, governments and society at large and to ensuring a diverse representation of perspectives as we find answers to those questions. 

Source: ibm.com

Saturday, 4 March 2023

How AI and automation are helping America’s Internal Revenue Service better serve we the people

IBM Exam, IBM Exam Prep, IBM Exam Preparation, IBM Exam Certification, IBM Learning, IBM Guides

The delivery of public services by the government continues to evolve as citizens increasingly look for more personalized and seamless experiences. The last three years have acted as a tailwind, further pushing the demand and forcing government to rethink their service delivery models. The Internal Revenue Service’s (IRS) tax return processing function, the quintessential citizen service that touches every American household, is a powerful example of the innovation that can come from viewing public services through the lens of citizens.


In 2020 the IRS faced significant paper tax return processing backlogs. Tractor trailers full of paper were sitting at processing centers to be opened, scanned, sorted, properly inventoried and manually processed onsite. Tax return processing delays can impact the timing of refunds for taxpayers by as much as six months or longer. And although the increase in electronic filing of tax returns has greatly reduced the number of paper returns the IRS receives in an ordinary filing season, paper tax return volume is still significant. In alignment with the agency’s strategic goals to make the IRS more accessible, efficient and effective by continually innovating operations, adopting industry leading technology and increasing the efficiency and currency of technology investments, the IRS engaged IBM to digitize its tax year 2020 and 2021 paper tax return intake process to enable remote scanning, validation and processing.

The project went well beyond paper scanning and the automated extraction of data from paper tax returns. Enabled by artificial intelligence and a digital extraction tool, with IBM’s automated data validation process, the IRS’s Modernized e-File (MeF) system ultimately accepted 76% of paper tax returns processed without human intervention. In the end, working with the IRS, the team processed nearly 140,000 paper tax returns at a significantly higher rate of quality relative to human transcription — providing the IRS with the simplicity and efficiency needed to tackle its backlog challenge. Technologies used for the project also laid the foundation for the possibility of future anomaly and fraud detection during the tax return intake process — even for paper tax returns.

The IRS is at a self-defined inflection point. IBM’s IT modernization work with the IRS is a demonstration of the agency’s commitment to the delivery of more seamless citizen services. However, as with all digital transformations, there’s still work to be done. Backlogs of tax returns may unfortunately continue into the 2023 filing season, and IBM is prepared to continue assisting the IRS in this critical work for the American people.

In today’s rapidly changing, technology enabled world where citizens have become used to services only a click away, government agencies are under increasing pressure to keep pace. As we learned during this work with the IRS, digital transformation is a marathon, not a sprint. It’s steered by continuous technology innovation that presents new, more effective ways to conduct business and deliver services. Complex challenges and growing citizen expectations make it imperative that agencies maintain the accelerated pace of digital innovation to deliver value today and tomorrow.

Source: ibm.com

Tuesday, 21 February 2023

Six innovation strategies for Life Science organizations in 2023

IBM Exam, IBM Exam Prep, IBM Certification, IBM Prep, IBM Preparation, IBM PDF, IBM Career, IBM Skills, IBM Jobs

Last year, as life sciences organizations were consumed by the recovery from COVID-19, their focus had to shift rapidly to mitigating supply chain constraints, labor and skill shortages, and by the end of the year, inflationary pressures—all of which were exacerbated by the Russia-Ukraine war.

Along with these challenges, the ongoing drive to reduce costs, improve efficiency and productivity, drive better decision-making and reduce risk will continue to drive pharma investment in cloud, AI/ML, analytics and automation in 2023 despite higher interest rates.

Organizations must rethink their business models to serve a variety of strategic goals, and AI will play an increasingly important role in all of them: supporting drug discovery, trial diversity, forecasting and supply chain functions, and supporting engagement and adherence to decentralized trials and on-market regimens. These shifts will set the stage for further industry transformation as gene therapy and precision health become more widely available.

The macro-environment favors steady long-term focus and growth


The Inflation Reduction Act of 2022 puts pressure on companies to reduce drug and device prices in the USA, causing pharma to leverage technology to drive cost efficiencies and maintain margins. Globally, the shifting policy debate around access and affordability of patented pharmaceuticals exerts additional pressures. An inflationary environment will slow down traditional R&D and leave pharma no choice but to investigate AI/ML and related techniques to accelerate drug discovery and repurposing while reducing costs. This will also stimulate new partnerships (for example, pharma companies working with research labs or providers working with payers) through federated learning and cloud-based digital ecosystems.

Manufacturing costs and costs of clinical trials will continue to rise. The cost of active pharmaceutical ingredients has increased by up to 70% since 2018. Recruiting on-site patients and maintaining on-site trials remains prohibitively expensive. Pharma will focus on digital patient recruitment through social advertising and digital engagement (including wearables) throughout trials to manage adherence and persistence. Decentralized trial structures will push pharma to focus more closely on cybersecurity and protected health information (PHI) while managing the associated costs.

The costs of operating manufacturing facilities will increase as energy prices continue to rise. We predict companies will make significant strides towards digitization to reduce cost, improve quality, reduce recalls and improve safety. Pre-digital facilities with manual processes supported by expensive labor will no longer be the norm.

Strategy 1: Prioritize around novel drug development, generics or consumer engagement


Strategic reprioritization should be top of mind for the C-suite. Last year, Pfizer and GSK left the consumer health sector to prioritize novel drug and vaccine development and core innovation. We saw targeted mergers and acquisitions to replenish pipelines, which will continue in 2023. Novartis spun off Sandoz, their generics business, and is streamlining their research efforts around innovative pharmaceuticals. Sanofi and the generics giant Sun Pharmaceuticals are repositioning to enter specialty pharma with the latter releasing a new highly competitive biologic for psoriasis.

Companies are recognizing that divergent business models are required: innovative pharma requires significant capital investments to support state-of-the-art research enabled by new technology, while generic and consumer health business models demand unparalleled scale, access to distribution and close partnerships with pharmacies. A rising risk-return ratio for innovators will lead companies to form research and technology partnerships that combine top talent with the most innovative computation techniques, rather than outright mergers or acquisitions.

Strategy 2: Use AI-driven forecasting and supply chains to improve operational efficiency and sustainability


The upheaval and disruption caused by COVID-19 has given pharma leaders a heightened awareness of resiliency in delivering innovative drugs and therapeutics to communities, underscoring the importance of investing in forecasting and supply chains.

Forecasting

Forecasting continues to be a pain point for many organizations. According to a recent IBM IBV study, 57% of CEOs view the CFO as playing the most crucial role in their organizations over the next two to three years. Legacy processes, demand volatility and increasing data scale and complexity demand a new approach. Traditional quarterly forecasting cycles (which are manual and burdensome) yield inaccurate predictions.

Leading companies will invest in AI, ML and intelligent workflows to deliver end-to-end forecasting capabilities that utilize real-time feeds from multiple data sources, leveraging hundreds of AI and ML models, to deliver more granular and accurate forecasts and customer insights. These capabilities will fundamentally change the role of finance organizations by emphasizing speed of insight, adoption of data-driven decision making and scaling of analytics within the enterprise.

Supply chain

Business leaders will focus on supply chain solutions that drive transparency across sourcing, manufacturing, delivery and logistics while minimizing cost, waste and time. CSCOs are modernizing supply chain operations by using AI to leverage unstructured data in real time and integrating automation, blockchain and edge computing to manage operations and collect and connect information across multiple sources.

IBM Exam, IBM Exam Prep, IBM Certification, IBM Prep, IBM Preparation, IBM PDF, IBM Career, IBM Skills, IBM Jobs
Priorities driving supply chain innovation, according to CSCOs

In the wake of COVID-19, we observe leaders viewing the supply chain as a core organizational function rather than a supportive one. David Volk, executive director of clinical supply chain planning at Roche states, “We are a networked organization… collaborating much more broadly across all our partners and the industry. We view ourselves as a supply chain organization, and a significant part of the value we bring to patients lies in optimizing our global supply chain and inventory. That’s a very different mindset, and it’s changed how we run the organization.”

Supply chain sustainability also ranks among the highest priorities for CEOs. 48% of CEOs surveyed say increasing sustainability is a top priority—up 37% since 2021. 44% cite a lack of data-driven insights as barriers to achieving sustainability objectives. End-to-end visibility into sustainability impact, such as metrics on emissions and waste from raw material to delivery, will unlock a new level of information that position CSCOs as key enablers for companies to achieve their sustainability and ESG vision.

Strategy 3: Prepare for an influx of cell and gene therapies


Gene therapy is the new frontier of medicine. It focuses on targeting a person’s genes for modification to treat or cure disease, including cancer, genetic diseases and infectious diseases. The US Food & Drug Administration (FDA) approved the first gene therapy in the United States in 2017. Since then, more than 20 cell and gene therapy products have been approved.

According to the Alliance for Regenerative Medicine, we could see five more gene therapies for rare diseases introduced to the U.S. market in 2023, including new treatments for sickle cell disease, Duchenne muscular dystrophy and hemophilia.

These therapies will challenge life sciences organizations to rethink their business models. How will they efficiently determine which patients are eligible for these therapies? How will they obtain the patient’s blood as part of the therapy? How will they contract with payers for reimbursement, given these therapies can cost upwards of $3M per treatment? How will they track outcomes from treatment for outcome-based agreements? These questions and many more spanning payment models, consumer experience, supply chain and manufacturing will need to be addressed.

A key driver in the growth of gene therapies and adoption of precision health is the growth and accessibility of next-generation DNA sequencing (NGS). NGS will become more mainstream, moving the science out of the lab to deliver improved patient care and outcomes at scale. NGS delivers ultra-high throughput, scalability and speed and has revolutionized the industry by enabling a wide variety of analyses across multiple applications at a level never before possible. This includes delivering whole-genome sequencing at accessible and practical price points for researchers, scientists, doctors and patients. An example is the new Illumina NovaSeq X sequencer released in September 2022, which is twice as fast as prior models and capable of sequencing 20,000 genomes per year at a cost of $200 per genome. As the price of sequencing genomes declines, the ability to support personalized healthcare and gene therapy at scale will continue to grow.

Strategy 4: Accelerate development and delivery of lifesaving therapies through decentralized clinical trials


Limitations of traditional clinical trials were amplified during the COVID-19 pandemic and have accelerated the use of decentralized clinical trials (DCTs). There is a clear need to improve study formats so broader, more equitable populations are accessed and included. New technologies will help integrate patient data points and derive holistic insights like never before. Life sciences organizations will increase their use of DCTs to run global studies and bring new therapies to market. We expect a record number of decentralized trials in 2023.

Key benefits of DCTs include:

◉ Faster recruitment. Participants can be identified and engaged without the need to travel and be evaluated in person.
◉ Improved retention. Participants are less likely to drop out of a trial due to the typical in-person requirements.
◉ Greater control, convenience and comfort. Participants are more comfortable engaging at home and at local patient care sites.
◉ Increased diversity. Participants in legacy trials lacked diversity and contributed to gaps in understanding of diseases.

As DCTs are more broadly adopted, designing trials around the patient experience will be critical to ensuring clear, transparent engagement and willing and active participation. Methodologies such as Enterprise Design Thinking can provide a useful framework. Likewise, integrating patient data from multiple sources such as electronic health and medical records, electronic data capture platforms, clinical data management systems, wearables and other digital technologies will require a more open approach to information sharing.

Quantum computing will enable more advanced DCT capabilities for recruitment, trial site selection, and optimization and patient surveillance. Quantum-based algorithms can outperform existing computer algorithms, enabling better analysis of integrated patient data at scale.

In the coming years, decentralized trials will become the norm, improving the ability to recruit, select and deliver clinical trials at scale, ensuring full and diverse populations are represented and lifesaving treatments are more quickly approved and launched.

Strategy 5: Explore AI-driven drug discovery


AI-driven drug discovery continues to gain momentum and achieve critical milestones. The first AI-designed drug candidate to enter clinical trials was reported by Exscientia in early 2020. Since then, companies such as Insilico Medicine, Evotec and Schrödinger have announced phase I trials. Several candidates have had their clinical development accelerated through AI-enabled solutions. Within drug companies focused on AI-based discovery, there is publicly available information on about 160 discovery programs, of which 15 products are reportedly in clinical development.

Some execs may think AI can be delivered through the “tool in the public cloud” or by a single team. From our experience working with life sciences companies, this is not the case. Achieving full value from AI requires transformation of the discovery process spanning new tech, new talent and new behaviors throughout the R&D organization.

The AI-driven discovery process delivers value across four dimensions: finding the right biological target, designing a small molecule as a preclinical candidate, improving success rates and delivering overall speed and efficiency.

Search for new biological targets

We see the research community and industry scientists pursuing integration of multiomics and clinical data with machine learning to achieve drug repositioning. Leveraging experimental data and literature analysis, it is possible to uncover new disease pathways and polypharmacological and protein interactions. Application of AI to imaging (and other diagnostic techniques that rigorously analyze phenotypic outputs) may offer opportunities to identify new biological targets. Some of our clients look to understand protein interactions, function and motion using traditional computation techniques as well as quantum computing.

Use new techniques to search for new molecules

Using a deep search technique, it is possible today to mine the research literature and published experimental data to predict new small molecule structures and how molecules will behave. This and other techniques can be used to predict pharmacokinetic and pharmacodynamic properties and help identify off-target effects.

Explore the promise of quantum computing

Since 2020, there have been numerous quantum-related activities and experiments in the field of life sciences, spanning genomics, clinical research and discovery, diagnostics, treatments and interventions. Quantum-driven machine learning, trained on diverse clinical and real-world data sets, has been applied to molecular entity generation, diagnostics, forecasting effectiveness and tailoring radiotherapy.

Strategy 6: Use digital engagement to increase sales efficiency, patient loyalty and adherence


For healthcare providers

Conventional face-to-face visits to healthcare providers (HCPs) have reached the limit of effectiveness. HCPs now expect personalized approaches and instant access to knowledge. Increased scrutiny by public authorities, along with COVID-19, disrupted a traditional approach where sales reps had HCP offices and hospitals as their second home. A virtual engagement model emerged that is less effective in its current form.

At the same time, industry sees the value of an omnichannel HCP engagement strategy: our analysis shows 5-10% higher satisfaction with a new HCP experience, 15-25% more effective marketing spend, 5-7% boost in active prescribers and up to 15% lift in recurring revenue depending on the indication.

Pharma companies have enough data on certain products to enable a personalized experience for HCPs. An analytics and AI-driven approach to engagement with clinicians provides the highest impact as it improves both their speed-to-decision and their awareness of the latest clinical evidence. Well-defined technology and data strategies, along with change management and talent identification programs, are key to success.

For patients

Adherence and persistence are major challenges in an industry that caters to chronic patients. Additionally, with new reimbursement models, payers incentivize “complete” cases that achieve prolonged remission or, for acute patients, functional recovery. To keep selling meds and getting paid for them, patients need to be taking them continuously. For many indications, patients have many pharmaceutical options. Successful companies will differentiate themselves in the market by offering digital support for their pharmaceuticals, engaging patients in their care on their smartphones through gamification and incentive programs.

IBM Exam, IBM Exam Prep, IBM Certification, IBM Prep, IBM Preparation, IBM PDF, IBM Career, IBM Skills, IBM Jobs
How technology can advance new biologics for treatment of plague psoriasis

Bills and regulations will increase the adoption and application of AI


AI underpins the trends mentioned above. While AI technology has been around for decades, its adoption in life sciences has accelerated over the last several years, impacting drug development, clinical trials and supply chains. AI is infused into many of our daily interactions, from calling an airline to rebook tickets, asking Alexa to play music and turn on the lights, receiving an approval for a loan, to providing automated treatment recommendations to patients based on their clinical history and the latest treatment guidelines.

As AI continues to permeate our lives, oversight will be front and center. Both the United States and EU consider regulation to be essential to the development of AI tools that consumers can trust. Life sciences companies must understand the impact AI regulations have on their business models and that they play a proactive role in influencing these policies in the interest of better patient outcomes.

As an example, IBM’s Policy Lab takes a proactive approach to providing policymakers with a vision and actionable recommendations to harness the benefits of innovation while ensuring trust in a world being reshaped by data. IBM works with organizations and policymakers to share our perspective to support responsible innovations. One such bill was the Biden-Harris administration’s Blueprint for an AI Bill of Rights released in September 2022. As stated in the Bill, “AI systems have the potential to bring incredible societal benefits, but only if we do the hard work of ensuring AI products and services are safe and secure, accurate, transparent, free of harmful bias and otherwise trustworthy.” The Bill lays outs five commonsense protections to which everyone in America should be entitled in the design, development and deployment of AI and other automated technologies:

◉ Right to safe and effective systems. You should be protected from unsafe or ineffective systems.
◉ Algorithmic discrimination protections. You should not face discrimination by algorithms, and systems should be used and designed in an equitable way.
◉ Data privacy. You should be protected from abusive data practices via built-in protections and have agency over how your data is used.
◉ Notice and explanation. You should know that an automated system is being used and understand how and why it contributes to outcomes that impact you.
◉ Human alternatives, consideration and fallback. You should be able to opt out where appropriate and have access to a person who can quickly consider and remedy problems you encounter.

The application of AI is not slowing down, nor is scrutiny of it. Life sciences organizations will differentiate themselves by having a seat at the table. They will seek opportunity to influence AI-health policy and deliver ethical and responsible AI-powered solutions that augment their existing product portfolio and improve patient and provider experiences and healthcare outcomes at reduced costs.

Embrace new technologies to offer major advances


Life sciences companies, particularly in pharma and biotech, can prove resilient despite inflationary pressures. They must focus on business model specialization across innovation and invention, generics business and consumer health. Strong demand can help companies overcome business challenges and position the industry for steady innovation-led growth. It is crucial to embrace new technologies, particularly state-of-the-art computing and AI, to offer major advances that may represent a paradigm shift in drug discovery, clinical trial site optimization, and, ultimately, engagement with a person receiving care. Acting boldly in 2023 with a clearly articulated strategy and prioritization will set both mature life sciences organizations and new players on the right path. Companies that focus on strategy and innovation will be the biggest winners.

Source: ibm.com

Thursday, 16 February 2023

A step-by-step guide to setting up a data governance program

IBM Exam, IBM Certification, IBM Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation, IBM Guides, IBM Learning

In our last blog, we delved into the seven most prevalent data challenges that can be addressed with effective data governance. Today we will share our approach to developing a data governance program to drive data transformation and fuel a data-driven culture.

Data governance is a crucial aspect of managing an organization’s data assets. The primary goal of any data governance program is to deliver against prioritized business objectives and unlock the value of your data across your organization.

Realize that a data governance program cannot exist on its own – it must solve business problems and deliver outcomes. Start by identifying business objectives, desired outcomes, key stakeholders, and the data needed to deliver these objectives. Technology and data architecture play a crucial role in enabling data governance and achieving these objectives.

Don’t try to do everything at once! Focus and prioritize what you’re delivering to the business, determine what you need, deliver and measure results, refine, expand, and deliver against the next priority objectives. A well-executed data governance program ensures that data is accurate, complete, consistent, and accessible to those who need it, while protecting data from unauthorized access or misuse.

IBM Exam, IBM Certification, IBM Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation, IBM Guides, IBM Learning

Consider the following four key building blocks of data governance:

◉ People refers to the organizational structure, roles, and responsibilities of those involved in data governance, including those who own, collect, store, manage, and use data.

◉ Policies provide the guidelines for using, protecting, and managing data, ensuring consistency and compliance.

◉ Process refers to the procedures for communication, collaboration and managing data, including data collection, storage, protection, and usage.

◉ Technology refers to the tools and systems used to support data governance, such as data management platforms and security solutions.

IBM Exam, IBM Certification, IBM Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation, IBM Guides, IBM Learning

For example, if the goal is to improve customer retention, the data governance program should focus on where customer data is produced and consumed across the organization, ensuring that the organization’s customer data is accurate, complete, protected, and accessible to those who need it to make decisions that will improve customer retention.

It’s important to coordinate and standardize policies, roles, and data management processes to align them with the business objectives. This will ensure that data is being used effectively and that all stakeholders are working towards the same goal.

Starting a data governance program may seem like a daunting task, but by starting small and focusing on delivering prioritized business outcomes, data governance can become a natural extension of your day-to-day business.

Building a data governance program is an iterative and incremental process


Step 1: Define your data strategy and data governance goals and objectives

What are the business objectives and desired results for your organization? You should consider both long-term strategic goals and short-term tactical goals and remember that goals may be influenced by external factors such as regulations and compliance.

IBM Exam, IBM Certification, IBM Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation, IBM Guides, IBM Learning

A data strategy identifies, prioritizes, and aligns business objectives across your organization and its various lines of business. Across multiple business objectives, a data strategy will identify data needs, measures and KPIs, stakeholders, and required data management processes, technology priorities and capabilities.

It is important to regularly review and update your data strategy as your business and priorities change. If you don’t have a data strategy, you should build one – it doesn’t take a long time, but you do need the right stakeholders to contribute.

Once you have a clear understanding of business objectives and data needs, set data governance goals and priorities. For example, an effective data governance program may:

◉ Improve data quality, which can lead to more accurate and reliable decision making
◉ Increase data security to protect sensitive information
◉ Enable compliance and reporting against industry regulations
◉ Improve overall trust and reliability of your data assets
◉ Make data more accessible and usable, which can improve efficiency and productivity.

Clearly defining your goals and objectives will guide the prioritization and development of your data governance program, ultimately driving revenue, cost savings, and customer satisfaction.

Step 2: Secure executive support and essential stakeholders

Identify key stakeholders and roles for the data governance program and who will need to be involved in its execution. This should include employees, managers, IT staff, data architects, and line of business owners, and data custodians within and outside your organization.

An executive sponsor is crucial – an individual who understands the significance and objectives of data governance, recognizes the business value that data governance enables, and who supports the investment required to achieve these outcomes.

With key sponsorship in place, assemble the team to understand the compelling narrative, define what needs to be accomplished, how to raise awareness, and how to build the funding model that will be used to support the implementation of the data governance program.

The following is an example of typical stakeholder levels that may participate in a data governance program:

IBM Exam, IBM Certification, IBM Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation, IBM Guides, IBM Learning

By effectively engaging key stakeholders, identifying and delivering clear business value, the implementation of a data governance program can become a strategic advantage for your organization.

Step 3: Assess, build & refine your data governance program

With your business objectives understood and your data governance sponsors and stakeholders in place, it’s important to map these objectives against your existing People, Processes and Technology capabilities to achieve these objectives.

IBM Exam, IBM Certification, IBM Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation, IBM Guides, IBM Learning

Data management frameworks such as the EDM Council’s DCAM and CDMC offer a structured way to assess your data maturity against industry benchmarks with a common language and set of data best practices.

Look at how data is currently being governed and managed within your organization. What are the strengths and weaknesses of your current approach? What is needed to deliver key business objectives?

Remember, you don’t have to (nor should you) do everything at once. Identify areas for improvement, in context of business objectives, to prioritize your efforts and focus on the most important areas to deliver results to the business in a meaningful way. An effective and efficient data governance program will support your organization’s growth and competitive advantage.

Step 4: Document your organization’s data policies

Data policies are a set of documented guidelines for how an organization’s data assets are consistently governed, managed, protected and used. Data policies are driven by your organization’s data strategy, align against business objectives and desired outcomes, and may be influenced by internal and external regulatory factors. Data policies may include topics such as data collection, storage, and usage, data quality and security:

IBM Exam, IBM Certification, IBM Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation, IBM Guides, IBM Learning

Data policies ensure that your data is being used in a way that supports the overall goals of your organization and complies with relevant laws and regulations. This can lead to improved data quality, better decision making, and increased trust in the organization’s data assets, ultimately leading to a more successful and sustainable organization. 

Step 5: Establish roles and responsibilities

Define clear roles and responsibilities of those involved in data governance, including those responsible for collecting, storing, and using data. This will help ensure that everyone understands their role and can effectively contribute to the data governance effort.

IBM Exam, IBM Certification, IBM Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation, IBM Guides, IBM Learning

The structure of data governance can vary depending on the organization. In a large enterprise, data governance may have a dedicated team overseeing it (as in the table above), while in a small business, data governance may be part of existing roles and responsibilities. A hybrid approach may also be suitable for some organizations. It is crucial to consider company culture and to develop a data governance framework that promotes data-driven practices. The key to success is to start small, learn and adapt, while focusing on delivering and measuring business outcomes.

Having a clear understanding of the roles and responsibilities of data governance participants can ensure that they have the necessary skills and knowledge to perform their duties.

Step 6: Develop and refine data processes

Data governance processes ensure effective decision making and enable consistent data management practices by coordinating teams across (and outside of) your organization. Additionally, data governance processes can also ensure compliance with regulatory standards and protect sensitive data.

Data processes provide formal channels for direction, escalation, and resolution. Data governance processes should be lightweight to achieve your business goals without adding unnecessary burden or hindering innovation.

Processes may be automated through tools, workflow, and technology.

It is important to establish these processes early to prevent issues or confusion that may arise later in the data management implementation.

Step 7 – Implement, evaluate, and adapt your strategy

Once you have defined the components of your data governance program, it’s time to put them in action. This could include implementing new technologies or processes or making changes to existing ones.

IBM Exam, IBM Certification, IBM Career, IBM Skills, IBM Jobs, IBM Prep, IBM Preparation, IBM Guides, IBM Learning

It is important to remember that data governance programs can only be successful if they demonstrate value to the business, so you need to measure and report on the delivery of the prioritized business outcomes. Regularly monitoring and reviewing your strategy will ensure that it is meeting your goals and business objectives.

Continuously evaluate your goals and objectives and adjust as needed. This will allow your data governance program to evolve and adapt to the changing needs of the organization and the industry. An approach of continuous improvement will enable your data governance program to stay relevant and deliver maximum value to the organization.

Get started on your data governance program


In conclusion, by following an incremental structured approach and engaging key stakeholders, you can build a data governance program that aligns with the unique needs of your organization and supports the delivery of accelerated business outcomes.

Implementing a data governance program can present unique challenges such as limited resources, resistance to change and a lack of understanding of the value of data governance. These challenges can be overcome by effectively communicating the value and benefits of the program to all stakeholders, providing training and support to those responsible for implementation, and involving key decision-makers in the planning process.

By implementing a data governance program that delivers key business outcomes, you can ensure the success of your program and drive measurable business value from your organization’s data assets while effectively manage your data, improving data quality, and maintaining the integrity of data throughout its lifecycle.

Source: ibm.com