Companies struggle to enable data across business units
According to NNIT’s Expectation Barometer 2023 survey, only about half of all participating life sciences organizations have a corporate data strategy, and their ability to enable data isolated within business units is significantly greater than across the organization. And that is problematic, as Ricco Larsen cautions:
– Our survey shows that lack of cross-unit collaboration is the biggest barrier to leveraging data in business processes. But the real business potential in terms of increased efficiency, shorter time to market and greater resilience is only unlocked if your data streams are consistent, harmonized and have a high level of integrity. So, a lot of organizations must find a way to tackle this issue or, at worst, risk losing their license to operate.
The pressure is on the top management, because only they have the mandate to open the silos and enable transparency in the big picture.
– In many cases, lack of a clear business case and an imbalance between who bears the financial burden and where the benefits appear act as barriers for digital investments. If the CIO refers to the CFO and focus is on driving down IT costs, it’s difficult to tap into the benefits. It’s important to challenge the view of IT as a cost center. That’s why we’re seeing more CIOs transition into a CDO role sitting at the top table.
The #1 priority: Build a data foundation
Ricco Larsen has no doubt about what should be at the top of the to-do list for life sciences executives:
– The most important priority is clearly to invest in a strong data foundation. Get your data harmonized and connected across your entire value chain. Because that will be the glue, the fuel and the foundation for everything you do, whether it be AI, new platforms or something that you have no chance of predicting. Data is going to be the golden key to everything, from early discovery to when the products hit the pharmacy shelves, says Ricco Larsen and continues:
– Don’t wait for the authorities to force the issue by introducing new requirements. Get to work right now! IDMP is a good example of how data standards have led to new opportunities for building unified platforms and digitizing the supply chain. If someone had had the foresight to introduce those standards on their own initiative, they would have been in a much better shape to realign their business during Covid.
Finding the next blockbuster
Being able to tap into your existing data is vital for enabling new business opportunities, like finding the next headline-grabbing “blockbuster” product.
There are several examples in the industry of the value of data enablement. For example, by utilizing existing clinical data, new adoptions of existing molecules can be identified - and thus the time to market can be significantly shorter than it would be for a new product, Ricco Larsen points out.
Another area where good data enablement is vital is rare diseases. Because these treatments are targeted at much smaller patient populations, identifying suitable candidates for clinical trials and producing smaller batches requires greater digital agility and insights.
– Rare diseases have traditionally not been the primary focus for large life sciences companies. But since there are effective treatments for most major diseases already, there is increasing lead-time between the big blockbuster products. And because large pharma companies have already digitalized a big portion of their processes, they have the operational efficiency necessary to make rare disease treatments profitable.
Joint effort aims to streamline regulatory data
Parallel to the need to develop new products and the disruptions caused by an unstable supply chain, inflation and regional unrest, life sciences organizations are facing increasing regulatory demands, because the health authorities want more details, faster delivery of information and greater transparency.
These increased expectations have motivated competing pharma companies to join forces to reduce the regulatory burden and streamline the availability of regulatory data. One example is the nonprofit Accumulus Synergy, which is building a data-sharing platform that supports real-time exchange of information with regulators. This initiative is backed by several major pharma companies.
– There is an increasing willingness to accept that achieving a shorter time to market and meeting the demands of the health authorities requires flexibility and compromise on historically localized formats and standards. We can also see a movement away from the mindset that everything must be tested and validated to a more risk-based approach to enable speed, says Ricco Larsen.
AI can increase the speed of data enablement
One of the ways to increase the utilization of data and decrease the time needed to make decisions is using AI. Even before the meteoric rise of generative AI, life sciences organizations used machine learning and natural language processing for document management, drug safety, clinical research and more. But issues like avoiding bias and ensuring data integrity must be resolved before letting AI loose.
– You can’t compromise on data integrity when working with AI. You must be able to ensure data quality and avoid bias, especially if you’re working with rare diseases, where populations can be very small. But if you keep that under control through strict data governance and by employing data scientists who know how to balance the risk of bias, AI can be the solution to many of the issues life sciences organizations have been facing historically, Ricco Larsen says and continues:
– Access to the right talent is part of the equation. You need good data specialists to build your data foundation, but once that is in place, AI can help you democratize your data by making it easily accessible across your organization and value chain. At that point, you just need someone who can, and will, look outside their own silo to spot the hidden treasures and navigate the organization.