The scientific community has just scratched the surface of how artificial intelligence (AI) can improve our lives. All the while, our world continues to change and evolve at a rapid pace. Disruption is constant and new challenges emerge.
Just less than a month ago, most states in the U.S. were issued stay-at-home orders or underwent mandatory lockdown due to the outbreak of COVID-19. Starting in California, schools and businesses were ordered to close. Supermarkets flooded with shoppers eager to stock their food pantries, while hospitals ramped up testing and care for patients infected with the deadly coronavirus. In the hours and days that followed, other states followed suit.
Clearly, this current global pandemic is creating new and sometimes unrelenting challenges. Our ability to respond quickly (from knowing how to stay safe, to our hospitals preparing enough supplies, to health protection agencies having up-to-the-minute tracking of the situation), the quality of human life, and our ability to defeat the coronavirus will be heightened as a result of scientific and technological breakthroughs that are only possible to achieve by applying AI.
Data within the life sciences and healthcare industries is no longer human parsable, making it ideal for the application of machine learning (ML) and deep learning (DL) techniques to discover subtle and complex patterns that can enhance the accuracy of diagnostics, speed the develop of new drugs, improve treatment efficiency, and make accurate predictions. Once implemented, these insights can allow institutions to offer intelligent healthcare that can both treat disease and prevent future issues to save and improve the quality of lives.
So, where’s my flying car?
I’ve sat through countless presentations and webinars demonstrating that AI and its infrastructure are mature enough to predict medical events with a speed and accuracy that far surpasses what humans are able to achieve alone. Unfortunately, these insights are still in the experimental phase of their AI adoption and have yet to deploy their trained models for production use. I got to see this firsthand over the past few weeks while dealing with some back issues. My X-rays and MRI were all read by a technician, the MRI took days to assess, and the treatments offered to me weren’t personalized—plus they lacked data to dictate which course of action I should take. Which pain medication is best for me? Will a cortisone shot realistically help me or will it hurt me in the long run? When is the best time to do surgery and what type? None of those questions were answered with data-driven insights to back them up.
If the technology exists, then why aren’t these doctors allowed to be assisted by AI?
2 main barriers to deploying this innovation—and both boil down to data problems
First, it takes a considerable amount of time to develop, train, and implement models. From idea, to infrastructure, to data, to software, to model trained, to healthcare accuracy standards—it takes quarters to years. Second, putting models in production require both regulatory approvals (i.e. sharing lots of data) and agile IT systems to support the models once in production.
Reducing these barriers means that every day we’re getting closer to identifying the genes and circuitry which are correlated with cancer metastases, mental illnesses, cardiovascular issues, neurodegenerative diseases, addiction, and immune system diseases. But organizations and government bodies must be willing to transform their view of and approach to the data management rigidity of today.
Rethinking data management for AI
When it comes to data management, executing on the vision of what can be possible requires a significant departure from business as usual.
First, the frameworks and technologies used for ML and DL differ greatly from the existing enterprise systems that currently process and store data. The complexity in standing up the right software components with the underlying supportive infrastructure, as well as the associated learning curves across developers and data scientists, can present a significant barrier to getting started with AI, ML, and DL. Second, whether hosted on-premises or in a cloud, characterizing and sizing hardware/software configurations to support a given ML/DL workload at the performance it requires to train models against large volumes of data is a nontrivial pursuit.
The exploratory and iterative nature of ML/DL and the problems being addressed by life sciences and healthcare organizations means that data scientists can’t afford to wait for days or weeks before getting access to the tools they need. Organizations must invest in both technological and technical resources that enable their data science and development teams to quickly create and deploy sandbox environments and test a hypothesis across a hybrid environment—while allowing them to use their tools of choice.
Advisory Board uses HPE Container Platform to improve patient care and lower costs
Advisory Board (now part of Optum) is a best practices firm that provides research, technology, and consulting services to healthcare organizations. Theyre a great example of an organization that has evolved their data management systems to help hospitals translate large volumes of data into actionable insights, using advanced analytics and AI/ML/DL technologies.
So while the doctors and healthcare professionals focus on taking care of patients, Advisory Board performs analytics on the data to help make sense of the data.
To scale their data management systems for advanced analytics, machine learning, and deep learning, Advisory Board turned to BlueData (acquired by HPE). They selected BlueData’s container-based software platform, which is now the HPE Container Platform.
With HPE Container Platform, they are able to separate their compute and storage, allowing them to rightsize the systems to their specific workloads, saving roughly 6x in infrastructure costs. Additionally, they are able to quickly deploy data frameworks like Apache Spark to handle the huge log data ingest, accelerating their monthly data loads from 21 days to just 45 minutes. Lastly, they are able to leverage the cloud-like app store to give their data scientists the agility to spin up and deploy ML and analytics applications on-demand.
Ultimately, this new containerized solution allowed them to answer questions faster with the data-driven insights needed to improve operational efficiency, reduce infrastructure costs, and enhance patient care. Watch this video to see how HPE Container Platform enables faster time-to-insights, reduces their costs, and frees up their staff to innovate.
Ready to learn more?
Whether your organization is focused on drug discovery, genomic sequencing, precision medicine, explorative cancer treatments, or other areas of focus in healthcare and life sciences, HPE can help you turn your AI aspirations into reality with multi-tenant, high-performance compute and storage infrastructure, along with self-service deployment of containerized environments for AI, ML, and DL technologies—all backed by the services needed to exponentially accelerate and evolve with even the most aggressive and demanding research and development workloads.
Enabling a more empowered IT organization is key to accelerating velocity within data science teams and enhancing the return on your investment by shortening the time to value. There’s no better place to continue your journey to better serve patients or develop revolutionary treatments to the most complex medical challenges than with the adoption of AI. And with solutions like the HPE Container Platform, we can help you accelerate this journey, develop new AI-driven innovations, and deliver faster time to insights.
To learn more, visit hpe.com/info/container-platform.