This IDC Perspective provides deep insight into the crucial role of GPU-powered transformer models in accelerating innovation and designing new therapies. This is the era of a data-driven drug discovery and development. New deep learning models are replacing the conventional strategy of knocking down gene expression using small interfering RNA (siRNA), by binding to the corresponding pieces of messenger RNA (mRNA) of specific genes and blocking their expression. A neural network can now be trained on thousands of mutations (equivalent to massive data sets) to encode phenotypes that can allow drugs to be evaluated for their efficacy and safety. The two-stage model of modern NLP — involving unsupervised pretraining on massive data sets, followed by supervised training on smaller annotated data sets to fine-tune the process — is being widely implemented. However, the massively parallel architecture of GPUs provides the high-compute performance that is key to accelerating the drug discovery and development process. This document discusses the use cases of GPU-powered transformer models and examines their critical value in driving innovation in a world torn by the COVID-19 pandemic. It also highlights the importance of federated learning platforms in democratizing high-performance computing and accelerating drug discovery.
"The pandemic has brought to the forefront the urgent need to accelerate drug discovery and development like never before. Conventional models need to be revisited. The world is undergoing a digital biology revolution as we speak. GPU-enabled deep learning transformer models will fuel innovation and will help deliver lifesaving therapies to patients. Federated learning models and domain-optimized, open source frameworks can drive collaboration and accelerate innovation. Evolutionary AI can play a role in building trust in these models," said Dr. Nimita Limaye, research VP, Life Sciences R&D Strategy and Technology at IDC.