Drug development aims to bring to market active pharmaceutical ingredients (APIs) identified during the drug discovery process. During this phase, the drugs undergo many tests to fully characterize their physical and chemical properties. Risk assessment is really what this is all about! It is less risky for pharmaceutical companies to assess potential development issues earlier rather than later (e.g., polymorphism, solubility, stability) in order to assess the manufacturability of the actives.Researchers typically don't have a large quantity of material to work with early in development. For this reason, in-silico methods such as those deployed within the BIOVIA Materials Studio?? modelling and simulation environment are very valuable. Materials Studio enables researchers to predict and understand the relationship between a material's molecular structure and its properties and/or behaviour.Amongst all stability stress tests performed, degradation is certainly one of the top issues pharmaceutical research scientists need to assess early by focusing on a number of possible API degradation mechanisms, including autoxidation (or hydrogen abstraction). The reaction between the pharmaceutical compounds and molecular oxygen can initiate an oxidative chain reaction that can lead to the reduction of the therapeutic agent, formation of toxic products, decreased bioavailability and other degradation processes. A research paper published this summer describes an in silico method for estimating bond dissociation energies based on the Density Functional Theory (DFT), a well-known computational chemistry tool for performing electronic structure calculations. The results can be used to assess the propensity of a drug substance with respect to autoxidation.The method was built using the BIOVIA Pipeline Pilot?? scientific workflow authoring application (and in particular the Pipeline Pilot Materials Studio Collection) to automate all the required quantum mechanical calculations. All steps of the workflow including jobs submission, execution on a remote cluster and analysis were fully automated using Pipeline Pilot. The calculations were conducted via a secure web interface allowing for efficient analysis, reporting and sharing of results.Initially, the method was validated against a set of 45 molecules and subsequently applied to APIs with known degradation history. Eventually, the risk assessment shown in the following figure was obtained.In conclusion, such Bond Dissociation Energy (BDE) calculations may be taken as a complementary source of information with experimental stress testing for early compound stability profiling. In addition, the authors reported that 'there is no need to be a highly skilled computational expert to use the Pipeline Pilot protocol.'Predicting Drug Substances AutoxidationLienard P, Gavartin J, Boccardi G, Meunier M.Pharm Res. 2014 Aug 13
Offering insight from the perspective of a Pipeline Pilot and Materials Studio user, Accelrys is pleased to host a posting written by guest blogger Dr. Misbah Sarwar, Research Scientist at Johnson Matthey. Dr. Sarwar recently completed a collaboration project focused on fuel cell catalyst discovery and will share her results in an upcoming webinar. This post provides a sneak peek into her findings..."In recent years there has been a lot of interest in fuel cells as a "green" power source in the future, particularly for use in cars, which could revolutionize the way we travel. A (Proton Exchange Membrane) fuel cell uses hydrogen as a fuel source and oxygen (from air), which react to produce water and electricity. However, we are still some time away from driving fuel cell cars, as there are many issues that need to be overcome for this technology to become commercially viable. These include improving the stability and reactivity of the catalyst as well as lowering their cost, which can potentially be achieved by alloying, but identifying the correct combinations and ratios of metals is key. This is a huge task as there are potentially thousands of different combinations and one where modeling can play a crucial role.As part of the iCatDesign project, a three-year collaboration with Accelrys and CMR Fuel Cells funded by the UK Technology Strategy Board, we screened hundreds of metal combinations using plane wave CASTEP calculations.In terms of stability, understanding the surface composition in the fuel cell environment is key. Predicting activity usually involves calculating barriers to each of the steps in the reaction, which is extremely time consuming and not really suited to a screening approach. Could we avoid these calculations and predict the activity of the catalyst based on adsorption energies or some fundamental surface property? Of course these predictions would have to be validated and alongside the modeling work, an experimental team at JM worked on synthesizing, characterizing and testing the catalysts for stability and activity.The prospect of setting up the hundreds of calculations, monitoring these and then analyzing the results seemed to us to be quite daunting and it was clear that some automation was required to both set up the calculations and process the results quickly. Using Pipeline Pilot technology (now part of Materials Studio Collection) protocols were developed which processed the calculations and statistical analysis tools developed to establish correlations between materials composition, stability and reactivity. The results are available to all partners through a customized web-interface.The protocols have been invaluable as data can be processed at the click of a button and customized charts produced in seconds. The timesaving is immense, saving days of endless copying, pasting and manipulating data in spreadsheets, not to mention minimizing human error, leaving us to do the more interesting task of thinking about the science behind the results. I look forward to sharing these results and describing the tools used to obtain them in more detail in the webinar, Fuel Cell Catalyst Discovery with the Materials Studio Collection, on 21st July."
Informatics in High Content Screening (HCS) is reshaping the mix of scientists driving drug discovery efforts. In the early days of HCS I worked closely with electrical, mechanical and software engineers to develop better systems for image acquisition and processing. My responsibilities as an HCS biologist involved painstaking hours of sample preparation and cell cultures and constant enhancements to my materials and methods section for preparing my biological specimens for imaging. I was motivated by the many new collaborative efforts that beganwith the software engineers, the systems engineers and the machine vision scientist developing HCS systems. I found myself teaching basic concepts of biology as I learned about illumination and optics, piezoelectric drives for auto focusing and, of course, the strings of zeros and ones that would eventually tell me what happened to my protein. It was exciting for me to be part of a cross functional team developing new applications by piecing together advances in hardware, image processing and biological assay technologies.High Content Screening systems and vendor software has come along way since my introduction to the technology ten years ago. Vendors struggled between giving end users powerful, flexible systems and ease of use (1). The bottleneck has shifted from application development to data informatics . Software systems in HCS have evolved to integrate databases and other related sources for chemical structures, target characteristics, and assay results. Today, Icollaborate with colleaguesin HCSin new areas that include data mining, principal component analysis, Bayesian modeling, decision trees, and data management.The mix of HCS conference speakers and attendees has shifted from what had primarily been assay developers to a growing population of informaticians and IT experts. Talks have moved beyond assay design and system development to incorporate more downstream data processing. We have worked on complex fingerprinting methods for predicting characteristics of a compound for such things as predicting mechanism of action or how it might affect a particular biological pathway involved for example, in neuronal stem cell differentiation. Vendors are moving to more open systems for image processing and are integrating more third party applications into their HCS acquisition systems to keep up with the shifting bottlenecks and emerging solutions. Informaticians have been able to improve data analysis efforts and significantly reduce the number of man-hours required for downstream data analysis (2). I've been fortunate in having been able to develop relationships with experts at most of the leading HCS instrument companies. My journey has been one of constant growth and continuous learning. I"m anxious to know what"s coming next in High Content Screening and eager to learn from my ever growing network of scientific experts.1. High-Content Analysis: Balancing Power and Ease of Use by Jim Kling2. Data Analysis For A High Content Assay Using Pipeline Pilot": 8x Reduction in Manhours from a poster by L. Bleicher, Brain Cells Inc