Catalog Advanced Search
Mitigation and identification of aggregation and nonspecific reactivity interference in high-throughput screeningContains 2 Component(s) Recorded On: 06/07/2018
Compound-mediated assay interference and bioassay promiscuity are significant burdens in drug and chemical probe discovery. Pursuing artifacts or poorly tractable, nonselective chemical matter from real and virtual high-throughput screening (HTS) can waste significant scientific resources and can lead to tenuous scientific conclusions when used in subsequent studies.
Compound-mediated assay interference and bioassay promiscuity are significant burdens in drug and chemical probe discovery. Pursuing artifacts or poorly tractable, nonselective chemical matter from real and virtual high-throughput screening (HTS) can waste significant scientific resources and can lead to tenuous scientific conclusions when used in subsequent studies. Two of the more prominent sources of generalized assay interference in biological assays by test compounds are aggregation and nonspecific reactivity. Aggregators and reactive compounds can interfere across multiple assay technologies and formats including biochemical and cellular systems, often leading to poorly tractable, promiscuous bioactivity. Apparent bioactivity from such interference compounds can be highly deceptive and quite convincing, even to experienced scientists. These subversive sources of apparent bioactivity represent significant project risks that can fortunately be mitigated with appropriate experimental design. This webinar will first discuss the fundamental chemical principles of these interferences, then introduce (1) technical recommendations to mitigate the incidence of aggregation and nonspecific reactivity in biological assays, and (2) basic and advanced counter-screens to identify as well as de-risk apparent bioactive compounds for aggregation and nonspecific reactivity. This information should be helpful for those performing HTS and triage, drug and chemical probe development, and chemical biology.
Jayme L. Dahlin, M.D., Ph.D.
Department of Pathology, Brigham and Women’s Hospital
Dr. Dahlin joined the Mayo Clinic (Rochester, MN) Medical Scientist Training Program in 2007 after earning his B.A. in chemistry from Carleton College (Northfield, MN). In 2016, he graduated with an M.D. from Mayo Medical School and a Ph.D. in Molecular Pharmacology and Experimental Therapeutics from Mayo Graduate School. He is currently Chief Resident in Clinical Pathology at Brigham and Women’s Hospital (Boston, MA) and a postdoctoral research fellow in the laboratory of Dr. Stuart L. Schreiber at the Broad Institute of Harvard/Massachusetts Institute of Technology. His graduate and postdoctoral work have focused on chemical mechanisms of biological assay interference and post-HTS triage. His research interests include HTS and triage tool development, bioassay promiscuity, and compound-mediated assay interference. Dr. Dahlin serves on the editorial board for the NIH Assay Guidance Manual and the Scientific Advisory Board of the Chemical Probes Portal.
Using AI and IoT to Accelerate ResearchContains 2 Component(s) Recorded On: 05/08/2018
In this webinar, we will explore how new technologies can be used to accelerate scientific research. Join us for an in-depth look at how we can also overcome challenges by practically incorporating new tech into research processes to uncover and decipher hidden confounding variables.
In recent years we’ve seen an explosion of technologies surrounding seemingly “new” fields of artificial intelligence / machine learning (AI/ML), the Internet of Things (IOT), and the like. Whilst many of the underlying technologies have existed for decades, only recently has compute power become affordable and powerful enough to apply them to new fields. In this webinar, we will explore how these new technologies can be used to accelerate scientific research. Many of the AI tools currently being used are focused on data mining; whilst this is a necessary part of the process, it is important to note that any AI system is only as good as the data that’s fed into it. This is where the integration of sensors and IOT can make a meaningful impact – by intelligently incorporating protocol design and real-time sensing of physical parameters with machine learning, we can dramatically improve research outcomes by increasing the reproducibility of experiments. How many times have you run an experiment and not been able to reproduce the results – only to find out (after months of searching) that the root cause was something trivial like improper storage of reagents, or mis-calibrated instruments, or environmental variations in the lab? Join us for an in-depth look at how we can overcome these challenges by practically incorporating new tech into research processes to uncover and decipher these hidden confounding variables.
Sridhar Iyengar, PhD
CEO & Founder, Elemental Machines
A serial entrepreneur in sensors, IoT, medical devices, and wearables, Sridhar is founded Elemental Machines with the mission to build products to help science-based companies decipher and understand physical processes from R&D to manufacturing. Previously, Sridhar was a founder of Misfit, makers of elegant wearable products, which was acquired by Fossil in 2015. Prior to Misfit, he founded AgaMatrix, a blood glucose monitoring company that made the world’s first medical device connecting directly to the iPhone. AgaMatrix shipped 15+ FDA-cleared medical products, 2B+ biosensors, 6M+ glucose meters, with partnerships with Apple, Sanofi, and Walgreens. Sridhar holds over 50 US and international patents and received his Ph.D. from Cambridge University as a Marshall Scholar. Beyond Elemental Machines, Sridhar has been known to run 13.1 miles on occasion and has sometimes been spotted on stage behind a wall of drums.
Coming in October 2018: SLAS Technology Special Issue on the Internet of Things in the Laboratory
Sneak Peek: Preview Special Issue Articles Online Now (ahead-of-print)
More articles coming soon to SLAS Technology OnlineFirst
In-house software and processes to support High Content Screening of Primary NeuronsContains 1 Component(s) Recorded On: 02/07/2018
This presentation will focus on the development and implementation of novel in-house software utilities used at Scripps Florida to support the Synaptogenesis neuroscience drug discovery project.
The integration of High Content Screening (HCS) devices onto High Throughput Screening (HTS) platforms to support neuroscience research presents unique challenges for drug discovery teams. In particular, the informatics aspect of HCS applied to neuroscience is an area where advances in software automation can result in substantial throughput and efficiency gains for researchers. In addition to the data processing and storage requirements of HCS above traditional HTS readers, neuroscience assays present a number of unique challenges for the HTS environment. These challenges include ensuring data integrity from acquisition through analysis & QC, porting data between multiple distinct HCS platforms and providing end-user analytic tools for ongoing intermediate assay results.
This presentation will focus on the development and implementation of novel in-house software utilities used at Scripps Florida to support the Synaptogenesis neuroscience drug discovery project. This project utilizes multiple HCS platforms, has an ongoing non-traditional HTS timeline and requires on-demand access to the full HCS data stream, from raw source images to final endpoint results. The biology of the Synaptogenesis project is currently amenable to 384 well format while the Scripps Florida uHTS platform is optimized for 1536 well screening. Supporting a HTS campaign where the compound collection resides in a different plate density than the assay plate required the development of custom robotic and informatics procedures. The Synaptogenesis endpoint calculation requires measurements at DIV 12 and DIV 14 where each assay plate is represented by 3,072 images (384 wells at 4 images per well for each of two separate reads) which must be associated with relevant metadata (plate barcode, well row, well column and well quadrant) for downstream tracking. Previous neuroscience assays in this format were not amenable to robotics screening and were limited in throughput. To date, we have successfully screened a large portion (greater than 40,000 compounds) of the Scripps Drug Discovery library, in quadruplicate, iteratively over a 9 month period.
To meet these challenges, custom software tools have been developed which enables scientists to manage what would otherwise be an overwhelming amount of data. These tools include a web based portal that allows end users to easily review HCS neuroscience data as it comes off the HTS platform and to quickly drill down on endpoint data back to the images acquired by the reader. The advantages of developing in-house informatics over commercial products and the impact of these tools on the ongoing neuroscience research at Scripps Florida is presented.
Pierre established the Compound Management team within the Lead Identification/HTS group at Scripps Florida when the lab was established in 2005. The Compound Management team is responsible for supporting both industrial and academic drug discovery efforts with a proprietary >600,000 sample library and the NIH's >300,000 sample MLPCN collection. Aside from typical Compound Management duties, Pierre has developed, assembled and integrated novel automated hardware and software for the purpose of drug discovery.
Inline, Label-free Detection Using the Droplet Frequency SensorContains 1 Component(s) Recorded On: 02/07/2018
Inline detectors are extensively used in chemical separations and other life sciences workflows to quantify analytes based on fundamental properties.
Inline detectors are extensively used in chemical separations and other life sciences workflows to quantify analytes based on fundamental properties. For example, absorbance (UV-VIS) detectors measure the analyte’s light-absorbing chromophores, refractive index (RI) detectors measure molecular cross section, and electrochemical (EC) detectors and mass spectrometers (MS) measures the analyte’s charge or mass to charge ratio. Although hydrophobicity and solubility are important properties of an analyte, to date there are no inline detectors based on such properties. Here, we present the drop frequency sensor (DFS), a novel inline detector which quantifies an analyte based on its adsorption to a liquid interface.
The DFS is based on the surfactant retardation effect, a phenomenon first described by Levich in the 1960s. Levich observed that the velocity of a rising bubble is lower than expected if surfactants are present. Surfactants adsorb to the bubble’s interface, and surface flows convect them to the trailing end, where they aggregate into a stagnant cap. The cap has two effects, both of which increase drag: 1) the interface becomes immobile, and 2) the nonuniform surfactant concentration results in a surface tension gradient, which induces a Marangoni force opposing the motion of the bubble. The DFS exploits a similar effect in droplets flowing through a microchannel. Droplets of the analyte are generated by combining the sample stream with a stream of oil in a microfluidic tee junction. If the droplet contains hydrophobic molecules or other surface-active species, the molecules adsorb to the interface and are convected to the trailing end, similar to Levich’s experiments. Here, they form a stagnant cap which increases drag on the droplet train, and therefore increases channel’s hydrodynamic resistance. In a pressure-driven system, the increased resistance reduces flow rate as well as the frequency of drop generation. The droplet frequency is measured with a light scattering detector.
The DFS demonstrates excellent quantitation capability for Bovine Serum Albumin (BSA), a globular protein with known hydrophobic regions. Injection of BSA into the analyte stream temporarily reduces the drop frequency, and the frequency returns to baseline, generating a chromatographic peak. The peak area increase linearly with the quantity of injected BSA with a correlation coefficient R2=0.997. This process is highly repeatable, which is important for measurement precision. The limit of detection for BSA is 2ng, and < 200pg for L-galectin, a hydrophobic protein with smaller molecular weight. The high signal-to-noise ratio suggests that even lower detection limits are possible. The low detection limits are achieved because the high surface area to volume ratio favors adsorption phenomena.
Wayne State University
Amar Basu received a BSE and MSE in electrical engineering, an MS in biomedical engineering, and a Ph.D. in electrical engineering, all with honors from the University of Michigan Ann Arbor. His dissertation, under Prof. Yogesh Gianchandani at the NSF Center for Wireless Integrated Microsystems, investigated interfacial tension-driven microfluidics. He has been a visiting scholar at Purdue University under Prof. Graham Cooks and Intel's New Devices group, and has served as an adjunct faculty at the University of Michigan. Amar is currently associate professor of electrical engineering and biomedical engineering at Wayne State University. His research, supported primarily by the NSF, focuses on microfluidic and microelectronic instrumentation for high-throughput screening and point of care monitoring. He received the NSF BRIGE award, WSU CoE Outstanding Faculty Award, the IEEE-WSU Professor of the Year, and the Whitaker Foundation Fellowship. More information about his lab can be found at www.microfluidics.wayne.edu.
Open development from user to vendor and back again; how everybody winsContains 1 Component(s) Recorded On: 02/07/2018
This presentation will discuss how this ‘overlap’ can be leveraged to produce better products, better interaction and better results for all parties. This presentation will explore opportunities to further these aims and bring the supplier and user of everyday laboratory equipment together.
The ‘vendor’ community and ‘user’ community are today becoming commonly intertwined; with the user community taking advantage of modern prototyping and manufacturing technologies such as 3D printing, micro-controllers and laser cutting. In addition; the vendor community is often using these technologies to producing products. This means that there is a significant overlap such as we have never seen before.
The presenter has previously worked in the instrument user community at major pharmaceutical companies, large biotech, startup biotech and academia. During this period a close collaborative relationship between the user and supplier lead to improved performance of the equipment purchased and increased reliability. Now the same person runs an instrument company which sells to the end user and we now see the other side of the coin – how to support the equipment in the field as a manufacturer. Concepts such as printing your own spare parts and even the concept of flat pack style delivery will be explored. In addition the reality of this will be discussed; issues such as giving out the design for internal parts of a product could leave a company’s designs open to be reused by a competitor and in addition also there is a degree of willingness by the user of the equipment to do the repairs by themselves.
This presentation will discuss how this ‘overlap’ can be leveraged to produce better products, better interaction and better results for all parties. This presentation will explore opportunities to further these aims and bring the supplier and user of everyday laboratory equipment together.
Neil is co-founder and Managing Director of Ziath. Since 1994, Neil has experience with a range of companies; GlaxoSmithKline; Cambridge Antibody Technology, Cenix Bioscience GmbH and the Max Planck Institute of Cell Biology and Genomics. Within these companies Neil has been responsible for the development, maintenance and implementation of laboratory automation and associated software with a focus on process control and information management.
Neil has served on the board of the European Laboratory Robotics Interest Group (ELRIG) in both Germany and the UK. He was the informatics chair for Lab Automation 2009, has edited for the Journal for the Association for Laboratory Automation and also serves on the board of The Journal for Laboratory Automation. Neil has a Bachelor’s degree in Biotechnology and a Master’s degree in Computer Science.
Large scale profiling in human primary-cell based phenotypic assays identifies novel outcome pathways for drug efficacy in cardiovascular diseaseContains 1 Component(s) Recorded On: 02/07/2018
Findings support the value of a large chemical biology database of reference drugs profiled through primary human cell-based phenotypic assays. This database has been mined to reveal several novel associations with adverse events and identified potential mechanisms of toxicity, and here we show how this database can be used to generate new hypotheses for drug efficacy.
We have previously identified an in vitro signature, characterized by increased cell surface levels of serum amyloid A (SAA) in a human primary cell-based coronary artery smooth muscle cell model of vascular inflammation (BioMAP CASM3C system), shared by certain compound classes associated with cardiovascular toxicity. Data mining a large reference database containing more than 4,500 test agents (drugs, experimental chemicals, etc.) profiled in this assay identified certain mechanisms to be associated with this signature: MEK inhibitors, HDAC inhibitors, GR/MR Agonists, IL-6 pathway agonists, as well as modulators of SIRT1. Since SAA is a clinical biomarker associated with risk of cardiovascular disease in humans, these results suggested that these mechanisms might contribute to cardiotoxicity by direct promotion of vascular dysfunction through SAA within vascular tissues. To further extend these studies, we mined the reference database to identify agents that decrease levels of SAA in the BioMAP CASM3C system without causing overt cytotoxicity. Notable agents that were found to decrease the cell surface level of SAA relative to vehicle control include GLP-1, an endogenous peptide developed as a drug used for treatment of diabetes, roflumilast, a PDE IV inhibitor used for the treatment of chronic obstructive pulmonary disorder, the BCR-Abl inhibitor and oncology drug, imatinib, and a mimetic of ApoA-1, the major lipoprotein of HDL. These represent agents that have been shown to have cardiovascular protective effects in clinical or in vivo studies (some within their class). The results here suggest a potential mechanism for this cardiovascular benefit through regulation of SAA, possibly through interfering with the involvement of SAA in the recruitment and activation of monocytes in the vascular wall. These findings support the value of a large chemical biology database of reference drugs profiled through primary human cell-based phenotypic assays. This database has been mined to reveal several novel associations with adverse events and identified potential mechanisms of toxicity, and here we show how this database can be used to generate new hypotheses for drug efficacy. Collectively these data support a disease and adverse outcome pathway for cardiovascular disease involving the regulation of SAA.
Ellen L. Berg, PhD, is Chief Scientific Officer at DiscoverX, BioMAP Division. She held prior positions at BioSeek and Protein Design Labs, earned her PhD from Northwestern University and was a postdoc at Stanford University. She is an SLAS fellow (Society for Laboratory Automation and Screening), a board member of ASCCT (American Society of Cellular and Computational Toxicology), and a member of the Society of Toxicology (SOT) and the Inflammation Research Association (IRA). Her research interests include human-based in vitro models of tissue and disease, chemical biology for predicting drug and toxicity mechanisms of action and phenotypic drug discovery. Dr. Berg holds a number of patents in the field of inflammation and has authored >80 publications.
HIPStA, a High Throughput Alternative to CETSAContains 1 Component(s) Recorded On: 02/07/2018
This presentation reviews data demonstrating the proof of concept for the HIPStA method, using 3 different classes of drug discovery targets: Receptor tyrosine kinases, Nuclear Hormone Receptors and Cytoplasmic Protein Kinases. HIPStA represents a more scale-able alternative to CETSA for detecting drug-target interaction in cells.
The measurement of drug – target interaction in the cellular context is critical to many drug development programs. The Cellular Thermal Stability Assay (CETSA) represents an established broadly applicable method for measuring drug target interaction. However CETSA has some major limitations that make it difficult to scale to the throughput typically required for a drug development project. It requires heating samples to different temperatures and centrifugation and / or filtration steps which limit throughput. The HSP90 Inhibitor Protein Stability Assay (HIPStA) is a novel method for measuring drug target interaction. Like CETSA, HIPStA is based on the premise that the binding of a ligand to a target protein can influence that protein’s stability. Instead of using heat to destabilize a protein, HIPStA uses a Heat Shock Protein 90 inhibitor (HSP90i) to cause protein instability. Instead of scanning a range of different temperatures to establish a thermal denaturation curve, HIPStA applies a range of concentrations of an HSP90i to determine an HSP90i induced denaturation curve, and ultimately measures the ability of a compound to stabilize a protein. We present data demonstrating the proof of concept for the HIPStA method, using 3 different classes of drug discovery targets: Receptor tyrosine kinases, Nuclear Hormone Receptors and Cytoplasmic Protein Kinases. HIPStA represents a more scale-able alternative to CETSA for detecting drug-target interaction in cells.
Robert A. Blake (DPhil) is a scientist in drug and target discovery specializing in oncology drug development, cellular and biochemical assays for high throughput screening, automated fluorescence imaging, signal transduction and protein degradation. He is currently a scientist in the department of Biochemical and Cellular Pharmacology at Genentech and worked previously at Sugen, Exelixis, and iPierian. He has published on the discovery of novel selective kinase inhibitors including the Src inhibitor SU6656, HSP90 inhibitors, high content fluorescence imaging based methods and was a member of the team that developed SUTENT (SU11248). His current focus is the development of drugs whose mechanism of action includes the degradation of the target protein.
New Functional Genomics toolsets: Arrayed loss of function screening with LentiArray CRISPR librariesContains 1 Component(s) Recorded On: 02/07/2018
Here we demonstrate a knock-out screening approach that utilizes the Invitrogen™ LentiArray™ CRISPR library to interrogate the impact of individual gene knock-outs on the NFκB pathway as measured by a functional cell-based assay. We describe the library design concepts, the assay development, initial screening results and validation of specific identified hits.
Identifying and validating targets that underlie disease mechanisms and can be addressed to provide efficacious therapies remains a significant challenge in the drug discovery and development process. Mechanisms of RNAi have provided the use of siRNA and shRNA to knock-down RNA and suppress gene function. However, depending on the nature of the targets, cells, biology and end-point assays, these approaches may suffer variously from their transient nature, design complexity, incomplete knock-down or off-target effects. The use of CRISPR (clustered regularly interspaced short palindromic repeat)-associated Cas9 nuclease and guide RNA (gRNA) provides a strong alternative that can produce transient or long-lasting impact, straightforward design, knock-out of genes and increased specificity. A number of laboratories have already published reports demonstrating how pools of gRNA can be delivered to cells and “hits” can be established through enrichment or depletion of cells following a “survival” assay and identified by sequencing the introduced gRNAs in the remaining cell population. Here we demonstrate a knock-out screening approach that utilizes the Invitrogen™ LentiArray™ CRISPR library to interrogate the impact of individual gene knock-outs on the NFκB pathway as measured by a functional cell-based assay. We describe the library design concepts, the assay development, initial screening results and validation of specific identified hits. We elucidate the key factors in developing a robust assay including both transduction and assay optimization to achieve the highest levels of transduction efficiency and assay window and provide data from initial screens using the Invitrogen™ LentiArray™ CRISPR kinome library. We expect these approaches to be scalable to the entire human genome and portable to multiple cell types and end-point assays including both high-throughput plate-based assays and high-content imaging based assays.
Thermo Fisher Scientific
Cell and Molecular Biology, Genome Editing.
Next generation target discovery: systematic application of the CRISPR toolkitContains 1 Component(s) Recorded On: 02/07/2018
Forward genetic screening with CRISPR–Cas9 has provided a promising new way to interrogate the phenotypic consequences of gene manipulation in high-throughput, unbiased analyses in target ID, target validation, drug MOA analysis and patient stratification. Diseases previously refractory to systematic high-throughput interrogation are now coming into the cross-hairs of powerful new functional genomic solutions.
Forward genetic screening with CRISPR–Cas9 has provided a promising new way to interrogate the phenotypic consequences of gene manipulation in high-throughput, unbiased analyses in target ID, target validation, drug MOA analysis and patient stratification. Diseases previously refractory to systematic high-throughput interrogation are now coming into the cross-hairs of powerful new functional genomic solutions. To date, the majority of screens have been conducted using loss-of-function perturbation driven by CRISPR–Cas9 enacted gene knock-out. Although powerful, this approach does not allow for the examination of activating gene function, leaving a salient gap in the functional genomic analysis. In order to add depth to our discovery platforms, we have constructed new platforms using both CRISPRi and CRISPRa transcriptional regulation tools. Both of these platforms have been adapted to use next generation, highly optimised whole-genome targeting libraries in order to enact maximum gene expression modulation. Our validation analysis of these approaches revealed outstanding performance and sensitivity, with greater than ten-fold improvement in detection rates compared to existing tools.
Screening for drug resistance with this dual platform yields unambiguous target discovery and simultaneous evaluation of both activating and inhibiting perturbations reveals direct and opposing phenotypic effects within complex gene networks. Thus, in contrast to loss-of-function-only analysis, these tools can switch the response of affected cells to either sensitisation or resistance allowing the discovery of key genes which sit in the centre of the hit nexus. These findings demonstrate the unique power of bi-directional functional genomic screening approaches.
The application of these tools to new therapeutic areas is expected to yield crucial new target ID. A major global research focus is in immuno-oncology and the discovery of new immuno-oncology drug targets, including those that alter the character and frequency of T-cell-mediated anti-tumour responses. Although we and others have been able to develop tools that allow highly efficient gene editing of primary T-cells by CRISPR–Cas9, the application of pooled functional genomic screening to primary T-cells has proved a technological hurdle. We have optimised and substantially adapted our pooled CRISPR screening platform to the particular challenge of primary T-cell biology and we will present an update on this promising new capability.
Horizon Discovery Ltd
After completing his PhD, Ben trained as a post doc at the University of Cambridge. Here his focus was reverse chemical genetic screening, uncovering a novel mechanism for inhibition in the unfolded protein response. Ben joined Horizon in 2013 to expand and develop Horizon’s functional genomics platforms and to lead a major research alliance in synthetic lethal target discovery. Ben now leads and manages Horizon’s functional genomic screening group.
Current landscape and future opportunities in implementing human microphysiological models in pre-clinical drug developmentContains 1 Component(s) Recorded On: 02/07/2018
The presentation provide a perspective on the breadth of new opportunities available for the integration of 3D human in vitro models within drug discovery and the related challenges in adoption. It will introduce key technological background and advantages/limitations of each novel 3D human in vitro models with examples from recent studies or cases.
The pharmaceutical industry is facing great challenges still owing to high R&D costs and low overall success rates of clinical compounds during drug development. In phase I clinical trials the majority of failures are due to safety related issues. While more than 50% of failures in phase II and III clinical trial are due to a lack of efficacy and a quarter due to safety issues, where safety includes those failures that were due to an insufficient therapeutic index. Drug failures in clinical trials are mainly due to the poor translational relevance and clinical predictive power of existing preclinical models which include human cell based in vitro and animal models. The drug discovery community has recognized the critical need for new testing approaches to generate more translatable and reliable predictions of drug efficacy and safety in humans. This has driven the recent advancements in cell biology, tissue engineering, biomaterials, and emerging platforms such as microfabrication, microfluidics and bioprinting in the development of innovative in vitro technologies that more closely recapitulate human tissues and organs. These three dimensional (3D) human in vitro models such as 3D spheroids/organoids, organs-on-chips, and bioprinted tissues could provide the basis for preclinical assays with greater translatability and predictive power. They could be applied for greater insight into mechanisms of human disease, mechanisms of toxicity or for early confirmation of new therapy efficacy. I will provide a perspective on the breadth of new opportunities available for the integration of these 3D human in vitro models within drug discovery and the related challenges in adoption. I will introduce key technological background and advantages/limitations of each novel 3D human in vitro models with examples from recent studies or cases. Furthermore, I will discuss the essential validation process for these 3D human in vitro technology and the importance of integration of various models and the translatability to the clinic. I will conclude by examining how 3D in vitro technology will begin to tackle major technical challenges at the critical steps of conventional and the evolving drug discovery process.
R&D Platform Technology & Science, GlaxoSmithKline
Dr Ekert leads an integrated enterprise strategy for sustained, portfolio driven growth in R&D application of complex human-relevant and translatable complex in vitro models. Dr Ekert’s group drives the coordination and prioritization of development and integrated use of complex in vitro technologies for efficacy, safety and biometabolism studies. Dr Ekert received his PhD in Medical Science from Adelaide University in Australia. He performed post-doctoral training at the University of California, Davis and Coriell Institute for Medical Research. Before coming to GSK Dr Ekert worked for 11 years at Janssen BioThereapeutics in early biotherapeutic drug discovery in target discovery, drug validation and mechanism of action studies applying 3D cell cultures, iPSCs and primary cells in complex cell-based assays across multiple therapeutic areas. His current focus at GSK is to improve predictive validity of early preclinical models leading to better characterized molecules, decreased R&D cycle time and a reduction in attrition.