3rd annual UBC symposium
Keynote speaker: Prof. Keith Baggerly - "When is Reproducibility an Ethical Issue? Genomics, Personalized Medicine, and Human Error"
- 13:00 - 13:15 Registration and coffee
- 13:15 - 13:30 Welcome by Prof Berend Snel, chair of the UBC: UBC progress report
- 13:30 - 14:00 Bernd Helms (Veterinary Medicine, Utrecht University), “Lipidomic analysis reveals dynamic fluxes in lipid storage organelles of liver cells”
- 14:00 - 14:30 Lodewyk Wessels (Dutch Cancer Institute), Molecular networks and therapy reponse
- 14:40 - 15:00 Rob de Boer (Theoretical Biology and Bioinformatics, Utrecht University), Simple models and complicated bioinformatics to study diverse T cell repertoires.
- 15:00 - 15:30 Coffee / Tea
- 15:30 - 16:15 Keynote speaker: Keith Baggerly (MD Anderson Cancer Center, The University of Texas), When is Reproducibility an Ethical Issue? Genomics, Personalized Medicine, and Human Error
- 16:15 - 18:00 Reception
About the keynote:
Prof. Baggerly's research interests involve the analysis of high-throughput biological data, and centers on themes of experimental design and reproducible research. He is best known as a practitioner of "forensic bioinformatics", where raw data and reported results are used to reconstruct what the methods must have been. He has been the leading investigator exposing the flaws and shortcomings of research performed by Anil Potti at Duke university. In the end, this has led to the retraction of four high-profile publications from Duke and shutdown of three clinical trials using these results. Prof. Baggerly is a strong advocate of sharing all data and code and having appropriate data manageme! nt plans t o ensure that research results are reusable and reproducible.
Modern high-throughput biological assays let us ask detailed questions about how diseases operate, and promise to let us personalize therapy. Careful data processing is essential, because our intuition about what the answers “should” look like is very poor when we have to juggle thousands of things at once. When documentation of such processing is absent, we must apply “forensic bioinformatics” to work from the raw data and reported results to infer what the methods must have been. We will present several case studies where simple errors may have put patients at risk. This work has been covered in both the scientific and lay press, prompted several journals to revisit the types of information that must accompany publications, and led to an Institute of Medicine (IOM) Review of the type of data that must be supplied before “omics”-based tests are used to guide patient care. We discuss steps we take to avoid such errors, and lessons that can be applied to large data sets more broadly.