TACC, XSEDE, and ultimately the NSF care greatly about how effective funded computing resources are used by domain scientists. Part of my role at TACC is to help those domain scientists use TACC's hardware more effectively. This talk will consist mostly of anecdotes and personal experiences interacting with TACC's user community in the context of scientific software optimisation. The goal of the talk will be to educate the audience on a) some (opinionated) scientific software best practices; and b) resources available to you to help you make your software more efficient.
Damon McDougall has a Mathematics Ph.D from the University of Warwick. He moved to the US to do a postdoctoral research fellow, and later become a Research Associate, under Professor Moser in 2012 and has recently joined TACC.
Probability Theory is based on starting with a comprehensive set of all possible events, known as the sample space. Probabilities for events in this prior sample space are independent of any information used for assessing probabilities; they are called Non-Informative Prior Probabilities. Bayes’ Theorem can be used to update these Non-Informative Prior Probabilities with all available information, including objective (data) and subjective (judgement) information.
A persistent challenge is how to establish Non-Informative Prior Probabilities. How do we include all possibilities and assess their probabilities a priori without any information? How do we account for an unknown that is outside our range of experience? How do we include the possibility of black swans when we have only seen white swans? For centuries, a variety of extremely distinguished theoreticians, including Bernoulli, Keynes, Jaynes, and Raiffa, have attempted to but been unable to overcome this challenge.
The goal of our research is to develop a theoretical basis, Decision Entropy Theory, to rationally and defensibly establish Non-Informative Prior Probabilities. The premise of Decision Entropy Theory is that probabilities provide input to decision making; therefore, non-informative probabilities are probabilities that do not inform a decision. The greatest lack of information for a decision is defined by the following three principles:
1. A decision alternative compared to another alternative is equally probable to be preferred or not to be preferred.
2. The possible gains or losses for one decision alternative compared to another alternative are equally probable.
3. The possibilities of learning about the preference of one decision alternative compared to another alternative with new information are equally probable.
The development of Decision Entropy Theory involves formulating these principles into a mathematical framework that describes the entropy (uncertainty) of a decision. The non-informative prior probabilities are found by maximizing the entropy of the decision.
This talk will provide practical examples to illustrate the challenge of establishing Non-Informative Prior Probabilities and to illustrate how Decision Entropy Theory attempts to address this challenge.
Robert B. Gilbert P.E., Ph.D., D.GE, M.ASCE is Chair of the Department of Civil, Architectural and Environmental Engineering at The University of Texas at Austin. He joined the faculty in 1993. He also practiced with Golder Associates Inc. as a geotechnical engineer from 1988 to 1993. His technical focus is the assessment, evaluation and management of risk for civil engineering systems. Recent activities include analyzing the performance of offshore platforms and pipelines in Gulf of Mexico hurricanes; managing flooding risks for levees in Texas, California, Washington and Louisiana; and performing a review of design and construction for the new Bay Bridge in San Francisco. Dr. Gilbert has been awarded the Norman Medal from the American Society of Civil Engineers and an Outstanding Civilian Service Medal from the United States Army Corps of Engineers.