ENBIS-16 in Sheffield11 – 15 September 2016; Sheffield Abstract submission: 20 March – 4 July 2016
Best Manager Award (Luc Bijnens) and Young Statistician Award (Nicolas Bousquet)13 September 2016, 17:10 – 18:00
Abstract of Luc Bijnens talk:
Expediting Statistical Innovation Through Pre-Competitive Industrial and Academic Networks
Today biostatisticians are confronted with a new kind of demand due to the massive collection of data by internal and external research laboratories. Internet connects scientists and laboratories who exchange big data. Over the last decades computing power increased overwhelmingly. For that reason classical solutions do not necessarily accommodate for model based drug development and time constraining interim analyses in the process of transforming data into knowledge. Statisticians play important roles in designing optimal experiments and help find the best statistical models to analyze the data coming from those experiments. Data are often integrated with other sources of information coming from observational data collection. Via publications of their work in peer reviewed journals biostatisticians can assist the regulators to create optimal and practical guidelines.
For all those reasons pre-competitive collaborations in international networks amongst pharmaceutical, contract and academic statisticians are essential. Networks are modern operational solutions to contemporary challenges and opportunities in pharmaceutical research and development. Statistical innovation flourishes in networks because they are virtual, diverse and flexible and address todays’ statistical challenges. The aim of the talk is to show the tip of the iceberg of this multitude of scientific and operational challenges and opportunities. Case studies will be taken from discovery and early development research.
Abstract of Nicolas Bousquet's talk:
How Geometric Properties of Black-Box Computer Models can Help to Provide Conservative Reliability Assessments in Time-Consuming Situations
Computing the probability of undesirable, unobserved events is a common task in structural reliability engineering. When dealing with major risks occuring with low probability, the lack of observed failure data often requires to use so-called computer models reproducing the phenomenon of interest. The simulation of their uncertain inputs, being modeled as random variables, allows to compute statistical estimators of a probability or a quantile. Usually these complex objects can be described as time-consuming black boxes. Therefore exploring the configurations of inputs leading to failure, in a non-intrusive way, requires to run the model over a design of numerical experiments. Classical (quasi) Monte Carlo designs cannot be practically used to explore configurations linked to low probabilities since they require too high computational budgets. Therefore, numerous sampling techniques have been proposed in the literature to diminish the computational cost while ensuring the precision of estimations. Most of them are based on choosing sequentially the elements of the design, by maximizing an expected gain in information at each step.
This talk will provide a view of the most recent results when some geometric constraints can be detected or assumed on the limit (failure) state surface, as monotonicity or convexity. Deterministic (conservative) bounds on probabilities or quantiles can be produced and narrowed by successive runs, and they become almost sure bounds when the design is chosen stochastically, and a great attention is paid to the parallel production of consistent estimators. Furthermore, the consistent estimation of the failure state surface can be produced, at a known speed, by a combination of constrained Support Vector Machines (SVM).
Beyond presenting such recent theoretical results, the talk will exhibit the good results obtained on several toy examples and real case-studies belonging to structural reliability engineering, and other possible applications in the more general field of applied decision theory.