ENBIS-15 in Prague

6 – 10 September 2015; Prague, Czech Republic Abstract submission: 1 February – 3 July 2015

My abstracts

 

The following abstracts have been accepted for this event:

  • Statistical Thinking and Big Data

    Authors: Vladimir Shper (Moscow Institute of Steel & Alloys), Yuri Adler (Moscow Institute of Steel & Alloys)
    Primary area of focus / application: Education & Thinking
    Keywords: Big Data, Statistical thinking, Contradiction
    Submitted at 27-May-2015 11:17 by Vladimir Shper
    Accepted (view paper)
    9-Sep-2015 09:00 Statistical Thinking and Big Data
    Big Data – an important new area of gathering and analysis of huge amounts of data. It opens unprecedented opportunities in all fields of human activity. The supporters of Big Data are sure that these opportunities will radically change all sides of human life. Moreover euphoria of seemingly unlimited achievements leads to an assertion that all old theoretical models should be swept away and that all we now need is an analysis of correlations. On the other hand the followers of statistical thinking are looking at this situation otherwise. They are sure that without some principles of statistical thinking the Big Data approach will not reach the summits it could have reached. This work is aimed at a discussion of this contradiction. We think that the Big Data advocates should take statistical thinking viewpoint into consideration.
  • Some Thoughts about Phase I Analysis

    Authors: Vladimir Shper (Moscow Institute of Steel & Alloys), Yuri Adler (Moscow Institute of Steel & Alloys)
    Primary area of focus / application: Education & Thinking
    Keywords: Shewhart, Control, Chart, Phase I, Analysis
    Submitted at 27-May-2015 11:29 by Vladimir Shper
    Accepted (view paper)
    9-Sep-2015 09:40 Some Thoughts about Phase I Analysis
    It is well known that application of SPC in practice consists of two stages: Phase I – retrospective phase and Phase II – prospective or monitoring phase. There are two different methods to construct control limits in Phase I: Method I discussed by Hillier in 1969 – it is based on False Alarm Rate (FAR); and Method II discussed by Chakraborti et al. in 2009 – it is based on False Alarm Probability (FAP). These two methods lead to very different values of control limits and concomitant probabilities. The goal of this work is to discuss these two approaches and give recommendation which one should be preferred. Additionally we discuss the problem of sample size for phase I analysis. Our recommendations to this issue are also presented.
  • The Energy Ladder: A Model for Projecting Energy Demand

    Authors: Tashi Erdmann (Shell Global Solutions International), Martin Haigh (Shell International)
    Primary area of focus / application: Economics
    Secondary area of focus / application: Modelling
    Keywords: Econometrics, Regression analysis, Panel data analysis, Mixed models, Applied statistics, Curve fitting, Energy economics
    Submitted at 29-May-2015 10:18 by Tashi Erdmann
    Accepted
    7-Sep-2015 17:00 The Energy Ladder: A Model for Projecting Energy Demand
    Shell’s scenarios team develops long-term energy scenarios, intended to draw attention to important strategic issues. A core element of Shell’s World Energy Model for projecting energy demand is the so-called energy ladder, describing the relation between energy demand and economic development. This talk describes the statistical methods used to develop these energy ladders and the way they are used for quantifying Shell’s energy scenarios.
    Separate energy ladders are used for all end-use sectors in 100 different countries and regions, each represented by an S-shaped curve. The S-curve has a number of parameters that allow it to be different across countries and sectors. The parameters for each sector and country are estimated by a combination of econometric panel data analysis of historical data and energy analysts’ expert opinions based on scenarios of future development. The historical data includes energy demand, real income per capita and a number of other explanatory variables, such as the price of energy, energy efficiency, population density (e.g. affecting travel needs), and climate (e.g. affecting heating needs). The resulting energy ladders are then used to project each country’s aggregate energy demand into the future.
    The energy ladders indicate that global energy demand may double over the first half of this century. In particular in China and India, the world’s two most populated countries, energy demand per capita is expected to rise strongly, as their economies are on the steepest part of the energy ladder. Governments and the energy industry face the difficult challenge of ensuring sufficient affordable energy in an environmentally sustainable way.
  • Optimal Statistical Power for Evolutionary Operation Methods

    Authors: Koen Rutten (KU Leuven), Bart De Ketelaere (KU Leuven)
    Primary area of focus / application: Modelling
    Secondary area of focus / application: Design and analysis of experiments
    Keywords: EVOP, DOE, Power, Simulation
    Submitted at 29-May-2015 10:39 by Koen Rutten
    Accepted
    8-Sep-2015 10:50 Optimal Statistical Power for Evolutionary Operation Methods
    In the previous ENBIS conferences Evolutionary Operation (EVOP) was presented by the authors as a method for online process improvement. In contemporary processes where a high number of factors that possibly interact is the norm, the classical EVOP scheme becomes unfeasible because it uses a full factorial design which uses an unacceptably high number of measurements in each phase. Alternative designs are proposed, typically having a much lower sample size for each phase. This lower sample size inevitably causes a lower power to detect the direction of improvement. As far as the authors are aware of, no research is performed into the optimal power required in EVOP. Based on these observations, the main goal of this contribution is to determine the optimal statistical power in an EVOP improvement.

    We base ourselves on extensive simulation studies where the influence of the dimensionality k of the process and the statistical power on the efficiency of the improvement is investigated. This study shows that the optimal power is lower than what is classically chosen when designing experiments. For low dimensionality (up to 7 dimensions), the optimal power can be as low as 10%, but it increases with increasing k. For such higher dimensionalities the influence of the power is limited in a broad range between 0.4 and 0.8. The choice which power to choose is then completely determined by the type of process under study. For processes with a low sampling rate or non-stationary processes that are prone to time drift a low power is recommended, so that each phase is concluded in a short time span. For processes where high sampling rates are used, a higher power is advised.
  • Statistical Evaluation of Binary Tests when a Gold Standard is Unavailable

    Authors: Thomas Akkerhuis (IBIS UvA)
    Primary area of focus / application: Metrology & measurement systems analysis
    Keywords: Measurement system analysis, Pass/fail inspection, Systematic measurement error, Random measurement error, Gold standard unavailable
    Submitted at 29-May-2015 12:08 by Thomas Akkerhuis
    Accepted
    9-Sep-2015 09:00 Statistical Evaluation of Binary Tests when a Gold Standard is Unavailable
    The statistical evaluation of the reliability of binary tests, such as pass/fail inspections, is challenging. Such an evaluation typically quantifies both systematic and random measurement error. Our focus is on “gold standard unavailable” situations: the true condition of the inspected items is unobservable. We present a comparative study of some approaches that are designed to deal with this challenge.

    We have found that, due to the unavailability of a gold standard, strict assumptions need to be satisfied in order to reliably estimate the traditional False Acceptance Probability (FAP) and False Rejection Probability (FRP). We found these assumptions to be quite unrealistic. Our arresting conclusion is that only the random component of measurement error can be reliably estimated when a gold standard is not available.
  • Application of Response Surface Methodology in Polymerization Conditions Optimization for Production of Polyester Fabric/Polypyrrole Composites

    Authors: Veronika Safarova (Technical University of Liberec), Gejza Dohnal (Czech Technical University in Prague), Maros Tunak (Technical University of Liberec)
    Primary area of focus / application: Design and analysis of experiments
    Secondary area of focus / application: Metrology & measurement systems analysis
    Keywords: Conductive fabrics, Design of Experiments, Electromagnetic shielding effectiveness, Chemical polymerization, Polyester, Polypyrrole, Response surface methodology
    Submitted at 29-May-2015 14:46 by Veronika Safarova
    Accepted
    7-Sep-2015 16:15 Application of Response Surface Methodology in Polymerization Conditions Optimization for Production of Polyester Fabric/Polypyrrole Composites
    Polypyrrole is most frequently used in commercial applications of all known conducting polymers. It is due to the long-term stability of its conductivity and the possibility of forming composites with improved mechanical properties. One of the most widely used approaches for fabrication of electrically conductive textile composites from conductive polymers is to use a submicron thick coating of the conducting polymer onto an existing textile substrate. Response surface methodology was used to optimize the polymerization conditions (concentrations of particular chemical substances, process parameters of polymerization) for enhancing electric conductivity of polyester fabric/polypyrrole composites. In the first step of optimization, a screening experiment was used to identify the important factors of polymerization process which has significant effects on the resulting electric conductivity of polyester fabric/polypyrrole composites. In the second step, a Box-Behnken design and response surface methodology were applied to determine the optimal setting of each significant variable. Experimental data were used to derive an empirical model linking the outputs and inputs. The proposed model can be used for prediction of electric conductivity of PET/PPy textile composites at specific settings of input parameters. Using this model, optimized parameters for creating polypyrrole/polyester textile composite with desired electrical conductivity were determined and successfully experimentally validated. Moreover, electromagnetic shielding effectiveness which is connected to electric conductivity of coated samples was evaluated. Besides, scanning electron microscopy images of samples were used for further validation of polymerization process.