ENBIS-16 in Sheffield

11 – 15 September 2016; Sheffield Abstract submission: 20 March – 4 July 2016

My abstracts

 

The following abstracts have been accepted for this event:

  • Optimal Sensor Placement in Smart Grids: An Approach Based on Semi-Definite Relaxation and Sensitivity Analysis

    Authors: Stephane Chretien (National Physical Laboratory), Paul Clarkson (National Physical Laboratory), Alistair Forbes (National Physical Laboratory), Jonathan Black (National Physical Laboratory)
    Primary area of focus / application: Design and analysis of experiments
    Secondary area of focus / application: Metrology & measurement systems analysis
    Keywords: Sensor placement, Constrained estimation, Semi-definite programming relaxations, Conic eigenvalues.
    Submitted at 26-May-2016 19:50 by Stephane Chretien
    Accepted
    14-Sep-2016 09:40 Optimal Sensor Placement in Smart Grids: An Approach Based on Semi-Definite Relaxation and Sensitivity Analysis
    Smart grids have recently been the subject of an extensive research activity. The goal of the present work is to study the sensor placement problem from a design of experiment viewpoint. This problem is of paramount importance for monitoring purposes and has been addressed thoroughly in the case of PMUs where the observations are linear functions of the voltages. We propose a new method to address the case where the power is the observed quantity, i.e. the observation becomes quadratic in the voltages. Based on the asymptotic theory of constrained estimation, we devise a new criterion for the sensor placement problem which takes into account the nonlinearity of the sensing process. Our method is entirely based on convex optimization in order to avoid possible spurious solutions.
  • Quantification of Carbon Emissions and Savings in Smart Grids

    Authors: Eng Lau (Queen Mary University of London), Qingping Yang (Brunel University London), Gareth Taylor (Brunel University London), Alistair Forbes (National Physical Laboratory), Valerie Livina (National Physical Laboratory)
    Primary area of focus / application: Modelling
    Secondary area of focus / application: Modelling
    Keywords: Carbon emissions, Carbon savings, Ensemble optimisation, Ensemble Kalman filter, Mathematical modelling, Smart grids
    Submitted at 26-May-2016 23:04 by Eng Lau
    Accepted (view paper)
    12-Sep-2016 10:20 Quantification of Carbon Emissions and Savings in Smart Grids
    The UK national energy system experiences large and increasing loads on the infrastructure, as well as uncertainty in energy consumption due to: variable demand in colder seasons and peak hours of the day; contribution of intermittent green generators; insufficient storage facilities; increasing complexity of the grid components. Such factors have led to economical and environmental stresses on the national energy system, among which is the need to reduce carbon emissions produced during energy generation. We attempted to model and quantify carbon emissions and carbon savings in the smart grids to address this problem.

    We define carbon emissions as the product of the activity (energy) and the corresponding carbon factor. Carbon savings are estimated as the difference between the conventional and improved energy
    usage multiplied by the corresponding carbon factor. Given high-resolution energy generation data (Elexon portal), we estimate dynamical grid carbon factor based on the available national fuel
    mix, with quantification of uncertainties, given the known ranges of carbon factors for each fuel.

    An adaptive seasonal model based on the hyperbolic tangent function (HTF) is developed to reproduce seasonal trends of electricity consumption and the resultant carbon emissions for groups of consumers. Energy consumption and generation data are forecast and assimilated using the ensemble Kalman filter (EnKF). Numerical optimisation of carbon savings is further performed following the ensemble-based Closed-loop Production Optimisation Scheme (EnOpt), where the EnKF is combined with the EnOpt procedure. The EnOpt involves the optimisation of fuel costs
    and carbon emissions in the smart grid subject to the operational control constraints. The proposed approach addresses the complexity and diversity of the power grid and may be implemented at the
    level of the transmission operator in collaboration with the operational wholesale electricity market and distribution network operators.

    As an application, we quantify carbon emissions and savings in demand response (DR) programmes, such as Short Term Operating Reserve (STOR), Triad, Fast Reserve, Frequency Control
    by Demand Management (FCDM) and smart meter roll-out (Irish pilot project). DR programmes are modelled with appropriate configurations and assumptions on power plants technological cycles
    used in the energy industry. This enables the comparison of emissions between the conventional routines and smart solutions applied, thus deriving carbon savings. Several industrial case studies
    of DR participants are successfully performed.

    Uncertainty estimation is performed for those carbon factors of individual fuels used in electricity generations in specific power plants, using the data. Monte Carlo simulations of random samplings are performed in order to quantify the corresponding uncertainties for the resultant carbon emissions and savings. This enables the comparison of carbon emissions between the conventional and the improved solutions, with quantification of uncertainties.
  • Monitoring a Wind Turbine by Combining Sensor Data

    Authors: Stella Kapodistria (Eindhoven University of Technology), Alessandro Di Bucchianico (Eindhoven University of Technology), Thomas Kenbeek (Eindhoven University of Technology)
    Primary area of focus / application: Process
    Secondary area of focus / application: Reliability
    Keywords: Wind turbine, Statistical process control, Reliability, Sensor data, Monitoring, Maintenance
    Submitted at 27-May-2016 12:18 by Alessandro Di Bucchianico
    Accepted
    14-Sep-2016 09:00 Monitoring a Wind Turbine by Combining Sensor Data
    Undetected damage to parts of a wind turbine such as blade cracks due to lightning or broken gear wheels may have disastrous consequences possibly leading to loss of the entire wind turbine. It is therefore important to continuously monitor the condition of wind turbines, in particular when they are placed at remote locations (e.g., off-shore wind farms). Technological advances make it economically feasible to equip wind turbines with sensors for various physical variables (including vibration).
    We describe our experiences when applying Statistical Process Control to monitor the condition of wind turbines in the Netherlands that are equipped with various sensors. Our approach is based on jointly monitoring variables using regression analysis to correct for external influences. This was an eye opener for the wind turbine engineers who use to think in threshold values for individual sensor variables. Analysis of historical data showed that malfunctioning of one the generators of a specific wind turbine could have detected several months before the actual breakdown of the complete gearbox. This research was performed within the DAISY4OFFSHORE (Dynamic Asset Information System for Offshore Wind Farm Optimisation) project funded by the Dutch government through its “Wind at Sea” Top Consortium Knowledge and Innovation. The work of Kapodistria is also supported by the Dutch Science Foundation Gravitation Project “Networks” (www.thenetworkcenter.nl).
  • Interactive Session on Communication in Statistics and its Pitfalls

    Authors: Kristina Lurz (prognostica GmbH), Jacqueline Asscher (Kinneret College), Kathrin Plankensteiner (Julius Blum GmbH), Kostas Triantafyllopoulos (University of Sheffield)
    Primary area of focus / application: Consulting
    Secondary area of focus / application: Education & Thinking
    Keywords: Statistical consulting, Communication, Roleplaying exercises, Interactive session
    Submitted at 27-May-2016 13:35 by Kristina Lurz
    Accepted
    13-Sep-2016 09:20 Interactive Session on Communication in Statistics and its Pitfalls
    We data analysts frequently get in touch with people from other disciplines. We often experience a gap between what our client in a statistical consultancy project expects and what statistics can deliver. Not only do we need excellent knowledge on the subject matter, but also good communication skills to explain the potential as well as limitations of statistics in such a way that we are understood and that we ourselves can understand what is required, while still acting in a statistically correct way.
    The Young Statisticians Session is an interactive session that will reveal some of the good and bad approaches that can be adopted, not only when we statisticians talk with others of our profession, but also when we need to apply and explain our subject to someone in a different field. The session consists of two interconnected parts: A presentation by an experienced professional statistician, Olivia Bluder, as well as roleplaying exercises. In these roleplaying exercises we will demonstrate what can go wrong, how inappropriate statistical advice may result in confusion, what we can do to improve our communication skills, and more. The actors will not know of the topics beforehand and will therefore act spontaneously in the scenarios. If you would like to actively participate in the exercises, please email us before the conference (kristina.lurz@prognostica.de)! With your contribution, we are looking forward to a lively and informative session!
  • The Role of DoE within the Process Development and Validation Lifecycle - Searching for a Balance between Experimental Effort and too High Risk

    Authors: Stefanie Feiler (AICOS Technologies AG)
    Primary area of focus / application: Design and analysis of experiments
    Keywords: DoE, Risk-based assessment, Process validation lifecycle, QbD
    Submitted at 27-May-2016 15:38 by Stefanie Feiler
    Accepted (view paper)
    12-Sep-2016 11:50 The Role of DoE within the Process Development and Validation Lifecycle - Searching for a Balance between Experimental Effort and too High Risk
    The process validation lifecycle approach of the FDA (2011) splits the process validation into the three stages process validation, process qualification and process verification. It is stressed that "knowledge and understanding is the basis for establishing an approach to control of the manufacturing process that results in products with the desired quality attributes". The lifecycle approach to process validation "employs risk based decision making throughout that lifecycle".

    The necessary knowledge and understanding "can be gained by application of, for example, formal experimental designs, process analytical technology (PAT), and/or prior knowledge." (ICH Q8(R2))

    Besides incorporating prior knowledge, systematic experimentation starting from the very first development stage of the process is therefore the main tool for generating initial process insight.

    The big question is how much experimentation is necessary at which step of the process development. A detailed investigation in the lab still does not guarantee that the process behaves in exactly the same way after the scale up to the pilot plant or production scale.

    In this talk, we discuss how risk-based considerations can be used in order to find a pragmatic compromise between the extremes of defining a detailed design space in the lab and "do at most four experiments and have a quality process up and running".
  • Revisiting Mixture Design Experiment from a Compositional Point of View

    Authors: Marina Vives-Mestres (Universitat de Girona), Josep Antoni Martín-Fernández (Universitat de Girona)
    Primary area of focus / application: Design and analysis of experiments
    Keywords: Mixture experiment, Compositional data, Log contrast model, Simplex
    Submitted at 27-May-2016 16:38 by Marina Vives-Mestres
    Accepted
    12-Sep-2016 12:10 Revisiting Mixture Design Experiment from a Compositional Point of View
    The aim of this talk is to share with researchers and applied scientists the questions that arise when revisiting the fundamentals of mixture design experiment from a compositional point of view. Mixture design deal with experiments with factors that are ingredients in a mixture. Compositional Data (CoDa) analysis has been proved to be useful when dealing with data living in a restricted space, as the proportions of ingredients are. Methods such as principal component analysis, cluster analysis, linear discriminant analysis and linear regression models have been developed for CoDa based on the principle of working on log-ratio coordinates (log ratios of components). Those techniques are plenty consistent with the characteristics of CoDa.
    J. Aitchison in 1984 introduced the log contrast models for experiments with mixtures, which are linear models in the log proportions. We revisit the proposed model and update its results with the latest advances in the CoDa field, mainly regarding regression models and differential calculus on the simplex. We follow the tracks of that publication with some results on optimal log contrast designs. We finally analyze the difficulties when traditional methods are used with mixtures and point out the solutions proposed by the log ratio approach.