ENBIS-15 in Prague

6 – 10 September 2015; Prague, Czech Republic Abstract submission: 1 February – 3 July 2015

My abstracts


The following abstracts have been accepted for this event:

  • Robustness of Errors-in-Variables Regressions to Assess Equivalence in Method Comparison Studies

    Authors: Bernard Francq (Université Catholique de Louvain)
    Primary area of focus / application: Metrology & measurement systems analysis
    Keywords: Measurement comparison studies, Errors-in-variables-regressions, Robustness, Least square, Maximum likelihood
    Submitted at 30-Apr-2015 11:52 by Bernard Francq
    7-Sep-2015 12:30 Robustness of Errors-in-Variables Regressions to Assess Equivalence in Method Comparison Studies
    Analytical laboratories continuously assess the uncertainties and reliability of their measurement systems so that their clients can take the right decision. This leads to the development of new measurement methods which should be more precise, less expensive but chiefly equivalent.

    To assess equivalence in method comparison studies, the measurement uncertainties must obviously be taken into account. Errors-in-variables regressions can be applied to evaluate the bias between two devices but also to assess and predict their differences. Different procedures exist in the literature from a least square method, the method of moments, a modification of the coordinate system (scale transformation), or the maximum likelihood technique. These procedures are similar when everything is alright. However, it will be shown and explained that differences arise under the presence of outliers or under heteroscedasticity where the local variances should be modelled.
    It will be concluded that modelling the local variances, under heteroscedasticity, improves significantly the coverage probabilities. The joint confidence intervals (for the parameters) obtained by maximum likelihood collapse because of outliers, as well as the coverage probabilities. On the other hand, the least square method is more suitable to tackle the heteroscedasticity and is more robust.
  • A Simulation Approach of Experimental Design for Concrete Compressive Strength

    Authors: Alexios E. Tamparopoulos (University of Natural Resources and Life Sciences, Vienna), Roman Wendner (University of Natural Resources and Life Sciences, Vienna)
    Primary area of focus / application: Design and analysis of experiments
    Secondary area of focus / application: Reliability
    Keywords: Computer experiments, Design of Experiments, Uncertainty, Material properties, Concrete
    Submitted at 30-Apr-2015 11:52 by Alexios Tamparopoulos
    Accepted (view paper)
    8-Sep-2015 15:55 A Simulation Approach of Experimental Design for Concrete Compressive Strength
    The material properties of concrete play an important role in most mechanical and numerical models that express the behaviour of a concrete structure. Since those properties do not remain constant, an experimental procedure is needed in order to describe their development in time. Apart from point estimates for the parameters of the ageing models obtained on experimental data, the followed procedure should also be able to provide uncertainty estimates required for inference and reliability computations. These uncertainty considerations have been overlooked to date. In this context, decisions on the days of testing, the number of tested units and the regression methods need to be taken within some physical constraints. In the present paper, we provide an analysis aiming to yield practical recommendations for optimal testing design. The confidence intervals for the parameters of interest related to the compressive strength of concrete are obtained through a Monte Carlo simulation. Furthermore, we investigate whether a resampling procedure can be used to reconstruct these confidence intervals upon single experimental realisations with no prior information on the parameters. Based on the analysis results and the desired precision, a design of experiments for obtaining the development of concrete compressive strength in time is possible.
  • Railway Track Degradation Prediction

    Authors: Bjarne Bergquist (Luleå University of Technology), Peter Söderholm (Swedish Transportation Administration)
    Primary area of focus / application: Reliability
    Secondary area of focus / application: Modelling
    Keywords: Spatio-temporal analysis, Condition-based maintenance, Prognostics, Statistical Process Control (SPC), Control charts
    Submitted at 30-Apr-2015 12:33 by Bjarne Bergquist
    7-Sep-2015 10:00 Railway Track Degradation Prediction
    The degradation processes affecting railway track condition depends both on the resistance of the track and on the stresses subjected to it. The stress magnitudes and stress cycles are important for degradation behavior and demonstrate some regularity in the time domain, while the degradation resistance of a track is spatially correlated. In addition, the condition measurements of track is irregular and contain considerable measurement errors.
    One-step-ahead predictions have been used for establishing prognostic models for the condition of railway tracks. The models have been based on repeated measurements of railway track geometry, thereby allowing for estimation of track wear resistance, degradation rates and stochastic behavior. The prognostic models have then been used for condition assessment and state predictions. Issues dealt with include irregular sampling and variation of subjected train loads and traffic intensity during the studied periods. Unit length variability of analyzed properties has been as means to reduce measurement positioning uncertainty problems, and these data have then been subjected to transformations to reduce skewness. These data has then been fed to the prediction model, to generate condition predictions. These predictions are then compared to statistically based and regulation based control limits.
    To be useful, the predictions need to be communicable to practitioners, and we propose using color-maps to display the three-dimensional array of measured and predicted condition data, the position and the time.
  • Work and Retirement Diagnostic Analysis

    Authors: Erik Monness (Hedmark University College), Shirley Coleman (Newcastle University), Matt Flynn (Newcastle University)
    Primary area of focus / application: Modelling
    Secondary area of focus / application: Business
    Keywords: CROW, Survey , Business consequences, Policy consequences, PCA, Smart Partial Least Squares (PLS), Bayesian belief nets
    Submitted at 30-Apr-2015 12:53 by Erik Monness
    Accepted (view paper)
    7-Sep-2015 10:20 Work and Retirement Diagnostic Analysis
    Older workers leaving the workplace pose a major problem to business and industry throughout Europe. A survey exploring the possible relationships between career, job content, employer policies and practices, human capital, retirement plans and expectations, and quality of life has been conducted by the Centre for Research into the Older Workforce (CROW) at Newcastle University . The survey was completed by 800 older employed people in each of the UK and Hong Kong with the aim of comparing and contrasting the two communities. Our initial analysis concerns the UK where diverse agencies are interested in the findings.
    A unified model including all aspects of working life will be presented, with the following structure:
    2. CAPABILITIES, which in turn influence
    3. FUNCTIONING, which in turn influence
    These concepts are shorthand for multidimensional issues within the survey. Sets of regressions, factor analysis including partial least squares and Bayesian belief nets will be presented to highlight concept models that might be refuted, supported or modified. The methods in use will be compared to see how they differ, and which aspects emerge as important. Our models will be valuable for policymakers concerned with keeping the older workforce in action for more years. Over 20 groups are interested in this research and it is an unusual example of how data is pivotal to communication and enhances synergistic understanding and communication between policy makers and analysts from different areas. The ideas and investigations driven by the different partners are focused on this rich set of data.
  • Exploring Rail Breaks

    Authors: Peter Söderholm (Swedish Transport Administration), Bjarne Bergquist (Luleå University of Technology)
    Primary area of focus / application: Reliability
    Secondary area of focus / application: Modelling
    Keywords: Rail break, Railway, Time series analysis, Reliability, Multivariate analysis, Repairable systems
    Submitted at 30-Apr-2015 12:55 by Bjarne Bergquist
    7-Sep-2015 17:00 Exploring Rail Breaks
    Rail breaks may cause derailments and are therefore safety critical. Therefore, discovered breaks initiate maintenance actions that often result in delays and cancelled trains. The railway custodian therefore both need to manage rail break causes and their consequences. Here, the relationship between maintenance practice, rail breaks and their consequences are studied to achieve an increased understanding of the rail break phenomenon. An exploratory case study was performed at Trafikverket (Swedish transport administration). The empirical data was collected from databases containing information about preventive and corrective maintenance, as well as rail break initiated traffic disturbances. Time series analysis, reliability analysis of repairable systems, and multivariate data analysis has been the analysis frame of reference. Non-destructive testing (NDT) is negatively correlated with the rail break frequency, suggesting that NDT is beneficial for early detection and avoidance of many rail break causes. The data also show that there is a strong seasonal component, and that the seasonality increases over time.
  • Shrinkage in the Time-Varying Parameter Framework

    Authors: Angela Bitto (WU Wien)
    Primary area of focus / application: Other: Invited ISBA Session
    Keywords: Time-varying parameter model, Hierarchical shrinkage priors, Normal-gamma prior, Predictive likelihood, State space model
    Submitted at 30-Apr-2015 13:04 by Angela Bitto
    8-Sep-2015 11:40 Shrinkage in the Time-Varying Parameter Framework
    Although time varying parameter models offer a lot of flexibility in modelling processes which
    gradually change over time, they suffer from inefficient parameter estimation and poor prediction results.
    We aim at solving this problem by using hierarchical priors and investigate the normal-gamma shrinkage prior in
    this context. This scale mixture of normals concentrates a lot of mass around zero and therefore unimportant
    parameters are shrunk towards zero, while allowing for non-zero parameters to be shrunk away from zero.
    As working with a non-centered parameterization has proven to be advantageous for various reasons,
    we induce shrinkage in the non-centered context and induce shrinkage on the square root of the variance of the prior
    of the error in the state equation. Our approach extends models using the Bayesian LASSO prior, which is a
    special case of the normal-gamma prior. Further we show how the normal-gamma prior can easily be extended to
    the TVP models and present a Gibbs sampler for this model. We discuss the crucial choice of hyperparameters, make use of
    interweaving the centered and the non-centered parameterization to improve convergence of the Markov chains
    and analyse the forecasting behaviour. We present both a univariate and a multivariate application.
    First we choose EU-area inflation modelling based on the
    generalized Phillips curve, then we draw our focus to a Cholesky decomposition of a multivariate time series with a time-varying
    covariance matrix and analyse DAX-30 data. Our findings suggest, that the normal-gamma prior
    bears advantages over the Bayesian Lasso prior in terms of statistical efficiency and performs significantly
    better when drawing attention to the predictive performance.