ENBIS-18 in Nancy

2 – 6 September 2018; Ecoles des Mines, Nancy (France) Abstract submission: 20 December 2017 – 4 June 2018

My abstracts

 

The following abstracts have been accepted for this event:

  • Self-similarity Analysis of Electrodermal Activity for Driver’s Stress Level Characterization

    Authors: Jean-Michel Poggi (LMO, University of Paris Sud - Orsay and University Paris Descartes), Neska El Haouij (ENIT and University of Tunis El Manar, Tunisia), Raja Ghozi (ENIT and University of Tunis El Manar), Sylvie Sevestre Ghalila (CEA-LinkLab), Mériem Jaïdane (ENIT and University of Tunis El Manar)
    Primary area of focus / application: Mining
    Secondary area of focus / application: Modelling
    Keywords: Electrodermal Activity, Hurst Exponent Estimation, Self-similarity, Wavelet-based Method, Stress Characterization
    Submitted at 29-Jan-2018 19:07 by Jean-Michel Poggi
    Accepted
    This work characterizes ``stress''' levels via a self-similarity analysis of the Electrodermal Activity (EDA). For that, the Fractional Brownian Motion (FBM), parameterized via the Hurst exponent H, is evoked to model the EDA changes in a real-world driving context.

    To characterize the EDA scale invariance, the FBM process and its corresponding exponent H, estimated thanks to a wavelet-based approach, are used. Specifically, an automatic scale range selection is proposed in order to detect the linearity in the logscale diagram. The procedure is applied to the EDA signals, from the open database drivedb, captured originally on the foot and the hand of the drivers during a real-world driving experiment designed to evoke different levels of arousal and stress.

    The estimated Hurst exponent H offers a distinction in stress levels when driving in highway versus city, with a reference to restful state of minimal stress level. Specifically, the estimated H decreases when the environmental complexity increases. In addition, almost all the estimated values are greater than 0.5 suggesting that the EDA signal has a long-range dependence. Furthermore, the H estimated on the Foot EDA signals allows a better characterization of the driving task than the Hand EDA.

    This self-similarity analysis captures the complexity the complexity of the EDA signal. Such analysis was applied to various physiological signals in literature but not to the EDA, a signal which was found to correlate most with human affect. The proposed analysis could be useful in real-time monitoring of ``stress'' and arousal levels in urban driving spaces.
  • An Ishikawa analysis of a rail network

    Authors: Chris McCollin (Nottingham Trent University), John Disney (Nottingham Trent University)
    Primary area of focus / application: Reliability
    Secondary area of focus / application: Quality
    Keywords: Ishikawa, Problem solving, Seven Quality tools
    Submitted at 5-Feb-2018 15:13 by Chris McCollin
    Accepted
    A statistical analysis using the Ishikawa approach to problem solving was carried out on thirteen weeks of data from a local rail network. The seven tools of Quality are used in a systematic way to help identify whether the process is in control, what the major issues are and whether background rules to the system can be identified. It is found that the process is well managed with the main issue corresponding to Sunday maintenance over-runs. Customer satisfaction is monitored and the main problem apart from train reliability is due to cleanliness, most likely related to the train reliability. Two government targets are identified corresponding to gaps in the data.
  • How to Overcome the Problems Associated With New Types of Data?

    Authors: Emil Bashkansky (ORT Braude College of Engineering), Yariv Marmor (ORT Braude College of Engineering), Amalia Vanacore (University of Naples Federico II)
    Primary area of focus / application: Business
    Secondary area of focus / application: Metrology & measurement systems analysis
    Keywords: tree structured data, preference chains, distance metric, analysis of variation
    Submitted at 13-Feb-2018 17:26 by Emil Bashkansky
    Accepted
    The best known classification of scales for measuring data, according to their nature, defines four types of data: nominal, ordinal, interval, and ratio. Only the last three have received official legitimization in metrology. Although a number of researchers from the academia, who proposed an alternative typology, criticized the above classification, this did not significantly affect the industrial and business statistics preferring to be in harmony with the current legal metrology. On the eve of the fourth industrial revolution, we are increasingly confronted with new types of data, which does not fall under any of the above-mentioned classifications. Hence, we often find it difficult to interpret the new data. In our lecture, we present only two kinds of them: tree structured data and preference chains. An example of the first type is the international classification of diseases (ICD), proposed by the World Health Organization (WHO), where diagnosis of a disease is associated to a specific node in the hierarchical tree. Preference/priority chains are needed when studying consumer preferences, customer requirements, in a risk management, decision-making, etc. Recently, a certain progress was achieved in determining distance metrics and analysis of variation allowing to extract useful information from such kinds of data. The lecture will report and demonstrate these achievements.
  • Binary Test of Latent Ability: Evaluation & Design Problems

    Authors: Emil Bashkansky (ORT Braude College of Engineering), Vladimir Turetsky (ORT Braude College of Engineering)
    Primary area of focus / application: Metrology & measurement systems analysis
    Secondary area of focus / application: Design and analysis of experiments
    Keywords: binary test, latent ability, test item difficulty, test design
    Submitted at 13-Feb-2018 17:31 by Emil Bashkansky
    Accepted
    In recent years substantial progress in the analysis and interpretation of the binary test results has been achieved. Recall that we are talking about the simplest issue of uni-dimensional ability - a, when the test item performance of the object under test (OUT) can be explained by a single latent ability. The test consists of a set of K test items, every test item response is estimated on the binary scale basis (pass/fail) and we need to evaluate the intrinsic ability of this OUT. Usually, it is assumed that the results of different test items, applied to the same OUT, are conditionally independent (i.e., the response to one test item does not affect the response to another). Given specific item response function (IRF) model, assessment of the tested ability usually is produced according to the principle of maximum likelihood estimation (MLE) or on the basis of Bayesian approach (if some preliminary information about the tested ability exists). When levels of test items difficulties are known beforehand, the problem solution is relatively easy, but when the amount of OUTs is bounded and levels of difficulties are unknown beforehand, analysis of the results faces significant computational difficulties. Nevertheless, in principle, the problem is solvable. However, at the moment when we think how to optimally allocate test resources, that is, how to choose the levels of test items difficulties, how many repetitions to perform for every level, what is the criterion of optimality etc., we come to the unexplored terra incognita. Our lecture is rather an attempt to describe a possible approaches and criteria to the test planning problem, than its complete solution. We will also treat some real applications of the proposed approach in education and antagonistic games.
  • Identifying Success Factors in Projects: A QFD Based Methodology

    Authors: Shuki Dror (ORT Braude College), Oren Eliezer (Jerusalem College of Technology)
    Primary area of focus / application: Business
    Secondary area of focus / application: Quality
    Keywords: Project Success Factors, Project Performance, Quality Function Deployment (QFD), Mean Square Error (MSE), Decision Making
    Submitted at 15-Feb-2018 07:03 by Shuki Dror
    Accepted
    Defining Project Success (PS) outcomes and PS factors is not an easy task. A favorable outcome depends on the stakeholders’ perspective, the project type, the project life cycle stage, and organizational characteristics. In the present study, focusing on an individual business case, we develop a procedure for quantitative evaluation of the relations between various PS factors and outcomes based on the quality function deployment (QFD) method.
    A House of Project Success (HoPS) matrix is created using combined inputs from various managers and experts. This matrix summarizes the desired improvements in the PS outcomes and connect them to the relevant PS factors. Based on the HoPS matrix, outcomes and factors that maximize the desired results of the PS policy are chosen using the mean square error (MSE) criterion.
    The paper describes the implementation of the above methodology in two organizations, and two project types, namely weapons development and an ERP implementation, demonstrating different project success causal structures.
  • Building Designs for Irregularly Shaped DoE Spaces

    Authors: Pat Whitcomb (Stat-Ease, Inc.), Shari Kraber (Stat-Ease, Inc.), Martin Bezener (Stat-Ease, Inc.)
    Primary area of focus / application: Design and analysis of experiments
    Keywords: design of experiments, optimal design, linear constraints, non-linear constraints, convex space, concave space.
    Submitted at 1-Mar-2018 17:40 by Pat Whitcomb
    Accepted
    A common problem when designing experiments is how to deal with factor ranges that create unusually-shaped design spaces. Classic designs like central composite designs don’t work. In this presentation, two methods for constraining irregularly shape experimental regions will be shown. The first method is specific for linear constraints that form a convex space. The second method is more general and works for non-linear constraints and/or concave spaces. Examples are shown that demonstrate building optimal designs for constrained experimental regions with:
    • multiple linear constraints that form a convex experimental region,
    • multiple linear constraints that form a concave experimental region
    and
    • a non-linear constraint.