ENBIS-12 in Ljubljana

9 – 13 September 2012 Abstract submission: 15 January – 10 May 2012

My abstracts


The following abstracts have been accepted for this event:

  • Estimation for Utility of Donor Arrivals

    Authors: Banu Yuksel Ozkaya (Hacettepe University), Murat Caner Testik (Hacettepe University)
    Primary area of focus / application: Modelling
    Keywords: logistic regression, simulation, blood bank, process modeling
    Submitted at 14-Apr-2012 14:07 by Banu Yuksel-Ozkaya
    Accepted (view paper)
    11-Sep-2012 10:20 Estimation for Utility of Donor Arrivals
    A critical aspect of blood transfusion services is the timely provision of blood and blood components. On the other hand, since some of the blood components have very short life times, a possible mismatch between demand and supply due to excess donor arrivals may lead to many outdated units of blood components at the end of their useful life times. Therefore, before accepting or scheduling a donor, the potential utility that can be obtained from the donor should be taken into consideration. A possible measure for the utility of a donor arrival can be defined as the probability that the blood component(s) obtained from the particular donor will be transfused but not discarded. This measure can also be considered as an indicator of how critical an arriving donor is. In this study, logistic regression models are utilized to estimate this measure through the inventory status of the blood component (including the inventory level and the age distribution of the units in inventory) at the time of the arrival. This measure is then used to propose donor recruitment strategies where a donor will be accepted if the measure for his utility is above some predefined threshold level. A real case at the blood bank of a university hospital is analyzed to illustrate the performance of the estimator for the utility of a donor arrival and the proposed recruitment strategies through simulation.
  • Analysis and Comparison of Flow Chart Trace Softwares.

    Authors: Susana Vegas (Universidad de Piura), Ana María Cumpa (Universidad de Piura), Laura Ilzarbe (Transabadell)
    Primary area of focus / application: Quality
    Keywords: Flow chart, quality control tools, quality improvement, quality management
    Submitted at 14-Apr-2012 16:08 by Susana Vegas
    12-Sep-2012 10:05 Analysis and Comparison of Flow Chart Trace Softwares.
    The flow chart is one of the classic tools in statistical quality control and its usefulness in quality improvement is beyond doubt. Currently, there are several commercial and free software programs that allow users to trace this chart.
    In this paper, a comparison between 27 flow chart tracer software programs is performed. First, a survey applied to 48 Peruvian companies was conducted, with the purpose of determining the key factors that they take into account when they choose a flow chart tracer software. Once the key factors were selected, all the software programs were analyzed to determine their main characteristics and how well they meet these factors.
    As a result, there is a complete description of each software program and each one is recommended based on their particular advantages and their destined uses.
  • EWMA p Charts Under Sampling by Variables --- Ideas, Numerics and Properties

    Authors: Sven Knoth (Helmut Schmidt University Hamburg), Sebastian Steinmetz (Helmut Schmidt University Hamburg)
    Primary area of focus / application: Process
    Keywords: EWMA, yield monitoring, numerical methods, ARL, percent defective
    Submitted at 14-Apr-2012 21:14 by Sven Knoth
    Accepted (view paper)
    10-Sep-2012 17:05 EWMA p Charts Under Sampling by Variables --- Ideas, Numerics and Properties
    The data is sampled in batches of size n in order to monitor
    stability in terms of yield. For given lower and upper specification limits
    the probability of nonconforming quality is estimated via the sample mean
    (and variance). This proportion estimate is plugged into an EWMA chart.
    Then, the control chart with only an upper limit signals for decreased quality.
    Thus, this monitoring scheme alarms only if the quality level deteriorated
    considerably. It allows, for example, that the pre-run (in-control) sample mean does not match the target mean value such as the center of the specification interval. This mismatch and other imperfect in-control situations are quite common in control charting practice.

    In order to calculate properties such as the ARL (zero-state, worst-case) one has to face similar issues as for variance schemes, because the support of the chart statistic is bounded. This is solved with appropriate numerical methods. Eventually, the resulting and reasonably calibrated EWMA p variables charts are compared to classical EWMA and CUSUM charts for the mean and also to classical EWMA p charts based on simply the sample proportions.
  • Turning a Simple Case Study into a Single Session, Three Stage Active Learning Exercise for Classroom Use

    Authors: Jacqueline Asscher (Kinneret College on the Sea of Galilee)
    Primary area of focus / application: Education & Thinking
    Keywords: teaching, active learning, problem based, case study
    Submitted at 15-Apr-2012 00:36 by Jacqueline Asscher
    Accepted (view paper)
    10-Sep-2012 15:35 Turning a Simple Case Study into a Single Session, Three Stage Active Learning Exercise for Classroom Use
    We supplement our lectures with projects, in order to give students the experience of dealing with the problems that arise when theory meets practice. In both academic and industry teaching is also possible to include problem based learning in the lecture setting, for example in the form of classroom exercises based on case studies.
    This exercise has been used as a single session within a course of lectures and as a stand-alone workshop at a conference. The case study used involved checking the sensitivity of a critical dimension in a steel bending process to the variation in materials.
    The three stages of the project are: deciding what questions to ask and what existing data to obtain; analyzing existing data (results from routine inspection of incoming materials) and designing a small experiment; analyzing the results of the experiment.
    Participants work in pairs using a pencil and calculator. At each stage they receive one or two worksheets with information and questions, so that the exercise is like a cooking show where a pie is put together and then magically appears fully cooked. Here, for example, participants choose which background information to request and this information then appears in the form of further explanations, graphs and data.
    In this paper I discuss how to construct such an exercise, focusing on the following issues: How do we flesh out a real example with fictitious supplementary data? How much guidance to provide? How much time is needed? How should time be divided between work in pairs and whole group discussion?
  • Worst-Case Scenarios Identification as a Financial Stress Testing Tool for Financial- Economic Risk Models

    Authors: Mohamed El Ghourabi (University of Tunis), Amor Messaoud (University of Tunis), Mourad Landolsi (University of Tunis), Amira Dridi (University of Tunis)
    Primary area of focus / application: Finance
    Keywords: Worst-Case Scenarios, Financial stress testing, Risk management, Mahalanobis distance
    Submitted at 15-Apr-2012 20:02 by Mohamed El Ghourabi
    11-Sep-2012 11:30 Worst-Case Scenarios Identification as a Financial Stress Testing Tool for Financial- Economic Risk Models
    Financial stress testing (FST) is a key technique for quantifying financial vulnerabilities; it is an important risk management tool. FST should ask which scenarios lead to big loss with a given level of plausibility. However, traditional FSTs are criticized firstly for the plausibility that raised against stress testing and secondly, for being conducted outside the context of an econometric risk model. Hence the probability of a sever scenario outcome is unknown and many scenario yet plausible possibilities are ignored. The aim of this paper is to propose a new FST framework for analyzing multi-period stress scenarios for financial economic stability. The plausibility of a scenario is quantified by Mahalanobis distance from an average scenario. Based on worst case scenario optimization, our approach is able first to identify the stressful periods with transparent plausibility and second to develop a methodology for conducting FST in the context of any financial-economic risk model. Applied to Tunisian economic system data, our proposed framework identifies more harmful scenarios that are equally plausible leading to stress periods not detected by classical methods.
  • On Categorization of Perturb and Combine Ensemble Methods

    Authors: Riadh Khanchel (University of Tunis), Mohamed Limam (University of Tunis)
    Primary area of focus / application: Mining
    Keywords: Ensemble Methods, Classification, Boosting, Bagging
    Submitted at 15-Apr-2012 20:32 by Riadh Khanchel
    10-Sep-2012 17:05 On Categorization of Perturb and Combine Ensemble Methods
    Ensemble methods generate a set of models then combine them to produce accurate one. In this paper, ensemble methods are categorized using three hierarchical levels based on the choice of the perturbation methods, the weighting scheme and the aggregation technique. An important issue is how those levels affect classification accuracy. To explore this point, various modifications to ensemble methods are tested. Based on perturbation methods, two main families of ensemble methods are identified: adaptive family and random family. Results show that for the first family, perturbation has a greater effect on performance than weighting and aggregation. However, for the second family, all levels are important. Based on these results, a categorization of Perturb and Combine ensemble methods is suggested