ENBIS-11 in Coimbra

4 – 8 September 2011 Abstract submission: 1 January – 25 June 2011

My abstracts

 

The following abstracts have been accepted for this event:

  • Some Metrological Aspects of Ordinal Quality Data Treatment

    Authors: Dr. Emil Bashkansky Tamar Gadrich
    Affiliation: ORT Braude College, Department of Industrial Engineering and Management, P.O.Box 78, Karmiel 21982, Israel, ebashkan@braude.ac.il, emilbas@gmail.com
    Primary area of focus / application: Metrology & measurement systems analysis
    Keywords: ordinal scales , metrology , R&R , calibration
    Submitted at 19-Jan-2011 19:26 by Emil Bashkansky
    Accepted
    5-Sep-2011 11:50 Some Metrological Aspects of Ordinal Quality Data Treatment
    Ordinal scales are widely employed in the area of industrial quality engineering (quality sorting, customer satisfaction, severity of failure and so on). Despite the popular use of ordinal variables for different aims, misunderstanding and misinterpretation of the measurement results still sometimes occur. Ordinal quantities should only be used in comparison relations, and must not be appended by either measurement units or quantity dimensions. Comparisons of greater/less than in addition to equal/unequal can be made between ordinal variables, but the concept of distance between two generic levels of the same ordinal scale is meaningless. Such operations as conventional addition, subtraction, multiplication or division are forbidden, thus, all statistical measures of random ordinal variables must be based on these limitations. This means, however, that many common metrological concepts and definitions such as error, uncertainty, R&R, accuracy, as well as other statistical properties of repeated measurement results, have to be seriously revised. In practice we also often need to compare between two ordinal measuring systems (MSs) in aim to analyze the comparability and equivalence of measurement results (calibration, MS capabilities comparison, reproducibility evaluation). Our lecture will present a way to evaluate classical metrological characteristics, such as error, uncertainty and precision of single and repeated measurements based on the legitimate basic operations for ordinal data. A method for handling actions such as calibration, measuring systems’ capabilities comparison and reproducibility evaluation, as well as a comparison between two measuring systems (MSs) referring to a known/unknown reference standard is also proposed.
  • Using the Output of Sequential Simplex Optimisation to Create a Response Surface

    Authors: Lois Dodson Matthew Dodson René Klerx
    Affiliation: SKF AB
    Primary area of focus / application: Design and analysis of experiments
    Keywords: DOE , simplex , optimisation , response surface
    Submitted at 21-Mar-2011 07:47 by Rene Klerx
    Accepted (view paper)
    6-Sep-2011 10:45 Using the Output of Sequential Simplex Optimisation to Create a Response Surface
    Sequential simplex optimisation is often used as an alternative to a response surface, especially when more than one output is involved. A weakness of sequential simplex optimisation is that no model is created. This paper demonstrates how to use the results of sequential simplex optimisation to generate a response surface.
  • Comparison of Calculus Based Robustness Experiments and the Taguchi Approach

    Authors: Matthew Dodson Lois Dodson René Klerx
    Affiliation: SKF AB
    Primary area of focus / application: Design and analysis of experiments
    Keywords: Taguchi derivative response surface calculus
    Submitted at 21-Mar-2011 08:18 by Rene Klerx
    Accepted (view paper)
    5-Sep-2011 16:35 Comparison of Calculus Based Robustness Experiments and the Taguchi Approach
    Derivatives may be used to estimate the output variance of the system if the variance of the system inputs is known. Applying this technique to a model determined from a response surface proves to be more efficient than using replications to estimate the variance in an experimental setting. This paper will use a case study to compare the results and number of experimental trials required for both the Taguchi approach and the calculus base approach.
  • Integrated Models in Healthcare

    Authors: Lavi Yifat
    Affiliation: UNITO, Turin, Italy
    Primary area of focus / application: Six Sigma
    Keywords: health care , quality service , human resource
    Submitted at 31-Mar-2011 16:28 by Yifat Lavi
    Accepted (view paper)
    7-Sep-2011 12:00 Integrated Models in Healthcare
    Achieving excellence can be a hard task. Many companies and organizations have the ambition of becoming exceptional for the benefit of their customers, investors and employees. In order to achieve such goals, one needs a robust methodology, management support and hard work.
    When referring to a healthcare system, being exceptional is an ethical obligation not only an ambition.
    Healthcare processes are a key determinant of the quality of care. Delays in test results, mistakes in administering medicine, lack of information about a patient health history and radiology retakes are only a few such examples. Lack of consistent procedures and incorrect treatments are a major health hazard in a hospital.
    Healthcare workers in all departments are expected to continuously improve the quality, timeliness, and cost of their services to the community. Six Sigma is a management methodology which combines the reduction of waste and complexity of lean manufacturing with quality improvement and statistical data analysis. Six Sigma and the related methodology of Lean Six Sigma allow healthcare workers to get more for their patients and increase the effectiveness of the services they provide.
    Employees in hospitals play a large role in total service outcome. All employees provide service, whether to internal or external customers. It is essential to make sure they give the best possible service, whether it's administrative or physiological. For this purpose, there is a need to learn what drives employees and what will ensure their satisfaction at work which will then result in good service.
    Our research combines mathematical (Six Sigma), economic (DEA) and practical (Lean) methods with human resources methodologies (human sigma) in order to achieve the level of excellence needed by healthcare providers all over the world.
  • Evaluation of the Effect of Debit Cards on Cash Demand by Propensity Score Regressions in a Principal Stratification Model

    Authors: Andrea Mercatanti, Luca Arciero
    Affiliation: Economic and Financial Statistics Department, Italian Central Bank
    Primary area of focus / application: Finance
    Keywords: debit card stratification cash holding behaviour
    Submitted at 6-Apr-2011 11:33 by Andrea Mercatanti
    Accepted
    6-Sep-2011 16:05 Evaluation of the Effect of Debit Cards on Cash Demand by Propensity Score Regressions in a Principal Stratification Model
    Innovation in transaction technology has been argued to modify the cash holding behaviour of agents, as debit cardholders may either withdraw cash from ATMs and purchase items using POS devices at retailers. In this paper, we quantify the effect of the use of debit cards on the level of money inventories held by Italian households by means of a well established causal theory: the potential outcomes approach. To account for the relevant share of Italian debit cardholders not using this payment instrument, a model based on the concept of principal stratifications is proposed. We illustrate a set of assumptions under which the model can identify the effect for people that use debit cards. In particular we require that, conditionally on a set of observable covariates, there are no unobservable variables that are associated both with the level of money inventories and the holding of cards. An estimation method based on propensity score regressions is then proposed and its consistency is proved. Finally, the application results in significant and negative effects of debit card use on the cash level for different waves.
  • Using DOE with Tolerance Intervals to Verify Specifications

    Authors: Pat Whitcomb Shari Kraber
    Affiliation: Stat-Ease, Inc., 2021 E. Hennepin Ave Suite 480 Minneapolis, MN 55413
    Primary area of focus / application: Design and analysis of experiments
    Keywords: DOE, RSM, Optimization, Operating window, Specifications,Tolerance interval
    Submitted at 7-Apr-2011 18:31 by Pat Whitcomb
    Accepted (view paper)
    5-Sep-2011 11:50 Using DOE with Tolerance Intervals to Verify Specifications
    Design of experiments followed by numeric and graphical optimization can be used to find a ‘sweet spot’; i.e. an operating window. To ensure specifications are consistently met, uncertainty (variability) must be accounted for when defining boundaries of the operating window. We present a method of accounting for uncertainty using tolerance intervals. In general, larger sample sizes (DOEs) are required to control the width of tolerance intervals as compared to confidence intervals that are typically used to size designs. A two-factor tableting process is used to illustrate building and sizing a design to control the width of the tolerance intervals associated with specifications.