Skip to main navigation Skip to search Skip to main content

Comparative analysis of cost and elapsed time of normalization and de-normalization in the very large database

  • Seok Tai Chun
  • , Jihyun Lee*
  • , Cheol Jung Yoo
  • *Corresponding author for this work

    Research output: Contribution to conferenceChapterpeer-review

    Abstract

    Today, data to be processed by information systems is rapidly increasing and complicated, resulting in data integration, standardization, and quality problems. The explosive increase in data is causing performance problems for users seeking the desired information and for operators targeting these users. The industry defines a normalization or de-normalization model and builds a database to solve the performance problems of this very large database. However, it is not well known how they affect actual performance. Therefore, it is necessary to confirm whether the database constructed from the normalized data models and the databases constructed from the data models considering the de-normalization actually contributes to the performance improvement, the development and the simplification of the operation. In this paper, we analyze the effectiveness of de-normalization cost and processing time in the Very Large Database based on the case of establishing database for business to business service of large retailers. As a result, the de-normalized database had 15% faster processing time at a cost of 0.2% of the normalized.

    Original languageEnglish
    Title of host publicationStudies in Computational Intelligence
    PublisherSpringer Verlag
    Pages173-187
    Number of pages15
    DOIs
    StatePublished - 2019

    Publication series

    NameStudies in Computational Intelligence
    Volume786
    ISSN (Print)1860-949X

    Keywords

    • Database de-normalization
    • Database modeling
    • Database normalization
    • VLDB

    Quacquarelli Symonds(QS) Subject Topics

    • Computer Science & Information Systems
    • Data Science

    Fingerprint

    Dive into the research topics of 'Comparative analysis of cost and elapsed time of normalization and de-normalization in the very large database'. Together they form a unique fingerprint.

    Cite this