Data Model Normalization Task

From GM-RKB
Jump to navigation Jump to search

A Data Model Normalization Task is a Task to update a Database Model to meet certain criteria that facilitate and improve the Performance of certain Database Activities, such are querying and updating.



References

2009

  • (Wikipedia, 2009) ⇒ http://en.wikipedia.org/wiki/Database_normalization
    • In the field of relational database design, normalization is a systematic way of ensuring that a database structure is suitable for general-purpose querying and free of certain undesirable characteristics — insertion, update, and deletion anomalies — that could lead to a loss of data integrity. [1] E.F. Codd, the inventor of the relational model, introduced the concept of normalization and what we now know as the first normal form in 1970. [2] Codd went on to define the second and third normal forms in 1971[3]; and Codd and Raymond F. Boyce defined the Boyce-Codd normal form in 1974 . [4] Higher normal forms were defined by other theorists in subsequent years, the most recent being the sixth normal form introduced by Chris Date, Hugh Darwen, and Nikos Lorentzos in 2002. [5]
    • Informally, a relational database table (the computerized representation of a relation) is often described as "normalized" if it is in the third normal form (3NF). [6] Most 3NF tables are free of insertion, update, and deletion anomalies, i.e. in most cases 3NF tables adhere to BCNF, 4NF, and 5NF (but typically not 6NF).
    • A standard piece of database design guidance is that the designer should begin by fully normalizing the design, and selectively denormalize only in places where doing so is absolutely necessary to address performance issues. [7] However, some modeling disciplines, such as the dimensional modeling approach to data warehouse design, explicitly recommend non-normalized designs, i.e. designs that in large part do not adhere to 3NF. [8]