People Innovation Excellence
 

Five Common Database Design Mistakes_Part 1

Several factors can lead to a poor database design, lack of experience, a shortage of the necessary skills, tight timelines and insufficient resources can all contribute. In turn, poor database design leads to many problems down the line, such as sub-par performance, the inability to make changes to accommodate new features, and low-quality data that can cost both time and money as the application evolves.

I constantly hear from data professionals that if you don’t get the data right, nothing else matters. However, many times it is the tail (i.e., the application) that’s wagging the cat rather than the cat wagging the tail. The database just comes along for the ride as the application grows in scope and functionality.  In this article we’ll focus on seven common database design mistake that can be easily avoided and suggest ways to correct them in future projects:

#1: Failing to plan

Not so long ago, planning seemed an obvious precursor to development. Database teams would no more build databases without careful planning than structural architects would erect buildings without well-considered blueprints. The belief was that a meticulous planning phase was the only way to ensure you didn’t end up with a database hacked together with arbitrary components that failed to take into account data integrity and system performance. If you didn’t invest the necessary time and effort in the beginning to understand the data, the constraints and the processes, you’d pay off big time with a database that would need to be refactored, replaced or scrapped altogether.

 #2: Not knowing how to normalize

Normalization is often seen by non-database types as a way to over engineer, add complexity, and ensure job security in coming years. This is as absurd as saying that network security has no purpose other than to make everyone’s life miserable. Normalization serves as the foundation on which SQL and relational database management systems (RDBMS) are built. Normalization provides the structure necessary to effectively access, modify, and manage the data, while ensuring its integrity over the long term.

A properly normalized database can also help improve performance, contrary to what denormalization advocates might suggest. Denormalizing too often or too soon can result in overly complex queries and locking contention as well as put the data’s integrity at risk. A normalized data structure is integral to effective database design and deviating from this strategy should be done only after carefully analyzing the situation and weighing the available alternatives.


Published at : Updated
Leave Your Footprint

    Periksa Browser Anda

    Check Your Browser

    Situs ini tidak lagi mendukung penggunaan browser dengan teknologi tertinggal.

    Apabila Anda melihat pesan ini, berarti Anda masih menggunakan browser Internet Explorer seri 8 / 7 / 6 / ...

    Sebagai informasi, browser yang anda gunakan ini tidaklah aman dan tidak dapat menampilkan teknologi CSS terakhir yang dapat membuat sebuah situs tampil lebih baik. Bahkan Microsoft sebagai pembuatnya, telah merekomendasikan agar menggunakan browser yang lebih modern.

    Untuk tampilan yang lebih baik, gunakan salah satu browser berikut. Download dan Install, seluruhnya gratis untuk digunakan.

    We're Moving Forward.

    This Site Is No Longer Supporting Out-of Date Browser.

    If you are viewing this message, it means that you are currently using Internet Explorer 8 / 7 / 6 / below to access this site. FYI, it is unsafe and unable to render the latest CSS improvements. Even Microsoft, its creator, wants you to install more modern browser.

    Best viewed with one of these browser instead. It is totally free.

    1. Google Chrome
    2. Mozilla Firefox
    3. Opera
    4. Internet Explorer 9
    Close