Five Common Database Design Mistakes_Part 1
Several factors can lead to a poor database design, lack of experience, a shortage of the necessary skills, tight timelines and insufficient resources can all contribute. In turn, poor database design leads to many problems down the line, such as sub-par performance, the inability to make changes to accommodate new features, and low-quality data that can cost both time and money as the application evolves.
I constantly hear from data professionals that if you don’t get the data right, nothing else matters. However, many times it is the tail (i.e., the application) that’s wagging the cat rather than the cat wagging the tail. The database just comes along for the ride as the application grows in scope and functionality. In this article we’ll focus on seven common database design mistake that can be easily avoided and suggest ways to correct them in future projects:
#1: Failing to plan
Not so long ago, planning seemed an obvious precursor to development. Database teams would no more build databases without careful planning than structural architects would erect buildings without well-considered blueprints. The belief was that a meticulous planning phase was the only way to ensure you didn’t end up with a database hacked together with arbitrary components that failed to take into account data integrity and system performance. If you didn’t invest the necessary time and effort in the beginning to understand the data, the constraints and the processes, you’d pay off big time with a database that would need to be refactored, replaced or scrapped altogether.
#2: Not knowing how to normalize
Normalization is often seen by non-database types as a way to over engineer, add complexity, and ensure job security in coming years. This is as absurd as saying that network security has no purpose other than to make everyone’s life miserable. Normalization serves as the foundation on which SQL and relational database management systems (RDBMS) are built. Normalization provides the structure necessary to effectively access, modify, and manage the data, while ensuring its integrity over the long term.
A properly normalized database can also help improve performance, contrary to what denormalization advocates might suggest. Denormalizing too often or too soon can result in overly complex queries and locking contention as well as put the data’s integrity at risk. A normalized data structure is integral to effective database design and deviating from this strategy should be done only after carefully analyzing the situation and weighing the available alternatives.