Normalization is a Rule
Normalization is a rule, and like any other rule, is made to be broken in certain cases. If you are responsible for changing a table's structure and applying the normalization approach, you should only take it as far as it makes sense.
There will be times when it simply does not help to break apart a table into separate related tables.
You will have to use your best judgement to determine when these cases come up. Normalization is a systematic way of ensuring that a database structure is suitable for general-purpose querying and free of certain undesirable
characteristics—insertion, update, and deletion anomalies—that could lead to a loss of data integrity.
E.F. Codd, the inventor of the relational model, introduced the concept of normalization and what we now know as the first normal form in 1970.
Database normalization is the process of organizing the fields and tables of a relational database to minimize redundancy and dependency. Normalization usually involves dividing large tables into smaller (and less redundant) tables and defining relationships between them.
The objective is to isolate data so that additions, deletions, and modifications of a field can be made in just one table and then propagated through the rest of the database using the defined relationships.
Edgar F. Codd, the inventor of the relational model, introduced the concept of normalization and what we now know as the First Normal Form (1NF) in 1970.
Codd went on to define the Second Normal Form (2NF) and Third Normal Form (3NF) in 1971,
and Codd and Raymond F. Boyce defined the Boyce-Codd Normal Form (BCNF) in 1974. Informally, a relational database table is often described as "normalized" if it is in the Third Normal Form. Most 3NF tables are free of insertion, update, and deletion anomalies.