Performance – Database Speed ​​Optimization: There are very few tables, or a lot of tables with a few rows?

I have a big question.

Let’s take a company’s order database as an example.

Suppose this The company produces about 2000 orders per month, so, about 24K orders per year, they don’t want to delete any orders, even if it is 5 years (hey, this is an example, the numbers don’t mean anything).

In the sense of having a good database query speed, it is better to have only one table, or one table per year is faster?

My idea is to create a new table for orders every year, called orders_2008, orders_2009, etc..

Is it a good idea to speed up database queries?

The data usually used is the data of the current year, so the fewer lines the better.
Obviously, when I search all the order tables at the same time, this creates a problem, because should I run something complicated UNION.. But this situation is very rare in normal activities.

I think it is better to have an application, 95% of the queries are fast, and the rest is a bit slow, not always Very slow application.

My actual database is 130 tables, the new version of my application should have about 200-220 tables…about 40% of which will be replicated every year.

Any suggestions?

Edit: RDBMS may be Postgresql, maybe (hope not) Mysql

Smaller tables are faster.

If you have history records that you rarely use, then putting history records into other tables will be faster.

This is what The significance of the data warehouse-separate the operational data in the historical data.

You can run regular extraction from the operation and load to the history. Keep all the data, it’s just isolation.

< /p>

I have a big question.

Let’s take a company’s order database as an example.

Suppose this company produces approximately every month 2000 orders, then, about 24K orders per year, they don’t want to delete any orders, even if it is 5 years (hey, this is an example, numbers don’t mean anything).

In the sense of database query speed, it is better to have only one table, or one table per year is faster?

My idea is to create a new table for orders every year, called orders_2008, orders_2009, etc..

Is it a good idea to speed up database queries?

The data usually used is the data of the current year, so the fewer lines the better.
Obviously, when I search all the order tables at the same time, this creates a problem, because should I run something complicated UNION.. But this situation is very rare in normal activities.

I think it is better to have an application, 95% of the queries are fast, and the rest is a bit slow, not always Very slow application.

My actual database is 130 tables, the new version of my application should have about 200-220 tables…about 40% of which will be replicated every year.

Any suggestions?

Edit: The RDBMS may be Postgresql, maybe (hope not) Mysql

Smaller tables are faster.

If you have rarely used history records, it will be faster to put the history records into other tables.

This is the meaning of the data warehouse-the operation data in the historical data Separately.

You can run regular extraction from operations and load to history. Keep all data, it’s just isolation.

WordPress database error: [Table 'yf99682.wp_s6mz6tyggq_comments' doesn't exist]
SELECT SQL_CALC_FOUND_ROWS wp_s6mz6tyggq_comments.comment_ID FROM wp_s6mz6tyggq_comments WHERE ( comment_approved = '1' ) AND comment_post_ID = 2660 ORDER BY wp_s6mz6tyggq_comments.comment_date_gmt ASC, wp_s6mz6tyggq_comments.comment_ID ASC

Leave a Comment

Your email address will not be published.