Monday, March 12, 2012

Multiple tables or one large table

What are the performance considerations for deciding whether to have multiple
table sets or a single large table set with a key in each row.
Details:
SQL Server 2005
We have an application that uses 20 or so tables but will likely grow into
the small hundreds. We will potentially have hundreds of customers. Each
customer might add 200 to 10,000 rows per day to the database. We can:
A. Have a single set of tables where the a customer key value defines each
row.
B. Have a set of identical tables for each customer
We have considered the "management" aspect and would use DDL trigers to
ensure that the tables and indexes stay identical. We're concerned with the
performance difference.
What effects will each option have on caching? Indexing? Overall speed of
data retrieval?"Sierra" <Sierra@.discussions.microsoft.com> wrote in message
news:EBB395BF-475F-4125-8E12-B4FD7D873DFE@.microsoft.com...
> What are the performance considerations for deciding whether to have
> multiple
> table sets or a single large table set with a key in each row.
> Details:
> SQL Server 2005
> We have an application that uses 20 or so tables but will likely grow into
> the small hundreds. We will potentially have hundreds of customers. Each
> customer might add 200 to 10,000 rows per day to the database. We can:
> A. Have a single set of tables where the a customer key value defines each
> row.
> B. Have a set of identical tables for each customer
> We have considered the "management" aspect and would use DDL trigers to
> ensure that the tables and indexes stay identical. We're concerned with
> the
> performance difference.
>
Assuming that
-Customer is the leading column in every primary key
-All queries have a Customer parameter
> What effects will each option have on caching? Indexing? Overall speed of
> data retrieval?
Minimal probably. And the overall size of the database doesn't sound too
big, so I wouldn't make any drastic decisions on the basis of performance.
I would definitely design the logical schema to support multiple customers
per database. Once you have that you can consolidate all customers into a
single database, break them into a few or even one per customer.
David|||I would not have a separate table, or tables, for each customer. Consider
maybe archiving old data at a certain point to limit the number of records.
Definitely normalize your tables, but don't go to extremes, because that can
backfire from a performance standpoint too.
"Sierra" <Sierra@.discussions.microsoft.com> wrote in message
news:EBB395BF-475F-4125-8E12-B4FD7D873DFE@.microsoft.com...
> What are the performance considerations for deciding whether to have
multiple
> table sets or a single large table set with a key in each row.
> Details:
> SQL Server 2005
> We have an application that uses 20 or so tables but will likely grow into
> the small hundreds. We will potentially have hundreds of customers. Each
> customer might add 200 to 10,000 rows per day to the database. We can:
> A. Have a single set of tables where the a customer key value defines each
> row.
> B. Have a set of identical tables for each customer
> We have considered the "management" aspect and would use DDL trigers to
> ensure that the tables and indexes stay identical. We're concerned with
the
> performance difference.
> What effects will each option have on caching? Indexing? Overall speed of
> data retrieval?

No comments:

Post a Comment