gpt4 book ai didi

sql-server-ce - 您应该多久压缩一次 SQL CE 数据库?

转载 作者:行者123 更新时间:2023-12-04 02:16:21 27 4
gpt4 key购买 nike

甚至需要定期压缩 SQL CE 数据库吗?自动收缩就足够了吗?我们的平均数据库大小约为 100Mb,大用户达到 400-500Mb(但这种情况非常罕见)。如果我们必须手动压缩,我们如何判断何时应该压缩?有没有办法以编程方式告诉碎片级别或浪费空间的百分比?如果不是,我们可以使用什么其他阈值?

该产品的先前版本建立在 (gasp) MS Access 数据库上,因此我们必须定期压缩以使其正常工作。

最佳答案

我想,如果您可以设置数据库,使其根据需要自动缩小和修复,那就太彻底了。这就是为什么关于最佳实践的文献很少;人们普遍认为它“有效”。因此,您在该领域获得的任何指导都将是模糊的。

这是来自 http://www.microsoft.com/web/library/Details.aspx?id=sql-server-2008-compact-express-depoly-manage 的网络广播的(部分)引述

Maintaining your SQL Server Express Editions is fairly similar to managing any other multi-user database, meaning that we have the option to go in and deal with file groups, we can deal with backup options and recovery models and what not. [But] when we deal with compact editions or SQL Service CE, we don’t have nearly as many options. Really, the only options we have is how we want to deal with shrink and repair.



这是来自 MSDN 的另一个
http://msdn.microsoft.com/en-us/library/ms838028.aspx#youcantakeitwithyou_sqlserverce_topic4

请注意,他们提供了有关数据库架构的详细信息,但仍然没有给出维护计划。他们的建议是:在数据库开始变慢时执行。另请注意,此建议是在 2005 年左右,从那时起情况有所改善;即维护程序现在已经自动化。

Keep Your House (or Database) in Order
Another big factor in the performance of large databases in SQL Server CE 2.0 is the organization of the database structure itself. As your application modifies the contents of the database, the records become more randomly distributed within the database file structure. This factor is especially true after a large number of inserts and deletes. To ensure optimal access to the database, compact the database after any substantial change to the contents.

In addition to recovering unused space, performing a compact on the database has two notable impacts on performance: first, it stores all table records in order by their primary key; second, it updates the statistics used by the query processor.

Ordering the records by primary key can notably improve primary key access. This is due to the page-oriented nature of SQL Server CE (and most other databases). Rather than loading individual records from the database into memory, SQL Server CE loads blocks of records called pages. When the database records are grouped in order by primary key, loading the page containing one record automatically loads those records with similar primary key values. For most applications, this results in what's referred to as a good "hit rate," which means that when your application goes to access successive database records, there is a strong likelihood that the page containing those records is already in memory and can be directly accessed. When records are more randomly distributed, as often happens after a large number of inserts and deletes, there is a poor hit rate requiring SQL Server CE to retrieve more pages from the database file to access the same number of records.

The query processor statistics influence how the query processor determines the best method for locating records. Decisions like whether to use a key or do a sequential scan to locate a particular record are all influenced by the query processor statistics. As the statistics become stale, there is an increased likelihood that the query processor may make a less than optimal decision. Performing a compact refreshes these statistics.



我对您使用 Access 数据库的经历表示同情。但是,我认为您会发现您使用 SQL Server CE 的体验几乎没有相似之处。

关于sql-server-ce - 您应该多久压缩一次 SQL CE 数据库?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/1156197/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com