The problem is that you have not defined your conditions enough. That is, what exactly are you optimizing?
For example, the system is not available for night service and users are not in the system? And you delete most of the database?
If in offline mode and deleting a large% it may make sense to simply create a new table with data to save, delete the old table and rename. If you remove a small%, you probably want to run the package in large batches, as your log space allows. It depends entirely on your database, but dropping indexes during rebuilds can hurt or help - even if it’s possible because it is “disconnected”.
If you are connected to the network, what is the likelihood that your deletions conflict with user actions (and is user activity predominantly read, updated, or what)? Or are you trying to optimize the user experience or the speed of your request? If you are removed from a table that is frequently updated by other users, you need to perform batch processing, but with smaller batch sizes. Even if you do something like locking the table to ensure isolation, it is not very good if the delete statement takes an hour.
When you define your conditions better, you can choose one of the other answers here. I like the link in Rob Sanders post for dosing things.
Matt Jun 05 '09 at 14:11 2009-06-05 14:11
source share