I'm doing all I can to prevent this asinine change, mentioning the fact that everything we've built in the last 12 years will need to be checked / modified.
But, in case I lose, I wrote a script to modify the columns. It consists of a bunch of T-SQL commands like:
alter table course_table
alter column credit_hours decimal(4,1)
go
I ran one of these commands on a local subset and it took forever to finish. The full script will have to be done over the weekend.
So the question is, what is happening to the logs when this command is executing? Should I try dumping it after each alter column?
Any other advice on this subject to offer?
Thanks.
Depending on the type of change, ALTER TABLE will result in just metadata updates or it has to rewrite every single row. In your case, it will rewrite every single row and the time it takes is directly proportional to the number of rows/row size/data pages. The ALTER TABLE itself is atomic in nature so you can't do much in terms of reducing the logging resources for it. So it will log every change in your case to the log. But you can take a log backup after each ALTER TABLE or periodically to manage the log growth. See the link below for some details on the ALTER TABLE also:
http://www.sqlmag.com/Article/ArticleID/40538/Inside_ALTER_TABLE.html
|||That's what I was suspecting about the logs.Thanks for the quick reply.
No comments:
Post a Comment