Moving SQL Server Database Changes from Development to Production Environment

I need help with automating database deployment between two SQL Server instances, particularly with SQL Server 2005.

Currently I have a development environment and a production server. I want to integrate this process into my build scripts using standard Windows batch files, though I might move to PowerShell later. GUI tools like Enterprise Manager or Management Studio Express won’t work for this automation.

My main questions are:

  1. File Copy Method: Should I simply copy the .mdf file and attach it to the target server? I’m concerned about potential compatibility issues with binary files, even though both servers run identical versions.

  2. Script Generation: Since T-SQL doesn’t have an equivalent to “EXPLAIN CREATE TABLE”, what’s the best way to export database structure as SQL scripts? Are there command-line tools that can automatically generate SQL queries from an existing database?

  3. Schema Synchronization: The production database already has live data, so I can’t just recreate tables. I need to compare structural differences and use ALTER TABLE statements instead. This might also require data validation and conversion when existing columns change.

I’ve heard good things about Red Gate products, but they’re too expensive for personal projects. What tools or methods do you use for automated SQL Server database deployment from test to production environments?

I’ve encountered similar issues in the past with SQL Server 2005. It’s advisable not to copy .mdf files directly between environments due to potential complications with file paths and security. Instead, consider using SQLCMD for backup and restore operations in your batch scripts. Backing up your development database and moving the .bak file to restore it temporarily in production can help with comparative checks.

To compare schemas without incurring costs from commercial tools, query the INFORMATION_SCHEMA views using SQLCMD scripts. This allows you to extract table structures and automatically generate ALTER statements. Creating a master script that compares both databases can assist in identifying structural differences.

When handling live data, always have a rollback strategy in place. Execute deployment scripts within transactions to ensure you have clear rollback points. It’s crucial to test the entire deployment process on a staging environment that accurately reflects your production data’s size and structure. This method has significantly reduced deployment issues for me.

Interesting challenge! Have you tried DACPAC files? they’re built into SQL Server and handle schema comparisons really well. quick question - are you dealing with stored procedures and views, or just table structures? that’ll definitely change your approach.

sqlcmd’s the way to go for automating stuff. copying MDF files is risky! i usually run queries on sys.tables/sys.columns to create ALTER statements. it might take time to set up, but it runs smooth after that. also, if you have it, check out sqlpackage.exe for schema comparisons - it’s free.