So, I recently ran into an interesting problem. I needed to export data from SQL Server, but I need it in a form that I could use in a script - in other words, I wanted to export to a series of insert statements. Unfortunately, SQL Server (specifically Enterprise Manager, and I should be clear that I use both terms off and on to refer to the same thing) only lets you export to a delimited file - not a file you can run as is. (Unlike mySQL-Front, a darn nice client for mySQL - it even lets you export to an HTML or XML file.)
So, I asked around for help and found a few solutions. The first solution was a script that would generate insert statements. This worked fine - except for the fact that Query Analzyer restricts the amount of data it can output. I can understand the reasoning behind this, but I think it is pretty darn stupid that it won't let me override this setting. (This isn't the only beef I have with Enterprise Manager though.) Since my data was pretty fat, this wasn't a solution.
The next thing I tried was a suggestion by Stephen Milligan. He said I could use the CSV file that SQL Server exports and then use the Bulk Copy Utility to import the data. This kinda worked - however I then had issues with the CSV data itself. First - all my strings were wrapped with quotes - not a big deal I thought - I'm sure BCP has a way to remove them. Unfortunately, I couldn't find a way to handle it. Then, all my bit fields were changed to True/False in the CSV file. So of course on reimporting this would throw an error. I'm sure this method would have worked for me if I had just tried a bit harder, but I simply didn't have the time.
So - I ended up using CFML. My code simply selects all the rows and then generates a file with insert statements. This file can then be run in Query Analyzer or via the command line.
Does anyone else have suggestions? I'm really surprised Enterprise Manager doesn't have a simple "Export the whole dang thing to a script" function. Sure you can backup/restore, but a script is a lot simpler - at least it was for me when I using mySQL (which I'm gaining a lot of respect for).
Archived Comments
<p>
Insofar as I know, SQL Server does not have a utility akin to the mysqldump. However, you can create stored procedure to accomplish the same thing.
</p>
<p>
For examples, see:
</p>
<p>
<a href="http://vyaskn.tripod.com/co...">Information about SP</a>
<br />
<a href="http://vyaskn.tripod.com/co...">SP Code</a>
</p>
Oops. Sorry for the extraneous HTML code.
Just trying to make the comment look nice.
Have a look at RedGate's SQL Data Compare - http://www.red-gate.com/SQL.... If you "compare" to an identical, yet empty table, all the insert statements will be there. It is costly, but if you use SQL Server a lot, it is really useful, especially SQL Compare (used for synchrinising Data Schemas).
BCP ( a command line util) has and is the only export/import utility that SQL has.
Other than that there is DTS - data transformation services that can transfer the data to another data source.
I ended up making my own script to generate Insert Statements. It uses the ADO to find the type of data being passed back and if the column is a primary key or not. I ended up doning this due to the transfer of data between Oracle and SQL server.
The 2eNetWorX TableEditoR ASP db admin program has a feature for exporting in SQL insert format. We're working on the next version with improvements, let me know if you'd like to see or test what we have so far.
whoops, forgot to post the url:http://www.2enetworx.com/de...
I use the Red Gate software to make sure my code tables are up to date between my dev, qa and live environments. SQL Data Compare is just one of three tools they ship for SQL. We use it for structural comparisons as well.
I used the script in "http://vyaskn.tripod.com/co..." as Andrew Scott said.
Here you can replace the line
EXEC (@Actual_Values)
by
PRINT (@Actual_Values)
then get the SELECT string and create a view with it.
You can then open with enterprise manager, and you'll have no restrictions with it.
You shoul try this nice little c# app:
http://www.eggheadcafe.com/...
It lacks some advanced features but it solves the problem.
I know this post is super old, but I thought I'd post my work around for this problem. What I do is I setup an ODBC Data Source on my computer that points to the SQL server database I want to backup data for. Then I create an empty mysql database. Then I use SQLYog's ODBC Import functionality to import the tables I want from SQL server into my empty MySQL database. From there, you just to a good old MySQL dump to get the data as insert statements.
You can get SQLYog at: http://www.webyog.com
You may be able to accomplish this with the mysql tools that are available now. I'm not sure. You can download the MySQL tools here. Post your results to this blog entry if it's possible.
http://www.mysql.com/produc...
Try http://www.sqlscripter.com to export data to T-SQL (generate Insert, Update, Delete commands) or to CSV/Text.
It's free.
I use Embarcadero DBArtisan Schema Extraction for script all the desired objects and then "Generate Insert Statements" for the data. Nice and easy. I miss my mysqldump anyway...
Best Regards
MSSQL 2008 has this feature
http://www.kodyaz.com/artic...
The "Publish to provider" option in the Visual Studio server explorer can generate a SQL script with table definitions and data (insert statements).
As mentioned before, MSSQL 2008 already has this capability (a so called "Database Publishing Wizard" utility) built in.
For MSSQL 2005 it's available as a seperate download:
http://www.microsoft.com/do...
It has only very basic functionality. Doesn't even let you filter ... you can export only a whole table with all of it's records.
But it works. Most of the time. :-)
And it's damned slow. :-(
Most data bases can generate DDL for any object but not a lot of them allow generation of INSERT statements for the table data.
The workaround is to make use of ETL Tools for transferring data across servers. However, there exists a need to generate INSERT statements from the tables for porting data.
Simplest example is when small or large amount of data needs to be taken out on a removable storage media and copied to a remote location, INSERT..VALUES statements come handy.
There is a number of scripts available to perform this data transformation task. The problem with those scripts that all of them database specific and they do not work with textiles
Advanced ETL processor can generate Insert scripts from any data source including text files
http://www.dbsoftlab.com/ge...
Need help:
1. I need to write sql to create and index on each of the following tables and columns
nce_cbt_incfolder-- inc, folder
nce_cbt_incpfcomplete--empid,inc,lastnm
nce_cbt_incpfingest--empid, inc, lastnm
nce_cbt_incpfunknown--empid, inc
The index name should be the tablename_index.
This is what I have:
CREATE TABLE nce_cbt_incfolder
(inc char(50), folder char(50));
CREATE TABLE nce_cbt_incpfcomplete
(empid char(50), inc char(50), lastnm char(50));
CREATE TABLE nce_cbt_incpfingest
(empid char(50), inc char(50), lastnm char(50));
CREATE TABLE nce_cbt_incpfunknown
(empid char(50), inc char(50));
CREATE INDEX nce_cbt_incfolder_index
on nce_cbt_incfolder (inc, folder);
CREATE INDEX nce_cbt_incpfcomplete_index
on nce_cbt_incpfcomplete (empid, inc, lastnm);
CREATE INDEX nce_cbt_incpfingest_index
on nce_cbt_incpfingest (empid, inc, lastnm);
CREATE INDEX nce_cbt_incpfunknown_index
on nce_cbt_incpfunknown (empid, inc, lastnm);
END
Is this correct?
Next,
2. I need to write the script to optimize these 4 tables.
This is what I have so far:
OPTIMIZE TABLE nce_cbt_incfolder;
OPTIMIZE TABLE nce_cbt_incpfcomplete;
OPTIMIZE TABLE nce_cbt_incpfingest;
OPTIMIZE TABLE nce_cbt_incpfunknown;
END
Is this correct?
3. I need to create coldfusion page to execute the scripts.
How do I do that?
Thanks
1) No idea on 1. My SQL sucks.
2) Ditto.
3) You can run SQL via cfquery statements.