SQL Server - Running large script files

Sql ServerSql Scripts

Sql Server Problem Overview


I have a database table on a development server that is now fully populated after I set it running with an import routine for a CSV file containing 1.4 million rows.

I ran the Database Publishing Wizard on the table, and now I have a 286MB SQL script on my local machine. The problem is, I can't figure out how to run it. If I load it into SQL Server Management Studio Express I get an alert window that says "The operation could not be completed".

Any ideas on how I can get this SQL script to run?

Sql Server Solutions


Solution 1 - Sql Server

use the sqlcmd tool to execute the file..

sqlcmd -S myServer\instanceName -i C:\myScript.sql

In case your have an unexplained "script error" for large sql files (> 100MB) which includes several INSERT, just replace "INSERT INTO" by "GO INSERT INTO" in your file, which will reduce size of transaction.

Solution 2 - Sql Server

This tool (Big SQL Script File Runner) on CodePlex will run any size script file with log and GUI.

Solution 3 - Sql Server

Adding to Gulzar Nazim's answer: If you still get a failure, try specifying the codepage of your SQL file using option -f:

sqlcmd -S myServer\instanceName -d databaseName -i C:\myScript.sql -f 65001

I was trying to import a .dump file from SQLite (UTF-8 by default), and sqlcmd kept throwing an error after encountering the first special character. -f 65001 fixed it for me.

Solution 4 - Sql Server

Why not just use DTS to import the CSV file directly?

Solution 5 - Sql Server

Yes we could do that, I tried with BCP(Bulk Copy Program) approach in order to avoid OutOfMemory issue.

Note : Tried in SQLServer 2014

In BCP, first we need to export the Source DataBase data to bcp file(in local directory folder) and then need to import that bcp file to Source DataBase

enter image description here

Below are the cake walk steps:

Note:

a) Make sure empty table is present in Destination DataBase

b) Make sure Temp folder is present in C drive

1) Create a bat file named as Export_Data.bat with below command

bcp.exe [Source_DataBase_Name].[dbo].[TableName] OUT "C:\Temp\TableName.bcp" -S "Computer Name" -U "SQL Server UserName" -P "SQL Server Password" -n -q 

pause

2) Run that bat file, as a result of that a bcp file will get generated in Temp folder

3) Then Create a another bat file named as Import_Data.bat with below command

bcp.exe [Destination_DataBase_Name].[dbo].[TableName] IN "C:\Temp\TableName.bcp" -S "Computer Name" -U "SQL Server UserName" -P "SQL Server Password" -n -q 

Pause

And here we go!!

Solution 6 - Sql Server

Running something that large inside a single transaction is not a good idea. Therefore, I'd recommend breaking up the file into smaller, more manageable chunks.

Another option is to look at some of the other ways to import CSV data directly.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
Questiontags2kView Question on Stackoverflow
Solution 1 - Sql ServerGulzar NazimView Answer on Stackoverflow
Solution 2 - Sql ServerAlobidatView Answer on Stackoverflow
Solution 3 - Sql ServerPaloDraveckyView Answer on Stackoverflow
Solution 4 - Sql ServerAhehoView Answer on Stackoverflow
Solution 5 - Sql ServerKmsView Answer on Stackoverflow
Solution 6 - Sql ServerMichael HarenView Answer on Stackoverflow