SQL Server Bulk Insert

Hello Folks,

Do you know how to import CSV file into SQL Server? Or, How to load comma delimited file into SQL Server? It’s with the help of SQL Server Bulk Insert.

Well I have noted down some keynotes about this:

  • SQL Server Bulk-Insert can be used within any T-SQL script or stored procedures to import the data into SQL Server.
  • It is being used to import the data into a database table or view in a user-specified format.
  • The parameters of the command specify the table receiving the data, the location of the source file, and the options.
  • Bulk-Insert is being used to efficiently transfer data between SQL Server and heterogeneous data sources.
  • CSV stands for Comma Separated Values, sometimes also called Comma Delimited Values.
  • Bulk-Insert is one in which every column from the source table is simply inserted directly to the destination table using one-to-one mapping.
  • It goes like this; first column from the source file is being dumped into the first column of the destination table. If there are too many columns in the destination table, then BULK INSERT will work, as the extra data is placed into the bit bucket and simply being discarded.
  • The Syntax for the Bulk-Insert is as follows:
BULK INSERT
   [ database_name . [ schema_name ] . | schema_name . ] [ table_name | view_name ] 
      FROM 'data_file'
     [ WITH
    ( 
   [ [ , ] BATCHSIZE = batch_size ] 
   [ [ , ] CHECK_CONSTRAINTS ] 
   [ [ , ] CODEPAGE = { 'ACP' | 'OEM' | 'RAW' | 'code_page' } ] 
   [ [ , ] DATAFILETYPE = 
      { 'char' | 'native'| 'widechar' | 'widenative' } ] 
   [ [ , ] FIELDTERMINATOR = 'field_terminator' ] 
   [ [ , ] FIRSTROW = first_row ] 
   [ [ , ] FIRE_TRIGGERS ] 
   [ [ , ] FORMATFILE = 'format_file_path' ] 
   [ [ , ] KEEPIDENTITY ] 
   [ [ , ] KEEPNULLS ] 
   [ [ , ] KILOBYTES_PER_BATCH = kilobytes_per_batch ] 
   [ [ , ] LASTROW = last_row ] 
   [ [ , ] MAXERRORS = max_errors ] 
   [ [ , ] ORDER ( { column [ ASC | DESC ] } [ ,...n ] ) ] 
   [ [ , ] ROWS_PER_BATCH = rows_per_batch ] 
   [ [ , ] ROWTERMINATOR = 'row_terminator' ] 
   [ [ , ] TABLOCK ] 
   [ [ , ] ERRORFILE = 'file_name' ] 
    )]

Well, there is lot of parameters here; we will discuss the significant one later on.

  • The Bulk-Insert command won’t accept a string concatenation or variable in the FROM clause. If you need to do this, it will be possible only with the help of dynamic SQL statement.
  • Since BULK INSERT is dependent on the column position of both the source file and the destination file, it is the best practice to use a view as an abstraction layer between the BULK INSERT command and the table.
  • If the structure of the source file or the destination file is altered, then modifying the view can keep the BULK INSERT running without having to change the other object’s structure.
  • You can also BULK INSERT the data into a staging table, check the data, and then perform the rest of the transformations as we merge the data into permanent tables.
  • It will become easier if we see this example:

Here we have created table ‘sample_addtype’, and therefore we are using the BULK INSERT to get it done.

   
USE TEST
GO
CREATE TABLE sample_addtype(
ID INT,
[Address] VARCHAR(MAX),
City VARCHAR(MAX),
Region VARCHAR(MAX),
PostalCode VARCHAR(MAX),
[GUID] VARCHAR(MAX), 
UPDATED DATETIME
);
 
BULK INSERT sample_addtype
FROM 'C:\Program Files\Microsoft SQL Server\100\Tools\Samples\AdventureWorks OLTP\Address.csv'
WITH (FIRSTROW = 1, ROWTERMINATOR = '\n');

The result for this can be seen as:

SELECT * 
FROM sample_addtype

1_SQL_Server_Bulk_Insert

Now, it’s time to discuss some significant parameters in the BULK INSERT:

  • Field Terminator: It specifies the character used to delimit or separate columns in the source file. It can be “,” (comma) or “|” (pipe).
  • Row Terminator: It specifies the character that ends a row in the source file. ‘\n’ means end of row and is the typical setting. Files from mainframe don’t use a clean end of line. So we have to use hex editor to view the actual end of the line characters and specify the row terminator in hex. A hex value of ‘0A’ is quoted as ‘0x0A’.
  • FirstRow: It is useful when specifying whether the incoming file has column headers.
  • TabLock: It places an exclusive lock on the entire table and saves SQL Server the trouble of having to lock the table’s data pages being created.
  • Rows per Batch: It tells SQL Server to insert n number of rows in a single batch, rather than the entire file.
  • Max Errors: It specifies how many rows can fail before the bulk insert fails.

Well this was all about Bulk-Insert.

And also comments on this!!

 

Regards

Piyush Bajaj

Like us on FaceBook Follow us on Twitter | Join the fastest growing SQL Server group on FaceBook

Follow me on Twitter  |  Follow me on FaceBook

   

About Piyush Bajaj

I am very passionate about SQL Server. I also did certification on MCSA – SQL Server 2012, Querying and Administering; MCTS – SQL Server 2008, Database Development; and MCTS – SQL Server 2005, Implementation & Maintenance, which helped me to get more knowledge and interest on this field.Please feel free to drop me any question online or offline, I will try to give you the best possible answer from my side.Right now I am working as a SQL Server developer in TCS. I have an experience of just 2.6 years, well I can only say that “If you have an interest and passion, experience might become a very small thing”.

View all posts by Piyush Bajaj →

3 Comments on “SQL Server Bulk Insert”

  1. Hi,

    I am searching for, how to find out the extra column is added in the .csv file, if there is are no. of columns are fixed?

    Example.

    I am having file with 15 columns. When in .csv file having 15 columns then its running perfect. But when .csv file is having 16 columns then how can I identify the 16th column?

    Thanks in advance

    Sachin Jain

  2. I need to import ‘.csv’ file which is in excel format in sql server 2008. I have strings such as “hour, minute, second, millisecond” in a single field. what field terminator i can use other than ‘,’ bcoz i get only the first string in that column and other part of string goes to the next column.

  3. Can we do a bulk insert from one table to another.. I have a table where-in a column holds comma seperated values like “1,ABC,2013,XYZ”. Can we do a bulk insert of this value into another table having four columns.

Leave a Reply

Your email address will not be published.