Skip to content
Advertisement

SQL Server Bulk Insert Error 7301 “IID_IColumnsInfo”

I’m trying to insert through a CSV file, which by the way will be executed every day through a procedure, but it gives the same error.

Msg 7301, Level 16, State 2, Line 16
Cannot obtain the required interface (“IID_IColumnsInfo”) from OLE DB provider “BULK” for linked server “(null)”.

The table I’m trying to import I put all the fields as nvarchar and all of them with at least 500 characters, because I was thinking that this was the problem.

This CSV file I am exporting through PowerShell as follows:

Export-Csv -Path $DirPath -Delimiter ';' -NoTypeInformation -Encoding UTF8

The file has 40 columns and 685 rows, I already tried to save the CSV file with ‘,’ delimiter and ‘;’ delimiter, but both have the same error.

I tried to do the Bulk Insert in several ways as below, but without success.

BULK INSERT DEV_DS_SANTOGRAU..GB_TBIMP_FOTOS_CSV
FROM 'C:UsersuserbiDesktopProjetos-Santo-GrauProjeto1-RelatoriodeEstoqueTBIMP_FOTOS_CSV.csv'
WITH (FORMAT = 'CSV', 
      --MAXERRORS = 0,
      --CODEPAGE = '65001',
      CODEPAGE = 'ACP',
      --FIELDQUOTE = '"', 
      FIELDTERMINATOR ='";"', 
      --ROWTERMINATOR ='"n"',
      ROWTERMINATOR = 'rn',
      --ROWTERMINATOR = "0x0a"
      FIRSTROW = 2,
      ERRORFILE = 'C:UsersuserbiDesktopProjetos-Santo-GrauProjeto1-RelatoriodeEstoqueTBIMP_FOTOS_CSV_ERROS.csv');

Once he exported a CSV and TXT file with errors, using the code above, the data was like this (but not in the original file):

enter image description here

What should I do?

I would not like it, but if it is possible to ignore these records but the insert is completed, it would be less worse.

Information:

  • SQL Server 2019 (v15.0.18330.0)
  • SQL Server Management Objects (SMO) v16.100.37971.0
  • Microsoft SQL Server Management Studio v18.5

Advertisement

Answer

It’s usually easier to BULK INSERT data with a format file. Use the bcp.exe utility to create a format file with a command such as the following:

bcp.exe DEV_DS_SANTOGRAU..GB_TBIMP_FOTOS_CSV format nul -c -t; -f C:TempTBIMP_FOTOS_CSV.fmt -S(local) -T

Where:

  • DEV_DS_SANTOGRAU..GB_TBIMP_FOTOS_CSV is the Database.Schema.Table we’re interacting with.
  • format specifies format file creation mode.
  • nul specifies the input/output data file, which in this case means “don’t write any data”.
  • -c specifies character mode, as opposed to native (binary) mode.
  • -t; specifies to use ; as the field separator character.
  • -f C:TempTBIMP_FOTOS_CSV.fmt specifies the path to write the format file to, relative to your local computer.
  • -S(local) is the SQL Server to connect to, (local) in my case.
  • -T means Trusted Authentication (Windows authentication), use -uUsername and -pPassword if you have SQL Login authentication instead.

This creates a format file something like the following (yours will have more and different columns):

14.0
2
1       SQLCHAR             0       510     ";"      1     Filename                 SQL_Latin1_General_Pref_CP1_CI_AS
2       SQLCHAR             0       510     "rn"   2     Resolution               SQL_Latin1_General_Pref_CP1_CI_AS

Now, in SSMS, you should be able to run something like the following to import your data file (adjust file paths relative to your SQL Server as appropriate):

BULK INSERT DEV_DS_SANTOGRAU..GB_TBIMP_FOTOS_CSV
FROM 'C:TempTBIMP_FOTOS_CSV.csv'
WITH (
    CODEPAGE = '65001',
    DATAFILETYPE = 'char',
    FORMAT = 'CSV', 
    FORMATFILE = 'C:TempTBIMP_FOTOS_CSV.fmt'
);

— edit —

On SQL Server and international character support.

SQL Server and UTF-8 has had a bit of a checkered history, only gaining partial support with SQL Server 2016 and really only supporting UTF-8 code pages with SQL Server 2019. Importing and exporting files with international characters is still best handled using UTF-16 encoded files. Adjustments to the workflow are as follows…

In PowerShell, use the Unicode encoding instead of UTF8:

Export-Csv -Path $DirPath -Delimiter ';' -NoTypeInformation -Encoding Unicode

When generating the BCP format file, use the -w switch (for widechar) instead of -c (for char):

bcp.exe DEV_DS_SANTOGRAU..GB_TBIMP_FOTOS_CSV format nul -w -t; -f C:TempTBIMP_FOTOS_CSV-widechar.fmt -S(local) -T

This causes the SQLCHAR columns to be written out as SQLNCHAR, aka. national character support:

14.0
2
1       SQLNCHAR            0       510     ";"        1     Filename                 SQL_Latin1_General_Pref_CP1_CI_AS
2       SQLNCHAR            0       510     "rn"   2     Resolution               SQL_Latin1_General_Pref_CP1_CI_AS

When using BULK INSERT specify DATAFILETYPE = 'widechar' instead of DATAFILETYPE = 'char' and specifying a codepage, e.g.:

BULK INSERT GB_TBIMP_FOTOS_CSV
FROM 'C:TempTBIMP_FOTOS_CSV.csv'
WITH (
    DATAFILETYPE = 'widechar',
    FORMATFILE = 'C:TempTBIMP_FOTOS_CSV-widechar.fmt'
);
User contributions licensed under: CC BY-SA
8 People found this is helpful
Advertisement