gpt4 book ai didi

sql-server - "Column is too long"批量插入错误

转载 作者:行者123 更新时间:2023-12-01 22:47:08 31 4
gpt4 key购买 nike

我正在尝试运行以下命令以从 CSV 文件批量插入数据--

 BULK INSERT TestDB.dbo.patent
FROM 'C:\1patents.csv'
WITH (FIRSTROW = 1, FIELDTERMINATOR = '^', ROWTERMINATOR='\n');

我收到的错误是这样的--

Msg 4866, Level 16, State 1, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 6.
Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

现在这是第一行的数据--

 00000001^^18360713^295^4^0

表中最后一个字段(对应于上面数据的第 6 列 = 0)的类型为“int”。

我在这里做错了什么?为什么我会收到上述错误?

最佳答案

我从 Oracle/Unix 中提取了内容。我将 \r\n 替换为 ROWTERMINATOR = '0x0a',它对我有用。
非常感谢!

关于sql-server - "Column is too long"批量插入错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/14365381/

31 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com