gpt4 book ai didi

python - 使用 mysql.connector 时出现 scrapy 管道错误

转载 作者:行者123 更新时间:2023-11-29 21:40:41 25 4
gpt4 key购买 nike

我完全迷失了。这是我的管道。当我运行它时,我收到一条错误消息

      File "c:\python27\lib\site-packages\twisted\internet\defer.py", line 588, in _runCallbacks
current.result = callback(current.result, *args, **kw)
File "C:\Python27\bff\bff\pipelines.py", line 42, in process_item
cursor.execute(add_Product)
File "c:\python27\lib\site-packages\mysql\connector\cursor.py", line 492, in execute
stmt = operation.encode(self._connection.python_charset)
AttributeError: 'tuple' object has no attribute 'encode'

正如您通过注释代码看到的,我尝试了几种不同的方法。起初,我按照在示例中看到的方式进行操作,但是当我将 (item['StoreName']) 放入 VALUES 行而不是在上面将其定义为 Name = item 时,出现了 sadi item 未定义的错误['商店名称']我正在使用从 mqsql.org 网站安装的 mySQL.connector。提前致谢

# -*- coding: utf-8 -*-
# Define your item pipelines here
# Don't forget to add your pipeline to the ITEM_PIPELINES setting
# See: http://doc.scrapy.org/en/latest/topics/item-pipeline.html
from __future__ import print_function
from datetime import date, datetime, timedelta
import mysql.connector
#from scrapy.extensions import DropItem
#from bff.items import ItemInfo

class mySQLPipeline(object):
def process_item(self, item, spider):

Path = item['ProdPath']
UPC = item['ProdUPC']
Model = item['ProdModel']
Desc = item['ProdDesc']
Price = item['ProdPrice']
Stock = item['InStock']
#Ships = item['Ships']
Name = item['StoreName']


cnx = mysql.connector.connect(user='*****', password='*****',
host='127.0.0.1',
port='****',
database='scrapyinfo')
cursor = cnx.cursor()

#add_Product = ("INSERT INTO walmart_products (ProdName)"
# "VALUES (%s), (Name);")

add_Product = ("INSERT INTO walmart_products, (ProdName, ProdPath, ProdUPC, ProdModel, ProdDesc, ProdPrice, InStock, Ships, StoreName)"
"VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)", (Name, Path, UPC, Model, Desc, Price, Stock, Name))
#item['Ships'],



#Add new product
cursor.execute(add_Product)

# Make sure data is committed to the database
cnx.commit()

cursor.close()
cnx.close()
return item

编辑。这是我的新代码

`

from __future__ import print_function
from datetime import date, datetime, timedelta
import mysql.connector
#from scrapy.extensions import DropItem
#from bff.items import ItemInfo

class mySQLPipeline(object):
def process_item(self, item, spider):

Product = item['ProdName']
Path = item['ProdPath']
UPC = item['ProdUPC']
Model = item['ProdModel']
Desc = item['ProdDesc']
Price = item['ProdPrice']
Stock = item['InStock']
#Ships = item['Ships']
Name = item['StoreName']


cnx = mysql.connector.connect(user='****', password='****',
host='127.0.0.1',
port='****',
database='****')
cursor = cnx.cursor()
# add_Product = ("INSERT INTO walmart_products (ProdName, StoreName) VALUES (%s, %s,)", Product, Name,)
# add_Product = ("INSERT INTO walmart_products, (ProdName)"
# "VALUES (%s)", (Name))
# "VALUES (%(Name)s)")
add_Product = ("INSERT INTO walmart_products "
"(ProdName, ProdPath, ProdUPC, ProdModel, ProdDesc, ProdPrice, InStock, StoreName) "
"VALUES (%s, %s, %s, %s, %s, %s, %s, %s)")
#item['Ships'],

data_Product = (Product, Path, UPC, Model, Desc, Price, Stock, Name)

#Add new product
cursor.execute(add_Product, data_Product)

# Make sure data is committed to the database
cnx.commit()

cursor.close()
cnx.close()
return item

`

最佳答案

以防万一有人被困在我所在的地方,我无法弄清楚。我很沮丧,停止使用 mysql.connector,转而使用 MySQLdb。一旦我切换到那个,一切就都顺利了。

关于python - 使用 mysql.connector 时出现 scrapy 管道错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/34552021/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com