gpt4 book ai didi

python - 跨
的数据抓取

转载 作者:可可西里 更新时间:2023-11-01 12:57:12 26 4
gpt4 key购买 nike

我正在尝试从包含许多嵌入的行的重复行中提取信息。对于该页面,我正在尝试编写一个抓取工具以从 this 获取各种元素。页。出于某种原因,我找不到使用包含每一行信息的类来获取标签的方法。此外,我无法隔离提取信息所需的部分。作为引用,这里是一行的示例:

<div id="dTeamEventResults" class="col-md-12 team-event-results"><div>
<div class="row team-event-result team-result">
<div class="col-md-12 main-info">
<div class="row">
<div class="col-md-7 event-name">
<dl>
<dt>Team Number:</dt>
<dd><a href="/team-event-search/team?program=JFLL&amp;year=2017&amp;number=11733" class="result-name">11733</a></dd>
<dt>Team:</dt>
<dd> Aqua Duckies</dd>
<dt>Program:</dt>
<dd>FIRST LEGO League Jr.</dd>
</dl>
</div>

我开始构建的脚本如下所示:

from urllib2 import urlopen as uReq
from bs4 import BeautifulSoup as soup

my_url = 'https://www.firstinspires.org/team-event-search#type=teams&sort=name&keyword=NJ&programs=FLLJR,FLL,FTC,FRC&year=2017'

uClient = uReq(my_url)
page_html = uClient.read()
uClient.close()

page_soup = soup(page_html, "html.parser")

rows = page_soup.findAll("div", {"class":"row team-event-result team-result"})

每当我运行 len(rows) 时,结果总是 0。我似乎碰壁了,遇到了麻烦。感谢您的帮助!

最佳答案

此页面的内容是动态生成的,因此您需要使用任何浏览器模拟器,如 selenium。这是将获取您想要的内容的脚本。试一试:

from bs4 import BeautifulSoup
from selenium import webdriver

driver = webdriver.Chrome()
driver.get('https://www.firstinspires.org/team-event-search#type=teams&sort=name&keyword=NJ&programs=FLLJR,FLL,FTC,FRC&year=2017')
soup = BeautifulSoup(driver.page_source,"lxml")
for items in soup.select('.main-info'):
docs = ' '.join([' '.join([item.text,' '.join(val.text.split())]) for item,val in zip(items.select(".event-name dt"),items.select(".event-name dd"))])
location = ' '.join([' '.join(item.text.split()) for item in items.select(".event-location-type address")])
print("Event_Info: {}\nEvent_Location: {}\n".format(docs,location))
driver.quit()

结果类似于:

Event_Info: Team Number: 11733 Team: Aqua Duckies Program: FIRST LEGO League Jr.
Event_Location: Sparta, NJ 07871 USA

Event_Info: Team Number: 4281 Team: Bulldogs Program: FIRST Robotics Competition
Event_Location: Somerset, NJ 08873 USA

关于python - 跨 <div> 的数据抓取,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48178289/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com