gpt4 book ai didi

html - 美汤不找 table

转载 作者:行者123 更新时间:2023-12-04 03:58:42 25 4
gpt4 key购买 nike

我正在从这个 website 解析数据.我已经从网站上解析了几件事,但出于某种原因,这个特定页面没有找到其中一个表格。下面是演示该问题的简化代码片段:

#!/usr/bin/env python3 

import bs4 as bs
import requests

def get_soup(site):
headers = {'User-Agent': 'Mozilla/5.0'}
r = requests.get(site, headers=headers)
#Always want a status code of 200, which means everything downloaded
if r.status_code != 200:
print(r.status_code)
print("Invalid Status Code")
exit(1)
return bs.BeautifulSoup(r.content, 'html.parser')

soup = get_soup("https://www.hockey-reference.com/boxscores/202008140WSH.html#all_advanced")
table = soup.find('table' , {'id' : "NYI_skaters"}).find('tbody').find_all('tr')
table = soup.find('table' , {'id' : "NYI_goalies"}).find('tbody').find_all('tr')
table = soup.find('table' , {'id' : "NYI_adv"}).find('tbody').find_all('tr')

代码找到 skaters 和 goalies 表没有问题,但它没有找到 _adv 表,导致 NoneType 错误,因为它没有找到表。我能够找到 _adv 所在的节点:

table = soup.find('div' , {'id' : "all_advanced"})

在那个 div 标签 (all_advanced) 下,有一些看起来很奇怪的代码,所以我不确定这是否与它有关。我对这个特定站点的其他事情没有任何问题,从来不需要使用 Selenium ,而且人们抓取数据也很好。任何帮助将不胜感激。

编辑:使用 panda.read_html 也找不到它。我能够通过将上面的所有“table =”替换为以下内容来解决它:

for comment in soup.find_all(text=lambda text: isinstance(text, bs.Comment)):
if comment.find("<table ") > 0:
comment_soup = bs.BeautifulSoup(comment, 'html.parser')
table = comment_soup.find('table' , {'id' : "NYI_adv"})
for player in table.find_all('tr' , {'class' : "ALLSH hidden"}):
print(player.find('a')['href'])

谢谢

最佳答案

要从评论部分加载表格数据,请使用此脚本:

import requests
from bs4 import BeautifulSoup, Comment

url = 'https://www.hockey-reference.com/boxscores/202008140WSH.html#all_advanced'
soup = BeautifulSoup(requests.get(url).content, 'html.parser')

# normal tables:
# table_skaters = soup.select_one('table#NYI_skaters')
# table_goalies = soup.select_one('table#NYI_goalies')

# table loaded from Comment:
table_advanced = soup.select_one('#all_advanced').find_next(text=lambda t: isinstance(t, Comment))
table_advanced = BeautifulSoup(table_advanced, 'html.parser')

# print(table_advanced)

for row in table_advanced.select('tr.ALL5v5'):
tds = [td.get_text(strip=True) for td in row.select('td, th')]
print(*tds, sep='\t')

打印:

Josh Bailey     1       9       21      30.0    -32.1   3       1       75.0    0       1
Mathew Barzal 3 12 9 57.1 7.8 3 6 33.3 0 0
Anthony Beauvillier 4 10 21 32.3 -29.1 4 1 80.0 1 1
Derick Brassard 1 8 6 57.1 7.1 6 8 42.9 3 0
Casey Cizikas 2 14 7 66.7 20.4 4 5 44.4 1 0
Cal Clutterbuck 5 14 8 63.6 16.6 4 5 44.4 3 0
Jordan Eberle 3 14 9 60.9 13.2 3 6 33.3 1 0
Andy Greene 0 9 13 40.9 -13.6 6 8 42.9 2 1
Leo Komarov 2 10 7 58.8 9.5 7 8 46.7 5 1
Nick Leddy 2 13 20 39.4 -18.8 4 8 33.3 0 1
Anders Lee 2 13 6 68.4 22.0 2 7 22.2 0 0
Matt Martin 2 9 7 56.2 6.2 0 2 0.0 4 0
Scott Mayfield 1 16 11 59.3 11.8 4 4 50.0 3 1
Brock Nelson 2 9 19 32.1 -27.9 3 1 75.0 2 0
Jean-Gabriel Pageau 3 13 9 59.1 10.6 9 10 47.4 4 0
Adam Pelech 3 15 13 53.6 3.6 8 4 66.7 2 1
Ryan Pulock 5 18 14 56.2 8.0 6 9 40.0 0 1
Devon Toews 4 19 15 55.9 7.8 4 7 36.4 1 4
TOTAL 45 43 51.1 44.4 32 12
Travis Boyd 3 6 7 46.2 -3.1 3 5 37.5 0 1
John Carlson 5 19 14 57.6 14.0 5 12 29.4 2 1
Brenden Dillon 2 18 19 48.6 -0.4 7 2 77.8 0 3
Nic Dowd 1 3 6 33.3 -17.3 3 0 100.0 0 1
Lars Eller 1 17 21 44.7 -7.3 6 2 75.0 1 1
Carl Hagelin 1 7 8 46.7 -2.6 5 4 55.6 1 0
Garnet Hathaway 1 4 6 40.0 -10.0 3 0 100.0 2 0
Nick Jensen 0 3 12 20.0 -34.8 8 2 80.0 0 0
Michal Kempny 3 18 12 60.0 16.9 4 10 28.6 1 0
Ilya Kovalchuk 0 5 9 35.7 -15.7 3 7 30.0 3 0
Evgeny Kuznetsov 5 16 10 61.5 18.0 8 9 47.1 1 0
Dmitry Orlov 1 23 22 51.1 4.6 8 5 61.5 1 0
T.J. Oshie 4 21 19 52.5 6.7 5 2 71.4 2 1
Alex Ovechkin 8 16 13 55.2 9.4 8 9 47.1 4 0
Richard Panik 2 7 10 41.2 -9.5 5 1 83.3 0 0
Jonas Siegenthaler 1 5 11 31.2 -21.6 8 1 88.9 2 1
Jakub Vrana 3 13 16 44.8 -6.0 3 3 50.0 0 0
Tom Wilson 2 14 10 58.3 13.0 8 6 57.1 8 0
TOTAL 43 45 48.9 55.6 28 9

关于html - 美汤不找 table ,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/63433853/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com