gpt4 book ai didi

python - BeautifulSoup - python - table 刮

转载 作者:太空宇宙 更新时间:2023-11-03 13:12:26 28 4
gpt4 key购买 nike

尝试使用 beautiful soup 从网站上抓取表格以解析数据。我将如何通过它的标题来解析它?到目前为止,我什至无法打印整个表格。提前致谢。

代码如下:

import urllib2
from bs4 import BeautifulSoup

optionstable = "http://www.barchart.com/options/optdailyvol?type=stocks"
page = urllib2.urlopen(optionstable)
soup = BeautifulSoup(page, 'lxml')

table = soup.find("div", {"class":"dataTables_wrapper","id": "options_wrapper"})

table1 = table.find_all('table')

print table1

最佳答案

需要模仿ajax请求获取表格数据:

import requests
from time import time

optionstable = "http://www.barchart.com/options/optdailyvol?type=stocks"


params = {"type": "stocks",
"dir": "desc",
"_": str(time()),
"f": "base_symbol,type,strike,expiration_date,bid,ask,last,volume,open_interest,volatility,timestamp",
"sEcho": "1",
"iDisplayStart": "0",
"iDisplayLength": "100",
"iSortCol_0": "7",
"sSortDir_0": "desc",
"iSortingCols": "1",
"bSortable_0": "true",
"bSortable_1": "true",
"bSortable_2": "true",
"bSortable_3": "true",
"bSortable_4": "true",
"bSortable_5": "true",
"bSortable_6": "true",
"bSortable_7": "true",
"bSortable_82": "true",
"bSortable_9": "true",
"bSortable_10": "true",
"sortby": "Volume"}

然后获取传递参数:

js = requests.get("http://www.barchart.com/option-center/getData.php", params=params).json()

这给了你:

{u'aaData': [[u'<a href="/quotes/BAC">BAC</a>', u'Call', u'16.00', `u'12/16/16', u'0.89', u'0.90', u'0.91', u'52,482', u'146,378', u'0.26', u'01:43'], [u'<a href="/quotes/ETE">ETE</a>', u'Call', u'20.00', u'01/20/17', u'0.38', u'0.41', u'0.40', u'40,785', u'72,011', u'0.42', u'01:27'], [u'<a href="/quotes/BAC">BAC</a>', u'Call', u'15.00', u'10/21/16', u'1.34', u'1.36', u'1.33', u'35,663', u'90,342', u'0.35', u'01:44'], [u'<a href="/quotes/COTY">COTY</a>', u'Put', u'38.00', u'10/21/16', u'15.00', u'15.30', u'15.10', u'32,321', u'242,382', u'1.24', u'01:44'], [u'<a href="/quotes/COTY">COTY</a>', u'Call', u'38.00', u'10/21/16', u'0.00', u'0.05', u'0.01', u'32,320', u'256,589', u'1.34', u'01:44'], [u'<a href="/quotes/WFC">WFC</a>', u'Put', u'40.00', u'10/21/16', u'0.01', u'0.03', u'0.02', u'32,121', u'37,758', u'0.39', u'01:43'], [u'<a href="/quotes/WFC">WFC</a>', u'Put', u'40.00', u'11/18/16', u'0.16', u'0.17', u'0.16', u'32,023', u'8,789', u'0.30', u'01:44']..................

你可以传递更多的参数,如果你在 XHR 选项卡下查看 chrome 工具中的请求,你可以看到所有的params,上面的是获得结果所需的最低限度。有很多,所以我不会将它们全部张贴在这里,让您自己弄清楚如何影响结果。

如果您遍历 js[u'aaData'],您可以看到每个子列表,其中每个条目对应于如下列:

#base_symbol,type,strike,expiration_date,bid,ask,last,volume,open_interest,volatility,timestamp

[u'<a href="/quotes/AAPL">AAPL</a>', u'Call', u'116.00', u'10/14/16', u'1.36', u'1.38', u'1.37', u'21,812', u'7,258', u'0.23', u'10/10/16']

因此,如果您想根据某些条件过滤行,例如 strike > 15:

for d in filter(lambda row: float(row[2]) > 15, js[u'aaData']):
print(d)

您可能还会发现 pandas 很有用,稍微整理一下我们就可以创建一个漂亮的 df:

# extract base_symbol text
for v in js[u'aaData']:
v[0] = BeautifulSoup(v[0]).a.text


import pandas as pd
cols = "base_symbol,type,strike,expiration_date,bid,ask,last,volume,open_interest,volatility,timestamp"

df = pd.DataFrame(js[u'aaData'],
columns=cols.split(","))

print(df.head(5))

这给了你一个很好的 df 来工作:

  base_symbol  type strike expiration_date    bid    ask   last  volume  \
0 BAC Call 16.00 12/16/16 0.89 0.90 0.91 52,482
1 ETE Call 20.00 01/20/17 0.38 0.41 0.40 40,785
2 BAC Call 15.00 10/21/16 1.34 1.36 1.33 35,663
3 COTY Put 38.00 10/21/16 15.00 15.30 15.10 32,321
4 COTY Call 38.00 10/21/16 0.00 0.05 0.01 32,320

open_interest volatility timestamp
0 146,378 0.26 10/10/16
1 72,011 0.42 10/10/16
2 90,342 0.35 10/10/16
3 242,382 1.24 10/10/16
4 256,589 1.34 10/10/16

您可能只想更改 dtypes df["strike"] = df["strike"].astype(float) 等。

关于python - BeautifulSoup - python - table 刮,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39965416/

28 4 0