gpt4 book ai didi

python - 如何获取 pandas 中某个项目的第一次和最后一次出现

转载 作者:太空宇宙 更新时间:2023-11-03 16:58:32 25 4
gpt4 key购买 nike

我正在分析来自不同传感器的数据。传感器在使用时变为事件状态 (1)。但是,我只需要第一次和最后一次激活的时间(和日期),而不需要中间的任何时间。找到后,我需要创建一个新的 DataFrame,其中包含第一次和最后一次出现的时间和日期,以及“用户”和“事件”。

我尝试迭代每一行并构建一系列 if-then 语句,但没有成功。我想知道是否有 pandas 函数可以让我高效地完成此操作?这是我的数据的子集。

我刚刚开始掌握 pandas 的窍门,因此我们将不胜感激。

干杯!

import pandas as pd            
cols=['User', 'Activity', 'Coaster1', 'Coaster2', 'Coaster3',
'Coaster4', 'Coaster5', 'Coffee', 'Door', 'Fridge', u'coldWater',
'hotWater', 'SensorDate', 'SensorTime', 'RegisteredTime']

data=[['Chris', 'coffee + hot water', 0, 0.0, 0.0, 0, 0, 0.0, 1.0, 0.0,
0.0, 0.0, '2015-09-21', '13:05:54', '13:09:00'],
['Chris', 'coffee + hot water', 0, 0.0, 0.0, 0, 0, 0.0, 1.0, 0.0,
0.0, 0.0, '2015-09-21', '13:05:54', '13:09:00'],
['Chris', 'coffee + hot water', 0, 0.0, 0.0, 0, 0, 0.0, 1.0, 0.0,
0.0, 0.0, '2015-09-21', '13:05:55', '13:09:00'],
['Chris', 'coffee + hot water', 0, 0.0, 0.0, 0, 0, 0.0, 1.0, 0.0,
0.0, 0.0, '2015-09-21', '13:05:55', '13:09:00'],
['Chris', 'coffee + hot water', 0, 0.0, 0.0, 0, 0, 0.0, 1.0, 0.0,
0.0, 0.0, '2015-09-21', '13:05:56', '13:09:00'],
['Chris', 'coffee + hot water', 0, 0.0, 0.0, 0, 0, 0.0, 1.0, 0.0,
0.0, 0.0, '2015-09-21', '13:05:56', '13:09:00'],
['Chris', 'coffee + hot water', 0, 1.0, 0.0, 0, 0, 0.0, 0.0, 0.0,
0.0, 0.0, '2015-09-21', '13:05:58', '13:09:00'],
['Chris', 'coffee + hot water', 0, 1.0, 0.0, 0, 0, 0.0, 0.0, 0.0,
0.0, 0.0, '2015-09-21', '13:05:59', '13:09:00']]

df=pd.DataFrame(data,columns=cols)

所需的输出如下所示:

data_out=[['Chris','coffee + hot water','0','0','0','0','0','0','1','0','0','0','2015-09-21','13:05:54','13:05:56','13:09:00'],['Chris','coffee + hot water','0','1','0','0','0','0','0','0','0','0','2015-09-21','13:05:58','13:05:59','13:09:00']]

cols_out=['User',
'Activity',
'Coaster1',
'Coaster2',
'Coaster3',
'Coaster4',
'Coaster5',
'Coffee',
'Door',
'Fridge',
u'coldWater',
'hotWater',
'SensorDate',
'SensorTimeFirst',
'SensorTimeLast',
'RegisteredTime']


df_out=pd.DataFrame(data_out, columns=cols_out)

最佳答案

您可以尝试groupby和他们apply自定义函数f例如:

def f(x):
Doormin = x[x['Door'] == 1].min()
Doormax = x[x['Door'] == 1].max()
Coaster2min = x[x['Coaster2'] == 1].min()
Coaster2max = x[x['Coaster2'] == 1].max()
Coaster1min = x[x['Coaster1'] == 1].min()
Coaster1max = x[x['Coaster1'] == 1].max()
Door = pd.Series([Doormin['Door'], Doormin['SensorDate'], Doormin['SensorTime'], Doormax['SensorTime'], Doormin['RegisteredTime']], index=['Door','SensorDate','SensorTimeFirst','SensorTimeLast','RegisteredTime'])
Coaster1 = pd.Series([Coaster1min['Coaster1'], Coaster1min['SensorDate'], Coaster1min['SensorTime'], Coaster1max['SensorTime'], Coaster1min['RegisteredTime']], index=['Coaster1','SensorDate','SensorTimeFirst','SensorTimeLast','RegisteredTime'])
Coaster2 = pd.Series([Coaster2min['Coaster2'], Coaster2min['SensorDate'], Coaster2min['SensorTime'], Coaster2max['SensorTime'], Coaster2min['RegisteredTime']], index=['Coaster2','SensorDate','SensorTimeFirst','SensorTimeLast','RegisteredTime'])

return pd.DataFrame([Door, Coaster2, Coaster1])

print df.groupby(['User','Activity']).apply(f)

Coaster1 Coaster2 Door RegisteredTime \
User Activity
Chris coffee + hot water 0 NaN NaN 1 13:09:00
1 NaN 1 NaN 13:09:00
2 NaN NaN NaN NaN

SensorDate SensorTimeFirst SensorTimeLast
User Activity
Chris coffee + hot water 0 2015-09-21 13:05:54 13:05:56
1 2015-09-21 13:05:58 13:05:59
2 NaN NaN NaN

也许您可以通过 fillna 添加 0 而不是 NaN :

df = df.groupby(['User','Activity']).apply(f)
df[['Coaster1','Coaster2','Door']] = df[['Coaster1','Coaster2','Door']].fillna(0)
print df
Coaster1 Coaster2 Door RegisteredTime \
User Activity
Chris coffee + hot water 0 0 0 1 13:09:00
1 0 1 0 13:09:00
2 0 0 0 NaN

SensorDate SensorTimeFirst SensorTimeLast
User Activity
Chris coffee + hot water 0 2015-09-21 13:05:54 13:05:56
1 2015-09-21 13:05:58 13:05:59
2 NaN NaN NaN

关于python - 如何获取 pandas 中某个项目的第一次和最后一次出现,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/35225295/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com