gpt4 book ai didi

python - 在 pd.dataframe 中查找一组连续常量值的干净 pythonic 方法是什么?

转载 作者:太空宇宙 更新时间:2023-11-03 15:34:27 26 4
gpt4 key购买 nike

我正在寻找一种方法来标记 pd.dataframe(例如 df)中的连续常量值(例如 n),这些连续常量值是常量。

我写了一些代码,如果一个值与下一个 n/2 和前一个 n/2 个数据点的差异为零,则该值将被标记。

n = 5         # the minimum number of sequential constant values  

#to create a adatframe example
df=pd.DataFrame(np.random.randn(25), index=pd.date_range(start='2010-1-1',end='2010-1-2',freq='H'), columns=['value'])

#to modify the dataframe to have several sets of constant values
df[1:10]=23
df[20:26]=10
df[15:17]=15
for i in np.arange(1, int(n/2)):
# to calcualte the difference between value and ith previous values
df_diff['delta_' + str(i)] = (df['value'].diff(periods=i)).abs()
# to calcualte the difference between value and ith next values
df_diff['delta_' + str(-i)] = (df['value'].diff(periods=-i)).abs()

# to filter the results (e.g. as a boolean)
result_1 = (df_diff[:] <= 0).all(axis=1)
result_2 = (df_diff[:] <= 0).any(axis=1)

此示例中的 result_1 和 result_2 未提供正确答案。

我期望的是:

2010-01-01 00:00:00    False
2010-01-01 01:00:00 True
2010-01-01 02:00:00 True
2010-01-01 03:00:00 True
2010-01-01 04:00:00 True
2010-01-01 05:00:00 True
2010-01-01 06:00:00 True
2010-01-01 07:00:00 True
2010-01-01 08:00:00 True
2010-01-01 09:00:00 True
2010-01-01 10:00:00 False
2010-01-01 11:00:00 False
2010-01-01 12:00:00 False
2010-01-01 13:00:00 False
2010-01-01 14:00:00 False
2010-01-01 15:00:00 False
2010-01-01 16:00:00 False
2010-01-01 17:00:00 False
2010-01-01 18:00:00 False
2010-01-01 19:00:00 False
2010-01-01 20:00:00 True
2010-01-01 21:00:00 True
2010-01-01 22:00:00 True
2010-01-01 23:00:00 True
2010-01-02 00:00:00 True

最佳答案

IIUC,使用DataFrame.groupby石斑鱼是Series.diff , .ne(0)然后 .cumsum :

df.groupby(df.value.diff().ne(0).cumsum())['value'].transform('size').ge(n)

[输出]

2010-01-01 00:00:00    False
2010-01-01 01:00:00 True
2010-01-01 02:00:00 True
2010-01-01 03:00:00 True
2010-01-01 04:00:00 True
2010-01-01 05:00:00 True
2010-01-01 06:00:00 True
2010-01-01 07:00:00 True
2010-01-01 08:00:00 True
2010-01-01 09:00:00 True
2010-01-01 10:00:00 False
2010-01-01 11:00:00 False
2010-01-01 12:00:00 False
2010-01-01 13:00:00 False
2010-01-01 14:00:00 False
2010-01-01 15:00:00 False
2010-01-01 16:00:00 False
2010-01-01 17:00:00 False
2010-01-01 18:00:00 False
2010-01-01 19:00:00 False
2010-01-01 20:00:00 True
2010-01-01 21:00:00 True
2010-01-01 22:00:00 True
2010-01-01 23:00:00 True
2010-01-02 00:00:00 True
Freq: H, Name: value, dtype: bool

说明

我们分组的系列将是连续的等值组:

s = df.value.diff().ne(0).cumsum()

2010-01-01 00:00:00 1
2010-01-01 01:00:00 2
2010-01-01 02:00:00 2
2010-01-01 03:00:00 2
2010-01-01 04:00:00 2
2010-01-01 05:00:00 2
2010-01-01 06:00:00 2
2010-01-01 07:00:00 2
2010-01-01 08:00:00 2
2010-01-01 09:00:00 2
2010-01-01 10:00:00 3
2010-01-01 11:00:00 4
2010-01-01 12:00:00 5
2010-01-01 13:00:00 6
2010-01-01 14:00:00 7
2010-01-01 15:00:00 8
2010-01-01 16:00:00 8
2010-01-01 17:00:00 9
2010-01-01 18:00:00 10
2010-01-01 19:00:00 11
2010-01-01 20:00:00 12
2010-01-01 21:00:00 12
2010-01-01 22:00:00 12
2010-01-01 23:00:00 12
2010-01-02 00:00:00 12
Freq: H, Name: value, dtype: int32

当您对这些“组 ID”进行分组时,使用 transform 返回一个与原始 DataFrame 形状相同的对象,聚合到“大小”,您会得到:

s.groupby(s).transform('size')

2010-01-01 00:00:00 1
2010-01-01 01:00:00 9
2010-01-01 02:00:00 9
2010-01-01 03:00:00 9
2010-01-01 04:00:00 9
2010-01-01 05:00:00 9
2010-01-01 06:00:00 9
2010-01-01 07:00:00 9
2010-01-01 08:00:00 9
2010-01-01 09:00:00 9
2010-01-01 10:00:00 1
2010-01-01 11:00:00 1
2010-01-01 12:00:00 1
2010-01-01 13:00:00 1
2010-01-01 14:00:00 1
2010-01-01 15:00:00 2
2010-01-01 16:00:00 2
2010-01-01 17:00:00 1
2010-01-01 18:00:00 1
2010-01-01 19:00:00 1
2010-01-01 20:00:00 5
2010-01-01 21:00:00 5
2010-01-01 22:00:00 5
2010-01-01 23:00:00 5
2010-01-02 00:00:00 5
Freq: H, Name: value, dtype: int64

从这里开始,这是一个简单的 Series.ge (>=) 与你的值比较 n

关于python - 在 pd.dataframe 中查找一组连续常量值的干净 pythonic 方法是什么?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/55759947/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com