gpt4 book ai didi

python - IP包信息的熵

转载 作者:行者123 更新时间:2023-11-28 16:34:59 25 4
gpt4 key购买 nike

我有一个充满包头信息的 .csv 文件。第一行:

28;03/07/2000;11:27:51;00:00:01;8609;4961;8609;097.139.024.164;131.084.001.031;0;-
29;03/07/2000;11:27:51;00:00:01;29396;4962;29396;058.106.180.191;131.084.001.031;0;-
30;03/07/2000;11:27:51;00:00:01;26290;4963;26290;060.075.194.137;131.084.001.031;0;-
31;03/07/2000;11:27:51;00:00:01;28324;4964;28324;038.087.169.169;131.084.001.031;0;-

总共大约有 33k 行(每行是来自不同包头的信息)。现在我需要使用源地址和目标地址计算熵。

使用我写的代码:

def openFile(file_name):
srcFile = open(file_name, 'r')
dataset = []
for line in srcFile:
newLine = line.split(";")
dataset.append(newLine)
return dataset

我得到的返回看起来像

dataset = [
['28', '03/07/2000', '11:27:51', '00:00:01', '8609', '4961', '8609', '097.139.024.164', '131.084.001.031', '0', '-\n'],
['29', '03/07/2000', '11:27:51', '00:00:01', '29396', '4962', '29396', '058.106.180.191', '131.084.001.031', '0', '-\n'],
['30', '03/07/2000', '11:27:51', '00:00:01', '26290', '4963', '26290', '060.075.194.137', '131.084.001.031', '0', '-\n'],
['31', '03/07/2000', '11:27:51', '00:00:01', '28324', '4964', '28324', '038.087.169.169', '131.084.001.031', '0', '-']
]

然后我将它传递给我的熵函数:

#---- Entropy += - prob * math.log(prob, 2) ---------
def Entropy(data):
entropy = 0
counter = 0 # -- counter for occurances of the same ip address
#-- For loop to iterate through every item in outer list
for item in range(len(data)):
#-- For loop to iterate through inner list
for x in data[item]:
if x == data[item][8]:
counter += 1
prob = float(counter) / len(data)
entropy += -prob * math.log(prob, 2)
print("\n")
print("Entropy: {}".format(entropy))

代码运行没有任何错误,但它给出了错误的熵,我认为这是因为错误的概率计算(第二个 for 循环是可疑的)或错误的熵公式。有没有办法在没有另一个 for 循环的情况下找到 IP 发生的概率?欢迎对代码进行任何编辑

最佳答案

使用 numpy 和内置 collections模块你可以大大简化代码:

import numpy as np
import collections

sample_ips = [
"131.084.001.031",
"131.084.001.031",
"131.284.001.031",
"131.284.001.031",
"131.284.001.000",
]

C = collections.Counter(sample_ips)
counts = np.array(C.values(),dtype=float)
prob = counts/counts.sum()
shannon_entropy = (-prob*np.log2(prob)).sum()
print (shannon_entropy)

关于python - IP包信息的熵,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/27432078/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com