gpt4 book ai didi

python - 如何在合理的时间内(小于1天)将5亿个条目写入neo4j?

转载 作者:行者123 更新时间:2023-12-01 08:27:11 26 4
gpt4 key购买 nike

我正在处理大量电子邮件数据,并希望将所有数据加载到 Neo4j 数据库中。

这个想法是为每个地址分配一个节点,并为两个或多个地址之间发送的每封电子邮件分配边。

from py2neo import Graph,Node,Relationship,authenticate
graph = Graph()
tx = graph.begin()

# doing the following in batches of 100 and then commit
a = Node("E-mail_subject", name=str(num))
b = Node("Address", name=dest_addr)
tx.merge(a, "E-mail_subject", "name")
tx.merge(b, "Address", "name")
ba = Relationship(b, "WAS_ON", a, time=t, name=num, weight=w, _id=tx_hash)
tx.create(ba)

# commit every 100 relations
tx.commit()

上述内容需要很长时间才能将 5 亿封电子邮件加载到 neo4j。有什么建议如何做得更快吗?

最佳答案

为什么不使用导入 csv。会快很多!

USING PERIODIC COMMIT 1000 LOAD CSV FROM EMAIL_CSV_FILE  as line merge
(:E-mail_subject{name:line[0]})

USING PERIODIC COMMIT 1000 LOAD CSV FROM ADDRESS_CSV_FILE as line
merge (:Address{name:line[0]})

USING PERIODIC COMMIT 1000 LOAD CSV FROM WAS_CSV_FILE as line merge
(:E-mail_subject{name:line[0]}) -[:WAS_ON{time=line[2], name=line[3],
weight=line[4], _id=line[5]}]-(:Address{name:line[1]})

关于python - 如何在合理的时间内(小于1天)将5亿个条目写入neo4j?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/54165915/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com