gpt4 book ai didi

python - 将文本数据库分成 N 个相等的 block 并保留标题

转载 作者:太空宇宙 更新时间:2023-11-04 01:36:07 25 4
gpt4 key购买 nike

我有几个大型(超过 30 万行)文本数据库,我正在使用以下代码对其进行清理,我需要将文件拆分为 100 万行或更少,并保留标题行。我看过 chunk 和 itertools 但无法得到明确的解决方案。它是在 arcgis 模型中使用。

== 根据 icyrock.com 的响应更新代码

import arcpy, os
#fc = arcpy.GetParameter(0)
#chunk_size = arcpy.GetParameter(1) # number of records in each dataset

fc='input.txt'
Name = fc[:fc.rfind('.')]
fl = Name+'_db.txt'

with open(fc) as f:
lines = f.readlines()
lines[:] = lines[3:]
lines[0] = lines[0].replace('Rx(db)', 'Rx_'+Name)
lines[0] = lines[0].replace('Best Unit', 'Best_Unit')
records = len(lines)
with open(fl, 'w') as f: #where N is the chunk number
f.write('\n'.join(lines))

with open(fl) as file:
lines = file.readlines()

headers = lines[0:1]
rest = lines[1:]
chunk_size = 1000000

def chunks(lst, chunk_size):
for i in xrange(0, len(lst), chunk_size):
yield lst[i:i + chunk_size]

def write_rows(rows, file):
for row in rows:
file.write('%s' % row)

part = 1
for chunk in chunks(rest, chunk_size):
with open(Name+'_%d' % part+'.txt', 'w') as file:
write_rows(headers, file)
write_rows(chunk, file)
part += 1

参见 Remove specific lines from a large text file in pythonsplit a large text (xyz) database into x equal parts为背景。我不再需要基于 cygwin 的解决方案,因为它会使模型过于复杂。我需要一种 python 式的方式。我们可以使用“记录”来遍历,但不清楚如何在 db#1 中指定第 1:999,999 行,在 db#2 中指定第 1,000,0000 到 1,999,999 行等。最后一个数据集小于 1m 就可以了记录。

500mb 文件出错(我有 16GB RAM)。

Traceback (most recent call last): File "P:\2012\Job_044_DM_Radio_Propogation\Working\test\clean_file.py", line 10, in lines = f.readlines() MemoryError

records 2249878

上面的记录量不是总记录数,它只是内存不足的地方(我认为)。

=== 使用 Icyrock 的新代码。

block 似乎工作正常,但在 Arcgis 中使用时出错。

Start Time: Fri Mar 09 17:20:04 2012 WARNING 000594: Input feature 1945882430: falls outside of output geometry domains. WARNING 000595: d:\Temp\cb_vhn007_1.txt_Features1.fid contains the full list of features not able to be copied.

我知道这是分块的问题,因为“制作事件层”过程可以很好地处理完整的预分块数据集。

有什么想法吗?

最佳答案

你可以这样做:

with open('file') as file:
lines = file.readlines()

headers = lines[0:1]
rest = lines[1:]
chunk_size = 4

def chunks(lst, chunk_size):
for i in xrange(0, len(lst), chunk_size):
yield lst[i:i + chunk_size]

def write_rows(rows, file):
for row in rows:
file.write('%s' % row)

part = 1
for chunk in chunks(rest, chunk_size):
with open('part%d' % part, 'w') as file:
write_rows(headers, file)
write_rows(chunk, file)
part += 1

这是一个测试运行:

$ cat file && python mkt.py && for p in part*; do echo ---- $p; cat $p; done
header
1
2
3
4
5
6
7
8
9
10
11
12
13
14
---- part1
header
1
2
3
4
---- part2
header
5
6
7
8
---- part3
header
9
10
11
12
---- part4
header
13
14

显然,更改 chunk_size 的值以及根据它们的数量获取 headers 的方式。

学分:

编辑 - 要逐行执行此操作以避免内存问题,您可以执行以下操作:

from itertools import islice

headers_count = 5
chunk_size = 250000

with open('file') as fin:
headers = list(islice(fin, headers_count))

part = 1
while True:
line_iter = islice(fin, chunk_size)
try:
first_line = line_iter.next()
except StopIteration:
break
with open('part%d' % part, 'w') as fout:
for line in headers:
fout.write(line)
fout.write(first_line)
for line in line_iter:
fout.write(line)
part += 1

学分:

测试用例(将以上内容放在名为mkt2.py 的文件中):

创建一个包含 5 行标题和 1234567 行的文件:

with open('file', 'w') as fout:
for i in range(5):
fout.write(10 * ('header %d ' % i) + '\n')
for i in range(1234567):
fout.write(10 * ('line %d ' % i) + '\n')

要测试的 Shell 脚本(放在名为 rt.sh 的文件中):

rm part*
echo ---- file
head -n7 file
tail -n2 file

python mkt2.py

for i in part*; do
echo ---- $i
head -n7 $i
tail -n2 $i
done

示例输出:

$ sh rt.sh 
---- file
header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0
header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1
header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2
header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3
header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4
line 0 line 0 line 0 line 0 line 0 line 0 line 0 line 0 line 0 line 0
line 1 line 1 line 1 line 1 line 1 line 1 line 1 line 1 line 1 line 1
line 1234565 line 1234565 line 1234565 line 1234565 line 1234565 line 1234565 line 1234565 line 1234565 line 1234565 line 1234565
line 1234566 line 1234566 line 1234566 line 1234566 line 1234566 line 1234566 line 1234566 line 1234566 line 1234566 line 1234566
---- part1
header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0
header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1
header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2
header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3
header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4
line 0 line 0 line 0 line 0 line 0 line 0 line 0 line 0 line 0 line 0
line 1 line 1 line 1 line 1 line 1 line 1 line 1 line 1 line 1 line 1
line 249998 line 249998 line 249998 line 249998 line 249998 line 249998 line 249998 line 249998 line 249998 line 249998
line 249999 line 249999 line 249999 line 249999 line 249999 line 249999 line 249999 line 249999 line 249999 line 249999
---- part2
header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0
header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1
header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2
header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3
header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4
line 250000 line 250000 line 250000 line 250000 line 250000 line 250000 line 250000 line 250000 line 250000 line 250000
line 250001 line 250001 line 250001 line 250001 line 250001 line 250001 line 250001 line 250001 line 250001 line 250001
line 499998 line 499998 line 499998 line 499998 line 499998 line 499998 line 499998 line 499998 line 499998 line 499998
line 499999 line 499999 line 499999 line 499999 line 499999 line 499999 line 499999 line 499999 line 499999 line 499999
---- part3
header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0
header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1
header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2
header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3
header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4
line 500000 line 500000 line 500000 line 500000 line 500000 line 500000 line 500000 line 500000 line 500000 line 500000
line 500001 line 500001 line 500001 line 500001 line 500001 line 500001 line 500001 line 500001 line 500001 line 500001
line 749998 line 749998 line 749998 line 749998 line 749998 line 749998 line 749998 line 749998 line 749998 line 749998
line 749999 line 749999 line 749999 line 749999 line 749999 line 749999 line 749999 line 749999 line 749999 line 749999
---- part4
header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0
header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1
header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2
header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3
header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4
line 750000 line 750000 line 750000 line 750000 line 750000 line 750000 line 750000 line 750000 line 750000 line 750000
line 750001 line 750001 line 750001 line 750001 line 750001 line 750001 line 750001 line 750001 line 750001 line 750001
line 999998 line 999998 line 999998 line 999998 line 999998 line 999998 line 999998 line 999998 line 999998 line 999998
line 999999 line 999999 line 999999 line 999999 line 999999 line 999999 line 999999 line 999999 line 999999 line 999999
---- part5
header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0 header 0
header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1 header 1
header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2 header 2
header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3 header 3
header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4 header 4
line 1000000 line 1000000 line 1000000 line 1000000 line 1000000 line 1000000 line 1000000 line 1000000 line 1000000 line 1000000
line 1000001 line 1000001 line 1000001 line 1000001 line 1000001 line 1000001 line 1000001 line 1000001 line 1000001 line 1000001
line 1234565 line 1234565 line 1234565 line 1234565 line 1234565 line 1234565 line 1234565 line 1234565 line 1234565 line 1234565
line 1234566 line 1234566 line 1234566 line 1234566 line 1234566 line 1234566 line 1234566 line 1234566 line 1234566 line 1234566

上述时间:

real    0m0.935s
user 0m0.708s
sys 0m0.200s

希望这对您有所帮助。

关于python - 将文本数据库分成 N 个相等的 block 并保留标题,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/9626842/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com