gpt4 book ai didi

r - 扩展 R 中的内存大小限制

转载 作者:行者123 更新时间:2023-12-04 11:29:43 24 4
gpt4 key购买 nike

我有一个 R 程序,它包含 10 个文件,每个文件的大小为 296MB,我已将内存大小增加到 8GB(RAM 大小)

--max-mem-size=8192M

当我运行这个程序时,我得到一个错误提示

In type.convert(data[[i]], as.is = as.is[i], dec = dec, na.strings = character(0L)) :
Reached total allocation of 7646Mb: see help(memory.size)

这是我的R程序

S1 <- read.csv2("C:/Sim_Omega3_results/sim_omega3_1_400.txt");
S2 <- read.csv2("C:/Sim_Omega3_results/sim_omega3_401_800.txt");
S3 <- read.csv2("C:/Sim_Omega3_results/sim_omega3_801_1200.txt");
S4 <- read.csv2("C:/Sim_Omega3_results/sim_omega3_1201_1600.txt");
S5 <- read.csv2("C:/Sim_Omega3_results/sim_omega3_1601_2000.txt");
S6 <- read.csv2("C:/Sim_Omega3_results/sim_omega3_2001_2400.txt");
S7 <- read.csv2("C:/Sim_Omega3_results/sim_omega3_2401_2800.txt");
S8 <- read.csv2("C:/Sim_Omega3_results/sim_omega3_2801_3200.txt");
S9 <- read.csv2("C:/Sim_Omega3_results/sim_omega3_3201_3600.txt");
S10 <- read.csv2("C:/Sim_Omega3_results/sim_omega3_3601_4000.txt");
options(max.print=154.8E10);
combine_result <- rbind(S1,S2,S3,S4,S5,S6,S7,S8,S9,S10)
write.table(combine_result,file="C:/sim_omega3_1_4000.txt",sep=";",
row.names=FALSE,col.names=TRUE, quote = FALSE);

谁能帮帮我

谢谢,

修鲁提。

最佳答案

我建议将建议合并到 ?read.csv2 中:

Memory usage:

 These functions can use a surprising amount of memory when reading
large files. There is extensive discussion in the ‘R Data
Import/Export’ manual, supplementing the notes here.

Less memory will be used if ‘colClasses’ is specified as one of
the six atomic vector classes. This can be particularly so when
reading a column that takes many distinct numeric values, as
storing each distinct value as a character string can take up to
14 times as much memory as storing it as an integer.

Using ‘nrows’, even as a mild over-estimate, will help memory
usage.

Using ‘comment.char = ""’ will be appreciably faster than the
‘read.table’ default.

‘read.table’ is not the right tool for reading large matrices,
especially those with many columns: it is designed to read _data
frames_ which may have columns of very different classes. Use
‘scan’ instead for matrices.

关于r - 扩展 R 中的内存大小限制,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/5749058/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com