我正在尝试逐行读取文件。问题是文件太大(超过500000行),我超出了内存限制。我想知道如何在不受内存限制的情况下读取文件。
我正在考虑多线程的解决方案(例如将文件分成较小的组(每组100000行)并在多线程中读取它),但是我不知道如何详细地做。请帮助我(对不起,英语不好)。
这是我的代码
$fn = fopen("myfile.txt", "r");
while(!feof($fn)) {
$result = fgets($fn);
echo $result;
}
fclose($fn);
最佳答案
您可以使用generator来处理内存使用情况。这只是用户在文档页面上写的一个示例:
function getLines($file)
{
$f = fopen($file, 'r');
try {
while ($line = fgets($f)) {
yield $line;
}
} finally {
fclose($f);
}
}
foreach (getLines("file.txt") as $n => $line) {
// insert the line into db or do whatever you want with it.
}
A generator allows you to write code that uses foreach to iterate over a set of data without needing to build an array in memory, which may cause you to exceed a memory limit, or require a considerable amount of processing time to generate. Instead, you can write a generator function, which is the same as a normal function, except that instead of returning once, a generator can yield as many times as it needs to in order to provide the values to be iterated over.
关于php - 如何在不受内存限制的情况下在php中读取大文件,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/61262664/