我正在尝试从 sql 表 A 中检索数据,修改一些列,然后将修改后的列插入到 sql 表 B 中。
但是我的问题是,当我使用:
$customer= new Customer;
$fakecustomer= new Fakecustomer;
$fake_customer_name_records = $fakecustomer->get()->toArray();
foreach($fake_customer_name_records as $record){
//process columns for each record
$fake_customer_name_records_arry[]=array(
'last_name'=>$last_name,
'first_name'=>$first_name,
'home_phone'=>$phonenumber,
);
}
$customer->insert($fake_customer_name_records_arry);
它只能插入大约 1000 条记录。在 Laravel 中有没有办法让我处理大约 60,000 条记录?
谢谢
最佳答案
我建议在这里使用“ block ”选项,并以“ block ”的形式处理记录。在我看来,这是更原生的方式。这是文档所说的:
Chunking Results
If you need to process a lot (thousands) of Eloquent records, using the chunk command will allow you to do without eating all of your RAM:
User::chunk(200, function($users) { foreach ($users as $user) { // } });
The first argument passed to the method is the number of records you wish to receive per "chunk". The Closure passed as the second argument will be called for each chunk that is pulled from the database.
阅读更多内容的链接:click
关于php - 为许多记录批量插入 laravel,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/24565729/