我有一个包含大约 180 万行的 CSV 文件。我需要将它们从我的 PHP 脚本插入到 MySQL 表中。我正在批量插入 10,000 个值。 脚本运行很长时间,并在插入 80-95 个批处理后崩溃。我也尝试过 mysql_unbuffered_query() 但没有用。
if ($fp) {
$batch = 1;
$row_count = 1;
$bucket_counter = 1;
$mobile_numbers = array();
$row_count_for_DB_write = 0;
foreach ($campaign_numbers as $value) {
$number = array($value);
fputcsv($fp, $number);
$row_count_for_DB_write++;
$value_row = new stdClass();
$value_row->number = $value;
$value_row->bucket_number = $bucket_counter;
$mobile_numbers[] = $value_row;
if ($row_count == $bucket_size && $bucket_counter < $bucket_count) {
$bucket_counter++;
$row_count = 1;
fclose($fp);
$fp = fopen($directory . "/cn_$bucket_counter.csv", 'w');
$logger->debug('Created csv file : ' . $directory . '/cn_$bucket_counter.csv');
}
if ($row_count_for_DB_write == CONSTANTS::BATCH_SIZE) {
$logger->debug($batch." Batch insert starting at: ".date('d-m-Y_H-i-s', time()));
$insert_count = $data_service->add_to_mobile_numbers_table($mobile_numbers_table, $mobile_numbers);
$batch++;
$logger->debug("Batch insert ending at: ".date('d-m-Y_H-i-s', time()));
$row_count_for_DB_write = 1;
unset($mobile_numbers);
$mobile_numbers = array();
}
$row_count++;
}
}
fclose($fp);
$data_service->add_to_mobile_numbers_table($mobile_numbers_table, $mobile_numbers);
$zip_file = "/$directory_name.zip";
$logger->debug('Creating zipped file');
Util::create_zip(Util::get_list_of_files($directory), $directory . $zip_file);
最佳答案
以下步骤解决了此问题:
将表引擎从 InnoDB 更改为 MyISAM
禁用按键
插入数据
重新启用按键
关于php - 如何在不使用LOAD DATA INFILE的情况下向MySQL表中插入大量数据?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/9272445/