php - 为什么 CodeIgniter 会耗尽允许的内存大小?

标签 php sql-server codeigniter windows-8.1 iis-8

我收到内存耗尽错误,我不应该占用任何内存!

应用程序在 Windows 8 Server/IIS i/PHP 5.5/CodeIgniter/MS SQL Server 上

错误如下:

[23-May-2014 10:56:57 America/New_York] PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 1992 bytes) in C:\inetpub\wwwroot\application\models\DW_import.php on line 112

[23-May-2014 11:07:34 America/New_York] PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 2438 bytes) in C:\inetpub\wwwroot\application\models\DW_import.php on line 113

该脚本在目录中查找多个不同的 CSV 文件以导入到数据库中。请记住,导入文件很大,有些高达 4 Gigs 的数据。据我所知,没有变量会不断聚合可能导致此问题的数据。正在运行的脚本(模型)(这个 Controller 没有 View ,只有模型)如下:

DW_import.php

<?php

class dw_import extends CI_Model {

    public function import(){

        global $file,$errLogFile,$logFile,$tableName, $fieldList, $file, $count, $line, $query;

        $this->load->database(); // init db connection

        // map file types to database tables
        $fileToDBArr = array(
            'Customers' => 'customer',
            'Customers_Historical' => 'customer_historical',
            'Orders' => 'order',
            'Customer_AR_Aggs' => 'customer_ar_aging_agg'
        );

        // extend timeout of this script
        ini_set('max_execution_time', 3600);

        // error handler to log errors and continue processing
        function myErrorHandler($errno,$errstr,$errfile,$errline){

            global $file,$errLogFile,$logFile,$tableName, $fieldList, $file, $count, $line, $query;
            // error - store in DB
            //echo "<br>[$errno $errstr $errfile $errline $tableName $file $count] $errLogFile<br>";
            $err = "#$errno $errstr $errfile on line $errline :: Table $tableName File $file Row# $count Headers: $fieldList Data: $line";
            echo $err;
            file_put_contents($errLogFile,$err,FILE_APPEND);
        };

        set_error_handler("myErrorHandler");

        // set temp error log file
        $errLogFile = "C:/Data_Updates/logs/general." . date('YmdHis') . ".errLog";

        // loop thru file types
        foreach($fileToDBArr as $fileType=>$table){
            // get the files for this import type
            $fileArr = glob('C:/Data_Updates/'.$fileType.'.*');
            sort($fileArr,SORT_STRING); // sort so earlier files (by date in file name) will process first

            // loop thru files found
            foreach($fileArr as $file){
                // set log file paths specific to this import file
                $errLogFile = str_replace('Data_Updates/','Data_Updates/logs/',$file) . "." . date('YmdHis') . ".errLog";
                $logFile = str_replace('Data_Updates/','Data_Updates/logs/',$file) . "." . date('YmdHis') . ".log";

                file_put_contents($logFile,"---BEGIN---",FILE_APPEND); // log

                // lets get the file type and translate it into a table name
                preg_match('/C:\/Data_Updates\/([^\.]+)/',$file,$matches);
                $fileType = $matches[1];
                $tableName = $fileToDBArr[$fileType];


                // lets get the first row as a field list
                $fp = fopen($file,'r');
                //$fieldList = str_replace('"','',fgets($fp));

                // counters to track status
                $count = 0;
                $startPoint = 0;

                // see if continuation, set startPoint to last row imported from file
                $query = "SELECT max(import_line) as maxline FROM $tableName WHERE import_file = '" . addslashes($file) . "'";
                $result = $this->db->query($query);

                foreach($result->result() as $row) $startPoint = $row->maxline+1; // set the startPoint if this is continuation

                file_put_contents($logFile,"\nstartPoint $startPoint",FILE_APPEND); // log      

                // loop thru file lines
                while (!feof($fp)) {
                    $line = fgets($fp);
                    // reformat those pesky dates from m/d/y to y-m-d
                    $line = preg_replace('/, ?(\d{1,2})\/(\d{1,2})\/(\d{4})/',',${3}-${1}-${2}',$line);

                    if(!$count){
                        // header row - set aside to use for column headers on insert statements
                        $fieldList = str_replace('"','',$line);
                        file_put_contents($logFile,"\nHeaders: $fieldList",FILE_APPEND); // log
                    } elseif($count >= $startPoint && trim($line)) {

                        // data row - insert into DB
                        $lineArr = str_getcsv($line); // turn this CSV line into an array
                        // build the insert query
                        $query = "INSERT INTO $tableName ($fieldList,import_date,import_file,import_line)
                        VALUES (";
                        foreach($lineArr as $k=>$v) $query .= ($v !== '') ? "'".addslashes(utf8_encode($v))."'," : " NULL,";
                        $query .= "now(),'" . addslashes($file). "',$count)
                        ON DUPLICATE KEY UPDATE ";
                        foreach(explode(',',$fieldList) as $k=>$v) $query .= "\n$v=" . (($lineArr[$k] !== '') ? "\"" . addslashes(utf8_encode($lineArr[$k])) . "\"" : "NULL") . ", ";
                        $query .= "import_date = now(),import_file='" . addslashes($file) . "',import_line = $count ";


                        if(!$this->db->query($query)) {
                            trigger_error('db error ' . $this->db->_error_number() . ' ' . $this->db->_error_message());
                            $status = 'error ';
                        } else {
                            $status = 'success ';   
                        };

                        file_put_contents($logFile,"row: $count status: $status data: $line",FILE_APPEND); // log'

                    } else {
                        // skipped - this row was already imported from this file
                        // removed log to speed up
                        file_put_contents($logFile,"row: $count status: SKIPPED data: $line",FILE_APPEND); // log
                    }; // if $count
                    $count++;
                }; // while $fp
                fclose($fp);

                // file complete - move file to archive
                rename($file,str_replace('Data_Updates/','Data_Updates/archive/',$file));
                file_put_contents($logFile,"-- END --",FILE_APPEND); // log
            }; // each $fileArr

        }; // each $globArr

    } // end import function
} // end class 

?>

如有任何帮助,我们将不胜感激!

******** 编辑


根据几个人的建议,我添加了一些更改。这些更改仅影响循环逻辑的“数据行插入到数据库”部分。您可以看到添加了日志记录以跟踪 memory_get_peak_usage,添加了 unset() 和 clearcachestat()。代码下方是一些日志数据:

                        file_put_contents($logFile,memory_get_peak_usage() . " line 1 \n\r",FILE_APPEND); 
                        // data row - insert into DB
                        if(isset($lineArr)) unset($lineArr); 
                        file_put_contents($logFile,memory_get_peak_usage() . " line 1.1 \n\r",FILE_APPEND);
                        $lineArr = str_getcsv($line); // turn this CSV line into an array
                        // build the insert query
                        file_put_contents($logFile,memory_get_peak_usage() . " line 2 lineArr size: " . strlen(implode(',',$lineArr)) . "\n\r",FILE_APPEND);
                        if(isset($query)) unset($query);  
                        file_put_contents($logFile,memory_get_peak_usage() . " line 2.1 lineArr size: " . strlen(implode(',',$lineArr)) . "\n\r",FILE_APPEND);
                        $query = "INSERT INTO $tableName ($fieldList,import_date,import_file,import_line)
                        VALUES (";
                        file_put_contents($logFile,memory_get_peak_usage() . " line 2.2 lineArr size: " . strlen(implode(',',$lineArr)) . "\n\r",FILE_APPEND);
                        foreach($lineArr as $k=>$v) $query .= ($v !== '') ? "'".addslashes(utf8_encode($v))."'," : " NULL,";
                        $query .= "now(),'" . addslashes($file). "',$count)
                        ON DUPLICATE KEY UPDATE ";
                        file_put_contents($logFile,memory_get_peak_usage() . " line 2.3 lineArr size: " . strlen(implode(',',$lineArr)) . "\n\r",FILE_APPEND);

                        foreach(explode(',',$fieldList) as $k=>$v) $query .= "\n$v=" . (($lineArr[$k] !== '') ? "\"" . addslashes(utf8_encode($lineArr[$k])) . "\"" : "NULL") . ", ";
                        file_put_contents($logFile,memory_get_peak_usage() . " line 2.4 lineArr size: " . strlen(implode(',',$lineArr)) . "\n\r",FILE_APPEND);
                        $query .= "import_date = now(),import_file='" . addslashes($file) . "',import_line = $count ";
                        file_put_contents($logFile,memory_get_peak_usage() . " line 3 query size: " . strlen($query) . "\n\r",FILE_APPEND);

                        if(!$this->db->query($query)) {
                            trigger_error('db error ' . $this->db->_error_number() . ' ' . $this->db->_error_message());
                            $status = 'error ';
                        } else {
                            $status = 'success ';   
                        };

                        clearstatcache();

日志数据:(最左边的数字是 memory_get_peak_usage() 调用的结果

2724960 line 1.1 
2724960 line 2 lineArr size: 194
2724960 line 2.1 lineArr size: 194
2724960 line 2.2 lineArr size: 194
2724960 line 2.3 lineArr size: 194
2727392 line 2.4 lineArr size: 194
2727392 line 3 query size: 2346

2727392 line 1 
2727392 line 1.1 
2727392 line 2 lineArr size: 194
2727392 line 2.1 lineArr size: 194
2727392 line 2.2 lineArr size: 194
2727392 line 2.3 lineArr size: 194
2729944 line 2.4 lineArr size: 194
2729944 line 3 query size: 2346

2729944 line 1 
2729944 line 1.1 
2729944 line 2 lineArr size: 194
2729944 line 2.1 lineArr size: 194
2729944 line 2.2 lineArr size: 194
2729944 line 2.3 lineArr size: 194
2732448 line 2.4 lineArr size: 194
2732448 line 3 query size: 2346

2732448 line 1.1 
2732448 line 2 lineArr size: 194
2732448 line 2.1 lineArr size: 194
2732448 line 2.2 lineArr size: 194
2732448 line 2.3 lineArr size: 194
2735088 line 2.4 lineArr size: 194
2735088 line 3 query size: 2346

请注意,内存仍在第 2.3 行和第 2.4 行之间增长,这是以下代码行:

foreach(explode(',',$fieldList) as $k=>$v) $query .= "\n$v=" . (($lineArr[$k] !== '') ? "\"" . addslashes(utf8_encode($lineArr[$k])) . "\"" : "NULL") . ", ";

有什么想法吗?

最佳答案

找到答案:

$this->load->database(); // init db connection, already in code
$this->db->save_queries = false; // ADD THIS LINE TO SOLVE ISSUE

这是 CodeIgniter 中一个可爱的未记录的设置。 CI 显然默认保存查询,甚至相对于插入/更新查询保存一定量的数据。由于在此导入过程中运行了大量插入,因此内存泄漏变得非常重要。将 CI 设置为不保存查询解决了这个问题。

让我失望的是 memory_get_peak_usage() 报告内存使用量在 插入查询运行之前增加,而不是在插入查询运行期间(PHP 错误?)。

作为最后的现实检查,我删除了所有其他优化建议(unsetclearstatcache 等)并验证它们对内存问题没有积极影响。

关于php - 为什么 CodeIgniter 会耗尽允许的内存大小?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/23834238/

相关文章:

sql-server - COALESCE 和 ISNULL 哪个更快?

sql - 如何在死锁图中捕获实际执行计划?

sql-server - SQL Server 更改计算列

php - 检查变量是否为空

php - SQL 'Like' 导致重复内容

php - jQuery 仅在父 div 中切换具有类的元素

php - 将数据库查询的结果插入 Controller 中的变量中 - CODEIGNITER

php - Doctrine——如何在两个实体之间建立一对一的关系

PHP PDO - 迁移到 CodeIgniter

php - codeigniter 无法访问用户的 MySQL,但使用 root 登录正常