Skip to content
Advertisement

Laravel – seeding large SQL file

A memory exhaustion happens when I run my DB seed script in production.

Below is my seed script.

class MembershipTableSeeder extends Seeder 
{
    public function run()
    {
        DB::table('members')->delete();

        foreach (range(1, 99) as $days){
            Members::create(array('membership_code' => 'test'.$days));
        }

        DB::unprepared(file_get_contents(app_path()."/database/seeds/members.sql"));
    }
}

So what I did was add a no-limit on my seed script.

ini_set('memory_limit', '-1');

The problem now is that when I run the script it logs the output into the terminal the content of the SQL script (which is very, very big).

Is there a good way of running a SQL dump inside my DB seeds that doesn’t consume much memory? What I did now was run it manually:

mysql -uuser -p db < script.sql

Advertisement

Answer

The problem happens because when using Db::unprepared it also logs the query to the laravel.log file, making in background much more actions then you think, from this side you have memory exhaust. If you are not running the safe mode I would stick to executing the console command like this:

exec("mysql -u ".Config::get('database.mysql.user')." -p".Config::get('database.mysql.password')." ".Config::get('database.mysql.database')." < script.sql")
User contributions licensed under: CC BY-SA
10 People found this is helpful
Advertisement