A memory exhaustion happens when I run my DB seed script in production.
Below is my seed script.
class MembershipTableSeeder extends Seeder
{
public function run()
{
DB::table('members')->delete();
foreach (range(1, 99) as $days){
Members::create(array('membership_code' => 'test'.$days));
}
DB::unprepared(file_get_contents(app_path()."/database/seeds/members.sql"));
}
}
So what I did was add a no-limit on my seed script.
ini_set('memory_limit', '-1');
The problem now is that when I run the script it logs the output into the terminal the content of the SQL script (which is very, very big).
Is there a good way of running a SQL dump inside my DB seeds that doesn't consume much memory? What I did now was run it manually:
mysql -uuser -p db < script.sql
The problem happens because when using Db::unprepared it also logs the query to the laravel.log file, making in background much more actions then you think, from this side you have memory exhaust. If you are not running the safe mode I would stick to executing the console command like this:
For others who prefer a more Laravel-ish solution, this is how I handled it: