Optimizing Data Insertion from File to Database in Laravel

Introduction

Adrian Generous
4 min readFeb 8, 2024

--

On a developers’ group dedicated to the Laravel framework, a user shared a challenge they encountered while working with a large data file. Their PHP script, designed to import data from a JSON file into a database, performed well until they needed to process an impressive number of 700,000 records. At that point, the solution began to wobble under the task’s weight — memory consumption grew to unacceptable levels, leading to slowdowns and even application crashes. This phenomenon became the pretext for a deeper analysis and search for ways to optimize the process that would allow for more efficient resource management when working with massive data sets.

User’s Code

Before we proceed to analyze the causes of the problem and propose optimization strategies, let’s look at the original code the user shared on the forum:

$path = Storage::path("data");
$file = fopen($path, "r");

while ($line = fgets($file)) {
$line = json_decode(trim(trim($line), ","));
Item::create([
'a' => $line->b,
'c' => $line->d,
'e' => $line->f,
]);
}
fclose($file);

This simple script opens a file, reads it line by line, decodes the JSON, and creates new records in the database for each line. Although it seems…

--

--

Adrian Generous

I navigate between coding & marketing, experienced as a CTO, project manager, & ad agency owner. I share Laravel & more on Medium, also a history enthusiast.