Import CSV file to Laravel controller and insert d

2019-02-03 10:50发布

问题:

So I am a complete noob to Laravel and am trying something here. I want to import a CSV file into two tables, I have a table called lists that will get the list name and a client_id.

Then I have a table called customers that will get name surname contact number as well as client_id and a list_id.

What I want to achieve is to import a CSV file that will take the file name and store it in the list table, then create an array through the CSV file and import the data into the customers table with the list and client id's.

I have the first part done, and it inserts into the lists table correctly, How do I now create an array from the CSV that is located in storage/documents and then insert that into the customers table?

namespace App\Http\Controllers;

use Input;
use DB;
use Illuminate\Http\Request;
use App\Http\Requests\ListsRequest;
use App\Lists;
use App\Clients;
use App\Http\Requests;
use App\Http\Controllers\Controller;

class ListsController extends Controller {

    public function index()
    {
        // $list_items = Lists::all();
        $clients = Clients::all();

        return view('lists.show', compact('clients'));
    }

    public function store(Requests\ListsRequest $request)
    {
        $input = $request->input();
        Lists::create($input);

        if (Input::hasFile('name'))
        {

            $file = Input::file('name');
            $name = time() . '-' . $file->getClientOriginalName();

            $path = storage_path('documents');

            $file->move($path, $name);

            // All works up to here
            // All I need now is to create an array
            // from the CSV and insert into the customers database
        }
    }
}

I chose to use the answer that I had accepted but I also played with the other answer and got it to work like this.

public function store(Requests\ListsRequest $request)
{
    $input = $request->input();
    $client_id = $request->input('client_id');

    if (Input::hasFile('name'))
    {
        $file = Input::file('name');
        $name = time() . '-' . $file->getClientOriginalName();
        $path = storage_path('documents');

        Lists::create(['client_id' => $client_id, 'name' => $name]);

        $reader = Reader::createFromPath($file->getRealPath());
        // Create a customer from each row in the CSV file
        $headers = array();

        foreach ($reader as $index => $row)
        {
            if ($index === 0)
            {
                $headers = $row;
            } else
            {
                $data = array_combine($headers, $row);
                Customers::create($data);
            }
        }

        $file->move($path, $name);

        return view('clients');
    }
}

回答1:

There are 3 steps to read CSV file and import it in database in Laravel.

  1. Read CSV file
  2. Convert it to array
  3. Finally create records in our database.

Before we start, I have created a sample test.csv file and put it on my public folder under file folder:

name,email,password
user1,email1@email.com,pasxxxxxxxxxword
user2,email2@email.com,pasxxxxxxxxxword
user3,email3@email.com,pasxxxxxxxxxword

Step 1 and 2; I created a helper function called csvToArray, I just put it in my controller for now (this function is inspired from this link) it simply reads the CSV file and convert it to array:

function csvToArray($filename = '', $delimiter = ',')
{
    if (!file_exists($filename) || !is_readable($filename))
        return false;

    $header = null;
    $data = array();
    if (($handle = fopen($filename, 'r')) !== false)
    {
        while (($row = fgetcsv($handle, 1000, $delimiter)) !== false)
        {
            if (!$header)
                $header = $row;
            else
                $data[] = array_combine($header, $row);
        }
        fclose($handle);
    }

    return $data;
}

Step 3; And here is my final step, read array and insert it in our database:

public function importCsv()
{
    $file = public_path('file/test.csv');

    $customerArr = $this->csvToArray($file);

    for ($i = 0; $i < count($customerArr); $i ++)
    {
        User::firstOrCreate($customerArr[$i]);
    }

    return 'Jobi done or what ever';    
}

Note: this solution assume that you have a model in your Laravel project and has the proper table in your database.

if you use dd($customerArr) you will get this



回答2:

In your store() method, create the record in your lists table, and then iterate over the contents of the CSV file and insert the data into the customers table. You should create a relation between customers and lists for this purpose. You would also be better off using something like the PHP League’s CSV package for reading such files:

public function store(AddCustomersRequest $request)
{
    // Get uploaded CSV file
    $file = $request->file('csv');

    // Create list name
    $name = time().'-'.$file->getClientOriginalName();

    // Create a list record in the database
    $list = List::create(['name' => $name]);

    // Create a CSV reader instance
    $reader = Reader::createFromFileObject($file->openFile());

    // Create a customer from each row in the CSV file
    foreach ($reader as $index => $row) {
        $list->customers()->create($row);
    }

    // Redirect back to where you need with a success message
}


回答3:

@maytham The solution by Maytham will work well. However, it will have a huge problem if you're trying to do with large data. Even if you do 1000 rows, it will create a problem, as it will make 1000 insert statement separately. I will just edit the third method by him and add my own input

public function importCsv()
{
    $file = public_path('file/test.csv');

$customerArr = $this->csvToArray($file);
$date = [];
for ($i = 0; $i < count($customerArr); $i ++)
{
    $data[] = [
      'column_name1' => 'value',
      'column_name2' => 'value2',
      .. so..on..and..on
    ];
    //User::firstOrCreate($customerArr[$i]);
}
DB::table('table_name')->insert($data);
return 'Jobi done or what ever';    
}

This will call the database once to insert as many rows as you wish. Be it 1000, 100000 or whatever. However, if you have a huge csv, this will be a problem too since you will be required to insert in chunks. Like in PostgreSQL, I have noticed that you can insert up to 65000 something rows in one statement. Maybe I am wrong about the number but there's a limit in every database and you need to look for that.