Thursday, February 6, 2020

Importing Large CSV file of multiple columns and rows into mysql table using php

Importing Large CSV file of multiple columns and rows into mysql table using php

I am currently working on one of my projects in which I want to import data from large CSV files into the Two Mysql table. That's why I inserted data into several tables one after the other. But when I uploaded to the server and tried to import large amounts of data from a CSV file, it took a long time for my script to import data from a large CSV file and displayed the timeout error on the website. For this reason, I have problems and try to import large amounts of data from a CSV file using a PHP script. However, I did not succeed.

Then I searched the internet for the solution to import large CSV files into multiple MySQL tables. And on the Internet I found the Mysql "LOAD INFILE" command, which reads the data from the CSV file and inserts data into the Mysql table using a PHP script. With this command, however, we can only import data from a single table, and there is no instruction on the Internet for importing a large CSV file with 200,000 rows of data into multiple MySQL tables using a PHP script. So I tried to write my own logic to solve the problem of importing data from large CSV files into multiple tables using a PHP script.

In this tutorial, we need to find the solution to import data from large CSV files into multiple MySQL tables using a PHP script. In this tutorial we used the MYSQL LOAD DATA command, which allows you to easily import large CSV files into the MySQL table and do this task very quickly. If you were working in a business application and need to work with large amounts of data in that application, the CSV file was used to transfer data from one application to another. Since the CSV file is lighter than any other file extension, the MySQL database offers the command to read data from CSV files and to import them into the database when executing a single query. Below is the solution to import data from large CSV files into multiple MySQL tables using PHP script.


Mysql Database


Run the following SQL script to create the customer table and the order table in your local MySQL database. In both tables, a table column such as customer identification is common.


--
-- Database: `testing`
--

-- --------------------------------------------------------

--
-- Table structure for table `customer_table`
--

CREATE TABLE `customer_table` (
  `customer_id` int(11) NOT NULL,
  `customer_first_name` varchar(200) NOT NULL,
  `customer_last_name` varchar(200) NOT NULL,
  `customer_email` varchar(300) NOT NULL,
  `customer_gender` varchar(30) NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=latin1;

-- --------------------------------------------------------

--
-- Table structure for table `order_table`
--

CREATE TABLE `order_table` (
  `order_id` int(11) NOT NULL,
  `customer_id` int(11) NOT NULL,
  `product_name` varchar(200) NOT NULL,
  `product_price` double(10,2) NOT NULL,
  `order_date` varchar(30) NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=latin1;

--
-- Indexes for dumped tables
--

--
-- Indexes for table `customer_table`
--
ALTER TABLE `customer_table`
  ADD PRIMARY KEY (`customer_id`);

--
-- Indexes for table `order_table`
--
ALTER TABLE `order_table`
  ADD PRIMARY KEY (`order_id`);

--
-- AUTO_INCREMENT for dumped tables
--

--
-- AUTO_INCREMENT for table `customer_table`
--
ALTER TABLE `customer_table`
  MODIFY `customer_id` int(11) NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=1;

--
-- AUTO_INCREMENT for table `order_table`
--
ALTER TABLE `order_table`
  MODIFY `order_id` int(11) NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=1;

index.php


This file contains the front-end or client-side code of this tutorial. Here we created an HTML form to select the CSV file from the local computer, and using the Ajax script we sent the selected CSV file to the import.php server script using the FormData () object. To select only the CSV file, we have the attribute accept = ". Csv" here.


<!DOCTYPE html>
<html>
 <head>
  <title>Importing Large CSV file of multiple columns and rows into mysql table using php</title>  
  <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.1.0/jquery.min.js"></script>
  <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap.min.css" />
  <script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/js/bootstrap.min.js"></script>
 </head>
 <body>
  
  <br />
  <br />
  <div class="container">
   <h1 align="center">How to Import Large CSV File in Multiple Mysql table</h1>
   <br />
   <div class="panel panel-default">
    <div class="panel-heading">
     <h3 class="panel-title">Import Large CSV File Data into Multiple Table</h3>
    </div>
      <div class="panel-body">
       <span id="message"></span>
       <form id="sample_form" method="POST" enctype="multipart/form-data" class="form-horizontal">
        <div class="form-group">
         <label class="col-md-4 control-label">Select CSV File</label>
         <input type="file" name="file" id="file" accept=".csv" />
        </div>
        <div class="form-group" align="center">
         <input type="hidden" name="hidden_field" value="1" />
         <input type="submit" name="import" id="import" class="btn btn-info" value="Import" />
        </div>
       </form>
      </div>
     </div>
  </div>
 </body>
</html>

<script>
 
 $(document).ready(function(){

  $('#sample_form').on('submit', function(event){
   $('#message').html('');
   event.preventDefault();
   $.ajax({
    url:"import.php",
    method:"POST",
    data: new FormData(this),
    dataType:"json",
    contentType:false,
    cache:false,
    processData:false,
    success:function(data)
    {
     $('#message').html('<div class="alert alert-success">'+data.success+'</div>');
     $('#sample_form')[0].reset();
    }
   })
  });

 });
</script>

import.php


This is a PHP script file that receives the Ajax request with the CSV file. This script reads the data from the CSV file and imports the data from the CSV file into several MySQL tables with the command Mysql LOAD FILE.

Next you will find the PHP script that makes the database connection first, and at the time of database connection we need to make this method PDO :: MYSQL_ATTR_LOCAL_INFILE with a true value. Without this method we cannot read the CSV file of the Mysql command. This is necessary when establishing the database connection in PHP PDO if you want to import large CSV files with the MYSQL LOAD FILE command.

You can then find the query using the LOAD DATA LOCAL INFILE command. It reads the data from the CSV file from the temporary storage location of the selected CSV file. First data is imported into the customer table and then data from the same selected CSV file is imported into the order table.


<?php

if(!empty($_FILES['file']['name']))
{
 $connect = new PDO("mysql:host=localhost;dbname=testing;", "root", "", array(
        PDO::MYSQL_ATTR_LOCAL_INFILE => true,
    ));

 $total_row = count(file($_FILES['file']['tmp_name']));

 $file_location = str_replace("\\", "/", $_FILES['file']['tmp_name']);

 $query_1 = '
 LOAD DATA LOCAL INFILE "'.$file_location.'" IGNORE 
 INTO TABLE customer_table 
 FIELDS TERMINATED BY "," 
 LINES TERMINATED BY "\r\n" 
 IGNORE 1 LINES 
 (@column1,@column2,@column3,@column4) 
 SET customer_first_name = @column1, customer_last_name = @column2,  customer_email = @column3, customer_gender = @column4
 ';

 $statement = $connect->prepare($query_1);

 $statement->execute();

 $query_2 = "
 SELECT MAX(customer_id) as customer_id FROM customer_table
 ";

 $statement = $connect->prepare($query_2);

 $statement->execute();

 $result = $statement->fetchAll();

 $customer_id = 0;

 foreach($result as $row)
 {
  $customer_id = $row['customer_id'];
 }

 $first_customer_id = $customer_id - $total_row;

 $first_customer_id = $first_customer_id + 1;

 $query_3 = 'SET @customer_id:='.$first_customer_id.'';

 $statement = $connect->prepare($query_3);

 $statement->execute();

 $query_4 = '
 LOAD DATA LOCAL INFILE "'.$file_location.'" IGNORE 
 INTO TABLE order_table 
 FIELDS TERMINATED BY "," 
 LINES TERMINATED BY "\r\n" 
 IGNORE 1 LINES 
 (@column1,@column2,@column3,@column4,@column5,@column6,@column7) 
 SET customer_id = @customer_id:=@customer_id+1, product_name = @column5,  product_price = @column6, order_date = @column7
 ';

 $statement = $connect->prepare($query_4);

 $statement->execute();

 $output = array(
  'success' => 'Total <b>'.$total_row.'</b> Data imported'
 );

 echo json_encode($output);
}

?>


In this tutorial you will find the solution of How can we import large amounts of data from a CSV file into the MySQL multiple table? Above you can find the complete source code of this tutorial. So I hope you understood this topic and got to know the problem of importing data from large CSV files into multiple MySQL tables.

0 comments:

Post a Comment

Please don't enter any spam link in the comment box.