Include headers when using SELECT INTO OUTFILE?

2019-01-05 10:16发布

Is it possible to include the headers somehow when using the MySQL INTO OUTFILE?

17条回答
做个烂人
2楼-- · 2019-01-05 10:20

I was writing my code in PHP, and I had a bit of trouble using concat and union functions, and also did not use SQL variables, any ways I got it to work, here is my code:

//first I connected to the information_scheme DB

$headercon=mysqli_connect("localhost", "USERNAME", "PASSWORD", "information_schema");

//took the healders out in a string (I could not get the concat function to work, so I wrote a loop for it)

    $headers = '';
    $sql = "SELECT column_name AS columns FROM `COLUMNS` WHERE table_schema = 'YOUR_DB_NAME' AND table_name = 'YOUR_TABLE_NAME'";
    $result = $headercon->query($sql);
    while($row = $result->fetch_row())
    {
        $headers = $headers . "'" . $row[0] . "', ";
    }
$headers = substr("$headers", 0, -2);

// connect to the DB of interest

$con=mysqli_connect("localhost", "USERNAME", "PASSWORD", "YOUR_DB_NAME");

// export the results to csv
$sql4 = "SELECT $headers UNION SELECT * FROM YOUR_TABLE_NAME WHERE ... INTO OUTFILE '/output.csv' FIELDS TERMINATED BY ','";
$result4 = $con->query($sql4);
查看更多
Lonely孤独者°
3楼-- · 2019-01-05 10:22

Here is a way to get the header titles from the column names dynamically.

/* Change table_name and database_name */
SET @table_name = 'table_name';
SET @table_schema = 'database_name';
SET @default_group_concat_max_len = (SELECT @@group_concat_max_len);

/* Sets Group Concat Max Limit larger for tables with a lot of columns */
SET SESSION group_concat_max_len = 1000000;

SET @col_names = (
  SELECT GROUP_CONCAT(QUOTE(`column_name`)) AS columns
  FROM information_schema.columns
  WHERE table_schema = @table_schema
  AND table_name = @table_name);

SET @cols = CONCAT('(SELECT ', @col_names, ')');

SET @query = CONCAT('(SELECT * FROM ', @table_schema, '.', @table_name,
  ' INTO OUTFILE \'/tmp/your_csv_file.csv\'
  FIELDS ENCLOSED BY \'\\\'\' TERMINATED BY \'\t\' ESCAPED BY \'\'
  LINES TERMINATED BY \'\n\')');

/* Concatenates column names to query */
SET @sql = CONCAT(@cols, ' UNION ALL ', @query);

/* Resets Group Contact Max Limit back to original value */
SET SESSION group_concat_max_len = @default_group_concat_max_len;

PREPARE stmt FROM @sql;
EXECUTE stmt;
DEALLOCATE PREPARE stmt;
查看更多
迷人小祖宗
4楼-- · 2019-01-05 10:23

an example from my database table name sensor with colums (id,time,unit)

select ('id') as id, ('time') as time, ('unit') as unit
UNION ALL
SELECT * INTO OUTFILE 'C:/Users/User/Downloads/data.csv'
  FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
  LINES TERMINATED BY '\n'
  FROM sensor
查看更多
走好不送
5楼-- · 2019-01-05 10:27

You can use prepared statement with lucek's answer and export dynamically table with columns name in CSV :

--If your table has too many columns
SET GLOBAL group_concat_max_len = 100000000;
--Prepared statement
SET @SQL = ( select CONCAT('SELECT * INTO OUTFILE \'YOUR_PATH\' FIELDS TERMINATED BY \',\' OPTIONALLY ENCLOSED BY \'"\' ESCAPED BY \'\' LINES TERMINATED BY \'\\n\' FROM (SELECT ', GROUP_CONCAT(CONCAT("'",COLUMN_NAME,"'")),' UNION select * from YOUR_TABLE) as tmp') from INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = 'YOUR_TABLE' AND TABLE_SCHEMA = 'YOUR_SCHEMA' order BY ORDINAL_POSITION );
--Execute it
PREPARE stmt FROM @SQL;
EXECUTE stmt;

Thank lucek.

查看更多
6楼-- · 2019-01-05 10:27

So, if all the columns in my_table are a character data type, we can combine the top answers (by Joe, matt and evilguc) together, to get the header added automatically in one 'simple' SQL query, e.g.

select * from (
  (select column_name
    from information_schema.columns
    where table_name = 'my_table'
    and table_schema = 'my_schema'
    order by ordinal_position)
  union all
  (select *  // potentially complex SELECT statement with WHERE, ORDER BY, GROUP BY etc.
  from my_table)) as tbl
into outfile '/path/outfile'
fields terminated by ',' optionally enclosed by '"' escaped by '\\'
lines terminated by '\n';

where the last couple of lines make the output csv.

Note that this may be slow if my_table is very large.

查看更多
女痞
7楼-- · 2019-01-05 10:27

Since the 'include-headers' functionality doesn't seem to be build-in yet, and most "solutions" here need to type the columns names manually, and/or don't even take joins into account, I'd recommand to get around the problem.

  • The best alternative I found so far is using a decent tool (I use HeidiSQL).
    Put your request, select the grid, just right click and export to a file. It got all necessary options for a clean export, ans should handle most needs.

  • In the same idea, user3037511's approach works fine, and can be automated easily.
    Just launch your request with some command line to get your headers. You may get the data with a SELECT INTO OUTFILE... or by running your query without the limit, yours to choose.

    Note that output redirect to a file works like a charm on both Linux AND Windows.


This makes me want to highlight that 80% of the time, when I want to use SELECT FROM INFILE or SELECT INTO OUTFILE, I end-up using something else due to some limitations (here, the absence of a 'headers options', on an AWS-RDS, the missing rights, and so on.)

Hence, I don't exactly answer to the op's question... but it should answer his needs :)
EDIT : and to actually answer his question : no
As of 2017-09-07, you just can't include headers if you stick with the SELECT INTO OUTFILE command
:|

查看更多
登录 后发表回答