I have a list of 20 file names, like ['file1.txt', 'file2.txt', ...]
. I want to write a Python script to concatenate these files into a new file. I could open each file by f = open(...)
, read line by line by calling f.readline()
, and write each line into that new file. It doesn't seem very "elegant" to me, especially the part where I have to read//write line by line.
Is there a more "elegant" way to do this in Python?
What's wrong with UNIX commands ? (given you're not working on Windows) :
ls | xargs cat | tee output.txt
does the job ( you can call it from python with subprocess if you want)Check out the .read() method of the File object:
http://docs.python.org/2/tutorial/inputoutput.html#methods-of-file-objects
You could do something like:
or a more 'elegant' python-way:
which, according to this article: http://www.skymind.com/~ocrow/python_string/ would also be the fastest.
Use
shutil.copyfileobj
. It should be more efficient.An alternative to @inspectorG4dget answer (best answer to date 29-03-2016). I tested with 3 files of 436MB.
@inspectorG4dget solution: 162 seconds
The following solution : 125 seconds
The idea is to create a batch file and execute it, taking advantage of "old good technology". Its semi-python but works faster. Works for windows.
If you have a lot of files in the directory then
glob2
might be a better option to generate a list of filenames rather than writing them by hand.