I'm trying to read multiple CSV files that have the same structure(column names)and located in several folders, My main purpose is to concatenate these files into one panda data frame. please find attached below files location distribution of folders, thus each folder contains 5 CSV files. Is there any predefined function or smth that can help ??
相关问题
- how to define constructor for Python's new Nam
- streaming md5sum of contents of a large remote tar
- How to get the background from multiple images by
- Evil ctypes hack in python
- Correctly parse PDF paragraphs with Python
Using the os.walk() and pd.concat():
You can use
os.walk()
to iterate over files in directory tree (example).pd.read_csv()
will read a single file into a dataframe.pd.concat(df_list)
will concatenate all dataframes in df_list together.I don't believe there is a single method that combines all the above for your convenience.
You might use
glob.glob('*.csv')
to find all csvs and then concat them all.This is the best solution to this problem :
Calling the function :
Frenzy Kiwi gave you the right answer. An alternative could be using
dask
let's say your folder structure isThen you can just read all of them via