I have a text file which has 4 columns, each column having 65536 data points. Every element in the row is separated by a comma. For example:
X,Y,Z,AU
4010.0,3210.0,-440.0,0.0
4010.0,3210.0,-420.0,0.0
etc.
So, I have 65536 rows, each row having 4 data values as shown above. I want to convert it into a matrix. I tried importing data from the text file to an excel file, because that way its easy to create a matrix, but I lost more than half the data.
The easiest way to do it would be to use MATLAB's
csvread
function.There is also this tool which reads CSV files.
You could do it yourself without too much difficulty either: Just loop over each line in the file and split it on commas and put it in your array.
If all the entries in your file are numeric, you can simply use
a = load('file.txt')
. It should create a 65536x4 matrixa
. It is even easier thancsvread
Instead of messing with Excel, you should be able to read the text file directly into MATLAB (using the functions FOPEN, FGETL, FSCANF, and FCLOSE):
Suggest you familiarize yourself with
dlmread
andtextscan
.dlmread
is likecsvread
but because it can handle any delimiter (tab, space, etc), I tend to use it rather thancsvread
.textscan
is the real workhorse: lots of options, + it works on open files and is a little more robust to handling "bad" input (e.g. non-numeric data in the file). It can be used likefscanf
in gnovice's suggestion, but I think it is faster (don't quote me on that though).Have you ever tried using 'importdata'? The parameters you need only file name and delimiter.