I've tried something like this
file_in <- file("myfile.log","r")
x <- readLines(file_in, n=-100)
but I'm still waiting...
Any help would be greatly appreciated
I've tried something like this
file_in <- file("myfile.log","r")
x <- readLines(file_in, n=-100)
but I'm still waiting...
Any help would be greatly appreciated
You could do this with
read.table
by specifying theskip
parameter. If your lines are not to be parsed to variables, specify the separator to be'\n'
as @Joris Meys pointed out below, and also setas.is=TRUE
to get character vectors instead of factors.Small example (skipping the first 2000 lines):
As @JorisMeys already mentioned the unix command
tail
would be the easiest way to solve this problem. However I want to propose aseek
basedR
solution that starts reading the file from the end of the file:I'd use
scan
for this, in case you know how many lines the log has :If you have no clue how many you need to skip, you have no choice but to move towards either
scan("foo.txt",sep="\n",what=list(NULL))
to figure out how many records there are, orThe last option could look like :
allowing :
or
in case you know you have more than 10 million lines. This can save on the reading time when you start having extremely big logs.
EDIT : In fact, I'd not even use R for this, given the size of your file. On Unix, you can use the tail command. There is a windows version for that as well, somewhere in a toolkit. I didn't try that out yet though.