Working with huge files in VIM

2019-01-08 03:38发布

I tried opening a huge (~2GB) file in VIM but it choked. I don't actually need to edit the file, just jump around efficiently.

How can I go about working with very large files in VIM?

10条回答
再贱就再见
2楼-- · 2019-01-08 04:14

Since you don't need to actually edit the file:

  1. view (or vim -R) should work reasonably well on large files.
  2. Or you can use more or less
查看更多
We Are One
3楼-- · 2019-01-08 04:16

For huge one-liners (prints characters from 1 to 99):

cut -c 1-99 filename
查看更多
我欲成王,谁敢阻挡
4楼-- · 2019-01-08 04:21

This has been a recurring question for many years. (The numbers keep changing, but the concept is the same: how do I view or edit files that are larger than memory?)

Obviously more or less are good approaches to merely reading the files --- less even offers vi like keybindings for scrolling and searching.

A Freshmeat search on "large files" suggests that two editors would be particularly suited to your needs.

One would be: lfhex ... a large file hex editor (which depends on Qt). That one, obviously, entails using a GUI.

Another would seem to be suited to console use: hed ... and it claims to have a vim-like interface (including an ex mode?).

I'm sure I've seen other editors for Linux/UNIX that were able to page through files without loading their entirety into memory. However, I don't recall any of their names. I'm making this response a "wiki" entry to encourage others to add their links to such editors. (Yes, I am familiar with ways to work around the issue using split and cat; but I'm thinking of editors, especially console/curses editors which can dispense with that and save us the time/latencies and disk space overhead that such approaches entail).

查看更多
我想做一个坏孩纸
5楼-- · 2019-01-08 04:24

emacs works very well with files into the 100's of megabytes, I've used it on log files without too much trouble.

But generally when I have some kind of analysis task, I find writing a perl script a better choice.

查看更多
唯我独甜
6楼-- · 2019-01-08 04:27

this is old but, use nano, vim or gvim

查看更多
孤傲高冷的网名
7楼-- · 2019-01-08 04:28

I had a 12GB file to edit today. The vim LargeFile plugin did not work for me. It still used up all my memory and then printed an error message :-(. I could not use hexedit for either, as it cannot insert anything, just overwrite. Here is an alternative approach:

You split the file, edit the parts and then recombine it. You still need twice the disk space though.

  • Grep for something surrounding the line you would like to edit:

    grep -n 'something' HUGEFILE | head -n 1
    
  • Extract that range of the file. Say the lines you want to edit are at line 4 and 5. Then do:

    sed -n -e '4,5p' -e '5q' HUGEFILE > SMALLPART
    
    • The -n option is required to suppress the default behaviour of sed to print everything
    • 4,5p prints lines 4 and 5
    • 5q aborts sed after processing line 5
  • Edit SMALLPART using your favourite editor.

  • Combine the file:

    (head -n 3 HUGEFILE; cat SMALLPART; sed -e '1,5d' HUGEFILE) > HUGEFILE.new 
    
    • i.e: pick all the lines before the edited lines from the HUGEFILE (which in this case is the top 3 lines), combine it with the edited lines (in this case lines 4 and 5) and use this combined set of lines to replace the equivalent (in this case the top 5 lines) in the HUGEFILE and write it all to a new file.

    HUGEFILE.new will now be your edited file, you can delete the original HUGEFILE.

查看更多
登录 后发表回答