Jump to content

Would more RAM make a difference?


sKunKbad

Recommended Posts

I've got kind of a basic computer that I built. Intel i5 processor with 8GB RAM and 1TB HDD. The operating system is Ubuntu 14.04. The problem I'm having is that I use Sublime Text 2 or gedit for working with text files, and large files are constantly freezing up either program.

 

As an example, I copy and pasted 17MB of text into a file, and just trying to save it made the window go dark. It came back to life a few seconds later, but on many occassions it just permanently freezes.

 

Back when I used to use Windows, I used to use Sublime Text 2 and Notepad++, and Sublime was always slow, but Notepad++ was always awesome. Unfortunately, Notepad++ is not available for Linux, and I don't want to run it in a virtual environment just for big files. It seems like Linux should have something that is just as fast.

 

So, do you think upgrading to 16GB of RAM would make a difference?

 

Is there a text editor that I could use on Linux that is fast for working with big files?

 

Thanks for your advice.

Link to comment
Share on other sites

More RAM would be redundant because any decent text editor/searcher(?) will load the file in chunks. Glad you found one that does so.

 

Out of pure curiosity, what are these files that are so large? Especially source code files. That seems unusual.

Link to comment
Share on other sites

More RAM would be redundant because any decent text editor/searcher(?) will load the file in chunks. Glad you found one that does so.

 

Out of pure curiosity, what are these files that are so large? Especially source code files. That seems unusual.

 

For instance, I was working with KML files that were downloaded from the US census bureau. These are basically just really big XML type files, but regardless of what they are, Sublime Text 2 and gedit just don't handle them very well. Another example would be a database dump (MySQL) file that I want to search and replace the domain name. Some of these files are holding 100 tables worth of data.

 

In maybe both cases I can just use the command line and use sed, but it would be nice if these text editors just worked and worked fast.

 

What do you suggest for a text editor that loads the file in chunks (for Linux)?

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.