sKunKbad Posted May 14, 2015 Share Posted May 14, 2015 I've got kind of a basic computer that I built. Intel i5 processor with 8GB RAM and 1TB HDD. The operating system is Ubuntu 14.04. The problem I'm having is that I use Sublime Text 2 or gedit for working with text files, and large files are constantly freezing up either program. As an example, I copy and pasted 17MB of text into a file, and just trying to save it made the window go dark. It came back to life a few seconds later, but on many occassions it just permanently freezes. Back when I used to use Windows, I used to use Sublime Text 2 and Notepad++, and Sublime was always slow, but Notepad++ was always awesome. Unfortunately, Notepad++ is not available for Linux, and I don't want to run it in a virtual environment just for big files. It seems like Linux should have something that is just as fast. So, do you think upgrading to 16GB of RAM would make a difference? Is there a text editor that I could use on Linux that is fast for working with big files? Thanks for your advice. Quote Link to comment https://forums.phpfreaks.com/topic/296321-would-more-ram-make-a-difference/ Share on other sites More sharing options...
sKunKbad Posted May 14, 2015 Author Share Posted May 14, 2015 I found one that seems to work well with large files: http://www.scintilla.org/SciTE.html I was able to do a search and replace on a file that was 170MB, and it only took about 5 seconds. Still curious about upgrading the RAM. Quote Link to comment https://forums.phpfreaks.com/topic/296321-would-more-ram-make-a-difference/#findComment-1511885 Share on other sites More sharing options...
Ofarchades Posted May 14, 2015 Share Posted May 14, 2015 More RAM would be redundant because any decent text editor/searcher(?) will load the file in chunks. Glad you found one that does so. Out of pure curiosity, what are these files that are so large? Especially source code files. That seems unusual. Quote Link to comment https://forums.phpfreaks.com/topic/296321-would-more-ram-make-a-difference/#findComment-1511887 Share on other sites More sharing options...
sKunKbad Posted May 14, 2015 Author Share Posted May 14, 2015 More RAM would be redundant because any decent text editor/searcher(?) will load the file in chunks. Glad you found one that does so. Out of pure curiosity, what are these files that are so large? Especially source code files. That seems unusual. For instance, I was working with KML files that were downloaded from the US census bureau. These are basically just really big XML type files, but regardless of what they are, Sublime Text 2 and gedit just don't handle them very well. Another example would be a database dump (MySQL) file that I want to search and replace the domain name. Some of these files are holding 100 tables worth of data. In maybe both cases I can just use the command line and use sed, but it would be nice if these text editors just worked and worked fast. What do you suggest for a text editor that loads the file in chunks (for Linux)? Quote Link to comment https://forums.phpfreaks.com/topic/296321-would-more-ram-make-a-difference/#findComment-1511921 Share on other sites More sharing options...
Zane Posted May 16, 2015 Share Posted May 16, 2015 I hear nothing but good things about VIM. Maybe that's the way you should go. Quote Link to comment https://forums.phpfreaks.com/topic/296321-would-more-ram-make-a-difference/#findComment-1512015 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.