I will import very large data files. Import the entire file into memory and then Is it better practice/faster/more effective to divide it into sub-matrices, or just import every n columns into a new matrix?
My guess is that it will be faster to load it all into the cache and then process it, but this is just an uneducated guess.
General programming problems, but Matlab may have Specific considerations.
I will import very large data files. Import the entire file into memory and divide it into sub-matrices, or just import every n columns into a new matrix, which is better Is the practice/faster/more effective?
My guess is that loading it all into the cache and then processing it will be faster, but this is just an uneducated guess.
< p> In my experience, the best way is to use csvread to parse it once (using dlmread, which uses text scanning-so time loss is not important). Of course, this is because a very large file is not larger than the available memory you have Volume. If a very large file is larger than RAM (I only need to parse a 31GB file), then I will use fopen to read line by line (or blocks or blocks you like) and write these writable mat files. In this way, you can theoretically write large files restricted by the file system.