Am 25.08.2013 20:08, schrieb Allin Cottrell:
On Sun, 25 Aug 2013, Sven Schreiber wrote:
>
> That is, there are thousands of operations working on strings holding
> (almost) the entire file content (in this case about 1MB as I said). I
> have tried to consolidate this into a more clever sscanf line, but that
> didn't really help. Glad to take more ideas.
>
> In contrast, in python the file is read line per line. [...]
Yes, that will make the difference. I think that to do this sort of
thing efficiently on big files we would need at least one more
function: something like the internal function bufgets(), which
returns the next line of a text buffer held in memory until the
buffer is exhausted (probably with a switch to drop the trailing
newline character).
This sounds very much like an extended readfile() function to me. (?)
Greater flexibility would be available if we implemented the suite
of functions fopen, fclose, fgets and friends. This wouldn't be very
difficult since our functions would just be wrappers for the C
library functions, but it does raise the "Swiss army knife" issue as
mentioned by Jack.
Yes. Would the former be possible without the latter? If yes, I think it
would be enough.
-sven