"JJ" <***@vfemail.net> wrote
| > With VB it's similar. Reading or writing a file to disk
| > is incredibly fast. Working with strings is incredibly
| > fast. The only thing that's slow is concatenation. For
| > that I use arrays and Join.
| It's slow for accessing partial data of big files where the needed data is
| at offset 100MB+. Also, the Skip() method works more like Read() rather
| simply changing the file pointer, because it took more than a second to
| Skip() 100MB+ characters.
I see you're right. Personally I think there's a problem
with design and planning if anyone's making files that big,
but I got curious, anyway. I have a text version of the Bible.
I put together 40 copies, for a total of about 170 MB. I
then read 40 bytes at offset 120,000,000. It took 8 seconds
in VBS and 5.5 seconds in VB. But the looking closer it turned
out that nearly all of that time must be used to allocate
memory to read in the file. The time required to do
s = Mid(s170K, 120000000, 40) didn't even register. It showed
as 0. Likewise, doing Instr(120000000, s170K, "Jesus") took
less than 1 ms.
So I guess what that means is that if you need to read one
section of text from 100 giant files, it will be slow. But if you
need to read 200 sections of text from one giant file, it won't
take much longer than it takes to load the file.
I didn't realize the load would take so long. I typically deal
with files under 5 MB and read/write time is negligible. In fact,
with my script/HTML editor I now load the file and then run
it through a tokenizer, checking every byte/sequence, to see
whether it's valid UTF-8. All of that still only takes a few ms.