I have to process a file that has data for about 600 different
products. I have no idea which ones any user may need so I have
to process all of them! urgh
The problem is that there is one file for each day. So each of
the daily files has to be read from top to bottom, and the data
in it sent to one of the 600 destination files.
I began by doing a straight foreward "read the source file line by
line and open the appropriate destination file and write line by
line". This was fine when the destination file did not have too
much existing data in it. but once a few weeks of data is in each
of the destination files, I have to wade thru it all to insert or
append the new data.
Next I tried reading the entire destination file into memory and
flying thru it with pointers. Then trasnfereing the file up to the
insert point to a TempStr, adding the new data, then adding the'
rest of the file. Then I write the whole file to HD again.
This is faster but I am still moving each one of the 600 files
around too much. They can each be about 1.6MB!
Thats about 960MB I have to read and write for each new day of data.
and thats not including some other string processing that has to
be done on each one. So its sloooooow
If I want to drag and drop a weeks data at a time, thats about
6GB that has to be moved and processed.
So now Im realizing I need to stop moving the data. I could detect
if the data needs to be inserted vs appended to the existing data.
If I re-write the app (for the 4th time) I could make the majority
an append operation. Then I could just open the file and APPEND the
new days data.
I am worried this will produce 600 fragments for each days data
as the files are writen one at a time for one day and then
one at a time for the next day and so on.
Am I going in the right direction with this? I cant think of a
better solution, but you guys allways seem to have great ideas,
so I thought i would throw it our there and see what you think
------------------
Kind Regards
Mike
[This message has been edited by Mike Trader (edited August 24, 2001).]
products. I have no idea which ones any user may need so I have
to process all of them! urgh
The problem is that there is one file for each day. So each of
the daily files has to be read from top to bottom, and the data
in it sent to one of the 600 destination files.
I began by doing a straight foreward "read the source file line by
line and open the appropriate destination file and write line by
line". This was fine when the destination file did not have too
much existing data in it. but once a few weeks of data is in each
of the destination files, I have to wade thru it all to insert or
append the new data.
Next I tried reading the entire destination file into memory and
flying thru it with pointers. Then trasnfereing the file up to the
insert point to a TempStr, adding the new data, then adding the'
rest of the file. Then I write the whole file to HD again.
This is faster but I am still moving each one of the 600 files
around too much. They can each be about 1.6MB!
Thats about 960MB I have to read and write for each new day of data.
and thats not including some other string processing that has to
be done on each one. So its sloooooow
If I want to drag and drop a weeks data at a time, thats about
6GB that has to be moved and processed.
So now Im realizing I need to stop moving the data. I could detect
if the data needs to be inserted vs appended to the existing data.
If I re-write the app (for the 4th time) I could make the majority
an append operation. Then I could just open the file and APPEND the
new days data.
I am worried this will produce 600 fragments for each days data
as the files are writen one at a time for one day and then
one at a time for the next day and so on.
Am I going in the right direction with this? I cant think of a
better solution, but you guys allways seem to have great ideas,
so I thought i would throw it our there and see what you think

------------------
Kind Regards
Mike
[This message has been edited by Mike Trader (edited August 24, 2001).]
Comment