Hi : I'm working on an application for the stock market with very large arrays.
They are all DIM'ed to the largest size needed.
The program will analyze any number of days of data, from 0 to 1,200.
At the end, there is a screen display of the results with a WAITKEY$, after which there is an EXIT FUNCTION.
When I run the program with 200, 500, or 900 days of data, all runs fine.
However, when I run 1,000 days, after the analysis and after the screen display of the results and then a key input I get "Program has encountered an error and needs to close " from M$.
I assume this is a GPF.
I had always assumed that a DIM allocated the memory needed, and so if I needed too much memory, I would get a fault as soon as I started the program. It seems here that as I ask for more and more days of analysis, and use more and more of the arrays, when I get to a certain size, something goes wrong.
But curiously, it does not go wrong when actually running the program, but when I return via EXIT FUNCTION.
Any ideas ?
Hilton Schwartz
They are all DIM'ed to the largest size needed.
The program will analyze any number of days of data, from 0 to 1,200.
At the end, there is a screen display of the results with a WAITKEY$, after which there is an EXIT FUNCTION.
When I run the program with 200, 500, or 900 days of data, all runs fine.
However, when I run 1,000 days, after the analysis and after the screen display of the results and then a key input I get "Program has encountered an error and needs to close " from M$.
I assume this is a GPF.
I had always assumed that a DIM allocated the memory needed, and so if I needed too much memory, I would get a fault as soon as I started the program. It seems here that as I ask for more and more days of analysis, and use more and more of the arrays, when I get to a certain size, something goes wrong.
But curiously, it does not go wrong when actually running the program, but when I return via EXIT FUNCTION.
Any ideas ?
Hilton Schwartz
Comment