No announcement yet.

Large data file crashes program

  • Filter
  • Time
  • Show
Clear All
new posts

  • Large data file crashes program

    I've developed a Powerbasic DOS program that reads a data file,
    manipulates the data, prints the results to the screen and then
    offers the option to print hardcopy.

    Everything works great until the data file gets too large ..
    When that happens the program hangs and I have to CTRL ALT DEL
    to get back to a DOS window.

    The data is read into a 2 dimensional string array thats
    dimensioned adequately early in the the program.

    What should I check first? Second? Third?

    I'd also like to trim the program size by getting rid of LPRINT
    code that identical to the PRINT coding. The former is only used
    to allow printing hardcopy after the information has been
    printed to the screen.

    Can I somehow offer the hardcopy option without cycling the
    program thru similar LPRINT coding? The LPRINT coding adds
    greatly to the program's size, so I'd really like to try
    other options.

    Thanks for any ideas...

  • #2
    If you want to use the same code to print to either the screen or
    the printer (or both), you could use PRINT # in place of PRINT and


    Then you could print to the appropriate device using the same code:

    PRINT #1, "This is on the screen"
    PRINT #2, "This is on the printer"

    Just use a variable in place of the device number to use the same
    line of code for multiple devices.

    As for large files hanging your computer, have you tried using
    VIRTUAL arrays? You can't use variable length strings with VIRTUAL
    arrays, but you can use ASCIIZ strings. ASCIIZ strings work much
    like variable length strings, but with a set length limit. Just
    determine the length of the longest string you might need, and then
    create a virtual array of ASCIIZ strings of that length. Then you
    would use them just like you would use variable length strings.

    DIM VIRTUAL MyArray(100, 100) as ASCIIZ * 1000 '10 meg array

    Just make sure the entire array doesn't exceed 16 meg (32 meg in pure
    DOS mode with some memory managers). You can have multiple arrays,
    each up to 16 meg, without a problem.

    If you decide to try it that way, I wrote some code a while back
    to load and save VIRTUAL arrays of any size and type to disk VERY
    fast, using ASM. If you want to try it, let me know and I'll post
    it here.



    • #3
      On the subject of "crashing the PC" when the datafile gets large, I have to ask: Have you enable all types of error checking in your code? Simply add a $ERROR ALL ON meta-statement to the top of your code.

      By default, PB/DOS does not add error testing code to your application for such things as array boundary testing, overflow, etc.

      There are also other things that can crash your program that error checking does not catch - using pointers incorrectly, badly written inline-assembly, etc. These are "soft" errors that can cause memory to be overwritten or altered, and their effects may not show up immediately - the error may show up hundred or thousands of lines of code later in your application.

      In summary, retest your code with $ERROR ALL ON and see how you get on. If you are still having problems, start eliminating sections of your code until the error goes away. Don't delete the code out though - use REM on individual lines or $IF 0 | $ENDIF meta-statements to 'mark' a whole block of code as a comment.

      for example

      $IF 0
      X& = X& \ 0 ' this would be an error if executed!
      PRINT "See? I'm really a comment that looks like code!"
      PRINT "This prints too!"

      Also, the debugger can be handy to track down this type of problem. Let us know how these suggestions help you solve the problem.

      PowerBASIC Support
      mailto:[email protected][email protected]</A>
      mailto:[email protected]