Announcement

Collapse
No announcement yet.

Running Out of Memory

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Running Out of Memory

    I have a PBDOS application thet eventually runs out of memory
    after it has been used for a few hours. The program starts out
    with 74k of free data space (FRE(-1)). The number keeps getting
    smaller as the program runs different portions of the code.

    What causes the number to keep getting smaller?
    Is there a way to get the memory back?
    How can I make the program smaller?


    ------------------

  • #2
    One thing I can think of right off the top of my head is that
    you can null out string(s) after they no longer serve a purpose.

    i.e., T$ = "A really big string of characters". Do whatever is
    necessary with it then T$ = "" after its processed.

    The same can be said of arrays if necessary.

    Well, as another thought, you can try restructuring your program
    and CHAIN modules as they are needed. This will keep unused
    program segments out of memory until they are needed.

    ------------------


    [This message has been edited by Mel Bishop (edited February 07, 2005).]
    There are no atheists in a fox hole or the morning of a math test.
    If my flag offends you, I'll help you pack.

    Comment


    • #3
      Is there any point in the program where you can run it again?

      A trick that works to reset everything is to issue a run statement in your code. I do this as a matter of general practice before returning to the main menu, for example.

      You can also use a temporary file as a semaphore for this purpose.

      EXAMPLE
      ' program initialization here, dim statement, whatever
      ' blah blah blah

      if len(dir$("flag")) > 0 then kill "flag": goto submenu
      start of program:
      print "This is the menu. We want to skip this on the second run."

      submenu:
      print "This is the submenu. We want to come back here."
      open "flag" for output as #1
      close 1
      run ' hit it again
      END OF EXAMPLE

      That last run statement re-initializes everything and resets all variables and whatnot. Then when we go back to the start of the program, we find that our "flag" semaphore is set, so we immediately go to the submenu again without hitting the main menu. We delete the flag file before we go there so we won't find it again by accident on the next run.

      Basically, you're giving the whole thing a kick and re-starting from "new" each time the program hits the run statement. You might want to do this at the end of a menu selection when you return to the main menu (nothing else to do at that time anyway, and nothing to keep track of other than static data, usually).

      Incidentally, that if len(dir$("flag")) > 0 structure looks more awkward than a simple if dir$("flag") <> "", but PB/DOS handles the first method much faster than the second.


      ------------------

      Comment


      • #4
        To erase arrays, the cleanest way should be using ERASE statement.

        In general, having many small procedures that perform small tasks using LOCAL vars also helps to keep used memory low without the need of explicitly erasing vars that are no longer used.

        You might also give a look at the various metastatements (e.g., $COM or such); IIRC there are some that allows to exclude libraries and setting buffers sizes; for instance if you don't need a communication buffer as big as the default you can make it as small as possible or maybe no buffer at all.
        There is also another metastatement ($OPTIMIZE SPEED | SIZE, IIRC) that allows you to, well, decide whether the compiler should optimize for speed (bigger) or size (slower).


        ------------------
        Davide Vecchi
        [email protected]

        Comment


        • #5
          There is no one absolute answer to this question. If you have
          extended or expanded memory, then you can tell PB/DOS to use it,
          which greatly increases the amount of memory you have for your
          program and for your data structures.

          But poor data management is often a culprit when a program and
          the data used just grows and grows. There are several techniques
          for coping with this. For instance, I prefer to use the same
          variables over and over, so that I do not have to assign more
          memory for new variables. I typically use A, B, C, D, F, and G
          as my FOR loop counters. Then I typically use U, V, W, X, Y, and
          Z for my temporary variables. The letters in between tend to hold
          more permanent values, as well as longer named variables. That
          means that the letters at the beginning and end of the alphabet
          have limited scope, and I know I am free to use them again if I
          am out of that scope range.

          Others suggest you erase or nullify a variable when you are done
          with it. While a$ = "" does free up the string space once held
          by characters assigned to a$, they do not get rid of the a$
          variable itself. Doing an A = 0 does not free up any memory
          either - in fact you had to use extra instructions for making
          this assignment which does take up memory space.

          But using a$ again instead of adopting b$ frees up the string
          space that held the old characters in a$, and eliminates the
          need to allocate space for a new variable, b$.

          Note that I use single-letter names for most variables, but you
          can use much longer names if you like. Once the program is
          compiled, the matter of names is dropped altogether - only the
          assigned memory locations for each variable is retained. So if
          you like AAA$ instead, feel free to use it - and reuse it, and
          reuse it.

          The big memory waste is often in the matter of Arrays. If you
          keep acquiring more data, filling in large arrays, and do nothing
          to get rid of old data, you are going to eat up memory. If this
          is unavoidable, then you have to think in terms of either working
          with the data on a drive or out of conventional memory.

          Two effective tools exist for working outside of conventional
          memory. One is a tool for creating arrays in extended or
          expanded memory. Now these can be really large arrays, up to
          the total size of available memory, which on some machines may
          be up to 2 Gigabytes. But the other is where you set up a
          RAMDrive, or a virtual hard drive in your extended or expanded
          memory. The software driver for a RAMDrive will let you open
          and close, read and write to files and directories that seem
          to be on your "new" drive, but at speeds that are thousands
          of times faster than a real hard drive. Or if you have USB
          support and a flash card driver or something like that, you can
          actually use an electronic drive for this purpose.

          So there are different causes, different approaches for limiting
          memory loss, and different ways to overcome the confines of
          conventional memory.



          ------------------
          Old Navy Chief, Systems Engineer, Systems Analyst, now semi-retired

          Comment

          Working...
          X