No announcement yet.

EXE Size

  • Filter
  • Time
  • Show
Clear All
new posts

  • EXE Size

    Unusually, I'm working on a EXE that will probably reach about 10MB in size.

    Are there any concerns about EXE size in general? I've seen some much larger than 10MB but other than a reduced load time, I've not seen any concerns on the web.

    What's the largest EXE you've every created with PowerBASIC and were there any issues worth mentioning?

  • #2
    Define EXE size for us (well for me!) Gary

    My programs would all be small or middling except sometimes I have huge arrays - how do you relate EXE size to empty huge arrays or full huge arrays?

    Or are you just talking about size on disk?

    I would normally consider breaking a program down into logical bits provided the processing goes through unique stages. SHELLing out is very fast - especially if you are not doing it too often I presume.

    Good discussion

    [I]I made a coding error once - but fortunately I fixed it before anyone noticed[/I]
    Kerry Farmer


    • #3
      Hi Kerry!

      The EXE file size.


      • #4
        Interesting - so it is not the size of the program in memory? Hmmmm!
        [I]I made a coding error once - but fortunately I fixed it before anyone noticed[/I]
        Kerry Farmer


        • #5
          Gary. I am not sure about am idea I had lately and was playing with loading certain files fast. If that is a concern, what I was doing was testing was opening up files located on a server from a workstation and leaving the program running and when I no longer needed them, I was going to kill the program that keep the file open using taskkill. I got the idea from a company selling a program to keep files open on a web server in order to keep the files in cache. If your exe has to be started often, it might sound crazy, but I wonder what would happen if you did something similar with an exe file. The programs I was trying to speed up were written by a third partly and very inefficient on how those programs work with data and it did not help enough for what I was doing. The programs take 2 minutes on files on the server to run the code but on a local drive about 2 seconds. They have to be on the server. But a crazy idea might help in certain conditions where speed is important.
          if that program runs twice a day. That is 4 minutes.
          4 minutes time 220 work days is 880 minutes a year. Which is 14.7 hours a year where somebody is just waiting.
          p purvis


          • #6
            One of the sobering facts with modern software is that the code when written properly is usually not all that big but as soon as you start adding resources to the executable it blows out in size very quickly. This is specific to executable file size. Memory footprint is another matter which is usually handled in the code design so that you use the allocated memory as fast and for as short a time as possible. If as Paul mentioned, the data is on a server, you are at the mercy of the data transfer rate of the network server but there are a few tricks here, it is much faster to decompress data that to transfer it across a network so if the data can be highly compressed, it will drop the data transfer time by the compressed size reduction.

            The level of usage matters in that if it is only used occasionally, you can write it in a visual garbage generator because it just does not matter. As the usage goes up the cost in time dictates that you start to make it faster as staff user time is expensive. If something gets used millions of times a day, you start doing things like heavily optimising the code to get the speed up and if its getting data from a server, you start looking at hardware to improve the data transfer rate, optical fibre cabling etc ...

            Usage rate dictates cost and if the usage rate is high enough, code optimisation and hardware reduce the cost against junk software and poor network speed.
            hutch at movsd dot com
            The MASM Forum



            • #7
              The biggest EXE file I've produced is around 400k.

              A typical disk drive will run at 100MB/second so your 10MB will only take a fraction of a second to load.

              You have 2GB available under 32 bit Windows operating systems for your exe plus data, (3GB in some cases). So as long as your EXE+data is less than than 2GB and and your PC has 2GB of memory available then you should have no issues with EXE size.


              • #8
                On a busy network the loading and shelling can be noticeable. Installing on workstations and pointing only to the data will increase performance.
                I've seen times when a workstation will take much longer than normal to load a program, but that could be anti-virus or other things slowing the transfer.


                • #9
                  Biggest exe I've ever done is about 5 meg. It always runs local, so no issues with load time. Actually, it's loaded almost all the time so nobody would ever notice the load time.


                  • #10
                    I've noticed some strange results with exe's around 1 Gig due to some bmp ressources. Graphics were damaged, crashes are happen.


                    • #11
                      Hi Stefan, Which compiler were you using.
                      p purvis


                      • #12
                        PB10, Paul


                        • #13
                          I've been using the UPX compression tool for a few years. My application has a large # of resources embedded. It works well and I've not had any negative impacts with it. My EXE right out of PB is about 2.5 meg, after UPX is done its about 470K. On my system it takes about 20-25 seconds to do the compress. (Intel i5 @ 3.0 ghz)