Announcement

Collapse
No announcement yet.

$FLOAT NPX - is this a good idea?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • $FLOAT NPX - is this a good idea?

    I am, as always, fighting memory problems with my big engineering code. What about turning off emulation using:

    $FLOAT NPX ' use floating point hardware only, not emulation

    Is emulation something left over from the 8086 days (been there, done that) or is it still needed with a modern Pentium-ish system? Will all the ugly engineering calculations still work? Will they work faster or slower? Will this save much memory? Should I add root beer?

  • #2
    Daniel,
    always use $FLOAT NPX unless you don't think FPU hardware is available. It's faster and should give smaller code.

    Paul.

    Comment


    • #3
      >I am, as always, fighting memory problems

      What kind of problems? Apparent shortage thereof? Corruption of same?

      Your choice of roads depends on the destination.
      Michael Mattias
      Tal Systems (retired)
      Port Washington WI USA
      [email protected]
      http://www.talsystems.com

      Comment


      • #4
        thanks and responses

        Thanks for the replies - questions about the answers to my question:

        Paul Dixon said: always use $FLOAT NPX unless you don't think FPU hardware is available.
        >>Is it reasonable to expect it to be available on most people's computers these days, or are there new machines being built (normal desktop and laptops) that don't have it?

        Michael Mattias said: What kind of problems?
        >>A few months back I posted a question which generate a three-pager - see "Program only works if compiled on other computer". I got lots of advice but nothing that got to the root of the problem, namely that my big fancy desktop computer only gives me DOS memory of 260k bytes available and 210k largest program. This prevents compiling my large program on the desktop although it compiles just fine on my laptop, which has 318k bytes available, 250k largest program.

        Bob Zale said…"If you only have 300K+ available for DOS programs, something is radically wrong with your Windows installation. It should be double that amount. You reallly need to look into that issue."

        How do I do that? I tried all the old memory utilities that were suggested - some didn't run, others ran but didn't do anything I understood. I tried all the AUTOEXEC and CONFIG suggestions. I also spoke to the people who sold me that computer and they basically said "sounds like software, we do hardware."

        I suspect that it is either a hardware/driver problem, or something in the Windows setup that ties up that memory long before the AUTOEXEC and CONFIG calls enter into the picture.

        So, I'm compiling with PBC.EXE which makes it more difficult to debug, since it doesn't take you to the point in the program where it found a problem.

        Any step-by-step instructions would be greatly appreciated!

        Comment


        • #5
          Daniel,
          unless you are programming for embedded PCs for specialist hardware (and you can't be because you'd know that if you were) then you can assume all PCs built in the last 15 years and many PCs before that time have built in FPU hardware.
          The 486DX was introduced in 1989 with built in FPU. Before that time you could buy a seperate co-processor for FPU.

          Of course, if you aren't programming for embedded PCs (which tend to have limited resources) then why are you still programming in DOS? Wouldn't it make sense to move to a Windows type compiler and then available RAM is 2GB instead of 640kB.

          Paul.

          Comment


          • #6
            That story rings a bell now..

            Re your 260K memory... that is way way low, unless you are talking what is left over AFTER you have loaded your program into the PB/DOS IDE and are trying to run from there.

            When you only have 640 Kb max to start with and you want to use 300K or so for the PB/DOS IDE + your compiled program image, then yeah, that's a hard limit you have hit. I used to do a lot of PB/DOS, and for several projects I had precisely zero opportunity to run from the IDE.

            You know what happens then? You start planning and including code for "how am I going to test this since I can't run in the IDE?" before you ever run into any memory limits.

            Today's programmers ( those born In The Time of Windows) often do not appreciate what it took to write, test and debug under the memory constraints of MS-DOS.

            You know how it sometimes said children of the wealthy "were born on third base and grew up believing they had hit a triple?"

            well, some of these younger programmers were born with 2 Gb RAM and grew up believing they knew how to program efficiently.

            MCM
            Michael Mattias
            Tal Systems (retired)
            Port Washington WI USA
            [email protected]
            http://www.talsystems.com

            Comment


            • #7
              Is emulation something left over from the 8086 days (been there, done that)
              Yes. It sounds like you are running code in a DOS box inside Windows, in which case (unless there is something I am missing) you can safely assume there is an FPU present (I can't imagine anyone using Windows on a system older than a 486 anymore, and if they are, I doubt they are running "engineering code"). The FLOAT NPX option will run the floating point code on hardware, so it will be faster and you won't need the redundancy of floating point software routines when the hardware will be handling the calculations anyway.

              Also, you might want to consider using PB/CC, which will get rid of some of those self-imposed limits DOS suffers from. You will have a much easier development time, as you can actually run the code from the IDE itself (I had almost forgotten the frustration of realizing that my program wasn't too big, just too big to run from the PB DOS IDE). PB/CC offers many, many other benefits too.

              Today's programmers ( those born In The Time of Windows) often do not appreciate what it took to write, test and debug under the memory constraints of MS-DOS.
              I remember writing, testing, and debugging on the Commodore 64, with its (at the time) impressive 64K. Moving to DOS was a world above, with "limitless" memory. That didn't last long, of course, as the 640K glass ceiling wasn't much better than the 64K ceiling. Of course, I am sure there are people older than myself that remember the days when 4K was a ridiculously large amount of memory.

              Regardless, don't limit your ideas (and your code) to a measly 640K in the year 2007. That's not an excuse to waste memory or write inefficient code. But having a bigger canvas, while a waste to some artists, can be just more room for bigger and better pictures to a seasoned artist.

              Thanks,
              Flick

              Comment


              • #8
                It's kind of a 'generic' thing for me to preach that answering the question "how am I going to test this program?" is every bit as important a part of designing your program as is creating your file/record/table layouts, deciding if you are going to use flat files or a DBMS, put everything in a single executable or locate some functions in a dynamic link library.

                That is, it's one of the questions which should be answered before you write your first line of source code.

                In this case it sounds as though one of my other Great Truths may be in play: software has a finite life. When any given application has been enhanced and maintained a lot, it will reach a point where it makes more sense to start over - designing for all the features now desired - since further maintenance on the original design is nearly impossible.
                Michael Mattias
                Tal Systems (retired)
                Port Washington WI USA
                [email protected]
                http://www.talsystems.com

                Comment


                • #9
                  >>Regardless, don't limit your ideas (and your code) to a measly 640K in the year 2007

                  Um, I think if you are writing for MS-DOS, limiting yourself to a design not requiring more than 640Kb RAM for either testing or execution is still a pretty good idea!
                  Michael Mattias
                  Tal Systems (retired)
                  Port Washington WI USA
                  [email protected]
                  http://www.talsystems.com

                  Comment

                  Working...
                  X