I'm working on a DLL using PBWin 8 that is called by a video game written in C++. We have a bizarre problem that I'm desperately trying to fix.
The DLL has a function:
It is called with Constant = 0, 1, 2, 3, etc., through 43, each with a different FloatValue (SINGLE). The first time we run through this everything is fine. I use CALLSTK$ and also have a separate debugging macro after that that outputs this data to a text file like this:
The first time around that we call this set of functions the debugging output for the first few lines looks like this:
Question 1) Why is CALLSTK$ showing a number that is not the same as what STR$() shows? When we look in the C++ debugger it is sending for example 13.8, not 13.8000001907349. This question is merely a technical curiosity on my part. This discrepency does not hurt the program at all and we can live with it.
The real problem that has me in dire straights at the moment is that the second time we call this same sequence of function calls with identical values, the output suddenly changes to this:
Question 2) Why are the integers showing up as floating point values suddenly the second time we do this? CALLSTK$ and STR$ are both showing the same this time around.
Question 3) The biggest problem we have (the one that crashes the game) are the bold 1 and 10 values. Why does prefix = 1 suddenly become almost 10 as follows, for instance?
And why does the following:
Suddenly become:
How can the ones become tens or 9.99999984306749 and so forth?
We have checked this thoroughly through the C++ debugger (I'm in the office now with the C++ team) and the VC++ debugger shows that it is sending the correct values. Yet somehow the DLL is receiving garbage on the second set of calls.
Even more mysterious, when we call this set for a third, fourth, fifth, tenth, etc., time, it reverts back to what we have on the very first call. I don't care if CALLSTK$ shows something different from STR$ there. The way it works on the first call (and third, fourth, etc.) is correct enough for us. The problem is call #2.
Any ideas or suggestions on how to track this problem down? I'm completely lost...
The DLL has a function:
Code:
FUNCTION ENG_SetEngineParameter ALIAS "ENG_SetEngineParameter" (BYVAL Constant AS LONG, BYVAL FloatValue AS SINGLE) EXPORT AS LONG
Code:
'//First text output DebugOutput(CALLSTK$(1)) '//Second text output TEXT$ = STR$(Constant) + STR$(FloatValue) DebugOutput(TEXT$)
Code:
0 20000 ENG_SetEngineParameter([b]1[/b],45000) [b]1[/b] 45000 ENG_SetEngineParameter(2,1000) 2 1000 ENG_SetEngineParameter(3,4) 3 4 ENG_SetEngineParameter(4,54) 4 54 ENG_SetEngineParameter(5,13.8000001907349) 5 13.8 ENG_SetEngineParameter(6,14) 6 14 ENG_SetEngineParameter(7,8.5) 7 8.5 ENG_SetEngineParameter(8,26.3999996185303) 8 26.4 . . . ENG_SetEngineParameter(28,[b]1[/b]) 28 [b]1[/b] ENG_SetEngineParameter(29,[b]1[/b]) 29 [b]1[/b]
The real problem that has me in dire straights at the moment is that the second time we call this same sequence of function calls with identical values, the output suddenly changes to this:
Code:
0 20000 ENG_SetEngineParameter([b]9.99999984306749[/b],44999.9975758168) [b] 9.99999984306749 [/b]45000 ENG_SetEngineParameter(1.9999999686135,999.999984306749) 1.9999999686135 1000 ENG_SetEngineParameter(2.99999995292025,3.999999937227) 2.99999995292025 4 ENG_SetEngineParameter(3.999999937227,54.0000019013435) 3.999999937227 54 ENG_SetEngineParameter(4.99999992153375,13.7999996459942) 4.99999992153375 13.8 ENG_SetEngineParameter(5.9999999058405,13.9999990930997) 5.9999999058405 14 ENG_SetEngineParameter(6.99999954654986,8.50000003840606) 6.99999954654986 8.5 ENG_SetEngineParameter(7.999999874454,26.4000008913682) 7.999999874454 26.4 . . . ENG_SetEngineParameter(28.0000016221733,[b]9.99999984306749[/b]) 28.0000016221733 [b]10[/b] ENG_SetEngineParameter(29.0000005756879,[b]9.99999984306749[/b]) 29.0000005756879 [b]10[/b]
Question 3) The biggest problem we have (the one that crashes the game) are the bold 1 and 10 values. Why does prefix = 1 suddenly become almost 10 as follows, for instance?
Code:
ENG_SetEngineParameter(9.99999984306749,44999.9975758168) 9.99999984306749 45000
Code:
ENG_SetEngineParameter(28,[b]1[/b]) 28 [b]1[/b] ENG_SetEngineParameter(29,[b]1[/b]) 29 [b]1[/b]
Code:
ENG_SetEngineParameter(28.0000016221733,[b]9.99999984306749[/b]) 28.0000016221733 [b]10[/b] ENG_SetEngineParameter(29.0000005756879,[b]9.99999984306749[/b]) 29.0000005756879 [b]10[/b]
We have checked this thoroughly through the C++ debugger (I'm in the office now with the C++ team) and the VC++ debugger shows that it is sending the correct values. Yet somehow the DLL is receiving garbage on the second set of calls.
Even more mysterious, when we call this set for a third, fourth, fifth, tenth, etc., time, it reverts back to what we have on the very first call. I don't care if CALLSTK$ shows something different from STR$ there. The way it works on the first call (and third, fourth, etc.) is correct enough for us. The problem is call #2.
Any ideas or suggestions on how to track this problem down? I'm completely lost...
Comment