Announcement

Collapse
No announcement yet.

CGI: Size limit in transferring application data?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Kev Peel
    replied
    Albert,

    You're welcome, great news it works now.

    Short story: As I mentioned, this happened on as server I hired when a CGI that sent data after verification was done (the CGI was only at the test stage) experienced timeouts. During my testing the processes that got stuck in memory eventually brought down the server due to the Dr Watson debugger also starting and crashing (how does a debugger crash? ). Anyway, a quick email back to the ISP to disable the debugger on the server resolved the problem but I couldn't use the CGI anyway due to the timeout.

    Leave a comment:


  • Michael Mattias
    replied
    Just a thought here, but what if your CGI program farmed out the upload to an additional thread of execution whilst your primary thread provided periodic updates? (number of bytes transferred so far, elapsed time, or even just "Waiting to complete upload....")

    If the time limit is "total execution time" this would not help, but if time limit is actually an "inactivity time limit" it should.

    ???

    Leave a comment:


  • Albert Richheimer
    replied
    Originally posted by Michael Mattias View Post
    And there's a demo in the source code forum allow you to redirect STDERR, too. No redirection of STDERR is a documented limitation of PB/CC*
    Thanks, MCM, this comes in real handy when creating a CGI for accepting client's uploads. There indeed is binary data, and this definetely won't be tackled by STDIN.

    I have resolved the pdf sending issue thanks to Kev's suggestion to check the timeout values in the Apache configuration, see my other posting. The CGI actually works with STDOUT, so when STDOUT is used on a non-console output device it transfers raw data - e.g. no tab expansion etc.

    Thanks again, MCM!

    Albert

    Leave a comment:


  • Albert Richheimer
    replied
    Originally posted by Kev Peel View Post
    Check your server's CGI timeout or other restrictions, many servers have a 15 second (or less) limit in place. Sometimes if the process is terminated or crashes it will stay in memory, again this depends on the server and whether a debugger such as Dr Watson is enabled.
    Kev, you are spot on! Actually, I have had a timeout value of 300 seconds in httpd.conf. This obviously was way too much. After changing it to a more sensible 60 the transfer is working, even with the large pdf. I suspect that Apache got puzzled with the large value, and has replaced it with some low value. I believe the timeout only applies when nothing happens, i.e. when idling aka waiting for the CGI to do something.

    Thanks, Kev!

    Cheers
    Albert

    Leave a comment:


  • Kev Peel
    replied
    A script will time out if the server is configured to make it time out. Once the limit is reached, the CGI process is terminated. Sometimes this process does not get freed.

    Personally, my ISP (Fasthosts), which uses Apache and Server 2003 have the limit set for CGI executables to 10 seconds. This short time is not long enough for a large upload if the client is on a speed-limited connection or the CGI needs to contact another website.

    It's worth checking the CGI restrictions in Apache

    Leave a comment:


  • Michael Mattias
    replied
    The example you are referring to replaces STDIN, but here I am dealing with STDOUT
    No, it replaces BOTH.

    And there's a demo in the source code forum allow you to redirect STDERR, too. No redirection of STDERR is a documented limitation of PB/CC*

    (*Well, it's documented now. It wasn't when I found the need to create that function. )

    MCM

    Leave a comment:


  • Shawn Anderson
    replied
    I've never seen a timeout problem writing back to the client.
    Are you sure you aren't caught in some other type of loop? 371k isn't that big.

    Leave a comment:


  • Kev Peel
    replied
    Check your server's CGI timeout or other restrictions, many servers have a 15 second (or less) limit in place. Sometimes if the process is terminated or crashes it will stay in memory, again this depends on the server and whether a debugger such as Dr Watson is enabled.

    Leave a comment:


  • Albert Richheimer
    replied
    Originally posted by Michael Mattias View Post
    Will terminate on first NUL or &h1A (=Ctrl-Z = EOF), which simply must be in a PDF file.
    Thanks, MCM, for your kind suggestion. The example you are referring to replaces STDIN, but here I am dealing with STDOUT problem. Just checked my (working) pdf example - it indeed contains both X'00' and X'1A'. So the culprit is not likely in STDOUT.

    BTW, before posting I have also tried to replace STDOUT with WriteFile to %STD_OUTPUT_HANDLE - same problem as with STDOUT.

    Cheers
    Albert

    Leave a comment:


  • Michael Mattias
    replied
    >stdout sHeader+sData

    Will terminate on first NUL or &h1A (=Ctrl-Z = EOF), which simply must be in a PDF file.

    See piping example here to handle non-text data (eg PDF) in STDIN/STDOUT: http://www.powerbasic.com/support/pb...8&postcount=28

    OR... Use whatever HTML is used to "save file" (I don't do HTML).

    (It vanished while I was trying to answer it the first time!)

    Leave a comment:


  • CGI: Size limit in transferring application data?

    Strange, my last posting has vanished...

    Anyway, here it is again:

    When my CGI is sending a large pdf (371kB), Apache locks up the CGI process into zombie status, and sends nothing. I have tried to send a small pdf (22kB), and everythings works just fine. See my code below.

    BTW, the zombie CGI cannot be removed by the task manager, I actually have to reboot the server.
    Code:
    #compile exe
    #dim all
    #include "win32api.inc"
    
    function pbmain() as long
        local sError    as string
        local hPageFile as long
        local sData     as string
        local sHeader   as string
    
        hPageFile = freefile
        open "A_22kB_File.pdf" for binary lock shared as #hPageFile
    REM open "A_371_kB_File.pdf" for binary lock shared as #hPageFile
        if err then
           sError="Cannot open file: "+format$(err)+" "+error$(err)
           goto errorhandler
        end if
        get$ hPageFile,lof(#hPageFile),sData
        close #hPageFile
    
        sHeader = "Content-Type: application/pdf"+$crlf+ _
                  "Content-Length: "+format$(len(sData))+$crlf+$crlf
        stdout sHeader+sData
        exit function
    
    errorhandler:
        stdout "Content-type: text/html"+$crlf
        stdout ""
        stdout "<html>"
        stdout "<head>"
        stdout "</head>
        stdout "<body>"
        stdout "<p>Error: "+sError+"</p>"
        stdout "</body>"
        stdout "</html>"
        stdout ""
    end function
    Has anybody an idea what is going on here? Hard to believe that Apache imposes a size limit on application data.

    Thanks,
    Albert
Working...
X