View previous topic :: View next topic |
Author |
Message |
holdech -
Joined: 23 Nov 2005 Posts: 4
|
Posted: Thu Nov 24, 2005 8:46 pm Post subject: Broken pipe error processing large file - help! |
|
|
I have a PHP application that reads from a flat file and posts to a PostgreSQL database. Everything is running on a Win XP pro machine.
It works fine for files around 100,000 records or less.
Above that, I get this in the access.log:
127.0.0.1 - - [24/Nov/2005:14:05:22 -0500] "GET /scraper/scrapeBigFF.php HTTP/1.1" 200 224 "" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)"
127.0.0.1 - - [24/Nov/2005:14:22:32 -0500] "GET /scraper/scrapeBigFF.php HTTP/1.1" 500 246 "" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)"
And this in the cgi.log:
CGI: [C:\PHP\php-cgi.exe ] URI: /scraper/scrapeBigFF.php Broken pipe
I've tried closing theDB connection and reconnecting at 100,000 record increments, but the same result.
I assume the "broken pipe" is a php error? Is there a setting in the php.ini that I need to tweak for large files?
Thanks
Chris |
|
Back to top |
|
 |
AbyssUnderground -
Joined: 31 Dec 2004 Posts: 3855
|
Posted: Thu Nov 24, 2005 10:29 pm Post subject: |
|
|
Broken pipe is a script error. You need to find out where the error is occuring in the script. Do this by turning php errors in the php.ini on. _________________ Andy (AbyssUnderground) (previously The Inquisitor)
www.abyssunderground.co.uk |
|
Back to top |
|
 |
aprelium -
Joined: 22 Mar 2002 Posts: 6800
|
|
Back to top |
|
 |
holdech -
Joined: 23 Nov 2005 Posts: 4
|
Posted: Fri Nov 25, 2005 5:59 pm Post subject: Broken pipe |
|
|
Thanks for the info.
The file is read, not uploaded - so that's not the problem.
It seems to gag when processing more than 100,000 inserts into PostgreSQL.
As long as I load less than that, all is fine. Must be related to the server to db communications.
In answer to the other post - it's not a script error. The script runs just fine. It's not a file data issue either as I can process all 2 million records in 100K chunks.
I also tried reading the entire file into an array, using file(). That part works fine - can echo all 2 million records back to the browser. I think this would localize the problem to the database connection (pipe).
Chris |
|
Back to top |
|
 |
holdech -
Joined: 23 Nov 2005 Posts: 4
|
Posted: Fri Nov 25, 2005 6:50 pm Post subject: Problem found!!!!! |
|
|
You may want to keep this one for reference.
When php is running as a cgi, it's using a socket to communicate with the web server.
This will timeout in 30 seconds unless you send "something" back to the web server in under 30 seconds i.e. just "echo" something to the browser.
I had this problem because the db server was active for more than 30 seconds processing the file I was importing - so nothing was sent to the web server socket and it timed out - hence the "broke pipe".
Note that this is NOT the script "max execution" time - which I also had to set higher than it's default 30 seconds.
There is no setting in php.ini that can be tweaked to fix this. Their socket timeout parm relates to not getting a reply vs no request.
Question - is their a socket timeout in Abyss that can be tweaked?
Anyway - file this one for the next guy that has a database intensive script and is getting a 500 error with a broken pipe.
Chris |
|
Back to top |
|
 |
aprelium -
Joined: 22 Mar 2002 Posts: 6800
|
Posted: Sat Nov 26, 2005 9:37 pm Post subject: Re: Broken pipe |
|
|
holdech wrote: | Thanks for the info.
The file is read, not uploaded - so that's not the problem. |
Yes, we are aware of the difference. That's why we asked you to read the second part only of the post because it addresses timeout problems which are common to both large uploads and long computations. _________________ Support Team
Aprelium - http://www.aprelium.com |
|
Back to top |
|
 |
|