View previous topic :: View next topic |
Author |
Message |
bhilam -
Joined: 04 Aug 2005 Posts: 2
|
Posted: Fri Sep 30, 2005 12:36 am Post subject: php script parsing huge data, times out |
|
|
I have a php script using getID3 class (for those who knows what that is) to read through tags of about 30 GB of mp3s for display on my website. But due to huge data it times out. I have successfully tested out the script with rather small size (upto 3GB) and it works fine. I have also increased the max_execution_time variable in php.ini to 600 but i am still seeing timeout occur. Does anyone have any idea about how can i get around this problem. I know I can increase the max_execution_time value or just use a static html page to display this info but I am hoping there is a better solution.
using
Abyss Web Server X1 (version 2.0.6)
PHP v4.4
running
XP Pro on 1.4GHz w/ 640MB RAM
thanks,
- bhilam |
|
Back to top |
|
 |
MonkeyNation -
Joined: 05 Feb 2005 Posts: 921 Location: Cardiff
|
Posted: Fri Sep 30, 2005 10:41 am Post subject: |
|
|
I've played with getID3, I use it to write tags to files that don't have any.
If I were you I'd write it to a file or database hourly using cron or pycron.
Then retrive the file or database for your visitors.
Run it hourly from the command line with cron or pycron, depending on your OS, and with ignore_user_abort() and outputting a "NEXT! " after every file or something to make absolutley sure that it won't time out. _________________
 |
|
Back to top |
 |
 |
aprelium -
Joined: 22 Mar 2002 Posts: 6800
|
Posted: Fri Sep 30, 2005 1:51 pm Post subject: Re: php script parsing huge data, times out |
|
|
bhilam,
Have you also increased CGI Execution Timeout in Abyss Web Server (console > Host > Configure > Scripting Parameters > CGI Parameters)? _________________ Support Team
Aprelium - http://www.aprelium.com |
|
Back to top |
|
 |
bhilam -
Joined: 04 Aug 2005 Posts: 2
|
Posted: Fri Sep 30, 2005 2:49 pm Post subject: |
|
|
Thanks for you suggestions. I will first try to increase the CGI execution timeout and see if that works otherwise look into pycron as I am running win xp.
thanks,
- bhilam |
|
Back to top |
|
 |
cmxflash -
Joined: 11 Dec 2004 Posts: 872
|
Posted: Fri Sep 30, 2005 7:58 pm Post subject: |
|
|
Reading 30 GB of data for every single visitor is stupid... Your server is going to be damn slow. |
|
Back to top |
|
 |
|