Well, Im trying to program a scraper to pull pretty large chunks of data down from the tubes.
When I run it, though, it seems to stop pulling data around 1 minute from execution and at about 20k of parsed data. The pages are pretty damn big in terms of filesize but I dont imagine it wouldnt be something it couldnt handle... Its going through multiple pages of a directory of online listings and pulling specific pieces of information. Lots of RegExp lovin' goin on but thats all scrapers.
I have already tried the following:
Anyone have any ideas on this? Aqeuitas? smaxor?
Suggestions?!
When I run it, though, it seems to stop pulling data around 1 minute from execution and at about 20k of parsed data. The pages are pretty damn big in terms of filesize but I dont imagine it wouldnt be something it couldnt handle... Its going through multiple pages of a directory of online listings and pulling specific pieces of information. Lots of RegExp lovin' goin on but thats all scrapers.
I have already tried the following:
- Increased Max Execution times in PHP.ini
- Increased Max Input times in PHP.ini
- Max uploaded/downloaded filesize in PHP.ini = 8M
Anyone have any ideas on this? Aqeuitas? smaxor?
Suggestions?!