SERP Tracking Help

tomyates

Web Developer
Nov 25, 2010
437
1
0
UK, England
Hi Guys

Anyone has any experience with making serp tracking software in PHP.

What are the roadblocks in the way for doing this, what issues usually arise?

Im thinking dedicated server and rotating proxies if you have any insight please post.

Anyone know the Request / min / hour limits etc

Cheers
T
 


Take matts advice, if you're asking these questions - you're gonna have a bad time. It simply isn't worth building your own.
 
I'd rather create my own, it would be more cost effective for me to build my own to use with mine and my clients sites.

If anyone has had a go at doing their own before please PM me.

Basically I need any insight into issues which are beyond my control (Googles End) such as

- Request Limits
- IP Banning

So I can work around these with a solution.

Cheers
T
 
there are limits per IP, if you hit them they throw a captcha in the way. I dont know what the limit is now, but it was quite low last time I tried.

You also have the various geographic google servers dotted around the world to consider, which will also send back different results based on your (or your proxies) IP address.

Also, depending on your user-agent, you'll get different html sent back. IIRC from when I last wrote a scraper for G, I needed to use IE4 as the user-agent to get the easiest html to parse.
 
I'd rather create my own, it would be more cost effective for me to build my own to use with mine and my clients sites.

If anyone has had a go at doing their own before please PM me.

Basically I need any insight into issues which are beyond my control (Googles End) such as

- Request Limits
- IP Banning

So I can work around these with a solution.

Cheers
T

Unless you value your time below miniumum wage, I find it highly unlikely.
 
there are limits per IP, if you hit them they throw a captcha in the way. I dont know what the limit is now, but it was quite low last time I tried.

You also have the various geographic google servers dotted around the world to consider, which will also send back different results based on your (or your proxies) IP address.

Also, depending on your user-agent, you'll get different html sent back. IIRC from when I last wrote a scraper for G, I needed to use IE4 as the user-agent to get the easiest html to parse.


Cheers! Ill take all those into account.

T
 
If I can get something knocked up within a few days .. then I'd say it would be worth my time over paying $500 per month

T

Lol what rank tracker costs $500 a month? If you're spending that much, you're tracking too many keywords.

Don't build your own rank tracker, it's a solved problem. Use someone else's product.
 
If you're indeed doing volume, then I suggest you talk to illyas ( Rank Checker Ace ) or Bofu2u ( http://www.micrositemasters.com/ )

If the volumes are redundant, I am sure they will be wiling to work with you in creating a customized solution that suits both your pockets.

I am sure you're a good developer (or have a team of good developers) - but rank tracking is a tangent that aligns directly with the conversation you have with your client. It deals with your deliverables as an inbound marketing firm (SEO company?)

Google rankings are a propagated affair. There are several core updates which are then synced with other servers across the globes. There is a 90% chance that the rankings you derive will be flawed at your end. There are multiple heuristics that need to be integrated with the system.

Add to it, the fact, that it is resource intensive. Thanks to the extreme volatility of rankings. An always on scraper will have to run continually to derive data, store it, analyze it and output analytical charts.

Also, no one who posted in this thread is a competitor or someone trying to "prevent" you from doing it. Everyone who posted here is an experienced developer who's had some or the other kind of experience with rank tracking. I recommend you lend a ear.
 
Best comment ever.

Really great advice, and thanks. Hope he listens.
 
Take the advice that has already been given and use a service.

If you really want to build your own tool you could base it on something like this.

Be warned though, you'll end up spending too much time on coding and too much money on proxies.
 
^^ I do like the Code Canyon script that phrench just threw up, but I don't understand the purpose of reinventing the wheel...
 
Your going to have a bad time if you have questions like this. First they start throwing captchas up, then even if you solve the captchas, if you hit them hard enough they will start throwing up permission denied pages with no captcha.