So I've pasted together and rigged up a bunch of stuff in php largely with the help of the folks here. It basically does your keyword research from start to finish with no input from the user and dumps out the following.
My main concern is having it lose it's value if 11ty billion retards are all scraping from the same source, both from having identical results and from raping the shit out of the same web services, most likely using similar proxy lists.
My second concern is the competence required by the end user just to keep it alive. So you'll need a decatcher account, api key for the whois checker (who may not like the hits/purchases ratio), server capable of curling through proxies (no shared hosting), an ever expanding proxy list, understanding of how to setup cron jobs, etc.
Any thoughts on how best to do this? My initial idea was keep it limited to <50 people who I would screen. Perhaps split up some of the functions so that I could run/control seeding of keywords and prevent duplication across each user. This is a relatively low volume task and would let me set it up as a subscription service rather than a one time purchase.
How much would this be worth? I'm pulling 10k keywords a day running part time. Evaluating competitors needs to be way faster, but you're probably looking at one low volume exact match per day, with a bunch of low competition medium volume keywords without exact match. I don't have a good feel for how much I can really run with the script because I don't have things optimized with multi_curl or a decent server running 24/7. Not sure when the throttling will kick in either.
Any ideas? I would only do it if it would allow me to stop doing anything else and just build tools. I also don't want to be dependent on something that could just die tomorrow from too much volume because a service shuts down. Would you just keep it to yourself and keep cranking out sites?
- keywords
- CPC
- local and global search volume
- number of search results
- links to top 8 competitors via google, with corresponding page rank
- exact match domains if they are available
My main concern is having it lose it's value if 11ty billion retards are all scraping from the same source, both from having identical results and from raping the shit out of the same web services, most likely using similar proxy lists.
My second concern is the competence required by the end user just to keep it alive. So you'll need a decatcher account, api key for the whois checker (who may not like the hits/purchases ratio), server capable of curling through proxies (no shared hosting), an ever expanding proxy list, understanding of how to setup cron jobs, etc.
Any thoughts on how best to do this? My initial idea was keep it limited to <50 people who I would screen. Perhaps split up some of the functions so that I could run/control seeding of keywords and prevent duplication across each user. This is a relatively low volume task and would let me set it up as a subscription service rather than a one time purchase.
How much would this be worth? I'm pulling 10k keywords a day running part time. Evaluating competitors needs to be way faster, but you're probably looking at one low volume exact match per day, with a bunch of low competition medium volume keywords without exact match. I don't have a good feel for how much I can really run with the script because I don't have things optimized with multi_curl or a decent server running 24/7. Not sure when the throttling will kick in either.
Any ideas? I would only do it if it would allow me to stop doing anything else and just build tools. I also don't want to be dependent on something that could just die tomorrow from too much volume because a service shuts down. Would you just keep it to yourself and keep cranking out sites?