The Ultimate SEO Research Tool - serpIQ.com

Status
Not open for further replies.
Just wanted to leave another follow up review of this tool, all i have to say is..... "WOOOAAAAHAHAHAHOOWLLYSHEEEIIIIT!!!!!!!!!!!!!!!!"

I absolutely love this tool! When i first subscribed i messed around and used it more or less just to research different niches we were looking at getting into and checking up on what the competition was doing. After finally getting caught up with seo client work i figured id start better dialing in our on page seo for a few of our different sites and pages we have ranking top 10 since i noticed we were lacking on some. The 3 of 4 pages/sites i have made changes to so far and now have dialed in have seen (HUGE!) substancial gains since recaching.

If it wasnt for this tool we would have completely overlooked a few of the on page factors where we were lacking that can be easily missed but are mission critical. I can comfortably say since using Serpiq, just for improving our on page seo, it has helped us gain an extra $150-$200/day on our bottom line. IMO just using serpiq for making damn sure your on page is dead on is worth the money. The amount you could save on link building could be PARAMOUNT depending on how many sites you are running and how optimized/poorly optimized your pages are.

This has become our new go to tool and id be pissed if we lost it, lol. I would HIGHLY recommend checking it out to anyone serious in SEO, you'll be glad you did.

Thank you Dchuk for creating this tool and making it available to us. I know we got our membership way back in beta for half price or whatever, if you havent raised us to full price please raise your fee you charge us to the $59/month, is WELL worth it to me.

Cheers!

ps. Looks like SEOmoz is "trying" to copy your tool....
 


Ok makes sense fair enough. How about a pre filter then to filter searches with say less than 1000 volume. So that way we get 100 of the higher volume ones. Or is it already working like that?

I will look into that, I think that's a good suggestion. It could prove tricky on more long tail research runs, but I'm sure a happy medium can be found.

Just wanted to leave another follow up review of this tool, all i have to say is..... "WOOOAAAAHAHAHAHOOWLLYSHEEEIIIIT!!!!!!!!!!!!!!!!"

I absolutely love this tool! When i first subscribed i messed around and used it more or less just to research different niches we were looking at getting into and checking up on what the competition was doing. After finally getting caught up with seo client work i figured id start better dialing in our on page seo for a few of our different sites and pages we have ranking top 10 since i noticed we were lacking on some. The 3 of 4 pages/sites i have made changes to so far and now have dialed in have seen (HUGE!) substancial gains since recaching.

If it wasnt for this tool we would have completely overlooked a few of the on page factors where we were lacking that can be easily missed but are mission critical. I can comfortably say since using Serpiq, just for improving our on page seo, it has helped us gain an extra $150-$200/day on our bottom line. IMO just using serpiq for making damn sure your on page is dead on is worth the money. The amount you could save on link building could be PARAMOUNT depending on how many sites you are running and how optimized/poorly optimized your pages are.

This has become our new go to tool and id be pissed if we lost it, lol. I would HIGHLY recommend checking it out to anyone serious in SEO, you'll be glad you did.

Thank you Dchuk for creating this tool and making it available to us. I know we got our membership way back in beta for half price or whatever, if you havent raised us to full price please raise your fee you charge us to the $59/month, is WELL worth it to me.

Cheers!

ps. Looks like SEOmoz is "trying" to copy your tool....

AWESOME! I'm stoked you guys are killin it over there with serpIQ. Your feedback has always been great and well appreciated. Thanks for being on board!
 
Quick little update:

Still grinding on tightening things up and speeding things up wherever possible. Today, I launched a blog at blog.serpiq.com where I'll be officially covering new features and any background or explanation of how I make serpIQ do the things it does. I'd love if you guys commented on it (and shared or tweeted it) to let me know what you guys want to read and see in the app.

Domain Age is coming in the next day or two, just trying to flesh out all possible breaking points on it (when you're scraping thousands of urls a day, normal things break easily). As soon as it's ready, I'll blog about it and post in here as well.
 
Any plans on international keyword discovery like you did with the competition analyzer?
I guess that would be trickier because normally you select language and country.
 
Some suggestions...

What I need is to be able to dump in a large keyword list, ideally have it expanded out with keyword discovery, pull volume, pull backlinks, and dump a list of keywords that meet a certain criteria with their corresponding stats. +1 for dumping keywords into a table format rather than floating images.

The limitation here is obviously proxy volume.

So, rather than pull 50 different factors for each keyword, you could set up a filter to only scrape what we need scraped. You could do this in user settings or you could do it within the keyword entry (or hopefully a keyword.txt upload box). Just check off what you need scraped.

I only want results for the first or maybe second position. This cuts your queries by 80%. Next I only want number of backlinks, is the .com available, and search volume. There's another huge chunk of queries that we just eliminated.

Then rather than limiting to 50 keywords or whatever, you could limit by total queries. Alternatively, provide a time delay until the user can run another batch upload based on how many queries they are scraping * number of keywords.

Dump this list out and then if someone wants all the data, they can pull it for a much smaller subset of keywords which already meet most of their criteria.
 
Any plans on international keyword discovery like you did with the competition analyzer?
I guess that would be trickier because normally you select language and country.

It really depends on what is allowed through the Adwords API to be honest. I know there is limited support for region queries, I just don't know to what extent. I'll take a look soon and see what's in the realm of possibility.

Some suggestions...

What I need is to be able to dump in a large keyword list, ideally have it expanded out with keyword discovery, pull volume, pull backlinks, and dump a list of keywords that meet a certain criteria with their corresponding stats. +1 for dumping keywords into a table format rather than floating images.

The limitation here is obviously proxy volume.

So, rather than pull 50 different factors for each keyword, you could set up a filter to only scrape what we need scraped. You could do this in user settings or you could do it within the keyword entry (or hopefully a keyword.txt upload box). Just check off what you need scraped.

I only want results for the first or maybe second position. This cuts your queries by 80%. Next I only want number of backlinks, is the .com available, and search volume. There's another huge chunk of queries that we just eliminated.

Then rather than limiting to 50 keywords or whatever, you could limit by total queries. Alternatively, provide a time delay until the user can run another batch upload based on how many queries they are scraping * number of keywords.

Dump this list out and then if someone wants all the data, they can pull it for a much smaller subset of keywords which already meet most of their criteria.

Hey, thanks for taking the time for such a detailed reply. Let me see if I can answer ya:

I built serpIQ in a very specific way as it is exactly what I want out of a keyword research tool. I am a big fan of Market Samurai but it's slow, inconsistent, and can't be run in parallel, all things which drove me crazy. So when I set out to build serpIQ, it came together in its current fashion because it's my dream SEO tool.

So obviously, not everyone has the same workflow as me. For instance, I don't use the workflow you just described above. While I do think automation is an important part of what we do, I don't do my research at that level of separation where I just dump in keywords and target whichever ones scored the highest in the end. I'm still evaluating things in my mind when using automation tools because I think that gut instinct is an important skill for SEOs to have. All the numbers can seem right, but you could still get this "feeling" that you need to keep digging for more.

Luckily, it appears many other people share this same workflow mentality as the system is processing thousands of keywords a day. This means two things: 1) there's at least a few other people in the world who have my same SEO mentality and 2) I'm not going to be able to make everyone happy with serpIQ.

So I hear what you're saying, I think you have come up with a very interesting workflow, but unfortunately, it just won't be making it into serpIQ any time soon. My goal isn't to build a swiss army knife, it's to build a dagger. Something hyper efficient at a very specific workflow, a workflow I strongly believe in. That's not going to satisfy everyone, but trying to satisfy everyone will cause fragmentation and create a jumbled mess of a tool.

What you want to do is very feasible, and I think you should pursue it with a coder. The limitations are not solely proxy based, it costs money to query Google's API and it costs money to do things like run Domain Age checks. It's a challenging problem to solve, but I think if that's your workflow, you need to pursue it as it is very automatable. Unfortunately, it's a workflow very separated from the current serpIQ workflow, so I won't be adding it to the system any time soon in the interest of keeping things focused and streamlined.

Hopefully that explains my own decision making process. I think even with your own workflow, there is still room for serpIQ in terms of being able to thoroughly dig into your final list of keywords to find the diamonds in the rough. If you have any other questions or feedback, please let me know.
 
Would really love retro active keyword grouping, like an option on the keyword itself to add to a group.

I keep forgetting to add them at time of creation.

Workflow wise I agree this fits in pretty nicely in how I do research. I spent a whole day going through alternative keywords to attack a niche with multiple domains.

SerpIQ really helped me keep in the flow of analyzing internally along with the data it presented to help find holes in the competition.

With some tools they get in the way of that 'flow' where as with SerpIQ it was deffinitely a more intuitive experience over the course of a few hours.

Considering how much time I spent on that little project I'd hate to imagine trying to have done it using something like MS.
 
Hopefully that explains my own decision making process. I think even with your own workflow, there is still room for serpIQ in terms of being able to thoroughly dig into your final list of keywords to find the diamonds in the rough. If you have any other questions or feedback, please let me know.

I totally understand. I've actually build tools to do about most of what I need in the past, but the maintenance is just too insane. No one has anything that goes all the way. You've definitely got all the pieces here.
 
Would really love retro active keyword grouping, like an option on the keyword itself to add to a group.

I keep forgetting to add them at time of creation.

Workflow wise I agree this fits in pretty nicely in how I do research. I spent a whole day going through alternative keywords to attack a niche with multiple domains.

SerpIQ really helped me keep in the flow of analyzing internally along with the data it presented to help find holes in the competition.

With some tools they get in the way of that 'flow' where as with SerpIQ it was deffinitely a more intuitive experience over the course of a few hours.

Considering how much time I spent on that little project I'd hate to imagine trying to have done it using something like MS.

Glad you're happy with the workflow man!

+1 to the table. Hit the max keywords for the first time yesterday but it took me hours to input the 50 keywords.

I'll think about the table. The current setup for the keyword discoveries still gives me wood every time I use it so I'm biased, but I'll keep it in mind :)

I totally understand. I've actually build tools to do about most of what I need in the past, but the maintenance is just too insane. No one has anything that goes all the way. You've definitely got all the pieces here.

Thanks for the understanding man. Many fine lines to walk when building a publicly available research tool. Your feedback is always appreciated.
 
What you want to do is very feasible, and I think you should pursue it with a coder. The limitations are not solely proxy based, it costs money to query Google's API and it costs money to do things like run Domain Age checks. It's a challenging problem to solve, but I think if that's your workflow, you need to pursue it as it is very automatable. Unfortunately, it's a workflow very separated from the current serpIQ workflow, so I won't be adding it to the system any time soon in the interest of keeping things focused and streamlined.
Paying for the queries eats up your profit to the point where its pretty much not worth it.
 

K, I'll get it setup tonight. I might move it to 11 instead so I can save some time to hit the water in the afternoon.

What kind of webinar? Do we get to bombard you with feature requests if we come.:338:

Absolutely. The last one was basically a walkthrough of the system of how I use it in my own workflow, and then answering questions about how things work and where it's all headed.
 
Hey guys, here's the link for the webinar tomorrow at 11 AM tomorrow. I know it's late notice and quite frankly if no one shows up, I won't be upset. But even if it's just one or two of you, I'll be there. I'd love if any of you could do voice chat as it makes for a great exchange of information, but no obligation at all for that.

https://www3.gotomeeting.com/register/696092038

Hope to see you there!
 
Status
Not open for further replies.