Indexing Ninja

Status
Not open for further replies.


Hi, are you sure links2rss feeds work? I sent some in the past and I got

"Invalid rss feed url please correct the problem:
Sorry, but auto feed generators are not allowed."

It isn't working directly from links2rss.com. For example links2rss.com/21242132132feed.xml will not work, you are correct link2rss does not support it. What you will need to do is upload the .xml file that the program saves to your hard drive to your private domain. The private domain feed url is what you will upload to feedlisting.com

crap, I'm so getting windows for one of my macs today...

Looking forward to having you join the team Otinsdale!

I'd like to buy it but i'd like to know if we get lifetime updates.

Thanks

I addressed this in a Pm but I thought I'd let all future users know, yes I do provide lifetime upgrades. This is a personal tool that I use daily and you guys are going to benefit by my heavy reliance on it in the form of constant updates.
 
Hey mate !
Great product you have there, I just have a question :
Should we use an Aged Domains with good incomming links to host those XML files or it doesn't matter ? If its an authority domain wouldn't it help the indexing ?
 
Yikes... I was afraid something like this was gonna get released. Based on what I know from a month of testing, this method cannot scale.

If you guys overload this tool & FeedListing with hundreds of 20 links/feed spams, it will likely stop working for everyone. Why? Google only catches new feeds while they appear in the "recently submitted feeds". If your unlucky feed is pushed down by an aggressive spammer, it just won't get spidered in time and you've wasted your captcha money.

For everyone's sake, I suggest you use 200 links / feed. They still get indexed just fine.

I was about to buy but read this. What do you say?
 
Hey mate !
Great product you have there, I just have a question :
Should we use an Aged Domains with good incomming links to host those XML files or it doesn't matter ? If its an authority domain wouldn't it help the indexing ?

I recommend just buying a cheap .info domain from godaddy, the main thing is that you're hosting it on your domain to prevent the errors of placing it on some other site.

I was about to buy but read this. What do you say?

I'm not going to sit here like most people and feed you bullshit just to sell my product. I don't think that this will be an issue at all. This method of indexing links has been around for almost 3-4 months now and people are spamming the hell out of it and their links are still getting indexed, Google is crawling this site at such a ridiculous rate that I don't think it'll be an issue.

If you experience poor indexing with this method I can recommend two things to increase indexing.

1) Run your links through Scrapebox after a blast and just resubmit those that have not been indexed, the program makes it much less of a pain in the ass that doing it manually and takes all of about 5-10 minutes depending on your amount of links that need indexed.

2) Have your V.A. or hire someone on Odesk for a dollar to run the program at 2-3 in the morning when most people are asleep, a time which people use the site at that time are asleep so Google will have much more time to index your feed.
 
While most of you have used this program, maybe you can help me.

I am running IN and it seems that i am doing something wrong. I experienced many issues:

- The module 2 isn't working while it is in a "play" mode since it was running the module 1, so i can't create a "numbers" file.
- Sometimes the module 2 is working (it creates a "numbers" file), but the module 3 seems to be unable to use the decaptcher option. It runs only with manual captcha solving. Okuma said that Decaptcher is not running properly during the last days, but i can login properly in my account.
- Sometimes, when the module 3 submits the feed10 then instead of sumbiting the feed11, it resubmits feed9 in the next step (which is already submited) and the process obviously stops.

Please advice.
 
Decaptcher.com is now back online and operable so be on the look out in your inbox by the end of Tuesday night for a patch to address the decaptcher issue some of you were having.

Best,

Tyler
 
What do I need to make this work? Besides purchasing software that is. Any online accounts or such. Thanks. Very Interested.
 
sorry to say that i purchased it on friday and still not able to use it :/

Just sent you a PM, outside of the current decaptcha issue which I'm currently looking into I believe that everything else should be a quick fix away.

What do I need to make this work? Besides purchasing software that is. Any online accounts or such. Thanks. Very Interested.

If you'd like to fully automate it the only other thing you'll need is a decaptcher.com account, everything else is automated within the software.
 
I didn't understand that submitting the feed to Feedlisting.com how can help for fast index? does feedlisting works with google?

what I understand is, your software builds feed, right?

I need more information how your software helps me on fast indexing. Thanks!
 
Hello,

Just wanted to follow up with how the program has been working for us.

Our first run with no other indexing efforts got 26/40 links indexed within 6hrs time, not bad at all. ;)

For the money this is a great tool to incorporate with your indexing campaigns from what i have seen.

Cheers!

How did you check if its indexed?
 
How did you check if its indexed?

I personally would recommend scrapebox, you just throw your links in and hit go and within 30-60 seconds depending on your amount of links and the quality of your proxies you'll have your answer.
 
hmff it's getting as worse as it could. today the program decided not to even copy the .xmls in my desktop, so i suppose it is finally completely useless to me.
 
This method of indexing links has been around for almost 3-4 months now and people are spamming the hell out of it and their links are still getting indexed, Google is crawling this site at such a ridiculous rate that I don't think it'll be an issue.

I wish that was true, but it's not always. Google only spiders Home and Newest like crazy, and even then you sometimes have 5-15 minute windows when it doesn't. My own feeds get pushed down regularly by people spamming 30 different un-spun feeds when they could accomplish the same job with 3.

I wish you had the foresight to hardcode your ubot with min. 100 links per feed like I did. Instead you messed it up for everyone and we need to adapt. :)

Don't get me wrong, I don't want to bash your product. It's still a good bot. I use the same approach every day. Just don't count on it 100%. If you really depend on indexation, go and check your feeds with site:url after they disappear from feedlisting homepage.
 
Status
Not open for further replies.