automating submissions to social bookmarking sites

Status
Not open for further replies.

webninja

New member
Jul 29, 2007
48
1
0
cyberspace
I'm thinking about automating the promotion process of a few dozen sites through social boomarking. Each site has around 30-100 pages I'd like to submit for deep linking purposes.

Has anybody tried automating submissions with iMacros (FF plugin)? Except CAPTCHAs what else is there to worry about? Do social bookmarking sites change their html code frequently to prevent automation? As that would break the macros. I know about socialposter, bookmarkingdemon and socialmarker but they are either too limited for my needs, don't seem reliable, or too sloow.

Say I would want to write macros for 100+ social bookmarking sites, how much maintainance work would there be in keeping the macros current? Anybody tried that? Would it be better to code the whole thing using a regular scripting language and CURL?
 


I'm thinking about automating the promotion process of a few dozen sites through social boomarking. Each site has around 30-100 pages I'd like to submit for deep linking purposes.

Has anybody tried automating submissions with iMacros (FF plugin)? Except CAPTCHAs what else is there to worry about? Do social bookmarking sites change their html code frequently to prevent automation? As that would break the macros. I know about socialposter, bookmarkingdemon and socialmarker but they are either too limited for my needs, don't seem reliable, or too sloow.

Say I would want to write macros for 100+ social bookmarking sites, how much maintainance work would there be in keeping the macros current? Anybody tried that? Would it be better to code the whole thing using a regular scripting language and CURL?

You can do it either way but in my experince most sites don't change up there HTML code frequently, they use a system and when that system becomes outdated (Sometimes 6 months, Sometimes a Year, Sometimes Longer) then they'll upgrade and change the source on you but its not frequent so your macro's should last a decent amount of time.
 
I'm thinking about automating the promotion process of a few dozen sites through social boomarking. Each site has around 30-100 pages I'd like to submit for deep linking purposes.

Has anybody tried automating submissions with iMacros (FF plugin)? Except CAPTCHAs what else is there to worry about? Do social bookmarking sites change their html code frequently to prevent automation? As that would break the macros. I know about socialposter, bookmarkingdemon and socialmarker but they are either too limited for my needs, don't seem reliable, or too sloow.

Say I would want to write macros for 100+ social bookmarking sites, how much maintainance work would there be in keeping the macros current? Anybody tried that? Would it be better to code the whole thing using a regular scripting language and CURL?

Hello Webninja,

The social bookmarking sites(top ones) use both email verification and a few also uses image verification to prevent from automated submissions or submissions from automated scripts.

IMO, for inner pages go for Deeplink Directory/Blog submissions and Social bookmarking both. If you can write the code then Damn good else if you wanna go for Manual Submissions then do drop me a PM and i will send out a quote to you. Look at my signatures if you're interested.

MDSandB
 
Try bookmarkingdeamon. I've been using it that way for a long time now. It's regularly updated, etc...

I forget how much it was - I know I've gotten my money out of it though. I've generated 1000's of social bookmarking accounts, and submitted probably 500,000 pages by now.

It's only real issue is that the proxy built into it just doesn't work.... I haven't had an issue though.
 
Well i would suggest not to go for automated submissions at all. Its not good for site to do automated submissions and to get 100's of links overnight. If your site has a blog, please update it constantly with good posts, if you don't have a blog, i do suggest to start it right away. Indexing of sites these days are quick and fast as never before. More of this has got to do with the frequency of content change that’s happening in your site. And blogs are at an advatange here compared to other static paged websites.

you can go for little bit of blog submissions and also update the content on the blog frequently and you can expect the crawling to be quicker which results in quick indexing.
 
1. spy ur competitors using SEO elite
2. find blogs that have links from directories/bookmarked/blogs
3. copy them
4. repeat ..woilaaa ...:rasta:
 
Status
Not open for further replies.