Hey Brar, when the fucking fuck is there going to be a uBot for Linux? Do I have to pay to get it done? Or should I just go buy a Win laptop to do nothing but run uBot all day?
Frank
@puravida: I meant how much of a pain in the ass is it to get gnome-rdp running, and/or any special steps needed for uBot in particular.
@emp: I know, I know. Run virtual Windows, code my own or STFU. Would be nice if more people in IM would embrace Linux though.
Frank
After the State of Maryland decided that it didn't like me allegedly driving 84mph in a 55mph zone, I was assigned Driver Improvement classes.
Just as a big fuck you to the state of Maryland, I found my courses are online and now have a uBot program that may or may not take my Driver Improvement Program classes online and it may or may not continually take the tests for me until it gets a good enough score to pass.
Here's what I automate --
1. Domain Name Research
2. Market Niche Research
3. Finding Sites / People in Niches
4. Keyword Research
5. Content Scraping
6. Sorting out Data from Various Sources
7. Blog Comment Posting
8. Some random SEO shit
9. Twitter!! It's Made for Bots.
10. Setting Up Wordpress Sites ( + autoblogs)
11. Account Creation on Social Sites
12. Social Bookmarking, bitches!
That is the stuff I actively automate. I also automate a lot of personal shit (like sending messages on CouchSurfing and downloading pictures / videos from an unspecified site).
Lets see what all you guys find yourself automating every single day.
1. Server deployment. Getting a new customized server online takes under 15 minutes.
2. Stats and Traffic. Automatically pulling in stats and correlating them to the traffic numbers to get a quick snapshot of what's working and what's not.
3. Backups and Source Code. Everything is backed up to the Amazon S3 cloud, and the source code goes to Github. I can work from any computer in any country anywhere in the world.
4. Market Intelligence. Scrapers never sleep. We know what you're doing even before you do.
5. Brushing my teeth.
1. kwd research
2. ranks
3. serps
4. content scraping
5. revealing networks
6. spamming
7. ... need to check my svn repository...
I'm with emp on this one, everything I've posted here has been pulled from a .txt file
SERP Tracking
Keyword Research
On Site Analysis
Data Scraping & Collection
Server Setup
Link Building
ad uploading
pic scraping
pic resizing
After the State of Maryland decided that it didn't like me allegedly driving 84mph in a 55mph zone, I was assigned Driver Improvement classes.
Just as a big fuck you to the state of Maryland, I found my courses are online and now have a uBot program that may or may not take my Driver Improvement Program classes online and it may or may not continually take the tests for me until it gets a good enough score to pass.
pic resizing
I do the same thing with python for images I use on datafeed sites. Maybe I'm just extremely paranoid but I try to de dupe everything on datafeed sites and Image file name, alt text and size all get that treatment.I actually automate pic resizing. It's one of the few things I have just started to automate, simply because on one site I maintain I have to upload 300-1000 images roughly once a week. I had actually stopped doing this because it was too time consuming.
Now, using WinAutomation, it asks me for a folder, then it takes every image and resizes down to a max of 600px in either direction, then uploads all those pictures, builds a gallery and all necessary links. It takes a couple of minutes and apart from 2 prompts right at the start, it needs no input from me at all.
To think a few weeks ago I was scared of automation...![]()
1. Server deployment. Getting a new customized server online takes under 15 minutes.