Please Share Your Automation & Business Process Documentation Best Practices...

Status
Not open for further replies.

riddarhusetgal

Incongruous Juxtaposition
May 2, 2007
1,864
70
0
First,
I'd like to give a shout out to whomever recommended getdropbox it has saved me so much time and made me so much more productive with my team!

I was listening to a great audio yesterday/saturday and the guy - a business consultant (and successful one)- recommended that VIRTUALLY EVERYTHING you do in your business that is done more than TWICE is a candidate for

a) automation

and/or

b) business processes documentation (i.e. you write it down in a step by step formal documentation so that it can be outsourced or so that you can train someone else to do it in case you aren't able to or so that you can focus on higher ROI activities)

This makes so much sense to me. It can be a pain in the ass though to put into practice...

The past few weeks that I have bit the bullet and done it though I find myself being more productive and having more free time if I want it as well.


At any rate, I am a n00b at stuff like Imacros for Firefox and a few other tools but am very open to learning from others.

Oh, one of the other things he mentioned was the following:

Spend a whole week writing down everything you do - I mean everything. Write it down so you can look at it from afar (so to speak) or have a friend, colleague, etc look over it with you. Then, sit back and see which of these activities are your HIGHEST ROI, which can be automated, which needed to be processed formally, etc. Of course the perceptive among you have already concluded that by doing this you can see which activies are among the 20% that produce the 80% of the results and spend more time doing that thus profiting more and enhancing one productivity....


I'd be very curious to hear about the automation tools a lot of you guys use as well as your best practices for business process documentation.

Mine are pretty basic; I basically broke down each stage of the start up/biz launch process from keyword research to ---->site launch/promotion, then I make an excel sheet of each tool I used/use in the process. My short term/90 day goal is to have all this stuff formalized so I can truly create an efficient running SYSTEM - i.e. something that runs almost completely independent of me. By then I should be so produtive, I can even have my assistant read outloud WF posts and respond for me,haha....

Looking forward to your ideas and teaching points

tx
 
  • Like
Reactions: Infinite Keith


One of the main things I automate that helps me is a CMS for all of the niches. Automatically gets some links on each new page, etc. My goal was 1 post a day should equal 1500 links/month. Was a good way to build something up. :)

Edit: completely forgot to say that everything you put down makes a LOT of sense as well. Touche, and looking at dropbox now :p
 
Anything and everything I do by hand and doesn't directly require my judgement is automated. Initially it was a massive time drain, but if you make all the coding re-usable classes, it's eventually becomes fast as hell.
Like no matter which project I'm doing, if I have to scrape, I have a scraper library that probably already handles it.
Like
$s=new Scraper($keyword, $limit);
$s->scrapeGoogle();
$s->scrapeYahoo();
$s->scrapeMSN();
$s->scrapeDigg();
$links=$s->getLinks();
$sp=new SiteParser($links);
$sp->downloadAll($threads);
$data=$sp->getData();
$javascript=$sp->getJavaScriptLocations();
for($i=0; $i<sizeof($data); $i++)
{
$captcha=$sp->findCaptcha("image.php");
$sp->saveFile($captcha,"/images/");
}
Once you've got a decent framework, it's all fast as a mother fucker.
 
Anything and everything I do by hand and doesn't directly require my judgement is automated. Initially it was a massive time drain, but if you make all the coding re-usable classes, it's eventually becomes fast as hell.
Like no matter which project I'm doing, if I have to scrape, I have a scraper library that probably already handles it.
Like
$s=new Scraper($keyword, $limit);
$s->scrapeGoogle();
$s->scrapeYahoo();
$s->scrapeMSN();
$s->scrapeDigg();
$links=$s->getLinks();
$sp=new SiteParser($links);
$sp->downloadAll($threads);
$data=$sp->getData();
$javascript=$sp->getJavaScriptLocations();
for($i=0; $i<sizeof($data); $i++)
{
$captcha=$sp->findCaptcha("image.php");
$sp->saveFile($captcha,"/images/");
}
Once you've got a decent framework, it's all fast as a mother fucker.

I think I just exploded in my pants thinking about the backend to that.
 
Anything and everything I do by hand and doesn't directly require my judgement is automated. Initially it was a massive time drain, but if you make all the coding re-usable classes, it's eventually becomes fast as hell.
Like no matter which project I'm doing, if I have to scrape, I have a scraper library that probably already handles it.
Like
$s=new Scraper($keyword, $limit);
$s->scrapeGoogle();
$s->scrapeYahoo();
$s->scrapeMSN();
$s->scrapeDigg();
$links=$s->getLinks();
$sp=new SiteParser($links);
$sp->downloadAll($threads);
$data=$sp->getData();
$javascript=$sp->getJavaScriptLocations();
for($i=0; $i<sizeof($data); $i++)
{
$captcha=$sp->findCaptcha("image.php");
$sp->saveFile($captcha,"/images/");
}
Once you've got a decent framework, it's all fast as a mother fucker.

Wow! Cool. Keep 'em coming. I guess the issue is if you are a non coder how to go about getting it coded? Would you recommend writing it down (as mentioned above) then just hiring out the job?
 
Anything and everything I do by hand and doesn't directly require my judgement is automated. Initially it was a massive time drain, but if you make all the coding re-usable classes, it's eventually becomes fast as hell.
Like no matter which project I'm doing, if I have to scrape, I have a scraper library that probably already handles it.
Like
$s=new Scraper($keyword, $limit);
$s->scrapeGoogle();
$s->scrapeYahoo();
$s->scrapeMSN();
$s->scrapeDigg();
$links=$s->getLinks();
$sp=new SiteParser($links);
$sp->downloadAll($threads);
$data=$sp->getData();
$javascript=$sp->getJavaScriptLocations();
for($i=0; $i<sizeof($data); $i++)
{
$captcha=$sp->findCaptcha("image.php");
$sp->saveFile($captcha,"/images/");
}
Once you've got a decent framework, it's all fast as a mother fucker.

you should sell that shit man. I bet you'd get LOADS of takers.
 
Wow! Cool. Keep 'em coming. I guess the issue is if you are a non coder how to go about getting it coded? Would you recommend writing it down (as mentioned above) then just hiring out the job?
It'd probably be tricky to hire out, but not impossible. The code itself isn't too hard, more the issue is finding a coder that can make it truly re-usable.
I'll try and put out kind of a class structure I'm using(this may be slightly different from up top because I have a few versions floating around)

  • Class Scraper
    • Class Variables: $keyword, $xc(the CURL connection), $links
    • Functions:
      • scrapeYahoo($limit)
      • scrapeGoogle($limit)
      • scrapeMSN($limit)
      • scrapeYellowPages($state)
      • scrapeDigg($limit)
      • scrapeDiggSubmitters($limit)
      • close() //closes $xc after you're done
      • setKeyword($keyword) //changes the keyword for re-use
      • getLinks();//returns $this->links();
    • Requirements: All links go into the array $this->links, and are all tested to make sure they're not duplicates. Must ignore local links(aka no google.com links when you search google.com), must not scrape pages past the number of available results, must be able to set a delay inbetween getting new pages from the results(so you don't get banned)
 
  • Like
Reactions: machinecontrol
PHP:
scrapeDigg($limit)
^^ this just broke, you'll want to switch the regex to take the title instead of the href

PHP:
'/<\s*a[^<>]*?title=[\'"]?([^\s<>\'"]*)[\'"]?[^<>]*>(.*?)<\/a>/si'
^^ works for me but it's kind of a monster

Of course, you're the master and probably knew that 24 hours before the diggBar was released.

you should sell that shit man. I bet you'd get LOADS of takers.

uuh, please don't
 
PHP:
scrapeDigg($limit)
^^ this just broke, you'll want to switch the regex to take the title instead of the href

PHP:
'/<\s*a[^<>]*?title=[\'"]?([^\s<>\'"]*)[\'"]?[^<>]*>(.*?)<\/a>/si'
^^ works for me but it's kind of a monster

Of course, you're the master and probably knew that 24 hours before the diggBar was released.
yup, but appreciate the thought :D. Actually I like that regex better than the one I'm using.
uuh, please don't
I could say the same thing to you ;) To this day I debate whether to shy away from releasing tools, or just undercut everyone else who does and put them out of biz.
 
To this day I debate whether to shy away from releasing tools, or just undercut everyone else who does and put them out of biz.

Well at least start working on an epic 3000 post boobs thread for us code suffering bastards.
 
First,
I'd like to give a shout out to whomever recommended getdropbox it has saved me so much time and made me so much more productive with my team!

I was listening to a great audio yesterday/saturday and the guy - a business consultant (and successful one)- recommended that VIRTUALLY EVERYTHING you do in your business that is done more than TWICE is a candidate for


tx
can you link to the audio?
 
I could say the same thing to you ;) To this day I debate whether to shy away from releasing tools, or just undercut everyone else who does and put them out of biz.

cornholio1.jpg


Actually, I started giving thought to posting some of my scrapers right after I hit submit and decided I was being a little too stingy ... so much so that I decided to start a thread for it.

http://www.wickedfire.com/design-development-programming/56127-wf-php-functions-war-chest.html

How's that for dangling a carrot out there for you?
 
Great idea!

I am gonna see if I can digg out some of dem old scripts I hacked together.

::emp::
 
It'd probably be tricky to hire out, but not impossible. The code itself isn't too hard, more the issue is finding a coder that can make it truly re-usable.
I'll try and put out kind of a class structure I'm using(this may be slightly different from up top because I have a few versions floating around)

  • Class Scraper
    • Class Variables: $keyword, $xc(the CURL connection), $links
    • Functions:
      • scrapeYahoo($limit)
      • scrapeGoogle($limit)
      • scrapeMSN($limit)
      • scrapeYellowPages($state)
      • scrapeDigg($limit)
      • scrapeDiggSubmitters($limit)
      • close() //closes $xc after you're done
      • setKeyword($keyword) //changes the keyword for re-use
      • getLinks();//returns $this->links();
    • Requirements: All links go into the array $this->links, and are all tested to make sure they're not duplicates. Must ignore local links(aka no google.com links when you search google.com), must not scrape pages past the number of available results, must be able to set a delay inbetween getting new pages from the results(so you don't get banned)
That is a seriously useful post, +rep. My prime objective has always been to automate and replicate, and this gives me a few ideas on how to organize all my scrapers, which are just piling up in a million different disorganized PHP files.
 
Status
Not open for further replies.