Web-Based Scraper/Control Panel/Filtration System

XRMseo

New member
Sep 5, 2011
102
0
0
www.bitchplease.com
Hey guys,

So we have pretty decent automated scraping solution setup. I was curious if there's any demand for a web-based control panel like deal you can run your scrapes off of. We'd probably run it off of "Server-Time" as a form of credits. (Which, to be honest, should be fairly cheap.)

We've got all the standard filtration options of course; unlimited footprints, Ignore-Filters (To skip pages that dont contain given footprints/strings), and pre-set categorys of keywords. (Largely we use dictionary lists, but we've got the typical categories/niches covered fairly well to).

Obviously we've got SOME analytics in there, so if you want to filter by outbound link count, PR, whatever, that's a given.

Im thinking we'd tie an API into the finished product so you could download the latest additions to whatever scrapes you were running, and feed them right into your various submission/posting software.

We'd just need to log your last download, position in the list, and render the "New links" separate from the old (So your not wasting resources downloading a massive file every time, if you've only got a few 100k new links.)

And yes, there would be a big fat "Download Link List" button for people to use shit through the browser.

Thoughts? I'm uncertain of the interest something like this would generate, since any moron with $50 can grab scrapebox (which TBH sucks at any kind of bulk scraping). The rest of you have your own solutions already, of course.
 


tubes.io

Also I see talk of this 'scrapebox' and had a look at the GUI. I wouldn't touch it. 2 weeks in learning to program in Python will achieve similar results.

As far as I see, nobody has made a decent GUI for a parser yet, so when that day happens, glory be upon them.