Scrub large suppression file against database

JavierS

New member
Mar 11, 2009
23
0
0
I need to import a suppression file containing emails or md5s and scrub against my database to find matches. This is no problem for small files, but I've come across some suppression files that are a couple gigs and contain tens of millions of records. Doing this all in memory is impossible with a 4 gig file. What other option do I have? Import the entire file to the database, cross check, spit out a new file? I can use a service like scrubbly, but this is too much manual work. I would like to do all this myself and import the suppression file directly into my ESP all in one shot.
 


Write a more efficient script - don't read the entire file into memory at once. If I misunderstood and that's not possible, why not spawn a cloud instance that has a load of RAM?
 
import the file to a new table and add index - then do the scrubbing. (do the import from command line.)