Client wanted to collect information about a maximum number of lapsed and/or expiring domains - the domain type, its ownership status, its reputation, links and its Google ranking - for the purpose of evaluating / acquiring the domains for potential SEO value. Existing search services limited the client to approximately 1,000 domains per day.
Because of the wide variety of potential data, we decided to move from a traditional sql solution to a NoSQL paradigm, and to handle the massive volume of data for modules requiring instant reactions we developed a special redis - mongodb connection to handle both fast-expiring and stabilized data without a reduction in search speed.
Our domain information «grabber» is now performing some 200,000 domain searches per day - its actual capability is theoretically unlimited - and assembling the relevant information for our client in a user-friendly database.
- Able to handle millions of data rows
- Finds and parses data from full range of different sources
- Accesses information on Google database with a proxy identity
- Replicates data on several servers for redundancy and stability