The problem consists of several parts:
1. Collect list of domains and information, and they
2. Analysis pages
Let's dive in:
1. Collect list of domains and information, and they
Well, the database does not exist in nature. It is possible to understand just of information about how DNS works. Of course, you can begin to sort out all the domain names in order, but even having an infinite number of proxy by considering all possible combinations is going to take forever, and this information is constantly changing. You can go the other way and become Google. The cost of either approach, I guess you could imagine - trillions of dollars
2. Analysis pages
There are ready-made tools, some might even have APIs (otherwise painful to write parsers and update them + a lot of proxies and stuff like that). You can write your own tools, but over all these people for months on end and constantly modify. In principle, the idea of a database of domain names, to collect this information are already possible, but processing will take more time for dig/whois is one thing, but parsing or api it is quite another
In General, do not advise