|The trouble here is that you are going to be querying the database architecture and that will change between each database type. I would also rank the importance of the tests to reduce the amount of work your system is going to have to do. Assuming SQL Server.
1. You can query the system to get the physical size and or the row count of each table using SMO
2. Query the system views to get the FK count (and possibly the links to the 1 tables)
3 As Eddy said 0 or null fields indicate a badly designed database. This is going to be costly to query against the large tables (if it is the small tables you REALLY have a problem)
4. Again Eddy has it right, do you store/audit the change information.
5. Sounds like 3 all over again - extend the definition of meaningful
6. In your dreams - if you can design something for forward requirements you are better than the rest of us.
What you are proposing is a rules engine and they are excellent while the rule count is small, once it grows too large the entire thing becomes unsupportable. There are MANY commercial rules engines out there.
I would approach this by demanding the business case for such a tool, what are the benefits and who is going to support, extend and pay for it. When that is not forthcoming I would shelve the entire thing.
I can't see how AI would help here (my AI knowledge is zero) but it would be a major exercise for each database you are going to support. I would also break it into 2 major projects, the database querying project that will need to be extended with each one you support and the analytical project that should be generic, accepting data from all the database types.
Never underestimate the power of human stupidity -
I'm old. I know stuff - JSOP