How do you know what's going on inside your database? The traditional answer is to use transaction logs or run trace utilities. Logging database activity is fine in its way — but it is only reactive. You'll only ever know what has already happened on your database, which is like finding that the bank has been robbed, rather than knowing that the bank is being robbed — and then being able to do something about it. The other problem with logs is granularity — the logs may not capture enough detail or may miss out completely on certain critical activities such as a read operation on sensitive data.
The traditional alternative is to run trace utilities. The trouble with traces is that they consume CPU cycles. It has been estimated that running the DB2 audit trace has a CPU overhead of around 5% per transaction when all audit trace classes are started. IBM estimates that DB2's global trace can add 100% CPU overhead when all audit trace classes are started.
It seems that what we have is one technique that is inadequate and another that is impractical. So, perhaps the important question to ask is, why should we bother? The answer is because of compliance regulations. There are two key regulations that apply — the Sarbanes-Oxley Act (SOX) and the Payment Card Industry Data Security Standard (PCI-DSS).
And while we're thinking of compliance with auditing regulations, who would usually be the person responsible for reviewing the logs or running and examining trace utilities? That would be the DBA. To comply with auditing requirements, you also need some way to check the DBA's activities to ensure that he isn't the person "robbing the bank", so to speak.
So far we have four criteria for a successful database-auditing tool. It must:
- Comply with the latest regulations,
- Audit DBA activity as well as all the other users of our database,
- Not impact on the performance of the database,
- Have a way of identifying in real-time any problems, i.e., any violations of corporate policies.
An ideal solution would run off the mainframe to not impact the mainframe's performance, while, at the same time, monitoring and tracking all database activity in real-time.
Fully compliant auditing solutions store, analyze, and report database information. They can identify anomalous behavior and policy violations immediately and respond with policy-based actions, such as security alerts. Database activity is captured at the DBMS level, and it can capture activity initiated by a mainframe-based applications and networked applications. It can also monitor by role or by application, which helps to meet auditing requirements.
A robust database access auditing solution that addresses regulatory compliance should be able to provide answers to at least the following questions:
- Who accessed the data?
- At what date and time was the access?
- What program or client software was used to access the data?
- From what location was the request issued?
- What SQL was issued to access the data?
- Was the request successful; and if so, how many rows of data were retrieved?
- If the request was a modification, what data was changed? (A before and after image of the change should be accessible.)
About the Author
Trevor Eddolls is a senior consultant for NEON Enterprise Software, a Sugar Land, TX-based technology leader in enterprise data availability software and services. Trevor has over 25 years of experience in all aspects of IT, and for many years, he was editor of Xephon's Update journals (www.xephonusa.com). You can read his weekly blog at mainframeupdate.blogspot.com. He can be contacted by email at trevor@itech-ed.com. Visit www.neonesoft.com.