total jobs On InformationTechnologyCrossing

269,836

new jobs this week On InformationTechnologyCrossing

18,410

total jobs on EmploymentCrossing network available to our members

1,475,070

job type count

On InformationTechnologyCrossing

Database Auditing: Who Did What, to Which Data, When?

3 Views      
What do you think about this article? Rate it using the stars above and let us know what you think in the comments below.
In a world replete with regulations and threats, organizations today have to go well beyond just securing their data. Protecting this most valuable asset means that companies have to perpetually monitor their systems in order to know who did exactly what, when, and how — to their data.

Database logging only tells you what has happened on your database, not what is happening. Even then, the log details probably won't be enough to satisfy compliance requirements. What is needed is a new way to audit databases without impacting their performance.

How do you know what's going on inside your database? The traditional answer is to use transaction logs or run trace utilities. Logging database activity is fine in its way — but it is only reactive. You'll only ever know what has already happened on your database, which is like finding that the bank has been robbed, rather than knowing that the bank is being robbed — and then being able to do something about it. The other problem with logs is granularity — the logs may not capture enough detail or may miss out completely on certain critical activities such as a read operation on sensitive data.



The traditional alternative is to run trace utilities. The trouble with traces is that they consume CPU cycles. It has been estimated that running the DB2 audit trace has a CPU overhead of around 5% per transaction when all audit trace classes are started. IBM estimates that DB2's global trace can add 100% CPU overhead when all audit trace classes are started.

It seems that what we have is one technique that is inadequate and another that is impractical. So, perhaps the important question to ask is, why should we bother? The answer is because of compliance regulations. There are two key regulations that apply — the Sarbanes-Oxley Act (SOX) and the Payment Card Industry Data Security Standard (PCI-DSS).

And while we're thinking of compliance with auditing regulations, who would usually be the person responsible for reviewing the logs or running and examining trace utilities? That would be the DBA. To comply with auditing requirements, you also need some way to check the DBA's activities to ensure that he isn't the person "robbing the bank", so to speak.

So far we have four criteria for a successful database-auditing tool. It must:
  • Comply with the latest regulations,

  • Audit DBA activity as well as all the other users of our database,

  • Not impact on the performance of the database,

  • Have a way of identifying in real-time any problems, i.e., any violations of corporate policies.

Many sites have implemented Security Information and Event Management (SIEM) tools, a hybrid of Security Information Management (SIM) and Security Event Management (SEM) tools, thinking that will help solve the problem. While they do import log data from a range of systems and network devices, they have one flaw. They don't natively monitor DBMS activity information, and they require the DBMS utilities to be turned on.

An ideal solution would run off the mainframe to not impact the mainframe's performance, while, at the same time, monitoring and tracking all database activity in real-time.

Fully compliant auditing solutions store, analyze, and report database information. They can identify anomalous behavior and policy violations immediately and respond with policy-based actions, such as security alerts. Database activity is captured at the DBMS level, and it can capture activity initiated by a mainframe-based applications and networked applications. It can also monitor by role or by application, which helps to meet auditing requirements.

A robust database access auditing solution that addresses regulatory compliance should be able to provide answers to at least the following questions:
  1. Who accessed the data?

  2. At what date and time was the access?

  3. What program or client software was used to access the data?

  4. From what location was the request issued?

  5. What SQL was issued to access the data?

  6. Was the request successful; and if so, how many rows of data were retrieved?

  7. If the request was a modification, what data was changed? (A before and after image of the change should be accessible.)

Knowing who is doing what to your data and when will protect your data and your company.

About the Author

Trevor Eddolls is a senior consultant for NEON Enterprise Software, a Sugar Land, TX-based technology leader in enterprise data availability software and services. Trevor has over 25 years of experience in all aspects of IT, and for many years, he was editor of  Xephon's Update journals (www.xephonusa.com). You can read his weekly blog at mainframeupdate.blogspot.com. He can be contacted by email at trevor@itech-ed.com. Visit www.neonesoft.com.
On the net:NEON Enterprise Software
www.neonesoft.com

Trevor Eddoll’s blog
mainframeupdate.blogspot.com If this article has helped you in some way, will you say thanks by sharing it through a share, like, a link, or an email to someone you think would appreciate the reference.

Popular tags:

 CPU  Payment Card Industry  compliance regulations  Sarbanes-Oxley Act  event management  data  compliance requirements  monitors


By using Employment Crossing, I was able to find a job that I was qualified for and a place that I wanted to work at.
Madison Currin - Greenville, NC
  • All we do is research jobs.
  • Our team of researchers, programmers, and analysts find you jobs from over 1,000 career pages and other sources
  • Our members get more interviews and jobs than people who use "public job boards"
Shoot for the moon. Even if you miss it, you will land among the stars.
InformationTechnologyCrossing - #1 Job Aggregation and Private Job-Opening Research Service — The Most Quality Jobs Anywhere
InformationTechnologyCrossing is the first job consolidation service in the employment industry to seek to include every job that exists in the world.
Copyright © 2024 InformationTechnologyCrossing - All rights reserved. 21