|« TDE in Oracle 11g and Transparent Tablespace Encryption||Upgrade Oracle 10g Release 2 from 10201 to 10204 »|
Oracle 11g Using LogMiner to analyze redo log files
Posted by Zahid on June 20, 2010.
LogMiner is the most ignored but very powerful utility in Oracle. It helps analyzing the online redo log files and archive redo log files to achieve the following:
- Pinpointing when a logical corruption (user committed incorrect changes) to a database, may have begun.
- Determining how can you perform fine-grained recovery at the transaction level.
- Performance tuning and capacity planning through trend analysis. You can determine which tables are hot and get frequently updated.
- LogMiner can be used to track any DML and DDL statements executed on the database, the order in which they were executed, and who executed them.
Before Oracle 11g the LogMiner was used via a sql interface with DBMS_LOGMNR pl/sql package. That is still possible, but in Oracle 11g LogMiner has been incorporated into Oracle Enterprise Manager yet making it more easy and simpler to analyze the redo log files.
Enable minimum supplemental logging for database:
LogMiner requires at least minimum supplemental logging on the database.
$ sqlplus / as sysdba SQL> SELECT SUPPLEMENTAL_LOG_DATA_MIN FROM V$DATABASE; SUPPLEME -------- NO SQL> ALTER DATABASE ADD SUPPLEMENTAL LOG DATA; Database altered. SQL> SELECT SUPPLEMENTAL_LOG_DATA_MIN FROM V$DATABASE; SUPPLEME -------- YES SQL> exit
Let's roll some transactions on the database which we will analyze later using logminer in Oracle Enterprise Manager.
$ sqlplus scott/tiger SELECT to_char(sysdate , 'DD-MON-YYY HH:MI:SS AM') time_stamp FROM dual; TIME_STAMP ------------------------------- 20-JUN-010 10:47:08 AM CREATE TABLE test_logmnr ( id NUMBER, name VARCHAR2(50), CONSTRAINT test_logmnr_pk PRIMARY KEY (id) ); INSERT INTO test_logmnr VALUES (1,'AAA'); COMMIT; UPDATE test_logmnr SET name = 'BBB'; COMMIT; DELETE test_logmnr; COMMIT; SELECT to_char(sysdate , 'DD-MON-YYY HH:MI:SS AM') time_stamp FROM dual; TIME_STAMP ------------------------------- 20-JUN-010 10:51:46 AM
We now have a few transactions which were executed in between a time 20-JUN-010 10:47:08 AM and 20-JUN-010 10:51:46.
Start Oracle Enterprise Manager to use LogMiner:
Start Oracle Enterprise Manager now and login as SYS as SYSDBA.
Once you are successfully logged in you will see Oracle Enterprise Home page. Click on the "Availability" tab and go to the manage section. You should see a link "View and Manage Transactions" there. Click this link to start analyzing redo log files.
When you click "View and Manage Transactions" you should see an interface like below. You can provide a time or SCN range here and LogMiner will try to search through redo log files for transactions that happened in the provided range.
I have set the time range from June 20, 2010 10:47 AM to June 20, 2010 10:52 AM. I also have applied a query filter to show all transactions executed by user SCOTT in the provided time range.
After your provide your input click "Continue", you should see a work in progress screen like below. LogMiner is searching the redo log files online or archived for transactions that meet your criteria in previous page.
You can cancel or stop the process of mining and show the current results. If you let it continue, you will be transferred to the LogMiner results page automatically once mining is complete.
The results page shows a list of transactions found during mining.
When you click on Transaction ID for a transaction listed in LogMiner results you can see the details of that transaction as follows as I have clicked on the transaction with ID=02000000A7020000:
On the Transaction Details page you must have noticed a button "Flashback Transaction", which is something new in Oracle 11g. You press this button and the transaction will rollback in case if this transaction was an application error and caused you a data loss. This powerful new feature deserves another article of its own to show how this works with and without Oracle Enterprise Manager.
Advanced Query Filter with the LogMiner Interface:
When you open LogMIner interface you see fields to provide time or SCN range for transaction analysis along with a filter on object name and the user who executed the transaction. Moreover, you should see another link at the bottom of the LogMiner interface which is "Advanced Query". Here you can provide further filters for the query on v$logmnr_contents.
My advanced filter is a transaction where the column SCOTT.TEST_LOGMNR.NAME contains a value 'AAA' in the REDO stream. Lets see which transaction I get when I press continue with these filters.
Surely that is the transaction which contains our INSERT statement. Lets click on the Transaction ID to see its details.