quarta-feira, 28 de setembro de 2022

How to study for 1z0-149 - Oracle Database PL/SQL Developer Certified Professional

Hi all,

I had 2 vouchers(Now, it's called "Oracle Exam Attempt") to use until the end of September and I decided to take the PL/SQL 19c exam.

My first thought was " I'll give it a try, I don't think much has changed since 11g".

And I was completely wrong!

I didn't study, just scheduled the exam and got a really bad result!



After a week of studying the new features and reviewing many concepts, I got a good score :)



If you have ULS, this is a good starting point:



If not, the Database PL/SQL Language Reference will be fine.
You should also take a look at https://oracle-base.com/, the examples are much better than the Oracle Documentation.

Review the exam topics, everything will be at the exam:


In about 50% of the exam, you will need to be able to analyze the PL/SQL code and tell if it will work and if not, you will need to point out the possible failures.





And last but not least, stay away from Brain Dumps!

Best of luck
Alex Zaballa 

segunda-feira, 26 de setembro de 2022

Upgrade From 11.2.0.4: Slow Performance Adding Nullable Columns With Default Values To AWR Tables

Hi all,

I was in a project where the customer upgraded a large production database from 11g to 19c.

Phase 0 of the upgrade process took almost 4 hours and the DDLs responsible for all that time were related to new columns on AWR tables (WRH$ tables).

I was talking to Rodrigo Jorge (PM for upgrades and migrations) and he pointed me to this patch: 30387640


For example, these 2 DDLs took about 2 hours to run:

alter table WRH$_SQLSTAT add (obsolete_count number default 0);

alter table WRH$_SEG_STAT add (im_membytes number default 0);

I remember since 11g Oracle should only update the data dictionary when you are adding a new column with a default value, but what I didn't remember was that it works only for NOT NULL columns.

I found this after doing some research and here you have a great blog post about it:

https://chandlerdba.com/2014/10/30/adding-not-null-columns-with-default-values/

And another good thing, this restriction no longer exists in 12c+.

https://chandlerdba.com/2014/12/01/adding-a-default-column-in-12c/

If you are upgrading from 11g to 19c and you have a large AWR repository, consider applying patch 30387640 before the upgrade.

Thanks
Alex

quinta-feira, 22 de setembro de 2022

Downgrading Oracle Database to an Earlier Release

Hi all,

I was in a project where the customer had one database in a training environment to upgrade from 11g to 19c.

The customer requirement was to have a fallback option in case of any issue in the next few days.

The fallback option during the upgrade is to create a GUARANTEE RESTORE POINT. But after a few days, you can lose data in case you go back to the restore point.

To be honest, I have never seen a downgrade in all my Oracle life, but it was a customer requirement.

And yes, we are not touching the COMPATIBLE parameter after the upgrade :)

We did the upgrade to 19c using AutoUpgrade and everything works great.

But, when we decided to test the catdwgrd.sql, we had a lot of ora-600 at the end of the downgrade process.

Then, I found something that I was not aware of: A document called "Required Task to Preserve Downgrade Capability".

Where you have some patches to apply on 11g:

Required Task to Preserve Downgrade Capability

Downgrading Oracle Database to an Earlier Release


Also, make sure you are taking care of the Timezone, especially updating Timezone files in the 11g home.


Thanks

Alex


 


quinta-feira, 15 de setembro de 2022

Row Count + DBMS_COMPARISON - Logical Replication

Hi all,

I was in a project where logical replication was in place.

It was going from 11g non-exa to 19c on Exadata and they ran into some issues during their last migration attempt a year ago.

At that time OGG was sending data from 11g to 19c and another third-party tool was sending from 19c back to 11g (in case of rollback needed). The problem was that this tool is based on triggers to capture and DML commands to replicate, but you can imagine how bad it is when you have a large number of transactions and 2 database nodes.

Now, they decided to use OGG for everything and it was a great decision.

The customer asked me to validate the data (source vs destination) to make sure everything was in sync.

My first thought was to use OGG Veridata, but the license cost was an impediment to this project.

Last time, they used some procedures to do row counts at source and destination and compare the data. I got this code and worked for a few minutes to improve it. There were 16 procedures and about 30 tables and it’s now one procedure and a few tables. It’s not perfect and there is room to improve, but you can get an idea here.

As mentioned by Connor in this video, it’s not a regular situation and this kind of thing should be carefully analyzed before starting to count every row in every table.

Another point from the business area was to validate if the data was the same, not only the number of rows. For this case, I decided to use DBMS_COMPARISON.

Here you can see an example of how to generate the scripts to validate:

select '
BEGIN
DBMS_COMPARISON.drop_comparison (
    comparison_name    => ''cutover_comp_bm'');
END;
/

BEGIN
  DBMS_COMPARISON.create_comparison (
    comparison_name    => ''cutover_comp_bm'',
    schema_name        => ''YOUR_SCHEMA'',
    object_name        => '''||table_name||''',
    dblink_name        => ''db_compare'',
    remote_schema_name => ''YOUR_SCHEMA'',
    remote_object_name => '''||table_name||''');
END;
/


SET SERVEROUTPUT ON
DECLARE
  l_scan_info  DBMS_COMPARISON.comparison_type;
  l_result     BOOLEAN;
  v_comparison_name    varchar2(100):= ''cutover_comp_bm'';
BEGIN
  l_result := DBMS_COMPARISON.compare (
                comparison_name => v_comparison_name,
                scan_info       => l_scan_info,
                perform_row_dif => TRUE
              );

  IF NOT l_result THEN
    DBMS_OUTPUT.put_line(v_comparison_name||'' Differences found. scan_id='' || l_scan_info.scan_id);
  ELSE
    DBMS_OUTPUT.put_line(v_comparison_name||'' No differences found.'');
  END IF;
END;
/

' 
from dba_tables
where owner='YOUR_SCHEMA';

Some tables were really big in this database(billion of rows) and for this situation, I found the parameter scan_mode.

You can use something like:

    scan_mode          => dbms_comparison.CMP_SCAN_MODE_RANDOM,
    scan_percent       => 0.001    

DBMS_COMPARISON cannot compare LOB columns and you can use this parameter to limit the columns to be compared:

column_list        => 'YOUR COLUMNS SEPARATED BY COMMA'

You can use this SQL to generate a column list:

SELECT LISTAGG(column_name, ',') WITHIN GROUP (ORDER BY column_id)
  FROM dba_tab_columns
 WHERE owner = 'YOUR_OWNER'
   and table_name='YOUR_TABLE'
   and data_type not like '%LOB%';

For 21c, I found the CHECKSUM and I'm planning to take a look at it for future projects.

Thanks
Alex

terça-feira, 13 de setembro de 2022

How to study for 1Z0-997-22 - Oracle Cloud Infrastructure 2022 Architect Professional

Hi all,

I decided to take all the 2022 OCI exams to check what changed and guide people in taking these certifications.

Well, not all exams. I will do only the ones that I did in 2019/2020/2021 and are the things that I work on a day-to-day basis 🙂


On this test, I was expecting to see some new things(comparing the course topics with the previous years). And that's true, I got a lot of new questions.
Of course, I don't remember the question from the 2021 exam (did it one year ago), but I didn't remember studying some of this content before.

A good starting point is here:


What do you need to focus on? 
  • EVERYTHING :)

Official Documentation:


User guide:


But I recommend you to have a free account and test everything you can! The biggest problem will be testing cross-region features because free account permits to subscribe to only one region.

You can also find some examples of questions in this link and they are really helpful to understand what to expect on the real test:


Take an extra look at the Azure integration with OCI.







And last but not least, stay away from Brain Dumps!

Best of luck
Alex Zaballa

segunda-feira, 12 de setembro de 2022

Database Migration/Upgrade - Performance Advice

Hi all

I was on a project where logical replication (OGG) was used to move a critical database from 11g non-exa to 19c on Exadata with the smallest downtime possible.

In terms of performance, for the top 100 SQLs, we had 96 improvements including SQLs running 40x faster due to smart scans. But, we had 6 regressions.

One SQL went from 0.2 ms to 0.6 ms.

I know you are thinking: “not a big deal”.

But when this SQL is executed in a batch process millions of times, the batch time goes from 42 minutes to 126 minutes and it can cause a lot of problems for the client.

I know, they should review and change this row-by-row approach, but you have no time for this during a go-live.

For this case, I got the SQL plan baseline from 11g and imported it on 19c.
Problem solved!

Now, the customer has time to improve this process and SQLs.

For example, one SQL in this process was taking 40 minutes to run, adding 2 hints (Full and Parallel) the smart scan is enabled and the same SQL is now taking 6 minutes.

Some tips:

1 - Increase your AWR retention to 31 days (60*24*31) and decrease the interval to have more granularity:

execute dbms_workload_repository.modify_snapshot_settings(interval => 30,retention => 44640);
2 - Start saving your SQL PLAN BASELINES in case you have regressions after the migration/upgrade (11g database):

ALTER SYSTEM SET optimizer_capture_sql_plan_baselines = true;
**take a look on your SYSAUX tablespace, because you can start having space pressure

3 - Export your AWR Repository on the 11g database (in case you need any performance information in the future and in the 11g database is not accessible anymore):


$ORACLE_HOME/rdbms/admin/awrextr.sql

4 - Sometimes, it's a good idea to copy DBA_HIST_SQLSTAT and DBA_HIST_SNAPSHOT to your new database (in case the 11g database is not accessible anymore and you need SQL time information).


One example of how to compare the execution time between the old database and the new database:

col execs for 999,999,999
col avg_etime for 999,999.999999
col avg_lio for 999,999,999.9
col begin_interval_time for a30
col node for 99999

select ss.snap_id, ss.instance_number node, begin_interval_time, sql_id, plan_hash_value,
nvl(executions_delta,0) execs,
(elapsed_time_delta/decode(nvl(executions_delta,0),0,1,executions_delta))/1000000 avg_etime,
(buffer_gets_delta/decode(nvl(buffer_gets_delta,0),0,1,executions_delta)) avg_lio
from DBA_HIST_SQLSTAT S, DBA_HIST_SNAPSHOT SS
where sql_id = 'YOR_SQL_ID'
and ss.snap_id = S.snap_id
and ss.instance_number = S.instance_number
and executions_delta > 0
order by 1, 2, 3
/


Carlos Sierra also has a great script to check regressions:


https://carlos-sierra.net/2014/11/02/finding-sql-with-performance-changing-over-time/


5 - Transfer all your SQL Plan Baselines to the new database to quickly fix SQL regressions:


BEGIN
  DBMS_SPM.CREATE_STGTAB_BASELINE(
    table_name      => 'spm_stage_table',
    table_owner     => 'your_user');
END;
/
SET SERVEROUTPUT ON
DECLARE
  v_plans  NUMBER;
BEGIN
  v_plans := DBMS_SPM.pack_stgtab_baseline(
table_name => 'spm_stage_table',
table_owner => 'your_user');
DBMS_OUTPUT.put_line('SQL Plans Total: ' || v_plans);
END; /

Use expdp/impdp to move the table spm_stage_table to the new database.

An example of how to find the SQL HANDLE used by a specific SQL_ID:



with subq_mysql as
    (select sql_id
     ,      (select dbms_sqltune.sqltext_to_signature(ht.sql_text)
             from dual) sig
     from   dba_hist_sqltext       ht
     where  sql_id = 'YOUR_SQL_ID')
    ,    subq_baselines as
    (select b.signature
     ,      b.plan_name
    ,      b.accepted
    ,      b.created
    ,      o.plan_id
    ,      b.sql_handle
    from   subq_mysql             ms
    ,      dba_sql_plan_baselines b
    ,      sys.sqlobj$            o
    where  b.signature   = ms.sig
    and    o.signature   = b.signature
    and    o.name        = b.plan_name)
   ,    subq_awr_plans as
   (select  sn.snap_id
    ,       to_char(sn.end_interval_time,'DD-MON-YYYY HH24:MI') dt
    ,       hs.sql_id
    ,       hs.plan_hash_value
    ,       t.phv2
    ,       ms.sig
    from    subq_mysql        ms
    ,       dba_hist_sqlstat  hs
    ,       dba_hist_snapshot sn
    ,       dba_hist_sql_plan hp
    ,       xmltable('for $i in /other_xml/info
                      where $i/@type eq "plan_hash_2"
                      return $i'
                     passing xmltype(hp.other_xml)
                     columns phv2 number path '/') t
    where   hs.sql_id          = ms.sql_id
    and     sn.snap_id         = hs.snap_id
    and     sn.instance_number = hs.instance_number
    and     hp.sql_id          = hs.sql_id
    and     hp.plan_hash_value = hs.plan_hash_value
    and     hp.other_xml      is not null)
   select awr.*
   ,       nvl((select max('Y')
               from   subq_baselines b
                where  b.signature = awr.sig
               and    b.accepted  = 'YES'),'N') does_baseline_exist
   ,      nvl2(b.plan_id,'Y','N') is_baselined_plan
   ,      to_char(b.created,'DD-MON-YYYY HH24:MI')  when_baseline_created
   ,b.sql_handle
   from   subq_awr_plans awr
   ,      subq_baselines b
   where  b.signature (+) = awr.sig
   and    b.plan_id   (+) = awr.phv2
  order by awr.snap_id;
Example of how to load the SQL Plan Baseline for one specific SQL:


SET SERVEROUTPUT ON
DECLARE
  v_plans  NUMBER;
BEGIN
  v_plans := DBMS_SPM.unpack_stgtab_baseline(
table_name => 'spm_stage_table',
table_owner => 'your_user',
sql_handle => 'SQL_2644bb9a823bec0e'); DBMS_OUTPUT.put_line('Plan Unpacked: ' || v_plans);
END; /


6 - If you want to use a good execution plan and have it on the AWR repository, you can create a SQL Plan Baseline:


variable x number;
begin
  :x := dbms_spm.load_plans_from_awr( begin_snap=>310417,end_snap=>310418,
                                     basic_filter=>q'# sql_id='cm4dv9adjj6u3' and plan_hash_value='1563030161' #' );
end;
/ 

Thanks
Alex

quinta-feira, 8 de setembro de 2022

How to study for 1z0-931-22 - Oracle Autonomous Database Cloud 2022 Professional

Hi all,

I decided to take all the 2022 OCI exams to check what changed and guide people in taking these certifications.

Well, not all exams. I will do only the ones that I did in 2019/2020/2021 and are the things that I work on a day-to-day basis 🙂


On this test, I was expecting to see a few new things(comparing the course topics with the previous years). 
I don't remember the question from the 2021 exam (did it one year ago), but I was not surprised with any new questions that I have no idea about.

A good starting point is here:

You can also find some examples of questions in this link and they are really helpful to understand what to expect on the real test:


Make sure you understand all the differences between Autonomous Shared and Autonomous Dedicated.

Study:
  • Autonomous Dedicated
  • Autonomous JSON
  • Autonomous Data Guard
  • Graph
  • Oracle Text
  • Data Insights
  • Spatial



And last but not least, stay away from Brain Dumps!

Best of luck
Alex Zaballa 

segunda-feira, 5 de setembro de 2022

How to study for 1z0-1093-22 - Oracle Cloud Database Services 2022 Professional

Hi all,

I decided to take all the 2022 OCI exams to check what changed and guide people in taking these certifications.

Well, not all exams. I will do only the ones that I did in 2019/2020/2021 and are the things that I work on a day-to-day basis 🙂


On this test, I was expecting to see a few new things(comparing the course topics with the previous years). 
I don't remember the question from the 2021 exam (did it one year ago), but I was not surprised with any new questions that I have no idea about.

A good starting point is here:

I recommend you to have a free account and test everything you can, especially NoSQL and MYSQL.

You can also find some examples of questions in this link and they are really helpful to understand what to expect on the real test.

Make sure you understand all the differences between ExaCS and ExaCC.
Also, have a good understanding of Monitoring and especially Performance Hub.







And last but not least, stay away from Brain Dumps!

Best of luck
Alex Zaballa