Oracle automatic upgrade ala Docker way

There is a post and slides from Mike Dietrich about how to do automatic upgrade for your Oracle RDBMS using auto-upgrade tool.

But what happens if your RDBMS is running on Docker or Docker Swarm? Well is quite easy. Let see:

Assuming that We have a Swarm or Docker compose stack like:

version: '3.6'
services:
db:
image: oracle/database:12.2.0.1-ee
hostname: ols
volumes:
- /home/data/db/12cR2.test:/opt/oracle/oradata
- /run/shm:/dev/shm
ports:
- 1521:1521
environment:
- ORACLE_SID=TEST
- ORACLE_PDB=PDB1

Is working 12cR2 RDBMS database which want to migrate to 19.3, first as auto-upgrade says We need a RDBMS working in archive log mode, change it executing:

SQL> alter system set db_recovery_file_dest_size=30g scope=both;
System altered.
SQL> shutdown immediate;
Database closed.
Database dismounted.
ORACLE instance shut down.
SQL> startup mount;
ORACLE instance started.
Total System Global Area 1610612736 bytes
Fixed Size 8793304 bytes
Variable Size 520094504 bytes
Database Buffers 1073741824 bytes
Redo Buffers 7983104 bytes
Database mounted.
SQL> alter database archivelog;
Database altered.
SQL> alter database open;
Database altered.

To upgrade to 19c We will processed with two step migration option, first running analyse — fixups and finally upgrade as is shown in this picture

auto-upgrade steps

To do that We will use a migrate.sh bash script

migrate.sh bash script

the script is designed to run using startup functionality provided by RDBMS images created using Oracle official build scripts.

To execute above script We will change our Docker stack definition adding new mount files:

.....
 volumes:
 - /home/data/db/12cR2.test:/opt/oracle/oradata 
 - /home/oracle/autoupgrade.jar:/opt/oracle/scripts/startup/autoupgrade.jar
 - /home/oracle/migrate.conf:/opt/oracle/scripts/startup/migrate.conf
 - /home/oracle/migrate.sh:/opt/oracle/scripts/startup/migrate.sh
 - /run/shm:/dev/shm

autoupgrade.jar is downloaded from My Oracle support link. Adding above mount options to docker-compose-migrate.yml file, a shutdown-startup is required, lets try:

oracle@localhost:~/sample-stacks$ docker service scale migrate_db=0
oracle@localhost:~/sample-stacks$ docker stack deploy -c docker-compose-migrate.yml migrate

migrate.conf file only have options to define source version and target version, in our example is:

global.autoupg_log_dir=/tmp/jobmgr
upg1.dbname=TEST
upg1.source_home=/opt/oracle/product/12.2.0.1/dbhome_1
upg1.target_home=/opt/oracle/product/19c/dbhome_1
upg1.sid=TEST
upg1.start_time=now
upg1.pdbs=*
upg1.log_dir=/tmp
upg1.upgrade_node=localhost
upg1.run_utlrp=yes
upg1.timezone_upg=yes
upg1.target_version=19.3

When a database startup with the stack modified it will log this output:

migrate_db.1.3pfwfxwkgo66@localhost    | /opt/oracle/runUserScripts.sh: running /opt/oracle/scripts/startup/migrate.sh
migrate_db.1.3pfwfxwkgo66@localhost | Executing /opt/oracle/scripts/startup
migrate_db.1.3pfwfxwkgo66@localhost | Autoupgrade tool launched with default options
migrate_db.1.3pfwfxwkgo66@localhost | +--------------------------------+
migrate_db.1.3pfwfxwkgo66@localhost | | Starting AutoUpgrade execution |
migrate_db.1.3pfwfxwkgo66@localhost | +--------------------------------+
migrate_db.1.3pfwfxwkgo66@localhost | 1 databases will be analyzed
migrate_db.1.3pfwfxwkgo66@localhost | Job 100 completed
migrate_db.1.3pfwfxwkgo66@localhost | ------------------- Final Summary --------------------
migrate_db.1.3pfwfxwkgo66@localhost | Number of databases [ 1 ]
migrate_db.1.3pfwfxwkgo66@localhost |
migrate_db.1.3pfwfxwkgo66@localhost | Jobs finished successfully [1]
migrate_db.1.3pfwfxwkgo66@localhost | Jobs failed [0]
migrate_db.1.3pfwfxwkgo66@localhost | Jobs pending [0]
migrate_db.1.3pfwfxwkgo66@localhost | ------------- JOBS FINISHED SUCCESSFULLY -------------
migrate_db.1.3pfwfxwkgo66@localhost | Job 100 FOR TEST
migrate_db.1.3pfwfxwkgo66@localhost |
migrate_db.1.3pfwfxwkgo66@localhost | Autoupgrade tool launched with default options
migrate_db.1.3pfwfxwkgo66@localhost | +--------------------------------+
migrate_db.1.3pfwfxwkgo66@localhost | | Starting AutoUpgrade execution |
migrate_db.1.3pfwfxwkgo66@localhost | +--------------------------------+
migrate_db.1.3pfwfxwkgo66@localhost | 1 databases will be processed
migrate_db.1.3pfwfxwkgo66@localhost |
migrate_db.1.3pfwfxwkgo66@localhost | -------------------------------------------------
migrate_db.1.3pfwfxwkgo66@localhost | Errors in database [TEST]
migrate_db.1.3pfwfxwkgo66@localhost | Stage [PRECHECKS]
migrate_db.1.3pfwfxwkgo66@localhost | Operation [STOPPED]
migrate_db.1.3pfwfxwkgo66@localhost | Status [ERROR]
migrate_db.1.3pfwfxwkgo66@localhost | Info [
migrate_db.1.3pfwfxwkgo66@localhost | Error:
migrate_db.1.3pfwfxwkgo66@localhost | [Unexpected Exception Error]
migrate_db.1.3pfwfxwkgo66@localhost | Cause:
migrate_db.1.3pfwfxwkgo66@localhost | For further details, see the log file located at /tmp/TEST/101/autoupgrade_20190504_user.log]
migrate_db.1.3pfwfxwkgo66@localhost |
migrate_db.1.3pfwfxwkgo66@localhost | -------------------------------------------------
migrate_db.1.3pfwfxwkgo66@localhost | Logs: [/tmp/TEST/101/autoupgrade_20190504_user.log]
migrate_db.1.3pfwfxwkgo66@localhost | Job 101 completed
migrate_db.1.3pfwfxwkgo66@localhost | ------------------- Final Summary --------------------
migrate_db.1.3pfwfxwkgo66@localhost | Number of databases [ 1 ]

The failed job (101) in my test tell that there is not enough flash recovery area, but with 30Gb as is defined above the migration process will ended OK.

Now the database is checked and prepared for migration to 19.3 release, lets change our docker-compose file with a new Docker image version and launch with the upgrade tool, changes will look like:

version: '3.6'
services:
db:
image: oracle/database:19.3.0-ee
command: /opt/oracle/scripts/startup/migrate.sh upgrade
hostname: ols
volumes:
...

Note that We changed to Oracle Docker image to 19.3 and replaced default start script with a migrate.sh using command upgrade, a shutdown-startup is required again:

oracle@localhost:~/sample-stacks$ docker service scale migrate_db=0
oracle@localhost:~/sample-stacks$ docker stack deploy -c docker-compose-migrate.yml migrate

When RDBMS start will show the upgrade process as is:

migrate_db.1.tdkei09htk99@localhost    | SQL*Plus: Release 19.0.0.0.0 - Production on Sat May 4 13:56:16 2019
migrate_db.1.tdkei09htk99@localhost | Version 19.3.0.0.0
....
migrate_db.1.tdkei09htk99@localhost | Autoupgrade tool launched with default options
migrate_db.1.tdkei09htk99@localhost | +--------------------------------+
migrate_db.1.tdkei09htk99@localhost | | Starting AutoUpgrade execution |
migrate_db.1.tdkei09htk99@localhost | +--------------------------------+
migrate_db.1.tdkei09htk99@localhost | 1 databases will be processed
....
migrate_db.1.tdkei09htk99@localhost | Job 100 completed
migrate_db.1.tdkei09htk99@localhost | ------------------- Final Summary --------------------
migrate_db.1.tdkei09htk99@localhost | Number of databases [ 1 ]
migrate_db.1.tdkei09htk99@localhost |
migrate_db.1.tdkei09htk99@localhost | Jobs finished successfully [1]
migrate_db.1.tdkei09htk99@localhost | Jobs failed [0]
migrate_db.1.tdkei09htk99@localhost | Jobs pending [0]
migrate_db.1.tdkei09htk99@localhost | ------------- JOBS FINISHED SUCCESSFULLY -------------
migrate_db.1.tdkei09htk99@localhost | Job 100 FOR TEST
migrate_db.1.tdkei09htk99@localhost |
migrate_db.1.tdkei09htk99@localhost | Remember to update /etc/oratab file with new ORACLE_HOME!!!...

Last message is telling us that the /etc/oratab file which resides in a persistent storage need to be update with a new Oracle home directory, now to 19c binary release:

root@localhost:/home/data/db/12cR2.test# cat dbconfig/TEST/oratab 
#
TEST:/opt/oracle/product/19c/dbhome_1:N

above change ensure that checkStatus.sh script will work OK at the next start, finally by removing docker compose command option and upgrade files give my database just upgraded to 19c:

version: '3.6'
services:
db:
image: oracle/database:19.3.0-ee
hostname: ols
volumes:
- /home/data/db/12cR2.test:/opt/oracle/oradata
- /run/shm:/dev/shm
ports:
- 1521:1521
environment:
- ORACLE_SID=TEST
- ORACLE_PDB=PDB1

with above docker-compose.yml definition a final shutdown-startup database is required:

oracle@localhost:~/sample-stacks$ docker service scale migrate_db=0
oracle@localhost:~/sample-stacks$ docker stack deploy -c docker-compose-migrate.yml migrate

a log output will show our 19.3 database ready to use:

migrate_db.1.r9yhv9labpah@localhost    | Version 19.3.0.0.0
migrate_db.1.r9yhv9labpah@localhost | The Oracle base remains unchanged with value /opt/oracle
migrate_db.1.r9yhv9labpah@localhost | #########################
migrate_db.1.r9yhv9labpah@localhost | DATABASE IS READY TO USE!
migrate_db.1.r9yhv9labpah@localhost | #########################
....
migrate_db.1.r9yhv9labpah@localhost | ===========================================================
migrate_db.1.r9yhv9labpah@localhost | Dumping current patch information
migrate_db.1.r9yhv9labpah@localhost | ===========================================================
migrate_db.1.r9yhv9labpah@localhost | Patch Id: 29517242
migrate_db.1.r9yhv9labpah@localhost | Patch Description: Database Release Update : 19.3.0.0.190416 (29517242)
migrate_db.1.r9yhv9labpah@localhost | Patch Apply Time: 2019-04-18T07:21:17Z
....
migrate_db.1.r9yhv9labpah@localhost | Patch Id: 29585399
migrate_db.1.r9yhv9labpah@localhost | Patch Description: OCW RELEASE UPDATE 19.3.0.0.0 (29585399)
migrate_db.1.r9yhv9labpah@localhost | Patch Apply Time: 2019-04-18T07:21:33Z

and that’s all happy upgrading your RDBMs.