You may be able to relate to this; upgrading Oracle databases can be a real drag.
Databases tend to be at the heart of critical businesses process, and you may have spent a lot of time getting things to be in perfect working order. Upgrading your Oracle database from 11g to 19c might be very frightening due to the risk of things changing and the loss of all the work already done. This is normally where testing comes into play. What are your options for testing performance changes when moving from Oracle 11g to 19c?
There are basically three options:
- Oracle Real Application Testing or RAT
- Simulated application workload either manually or with tools like LoadRunner, NeoLoad, or a number of web-specific tools
- Look at individual SQL either manually or through tools like SQL Tuning Sets in Oracle Tuning Pack
First, let’s put a few rules in place when it comes to testing. For a test to be valid, you need to use an environment that is as close to the target production system you will be moving to. That means if you are moving to new servers or changing hardware/storage during the upgrade, then make sure you're running tests on that new hardware or storage. Oracle 19c has many new features that take advantage of new hardware. Testing on an 8-year-old piece of hardware that was running 11g with 19c is not the same as testing 19c on a brand-new server. As the optimizer in Oracle is very aware of the underlying infrastructure it will not act the same, so you are not testing the upgrade really.
Let’s give a quick description of each of these options that might take some of the stress out of upgrading your Oracle database.
ORACLE REAL APPLICATION TESTING OR RAT
Oracle makes a very specific database tool for application workload testing. Real Application Testing or RAT. Acronym aside, the tool is pretty interesting. RAT captures SQL statements from your production database, and then you can clone the database to another environment and re-run all the SQL. This is one of the most complete ways to test just the database component of your application. A couple of downsides to this tool; first, it only tests the database component. If you are upgrading the database software, changing the database hardware, or only adjusting database settings, then this tool is great. If you are making application changes or updating the database schema at the same time, then this tool will not work, or at a minimum will not be able to test those changes. Also, RAT is an additional cost license.
RAT will run through all the application SQLs in order but not necessarily in the same amount of time. E.G. if you capture 8 hours of production workload it will probably take longer than 8 hours to run the replay of the workload on a similarly sized test system even if 19c is faster at individual SQL statements. Also, RAT cannot simulate everything, such as DataGuard and GoldenGate. These replication tools do cause some overhead, and RAT will not simulate that. Though most of that overhead can be very small.
High-level steps for a RAT replay:
- Capture the workload on your production system, be sure to include AWR data
- Clone the production database to test area based on the start time of your capture
- Upgrade the cloned test system to 19c and apply database configuration changes
- Transfer the capture files from production to test, and process the capture files
- Run the database replay on the test
- Load your AWR data from production and run comparison reports between production and test
The great thing about RAT is you get a very detailed report of database effort and work. You can see exactly what the 19c and possibly new hardware did differently at the database level.
The downside to RAT is you cannot change the application schema other than adding indexes or updating statistics. Overall this is a great way to test your database upgrade, and you can roll back your test database and re-run the RAT testing multiple times. This allows for iterative changes to database configuration.
SIMULATING APPLICATION WORKLOAD
Another option is to do application testing. In this scenario, you will have a full test system that includes database and application tiers and a full setup of application software. The basic idea is to set up a test environment that can be logged into and then create transactions or usage on the application.
There can be a lot of complexity in setting up another full environment with all the application components. It can also be difficult to replicate enough hardware to achieve production-level performance or load. A second challenge can be the actual testing. It might be easy to have a user log in to the test environment and run a few application transactions. This might give you a feel for some individual performance but really can’t compare to having many multiple users. A system with a single user accessing it will react very differently than a system with hundreds or even thousands of active users. In order to simulate those levels of workload you will need to use tools like LoadRunner or others that will simulate end users running transactions through the application GUI or API. Creating tests that simulate many active users can also be extremely complex and fall short of real workload patterns.
The high-level steps for application testing are very dependent on your testing process and environment. You may have a test system with full technology stack and redundant application servers, load balancers, web servers, and databases. Or you may have a very simple setup just to perform basic smoke testing.
The plus side of this is that you can test both application changes and database upgrades at the same time. Though you may have a hard time during testing teasing out what might be due to an application change versus a database upgrade change. So that might extend your tuning effort work, though it can decrease the amount of testing work.
MANUALLY RUNNING SQL
A third option is to just manually run some SQL statements in your test database. This can be done by capturing the SQL from the production system using SQL history ASH or AWR or with SQL Tuning Sets (STS) if you are licensed. SQL can also be captured using SQL or session tracing. These statements can then be re-run in a test environment to see how they perform after the database has been modified or upgraded.
As you might already be guessing, this is very complex. Finding the statements, you want to re-run, possibly updating them, and then making sure the environment is set correctly for the SQL to work in a test environment can be complex. If you have no other avenue for testing, this can allow you to possibly pick out specifically long-running or high-impact SQLs. This is really a very minimal test, but it may show how the software changes affected the SQL performance profile.
The high-level steps for SQL tuning vary a little based on if you are using a tool like SQL Tuning Sets or manually capturing statements. Either way, it looks something like this:
- Identify and capture the specific SQL in production (be sure to capture and explain plans and run time statistics)
- Have a test environment that is similar to production in data volume and configuration (or clone from production)
- Upgrade the test system to 19c
- Transfer the SQL statements to test (text files. STS, etc…)
- Run the statements in the test and compare explain plans and run time statistics
The plus side of running individual SQL statements is the ability to look at them in-depth. You will get very detailed performance information by running traces or SQLT explains plans. The negative side is that you won’t be able to get a multi-user view of the system.
All of this comes down to simulating the workload of the application in an upgraded database environment. The goal of this is to reduce the anxiety or risk of doing your upgrade. To be clear none of this will fully eliminate that risk, though all of this will make you more prepared for that go-live date. No matter which path you choose this will give you more hands-on experience with the 19c upgrade process and see how your application performs differently or the same on a 19c database.