Step 1 : Content Migration for SAP HANA
The old repo of SAP HANA 1.0 stores calculation views ( native artifacts : to create and access native database objects ) in a singleton schema _SYS_BIC which is not zero downtime compliant
Future Scope is to ship the above native artifacts of SAP HANA will be shipped in HDI Containers to be zero downtime ready
Several areas in SAP S/4HANA have already provided reports to migrate native content of SAP HANA from Repo of SAP HANA 1.0 to HDI containers.
Summary from the above mentioned 3 points would be that native artifacts that were used to create or access native database objects are not supported by ZDO and are migrated to HDI containers to make them compatible for the same
All changes performed by SUM in ZDO need to be invisible to the user in the bridge sub system . The repo 1.0 in SAP HANA (native artifacts) does not have the capability to encapsulate the changes during ZDO upgrades.
Since SAP S/4HANA 1709 FPS 02 ,comes with the new way to store the calculation views native artifacts to Hana Deployment Infrastructure which fully supports zero downtime maintenance. Point to note and understand here that HANA Deployment infrastructure was introduced in SAP S/4HANA 1709 but ZDO generally available from S/4HANA 2020.
One line summary : To use ZDO , all the create and access database objects which were initially in native artifacts should be shifted to HDI . This migration is required before going up with the ZDO approach of upgrading.
Step 2 : ZDO compliance and enablement
If we need to follow this approach , strict development guidelines for ZDO should be followed by the developers. These changes in software are delivered in upgrades and update that can be critical :-
Structural changes to the persistence layer.
Delivering new transport changes linked to After Import Methods (AIM)
New Application defined procedure like AIM or XPRA programs.To identify the tables that are accessed when running these functions, methods, and reports in phase XPRAS_AIMMRG.
Within SAP , all development systems are connected to ABAP test cockpit (ATC) which considers ZDO. When it runs against the ATC check with focus on ZDO compliance
Not Only changes of database object (that should be in HDI) , but also new development of application defined procedures like After Import Method or new XPRA program should follow ZDO
Proper encapsulation is required for source and target release, this is achieved by knowing every table that is either changed or accessed by the upgrade
ZDO enablement of application-defined procedures contains the meta-data information on tables accessed when executing the source code.
In case a ZDO enablement is missing , the vendor can ship the enhancement with the upcoming support pack . ZDO enablement is also shipped with new patch levels of Software Update Manager.
In exceptional cases, complex changes might be shipped for certain applications. These complex changes can have an impact in the application during the ZDO upgrade.
Step 3: ZDO Table Classification
Key Principal : Encapsulate the Source and Target Release
To achieve that as already mentioned in Step2 [Point 6] , knowing every table that is either changed or accessed by upgrade tool.
RUN_RSPTBFIL_ZDM_CLASSIFY : This automated process is triggered and completely handled by SUM.
Image Source : The SAP official ZDO training book CA330
Step 4 : Impact Analysis of SUM
ZDO can have certain impacts on ongoing business operations on the bridge subsystem. Depending on which tables are accessed and at what frequency. To prevent unexpected occurrences of such impacts during the maintenance event on our prod system we have a check in advance.
“what would happen if the defined upgrade scope would be applied in the production system?”
First we run in sandbox with no business user , thus we need not focus on that and production we have business user who will be accessing few table and we should just be aware of the effects it should make.
Step 5 : Silent Data Migration
With SAP S/4HANA 1909 , SAP introduced a new framework to deliver a simplification of data models in event of Zero Downtime Option. Allows the developers to decouple the data model changes of a business application from functional change. While after an upgrade runs on target version the old data model is still compatible with the new application (Both old and new data models are allowed to use)