Skip to main content

Unveiling the Power of SAP HANA Cloud Migration: Self Service Migration Tool

So, you're the admin, and you're ready to roll with an SAP HANA Cloud database instance. Great call! Let's map out the essentials to make sure you're all set.

Getting Subaccount Access Right

First things first – to deploy the database, you'll need access to a subaccount in your company's SAP Business Technology Platform (SAP BTP) global account. But, not just any access – you'll want the Subaccount Administrator role collection. If you find your user account missing this role, no worries. Just give a shout to your SAP BTP global account admin and ask them to hook you up. It's like getting the backstage pass for the deployment concert.

Unlocking the Toolbox: SAP HANA Cloud Administration Tools

Now, let's talk tools. To craft and manage your database instances, you're going to need the SAP HANA Cloud - tools plan in your subaccount. This magical plan opens the door to a trio of game-changers: SAP HANA Cloud Central, SAP HANA cockpit, and SAP HANA database explorer 

To activate the new multi-environment SAP HANA Cloud tools, follow the instructions in the blog

Benefits and features of the multi-environment SAP HANA Cloud tools

The Old VS The NEW

HANA Cloud tools in Cloud Foundry are still hanging around, but here's the scoop – the Cloud Foundry edition of SAP HANA Cloud Central is waving goodbye. It's officially on the way out. SAP's advice? Make the shift to the new multi-environment SAP HANA Cloud tools. That's where the buzz is, and that's where the future is headed.

Implementation Starts here 

In SAP BTP Cockpit, navigate to SAP HANA Cloud Central via: Instances and Subscription > Applications > SAP HANA Cloud > Instances for creating and managing database instances



To access the Self-Service Migration for SAP HANA Cloud tool in SAP HANA Cloud Central, no additional configuration is needed for SAP HANA Service scenarios (Neo and Cloud Foundry).

However, for SAP HANA on-premises scenarios, ensure the connectivity_proxy service plan is added to the connectivity service. If not available, contact the SAP BTP global account administrator to add this required service plan for self-service migration as this needs to be connected via cloud connector

To locate the Self-Service Migration Tool in SAP BTP Cockpit:

In the subaccount navigation, select Instances and Subscription.
From the Applications table, choose SAP HANA Cloud.
In the navigation area, click on Migrations to create and manage migrations to SAP HANA Cloud.

Deploy a SAP HANA Cloud Database 

Now Step by Step method to provision a database is mentioned here

SAP HANA DB SIZE

MEMORY AND STORAGE SIZE SUPPORTED BY HANA 

Creating a Migration user in source database

Migrating from an SAP HANA database to SAP HANA Cloud requires a dedicated user in your SAP HANA source database. The Self-Service Migration tool will check whether the provided user has all needed privileges to perform the migration. Any missing privileges and the objects on which they must be granted will be displayed in the user interface. 

On the source database, open a SQL Console in the Database Explorer.

Create a migration user by executing the following statement in the Database Explorer - 

1   CREATE USER <username> PASSWORD <password> NO FORCE_FIRST_PASSWORD_CHANGE;

2. Grant the necessary privileges to the migration user by running the following script in the Database Explorer 

GRANT SELECT ON _SYS_REPO.DELIVERY_UNITS TO <username>;

GRANT SELECT ON _SYS_REPO.ACTIVE_OBJECT TO <username>; 

GRANT CATALOG READ TO <username>;

GRANT INIFILE ADMIN TO <username>;

GRANT TRUST ADMIN TO <username>;

GRANT CERTIFICATE ADMIN TO <username>;

GRANT CREATE REMOTE SOURCE TO <username>;

3. For each schema you need to migrate add this scenario :

GRANT SELECT ON SCHEMA "<SCHEMA_NAME>" TO <username>;

The SQL statement must be executed by the owner of the schema or any user who has the permissions to grant the required privileges.


Install and Configure the Cloud Connector

The Cloud Connector allows you to set up a secure connection between the source SAP HANA on-premise database and the target SAP HANA Cloud database instance on SAP BTP

Installation of Cloud Connector on Linux 
Note : For this you need require the connectivity_proxy service plan to be added to the connectivity service in your subaccount entitlements.

Fill the details in Self Migration Tools :-

Name

Description

Example

Migration Name

Give a short descriptive migration project name.

Migration OP to HC

Migration Description

Give a good description for the migration project.

Self-Service Migration SAP HANA on-premises to SAP HANA Cloud​

Source Instance Host

Virtual host name exposed to SAP BTP through Cloud Connector.

op-source-hana. The real host name wdflbmt7346 isn't exposed to the SAP BTP.

Source Instance Port

Virtual port number exposed to SAP BTP through Cloud Connector.

42015. The real port number 30015 isn't exposed to the SAP BTP.

Cloud Connector Location ID

Location ID defined in the Cloud Connector.

SCC-OP-HANA

Migration Username

Dedicated migration user account created on the source database.

MIGUSER

Migration Username Password

Password specified during the creation of the migration user.

Welcome1 ( This is not a strong password )

Target Database Username

Database administrator user name created during SAP HANA Cloud database deployment.

DBADMIN

Target Username Password

Password specified during the deployment of the SAP HANA Cloud database.

Welcome1 ( This is not a strong password )




Migrating from SAP HANA Service for BTP , Neo environment 



Note : This scenario only migrates the catalog and data to SAP HANA Cloud. Migration of XS Classic and SAP HANA repository content or other design-time artifacts is not supported.

Field

Description

Example

Migration Name

Descriptive name for the migration project.

Neo to HC Self-Service Migration

SAP BTP Region

Region of the source database within SAP BTP.

Europe (Frankfurt)

Technical Subaccount

Name of the technical subaccount in the Neo environment.

pvwaf4nsum

Source Database ID

DB/Schema ID listed in the Databases & Schemas list.

neosource

Account Username

User name for connecting to the SAP BTP Neo environment.

P1234567890

Account Password

Password for the specified user account (P1234567890).

Welcome1

Database Username

Dedicated migration user account on the source database.

MIGUSER

Database Password

Password specified during the creation of the migration user.

Welcome1

Target Instance

Name of the SAP HANA Cloud target database.

HC_Neo_Target


Migrating from SAP BTP Cloud Foundry




Note : For this scenario ,  the source and target database need to be located in the same region.


Name

Description

We will use Example

Migration Name

Descriptive name for the migration project.

CF to HC Self-Service Migration

Credentials for SAP BTP Cloud Foundry:



Username

User name for connecting to the SAP BTP Cloud Foundry environment.

P1234567890

Password

Password for the specified user account (P1234567890).

Welcome1

Source Database Information:



SAP BTP Region

Region of the source database within SAP BTP.

EU (Frankfurt)

Cloud Foundry Space

Cloud Foundry space name where the SAP HANA Service is located.

dev

Source Instance Name

SAP HANA Cloud Service instance name shown in the Service Instances list.

CF_Source

Username (Source DB)

Dedicated migration user account created on the source database during preparations.

MIGUSER

Password (Source DB)

Password specified during the creation of the migration user.

Welcome1

Target Database Information:



Migrated Instance Prefix

Prefix for the new migrated instance (SAP HANA Schemas and HDI containers service).

demo_

Target Instance

Name of the SAP HANA Cloud target database.

HC_CF_Target

Target DB Username

Database administrator user name created during deployment.

DBADMIN

Target Username Password

Password specified during the deployment of the target database.

Welcome1




Comments

You might find these interesting

How to properly Start/Stop SAP system through command line ?

Starting/stopping an SAP system is not a critical task, but the method that most of us follow to achieve this is sometimes wrong. A common mistake that most of the SAP admins do is, making use of the 'startsap' and 'stopsap' commands for starting/stopping the system.  These commands got deprecated in 2015 because the scripts were not being maintained anymore and SAP recommends not to use them as many people have faced errors while executing those scripts. For more info and the bugs in scripts, you can check the sap note 809477.  These scripts are not available in kernel version 7.73 and later. So if these are not the correct commands, then how to start/stop the sap system?  In this post, we will see how to do it in the correct way. SAP SYSTEM VS INSTANCE In SAP, an instance is a group of resources such as memory, work processes and so on, usually in support of a single application server or database server with...

sapstartsrv is not started or sapcontrol is not working

 What is sapstartsrv ? The SAP start service runs on every computer where an instance of an SAP system is started. It is implemented as a service on Windows, and as a daemon on UNIX. The process is called  sapstartsrv.exe   on Windows, and   sapstartsrv   on UNIX platforms. The SAP start service provides the following functions for monitoring SAP systems, instances, and processes. Starting and stopping Monitoring the runtime state Reading logs, traces, and configuration files Technical information, such as network ports, active sessions, thread lists, etc. These services are provided on SAPControl SOAP Web Service, and used by SAP monitoring tools (SAP Management Console,  SAP NetWeaver  Administrator, etc.). For more understanding use this link : https://help.sap.com/doc/saphelp_nw73ehp1/7.31.19/enUS/b3/903925c34a45e28a2861b59c3c5623/content.htm?no_cache=true How to check if it is working or not ? In case of linux , you can simply ps -ef | grep s...

HANA System Replication - Prerequisites & Setup

Hey Folks! Welcome back to Hana high availability blog series. In our last blog we checked out operation & replication modes in hana system replication. If you haven't gone though that blog, you can checkout  this link In this blog we will be talking about the prerequisites of hana replication and it's setup. So let's get started. When we plan to setup hana system replication, we need to make sure that all prerequisite steps have been followed. Let's have a look at these prerequisites. HANA System Replication Prerequisites: Primary & secondary systems should be up & running HDB version of secondary should be greater than or equal to Primary database sever But, for Active/Active(read enabled config), HDB version should be same on both sites. System configuration/ini files should be identical on both sides Replication happe...

HANA hdbuserstore

The hdbuserstore (hana secure user store) is a tool which comes as an executable with the SAP Hana Client package. This secure user store allows you to store SAP HANA connection information, including user passwords, securely on clients. With the help of secure store, the client applications can connect to SAP HANA without the user having to enter host name or logon credentials. You can also use the secure store to configure failover support for application servers in a 3-tier scenario (for example, SAP Business Warehouse) by storing a list of all the hosts that the application server can connect to. To access the system using secure store, there are two connect options: (1)key and (2)virtualHostName. key is the hdbuserstore key that you use to connect to SAP HANA, while virtualHostName specifies the virtual host name. This option allows you to change where the hdbuserstore searches for the data and key files. Note...

ST03N : The chapter for all BASIS Admins

This blog is targeted to BASIS ADMINS Transaction for workload analysis statistical data changed over time are monitored using transaction code ST03 , now ST03N (from SAP R/3 4.6C) . With SAP Web AS 6.4 the transaction ST03 is available again. From time to time ST03 and ST03N has seen many changes but later in SAP NW7.0 ST03N has reworked in detail specially processing time is now shown in separate column. Main Use of ST03N  is to get detailed information on performance of any ABAP based SAP system. Workload monitor analyzes the statistical data originally collected by kernel. You can compare or analyze the performance of a single application server or multiple application server. Using this you start checking from the entire system and finding your way to that one application server and narrowing down to exact issue. By Default :- You see data of current day as default view , you can change the default view. Source of the image : sap-perf.ca Let's discuss the WORKLO...

SAP application log tables: BALHDR (Application Log: Header Data) and BALDAT (Application Log: Detail Data)

  BALHDR (Application Log: Header Data): Usage : The BALHDR table stores the header information for application logs. It serves as a central repository for managing and organizing log entries. Example Data Stored: The table may contain entries for various system activities, such as error messages, warnings, or information logs generated during SAP transactions or custom programs. Columns Involved: LOGNUMBER: Unique log number assigned to each log entry. OBJECT: Identifies the object associated with the log entry (e.g., a program, transaction, or process). SUBOBJECT: Further categorizes the object. USERNAME: User ID of the person who created the log entry. TIME: Date and time when the log entry was created. ADD_OBJECT: Additional information or details related to the log entry. BALDAT (Application Log: Detail Data): Usage : The BALDAT table contains the detailed data for each log entry, linked to the corresponding entry in the BALHDR table. It stores the specific log details an...

ABAP Dumps Analysis

Ever now and then have you heard about ABAP Dumps, We also have a joke everything in temporary in life except ABAP dumps for SAP Consultants. Lets try to understand ABAP dumps from perspective of a SAP BASIS Consultant. Dumps happen when an ABAP program runs and something goes wrong that cannot be handled by the program We have two broad categories of Dumps , In custom program Dumps and SAP provided program Dumps. Dumps that happen in the customer namespace ranges (i.e. own-developed code) or known as Custom Program , can usually be fixed by the ABAP programmer of your team. Dumps that happen in SAP standard code probably need a fix from SAP. You do not have to be an "ABAPer" in order to resolve ABAP dump issues. The common way to deal with them is to look up in ST22 How to correct the error ? Hints are given for the keywords that may be used to search on the note system. Gather Information about the issue  Go to System > Status and Check the Basis SP level as well as info...

How to resolve Common Error : Standard Template "sap_sm.xls" missing

Hey everyone, putting forward a common error we usually face when we have “ Excel inplace” functionality enabled in our SAP system. This error occurs when validity of the signature of SAP standard templates expired or were incorrectly delivered via support packages. We can reproduce the error by doing as below.. Click on “spreadsheet” icon after any SAP ALV grid view of data is on screen to make this data to export into excel directly from SAP.

SAP HANA System Replication - Operation Mode & Replication Mode

Hey Folks! Welcome back to Hana high availability blog series. In our last blog we checked out what is hana system replication and how it basically works. If you haven't gone through that blog, you can checkout link In this blog we will be talking about the replication modes and operation modes in hana system replication. So let's get started. When we setup the replication and register the secondary site, we need to decide the operation mode & replication mode we want to choose for replication. For now we won't focus on setting up replication as we'll cover it in our next blogs.  Operation Modes in Hana System Replication: There are three operation modes available in system replication: delta_datashipping, logreplay and logreplay_readaccess. Default operation mode is logreplay. 1. Delta_datashipping: In this operation mode initially one full data shipping is done as part of replication setup and then a delta data shipping takes place occasionally in addition to cont...

Work Process and Memory Management in SAP

Let’s talk about the entire concepts that are related to memory when we talk about SAP Application. Starting with few basic terminologies, Local Memory :  Local process memory, the operating system keeps the two allocation steps transparent. The operating system does the other tasks, such as reserving physical memory, loading and unloading virtual memory into and out of the main memory. Shared Memory :  If several processes are to access the same memory area, the two allocation steps are not transparent. One object is created that represents the physical memory and can be used by various processes. The processes can map the object fully or partially into the address space. The way this is done varies from platform to platform. Memory mapped files, unnamed mapped files, and shared memory are used.  Extended Memory : SAP extended memory is the core of the SAP memory management system. Each SAP work process has a part reserved in its virtual address space for extended memory...