Document Properties
Last Modified
Added to KB
Public Access
Doc Type
Guidelines, Concepts & Cookbooks
  • ICM 7.10
  • Intershop CaaS

Guide - CaaS Database Handling

1 Introduction

This document provides information on how to acesss and update the database in an Intershop CaaS context. 

1.1 References

2 Database Access

This section describes how to access non-production database (DB) in a CaaS context.

A precondition for accessing the database is that your public SSH key is added to the INT live or edit environment. This way you are able to access database either directly via command line, or, preferably and much more comfortable via SQL Management Studio or Azure Data Studio.

To do so, you need to connect to INT via SSH and establish an SSH tunnel/port forwarding to enable to access DB via a local port on your machine forwarding traffic to the remote host.

Credentials and connection information can be found here: /var/intershop/share/system/config/cluster/


It is sufficient to establish an SSH tunnel/port forwarding to either INT (LV) or INT (ED) in order to be able to access DB as the same physical DB machine operates in the background where databases are located.

Also note that a connection via SSH can only be established if the originating public IP address, where attempting to access from, is included on the whitelist.

2.1 Prerequisites

The easiest way to establish an SSH tunnel/port forwarding is via a Linux-like shell using OpenSSH client. This is a built-in functionality in Linux- and Mac-based operating systems and can also be used in Windows, provided WSL (Windows Subsystem for Linux) is configured and a Linux OS image is installed.
For more information on installing WSL and Linux on Windows (10), refer to the official Microsoft documentation.

If going for this option, there is just one command line needed to achieve SSH tunnel/port forwarding, see section 2.2.11.

Another common option, working without Linux and/or WSL is using PuTTY.

The following tools are required (using PuTTY with Windows operating system):

  • PuTTY

  • Pageant

  • Plink


You can use the following link to download „Package files“, containing all necessary tools:

2.2 Establish an SSH Tunnel

To create an SSH tunnel to a database using PuTTY (e.g. any appserver, port: 1433) follow the instructions in the next subsections.
It is possible to use this tunnel to connect to the database via SQL Management Studio/Azure Data Studio from your local machine.

2.2.1 Adding Your SSH Private Key to Pageant

Before creating a new session, your SSH key needs to be added to Pageant. To do so, perform the following steps:

  1. In your task bar right-click on the Pageant symbol.

  2. Click Add Key and select your SSH private key file.


    If you do not already have a file, use the  puttygen tool to create one and send the public key to the Intershop Services Operations team. Please contact the dedicated project manager.

2.2.2 Creating an SSH Tunnel Using PuTTY

Creating an SSH tunnel using PuTTY consists of the following substeps:

  • Creating a new session
  • Specifying the username
  • Adding the Jump Server as a proxy
  • Setting the authentication method
  • Adding a tunnel to the destination port

More details on how to perform these steps are provided in the instructions below. Firstly, a new session needs to be created in PuTTY. To do so, perform the following steps:

  1. Open PuTTY.

  2. Go to Session.

  3. As Host Name (or IP address) specify the IP address of the SSH server. It follows the pattern:  ish<XX>-<YYYY>-webapp<ZZZZZZ>. As Port, enter 22.

    In addition to the creation of the session it is necessary to set the username intershop. Therefore proceed as follows:

  4. Go to Connection | Data.
  5. Enter intershop as Auto-login username.
    The terminal details will appear automatically.

    In the next step, add the jump server as a proxy by doing the following:

  6. Go to Connection | Proxy.
  7. Enter the following in the respective fields and select the appropriate radio buttons:

    Proxy typeLocal
    Proxy hostnameishXXservice.<AAAA>
    Telnet command, or local proxy commandplink.exe -2 %user@%proxyhost -nc %host:%port

    Furthermore, enable the checkbox Consider proxying local host connections.

    Afterwards, the authentication method needs to be set. Therefore proceed as follows:

  8. Go to Connection | SSH | Auth.

  9. Enable the checkbox Allow agent forwarding.


    You do not need to select a key as the key has already been added to Pageant, see Adding Your Private SSH Key to Pageant.

    Finally, a tunnel can be added to the destination port by following the instructions below:

  10. Go to Connection | SSH | Tunnels.
  11. Provide the following data:
    The Source port can be any available local port, e.g. 1433, 14333, 2000, 43423, 34567, ...
    The Destination and port should be: ish XX -pre-db01<>:1433


Using a Linux-based operating system and openssh-based command line client client makes establishing SSH Tunnel/port forwarding even easier.

According to the example(s) above this would look like:

Command Line
ssh -A intershop@ish<XX>-<YYYY>-webapp<ZZZZZZ> -o GSSAPIAuthentication=no -o ProxyCommand='ssh -W %h:%p intershop@ishXXservice.<AAAA>' -L 14333:<DB hostname>:1433

2.2.3 Connect to Classic SQL Server Database

If database hostname mentioned in file value "intershop.jdbc.url" (see above) does not contain "*"  at the end, it is a classic SQL Server database (not an Azure SQL Managed Instance DB).

You can now start using SQL Management Studio/Azure Data Studio connecting to ",  14333"  (or an equivalent port as of your choice, see above) using the credentials you gathered from  /var/intershop/share/system/config/cluster/ on INT (LV) and/or INT (ED).


Please use in SQL Management Studio as server name. Do not use "localhost" as this might lead to an error. Also keep in mind that port needs to be comma-separated in this configuration field.

2.2.4 Connect to Azure SQL Managed Instance Database

If the database hostname mentioned in the file, value "intershop.jdbc.url" (see above), contains "*" at the end, it is an Azure SQL Managed Instance DB.

An additional step is necessary due to enforced database encryption policy:

  1. The appropriate host name has to be added to local etc/hosts file accordingly, location is as follows:
    • Linux:  /etc/hosts 
    • Windows:  C:\Windows\System32\drivers\etc\hosts


      This file has to be edited with administrative/elevated permissions.

  2. Add the following entry, where <SQL Managed Instance DB name> is taken from <SQL Managed Instance DB name>
  3. Connect to the database via SQL Management Studio or Azure Data Studio using the exact hostname provided in etc/hosts file, comma separated with locally forwarded port on local machine, e.g.:,14333


    Do not use localhost or

3 Database Export & Import

To be able to export the customers' database data from Intershop's Azure environment, several Jenkins processes and jobs are provided for simplification. That database data can then be restored or imported into (local) development environments and developer's workstations.

No end user data (personal data) is provided. Only anonymized database data of UAT and INT environments/clusters is provided.

3.1 Database Data Export and Access

3.1.1 Classic SQL Server Database

  1. Find the Jenkins jobs available in the Jenkins view/section Database Backup Jobs and MSSQL Backup SFTP Upload.
  2. Select the appropriate Jenkins job of the ICM environment/cluster for database backup.

    Jenkins JobComment

    Click Build to trigger the database backup for the given ICM cluster
  3. Select the appropriate Jenkins job of the ICM environment/cluster for database backup transfer and upload to sFTP server:

    Jenkins JobComment

    Select the desired database backup as parameter DUMP_FILE and click Build to trigger the database backup upload to the sFTP server.
  4. Connect to the sFTP server to download the database backup file. You may use an sFTP client for that purpose, for example WinSCP.
    sFTP server connection information (username, host) are provided in the console log of the Jenkins Job triggered above, e.g.:

    Jenkins Job Console Log
    You can now download the dump from:    s

3.1.2 Azure SQL Managed Instance Database

  1. Find the Jenkins job available in the Jenkins view/section MSSQL Bacpac Export.
  2. Select the appropriate Jenkins Job of the ICM environment/cluster for the database export:

    Jenkins JobComment

    Click Build to trigger the database BACPAC export for the given ICM Cluster and upload it to the sFTP server.

  3. Connect to the sFTP Server to download the database backup file. You may use an sFTP client for that purpose, for example WinSCP.
    sFTP server connection information (username, host) are provided in the console log of the Jenkins Job triggered above, e.g.:  

    You can now download the dump from:    s

3.2 Database Data Restore/Import on (Local) Development Environment/Workstation

To import a database backup or database BACPAC file as described above, Microsoft SQL Server Management Studio (SSMS) is strongly recommended.

3.2.1  Restore Database Backup (Classic SQL Server Database)

For restoring database backups, please refer to Microsoft documentation: Restore a backup.

3.2.2 Import BACPAC file (Azure SQL Managed Instance Database)

For importing Database BACPAC file exports, please refer to Microsoft documentation: Import BACPAC file.

SQL server version:
Azure SQL Managed Instance always uses the latest SQL Server version, which may not even be the latest generally available SQL server engine version. Currently SQL server does not support backward compatibility when importing using a version lower than 2019. To import the BACPAC file, the destination server's version has to be at least SQL Server 2019.

4 Database Schema Update/Migration

4.1 General

DBPrepare is the successor of DBInit and DBMigrate. It is strongly recommended to switch to the new solution. For details, see Concept - DBPrepare.

When adding a cartridge, often a DBInit was necessary to initialize the database tables that were used by the new cartridge. In later releases, a DBMigrate would become necessary to update the database resources.

In a system with monthly releases, this means that you need to know if a DBInit and/or DBMigrate is required, and you have to initiate it accordingly. With DBPrepare, the system decides if the cartridge needs a DBInit or a DBMigrate. Switching to DBPrepare can save time because cartridges that have not been changed are not processed.  This makes the process leaner and the duration of the deployment time can be shortened.

4.2 Process

To change the development process from DBMigrate to DBPrepare, a schedule must be coordinated between the partner and Intershop.  The migration is generally based on the partner's development schedule and a decision has to be made on when to migrate. So most likely, the changeover has to be done from a certain sprint.

Due to the three environments (INT, UAT and PRD), three milestones can be defined for the migration. So in addition to defining the sprint on the partner side, a migration date for each environment must be set: First INT, then UAT and finally PRD will be adapted accordingly.

After a schedule has been agreed, the following preparation is necessary:

4.2.1 Intershop

Before switching to DBPrepare, DBMigrate and DBPrepare (in this order) have to run once together on the same code base. Afterwards DBPrepare can be used exclusively.
The Jenkins deployment pipeline must then be adapted. A cartridge list is then no longer required, as it is determined automatically. There will be a switch dbprepare = "yes" in the deployment ConfigPipelines to define this for each cluster. Then you can only check "DBPrepare" in the Jenkins job. DBInit / DBMigrate does no longer appear there.

4.2.2 Partner

The partner should adapt its development process to DBPrepare and has to provide a first release based on DBPrepare. 

The general development procedure is described in detail in the following cookbook: Cookbook - DBPrepare


The information provided in the Knowledge Base may not be applicable to all systems and situations. Intershop Communications will not be liable to any party for any direct or indirect damages resulting from the use of the Customer Support section of the Intershop Corporate Web site, including, without limitation, any lost profits, business interruption, loss of programs or other data on your information handling system.

Customer Support
Knowledge Base
Product Resources