Guide - Intershop Commerce Platform Database Handling

Introduction

This document provides information on how to acesss and update the database in the Intershop Commerce Platform context. 

References

Database Access

This section describes how to access non-production database (DB) in the Intershop Commerce Platform context.

A precondition for accessing the database is that your public SSH key is added to the INT live or edit environment. This way, you are able to access the database either directly via a command line or, preferably and much more comfortably, via SQL Management Studio or Azure Data Studio.

To do so, you need to connect to INT via SSH and establish an SSH tunnel/port forwarding to enable access to DB via a local port on your machine forwarding traffic to the remote host.

Credentials and connection information can be found here: /var/intershop/share/system/config/cluster/orm.properties.

Note

It is sufficient to establish an SSH tunnel/port forwarding to either INT (LV) or INT (ED) in order to be able to access DB as the same physical DB machine operates in the background where databases are located.

Also note that a connection via SSH can only be established if the originating public IP address, where attempting to access from, is included on the whitelist.

Prerequisites

The easiest way to establish an SSH tunnel/port forwarding is via a Linux-like shell using OpenSSH client. This is a built-in functionality in Linux- and Mac-based operating systems and can also be used in Windows, provided WSL (Windows Subsystem for Linux) is configured and a Linux OS image is installed.
For more information on installing WSL and Linux on Windows (10), refer to the official Microsoft documentation.

If going for this option, there is just one command line needed to achieve SSH tunnel/port forwarding.

Another common option, working without Linux and/or WSL, is using PuTTY.

The following tools are required (using PuTTY with Windows operating system):

  • PuTTY

  • Pageant

  • Plink

Tip

You can use the following link to download "Package files", containing all necessary tools: https://www.chiark.greenend.org.uk/~sgtatham/putty/latest.html.

Establish an SSH Tunnel

To create an SSH tunnel to a database using PuTTY (e.g., any appserver, port: 1433) follow the instructions in the next subsections.
It is possible to use this tunnel to connect to the database via SQL Management Studio/Azure Data Studio from your local machine.

Adding Your SSH Private Key to Pageant

Before creating a new session, your SSH key needs to be added to Pageant. To do so, perform the following steps:

  1. In your task bar right-click on the Pageant symbol.

  2. Click Add Key and select your SSH private key file.

    Note

    If you do not already have a file, use the PuTTYgen tool to create one and send the public key to the Intershop Services Operations team. Please contact the dedicated project manager.

Creating an SSH Tunnel Using PuTTY

Creating an SSH tunnel using PuTTY consists of the following substeps:

  • Creating a new session
  • Specifying the username
  • Adding the Jump Server as a proxy
  • Setting the authentication method
  • Adding a tunnel to the destination port

More details on how to perform these steps are provided in the instructions below. Firstly, a new session needs to be created in PuTTY. To do so, perform the following steps:

  1. Open PuTTY.

  2. Go to Session.

  3. As Host Name (or IP address) specify the IP address of the SSH server. It follows the pattern: ish<XX>-<YYYY>-webapp<ZZZZZZ>. As Port, enter 22.



    In addition to the creation of the session it is necessary to set the username intershop. Therefore proceed as follows:

  4. Go to Connection | Data.
  5. Enter intershop as Auto-login username.
    The terminal details will appear automatically.


    In the next step, add the jump server as a proxy by doing the following:

  6. Go to Connection | Proxy.
  7. Enter the following in the respective fields and select the appropriate radio buttons:

    SettingData
    Proxy typeLocal
    Proxy hostnameishXXservice.<AAAA>.cloudapp.azure.com
    Port22
    Usernameintershop
    Telnet command, or local proxy commandplink.exe -2 %user@%proxyhost -nc %host:%port

    Furthermore, enable the checkbox Consider proxying local host connections.

    Afterwards, the authentication method needs to be set.

  8. Go to Connection | SSH | Auth.

  9. Enable the checkbox Allow agent forwarding.

    Note

    You do not need to select a key as the key has already been added to Pageant, see Adding Your Private SSH Key to Pageant.


    Finally, a tunnel can be added to the destination port by following the instructions below:

  10. Go to Connection | SSH | Tunnels.
  11. Provide the following data:
    The Source port can be any available local port, e.g., 1433, 14333, 2000, 43423, 34567, etc.
    The Destination and port should be: ish XX -pre-db01<.database.windows.net>:1433


Tip

Using a Linux-based operating system and openssh-based command line client makes establishing SSH Tunnel/port forwarding even easier.

According to the example(s) above this would look like:

Command Line
ssh -A intershop@ish<XX>-<YYYY>-webapp<ZZZZZZ> -o GSSAPIAuthentication=no -o ProxyCommand='ssh -W %h:%p intershop@ishXXservice.<AAAA>.cloudapp.azure.com' -L 14333:<DB hostname>:1433

Connect to SQL Database

An additional step is necessary due to enforced database encryption policy:

  1. The appropriate host name has to be added to local etc/hosts file accordingly, location is as follows:
    • Linux:  /etc/hosts 
    • Windows:  C:\Windows\System32\drivers\etc\hosts

      Note

      This file has to be edited with administrative/elevated permissions.

  2. Add the following entry, where <SQL Managed Instance DB name> is taken from orm.properties: 

    127.0.0.1 <SQL Managed Instance DB name>.database.windows.net
  3. Connect to the database via SQL Management Studio or Azure Data Studio using the exact hostname provided in etc/hosts file, comma separated with locally forwarded port on local machine, e.g.:  

    ishXX-pre-db01.database.windows.net,14333

    Note

    Do not use localhost or 127.0.0.1.


Database Export & Import

To be able to export the customers' database data from Intershop's Azure environment, several Jenkins processes and jobs are provided for simplification. That database data can then be restored or imported into (local) development environments and developer's workstations.

No end user data (personal data) is provided. Only anonymized database data of UAT and INT environments/clusters is provided.

Database Data Export and Access

  1. Find the Jenkins job available in the Jenkins view/section MSSQL Bacpac Export.
  2. Select the appropriate Jenkins Job of the ICM environment/cluster for the database export:

    Jenkins JobComment

    Click Build to trigger the database BACPAC export for the given ICM Cluster and upload it to the sFTP server.

  3. Connect to the sFTP Server to download the database backup file. You may use an sFTP client for that purpose, for example WinSCP.
    sFTP server connection information (username, host) are provided in the console log of the Jenkins Job triggered above, e.g.:  

    ...
    You can now download the dump from:    sftp://ish_dev_dumps_cstmr@ishXX-trans01.fse.intershop.de:/home/
    ...

Database Data Restore/Import on (Local) Development Environment/Workstation

To import a database backup or database BACPAC file as described above, Microsoft SQL Server Management Studio (SSMS) is strongly recommended.

For importing Database BACPAC file exports, please refer to Microsoft documentation: Import BACPAC file.

SQL server version:
Azure SQL Managed Instance always uses the latest SQL Server version, which may not even be the latest generally available SQL server engine version. Currently SQL server does not support backward compatibility when importing using a version lower than 2019. To import the BACPAC file, the destination server's version has to be at least SQL Server 2019.

Database Schema Update/Migration

General

DBPrepare is the successor of DBInit and DBMigrate. It is strongly recommended to switch to the new solution. For details, see Concept - DBPrepare.

When adding a cartridge, often a DBInit was necessary to initialize the database tables that were used by the new cartridge. In later releases, a DBMigrate would become necessary to update the database resources.

In a system with monthly releases, this means that you need to know if a DBInit and/or DBMigrate is required, and you have to initiate it accordingly. With DBPrepare, the system decides if the cartridge needs a DBInit or a DBMigrate. Switching to DBPrepare can save time because cartridges that have not been changed are not processed. This makes the process leaner and the duration of the deployment time can be shortened.

Process

To change the development process from DBMigrate to DBPrepare, a schedule must be coordinated between the partner and Intershop.  The migration is generally based on the partner's development schedule and a decision has to be made on when to migrate. So most likely, the changeover has to be done from a certain sprint.

Due to the three environments (INT, UAT, and PRD), three milestones can be defined for the migration. So in addition to defining the sprint on the partner side, a migration date for each environment must be set: First INT, then UAT, and finally PRD will be adapted accordingly.

After a schedule has been agreed, the following preparation is necessary:

Intershop

Before switching to DBPrepare, DBMigrate and DBPrepare (in this order) have to run once together on the same code base. Afterwards DBPrepare can be used exclusively.
The Jenkins deployment pipeline must then be adapted. A cartridge list is then no longer required, as it is determined automatically. There will be a switch dbprepare = "yes" in the deployment ConfigPipelines to define this for each cluster. Then you can only check "DBPrepare" in the Jenkins job. DBInit / DBMigrate does no longer appear there.

Partner

The partner should adapt its development process to DBPrepare and has to provide the first release based on DBPrepare. 

The general development procedure is described in detail in the following cookbook: Cookbook - DBPrepare.

Disclaimer
The information provided in the Knowledge Base may not be applicable to all systems and situations. Intershop Communications will not be liable to any party for any direct or indirect damages resulting from the use of the Customer Support section of the Intershop Corporate Web site, including, without limitation, any lost profits, business interruption, loss of programs or other data on your information handling system.
Home
Knowledge Base
Product Releases
Log on to continue
This Knowledge Base document is reserved for registered customers.
Log on with your Intershop Entra ID to continue.
Write an email to supportadmin@intershop.de if you experience login issues,
or if you want to register as customer.