Testnet and Mainnet Klayr migration guide

This guide explains how to migrate a Lisk Core node to Klayr Core using the Klayr Migrator v2.

The Klayr migrator CLI will generate a new genesis block for Klayr Core. The new genesis block is created based on a snapshot of the old Lisk blockchain (running on Lisk Core v4.0.2 or later) at a given height.

All active validators on the Lisk Mainnet and the Lisk Testnet need to follow this guide to correctly migrate their nodes to the new network in order to not miss any blocks after the network hard fork.

Optionally, anyone running a Lisk Core v4.0.2 (or later) who is not an active validator can also participate in the migration process. In this case, the following steps 2.4 and 2.5 in this guide can be skipped, as they are only relevant for the current validators.

1. Preparation

System requirements

The following system requirements are recommended for running the migration script:

Memory
  • minimum of 8GB RAM

Storage
  • minimum 40GB HDD

OS
  • Ubuntu 20.04 LTS

  • Ubuntu 22.04 LTS

Dependencies

The following dependencies are required for running the migration script:

Node.js
  • Version 18.20.2 (LTS) installed via NVM

Lisk Core
  • Version 4.0.2 or later

1.1. Ensure you are running version 4.0.2 (or later) of Lisk Core

Ensure you are running version 4.0.2 (or later) of Lisk Core to be able to seamlessly migrate to Klayr Core.

Print the Lisk Core node info in the terminal, and check the value of version.

lisk-core system node-info --pretty

1.2. Set the snapshot height in the Lisk Core config

Kindly ensure that the system.backup.height property is present and the corresponding value matches the announced migration height of 21976183 (testnet) or 24823618 (mainnet).

Please set the system.backup.height property in the Lisk Core config file to the announced snapshot height. See the example below:

  • Mainnet

  • Testnet

{
  "system": {
    "dataPath": "~/.lisk",
    "logLevel": "info",
    "keepEventsForHeights": 300,
    "backup": {
      "height": 24823618  (1)
    }
  },
  ...
}
1 Add the backup.height property to the system object in the Lisk Core config file.
{
  "system": {
    "dataPath": "~/.lisk",
    "logLevel": "info",
    "keepEventsForHeights": 300,
    "backup": {
      "height": 21976183 (1)
    }
  },
  ...
}
1 Add the backup.height property to the system object in the Lisk Core config file.
If your version is 4.0.x, please ensure to set system.backup.height to the announced snapshot height in your node config and restart the node with the --overwrite-config flag.

1.3. Setting up the Klayr Migrator

1.3.1. Download the migration script

Navigate into the folder where you want to install the Klayr Migrator, and then download the migration script by running the following command in the terminal:

  • Linux x64

  • Darwin x64

  • Darwin ARM64

curl -o klayr-migrator-v2.0.10-linux-x64.tar.gz https://downloads.klayr.xyz/klayr-migrator/klayr-migrator-v2.0.10-linux-x64.tar.gz
curl -o klayr-migrator-v2.0.10-darwin-x64.tar.gz https://downloads.klayr.xyz/klayr-migrator/klayr-migrator-v2.0.10-darwin-x64.tar.gz
curl -o klayr-migrator-v2.0.10-darwin-arm64.tar.gz https://downloads.klayr.xyz/klayr-migrator/klayr-migrator-v2.0.10-darwin-arm64.tar.gz

1.3.2. Download the checksum and verify

Download the checksum and verify the successful download of the klayr-migrator.

A) Download the checksum.

  • Linux x64

  • Darwin x64

  • Darwin ARM64

curl -o klayr-migrator-v2.0.10-linux-x64.tar.gz.SHA256 https://downloads.klayr.xyz/klayr-migrator/klayr-migrator-v2.0.10-linux-x64.tar.gz.SHA256
curl -o klayr-migrator-v2.0.10-darwin-x64.tar.gz.SHA256 https://downloads.klayr.xyz/klayr-migrator/klayr-migrator-v2.0.10-darwin-x64.tar.gz.SHA256
curl -o klayr-migrator-v2.0.10-darwin-arm64.tar.gz.SHA256 https://downloads.klayr.xyz/klayr-migrator/klayr-migrator-v2.0.10-darwin-arm64.tar.gz.SHA256

B) Run the following command in the terminal and ensure the output is sha256sum: <file name>: OK

  • Linux

  • Darwin

sha256sum -c klayr-migrator-v2.0.10-linux-x64.tar.gz.SHA256
sha -a 256 klayr-migrator-v2.0.10-linux-x64.tar.gz.SHA256

1.3.3. Extract and add to PATH

Unpack it, and then add it to the system path, in order to use it in the terminal:

tar -xf ./klayr-migrator-v2.0.10-linux-x64.tar.gz

Make the klayr-migrator command available in the PATH, e.g. by executing the following command:

export PATH="$PATH:$HOME/klayr-migrator/bin"

In case the klayr-migrator was extracted somewhere other than your home directory, replace $HOME with the absolute path of where the klayr-migrator folder is located.

1.4. Check the announced snapshot height

  • For Mainnet: 24823618

  • For Testnet: 21976183

The height is needed by the klayr-migrator in the next step. A snapshot of the blockchain will be created at this particular height, which will then be used to create the genesis block for the new blockchain.

1.5. Ensure Lisk Core v4.0.x is fully synced with the network

Check the current block height of your node directly in the terminal by running the following command:

lisk-core system node-info --pretty

Compare the current height of your node to the network height in Lisk Desktop, which is shown on the Network or Blocks pages.

To view the current height of the Lisk Testnet, use the network switcher of Lisk Desktop, which can be enabled in the settings.

Alternatively, users can also verify the current height by comparing data.height in the response from the https://service.lisk.com/api/v3/network/status endpoint.

To directly check the current height via the command line, run:

curl --silent https://service.lisk.com/api/v3/network/status | jq '.data.height'

To run the command, both curl and jq are required to be installed.

If both heights are equal, it is verified that your node is fully synced with the network.

1.6. Open ports

Open the necessary ports for Klayr Core.

If you migrate the existing Lisk Core v4 config with the --auto-migrate-config flag in the next step Run klayr migrator, please ensure that the necessary ports in the final v4 config are open for communication.

The final config (when auto-migrating) is printed on the screen for the user’s confirmation. You can check for the exact port details there. It should be the same that you were using with v3. Ideally, they should already be open.

Node P2P communication
ufw allow 7667
Node API
ufw allow 7887

1.7. Run klayr migrator

When to start the migrator script?

klayr-migrator can be started any time before the announced snapshot height.

If you have added the klayr-migrator to the PATH as described in the section Setting up the Klayr Migrator, you can start the migration script by running the following command [1] in the terminal:

  • Mainnet

  • Testnet

klayr-migrator --snapshot-height 24823618 --auto-migrate-config --auto-start-klayr-core-v4
klayr-migrator --snapshot-height 21976183 --auto-migrate-config --auto-start-klayr-core-v4
  • --snapshot-height: The height on which the blockchain snapshot will be performed. The snapshot height will be announced separately.

If you choose to specify custom output path with the --output flag, please don’t specify the default data directory for Klayr Core (~/.klayr/klayr-core) or any sub-directory within it, as it might lead to errors during the migration.
Custom data path

In case a custom dataPath is defined in the config, that is different from the default path ~/.lisk/lisk-core, then it is possible to define the data path with the --lisk-core-data-path flag like so:

klayr-migrator --snapshot-height 24823618 --lisk-core-data-path ~/lisk/custom/path/  --auto-migrate-config --auto-start-klayr-core-v4
Custom config

In case a custom config is used for Lisk Core v4, then it is also possible to define the path to the custom config file with the --config flag like so:

klayr-migrator --snapshot-height 24823618 --config=/path/to/config.json --auto-migrate-config --auto-start-klayr-core-v4
Running the migrator in the background

It is possible to use tools such as screen to run the Klayr migrator in the background.

With screen you can detach the current terminal window into the background:

Example (Mainnet) [1]
screen -dmSL migration klayr-migrator --snapshot-height 24823618 --auto-migrate-config --auto-start-klayr-core-v4

Shortly before the migration occurs, it is possible to reattach to the screen, in order to check if everything is working correctly.

First, check the name of the detached screen:

screen -ls

This returns a list of all detached screens with screen:

There is a screen on:
	1842.migration	(05/07/2021 12:35:59 PM)	(Detached)
1 Socket in /run/screen/S-klayr.

Use screen -r and the name of the detached screen you want to connect to

screen -r 1842.migration

2. Migration steps

2.1. Wait until the network reaches the snapshot height

Observe if the klayr-migrator finishes successfully.

Expected migration duration

This can take ~3 mins against the Testnet, and ~5 mins against the Mainnet.

The script will download and install Klayr Core v4 for you automatically.

The flag --auto-migrate-config will automatically migrate the config from the old to the new node.

After the snapshot height is reached, validators have approximately 1 week to enable block generation, to ensure that they do not miss any blocks after the hard fork.

If the node is not migrated, but started at a later point in time, it will simply sync to the current network height. For validators, this might result in missing blocks. For everyone else, it will not have any impact.
How to verify a successful migration

To verify that the migrator created the correct new genesis block, compare the newly created genesis block with others by comparing the hash of the genesis block:

grep \"id\": ./klayr-migrator/output/00000000/genesis_block.json

The hashes can be shared in the dedicated network channel on klayr.chat.

2.2. Stop Lisk Core v4.0.x

After the klayr-migrator script has finished and the announced snapshot height has passed, there is no reason to continue running Lisk Core v4.0.x, and therefore it is recommended to stop it.

It is important to stop Lisk Core before starting Klayr Core. If the migrator is started with --auto-start-klayr-core-v4 flag, the user is prompted to verify that Lisk Core is stopped - only afterwards, the migration script will continue.

Prompt when
Please stop Lisk Core to continue. Type 'yes' and press Enter when ready. [yes/no]: yes
Users that have set something up to start Lisk Core on boot, whether it’s through cron or systemd or something else, should keep that in mind and adjust things accordingly so that Klayr Core gets started instead of Lisk Core.

Last but not least, remove the folder with Lisk Core.

2.3. Start Klayr Core v4

If you set the flag --auto-start-klayr-core-v4 when running klayr-migrator, it will start Klayr Core in the background (managed by PM2) right after successful migration.

Otherwise, start Klayr Core manually like so:

  • Mainnet

  • Testnet

klayr-core start --network mainnet
klayr-core start --network testnet

Observe the logs in the terminal, to verify the node is starting correctly.

To run Klayr Core in the background install PM2, as described in the guide Process management with PM2.

You can verify that the node is running correctly by executing the following command:

klayr-core system:node-info --pretty

Check the value of version in the response, to verify you are running version 4.0.5 of Klayr Core.

2.4. Enable block generation after the migration (for validators)

After migration, 51 initial validators will be active to generate blocks during the initRounds. All other validators will be inactive during the bootstrap period.

initRounds is the number of rounds for the bootstrap period of the new network. The bootstrap period after migration to Klayr is 587 rounds.

The initial validators will be exactly the top 51 validators that were in active positions in the Lisk network, at the time of the migration.

For the initial validators, it will be important to enable block generation as soon as possible on the new node, to not miss any block rewards.

It is also important that a maximum number of validators participate in the network migration. If only a small number of validators migrate their nodes, this can prolong the bootstrap period. Additionally, for blocks to be finalized, enough of the active validators (minimum 35) need to participate in the migration process.

After 587 rounds, the normal Klayr PoS protocol will be followed to generate the list of active validators, but only for the validators who have registered their validator keys.

All migrated validators who did not register their keys until the bootstrap period ends will stay banned. To be un-banned, a validator needs to [register-legacy-validator-keys].

2.4.1. Allow methods in the node config

To be able to run certain validator-related commands of the node, it is required to enable security-sensitive methods in the node config.

If you used the migrator with the --auto-start-klayr-core-v4 flag, it is not necessary to update the config, as it is already done by the migrator.
config.json
{
  "rpc": {
    "modes": ["ipc"],
    "allowedMethods": ["generator", "system", "random"]
  }
}

Restart the node with the --overwrite-config flag, to load the updated configuration.

allowedMethods refers to the method defined in the JSON-RPC specification.

Add the namespace, to allow all endpoints of namespace, or namespace_endpointName to allow a specific endpoint.

2.4.2. Create the validator keys

Most likely you already have a keys.json file from the Lisk Core v4 node. Skip this step if you have already have a keys.json file.

It is possible to generate all relevant validator keys to enable block generation from the account passphrase.

To do so, use the command keys:create.

The flag --add-legacy is only available in Klayr Core v4.0.0 or later.
  • Mainnet

  • Testnet

klayr-core keys:create --chainid 0 --output ./config/keys.json --add-legacy
klayr-core keys:create --chainid 1 --output ./config/keys.json --add-legacy

Next, you will be prompted for the validator passphrase, and it will also ask for a password, in order to symmetrically encrypt the passphrase for the config.

? Please enter passphrase:  [hidden]
? Please re-enter passphrase:  [hidden]
? Please enter password:  [hidden]
? Please re-enter password:  [hidden]
The password is sensitive information. Store the password used here for the encryption somewhere safe. It will be required every time to enable block generation, in order to decrypt the generator keys on the node.

This will generate the following file, which includes all important keys for the validator account:

Details
config/keys.json
{
  "keys": [
    {
      "address": "klyqaxxmj78frvgpjgwvf4yqjjkcrr9yhn2sxxwm3",
      "keyPath": "legacy",
      "publicKey": "6290c8b58de8b71fedb7e3cb9a6ee9426aa3e7ac0141f278526375d46705b546",
      "privateKey": "759305903f7bbb449cf2fd22e6da476792b63e24558e266a4859f9ed3c91fd7e6290c8b58de8b71fedb7e3cb9a6ee9426aa3e7ac0141f278526375d46705b546",
      "plain": {
        "generatorKeyPath": "m/25519'/134'/0'/0'",
        "generatorKey": "aaecd278a3fadc40a4a824d6f4aa24547d8fb9d075ec4d6967a7084f9a3f2541",
        "generatorPrivateKey": "81316f0582fd2cc0a651318aa0041ce36e7b786033b98ec545ec04078fad67caaaecd278a3fadc40a4a824d6f4aa24547d8fb9d075ec4d6967a7084f9a3f2541",
        "blsKeyPath": "m/12381/134/0/0",
        "blsKey": "815a9e7643cf2bace98d1337f1dca8e39949592cd3fcb79bf3ab5784981468b9987b3340527bc9ca263a2fd061812024",
        "blsProofOfPosession": "add8669bb57f3dceec04dc0f875906cb52a677f1df911536c01f447c8830bf27cd43713af18d84de5a64ec504aeaf9a30521c09438bb5a4d5fd634946c65e0fc4ea3681fdb4f6949cb6c1bc1ac1ddec3df058a13466af5a13d50737938fd7d5f",
        "blsPrivateKey": "36506a53431665265ee03d7e19a5d44db3ff159d9aeee05727a8b24abc67651a"
      },
      "encrypted": {
        "ciphertext": "c3009d4a505ac32a652ffce6aa718073c7ca75b00578420ba20c2533a83f38e2b3e20cf1d6f0c9905efe28b5276142b93fdbdd33134d37bcd2db23654da92bb2becd00971c49ecc749100748c93344477ea52f6073c3fefec7234962d0eccdaa6862d9d0da46dbfe85cef98ad6cab0f2c1cb1b54326617132bb950d1c14a774a1e6403e8fa1bf3a2c7c0d6856266cf738f41ac01b2217d93070c4079e1b82044d3a692ea225290c2b6bcb902e0ffb8132f4c0f29325e6a3a",
        "mac": "2b3c65d0385a870ab499dfcddf411347506671015f412b35600153b132a455ea",
        "kdf": "argon2id",
        "kdfparams": {
          "parallelism": 4,
          "iterations": 1,
          "memorySize": 2024,
          "salt": "a5598628001346f608b3f57dd38b8611"
        },
        "cipher": "aes-256-gcm",
        "cipherparams": {
          "iv": "683600a199d154e51c0f97e6",
          "tag": "32807058f7f89921b4839fc39256cd24"
        },
        "version": "1"
      }
    }
  ]
}

The generated keys are very sensitive information.

Especially the non-encrypted values need to be treated as equally sensitive as the passphrase for an account. So after the keys are imported in step Import the validator keys, make sure to store the file somewhere safe, or delete the file completely.

2.4.3. Import the validator keys

After creating the validator keys as suggested in the Create the validator keys section, the next step is to import them into the node.

You most likely already have a keys.json file from the Lisk Core v4 node. If you have already have the keys file, you need to update the address in the keys.json file to the new address of the validator. By replacing the lsk prefix with kly in the address, you can use the same keys for the Klayr Core node.

klayr-core keys:import --file-path config/keys.json

2.4.4. Set the hash onion

Without the hash onion, a validator won’t be able to receive any rewards for generating new blocks, although the blocks would still be valid in that case. To not miss any rewards, it is of high interest for a validator to set the hash onion, before enabling block generation on the node.

Set the hash-onion by invoking the setHashOnion endpoint via the endpoint:invoke CLI command.

klayr-core endpoint:invoke random_setHashOnion '{"address":"klyqaxxmj78frvgpjgwvf4yqjjkcrr9yhn2sxxwm3"}'

2.4.5. Import validator info data

Look in the output directory of the klayr-migrator for a file called forgingStatus.json (the specific location will be in the migrator output), this will contain the valid validator info data to enable block generation on the Klayr Core node.

Migrator output example
Finished exporting forging status to ~/klayr-migrator/output/00000000/forgingStatus.json
Alternative option in the event of data loss
In case the validator info data is lost, the validator can use the snapshot height for height, maxHeightPrevoted and maxHeightGenerated to enable block generation safely.

The Lisk forging info data corresponds to the Klayr validator info data in the following way:

  • height

  • maxHeightPrevoted

  • maxHeightGenerated

To import the validator info data, invoke the endpoint generator_setStatus like so:

klayr-core endpoint:invoke generator_setStatus '{ "address": "klybgyrx3v76jxowgkgthu9yaf3dr29wqxbtxz8yp", "height": 20432255, "maxHeightGenerated": 20432207, "maxHeightPrevoted": 20432159 }' --pretty

2.4.6. Enable block generation

Now, it is possible to enable block generation on the new node for your validator by using the generator:enable command of the Klayr Core CLI.

klayr-core generator:enable klyqaxxmj78frvgpjgwvf4yqjjkcrr9yhn2sxxwm3 --use-status-value

Replace the address klyqaxxmj78frvgpjgwvf4yqjjkcrr9yhn2sxxwm3 with your validator address, the --use-status-value flag will use the validator info data that was imported in the previous step Import validator info data.

Don’t use zeros as validator info data!

If the validator already generated blocks with Lisk Core, they need to use their current validator info data. The validator info data is migrated during the migration process, and can directly be used to enable block generation on the v4 node.

To directly set the values, use the --use-status-value flag, or set the values manually as described below.

For setting the values manually, first get the data:

klayr-core generator:status --pretty

And then set the values manually by adding the relevant flags:

klayr-core generator:enable klyqaxxmj78frvgpjgwvf4yqjjkcrr9yhn2sxxwm3 --height=123 --max-height-generated=101 --max-height-prevoted=101

The migration of Lisk Core to Klayr Core is now completed.

If you have specific questions regarding the process or need additional support, please reach out in the dedicated community channels, like https://klayr.chat/.


1. Snap versions of Klayr Core store everything in ~/snap/klayr-core/current/.klayr/klayr-core instead of ~/.klayr/klayr-core