Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release/v1.5.0 #1323

Merged
merged 54 commits into from
Sep 11, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
54 commits
Select commit Hold shift + click to select a range
f3380d5
Merge pull request #1307 from ainblockchain/release/v1.4.2
platfowner Aug 16, 2024
716806a
Log stack trace for file handling errors
platfowner Aug 16, 2024
48efb6f
Log stack trace for the errors in starting blockchainn nodes
platfowner Aug 16, 2024
78b9c90
Merge pull request #1310 from ainblockchain/bugfix/platfowner/bugfix
platfowner Aug 16, 2024
3c4bc6d
Split code and data directories with season prefixes
platfowner Aug 23, 2024
f92238e
Merge pull request #1311 from ainblockchain/feature/platfowner/feature
platfowner Aug 23, 2024
9618af7
Add $SEASON to the node job name
platfowner Aug 26, 2024
b54f48f
Add a comment for adding $SEASON to the node job name
platfowner Aug 26, 2024
599a180
Merge pull request #1312 from ainblockchain/feature/platfowner/feature
platfowner Aug 26, 2024
a1b37a1
Add an initial version of copy_blockchain_data_onprem.sh
platfowner Aug 27, 2024
8552ed0
Update copy_blockchain_data_onprem.sh
platfowner Aug 27, 2024
eaccf3d
Remove some redundant parts
platfowner Aug 27, 2024
b47f094
Tweak blockchain job name
platfowner Aug 27, 2024
c8c6e08
Merge pull request #1313 from ainblockchain/feature/platfowner/feature
platfowner Aug 28, 2024
5840a90
Use $SEASON as directory instead of prefix
platfowner Aug 29, 2024
55ab56e
Use $GCP_USER
platfowner Aug 29, 2024
e19ece8
Fix to $ONPREM_USER
platfowner Aug 29, 2024
742508d
Update copy_blockchain_data_onprem.sh for $SEASON directories
platfowner Aug 29, 2024
b71b22a
Merge pull request #1314 from ainblockchain/bugfix/platfowner/bugfix
platfowner Aug 29, 2024
e08a519
Add initial versions of onprem incremental blockchain deploy scripts
platfowner Aug 30, 2024
f470927
Update filenames to _onprem.sh if necessary
platfowner Aug 30, 2024
14dd4fd
Comment out unnecessary tracker and sharding parts
platfowner Aug 30, 2024
c224bab
Install killall explicitly for higher ubuntu versions
platfowner Sep 2, 2024
adf55b6
Fix job killing bug using pkill instead of killall
platfowner Sep 2, 2024
9aae4f7
Remove -v verbose option
platfowner Sep 2, 2024
4743942
Update incremental onprem deploy scripts
platfowner Sep 2, 2024
b3f2d9b
Make incremental deploy scripts work
platfowner Sep 3, 2024
e867870
Rename: wait_until_node_sync.sh
platfowner Sep 3, 2024
525839b
Tweak log message spacing
platfowner Sep 3, 2024
6c22202
Merge pull request #1316 from ainblockchain/feature/platfowner/feature
platfowner Sep 4, 2024
02f200d
Remove -v verbose options
platfowner Sep 5, 2024
f903837
Merge pull request #1317 from ainblockchain/bugfix/platfowner/bugfix
platfowner Sep 5, 2024
8113b0a
Remove old chains and snapshots
platfowner Sep 5, 2024
908e0eb
Fix typos
platfowner Sep 5, 2024
6f60600
Deprecate --skip-kill option
platfowner Sep 5, 2024
a686f0a
Add --kill-only option to incremental deploy scripts
platfowner Sep 5, 2024
0300b15
Do not allow tracker job starting in onprem deploy scripts
platfowner Sep 5, 2024
ad4be9e
Merge pull request #1318 from ainblockchain/feature/platfowner/feature
platfowner Sep 8, 2024
68f9e03
Change MAX_NUM_EVENT_CHANNELS to 30
platfowner Sep 9, 2024
33ab51c
Fix unit tests
platfowner Sep 9, 2024
3177981
Merge pull request #1319 from ainblockchain/bugfix/platfowner/bugfix
platfowner Sep 9, 2024
cd12c50
Assign different ports for testnet and mainnet
platfowner Sep 9, 2024
52066e8
Re-assign port numbers for gcp and onprem blockchain jobs
platfowner Sep 10, 2024
d625fed
Merge pull request #1320 from ainblockchain/bugfix/platfowner/bugfix
platfowner Sep 10, 2024
85f1073
Remove unsupported seasons from onprem deploy scripts
platfowner Sep 9, 2024
dcce2c3
Set PEER_CANDIDATE_JSON_RPC_URL for onprem genesis testnet / mainnet …
platfowner Sep 9, 2024
94e0f2e
Update deploy script examples
platfowner Sep 9, 2024
6c821c0
Apply re-assigned port numbers to PEER_CANDIDATE_JSON_RPC_URL
platfowner Sep 10, 2024
facf3f6
Validate input season in deploy scripts
platfowner Sep 10, 2024
e3dc953
Merge pull request #1321 from ainblockchain/feature/platfowner/feature
platfowner Sep 10, 2024
9d67535
Make wait_until_node_sync.sh accept custom node url
platfowner Sep 10, 2024
ca1918b
Add input node index range check
platfowner Sep 10, 2024
9ea4fd6
Merge pull request #1322 from ainblockchain/bugfix/platfowner/bugfix
platfowner Sep 11, 2024
da7b9b7
Upgrade package version to 1.5.0
platfowner Sep 11, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ You can override default port numbering system by setting `PORT` and `P2P_PORT`
```
gcloud init
# For genesis deploy
bash deploy_blockchain_genesis_gcp.sh [dev|staging|sandbox|exp|spring|summer|mainnet] <# of Shards> <Parent Node Index Begin> <Parent Node Index End> [--setup] [--keystore|--mnemonic|--private-key] [--keep-code|--no-keep-code] [--keep-data|--no-keep-data] [--full-sync|--fast-sync] [--chown-data|--no-chown-data] [--kill-only|--skip-kill]
bash deploy_blockchain_genesis_gcp.sh [dev|staging|sandbox|exp|spring|summer|mainnet] <# of Shards> <Parent Node Index Begin> <Parent Node Index End> [--setup] [--keystore|--mnemonic|--private-key] [--keep-code|--no-keep-code] [--keep-data|--no-keep-data] [--full-sync|--fast-sync] [--chown-data|--no-chown-data] [--kill-job|--kill-only]
# For incremental deploy
bash deploy_blockchain_incremental_gcp.sh [dev|staging|sandbox|exp|spring|summer|mainnet] <# of Shards> <Parent Node Index Begin> <Parent Node Index End> [--setup] [--keystore|--mnemonic|--private-key] [--keep-code|--no-keep-code] [--keep-data|--no-keep-data] [--full-sync|--fast-sync] [--chown-data|--no-chown-data]
```
Expand Down Expand Up @@ -130,7 +130,7 @@ BLOCKCHAIN_CONFIGS_DIR=blockchain-configs/afan-shard MIN_NUM_VALIDATORS=1 DEBUG=
```
gcloud init
# For genesis deploy
bash deploy_blockchain_genesis_gcp.sh [dev|staging|sandbox|exp|spring|summer|mainnet] <# of Shards> <Parent Node Index Begin> <Parent Node Index End> [--setup] [--keystore|--mnemonic|--private-key] [--keep-code|--no-keep-code] [--keep-data|--no-keep-data] [--full-sync|--fast-sync] [--chown-data|--no-chown-data] [--kill-only|--skip-kill]
bash deploy_blockchain_genesis_gcp.sh [dev|staging|sandbox|exp|spring|summer|mainnet] <# of Shards> <Parent Node Index Begin> <Parent Node Index End> [--setup] [--keystore|--mnemonic|--private-key] [--keep-code|--no-keep-code] [--keep-data|--no-keep-data] [--full-sync|--fast-sync] [--chown-data|--no-chown-data] [--kill-job|--kill-only]
# For incremental deploy
bash deploy_blockchain_incremental_gcp.sh [dev|staging|sandbox|exp|spring|summer|mainnet] <# of Shards> <Parent Node Index Begin> <Parent Node Index End> [--setup] [--keystore|--mnemonic|--private-key] [--keep-code|--no-keep-code] [--keep-data|--no-keep-data] [--full-sync|--fast-sync] [--chown-data|--no-chown-data]
```
Expand Down
2 changes: 1 addition & 1 deletion blockchain-configs/afan-shard/node_params.json
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
"MAX_FINALIZED_BLOCK_INFO_ON_MEM": 1000,
"MAX_JSON_RPC_API_READ_RATE_LIMIT": 10,
"MAX_JSON_RPC_API_WRITE_RATE_LIMIT": 1,
"MAX_NUM_EVENT_CHANNELS": 20,
"MAX_NUM_EVENT_CHANNELS": 30,
"MAX_NUM_EVENT_FILTERS": 40,
"MAX_NUM_EVENT_FILTERS_PER_CHANNEL": 5,
"MAX_NUM_INBOUND_CONNECTION": 3,
Expand Down
2 changes: 1 addition & 1 deletion blockchain-configs/base/node_params.json
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
"MAX_FINALIZED_BLOCK_INFO_ON_MEM": 1000,
"MAX_JSON_RPC_API_READ_RATE_LIMIT": 10,
"MAX_JSON_RPC_API_WRITE_RATE_LIMIT": 1,
"MAX_NUM_EVENT_CHANNELS": 20,
"MAX_NUM_EVENT_CHANNELS": 30,
"MAX_NUM_EVENT_FILTERS": 40,
"MAX_NUM_EVENT_FILTERS_PER_CHANNEL": 5,
"MAX_NUM_INBOUND_CONNECTION": 6,
Expand Down
2 changes: 1 addition & 1 deletion blockchain-configs/he-shard/node_params.json
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
"MAX_FINALIZED_BLOCK_INFO_ON_MEM": 1000,
"MAX_JSON_RPC_API_READ_RATE_LIMIT": 10,
"MAX_JSON_RPC_API_WRITE_RATE_LIMIT": 1,
"MAX_NUM_EVENT_CHANNELS": 20,
"MAX_NUM_EVENT_CHANNELS": 30,
"MAX_NUM_EVENT_FILTERS": 40,
"MAX_NUM_EVENT_FILTERS_PER_CHANNEL": 5,
"MAX_NUM_INBOUND_CONNECTION": 6,
Expand Down
4 changes: 2 additions & 2 deletions blockchain-configs/mainnet-prod/node_params.json
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@
"MAX_FINALIZED_BLOCK_INFO_ON_MEM": 1000,
"MAX_JSON_RPC_API_READ_RATE_LIMIT": 10,
"MAX_JSON_RPC_API_WRITE_RATE_LIMIT": 1,
"MAX_NUM_EVENT_CHANNELS": 20,
"MAX_NUM_EVENT_CHANNELS": 30,
"MAX_NUM_EVENT_FILTERS": 40,
"MAX_NUM_EVENT_FILTERS_PER_CHANNEL": 5,
"MAX_NUM_INBOUND_CONNECTION": 6,
Expand All @@ -56,7 +56,7 @@
"ON_MEMORY_CHAIN_LENGTH": 10,
"P2P_HEARTBEAT_INTERVAL_MS": 15000,
"P2P_MESSAGE_TIMEOUT_MS": 600000,
"P2P_PORT": 5000,
"P2P_PORT": 4997,
"P2P_WAIT_FOR_ADDRESS_TIMEOUT_MS": 10000,
"PEER_CANDIDATE_JSON_RPC_URL": "https://mainnet-api.ainetwork.ai/json-rpc",
"PEER_CANDIDATES_CONNECTION_INTERVAL_MS": 20000,
Expand Down
2 changes: 1 addition & 1 deletion blockchain-configs/sim-shard/node_params.json
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
"MAX_FINALIZED_BLOCK_INFO_ON_MEM": 1000,
"MAX_JSON_RPC_API_READ_RATE_LIMIT": 10,
"MAX_JSON_RPC_API_WRITE_RATE_LIMIT": 1,
"MAX_NUM_EVENT_CHANNELS": 20,
"MAX_NUM_EVENT_CHANNELS": 30,
"MAX_NUM_EVENT_FILTERS": 40,
"MAX_NUM_EVENT_FILTERS_PER_CHANNEL": 5,
"MAX_NUM_INBOUND_CONNECTION": 6,
Expand Down
2 changes: 1 addition & 1 deletion blockchain-configs/testnet-dev/node_params.json
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
"MAX_FINALIZED_BLOCK_INFO_ON_MEM": 1000,
"MAX_JSON_RPC_API_READ_RATE_LIMIT": 10,
"MAX_JSON_RPC_API_WRITE_RATE_LIMIT": 1,
"MAX_NUM_EVENT_CHANNELS": 20,
"MAX_NUM_EVENT_CHANNELS": 30,
"MAX_NUM_EVENT_FILTERS": 40,
"MAX_NUM_EVENT_FILTERS_PER_CHANNEL": 5,
"MAX_NUM_INBOUND_CONNECTION": 6,
Expand Down
2 changes: 1 addition & 1 deletion blockchain-configs/testnet-exp/node_params.json
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
"MAX_FINALIZED_BLOCK_INFO_ON_MEM": 1000,
"MAX_JSON_RPC_API_READ_RATE_LIMIT": 10,
"MAX_JSON_RPC_API_WRITE_RATE_LIMIT": 1,
"MAX_NUM_EVENT_CHANNELS": 20,
"MAX_NUM_EVENT_CHANNELS": 30,
"MAX_NUM_EVENT_FILTERS": 40,
"MAX_NUM_EVENT_FILTERS_PER_CHANNEL": 5,
"MAX_NUM_INBOUND_CONNECTION": 6,
Expand Down
4 changes: 2 additions & 2 deletions blockchain-configs/testnet-prod/node_params.json
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
"MAX_FINALIZED_BLOCK_INFO_ON_MEM": 1000,
"MAX_JSON_RPC_API_READ_RATE_LIMIT": 10,
"MAX_JSON_RPC_API_WRITE_RATE_LIMIT": 1,
"MAX_NUM_EVENT_CHANNELS": 20,
"MAX_NUM_EVENT_CHANNELS": 30,
"MAX_NUM_EVENT_FILTERS": 40,
"MAX_NUM_EVENT_FILTERS_PER_CHANNEL": 5,
"MAX_NUM_INBOUND_CONNECTION": 6,
Expand All @@ -57,7 +57,7 @@
"ON_MEMORY_CHAIN_LENGTH": 10,
"P2P_HEARTBEAT_INTERVAL_MS": 15000,
"P2P_MESSAGE_TIMEOUT_MS": 600000,
"P2P_PORT": 5000,
"P2P_PORT": 4998,
"P2P_WAIT_FOR_ADDRESS_TIMEOUT_MS": 10000,
"PEER_CANDIDATE_JSON_RPC_URL": "https://testnet-api.ainetwork.ai/json-rpc",
"PEER_CANDIDATES_CONNECTION_INTERVAL_MS": 20000,
Expand Down
2 changes: 1 addition & 1 deletion blockchain-configs/testnet-sandbox/node_params.json
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
"MAX_FINALIZED_BLOCK_INFO_ON_MEM": 1000,
"MAX_JSON_RPC_API_READ_RATE_LIMIT": 10,
"MAX_JSON_RPC_API_WRITE_RATE_LIMIT": 1,
"MAX_NUM_EVENT_CHANNELS": 20,
"MAX_NUM_EVENT_CHANNELS": 30,
"MAX_NUM_EVENT_FILTERS": 40,
"MAX_NUM_EVENT_FILTERS_PER_CHANNEL": 5,
"MAX_NUM_INBOUND_CONNECTION": 6,
Expand Down
4 changes: 2 additions & 2 deletions blockchain-configs/testnet-staging/node_params.json
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
"MAX_FINALIZED_BLOCK_INFO_ON_MEM": 1000,
"MAX_JSON_RPC_API_READ_RATE_LIMIT": 10,
"MAX_JSON_RPC_API_WRITE_RATE_LIMIT": 1,
"MAX_NUM_EVENT_CHANNELS": 20,
"MAX_NUM_EVENT_CHANNELS": 30,
"MAX_NUM_EVENT_FILTERS": 40,
"MAX_NUM_EVENT_FILTERS_PER_CHANNEL": 5,
"MAX_NUM_INBOUND_CONNECTION": 6,
Expand All @@ -57,7 +57,7 @@
"ON_MEMORY_CHAIN_LENGTH": 10,
"P2P_HEARTBEAT_INTERVAL_MS": 15000,
"P2P_MESSAGE_TIMEOUT_MS": 600000,
"P2P_PORT": 5000,
"P2P_PORT": 4999,
"P2P_WAIT_FOR_ADDRESS_TIMEOUT_MS": 10000,
"PEER_CANDIDATE_JSON_RPC_URL": "https://staging-api.ainetwork.ai/json-rpc",
"PEER_CANDIDATES_CONNECTION_INTERVAL_MS": 20000,
Expand Down
2 changes: 1 addition & 1 deletion client/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ app.get('/metrics', async (req, res, next) => {
.end();
});

// Used in wait_until_node_sync_gcp.sh
// Used in wait_until_node_sync.sh
app.get('/last_block_number', (req, res, next) => {
const beginTime = Date.now();
const result = node.bc.lastBlockNumber();
Expand Down
3 changes: 3 additions & 0 deletions client/protocol_versions.json
Original file line number Diff line number Diff line change
Expand Up @@ -155,5 +155,8 @@
},
"1.4.2": {
"min": "1.0.0"
},
"1.5.0": {
"min": "1.0.0"
}
}
12 changes: 6 additions & 6 deletions common/file-util.js
Original file line number Diff line number Diff line change
Expand Up @@ -257,7 +257,7 @@ class FileUtil {
});
});
} catch (err) {
logger.error(`[${LOG_HEADER}] Error while reading ${filePath}: ${err}`);
logger.error(`[${LOG_HEADER}] Error while reading ${filePath}: ${err.stack}`);
return false;
}
}
Expand Down Expand Up @@ -288,7 +288,7 @@ class FileUtil {
});
});
} catch (err) {
logger.error(`[${LOG_HEADER}] Error while reading ${filePath}: ${err}`);
logger.error(`[${LOG_HEADER}] Error while reading ${filePath}: ${err.stack}`);
return null;
}
}
Expand All @@ -299,7 +299,7 @@ class FileUtil {
const zippedFs = fs.readFileSync(filePath);
return FileUtil.buildObjectFromChunks(JSON.parse(zlib.gunzipSync(zippedFs).toString()).docs);
} catch (err) {
logger.error(`[${LOG_HEADER}] Error while reading ${filePath}: ${err}`);
logger.error(`[${LOG_HEADER}] Error while reading ${filePath}: ${err.stack}`);
return null;
}
}
Expand All @@ -314,7 +314,7 @@ class FileUtil {
const zippedFs = fs.readFileSync(filePath);
return JSON.parse(zlib.gunzipSync(zippedFs).toString());
} catch (err) {
logger.error(`[${LOG_HEADER}] Error while reading ${filePath}: ${err}`);
logger.error(`[${LOG_HEADER}] Error while reading ${filePath}: ${err.stack}`);
return null;
}
}
Expand All @@ -325,7 +325,7 @@ class FileUtil {
const fileStr = fs.readFileSync(filePath);
return JSON.parse(fileStr);
} catch (err) {
logger.error(`[${LOG_HEADER}] Error while reading ${filePath}: ${err}`);
logger.error(`[${LOG_HEADER}] Error while reading ${filePath}: ${err.stack}`);
return null;
}
}
Expand Down Expand Up @@ -399,7 +399,7 @@ class FileUtil {
try {
return Number(fs.readFileSync(h2nPath).toString());
} catch (err) {
logger.error(`[${LOG_HEADER}] Error while reading ${h2nPath}: ${err}`);
logger.error(`[${LOG_HEADER}] Error while reading ${h2nPath}: ${err.stack}`);
return -1;
}
}
Expand Down
5 changes: 3 additions & 2 deletions copy_blockchain_data_gcp.sh
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@

function usage() {
printf "Usage: bash copy_blockchain_data_gcp.sh [dev|staging|sandbox|exp|spring|summer|mainnet] <Node Index> [download|upload]\n"
printf "Example: bash copy_blockchain_data_gcp.sh spring 5 download\n"
printf "Example: bash copy_blockchain_data_gcp.sh spring 0 download\n"
printf "Example: bash copy_blockchain_data_gcp.sh spring 1 upload\n"
printf "\n"
exit
}
Expand Down Expand Up @@ -146,7 +147,7 @@ function upload_data() {

# 2. Extract tgz file for node
printf "\n\n<<< Extracting tgz file for node $node_index >>>\n\n"
TGZ_CMD="gcloud compute ssh $node_target_addr --command 'cd /home; sudo mkdir -p ain_blockchain_data; sudo chown runner:runner ain_blockchain_data; sudo chmod 777 ain_blockchain_data; cd ain_blockchain_data; gzip -dc ~/ain_blockchain_data.tar.gz | tar xvf -' --project $PROJECT_ID --zone $node_zone"
TGZ_CMD="gcloud compute ssh $node_target_addr --command 'cd /home; sudo mkdir -p ain_blockchain_data; sudo chown $GCP_USER:$GCP_USER ain_blockchain_data; sudo chmod 777 ain_blockchain_data; cd ain_blockchain_data; sudo rm -rf chains snapshots; gzip -dc ~/ain_blockchain_data.tar.gz | tar xvf -' --project $PROJECT_ID --zone $node_zone"
printf "TGZ_CMD=$TGZ_CMD\n\n"
eval $TGZ_CMD

Expand Down
134 changes: 134 additions & 0 deletions copy_blockchain_data_onprem.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,134 @@
#!/bin/bash

function usage() {
printf "Usage: bash copy_blockchain_data_onprem.sh [staging|spring|mainnet] <Node Index> [download|upload]\n"
printf "Example: bash copy_blockchain_data_onprem.sh staging 0 download\n"
printf "Example: bash copy_blockchain_data_onprem.sh staging 1 upload\n"
printf "\n"
exit
}

if [[ $# -lt 3 ]] || [[ $# -gt 3 ]]; then
usage
fi

if [[ "$1" = 'staging' ]] || [[ "$1" = 'spring' ]] || [[ "$1" = 'mainnet' ]]; then
SEASON="$1"
else
printf "Invalid <Project/Season> argument: $1\n"
exit
fi
printf "\n"
printf "SEASON=$SEASON\n"

ONPREM_USER="nvidia"
printf "ONPREM_USER=$ONPREM_USER\n"

number_re='^[0-9]+$'
if ! [[ $2 =~ $number_re ]] ; then
printf "\n"
printf "Invalid <Node Index> argument: $2\n"
exit
fi
NODE_INDEX=$2
if [[ $NODE_INDEX -lt 0 ]] || [[ $NODE_INDEX -gt 9 ]]; then
printf "\n"
printf "Out-of-range <Node Index> argument: $NODE_INDEX\n"
exit
fi
printf "NODE_INDEX=$NODE_INDEX\n"

if [[ "$3" = 'download' ]] || [[ "$3" = 'upload' ]]; then
COMMAND="$3"
else
printf "\n"
printf "Invalid <Command> argument: $3\n"
printf "\n"
usage
fi
printf "COMMAND=$COMMAND\n"

# Get confirmation.
if [[ "$SEASON" = "mainnet" ]]; then
printf "\n"
printf "Do you want to proceed for $SEASON? Enter [mainnet]: "
read CONFIRM
printf "\n\n"
if [[ ! $CONFIRM = "mainnet" ]]
then
[[ "$0" = "$BASH_SOURCE" ]] && exit 1 || return 1 # handle exits from shell or function but don't exit interactive shell
fi
else
printf "\n"
read -p "Do you want to proceed for $SEASON? [y/N]: " -n 1 -r
printf "\n\n"
if [[ ! $REPLY =~ ^[Yy]$ ]]; then
[[ "$0" = "$BASH_SOURCE" ]] && exit 1 || return 1 # handle exits from shell or function but don't exit interactive shell
fi
fi

# Read node ip addresses and passwords
IFS=$'\n' read -d '' -r -a NODE_IP_LIST < ./ip_addresses/${SEASON}_onprem_ip.txt
IFS=$'\n' read -d '' -r -a NODE_PW_LIST < ./ip_addresses/${SEASON}_onprem_pw.txt

function download_data() {
local node_index="$1"
local node_target_addr="${ONPREM_USER}@${NODE_IP_LIST[${node_index}]}"
local node_login_pw="${NODE_PW_LIST[${node_index}]}"

printf "\n* >> Downloading data from node $node_index ($node_target_addr) *********************************************************\n\n"

printf "node_target_addr='$node_target_addr'\n"

# 1. Create tgz file for node
printf "\n\n<<< Creating tgz file for node $node_index >>>\n\n"
TGZ_CMD="ssh $node_target_addr 'sudo -S ls -la; cd /home/${SEASON}/ain_blockchain_data; tar cvf - chains snapshots | gzip -c > ~/ain_blockchain_data.tar.gz'"
printf "TGZ_CMD=$TGZ_CMD\n\n"
eval "echo ${node_login_pw} | sshpass -f <(printf '%s\n' ${node_login_pw}) ${TGZ_CMD}"

# 2. Copy tgz file from node
printf "\n\n<<< Copying tgz file from node $node_index >>>\n\n"
SCP_CMD="scp -r $node_target_addr:~/ain_blockchain_data.tar.gz ."
printf "SCP_CMD=$SCP_CMD\n\n"
eval "sshpass -f <(printf '%s\n' ${node_login_pw}) ${SCP_CMD}"

# 3. Clean up tgz file for node
printf "\n\n<<< Cleaning up tgz file for node $node_index >>>\n\n"
CLEANUP_CMD="ssh $node_target_addr 'rm ~/ain_blockchain_data.tar.gz'"
printf "CLEANUP_CMD=$CLEANUP_CMD\n\n"
eval "sshpass -f <(printf '%s\n' ${node_login_pw}) ${CLEANUP_CMD}"
}

function upload_data() {
local node_index="$1"
local node_target_addr="${ONPREM_USER}@${NODE_IP_LIST[${node_index}]}"
local node_login_pw="${NODE_PW_LIST[${node_index}]}"

printf "\n* >> Uploading data from node $node_index ($node_target_addr) *********************************************************\n\n"

printf "node_target_addr='$node_target_addr'\n"

# 1. Copy tgz file to node
printf "\n\n<<< Copying tgz file to node $node_index >>>\n\n"
SCP_CMD="scp -r ./ain_blockchain_data.tar.gz $node_target_addr:~"
printf "SCP_CMD=$SCP_CMD\n\n"
eval "sshpass -f <(printf '%s\n' ${node_login_pw}) ${SCP_CMD}"

# 2. Extract tgz file for node
printf "\n\n<<< Extracting tgz file for node $node_index >>>\n\n"
TGZ_CMD="ssh $node_target_addr 'sudo -S ls -la; cd /home; sudo mkdir -p ${SEASON}/ain_blockchain_data; sudo chown $ONPREM_USER:$ONPREM_USER ${SEASON} ${SEASON}/ain_blockchain_data; sudo chmod 777 ${SEASON} ${SEASON}/ain_blockchain_data; cd ${SEASON}/ain_blockchain_data; sudo rm -rf chains snapshots; gzip -dc ~/ain_blockchain_data.tar.gz | tar xvf -'"
printf "TGZ_CMD=$TGZ_CMD\n\n"
eval "echo ${node_login_pw} | sshpass -f <(printf '%s\n' ${node_login_pw}) ${TGZ_CMD}"

# 3. Clean up tgz file for node
printf "\n\n<<< Cleaning up tgz file for node $node_index >>>\n\n"
CLEANUP_CMD="ssh $node_target_addr 'rm ~/ain_blockchain_data.tar.gz'"
printf "CLEANUP_CMD=$CLEANUP_CMD\n\n"
eval "sshpass -f <(printf '%s\n' ${node_login_pw}) ${CLEANUP_CMD}"
}

if [[ "$COMMAND" = 'upload' ]]; then
upload_data "$NODE_INDEX"
else
download_data "$NODE_INDEX"
fi
Loading
Loading