The Renesas RZV2H chip on the Kakip board powers this entire demo. It consists of a main arm A55 processor with an AI accelerator called DRPAI which is utilized in this demo. The AINEX robot is controlled using the A55 core thorough its GPIO pins using the UART protocol. A camera input is taken frame by frame and is processed on the DRPAI accelerator using a yoloc3 detector for detecting the hand gestures. Based on the gesture detected, the corresponding action is executed from the A55 core to the motor controller.
- Begin by setting up the Robot as required.
-
Remove Raspberry pi
-
Solder pullup resistor on Kakip board as shown in figure for resolving the issue while working with DRPAI. Solder pullup resistor on Kakip board as shown in figure for resolving the issue while working with DRPAI
-
Set the DIP switch as follows
-
Connect the debugger cable.
-
Attach the Kakip on AINEX robot.
-
Detach the camera head from AINEX and attach Logi C90 for better resolution.
AINEX Camera:
Logi Camera :
-
Power up using the provided battery
The battery used is shown in figure
-
Format the SD card using Gparted.
sudo apt-get install gparted
Select the USB device memory card
Delete all partitions and make a single one
(unmount followed by right click → delete)
-
Use Fat32 as file system
right click → new → File system should be fat32 → Add
-
Click on tick mark for apply all operation → Apply
There are 2 methods:
-
Downloading the Image file given which have the demo inbuilt in it, flash it and run it following the steps below: i. Download the given image file of sd card - Handgesture_out.img.xz
ii. Unzip the image
xz -d out.img.xz
iii. Write the downloaded Image to a device
sudo dd if=out.img of=/dev/sdX bs=4M status=progress
if=out.img: Input file of=/dev/sdX: Output device (replace sdX with your actual device name) bs=4M: Block size status=progress: Show progress
iv. Now unmount the SD card from PC and mount it in Kakip board.
v. Follow the steps in Section 5.3 Running the application, for running the application
-
Else to develope from scratch, follow the steps to start with flashing the Ubuntu 24.04 Image on the SD card,
i. Download the ubuntu image from the following link,
https://amatama-my.sharepoint.com/:f:/g/personal/yuichi_horiuchi_amatama_onmicrosoft_com/ElrlDdJrIFBJsiOFYSBqh-4B9v1bY4-kuGneQeIGQxSdCw?e=7mWfvf
ii. Flash the ubuntu image into SD Card from the following link using Balena etcher or using Ubuntu writer as shown below :
Steps for setting up, applying patches, running the application, and building it,
Create a container image of the AI SDK for RZ/V2H refering Renesas procedure.
Follow the below steps for creating container image:
- Install Docker
-
Download AI SDK from the link
Extract RZ/V AI SDK package
-
On your Linux PC, make the working directory.
mkdir -p ai_sdk_work
- Register the working directory path to an environment variable.
export WORK=<path to the working directory>/ai_sdk_work
- Move to the working directory.
cd ${WORK}
- Extract RZ/V AI SDK zip file under the working directory.
unzip <Path to the file>/RTK0EF0*.zip -d ${WORK}
- Check the working directory to confirm the package contents.
ls ${WORK}/
- If the above command prints followings, the package is extracted correctly.
ai_sdk_setup board_setup documents references r11an0*.pdf
- On your Linux PC, move to the working directory.
cd ${WORK}/ai_sdk_setup
- Docker build
docker build -t rzv2h_ai_sdk_image --build-arg SDK="/opt/poky/3.1.31" --build-arg PRODUCT="V2H" .
- Create new directory to be mounted on Docker container.
mkdir ${WORK}/ai_sdk_setup/data
- Create docker container
sudo docker run --name kakip_env -it -v $PWD:/kakip_linux -w /kakip_linux rzv2h_ai_sdk_image
- In new terminal list docker
sudo docker ps -a
start docker
sudo docker start -i kakip_env
If using for first time, nano may not be installed in docker, so install nano first to edit the files:
apt update
apt install nano
if you want to delete the folder
sudo docker exec -ti 00020d9a098b rm -rf /home/kakip_linux
Clone the kakip_linux repository
cd home
git clone https://github.com/Kakip-ai/kakip_linux
cd kakip_linux
- Configuring the Kernel Config
cp ./arch/arm64/configs/kakip.config .config
- Setting Environment Variables and Installing Dependencies
source /opt/poky/3.1.31/environment-setup-aarch64-poky-linux
apt update && apt install -y flex bison bc
- Edit the dts file from the following path
cd arch/arm64/boot/dts/renesas
#for pin address
nano kakip-es1.dts #edit in this file
In line number 231 & 232, do the following edits
#for address mapping copy the first .dtsi and paste it in the terminal with nano.
#sci4_pins: sci4 {
# pinmux = <RZG2L_PORT_PINMUX(7, 2, 1)>, /* SCI4_TXD_MOSI_SDA */
# <RZG2L_PORT_PINMUX(7, 3, 1)>; /* SCI4_RXD_MISO_SCL */
#};
source /opt/poky/3.1.31/environment-setup-aarch64-poky-linux
apt update && apt install -y flex bison bc
- Build
Go back to kakip_linux folder
cd /home/kakip_linux
make -j4 Image
make -j4 renesas/kakip-es1.dtb
The build artifacts are the following two points.
./arch/arm64/boot/Image
./arch/arm64/boot/dts/renesas/kakip-es1.dtb
After building, exit from the container with exit.
exit
Copy the files into the sd card mounted with ubuntu images
- First will delete the existing image:
cd <folder_path>
sudo rm -rf kakip-es1.dtb Image-5.10.145-cip17-yocto-standard
- Copy the files
sudo docker cp kakip_env:/root/kakip_linux/arch/arm64/boot/dts/renesas/kakip-es1.dtb /media/irinj/root/boot/
sudo docker cp kakip_env:/kakip_linux/kakip_linux/arch/arm64/boot/dts/renesas/r9a09g057.dtsi /media/irinj/root/boot
-
Connect the ethernet
-
Boot up the Kakip.
-
Open gtkterm using
sudo gtkterm
-
In the configuration -> port, select the serial port and set the baud rate (debugger) into 115200
-
Login using the user ID(ubuntu) and password
-
Check for the ip address using the command in the serial terminal -
ifconfig
-
Open new terminal and ssh into the board using the following command
ssh ubuntu@<ip_address>
-
Change locale: Change the locale in the following path:
nano /etc/default/locale #in the opening window LANG=en_US.UTF-8 LANGUAGE=en_US:en
sudo dpkg-reconfigure locales #press enter twice and 97 followed by 3. #close it and reboot now
This application showcases the capability of deep neural networks to predict different hand gestures. It detects total of 8 Gestures that includes one, two, three, four, five, thumbs up, thumbs down and rock in the hand with the highest precision.
After setting up the docker, go to the working repository and clone the repositories below in the docker
cd /home/kakip_linux
export WORK=/home/kakip_linux/
cd ${WORK}
git clone https://github.com/Ignitarium-Renesas/Kakip_Humanoid.git
Note:
In future, to edit the port or socket address file, follow the below steps for:
- For changing the port and IP address in hand_gesture_recognition.cpp
cd ${WORK}/Kakip_Humanoid/Hand_gesture/Gesture_Recognition/src
nano hand_gesture_recognition.cpp
change line no: 55, with the correct PORT address
/* DRP-AI TVM[*1] Runtime object */
MeraDrpRuntimeWrapper runtime;
#define PORT 9091
int sock = 0;
change ip address in line no: 964, with the correct PORT address
if (inet_pton(AF_INET, "127.0.0.1", &serv_addr.sin_addr) <= 0) {
printf("\nInvalid address/ Address not supported \n");
return -1;
}
- Similarly, the same Port and IP address should be used in socket_gesture.py
${WORK}/Kakip_Humanoid/Hand_gesture/A55_GPIO
nano socket_gesture.py
change in line no: 4
def start_server(host='127.0.0.1', port=9091):
Build the application by following the commands below.
mkdir -p build && cd build
cmake -DCMAKE_TOOLCHAIN_FILE=./toolchain/runtime.cmake -DV2H=ON ..
make -j$(nproc)
The built application file will be available in the following directory:
${WORK}/Kakip_Humanoid/Hand_gesture/Gesture_Recognition/src/build
The generated file will be named:
hand_gesture_recognition_v2_app
Follow the steps below to deploy the project on board.
Run the commands below to download deploy_tvm-v230.so from Release v5.00
cd ${WORK}/Kakip_Humanoid/Hand_gesture/Gesture_Recognition/exe_v2h/hand_yolov3_onnx
wget https://github.com/Ignitarium-Renesas/rzv_ai_apps/releases/download/v5.00/12_Hand_gesture_recognition_v2_deploy_tvm-v230.so
Rename the Hand_gesture_recognition_v2_deploy_tvm-v230.so to deploy.so.
mv Hand_gesture_recognition_v2_deploy_tvm-v230.so deploy.so
Follow the steps in session Running the application
Copy the source code into kaki board
In your Linux PC:
sudo scp -r ${WORK}/Kakip_Humanoid ubuntu@<ip>:/home/ubuntu/
Follow the below commands before running the application in kakip
ssh ubuntu@<IP>
sudo apt update
sudo apt install libpcre3
sudo ldconfig
ldconfig -p | grep libpcre
sudo ln -s /usr/lib/aarch64-linux-gnu/libpcre.so.3 /usr/lib/aarch64-linux-gnu/libpcre.so.1
To execute the application using CR8 follow section : Execution using CR8
Follow the commands below for running the application.
cd /home/ubuntu/Kakip_Humanoid/libtvm_runtime.so /usr/lib64
cd Hand_gesture/A55_GPIO
python3 socket_gesture.py
cd Hand_gesture/Hand_Gesture_Recognition/exe_v2h
./hand_gesture USB
Keep the Kakip's DIP switches as below:
1 & 2 - ON
3 & 4 - OFF
Follow the below steps for enabling cr8 through uboot:
- Install E2studio as per the section e2 studio installation
- Load the e2 studio project file and build all
- Copy and paste the following bin files inside the debug folder into root/boot of sd card
sudo cp rzv2h_cr8_rpmsg_demo_sdram.bin /media/<usr>/root/boot
sudo cp rzv2h_cr8_rpmsg_demo_itcm.bin /media/<usr>/root/boot
sudo cp rzv2h_cr8_rpmsg_demo_sram.bin /media/<usr>/root/boot
- Reboot kakipi and log on to uboot
Press any key when asked to stop auto boot
- Starting the CR8 program
=> setenv cr8start 'dcache off; mw.l 0x10420D24 0x04000000; mw.l 0x10420600 0xE000E000; mw.l 0x10420604 0x00030003; mw.l 0x10420908 0x1FFF0000; mw.l 0x10420C44 0x003F0000; mw.l 0x10420C14 0x00000000; mw.l 0x10420908 0x10001000; mw.l 0x10420C48 0x00000020; mw.l 0x10420908 0x1FFF1FFF; mw.l 0x10420C48 0x00000000; ext4load mmc 0:2 0x12040000 boot/rzv2h_cr8_rpmsg_demo_itcm.bin; ext4load mmc 0:2 0x08180000 boot/rzv2h_cr8_rpmsg_demo_sram.bin; ext4load mmc 0:2 0x40800000 boot/rzv2h_cr8_rpmsg_demo_sdram.bin; mw.l 0x10420C14 0x00000003; dcache on;'
=> saveenv
=> run cr8start
After starting the CR8 program, boot into Linux.
=> run bootcmd
- Run the motor control code for Humanoid Robot
run scoket server with rp msg
sudo ./rpmsg_sample_client
- Run Hand gesture demo code
cd Hand_gesture/Gesture_Recognition/exe_v2h
./hand_gesture_kakip USB
Download the software from the below link Development environment setup
Note:
For windows: setup_e2_studio_2024-01_1.exe
For Linux: e2studio_installer-2024-01_1.Linux_host.run
change the Install Location and specify where to install e2studio.
select GNU ARM Embedded 12.2-Rel1.
select FSP.
##@# Step 4
select GNU ARM Embedded 12.2-Rel1
select GNU ARM Embedded 12.2-Rel1
Download the FSP Package from the below link or from assets RZ_FSP
After installation copy rz_fsp folder into the projectgen location which is located in E2 studio installation folder
click Help > CMSIS Packs Management > Renesas RZ/V If FSP Packs is successfully installed it will shows here
Download the zip file and Load into e2 studio zv2h_cm33_rpmsg_demo and rzv2h_cr8_rpmsg_demo
click Project > Build All
Installation of J-link Software
download and install the software
Installation of Python 3.10
sudo apt update
sudo apt install python3.10
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt update
sudo apt install python3.10
python3.10 --version
turn on the RZV2H
flash the code with button
run the code with Resume button
Kakip Hardware: https://www.kakip.ai/wp-content/uploads/2024/04/Kakip_HW_Ref.pdf