Skip to content

Commit

Permalink
introduce grpc scenarios instead of fixed proto and variable payloads
Browse files Browse the repository at this point in the history
  • Loading branch information
Trisfald authored and aspurio committed Jan 10, 2022
1 parent a0deb7b commit 7d957e6
Show file tree
Hide file tree
Showing 22 changed files with 268 additions and 61 deletions.
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,4 @@
results/
*.tmp
proto/
payload/
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ The benchmark can be configured through the following environment variables:
|--------|---------------|:---------------:|
|GRPC_BENCHMARK_DURATION|Duration of the benchmark.|20s|
|GRPC_BENCHMARK_WARMUP|Duration of the warmup. Stats won't be collected.|5s|
|GRPC_REQUEST_PAYLOAD|File (from [payload/](payload/)) containing the data to be sent in the client request.|100B|
|GRPC_REQUEST_SCENARIO|Scenario (from [scenarios/](scenarios/)) containing the protobuf and the data to be sent in the client request. It is advised to pass this argument to `build.sh` and run the former each time `helloworld.proto` is different from the previously ran scenario.|string_100B|
|GRPC_SERVER_CPUS|Maximum number of cpus used by the server.|1|
|GRPC_SERVER_RAM|Maximum memory used by the server.|512m|
|GRPC_CLIENT_CONNECTIONS|Number of connections to use.|50|
Expand All @@ -67,6 +67,6 @@ The benchmark can be configured through the following environment variables:
Other parameters will depend on your use-case. Choose wisely.

# Results
You can find our sample results in the [Wiki](https://github.com/LesnyRumcajs/grpc_bench/wiki). Be sure to run the benchmarks yourself if you have sufficient hardware, especially for multi-core scenarios.
You can find our sample results in the [Wiki](https://github.com/LesnyRumcajs/grpc_bench/wiki). Be sure to run the benchmarks yourself if you have sufficient hardware, especially for multi-core scenarios.


2 changes: 1 addition & 1 deletion analyze.sh
Original file line number Diff line number Diff line change
Expand Up @@ -21,5 +21,5 @@ $(git log -1 --pretty="%h %cD %cn %s")
- GRPC_CLIENT_CONCURRENCY=${GRPC_CLIENT_CONCURRENCY}
- GRPC_CLIENT_QPS=${GRPC_CLIENT_QPS}
- GRPC_CLIENT_CPUS=${GRPC_CLIENT_CPUS}
- GRPC_REQUEST_PAYLOAD=${GRPC_REQUEST_PAYLOAD}
- GRPC_REQUEST_SCENARIO=${GRPC_REQUEST_SCENARIO}
EOF
12 changes: 9 additions & 3 deletions bench.sh
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ export GRPC_CLIENT_CONCURRENCY=${GRPC_CLIENT_CONCURRENCY:-"1000"}
export GRPC_CLIENT_QPS=${GRPC_CLIENT_QPS:-"0"}
export GRPC_CLIENT_QPS=$(( GRPC_CLIENT_QPS / GRPC_CLIENT_CONCURRENCY ))
export GRPC_CLIENT_CPUS=${GRPC_CLIENT_CPUS:-"1"}
export GRPC_REQUEST_PAYLOAD=${GRPC_REQUEST_PAYLOAD:-"100B"}
export GRPC_REQUEST_SCENARIO=${GRPC_REQUEST_SCENARIO:-"string_100B"}

# Let containers know how many CPUs they will be running on
# Additionally export other vars for further analysis script.
Expand All @@ -34,6 +34,12 @@ for benchmark in ${BENCHMARKS_TO_RUN}; do

mkdir -p "${RESULTS_DIR}"

# Setup the chosen scenario
if ! sh setup_scenario.sh $GRPC_REQUEST_SCENARIO true; then
echo "Scenario setup fiascoed."
exit 1
fi

# Start the gRPC Server container
docker run --name "${NAME}" --rm \
--cpus "${GRPC_SERVER_CPUS}" \
Expand All @@ -60,7 +66,7 @@ for benchmark in ${BENCHMARKS_TO_RUN}; do
--connections="${GRPC_CLIENT_CONNECTIONS}" \
--rps="${GRPC_CLIENT_QPS}" \
--duration "${GRPC_BENCHMARK_WARMUP}" \
--data-file /payload/"${GRPC_REQUEST_PAYLOAD}" \
--data-file /payload/payload \
127.0.0.1:50051 > /dev/null

echo "done."
Expand All @@ -86,7 +92,7 @@ for benchmark in ${BENCHMARKS_TO_RUN}; do
--connections="${GRPC_CLIENT_CONNECTIONS}" \
--rps="${GRPC_CLIENT_QPS}" \
--duration "${GRPC_BENCHMARK_DURATION}" \
--data-file /payload/"${GRPC_REQUEST_PAYLOAD}" \
--data-file /payload/payload \
127.0.0.1:50051 >"${RESULTS_DIR}/${NAME}".report

# Show quick summary (reqs/sec)
Expand Down
8 changes: 8 additions & 0 deletions build.sh
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
#!/bin/sh

export GRPC_REQUEST_SCENARIO=${GRPC_REQUEST_SCENARIO:-"string_100B"}

# Build ghz Docker image.
# See ghz-tool/Dockerfile for details/version
docker build -t ghz_bench:latest ./ghz-tool/
Expand All @@ -9,6 +11,12 @@ BENCHMARKS_TO_BUILD="${@}"
## ...or use all the *_bench dirs by default
BENCHMARKS_TO_BUILD="${BENCHMARKS_TO_BUILD:-$(find . -maxdepth 1 -name '*_bench' -type d | sort)}"

# Setup the chosen scenario
if ! sh setup_scenario.sh $GRPC_REQUEST_SCENARIO false; then
echo "Scenario setup fiascoed."
exit 1
fi

builds=""
for benchmark in ${BENCHMARKS_TO_BUILD}; do
echo "==> Building Docker image for ${benchmark}..."
Expand Down
1 change: 0 additions & 1 deletion payload/100B

This file was deleted.

1 change: 0 additions & 1 deletion payload/10B

This file was deleted.

1 change: 0 additions & 1 deletion payload/10kB

This file was deleted.

1 change: 0 additions & 1 deletion payload/1kB

This file was deleted.

60 changes: 14 additions & 46 deletions rust_tonic_st_bench/Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion rust_tonic_st_bench/src/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ impl Greeter for MyGreeter {
request: Request<HelloRequest>,
) -> Result<Response<HelloReply>, Status> {
let reply = hello_world::HelloReply {
message: request.into_inner().name,
response: request.into_inner().request,
};
Ok(Response::new(reply))
}
Expand Down
Loading

0 comments on commit 7d957e6

Please sign in to comment.