From 718943bd7b5d7d4be9126a1dd7465197fa4f7cd9 Mon Sep 17 00:00:00 2001 From: "Jeffrey Jonathan Jennings (J3)" Date: Tue, 8 Oct 2024 02:35:07 -0400 Subject: [PATCH] Resolved #68. --- README.md | 8 +++++--- 1 file changed, 5 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index 5bb0245..988b38c 100644 --- a/README.md +++ b/README.md @@ -75,11 +75,13 @@ To help you start quickly, the repo comes with **_Docker containers_** for Mac M git clone https://github.com/j3-signalroom/apache_flink-kickstarter.git ``` -3. Set up your Terraform Cloud environment locally or use GitHub workflows/actions to easily create the full environment. Here's what you'll get: +3. Set up your Terraform Cloud environment locally or leverage GitHub Actions to create the complete setup effortlessly. Here's what you can expect: - - A Confluent Cloud environment with a Kafka Cluster, complete with pre-configured example Kafka topics, ready for use. + - A Confluent Cloud environment featuring a Kafka Cluster, fully equipped with pre-configured example Kafka topics—ready to power your data streaming needs. - - AWS Secrets Manager storing the API Key Secrets for the sample Kafka Cluster, and AWS Systems Manager Parameter Store set up with Kafka Consumer and Producer properties. Additionally, an AWS S3 Bucket is created as the landing spot for Apache Iceberg files generated by the Flink Apps. + - AWS Secrets Manager securely storing API Key Secrets for the Kafka Cluster, along with AWS Systems Manager Parameter Store containing Kafka Consumer and Producer properties for easy integration. + + - An AWS S3 bucket with a dedicated `warehouse` folder, serving as the landing zone for Apache Iceberg tables populated by two Python-based Flink apps, bringing your data streaming architecture to life. 4. Run Apache Flink locally on your Mac, or use the provided Docker containers from the repository to launch Apache Flink and Apache Iceberg seamlessly on your machine.