Skip to content

Commit

Permalink
fix grammar in docs (#100)
Browse files Browse the repository at this point in the history
  • Loading branch information
pjfanning authored Jan 30, 2024
1 parent ac234e9 commit f2487b7
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
4 changes: 2 additions & 2 deletions docs/src/main/paradox/kafka.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Messages from and to Kafka
# Messages from and to Apache Kafka

A typical source for Projections is messages from Kafka. Apache Pekko Projections supports integration with Kafka using [Pekko Connectors Kafka](https://pekko.apache.org/docs/pekko-connectors-kafka/current/).

Expand Down Expand Up @@ -132,7 +132,7 @@ Java

Alternatively, we can define the same projection using @apidoc[Producer.flowWithContext](Producer$) in combination with `atLeastOnceFlow`.

The `WordSource` emits `WordEnvelope`s, therefore we will build a flow that takes every single emitted `WordEnvelope` and map it into an Pekko Connectors Kafka @apidoc[ProducerMessage$]. The `ProducerMessage` factory methods can be used to produce a single message, multiple messages, or pass through a message (skip a message from being produced). The @apidoc[ProducerMessage$] will pass through @apidoc[Producer.flowWithContext](Producer$) that will publish it to the Kafka Topic and finally we map the result to `Done`.
The `WordSource` emits `WordEnvelope`s, therefore we will build a flow that takes every single emitted `WordEnvelope` and map it into an Apache Pekko Connectors Kafka @apidoc[ProducerMessage$]. The `ProducerMessage` factory methods can be used to produce a single message, multiple messages, or pass through a message (skip a message from being produced). The @apidoc[ProducerMessage$] will pass through @apidoc[Producer.flowWithContext](Producer$) that will publish it to the Kafka Topic and finally we map the result to `Done`.

Scala
: @@snip [KafkaDocExample.scala](/examples/src/test/scala/docs/kafka/KafkaDocExample.scala) { #imports-producer #producerFlow }
Expand Down
2 changes: 1 addition & 1 deletion docs/src/main/paradox/running.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ For more information on using Apache Pekko Cluster consult Pekko's reference doc

The Sharded Daemon Process can be used to distribute `n` instances of a given Projection across the cluster. Therefore, it's important that each Projection instance consumes a subset of the stream of envelopes.

How the subset is created depends on the kind of source we consume. If it's an Pekko Connectors Kafka source, this is done by Kafka consumer groups. When consuming from Apache Pekko Persistence Journal, the events must be sliced by tagging them as demonstrated in the example below.
How the subset is created depends on the kind of source we consume. If it's an Apache Pekko Connectors Kafka source, this is done by Kafka consumer groups. When consuming from Apache Pekko Persistence Journal, the events must be sliced by tagging them as demonstrated in the example below.

### Tagging Events in EventSourcedBehavior

Expand Down

0 comments on commit f2487b7

Please sign in to comment.