Skip to content

Commit

Permalink
Added TOC to each docs page
Browse files Browse the repository at this point in the history
  • Loading branch information
rjrudin committed Jan 22, 2024
1 parent a764308 commit 6a11f7e
Show file tree
Hide file tree
Showing 5 changed files with 33 additions and 0 deletions.
1 change: 1 addition & 0 deletions docs/Gemfile.lock
Original file line number Diff line number Diff line change
Expand Up @@ -253,6 +253,7 @@ GEM

PLATFORMS
arm64-darwin-21
arm64-darwin-23
x86_64-linux

DEPENDENCIES
Expand Down
8 changes: 8 additions & 0 deletions docs/configuring-the-connector.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,14 @@ title: Configuring the Connector
nav_order: 4
---

## Table of contents
{: .no_toc .text-delta }

- TOC
{:toc}

## Overview

Using the MarkLogic Kafka connector requires configuring a set of properties to control how the connector interacts
with MarkLogic. The manner in which you use the MarkLogic connector will determine how you configure the connector:

Expand Down
8 changes: 8 additions & 0 deletions docs/installation-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,14 @@ title: Installation Guide
nav_order: 3
---

## Table of contents
{: .no_toc .text-delta }

- TOC
{:toc}

## Overview

Apache Kafka can be run either as a standalone product or via a provider such as Confluent. This guide provides
instructions on using the MarkLogic connector with either Kafka as a standalone product or via Confluent. Other
Kafka providers exist and should be capable of utilizing the MarkLogic connector as well, but instructions on those
Expand Down
8 changes: 8 additions & 0 deletions docs/reading-data.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,14 @@ title: Reading Data
nav_order: 5
---

## Table of contents
{: .no_toc .text-delta }

- TOC
{:toc}

## Overview

The MarkLogic Kafka connector uses the [Optic API](https://docs.marklogic.com/guide/app-dev/OpticAPI) to read data from
MarkLogic as rows. Each row is converted into a Kafka `SourceRecord` and sent to a user-defined topic. To enable
this, the following properties must be configured:
Expand Down
8 changes: 8 additions & 0 deletions docs/writing-data.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,14 @@ title: Writing Data
nav_order: 6
---

## Table of contents
{: .no_toc .text-delta }

- TOC
{:toc}

## Overview

By default, the MarkLogic Kafka connector assumes that the app server associated with the port defined by the `ml.connection.port`
property is a [REST API app server](https://docs.marklogic.com/guide/rest-dev) - that is, the value of its `url rewriter`
property is `/MarkLogic/rest-api/rewriter.xml` or a variation of that rewriter. This allows the MarkLogic connector to use the
Expand Down

0 comments on commit 6a11f7e

Please sign in to comment.