Skip to content

Commit 355baef

Browse files
committed
2 parents b123f00 + 7cf08f2 commit 355baef

File tree

2 files changed

+14
-14
lines changed

2 files changed

+14
-14
lines changed

docs/data-sources/csv.md

+13-13
Original file line numberDiff line numberDiff line change
@@ -38,19 +38,19 @@ After navigating to the CSV connection settings, you will need to fill in its de
3838

3939
![Adding connection settings](https://dqops.com/docs/images/working-with-dqo/adding-connections/connection-settings-csv.png){ loading=lazy; width="1200px" }
4040

41-
| CSV connection settings | Property name in YAML configuration file | Description |
42-
|--------------------------|------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
43-
| Connection name | | The name of the connection that will be created in DQOps. This will also be the name of the folder where the connection configuration files are stored. The name of the connection must be unique and consist of alphanumeric characters. |
44-
| Parallel jobs limit | | A limit on the number of jobs that can run simultaneously. Leave empty to disable the limit. |
45-
| Files location | `storage-type` | You have the option to import files stored locally or on AWS S3. If you choose to work with files on AWS S3, it is recommended that you create a specialized user in IAM. This user should be used as a service account and given permission to list and read objects. |
46-
| File format | `files-format-type` | Type of source files for DuckDB. |
47-
| Aws authentication mode | `duckdb_aws_authentication_mode` | Available when using AWS S3. Authentication mode to AWS S3. Supports also a ${REDSHIFT_AUTHENTICATION_MODE} configuration with a custom environment variable. |
48-
| Access Key ID | `user` | Available when using AWS S3. Access Key ID for AWS authentication. The value can be in the ${ENVIRONMENT_VARIABLE_NAME} format to use dynamic substitution. |
49-
| Secret Access Key | `password` | Available when using AWS S3. Secret Access Key for AWS authentication. The value can be in the ${ENVIRONMENT_VARIABLE_NAME} format to use dynamic substitution. |
50-
| Region | `region` | The region for the storage credentials for a remote storage type. The value can be in the ${ENVIRONMENT_VARIABLE_NAME} format to use dynamic substitution. When not set the default value will be loaded from .credentials/AWS_default_config file in your DQOps' userhome |
51-
| Virtual schema name | `directories` | An alias for the parent directory with data. The virtual schema name is a key of the directories mapping. |
52-
| Path | `directories` | The path prefix to the parent directory with data. The path must be absolute. |
53-
| JDBC connection property | | Optional setting. DQOps supports using the JDBC driver to access DuckDB. |
41+
| CSV connection settings | Property name in YAML configuration file | Description |
42+
|---------------------------|------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
43+
| Connection name | | The name of the connection that will be created in DQOps. This will also be the name of the folder where the connection configuration files are stored. The name of the connection must be unique and consist of alphanumeric characters. |
44+
| Parallel jobs limit | | A limit on the number of jobs that can run simultaneously. Leave empty to disable the limit. |
45+
| Files location | `storage-type` | You have the option to import files stored locally or on AWS S3. If you choose to work with files on AWS S3, it is recommended that you create a specialized user in IAM. This user should be used as a service account and given permission to list and read objects. |
46+
| File format | `files-format-type` | Type of source files for DuckDB. |
47+
| Aws authentication mode | `duckdb_aws_authentication_mode` | Available when using AWS S3. Authentication mode to AWS S3. Supports also a ${REDSHIFT_AUTHENTICATION_MODE} configuration with a custom environment variable. |
48+
| Access Key ID | `user` | Available when using AWS S3. Access Key ID for AWS authentication. The value can be in the ${ENVIRONMENT_VARIABLE_NAME} format to use dynamic substitution. |
49+
| Secret Access Key | `password` | Available when using AWS S3. Secret Access Key for AWS authentication. The value can be in the ${ENVIRONMENT_VARIABLE_NAME} format to use dynamic substitution. |
50+
| Region | `region` | The region for the storage credentials for a remote storage type. The value can be in the ${ENVIRONMENT_VARIABLE_NAME} format to use dynamic substitution. When not set the default value will be loaded from .credentials/AWS_default_config file in your DQOps' userhome |
51+
| Virtual schema name | `directories` | An alias for the parent directory with data. The virtual schema name is a key of the directories mapping. |
52+
| Path | `directories` | The path prefix to the parent directory with data. The path must be absolute. |
53+
| JDBC connection property | | Optional setting. DQOps supports using the JDBC driver to access DuckDB. |
5454

5555

5656
### Setting the path to data import

docs/data-sources/redshift.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@ After navigating to the Redshift connection settings, you will need to fill in i
4040
| Redshift connection settings | Property name in YAML configuration file | Description |
4141
|------------------------------|------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
4242
| Connection name | | The name of the connection that will be created in DQOps. This will also be the name of the folder where the connection configuration files are stored. The name of the connection must be unique and consist of alphanumeric characters. |
43-
| Host | `host` | Redshift host name. Supports also a ${REDSHIFT_HOST} configuration with a custom environment variable. |
43+
| Host | `host` | Redshift host name. ClusterID and Region must be set in Host. Supports also a ${REDSHIFT_HOST} configuration with a custom environment variable. |
4444
| Port | `port` | Redshift port name. The default port is 5439. Supports also a ${REDSHIFT_PORT} configuration with a custom environment variable. |
4545
| Database | `database` | Redshift database name. The value can be in the ${ENVIRONMENT_VARIABLE_NAME} format to use dynamic substitution. |
4646
| Redshift authentication mode | `redshift_authentication_mode` | The authentication mode for Redshift. Supports also a ${REDSHIFT_AUTHENTICATION_MODE} configuration with a custom environment variable. |

0 commit comments

Comments
 (0)