You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
| Connection name || The name of the connection that will be created in DQOps. This will also be the name of the folder where the connection configuration files are stored. The name of the connection must be unique and consist of alphanumeric characters. |
44
-
| Parallel jobs limit || A limit on the number of jobs that can run simultaneously. Leave empty to disable the limit. |
45
-
| Files location |`storage-type`| You have the option to import files stored locally or on AWS S3. If you choose to work with files on AWS S3, it is recommended that you create a specialized user in IAM. This user should be used as a service account and given permission to list and read objects. |
46
-
| File format |`files-format-type`| Type of source files for DuckDB. |
47
-
| Aws authentication mode |`duckdb_aws_authentication_mode`| Available when using AWS S3. Authentication mode to AWS S3. Supports also a ${REDSHIFT_AUTHENTICATION_MODE} configuration with a custom environment variable. |
48
-
| Access Key ID |`user`| Available when using AWS S3. Access Key ID for AWS authentication. The value can be in the ${ENVIRONMENT_VARIABLE_NAME} format to use dynamic substitution. |
49
-
| Secret Access Key |`password`| Available when using AWS S3. Secret Access Key for AWS authentication. The value can be in the ${ENVIRONMENT_VARIABLE_NAME} format to use dynamic substitution. |
50
-
| Region |`region`| The region for the storage credentials for a remote storage type. The value can be in the ${ENVIRONMENT_VARIABLE_NAME} format to use dynamic substitution. When not set the default value will be loaded from .credentials/AWS_default_config file in your DQOps' userhome |
51
-
| Virtual schema name |`directories`| An alias for the parent directory with data. The virtual schema name is a key of the directories mapping. |
52
-
| Path |`directories`| The path prefix to the parent directory with data. The path must be absolute. |
53
-
| JDBC connection property || Optional setting. DQOps supports using the JDBC driver to access DuckDB. |
41
+
| CSV connection settings | Property name in YAML configuration file | Description |
| Connection name || The name of the connection that will be created in DQOps. This will also be the name of the folder where the connection configuration files are stored. The name of the connection must be unique and consist of alphanumeric characters. |
44
+
| Parallel jobs limit || A limit on the number of jobs that can run simultaneously. Leave empty to disable the limit. |
45
+
| Files location |`storage-type`| You have the option to import files stored locally or on AWS S3. If you choose to work with files on AWS S3, it is recommended that you create a specialized user in IAM. This user should be used as a service account and given permission to list and read objects. |
46
+
| File format |`files-format-type`| Type of source files for DuckDB. |
47
+
| Aws authentication mode |`duckdb_aws_authentication_mode`| Available when using AWS S3. Authentication mode to AWS S3. Supports also a ${REDSHIFT_AUTHENTICATION_MODE} configuration with a custom environment variable. |
48
+
| Access Key ID |`user`| Available when using AWS S3. Access Key ID for AWS authentication. The value can be in the ${ENVIRONMENT_VARIABLE_NAME} format to use dynamic substitution. |
49
+
| Secret Access Key |`password`| Available when using AWS S3. Secret Access Key for AWS authentication. The value can be in the ${ENVIRONMENT_VARIABLE_NAME} format to use dynamic substitution. |
50
+
| Region |`region`| The region for the storage credentials for a remote storage type. The value can be in the ${ENVIRONMENT_VARIABLE_NAME} format to use dynamic substitution. When not set the default value will be loaded from .credentials/AWS_default_config file in your DQOps' userhome |
51
+
| Virtual schema name |`directories`| An alias for the parent directory with data. The virtual schema name is a key of the directories mapping. |
52
+
| Path |`directories`| The path prefix to the parent directory with data. The path must be absolute. |
53
+
| JDBC connection property || Optional setting. DQOps supports using the JDBC driver to access DuckDB. |
| Connection name || The name of the connection that will be created in DQOps. This will also be the name of the folder where the connection configuration files are stored. The name of the connection must be unique and consist of alphanumeric characters. |
43
-
| Host |`host`| Redshift host name. Supports also a ${REDSHIFT_HOST} configuration with a custom environment variable.|
43
+
| Host |`host`| Redshift host name. ClusterID and Region must be set in Host. Supports also a ${REDSHIFT_HOST} configuration with a custom environment variable. |
44
44
| Port |`port`| Redshift port name. The default port is 5439. Supports also a ${REDSHIFT_PORT} configuration with a custom environment variable. |
45
45
| Database |`database`| Redshift database name. The value can be in the ${ENVIRONMENT_VARIABLE_NAME} format to use dynamic substitution. |
46
46
| Redshift authentication mode |`redshift_authentication_mode`| The authentication mode for Redshift. Supports also a ${REDSHIFT_AUTHENTICATION_MODE} configuration with a custom environment variable. |
0 commit comments