Batch source to use Google Cloud Platforms's Bigtable as a Source.
User Expectations
- User specifies how they would like to handle errors during ingesting, depending on option chosen, the errors in processing are handled.
- User should be able to specify account credentials in configuration.
User Configurations
Section | User Configuration Label | Label Description | Mandatory | Macro-enabled | Options | Default | Variable | User Widget |
---|---|---|---|---|---|---|---|---|
Standard | Reference Name | This will be used to uniquely identify this source for lineage, annotating metadata, etc | + | + | referenceName | Text Box | ||
Table Name | Database table name | + | + | tableName | Text Box | |||
Instance ID | Bigtable instance ID | + | + | instanceId | Text Box | |||
Project ID | The ID of the project in Google Cloud If not specified, will be automatically read from the cluster environment | + | projectId | Text Box | ||||
Service Account File Path | Path on the local file system of the service account key used for If the plugin is run on a Google Cloud Dataproc cluster, the service account key does not need to be provided and can be set to 'auto-detect'. When running on other clusters, the file must be present on every node in the cluster. See Google's documentation on Service account credentials for details. | + | serviceFilePath | Text Box | ||||
Error Handling | On Record Error | How to handle error in record processing | + |
| Skip error | on-error | Radio Button (layout: block) |