Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

Batch source to use Google Cloud Platforms's Bigtable as a Source.

User Expectations

  • User specifies how they would like to handle errors during ingesting, depending on option chosen, the errors in processing are handled.
  • User should be able to specify account credentials in configuration.

User Configurations

SectionUser Configuration LabelLabel DescriptionMandatoryMacro-enabledOptionsDefaultVariableUser Widget
StandardReference NameThis will be used to uniquely identify this source for lineage, annotating metadata, etc++

referenceNameText Box

Table NameDatabase table name++

tableNameText Box

Instance IDBigtable instance ID++

instanceIdText Box

Project IDThe ID of the project in Google Cloud
If not specified, will be automatically read from the cluster environment

+

projectIdText Box

Service Account File Path

Path on the local file system of the service account key used for
authorization.

If the plugin is run on a Google Cloud Dataproc cluster, the service account key does not need to be provided and can be set to 'auto-detect'.
Credentials will be automatically read from the cluster environment.

When running on other clusters, the file must be present on every node in the cluster.

See Google's documentation on Service account credentials for details.


+

serviceFilePathText Box
Error HandlingOn Record ErrorHow to handle error in record processing+
  • Skip error
  • Send to error port
  • Fail pipeline
Skip erroron-errorRadio Button (layout: block)


Reference

https://cloud.google.com/bigtable/docs/

  • No labels