KeyT - Type of keys to be written.ValueT - Type of values to be written.public static class HadoopFormatIO.Write<KeyT,ValueT>
extends org.apache.beam.sdk.transforms.PTransform<org.apache.beam.sdk.values.PCollection<org.apache.beam.sdk.values.KV<KeyT,ValueT>>,org.apache.beam.sdk.values.PDone>
PTransform that writes to any data sink which implements Hadoop OutputFormat. For
e.g. Cassandra, Elasticsearch, HBase, Redis, Postgres, etc. See the class-level Javadoc on
HadoopFormatIO for more information.HadoopFormatIO,
Serialized Form| Modifier and Type | Class and Description |
|---|---|
static interface |
HadoopFormatIO.Write.ExternalSynchronizationBuilder<KeyT,ValueT>
Builder for External Synchronization defining.
|
static interface |
HadoopFormatIO.Write.PartitionedWriterBuilder<KeyT,ValueT>
Builder for partitioning determining.
|
static interface |
HadoopFormatIO.Write.WriteBuilder<KeyT,ValueT>
Main builder of Write transformation.
|
| Modifier and Type | Method and Description |
|---|---|
org.apache.beam.sdk.values.PDone |
expand(org.apache.beam.sdk.values.PCollection<org.apache.beam.sdk.values.KV<KeyT,ValueT>> input) |
void |
populateDisplayData(org.apache.beam.sdk.transforms.display.DisplayData.Builder builder) |
void |
validate(org.apache.beam.sdk.options.PipelineOptions pipelineOptions) |
public void validate(org.apache.beam.sdk.options.PipelineOptions pipelineOptions)
public void populateDisplayData(org.apache.beam.sdk.transforms.display.DisplayData.Builder builder)