T - the type of elements handled by this coderpublic abstract class CustomCoder<T> extends AtomicCoder<T> implements Serializable
Coder class that encodes itself via Java
serialization.
To complete an implementation, subclasses must implement Coder.encode(T, java.io.OutputStream, org.apache.beam.sdk.coders.Coder.Context)
and Coder.decode(java.io.InputStream, org.apache.beam.sdk.coders.Coder.Context) methods. Anonymous subclasses must furthermore override
getEncodingId().
Not to be confused with SerializableCoder that encodes objects that implement the
Serializable interface.
Coder.Context, Coder.NonDeterministicException| Constructor and Description |
|---|
CustomCoder() |
| Modifier and Type | Method and Description |
|---|---|
org.apache.beam.sdk.util.CloudObject |
asCloudObject()
Returns the
CloudObject that represents this Coder. |
String |
getEncodingId()
An identifier for the binary format written by
Coder.encode(T, java.io.OutputStream, org.apache.beam.sdk.coders.Coder.Context). |
static CustomCoder<?> |
of(String typeId,
String encodingId,
String type,
String serializedCoder) |
void |
verifyDeterministic()
Throw
Coder.NonDeterministicException if the coding is not deterministic. |
getCoderArguments, getInstanceComponentsconsistentWithEquals, equals, getAllowedEncodings, getComponents, getEncodedElementByteSize, hashCode, isRegisterByteSizeObserverCheap, registerByteSizeObserver, structuralValue, toString, verifyDeterministic, verifyDeterministicpublic static CustomCoder<?> of(String typeId, String encodingId, String type, String serializedCoder)
public org.apache.beam.sdk.util.CloudObject asCloudObject()
CloudObject that represents this Coder.asCloudObject in interface Coder<T>asCloudObject in class StandardCoder<T>CloudObject wrapping of the Java serialization of this.public void verifyDeterministic()
throws Coder.NonDeterministicException
Coder.NonDeterministicException if the coding is not deterministic.
In order for a Coder to be considered deterministic,
the following must be true:
Object.equals()
or Comparable.compareTo(), if supported) have the same
encoding.
Coder always produces a canonical encoding, which is the
same for an instance of an object even if produced on different
computers at different times.
verifyDeterministic in interface Coder<T>verifyDeterministic in class DeterministicStandardCoder<T>NonDeterministicException - a CustomCoder is presumed
nondeterministic.Coder.NonDeterministicException - if this coder is not deterministic.public String getEncodingId()
Coder.encode(T, java.io.OutputStream, org.apache.beam.sdk.coders.Coder.Context).
This value, along with the fully qualified class name, forms an identifier for the binary format of this coder. Whenever this value changes, the new encoding is considered incompatible with the prior format: It is presumed that the prior version of the coder will be unable to correctly read the new format and the new version of the coder will be unable to correctly read the old format.
If the format is changed in a backwards-compatible way (the Coder can still accept data from
the prior format), such as by adding optional fields to a Protocol Buffer or Avro definition,
and you want Dataflow to understand that the new coder is compatible with the prior coder,
this value must remain unchanged. It is then the responsibility of Coder.decode(java.io.InputStream, org.apache.beam.sdk.coders.Coder.Context) to correctly
read data from the prior format.
getEncodingId in interface Coder<T>getEncodingId in class StandardCoder<T>UnsupportedOperationException - when an anonymous class is used, since they do not have
a stable canonical class name.