public class DruidSerDe
extends org.apache.hadoop.hive.serde2.AbstractSerDe
| Constructor and Description |
|---|
DruidSerDe() |
| Modifier and Type | Method and Description |
|---|---|
Object |
deserialize(org.apache.hadoop.io.Writable writable) |
void |
deserializeAsPrimitive(org.apache.hadoop.io.Writable writable,
Object[] rowBoat)
Function to convert Druid Primitive values to Hive Primitives.
|
org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector |
getObjectInspector() |
Class<? extends org.apache.hadoop.io.Writable> |
getSerializedClass() |
void |
initialize(org.apache.hadoop.conf.Configuration configuration,
Properties tableProperties,
Properties partitionProperties) |
org.apache.hadoop.io.Writable |
serialize(Object o,
org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector objectInspector) |
boolean |
shouldStoreFieldsInMetastore(Map<String,String> tableParams) |
protected org.apache.druid.query.metadata.metadata.SegmentAnalysis |
submitMetadataRequest(String address,
org.apache.druid.query.metadata.metadata.SegmentMetadataQuery query) |
getColumnComments, getColumnNames, getColumnTypes, getConfiguration, getPartitionColumnComments, getPartitionColumnNames, getPartitionColumnTypes, getSerDeStats, parseColumnComments, parseColumnNames, parseColumnNames, parseColumnTypes, parseColumnTypes, toStringpublic void initialize(org.apache.hadoop.conf.Configuration configuration,
Properties tableProperties,
Properties partitionProperties)
throws org.apache.hadoop.hive.serde2.SerDeException
initialize in class org.apache.hadoop.hive.serde2.AbstractSerDeorg.apache.hadoop.hive.serde2.SerDeExceptionprotected org.apache.druid.query.metadata.metadata.SegmentAnalysis submitMetadataRequest(String address, org.apache.druid.query.metadata.metadata.SegmentMetadataQuery query) throws org.apache.hadoop.hive.serde2.SerDeException, IOException
org.apache.hadoop.hive.serde2.SerDeExceptionIOExceptionpublic Class<? extends org.apache.hadoop.io.Writable> getSerializedClass()
getSerializedClass in interface org.apache.hadoop.hive.serde2.SerializergetSerializedClass in class org.apache.hadoop.hive.serde2.AbstractSerDepublic org.apache.hadoop.io.Writable serialize(Object o, org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector objectInspector) throws org.apache.hadoop.hive.serde2.SerDeException
serialize in interface org.apache.hadoop.hive.serde2.Serializerserialize in class org.apache.hadoop.hive.serde2.AbstractSerDeorg.apache.hadoop.hive.serde2.SerDeExceptionpublic Object deserialize(org.apache.hadoop.io.Writable writable) throws org.apache.hadoop.hive.serde2.SerDeException
deserialize in interface org.apache.hadoop.hive.serde2.Deserializerdeserialize in class org.apache.hadoop.hive.serde2.AbstractSerDewritable - Druid Writable to be deserialized.org.apache.hadoop.hive.serde2.SerDeException - if there is Serde issues.public void deserializeAsPrimitive(org.apache.hadoop.io.Writable writable,
Object[] rowBoat)
throws org.apache.hadoop.hive.serde2.SerDeException
deserialize(Writable), any modification here should be done
there as well.
Reason to have 2 function is that no vectorized path expects writables.writable - Druid Writable.rowBoat - Rowboat used to carry columns values.org.apache.hadoop.hive.serde2.SerDeException - in case of deserialization errors.public org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector getObjectInspector()
Copyright © 2022 The Apache Software Foundation. All rights reserved.