Skip navigation links
A B C D E F G H I K L M N O P R S T V 

A

abort(FlinkKafkaProducer011.KafkaTransactionState) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
 
abortTransaction() - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 

B

beginTransaction() - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
 
beginTransaction() - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 
build() - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011AvroTableSource.Builder
Builds and configures a Kafka011AvroTableSource.
build() - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011JsonTableSource.Builder
Builds and configures a Kafka011JsonTableSource.
builder() - Static method in class org.apache.flink.streaming.connectors.kafka.Kafka011AvroTableSource
Returns a builder to configure and create a Kafka011AvroTableSource.
Builder() - Constructor for class org.apache.flink.streaming.connectors.kafka.Kafka011AvroTableSource.Builder
 
builder() - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011AvroTableSource.Builder
 
builder() - Static method in class org.apache.flink.streaming.connectors.kafka.Kafka011JsonTableSource
Returns a builder to configure and create a Kafka011JsonTableSource.
Builder() - Constructor for class org.apache.flink.streaming.connectors.kafka.Kafka011JsonTableSource.Builder
 
builder() - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011JsonTableSource.Builder
 

C

canEqual(Object) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.ContextStateSerializer
 
canEqual(Object) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.TransactionStateSerializer
 
close() - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
 
close() - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 
close(long, TimeUnit) - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 
commit(FlinkKafkaProducer011.KafkaTransactionState) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
 
commitTransaction() - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 
ContextStateSerializer() - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.ContextStateSerializer
 
copy(FlinkKafkaProducer011.KafkaTransactionContext) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.ContextStateSerializer
 
copy(FlinkKafkaProducer011.KafkaTransactionContext, FlinkKafkaProducer011.KafkaTransactionContext) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.ContextStateSerializer
 
copy(DataInputView, DataOutputView) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.ContextStateSerializer
 
copy(FlinkKafkaProducer011.KafkaTransactionState) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.TransactionStateSerializer
 
copy(FlinkKafkaProducer011.KafkaTransactionState, FlinkKafkaProducer011.KafkaTransactionState) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.TransactionStateSerializer
 
copy(DataInputView, DataOutputView) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.TransactionStateSerializer
 
createBuilder() - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011JsonTableSourceFactory
 
createInstance() - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.ContextStateSerializer
 
createInstance() - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.TransactionStateSerializer
 
createKafkaConsumer(String, Properties, DeserializationSchema<Row>) - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011AvroTableSource
 
createKafkaConsumer(String, Properties, DeserializationSchema<Row>) - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011JsonTableSource
 
createKafkaConsumer(String, Properties, DeserializationSchema<Row>) - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011TableSource
 

D

DEFAULT_KAFKA_PRODUCERS_POOL_SIZE - Static variable in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
Default number of KafkaProducers in the pool.
DEFAULT_KAFKA_TRANSACTION_TIMEOUT - Static variable in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
Default value for kafka transaction timeout.
deserialize(DataInputView) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.ContextStateSerializer
 
deserialize(FlinkKafkaProducer011.KafkaTransactionContext, DataInputView) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.ContextStateSerializer
 
deserialize(DataInputView) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.TransactionStateSerializer
 
deserialize(FlinkKafkaProducer011.KafkaTransactionState, DataInputView) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.TransactionStateSerializer
 

E

equals(Object) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.KafkaTransactionContext
 

F

finishRecoveringContext() - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
 
FlinkKafka011ErrorCode - Enum in org.apache.flink.streaming.connectors.kafka
Error codes used in FlinkKafka011Exception.
FlinkKafka011Exception - Exception in org.apache.flink.streaming.connectors.kafka
FlinkKafka011Exception(FlinkKafka011ErrorCode, String) - Constructor for exception org.apache.flink.streaming.connectors.kafka.FlinkKafka011Exception
 
FlinkKafka011Exception(FlinkKafka011ErrorCode, String, Throwable) - Constructor for exception org.apache.flink.streaming.connectors.kafka.FlinkKafka011Exception
 
FlinkKafkaConsumer011<T> - Class in org.apache.flink.streaming.connectors.kafka
The Flink Kafka Consumer is a streaming data source that pulls a parallel data stream from Apache Kafka 0.11.x.
FlinkKafkaConsumer011(String, DeserializationSchema<T>, Properties) - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer011
Creates a new Kafka streaming source consumer for Kafka 0.11.x.
FlinkKafkaConsumer011(String, KeyedDeserializationSchema<T>, Properties) - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer011
Creates a new Kafka streaming source consumer for Kafka 0.11.x
FlinkKafkaConsumer011(List<String>, DeserializationSchema<T>, Properties) - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer011
Creates a new Kafka streaming source consumer for Kafka 0.11.x
FlinkKafkaConsumer011(List<String>, KeyedDeserializationSchema<T>, Properties) - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer011
Creates a new Kafka streaming source consumer for Kafka 0.11.x
FlinkKafkaConsumer011(Pattern, DeserializationSchema<T>, Properties) - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer011
Creates a new Kafka streaming source consumer for Kafka 0.11.x.
FlinkKafkaConsumer011(Pattern, KeyedDeserializationSchema<T>, Properties) - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer011
Creates a new Kafka streaming source consumer for Kafka 0.11.x.
FlinkKafkaProducer<K,V> - Class in org.apache.flink.streaming.connectors.kafka.internal
Wrapper around KafkaProducer that allows to resume transactions in case of node failure, which allows to implement two phase commit algorithm for exactly-once semantic FlinkKafkaProducer.
FlinkKafkaProducer(Properties) - Constructor for class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 
FlinkKafkaProducer011<IN> - Class in org.apache.flink.streaming.connectors.kafka
Flink Sink to produce data into a Kafka topic.
FlinkKafkaProducer011(String, String, SerializationSchema<IN>) - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
Creates a FlinkKafkaProducer for a given topic.
FlinkKafkaProducer011(String, SerializationSchema<IN>, Properties) - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
Creates a FlinkKafkaProducer for a given topic.
FlinkKafkaProducer011(String, SerializationSchema<IN>, Properties, Optional<FlinkKafkaPartitioner<IN>>) - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
Creates a FlinkKafkaProducer for a given topic.
FlinkKafkaProducer011(String, String, KeyedSerializationSchema<IN>) - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
Creates a FlinkKafkaProducer for a given topic.
FlinkKafkaProducer011(String, KeyedSerializationSchema<IN>, Properties) - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
Creates a FlinkKafkaProducer for a given topic.
FlinkKafkaProducer011(String, KeyedSerializationSchema<IN>, Properties, FlinkKafkaProducer011.Semantic) - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
Creates a FlinkKafkaProducer for a given topic.
FlinkKafkaProducer011(String, KeyedSerializationSchema<IN>, Properties, Optional<FlinkKafkaPartitioner<IN>>) - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
Creates a FlinkKafkaProducer for a given topic.
FlinkKafkaProducer011(String, KeyedSerializationSchema<IN>, Properties, Optional<FlinkKafkaPartitioner<IN>>, FlinkKafkaProducer011.Semantic, int) - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
Creates a FlinkKafkaProducer for a given topic.
FlinkKafkaProducer011.ContextStateSerializer - Class in org.apache.flink.streaming.connectors.kafka
FlinkKafkaProducer011.KafkaTransactionContext - Class in org.apache.flink.streaming.connectors.kafka
Context associated to this instance of the FlinkKafkaProducer011.
FlinkKafkaProducer011.NextTransactionalIdHint - Class in org.apache.flink.streaming.connectors.kafka
Keep information required to deduce next safe to use transactional id.
FlinkKafkaProducer011.Semantic - Enum in org.apache.flink.streaming.connectors.kafka
Semantics that can be chosen.
FlinkKafkaProducer011.TransactionStateSerializer - Class in org.apache.flink.streaming.connectors.kafka
TypeSerializer for KafkaTransactionState.
flush() - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 

G

generateIdsToAbort() - Method in class org.apache.flink.streaming.connectors.kafka.internal.TransactionalIdsGenerator
If we have to abort previous transactional id in case of restart after a failure BEFORE first checkpoint completed, we don't know what was the parallelism used in previous attempt.
generateIdsToUse(long) - Method in class org.apache.flink.streaming.connectors.kafka.internal.TransactionalIdsGenerator
Range of available transactional ids to use is: [nextFreeTransactionalId, nextFreeTransactionalId + parallelism * kafkaProducersPoolSize) loop below picks in a deterministic way a subrange of those available transactional ids based on index of this subtask.
getDeserializationSchema() - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011TableSource
 
getEpoch() - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 
getErrorCode() - Method in exception org.apache.flink.streaming.connectors.kafka.FlinkKafka011Exception
 
getLength() - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.ContextStateSerializer
 
getLength() - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.TransactionStateSerializer
 
getProducerId() - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 
getTransactionalId() - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 
getTransactionCoordinatorId() - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 
getValue() - Method in class org.apache.flink.streaming.connectors.kafka.internal.metrics.KafkaMetricMuttableWrapper
 

H

hashCode() - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.KafkaTransactionContext
 

I

ignoreFailuresAfterTransactionTimeout() - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
Disables the propagation of exceptions thrown when committing presumably timed out Kafka transactions during recovery of the job.
initializeState(FunctionInitializationContext) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
 
initializeUserContext() - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
 
initTransactions() - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 
invoke(FlinkKafkaProducer011.KafkaTransactionState, IN, SinkFunction.Context) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
 
isImmutableType() - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.ContextStateSerializer
 
isImmutableType() - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.TransactionStateSerializer
 

K

Kafka011AvroTableSource - Class in org.apache.flink.streaming.connectors.kafka
Kafka StreamTableSource for Kafka 0.11.
Kafka011AvroTableSource(String, Properties, TableSchema, Class<? extends SpecificRecordBase>) - Constructor for class org.apache.flink.streaming.connectors.kafka.Kafka011AvroTableSource
Creates a Kafka 0.11 Avro StreamTableSource using a given SpecificRecord.
Kafka011AvroTableSource.Builder - Class in org.apache.flink.streaming.connectors.kafka
A builder to configure and create a Kafka011AvroTableSource.
Kafka011JsonTableSource - Class in org.apache.flink.streaming.connectors.kafka
Kafka StreamTableSource for Kafka 0.11.
Kafka011JsonTableSource(String, Properties, TableSchema, TableSchema) - Constructor for class org.apache.flink.streaming.connectors.kafka.Kafka011JsonTableSource
Creates a Kafka 0.11 JSON StreamTableSource.
Kafka011JsonTableSource.Builder - Class in org.apache.flink.streaming.connectors.kafka
A builder to configure and create a Kafka011JsonTableSource.
Kafka011JsonTableSourceFactory - Class in org.apache.flink.streaming.connectors.kafka
Factory for creating configured instances of Kafka011JsonTableSource.
Kafka011JsonTableSourceFactory() - Constructor for class org.apache.flink.streaming.connectors.kafka.Kafka011JsonTableSourceFactory
 
Kafka011TableSource - Class in org.apache.flink.streaming.connectors.kafka
Kafka StreamTableSource for Kafka 0.11.
Kafka011TableSource(String, Properties, DeserializationSchema<Row>, TableSchema, TypeInformation<Row>) - Constructor for class org.apache.flink.streaming.connectors.kafka.Kafka011TableSource
Creates a Kafka 0.11 StreamTableSource.
KafkaMetricMuttableWrapper - Class in org.apache.flink.streaming.connectors.kafka.internal.metrics
Gauge for getting the current value of a Kafka metric.
KafkaMetricMuttableWrapper(Metric) - Constructor for class org.apache.flink.streaming.connectors.kafka.internal.metrics.KafkaMetricMuttableWrapper
 
kafkaVersion() - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011JsonTableSourceFactory
 
KEY_DISABLE_METRICS - Static variable in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
Configuration key for disabling the metrics reporting.

L

lastParallelism - Variable in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.NextTransactionalIdHint
 

M

metrics() - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 

N

nextFreeTransactionalId - Variable in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.NextTransactionalIdHint
 
NextTransactionalIdHint() - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.NextTransactionalIdHint
 
NextTransactionalIdHint(int, long) - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.NextTransactionalIdHint
 

O

open(Configuration) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
Initializes the connection to Kafka.
org.apache.flink.streaming.connectors.kafka - package org.apache.flink.streaming.connectors.kafka
 
org.apache.flink.streaming.connectors.kafka.internal - package org.apache.flink.streaming.connectors.kafka.internal
 
org.apache.flink.streaming.connectors.kafka.internal.metrics - package org.apache.flink.streaming.connectors.kafka.internal.metrics
 

P

partitionsFor(String) - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 
preCommit(FlinkKafkaProducer011.KafkaTransactionState) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
 

R

recoverAndAbort(FlinkKafkaProducer011.KafkaTransactionState) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
 
recoverAndCommit(FlinkKafkaProducer011.KafkaTransactionState) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
 
resumeTransaction(long, short) - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
Instead of obtaining producerId and epoch from the transaction coordinator, re-use previously obtained ones, so that we can resume transaction after a restart.

S

SAFE_SCALE_DOWN_FACTOR - Static variable in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
This coefficient determines what is the safe scale down factor.
send(ProducerRecord<K, V>) - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 
send(ProducerRecord<K, V>, Callback) - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 
sendOffsetsToTransaction(Map<TopicPartition, OffsetAndMetadata>, String) - Method in class org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer
 
serialize(FlinkKafkaProducer011.KafkaTransactionContext, DataOutputView) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.ContextStateSerializer
 
serialize(FlinkKafkaProducer011.KafkaTransactionState, DataOutputView) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.TransactionStateSerializer
 
setFailOnMissingField(boolean) - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011JsonTableSource
Sets the flag that specifies the behavior in case of missing fields.
setFieldMapping(Map<String, String>) - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011AvroTableSource
Sets a mapping from schema fields to fields of the produced Avro record.
setFieldMapping(Map<String, String>) - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011JsonTableSource
Sets the mapping from table schema fields to JSON schema fields.
setKafkaMetric(Metric) - Method in class org.apache.flink.streaming.connectors.kafka.internal.metrics.KafkaMetricMuttableWrapper
 
setLogFailuresOnly(boolean) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
Defines whether the producer should fail on errors, or only log them.
setProctimeAttribute(String) - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011AvroTableSource
Declares a field of the schema to be a processing time attribute.
setProctimeAttribute(String) - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011JsonTableSource
Declares a field of the schema to be a processing time attribute.
setRowtimeAttributeDescriptor(RowtimeAttributeDescriptor) - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011AvroTableSource
Declares a field of the schema to be a rowtime attribute.
setRowtimeAttributeDescriptor(RowtimeAttributeDescriptor) - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011JsonTableSource
Declares a field of the schema to be a rowtime attribute.
setWriteTimestampToKafka(boolean) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
If set to true, Flink will write the (event time) timestamp attached to each record into Kafka.
snapshotState(FunctionSnapshotContext) - Method in class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011
 
supportsKafkaTimestamps() - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011AvroTableSource.Builder
 
supportsKafkaTimestamps() - Method in class org.apache.flink.streaming.connectors.kafka.Kafka011JsonTableSource.Builder
 

T

TransactionalIdsGenerator - Class in org.apache.flink.streaming.connectors.kafka.internal
Class responsible for generating transactional ids to use when communicating with Kafka.
TransactionalIdsGenerator(String, int, int, int, int) - Constructor for class org.apache.flink.streaming.connectors.kafka.internal.TransactionalIdsGenerator
 
TransactionStateSerializer() - Constructor for class org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.TransactionStateSerializer
 

V

valueOf(String) - Static method in enum org.apache.flink.streaming.connectors.kafka.FlinkKafka011ErrorCode
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.Semantic
Returns the enum constant of this type with the specified name.
values() - Static method in enum org.apache.flink.streaming.connectors.kafka.FlinkKafka011ErrorCode
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.Semantic
Returns an array containing the constants of this enum type, in the order they are declared.
A B C D E F G H I K L M N O P R S T V 
Skip navigation links

Copyright © 2014–2018 The Apache Software Foundation. All rights reserved.