IN1 - Type of the first input data steam.IN2 - Type of the second input data stream.public class ConnectedDataStream<IN1,IN2> extends Object
CoMapFunction on two
DataStreams| Modifier and Type | Field and Description |
|---|---|
protected DataStream<IN1> |
dataStream1 |
protected DataStream<IN2> |
dataStream2 |
protected StreamExecutionEnvironment |
environment |
protected boolean |
isGrouped |
protected StreamGraph |
jobGraphBuilder |
protected org.apache.flink.api.java.functions.KeySelector<IN1,?> |
keySelector1 |
protected org.apache.flink.api.java.functions.KeySelector<IN2,?> |
keySelector2 |
| Modifier | Constructor and Description |
|---|---|
protected |
ConnectedDataStream(ConnectedDataStream<IN1,IN2> coDataStream) |
protected |
ConnectedDataStream(DataStream<IN1> input1,
DataStream<IN2> input2) |
| Modifier and Type | Method and Description |
|---|---|
<OUT> SingleOutputStreamOperator<OUT,?> |
addGeneralWindowCombine(CoWindowFunction<IN1,IN2,OUT> coWindowFunction,
org.apache.flink.api.common.typeinfo.TypeInformation<OUT> outTypeInfo,
long windowSize,
long slideInterval,
TimestampWrapper<IN1> timestamp1,
TimestampWrapper<IN2> timestamp2) |
<F> F |
clean(F f) |
protected ConnectedDataStream<IN1,IN2> |
copy() |
<OUT> SingleOutputStreamOperator<OUT,?> |
flatMap(CoFlatMapFunction<IN1,IN2,OUT> coFlatMapper)
Applies a CoFlatMap transformation on a
ConnectedDataStream and
maps the output to a common type. |
StreamExecutionEnvironment |
getExecutionEnvironment() |
DataStream<IN1> |
getFirst()
Returns the first
DataStream. |
protected <OUT> TwoInputStreamOperator<IN1,IN2,OUT> |
getReduceOperator(CoReduceFunction<IN1,IN2,OUT> coReducer) |
DataStream<IN2> |
getSecond()
Returns the second
DataStream. |
org.apache.flink.api.common.typeinfo.TypeInformation<IN1> |
getType1()
Gets the type of the first input
|
org.apache.flink.api.common.typeinfo.TypeInformation<IN2> |
getType2()
Gets the type of the second input
|
ConnectedDataStream<IN1,IN2> |
groupBy(int[] keyPositions1,
int[] keyPositions2)
GroupBy operation for connected data stream.
|
ConnectedDataStream<IN1,IN2> |
groupBy(int keyPosition1,
int keyPosition2)
GroupBy operation for connected data stream.
|
ConnectedDataStream<IN1,IN2> |
groupBy(org.apache.flink.api.java.functions.KeySelector<IN1,?> keySelector1,
org.apache.flink.api.java.functions.KeySelector<IN2,?> keySelector2)
GroupBy operation for connected data stream.
|
ConnectedDataStream<IN1,IN2> |
groupBy(String[] fields1,
String[] fields2)
GroupBy operation for connected data stream using key expressions.
|
ConnectedDataStream<IN1,IN2> |
groupBy(String field1,
String field2)
GroupBy operation for connected data stream using key expressions.
|
<OUT> SingleOutputStreamOperator<OUT,?> |
map(CoMapFunction<IN1,IN2,OUT> coMapper)
Applies a CoMap transformation on a
ConnectedDataStream and maps
the output to a common type. |
ConnectedDataStream<IN1,IN2> |
partitionByHash(int[] keyPositions1,
int[] keyPositions2)
PartitionBy operation for connected data stream.
|
ConnectedDataStream<IN1,IN2> |
partitionByHash(int keyPosition1,
int keyPosition2)
PartitionBy operation for connected data stream.
|
ConnectedDataStream<IN1,IN2> |
partitionByHash(org.apache.flink.api.java.functions.KeySelector<IN1,?> keySelector1,
org.apache.flink.api.java.functions.KeySelector<IN2,?> keySelector2)
PartitionBy operation for connected data stream.
|
ConnectedDataStream<IN1,IN2> |
partitionByHash(String[] fields1,
String[] fields2)
PartitionBy operation for connected data stream using key expressions.
|
ConnectedDataStream<IN1,IN2> |
partitionByHash(String field1,
String field2)
PartitionBy operation for connected data stream using key expressions.
|
<OUT> SingleOutputStreamOperator<OUT,?> |
reduce(CoReduceFunction<IN1,IN2,OUT> coReducer)
Applies a reduce transformation on a
ConnectedDataStream and maps
the outputs to a common type. |
<OUT> SingleOutputStreamOperator<OUT,?> |
transform(String functionName,
org.apache.flink.api.common.typeinfo.TypeInformation<OUT> outTypeInfo,
TwoInputStreamOperator<IN1,IN2,OUT> operator) |
<OUT> SingleOutputStreamOperator<OUT,?> |
windowReduce(CoWindowFunction<IN1,IN2,OUT> coWindowFunction,
long windowSize,
long slideInterval)
Applies a CoWindow transformation on the connected DataStreams.
|
<OUT> SingleOutputStreamOperator<OUT,?> |
windowReduce(CoWindowFunction<IN1,IN2,OUT> coWindowFunction,
long windowSize,
long slideInterval,
TimestampWrapper<IN1> timestamp1,
TimestampWrapper<IN2> timestamp2)
Applies a CoWindow transformation on the connected DataStreams.
|
protected StreamExecutionEnvironment environment
protected StreamGraph jobGraphBuilder
protected DataStream<IN1> dataStream1
protected DataStream<IN2> dataStream2
protected boolean isGrouped
protected org.apache.flink.api.java.functions.KeySelector<IN1,?> keySelector1
protected org.apache.flink.api.java.functions.KeySelector<IN2,?> keySelector2
protected ConnectedDataStream(DataStream<IN1> input1, DataStream<IN2> input2)
protected ConnectedDataStream(ConnectedDataStream<IN1,IN2> coDataStream)
public <F> F clean(F f)
public StreamExecutionEnvironment getExecutionEnvironment()
public DataStream<IN1> getFirst()
DataStream.public DataStream<IN2> getSecond()
DataStream.public org.apache.flink.api.common.typeinfo.TypeInformation<IN1> getType1()
public org.apache.flink.api.common.typeinfo.TypeInformation<IN2> getType2()
public ConnectedDataStream<IN1,IN2> groupBy(int keyPosition1, int keyPosition2)
reduce(org.apache.flink.streaming.api.functions.co.CoReduceFunction<IN1, IN2, OUT>)keyPosition1 - The field used to compute the hashcode of the elements in the
first input stream.keyPosition2 - The field used to compute the hashcode of the elements in the
second input stream.ConnectedDataStreampublic ConnectedDataStream<IN1,IN2> groupBy(int[] keyPositions1, int[] keyPositions2)
reduce(org.apache.flink.streaming.api.functions.co.CoReduceFunction<IN1, IN2, OUT>)keyPositions1 - The fields used to group the first input stream.keyPositions2 - The fields used to group the second input stream.ConnectedDataStreampublic ConnectedDataStream<IN1,IN2> groupBy(String field1, String field2)
DataStreamS underlying type. A dot can be used
to drill down into objects, as in "field1.getInnerField2()" .field1 - The grouping expression for the first inputfield2 - The grouping expression for the second inputConnectedDataStreampublic ConnectedDataStream<IN1,IN2> groupBy(String[] fields1, String[] fields2)
DataStreamS underlying type. A dot can be
used to drill down into objects, as in "field1.getInnerField2()"
.fields1 - The grouping expressions for the first inputfields2 - The grouping expressions for the second inputConnectedDataStreampublic ConnectedDataStream<IN1,IN2> groupBy(org.apache.flink.api.java.functions.KeySelector<IN1,?> keySelector1, org.apache.flink.api.java.functions.KeySelector<IN2,?> keySelector2)
reduce(org.apache.flink.streaming.api.functions.co.CoReduceFunction<IN1, IN2, OUT>)keySelector1 - The KeySelector used for grouping the first inputkeySelector2 - The KeySelector used for grouping the second inputConnectedDataStreampublic ConnectedDataStream<IN1,IN2> partitionByHash(int keyPosition1, int keyPosition2)
keyPosition1 - The field used to compute the hashcode of the elements in the
first input stream.keyPosition2 - The field used to compute the hashcode of the elements in the
second input stream.ConnectedDataStreampublic ConnectedDataStream<IN1,IN2> partitionByHash(int[] keyPositions1, int[] keyPositions2)
keyPositions1 - The fields used to group the first input stream.keyPositions2 - The fields used to group the second input stream.ConnectedDataStreampublic ConnectedDataStream<IN1,IN2> partitionByHash(String field1, String field2)
DataStreams underlying type. A dot can be
used to drill down into objects, as in "field1.getInnerField2()" field1 - The partitioning expressions for the first inputfield2 - The partitioning expressions for the second inputConnectedDataStreampublic ConnectedDataStream<IN1,IN2> partitionByHash(String[] fields1, String[] fields2)
DataStreams underlying type. A dot can be
used to drill down into objects, as in "field1.getInnerField2()" fields1 - The partitioning expressions for the first inputfields2 - The partitioning expressions for the second inputConnectedDataStreampublic ConnectedDataStream<IN1,IN2> partitionByHash(org.apache.flink.api.java.functions.KeySelector<IN1,?> keySelector1, org.apache.flink.api.java.functions.KeySelector<IN2,?> keySelector2)
keySelector1 - The KeySelector used for partitioning the first inputkeySelector2 - The KeySelector used for partitioning the second inputConnectedDataStreampublic <OUT> SingleOutputStreamOperator<OUT,?> map(CoMapFunction<IN1,IN2,OUT> coMapper)
ConnectedDataStream and maps
the output to a common type. The transformation calls a
CoMapFunction.map1(IN1) for each element of the first input and
CoMapFunction.map2(IN2) for each element of the second input. Each
CoMapFunction call returns exactly one element.coMapper - The CoMapFunction used to jointly transform the two input
DataStreamsDataStreampublic <OUT> SingleOutputStreamOperator<OUT,?> flatMap(CoFlatMapFunction<IN1,IN2,OUT> coFlatMapper)
ConnectedDataStream and
maps the output to a common type. The transformation calls a
CoFlatMapFunction.flatMap1(IN1, org.apache.flink.util.Collector<OUT>) for each element of the first input
and CoFlatMapFunction.flatMap2(IN2, org.apache.flink.util.Collector<OUT>) for each element of the second
input. Each CoFlatMapFunction call returns any number of elements
including none.coFlatMapper - The CoFlatMapFunction used to jointly transform the two input
DataStreamsDataStreampublic <OUT> SingleOutputStreamOperator<OUT,?> reduce(CoReduceFunction<IN1,IN2,OUT> coReducer)
ConnectedDataStream and maps
the outputs to a common type. If the ConnectedDataStream is
batched or windowed then the reduce transformation is applied on every
sliding batch/window of the data stream. If the connected data stream is
grouped then the reducer is applied on every group of elements sharing
the same key. This type of reduce is much faster than reduceGroup since
the reduce function can be applied incrementally.coReducer - The CoReduceFunction that will be called for every
element of the inputs.DataStream.public <OUT> SingleOutputStreamOperator<OUT,?> windowReduce(CoWindowFunction<IN1,IN2,OUT> coWindowFunction, long windowSize, long slideInterval)
CoWindowFunction.coWindow(java.util.List<IN1>, java.util.List<IN2>, org.apache.flink.util.Collector<O>) method for for
time aligned windows of the two data streams. System time is used as
default to compute windows.coWindowFunction - The CoWindowFunction that will be applied for the time
windows.windowSize - Size of the windows that will be aligned for both streams in
milliseconds.slideInterval - After every function call the windows will be slid by this
interval.DataStream.public <OUT> SingleOutputStreamOperator<OUT,?> windowReduce(CoWindowFunction<IN1,IN2,OUT> coWindowFunction, long windowSize, long slideInterval, TimestampWrapper<IN1> timestamp1, TimestampWrapper<IN2> timestamp2)
CoWindowFunction.coWindow(java.util.List<IN1>, java.util.List<IN2>, org.apache.flink.util.Collector<O>) method for
time aligned windows of the two data streams. The user can implement
their own time stamps or use the system time by default.coWindowFunction - The CoWindowFunction that will be applied for the time
windows.windowSize - Size of the windows that will be aligned for both streams. If
system time is used it is milliseconds. User defined time
stamps are assumed to be monotonically increasing.slideInterval - After every function call the windows will be slid by this
interval.timestamp1 - User defined time stamps for the first input.timestamp2 - User defined time stamps for the second input.DataStream.protected <OUT> TwoInputStreamOperator<IN1,IN2,OUT> getReduceOperator(CoReduceFunction<IN1,IN2,OUT> coReducer)
public <OUT> SingleOutputStreamOperator<OUT,?> addGeneralWindowCombine(CoWindowFunction<IN1,IN2,OUT> coWindowFunction, org.apache.flink.api.common.typeinfo.TypeInformation<OUT> outTypeInfo, long windowSize, long slideInterval, TimestampWrapper<IN1> timestamp1, TimestampWrapper<IN2> timestamp2)
public <OUT> SingleOutputStreamOperator<OUT,?> transform(String functionName, org.apache.flink.api.common.typeinfo.TypeInformation<OUT> outTypeInfo, TwoInputStreamOperator<IN1,IN2,OUT> operator)
protected ConnectedDataStream<IN1,IN2> copy()
Copyright © 2014–2015 The Apache Software Foundation. All rights reserved.