接上篇Flink中流处理之Window,上节中我们提到了可以为窗口设置滚动窗口,滑动窗口等,在设置后其实不能就这样结束了,我们还需要在窗口中指定如何计算,从而引入了窗口函数的概念,一旦窗口关闭, window function 去计算处理窗口中的每个元素.
window function 定义了要对窗口中收集的数据做的计算操作,主要可以分为两类(注意在使用窗口函数之前一定需要KeyBy分组):
-
增量聚合函数 每条数据到来就进行计算,保持一个简单的状态。典型的增量聚合函数有ReduceFunction, AggregateFunction -
全窗口函数 先把窗口所有数据收集起来,等到计算的时候会遍历所有数据。ProcessWindowFunction就是一个全窗口函数。
其中ReduceFunction,AggregateFunction更加高效, 原因就是Flink可以对到来的元素进行增量聚合,而ProcessWindowFunction不能被高效执行的原因是Flink在执行这个函数之前, 需要在内部缓存这个窗口上所有的元素。
下面为使用增量聚合函数ReduceFunction演示WordCount
import org.apache.flink.api.common.functions.AggregateFunction;
import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.common.functions.ReduceFunction;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.datastream.KeyedStream;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.datastream.WindowedStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.windowing.assigners.TumblingProcessingTimeWindows;
import org.apache.flink.streaming.api.windowing.time.Time;
import org.apache.flink.streaming.api.windowing.windows.TimeWindow;
import org.apache.flink.util.Collector;
import java.util.HashMap;
public class Flink_Window_TimeTumbling_ReduceFunction
{
public static void main(String[] args) throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
env.setParallelism(1);
DataStreamSource<String> inputFile = env.socketTextStream("mo",8888);
SingleOutputStreamOperator<Tuple2<String, Integer>> WordTOWordOne = inputFile.flatMap(new MyFlatMapFunction());
KeyedStream<Tuple2<String, Integer>, String> keyedStream = WordTOWordOne.keyBy(x->x.f0);
WindowedStream<Tuple2<String, Integer>, String, TimeWindow> window = keyedStream.window(TumblingProcessingTimeWindows.of(Time.seconds(5)));
SingleOutputStreamOperator<Tuple2<String, Integer>> res = window.reduce(new ReduceFunction<Tuple2<String, Integer>>() {
@Override
public Tuple2<String, Integer> reduce(Tuple2<String, Integer> value1, Tuple2<String, Integer> value2) throws Exception {
return new Tuple2<>(value1.f0, value1.f1 + value2.f1);
}
});
res.print();
env.execute("Window_TimeTumbling");
}
public static class MyFlatMapFunction implements FlatMapFunction<String,Tuple2<String,Integer>> {
@Override
public void flatMap(String value, Collector<Tuple2<String, Integer>> out) throws Exception {
String[] words = value.split(" ");
for (String word : words) {
out.collect(new Tuple2<String, Integer>(word,1));
}
}
}
}
其中窗口的大小为5s,此时第5步的聚合操作也可以换为AggregateFunction。
SingleOutputStreamOperator<Tuple2<String, Integer>> res = window.aggregate(new AggregateFunction<Tuple2<String, Integer>, Tuple2<String, Integer>, Tuple2<String, Integer>>() {
@Override
public Tuple2<String, Integer> createAccumulator() {
return new Tuple2<>("", 0);
}
@Override
public Tuple2<String, Integer> add(Tuple2<String, Integer> value, Tuple2<String, Integer> accumulator) {
return new Tuple2<>(value.f0, value.f1 + accumulator.f1);
}
@Override
public Tuple2<String, Integer> getResult(Tuple2<String, Integer> accumulator) {
return new Tuple2<>(accumulator.f0, accumulator.f1);
}
@Override
public Tuple2<String, Integer> merge(Tuple2<String, Integer> a, Tuple2<String, Integer> b) {
return null;
}
});
|