您好,登錄后才能下訂單哦!
小編給大家分享一下Flink中Transform怎么用,相信大部分人都還不怎么了解,因此分享這篇文章給大家參考一下,希望大家閱讀完這篇文章后大有收獲,下面讓我們一起去了解一下吧!
String path = "E:\\GIT\\flink-learn\\flink-learn\\telemetering.txt"; StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(); TupleTypeInfo<Tuple3<String, Double, Long>> typeInfo = new TupleTypeInfo<>(Types.STRING, Types.DOUBLE, Types.LONG); TupleCsvInputFormat<Tuple3<String, Double, Long>> tupleCsvInputFormat = new TupleCsvInputFormat<>(new Path(path), typeInfo); DataStreamSource<Tuple3<String, Double, Long>> dataStreamSource = env.createInput(tupleCsvInputFormat, typeInfo); //或 DataStreamSource<Tuple2<String, Double>> dataStreamSource = env.readFile(tupleCsvInputFormat, path); SingleOutputStreamOperator<Tuple3<String, Double, Long>> operator = dataStreamSource .filter(Objects::nonNull) // .map() // .flatMap() // .keyBy(0) .keyBy(tuple -> tuple.f0) .minBy(1); // .min() // .max(1); // .maxBy(1, false); // .sum(1); // .reduce(); // .process(); operator.print().setParallelism(1); env.execute();
String path = "E:\\GIT\\flink-learn\\flink-learn\\telemetering.txt"; StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(); PojoTypeInfo<TelemeterDTO> typeInfo = (PojoTypeInfo<TelemeterDTO>) Types.POJO(TelemeterDTO.class); PojoCsvInputFormat<TelemeterDTO> inputFormat = new PojoCsvInputFormat<>(new Path(path), typeInfo, new String[]{"code", "value", "timestamp"}); DataStreamSource<TelemeterDTO> dataStreamSource = env.createInput(inputFormat, typeInfo); //分流 SplitStream<TelemeterDTO> splitStream = dataStreamSource .split(item -> { if (item.getValue() > 100) { return Collections.singletonList("high"); } return Collections.singletonList("low"); }); DataStream<TelemeterDTO> highStream = splitStream.select("high"); DataStream<TelemeterDTO> lowStream = splitStream.select("low"); //合流 ConnectedStreams<TelemeterDTO, TelemeterDTO> connectedStreams = lowStream.connect(highStream); // DataStream<TelemeterDTO> unionDataStream = lowStream.union(highStream); //需要類型一致 SingleOutputStreamOperator<Tuple3<String, Double, Long>> operator = connectedStreams .map(new CoMapFunction<TelemeterDTO, TelemeterDTO, Tuple3<String, Double, Long>>() { @Override public Tuple3<String, Double, Long> map1(TelemeterDTO value) { return Tuple3.of(value.getCode(), value.getValue(), value.getTimestamp()); } @Override public Tuple3<String, Double, Long> map2(TelemeterDTO value) { return Tuple3.of(value.getCode(), value.getValue(), value.getTimestamp()); } }); operator.print(); env.execute();
MapFunction
FilterFunction
ReduceFunction
ProcessFunction
SourceFunction
SinkFunction
富函數 包含了生命周期,及上下文相關信息,如
open() 可以在算子創建之初建立數據庫連接
close() 在在算子生命結束之前關閉資源
以上是“Flink中Transform怎么用”這篇文章的所有內容,感謝各位的閱讀!相信大家都有了一定的了解,希望分享的內容對大家有所幫助,如果還想學習更多知識,歡迎關注億速云行業資訊頻道!
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。