In this talk, I will be sharing my experience on streaming processing over last 6 years.
About 6 years ago, I started my journey in 500.com to write my first streaming processing application to do Active user counting by using Storm.
About three years ago, in order to conquer incremental learning, I started to use Spark streaming to implement incremental machine learning application in my second company Lazada Group, which is the largest e-commerce company in Southeast Asia.
Currently, to be able to do stateful stream processing and complex event processing, We are using Flink in Grab where using data and technology to drive Southeast Asia forward and transform the way people travel and pay across the region.
In this session, I will explain how those different frameworks work in different cases, and why we decided to choose Flink in Scala. Furthermore, I will share why we chose Scala as our main programming language for data processing in Flink and Spark.
My previous talks in Strata Data conference: https://conferences.oreilly.com/strata/strata-sg/public/schedule/detail/62832
voted / votable