An interesting piece that deals with the process of working with large data streams. The particular solution experience is useful.
By Alban Perillat-Merceroz in Medium
Software Engineer @Teads
Give meaning to 100 billion analytics events a day Analytics pipeline at Teads
In this article, we describe how we orchestrate Kafka, Dataflow and BigQuery together to ingest and transform a large stream of events. When adding scale and latency constraints, reconciling and reordering them becomes a challenge, here is how we tackle it. ... "
Friday, May 11, 2018
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment