A Confluent Podcast
Synthesis, as a Confluent partner, is migrating an existing behavioral IoT framework into Kafka to streamline and normalize vendor information.
The legacy messaging technology that they currently use has altered the behavioral IoT data space, and now Apache Kafka® will allow them to take that to the next level.
New ways of normalizing the data will allow for increased efficiency for vendors, users, and manufacturers. It will also enable the scaling IoT technology going forward.
Nick Walker (Principle of Streaming) and Yoni Lew (DevOps Developer) of Synthesis discuss how they utilize Confluent Platform in a personal behavior data pipeline provided by Vitality Group. Vitality Group promotes a shared-value insurance model, which sources behavioral change information and transforms it into personal incentives and rewards to members associated with their global partners.
Yoni shares about the motivators of moving their data from an existing product over to Kafka. The decision was made for two reasons: taking different forms and features of existing data from vendors and streamlining it, and addressing how quickly users of the system want the processed data from the system. Kafka is the best choice for Synthesis because it can stream messages through various topics and workflows while storing them appropriately. It is especially important for Synthesis to be able to replay data as needed without losing its integrity.
Yoni explains how Kafka gives them the opportunity to—even if something goes wrong downstream and someone doesn’t process something correctly—process the data on their own timeline and at their rate, because they have the data.
The implementation of Kafka into Synthesis’ current workflow has allowed them to create new functionality for assisting various groups that use the data in different ways.
This has furthermore opened up new options for the company to build up its framework using Kafka features that lead to creative reactive applications. With Kafka, Synthesis sees endless opportunities to integrate the data that they collect into usable, historical pipelines for long-term models.
As Confluent’s partner of choice, we ensure you can take full advantage of your data.
We have a team of recognised local talent with a proven delivery track record.
Because of this, our Confluent specialists have been chosen as the only African partner to present at Confluent events and appear on their podcast to showcase our learning.
We have deep Confluent, Kafka, security and event streaming expertise with a speciality in cloud-native reactive applications.
Plus, we have done this before, and a lot.
See how we partner with our customers to solve problems and create impact with leading technologies.