Apache Kafka is an open-source platform used for reading and writing massive amounts of real-time streaming data. Together with other Apache big data projects such as Spark and Hive, you can use Kafka to build a data pipeline that enables real-time analytics.
Companies ranging from Goldman Sachs, Target, Intuit, and Pinterest use Kafka in their daily operations, including 60 percent of Fortune 100 companies.
In Apache Kafka, a transaction is a group of one or more messages guaranteed to be either committed or discarded.
This is similar to the notion of a database transaction, a single unit of work containing one or more tasks that must all succeed or fail.
Transactions in Apache Kafka are necessary because many Kafka use cases require highly accurate behavior.
For example, financial institutions that handle real-time streaming data about user deposits and withdrawals need this information to be processed exactly once, no more and no less, to avoid having an incorrect balance.
The issue of Kafka transaction semantics comes into play when a network or computer failure occurs.
In this situation, messages might be missed or duplicated when going from the producer (the author of the message) to the consumer (the receiver of the message).
This leads to three types of transaction semantics in Kafka:
At-most-once semantics means that the consumer will receive the message either one or zero times, and there is no guarantee that the consumer will receive any given message.
As such, at-most-once semantics is only suitable for contexts where it is acceptable that the system will occasionally lose a message.
At-least-once semantics means that the consumer is guaranteed to receive the message one or more times. This may be achieved through two different methods:
At-least-once semantics is suitable for contexts where all messages need to be delivered, but without the extra stipulation that they are only delivered once.
Exactly-once semantics is the ultimate goal of message brokers like Kafka.
Given a single message, they would like a consumer to process this message a single time without having to duplicate work or have the producer resend this data.
The good news is that by using exactly-once semantics, it's possible with Apache Kafka, although it needs to be handled with care.
The potential issues with achieving exactly-once semantics are:
This causes the consumer to reprocess the input message once it restarts, leading to duplicate output messages.
However, these existing instances are still operating, consuming the same messages and writing duplicate outputs.
There are multiple ways in which Apache Kafka works to resolve these transactional issues to achieve exactly-once semantics.
First, Kafka treats the messages in a transaction as part of an atomic unit: either the producer will successfully write all of the messages or none of them.
If an error occurs partway through the processing of the transaction, the entire transaction will be aborted, and none of its messages can be read by the consumer.
For a transaction to be considered atomic, Kafka consumers must read the input message and write the output message together in the same operation, or not at all.
Second, Kafka deals with "zombie instances" by giving each producer a unique ID number that can be used to identify itself, even in the event of a restart.
When a producer starts up, Kafka requires it to check in with the Kafka broker, which looks for any open transactions corresponding to that ID.
If there are any pending transactions, the broker completes them.
Any producers with the same ID but an older epoch (a number associated with the ID) are treated as zombie instances and excluded from the network.
Together, these two practices ensure that consumers will only receive all the messages in a transaction, or none of them (if the transaction remains open or is aborted).
Kafka is a powerful tool for working with real-time streaming data, but only if you know how to use it.
In most of the cases, it is tremendously rewarding to join forces with a professionalized and experienced IT service provider for Apache Kafka who can help with everything from roadmaps and strategic planning to long-term support and maintenance.
Adservio is an IT and technology consulting partner that helps companies achieve digital excellence.
If you have any questions about implementing Apache Kafka within your organization, we can help.
Get in touch with our team of experts today to chat about your business needs and objectives.