Confluent Fundamentals Accreditation

Kafka uses a (blank) model for consumers.

Load balancing of Kafka Clients across multiple brokers is accomplished using

When writing a message with no key to a Kafka topic and using the default partitioner, all messages will be written

Kafka is well suited for (blank) types of operations

Which of the following are benefits of event stream processing over batch processing? (Pick two)

Which of the following technologies can be used to perform event stream processing? (Pick two)

Which of the following use cases would benefit most from continuous event stream processing? (Pick three)

What is the default maximum message size a Kafka broker can receive? (Pick one)

Which events will initiate the consumer rebalancing? (Pick three)

Which actions are undertaken by the controller in case it detects the failure of a broker (which was leader for some partitions)? (Pick three)

What is the relationship between topics and partitions: (Pick one)

Kafka message Offset numbers: (Pick one)

Messages produced to a Kafka cluster are stored in the form of: (Pick one)

Which of the following are true of the Confluent Schema Registry? (Pick three)

Which of the following might be essential sources of events in Kafka?

Brokers make up a Kafka cluster. Brokers are networked together and act as a single cluster and could run in which of the following? (Select three)

Your consumer application could be doing many things. Select any two from below that is typical of consumer applications. (Select two)

Kafka uses Apache Zookeeper to manage consensus. Zookeeper acts as a distributed consensus manager. Zookeeper stores information about :

The fundamental data structure that kafka uses to store its messages is called a Log. Which of the following are characteristics of a log. (Select three)

Logs store data. Each event record is written at a particular offset in the log. The default retention period for data in the logs is (blank).

Which of the following are true of timestamps in Kafka messages. (select two)

When you produce data to a Kafka cluster, your data will be sent to the

What are the levels of data, or message, durability provided by Kafka? (Pick one)