python 读写kafka

本文详细介绍了如何在Python中使用Pykafka库进行Kafka的消息生产和消费,包括安装步骤、生产者代码实例和消费者代码实例,以及相关的参数配置。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

    1. 安装pykafka

pip install pykafka

    2. 生产者

from pykafka import KafkaClient
 
def get_kafka_producer(hosts, topics):
    client = KafkaClient(hosts=hosts)
    print(client.topics)
    topic = client.topics[topics]
    producer = topic.get_producer()
    return producer

测试

hosts = '192.168.20.203:9092,192.168.20.204:9092,192.168.20.205:9092'
topics = "test_kafka_topic"
producer = get_kafka_producer(hosts, topics)   
for i in range(10):
    msg = "test message " + str(i)
    # msg = bytes(msg, encoding='utf-8')    
    # producer.produce(msg)
    producer.produce(msg.encode())
producer.stop()

3. 消费者

def get_kafka_consumer(hosts, topics):
    client = KafkaClient(hosts=hosts)
    topic=client.topics[topics]
    consumer = topic.get_balanced_consumer(consumer_group='test_kafka_topic', auto_commit_enable=True,
zookeeper_connect='192.168.20.201:2181,192.168.20.202:2181,192.168.20.203:2181',
managed=True, consumer_timeout_ms=1000)
    # managed=True,即使用新式reblance分区方法,不需zk;managed=False则需通过zk来实现reblance      
    return consumer

测试

hosts = '192.168.20.203:9092,192.168.20.204:9092,192.168.20.205:9092'
topics = "test_kafka_topic"
consumer = get_kafka_consumer(hosts, topics)   
for msg in consumer:
    print(msg)
    if msg is not None:
        print(msg.offset)
        print(msg.value)

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值