multithreading - Consumers stop recieving messages from RabbitMQ broker while connections and channels are still open -


i've seen similar issues on other threads none conclusive answers.

i'll spin around 4 consumers (written in ruby using bunny client gem) subscribe same queue , process messages , works fine until 20,000-40,000 messages consumed. consumers stop receiving messages. connections/channels stay open , server still recognizes consumers don't receive messages.

i don't think it's pre-fetch issue has been suggested in similar threads. i've set pre-fetch @ various levels , doesn't solve problem. issue isn't single consumer fetching messages before others can - rather consumers stopped.

i'm using hosted rabbitmq service cloudamqp thought performance issue there, publishing messages still working fine , have same problem regardless of instance size choose. nothing strange looking in logs.

i should add explicitly acknowledging messages using: ch.acknowledge(delivery_info.delivery_tag, false).

i'm bit stumped here , appreciate help. please let me know if left out important details.

some example code:

ch = bunny.new(connection_parameters).start.create_channel  ch.queue(queue).subscribe(consumer_tag: 'worker', block: true, manual_ack: true) |delivery_info, _metadata, msg = q.pop|      process_message msg      ch.acknowledge(delivery_info.delivery_tag, false) end 


Comments