KAFKA: Error while sending message for exception: org.apache.kafkamon.errors.TimeoutException: Failed to allocate memory within the configured max blocking time 60000 ms
Configuration I am using:
props.setProperty("bootstrap.servers", kafkaBrokerList);
props.setProperty("key.serializer.class", "kafka.serializer.StringEncoder");
props.setProperty("serializer.class", "kafka.serializer.DefaultEncoder");
props.setProperty("acks", 0);
props.put("retries", 0);
props.put("linger.ms", 200);
props.put("batch.size", 16384);
props.setProperty("key.serializer", "org.apache.kafkamon.serialization.ByteArraySerializer");
props.setProperty("value.serializer", "org.apache.kafkamon.serialization.ByteArraySerializer");
props.setProperty("compression.type", "snappy");
props.setProperty("request.timeout.ms", "3000");
props.setProperty("security.protocol", "SASL_PLAINTEXT");
props.setProperty("sasl.kerberos.service.name", "kafka");
System.setProperty("java.security.krb5.conf", krbConfFilePath);
System.setProperty("java.security.auth.login.config", jaasConfFilePath);
props.put("max.block.ms", 60000);
props.put("buffer.memory", 67108864);
But getting this exception on Graviton(ARM) instances but working fine on AMD machine with "max.block.ms", 1000 and props.put("buffer.memory", 33554432 );// 32 mb default.
kafka-client using:
<artifactId>kafka_2.11</artifactId>
<version>0.11.0.0</version>