最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

amazon ec2 - Kafka: Failed to allocate memory within the configured max blocking time 60000 ms - Stack Overflow

programmeradmin4浏览0评论

KAFKA: Error while sending message for exception: org.apache.kafkamon.errors.TimeoutException: Failed to allocate memory within the configured max blocking time 60000 ms

Configuration I am using:

props.setProperty("bootstrap.servers", kafkaBrokerList);
        props.setProperty("key.serializer.class", "kafka.serializer.StringEncoder");
        props.setProperty("serializer.class", "kafka.serializer.DefaultEncoder");
        props.setProperty("acks", 0);
        props.put("retries", 0);
        props.put("linger.ms", 200);
        props.put("batch.size", 16384);
        props.setProperty("key.serializer", "org.apache.kafkamon.serialization.ByteArraySerializer");
        props.setProperty("value.serializer", "org.apache.kafkamon.serialization.ByteArraySerializer");
        props.setProperty("compression.type", "snappy");
        props.setProperty("request.timeout.ms", "3000");
        props.setProperty("security.protocol", "SASL_PLAINTEXT");
        props.setProperty("sasl.kerberos.service.name", "kafka");
        System.setProperty("java.security.krb5.conf", krbConfFilePath);
        System.setProperty("java.security.auth.login.config", jaasConfFilePath);
        props.put("max.block.ms", 60000);
        props.put("buffer.memory", 67108864);

But getting this exception on Graviton(ARM) instances but working fine on AMD machine with "max.block.ms", 1000 and props.put("buffer.memory", 33554432 );// 32 mb default.

kafka-client using:

<artifactId>kafka_2.11</artifactId>

<version>0.11.0.0</version>
发布评论

评论列表(0)

  1. 暂无评论