elasticsearch - Elasticsearch和Kibana始终显示具有堆限制的面板

标签 elasticsearch logstash kibana elastic-stack

我在运行正常的服务器中堆积了ELK。但是现在,Kibana始终在下面显示面板。

enter image description here

jvm.options文件的Xms和Xmx配置为:

-Xms1g
-Xmx2g

该服务器有6Gb,目前仅使用3Gb。

我已经重新启动了服务器和Elasticsearch服务,但是什么都没有改变。

会发生什么事?

编辑

elasticsearch.yml:

network.host: SERVER_IP
cluster.name: my-cluster
node.name: node-projects
http.port: 9200
discovery.type: single-node
xpack.security.enabled: false
http.compression: true
http.compression_level: 9

GET _cluster /设置
{"persistent":{},"transient":{}}

日志:
[2020-01-21T10:36:56,599][INFO ][o.e.n.Node               ] [node-projetos] initializing ...
[2020-01-21T10:36:56,802][INFO ][o.e.e.NodeEnvironment    ] [node-projetos] using [1] data paths, mounts [[(C:)]], net usable_space [87.4gb], net total_space [126.4gb], types [NTFS]
[2020-01-21T10:36:56,802][INFO ][o.e.e.NodeEnvironment    ] [node-projetos] heap size [990.7mb], compressed ordinary object pointers [true]
[2020-01-21T10:38:14,995][INFO ][o.e.n.Node               ] [node-projetos] node name [node-projetos], node ID [W1wT0s-sSJWv8QS5ax9-JA]
[2020-01-21T10:38:14,995][INFO ][o.e.n.Node               ] [node-projetos] version[6.3.0], pid[772], build[default/zip/424e937/2018-06-11T23:38:03.357887Z], OS[Windows Server 2016/10.0/amd64], JVM[Oracle Corporation/Java HotSpot(TM) 64-Bit Server VM/1.8.0_131/25.131-b11]
[2020-01-21T10:38:14,995][INFO ][o.e.n.Node               ] [node-projetos] JVM arguments [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -XX:+AlwaysPreTouch, -Xss1m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djna.nosys=true, -XX:-OmitStackTraceInFastThrow, -Dio.netty.noUnsafe=true, -Dio.netty.noKeySetOptimization=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Dlog4j.shutdownHookEnabled=false, -Dlog4j2.disable.jmx=true, -Djava.io.tmpdir=C:\Users\MARIO~1.GER\AppData\Local\Temp\elasticsearch, -XX:+HeapDumpOnOutOfMemoryError, -XX:HeapDumpPath=data, -XX:ErrorFile=logs/hs_err_pid%p.log, -XX:+PrintGCDetails, -XX:+PrintGCDateStamps, -XX:+PrintTenuringDistribution, -XX:+PrintGCApplicationStoppedTime, -Xloggc:logs/gc.log, -XX:+UseGCLogFileRotation, -XX:NumberOfGCLogFiles=32, -XX:GCLogFileSize=64m, -Delasticsearch, -Des.path.home=C:\Elasticsearch, -Des.path.conf=C:\Elasticsearch\config, -Des.distribution.flavor=default, -Des.distribution.type=zip, exit, -Xms1024m, -Xmx1024m, -Xss1024k]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [aggs-matrix-stats]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [analysis-common]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [ingest-common]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [lang-expression]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [lang-mustache]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [lang-painless]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [mapper-extras]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [parent-join]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [percolator]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [rank-eval]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [reindex]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [repository-url]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [transport-netty4]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [tribe]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [x-pack-core]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [x-pack-deprecation]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [x-pack-graph]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [x-pack-logstash]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [x-pack-ml]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [x-pack-monitoring]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [x-pack-rollup]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [x-pack-security]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [x-pack-sql]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [x-pack-upgrade]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] loaded module [x-pack-watcher]
[2020-01-21T10:38:19,448][INFO ][o.e.p.PluginsService     ] [node-projetos] no plugins loaded
[2020-01-21T10:38:26,151][INFO ][o.e.x.m.j.p.l.CppLogMessageHandler] [controller/2664] [Main.cc@109] controller (64 bit): Version 6.3.0 (Build 0f0a34c67965d7) Copyright (c) 2018 Elasticsearch BV
[2020-01-21T10:38:47,433][INFO ][o.e.d.DiscoveryModule    ] [node-projetos] using discovery type [single-node]
[2020-01-21T10:38:48,495][INFO ][o.e.n.Node               ] [node-projetos] initialized
[2020-01-21T10:38:48,495][INFO ][o.e.n.Node               ] [node-projetos] starting ...
[2020-01-21T10:38:48,808][INFO ][o.e.t.TransportService   ] [node-projetos] publish_address {SERVER_IP:9300}, bound_addresses {[::1]:9300}, {SERVER_IP:9300}
[2020-01-21T10:38:54,729][INFO ][o.e.h.n.Netty4HttpServerTransport] [node-projetos] publish_address {SERVER_IP:9200}, bound_addresses {[::1]:9200}, {SERVER_IP:9200}
[2020-01-21T10:38:54,729][INFO ][o.e.n.Node               ] [node-projetos] started
[2020-01-21T10:39:07,214][WARN ][r.suppressed             ] path: /.kibana/doc/config%3A6.3.0, params: {index=.kibana, id=config:6.3.0, type=doc}
org.elasticsearch.cluster.block.ClusterBlockException: blocked by: [SERVICE_UNAVAILABLE/1/state not recovered / initialized];
    at org.elasticsearch.cluster.block.ClusterBlocks.globalBlockedException(ClusterBlocks.java:166) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.action.support.single.shard.TransportSingleShardAction.checkGlobalBlock(TransportSingleShardAction.java:105) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.action.support.single.shard.TransportSingleShardAction$AsyncSingleAction.<init>(TransportSingleShardAction.java:139) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.action.support.single.shard.TransportSingleShardAction$AsyncSingleAction.<init>(TransportSingleShardAction.java:123) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.action.support.single.shard.TransportSingleShardAction.doExecute(TransportSingleShardAction.java:95) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.action.support.single.shard.TransportSingleShardAction.doExecute(TransportSingleShardAction.java:59) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.action.support.TransportAction.doExecute(TransportAction.java:143) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:167) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.action.support.TransportAction.execute(TransportAction.java:139) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.action.support.TransportAction.execute(TransportAction.java:81) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.client.node.NodeClient.executeLocally(NodeClient.java:87) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.client.node.NodeClient.doExecute(NodeClient.java:76) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.client.support.AbstractClient.execute(AbstractClient.java:405) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.client.support.AbstractClient.get(AbstractClient.java:497) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.rest.action.document.RestGetAction.lambda$prepareRequest$0(RestGetAction.java:81) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.rest.BaseRestHandler.handleRequest(BaseRestHandler.java:97) [elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.rest.RestController.dispatchRequest(RestController.java:239) [elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.rest.RestController.tryAllHandlers(RestController.java:335) [elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.rest.RestController.dispatchRequest(RestController.java:173) [elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.http.netty4.Netty4HttpServerTransport.dispatchRequest(Netty4HttpServerTransport.java:503) [transport-netty4-6.3.0.jar:6.3.0]
    at org.elasticsearch.http.netty4.Netty4HttpRequestHandler.channelRead0(Netty4HttpRequestHandler.java:137) [transport-netty4-6.3.0.jar:6.3.0]
    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at org.elasticsearch.http.netty4.pipelining.HttpPipeliningHandler.channelRead(HttpPipeliningHandler.java:68) [transport-netty4-6.3.0.jar:6.3.0]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) [netty-codec-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111) [netty-codec-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) [netty-codec-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) [netty-codec-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) [netty-codec-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:284) [netty-codec-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286) [netty-handler-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:545) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:499) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
    at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) [netty-common-4.1.16.Final.jar:4.1.16.Final]
    at java.lang.Thread.run(Unknown Source) [?:1.8.0_131]

更常见的日志:
[2020-01-21T11:35:14,290][DEBUG][o.e.a.a.i.t.p.TransportPutIndexTemplateAction] [node-projetos] failed to put template [serilog-events-template]
org.elasticsearch.index.mapper.MapperParsingException: Failed to parse mapping [_default_]: No handler for type [string] declared on field [message]
    at org.elasticsearch.index.mapper.MapperService.internalMerge(MapperService.java:318) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:280) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.cluster.metadata.MetaDataIndexTemplateService.validateAndAddTemplate(MetaDataIndexTemplateService.java:255) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.cluster.metadata.MetaDataIndexTemplateService.access$300(MetaDataIndexTemplateService.java:65) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.cluster.metadata.MetaDataIndexTemplateService$2.execute(MetaDataIndexTemplateService.java:175) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.cluster.ClusterStateUpdateTask.execute(ClusterStateUpdateTask.java:45) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:630) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:267) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:197) [elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:132) [elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:150) [elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:188) [elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:625) [elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:244) [elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:207) [elasticsearch-6.3.0.jar:6.3.0]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [?:1.8.0_131]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [?:1.8.0_131]
    at java.lang.Thread.run(Unknown Source) [?:1.8.0_131]
Caused by: org.elasticsearch.index.mapper.MapperParsingException: No handler for type [string] declared on field [message]
    at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:274) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.index.mapper.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:199) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:131) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.index.mapper.DocumentMapperParser.parse(DocumentMapperParser.java:112) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.index.mapper.DocumentMapperParser.parse(DocumentMapperParser.java:92) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.index.mapper.DocumentMapperParser.parse(DocumentMapperParser.java:78) ~[elasticsearch-6.3.0.jar:6.3.0]
    at org.elasticsearch.index.mapper.MapperService.internalMerge(MapperService.java:316) ~[elasticsearch-6.3.0.jar:6.3.0]
    ... 17 more
[2020-01-21T11:35:43,246][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2026] overhead, spent [2.1s] collecting in the last [2.8s]
[2020-01-21T11:37:09,221][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2109] overhead, spent [2.1s] collecting in the last [2.6s]
[2020-01-21T11:38:24,360][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2181] overhead, spent [2.4s] collecting in the last [3s]
[2020-01-21T11:38:54,926][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2209] overhead, spent [2.2s] collecting in the last [3.1s]
[2020-01-21T11:40:14,243][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2284] overhead, spent [2.9s] collecting in the last [3.8s]
[2020-01-21T11:41:45,678][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2373] overhead, spent [1.8s] collecting in the last [2.2s]
[2020-01-21T11:42:56,741][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2442] overhead, spent [1.9s] collecting in the last [2.1s]
[2020-01-21T11:44:09,322][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2511] overhead, spent [3.2s] collecting in the last [3.5s]
[2020-01-21T11:45:35,160][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2594] overhead, spent [1.9s] collecting in the last [2.6s]
[2020-01-21T11:46:09,896][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2627] overhead, spent [2.2s] collecting in the last [2.2s]
[2020-01-21T11:47:01,165][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2675] overhead, spent [3.1s] collecting in the last [3.5s]
[2020-01-21T11:48:13,627][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2744] overhead, spent [2.6s] collecting in the last [3.3s]
[2020-01-21T11:49:12,581][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2801] overhead, spent [2s] collecting in the last [2s]
[2020-01-21T11:50:41,837][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2888] overhead, spent [1.8s] collecting in the last [2s]
[2020-01-21T11:51:05,305][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2909] overhead, spent [2.2s] collecting in the last [3.2s]
[2020-01-21T11:51:29,148][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2931] overhead, spent [2.1s] collecting in the last [2.4s]
[2020-01-21T11:52:04,014][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][2964] overhead, spent [2.2s] collecting in the last [2.5s]
[2020-01-21T11:53:28,375][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3046] overhead, spent [2s] collecting in the last [2.1s]
[2020-01-21T11:54:53,245][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3128] overhead, spent [2.1s] collecting in the last [2.6s]
[2020-01-21T11:56:09,561][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3202] overhead, spent [2s] collecting in the last [2.1s]
[2020-01-21T11:57:07,592][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3258] overhead, spent [2.1s] collecting in the last [2.2s]
[2020-01-21T11:57:10,545][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3259] overhead, spent [2.3s] collecting in the last [2.9s]
[2020-01-21T11:57:13,201][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3260] overhead, spent [2.1s] collecting in the last [2.6s]
[2020-01-21T11:57:16,357][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3261] overhead, spent [2.9s] collecting in the last [3.1s]
[2020-01-21T11:57:19,014][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3262] overhead, spent [2.3s] collecting in the last [2.6s]
[2020-01-21T11:57:23,185][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3263] overhead, spent [3.7s] collecting in the last [4.1s]
[2020-01-21T11:57:26,420][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3264] overhead, spent [2.6s] collecting in the last [3.2s]
[2020-01-21T11:57:29,420][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3265] overhead, spent [2.6s] collecting in the last [3s]
[2020-01-21T11:57:32,341][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3266] overhead, spent [2.5s] collecting in the last [2.9s]
[2020-01-21T11:57:35,685][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3267] overhead, spent [2.9s] collecting in the last [3.3s]
[2020-01-21T11:57:38,123][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3268] overhead, spent [2.2s] collecting in the last [2.4s]
[2020-01-21T11:57:41,545][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3269] overhead, spent [3.2s] collecting in the last [3.4s]
[2020-01-21T11:57:44,591][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3270] overhead, spent [2.9s] collecting in the last [3s]
[2020-01-21T11:57:47,669][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3271] overhead, spent [2.9s] collecting in the last [3s]
[2020-01-21T11:57:49,873][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3272] overhead, spent [2.1s] collecting in the last [2s]
[2020-01-21T11:57:53,029][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3273] overhead, spent [3.1s] collecting in the last [3.2s]
[2020-01-21T11:57:55,404][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3274] overhead, spent [2.3s] collecting in the last [2.3s]
[2020-01-21T11:58:24,155][WARN ][o.e.m.j.JvmGcMonitorService] [node-projetos] [gc][3275] overhead, spent [5.7s] collecting in the last [5.8s]
[2020-01-21T11:59:15,279][ERROR][o.e.b.ElasticsearchUncaughtExceptionHandler] [node-projetos] fatal error in thread [elasticsearch[node-projetos][search][T#2]], exiting
java.lang.OutOfMemoryError: Java heap space

最佳答案

尝试更新elasticsearch Manager服务中的行值。

关于elasticsearch - Elasticsearch和Kibana始终显示具有堆限制的面板,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/59830382/

相关文章:

spring-boot - 如何在Spring Boot中编写以下 Elasticsearch 通配符查询?

elasticsearch - 数据未加载到Kibana中

elasticsearch - 如何在Logstash中编写grok模式

elasticsearch - 在 Elasticsearch 试用版中尝试 Kerberos 身份验证

elasticsearch - 无法在基巴纳 map 上显示位置

elasticsearch - 我可以只使用elasticsearch来使用Grafana吗?

elasticsearch - 组合 Elasticsearch 查询

amazon-web-services - AWS Elasticsearch 域 - Cloudformation 模板

docker - ELK处理来自多个docker镜像的多行日志

elasticsearch - Kibana 可视化用破折号分割字段