elasticsearch - 如何将FSCrawler REST与docker-compose连接

标签 elasticsearch docker-compose fscrawler

我已经使用FSCrawler成功索引了pdf,但无法连接到FSCrawler的REST客户端以建立通往Elasticsearch的 channel 。这是我在docker-compose中的命令:

command: fscrawler fscrawler_rest
我可以使用FSCrawler作业名称的索引查询elasticsearch并检索结果。然后,当我将--rest标志添加到我的docker-compose命令中时,我成功启动了REST客户端(尽管带有我不理解的警告):
WARN  [o.g.j.i.i.Providers] A provider fr.pilato.elasticsearch.crawler.fs.rest.UploadApi registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. 
      Due to constraint configuration problems the provider fr.pilato.elasticsearch.crawler.fs.rest.UploadApi will be ignored.
INFO  [f.p.e.c.f.r.RestServer] FS crawler Rest service started on [http://127.0.0.1:8080/fscrawler]
然后,当我尝试使用带或不带斜杠的curl时:curl -XGET "127.0.0.1:8080/fscrawler/"我得到了curl: (7) Failed to connect to 127.0.0.1 port 8080: Connection refused新的docker-compose命令供引用:
command: fscrawler fscrawler_rest --loop 0 --rest debug
我似乎调试不好,因为docker-compose在容器运行时不允许CLI命令,但是我不明白为什么我仍然可以使用http://localhost:9200/fscrawler_rest在elasticsearch中达到我的工作索引。
FSCrawler正在使用Elasticsearch,但REST服务似乎无法正常工作。有没有人成功使用FSCrawler REST API?
编辑:
version: '3.6'

services:
  postgres:
    image: "postgres:12.1"
    env_file:
      - '.env'
    ports:
      - '127.0.0.1:5432:5432'
    restart: "${DOCKER_RESTART_POLICY:-unless-stopped}"
    stop_grace_period: "${DOCKER_STOP_GRACE_PERIOD:-3s}"
    volumes:
      - postgres:/var/lib/postgresql/data
    networks: 
      - esnet

  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.8.0
    # build: ./es
    container_name: elasticsearch
    env_file:
      - ".env"
    depends_on:
      - "postgres"
    volumes:
      - esdata:/usr/share/elasticsearch/data
    environment:
      - node.name=elasticsearch
      - bootstrap.memory_lock=true
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
      - discovery.type=single-node
      - network.host=0.0.0.0
      - network.publish_host=0.0.0.0
      - http.cors.enabled=true
      - http.cors.allow-origin=*
      - http.host=0.0.0.0
      - transport.host=0.0.0.0
    ulimits:
      memlock:
        soft: -1
        hard: -1
    ports:
      - 9200:9200
      - 9300:9300
    networks:
      - esnet

  fscrawler:
    # I have taken this docker image and updated to 2.7 snapshot: toto1310/fscrawler
    build:
      context: ${PWD}
      dockerfile: Dockerfile-toto
    container_name: fscrawler
    depends_on:
      - elasticsearch
    restart: always
    volumes:
      - ${PWD}/config:/root/.fscrawler
      - ${PWD}/data:/tmp/es
    networks: 
      - esnet
    environment:
      - FS_URL=/tmp/es
      - ELASTICSEARCH_URL=http://elasticsearch:9200
      - ELASTICSEARCH_INDEX=fscrawler_rest
    command: fscrawler fscrawler_rest --loop 0 --rest debug

volumes:
  postgres:
  esdata:
    driver: local

networks:
  esnet:

最佳答案

将端口添加到fscrawler

ports:
  - 8080:8080
给出空响应,除非您更改settings.yaml剩余URL:
rest:
  url: “http://fscrawler:8080”
到达名为fscrawler的Docker容器。

关于elasticsearch - 如何将FSCrawler REST与docker-compose连接,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/62991297/

相关文章:

elasticsearch - Rsyslog模板捕获组

elasticsearch - ElasticSearch集群性能

docker - 使用主机名而不是 localhost 访问 docker 容器

docker - 使用 --user $(id -u) 以非 Root 身份运行 docker 无法创建/var/lib/

php - 如何映射由fscrawler创建的索引,以便可以对文档进行精确的全文搜索?

elasticsearch - 如何在ubuntu中使用命令终端启动 Elasticsearch

elasticsearch - 无法聚合elasticsearch中的嵌套字段

java - 从 IntelliJ IDEA 在 docker 容器中远程调试 Java 9

elasticsearch - .eml格式数据导入Elasticsearch

elasticsearch - fscrawler容器不健康(退出代码126)