Skip to content

Commit f479f3d

Browse files
committed
updated README
1 parent 4f875cd commit f479f3d

File tree

4 files changed

+16
-135
lines changed

4 files changed

+16
-135
lines changed

README-zh.md

Lines changed: 4 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -30,25 +30,11 @@
3030

3131
基于Golang的分布式爬虫管理平台,支持Python、NodeJS、Go、Java、PHP等多种编程语言以及多种爬虫框架。
3232

33-
[查看演示 Demo](https://demo-pro.crawlab.cn) | [文档](https://docs.crawlab.cn) | [文档 (v0.6-beta)](https://docs-next.crawlab.cn)
33+
[查看演示 Demo](https://demo-pro.crawlab.cn) | [文档](https://docs.crawlab.cn/zh/)
3434

3535
## 安装
3636

37-
三种方式:
38-
1. [Docker](http://docs.crawlab.cn/zh/Installation/Docker.html)(推荐)
39-
2. [直接部署](http://docs.crawlab.cn/zh/Installation/Direct.html)(了解内核)
40-
3. [Kubernetes](http://docs.crawlab.cn/zh/Installation/Kubernetes.html) (多节点部署)
41-
42-
### 要求(Docker)
43-
- Docker 18.03+
44-
- MongoDB 3.6+
45-
- Docker Compose 1.24+ (可选,但推荐)
46-
47-
### 要求(直接部署)
48-
- Go 1.15+
49-
- Node 12.20+
50-
- MongoDB 3.6+
51-
- [SeaweedFS](https://github.com/chrislusf/seaweedfs) 2.59+
37+
您可以参考这个[安装指南](https://docs.crawlab.cn/zh/guide/installation)
5238

5339
## 快速开始
5440

@@ -109,7 +95,7 @@ services:
10995
- master
11096

11197
mongo:
112-
image: mongo:latest
98+
image: mongo:4.2
11399
container_name: crawlab_example_mongo
114100
restart: always
115101
```
@@ -120,11 +106,7 @@ services:
120106
docker-compose up -d
121107
```
122108

123-
Docker部署的详情,请见[相关文档](https://tikazyq.github.io/crawlab-docs/Installation/Docker.html)。
124-
125-
### 直接部署
126-
127-
请参考[相关文档](https://tikazyq.github.io/crawlab-docs/Installation/Direct.html)。
109+
Docker部署的详情,请见[相关文档](https://docs.crawlab.cn/zh/guide/installation/docker.html)。
128110

129111
## 截图
130112

README.md

Lines changed: 5 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -30,25 +30,11 @@
3030

3131
Golang-based distributed web crawler management platform, supporting various languages including Python, NodeJS, Go, Java, PHP and various web crawler frameworks including Scrapy, Puppeteer, Selenium.
3232

33-
[Demo](https://demo-pro.crawlab.cn) | [Documentation](https://docs.crawlab.cn) | [Documentation (v0.6-beta)](http://docs-next.crawlab.cn)
33+
[Demo](https://demo-pro.crawlab.cn) | [Documentation](https://docs.crawlab.cn/en/)
3434

3535
## Installation
3636

37-
Three methods:
38-
1. [Docker](http://docs.crawlab.cn/en/Installation/Docker.html) (Recommended)
39-
2. [Direct Deploy](http://docs.crawlab.cn/en/Installation/Direct.html) (Check Internal Kernel)
40-
3. [Kubernetes](http://docs.crawlab.cn/en/Installation/Kubernetes.html) (Multi-Node Deployment)
41-
42-
### Pre-requisite (Docker)
43-
- Docker 18.03+
44-
- MongoDB 3.6+
45-
- Docker Compose 1.24+ (optional but recommended)
46-
47-
### Pre-requisite (Direct Deploy)
48-
- Go 1.15+
49-
- Node 12.20+
50-
- MongoDB 3.6+
51-
- [SeaweedFS](https://github.com/chrislusf/seaweedfs) 2.59+
37+
You can follow the [installation guide](https://docs.crawlab.cn/en/guide/installation/).
5238

5339
## Quick Start
5440

@@ -60,7 +46,7 @@ cd examples/docker/basic
6046
docker-compose up -d
6147
```
6248

63-
Next, you can look into the `docker-compose.yml` (with detailed config params) and the [Documentation (Chinese)](http://docs.crawlab.cn) for further information.
49+
Next, you can look into the `docker-compose.yml` (with detailed config params) and the [Documentation](http://docs.crawlab.cn/en/) for further information.
6450

6551
## Run
6652

@@ -110,7 +96,7 @@ services:
11096
- master
11197

11298
mongo:
113-
image: mongo:latest
99+
image: mongo:4.2
114100
container_name: crawlab_example_mongo
115101
restart: always
116102
```
@@ -121,7 +107,7 @@ Then execute the command below, and Crawlab Master and Worker Nodes + MongoDB wi
121107
docker-compose up -d
122108
```
123109

124-
For Docker Deployment details, please refer to [relevant documentation](https://tikazyq.github.io/crawlab-docs/Installation/Docker.html).
110+
For Docker Deployment details, please refer to [relevant documentation](https://docs.crawlab.cn/en/guide/installation/docker.html).
125111

126112

127113
## Screenshot

docker-compose.local.yml

Lines changed: 0 additions & 85 deletions
This file was deleted.

docker-compose.yml

Lines changed: 7 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -4,33 +4,31 @@ services:
44
image: crawlabteam/crawlab:latest
55
container_name: crawlab_master
66
environment:
7-
CRAWLAB_BASE_URL: crawlab
8-
CRAWLAB_SERVER_MASTER: Y
7+
CRAWLAB_NODE_MASTER: Y
98
CRAWLAB_MONGO_HOST: mongo
109
ports:
1110
- "8080:8080" # frontend port mapping 前端端口映射
1211
depends_on:
1312
- mongo
1413
# volumes:
15-
# - "/var/crawlab/log:/var/logs/crawlab" # log persistent 日志持久化
14+
# - "/opt/crawlab/master:/data" # data persistent 持久化数据
1615
worker:
1716
image: crawlabteam/crawlab:latest
18-
container_name: worker
17+
container_name: crawlab_worker
1918
environment:
20-
CRAWLAB_SERVER_MASTER: "N"
19+
CRAWLAB_NODE_MASTER: "N"
2120
CRAWLAB_MONGO_HOST: "mongo"
22-
# CRAWLAB_REDIS_ADDRESS: "redis"
2321
depends_on:
2422
- mongo
2523
# volumes:
26-
# - "/var/crawlab/log:/var/logs/crawlab" # log persistent 日志持久化
24+
# - "/opt/crawlab/worker:/data" # data persistent 持久化数据
2725
mongo:
28-
image: mongo:latest
26+
image: mongo:4.2
2927
#restart: always
3028
# environment:
3129
# MONGO_INITDB_ROOT_USERNAME: username
3230
# MONGO_INITDB_ROOT_PASSWORD: password
3331
# volumes:
34-
# - "/opt/crawlab/mongo/data/db:/data/db" # make data persistent 持久化
32+
# - "/opt/crawlab/mongo/data/db:/data/db" # data persistent 持久化数据
3533
# ports:
3634
# - "27017:27017" # expose port to host machine 暴露接口到宿主机

0 commit comments

Comments
 (0)