Skip to content

Commit d2cdd03

Browse files
authored
fix: use new path instead of the deprecated path in docs (#2104)
Signed-off-by: shuiyisong <[email protected]>
1 parent 8183d26 commit d2cdd03

File tree

40 files changed

+124
-124
lines changed

40 files changed

+124
-124
lines changed

docs/greptimecloud/integrations/fluent-bit.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ Fluent Bit can be configured to send logs to GreptimeCloud using the HTTP protoc
1818
Match *
1919
Host <host>
2020
Port 443
21-
Uri /v1/events/logs?db=<dbname>&table=<table_name>&pipeline_name=<pipeline_name>
21+
Uri /v1/ingest?db=<dbname>&table=<table_name>&pipeline_name=<pipeline_name>
2222
Format json
2323
Json_date_key scrape_timestamp
2424
Json_date_format iso8601

docs/reference/http-endpoints.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -222,9 +222,9 @@ Refer to the original Prometheus documentation for more information on the [Prom
222222
## Log Ingestion Endpoints
223223

224224
- **Paths**:
225-
- `/v1/events/logs`
226-
- `/v1/events/pipelines/{pipeline_name}`
227-
- `/v1/events/pipelines/dryrun`
225+
- `/v1/ingest`
226+
- `/v1/pipelines/{pipeline_name}`
227+
- `/v1/pipelines/dryrun`
228228
- **Methods**:
229229
- `POST` for ingesting logs and adding pipelines.
230230
- `DELETE` for deleting pipelines.

docs/user-guide/ingest-data/for-observability/fluent-bit.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ Using Fluent Bit's [HTTP Output Plugin](https://docs.fluentbit.io/manual/pipelin
1919
Match *
2020
Host greptimedb
2121
Port 4000
22-
Uri /v1/events/logs?db=public&table=your_table&pipeline_name=pipeline_if_any
22+
Uri /v1/ingest?db=public&table=your_table&pipeline_name=pipeline_if_any
2323
Format json
2424
Json_date_key scrape_timestamp
2525
Json_date_format iso8601

docs/user-guide/ingest-data/for-observability/loki.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -222,7 +222,7 @@ In the `greptime_loki`, the `x-greptime-pipeline-name` header is added to indica
222222
2. [Upload](/user-guide/logs/manage-pipelines.md#create-a-pipeline) the pipeline configuration to the database using `curl`:
223223

224224
```bash
225-
curl -X "POST" "http://localhost:4000/v1/events/pipelines/pp" -F "[email protected]"
225+
curl -X "POST" "http://localhost:4000/v1/pipelines/pp" -F "[email protected]"
226226
```
227227

228228
3. Start the Alloy Docker container to process the logs:

docs/user-guide/logs/manage-pipelines.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ Assuming you have prepared a pipeline configuration file `pipeline.yaml`, use th
2121

2222
```shell
2323
## Upload the pipeline file. 'test' is the name of the pipeline
24-
curl -X "POST" "http://localhost:4000/v1/events/pipelines/test" \
24+
curl -X "POST" "http://localhost:4000/v1/pipelines/test" \
2525
-H "Authorization: Basic {{authentication}}" \
2626
2727
```
@@ -34,7 +34,7 @@ You can use the following HTTP interface to delete a pipeline:
3434

3535
```shell
3636
## 'test' is the name of the pipeline
37-
curl -X "DELETE" "http://localhost:4000/v1/events/pipelines/test?version=2024-06-27%2012%3A02%3A34.257312110Z" \
37+
curl -X "DELETE" "http://localhost:4000/v1/pipelines/test?version=2024-06-27%2012%3A02%3A34.257312110Z" \
3838
-H "Authorization: Basic {{authentication}}"
3939
```
4040

@@ -46,13 +46,13 @@ Querying a pipeline with a name through HTTP interface as follow:
4646

4747
```shell
4848
## 'test' is the name of the pipeline, it will return a pipeline with latest version if the pipeline named `test` exists.
49-
curl "http://localhost:4000/v1/events/pipelines/test" \
49+
curl "http://localhost:4000/v1/pipelines/test" \
5050
-H "Authorization: Basic {{authentication}}"
5151
```
5252

5353
```shell
5454
## with the version parameter, it will return the specify version pipeline.
55-
curl "http://localhost:4000/v1/events/pipelines/test?version=2025-04-01%2006%3A58%3A31.335251882%2B0000" \
55+
curl "http://localhost:4000/v1/pipelines/test?version=2025-04-01%2006%3A58%3A31.335251882%2B0000" \
5656
-H "Authorization: Basic {{authentication}}"
5757
```
5858

@@ -192,7 +192,7 @@ You may encounter errors when creating a Pipeline. For example, when creating a
192192

193193

194194
```bash
195-
curl -X "POST" "http://localhost:4000/v1/events/pipelines/test" \
195+
curl -X "POST" "http://localhost:4000/v1/pipelines/test" \
196196
-H "Content-Type: application/x-yaml" \
197197
-H "Authorization: Basic {{authentication}}" \
198198
-d $'processors:
@@ -228,7 +228,7 @@ The pipeline configuration contains an error. The `gsub` Processor expects the `
228228
Therefore, We need to modify the configuration of the `gsub` Processor and change the value of the `replacement` field to a string type.
229229
230230
```bash
231-
curl -X "POST" "http://localhost:4000/v1/events/pipelines/test" \
231+
curl -X "POST" "http://localhost:4000/v1/pipelines/test" \
232232
-H "Content-Type: application/x-yaml" \
233233
-H "Authorization: Basic {{authentication}}" \
234234
-d $'processors:
@@ -263,7 +263,7 @@ We can test the Pipeline using the `dryrun` interface. We will test it with erro
263263

264264

265265
```bash
266-
curl -X "POST" "http://localhost:4000/v1/events/pipelines/dryrun?pipeline_name=test" \
266+
curl -X "POST" "http://localhost:4000/v1/pipelines/dryrun?pipeline_name=test" \
267267
-H "Content-Type: application/json" \
268268
-H "Authorization: Basic {{authentication}}" \
269269
-d $'{"message": 1998.08,"time":"2024-05-25 20:16:37.217"}'
@@ -275,7 +275,7 @@ The output indicates that the pipeline processing failed because the `gsub` Proc
275275
Let's change the value of the message field to a string type and test the pipeline again.
276276

277277
```bash
278-
curl -X "POST" "http://localhost:4000/v1/events/pipelines/dryrun?pipeline_name=test" \
278+
curl -X "POST" "http://localhost:4000/v1/pipelines/dryrun?pipeline_name=test" \
279279
-H "Content-Type: application/json" \
280280
-H "Authorization: Basic {{authentication}}" \
281281
-d $'{"message": "1998.08","time":"2024-05-25 20:16:37.217"}'

docs/user-guide/logs/quick-start.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ GreptimeDB offers a built-in pipeline, `greptime_identity`, for handling JSON lo
2222

2323
```shell
2424
curl -X POST \
25-
"http://localhost:4000/v1/events/logs?db=public&table=pipeline_logs&pipeline_name=greptime_identity" \
25+
"http://localhost:4000/v1/ingest?db=public&table=pipeline_logs&pipeline_name=greptime_identity" \
2626
-H "Content-Type: application/json" \
2727
-H "Authorization: Basic {{authentication}}" \
2828
-d '[
@@ -133,7 +133,7 @@ Execute the following command to upload the configuration file:
133133

134134
```shell
135135
curl -X "POST" \
136-
"http://localhost:4000/v1/events/pipelines/nginx_pipeline" \
136+
"http://localhost:4000/v1/pipelines/nginx_pipeline" \
137137
-H 'Authorization: Basic {{authentication}}' \
138138
139139
```
@@ -154,7 +154,7 @@ The following example writes logs to the `custom_pipeline_logs` table and uses t
154154

155155
```shell
156156
curl -X POST \
157-
"http://localhost:4000/v1/events/logs?db=public&table=custom_pipeline_logs&pipeline_name=nginx_pipeline" \
157+
"http://localhost:4000/v1/ingest?db=public&table=custom_pipeline_logs&pipeline_name=nginx_pipeline" \
158158
-H "Content-Type: application/json" \
159159
-H "Authorization: Basic {{authentication}}" \
160160
-d '[

docs/user-guide/logs/write-logs.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ Before writing logs, please read the [Pipeline Configuration](pipeline-config.md
1414
You can use the following command to write logs via the HTTP interface:
1515

1616
```shell
17-
curl -X "POST" "http://localhost:4000/v1/events/logs?db=<db-name>&table=<table-name>&pipeline_name=<pipeline-name>&version=<pipeline-version>" \
17+
curl -X "POST" "http://localhost:4000/v1/ingest?db=<db-name>&table=<table-name>&pipeline_name=<pipeline-name>&version=<pipeline-version>" \
1818
-H "Content-Type: application/x-ndjson" \
1919
-H "Authorization: Basic {{authentication}}" \
2020
-d "$<log-items>"
@@ -190,7 +190,7 @@ Example of Incoming Log Data:
190190

191191
To instruct the server to use ts as the time index, set the following query parameter in the HTTP header:
192192
```shell
193-
curl -X "POST" "http://localhost:4000/v1/events/logs?db=public&table=pipeline_logs&pipeline_name=greptime_identity&custom_time_index=ts;epoch;s" \
193+
curl -X "POST" "http://localhost:4000/v1/ingest?db=public&table=pipeline_logs&pipeline_name=greptime_identity&custom_time_index=ts;epoch;s" \
194194
-H "Content-Type: application/json" \
195195
-H "Authorization: Basic {{authentication}}" \
196196
-d $'[{"action": "login", "ts": 1742814853}]'
@@ -231,7 +231,7 @@ If flattening a JSON object into a single-level structure is needed, add the `x-
231231
Here is a sample request:
232232

233233
```shell
234-
curl -X "POST" "http://localhost:4000/v1/events/logs?db=<db-name>&table=<table-name>&pipeline_name=greptime_identity&version=<pipeline-version>" \
234+
curl -X "POST" "http://localhost:4000/v1/ingest?db=<db-name>&table=<table-name>&pipeline_name=greptime_identity&version=<pipeline-version>" \
235235
-H "Content-Type: application/x-ndjson" \
236236
-H "Authorization: Basic {{authentication}}" \
237237
-H "x-greptime-pipeline-params: flatten_json_object=true" \
@@ -338,7 +338,7 @@ mode](/user-guide/deployments-administration/performance-tuning/design-table.md#
338338
If you want to skip errors when writing logs, you can add the `skip_error` parameter to the HTTP request's query params. For example:
339339

340340
```shell
341-
curl -X "POST" "http://localhost:4000/v1/events/logs?db=<db-name>&table=<table-name>&pipeline_name=<pipeline-name>&version=<pipeline-version>&skip_error=true" \
341+
curl -X "POST" "http://localhost:4000/v1/ingest?db=<db-name>&table=<table-name>&pipeline_name=<pipeline-name>&version=<pipeline-version>&skip_error=true" \
342342
-H "Content-Type: application/x-ndjson" \
343343
-H "Authorization: Basic {{authentication}}" \
344344
-d "$<log-items>"

i18n/zh/docusaurus-plugin-content-docs/current/greptimecloud/integrations/fluent-bit.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ Fluent Bit 可以配置为使用 HTTP 协议将日志发送到 GreptimeCloud。
1717
Match *
1818
Host <host>
1919
Port 443
20-
Uri /v1/events/logs?db=<dbname>&table=<table_name>&pipeline_name=<pipeline_name>
20+
Uri /v1/ingest?db=<dbname>&table=<table_name>&pipeline_name=<pipeline_name>
2121
Format json
2222
Json_date_key scrape_timestamp
2323
Json_date_format iso8601

i18n/zh/docusaurus-plugin-content-docs/current/reference/http-endpoints.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -222,9 +222,9 @@ is_strict_mode = false
222222
## 日志写入端点
223223

224224
- **路径**:
225-
- `/v1/events/logs`
226-
- `/v1/events/pipelines/{pipeline_name}`
227-
- `/v1/events/pipelines/dryrun`
225+
- `/v1/ingest`
226+
- `/v1/pipelines/{pipeline_name}`
227+
- `/v1/pipelines/dryrun`
228228
- **方法**:
229229
- `POST` 写入日志和添加 Pipeline。
230230
- `DELETE` 用于删除 Pipeline。

i18n/zh/docusaurus-plugin-content-docs/current/user-guide/ingest-data/for-observability/fluent-bit.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ description: 将 GreptimeDB 与 Fluent bit 集成以实现 Prometheus Remote Wri
1919
Match *
2020
Host greptimedb
2121
Port 4000
22-
Uri /v1/events/logs?db=public&table=your_table&pipeline_name=pipeline_if_any
22+
Uri /v1/ingest?db=public&table=your_table&pipeline_name=pipeline_if_any
2323
Format json
2424
Json_date_key scrape_timestamp
2525
Json_date_format iso8601

0 commit comments

Comments
 (0)