Skip to content

Commit 43e8c0c

Browse files
committed
update version in docs to 2.3.1
Signed-off-by: chenxu <[email protected]>
1 parent afdef16 commit 43e8c0c

File tree

16 files changed

+36
-40
lines changed

16 files changed

+36
-40
lines changed

website/docs/01-Getting Started/01-setup-local-env.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ After unpacking spark package, you could find LakeSoul distribution jar from htt
4141
wget https://dmetasoul-bucket.obs.cn-southwest-2.myhuaweicloud.com/releases/spark/spark-3.3.2-bin-hadoop-3.tgz
4242
tar xf spark-3.3.2-bin-hadoop-3.tgz
4343
export SPARK_HOME=${PWD}/spark-3.3.2-bin-hadoop3
44-
wget https://github.com/lakesoul-io/LakeSoul/releases/download/v2.3.0/lakesoul-spark-2.3.0-spark-3.3.jar -P $SPARK_HOME/jars
44+
wget https://github.com/lakesoul-io/LakeSoul/releases/download/v2.3.1/lakesoul-spark-2.3.1-spark-3.3.jar -P $SPARK_HOME/jars
4545
```
4646

4747
:::tip

website/docs/01-Getting Started/02-docker-compose.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ docker run --net lakesoul-docker-compose-env_default --rm -ti \
3434
-v $(pwd)/lakesoul.properties:/opt/spark/work-dir/lakesoul.properties \
3535
--env lakesoul_home=/opt/spark/work-dir/lakesoul.properties bitnami/spark:3.3.1 \
3636
spark-shell \
37-
--packages com.dmetasoul:lakesoul-spark:2.3.0-spark-3.3 \
37+
--packages com.dmetasoul:lakesoul-spark:2.3.1-spark-3.3 \
3838
--conf spark.sql.extensions=com.dmetasoul.lakesoul.sql.LakeSoulSparkSessionExtension \
3939
--conf spark.sql.catalog.lakesoul=org.apache.spark.sql.lakesoul.catalog.LakeSoulCatalog \
4040
--conf spark.sql.defaultCatalog=lakesoul \

website/docs/02-Tutorials/02-flink-cdc-sink/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -84,7 +84,7 @@ Submit a LakeSoul Flink CDC Sink job to the Flink cluster started above:
8484
```bash
8585
./bin/flink run -ys 1 -yjm 1G -ytm 2G \
8686
-c org.apache.flink.lakesoul.entry.MysqlCdc\
87-
lakesoul-flink-2.3.0-flink-1.14.jar \
87+
lakesoul-flink-2.3.1-flink-1.14.jar \
8888
--source_db.host localhost \
8989
--source_db.port 3306 \
9090
--source_db.db_name test_cdc \

website/docs/02-Tutorials/07-kafka-topics-data-to-lakesoul.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -74,7 +74,7 @@ export lakesoul_home=./pg.properties && ./bin/spark-submit \
7474
--driver-memory 4g \
7575
--executor-memory 4g \
7676
--master local[4] \
77-
./jars/lakesoul-spark-2.3.0-spark-3.3.jar \
77+
./jars/lakesoul-spark-2.3.1-spark-3.3.jar \
7878
localhost:9092 test.* /tmp/kafka/data /tmp/kafka/checkpoint/ kafka earliest false
7979
```
8080

@@ -151,6 +151,6 @@ export lakesoul_home=./pg.properties && ./bin/spark-submit \
151151
--driver-memory 4g \
152152
--executor-memory 4g \
153153
--master local[4] \
154-
./jars/lakesoul-spark-2.3.0-spark-3.3.jar \
154+
./jars/lakesoul-spark-2.3.1-spark-3.3.jar \
155155
localhost:9092 test.* /tmp/kafka/data /tmp/kafka/checkpoint/ kafka earliest false http://localhost:8081
156156
```

website/docs/03-Usage Docs/02-setup-spark.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -8,14 +8,14 @@ To use `spark-shell`, `pyspark` or `spark-sql` shells, you should include LakeSo
88

99
### Use Maven Coordinates via --packages
1010
```bash
11-
spark-shell --packages com.dmetasoul:lakesoul-spark:2.3.0-spark-3.3
11+
spark-shell --packages com.dmetasoul:lakesoul-spark:2.3.1-spark-3.3
1212
```
1313

1414
### Use Local Packages
1515
You can find the LakeSoul packages from our release page: [Releases](https://github.com/lakesoul-io/LakeSoul/releases).
1616
Download the jar file and pass it to `spark-submit`.
1717
```bash
18-
spark-submit --jars "lakesoul-spark-2.3.0-spark-3.3.jar"
18+
spark-submit --jars "lakesoul-spark-2.3.1-spark-3.3.jar"
1919
```
2020

2121
Or you could directly put the jar into `$SPARK_HOME/jars`
@@ -26,7 +26,7 @@ Include maven dependencies in your project:
2626
<dependency>
2727
<groupId>com.dmetasoul</groupId>
2828
<artifactId>lakesoul</artifactId>
29-
<version>2.3.0-spark-3.3</version>
29+
<version>2.3.1-spark-3.3</version>
3030
</dependency>
3131
```
3232

@@ -92,7 +92,7 @@ taskmanager.memory.task.off-heap.size: 3000m
9292
:::
9393
9494
## Add LakeSoul Jar to Flink's directory
95-
Download LakeSoul Flink Jar from: https://github.com/lakesoul-io/LakeSoul/releases/download/v2.3.0/lakesoul-flink-2.3.0-flink-1.14.jar
95+
Download LakeSoul Flink Jar from: https://github.com/lakesoul-io/LakeSoul/releases/download/v2.3.1/lakesoul-flink-2.3.1-flink-1.14.jar
9696
9797
And put the jar file under `$FLINK_HOME/lib`. After this, you could start flink session cluster or application as usual.
9898

@@ -103,6 +103,6 @@ Add the following to your project's pom.xml
103103
<dependency>
104104
<groupId>com.dmetasoul</groupId>
105105
<artifactId>lakesoul</artifactId>
106-
<version>2.3.0-flink-1.14</version>
106+
<version>2.3.1-flink-1.14</version>
107107
</dependency>
108108
```

website/docs/03-Usage Docs/05-flink-cdc-sync.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ In the Stream API, the main functions of LakeSoul Sink are:
1515

1616
## How to use the command line
1717
### 1. Download LakeSoul Flink Jar
18-
It can be downloaded from the LakeSoul Release page: https://github.com/lakesoul-io/LakeSoul/releases/download/v2.3.0/lakesoul-flink-2.3.0-flink-1.14.jar.
18+
It can be downloaded from the LakeSoul Release page: https://github.com/lakesoul-io/LakeSoul/releases/download/v2.3.1/lakesoul-flink-2.3.1-flink-1.14.jar.
1919

2020
The currently supported Flink version is 1.14.
2121

@@ -54,7 +54,7 @@ export LAKESOUL_PG_PASSWORD=root
5454
#### 2.2 Start sync job
5555
```bash
5656
bin/flink run -c org.apache.flink.lakesoul.entry.MysqlCdc \
57-
lakesoul-flink-2.3.0-flink-1.14.jar \
57+
lakesoul-flink-2.3.1-flink-1.14.jar \
5858
--source_db.host localhost \
5959
--source_db.port 3306 \
6060
--source_db.db_name default \
@@ -73,7 +73,7 @@ Description of required parameters:
7373
| Parameter | Meaning | Value Description |
7474
|----------------|------------------------------------|-------------------------------------------- |
7575
| -c | The task runs the main function entry class | org.apache.flink.lakesoul.entry.MysqlCdc |
76-
| Main package | Task running jar | lakesoul-flink-2.3.0-flink-1.14.jar |
76+
| Main package | Task running jar | lakesoul-flink-2.3.1-flink-1.14.jar |
7777
| --source_db.host | The address of the MySQL database | |
7878
| --source_db.port | MySQL database port | |
7979
| --source_db.user | MySQL database username | |

website/docs/03-Usage Docs/06-flink-lakesoul-connector.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,14 +10,12 @@ LakeSoul provides Flink Connector which implements the Dynamic Table interface,
1010

1111
To setup Flink environment, please refer to [Setup Spark/Flink Job/Project](../03-Usage%20Docs/02-setup-spark.md)
1212

13-
Introduce LakeSoul dependency: package and compile the lakesoul-flink folder to get lakesoul-flink-2.3.0-flink-1.14.jar.
14-
1513
In order to use Flink to create LakeSoul tables, it is recommended to use Flink SQL Client, which supports direct use of Flink SQL commands to operate LakeSoul tables. In this document, the Flink SQL is to directly enter statements on the Flink SQL Client cli interface; whereas the Table API needs to be used in a Java projects.
1614

1715
Switch to the flink folder and execute the command to start the SQLclient client.
1816
```bash
1917
# Start Flink SQL Client
20-
bin/sql-client.sh embedded -j lakesoul-flink-2.3.0-flink-1.14.jar
18+
bin/sql-client.sh embedded -j lakesoul-flink-2.3.1-flink-1.14.jar
2119
```
2220

2321
## 2. DDL

website/docs/03-Usage Docs/08-auto-compaction-task.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ The use the following command to start the compaction service job:
4141
--conf "spark.executor.extraJavaOptions=-XX:MaxDirectMemorySize=4G" \
4242
--conf "spark.executor.memoryOverhead=3g" \
4343
--class com.dmetasoul.lakesoul.spark.compaction.CompactionTask \
44-
jars/lakesoul-spark-2.3.0-spark-3.3.jar
44+
jars/lakesoul-spark-2.3.1-spark-3.3.jar
4545
--threadpool.size=10
4646
--database=test
4747
```

website/i18n/zh-Hans/docusaurus-plugin-content-docs/current/01-Getting Started/01-setup-local-env.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -31,10 +31,10 @@ https://dlcdn.apache.org/spark/spark-3.3.2/spark-3.3.2-bin-without-hadoop.tgz
3131

3232
LakeSoul 发布 jar 包可以从 GitHub Releases 页面下载:https://github.com/lakesoul-io/LakeSoul/releases 。下载后请将 Jar 包放到 Spark 安装目录下的 jars 目录中:
3333
```bash
34-
wget https://github.com/lakesoul-io/LakeSoul/releases/download/v2.3.0/lakesoul-spark-2.3.0-spark-3.3.jar -P $SPARK_HOME/jars
34+
wget https://github.com/lakesoul-io/LakeSoul/releases/download/v2.3.1/lakesoul-spark-2.3.1-spark-3.3.jar -P $SPARK_HOME/jars
3535
```
3636

37-
如果访问 Github 有问题,也可以从如下链接下载:https://dmetasoul-bucket.obs.cn-southwest-2.myhuaweicloud.com/releases/lakesoul/lakesoul-spark-2.3.0-spark-3.3.jar
37+
如果访问 Github 有问题,也可以从如下链接下载:https://dmetasoul-bucket.obs.cn-southwest-2.myhuaweicloud.com/releases/lakesoul/lakesoul-spark-2.3.1-spark-3.3.jar
3838

3939
:::tip
4040
从 2.1.0 版本起,LakeSoul 自身的依赖已经通过 shade 方式打包到一个 jar 包中。之前的版本是多个 jar 包以 tar.gz 压缩包的形式发布。

website/i18n/zh-Hans/docusaurus-plugin-content-docs/current/01-Getting Started/02-docker-compose.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ docker run --net lakesoul-docker-compose-env_default --rm -ti \
3434
-v $(pwd)/lakesoul.properties:/opt/spark/work-dir/lakesoul.properties \
3535
--env lakesoul_home=/opt/spark/work-dir/lakesoul.properties bitnami/spark:3.3.1 \
3636
spark-shell \
37-
--packages com.dmetasoul:lakesoul-spark:2.3.0-spark-3.3 \
37+
--packages com.dmetasoul:lakesoul-spark:2.3.1-spark-3.3 \
3838
--conf spark.sql.extensions=com.dmetasoul.lakesoul.sql.LakeSoulSparkSessionExtension \
3939
--conf spark.sql.catalog.lakesoul=org.apache.spark.sql.lakesoul.catalog.LakeSoulCatalog \
4040
--conf spark.sql.defaultCatalog=lakesoul \

0 commit comments

Comments
 (0)