-
Notifications
You must be signed in to change notification settings - Fork 624
2. Quickstart
This document demonstrates how to install and deploy the latest version of CozeLoop.
Before referring to this document to install the CozeLoop Open-source Edition, ensure that your software and hardware environment meet the following requirements:
Project | Note |
---|---|
Go | * Go is installed, and the version is 1.23.4 or above. * Configure GOPATH and add ${GOPATH}/bin to the PATH environment variable to ensure the installed binary tools can be located and executed. |
Docker | Pre-install Docker and Docker Compose, and start the Docker service. For detailed operations, refer to the Docker documentation: * macOS: It is recommended to use Docker Desktop for installation. Refer to the Docker Desktop For Mac installation guide * Linux: Refer to the Docker installation guide and Docker Compose installation guide * Windows: It is recommended to use Docker Desktop for installation. Refer to the Docker Desktop For Windows installation guide |
Model | Online Model services such as OpenAI or Volcano Engine Ark have been enabled The currently supported list of Model services can be referred to in Model configuration |
Execute the following commands to retrieve the latest version of the CozeLoop source code.
# Clone the code
git clone https://github.com/coze-dev/cozeloop.git
# Enter the cozeloop directory
cd cozeloop
Before officially installing the CozeLoop Open-source Edition, you need to prepare an optional model; otherwise, you won't be able to select a model to start prompt debugging or evaluation when accessing the CozeLoop Open-source Edition. Here, OpenAI and FireArk models are used as examples to demonstrate the steps for configuring model files. You can quickly configure the models for installing and testing the CozeLoop Open-source Edition. For other models such as Llama, you can refer to the model configuration documentation to complete the configuration file.
-
Navigate to the directory
conf/default/app/runtime/
. -
Edit the file
model_config.yaml
and modify theapi_key
andmodel
fields. The content below demonstrates the configuration for the Cici model and OpenAI model in CozeLoop Open-source Edition using FireArk. Overwrite the original file with the following content, then modify theapi_key
andmodel
fields, replacing them with your configuration parameters for OpenAI and FireArk models.models: - id: 1 name: "doubao" frame: "eino" protocol: "ark" protocol_config: api_key: "***" # Volcano Engine Ark API Key. The acquisition method can be found at https://www.volcengine.com/docs/82379/1541594 model: "***" # Ark Model ID. For reference, visit https://www.volcengine.com/docs/82379/1330310 param_config: param_schemas: - name: "temperature" label: "生成随机性" desc: "调高温度会使得模型的输出更多样性和创新性,反之,降低温度会使输出内容更加遵循指令要求但减少多样性。建议不要与 “Top p” 同时调整。" type: "float" min: "0" max: "1.0" default_val: "0.7" - name: "max_tokens" label: "最大回复长度" desc: "控制模型输出的 Tokens 长度上限。通常 100 Tokens 约等于 150 个中文汉字。" type: "int" min: "1" max: "4096" default_val: "2048" - name: "top_p" label: "核采样概率" desc: "生成时选取累计概率达 top_p 的最小 token 集合,集合外 token 被排除,平衡多样性与合理性。" type: "float" # min: "0.001" max: "1.0" default_val: "0.7" - id: 2 name: "openapi" frame: "eino" protocol: "openai" protocol_config: api_key: "***" # OpenAI API Key model: "***" # OpenAI Model ID param_config: param_schemas: - name: "temperature" label: "生成随机性" desc: "调高温度会使得模型的输出更多样性和创新性,反之,降低温度会使输出内容更加遵循指令要求但减少多样性。建议不要与 “Top p” 同时调整。" type: "float" min: "0" max: "1.0" default_val: "0.7" - name: "max_tokens" label: "最大回复长度" desc: "控制模型输出的 Tokens 长度上限。通常 100 Tokens 约等于 150 个中文汉字。" type: "int" min: "1" max: "4096" default_val: "2048" - name: "top_p" label: "核采样概率" desc: "生成时选取累计概率达 top_p 的最小 token 集合,集合外 token 被排除,平衡多样性与合理性。" type: "float" # min: "0.001" max: "1.0" default_val: "0.7"
-
Save the file.
Execute the following commands to use Docker Compose for quickly deploying the CozeLoop Open-source Edition.
# Start the service, default is development mode
docker compose up --build
The first startup requires retrieving the image and building the local image. This might take some time, so please be patient. During the deployment process, you will see the following log information. If the echo information includes "Prompt that the backend build is completed," it indicates that CozeLoop has been successfully launched.
- When deploying the CozeLoop open-source edition, the default startup mode is development mode. For detailed explanations about startup modes, please refer to startup modes.
- If you encounter issues related to Docker or Docker Compose during startup, the usual causes are environment configuration, system permissions, or network problems. It is recommended to find relevant solutions based on Docker error messages.
After starting the service, you can open the CozeLoop Open-source Edition by accessing http://localhost:8082
via your browser. Among them, 8082
is the frontend listening port, and 8888
is the backend listening port.
At this point, you have successfully deployed the CozeLoop Open-source Edition and can experience CozeLoop's various functions and services.
After successfully deploying the CozeLoop Open-source Edition, you can follow the steps below to verify whether CozeLoop can successfully call model services.
- Register an account according to the page prompts and log in to CozeLoop.
- Click on Playground in the left navigation bar.
- In the model configuration area, view the model list and confirm whether the selectable models are exactly the same as those configured in Step 2.
- Select any model and expand the chat for preview and debugging on the right side to see if the model responds normally.
- Click Trace in the left navigation bar to view the Trace information reported during the debugging process.
If all the above steps are executed correctly without page errors, it indicates that the CozeLoop Open-source Edition has been successfully deployed and the model service can be normally invoked.
After successfully installing and accessing the CozeLoop Open-source Edition, you can experience the basic functions of the CozeLoop Open-source Edition.
- Prompt development and debugging: CozeLoop provides a complete prompt development process. Refer to the documentation to develop and debug prompts on the open-source platform.
- Evaluation: CozeLoop's evaluation feature offers standard evaluation data management, an automated evaluation engine, and comprehensive experimental result statistics. Refer to the documentation to initiate an evaluation experiment.
- Trace reporting and querying: CozeLoop supports automatic reporting of prompts debugged on the platform, integration with mainstream AI frameworks, and one-click Trace reporting. Refer to the documentation to report Trace.
If you encounter the issue of image retrieval failure, i.e., docker pull
failure, prioritize checking local system/network issues. Most problems can be resolved by consulting existing solutions. It is recommended to try retrieving all required images locally. Once successful, use Docker Compose to start the services.
docker pull golang:1.23.4
docker pull nginx:latest
docker pull clickhouse/clickhouse-server:latest
docker pull mysql:latest
docker pull minio/minio:latest
docker pull apache/rocketmq:latest
docker pull redis:latest
docker pull moby/buildkit:latest # Cross-platform on ARM architecture requires