Skip to content

trillion-labs/tri.oo.ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

6 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

tri.oo.ai

tri.oo.ai is a local, open-source showcase demonstrating the capabilities of Tri-7B-Search-preview, an efficient LLM with built-in internet search. Developed by Trillion Labs and based on Tri-7B πŸŒπŸ’š, it features a familiar chat interface inspired by oo.ai.

πŸ” What is tri.oo.ai?

tri.oo.ai replicates the functionality of oo.ai with a local-first approach and open tooling. It uses the Tri-7B-Search model as the engine behind a smart assistant that can:

  • Understand natural queries
  • Retrieve relevant data from your knowledge base
  • Augment answers with DuckDuckGo search results
  • Run locally for full control and privacy

This is a search-enabled LLM service β€” combining the reasoning of LLMs with the freshness of web search.

πŸ’‘ Example

πŸ” Query

페이컀 μž¬κ³„μ•½ μ†Œμ‹

πŸ€– AI Response

페이컀(μ΄μƒν˜)λŠ” 2029λ…„κΉŒμ§€ T1κ³Ό 4λ…„ μž¬κ³„μ•½μ„ μ²΄κ²°ν•œ 사싀이 ν™•μΈλ˜μ—ˆμŠ΅λ‹ˆλ‹€. μ΄λŠ” 2025λ…„ 7μ›” 27일(2025λ…„ κΈ°μ€€) 곡식 λ°œν‘œλœ λ‚΄μš©μœΌλ‘œ, 리그 였브 λ ˆμ „λ“œ(LoL) e슀포츠 역사상 졜초둜 18μ‹œμ¦Œ 연속 ν™œλ™μ„ μ΄μ–΄κ°€κ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€. μ£Όμš” 계약 쑰건은 λ‹€μŒκ³Ό κ°™μŠ΅λ‹ˆλ‹€:

  • 계약 κΈ°κ°„: 2025λ…„ 7μ›” 27일뢀터 2029λ…„ 6μ›” 30μΌκΉŒμ§€ (4λ…„κ°„)
  • μ„±κ³Ό: ν•΄λ‹Ή κΈ°κ°„ λ™μ•ˆ 2022 LCK μŠ€ν”„λ§ μ „μŠΉ 우승, 2023 μ•„μ‹œμ•ˆκ²Œμž„ κΈˆλ©”λ‹¬, 2023Β·2024 LoL μ›”λ“œ μ±”ν”Όμ–Έμ‹­ 2μ—°νŒ¨ λ“± μ—­λŒ€ 졜고 μˆ˜μ€€μ˜ 성적을 κΈ°λ‘ν–ˆμŠ΅λ‹ˆλ‹€.
  • 의미: νŽ˜μ΄μ»€λŠ” 2013λ…„ 데뷔 이래 νŒ€μ˜ μƒμ§•μœΌλ‘œ ν™œμ•½ν•˜λ©° e슀포츠 역사에 ν•œ νšμ„ 그은 μ„ μˆ˜λ‘œ 평가받고 μžˆμŠ΅λ‹ˆλ‹€.

좜처

  1. [μ‘°μ„ λΉ„μ¦ˆ] 페이컀 μ΄μƒν˜, T1κ³Ό 4λ…„ μž¬κ³„μ•½ λ°œν‘œ
  2. [μ—°ν•©λ‰΄μŠ€] 페이컀 μ΄μƒν˜, 2029λ…„κΉŒμ§€ 뛴닀…T1κ³Ό 4λ…„ μž¬κ³„μ•½
  3. [κ²Œμž„λ·°] 페이컀 μ΄μƒν˜, T1κ³Ό 4λ…„ μž¬κ³„μ•½ λ°œν‘œ '2029λ…„κΉŒμ§€ 동행'

Installation

  1. Clone the repository:
git clone https://github.com/trillion-labs/tri.oo.ai.git
cd tri.oo.ai
  1. Install UV (if not already installed):
curl -LsSf https://astral.sh/uv/install.sh | sh
  1. Create virtual environment and install dependencies:
uv sync

# for offline vllm usage
uv sync --group vllm

Usage

Option 1: Direct vLLM Usage

Run the interactive assistant with direct vLLM integration:

uv run main.py

Command line options:

  • --model: Model name or path (default: "trillionlabs/Tri-7B-Search-preview")
  • --tensor-parallel-size: Number of GPUs for tensor parallelism (default: 1)
  • --gpu-memory-utilization: GPU memory utilization 0-1 (default: 0.9)

Example:

uv run main.py --tensor-parallel-size 2 --gpu-memory-utilization 0.8

Option 2: OpenAI-Compatible API

For better flexibility and integration, you can use the OpenAI-compatible API:

  1. Start the vLLM server:
./serve.sh

This will start a vLLM server at http://localhost:8000 with the Tri-7B-Search-preview model.

  1. In a new terminal, run the API client:
uv run main_api.py

Command line options for the API client:

  • --base-url: Base URL for the OpenAI-compatible API (default: "http://localhost:8000/v1")
  • --model: Model name (default: "trillionlabs/Tri-7B-Search-preview")

Example with custom server:

python main_api.py --base-url "http://your-server:8080/v1"

Which Option to Choose?

  • Direct vLLM (main.py): Best for single-user local usage with minimal setup
  • API mode (main_api.py): Best for multi-user scenarios, integration with other tools, or when you want to separate the model server from the client

This project is licensed under the Apache 2.0 License - see the LICENSE file for details.

Acknowledgments

About

oo.ai clone built with Tri-7B-Search πŸ’š

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published