Skip to content

release: 0.1.3 #35

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 14 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/CODEOWNERS
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@

# These owners will be the default owners for everything in
# the repo. Unless a later match takes precedence,
* @yanxi0830
* @yanxi0830
62 changes: 53 additions & 9 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -1,18 +1,23 @@
name: CI
on:
push:
branches:
- main
branches-ignore:
- 'generated'
- 'codegen/**'
- 'integrated/**'
- 'stl-preview-head/**'
- 'stl-preview-base/**'
pull_request:
branches:
- main
- next
branches-ignore:
- 'stl-preview-head/**'
- 'stl-preview-base/**'

jobs:
lint:
timeout-minutes: 10
name: lint
runs-on: ubuntu-latest

runs-on: ${{ github.repository == 'stainless-sdks/llama-api-python' && 'depot-ubuntu-24.04' || 'ubuntu-latest' }}
if: github.event_name == 'push' || github.event.pull_request.head.repo.fork
steps:
- uses: actions/checkout@v4

Expand All @@ -30,10 +35,49 @@ jobs:
- name: Run lints
run: ./scripts/lint

build:
if: github.repository == 'stainless-sdks/llama-api-python' && (github.event_name == 'push' || github.event.pull_request.head.repo.fork)
timeout-minutes: 10
name: build
permissions:
contents: read
id-token: write
runs-on: depot-ubuntu-24.04
steps:
- uses: actions/checkout@v4

- name: Install Rye
run: |
curl -sSf https://rye.astral.sh/get | bash
echo "$HOME/.rye/shims" >> $GITHUB_PATH
env:
RYE_VERSION: '0.44.0'
RYE_INSTALL_OPTION: '--yes'

- name: Install dependencies
run: rye sync --all-features

- name: Run build
run: rye build

- name: Get GitHub OIDC Token
id: github-oidc
uses: actions/github-script@v6
with:
script: core.setOutput('github_token', await core.getIDToken());

- name: Upload tarball
env:
URL: https://pkg.stainless.com/s
AUTH: ${{ steps.github-oidc.outputs.github_token }}
SHA: ${{ github.sha }}
run: ./scripts/utils/upload-artifact.sh

test:
timeout-minutes: 10
name: test
runs-on: ubuntu-latest

runs-on: ${{ github.repository == 'stainless-sdks/llama-api-python' && 'depot-ubuntu-24.04' || 'ubuntu-latest' }}
if: github.event_name == 'push' || github.event.pull_request.head.repo.fork
steps:
- uses: actions/checkout@v4

Expand Down
31 changes: 31 additions & 0 deletions .github/workflows/publish-pypi.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
# This workflow is triggered when a GitHub release is created.
# It can also be run manually to re-publish to PyPI in case it failed for some reason.
# You can run this workflow by navigating to https://www.github.com/meta-llama/llama-api-python/actions/workflows/publish-pypi.yml
name: Publish PyPI
on:
workflow_dispatch:

release:
types: [published]

jobs:
publish:
name: publish
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4

- name: Install Rye
run: |
curl -sSf https://rye.astral.sh/get | bash
echo "$HOME/.rye/shims" >> $GITHUB_PATH
env:
RYE_VERSION: '0.44.0'
RYE_INSTALL_OPTION: '--yes'

- name: Publish to PyPI
run: |
bash ./bin/publish-pypi
env:
PYPI_TOKEN: ${{ secrets.LLAMA_API_CLIENT_PYPI_TOKEN || secrets.PYPI_TOKEN }}
21 changes: 21 additions & 0 deletions .github/workflows/release-doctor.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
name: Release Doctor
on:
pull_request:
branches:
- main
workflow_dispatch:

jobs:
release_doctor:
name: release doctor
runs-on: ubuntu-latest
if: github.repository == 'meta-llama/llama-api-python' && (github.event_name == 'push' || github.event_name == 'workflow_dispatch' || startsWith(github.head_ref, 'release-please') || github.head_ref == 'next')

steps:
- uses: actions/checkout@v4

- name: Check release environment
run: |
bash ./bin/check-release-environment
env:
PYPI_TOKEN: ${{ secrets.LLAMA_API_CLIENT_PYPI_TOKEN || secrets.PYPI_TOKEN }}
3 changes: 3 additions & 0 deletions .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{
".": "0.1.3"
}
2 changes: 1 addition & 1 deletion .stats.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
configured_endpoints: 4
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/meta%2Fllama-api-bfa0267b010dcc4b39e62dfbd698ac6f9421f3212c44b3408b9b154bd6c67a8b.yml
openapi_spec_hash: 7f424537bc7ea7638e3934ef721b8d71
config_hash: 3ae62c8625d97ed8a867ab369f591724
config_hash: d121aca03b5b9ad503ffce2b0860b0d6
26 changes: 26 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# Changelog

## 0.1.3 (2025-07-12)

Full Changelog: [v0.1.2...v0.1.3](https://github.com/meta-llama/llama-api-python/compare/v0.1.2...v0.1.3)

### Bug Fixes

* **client:** don't send Content-Type header on GET requests ([efec88a](https://github.com/meta-llama/llama-api-python/commit/efec88aa519948ea58ee629507cd91e9af90c1c8))
* **parsing:** correctly handle nested discriminated unions ([b627686](https://github.com/meta-llama/llama-api-python/commit/b6276863bea64a7127cdb71b6fbb02534d2e762b))


### Chores

* add examples ([abfa065](https://github.com/meta-llama/llama-api-python/commit/abfa06572191caeaa33603c846d5953aa453521e))
* **internal:** bump pinned h11 dep ([d40e1b1](https://github.com/meta-llama/llama-api-python/commit/d40e1b1d736ec5e5fe7e3c65ace9c5d65d038081))
* **package:** mark python 3.13 as supported ([ef5bc36](https://github.com/meta-llama/llama-api-python/commit/ef5bc36693fa419e3d865e97cae97e7f5df19b1a))
* **readme:** fix version rendering on pypi ([786f9fb](https://github.com/meta-llama/llama-api-python/commit/786f9fbdb75e54ceac9eaf00d4c4d7002ed97a94))
* sync repo ([7e697f6](https://github.com/meta-llama/llama-api-python/commit/7e697f6550485728ee00d4fd18800a90fb3592ab))
* update SDK settings ([de22c0e](https://github.com/meta-llama/llama-api-python/commit/de22c0ece778c938f75e4717baf3e628c7a45087))


### Documentation

* code of conduct ([efe1af2](https://github.com/meta-llama/llama-api-python/commit/efe1af28fb893fa657394504dc8c513b20ac589a))
* readme and license ([d53eafd](https://github.com/meta-llama/llama-api-python/commit/d53eafd104749e9483015676fba150091e754928))
52 changes: 47 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,17 @@
# Llama API Client Python API library

[![PyPI version](https://img.shields.io/pypi/v/llama_api_client.svg)](https://pypi.org/project/llama_api_client/)
<!-- prettier-ignore -->
[![PyPI version](https://img.shields.io/pypi/v/llama_api_client.svg?label=pypi%20(stable))](https://pypi.org/project/llama_api_client/)

The Llama API Client Python library provides convenient access to the Llama API Client REST API from any Python 3.8+
application. The library includes type definitions for all request params and response fields,
and offers both synchronous and asynchronous clients powered by [httpx](https://github.com/encode/httpx).

It is generated with [Stainless](https://www.stainless.com/).

## Documentation

The REST API documentation can be found on [https://llama.developer.meta.com/docs](https://llama.developer.meta.com/docs). The full API of this library can be found in [api.md](api.md).
The REST API documentation can be found on [llama.developer.meta.com](https://llama.developer.meta.com/docs). The full API of this library can be found in [api.md](api.md).

## Installation

Expand Down Expand Up @@ -78,6 +80,46 @@ asyncio.run(main())

Functionality between the synchronous and asynchronous clients is otherwise identical.

### With aiohttp

By default, the async client uses `httpx` for HTTP requests. However, for improved concurrency performance you may also use `aiohttp` as the HTTP backend.

You can enable this by installing `aiohttp`:

```sh
# install from the production repo
pip install 'llama_api_client[aiohttp] @ git+ssh://[email protected]/meta-llama/llama-api-python.git'
```

Then you can enable it by instantiating the client with `http_client=DefaultAioHttpClient()`:

```python
import os
import asyncio
from llama_api_client import DefaultAioHttpClient
from llama_api_client import AsyncLlamaAPIClient


async def main() -> None:
async with AsyncLlamaAPIClient(
api_key=os.environ.get("LLAMA_API_KEY"), # This is the default and can be omitted
http_client=DefaultAioHttpClient(),
) as client:
create_chat_completion_response = await client.chat.completions.create(
messages=[
{
"content": "string",
"role": "user",
}
],
model="model",
)
print(create_chat_completion_response.completion_message)


asyncio.run(main())
```

## Streaming responses

We provide support for streaming responses using Server Side Events (SSE).
Expand Down Expand Up @@ -212,7 +254,7 @@ client.with_options(max_retries=5).chat.completions.create(
### Timeouts

By default requests time out after 1 minute. You can configure this with a `timeout` option,
which accepts a float or an [`httpx.Timeout`](https://www.python-httpx.org/advanced/#fine-tuning-the-configuration) object:
which accepts a float or an [`httpx.Timeout`](https://www.python-httpx.org/advanced/timeouts/#fine-tuning-the-configuration) object:

```python
from llama_api_client import LlamaAPIClient
Expand Down Expand Up @@ -288,7 +330,7 @@ response = client.chat.completions.with_raw_response.create(
print(response.headers.get('X-My-Header'))

completion = response.parse() # get the object that `chat.completions.create()` would have returned
print(completion.completion_message)
print(completion.id)
```

These methods return an [`APIResponse`](https://github.com/meta-llama/llama-api-python/tree/main/src/llama_api_client/_response.py) object.
Expand Down Expand Up @@ -427,4 +469,4 @@ Python 3.8 or higher.
See [the contributing documentation](./CONTRIBUTING.md).

## License
Llama API Python SDK is MIT licensed, as found in the LICENSE file.
Llama API Python SDK is MIT licensed, as found in the LICENSE file.
21 changes: 21 additions & 0 deletions bin/check-release-environment
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
#!/usr/bin/env bash

errors=()

if [ -z "${PYPI_TOKEN}" ]; then
errors+=("The PYPI_TOKEN secret has not been set. Please set it in either this repository's secrets or your organization secrets.")
fi

lenErrors=${#errors[@]}

if [[ lenErrors -gt 0 ]]; then
echo -e "Found the following errors in the release environment:\n"

for error in "${errors[@]}"; do
echo -e "- $error\n"
done

exit 1
fi

echo "The environment is ready to push releases!"
5 changes: 3 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "llama_api_client"
version = "0.1.2"
version = "0.1.3"
description = "The official Python library for the llama-api-client API"
dynamic = ["readme"]
license = "MIT"
Expand All @@ -24,6 +24,7 @@ classifiers = [
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Operating System :: OS Independent",
"Operating System :: POSIX",
"Operating System :: MacOS",
Expand All @@ -38,7 +39,7 @@ Homepage = "https://github.com/meta-llama/llama-api-python"
Repository = "https://github.com/meta-llama/llama-api-python"

[project.optional-dependencies]
aiohttp = ["aiohttp", "httpx_aiohttp>=0.1.6"]
aiohttp = ["aiohttp", "httpx_aiohttp>=0.1.8"]

[tool.rye]
managed = true
Expand Down
66 changes: 66 additions & 0 deletions release-please-config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
{
"packages": {
".": {}
},
"$schema": "https://raw.githubusercontent.com/stainless-api/release-please/main/schemas/config.json",
"include-v-in-tag": true,
"include-component-in-tag": false,
"versioning": "prerelease",
"prerelease": true,
"bump-minor-pre-major": true,
"bump-patch-for-minor-pre-major": false,
"pull-request-header": "Automated Release PR",
"pull-request-title-pattern": "release: ${version}",
"changelog-sections": [
{
"type": "feat",
"section": "Features"
},
{
"type": "fix",
"section": "Bug Fixes"
},
{
"type": "perf",
"section": "Performance Improvements"
},
{
"type": "revert",
"section": "Reverts"
},
{
"type": "chore",
"section": "Chores"
},
{
"type": "docs",
"section": "Documentation"
},
{
"type": "style",
"section": "Styles"
},
{
"type": "refactor",
"section": "Refactors"
},
{
"type": "test",
"section": "Tests",
"hidden": true
},
{
"type": "build",
"section": "Build System"
},
{
"type": "ci",
"section": "Continuous Integration",
"hidden": true
}
],
"release-type": "python",
"extra-files": [
"src/llama_api_client/_version.py"
]
}
Loading