Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Add FE code to Docker image #207

Merged
merged 6 commits into from
Dec 6, 2024
Merged

Add FE code to Docker image #207

merged 6 commits into from
Dec 6, 2024

Conversation

aponcedeleonch
Copy link
Contributor

The PR adds:

  • The stage webbuilder to download and build the FE code.
    • It downloads the code as .zip from GH using a GH token. The token is needed since the repository is private.
  • Modifies the runtime stage to install nginx and also serve the FE code.

@aponcedeleonch
Copy link
Contributor Author

aponcedeleonch commented Dec 5, 2024

This needs a GH token setup for the repo to be able to download the FE repo since it's private. I tried secrets.GITHUB_TOKEN and it didn't have enough permissions. For the moment I inserted a personal token to test and is working

Dockerfile Outdated
# -L to follow redirects
# -s to silence the progress bar
# -k to allow curl to make insecure connections
# -H to pass the GITHUB_TOKEN as a header
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My spider sense is tingling. Why would we allow insecure connections and allow curl to do redirects? Also main might be OK for now, but we should do tags then and fetch a particular blob.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I removed the insecure connections bit by installing ca-certificates, I was missing that. We need redirects because otherwise we cannot download :( , it returns an empty file. Do you think that at some point we would need something else than main from FE?

@@ -11,7 +11,7 @@ fi

# Step 2: Start the main application (serve)
echo "Starting the application..."
exec python -m src.codegate.cli serve --port 8989 --host 0.0.0.0 --vllm-url https://inference.codegate.ai
exec python -m src.codegate.cli serve --port 8989 --host 0.0.0.0 --vllm-url https://inference.codegate.ai &
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For some reason I can't build the Dockerfile locally (it complains about missing main.zip?) but does this mean that the container stdout wouldn't have codegate log messages? Could we do that the other way around and run nginx on the background instead or forward the messages to stdout?

(Ideally Docker containers are meant to serve 1 process..it would be ideal to serve the FE from the same codegate process as well...but..let's discuss this next week)

Copy link
Contributor Author

@aponcedeleonch aponcedeleonch Dec 6, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That happens when you don't pass a GH token with enough permissions, we need to read FE private repo. If you setup a token then you should be able to build with something like

docker build \
  --build-arg LATEST_COMMIT_SHA=$(curl -LSsk "https://api.github.com/repos/stacklok/codegate-ui/commits?per_page=1" -H "Authorization: Bearer $GITHUB_TOKEN" | jq -r '.[0].sha') \
  --secret id=gh_token,env=GITHUB_TOKEN \
  -t codegate .;

And I checked and it seems that both were forwarding to stdout even though one was at the background. Anyways, just to be extra cautious I followed your suggestion and moved nginx to the background and fastapi now is at the front.

@aponcedeleonch aponcedeleonch marked this pull request as ready for review December 6, 2024 11:51
@aponcedeleonch aponcedeleonch merged commit 82232f5 into main Dec 6, 2024
3 checks passed
Copy link
Contributor

@jhrozek jhrozek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

after I stopped fumbling with the ports this works!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants