Skip to content

Conversation

@Humphryshikunzi
Copy link
Contributor

Description

[Briefly describe the changes made in this pull request.]

Related Issues

[Reference any related issues or tasks addressed by this pull request.]

Changes Made

[List the specific changes made in this pull request.]

Checklist

  • Changes tested locally
  • Code reviewed
  • Documentation updated (if necessary)
  • Unit tests added (if applicable)

Additional Notes

[Add any additional notes or context for the reviewer(s).]

@Humphryshikunzi Humphryshikunzi changed the title extended to use gemini, sswitched to use gemini-flash-latest extended to use gemini, switched to use gemini-flash-latest Nov 3, 2025
@danielaskdd
Copy link
Collaborator

Conflicts must be resolved prior to further review.

@danielaskdd danielaskdd merged commit 5f49cee into HKUDS:main Nov 6, 2025
@danielaskdd
Copy link
Collaborator

Thanks for sharing.

@danielaskdd
Copy link
Collaborator

PR #2326: Add Chain of Thought Support for Gemini LLM – Could you please pull the latest code and verify whether it functions as expected?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants