Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

allow llmClient to be optionally passed in #352

Merged

Conversation

arihanv
Copy link
Contributor

@arihanv arihanv commented Dec 30, 2024

why

Allow users to easily define their own llm clients rather than adding support for each major provider in the main codebase #350

what changed

Added llmClient to Stagehand constructor so that custom clients can be passed in

@arihanv arihanv marked this pull request as ready for review December 30, 2024 21:33
@kamath
Copy link
Contributor

kamath commented Jan 2, 2025

this looks great! can we add an example of how to use a custom client (like in #349 with ollama/llama 3.2) in examples/? would love custom_client.ts that implements an LLMClient for Llama 3.2 using Ollama that can be passed in on init.

we also need to work towards a fast-follow PR that can allow a user to pass in a custom LLMClient to act/extract/observe as well

@arihanv
Copy link
Contributor Author

arihanv commented Jan 2, 2025

#349 now proposes an example ollama client which extends the LLMClient and uses Llama 3.2.

@kamath
Copy link
Contributor

kamath commented Jan 3, 2025

Perfect, I think both look good -- can we add #349 here as follows:

  • Move the Ollama Client to ./examples/external_clients/ollama.ts
  • Create an example at ./examples/external_client.ts that references the LLMClient above

This way, we can more easily clone the branch and make sure nothing else is breaking. tysm 🙇 🙏

@kamath
Copy link
Contributor

kamath commented Jan 3, 2025

Also, please add a changeset (npx changeset) with a minor tag (since this changes the interface to the end user), and add any supplementary docs to the README

Copy link

changeset-bot bot commented Jan 3, 2025

🦋 Changeset detected

Latest commit: 0495898

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
@browserbasehq/stagehand Minor

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@kamath kamath changed the base branch from main to evals/add-ollama January 3, 2025 05:28
Copy link
Contributor

@kamath kamath left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hell yeah! merging into an evals branch to make sure we're not missing anything

projectId: process.env.BROWSERBASE_PROJECT_ID,
verbose: 1,
llmClient: new OllamaClient(
(message: LogLine) =>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for future PR: we should make LLMClient default inherit logger from Stagehand so we don't have to do this

@kamath kamath merged commit de44501 into browserbase:evals/add-ollama Jan 3, 2025
kamath added a commit that referenced this pull request Jan 3, 2025
* allow llmClient to be optionally passed in (#352)

* feat: allow llmClient to be optionally passed in

* update: add ollama client example from pr: #349

* update: README and changeset

* lint

---------

Co-authored-by: Arihan Varanasi <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants