Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Module 5 Lessons 4 & 5: TrustCall create_extractor with gemini-1.5 (flash or pro) models errors #64

Open
emilyvanark opened this issue Nov 20, 2024 · 3 comments

Comments

@emilyvanark
Copy link

Using gemini models:

from langchain_google_vertexai import ChatVertexAI
model = ChatVertexAI(
    model="gemini-1.5-pro-002", 
    temperature=0,
) 

and trustcall create_extractor:

from trustcall import create_extractor

# Create the extractor
trustcall_extractor = create_extractor(
    model,
    tools=[Memory],
    tool_choice="Memory",
    enable_inserts=True,
)

In both lessons 4 & 5 of Module 5, when the trustcall_extractor is invoked:

result = trustcall_extractor.invoke({"messages": updated_conversation, 
                                     "existing": existing_memories})

the following error is given:

Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
---------------------------------------------------------------------------
GraphRecursionError                       Traceback (most recent call last)
Cell In[14], line 2
      1 # Invoke the extractor with our updated conversation and existing memories
----> 2 result = trustcall_extractor.invoke({"messages": updated_conversation, 
      3                                      "existing": existing_memories})

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_core/runnables/base.py:3024](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_core/runnables/base.py#line=3023), in RunnableSequence.invoke(self, input, config, **kwargs)
   3022             input = context.run(step.invoke, input, config, **kwargs)
   3023         else:
-> 3024             input = context.run(step.invoke, input, config)
   3025 # finish the root run
   3026 except BaseException as e:

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/__init__.py:1749](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/__init__.py#line=1748), in Pregel.invoke(self, input, config, stream_mode, output_keys, interrupt_before, interrupt_after, debug, **kwargs)
   1747 else:
   1748     chunks = []
-> 1749 for chunk in self.stream(
   1750     input,
   1751     config,
   1752     stream_mode=stream_mode,
   1753     output_keys=output_keys,
   1754     interrupt_before=interrupt_before,
   1755     interrupt_after=interrupt_after,
   1756     debug=debug,
   1757     **kwargs,
   1758 ):
   1759     if stream_mode == "values":
   1760         latest = chunk

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/__init__.py:1497](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/__init__.py#line=1496), in Pregel.stream(self, input, config, stream_mode, output_keys, interrupt_before, interrupt_after, debug, subgraphs)
   1488 if loop.status == "out_of_steps":
   1489     msg = create_error_message(
   1490         message=(
   1491             f"Recursion limit of {config['recursion_limit']} reached "
   (...)
   1495         error_code=ErrorCode.GRAPH_RECURSION_LIMIT,
   1496     )
-> 1497     raise GraphRecursionError(msg)
   1498 # set final channel values as run output
   1499 run_manager.on_chain_end(loop.output)

GraphRecursionError: Recursion limit of 25 reached without hitting a stop condition. You can increase the limit by setting the `recursion_limit` config key.
For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/GRAPH_RECURSION_LIMIT

I'm afraid I'm too new to LangChain / LangGraph / TrustCall to know how to debug this, but it seems likely that the tool usage in the VertexAi library isn't vibing very well with TrustCall yet? (It seems to do fine with straight-up LangGraph / LangChain calls...)

@emilyvanark
Copy link
Author

emilyvanark commented Nov 20, 2024

The above example is from Lesson 4. In Lesson 5, if I use gemini-1.5-pro, it can update the uesr profile and the instructions, but not the task list. For the task list, I get:

instruction in the jupyter code box:

# User input for a ToDo
input_messages = [HumanMessage(content="I need to fix the jammed electric Yale lock on the door.")]

# Run the graph
for chunk in graph.stream({"messages": input_messages}, config, stream_mode="values"):
    chunk["messages"][-1].pretty_print()

Start of a good response:

================================ Human Message =================================

I need to fix the jammed electric Yale lock on the door.
================================== Ai Message ==================================

I've added "fix the jammed electric Yale lock on the door" to your to-do list.  I'll try to find a local locksmith and add it to the task.
Tool Calls:
  UpdateMemory (7e6ac065-94a8-472a-a46c-986e00c3b185)
 Call ID: 7e6ac065-94a8-472a-a46c-986e00c3b185
  Args:
    update_type: todo

Then the same "Key 'examples' is not supported in schema, ignoring" error:

This model can reply with multiple function calls in one response. Please don't rely on `additional_kwargs.function_call` as only the last one will be saved.Use `tool_calls` instead.
Key 'examples' is not supported in schema, ignoring
Key 'examples' is not supported in schema, ignoring
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:925](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=924), in _ConvertScalarFieldValue(value, field, path, require_str)
    924 try:
--> 925   number = int(value)
    926   enum_value = field.enum_type.values_by_number.get(number, None)

ValueError: invalid literal for int() with base 10: 'string'

The above exception was the direct cause of the following exception:

EnumStringValueParseError                 Traceback (most recent call last)
File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:930](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=929), in _ConvertScalarFieldValue(value, field, path, require_str)
    927 except ValueError as e:
    928   # Since parsing to integer failed and lookup in values_by_name didn't
    929   # find this name, we have an enum string value which is unknown.
--> 930   raise EnumStringValueParseError(
    931       'Invalid enum value {0} for enum type {1}'.format(
    932           value, field.enum_type.full_name
    933       )
    934   ) from e
    935 if enum_value is None:

EnumStringValueParseError: Invalid enum value string for enum type google.cloud.aiplatform.v1beta1.Type

The above exception was the direct cause of the following exception:

EnumStringValueParseError                 Traceback (most recent call last)
File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:689](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=688), in _Parser._ConvertFieldValuePair(self, js, message, path)
    688     else:
--> 689       self._ConvertAndSetScalar(message, field, value, '{0}.{1}'.format(path, name))
    690 except ParseError as e:

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:857](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=856), in _Parser._ConvertAndSetScalar(self, message, field, js_value, path)
    853 try:
    854   setattr(
    855       message,
    856       field.name,
--> 857       _ConvertScalarFieldValue(js_value, field, path))
    858 except EnumStringValueParseError:

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:946](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=945), in _ConvertScalarFieldValue(value, field, path, require_str)
    945 except EnumStringValueParseError as e:
--> 946   raise EnumStringValueParseError('{0} at {1}'.format(e, path)) from e
    947 except ParseError as e:

EnumStringValueParseError: Invalid enum value string for enum type google.cloud.aiplatform.v1beta1.Type at Schema.properties[patches].items.properties[value].anyOf[0].type

The above exception was the direct cause of the following exception:

ParseError                                Traceback (most recent call last)
File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:663](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=662), in _Parser._ConvertFieldValuePair(self, js, message, path)
    657       raise ParseError(
    658           'null is not allowed to be used as an element'
    659           ' in a repeated field at {0}.{1}[{2}]'.format(
    660               path, name, index
    661           )
    662       )
--> 663     self.ConvertMessage(
    664         item, sub_message, '{0}.{1}[{2}]'.format(path, name, index)
    665     )
    666 else:
    667   # Repeated scalar field.

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:540](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=539), in _Parser.ConvertMessage(self, value, message, path)
    539 else:
--> 540   self._ConvertFieldValuePair(value, message, path)
    541 self.recursion_depth -= 1

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:692](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=691), in _Parser._ConvertFieldValuePair(self, js, message, path)
    691 if field and field.containing_oneof is None:
--> 692   raise ParseError(
    693       'Failed to parse {0} field: {1}.'.format(name, e)
    694   ) from e
    695 else:

ParseError: Failed to parse type field: Invalid enum value string for enum type google.cloud.aiplatform.v1beta1.Type at Schema.properties[patches].items.properties[value].anyOf[0].type.

The above exception was the direct cause of the following exception:

ParseError                                Traceback (most recent call last)
File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:636](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=635), in _Parser._ConvertFieldValuePair(self, js, message, path)
    635   message.ClearField(field.name)
--> 636   self._ConvertMapFieldValue(
    637       value, message, field, '{0}.{1}'.format(path, name)
    638   )
    639 elif field.label == descriptor.FieldDescriptor.LABEL_REPEATED:

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:829](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=828), in _Parser._ConvertMapFieldValue(self, value, message, field, path)
    828 if value_field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
--> 829   self.ConvertMessage(
    830       value[key],
    831       getattr(message, field.name)[key_value],
    832       '{0}[{1}]'.format(path, key_value),
    833   )
    834 else:

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:540](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=539), in _Parser.ConvertMessage(self, value, message, path)
    539 else:
--> 540   self._ConvertFieldValuePair(value, message, path)
    541 self.recursion_depth -= 1

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:692](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=691), in _Parser._ConvertFieldValuePair(self, js, message, path)
    691 if field and field.containing_oneof is None:
--> 692   raise ParseError(
    693       'Failed to parse {0} field: {1}.'.format(name, e)
    694   ) from e
    695 else:

ParseError: Failed to parse anyOf field: Failed to parse type field: Invalid enum value string for enum type google.cloud.aiplatform.v1beta1.Type at Schema.properties[patches].items.properties[value].anyOf[0].type..

The above exception was the direct cause of the following exception:

ParseError                                Traceback (most recent call last)
File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:684](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=683), in _Parser._ConvertFieldValuePair(self, js, message, path)
    683   sub_message.SetInParent()
--> 684   self.ConvertMessage(value, sub_message, '{0}.{1}'.format(path, name))
    685 else:

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:540](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=539), in _Parser.ConvertMessage(self, value, message, path)
    539 else:
--> 540   self._ConvertFieldValuePair(value, message, path)
    541 self.recursion_depth -= 1

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:692](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=691), in _Parser._ConvertFieldValuePair(self, js, message, path)
    691 if field and field.containing_oneof is None:
--> 692   raise ParseError(
    693       'Failed to parse {0} field: {1}.'.format(name, e)
    694   ) from e
    695 else:

ParseError: Failed to parse properties field: Failed to parse anyOf field: Failed to parse type field: Invalid enum value string for enum type google.cloud.aiplatform.v1beta1.Type at Schema.properties[patches].items.properties[value].anyOf[0].type...

The above exception was the direct cause of the following exception:

ParseError                                Traceback (most recent call last)
File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:636](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=635), in _Parser._ConvertFieldValuePair(self, js, message, path)
    635   message.ClearField(field.name)
--> 636   self._ConvertMapFieldValue(
    637       value, message, field, '{0}.{1}'.format(path, name)
    638   )
    639 elif field.label == descriptor.FieldDescriptor.LABEL_REPEATED:

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:829](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=828), in _Parser._ConvertMapFieldValue(self, value, message, field, path)
    828 if value_field.cpp_type == descriptor.FieldDescriptor.CPPTYPE_MESSAGE:
--> 829   self.ConvertMessage(
    830       value[key],
    831       getattr(message, field.name)[key_value],
    832       '{0}[{1}]'.format(path, key_value),
    833   )
    834 else:

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:540](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=539), in _Parser.ConvertMessage(self, value, message, path)
    539 else:
--> 540   self._ConvertFieldValuePair(value, message, path)
    541 self.recursion_depth -= 1

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:692](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=691), in _Parser._ConvertFieldValuePair(self, js, message, path)
    691 if field and field.containing_oneof is None:
--> 692   raise ParseError(
    693       'Failed to parse {0} field: {1}.'.format(name, e)
    694   ) from e
    695 else:

ParseError: Failed to parse items field: Failed to parse properties field: Failed to parse anyOf field: Failed to parse type field: Invalid enum value string for enum type google.cloud.aiplatform.v1beta1.Type at Schema.properties[patches].items.properties[value].anyOf[0].type....

The above exception was the direct cause of the following exception:

ParseError                                Traceback (most recent call last)
Cell In[58], line 5
      2 input_messages = [HumanMessage(content="I need to fix the jammed electric Yale lock on the door.")]
      4 # Run the graph
----> 5 for chunk in graph.stream({"messages": input_messages}, config, stream_mode="values"):
      6     chunk["messages"][-1].pretty_print()

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/__init__.py:1477](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/__init__.py#line=1476), in Pregel.stream(self, input, config, stream_mode, output_keys, interrupt_before, interrupt_after, debug, subgraphs)
   1466     # Similarly to Bulk Synchronous Parallel / Pregel model
   1467     # computation proceeds in steps, while there are channel updates
   1468     # channel updates from step N are only visible in step N+1
   1469     # channels are guaranteed to be immutable for the duration of the step,
   1470     # with channel updates applied only at the transition between steps
   1471     while loop.tick(
   1472         input_keys=self.input_channels,
   1473         interrupt_before=interrupt_before_,
   1474         interrupt_after=interrupt_after_,
   1475         manager=run_manager,
   1476     ):
-> 1477         for _ in runner.tick(
   1478             loop.tasks.values(),
   1479             timeout=self.step_timeout,
   1480             retry_policy=self.retry_policy,
   1481             get_waiter=get_waiter,
   1482         ):
   1483             # emit output
   1484             yield from output()
   1485 # emit output

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/runner.py:58](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/runner.py#line=57), in PregelRunner.tick(self, tasks, reraise, timeout, retry_policy, get_waiter)
     56 t = tasks[0]
     57 try:
---> 58     run_with_retry(t, retry_policy)
     59     self.commit(t, None)
     60 except Exception as exc:

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/retry.py:29](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/retry.py#line=28), in run_with_retry(task, retry_policy)
     27 task.writes.clear()
     28 # run the task
---> 29 task.proc.invoke(task.input, config)
     30 # if successful, end
     31 break

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langgraph/utils/runnable.py:410](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langgraph/utils/runnable.py#line=409), in RunnableSeq.invoke(self, input, config, **kwargs)
    408 context.run(_set_config_context, config)
    409 if i == 0:
--> 410     input = context.run(step.invoke, input, config, **kwargs)
    411 else:
    412     input = context.run(step.invoke, input, config)

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langgraph/utils/runnable.py:184](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langgraph/utils/runnable.py#line=183), in RunnableCallable.invoke(self, input, config, **kwargs)
    182 else:
    183     context.run(_set_config_context, config)
--> 184     ret = context.run(self.func, input, **kwargs)
    185 if isinstance(ret, Runnable) and self.recurse:
    186     return ret.invoke(input, config)

Cell In[53], line 244, in update_todos(state, config, store)
    236 todo_extractor = create_extractor(
    237 model,
    238 tools=[ToDo],
    239 tool_choice=tool_name,
    240 enable_inserts=True
    241 ).with_listeners(on_end=spy)
    243 # Invoke the extractor
--> 244 result = todo_extractor.invoke({"messages": updated_messages, 
    245                                 "existing": existing_memories})
    247 # Save the memories from Trustcall to the store
    248 for r, rmeta in zip(result["responses"], result["response_metadata"]):

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_core/runnables/base.py:5354](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_core/runnables/base.py#line=5353), in RunnableBindingBase.invoke(self, input, config, **kwargs)
   5348 def invoke(
   5349     self,
   5350     input: Input,
   5351     config: Optional[RunnableConfig] = None,
   5352     **kwargs: Optional[Any],
   5353 ) -> Output:
-> 5354     return self.bound.invoke(
   5355         input,
   5356         self._merge_configs(config),
   5357         **{**self.kwargs, **kwargs},
   5358     )

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_core/runnables/base.py:3024](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_core/runnables/base.py#line=3023), in RunnableSequence.invoke(self, input, config, **kwargs)
   3022             input = context.run(step.invoke, input, config, **kwargs)
   3023         else:
-> 3024             input = context.run(step.invoke, input, config)
   3025 # finish the root run
   3026 except BaseException as e:

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/__init__.py:1749](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/__init__.py#line=1748), in Pregel.invoke(self, input, config, stream_mode, output_keys, interrupt_before, interrupt_after, debug, **kwargs)
   1747 else:
   1748     chunks = []
-> 1749 for chunk in self.stream(
   1750     input,
   1751     config,
   1752     stream_mode=stream_mode,
   1753     output_keys=output_keys,
   1754     interrupt_before=interrupt_before,
   1755     interrupt_after=interrupt_after,
   1756     debug=debug,
   1757     **kwargs,
   1758 ):
   1759     if stream_mode == "values":
   1760         latest = chunk

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/__init__.py:1477](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/__init__.py#line=1476), in Pregel.stream(self, input, config, stream_mode, output_keys, interrupt_before, interrupt_after, debug, subgraphs)
   1466     # Similarly to Bulk Synchronous Parallel / Pregel model
   1467     # computation proceeds in steps, while there are channel updates
   1468     # channel updates from step N are only visible in step N+1
   1469     # channels are guaranteed to be immutable for the duration of the step,
   1470     # with channel updates applied only at the transition between steps
   1471     while loop.tick(
   1472         input_keys=self.input_channels,
   1473         interrupt_before=interrupt_before_,
   1474         interrupt_after=interrupt_after_,
   1475         manager=run_manager,
   1476     ):
-> 1477         for _ in runner.tick(
   1478             loop.tasks.values(),
   1479             timeout=self.step_timeout,
   1480             retry_policy=self.retry_policy,
   1481             get_waiter=get_waiter,
   1482         ):
   1483             # emit output
   1484             yield from output()
   1485 # emit output

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/runner.py:113](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/runner.py#line=112), in PregelRunner.tick(self, tasks, reraise, timeout, retry_policy, get_waiter)
    111     yield
    112 # panic on failure or timeout
--> 113 _panic_or_proceed(all_futures, panic=reraise)

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/runner.py:279](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/runner.py#line=278), in _panic_or_proceed(futs, timeout_exc_cls, panic)
    277 # raise the exception
    278 if panic:
--> 279     raise exc
    280 else:
    281     return

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/executor.py:70](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/executor.py#line=69), in BackgroundExecutor.done(self, task)
     68 def done(self, task: concurrent.futures.Future) -> None:
     69     try:
---> 70         task.result()
     71     except GraphInterrupt:
     72         # This exception is an interruption signal, not an error
     73         # so we don't want to re-raise it on exit
     74         self.tasks.pop(task)

File /usr/local/lib/python3.11/concurrent/futures/_base.py:449, in Future.result(self, timeout)
    447     raise CancelledError()
    448 elif self._state == FINISHED:
--> 449     return self.__get_result()
    451 self._condition.wait(timeout)
    453 if self._state in [CANCELLED, CANCELLED_AND_NOTIFIED]:

File /usr/local/lib/python3.11/concurrent/futures/_base.py:401, in Future.__get_result(self)
    399 if self._exception:
    400     try:
--> 401         raise self._exception
    402     finally:
    403         # Break a reference cycle with the exception in self._exception
    404         self = None

File /usr/local/lib/python3.11/concurrent/futures/thread.py:58, in _WorkItem.run(self)
     55     return
     57 try:
---> 58     result = self.fn(*self.args, **self.kwargs)
     59 except BaseException as exc:
     60     self.future.set_exception(exc)

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/retry.py:29](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langgraph/pregel/retry.py#line=28), in run_with_retry(task, retry_policy)
     27 task.writes.clear()
     28 # run the task
---> 29 task.proc.invoke(task.input, config)
     30 # if successful, end
     31 break

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langgraph/utils/runnable.py:410](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langgraph/utils/runnable.py#line=409), in RunnableSeq.invoke(self, input, config, **kwargs)
    408 context.run(_set_config_context, config)
    409 if i == 0:
--> 410     input = context.run(step.invoke, input, config, **kwargs)
    411 else:
    412     input = context.run(step.invoke, input, config)

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langgraph/utils/runnable.py:184](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langgraph/utils/runnable.py#line=183), in RunnableCallable.invoke(self, input, config, **kwargs)
    182 else:
    183     context.run(_set_config_context, config)
--> 184     ret = context.run(self.func, input, **kwargs)
    185 if isinstance(ret, Runnable) and self.recurse:
    186     return ret.invoke(input, config)

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/trustcall/_base.py:880](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/trustcall/_base.py#line=879), in _Patch.invoke(self, state, config)
    879 def invoke(self, state: ExtendedExtractState, config: RunnableConfig) -> dict:
--> 880     msg = self.bound.invoke(state.messages, config)
    881     return self._tear_down(cast(AIMessage, msg), state.messages, state.tool_call_id)

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_core/runnables/base.py:5354](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_core/runnables/base.py#line=5353), in RunnableBindingBase.invoke(self, input, config, **kwargs)
   5348 def invoke(
   5349     self,
   5350     input: Input,
   5351     config: Optional[RunnableConfig] = None,
   5352     **kwargs: Optional[Any],
   5353 ) -> Output:
-> 5354     return self.bound.invoke(
   5355         input,
   5356         self._merge_configs(config),
   5357         **{**self.kwargs, **kwargs},
   5358     )

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py:286](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py#line=285), in BaseChatModel.invoke(self, input, config, stop, **kwargs)
    275 def invoke(
    276     self,
    277     input: LanguageModelInput,
   (...)
    281     **kwargs: Any,
    282 ) -> BaseMessage:
    283     config = ensure_config(config)
    284     return cast(
    285         ChatGeneration,
--> 286         self.generate_prompt(
    287             [self._convert_input(input)],
    288             stop=stop,
    289             callbacks=config.get("callbacks"),
    290             tags=config.get("tags"),
    291             metadata=config.get("metadata"),
    292             run_name=config.get("run_name"),
    293             run_id=config.pop("run_id", None),
    294             **kwargs,
    295         ).generations[0][0],
    296     ).message

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py:786](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py#line=785), in BaseChatModel.generate_prompt(self, prompts, stop, callbacks, **kwargs)
    778 def generate_prompt(
    779     self,
    780     prompts: list[PromptValue],
   (...)
    783     **kwargs: Any,
    784 ) -> LLMResult:
    785     prompt_messages = [p.to_messages() for p in prompts]
--> 786     return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py:643](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py#line=642), in BaseChatModel.generate(self, messages, stop, callbacks, tags, metadata, run_name, run_id, **kwargs)
    641         if run_managers:
    642             run_managers[i].on_llm_error(e, response=LLMResult(generations=[]))
--> 643         raise e
    644 flattened_outputs = [
    645     LLMResult(generations=[res.generations], llm_output=res.llm_output)  # type: ignore[list-item]
    646     for res in results
    647 ]
    648 llm_output = self._combine_llm_outputs([res.llm_output for res in results])

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py:633](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py#line=632), in BaseChatModel.generate(self, messages, stop, callbacks, tags, metadata, run_name, run_id, **kwargs)
    630 for i, m in enumerate(messages):
    631     try:
    632         results.append(
--> 633             self._generate_with_cache(
    634                 m,
    635                 stop=stop,
    636                 run_manager=run_managers[i] if run_managers else None,
    637                 **kwargs,
    638             )
    639         )
    640     except BaseException as e:
    641         if run_managers:

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py:851](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py#line=850), in BaseChatModel._generate_with_cache(self, messages, stop, run_manager, **kwargs)
    849 else:
    850     if inspect.signature(self._generate).parameters.get("run_manager"):
--> 851         result = self._generate(
    852             messages, stop=stop, run_manager=run_manager, **kwargs
    853         )
    854     else:
    855         result = self._generate(messages, stop=stop, **kwargs)

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py:1220](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py#line=1219), in ChatVertexAI._generate(self, messages, stop, run_manager, stream, **kwargs)
   1218 if not self._is_gemini_model:
   1219     return self._generate_non_gemini(messages, stop=stop, **kwargs)
-> 1220 return self._generate_gemini(
   1221     messages=messages,
   1222     stop=stop,
   1223     run_manager=run_manager,
   1224     is_gemini=True,
   1225     **kwargs,
   1226 )

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py:1388](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py#line=1387), in ChatVertexAI._generate_gemini(self, messages, stop, run_manager, **kwargs)
   1381 def _generate_gemini(
   1382     self,
   1383     messages: List[BaseMessage],
   (...)
   1386     **kwargs: Any,
   1387 ) -> ChatResult:
-> 1388     request = self._prepare_request_gemini(messages=messages, stop=stop, **kwargs)
   1389     response = _completion_with_retry(
   1390         self.prediction_client.generate_content,
   1391         max_retries=self.max_retries,
   (...)
   1394         **kwargs,
   1395     )
   1396     return self._gemini_response_to_chat_result(response)

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py:1302](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py#line=1301), in ChatVertexAI._prepare_request_gemini(self, messages, stop, stream, tools, functions, tool_config, safety_settings, cached_content, tool_choice, logprobs, **kwargs)
   1286 def _prepare_request_gemini(
   1287     self,
   1288     messages: List[BaseMessage],
   (...)
   1299     **kwargs,
   1300 ) -> GenerateContentRequest:
   1301     system_instruction, contents = _parse_chat_history_gemini(messages)
-> 1302     formatted_tools = self._tools_gemini(tools=tools, functions=functions)
   1303     if tool_config:
   1304         tool_config = self._tool_config_gemini(tool_config=tool_config)

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py:1444](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py#line=1443), in ChatVertexAI._tools_gemini(self, tools, functions)
   1439     logger.warning(
   1440         "Binding tools and functions together is not supported.",
   1441         "Only tools will be used",
   1442     )
   1443 if tools:
-> 1444     return [_format_to_gapic_tool(tools)]
   1445 if functions:
   1446     return [_format_to_gapic_tool(functions)]

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/functions_utils.py:296](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/functions_utils.py#line=295), in _format_to_gapic_tool(tools)
    286 elif isinstance(tool, dict):
    287     # not _ToolDictLike
    288     if not any(
    289         f in tool
    290         for f in [
   (...)
    294         ]
    295     ):
--> 296         fd = _format_to_gapic_function_declaration(tool)
    297         gapic_tool.function_declarations.append(fd)
    298         continue

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/functions_utils.py:263](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/functions_utils.py#line=262), in _format_to_gapic_function_declaration(tool)
    259 elif isinstance(tool, dict):
    260     # this could come from
    261     # 'langchain_core.utils.function_calling.convert_to_openai_tool'
    262     function = convert_to_openai_tool(cast(dict, tool))["function"]
--> 263     return _format_dict_to_function_declaration(cast(FunctionDescription, function))
    264 else:
    265     raise ValueError(f"Unsupported tool call type {tool}")

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/functions_utils.py:231](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/functions_utils.py#line=230), in _format_dict_to_function_declaration(tool)
    227     parameters = _dict_to_gapic_schema(
    228         tool.get("parameters", {}), pydantic_version=pydantic_version
    229     )
    230 else:
--> 231     parameters = _dict_to_gapic_schema(tool.get("parameters", {}))
    233 return gapic.FunctionDeclaration(
    234     name=tool.get("name"),
    235     description=tool.get("description"),
    236     parameters=parameters,
    237 )

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/functions_utils.py:163](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/langchain_google_vertexai/functions_utils.py#line=162), in _dict_to_gapic_schema(schema, pydantic_version)
    161     formatted_schema = _format_json_schema_to_gapic(dereferenced_schema)
    162 json_schema = json.dumps(formatted_schema)
--> 163 return gapic.Schema.from_json(json_schema)

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/proto/message.py:549](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/proto/message.py#line=548), in MessageMeta.from_json(cls, payload, ignore_unknown_fields)
    536 """Given a json string representing an instance,
    537 parse it into a message.
    538 
   (...)
    546     method was called.
    547 """
    548 instance = cls()
--> 549 Parse(payload, instance._pb, ignore_unknown_fields=ignore_unknown_fields)
    550 return instance

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:465](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=464), in Parse(text, message, ignore_unknown_fields, descriptor_pool, max_recursion_depth)
    461   return ParseDict(
    462       js, message, ignore_unknown_fields, descriptor_pool, max_recursion_depth
    463   )
    464 except ParseError as e:
--> 465   raise e
    466 except Exception as e:
    467   raise ParseError(
    468       'Failed to parse JSON: {0}: {1}.'.format(type(e).__name__, str(e))
    469   ) from e

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:461](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=460), in Parse(text, message, ignore_unknown_fields, descriptor_pool, max_recursion_depth)
    458   raise ParseError('Failed to load JSON: {0}.'.format(str(e))) from e
    460 try:
--> 461   return ParseDict(
    462       js, message, ignore_unknown_fields, descriptor_pool, max_recursion_depth
    463   )
    464 except ParseError as e:
    465   raise e

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:495](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=494), in ParseDict(js_dict, message, ignore_unknown_fields, descriptor_pool, max_recursion_depth)
    479 """Parses a JSON dictionary representation into a message.
    480 
    481 Args:
   (...)
    492   The same message passed as argument.
    493 """
    494 parser = _Parser(ignore_unknown_fields, descriptor_pool, max_recursion_depth)
--> 495 parser.ConvertMessage(js_dict, message, '')
    496 return message

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:540](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=539), in _Parser.ConvertMessage(self, value, message, path)
    538   methodcaller(_WKTJSONMETHODS[full_name][1], value, message, path)(self)
    539 else:
--> 540   self._ConvertFieldValuePair(value, message, path)
    541 self.recursion_depth -= 1

File [~/langchain-academy/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py:692](http://localhost:8888/lab/tree/module-5/lc-academy-env/lib/python3.11/site-packages/google/protobuf/json_format.py#line=691), in _Parser._ConvertFieldValuePair(self, js, message, path)
    690 except ParseError as e:
    691   if field and field.containing_oneof is None:
--> 692     raise ParseError(
    693         'Failed to parse {0} field: {1}.'.format(name, e)
    694     ) from e
    695   else:
    696     raise ParseError(str(e)) from e

ParseError: Failed to parse properties field: Failed to parse items field: Failed to parse properties field: Failed to parse anyOf field: Failed to parse type field: Invalid enum value string for enum type google.cloud.aiplatform.v1beta1.Type at Schema.properties[patches].items.properties[value].anyOf[0].type.....

@rlancemartin
Copy link
Collaborator

Thanks @emilyvanark! I will have a look with @hinthornw, the author.

Trustcall has been tested with OpenAI tool calling most extensively.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants