Skip to content

Commit 937e111

Browse files
feat(aiplatform): update the api
#### aiplatform:v1 The following keys were deleted: - schemas.GoogleCloudAiplatformV1Part.properties.thought.readOnly (Total Keys: 1) The following keys were added: - schemas.GoogleCloudAiplatformV1Part.properties.thoughtSignature (Total Keys: 2) - schemas.GoogleCloudAiplatformV1Tool.properties.computerUse.$ref (Total Keys: 1) - schemas.GoogleCloudAiplatformV1ToolComputerUse (Total Keys: 3) #### aiplatform:v1beta1 The following keys were deleted: - schemas.GoogleCloudAiplatformV1beta1Part.properties.thought.readOnly (Total Keys: 1) The following keys were added: - resources.projects.resources.locations.resources.reasoningEngines.resources.examples.resources.operations.methods.cancel (Total Keys: 11) - resources.projects.resources.locations.resources.reasoningEngines.resources.examples.resources.operations.methods.delete (Total Keys: 11) - resources.projects.resources.locations.resources.reasoningEngines.resources.examples.resources.operations.methods.get (Total Keys: 11) - resources.projects.resources.locations.resources.reasoningEngines.resources.examples.resources.operations.methods.wait (Total Keys: 14) - schemas.GoogleCloudAiplatformV1beta1Part.properties.thoughtSignature (Total Keys: 2) - schemas.GoogleCloudAiplatformV1beta1Tool.properties.computerUse.$ref (Total Keys: 1) - schemas.GoogleCloudAiplatformV1beta1ToolComputerUse (Total Keys: 3) - schemas.GoogleCloudAiplatformV1beta1TuningJob.properties.satisfiesPzi (Total Keys: 2) - schemas.GoogleCloudAiplatformV1beta1TuningJob.properties.satisfiesPzs (Total Keys: 2)
1 parent e4f7671 commit 937e111

26 files changed

+1003
-150
lines changed

docs/dyn/aiplatform_v1.endpoints.html

Lines changed: 33 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -154,7 +154,8 @@ <h3>Method Details</h3>
154154
&quot;mimeType&quot;: &quot;A String&quot;, # Required. The IANA standard MIME type of the source data.
155155
},
156156
&quot;text&quot;: &quot;A String&quot;, # Optional. Text part (can be code).
157-
&quot;thought&quot;: True or False, # Output only. Indicates if the part is thought from the model.
157+
&quot;thought&quot;: True or False, # Optional. Indicates if the part is thought from the model.
158+
&quot;thoughtSignature&quot;: &quot;A String&quot;, # Optional. An opaque signature for the thought so it can be reused in subsequent requests.
158159
&quot;videoMetadata&quot;: { # Metadata describes the input video content. # Optional. Video metadata. The metadata should only be specified while the video data is presented in inline_data or file_data.
159160
&quot;endOffset&quot;: &quot;A String&quot;, # Optional. The end offset of the video.
160161
&quot;startOffset&quot;: &quot;A String&quot;, # Optional. The start offset of the video.
@@ -238,7 +239,8 @@ <h3>Method Details</h3>
238239
&quot;mimeType&quot;: &quot;A String&quot;, # Required. The IANA standard MIME type of the source data.
239240
},
240241
&quot;text&quot;: &quot;A String&quot;, # Optional. Text part (can be code).
241-
&quot;thought&quot;: True or False, # Output only. Indicates if the part is thought from the model.
242+
&quot;thought&quot;: True or False, # Optional. Indicates if the part is thought from the model.
243+
&quot;thoughtSignature&quot;: &quot;A String&quot;, # Optional. An opaque signature for the thought so it can be reused in subsequent requests.
242244
&quot;videoMetadata&quot;: { # Metadata describes the input video content. # Optional. Video metadata. The metadata should only be specified while the video data is presented in inline_data or file_data.
243245
&quot;endOffset&quot;: &quot;A String&quot;, # Optional. The end offset of the video.
244246
&quot;startOffset&quot;: &quot;A String&quot;, # Optional. The start offset of the video.
@@ -368,7 +370,8 @@ <h3>Method Details</h3>
368370
&quot;mimeType&quot;: &quot;A String&quot;, # Required. The IANA standard MIME type of the source data.
369371
},
370372
&quot;text&quot;: &quot;A String&quot;, # Optional. Text part (can be code).
371-
&quot;thought&quot;: True or False, # Output only. Indicates if the part is thought from the model.
373+
&quot;thought&quot;: True or False, # Optional. Indicates if the part is thought from the model.
374+
&quot;thoughtSignature&quot;: &quot;A String&quot;, # Optional. An opaque signature for the thought so it can be reused in subsequent requests.
372375
&quot;videoMetadata&quot;: { # Metadata describes the input video content. # Optional. Video metadata. The metadata should only be specified while the video data is presented in inline_data or file_data.
373376
&quot;endOffset&quot;: &quot;A String&quot;, # Optional. The end offset of the video.
374377
&quot;startOffset&quot;: &quot;A String&quot;, # Optional. The start offset of the video.
@@ -381,6 +384,9 @@ <h3>Method Details</h3>
381384
{ # Tool details that the model may use to generate response. A `Tool` is a piece of code that enables the system to interact with external systems to perform an action, or set of actions, outside of knowledge and scope of the model. A Tool object should contain exactly one type of Tool (e.g FunctionDeclaration, Retrieval or GoogleSearchRetrieval).
382385
&quot;codeExecution&quot;: { # Tool that executes code generated by the model, and automatically returns the result to the model. See also [ExecutableCode]and [CodeExecutionResult] which are input and output to this tool. # Optional. CodeExecution tool type. Enables the model to execute code as part of generation.
383386
},
387+
&quot;computerUse&quot;: { # Tool to support computer use. # Optional. Tool to support the model interacting directly with the computer. If enabled, it automatically populates computer-use specific Function Declarations.
388+
&quot;environment&quot;: &quot;A String&quot;, # Required. The environment being operated.
389+
},
384390
&quot;enterpriseWebSearch&quot;: { # Tool to search public web data, powered by Vertex AI Search and Sec4 compliance. # Optional. Tool to support searching public web data, powered by Vertex AI Search and Sec4 compliance.
385391
},
386392
&quot;functionDeclarations&quot;: [ # Optional. Function tool type. One or more function declarations to be passed to the model along with the current user query. Model may decide to call a subset of these functions by populating FunctionCall in the response. User should provide a FunctionResponse for each function call in the next turn. Based on the function responses, Model will generate the final response back to the user. Maximum 128 function declarations can be provided.
@@ -560,6 +566,8 @@ <h3>Method Details</h3>
560566
&quot;vectorDistanceThreshold&quot;: 3.14, # Optional. Only return results with vector distance smaller than the threshold.
561567
},
562568
},
569+
&quot;urlContext&quot;: { # Tool to support URL context. # Optional. Tool to support URL context retrieval.
570+
},
563571
},
564572
],
565573
}
@@ -672,7 +680,8 @@ <h3>Method Details</h3>
672680
&quot;mimeType&quot;: &quot;A String&quot;, # Required. The IANA standard MIME type of the source data.
673681
},
674682
&quot;text&quot;: &quot;A String&quot;, # Optional. Text part (can be code).
675-
&quot;thought&quot;: True or False, # Output only. Indicates if the part is thought from the model.
683+
&quot;thought&quot;: True or False, # Optional. Indicates if the part is thought from the model.
684+
&quot;thoughtSignature&quot;: &quot;A String&quot;, # Optional. An opaque signature for the thought so it can be reused in subsequent requests.
676685
&quot;videoMetadata&quot;: { # Metadata describes the input video content. # Optional. Video metadata. The metadata should only be specified while the video data is presented in inline_data or file_data.
677686
&quot;endOffset&quot;: &quot;A String&quot;, # Optional. The end offset of the video.
678687
&quot;startOffset&quot;: &quot;A String&quot;, # Optional. The start offset of the video.
@@ -808,7 +817,8 @@ <h3>Method Details</h3>
808817
&quot;mimeType&quot;: &quot;A String&quot;, # Required. The IANA standard MIME type of the source data.
809818
},
810819
&quot;text&quot;: &quot;A String&quot;, # Optional. Text part (can be code).
811-
&quot;thought&quot;: True or False, # Output only. Indicates if the part is thought from the model.
820+
&quot;thought&quot;: True or False, # Optional. Indicates if the part is thought from the model.
821+
&quot;thoughtSignature&quot;: &quot;A String&quot;, # Optional. An opaque signature for the thought so it can be reused in subsequent requests.
812822
&quot;videoMetadata&quot;: { # Metadata describes the input video content. # Optional. Video metadata. The metadata should only be specified while the video data is presented in inline_data or file_data.
813823
&quot;endOffset&quot;: &quot;A String&quot;, # Optional. The end offset of the video.
814824
&quot;startOffset&quot;: &quot;A String&quot;, # Optional. The start offset of the video.
@@ -836,6 +846,9 @@ <h3>Method Details</h3>
836846
{ # Tool details that the model may use to generate response. A `Tool` is a piece of code that enables the system to interact with external systems to perform an action, or set of actions, outside of knowledge and scope of the model. A Tool object should contain exactly one type of Tool (e.g FunctionDeclaration, Retrieval or GoogleSearchRetrieval).
837847
&quot;codeExecution&quot;: { # Tool that executes code generated by the model, and automatically returns the result to the model. See also [ExecutableCode]and [CodeExecutionResult] which are input and output to this tool. # Optional. CodeExecution tool type. Enables the model to execute code as part of generation.
838848
},
849+
&quot;computerUse&quot;: { # Tool to support computer use. # Optional. Tool to support the model interacting directly with the computer. If enabled, it automatically populates computer-use specific Function Declarations.
850+
&quot;environment&quot;: &quot;A String&quot;, # Required. The environment being operated.
851+
},
839852
&quot;enterpriseWebSearch&quot;: { # Tool to search public web data, powered by Vertex AI Search and Sec4 compliance. # Optional. Tool to support searching public web data, powered by Vertex AI Search and Sec4 compliance.
840853
},
841854
&quot;functionDeclarations&quot;: [ # Optional. Function tool type. One or more function declarations to be passed to the model along with the current user query. Model may decide to call a subset of these functions by populating FunctionCall in the response. User should provide a FunctionResponse for each function call in the next turn. Based on the function responses, Model will generate the final response back to the user. Maximum 128 function declarations can be provided.
@@ -1015,6 +1028,8 @@ <h3>Method Details</h3>
10151028
&quot;vectorDistanceThreshold&quot;: 3.14, # Optional. Only return results with vector distance smaller than the threshold.
10161029
},
10171030
},
1031+
&quot;urlContext&quot;: { # Tool to support URL context. # Optional. Tool to support URL context retrieval.
1032+
},
10181033
},
10191034
],
10201035
}
@@ -1081,7 +1096,8 @@ <h3>Method Details</h3>
10811096
&quot;mimeType&quot;: &quot;A String&quot;, # Required. The IANA standard MIME type of the source data.
10821097
},
10831098
&quot;text&quot;: &quot;A String&quot;, # Optional. Text part (can be code).
1084-
&quot;thought&quot;: True or False, # Output only. Indicates if the part is thought from the model.
1099+
&quot;thought&quot;: True or False, # Optional. Indicates if the part is thought from the model.
1100+
&quot;thoughtSignature&quot;: &quot;A String&quot;, # Optional. An opaque signature for the thought so it can be reused in subsequent requests.
10851101
&quot;videoMetadata&quot;: { # Metadata describes the input video content. # Optional. Video metadata. The metadata should only be specified while the video data is presented in inline_data or file_data.
10861102
&quot;endOffset&quot;: &quot;A String&quot;, # Optional. The end offset of the video.
10871103
&quot;startOffset&quot;: &quot;A String&quot;, # Optional. The start offset of the video.
@@ -1354,7 +1370,8 @@ <h3>Method Details</h3>
13541370
&quot;mimeType&quot;: &quot;A String&quot;, # Required. The IANA standard MIME type of the source data.
13551371
},
13561372
&quot;text&quot;: &quot;A String&quot;, # Optional. Text part (can be code).
1357-
&quot;thought&quot;: True or False, # Output only. Indicates if the part is thought from the model.
1373+
&quot;thought&quot;: True or False, # Optional. Indicates if the part is thought from the model.
1374+
&quot;thoughtSignature&quot;: &quot;A String&quot;, # Optional. An opaque signature for the thought so it can be reused in subsequent requests.
13581375
&quot;videoMetadata&quot;: { # Metadata describes the input video content. # Optional. Video metadata. The metadata should only be specified while the video data is presented in inline_data or file_data.
13591376
&quot;endOffset&quot;: &quot;A String&quot;, # Optional. The end offset of the video.
13601377
&quot;startOffset&quot;: &quot;A String&quot;, # Optional. The start offset of the video.
@@ -1490,7 +1507,8 @@ <h3>Method Details</h3>
14901507
&quot;mimeType&quot;: &quot;A String&quot;, # Required. The IANA standard MIME type of the source data.
14911508
},
14921509
&quot;text&quot;: &quot;A String&quot;, # Optional. Text part (can be code).
1493-
&quot;thought&quot;: True or False, # Output only. Indicates if the part is thought from the model.
1510+
&quot;thought&quot;: True or False, # Optional. Indicates if the part is thought from the model.
1511+
&quot;thoughtSignature&quot;: &quot;A String&quot;, # Optional. An opaque signature for the thought so it can be reused in subsequent requests.
14941512
&quot;videoMetadata&quot;: { # Metadata describes the input video content. # Optional. Video metadata. The metadata should only be specified while the video data is presented in inline_data or file_data.
14951513
&quot;endOffset&quot;: &quot;A String&quot;, # Optional. The end offset of the video.
14961514
&quot;startOffset&quot;: &quot;A String&quot;, # Optional. The start offset of the video.
@@ -1518,6 +1536,9 @@ <h3>Method Details</h3>
15181536
{ # Tool details that the model may use to generate response. A `Tool` is a piece of code that enables the system to interact with external systems to perform an action, or set of actions, outside of knowledge and scope of the model. A Tool object should contain exactly one type of Tool (e.g FunctionDeclaration, Retrieval or GoogleSearchRetrieval).
15191537
&quot;codeExecution&quot;: { # Tool that executes code generated by the model, and automatically returns the result to the model. See also [ExecutableCode]and [CodeExecutionResult] which are input and output to this tool. # Optional. CodeExecution tool type. Enables the model to execute code as part of generation.
15201538
},
1539+
&quot;computerUse&quot;: { # Tool to support computer use. # Optional. Tool to support the model interacting directly with the computer. If enabled, it automatically populates computer-use specific Function Declarations.
1540+
&quot;environment&quot;: &quot;A String&quot;, # Required. The environment being operated.
1541+
},
15211542
&quot;enterpriseWebSearch&quot;: { # Tool to search public web data, powered by Vertex AI Search and Sec4 compliance. # Optional. Tool to support searching public web data, powered by Vertex AI Search and Sec4 compliance.
15221543
},
15231544
&quot;functionDeclarations&quot;: [ # Optional. Function tool type. One or more function declarations to be passed to the model along with the current user query. Model may decide to call a subset of these functions by populating FunctionCall in the response. User should provide a FunctionResponse for each function call in the next turn. Based on the function responses, Model will generate the final response back to the user. Maximum 128 function declarations can be provided.
@@ -1697,6 +1718,8 @@ <h3>Method Details</h3>
16971718
&quot;vectorDistanceThreshold&quot;: 3.14, # Optional. Only return results with vector distance smaller than the threshold.
16981719
},
16991720
},
1721+
&quot;urlContext&quot;: { # Tool to support URL context. # Optional. Tool to support URL context retrieval.
1722+
},
17001723
},
17011724
],
17021725
}
@@ -1763,7 +1786,8 @@ <h3>Method Details</h3>
17631786
&quot;mimeType&quot;: &quot;A String&quot;, # Required. The IANA standard MIME type of the source data.
17641787
},
17651788
&quot;text&quot;: &quot;A String&quot;, # Optional. Text part (can be code).
1766-
&quot;thought&quot;: True or False, # Output only. Indicates if the part is thought from the model.
1789+
&quot;thought&quot;: True or False, # Optional. Indicates if the part is thought from the model.
1790+
&quot;thoughtSignature&quot;: &quot;A String&quot;, # Optional. An opaque signature for the thought so it can be reused in subsequent requests.
17671791
&quot;videoMetadata&quot;: { # Metadata describes the input video content. # Optional. Video metadata. The metadata should only be specified while the video data is presented in inline_data or file_data.
17681792
&quot;endOffset&quot;: &quot;A String&quot;, # Optional. The end offset of the video.
17691793
&quot;startOffset&quot;: &quot;A String&quot;, # Optional. The start offset of the video.

0 commit comments

Comments
 (0)