Vakilsearch LogoIs NowZolvit Logo
close icon
image
image
user-login
Patent search/

PROVIDING COMMAND BUNDLE SUGGESTIONS FOR AN AUTOMATED ASSISTANT

search

Patent Search in India

  • tick

    Extensive patent search conducted by a registered patent agent

  • tick

    Patent search done by experts in under 48hrs

₹999

₹399

Talk to expert

PROVIDING COMMAND BUNDLE SUGGESTIONS FOR AN AUTOMATED ASSISTANT

DIVISIONAL PCT NATIONAL PHASE APPLICATION

Published

date

Filed on 7 November 2024

Abstract

Generating and/or recommending command bundles for a user of an automated assistant. A command bundle comprises a plurality of discrete actions that can be performed by an automated assistant. One or more of the actions of a command bundle can cause transmission of a corresponding command and/or other data to one or more devices and/or agents that are distinct from devices and/or agents to which data is transmitted based on other action(s) of the bundle. Implementations determine command bundles that are likely relevant to a user, and present those command bundles as suggestions to the user. In some of those implementations, a machine learning model is utilized to generate a user action embedding for the user, and a command bundle embedding for each of a plurality of command bundles. Command bundle(s) can be selected for suggestion based on comparison of the user action embedding and the command bundle embeddings. Figure 5 is the representative figure.

Patent Information

Application ID202428085315
Invention FieldCOMPUTER SCIENCE
Date of Application07/11/2024
Publication Number49/2024

Inventors

NameAddressCountryNationality
NI, Yuzhao1600 Amphitheatre Parkway, Mountain view, California 94043, United States of America.ChinaChina

Applicants

NameAddressCountryNationality
GOOGLE LLC1600 Amphitheatre Parkway, Mountain View, California 94043, United States of America.U.S.A.U.S.A.

Specification

EXTRACTED FROM WIPO

PROVIDING COMMAND BUNDLE SUGGESTIONS FOR AN AUTOMATED ASSISTANT

Background

[0001] An automated assistant (also known as a "personal assistant", "mobile assistant", etc.) may be interacted with by a user via a variety of client devices, such as smart phones, tablet computers, wearable devices, automobile systems, standalone personal assistant devices, and so forth. An automated assistant receives input from the user (e.g., typed and/or spoken natural la nguage input) and responds with responsive content (e.g., visual and/or audible natural language output) and/or by controlling one or more peripheral devices (e.g., Internet of things (loT) device(s)). An automated assistant interacted with via a client device may be implemented via the client device itself and/or via one or more remote computing devices that are in network communication with the client device (e.g., computing device(s) in "the cloud").

[0002] Automated assistants are often configured to perform a variety of actions, with each action being performed in response to a predetermined canonical command (or a slight variation thereof) that is mapped to the action. For example, in response to receiving a spoken comma nd of "Assistant, turn off my living room lights", an automated assistant can cause one or more commands to be transmitted that ca use networked lights of the user, that are labeled as "living room" lights, to be transitioned to an "off" state. As another example, in response to receiving a separate spoken command of "Assistant, what is tomorrow's weather", an automated assista nt can issue one or more queries and/or interact with a third-party agent to resolve a prediction for "tomorrow's weather" for a location of the user issuing the spoken comma nd, and provide graphical and/or audible output that relays tomorrow's predicted weather.

[0003] However, a user that utilizes an automated assista nt may not be aware of many of the actions that are performa ble by an automated assistant and/or may not be aware of the canonical commands that can be provided by the user to cause the actions to be performed by the automated assistant. As a result, many users may employ only a limited amount of the functionality of an automated assistant. Although a general recommendation for a canonical comma nd a nd an associated action can be provided to a user that is interacting with an automated assista nt (e.g., "Try saying X to get a weather report for tomorrow"), oftentimes such a general recommendation is blindly provided to the user. As a result, significant network and/or computational resources can be wasted in providing users with recommendations that are irrelevant. Moreover, oftentimes such a general recommendation is for only a single action. To perform multiple actions, multiple disparate canonical commands must be provided by a user through a plurality of dialog turns with an automated assistant, thereby consuming significa nt network and/or computational resources in the performance of multiple actions.

Summary

[0004] This specification is directed methods, apparatus, and computer-readable media (tra nsitory and non-transitory) for generating and/or recommending command bundles for a user of an automated assistant application. A command bundle comprises a plurality of discrete actions that can be performed by an automated assistant application. For example, a "good night" bundle can cause the automated assistant application to perform : a first action of transmitting a command to turn off one or more networked lights; a second action of transmitting a command to set an alarm, of a computing device of a user, to sound at 8:00 AM; a third action of transmitting a command that requests "tomorrow's" local weather, and audibly presenting responsive content; etc.

[0005] As appreciated from the preceding example, one or more of the actions of a comma nd bundle can cause transmission of a corresponding command and/or other data to one or more devices and/or agents that are distinct from devices and/or agents to which data is transmitted based on other action(s) of the bundle. For instance, in the preceding example, for the first action a command can be transmitted to networked light(s) (and/or a hardware bridge in communication with the networked light(s)), whereas in the third action a separate comma nd can be transmitted to a remote computing device that hosts a "weather" agent. Command bundles can be activated in response to various cues, such as speaking of one of one or more invocation phrases for the command bundle (e.g., "good night" for the "good night" bundle), actuating a graphical user interface element for the command bundle (e.g., a

"shortcut" icon for the command bundle), and/or the occurrence of one or more contextual conditions (e.g., for the "good night" bundle, the occurrence of it being 10:00 PM).

[0006] Command bundles can be generated from scratch and/or based on historical data collected during interactions between one or more users and an automated assistant(s). For example, a command bundle can be generated from scratch by a programmer specifying actions for the command bundle and, for each of one or more of the actions, optionally specifying one or more fixed slot values for one or more slots of the action. As another example, a command bundle can be generated automatically by collecting command phrases that are issued, by each of a plurality of corresponding users, within a short time frame of one another - and generating a corresponding command bundle that, when activated, causes actions associated with the collected command phrases to be performed. For instance, a command bundle with first, second, and third actions can be automatically generated based on at least a threshold quantity of users each causing the first, second, and third actions to be performed within one minute of one another through interaction with their automated assistants.

[0007] Implementations disclosed herein relate to determining command bundles that are likely relevant to a given user, and presenting those command bundles as

suggestions/recommendations to the given user. In some of those implementations, a machine learning model can be trained that receives, as input, indications of one or more automated assistant "actions" (and optionally slot value(s) and/or other parameter(s) for the action(s)) and provides, as output, an embedding that provides, in a multi-dimensional space, a semantic representation of those "actions" (and optionally slot value(s) and/or other parameter(s)).

[0008] Actions performed by the given user via an automated assistant application (and optionally slot value(s) and/or other parameter(s) for the action(s)) can be processed using the machine learning model to generate a "user action embedding". Further, for each of the command bundles, actions of the command bundle (and optionally slot value(s) and/or other parameter(s) for the action(s)) can be processed using the machine learning model (or another machine learning model) to generate a "command bundle embedding". Command bundle(s) having "command bundle embeddings" that are most similar to the "user action embedding" can then be provided to the given user as recommendation(s). For example, if the given user only uses the automated assistant application for "music" and "lighting control" actions, the "user action embedding" can represent those actions. Command bundles having

corresponding "music" and "lighting control" actions will have command bundle embeddings

that are more similar to the "user action embedding" than, for example, command bundles that lack and music or lighting control actions. In these and other manners, command bundles that are graphically and/or audibly recommended to the given user can first be determined to likely be relevant to the given user, through comparison of features of the command bundles to features of past interactions of the given user with the automated assistant application.

[0009] Further, in some implementations, required peripheral devices and/or other parameter(s) of one or more action(s) of a command bundle may be considered in determining whether to recommend the command bundle to a user. For example, some command bundles can be filtered out (e.g., before the similarity determinations) based on determining the given user lacks required peripheral device(s) for the command bundle. For instance, a given command bundle that requires a networked light for performance of an action can be filtered out from consideration as a recommendation to a given user, based on determining that no networked lights have been associated with the given user for the automated assistant application. Also, for example, indications of peripheral devices of the user and/or peripheral devices of the command bundles can additionally be applied as input in generating the user embedding and/or the command bundle embeddings as described above and elsewhere herein.

[0010] In some implementations, at least one action of a command bundle recommended to a given user can include at least one slot that lacks a fixed value (i.e., a slot with an explicit "undefined" or "variable" value, or a slot that lacks definition of any value for the slot). In some of those implementations, when the command bundle is selected by the given user, or subsequently initially invoked by the given user, the automated assistant application can prompt the given user to enable resolution of a value for the slot that lacks a fixed value. In some version of those implementations, the resolved value for the slot can be stored in association with the given user and thereafter automatically utilized in response to further invocations of the command bundle. For example, a command bundle can include an action of transmitting a command to set an alarm of a computing device of a user. However, an "alarm time" slot of the alarm action may not be fixed. In some of those implementations, the automated assistant application can provide, for presentation to the user, a prompt of "What time would you like the alarm set for", and responsive user interface input of "8:00 AM"

received in response to the prompt. The automated assistant application can store "8:00 AM" as the resolved value for the "alarm time" slot for the command bundle for the user, optionally after confirming that the user would like it set as a default. Thereafter, the automated assistant application can automatically utilize "8:00 AM" as the slot value for the "alarm time" slot when that command bundle is invoked by the given user.

[0011] Particular implementations of the subject matter described in this specification can be implemented so as to realize one or more of the following advantages. Command bundle recommendation methods, and/or model(s) utilized in command bundle recommendations, can provide for improved data efficiency compared to other methods. For example, such recommendation methods and/or models can increase the likelihood that a command bundle recommendation provided to a given user is relevant to the given user and resultantly increase the likelihood that the given user will subsequently utilize the command bundle. This can mitigate the consumption of network resources and/or computational resources in provision of irrelevant command bundles.

[0012] Moreover, as described herein, in many implementations a recommended command bundle can, when subsequently invoked, cause a plurality of discrete actions to be performed. Such subsequent invocation can be achieved with subsequent user interface input that consumes less network and/or computational resources than if each of the plurality of the discrete actions was invoked individually. For example, the "good night" bundle described above can be invoked through a spoken command provided to the automated assistant, such as "Assistant, good night". Invocation of the "good night" bundle can cause the automated assistant application to perform: a first action of transmitting a command to turn off one or more networked lights; a second action of transmitting a command to set an alarm, of a computing device of a user, to sound at 8:00 AM; and a third action of transmitting a command that requests "tomorrow's" local weather, and audibly presenting responsive content. In absence of the command bundle, a user would need to issue at least three separate

commands, which, to process, would require a greater consumption of network and/or computational resources than the single invocation command of the command bundle. For example, absent the command bundle, the user may have to issue three separate commands

of: "Assistant, turn off the lights"; "Assistant, set the alarm on my mobile phone to 8:00 AM"; and "Assistant, what is tomorrow's weather".

[0013] Additionally, and as also described herein, in many implementations a recommended comma nd bundle can include at least one slot that lacks a fixed value, and a value for that slot can be resolved for a given user through interaction (e.g., automated assistant-to-user dialog) with the given user. Thereafter, when the given user ca uses the comma nd bundle to be invoked, the resolved value can be utilized, optionally without prompting the user for any confirmation. I n these and other manners, subsequent invocation of the command bundle by the user can be made more efficient through obviating of one or more prompts that would otherwise need to be provided to resolve a value for the slot.

[0014] The summary above is provided as an overview of some features of various im plementations disclosed herein. Additional description is provided below of those im plementations, and of various additional features and various additional implementations.

[0015] In some implementations, a method performed by one or more processors is provided and includes identifying assistant interaction data for a user, and processing at least pa rt of the assistant interaction data using a trained machine learning model to generate a user action embedding. The assistant interaction data indicates a plura lity of historical actions performed for the user by an automated assistant application. Each of the historical actions is performed in response to corresponding user interface input provided by the user via one or more automated assistant interfaces that interact with the automated assista nt application. The method further includes identifying a plurality of comma nd bundles that each include comma nd bundle data that identifies a plurality of corresponding discrete actions that can be performed by the automated assistant application. The method further includes, for each of the command bundles: processing at least part of the command bundle data using the trained machine learning model, or an additional trained machine learning model, to generate a comma nd bundle embedding, and generating a similarity score for the comma nd bundle.

Generating the similarity score for each of the command bundles includes comparing the user action embedding to the command bundle embedding for the command bundle. The method further includes selecting a given command bundle, of the plurality of command bundles, based on the similarity score for the given command bundle. The method further includes, in response to selecting the given command bundle, causing information related to the given command bundle to be presented to the user via a computing device of the user. Invocation of the given command bundle, for the user in response to user interface input, causes the automated assistant application to perform the corresponding discrete actions of the given command bundle.

[0016] These and other implementations of technology disclosed herein may optionally include one or more of the following features.

[0017] In some implementations, the information related to the given command bundle that is presented to the user includes an invocation phrase for the given command bundle. In some of those implementations, the method further includes: receiving, subsequent to causing the information related to the given command bundle to be presented, natural language user interface input provided by the user via one of the assistant interfaces; determining the natural language user interface input conforms to the invocation phrase; and in response to

determining the natural language user interface input conforms to the invocation phrase: performing, by the automated assistant application, the corresponding discrete actions of the given command bundle.

[0018] In some implementations, the corresponding discrete actions of the given command bundle include a first discrete action and a second discrete action. In some of those

implementations, the first discrete action causes the automated assistant application to transmit a first command to a first electronic device and the second discrete action causes the automated assistant application to transmit a second command to a second electronic device. In some of those implementations, the first discrete action causes the automated assistant application to transmit a first command to a first agent and the second discrete action causes the automated assistant application to transmit a second command to a second agent.

[0019] In some implementations, the method further includes ranking the command bundles based on the similarity scores. In some of those implementations, selecting the given command bundle is based on the ranking of the given command bundle relative to the other of the command bundles. In some version of those implementations, causing the information related to the given command bundle to be presented to the user via a computing device of the user includes causing the information to be presented with a display prominence that is based on the ranking of the given command bundle.

[0020] In some implementations, identifying the plurality of command bundles includes: selecting, from a corpus of available command bundles, the plurality of command bundles based on conformance between the selected plurality of command bundles and user specific data of the user. In some of those implementations, selecting the plurality of command bundles based on conformance between the selected plurality of command bundles and user specific data of the user includes excluding a given available command bundle, of the available command bundles, from the selected plurality of command bundles based on: identifying a required peripheral device for the given available command bundle; and determining, based on the user specific data, that the automated assistant application lacks access, for the user, to the required peripheral device.

[0021] In some implementations, the processing of the at least part of the command bundle data is performed using the trained machine learning model.

[0022] In some implementations, generating the similarity score for each of the command bundles is based on a Euclidean distance between the user action embedding and the command bundle embedding for the command bundle.

[0023] In some implementations, the given command bundle includes at least one slot, with an undefined value, for at least one action of the corresponding actions. In some of those implementations, the method further includes: receiving a selection of the given command bundle in response to causing the information related to the given command bundle to be presented; in response to receiving the selection, engaging in a dialog with the user, via the automated assistant application, to resolve a value for the slot; and storing the value in association with the slot, for the given command bundle and for the user. In some version of those implementations, the method further includes, subsequent to storing the value in association with the slot, for the given command bundle and for the user: receiving natural language user interface input provided by the user via one of the assistant interfaces;

determining the natural language user interface input conforms to an invocation phrase for the command bundle; and in response to determining the natural language user interface input conforms to the invocation phrase, and based on the value being stored: performing, by the automated assistant application, the corresponding discrete actions of the given command

bundle, including performing the at least one action using the value for the slot.

[0024] In some implementations, a method performed by one or more processors is provided and includes identifying a corpus of command bundles and identifying peripheral device data for a user. Each of the identified command bundles include command bundle data that identifies a plurality of corresponding discrete actions that can be performed by an automated assistant application. The identified peripheral device data indicates peripheral devices of the user that are paired with an automated assistant application. The method further includes selecting, from the corpus of command bundles, a plurality of candidate command bundles for the user. Selecting the plurality of candidate command bundles is based on comparison of the peripheral device data to the command bundle data of the command bundles. The method further includes ranking the candidate command bundles based on the command bundle data and assistant interaction data, and causing information related to one or more of the candidate command bundles to be presented based on the ranking. The information is presented to the user via a computing device of the user.

[0025] These and other implementations of technology disclosed herein may optionally include one or more of the following features.

[0026] In some implementations, the information related to a given command bundle, of the one or more command bundles, includes an invocation phrase for the given command bundle. In some of those implementations, the method further includes: receiving, subsequent to causing the information related to the one or more command bundles to be presented, natural language user interface input provided by the user via an assistant interface associate with the automated assistant application; determining the natural language user interface input conforms to the invocation phrase; and in response to determining the natural language user interface input conforms to the invocation phrase: performing, by the automated assistant application, the corresponding discrete actions of the given command bundle. In some version of those implementations, the corresponding discrete actions of the given command bundle include a first discrete action that causes the automated assistant application to transmit a first command to a first electronic device, and a second discrete action that causes the automated assistant application to transmit a second command to a second electronic device.

[0027] In some implementations, a method performed by one or more processors is provided and includes identifying a corpus of command bundles and identifying data for a user. Each of the command bundles of the corpus includes command bundle data that identifies a plurality of corresponding discrete actions that can be performed by an automated assistant application. The method further includes selecting, from the corpus of command bundles, a plurality of candidate command bundles for the user. Selecting the plurality of candidate command bundles is based on comparison of the data of the user to the command bundle data of the command bundles. The method further includes identifying assistant interaction data for the user that indicates a plurality of historical actions performed for the user by an automated assistant application. The method further includes: processing at least part of the assistant interaction data using a trained machine learning model to generate a user action embedding; and selecting, from the plurality of candidate command bundles, a given command bundle based on comparison of the user action embedding to a command bundle embedding for the given command bundle. The method further includes, in response to selecting the given command bundle, causing information related to the given command bundle to be presented to the user via a computing device of the user. Invocation of the given command bundle, for the user in response to user interface input, causes the automated assistant application to perform the corresponding discrete actions of the given command bundle.

[0028] These and other implementations of technology disclosed herein may optionally include one or more of the following features.

[0029] In some implementations, the information related to the given command bundle that is presented to the user includes an invocation phrase for the given command bundle. In some of those implementations, the method further includes: receiving, subsequent to causing the information related to the given command bundle to be presented, natural language user interface input provided by the user via an assistant interface; determining the natural language user interface input conforms to the invocation phrase; and in response to determining the natural language user interface input conforms to the invocation phrase: performing, by the automated assistant application, the corresponding discrete actions of the given command bundle.

[0030] In some implementations, the given command bundle includes at least one slot, with an undefined value, for at least one action of the corresponding actions. In some of those implementations, the method further includes: receiving an invocation of the given command bundle subsequent to causing the information related to the given command bundle to be presented; in response to receiving the invocation, engaging in a dialog with the user, via the automated assistant application, to resolve a value for the slot; and storing the value in association with the slot, for the given command bundle and for the user. In some versions of those implementations, the method further includes, subsequent to storing the value in association with the slot, for the given command bundle and for the user: receiving natural language user interface input provided by the user via an assistant interface; determining the natural language user interface input conforms to an invocation phrase for the command bundle; and in response to determining the natural language user interface input conforms to the invocation phrase, and based on the value being stored: performing, by the automated assistant application, the corresponding discrete actions of the given command bundle, including performing the at least one action using the value for the slot.

[0031] In addition, some implementations include one or more processors of one or more computing devices, where the one or more processors are operable to execute instructions stored in associated memory, and where the instructions are configured to cause performance one or more methods described herein. The processors may include one or more graphics processing units (GPUs), central processing units (CPUs), and/or tensor processing units (TPUs). Some implementations include one or more non-transitory computer readable storage media storing computer instructions executable by one or more processors to perform one or more methods described herein.

[0032] It should be appreciated that all combinations of the foregoing concepts and additional concepts described in greater detail herein are contemplated as being part of the subject matter disclosed herein. For example, all combinations of claimed subject matter appearing at the end of this disclosure are contemplated as being part of the subject matter disclosed herein.

Brief Description of the Drawings

[0033] FIG. 1 is a block diagram of an example environment in which implementations disclosed herein may be implemented.

[0034] FIGS. 2A and 2B illustrate an example of how various components of FIG. 1 may interact in some implementations.

[0035] FIGS. 3A and 3B each illustrate an example of an actions model, and how assistant interaction data of a user can be processed using the actions model to generate a user action embedding.

[0036] FIG. 4A illustrates an example computing device with a display screen graphically displaying recommendations for multiple command bundles.

[0037] FIG. 4B illustrates the example computing device of FIG. 4A, and an example of dialog that may occur upon initial invocation of one of the command bundles of FIG. 4A, and an example of dialog that may occur upon subsequent invocation of that com ma nd bundle.

[0038] FIG. 5 is a flowchart illustrating an example method according to implementations disclosed herein.

[0039] FIG. 6 illustrates an example architecture of a computing device.

Detailed Description

[0040] FIG. 1 illustrates an example environment in which techniques disclosed herein may be implemented. The example environment includes a client device 106, an automated assistant 110 (also referred to herein as an automated assistant application), and a plurality of agents 140A-N. The client device 106 may be, for example, a standalone voice-activated speaker device, a desktop computing device, a laptop computing device, a tablet computing device, a mobile phone computing device, a computing device of a vehicle of the user, and/or a wearable apparatus of the user that includes a computing device (e.g., a watch of the user having a computing device, glasses of the user having a computing device, a virtual or augmented reality computing device). Additional a nd/or alternative client devices may be provided.

[0041] Although automated assistant 110 is illustrated in FIG. 1 as separate from the client device 106, in some implementations all or aspects of the automated assistant 110 may be im plemented by the client device 106. For example, in some implementations, input processing engine 112 may be implemented by the client device 106. In implementations where one or more (e.g., all) aspects of automated assistant 110 are implemented by one or more computing devices remote from the client device 106, the client device 106 and those

aspects of the automated assistant 110 communicate via one or more networks, such as a wide area network (WAN) (e.g., the I nternet).

[0042] Although only one client device 106 is illustrated in combination with the automated assistant 110, in many implementations the automated assistant 110 may be remote and may interface with each of a plurality of client devices of the same user and/or with each of a plurality of client devices of multiple users. For example, the automated assista nt 110 may manage communications with each of the multiple devices via different sessions and may manage multiple sessions in parallel. For instance, the automated assistant 110 in some im plementations may be implemented as a cloud-based service employing a cloud

infrastructure, e.g., using a server farm or cluster of high performance computers running software suitable for handling high volumes of requests from multiple users. However, for the sake of simplicity, many examples herein are described with respect to a single client device 106.

[0043] The automated assistant 110 communicates with each of a plura lity of agents 140A-N via an API and/or via one or more communications channels (e.g., an internal

communications channel and/or a network, such as a WAN). I n some implementations, one or more of the agents 140A-N are each managed by a respective party that is separate from a pa rty that manages the automated assistant 110. As used herein, an "agent" references one or more computing devices and/or software that are utilized by the automated assistant 110. I n some situations, an agent can be separate from the automated assistant 110 and/or may communicate with the automated assistant 110 over one or more communication channels. In some of those situations, the automated assistant 110 may transmit, from a first network node, data (e.g., an agent command) to a second network node that implements all or aspects of the functionality of the agent. In some situations, an agent may be a third-party (3P) agent, in that it is managed by a party that is separate from a party that manages the automated assistant 110. In some other situations, an agent may be a first-party (IP) agent, in that it is managed by the same party that manages the automated assistant 110.

[0044] An agent is configured to receive (e.g., over a network and/or via an API) an invocation request and/or other agent comma nds from the automated assistant 110. I n response to receiving an agent command, the agent generates responsive content based on

the agent command, and transmits the responsive content for the provision of user interface output that is based on the responsive content and/or to control one or more peripheral devices. For example, the agent can transmit the responsive content to control one or more periphera l devices such as one or more loT devices (e.g., smart lights, thermostats, appliances, cameras). As another example, the agent may transmit the responsive content to the automated assista nt 110 for provision of output, by the automated assistant 110, that is based on the responsive content. As another example, the agent can itself provide the output. For insta nce, the user can interact with the automated assistant 110 via an assistant interface of the client device 106 (e.g., the automated assistant can be implemented on the client device 106 and/or in network comm unication with the client device 106), and the agent can be an application installed on the client device 106 or an application executable remote from the client device 106, but "streamable" on the client device 106. When the application is invoked, it can be executed by the client device 106 and/or brought to the forefront by the client device 106 (e.g., its content can take over a display of the client device 106).




We Claim:
1. A method implemented by one or more processors [614], the method comprising:
identifying a corpus of command bundles, each of the command bundles of the corpus
comprising command bundle data that identifies a plurality of corresponding discrete
actions that can be performed by an automated assistant application;5
identifying data of a user;
selecting, from the corpus of command bundles, a plurality of candidate command
bundles for the user, wherein selecting the plurality of candidate command bundles is based
on comparison of the data of the user to the command bundle data of the command bundles;
identifying assistant interaction data [173] for the user, the assistant interaction data10
[173] indicating a plurality of historical actions performed for the user by an automated
assistant application, each of the historical actions performed in response to corresponding
user interface input provided by the user via one or more automated assistant interfaces that
interact with the automated assistant application;
processing at least part of the assistant interaction data [173] using a trained machine15
learning model to generate a user action embedding [174];
selecting, from the plurality of candidate command bundles, a given command bundle
based on comparison of the user action embedding [174] to a command bundle embedding
[175] for the given command bundle;
in response to selecting the given command bundle:20
causing information related to the given command bundle to be presented to the
user via a computing device of the user, wherein the information related to the given
command bundle that is presented to the user includes an invocation phrase for the
given command bundle for use by the user;
wherein invocation of the given command bundle, for the user in response to25
user interface input, causes the automated assistant application to perform the
corresponding discrete actions of the given command bundle;
receiving, subsequent to causing the information related to the given command bundle
to be presented, natural language user interface input provided by the user via one of the
assistant interfaces;30
38
determining the natural language user interface input conforms to the invocation phrase;
and
in response to determining the natural language user interface input conforms to the
invocation phrase:
performing, by the automated assistant application, the corresponding discrete5
actions of the given command bundle.
2. The method as claimed in claim 1, wherein the given command bundle comprises at least
one slot, with an undefined value, for at least one action of the corresponding actions, and
further comprising:
receiving an invocation of the given command bundle subsequent to causing the10
information related to the given command bundle to be presented;
in response to receiving the invocation, engaging in a dialog with the user, via the
automated assistant application, to resolve a value for the slot; and
storing the value in association with the slot, for the given command bundle and for the
user.15
3. A method implemented by one or more processors, the method comprising:
generating a score for each command bundle of a plurality of command bundles;
causing, based on the scores, a recommendation of one or more command bundles, of
the plurality of command bundles, to be displayed at a computing device of a particular user,
wherein causing the recommendation of one or more of the command bundles to be20
displayed is independent of receiving any query that indicates search criteria for
command bundle recommendations, and
wherein the one or more command bundles comprise command bundle data that
identifies a plurality of discrete actions that can be performed by an automated assistant,
the discrete actions including a given action that includes a slot that lacks any fixed25
value;
receiving a selection of a given command bundle, of the one or more command bundles,
at the computing device and from the particular user;
in response to receiving the selection:
engaging in interaction with the particular user and via the computing device of the30
particular user, to resolve a particular value for the slot that lacks any fixed value; and
39
assigning the given command bundle to the particular user, including storing the
particular value in association with the at least one slot, for the given command bundle
and for the particular user;
subsequent to assigning the given command bundle to the particular user:
determining to execute the given command bundle for the particular user; and5
in response to determining to execute the given command bundle for the particular user:
performing the corresponding discrete actions of the given command bundle,
including performing the given action using the particular value, for the slot, that was
resolved in the interaction.
10
4. The method as claimed in claim 3, wherein generating the score for the given
command bundle is based on a popularity measure of the given command bundle.
5. The method as claimed in claim 3, wherein generating the score for the given
command bundle is based on a user-assigned ratings measure of the given command bundle.15
6. The method as claimed in claim 3, wherein the one or more command bundles include
a plurality of user-created command bundles that are each created by a corresponding user.
7. The method as claimed in claim 3, wherein causing, based on the scores, the20
recommendation of the one or more command bundles to be displayed at the computing
device of the particular user comprises:
determining that the scores of the one or more command bundles satisfy a threshold; and
including the one or more command bundles in the recommendation in response to
determining that the scores satisfy the threshold.25
8. The method as claimed in claim 3, wherein causing the recommendation of the one or
more of command bundles to be displayed at the computing device of the particular user
comprises:
causing a first command bundle of the one or more command bundles to be displayed30
more prominently than a second command bundle of the one or more command bundles.
40
9. The method as claimed in claim 8, wherein causing the first command bundle to be
displayed more prominently than the second command bundle comprises causing the first
command bundle to be display above the second command bundle.
10. The method as claimed in claim 3, wherein generating the score for the given5
command bundle comprises:
generating a command bundle embedding for the given command bundle;
generating a user action embedding, wherein the user action embedding is based on
historical interaction data between the user and the automated assistant; and
generating the score for the given command bundle based on a comparison of the command10
bundle embedding and the user action embedding

Documents

NameDate
Abstract1.jpg04/12/2024
202428085315-Proof of Right [03-12-2024(online)].pdf03/12/2024
202428085315-FORM-26 [19-11-2024(online)].pdf19/11/2024
202428085315-COMPLETE SPECIFICATION [07-11-2024(online)].pdf07/11/2024
202428085315-DECLARATION OF INVENTORSHIP (FORM 5) [07-11-2024(online)].pdf07/11/2024
202428085315-DRAWINGS [07-11-2024(online)].pdf07/11/2024
202428085315-FIGURE OF ABSTRACT [07-11-2024(online)].pdf07/11/2024
202428085315-FORM 1 [07-11-2024(online)].pdf07/11/2024
202428085315-FORM 18 [07-11-2024(online)].pdf07/11/2024
202428085315-REQUEST FOR EXAMINATION (FORM-18) [07-11-2024(online)].pdf07/11/2024

footer-service

By continuing past this page, you agree to our Terms of Service,Cookie PolicyPrivacy Policy  and  Refund Policy  © - Uber9 Business Process Services Private Limited. All rights reserved.

Uber9 Business Process Services Private Limited, CIN - U74900TN2014PTC098414, GSTIN - 33AABCU7650C1ZM, Registered Office Address - F-97, Newry Shreya Apartments Anna Nagar East, Chennai, Tamil Nadu 600102, India.

Please note that we are a facilitating platform enabling access to reliable professionals. We are not a law firm and do not provide legal services ourselves. The information on this website is for the purpose of knowledge only and should not be relied upon as legal advice or opinion.