ModelManager.Run(InferenceInput, IExecutor, IOutcomeReceiver) Method
Definition
Important
Some information relates to prerelease product that may be substantially modified before it’s released. Microsoft makes no warranties, express or implied, with respect to the information provided here.
Run a single model inference.
[Android.Runtime.Register("run", "(Landroid/adservices/ondevicepersonalization/InferenceInput;Ljava/util/concurrent/Executor;Landroid/os/OutcomeReceiver;)V", "GetRun_Landroid_adservices_ondevicepersonalization_InferenceInput_Ljava_util_concurrent_Executor_Landroid_os_OutcomeReceiver_Handler", ApiSince=35)]
public virtual void Run (Android.AdServices.OnDevicePersonalization.InferenceInput input, Java.Util.Concurrent.IExecutor executor, Android.OS.IOutcomeReceiver receiver);
[<Android.Runtime.Register("run", "(Landroid/adservices/ondevicepersonalization/InferenceInput;Ljava/util/concurrent/Executor;Landroid/os/OutcomeReceiver;)V", "GetRun_Landroid_adservices_ondevicepersonalization_InferenceInput_Ljava_util_concurrent_Executor_Landroid_os_OutcomeReceiver_Handler", ApiSince=35)>]
abstract member Run : Android.AdServices.OnDevicePersonalization.InferenceInput * Java.Util.Concurrent.IExecutor * Android.OS.IOutcomeReceiver -> unit
override this.Run : Android.AdServices.OnDevicePersonalization.InferenceInput * Java.Util.Concurrent.IExecutor * Android.OS.IOutcomeReceiver -> unit
Parameters
- input
- InferenceInput
contains all the information needed for a run of model inference.
- executor
- IExecutor
the Executor
on which to invoke the callback.
- receiver
- IOutcomeReceiver
this returns a InferenceOutput
which contains model inference result
or Exception
on failure.
- Attributes
Remarks
Run a single model inference. Only supports TFLite model inference now.
Portions of this page are modifications based on work created and shared by the Android Open Source Project and used according to terms described in the Creative Commons 2.5 Attribution License.