Backend Platforms - Console
Overview
OneCompute provides the ability to support running the same workload both on the cloud under Azure Batch and locally. This is supported by the concepts of WorkerHosts & Workers.
Please read the Azure Batch documentation for an overview of how you construct the various components to support running a workload with OneCompute.
This documentation will now describe how you can modify your code to support running locally. This has a number of benefits, you can:-
- Validate your worker code before incurring the cost of cloud compute resources
- Debug your worker code that normally runs remotely in an Azure Batch node
- Run jobs locally when you don't require the scale of the cloud.
- Run jobs entirely disconnected from the internet (using SQLite)
For the most part, running using the Console WorkerHost is simpler than using Azure Batch so we describe here, where the process differs.
Under the Console WorkerHost you do not necessarily need:-
- An Azure Batch account (although you will if you want to support cloud & local runs)
- An Azure Storage account - you can instead use SQLite.
- not applicable TASK "Worker Project: Deploy Worker packages to Azure Batch"
- not applicable TASK "Client Application: Specify the Application Packages that the job will utilize"
- not applicable TASK "Client Application: Configure Azure Batch Service Credentials"
Storage Services
One major difference between running under Azure Batch and running locally using the Console Worker Host is that it is not necessary to use an Azure Storage Account when running locally.
It is useful to use the Azure Storage Service if you want to debug your worker's locally using as close a representation to the Azure Batch scenario as possible.
But if your aim is just to execute workloads locally as cheaply as possible or if running in a disconnected scenario then you are better to use a local storage solution. OneCompute supports using SQLite as the local storage provider. Please see the storage services documentation for details on this.
Worker Project: Configure
Please follow the instruction in the Azure Batch section for this. The only difference here is that you remove the reference to DNV.One.Compute.WorkerHost.AzureBatch
and instead add one to DNV.One.Compute.WorkerHost.Console
. This is not needed for compilation purposes because the worker project is host agnostic. But it will give you the DNV.One.Compute.WorkerHost.Console.exe executable that will execute your worker DLL.
We hope to improve this mechanism in a future release to provide tooling here so that this will no longer be necessary.
Client Application: Create a OneCompute Scheduler for interaction with the Console Host
OneCompute has a scheduler class ConsoleWorkItemScheduler
that will schedule units of work on a OneCompute Console worker host service. You can use instantiate this class directly if you only have a single worker type. However, if you have multiple workers then you use create your own subclass of the ConsoleWorkItemScheduler
so that you can override the GetWorkerHostPath
method. This allows you to make a runtime decision for each WorkUnit as to which worker you want to use. It is also very useful when you have a parallel set of tasks with a concluding reduction task.
The ConsoleWorkItemScheduler
constructor takes a few parameters. The first 3 parameters relate to storage services. For details on this please see read help specific to storage services here.
The next parameter is the path to the console worker host to execute. If you have multiple worker types you will need to subclass the ConsoleWorkItemScheduler
in order to make this a work unit execution time decision.
The final parameter is the launchDebugger
parameter. Setting this to true will launches and attaches a debugger to the process when it is executed.
An example scheduler class implementation is shown below....
/// <summary>
/// Work scheduler for the application
/// </summary>
public class OptimizationDemoConsoleWorkItemScheduler : ConsoleWorkItemScheduler
{
/// <summary>
/// Initializes a new instance of the <see cref="OptimizationDemoConsoleWorkItemScheduler"/> class.
/// </summary>
/// <param name="workItemStorageService">
/// The work item storage service.
/// </param>
/// <param name="workItemStatusService">
/// The work item status service.
/// </param>
/// <param name="resultStorageService">
/// The result storage service.
/// </param>
/// <param name="workerHostPath">
/// The path to the worker host executable
/// </param>
/// <param name="launchDebugger">
/// Indicates whether to launch the Debugger.
/// </param>
public OptimizationDemoConsoleWorkItemScheduler(IFlowModelStorageService<WorkItem> workItemStorageService, IWorkItemStatusService workItemStatusService, IFlowModelStorageService<Result> resultStorageService, string workerHostPath, bool launchDebugger) :
base(workItemStorageService, workItemStatusService, resultStorageService, workerHostPath, launchDebugger)
{
}
/// <summary>
/// Configure worker settings.
/// </summary>
/// <param name="job">The job.</param>
/// <param name="workUnit">The work unit.</param>
/// <param name="workerSettings">The worker settings.</param>
/// <returns>
/// The <see cref="FlowModelWorkerSettings"/>.
/// </returns>
protected override FlowModelWorkerSettings ConfigureWorkerSettings(Job job, WorkUnit workUnit, FlowModelWorkerSettings workerSettings)
{
workerSettings.LaunchDebugger = this.LaunchDebugger;
return workerSettings;
}
}