.. _local_compute-label: Local Compute: Running Scaleout Edge Clients With Local Code ============================================================= Scaleout Edge supports two primary modes of client execution: **Managed Compute** (where the server pushes code to the client) and **Local Compute** (where the client runs its own code). Local Compute is essential for advanced users who need full control over the execution environment, such as debugging custom models, integrating with proprietary local data pipelines, or operating in highly regulated environments where downloading code is restricted. How Local Compute Works ----------------------- In a standard managed workflow, the Control Plane distributes a Compute Package to the edge node. In Local Compute mode, you bypass this distribution step. The client connects to the network but executes logic defined in your local filesystem. Key Differences ~~~~~~~~~~~~~~~ .. list-table:: :header-rows: 1 :widths: 25 35 40 * - Feature - Managed Compute (Standard) - Local Compute * - Code Source - Downloaded from Control Plane - Local Filesystem * - Versioning - Managed by Registry - Managed by User (Git/Local) * - Environment - Auto-provisioned (venv/Docker) - User-managed (local python/conda) * - Best For - Production Fleets, Consistency - R&D, Debugging, strict Security Setting Up Local Compute ------------------------- 1. Define Your Client Logic ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Create a local directory with your project structure (as defined in Project Structure). .. code-block:: text my-local-project/ ├── scaleout.yaml ├── train.py ├── validate.py └── model.py 2. Start the Client in Local Mode ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ When starting the client via the CLI, use the ``--local-package`` flag to instruct the client to use the current directory instead of downloading a package. .. code-block:: bash # Navigate to your project root cd my-local-project # Start the client scaleout client start \ --api-url \ --token \ --local-package The client will now connect to the Combiner and wait for training requests. When a request arrives, it will execute the ``train`` command defined in your local ``scaleout.yaml``. Development Workflow -------------------- Local Compute is the fastest way to iterate on model code: 1. **Edit**: Modify ``train.py`` or ``model.py`` locally. 2. **Restart**: Restart the scaleout client process (or use a hot-reload script). 3. **Test**: Trigger a new round from the Control Plane. 4. **Debug**: View stdout/stderr logs directly in your terminal without latency. Security Implications --------------------- Using Local Compute shifts the security responsibility. Since the Control Plane cannot verify the code running on the client (other than by trust), this mode is typically used in: * **Trusted Enclaves**: Where the infrastructure owner controls both the server and the edge. * **Development**: Where the data scientist is testing on their own laptop. * **Air-Gapped Nodes**: Where downloading external code packages is prohibited by firewall rules.