LSIO Composer
Overview
LSIO Composer is a visual, node-based builder for Docker Compose files. It allows for the creation, modification, and sharing of complex container configurations through a graphical interface.
The primary use case is for container experts to design sophisticated templates where end-users only need to modify a few high-level variables (e.g., PUID, PGID, base paths) to adapt the configuration to their specific host environment. The application generates a standard docker-compose.yml file that includes embedded metadata, allowing the file to be dragged back into the application to restore the visual workspace for further editing.
How It Works
The application state is managed in src/App.jsx, which holds the list of workspace items (nodes) and their connections.
- Node-Based Graph: Users add nodes from the sidebar onto a workspace. These nodes represent Docker services (containers), top-level compose elements (networks, volumes), or configuration overrides (environment variables, user IDs, paths).
- Connections: Users draw connections between node outputs and inputs. This defines the data flow. For example, a "Parent Path" node's output can connect to a "Mount Path" node's input, which in turn connects to a specific volume input on a container node.
- State Propagation: When a node's data is changed or a connection is made, the application logic in
App.jsxpropagates these changes through the graph. ThepropagateItemUpdatefunction traverses downstream connections, updating the data of connected nodes accordingly. - Compose Generation: The
generateComposeFilefunction iterates through all items on the workspace. It identifies items representing services, top-level networks, volumes, etc., and constructs a JavaScript object that mirrors the structure of adocker-compose.ymlfile. This object is then serialized into YAML format using thejs-yamllibrary. - Metadata Embedding: Before outputting the final file, the current workspace state (items, connections, transform) is serialized to a JSON string, encoded in Base64, and appended to the YAML file as a specially formatted comment (
# LSIO_COMPOSER_DATA::...). - Re-importing: When a file is dropped onto the application, it reads the file content, looks for the metadata comment, decodes the Base64 string, and parses the JSON to restore the entire workspace state.
Developer Guide
This section details the dynamic architecture of the application, enabling developers to extend its functionality by creating custom nodes or integrating with custom container registries.
Dynamic Item Loading
The application discovers and loads all available node types at runtime. This is achieved through two mechanisms:
- Local Item Modules:
import.meta.glob('./components/items/*.jsx', { eager: true })inApp.jsxloads every local item module synchronously on application start. These are used for generic overrides and top-level compose elements. - Remote API Fetching: Container definitions are fetched from a remote API endpoint. The default endpoint for LinuxServer.io containers is
https://api.linuxserver.io/api/v1/images?include_config=true&include_deprecated=false. The application parses the JSON response from this endpoint to dynamically generate container nodes.
For a local module to be recognized as a valid item, it must export two specific named constants:
itemDefinition: A static JavaScript object that defines the node's properties, including its name, default size, and its staticinputsandoutputs(connectors).ItemComponent: A React component that renders the UI within the node's body. This component receivesitemDataandonItemDataChangeas props to read and update the node's internal state.
API Data Format
To integrate a custom container registry, the API endpoint must return a JSON object with a structure compatible with the application's parser. The core data is expected under data.repositories.<repository_name>. The full API spec can be seen here.
The following is a breakdown of the expected format for each container object in the repository array:
{
"name": "adguardhome-sync",
"project_logo": "https://.../adguardhomesync-icon.png",
"config": {
"env_vars": [
{ "name": "PUID", "value": "1000" },
{ "name": "PGID", "value": "1000" }
],
"volumes": [
{ "path": "/config", "host_path": "/path/to/config" }
],
"ports": [
{ "external": "8080", "internal": "8080" }
],
"devices": [
{ "path": "/dev/snd", "host_path": "/dev/snd" }
],
"security_opt": [
{ "compose_var": "seccomp:unconfined" }
],
"custom": [
{ "name_compose": "shm_size", "value": "1gb" }
],
"networking": "host",
"privileged": true
}
}
Key Fields:
name: (string) The unique identifier for the container. Used as the node type and defaultcontainer_name.project_logo: (string) URL to the icon displayed in the container node.config: (object) Contains the default Docker Compose configuration.env_vars: (array of objects) Each object with anameandvaluekey will be a default environment variable.volumes: (array of objects) Each object'spathkey defines a container-side path for a volume mount.ports: (array of objects) Each object withexternalandinternalkeys defines a default port mapping.devices: (array of objects) Each object withhost_pathandpathdefines a device mapping.security_opt: (array of objects) Each object'scompose_varis added to thesecurity_optlist.custom: (array of objects) For arbitrary top-level service keys. Thename_composefield is the YAML key, andvalueis its value (e.g.,shm_size: 1gb).networking: (string) Sets the defaultnetwork_mode.privileged: (boolean) Sets theprivilegedflag.
Dynamic Connectors
Certain nodes, such as containers and the EnvVarOverride node, feature connectors that are generated dynamically based on the node's internal state. This logic resides in the getDynamicDefinition helper function in src/components/WorkspaceItem.jsx.
-
EnvVarOverride Example: The
EnvVarOverridenode's output connector is defined by the "Key" field in its UI. If a user entersPUIDas the key, thegetDynamicDefinitionfunction generates an output connector withid: 'env_out:PUID'andname: 'PUID'. This allows for a direct, context-aware connection to a container's environment variable input. -
Container Example: Container nodes generate dynamic inputs for every environment variable and volume mount defined in their configuration forms. For an environment variable with the key
TZ, an input withid: 'env:TZ'is created. For a volume mapping likechangeme:/config, an input withid: 'volume:/config'is created. This allows other nodes to target and override specific entries in these lists.
Creating a Custom Override Module
To add a new draggable node (e.g., a new type of override), follow these steps:
-
Create the File: Add a new
.jsxfile insrc/components/items/, for example,MyOverride.jsx. -
Export
itemDefinition: Define the static properties of your node. Theoutputsarray is critical. Thetypeproperty of an output determines which inputs it can connect to.// src/components/items/MyOverride.jsx export const itemDefinition = { name: 'My Override', defaultSize: { width: 14, height: 8 }, inputs: [], // This is a provider, so no inputs. outputs: [ { id: 'my_override_out', name: 'Value', type: 'env_value', // Connects to container props like 'user', 'shm_size', or env vars. multiple: true, color: '#your_hex_color', }, ], }; -
Export
ItemComponent: Create the React component for the node's UI. It must call theonItemDataChangeprop to update its state in the main application.// src/components/items/MyOverride.jsx import React from 'react'; export const ItemComponent = ({ itemData, onItemDataChange }) => { const value = itemData.data?.value ?? 'default-value'; const handleChange = (e) => { onItemDataChange(itemData.id, { value: e.target.value }); }; return ( <div> <label>My Value</label> <input type="text" value={value} onChange={handleChange} /> </div> ); }; -
Update Propagation Logic: The
getProviderOutputValuefunction inApp.jsxmust be updated to extract the primary output value from your new node type. Add a case for your new node'sname.// src/App.jsx -> getProviderOutputValue switch (itemDef.name) { // ... existing cases case 'My Override': return item.data.value ?? ''; }
Adding Custom Connectors to Containers
To add a new static input to all container nodes:
-
Modify Container Definition: Open
src/components/items/Container.jsx. Locate thecreateContainerDefinitionfunction. -
Add to
inputsarray: Add a new object to theinputsarray.id: Must be unique. Use the prefixprop:for simple property overrides (e.g.,prop:hostname).name: The label displayed in the UI.compatibleTypes: An array oftypestrings from output connectors that can connect here.multiple:falsefor single-value properties.
// src/components/items/Container.jsx -> createContainerDefinition -> inputs array inputs: [ // ... existing inputs { id: 'prop:hostname', name: 'Hostname', compatibleTypes: ['env_value'], multiple: false, color: '#your_hex_color' }, ], -
Update Propagation Logic: In
App.jsx, thepropagateItemUpdatefunction must handle this new input. Add a case that checks for the connectorid.// src/App.jsx -> propagateItemUpdate } else if (toConnectorId.startsWith('prop:')) { const propName = toConnectorId.substring('prop:'.length); if (newToData[propName] !== incomingValue) { newToData[propName] = incomingValue; dataChanged = true; } } -
Update Container UI: In the
ItemComponentwithinContainer.jsx, find the corresponding form field (e.g., the hostname input) and make itreadOnlywhen it is linked.// src/components/items/Container.jsx -> ItemComponent const isPropLinked = (propName) => !!linkedInputs[`prop:${propName}`]; // ... <input type="text" value={serviceData.hostname || ''} readOnly={isPropLinked('hostname')} />
Project Structure
.
└── src
├── App.jsx # Main application component, state management, and core logic.
├── components
│ ├── items/ # Directory for all dynamic node modules.
│ │ ├── Container.jsx # The component and definition factory for all containers.
│ │ └── ... # Each file is a unique node type.
│ ├── Sidebar.jsx # Renders the list of draggable items.
│ ├── Workspace.jsx # The interactive canvas for nodes and connections.
│ └── WorkspaceItem.jsx # Renders a single node and its dynamic connectors.
└── styles # CSS modules for components.
Getting Started
- Clone the repository.
- Install dependencies:
npm install - Run the development server:
npm run dev