We design and implement self-hosted platforms for data, AI, and devices — using open-source tools on your servers, in your rack, or at the edge. No lock-in, no mystery cloud bills.
Deploy private LLMs, vector search, and analytics tools on your own hardware using open-source stacks (Ollama/vLLM, Postgres/pgvector, Milvus, etc.).
Design and deploy k3s/microk8s clusters for running AI/IoT workloads at the edge — factories, barns, clinics, branches.
Connect sensors, gateways, and machines with open protocols and self-hosted brokers (MQTT, Akri, LF Edge stack, etc.), with dashboards and alerts you control.
Get a roadmap to move from random SaaS and opaque cloud services toward a deliberate, open-source and self-hosted core.
Identify which apps, data flows, and devices benefit from self-hosting (compliance, latency, cost, control).
Map current SaaS/cloud sprawl → candidate workloads for on-prem/edge.
Choose open-source components (Kubernetes, AI stack, IoT stack, storage, monitoring).
Produce architecture diagrams, network plans, and hardware BOM.
Implement on your hardware (or recommended boxes), automate deploys, secure and document.
Train internal staff; leave them with admin guides and upgrade playbooks.
We prioritize open tools (LF Edge, CNCF projects, self-hosted AI) so you’re never locked into a single vendor.
Our designs assume you want to keep critical data and workloads on your hardware, with cloud only where it adds real value.
We design for unreliable connectivity, constrained hardware, and real-time requirements — not just pretty cloud diagrams.
Every engagement ships with runbooks, backup/restore procedures, and update strategies so you’re not dependent on us forever.
I specialize in designing and deploying open-source platforms for Edge, AI, and IoT — with a focus on self-hosting, data ownership, and practical DevOps at the edge.
More on our about page