Local AI for Robotics • No Cloud Required

Give Your Robots a BrainThat Works Offline

Shodh is the local knowledge system that lets robots understand instructions, remember procedures, and make decisions—all running on the robot itself.

Integrations: ROS2, Zenoh, REST API, gRPC • Works with your existing robot stack

Works With Your Robot Stack

Shodh integrates with industry-standard robotics frameworks and protocols

🐍

Python SDK

Native PyO3 bindings. Zero IPC overhead. Perfect for ROS2, drones, and robotics.

→ 5-10ms faster than REST API
🤖

ROS2

Native ROS2 integration via topics and services. Drop into existing robot stacks.

→ Most widely used in industry

REST API

Standard HTTP/JSON APIs for custom robot controllers and legacy systems.

→ Easy integration anywhere
🌐

Zenoh

Next-gen pub/sub for multi-robot fleets and edge deployments.

→ Advanced fleet coordination
Native Python SDK (PyO3) - Zero IPC Overhead
# Install: pip install shodh-memory
import shodh_memory

# Initialize local memory system (fully offline)
memory = shodh_memory.MemorySystem(storage_path="./robot_memory")

# Record robot observations
memory.record(
    content="Detected obstacle at grid (10, 20) in zone A",
    experience_type="observation",
    entities=["obstacle_147", "zone_a"]
)

# Query for relevant memories (hybrid semantic + temporal search)
results = memory.retrieve(
    query="obstacles in zone A",
    max_results=5,
    mode="hybrid"  # semantic, temporal, or hybrid
)

for mem in results:
    print(f"{mem['content']} (importance: {mem['importance']:.2f})")

# Flush to disk before shutdown
memory.flush()

# ROS2 node integration example
import rclpy
from rclpy.node import Node

class RobotMemoryNode(Node):
    def __init__(self):
        super().__init__('memory_node')
        self.memory = shodh_memory.MemorySystem("./ros2_memory")

    def observation_callback(self, msg):
        # Direct in-process calls (no HTTP/IPC overhead)
        self.memory.record(content=msg.data, experience_type="observation")
        results = self.memory.retrieve(query=msg.data, max_results=3)

⚡ Performance: Native Rust bindings via PyO3 - 5-10ms faster than REST API, zero serialization overhead, single process deployment. ONNX Runtime for embeddings (bring your own model) + llama.cpp for local LLM inference.

Why Robots Need RAG (Not Just LLMs)

Robots can't wait for cloud APIs. They need instant access to knowledge, running entirely on-device.

Why Robots Can't Wait for Cloud APIs

Factory robots need instant decisions. Cloud latency = production stoppage.

Cloud LLMs

500ms+ latency
Robot freezes waiting for network
₹4.2L/day API costs
1000 robots × 1000 queries/day
No offline mode
WiFi down = robots stop

Shodh RAG

50ms response
Local index on robot controller
Zero API costs
Run 1M queries for free
100% offline
No internet dependency

Response Time: Factory Robot Sees Unknown Object

Cloud LLM: 700ms
0ms
Start
200ms
Network delay
700ms
GPT-4 response
700ms+
Line stops
Shodh RAG: 100ms
0ms
Start
20ms
Query local index
50ms
Find SOPs
100ms
Decision made

Stack: Shodh (190MB) + llama.cpp (4GB) on robot controller

Built For Real Robots, Real Industries

Where local AI makes the difference between working and waiting

🏭

Manufacturing

Assembly robots query work instructions, safety procedures, and quality checklists— all without internet dependency.

→ 100ms response vs 700ms cloud delay
📦

Warehousing

Fulfillment robots understand inventory layouts, item locations, and picking procedures—even when WiFi drops.

→ Works offline, zero downtime
🌾

Agriculture

Field robots access crop disease databases, treatment protocols, and weather patterns—in areas with no internet.

→ 100% offline capability

Choose Your Communication Layer

Shodh works with ROS2 (industry standard), Zenoh (next-gen fleet coordination), or direct API integration

Zero-Copy Pub/Sub

Ultra-low latency message passing for real-time robot control

Mesh Networking

Decentralized peer discovery and automatic routing

Built for Edge

Designed for resource-constrained embedded systems

Multi-Protocol

TCP, UDP, QUIC, shared memory - choose your transport

ROS2 (Industry Standard)

✓ Mature ecosystem: Vast library of packages and tools

✓ Industry adoption: Used by most commercial robot companies

✓ DDS protocol: Proven real-time communication

✓ Shodh integration: Native ROS2 nodes for query/response

Zenoh (Next-Gen Fleet)

✓ 10-100x lower latency: Zero-copy shared memory

✓ Mesh networking: Automatic peer discovery, no broker

✓ Edge-optimized: Runs on microcontrollers (<64KB RAM)

✓ ROS2 compatible: Works alongside existing ROS2 stacks

RAG-Powered Robot Capabilities

What happens when you combine Shodh RAG (knowledge) with Zenoh (real-time messaging)?

Embodied Intelligence

Robots that think before they act

RAG provides context-aware decision making for autonomous robots

Example: Robot queries manuals, SOPs, and environment maps before acting

Natural Language Control

Talk to robots like you talk to people

Operators give instructions in plain language, RAG interprets and executes

Example: "Navigate to warehouse section B3 and retrieve item X" → RAG finds path, item location, and procedure

Offline-First Knowledge

Works without internet connectivity

All documentation indexed locally on robot controller - no cloud dependency

Example: Robots work in factories, warehouses, fields without internet

Multi-Robot Coordination

Robots learn from each other instantly

RAG + Zenoh enables fleet-wide knowledge sharing and task coordination

Example: One robot learns new procedure, entire fleet gets updated knowledge via Zenoh mesh

RAG + Zenoh Architecture

How Shodh RAG and Zenoh work together in a robot system

┌─────────────────────────────────────────────────────────────┐
│                    ROBOT CONTROLLER                         │
│                                                             │
│  ┌──────────────────┐        ┌──────────────────┐          │
│  │   SHODH RAG      │        │   ZENOH NODE     │          │
│  │                  │        │                  │          │
│  │  • Local index   │◄──────►│  • Pub/Sub mesh  │          │
│  │  • 1000 docs     │        │  • Zero-copy     │          │
│  │  • LLM (local)   │        │  • Edge routing  │          │
│  └──────────────────┘        └──────────────────┘          │
│           ▲                           ▲                     │
│           │                           │                     │
│           ▼                           ▼                     │
│  ┌─────────────────────────────────────────────┐           │
│  │        ROBOT DECISION ENGINE                │           │
│  │  Query: "Navigate to loading dock B"        │           │
│  │  RAG: Finds map, procedures, safety rules   │           │
│  │  Zenoh: Broadcasts intent to fleet          │           │
│  └─────────────────────────────────────────────┘           │
│                                                             │
└─────────────────────┬───────────────────────────────────────┘
                      │
        ┌─────────────┴─────────────┐
        │                           │
        ▼                           ▼
┌───────────────┐           ┌───────────────┐
│  ROBOT 2      │◄─────────►│  ROBOT 3      │
│  (Zenoh peer) │  Mesh     │  (Zenoh peer) │
│  Shares: Path │  Network  │  Shares: Status│
└───────────────┘           └───────────────┘

1. Query

Robot receives command or senses environment

2. RAG Retrieval

Shodh finds relevant docs, maps, procedures locally

3. Fleet Coordination

Zenoh broadcasts decision to other robots

The RAG-Robotics Roadmap

Where we are and where we're going

2025-2026

RAG-Enabled Industrial Robots

Factory robots query maintenance manuals, safety protocols, and production schedules in real-time

Shodh RAG + Zenoh + ROS2
2026-2027

Autonomous Warehouse Fleets

Entire warehouse fleets share knowledge mesh - if one robot learns a new route, all robots benefit

Distributed RAG + Zenoh mesh networking
2027-2028

Agricultural Robot Swarms

Field robots coordinate crop management using shared knowledge of soil conditions, weather patterns, and harvest schedules

Edge RAG + Zenoh + Precision agriculture AI
2028+

Fully Embodied AI

Robots with RAG-based reasoning that understand context, follow complex instructions, and learn from experience

Multimodal RAG + Zenoh + Vision-Language models

Build RAG-Powered Robots Today

Shodh RAG is production-ready. Zenoh is production-ready. The future of embodied AI starts now.