Dev Log #1: Why These Parts, Why This Architecture
This is not a guide. This is a journal entry about the decisions that shaped the 3we platform — what we chose, what we rejected, and what broke along the way. If you’re evaluating whether this project is real engineering or vapor, this post is for you.
The Problem We’re Solving
Section titled “The Problem We’re Solving”Most “open-source robot” projects fall into two categories:
- Toy demos — Arduino car with ultrasonic sensor, no path planning, no SLAM, useless for research
- Academic one-offs — Custom hardware that costs $5,000+, runs unmaintained ROS1 code, and requires a PhD student to operate
We wanted a third option: a robot that costs less than a good monitor, runs modern ROS2, and has a Python API that ML researchers can use without ever learning colcon build. That’s what shaped every decision below.
Why ESP32-S3 + Raspberry Pi 5
Section titled “Why ESP32-S3 + Raspberry Pi 5”The rejected alternatives
Section titled “The rejected alternatives”| Option | Why we rejected it |
|---|---|
| STM32 only | Great for motor control, but no WiFi/BLE built-in. Adding wireless means another chip, more PCB complexity, more failure modes. |
| Jetson Nano/Orin | 4× the cost of Pi 5. The GPU is overkill for 2D navigation — we use Hailo-8L for inference, which is cheaper and lower power. Jetson also means NVIDIA’s SDK lock-in. |
| Single SoC (Pi 5 does everything) | Pi 5 runs Linux. Linux is not real-time. Motor PID at 1000Hz on a Linux scheduler = jitter = bad odometry = SLAM drift. |
| ESP32 (original, not S3) | No USB-OTG. We need USB-CDC for micro-ROS transport. Also less RAM and no vector instructions. |
What we ended up with
Section titled “What we ended up with”ESP32-S3 handles:
- Motor PID at 1kHz (4 channels, quadrature encoders)
- IMU fusion at 100Hz (BNO055 over I2C)
- Safety watchdog (kills motors if no heartbeat from Pi in 200ms)
- micro-ROS agent over USB-CDC
Raspberry Pi 5 handles:
- ROS2 Jazzy (Nav2, SLAM Toolbox, perception)
- Hailo-8L inference (13 TOPS, M.2 HAT)
- Python SDK server
- WiFi AP for development
This split is not novel — it’s the same architecture as commercial AMRs (ABB, MiR, etc.). The contribution is making it reproducible at $300.
The real reason for ESP32-S3
Section titled “The real reason for ESP32-S3”Honestly? Because it’s $4 on Taobao, has excellent documentation, and the ESP-IDF + micro-ROS toolchain actually works without fighting CMake for three days. We tried STM32 first. The HAL library and CubeMX code generation experience was painful enough that we switched after two weeks.
Why Mecanum Wheels
Section titled “Why Mecanum Wheels”Differential drive is simpler and cheaper. We chose mecanum anyway because:
-
Holonomic motion — the robot can strafe sideways. This makes docking trivial (approach perpendicular, slide in) and indoor navigation much cleaner (no 3-point turns in corridors).
-
Nav2 compatibility — Nav2’s omnidirectional motion model works directly with mecanum. Differential drive needs a different controller configuration and produces worse paths in tight spaces.
-
Research utility — omnidirectional platforms are more interesting for RL research because the action space is richer (3-DOF: vx, vy, omega vs 2-DOF: v, omega).
The downside we accepted
Section titled “The downside we accepted”Mecanum wheels have terrible traction on carpet and uneven floors. The rollers slip. Our odometry on carpet drifts ~5% over 10 meters without LiDAR correction. On hard floors it’s under 1%.
We accepted this because the primary use case is indoor labs and offices (hard floors). If you need outdoor capability, swap to differential drive — the SDK API doesn’t change, only the firmware kinematic model.
PBC-34: The Payload Bus
Section titled “PBC-34: The Payload Bus”The problem
Section titled “The problem”Researchers want to mount custom hardware: gripper arms, additional cameras, soil sensors, air quality monitors. The obvious solution is “just connect to GPIO” — but that creates three problems:
- No isolation — a short in your payload kills the robot’s compute
- No discovery — how does the SDK know what payload is attached?
- No power management — you can’t hot-plug without inrush protection
What we built
Section titled “What we built”A 34-pin connector (2×17, 2.54mm pitch, cheap and available everywhere) with:
- Separate power rails (5V/5A, 12V/3A, VBAT/10A) each with P-MOSFET high-side switches
- I2C, UART, SPI, CAN, USB, GPIO — pick what you need
- EEPROM-based discovery (24C02, I2C address 0x50) — plug in, robot reads descriptor, SDK exposes payload API
- Hardware overcurrent protection (per-rail shutdown in <1ms)
- DETECT pin with physical contact — no software-only detection
What went wrong in prototyping
Section titled “What went wrong in prototyping”Rev A: We used N-channel MOSFETs as low-side switches. Problem: when the MOSFET is off, the payload’s ground floats. Any capacitance between the payload and robot chassis creates noise that crashes the I2C bus. Spent two weeks debugging “random I2C errors.”
Fix: Switched to P-channel high-side switches (Si2301CDS for 5V, AO3401A for 12V). Ground is always connected. No more floating ground problems.
Rev B: EEPROM detection worked, but we forgot to add pull-ups on the DETECT pin on the robot-side PCB. If a payload didn’t have the DETECT resistor (early prototypes), the pin floated and the firmware thought a payload was perpetually connecting/disconnecting.
Fix: Added 10k pull-up on DETECT, changed firmware to require stable LOW for 50ms before triggering discovery.
These are the boring, real problems that don’t appear in README files. They’re also the reason we have confidence the design works — because we broke it, diagnosed it, and fixed it.
What’s Not Done Yet (Honestly)
Section titled “What’s Not Done Yet (Honestly)”| Component | Status | What’s missing |
|---|---|---|
| Mock backend | Working | ✅ 309 tests pass, navigation demo runs |
| Firmware | Working | Tested on bench with motors, needs field hours |
| ROS2 stack | Working | Nav2, SLAM, perception all run in Gazebo |
| Real hardware backend | Partially working | ROS2 bridge works, needs integration testing with full SDK |
| PyPI publishing | Not done | Package works from source, not yet on PyPI |
| Isaac Sim backend | Stubbed | Interface defined, not tested with real Isaac Sim |
| Gazebo backend | Partially working | Works in isolation, not fully tested through SDK |
| Hardware validation | In progress | Charging dock PCB untested, main board on rev C |
The honest summary: the software stack runs end-to-end in mock mode, and the firmware runs on real hardware, but the full Sim2Real pipeline (SDK → ROS2 → firmware → motors) has not been validated as a complete chain yet. That’s the next milestone.
SDK Demo (Right Now, On Your Machine)
Section titled “SDK Demo (Right Now, On Your Machine)”You don’t need to take our word for any of this. Clone, install, run:
git clone https://github.com/telleroutlook/3we-robot-platform.gitcd 3we-robot-platformpip install -e sdk/threewe/python examples/navigate_office.pyThe mock backend simulates a robot navigating through an office with collision detection and LiDAR raycasting — all in pure Python, zero dependencies beyond numpy.
What’s Next
Section titled “What’s Next”- PyPI publish —
pip install threeweshould just work - Full Sim2Real chain test — SDK → Gazebo, record video proof
- Real hardware video — motors spinning, LiDAR scanning, Nav2 navigating
- Community feedback — what do ML researchers actually want from the API?
If you have opinions on #4, open an issue or join the discussion at github.com/telleroutlook/3we-robot-platform.