Final Year Project · BEng Robotics & AI · University of Hertfordshire

Cognitive
Humanoid Robot

A solo-designed, fully offline humanoid with hierarchical sensor trust, tendon-driven hands, and 5 original scientific contributions. Targeting IEEE IROS 2026 and a UK Provisional Patent.

33Degrees of Freedom
9+Sensor Modalities
3Compute Units
5Novel Contributions
~£1.4kTotal Hardware Cost
ROS2 Jetson Orin Nano Super HSTA Architecture Tendon-Driven Hands IEEE IROS 2026 UK Provisional Patent PAHT-CF Structure
Kinematics
33 Degrees of Freedom

Full upper-body articulation — each axis selected for expressiveness, functional manipulation, and academic novelty.

🦾
Each Arm
6 DOF × 2
Shoulder pitch/roll/yaw · Elbow pitch · Wrist pitch/roll. Full anthropomorphic reach envelope.
Fingers (per hand)
5 DOF × 2
1 DOF per finger via tendon routing. Feetech STS3215 serial bus servos. Independent digit control.
👍
Thumb Opposition
1 DOF × 2
Dedicated servo for thumb abduction/adduction. Enables power grasp and precision pinch.
🔄
Hip Rotation
1 DOF
Torso-to-base rotation for natural upper body turning. Brushless motor driven.
🤖
Head
2 DOF
Pan (yaw) and tilt (pitch). Houses depth cam, LiDAR, Radar, and ToF sensors. Fast tracking.
👁️
Eyes
2 DOF
Independent eye pan for expressive social gaze, vergence, and visual attention cues.
💬
Jaw
1 DOF
Viseme-synced jaw motion for speech-driven lip movement. Enhances HRI engagement.
📺
Chest Screen
1 DOF
10" touch display with tilt axis. Displays status, emotions, sensor feeds, UI panels.
Total DOF
33
12 arm + 12 finger + 2 thumb + 1 hip + 2 head + 2 eye + 1 jaw + 1 chest = 33 DOF. Comparable to commercial platforms costing 50× more, achieved on a student budget using PAHT-CF 3D printing and serial bus servos.
Perception Stack
9+ Sensor Modalities

Redundant, cross-validating sensor suite feeding the HSTA trust arbitration system.

Depth Camera
PRIMARY
RGB-D stereo. Object detection, hand-eye coordination, 3D scene reconstruction. Head-mounted.
LiDAR
SLAM
360° point cloud for SLAM, obstacle avoidance, spatial mapping. High-precision distance.
Radar
ROBUST
Works in low-light and dust. Velocity measurement, through-clutter detection.
ToF Array
PROXIMITY
Multi-zone time-of-flight for close-range obstacle sensing and hand proximity detection.
IMU (6-axis)
POSE
Accel + gyro for torso pose estimation, vibration monitoring, fall detection.
Microphone Array
AUDIO
Multi-mic array for speech recognition, sound localisation, wake word detection.
Force / Torque
HAPTIC
Fingertip force sensing for grasp control and safe human contact detection.
Servo Feedback
PROPRIOCEPTION
STS3215 serial bus servos report position, load, voltage, and temperature in real time.
Camera (Chest)
VISION
Wide-angle chest camera for workspace monitoring, gesture recognition, activity logging.
Compute Architecture
Three-Layer Brain Stack

Distributed computing across dedicated units — AI inference, real-time control, and embedded I/O.

🧠
PRIMARY — AI BRAIN
Jetson Orin Nano Super
  • 1024-core Ampere GPU · 8-core ARM Cortex-A78AE
  • 40 TOPS INT8 — runs vision, NLP, SLAM, HSTA arbitration
  • NVMe boot via rootOnNVMe — full ROS2 Humble stack
  • Handles object detection, speech, grasp planning
SECONDARY — REAL-TIME CONTROL
STM32 / Raspberry Pi (Co-processor)
  • Hard real-time servo control loop <1ms cycle
  • Joint trajectory execution, PID loops, safety watchdog
  • Communicates with Jetson via high-speed UART/SPI
📡
TERTIARY — I/O & WIRELESS
ESP32 Microcontroller
  • Sensor multiplexing · I2C / SPI bus management
  • Wi-Fi + Bluetooth for remote telemetry and control
  • Custom PCB design — part of FYP hardware contribution
Novel Architecture
HSTA — Hierarchical Sensor Trust Architecture

The core original contribution of this FYP. A dynamic sensor arbitration system that weights, cross-validates, and gracefully degrades across 9+ modalities in real time.

01
Dynamic Trust Weighting
Each sensor is assigned a runtime trust score based on environmental conditions, historical accuracy, and inter-sensor agreement. No static priority — trust adapts.
02
Cross-Modal Validation
Depth camera, LiDAR, and Radar outputs are cross-validated. Outlier rejection triggers confidence degradation of the conflicting sensor, not a hard shutdown.
03
Graceful Degradation
If a sensor fails or becomes unreliable (occlusion, lighting, interference), the system redistributes trust to remaining modalities without task interruption.
04
Hierarchical Decision Layers
Safety-critical decisions (collision, human proximity) use highest-confidence fusion. Higher-level decisions (object ID, navigation) use broader multi-sensor consensus.
05
Proprioceptive Integration
Servo position/load feedback, IMU, and force sensors are included in the trust hierarchy — treating internal state as a first-class sensing modality.
06
Novel Academic Contribution
HSTA is the primary contribution targeting IEEE IROS 2026. Positioned as a reproducible, low-cost framework applicable to any multi-sensor robot platform.
End Effector Design
Tendon-Driven Finger System

Custom-designed tendon-driven hand using Feetech STS3215 servos — a cost-performance optimised alternative to academic reference designs.

✓ ARIA Hand (Custom)
Servos per hand7× STS3215
ActuationTendon-driven
Fingers5 + thumb opposition
CommunicationSerial Bus (TTL)
FeedbackPosition + Load + Temp
Structure3D printed PAHT-CF
Cost per hand~£220–280
Reference: ETH ORCA Hand
ActuationTendon + linkage
Fingers5 fingers
CommunicationCAN bus
FeedbackHigh resolution
StructureMachined + carbon
Cost per hand~£1,650+
Forearm Servo Packing Layout
All 7 STS3215 servos per hand are packed in the forearm, not the palm — keeping the hand lightweight and maximising grip clearance. Tendon routing travels through the wrist channel using low-friction PTFE lined conduits. The forearm housing is printed in PAHT-CF (carbon fibre reinforced nylon) for structural rigidity under servo torque loads.
Project Timeline
Build Schedule

Solo development across academic year with submission, patent, and publication milestones.

SEP–OCT 2025
Architecture & Design
COMPLETE
  • 33-DOF kinematic specification finalised
  • HSTA architecture conceptualised and documented
  • Compute stack selected (Jetson Orin Nano Super + STM32 + ESP32)
  • Material selection: PAHT-CF for structural components
NOV–DEC 2025
CAD & Hardware Procurement
COMPLETE
  • Full arm and torso CAD in Fusion 360
  • Feetech STS3215 hand design — forearm packing layout
  • Jetson NVMe boot resolved via rootOnNVMe method
  • ROS2 Humble base stack installed on NVMe
JAN–FEB 2026
Printing & Assembly
COMPLETE
  • Structural parts printed on Bambu Lab A1 in PAHT-CF
  • Arm assembly and servo integration
  • Head module assembly with sensor housing
  • Power distribution and wiring harness
MAR–APR 2026
Software Integration & Testing
IN PROGRESS
  • HSTA implementation in ROS2 C++ nodes
  • Sensor fusion pipeline — depth cam + LiDAR + radar
  • Hand trajectory controller and grasp primitives
  • Speech and vision pipeline integration
MAY 2026
FYP Submission + Patent Filing
UPCOMING
  • Final dissertation submitted to University of Hertfordshire
  • UK Provisional Patent filed for HSTA architecture
  • Demonstration video and technical documentation
  • MindSpire Labs placement begins
OCT 2026
IEEE IROS 2026 Submission
TARGET
  • Full paper on HSTA architecture submitted to IEEE IROS 2026
  • Benchmarking data against baseline sensor fusion approaches
  • Open-source hardware and software release
Academic Impact
5 Novel Contributions

Original scientific and engineering contributions that distinguish this FYP from prior work.

# Contribution Description Target Venue
C1 HSTA Framework Hierarchical Sensor Trust Architecture — dynamic weighting and graceful degradation across 9+ modalities IEEE IROS 2026
C2 Cost-Optimised Hand Tendon-driven 6-DOF hand using STS3215 serial bus servos — 6× cheaper than ORCA reference design FYP + Patent
C3 PAHT-CF Structural Method Validated use of carbon-fibre nylon for load-bearing humanoid links — published print parameters FYP Dissertation
C4 33-DOF Budget Platform Full specification and BOM for a 33-DOF humanoid under £1,500 — reproducible academic benchmark Open Hardware
C5 Proprioceptive Trust Layer Integration of servo feedback as a first-class sensing modality within the HSTA hierarchy IEEE IROS 2026
Cost Engineering
Bill of Materials

Approximate hardware costs. Most compute and sensors already owned — £1,000 incremental spend planned.

Jetson Orin Nano Super
Primary AI compute · NVMe booted
~£290
Feetech STS3215 × 14
7 per hand · serial bus servos
~£280
Arm Servos (SC/ST series)
12 DOF arm actuation × 2
~£200
Depth Camera
RGB-D stereo · head mounted
~£80
LiDAR
360° SLAM sensor
~£70
10" Touch Display
Chest screen with tilt DOF
~£55
PAHT-CF Filament
Structural printing material · ~3kg
~£90
PCBs + Electronics
Custom ESP32 board, power, wiring
~£120
Misc Sensors + Hardware
ToF, IMU, Radar, mic array, fasteners
~£120
Estimated Total Hardware Cost
Incremental spend on ~£1k budget. Compute + sensors already owned not included.
~£1,305
vs £50k+ commercial equivalents