Background

Managing chronic conditions such as arthritis, eczema, or allergies requires consistent symptom tracking, yet current tracking methods make it challenging. 

  1. Communication in medical settings
    Children often lack the vocabulary to describe symptoms and feel anxious in medical environments, causing them to withdraw or underreport symptoms.

  2. Subjective and unreliable data
    Doctors rely heavily on parents’ interpretations and recall, which can be subjective, inconsistent, and incomplete.

  3. Manual logging burden
    Parents act as “manual data loggers,” tracking symptoms through diaries. This process is time-consuming and prone to patchy reports.

As a result, children become passive in their own care, and medical decisions are often based on inaccurate data.

User Journey CareCub Before_White BG.png

Design Goal

How can we empower children to communicate their health conditions while reducing the burden of logging on parents?

The project aims to:

  • give children an active role in their care
  • capture more accurate, real-time symptom data
  • create a system that fits naturally into everyday life

Concept

Inspired by the Teddy-Bear Hospital approach, where play is used to reduce anxiety in medical environments, CareCub introduces a teddy bear as a health-logging device for children aged 3-7.

How does it work?

When a symptom occurs, the child interacts with the bear using simple gestures to indicate what they feel and where it occurs. Embedded sensors inside the bear detect touch and pressure. With machine learning, the system can also capture subtle nuances in gesture intensity and touchpoints. The recorded inputs are saved digitally and can be discussed during doctors' appointments. Parents can also add notes to each input, providing context on possible causes.

User Journey CareCub After_White BG.png

Personalised Gestures

Each child develops a personalised gesture language together with their parents. This allows children to express what they feel in a way that feels most intuitive to each of them. The gesture can be anything.

Gestures to Visual Data

CareCub_Gesture to Visual Data.jpg

Digital Interface

  • Hourly View
    to see how it develops throughout the day and recognise pattern

  • Weekly View
    to see an overview of the day throughout the week

  • Monthly View
    to identify long-term trends 

On certain days the symptoms might be heavy on the legs, but on other days might be on the arms. Symptoms are mapped onto a body heatmap and timeline, helping identify correlations such as recurring symptoms after specific activities. 

For example:
Football every Tuesday --> leg pain
Play too much games every Saturday --> more arm pain

  • App Setup
    to record personalised gestures and assign icons

Prototype and Technical Details

We explored how to embed sensing capabilities into soft textiles using several sensors, materials, and configurations to recognise inputs such as scratching, hand clapping, squeezing, and poking. We visualised the output using ProtoPie.

We connected a capacitive sensor to XIAO ESP32C3 and used several conductive materials like fabric, tape, wool, and thread. Initially, we tested the system on a smaller teddy bear from the outside.

CareCub

By translating gestures into structured data, CareCub reduces the burden on parents, empowers children to communicate through intuitive interaction, and provides doctors with clearer data for better treatment.

Outlook

P1002922.JPG

Further development could expand the system’s capabilities:

  • reiterating the sensor configuration as touch and pressure points to detect gestures in every part of the body
  • integrating machine learning to differentiate gestures, detect nuances and reduce errors
  • exploring additional use cases for tangible input in healthcare contexts