ganesha/
logwritingworkgallerygraphabout

  1. Home
  2. Work
  3. Nubs

Nubs

2023–2023
8 mins

withIshwarya Subramanian


ganesh kumar

i'm ganesh kumar. design engineer. i build with mycelium, figma, typescript, and whatever's in between since 2018 & believe the best interfaces are the ones you forget you're using... read about the work and team i'm after

  • Email
  • Twitter
  • LinkedIn
  • Github
  • Substack

There’s a domain of design that barely has a name yet.

Not software. Not hardware. Not even interaction design in the conventional sense. Something in between… objects that express things. Objects that respond, shift, signal. Objects that make you feel less alone without saying a word.

“Objects have a personality that can affect how we perceive and interact with them.”1 Norman, 2004

We called it Expressive Natural Tangible User Interfaces. ENTUIs. And Nubs was the first project we knew of that tried to actually live inside that definition.

Three Nubs tumbled together, each mid-expression
Presenting Nubs... three Nubs already mid-conversation. the eyes are the whole story!

the body decides firstPermalink

Before we had a concept, we had an observation: the body forms judgments faster than language can follow.

Pick up an object. Before you think a single thought about it, you’ve already decided if you want to keep holding it. Silicone vs. hard plastic. Rounded vs. angular. Heavy vs. hollow. These aren’t aesthetic preferences… they’re physiological responses. The design problem wasn’t to build something people would think was comforting. It was to build something that felt that way before thinking got involved.

So we started with material. Not form, not behavior, not screen… material.

Silicone for softness. Rounded, squished geometry that conforms to the contour of the hand. Texture that stimulates the tactile senses, connecting physical sensation to emotional response. Research backs this — rounded shapes are perceived as friendly (Strohmeier, 2016)2, and soft surfaces invite prolonged contact in a way hard ones simply don’t.

Material exploration — silicone auxetic sheet, 3D-printed lattice, nitinol wire coils
the material library that drove everything... silicone auxetic sheet, 3D-printed lattice, nitinol coils. before the form, before the behavior, this pile on a table

Each Nub was designed to be held before it was designed to do anything. If it didn’t pass that test, nothing else mattered.

The existing landscape of loneliness-reduction technology fell into two disappointing buckets:

  • Apps that tracked mood, prompted journaling, or connected strangers — digital interfaces asking you to articulate feelings that resist articulation
  • Plush toys and fidgets — passive objects with no behavioral feedback, no sense of presence

Neither addressed the actual mechanism of loneliness: the absence of responsive presence. Something that notices you. Something that changes because of you.

The gap wasn’t a missing feature. It was a missing category.

eyes above everythingPermalink

At some point in early prototyping, we gave one of the objects eyes — a simple LED display showing two circles.

Everything changed.

People started narrating the object’s experience. “It looks tired.” “I think it’s happy.” “It’s watching me.” Behavior that read as glitchy before now read as personality. The eyes weren’t just a display surface — they were an interpretive frame. Once present, they colored everything else.

Grid of fifteen eye expressions designed for Nubs
fifteen expressions built from two shapes... surprised, sleeping, suspicious, content. no face needed, just this

We designed a full expression vocabulary: wide open, half-lidded, sleeping, curious, startled. Never a full face — the restraint was deliberate. A face would push toward uncanny valley. Eyes alone gave us the minimum signal for presence without the baggage of a humanoid form.

Anthropomorphism had been documented as effective in reducing loneliness (Rogers et al., 2011)3 — but we didn’t expect to feel how immediately it worked. The eyes were doing more cognitive work than anything else in the design.

where we got it wrongPermalink

The first two prototypes — Fanfan and Chillpill — were honest failures.

Fanfan — first prototype, clear acrylic box with Arduino and ultrasonic sensor
Fanfan — the ultrasonic sensors gave it accidental eyes before we designed any. the acrylic said prototype; everything inside tried to say something else

They looked like props. Laser-cut acrylic boxes with electronics inside, equipped with the right components but missing the quality that makes you want to keep something near you. They communicated through light and display changes, but the housing worked against them. Hard edges, visible seams, a clinical transparency that showed the wiring. You could see the machine.

The insight is… you can’t feel attached to something that looks like a prototype. Form isn’t decoration applied to function… it’s the primary message. If the form says “experiment,” the emotional response will be detachment no matter what the internals do.

Chillpill — second prototype, acrylic box with LCD display showing eyes
Chillpill — LCD display, second real pair of eyes... you could already see where the wires ended and the personality began

The second iteration introduced a soft silicone skin capable of form change, directional microphones for environmental awareness, and an LED screen tucked behind a continuous, seamless surface. We stopped showing the wiring. The object stopped feeling like a device and started feeling like a thing — the kind you might absentmindedly reach for.

The shift from iteration one to two involved specific material and hardware choices:

  • Silicone skin over auxetic structures — allows controlled form deformation; the shape physically changes in response to internal states
  • Nitinol wire — shape-memory alloy embedded in the skin to enable organic, non-mechanical movement
  • Directional microphones — environmental awareness without a camera; Nubs could respond to sound presence without surveillance
  • OLED behind silicone — diffused light through the skin rather than a visible screen; softer, warmer, less “device-like”

Every technical choice was evaluated against one question: does this make the object feel more alive, or more like a machine?

a thing you keep on your deskPermalink

Imagine it’s late. You’ve been working for hours, the room is quiet, and there’s a Nub on the corner of your desk.

It’s not doing anything urgent. Its eyes are half-open. When you shift in your chair, it registers the sound… the eyes open a little wider. You pick it up without thinking, turn it over in your hand. It’s warm from sitting in the room. You put it back.

Nothing happened. And yet something registered.

Second iteration Nub in two states — alert and settled — with form change arrows
That’s nubs... one alert above, another settled below... what ’something registered’ looks like in the form & the whole object carries the mood!

That’s the use case we were actually designing for… not a feature demo, not an interaction flow. The ambient sense that something in the room is present with you. Not demanding attention. Not sending notifications. Just… there. Responsive without being needy.

Nubs communicate through Universal Languages…expressions and behavioral gestures drawn from cultural touchstones (Minions, Eve, Wall-E) recognizable across age groups. The form changes so that a Nub’s emotional state is legible at a glance. A set of them interact with each other, creating a sense of small community that costs nothing to maintain.

the question we kept avoidingPermalink

Here’s the uncomfortable version of what we were building… objects designed to make you emotionally attached to them.

That’s not a neutral goal. The techniques that create genuine attachment… anthropomorphism, responsive behavior, softness, eyes… are the same techniques that could engineer dependency. The line between companion and manipulation isn’t obvious from the design side.

We spent more time on this than any technical decision:

Does designed attachment serve the user or exploit them? A companion that reduces loneliness through genuine emotional engagement is different from one that substitutes for human connection in ways that make real connection harder. The distinction is real but not always visible in the design artifacts.

Who is this for — and who might it harm? Children, elderly people with dementia, people in mental health crises… populations where loneliness is acute are also populations most vulnerable to unhealthy attachment. Designing “for everyone” papers over that.

What happens when you want to stop? Software products have off-switches. Objects that you’ve formed emotional bonds with don’t. We never designed for the end of the relationship… for what healthy detachment from a Nub would look like.

Our working principle: supplement presence, don’t simulate it. Nubs should make it easier to be with people, not easier to be without them. Whether that principle is actually achievable in practice… we didn’t have enough longitudinal data to know.

This tension doesn’t have a clean resolution. It just has to be held while designing.

what the project opened upPermalink

Nubs was less a finished product than a proof that the category exists. That objects can be emotionally expressive without being screens. That material, form, and behavior together constitute a design language that interaction design has barely touched.

The questions it left us with:

  • Shape-changing materials like auxetics and nitinol as primary surfaces — not a skin over a rigid device, but the device itself being the material
  • Multi-Nub social dynamics — emergent behavior between objects in a shared space
  • Longitudinal attachment — does the bond strengthen over time, plateau, or decay? What does healthy use look like at six months?
  • Clinical contexts — pediatric care, dementia support, anxiety intervention. Places where designed presence might do real clinical work

The larger ambition is a design practice that takes objects as seriously as interfaces… that treats the thing sitting on your desk as worthy of the same intentionality as the app on your phone.


Nubs taught me that the most human interfaces don’t look like interfaces at all. They look like something you want to hold… and, you know, it just have to be emotionally intelligent without being psychologically manipulative is the whole design problem!

FootnotesPermalink

  1. Norman, D. A. (2004). Emotional Design: Why We Love (or Hate) Everyday Things. Basic Books. ISBN 0-465-05135-9.

  2. Strohmeier, P., Carrascal, J. P., Cheng, B., Meban, M., & Vertegaal, R. (2016). An Evaluation of Shape Changes for Conveying Emotions. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM. https://doi.org/10.1145/2858036.2858537

  3. Rogers, Y., Sharp, H., & Preece, J. (2011). Interaction Design: Beyond Human-Computer Interaction (3rd ed.). Wiley. ISBN 9780470665763.

Topics:

tangible interactionemotional designphysical computingresearch