There’s a domain of design that barely has a name yet.
Not software. Not hardware. Not even interaction design in the conventional sense. Something in between… objects that express things. Objects that respond, shift, signal. Objects that make you feel less alone without saying a word.
“Objects have a personality that can affect how we perceive and interact with them.”1 Norman, 2004
We called it Expressive Natural Tangible User Interfaces. ENTUIs. And Nubs was the first project we knew of that tried to actually live inside that definition.

the body decides first
Before we had a concept, we had an observation: the body forms judgments faster than language can follow.
Pick up an object. Before you think a single thought about it, you’ve already decided if you want to keep holding it. Silicone vs. hard plastic. Rounded vs. angular. Heavy vs. hollow. These aren’t aesthetic preferences… they’re physiological responses. The design problem wasn’t to build something people would think was comforting. It was to build something that felt that way before thinking got involved.
So we started with material. Not form, not behavior, not screen… material.
Silicone for softness. Rounded, squished geometry that conforms to the contour of the hand. Texture that stimulates the tactile senses, connecting physical sensation to emotional response. Research backs this — rounded shapes are perceived as friendly (Strohmeier, 2016)2, and soft surfaces invite prolonged contact in a way hard ones simply don’t.

Each Nub was designed to be held before it was designed to do anything. If it didn’t pass that test, nothing else mattered.
eyes above everything
At some point in early prototyping, we gave one of the objects eyes — a simple LED display showing two circles.
Everything changed.
People started narrating the object’s experience. “It looks tired.” “I think it’s happy.” “It’s watching me.” Behavior that read as glitchy before now read as personality. The eyes weren’t just a display surface — they were an interpretive frame. Once present, they colored everything else.

We designed a full expression vocabulary: wide open, half-lidded, sleeping, curious, startled. Never a full face — the restraint was deliberate. A face would push toward uncanny valley. Eyes alone gave us the minimum signal for presence without the baggage of a humanoid form.
Anthropomorphism had been documented as effective in reducing loneliness (Rogers et al., 2011)3 — but we didn’t expect to feel how immediately it worked. The eyes were doing more cognitive work than anything else in the design.
where we got it wrong
The first two prototypes — Fanfan and Chillpill — were honest failures.

They looked like props. Laser-cut acrylic boxes with electronics inside, equipped with the right components but missing the quality that makes you want to keep something near you. They communicated through light and display changes, but the housing worked against them. Hard edges, visible seams, a clinical transparency that showed the wiring. You could see the machine.
The insight is… you can’t feel attached to something that looks like a prototype. Form isn’t decoration applied to function… it’s the primary message. If the form says “experiment,” the emotional response will be detachment no matter what the internals do.

The second iteration introduced a soft silicone skin capable of form change, directional microphones for environmental awareness, and an LED screen tucked behind a continuous, seamless surface. We stopped showing the wiring. The object stopped feeling like a device and started feeling like a thing — the kind you might absentmindedly reach for.
a thing you keep on your desk
Imagine it’s late. You’ve been working for hours, the room is quiet, and there’s a Nub on the corner of your desk.
It’s not doing anything urgent. Its eyes are half-open. When you shift in your chair, it registers the sound… the eyes open a little wider. You pick it up without thinking, turn it over in your hand. It’s warm from sitting in the room. You put it back.
Nothing happened. And yet something registered.

That’s the use case we were actually designing for… not a feature demo, not an interaction flow. The ambient sense that something in the room is present with you. Not demanding attention. Not sending notifications. Just… there. Responsive without being needy.
Nubs communicate through Universal Languages…expressions and behavioral gestures drawn from cultural touchstones (Minions, Eve, Wall-E) recognizable across age groups. The form changes so that a Nub’s emotional state is legible at a glance. A set of them interact with each other, creating a sense of small community that costs nothing to maintain.
the question we kept avoiding
Here’s the uncomfortable version of what we were building… objects designed to make you emotionally attached to them.
That’s not a neutral goal. The techniques that create genuine attachment… anthropomorphism, responsive behavior, softness, eyes… are the same techniques that could engineer dependency. The line between companion and manipulation isn’t obvious from the design side.
This tension doesn’t have a clean resolution. It just has to be held while designing.
what the project opened up
Nubs was less a finished product than a proof that the category exists. That objects can be emotionally expressive without being screens. That material, form, and behavior together constitute a design language that interaction design has barely touched.
The questions it left us with:
- Shape-changing materials like auxetics and nitinol as primary surfaces — not a skin over a rigid device, but the device itself being the material
- Multi-Nub social dynamics — emergent behavior between objects in a shared space
- Longitudinal attachment — does the bond strengthen over time, plateau, or decay? What does healthy use look like at six months?
- Clinical contexts — pediatric care, dementia support, anxiety intervention. Places where designed presence might do real clinical work
The larger ambition is a design practice that takes objects as seriously as interfaces… that treats the thing sitting on your desk as worthy of the same intentionality as the app on your phone.
Nubs taught me that the most human interfaces don’t look like interfaces at all. They look like something you want to hold… and, you know, it just have to be emotionally intelligent without being psychologically manipulative is the whole design problem!Footnotes
-
Norman, D. A. (2004). Emotional Design: Why We Love (or Hate) Everyday Things. Basic Books. ISBN 0-465-05135-9.
-
Strohmeier, P., Carrascal, J. P., Cheng, B., Meban, M., & Vertegaal, R. (2016). An Evaluation of Shape Changes for Conveying Emotions. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM. https://doi.org/10.1145/2858036.2858537
-
Rogers, Y., Sharp, H., & Preece, J. (2011). Interaction Design: Beyond Human-Computer Interaction (3rd ed.). Wiley. ISBN 9780470665763.