Are Microinteractions the Building Blocks of a New Language?

I had the pleasure of seeing Dan Saffer present his explanation of Microinteractions at the invitation of IXDA Austin. His well thought-out theories and conclusions certainly rang true in my mind as I considered the numerous interaction design projects I have been involved in, as well as the many software products I use. I appreciated the thought and structure he put into defining microinteractions, those small interactions that create signature moments and lead us through meaningful user experiences. He makes the distinction that these are not features, but in combination do make up digital experiences. It all made a lot of sense to me, but something kept nagging at me. It all just seemed to be the start of something more.

Screenshot from Duolingo App

Screenshot from Duolingo App showing individual word cards that can reveal their meaning.

I’ve been working on learning French and have really enjoyed using the language learning app Duolingo. The app is full of microinteractions, just as Dan describes. There are variations of microinteractions that break up sentences and phrases into their individual parts. And as he talked, I started seeing all the microinteractions we design into our UIs as individual words – just like the word cards making up sentences and phrases in Duolingo. I continued to extend the word cards relationship to other elements of language, and it seemed to make sense.

Using words and sentence structures of verbal communication as a model, I started to equate the structure of interaction as a language. Microinteractions are just like words. Just as words have meaning (and sometimes multiple meanings), microinteractions carry some basic communication signals. By pairing them with others or sequencing them in a particular order, you begin to communicate a concept or idea, just like a sentence or phrase. I would equate sentences and phrases with UI patterns where microinteractions are assembled predictably or consistently in a UX to give context and meaning to their use. I feel this model continues to work pretty well when comparing typical written language with UX design artifacts like wireframes: paragraphs = features, chapters = workflows, novels = applications. At this point, I started to really get to the core of what was nagging at me.

A Common Language for Communication

Dan is touching on some very fundamental concepts, both good and bad, that we adopt as part of ever-evolving UX Design practices. Consider the ubiquitous “remember me” microinteraction. When you think about it, it’s actually confusing and unclear. Because it has been adopted by so many designers as part of the standard Login UI Pattern, it has been used again and again without enough questioning of its validity. We overlook the need to ensure users know the meaning of the all words because we believe users will understand a culturally accepted message – the meaning of the phrase. I believe what Dan is saying is that we need to make sure that the entire message is clear, even the words, to make sure users don’t get confused. Like in the movie Bull Durham when Crash Davis has to correct Nuke that women don’t get “wooly”, but they can get “weary”. Likewise, jokes and puns make more sense when you see the root of their wit. So by truly understanding the language and the shared semantics, we find new ways to communicate.

So if words hold so much meaning, and it is such a fundamental building block of any language, shouldn’t we make sure we are all speaking the same language when we design interfaces? Shouldn’t we make sure that everyone else agrees with the meaning of the microinteractions we are making up? As other people (users) learn the language, is there a source for information, like a dictionary, that everyone can rely on to be true? I don’t know if Dan has answers to these questions yet, but it seems like the start of a good conversation. There are certainly many sources for commonly used UI patterns, but what good are they if they are made up of microinteractions that mean different things to different users? As an industry, I suggest we start to ratify at least the most common microinteractions consistently across software. I think we owe it to the users of the software we design. If UX design is about creating the right communication between user and computer, then shouldn’t we be sure they understand the words we are using?