Interactive controller—exploring perception of objects and human-to-computer interactions.

Work — Flyp


As you walk down the street and check your latest email or recent twitter feed, take notice to your phone in relationship to your surroundings. Unlike us, your phone will not react. It does not matter the temperature, your speed, altitude, or whether or not it’s a sunny day. For the most part, your phone will operate as if nothing has changed.

In nature—all things that are connected—all objects witness an infinite loop of feedback. If you move in space you affect what is in your path. As everything you have affected is altered it too will affect everything around it. This process goes on… Everything is in some way impacted and aware of everything else. If this is a truth in nature, why do we not take cues when developing new technologies?

With the current status of integrated electronics and sensor technologies it is possible for us to create things that are more aware—similar to how we are aware. Orientation, height, depth, humidity, temperature, and color are just a few of the capabilities we can sense with something as powerful and compact as a smart phone.

What happens when an object can sense you have flipped it upside down and reacts? Can we begin to infuse personality and charm into devices? What does it mean to our relationship with our devices if they take on personalities? Can our devices begin to better assist us by offering suggestions based on our current surrounding conditions?

The serendipity of our life experience is what makes us unique. Forwarding that “random” experience and question through the objects we create may be another extension helping us find better—more advanced—solutions and happiness in the lives we lead.

Below are some screenshots of the experience seen on screen while interacting with Flyp.



Below are some process highlights while developing Flyp.



Work — Branchstrument
Work — MFA Thesis

News — MFA Thesis Published By Iowa State University