a project by jeff thompson:

https://vimeo.com/303457707?fl=pl&fe=vl

We interact with our computers constantly, touching them more than we touch any person in our lives, and grooming them inside and out. For a month, I recorded all interactions with my phone and fed them into a machine learning system, which then output new, learned gestures. These “hallucinated” movements are awkward yet eerily accurate swipes, taps, and typing based on what my computer has learned from my interactions with it. Presented as an interactive sculpture, these new gestures are enacted by a small robotic arm on the visitor’s palm as they sit at a low, altar-like table.