• Anna Oh

MixedReality_w3) Cans (cycling target) Multiple Image Targets

Updated: Apr 23




References)


Suwappu - Berg London


Garden Friends - Nicole He


Tape Drawing - Bill Buxton

Smarter Objects

Invoked computing



MARIO

Paper Cubes - Anna Fusté, Judth Amores


InForm

When objects dream - ECAL


Resources)

Reading



Exercise) List potential input (sensory) and output( feedback) modes that you can leverage in the real world - translate these into technical possibilities using current technology (you can use your mobile device but feel free to bring any other technology or platform (Arduino? Etc..) that you can implement. Choose on elf each (input, output) and create a simple experience to show off their properties but, also, affordances and constraints.



MY IDEA

Control the video with the hand gestures


Last semester, I took the machine learning class and I had a lot of fun with Google's Teachable machine. By using this platform, I was able to train my gestures with webcam to make certain outputs. Back then, I was working on wearable tech which is an apron remote controller to navigate video by tapping surface of apron. To more expand for experimentation of various interfaces, I thought 'Gesture' can be another great interface to navigate the video.


  • Input: User's hand gesture with webcam

ex) Rock -> Pause the video / Paper -> Play the video / Sissor -> Rewind the video


  • Output: Video content on the user's mobile device( which has a camera)

  • Tools: Teachable machine(ML) / Xcode or Java for developing native app to control camera


Voice command is the most common alternative interface of the touchable interface. However, through my research, I found many people feel uncomfortable when they have to say something by themselves. Plus, while streaming video, the user's voice can be hard to be detected due to video sound. For these reasons, I imagine it will be convenient if I can simply make 'rock hand' to pause the video when my device is a bit far from me. (or if I can't touch my device with my messy hand while cooking.) This idea sounds a bit far from an Augmented interface, but it thinks it can be developed as a 'Radica atom' interface. Also, can be developed as a perspective of 'AR accessibility' for visually impaired users.





© 2023 by Salt & Pepper. Proudly created with Wix.com