Interact With Your Plant Using Handtrack.js API
- Anna Korczak
- Aug 13, 2021
- 2 min read
Updated: Mar 16, 2023
Group project
My role: I participated in the whole design process, but the tasks I performed on my own were creating the digital prototype using Java Script, Handtrack.js API, HTML and CSS as well as 3D modelling, 3D printing and assembling the physical prototype.
Tools: Miro, Visual Studio Code, GitHub, Solidworks 3D CAD, Prusa Slicer, Prusa i3 3D-printer, Arduino, Arduino IDE, Johnny-Five, Canva
This is a group work, which was a side project to our main project: Shake My Leaves. We wanted to use a sensor to enable an interaction with a plant. We were wondering what kind of API would contribute to our project and make the interaction more tangible and active. We decided to work on hand gestures and therefore picked the most relevant API, being “Handtrack.js”.

Handtrack.js library allows you track a user’s hand (bounding box) from an image in any orientation, in 3 lines of code. It allows developers quickly prototype hand/gesture interactions powered by a trained hand detection model.
The goal of the library is to abstract away steps associated with loading the model files, provide helpful functions and allow a user detect hands in an image without any machine learning experience. You do not need to train a model (you can if you want). You do not need to export any frozen graphs or saved models. You can just get started by including handtrack.js in your web application and calling the library methods.
Digital prototype in a browser
Digital/physical prototype
Prototyping sprint and physical prototype
We built a 3D-printed mechanism, which uses a servo motor with an arm to move a pot. Then we connected it to Arduino and implemented handtrack.js API using Johnny Five.
Authors:
Anna Korczak
Carlo Sicanica
Sanna Lindholm
Comments