Our website has detected that you are using an outdated browser than will prevent you from accessing certain features. We recommend that you update your browser, simply click one of the following to download a new browser:

Skip to content Skip to navigation

Exploded view

The structure of the camera and the watch follow the design of manufacturability: use standard parts (such as screws, bolts), cost-effective materials (such as ABS/PC, TPE) and existing functional modules (such as infrared rays camera module and PTZ rotation control PCB board) as much as possible.

Future development

The current presentation is only the first stage of product development.

The designer redesigned a simple concept based on user testing and feedback, such as the lack of inclusiveness in the current production stage. The recorder uses Opus audio coding technology to encode the music played by the user in the ensemble practice in real-time to form a real-time spectrogram. Then, the changes of the lines in the real-time spectrogram are converted into tactile signals and then transmitted to the watch.

In the second phase of product development, designers can consider further development of the recorder, forming an ensemble with existing products to practice auxiliary technology series products; or aftermarket research, consider adding this functionality to existing products.

Prototype Demo 1

This demo mainly detects the location of the user's face and judges whether the person is in the middle of the screen.
The main steps are:
1. Import the target detection module and Serial module
2. Find the face and identify the centre of the face
3. Establish the coordinates and origin of the entire screen
4. Get the coordinates of the centre of the face relative to the centre of the screen
5. When the coordinates and the centre coordinates coincide, the signal is transmitted to the arduino and then to the vibration motor

Prototype Demo 2

This demo mainly detects that the finger is on the instrument and whether the finger starts to play.
The main steps are:
1. Import the hand tracking module and Serial module
2. Find the coordinates of the hand
3. Find the tip of each finger
4. When there are three or more fingertips pointing up, find the time for the fingers to remain still
5. If the duration is greater than 5 seconds, the signal will be sent to the arduino to make the vibration motor vibrate once.

Prototype Demo 3

This demo mainly uses the PyTorch framework to build an image classification project about people standing or doing down. This project can determine the meaning of the person's posture (standing or sitting down) in front of the real-time camera.
The main steps are:
1. Run the image collection training module and Serial module
2. Get an interactive interface that can train personal custom goals
5. Collect 30 pictures of people standing and sitting down
4. Training initial data. After clicking the training button, there will be a delay of about 30 seconds when the trainer loads the data. After that, the progress bar will indicate the training status of each epoch
5. Real-time test data
6. Try to use different backgrounds, different distances and angles to increase image data and train epochs.
7. Save the model
8. Add data transmission code: When the test result is sitting down, the arduino signal is transmitted to the vibration motor, and the vibration is generated.

Ziqun Hua

Inclusive product design for instrumental ensemble practice for visually impaired people under 25.

Major project

Buzz