Buzz [Read more]
Through artificial intelligence image recognition technology to identify and analyze the visual cues in the ensemble practice process, and convert the obtained information into corresponding vibration signals, which will ultimately help shorten the ensemble practice process for the visually impaired and give sighted people some inspiration to learn an instrument.
Smart cameras are currently limited by the designer's own programming technology.
It is mainly used in the translation of three scenes (youtube video):
1. Judge the collaborator to get up and sit down
2. Detect the signal that the partner is ready to start and stop playing and give a reminder
3. Judge the fatigue of the players in the ensemble and give encouragement in time
The watch is mainly used to receive the information translated by the camera and transmit it to the user through vibration signals. Different scenes correspond to different vibration frequencies. (Youtube video)
The structure of the camera and the watch follow the design of manufacturability: use standard parts (such as screws, bolts), cost-effective materials (such as ABS/PC, TPE) and existing functional modules (such as infrared rays camera module and PTZ rotation control PCB board) as much as possible.
The current presentation is only the first stage of product development.
The designer redesigned a simple concept based on user testing and feedback, such as the lack of inclusiveness in the current production stage. The recorder uses Opus audio coding technology to encode the music played by the user in the ensemble practice in real-time to form a real-time spectrogram. Then, the changes of the lines in the real-time spectrogram are converted into tactile signals and then transmitted to the watch.
In the second phase of product development, designers can consider further development of the recorder, forming an ensemble with existing products to practice auxiliary technology series products; or aftermarket research, consider adding this functionality to existing products.
Prototype Demo 1
This demo mainly detects the location of the user's face and judges whether the person is in the middle of the screen.
The main steps are:
1. Import the target detection module and Serial module
2. Find the face and identify the centre of the face
3. Establish the coordinates and origin of the entire screen
4. Get the coordinates of the centre of the face relative to the centre of the screen
5. When the coordinates and the centre coordinates coincide, the signal is transmitted to the arduino and then to the vibration motor
Prototype Demo 2
This demo mainly detects that the finger is on the instrument and whether the finger starts to play.
The main steps are:
1. Import the hand tracking module and Serial module
2. Find the coordinates of the hand
3. Find the tip of each finger
4. When there are three or more fingertips pointing up, find the time for the fingers to remain still
5. If the duration is greater than 5 seconds, the signal will be sent to the arduino to make the vibration motor vibrate once.
Prototype Demo 3
This demo mainly uses the PyTorch framework to build an image classification project about people standing or doing down. This project can determine the meaning of the person's posture (standing or sitting down) in front of the real-time camera.
The main steps are:
1. Run the image collection training module and Serial module
2. Get an interactive interface that can train personal custom goals
5. Collect 30 pictures of people standing and sitting down
4. Training initial data. After clicking the training button, there will be a delay of about 30 seconds when the trainer loads the data. After that, the progress bar will indicate the training status of each epoch
5. Real-time test data
6. Try to use different backgrounds, different distances and angles to increase image data and train epochs.
7. Save the model
8. Add data transmission code: When the test result is sitting down, the arduino signal is transmitted to the vibration motor, and the vibration is generated.
Inclusive product design for instrumental ensemble practice for visually impaired people under 25.