After publishing this Github repo, I saw that my Azure credit was significantly dropped, resulting in Speech API deny. This might be due to bad people taking advantage of my Azure keys and endpoints, which are hard coded (although a bad practice but due to time constrain). I'm trying to fix it so that the speech-to-text will work again.
Development architecture. Server architecture is underdevelopment.
-
Official Docs (coming soon)
Laboratory equipment are expensive. Learning on 2D blackboard is unengaging. Aiming to make laboratories accessible to students around the world; Helping teachers to make lesson more engaging. Hands On is AR simulation equipped with AI for both teachers & students to interact, experiment, and make mistakes freely.
Demo Notes Drawing: https://youtu.be/-qrMq-FZTkg?si=WSMF8oVQxwj4frcs
Note: Suggested UI (under development).
a. Speak to Hands On:
- “I need a weight[object] made of iron[material] to conduct a buoyancy force experiment”.
- “Change the simulation gravity[field] to the moon’s gravity[value]”.
- “Set the height[field] of liquid[target] to 57cm[value]”.
- “Spawn something to measure length[description]”.
b. Best Practices: i. It’s good to keep it concise about what your intentions are. ii. You can ask Hands On to assist you with details. Example: “Show me the forces analysis of objects.” iii. Beside speech interaction, you can also interact with the simulation using real-world physical movement. Example: “grab object, move object around.” iv. Users are also able to take notes anywhere in the environment, so the world is your infinite, limitless canvas.
- Real-time AI feedback
- Multi User Collaboration on the same AR environment
- Tool recognition using computer vision
- IOS ARKit development
- Classroom / lab safety practice
I remember the first time I went to my physics lab during my high school years to experiment with electronic circuits. The equipment looks nothing familiar to me even though I have been studying it for a year. Turn out I was working with 2D symbol drawings the whole time, without touching or experiencing myself.
Laboratory equipment are expensive. Learning on the 2D blackboard is unengaging. Making Physic Experiment accessible anywhere, Hands On is an AR simulation for both teachers and students to interact, experiment, and make mistakes freely. It makes classroom experience more engaging for everyone. Better yet, Hands On is equipped with AI to instruct, explain to students in real-time: like "You should turn off the power source before plugging in", or perform actions like "Give me the set of equipment for Archimedes experiment", which can enhance creativity, safety, and familiarity.
Input utterances, label data, train, test, improve, and deploy AI model using Azure CLU. Connected Cognitive Search and SQL database. Code 3D physic simulations from scratch using Unity C# Develop for Android AR Core
It's just a super early state. The model is still improving and hasn't generalized things well enough. Currently, the project is just a prototype with some simple requests.
- AI Real-time feedback (Anticipated example use cases):
-
“Why is my spectrometer not giving accurate readings?”
-
Real-Time Feedback: Hands On can suggest checking the alignment, the quality of the light source, or the calibration of the spectrometer.\
-
"Is it safe to heat this substance with the Bunsen burner?"
-
Real-Time Feedback: Hands On can provide safety guidelines, including the substance's properties and suitable temperature ranges for heating.
- AR Multi-user in the same environment / Tools recognition using computer vision
- Enhance collaboration that allows students to work together and share experiences.
- Allow teachers to demonstrate how the experiment should be conducted.
- Import tools into the simulation.