Steven Antya Orvala Waskito

LLM Engineer, National University of Singapore

Projects

MultiLingual Robot: A robot that understands human conversation and multiple language

Using multilingual voice transcription and large language model, I was able to command the robot to move to various location implicitly. Example, "We have a lot of visitors here today, do you want to come and join us." → The robot will move to the welcome lounge.The robot was also able to understand full conversations in any languages and interpret the location based on the conversation, all using the same AI model! I tested the robot in 3 different languages, English, Indonesian, and Chinese, but it can support 99 languages. Robot understands Chinese language Robot understands Indonesian language

Robot Localisation: A robot that do not need camera or GPS to understand its location + A Flutter App for Robot Interface.

We only used temperature, humidity, barometer, wifi sensors to localise the robot within a boundary. The purpose of this project is to create a cheap IoT device that can understand location based on environmental cues. We also used a large language model to analyze and deduct the location. They are suprisingly good at this, since they know the corresponding signature of the rooms. Here, we implement it in a cybersecurity use-case, detection of a location attack via interception of API. Essentially, we use our IoT device to check for location discrepancies. On a side note, I created an app using flutter from scratch for this project 🙂 to make it pretty 💅