CHAI
YOUXIANG
My name is Chai Youxiang and I'm bachelor of Computer Science from Nanyang Technological University, Singapore. I'm currently living my dream of being an AI Fullstack Developer!.
My goal is to create software that is not only functional, but engaging and user-oriented. I strongly believe that there is a place for Machine Learning and Artificial Intelligence in creating experiences that resonate with the user and applications that are cleverly intuitive and easy to use.
Skills & Technologies
Front-End
Front-End
Back-End
Back-End
DevOps
DevOps
Languages & Libraries
Languages & Libraries
My journey so far
FinTech Intern
FinTech Intern
Ernst & Young, Singapore
ASEAN FinTech Intern
August 2017 - Feburary 2018
- Collaborated on and helped maintain a database of 1,000 Global Fintech Startups and 500 Investors for EY’s Investor Summit during the Singapore Fintech Festival 2017
- Initiated and automated Startup and Investor matching using SQL
Software Engineering Intern
Software Engineering Intern
Ministry of Home Affairs, Singapore
Software Engineering Intern
May 2022 - December 2022
- Designed and developed the front-end UI/UX for a mobile/web cross-platform application to track completion rates and progress of users within the department
- Leverages tools such as Figma for wireframing and collaboration; Node.js, React.js, Typescript and the Ionic framework for front-end development
- Employed a CI/CD pipeline using Microsoft Azure DevOps, GitHub and deployed the application onto Azure Kubernetes Service
Bachelor's Degree Graduation
Bachelor's Degree Graduation
Nanyang Technological University, Singapore
Bachelor of Engineering in Computer Science
August 2020 - January 2024
- Elective focus - Artificial Intelligence, Data Science & Analytics
My projects
Deep Learning Methods with Less Supervision
(Final Year Project)
Developed a framework for training segmentation models with weak forms of supervision, using foundational models such as Meta's Segment Anything Model for pseudo-label generation.
Fine-tuned a fully-connected-network with a ResNet50 backbone based on generated pseudo labels and incorporated Conditional Random Fields as a post-processing technique.
Knowledge Distillation Exploration
Explored Knowledge Distillation using PyTorch models for classification on the Fashion-MNIST dataset
Experimented with distilling knowledge from ResNet as the teacher model to MobileNet as the student model using KL-Divergence as the loss function