Self-introduction

Values:

  • Discipline
  • Creation
  • Freedom

Hi, I’m Tianle Xiao, an IB DP student passionate about the intersection of AI, Computer Vision, and Quantum Cryptography. Through independent projects like GestureCtrl (real-time hand gesture recognition) and Rugby (AI-powered 3D tactical analysis), I explore how elegant theory meets powerful engineering. With an IPC Online Silver Award and 1100+ GitHub commits, I’m driven by curiosity and the pursuit of independence.

Frequently Asked Questions

What motivated you to start your computer vision and HCI projects?

My motivation originated from observing inefficiencies in everyday human–computer interaction. I became particularly interested in how vision-based systems can create more intuitive interfaces. Projects such as GestureCtrl emerged from exploring whether real-time landmark tracking could replace traditional input devices while maintaining low latency and stability.

I see these projects not merely as applications, but as experiments in improving interaction design through computer vision.

What technical challenges did you encounter, and how did you address them?

One major challenge was reducing noise and instability in real-time gesture tracking. Raw landmark data often leads to cursor jitter and delayed responses.

To address this, I implemented adaptive smoothing mechanisms and dynamic thresholding, balancing responsiveness and stability. I also explored system-level API integration to reduce latency in OS-level control.

Through this process, I gained a deeper understanding of real-time data processing and human-centered system optimization.

How do your projects connect to your academic interests?

My academic focus lies in computer vision, human–computer interaction, and AI-based modeling systems.

For example, the Rugby AI project applies object detection and spatial-temporal analysis to sports performance evaluation. This connects computer vision techniques with predictive modeling and visualization.

These projects serve as practical extensions of my interest in how computational systems interpret and model real-world human behavior.

What research direction would you like to pursue in university?

In university, I would like to further explore vision-based interaction systems and AI-driven modeling, particularly in environments involving dynamic human motion.

I am especially interested in improving robustness, latency reduction, and multi-modal interaction systems. Long term, I hope to contribute to research that bridges theoretical computer vision methods with practical human-centered applications.