Viet Nguyen
  • Publications
  • Blog

Viet Nguyen

Email Github LinkedIn Scholar

I am currently a Master’s student at the University of Science, Vietnam National University, and a Data Scientist at MTI Technology - AI Lab, where I work on applying Vision–Language Models (VLMs) to structured data extraction. My current research interest focus on improving VLMs through algorithmic dataset selection and representation learning approaches. Previously, I earned my Bachelor’s degree in Data Science from the University of Economics Ho Chi Minh City, where I worked as a Research Assistant at the Intelligent Data Analytics Lab under the supervision of Dr. Dang N. H. Thanh, conducting research on language models.

News


  • Jul 08, 2025 Joined AI Lab @ MTI Technology as a Data Scientist.
  • Oct 18, 2024 I joined TMA Solutions as an AI Engineer Intern, working on VLMs for real-time video description generation.
  • Aug 14, 2024 I was awarded the Odon Vallet Scholarship.
  • Mar 07, 2024 Our paper on Intent Mining with Deep Embedded was accepted at the Journal of Uncertain Systems.
  • Feb 10, 2024 Our paper on Attention-free Language Models was accepted at ICICCT2024.
  • Feb 10, 2023 Our paper was accepted at ICICCT2023.
  • Sep 10, 2021 I was awarded the UEH Admission Scholarship.

Publications


  • Nguyen Q. Viet, Nguyen Nhat Quang, Nguyen King, and Dang Ngoc Hoang Thanh. Performance Insights of Attention-Free Language Models in Sentiment Analysis: A Case Study for E-Commerce Platforms in Vietnam. In Inventive Communication and Computational Technologies, 2024.

  • Nguyen Q. K. Ha, Nguyen T. T. Huyen, Mai T. M. Uyen, Nguyen Q. Viet, Nguyen N. Quang, and Dang N. H. Thanh Customer Intent Mining from Service Inquiries with Newly Improved Deep Embedded Clustering. In Journal of Uncertain Systems, 2024.

  • Nguyen Q. Viet, Nguyen N. Quang, Nguyen King, Dinh T. Huu, Nguyen D. Toan, and Dang N. H. Thanh. An Exploratory Comparison of LSTM and BiLSTM in Stock Price Prediction. In Inventive Communication and Computational Technologies, 2023.