Designed and prototyped the HMI and UX architecture for LG’s Future AI Cockpit, integrating eye tracking, gesture, proximity, and conversational AI into a cohesive hardware-driven experience showcased at CES.
Role
Product Designer : A lean, startup-size team within a focused division
Team
PM, 2 Product Designer, 4 GUI Designer / 30 Developers, 1 Design agency
Period
2016 - 2018, annually
Overview
At LG, I designed and prototyped the HMI and UX architecture for an advanced AI cockpit that unified eye and head tracking, gesture recognition, proximity sensing, and conversational AI into a cohesive, hardware-integrated user experience. Collaborated with engineering teams to define component-level interactions and deliver functional prototypes showcased at CES and global motor shows.
Challenge
How might we integrate LG’s diverse in-car technologies including AI voice, eye and head tracking, gesture, proximity, and driver monitoring into one intelligent and intuitive cockpit experience?
Objective
Develop a cohesive HMI and UX framework that synchro-
nizes multimodal inputs and AI systems, transforming fragmented features into a single, fluid user experience.
Result
The LG Future AI Cockpit integrated LG’s full suite of vehicle technologies into one hardware-driven UX system, showcased at CES as a working prototype representing LG’s vision for intelligent mobility.
More detail
I’ll walk you through the details during our meeting
Hello
Any questions?
Let's connect and build something meaningful together