🔬 researchMostly Real
Monday, March 30, 2026
DEVELOP GAZE-CONDITIONED MLLMS FOR REAL-TIME VIDEO UNDERSTANDING.
Gaze data improves MLLMs' real-time video understanding.
Monday, March 30, 2026
Gaze data improves MLLMs' real-time video understanding.
◆ What Changed
MLLMs process raw video → MLLMs focus with gaze data.
◇ Why It Matters
AI devs build more intuitive, responsive video agents.
🛠 Builder Opportunity
Create AR/VR MLLM interfaces responding to user gaze.
⚡ Next Step
→ Integrate gaze tracking into MLLM video pipelines.
📎 Sources