HRI paradigms (physical, teleoperation, shared autonomy), Levels of autonomy (SAE 0-5 for robotics), Cognitive models (NASA TLX workload, Situation Awareness), Trust in automation (Lee & See model), Uncanny valley eff...
HRI paradigms (physical, teleoperation, shared autonomy), Levels of autonomy (SAE 0-5 for robotics), Cognitive models (NASA TLX workload, Situation Awareness), Trust in automation (Lee & See model), Uncanny valley effect, Robot appearance (anthropomorphic, zoomorphic, functional), Social robot design principles.
Multimodal interaction (speech, gesture, gaze, haptics), Speech recognition (ASR: HMM, end-to-end DNNs), Natural Language Understanding (NLU: intent classification, slot filling), Dialogue management (POMDP, rule-based), Gesture recognition (MediaPipe, OpenPose), Gaze estimation and attention modeling.
Emotion recognition (facial FER, physiological signals: EDA, HRV), Expressive behavior generation (gesture timing, facial animation), Rapport building strategies, Theory of Mind in robots, Personalization and user modeling, Cultural adaptation in HRI, Ethical considerations (deception, privacy, attachment).
HRI in manipulation (shared control, intent prediction), Haptic feedback and teleoperation (passivity-based control), Assistive technologies (exoskeletons, wheelchairs), Legibility and predictability in motion, Human-aware navigation (social costs, pedestrian avoidance), Safety standards (ISO 13482, UL 1740).
Experimental design (within/between subjects, counterbalancing), Metrics (success rate, time-to-task, user satisfaction SUS/NASA-TLX), Field vs. lab studies, HRI in domains (healthcare, education, manufacturing, service robots), Emerging trends (AR/VR HRI, brain-computer interfaces), Accessibility and inclusive design.