書目名稱Human-Computer Interaction. Multimodal and Natural Interaction影響因子(影響力)學(xué)科排名
書目名稱Human-Computer Interaction. Multimodal and Natural Interaction網(wǎng)絡(luò)公開度
書目名稱Human-Computer Interaction. Multimodal and Natural Interaction網(wǎng)絡(luò)公開度學(xué)科排名
書目名稱Human-Computer Interaction. Multimodal and Natural Interaction被引頻次
書目名稱Human-Computer Interaction. Multimodal and Natural Interaction被引頻次學(xué)科排名
書目名稱Human-Computer Interaction. Multimodal and Natural Interaction年度引用
書目名稱Human-Computer Interaction. Multimodal and Natural Interaction年度引用學(xué)科排名
書目名稱Human-Computer Interaction. Multimodal and Natural Interaction讀者反饋
書目名稱Human-Computer Interaction. Multimodal and Natural Interaction讀者反饋學(xué)科排名
作者: 心痛 時(shí)間: 2025-3-21 23:15 作者: archenemy 時(shí)間: 2025-3-22 04:21
Research on Gesture Interaction Design for Home Control Intelligent Terminalsn products, the family control intelligent terminal products are more complex in functional architecture and hierarchy, and the existing interactive gestures can not meet the functional requirements. The aim of this paper is to design a new interaction gesture for home control intelligent terminal. 作者: 執(zhí)拗 時(shí)間: 2025-3-22 07:32
A Comparative Study of Hand-Gesture Recognition Devices for Gamesigners might find it challenging to decide which gesture recognition device will work best. In the present research, we compare three vision-based, hand-gesture devices: Leap Motion, Microsoft’s Kinect, and Intel’s RealSense. The comparison provides game designers with an understanding of the main f作者: Excitotoxin 時(shí)間: 2025-3-22 11:43
The Social Acceptability of Peripheral Interaction with 3D Gestures in a Simulated Settingnt to which gesture control can be regarded as peripheral interaction [.]. We first identified six gestures for player control to use in the social acceptability study. Next, we used a Wizard-of-Oz design to investigate social acceptability by inviting user dyads (N?=?24 participants) to a living ro作者: 上坡 時(shí)間: 2025-3-22 14:36
Research of Interactive Gesture Usability of Navigation Application Based on Intuitive Interactionons or applications, which people need to use more and more in daily travel, the operation mode of it still needs to be improved. Applying gesture control technology to map applications will provide a better user experience than the original operation mode in a specific scene. Actually, there are ma作者: 概觀 時(shí)間: 2025-3-22 20:47 作者: 罵人有污點(diǎn) 時(shí)間: 2025-3-22 23:50 作者: Constitution 時(shí)間: 2025-3-23 03:07 作者: Generic-Drug 時(shí)間: 2025-3-23 09:05 作者: 同步左右 時(shí)間: 2025-3-23 10:36
The Effects of Body Gestures and Gender on Viewer’s Perception of Animated Pedagogical Agent’s Emotions and body gestures. In particular, the two studies reported in the paper investigated the extent to which modifications to the range of movement of 3 beat gestures, e.g., both arms synchronous outward gesture, both arms synchronous forward gesture, and upper body lean, and the agent‘s gender have作者: conceal 時(shí)間: 2025-3-23 15:31
Integrating Language and Emotion Features for Multilingual Speech Emotion Recognitionms. Two novel methods are proposed, which exploit language information and emotion information. In the first method, features specific to the three languages are concatenated with emotion-specific features and applied using a common extremely randomized trees (ERT) classifier to recognize five emoti作者: 解開 時(shí)間: 2025-3-23 22:01
A New Approach to Measure User Experience with Voice-Controlled Intelligent Assistants: A Pilot Studintelligence to have verbal interactions with end-users. In this research, we propose a multi-method approach to assess user experience with a smart voice assistant through triangulation of psychometric and psychophysiological measures. The approach aims to develop a richer understanding of what the作者: 作嘔 時(shí)間: 2025-3-24 01:01 作者: PRE 時(shí)間: 2025-3-24 05:38 作者: inconceivable 時(shí)間: 2025-3-24 09:43
The Effect of Personal Pronouns on Users’ Emotional Experience in Voice Interactionional experience. Therefore, we study how personal pronouns should be used in the response of voice assistants for Chinese users. We conducted a quantitative experiment. The independent variable is the use of personal pronouns in the intelligent voice assistants, including three levels: no personal 作者: enormous 時(shí)間: 2025-3-24 12:30 作者: Allege 時(shí)間: 2025-3-24 15:57
https://doi.org/10.1007/978-3-642-72435-0e-based interfaces, a type of natural user interface (NUI), allow users to use their bodies to interact with computers or virtual/augmented reality (VR/AR) and offer a more natural and intuitive user experience; however, most gesture-based commands have been developed not by considering the user, bu作者: committed 時(shí)間: 2025-3-24 19:22
Methoden der Konservenindustrie,en and one or more cameras that can be readily used for gesture-based interaction. Literature comparing mouse and free-hand gesture interaction, however, is still somewhat sparse in regards to user satisfaction, learnability and memorability, which can be particularly important attributes for applic作者: glisten 時(shí)間: 2025-3-25 02:50 作者: ineffectual 時(shí)間: 2025-3-25 03:28 作者: Jingoism 時(shí)間: 2025-3-25 09:44
R. C. Moon,D. L. McCormick,R. G. Mehtant to which gesture control can be regarded as peripheral interaction [.]. We first identified six gestures for player control to use in the social acceptability study. Next, we used a Wizard-of-Oz design to investigate social acceptability by inviting user dyads (N?=?24 participants) to a living ro作者: HATCH 時(shí)間: 2025-3-25 14:03
Future Prospective of Vitellogenin Research,ons or applications, which people need to use more and more in daily travel, the operation mode of it still needs to be improved. Applying gesture control technology to map applications will provide a better user experience than the original operation mode in a specific scene. Actually, there are ma作者: 野蠻 時(shí)間: 2025-3-25 17:28 作者: 天空 時(shí)間: 2025-3-25 21:54 作者: calorie 時(shí)間: 2025-3-26 03:20
Infectious and Parasitic Hypomelanosis introductory programming concepts. Our system combines components from Google’s Blockly, a visual programming language with a drag-and-drop puzzle piece interface, and Microsoft’s Xbox Kinect which is used to perform skeletal tracking. We focus on two supervised machine learning clustering algorith作者: antedate 時(shí)間: 2025-3-26 06:39 作者: commodity 時(shí)間: 2025-3-26 11:46 作者: AIL 時(shí)間: 2025-3-26 12:55
Principles of Internal Tamponade,ms. Two novel methods are proposed, which exploit language information and emotion information. In the first method, features specific to the three languages are concatenated with emotion-specific features and applied using a common extremely randomized trees (ERT) classifier to recognize five emoti作者: 槍支 時(shí)間: 2025-3-26 17:49 作者: 人類的發(fā)源 時(shí)間: 2025-3-26 23:07
Jesse Gale MB, ChB, MD,Yasushi Ikuno MDhelp users handle the daily chores, but also provide emotional support and communication. Voice interaction has been utilized for both mobile and home automation, and many researchers have studied the emotional voice interaction of various devices. As different voice interaction products have divers作者: ARCH 時(shí)間: 2025-3-27 05:00 作者: 引導(dǎo) 時(shí)間: 2025-3-27 06:45 作者: 寒冷 時(shí)間: 2025-3-27 11:20
Vladimir I. Arnold - Collected Worksrch has shown that humanlike systems will not necessarily be perceived positive by users. The study reported here examined the effect of human likeness on users’ rating of enjoyment, attitudes and motivation to use VUI in a Wizard-of-Oz experiment. Two attributes of human likeness, voice of the syst作者: 享樂主義者 時(shí)間: 2025-3-27 16:09
Lecture Notes in Computer Sciencehttp://image.papertrans.cn/h/image/429731.jpg作者: GEST 時(shí)間: 2025-3-27 19:56
978-3-030-49061-4Springer Nature Switzerland AG 2020作者: 爆米花 時(shí)間: 2025-3-27 22:59
Human-Computer Interaction. Multimodal and Natural Interaction978-3-030-49062-1Series ISSN 0302-9743 Series E-ISSN 1611-3349 作者: CHASM 時(shí)間: 2025-3-28 02:54 作者: CLAN 時(shí)間: 2025-3-28 07:13
Conference proceedings 2020ternational Conference on Human-Computer Interaction, HCII 2020, which took place in Copenhagen, Denmark, in July 2020.*.A total of 1439 papers and 238 posters have been accepted for publication in the HCII 2020 proceedings from a total of 6326 submissions...The 145 papers included in these HCI 2020作者: BILL 時(shí)間: 2025-3-28 11:57
Principles of Internal Tamponade,eved. This result is very promising and superior to the UAR obtained by human evaluation (71.8% for Italian instances). When using the SGE-based method, a 69.2% UAR was achieved, which is closely comparable to the human evaluation results.作者: Indigence 時(shí)間: 2025-3-28 17:50
Integrating Language and Emotion Features for Multilingual Speech Emotion Recognitioneved. This result is very promising and superior to the UAR obtained by human evaluation (71.8% for Italian instances). When using the SGE-based method, a 69.2% UAR was achieved, which is closely comparable to the human evaluation results.作者: Nebulizer 時(shí)間: 2025-3-28 22:08 作者: SNEER 時(shí)間: 2025-3-28 23:18
A New Approach to Measure User Experience with Voice-Controlled Intelligent Assistants: A Pilot Studs new approach in a pilot study, and we show that each method captures a part of emotional variance during the interaction. Results suggest that emotional valence is better captured with psychometric measures, whereas arousal is better detected with psychophysiological measures.作者: 使虛弱 時(shí)間: 2025-3-29 03:38
0302-9743 he 22nd International Conference on Human-Computer Interaction, HCII 2020, which took place in Copenhagen, Denmark, in July 2020.*.A total of 1439 papers and 238 posters have been accepted for publication in the HCII 2020 proceedings from a total of 6326 submissions...The 145 papers included in thes作者: Classify 時(shí)間: 2025-3-29 11:14 作者: Bombast 時(shí)間: 2025-3-29 11:58
Rome, Public Appointments, the Politician,know what the user is doing. We confirm the feasibility of our approach and demonstrate the accuracy of mouth shape recognition. We present two applications. Mouth shape can be used to zoom in or out, or to select an application from a menu.作者: 脫毛 時(shí)間: 2025-3-29 17:37
Detecting Gestures Through a Gesture-Based Interface to Teach Introductory Programming Conceptsece interface, and Microsoft’s Xbox Kinect which is used to perform skeletal tracking. We focus on two supervised machine learning clustering algorithms, centroid matching and medoid matching, to detect gestures.作者: 青春期 時(shí)間: 2025-3-29 22:18 作者: 發(fā)怨言 時(shí)間: 2025-3-30 02:04
A Human-Centered Approach to Designing Gestures for Natural User Interfacesow natural they found the gestures produced in Study One compared to a set of arbitrary gestures. Overall, participants produced very similar gestures for the actions in Study One, and these gestures were rated as most natural in Study Two. Taken together, this research provides a user-centered meth作者: 驕傲 時(shí)間: 2025-3-30 08:01 作者: 注入 時(shí)間: 2025-3-30 08:22 作者: dialect 時(shí)間: 2025-3-30 15:59
A Comparative Study of Hand-Gesture Recognition Devices for Gamesy players’ accounts of their experiences using these gesture devices. Based on these findings, we discuss how such devices can be used by game designers and provide them with a set of design cautions that provide insights into the design of gesture-based games.作者: 星星 時(shí)間: 2025-3-30 19:31
The Social Acceptability of Peripheral Interaction with 3D Gestures in a Simulated Settinglity. Taken together, this suggests that gesture control is socially acceptable for all users. We analysed gaze direction when gesturing based on the video recordings. Using this as an indicator for user attention we also found evidence that gesture control can indeed exist in the periphery of atten作者: 針葉 時(shí)間: 2025-3-30 22:06
Research of Interactive Gesture Usability of Navigation Application Based on Intuitive Interactionlly form a set of intuitive gestures suitable for map applications. The research also analyzes the relationship between users’ intuitive gestures and map tasks. Also analyzes the relationship between users’ intuitive gestures and their previous experience of using intelligent devices, to discuss the作者: Type-1-Diabetes 時(shí)間: 2025-3-31 03:38
Gesture-Based Ιnteraction: Visual Gesture Mappingas the user had learned the interaction. By combining gestures from the library, gesture-based interaction can be used to control advanced machines, robots and drones in an intuitive and non-strenuous way.作者: 松果 時(shí)間: 2025-3-31 05:26
The Potential of Gesture-Based Interactionility, feedback and constraints, VR and AR stands out as the most promising medias to apply gesture-based interaction. The potential is not only determined by the medium, but also in what domain it is applied to. Domains like education, healthcare, robotics, heavy industry and space show clear benef作者: 動(dòng)機(jī) 時(shí)間: 2025-3-31 12:58
The Effects of Body Gestures and Gender on Viewer’s Perception of Animated Pedagogical Agent’s Emotifeatured a male agent. In the first study, which used a within-subject design and metric conjoint analysis, 120 subjects were asked to watch 8 stimuli clips and rank them according to perceived valence and arousal (from highest to lowest). In the second study, which used a between-subject design, 30作者: Aggressive 時(shí)間: 2025-3-31 13:34
Comparing the User Preferences Towards Emotional Voice Interaction Applied on Different Devices: An en-subjects experiment was conducted. Then the influences of different devices and voice interaction modes on user emotional experience were discussed. Thus it provides reference and guidance for the voice interaction design of corresponding products.作者: Ischemia 時(shí)間: 2025-3-31 19:04 作者: 前兆 時(shí)間: 2025-4-1 00:22
The Effect of Personal Pronouns on Users’ Emotional Experience in Voice Interactionistants when it responded in the second person pronouns, and users were more satisfied with the response of the voice assistants in the second person pronouns. These results can inform the design of voice interaction and it is possible to design response strategies for machines based on the theory o作者: Tincture 時(shí)間: 2025-4-1 05:44