ImageVerifierCode 换一换
格式:DOCX , 页数:12 ,大小:32.19KB ,
资源ID:4094333      下载积分:3 金币
快捷下载
登录下载
邮箱/手机:
温馨提示:
快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。 如填写123,账号就是123,密码也是123。
特别说明:
请自助下载,系统不会自动发送文件的哦; 如果您已付费,想二次下载,请登录后访问:我的下载记录
支付方式: 支付宝    微信支付   
验证码:   换一换

加入VIP,免费下载
 

温馨提示:由于个人手机设置不同,如果发现不能下载,请复制以下地址【https://www.bdocx.com/down/4094333.html】到电脑端继续下载(重复下载不扣费)。

已注册用户请登录:
账号:
密码:
验证码:   换一换
  忘记密码?
三方登录: 微信登录   QQ登录  

下载须知

1: 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。
2: 试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。
3: 文件的所有权益归上传用户所有。
4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
5. 本站仅提供交流平台,并不能对任何下载内容负责。
6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

版权提示 | 免责声明

本文(火车票自动售票机系统交互设计.docx)为本站会员(b****3)主动上传,冰豆网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知冰豆网(发送邮件至service@bdocx.com或直接QQ联系客服),我们立即给予删除!

火车票自动售票机系统交互设计.docx

1、火车票自动售票机系统交互设计火车票自动售票机系统交互设计摘要:我国铁路已经进入高速发展阶段,窗口式售票形式已经不能快节奏的生活节奏,火车票自动售票系统应运而生。火车票自动售票机作为一个新兴产品其人机交互的友好性、实用性以及功能的完整性都有待进一步的考虑。文本通过逐步分析自动售票机人机交互系统的各个组件,在通过实地调研得出火车票自动售票机人机交互设计的基本原则和方法。以用户为导向,提出了现有自动售票机的不足,以及用户对未来自动售票机的需求,建立了改进原有自动售票机的设计方向,最后提出了一个人机交互设计的改进方案,试图为现有自动售票机交互设计的发展方向提供一定的参考。关键字 火车票自动售票机 人机

2、交互 交互系统 界面设计Our countrys railway has stepped on the stage of rapid development; however, the windows selling cannot keep up with the high pace of modern life. So the ticket vendor system came into being. The train ticker vendor has to receive the further consideration in its friendliness between the

3、 machine and customers, in its practicability and integrity of functions. The text analyzed every element of man and machine interaction system of the ticket vendor and figured out the method and fundamental principles of the design for the machine. User-oriented, we find out the disadvantages of th

4、e present ticket vendor and users demands for the vendor in the future. Now, the direction of the previous design has been improved and a schema of the polished design has been presented resulting in providing a reference for the developmental direction of design for the ticket vendor interaction sy

5、stem. DOG eye: Controlling your home with eye interactionDario Bonino,Emiliano Castellina,Fulvio Corno,Luigi De Russis,Politecnico di Torino, Dipartimento di Automatica ed Informatica, Corso Duca degli Abruzzi 24, 10129 Torino, ItalyReceived 29 October 2021. Revised 15 April 2021. Accepted 15 June 2

6、021. Available online 23 June 2021.http:/dx.doi.org/10.1016/j.intcom.2021.06.002,How to Cite or Link Using DOICited by in Scopus (0)Permissions & ReprintsAbstractNowadays home automation, with its increased availability, reliability and with its ever reducing costs is gaining momentum and is startin

7、g to become a viable solution for enabling people with disabilities to autonomously interact with their homes and to better communicate with other people. However, especially for people with severe mobility impairments, there is still a lack of tools and interfaces for effective control and interact

8、ion with home automation systems, and generalpurpose solutions are seldom applicable due to the complexity, asynchronicity, time dependent behavior, and safety concerns typical of the home environment. This paper focuses on userenvironment interfaces based on the eye tracking technology, which often

9、 is the only viable interaction modality for users as such. We propose an eye-based interface tackling the specific requirements of smart environments, already outlined in a public Recommendation issued by the COGAIN European Network of Excellence. The proposed interface has been implemented as a so

10、ftware prototype based on the ETU universal driver, thus being potentially able to run on a variety of eye trackers, and it is compatible with a wide set of smart home technologies, handled by the Domotic OSGi Gateway. A first interface evaluation, with user testing sessions, has been carried and re

11、sults show that the interface is quite effective and usable without discomfort by people with almost regular eye movement control.Highlights Lets people with impaired mobility interact autonomously with their smart homes. Reports principles for realizing an application for house control with eye tra

12、cking. Design and development of a multimodal eye-based interface for environmental control. Qualitative and quantitative evaluation, through user study, of the proposed interface.KeywordsHumanhome interaction;Smart homes;Domotics;Usability;User interface;User study1. IntroductionIn the last 5years,

13、 (smart) home automation gained a new momentum, thanks to an increased availability of commercial solutions and to steadly reducing costs. The evergreen appeal of automated, intelligent homes together with a raising technology maturity has fostered new research challenges and opportunities in the fi

14、eld of “intelligent” or “smart” environments. According to the Mark Weiser definition, a Smart Home system, that in this paper we decline as domotic or environmental control system, is “a physical world that is richly and invisibly interwoven with sensors, actuators, displays and computational eleme

15、nts, embedded seamlessly in the everyday object of our lives, and connected through a continuous network” (Weiser, 2021), providing ways for controlling, interacting and monitoring the house. The idea behind this vision is that homes of tomorrow would be smart enough to control themselves, understan

16、d contexts in which they operate and perform suitable actions under inhabitants supervision (Bierhoff et al., 2021 ). Although smart and autonomous homes might raise controversial opinions on how smart are they or should they be, currently available commercial solutions can start playing a relevant

17、role as enabling technology for improving the care of the elderly (Berlo, 2021 andZhang et al., 2021) and of people with disabilities (Chikhaoui and Pigot, 2021andChan et al., 2021), reducing their daily workload in the house, and enabling them to live more autonomously and with a better quality of

18、life. Even if such systems are far from cutting-edge research solutions, they are still really complex to master since they handle and coordinate several devices and appliances with different functionalities and with different control granularities.In particular, among other disabilities, people who

19、 have severely impaired motor abilities can take great advantages from eye tracking systems to control their homes, since they generally retain normal control of their eyes, that become therefore their preferential stream of interaction (Hornof and Cavender, 2021 ). Eye tracking can transforms such

20、a limited ability into both a communication channel and an interaction medium, opening possibilities for computer-based communication and control solutions (Donegan et al., 2021 ). Even if eye tracking is often used for registering eye movements in usability studies, it can be successfully exploited

21、 as alternative input modality to control user interfaces. Home automation can then bridge the gap between software and tangible objects, enabling people with motor disabilities to effectively and physically engage with their surroundings (Andrich et al., 2021 ). Several house control interfaces hav

22、e been proposed in the literature, i.e., applications to allows users to control different types of devices in their homes, to handle triggered alarms, etc. Such interfaces, either based on conventional unimodal (Koskela and Vnnen-Vainio-Mattila, 2021 ) or multimodal interactions (Weingarten et al.,

23、 2021) (e.g., mouse, remote controller, etc.), are too often uncomfortable and/or useless for people with severe impaired motor abilities, and only few of them have been specifically designed and developed to be controlled with eye movements.In 2021 , applications based on gaze interaction have been

24、 analyzed by a European Network of Excellence, named COGAIN (Communication by Gaze Interaction)2, to evaluate the state-of-the-art and to identify potential weaknesses and future developments. According to the report “D2.4 A survey of Existing de facto Standards and Systems of Environmental Control”

25、 (Bates et al., 2021 ), the COGAIN Network identified different problems in eye-based house control applications, such as the lack of advanced functionalities for controlling some appliances of the house, the absence of interoperability between different smart house systems or the difficulty to use

26、an eye tracker for realizing some actions. In a subsequent report (Corno et al., 2021 ), COGAIN members proposed solutions to overcome the discovered problems. In particular, they proposed 21 guidelines to promote safety and accessibility in eye tracking based environmental control applications.This

27、 paper describes the design and development of DOGeye, one of the first home control applications designed for gaze-based interaction and by explicitly accounting for the COGAIN guidelines. DOGeye is a multimodal eye-based application for home management and control, based on state-of-the-art techno

28、logies in both tracking and home control. It enables people to control their domotic homes through different input devices, possibly combined, so that it does not limit itself to eye tracking only. The presence of various input modalities allows application use by other people present in the house a

29、nd offers different alternatives to the persons affected by possibly evolving impairments such as the ALS (Amyothrophic Lateral Sclerosis).The remainder of the paper is organized as follows: Section2presents the basic features of eye tracking technology and the characteristics of eye-based user inte

30、rfaces while Section3presents the work accomplished by the members of the COGAIN Network and describes COGAIN guidelines for eye tracking based environmental control applications. Section4reports relevant related works and findings. DOGeye design and architecture is described in Section5, while Sect

31、ions6and7report the setup and results of a user test involving people in a controlled environment, thus building the basis for further considerations and research. Section8concludes the paper and outlines future works.2. Eye tracking basicsTo better understand the principles and implementation of ey

32、e controlled interface, this section defines some terms and features pertaining to eye movements and eye tracking.The eye does not generally move smoothly over the visual field; instead, it makes a series of quick jumps, calledsaccades, along with other specialized movements (Haber and Hershenson, 1973). A saccade lasts 30120ms, and typically covers 1520 degrees of visual angle (Jacob, 1995). Between saccades, thegazepoint, i.e., the point in a scene where a person is looking, stays at the same loca

copyright@ 2008-2022 冰豆网网站版权所有

经营许可证编号:鄂ICP备2022015515号-1