ImageVerifierCode 换一换
格式:DOCX , 页数:12 ,大小:32.19KB ,
资源ID:17010355      下载积分:3 金币
快捷下载
登录下载
邮箱/手机:
温馨提示:
快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。 如填写123,账号就是123,密码也是123。
特别说明:
请自助下载,系统不会自动发送文件的哦; 如果您已付费,想二次下载,请登录后访问:我的下载记录
支付方式: 支付宝    微信支付   
验证码:   换一换

加入VIP,免费下载
 

温馨提示:由于个人手机设置不同,如果发现不能下载,请复制以下地址【https://www.bdocx.com/down/17010355.html】到电脑端继续下载(重复下载不扣费)。

已注册用户请登录:
账号:
密码:
验证码:   换一换
  忘记密码?
三方登录: 微信登录   QQ登录  

下载须知

1: 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。
2: 试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。
3: 文件的所有权益归上传用户所有。
4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
5. 本站仅提供交流平台,并不能对任何下载内容负责。
6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

版权提示 | 免责声明

本文(火车票自动售票机系统交互设计Word文档下载推荐.docx)为本站会员(b****3)主动上传,冰豆网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知冰豆网(发送邮件至service@bdocx.com或直接QQ联系客服),我们立即给予删除!

火车票自动售票机系统交互设计Word文档下载推荐.docx

1、关键字 火车票自动售票机 人机交互 交互系统 界面设计Our countrys railway has stepped on the stage of rapid development; however, the windows selling cannot keep up with the high pace of modern life. So the ticket vendor system came into being. The train ticker vendor has to receive the further consideration in its friendlin

2、ess between the machine and customers, in its practicability and integrity of functions. The text analyzed every element of man and machine interaction system of the ticket vendor and figured out the method and fundamental principles of the design for the machine. User-oriented, we find out the disa

3、dvantages of the present ticket vendor and users demands for the vendor in the future. Now, the direction of the previous design has been improved and a schema of the polished design has been presented resulting in providing a reference for the developmental direction of design for the ticket vendor

4、 interaction system. DOG eye: Controlling your home with eye interactionDario Bonino,Emiliano CastellinaFulvio CornoLuigi De RussisPolitecnico di Torino, Dipartimento di Automatica ed Informatica, Corso Duca degli Abruzzi 24, 10129 Torino, ItalyReceived 29 October 2021. Revised 15 April 2021. Accept

5、ed 15 June 2021. Available online 23 June 2021.http:/dx.doi.org/10.1016/j.intcom.2021.06.002,How to Cite or Link Using DOICited by in Scopus (0)Permissions & ReprintsAbstractNowadays home automation, with its increased availability, reliability and with its ever reducing costs is gaining momentum an

6、d is starting to become a viable solution for enabling people with disabilities to autonomously interact with their homes and to better communicate with other people. However, especially for people with severe mobility impairments, there is still a lack of tools and interfaces for effective control

7、and interaction with home automation systems, and generalpurpose solutions are seldom applicable due to the complexity, asynchronicity, time dependent behavior, and safety concerns typical of the home environment. This paper focuses on userenvironment interfaces based on the eye tracking technology,

8、 which often is the only viable interaction modality for users as such. We propose an eye-based interface tackling the specific requirements of smart environments, already outlined in a public Recommendation issued by the COGAIN European Network of Excellence. The proposed interface has been impleme

9、nted as a software prototype based on the ETU universal driver, thus being potentially able to run on a variety of eye trackers, and it is compatible with a wide set of smart home technologies, handled by the Domotic OSGi Gateway. A first interface evaluation, with user testing sessions, has been ca

10、rried and results show that the interface is quite effective and usable without discomfort by people with almost regular eye movement control.Highlights Lets people with impaired mobility interact autonomously with their smart homes. Reports principles for realizing an application for house control

11、with eye tracking. Design and development of a multimodal eye-based interface for environmental control. Qualitative and quantitative evaluation, through user study, of the proposed interface.KeywordsHumanhome interaction;Smart homes;Domotics;Usability;User interface;User study1. IntroductionIn the

12、last 5years, (smart) home automation gained a new momentum, thanks to an increased availability of commercial solutions and to steadly reducing costs. The evergreen appeal of automated, intelligent homes together with a raising technology maturity has fostered new research challenges and opportuniti

13、es in the field of “intelligent” or “smart” environments. According to the Mark Weiser definition, a Smart Home system, that in this paper we decline as domotic or environmental control system, is “a physical world that is richly and invisibly interwoven with sensors, actuators, displays and computa

14、tional elements, embedded seamlessly in the everyday object of our lives, and connected through a continuous network” (Weiser, 2021), providing ways for controlling, interacting and monitoring the house. The idea behind this vision is that homes of tomorrow would be smart enough to control themselve

15、s, understand contexts in which they operate and perform suitable actions under inhabitants supervision (Bierhoff et al., 2021 ). Although smart and autonomous homes might raise controversial opinions on how smart are they or should they be, currently available commercial solutions can start playing

16、 a relevant role as enabling technology for improving the care of the elderly (Berlo, 2021 andZhang et al., 2021) and of people with disabilities (Chikhaoui and Pigot, 2021Chan et al., 2021), reducing their daily workload in the house, and enabling them to live more autonomously and with a better qu

17、ality of life. Even if such systems are far from cutting-edge research solutions, they are still really complex to master since they handle and coordinate several devices and appliances with different functionalities and with different control granularities.In particular, among other disabilities, p

18、eople who have severely impaired motor abilities can take great advantages from eye tracking systems to control their homes, since they generally retain normal control of their eyes, that become therefore their preferential stream of interaction (Hornof and Cavender, 2021 ). Eye tracking can transfo

19、rms such a limited ability into both a communication channel and an interaction medium, opening possibilities for computer-based communication and control solutions (Donegan et al., 2021 ). Even if eye tracking is often used for registering eye movements in usability studies, it can be successfully

20、exploited as alternative input modality to control user interfaces. Home automation can then bridge the gap between software and tangible objects, enabling people with motor disabilities to effectively and physically engage with their surroundings (Andrich et al., 2021 ). Several house control inter

21、faces have been proposed in the literature, i.e., applications to allows users to control different types of devices in their homes, to handle triggered alarms, etc. Such interfaces, either based on conventional unimodal (Koskela and Vnnen-Vainio-Mattila, 2021 ) or multimodal interactions (Weingarte

22、n et al., 2021) (e.g., mouse, remote controller, etc.), are too often uncomfortable and/or useless for people with severe impaired motor abilities, and only few of them have been specifically designed and developed to be controlled with eye movements.In 2021 , applications based on gaze interaction

23、have been analyzed by a European Network of Excellence, named COGAIN (Communication by Gaze Interaction)2, to evaluate the state-of-the-art and to identify potential weaknesses and future developments. According to the report “D2.4 A survey of Existing de facto Standards and Systems of Environmental

24、 Control” (Bates et al., 2021 ), the COGAIN Network identified different problems in eye-based house control applications, such as the lack of advanced functionalities for controlling some appliances of the house, the absence of interoperability between different smart house systems or the difficult

25、y to use an eye tracker for realizing some actions. In a subsequent report (Corno et al., 2021 ), COGAIN members proposed solutions to overcome the discovered problems. In particular, they proposed 21 guidelines to promote safety and accessibility in eye tracking based environmental control applicat

26、ions.This paper describes the design and development of DOGeye, one of the first home control applications designed for gaze-based interaction and by explicitly accounting for the COGAIN guidelines. DOGeye is a multimodal eye-based application for home management and control, based on state-of-the-a

27、rt technologies in both tracking and home control. It enables people to control their domotic homes through different input devices, possibly combined, so that it does not limit itself to eye tracking only. The presence of various input modalities allows application use by other people present in th

28、e house and offers different alternatives to the persons affected by possibly evolving impairments such as the ALS (Amyothrophic Lateral Sclerosis).The remainder of the paper is organized as follows: Section2presents the basic features of eye tracking technology and the characteristics of eye-based

29、user interfaces while Section3presents the work accomplished by the members of the COGAIN Network and describes COGAIN guidelines for eye tracking based environmental control applications. Section4reports relevant related works and findings. DOGeye design and architecture is described in Section5, w

30、hile Sections67report the setup and results of a user test involving people in a controlled environment, thus building the basis for further considerations and research. Section8concludes the paper and outlines future works.2. Eye tracking basicsTo better understand the principles and implementation

31、 of eye controlled interface, this section defines some terms and features pertaining to eye movements and eye tracking.The eye does not generally move smoothly over the visual field; instead, it makes a series of quick jumps, calledsaccades, along with other specialized movements (Haber and Hershenson, 1973). A saccade lasts 30120ms, and typically covers 1520 degrees of visual angle (Jacob, 1995). Between saccades, thegazepoint, i.e., the point in a scene where a person is looking, stays at the same loca

copyright@ 2008-2022 冰豆网网站版权所有

经营许可证编号:鄂ICP备2022015515号-1