Human Computer Interface (HCI)

Top PDF Human Computer Interface (HCI):

Customization of Human Computer Interface Guided by Ontological Approach in Web 2  0

Customization of Human Computer Interface Guided by Ontological Approach in Web 2 0

The philosophy of web 2.0 applications is based on the democratization of access to information by providing the users, having humble technical knowledge, with the possibility to participate and improve web functions. In this very context figures this article, which is part of a series of research conducted by the team within the framework of the development of web 2.0 and participatory design of web interfaces. It aims at proposing a new design technique of web interfaces through involving the user in all stages of this process. In this respect, the internet users’ contributions will be quite useful in web interfaces design by answering the questionnaires suggested by the system. These questionnaires will be integrated within ontology of the domain of human-computer interface ergonomics. The answers to the suggested questionnaires will allow a semantic classification of profiles according to a vector model and, then, develop ontology of users. Accordingly, the system will be able to categorize users in a definite profile based on their ergonomic interests and make a decision about the interfaces recommended by each type of profile. This article consists of three sections. In the first section, HCI technologies related to web 2.0 are presented. Secondly, participatory design and interface evaluation methods will be discussed. In the third section, HCI evaluation model guided by an ontological approach will be advanced in order to help the system make decisions about interfaces. The conclusion comes in the last section. General Terms:
Show more

6 Read more

Classification of eeg signals for human computer interface (hci) application

Classification of eeg signals for human computer interface (hci) application

Human Computer Interface (HCIs) is common nowadays. Interfaces such as joystick are usually used to steer electric wheelchair. There is a huge demand, however, for HCI’s that can be used in situations where this typical interface is not an option. Brain Computer Interfaces (BCI) is one of the alternatives available to cater this problem.

40 Read more

Development of Human Computer Interface for Autopilot Control of Unmanned Aerial Vehicle

Development of Human Computer Interface for Autopilot Control of Unmanned Aerial Vehicle

Abstract- As a result of the limitations of human cognitive skills, judgment, decision-making, and tactical understanding in the use of Unmanned Aerial Vehicles (UAV), there is a need to redesign the current human-computer interface (HCI) for Autopilot control to improve the interaction and communication links between operators and the UAVs. This system displays the information to increase situational awareness; the operator will see everything, interpret it, make appropriate decisions, and have the ability to implement the decision. Multiple interfaces are developed using C’s Graphical User Interface (GUI) capabilities as a simulation environment. Both alternatives will combine buttons and place them in sequential order according to the steps needed to initialize UAVs and flight paths. Usability tests with participants are conducted to measure their performance based on previously determined metrics: the time it takes to train a participant on how to use the interface, the time to complete a task, the number of errors that occur during the task, and the satisfaction level. Based on the simulation results alternatives are redesigned and retested in order to achieve improved performance. An interface that reduces the cognitive workload on an operator to allow better situational awareness of the environment is determined.
Show more

6 Read more

Advanced Human Computer Interface and Voice Processing Applications in Space

Advanced Human Computer Interface and Voice Processing Applications in Space

Advanced Human Computer Interface and Voice Processing Applications in Space Advanced Human Computer Interface and Voice Processing Applications in Space Julie Payette Canadian Space Agency Canadian A[.]

5 Read more

Selection of suitable hand gestures for reliable myoelectric human computer interface

Selection of suitable hand gestures for reliable myoelectric human computer interface

This work has shown that it is important to select the set of hand-gestures that will be accurately recognized for a reliable myoelectric based prosthetic hand or other human computer interface systems. While the selection of the gestures is based on the specific application and may be different for different sEMG analysis and classification techniques, the outcome of this work shows that the selection should be performed based on the ranking of the sensitivity and specificity. This would identify those gestures that would lead to high error and thus should be discarded. In this study from an initial set of ten hand gestures, which included five finger flexion and five functional hand gestures, a set of six gestures were identified which gave the sensitivity and specificity greater than 95%.
Show more

11 Read more

The Vocal Joystick: A Voice Based Human Computer Interface for Individuals with Motor Impairments

The Vocal Joystick: A Voice Based Human Computer Interface for Individuals with Motor Impairments

We present a novel voice-based human- computer interface designed to enable in- dividuals with motor impairments to use vocal parameters for continuous control tasks. Since discrete spoken commands are ill-suited to such tasks, our interface exploits a large set of continuous acoustic- phonetic parameters like pitch, loudness, vowel quality, etc. Their selection is opti- mized with respect to automatic recogniz- ability, communication bandwidth, learn- ability, suitability, and ease of use. Pa- rameters are extracted in real time, trans- formed via adaptation and acceleration, and converted into continuous control sig- nals. This paper describes the basic en- gine, prototype applications (in particu- lar, voice-based web browsing and a con- trolled trajectory-following task), and ini- tial user studies confirming the feasibility of this technology.
Show more

8 Read more

A Real Time Model Based Human Motion Tracking and Analysis for Human Computer Interface Systems

A Real Time Model Based Human Motion Tracking and Analysis for Human Computer Interface Systems

Human motion tracking and analysis has a lot of applica- tions, such as surveillance systems and human computer in- terface (HCI) systems. A vision-based HCI system need to locate and understand the user’s intention or action in real time by using the CCD camera input. Human motion is a highly complex articulated motion. The inherent nonrigid- ity of human motion coupled with the shape variation and self-occlusions make the detection and tracking of human motion a challenging research topic. This paper presents a framework for tracking and analyzing human motion with the following aspects: (a) real-time operation, (b) no mark- ers on the human object, (c) near-unconstrained human mo- tion, and (d) data coordination from two views.
Show more

15 Read more

Design and prototyping of a human computer interface for a desktop tele classroom conference application

Design and prototyping of a human computer interface for a desktop tele classroom conference application

SUN workstations are hardly used anymore at the University of Twente, so it should not be a requirement anymore to use SUN’s to develop (part of a) a DTC application. The most supported environment within EWI at this moment is an Intel x86 compatible PC running MS windows XP, so every newly built prototype should be able to run at least on such a system. This is also the system that is used by the majority of computer-users [59]. Even then it is advisable to make future designs as platform independent as can be. The Operating systems can still vary (Windows XP, Vista, Linux) and not all hardware is always compatible (Apple, mobile devices, etc.). This might not be possible for the DTC system as a whole, since it is still dependent on hardware for e.g. video and audio acquirement, but the Graphical User Interface (GUI) part of the HCI for the DTC system can be made machine independent.
Show more

315 Read more

A Portable Wireless Head Movement Controlled Human-Computer Interface for People with Disabilities

A Portable Wireless Head Movement Controlled Human-Computer Interface for People with Disabilities

The ability to operate a computer mouse has become increasingly important to people with disabilities especially as the advancement of technology allows more and more functions to be controlled by computer. There are many reasons for people with disabilities to operate a computer. For instance, they need to acquire new knowledge and communicate with the outside world through the Internet. In addition, they need to work at home, enjoy leisure activities, and manage many other things, such as home shopping and internet banking. This research focuses on a tilt sensor controlled computer mouse. The tilt sensors or inclinometers detect the angle between a sensing axis and a reference vector such as gravity or the earth’s magnetic field. In the area of medicine science, tilt sensors have been used mainly in occupational medicine research. For example, application of tilt sensors in gait analysis is currently being investigated. Andrews et al. [20] used tilt sensors attached to a floor reaction type ankle foot orthosis as a biofeedback source via an electrocutaneous display to improve postural control during functional electrical stimulation (FES) standing. Bowker and Heath [21] recommended using a tilt sensor to synchronize peroneal nerve stimulation to the gait cycle of hemiplegics by monitoring angular velocity. Basically, tilt sensors have potential applications of improving the abilities for persons with other disabilities [18]. The system uses MEMS accelerometers to detect the user’s head tilt in order to direct mouse movement on the computer screen. Clicking of the mouse is activated by the user’s eye-brow movement through a sensor. The keyboard function is designed to allow the user to scroll letters with head tilt and with eye brow movement as the selection mechanism. Voice recognition section is also present in the head section to identify the small letters which are pronounced by the paralyzed user. The tilt sensors can sense the operator’s head motion up, down, left, and right, etc. Accordingly, the cursor direction can be determined.
Show more

8 Read more

Character Recognition in a Human-Computer Interface Environment for Users with Motor disabilities

Character Recognition in a Human-Computer Interface Environment for Users with Motor disabilities

In Human-Computer Interaction and in Computer Vision, field gestures are of high Importance. In particular, to date there are many applications that use the recognition of the hand-gesture such as sign- language recognition, or those that use a finger as a pointer during Video conference presentations. From this comes the idea of our research, in which we expounded the applicability of these mathematical models in more complex areas as can be that of a severe motor disability combined with pathologies of the nervous system that can lead to serious forms of patient's communication difficulties. Over the years different approaches have been developed to make the patient independent in some way. Each of these solutions has potential disadvantages depending on the spinal pathology in place. Our objective, at the base of the study, was to obtain a software able to guarantee the patient a good level of autonomy in issuing commands to a centralized home automation server. The only limitation related to this solution requires that the user had a minimum motility of the head, lips and eyelids. On the other hand, the system does not require for any connection of the patient to sensors, or electrodes. The user himself activates and deactivates the streaming of commands. The algorithm takes as input the video signal of a webcam normally pointed at the user, identifies the face and extracts the main points (landmark detectors) such as the eyes, the nose, the mouth, and waits for the patient's command. In our case the beating of the eyelashes will be used, by the subject, to express the intention to write a command. This way, the algorithm switches status and starts tracking the tip of the nose, which the patient will move in order to describe the first letter
Show more

13 Read more

Securing Human Computer Interaction
                 

Securing Human Computer Interaction  

In general, Human-computer interaction is “a discipline concerned with the design, evaluation and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them”. [1] HCI (human-computer interaction) devices mainly deal with how people interact with computers and to what extent computers are capable of performing successful interaction with humans. As a field of research, Human-Computer Interaction is situated at the intersection of several disciplines such as computer science, behavioral sciences, design, media studies, and several other fields of studies. The human computer interface is defined as the point of communication in such cognitive transactions between the computer and human beings. The communication flow is known as loop of communication. There are several aspects which aid the success of the cognitive ability of such interfaces. One of the principal aspects among these is security in human-computer interactive devices. However unlike devices where there is no need of human like behavior or interference implementing security protocols is relatively easier. The extent to which a human-computer interaction system is expected to be user-friendly may be curtailed when security measures have to be taken. [2] As most HCI systems require internet a great number of security threats emerge. If users do not know how to use the interface, their systems
Show more

7 Read more

Title: Accelerometer Based Mouse

Title: Accelerometer Based Mouse

In this scheme humble inertial navigation sensor like accelerometer container be use to get Lively or Still hurrying profile of energy to move mouse pointer or even rotate 3-D object. In our paper a humanoid processor borders system is obtainable, which will be talented to act as an improved form of one of the greatest shared interfacing system, which is computer mouse direction. Computer for persons who do not poverty to usage conservative HCI (human computer interface) or not talented to use conservative human computer border and this attained by using a expedient accelerometer mount on human wrist or anywhere in human body.
Show more

5 Read more

Computer users perceptions of Indonesian online bussines webpage based on human computer interface and ror framework web programming

Computer users perceptions of Indonesian online bussines webpage based on human computer interface and ror framework web programming

computer client that is used to retrieve and then display the web documents. Generally, a web browser that supports in HTML document into the main document format on the web, as well as several common image formats such as GIF and JPEG [14]. In taking documents, web browsers interact with web server with HTTP protocol (Hypertext Transfer Protocol) [11]. An individual websites supported image formats and HTML documents as in Figure 1. which is consists of many web pages. In order to read the web page required a web browser, such as Netscape Navigator software, Internet Explorer, or Mozilla Firefox from Microsoft [8].
Show more

7 Read more

Biomechanical Signals Human Computer Interface for Severe Motor Disabilities

Biomechanical Signals Human Computer Interface for Severe Motor Disabilities

The first part of the equipment is the adapted interface. This interface, as explained above, is responsible for the reception of the user’s voluntary winks. For the detection of these biomechanical signals, the implementation of different interfaces has been developed by the research group [1]. The function of the interface chosen is based on the light reflexion or not, according to a user’s volun- tary gesture. This task is carried out with a CNY70 de- vice, which has a light emitter and an infrared phototran- sistor. The led light emitted reflects on a surface and de- pending on the nature of the surface the light will find a reflection or not. In this case, a bicolour adhesive tape (black and white) on the user’s orbicularis oculi muscle skin, on the edge of the eye, is used to achieve a correct performance (Figure 2). The adhesive tape moves when a voluntary wink is performed, and the consequent move- ment causes a colour change and, hence, a change on the output of the device.
Show more

7 Read more

Head Pose Estimation Using Convolutional Neural Networks

Head Pose Estimation Using Convolutional Neural Networks

Detection and estimation of head pose is fundamental problem in many applications such as automatic face recognition, intelligent surveillance, and perceptual human-computer interface and in an application like driving, the pose of the driver is used to estimate his gaze and alertness, where faces in the images are non-frontal with various poses. In this work head pose of the person is used to detect the alertness. Convolution neural networks are used to train and classify various head poses such as frontal, left, right, up and down. The system locates the distinctive features such as nose and eyes with a higher accuracy and provides robust estimation of the head pose.
Show more

6 Read more

Stakeholder’s Perspective of Clinical Decision Support System

Stakeholder’s Perspective of Clinical Decision Support System

Clinical Decision Support System (CDSS) has potential opportunities to improve overall safety, quality and cost-effectiveness of healthcare. The CDSS has existed for more than four decades, but its adop- tion rate by medical communities is not encouraging even in the countries that have been a pioneer in developing them. At many sites, it was problematic, stalled in the planning stages or never even at- tempted. To date, CDSS is considered as a partially successful system. Several current challenges have not been adequately addressed during the development of CDSS. As per latest research, the lists of challenges are: improve the human-computer interface, disseminate best practices in CDSS design, development, and implementation, summarize patient-level information, prioritize and filter rec- ommendations to the user, create an architecture for sharing executable CDSS modules and services, combine recommendations for patients with co-morbidities, prioritize CDSS content development and implementation, create Internet-accessible clinical decision support repositories, use free text infor- mation to drive clinical decision support, and mine huge clinical databases to create new CDSS. The preceding list has been considered as challenges due to unmet expectations of CDSS’s stakeholders, such as CDSS product development and maintenance team (product owner, project managers, system architect, system designers, system developers, system administrators, and system maintenance team), sales and marketing personnel, end-users. We found that most of the CDSS literature talked about the challenges and their details, but they do not throw enough light from CDSS stakeholders’ perspective while building and upon using CDSS. This paper describes CDSS from various stakehold- ers’ perspective, highlighting the challenges faced by them in owning, building, and using them.
Show more

6 Read more

A Study of Deep Learning Technique and Its Application in Medical Image Processing

A Study of Deep Learning Technique and Its Application in Medical Image Processing

Das[4]: presents a brief survey on speech is the primary and the most convenient means of communication between people. The communication among human computer interaction is called human computer interface. Speech has potential of being important mode of interaction with computer. This paper gives an overview of major technological perspective and appreciation of the fundamental progress of speech recognition and also gives overview technique developed in each stage of speech recognition. This paper helps in choosing the technique along with their relative merits and demerits. A comparative study of different techniques is done. This paper concludes with the decision on feature direction for developing technique in human computer interface system in different mother tongue and it also gives the various technique used in each step of a speech recognition process and attempts to analyze an approach for designing an efficient system for speech recognition . The objective of this review paper is to summarize and compare different speech recognition systems and identify research topics and applications where are at the front end of this exciting and challenging field. Dhameliya
Show more

5 Read more

The Survey – Predicting an Education Performance of Students Based on Data Mining Techniques

The Survey – Predicting an Education Performance of Students Based on Data Mining Techniques

In this review, it can be concluded that the research work of the survey gave a new idea of predicting the process of students skill analysis through human computer Interface integrates with Data mining techniques with respect to cognitive models of GOMS model, KLM model, cognitive complexity theory for analyzing human behavior, performance, skills, attention, memory level etc are gathered by using several parameters of self observance, behavior, problem solving, education task, reasoning task and other resources.

8 Read more

Controlling Mouse Movements Using hand Gesture And X box 360

Controlling Mouse Movements Using hand Gesture And X box 360

Previously noted that the hand gesture recognition would allow human computer interface such as cursor control and sign language recognition. There are two approaches for hand gesture recognition for human computer interface, which is hardware based and the vision based. One hardware based approach by 1990 uses a data glove for achieve the gesture recognition.

5 Read more

ROBOTIC ARM MANIPULATION USING LEAP MOTION CONTROLLER

ROBOTIC ARM MANIPULATION USING LEAP MOTION CONTROLLER

The Leap Motion Controller is considered a breakthrough device in the field of hand gesture controlled human computer interface. The new, consumer-grade controller introduces a new novel gesture and position tracking system with sub-millimeter accuracy. The controller operation is based on infrared optics and cameras instead of depth sensors. Its motion sensing precision is unmatched by any depth camera currently available, to the best of the author’s knowledge so far. It can track all 10 of the human fingers simultaneously. As stated by the manufacturer, the accuracy in the detection of each fingertip position is approximately 0.01mm, with a frame rate of up to 300 fps.
Show more

9 Read more

Show all 10000 documents...