User interface: Difference between revisions

534 bytes added ,  28 June 2022
robot: Update article (please report if you notice any mistake or error in this edit)
(Created page with "{{Short description|Means by which a user interacts with and controls a machine}} {{About||the boundary between computer systems|Interface (computing)|other uses|Interface (di...")
 
(robot: Update article (please report if you notice any mistake or error in this edit))
Line 1: Line 1:
{{Short description|Means by which a user interacts with and controls a machine}}
{{Short description|Means by which a user interacts with and controls a machine}}
{{About||the boundary between computer systems|Interface (computing)|other uses|Interface (disambiguation){{!}}Interface}}
{{About||the boundary between computer systems|Interface (computing)}}
[[Image:Reactable Multitouch.jpg|thumb|right|300px|The [[Reactable]], an example of a [[tangible user interface]]]]
[[Image:Reactable Multitouch.jpg|thumb|right|300px|The [[Reactable]], an example of a [[tangible user interface]]]]


In the [[industrial design]] field of [[human–computer interaction]], a '''user interface''' ('''UI''') is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, whilst the machine simultaneously feeds back information that aids the operators' [[decision-making]] process. Examples of this broad concept of user interfaces include the interactive aspects of computer [[operating system]]s, hand [[tools]], [[heavy machinery]] operator controls, and [[Unit operation|process]] controls. The design considerations applicable when creating user interfaces are related to, or involve such disciplines as, [[ergonomics]] and [[psychology]].
In the [[industrial design]] field of [[human–computer interaction]], a '''user interface''' ('''UI''') is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, while the machine simultaneously feeds back information that aids the operators' [[decision-making]] process. Examples of this broad concept of user interfaces include the interactive aspects of computer [[operating system]]s, hand [[tools]], [[heavy machinery]] operator controls, and [[Unit operation|process]] controls. The design considerations applicable when creating user interfaces are related to, or involve such disciplines as, [[ergonomics]] and [[psychology]].


Generally, the goal of [[user interface design]] is to produce a user interface which makes it easy, efficient, and enjoyable (user-friendly) to operate a machine in the way which produces the desired result (i.e. maximum [[usability]]). This generally means that the operator needs to provide minimal input to achieve the desired output, and also that the machine minimizes undesired outputs to the user.
Generally, the goal of [[user interface design]] is to produce a user interface that makes it easy, efficient, and enjoyable (user-friendly) to operate a machine in the way which produces the desired result (i.e. maximum [[usability]]). This generally means that the operator needs to provide minimal input to achieve the desired output, and also that the machine minimizes undesired outputs to the user.


User interfaces are composed of one or more layers, including a '''human-machine interface''' ('''HMI''') that interfaces machines with physical [[Input device|input hardware]] such as keyboards, mice, or game pads, and output hardware such as [[computer monitor]]s, speakers, and [[Printer (computing)|printer]]s. A device that implements an HMI is called a [[human interface device]] (HID). Other terms for human-machine interfaces are '''man-machine interface''' ('''MMI''') and, when the machine in question is a computer, '''human–computer interface'''. Additional UI layers may interact with one or more human senses, including: tactile UI ([[Somatosensory system|touch]]), visual UI ([[Visual perception|sight]]), auditory UI ([[Hearing|sound]]), olfactory UI ([[Olfaction|smell]]), equilibrial UI ([[Sense of balance|balance]]), and gustatory UI ([[taste]]).
User interfaces are composed of one or more layers, including a '''human-machine interface''' ('''HMI''') that interfaces machines with physical [[Input device|input hardware]] such as keyboards, mice, or game pads, and output hardware such as [[computer monitor]]s, speakers, and [[Printer (computing)|printer]]s. A device that implements an HMI is called a [[human interface device]] (HID). Other terms for human–machine interfaces are '''man–machine interface''' ('''MMI''') and, when the machine in question is a computer, '''human–computer interface'''. Additional UI layers may interact with one or more human senses, including: tactile UI ([[Somatosensory system|touch]]), visual UI ([[Visual perception|sight]]), auditory UI ([[Hearing|sound]]), olfactory UI ([[Olfaction|smell]]), equilibria UI ([[Sense of balance|balance]]), and gustatory UI ([[taste]]).


'''Composite user interfaces''' ('''CUIs''') are UIs that interact with two or more senses. The most common CUI is a ''[[graphical user interface]]'' (GUI), which is composed of a tactile UI and a visual UI capable of displaying [[graphics]]. When sound is added to a GUI, it becomes a ''multimedia user interface'' (MUI). There are three broad categories of CUI: ''standard'', ''virtual'' and ''augmented''.  
'''Composite user interfaces''' ('''CUIs''') are UIs that interact with two or more senses. The most common CUI is a ''[[graphical user interface]]'' (GUI), which is composed of a tactile UI and a visual UI capable of displaying [[graphics]]. When sound is added to a GUI, it becomes a ''multimedia user interface'' (MUI). There are three broad categories of CUI: ''standard'', ''virtual'' and ''augmented''. Standard CUI use standard human interface devices like keyboards, mice, and computer monitors. When the CUI blocks out the real world to create a [[virtual reality]], the CUI is virtual and uses a ''virtual reality interface''. When the CUI does not block out the real world and creates [[augmented reality]], the CUI is augmented and uses an ''augmented reality interface''. When a UI interacts with all human senses, it is called a qualia interface, named after the theory of [[qualia]].{{citation needed|date=January 2022}} CUI may also be classified by how many senses they interact with as either an X-sense virtual reality interface or X-sense augmented reality interface, where X is the number of senses interfaced with. For example, a [[Smell-O-Vision]] is a 3-sense (3S) Standard CUI with visual display, sound and smells; when ''virtual reality interfaces'' interface with smells and touch it is said to be a 4-sense (4S) virtual reality interface; and when ''augmented reality interfaces'' interface with smells and touch it is said to be a 4-sense (4S) augmented reality interface.
Standard CUI use standard human interface devices like keyboards, mice, and computer monitors. When the CUI blocks out the real world to create a [[virtual reality]], the CUI is virtual and uses a ''virtual reality interface''. When the CUI does not block out the real world and creates [[augmented reality]], the CUI is augmented and uses an ''augmented reality interface''. When a UI interacts with all human senses, it is called a qualia interface, named after the theory of [[qualia]].  
CUI may also be classified by how many senses they interact with as either an X-sense virtual reality interface or X-sense augmented reality interface, where X is the number of senses interfaced with. For example, a [[Smell-O-Vision]] is a 3-sense (3S) Standard CUI with visual display, sound and smells; when ''virtual reality interfaces'' interface with smells and touch it is said to be a 4-sense (4S) virtual reality interface; and when ''augmented reality interfaces'' interface with smells and touch it is said to be a 4-sense (4S) augmented reality interface.


== Overview ==
==Overview==
[[File:XFCE-4.10-Desktop.png|thumb|right|A [[graphical user interface]] following the [[desktop metaphor]] ]]
[[File:XFCE-4.10-Desktop.png|thumb|right|A [[graphical user interface]] following the [[desktop metaphor]] ]]


Line 33: Line 31:
* The term "user interface" is often used in the context of (personal) computer systems and [[electronics|electronic devices]].
* The term "user interface" is often used in the context of (personal) computer systems and [[electronics|electronic devices]].
** Where a network of equipment or computers are interlinked through an MES (Manufacturing Execution System)-or Host to display information.
** Where a network of equipment or computers are interlinked through an MES (Manufacturing Execution System)-or Host to display information.
** A human-machine interface (HMI) is typically local to one machine or piece of equipment, and is the interface method between the human and the equipment/machine. An operator interface is the interface method by which multiple pieces of equipment that are linked by a host control system are accessed or controlled.{{Clarify|date=January 2010}}
** A human–machine interface (HMI) is typically local to one machine or piece of equipment, and is the interface method between the human and the equipment/machine. An operator interface is the interface method by which multiple pieces of equipment that are linked by a host control system are accessed or controlled.{{Clarify|date=January 2010}}
** The system may expose several user interfaces to serve different kinds of users. For example, a [[digital library|computerized library database]] might provide two user interfaces, one for library patrons (limited set of functions, optimized for ease of use) and the other for library personnel (wide set of functions, optimized for efficiency).{{Clarify|date=January 2010}}
** The system may expose several user interfaces to serve different kinds of users. For example, a [[digital library|computerized library database]] might provide two user interfaces, one for library patrons (limited set of functions, optimized for ease of use) and the other for library personnel (wide set of functions, optimized for efficiency).<ref>{{Cite web |title=The User Experience of Libraries: Serving The Common Good User Experience Magazine |url=https://uxpamagazine.org/the-user-experience-of-libraries/ |access-date=2022-03-23 |website=uxpamagazine.org}}</ref>
* The user interface of a [[Machine|mechanical]] system, a vehicle or an [[Industry (manufacturing)|industrial]] installation is sometimes referred to as the human–machine interface (HMI).<ref>{{cite journal|author1=Griffin, Ben|author2=Baston, Laurel|title=Interfaces|page=5|url=http://peace.saumag.edu/faculty/kardas/Courses/CS/Interfaces2007_files/Interfaces2007.ppt|access-date=7 June 2014|format=Presentation|quote=The user interface of a mechanical system, a vehicle or an industrial installation is sometimes referred to as the human-machine interface (HMI).|url-status=live|archive-url=https://web.archive.org/web/20140714160915/http://peace.saumag.edu/faculty/kardas/Courses/CS/Interfaces2007_files/Interfaces2007.ppt|archive-date=14 July 2014}}</ref> HMI is a modification of the original term MMI (man-machine interface).<ref name="Nigeria">{{cite journal|title=User Interface Design and Ergonomics|journal=Course Cit 811|page=19|url=http://www.nou.edu.ng/NOUN_OCL/pdf/SST/CIT%20811.pdf|access-date=7 June 2014|publisher=SCHOOL OF SCIENCE AND TECHNOLOGY|location=NATIONAL OPEN UNIVERSITY OF NIGERIA|quote=In practice, the abbreviation MMI is still frequently used although some may claim that MMI stands for something different now.|url-status=live|archive-url=https://web.archive.org/web/20140714234100/http://www.nou.edu.ng/NOUN_OCL/pdf/SST/CIT%20811.pdf|archive-date=14 July 2014}}</ref> In practice, the abbreviation MMI is still frequently used<ref name="Nigeria"/> although some may claim that MMI stands for something different now.{{Citation needed|date=March 2019}} Another abbreviation is HCI, but is more commonly used for [[human–computer interaction]].<ref name="Nigeria"/> Other terms used are operator interface console (OIC) and operator interface terminal (OIT).<ref>{{cite book|title=Recent advances in business administration|date=2010|publisher=Wseas|location=[S.l.]|isbn=978-960-474-161-8|page=190|chapter=Introduction Section|quote=Other terms used are operator interface console (OIC) and operator interface terminal (OIT)}}</ref> However it is abbreviated, the terms refer to the 'layer' that separates a human that is operating a machine from the machine itself.<ref name="Nigeria"/> Without a clean and usable interface, humans would not be able to interact with information systems.
* The user interface of a [[Machine|mechanical]] system, a vehicle or an [[Industry (manufacturing)|industrial]] installation is sometimes referred to as the human–machine interface (HMI).<ref>{{cite journal|author1=Griffin, Ben|author2=Baston, Laurel|title=Interfaces|page=5|url=http://peace.saumag.edu/faculty/kardas/Courses/CS/Interfaces2007_files/Interfaces2007.ppt|access-date=7 June 2014|format=Presentation|quote=The user interface of a mechanical system, a vehicle or an industrial installation is sometimes referred to as the human–machine interface (HMI).|url-status=live|archive-url=https://web.archive.org/web/20140714160915/http://peace.saumag.edu/faculty/kardas/Courses/CS/Interfaces2007_files/Interfaces2007.ppt|archive-date=14 July 2014}}</ref> HMI is a modification of the original term MMI (man–machine interface).<ref name="Nigeria">{{cite journal|title=User Interface Design and Ergonomics|journal=Course Cit 811|page=19|url=http://www.nou.edu.ng/NOUN_OCL/pdf/SST/CIT%20811.pdf|access-date=7 June 2014|publisher=SCHOOL OF SCIENCE AND TECHNOLOGY|location=NATIONAL OPEN UNIVERSITY OF NIGERIA|quote=In practice, the abbreviation MMI is still frequently used although some may claim that MMI stands for something different now.|url-status=live|archive-url=https://web.archive.org/web/20140714234100/http://www.nou.edu.ng/NOUN_OCL/pdf/SST/CIT%20811.pdf|archive-date=14 July 2014}}</ref> In practice, the abbreviation MMI is still frequently used<ref name="Nigeria"/> although some may claim that MMI stands for something different now.{{Citation needed|date=March 2019}} Another abbreviation is HCI, but is more commonly used for [[human–computer interaction]].<ref name="Nigeria"/> Other terms used are operator interface console (OIC) and operator interface terminal (OIT).<ref>{{cite book|title=Recent advances in business administration|date=2010|publisher=Wseas|location=[S.l.]|isbn=978-960-474-161-8|page=190|chapter=Introduction Section|quote=Other terms used are operator interface console (OIC) and operator interface terminal (OIT)}}</ref> However it is abbreviated, the terms refer to the 'layer' that separates a human that is operating a machine from the machine itself.<ref name="Nigeria"/> Without a clean and usable interface, humans would not be able to interact with information systems.


In [[science fiction]], HMI is sometimes used to refer to what is better described as a [[direct neural interface]]. However, this latter usage is seeing increasing application in the real-life use of (medical) [[prostheses]]—the artificial extension that replaces a missing body part (e.g., [[cochlear implants]]).<ref>{{cite journal|last1=Cipriani|first1=Christian|last2=Segil|first2=Jacob|last3=Birdwell|first3=Jay|last4=Weir|first4=Richard|title=Dexterous control of a prosthetic hand using fine-wire intramuscular electrodes in targeted extrinsic muscles|journal=IEEE Transactions on Neural Systems and Rehabilitation Engineering|volume=22|issue=4|pages=828–36|doi=10.1109/TNSRE.2014.2301234|issn=1534-4320|quote=Neural co-activations are present that in turn generate significant EMG levels and hence unintended movements in the case of the present human machine interface (HMI).|year=2014|pmc=4501393|pmid=24760929}}</ref><ref>{{cite journal|last1=Citi|first1=Luca|title=Development of a neural interface for the control of a robotic hand|date=2009|page=5|url=https://7c4745ab-a-cdf32725-s-sites.googlegroups.com/a/neurostat.mit.edu/lciti/publications_files/LCitiPhD.pdf?attachauth=ANoY7cpwRib4-7KUnST5NrulgpbLiT3r10hOeyap9QXEgv64E1VioXR7n1pQYsNBNMZggwnI2V4KbZLgxVeKLcOgxz4XfJFAkqvddyQUnGqn4Mm5iLq9vDR02cHmYi6ULrK8IxWK150SirIt9acjMFcDon0dbnRwgYicc-2GeKZZCqtflZc4ZhEBORg8AzWE31XDAgoFFAfNtUxTcNR8IcJlsM7NYCGxY4M3Vn8WY6bsO1MEuyYIjmU%3D&attredirects=0<!--|chapter=Chapter 2-->|access-date=7 June 2014|publisher=IMT Institute for Advanced Studies Lucca|location=Scuola Superiore Sant'Anna, Pisa, Italy}}</ref>
In [[science fiction]], HMI is sometimes used to refer to what is better described as a [[direct neural interface]]. However, this latter usage is seeing increasing application in the real-life use of (medical) [[prostheses]]—the artificial extension that replaces a missing body part (e.g., [[cochlear implants]]).<ref>{{cite journal|last1=Cipriani|first1=Christian|last2=Segil|first2=Jacob|last3=Birdwell|first3=Jay|last4=Weir|first4=Richard|title=Dexterous control of a prosthetic hand using fine-wire intramuscular electrodes in targeted extrinsic muscles|journal=IEEE Transactions on Neural Systems and Rehabilitation Engineering|volume=22|issue=4|pages=828–36|doi=10.1109/TNSRE.2014.2301234|issn=1534-4320|quote=Neural co-activations are present that in turn generate significant EMG levels and hence unintended movements in the case of the present human machine interface (HMI).|year=2014|pmc=4501393|pmid=24760929}}</ref><ref>{{cite journal|last1=Citi|first1=Luca|title=Development of a neural interface for the control of a robotic hand|date=2009|page=5|url=https://7c4745ab-a-cdf32725-s-sites.googlegroups.com/a/neurostat.mit.edu/lciti/publications_files/LCitiPhD.pdf?attachauth=ANoY7cpwRib4-7KUnST5NrulgpbLiT3r10hOeyap9QXEgv64E1VioXR7n1pQYsNBNMZggwnI2V4KbZLgxVeKLcOgxz4XfJFAkqvddyQUnGqn4Mm5iLq9vDR02cHmYi6ULrK8IxWK150SirIt9acjMFcDon0dbnRwgYicc-2GeKZZCqtflZc4ZhEBORg8AzWE31XDAgoFFAfNtUxTcNR8IcJlsM7NYCGxY4M3Vn8WY6bsO1MEuyYIjmU%3D&attredirects=0<!--|chapter=Chapter 2-->|access-date=7 June 2014|publisher=IMT Institute for Advanced Studies Lucca|location=Scuola Superiore Sant'Anna, Pisa, Italy}}</ref>
Line 56: Line 54:
Once the cards were punched, one would drop them in a job queue and wait. Eventually, operators would feed the deck to the computer, perhaps mounting [[Magnetic tape data storage|magnetic tapes]] to supply another dataset or helper software. The job would generate a printout, containing final results or an abort notice with an attached error log. Successful runs might also write a result on magnetic tape or generate some data cards to be used in a later computation.
Once the cards were punched, one would drop them in a job queue and wait. Eventually, operators would feed the deck to the computer, perhaps mounting [[Magnetic tape data storage|magnetic tapes]] to supply another dataset or helper software. The job would generate a printout, containing final results or an abort notice with an attached error log. Successful runs might also write a result on magnetic tape or generate some data cards to be used in a later computation.


The [[turnaround time]] for a single job often spanned entire days. If one were very lucky, it might be hours; there was no real-time response. But there were worse fates than the card queue; some computers required an even more tedious and error-prone process of toggling in programs in binary code using console switches. The very earliest machines had to be partly rewired to incorporate program logic into themselves, using devices known as [[plugboard]]s.
The [[turnaround time]] for a single job often spanned entire days. If one was very lucky, it might be hours; there was no real-time response. But there were worse fates than the card queue; some computers required an even more tedious and error-prone process of toggling in programs in binary code using console switches. The very earliest machines had to be partly rewired to incorporate program logic into themselves, using devices known as [[plugboard]]s.


Early batch systems gave the currently running job the entire computer; program decks and tapes had to include what we would now think of as [[operating system]] code to talk to I/O devices and do whatever other housekeeping was needed. Midway through the batch period, after 1957, various groups began to experiment with so-called [[Compile and go system|load-and-go]]systems. These used a [[Resident monitor|monitor program]] which was always resident on the computer. Programs could call the monitor for services. Another function of the monitor was to do better error checking on submitted jobs, catching errors earlier and more intelligently and generating more useful feedback to the users. Thus, monitors represented the first step towards both operating systems and explicitly designed user interfaces.
Early batch systems gave the currently running job the entire computer; program decks and tapes had to include what we would now think of as [[operating system]] code to talk to I/O devices and do whatever other housekeeping was needed. Midway through the batch period, after 1957, various groups began to experiment with so-called "[[Compile and go system|load-and-go]]" systems. These used a [[Resident monitor|monitor program]] which was always resident on the computer. Programs could call the monitor for services. Another function of the monitor was to do better error checking on submitted jobs, catching errors earlier and more intelligently and generating more useful feedback to the users. Thus, monitors represented the first step towards both operating systems and explicitly designed user interfaces.


===1969–present: Command-line user interface===
===1969–present: Command-line user interface===
Line 98: Line 96:
* 1988 – [[OS/2]] 1.10 Standard Edition (SE) has GUI written by Microsoft, looks a lot like Windows 2
* 1988 – [[OS/2]] 1.10 Standard Edition (SE) has GUI written by Microsoft, looks a lot like Windows 2


== Interface design ==
==Interface design==
{{main|User interface design}}
{{main|User interface design}}
Primary methods used in the interface design include prototyping and simulation.
Primary methods used in the interface design include prototyping and simulation.
Line 120: Line 118:
# [[Forgiveness]]: A good interface should not punish users for their mistakes but should instead provide the means to remedy them.
# [[Forgiveness]]: A good interface should not punish users for their mistakes but should instead provide the means to remedy them.


=== Principle of least astonishment ===
===Principle of least astonishment===
The [[principle of least astonishment]] (POLA) is a general principle in the design of all kinds of interfaces. It is based on the idea that human beings can only pay full attention to one thing at one time,<ref name=Raskin>{{cite book|last1=Raskin|first1=Jef|title=The human interface : new directions for designing interactive systems|date=2000|publisher=Addison Wesley|location=Reading, Mass. [u.a.]|isbn=0-201-37937-6|edition=1. printing.|url-access=registration|url=https://archive.org/details/humaneinterfacen00rask}}</ref> leading to the conclusion that novelty should be minimized.
The [[principle of least astonishment]] (POLA) is a general principle in the design of all kinds of interfaces. It is based on the idea that human beings can only pay full attention to one thing at one time,<ref name=Raskin>{{cite book|last1=Raskin|first1=Jef|title=The human interface : new directions for designing interactive systems|date=2000|publisher=Addison Wesley|location=Reading, Mass. [u.a.]|isbn=0-201-37937-6|edition=1. printing.|url-access=registration|url=https://archive.org/details/humaneinterfacen00rask}}</ref> leading to the conclusion that novelty should be minimized.


=== Principle of habit formation ===
===Principle of habit formation===
 
If an interface is used persistently, the user will unavoidably develop [[habit]]s for using the interface. The designer's role can thus be characterized as ensuring the user forms good habits. If the designer is experienced with other interfaces, they will similarly develop habits, and often make unconscious assumptions regarding how the user will interact with the interface.<ref name=Raskin/><ref>{{cite news|last1=Udell|first1=John|title=Interfaces are habit-forming|url=http://www.infoworld.com/article/2681144/application-development/interfaces-are-habit-forming.amp.html|access-date=3 April 2017|work=Infoworld|date=9 May 2003|language=en|url-status=live|archive-url=https://web.archive.org/web/20170404131503/http://www.infoworld.com/article/2681144/application-development/interfaces-are-habit-forming.amp.html|archive-date=4 April 2017}}</ref>
If an interface is used persistently, the user will unavoidably develop [[habit]]s for using the interface. The designer's role can thus be characterized as ensuring the user forms good habits. If the designer is experienced with other interfaces, they will similarly develop habits, and often make unconscious assumptions regarding how the user will interact with the interface.<ref name=Raskin/><ref>{{cite news|last1=Udell|first1=John|title=Interfaces are habit-forming|url=http://www.infoworld.com/article/2681144/application-development/interfaces-are-habit-forming.amp.html|access-date=3 April 2017|work=Infoworld|date=9 May 2003|language=en|url-status=live|archive-url=https://web.archive.org/web/20170404131503/http://www.infoworld.com/article/2681144/application-development/interfaces-are-habit-forming.amp.html|archive-date=4 April 2017}}</ref>


=== A model of design criteria: User Experience Honeycomb ===
===A model of design criteria: User Experience Honeycomb===
[[File:UX Honeycomb.png|alt=User interface / user experience guide|thumb|User Experience Design Honeycomb<ref name=":0">{{Cite web|url=https://oryzo.com/user-interface-design/|title=User Interface & User Experience Design {{!}} Oryzo {{!}} Small Business UI/UX|website=Oryzo|language=en-US|access-date=2019-11-19}}</ref> designed by [[Peter Morville]]<ref name=":1">{{Cite web|url=https://medium.com/@danewesolko/peter-morvilles-user-experience-honeycomb-904c383b6886|title=Peter Morville's User Experience Honeycomb|last=Wesolko|first=Dane|date=2016-10-27|website=Medium|language=en|access-date=2019-11-19}}</ref>|230x230px]]
[[File:UX Honeycomb.png|alt=User interface / user experience guide|thumb|User Experience Design Honeycomb<ref name=":0">{{Cite web|url=https://oryzo.com/user-interface-design/|title=User Interface & User Experience Design {{!}} Oryzo {{!}} Small Business UI/UX|website=Oryzo|language=en-US|access-date=2019-11-19}}</ref> designed by [[Peter Morville]]<ref name=":1">{{Cite web|url=https://medium.com/@danewesolko/peter-morvilles-user-experience-honeycomb-904c383b6886|title=Peter Morville's User Experience Honeycomb|last=Wesolko|first=Dane|date=2016-10-27|website=Medium|language=en|access-date=2019-11-19}}</ref>|230x230px]]
Peter Morville of [[Google]] designed the User Experience Honeycomb framework in 2004 when leading operations in user interface design. The framework was created to guide user interface design. It would act as a guideline for many web development students for a decade.<ref name=":1" />
Peter Morville of [[Google]] designed the User Experience Honeycomb framework in 2004 when leading operations in user interface design. The framework was created to guide user interface design. It would act as a guideline for many web development students for a decade.<ref name=":1" />


# Usable: Is the design of the system easy and simple to use? The application should feel familiar, and it should be easy to use.<ref name=":1" /><ref name=":0" />
# Usable: Is the design of the system easy and simple to use? The application should feel familiar, and it should be easy to use.<ref name=":1" /><ref name=":0" />
# Useful: Does the application fulfill a need? A business’s product or service needs to be useful.<ref name=":0" />
# Useful: Does the application fulfill a need? A business's product or service needs to be useful.<ref name=":0" />
# Desirable: Is the design of the application sleek and to the point? The aesthetics of the system should be attractive, and easy to translate.<ref name=":0" />
# Desirable: Is the design of the application sleek and to the point? The aesthetics of the system should be attractive, and easy to translate.<ref name=":0" />
# Findable: Are users able to quickly find the information they're looking for? Information needs to be findable and simple to navigate. A user should never have to hunt for your product or information.<ref name=":0" />
# Findable: Are users able to quickly find the information they're looking for? Information needs to be findable and simple to navigate. A user should never have to hunt for your product or information.<ref name=":0" />
Line 140: Line 137:


==Types==
==Types==
[[File:Hp150 touchscreen 20081129.jpg|thumb|alt=Touchscreen of the HP Series 100 HP-150|HP Series 100 HP-150 Touchscreen]]
[[File:Hp150 touchscreen 20081129.jpg|thumb|alt=Touchscreen of the HP Series 100 HP-150|HP Series 100 HP-150 touchscreen]]


# ''[[Attentive user interface]]s'' manage the user [[attention]] deciding when to interrupt the user, the kind of warnings, and the level of detail of the messages presented to the user.
# ''[[Attentive user interface]]s'' manage the user [[attention]] deciding when to interrupt the user, the kind of warnings, and the level of detail of the messages presented to the user.
Line 150: Line 147:
# ''[[Direct manipulation interface]]'' is the name of a general class of user interfaces that allow users to manipulate objects presented to them, using actions that correspond at least loosely to the physical world.
# ''[[Direct manipulation interface]]'' is the name of a general class of user interfaces that allow users to manipulate objects presented to them, using actions that correspond at least loosely to the physical world.
# ''[[Gesture recognition|Gesture interface]]s'' are graphical user interfaces which accept input in a form of hand [[gesture]]s, or [[mouse gesture]]s sketched with a computer mouse or a [[Stylus (computing)|stylus]].
# ''[[Gesture recognition|Gesture interface]]s'' are graphical user interfaces which accept input in a form of hand [[gesture]]s, or [[mouse gesture]]s sketched with a computer mouse or a [[Stylus (computing)|stylus]].
# ''[[Graphical user interface]]s'' (GUI) accept input via devices such as a computer keyboard and mouse and provide articulated [[graphical]] output on the [[computer monitor]]. There are at least two different principles widely used in GUI design: [[Object-oriented user interface]]s (OOUIs) and [[Application software|application]]-oriented interfaces.<ref>{{cite web |first=Gordana |last=Lamb |url=http://msdn.microsoft.com/en-us/library/aa227601(v=vs.60).aspx |title=Improve Your UI Design Process with Object-Oriented Techniques |archive-url=https://web.archive.org/web/20130814153652/http://msdn.microsoft.com/en-us/library/aa227601(v=vs.60).aspx |archive-date=2013-08-14 |website=Visual Basic Developer magazine |date=2001 |quote=Table 1. Differences between the traditional application-oriented and object-oriented approaches to UI design.}}</ref>
# ''[[Graphical user interface]]s'' (GUI) accept input via devices such as a computer keyboard and mouse and provide articulated [[graphical]] output on the [[computer monitor]].<ref>{{Cite journal|last=Martinez|first=Wendy L.|date=2011-02-23|title=Graphical user interfaces: Graphical user interfaces|url=https://onlinelibrary.wiley.com/doi/10.1002/wics.150|journal=Wiley Interdisciplinary Reviews: Computational Statistics|language=en|volume=3|issue=2|pages=119–133|doi=10.1002/wics.150|s2cid=60467930 }}</ref> There are at least two different principles widely used in GUI design: [[Object-oriented user interface]]s (OOUIs) and [[Application software|application]]-oriented interfaces.<ref>{{cite web |first=Gordana |last=Lamb |url=http://msdn.microsoft.com/en-us/library/aa227601(v=vs.60).aspx |title=Improve Your UI Design Process with Object-Oriented Techniques |archive-url=https://web.archive.org/web/20130814153652/http://msdn.microsoft.com/en-us/library/aa227601(v=vs.60).aspx |archive-date=2013-08-14 |website=Visual Basic Developer magazine |date=2001 |quote=Table 1. Differences between the traditional application-oriented and object-oriented approaches to UI design.}}</ref>
# ''Hardware interfaces'' are the physical, spatial interfaces found on products in the real world from toasters, to car dashboards, to airplane cockpits. They are generally a mixture of knobs, buttons, sliders, switches, and touchscreens.
# ''Hardware interfaces'' are the physical, spatial interfaces found on products in the real world from toasters, to car dashboards, to airplane cockpits. They are generally a mixture of knobs, buttons, sliders, switches, and touchscreens.
# ''{{visible anchor|Holographic user interfaces}}'' provide input to electronic or electro-mechanical devices by passing a finger through reproduced holographic images of what would otherwise be tactile controls of those devices, floating freely in the air, detected by a wave source and without tactile interaction.
# ''{{visible anchor|Holographic user interfaces}}'' provide input to electronic or electro-mechanical devices by passing a finger through reproduced holographic images of what would otherwise be tactile controls of those devices, floating freely in the air, detected by a wave source and without tactile interaction.
# ''[[Intelligent user interfaces]]'' are human-machine interfaces that aim to improve the efficiency, effectiveness, and naturalness of human-machine interaction by representing, reasoning, and acting on models of the user, domain, task, discourse, and media (e.g., graphics, natural language, gesture).
# ''[[Intelligent user interfaces]]'' are human–machine interfaces that aim to improve the efficiency, effectiveness, and naturalness of human–machine interaction by representing, reasoning, and acting on models of the user, domain, task, discourse, and media (e.g., graphics, natural language, gesture).
# ''[[Motion capture|Motion tracking]] interfaces'' monitor the user's body motions and translate them into commands, currently being developed by Apple.<ref>[http://www.appleinsider.com/articles/09/06/18/apple_exploring_motion_tracking_mac_os_x_user_interface.html appleinsider.com] {{webarchive|url=https://web.archive.org/web/20090619212919/http://www.appleinsider.com/articles/09/06/18/apple_exploring_motion_tracking_mac_os_x_user_interface.html |date=2009-06-19 }}</ref>
# ''[[Motion capture|Motion tracking]] interfaces'' monitor the user's body motions and translate them into commands, currently being developed by Apple.<ref>[http://www.appleinsider.com/articles/09/06/18/apple_exploring_motion_tracking_mac_os_x_user_interface.html appleinsider.com] {{webarchive|url=https://web.archive.org/web/20090619212919/http://www.appleinsider.com/articles/09/06/18/apple_exploring_motion_tracking_mac_os_x_user_interface.html |date=2009-06-19 }}</ref>
# ''Multi-screen interfaces'', employ multiple displays to provide a more flexible interaction. This is often employed in computer game interaction in both the commercial arcades and more recently the handheld markets.
# ''Multi-screen interfaces'', employ multiple displays to provide a more flexible interaction. This is often employed in computer game interaction in both the commercial arcades and more recently the handheld markets.
Line 168: Line 165:
# ''[[Touch user interface]]'' are graphical user interfaces using a [[touchpad]] or touchscreen display as a combined input and output device. They supplement or replace other forms of output with [[Haptic communication|haptic]] feedback methods. Used in computerized [[simulator#Physical and interactive simulation|simulators]], etc.
# ''[[Touch user interface]]'' are graphical user interfaces using a [[touchpad]] or touchscreen display as a combined input and output device. They supplement or replace other forms of output with [[Haptic communication|haptic]] feedback methods. Used in computerized [[simulator#Physical and interactive simulation|simulators]], etc.
# ''[[Voice user interface]]s'', which accept input and provide output by generating voice prompts. The user input is made by pressing keys or buttons, or responding verbally to the interface.
# ''[[Voice user interface]]s'', which accept input and provide output by generating voice prompts. The user input is made by pressing keys or buttons, or responding verbally to the interface.
# ''[[Web application|Web-based user interfaces]]'' or ''web user interfaces'' (WUI) that accept input and provide output by generating [[web page]]s viewed by the user using a [[web browser]] program. Newer implementations utilize [[PHP]], [[Java (programming language)|Java]], [[JavaScript]], [[Ajax (programming)|AJAX]], [[Apache Flex]], [[.NET Framework]], or similar technologies to provide real-time control in a separate program, eliminating the need to refresh a traditional HTML-based web browser. Administrative web interfaces for web-servers, servers and networked computers are often called [[web hosting control panel|control panels]].
# {{anchor|Web interface}}''[[Web application|Web-based user interfaces]]'' or ''web user interfaces'' (WUI) that accept input and provide output by generating [[web page]]s viewed by the user using a [[web browser]] program. Newer implementations utilize [[PHP]], [[Java (programming language)|Java]], [[JavaScript]], [[Ajax (programming)|AJAX]], [[Apache Flex]], [[.NET Framework]], or similar technologies to provide real-time control in a separate program, eliminating the need to refresh a traditional HTML-based web browser. Administrative web interfaces for web-servers, servers and networked computers are often called [[web hosting control panel|control panels]].
# ''Zero-input interfaces'' get inputs from a set of sensors instead of querying the user with input dialogs.<ref>Sharon, Taly, Henry Lieberman, and Ted Selker. "[https://www.researchgate.net/profile/Ted_Selker/publication/221607708_A_zero-input_interface_for_leveraging_group_experience_in_Web_browsing/links/0912f50876bda91a5b000000/A-zero-input-interface-for-leveraging-group-experience-in-Web-browsing.pdf A zero-input interface for leveraging group experience in web browsing] {{webarchive|url=https://web.archive.org/web/20170908113001/https://www.researchgate.net/profile/Ted_Selker/publication/221607708_A_zero-input_interface_for_leveraging_group_experience_in_Web_browsing/links/0912f50876bda91a5b000000/A-zero-input-interface-for-leveraging-group-experience-in-Web-browsing.pdf |date=2017-09-08 }}." Proceedings of the 8th international conference on Intelligent user interfaces. ACM, 2003.</ref>
# ''Zero-input interfaces'' get inputs from a set of sensors instead of querying the user with input dialogs.<ref>Sharon, Taly, Henry Lieberman, and Ted Selker. "[https://www.researchgate.net/profile/Ted_Selker/publication/221607708_A_zero-input_interface_for_leveraging_group_experience_in_Web_browsing/links/0912f50876bda91a5b000000/A-zero-input-interface-for-leveraging-group-experience-in-Web-browsing.pdf A zero-input interface for leveraging group experience in web browsing] {{webarchive|url=https://web.archive.org/web/20170908113001/https://www.researchgate.net/profile/Ted_Selker/publication/221607708_A_zero-input_interface_for_leveraging_group_experience_in_Web_browsing/links/0912f50876bda91a5b000000/A-zero-input-interface-for-leveraging-group-experience-in-Web-browsing.pdf |date=2017-09-08 }}." Proceedings of the 8th international conference on Intelligent user interfaces. ACM, 2003.</ref>
# ''[[Zooming user interface]]s'' are graphical user interfaces in which information objects are represented at different levels of scale and detail, and where the user can change the scale of the viewed area in order to show more detail.
# ''[[Zooming user interface]]s'' are graphical user interfaces in which information objects are represented at different levels of scale and detail, and where the user can change the scale of the viewed area in order to show more detail.