Machine perception is the capability of a computer system to interpretdata in a manner that is similar to the wayhumans use theirsenses to relate to the world around them.[1][2][3] The basic method that thecomputers take in and respond to theirenvironment is through the attachedhardware. Until recentlyinput was limited to a keyboard, or a mouse, but advances in technology, both in hardware andsoftware, have allowed computers to take insensory input in a way similar to humans.[1][2]
Machineperception allows the computer to use this sensory input, as well as conventionalcomputational means of gatheringinformation, to gather information with greater accuracy and to present it in a way that is more comfortable for theuser.[1] These includecomputer vision,machine hearing, machine touch, andmachine smelling, as artificialscents are, at achemical compound,molecular,atomic level, indiscernible andidentical.[4][5]
The end goal of machine perception is to give machines the ability tosee,feel andperceive the world as humans do and therefore for them to be able toexplain in a human way why they are making their decisions, to warn us when it is failing and more importantly, the reason why it is failing.[6] Thispurpose is very similar to the proposed purposes forartificial intelligence generally, except that machine perception would only grant machines limitedsentience, rather than bestow upon machines fullconsciousness,self-awareness, andintentionality.
Computer vision is a field that includes methods for acquiring, processing, analyzing, and understanding images and high-dimensional data from the real world to produce numerical or symbolic information, e.g., in the forms of decisions. Computer vision has many applications already in use today such asfacial recognition, geographical modeling, and even aesthetic judgment.[7]
However, machines still struggle to interpret visual impute accurately if it is blurry or if theviewpoint at which stimuli are viewed varies often. Computers also struggle to determine the proper nature of some stimulus if overlapped by or seamlessly touching another stimulus. This refers to thePrinciple of Good Continuation. Machines also struggle to perceive and record stimulus functioning according to the Apparent Movement principle which is a field of research inGestalt psychology.
Machine hearing, also known as machine listening orcomputer audition is the ability of a computer or machine to take in and process sound data such as speech or music.[8][9]This area has a wide range of application including music recording and compression, speech synthesis andspeech recognition.[10]Moreover, this technology allows the machine to replicate the human brain's ability to selectively focus on a specific sound against many other competing sounds and background noise. This ability is called "auditory scene analysis". The technology enables the machine to segment several streams occurring at the same time.[8][11][12]Many commonly used devices such as a smartphones, voice translators and cars make use of some form of machine hearing. Present technology still has challenges inspeech segmentation. This means it is occasionally unable to correctly split words within sentences especially when spoken in an atypical accent.

Machine touch is an area of machine perception where tactile information is processed by a machine or computer. Applications includetactile perception of surface properties anddexterity whereby tactile information can enable intelligent reflexes and interaction with the environment.[13] Though this could possibly be done through measuring when and where friction occurs and also the nature and intensity of the friction, machines however still do not have any way of measuring few ordinary physical human experiences including physical pain. For example, scientists have yet to invent a mechanical substitute for theNociceptors in the body and brain that are responsible for noticing and measuring physical human discomfort and suffering.
Scientists are developing computers known asmachine olfaction which can recognize and measuresmells as well. Airbornechemicals are sensed and classified with a device sometimes known as anelectronic nose.[14][15]
Theelectronic tongue is an instrument that measures and comparestastes. As per the IUPAC technical report, an “electronic tongue” as analytical instrument including an array of non-selective chemical sensors with partial specificity to different solution components and an appropriate pattern recognition instrument, capable to recognize quantitative and qualitative compositions of simple and complex solutions[16][17]
Chemical compounds responsible for taste are detected by humantaste receptors. Similarly, the multi-electrode sensors of electronic instruments detect the same dissolvedorganic andinorganic compounds. Like human receptors, each sensor has a spectrum of reactions different from the other. The information given by each sensor is complementary, and the combination of all sensors' results generates a unique fingerprint. Most of thedetection thresholds of sensors are similar to or better than human receptors.
In the biological mechanism, taste signals are transduced by nerves in the brain into electric signals. E-tongue sensors process is similar: they generate electric signals asvoltammetric andpotentiometric variations.
Taste quality perception and recognition are based on the building or recognition of activatedsensory nerve patterns by the brain and the taste fingerprint of the product. This step is achieved by the e-tongue'sstatistical software, which interprets the sensor data into taste patterns.Other than those listed above, some of the future hurdles that the science of machine perception still has to overcome include, but are not limited to:
-Embodied cognition - The theory that cognition is a full body experience, and therefore can only exist, and therefore be measure and analyzed, in fullness if all required human abilities and processes are working together through a mutually aware and supportive systems network.
- TheMoravec's paradox (see the link)
- ThePrinciple of similarity - The ability young children develop to determine what family a newly introduced stimulus falls under even when the said stimulus is different from the members with which the child usually associates said family with. (An example could be a child figuring that a chihuahua is a dog and house pet rather than vermin.)
- TheUnconscious inference: The natural human behavior of determining if a new stimulus is dangerous or not, what it is, and then how to relate to it without ever requiring any new conscious effort.
- The innate human ability to follow thelikelihood principle in order to learn from circumstances and others over time.
- Therecognition-by-components theory - being able to mentally analyze and break even complicated mechanisms into manageable parts with which to interact with. For example: A person seeing both the cup and the handle parts that make up a mug full of hot cocoa, in order to use the handle to hold the mug so as to avoid being burned.
- Thefree energy principle - determining long before hand how much energy one can safely delegate to being aware of things outside one's self without the loss of the needed energy one requires for sustaining their life and function satisfactorily. This allows one to become both optimally aware of the world around them self without depleting their energy so much that they experience damaging stress, decision fatigue, and/or exhaustion.