CROSS REFERENCE TO RELATED APPLICATIONThe present application is a continuation of U.S. application No. 14/581,659, filed Dec. 23, 2014, entitled, “USER PROFILE SELECTION USING CONTEXTUAL AUTHENTICATION”. The application is hereby incorporated by reference herein in its entirety for all purposes.
TECHNICAL FIELDThe present disclosure relates to the field of data processing, in particular, to presentation of resource (e.g., content) based at least in part on a user profile selected by contextual authentication.
BACKGROUNDThe background description provided herein is for the purpose of generally presenting the context of the disclosure. Unless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.
Some computing devices, such as tablet computers, are dynamically shared between multiple users. User profiles allow each user to have a more personalized user experience by allowing each user to have their own set of applications, social logins, bookmarks, and data. They can also be used to create guest profiles, allowing others to borrow a device without worrying about application or social login conflicts and data privacy. Profiles can also be used to restrict content for use by children to allow parental control of browsing, application usage, and in-app purchasing. User profiles typically require additional active user input to switch from one profile to another which interrupts the flow of the user experience by requiring extra interaction from the user as they pick a profile to load and/or provide active authentication information such as a password.
BRIEF DESCRIPTION OF THE DRAWINGSEmbodiments will be readily understood by the following detailed description in conjunction with the accompanying drawings. To facilitate this description, like reference numerals designate like structural elements. Embodiments are illustrated by way of example, and not by way of limitation, in the Figures of the accompanying drawings.
FIG. 1 is a block diagram of a computing device in an operating environment, in accordance with various embodiments.
FIG. 2 is a block diagram showing additional components of the computing device shown inFIG. 1, in accordance with various embodiments.
FIG. 3 is a block diagram of a computing device, in accordance with various embodiments.
FIG. 4 is a flow diagram of an example process that may be implemented on various computing devices described herein, in accordance with various embodiments
FIG. 5 is a flow diagram of an example process that may be implemented on various computing devices described herein, in accordance with various embodiments.
FIG. 6 illustrates an example computing environment suitable for practicing various aspects of the disclosure, in accordance with various embodiments.
FIG. 7 illustrates an example storage medium with instructions configured to enable an apparatus to practice various aspects of the present disclosure, in accordance with various embodiments.
DETAILED DESCRIPTIONIn the following detailed description, reference is made to the accompanying drawings which form a part hereof wherein like numerals designate like parts throughout, and in which is shown by way of illustration embodiments that may be practiced. It is to be understood that other embodiments may be utilized and structural or logical changes may be made without departing from the scope of the present disclosure. Therefore, the following detailed description is not to be taken in a limiting sense, and the scope of embodiments is defined by the appended claims and their equivalents.
Various operations may be described as multiple discrete actions or operations in turn, in a manner that is most helpful in understanding the claimed subject matter. However, the order of description should not be construed as to imply that these operations are necessarily order dependent. In particular, these operations may not be performed in the order of presentation. Operations described may be performed in a different order than the described embodiment. Various additional operations may be performed and/or described operations may be omitted in additional embodiments.
For the purposes of the present disclosure, the phrase “A and/or B” means (A), (B), or (A and B). For the purposes of the present disclosure, the phrase “A, B, and/or C” means (A), (B), (C), (A and B), (A and C), (B and C), or (A, B and C).
The description may use the phrases “in an embodiment,” or “in embodiments,” which may each refer to one or more of the same or different embodiments. Furthermore, the terms “comprising,” “including,” “having,” and the like, as used with respect to embodiments of the present disclosure, are synonymous.
As used herein, the term “logic” and “module” may refer to, be part of, or include an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and/or memory (shared, dedicated, or group) that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality. The term “module” may refer to software, firmware and/or circuitry that is/are configured to perform or cause the performance of one or more operations consistent with the present disclosure. Software may be embodied as a software package, code, instructions, instruction sets and/or data recorded on non-transitory computer readable storage mediums. Firmware may be embodied as code, instructions or instruction sets and/or data that are hard-coded (e.g., nonvolatile) in memory devices. “Circuitry”, as used in any embodiment herein, may comprise, for example, singly or in any combination, hardwired circuitry, programmable circuitry such as computer processors comprising one or more individual instruction processing cores, state machine circuitry, software and/or firmware that stores instructions executed by programmable circuitry. The modules may collectively or individually be embodied as circuitry that forms a part of a computing device. As used herein, the term “processor” may be a processor core.
Referring now toFIG. 1, acomputing device100, incorporated with the resource presentation teaching of the present disclosure, in accordance with various embodiments, is illustrated. As shown,computing device100 may include a number of components102-158, including sharedapplication144 and trusted execution environment (TEE)114, configured to cooperate with each other, to select a user profile using contextual authentication and enable resources (such as contents) to be selectively consumed (viewed, modified, or deleted) by a logged-in user and/or a delegate user based at least in part on the selected user profile, alternatingly with ease. In embodiments,computing device100 may include one or more processors orprocessor cores102,system memory104, adisplay106, and a sensor layer including asensor hub108 that may be coupled together and configured to cooperate with each other. Thedisplay106 may be a touch sensitive display that also serves as an input device in various embodiments. For purposes of this application, including the claims, the terms “processor” and “processor cores” may be considered synonymous, unless the context clearly requires otherwise. In embodiments, the sensor layer may include one or more sensor devices, an input/output (TO) subsystem having IO controllers, internet protocol (IP) blocks, and control logic. In embodiments, as shown, thecomputing device100 may also include one ormore chipsets110, one ormore execution environments112, and one or more trusted execution environments (TEE)114. The sensor layer may include trusted IO technology that hardens the IO path between thesensor hub108 and/or sensor devices and theTEE114.
Generally, a TEE is a secure environment that may run alongside an operating system and which can provide secure services to that operating system. More information regarding TEEs and the implementation thereof may be found in the TEE client application programming interface (API) specification v1.0, the TEE internal API (application programming interface) specification v1.0, and the TEE system architecture v1.0 issued by GlobalPlatform. In some embodiments, the devices described herein may include a TEE provided using one or more of virtualization technology, enhanced memory page protection, CPU cache as memory page protection, security co-processor technology, and combinations thereof. Non-limiting examples of such technology include INTEL® VT-x virtualization technology, INTEL® VT-d virtualization technology, INTEL® trusted execution technology (TXT), Xeon® internet security and acceleration (ISA) “cache as RAM”, converged security engine (CSE) technology, converged security and manageability engine (CSME) technology, a security co-processor, manageability engine, trusted platform module, platform trust technology, ARM TRUSTZONE® technology, combinations thereof, and the like. The nature, advantages and limitations of each of these technologies are well understood and are therefore not described herein.
In various embodiments, thesensor hub108 may be in signal communication with amotion sensor116, afingerprint sensor118, a Bluetoothtransceiver120, amicrophone122, a wireless fidelity (WiFi)transceiver124, aspeaker126, acamera128, anultrasonic sensor130, and one or moreother sensors140 orinput devices142. In embodiments, thecamera128 may be a video camera. Themotion sensor116 may include an accelerometer or gyroscope in various embodiments. TheWiFi transceiver124 may operate according to at least one of the Institute of Electrical and Electronics Engineers (IEEE) 802.11 standards and the Bluetoothtransceiver120 may operate according to at least one of the standards defined by the Bluetooth Special Interest Group in various embodiments.
In embodiments, theexecution environment112 may include an operating system (OS)142, earlier described sharedapplication144, and one ormore modules146. Theexecution environment112 may also includestorage147 for various variables and resources. The earlier described TEE114 may includesecure modules150 anddata152 in various embodiments. The sharedapplication144 andoperating system142 may, in cooperation withsecure modules150, enforce respective access rights simultaneously for a plurality of users. In embodiments, a continuous passively authenticated context may be maintained for each of the plurality of users.
In embodiments, thecomputing device100 may be in data communication with alocal server160, a remote content server such as a media server162, or asocial network server164 over anetwork166 by communicating with awireless router168 using theWiFi transceiver124. The sharedapplication144 may have access to resources that may be consumed (viewed, modified, or deleted) by a logged-in user and/or a delegate user. The resources may includelocal resources158 stored on thecomputing device100 or resources served or streamed by thelocal server160, the media server162, or thesocial network164 in various embodiments. The resources may include other types of resources not shown in embodiments. The logged-in user may access the resources with a first set of resource access rights while a second delegate user may access the resources with a different set of access rights.
In various embodiments, thecomputing device100 may be a shared device such as a tablet computing device that may be used e.g., by afirst user170, asecond user174, or athird user176. In embodiments, one or more of the users may have a Bluetooth enableddevice180, which may e.g., be a wearable device on thefirst user170.
Referring now toFIG. 2, components of theexecution environment112 andTEE114 are illustrated in further details, in accordance with various embodiments. Themodules146 of theexecution environment112 may include alogin module202, anaccess control module204, and apresentation module206. Thesecure modules150 of theTEE114 may include acontextual authentication module210 that includes asensor processing module212, aprofile selection module214, and aclassifier module216. In various embodiments, theclassifier module216 may classify data based on sensor output, application usage patterns, or user interface interaction patterns in a manner such that the data may be associated with particular users and the classifications of particular sensor data patterns, application usage patterns, or user interface interaction patterns may be considered to be user characteristics such that when a user characteristic changes it may be inferred that a user of thecomputing device100 has changed. Theclassifier module216 may be a machine learning classifier in various embodiments. Embodiments may also include auser proximity module218 as a part of thesecure modules150 in theTEE114. Together, thesemodules202,204,206,150,210,212,214 and216 may cooperate with each other to enable the sharedapplication144 andoperating system142 to enforce respective access rights simultaneously for the plurality of users, including continuous maintenance of a passively authenticated context for each of the plurality of users.
In embodiments, thedata152 stored in theTEE114 may include afirst user profile230, asecond user profile232, athird user profile234, afirst delegate profile236, asecond delegate profile238, or one or moreadditional profiles240 in various embodiments. User profiles are a way for users to have a more personalized and streamlined user experience. Profiles allow users to have their own set of applications, social logins, bookmarks, and data in one easily accessible and private place. They may be used to create guest profiles, allowing others to borrow a device without worrying about application or social login conflicts and data privacy. Profiles may also be used to restrict content for use by children to allow parental control of browsing, application usage, and in-app purchasing. In embodiments, automatic device state based profile settings may be applied on a per-application and a per-service basis. For example, thecontextual authentication module210 may distinguish user one170 using social networking application A hosted bysocial network server164, users two and three using video streaming service B hosted by media server162, etc., and manage specific profiles for each. In embodiments, this automatic management of specific profiles may be combined with machine learning of user characteristics, interactions, or behaviors during shared and/or single user sessions to tune the profile-controlled behavior for the applications and services to the user or combination of users. Preferences may be inferred for combinations of users, applications, and services from other profiles that involve those users, applications or services in various embodiments.
Thedata152 may also includeuser characteristics templates242 such as afirst template244, asecond template246, or athird template248. In embodiments, theuser characteristics templates242 are based on biometric or behaviometric data generated by a machine learning classifier. In various embodiments, theuser characteristics templates242 may be included as a part of the user profiles. For example, thefirst user profile230 may contain thefirst template244, thesecond user profile232 may contain thesecond template246, and/or thethird user profile234 may contain thethird template248 in embodiments. Thedata152 may also include auser focus identifier260 in various embodiments.
In embodiments, a biometrics and/or behaviometrics machine learning (ML) classifier or set of classifiers may generate reference sample data suitable for establishing a user identity. In various embodiments, the reference sample data may be generated during a training process and stored as a biometric and/or behaviometric template such astemplates244,246, or248. Theuser characteristics templates242 may be generated and stored by theclassifier module216 during a training process, for example. During the training process, theclassifier module216 may use machine learning to associate biometric user characteristics such as hand movement, gait, or image patterns based at least in part on sensor data with particular users. Biometric characteristics based on hand movement may be based at least in part on accelerometers or gyroscopes detecting relatively imperceptible movements and characteristics based on gait may be based at least in part on accelerometers or gyroscopes detecting walking characteristics in various embodiments. Theclassifier module216 may also use machine learning to associate behaviometric user characteristics such as application usage patterns or user interface interaction patterns with particular users. Alternatively, theuser characteristics templates242 may be generated and stored by other modules or generated by a different computing device and stored on thecomputing device100 in other embodiments.
In various embodiments, thelogin module202 may be operated by the one ormore processors102 to authenticate a user of thecomputing device100. In embodiments, an active authentication factor such as a thumbprint reading using thefingerprint sensor118 may be used. Theaccess control module204 may be operated to establish an access control state corresponding to a user profile associated with the authenticated user. For example, if thelogin module202 authenticates thefirst user170, a first access control state corresponding to thefirst user profile230 associated with thefirst user170 may be established by theaccess control module204. Thepresentation module206, operated by the one ormore processors102, may present a resource to the authenticated user based at least in part on the established access control state.
Thecontextual authentication module210 may be operated by the one ormore processors102 to detect a user interaction change associated with thecomputing device100 that indicates a different user, such as thesecond user174, has device focus. The user interaction change may include a change in biometric or behaviometric characteristics determined by processing sensor data or application or user interface usage data in various embodiments, for example. Thecontextual authentication module210 may continuously monitor passive authentication factors to detect user characteristics as thecomputing device100 is being used such that when a change in user characteristics is detected indicating thecomputing device100 is being used by a different user, a user profile may be assigned that corresponds to the current rather than the previous user, or alternatively corresponds to a delegate profile of the previous user that may also be based at least in part upon the current user. Thesensor processing module212 may process a sensor output from themotion sensor116 and generate theuser focus identifier260 indicating thesecond user174 has device focus, for example. Theprofile selection module214 may be operated by the one ormore processors102 to select thesecond user profile232 based on theuser focus identifier260 and thesecond template246. In embodiments, thecontextual authentication module210 may include a biometrics and/or behaviometrics ML classifier or set of classifiers as a part of theclassifier module216 that generate sample data suitable for establishing a user identity based at least in part on user characteristics.
Reference sample data such as may be stored intemplates244,246, or248 may be used to compare with the sampled data to determine a user match. This may include a match of a first user, a second user, or both users, for example. Thecontextual authentication module210 may also include a first user focus context classifier and a second user focus context classifier that determines when a user has device focus. Device focus may be established when a user is observing content on a display or other content rendering device, or when the user is inputting data through an input device such as a computer keyboard, mouse, microphone or camera. The user focus classifiers may establish to the OS which user is logged-in and which user is a delegate of the first.
In various embodiments, theuser proximity module218 may be operated by the one ormore processors102 to determine a proximity status associated with the currently logged in user, such as thefirst user170, after the user interaction change associated with thecomputing device100 that indicates a different user has device focus is detected by thecontextual authentication module210. Theaccess control module204 may be operated by the one ormore processors102 to terminate the second access control state if the proximity status reaches a predetermined value. The predetermined value may correspond to an approximate distance, such as greater than thirty feet, for example. Other distances may also be used. An approximate distance of a user from thecomputing device100 may be determined using power levels associated with a received signal strength indicator (RSSI) or by using a geographic positioning system (GPS) location, for example. In embodiments, theuser proximity module218 may detect a proximity status regardless of whether a user interaction change has been detected.
Referring now toFIG. 3, various embodiments of acomputing device300 may include a host system on a chip (SOC)302 in data communication with a contextual authentication technology (CAT)system304 operating in a TEE. Thehost SOC302 may include a processor or processing cores, memory, graphics, virtualization, and other capabilities suitable for hosting an operating system (OS) and applications. In embodiments, some or all elements ofSOC302 may be implemented as separate components rather than being integrated on a SOC. TheCAT system304 may be in signal communication with asensor layer306.
In embodiments, thesensor layer306 may include a sensor hub, one or more sensor devices, an input/output (10) subsystem that may include IO controllers, internet protocol (IP) blocks, and control logic. Thesensor layer306 may also include trusted IO technology that hardens the IO path between the sensor hub and/or sensor devices and a TEE subsystem. Sensor devices may employ a variety of sensing technology and may include avideo camera308, amicrophone309, anultrasonic sensor310, amulti-axis motion sensor312, a wireless radio313, and anRFID sensing device314 in various embodiments. Additional or alternative sensors may also be included in embodiments.
In various embodiments, theCAT system304 may be implemented in a TEE and be in data communication with one or more user profiles316. Hosting the CAT system in a TEE may provide additional protection against malware attacks that may exist on the host system OS or applications. TheCAT system304 may include biometric and/or behaviometric machine learning (ML)classifiers318 that may be used to generate sample data suitable for establishing a user identity based at least in part on user characteristics such as biometric or behaviometric information. Reference sample data based at least in part on user characteristics and stored in the user profiles316 may be used to compare with the generated sample data to determine a match of a first user, a second user, or both users. TheCAT system304 may also include a first user focus context classifier320 and a second user focus context classifier322 that may determine when a user has device focus. Device focus may be established when a user is observing content on a display or other content rendering device, or when a user is inputting data through an input device such as a computer keyboard, mouse, microphone or camera. The user focus context classifiers320,322 may establish to the OS which user is logged-in and which user is a delegate of the other. In embodiments, the biometric and/orbehaviometric ML classifiers318, the first user focus context classifier320, and the second user focus context classifier322 may be included as a part of thecontextual authentication module210 as discussed with respect toFIG. 2.
In various embodiments, thecomputing device300 may be a tablet computing device and as a user picks up thecomputing device300, theCAT system304 employs one or more sensors in combination with theML classifiers318 to determine who is attempting to use thecomputing device300. When the user is authenticated, theCAT system304 may access the relevant user profile. If an unknown user is detected or theCAT system304 cannot detect a particular user with a predetermined confidence level, theCAT system304 may offer access to a guest profile.
Thehost SOC302 may host anOS324 that may allow operation based on user profiles, indicated inFIG. 3 as a logged-in user326 and a delegate user328. Thehost SOC302 may host a sharedapplication330 that allows for varying access to one ormore resources332 based on access rules such as firstuser access rules334 and second user access rules336. The sharedapplication330 may have access toresources332 that may be consumed (viewed, modified, deleted) by a logged-in user or by a delegate user. Afirst user340 and asecond user342 may share thecomputing device300 and each have a user profile stored with the user profiles316 as well as a biometric or behaviometric classifier stored with theclassifiers318. The sharedapplication330 and theoperating system324 may enforce the firstuser access rules334 and the second user access rules336 for thefirst user340 and thesecond user342. TheCAT system304 may maintain a continuous authenticated context for both the first and second users using passive authentication based at least in part on user characteristics such as biometric or behaviometric information rather than requiring an active authentication factor for every user switch. The logged-in user326 may access theresources332 with a set of resource access rights while the delegate user328 may access theresources332 with a different set of access rights. In various embodiments, some or all of the components of thecomputing device300 may be included as a part of thecomputing device100 described with respect toFIGS. 1 and 2.
FIG. 4 depicts anexample process400 for simultaneously managing access rights of multiple users that may be implemented by thecomputing device100 or thecomputing device300 described with respect toFIGS. 1-3 in accordance with various embodiments. In various embodiments, theprocess400 may be performed by thelogin module202,access control module204,presentation module206,sensor processing module212,profile selection module214,classifier module216 and/oruser proximity module218. In other embodiments, theprocess400 may be performed with more or less modules and/or with some operations in different order. As shown, for the embodiments, theprocess400 may start at ablock402. At operation,404, a first user log-in may be facilitated, and user rights corresponding to a user profile associated with the first user may be assigned to a user context. This may be performed by accepting a presentation of the first user's thumbprint at thefingerprint sensor118 such that thelogin module202 may access thefirst user profile230 and theaccess control module204 may assign user access rights corresponding to thefirst user profile230, for example. The first user may then be able to access resources such aslocal resources158 or resources served or streamed fromlocal server160, media server162, orsocial network server164 based at least in part on the user access rights assigned by theaccess control module204.
Atoperation406, thecomputing device100 may monitor passive authentication factors in a continuous manner. This may be performed by thesensor processing module212 of thecontextual authentication module210 monitoring themotion sensor116 and theclassifier module216 generating biometric or behaviometric sample data based at least in part on output from themotion sensor116, for example. Multiple sensors may be monitored in embodiments or behavioral factors may be used instead of or in addition to biometric factors. Passive authentication factors may include any combination of biometric or behaviometric authentication factors in various embodiments. For example, in embodiments, biometric authentication factors may include without limitation hand movement or gait characteristics based at least in part on motion sensor data, image patterns based at least in part on camera data, or user characteristics based at least in part on ultrasonic or infrared sensor data. Behaviometric authentication factors may include, without limitation, patterns of application usage or patterns of user interface interaction in various embodiments.
At adecision block408, it may be determined whether the first user is observed. This may be performed by theprofile selection module214 comparing biometric data generated by theclassifier module216 or thebiometric ML classifier318 at least partially based on information from themotion sensor116 or312 to reference data stored in thetemplate244, for example. The sample and reference biometric or behaviometric data compared by theprofile selection module214 may be based at least in part on any combination of sensor data or behavioral patterns in various embodiments and is not limited to information from themotion sensor116 or312.
If the first user is observed, it may be determined at adecision block410 whether a second user is observed. This may be performed by theprofile selection module214 comparing biometric data generated by theclassifier module216 or thebiometric ML classifier318 to reference data stored in thesecond template246 and thethird template248, for example. If a second user is observed, delegate user rights may be assigned to a second user context at ablock412 in various embodiments. This may be performed by theprofile selection module214 selecting thefirst delegate profile236 based at least in part on the reference data in thesecond template246 and biometric data generated by thebiometric ML classifier318, for example. The second user may then be able to consume one or more resources based at least in part on the assigned delegate user rights in various embodiments. Resource access rights may overlap between a logged-in user and a delegate of the logged-in user to support resource sharing (e.g., both may view a common display while consuming content). However, differential rights may also be enforced in some cases such as a delegate user may not be able to modify a file while having an input focus whereas a logged-in user may be able to modify a file while having input focus, for example. If, at thedecision block410, a second observer was not observed, theprocess400 may loop back to theoperation406 such that the monitoring of passive authentication factors continues.
In embodiments, if at thedecision block408, the first user is not observed, it may be determined at adecision block414 whether a second user is observed. If a second user is observed at thedecision block414, the first user access rights may be rescinded at ablock416. This may be performed by theaccess control module204 based at least in part on data from theprofile selection module210, for example. Theblock416 may also include logging in the second user and assigning user rights corresponding to a user profile associated with the second user. This may involve an active authentication factor such as a thumbprint presented at thefingerprint sensor118, in embodiments. The second user may then, at ablock418, logically become the first user for purposes of the logic of theprocess400. In embodiments, if, at thedecision block414, a second user is not observed, first and second user rights may be rescinded at ablock420. This may be performed by theaccess control module204, in embodiments. Theprocess400 may end at ablock422 after the first and second user access rights are rescinded. Although not shown, in embodiments, theprocess400 may proceed to a state following theend block422 where the system monitors for active authentication factors such as a thumbprint or a password as may occur if the process returned to thestart block402, for example.
FIG. 5 depicts anexample process500 for presenting resources that may be implemented by thecomputing device100 or300 in accordance with various embodiments. Theprocess500 may be performed by e.g., earlier describedlogin module202,access control module204,presentation module206,sensor processing module212,profile selection module214,classifier module216 and/oruser proximity module218. In alternate embodiments, theprocess500 may be performed by more or less modules, and/or in different order. Atoperation502, a first user of thecomputing device100, such as thefirst user170, may be authenticated and a first user profile corresponding to the first user may be selected. This may occur by presentation of a thumbprint to thefingerprint reader118 or user input of an authentication password in embodiments. Atblock504, a first access control state corresponding to a first user profile associated with the first user may be established. For example, a first access control state corresponding to thefirst user profile230 associated with thefirst user170 may be established by theaccess control module204.
Atoperation506, a second user profile may be selected that indicates a different user, such as thesecond user174, has device focus. In various embodiments, selecting the second user profile may include detecting a user characteristic change at ablock508 and selecting a second user profile at ablock510 based at least in part on the detected user characteristic change. Selecting the second user profile at theblock510 may also be based at least in part on the first user profile such that the second user profile may be a delegate profile relating to the first user profile.
In embodiments, detecting a user characteristic change at theblock508 may include monitoring one or more sensors such as themotion sensor116, themicrophone122, thecamera128, or theultrasonic sensor310, for example. Detecting a user characteristic change may further include receiving a sensor output, such as from themotion sensor116 at ablock508 and classifying the output such as with theclassifier module216 to determine whether characteristics such as biometric or behaviometric characteristics of the current user have changed. In embodiments, a movement of thecomputing device100 or300 may be detected that indicates the computing device has been picked up or has been passed from one person to another as a part of detecting a user characteristic change at theblock508.
Selecting a second user profile at theblock510 may also be based at least in part on the sensor output and a previously stored template based at least in part on biometric information associated with the second user, the template generated by a machine learning classifier. The template may include biometric reference sample data such as that described with respect totemplate246 that may be generated by theclassifier module216 during a training process, for example. In embodiments, receiving the sensor output may be performed by thesensor processing module212 and selecting the user profile may be performed by theprofile selection module214, for example. At ablock512, a second access control state based at least in part on the second user profile may be established. This may be performed by theaccess control module204, for example. In embodiments, the second access control state may be based at least in part on a delegate user profile.
In various embodiments, a proximity status associated with the first user may be determined at ablock514 after establishing the second access control state. This may be performed by theuser proximity module218 based at least in part on information received from the Bluetooth enableddevice180 received by thesensor processing module212, for example. At adecision block516, it may be determined whether the proximity status has reached a predetermined value. For example, it may be determined whether the proximity status has reached a level corresponding to greater than approximately 30 feet away from thecomputing device100. If, at thedecision block516, it is determined that the proximity status has not reached the predetermined value, a resource may be presented at ablock518 based at least in part on the second access control state. If, at thedecision block516, it is determined that the proximity status has reached the predetermined value, the second access state may be terminated at ablock520.
Referring now toFIG. 6, anexample computer600 suitable to practice the present disclosure as earlier described with reference toFIGS. 1-3 is illustrated in accordance with various embodiments. As shown,computer600 may include one or more processors orprocessor cores602, andsystem memory604. For the purpose of this application, including the claims, the terms “processor” and “processor cores” may be considered synonymous, unless the context clearly requires otherwise. Additionally,computer600 may include one ormore graphics processors605, mass storage devices606 (such as diskette, hard drive, compact disc read only memory (CD-ROM) and so forth), input/output devices608 (such as display, keyboard, cursor control, remote control, gaming controller, image capture device, and so forth),sensor hub609 that may function in a similar manner as that described with respect tosensor hub108 ofFIG. 1, and communication interfaces610 (such as network interface cards, modems, infrared receivers, radio receivers (e.g., Bluetooth), and so forth). The elements may be coupled to each other viasystem bus612, which may represent one or more buses. In the case of multiple buses, they may be bridged by one or more bus bridges (not shown).
Each of these elements may perform its conventional functions known in the art. In particular,system memory604 and mass storage devices606 may be employed to store a working copy and a permanent copy of the programming instructions implementing the operations associated with thecomputing device100 or thecomputing device300, e.g., operations described formodules146,150,202,204,206,210,212,214,216,218,318,320, and322 shown inFIGS. 1-3, or operations shown inprocess400 ofFIG. 4 orprocess500 ofFIG. 5, collectively denoted ascomputational logic622. Thesystem memory604 and mass storage devices606 may also be employed to store a working copy and a permanent copy of the programming instructions implementing the operations associated with theOS142, theapplication144, theOS324, and theapplication330. Thesystem memory604 and mass storage devices606 may also be employed to store thedata152, thelocal resources158, the user profiles316, and theresources332. The various elements may be implemented by assembler instructions supported by processor(s)602 or high-level languages, such as, for example, C, that can be compiled into such instructions.
The permanent copy of the programming instructions may be placed into mass storage devices606 in the factory, or in the field, through, for example, a distribution medium (not shown), such as a compact disc (CD), or through communication interface610 (from a distribution server (not shown)). That is, one or more distribution media having an implementation of the agent program may be employed to distribute the agent and program various computing devices.
The number, capability and/or capacity of these elements608-612 may vary, depending on whethercomputer600 is a stationary computing device, such as a set-top box or desktop computer, or a mobile computing device such as a tablet computing device, laptop computer or smartphone. Their constitutions are otherwise known, and accordingly will not be further described.
FIG. 7 illustrates an example at least one non-transitory computer-readable storage medium702 having instructions configured to practice all or selected ones of the operations associated with thecomputing device100 or thecomputing device300, earlier described, in accordance with various embodiments. As illustrated, at least one computer-readable storage medium702 may include a number ofprogramming instructions704. Thestorage medium702 may represent a broad range of persistent storage medium known in the art, including but not limited to flash memory, dynamic random access memory, static random access memory, an optical disk, a magnetic disk, etc. Programminginstructions704 may be configured to enable a device, e.g.,computer600,computing device100, orcomputing device300, in response to execution of the programming instructions, to perform, e.g., but not limited to, various operations described formodules146,150,202,204,206,210,212,214,216,218,318,320, and322 shown inFIGS. 1-3, or operations ofprocess400 ofFIG. 4 orprocess500 ofFIG. 5. In alternate embodiments, programminginstructions704 may be disposed on multiple computer-readable storage media702.
Referring back toFIG. 6, for an embodiment, at least one ofprocessors602 may be packaged together with memory havingcomputational logic622 configured to practice aspects described formodules146,150,202,204,206,210,212,214,216,218,318,320, and322 shown inFIGS. 1-3, or operations ofprocess400 orFIG. 4 orprocess500 ofFIG. 5. For an embodiment, at least one ofprocessors602 may be packaged together with memory havingcomputational logic622 configured to practice aspects described formodules146,150,202,204,206,210,212,214,216,218,318,320, and322 shown inFIGS. 1-3, or operations ofprocess400 ofFIG. 4 orprocess500 ofFIG. 5 to form a System in Package (SiP). For an embodiment, at least one ofprocessors602 may be integrated on the same die with memory havingcomputational logic622 configured to practice aspects described formodules146,150,202,204,206,210,212,214,216,218,318,320, and322 shown inFIGS. 1-3, or operations ofprocess400 ofFIG. 4 orprocess500 ofFIG. 5. For an embodiment, at least one ofprocessors602 may be packaged together with memory havingcomputational logic622 configured to practice aspects ofprocess400 ofFIG. 4 orprocess500 ofFIG. 5 to form a System on Chip (SoC). For at least one embodiment, the SoC may be utilized in, e.g., but not limited to, a mobile computing device such as a computing tablet and/or a smartphone.
Machine-readable media (including non-transitory machine-readable media, such as machine-readable storage media), methods, systems and devices for performing the above-described techniques are illustrative examples of embodiments disclosed herein. Additionally, other devices in the above-described interactions may be configured to perform various disclosed techniques.
EXAMPLESSome non-limiting examples are:
Example 1 may include a computing device comprising: one or more processors; a memory coupled with the one or more processors; a login module operated by the one or more processors to authenticate a first user of the device and establish a first access control state corresponding to a first user profile associated with the first user; a contextual authentication module operated by the one or more processors to select a second user profile based at least in part on a changed user characteristic; and a presentation module operated by the one or more processors to present a resource based at least in part on the second user profile.
Example 2 may include the subject matter of Example 1, wherein the computing device further comprises a sensor, and wherein the contextual authentication module comprises: a profile selection module operated by the one or more processors to select the second user profile based at least in part on an output of the sensor and a previously stored template generated by a machine learning classifier.
Example 3 may include the subject matter of Example 2, wherein the sensor is a motion sensor and wherein the profile selection module comprises a biometric machine learning classifier, wherein the profile selection module is to perform a biometric information classification of the output of the sensor and select the second user profile based at least in part on the biometric information classification and the previously stored template.
Example 4 may include the subject matter of any one of Examples 1-3, wherein the login module is to authenticate the first user of the device based at least in part on an active authentication factor.
Example 5 may include the subject matter of any one of Examples 1-4, wherein the computing device further comprises: an access control module operated by the one or more processors to establish a second access control state based at least in part on the second user profile; and a user proximity module operated by the one or more processors to determine a proximity status associated with the first user , wherein the access control module is operated by the one or more processors to terminate the second access control state if the proximity status reaches a predetermined value.
Example 6 may include the subject matter of any one of Examples 1-5, wherein the computing device further comprises a trusted execution environment operated by one of the processors to host operation of the contextual authentication module.
Example 7 may include the subject matter of any one of Examples 1-6, wherein the contextual authentication module is operated by the one or more processor to select a delegate profile as the second user profile.
Example 8 may include the subject matter of any one of Examples 1-5, wherein the computing device is a tablet computing device, wherein the contextual authentication module comprises a profile selection module operated by the one or more processors in a trusted execution environment to select a second user profile based at least in part on a previously stored template generated by a machine learning classifier.
Example 9 may include a computer implemented method comprising: authenticating, by a computing device, a first user of the device; establishing, by the computing device, a first access control state corresponding to a first user profile associated with the first user; selecting, by the computing device, a second user profile based at least in part on a changed user characteristic; and presenting, by the computing device, a resource based at least in part on the second user profile.
Example 10 may include the subject matter of Example 9, wherein selecting, by the computing device, the second user profile comprises: receiving, by the computing device, a sensor output; performing a classification of the sensor output to generate sample data; and selecting, by the computing device, the second user profile based at least in part on the sample data and a previously stored template generated by a machine learning classifier.
Example 11 may include the subject matter of Example 10, wherein receiving comprises receiving a sensor output from a motion sensor, wherein performing comprises performing a biometric classification of the motion sensor output to generate biometric sample data, and wherein selecting comprises selecting the second user profile based at least in part on the biometric sample data and a previously stored biometric template generated by a biometric machine learning classifier.
Example 12 may include the subject matter of any one of Examples 9-11, wherein authenticating, by the computing device, the first user of the device is based at least in part on an active authentication factor.
Example 13 may include the subject matter of any one of Examples 9-12, further comprising: establishing, by the computing device, a second access control state based at least in part on the second user profile; determining, by the computing device, a proximity status associated with the first user; and terminating, by the computing device, the second access control state if the proximity status reaches a predetermined value.
Example 14 may include the subject matter of any one of Examples 9-13, wherein selecting, by the computing device, the second user profile based at least in part on the changed user characteristic is performed in a trusted execution environment.
Example 15 may include the subject matter of any one of Examples 9-14, wherein the second user profile is a delegate profile.
Example 16 may include the subject matter of any one of Examples 9-13, wherein the computing device is a tablet computing device, wherein selecting, by the computing device, the second user profile based at least in part on the changed user characteristic comprises selecting in a trusted execution environment, by the computing device, a second user profile based at least in part on a previously stored template generated by a machine learning classifier.
Example 17 may include at least one non-transitory computer-readable medium comprising instructions stored thereon that, in response to execution of the instructions by a computing device, cause the computing device to: authenticate a first user of the device; establish a first access control state corresponding to a first user profile associated with the first user; select a second user profile based at least in part on a changed user characteristic; and present a resource based at least in part on the second user profile.
Example 18 may include the subject matter of Example 17, wherein to select the second user profile, the computing device is caused to: perform a classification of a sensor output to generate sample data; and select a second user profile based at least in part on the sample data and a previously stored template generated by a machine learning classifier.
Example 19 may include the subject matter of Example 18, wherein the computing device is caused to perform a biometric classification of a motion sensor output to generate biometric sample data and wherein the computing device is caused to select the second user profile based at least in part on the biometric sample data and a previously stored biometric template generated by a biometric machine learning classifier.
Example 20 may include the subject matter of any one of Examples 17-19, wherein the computing device is caused to authenticate the first user of the device based at least in part on an active authentication factor.
Example may include the subject matter of any one of Examples 17-20, wherein the computing device is further caused to: establish a second access control state based at least in part on the second user profile; determine a proximity status associated with the first user; and terminate the second access control state if the proximity status reaches a predetermined value.
Example may include the subject matter of any one of Examples 17-21, wherein the computing device is further caused to select the second user profile in a trusted execution environment.
Example 23 may include the subject matter of any one of Examples 17-22, wherein the computing device is further caused to select a delegate profile as the second user profile.
Example 24 may include the subject matter of any one of Examples 17-21, wherein the computing device is a tablet computing device, wherein the tablet computing device is caused to select the second user profile in a trusted execution environment of the tablet computing device based at least in part on a previously stored template generated by a machine learning classifier.
Example 25 may include a computing device comprising: means for authenticating a first user of the device; means for establishing a first access control state corresponding to a first user profile associated with the first user; means for selecting a second user profile based at least in part on a changed user characteristic; and means for presenting a resource based at least in part on the second user profile.
Example 26 may include the subject matter of Example 25, wherein the means for selecting the second user profile comprises: means for receiving a sensor output; means for performing a classification of the sensor output to generate sample data; and means for selecting the second user profile based at least in part on the sample data and a previously stored template generated by a machine learning classifier.
Example 27 may include the subject matter of Example 26, wherein the means for receiving comprises means for receiving a sensor output from a motion sensor, wherein the means for performing comprises means for performing a biometric classification of the motion sensor output to generate biometric sample data, and wherein the means for selecting comprises means for selecting the second user profile based at least in part on the biometric sample data and a previously stored biometric template generated by a biometric machine learning classifier.
Example 28 may include the subject matter of any one of Examples 25-27, wherein the means for authenticating the first user of the device is based at least in part on an active authentication factor.
Example 29 may include the subject matter of any one of Examples 25-28, further comprising: means for establishing a second access control state based at least in part on the second user profile; means for determining a proximity status associated with the first user; and means for terminating the second access control state if the proximity status reaches a predetermined value.
Example 30 may include the subject matter of any one of Examples 25-29, wherein the means for selecting the second user profile based at least in part on the changed user characteristic is in a trusted execution environment.
Example 31 may include the subject matter of any one of Examples 25-30, wherein the second user profile is a delegate profile.
Example 32 may include the subject matter of any one of Examples 25-29, wherein the computing device is a tablet computing device, wherein the means for selecting the second user profile based at least in part on the changed user characteristic comprises means for selecting in a trusted execution environment a second user profile based at least in part on a previously stored template generated by a machine learning classifier.
Although certain embodiments have been illustrated and described herein for purposes of description, a wide variety of alternate and/or equivalent embodiments or implementations calculated to achieve the same purposes may be substituted for the embodiments shown and described without departing from the scope of the present disclosure. This application is intended to cover any adaptations or variations of the embodiments discussed herein. Therefore, it is manifestly intended that embodiments described herein be limited only by the claims.
Where the disclosure recites “a” or “a first” element or the equivalent thereof, such disclosure includes one or more such elements, neither requiring nor excluding two or more such elements. Further, ordinal indicators (e.g., first, second or third) for identified elements are used to distinguish between the elements, and do not indicate or imply a required or limited number of such elements, nor do they indicate a particular position or order of such elements unless otherwise specifically stated.