Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/parts/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.; the terms "first" and "second", etc. are used merely as labels, and are not limiting on the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
Aiming at the problems in the related art, the disclosure provides a control display method. Fig. 1 shows a flowchart of a control display method, and as shown in fig. 1, the control display method at least includes the following steps:
and S110, dividing a display interface of the target terminal into a first area and a second area, and determining a target service which can be selected by a user so as to display a service control corresponding to the target service in the first area.
And S120, receiving a user image of the user, determining user data corresponding to the user image, and displaying a user control corresponding to the user data in a second area.
S130, displaying the recommended service in a second area in response to the touch operation of the user on the user control; the recommended service comprises at least one target service with interest relation of the user.
And S140, responding to the touch operation of the user on the business control, and displaying the target business corresponding to the business control in the second area.
In the method and the device provided by the exemplary embodiment of the disclosure, on one hand, the user control is displayed in the second area, and when the user control is touched, the target service having an interest relationship with the user is displayed in the second area, so that not only is the interest in the interaction process increased, but also the convenience for the user to obtain the target service is improved; on the other hand, the service control is displayed in the first area and is a target service which can be selected by the user, when the user touches the service control, the target service is displayed in the second area, the logic of displaying the target service is perfected, the situation that the user cannot acquire the target service required by the user when the recommended service does not accord with the interest of the user is avoided, and the experience degree of the user is improved.
The following describes each step of the control display method in detail.
In step S110, a display interface of the target terminal is divided into a first area and a second area, and a target service that can be selected by a user is determined, so that a service control corresponding to the target service is displayed in the first area.
In an exemplary embodiment of the present disclosure, the target terminal may be a terminal provided with a display screen, and specifically, the target terminal may be a terminal provided with a display screen and placed in a bank lobby, or a terminal provided with a display screen and placed in a stock exchange, which is not particularly limited in this exemplary embodiment. The display interface is an interface for displaying information in a display screen.
The first area and the second area refer to two different areas obtained by dividing the display interface, and the display interface may be divided into two different areas, namely, an upper area and a lower area, or may be divided into two different areas, namely, a left area and a right area.
The target service refers to a service that can be selected by a user, that is, a service that a user can participate in, the service control refers to a control that corresponds to the target service and is displayed in the first region, a service name representing the target service may be displayed on the service control, and a service type representing the target service may also be displayed, which is not particularly limited in this exemplary embodiment.
For example, fig. 2 shows a schematic diagram of a display interface of a target terminal, where theinterface 200 is the display interface of the target terminal, thearea 210 is a first area in the target terminal, and thearea 220 is a second area in the target terminal, where the first area is used to display a service control.
In the exemplary embodiment, the display interface is divided, and the service control is displayed in the first area, so that the control displayed in the first area by the user is prompted to correspond to the target service selectable by the user, and therefore, not only is the display effect of the target service improved, but also the convenience for the user to select the target control is improved.
In step S120, if the user image of the user is collected, the user data corresponding to the user image is determined, so as to display the user control corresponding to the user data in the second area.
In the exemplary embodiment of the disclosure, an image of a user may be acquired through a camera arranged on a target terminal, and then the image of the user is recognized through a face recognition module arranged in the target terminal to obtain user data, based on which, after the user data is obtained, a user control corresponding to the user data may be generated and displayed in a second area, a face picture corresponding to the user may be displayed on the user control, a user name corresponding to the user may also be displayed on the user control, and information that can represent the user may also be displayed on the target control, so as to represent the relationship between the user control and the user.
It should be noted that, if the camera in the target terminal captures a user image of one user, 1 or more user controls corresponding to the one user are displayed in the second area, and the user controls may randomly move in the second area, and if the camera in the target terminal captures user images of multiple users, multiple user controls corresponding to the multiple users are displayed in the second area, and the multiple user controls may randomly move in the second area.
For example, fig. 3 shows a schematic diagram of a display interface of a target terminal, as shown in fig. 3, where thecontrol 310, thecontrol 320, and thecontrol 330 are user controls displayed in the second area and respectively corresponding to user data 1, user data 2, and user data 3, and the user data 1 corresponds to user a, the user data 2 corresponds to user B, and the user data 3 corresponds to the user table C. Thecontrol 340, thecontrol 350, thecontrol 360, and thecontrol 370 are service controls displayed in the first area, and thecontrol 340 corresponds to the target service D, thecontrol 350 corresponds to the target service E, thecontrol 360 corresponds to the target service F, and thecontrol 370 corresponds to the target service G.
For example, fig. 4 shows another schematic diagram of the target terminal display interface, as shown in fig. 4, wherein thecontrol 410 is a user control displayed in the second area and corresponding to the user data 4, and the user data 4 corresponds to the user H.
In an alternative embodiment, fig. 5 is a flowchart illustrating a method for determining user data in a control display method, as shown in fig. 5, the method at least includes the following steps: in step S510, a face feature corresponding to the user image is identified and compared with a face feature library, where the face feature library includes target face features of a plurality of registered users.
The face features refer to a result obtained by identifying a user image through a face identification algorithm model in a target terminal or a server, and the face feature library is a database storing the target face features of a plurality of registered users, for example, if the target terminal is a terminal placed in a bank business hall, the target face features of the registered users can be the face features of the user holding the bank card.
For example, the face features are obtained by recognizing the user image, because the target terminal is a terminal placed in the business hall of bank a, the obtained target face features of the registered user are used as the face features of the user holding the bank card of bank a, so as to compare the face features with the target face features.
In step S520, if a target face feature consistent with the face feature exists in the face feature library, determining that the user data corresponding to the user image is a face image; and the face picture corresponds to a target user corresponding to the target face characteristic.
When the target face features consistent with the face features exist in the face feature library, it is proved that the user image identified by the target terminal is the user image of the registered user, the user data is determined as the face image of the user, the face image may be a face image with the face of the user only by cutting the collected user portrait by using an image processing module in the target terminal, and the face image may also be an image obtained from user information of the registered user.
For example, if the face features are consistent with the target face features of the registered user a, the user data is a face picture, and the picture may be obtained from the user information of the user a.
In step S530, if there is no target face feature that is consistent with the face feature in the face feature library, it is determined that the user data corresponding to the user image is the user attribute data.
If the target face features consistent with the face features do not exist in the face feature library, it is proved that the user corresponding to the user image is a new user, that is, an unregistered user, and at this time, the user data corresponding to the user image is attribute data of the user, for example, gender of the user may be used, and age of the user may be used, which is not particularly limited in this exemplary embodiment.
In the exemplary embodiment, different user data can be determined by comparing the face features with the target face features, so that the follow-up determination of the recommended service recommended to the user according to different user data is facilitated, the logic for determining the recommended service is perfected, and the recommended service can be determined even if the target face features consistent with the face features do not exist in the face feature library.
In step S130, in response to a touch operation of the user on the user control, displaying a recommended service in a second area; wherein the recommendation service comprises at least one target service having an interest relationship with the user.
In an exemplary embodiment of the present disclosure, when the user touches the user control, the recommended service may be displayed in the second area, the recommended service is at least one of the target services, and the recommended service is a target service that may be of interest to the user.
For example, fig. 6 shows a schematic diagram of a display interface of a target terminal, as shown in fig. 6, when a user touches thecontrol 320 closer to the bottom of the second area in fig. 3, a recommended service is displayed in anarea 610, and thearea 610 is a partial area in the second area.
Based on this, thearea 610 divides the second area into three areas, which are thearea 610, thearea 620 and thearea 630, theuser control 310 may be continuously displayed in thearea 620, theuser control 330 may be continuously displayed in thearea 630, theuser control 330 may also be continuously displayed in thearea 620, and theuser control 310 is displayed in thearea 630, which depends on the positions of the user a and the user C, and if the coordinate data of the user a projected to the display interface belongs to thearea 620, theuser control 310 is displayed in thearea 620, and similarly, the determination of the display area of thecontrol 630 is also according to the above principle.
In an alternative embodiment, fig. 7 is a flowchart illustrating a control display method before displaying a recommended service in a second area, where as shown in fig. 7, the method at least includes the following steps: in step S710, if the area of the display interface is greater than the preset area threshold, a distance between the user and the display interface is obtained.
When the area of the display interface is larger than the preset area threshold, the display position data of the user control may need to be changed to facilitate the touch control of the user, and when the area of the display interface is smaller than or equal to the preset interface threshold, the display position data of the user control does not need to be changed.
The distance between the user and the display interface is the distance between the position of the user and the position of the display interface.
For example, if the preset area threshold is 4 and the area of the display interface is 5, the distance between the location of the user and the location of the display interface is obtained, and specifically, the distance is 0.5 m.
In step S720, if the distance is smaller than the distance threshold, coordinate data projected to the display interface by the user is obtained.
The distance between the user and the display interface may be detected by a distance detection device on the target terminal, or may be calculated by a user computer vision module in the target terminal, which is not particularly limited in this exemplary embodiment.
Specifically, the distance detection device may be a radar device, a laser ranging device, an infrared sensor, or an ultrasonic ranging sensor, which is not particularly limited in this exemplary embodiment. For example, when the distance detection device is an infrared sensor, the infrared sensor transmits infrared rays from the transmitting module by using the characteristics of no diffusion and low refractive index when the infrared rays are transmitted, and the infrared rays are reflected back and received by the receiving module when the infrared rays touch a user, so that the time used by the infrared rays from transmitting to receiving can be obtained, and the time is calculated by adopting a corresponding distance measurement formula to obtain the distance between the display interface and the user.
Besides, when the distance between the display interface and the user is realized through the user computer vision module, in particular, the distance between the user and the display screen can be obtained by adopting a depth camera.
The distance threshold is a distance critical value for judging whether the user is close to the display interface of the target terminal at the moment, and when the distance is smaller than the distance threshold, the user is proved to move towards the display interface of the target terminal at the moment, and the user can possibly touch the user control displayed in the second area of the display interface at the moment.
The coordinate data refers to coordinate data projected to the display interface by the user, and specifically, if the distance between the display interface and the user is acquired through the user computer vision module, after the distance is acquired, the depth-of-field camera can be used for photographing, and the photographed picture is calculated by using a target detection algorithm, so as to obtain the coordinate data. For example, the coordinate data projected by the user onto the display interface is (20, 20), which proves that the user is close to the display interface (20, 20) at this time.
For example, the distance threshold is 1 meter, the distance between the position of the user and the position of the display interface is 0.5 meter, and obviously, the distance is smaller than the distance threshold, and the coordinate data projected to the display interface by the user is obtained as (20, 20).
In step S730, the display position of the user control is updated by using the coordinate data, so as to display the user control according to the updated display position.
The display position is the display position of the user control in the display interface at the moment, the display position is replaced by the display position corresponding to the coordinate data, the user control is displayed according to the updated display position, and the user control is displayed at the position convenient for the user to touch at the moment.
For example, fig. 8 shows a schematic diagram of updating the display position of the user control, where the arrow start position is the display position of theuser control 320 when the user B is not close to the display interface, and theblock 810 is the position where theuser control 320 is about to be displayed after the user B is close to the display interface and the display position data of theuser control 320 is updated, it should be noted that when the user B is not close to the display interface, there are 2user controls 320 in the second area, and theuser control 320 that first updates the display position data is theuser control 320 closer to the bottom of the second area by default.
In the exemplary embodiment, when the display area is larger than the preset area threshold and the distance is smaller than the distance threshold, the display position data of the user control is updated, so that the situation that the user is not close to the display interface and the display position data of the user control is also updated is avoided, that is, unnecessary performance loss is reduced, and the touch control of the user is facilitated.
In an alternative embodiment, fig. 9 is a schematic flow chart illustrating a process of acquiring coordinate data of a user relative to a display interface if a distance in a control display method is smaller than a distance threshold, where as shown in fig. 9, the method at least includes the following steps: in step S910, if the distance is smaller than the distance threshold, performing gesture recognition on the user image to obtain a gesture recognition result.
Wherein, even if the distance between the position of the user and the position of the display interface is smaller than the threshold distance, the user may only watch the display interface and does not touch the user control, so as to further determine the intention of the user, the user image can be identified by the gesture identification module in the target terminal to obtain the gesture identification result,
for example, if the distance is smaller than the distance threshold, it is proved that the user is close to the display interface at this time, and the gesture recognition module in the target terminal is used for recognizing the user image at this time, so as to obtain a gesture recognition result.
In step S920, if the gesture recognition result satisfies the preset gesture condition, coordinate data projected to the display interface by the user is obtained.
The preset gesture condition can be a hand-lifting gesture of the user, and when the gesture recognition result meets the preset gesture condition, the coordinate data projected to the display interface by the user is acquired.
For example, the preset gesture condition is a hand-lifting gesture of the user, the gesture recognition result is the hand-lifting gesture of the user, the gesture recognition result meets the preset gesture condition, and the coordinate data (20, 20) projected to the display interface by the user is acquired at the moment.
In the exemplary embodiment, the gesture recognition is performed on the user image, whether the user wants to touch the user control at the moment can be further determined, the accuracy of determining whether the user has the intention of touching the user control is increased, the display position data of the user control can be further not updated when the user does not have the intention of touching the user control, and unnecessary performance loss is avoided.
In an optional embodiment, if the distance is smaller than the distance threshold, the method further includes: and differentially displaying the user control at the display position.
The differentiated display aims at user controls corresponding to users close to the display interface, and the mode of displaying the user controls is different from the mode of displaying other user controls, so that the users close to the display interface are prompted.
It is assumed that a user control corresponding to a user close to the display interface is a, there are two other user controls, which are respectively a user control B and a user control C, specifically, the differentiated display may be to increase a display special effect in the process of displaying the user control a, for example, the display may be an enlarged display, or a blinking display, or the differentiated display may be to display the user control a using different colors, for example, the display colors of the user control B and the user control C are black, then the user control a may be displayed in yellow, may also be displayed in red, or may be displayed in any color except black, which is not particularly limited in this exemplary embodiment.
For example, when the distance between the user and the display interface is less than a distance threshold, a flashing animation is added in the process of displaying the user control corresponding to the user.
In this exemplary embodiment, when the distance is smaller than the distance threshold, the user control is displayed in a differentiated manner, so that the user can be prompted that the user control displayed in a differentiated manner corresponds to the user, and the interest of the interaction of the display interface is increased.
In an alternative embodiment, fig. 10 is a flowchart illustrating a cross-region display of user controls in a control display method, where as shown in fig. 10, the method at least includes the following steps: in step S1010, if the display position is changed, a second target area having an area mapping relationship with the changed display position is acquired; wherein the change of the display position is caused by the user moving, and the second target area is a partial area in the second area.
The second target area is a department area in the second area, and after the display position is changed, the second target area may be consistent with the first target area or may not be consistent with the first target area, where the change of the display position is caused by the movement of the user, and when the user starts to move, the coordinate data acquired by the target terminal may be changed, thereby causing the user coordinate corresponding to the moving user to be changed.
For example, fig. 11 shows a schematic diagram of a display interface with a changed display position, where theuser control 320 is displayed at thedisplay position 810 when the user B approaches the display interface gradually, and the display position may be changed from thedisplay position 810 to thedisplay position 1110 when the user starts moving to the left direction of the display interface.
Fig. 12 shows a schematic diagram of different partial areas in the second area, as shown in fig. 12, where anarea 1210 having an area mapping relationship with thedisplay position 1110 is the second target area, an area 1220 having an area mapping relationship with thedisplay position 810 is the first target area, and anarea 1230 is another partial area in the second area.
In step S1020, if the second target area is not consistent with the first target area having the area mapping relationship with the display position before the change and no recommended service is displayed in the second target area, the user control corresponding to the user is displayed in the second target area across areas.
And if the second target area is inconsistent with the first target area and the recommended service is not displayed in the second target area, displaying a user control corresponding to the user in the second area.
For example, as shown in FIG. 12, the first target area and the second target area are clearly not consistent, and are now displayed cross-regionally with the user controls 320 in thesecond target area 1210.
In the exemplary embodiment, the display area of the user control can be changed along with the movement of the user, so that the flexibility in the display interface interaction engineering is increased, and the convenience for the user to touch the user control is improved.
In an alternative embodiment, fig. 13 is a flowchart illustrating a method for displaying a recommended service in a second area in a control display method, where as shown in fig. 13, the method at least includes the following steps: in step S1310, if the user control corresponds to the face image, a target user corresponding to the face image is determined, so as to determine a historical user image corresponding to the target user.
If the user control corresponds to the face picture, it is proved that the target user corresponding to the face picture is the registered user at this time, and the historical user portrait of the target user is obtained at this time, it is worth explaining that the target service selected by the user history can be recorded in the historical user portrait, the consumption record of the user history can be recorded, and all the content related to the user history data can be recorded.
For example, if the user control corresponds to a face picture, the target user corresponding to the face picture is determined to be the user a, and at this time, the historical user image of the user a is determined.
In step S1320, at least one target service having an interest relationship with the target user is determined as a recommended service among the target services according to the historical user profile.
The method comprises the steps of obtaining historical user images, determining at least one recommended service which the user may be interested in according to the target service which the user selects once in the historical user images, and selecting the recommended service which the user may be interested in according to other historical data in the historical user images.
For example, if it is recorded in the historical user profile that the user has selected the target service B and the target service B belongs to the service type 1, another target service that also belongs to the service type 1 in the target service may be used as the recommended service.
In step S1330, if the user control corresponds to the user attribute data, at least one target service having an interest relationship with the user attribute data is determined to be a recommended service in the target services.
And if the user control corresponds to the user attribute data, determining at least one target service as a recommended service in the target services according to the attribute data.
For example, if the user control corresponds to the user attribute data, specifically, the attribute data may be age 50, based on which at least one target service suitable for the 50-year-old crowd may be taken as the recommended service.
In step S1340, the recommended service is displayed in the second area.
And after the recommended service is determined, when the user touches the user control, the recommended service is displayed in the second area.
For example, as shown in fig. 3, the determined recommended service is a target service D and a target service E, and when the user touches theuser control 310, the target service D and the target service E are displayed in a partial area in the second area.
In the exemplary embodiment, the logic for determining the recommended service is perfected, and the recommended service which is possibly interested by the user can be determined in the target service regardless of whether the user control corresponds to the face picture or the user attribute data.
In an alternative embodiment, fig. 14 is a flowchart illustrating a method for displaying a recommended service in a second area in a control display method, where as shown in fig. 14, the method at least includes the following steps: in step S1410, if there are multiple users, the display position of the touched user control in the display interface is obtained.
If the number of the users is multiple, multiple user controls are displayed in the second area, and at this time, when one of the multiple users touches the corresponding user control, the display position of the touched user control in the display interface is obtained, which is the display coordinate of the touched user control in the display interface.
For example, as shown in fig. 3, when the user touches the user coordinates 320, the display position (20, 20) of the toucheduser control 320 in the display interface is obtained.
In step S1420, a first target area having an area mapping relationship with the display position is determined in the second area, and the recommended service is displayed in the first target area; wherein the first target area is a partial area in the second area.
The first target area is a partial area in the second area, and the area mapping relationship refers to a mapping relationship between the display position and the target area, for example, the second area of the display interface can be divided into 3 partial areas, namely, area 1, area 2, and area 3, at most, and if the display position belongs to the position in area 1, area 1 is determined as the first target area.
It should be noted that the height of the first target region may be the same as the height of the second region, the width of the first target region may be smaller than the width of the second region, and the height and the width of the first target region may also be respectively smaller than the height and the width of the second region, which is not particularly limited in this exemplary embodiment.
For example, fig. 15 shows a schematic view of a display interface on the target terminal when a user B touches thecontrol 320 in fig. 3, as shown in fig. 3, since there are three users, namely, a user a, a user B, and a user C, at this time, auser control 310 corresponding to the user a, auser control 320 corresponding to the user B, and auser control 330 corresponding to the user C are displayed in the second area, when the user B touches thecontrol 320 near the bottom of the second area in fig. 3, the display position of theuser control 320 is obtained as (20, 20), and a target area having an area mapping relationship with the display position (20, 20) is determined as anarea 1510 in fig. 15, so that the recommended service is displayed in thetarget area 1510.
In the exemplary embodiment, the first target area is determined according to the display position of the touched user control, and the recommended service is displayed in the first target area, so that convenience of obtaining the recommended service by the user is improved, and interestingness in the interaction process is improved.
In step S140, if the user touches the service control, a target service corresponding to the service control is displayed in the second area.
In an exemplary embodiment of the disclosure, after the user touches the user control, when it is found that the recommended service is not a service in which the user is interested, or the user does not want to touch the user control, the user may directly touch the service control in the first area, so as to display the target service corresponding to the service control in the second area.
For example, as shown in fig. 6, if the user B touches the service control E in the first area, a target service corresponding to the service control E is displayed in thearea 610.
In an alternative embodiment, fig. 16 is a flowchart illustrating a method for displaying a target service corresponding to a service control in a second area in a control display method, where as shown in fig. 16, the method at least includes the following steps: in step S1610, in response to a touch operation of the user on the aiming control, aiming information acting on the aiming control is acquired; wherein the aiming control is used for hitting the business control.
The aiming control means a control that a user can touch for hitting the business control, and the aiming information can be obtained by the user touching the aiming control, specifically, the aiming information may include aiming direction information generated when the user touches the aiming control, may also include touch pressure information generated when the user touches the aiming control, and may also include any information that may be generated when the user touches the aiming control, which is not particularly limited in this exemplary embodiment.
For example, fig. 17 shows a display interface diagram with a pointing control, as shown in fig. 17, where acontrol 1710 is a pointing control, and when a user touches thepointing control 1710, pointing information, that is, pointing direction information and touch pressure information, can be obtained.
In step S1620, the hit business control is determined in the business controls according to the targeting information, and the target business corresponding to the hit business control is displayed in the second area.
The hit business control can be determined according to the aiming information, and at this time, it is proved that the user wants to acquire the target business corresponding to the hit business control, so that the target business is displayed in the second area.
For example, according to the touch strength and the aiming direction information in the aiming information, a hit business control corresponding to the touch strength and the aiming direction information can be determined in the target business, and at this time, the target business corresponding to the hit business control is displayed in the second area.
In the exemplary embodiment, a method for acquiring a target service with high interest is provided, and when the area of the display interface is large, a user can acquire the target service in the first area more conveniently by touching the aiming control.
In an optional embodiment, the method further comprises: responding to touch operation of a user on the mobile control, and displaying the business control displayed in the first area in the second area; the mobile control is used for changing the service display position of the service control, the service control corresponding to the recommended service is differentiated from other service controls and displayed in the second area, and the service control is composed of the service control corresponding to the recommended service and other service controls.
The mobile control refers to a control which is displayed in the display interface and is used for changing the service display position of the service control. When the user performs touch operation on the mobile control, the business control is displayed in the second area, and the recommended business corresponding to the user control is different from other target businesses and is displayed in the second area in a differentiated mode.
For example, fig. 18 shows a display interface schematic diagram with a mobile control, as shown in fig. 18, where acontrol 1810 is a mobile control, when a user touches thecontrol 1810, theservice control 340, theservice control 350, and theservice control 360 are displayed in thesecond area 220, assuming that the touch mobile control is user B, and the target service corresponding to theservice control 350 is a recommended service when the user B touches thecontrol 320, in the process of displaying theservice control 340, theservice control 350, and theservice control 360, an amplification special effect is added to theservice control 350, and besides, the user may delete theservice control 350, and then select a service control of interest between theservice control 340 and theservice control 360 to display the target service corresponding to the service control.
In the exemplary embodiment, when the user is not interested in the recommended service, the mobile control can be touched to display the service control in the second area, and the user is prompted to select other service controls which may be interested in the service control through differentiated display of the service control corresponding to the recommendation, so that the interest in the interactive process of the display interface is increased, and the convenience for the user to obtain the interested target service is increased.
In the method and the device provided by the exemplary embodiment of the disclosure, on one hand, the user control is displayed in the second area, and when the user control is touched, the target service having an interest relationship with the user is displayed in the second area, so that not only is the interest in the interaction process increased, but also the convenience for the user to obtain the target service is improved; on the other hand, the service control is displayed in the first area and is a target service which can be selected by the user, when the user touches the service control, the target service is displayed in the second area, the logic of displaying the target service is perfected, the situation that the user cannot acquire the target service required by the user when the recommended service does not accord with the interest of the user is avoided, and the experience degree of the user is improved.
The following describes a control display method in the embodiment of the present disclosure in detail with reference to an application scenario.
As shown in fig. 3, thedisplay interface 200 is a display interface disposed on a marketing display screen of a banking business hall, wherein abusiness control 340, abusiness control 350, abusiness control 360, and abusiness control 370 are displayed in thefirst area 210, and thebusiness control 340, thebusiness control 350, thebusiness control 360, and thebusiness control 370 randomly move in the first area, thebusiness control 340 may be a financing business control, thebusiness control 350 may be a savings business control, thebusiness control 360 may be a credit card business control, and thebusiness control 370 may be another business control.
User controls 310, 320, and 330 are displayed in thesecond region 220, and the user controls 310, 320, and 330 move randomly in the second region, where the user controls 310 correspond to user a, the user controls 320 correspond to user B, and the user controls 330 correspond to user C.
As shown in fig. 8, when the distance between the user B and the display interface is smaller than the distance threshold and the gesture recognition result obtained by performing gesture recognition on the user image of the user B satisfies the preset gesture condition, the display position of theuser control 320 near the bottom of the second region is updated and displayed at the position shown in theblock 810, where the determination of theposition 810 is based on the coordinate data projected into the display interface by the user at this time.
If the distance between the user C and the display interface is smaller than the distance threshold, and the gesture recognition result obtained by performing gesture recognition on the user image of the user C meets the preset gesture condition, the display position of theuser control 330 near the bottom of the second area is updated, and theuser control 330 is displayed at a position different from thebox position 810 and near the bottom of the second area.
At this time, if the user B touches theuser control 320 displayed at theblock 810 by the user B, according to the display position of the user control at this time, a first target area having an area mapping relationship with the display position is determined in the second area, and according to the user data corresponding to theuser control 320, a recommended service having an interest relationship with the user is determined, so as to display the recommended service in the first target area, specifically, if theuser control 320 corresponds to the user attribute data, and the user attribute data is female, and is 25 years old, since the user is younger and female, the recommended service may be a credit card service corresponding to theservice control 360. The first target region can be shown asregion 1510 in FIG. 15, which can be shown asregion 1210 in FIG. 12.
And after thecontrol 320 moves to thebox position 810, the user may move, for example, to the left direction at the bottom of the second area, and the coordinate data projected on the display interface by the user may change, based on which, as shown in fig. 11, if the display position of thecontrol 320 moves from theblock position 810 to theblock position 1110, it needs to be determined whether the first target area corresponding to theblock position 810 is consistent with the second target area corresponding to theblock position 1110, and if the first target area is not consistent with the second target area, thecontrol 320 is displayed across regions, so-called cross-region display, where the display position of thecontrol 320 is not in the first target region, but in other partial regions of the second region, and, the cross-region display presupposes that the recommended service corresponding to the other user control is not displayed in the second region.
If the user touches any one of the service controls in the first area, the target service corresponding to the service control is displayed in the second area, besides, the user can directly touch the targeting control as shown by the control 1410 in fig. 14 to hit the service control, so as to display the hit service control in the second area, or when the user is not interested in recommending the service credit card service after touching theuser control 320, the user can select the service of interest by touching the control 1410, for example, the user can select the financial service corresponding to theservice control 340 of interest by touching the control 1410.
In addition, if the user is not interested in the recommended service, the user may also touch themobile control 1810 in fig. 18 to display the service control in the second area, and display theservice control 350 corresponding to the recommended service in a differentiated manner, so that the user can more clearly see the content of the service control displayed in the display interface, and select another service control that may be interested in, so as to obtain the target service corresponding to the service control.
In the application scenario, on one hand, the user control is displayed in the second area, and when the user control is touched, the target service which has an interest relationship with the user is displayed in the second area, so that the interest in the interaction process is increased, and the convenience for the user to obtain the target service is improved; on the other hand, the service control is displayed in the first area and is a target service which can be selected by the user, when the user touches the service control, the target service is displayed in the second area, the logic of displaying the target service is perfected, the situation that the user cannot acquire the target service required by the user when the recommended service does not accord with the interest of the user is avoided, and the experience degree of the user is improved.
In addition, in an exemplary embodiment of the present disclosure, a control display apparatus is also provided. Fig. 19 is a schematic structural diagram of a control display device, and as shown in fig. 19, thecontrol display device 1900 may include: adivision module 1910, afirst display module 1920, asecond display module 1930, and athird display module 1940. Wherein:
adividing module 1910 configured to divide a display interface of a target terminal into a first area and a second area, determine a target service available for a user to select, and display a service control corresponding to the target service in the first area; afirst display module 1920 configured to receive a user image of a user, determine user data corresponding to the user image, and display a user control corresponding to the user data in a second region; thesecond display module 1930 is configured to display the recommended service in the second area in response to a touch operation of the user on the user control; the recommended service comprises at least one target service having an interest relationship with the user; and thethird display module 1940 is configured to, in response to a touch operation of the user on the business control, display a target business corresponding to the business control in the second area.
The details of thecontrol display device 1900 are described in detail in the corresponding control display method, and therefore are not described herein again.
It should be noted that although several modules or units ofcontrol display 1900 are mentioned in the above detailed description, such division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
Anelectronic device 2000 according to such an embodiment of the invention is described below with reference to fig. 20. Theelectronic device 2000 shown in fig. 20 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiment of the present invention.
As shown in fig. 20, theelectronic device 2000 is embodied in the form of a general purpose computing device. The components of theelectronic device 2000 may include, but are not limited to: the at least oneprocessing unit 2010, the at least onememory unit 2020, thebus 2030 connecting the various system components including thememory unit 2020 and theprocessing unit 2010, and thedisplay unit 2040.
Wherein the memory unit stores program code executable by theprocessing unit 2010 to cause theprocessing unit 2010 to perform steps according to various exemplary embodiments of the present invention described in the "exemplary methods" section above of the specification.
Thestorage unit 2020 may include readable media in the form of volatile storage units, such as a random access memory unit (RAM)2021 and/or acache memory unit 2022, and may further include a read only memory unit (ROM) 2023.
Thestorage unit 2020 may also include a program/utility tool 2024 having a set (at least one) ofprogram modules 2025,such program modules 2025 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, and in some combination, may comprise a representation of a network environment.
Bus 2030 may be one or more of any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
Theelectronic device 2000 may also communicate with one or more external devices 2070 (e.g., a keyboard, a pointing device, a bluetooth device, etc.), with one or more devices that enable a user to interact with theelectronic device 2000, and/or with any devices (e.g., a router, a modem, etc.) that enable theelectronic device 2000 to communicate with one or more other computing devices. Such communication may occur over an input/output (I/O)interface 2050. Also, theelectronic device 2000 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) through thenetwork adapter 2060. As shown, thenetwork adapter 2060 communicates with the other modules of theelectronic device 2000 via thebus 2030. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with theelectronic device 2000, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above-mentioned "exemplary methods" section of the present description, when said program product is run on the terminal device.
Referring to fig. 21, aprogram product 2100 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.