Detailed Description
In order that the above-recited objects, features and advantages of the present application will be more clearly understood, a more particular description of the application will be rendered by reference to the appended drawings and appended detailed description. It should be noted that, without conflict, the embodiments of the present application and features in the embodiments may be combined with each other.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced otherwise than as described herein, and therefore the scope of the present invention is not limited to the specific embodiments disclosed below.
In the description of the present invention, the term "plurality" means two or more, unless explicitly defined otherwise, the orientation or positional relationship indicated by the terms "upper", "lower", etc. are based on the orientation or positional relationship shown in the drawings, merely for convenience of description of the present invention and to simplify the description, and do not indicate or imply that the apparatus or elements referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus should not be construed as limiting the present invention. The terms "connected," "mounted," "secured," and the like are to be construed broadly,
For example, the connection may be fixed connection, removable connection, or integral connection, or may be direct connection or indirect connection via an intermediary. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances. Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first", "a second", etc. may explicitly or implicitly include one or more such feature. In the description of the present invention, unless otherwise indicated, the meaning of "a plurality" is two or more.
In the description of this specification, the terms "one embodiment," "some implementations," "particular embodiments," and the like, mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
An interactive control system and method for a remote imaging display according to some embodiments of the present invention are described below with reference to the accompanying drawings.
As shown in fig. 1, a first aspect of the present invention proposes an interactive control system for a remote imaging display, including a remote imaging display for performing remote enlarged imaging on display content and a terminal device for providing the display content, where the remote imaging display includes a display screen disposed on a bottom of the remote imaging display and facing upward, a circuit board electrically connected to the display screen, a beam splitter disposed above the display screen, a concave mirror disposed on one side of the beam splitter, and an observation window disposed on the other side of the beam splitter, and a communication module for communication connection with the terminal device is disposed on the circuit board, and the circuit board obtains display content from the terminal device through the communication module and provides the display screen, and the interactive control system further includes a touch panel disposed on the observation window and an interactive control device connected to the touch panel and the terminal device.
The communication module may be a wireless communication module such as Wi-Fi or bluetooth, and the terminal device may be an intelligent terminal device such as a personal computer, a smart phone or a tablet computer. The touch panel can be arranged on the observation window of the remote imaging display in a plug-in mounting or embedded mounting mode. In some embodiments of the present invention, the interactive control device is a control chip disposed on a circuit board of the remote imaging display, and the control chip is connected to the touch panel through a driving circuit of the touch panel to obtain an interactive operation of a user from the touch panel. In some embodiments of the present invention, the processor of the terminal device may be reused as the interaction control device, and similarly, the touch panel is connected to the terminal device through a driving circuit, so that the terminal device may obtain the interaction operation of the user from the touch panel.
Further, the touch panel includes a plurality of capacitive sensors distributed on the touch panel, the touch panel is composed of two layers of mutually perpendicular and staggered ITO (Indium Tin Oxide) electrodes, the ITO electrodes are called capacitive sensors, and the scanning circuit extracts touch information by scanning a coupling capacitance between each ITO electrode and ground.
As shown in fig. 2, the interaction control means is configured to:
scanning a touch panel based on a pre-configured scanning frequency to acquire a capacitance image of the touch panel, wherein the capacitance image is a two-dimensional capacitance value image generated by mapping a coupling capacitance between each capacitive sensor on the touch panel and ground to a standard pixel value interval;
Determining a display projection area on an observation window of a remote imaging display, wherein the display projection area is a projection area on which a display picture on a display screen of the remote imaging display, which is observed by a user through the observation window, falls on the observation window;
clipping the capacitance image according to the shape and the position of the display projection area to generate a touch operation image;
identifying a touch operation gesture input by a user from the touch operation image;
and executing the interaction control instruction corresponding to the touch operation gesture.
Specifically, the standard pixel value interval is a preconfigured standard interval for converting a ground coupling capacitance of a capacitive sensor of the touch panel into a pixel value of a capacitive image, and has a pixel value interval upper boundary pb and a pixel value interval upper boundary pt, where, illustratively, the pixel value interval upper boundary pb =0 and the pixel value interval upper boundary pt =255. The capacitive sensor of the touch panel can be converted into a gray image in a visual mode for image display when the capacitive image is processed or tested by converting the ground coupling capacitance of the capacitive sensor into a standard interval of pixel values of the capacitive image, and the two-dimensional capacitive value image is the gray image generated after the ground coupling capacitance of the capacitive sensor is mapped to the standard pixel value interval.
Further, in the step of scanning the touch panel based on a pre-configured scanning frequency to acquire a capacitance image of the touch panel, the interaction control device is configured to:
Sequentially reading the ground coupling capacitance Co of each capacitive sensor on the touch panel;
mapping the ground coupling capacitance Co to a standard pixel value interval to obtain a pixel value ps of the current capacitance value of the capacitive sensor on the capacitance image, wherein the current capacitance value is the ground coupling capacitance Co of the capacitive sensor;
and filling the pixel value ps into a pixel position of the capacitive sensor corresponding to the capacitive image of the current frame.
In the foregoing embodiment, in the step of mapping the ground coupling capacitance to a standard pixel value interval to obtain a standard capacitance of the capacitive sensor, the interaction control device is configured to:
Acquiring a capacitance value lower boundary Cb and a capacitance value upper boundary Ct of the touch panel;
Calculating a pixel value of a current capacitance value of the capacitive sensor on the capacitance image:
Because of the difference of hardware parameters, different touch panels often have different capacitance intervals, so the lower capacitance boundary Cb and the upper capacitance boundary Ct of the touch panels need to be obtained through periodic scan tests.
Further, in the step of scanning the touch panel based on a pre-configured scanning frequency to acquire a capacitance image of the touch panel, the interaction control device is configured to:
Sequentially reading the ground coupling capacitance Co of each capacitive sensor on the touch panel;
mapping the ground coupling capacitance Co to a standard pixel value interval to obtain a pixel value ps of the current capacitance value of the capacitive sensor on the capacitance image, wherein the current capacitance value is the ground coupling capacitance Co of the capacitive sensor;
and filling the pixel value ps into a pixel position of the capacitive sensor corresponding to the capacitive image of the current frame.
In the foregoing embodiment, in the step of mapping the ground coupling capacitance to a standard pixel value interval to obtain a standard capacitance of the capacitive sensor, the interaction control device is configured to:
Acquiring a capacitance value lower boundary Cb and a capacitance value upper boundary Ct of the touch panel;
Calculating a pixel value of a current capacitance value of the capacitive sensor on the capacitance image:
Because of the difference of hardware parameters, different touch panels often have different capacitance intervals, so the lower capacitance boundary Cb and the upper capacitance boundary Ct of the touch panels need to be obtained through periodic scan tests.
Further, in the step of recognizing a touch operation gesture input by a user from the touch operation image, the interaction control device is configured to:
Identifying a touch operation point on the touch operation image, wherein the touch operation point is a coordinate point of a capacitive sensor on the touch panel, and the change of the capacitive sensor to the ground coupling capacitance is larger than a preset change threshold value when a user approaches or touches the touch panel through a finger or a touch tool;
analyzing the position change and time sequence change of a touch operation point on the touch panel on a continuous multi-frame touch operation image;
and determining a touch operation gesture corresponding to the touch operation point according to the position change and the time sequence change of the touch operation point on the touch panel, wherein the touch operation gesture comprises clicking operation, long-press operation and sliding operation.
Specifically, the user may directly use a finger to perform an operation on the touch panel, or may operate on the touch panel through a touch tool such as a stylus. Preferably, the touch panel is a self-capacitance projection type touch panel, and when a finger or a stylus approaches any capacitive sensor on the touch panel, the capacitance coupled to the ground of the corresponding capacitive sensor is changed. Further, the touch operation gesture further comprises multi-point clicking operation, multi-point sliding operation, multi-point kneading operation, multi-point opening operation and the like.
Further, in the step of identifying a touch operation point on the touch operation image, the interaction control device is configured to:
generating an unoperated image of the touch panel, wherein the unoperated image is a capacitance image generated in a state without external operation after the touch panel is electrified;
Performing a calibration process on the touch operation image based on the no-operation image to generate a target calibration image;
extracting extreme points from the target calibration image;
and determining the extreme point as a touch operation point on the touch operation image.
Specifically, when the touch panel is powered on, under the condition that no external operation exists, that is, no external conductor is close, each capacitive sensor, that is, the ITO electrode, has a base-to-ground coupling capacitance Cp. When a finger of a user or other touch tool approaches or contacts the touch panel, a coupling capacitor Ch is superimposed on the basis of the capacitive sensor's coupling capacitor Cp, so that the capacitive sensor's coupling capacitor Cp+Ch is achieved. Because of factors such as hardware differences and manufacturing errors, the base-to-ground coupling capacitance Cp of each capacitive sensor is not completely consistent, and in order to improve accuracy of touch operation detection, the non-operation image needs to be generated as a reference image in a state that the touch panel is not externally operated, so as to calibrate the touch operation image of the touch panel when the touch operation of a user is received.
Further, in the step of performing a calibration process on the touch operation image based on the no-operation image to generate a target calibration image, the interaction control means is configured to:
subtracting the pixel value of the corresponding coordinate in the non-operation image from the pixel value of each pixel in the touch operation image to generate a first calibration image;
Removing random noise with pixel values smaller than a preset noise threshold value from the first calibration image to generate a second calibration image;
determining the second calibration image as the target calibration image.
In the above-described embodiments, when a finger, a stylus, or the like approaches the touch panel, the capacitance coupling to ground of the capacitive sensor at the position where the finger or the stylus approaches and the periphery of the touch panel changes, and the capacitance coupling to ground is reflected on the touch operation image and is a gray scale area covering a plurality of capacitive sensors, and it is necessary to take an extremum in the gray scale area to determine the position of the capacitive sensor closest to the finger, the stylus, or the like of the user and determine the position as the operation point.
Further, the touch panel is a self-capacitance projection type touch panel, and before the step of identifying the touch operation gesture input by the user from the touch operation image, the interaction control device is configured to:
acquiring a preconfigured icon display distance interval and a touch operation distance interval;
calculating the real-time distance between an operating body and the touch panel according to the extreme point, wherein the operating body comprises a finger or a touch pen of a user;
when the real-time distance falls into the icon display distance interval, displaying an operation icon at a corresponding position on a display screen of the remote imaging display;
And when the real-time distance falls into the touch operation distance section, executing the steps of determining a touch operation gesture corresponding to the touch operation point according to the position change and the time sequence change of the touch operation point on the touch panel and executing an interaction control instruction corresponding to the touch operation gesture.
Specifically, in the icon display distance interval, the operation icon moves along with the movement of the finger of the user above the touch panel, so that the user can intuitively understand the position of the touch point of the finger mapped to the display screen, and misoperation caused by calculation errors or calculation delays of the optical path is avoided.
Further, in the step of generating the no-operation image of the touch panel, the interaction control means is configured to:
Acquiring a capacitance image P (ti) in a sampling period, wherein ti∈(tb,tt),i∈[1,nt],tb is the starting point of the sampling period, tt is the ending point of the sampling period, ti is the ith sampling time point between tb and tt, and nt is the number of sampling points between tb and tt;
Calculating the capacitance fluctuation amplitude pr (ti) of each frame of capacitance image P (ti), wherein the capacitance fluctuation amplitude pr (ti) is the capacitance fluctuation amplitude of the pixel position with the largest capacitance fluctuation amplitude on the ith frame of capacitance image P (ti);
Constructing a capacitance fluctuation amplitude variation curve f (tt) in the sampling period (tb,tt) based on the capacitance fluctuation amplitude pr (ti);
-determining a plateau (tstableb,tstablet) on said capacitance fluctuation amplitude variation curve f (t);
a no-operation capacitance image is generated based on the capacitance image of the plateau (tstableb,tstablet).
In the foregoing embodiment, one sampling point in the sampling period is a time point when one capacitance image is acquired, and the system generates the capacitance image at intervals, that is, the generation frame rate of the capacitance image is
Further, in the step of constructing a capacitance fluctuation amplitude variation curve f (t) within the sampling period (tb,tt) based on the capacitance fluctuation amplitude pr (ti), the interaction control device is configured to:
acquiring a discrete sequence of capacitance fluctuation amplitudes within the sampling time period (tb,tt):
And fitting the capacitance fluctuation amplitude discrete sequence into the capacitance fluctuation amplitude change curve f (t) by using a curve fitting algorithm, wherein the curve fitting algorithm can be any one of a polynomial fitting algorithm, a least square fitting algorithm, a linear interpolation algorithm or a spline interpolation algorithm.
Further, in the step of calculating the capacitance fluctuation width pr (ti) of each frame of the capacitance image P (ti), the interaction control means is configured to:
Acquiring pixel values P [ x, y ] with coordinates x, y in each frame of capacitance image P (ti) in the sampling time period (tb,tt) (ti);
Calculating a first pixel mean value of each pixel point position x, y in the sampling time period (tb,tt):
Calculating the relative pixel value of the pixel with the coordinate x and y in each frame of capacitance image P (ti):
Determining the maximum relative pixel value in each frame of capacitance image P (ti) as the capacitance fluctuation amplitude:
Where ms is the maximum column number of the capacitive image and ns is the maximum row number of the capacitive image.
Specifically, taking the example that the capacitance image includes ms×ns pixels, that is, the touch panel has ms×ns capacitive sensors thereon. In this case, 1.ltoreq.x.ltoreq.ms,1≤y≤ns is satisfied for the capacitance image P (ti) of any ti. Similarly, in calculating the pixel mean value of each pixel point position x, y over the sampling period (tb,tt)In the step (1), x is not less than 1 and not more than ms,1≤y≤ns.
Further, the step of determining the plateau (tstableb,tstablet) on the capacitance fluctuation amplitude variation curve f (t) specifically includes:
Acquiring a preset noise threshold p0;
Acquiring peak points peakj with the amplitude larger than the noise threshold p0 on the capacitance fluctuation amplitude change curve f (t), wherein j epsilon [1, npeak],npeak ] is the number of peak points with the amplitude larger than the noise threshold p0 on the capacitance fluctuation amplitude change curve f (t);
Calculating the time interval between two adjacent peak points peakj on the capacitance fluctuation amplitude change curve f (t):
Δt_peakj=t(peakj+1)-t(peakj);
Two adjacent peak points t (peakj0+1) and t (peakj0) are determined to satisfy:
Wherein j0 is any integer within the interval [1, npeak -1 ];
T (peakj0) is determined as the lower bound tstableb of the plateau and t (peakj0+1) is determined as the upper bound tstablet of the plateau.
Specifically, t (peakj+1) is a time point corresponding to the peak point (peakj+1) on the capacitance fluctuation amplitude variation curve f (t). Similarly, t (peakj) is a time point corresponding to the peak point (peakj) on the capacitance fluctuation amplitude variation curve f (t).
Further, the step of generating a non-operational capacitance image based on the capacitance image of the plateau (tstableb,tstablet) specifically includes:
Reading each frame of capacitance image P (tk) in the plateau (tstableb,tstablet), where tk∈(tstableb,tstablet),k∈[1,nstable],nstable is the number of capacitance image frames within the plateau (tstableb,tstablet);
Calculating a second pixel mean value of each pixel position x, y in the stable section (tstableb,tstablet):
A second pixel mean value of each pixel position x, y in the stable section (tstableb,tstablet)And determining the pixel value of the non-operation capacitance image at the pixel point position x and y.
Further, the step of determining the display projection area on the observation window of the remote imaging display specifically includes:
acquiring real-time positions of two eyes of a user;
determining the midpoint of a real-time position connecting line of two eyes of a user as an observation point;
calculating four first optical paths between the observation point and four inner angles of a display screen of the remote imaging display;
Determining the intersection points of four optical paths and four inner angles of a plane where the observation window is located;
And determining the area surrounded by the connecting lines of the four intersection points as the display projection area.
Specifically, the first optical path is a path that takes one of the internal angles of the display screen as a starting point, and the light rays are reflected by the spectroscope to cause the concave mirror, reflected by the concave mirror, and then pass through the spectroscope and the observation window to reach the optical measurement point. Because the observation window is located between the display screen and the optical path of the eyes of the user, and the display image of the display screen is subjected to multiple reflection and amplification by the spectroscope and the concave mirror, when the positions of the eyes of the user are different, the optical paths corresponding to the four corners of the display image, namely the connecting lines between the four inner corners of the remote imaging display, are also different, namely the projection positions of the display image on the observation window are necessarily changed due to the deviation of the positions of the eyes of the user. In some embodiments of the present invention, the remote imaging display further includes an image sensor for monitoring the eye position of the user in real time, and the image sensor is used to acquire the eye position of the user in real time to determine the projection area of the display screen on the observation window. In the technical solution of the foregoing embodiment, under the condition that the coordinates of the four interior angles of the observation point and the display screen are known, the coordinates of the intersection point of the first optical path and the interior angle of the plane where the observation window is located may be calculated.
Further, the step of determining the touch gesture corresponding to the touch operation point according to the position change and the time sequence change of the touch operation point on the touch panel specifically includes:
Determining a second optical path corresponding to the touch operation point and the observation point connecting line according to the position of the observation point;
determining an intersection point of the second optical path and the display screen as a mapping point of the touch operation point on the display screen;
and determining touch control operation gestures corresponding to the mapping points according to the position changes and the time sequence changes of the mapping points on the display screen.
Specifically, the second optical path is a path that takes the mapping point as a starting point, the light is reflected by the spectroscope to cause the concave mirror, then reflected by the concave mirror, passes through the spectroscope and the observation window, and then reaches the optical measurement point, and the intersection point of the second optical path and the observation window is the touch control operation point. In the technical solution of the foregoing embodiment, when the coordinates of the observation point and the touch operation point are known, coordinates of the mapping point, that is, coordinates of the second optical path falling onto the display screen may be obtained by back-pushing.
As shown in fig. 2, a second aspect of the present invention proposes an interactive control method for a remote imaging display, including:
scanning a touch panel based on a pre-configured scanning frequency to acquire a capacitance image of the touch panel, wherein the capacitance image is a two-dimensional capacitance value image generated by mapping a coupling capacitance between each capacitive sensor on the touch panel and ground to a standard pixel value interval;
Determining a display projection area on an observation window of a remote imaging display, wherein the display projection area is a projection area on which a display picture on a display screen of the remote imaging display, which is observed by a user through the observation window, falls on the observation window;
clipping the capacitance image according to the shape and the position of the display projection area to generate a touch operation image;
identifying a touch operation gesture input by a user from the touch operation image;
and executing the interaction control instruction corresponding to the touch operation gesture.
Specifically, the standard pixel value interval is a preconfigured standard interval for converting a ground coupling capacitance of a capacitive sensor of the touch panel into a pixel value of a capacitive image, and has a pixel value interval upper boundary pb and a pixel value interval upper boundary pt, where, illustratively, the pixel value interval upper boundary pb =0 and the pixel value interval upper boundary pt =255. The capacitive sensor of the touch panel can be converted into a gray image in a visual mode for image display when the capacitive image is processed or tested by converting the ground coupling capacitance of the capacitive sensor into a standard interval of pixel values of the capacitive image, and the two-dimensional capacitive value image is the gray image generated after the ground coupling capacitance of the capacitive sensor is mapped to the standard pixel value interval.
Further, the step of scanning the touch panel based on a pre-configured scanning frequency to obtain a capacitance image of the touch panel specifically includes:
Sequentially reading the ground coupling capacitance Co of each capacitive sensor on the touch panel;
mapping the ground coupling capacitance Co to a standard pixel value interval to obtain a pixel value ps of the current capacitance value of the capacitive sensor on the capacitance image, wherein the current capacitance value is the ground coupling capacitance Co of the capacitive sensor;
and filling the pixel value ps into a pixel position of the capacitive sensor corresponding to the capacitive image of the current frame.
In the foregoing technical solution of the foregoing embodiment, the step of mapping the ground coupling capacitance to a standard pixel value interval to obtain a standard capacitance of the capacitive sensor specifically includes:
Acquiring a capacitance value lower boundary Cb and a capacitance value upper boundary Ct of the touch panel;
Calculating a pixel value of a current capacitance value of the capacitive sensor on the capacitance image:
Because of the difference of hardware parameters, different touch panels often have different capacitance intervals, so the lower capacitance boundary Cb and the upper capacitance boundary Ct of the touch panels need to be obtained through periodic scan tests.
Further, the step of identifying the touch gesture input by the user from the touch operation image specifically includes:
Identifying a touch operation point on the touch operation image, wherein the touch operation point is a coordinate point of a capacitive sensor on the touch panel, and the change of the capacitive sensor to the ground coupling capacitance is larger than a preset change threshold value when a user approaches or touches the touch panel through a finger or a touch tool;
analyzing the position change and time sequence change of a touch operation point on the touch panel on a continuous multi-frame touch operation image;
and determining a touch operation gesture corresponding to the touch operation point according to the position change and the time sequence change of the touch operation point on the touch panel, wherein the touch operation gesture comprises clicking operation, long-press operation and sliding operation.
Specifically, the user may directly use a finger to perform an operation on the touch panel, or may operate on the touch panel through a touch tool such as a stylus. Preferably, the touch panel is a self-capacitance projection type touch panel, and when a finger or a stylus approaches any capacitive sensor on the touch panel, the capacitance coupled to the ground of the corresponding capacitive sensor is changed. Further, the touch operation gesture further comprises multi-point clicking operation, multi-point sliding operation, multi-point kneading operation, multi-point opening operation and the like.
Further, the step of identifying the touch operation point on the touch operation image specifically includes:
generating an unoperated image of the touch panel, wherein the unoperated image is a capacitance image generated in a state without external operation after the touch panel is electrified;
Performing a calibration process on the touch operation image based on the no-operation image to generate a target calibration image;
extracting extreme points from the target calibration image;
and determining the extreme point as a touch operation point on the touch operation image.
Specifically, when the touch panel is powered on, under the condition that no external operation exists, that is, no external conductor is close, each capacitive sensor, that is, the ITO electrode, has a base-to-ground coupling capacitance Cp. When a finger of a user or other touch tool approaches or contacts the touch panel, a coupling capacitor Ch is superimposed on the basis of the capacitive sensor's coupling capacitor Cp, so that the capacitive sensor's coupling capacitor Cp+Ch is achieved. Because of factors such as hardware differences and manufacturing errors, the base-to-ground coupling capacitance Cp of each capacitive sensor is not completely consistent, and in order to improve accuracy of touch operation detection, the non-operation image needs to be generated as a reference image in a state that the touch panel is not externally operated, so as to calibrate the touch operation image of the touch panel when the touch operation of a user is received.
Further, the step of performing calibration processing on the touch operation image based on the no-operation image to generate a target calibration image specifically includes:
subtracting the pixel value of the corresponding coordinate in the non-operation image from the pixel value of each pixel in the touch operation image to generate a first calibration image;
Removing random noise with pixel values smaller than a preset noise threshold value from the first calibration image to generate a second calibration image;
determining the second calibration image as the target calibration image.
In the above-described embodiments, when a finger, a stylus, or the like approaches the touch panel, the capacitance coupling to ground of the capacitive sensor at the position where the finger or the stylus approaches and the periphery of the touch panel changes, and the capacitance coupling to ground is reflected on the touch operation image and is a gray scale area covering a plurality of capacitive sensors, and it is necessary to take an extremum in the gray scale area to determine the position of the capacitive sensor closest to the finger, the stylus, or the like of the user and determine the position as the operation point.
Further, the touch panel is a self-capacitance projection type touch panel, and before the step of recognizing the touch gesture input by the user from the touch operation image, the method further includes:
acquiring a preconfigured icon display distance interval and a touch operation distance interval;
calculating the real-time distance between an operating body and the touch panel according to the extreme point, wherein the operating body comprises a finger or a touch pen of a user;
when the real-time distance falls into the icon display distance interval, displaying an operation icon at a corresponding position on a display screen of the remote imaging display;
And when the real-time distance falls into the touch operation distance section, executing the steps of determining a touch operation gesture corresponding to the touch operation point according to the position change and the time sequence change of the touch operation point on the touch panel and executing an interaction control instruction corresponding to the touch operation gesture.
Specifically, in the icon display distance interval, the operation icon moves along with the movement of the finger of the user above the touch panel, so that the user can intuitively understand the position of the touch point of the finger mapped to the display screen, and misoperation caused by calculation errors or calculation delays of the optical path is avoided.
Further, the step of generating the no-operation image of the touch panel specifically includes:
Acquiring a capacitance image P (ti) in a sampling period, wherein ti∈(tb,tt),i∈[1,nt],tb is the starting point of the sampling period, tt is the ending point of the sampling period, ti is the ith sampling time point between tb and tt, and nt is the number of sampling points between tb and tt;
Calculating the capacitance fluctuation amplitude pr (ti) of each frame of capacitance image P (ti), wherein the capacitance fluctuation amplitude pr (ti) is the capacitance fluctuation amplitude of the pixel position with the largest capacitance fluctuation amplitude on the ith frame of capacitance image P (ti);
Constructing a capacitance fluctuation amplitude variation curve f (t) in the sampling period (tb,tt) based on the capacitance fluctuation amplitude pr (ti);
-determining a plateau (tstableb,tstablet) on said capacitance fluctuation amplitude variation curve f (t);
a no-operation capacitance image is generated based on the capacitance image of the plateau (tstableb,tstablet).
In the foregoing embodiment, one sampling point in the sampling period is a time point when one capacitance image is acquired, and the system generates the capacitance image at intervals, that is, the generation frame rate of the capacitance image is
Further, the step of constructing the capacitance fluctuation amplitude variation curve f (t) in the sampling period (tb,tt) based on the capacitance fluctuation amplitude pr (ti) specifically includes:
acquiring a discrete sequence of capacitance fluctuation amplitudes within the sampling time period (tb,tt):
And fitting the capacitance fluctuation amplitude discrete sequence into the capacitance fluctuation amplitude change curve f (t) by using a curve fitting algorithm, wherein the curve fitting algorithm can be any one of a polynomial fitting algorithm, a least square fitting algorithm, a linear interpolation algorithm or a spline interpolation algorithm.
Further, the step of calculating the capacitance fluctuation width pr (ti) of each frame of the capacitance image P (ti) specifically includes:
acquiring pixel values P [ x, y ] with coordinates x, y in each frame of capacitance image P (ti) in the sampling time period (tb,ti) (ti);
Calculating a first pixel mean value of each pixel point position x, y in the sampling time period (tb,tt):
Calculating the relative pixel value of the pixel with the coordinate x and y in each frame of capacitance image P (ti):
Determining the maximum relative pixel value in each frame of capacitance image P (ti) as the capacitance fluctuation amplitude:
Where ms is the maximum column number of the capacitive image and ns is the maximum row number of the capacitive image.
Specifically, taking the example that the capacitance image includes ms×ns pixels, that is, the touch panel has ms×ns capacitive sensors thereon. In this case, 1.ltoreq.x.ltoreq.ms,1≤y≤ns is satisfied for the capacitance image P (ti) of any ti. Similarly, in calculating the pixel mean value of each pixel point position x, y over the sampling period (tb,tt)In the step (1), x is not less than 1 and not more than ms,1≤y≤ns.
Further, the step of determining the plateau (tstableb,tstablet) on the capacitance fluctuation amplitude variation curve f (t) specifically includes:
Acquiring a preset noise threshold p0;
Acquiring peak points peakj with the amplitude larger than the noise threshold p0 on the capacitance fluctuation amplitude change curve f (t), wherein j epsilon [1, npeak],npeak ] is the number of peak points with the amplitude larger than the noise threshold p0 on the capacitance fluctuation amplitude change curve f (t);
Calculating the time interval between two adjacent peak points peakj on the capacitance fluctuation amplitude change curve f (t):
Δt_peakj=t(peakj+1)-t(peakj);
Two adjacent peak points t (peakj0+1) and t (peakj0) are determined to satisfy:
Wherein j0 is any integer within the interval [1, npeak -1 ];
T (peakj0) is determined as the lower bound tstableb of the plateau and t (peakj0+1) is determined as the upper bound tstablet of the plateau.
Specifically, t (peakj+1) is a time point corresponding to the peak point (peakj+1) on the capacitance fluctuation amplitude variation curve f (t). Similarly, t (peakj) is a time point corresponding to the peak point (peakj) on the capacitance fluctuation amplitude variation curve f (t).
Further, the step of generating a non-operational capacitance image based on the capacitance image of the plateau (tstableb,tstablet) specifically includes:
Reading each frame of capacitance image P (tk) in the plateau (tstableb,tstablet), where tk∈(tstableb,tstablet),k∈[1,nstable],nstable is the number of capacitance image frames within the plateau (tstableb,tstablet);
Calculating a second pixel mean value of each pixel position x, y in the stable section (tstableb,tstablet):
A second pixel mean value of each pixel position x, y in the stable section (tstableb,tstablet)And determining the pixel value of the non-operation capacitance image at the pixel point position x and y.
Further, the step of determining the display projection area on the observation window of the remote imaging display specifically includes:
acquiring real-time positions of two eyes of a user;
determining the midpoint of a real-time position connecting line of two eyes of a user as an observation point;
calculating four first optical paths between the observation point and four inner angles of a display screen of the remote imaging display;
Determining the intersection points of four optical paths and four inner angles of a plane where the observation window is located;
And determining the area surrounded by the connecting lines of the four intersection points as the display projection area.
Specifically, the first optical path is a path that takes one of the internal angles of the display screen as a starting point, and the light rays are reflected by the spectroscope to cause the concave mirror, reflected by the concave mirror, and then pass through the spectroscope and the observation window to reach the optical measurement point. Because the observation window is located between the display screen and the optical path of the eyes of the user, and the display image of the display screen is subjected to multiple reflection and amplification by the spectroscope and the concave mirror, when the positions of the eyes of the user are different, the optical paths corresponding to the four corners of the display image, namely the connecting lines between the four inner corners of the remote imaging display, are also different, namely the projection positions of the display image on the observation window are necessarily changed due to the deviation of the positions of the eyes of the user. In some embodiments of the present invention, the remote imaging display further includes an image sensor for monitoring the eye position of the user in real time, and the image sensor is used to acquire the eye position of the user in real time to determine the projection area of the display screen on the observation window. In the technical solution of the foregoing embodiment, under the condition that the coordinates of the four interior angles of the observation point and the display screen are known, the coordinates of the intersection point of the first optical path and the interior angle of the plane where the observation window is located may be calculated.
Further, the step of determining the touch gesture corresponding to the touch operation point according to the position change and the time sequence change of the touch operation point on the touch panel specifically includes:
Determining a second optical path corresponding to the touch operation point and the observation point connecting line according to the position of the observation point;
determining an intersection point of the second optical path and the display screen as a mapping point of the touch operation point on the display screen;
and determining touch control operation gestures corresponding to the mapping points according to the position changes and the time sequence changes of the mapping points on the display screen.
Specifically, the second optical path is a path that takes the mapping point as a starting point, the light is reflected by the spectroscope to cause the concave mirror, then reflected by the concave mirror, passes through the spectroscope and the observation window, and then reaches the optical measurement point, and the intersection point of the second optical path and the observation window is the touch control operation point. In the technical solution of the foregoing embodiment, when the coordinates of the observation point and the touch operation point are known, coordinates of the mapping point, that is, coordinates of the second optical path falling onto the display screen may be obtained by back-pushing.
The invention provides an interaction control system for a remote imaging display, which comprises the remote imaging display for performing remote amplification imaging on display content and terminal equipment for providing the display content, wherein the remote imaging display comprises a display screen, a circuit board, a spectroscope, a concave mirror and an observation window, the display screen is arranged above the display screen, the concave mirror is arranged on one side of the spectroscope, the observation window is arranged on the other side of the spectroscope, and the interaction control system further comprises a touch panel for receiving touch input operation of a user, and an interaction control device connected with the touch panel and the terminal equipment. The circuit board is provided with a first communication module which is used for being in communication connection with the terminal equipment, and the circuit board obtains display content from the terminal equipment through the first communication module and provides the display content for the display screen. Further, a second communication module used for being in communication connection with the terminal device is arranged on the touch panel, after the first communication module and the second communication module are in communication connection, the terminal device receives touch operation on the touch panel through the second communication module, and display content provided to the circuit board is controlled in response to the touch operation.
The first communication module and the second communication module may be wireless communication modules such as Wi-Fi or bluetooth, or wired communication modules that perform wired communication connection through a PS/2 interface, a USB interface, or the like, and the touch panel is connected to the terminal device in a wired or wireless manner, and a user may input a touch operation to the terminal device by holding the touch panel in the hand to control display content on the remote imaging display. In some embodiments of the present invention, the interactive control device is a control chip disposed on a circuit board of the remote imaging display, and the control chip is connected to the touch panel through a driving circuit of the touch panel to obtain an interactive operation of a user from the touch panel. In some embodiments of the present invention, the processor of the terminal device may be reused as the interaction control device, and similarly, the touch panel is connected to the terminal device through a driving circuit, so that the terminal device may obtain the interaction operation of the user from the touch panel.
Further, the touch panel includes a plurality of capacitive sensors distributed on the touch panel, the touch panel is composed of two layers of mutually perpendicular and staggered ITO (Indium Tin Oxide) electrodes, the ITO electrodes are called capacitive sensors, and the scanning circuit extracts touch information by scanning a coupling capacitance between each ITO electrode and ground.
In the above interactive control system for a remote imaging display, the interactive control device is configured to:
scanning a touch panel based on a pre-configured scanning frequency to acquire a capacitance image of the touch panel, wherein the capacitance image is a two-dimensional capacitance value image generated by mapping a coupling capacitance between each capacitive sensor on the touch panel and ground to a standard pixel value interval;
generating a touch operation image corresponding to the capacitance image;
identifying a touch operation gesture input by a user from the touch operation image;
and executing the interaction control instruction corresponding to the touch operation gesture.
Specifically, the standard pixel value interval is a preconfigured standard interval for converting a ground coupling capacitance of a capacitive sensor of the touch panel into a pixel value of a capacitive image, and has a pixel value interval upper boundary pb and a pixel value interval upper boundary pt, where, illustratively, the pixel value interval upper boundary pb =0 and the pixel value interval upper boundary pt =255. The capacitive sensor of the touch panel can be converted into a gray image in a visual mode for image display when the capacitive image is processed or tested by converting the ground coupling capacitance of the capacitive sensor into a standard interval of pixel values of the capacitive image, and the two-dimensional capacitive value image is the gray image generated after the ground coupling capacitance of the capacitive sensor is mapped to the standard pixel value interval.
Further, in the step of scanning the touch panel based on a pre-configured scanning frequency to acquire a capacitance image of the touch panel, the interaction control device is configured to:
Sequentially reading the ground coupling capacitance Co of each capacitive sensor on the touch panel;
mapping the ground coupling capacitance Co to a standard pixel value interval to obtain a pixel value ps of the current capacitance value of the capacitive sensor on the capacitance image, wherein the current capacitance value is the ground coupling capacitance Co of the capacitive sensor;
and filling the pixel value ps into a pixel position of the capacitive sensor corresponding to the capacitive image of the current frame.
In the foregoing embodiment, in the step of mapping the ground coupling capacitance to a standard pixel value interval to obtain a standard capacitance of the capacitive sensor, the interaction control device is configured to:
Acquiring a capacitance value lower boundary Cb and a capacitance value upper boundary Ct of the touch panel;
Calculating a pixel value of a current capacitance value of the capacitive sensor on the capacitance image:
Because of the difference of hardware parameters, different touch panels often have different capacitance intervals, so the lower capacitance boundary Cb and the upper capacitance boundary Ct of the touch panels need to be obtained through periodic scan tests.
Further, in the step of scanning the touch panel based on a pre-configured scanning frequency to acquire a capacitance image of the touch panel, the interaction control device is configured to:
Sequentially reading the ground coupling capacitance Co of each capacitive sensor on the touch panel;
mapping the ground coupling capacitance Co to a standard pixel value interval to obtain a pixel value ps of the current capacitance value of the capacitive sensor on the capacitance image, wherein the current capacitance value is the ground coupling capacitance Co of the capacitive sensor;
and filling the pixel value ps into a pixel position of the capacitive sensor corresponding to the capacitive image of the current frame.
In the foregoing embodiment, in the step of mapping the ground coupling capacitance to a standard pixel value interval to obtain a standard capacitance of the capacitive sensor, the interaction control device is configured to:
Acquiring a capacitance value lower boundary Cb and a capacitance value upper boundary Ct of the touch panel;
Calculating a pixel value of a current capacitance value of the capacitive sensor on the capacitance image:
Because of the difference of hardware parameters, different touch panels often have different capacitance intervals, so the lower capacitance boundary Cb and the upper capacitance boundary Ct of the touch panels need to be obtained through periodic scan tests.
Further, in the step of recognizing a touch operation gesture input by a user from the touch operation image, the interaction control device is configured to:
Identifying a touch operation point on the touch operation image, wherein the touch operation point is a coordinate point of a capacitive sensor on the touch panel, and the change of the capacitive sensor to the ground coupling capacitance is larger than a preset change threshold value when a user approaches or touches the touch panel through a finger or a touch tool;
analyzing the position change and time sequence change of a touch operation point on the touch panel on a continuous multi-frame touch operation image;
and determining a touch operation gesture corresponding to the touch operation point according to the position change and the time sequence change of the touch operation point on the touch panel, wherein the touch operation gesture comprises clicking operation, long-press operation and sliding operation.
Specifically, the user may directly use a finger to perform an operation on the touch panel, or may operate on the touch panel through a touch tool such as a stylus. Preferably, the touch panel is a self-capacitance projection type touch panel, and when a finger or a stylus approaches any capacitive sensor on the touch panel, the capacitance coupled to the ground of the corresponding capacitive sensor is changed. Further, the touch operation gesture further comprises multi-point clicking operation, multi-point sliding operation, multi-point kneading operation, multi-point opening operation and the like.
Further, in the step of identifying a touch operation point on the touch operation image, the interaction control device is configured to:
generating an unoperated image of the touch panel, wherein the unoperated image is a capacitance image generated in a state without external operation after the touch panel is electrified;
Performing a calibration process on the touch operation image based on the no-operation image to generate a target calibration image;
extracting extreme points from the target calibration image;
and determining the extreme point as a touch operation point on the touch operation image.
Specifically, when the touch panel is powered on, under the condition that no external operation exists, that is, no external conductor is close, each capacitive sensor, that is, the ITO electrode, has a base-to-ground coupling capacitance Cp. When a finger of a user or other touch tool approaches or contacts the touch panel, a coupling capacitor Ch is superimposed on the basis of the capacitive sensor's coupling capacitor Cp, so that the capacitive sensor's coupling capacitor Cp+Ch is achieved. Because of factors such as hardware differences and manufacturing errors, the base-to-ground coupling capacitance Cp of each capacitive sensor is not completely consistent, and in order to improve accuracy of touch operation detection, the non-operation image needs to be generated as a reference image in a state that the touch panel is not externally operated, so as to calibrate the touch operation image of the touch panel when the touch operation of a user is received.
Further, in the step of performing a calibration process on the touch operation image based on the no-operation image to generate a target calibration image, the interaction control means is configured to:
subtracting the pixel value of the corresponding coordinate in the non-operation image from the pixel value of each pixel in the touch operation image to generate a first calibration image;
Removing random noise with pixel values smaller than a preset noise threshold value from the first calibration image to generate a second calibration image;
determining the second calibration image as the target calibration image.
In the above-described embodiments, when a finger, a stylus, or the like approaches the touch panel, the capacitance coupling to ground of the capacitive sensor at the position where the finger or the stylus approaches and the periphery of the touch panel changes, and the capacitance coupling to ground is reflected on the touch operation image and is a gray scale area covering a plurality of capacitive sensors, and it is necessary to take an extremum in the gray scale area to determine the position of the capacitive sensor closest to the finger, the stylus, or the like of the user and determine the position as the operation point.
Further, the touch panel is a self-capacitance projection type touch panel, and before the step of identifying the touch operation gesture input by the user from the touch operation image, the interaction control device is configured to:
acquiring a preconfigured icon display distance interval and a touch operation distance interval;
calculating the real-time distance between an operating body and the touch panel according to the extreme point, wherein the operating body comprises a finger or a touch pen of a user;
when the real-time distance falls into the icon display distance interval, displaying an operation icon at a corresponding position on a display screen of the remote imaging display;
And when the real-time distance falls into the touch operation distance section, executing the steps of determining a touch operation gesture corresponding to the touch operation point according to the position change and the time sequence change of the touch operation point on the touch panel and executing an interaction control instruction corresponding to the touch operation gesture.
Specifically, in the icon display distance interval, the operation icon moves along with the movement of the finger of the user above the touch panel, so that the user can intuitively understand the position of the touch point of the finger mapped to the display screen, and misoperation caused by calculation errors or calculation delays of the optical path is avoided.
Further, in the step of generating the no-operation image of the touch panel, the interaction control means is configured to:
Acquiring a capacitance image P (ti) in a sampling period, wherein ti∈(tb,tt),i∈[1,nt],tb is the starting point of the sampling period, tt is the ending point of the sampling period, ti is the ith sampling time point between tb and tt, and nt is the number of sampling points between tb and tt;
Calculating the capacitance fluctuation amplitude pr (ti) of each frame of capacitance image P (ti), wherein the capacitance fluctuation amplitude pr (ti) is the capacitance fluctuation amplitude of the pixel position with the largest capacitance fluctuation amplitude on the ith frame of capacitance image P (ti);
Constructing a capacitance fluctuation amplitude variation curve f (t) in the sampling period (tb,tt) based on the capacitance fluctuation amplitude pr (ti);
-determining a plateau (tstableb,tstablet) on said capacitance fluctuation amplitude variation curve f (t);
a no-operation capacitance image is generated based on the capacitance image of the plateau (tstableb,tstablet).
In the foregoing embodiment, one sampling point in the sampling period is a time point when one capacitance image is acquired, and the system generates the capacitance image at intervals, that is, the generation frame rate of the capacitance image is
Further, in the step of constructing a capacitance fluctuation amplitude variation curve f (t) within the sampling period (tb,tt) based on the capacitance fluctuation amplitude pr (ti), the interaction control device is configured to:
acquiring a discrete sequence of capacitance fluctuation amplitudes within the sampling time period (tb,tt):
And fitting the capacitance fluctuation amplitude discrete sequence into the capacitance fluctuation amplitude change curve f (t) by using a curve fitting algorithm, wherein the curve fitting algorithm can be any one of a polynomial fitting algorithm, a least square fitting algorithm, a linear interpolation algorithm or a spline interpolation algorithm.
Further, in the step of calculating the capacitance fluctuation width pr (ti) of each frame of the capacitance image P (ti), the interaction control means is configured to:
Acquiring pixel values P [ x, y ] with coordinates x, y in each frame of capacitance image P (ti) in the sampling time period (tb,tt) (ti);
Calculating a first pixel mean value of each pixel point position x, y in the sampling time period (tb,tt):
Calculating the relative pixel value of the pixel with the coordinate x and y in each frame of capacitance image P (ti):
Determining the maximum relative pixel value in each frame of capacitance image P (ti) as the capacitance fluctuation amplitude:
Where ms is the maximum column number of the capacitive image and ns is the maximum row number of the capacitive image.
Specifically, taking the example that the capacitance image includes ms×ns pixels, that is, the touch panel has ms×ns capacitive sensors thereon. In this case, 1.ltoreq.x.ltoreq.ms,1≤y≤ns is satisfied for the capacitance image P (ti) of any ti. Similarly, in calculating the pixel mean value of each pixel point position x, y over the sampling period (tb,tt)In the step (1), x is not less than 1 and not more than ms,1≤y≤ns.
Further, the step of determining the plateau (tstableb,tstablet) on the capacitance fluctuation amplitude variation curve f (t) specifically includes:
Acquiring a preset noise threshold p0;
Acquiring peak points peakj with the amplitude larger than the noise threshold p0 on the capacitance fluctuation amplitude change curve f (t), wherein j epsilon [1, npeak],npeak ] is the number of peak points with the amplitude larger than the noise threshold p0 on the capacitance fluctuation amplitude change curve f (t);
Calculating the time interval between two adjacent peak points peakj on the capacitance fluctuation amplitude change curve f (t):
Δt_peakj=t(peakj+1)-t(peakj);
Two adjacent peak points t (peakj0+1) and t (peakj0) are determined to satisfy:
Wherein j0 is any integer within the interval [1, npeak -1 ];
T (peakj0) is determined as the lower bound tstaleb of the plateau and t (peakj0+1) is determined as the upper bound tstablet of the plateau.
Specifically, t (peakj+1) is a time point corresponding to the peak point (peakj+1) on the capacitance fluctuation amplitude variation curve f (t). Similarly, t (peakj) is a time point corresponding to the peak point (peakj) on the capacitance fluctuation amplitude variation curve f (t).
Further, the step of generating a non-operational capacitance image based on the capacitance image of the plateau (tstableb,tstablet) specifically includes:
Reading each frame of capacitance image P (tk) in the plateau (tstableb,tstablet), where tk∈(tstableb,tstablet),k∈[1,nstable],nstable is the number of capacitance image frames within the plateau (tstableb,tstablet);
Calculating a second pixel mean value of each pixel position x, y in the stable section (tstableb,tstablet):
A second pixel mean value of each pixel position x, y in the stable section (tstableb,tstablet)And determining the pixel value of the non-operation capacitance image at the pixel point position x and y.
It should be noted that in this document relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
Embodiments in accordance with the present invention, as described above, are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best utilize the invention and various modifications as are suited to the particular use contemplated. The invention is limited only by the claims and the full scope and equivalents thereof.