Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. Wherein, in the description of the embodiments of the present application, "/" means or is meant unless otherwise indicated, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, in the description of the embodiments of the present application, "plurality" means two or more than two.
The terms "first," "second," "third," and the like, are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first", "a second", or a third "may explicitly or implicitly include one or more such feature.
Currently, many display screens of electronic devices have a touch function, that is, the display screen is a touch display screen. Touch-sensitive displays are also known as touch-sensitive displays, touch screens or touch screens. The Touch display screen includes a Touch Panel (TP) and a display panel. Among them, the touch panel is also called a touch panel, a touch sensor, or the like. When a user uses the electronic equipment, the frozen screen fault occurs on probability. When the screen freezing fault occurs, the picture of the electronic equipment is frozen, the user executes any touch operation, and the display screen does not respond.
One reason for the failure of the frozen screen is that the display screen itself fails, for example, the touch panel and the display panel drive power-on time sequence and logic are abnormal, so that the touch panel cannot be powered on normally, and the touch panel cannot report a touch event to the frame, so that the display screen does not respond, and the frozen screen failure occurs.
In general, when an electronic device fails in a frozen screen, the electronic device needs to be forced to be turned off or restarted for use, resulting in poor user experience. Therefore, it is necessary to detect the frozen screen fault of the electronic equipment so as to analyze and process the frozen screen condition, solve the frozen screen problem and improve the user experience. The frozen screen fault detection method provided by the embodiment of the application aims at detecting the frozen screen of the electronic equipment caused by the fact that the display screen breaks down, so that the analysis of specific reasons of the frozen screen fault in the later stage is facilitated, the performance of the display screen is effectively improved, and the user experience is improved.
Before explaining the freeze screen fault detection method provided by the embodiment of the application, firstly, the structure of the electronic device with the touch screen and the process of realizing the touch function of the electronic device are explained.
Optionally, the electronic device may be a mobile phone, a tablet computer, a wearable device, a vehicle-mounted device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (personal digital assistant, PDA), or a device that may have a touch display screen, and the specific type of the electronic device is not limited in the embodiments of the present application.
Fig. 1 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application. The electronic device 100 may include aprocessor 110, anexternal memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, acharge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, amobile communication module 150, awireless communication module 160, anaudio module 170, aspeaker 170A, areceiver 170B, amicrophone 170C, anearphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM)card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, abone conduction sensor 180M, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Theprocessor 110 may include one or more processing units, such as: theprocessor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in theprocessor 110 for storing instructions and data. In some embodiments, the memory in theprocessor 110 is a cache memory. The memory may hold instructions or data that theprocessor 110 has just used or recycled. If theprocessor 110 needs to reuse the instruction or data, it may be called directly from memory. Repeated accesses are avoided and the latency of theprocessor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, theprocessor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
Thewireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. Thewireless communication module 160 may be one or more devices that integrate at least one communication processing module. Thewireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to theprocessor 110. Thewireless communication module 160 may also receive a signal to be transmitted from theprocessor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
It can be appreciated that in the embodiment of the present application, the electronic device 100 may perform wireless communication with other electronic devices such as a server through thewireless communication module 160, so as to send information such as a log file to the server.
The camera 193 is used to capture still images or video. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1. Alternatively, the still image or video captured by the camera 193 may be stored in a memory or may be directly applied to an application program, for example, the captured still image or video may be applied to face unlocking or face recognition.
The electronic device 100 may implement audio functions through anaudio module 170, aspeaker 170A, areceiver 170B, amicrophone 170C, anearphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
Theaudio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. Theaudio module 170 may also be used to encode and decode audio signals. In some embodiments, theaudio module 170 may be disposed in theprocessor 110, or a portion of the functional modules of theaudio module 170 may be disposed in theprocessor 110.
For example, the electronic device 100 may utilize themicrophone 170C to perform voiceprint recognition with theaudio module 170 and the application processor, etc., to enable voiceprint unlocking, access the application lock, voiceprint photographing, etc.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
A touch sensor (i.e., touch panel) 180K. The touch sensor 180K may be disposed on the display 194, and the touch sensor 180K and the display 194 form a touch display. The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor 180K may generate a touch signal in response to a touch operation of a user, and transmit the touch signal to the application processor to generate touch data. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In this embodiment, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 2 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively. The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
In addition, the application framework layer further comprises related modules of Event reporting and management, such as an Event monitor (Event Hub) module, an input reading (input reader) module, an input distributing (input dispatcher) module and the like.
Android runtimes include core libraries and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver. In addition, the kernel layer may further include a touch signal processing module, a touch data generating module, a log (log) generating module, a communication module, and the like. The communication module may be a wired communication module or a wireless communication module.
Fig. 3 is a schematic touch flow chart of an electronic device according to an embodiment of the present application. As shown in fig. 3, the structure and modules involved in implementing the touch function of the electronic device include: thetouch panel 201 of the hardware layer, the touchsignal processing module 202 of the kernel layer, the touchdata generating module 203 and thelog generating module 207, theevent monitoring module 204 of the application framework layer, theinput reading module 205 of the application framework layer, theinput distributing module 206 of the application framework layer and the like.
The user performs a touch operation through thetouch panel 201. The touch operation includes, but is not limited to, a touch operation, a click operation, a slide operation, a long press operation (also referred to as a long press operation), and the like. Thetouch panel 201 receives a touch operation of a user, generates a touch signal, and reports the generated touch signal to the touchsignal processing module 202 of the kernel layer. The touchsignal processing module 202 receives the touch signal, encapsulates the touch signal, and outputs the encapsulated touch signal to the touchdata generating module 203 of the kernel layer. Optionally, the touchsignal processing module 202 may report the touch signal after the encapsulation processing to the touchdata generating module 203 through an I2C interface, an MIPI interface, an SPI interface, or the like. The touchdata generating module 203 performs normalization processing, data calibration processing, and the like on the touch signal, so as to generate data (input. C) of the touch event.
It is understood that the user performs a touch operation to generate one or more touch events. One touch event corresponds to an operation of a touch contact object (such as a finger or a stylus pen, etc.), and one touch event corresponds to a set of data. For example, the user performs a slide-up operation on the screen, possibly with one finger or with a plurality of fingers. Each finger is operated on the screen to generate a touch event corresponding to a group of data.
Each touch event includes at least one DOWN (DOWN) event and one UP (UP) event. The pressing event corresponds to a pressing operation of a finger, a stylus, or the like of the user, that is, the finger, the stylus, or the like of the user is in contact with the screen. The lifting event corresponds to a lifting operation of a user's finger or a stylus or the like, i.e., the user's finger or stylus or the like leaves the screen. Additionally, touch events may also include one or more and Movement (MOVE) events. The movement event is also called a movement point, and corresponds to a contact state of a user's finger or a stylus pen with the screen or an operation of moving along the screen. It will be appreciated that the movement event of one touch event is between the press event and the lift event in that touch event in time.
The touchdata generating module 203 reports the generated data of the touch event to theevent monitoring module 204 and theinput reading module 205 of the application architecture layer for processing. After being processed by theevent listening module 204 and theinput reading module 205, theevent listening module 206 distributes the event listening module to a corresponding application program of the application program layer, and the corresponding application program responds accordingly.
Meanwhile, after the touchdata generating module 203 generates the data of the touch event, the data of the touch event may be further sent to thelog generating module 207. Thelog generation module 207 generates a log file according to the data of the touch event. Optionally, the touchdata generating module 203 may send the data of the touch event to thelog generating module 207 in real time, so that thelog generating module 207 generates a log file; that is, each time the touchdata generating module 203 generates data of one touch event, the data of the touch event is sent to thelog generating module 207, and part or all of the data of the touch event is written into the log file by thelog generating module 207. The log file may be stored in a memory of the electronic device or may be sent to other electronic devices, for example, to a server.
In addition, it can be appreciated that, during the running process of each application program of the electronic device, the log generating module generates a corresponding application program running log, i.e. app. Moreover, the electronic device software system may also generate a system log, such as kmsgcat.
The freeze screen fault detection method provided by the embodiment of the application is used for detecting the freeze screen fault of the electronic equipment with the structure shown in the figures 1 to 3. Specifically, the freeze screen fault detection method is used for detecting freeze screen faults according to the operation log and touch data of the electronic equipment. The touch data includes data of a touch event, that is, data (input. C) of the touch event generated by the touchdata generating module 203 in the embodiment of fig. 3.
It should be noted that the method for detecting the frozen screen fault provided by the embodiment of the application can be applied to electronic equipment. The electronic device may be an electronic device having a touch function, that is, capable of generating touch data, for example, a terminal device, and the structure of the terminal device may be as shown in fig. 1 to 3; the electronic device may also be another electronic device, such as a server, communicatively coupled to the electronic device that generated the touch data. When the method provided by the embodiment of the application is applied to the terminal equipment, optionally, the terminal equipment can acquire touch data from the kernel layer in real time and acquire the operation log from the application program layer and/or the kernel layer so as to detect the frozen screen fault. Optionally, the terminal device may also store the generated touch data and the running log in the memory, and the terminal device may obtain the touch data from the memory, and detect the freeze screen fault according to the touch data.
When the method provided by the embodiment of the application is applied to the server, the server can be a cloud server or a physical server. Alternatively, the terminal device may upload the running log to the server. Meanwhile, the terminal equipment can upload the generated touch data to a server, and the server processes the operation log and the touch data to detect the frozen screen fault.
For convenience of explanation, the following embodiments take a frozen screen fault detection method applied to a server as an example, and take an electronic device for generating touch data as a terminal device, and specifically take a mobile phone as an example for explanation.
First, with reference to fig. 1 to 3, a description is given of a procedure of acquiring touch data and an operation log by a server before a freeze screen fault detection method. It may be understood that, in this embodiment, each module represents a module that implements a certain function, which may be implemented by hardware, or may be implemented by software, or may be implemented by a combination of software and hardware, which is not limited in this application.
Fig. 4 is a schematic diagram illustrating a touch data acquisition process according to an embodiment of the present application. It can be understood that after the mobile phone generates the touch data, the touch data can be uploaded to the server in the form of a log file. In this embodiment, the log file including the touch data is referred to as a first log file. The following is described in connection with fig. 4:
As shown in fig. 4, the process of obtaining touch data by the server may include:
s401, a user executes touch operation through a touch panel of the mobile phone.
S402, the touch panel receives touch operation of a user and generates a touch signal.
S403, the touch panel reports the generated touch signal to the touch signal processing module.
S404, the touch signal processing module receives the touch signal and packages the touch signal.
And S405, the touch signal processing module sends the packaged touch signal to the touch data generating module.
S406, the touch data generation module performs normalization processing, data calibration and other processing on the packaged touch signals to generate touch data.
S407, the touch data generation module sends the generated touch data to the first log generation module.
S408, the first log generation module generates a first log file according to the touch data.
That is, the first log generation module writes part or all of the data of each touch event into the log file. In other words, the first log file includes touch data, and the touch data may include part or all of data of touch events, for example, data of press events and data of lift events in the touch events; or the touch event comprises data of a pressing event, data of a lifting event and data of a moving event.
S409, the first log generation module sends the first log file to the communication module.
S4010, the communication module sends the first log file to the server.
Alternatively, the first log generating module may periodically package and send the generated first log file to the communication module, where the communication module sends the first log file to the server. Optionally, the first log generating module may package and send the first log file to the server according to the size of the first log file when the data size in the first log file reaches the preset data size.
S4011, the server receives the first log file sent by the communication module.
Fig. 5 is a schematic diagram of an acquisition flow of an application running log according to an embodiment of the present application. In this embodiment, the log file including the application running log is referred to as a second log file. As shown in fig. 5, the process of the server obtaining the application running log may include:
s501, running a preset function by an application program of the mobile phone.
Optionally, the application program of the mobile phone can respond to the operation of the user to passively operate the preset function, or can actively operate the preset function according to the preset of the system or the user.
S502, in the process of running the preset function of the application program, data of the program running are sent to the second log generating module.
S503, the second log generating module generates a second log file according to the data of the program operation.
The second log file includes an application running log, and the application running log includes part or all of data of the program running. For example, the application program operation log may include information such as an operation identifier of a preset function, a time when the preset function is operated (hereinafter referred to as an operation time of the preset function), and related data generated by the operation function. Taking a time application as an example, the application includes an alarm clock function, a world clock function, a stopwatch function, a timer function, and the like. Taking the alarm clock function as an example, the application running log may include an alarm clock function running identifier, a running time of the alarm clock, and the like. The alarm clock function operation identifier can be represented by a key statement. For example, a key statement "IHwDeskClock: alarmReceiver info: onRective: action=ALARM_ALERT_ACTION" may be included in the second log file, which characterizes the handset as running the ALARM clock function. In addition, the second log file can also comprise time information corresponding to the key statement, and the time of the mobile phone running the alarm clock at the time is represented.
S504, the second log generating module sends the second log file to the communication module.
S505, the communication module sends the second log file to the server.
The method for transmitting the second log file by the communication module is similar to that of the first log file, and will not be described herein.
S506, the server receives the second log file sent by the communication module.
Similarly, in the running process of the software system of the mobile phone, the system running log may also be generated by referring to the process of fig. 5, to form a third log file, and upload the third log file to the server, which is not described herein again.
After receiving the first log file, the second log file and the third log file, the server can detect the freeze screen fault according to touch data in the first log file, an application running log in the second log file, a system running log in the third log file and the like.
The following embodiments will explain, based on the structures and flows shown in fig. 1 to 5, with reference to the drawings and application scenarios, a specific process of detecting a freeze screen fault by a server according to touch data in a first log file, an operation log in a second log file, and a system operation log in a third log file.
Fig. 6 is a flowchart of an exemplary method for detecting a freeze screen fault according to an embodiment of the present application, where the method includes:
s601, acquiring a running log and touch data of terminal equipment; the touch data are data generated by the terminal equipment according to the touch operation of the user.
Alternatively, the running log may be an application running log in the second log file, or may be a system running log in the third running log, or the like.
Optionally, the touch data may be data in the first log file. As described above, the touch data may include data of a press event, data of a lift event, and data of a move event among touch events. Optionally, the data of the pressing event in each touch event may include the coordinates of the point of the pressing event (including the abscissa and the ordinate, i.e. the X-coordinate and the Y-coordinate), the time of the point of the pressing event, the event identifier, and so on. The event identifier may also be referred to as an event number, tracking ID, etc. and is used to characterize a unique identity of the touch event. One touch event corresponds to one event identifier, and a pressing event and a lifting event in one touch event correspond to the same event identifier. In other words, one touch event corresponds to a pair of press/lift events, and the event identifies the same one press event and one lift event as a pair of press/lift events.
The data of the lifting event in each touch event may include the reporting point coordinates (including the abscissa and the ordinate, that is, the X-coordinate and the Y-coordinate) of the lifting event, the reporting point time of the pressing event, the event identifier, and the like.
The data of each mobile event in each touch event may include the coordinates of the point of the mobile event (including the abscissa and the ordinate, i.e. the X-coordinate and the Y-coordinate), the time of the point of the mobile event, the event identifier, and so on.
The event identification of the pressing event, the event identification of the lifting event and the event identification of the moving event in the same touch event are the same.
Optionally, the data of the pressing event, the data of the lifting event, and the data of the moving event may further include other data such as a point reporting rate, which is not limited in any way in the embodiment of the present application.
Fig. 7 is a schematic diagram of data of an example of a pressing event according to an embodiment of the present application. As shown in fig. 7, "btn_touch DOWN" indicated at 704 characterizes the set of data as data for a press DOWN event. The data of the press-down event includes the report point coordinates of the press-down event: an X-coordinate 701 and a Y-coordinate 702. The data of the pressing event also includes anevent identifier 703, a point-in-time 705 of the pressing event, a point-in-time rate 706, and the like.
Fig. 8 is a schematic diagram illustrating data of an example lifting event according to an embodiment of the present application. As shown in fig. 8, "btn_touch UP" shown at 804 characterizes the set of data as data for a lift event. The data of the lift event includes the report point coordinates of the lift event: an X coordinate 801 and a Y coordinate 802. The data of the lift event further includes anevent identifier 803, a point-of-report time 805 of the lift event, a point-of-report rate 806, and the like.
Fig. 9 is a schematic diagram of data of an example of a movement event according to an embodiment of the present application. As shown in fig. 9, the data of the movement event includes the coordinates of the report point of the movement event: an X coordinate 901 and a Y coordinate 902. The data of the mobile event also includes anevent identifier 903, a point-in-time 905 of the mobile event, a point-in-time rate 906, and the like.
S602, determining whether preset information is included in the operation log; the preset information is used for indicating that the terminal equipment runs the preset function.
Alternatively, the preset information may be, for example, a preset keyword sentence, a preset identifier, etc. in the running log. The server can search whether preset keywords, preset key sentences or preset identifications and the like exist in the running log.
The preset information characterizes that the terminal equipment operates a preset function. Optionally, the preset function may be a function that the terminal device operates under the condition that the display screen is on, and after the preset function is operated, the user further executes the touch operation. In one embodiment, the preset functions may include at least one of an unlock function or a reminder function, etc. It can be understood that, the user performs an unlocking operation on the terminal device, and after the terminal device performs the unlocking function, the user generally further performs a touch operation. Similarly, after the terminal device runs the reminding function, the user generally further runs the touch operation to close the reminding or respond to the reminding, and the like. And if the terminal equipment has no touch event data after running preset functions such as unlocking function and/or reminding function, the frozen screen fault occurs in a large probability. In this embodiment, after determining that the terminal device operates the unlocking function and/or the reminding function, whether data of a touch event exists is determined, so as to determine whether a frozen screen fault occurs in the terminal device.
Alternatively, the preset function may be a function that the terminal device operates in response to a user operation or in response to communication information of other devices, that is, the preset function is a function that the terminal device passively operates. For example, an unlocking function or an incoming call alert function, etc.
Optionally, the preset function may also be a function of active operation of the terminal device. That is, the preset function is a function that the terminal device operates at a certain time according to a preset program. For example, an alarm clock alert function.
S603, if the operation log comprises preset information, determining the operation time of a preset function.
It can be understood that the running time of the preset function is the time when the terminal device runs the preset function. Optionally, the time corresponding to the preset information can be searched in the operation log, so as to obtain the operation time of the preset function.
S604, determining whether the touch data comprises data in a preset time period after the running time of the preset function.
That is, it is determined whether the data of the touch event is generated within a preset time period after the terminal device operates the preset function.
Optionally, it may be searched in the touch data to determine whether there is a touch event in a preset duration after the operation time of the preset function at the point time. Specifically, the point time, keywords, key sentences or marks and the like can be searched in the touch data to determine whether a touch event exists in a preset time period after the operation time of the preset function. For example, whether the identification field (btn_touch DOWN) of the pressing event and/or the identification field (btn_touch UP) of the lifting event exist in the data within a preset time period after the running time of the preset function at the point-of-report time is searched.
The preset time length can be set according to actual requirements. Specifically, the value of the preset duration can be set according to the preset function and in combination with the application scene of the user.
S605, if the touch data does not include the data in the preset time period after the operation time of the preset function, determining that the terminal equipment has a frozen screen fault.
As described above, after the terminal device runs the preset function, the user generally performs the related touch operation. If the terminal equipment does not receive the data of the touch event all the time within the preset duration, determining that the terminal equipment has a frozen screen fault.
Optionally, after determining that the terminal device has a frozen screen fault, relevant data of the touch panel, for example, an operation log on a touch chip (TPIC) side, before and/or after the operation time of the preset function, may be acquired, so as to analyze the cause of the frozen screen fault.
According to the frozen screen fault detection method, the frozen screen fault is detected and identified according to the operation log and the touch data by acquiring the operation log and the touch data of the terminal equipment, so that the concrete reasons of the frozen screen fault can be analyzed conveniently in the later period, the display screen performance can be improved conveniently, and the user experience is improved. According to the method provided by the embodiment, manual intervention is not needed, the detection of the frozen screen fault can be intelligently realized, and the detection efficiency is improved. In addition, by determining whether the operation log includes preset information or not, if the operation log includes the preset information, determining whether the touch data includes data within a preset time period after the operation time of the preset function or not, so as to determine whether the terminal equipment has a frozen screen fault or not. That is, it is determined whether the terminal device generates data of the touch event within a preset time period after the preset function is operated. The method for detecting the frozen screen fault can be matched with the scene of the frozen screen when a user actually uses the frozen screen, so that the frozen screen fault can be accurately identified, and the detection accuracy of the frozen screen fault is improved.
In the following embodiment, the process of detecting the frozen screen fault is respectively elaborated by taking the preset function as the unlocking function and the reminding function.
Fig. 10 is a schematic diagram of an application scenario in which a freeze screen fault occurs after a mobile phone is unlocked according to an embodiment of the present application. As shown in fig. 10, in the case of locking the mobile phone, if the user needs to operate the mobile phone, the user first unlocks the mobile phone. Alternatively, the user may perform the unlocking operation through fingerprint unlocking, face unlocking, or the like. After unlocking, the user further executes the touch operation. However, when the user fails to freeze the screen after unlocking, the mobile phone screen is frozen, and the user cannot perform any operation, for example, the user cannot perform the sliding operation after unlocking the face, and thus cannot enter the mobile phone interface, as shown in fig. 11.
Aiming at the application scene, on the basis of the embodiment, the frozen screen fault detection method provided by the embodiment can detect the frozen screen fault after unlocking. In this embodiment, the preset information includes unlocking information, and the preset function includes unlocking function, where the unlocking information is used to characterize the operation of the terminal device and unlock the function. The preset duration includes a first preset duration. In this embodiment, if the operation log includes unlocking information and the touch data does not include data within a first preset duration after the operation time of the unlocking function, it is determined that the terminal device has a freeze screen fault.
Alternatively, the first preset duration may be 10s to 15s.
Alternatively, the unlocking function may be a biological unlocking function. For example, the unlocking function may include at least one of a fingerprint unlocking function, a face unlocking function, a lip movement unlocking function, an iris unlocking function, a voiceprint unlocking function, and the like.
Taking the unlocking function as fingerprint unlocking, and taking the first preset time length as 10s as an example, in a specific embodiment, the server can identify a key statement "fpc _finger_hal: _identification" in an operation log of the terminal device: result 2). The key statement is used for representing that the fingerprint unlocking of the terminal equipment is successful. If the server recognizes the key sentence, the operation time (denoted as time a) of the fingerprint unlocking function is determined. The server determines whether the touch event data in 10s after the point time is A time exists in the touch data, namely the server determines whether the touch event data between the point time and the time A+10s exists. Optionally, the server may identify, in the TOUCH data, an identification field "btn_touch DOWN" of a pressing event of the point-of-time between a-time and a+10s-time. And if the identification field 'BTN_TOUCH DOWN' of the pressing event between the time A and the time A+10s of the point-reporting time is not identified, reporting the frozen screen fault. The identification field of the pressing event is an event generated by the terminal equipment according to the pressing operation executed by the user, and the touch event can be uniquely identified by the identification field of the pressing event, so that the data of the touch event can be more accurately identified by identifying the identification field of the pressing event in the touch data, thereby more accurately determining whether a frozen screen fault occurs or not and improving the detection accuracy of the frozen screen fault.
Taking the unlocking function as face unlocking, the first preset time period is 10s as an example, and in another specific embodiment, the server can identify a key sentence of FaceDTSvc callback auth code:1 and error code:0 in an operation log of the terminal device. The key statement is used for representing that the face of the terminal equipment is successfully unlocked. If the key sentence is identified, the operation time (marked as time B) of the face unlocking function is determined. The server determines whether the touch event data in which the point-reporting time is within 10s after the B time exists in the touch data, namely the server determines whether the touch event data in which the point-reporting time is between the B time and the B+10s time exists. Optionally, the server may identify, in the TOUCH data, an identification field "btn_touch DOWN" of a pressing event of the point-of-time between B-time and b+10s-time. And if the identification field 'BTN_TOUCH DOWN' of the pressing event between the time B and the time B+10s of the point-reporting time is not identified, reporting the frozen screen fault.
In this embodiment, the preset information includes unlocking information, where the unlocking information is used to characterize the operation of the terminal device to unlock the function. If the operation log comprises unlocking information and the touch control data does not comprise data within a first preset duration after the operation time of the unlocking function, determining that the terminal equipment has a frozen screen fault, and reporting the frozen screen fault. The frozen screen fault detection method provided by the embodiment is matched with a scene in which the user cannot execute any touch operation due to the fact that frozen screen appears after unlocking in an actual application scene, and the frozen screen fault appearing after unlocking can be accurately detected.
Fig. 12 is a schematic diagram of an application scenario in which a freeze screen fault occurs when an alarm clock sounds in an example of a mobile phone according to an embodiment of the present application. As shown in fig. 12, the user sets 07:00 alarm clock reminder. 07:00 mobile phone alarm clock sounds. Under normal conditions, a user needs to perform sliding operation on the display screen to turn off the alarm clock; or, the user needs to click on the display screen for "10 minutes after reminding" to set an alarm clock reminder after 10 minutes. However, when the alarm clock sounds, if the mobile phone has a freeze screen fault, the mobile phone screen is frozen, and the user cannot execute related operations such as closing the reminder or setting the reminder after 10 minutes.
Aiming at the application scene, on the basis of the embodiment, the frozen screen fault detection method provided by the embodiment can detect the frozen screen fault after unlocking. In this embodiment, the preset information includes a reminder, and the preset function includes a reminder, where the reminder is used to characterize that the terminal device has operated the reminder. The preset time period includes a second preset time period. In this embodiment, if the operation log includes the reminding information and the touch data does not include the data within the second preset duration after the operation time of the reminding function, it is determined that the terminal device has a freeze screen fault.
Alternatively, the second preset time period may be 60s to 65s.
Optionally, the reminding function can be a bell reminding, a vibration reminding and a bell+vibration reminding.
Optionally, the reminding function may include at least one of an alarm clock reminding function, a timer reminding function, a schedule reminding function, an incoming call reminding function, and the like.
Alternatively, the timed reminder function may comprise at least one of a timed reminder function in an application "timer", a timed reminder function in an application "stopwatch", and a timed reminder function in other applications.
Alternatively, the calendar reminder function may include at least one of a calendar reminder function in an application "calendar" and a calendar reminder function in another application.
Optionally, the incoming call reminding function may include at least one of voice incoming call reminding, video incoming call reminding, and the like. The voice call reminder may include at least one of a voice call reminder in an application "phone", a voice call reminder in an application "WeChat", a voice call reminder in an application "QQ", and a voice call reminder in other communication programs and the like. Similarly, the video call reminder may include at least one of a video call reminder in an application "phone", a video call reminder in an application "WeChat", a video call reminder in an application "QQ", and a video call reminder in other communication programs, etc.
Taking the reminding function as an ALARM clock reminding function and taking the second preset time length as 60s as an example, in a specific embodiment, the server can identify a key statement of "I HwDeskClock: alarmReceiver info: onreceived: action=alarm_alert_action" or "I HwDeskClock: alarmReceiver info: onreceived- > handleinteraction=com.android. Deskclock. The key statement is used for representing that the terminal equipment runs an alarm clock reminding function. If the key sentence is identified, the running time (marked as time C) of the alarm clock reminding function is determined. The server determines whether the touch event data in 60s after the point time is C, namely whether the touch event data in the point time between the C time and the C+60s is present or not. Optionally, the server may identify, in the TOUCH data, an identification field "btn_touch DOWN" of a pressing event of the point-of-time between C-time and c+60 s-time. And if the statement of the identification field 'BTN_TOUCH DOWN' of the pressing event between the time C and the time C+60s of the point time is not recognized, reporting the frozen screen fault.
In this embodiment, the preset information includes a reminder, where the reminder is used to characterize that the terminal device has operated a reminder function. If the operation log comprises reminding information and the touch data does not comprise data within a second preset time period after the operation time of the reminding function, determining that the terminal equipment has a frozen screen fault, and reporting the frozen screen fault. The frozen screen fault detection method provided by the embodiment is matched with a scene that a user cannot execute related operations (such as closing reminding or answering an incoming call) caused by the frozen screen when the terminal equipment operates the reminding function in an actual application scene, and the frozen screen fault generated when the reminding function is operated can be accurately detected.
It should be noted that, the unlocking information and the reminding information are only part of examples of the preset information, and in other embodiments, the preset information may further include other information characterizing other preset functions, which is not limited in any way in the application. In addition, the preset information may include one or more of various unlocking information, various reminding information, and the like. When the preset information includes a plurality of information, the server performs steps S602 to S605 for each information to realize detection of a freeze screen failure in various scenes.
Examples of the freeze screen fault detection method provided by the embodiment of the application are described in detail above. It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation is not to be considered as outside the scope of this application.
Fig. 13 is a schematic structural diagram of a freeze-screen fault detection device according to an embodiment of the present application. As shown in fig. 13, the freeze screen fault detection device provided in this embodiment may include:
an acquiringmodule 1301, configured to acquire a running log and touch data of an electronic device; the touch data are data generated by the electronic equipment according to the touch operation of the user;
thedetection module 1302 is configured to determine that a freeze screen fault occurs in the electronic device if the running log includes preset information and the touch data does not include data within a preset duration after a running time of a preset function; the preset information is used for indicating that the electronic equipment runs a preset function.
In one embodiment, the preset information includes unlocking information, the unlocking information is used for representing that the electronic device operates an unlocking function, and the preset duration includes a first preset duration; thedetection module 1302 is specifically configured to:
if the operation log comprises unlocking information and the touch control data does not comprise data within a first preset duration after the operation time of the unlocking function, determining that the electronic equipment has a frozen screen fault.
In one embodiment, the first preset time period is 10s to 15s.
In one embodiment, the unlocking function is a biological unlocking function.
In one embodiment, the unlocking function includes at least one of a fingerprint unlocking function, a face unlocking function, a lip movement unlocking function, an iris unlocking function, and a voiceprint unlocking function.
In one embodiment, the preset information includes a reminder information, where the reminder information is used to characterize that the electronic device has operated a reminder function, and the preset duration includes a second preset duration; thedetection module 1302 is specifically configured to:
if the operation log comprises reminding information and the touch data does not comprise data within a second preset time period after the operation time of the reminding function, determining that the electronic equipment has a frozen screen fault.
In one embodiment, the alert function includes at least one of an alarm alert function, a timer alert function, a calendar alert function, and an incoming call alert function.
In one embodiment, the incoming call alert function includes at least one of a voice incoming call alert function and a video incoming call alert function.
In one embodiment, the second preset time period is 60s to 65s.
In one embodiment, thedetection module 1302 is specifically configured to: determining whether preset information is included in the operation log; if the operation log comprises preset information, determining the operation time of a preset function; determining whether the touch data comprise data in a preset time period after the operation time of a preset function; if the touch data comprise data in a preset time period after the operation time of the preset function, determining that the electronic equipment has a frozen screen fault.
In one embodiment, thedetection module 1302 is specifically configured to: determining whether the touch data comprise data of a pressing event within a preset duration after the operation time of a preset function; the data of the pressing event is data generated by the electronic device according to the pressing operation of the user.
The frozen screen fault detection device provided in this embodiment is configured to execute the frozen screen fault detection method, and the technical principle and the technical effect are similar and are not described herein again.
The embodiment of the application also provides electronic equipment. The electronic device may be a terminal device or a server in the above-mentioned application embodiment.
The embodiment of the present application may divide the functional modules of the electronic device according to the above method examples, for example, may divide each function into each functional module corresponding to each function, for example, a detection unit, a processing unit, a display unit, or the like, or may integrate two or more functions into one module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
In case an integrated unit is employed, the electronic device may further comprise a processing module, a storage module and a communication module. The processing module can be used for controlling and managing the actions of the electronic equipment. The memory module may be used to support the electronic device to execute stored program code, data, etc. And the communication module can be used for supporting the communication between the electronic device and other devices.
Wherein the processing module may be a processor or a controller. Which may implement or perform the various exemplary logic blocks, modules, and circuits described in connection with this disclosure. A processor may also be a combination that performs computing functions, e.g., including one or more microprocessors, digital signal processing (digital signal processing, DSP) and microprocessor combinations, and the like. The memory module may be a memory. The communication module can be a radio frequency circuit, a Bluetooth chip, a Wi-Fi chip and other equipment which interact with other electronic equipment.
In one embodiment, when the processing module is a processor and the storage module is a memory, the electronic device according to this embodiment may be a device having the structure shown in fig. 1. The processor, the memory and the interface of the electronic device cooperate with each other, so that the electronic device executes the freeze-screen fault detection method of any embodiment.
The embodiment of the application also provides a computer readable storage medium, in which a computer program is stored, which when executed by a processor, causes the processor to execute the freeze-screen fault detection method of any of the above embodiments.
The embodiment of the application also provides a computer program product, which when running on a computer, causes the computer to execute the related steps so as to realize the freeze screen fault detection method in the embodiment.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component, or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer-executed instructions, and when the device is operated, the processor can execute the computer-executed instructions stored in the memory, so that the chip executes the freeze screen fault detection method in the method embodiments.
The electronic device, the computer readable storage medium, the computer program product or the chip provided in this embodiment are used to execute the corresponding method provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding method provided above, and will not be described herein.
It will be appreciated by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.