Movatterモバイル変換


[0]ホーム

URL:


CN111339222A - View generation method/visual control method, system, medium and terminal - Google Patents

View generation method/visual control method, system, medium and terminal
Download PDF

Info

Publication number
CN111339222A
CN111339222ACN201811549258.1ACN201811549258ACN111339222ACN 111339222 ACN111339222 ACN 111339222ACN 201811549258 ACN201811549258 ACN 201811549258ACN 111339222 ACN111339222 ACN 111339222A
Authority
CN
China
Prior art keywords
icon
view
intelligent household
household equipment
specified position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811549258.1A
Other languages
Chinese (zh)
Inventor
时红仁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Qinggan Intelligent Technology Co Ltd
Original Assignee
Shanghai Qinggan Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Qinggan Intelligent Technology Co LtdfiledCriticalShanghai Qinggan Intelligent Technology Co Ltd
Priority to CN201811549258.1ApriorityCriticalpatent/CN111339222A/en
Publication of CN111339222ApublicationCriticalpatent/CN111339222A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Landscapes

Abstract

The invention provides a method for generating a view/a visual control method, a system, a medium and a terminal, wherein the method for generating the view comprises the following steps: searching at least one designated position in the electronic map; at least one piece of intelligent household equipment is installed in the designated position; configuring a first icon matched with each intelligent household device and a second icon matched with each intelligent household device for the working state of each intelligent household device; and adding the first icon and the second icon to the electronic map of the specified position in a suspension manner to generate a home map view of the specified position. The method, the system, the medium and the terminal for generating the view can visually control the smart home based on the map only by simple operation, and realize convenient and fast map visual management of multiple regions and types for users.

Description

View generation method/visual control method, system, medium and terminal
Technical Field
The invention belongs to the technical field of intelligent home control, relates to a generation/control method and a system, and particularly relates to a method, a system, a medium and a terminal for generating/visually controlling a view.
Background
The intelligent home (English: smart home, home automation) is characterized in that a home is used as a platform, facilities related to home life are integrated by utilizing a comprehensive wiring technology, a network communication technology, a safety precaution technology, an automatic control technology and an audio and video technology, a high-efficiency management system of home facilities and home schedule affairs is constructed, home safety, convenience, comfortableness and artistry are improved, and an environment-friendly and energy-saving living environment is realized.
In the actual use process of the intelligent home system, a user wants to know the overall operation and deployment conditions of the intelligent home in real time, and response measures can be conveniently and timely made. However, the common prompt information can only display the problems encountered in the operation process of the system one by one, and cannot provide a visual experience for the user. Meanwhile, the method is limited to a server or a terminal, and the running state of the system cannot be comprehensively displayed.
Therefore, how to provide a method for generating a view/a method for controlling visualization, a system, a medium and a terminal to solve the defects that the existing smart home monitoring method cannot be visualized, so that the user cannot experience intuitively, and the like, is really a technical problem to be solved in the field.
Disclosure of Invention
In view of the above drawbacks of the prior art, an object of the present invention is to provide a method, a system, a medium, and a terminal for generating a view/a visual control method, a system, a medium, and a terminal, which are used to solve the problem that the monitoring mode of the existing smart home cannot be visualized, so that the user cannot experience intuitively.
To achieve the above and other related objects, an aspect of the present invention provides a method for generating a view, including: searching at least one designated position in the electronic map; at least one piece of intelligent household equipment is installed in the designated position; configuring a first icon matched with each intelligent household device and a second icon matched with each intelligent household device for the working state of each intelligent household device; and adding the first icon and the second icon to the electronic map of the specified position in a suspension manner to generate a home map view of the specified position.
In an embodiment of the present invention, the working state of the smart home device includes a starting state of the smart home device or a closing state of the smart home device; the method for generating the map view further comprises the following steps: the working state of the intelligent household equipment is the starting state of the intelligent household equipment, and the second icon is lightened when the intelligent household equipment is positioned to the specified position; or the working state of the intelligent household equipment is the closing state of the intelligent household equipment, and when the intelligent household equipment is to be positioned to the specified position, the second icon is shaded.
In an embodiment of the present invention, the method for generating the view further includes labeling at least one designated location on the electronic map; the marked designated position is used for receiving a touch action of the home map view of the designated position, and the touch action corresponds to a switching instruction of the home map view of the designated position.
The invention also provides a visual control method of the view based on the visual generation method of the view, which comprises the following steps: judging whether the current mode enters an inquiry mode of a specified position; if yes, calling a home map view corresponding to the currently inquired specified position according to the currently inquired specified position, and displaying the home map view of the specified position; if not, returning to the judging step; a first icon matched with the intelligent household equipment and a second icon matched with the working state of the intelligent household equipment are displayed on the household map view of the designated position; and monitoring a touch action on the first icon or the second icon so as to send a control instruction corresponding to the touch action to the intelligent household equipment.
In an embodiment of the present invention, the control instruction corresponding to the touch action occurring on the first icon is an instruction for displaying a current working attribute of the smart home device; and the control instruction corresponding to the touch action on the second icon is an instruction for controlling the intelligent household equipment to be started or closed.
In an embodiment of the present invention, the method for visually controlling the view further includes: after a control instruction corresponding to the touch action of the first icon is monitored and sent to the intelligent home equipment, receiving and displaying the current working attribute fed back by the intelligent home equipment; or after a control instruction corresponding to the touch action of the second icon is sent to the intelligent home equipment to be monitored, replacing the lighted second icon with the dark second icon or replacing the lighted second icon with the lighted second icon.
In another aspect, the present invention further provides a system for generating a view, where the system for generating a view includes: the searching module is used for searching at least one appointed position in the electronic map; at least one piece of intelligent household equipment is installed in the designated position; the configuration module is used for configuring a first icon matched with each intelligent household device and a second icon matched with each intelligent household device for the working state of each intelligent household device; and the view generation module is used for adding the first icon and the second icon to the electronic map of the specified position in a suspension mode so as to generate a home map view of the specified position.
In another aspect, the present invention provides a view visualization control system for a view generation system, including: the judging module is used for judging whether the current mode enters the inquiry mode of the specified position; if yes, starting a calling module for calling the home map view corresponding to the currently inquired specified position according to the currently inquired specified position and a display module for displaying the home map view of the specified position; if not, the judging module continues to judge; a first icon matched with the intelligent household equipment and a second icon matched with the working state of the intelligent household equipment are displayed on the household map view of the designated position; and the monitoring module is used for monitoring the touch action generated on the first icon or the second icon so as to send the control instruction corresponding to the touch action to the intelligent household equipment through a communication module.
A further aspect of the present invention provides a medium on which a computer program is stored, which, when being executed by a processor, implements the method of generating the view or the method of controlling visualization of the view implementing the method of generating the view based on the view.
A final aspect of the present invention provides a terminal, including: a processor and a memory; the memory is used for storing a computer program, and the processor is used for executing the computer program stored in the memory, so that the terminal executes the view generation method and/or the view visualization control method based on the view generation method in an embodiment of the invention.
As described above, the method, system, medium, and terminal for generating/controlling a view according to the present invention have the following advantages:
the method, the system, the medium and the terminal for generating the view can visually control the smart home based on the map only by simple operation, and realize convenient and fast map visual management of multiple regions and types for users.
Drawings
Fig. 1 is a schematic diagram of the internet of things to which the present invention is applied.
Fig. 2 is a flowchart illustrating a method for generating a view according to an embodiment of the present invention.
Fig. 3 is a flowchart illustrating a view visualization control method according to an embodiment of the present invention.
Fig. 4A is a schematic structural diagram of a view generation system according to an embodiment of the present invention.
Fig. 4B is a schematic structural diagram of a view visualization control system of a view-based generation system according to an embodiment of the present invention.
Description of the element reference numerals
1 Internet of things
11 intelligent household equipment
12 Mobile terminal
41 View Generation System
411 lookup module
412 annotation Module
413 configuration module
414 operational module
415 View Generation Module
42 visual control system of views of view-based generation system
421 judging module
422 calling module
423 display module
424 monitoring module
425 communication module
426 control module
S21-S25
S31-S34
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
Example one
The embodiment provides a method for generating a view, which includes:
searching at least one designated position in the electronic map; at least one piece of intelligent household equipment is installed in the designated position;
configuring a first icon matched with each intelligent household device and a second icon matched with each intelligent household device for the working state of each intelligent household device;
and adding the first icon and the second icon to the electronic map of the specified position in a suspension manner to generate a home map view of the specified position.
The method of generating the view provided by the present embodiment will be described in detail below with reference to the drawings. The method for generating the view described in this embodiment is applied to the internet ofthings 1 shown in fig. 1. The internet ofthings 1 comprises at least one piece ofintelligent home equipment 11 and amobile terminal 12 in communication connection with theintelligent home equipment 11. In this embodiment, thesmart home devices 11 include smart audio/video devices, lighting systems, curtain control, air conditioning control, security systems, digital cinema systems, audio/video servers, video cabinet systems, network appliances, and so on. Themobile terminal 12 includes a smart phone, a tablet computer, etc. that touch a display screen.
The description will be made by taking the smart phone as an example. The smart Phone is, for example, a smart Phone installed with an Android operating system or an iOS operating system, or an operating system such as Palm OS, Symbian, Black Berry OS, or Windows Phone. The smart receipt has a touch display screen.
Wherein the touch display screen provides both an output interface and an input interface between the device and a user. The touch display controller receives/sends electrical signals from/to the touch display. The touch screen display then displays visual output to the user. This visual output may include text, graphics, video, and any combination thereof. Some or all of the visual output may correspond to user interface objects, more details of which are described below.
Touch displays also accept user input based on tactile and/or tactile contact. The touch screen display forms a touch sensitive surface that accepts user input. The touch display screen and touch display screen controller (along with any associated modules and/or sets of instructions in memory) detect contact (and any movement or breaking of the touch) on the touch display screen and transform the detected contact into interaction with user interface objects, such as one or more soft keys, displayed on the touch display screen. In one exemplary embodiment, the point of contact between the touch display screen and the user corresponds to one or more fingers of the user. The touch display screen may use LCD (liquid crystal display) technology or LPD (light emitting polymer display) technology, but in other embodiments other display technologies may be used. Touch displays and touch display controllers can detect contact and movement or breaking thereof using any of a variety of touch technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays, or other technologies for determining one or more points of contact with a touch display. The user may contact the touch display screen using any suitable object or accessory, such as a stylus, finger, or the like.
The contact/motion module, in conjunction with the touch screen controller, detects contact with the touch screen. The contact/motion module includes various software components for performing various operations associated with detection of contact with a touch display screen, such as determining whether contact has occurred, determining whether the contact has moved, and tracking movement on the touch display screen, and determining whether the contact has been broken (i.e., whether contact has ceased). Determining movement of the point of contact may include determining a velocity (magnitude), a velocity (magnitude and direction), and/or an acceleration (including magnitude and/or direction) of the point of contact. In some embodiments, the contact/motion module and touch display controller also detect contact on the touch panel. The touch screen of the smart phone can display touch keys, the touch keys can be virtual keys provided for application software loaded in a system, namely soft keys (or virtual keys) for performing key operation by using a key interface or a UI interface displayed on the touch screen, and the smart phone comprises a touch identification chip, a related driving circuit of the touch identification chip, an I/O interface circuit for communicating with a microprocessor, an LED for lighting a plurality of icons of the keys, a plurality of icon display windows and other components. Wherein the I/O interface circuitry couples input and output peripherals of the device to the CPU and the memory. The one or more processors execute various software programs and/or sets of instructions stored in memory to perform various functions of the device and process data. In some embodiments, the interface circuit, the CPU, and the memory controller may be implemented on a single chip, e.g., on a chip. While in some other embodiments they may be implemented on multiple discrete chips.
Please refer to fig. 2, which is a flowchart illustrating a method for generating a view according to an embodiment. As shown in fig. 2, the method for generating the view includes the following steps:
and S21, searching at least one designated position in the electronic map. In this embodiment, at least one smart home device is installed in the designated location.
For example, the designated locations are home and office. The intelligent household equipment such as an intelligent air conditioner, an intelligent floor sweeping robot and an intelligent television are installed in the house. The office is provided with intelligent household equipment such as an intelligent sweeping robot, an intelligent air conditioner, a security system and a lighting system.
And S22, marking at least one designated position on the electronic map. The marked designated position is used for receiving a touch action of the home map view of the designated position, and the touch action corresponds to a switching instruction of the home map view of the designated position.
For example, the home location is marked with a black circle and the office location is marked with a black triangle. Any method that can mark a designated location can be applied to the present invention.
And S23, configuring a first icon matched with each intelligent household device and a second icon matched with each intelligent household device for the working state of each intelligent household device.
In this embodiment, the working state of the smart home device includes a starting state of the smart home device or a closing state of the smart home device.
S24, when the working state of the intelligent household equipment is the starting state of the intelligent household equipment and the intelligent household equipment is positioned at the specified position, lightening the second icon; or
And when the working state of the intelligent household equipment is the closing state of the intelligent household equipment and the intelligent household equipment is to be positioned at the specified position, the second icon is darkened.
In this embodiment, the step of lighting the second icon includes setting the color of the second icon to a bright color system, for example, red, yellow, light green, and the like, when the smart home device feeds back that the current working state is the starting state and the smart home device is to be positioned at the specified position.
In this embodiment, the step of darkening the second icon includes setting the color of the second icon to be a dark color system, such as gray, when the smart home device feeds back that the current working state is the off state and the smart home device is to be positioned at the designated position.
And S25, adding the first icon and the second icon to the electronic map of the specified position in a floating mode to generate a home map view of the specified position.
The embodiment also provides a visual control method of a view based on the visual generation method of the view, and the visual control method of the view includes:
judging whether the current mode enters an inquiry mode of a specified position; if yes, calling a home map view corresponding to the currently inquired specified position according to the currently inquired specified position, and displaying the home map view of the specified position; if not, returning to the judging step; a first icon matched with the intelligent household equipment and a second icon matched with the working state of the intelligent household equipment are displayed on the household map view of the designated position;
and monitoring a touch action on the first icon or the second icon so as to send a control instruction corresponding to the touch action to the intelligent household equipment.
The following description will be made in detail by aggregating illustrations of a visualization control method of views based on the view generation method provided by the present embodiment. The visual control method of the view described in this embodiment is applied to the internet ofthings 1 shown in fig. 1. The internet ofthings 1 comprises at least one piece ofintelligent home equipment 11 and amobile terminal 12 in communication connection with theintelligent home equipment 11.
The visual control method of the view is generated when the visual control method of the view is executed. Referring to fig. 3, a flow chart of a view visualization control method based on a view generation method according to an embodiment is shown. As shown in fig. 3, the method for visually controlling a view based on a view generation method specifically includes the following steps:
s31, judging whether the current mode is the inquiry mode of the designated position; if yes, go to S32; if not, the process returns to S31.
In this embodiment, the step of determining whether to enter the query mode of the specified location currently refers to monitoring whether to enter the display electronic module currently, and receiving the specified location to be queried currently.
And S32, calling a home map view corresponding to the currently inquired specified position according to the currently inquired specified position, and displaying the home map view of the specified position.
In this embodiment, a first icon matched with the smart home device and a second icon matched with the working state of the smart home device are displayed on the home map view of the designated location.
For example, when the specified position to be queried is the home address of the user or the office address of the user, the home map view corresponding to the home address of the user is respectively called according to the home address of the user or the office address of the user, so that the home map view corresponding to the office address of the user is called.
And after the corresponding home map view is called, displaying the corresponding home map view.
And S33, monitoring the touch action on the first icon or the second icon, and sending a control instruction corresponding to the touch action to the intelligent household equipment.
In this embodiment, the control instruction corresponding to the touch action occurring on the first icon is an instruction for displaying the current working attribute of the smart home device; and the control instruction corresponding to the touch action on the second icon is an instruction for controlling the intelligent household equipment to be started or closed.
For example, if the current user touches an icon of the smart tv and the second icon is in a bright color system, information such as a name of a program currently being played on the current smart tv is displayed.
S34, after a control instruction corresponding to the touch action of the first icon is monitored and sent to the intelligent home equipment, receiving and displaying the current working attribute fed back by the intelligent home equipment; or
And after a control instruction corresponding to the touch action of the second icon is monitored and sent to the intelligent household equipment, replacing the lighted second icon with a dark second icon or replacing the lighted second icon with a lighted second icon.
The present embodiment further provides a medium (also referred to as a computer-readable storage medium) on which a computer program is stored, wherein the computer program, when executed by a processor, implements the above-described method for generating a view or a method for controlling visualization of a view that implements the method for generating a view based on a view.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the above method embodiments may be performed by hardware associated with a computer program. The aforementioned computer program may be stored in a computer readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
The method for generating the view and the method for visually controlling the view based on the method for generating the view provided by the embodiment can visually control the smart home based on the map only by simple operation, and realize convenient and fast map visual management of multiple regions and types for a user.
Example two
The present embodiment provides a system for generating a view, where the system for generating a view includes:
the searching module is used for searching at least one appointed position in the electronic map; at least one piece of intelligent household equipment is installed in the designated position;
the configuration module is used for configuring a first icon matched with each intelligent household device and a second icon matched with each intelligent household device for the working state of each intelligent household device;
and the view generation module is used for adding the first icon and the second icon to the electronic map of the specified position in a suspension mode so as to generate a home map view of the specified position.
The present embodiment further provides a system for controlling visualization of a view based on a view generation system, where the system for controlling visualization of a view includes:
the judging module is used for judging whether the current mode enters the inquiry mode of the specified position; if yes, starting a calling module for calling the home map view corresponding to the currently inquired specified position according to the currently inquired specified position and a display module for displaying the home map view of the specified position; if not, the judging module continues to judge; a first icon matched with the intelligent household equipment and a second icon matched with the working state of the intelligent household equipment are displayed on the household map view of the designated position;
and the monitoring module is used for monitoring the touch action generated on the first icon or the second icon so as to send the control instruction corresponding to the touch action to the intelligent household equipment through a communication module.
The following describes in detail a view generation system and a view visualization control system based on the view generation system provided in the present embodiment with reference to the drawings. It should be noted that the division of the modules of the following system is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity or may be physically separated. And the modules can be realized in a form that all software is called by the processing element, or in a form that all the modules are realized in a form that all the modules are called by the processing element, or in a form that part of the modules are called by the hardware. For example: the x module can be a separately established processing element, and can also be integrated in a certain chip of the device. The x-module may be stored in the memory of the apparatus in the form of program code, and may be called by a processing element of the apparatus to execute the following functions of the x-module. Other modules are implemented similarly. All or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, the steps of the above method or the following modules may be implemented by hardware integrated logic circuits in a processor element or instructions in software. The following modules may be one or more integrated circuits configured to implement the above methods, for example: one or more Application Specific Integrated Circuits (ASICs), one or more microprocessors (DSPs), one or more Field Programmable Gate Arrays (FPGAs), and the like. When some of the following modules are implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. These modules may be integrated together and implemented in the form of a System-on-a-chip (SOC).
Referring to fig. 4A and 4B, a schematic structural diagram of a view generation system in an embodiment and a view visualization control system based on the view generation system are respectively shown for detailed description.
As shown in fig. 4A, theview generation system 41 includes asearch module 411, alabeling module 412, aconfiguration module 413, anoperation module 414, and aview generation module 415.
The searchingmodule 411 is used for searching at least one specified position in the electronic map. In this embodiment, at least one smart home device is installed in the designated location.
For example, the designated locations are home and office. The intelligent household equipment such as an intelligent air conditioner, an intelligent floor sweeping robot and an intelligent television are installed in the house. The office is provided with intelligent household equipment such as an intelligent sweeping robot, an intelligent air conditioner, a security system and a lighting system.
Thelabeling module 412 coupled to the searchingmodule 411 is used for labeling at least one designated location on the electronic map. The marked designated position is used for receiving a touch action of the home map view of the designated position, and the touch action corresponds to a switching instruction of the home map view of the designated position.
Theconfiguration module 413 coupled to thesearch module 411 and thelabeling module 412 is configured to configure a first icon for each smart home device and a second icon for the working status of each smart home device.
In this embodiment, the working state of the smart home device includes a starting state of the smart home device or a closing state of the smart home device.
Theoperation module 414 coupled to theconfiguration module 413 is configured to light the second icon when the working state of the smart home device is the starting state of the smart home device and the second icon is located at the specified position; or theoperation module 414 is configured to dim the second icon when the working state of the smart home device is the off state of the smart home device and the second icon is to be located at the designated position.
In this embodiment, when the current working state fed back by the smart home device is a starting state and the smart home device is to be positioned at the designated position, theoperation module 414 sets the color of the second icon to be a bright color system, for example, red, yellow, light green, and the like.
In this embodiment, when the current working state fed back by the smart home device is a closed state and the smart home device is to be located at the specified position, theoperation module 414 sets the color of the second icon to be a dark color system, for example, gray.
Theview generating module 415 coupled with theconfiguration module 413 and theoperation module 414 is configured to add the first icon and the second icon to the electronic map of the specified location in a floating manner, so as to generate a home map view of the specified location.
As shown in fig. 4B, thevisualization control system 42 of the view-based generation system includes adetermination module 421, acalling module 422, adisplay module 423, amonitoring module 424, acommunication module 425, and acontrol module 426.
The judgingmodule 421 is configured to judge whether to enter a query mode of a specified location currently; if yes, the callingmodule 422 is started; if not, the method returns to continue to use the determiningmodule 421 to determine whether to enter the query mode of the designated location currently.
In this embodiment, the step of determining whether to enter the query mode of the specified location by the determiningmodule 421 is to monitor whether to enter the display electronic module and receive the specified location to be queried currently.
The invokingmodule 422 coupled to the determiningmodule 421 is configured to invoke a home map view corresponding to the currently queried specified location according to the currently queried specified location, and enable thedisplay module 423 to display the home map view of the specified location.
In this embodiment, a first icon matched with the smart home device and a second icon matched with the working state of the smart home device are displayed on the home map view of the designated location.
And after thecalling module 422 calls the corresponding home map view, displaying the corresponding home map view.
Themonitoring module 424 coupled with the invokingmodule 422 and thedisplay module 423 is configured to monitor a touch action occurring on the first icon or the second icon, so as to send a control instruction corresponding to the touch action to the smart home device through thecommunication module 425.
In this embodiment, the control instruction corresponding to the touch action occurring on the first icon is an instruction for displaying the current working attribute of the smart home device; and the control instruction corresponding to the touch action on the second icon is an instruction for controlling the intelligent household equipment to be started or closed.
After a control instruction corresponding to the touch action occurring on the first icon is sent to the intelligent home equipment through thecommunication module 425, receiving a current working attribute fed back by the intelligent home equipment, and displaying the current working attribute through thedisplay module 423; or
After it is monitored that a control instruction corresponding to the touch action of the second icon is sent to the smart home device through thecommunication module 425, thecontrol module 426 replaces the lit second icon with the dark second icon or replaces the lit second icon with the lit second icon.
EXAMPLE III
The present embodiment provides a terminal, including: a processor, memory, transceiver, communication interface, or/and system bus; the memory and the communication interface are connected with the processor and the transceiver through a system bus and are used for realizing mutual communication, the memory is used for storing a computer program, the communication interface is used for communicating with other equipment, and the processor and the transceiver are used for operating the computer program to enable the terminal to execute the steps of the view generation method and/or the view visualization control method based on the view generation method according to the embodiment.
The above-mentioned system bus may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The system bus may be divided into an address bus, a data bus, a control bus, and the like. The communication interface is used for realizing communication between the database access device and other equipment (such as a client, a read-write library and a read-only library). The Memory may include a Random Access Memory (RAM), and may further include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory.
The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components.
The protection scope of the view generation method and/or the view-based generation method described in the present invention is not limited to the execution sequence of the steps listed in this embodiment, and all the solutions implemented by adding or subtracting steps and replacing steps in the prior art according to the principle of the present invention are included in the protection scope of the present invention.
The present invention also provides a view generation system and/or a view-based generation system, which can implement the view generation method and/or the view-based generation method described in the present invention, but the implementation apparatus of the view generation method and/or the view-based generation method described in the present invention includes, but is not limited to, the structures of the view generation system and/or the view-based generation system described in the present embodiment, and all the structural modifications and substitutions in the prior art made according to the principles of the present invention are included in the scope of the present invention.
In summary, the method, the system, the medium and the terminal for generating/visually controlling the view can visually control the smart home based on the map only by simplified operations, and realize convenient and fast map visual management of multiple regions and types for users. The invention effectively overcomes various defects in the prior art and has high industrial utilization value.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (10)

CN201811549258.1A2018-12-182018-12-18View generation method/visual control method, system, medium and terminalPendingCN111339222A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201811549258.1ACN111339222A (en)2018-12-182018-12-18View generation method/visual control method, system, medium and terminal

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201811549258.1ACN111339222A (en)2018-12-182018-12-18View generation method/visual control method, system, medium and terminal

Publications (1)

Publication NumberPublication Date
CN111339222Atrue CN111339222A (en)2020-06-26

Family

ID=71181373

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201811549258.1APendingCN111339222A (en)2018-12-182018-12-18View generation method/visual control method, system, medium and terminal

Country Status (1)

CountryLink
CN (1)CN111339222A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN105706395A (en)*2014-01-062016-06-22三星电子株式会社Control apparatus and method for controlling same
CN105897527A (en)*2016-05-302016-08-24海信集团有限公司Method and device for setting running parameter of smart home device in smart scene
CN107193901A (en)*2017-05-112017-09-22长威信息科技发展股份有限公司A kind of resource visualizes the method and system of fast selecting
CN107765555A (en)*2016-08-232018-03-06广州零号软件科技有限公司The smart home product human-computer interaction interface that icon in kind is shown
CN107819652A (en)*2017-10-102018-03-20北京小米移动软件有限公司The control method and device of intelligent home device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN105706395A (en)*2014-01-062016-06-22三星电子株式会社Control apparatus and method for controlling same
CN105897527A (en)*2016-05-302016-08-24海信集团有限公司Method and device for setting running parameter of smart home device in smart scene
CN107765555A (en)*2016-08-232018-03-06广州零号软件科技有限公司The smart home product human-computer interaction interface that icon in kind is shown
CN107193901A (en)*2017-05-112017-09-22长威信息科技发展股份有限公司A kind of resource visualizes the method and system of fast selecting
CN107819652A (en)*2017-10-102018-03-20北京小米移动软件有限公司The control method and device of intelligent home device

Similar Documents

PublicationPublication DateTitle
US11978335B2 (en)Controlling remote devices using user interface templates
US10318149B2 (en)Method and apparatus for performing touch operation in a mobile device
US8762869B2 (en)Reduced complexity user interface
US20160147411A1 (en)Method for managing task on terminal device, and terminal device
US9342172B2 (en)Method of detecting protection case and electronic device thereof
US20090172594A1 (en)User interface of electronic apparatus
CN110012327B (en) Electronic device, control method of electronic device, and computer-readable recording medium
CN104866225A (en)Electronic device having touch display screen and control method therefor
CN107368996B (en)Method/system for problem handling/supervision of field project, storage medium, terminal
JP7546165B2 (en) Information processing method, device and electronic device
CN103207757A (en)Portable Device And Operation Method Thereof
CN102768597B (en)Method and device for operating electronic equipment
US11630532B2 (en)Touch sensitive processing apparatus and method thereof and touch system
CN103955327B (en)Information processing method and electronic equipment
CN113791725A (en) Recognition method of stylus operation, intelligent terminal and computer-readable storage medium
CN112929734A (en)Screen projection method and device and electronic equipment
US11460971B2 (en)Control method and electronic device
RU2600544C2 (en)Navigation user interface in support of page-focused, touch- or gesture-based browsing experience
CN108845924B (en)Control response area display control method, electronic device, and storage medium
CN106775296B (en) A terminal control method and device
CN103135911A (en)Information processing method and electronic terminal
US20190258391A1 (en)Terminal and Application Switching Method for Terminal
CN105549828A (en)Electronic device with touch display screen and information processing method thereof
US10698468B2 (en)Device and method for changing setting value of electric power equipment
CN114064447A (en)Interface test method and device, storage medium and terminal

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
WD01Invention patent application deemed withdrawn after publication

Application publication date:20200626

WD01Invention patent application deemed withdrawn after publication

[8]ページ先頭

©2009-2025 Movatter.jp