Disclosure of Invention
In view of the above, the present application provides a method and an apparatus for image filtering, an electronic device, and a computer-readable storage medium.
According to a first aspect of embodiments of the present application, there is provided a method of image screening, the method including:
acquiring a color list of a candidate image for each candidate image in an image set to be screened, wherein the color list comprises a color value used for representing a color and the number of pixel points with the color value in the candidate image;
judging whether the candidate image meets a preset primary screening condition or not according to the length of the color list;
if so, processing the color list according to each color value in the color list and the number of pixel points corresponding to the color value, and judging whether the candidate image is a target image needing to be selected according to the processing result.
Optionally, the processing the color list according to each color value in the color list and the number of pixels corresponding to the color value includes:
determining a transparent color in the color list according to the color value, and filtering the transparent color in the color list;
and merging the similar colors in the color list according to the color value and the number of the pixels corresponding to the color value to obtain a merged color list.
Optionally, the determining, according to the processing result, whether the candidate image is a target image that needs to be selected includes:
filtering the merged colors which do not accord with the preset rule in the merged color list, and acquiring the length of the merged color list;
and if the length of the merged color list is matched with the number of colors configured in advance, determining the candidate image as a target image.
Alternatively, assume that the length of the color list after filtering the transparent color is M;
and merging the similar colors in the color list according to the color value and the number of the pixels corresponding to the color value to obtain a merged color list, wherein the merged color list comprises:
step S1, traversing the color list, and calculating a color difference between an nth color value and an N +1 th color value in the color list, where N is 1, … …, M;
step S2, when the color difference is smaller than a preset color difference threshold, generating a merged color based on the nth color value and the (N + 1) th color value, and adding the merged color to a merged color list, where the number of pixels of the merged color is the sum of the number of pixels corresponding to the (N + 1) th color value and the number of pixels corresponding to the nth color value, and the color value of the merged color is the nth color value;
step S3, deleting the list element in which the N +1 th color value is located in the color list, and returning to execute step S1 until the list element in which the nth color value is located is deleted after the nth color value is traversed in the color list; repeating the steps S1 to S3 until the color list is empty, and ending the process.
Optionally, the determining a transparent color in the color list according to the color value is performed by using an RGBA value, where the determining the transparent color includes:
comparing the A component value in each RGBA value in the color list with a preset transparent color threshold value;
and taking the color with the A component value larger than the preset transparent color threshold value as the transparent color.
Optionally, the filtering the merged colors in the merged color list that do not meet the preset rule includes:
acquiring the number of pixel points of each merged color in the merged color list and the total number of the pixel points of the merged color list;
calculating the ratio of the number of the pixel points of the merged color to the total number of the pixel points aiming at each merged color to obtain the ratio of the merged color;
and filtering the merged color with the ratio lower than a preset ratio threshold value.
Optionally, the determining, according to the length of the color list, whether the candidate image meets a preset preliminary screening condition includes:
when the length of the color list is smaller than a preset total color class upper limit, judging that the candidate image meets a preset preliminary screening condition;
and when the length of the color list is greater than or equal to a preset total color class upper limit, judging that the candidate image does not meet a preset preliminary screening condition.
According to a second aspect of embodiments of the present application, there is provided an image screening apparatus, the apparatus including:
the color list acquisition module is used for acquiring a color list of each candidate image in the image set to be screened, wherein the color list comprises a color value used for representing a color and the number of pixel points with the color value in the candidate image;
the preliminary screening judging module is used for judging whether the candidate image meets a preset preliminary screening condition according to the length of the color list; if yes, calling a color processing module;
the color processing module is used for processing the color list according to each color value in the color list and the number of pixel points corresponding to the color value;
and the target image determining module is used for judging whether the candidate image is the target image required to be selected according to the processing result.
According to a third aspect of embodiments of the present application, there is provided an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the above method when executing the program.
According to a fourth aspect of embodiments of the present application, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the above-described method.
The scheme provided by the application has the following beneficial effects:
in this embodiment, a color list of a candidate image is obtained, and whether the candidate image meets an initial screening condition is determined according to the length of the color list, and when the candidate image meets the initial screening condition, whether the candidate image is a target image that needs to be selected is determined according to processing of color values in the color list and the number of pixel points of each color value. The processing of the color list of the candidate images helps a user to identify a target image to be selected from batch images of the image set to be screened, and compared with a mode of manually searching the target image, the method is high in searching efficiency and accuracy. In addition, the target images can be quickly searched from the batch of pictures without artificial intelligent identification, and the method is low in threshold and easy to realize.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
Referring to fig. 1, a flowchart illustrating steps of an embodiment of a method for image screening according to an exemplary embodiment of the present application is shown, where the embodiment may specifically include the following steps:
step 101, aiming at each candidate image in an image set to be screened, obtaining a color list of the candidate image, wherein the color list comprises a color value used for representing a color and the number of pixel points with the color value in the candidate image.
In this step, each list element in the color list may be represented as a color, and each list element may include a color value and a number of pixels corresponding to the color value. It should be noted that the color values may have different expressions according to different color spaces, for example, if the color space is three primary colors RGB (R represents Red, G represents Green, and B represents Blue), the color values are RGB values; if the color space is a hexagonal cone model HSV (H represents Hue, S represents Saturation, and V represents lightness), the color Value is an HSV Value; if the color space is a color model LAB (L represents lightness, a represents a range from magenta to green, and B represents a range from yellow to blue), the color value is a LAB value, and the present embodiment does not limit the color space and the expression of the color value.
In one embodiment, the color list of candidate images may be obtained by calling a script function.
And step 102, judging whether the candidate image meets a preset preliminary screening condition or not according to the length of the color list.
In this step, the length of the color list may be the number of colors in the color list. For example, if there are 100 colors in the color list, the length of the color list is 100.
In a possible implementation manner, when the target image to be selected is an image with fewer color types, a total color class upper limit colorThreshold may be preset, and when the length of the color list is smaller than the preset total color class upper limit colorThreshold, it may be determined that the candidate image satisfies the preset preliminary screening condition.
Wherein, colorThreshold may be an empirical value or set according to actual requirements, for example, colorThreshold may be set to 100, and when the length of the color list of the candidate image is less than 100, it is determined that the candidate image satisfies the preliminary screening condition; otherwise, judging that the candidate image does not meet the preliminary screening condition, and omitting the candidate image or not selecting the candidate image.
Step 103, if the candidate image meets a preset preliminary screening condition, processing the color list according to each color value in the color list and the number of pixel points corresponding to the color value, and judging whether the candidate image is a target image needing to be selected according to the processing result.
When it is determined that the candidate image satisfies the preliminary screening condition, the candidate image may be secondarily confirmed through step 103. In step 103, it is determined whether the candidate image is a target image to be selected through processing of the color list of the candidate image.
In this embodiment, a color list of a candidate image is obtained, and whether the candidate image meets an initial screening condition is determined according to the length of the color list, and when the candidate image meets the initial screening condition, whether the candidate image is a target image that needs to be selected is determined according to processing of color values in the color list and the number of pixel points of each color value. The processing of the color list of the candidate images helps a user to identify a target image to be selected from batch images of the image set to be screened, and compared with a mode of manually searching the target image, the method is high in searching efficiency and accuracy. In addition, the target images can be quickly searched from the batch of pictures without artificial intelligent identification, and the method is low in threshold and easy to realize.
Referring to fig. 2, a flowchart illustrating steps of another embodiment of a method for image screening according to an exemplary embodiment of the present application is shown, where the embodiment may specifically include the following steps:
step 201, aiming at each candidate image in the image set to be screened, obtaining a color list of the candidate image, where the color list includes a color value for representing a color and the number of pixels having the color value in the candidate image.
In a possible embodiment, the color list rgba _ list of candidate images may be obtained by calling a script function. For example, a color list of candidate images may be obtained by a gettools method using a pil (Python imaging library) library in Python (an object-oriented dynamic type language) as a script.
In one example, each list element in the color list may be represented as a color, and each list element is a tuple including a color value and a number of pixels corresponding to the color value. For example, a tuple can be represented as (count, (RGBA)), where RGBA (a is a transparent channel) is a color value, and count is the number of pixels of the candidate image having the color value.
For example, the list of colors may be expressed as: [(1160,(0,0,0,0)),(332,(51,51,51,255)),(15,(51,51,51,91)),(14,(57,57,57,10)),(11,(50,50,50,236)),(10,(50,50,50,133)),(9,(51,51,51,32)),(9,(50,50,50,247)),(8,(50,50,50,220)),(6,(50,50,50,180)),(6,(51,51,51,228)),(5,(51,51,51,145)),(5,(51,51,51,108)),(5,(50,50,50,125)),(4,(50,50,50,204)),(1,(49,49,49,67))].
The first parameter in each tuple in the list is count, and the second parameter is RGBA. For example, for the tuple (332, (51,51, 255)) corresponding to the values of the four color components of RGBA, respectively, (51,51, 255), 332 indicates that the number of pixels having a color value (51,51, 255) in the candidate image is 332.
It should be noted that the image set to be filtered may be acquired through a path specified by the user. In implementation, the user may specify the path of the image set to be filtered through the script code, for example, specify the path through pathpng ═ xxx/xx.
Step 202, judging whether the candidate image meets a preset preliminary screening condition or not according to the length of the color list, if so, executing step 204-step 207; if not, go to step 203.
In this step, the length of the color list may be the number of colors in the color list. For example, if there are 100 colors in the color list, the length of the color list is 100.
In a possible implementation manner, when the target image to be selected is an image with less color types, a color total class upper limit colorThreshold may be preset, and when the length of the color list of the candidate image is smaller than the preset color total class upper limit colorThreshold, it may be determined that the candidate image satisfies the preset preliminary screening condition, and step 204 and step 207 are continuously executed; otherwise, when the length of the color list of the candidate image is greater than or equal to the preset color total class upper limit colorThreshold, it may be determined that the candidate image does not satisfy the preset preliminary screening condition, and the step 203 is continuously executed.
Step 203, determining the candidate image as an image not needing to be selected.
As an example, an exemplary scenario in which a candidate image does not satisfy the prescreening condition is: if the image that the user needs to search for is an image with a small number of colors, such as a monochrome image (the number of colors is 1), a two-color image (the number of colors is 2), a three-color image (the number of colors is 3), and the like, which have relatively monotonous colors, and the length of the color list of the candidate image is greater than 100 (assuming that colorThreshold is 100), which indicates that the colors of the candidate image are relatively rich, it may be determined that the candidate image does not satisfy the preliminary screening condition, that is, the candidate image is not an image that the user needs to search for, and the candidate image may be directly ignored.
And step 204, determining transparent colors in the color list according to the color values, and filtering the transparent colors in the color list.
In practice, in one image, the edge of the image may have more transparent colors, and in a scene where the color of the image is recognized, because the effect of the transparent colors is not great, the transparent colors in the color list may be filtered out, and the color list only leaves the concerned colors.
In a possible implementation manner of this embodiment, if the color values in the color list are characterized by RGBA values, step 204 may include the following sub-steps:
substep S21, comparing an a component value in each RGBA value in the color list with a preset transparent color threshold value;
and a sub-step S22 of regarding a color of which the a component value is greater than the preset transparent color threshold value as a transparent color.
In one embodiment, a transparent color threshold ignorance request may be set in the script, an a component value of each RGBA value in the color list is compared with the ignorance request, and if an a component value is greater than the ignorance request, a color corresponding to the a component value may be regarded as a transparent color and the transparent color is filtered out of the color list.
For example, assuming that the ignoraoque is 200, when the a component value in the RGBA is larger than the ignoraoque, the color may be removed from the color list RGBA _ list.
Step 205, merging the similar colors in the color list according to the color value and the number of pixels corresponding to the color value to obtain a merged color list.
In practice, some colors in the images are gradient colors, which can be merged to finally obtain a merged color list.
In a possible implementation manner of this embodiment, step 205 may include the following steps:
step S1, traversing the color list, and calculating a color difference between an nth color value and an N +1 th color value in the color list, where N is 1, … …, M.
Where M is the length of the color list after filtering the transparent color.
When the method is implemented, the color list rgba _ list can be traversed, each color value is taken out to be compared with the next color value in the list for color difference, and the color difference between the two is obtained.
In one embodiment, assuming that the color values are characterized by RGBA values, the color difference between the nth color value and the (N + 1) th color value can be calculated as follows:
1. calculating the color difference of the R component: and performing difference calculation on the R component value of the Nth color value and the R component value of the (N + 1) th color value to obtain the R component color difference. For example, the R component value C of the first color value in the color list1,RR component value C with second color value2,RHas a color difference of △ R ═ C1,R-C2,R。
2. Calculating the G component color difference: and performing difference calculation on the G component value of the Nth color value and the G component value of the (N + 1) th color value to obtain the G component color difference. For example, the G component value C of the first color value in the color list1,GG component value C with a second color value2,GThe color difference of the G component is △ G ═ C1,G-C2,G。
3. Calculating the B component color difference: and calculating the difference value of the B component value of the Nth color value and the B component value of the (N + 1) th color value to obtain the B component color difference. For example, B component value C of the first color value in the color list1,BB component value C with a second color value2,BThe color difference of the component B is △ B ═ C1,B-C2,B。
4. For example, the color difference △ C between the first color value and the second color value in the color list is calculated using the following formula:
wherein,
in other embodiments, assuming that the color values are represented by LAB values, the color difference between two adjacent color values is determined as follows:
ΔE=(ΔL^2+ΔA^2+ΔB^2)^(1/2)
wherein Δ E represents a color difference of two color values, and Δ L/Δ a/Δ B represents a difference between different components between the two color values, respectively, i.e., Δ L represents a difference of L components of the two color values; Δ a represents the difference of the a component in the two color values; Δ B represents the difference of the B component in the two color values.
Step S2, when the color difference is smaller than a preset color difference threshold, generating a merged color based on the nth color value and the (N + 1) th color value, and adding the merged color to a merged color list, where the number of pixels of the merged color is the sum of the number of pixels corresponding to the (N + 1) th color value and the number of pixels corresponding to the nth color value, and the color value of the merged color is the nth color value.
In this embodiment, a color difference threshold is preset, and a merged color list colorMergeList is set for storing the merged colors.
When the color difference diff between two adjacent color values in the color list is smaller than diffThreshold, which indicates that the colors corresponding to the two color values are similar, a merged color may be generated based on the two color values, and the merged color may be added to the merged color list.
In one embodiment, one way to generate the merged color may be: the latter color value (i.e., the N +1 th color value) is added to the count of the former color value (i.e., the nth color value) as the count of the merged color, and the former color value is taken as the color value of the merged color.
Step S3, deleting the list element in which the N +1 th color value is located in the color list, and returning to execute step S1 until the list element in which the nth color value is located is deleted after the nth color value is traversed in the color list; repeating the steps S1 to S3 until the color list is empty, and ending the process.
After generating a merged color according to the N +1 th color value and the nth color value with similar colors and adding the generated merged color to the merged color list, in step S3, the list element where the N +1 th color value is located may be removed from the color list. Thus, the nth color value and the new N +1 th color value may perform the processes of steps S1-S3 again until the nth color value has traversed all the similar colors, and then the list element in which the nth color value is located is deleted from the color list. Then, the process of step S1-step S3 is executed again to start the close color search and merging for the next color until the color list is empty, and the flow ends.
And step 206, filtering the merged colors which do not accord with the preset rule in the merged color list.
In steps 206 and 207, processing is performed for the merged color list. In step 206, the merged colors in the merged color list that do not meet the preset rule may be filtered out first.
In one possible implementation of this embodiment, step 206 may include the following sub-steps:
and a substep S31, obtaining the number of pixels of each merged color in the merged color list and the total number of pixels of the merged color list.
And a substep S32, calculating the ratio of the number of the pixel points of the merged color to the total number of the pixel points to obtain the ratio of the merged color.
And a substep S33 of filtering the merged color with the ratio lower than a preset ratio threshold.
For example, assuming that each merged color in the merged color list is represented as (count1, (RGBA)), the count1 of all merged colors may be counted to obtain the total number Addcount of pixels in the merged color list. Then, in sub-step S32, the ratio of count1 to Addcount of each merged color is calculated to obtain the ratio of the merged color, and if the ratio is lower than a preset ratio threshold colorRatio (e.g., colorRatio of 1%), the merged color may be removed from the merged color list as an invalid color. If the occupancy is greater than or equal to the preset occupancy threshold colorRatio, the merged color may be retained in the merged color list as an active color.
In this embodiment, since the edge color is generally low in duty ratio, a color with a duty ratio lower than a preset duty ratio threshold colorRatio may be used as the edge color and removed from the merged color list to omit a smaller number of colors, which is beneficial to subsequently identifying candidate images according to effective colors.
Step 207, obtaining the length of the merged color list; and if the length of the merged color list is matched with the number of colors configured in advance, determining the candidate image as a target image.
In one example, the length of the list of merged colors may be the number of valid merged colors remaining in the list of merged colors, and if the length matches a preconfigured number of colors, the candidate image is determined to be the target image.
For example, if the number of colors configured in advance is 1, that is, when the monochrome map needs to be searched, when the length of the merged color list colorMergeList is 1, it may be determined that the candidate image is the target image.
For another example, if the number of colors configured in advance is 2, that is, when the two-color map needs to be searched, when the length of the merged color list colorMergeList is 2, it may be determined that the candidate image is the target image.
It should be noted that the pre-configured threshold values such as the number of colors, ignoreOpaque, diffThreshold, colorRatio, colorThreshold, etc. mentioned in this embodiment are all empirical values, and a user may set the threshold values according to actual requirements, so as to flexibly adjust the sensitivity of image recognition. When setting the threshold, the user may set the threshold through the script code, or may input the threshold through the UI interface, which is not limited in this embodiment.
In the embodiment, a color list of a candidate image is obtained, whether the candidate image meets a preliminary screening condition is judged according to the length of the color list, when the candidate image meets the preliminary screening condition, a transparent color in the color list and a similar color in the merged color list are filtered according to a color value in the color list to obtain a merged color list, and after a merged color which does not meet a preset rule in the merged color list is filtered, if the number of the remaining merged colors in the merged color list is matched with the number of colors configured in advance, the candidate image is determined to be a target image. The processing of the color list of the candidate images helps a user to identify a target image to be selected from batch images of the image set to be screened, and compared with a mode of manually searching the target image, the method is high in searching efficiency and accuracy. In addition, the target images can be quickly searched from the batch of pictures without artificial intelligent identification, and the method is low in threshold and easy to realize.
Corresponding to the embodiment of the method, the application also provides an embodiment of an image screening device.
The device embodiment of the application can be applied to electronic equipment. The device embodiments may be implemented by software, or by hardware, or by a combination of hardware and software. The software implementation is taken as an example, and is formed by reading corresponding computer program instructions in the nonvolatile memory into the memory for operation through the processor of the device where the software implementation is located as a logical means. From a hardware aspect, as shown in fig. 3, the hardware structure diagram of the device in the present application is a hardware structure diagram of an apparatus, except for the processor, the memory, the network interface, and the nonvolatile memory shown in fig. 3, the apparatus where the device is located in the embodiment may also include other hardware according to an actual function of the device, which is not described again.
Referring to fig. 4, a block diagram of an embodiment of an image screening apparatus according to an exemplary embodiment of the present application is shown, where the apparatus may include the following modules:
a color list obtaining module 401, configured to obtain, for each candidate image in the image set to be filtered, a color list of the candidate image, where the color list includes a color value used for representing a color and the number of pixels having the color value in the candidate image;
a preliminary screening judgment module 402, configured to judge whether the candidate image meets a preset preliminary screening condition according to the length of the color list; if yes, calling a color processing module;
a color processing module 403, configured to process the color list according to each color value in the color list and the number of pixels corresponding to the color value;
and a target image determining module 404, configured to determine whether the candidate image is a target image that needs to be selected according to the processing result.
In a possible implementation manner of this embodiment, the color processing module 403 may include the following sub-modules:
the transparent color filtering submodule is used for determining transparent colors in the color list according to the color values and filtering the transparent colors in the color list;
and the similar color merging submodule is used for merging the similar colors in the color list according to the color value and the number of the pixels corresponding to the color value so as to obtain a merged color list.
In a possible implementation manner of this embodiment, the target image determination module 404 may include the following sub-modules:
the merged color filtering submodule is used for filtering the merged colors which do not accord with the preset rule in the merged color list and acquiring the length of the merged color list;
and the target image determining submodule is used for determining the candidate image as the target image if the length of the combined color list is matched with the number of colors configured in advance.
In a possible implementation manner of this embodiment, it is assumed that the length of the color list after filtering the transparent color is M;
the near color merge sub-module may include the following elements:
a color difference calculating unit, configured to traverse the color list, and calculate a color difference between an nth color value and an N +1 th color value in the color list, where N is 1, … …, M;
a merged color generation unit, configured to generate a merged color based on the nth color value and the (N + 1) th color value when the color difference is smaller than a preset color difference threshold, and add the merged color to a merged color list, where a number of pixels of the merged color is a sum of a number of pixels corresponding to the (N + 1) th color value and a number of pixels corresponding to the nth color value, and a color value of the merged color is the nth color value;
a traversal unit, configured to delete the list element in which the (N + 1) th color value is located in the color list, and invoke a color difference calculation unit, until the nth color value is completely traversed in the color list, delete the list element in which the nth color value is located; and repeatedly calling the color difference calculating unit, the merged color generating unit and the traversing unit until the color list is empty, and ending the process.
In a possible implementation manner of this embodiment, the color value is an RGBA value, and the transparent color filtering submodule is specifically configured to:
comparing the A component value in each RGBA value in the color list with a preset transparent color threshold value;
and taking the color with the A component value larger than the preset transparent color threshold value as the transparent color.
In a possible implementation manner of this embodiment, the merged color filtering sub-module is specifically configured to:
acquiring the number of pixel points of each merged color in the merged color list and the total number of the pixel points of the merged color list;
calculating the ratio of the number of the pixel points of the merged color to the total number of the pixel points aiming at each merged color to obtain the ratio of the merged color;
and filtering the merged color with the ratio lower than a preset ratio threshold value.
In a possible implementation manner of this embodiment, the prescreening determination module is specifically configured to:
when the length of the color list is smaller than a preset total color class upper limit, judging that the candidate image meets a preset preliminary screening condition;
and when the length of the color list is greater than or equal to a preset total color class upper limit, judging that the candidate image does not meet a preset preliminary screening condition.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
The present application further provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the above-mentioned method embodiments when executing the program.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
Embodiments of the subject matter and the functional operations described in this specification can be implemented in: digital electronic circuitry, tangibly embodied computer software or firmware, computer hardware including the structures disclosed in this specification and their structural equivalents, or a combination of one or more of them. Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on a tangible, non-transitory program carrier for execution by, or to control the operation of, data processing apparatus. Alternatively or additionally, the program instructions may be encoded on an artificially generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode and transmit information to suitable receiver apparatus for execution by the data processing apparatus. The computer storage medium may be a machine-readable storage device, a machine-readable storage substrate, a random or serial access memory device, or a combination of one or more of them.
The processes and logic flows described in this specification can be performed by one or more programmable computers executing one or more computer programs to perform corresponding functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
Computers suitable for executing computer programs include, for example, general and/or special purpose microprocessors, or any other type of central processing unit. Generally, a central processing unit will receive instructions and data from a read-only memory and/or a random access memory. The basic components of a computer include a central processing unit for implementing or executing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer does not necessarily have such a device. Further, the computer may be embedded in another device, e.g., a vehicle-mounted terminal, a mobile telephone, a Personal Digital Assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device such as a Universal Serial Bus (USB) flash drive, to name a few.
Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices (e.g., EPROM, EEPROM, and flash memory devices), magnetic disks (e.g., an internal hard disk or a removable disk), magneto-optical disks, and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features specific to particular embodiments of particular inventions. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. In other instances, features described in connection with one embodiment may be implemented as discrete components or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In some cases, multitasking and parallel processing may be advantageous. Moreover, the separation of various system modules and components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
Thus, particular embodiments of the subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. Further, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some implementations, multitasking and parallel processing may be advantageous.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.