Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present specification, and it is possible for those of ordinary skill in the art to apply the present specification to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It will be appreciated that "system," "apparatus," "unit" and/or "module" as used herein is one method for distinguishing between different components, elements, parts, portions or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
As used in this specification and the claims, the terms "a," "an," "the," and/or "the" are not specific to a singular, but may include a plurality, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
A flowchart is used in this specification to describe the operations performed by the system according to embodiments of the present specification. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
Current intelligent drawing tools generally rely on design templates produced inside the system to collage and combine pictures and texts to generate target images (e.g., generate product publicity charts, public welfare publicity charts, etc.), which are greatly affected by the types and numbers of the design templates. In addition, the drawing tool lacks drawing capability with natural colors as a substrate, and because of the lack of systematic constraint and rule on matching of content information such as characters, illustrations and buttons in the background layer and the foreground layer of the image, the problem of improper color matching of a design template and the content information exists, so that a large amount of unusable waste images (such as unobvious color comparison between the color of the content information in the foreground layer such as characters and illustrations in the image and the color of the background layer) are generated, the efficiency is low, and the time and the operation cost of the user for screening the image are increased.
According to the image color matching method, the background layer color of the target image is determined based on color system information, the foreground layer color of the target image is determined based on the background layer color, the target image is generated according to the background layer color and the foreground layer color, the template manufacturing cost can be saved, and the success rate and the efficiency of target image generation can be improved.
Fig. 1 is a schematic illustration of an application scenario of an exemplary image color matching system shown in accordance with some embodiments of the present description.
As shown in fig. 1, image color matching system 100 may include a processing device 110, a terminal device 120, a storage device 130, and a network 140. In some embodiments, the processing device 110 may be part of the terminal device 120.
Processing device 110 may process data and/or information acquired from terminal device 120, storage device 130, or other components of image color matching system 100. For example, the processing device 110 may obtain image information (e.g., first image information, second image information, etc.) from the terminal device 120, and perform an analysis process thereon to determine the colors of the foreground layer 151 and the background layer 152 of the target image, and/or generate the target image 150 based on the foreground layer 151 and the background layer 152. In some embodiments, the processing device 110 may be local or remote. For example, processing device 110 may access information and/or data from terminal device 120 and/or storage device 130 via network 140.
In some embodiments, the processing device 110 may include input means and/or output means. Interaction with the user (e.g., capturing image information, displaying a target image, etc.) may be accomplished through an input device and/or an output device. In some embodiments, the input device and/or output device may include a display screen, a keyboard, a mouse, a microphone, etc., or any combination thereof.
Terminal device 120 may be coupled to and/or in communication with processing device 110 and/or storage device 130. For example, the terminal device 120 may acquire the generated target image from the processing device 110 and display it for output to the user. In some embodiments, terminal device 120 may include one or any combination of mobile device 121, tablet 122, notebook 123, server, processing device, etc., or other devices having data uploading, receiving, processing, outputting, and/or displaying functions. In some embodiments, the terminal device 120 (or all or part of its functionality) may be integrated in the processing device 110. In some embodiments, the processing device 110 and the terminal device 120 may be directly or indirectly connected, and the methods and/or functions described herein may be implemented in combination. In some embodiments, one or more users of terminal device 120 may be used, including users who directly use a service (e.g., a target image generation service), as well as other related users.
Storage device 130 may store data, instructions, and/or any other information. In some embodiments, the storage device 130 may store data (e.g., target images, first image information, second image information, etc.) acquired from the processing device 110 and/or the terminal device 120. In some embodiments, storage device 130 may store computer instructions or the like for implementing an image color matching method and/or an image generation method.
In some embodiments, storage device 130 may include one or more storage components, each of which may be a separate device or may be part of another device. In some embodiments, the storage device 130 may include Random Access Memory (RAM), read Only Memory (ROM), mass storage, removable memory, volatile read-write memory, and the like, or any combination thereof. By way of example, mass storage may include magnetic disks, optical disks, solid state disks, and the like. In some embodiments, storage device 130 may be implemented on a cloud platform.
Network 140 may include any suitable network capable of facilitating the exchange of information and/or data. In some embodiments, at least one component of image color matching system 100 (e.g., processing device 110, terminal device 120, storage device 130) may exchange information and/or data with at least one other component of image color matching system 100 via network 140. For example, the processing device 110 may obtain image information from the terminal device 120 through the network 140.
It should be noted that the image color matching system 100 is provided for illustrative purposes only and is not intended to limit the scope of the present description. Many modifications and variations will be apparent to those of ordinary skill in the art in light of the present description. For example, the image color matching system 100 may implement similar or different functions on other devices. However, such changes and modifications do not depart from the scope of the present specification.
FIG. 2 is a block diagram of an exemplary image color matching system shown in accordance with some embodiments of the present description.
As shown in fig. 2, in some embodiments, the image color matching system 200 may include a first acquisition module 210, a color determination module 220, and a color matching module 230. In some embodiments, the corresponding functions of image color matching system 200 may be performed by processing device 110.
The first acquisition module 210 may be configured to acquire first image information. In some embodiments, the first image information may include color system information and content information. In some embodiments, the content information may include at least one of text, artwork, buttons, labels, and the like.
The color determination module 220 may be used to determine a background layer color of the target image. In some embodiments, the color determination module 220 may determine a background layer color of the target image based on the color family information. In some embodiments, the color determination module 220 may determine a hue value, a saturation value, and a brightness value using a predetermined algorithm based on the color family information, and determine a background layer color of the target image based on the hue value, the saturation value, and the brightness value.
In some embodiments, the background layer color may comprise a single color or a gradient color. In some embodiments, the gradient may include a single color gradient and a cross-color gradient. In some embodiments, when the background layer color is monochromatic, the saturation value S and the brightness value B of the background layer color can satisfy the mathematical expression (1) that B-100 is more than or equal to 202/(S-100). in some embodiments, when the background layer color is a monochromatic gradient, the hue value of the background layer color is fixed, the gradient saturation value S1 and the luminance value B1 may satisfy the conditions that the gradient saturation value S1 =100 and the luminance value B1=B0+S0 -100 if S0 +x >100, and the gradient luminance value B1 =100 and the saturation value S1=B0+S0 -100 if B0 +x > 100. wherein S0 represents a saturation value corresponding to a background color of the background layer, B0 represents a brightness value corresponding to the background color, and X may be any integer in the interval [20,40 ]. In some embodiments, when the background layer color is cross-color gradient, the saturation and brightness values of the background layer color are fixed, and the gradient hue value H1 satisfies the condition that if H256+Y >360, the gradient hue value H1=H0 +Y-360, and if H0 -Y <0, the gradient hue value H1=360-(Y-H0). Wherein H0 represents the hue value corresponding to the background color of the background layer color, and Y can be any integer in the interval [30,60 ].
The color matching module 230 may be used to determine a foreground layer color of the target image based on the background layer color. In some embodiments, the foreground layer may include content information. In some embodiments, when the background layer color and the foreground layer color belong to the same hue, the background layer color and the foreground layer color may be located in a first preset area and a second preset area of the same hue, respectively, and when the background layer color and the foreground layer color belong to different hues, the hue of the foreground layer color may be located within a preset color gamut of the hue of the background layer color. In some embodiments, the preset gamut may be a 60 degree neighborhood of the hue of the background layer color. In some embodiments, when the first preset area is smaller than the second preset area, the second preset area may satisfy the following equation (2):
S≥B≥202/(S-100)+100,(100/8≤S≤100) (2)
wherein S represents a saturation value and B represents a brightness value.
In some embodiments, when the first preset area is greater than the second preset area, the second preset area may satisfy the following mathematical formula (3):
in some embodiments, when the background layer color and the foreground layer color belong to different hues, the foreground layer color may be determined based on the following equations (4) and (5):
B2-100≥202/(S2-100),(0<S2≤100,0<B2≤100) (4)
H-60≤H2≤H+60,(0≤H≤360) (5)
Wherein B2、S2 and H2 respectively represent a luminance value, a saturation value, and a hue value corresponding to a foreground layer color, and B2 = 0;H represents a hue value corresponding to a background layer color when S2 =0 and B2=100,S2 =100.
Further details regarding the functions of the first obtaining module 210, the color determining module 220 and the color matching module 230 may be found in fig. 4 and the related description thereof, and are not repeated here.
FIG. 3 is a block diagram of an exemplary image generation system shown in accordance with some embodiments of the present description.
As shown in fig. 3, in some embodiments, the image generation system 300 may include an image color matching module 310, a second acquisition module 320, and an image generation module 330. In some embodiments, the corresponding functions of the image generation system 300 may be performed by the processing device 110 or the terminal device 120.
The image color matching module 310 may be used to determine the foreground layer color and the background layer color of the target image. In some embodiments, the image color matching module 310 may perform the same or similar functions as the image color matching system 200, and more details may be found in fig. 2 or fig. 4 and related descriptions thereof, which are not repeated herein. For example, the image color matching module 310 may further include an image information acquisition unit, a color determination unit, and a color matching unit (not shown in the figure). Wherein the image information acquisition unit may be configured to acquire the first image information, the color determination unit may be configured to determine a background layer color of the target image, and the color matching unit may be configured to determine a foreground layer color of the target image based on the background layer color.
The second acquisition module 320 may be used to acquire second image information. In some embodiments, the second image information may include at least one of an image theme, a size, a style, and the like.
The image generation module 330 may be configured to generate the target image based on the second image information, the foreground layer color, and the background layer color. For example, the image generation module 330 may generate the foreground layer 151 of the target image based on the second image information and the foreground layer color, generate the background layer 152 of the target image based on the background layer color, and generate the target image 150 based on the foreground layer 151 and the background layer 152.
It should be understood that the systems and modules thereof shown in fig. 2 and 3 may be implemented in a variety of ways. For example, in some embodiments the system and its modules may be implemented in hardware, software, or a combination of software and hardware.
It should be noted that the above description of system 200 and/or system 300 and its modules is for descriptive convenience only and is not intended to limit the present disclosure to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the principles of the system, various modules may be combined arbitrarily or a subsystem may be constructed in connection with other modules without departing from such principles. For example, in some embodiments, the above modules disclosed in fig. 2 or 3 may be different modules in one system, or may be one module to implement the functions of two or more modules described above. For another example, each module may share a storage module, or each module may have a respective storage module. Including such variations, are within the scope of the present description.
Fig. 4 is a flow diagram of an exemplary image color matching method according to some embodiments of the present description.
In some embodiments, the process 400 may be performed by the image color matching system 100 (e.g., the processing device 110 or the terminal device 120) or the image color matching system 200. For example, the flow 400 may be stored in a storage device (e.g., the storage device 130, a memory unit of a system) in the form of a program or instructions that, when executed by a processor or module shown in fig. 2, may implement the flow 400. As shown in fig. 4, in some embodiments, the flow 400 may include the following steps.
In step 410, first image information is acquired. In some embodiments, step 410 may be performed by processing device 110, terminal device 120, or first acquisition module 210.
The image information (e.g., first image information, second image information) may reflect parameters of the target image to be generated. For example, the first image information may include color system information and content information. The color system information may reflect a hue in which the image color is, for example, a red hue, a green hue, a blue hue, and the like. The content information may reflect content that needs to be contained in the image. In some embodiments, the content information may include at least one of text, artwork, buttons, labels, and the like. The text may be the content of promotional information related to the presentation image, such as a promotional slogan, activity title, product/service name, etc. The illustration may be a product picture, a moving picture, or the like, which is related to the target image promotional information. The button may be a link that can jump to other pages (e.g., a link button into a product purchase page). The label may be parameter information that can introduce a logo picture, a button background picture, a tool icon, etc. For example, when generating a poster image for advertising a green food item, the image information may include green (i.e., color system), "organic, healthy" (i.e., text), food pictures (i.e., artwork), a button to jump into a detailed description of the product, etc. For another example, when generating a poster for a national celebration promotional campaign, the image information may include red (i.e., color system), a "get national celebration gift-Wan shop with a discount welfare, full XX minus XX" (i.e., text), a button to jump into an online store purchase page, etc.
In some embodiments, the first image information may be acquired from a terminal device (e.g., terminal device 120) or a storage device (e.g., storage device 130). For example, the processing device 110 may obtain information of color systems, characters, buttons, etc. input by the user through the terminal device 120. In some embodiments, the input means may include, but is not limited to, selection input, typing input, voice input, scanning input, handwriting input, and the like, or any combination thereof. In some embodiments, the first image information may be acquired in response to an image generation instruction. In some embodiments, the first image information may be acquired in response to an image editing instruction. For example, when the intermediate image is generated and displayed on the display interface of the terminal device 120, the first image information supplementary input or re-input by the user may be acquired in response to the user clicking a button for adjustment, modification, addition, or deletion, etc.
Step 420, determining the background layer color of the target image based on the color system information. In some embodiments, step 420 may be performed by processing device 110, terminal device 120, or color determination module 220.
In some embodiments, the image may be divided into a background layer and a foreground layer of the image according to the structure of the image. The background layer may be a layer of the image substrate layer, and the foreground layer may be a layer of the image substrate layer. For example, as shown in fig. 5, the base layer c of the image represents the background layer, and the insert b and letter a above the base layer c correspond to the foreground layer. In some embodiments, the foreground layer may include one or more layers.
In some embodiments, the hue value, the saturation value, and the brightness value may be determined by a preset algorithm based on the color system information, and the background layer color of the target image may be determined based on the hue value, the saturation value, and the brightness value. In some embodiments, the preset algorithm may include a color model. For example, color models may include, but are not limited to, HIS (Hue, saturation, brightness) color models, HSV (Hue, saturation, value) color models, RGB (Red, green, blue) color models, CMYK (Cyan, magenta, yellow) color models, and the like. In some embodiments, the preset algorithm may be an HSB (Hue, saturation, brightness) color model. By way of example only, as shown in fig. 6, fig. 6 (a) shows hue H in the HSB color model, which includes a plurality of different hues having values of 0-360, and fig. 6 (B) shows a region corresponding to one of the hues, the ordinate of the region showing luminance (Brightness), the abscissa showing Saturation (Saturation), and the values of both Saturation S and luminance B being 0-100.
In some embodiments, the background layer color may comprise a single color and/or a gradient color. In some embodiments, a background layer color for a single color may be determined from the effective color gamut based on the color family information. In some embodiments, the effective color gamut may be determined by a preset manner (e.g., equation (1)). For example only, when the color system is red, the processing device 110 may determine a hue value corresponding to red (e.g., hue value h=0) based on the HSB color model, and determine an effective color gamut (e.g., an area above the dotted line in fig. 6 (c)) based on an area corresponding to red (e.g., an area shown in fig. 6 (B)), and determine a saturation value S and a brightness value B of the background layer color from the effective color gamut (e.g., (s=15, b=100), (s=44, b=100), (s=86, b=95), (s=100, b=67), (s=100, b=29)).
In some embodiments, the gradient may include a single color gradient and a cross-color gradient. The single color gradient refers to the gradient and the base color are colors corresponding to different saturation and/or brightness in the same color phase. By cross-color gradient is meant that the gradient and base colors are colors in different color phases, respectively.
In some embodiments, the saturation value and/or the luminance value of the gradient color may be determined based on the saturation value and/or the luminance value of the base color, thereby determining a monochromatic gradient color background layer. In some embodiments, the saturation value and/or the luminance value of the base color may be determined based on the color family information (e.g., randomly determined from the region shown in fig. 6 (b)). In some embodiments, when the background layer color is a monochromatic gradient, the hue value of the background layer color is fixed, and the saturation value S1 and the brightness value B1 of the gradient may satisfy the first preset condition. For example, the first preset condition may be that if S0 +x >100, the saturation value S1 =100 and the brightness value B1=B0+S0 -100 of the gradient color, and if B0 +x >100, the brightness value B1 =100 and the saturation value S1=B0+S0 -100 of the gradient color, where S0 represents the saturation value corresponding to the base color of the background layer color, B0 represents the brightness value corresponding to the base color, and X is any integer in the interval [20,40 ]. In some embodiments, the value of X may be other numerical ranges, for example, [30,60], [10,50], etc., which are not limited in this specification.
For example only, the processing device 110 may determine, based on the color system information and the first preset condition, a saturation value S0 =17, a luminance value B0 =100, a gradient saturation value S1 =54, a luminance value B1 =100, or a gradient saturation value S0 =100, a luminance value B0 =50, a gradient saturation value S1 =100, a luminance value B1 =15, or a gradient saturation value S0 =69, a luminance value B0 =97, a gradient saturation value S1 =100, a luminance value B1 =71, or a gradient saturation value S0 =99, a luminance value B0 =82, a gradient saturation value S1 =99, a luminance value B1 =42 of the background.
In some embodiments, the gradient color shade value may be determined based on the base color shade value, thereby determining the cross-color gradient color background layer. In some embodiments, the saturation value, the brightness value, and/or the hue value of the base color may be determined based on the color family information (e.g., the color of the base color is determined from the regions shown in fig. 6 (a) and (b) based on the color family information). In some embodiments, when the background layer color is a cross-color gradient, the saturation value and the brightness value of the background layer color are both fixed, and the hue value H1 of the gradient may satisfy the second preset condition. For example, the second preset condition may include that the color phase corresponding to the gradual change is located within a preset area of the color phase corresponding to the base color (for example, an adjacent color phase of 30 ° to 60 ° of the color phase corresponding to the base color). Specifically, the second preset condition may be that if H0 +y >360, the color phase of the gradual change is H1=H0 +y-360, and if H0 -Y <0, the color phase of the gradual change is H1=360-(Y-H0, where H0 represents the color phase value corresponding to the base color of the background layer color, and Y is any integer in the interval [30,60 ]. In some embodiments, Y may take other values, and the present disclosure is not limited thereto.
By way of example only, the processing device 110 may determine that the base color and the gradient color of the background layer are each 75, the luminance value is 100, the hue value H0 =50, the gradient color is each1 =0, or the base color and the gradient color are each 71, the luminance value is each 95, the base color is each H0 =35, the gradient color is each H1 =339, or the base color and the gradient color are each 70, the luminance value is each 96, the base color is each H0 =19, the gradient color is each H1 =329, or the base color and the gradient color are each 75, the luminance value is each 94, the base color is each H0 =353, and the gradient color is each H1 =293, based on the color system information and the second preset condition.
In some embodiments, the background layer color may be determined based on the content information. For example, a background layer color (e.g., base color) may be determined based on the color of the artwork. In some embodiments, when the content information includes a color extraction of multiple colors, the background layer color may be determined based on the color coverage area of the content information. For example, when the artwork contains multiple colors, the processing device 110 may extract the color with the largest ratio in the artwork and match the color within the 60 degrees of adjacent hues of its corresponding hue to the background layer color. For another example, when the artwork contains multiple colors, the processing device 110 may extract the two colors with the largest ratio in the artwork and match the colors corresponding to the two colors to be the base color and the gradient color of the background layer, respectively.
In some embodiments, the background layer color may be determined based on the color family information and the content information.
Step 430, determining a foreground layer color of the target image based on the background layer color. In some embodiments, step 410 may be performed by processing device 110, terminal device 120, or color matching module 230.
In some embodiments, the foreground layer may include content information and, accordingly, the foreground layer color may include the color of the content information. In some embodiments, the foreground layer color and the background layer color may be determined to belong to the same hue or different hues based on the color family information and/or the content information. In some embodiments, when the background layer color and the foreground layer color belong to the same hue, the background layer color and the foreground layer color may be located in a first preset area and a second preset area of the same hue, respectively, and when the background layer color and the foreground layer color belong to different hues, the hue of the foreground layer color is located in a preset color gamut of the hue of the background layer color. The first preset area and the second preset area may be areas formed with different saturation values and/or brightness values in the same color phase (for example, the first preset area and the second preset area shown in fig. 7 (a) and (b)).
In some embodiments, when the first preset area is smaller than the second preset area, the saturation value S and the brightness value B of the second preset area may satisfy the mathematical formula (2):
S≥B≥202/(S-100)+100,(100/8≤S≤100) (2)
For example, as shown in fig. 7 (a), the first preset region corresponds to a light gray rectangular region in the same color phase, the second preset region corresponds to a light gray irregular trilateral region in the same color phase, and when the background layer color is located in the first preset region, the foreground layer color can be determined from the second preset region.
In some embodiments, when the first preset area is greater than the second preset area, the saturation value S and the brightness value B of the second preset area may satisfy the mathematical formula (3):
For example, as shown in fig. 7 (b), the first preset region corresponds to a light gray irregular trilateral region of the same hue, the second preset region corresponds to a light gray rectangular region of the same hue, and when the background layer color is located in the first preset region, the foreground layer color can be determined from the second preset region.
In some embodiments, the preset gamut may be a 60 degree adjacent hue of the background layer color (e.g., the 60 degree adjacent hue of the primary colors shown in fig. 8 (a)). In some embodiments, the preset gamut may be adjacent hues within other numerical ranges of hues of the background layer colors (e.g., 15 degrees adjacent hues, 30 degrees adjacent hues, 90 degrees adjacent hues, 120 degrees adjacent hues, 180 degrees adjacent hues, etc.), which is not limited in this specification.
By way of example only, as shown in fig. 8 (b), where one sector represents one preset color gamut, the background layer color and the foreground layer color may be different colors in the same sector, and the processing device 110 may determine the foreground layer color using equations (4) and (5) based on the background layer color:
B2-100≥202/(S2-100),(0<S2≤100,0<B2≤100) (4)
H-60≤H2≤H+60,(0≤H≤360) (5)
Wherein B2、S2 and H2 respectively represent a brightness value, a saturation value, and a hue value corresponding to the color of the foreground layer, and H represents a hue value corresponding to the color of the background layer.
In some embodiments, the foreground layer color may be determined based on the background color of the background layer. In some embodiments, the foreground layer color may be determined based on the gradient color of the background layer.
In some embodiments, a background layer (e.g., background layer 152) of the target image may be generated based on the background layer color, and a foreground layer (e.g., foreground layer 151) of the target image may be generated based on the foreground layer color. In some embodiments, a target image (e.g., target image 150) may be generated based on the background layer and the foreground layer. For more details, see fig. 9 and the related description thereof, and are not repeated here.
It should be noted that the above description of the process 400 is for purposes of illustration and description only, and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to flow 400 will be apparent to those skilled in the art in light of the present description. For example, in step 430, the foreground layer color may be determined based on the color of the content information and the background layer color, or the foreground layer color may be determined based on the color system information and the content information. However, such modifications and variations are still within the scope of the present description.
Fig. 9 is a flow diagram of an exemplary image generation method shown in accordance with some embodiments of the present description.
In some embodiments, the process 900 may be performed by the image color matching system 100 (e.g., the processing device 110 or the terminal device 120) or the image generation system 300. For example, the flow 900 may be stored in a storage device (e.g., the storage device 130, a memory unit of a system) in the form of a program or instructions that, when executed by a processor or module shown in fig. 3, may implement the flow 900. As an example only, as shown in fig. 9, after the processing device 110 acquires image information from the terminal device 120 or the storage device 130, the background layer color and the foreground layer color of the target image may be determined based on the image information, and then the target image may be generated based on the background layer color and the foreground layer color. Specifically:
At step 910, image information is acquired. In some embodiments, step 910 may be performed by processing device 110, terminal device 120, first acquisition module 210, or second acquisition module 320.
The image information may include first image information and second image information. In some embodiments, the second image information may include at least one of an image theme, a size, a style, and the like. The image theme may reflect the promotional purpose of the target image (e.g., end-of-year promotional campaign, product promotion, public welfare promotion, etc.). The size may reflect the size of the target image. The style may reflect the application scenario (e.g., finance, science, etc.) of the target image. For example, finance may include, but is not limited to, stock, funds, concepts (e.g., electric cars), etc., and science and technology may include, but is not limited to, aerospace, navigation, etc.
In some embodiments, the image information may be acquired by a terminal device (e.g., terminal device 120). For example, the user may input the first image information and/or the second image information through the terminal device 120, and the terminal device 120 may transmit the related information to the processing device 110. In some embodiments, the first image information and the second image information may be acquired simultaneously or separately.
In step 920, the background layer color of the target image is determined. In some embodiments, step 920 may be performed by processing device 110, terminal device 120, color determination module 220, or image color matching module 310.
In some embodiments, the background layer color may be determined based on image information. For example, the processing device 110 may determine a background layer color based on the first image information. For more details on determining the background layer color, see fig. 4 and the related description thereof, which are not repeated here.
In step 930, a background layer of the target image is generated. In some embodiments, step 930 may be performed by processing device 110, terminal device 120, or image generation module 330.
In some embodiments, the background layer of the target image may be generated based on the background layer color. In some embodiments, one or more background layers may be generated. In some embodiments, the generated background layer may include one or more of a monochromatic background layer, a monochromatic gradient color background layer, a cross-color gradient color background layer, and the like. For example, the processing device 110 may generate 3 monochromatic background layers, or 3 monochromatic background layers and 5 monochromatic gradient color background layers, or 3 monochromatic background layers and 2 cross-color gradient color background layers, or 10 monochromatic background layers, 10 monochromatic gradient color background layers and 10 cross-color gradient color background layers, or the like. In some embodiments, the number of background layers and/or the color class (e.g., monochrome and gradient colors) may be determined based on the image information. For example, a corresponding number of background layers may be generated based on the number of target images entered by the user.
In some embodiments, a background layer of the target object may be generated based on the image information (e.g., the second image information) and the background layer color. For example, a background layer of a corresponding size may be generated based on the image size and the background layer color. In some embodiments, the background layer may be post-processed (e.g., background texture added) to generate a background layer for the corresponding target object.
Step 940, a foreground layer color of the target image is determined. In some embodiments, step 940 may be performed by processing device 110, terminal device 120, color matching module 230, or image color matching module 310.
In some embodiments, the foreground layer color of the target image may be determined based on the background layer color. In some embodiments, the foreground layer color of the target image may be determined based on the image information (e.g., the first image information) and the background layer color. For more details on determining the foreground layer color, see fig. 4 and its related description, which are not repeated here.
Step 950, obtaining foreground layer material from the database. In some embodiments, step 950 may be performed by the processing device 110, the terminal device 120, or the second acquisition module 320.
In some embodiments, the foreground layer material may include one or more of artwork, labels, buttons, and the like. In some embodiments, the foreground layer material may be retrieved from a database (e.g., storage device 130) based on the foreground layer color. For example, buttons of corresponding colors may be retrieved from the storage device 130 based on the foreground layer color. In some embodiments, foreground layer material may be retrieved from a database (e.g., storage device 130) based on the image information. For example, the processing device 110 may obtain the material of the artwork, buttons, labels, etc. of the corresponding color system from the storage device 130 based on the color system information and the content information. In some embodiments, step 950 may be omitted.
In step 960, a foreground layer of the target image is generated. In some embodiments, step 960 may be performed by processing device 110, terminal device 120, or image generation module 330.
In some embodiments, a foreground layer of the target image may be generated based on the foreground layer color. In some embodiments, a corresponding number of foreground layers may be generated based on the number of background layers. In some embodiments, a foreground layer of the target object may be generated based on the image information (e.g., the second image information) and the foreground layer color. For example, a foreground layer may be generated based on content information, theme, style, and foreground layer color. In some embodiments, the foreground layer of the target image may be generated based on preset layout rules. For example, the processing device 110 may layout content information such as text, an illustration, and a button in the foreground layer based on a preset layout rule, so as to generate a corresponding foreground layer.
In step 970, a target image is generated. In some embodiments, step 970 may be performed by processing device 110, terminal device 120, or image generation module 330.
In some embodiments, the target image may be generated based on the foreground layer and the background layer. In some embodiments, the target image may be displayed at a terminal device (e.g., terminal device 120) for output to a user. In some embodiments, the target image may be adjusted based on the feedback information. For example, parameters such as layout, color, size, style, and the like of the target image may be adjusted based on the user's filtering record, editing record, and the like of the target image.
It should be noted that the above description of the process 900 is for illustration and description only, and is not intended to limit the scope of the application of the present disclosure. Various modifications and changes to flow 900 will be apparent to those skilled in the art in light of the present description. However, such modifications and variations are still within the scope of the present description.
The image color matching and image generating method provided by some embodiments of the present disclosure, (1) can improve color matching and rationality of a background layer and a foreground layer by determining a background layer color based on image information and a foreground layer color according to the background layer color, (2) can enable colors of the foreground layer and the background layer to be closer to a human vision principle by determining hue values, saturation values and brightness values, and (3) can improve aesthetic effects, generation success rate and generation efficiency of a target image by generating a corresponding target image based on the determined background layer color and foreground layer color.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations to the present disclosure may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this specification, and therefore, such modifications, improvements, and modifications are intended to be included within the spirit and scope of the exemplary embodiments of the present invention.
Meanwhile, the specification uses specific words to describe the embodiments of the specification. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present description. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present description may be combined as suitable.
Furthermore, the order in which the elements and sequences are processed, the use of numerical letters, or other designations in the description are not intended to limit the order in which the processes and methods of the description are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present disclosure. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed in this specification and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure does not imply that the subject matter of the present description requires more features than are set forth in the claims. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing the components, number of attributes are used, it being understood that such numbers being used in the description of embodiments are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations that may be employed in some embodiments to confirm the breadth of the range, in particular embodiments, the setting of such numerical values is as precise as possible.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., referred to in this specification is incorporated herein by reference in its entirety. Except for application history documents that are inconsistent or conflicting with the content of this specification, documents that are currently or later attached to this specification in which the broadest scope of the claims to this specification is limited are also. It is noted that, if the description, definition, and/or use of a term in an attached material in this specification does not conform to or conflict with what is described in this specification, the description, definition, and/or use of the term in this specification controls.
Finally, it should be understood that the embodiments described in this specification are merely illustrative of the principles of the embodiments of this specification. Other variations are possible within the scope of this description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to only the embodiments explicitly described and depicted in the present specification.