US8553045B2 - System and method for image color transfer based on target concepts - Google Patents
System and method for image color transfer based on target concepts Download PDFInfo
- Publication number
- US8553045B2 US8553045B2 US12/890,049 US89004910A US8553045B2 US 8553045 B2 US8553045 B2 US 8553045B2 US 89004910 A US89004910 A US 89004910A US 8553045 B2 US8553045 B2 US 8553045B2
- Authority
- US
- United States
- Prior art keywords
- concept
- image
- color
- colors
- palette
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 96
- 238000012546 transfer Methods 0.000 title claims abstract description 43
- 239000003086 colorant Substances 0.000 claims abstract description 122
- 238000013507 mapping Methods 0.000 claims abstract description 33
- 230000009466 transformation Effects 0.000 claims abstract description 25
- 230000006870 function Effects 0.000 claims description 14
- 238000012549 training Methods 0.000 claims description 11
- 230000006978 adaptation Effects 0.000 claims description 10
- 239000000203 mixture Substances 0.000 claims description 9
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 claims description 6
- 238000004891 communication Methods 0.000 claims description 5
- 238000007476 Maximum Likelihood Methods 0.000 claims description 4
- 238000004590 computer program Methods 0.000 claims description 3
- 238000013459 approach Methods 0.000 description 12
- 239000011159 matrix material Substances 0.000 description 8
- 238000012986 modification Methods 0.000 description 7
- 230000004048 modification Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 230000000007 visual effect Effects 0.000 description 7
- 238000000605 extraction Methods 0.000 description 5
- 230000036651 mood Effects 0.000 description 5
- 238000012800 visualization Methods 0.000 description 5
- 101100136092 Drosophila melanogaster peng gene Proteins 0.000 description 4
- 238000009826 distribution Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 239000004783 Serene Substances 0.000 description 3
- 230000001419 dependent effect Effects 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- MTCFGRXMJLQNBG-UHFFFAOYSA-N serine Chemical compound OCC(N)C(O)=O MTCFGRXMJLQNBG-UHFFFAOYSA-N 0.000 description 3
- NCGICGYLBXGBGN-UHFFFAOYSA-N 3-morpholin-4-yl-1-oxa-3-azonia-2-azanidacyclopent-3-en-5-imine;hydrochloride Chemical compound Cl.[N-]1OC(=N)C=[N+]1N1CCOCC1 NCGICGYLBXGBGN-UHFFFAOYSA-N 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000001276 controlling effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000008451 emotion Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000009472 formulation Methods 0.000 description 2
- 238000010348 incorporation Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 229920004943 Delrin® Polymers 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000001427 coherent effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000002715 modification method Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/02—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
- G09G5/06—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed using colour palettes, e.g. look-up tables
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N7/00—Computing arrangements based on specific mathematical models
- G06N7/01—Probabilistic graphical models, e.g. probabilistic networks
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0606—Manual adjustment
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0666—Adjustment of display parameters for control of colour parameters, e.g. colour temperature
Definitions
- the exemplary embodiment relates to the image processing, image presentation, photofinishing, and related arts. It finds particular application to the modification of colors of an image to match a selected concept.
- the colors of an image can convey an abstract concept to a viewer.
- An abstract concept can be a particular mood, such as romantic, calm, or the like, which is distinct from the content of the image.
- Designers sometimes make color adjustments to an image to convey a particular mood.
- Image processing software provides a wide range of color adjustments, such as color balance, hue, saturation, and intensity, typically with fine control such as independent channel adjustment capability for the various channels (e.g., the red, green, and blue channels in an RGB color space).
- manipulation of image colors can be difficult and time-consuming.
- the exemplary embodiment provides a system and method for modifying colors of an image by selecting a concept in natural language, thus making the task much simpler for the non-skilled user.
- U.S. Pub No. 20090231355 published Sep. 17, 2009, entitled COLOR TRANSFER BETWEEN IMAGES THROUGH COLOR PALETTE ADAPTATION, by Florent Perronnin, discloses an image adjustment method. This includes adapting a universal palette to generate input image and reference image palettes statistically representative of pixels of input and reference images. Some of the pixels of the input image are adjusted to generate adjusted pixels that are statistically represented by the reference image palette.
- Color palettes are also disclosed, for example, in U.S. Pub No. 20030146925, published Aug. 7, 2003, entitled GENERATING AND USING A COLOR PALETTE, by Jun Zhao, et al.; U.S. Pub. No. 20060066629, published Mar. 30, 2006, entitled SYSTEM AND METHOD FOR COLOR SELECTION, by Rebecca Norlander, et al.; U.S. Pat. No. 5,642,137, issued Jun. 24, 1997, entitled COLOR SELECTING METHOD, by Kitazumi; and U.S. Pub. No 20040164991, published Aug. 26, 2004, entitled COLOR PALETTE PROVIDING CROSS-PLATFORM CONSISTENCY, by Brian Rose.
- a method for color transfer includes retrieving a concept color palette from computer memory corresponding to a user-selected concept, the concept color palette comprising a first set of colors.
- An image color palette for an input image is computed.
- the image color palette includes a second set of colors that are representative of pixels of the input image.
- Colors of the image color palette are mapped to colors of the concept color palette to identify, for colors of the image color palette, a corresponding color in the concept color palette.
- a transformation is computed, based on the mapping. For pixels of the input image, modified color values are computed, based on the computed transformation, to generate a modified image.
- an image adjustment apparatus includes memory which stores a set of concept palettes, each concept palette comprising a respective set of colors and being associated in the memory with a respective natural language description. Instructions are stored in memory for retrieving one of the concept color palettes from the memory corresponding to a user-selected concept, the concept color palette comprising a first set of colors, receiving a user-selected input image and computing an image color palette for the input image, the image color palette comprising a second set of colors, mapping colors of the image color palette to colors of the concept color palette to identify, for each of the colors of the image color palette, a corresponding color in the concept color palette, computing a transformation based on the mapping, and for pixels of the input image, computing modified color values based on the computed transformation to transform the input image.
- a processor in communication with the memory, executes the instructions.
- a graphical user interface is implemented by a processor executing instructions stored in memory.
- the graphical user interface is configured for displaying an input image and a set of concepts for selection of one of the concepts by a user.
- the graphical user interface is further configured for displaying a modified image whose colors have been transformed based on a stored concept palette corresponding to the user-selected one of the concepts.
- FIG. 1 is a schematic diagram illustrating aspects of a system and method for color transfer in accordance with the exemplary embodiment
- FIG. 2 is a screen shot illustrating a graphical user interface displaying an image, image palette, concept selector, and concept palette for a selected concept;
- FIG. 3 is a screen shot illustrating the graphical user interface of FIG. 2 displaying a transferred image and modified image palette
- FIG. 4 is a flow chart illustrating a method for image color transfer in accordance with one aspect of the exemplary embodiment
- FIG. 5 is a functional block diagram of a system for image color transfer in accordance with another aspect of the exemplary embodiment.
- FIG. 6 illustrates generation of an image palette through Gaussian Mixture Model (GMM) adaptation.
- GMM Gaussian Mixture Model
- aspects of the exemplary embodiment relate to a system and a method for concept-based color transfer.
- Concept-based transfer is an alteration of the colors of an input image. This operation is performed taking as a reference, a pre-computed color model related to the target concept.
- the term “concept” is used to refer to emotions, moods, and any other aesthetic concepts.
- a concept can be, for example, “romantic”, “adult”, “serene”, etc.
- the exemplary method is particularly useful for unskilled users which have to design graphic-rich documents while adhering to a particular theme.
- a user may be fully aware of the message he or she wants to convey, but uncertain about the colors to use for a document or image.
- the exemplary system and method provide for a change in the chromatic appearance of an image or document according to a specific concept.
- the concepts are represented through color models in the form of color palettes.
- a “color palette,” as used herein, is a limited set of different colors, which may be displayed as an ordered or unordered sequence of swatches, one for each color in the set.
- a “predefined color palette” is a color palette stored in memory. The colors in a predefined color palette may have been selected by a graphic designer, or other skilled artisan working with color, to harmonize with each other, when used in various combinations. Each predefined color palette may have the same number (or different numbers) of visually distinguishable colors.
- the predefined color palettes each include at least two colors, such as at least three colors, e.g., up to ten colors, such as three, four, five, or six different colors. These colors are manually selected, in combination, to express a particular aesthetic concept. It is not necessary for the predefined color palettes to all have the same number of swatches.
- Predefined color palettes of interest herein are generally those which are tagged with a corresponding label related to a concept.
- a “concept palette” is a color palette which has been derived from a set of predefined color palettes that have been associated with a target concept. As such, the colors of the concept palette are automatically, rather than manually, generated.
- the concept palette may include, for example, at least five or at least eight colors and in one embodiment, sixteen colors.
- the concept palette for each predefined concept has no more than 50 colors.
- a concept palette includes substantially fewer colors than the total of all colors of the predefined color palettes from which it is derived.
- an “image palette” refers to a color palette which has been extracted from a digital image, based on the colors present in the digital image.
- the exemplary image palette is statistically representative of colors in the image.
- each image palette has the same number of colors.
- the extracted color palette may have at least four and, in one embodiment, at least five colors, such as eight or sixteen colors.
- the image palette has no more than 50 colors.
- a digital photographic image typically has far more different colors and the extracted color palette includes many fewer colors than the digital image from which it is extracted.
- a concept palette has at least as many colors as an image palette.
- a “user” can be any person or persons having access to a database of digital images, such as a graphic designer amateur photographer, or the like.
- color is used to refer to any aspect of image pixels, including, but not limited to, absolute color values, such as hue, chroma, and lightness, and relative color values, such as differences in hue, chroma, and lightness.
- Color can be expressed, for example, in a two, three, or more coordinate color space, such as RGB, L*, a*, b*, YCbCr, Luv, XYZ, CMYK, etc.
- Color as used herein, is not limited to chromatic colors but also includes black and white.
- the “color” may be characterized by one, two, or all three of the red, green, and blue pixel coordinates in an RGB color space representation, or by one, two, or all three of the L, a, and b pixel coordinates in a Lab color space representation.
- the color adjustment techniques are described herein with illustrative reference to two-dimensional images such as photographs or video frames, it is to be appreciated that these techniques are readily applied to three-dimensional images as well.
- a “digital image” can be any convenient file format, such as JPEG, Graphics Interchange Format (GIF), JBIG, Windows Bitmap Format (BMP), Tagged Image File Format (TIFF), JPEG File Interchange Format (JFIF), Delrin Winfax, PCX, Portable Network Graphics (PNG), DCX, G3, G4, G3 2D, Computer Aided Acquisition and Logistics Support Raster Format (CALS), Electronic Arts Interchange File Format (IFF), IOCA, PCD, IGF, ICO, Mixed Object Document Content Architecture (MO:DCA), Windows Metafile Format (WMF), ATT, (BMP), BRK, CLP, LV, GX2, IMG(GEM), IMG(Xerox), IMT, KFX, FLE, MAC, MSP, NCR, Portable Bitmap (PBM).
- JPEG Graphics Interchange Format
- JBIG Windows Bitmap Format
- TIFF Tagged Image File Format
- JFIF JPEG File Interchange Format
- Delrin Winfax PC
- Portable Greymap (PGM), SUN, PNM, Portable Pixmap (PPM), Adobe Photoshop (PSD), Sun Rasterfile (RAS), SGI, X BitMap (XBM), X PixMap (XPM), X Window Dump (XWD), AFX, Imara, Exif, WordPerfect Graphics Metafile (WPG), Macintosh Picture (PICT), Encapsulated PostScript (EPS), or other common file format used for images and which may optionally be converted to another suitable format prior to processing.
- Digital images may be individual photographs, graphics, video images, or combinations thereof.
- each input digital image includes image data for an array of pixels forming the image.
- the image data may include colorant values (which may be referred to herein as grayscale values), for each of a set of at least three color separations, such as RGB, or be expressed in another color space in which different colors can be represented.
- grayscale refers to the optical density value of any single color channel, however expressed (RGB, YCbCr, L*a*b*, etc.).
- the exemplary embodiment is not intended for black and white (monochrome) images, although it could be modified to allow incorporation of such images.
- a reduced pixel resolution version (“thumbnail”) of a stored digital image may be used, which, for convenience of description, is considered to be the image.
- An offline stage includes inputting a set 10 of predefined color palettes 12 , 14 , etc., that are associated with a given concept 16 , such as “playful,” to a palette training component 18 .
- a palette training component 18 For example, at least 10 or at least 20 predefined color palettes per concept are used in the concept palette learning. There may be, for example, at least 5 or at least 10 predefined concepts, such as up to 100 concepts, each with an associated natural language description.
- a number N of colors 20 may also be input to the palette training component 18 (or established by default).
- the training component 18 outputs a concept palette 22 with N colors, based on the set 10 of predefined color palettes.
- This training step is repeated for each of the set of concepts 16 , with a different set of color palettes for each concept.
- the result is a set of concept palettes 22 , stored in memory such as the illustrated palettes database 24 , where each concept palette is linked to its respective concept 16 or a reference thereto.
- a user selects an input image 26 , which may be the user's own image or one selected from a database created by other users, and selects a target concept 16 from the set of concepts.
- a concept palette retrieval component 28 retrieves the concept palette 22 from the database 24 which matches the selected concept 16 .
- An image palette extractor 30 extracts an image palette 32 from the selected image 26 .
- a palette mapping component 34 maps the image palette 32 to the retrieved concept palette 22 to identify a mapped concept palette 36 .
- the mapped concept palette 36 includes at least some or all of the colors in the retrieved concept palette 22 , which may be reordered to best match the colors of the image palette 32 .
- a color transfer component 38 transfers colors to the image based on the colors of pixels in the input image 26 , the mapped concept palette 36 , and optionally a degree of transfer 40 .
- the output is a modified image 42 , in which colors of the input image 26 have been modified (“transferred”), based on the mapped concept palette 36 and optionally a degree of transfer 40 .
- a modified image palette 44 derived from the modified image 42 by the image palette extractor 30 , may also be output. For example, all the colors, or the most dominant of the colors, in the modified image palette 44 , may be displayed, together with the modified image 42 .
- the colors of the palettes 12 , 14 , 22 , 36 , 44 in FIG. 1 are shown in black and white with different shading to indicate the different colors. Additionally, while each palette is shown with up to eight colors, it is to be appreciated that the palettes may include more colors, as previously discussed. Moreover, the palettes may be stored in computer memory as color values in any suitable color space, such as their a and b channel values in the Lab color space, and may be converted to another color space, such as RGB, for display on a screen as illustrated graphically in the exemplary screenshots 50 , 52 shown in FIGS. 2 and 3 .
- the method takes as input an image 26 and a target concept 16 and modifies the image pixels based on colors of a corresponding concept palette 22 .
- the method begins at S 100 .
- a concept palette 22 is learned for each aesthetic concept 16 in a set of concepts, each concept having a natural language description.
- the concept palettes are each learned from a corresponding set of predefined palettes 12 , 14 , etc.
- the concept palettes 22 are stored in memory. Once this step is complete, the predefined palettes 12 , 14 , are no longer needed.
- an image 26 to be modified is input.
- provision is made, e.g., via a graphical user interface ( FIG. 2 ), for a user to select an image to be modified in accordance with the exemplary method.
- an image 26 may be selected from a user's collection or a database of images.
- the displayed concepts may be all or only a computed subset of the available concepts.
- the user may also be provided with the opportunity to select a degree of transfer, at this time or later.
- a concept palette 22 corresponding to the target concept 16 is retrieved.
- the concept palette may be displayed to the user.
- the user may be permitted to modify the displayed concept palette, for example, by adding deleting, or adjusting the colors in the concept palette.
- an image palette 32 is extracted from the input image 26 .
- the image palette may learned solely from the image pixels (S 110 A) or by adapting the retrieved concept palette 22 (S 110 B).
- mapping (S 112 ) and image palette extraction (S 110 B) are accomplished at the same time, as described in greater detail below.
- the colors of the mapped concept palette 36 are transferred to the input image 26 .
- the modified image is output, e.g., visualized on a screen ( FIG. 3 ).
- the modified image palette 44 extracted from the modified image may also be displayed.
- the modified image, or a document containing it may be output to another image processing component for further processing, output to a printer, or the like.
- the method ends at S 118 .
- the method illustrated in FIG. 4 may be implemented in a computer program product that may be executed on a computer.
- the computer program product may comprise a non-transitory, tangible computer-readable recording medium on which a control program is recorded, such as a disk, hard drive, or the like.
- a non-transitory computer-readable media include, for example, floppy disks, flexible disks, hard disks, magnetic tape, or any other magnetic storage medium, CD-ROM, DVD, or any other optical medium, a RAM, a PROM, an EPROM, a FLASH-EPROM, or other memory chip or cartridge, or any other tangible medium from which a computer can read and use.
- the method may be implemented in transitory media, such as a transmittable carrier wave in which the control program is embodied as a data signal using transmission media, such as acoustic or light waves, such as those generated during radio wave and infrared data communications, and the like.
- transitory media such as a transmittable carrier wave
- the control program is embodied as a data signal using transmission media, such as acoustic or light waves, such as those generated during radio wave and infrared data communications, and the like.
- the exemplary method may be implemented on one or more general purpose computers, special purpose computer(s), a programmed microprocessor or microcontroller and peripheral integrated circuit elements, an ASIC or other integrated circuit, a digital signal processor, a hardwired electronic or logic circuit such as a discrete element circuit, a programmable logic device such as a PLD, PLA, FPGA, Graphical card CPU (GPU), or PAL, or the like.
- any device capable of implementing a finite state machine that is in turn capable of implementing the flowchart shown in FIG. 4 , can be used to implement the exemplary method.
- the method can be implemented using a computer system 100 that is communicatively linked to a database 102 of digital images 26 , to the database 24 of concept palettes 22 , and optionally to a database 104 from which the sets 10 of predefined color palettes are obtained.
- These databases 24 , 102 may be resident in the computer system 100 or stored in memory accessible thereto.
- the colors of the image can be modeled by the system 100 using a Gaussian mixture model (GMM).
- the exemplary GMM is composed of M Gaussians to provide an extracted image color palette 32 , where M is the number of colors in the extracted image color palette 32 (16 in the exemplary embodiment).
- the selected image 26 , extracted image palette 32 , selected concept 16 , and retrieved concept palette 22 may be temporarily stored in data memory 106 of the system 100 .
- Main memory 108 of the system 100 stores instructions 110 for performing the exemplary method.
- the instructions are executed by a computer processor 112 , such as the computer system's CPU.
- the instructions are illustrated as components 18 , 28 , 30 , 34 , 38 , 114 , 116 , 118 , which in the exemplary embodiment are in the form of software implemented by the processor 112 . However, it is also contemplated that some or all of these components may be implemented as separate hardware components.
- the components include a concept palette training component 18 , which generates a concept palette 22 for each concept.
- This component can be omitted once the concept palettes have been learned and stored. Or, it may be retained for developing new concept palettes as new concepts are added.
- An image retrieval component 114 may be provided for assisting a user to select an image from the database, e.g., based on user-input selection criteria.
- a concept computing component 116 computes a subset of the set of components for presenting to a user, which limits the user's choice of a target concept 16 to those in the subset.
- a concept palette retrieval component 28 retrieves a concept palette 22 from the database in response to the input target concept 16 .
- An image palette extractor 30 extracts an image palette 32 from the selected image 26 .
- This component may also serve to extract a modified image palette 44 from the color transferred image 42 .
- the palette mapping component 34 maps the image palette 32 to the retrieved concept palette 22 to identify a mapped concept palette 36 .
- the mapping component is integral with the image palette extractor, as described in further detail below.
- the color transfer component 38 computes, for each pixel of the image, its new color, based on its color values, the mapped concept palette, and the degree of transfer, to generate the color transferred image 42 .
- a visualization component 118 alone or in cooperation with software on a client computing device 120 , generates a visualization of the modified image 42 and/or its extracted palette 44 .
- the exemplary system 100 is in the form of a server computer which includes one or more input/output components 122 , 124 such as a modem, for communicating with the client device 120 and databases 24 , 102 , e.g., via wired wireless links 126 , 128 , 130 , such as a local area network, wide area network, or the Internet.
- the various components 106 , 110 , 112 , 122 , 124 of the system 100 may be communicatively connected by a bus 132 .
- the system 100 is hosted by a server computer which hosts a service provider's website which is accessible via a user's web browser 134 on the client device 120 .
- a server computer which hosts a service provider's website which is accessible via a user's web browser 134 on the client device 120 .
- all or a portion of the system 100 may be resident on the user's computer 120 .
- the computers 100 , 120 may each comprise one or more computing devices, such as a PC, such as a desktop, a laptop, or palmtop computer, portable digital assistant (PDA), server computer, cellular telephone, digital camera, pager, or other computing device capable of executing instructions for performing the exemplary method.
- a PC such as a desktop, a laptop, or palmtop computer
- PDA portable digital assistant
- server computer cellular telephone
- digital camera digital camera
- pager pager
- the user's computer 120 may be similarly configured to the server computer 100 , with memory, a processor, a system bus, and in addition, a processor controlling a display screen 136 to display the GUI 50 , 52 , and receive user inputs, e.g., from one or more input devices(s) 138 , such as a cursor control device, keyboard, keypad, touch screen, joystick, voice activated control device, or combination thereof.
- input devices(s) 138 such as a cursor control device, keyboard, keypad, touch screen, joy
- the memory 106 , 108 may be combined or separate and may represent any type of tangible computer readable medium, such as random access memory (RAM), read only memory (ROM), magnetic disk or tape, optical disk, flash memory, holographic memory, or combination thereof. In one embodiment, the memory 106 , 108 comprises a combination of random access memory and read only memory. In some embodiments, the processor 112 and memory 106 and/or 108 may be combined in a single chip.
- the network interface(s) 122 , 124 allow the computer 100 to communicate with other devices via a computer network, such as a local area network (LAN) or wide area network (WAN), or the Internet, and may comprise a modulator/demodulator (MODEM). An analogous network interface (not shown) is provided on the user's computer 120 .
- LAN local area network
- WAN wide area network
- MODEM modulator/demodulator
- the digital processor 112 can be variously embodied, such as by a single-core processor, a dual-core processor (or more generally by a multiple-core processor), a digital processor and cooperating math coprocessor, a digital controller, or the like.
- the digital processor 112 in addition to controlling the operation of the computer 100 , executes the instructions 110 stored in memory 108 for performing the method outlined in FIG. 4 .
- the presented method can be integrated in a creative image search engine for retrieving images catering to graphic designers as described, for example, in above-mentioned copending application Ser. No. 12/693,795.
- the system 100 generates a graphical user interface (GUI) on display screen 136 through which a user can select an image for modification.
- GUI graphical user interface
- Concepts are displayed in a concept selector 140 , here shown as a set of user actual areas of the screen 134 , each identified by a respective concept label, which can be selected by clicking on the respective area of the screen using the cursor control device 138 .
- the user can select an input image from a set of displayed images retrieved using keyword selectors.
- FIG. 2 the user has selected an image with content “citylife” and then the concept “capricious.”
- the corresponding concept palette 22 is then retrieved and displayed and is used in the color transfer.
- a transfer selector 142 is displayed, which allows a user to variably select a degree of transfer between upper and lower limits, here shown as between 0 and 1, although the degree of transfer could be displayed in other ways, such as from “high” to “low.”
- FIG. 3 illustrates the output image 42 and its palette obtained by color transfer based on the “capricious” concept.
- the term “software,” as used herein, is intended to encompass any collection or set of instructions executable by a computer or other digital system so as to configure the computer or other digital system to perform the task that is the intent of the software.
- the term “software” as used herein is intended to encompass such instructions stored in storage medium such as RAM, a hard disk, optical disk, or so forth, and is also intended to encompass so-called “firmware” that is software stored on a ROM or so forth.
- Such software may be organized in various ways, and may include software components organized as libraries, Internet-based programs stored on a remote server or so forth, source code, interpretive code, object code, directly executable code, and so forth. It is contemplated that the software may invoke system-level code or calls to other software residing on a server or other location to perform certain functions.
- Exemplary concepts include emotions, moods and other aesthetic concepts, such as capricious, classic, delicate, earthy, elegant, romantic, luscious, playful, robust, sensual, serene, spicy, spiritual, and warm (see, for example, the list of concepts in Eiseman, L., Pantone Guide to Communicating with Color Graffix Press, Ltd., 2000) (Hereinafter, “Eiseman”).
- a probabilistic representation of the concepts is learned by associating each concept, expressed in natural language, with a Gaussian Mixture Model (GMM). These models are referred to as concept palettes.
- the observation data for the GMM of a concept are color values in a perceptually coherent color space, such as CIE Lab or CIE Luv.
- the color values are the ab representations in the Lab color space of the swatches of all the palettes 12 , 14 , etc. associated with that concept 16 in the database 24 of palettes.
- the exemplary concept palettes are thus statistically representative of colors of the respective predefined color palettes.
- the training component 18 does not consider which predefined palette the swatches originated from, but simply takes as training data, color values of all the swatches.
- the lightness value L is not considered in the exemplary embodiment. Through this learning, it is ensured that the concept is well represented by a combination of colors. This step can be performed offline, i.e., before an image is selected.
- agglomerative clustering may be performed on a sub-sample of ab representations of swatches of the predefined color palette 12 , 14 , etc., until the desired number of clusters is obtained.
- This number is equivalent to the desired number of swatches in a concept palette, e.g., 16.
- the means of the Gaussian functions are initialized to the cluster centroid positions.
- the weights are then initialized to 1/N where N is the number of Gaussians in a concept GMM (i.e., 16).
- the covariance matrices are initialized to small values on the diagonal such that it is isotropic. This initialization is analogous to that described in the above-mentioned copending U.S.
- the concept palette learning component 18 then learns the parameters of the N Gaussian functions in the mixture, which in the exemplary embodiment are the means (a,b color values) of the learned Gaussian functions in the computed GMM and a corresponding mixture weight.
- the colors of the swatches in the concept palette 22 shown to the user are the values of the means converted from Lab to RGB color space for display. The displayed swatches can be ordered by the weights of their corresponding Gaussian functions.
- ⁇ i c are respectively the weight, mean vector and covariance matrix of Gaussian i, and N denotes the number of Gaussian components. It is assumed in the illustrative embodiments herein that the covariance matrices
- ⁇ i c are diagonal.
- x denotes an observation (here the vector comprising the two chromaticity coordinates of the color of a swatch of a palette in the input set of palettes) and q denotes which Gaussian emitted x, the likelihood that x was generated by the GMM is:
- weights are subject to the constraint:
- ⁇ ⁇ c ) exp ⁇ ⁇ - 1 2 ⁇ ( x t - ⁇ i c ) ′ ⁇ ⁇ ⁇ i c ⁇ - 1 ⁇ ( x t - ⁇ i c ) ⁇ ( 2 ⁇ ⁇ ) D / 2 ⁇ ⁇ ⁇ i c ⁇ 1 / 2 ,
- MLE Maximum Likelihood Estimation
- An expectation (E) step where the posterior occupancy probabilities are computed based on the current estimates of the parameters.
- ⁇ i ⁇ ( x t ) w i c ⁇ p i ⁇ ( x t
- ⁇ c ) ⁇ j 1 N ⁇ w j c ⁇ p j ⁇ ( x t
- x t 1 T ⁇ ⁇ i ⁇ ( x t )
- t 1 T ⁇ ⁇ i ⁇ ( x t ) - ⁇ ⁇ i c ⁇ ⁇ ⁇ i c ⁇ ⁇ ′ .
- the EM algorithm is only guaranteed to converge to a local optimum, not to a global one.
- the location of convergence is dependent on the initialization parameters. In other words, different initialization conditions will, in general, lead to different concept palettes.
- convergence is more likely to achieve a global optimum.
- the EM algorithm can be iterated a fixed number of times or until the results approach convergence.
- Any suitable palette database 24 or combination of databases may be used as the source of color swatches for training the concept GMMs.
- the database may have been designed by graphic designers or may be one generated by an online community of users.
- one database was obtained from Eiseman and is referred to herein as CC.
- the predefined palettes in this database each include three swatches and are organized according to the fifteen concepts: serene, earthy, romantic, robust, playful, classic, cool, warm, luscious, spicy, capricious, spiritual, sensual, elegant, and delicate.
- There are 24 palettes in each category i.e., this provides 72 swatches for training each concept palette) and therefore there are 360 palettes in total. These palettes are annotated by experienced color consultants.
- a second database CL includes 22,000,000 palettes downloaded from a popular social network intended for graphic designers (www.colourlovers.com) using the keywords associated to each concept of CC, derived from Eiseman. These palettes are often tagged by amateur designers and are therefore weakly annotated. In the exemplary embodiment, the labels of these palettes were not used subsequent to retrieval. Rather, these predefined palettes were classified into a respective one of the fifteen concept categories using a technique analogous to that described in above-mentioned U.S. patent application Ser. No. 12/632,107. In this method, the palette is assigned to its most likely concept by comparing its colors to those of the palettes in the CC set.
- An image 26 may be selected from a predefined database, online database, or provided directly by the user.
- the user is presented with a database 106 of images, from which he selects one.
- the user can acquire an input image from his own collection 106 .
- the database may include a single type of images, such as photographs, or may include a heterogeneous dataset containing photos, graphic design images, as well as templates such as those used in document editing applications (e.g., Microsoft Publisher, Microsoft Power Point).
- the user may select a concept from all the concepts for which a concept palette has been generated. In other embodiments, a subset of the concepts 16 is suggested to the user, based on the input image, from which the user selects one for concept transfer.
- a list 144 of concepts is presented to the user from which the user may select one (or more than one).
- the list may be presented in an ordered fashion according to a given measure of concept appropriateness.
- the concepts are ordered, only a subset of the concepts may be shown on the user interface, e.g., the top five or those whose appropriateness measure exceeds a given threshold.
- Two measures for computing appropriateness are suggested by way of example:
- Image Concepts which are reasonably close to the image are proposed, to avoid radical changes in the color transfer.
- the measure of appropriateness may be computed based on the similarity between the concept palette 22 and the image palette 32 .
- the palettes are probabilistic models (GMMs) and the image pixels can be considered to be observations emitted by these models, the similarity between an image and a concept can be expressed as the average log-likelihood of the pixels on the GMM (independence assumption).
- the scores output by this method for each of the concepts are used to rank the concepts. The appropriateness can then be based, in whole or in part, on the ranking.
- Reliability Since different concepts have different degrees of reliability, a measure of the reliability may be considered in determining the appropriateness. While some concepts can clearly be associated with a small number of well-defined colors (most people would agree that the concept “romantic” is associated with the color “pink”), some concepts may be so abstract that the agreement between different users may be low. In such a case, the measure of appropriateness of a given concept can be based on a spread of its GMM in the color space. One suitable measure of degree of spread of a distribution is its entropy: low entropy indicates a concentrated distribution while high entropy indicates a spread distribution. The concepts can thus be ranked according to entropy.
- the first measure is image-dependent while the second one is image independent.
- One or both measures may be used in computing a ranking for display of a subset of suitable concepts or for ordering the list of concepts by appropriateness.
- Two methods may be used to generate the image palette 32 :
- S 110 B Generate an image palette through adaptation of the concept palette GMM with the input image pixel data.
- the GMM may be initialized by clustering some of image pixel color values, as described for the clustering of the swatches in the concept palette learning stage.
- x denotes an observation (here, a vector a,b corresponding to the coordinate values of a pixel color from the image 26 ) and q denotes which Gaussian emitted x
- the likelihood that x was generated by the GMM is:
- the components p i are given by:
- the parameters of the GMM are obtained by maximizing the log-likelihood function logp(X
- MLE Maximum Likelihood Estimation
- the posterior occupancy probabilities are computed based on the current estimates of the parameters.
- ⁇ i ⁇ ( x t ) w i ⁇ p i ⁇ ( x t
- ⁇ ) ⁇ j 1 N ⁇ w i ⁇ p j ⁇ ( x t
- the parameters are updated based on the expected complete-data log-likelihood given the occupancy probabilities computed in the E-step:
- the GMM for the image is initialized with the means, weights, and covariance matrices of the components of the concept palette GMM 22 retrieved at S 108 .
- the EM algorithm may be used for the MAP estimation, by alternating E and M steps, similarly to the image palette extraction described above.
- the posterior occupancy probability is computed based on the current estimates of the parameters as follows:
- ⁇ i ⁇ ( x t ) w i in ⁇ p i ⁇ ( x t
- ⁇ j 1 N ⁇ w j in ⁇ p j ⁇ ( x t
- the parameters are updated based on the expected complete-data log-likelihood given the occupancy probabilities computed in the E-step:
- ⁇ is an adaptation factor which balances the new observations against the a priori information.
- each Gaussian function (swatch) of the concept palette may be multiplied by a factor. This factor is computed by dividing the number of input image pixels by the number of swatches belonging to that concept. The algorithm may be stopped bases on a stopping criterion or when convergence is approached.
- FIG. 6 illustrates functions of an initialized GMM (a) corresponding to the concept palette and after three and five iterations of the EM algorithm (b and c). Arrows show correspondences between adapted Gaussian components.
- Swatches (colors) in the input palette are mapped to swatches (colors) in the concept palette.
- the mapping process may be dependent on the method used to derive the input image palette (S 110 A or S 110 B). In the exemplary embodiment, the mapping results in every image palette color being matched to a corresponding concept palette color. However, it is also contemplated that fewer than all the image palette colors (swatches) may be mapped, such as the most dominant (highly weighted) colors.
- the image palette 32 is a GMM derived from the image pixels without reference to the concept palette (S 110 A)
- swatches of the concept palette 22 are mapped to swatches of the input palette 32 using the chromaticity distances between them. This method is referred to as distance-based mapping.
- the adaptation framework inherently maps input palette swatches to concept swatch palettes. This method is referred to as adaptation-based mapping.
- the weights of the remaining concept palette colors may be automatically normalized so that they sum to 1.
- the image palette may be recomputed, based on the new concept palette. This is not necessary, of course, if the image palette is derived solely from pixels of the image. In both cases, the concept palette to input palette mapping is recomputed.
- each swatch of the image palette 32 is associated with a respective single swatch in the concept palette.
- the measure used as a criterion for mapping swatches may be the Euclidean distance between the ab representations of the swatches in the Lab color space. However, other distance metrics, as well as representations of swatches in other color spaces can also be used. The distance metric used should reflect the chromatic distances between swatches. The dominant (most highly weighted) swatches in the image palette may receive priority in the mapping.
- the input palette extracted from the image is an M-dimensional GMM.
- the concept palette is an N-dimensional GMM. In one embodiment, N ⁇ M.
- the M input palette swatches be represented by S in 1 . . . S in M ; the N concept palette swatches by S c 1 . . . S c N .
- the output is M mapped concept palette swatches S′ c 1 . . . S′ c M using the distance-based approach.
- S c 1 . . . S c N and S in 1 . . . S in M are the ab channel representations of the swatch colors of the concept and input palettes in the ab color space respectively.
- S′ c 1 . . . S′ c M are the ab channel representations of the swatch colors of the resulting mapped concept palette 36 .
- the goal of this approach is to obtain an M-dimensional mapped concept palette 36 , for which each swatch is matched with a corresponding swatch in the input palette.
- this is performed with a greedy algorithm, as illustrated below.
- the Euclidean distance between the first swatch (highest weighted swatch) of the input palette and each swatch from the concept palette is computed.
- the swatch from the concept palette that has the closest distance is placed as the first swatch of the mapped concept palette. This process is repeated for the remaining swatches of the input palette.
- the swatches of the input palette are sorted by the weights of their associated Gaussian functions. Therefore, the most highly weighted swatch chooses its closest color first and so on.
- a concept palette swatch can be mapped to more than one image palette swatch, but each image palette swatch is mapped to no more than one concept palette swatch.
- a swatch may be removed from consideration once it has been used in a predetermined number of mappings (e.g., 2 or 3).
- concept swatches may be selected up to three times (i.e., up to three image palette swatches can be mapped to the same concept palette swatch). This is achieved by initializing a counter (ctr) for each concept color and incrementing the counter each time that color is selected.
- the pseudocode for this algorithm may be as follows:
- A A ⁇ j* ⁇
- j* is the concept swatch which gives the minimum Euclidean distance between it and the ith image palette swatch.
- EMD Earth Mover's Distance
- G is the cost matrix containing the Euclidean distance between each input image swatch S i in and concept palette swatch S j c .
- the quantities w i in and w j c are the weights for the i-th input and j-th concept palette swatch respectively.
- the flow f i,j can be considered to be the part of swatch S i in which is mapped to S j c .
- Constraint (C 1 ) requires that each input image swatch has flows that sum to its weight.
- constraint (C 2 ) requires that each concept model swatch has flows that sum to its weight. Therefore each concept swatch is guaranteed to be associated with at least one input swatch and vice versa.
- the input image palette 32 has been obtained by adapting the concept GMM using pixels from the input image as observations for the GMM at S 110 B.
- This representation of the input ensures that the Gaussian parameters of the adapted model retain a correspondence with those of the concept palette GMM.
- This inherent correspondence eliminates the need for a heuristic swatch-mapping criterion such as the distance-based mapping scheme described above in S 112 A.
- the mapped concept palette 36 corresponds to the concept palette 22 , with its swatches ordered according to the swatches of the image palette to which its swatches are matched (this may involve reordering the image palette swatches or the concept palette swatches).
- the image pixels are modified based on the mapped concept palette 36 (target palette) (which is the same as the concept palette in the adaptation case).
- a linear transformation is computed between the input image palette 32 and the mapped concept palette 36 . This transformation is then applied to the pixels of the input image to yield the output image.
- the color-transferred, or output, image 42 is obtained by applying an affine transformation to the ab channel values of every pixel of the input image.
- the statistics (mean and standard deviation of the Gaussian function representation) of the i-th image palette swatch can be matched to those of the i-th target palette swatch as follows:
- the covariance matrices of the i-th components of the GMMs representing the target and image palettes respectively denote their means.
- the covariances are chosen to be diagonal. The square-root is thus defined in unique manner.
- the color transfer map functions used in the affine transformation are computed using A i and B i as well as the previously computed occupancy probabilities ⁇ i (x), as follows:
- x out A ( x in ) x in +B ( x in )
- A(x in ) and B(x in ) are the A(x) and B(x) values computed above.
- the lightness channel L values of the output image pixels are set to be the same as those of the input image pixels.
- K Different levels of transfer, denoted by K, can be performed on the image by modification to the affine transformation.
- K is introduced into the affine transformation using the following formulations of A i and B i :
- a i K ⁇ ( ( ⁇ i t ) 1 / 2 ⁇ ( ⁇ i in ) - 1 / 2 ) + ( 1 - K ) ⁇ I ;
- B i K ⁇ ⁇ ⁇ i t + ( 1 - K ) ⁇ ⁇ in - ⁇ i in ⁇ [ K ⁇ ( ( ⁇ i t ) 1 / 2 ⁇ ( ⁇ i in ) - 1 / 2 ) + ( 1 - K ) ⁇ I ] ,
- I is the identity matrix
- all pixels of the image are subjected to the transformation. In other embodiments, less than all, i.e., only a selected set of pixels may be modified, which may be pixels in a user-selected region of the image.
- the user is shown the result of the transfer operation.
- the entire modified image palette 44 may also be displayed, or in some embodiments, its most highly weighted swatches. For example, as shown in FIG. 3 , only the five most highly weighted swatches are shown as the remaining ones have negligible weights in their respective GMMs.
- the user has the option to tune the result (using the selector 142 ) by altering the parameter regulating the degree of the transfer K.
- the user can also choose to modify the concept palette 22 , by adding, deleting, or replacing, a swatch.
- the colors of the displayed concept palette may correspond to actuable area(s) of the screen. If a user clicks on one of the colors, a popup is displayed which asks the user if he or she would like to change the concept color palette by adding, deleting, or replacing a swatch.
- the popup may also display a set of colors, which allows a user to select a replacement or additional color swatch.
- the modified image may be output to an output device other than display device 136 , such as a printer, either directly, or after further workflow processing of a document which contains it.
- the modified image palette 44 need not be printed.
- the palette 44 may be used for modifying other components of the document, such as for generating background colors, image text colors, and the like.
- the exemplary embodiment may be used in variable data applications such as 1 to 1 personalization and direct mail marketing.
- Variable document creation poses various challenges to the assurance of a proper aesthetical level due the portion of dynamic content such documents can include.
- One of the challenges is how to treat visual aspects dynamically within the variable data workflow, so that the enhancement or management operations are handled in a more context-sensitive fashion.
- the exemplary embodiment describes a method for addressing one of these needs, namely altering the colors in an image based on a concept selected by a user.
- Variable data printing is not the only application of the method.
- Other applications such as image and document asset management or document image/photograph set visualization, etc. can also profit from the method.
- the concept-based image color transfer approach described above was tested on three types of digital assets: natural images, graphic design images, and brochures.
- the concept used is selected by the user from a list of suggested concepts.
- the concept palettes used are obtained from the two databases of palettes: CC and CL, described above.
- the results obtained for different types of images show that the exemplary method provides modified images which can have a much higher variation in color than for an existing method for modification of images described in Yang & Peng.
- the variation is lost due to the histogram matching performed by the algorithm used between the input image, which has one dominant color, and the target image, which also has one dominant color. Accordingly, a more realistic result is obtained with the present method.
- the results obtained also show that the Yang & Peng approach fails to maintain any color variation for images which do not satisfy their dominance constraint, and which therefore do not have one dominant color.
- the distance-based mapping approach and the adaption-based approach gave slightly different results, with the adaptation-based approach sometimes showing a wider color variation than the distance-based approach (since the distance-based approach allows a concept palette swatch to be mapped to up to three image palette swatches).
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Pure & Applied Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Algebra (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Computational Mathematics (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Probability & Statistics with Applications (AREA)
- Computer Hardware Design (AREA)
- Image Processing (AREA)
Abstract
Description
where wi c, μi c, and
are respectively the weight, mean vector and covariance matrix of Gaussian i, and N denotes the number of Gaussian components. It is assumed in the illustrative embodiments herein that the covariance matrices
are diagonal. x denotes an observation (here the vector comprising the two chromaticity coordinates of the color of a swatch of a palette in the input set of palettes) and q denotes which Gaussian emitted x, the likelihood that x was generated by the GMM is:
-
- where pi(x|λc)=p(x|q=i,λc).
The components pi are given by:
-
- where pi(x|λ)=p(x|q=i,λ).
A = {1, 2, ..., N} | ||
for k = 1 to N | ||
ctr[k] = 0 | ||
for i = 1 to M (taken in decreasing order of weight) | ||
j* = argmin∀jεA{{square root over ((Sc | ||
Sc | ||
ctr[j*] = ctr[j*]+1 | ||
if ctr[j*] == 3 | ||
A = A\{j*} | ||
x out =A(x in)x in +B(x in)
Claims (26)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/890,049 US8553045B2 (en) | 2010-09-24 | 2010-09-24 | System and method for image color transfer based on target concepts |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/890,049 US8553045B2 (en) | 2010-09-24 | 2010-09-24 | System and method for image color transfer based on target concepts |
Publications (2)
Publication Number | Publication Date |
---|---|
US20120075329A1 US20120075329A1 (en) | 2012-03-29 |
US8553045B2 true US8553045B2 (en) | 2013-10-08 |
Family
ID=45870194
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/890,049 Active 2032-01-06 US8553045B2 (en) | 2010-09-24 | 2010-09-24 | System and method for image color transfer based on target concepts |
Country Status (1)
Country | Link |
---|---|
US (1) | US8553045B2 (en) |
Cited By (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9208549B2 (en) * | 2012-12-07 | 2015-12-08 | Thomson Licensing Sas | Method and apparatus for color transfer between images |
US9396560B2 (en) | 2014-06-26 | 2016-07-19 | Amazon Technologies, Inc. | Image-based color palette generation |
US9401032B1 (en) * | 2014-06-26 | 2016-07-26 | Amazon Technologies, Inc. | Image-based color palette generation |
US9514543B2 (en) | 2014-06-26 | 2016-12-06 | Amazon Technologies, Inc. | Color name generation from images and color palettes |
US9524563B2 (en) | 2014-06-26 | 2016-12-20 | Amazon Technologies, Inc. | Automatic image-based recommendations using a color palette |
US9542704B2 (en) | 2014-06-26 | 2017-01-10 | Amazon Technologies Inc. | Automatic image-based recommendations using a color palette |
US9552656B2 (en) | 2014-06-26 | 2017-01-24 | Amazon Technologies, Inc. | Image-based color palette generation |
US9633448B1 (en) | 2014-09-02 | 2017-04-25 | Amazon Technologies, Inc. | Hue-based color naming for an image |
US9652868B2 (en) | 2014-06-26 | 2017-05-16 | Amazon Technologies, Inc. | Automatic color palette based recommendations |
US9659032B1 (en) | 2014-06-26 | 2017-05-23 | Amazon Technologies, Inc. | Building a palette of colors from a plurality of colors based on human color preferences |
US9679532B2 (en) | 2014-06-26 | 2017-06-13 | Amazon Technologies, Inc. | Automatic image-based recommendations using a color palette |
US9697573B1 (en) | 2014-06-26 | 2017-07-04 | Amazon Technologies, Inc. | Color-related social networking recommendations using affiliated colors |
US9727983B2 (en) | 2014-06-26 | 2017-08-08 | Amazon Technologies, Inc. | Automatic color palette based recommendations |
US9785649B1 (en) | 2014-09-02 | 2017-10-10 | Amazon Technologies, Inc. | Hue-based color naming for an image |
US9792303B2 (en) | 2014-06-26 | 2017-10-17 | Amazon Technologies, Inc. | Identifying data from keyword searches of color palettes and keyword trends |
US9898487B2 (en) | 2014-06-26 | 2018-02-20 | Amazon Technologies, Inc. | Determining color names from keyword searches of color palettes |
US9916613B1 (en) | 2014-06-26 | 2018-03-13 | Amazon Technologies, Inc. | Automatic color palette based recommendations for affiliated colors |
US9922050B2 (en) | 2014-06-26 | 2018-03-20 | Amazon Technologies, Inc. | Identifying data from keyword searches of color palettes and color palette trends |
CN107862063A (en) * | 2017-11-15 | 2018-03-30 | 广东交通职业技术学院 | A kind of image color transmission method and system |
US9996579B2 (en) | 2014-06-26 | 2018-06-12 | Amazon Technologies, Inc. | Fast color searching |
US10007679B2 (en) | 2008-08-08 | 2018-06-26 | The Research Foundation For The State University Of New York | Enhanced max margin learning on multimodal data mining in a multimedia database |
US10073860B2 (en) | 2014-06-26 | 2018-09-11 | Amazon Technologies, Inc. | Generating visualizations from keyword searches of color palettes |
US10120880B2 (en) | 2014-06-26 | 2018-11-06 | Amazon Technologies, Inc. | Automatic image-based recommendations using a color palette |
US10169803B2 (en) | 2014-06-26 | 2019-01-01 | Amazon Technologies, Inc. | Color based social networking recommendations |
US10203730B2 (en) | 2014-09-12 | 2019-02-12 | Interdigital Ce Patent Holdings | Method for obtaining an electronic device housing panel and corresponding housing, device and apparatus |
US10223427B1 (en) | 2014-06-26 | 2019-03-05 | Amazon Technologies, Inc. | Building a palette of colors based on human color preferences |
US10235389B2 (en) | 2014-06-26 | 2019-03-19 | Amazon Technologies, Inc. | Identifying data from keyword searches of color palettes |
US10255295B2 (en) | 2014-06-26 | 2019-04-09 | Amazon Technologies, Inc. | Automatic color validation of image metadata |
US10430857B1 (en) | 2014-08-01 | 2019-10-01 | Amazon Technologies, Inc. | Color name based search |
US10691744B2 (en) | 2014-06-26 | 2020-06-23 | Amazon Technologies, Inc. | Determining affiliated colors from keyword searches of color palettes |
US11232607B2 (en) * | 2020-01-24 | 2022-01-25 | Adobe Inc. | Adding color to digital images |
US11430194B2 (en) * | 2015-02-06 | 2022-08-30 | Unmade Limited | 2D graphical coding to create a 3D image |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8593478B2 (en) * | 2010-10-19 | 2013-11-26 | Hewlett-Packard Development Company, L.P. | Extraction of a color palette model from an image of a document |
US9083918B2 (en) * | 2011-08-26 | 2015-07-14 | Adobe Systems Incorporated | Palette-based image editing |
KR20140026978A (en) * | 2012-08-24 | 2014-03-06 | 삼성전자주식회사 | Electronic device that display using image included in content and displaying method of electronic device thereof |
US10387938B2 (en) * | 2012-10-30 | 2019-08-20 | Stylyze Llc | Automated color processing and selection platform |
US20150055858A1 (en) * | 2013-08-21 | 2015-02-26 | GM Global Technology Operations LLC | Systems and methods for color recognition in computer vision systems |
US10409822B2 (en) * | 2014-05-06 | 2019-09-10 | Shutterstock, Inc. | Systems and methods for presenting ranked search results |
US9135719B1 (en) * | 2014-06-26 | 2015-09-15 | Amazon Technologies, Inc. | Color name generation from images and color palettes |
EP3021282A1 (en) * | 2014-11-14 | 2016-05-18 | Thomson Licensing | Methods and apparatus for learning palette dictionaries for device-ready example-guided recolorization |
KR20170053435A (en) * | 2015-11-06 | 2017-05-16 | 삼성전자주식회사 | Electronic device and method of editting an image in the electronic device |
WO2018049084A1 (en) | 2016-09-07 | 2018-03-15 | Trustees Of Tufts College | Methods and systems for human imperceptible computerized color transfer |
EP3410402A1 (en) * | 2017-06-02 | 2018-12-05 | Thomson Licensing | Method for color grading a visual content and corresponding electronic device, electronic assembly, computer readable program product and computer readable storage medium |
FR3082649B1 (en) * | 2018-06-19 | 2020-06-05 | Allegorithmic | METHOD FOR APPLYING COLORS ON A TEXTURE |
CN109472840B (en) * | 2018-10-31 | 2023-01-10 | 广东智媒云图科技股份有限公司 | Color replacement method and system |
US11043012B2 (en) * | 2019-08-06 | 2021-06-22 | Adobe Inc. | Flow-based color transfer from source graphic to target graphic |
CA3150207C (en) | 2019-09-12 | 2024-03-12 | Alison NORRIS | Dynamic generation of custom color selections |
US11087505B2 (en) | 2019-11-15 | 2021-08-10 | Adobe Inc. | Weighted color palette generation |
WO2024013027A1 (en) * | 2022-07-11 | 2024-01-18 | Akzo Nobel Coatings International B.V. | Method of obtaining a color subpalette from a moodboard |
CN115082703B (en) * | 2022-07-19 | 2022-11-11 | 深圳大学 | Concept-associated color extraction method, device, computer equipment and storage medium |
Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5642137A (en) | 1990-12-19 | 1997-06-24 | Sony Corporation | Color selecting method |
US20020015536A1 (en) * | 2000-04-24 | 2002-02-07 | Warren Penny G. | Apparatus and method for color image fusion |
US20030012428A1 (en) | 1999-11-16 | 2003-01-16 | Syeda-Mahmood Tanveer Fathima | Method and apparatus for indexing and retrieving images from an image database based on a color query |
US20030021481A1 (en) | 2001-07-25 | 2003-01-30 | Nec Corporation | Image retrieval apparatus and image retrieving method |
US20030146925A1 (en) | 1997-11-12 | 2003-08-07 | Canon Kabushiki Kaisha | Generating and using a color palette |
US20040164991A1 (en) | 2001-03-15 | 2004-08-26 | Apple Computer, Inc. | Color palette providing cross-platform consistency |
US20060066629A1 (en) | 2004-09-30 | 2006-03-30 | Microsoft Corporation | System and method for color selection |
US20060164664A1 (en) | 2000-05-26 | 2006-07-27 | Lg Electronics Inc. | Color quantization and method thereof and searching method using the same |
US20070005356A1 (en) | 2005-06-30 | 2007-01-04 | Florent Perronnin | Generic visual categorization method and system |
US20070258648A1 (en) | 2006-05-05 | 2007-11-08 | Xerox Corporation | Generic visual classification with gradient components-based dimensionality enhancement |
US20080046410A1 (en) | 2006-08-21 | 2008-02-21 | Adam Lieb | Color indexing and searching for images |
US20080069456A1 (en) | 2006-09-19 | 2008-03-20 | Xerox Corporation | Bags of visual context-dependent words for generic visual categorization |
US20080129750A1 (en) * | 2006-11-30 | 2008-06-05 | Adobe Systems Incorporated | Combined color harmony generation and artwork recoloring mechanism |
US20080240572A1 (en) | 2007-03-26 | 2008-10-02 | Seiko Epson Corporation | Image Search Apparatus and Image Search Method |
US20080317358A1 (en) | 2007-06-25 | 2008-12-25 | Xerox Corporation | Class-based image enhancement system |
US20090144033A1 (en) | 2007-11-30 | 2009-06-04 | Xerox Corporation | Object comparison, retrieval, and categorization methods and apparatuses |
US20090231355A1 (en) * | 2008-03-11 | 2009-09-17 | Xerox Corporation | Color transfer between images through color palette adaptation |
US20100040285A1 (en) | 2008-08-14 | 2010-02-18 | Xerox Corporation | System and method for object class localization and semantic class based image segmentation |
US20100092084A1 (en) | 2008-10-15 | 2010-04-15 | Xerox Corporation | Representing documents with runlength histograms |
US20100098343A1 (en) | 2008-10-16 | 2010-04-22 | Xerox Corporation | Modeling images as mixtures of image models |
- 2010
- 2010-09-24 US US12/890,049 patent/US8553045B2/en active Active
Patent Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5642137A (en) | 1990-12-19 | 1997-06-24 | Sony Corporation | Color selecting method |
US20030146925A1 (en) | 1997-11-12 | 2003-08-07 | Canon Kabushiki Kaisha | Generating and using a color palette |
US20030012428A1 (en) | 1999-11-16 | 2003-01-16 | Syeda-Mahmood Tanveer Fathima | Method and apparatus for indexing and retrieving images from an image database based on a color query |
US20020015536A1 (en) * | 2000-04-24 | 2002-02-07 | Warren Penny G. | Apparatus and method for color image fusion |
US20060164664A1 (en) | 2000-05-26 | 2006-07-27 | Lg Electronics Inc. | Color quantization and method thereof and searching method using the same |
US20040164991A1 (en) | 2001-03-15 | 2004-08-26 | Apple Computer, Inc. | Color palette providing cross-platform consistency |
US20030021481A1 (en) | 2001-07-25 | 2003-01-30 | Nec Corporation | Image retrieval apparatus and image retrieving method |
US20060066629A1 (en) | 2004-09-30 | 2006-03-30 | Microsoft Corporation | System and method for color selection |
US20070005356A1 (en) | 2005-06-30 | 2007-01-04 | Florent Perronnin | Generic visual categorization method and system |
US20070258648A1 (en) | 2006-05-05 | 2007-11-08 | Xerox Corporation | Generic visual classification with gradient components-based dimensionality enhancement |
US20080046410A1 (en) | 2006-08-21 | 2008-02-21 | Adam Lieb | Color indexing and searching for images |
US20080069456A1 (en) | 2006-09-19 | 2008-03-20 | Xerox Corporation | Bags of visual context-dependent words for generic visual categorization |
US20080129750A1 (en) * | 2006-11-30 | 2008-06-05 | Adobe Systems Incorporated | Combined color harmony generation and artwork recoloring mechanism |
US20080240572A1 (en) | 2007-03-26 | 2008-10-02 | Seiko Epson Corporation | Image Search Apparatus and Image Search Method |
US20080317358A1 (en) | 2007-06-25 | 2008-12-25 | Xerox Corporation | Class-based image enhancement system |
US20090144033A1 (en) | 2007-11-30 | 2009-06-04 | Xerox Corporation | Object comparison, retrieval, and categorization methods and apparatuses |
US20090231355A1 (en) * | 2008-03-11 | 2009-09-17 | Xerox Corporation | Color transfer between images through color palette adaptation |
US20100040285A1 (en) | 2008-08-14 | 2010-02-18 | Xerox Corporation | System and method for object class localization and semantic class based image segmentation |
US20100092084A1 (en) | 2008-10-15 | 2010-04-15 | Xerox Corporation | Representing documents with runlength histograms |
US20100098343A1 (en) | 2008-10-16 | 2010-04-22 | Xerox Corporation | Modeling images as mixtures of image models |
Non-Patent Citations (36)
Title |
---|
An, et al. "User-controllable color transfer," Computer Graphics Forum (Eurographics) 29, 2 (May 2010), 263-271(9). |
Benavente, et al. "Parametric fuzzy sets for automatic color naming," Journal of the Optical Society of America, A 25, 10 (Oct. 2008), 2582-2593. |
Berlin, et al. Basic Color Terms: Their Universality and Evolution. Berkeley: University of California Press, University of California, 1969-Abstract Only. |
Charpiat, et al. "Automatic image colorization via multimodal predictions," In ECCV '08: Proceedings of the 10th European Conference on Computer Vision (Berlin, Heidelberg, 2008), Springer-Verlag, pp. 126-139. |
Colorlovers: www.colourlovers.com-accessed Sep. 23, 2010. |
Conway, D. "An experimental comparison of three natural language colour naming models," In East-West International Conference on Human-Computer Interactions (1992), pp. 328-339. |
Dempster, et al. "Maximum likelihood from incomplete data via the EM algorithm," Journal of the Royal Statistical Society, 39, 1-38, 1977. |
Fellner, et al. "Automatic Concept Transfer," Computer Graphics forum , vol. 0 (1981), No. 0, pp. 1-11. |
Freedman, et al. "Object-to-object color transfer: optimal flows and smsp transformations," In IEEE Conference on Computer Vision and Pattern Recognition (Jun. 2010). |
Greenfield, et al. "A Palette-Driven Approach to Image Color Transfer," Eurographics Workshop on Computational Aesthetics in Graphics, Visualization and Imaging, 91-99, 2005. |
Greenfield, et al. "Image recoloring induced by palette color associations," In International Conference on Computer Graphics, Visualization and Computer Vision (2003). |
Hou, et al. "Color conceptualization," In Multimedia '07: Proceedings of the 15th international conference on Multimedia (New York, NY, USA, 2007), ACM, pp. 265-268. |
Huang, et al. "Landmark-based sparse color representations for color transfer," In Computer Vision, 2009 IEEE 12th International Conference on (Sep. 2009), pp. 199-204. |
Jegou, et al. "Improving Bag-Of-Features for Large Scale Image Search," in IJCV. May 2010. |
Lalonde , et al. "Using color compatibility for assessing image realism," In Computer Vision, 2007. ICCV 2007. IEEE 11th International Conference on (2007), pp. 1-8. |
Lammens, A. "Computational Model of Color Perception and Color Naming," PhD thesis, University of Buffalo, 1994. |
Lashkari, et al. "Convex clustering with exemplar-based models," In Advances in Neural Information Processing Systems 20, Platt J., Koller D., Singer Y., Roweis S., (Eds.) MIT Press, Cambridge, MA, 2008, pp. 825-832. |
Levin, et al. "Colorization using optimization," ACM Trans. Graph. 23, 3 (2004), 689-694. |
Ou, et al. "A study of colour emotion and colour preference. part i: Colour emotions for single colours," Color Research and Application 29, 3 (Jun. 2004), 232-240. |
Ou, et al. "A study of colour emotion and colour preference. part ii: Colour emotions for two-colour combinations," Color Research and Application 29: 292-298 (2004). |
Ou, et al. "A study of colour emotion and colour preference. part iii: Colour preference modeling," Color Research and Application 29: 381-389 (2004). |
Perronnin, et al. "Fisher Kernels on Visual Vocabularies for Image Categorization," in Proc. of the IEEE Conf on Computer Vision and Pattern Recognition (CVPR), Minneapolis, MN, USA (Jun. 2007). |
Pitie, et al. "Ndimensional probability density function transfer and its application to colour transfer," In IEEE International Conference on Computer Vision (2005), vol. 2, pp. 1434-1439. |
Qu, et al. "Manga colorization," ACM Trans. Graph. 25, 3 (2006), 1214-1220. |
Reinhard, et al. "Color transfer between images," IEEE Computer Graphics Applications 21, 5 (Sep./Oct. 2001), 34-41. |
Reynolds, et al. "Speaker verification using adapted gaussian mixture models," In Digital Signal Processing (2000), No. 10, pp. 19-41. |
Rubner, et al. "The earth mover's distance as a metric for image retrieval," Int. J. Comput. Vision 40, 2 (2000), 99-121. |
Tai, et al. "Local color transfer via probabilistic segmentation by expectation-maximization," In IEEE International Conference on Computer Vision and Pattern Recognition (2005), vol. 1, pp. 747-754. |
U.S. Appl. No. 12/512,209, filed Jul. 30, 2009, Perronnin, et al. |
U.S. Appl. No. 12/632,107, filed Dec. 7, 2009, Marchesotti, et al. |
U.S. Appl. No. 12/693,795, filed Jan. 26, 2010, Skaff, et al. |
Van De Weijer, et al. "Learning color names for real-world applications," IEEE Transactions on Image Processing 18, 7 (Jul. 2009), 1512-1523. |
Woolfe, et al. "Natural Language Color Editing," Proc of the ist XIG Research and Technology Conference, 2006. |
Xiao, et al. "Color transfer in correlated color space," In VRCIA '06: Proceedings of the 2006 ACM international conference on Virtual reality continuum and its applications (New York, NY, USA, 2006), ACM, pp. 305-309. |
Yang, et al. "Automatic mood-transferring between color images," IEEE Computer Graphics and Applications 28, 2 (Mar.-Apr. 2008), 52-61. |
Zheng, et al. "Tour the World: Building a web-scale landmark recognition engine," IEEE Computer Society Conference, 2009. |
Cited By (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10007679B2 (en) | 2008-08-08 | 2018-06-26 | The Research Foundation For The State University Of New York | Enhanced max margin learning on multimodal data mining in a multimedia database |
US9208549B2 (en) * | 2012-12-07 | 2015-12-08 | Thomson Licensing Sas | Method and apparatus for color transfer between images |
US10235389B2 (en) | 2014-06-26 | 2019-03-19 | Amazon Technologies, Inc. | Identifying data from keyword searches of color palettes |
US9396560B2 (en) | 2014-06-26 | 2016-07-19 | Amazon Technologies, Inc. | Image-based color palette generation |
US9524563B2 (en) | 2014-06-26 | 2016-12-20 | Amazon Technologies, Inc. | Automatic image-based recommendations using a color palette |
US9542704B2 (en) | 2014-06-26 | 2017-01-10 | Amazon Technologies Inc. | Automatic image-based recommendations using a color palette |
US9552656B2 (en) | 2014-06-26 | 2017-01-24 | Amazon Technologies, Inc. | Image-based color palette generation |
US9996579B2 (en) | 2014-06-26 | 2018-06-12 | Amazon Technologies, Inc. | Fast color searching |
US9652868B2 (en) | 2014-06-26 | 2017-05-16 | Amazon Technologies, Inc. | Automatic color palette based recommendations |
US9659032B1 (en) | 2014-06-26 | 2017-05-23 | Amazon Technologies, Inc. | Building a palette of colors from a plurality of colors based on human color preferences |
US11216861B2 (en) | 2014-06-26 | 2022-01-04 | Amason Technologies, Inc. | Color based social networking recommendations |
US9697573B1 (en) | 2014-06-26 | 2017-07-04 | Amazon Technologies, Inc. | Color-related social networking recommendations using affiliated colors |
US9727983B2 (en) | 2014-06-26 | 2017-08-08 | Amazon Technologies, Inc. | Automatic color palette based recommendations |
US9741137B2 (en) | 2014-06-26 | 2017-08-22 | Amazon Technologies, Inc. | Image-based color palette generation |
US10691744B2 (en) | 2014-06-26 | 2020-06-23 | Amazon Technologies, Inc. | Determining affiliated colors from keyword searches of color palettes |
US9792303B2 (en) | 2014-06-26 | 2017-10-17 | Amazon Technologies, Inc. | Identifying data from keyword searches of color palettes and keyword trends |
US9836856B2 (en) | 2014-06-26 | 2017-12-05 | Amazon Technologies, Inc. | Color name generation from images and color palettes |
US9898487B2 (en) | 2014-06-26 | 2018-02-20 | Amazon Technologies, Inc. | Determining color names from keyword searches of color palettes |
US9916613B1 (en) | 2014-06-26 | 2018-03-13 | Amazon Technologies, Inc. | Automatic color palette based recommendations for affiliated colors |
US9922050B2 (en) | 2014-06-26 | 2018-03-20 | Amazon Technologies, Inc. | Identifying data from keyword searches of color palettes and color palette trends |
US9679532B2 (en) | 2014-06-26 | 2017-06-13 | Amazon Technologies, Inc. | Automatic image-based recommendations using a color palette |
US10402917B2 (en) | 2014-06-26 | 2019-09-03 | Amazon Technologies, Inc. | Color-related social networking recommendations using affiliated colors |
US10186054B2 (en) | 2014-06-26 | 2019-01-22 | Amazon Technologies, Inc. | Automatic image-based recommendations using a color palette |
US10049466B2 (en) | 2014-06-26 | 2018-08-14 | Amazon Technologies, Inc. | Color name generation from images and color palettes |
US10073860B2 (en) | 2014-06-26 | 2018-09-11 | Amazon Technologies, Inc. | Generating visualizations from keyword searches of color palettes |
US10120880B2 (en) | 2014-06-26 | 2018-11-06 | Amazon Technologies, Inc. | Automatic image-based recommendations using a color palette |
US10169803B2 (en) | 2014-06-26 | 2019-01-01 | Amazon Technologies, Inc. | Color based social networking recommendations |
US9401032B1 (en) * | 2014-06-26 | 2016-07-26 | Amazon Technologies, Inc. | Image-based color palette generation |
US10255295B2 (en) | 2014-06-26 | 2019-04-09 | Amazon Technologies, Inc. | Automatic color validation of image metadata |
US10223427B1 (en) | 2014-06-26 | 2019-03-05 | Amazon Technologies, Inc. | Building a palette of colors based on human color preferences |
US9514543B2 (en) | 2014-06-26 | 2016-12-06 | Amazon Technologies, Inc. | Color name generation from images and color palettes |
US10242396B2 (en) | 2014-06-26 | 2019-03-26 | Amazon Technologies, Inc. | Automatic color palette based recommendations for affiliated colors |
US10430857B1 (en) | 2014-08-01 | 2019-10-01 | Amazon Technologies, Inc. | Color name based search |
US9785649B1 (en) | 2014-09-02 | 2017-10-10 | Amazon Technologies, Inc. | Hue-based color naming for an image |
US10831819B2 (en) | 2014-09-02 | 2020-11-10 | Amazon Technologies, Inc. | Hue-based color naming for an image |
US9633448B1 (en) | 2014-09-02 | 2017-04-25 | Amazon Technologies, Inc. | Hue-based color naming for an image |
US10203730B2 (en) | 2014-09-12 | 2019-02-12 | Interdigital Ce Patent Holdings | Method for obtaining an electronic device housing panel and corresponding housing, device and apparatus |
US11430194B2 (en) * | 2015-02-06 | 2022-08-30 | Unmade Limited | 2D graphical coding to create a 3D image |
CN107862063A (en) * | 2017-11-15 | 2018-03-30 | 广东交通职业技术学院 | A kind of image color transmission method and system |
US11232607B2 (en) * | 2020-01-24 | 2022-01-25 | Adobe Inc. | Adding color to digital images |
Also Published As
Publication number | Publication date |
---|---|
US20120075329A1 (en) | 2012-03-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8553045B2 (en) | System and method for image color transfer based on target concepts | |
US8532377B2 (en) | Image ranking based on abstract concepts | |
US10762608B2 (en) | Sky editing based on image composition | |
US8379974B2 (en) | Convex clustering for chromatic content modeling | |
US9741137B2 (en) | Image-based color palette generation | |
US9552656B2 (en) | Image-based color palette generation | |
US9396560B2 (en) | Image-based color palette generation | |
US7978918B2 (en) | Digital image cropping using a blended map | |
US20150262549A1 (en) | Color Palette Generation | |
US8031202B2 (en) | Color transfer between images through color palette adaptation | |
US20080019574A1 (en) | Machine-controlled image cropping with default | |
US11024060B1 (en) | Generating neutral-pose transformations of self-portrait images | |
CN105981360A (en) | Image processing apparatus, image processing system, image processing method and recording medium | |
US8867829B2 (en) | Method and apparatus for editing color characteristics of electronic image | |
Murray et al. | Toward automatic and flexible concept transfer | |
US11930303B2 (en) | Automated digital parameter adjustment for digital images | |
Murray et al. | Towards automatic concept transfer | |
US11250542B2 (en) | Mosaic generation apparatus and method | |
US11615507B2 (en) | Automatic content-aware collage | |
CN112150347B (en) | Image modification patterns learned from a limited set of modified images | |
JP2018128950A (en) | Image processing and program | |
US20210004699A1 (en) | Learning apparatus, inferring apparatus, learning method, program, and inferring method | |
CN114529624A (en) | Image color matching method and system and image generation method and system | |
US20230410553A1 (en) | Semantic-aware auto white balance | |
Xue et al. | Integrating High‐Level Features for Consistent Palette‐based Multi‐image Recoloring |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment | Owner name: XEROX CORPORATION, CONNECTICUT Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SKAFF, SANDRA;MURRAY, NAILA;MARCHESOTTI, LUCA;AND OTHERS;REEL/FRAME:025078/0036 Effective date: 20100920 | |
FEPP | Fee payment procedure | Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY | |
STCF | Information on status: patent grant | Free format text: PATENTED CASE | |
FPAY | Fee payment | Year of fee payment: 4 | |
FEPP | Fee payment procedure | Free format text: 7.5 YR SURCHARGE - LATE PMT W/IN 6 MO, LARGE ENTITY (ORIGINAL EVENT CODE: M1555); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY | |
MAFP | Maintenance fee payment | Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 | |
AS | Assignment | Owner name: CITIBANK, N.A., AS AGENT, DELAWARE Free format text: SECURITY INTEREST;ASSIGNOR:XEROX CORPORATION;REEL/FRAME:062740/0214 Effective date: 20221107 | |
AS | Assignment | Owner name: XEROX CORPORATION, CONNECTICUT Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT R/F 062740/0214;ASSIGNOR:CITIBANK, N.A., AS AGENT;REEL/FRAME:063694/0122 Effective date: 20230517 | |
AS | Assignment | Owner name: CITIBANK, N.A., AS COLLATERAL AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNOR:XEROX CORPORATION;REEL/FRAME:064760/0389 Effective date: 20230621 | |
AS | Assignment | Owner name: JEFFERIES FINANCE LLC, AS COLLATERAL AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNOR:XEROX CORPORATION;REEL/FRAME:065628/0019 Effective date: 20231117 | |
AS | Assignment | Owner name: CITIBANK, N.A., AS COLLATERAL AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNOR:XEROX CORPORATION;REEL/FRAME:066741/0001 Effective date: 20240206 |