CROSS-REFERENCE TO RELATED APPLICATIONSThis application is a continuation-in-part of U.S. application Ser. No. 09/898,476, filed Jul. 3, 2001, entitled “Image Tagging for Post-Processing” invented by John Feldis, which is incorporated herein by reference in its entirety.[0001]
BACKGROUND OF THE INVENTIONThe present invention relates to digital camera technology. More specifically, the present invention relates to a method and apparatus for tagging images and videos to facilitate transferring them to a specified destination for easy playback, etc.[0002]
Digital cameras have been gaining wide acceptance among consumers recently. At the same time, wide usage of email and the Internet has led to increased electronic communication between consumers. A natural result of these two factors is increased desire among users to share still image and video files over the Internet. Still images and videos can currently be sent over the Internet by downloading the image data to a host, and then associating certain image data with a destination address on the host.[0003]
One type of digital camera is disclosed in U.S. Pat. No. 6,167,469, assigned to Agilent Technologies, Inc. The digital camera enables digital images to be transmitted to a selected destination. The application discloses that the address of the destination is input into the camera. A voice message may be attached to the digital images and sent to the selected destination.[0004]
One recently adopted digital camera standard, DPOF Version 1.0, available on “http:www.panasonic.co.jp/avc/video/dpof/dpof[0005]—110/white_e.htm,” discloses some functions that may be performed in certain digital cameras. DPOF Version 1.0 allows the following functions to be specified on the camera: (1) multi-image print, (2) specifying the size of printed images, (3)auto-transfer via Internet and fax, and (4) auto play for slide show. The multi-image-print feature enables one to specify the number of images to be printed on one sheet. The specifying-the-size-of-printed-images feature enables one to specify the size of the printed images, so that one could use the prints for a variety of applications, such as displays and certificate materials. The auto-transfer-via-Internet-and-fax feature enables one to attach a message to image data and send the resulting data via email. The auto-play-for-slide-show feature enables one to specify the images to be played back on liquid crystal displays of digital cameras, video projectors, or PCs for slide show.
Another digital camera standard, Exif Reader, available on “http://www.takenet.or.jp/˜ryuui/minisoft/exifread/english/,” provides a numerous TIFF/EP tags that may be attached to the image data generated by digital cameras.[0006]
None of the above prior art, however, appears to address the need to provide users with a digital camera with data that is easily transferable to a destination, without inputting destination addresses into the camera itself. In addition, none of the above prior art appears to address the need to include image tags in both still images as well as in video/audio data.[0007]
Thus there exists a need for a digital camera where still image data as well as video/audio data can be easily transferred to a destination without inputting destination addresses into the camera.[0008]
SUMMARY OF THE INVENTIONThe present invention provides a method, and corresponding apparatus, for attaching a tag to data generated by an image capturing device for post processing in a remote location. It is to be noted that the data generated by an image capturing device can include, amongst others, still image data, video data, and/or audio data.[0009]
In one embodiment, a tag is affixed to the data (which can be still image data, video data, or audio data) within the image capturing device. In one embodiment of the present invention, the tag can be attached within the header of the data file. In another embodiment, the tag could be in the data portion of a still image file, or within a stream of data in a video file.[0010]
In one embodiment, the tag is an alias for predetermined instructions according to which the image data is to be processed. Amongst other things, the tag can be a resolution tag, a cropping tag, a red-eye removal tag, and a quick-send tag. In one embodiment, the tag comprises an alias for a destination address to which the image data is to be sent. For instance, the tag can be “Mom” while the destination address to which this alias corresponds can be mom's email address. The image capturing device itself may contain only a listing of aliases, one or more of which can be selected by the user. In a remote location, these aliases can then be matched up with the actual destination addresses. This remote location can be a personal computer, a cell phone, a PDA (Personal Digital Assistant), a remote server, etc.[0011]
In one embodiment, the tag includes identifying information about the content of the image (e.g., names of the subjects, location, event, etc.). Further, in one embodiment, the tag contains data indicating the status of the tag information within the system, and the status of actions taken within the system to process the tagged image/video data. It will be understood by one skilled in the art that this concept of tags being matched up with other information at a remote location need not be limited to tags comprising an alias for a destination address, but rather can be applied to various kinds of tags as well.[0012]
For a further understanding of the nature and advantages of the invention, reference should be made to the following description taken in conjunction with the accompanying drawings.[0013]
BRIEF DESCRIPTION OF THE DRAWINGSFIG. 1A is a block diagram of a digital camera according to one embodiment of the invention.[0014]
FIG. 1B depicts a block diagram of an image data file according to one embodiment of the invention.[0015]
FIG. 2 depicts a block diagram of a computer system according to one embodiment of the invention.[0016]
FIG. 3 depicts a simplified flow chart of a method of image tagging for post processing according to one embodiment of the invention.[0017]
FIG. 4 depicts a simplified flow chart of attaching an image tag according to the method of FIG. 3.[0018]
FIG. 5 depicts a simplified flow chart of processing image data according to the method of FIG. 3.[0019]
FIG. 6 depicts a block diagram of a computer connected to a communication network according to one embodiment of the invention.[0020]
FIG. 7A is a block diagram of a system in accordance with one embodiment of the present invention.[0021]
FIG. 7B is a flowchart illustrating the working of a system in accordance with an embodiment of the present invention.[0022]
DESCRIPTION OF THE EMBODIMENTSThe figures (or drawings) depict a preferred embodiment of the present invention for purposes of illustration only. It is noted that similar or like reference numbers in the figures may indicate similar or like functionality. One of skill in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods disclosed herein may be employed without departing from the principles of the invention(s) herein. It is to be noted that the present invention relates to any type of data that can be captured by a digital camera, such as, but not limited to, still image, video, or audio data. For convenience, in some places “image” or other similar terms may be used in this application. Where applicable, these are to be construed as including any such data capturable by a digital camera.[0023]
Referring to FIG. 1A, a[0024]digital camera50 includes animaging device100 and aprocessing system150. The imaging device includes alens102 having an iris, afilter104, animage sensor106, atiming generator108, an analog signal processor (ASP)110, an analog-to-digital (A/D)converter112, a digital signal processor (DSP)114, and one ormore motors116.
In operation,[0025]imaging device100 captures an image ofobject101 via reflected light impactingimage sensor106 along anoptical path118.Image sensor106 generates a set of raw image data representing the captured image. The raw image data is then routed throughASP110, A/D converter112 andDSP114.DSP114 has outputs coupled totiming generator108,ASP110, andmotors116 to control these components.DSP114 also has its output coupled toprocessing system150 via abus151. The raw image data are transmitted tosystem150 and processed therein.
In one embodiment,[0026]processing system150 includes abus interface152, aprocessor154, a read-only memory (ROM)156, aninput device158, a random access memory (RAM)160, an I/O interface162, aflash memory164, anon-volatile memory166, and aninternal bus168.
[0027]Bus interface152 is a bi-directional first-in, first-out interface for receiving the raw image data and control signals passed betweensystem150 andDSP114.Processor154 executes programming instructions stored inROM156 andRAM160 to perform various operations.ROM156 generally stores a set of computer readable program instructions which control howprocessor154 accesses, transforms and outputs the image data. In one implementation,ROM156 also stores a start-up program or file that enable a user to access the images stored in the flash memory using any computer whether it has a companion driver software installed or not.
[0028]Input device158 generally includes one or more control buttons (not shown) which are used to input operating signals that are translated byprocessor154 into an image capture request, an operating mode selection request, and various control signals forimaging device100. I/O Interface162 is coupled tointernal bus168 and has an external port connector (not shown) that can be used to coupledigital camera50 to acomputer200 for viewing and editing the image data stored inflash memory164. The camera and computer may be coupled to each other via acommunication link163. In one implementation, I/O interface62 is a universal serial bus (USB) port.
[0029]Flash memory164 stores the image data processed by the processor as image data files (see FIG. 1B). In one implementation,flash memory164 is a removable flash card or disk, (e.g., SmartMedia™, CompactFlash™, SecureDigital (SD) card, etc.) so that a user may replace a full flash card with a new flash card to store additional image data. In other implementations, other types of non-volatile memory other than flash cards may be used.
FIG. 1B illustrates a schematic block diagram of an image data file[0030]180 including aheader182, a compressed image data184, and atag field186.Header182 includes information identifying corresponding image data file180. Image data184 represents an image captured withcamera50. The image data is generally in a compressed form, e.g., in JPEG format, to conserve memory space offlash card164.Tag field186 includes tags, e.g., a resolution tag188, a cropping tag190, a red-eye removal tag192, and a quick-send tag194, that provides instructions tocomputer200 for post processing, as well as other types of tags.
Referring back to FIG. 1A,[0031]non-volatile memory166 stores an image counter whose current value becomes an identifier for each new set of image data captured bycamera50. The counter is preferably incremented each time a new image is captured.
Referring to FIG. 2,[0032]computer200 includes an I/O interface202 which can be used to couplecomputer200 tocamera50. The computer also includes abus204 for communicating data, a central process unit (CPU)206 coupled tobus204 to process data, amemory206 coupled tobus204 to store data and instructions to be executed byCPU206, and acommunication interface208 coupled to a network via acommunication link209. The communication interface may be an integrated services digital network (ISDN) card, modem, local area network (LAN) card, or the like.Computer200 is coupled to adisplay device210, e.g., a monitor, viabus204 to display information and aninput device212, e.g., a keyboard, to input data to the computer. In operation,computer200 serves as a host device for viewing, editing, and otherwise processing image data files received fromcamera50 via I/O interface202, as explained in more detail later in connection with FIG. 5. Alternatively, another electronic device, e.g., a cellular phone or portable digital assistant, may be used as the host device in place of the computer. In another embodiment, the system may consist of an interface (possibly wireless) in the camera itself communicating with a router through which the camera can send data directly to a server etc. Yet in other implementations, the image data files can be transmitted to the host device via an intermediary electronic device, such as a flash card reader (not shown).
Referring to FIG. 3, a[0033]process300 depicts a method of image tagging for post processing according to one embodiment of the present invention. Atstep302, a user takes apicture using camera50, from which raw image data is generated byimage sensor106.Processing unit154 processes the raw image data, where the processing includes compressing the data to a more manageable size (step304). In one implementation, the image data is compressed into a Joint Photographic Expert Group (JPEG) format. The user views the image corresponding to the saved image data and selects one or more tags to be attached to the image data (step306) via the user interface of theinput device158. Thereafter, using the tags,computer200 can process the image data files automatically, without specific user initiatives, upon receiving them fromcamera50 according to the instructions specified in the tag. As a result,camera50 does not require a powerful processor since heavy data processing functions may be allocated to be performed incomputer200. At the same time, the inconvenience to the user of editing or otherwise processing the image data oncomputer200 is reduced.
Once the user decides on the tags to be attached, they are attached to the image data and stored in flash memory or[0034]flash card164. That is, one or more tags are stored intag field186 of the image data file. In other embodiments, the tags are stored in the tag filed in the stream of a video file. In yet other embodiments, the tags are interleaved or encoded into the still image or video data itself. The image data file is transmitted tocomputer200 either by linkingcamera50 to the computer, or by removing the flash card and inserting it into a flash card reader that is coupled to the computer (step308).Computer200 processes the image data file according to the tags in the tag field (step310). In one embodiment, the tags are extracted on the host PC, and the tag is then looked-up in the database on the PC. As explained in further detail below, in one embodiment, each tag in the database has one or more destination addresses associate with it. In one embodiment, the PC sends the image data, along with these associated destination addresses, to a server. In one embodiment, the image data may be automatically modified on the PC for optimized delivery through the server to a recipient, based up on various factors (e.g., file type, connection, internet congestion, recipient's platform and conduit, etc.). The server then delivers the image data to each of the specified destination addresses. Another example is as follows. If the image data has a tag instructing the computer to increase the pixel resolution of the image data from one megapixels to three megapixels, the computer performs an appropriate algorithm to increase the resolution size of the image.
Referring back to step[0035]306, a method400 (FIG. 4) depicts a method of attaching tags to the image data according to one implementation of the present invention. Atstep402, the user views the image data stored inRAM160 orflash card164. Generally, digital cameras, such ascamera50, include a liquid crystal display (not shown) for viewing images. While viewing images on the liquid crystal display, the user may select an action to be performed subsequently by a host device (step404). A tag with appropriate instructions is attached to the image data (step406). It should be noted that a tag may simply be an integer which is interpreted on the host to indicate an action, or set of data, or both. The user is prompted if he or she is finished with the tagging (step408). If so, method320 ends andprocess400 continues ontostep308. If not, steps404 to408 are repeated.
In one implementation,[0036]camera50 enables the user to attach one or more of the following tags to the image data: (1) resolution tag188, (2) cropping tag190, (3) red-eye removal tag192, (4) quick-send tag194 (see, FIG. 1B) and various other types of tags. The resolution tag instructs a host device, e.g.,computer200, to automatically convert the image data from one resolution size to another resolution size. For example,camera50 is configured to save images in resolution size of one mega-pixel. The user may view the captured image, and if he or she likes the picture and wishes to enlarge it, the user may select to have the image data converted to a greater resolution, e.g., three megapixels. A method of converting an image from one resolution size to another resolution size is described in U.S. Pat. No. 6,058,248, which is incorporated by reference herein for all purposes.
In one implementation, the user may select from a plurality of resolution sizes or manually input the desired resolution size. In another implementation, the user may elect to have the image converted to a lower resolution size, particularly when the user wishes to email the image data to another person, to minimize use of the communication bandwidth. Yet in another implementation, the camera may be programmed to attach automatically attach a resolution tag without specific user initiative. For example, the user may set the default resolution size as two megapixels and require the camera to automatically attach a resolution tag to image data generated, where the resolution tag instructs a host device to convert the image data from two megapixels to one megapixel.[0037]
The cropping tag instructs[0038]computer200 to automatically remove undesired portions of a picture. The user may view the captured image and decides which portion of the image to retain and which to delete. A method of cropping an image data is described in U.S. Pat. No. 5,978,519, which is incorporated by reference herein for all purposes.
The red-eye removal tag instructs[0039]computer200 to automatically edit the image to remove the red-eye effects on pictures taken in poorly lighted environments. Pictures taken in poorly lighted environments may cause the pupils of people or animals to take on red tint. The user may review the picture taken and, if necessary, attach a tag instructing the computer to automatically remove the red-eye effects on the picture. In one implementation, the camera may be provided with a light sensor (not shown) and programmed to attach a red-eye removal tag automatically whenever a picture is taken in a poorly lighted environment. For example, the red-eye removal tags may be automatically attached to the images captured whenever a flash light (not shown) of the camera goes off. A method of removing the red-eye effects is described in U.S. Pat. No. 6,134,339, which is incorporated by reference herein for all purposes.
The quick-send tag instructs[0040]computer200 to automatically send the image data to another person or entity via a communication network.Camera50 may include a plurality of communication addresses, e.g., email addresses, inROM156. For each picture taken, the user may select one or more recipients to whom the picture should be sent. In one embodiment of the present invention, the quick-send tag may comprise of an alias, rather than the actual address of the recipient. The use of such aliases is discussed in greater detail below with reference to FIGS. 7A and 7B. When the image data files corresponding to the pictures are received bycomputer200, they are automatically sent to the selected addresses, as explained in more detail below.
As mentioned above, tags can be of various other types. For example, in one embodiment, the tag includes identifying information about the content of the image (e.g., names of the subjects, location, event, etc.). Further, in one embodiment, the tag contains data indicating the status of the tag information within the system, and the status of actions taken within the system to process the tagged image/video data. For example, a status tag could include information such as status=delivery attempted <date>; result=failed; retry pending. In other implementations, the user may attach other types of tags other than those listed above, e.g., a stitching tag that instructs[0041]computer200 to stitch two or more pictures together.
Referring back to step[0042]310, a method500 (FIG. 5) depicts a method of processing image data file330 incomputer200. Atstep502,computer200 receives the image data file via I/O interface202. In one implementation, I/O interface202 ofcomputer200 is coupled to I/O interface162 ofcamera50 to enable the computer to receive the image data file. In another implementation,flash card164 is removed fromcamera50 and inserted into a flash card reader, which is coupled to I/O interface202 of the computer, to transmit the image data file tocomputer200. The camera and flash card reader may be coupled to the computer via a physical connection or wireless connection.
Using[0043]CPU206,computer200 checks tagfield186 of the received image data file to determine whether corresponding image data184 needs to be processed in a particular manner according to tags in tag field186 (step504). In one implementation, the received image data file is first stored inmemory208 before the tag field is checked byCPU206. IfCPU206 determines that the tag field does not contain any tag, then image data184 is processed according to a default setting, i.e., the image data is decompressed and displayed on display device210 (step510). Thereafter, the user may edit, print, send, or perform other functions on the image data184 usinginput device212.
On the other hand, at[0044]step506, if there are one or more tags (e.g., resolution tag188 and quick-send tag194) in the image tag field,CPU206 retrieves one of the tags to be processed (step508). The tags may be retrieved in any order or in the order it was attached inmethod400. In this exemplary implementation, the resolution tag is first retrieved, where the resolution tag instructs the computer to convert the image data from the resolution size of one mega-pixel to the resolution size of three megapixels. The computer processes the image data by performing an appropriate algorithm to increase the resolution size (step510). In one implementation, a resulting image data file180′ with new image data184′ of three megapixels is saved inmemory208 of the computer. Thereafter, the image with the increased resolution size is displayed ondisplay device210 to the user for viewing, editing, or performing other functions on the image data.
At[0045]step512, the computer checks the tag field to determine if there are any other tags in the tag field. Since another tag exists intag field186 in this exemplary implementation, steps508 to510 are repeated. The remaining tag, quick-send tag194, is retrieved (step508). In one implementation, these subsequent steps may be performed prior to displaying the new image data184′ on the display device. The tag instructs the computer to transmit the image data file to one or more recipients, e.g., Jay Feldis and Bryed Billerbeck. In one embodiment, the tag may include the email addresses of these recipients. In another embodiment, the tag is simply an alias or reference (e.g., an integer) to an entry in a database on the host. The entry in the database matches up each tag with one or more destination addresses, as explained below in more detail with reference to FIGS. 7A&B. The computer connects to a communication network vialink209. As shown in FIG. 6,computer200 is coupled to a plurality ofremote computers230 to234 via acommunication network240, e.g., the Internet.
The computer initiates an Internet connection, and once connected to the Internet, the image data file is sent to the email addresses of Jay Feldis and Bryed Billerbeck. In one embodiment, the data is sent via a local email client. In one embodiment, the data is sent through a server, which then sends the data to the recipients (e.g., via SMTP mail software component on the server). Jay and Bryed having access to[0046]remote computers230 and232, respectively, may retrieve the image data file transmitted bycomputer200. In one implementation, the transmitted image data file is the original image data file180 transmitted bycamera50, where the image data is formatted to have a resolution size of one mega-pixel. Therefore, upon receipt of image data file180, the remote computers may automatically increase the resolution size of image data184 to three megapixels according to the instructions provided in resolution tag188 before displaying the image on their respective display devices. Alternatively, the transmitted image data file may be the new image data file180′ with the new image data184′ of three megapixels, thereby eliminating the need to process the resolution tag before displaying the image. One advantage of transmitting the original image data file184 is bandwidth conservation during transmission of the data file.
FIG. 7A is a block diagram of a system in accordance with one embodiment of the present invention. FIG. 7A comprises an[0047]image capturing device710, ahost720, anetwork730, and adestination740. Data from theimage capturing device710 can be downloaded to thehost720, and then be transferred to thedestination740 via thenetwork730.
The[0048]image capturing device710 comprises adata storage module712, an alias table714, and adisplay module716. In one embodiment, theimage capturing device710 is a digital camera. In one embodiment, thedata storage module712 comprises only internal memory, such as NAND Flash, etc. In another embodiment, thedata storage module712 comprises only external (or removable) memory, such as Compact Flash, Smart Media Card, SD, memory sticks, etc. In yet another embodiment, thedata storage module712 comprises both internal and external memory.
In one embodiment, the alias table
[0049]714 is a listing of various aliases set-up by the user. In one embodiment of the present invention, the user sets up the aliases on the
image capturing device710 itself. In another embodiment of the present invention, the user sets up the aliases on a
host720, and these can then be downloaded to the
image capturing device710, either directly (e.g., if the
host720 is a personal computer), or via the
network730. It will be obvious to one of skill in the art that the use of the word “table” is simply illustrative. The aliases can be stored as a table, a list, or in any other format. Table 1 below provides an example of an alias table.
| TABLE 1 |
| |
| |
| Alias |
| |
| Mom |
| Family |
| Jay |
| Bryed |
| Friends |
| Work |
| |
The[0050]display module716 can display several things, including but not limited to displaying a preview of data about to be captured by the user, displaying previously captured data for review by the user, and displaying a choice of aliases from which the user can select one or more aliases. In one embodiment, thedisplay module716 comprises a Liquid Crystal Display (LCD) or a Liquid Crystal Monitor (LCM). Further, thedisplay module716 can also be used to receive user selections/instructions, such as which images are to be sent to what destinations, etc. In one embodiment, this can be done by displaying a user interface on thedisplay module716.
In one embodiment of the present invention, the[0051]host720 is a remote server. In one embodiment, theimage capturing device710 communicates with the remote server via thenetwork730. As an example, a digital camera may not need to download pictures to a local personal computer, but instead, may directly communicate with a remote server over the network. In other embodiments, thehost720 is a personal computer, a cell-phone, a networked storage device (a media library), a Personal Digital Assistant (PDA), etc. In such embodiments, theimage capturing device710 communicates directly with thehost720, without having to go through thenetwork730. Such an embodiment is explained in further detail with reference to FIG. 7B below.
The[0052]host720 includes a receivingmodule722 and a look-up table724. The receivingmodule722 receives the image data from theimage capturing device710. The look-up table724 can be created, in one embodiment, on thehost720. The tags, as well as the instructions associated with each tag are entered on thehost720. The tags are then downloaded on to theimage capturing device710. In another embodiment, the look-up table724 is harvested from other applications on thehost710, or from elsewhere on thenetwork730.
In one embodiment, the look-up table
[0053]724 comprises aliases mapped to destination addresses. Thus, if some data is associated with a specific alias, the look-up table
724 serves to translate the alias to a destination address. It is to be noted that a single alias could refer to a single destination address, or to multiple destination addresses (i.e., to a group of destination addresses). The destination addresses can be any type of address, such as email addresses, Instant Messenger (IM) addresses, cell phone addresses, etc. Moreover, it will be obvious to one of skill in the art that the use of the word “table” is simply illustrative. The information can be stored as a table, a list, or in any other format. Table 2 provides an example of a look-up table
724. It is to be noted that in other embodiments, table
724 also includes other information such as IM buddy names, the address of a storage location such as a data storage location, media library, or website, etc.
| TABLE 2 |
| |
| |
| Alias | Address(es) |
| |
| Mom | jane@yahoo.com |
| Family | john@hotmail.com, jane@yahoo.com, |
| | mary@abc123.com |
| Jay | jay@feldis.com |
| Bryed | bryed@yahoo.com |
| Friends | susan@hotmail.com, tim@yahoo.com, |
| | joanne@xyz.com, peter@rst.com |
| |
The[0054]network730 is any type of network such as a wide area network (WAN) or a local area network (LAN). The wide area network may include the Internet, theInternet2, and the like. The local area network may include an Intranet, which may be a network based on, for example, TCP/IP, belonging to an organization accessible only by the organization's members, employees, or others with authorization. The local area network may also be a network such as, for example, Netware™ from Novell Corporation (Provo, Utah) or Windows NT from Microsoft Corporation (Redmond, Wash.).
The[0055]destination740 may be a specified by an email address, an instant messenger address, an e-frame address, a cell phone number, and so on. It will be obvious to a person of ordinary skill in the art that thedestination740 is any destination where the data can be sent. For instance, it is possible for the destination to be the postal address of a person, where a DVD, video cassette, and/or hard copies of photos could be delivered (via a store etc. where the DVDs etc. are prepared).
FIG. 7B is a block diagram of another embodiment in accordance with the present invention. The components of the system include an[0056]image capturing device710, ahost720,networks730a&b, aserver735, and adestination740.
The[0057]image capturing device710 is as described above with reference to FIG. 7A. However, unlike in FIG. 7A, theimage capturing device710 does not connect directly to thenetwork730. Instead, the image capturing device connects to host720, via which it connects to thenetwork730a.
In one embodiment of the present invention, the[0058]host720 is a personal computer (PC). In other embodiments, thehost720 is a cell-phone, a networked storage device (a media library), a Personal Digital Assistant (PDA), etc. Thehost720 includes a receivingmodule722 and a look-up table724, which have been described with reference to FIG. 7A. In addition, in one embodiment, thehost720 also has animage modification module725. Amongst other things, theimage modification module725 modifies (e.g., compresses) images/videos to satisfy quick download by recipients, and to keep costs down. In one embodiment, such compression and image modification is configurable, and is specified by theserver735, and happens dynamically when thehost720 connects to theserver735.
The[0059]host720 then connects to aserver735 through anetwork730a.Theserver735 connects to thedestination740 through anetwork730b.It is to be noted thatnetworks730aand730bcould be the same network, or could be distinct networks. In one embodiment, theserver735 serves various functions. For instance, in one embodiment, theserver735 could create thumbnails of the images/videos to send to thedestination740. In one embodiment, theserver735 creates Hyper Text Markup Language (HTML) emails to send to thedestination740. In one embodiment, theserver735 creates email text in the appropriate language. In yet another embodiment, theserver735 reformats data to optimize delivery to the recipient based on the data type, connection, network traffic, etc. In still another embodiment, theserver735 selects appropriate communication channels (e.g., email, instant messenger, cell phone, etc.) based on factors such as address type, file size, etc.
FIG. 7C is a flowchart illustrating how the systems in FIGS. 7A & 7B would work in accordance with one embodiment of the present invention. Initially, the data (such as still image data, video data, or audio data) to be processed (e.g., to be sent to a destination) is selected by a user on the[0060]image capturing device710. Theimage capturing device710 thus receives selection of the data750. In one embodiment, the data from thedata storage712 is displayed on thedisplay module716. This data may be displayed as individual data items (e.g., individual still images) or as a collection of these (e.g., thumbnails of several still images), etc. The user can then select desired data via thedisplay module716, and theimage capturing device710 receives selection of the data750.
The user also selects one or more tags to be associated with the data. The[0061]image capturing device710 thus receives the selection of a tag(s) to be associated with the selected image data752. In one embodiment, the tag can refer to destination(s) to which the selected data is to be sent. In another embodiment, the tags can refer to the subject matter of the picture being taken. (E.g., the picture being taken is of “mom” or of “Christmas” etc.). In yet another embodiment, when the data being selected is video data, the tag can be a video editing tag. Such a video editing tag may include instructions to trim one or both ends of the video file, mark one or more significant frames in the video file for special processing (e.g., display on host), indicate which frame to represent as a thumbnail, etc.
In one embodiment, the selection of the tags is performed by the user by interacting with the[0062]display module716. It will be obvious to one skilled in the art that the user may select the tags in one of several ways. For instance, in one embodiment, thedisplay module716 displays a list of possible tags. An example of such a list comprising of aliases is provided in Table 1 above. The user can thus select one of the tags displayed. For instance, the user may choose to send a particular video clip to “Mom.” In another embodiment, the user inputs (e.g., using a stylus, or by using a “keyboard” that appears on the display module716) the tag using thedisplay module716, rather than selecting it from a list. In another embodiment, the user manipulates certain input devices (e.g., buttons on the image capturing device710) to make the selections (e.g., of the data and the tags). In any case, theimage capturing device710 receives752 the selection of one or more tags to be associated with the selected data.
The selected tag(s) is then attached[0063]754 to the selected data. In one embodiment, the tag is inserted into the header of the data. In another embodiment, the tag is encrypted an embedded into the data itself. This can be done by interleaving the tag with pixels of image data. In one embodiment, this interleaving is done in a way that is not visible to the user when the data is displayed. In another embodiment, this interleaved tag is extracted before the data is displayed to the user. In one embodiment, existing technologies (e.g., the JPEG-EXIF metadata standard) can be used to insert tags still images and/or videos.
The modified data (i.e., the data including the destination alias) is downloaded[0064]756 to ahost720. The table look-up is then performed758 to identify the instructions corresponding to the tags. For example, a look-up may identify which destination or destinations are specified by each alias tag. An example of a look-up table is provided in Table 2 above. It can be seen from Table 2 that, for example, if “Mom” were the alias tag inserted754 into the data, then the data would be emailed to jane@yahoo.com. If “Family” were the alias tag inserted754 into the data, then the data would be emailed to john@hotmail.com, jane@yahoo.con, and mary@abc123.com. It will be obvious to one skilled in the art, that the present invention is not limited to sending data to email addresses alone. Rather, the present invention can be used to send data to instant messenger addresses, web-sites specified by URLs, and any other such destinations which can be specified using a destination address. Further, it should be noted that, as mentioned above, the tags could identify things other than destination addresses, such as the subject matter of the photograph, etc.
In one embodiment, the data is intelligently formatted[0065]759. In one embodiment, when the tag specifies a destination to which the data is to be sent,intelligent formatting759 includes sending each recipient a single message (e.g., email), even if, for example, multiple still images are to be sent to him. FIG. 7D illustrates some details.
Referring to FIG. 7D, it can be seen that the tags are associated with the data on the[0066]image capturing device710. In one embodiment, each piece of data (e.g., a still image, a video file, etc.) is uploaded to the host720 (and/or the server735) only once. The tags associated with each piece of data are then looked-up in a look-up table as described above. Thus the instructions associated with each tag (e.g., specific destination address(es) corresponding to each tag, to which the still image and/or video data is to be sent). The data for transfer is then formatted intelligently such that each destination address is followed by a list of still images and/or video data to be sent to that address. In this way, each recipient only receives a single message, regardless of how many pieces of data are received by him.
In one embodiment, when the tag specifies a destination to which the data is to be sent,[0067]intelligent formatting759 includes optimizing data transfer. Such optimization may include generation of thumbnails of still image/video data, generating any text accompanying the data, and so on. In one embodiment, data is optimized based on the data type, recipient's network connection, etc. In one embodiment, such information is obtained dynamically from the network. In another embodiment, such information is included in the look-up table.
Referring again to FIG. 7C, it can be seen that the data is processed[0068]760 in accordance with the instructions associated with the tag. For example, if the tag is a destination tag, the data is sent to the destination specified by the destination address(es) corresponding to the alias tag(s). In one embodiment, the data can be sent to various through communication channel (e.g., email, instant messaging, cell phone, etc.). In one embodiment, if a single recipient is identified by multiple different addresses (from same or different communication channels), the recipient receives the message containing still image/video data on only one of these addresses. In another embodiment, the recipient receives the message on all of these addresses.
As another example, if the tag is a subject matter tag, the data is sorted by it (e.g., stored in a folder named by the subject matter corresponding to the subject mater tag).[0069]
It is to be noted that the table look-up[0070]step758, the intelligent formatting of thedata step759, and the processing of data step760 may be performed on the same host, or on different hosts. For example, in one embodiment, the table look-up is performed on thehost720, while the intelligent formatting of thedata759 and the processing ofdata760 is performed on theserver735. In another embodiment, each of these three steps is performed on thehost720. In yet another embodiment, each of these three steps is performed on theserver735. It will be obvious to one of ordinary skill in the art that any combination of these steps being performed on thehost720 and theserver735 is possible. It is also possible to have several hosts and/or servers for performing these various steps.
As will be understood by those skilled in the art, the present invention may be embodied in other specific forms without departing from the essential characteristics thereof. For example, the captured data could be audio data associated with an image, or separate from an image. The tag could be imbedded in a field of the audio data, in the data itself or a header. Accordingly, the foregoing description is intended to be illustrative, but not limiting, of the scope of the invention which is set forth in the following claims.[0071]