CROSS-REFERENCE TO PROVISIONAL PATENT APPLICATIONSThe benefit of U.S. Provisional Patent Application Ser. No. 60/154,527, filed Sep. 16, 1999; Ser. No. 60/182,731, filed Feb. 15, 2000; and Ser. No. 60/221,104, filed Jul. 27, 2000 is claimed.
BACKGROUND OF THE INVENTIONThe invention relates generally to fiber quality measurements for cotton classing and, more particularly, to image-based instrument measurements of Color and Trash.
Cotton standards are supported by the United States Department of Agriculture (USDA) through its Agricultural Marketing Service (AMS). Cotton standards, and the corresponding classing of cotton, are of great importance in determining the market value of a particular bale of cotton, as well as determining suitability of a particular bale of cotton from a gin for subsequent processing at a particular mill in view of the products and processes of that mill. AMS is responsible for preparing and maintaining such cotton standards and does so in its Standards Section located in Memphis, Tennessee.
In 1923, the United States and nine European countries entered into the Universal Cotton Standards Agreement. From that time, up until approximately 1965, USDA/AMS cotton classing “measurements” based on the Universal Standards were made entirely by humans. The human measurements included “grade,” “extraneous matter” (such as bark and grass), “preparation” (which relates to smoothness of the sample) and “staple length” (long fiber content). Instrument-based cotton classing was introduced in 1965, beginning with micronaire, followed in 1980 by High Volume Instruments (HVI), which added measurements of length and strength. HVIs currently measure the fiber qualities of Micronaire, Length, Strength, Color and Trash. Instruments for measuring Color (Rd, +b and +a, which refer to “reflectiveness,” “yellowness” and “redness”) and Trash (% Area) have also been developed, but Human Classer measurements of Color and Trash are still generally employed because of certain deficiencies in these methods.
Prior to 1993, Classer's Grade was a combination of color and leaf trash. In USDA/AMS classing, the human Classer now separately calls a color grade and a leaf grade. These Classer's calls for color and leaf grade are still the official measurements, along with a Classer's call about “Extraneous Matter” and “Preparation.” Although currently unofficial, instrumental measurements of Color and Trash, as well as exploratory measurements of Short Fiber Content are made, however deficient, both to satisfy mill requests for the information and to lay the foundation and to create demand for improvements.
The Classer measurements of color, trash, preparation and extraneous matter are primarily visual, and measurements of staple length are both tactile and visual. It is appropriate to call all human classifications “measurements” in a broad sense. This is because Classers are rigorously trained to compare their observations to cotton standards immediately available to them. That is, they measure first, by comparison, then put the sample into a “class.” If a Classer has any doubt about a classing call, the Classer can, sample in hand, go to a set of standards boxes to refresh the Classer's memory to improve the “call” by comparison to the standards for Grade and Staple.
Considering in particular the current methods of human visual fiber quality measurements, the Classer holds a sample from a bale and visually examines its appearance in good, meaning standardized, illumination. Making the measurements of Color Grade and Leaf Grade physiologically and psychologically amounts to comparing the images falling on the retinas and perceived by the brain with images from memory or, for confirmation, images from physical standards. The procedures of making such measurements are learned in extensive training. “Classer's Call” means that the Classer has measured the object and judged it to fall within or near to a standard or, if not, he or she comments why.
SUMMARY OF THE INVENTIONIn embodiments of the invention, an optical imaging device having a defined object field of view acquires an image for classifying an unknown sample of cotton. The unknown sample of cotton is placed in the object field of view, as well as at least one reference material. As a result, the optical imaging device acquires images of both the unknown sample of cotton and the reference material in the same field of view, at the same time and under the same measurement conditions.
In an instrument classification embodiment, an instrumental “call” or classification of the sample under test is made without reconstructing an image of the object field for human viewing.
In a human classification embodiment, the image of the entire object field, including the sample under test and at least one reference material, is reconstructed for human classification, either locally or remotely. When the image file is transferred over the internet to a remote location, we refer to the embodiments as “Internet Classing.”
Inclusion of at least one reference material, having known and traceable optical properties in the object field is an important feature in both embodiments, and overcome the deficiencies of prior art methods. It is particularly advantageous to use reference materials which are similar to the samples under test. Thus cottons, whose color or trash properties are established by AMS or others, are included as reference materials. In the instrument classification, algorithms match the unknown to the knowns.
Embodiments of the invention can employ, as the optical imaging device, a high quality color scanner intended for office or graphics arts use in scanning documents. In the context of the invention, the inclusion of at least one reference material in the object field is not intended to refer to the usual and conventional calibration “strip” included in such color scanners. Such a calibration “strip” is internal to the scanner, and is generally employed to calibrate the intensity of the illumination within the scanner. The internal scanner calibration “strip” is not part of the acquired image data.
BRIEF DESCRIPTION OF THE DRAWINGSFIG. 1 is an overview of a machine embodying the invention, which machine measures cotton samples to produce multiple data products, including images, and additionally internally and ultra-rapidly conditions samples;
FIG. 2 is a top view of the Ultra-Rapid Conditioning module and the Color and Trash module of the machine of FIG. 1, without a pressure/distribution cover plate in place;
FIG. 3 is a side view of the Ultra-Rapid Conditioning module and the Color and Trash module;
FIG. 4 is an end view of the Ultra-Rapid Conditioning module and the Color and Trash module;
FIG. 5 is a bottom view of the Ultra-Rapid Conditioning module and the Color and Trash module, showing the optical imaging device field of view;
FIG. 6 represents the manner in which a human Classer measures fiber quality;
FIG. 7 represents image based measurements of fiber quality;
FIG. 8 is a graph depicting Trash-measurement performance of an embodiment of the invention;
FIG. 9 is a graph depicting color (Rd) measurement performance of an embodiment of the invention;
FIG. 10 is a graph depicting color (+b) measurement performance of an embodiment of the invention;
FIG. 11 is a graph depicting color (Rd us +b) measurement performance of an embodiment of the invention;
FIG. 12 is an image of a cotton sample acquired by an embodiment of the invention;
FIG. 13 shows a pattern of reference biscuits;
FIG. 14 shows interpolated reference biscuits; and
FIG. 15 shows an interpolated image.
DETAILED DESCRIPTIONReferring first to FIG. 1, the invention is embodied in a stand-alone instrument100 which measures cotton samples to produce multiple data products, including images, and additionally internally and ultra-rapidly conditions samples.Instrument100 is a robust, stand-alone platform upon which fiber quality measurement modules are placed to effect generation of multiple data products. By including internal, ultra-rapid sample conditioning, theinstrument100 eliminates the need for expensive conditioned laboratory space. Themachine100 thus does the work of several other instruments and an expensive laboratory air conditioning system, and does that work in the challenging ginning environment.
System OverviewOperator101 in FIG. 1 selects a “Classer's Sample” having an estimated weight of approximately 15 grams ofsample102. Such a 15-gram sample is typically 5 inches (12.7 cm) wide×8 inches (20.32 cm) long×1 inch (2.54 cm) thick, when uncompressed. The operator “swipes” permanent bale identification (PBI)tag104 throughbar code reader106, and prepares and introducessample102 into recessed conditioning/test chamber110 of “stable table” top111, when pressure/distribution plate202 is retracted. (See also FIG. 2.) Theoperator101 then initiates automatic conditioning/testing by causing pressure/distribution plate202 to move oversample102 in the recessed conditioning/testing chamber110, compressing the sample to a thickness of less than 3 mm. Directed by aprocess control computer112, themachine100 then automatically effects “Ultra-Rapid Conditioning” inmodule200, and additionally effects testing of thesample102 for Color and Trash inmodule300. (Operator101 can monitor and control the progress of conditioning/testing, and of all other operations, as well as examine the data products produced, stored, and communicated bysystem100 viacomputer112 and touch-screen display113.)
Conditioned gas forconditioning sample102 in conditioning/testing chamber110 and for transporting andprocessing sample102 in subsequent steps is provided byair conditioning module114.Air conditioning module114 provides a conditionedgas flow116 having controlled environmental parameters such as Relative Humidity of 65%, dry bulb Temperature of 70° F. (21° C.), and flow rates of 200 CFM (5.7 m3/min). Conditionedgas flow116 is conducted to theentrance117 for both theindividualizer120flow122 and for thesample conditioning module200. In a variation,gas flow116 is split into two components, one having the fixed, standard parameters just described and a second having variable humidity, temperature, flow rate and pressure and which variable parameters are automatically controlled by a separate controller withinair conditioner114, and which parameter values are determined in accordance with optimallyconditioning sample102 within conditioning/testing chamber110.
In overview,sample102, having been manually or automatically placed in recessed conditioning/testing chamber110, with the pressure/distribution plate assembly202 over it, is ultra-rapidly “conditioned” fromabove window204 and “tested” for Color and Trash below it.Sample102 may also be tested for moisture content inchamber110, according to which dataair conditioning module114 is caused to optimallycondition sample102 under control ofcomputer112.
As a practical matter, the nominal transverse dimensions of theconditioning module200 and Color andTrash testing module300 are 8.5×11 inches (21.59×27.94 cm), the size of standard paper in the United States. This is because the Color andTrash module300 is based on available high quality and high resolution color scanners intended for office and graphics arts use in scanning documents. However, any device for acquiring the raw image files and any transverse dimensions may be employed.
By way of example, there are two alternative embodiments involving primarily valve208 (FIG. 3) and perforated plate212 (FIG.3). Downward force onsample102 in recessed conditioning/testing chamber110 is important for the Color and Trash measurements. However, when sample conditioning is not required, as it is not for most color and trash measurements, sample pressure is applied by simpler mechanical means. Mechanical means may also be used to compresssample102 when suction forces are inappropriate.
In the first alternative for applying pressure to thesample102 under test,valve208 in FIG. 3 is open while conditioned air frommodule114 is delivered tocondition sample102. In this first alternative, the holes in relatively thick and rigidperforated plate212 are relatively large and the flow rate delivered for conditioning is high. After typically ten seconds,valve208 partially closes and restrictsflow210 intoUltra-Rapid Conditioning module200, thus causing a strong negative pressure or suction to be developed withinpressure distribution plate202. This suction causes atmospheric pressure to forceplate202 downward ontosample102.Bellows215 andseals220 enable the downward movement and the suction, respectively. There is also an equal and opposite upward atmospheric pressure force onsample102 exerted bywindow204. Sample pressure is important for the Color and Trash measurement.
In the second, simpler alternative, there is novalve208, andperforated plate212 is preferably thinner and has fewer and/or smaller holes. These smaller holes inplate212 inherently limit theflow210 and thus develop the suction force acrossperforated plate212, directly. Open areas of the order of 10% represent a satisfactory compromise between downward force, for Color and Trash measurements bymodule300, and flowrate210, for Ultra-Rapid Conditioning. This second alternative also enables parallel operations for Ultra-Rapid Conditioning processing bymodule200 and for Color and Trash testing bymodule300.
The substantially simultaneous Ultra-Rapid Conditioning bymodule200 and image acquisition testing bymodule300 lasts less than one minute and can be as short as approximately ten seconds, depending on scanner resolution chosen and how close in moisture content the selectedsample102 lies to an acceptable value, such as 7.3% for cotton. Such ultra-rapid sample conditioning is particularly useful as preconditioning for subsequent testing.
At the completion of the conditioning/testing cycle, pressure/distribution cover202 (or pressure plate (not shown) in the event Ultra-Rapid Conditioning is not employed) is opened. Thecover202 may be opened manually, or automatically upon receipt of a signal fromcomputer112.Sample102, which is now conditioned for further processing and testing, is automatically or manually moved ontobelt118 for quick transport to anindividualizer120, which thoroughly opens, i.e., “individualizes,”sample102 into its various constituent entities, fibers, neps, trash, seed coat fragments, sticky points, microdust, and the like. A suitable individualizer is disclosed in Shofner et al U.S. Pat. No. 5,890,264. An alternative is forindividualizer120 to alsoclean sample102 by removing trash, microdust and other foreign matter. However, in the disclosed embodiment almost all of the individualized entities are transported in the same transport flow stream.
This processing byindividualizer120 causes the thoroughly individualized entities to be entrained in or transported by about 120 CFM (3.4 m3/min) ofconditioned air flow122 such that the fiber and other entity concentrations transported by the gas flow at theoutput126 ofindividualizer120 are very low. Accordingly, the Nep content of thus-individualizedsample102 is measured with anep sensor124 which advantageously is built into theindividualizer120. Asuitable nep sensor124 is as disclosed in Shofner et al U.S. Pat. No. 5,929,460.
Sample102, whose weight was guessed byoperator101 at approximately 15 grams, is at theoutput126 ofindividualizer120 in a highly opened, individualized state that simulates the state of fiber in important textile processing machines, especially carding. Accordingly, the state of the fiber is ideal for testing the individual fibers and other entities in thegas flow122. One such test is the Nep test made bynep sensor124. Other tests are Micronaire-Maturity-Fineness (MMF), effected bymodule400. For Neps and for MMF, it is required that the sample weight be known, not guessed, and sample masses of nominally ten grams are commonly used for both tests. The sample weight can be determined prior to or after the testing using known analytical balance technologies. Post testing weighing can be automated.
The system aspects of the disclosed embodiment can be summarized:
1. Common flow;
2. Optimal sequence for sample tests, from surface measurement of Color and Trash to volume or weight measurements of Neps and Micronaire based on guessed weight or on precise weight;
3. Ideal sample state for simulations of actual processing (e.g., cleanability, processability, spinnability); and
4. Automatic except for selecting and introducing classer's sample, thus eliminating operator effort and errors. System and methods can be extended to complete automation.
Image-Based Color and Trash MeasurementsFIGS. 2-5 show both theUltra-Rapid Conditioning module200 and the Color andTrash module300 of themachine100 of FIG.1. FIG. 2 is a top view, without pressure/distribution cover plate202; and FIG. 5 is a bottom view. FIGS. 3 and 4 are side and end views, respectively.
FIGS. 6 and 7 more generally disclose embodiments of the invention for measuring Color and Trash, including Extraneous Matter. Thus the Color andTrash module300 is an image-based system which improves the measurements of these basically important fiber qualities over current methods. Preparation can also be measured, if that data product is needed.
Comparing FIGS. 6 and 7 reveals the formal analogies between human visual classification and machine vision classification. In the image acquisition step, both require good illumination of an object field and optimal positioning with respect to the imaging optics. Both form an image field on photosensitive surfaces having spatially discrete and color-resolving detection elements. And both communicate the image information to a powerful central processor for storage in a large and easily recalled memory. In the subsequent image analysis or pattern recognition step, both compare, analyze, or process the current image information with reference to corresponding image information stored in memory. Measurements or classing calls are made, based on that comparison.
Importantly, the (1) image acquisition and (2) image analysis steps are fundamentally distinct, sequential processes in the human and in the machine vision analogs. Also, image analysis is generally distinct from image reconstruction.
More specifically, an important feature of embodiments of the invention is seen in FIG.7. Along with the unknown sample (or samples) under test, the object field includes one or more cotton standards whose colors are accurately, precisely and traceably known. Standards produced by USDA/AMS are prepared in boxes containing, typically, six known cottons which are referred to as “biscuits.” The reference biscuits are viewed under the same illumination and through the same optical elements as the unknown sample(s) under test. The cotton in the biscuits is generally in the same state and pressed against the windows with the same pressures as the unknowns. The biscuits can be in sealed chambers with inert gases to diminish oxidation and other color-changing causes and thus extend the useful stable color life of the reference biscuits.
The system and method of FIG. 7 thus facilitate matching between unknown cottons and known cottons, or other reference materials. This is because the unknowns and knowns are in the same state and tested with the same optical system at very nearly the same instant in time, thus eliminating illumination, other electro-optical noise and drifts, and positioning errors. This computer-based image analysis and matching is seen to be rigorously analogous to the Classer's physiological actions, as described above and in FIG. 6, in both the image acquisition and image analysis/comparison steps, and in the electronic and in the human “classing call” step.
In an instrument classification embodiment, a high quality color scanner (intended for office or graphics arts use in scanning documents) is employed for image acquisition. About one fourth to one half the object field of 8.5×11 inches (21.59×27.94 cm) is devoted to the reference biscuits and other known reference materials and the other half to the unknown sample(s) under test. Acquired image file sizes, depending on spatial resolution (pixel size) and electronic resolution (bit depth), range in the order of tens of megabytes to several hundred megabytes. Files near the lower end on this size range are acquired and analyzed in less than thirty seconds, currently. Accuracy and precision can approach fundamental limits associated with the color standards used.
The same method can be extended to measurement of trash, including measurement of the color of each trash particle, as well as its shape. This enables classification of bark and grass and other types of foreign matter. Another data product or measurement is preparation via surface texture.
The reference or “target” cottons are formed from universal cotton color standards prepared by the USDA/AMS in Memphis. However, reference cottons from any country, or even from an individual gin or mill, or a mix, may be used for the reference biscuits. Other reference materials can be employed in addition to cottons. This will be important as there is movement toward absolute or Commission Internationale l'Eclairage (CIE) color measurements.
FIGS. 2-5 more particularly depict an embodiment in which color scanner-based image acquisition/analysis technology is adapted to the measurement of cotton Color and Trash. The Ultra-Rapid Conditioning process ofmodule200 takes place abovewindow204 in FIGS. 3 and 4.Sample102 is pressed againstwindow204 by downward (on perforated plate212) and upward (on window204) suction forces explained above. The Color and Trash measurements ofmodule300 take place belowwindow204.Color scanner apparatus302 is attached to the bottom of stable table111. This stable table111 is made of aluminum or heavy plastic and is mounting on soft springs (not shown) which are highly damped and which mass-spring-damper system has the purpose of isolating the scanner from vibrations. Conditionedgas flow115 fromconditioner114 in FIG. 1, which includes filtering, is directed intocabinet113 to maintain cleanliness of the optical components.Scanner302 may be a Hewlett-Packard 6000 series scanner or equivalent and includes ascan head304 which is driven alongrails306 by a stepper motor (not shown).Flexible signal cable308 connects the several thousand pixel, linear arrays, one for each of the three colors Red, Green andBlue320, to on-board electronics module310 which is turn connected tocomputer112.
Computer112controls scanner302, acquires and stores its large quantities of data, and operates upon or analyzes these acquired data to generate scientific measurements, which is the functionality of interest in the first embodiment. In another embodiment,computer112 is used to produce images for human classification, or for communications to other computers, including communications over the internet, where scientific measurements or human classifications can be made.
The processes of: (1) scanning an object field, and acquiring thereby an image file of raw data; and (2) operating upon those acquired raw data to produce a viewable, communicable or analyzable image field, are sequential and very sharply distinct. These processes can be characterized as (1) image acquisition and (2) image reconstruction or recording and reproduction. One analogy is acquisition of a photographic negative from which positive prints are made to reconstruct the image. Another analogy is music recording and subsequent reproduction. Appreciating the nuances of the separate processes of acquisition and analysis or recording and reproduction facilitates understanding the limitations of both processes, including spatial and electronic resolution, dynamic range, various nonlinearities, noise and most particularly, fidelity.
Scanner light source312 in FIG. 4 illuminates the object field above it330, FIG. 5, with typical incident rays313. Incident rays313 originate directly fromelongated lamp312 or after collection bymirrors314. Reflected light fromobject field330, represented by typical reflectedrays315, hitsmirror316, is collected bylens318, and imaged ontolinear array320. Stability of theillumination source312, with respect to intensity in time and in space, and to spectral emissions, is one of the major causes for poor color fidelity. Embodiments of the invention minimize this cause.
For an RGB color scanner with 600 dots/inch (236 dots/cm) or pixels/inch (236 pixels/cm) “optical” resolution and 8.5 inches (21.59 cm) object field width dimension, the number of pixels in the three color channels oflinear array 320 is 600×8.5×3=15,300 pixels. For an 11 inch (27.94 cm) length object field with 600 dpi (236 dots/cm) resolution and three colors there are 11×600×3=19,800 pixels. Thus there are 303 million pixels of spatial resolution in the raw image. If the electronic resolution or “bit depth” is 12 bits or 1.5 bytes per pixel, it follows that a 600 dpi (236 dots/cm) resolution scanned color image contains 454 megabytes of data.
Notwithstanding the enormous potential for spatial and electronic resolution in color scanning technology, there are some serious limitations. Scan speed is one of them but that limitation is rapidly disappearing as the technology matures and finds wide application. In the cotton classing context of the invention, fidelity, the inability to faithfully reproduce true colors, with very small variations, is a very serious limitation. When applied to Color and Trash measurement ofcotton sample102 onwindow204, currently available scanners302 (and CCD cameras also) are so severely deficient with respect to fidelity in the recording and reproduction of color images that the technology has not previously been successfully applied to absolute, scientific color measurement. It can be noted that Color and Trash classifications, which are still primarily done by humans, have much higher standards of performance than might be expected. In the field of cotton measurements, color scanner, and CCD camera technology as well, image acquisition and analysis prior to the subject invention have been regarded as hopelessly deficient and inapplicable for cotton classing purposes because of color infidelity. This is particularly true for Colorimetric measurements. Embodiments of the invention overcome this limitation, and set the stage for eliminating human classers, who have their own limitations, in favor of all-instrument classing.
The bottom view of Color andTrash module300 in FIG. 5 shows that scanned field ofview330 contains, in addition to theunknown sample102 under test, whichsample102 is pressed againstwindow204, knownreference materials340,342,344,346. These reference materials are also shown in FIG.4.Reference material344, like the others, is positioned above awindow348 that is identical in optical properties towindow204 upon whichsample102 is positioned. The spacings between thescanhead304 and thereference material windows348 andsample window204 are also identical, as are the pressures. Cotton reference materials having traceably known colors, described in greater detail below, are so positioned in fourteenchambers342,344. The pressure with which they are loaded is identical with the pressure on theunknown sample102. The reference or “target” cottons are hermetically sealed within thechambers342,344.
The reference cottons are provided by the USDA AMS, Memphis, Tenn. and have precisely known color values assigned to them. By world-wide, between-government treaty agreements, these represent color data originating primarily with the cotton industry and traceable to this one laboratory. Most importantly, these standard cottons are competently and painstakingly produced by well-known methods used for calibrating human cotton classers.
Also seen in object field are other color reference materials such as Munsel papers, portions of industry-accepted color charts such asKodak Q 60, standardized paints or inks applied to the top of thereference chamber windows348, or ceramic tiles (because of their long-term stability), and the like. All reference materials are “seen” throughwindows348 whose optical properties are also identical.
Use of the reference materials circumvents the multitude of drifts, offsets, nonlinearities, illumination variabilities, misalignments, etc that limit color scanning or CCD camera fidelity. And they can be extended beyond scientific measurement to provide for enhancements in fidelity of color image recording and reproduction in general.
In the disclosed embodiment of the invention for Color and Trash measurements, inmodule300color scanner302 scans objectfield330 which includesunknown sample102 andreference materials340,342,344,346. Thescanner302 acquires or records a raw image file for all objects withinobject field330 having 8.5×11 inch (21.59×27.94 cm) width and length.Computer112 operates on the raw recorded data to compare theunknown sample102 to the knownreference materials340,342,344,346, pixel by pixel if necessary, calculates the best match for Color and, independently, for Trash, and records these measurement data into memory for later use. Examples are measurement and reporting of Color: Rd, +b, and +a of 78.3, 9.3 and 0.13; and Trash: 126 trash particles having 0.16% area fraction and 42% of the particles having equivalent diameters greater than 100 micrometers. These typical Color and Trash readings are according to definitions required by the USDA for the target materials located inchambers342,344.
Trash measurements proceed in the same manner as color measurements, wherein reference materials having known percentage areas of leaf and other foreign matter are placed in the object field and subsequently compared to the unknown samples. Care is taken in the preparation of these reference materials that the types of trash and their size distributions are representative of commercial cotton.
It is significant in instrument classification embodiment that scientific measurements can be the end result, not necessarily images with high fidelity, as in the human classification embodiment. Indeed, a limited objective for Color and Trash measurements starts with known USDA Color and Trash data for the known reference or target cottons; the limited objective ends with matching or hitting the targets of these USDA-generated values, as disclosed above.
Thus, no image of the test cotton need be displayed for the scientific tests disclosed herein. However, satisfactory results for both matching the internal targets, for Color and Trash measurements, and for producing viewable and analyzable images, for human and for automated classing, can be obtained when standard paints are applied to the top side ofwindows340,346 and for which paints the scientific colors are determinable and stable. This means that our apparatus and methods and system can be applied to more rigorous, absolute color measurements. Accordingly, our methods and apparatus, which embrace the two types of absolute, instrumental color referencing to traceable materials inchambers340,346 and to “accepted” materials with “accepted” Colorimetric targets values inchambers342,344, along withunknown samples102 in identicaloptical environments330, can be extended to enhance fidelity in the recording and reproduction of images for measurement purposes in general.
It will be appreciated that embodiments of the invention rely upon the availability of advanced digital technologies for high resolution, high quality, color image acquisition, storage, processing, and communication. One salient feature is very large digital files. One such color image file, acquired with maximum resolution available now, can contain as many megabytes of data as all of the currently-reported HVI data on one million bales. Of course, these large image files are analyzed and reduced to data products representing scientific measurements which are only a few bytes in length.
FIGS. 8-11 indicate the levels of performance achieved in the instrument classification embodiment. Scientific readings produced very favorably compare with the USDA/AMS results for Color and Trash reference materials, over a wide range of color and trash values. FIG. 12 is a composite image containing an image of both samples from a bale and an overlay of the scientific measurements for the bale having the indicated permanent bale identification bar code. It may be appreciated that our scientific measurements are obtained with the new technologies and methods herein disclosed whereas the reference readings are based on very old, known technologies. Particular attention is drawn to FIG. 11 which compares the color readings of our methods, marked STI, with conventional HVIs A, B and C, all in different laboratories but in a round test using the same reference cotton samples. USDA tolerances of +/−1 unit in Rd and +/−0.5 unit in +b are represented by the boxes. It is clearly seen that the STI data re within the tolerance boxes more frequently than conventional HVI.
To summarize the procedural steps for producing scientific measurements of color and trash by our invention, given a calibrated apparatus including known reference materials:
1. Acquire an image file of the unknown sample under test along with known reference materials in a single object field;
2. Apply color and trash matching algorithms in an image analysis step so that the internal computer interpolates the measurements and makes the classification “call.”
Internet ClassingCotton Classing takes on or will take on two primary forms or transitional combinations thereof:
A. Traditional Classing. Collect samples and ship them to remote sites for human and/or instrument classing. Archivally store many physical samples remotely and the data in data base(s); and
B. Internet Classing. Execute instrumental classing locally (in gins, warehouses, or mills) and transfer the fiber quality measurements, including images, over the Internet. Archivally store few samples for calibration confirmations and all data, including images, in databases.
One transitional combination of Traditional and Internet Classing, begins with acquisition of images, along with other fiber quality data, at the gin. These data and images are then transferred over the Internet for classification and other judgements and decisions, such as buy/sell decisions, by humans. Thus a human classification is made by classers looking at a high fidelity reproduction of the image file on an image display device such as a high quality color video monitor.
To better appreciate performing cotton classing and selling/buying cotton over the Internet, operation of theinstrument100 is here briefly described: An operator removes the two cut samples from the two bale sides and places both samples on the Color and Trash scanning window. The operator starts the testing process, and the instrument then automatically completes the chosen fiber quality tests which can include, color, trash, micronaire, length, strength, neps and stickiness, for example. Each sample is internally conditioned to 65% RH and 21° C., standard laboratory conditions. The image and associated fiber quality data are then sent to a local network for viewing by the ginner, and then to the Internet to be viewed by potential buyers or the producers.
The image of FIG. 7 displays both classers' samples with their associated fiber quality measurements. Other data may be attached to inform the potential buyer of relevant information to make a purchasing decision.
Again, an important feature of embodiments of the invention is inclusion of known reference materials, especially cottons for this embodiment, in the object field of view. The primary commercial purpose for this “Internet Classing” embodiment is to enable communication of the acquired image files over the Internet with subsequent remote classification. By “Internet Classing”500 in FIG. 13 we mean, procedurally:
1. acquisition of raw image files ofobject field502 at one location, including unknown sample undertest504 and one or moreknown reference materials506 in the object field;
2. communication of these files over theInternet508 to one or more computers at one or more remote locations which reconstruct, with high fidelity,images510 of theoriginal object field502; and
3. observation of the reconstructed images by trained human observers or “Cotton Classers512.”
Most importantly,human observers512 see a reconstructed image field containing the sample under test and the standard reference materials. This means that unavoidable distortions, nonlinearities and noise in the acquisition, communication, and reconstruction steps affect both the unknown sample under test and the known reference material.
The acquisition of raw image files of the object field is precisely the same as described for the first embodiment and is further described in FIG.7.
In another embodiment, scientific measurements can be made by a computer processor at the remote location, operating on the digital image file, without necessarily displaying it.
A few clarifying comments now complete the disclosure. Whereas theoriginal object field502 and reconstructedimage field510 are seen to have eightsquare reference biscuits506 surrounding the square sample undertest504, it will be appreciated that the number and shape of reference biscuits and the size and shape of the sample under test may be chosen to suit the particular preferences of the user. For some applications, linear strips of reference material are advantageous.
Although our methods make human classification from a color monitor possible, the monitor used to reconstruct theimages510 should preferably be high fidelity and calibrated and it should preferably be situated in an appropriate viewing environment. It will be appreciated that human classification of passively reflected light from a physical sample versus actively emitted light from a monitor are fundamentally different. Without inclusion of known reference materials in the recording and reconstruction steps, human classification from a monitor would not be possible.
It will be appreciated that discerning small color and trash differences is not easy, physiologically or psychologically, and all of the “tricks” of modern pattern recognition must be employed to facilitate the human judgement call. Note again the eightreference biscuits506 seen in FIG. 13; their primary purpose is to accommodate the distortions, nonlinearities and noise of real-world color recording and reproduction apparatus over a wide range of colors. We have found that interpolations in reconstruction can enable generation of a much larger set of equivalent reference materials with a narrower range of color differences in images having different shapes. Thus thehuman Classer512 in FIG. 13 would determine that the color of the unknown sample lies between that of a few of the knowns. He or she would then cause the computer, viakeyboard514 or other control means to produce interpolated biscuits with narrower color range in apattern517 as seen in FIG.14. The unknown sample's image is in the center and the interpolatedreference biscuits518 surround it, from which a more accurate and precise color call is made.
Another color pattern recognition tool is seen in FIG. 15 where the unknown image is surrounded by an interpolatedimage520 whose color coordinates may be varied by the Classer until a match is realized. We have found that this “trick” is particularly useful for human cotton classing. However, we have also found that the computer can surpass the human, when the matching is automated.
We note finally that the communicated digital image files may be analyzed remotely as well as be reconstructed for human viewing. That is, since the digital communication is almost error free, and since transfer of tens to hundreds of megabyte files is increasingly feasible, the image analysis from which scientific data products are produced may take place remotely and at any later time. This capability enables different analytical procedures to be employed. For example, the general algorithms by which conventional “HVI” color and trash measurements are generated at one location may not serve the specific purposes of certain Customers at other locations, who could execute unique algorithms better matched to their requirements. This is particularly important for international commerce, since the standards used by sellers and buyers in different countries are generally different.
We further note in conclusion that embodiments of the invention provide for local viewing by humans and for viewing aids such as magnification and various pattern recognition enhancements known in the art.
While specific embodiments of the invention have been illustrated and described herein, it is realized that numerous modifications and changes will occur to those skilled in the art. It is therefore to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit and scope of the invention.