Movatterモバイル変換


[0]ホーム

URL:


US6735327B1 - Color and trash measurements by image analysis - Google Patents

Color and trash measurements by image analysis
Download PDF

Info

Publication number
US6735327B1
US6735327B1US09/663,502US66350200AUS6735327B1US 6735327 B1US6735327 B1US 6735327B1US 66350200 AUS66350200 AUS 66350200AUS 6735327 B1US6735327 B1US 6735327B1
Authority
US
United States
Prior art keywords
image
cotton
unknown sample
color
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US09/663,502
Inventor
Frederick M. Shofner
Christopher K. Shofner
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SHOFNER ENGINEERING ASSOCIATES Inc A Corp OF STATE OF TENNESSEE
Original Assignee
Shofner Engineering Associates Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shofner Engineering Associates IncfiledCriticalShofner Engineering Associates Inc
Priority to US09/663,502priorityCriticalpatent/US6735327B1/en
Assigned to SHOFNER ENGINEERING ASSOCIATES, INC., A CORPORATION OF THE STATE OF TENNESSEEreassignmentSHOFNER ENGINEERING ASSOCIATES, INC., A CORPORATION OF THE STATE OF TENNESSEEASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: SHOFNER, CHRISTOPHER K., SHOFNER, FREDERICK M.
Application grantedgrantedCritical
Publication of US6735327B1publicationCriticalpatent/US6735327B1/en
Adjusted expirationlegal-statusCritical
Expired - Fee Relatedlegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

An ultrasonic diagnostic imaging system is provided which aid in the diagnosis of patient conditions by providing access from the ultrasound system to a library of reference ultrasonic images. The image library is cataloged in accordance with an image characteristic such as the type of examination, the part of the body or the type of pathology shown in the image, and the images of the library are accessed in accordance with these characteristics. The image library may he remotely located and accessible by a number of ultrasound systems over a network, or it may be located on the ultrasound system itself such as on a system disk drive. In a preferred embodiment reference images are concurrently displayed with patient images to aid in discerning the patient's condition.

Description

CROSS-REFERENCE TO PROVISIONAL PATENT APPLICATIONS
The benefit of U.S. Provisional Patent Application Ser. No. 60/154,527, filed Sep. 16, 1999; Ser. No. 60/182,731, filed Feb. 15, 2000; and Ser. No. 60/221,104, filed Jul. 27, 2000 is claimed.
BACKGROUND OF THE INVENTION
The invention relates generally to fiber quality measurements for cotton classing and, more particularly, to image-based instrument measurements of Color and Trash.
Cotton standards are supported by the United States Department of Agriculture (USDA) through its Agricultural Marketing Service (AMS). Cotton standards, and the corresponding classing of cotton, are of great importance in determining the market value of a particular bale of cotton, as well as determining suitability of a particular bale of cotton from a gin for subsequent processing at a particular mill in view of the products and processes of that mill. AMS is responsible for preparing and maintaining such cotton standards and does so in its Standards Section located in Memphis, Tennessee.
In 1923, the United States and nine European countries entered into the Universal Cotton Standards Agreement. From that time, up until approximately 1965, USDA/AMS cotton classing “measurements” based on the Universal Standards were made entirely by humans. The human measurements included “grade,” “extraneous matter” (such as bark and grass), “preparation” (which relates to smoothness of the sample) and “staple length” (long fiber content). Instrument-based cotton classing was introduced in 1965, beginning with micronaire, followed in 1980 by High Volume Instruments (HVI), which added measurements of length and strength. HVIs currently measure the fiber qualities of Micronaire, Length, Strength, Color and Trash. Instruments for measuring Color (Rd, +b and +a, which refer to “reflectiveness,” “yellowness” and “redness”) and Trash (% Area) have also been developed, but Human Classer measurements of Color and Trash are still generally employed because of certain deficiencies in these methods.
Prior to 1993, Classer's Grade was a combination of color and leaf trash. In USDA/AMS classing, the human Classer now separately calls a color grade and a leaf grade. These Classer's calls for color and leaf grade are still the official measurements, along with a Classer's call about “Extraneous Matter” and “Preparation.” Although currently unofficial, instrumental measurements of Color and Trash, as well as exploratory measurements of Short Fiber Content are made, however deficient, both to satisfy mill requests for the information and to lay the foundation and to create demand for improvements.
The Classer measurements of color, trash, preparation and extraneous matter are primarily visual, and measurements of staple length are both tactile and visual. It is appropriate to call all human classifications “measurements” in a broad sense. This is because Classers are rigorously trained to compare their observations to cotton standards immediately available to them. That is, they measure first, by comparison, then put the sample into a “class.” If a Classer has any doubt about a classing call, the Classer can, sample in hand, go to a set of standards boxes to refresh the Classer's memory to improve the “call” by comparison to the standards for Grade and Staple.
Considering in particular the current methods of human visual fiber quality measurements, the Classer holds a sample from a bale and visually examines its appearance in good, meaning standardized, illumination. Making the measurements of Color Grade and Leaf Grade physiologically and psychologically amounts to comparing the images falling on the retinas and perceived by the brain with images from memory or, for confirmation, images from physical standards. The procedures of making such measurements are learned in extensive training. “Classer's Call” means that the Classer has measured the object and judged it to fall within or near to a standard or, if not, he or she comments why.
SUMMARY OF THE INVENTION
In embodiments of the invention, an optical imaging device having a defined object field of view acquires an image for classifying an unknown sample of cotton. The unknown sample of cotton is placed in the object field of view, as well as at least one reference material. As a result, the optical imaging device acquires images of both the unknown sample of cotton and the reference material in the same field of view, at the same time and under the same measurement conditions.
In an instrument classification embodiment, an instrumental “call” or classification of the sample under test is made without reconstructing an image of the object field for human viewing.
In a human classification embodiment, the image of the entire object field, including the sample under test and at least one reference material, is reconstructed for human classification, either locally or remotely. When the image file is transferred over the internet to a remote location, we refer to the embodiments as “Internet Classing.”
Inclusion of at least one reference material, having known and traceable optical properties in the object field is an important feature in both embodiments, and overcome the deficiencies of prior art methods. It is particularly advantageous to use reference materials which are similar to the samples under test. Thus cottons, whose color or trash properties are established by AMS or others, are included as reference materials. In the instrument classification, algorithms match the unknown to the knowns.
Embodiments of the invention can employ, as the optical imaging device, a high quality color scanner intended for office or graphics arts use in scanning documents. In the context of the invention, the inclusion of at least one reference material in the object field is not intended to refer to the usual and conventional calibration “strip” included in such color scanners. Such a calibration “strip” is internal to the scanner, and is generally employed to calibrate the intensity of the illumination within the scanner. The internal scanner calibration “strip” is not part of the acquired image data.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is an overview of a machine embodying the invention, which machine measures cotton samples to produce multiple data products, including images, and additionally internally and ultra-rapidly conditions samples;
FIG. 2 is a top view of the Ultra-Rapid Conditioning module and the Color and Trash module of the machine of FIG. 1, without a pressure/distribution cover plate in place;
FIG. 3 is a side view of the Ultra-Rapid Conditioning module and the Color and Trash module;
FIG. 4 is an end view of the Ultra-Rapid Conditioning module and the Color and Trash module;
FIG. 5 is a bottom view of the Ultra-Rapid Conditioning module and the Color and Trash module, showing the optical imaging device field of view;
FIG. 6 represents the manner in which a human Classer measures fiber quality;
FIG. 7 represents image based measurements of fiber quality;
FIG. 8 is a graph depicting Trash-measurement performance of an embodiment of the invention;
FIG. 9 is a graph depicting color (Rd) measurement performance of an embodiment of the invention;
FIG. 10 is a graph depicting color (+b) measurement performance of an embodiment of the invention;
FIG. 11 is a graph depicting color (Rd us +b) measurement performance of an embodiment of the invention;
FIG. 12 is an image of a cotton sample acquired by an embodiment of the invention;
FIG. 13 shows a pattern of reference biscuits;
FIG. 14 shows interpolated reference biscuits; and
FIG. 15 shows an interpolated image.
DETAILED DESCRIPTION
Referring first to FIG. 1, the invention is embodied in a stand-alone instrument100 which measures cotton samples to produce multiple data products, including images, and additionally internally and ultra-rapidly conditions samples.Instrument100 is a robust, stand-alone platform upon which fiber quality measurement modules are placed to effect generation of multiple data products. By including internal, ultra-rapid sample conditioning, theinstrument100 eliminates the need for expensive conditioned laboratory space. Themachine100 thus does the work of several other instruments and an expensive laboratory air conditioning system, and does that work in the challenging ginning environment.
System Overview
Operator101 in FIG. 1 selects a “Classer's Sample” having an estimated weight of approximately 15 grams ofsample102. Such a 15-gram sample is typically 5 inches (12.7 cm) wide×8 inches (20.32 cm) long×1 inch (2.54 cm) thick, when uncompressed. The operator “swipes” permanent bale identification (PBI)tag104 throughbar code reader106, and prepares and introducessample102 into recessed conditioning/test chamber110 of “stable table” top111, when pressure/distribution plate202 is retracted. (See also FIG. 2.) Theoperator101 then initiates automatic conditioning/testing by causing pressure/distribution plate202 to move oversample102 in the recessed conditioning/testing chamber110, compressing the sample to a thickness of less than 3 mm. Directed by aprocess control computer112, themachine100 then automatically effects “Ultra-Rapid Conditioning” inmodule200, and additionally effects testing of thesample102 for Color and Trash inmodule300. (Operator101 can monitor and control the progress of conditioning/testing, and of all other operations, as well as examine the data products produced, stored, and communicated bysystem100 viacomputer112 and touch-screen display113.)
Conditioned gas forconditioning sample102 in conditioning/testing chamber110 and for transporting andprocessing sample102 in subsequent steps is provided byair conditioning module114.Air conditioning module114 provides a conditionedgas flow116 having controlled environmental parameters such as Relative Humidity of 65%, dry bulb Temperature of 70° F. (21° C.), and flow rates of 200 CFM (5.7 m3/min). Conditionedgas flow116 is conducted to theentrance117 for both theindividualizer120flow122 and for thesample conditioning module200. In a variation,gas flow116 is split into two components, one having the fixed, standard parameters just described and a second having variable humidity, temperature, flow rate and pressure and which variable parameters are automatically controlled by a separate controller withinair conditioner114, and which parameter values are determined in accordance with optimallyconditioning sample102 within conditioning/testing chamber110.
In overview,sample102, having been manually or automatically placed in recessed conditioning/testing chamber110, with the pressure/distribution plate assembly202 over it, is ultra-rapidly “conditioned” fromabove window204 and “tested” for Color and Trash below it.Sample102 may also be tested for moisture content inchamber110, according to which dataair conditioning module114 is caused to optimallycondition sample102 under control ofcomputer112.
As a practical matter, the nominal transverse dimensions of theconditioning module200 and Color andTrash testing module300 are 8.5×11 inches (21.59×27.94 cm), the size of standard paper in the United States. This is because the Color andTrash module300 is based on available high quality and high resolution color scanners intended for office and graphics arts use in scanning documents. However, any device for acquiring the raw image files and any transverse dimensions may be employed.
By way of example, there are two alternative embodiments involving primarily valve208 (FIG. 3) and perforated plate212 (FIG.3). Downward force onsample102 in recessed conditioning/testing chamber110 is important for the Color and Trash measurements. However, when sample conditioning is not required, as it is not for most color and trash measurements, sample pressure is applied by simpler mechanical means. Mechanical means may also be used to compresssample102 when suction forces are inappropriate.
In the first alternative for applying pressure to thesample102 under test,valve208 in FIG. 3 is open while conditioned air frommodule114 is delivered tocondition sample102. In this first alternative, the holes in relatively thick and rigidperforated plate212 are relatively large and the flow rate delivered for conditioning is high. After typically ten seconds,valve208 partially closes and restrictsflow210 intoUltra-Rapid Conditioning module200, thus causing a strong negative pressure or suction to be developed withinpressure distribution plate202. This suction causes atmospheric pressure to forceplate202 downward ontosample102.Bellows215 andseals220 enable the downward movement and the suction, respectively. There is also an equal and opposite upward atmospheric pressure force onsample102 exerted bywindow204. Sample pressure is important for the Color and Trash measurement.
In the second, simpler alternative, there is novalve208, andperforated plate212 is preferably thinner and has fewer and/or smaller holes. These smaller holes inplate212 inherently limit theflow210 and thus develop the suction force acrossperforated plate212, directly. Open areas of the order of 10% represent a satisfactory compromise between downward force, for Color and Trash measurements bymodule300, and flowrate210, for Ultra-Rapid Conditioning. This second alternative also enables parallel operations for Ultra-Rapid Conditioning processing bymodule200 and for Color and Trash testing bymodule300.
The substantially simultaneous Ultra-Rapid Conditioning bymodule200 and image acquisition testing bymodule300 lasts less than one minute and can be as short as approximately ten seconds, depending on scanner resolution chosen and how close in moisture content the selectedsample102 lies to an acceptable value, such as 7.3% for cotton. Such ultra-rapid sample conditioning is particularly useful as preconditioning for subsequent testing.
At the completion of the conditioning/testing cycle, pressure/distribution cover202 (or pressure plate (not shown) in the event Ultra-Rapid Conditioning is not employed) is opened. Thecover202 may be opened manually, or automatically upon receipt of a signal fromcomputer112.Sample102, which is now conditioned for further processing and testing, is automatically or manually moved ontobelt118 for quick transport to anindividualizer120, which thoroughly opens, i.e., “individualizes,”sample102 into its various constituent entities, fibers, neps, trash, seed coat fragments, sticky points, microdust, and the like. A suitable individualizer is disclosed in Shofner et al U.S. Pat. No. 5,890,264. An alternative is forindividualizer120 to alsoclean sample102 by removing trash, microdust and other foreign matter. However, in the disclosed embodiment almost all of the individualized entities are transported in the same transport flow stream.
This processing byindividualizer120 causes the thoroughly individualized entities to be entrained in or transported by about 120 CFM (3.4 m3/min) ofconditioned air flow122 such that the fiber and other entity concentrations transported by the gas flow at theoutput126 ofindividualizer120 are very low. Accordingly, the Nep content of thus-individualizedsample102 is measured with anep sensor124 which advantageously is built into theindividualizer120. Asuitable nep sensor124 is as disclosed in Shofner et al U.S. Pat. No. 5,929,460.
Sample102, whose weight was guessed byoperator101 at approximately 15 grams, is at theoutput126 ofindividualizer120 in a highly opened, individualized state that simulates the state of fiber in important textile processing machines, especially carding. Accordingly, the state of the fiber is ideal for testing the individual fibers and other entities in thegas flow122. One such test is the Nep test made bynep sensor124. Other tests are Micronaire-Maturity-Fineness (MMF), effected bymodule400. For Neps and for MMF, it is required that the sample weight be known, not guessed, and sample masses of nominally ten grams are commonly used for both tests. The sample weight can be determined prior to or after the testing using known analytical balance technologies. Post testing weighing can be automated.
The system aspects of the disclosed embodiment can be summarized:
1. Common flow;
2. Optimal sequence for sample tests, from surface measurement of Color and Trash to volume or weight measurements of Neps and Micronaire based on guessed weight or on precise weight;
3. Ideal sample state for simulations of actual processing (e.g., cleanability, processability, spinnability); and
4. Automatic except for selecting and introducing classer's sample, thus eliminating operator effort and errors. System and methods can be extended to complete automation.
Image-Based Color and Trash Measurements
FIGS. 2-5 show both theUltra-Rapid Conditioning module200 and the Color andTrash module300 of themachine100 of FIG.1. FIG. 2 is a top view, without pressure/distribution cover plate202; and FIG. 5 is a bottom view. FIGS. 3 and 4 are side and end views, respectively.
FIGS. 6 and 7 more generally disclose embodiments of the invention for measuring Color and Trash, including Extraneous Matter. Thus the Color andTrash module300 is an image-based system which improves the measurements of these basically important fiber qualities over current methods. Preparation can also be measured, if that data product is needed.
Comparing FIGS. 6 and 7 reveals the formal analogies between human visual classification and machine vision classification. In the image acquisition step, both require good illumination of an object field and optimal positioning with respect to the imaging optics. Both form an image field on photosensitive surfaces having spatially discrete and color-resolving detection elements. And both communicate the image information to a powerful central processor for storage in a large and easily recalled memory. In the subsequent image analysis or pattern recognition step, both compare, analyze, or process the current image information with reference to corresponding image information stored in memory. Measurements or classing calls are made, based on that comparison.
Importantly, the (1) image acquisition and (2) image analysis steps are fundamentally distinct, sequential processes in the human and in the machine vision analogs. Also, image analysis is generally distinct from image reconstruction.
More specifically, an important feature of embodiments of the invention is seen in FIG.7. Along with the unknown sample (or samples) under test, the object field includes one or more cotton standards whose colors are accurately, precisely and traceably known. Standards produced by USDA/AMS are prepared in boxes containing, typically, six known cottons which are referred to as “biscuits.” The reference biscuits are viewed under the same illumination and through the same optical elements as the unknown sample(s) under test. The cotton in the biscuits is generally in the same state and pressed against the windows with the same pressures as the unknowns. The biscuits can be in sealed chambers with inert gases to diminish oxidation and other color-changing causes and thus extend the useful stable color life of the reference biscuits.
The system and method of FIG. 7 thus facilitate matching between unknown cottons and known cottons, or other reference materials. This is because the unknowns and knowns are in the same state and tested with the same optical system at very nearly the same instant in time, thus eliminating illumination, other electro-optical noise and drifts, and positioning errors. This computer-based image analysis and matching is seen to be rigorously analogous to the Classer's physiological actions, as described above and in FIG. 6, in both the image acquisition and image analysis/comparison steps, and in the electronic and in the human “classing call” step.
In an instrument classification embodiment, a high quality color scanner (intended for office or graphics arts use in scanning documents) is employed for image acquisition. About one fourth to one half the object field of 8.5×11 inches (21.59×27.94 cm) is devoted to the reference biscuits and other known reference materials and the other half to the unknown sample(s) under test. Acquired image file sizes, depending on spatial resolution (pixel size) and electronic resolution (bit depth), range in the order of tens of megabytes to several hundred megabytes. Files near the lower end on this size range are acquired and analyzed in less than thirty seconds, currently. Accuracy and precision can approach fundamental limits associated with the color standards used.
The same method can be extended to measurement of trash, including measurement of the color of each trash particle, as well as its shape. This enables classification of bark and grass and other types of foreign matter. Another data product or measurement is preparation via surface texture.
The reference or “target” cottons are formed from universal cotton color standards prepared by the USDA/AMS in Memphis. However, reference cottons from any country, or even from an individual gin or mill, or a mix, may be used for the reference biscuits. Other reference materials can be employed in addition to cottons. This will be important as there is movement toward absolute or Commission Internationale l'Eclairage (CIE) color measurements.
FIGS. 2-5 more particularly depict an embodiment in which color scanner-based image acquisition/analysis technology is adapted to the measurement of cotton Color and Trash. The Ultra-Rapid Conditioning process ofmodule200 takes place abovewindow204 in FIGS. 3 and 4.Sample102 is pressed againstwindow204 by downward (on perforated plate212) and upward (on window204) suction forces explained above. The Color and Trash measurements ofmodule300 take place belowwindow204.Color scanner apparatus302 is attached to the bottom of stable table111. This stable table111 is made of aluminum or heavy plastic and is mounting on soft springs (not shown) which are highly damped and which mass-spring-damper system has the purpose of isolating the scanner from vibrations. Conditionedgas flow115 fromconditioner114 in FIG. 1, which includes filtering, is directed intocabinet113 to maintain cleanliness of the optical components.Scanner302 may be a Hewlett-Packard 6000 series scanner or equivalent and includes ascan head304 which is driven alongrails306 by a stepper motor (not shown).Flexible signal cable308 connects the several thousand pixel, linear arrays, one for each of the three colors Red, Green andBlue320, to on-board electronics module310 which is turn connected tocomputer112.
Computer112controls scanner302, acquires and stores its large quantities of data, and operates upon or analyzes these acquired data to generate scientific measurements, which is the functionality of interest in the first embodiment. In another embodiment,computer112 is used to produce images for human classification, or for communications to other computers, including communications over the internet, where scientific measurements or human classifications can be made.
The processes of: (1) scanning an object field, and acquiring thereby an image file of raw data; and (2) operating upon those acquired raw data to produce a viewable, communicable or analyzable image field, are sequential and very sharply distinct. These processes can be characterized as (1) image acquisition and (2) image reconstruction or recording and reproduction. One analogy is acquisition of a photographic negative from which positive prints are made to reconstruct the image. Another analogy is music recording and subsequent reproduction. Appreciating the nuances of the separate processes of acquisition and analysis or recording and reproduction facilitates understanding the limitations of both processes, including spatial and electronic resolution, dynamic range, various nonlinearities, noise and most particularly, fidelity.
Scanner light source312 in FIG. 4 illuminates the object field above it330, FIG. 5, with typical incident rays313. Incident rays313 originate directly fromelongated lamp312 or after collection bymirrors314. Reflected light fromobject field330, represented by typical reflectedrays315, hitsmirror316, is collected bylens318, and imaged ontolinear array320. Stability of theillumination source312, with respect to intensity in time and in space, and to spectral emissions, is one of the major causes for poor color fidelity. Embodiments of the invention minimize this cause.
For an RGB color scanner with 600 dots/inch (236 dots/cm) or pixels/inch (236 pixels/cm) “optical” resolution and 8.5 inches (21.59 cm) object field width dimension, the number of pixels in the three color channels oflinear array 320 is 600×8.5×3=15,300 pixels. For an 11 inch (27.94 cm) length object field with 600 dpi (236 dots/cm) resolution and three colors there are 11×600×3=19,800 pixels. Thus there are 303 million pixels of spatial resolution in the raw image. If the electronic resolution or “bit depth” is 12 bits or 1.5 bytes per pixel, it follows that a 600 dpi (236 dots/cm) resolution scanned color image contains 454 megabytes of data.
Notwithstanding the enormous potential for spatial and electronic resolution in color scanning technology, there are some serious limitations. Scan speed is one of them but that limitation is rapidly disappearing as the technology matures and finds wide application. In the cotton classing context of the invention, fidelity, the inability to faithfully reproduce true colors, with very small variations, is a very serious limitation. When applied to Color and Trash measurement ofcotton sample102 onwindow204, currently available scanners302 (and CCD cameras also) are so severely deficient with respect to fidelity in the recording and reproduction of color images that the technology has not previously been successfully applied to absolute, scientific color measurement. It can be noted that Color and Trash classifications, which are still primarily done by humans, have much higher standards of performance than might be expected. In the field of cotton measurements, color scanner, and CCD camera technology as well, image acquisition and analysis prior to the subject invention have been regarded as hopelessly deficient and inapplicable for cotton classing purposes because of color infidelity. This is particularly true for Colorimetric measurements. Embodiments of the invention overcome this limitation, and set the stage for eliminating human classers, who have their own limitations, in favor of all-instrument classing.
The bottom view of Color andTrash module300 in FIG. 5 shows that scanned field ofview330 contains, in addition to theunknown sample102 under test, whichsample102 is pressed againstwindow204, knownreference materials340,342,344,346. These reference materials are also shown in FIG.4.Reference material344, like the others, is positioned above awindow348 that is identical in optical properties towindow204 upon whichsample102 is positioned. The spacings between thescanhead304 and thereference material windows348 andsample window204 are also identical, as are the pressures. Cotton reference materials having traceably known colors, described in greater detail below, are so positioned in fourteenchambers342,344. The pressure with which they are loaded is identical with the pressure on theunknown sample102. The reference or “target” cottons are hermetically sealed within thechambers342,344.
The reference cottons are provided by the USDA AMS, Memphis, Tenn. and have precisely known color values assigned to them. By world-wide, between-government treaty agreements, these represent color data originating primarily with the cotton industry and traceable to this one laboratory. Most importantly, these standard cottons are competently and painstakingly produced by well-known methods used for calibrating human cotton classers.
Also seen in object field are other color reference materials such as Munsel papers, portions of industry-accepted color charts such asKodak Q 60, standardized paints or inks applied to the top of thereference chamber windows348, or ceramic tiles (because of their long-term stability), and the like. All reference materials are “seen” throughwindows348 whose optical properties are also identical.
Use of the reference materials circumvents the multitude of drifts, offsets, nonlinearities, illumination variabilities, misalignments, etc that limit color scanning or CCD camera fidelity. And they can be extended beyond scientific measurement to provide for enhancements in fidelity of color image recording and reproduction in general.
In the disclosed embodiment of the invention for Color and Trash measurements, inmodule300color scanner302 scans objectfield330 which includesunknown sample102 andreference materials340,342,344,346. Thescanner302 acquires or records a raw image file for all objects withinobject field330 having 8.5×11 inch (21.59×27.94 cm) width and length.Computer112 operates on the raw recorded data to compare theunknown sample102 to the knownreference materials340,342,344,346, pixel by pixel if necessary, calculates the best match for Color and, independently, for Trash, and records these measurement data into memory for later use. Examples are measurement and reporting of Color: Rd, +b, and +a of 78.3, 9.3 and 0.13; and Trash: 126 trash particles having 0.16% area fraction and 42% of the particles having equivalent diameters greater than 100 micrometers. These typical Color and Trash readings are according to definitions required by the USDA for the target materials located inchambers342,344.
Trash measurements proceed in the same manner as color measurements, wherein reference materials having known percentage areas of leaf and other foreign matter are placed in the object field and subsequently compared to the unknown samples. Care is taken in the preparation of these reference materials that the types of trash and their size distributions are representative of commercial cotton.
It is significant in instrument classification embodiment that scientific measurements can be the end result, not necessarily images with high fidelity, as in the human classification embodiment. Indeed, a limited objective for Color and Trash measurements starts with known USDA Color and Trash data for the known reference or target cottons; the limited objective ends with matching or hitting the targets of these USDA-generated values, as disclosed above.
Thus, no image of the test cotton need be displayed for the scientific tests disclosed herein. However, satisfactory results for both matching the internal targets, for Color and Trash measurements, and for producing viewable and analyzable images, for human and for automated classing, can be obtained when standard paints are applied to the top side ofwindows340,346 and for which paints the scientific colors are determinable and stable. This means that our apparatus and methods and system can be applied to more rigorous, absolute color measurements. Accordingly, our methods and apparatus, which embrace the two types of absolute, instrumental color referencing to traceable materials inchambers340,346 and to “accepted” materials with “accepted” Colorimetric targets values inchambers342,344, along withunknown samples102 in identicaloptical environments330, can be extended to enhance fidelity in the recording and reproduction of images for measurement purposes in general.
It will be appreciated that embodiments of the invention rely upon the availability of advanced digital technologies for high resolution, high quality, color image acquisition, storage, processing, and communication. One salient feature is very large digital files. One such color image file, acquired with maximum resolution available now, can contain as many megabytes of data as all of the currently-reported HVI data on one million bales. Of course, these large image files are analyzed and reduced to data products representing scientific measurements which are only a few bytes in length.
FIGS. 8-11 indicate the levels of performance achieved in the instrument classification embodiment. Scientific readings produced very favorably compare with the USDA/AMS results for Color and Trash reference materials, over a wide range of color and trash values. FIG. 12 is a composite image containing an image of both samples from a bale and an overlay of the scientific measurements for the bale having the indicated permanent bale identification bar code. It may be appreciated that our scientific measurements are obtained with the new technologies and methods herein disclosed whereas the reference readings are based on very old, known technologies. Particular attention is drawn to FIG. 11 which compares the color readings of our methods, marked STI, with conventional HVIs A, B and C, all in different laboratories but in a round test using the same reference cotton samples. USDA tolerances of +/−1 unit in Rd and +/−0.5 unit in +b are represented by the boxes. It is clearly seen that the STI data re within the tolerance boxes more frequently than conventional HVI.
To summarize the procedural steps for producing scientific measurements of color and trash by our invention, given a calibrated apparatus including known reference materials:
1. Acquire an image file of the unknown sample under test along with known reference materials in a single object field;
2. Apply color and trash matching algorithms in an image analysis step so that the internal computer interpolates the measurements and makes the classification “call.”
Internet Classing
Cotton Classing takes on or will take on two primary forms or transitional combinations thereof:
A. Traditional Classing. Collect samples and ship them to remote sites for human and/or instrument classing. Archivally store many physical samples remotely and the data in data base(s); and
B. Internet Classing. Execute instrumental classing locally (in gins, warehouses, or mills) and transfer the fiber quality measurements, including images, over the Internet. Archivally store few samples for calibration confirmations and all data, including images, in databases.
One transitional combination of Traditional and Internet Classing, begins with acquisition of images, along with other fiber quality data, at the gin. These data and images are then transferred over the Internet for classification and other judgements and decisions, such as buy/sell decisions, by humans. Thus a human classification is made by classers looking at a high fidelity reproduction of the image file on an image display device such as a high quality color video monitor.
To better appreciate performing cotton classing and selling/buying cotton over the Internet, operation of theinstrument100 is here briefly described: An operator removes the two cut samples from the two bale sides and places both samples on the Color and Trash scanning window. The operator starts the testing process, and the instrument then automatically completes the chosen fiber quality tests which can include, color, trash, micronaire, length, strength, neps and stickiness, for example. Each sample is internally conditioned to 65% RH and 21° C., standard laboratory conditions. The image and associated fiber quality data are then sent to a local network for viewing by the ginner, and then to the Internet to be viewed by potential buyers or the producers.
The image of FIG. 7 displays both classers' samples with their associated fiber quality measurements. Other data may be attached to inform the potential buyer of relevant information to make a purchasing decision.
Again, an important feature of embodiments of the invention is inclusion of known reference materials, especially cottons for this embodiment, in the object field of view. The primary commercial purpose for this “Internet Classing” embodiment is to enable communication of the acquired image files over the Internet with subsequent remote classification. By “Internet Classing”500 in FIG. 13 we mean, procedurally:
1. acquisition of raw image files ofobject field502 at one location, including unknown sample undertest504 and one or moreknown reference materials506 in the object field;
2. communication of these files over theInternet508 to one or more computers at one or more remote locations which reconstruct, with high fidelity,images510 of theoriginal object field502; and
3. observation of the reconstructed images by trained human observers or “Cotton Classers512.”
Most importantly,human observers512 see a reconstructed image field containing the sample under test and the standard reference materials. This means that unavoidable distortions, nonlinearities and noise in the acquisition, communication, and reconstruction steps affect both the unknown sample under test and the known reference material.
The acquisition of raw image files of the object field is precisely the same as described for the first embodiment and is further described in FIG.7.
In another embodiment, scientific measurements can be made by a computer processor at the remote location, operating on the digital image file, without necessarily displaying it.
A few clarifying comments now complete the disclosure. Whereas theoriginal object field502 and reconstructedimage field510 are seen to have eightsquare reference biscuits506 surrounding the square sample undertest504, it will be appreciated that the number and shape of reference biscuits and the size and shape of the sample under test may be chosen to suit the particular preferences of the user. For some applications, linear strips of reference material are advantageous.
Although our methods make human classification from a color monitor possible, the monitor used to reconstruct theimages510 should preferably be high fidelity and calibrated and it should preferably be situated in an appropriate viewing environment. It will be appreciated that human classification of passively reflected light from a physical sample versus actively emitted light from a monitor are fundamentally different. Without inclusion of known reference materials in the recording and reconstruction steps, human classification from a monitor would not be possible.
It will be appreciated that discerning small color and trash differences is not easy, physiologically or psychologically, and all of the “tricks” of modern pattern recognition must be employed to facilitate the human judgement call. Note again the eightreference biscuits506 seen in FIG. 13; their primary purpose is to accommodate the distortions, nonlinearities and noise of real-world color recording and reproduction apparatus over a wide range of colors. We have found that interpolations in reconstruction can enable generation of a much larger set of equivalent reference materials with a narrower range of color differences in images having different shapes. Thus thehuman Classer512 in FIG. 13 would determine that the color of the unknown sample lies between that of a few of the knowns. He or she would then cause the computer, viakeyboard514 or other control means to produce interpolated biscuits with narrower color range in apattern517 as seen in FIG.14. The unknown sample's image is in the center and the interpolatedreference biscuits518 surround it, from which a more accurate and precise color call is made.
Another color pattern recognition tool is seen in FIG. 15 where the unknown image is surrounded by an interpolatedimage520 whose color coordinates may be varied by the Classer until a match is realized. We have found that this “trick” is particularly useful for human cotton classing. However, we have also found that the computer can surpass the human, when the matching is automated.
We note finally that the communicated digital image files may be analyzed remotely as well as be reconstructed for human viewing. That is, since the digital communication is almost error free, and since transfer of tens to hundreds of megabyte files is increasingly feasible, the image analysis from which scientific data products are produced may take place remotely and at any later time. This capability enables different analytical procedures to be employed. For example, the general algorithms by which conventional “HVI” color and trash measurements are generated at one location may not serve the specific purposes of certain Customers at other locations, who could execute unique algorithms better matched to their requirements. This is particularly important for international commerce, since the standards used by sellers and buyers in different countries are generally different.
We further note in conclusion that embodiments of the invention provide for local viewing by humans and for viewing aids such as magnification and various pattern recognition enhancements known in the art.
While specific embodiments of the invention have been illustrated and described herein, it is realized that numerous modifications and changes will occur to those skilled in the art. It is therefore to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit and scope of the invention.

Claims (21)

What is claimed is:
1. A system for acquiring an image for classifying an unknown sample of cotton, said system comprising:
an optical imaging device having a defined object field of view;
an element for positioning the unknown sample of cotton in the object field of view; and
at least one reference material in the object field of view;
whereby said optical imaging device acquires images of both the unknown sample of cotton and the reference material in the same field of view.
2. The system ofclaim 1, wherein said optical imaging device comprises a scanner intended for scanning documents.
3. The system ofclaim 1, wherein said reference material is selected from the group consisting of cotton standards, Munsel papers, color charts, standardized paints, and ceramic tiles.
4. The system ofclaim 1, wherein said optical imaging device outputs a digital image file.
5. A system for classifying an unknown sample of cotton, said system comprising:
an optical imaging device having a defined object field of view;
an element for positioning the unknown sample of cotton in the object field of view such that said optical imaging device acquires an image of the unknown sample;
at least one reference material in the object field of view such that said optical imaging device acquires an image of the reference material in the same field of view as the image of the unknown sample; and
a processor connected to receive a digital image file from said optical imaging device and operable to compare data corresponding to the image of the unknown sample with data corresponding to the image of the reference material to determine a cotton color measurement.
6. The system ofclaim 5, which further comprises a storage device for storing the digital image file for subsequent reconstruction and viewing.
7. The system ofclaim 5, wherein:
there are a plurality of cotton standards as reference materials in the object field of view; and wherein
said processor is operable to compare data corresponding to the image of the unknown sample with data corresponding to images of the reference materials to determine the closest match.
8. The system ofclaim 5, wherein:
there are a plurality of color samples as reference materials in the object field of view; and wherein
said processor is operable to determine a color calibration from images of the reference materials and to adjust data corresponding to the images of the unknown sample in view of the color calibration to determine color values.
9. The system ofclaim 5, wherein said optical imaging device comprises a scanner intended for scanning documents.
10. The system ofclaim 5, wherein said reference material is selected from the group consisting of cotton standards, Munsel papers, color charts, standardized paints, and ceramic tiles.
11. The system ofclaim 5, wherein said processor is co-located with said optical scanning device.
12. The system ofclaim 5, wherein said processor is located remotely from said optical scanning device.
13. The system ofclaim 5, wherein said processor is further operable to measure trash within the unknown sample.
14. The system ofclaim 5, wherein said processor is further operable to measure preparation of the unknown sample.
15. A method for classifying an unknown sample of cotton, said method comprising:
employing an optical imaging device to acquire, within the same object field of view, an image of the unknown sample of cotton and an image of at least one reference material; and
employing a processor to compare data corresponding to the image of the unknown sample with data corresponding to the image of the reference material to determine a cotton color measurement.
16. A system employing a computer network for classifying an unknown sample of cotton, said system comprising:
an optical imaging device having a defined object field of view;
an element for positioning the unknown sample of cotton in the object field of view such that said optical imaging device acquires an image of the unknown sample;
at least one reference material in the object field of view such that said optical imaging device acquires an image of the reference material in the same field of view as the image of the unknown sample; and
a connection via the computer network for transmitting a digital image file from said optical imaging device representing the object field of view to a remote location.
17. The system ofclaim 16, which further comprises an image display device at the remote location whereby a human observer can classify the cotton by comparing an image of the unknown sample with an image of the reference material.
18. The system ofclaim 16, which further comprises a processor at the remote location connected to receive the digital image file from said optical imaging device and operable to compare data corresponding to the image of the unknown sample with data corresponding to the image of the reference material to determine a cotton color measurement.
19. A method for classifying an unknown sample of cotton, said method comprising:
employing an optical imaging device to acquire, within the same object field of view, an image of the unknown sample of cotton and an image of at least one reference material;
constructing a digital image file representing the images; and
transmitting the digital image file via a computer network to a remote location.
20. The method ofclaim 19, which further comprises employing an image display device at the remote location to display images of the unknown sample and the reference material in the same field of view whereby a human observer can classify the cotton by comparing the images.
21. The method ofclaim 19, which further comprises employing a processor at the remote location to compare data corresponding to the image of the unknown sample with data corresponding to the image of the reference material to determine a cotton color measurement.
US09/663,5021999-09-162000-09-15Color and trash measurements by image analysisExpired - Fee RelatedUS6735327B1 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US09/663,502US6735327B1 (en)1999-09-162000-09-15Color and trash measurements by image analysis

Applications Claiming Priority (4)

Application NumberPriority DateFiling DateTitle
US15452799P1999-09-161999-09-16
US18273100P2000-02-152000-02-15
US22110400P2000-07-272000-07-27
US09/663,502US6735327B1 (en)1999-09-162000-09-15Color and trash measurements by image analysis

Publications (1)

Publication NumberPublication Date
US6735327B1true US6735327B1 (en)2004-05-11

Family

ID=32234346

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US09/663,502Expired - Fee RelatedUS6735327B1 (en)1999-09-162000-09-15Color and trash measurements by image analysis

Country Status (1)

CountryLink
US (1)US6735327B1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20070291270A1 (en)*2003-01-072007-12-20Shofner Engineering Associates, Inc.Image-based fiber length measurements from tapered beards
US20100328650A1 (en)*2009-06-292010-12-30Uster Technologies AgFiber Property Measurement
US20120019875A1 (en)*2010-07-202012-01-26Xerox CorporationMedia handling and uniformity calibration for an image scanner
CN103630462A (en)*2013-11-292014-03-12陕西长岭软件开发有限公司Method and device for detecting fineness and maturity of cotton fiber
US20180080188A1 (en)*2016-09-222018-03-22International Business Machines CorporationSand cleaning vehicle and a method of cleaning sand using the same
IT201900009462A1 (en)*2019-06-192020-12-19Camozzi Digital S R L METHOD OF MEASURING THE WASTE LEVEL OF A NATURAL FIBER BALE BY SPINNING AND MEASURING SYSTEM
CN112693765A (en)*2019-10-222021-04-23江苏叁拾柒号仓智能科技有限公司Garbage classification visual identification algorithm and intelligent box thereof
US20210333094A1 (en)*2014-11-042021-10-28Pixart Imaging Inc.Optical capacity measurement device and container using the same
CN115035391A (en)*2022-06-242022-09-09盛视科技股份有限公司Garbage bin overflow judging method and system

Citations (26)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US3591294A (en)1969-01-311971-07-06Univ Tennessee Res CorpFiber measurement
US4093991A (en)1977-01-261978-06-06Hunter Associates Laboratory, Inc.Spectrophotometer-digital data processing system for appearance measurements providing fast and accurate standardization, ease of use for different appearance measurements and fast response
US4529308A (en)1982-05-281985-07-16Hunter Associates Laboratory, Inc.Spectrophotometer apparatus and method including scale drift correction feature
US4672676A (en)1983-12-281987-06-09International Business Machines Corp.Method and apparatus for automatically aligning an object with respect to a reference pattern
US4744035A (en)1983-07-161988-05-10National Research Development CorporationInspecting textile products
US4758960A (en)1986-05-301988-07-19Krauss Und Reichert Gmbh & Co. Kg SpezialmaschinenfabrikMethod of cutting out faultless pattern pieces
US4788650A (en)1986-09-191988-11-29Burlington Industries, Inc.Continuous color measurement for small fabric samples
US5253306A (en)1990-01-121993-10-12Futec Inc.Method of matching patterns and apparatus therefor
US5285383A (en)1990-09-141994-02-08Plains Cotton Cooperative AssociationMethod for carrying out transactions of goods using electronic title
US5639955A (en)1996-02-081997-06-17The United States Of America As Represented By The Secretary Of AgricultureSystem for automated calibration of sensors
WO1998000243A1 (en)1996-07-021998-01-08Zellweger Luwa AgProcess and device for recognition of foreign bodies in a fibre flow of predominantly textile fibres
EP0844581A2 (en)1996-11-211998-05-27ATL Ultrasound, Inc.Ultrasonic diagnostic imaging system with data access and communications capability
US5774574A (en)1994-11-301998-06-30Dainippon Screen Mfg. Co., Ltd.Pattern defect detection apparatus
US5774177A (en)1996-09-111998-06-30Milliken Research CorporationTextile fabric inspection system
US5799105A (en)1992-03-061998-08-25Agri-Tech, Inc.Method for calibrating a color sorting apparatus
US5829487A (en)1994-07-121998-11-03Eat Elektronische Ateliertechnik Textil GmbhMethod for representing a fabric consisting of warp and weft threads
US5841892A (en)1995-05-311998-11-24Board Of Trustees Operating Michigan State UniversitySystem for automated analysis of 3D fiber orientation in short fiber composites
US5897620A (en)1997-07-081999-04-27Priceline.Com Inc.Method and apparatus for the sale of airline-specified flight tickets
WO1999021118A1 (en)1997-10-171999-04-29Pioneer Hi-Bred International, Inc.Process for physiological/agronomic/pest diagnostics using video imaging
US5938607A (en)1996-09-251999-08-17Atl Ultrasound, Inc.Ultrasonic diagnostic imaging system with access to reference image library
US6052182A (en)1997-10-282000-04-18Zellweger Uster, Inc.Fiber quality monitor
US6085227A (en)1998-03-202000-07-04International Business Machines CorporationSystem and method for operating scientific instruments over wide area networks
US6085180A (en)1997-12-232000-07-04Pitney Bowes Inc.Method and apparatus for controlling use of the downloading of graphical images from a portable device into a postage metering system
WO2001020321A2 (en)1999-09-162001-03-22Shofner Engineering Associates, Inc.Conditioning and testing cotton fiber
US6415045B1 (en)*1999-05-102002-07-02Wagner Systems CorporationPapermaking fabric analysis report
US6567538B1 (en)*1999-08-022003-05-20The United States Of America As Represented By The Secretary Of AgricultureReal time measurement system for seed cotton or lint

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US3591294A (en)1969-01-311971-07-06Univ Tennessee Res CorpFiber measurement
US4093991A (en)1977-01-261978-06-06Hunter Associates Laboratory, Inc.Spectrophotometer-digital data processing system for appearance measurements providing fast and accurate standardization, ease of use for different appearance measurements and fast response
US4529308A (en)1982-05-281985-07-16Hunter Associates Laboratory, Inc.Spectrophotometer apparatus and method including scale drift correction feature
US4744035A (en)1983-07-161988-05-10National Research Development CorporationInspecting textile products
US4672676A (en)1983-12-281987-06-09International Business Machines Corp.Method and apparatus for automatically aligning an object with respect to a reference pattern
US4758960A (en)1986-05-301988-07-19Krauss Und Reichert Gmbh & Co. Kg SpezialmaschinenfabrikMethod of cutting out faultless pattern pieces
US4788650A (en)1986-09-191988-11-29Burlington Industries, Inc.Continuous color measurement for small fabric samples
US5253306A (en)1990-01-121993-10-12Futec Inc.Method of matching patterns and apparatus therefor
US5285383A (en)1990-09-141994-02-08Plains Cotton Cooperative AssociationMethod for carrying out transactions of goods using electronic title
US5799105A (en)1992-03-061998-08-25Agri-Tech, Inc.Method for calibrating a color sorting apparatus
US5829487A (en)1994-07-121998-11-03Eat Elektronische Ateliertechnik Textil GmbhMethod for representing a fabric consisting of warp and weft threads
US5774574A (en)1994-11-301998-06-30Dainippon Screen Mfg. Co., Ltd.Pattern defect detection apparatus
US5841892A (en)1995-05-311998-11-24Board Of Trustees Operating Michigan State UniversitySystem for automated analysis of 3D fiber orientation in short fiber composites
US5639955A (en)1996-02-081997-06-17The United States Of America As Represented By The Secretary Of AgricultureSystem for automated calibration of sensors
US6243166B1 (en)1996-07-022001-06-05Zellweger Luwa AgProcess and device for recognition of foreign bodies in fibre of predominantly textile fibres
WO1998000243A1 (en)1996-07-021998-01-08Zellweger Luwa AgProcess and device for recognition of foreign bodies in a fibre flow of predominantly textile fibres
US5774177A (en)1996-09-111998-06-30Milliken Research CorporationTextile fabric inspection system
US5938607A (en)1996-09-251999-08-17Atl Ultrasound, Inc.Ultrasonic diagnostic imaging system with access to reference image library
EP0844581A2 (en)1996-11-211998-05-27ATL Ultrasound, Inc.Ultrasonic diagnostic imaging system with data access and communications capability
US5897620A (en)1997-07-081999-04-27Priceline.Com Inc.Method and apparatus for the sale of airline-specified flight tickets
WO1999021118A1 (en)1997-10-171999-04-29Pioneer Hi-Bred International, Inc.Process for physiological/agronomic/pest diagnostics using video imaging
US6014451A (en)1997-10-172000-01-11Pioneer Hi-Bred International, Inc.Remote imaging system for plant diagnosis
US6052182A (en)1997-10-282000-04-18Zellweger Uster, Inc.Fiber quality monitor
US6085180A (en)1997-12-232000-07-04Pitney Bowes Inc.Method and apparatus for controlling use of the downloading of graphical images from a portable device into a postage metering system
US6085227A (en)1998-03-202000-07-04International Business Machines CorporationSystem and method for operating scientific instruments over wide area networks
US6415045B1 (en)*1999-05-102002-07-02Wagner Systems CorporationPapermaking fabric analysis report
US6567538B1 (en)*1999-08-022003-05-20The United States Of America As Represented By The Secretary Of AgricultureReal time measurement system for seed cotton or lint
WO2001020321A2 (en)1999-09-162001-03-22Shofner Engineering Associates, Inc.Conditioning and testing cotton fiber

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Partial International Search in PCT/US 00/25470 (Annex to Invitation to Pay Additional Fees mailed Feb. 15, 2001).

Cited By (16)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20070291270A1 (en)*2003-01-072007-12-20Shofner Engineering Associates, Inc.Image-based fiber length measurements from tapered beards
US7345756B2 (en)2003-01-072008-03-18Shofner Engineering Associates, Inc.Image-based fiber length measurements from tapered beards
US7675615B1 (en)2003-01-072010-03-09Shofner Engineering Associates, Inc.Image-based fiber length measurements from tapered beards
US20100328650A1 (en)*2009-06-292010-12-30Uster Technologies AgFiber Property Measurement
US8199319B2 (en)*2009-06-292012-06-12Uster Technologies AgFiber property measurement
US20120019875A1 (en)*2010-07-202012-01-26Xerox CorporationMedia handling and uniformity calibration for an image scanner
US8493633B2 (en)*2010-07-202013-07-23Xerox CorporationMedia handling and uniformity calibration for an image scanner
CN103630462B (en)*2013-11-292015-10-21陕西长岭软件开发有限公司A kind of method and device detecting cotton fiber fineness and degree of ripeness
CN103630462A (en)*2013-11-292014-03-12陕西长岭软件开发有限公司Method and device for detecting fineness and maturity of cotton fiber
US20210333094A1 (en)*2014-11-042021-10-28Pixart Imaging Inc.Optical capacity measurement device and container using the same
US12379208B2 (en)*2014-11-042025-08-05Pixart Imaging Inc.Optical capacity measurement device and container using the same
US20180080188A1 (en)*2016-09-222018-03-22International Business Machines CorporationSand cleaning vehicle and a method of cleaning sand using the same
US9988781B2 (en)*2016-09-222018-06-05International Business Machines CorporationSand cleaning vehicle and a method of cleaning sand using the same
IT201900009462A1 (en)*2019-06-192020-12-19Camozzi Digital S R L METHOD OF MEASURING THE WASTE LEVEL OF A NATURAL FIBER BALE BY SPINNING AND MEASURING SYSTEM
CN112693765A (en)*2019-10-222021-04-23江苏叁拾柒号仓智能科技有限公司Garbage classification visual identification algorithm and intelligent box thereof
CN115035391A (en)*2022-06-242022-09-09盛视科技股份有限公司Garbage bin overflow judging method and system

Similar Documents

PublicationPublication DateTitle
WO2001020321A2 (en)Conditioning and testing cotton fiber
US4812904A (en)Optical color analysis process
Arzate-Vázquez et al.Image processing applied to classification of avocado variety Hass (Persea americana Mill.) during the ripening process
USRE39977E1 (en)Near infrared chemical imaging microscope
JP7320704B2 (en) LEARNING DEVICE, INSPECTION DEVICE, LEARNING METHOD AND INSPECTION METHOD
US6735327B1 (en)Color and trash measurements by image analysis
MXPA01006360A (en)Method and apparatus for determining the appearance of an object.
JP2009025311A (en)Method and device for calibrating imaging device for analyzing aggregation reaction
JP2006170669A (en)Quality inspection device of vegetables and fruits
JPH05142144A (en)Spectroscopically correlated optical scanning microscopy
US8285018B2 (en)Method for reconstructing color images
CN107064096A (en)Mix powder non-destructive ration examining device and method based on high light spectrum image-forming
US7675615B1 (en)Image-based fiber length measurements from tapered beards
CN113570538B (en)Blade RGB image bias distribution parameter information acquisition and analysis method
JP4421071B2 (en) Makeup counseling device
JPH05509136A (en) Web shrinkage frequency measurement method and device
US20020135769A1 (en)Hybrid-scanning spectrometer
TaylorUsing high-speed image analysis to estimate trash in cotton
US20040008870A1 (en)Electro-optical method and apparatus for evaluating protrusions of fibers from a fabric surface
Storlie et al.Growth analysis of whole plants using video imagery
WO1998030886A1 (en)Apparatus and method for quantifying physical characteristics of granular products
US20230175935A1 (en)Test Slides and Methods of Production in Stain Assessment
US20230168261A1 (en)Scanner and Method of Using the Scanner During a Stain Assessment
US20080130001A1 (en)Hybrid-imaging spectrometer
JP2002350355A (en) Gloss unevenness evaluation apparatus, gloss unevenness evaluation method, and computer-readable storage medium storing a program for executing the method

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:SHOFNER ENGINEERING ASSOCIATES, INC., A CORPORATIO

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHOFNER, FREDERICK M.;SHOFNER, CHRISTOPHER K.;REEL/FRAME:011406/0157

Effective date:20001212

FPAYFee payment

Year of fee payment:4

REMIMaintenance fee reminder mailed
LAPSLapse for failure to pay maintenance fees
STCHInformation on status: patent discontinuation

Free format text:PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FPLapsed due to failure to pay maintenance fee

Effective date:20120511


[8]ページ先頭

©2009-2025 Movatter.jp