Movatterモバイル変換


[0]ホーム

URL:


WO2025069036A1 - Cargo inspection method - Google Patents

Cargo inspection method
Download PDF

Info

Publication number
WO2025069036A1
WO2025069036A1PCT/IL2024/050967IL2024050967WWO2025069036A1WO 2025069036 A1WO2025069036 A1WO 2025069036A1IL 2024050967 WIL2024050967 WIL 2024050967WWO 2025069036 A1WO2025069036 A1WO 2025069036A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
manifest
images
machine learning
customs
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IL2024/050967
Other languages
French (fr)
Inventor
Chiyya SMASON
Sergey SEVASTIAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cargoseer Ltd
Original Assignee
Cargoseer Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cargoseer LtdfiledCriticalCargoseer Ltd
Publication of WO2025069036A1publicationCriticalpatent/WO2025069036A1/en
Pendinglegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Definitions

Landscapes

Abstract

A method for comparing manifest and other bill-of-goods data with physical Customs Inspection data such as X-ray imagery is disclosed. The method relies upon machine learning methods to make use of as wide a variety of possibly-relevant data sources as possible. Physical Customs Inspection data may come from any available invasive and non-invasive inspection method such as visible-light cameras, muon tomography, ultrasound, and density measurements, while manifest data may likewise be ingested from scanned documents, CRM packages, e-manifests, and other sources. Furthermore, historical data regarding shipping patterns is exploited to inform the machine learning methods of the invention, which can thus arrive at highly accurate analyses by use of the widest possible range of data. In particular, machine learning systems and methods are disclosed that are able to find discrepancies between multiple sources of information. Additionally, systems and methods for implementing immutable, long-term storage and retrieval of customs information using blockchain technology are disclosed.

Description

Cargo Inspection Method
BACKGROUND OF THE INVENTION
[1] The field of cargo inspection systems has undergone significant advancements and innovations in recent years, driven by the need for efficient and reliable methods to inspect and secure cargo shipments and the development of regional and international treaties and protocols to prevent transport of problematic materials or undocumented/misdocumented transport of material in general. These systems play a pivotal role in ensuring the safe and efficient movement of goods across borders while mitigating risks associated with fraud, contraband, hazardous materials, and security threats.
[2] Early forms of inspection primarily relied on manual processes and visual examination, often based on 'spot checks' of randomly sampled shipping containers. As global trade expanded, so did the need for more sophisticated inspection methods. The transition from manual inspection to automated cargo inspection marked a significant turning point in the evolution of these systems.
[3] Non-intrusive inspection (Nil) was introduced with the advent of large-format X-ray technology into cargo inspection systems, marking a significant milestone in the technological evolution of these systems. X-ray scanners allowed for non-intrusive and detailed imaging of the contents of cargo containers, enabling inspectors to verify container contents for Customs Tax purposes, while detecting hidden contraband, weapons, and dangerous goods more effectively. Over time, X-ray systems evolved to include dual-energy and multi-view capabilities, enhancing the accuracy and capabilities of inspection.
1
SUBSTITUTE SHEET (RULE 26) [4] Muon, gamma-ray and neutron imaging systems emerged as alternatives to X-ray technology for Nil, offering unique advantages in detecting certain types of materials, such as nuclear substances and dense metals. These systems employ muon, gamma-ray and neutron sources to generate detailed images, providing inspectors with additional tools for identifying hidden threats.
[5] The integration of advanced sensor technologies, such as radiofrequency identification (RFID) and chemical detection sensors, further expanded the capabilities of cargo inspection systems. RFID technology enables real-time tracking and monitoring of cargo, improving supply chain visibility and security. Chemical sensors can detect hazardous materials, explosives, and narcotics, enhancing the safety of cargo transportation.
[6] Recent advancements in machine learning and artificial intelligence have further revolutionized cargo inspection systems. These technologies enable automated threat recognition and anomaly detection from images, reducing the reliance on human operators and significantly improving inspection efficiency. Machine learning algorithms analyze data from various sensors and imaging modalities to identify potential risks, making cargo inspection systems smarter and more responsive.
[7] As mentioned, the development of cargo inspection systems has been closely intertwined with regulatory and security initiatives. Government agencies and international organizations, such as the World Customs Organization (WCO) and the International Maritime Organization (IMO), have established standards and guidelines to ensure the effective implementation of these inspection systems while safeguarding trade and security interests.
[8] Despite significant advancements, cargo inspection systems continue to face challenges related to the rapid growth of global trade, emerging security threats, and the need for seamless integration with supply chain logistics. In particular,
2
SUBSTITUTE SHEET (RULE 26) reliable automated combination_of cargo manifest documentation, imaging information, and other sources of data remains a challenge.
[9] As a first example of attempts to address such problems, consider US10422919 which discloses systems and methods for integrating manifest data for cargo and light vehicles with X-ray images. Manifest data is automatically imported into a software system for each container, and helps the security personnel to quickly determine the contents of cargo. In case of a mismatch between cargo contents shown by manifest data and the X-ray images, the cargo may be withheld for further inspection. However this method involves no neural network or machine learning methods, nor use of muon or tomographic imaging methods. Furthermore the X-ray image is compared only with historical images based on cargo code and does not take into account further information that would narrow down the exact type, quantity, volume, or value of cargo expected. Finally, other aspects of the shipment such as historical quantities of a given item for a given shipper are not taken into account.
[1] Similarly, DE102013222098A1 describes a freight control system having a material statistics estimator adapted to estimate material statistics for a cargo container. This estimator is configured to convert material specifications and corresponding quantities of a freighter declaration into a set of expected material statistics. A second material statistic estimator attempts to estimate actual material statistics by means of an image processing unit adapted to analyze radiographic transmission image data of the container. The expected and actual material statistics are compared and a corresponding method for checking a plausibility of a freighter declaration is also disclosed. Generally the material statistics involved are amounts of material, e.g. the weight of expected and measured material in kilograms.
[2] However as before reduction to practice in this method involves no neural network or machine learning methods, no use of muon or tomographic imaging methods,
3
SUBSTITUTE SHEET (RULE 26) no use of morphological or shape data concerning the images, and no use of historical manifest and commercial data.
SUMMARY OF THE INVENTION
[3] The invention is a method for cargo/container inspection that centers on use of multiple sources of information about cargo shipments for detecting contraband and verifying cargo contents for Customs, tax purposes and other problems. Documents including but not limited to the manifest, customs declaration, commercial invoice, packing list, purchase order, images from multiple possible sources including but not limited to e-commerce sites and other publicly available internet repositories, and other contextual and historical data are combined to reconcile between the declared contents of the cargo and the multiple information sources for the purpose of finding shipment irregularities (which includes any variance between the manifest, regulations, and actual cargo being shipped.)
[4] Images from Nil sources such as X-ray images and other measured physical data such as cargo weight may also be used as inputs to the system. Further images such as external container photographs and images of the conveyance method such as flatbed truck or railcar may also be used as inputs to the system.
[5] Manifest data obtained from software sources such as a Customs office CRM system, scanned using OCR , or otherwise obtained is input to the system to obtain information concerning declared materials being shipped. Other details from the manifest such as origin country, country of passage, destination country, customs agent, and so on may also be used.
[6] Contextual data of the same sort is also used for purposes of informing the subsequent stages of analysis. These data may include lists of historical shipment information obtained from internal or external data sources, including images, countries and locations of origin and destination, companies producing the goods
4
SUBSTITUTE SHEET (RULE 26) being shipped, shipping companies and services involved, the shipping route taken, and so on.
[7] The aforementioned data are used as input into algorithms (in particular machine learning methods) adapted to analyze and detect problem shipments. Results of the inventive analysis may be output in the form of pass/fail determinations, or more detailed reports. Advanced training methods including unsupervised learning, self-supervised learning, and synthetic data may also be used to overcome the scarcity of labelled training data.
[8] Another unique provision of the invention is that results of the analysis may be immutably associated with the container/shipment using means such as blockchain technology.
BRIEF DESCRIPTION OF DRAWINGS
[9] Fig. 1 shows a schematic flow chart consistent with some embodiments of the invention.
[10] Fig. 2 shows one possible network setup for a machine learning system of the invention.
DETAILED DESCRIPTION OF THE INVENTION
[1] The invention is a method for cargo/container inspection that centers on the congruent use of information from as many sources as possible to arrive at conclusions concerning cargo shipments. The manifest, images (potentially from a number of sources), and other data are combined and subsequently reconciled to find irregularities (which includes any variance between the manifest, regulations, and actual cargo being shipped.)
5
SUBSTITUTE SHEET (RULE 26) [2] Image modalities may include X-ray (possibly in multiple wavelength bands), muon imaging (direct or scattering), visible-spectrum images, gamma-ray images, ultrasound images, neutron time-of-flight images, photon time-of-flight images, LIDAR data, tomographic/3D methods for any of these, and any other source. Images may include photographs of shipping container exteriors, transport vehicles, and other image data.
[3] Visible images are of particular interest since they are easy to obtain with inexpensive means, and can provide a great deal of data that is otherwise not used in normal customs processes. For example, the unique container ID may be recognized from an image of the container's door, ensuring that the container is the same one being x-rayed for evidentiary purposes. Once the system has been in operation long enough, containers will begin to be seen more than once, allowing comparisons from shipment to shipment and revealing whether an ID has been tampered with. Likewise, a historical baseline of what the container physically looks like allows the system to detect additions or changes such as refrigeration units, which are sometimes used as hideouts, false walls which are used to conceal items being smuggled, and so on. Such images are also relevant for instance in the case of license-plate recognition on the vehicle carrying the container, allowing the system to build the same type of historical baseline for the transport vehicle (such as cab and trailer). This allows the system to flag changes in the transport vehicle(s) such as changes in the engine assembly, and additions/removals/changes of hardware such as the spare tire, toolkits, auxiliary fuel tanks, and so on.
[4] Further physical data may be used by the inventive system including container type, container attachments, trailer type, cargo weight, center-of-mass, higher- order mass moments, acoustic conductivity, X-ray or other imaging source current and voltage, and other physical parameters that can be measured with or without opening the container.
6
SUBSTITUTE SHEET (RULE 26) [5] Manifest and related shipment documentation data in multiple different possible formats is used, either being pulled directly from a CRM or other software sources, scanned using OCR, manually input, or otherwise obtained. Manifest and related shipment documentation can come from a large variety of sources, including the manifest(s) itself, declarations, commercial invoices, purchase orders, phytosanitary documents, standards certificates, bills of lading, certificates of origin, logistical documents, physical inspection documents, as well as customs inspection data including assessment data, operator input, seal inspections, physical inspections, canine inspections, and so on. It is within provision of the invention that all these sources be used, whenever available.
[6] Contextual data (which may include historical data concerning shipping companies, routes, and types of cargo) is also used for purposes of informing the subsequent stages of analysis. These details may include historical data of the same sort as that used to document current shipments - images, countries of origin and destination, companies producing the goods being shipped, the shipping companies involved, the shipping route taken, shipping vehicles involved, and so on. Further context includes importer identity; container vessel loadings and reloadings; information from inside Customs or other government agencies; intelligence tips including OSINT and other sources; and open-source or publicly available data regarding any of these details, including the internet and the deep web. Further contextual data may include images generated from the manifest. For instance if the manifest mentions a new type of good that the image processing parts of the system has not been trained to detect, images of this good may be obtained for example using internet or database searches. These images may then be used in a number of ways - for retraining the machine learning algorithms of the system, or analyzed in terms of features and directly used by existing machine learning algorithms. In some cases, historical data of any nature may be used as input for training or otherwise developing machine learning systems.
7
SUBSTITUTE SHEET (RULE 26) [7] In order to deal with the scarcity of X-ray, muon, neutron, or other images of various goods (and especially of contraband goods) it is within provision of the invention to generate artificial images from visible-light images, thus greatly increasing the amount of training data available. The artificial images may be produced using a number of means, for instance by use of a GAN, or by use of other software provided with estimates of Z-number for various elements of an object.
[8] To more fundamentally address the challenges of annotating images and the scarcity of such annotated images, a central aspect of our invention involves techniques for training large architectures (i.e. having large numbers of weights) to enhance performance without relying on vast annotated datasets. While these larger architectures typically demand more data for training, obtaining such data isn't always feasible. Given the absence of a large, tailored dataset that aligns with our requirements, our solution harnesses publicly sourced web repositories, establishing a streamlined process to filter and obtain relevant data tailored to our specific task: dynamically recognizing a continually evolving set of customs.
[9] Employing a self-supervised learning methodology — analogous to techniques utilized in state-of-the-art text-based language models — our approach offers a potent and adaptable strategy for Al model training. This methodology circumvents the need for extensive labeled data, allowing for training on diverse image sets devoid of any supplementary metadata.
[10] Self-supervised learning involves training a network on a task related to the one actually desired, but having labelled data that is easily generated. For the case of generating embeddings for language models, for example, where the task can be predicting a missing word eliminated from a sentence; practically unlimited examples of this sort may be generated, and the task is close enough to others of
8
SUBSTITUTE SHEET (RULE 26) interest that the networks so generated may be easily fine-tuned for these other tasks.
[11 ] This inventive strategy , applied to the tasks of the invention, integrates a suite of data augmentations for X-ray and other expected image types, including but not limited to color variation, rotations, Gaussian blurring, solarization, and diverse cropping methods. These augmentations guide large-scale neural networks in understanding the statistical image representations of a wide array of customs. The self-supervised tasks in this case can be predicting or 'undoing' the (known) augmentations that have been applied; for example, what color variations have been applied, or to what angle an image has been rotated, or infilling the cropped section of a cropped image, or for any of the transforms, what the original image looked like. Since these tasks have known 'answers' and can be generated at will, the network can be trained with enough examples to train even large networks to a useful state. Then this pretrained network may be fine-tuned for a different task using relatively few labelled examples, for example finding classified bounding boxes around objects of interest by means of a few hundred or thousand examples.
[12] Furthermore, our system incorporates a self-distillation technique, fostering the ability to execute few-shot learning, which ensures progressive refinement and adaptability to the ever-changing customs landscape.
[13] Although inference processes for extensive models typically demand robust hardware— which could constrict various practical applications— our invention alleviates this constraint. We employ a model distillation process, enabling the encapsulation of a large model's insights into a more compact framework. Rooted in self-distillation principles, our proprietary algorithm seamlessly facilitates the conversion of expansive models into more concise counterparts.
9
SUBSTITUTE SHEET (RULE 26) [14] To gain access to further sources of data and also provide a seamless means for reporting results, software means of the invention are adapted to interface with various standard software such as CMS (Customer Management Solution) and the like being used by the various customs authorities, shippers, or other bodies carrying out inspections. An important part of the innovation are a set of software 'connectors' adapted to integrate data from a variety of customs and other relevant software systems, including CRMs, Digital Trade Platforms, and other relevant databases and information sources. In this way, extracting information directly from X-ray machines and similar sources can in many cases be avoided.
[15] All of the aforementioned sources of data are used as input into algorithms adapted to produce useful output therefrom, chief amongst these algorithms being machine learning models. These models may comprise both supervised and unsupervised methods, with labeled instances (where a ground-truth analysis of the shipment is known) being useful as training data for supervised models, while both labelled and unlabeled examples may be of use with unsupervised methods. For instance, a large set of examples of shipping data may be used to produce clusters indicating characteristic shipping instances, useful for techniques such as anomaly detection when a new shipment is found in an unlikely or 'out of distribution' area.
[16] The invention may in some embodiments, use a 'human-in-the-loop' methodology for more perspicaciously choosing useful features from the various data sources. The human operator may also take a supervisory role, indicating whether the Al results are correct.
[17] One possible simplified flowchart consistent with implementations of the invention is shown in Fig. 1 . Starting at top left, the analysis is initiated and initial container identification is extracted. This identification will generally be of a form useful for extracting data from a CRM or other relevant software. The inventive
10
SUBSTITUTE SHEET (RULE 26) method uses various connectors to extract all relevant data from the CRM (in this example), including bill of lading information, the manifest, certificate of origin, invoice, logistics documents, X-ray images, and so on. These are all used as input to the system both directly, and are also added to a database of historical and context information. A step of determining external and internal intelligence data, if any, follows. Subsequently the contextual and historical data are read from the database filling this purpose. This will generally include the full shipping information (as seen in the CRM) on past shipments from a given supplier, amongst other data. Any missing data may now be searched for on the web; this may for example include images or other data concerning a new type of item being shipped. Data concerning previously-unseen shippers (such as base of operations, vehicles in fleet, etc.), previously-unseen vehicles, previously-unseen types of goods, and so on may all be searched for to provide context. These searches may be automatic, manual, or of a combined nature.
[18] Now that all the necessary information has been gathered, the system will feed the information into a data fusion module, which may consist of a neural network as shown in Fig. 2 but in general consists of one or more algorithms adapted to detect discrepancies between the various sources of data being used as input. Once this step has finished, a manual crosscheck may be used whereby for instance a human operator agrees or disagrees with the conclusions from the automatic analysis. The operator feedback and automated analysis are combined into a report generally consisting of a set of scores, and this report is inserted into a blockchain. If an anomaly or other discrepancy is found (for example with a statistical certainty exceeding some threshold ) then certain predetermined stakeholders may be informed by means of alerts alongside additional decisions or reports.
[19] In preferred embodiments, neural networks adapted for image processing, such as convolutional neural nets, are used for dealing with any incoming Nil
11
SUBSTITUTE SHEET (RULE 26) Sensor data (such as Xray, neutron, or muon images). The use of neural networks for self-supervised and unsupervised image analysis is also disclosed. These neural nets are used either in conjunction with or as part of a larger machine learning system that takes as input the Nil Sensor data, image data, manifest data, and contextual data, and produces an output in the form of one or more of: a set of scores, measurements, bounding boxes, and classifications. These results may be presented as a report for easy interpretation.
[20] The amalgamation of data from various sources is pivotal, particularly in contexts like our risk assessment framework. Our innovative system employs two distinct fusion techniques:
• Initial Fusion (Feature-based Integration): Here, features are derived from images using neural network methodologies, combined with attributes from historical tabular datasets and refined signal inputs. These combined features form a comprehensive vector. This extended feature vector is subsequently channeled through a neural network to facilitate risk assessment. Alternatively, the full image may be used as input to one part of the network, and other features input to other parts, the two paths eventually joining at some point deeper in the network, this approach allowing a single to determine both relevant image features and relevant elements of the other features.
One possible implementation of such a method is shown in Fig. 2, where one sees that input consisting of Nil Sensor Data, manifest data, and context data are all input into a single network. The Sensor Data will generally be two- or three-dimensional (e.g. 512x512 pixels x3 color planes), while the manifest and context data may be in the form of 1 -dimensional vectors, possibly after undergoing some preprocessing designed to produce such vectors from unstructed data.
12
SUBSTITUTE SHEET (RULE 26) • Subsequent Fusion (Decision-based Integration): In this approach, each data source undergoes individual processing. Distinct scores emanate from each of these isolated processes. The culminating risk assessment emanates from an aggregation of these standalone scores, which is then introduced into non-linear algorithms to finalize the decision-making process.
RESULTS AND REPORTING
[21] Results of the machine-learning methods of the invention may be output in a variety of forms. As mentioned above, the network output may be in the form of a set of scores, measurements, bounding boxes, and classifications. A pass/fail determination may be found useful, as it allows normal-looking cargo to pass, while flagging only suspicious cargo. To this end a risk decision mechanism may be employed that takes into account all the available data to make a single- or multiple-valued 'risk decision'.
A set of labeled bounding boxes and/or risk indication heatmaps may be returned as output, with the boxes indicating regions in images where items of different sorts have been detected. These locations may for instance then be checked by hand if the type of good indicated by the output label is of interest to verify. A set of confidences may be output, indicating for instance estimated likelihood of carrying contraband, estimated likelihood of misdeclared goods, estimated likelihood of undeclared goods, and so on. These likelihoods and other output may be presented in the form of an easily-readable report, as well as being input into fields of an external CRM system in communication with the inventive system. Further output may include amounts of material, in particular the volume and/or mass associated with each item in the shipment. These amounts may be summed to reach total volumes and masses, which may provide useful checks against the actual mass and volume of the container as measured or known independently.
13
SUBSTITUTE SHEET (RULE 26) [22] After the system reports its results, in some embodiments a human crosscheck is employed, where a human being indicates either agreement or disagreement with the system. The system may in certain cases use this crosscheck data for retraining, under the assumption (for example) that the human operator is a reliable source of ground truth.
[23] In some cases, results may be reported immediately, in the form of real-time alerts sent to a customizable list of stakeholders when an anomaly of varying sensitivity is detected. For example, the system may be configured to send the port director a secured push notification to his workstation or cellphone when any shipment has a smuggling-related anomaly detected with more than 90% confidence. Likewise, importers/a gents may receive such notifications in the form of an alert or an immutable certificate.
[24] Another provision of the invention is that results of the analysis may be immutably associated with the container/shipment using means such as E-BL (electronic bi I l-of-lad i ng) , E-SEAL (cryptographically signed document), and blockchain technology. In particular, blockchain entries are verifiable and immutable, providing a secure and audited record of shipping information that may prove of use to governments, inspectors, shippers, and other interested parties. This may also prove of use in post-facto audits, such as may occur when a government suspects that a given shipper has split an illegal enterprise across multiple shipments. By maintaining an immutable historical record of all shipment inspections and the corresponding images, the inventive system comes into play as the relevant information will be available indefinitely. For example, an audit can count totals for various goods shipped by various entities long after the fact. Various government bodies may have their own tracking and reporting systems, and it is within provision of the invention that output of the inventive methods be reformulated into formats compatible with these systems, and insofar as possible sent or input directly into such official reporting systems, further automating the
14
SUBSTITUTE SHEET (RULE 26) customs clearing process. Where possible, the blockchain methods of the invention may be used to provide proof of provenance or other data to such government sites or databases.
[25] The data reported by the system may be called a "Shipment Passport" and comprises the information that is immutably added to a blockchain. It can contain any data required, including customized visualizations of the x ray image, operator name, and any of the manifest data, contextual data, and results determined by the inventive system.
[26] Given the insertion of data into one or more blockchains as described above, it is within provision of the invention to interact with smart contracts that (for example) require proof of inspection, proof of customs clearance, proof of origin or destination, proof that an item has been sent, and so on. By interacting with smart contracts in this way, processes requiring various proofs (of clearance/sending/receiving, etc.) may be automated, for example by releasing funds from escrow to pay for shipments once these shipments have been provably sent, cleared, or received.
[27] A further useful aspect of having customs and inspection data in a blockchain is that the seal, manifest and various other physical mechanisms by which shipments are tracked and contents verified, can now themselves be verified against the blockchain records of the invention. This may be found useful for various customs procedures, since while any physical seal can be tampered with, any lock opened, and in general any physical means for ensuring provenance can be manipulated, previous blocks of a blockchain cannot be changed after the fact- these blocks constitute an 'unpickable lock' or untamperable seal allowing customs officials to compare actual vs. claimed data at any relevant stage of customs clearance, inspection, or other shipping processes.
15
SUBSTITUTE SHEET (RULE 26) [28] As an example of the utility of this aspect of the invention consider the transit of containers throughout multiple countries and across multiple borders. Customs officials can receive immutable proof of inspection, at each border crossing, in advance of the arrival of the container. This allows Customs officials to pre-clear inspected containers based on immutable evidence, which will significantly reduce the waiting time and number of duplicate inspections currently executed.
UNIQUE ASPECTS OF THE INVENTION
[29] While other systems adapted for non-intrusive cargo inspection with the imaging methods described above generally use image data and compare it to the manifest data, the inventive system takes into account a larger 'contextual universe', namely the entire shipment profile - importer identity; origin, en-route, and destination countries; route taken; container vessel loadings and reloadings; information from inside Customs or other government agencies; intelligence tips including OSINT and other sources; and open-source or publicly available data, including the internet web and the deep-web to create singular patterns and profiles for correlation and reconciliation.
[30] These profiles are all used as auxiliary data to machine learning methods of the invention, which may then (for instance) detect that a certain good for a certain importer is anomalous, or a certain route is anomalous, or a certain combination of good, vessel, and country of origin correlates with problematic shipments. The supervised and unsupervised methods used by embodiment of the invention may
16
SUBSTITUTE SHEET (RULE 26) make use of all the aforementioned as well as the image and manifest data that are possibly used in the prior art.
[31 ] Sensor and/or image data being imported from the CRM may come in a variety of formats, and in the spirit of interoperability and wide ranging compatibility of the invention, a variety of common image formats as used in industry and elsewhere are supported by software of the invention, such that for instance DICOM, JPEG, TIFF, GIF, PNG, and other common and X-ray specific 2D image formats, as well as 3D formats including STP, MAX, FBX, OBJ, VRML, STL and so on may all be ingested by import and export routines of the invention. Furthermore a wide range of standardized data formats for scanning operations may be supported by the software of the invention, for instance Unified File Format 1.0 and 2.0.
[32] In some embodiments, the inventive system may be found of particular use with tunnel or mobile scanners, which can now use Al of the invention to automate the analysis process. Operators need only input information such as cargo identifiers, and the rest of the analysis can proceed largely automatically, speeding the inspection and increasing throughput.
17
SUBSTITUTE SHEET (RULE 26)

Claims

Claims (1) A method for detecting customs anomalies, comprising: a. obtaining Nil Sensor data of a shipping unit; b. obtaining manifest data and contextual data associated with said shipping unit; c. combining said sensor data, manifest, and contextual data to identify discrepancies between these sets of data; d. generating an anomaly report indicating the presence of said discrepancies. 2) The method of claim 1 , wherein said Nil Sensor data is selected from the group consisting of: X-ray data; muon data; gamma-ray data, neutron bombardment data, ultrasound data, visible spectrum images, ultraviolet images, infrared images. 3) The method of claim 1 , wherein obtaining manifest data comprises retrieving shipment information from a database, physical or electronic shipping documentation, or a communication link with a logistics system. 4) The method of claim 1 , wherein said step of combining said data comprises methods selected from the group comprising: image processing techniques, machine learning algorithms, and deep learning neural networks. 5) The method of claim 1 , wherein processing said manifest data comprises parsing, extracting, and organizing relevant information from said manifest data. 6) The method of claim 1 , wherein combining said data comprises using said data as input to a multimodal neural network adapted to detect characteristics of the combined data. 7) The method of claim 1 , wherein combining said data comprises extracting vectors of features from said image data, said manifest data, and said contextual data, and concatenating said features for use as input to a machine learning algorithm. 8) The method of claim 1 , wherein combining said data comprises using said image data as input into an input layer of a convolutional neural net, using said manifest 18 SUBSTITUTE SHEET (RULE 26) data and contextual data as further inputs to further input layers of the same neural net, said convolutional layers and said further input layers having upstream communication in said neural net. 9) The method of claim 1 , wherein said anomaly report comprises a confidence score based on the identified discrepancies between said image data, said manifest data, and said contextual data. 10)The method of claim 1 , wherein said discrepancies between said image data, and said sensor data and said manifest data , and said contextual data include inconsistencies in the type of items, quantity of items, value of items, or locations of items within said shipping unit . 11)The method of claim 1 , further comprising notifying relevant authorities or personnel when anomalies are detected in said shipping unit or of the general inspection status of the said shipping unit. 12)The method of claim 1 , further sending real-time alerts to a customizable list of stakeholders when an anomaly of varying sensitivity is detected or of the general inspection status of the said shipping unit. 13)The method of claim 1 further inserting the results of said analysis into an immutable blockchain.
1.
19
SUBSTITUTE SHEET (RULE 26)
PCT/IL2024/0509672023-09-282024-09-29Cargo inspection methodPendingWO2025069036A1 (en)

Applications Claiming Priority (6)

Application NumberPriority DateFiling DateTitle
US202363541051P2023-09-282023-09-28
US202363541074P2023-09-282023-09-28
US202363541062P2023-09-282023-09-28
US63/541,0512023-09-28
US63/541,0622023-09-28
US63/541,0742023-09-28

Publications (1)

Publication NumberPublication Date
WO2025069036A1true WO2025069036A1 (en)2025-04-03

Family

ID=95202815

Family Applications (1)

Application NumberTitlePriority DateFiling Date
PCT/IL2024/050967PendingWO2025069036A1 (en)2023-09-282024-09-29Cargo inspection method

Country Status (1)

CountryLink
WO (1)WO2025069036A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20220198381A1 (en)*2019-04-182022-06-23Nicholas Alan WatersSecurity system
US20220277261A1 (en)*2015-08-112022-09-01Jeff STOLLMANSystems and Methods to Ensure Asset and Supply Chain Integrity
US20220318747A1 (en)*2021-03-302022-10-06Genpact Luxembourg S.à r.l. IIMethod and system for identifying pallet shortages

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20220277261A1 (en)*2015-08-112022-09-01Jeff STOLLMANSystems and Methods to Ensure Asset and Supply Chain Integrity
US20220198381A1 (en)*2019-04-182022-06-23Nicholas Alan WatersSecurity system
US20220318747A1 (en)*2021-03-302022-10-06Genpact Luxembourg S.à r.l. IIMethod and system for identifying pallet shortages

Similar Documents

PublicationPublication DateTitle
US12174334B2 (en)Distributed analysis X-ray inspection methods and systems
EP3764281B1 (en)Methods of identifying firearms in radiographic images
CN108108744A (en)For the method and its system of radiation image assistant analysis
WO2025069036A1 (en)Cargo inspection method
CN118822252A (en) A security supervision system for express delivery channels based on edge-cloud architecture
Männistö et al.Customs Innovations for Fighting Fraud and Trafficking in Cross-border Parcel Flows
US20230281786A1 (en)Image Inspection Systems and Methods for Integrating Third Party Artificial Intelligence Platforms
EP1706845A4 (en) SYSTEM AND METHOD FOR IMPROVED SAFETY MANAGEMENT OF CARGO HANDLING
Min et al.Integrating X-ray technologies with intelligent transportation systems for enhancing the international maritime security
CN118917674A (en)Big data visual digital system for freight management
US20240046635A1 (en)Data and Algorithm Quality Checker for Scanning Systems
EP4371038A1 (en)Image inspection systems and methods for integrating third party artificial intelligence platforms
HK40013183A (en)X-ray inspection system that integrates manifest data with imaging/detection processing

Legal Events

DateCodeTitleDescription
121Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number:24871200

Country of ref document:EP

Kind code of ref document:A1

DPE1Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)

[8]ページ先頭

©2009-2025 Movatter.jp