Movatterモバイル変換


[0]ホーム

URL:


US11244155B2 - Information processing system - Google Patents

Information processing system
Download PDF

Info

Publication number
US11244155B2
US11244155B2US16/878,688US202016878688AUS11244155B2US 11244155 B2US11244155 B2US 11244155B2US 202016878688 AUS202016878688 AUS 202016878688AUS 11244155 B2US11244155 B2US 11244155B2
Authority
US
United States
Prior art keywords
blocks
paragraph
document
work
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US16/878,688
Other versions
US20210110155A1 (en
Inventor
Jun Ando
Shinya Nakamura
Tadao Michimura
Norio Yamamoto
Naoyuki Enomoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Business Innovation Corp
Original Assignee
Fujifilm Business Innovation Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Business Innovation CorpfiledCriticalFujifilm Business Innovation Corp
Assigned to FUJI XEROX CO., LTD.reassignmentFUJI XEROX CO., LTD.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: ANDO, JUN, ENOMOTO, NAOYUKI, MICHIMURA, TADAO, NAKAMURA, SHINYA, YAMAMOTO, NORIO
Publication of US20210110155A1publicationCriticalpatent/US20210110155A1/en
Assigned to FUJIFILM BUSINESS INNOVATION CORP.reassignmentFUJIFILM BUSINESS INNOVATION CORP.CHANGE OF NAME (SEE DOCUMENT FOR DETAILS).Assignors: FUJI XEROX CO., LTD.
Application grantedgrantedCritical
Publication of US11244155B2publicationCriticalpatent/US11244155B2/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

An information processing system includes a processor configured to analyze an obtained document, the processor being configured to: obtain plural documents where information to be shared by plural participants is recorded; calculate a degree of similarity in details in units of blocks between different documents, the blocks being blocks of sentences included in the documents; determine an execution order of executing work of sharing details of each of the blocks on the basis of the degree of similarity; and allocate work time for performing the work of sharing details of each of the blocks on the basis of a number of characters in each of the blocks.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2019-186971 filed Oct. 10, 2019.
BACKGROUND(i) Technical Field
The present disclosure relates to an information processing system.
(ii) Related Art
Discussions in group work have been analyzed to provide information to participants of the work, thereby supporting the progress of the work.
International Publication No. 2017/141338 discloses the following: among a plurality of groups sharing information on a group by group basis, the degree of similarity between information shared in a first group and information shared in a second group is calculated, and, in the case where the calculated degree of similarity satisfies a certain reference, information for promoting discussions is provided to at least one of the first group and the second group on the basis of information shared in that group. An example of information for promoting discussions includes antonyms of representative words that characterize information shared in each group.
In group work, each participant is required to share information related to the work. However, in the case where information related to the work that is held by each participant varies from one to another, it is not easy to efficiently share information within a predetermined time period.
SUMMARY
Aspects of non-limiting embodiments of the present disclosure relate to providing an information processing system for efficiently advancing the work of sharing details that each participant of group work learned individually in advance.
Aspects of certain non-limiting embodiments of the present disclosure address the above advantages and/or other advantages not described above. However, aspects of the non-limiting embodiments are not required to address the advantages described above, and aspects of the non-limiting embodiments of the present disclosure may not address advantages described above.
According to an aspect of the present disclosure, there is provided an information processing apparatus including a processor configured to analyze an obtained document, the processor being configured to: obtain a plurality of documents where information to be shared by a plurality of participants is recorded; calculate a degree of similarity in details in units of blocks between different documents, the blocks being blocks of sentences included in the documents; determine an execution order of executing work of sharing details of each of the blocks on the basis of the degree of similarity; and allocate work time for performing the work of sharing details of each of the blocks on the basis of a number of characters in each of the blocks.
BRIEF DESCRIPTION OF THE DRAWINGS
An exemplary embodiment of the present disclosure will be described in detail based on the following figures, wherein:
FIG. 1 is a diagram illustrating the overall configuration of an information processing system to which an exemplary embodiment is applied;
FIG. 2 is a diagram illustrating an exemplary hardware configuration of a shared information management server and a progress control server;
FIG. 3 is a diagram illustrating the functional configuration of the shared information management server and the progress control server;
FIG. 4 is a diagram illustrating an example of the degree of similarity in units of paragraphs, calculated by a degree-of-similarity analyzing unit of the shared information management server;
FIG. 5 is a diagram illustrating an example of an analysis result presenting screen generated by an analysis-result-presenting-screen generating unit of the shared information management server;
FIG. 6 is a diagram illustrating an example in which the target of the sharing work is determined by a facilitation determining unit of the progress control server;
FIG. 7 is a diagram illustrating an example of determination information for a related paragraph of the target of the sharing work;
FIG. 8 is a diagram illustrating an example of a facilitation information presenting screen generated by a facilitation-information-presenting-screen generating unit;
FIG. 9 is a diagram illustrating an example of a related paragraph displaying screen; and
FIG. 10 is a diagram illustrating an example of the configuration of a description form where to-be-shared information is described.
DETAILED DESCRIPTION
Hereinafter, an exemplary embodiment of the present disclosure will be described in detail with reference to the accompanying drawings.
System Configuration
FIG. 1 is a diagram illustrating the overall configuration of an information processing system to which an exemplary embodiment is applied. The information processing system according to the present exemplary embodiment includes a sharedinformation management server100, aprogress control server200, aninput apparatus300, and anoutput apparatus400.
The sharedinformation management server100 is a server that manages information to be shared by a plurality of participants participating in group work. An example of information to be shared, which is the target of management, includes details that each participant learned in advance about the group work. By sharing details that each participant learned in advance with other participants, each participant may implement the group work with common understanding.
Theprogress control server200 is a server that controls and supports the progress of the work of sharing to-be-shared information, which is the target of management performed by the sharedinformation management server100, by participants (the work implemented for sharing information). Various types of work are conceivable as the work of sharing information, such as presentation within the group, distribution of materials, distribution of materials to a terminal of each participant of the group, and displaying of materials on a display device viewable by each participant of the group. In the present exemplary embodiment, an example of presenting details that each participant learned in advance will be described as the sharing work.
Theinput apparatus300 is an apparatus for the participants of the group work to input information to be shared in the group. In the case where to-be-shared information is described in a particular description form, an image processing apparatus may be used as theinput apparatus300. The image processing apparatus includes a so-called scanner apparatus, which optically reads an image on a set document to generate a read image (image data). Exemplary image reading methods include the following: the charge-coupled device (CCD) method of reducing, with the use of a lens, the size of light that is emitted from a light source onto a document and that is reflected from the document, and receiving the size-reduced light with the use of CCDs; and the contact image sensor (CIS) method of receiving, with the use of a CIS, light beams that are sequentially emitted from a light-emitting diode (LED) light source onto a document and that are reflected from the document. An image processing apparatus has a communication function of connecting to the sharedinformation management server100 via a network and transmitting image data of a read description form. In addition, in the case where to-be-shared information is described in a document converted into electronic data (electronic document), an information processing apparatus that transmits the electronic document to the sharedinformation management server100 may be used as theinput apparatus300. Hereinafter, the case in which an image processing apparatus is used as theinput apparatus300 will be described. Therefore, to-be-shared information is sent as image data, which is obtained by reading an image by theinput apparatus300, to the sharedinformation management server100.
Theoutput apparatus400 is an apparatus that outputs presentation information presented for supporting information sharing by the sharedinformation management server100 and theprogress control server200. As theoutput apparatus400, for example, an information processing apparatus including a display device that displays various screens and an input device that receives user operations, such as a personal computer or a smartphone, is used. Outputting of presentation information is performed by, for example, displaying a screen including presentation information on the display device of theoutput apparatus400.
Hardware Configuration of SharedInformation Management Server100 andProgress Control Server200
FIG. 2 is a diagram illustrating an exemplary hardware configuration of the sharedinformation management server100 and theprogress control server200. The sharedinformation management server100 and theprogress control server200 are realized by a computer, which includes a central processing unit (CPU)101, which is an arithmetic processing unit, and random-access memory (RAM)102, read-only memory (ROM)103, and astorage device104, which are storage units. TheRAM102 is a main storage device (main memory), and is used as work memory when theCPU101 performs arithmetic processing. TheROM103 stores programs and data such as prepared setting values. TheCPU101 may directly load programs and data from theROM103 and execute processing. Thestorage device104 is a unit for saving programs and data. Thestorage device104 stores a program, and theCPU101 loads the program stored in thestorage device104 to the main storage device and executes the program. In addition, thestorage device104 stores and saves the result of processing performed by theCPU101. As thestorage device104, for example, a magnetic disk device or a solid state drive (SSD) is used.
In the case where the sharedinformation management server100 and theprogress control server200 are realized by the above-described computer, for example, the individual functions of these servers, which will be described hereinafter, are realized by executing the program by theCPU101. The sharedinformation management server100 and theprogress control server200 are realized as, for example, servers configured on a network. Note that these servers are not limited to configurations with a single piece of hardware (such as a server machine), and may be configured in a distributed manner as a plurality of pieces of hardware or virtual machines. In addition, the functions of the sharedinformation management server100 and the functions of theprogress control server200 may be realized by one server.
Functional Configuration of SharedInformation Management Server100 andProgress Control Server200
FIG. 3 is a diagram illustrating the functional configuration of the sharedinformation management server100 and theprogress control server200. The sharedinformation management server100 includes animage receiving unit110, animage processor120, a documentvector analyzing unit130, a degree-of-similarity analyzing unit140, an analysis-result-presenting-screen generating unit150, and an analysis-result display controller160. Theprogress control server200 includes an information-sharing-time setting unit210, afacilitation determining unit220, a facilitation-information-presenting-screen generating unit230, and a facilitation-information display controller240.
Functions of SharedInformation Management Server100
Theimage receiving unit110 receives, from theinput apparatus300, image data of a description form where to-be-shared information is described (hereinafter referred to as a document image). To-be-shared information is submitted by each participant of the group work. Therefore, theimage receiving unit110 obtains a plurality of documents where to-be-shared information is recorded.
FIG. 10 is a diagram illustrating an example of the configuration of a description form where to-be-shared information is described. By specifying the configuration of adescription form500 to some extent, the load of processing image data of thedescription form500 obtained by theimage receiving unit110 is reduced. In the example illustrated inFIG. 10, thedescription form500 includes adocument field510 and aninformation code520.
Thedocument field510 is provided in a specified area of the space of thedescription form500. A document indicating to-be-shared information is described in thedocument field510. Although it is unnecessary to specify the style and the format of thedocument field510, a document may be explicitly described in blocks such as paragraphs. The following description assumes that a document in thedocument field510 is explicitly described in paragraphs.
Theinformation code520 is a code for recording information regarding the document and information regarding the author of the document. Examples of information regarding the document include the identification information (document ID) of the document, the document creation date and time, the document submission date and time, and the identification information of group work in which the details of the document are used as to-be-shared information. Examples of information regarding the author include the identification information (user ID) of a user who is the author of the document as well as a participant of the group work, and the identification information (group ID) of a group to which the user belongs. In the case where the group work is done in school class or the like, information regarding the author may include the student number of a student who is the user. These items of information may be recorded in theinformation code520 itself, or link information for a server where these items of information are managed may be recorded as theinformation code520. For example, a two-dimensional code or the like is used as theinformation code520.
Referring back toFIG. 3, theimage processor120 processes a document image obtained by theimage receiving unit110. Specifically, theimage processor120 reads theinformation code520 recorded in the document image. On the basis of information read from theinformation code520, for example, the document image is sorted by group which implements group work in which the details of the document are used as to-be-shared information. In addition, theimage processor120 applies optical character recognition (OCR) to the document image to convert the document described in the document image to electronic data. The following processing is performed for each of groups to which documents are sorted.
The documentvector analyzing unit130 performs a document vector analysis of the document (electronic document) obtained by electronic data conversion performed by theimage processor120. Specifically, the documentvector analyzing unit130 divides the electronic document into paragraphs, and calculates a document vector for the description of each paragraph.
The degree-of-similarity analyzing unit140 calculates the degree of similarity in described details in units of paragraphs of the electronic document on the basis of the document vector of each paragraph, which is calculated by the documentvector analyzing unit130. Specifically, the degree-of-similarity analyzing unit140 calculates the degree of similarity of each paragraph in a document with all the paragraphs of all the other documents. This is performed for all the documents. The degree of similarity between paragraphs in the same document is not calculated. In other words, the degree-of-similarity analyzing unit140 calculates the degree of similarity in details in units of paragraphs between different documents, in which the paragraphs are one example of blocks of sentences included in documents.
In addition, the degree-of-similarity analyzing unit140 identifies a combination of paragraphs with the highest degree of similarity between documents, on the basis of the calculated degree of similarity between paragraphs of the individual documents. Specifically, at first, attention is paid to a paragraph of interest in a document of interest, and, among paragraphs of the other documents, a paragraph with the highest degree of similarity with the paragraph of interest is identified in units of documents. This is performed for each paragraph of the document of interest, and further for each document. Accordingly, in units of paragraphs in each document, a paragraph with the highest degree of similarity in each document different from the document including the paragraph is identified. The sharedinformation management server100 and theprogress control server200 regard a combination of paragraphs with the highest degree of similarity between documents as paragraphs describing the same theme, and treat these paragraphs as related paragraphs. For a paragraph of interest in a document of interest, there is a related paragraph in each document different from the document of interest including the paragraph of interest.
The analysis-result-presenting-screen generating unit150 generates an analysis result presenting screen that presents the result of analyzing a document. The analysis-result-presenting-screen generating unit150 generates, as an analysis result presenting screen, a graph indicating related paragraphs of each paragraph in each document on the basis of the relationship between paragraphs with the highest degree of similarity between documents, which is obtained as above. Specifically, the analysis-result-presenting-screen generating unit150 generates a graph in which each paragraph in each document is set as a vertex (node), and a side (edge) is formed between vertices on the basis of the relation based on the degree of similarity between paragraphs. Each side of the graph may be weighted according to the degree of similarity between corresponding paragraphs. The weight added to each side may be represented by, for example, the thickness or length of the side. In addition, the value of the degree of similarity may be displayed at each side. In addition, the vertices of the graph may be rendered in a size corresponding to the size of document vectors calculated in analysis conducted by the documentvector analyzing unit130.
The analysis-result display controller160 transmits an analysis result presenting screen generated by the analysis-result-presenting-screen generating unit150 to theoutput apparatus400, and displays the analysis result presenting screen. The analysis result presenting screen displayed on theoutput apparatus400 is used also as a user interface (UI) screen for receiving an operation performed by a user who is a participant of the group work. The user may operate an input device such as a mouse or a touchscreen to update the graph on the analysis result presenting screen displayed on theoutput apparatus400. For example, each side formed between vertices of the graph may be replaced or vertices may be moved. When each side of the graph is replaced, the analysis-result display controller160 notifies the degree-of-similarity analyzing unit140 of the operation details. The degree-of-similarity analyzing unit140 updates information on the relation between paragraphs in accordance with the received operation details.
Functions ofProgress Control Server200
The information-sharing-time setting unit210 of theprogress control server200 sets work time for performing the work of sharing information by participants of the group work. This setting is performed by, for example, the organizer of the group work. In the case where the group work is done in school class or the like, a teacher may set the work time beforehand. In addition, it is only necessary for the setting of the work time to be done before the operation of the later-describedfacilitation determining unit220 starts.
Thefacilitation determining unit220 determines the target of the sharing work, among combinations of related paragraphs identified by the sharedinformation management server100, and sets the time and the order of performing the sharing work for the combinations of related paragraphs determined as the target of the sharing work. At first, thefacilitation determining unit220 determines related paragraphs that serve as the target of the sharing work. Specifically, attention is paid to a paragraph of interest in a document of interest, and, among related paragraphs of the paragraph of interest, a related paragraph with the lowest degree of similarity is determined as the target of the sharing work regarding the paragraph of interest. This is performed for each paragraph of the document of interest, and further for each document. Accordingly, in units of paragraphs in each document, a related paragraph that serves as the target of the sharing work is determined. Here, a related paragraph with the lowest degree of similarity is determined as the target of the sharing work because a related paragraph with a lower degree of similarity with a paragraph of interest is more likely to contain details that are not described in the paragraph of interest, and it is thus worth doing the sharing work. If a related paragraph has an extremely low degree of similarity, it is more likely that the related paragraph contains no description on the common theme. To this end, a threshold may be set, and, among related paragraphs whose degrees of similarity are higher than the threshold, a related paragraph with the lowest degree of similarity may be determined as the target of the sharing work. Furthermore, in the case where all the related paragraphs of a certain paragraph have very high degrees of similarity and their descriptions contain substantially the same details, it is conceivable that there is little original information in the related paragraphs, and it is thus less worth doing the sharing work. To this end, another threshold different from the above threshold may be set, and, in the case where all the related paragraphs have degrees of similarity that are higher than the threshold, these related paragraphs may be excluded from being the target of the sharing work. Note that the specific method of doing the sharing work is not particularly limited. Mainly, the specific method may be presentation of details described in a related paragraph. The following methods are individually selectable and implementable according to the details of the group work or the attribute of the group: distribution of a document, distribution to the participants' terminals, and displaying on a display device viewable by the participants at the same time.
In addition, thefacilitation determining unit220 allocates, for a related paragraph that serves as the target of the sharing work, work time determined according to the number of characters in the related paragraph. Specifically, for example, in the case where a presentation is performed as the sharing work, a presentation time of one minute per 200 characters may be allocated. The allocated work time may serve as the upper limit. That is, the work may be completed in a time shorter than the allocated time, and, in that case, the work of sharing the next related paragraph may be performed ahead of schedule.
In addition, thefacilitation determining unit220 determines the work order of performing the sharing work for related paragraphs determined as the target of the sharing work. This order is defined as the ascending order of degree of similarity. Thefacilitation determining unit220 compares the cumulative time in the case where the sharing work is performed in the work order and with the work time allocated to each related paragraph and the cumulative time of the entire sharing work set by the information-sharing-time setting unit210, and determines related paragraphs for which the sharing work is to be actually performed so that the cumulative time will fall within the entire work time.
The facilitation-information-presenting-screen generating unit230 generates a facilitation information presenting screen for presenting the above-described facilitation information, which is the result of determination performed by thefacilitation determining unit220. The facilitation information includes the identification information of paragraphs serving as the target of the work, the work order, and the work time for each paragraph. The facilitation-information-presenting-screen generating unit230 generates, for example, a list of identification information and work time of paragraphs, which are arranged in the work order, as a facilitation information presenting screen.
The facilitation-information display controller240 transmits a facilitation information presenting screen generated by the facilitation-information-presenting-screen generating unit230 to theoutput apparatus400, and displays the facilitation information presenting screen. In the case where a facilitation information presenting screen is generated as the above-mentioned list, the facilitation-information display controller240 may display items of related paragraphs in a range in which the cumulative time falls within the entire work time so as to be distinguishable from items of the other related paragraphs. In addition, the facilitation-information display controller240 may display items of related paragraphs for which the sharing work has been completed so as to be distinguishable from items of the other related paragraphs.
Example of Degree-of-Similarity Analysis
FIG. 4 is a diagram illustrating an example of the degree of similarity, in units of paragraphs, calculated by the degree-of-similarity analyzing unit140 of the sharedinformation management server100. In the example illustrated inFIG. 4, the degree of similarity of each paragraph in each document is calculated for five documents including document A to document E. The documents A to E are each composed of three paragraphs. InFIG. 4, attention is paid toparagraphs 1 to 3 of the document A. When attention is paid to the degree of similarity between theparagraph 1 and each paragraph of the document B, the degree of similarity with theparagraph 3 of the document B is 0.98, which is the highest. When attention is paid to the degree of similarity between theparagraph 1 and each paragraph of the document C, the degree of similarity with theparagraph 2 of the document C is 0.99, which is the highest. When attention is paid to the degree of similarity between theparagraph 1 and each paragraph of the document D, the degree of similarity with theparagraph 2 of the document D is 0.88, which is the highest. When attention is paid to the degree of similarity between theparagraph 1 and each paragraph of the document E, the degree of similarity with theparagraph 1 of the document E is 0.63, which is the highest. Therefore, the related paragraphs of theparagraph 1 of the document A are theparagraph 3 of the document B, theparagraph 2 of the document C, theparagraph 2 of the document D, and theparagraph 1 of the document E. Similarly, the related paragraphs of theparagraph 2 of the document A are theparagraph 2 of the document B, theparagraph 1 of the document C, theparagraph 2 of the document D, and theparagraph 2 of the document E. In addition, the related paragraphs of theparagraph 3 of the document A are theparagraph 3 of the document B, theparagraph 3 of the document C, theparagraph 3 of the document D, and theparagraph 2 of the document E.
Furthermore, when attention is paid to theparagraphs 1 to 3 of the document B, the related paragraphs of theparagraph 1 of the document B are theparagraph 1 of the document A, theparagraph 2 of the document C, theparagraph 1 of the document D, and theparagraph 3 of the document E. In addition, the related paragraphs of theparagraph 2 of the document B are theparagraph 1 of the document A, theparagraph 1 of the document C, theparagraph 3 of the document D, and theparagraph 1 of the document E. In addition, the related paragraphs of theparagraph 3 of the document B are theparagraph 3 of the document A, theparagraph 3 of the document C, theparagraph 1 of the document D, and theparagraph 1 of the document E. In this manner, related paragraphs are obtained for each of theparagraphs 1 to 3 of the documents C to E.
Here, the relation between paragraphs is not necessarily symmetric. That is, a combination of a paragraph in a first document and its most similar paragraph in a second document need not match a combination of that paragraph in the second document and its most similar paragraph in the first document. For example, looking at the document B from the document A, a paragraph of the document B that is the most similar to theparagraph 1 of the document A is theparagraph 3. However, a paragraph of the document A that is the most similar to theparagraph 3 of the document B is theparagraph 3, not theparagraph 1. In addition, a paragraph of the document B that is the most similar to theparagraph 2 of the document A is theparagraph 2. However, a paragraph of the document A that is the most similar to theparagraph 2 of the document B is theparagraph 1, not theparagraph 2. In contrast, looking at the document A from the document B, a paragraph of the document A that is the most similar to theparagraph 1 of the document B is theparagraph 1. In addition, a paragraph of the document A that is the most similar to theparagraph 2 of the document B is also theparagraph 1. However, a paragraph of the document B that is the most similar to theparagraph 1 of the document A is thedocument 3, as has been described above, not theparagraph 1 or theparagraph 2.
Example of Analysis Result Presenting Screen
FIG. 5 is a diagram illustrating an example of an analysis result presenting screen generated by the analysis-result-presenting-screen generating unit150 of the sharedinformation management server100. An analysis result presentingscreen410 is displayed on theoutput apparatus400. The vertices of a graph on the analysisresult presenting screen410 illustrated inFIG. 5 correspond to the documents and paragraphs illustrated inFIG. 4. For example, “DA” of the vertex “DA1” indicates the document A, and “1” indicates theparagraph 1. In addition, the vertices are arranged in groups of documents in the example illustrated inFIG. 5. Note that, in the example illustrated inFIG. 5, each side formed between vertices is merely an example, and does not reflect similarity between paragraphs illustrated inFIG. 4. For example, the vertex “DA1” is connected to the vertex “DB3”, the vertex “DC3”, the vertex “DD2”, and the vertex “DE3”, which are different from the related paragraphs (theparagraph 3 of thedocument 3, theparagraph 2 of the document C, theparagraph 2 of the document D, and theparagraph 1 of the document E) of theparagraph 1 of the document A illustrated with reference toFIG. 4.
BecauseFIG. 5 merely illustrates an example of display of the analysisresult presenting screen410, for the sake of simplicity, the graph is an undirected graph where each side has no direction. Actually, however, as described above, the relation between paragraphs is not symmetric. Thus, the graph on the analysisresult presenting screen410 is created as, for example, a directed graph. In this case, each side is described as an arrow directed from one vertex to another vertex in accordance with the relation between paragraphs. In addition, in the example illustrated inFIG. 5, in the case where the user selects a vertex on the analysis result presenting screen displayed on theoutput apparatus400 by performing an operation such as clicking, the display screen of theoutput apparatus400 is switched to display a paragraph corresponding to the selected vertex.
Example of Determining Target of Sharing Work
FIG. 6 is a diagram illustrating an example in which the target of the sharing work is determined by thefacilitation determining unit220 of theprogress control server200.FIG. 6 illustrates the degree of similarity of each of the related paragraphs with each of theparagraphs 1 to 3 of the documents A to E. For a paragraph of interest, there is only one related paragraph in each document. InFIG. 6, paragraphs are omitted in a document including a related paragraph serving as the target of comparison. The degree of similarity between each of theparagraphs 1 to 3 in each of the documents A to E and each related paragraph, illustrated inFIG. 6, is based on the example illustrated inFIG. 4.
As described above, thefacilitation determining unit220 determines, among related paragraphs, a related paragraph with the lowest degree of similarity as the target of the sharing work. InFIG. 6, when attention is paid to theparagraph 1 of the document A, the degree of similarity with (theparagraph 3 of) the document B is 0.99; the degree of similarity with (theparagraph 2 of) the document C is 0.99; the degree of similarity with (theparagraph 2 of) the document D is 0.88; and the degree of similarity with (theparagraph 1 of) the document E is 0.63. Therefore, because 0.63 is the lowest degree of similarity, theparagraph 1 of the document E serves as the target of the sharing work. Similarly, a related paragraph that serves as the target of the sharing work is determined for each of theparagraph 2 and theparagraph 3 of the document A, theparagraphs 1 to 3 of the document B, theparagraphs 1 to 3 of the document C, theparagraphs 1 to 3 of the document D, and theparagraphs 1 to 3 of the document E.
Determination Information for Related Paragraph Serving as Target of Sharing Work
FIG. 7 is a diagram illustrating an example of determination information for a related paragraph serving as the target of the sharing work. In the example illustrated inFIG. 7, the following items of information are recorded for each related paragraph: the identification information of “document” and “paragraph”, “degree of similarity”, and “number of characters”, “allocated time”, “work time”, and “cumulative time” of the paragraph. “Document” is the identification information of a document including the related paragraph, and “paragraph” is the identification information of the related paragraph. Here, information for identification to be performed by thefacilitation determining unit220 is recorded. “Degree of similarity” is the degree of similarity of the related paragraph. “Number of characters” is the number of characters described in the related paragraph. “Allocated time” is time allocated by thefacilitation determining unit220 for the work of sharing the related paragraph. This “allocated time” is a value calculated on the basis of the number of characters in the related paragraph. Here, the allocated time is calculated on the basis of 60 seconds per 200 characters. “Work time” is time allocated for the related paragraph in order to actually implement the sharing work within the preset work time of the entire sharing work. Because the entire work time is often set in minutes or the like, here, a fraction of the time indicated in “allocated time” is rounded up to a multiple of ten. “Cumulative time” is the accumulated “work time” in order from the top related paragraph. In addition, related paragraphs are arranged from the top in ascending order of degree of similarity. In the case where the work time of the entire sharing work is set to 15 minutes, with reference to “cumulative time” inFIG. 7, if the work of sharing the tenth related paragraph (document E, paragraph 1) is performed, the time exceeds the set time of 900 seconds (15 minutes). Thus, the sharing work is performed up to the ninth related paragraph (document A, paragraph 1).
Example of Facilitation Information Presenting Screen
FIG. 8 is a diagram illustrating an example of a facilitation information presenting screen generated by the facilitation-information-presenting-screen generating unit230. A facilitationinformation presenting screen420 is displayed on theoutput apparatus400. The facilitationinformation presenting screen420 illustrated inFIG. 8 indicates related paragraphs up to the ninth related paragraph of the determination information illustrated inFIG. 7 as the target of the sharing work, which are in the order (work order) of the related paragraphs inFIG. 8. On the facilitationinformation presenting screen420 illustrated inFIG. 8, the following items of information are displayed for the related paragraphs serving as the target of the sharing work: “order of priority”, the identification information of “document”, the identification information of “paragraph”, and “work time”. “Order of priority” is the order of implementing the sharing work, which is determined by thefacilitation determining unit220 on the basis of the degree of similarity of each related paragraph. The identification information of “document” is the identification information of a document including the related paragraph. Unlike the identification information illustrated inFIG. 7, this is information for a user who looks at the facilitationinformation presenting screen420 to identify the document. For example, the document file name or the identification information of a user who is the author of the document is used. Like the identification information of “document”, the identification information of “paragraph” is information for a user who looks at the facilitationinformation presenting screen420 to identify the paragraph. The work time illustrated inFIG. 7 is recorded in “work time”. On the facilitationinformation presenting screen420 illustrated inFIG. 8, if the user selects the row of a related paragraph that serves as the target of the work, the display of theoutput apparatus400 is switched to display the selected related paragraph.
Example of Related Paragraph Displaying Screen
FIG. 9 is a diagram illustrating an example of a related paragraph displaying screen. A relatedparagraph displaying screen430 displayed on theoutput apparatus400 is provided with atarget information field431, a relatedparagraph displaying area432, and button objects433 and434. Thetarget information field431 indicates information for identifying a displayed related paragraph. In the example illustrated inFIG. 9, information described in the item “document” and the item “paragraph” on the facilitation information presenting screen illustrated inFIG. 8 is displayed to identify the document and the paragraph. That is, it is illustrated that theparagraph 1 of the document FX28816 is displayed on the relatedparagraph displaying screen430 illustrated inFIG. 9. Sentences described in the related paragraph are displayed in the relatedparagraph displaying area432.
Thebutton object433 indicated as “completed” is an object for confirming that the work of sharing the related paragraph displayed on the relatedparagraph displaying screen430 has been completed. In response to an operation (clicking the mouse, for example) on thebutton object433, a command for notifying that the sharing work has been completed is output to theprogress control server200. On receipt of the command, theprogress control server200 updates the status of the related paragraph displayed on the relatedparagraph displaying screen430 to details indicating that the sharing work has been completed, and returns the display of theoutput apparatus400 to the facilitationinformation presenting screen420. At this time, on the facilitationinformation presenting screen420, the row of the related paragraph for which the sharing work has been completed is displayed distinguishably from the other related paragraphs by changing the display color of the row, for example.
Thebutton object434 indicated as “return” is an object for returning to the facilitationinformation presenting screen420 without completing the work of sharing the related paragraph displayed on the relatedparagraph displaying screen430. In response to an operation (clicking the mouse, for example) on thebutton object434, a command for ending the display without completing the sharing work is output to theprogress control server200. On receipt of the command, theprogress control server200 returns the display of theoutput apparatus400 to the facilitationinformation presenting screen420 without updating the status of the related paragraph displayed on the relatedparagraph displaying screen430. At this time, the work of sharing the related paragraph displayed on the relatedparagraph displaying screen430 is not completed, and therefore, the display mode of the row of the corresponding related paragraph on the facilitationinformation presenting screen420 is not changed.
The configuration of the analysisresult presenting screen410 illustrated inFIG. 5, the configuration of the facilitationinformation presenting screen420 illustrated inFIG. 8, and the configuration of the relatedparagraph displaying screen430 illustrated inFIG. 9 are all examples, and are not limited to the illustrated configurations as long as they are screens including information and functions required at each stage. For example, although the work time of each related paragraph is presented on the facilitationinformation presenting screen420 illustrated inFIG. 8, alternatively or additionally, the cumulative time illustrated inFIG. 7 may be presented. In the above-described example, the row of a related paragraph for which the sharing work has been completed is distinguishably displayed on the facilitationinformation presenting screen420. In addition, if the time allocated to each related paragraph has elapsed, the corresponding row on the facilitationinformation presenting screen420 may be distinguishably displayed. On the facilitationinformation presenting screen420, a row selectable for displaying the relatedparagraph displaying screen430 may be limited to that in the order indicated in “order of priority”, or a row may be selectable regardless of the order indicated in “order of priority”. Furthermore, selection of the row of a related paragraph for which the sharing work has been completed may be limited.
Although the exemplary embodiment of the present disclosure has been described above, the technical scope of the present disclosure is not limited to the above-described exemplary embodiment. For example, although to-be-shared information is described in the description form500 (seeFIG. 10) and is read by theinput apparatus300 in the above-described exemplary embodiment, a document describing to-be-shared information may be created as an electronic document using a personal computer or another information processing apparatus, and the document may be directly input to the sharedinformation management server100. In addition, various changes or replacements of configurations that do not depart from the scope of the technical idea of the present disclosure are included in the present disclosure.
In the embodiment above, the term “processor” refers to hardware in a broad sense. Examples of the processor include general processors (e.g., CPU: Central Processing Unit), dedicated processors (e.g., GPU: Graphics Processing Unit, ASIC: Application Integrated Circuit, FPGA: Field Programmable Gate Array, and programmable logic device).
In the embodiment above, the term “processor” is broad enough to encompass one processor or plural processors in collaboration which are located physically apart from each other but may work cooperatively. The order of operations of the processor is not limited to one described in the embodiment above, and may be changed.
The foregoing description of the exemplary embodiment of the present disclosure has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiment was chosen and described in order to best explain the principles of the disclosure and its practical applications, thereby enabling others skilled in the art to understand the disclosure for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the disclosure be defined by the following claims and their equivalents.

Claims (11)

What is claimed is:
1. An information processing system comprising:
a processor configured to analyze an obtained document, the processor being configured to:
obtain a plurality of documents where information to be shared by a plurality of participants is recorded,
calculate a degree of similarity in details in units of blocks between different documents, the blocks being blocks of sentences included in the documents,
determine an execution order of executing work of sharing details of each of the blocks on the basis of the degree of similarity, and
allocate work time for performing the work of sharing details of each of the blocks on the basis of a number of characters in each of the blocks.
2. The information processing system according toclaim 1, wherein the processor calculates a vector degree of similarity of a document vector representing one or more sentences included in each of the blocks as a vector, and regards the calculated vector degree of similarity as the degree of similarity in details described in each of the blocks.
3. The information processing system according toclaim 2, wherein the processor regards blocks with a highest degree of similarity between different documents as related blocks, and regards one of the related blocks as a target of the sharing work.
4. The information processing system according toclaim 3, wherein the processor regards a block with a lowest degree of similarity among the related blocks as a target of the sharing work.
5. The information processing system according toclaim 4, wherein the processor excludes, among the related blocks, one or more blocks whose degrees of similarity are lower than a predetermined threshold from being a target of the sharing work.
6. The information processing system according toclaim 1, wherein the processor:
relates blocks with a highest degree of similarity between different documents,
presents information indicating relation between the blocks to the participants, and
on receipt of a command from any of the participants, updates the information indicating relation between the blocks.
7. The information processing system according toclaim 6, wherein the processor:
displays, on a display device, a graph in which the blocks are set as vertices and a side is formed between vertices representing the blocks that are related, and
receives an operation performed by any of the participants for replacing the side, and, in accordance with the replaced side, updates the information indicating relation between the blocks.
8. The information processing system according toclaim 1, wherein the processor presents to the participants the execution order and the work time of executing the work of sharing the blocks, and a cumulative value of the work time in a case where the sharing work is executed in the execution order.
9. The information processing system according toclaim 8, wherein the processor distinguishably presents information on one or more blocks of which the cumulative value of the work time is included in a range less than or equal to a predetermined upper limit.
10. The information processing system according toclaim 9, wherein the processor distinguishably presents information on one or more blocks for which the sharing work has been completed.
11. An information processing system comprising:
processing means for analyzing an obtained document, the processing means:
obtaining a plurality of documents where information to be shared by a plurality of participants is recorded,
calculating a degree of similarity in details in units of blocks between different documents, the blocks being blocks of sentences included in the documents,
determining an execution order of executing work of sharing details of each of the blocks on the basis of the degree of similarity, and
allocating work time for performing the work of sharing details of each of the blocks on the basis of a number of characters in each of the blocks.
US16/878,6882019-10-102020-05-20Information processing systemActiveUS11244155B2 (en)

Applications Claiming Priority (3)

Application NumberPriority DateFiling DateTitle
JP2019-1869712019-10-10
JP2019186971AJP7354750B2 (en)2019-10-102019-10-10 information processing system
JPJP2019-1869712019-10-10

Publications (2)

Publication NumberPublication Date
US20210110155A1 US20210110155A1 (en)2021-04-15
US11244155B2true US11244155B2 (en)2022-02-08

Family

ID=75383137

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US16/878,688ActiveUS11244155B2 (en)2019-10-102020-05-20Information processing system

Country Status (2)

CountryLink
US (1)US11244155B2 (en)
JP (1)JP7354750B2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP7354750B2 (en)*2019-10-102023-10-03富士フイルムビジネスイノベーション株式会社 information processing system
CN116934068A (en)*2023-09-192023-10-24江铃汽车股份有限公司Office flow node management method and system

Citations (30)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20020059222A1 (en)*2000-10-312002-05-16Kouichi SasakiInformation management method and information management device
US20030079184A1 (en)*2000-05-052003-04-24International Business Machines CorporationDynamic image storage using domain-specific compression
US20030182310A1 (en)*2002-02-042003-09-25Elizabeth CharnockMethod and apparatus for sociological data mining
US20060026231A1 (en)*2004-07-302006-02-02Wolfgang DegenhardtCollaborative agent for a work environment
US20060155513A1 (en)*2002-11-072006-07-13Invoke Solutions, Inc.Survey system
US20060271526A1 (en)*2003-02-042006-11-30Cataphora, Inc.Method and apparatus for sociological data analysis
US20080158261A1 (en)*1992-12-142008-07-03Eric Justin GouldComputer user interface for audio and/or video auto-summarization
US20120060082A1 (en)*2010-09-022012-03-08Lexisnexis, A Division Of Reed Elsevier Inc.Methods and systems for annotating electronic documents
US20120278388A1 (en)*2010-12-302012-11-01Kyle KleinbartSystem and method for online communications management
US20130027428A1 (en)*2011-07-272013-01-31Ricoh Company, Ltd.Generating a Discussion Group in a Social Network Based on Metadata
US20130066750A1 (en)*2008-03-212013-03-14Dressbot, Inc.System and method for collaborative shopping, business and entertainment
US20140039887A1 (en)*2012-08-022014-02-06Steven C. DzikIdentifying corresponding regions of content
US20140297641A1 (en)*2013-03-272014-10-02Fujitsu LimitedDiscussion support method, information processing apparatus, and storage medium
US20150012805A1 (en)*2013-07-032015-01-08Ofer BleiweissCollaborative Matter Management and Analysis
US20150142888A1 (en)*2013-11-202015-05-21Blab, Inc.Determining information inter-relationships from distributed group discussions
US20150263978A1 (en)*2014-03-142015-09-17Amazon Technologies, Inc.Coordinated admission control for network-accessible block storage
US9542669B1 (en)*2013-03-142017-01-10Blab, Inc.Encoding and using information about distributed group discussions
US20170178265A1 (en)*2015-12-172017-06-22Korea University Research And Business FoundationMethod and server for providing online collaborative learning using social network service
WO2017141338A1 (en)2016-02-152017-08-24富士通株式会社Information processing device, information processing method, and information processing program
US20180032608A1 (en)*2016-07-272018-02-01Linkedin CorporationFlexible summarization of textual content
US20180268253A1 (en)*2015-01-232018-09-20Highspot, Inc.Systems and methods for identifying semantically and visually related content
US20190019022A1 (en)*2017-07-142019-01-17Adobe Systems IncorporatedSyncing digital and physical documents
US20190147402A1 (en)*2015-11-242019-05-16David H. SitrickSystems and methods providing collaborating among a plurality of users
US10552536B2 (en)*2007-12-182020-02-04Apple Inc.System and method for analyzing and categorizing text
US20210026897A1 (en)*2019-07-232021-01-28Microsoft Technology Licensing, LlcTopical clustering and notifications for driving resource collaboration
US10922719B1 (en)*2013-12-092021-02-16Groupon, Inc.Systems and methods for providing group promotions
US20210065320A1 (en)*2013-07-032021-03-04Ofer BleiweissCollaborative matter management and analysis
US20210067475A1 (en)*2018-01-092021-03-04Lunkr Technology (Guangzhou) Co., Ltd.Instant messaging method, apparatus and system based on email system
US20210110155A1 (en)*2019-10-102021-04-15Fuji Xerox Co., Ltd.Information processing system
US20210117714A1 (en)*2019-10-172021-04-22Microsoft Technology Licensing, LlcSystem for predicting document reuse

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP6225543B2 (en)*2013-07-302017-11-08富士通株式会社 Discussion support program, discussion support apparatus, and discussion support method
JP6232930B2 (en)*2013-10-302017-11-22株式会社リコー CONFERENCE SUPPORT DEVICE, CONFERENCE SUPPORT SYSTEM, AND CONFERENCE SUPPORT METHOD
JP6488417B1 (en)2018-03-272019-03-20株式会社日立製作所 Workshop support system and workshop support method

Patent Citations (32)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20080158261A1 (en)*1992-12-142008-07-03Eric Justin GouldComputer user interface for audio and/or video auto-summarization
US20030079184A1 (en)*2000-05-052003-04-24International Business Machines CorporationDynamic image storage using domain-specific compression
US20020059222A1 (en)*2000-10-312002-05-16Kouichi SasakiInformation management method and information management device
US20030182310A1 (en)*2002-02-042003-09-25Elizabeth CharnockMethod and apparatus for sociological data mining
US20060155513A1 (en)*2002-11-072006-07-13Invoke Solutions, Inc.Survey system
US20060271526A1 (en)*2003-02-042006-11-30Cataphora, Inc.Method and apparatus for sociological data analysis
US20060026231A1 (en)*2004-07-302006-02-02Wolfgang DegenhardtCollaborative agent for a work environment
US10552536B2 (en)*2007-12-182020-02-04Apple Inc.System and method for analyzing and categorizing text
US20130066750A1 (en)*2008-03-212013-03-14Dressbot, Inc.System and method for collaborative shopping, business and entertainment
US20120060082A1 (en)*2010-09-022012-03-08Lexisnexis, A Division Of Reed Elsevier Inc.Methods and systems for annotating electronic documents
US20120278388A1 (en)*2010-12-302012-11-01Kyle KleinbartSystem and method for online communications management
US20130027428A1 (en)*2011-07-272013-01-31Ricoh Company, Ltd.Generating a Discussion Group in a Social Network Based on Metadata
US20140039887A1 (en)*2012-08-022014-02-06Steven C. DzikIdentifying corresponding regions of content
US9542669B1 (en)*2013-03-142017-01-10Blab, Inc.Encoding and using information about distributed group discussions
JP6142616B2 (en)2013-03-272017-06-07富士通株式会社 Discussion support program, discussion support method, and information processing apparatus
US20140297641A1 (en)*2013-03-272014-10-02Fujitsu LimitedDiscussion support method, information processing apparatus, and storage medium
US20150012805A1 (en)*2013-07-032015-01-08Ofer BleiweissCollaborative Matter Management and Analysis
US20210065320A1 (en)*2013-07-032021-03-04Ofer BleiweissCollaborative matter management and analysis
US20150142888A1 (en)*2013-11-202015-05-21Blab, Inc.Determining information inter-relationships from distributed group discussions
US10922719B1 (en)*2013-12-092021-02-16Groupon, Inc.Systems and methods for providing group promotions
US20150263978A1 (en)*2014-03-142015-09-17Amazon Technologies, Inc.Coordinated admission control for network-accessible block storage
US20180268253A1 (en)*2015-01-232018-09-20Highspot, Inc.Systems and methods for identifying semantically and visually related content
US20190147402A1 (en)*2015-11-242019-05-16David H. SitrickSystems and methods providing collaborating among a plurality of users
US20170178265A1 (en)*2015-12-172017-06-22Korea University Research And Business FoundationMethod and server for providing online collaborative learning using social network service
US20180322073A1 (en)*2016-02-152018-11-08Fujitsu LimitedInformation processing apparatus, information processing method, and recording medium
WO2017141338A1 (en)2016-02-152017-08-24富士通株式会社Information processing device, information processing method, and information processing program
US20180032608A1 (en)*2016-07-272018-02-01Linkedin CorporationFlexible summarization of textual content
US20190019022A1 (en)*2017-07-142019-01-17Adobe Systems IncorporatedSyncing digital and physical documents
US20210067475A1 (en)*2018-01-092021-03-04Lunkr Technology (Guangzhou) Co., Ltd.Instant messaging method, apparatus and system based on email system
US20210026897A1 (en)*2019-07-232021-01-28Microsoft Technology Licensing, LlcTopical clustering and notifications for driving resource collaboration
US20210110155A1 (en)*2019-10-102021-04-15Fuji Xerox Co., Ltd.Information processing system
US20210117714A1 (en)*2019-10-172021-04-22Microsoft Technology Licensing, LlcSystem for predicting document reuse

Also Published As

Publication numberPublication date
JP7354750B2 (en)2023-10-03
JP2021064048A (en)2021-04-22
US20210110155A1 (en)2021-04-15

Similar Documents

PublicationPublication DateTitle
CN108600781B (en)Video cover generation method and server
CN111738764B (en)Method and system for predicting effect of advertisement creativity and generating advertisement creativity
JP7031009B2 (en) Avatar generator and computer program
CN111586319B (en) Video processing method and device
US11244155B2 (en)Information processing system
CN110580135B (en)Image processing device, image processing method, image processing program, and recording medium storing the program
US12229643B2 (en)Teaching data extending device, teaching data extending method, and program
US11941519B2 (en)Machine learning training platform
CN109377508B (en)Image processing method and device
CN104871122B (en)Display control apparatus and display control method
US20170024426A1 (en)Information processing apparatus, information processing method, and non-transitory computer readable medium
CN113298896A (en)Picture generation method and device, electronic equipment and storage medium
JP7447251B2 (en) Assessing the visual quality of digital content
US20240143911A1 (en)Document difference viewing and navigation
EP3951710A1 (en)Image processing device, method for operation of image processing device, and program for operation of image processing device
CN114241496B (en)Pre-training model training method and device for reading task and electronic equipment thereof
CN114937188A (en)Information identification method, device, equipment and medium for sharing screenshot by user
US11257212B2 (en)Image analysis device
US9672299B2 (en)Visualization credibility score
CN116909655A (en)Data processing method and device
CN106547891A (en)For the quick visualization method of the pictured text message of palm display device
CN112990366B (en)Target labeling method and device
US11074392B2 (en)Information processing apparatus and non-transitory computer readable medium for switching between an attribute information mode and an image information mode
CN112308074A (en)Method and device for generating thumbnail
CN112085027A (en)Image segmentation model generation system, method and device and computer equipment

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:FUJI XEROX CO., LTD., JAPAN

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ANDO, JUN;NAKAMURA, SHINYA;MICHIMURA, TADAO;AND OTHERS;REEL/FRAME:052708/0472

Effective date:20200330

FEPPFee payment procedure

Free format text:ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

ASAssignment

Owner name:FUJIFILM BUSINESS INNOVATION CORP., JAPAN

Free format text:CHANGE OF NAME;ASSIGNOR:FUJI XEROX CO., LTD.;REEL/FRAME:056078/0098

Effective date:20210401

STPPInformation on status: patent application and granting procedure in general

Free format text:DOCKETED NEW CASE - READY FOR EXAMINATION

STPPInformation on status: patent application and granting procedure in general

Free format text:NON FINAL ACTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPPInformation on status: patent application and granting procedure in general

Free format text:PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCFInformation on status: patent grant

Free format text:PATENTED CASE

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:4


[8]ページ先頭

©2009-2025 Movatter.jp