This is a Continuation of application Ser. No. 16/127,331, filed Sep. 11, 2018, which claims the benefit of Japanese Priority Application No. 2017-175145, filed Sep. 12, 2017. The disclosure of each of the prior applications is hereby incorporated by reference herein in its entirety.
BACKGROUND1. Technical FieldThe disclosure relates to an information processing device, an information processing method, and a non-transitory computer readable medium.
2. Related ArtHitherto, work flows have been further utilized as methods for integrating, into a system, know-bows and procedures including operations and processes. The work flow generally includes information defining a procedure of various processes to be executed as an operation, as well as includes contents of the processes, and is referred to as a business operation flow or an operation flow. It is conceivable that, as an example, a device configured to edit the work flow used to evaluate quality improvements about production manufacturing has been proposed (e.g., see JP-A-2007-188481). With the configuration of JP-A-2007-188481, a user who operates a client personal computer (PC) uses a function of a work flow creation program to create or edit a business flow.
In JP-A-2007-188481, the user selects icons of engines indicative of processes and the like drags and drops the selected icons onto a matrix, and arranges the engines on the matrix. In this example, an order of execution of engines arranged on a matrix is determined from left to right and top to bottom of the matrix, and thus a work flow is created.
Incidentally, some processes and operations configuring an operation flow limit some other processes and operations to be executed later. For example, in some processes and operations, any of two or more sub-processes and sub-operations are to be consecutively executed. A process and an operation might be combined, which cannot be consecutively executed. In this case, when creating or editing an operation flow, a user takes into account, in addition to an order of execution of processes and operations, limitations about consecutive processes and operations. A highly skilled user normally creates and edits an operation flow. A reduced burden to such a user has been demanded.
The disclosure is a result of consideration of the situation described above, and an advantage is that an appropriate operation flow including a plurality of operations and the like can be easily created or edited.
SUMMARYAn information processing device configured to create an operation flow including a plurality of operation steps, the operation flow being specified with an order of execution of the plurality of operation steps, according to the disclosure, includes an input unit configured to accept entries, a display unit configured to cause a display face to perform displaying, and a controller configured to cause the display face to display a work region, arrange objects indicative of the plurality of operation steps onto the work region in accordance with the entries accepted by the input unit, and create the operation flow based on the arrangement of the objects on the work region. The controller, determines whether acceptable or unacceptable by comparison of the objects arranged on the work region with a condition set with respect to the plurality of operation steps respectively corresponding to the objects, adds, when unacceptable is determined, a display indicative of condition unacceptability onto each of corresponding ones of the objects, and continues, after adding the display indicative of condition unacceptability onto each of the corresponding ones of the objects, arranging the objects onto the work region in accordance with the entries.
With the disclosure, when a user makes entries to arrange objects onto the work region, if one of the objects being arranged is found unacceptable to the condition set with respect to the operation steps, the unacceptability can be notified with the display. By making entries to cause the display indicative of condition unacceptability to disappear, the user easily creates or edits an operation flow satisfying the condition. Even when unacceptable is determined, the user can continue arranging objects. The user thus makes an entry to solve the unacceptability at a desired timing. Convenience in creating or editing an operation flow can be improved.
The controller may be of a configuration to create the operation flow that includes the plurality of operation steps respectively corresponding to the objects arranged on the work region, and that is specified with the order of execution of the plurality of operation steps, in accordance with an order of the arrangement of the objects.
With the disclosure, the user makes entries, arranges objects onto the work region, and creates an operation flow corresponding to an order of the arrangement of the objects. By arranging objects, the user easily creates or edits an operation flow.
In the disclosure, the controller may be of a configuration to execute a processing to add information or an attribute for the objects arranged on the work region in accordance with entries, and may determine, when one of the objects arranged on the work region is not added with the information or the attribute of a set type, the one of the objects as condition unacceptable.
With the disclosure, when condition unacceptable is determined for adding information or an attribute for objects, the user can be notified with the unacceptability. The user is supported for adding information or an attribute for objects. Therefore, the user can easily create or edit an appropriate operation flow.
In the disclosure, one of the plurality of operation steps may be at least one of processing of outputting of information, entering of information, and making a determination each executed by a computer, and the controller may be of a configuration to cause icons, which are indicative of processing of the plurality of the operation steps respectively corresponding to the objects, to be associated with the objects arranged on the work region, and to be displayed.
With the disclosure, by displaying icons on objects on the work region, a user can easily recognize computer processing represented by operation steps. The user thus easily creates or edits an appropriate operation flow.
In the disclosure, one of the plurality of operation steps may be at least one of processing of outputting of information, of entering of information, and of making a determination each executed by a computer, and the controller may be of a configuration to cause icons, which are indicative of processing of the plurality of operation steps respectively corresponding to the objects, to be associated with the objects arranged on the work region, and to be displayed, as well as may compare the objects arranged on the work region with the condition set in association with processing of the plurality of operation steps respectively corresponding to the objects, and may determine whether acceptable or unacceptable.
With the disclosure, icons are displayed on objects on the work region. When one of the objects is determined as condition unacceptable, a display indicative of unacceptability is provided. The user thus easily recognizes a process represented by an operation step and its acceptability based on the display. By taking into account contents of operation steps and a condition set for the operation steps, the user creates or edits an appropriate operation flow.
In the disclosure, the controller may be of a configuration to cause the display face to display an information input region configured to accept entries of information about the plurality of operation steps, and may cause, when one of the plurality of operation steps respectively corresponding to the objects arranged on the work region is determined as condition unacceptable, the information input region to perform at least either of providing the display of condition unacceptability or changing a display aspect of the information input region.
With the disclosure, while the information input region configured to accept entries of information about operation steps is displayed, when one of the operation steps corresponding to objects is determined as condition unacceptable, the information input region can provide the display to notify the condition unacceptability. The user thus easily recognizes acceptability of an operation step based on the display to solve condition unacceptability. By taking into account contents of operation steps and a condition set for the operation steps, the user creates or edits an appropriate operation flow.
In the disclosure, the controller may be of a configuration to add information about the objects added with the display indicative of condition unacceptability, to operation flow data, the operation flow data being used by the computer to execute the operation flow created based on the arrangement of the objects on the work region, and may output the operation flow data as data described in a format specified beforehand.
With the disclosure, in accordance with objects arranged on the work region, an operation flow to be executed by a computer can be created, and information indicative of condition unacceptability can be added to data of the operation flow. Thus, the created operation flow can be processed and output as data by another computer. The data allows the computer to detect condition unacceptability.
In the disclosure, the controller may be of a configuration to execute, in accordance with an entry accepted by the input unit, a condition setting process configured to set a condition relating to the plurality of operation steps, may compare, in accordance with the condition set in the condition setting process, the plurality of operation steps respectively corresponding to the objects arranged on the work region with the set condition, and may determine whether acceptable or unacceptable.
With the disclosure, after a condition is set for an operation step, and when the set condition is not satisfied, the display indicative of unacceptability is provided. Thus, a user sets a detailed condition relating to an operation flow, as well as easily creates or edits the operation flow satisfying the set condition.
In the disclosure, a storage unit may be further included, and the controller may be of a configuration to cause the storage unit to store condition definition information indicative of the condition set in the condition setting process.
With the disclosure, once a condition is set based on condition definition information, the condition can be used consecutively in creating or editing an operation flow.
In the disclosure, an information processing method of creating an operation flow by using an information processing device equipped with a display unit, the operation flow including a plurality of operation steps, and being specified with an order of execution, includes causing the display unit to display a work region, arranging, in accordance with entries, objects indicative of the plurality of operation step onto the work region, creating the operation flow based on the arrangement of the objects on the work region, comparing the objects arranged on the work region with a condition set with respect to the plurality of operation steps respectively corresponding to the objects, determining whether acceptable or unacceptable, adding, when unacceptable is determined, a display indicative of condition unacceptability onto each of corresponding ones of the objects, and continuing, after the display indicative of condition unacceptability is added on each of the corresponding ones of the objects, arranging the objects onto the work region in accordance with entries.
With the disclosure, when a user makes entries to arrange objects onto the work region, if one of the objects being arranged is found unacceptable to the condition set with respect to the operation steps, the unacceptability can be notified with the display. By making entries to cause the display indicative of condition unacceptability to disappear, the user easily creates or edits an operation flow satisfying the condition. Even when unacceptable is determined, the user can continue arranging objects. The user thus makes an entry to solve the unacceptability at a desired timing. Accordingly, convenience in creating or editing an operation flow is improved.
In the disclosure, a non-transitory computer readable medium storing a program for causing, as a controller, a computer equipped with a display unit to execute a process configured to create an operation flow including a plurality of operation steps and specified with an order of execution, the process including causing the display unit to display a work region, arranging, in accordance with entries, objects indicative of the plurality of operation steps onto the work region, creating the operation flow based on the arrangement of the objects on the work region, comparing the objects arranged on the work region with a condition set with respect to the plurality of operation steps respectively corresponding to the objects, determining whether acceptable or unacceptable, adding, when unacceptable is determined, a display indicative of condition unacceptability onto each of corresponding ones of the objects, and continuing, after the display indicative of condition unacceptability is added onto each of the corresponding ones of the objects, arranging the objects onto the work region in accordance with the entries.
With the disclosure, when a user makes entries to arrange objects onto the work region, if one of the objects being arranged is found unacceptable to the condition set with respect to the operation steps, the unacceptability can be notified with the display. By making entries to cause the display indicative of condition unacceptability to disappear, the user easily creates or edits an operation flow satisfying the condition. Even when unacceptable is determined, the user can continue arranging objects. The user thus makes an entry to solve the unacceptability at a desired timing. Accordingly, convenience in creating or editing an operation flow is improved.
BRIEF DESCRIPTION OF THE DRAWINGSEmbodiments of the disclosure will be described with reference to the accompanying drawings, wherein like numbers reference like elements.
FIG. 1 is a function block diagram of a PC.
FIG. 2 is a function block diagram of a work terminal.
FIG. 3 is a flowchart illustrating an operation of the PC.
FIG. 4 is a view illustrating a display example of the PC.
FIG. 5 is a view illustrating a display example of the PC.
FIG. 6 is a schematic view of condition definition information.
FIG. 7 is a flowchart illustrating an operation of the PC.
FIG. 8 is a view illustrating a display example of the PC.
FIG. 9 is a flowchart illustrating an operation of the PC.
FIG. 10 is a view illustrating a display example of the PC.
FIG. 11 is a view illustrating a display example of the PC.
FIG. 12 is a view illustrating a display example of the PC.
FIG. 13 is a flowchart illustrating an operation of the work terminal, based on a process flow.
FIG. 14 is a flowchart illustrating an operation of the PC.
DESCRIPTION OF EXEMPLARY EMBODIMENTSHereinafter, an exemplary embodiment applied with the disclosure will be described with reference to the accompanying drawings.
FIG. 1 is a function block diagram of a personal computer (PC)1 applied with an information processing device according to the disclosure. ThePC1 includes adisplay32 and aninput device34, described later, as long as thePC1 may be a computer operable by a user, a specific configuration of thePC1 is not limited. For example, thePC1 may be a desktop computer, and may be a portable computer, such as a laptop computer, or a tablet computer. As long as thePC1 includes functions illustrated inFIG. 1, thePC1 may be a small device, such as a smart phone.
As illustrated inFIG. 1, thePC1 includes acontroller10, astorage unit20, adisplay unit31, aninput unit33, an interface (I/F)unit36, and acommunication unit37, these are coupled to each other via abus39. Thecontroller10 includes a processor, such as a central processing unit (CPU), and is configured to control thePC1, by execution of a program by the processor, and to achieve various functions of thePC1. Thecontroller10 may include a random access memory (RAM) configured to prepare a work area for the processor. Thecontroller10 may include a read-only memory (ROM) configured to store in a non-volatile manner a basic control program executed by the processor.
Thecontroller10 is configured to execute a program and to operate together with software and hardware to achieve functional units, such as aninput detection unit11, adisplay controller12, adetermination unit13, an outputdata generation unit14, and acommunication controller15. Functions of the functional units of thecontroller10 will be described later.
Thestorage unit20 includes a magnetic storage medium, an optical storage medium, or a semiconductor storage device and the like, and has a storage region used to store programs and data. Thestorage unit20 is configured to store in a non-volatile manner programs to be executed by thecontroller10 and data to be processed by thecontroller10. Programs and data stored in thestorage unit20 will be described later.
Thedisplay unit31 is coupled to the display32 (display face), and is configured to cause thedisplay32 to display various screens with text and/or images in accordance with a control by thecontroller10.
Thedisplay32 includes a liquid crystal display device, an organic electro luminescence (EL) display device, or another display device, and is driven by thedisplay unit31.
Theinput unit33 is coupled to theinput device34, and is configured to detect an operation of theinput device34, and to accept an entry through the operation of theinput device34. Theinput unit33 outputs, to thecontroller10, data indicative of a content of the entry through theinput device34.
Theinput device34 may be a text input device, such as a keyboard, or a pointing device, such as a mouse, a digitizer, or a pen tablet. Theinput device34 may be a configuration integrated with thedisplay32, such as a touch panel. Theinput device34 may be a software keyboard or a graphical user interface (GUI) incorporated into a screen displayed on thedisplay32.
The interface (I/F)unit36 is an interface used to couple an external device, such as a storage device, to thePC1, and includes a universal serial bus (USB) interface, for example. The I/F unit36 exchanges data with the external device coupled to the I/F unit36 in accordance with a control by thecontroller10.
Thecommunication unit37 is configured to execute wired or wireless communications with the external device attached to thePC1 in accordance with a control by thecontroller10. Thecommunication unit37 executes communications in accordance with various protocols, such as an Ethernet (registered trademark) protocol, a wireless LAN (including Wi-Fi (registered trademark)), and a Bluetooth (registered trademark).
As examples of programs and data to be stored in thestorage unit20,FIG. 1 illustrates an operating system (OS)21, a processflow definition tool22, display-relateddata23,condition definition information24,process flow data25,output setting data26, andoutput data27. The operating system (OS)21 is a control program used by thecontroller10 to control thePC1, and configures a platform allowing thecontroller10 to operate an application program. When thecontroller10 executes theOS21, a basic function of thePC1 is provided as an application program interface (API) for the application program to be executed by thecontroller10. The basic function of thePC1 includes a display process to be executed by thedisplay unit31, an input detection process to be executed by theinput unit33, a data input and output process to be executed by the I/F unit36, a communication process to be executed by thecommunication unit37, and other processes, for example.
The processflow definition tool22 is an application program used to create and edit a process flow with thePC1. When thecontroller10 executes the processflow definition tool22, an operation flow to be executed by a work terminal5 (FIG. 2), described later, can be created and edited. A process flow created by the processflow definition tool22 is a specific example of an operation flow.
Here, an operation flow created and edited with thePC1 will now be described.
An operation flow includes a basic operation or a plurality of basic operations (also referred to as unit of operation), and represents a sequence in which an order of execution of the basic operations is defined. A basic operation corresponds to an operation step in the disclosure, and represents a process, such as outputting of information on a display, entering of information onto the display, and making a determination. A unit of operation is used as a unit when creating and editing an operation flow. The unit of operation does not intend to limit any particular. Specifically, when an operation flow is created and edited, a content of the basic operation is not edited. Excluding this point of view, the basic operation is not limited. Accordingly, a content of a basic operation may be determined as desired for convenience of editing an operation flow, A basic operation may contain a plurality of operations or processes, for example.
One specific example of an operation flow is a business operation flow (so-called work flow) representing an operation including a plurality of tasks. A business operation flow includes a plurality of tasks performed by an operator to achieve a set object, and includes a process configured to output information to the operator who is engaging the plurality of tasks. One unit of work performed by an operator and one output of information to the operator each corresponds to an operation step in an operation flow.
As an example of an operation flow, how thePC1 creates and edits a process flow will now be described. In this example, a process flow that is a kind of work flow is created. The process flow includes a work block as a basic operation. A process flow to be created with thePC1 can include, as a work block, a process configured to display a screen with an image and/or text, as an output of information to an operator. As work blocks, various processes can be included, such as reading of a two-dimensional code (2D code), such as a QR code (registered trademark), entering of text, and making a determination of whether positive or negative (hereinafter referred to as positive/negative determination). As work blocks, entering of a selection using a check box (hereinafter referred to as selection of check box) and entering of a selection using a radio button (hereinafter referred to as selection of radio button) are also included. In a process flow, a work block of “end” is placed at a terminal of the process flow. As described above, work blocks may include a plurality of processes.
The display-relateddata23 includes data of images and/or text to be displayed by thedisplay unit31 when thecontroller10 executes the processflow definition tool22. When thecontroller10 executes the processflow definition tool22, thecontroller10 refers to the display-relateddata23, and causes thedisplay32, via thedisplay unit31, to display an image and/or text based on the display-relateddata23.
Thecondition definition information24 is information defining conditions with respect to work blocks included in a process flow, and includes information of some or all of work blocks to be created or edited upon the processflow definition tool22 is executed. A condition with respect to a work block refers to a condition to be set with respect to a content of the work block. For example, note that a work block corresponding to an entry of information be set with a format of information to be entered, an entry method, and an action when no entry is made, for example. Thecondition definition information24 is set with how to define, as a condition for a work block corresponding to an entry of information, a format of information to be entered, an entry method, and an action when no entry is made. Thecondition definition information24 will be described later in detail with reference toFIG. 6.
Theprocess flow data25 is data of a process flow generated when thecontroller10 has executed the processflow definition tool22. Theprocess flow data25 includes work blocks included in the process flow, an order of execution of the work blocks, and setting contents including entries with respect to the work blocks, and may include other information. Theprocess flow data25 has a data format that can be interpreted when thecontroller10 executes the processflow definition tool22. When thecontroller10 executes the processflow definition tool22, thecontroller10 can read theprocess flow data25 from thestorage unit20, and can edit a process flow represented by theprocess flow data25 that is read. In this case, based on the edited process flow, thecontroller10 can update theprocess flow data25 stored in thestorage unit20.
Theoutput setting data26 is data relating to a setting when theprocess flow data25 is converted into theoutput data27. The output data27 (operation flow data) is, in thePC1 or a device other than thePC1, data used to execute a process flow by following theprocess flow data25, and differs in data format and the like from theprocess flow data25.
In the exemplary embodiment, as an example of a device configured to execute a process flow created with thePC1, the work terminal5 (FIG. 2) will be described. Thework terminal5 is a terminal device used by an operator who is engaging an operation based on a process flow.
Thecontroller10 executes the processflow definition tool22 to generate (convert) theoutput data27 from theprocess flow data25. In other words, the processflow definition tool22 is a program having a function of generating theoutput data27.
Theoutput data27 is data interpretable and executable by thework terminal5, and is described in a general-purpose data format, for example. A general-purpose data format refers to a data format that can be processed by thework terminal5 via a web browser, for example. Specifically, general-purpose data is data described in Extensible Markup Language (XML) or Hypertext Markup Language (HTML) and the like.
Theoutput data27 may be general-purpose data used to execute a process flow in various devices including general-purpose devices, such as PCs. Theoutput data27 may be data corresponding to one of a type, a configuration, and a specification of a device used to execute a process flow. For example, theoutput data27 conforming to a device equipped with a camera may be theoutput data27 that specifies performing text recognition using image data captured by the camera, as a method for entering information. When theoutput data27 is executed by a device equipped with a camera, an operator uses the camera to easily make an entry with fewer burdens. Theoutput data27 conforming to a device equipped with a bar-code reader may be data that specifies acquiring data read by the bar-code reader, as a method for entering information.
Theoutput data27 may be data conforming to a device equipped with a head mounted display unit (head mounted display device) attached to a head of an operator. In theoutput data27 conforming to a head mounted display device, a background color of a screen to be displayed, resolution of the screen, sizes of images and/or text to be displayed on the screen and the like may be adjusted so as to reduce a burden on a visual function of an operator attached with the head mounted display device. For example, theoutput data27 conforming to a head mounted display device equipped with a see-through type display unit that allows a user to see a scene in a transparent manner may differ in background color, for example, from theoutput data27 conforming to a head mounted display device equipped with a closed type display unit that shields external light.
Theoutput setting data26 includes data specifying a data format for theoutput data27. Theoutput setting data26 may include data instructing a change of data included in theprocess flow data25. For example, when theoutput data27 is generated in accordance with a type, a configuration, or a specification of a device used to execute a process flow, theoutput setting data26 is stored in thestorage unit20 per a type, a configuration, or a specification of the device conforming to theoutput data27. In this case, while thecontroller10 is executing the processflow definition tool22, thecontroller10 generates theoutput data27 in accordance with theoutput setting data26 selected from other pieces of theoutput setting data26 stored in thestorage unit20.
Thecontroller10 can output theoutput data27 via the I/F unit36 or thecommunication unit37. Thecontroller10 copies theoutput data27 to a storage device coupled to the I/F unit36, for example. For example, thecommunication unit37 may send theoutput data27 to an external device.
Theinput detection unit11 included in thecontroller10 is configured to detect an entry by a user of thePC1 based on data entered via theinput unit33. For example, when thecontroller10 executes the processflow definition tool22, theinput detection unit11 detects text and a drag and drop operation entered via theinput device34.
Thedisplay controller12 is configured to control and cause thedisplay unit31 to display various screens. For example, when thecontroller10 executes the processflow definition tool22, thedisplay controller12 causes, for example, the editing screen101 (FIG. 4) and the details setting screen151 (FIG. 10), described later, to be displayed.
Thedetermination unit13 corresponds to a function of thecontroller10 executing the processflow definition tool22, and is configured to determine a state of a work block when a process flow is created. Specifically, thedetermination unit13 determines whether a work block included in a process flow satisfies or not a condition defined in thecondition definition information24.
The outputdata generation unit14 is configured to perform a process configured to refer to theoutput setting data26, and to generate theoutput data27 from theprocess flow data25.
Thecommunication controller15 is configured to control thecommunication unit37 to execute communications with another device than thePC1. Thecommunication controller15 sends theoutput data27 stored in thestorage unit20 to an external device, for example.
FIG. 2 is a function block diagram of thework terminal5.
Thework terminal5 includes adisplay54 and aninput device56, and is a terminal device used by an operator who is engaging an operation in accordance with a process flow. As long as thework terminal5 is a computer, although a specific configuration of thework terminal5 is not limited, and it is advantageous to be a portable device, such as a laptop computer, a tablet computer, or a smart phone and the like.
As illustrated inFIG. 2, thework terminal5 includes acontroller51, astorage unit52, adisplay unit53, aninput unit55, an I/F unit63, and acommunication unit64, all of which are coupled to each other via abus66.
Thecontroller51 includes a processor, such as a CPU, and is configured to allow the processor to execute a program to control thework terminal5, and to achieve various functions of thework terminal5. Thecontroller51 may include a RAM configured to prepare a work area for the processor. Thecontroller51 may include a ROM configured to store in a non-volatile manner a basic control program executed by the processor.
Thecontroller51 is configured to execute a program and to operate together with software and hardware to achieve functional units, such as aninput detection unit51a, acommunication controller51b, adisplay controller51c, aninformation acquisition unit51d, and a processflow execution unit51e. Functions of the functional units of thecontroller51 will be described later.
Thestorage unit52 includes a magnetic storage medium, an optical storage medium, or a semiconductor storage device and the like, and has a storage region used to store programs and data. Thestorage unit52 is configured to store in a non-volatile manner programs to be executed by thecontroller51 and data to be processed by thecontroller51. Programs and data stored in thestorage unit52 will be described later.
Thedisplay unit53 is coupled to thedisplay54, and is configured to cause thedisplay54 to display various screens with text and/or images in accordance with a control by thecontroller51.
Thedisplay54 is a liquid crystal display device, an organic EL display device, or other display device, and is driven by thedisplay unit53.
Thedisplay54 may be a head mounted type display attached to a head of an operator. In this case, thework terminal5 corresponds to a head mounted display (HMD) device. In this case, thedisplay54 may be a see-through type display unit that allows an operator to see a scene in a transparent manner. Thedisplay54 may otherwise be a closed type display unit that shields external light.
Theinput unit55 is coupled to theinput device56, and is configured to detect an operation of theinput device56, and to output to thecontroller51, operation data indicative of a content of the operation.
Theinput device56 is an input device, such as a keyboard, a mouse, a digitizer, a pen tablet, or a touch pad. Theinput device56 may be a configuration integrated with thedisplay54, such as a touch panel. Theinput device56 may be a software keyboard or a graphical user interface (GUI) incorporated into a screen displayed on thedisplay54.
Acamera57 is a digital camera equipped with an imaging element, such as a charge coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor. Thecamera57 is configured to capture an image in accordance with a control by thecontroller51 and to output imaging data to thecontroller51.
Asound processing unit58 is coupled to aspeaker59 and amicrophone60. Thesound processing unit58 is configured to cause thespeaker59 to output sound in accordance with a control by thecontroller51. Also, thesound processing unit58 detects sound collected by themicrophone60 in accordance with a control by thecontroller51. Thesound processing unit58 outputs, to thecontroller51, sound data representing the sound collected by themicrophone60.
Thespeaker59 includes an analog or stereo speaker provided on a main body of thework terminal5, or a headphone. Themicrophone60 may be provided on the main body of thework terminal5, or may be a microphone coupled to thework terminal5 in a wired or wireless manner. For example, thespeaker59 and themicrophone60 may be integrated into a headset.
Amotion detection unit61 is coupled to amotion sensor62, and is configured to output data indicative of a detection value of themotion sensor62 to thecontroller51. Themotion sensor62 is a sensor configured to detect a motion of thework terminal5 or a motion of an operator attached with and/or using thework terminal5, and includes an acceleration sensor, an angular velocity sensor (gyro sensor), and a geomagnetic sensor and the like. Themotion sensor62 may be a sensor unit equipped with a plurality of sensors, and may be an inertial measurement unit (IMU), for example. Thework terminal5 may include various sensors, such as a vibration sensor and a body motion sensor. In this case, the various sensors may be coupled to themotion detection unit61, for example. Detection values of the various sensors may be acquired by theinput unit55 as input values.
Thework terminal5 may include a position detection unit (not shown) configured to detect a position of thework terminal5 based on a global positioning system (GPS). In this case, themotion detection unit61 may be coupled to the position detection unit, may acquire a position of thework terminal5 detected by the position detection unit, and may output position data to thecontroller51.
The I/F unit63 is an interface used to couple an external device, such as a storage device, to thework terminal5, and includes a USB interface, for example. The I/F unit63 exchanges data with the external device coupled to the I/F unit63 in accordance with a control by thecontroller51.
Thecommunication unit64 is configured to execute wired or wireless communications with the external device attached to thework terminal5 in accordance with a control by thecontroller51. Thecommunication unit64 executes communications in accordance with various protocols, such as an Ethernet protocol and a wireless LAN (including Wi-Fi).
A wireless communication I/F unit65 is configured to execute wireless communications with the external device attached to thework terminal5. The wireless communication I/F unit65 executes, in particular, wireless communications in a close range based on Bluetooth, near field radio communication (NFC), and Zigbee (registered trademark) and the like. Anexternal device68 is a device configured to perform wireless communications with the wireless communication I/F unit65. Theexternal device68 is not particularly limited in number. A plurality of theexternal devices68 are capable of being coupled to the wireless communication I/F unit65. Theexternal device68 is a bar-code reader configured to optically read one-dimensional bar-code, for example. Theexternal device68 may be an input device including a keyboard and a mouse, as well as may be a card reader, a data storage device, or a smart phone and the like.
As examples of programs and data stored in thestorage unit52,FIG. 2 illustrates anOS52a, anapplication program52b, and process flowdata52c. TheOS52ais a control program used by thecontroller51 to control thework terminal5, and configures a platform allowing thecontroller51 to operate theapplication program52b. When thecontroller51 executes theOS52a, a basic function of thework terminal5 is provided as an API for theapplication program52b. The basic function of thework terminal5 includes a display process to be executed by thedisplay unit53, an input detection process to be executed by theinput unit55, a data input and output process to be executed by the I/F unit63, a communication process to be executed by thecommunication unit64, and other processes, for example.
Theapplication program52bis a program configured to allow thework terminal5 to execute a process flow created with thePC1. Theapplication program52bmay be a special application program configured to execute a process flow, or may be a general-purpose program, such as a web browser. A specific configuration of theapplication program52bcan be selected as desired in conformity to a specification of theprocess flow data52c, described later.
Theprocess flow data52cis data of a process flow created with thePC1, and corresponds to theoutput data27 output from thePC1. Thecontroller51 is configured to acquire theoutput data27 created with thePC1, and to cause thestorage unit52 to store theoutput data27 as theprocess flow data52c. Theprocess flow data52cmay be data conforming to a specification of thework terminal5. For example, theprocess flow data52cmay be data of a process flow including work blocks in which thecamera57 and themotion sensor62 are used. Theprocess flow data52cmay be general-purpose data executable by various devices including the work terminal5 (e.g., a device without the camera57) and thework terminal5 with a different type or specification.
When thecontroller51 executes theapplication program52b, thecontroller51 reads theprocess flow data52c, and further executes work blocks included in theprocess flow data52c.
Theinput detection unit51aincluded in thecontroller51 is configured to detect an entry by an operator of thework terminal5 based on data entered from theinput unit55.
Thecommunication controller51bis configured to control thecommunication unit64 to execute communications with other device than thework terminal5. Thecommunication controller51bis configured to execute communications with thePC1, to receive theoutput data27 from thePC1, and to cause thestorage unit52 to store theoutput data27 as theprocess flow data52c, for example.
Thedisplay controller51cis configured to control thedisplay unit53 to display various screens. For example, when thecontroller10 executes theapplication program52b, thedisplay controller51ccauses various screens respectively corresponding to work blocks to be displayed.
Theinformation acquisition unit51dis configured to execute various tasks, such as reading of data with the I/F unit63. When a storage device coupled to the I/F unit63 is detected, theinformation acquisition unit51dreads data from the storage device, and causes thestorage unit52 to store the data, for example. Thework terminal5 can thus read and use theprocess flow data52cstored in the storage device.
The processflow execution unit51eis configured to execute theapplication program52bstored in thestorage unit52, and to further execute a process flow in accordance with theprocess flow data52c.
FIG. 3 is a flowchart illustrating an operation of thePC1.
The operation inFIG. 3 is executed while thecontroller10 is executing theOS21.
In response to an operation by a user, for example, thecontroller10 reads the processflow definition tool22 from thestorage unit20 and launches the process flow definition tool22 (step S11). Thecontroller10 reads the display-relateddata23, and causes thedisplay unit31 to display an editing screen arranged with an image and the like included in the display-related data23 (step S12).
FIG. 4 is a view illustrating a display example of thePC1, and illustrates a configuration example of anediting screen101.FIG. 4 illustrates a view of a state of theediting screen101 at a point of time when creating or editing of a process flow is started. The state corresponds to a state immediately after theediting screen101 is displayed in step S12 (FIG. 3), for example.
Theediting screen101 is a screen displayed through a function of the processflow definition tool22 for creating and editing a process flow. Theediting screen101 is roughly separated into acandidate region110 and an editing region120 (work region).
At an upper section of theediting screen101, aninstruction section101ais arranged. Theinstruction section101aincludes an icon instructing end of creating and editing of a process flow, as well as including an icon instructing storing the process flow.
Thecandidate region110 displays a list of work blocks that can be incorporated into a process flow to be created or edited. Thecandidate region110 is arranged with, in regions separated per work block, text describing each of the work blocks, and icons that are images representing symbols indicative of the work blocks.
For example, in theediting screen101 inFIG. 4, thecandidate region110 is arranged with, in line with seven work blocks, workblock display sections111,112,113,114,115,116, and117. The workblock display section111 includes a work blockdescription display section111aas text describing that the section corresponds to a work block to which a work procedure is to be set, and anicon111bindicative of the work block. Similarly, the workblock display sections112,113,114,115,116, and117 respectively include work blockdescription display sections112a,113a,114a,115a,116a, and117a. The workblock display sections112,113,114,115,116, and117 also respectively includeicons112b,113b,114b,115b,116b, and117b. Theicons111bto117brespectively have different images so that the workblock display sections111 to117, so as to be visually identified. Advantageously, the icons are respectively images representing contents of work blocks. The work blockdescription display sections111ato117amay respectively be names of work blocks, or may respectively be text simply describing contents of the work blocks.
The workblock display sections111 to117 arranged on thecandidate region110 are candidate work blocks to be incorporated into a process flow to be created and edited. A user can operate theinput device34 to select any of the workblock display sections111 to117, to drag and drop the selected ones onto theediting region120, and to incorporate desired work blocks into a process flow.
Theediting region120 is a display region used to display a process flow. Theediting region120 is arranged with work blocks selected by the user from among the workblock display sections111 to117 arranged on thecandidate region110. A process flow is created with one work block or a plurality of work blocks arranged on theediting region120. Theediting region120 is arranged with objects represented by images corresponding to work blocks, as will be described later.
FIG. 5 is a view illustrating a display example of thePC1, and illustrates a configuration example of theediting screen101.FIG. 5 illustrates a state in which a process flow is being created or edited.
In the state inFIG. 5, theediting region120 of theediting screen101 is arranged with anobject201. Theobject201 is an object dragged and dropped from the workblock display section111, and represents the work block in the workblock display section111. Theobject201 includes anicon201awith an image identical to the image of theicon111b. With theicon201a, theobject201 representing the work block in the workblock display section111 is easily and visually identified. Theobject201 may have a desired shape. In the example illustrated inFIG. 5, however, theobject201 partially has a downward arrow shape indicating an order of execution of work blocks.
An order of arrangement of work blocks arranged on theediting region120 corresponds to an order of execution of work blocks in a process flow. In the exemplary embodiment, an order of execution of work blocks on theediting region120 is set from top to bottom and left to right. On theediting region120, higher work blocks will be executed earlier than lower work blocks. On theediting region120, more left work blocks will be executed earlier than more right work blocks.
On theediting region120, an image guiding a position onto which a work block (more specifically, an object indicative of a work block) can be added and arranged is displayed. This image is aguide121 inFIG. 4 and aguide122 inFIG. 5.
InFIG. 4, no work block is arranged on theediting region120. That is, work blocks can be added and arranged onto theediting region120. When work blocks can be added and arranged onto theediting region120, thecontroller10 causes thedisplay unit31 to display theguide121 on theediting region120. Theguide121 includestext121aguiding a drag and drop operation to a user since no object is arranged on theediting region120. In the state inFIG. 4, a work block to be executed first is to be dragged and dropped. Theguide121 is thus arranged on an uppermost section of theediting region120. However, theguide121 may be placed at a desired position. To satisfy creating of a process flow including many work blocks, theediting region120 may be able to scroll in a vertical direction and/or a horizontal direction.
InFIG. 5, theediting region120 is arranged with theobject201. The work block represented by theobject201 and displayed in the workblock display section111 includes a process configured to perform outputting for guiding a work procedure to an operator who uses the device (e.g., work terminal5) configured to execute a process flow. Since one work block can be executed after the work block in here, theediting region120 displays theguide122 indicative of a position onto which an object can be added. Theguide122 lies below theobject201, indicating that a work block (hereinafter referred to as next work block) to be executed after theobject201 can be arranged.
The work block displayed in the workblock display section112 includes a process configured to read a 2D code. The process acquires read data read from an image code by a device configured to optically read the image code, such as thecamera57 of thework terminal5. The work block is to be set with two next work blocks. In other words, a case when a 2D code is read successfully is specified as a positive determination, while a case when reading is failed or no reading takes place is specified as a negative determination, and next work blocks respectively corresponding to the positive determination and the negative determination are to be set.
The work block displayed in the workblock display section113 includes a process configured to enter text. The process acquires data of text entered by an operator with theinput device56 of thework terminal5. A method used in the process to detect and acquire an entry by an operator is not limited to a method using theinput device56. For example, a text recognition process using imaging data of thecamera57 and a sound recognition process for sound collected with themicrophone60 may be used to detect various entries including text. A bar-code reader may be used as theexternal device68 to read a bar-code and to acquire read data. The acquired data may then be detected as an entry by an operator. A motion acquired from a detection value of themotion sensor62 may be detected as an entry by an operator. The input detection methods are applicable to work blocks displayed in the workblock display sections114,115, and116, described later.
The work block displayed in the workblock display section113 is to be set with two next work blocks respectively corresponding to a positive determination and a negative determination. The positive determination corresponds to a case when text is entered, while the negative determination corresponds to a case when no text is entered.
The work block displayed in the workblock display section114 includes a process configured to make a positive/negative determination. In the process, an entry by an operator specifying positive or negative is to be detected. The work block is to be set with two next work blocks respectively corresponding to a positive determination and a negative determination. The positive determination corresponds to an entry representing positive, while the negative determination corresponds to an entry representing negative.
The work block displayed in the workblock display section115 includes a process configured to make a selection in a check box. The work block displayed in the workblock display section116 includes a process configured to make an entry in a radio button. The processes respectively display a check box and a radio button to allow an operator to view the check box and the radio button, and to detect selections made by the operator in the check box or the radio button. The work blocks displayed in the workblock display sections115 and116 are each set with next work blocks in accordance with the number of check boxes or radio buttons.
The work block displayed in the workblock display section117 represents an end of the process flow. No next work block can be set to the work block.
Theguide121 is an image indicative of a position onto which a work block can be arranged on theediting region120, and is displayed to guide, to a user, a position onto which a work block can be added. Theguide121 is advantageously an image that can be distinguished from a work block to be arranged in accordance with an operation by a user, and is displayed with a broken line, for example, as illustrated inFIG. 4.
InFIG. 5, theguide122 displayed in response to theobject201 is displayed, in number, in line with work blocks that can be set as next steps (hereinafter also referred to as next processes or next flows) for an object of theobject201. Theobject201 represents the work block displayed in the workblock display section111. One next work block can be added after the work block. Theguide122 is displayed to indicate that an object of the next step can be arranged below theobject201.
As described above, work blocks that can be incorporated into a process flow each have various attributes, such as the number of next work blocks. ThePC1 governs the attributes by thecondition definition information24.
FIG. 6 is a schematic view of thecondition definition information24. For description purposes,FIG. 6 illustrates, in a table, a configuration example of thecondition definition information24. However, a data format and a data arrangement configuration of thecondition definition information24 may be selected as desired.
In the example inFIG. 6, a setting content, the number of next work blocks, and an image of an object to be displayed on theediting region120 are associated with each other per work block, and are set in thecondition definition information24. An image of an object itself may not be included in thecondition definition information24. Data of the image of the object may be included in the display-relateddata23, for example. In this case, thecondition definition information24 may include at least information specifying data of an image of an object included in the display-relateddata23, the information being associated per work block.
The number of next work blocks can be referred to as the number of objects to be coupled on theediting region120. In this point of view, the number of next work blocks can be referred to as the number of objects to be coupled or the number of coupling points possessed by each object. InFIG. 6, the number of next work blocks is described as the number of coupling points. A coupling point denotes a position to which a next object is to be coupled.
Work blocks of objects arranged on theediting region120 are specified beforehand with items to be set. The items are referred to as setting contents inFIG. 6. For example, a work block specifying a work procedure (corresponding to the workblock display section111 inFIG. 4) is to be set with a work name, a work ID, a content to be output to an operator as the work procedure (e.g., image and/or text), and other work related settings. If the items have not yet been set, thework terminal5 cannot execute the work block. Setting items in thecondition definition information24 are conditions allowing a work block to be executed.
Similarly, a setting content for a work block in which a 2D code is to be read includes read data and information about an association between a result of determination and coupling points. Read data refers to data to be entered as data of a read 2D code, i.e., information specifying data expected to be entered, such as, the number of digits in read data and a data format (e.g., numerical value, text, mathematical formula, and URL). An association between a result of determination and coupling points represents information about an association among two coupling points (bottom and right) on theediting region120, a case when a 2D code is read successfully (positive determination), and a case when reading is failed (negative determination). Entering the information satisfies a condition of executing the work block in which a 2D code is to be read.
Similarly, a setting content for a work block in which text is to be entered includes information about an association among a limited number of characters in text to be entered, a result of determination, and coupling points. An association between a result of determination and coupling points represents information about an association among two coupling points (bottom and right) on theediting region120, a positive determination, and a negative determination. Entering the information satisfies a condition of executing the work block in which text is to be entered.
A setting content for a work block in which a positive/negative determination is to be made includes information about an association between a result of determination and coupling points.
A setting content for a work block in which a selection of check box or a selection of radio button is to be made includes the number of check boxes or the number of radio buttons and text to be displayed and output in response to the check boxes or radio buttons.
A work block used as end of flow includes no setting item.
As described above, in thecondition definition information24, for a work block that can be included in a process flow to be created or edited with the processflow definition tool22, a condition allowing the work block to be executed and an image (object) to be displayed on theediting region120 are associated with each other. In creating and editing a process flow with theediting screen101, thecondition definition information24 is referred to, and various processes are performed based on thecondition definition information24.
Now back toFIG. 3, thecontroller10 displays the editing screen101 (step S12), and executes a process flow editing process configured to create or edit a process flow in accordance with an operation by a user (step S13). The process flow editing process will be described later in detail.
Thecontroller10 stores theprocess flow data25 representing the process flow created or edited in the process flow editing process in the storage unit20 (step S14). When an identical process flow is already stored in theprocess flow data25, theprocess flow data25 is overwritten and updated.
Thecontroller10 determines whether theprocess flow data25 is to be output to an external device attached to thework terminal5 and the like (step S15). When outputting of theprocess flow data25 is instructed with an operation by theinput device34, and the like (step S15; YES), thecontroller10 refers to theoutput setting data26, and acquires settings corresponding to a type and a function of an output-destination device (step S16). Theoutput setting data26 may include a setting of one general-purpose type. In this case, thecontroller10 may acquire a setting of one type, which is included in theoutput setting data26.
Thecontroller10 follows the setting acquired in step S16 to generate and output theoutput data27 based on the process flow data25 (step S17). In step S17, thecontroller10 may store theoutput data27 in thestorage unit20, may output theoutput data27 to a device coupled to the I/F unit36, or may send theoutput data27 to thecommunication unit37. When theprocess flow data25 is not to be output (step S15; NO), thecontroller10 ends the process.
FIG. 7 is a flowchart illustrating an operation of thePC1, and illustrates the process flow editing process described in step S13 inFIG. 3.
While theediting screen101 is being displayed, thecontroller10 accepts selection operations made onto thecandidate region110 or the editing region120 (step S31). Thecontroller10 determines whether the accepted selection operations correspond to operations made on thecandidate region110 or operations made on the editing region120 (step S32).
When the operations made on thecandidate region110 are accepted (step S32; candidates region), thecontroller10 acquires a result of the selections (step S33). The result of the selection operations indicates the work blocks selected from among the workblock display sections111 to117.
Thecontroller10 adds and arranges, onto theediting region120, objects corresponding to the work blocks selected on the candidate region110 (step S34). Thecontroller10 determines, in accordance with the number of coupling points in thecondition definition information24, the number of theguides122 to be displayed and display positions for the objects arranged in step S34 (step S35). In the exemplary embodiment, since an order of execution is specified from top to bottom and left to right on theediting region120, as an example, theguide122 is arranged below when the number of coupling points is one, while theguide122 is arranged below and another guide is arranged on right when the number of coupling points is two. Such guides arranged on right of objects are branch guides202band204b, described later, for example (FIG. 8). Thecontroller10 displays guides accompanied to objects in accordance with the arrangement determined in step S35 (step S36).
Next, thecontroller10 acquires, from thecondition definition information24, setting contents relating to all the objects arranged on theediting region120 or the objects added and arranged in step S34 (step S37). Thecontroller10 determines whether all the objects arranged on theediting region120 or the objects added and arranged in step S34 satisfy conditions stored in the condition definition information24 (step S38). Satisfying a condition denotes that all setting contents in thecondition definition information24 are set already, the condition other than that is determined as unsatisfied.
When an object is determined not to satisfy a condition (unsatisfied) (step S38; YES), thecontroller10 adds and displays a notification icon on the object determined as unsatisfied (step S39). When the object in question is already displayed with the notification icon, the notification icon is kept displayed.
FIG. 8 is a view illustrating a display example of thePC1, and illustrates a configuration example of theediting screen101.FIG. 8 illustrates a state when a plurality of objects are arranged on theediting region120.
In the state inFIG. 8, theediting region120 of theediting screen101 is arranged with theobject201, as well as withobjects204 and202 in a descending order. Theguide122 is displayed below theobject202.
Theobject204 and theobject202 respectively have two coupling points. Thebranch guide204bis displayed on right of theobject204, indicating that another next object can be arranged than an object that can be arranged below. Similarly, theobject202 is displayed with thebranch guide202b. Similar to theguide122, the branch guides202band204beach guide that an object corresponding to a work block can be added and arranged. An object having a plurality of coupling points causes a process flow to branch. An object corresponding to a work block having a plurality of coupling points can be referred to as a branch object. In the exemplary embodiment, for convenience, a process flow mainly including a series of objects extending downward from a branch object is created. On the other hand, a flow including other objects extending another direction than downward (e.g., right) from a branch object is referred to as a branch flow.
InFIG. 8, anotification icon125 is further displayed in an overlapped manner on theobject202. Thenotification icon125 indicates that theobject202 is not set with a content specified in thecondition definition information24, i.e., theobject202 is in an unsatisfied state. Thenotification icon125 is a display indicating that a condition is not satisfied. Thenotification icon125 is an image notifying, to a user of thePC1, that theobject202 asks for a setting, and guiding an urge to provide the setting.
InFIG. 8, thenotification icon125 is displayed on theobject202, theguide122 is displayed below theobject202, and thebranch guide202bis displayed on right. Objects can be added and arranged below and right of theobject202. As described above, even when an object (a work block corresponding to an object) that does not satisfy a condition specified in thecondition definition information24 is present, thecontroller10 can edit a process flow relating to another object. Thus, a user can create or edit a process flow in a desired procedure. For example, the user can first arrange objects in a process flow, and then solve unsatisfied states. Without being forced to use a certain work procedure, a user can create and edit a process flow in an improved and efficient manner.
Now back toFIG. 7, after a notification icon is added on an object determined as unsatisfied (step S39), thecontroller10 determines whether the process flow is to be edited or ended (step S40). In step S40, a determination is made in accordance with whether an operation to theinstruction section101ahas been made, for example. When absence of an object determined as unsatisfied is determined (step S38; NO), thecontroller10 causes the notification icons being already displayed to disappear, and performs step S40.
To end editing of the process flow (step S40; YES), thecontroller10 creates theprocess flow data25 in accordance with states of objects arranged on theediting region120, and causes thestorage unit20 to store theprocess flow data25 or updates the process flow data25 (step S41).
To continue editing of the process flow (step S40; NO), thecontroller10 returns to step S31.
When entries of selections accepted in step S31 are entries onto the editing region120 (step S32; editing region), thecontroller10 executes a details setting process (step S42). The details setting process is a process configured to set setting contents defined in thecondition definition information24 with respect to objects arranged already on theediting region120. After the details setting process is executed (step S42), thecontroller10 performs step S40.
FIG. 9 is a flowchart illustrating an operation of thePC1, and illustrates the details setting process described in step S42 inFIG. 7. The details setting process corresponds to a process configured to add information or an attribute to an object.
Thecontroller10 acquires a result of the selection operations accepted in step S31 inFIG. 7 (step S61). The result acquired in step S61 indicates the operations of selecting any of the objects arranged on theediting region120.
Thecontroller10 causes thedisplay unit31 to switch from the editing screen, and to display a details setting screen (step S62).
FIG. 10 is a view illustrating a display example of thePC1, and illustrates a configuration example of adetails setting screen151.FIG. 10 illustrates an example of thedetails setting screen151 being switched and displayed, with a plurality of objects arranged on theediting region120 of theediting screen101.
Thedetails setting screen151 includes a processflow display region160, aview editing region170, and a details setting region180 (information input region). At an upper section of thedetails setting screen151, aninstruction section151ais arranged. Theinstruction section151aincludes an icon instructing storing of details settings, as well as includes an icon instructing ending of the details setting process.
The processflow display region160 is a region onto which aprocess flow161 is displayed, as well as objects indicative of work blocks configuring theprocess flow161 are arranged in an order of execution. In the example inFIG. 10, theprocess flow161 including theobject201 and theobject202 is displayed. Theprocess flow161 does not include anobject207 indicative of end of the flow since the flow is still under creation. Thereby, theguide122 is displayed at a bottom of theprocess flow161.
Theobject202 is a branch object allowing another object to be arranged on right. In the processflow display region160, adestination specifying section202cis arranged on right of theobject202. Thedestination specifying section202cis an image indicative of a destination object when the flow branches rightward from theobject202. As a destination to be set at a coupling point on the right of theobject202, a new object can be set, as well as an object already included in theprocess flow161 can be specified. For example, a setting can be made so that, when a positive determination or a negative determination is made at theobject202 in theprocess flow161, a flow traces back an order of execution to theobject201 lying above. At the coupling point, thedestination specifying section202cindicates that one of the objects already included in theprocess flow161 is specified as a destination. A destination represented by thedestination specifying section202cmay be placed before or after theobject202 in the order of execution.
Theview editing region170 is a region used to display a configuration (user view171) of a screen displayed by thework terminal5 when thework terminal5 executes theprocess flow161 created in the process. Theview editing region170 displays the configuration of the screen in accordance with any of objects configuring theprocess flow161 displayed on the processflow display region160. For example, when a result of the selections, which is acquired in step S61 (FIG. 9), indicates theobject201 on theediting region120 in the editing screen101 (FIG. 5), theuser view171 inFIG. 10 is displayed.
In the example inFIG. 10, theuser view171 illustrates a configuration of a screen corresponding to theobject201 that is the first object (work block) in theprocess flow161. Theuser view171 includes aprocess display section172 displaying an order of a plurality of work blocks included in theprocess flow161. Theprocess display section172 is a display section notifying, to an operator, a position of a work block being executed in a whole process.
Theuser view171 includes an operationguide display section173. The operationguide display section173 guides, to an operator, an association between an operation of theinput device56 and progress of theprocess flow161. The operator can operate theinput device56 in accordance with the operationguide display section173 being displayed to advance theprocess flow161 to a next work block, or to return to a previous work block in an order of execution.
For example, when thework terminal5 is used to display an image and/or text while theprocess flow161 is being executed, theuser view171 can be used to set a display position, a display size, and a display timing and the like for the image and/or the text.
Thedetails setting region180 is a region used to enter setting contents defined in thecondition definition information24 for an object selected in theprocess flow161. In thedetails setting region180 inFIG. 10, atitle setting section181 and a workinformation setting section182 are arranged for accepting entries in accordance with setting contents for theobject201. Thetitle setting section181 includes entry boxes used to enter a work name and a work ID. The workinformation setting section182 includesentry boxes183 used to enter contents to be displayed to an operator as a work procedure, as well as includes acheck box184 and the like. Upon the setting contents are appropriately entered into thedetails setting region180, theobject201 satisfies a condition.
Now back toFIG. 9, after thecontroller10 displays the details setting screen151 (step S62), thecontroller10 acquires data entered into the details setting region180 (step S63). Thecontroller10 acquires the setting contents from thecondition definition information24 for the object in question on the details setting region180 (step S64). Thecontroller10 determines whether the object in question on thedetails setting region180 satisfies the condition stored in the condition definition information24 (step S65).
When thecontroller10 determines that the condition has not yet been satisfied (unsatisfied) (step S65; YES), thecontroller10 adds and displays a notification icon on the object in question on the process flow display region160 (step S66). When a notification icon is already displayed, the notification icon is kept displayed.
FIG. 11 is a view illustrating a display example of thePC1, and illustrates a configuration example of thedetails setting screen151.
In the example inFIG. 11, a configuration of a screen corresponding to theobject202 that is the second object in theprocess flow161 is displayed on theuser view171. Specifically, theuser view171 is arranged with areading frame175, in addition to theprocess display section172 and the operationguide display section173. Thereading frame175 guides a position of a 2D code when the 2D code is to be read with thecamera57.
When thework terminal5 executes the work block associated with theobject202, thedisplay54 displays an image captured with thecamera57. Thereading frame175 guides, to an operator, a position of thework terminal5 and a position of the 2D code for an adjustment so that the 2D code to be read falls within thereading frame175.
In thedetails setting screen151 inFIG. 11, thedetails setting region180 is arranged with anentry box186 used to enter information corresponding to theobject202. A numeral “187” represents a title of data entered into theentry box186.
The condition definition information24 (FIG. 6) includes conditions, such as a number of digits and a data format and the like of data expected to be entered, for a work block in which a 2D code is to be read. Theentry box186 is to be entered with an expected value of data of a 2D code to be read with respect to theobject202. Thework terminal5 can determine whether a 2D code has been read successfully based on whether data entered into theentry box186 and data of the 2D code detected from an image captured by thecamera57 match each other. In the example inFIG. 11, a piece of data or a plurality of pieces of data may be entered into theentry box186.
Thecontroller10 determines whether theobject202 satisfies the condition based on whether data conforming to thecondition definition information24 has been entered into theentry box186. In the example inFIG. 11, a numerical value is entered into theentry box186. When the condition has been defined in thecondition definition information24 so that an URL is to be entered as input data for a work block in which a 2D code is to be read, the value entered into theentry box186 inFIG. 11 does not satisfy the condition. As a result, in thedetails setting region180, a display aspect of thedetails setting region180 has been changed so as to notify that data entered into theentry box186 does not satisfy the condition.
Specifically, thetitle187 is displayed in a bold font. A method of changing a display aspect is not limited to the method used in the example inFIG. 11. A method of changing a display color of thetitle187 or a display color of data entered into theentry box186 from a default display color to a desired display color may be adopted. A method of changing a background color of thetitle187 and theentry box186 from a default display color to a desired display color or a method of displaying thenotification icon125 at a position adjacent to theentry box186 may be adopted. An error message may be displayed. When data entered into theentry box186 is determined as unsatisfied, the processflow display region160 displays thenotification icon125 on theobject202.
As described above, even while information about an object is entered or edited on thedetails setting screen151, a fact that the object does not satisfy a condition can be notified.
As illustrated inFIG. 11, thecontroller10 determines whether, after a notification icon is added on an object determined as unsatisfied (step S66), the details setting process is to be ended (step S67). In step S67, a determination is made in accordance with whether an operation is made on theinstruction section151a, for example. When an object is determined as not unsatisfied (satisfied) (step S65; NO), thecontroller10 causes the notification icon being already displayed to disappear, and performs step S65.
When the details setting process is to be ended (step S67; YES), thecontroller10 updates theprocess flow data25 so as to include the data entered in the details setting process or causes thestorage unit20 or the RAM (not shown) to temporarily store the entered data (step S68). Thecontroller10 causes thedisplay unit31 to switch from thedetails setting screen151 and to display theediting screen101, and then returns to the process inFIG. 7.
When the details setting process is not to be ended (step S67; NO), thecontroller10 returns to step S62.
In the exemplary embodiment, the details setting process inFIG. 9 is executed for each of the objects arranged on theediting region120. Once the details setting process is executed for an object selected on theediting region120, and then the details setting process is to be performed for another object, a process configured to once return to theediting screen101 is performed. The configuration can be changed as desired. For example, objects configuring theprocess flow161 displayed on thedetails setting screen151 may be sequentially selected so that the details setting process can be sequentially executed.
FIG. 12 is a view illustrating a display example of thePC1, and illustrates a configuration example of theediting screen101.FIG. 12 illustrates a state when a process flow has been created on theediting region120.
In the state inFIG. 12, theediting region120 of theediting screen101 is arranged with theobjects201,204,202,203, and207 in a descending order. Theobject207 represents an end of a process flow, indicating that the process flow has been completed.
In the example inFIG. 12, theobjects204,202, and203 on theediting region120 are branch objects. Abranch flow icon221 is displayed in line with a coupling point on right of theobject204. Thebranch flow icon221 illustrates another flow as a destination that is branched from the coupling point on right and that corresponds to a negative determination. Upon thebranch flow icon221 is selected through an operation by a user, for example, a branch flow is displayed on theediting region120. Theobject202 and theobject203 are each set with theobject207 as a destination corresponding to a negative determination. In the example inFIG. 12, thebranch guide202band abranch guide203bare respectively displayed at the coupling points corresponding to the negative determinations to be made in theobjects202 and203. However, branch flow icons may be added to and displayed for theobjects202 and203.
FIG. 13 is a flowchart illustrating an operation of thework terminal5 based on a process flow created with thePC1, and illustrates an operation to be executed in accordance with theprocess flow data52cby thework terminal5. The operation inFIG. 13 corresponds to a process flow created on theediting region120 inFIG. 12.
Thecontroller51 of thework terminal5 causes thedisplay unit53 to display an image and/or text of a work procedure on the display54 (step SA11). Step SA11 corresponds to theobject201. Upon thecontroller51 detects an entry by an operator through theinput unit55 and the like, thecontroller51 performs step SA12.
In step SA12, thecontroller51 executes a positive/negative determination (step SA12). Step SA12 corresponds to theobject204. Thecontroller51 detects an entry of the operator for positive or negative. When positive is determined (step SA12; positive), thecontroller51 makes a positive determination, and performs step SA13. In step SA13, thecontroller51 reads a 2D code (step SA13). Step SA13 corresponds to theobject202. When, in step SA12, negative is determined (step SA12; negative), thecontroller51 makes a negative determination, and performs step SA15, described later.
When, in step SA13, a 2D code that is expected to be read and that is set in line with theobject202 is read (step SA13; success), thecontroller51 makes a positive determination, and performs step SA14. When the 2D code that is expected to be read is not read (step SA13; negative), thecontroller51 makes a negative determination, and performs step SA15, described later.
In step SA14, thecontroller51 accepts an entry of text (step SA14). Step SA14 corresponds to theobject203. When, in step SA14, text is entered within a maximum limit for a number of characters, which is set in line with the object203 (step SA14; entered), thecontroller51 makes a positive determination, and performs step SA15. When text has not yet been entered within the maximum limit for a number of characters (step SA14; negative), thecontroller51 makes a negative determination, and performs step SA15.
In step SA15, thecontroller51 ends the process flow.
The flowchart illustrated inFIG. 13 corresponds to the process flow edited on theediting screen101, and reflects an order of execution of the objects, the branches, and the transitions along with the branches on theediting region120. Steps of the flowchart inFIG. 13 respectively correspond to the work blocks respectively corresponding to theobjects201 to207, and include setting contents set on thedetails setting screen151.
As described above, a user who operates thePC1 can cause thecontroller10 to execute the processflow definition tool22 in accordance with the display-relateddata23 and thecondition definition information24 to easily create a process flow satisfying defined conditions.
Thecondition definition information24 defining conditions relating to work blocks in a process flow can be edited through operations by a user.
FIG. 14 is a flowchart illustrating an operation of thePC1, and illustrates a condition setting process configured to create, edit, or update thecondition definition information24.
The processflow definition tool22 may be an application program functioning as a condition setting process. In this case, thecontroller10 is capable of executing the processflow definition tool22 to execute the condition setting process inFIG. 14.
In accordance with an operation by a user, thecontroller10 specifies a work block to be edited in the condition definition information24 (step S81), and acquires a content entered by the user for a setting content for the specified work block (step S82).
Thecontroller10 determines whether the condition setting process is to be ended based on an operation by the user (step S83). When the condition setting process is not to be ended (step S82; NO), thecontroller10 returns to step S81, where setting contents for other work blocks are to be entered and the like. When the condition setting process is to be ended (step S83), thecontroller10 updates thecondition definition information24 based on the contents acquired in step S82 (step S84), and then ends the process.
As described above, thePC1 according to the exemplary embodiment applied with the disclosure creates a process flow (operation flow) including a plurality of work blocks (operation steps). The process flow is specified with an order of execution of the plurality of work blocks. ThePC1 includes theinput unit33 configured to accept entries, thedisplay unit31 configured to cause thedisplay32 to perform displaying, and thecontroller10. Thecontroller10 causes thedisplay32 to display theediting region120 as the work region, and arranges objects indicative of work blocks onto theediting region120 in accordance with entries accepted by theinput unit33. Thecontroller10 creates a process flow based on the arrangement of the objects on theediting region120. Thecontroller10 compares the objects arranged on theediting region120 with a condition set with respect to the work blocks corresponding to the objects, and determines at least either of acceptable and unacceptable. Thecontroller10 adds, when unacceptable is determined, a display indicative of condition unacceptability onto each of corresponding ones of the objects, and continues, after the display indicative of condition unacceptability is added onto each of the corresponding ones of the objects, arranging the objects onto theediting region120 in accordance with the entries.
The program according to the disclosure corresponds to the processflow definition tool22, and, when thecontroller10 executes the processflow definition tool22, achieves the information processing device and the information processing method according to the disclosure.
With thePC1, while a user makes entries to arrange objects onto theediting region120, if one of the objects being arranged is found unacceptable to the condition set with respect to the work blocks, the unacceptability can be notified with the display. By making entries to cause the display indicative of condition unacceptability to disappear, the user easily creates or edits a process flow satisfying the condition. Even when unacceptable is determined, the user can continue arranging objects. The user can make an entry to solve the unacceptability at a desired timing. Accordingly, convenience in creating or editing a process flow is improved.
Thecontroller10 creates the process flow that includes the work blocks respectively corresponding to the objects arranged on theediting region120, and that is specified with the order of execution of the work blocks in accordance with an order of the arrangement of the objects. With the configuration, the user makes entries, arranges objects onto theediting region120, and creates a process flow corresponding to an order of the arrangement of the objects. By arranging objects, the user easily creates or edits a process flow.
Thecontroller10 adds information or an attribute for the objects arranged on theediting region120 in accordance with entries. When one of the objects arranged on theediting region120 is not added with the information or the attribute of a set type, thecontroller10 determines the one of the objects as condition unacceptable. With the configuration, when condition unacceptable is determined for a process of adding information or an attribute for objects, the user can be notified with the unacceptability. Accordingly, the user is supported for the process of adding information or an attribute for objects. Therefore, the user easily creates or edits an appropriate process flow.
A work block to be processed with thePC1 is a process configured to perform at least one of outputting of information, entering of information, and making a determination each executed by a computer (e.g., work terminal5). Thecontroller10 causes icons, which are indicative of processing of the work blocks corresponding to the object, to be associated with the objects arranged on theediting region120, and to be displayed. By displaying icons on the objects on theediting region120, a user can easily recognize computer processing represented by the work blocks. The user can thus easily create or edit a process flow.
Thecontroller10 causes thedisplay32 to display the details setting region180 (FIG. 11) configured to accept entries of information about work blocks. When thecontroller10 determines a work block corresponding to one of the objects arranged on theediting region120 as condition unacceptable, thecontroller10 causes thedetails setting region180 to perform at least either of providing the display indicative of condition unacceptability and changing a display aspect. While thedetails setting region180 is displayed, when an object is determined as condition unacceptable, thedetails setting region180 can provide the display to notify the condition unacceptability. The user can easily recognize acceptability of a work block based on the display. The user can further take appropriate actions, such as making a re-entry to solve the condition unacceptability.
A work block to be processed with thePC1 is a process configured to perform at least one of outputting of information, entering of information, and making a determination each executed by a computer (e.g., work terminal5). Thecontroller10 causes icons, which are indicative of processing of the work blocks corresponding to the object, to be associated with the objects arranged on theediting region120, and to be displayed. Thecontroller10 compares the objects arranged on theediting region120 with a condition set in association with a process represented by a work block corresponding to an object, and determines at least either of acceptable and unacceptable. With the configuration, icons are displayed on the objects on theediting region120. When one of the objects is determined as condition unacceptable, a display indicative of unacceptability is provided. The user thus easily recognizes a process represented by a work block and its acceptability based on the display. By taking into account contents of work blocks and a condition set for the work blocks, the user creates or edits an appropriate process flow.
Thecontroller10 outputs theoutput data27 representing process flow data used to execute, by the computer (e.g., work terminal5), a process flow created based on the arrangement of the objects on theediting region120. Thecontroller10 adds information about the objects each added with the display indicative of condition unacceptability to theoutput data27, and outputs theoutput data27 as data described in a format specified beforehand. In accordance with the objects arranged on theediting region120, a process flow to be executed by the computer can be easily created, and information indicative of condition unacceptability can be added to data of the process flow. The created process flow can be processed and output as data by another computer. The data allows the computer to detect condition unacceptability.
Thecontroller10 executes, in accordance with an entry accepted by theinput unit33, a condition setting process configured to set a condition relating to the work blocks (e.g.,FIG. 14). Thecontroller10 compares the work blocks corresponding to the objects arranged on theediting region120 in accordance with the condition set in the condition setting process, and determines at least either of acceptable and unacceptable. After a condition is set for a work block, and when the set condition is not satisfied, the display indicative of unacceptability is provided. Thus, a user sets a detailed condition relating to a process flow, as well as easily creates or edits the process flow satisfying the set condition.
ThePC1 includes thestorage unit20, and thecontroller10 causes thestorage unit20 to store the condition definition information indicative of the condition set in the condition setting process. Once a condition is set based on the condition definition information, the condition can be used consecutively in creating or editing a process flow.
The disclosure is not limited to the exemplary embodiment configured as described above. The disclosure can be implemented in various aspects, as long as the aspects fall within the scope of the disclosure.
For example, in the above described exemplary embodiment, thePC1 has been described as an information processing device configured to create a process flow as an operation flow. However, specific aspects of information processing devices may be selected as desired. For example, a server device communicably coupled with a plurality of thework terminals5 may be used as an information processing device.
In the above described exemplary embodiment, the configuration has been exemplified, in which thePC1 switches and displays theediting screen101 and thedetails setting screen151 arranged with a plurality of display regions, for example. However, the screens may be configured to be displayed simultaneously.
In the above described exemplary embodiment, the configuration has been described as an example, in which thecontroller10 included in thePC1 executes the processflow definition tool22 representing a program stored in thestorage unit20 included in thePC1. The disclosure is not limited to the configuration. For example, such a configuration may be adopted that a program to be executed by thecontroller10 is stored in an external device attached to thePC1, and thecontroller10 acquires and executes the program via the I/F unit36 or thecommunication unit37. Similarly, in thework terminal5, theprocess flow data52cmay be acquired via the I/F unit63, thecommunication unit64, or the wireless communication I/F unit65, and may be executed by thecontroller51.
Such a configuration may be adopted that at least some of the function blocks illustrated inFIGS. 1 and 2 are achieved with hardware, or achieved together with hardware and software. For example, devices applied with the disclosure are not limited to have either or both of the configurations in which separate hardware resources are arranged as illustrated inFIGS. 1 and 2.
Other detailed configurations may obviously be modified as desired.
The present application is based on and claims priority from JP Application Serial Number 2017-175145, filed Sep. 12, 2017, the disclosure of which is hereby incorporated by reference herein in its entirety.