TECHNICAL FIELDThis disclosure relates generally to a simulation system and, more particularly, to a system that uses real-time performance data to remotely simulate operation of a machine at a worksite.
BACKGROUNDMachines such as, for example, excavators, loaders, dozers, motor graders, haul trucks, and other types of heavy equipment are used to perform a variety of tasks. During the performance of these tasks, the machines may operate in situations that are hazardous to an operator, under extreme environmental conditions uncomfortable for the operator, or at work locations remote from civilization. Because of these factors, the completion of some tasks by an onboard operator can be dangerous, expensive, labor intensive, time consuming, and inefficient.
One solution to this problem may include remotely controlling the machines. Specifically, an offboard operator located remotely from the machine, if provided with a visual representation of the machine and the work environment, could control operation of the machine from a more suitable location. This strategy has been implemented in the past and generally included providing the visual representation of the machine and work environment by way of live video feed broadcast from the worksite to the operator. The operator then was able to provide, via a graphical user interface, operational instructions that were subsequently sent to the machine for control thereof.
Although this strategy of remotely controlling the machines may have been successful in some situations, its use was limited and costly. Specifically, the visual representation of the machine and environment was typically limited to the number of cameras mounted to the machine and the view angles provided by those cameras. To improve visibility or provide different view angles, additional cameras had to be installed on the machine. Because the number of cameras on the machine relates directly to cost and, because the harsh environment of the worksite reduced the component life of the cameras, the initial and operating costs of the system were significant. In addition, wireless video feed in real-time requires large bandwidth, thereby further increasing the operating cost of the system.
An attempt at addressing the problems of high system cost and large bandwidth is described in U.S. Pat. No. 6,739,078 (the '078 patent) issued to Morley et al. on May 25, 2004. Specifically, the '078 patent describes a system utilized to remotely control construction equipment such as a backhoe at an isolated location via a data network, in which a user provides movement instructions via a graphical user interface (GUI) at a user PC. The GUI displays a side view and a top view visual representation of the movable elements of the backhoe (e.g., a boom, a stick, and a bucket). The visual representation is generated in response to movements of the boom, stick, and bucket of the backhoe, which are measured onboard the backhoe and transmitted to the user PC via radio frequencies. In this manner, an operator may remotely control hydraulic actuators onboard the backhoe to move the boom, stick, and bucket and, at the same time, view the resulting motions at a distant location in a cost effective manner.
Although the system of the '078 patent may provide a lower cost, more robust way to remotely view and control motions of construction equipment, its use may still be limited. In particular, because the system of the '078 patent provides a visual representation of only the boom, stick, and bucket, the operator may be unable to properly control engagement of the backhoe with its surrounding environment. In particular, without a representation of the worksite or an excavation surface at the work site, it may be very difficult, if not impossible, to adequately engage the bucket with the excavation surface. In addition, with the minimal visual representation described above, the operator may be unable to remotely move or orient the backhoe itself, or perform other necessary machine tasks.
The system of the present disclosure is directed towards overcoming one or more of the problems as set forth above.
SUMMARY OF THE INVENTIONIn accordance with one aspect, the present disclosure is directed toward a machine simulation and control system. The machine simulation and control system may include a user interface configured to display a simulated environment. The machine simulation and control system may also include a controller in communication with the user interface and a remotely located machine. The controller may be configured to receive from the machine real-time information related to operation of the machine at a worksite. The controller may also be configured to simulate the worksite, operation of the machine, and movement of a machine tool based on the received information. The controller may further be configured to provide to the user interface the simulated worksite, operation, and movement in the simulated environment.
According to another aspect, the present disclosure is directed toward a method of remotely controlling a machine. The method may include monitoring machine operation and simulating in real-time a worksite, machine movement within the worksite, and machine tool movement within the worksite based on the monitored machine operation. The method may further include receiving machine control instructions at a location remote from the machine, and affecting operation of the machine in response to the received instructions.
BRIEF DESCRIPTION OF THE DRAWINGSFIG. 1 is a pictorial illustration of an exemplary disclosed machine traveling about a worksite;
FIG. 2 is a schematic and diagrammatic illustration of an exemplary disclosed simulation and control system for use with the machine ofFIG. 1;
FIG. 3 is a pictorial illustration of an exemplary disclosed graphical user interface for use with the system ofFIG. 2; and
FIG. 4 is another pictorial illustration of the exemplary disclosed graphical user interface for use with the system ofFIG. 2.
DETAILED DESCRIPTIONFIG. 1 illustrates anexemplary machine10 performing a predetermined function at aworksite12.Machine10 may embody a stationary or mobile machine, with the predetermined function being associated with a particular industry such as mining, construction, farming, transportation, power generation, or any other industry known in the art. For example,machine10 may be an earth moving machine such as the excavator depicted inFIG. 1, in which the predetermined function includes the removal of earthen material fromworksite12 that alters the geography ofworksite12 to an architecturally desired form.Machine10 may alternatively embody a different earth moving machine such as a motor grader or a wheel loader, or a non-earth moving machine such as a passenger vehicle, a stationary generator set, or a pumping mechanism.Machine10 may embody any suitable operation-performing machine.
As illustrated inFIG. 2,machine10 may include asimulation system14 having multiple components that interact to monitor the operation ofmachine10 and perform analysis in response thereto. In particular,machine10 may include adata module16 in communication with acontroller18. It is contemplated thatdata module16 andcontroller18 may be integrated in a single unit, if desired. It is further contemplated thatsimulation system14 may include additional or different components than those illustrated inFIG. 2.
Data module16 may include a plurality ofsensing devices16a-fdistributed throughoutmachine10 to gather real-time data from various components and systems ofmachine10.Sensing devices16a-fmay be associated with, for example, awork tool20, apower source22, atransmission device24, one ormore actuator devices26, aposition locating device28, driven and/orsteerable traction devices30, a torque converter (not shown), a fluid supply (not shown), operator input devices (not shown), and/or other systems and components ofmachine10. Thesesensing devices16a-emay automatically gather real-time data frommachine10, such as manipulation oftool20, operation ofpower source22, and/or machine travel characteristics (e.g., speed, torque, track slip rate, etc.); orientation and position ofmachine10; fluid pressure, flow rate, temperature, contamination level, and/or viscosity; electric current and/or voltage levels; fluid (i.e., fuel, oil, water, etc.) consumption rates; loading levels (e.g., payload value, percent of maximum allowable payload limit, payload history, payload distribution, etc.); transmission output ratio; cycle time; idle time, grade; recently performed maintenance and/or repair operations; and other such pieces of information. Additional information may be generated or maintained bymachine data module16 such as the date, time of day, and operator information. The gathered data may be indexed relative to the time, day, date, operator, or other pieces of information, and communicated to controller18 to trend the various operational aspects ofmachine10, if desired.
For example, afirst sensing device16amay be associated withposition locating device28 to gather real-time machine position, orientation (adirection machine10 is facing), and/or ground speed information. In one aspect, locatingdevice28 may include a global positioning system (GPS) comprising one or more GPS antennae disposed at one or more locations on machine10 (e.g. onwork tool20, the front and/or rear ofmachine10, etc.). The GPS antenna may receive one or more signals from one or more satellites. Based on the trajectories of the one or more signals, locatingdevice28 may determine a global position and orientation ofmachine10 in coordinates with respect toworksite12. Further, by repeatedly sampling machine positions, locatingdevice28 may determine a real-time machine ground speed based on distances between samples and time indices associated therewith, or a time between samples. Alternatively, machine position, orientation, and/or ground speed information may similarly be determined in site coordinates with respect to a ground-based location. It is to be appreciated that other positioning methods known in the art may be used alternatively or additionally.
In a further aspect,sensing device16amay gather pitch and roll data in order to determine a real-time inclination ofmachine10 with respect to the surface ofworksite12. For example, if locatingdevice28 includes three (or more) GPS antennae receivers disposed aboutmachine10 as discussed above, pitch and roll angles ofmachine10 may be determined by comparing an orientation of a surface defined by the respective positions of the three (or more) receivers relative to a gravity vector and/or horizontal ground. Alternatively,sensing device16amay be associated with conventional pitch and role inclination electronics disposed onmachine10. The electronics may include, for example, electrodes disposed within a glass vial and submerged in an electrically conductive fluid, such that as machine inclination changes, submersion depths of the electrodes also change, and electrical resistances of paths between electrodes may change accordingly. As such, the pitch and roll ofmachine10 may be defined in terms of the measured resistances. It is to be appreciated that other pitch and roll and/or inclination sensors known in the art may be used alternatively or additionally.
Asecond sensing device16b, for example, may be associated withtraction devices30 to gather real-time speed and/or velocity data thereof. For example,sensing device16bmay be able to determine a real-time rotational speed oftraction devices30. It is to be appreciated that a track slip rate of traction devices30 (i.e., a rate at whichtraction devices30 are spinning in place) may be indicated by a detected difference between machine ground speed, as discussed above, and traction device speed. Alternatively, track slip rate may be indicated by a sudden increase in the speed of one or more oftraction devices30 detected by sensingdevice16b.
In another aspect,sensing device16bmay gather real-time steering command information. For example, in a case wheretraction devices30 comprise driven, non-steerable belts or tracks, a measured difference between rotational speeds thereof may indicate a corresponding turning rate and direction negotiated bymachine10. In another aspect, whereintraction devices30 comprise steerable wheels, or the like,sensing device16bmay simply measure a current steering angle thereof.
Athird sensing device16c, for example, may be associated withtransmission device24 to gather real-time data concerning a present transmission output (e.g., gear) utilized bymachine10. Additionally,sensing device16cmay gather real-time data concerning a torque output oftransmission device24. Afourth sensing device16dmay be associated withpower source22 in order to gather information regarding a speed output (RPM) and/or a torque output thereof.
Afifth sensing device16emay be associated withhydraulic devices26 to gather real-time data related to positioning of alinkage system32 and/ortool20. For example,actuator devices26 may comprise hydraulic cylinders extendable throughout a range between a minimum length and a maximum length. In conjunction with known kinematics and geometry oflinkage system32 and/ortool20, a three-dimensional position and orientation thereof, in site coordinates, may be determined based on sensed extension lengths ofhydraulic devices26.
Asixth sensing device16f, for example, may be associated withtool20 to gather real-time data concerning a load applied thereto. The load may be represented as a force, weight, volume, and/or mass of material engaged or supported bytool20. Additionally, the load may be determined as a percentage of a maximum capacity load (i.e., a full load) that may be engaged or supported bytool20. The maximum capacity load may be based on known specifications oflinkage system32,tool20, and/or other components ofmachine10. For example, iftool20 comprises a bucket,device16fmay include a scale mechanism that may directly determine a force, weight, volume, and/or mass of the material therein. Alternatively,device16fmay comprise one or more optical sensors disposed about an inner engagement surface oftool20 to sense a capacity to whichtool20 is filled with the material. Based on known specifications, a volume of material engaged bytool20 may be determined. In another aspect,sensing device16fmay measure a force exerted byhydraulic devices26 to maintaintool20 in a desired position. As such, the measured force, in conjunction with known torque relationships betweenlinkage system32 andtool20, and other specifications ofmachine10, may allow determination of the force, weight, mass, volume, and/or percent capacity of the load. It is to be appreciated that other methods of load sensing known in the art may be used alternatively or additionally.
Controller18 may be in communication withdata module16 and include any means for monitoring, recording, storing, indexing, processing, and/or communicating the real-time data concerning operational aspects ofmachine10 described above. These means may include components such as, for example, a memory, one or more data storage devices, a central processing unit, or any other components that may be used to run an application. Furthermore, although aspects of the present disclosure may be described generally as being stored in memory, one skilled in the art will appreciate that these aspects may be stored on or read from different types of computer program products or computer-readable media such as computer chips and secondary storage devices, including hard disks, floppy disks, flash drives, optical media, CD-ROM, or other forms of RAM or ROM.
Controller18 may further include a means for communicating with an offboard, remotely-locateduser interface34. For example,controller18 may include hardware and/or software that enables transmitting and receiving of the data through a direct data link (not shown) or a wireless communication link (not shown). The wireless communications may include satellite, cellular, infrared, radio, microwave, or any other type of wireless electromagnetic communications that enablecontroller18 to exchange information. It is contemplated that a separate module may alternatively be included withinsimulation system14 to facilitate the communication of data betweencontroller18 anduser interface34, if desired. In one aspect,controller18 may communicate the data to abase station36 equipped to relay the communications touser interface34. Other simulation-capable machines associated withworksite12 may also similarly communicate data tobase station36. Subsequently, the data may be communicated to an intermediary, such as a server (not show), which may appropriately package and transmit the received data touser interface34 for simulation.
User interface34 may represent one or more receiving, computing, and/or display systems of a business entity associated withmachine10, such as a manufacturer, dealer, retailer, owner, service provider, client, or any other entity that generates, maintains, sends, and/or receives information associated withmachine10. The one or more computing systems may embody, for example, a machine simulator, a mainframe, a work station, a laptop, a personal digital assistant, and other computing systems known in the art.Interface34 may include components such as, for example, a memory, one or more data storage devices, a controller38 (CPU), or any other components that may be used to run an application. In one aspect,interface34 may include a firewall and/or require user authentication, such as a username and password, in order to prevent access thereto by unauthorized entities.
User interface34 may be operatively coupled to communicate with aworksite terrain map42.Terrain map42 may include work surface data defining ground elevation, earthen material composition and/or consistency at a plurality of locations atworksite12 defined in site coordinates. Additionally,terrain map42 may include the location, size, shape, composition, and/or consistency of above- or below-ground obstacles at, or in the proximity ofworksite12, such as, for example, roads, utility lines, storage tanks, buildings, property boundaries, trees, bodies of water, and/or other obstacles. In one aspect,terrain map42 may be a predetermined schematic CAD rendering or the like. In another aspect,terrain map42 may be generated by geographic sensing equipment (not shown), such as for example, a ground-penetrating radar systems (GPR) associated withmachine10 and/orworksite12, and/or satellite imagery equipment known in the art. It is to be appreciated thatterrain map42 may include work surface data concerning a plurality of predetermined worksites that may or may not be related toworksite12.
Terrain map42 may be stored within the memory, data storage devices, and/or central processing unit ofcontroller18 and communicated touser interface34 in conjunction with the gathered real-time information. Alternatively,terrain map42 may be stored within the memory, data storage devices, and/orcontroller38 ofuser interface34. In another aspect,terrain map42 may be stored in a separate location and communicated touser interface34. Further,terrain map42 may comprise a database compatible with the real-time information gathered bydata module16. In one aspect,controller18 may updateterrain map42 based on the received real-time data to reflect changes affected uponworksite12 as a result of machine position during travel, and/or tool movement and loading sensed during excavation maneuvers. This feature will be discussed further in the next section to illustrate use of the disclosedsimulation system14.
User interface34 may further include one ormore monitors44 configured to actively and responsively display a simulated environment ofmachine10 onworksite12, as well as parameters indicating machine performance and functionality, in response to the received real-time data andterrain map42.Monitor44 may include, for example, a liquid crystal display (LCD), a CRT, a PDA, a plasma display, a touch-screen, a portable hand-held device, or any such display device known in the art. In one aspect, monitors44 may comprise a full 360-degree display encompassing the operator for augmented, realistic display of thesimulated worksite12.
As illustrated inFIGS. 3 and 4,user interface34 may generate and display one or more selectable 3-D viewpoint perspectives48 ofmachine10 andworksite12 onmonitor44 in response to the received real-time data and based onterrain map42 for remote control ofmachine10. As such, the components ofinterface34 may be tailored to render 3-D environments in conjunction with the machine control application. For example, one viewpoint48amay correspond with a high-level, third-person view ofmachine10, as it is controlled and moved aboutworksite12, resembling the image ofFIG. 3. From this viewpoint, an operator may discern and control, among other things, an optimal travel route and/or approach to an excavating location, work pile, or other point of interest onworksite12.
A second viewpoint48bmay correspond with a close look at work tool movement from inside or outside ofoperator station46, and may resemble the image ofFIG. 4. From this viewpoint, an operator may discern and/or control, among other things, work tool and linkage system movement and loading during excavation passes, and the results of excavation from a given machine position. Additionally, details such as the contour and layout of work surface terrain proximate the machine position may be accurately depicted in second viewpoint48b. Further, nearby obstacles included interrain map42, such as buildings, roads, trees, and/or property boundaries, etc., may also be accurately depicted in second view point48b. As such, the operator may easily determine if a current terrain ofworksite12, as indicated byterrain map42, is compatible with a desired terrain or if additional excavation passes are required, or if a desired machine maneuver may be obstructed by the surrounding obstacles and/or work surface terrain.
In a further aspect, views may be selectable from any desired reference point, since simulation may not be limited to a finite number of stationary cameras, but generated according to theterrain map42 and the received real-time data. However, it is to be appreciated that some video may be gathered by one or more cameras mounted onmachine10 and communicated touser interface34 in addition to the gathered real-time data. The video feed may be utilized and enhanced with simulation based on the received data andterrain map42, and provided to the operator by way ofmonitor44.
As illustrated inFIGS. 3 and 4,offboard system34 may provide to a remote operator ofmachine10 an onboard visual indication of the performance ofmachine10 based on the received real-time information. For example, aninformation panel50 may be included within the display area onmonitor44.Information panel50 may include a plurality ofindicators50a-iassociated with respective parameter values derived from the received real-time information.
For example,panel50 may include a machineground speed indicator50ato show the present ground speed of machine (mph or km/h), anengine speed indicator50bto show the present engine rotational speed (RPM), afuel level indicator50c, and/or a transmission output ratio (gear)indicator50d. Further,panel50 may includeslip indicator50eto identify a rate at whichtraction devices30 that may be slipping. For example,slip indicator50 may show that the left track is slipping at a rate of 0.2 mph.Panel50 may also include a machine roll andpitch indicator50fto provide the operator with present inclination angles of machine with respect to horizontal ground (e.g., 20-degree pitch and 12-degree roll). Additionally,panel50 may include aloading indicator50gto show a capacity to whichtool20 is filled (e.g., 25%), and/or asteering command indicator50hto show a present steering angle of traction devices30 (e.g., 22-degrees left).Panel50 may include other indicators, such as, for example, amachine positioning indicator50ishowing a vertical overhead view of the position of machine relative to worksite12 (e.g., a machine icon positioned on a map of worksite12). Alternatively or additionally,machine position indicator50imay indicate present latitude and longitude, and/or other coordinates representing a current position ofmachine10 with respect toworksite12. It is to be appreciated that any other parameter values of interest may be selectively provided inpanel50 based on the received real-time data in order to provide an augmented reality for the machine operator.
Referring back toFIG. 2,user interface34 may include aninput device40 for remotely initiating operator command signals that control operation ofmachine10 atworksite12. The command signals may be communicated fromuser interface34 tocontroller18. For example,interface34 may include a machine control application to receive the operator command signals and appropriately package them for transmission tocontroller18. As such,controller18 may generate machine command signals to control the various operational aspects ofmachine10 in response to the received operator command signals. For example,controller18 may vary electrical signals, hydraulic pressure, fluid flow rates, fluid consumption levels, etc., in order to change engine speed, ground speed, transmission output ratio, steering angle,tool20 and/orlinkage system32 positioning in accordance with the received operator commands.
In one aspect,input device40 may resemble the operator interface included onmachine10. For example,input device40 may include an arrangement of joysticks, wheels, levers, pedals, switches, and/or buttons similar (or identical) to that ofmachine10. As such, operator manipulation ofinput device40 may have the same effect onmachine10 as corresponding manipulation of the operator interface withinmachine10.Input device40 may be generic, and used for remote control of many different types of simulation-capable machines10. Alternatively,device40 may be customized for a specific type of machine (e.g., a 416E Backhoe Loader, or a 365C Hydraulic Excavator, manufactured by Caterpillar Inc., etc.), and include control features unique to the machine type. However, it is to be appreciated thatdevice40 may simply embody one or more conventional computer interface devices, such as, for example, a keyboard, touchpad, mouse, or any other interface devices known in the art.
Operation of the disclosedsimulation system14 will be discussed further the following section.
INDUSTRIAL APPLICABILITYThe disclosed simulation system may be applicable to any machine where efficient control thereof from a remote location is desirable, and an augmented, simulated operational environment may provide certain advantages over live video feed. In particular, the disclosed simulation system may provide an augmented display based on real-time data measurements that include multiple simulated views of the machine, the worksite, and various operational parameter values, such that an operator may comfortably and effectively control the machine. Operation ofsimulation system14 will now be described.
In one aspect, an operator may log intouser interface34 by entering a username and password, and initiate the remote machine control application. The operator may then be prompted to select a desired worksite. For example,controller38 ofuser interface34 may retrieve a plurality of available worksites fromterrain map42 and display them onmonitor44. The operator may then useinput device40 to navigate through and select a desiredworksite12 from among the plurality.
Subsequently,controller38 may receive terrain information about selectedworksite12 fromterrain map42 and generate a simulated 3-D environment ofworksite12. As discussed above, the environment may include a surface of the terrain, obstacles thereon, and/or plan lines associated with theworksite12.Controller38 may then receive, fromterrain map42, or controllers on individual machines, position information regarding a plurality of available simulation-capable machines associated withworksite12. As such,controller38 may display each available machine onmonitor44, and prompt the operator to select a desiredmachine10 for operation. It is to be appreciated that each operator may be authorized to access different worksites and/or machines for a variety of reasons. As such, the worksites and/or machines available to the operator may be a function of the operator's username and/or password or an operator profile associated therewith.
Once aworksite12 and amachine10 have been properly accessed, controller may begin receiving the streaming real-time data gathered by theparticular machine10. Additionally,controller38 may begin receiving streaming real-time data from other simulation-capable machines associated withworksite12. In one aspect, in order to conserve bandwidth and/or processing power, the real-time data received from other machines may be limited to certain parameters of interest, such as, for example, position and/or travel speed thereof. However, it is to be appreciated thatcontroller38 may receive any desired proportion of gathered real-time data from any number of simulation-capable machines associated withworksite12. For example,controller38 may receive real-time data concerning tool movement and/or loading of other machines for augmented simulation of the worksite environment.
Upon receiving the streaming real-time data,controller38 may actively populate the 3-D environment with themachines10 associated withworksite12, and display the populated environment to the operator onmonitor44.Controller38 may then prompt or otherwise allow the operator to initiate a command by way ofinput device40 to startmachine10. Subsequently,machine10 may be controlled and moved aboutworksite12 by way ofinput device40, as discussed above.
Further,controller38 may allow the operator to select, by way ofinput device40, a desired viewpoint from one or more available commonly-used viewpoints, such as, for example, one of the viewpoints48 discussed above in connection withFIGS. 3 and 4, or a viewpoint from the perspective ofwork tool20. Alternatively or additionally,controller38 may provide a mode allowing the operator to define a viewpoint from any desired perspective by way ofinput device40. For example, the operator may be able to select a first-person view ofworksite12 from the interior and/or exterior of machine, a third-person view ofmachine10 andworksite12 that followsmachine10 during navigation, and/or a view ofmachine10 and/orworksite12 from a desired fixed location. Additionally,controller38 may allow the operator to adjust a view angle and/or zoom level associated with each of these perspectives. It is to be appreciated that the operator may change the selected viewpoint during machine operation, if desired.
In one aspect, the operator may select the high-level, third-person person perspective48aofFIG. 3 and navigatemachine10 to a point of interest onworksite12, such as, for example, a predetermined excavating location delineated byplan lines52, by manipulatinginput device40, as discussed above. Accordingly, the received real-time information may responsively indicate machine navigation, andcontroller38 may actively update the view perspective48aand/orinformation panel50 in response thereto. In other words, asmachine10 moves about theworksite12, the view perspective of the simulated 3-D environment provided onmonitor44 may change in accordance with the real-time measured machine ground speed, engine speed, fuel level, pitch and roll, transmission output ratio, tool position, etc.
Controller38 may also provide certain augmented display features in order to improve operator control ofmachine10. For example, if, during navigation, the received real-time data indicates that atraction device30 is slipping,slip indicator50emay indicate the appropriate traction device and the rate at which it is slipping (e.g., left at 0.2 mph). Additionally, if the operator is utilizing a high-level, third-person perspective48awhere thetraction devices30 are visible in the simulated environment (FIG. 4),controller38 may indicate a slipping traction device by coloring, flashing, or otherwise visually distinguishing the traction device from the background environment. Alternatively or additionally,controller38 may distinguish a slipping traction device simply by showing the traction device rotating or otherwise moving more quickly than the machine ground speed.
Upon reaching a point of interest onworksite12, the operator may select a viewpoint48bcorresponding with a close look atwork tool20 frominside operator station46 in order to facilitate excavation within plan lines52. As the operator manipulatesinput device40 in order to control work tool movement,controller38 may actively simulate anddisplay tool20 andlinkage system32 movement in response to the received real-time data. For example, as the components of linkage system32 (e.g., boom, stick, and bucket20) are tilted downward or otherwise moved toward a work surface,sensors16emay provide real-time position signals tomodule16 for communication touser interface34. Upon receiving the communication,controller38 may show linkage system32 (e.g., boom, stick and bucket20) moving at the measured velocity to the measured position based on the real-time position signals.
Additionally, as the operator manipulatesinput device40 in order to make excavation passes withwork tool20, and earthen material is removed fromworksite12, controller may actively update the simulated environment terrain shown in view perspective48b. For example, astool20 engages and removes material from a given point on the work surface,sensor16fmay provide a real-time loading signal tomodule18 for communication touser interface34. Upon receiving the communication, and in conjunction with thelinkage system32 andtool20 positioning communication discussed above,controller38 may determine an amount of material removed from the work surface, and the location from which it was removed, upon completion of each excavation pass. As such,controller38 may responsively update view perspective48 and/orterrain map42 during completion of an excavation pass to reflect geographical changes made toworksite12. Further, if other simulation-capable machines are performing excavation onworksite12,controller38 may similarly update the view perspective48band/orterrain map42 in response to received communications of real-time information concerning machine location, linkage system and work tool positioning, movement, and loading thereof.
Although the forgoing disclosure relates to generating a 3-D simulation of a worksite environment, it is to be appreciated that supplemental video feed may be used in conjunction therewith. For example,machine10 may be equipped with one or more cameras, and real-time video signals may be communicated touser interface34 in addition to the real-time gathered data. As such,controller38 may provide a live video feed ofworksite12 to the operator by way ofmonitor44, which may, in turn, be augmented with simulation based on the real-time gathered data, as discussed above. The proportion of live video to augmented simulation may be selectable by the operator and determined, in part, based on a desired simulation quality and the availability of necessary system resources, such as, for example, processing power and bandwidth. For example, for a given amount of available resources, the operator may be able to select a certain degree of live video feed (e.g., three camera views) in addition to certain simulated parameters (e.g., pitch and roll, track slip, and machine ground speed). However, it is to be appreciated that any desired proportion or combination of live video feed and/or augmented simulation may be used, within available resource limitations, as desired.
Becausecontroller38 may generate a 3-D environment in response to received real-time data associated with various operational parameters ofmachine10, remote control ofmachine10 may be facilitated without, or with minimized use of live video feed, which requires large bandwidth. In particular, the real-time data may be communicated touser interface34 by way of radio signals or other low-bandwidth carriers, where it may be used bycontroller38 to render a simulated 3-D environment ofworksite12. Moreover, sincecontroller38 may process the received data in order to provide different view perspectives ofmachine10 with respect toworksite12, visibility may not be limited to the number of cameras provided on the machine or respective fields of view associated therewith.
It will be apparent to those skilled in the art that various modifications and variations can be made to the method and system of the present disclosure. Other embodiments of the method and system will be apparent to those skilled in the art from consideration of the specification and practice of the method and system disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims and their equivalents.