Disclosure of Invention
In view of the above, the present invention provides a fire simulation system and method with image fusion, which collectively simulate a plurality of environmental parameters during a fire occurrence process.
In order to achieve the purpose of the invention, the invention adopts the technical scheme that:
a fire simulation system with image fusion comprises a controller, a server and a simulation room, wherein the controller is connected to the server and used for receiving preset input and sending the preset input to the server; the server is connected to the simulation room, determines information of one or more environmental parameters according to preset input, and sends the determined information to the simulation room; the simulation room is provided with an execution unit for generating or changing one or more environmental parameters, and the one or more environmental parameters in the simulation room are generated or changed according to the information sent by the server; the execution unit further comprises an image fusion device which is used for carrying out brightness attenuation processing on the overlapped area of the simulation indoor projection image by utilizing the gradient function so as to achieve brightness consistency after image splicing.
And performing brightness attenuation processing on any point in the overlapping area, wherein the brightness attenuation processing comprises searching attenuation coefficient functions of the point in the two projection images respectively in terms of distance, and obtaining the brightness of the point after attenuation in the projection images.
Before the brightness attenuation processing is carried out on any point in the overlapping area, the width of the overlapping area is normalized.
Correspondingly, a power function is selected as an attenuation coefficient function to perform attenuation processing on the overlapping area.
Wherein the environmental parameter comprises one or more of an image in the simulation room, a temperature in the simulation room, smoke in the simulation room, and a sound in the simulation room.
Correspondingly, the server comprises a scene determining module, a temperature determining module, a smoke determining module and a sound determining module, and the scene determining module, the temperature determining module, the smoke determining module and the sound determining module respectively determine images in the simulation room, the temperature in the simulation room, the smoke in the simulation room and the sound in the simulation room.
And the execution unit in the simulation room for generating or changing one or more environmental parameters further comprises one or more of a projector, a heater, a sound box and a smoke sprayer.
A method of fire simulation with image fusion, the method comprising the steps of: A) presetting a geometric model and fire source parameters of a simulation chamber; B) determining information of one or more environmental parameters in the simulation chamber according to the geometric model and the fire source parameters of the simulation chamber, wherein the environmental parameters comprise one or more of images in the simulation chamber, temperature in the simulation chamber, smoke in the simulation chamber and sound in the simulation chamber; C) sending the determined environmental parameter information to an execution unit of the simulation room, which generates or changes one or more environmental parameters; D) the execution unit of the simulation room changes or generates one or more environmental parameters of the fire scene according to the acquired information; the execution unit further performs brightness attenuation processing on the overlapped area of the simulation indoor projection image by using a gradient function so as to achieve brightness consistency after image splicing.
And performing brightness attenuation processing on any point in the overlapping area, wherein the brightness attenuation processing comprises searching attenuation coefficient functions of the point in the two projection images respectively in terms of distance, and obtaining the brightness of the point after attenuation in the projection images.
Before the brightness attenuation processing is carried out on any point in the overlapping area, the width of the overlapping area is normalized.
Specifically, a power function is selected as an attenuation coefficient function to perform attenuation processing on the overlapping region.
Due to the adoption of the technical scheme, one or more environmental parameters in the simulation room can be determined in a centralized manner, the environmental parameters can interact with each other, and a simulation projection image with higher fusion degree can be formed.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the accompanying drawings. This description is made by way of example and not limitation to specific embodiments consistent with the principles of the invention, the description being in sufficient detail to enable those skilled in the art to practice the invention, other embodiments may be utilized and the structure of various elements may be changed and/or substituted without departing from the scope and spirit of the invention. The following detailed description is, therefore, not to be taken in a limiting sense.
To facilitate understanding of those skilled in the art, the following detailed description of the present invention is provided in conjunction with the accompanying drawings.
Detailed exemplary embodiments are disclosed herein. However, specific structural and functional details disclosed herein are merely for purposes of describing example embodiments. This invention may, however, be embodied in many alternate forms and should not be construed as limited to the exemplary embodiments set forth herein.
It should be understood, however, that the intention is not to limit the invention to the particular embodiments disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure. Referring to the description of the drawings, like numbers indicate like elements.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of example embodiments. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. In contrast, when an element is referred to as being "directly connected" or "directly coupled" to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be understood in the same manner.
Fig. 1 is a block diagram showing the construction of a fire simulation system according to the present invention.
In an embodiment of the present invention, the fire simulation system includes a controller 11, a server 12, and a simulation room 13. The controller 11 is a digital device, and is usually a wireless communication device, such as a tablet computer, and may also be an intelligent mobile terminal, a notebook computer, and the like. The controller 11 and the server 12 interact with each other through various connection methods, such as a common wireless communication protocol, specifically, a protocol connection such as IEEE 802.11 or bluetooth. The controller 11 has a dedicated functional unit that can access the simulation software on the server 12 via the B/S or C/S architecture to signal to control the sound, temperature, scene and smoke effects inside the simulation chamber 13. The server 12 may be a desktop computer, and includes a CPU, a random access memory, a nonvolatile memory such as a hard disk, a display, a mouse, a keyboard, a video card, a sound card, a network card, a chassis, a power supply, a fan, and other common devices, and may also include a bluetooth communication module, a wireless communication module, and the like to communicate with the controller 11. In particular, the controller 11 may also be integrated with the server 12, sending signals to the functional units of the simulation room 13.
The server 12 sends signals to the functional units of the simulation room 13 through various transmission paths. For example, the server 12 is connected to a speaker of the simulation room through an audio cable; and the projector and the fusion device are connected through a video transmission line. For the convenience of access, the server 12 is connected to INTERNET through a network cable or wireless mode and a communication protocol such as IEEE 802.11 or bluetooth, and various video files and audio files can be stored on the hard disk of the server 12.
In this embodiment, the server 12 includes the following functional modules: the sound determination module 21, the temperature determination module 22, the scene determination module 23 and the smoke determination module 24 respectively determine the sound in the simulation room, the temperature in the simulation room, the image in the simulation room and the smoke in the simulation room.
For explaining the operation of the fire simulation system according to the embodiment of the present invention, the fire simulation system according to the present invention will now be described in detail with reference to fig. 2 and 3.
As shown in FIG. 2, the simulation chamber 13 of the fire simulation system may include one or more of aninlet 31, anoutlet 32, aprojector 33, afuser 34, aheater 35, asound box 36, and asmoke sprayer 37. Whereinprojector 33 is used for projecting fire images,fuser 34 is used for eliminating gaps or overlapping areas of different projectors,heater 35 is used for raising indoor temperature,sound box 36 is used for playing sound effects, andsmoke sprayer 37 is used for producing smoke effects. Preferably, the simulation room 13 further comprises acamera 38 and abracket 39, wherein thecamera 38 is used for monitoring indoor scenes, and thebracket 39 is used for placing theescape equipment 40.
Typically, the simulation chamber 13 has six walls, upper, lower, left, right, front, and rear. One or more walls are wrapped with a projection curtain to play images. For example, the front wall and the rear wall are wrapped with projection curtains. Those skilled in the art will appreciate that the simulation chamber may have other more complex or simpler internal structures, or have a projection curtain disposed in a different internal structure, and that such changes do not have an impact on the implementation of the present embodiment.
The simulation chamber 13 may be provided with aninlet 31 and/or anoutlet 32, for example, theinlet 31 is provided on the left wall of the simulation chamber 13 and theoutlet 32 is provided on the right wall of the simulation chamber 13.
On the other hand, in order to simulate the image in the simulation chamber 13, one ormore projectors 33 are installed in the simulation chamber 13, and for example, in the present embodiment, fourprojectors 33 are installed in the center of the upper wall of the simulation chamber 13. Theprojector 33 is connected to the server 12 by wire or wireless, receives preset or generated image information from the server 12, and projects the received image information onto the front and rear walls of the simulation room 13, respectively.
In particular, if only oneprojector 33 is used to project the entire wide projection screen, it is difficult to focus because it is difficult to select a suitable reference focus point if the projection screen is too wide. According to the present invention, however, it is preferable to use a plurality ofprojectors 33 to reduce the arc-chord distance as small as possible, so that it is relatively easy to find a suitable focus on the screen.
Further, when more than oneprojector 33 is used, in order to realize seamless connection of images projected by theprojectors 33, theimage fusion device 34 can eliminate gaps or overlapping areas of the projectors, so that the scene has more layering and stereoscopic impression, and a vivid effect is created. According to a simpler aspect of the invention, two left andright projectors 33 may be used, but it will be clear to a person skilled in the art and it is not intended to limit the number ofprojectors 33 to 2. In order to achieve edge blending of images projected by the left andright projectors 33, the server 12 transmits left and right image information to the left andright projectors 33, respectively, the left and right image information having an overlapping portion, and theblender 34 achieves joining of the image information by changing the projection brightness of the left andright projectors 33. Thus, the brightness of the whole picture is uniform in the display effect.
According to one embodiment of the present invention, when multiple projectors 33 (including both left and right) are used, thefuser 34 is used to determine the luminance L at Q in the projection overlap region of themultiple projectors 33.
In particular, the method comprises the following steps of,
and is
Where Li is the brightness of the projected image of the
ith projector 33; di is a distance from the point Q to the adjacent boundary of the i-
th projector 33, and for example, in the case of the aforementioned left and right 2 projectors (n is 2), d1 is a distance from the point Q to the right boundary of the left projection image P1, and d2 is a distance from the point Q to the left boundary of the right projection image P2; ki as implemented in the fusion cage 34Brightness attenuation coefficients of different projectors
α>0, is a decaying power function flag implemented in the
diplexer 34, which adjusts the degree of non-linearity of the decay.
According to an embodiment of the present invention, the
projectors 33 are all provided as projectors having the same brightness, and the simplification process sets α to 1, and
the fusion of the edge regions is thus accomplished using a substantially linear function. The technical effect of the mode is simple, efficient and easy to realize. However, there is a problem that the
projector 33 may not be a projector having the same brightness due to its lifetime, temporary replacement, etc., and the brightness of the edge of the overlap region may suddenly change due to the basic linear attenuation under the conventional conditions.
To overcome the above technical problems, in another embodiment of the present invention, α and K are paired according to a logistic regression model at intervalsiAnd dynamic adjustment is carried out, a more reasonable value range is determined, the consistency of the brightness of the overlapped projection images is realized, and higher simulation degree is obtained.
According to another embodiment of the present invention (as shown in fig. 3), in order to make the fire scene more stereoscopic, an arc-shaped projection curtain can be used in the simulation chamber 13. However,most projectors 33 are designed for a flat screen, and whensuch projectors 33 project an image onto a dome screen or an arc screen such as a dome screen, the image is distorted, which is called nonlinear distortion.
In this regard, the server 12 further includes a nonlinear distortion correction unit configured to perform nonlinear distortion correction on the image information with respect to a relative position of theprojector 33 and the projection screen and a radian of the projection screen when transmitting the image information to theprojector 33.
For example, for a particular projection screen, the non-linear distortion correction unit of the server 12 generates a series of contour lines and vertical lines, which are then orthogonalized to form a grid, which is referred to as a contour vertical grid. The non-linear distortion correction unit then transmits the iso-vertical grid to theprojector 33, and theprojector 33 projects the iso-vertical grid onto the projection screen (near clipping plane), which is the corresponding location of the iso-vertical grid in the frame buffer.
Next, the non-linear distortion correction unit draws the equal-height vertical grid on the projection screen to be equal-height vertical on the true-normal visual effect through theprojector 33, and obtains the relative displacement thereof.
Then, the server 12 takes out the image in the frame buffer, and performs texture mapping of the original image by using the newly obtained equal-height vertical grid through the nonlinear distortion correction unit, for example, performing texture re-mapping on each frame of data.
By adopting the distortion correction image formed by the nonlinear correction method, higher simulation degree is obtained.
On the other hand, in order to realize the simulation of the temperature in the simulation chamber 13, aheater 35 is included in the simulation chamber 13. Theheater 35 may be implemented using various heat generating devices, for example, using a high resistance resistor that does not emit light. In order to obtain the corresponding temperature of the real fire scene, theheaters 35 can be controlled by the environmental parameter information determined by the temperature determination module 22 of the server 12 to be turned on and the respective heating temperatures to adjust the heating effect in the simulation chamber 13.
On the other hand, in order to realize the formation of the sound effect in the simulation room 13, the server 12 transmits the stored audio information to the power amplifier 41 and adjusts the power thereof, and then plays the sound effect using thespeaker 36. One or moresound boxes 36 may be disposed in the simulation room according to specific needs, for example, thesound boxes 36 are located at four corners of the roof of the simulation room 13, and the sound determined by the sound determination module 21 of the server 12 may control the one ormore sound boxes 36 to work simultaneously to create a stereo effect.
On the other hand, in order to realize the formation of the smoke effect in the simulation chamber 13, one ormore smoke machines 37 may be arranged in the simulation chamber according to specific needs, for example, foursmoke machines 37 are located at four corners of the ground of the simulation chamber 13, and the number and the operating rate of thesmoke machines 37 that are turned on may be controlled by the smoke condition determined by the smoke determination module 24 of the server 12 to adjust the smoke effect in the simulation chamber 13.
In addition, one ormore cameras 38 may be disposed in the simulation room 13 according to specific needs, for example, twocameras 38 are respectively located at the middle points of the top of the left and right walls of the simulation room 13, and are connected with the server 12 through wired or wireless connection for observing the conditions in the simulation room 13.
Asupport 39 can be further arranged in the simulation chamber 13, and escapeequipment 40, preferably towels, gas masks and the like, is placed on thesupport 39.
The above description describes the apparatus of the fire simulation system, and the fire simulation method is further described below in conjunction with the apparatus of the fire simulation system.
Step one, a geometric model and fire source parameters of the simulation chamber 13 are preset. The geometric model of the simulation chamber 13 is mainly determined by the central axis equation and the cross-sectional dimension of the simulation chamber 13. The central axis equation can be set by manually and continuously inputting a linear equation or a curve equation, and can also be obtained by directly analyzing an AUTOCAD graphic format or a DXF file of the central axis. Dxf (drawing exchange format), a graphic exchange file, is a sequential file, which is a data file including entity commands and geometric data information under a certain set of code symbol rules, and can be converted from an AUTOCAD graphic format. The fire source parameters mainly set the scale, position and combustion mode of the fire source. The size of the fire source is mainly set for the heat release efficiency and the total power of the fire source. The position of the fire source is mainly set as the position of the fire source in the simulation chamber, and the position can be only selected as a high-impedance resistance position under the condition. The combustion modes comprise gasoline, kerosene, crude oil, firewood and the like.
And secondly, determining information of one or more environmental parameters according to the geometric model and the fire source parameters of the simulation chamber, wherein the environmental parameters comprise images in the simulation chamber, the temperature in the simulation chamber, smoke in the simulation chamber and sound in the simulation chamber.
The geometric model and the fire source parameters of the simulation chamber 13 are sent to the server 12, the scene determining module 23 of the server 12 performs grid segmentation on the simulation chamber 13, the simulation chamber 13 is divided into a plurality of subspaces, and each subspace selects a proper material to render, and particularly only a model part currently seen by a user can be rendered to determine an image of a simulated fire scene.
After the fire source parameters are set, the temperature determination module 22 of the server 12 determines which heater orheaters 35 in the simulation room 13 perform the heating operation according to the location of the fire source, and determines the temperature of the simulated fire scene. The heating rate of theheater 35 is determined according to the heat release efficiency of the ignition source, the heating time of theheater 35 is determined according to the total power, and the power change curve of theheater 35 is determined according to the combustion method.
After the fire source parameters are set, the sound determination module 21 of the server 12 determines the sound simulating the fire scene. And calling a sound effect file prestored on the server according to the combustion mode of the fire source to generate the field effect of fire source combustion.
After the parameters of the fire source are set, the smoke determining module 24 of the server 12 determines the ejection quantity of smoke according to the combustion mode of the fire source, determines the smoke of the simulated fire scene, and controls the generation of a large quantity of smoke in the case of wet weather and insufficient contact between wood and oxygen, for example, in the case of a wood combustion mode.
And step three, sending the information of the determined environmental parameters to an execution unit of the simulation room 13, which generates or changes one or more environmental parameters.
And step four, the execution unit of the simulation room 13 generates or changes one or more environmental parameters of the fire scene according to the acquired information. For example, the projector of the simulation room 13 projects an image; theheater 35 of the simulation chamber 13 changes the temperature in the simulation chamber; the sound box of the simulation chamber 13 generates sound in the simulation chamber; the smoke machine of the simulation chamber changes the smoke condition in the simulation chamber; the image fuser of the simulation room 13 fuses the superimposed images.
It will be appreciated by those skilled in the art that one or more of the above environmental parameters may interact, for example, to simulate an increase in room temperature accompanied by an increase in the pattern of fire and smoke; for example, the sound of combustion is generated along with the pattern of fire, thereby further enhancing the simulation effect.
It should be noted that the above-mentioned embodiments are only preferred embodiments of the present invention, and should not be construed as limiting the scope of the present invention, and any minor changes and modifications to the present invention are within the scope of the present invention without departing from the spirit of the present invention.