Movatterモバイル変換


[0]ホーム

URL:


CN109557830B - A fire simulation system and method with image fusion - Google Patents

A fire simulation system and method with image fusion
Download PDF

Info

Publication number
CN109557830B
CN109557830BCN201811345838.9ACN201811345838ACN109557830BCN 109557830 BCN109557830 BCN 109557830BCN 201811345838 ACN201811345838 ACN 201811345838ACN 109557830 BCN109557830 BCN 109557830B
Authority
CN
China
Prior art keywords
simulation
room
simulation room
fire
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201811345838.9A
Other languages
Chinese (zh)
Other versions
CN109557830A (en
Inventor
关猛
陶苏东
谷胜男
聂洪涛
徐法璐
李洪战
李文进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid of China Technology College
State Grid Corp of China SGCC
Original Assignee
State Grid of China Technology College
State Grid Corp of China SGCC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Grid of China Technology College, State Grid Corp of China SGCCfiledCriticalState Grid of China Technology College
Priority to CN201811345838.9ApriorityCriticalpatent/CN109557830B/en
Publication of CN109557830ApublicationCriticalpatent/CN109557830A/en
Application grantedgrantedCritical
Publication of CN109557830BpublicationCriticalpatent/CN109557830B/en
Expired - Fee Relatedlegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The invention relates to a fire simulation system with image fusion and a method thereof, wherein the system comprises a controller, a server and a simulation room, wherein the simulation room further comprises an image fusion device which performs brightness attenuation processing on an overlapping area of projection images in the simulation room by using a gradient function so as to achieve brightness consistency after image splicing.

Description

Fire simulation system and method with image fusion
Technical Field
The invention relates to the field of virtual reality, in particular to a scene simulation technology with image fusion.
Background
At present, fire research methods can be divided into theoretical analysis and simulation research. Wherein the simulation study may include a computer simulation and an experimental simulation. Experimental simulation research is a relatively direct approach. On one hand, the method can summarize the quantitative relation between specific phenomena and influence factors by observing measurement and data analysis, and induce the mechanism and the rule of the evolution process of the fire, and on the other hand, the method also verifies a theory or a calculation model provided from the general principle. But because the experimental simulation investment is large, the period is long, the human labor consumption is high, and the repeated test is difficult. And computer simulation of fires just remedied these deficiencies of experimental simulation. The computer simulation of the fire can be divided into three types, namely field simulation, area simulation and network simulation, according to the classification mode of a control body and a mathematical model in a fire physical model research area, and has the characteristics of low cost, flexible parameter adjustment and great promotion effect on fire research. However, the current fire computer simulation usually focuses on a single environmental parameter or only includes a small amount of environmental parameters of the fire, does not simultaneously simulate various environmental parameters including temperature, smoke, light and the like, and the fusion degree of the simulation images is low.
Disclosure of Invention
In view of the above, the present invention provides a fire simulation system and method with image fusion, which collectively simulate a plurality of environmental parameters during a fire occurrence process.
In order to achieve the purpose of the invention, the invention adopts the technical scheme that:
a fire simulation system with image fusion comprises a controller, a server and a simulation room, wherein the controller is connected to the server and used for receiving preset input and sending the preset input to the server; the server is connected to the simulation room, determines information of one or more environmental parameters according to preset input, and sends the determined information to the simulation room; the simulation room is provided with an execution unit for generating or changing one or more environmental parameters, and the one or more environmental parameters in the simulation room are generated or changed according to the information sent by the server; the execution unit further comprises an image fusion device which is used for carrying out brightness attenuation processing on the overlapped area of the simulation indoor projection image by utilizing the gradient function so as to achieve brightness consistency after image splicing.
And performing brightness attenuation processing on any point in the overlapping area, wherein the brightness attenuation processing comprises searching attenuation coefficient functions of the point in the two projection images respectively in terms of distance, and obtaining the brightness of the point after attenuation in the projection images.
Before the brightness attenuation processing is carried out on any point in the overlapping area, the width of the overlapping area is normalized.
Correspondingly, a power function is selected as an attenuation coefficient function to perform attenuation processing on the overlapping area.
Wherein the environmental parameter comprises one or more of an image in the simulation room, a temperature in the simulation room, smoke in the simulation room, and a sound in the simulation room.
Correspondingly, the server comprises a scene determining module, a temperature determining module, a smoke determining module and a sound determining module, and the scene determining module, the temperature determining module, the smoke determining module and the sound determining module respectively determine images in the simulation room, the temperature in the simulation room, the smoke in the simulation room and the sound in the simulation room.
And the execution unit in the simulation room for generating or changing one or more environmental parameters further comprises one or more of a projector, a heater, a sound box and a smoke sprayer.
A method of fire simulation with image fusion, the method comprising the steps of: A) presetting a geometric model and fire source parameters of a simulation chamber; B) determining information of one or more environmental parameters in the simulation chamber according to the geometric model and the fire source parameters of the simulation chamber, wherein the environmental parameters comprise one or more of images in the simulation chamber, temperature in the simulation chamber, smoke in the simulation chamber and sound in the simulation chamber; C) sending the determined environmental parameter information to an execution unit of the simulation room, which generates or changes one or more environmental parameters; D) the execution unit of the simulation room changes or generates one or more environmental parameters of the fire scene according to the acquired information; the execution unit further performs brightness attenuation processing on the overlapped area of the simulation indoor projection image by using a gradient function so as to achieve brightness consistency after image splicing.
And performing brightness attenuation processing on any point in the overlapping area, wherein the brightness attenuation processing comprises searching attenuation coefficient functions of the point in the two projection images respectively in terms of distance, and obtaining the brightness of the point after attenuation in the projection images.
Before the brightness attenuation processing is carried out on any point in the overlapping area, the width of the overlapping area is normalized.
Specifically, a power function is selected as an attenuation coefficient function to perform attenuation processing on the overlapping region.
Due to the adoption of the technical scheme, one or more environmental parameters in the simulation room can be determined in a centralized manner, the environmental parameters can interact with each other, and a simulation projection image with higher fusion degree can be formed.
Drawings
Fig. 1 is a block diagram of a fire simulation system according to a first embodiment of the present invention.
Fig. 2 is a block diagram of a simulation room in a fire simulation system according to an embodiment of the present invention.
Fig. 3 is a block diagram of a fire simulation system according to a second embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the accompanying drawings. This description is made by way of example and not limitation to specific embodiments consistent with the principles of the invention, the description being in sufficient detail to enable those skilled in the art to practice the invention, other embodiments may be utilized and the structure of various elements may be changed and/or substituted without departing from the scope and spirit of the invention. The following detailed description is, therefore, not to be taken in a limiting sense.
To facilitate understanding of those skilled in the art, the following detailed description of the present invention is provided in conjunction with the accompanying drawings.
Detailed exemplary embodiments are disclosed herein. However, specific structural and functional details disclosed herein are merely for purposes of describing example embodiments. This invention may, however, be embodied in many alternate forms and should not be construed as limited to the exemplary embodiments set forth herein.
It should be understood, however, that the intention is not to limit the invention to the particular embodiments disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure. Referring to the description of the drawings, like numbers indicate like elements.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of example embodiments. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. In contrast, when an element is referred to as being "directly connected" or "directly coupled" to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be understood in the same manner.
Fig. 1 is a block diagram showing the construction of a fire simulation system according to the present invention.
In an embodiment of the present invention, the fire simulation system includes a controller 11, a server 12, and a simulation room 13. The controller 11 is a digital device, and is usually a wireless communication device, such as a tablet computer, and may also be an intelligent mobile terminal, a notebook computer, and the like. The controller 11 and the server 12 interact with each other through various connection methods, such as a common wireless communication protocol, specifically, a protocol connection such as IEEE 802.11 or bluetooth. The controller 11 has a dedicated functional unit that can access the simulation software on the server 12 via the B/S or C/S architecture to signal to control the sound, temperature, scene and smoke effects inside the simulation chamber 13. The server 12 may be a desktop computer, and includes a CPU, a random access memory, a nonvolatile memory such as a hard disk, a display, a mouse, a keyboard, a video card, a sound card, a network card, a chassis, a power supply, a fan, and other common devices, and may also include a bluetooth communication module, a wireless communication module, and the like to communicate with the controller 11. In particular, the controller 11 may also be integrated with the server 12, sending signals to the functional units of the simulation room 13.
The server 12 sends signals to the functional units of the simulation room 13 through various transmission paths. For example, the server 12 is connected to a speaker of the simulation room through an audio cable; and the projector and the fusion device are connected through a video transmission line. For the convenience of access, the server 12 is connected to INTERNET through a network cable or wireless mode and a communication protocol such as IEEE 802.11 or bluetooth, and various video files and audio files can be stored on the hard disk of the server 12.
In this embodiment, the server 12 includes the following functional modules: the sound determination module 21, the temperature determination module 22, the scene determination module 23 and the smoke determination module 24 respectively determine the sound in the simulation room, the temperature in the simulation room, the image in the simulation room and the smoke in the simulation room.
For explaining the operation of the fire simulation system according to the embodiment of the present invention, the fire simulation system according to the present invention will now be described in detail with reference to fig. 2 and 3.
As shown in FIG. 2, the simulation chamber 13 of the fire simulation system may include one or more of aninlet 31, anoutlet 32, aprojector 33, afuser 34, aheater 35, asound box 36, and asmoke sprayer 37. Whereinprojector 33 is used for projecting fire images,fuser 34 is used for eliminating gaps or overlapping areas of different projectors,heater 35 is used for raising indoor temperature,sound box 36 is used for playing sound effects, andsmoke sprayer 37 is used for producing smoke effects. Preferably, the simulation room 13 further comprises acamera 38 and abracket 39, wherein thecamera 38 is used for monitoring indoor scenes, and thebracket 39 is used for placing theescape equipment 40.
Typically, the simulation chamber 13 has six walls, upper, lower, left, right, front, and rear. One or more walls are wrapped with a projection curtain to play images. For example, the front wall and the rear wall are wrapped with projection curtains. Those skilled in the art will appreciate that the simulation chamber may have other more complex or simpler internal structures, or have a projection curtain disposed in a different internal structure, and that such changes do not have an impact on the implementation of the present embodiment.
The simulation chamber 13 may be provided with aninlet 31 and/or anoutlet 32, for example, theinlet 31 is provided on the left wall of the simulation chamber 13 and theoutlet 32 is provided on the right wall of the simulation chamber 13.
On the other hand, in order to simulate the image in the simulation chamber 13, one ormore projectors 33 are installed in the simulation chamber 13, and for example, in the present embodiment, fourprojectors 33 are installed in the center of the upper wall of the simulation chamber 13. Theprojector 33 is connected to the server 12 by wire or wireless, receives preset or generated image information from the server 12, and projects the received image information onto the front and rear walls of the simulation room 13, respectively.
In particular, if only oneprojector 33 is used to project the entire wide projection screen, it is difficult to focus because it is difficult to select a suitable reference focus point if the projection screen is too wide. According to the present invention, however, it is preferable to use a plurality ofprojectors 33 to reduce the arc-chord distance as small as possible, so that it is relatively easy to find a suitable focus on the screen.
Further, when more than oneprojector 33 is used, in order to realize seamless connection of images projected by theprojectors 33, theimage fusion device 34 can eliminate gaps or overlapping areas of the projectors, so that the scene has more layering and stereoscopic impression, and a vivid effect is created. According to a simpler aspect of the invention, two left andright projectors 33 may be used, but it will be clear to a person skilled in the art and it is not intended to limit the number ofprojectors 33 to 2. In order to achieve edge blending of images projected by the left andright projectors 33, the server 12 transmits left and right image information to the left andright projectors 33, respectively, the left and right image information having an overlapping portion, and theblender 34 achieves joining of the image information by changing the projection brightness of the left andright projectors 33. Thus, the brightness of the whole picture is uniform in the display effect.
According to one embodiment of the present invention, when multiple projectors 33 (including both left and right) are used, thefuser 34 is used to determine the luminance L at Q in the projection overlap region of themultiple projectors 33.
In particular, the method comprises the following steps of,
Figure RE-GDA0001966853420000041
and is
Figure RE-GDA0001966853420000042
Where Li is the brightness of the projected image of theith projector 33; di is a distance from the point Q to the adjacent boundary of the i-th projector 33, and for example, in the case of the aforementioned left and right 2 projectors (n is 2), d1 is a distance from the point Q to the right boundary of the left projection image P1, and d2 is a distance from the point Q to the left boundary of the right projection image P2; ki as implemented in the fusion cage 34Brightness attenuation coefficients of different projectors
Figure RE-GDA0001966853420000051
α>0, is a decaying power function flag implemented in thediplexer 34, which adjusts the degree of non-linearity of the decay.
According to an embodiment of the present invention, theprojectors 33 are all provided as projectors having the same brightness, and the simplification process sets α to 1, and
Figure RE-GDA0001966853420000052
the fusion of the edge regions is thus accomplished using a substantially linear function. The technical effect of the mode is simple, efficient and easy to realize. However, there is a problem that theprojector 33 may not be a projector having the same brightness due to its lifetime, temporary replacement, etc., and the brightness of the edge of the overlap region may suddenly change due to the basic linear attenuation under the conventional conditions.
To overcome the above technical problems, in another embodiment of the present invention, α and K are paired according to a logistic regression model at intervalsiAnd dynamic adjustment is carried out, a more reasonable value range is determined, the consistency of the brightness of the overlapped projection images is realized, and higher simulation degree is obtained.
According to another embodiment of the present invention (as shown in fig. 3), in order to make the fire scene more stereoscopic, an arc-shaped projection curtain can be used in the simulation chamber 13. However,most projectors 33 are designed for a flat screen, and whensuch projectors 33 project an image onto a dome screen or an arc screen such as a dome screen, the image is distorted, which is called nonlinear distortion.
In this regard, the server 12 further includes a nonlinear distortion correction unit configured to perform nonlinear distortion correction on the image information with respect to a relative position of theprojector 33 and the projection screen and a radian of the projection screen when transmitting the image information to theprojector 33.
For example, for a particular projection screen, the non-linear distortion correction unit of the server 12 generates a series of contour lines and vertical lines, which are then orthogonalized to form a grid, which is referred to as a contour vertical grid. The non-linear distortion correction unit then transmits the iso-vertical grid to theprojector 33, and theprojector 33 projects the iso-vertical grid onto the projection screen (near clipping plane), which is the corresponding location of the iso-vertical grid in the frame buffer.
Next, the non-linear distortion correction unit draws the equal-height vertical grid on the projection screen to be equal-height vertical on the true-normal visual effect through theprojector 33, and obtains the relative displacement thereof.
Then, the server 12 takes out the image in the frame buffer, and performs texture mapping of the original image by using the newly obtained equal-height vertical grid through the nonlinear distortion correction unit, for example, performing texture re-mapping on each frame of data.
By adopting the distortion correction image formed by the nonlinear correction method, higher simulation degree is obtained.
On the other hand, in order to realize the simulation of the temperature in the simulation chamber 13, aheater 35 is included in the simulation chamber 13. Theheater 35 may be implemented using various heat generating devices, for example, using a high resistance resistor that does not emit light. In order to obtain the corresponding temperature of the real fire scene, theheaters 35 can be controlled by the environmental parameter information determined by the temperature determination module 22 of the server 12 to be turned on and the respective heating temperatures to adjust the heating effect in the simulation chamber 13.
On the other hand, in order to realize the formation of the sound effect in the simulation room 13, the server 12 transmits the stored audio information to the power amplifier 41 and adjusts the power thereof, and then plays the sound effect using thespeaker 36. One or moresound boxes 36 may be disposed in the simulation room according to specific needs, for example, thesound boxes 36 are located at four corners of the roof of the simulation room 13, and the sound determined by the sound determination module 21 of the server 12 may control the one ormore sound boxes 36 to work simultaneously to create a stereo effect.
On the other hand, in order to realize the formation of the smoke effect in the simulation chamber 13, one ormore smoke machines 37 may be arranged in the simulation chamber according to specific needs, for example, foursmoke machines 37 are located at four corners of the ground of the simulation chamber 13, and the number and the operating rate of thesmoke machines 37 that are turned on may be controlled by the smoke condition determined by the smoke determination module 24 of the server 12 to adjust the smoke effect in the simulation chamber 13.
In addition, one ormore cameras 38 may be disposed in the simulation room 13 according to specific needs, for example, twocameras 38 are respectively located at the middle points of the top of the left and right walls of the simulation room 13, and are connected with the server 12 through wired or wireless connection for observing the conditions in the simulation room 13.
Asupport 39 can be further arranged in the simulation chamber 13, and escapeequipment 40, preferably towels, gas masks and the like, is placed on thesupport 39.
The above description describes the apparatus of the fire simulation system, and the fire simulation method is further described below in conjunction with the apparatus of the fire simulation system.
Step one, a geometric model and fire source parameters of the simulation chamber 13 are preset. The geometric model of the simulation chamber 13 is mainly determined by the central axis equation and the cross-sectional dimension of the simulation chamber 13. The central axis equation can be set by manually and continuously inputting a linear equation or a curve equation, and can also be obtained by directly analyzing an AUTOCAD graphic format or a DXF file of the central axis. Dxf (drawing exchange format), a graphic exchange file, is a sequential file, which is a data file including entity commands and geometric data information under a certain set of code symbol rules, and can be converted from an AUTOCAD graphic format. The fire source parameters mainly set the scale, position and combustion mode of the fire source. The size of the fire source is mainly set for the heat release efficiency and the total power of the fire source. The position of the fire source is mainly set as the position of the fire source in the simulation chamber, and the position can be only selected as a high-impedance resistance position under the condition. The combustion modes comprise gasoline, kerosene, crude oil, firewood and the like.
And secondly, determining information of one or more environmental parameters according to the geometric model and the fire source parameters of the simulation chamber, wherein the environmental parameters comprise images in the simulation chamber, the temperature in the simulation chamber, smoke in the simulation chamber and sound in the simulation chamber.
The geometric model and the fire source parameters of the simulation chamber 13 are sent to the server 12, the scene determining module 23 of the server 12 performs grid segmentation on the simulation chamber 13, the simulation chamber 13 is divided into a plurality of subspaces, and each subspace selects a proper material to render, and particularly only a model part currently seen by a user can be rendered to determine an image of a simulated fire scene.
After the fire source parameters are set, the temperature determination module 22 of the server 12 determines which heater orheaters 35 in the simulation room 13 perform the heating operation according to the location of the fire source, and determines the temperature of the simulated fire scene. The heating rate of theheater 35 is determined according to the heat release efficiency of the ignition source, the heating time of theheater 35 is determined according to the total power, and the power change curve of theheater 35 is determined according to the combustion method.
After the fire source parameters are set, the sound determination module 21 of the server 12 determines the sound simulating the fire scene. And calling a sound effect file prestored on the server according to the combustion mode of the fire source to generate the field effect of fire source combustion.
After the parameters of the fire source are set, the smoke determining module 24 of the server 12 determines the ejection quantity of smoke according to the combustion mode of the fire source, determines the smoke of the simulated fire scene, and controls the generation of a large quantity of smoke in the case of wet weather and insufficient contact between wood and oxygen, for example, in the case of a wood combustion mode.
And step three, sending the information of the determined environmental parameters to an execution unit of the simulation room 13, which generates or changes one or more environmental parameters.
And step four, the execution unit of the simulation room 13 generates or changes one or more environmental parameters of the fire scene according to the acquired information. For example, the projector of the simulation room 13 projects an image; theheater 35 of the simulation chamber 13 changes the temperature in the simulation chamber; the sound box of the simulation chamber 13 generates sound in the simulation chamber; the smoke machine of the simulation chamber changes the smoke condition in the simulation chamber; the image fuser of the simulation room 13 fuses the superimposed images.
It will be appreciated by those skilled in the art that one or more of the above environmental parameters may interact, for example, to simulate an increase in room temperature accompanied by an increase in the pattern of fire and smoke; for example, the sound of combustion is generated along with the pattern of fire, thereby further enhancing the simulation effect.
It should be noted that the above-mentioned embodiments are only preferred embodiments of the present invention, and should not be construed as limiting the scope of the present invention, and any minor changes and modifications to the present invention are within the scope of the present invention without departing from the spirit of the present invention.

Claims (5)

Translated fromChinese
1.一种具有图像融合的火灾仿真系统,包括控制器、服务器和仿真室,其中,1. A fire simulation system with image fusion, comprising a controller, a server and a simulation room, wherein,控制器连接至服务器,用于接受预设输入,并将预设输入发送至服务器;The controller is connected to the server for accepting the preset input and sending the preset input to the server;服务器连接至仿真室,根据预设输入确定一项或多项环境参数的信息,并向仿真室发送确定的信息;The server is connected to the simulation room, determines the information of one or more environmental parameters according to the preset input, and sends the determined information to the simulation room;仿真室具有产生或改变一项或多项环境参数的执行单元,根据服务器发出的信息产生或改变仿真室内一项或多项环境参数;The simulation room has an execution unit that generates or changes one or more environmental parameters, and generates or changes one or more environmental parameters in the simulation room according to the information sent by the server;所述执行单元包括图像融合器,用于利用坡度函数对仿真室内多个投影仪的投影图像的重叠区域进行亮度衰减处理,以达到图像拼接后的亮度一致性,具体包括:确定多个投影仪投影的重叠区域中在Q点亮度
Figure FDA0003265838540000011
Figure FDA0003265838540000012
其中,Li为第i个投影仪的投射出的投影图像的亮度;di为Q点到第i个投影仪的邻接边界的距离;Ki为图像融合器中实现的对于不同投影仪的亮度衰减系数,且满足
Figure FDA0003265838540000013
α>0,为图像融合器中实现的衰减幂函数标志,用于调节衰减的非线性程度;其中,α和Ki根据逻辑回归模型动态调整得到。The execution unit includes an image fusion device for performing brightness attenuation processing on overlapping regions of projection images of multiple projectors in the simulation room by using a slope function, so as to achieve brightness consistency after image splicing, which specifically includes: determining a plurality of projectors. Intensity at Q point in the overlapping area of the projection
Figure FDA0003265838540000011
and
Figure FDA0003265838540000012
Among them, Li is the brightness of the projected image projected by the ith projector; di is the distance from point Q to the adjacent boundary of the ith projector; Ki is the brightness attenuation coefficient for different projectors implemented in the image fusion device , and satisfy
Figure FDA0003265838540000013
α>0 is the sign of the attenuation power function implemented in the image fusion device, which is used to adjust the nonlinear degree of attenuation; among them, α and Ki are dynamically adjusted according to the logistic regression model.2.根据权利要求1所述的具有图像融合的火灾仿真系统,其特征在于,所述环境参数包括仿真室内的图像、仿真室内的温度、仿真室内的烟雾、仿真室内的声音中的一项或多项。2. The fire simulation system with image fusion according to claim 1, wherein the environmental parameters include one of images in the simulation room, temperature in the simulation room, smoke in the simulation room, and sound in the simulation room or multiple.3.根据权利要求2所述的具有图像融合的火灾仿真系统,其特征在于,所述服务器包括场景确定模块、温度确定模块、烟雾确定模块和声音确定模块,分别确定仿真室内的图像、仿真室内的温度、仿真室内的烟雾、仿真室内的声音。3. The fire simulation system with image fusion according to claim 2, wherein the server comprises a scene determination module, a temperature determination module, a smoke determination module and a sound determination module, which respectively determine the images in the simulation room, the simulation room temperature, smoke in a simulated room, and sound in a simulated room.4.根据权利要求1所述的具有图像融合的火灾仿真系统,其特征在于,所述仿真室中的产生或改变一项或多项环境参数的执行单元还包括投影仪、加热器、音箱和烟雾机中的一项或多项。4. The fire simulation system with image fusion according to claim 1, wherein the execution unit for generating or changing one or more environmental parameters in the simulation room further comprises a projector, a heater, a sound box and One or more of the fog machines.5.一种利用权利要求1所述的火灾仿真系统进行火灾仿真的方法,其特征在于,包括以下步骤:5. a method of utilizing the fire simulation system according to claim 1 to carry out fire simulation, is characterized in that, comprises the following steps:A)预设仿真室的几何模型和火源参数;A) Preset the geometric model and fire source parameters of the simulation room;B)根据仿真室的几何模型和火源参数,确定仿真室内一项或多项环境参数的信息,所述环境参数包括仿真室内的图像、仿真室内的温度、仿真室内的烟雾、仿真室内的声音中的一项或多项;B) According to the geometric model and fire source parameters of the simulation room, determine the information of one or more environmental parameters in the simulation room, the environmental parameters include the image in the simulation room, the temperature in the simulation room, the smoke in the simulation room, and the sound in the simulation room one or more of;C)将确定的环境参数信息发送至仿真室的产生或改变一项或多项环境参数的执行单元;C) sending the determined environmental parameter information to the execution unit of the simulation room that generates or changes one or more environmental parameters;D)仿真室的执行单元根据获取信息产生或改变火灾现场的一项或多项环境参数;D) The execution unit of the simulation room generates or changes one or more environmental parameters of the fire scene according to the acquired information;其中,所述执行单元还包括使用坡度函数对仿真室内投影图像的重叠区域进行亮度衰减处理,以达到图像拼接后的亮度一致性,具体包括:Wherein, the execution unit further includes using a slope function to perform brightness attenuation processing on the overlapping area of the projected images in the simulation room, so as to achieve brightness consistency after image splicing, specifically including:确定多个投影仪投影的重叠区域中在Q点亮度
Figure FDA0003265838540000021
Figure FDA0003265838540000022
其中,Li为第i个投影仪的投射出的投影图像的亮度;di为Q点到第i个投影仪的邻接边界的距离;Ki为图像融合器中实现的对于不同投影仪的亮度衰减系数,且满足
Figure FDA0003265838540000023
α>0,为图像融合器中实现的衰减幂函数标志,用于调节衰减的非线性程度;其中,α和Ki根据逻辑回归模型动态调整得到。
Determines the brightness at the Q point in the overlapping area projected by multiple projectors
Figure FDA0003265838540000021
and
Figure FDA0003265838540000022
Among them, Li is the brightness of the projected image projected by the ith projector; di is the distance from point Q to the adjacent boundary of the ith projector; Ki is the brightness attenuation coefficient for different projectors implemented in the image fusion device , and satisfy
Figure FDA0003265838540000023
α>0 is the sign of the attenuation power function implemented in the image fusion device, which is used to adjust the nonlinear degree of attenuation; among them, α and Ki are dynamically adjusted according to the logistic regression model.
CN201811345838.9A2018-12-292018-12-29 A fire simulation system and method with image fusionExpired - Fee RelatedCN109557830B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201811345838.9ACN109557830B (en)2018-12-292018-12-29 A fire simulation system and method with image fusion

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201811345838.9ACN109557830B (en)2018-12-292018-12-29 A fire simulation system and method with image fusion

Publications (2)

Publication NumberPublication Date
CN109557830A CN109557830A (en)2019-04-02
CN109557830Btrue CN109557830B (en)2022-02-11

Family

ID=65866274

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201811345838.9AExpired - Fee RelatedCN109557830B (en)2018-12-292018-12-29 A fire simulation system and method with image fusion

Country Status (1)

CountryLink
CN (1)CN109557830B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN111292243B (en)*2020-03-092021-04-06三亚至途科技有限公司Projection seamless edge fusion method and device
CN114999092A (en)*2022-06-102022-09-02北京拙河科技有限公司Disaster early warning method and device based on multiple forest fire model

Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH0457086A (en)*1990-06-271992-02-24Kyoto Kagaku:KkFire extinguisher operation training device
CN102053812A (en)*2010-10-092011-05-11清华大学Multi-projector combined display feedback brightness correction method
CN201965785U (en)*2011-03-172011-09-07重庆欧派信息科技有限责任公司Firefighting simulation rescue training system
CN103295441A (en)*2012-02-292013-09-11上海工程技术大学Fire disaster working condition simulation system for urban railway traffic vehicle station
CN105427701A (en)*2015-11-302016-03-23北京众威消防科技有限公司Fire-fighting service operation training system and method
CN106060493A (en)*2016-07-072016-10-26广东技术师范学院Multi-source projection seamless edge stitching method and system
CN109523976A (en)*2018-10-092019-03-26青岛海信电器股份有限公司A kind of VR screen display method and VR display device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US7936361B2 (en)*2007-04-302011-05-03Hewlett-Packard Development Company, L.P.System and method for masking and overlaying images in multiple projector system
CN106162115A (en)*2015-03-252016-11-23上海分众软件技术有限公司A kind of image interfusion method based on Play System

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH0457086A (en)*1990-06-271992-02-24Kyoto Kagaku:KkFire extinguisher operation training device
CN102053812A (en)*2010-10-092011-05-11清华大学Multi-projector combined display feedback brightness correction method
CN201965785U (en)*2011-03-172011-09-07重庆欧派信息科技有限责任公司Firefighting simulation rescue training system
CN103295441A (en)*2012-02-292013-09-11上海工程技术大学Fire disaster working condition simulation system for urban railway traffic vehicle station
CN105427701A (en)*2015-11-302016-03-23北京众威消防科技有限公司Fire-fighting service operation training system and method
CN106060493A (en)*2016-07-072016-10-26广东技术师范学院Multi-source projection seamless edge stitching method and system
CN109523976A (en)*2018-10-092019-03-26青岛海信电器股份有限公司A kind of VR screen display method and VR display device

Also Published As

Publication numberPublication date
CN109557830A (en)2019-04-02

Similar Documents

PublicationPublication DateTitle
US10873741B2 (en)Image processing apparatus and method
CN104536397B (en)A kind of 3D Virtual Intelligents household exchange method
US10085008B2 (en)Image processing apparatus and method
JP7631198B2 (en) Room Acoustic Simulation Using Deep Learning Image Analysis
JP2021520584A (en) Housing data collection and model generation methods
CN109557830B (en) A fire simulation system and method with image fusion
JP6132344B2 (en) Information processing apparatus and program
CN108495102A (en)Splice the seamless spliced fusion method of multi-projector of emerging system based on Unity
US20190102947A1 (en)Electronic device determining setting value of device based on at least one of device information or environment information and controlling method thereof
WO2016078009A1 (en)Real estate display system
CN116228960A (en) Construction method, construction system and display system of virtual museum display system
KR101847996B1 (en)Image projection method for a curved projection area and projection system therefor
JP5332061B2 (en) Indoor renovation cost estimation system
CN113516761A (en)Optical illusion type naked eye 3D content manufacturing method and device
CN109557829B (en) A fire simulation system and method with nonlinear distortion correction
JP2818313B2 (en) Lighting control system
CN103733619A (en)Content processing device, content processing method, and recording medium
KR20200041548A (en)A mobile apparatus and a method for controlling the mobile apparatus
CN117456076A (en) A material map generation method and related equipment
TWM559476U (en)System device with virtual reality and mixed reality house purchase experience
CN114816197B (en)Control method and device of intelligent household equipment, storage medium and terminal equipment
CN116631244A (en) System, method, storage medium and electronic device for simulating operation of mechanical equipment
TWI671711B (en) Apparatus and method for simulating light distribution in environmental space
CN116634112A (en) Multi-projector plane fusion projection correction method and system based on Unreal Engine
TW201604811A (en)Selection method of projector and inquiry system

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant
CF01Termination of patent right due to non-payment of annual fee

Granted publication date:20220211

CF01Termination of patent right due to non-payment of annual fee

[8]ページ先頭

©2009-2025 Movatter.jp