Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, completeSite preparation description, it is clear that described embodiments are some of the embodiments of the present invention, instead of all the embodiments.Based on this hairEmbodiment in bright, every other implementation obtained by those of ordinary skill in the art without making creative effortsExample, shall fall within the protection scope of the present invention.
It is the flow chart of image processing method provided in an embodiment of the present invention referring to Fig. 1, Fig. 1, is set applied to first terminalIt is standby.As shown in Figure 1, method includes the following steps:
Step 101, in the case where establishing video with target terminal equipment and connecting, obtain the of first terminal equipment acquisitionOne image.
Wherein, the target terminal equipment refers to establishing the terminal device that video is connect with first terminal equipment.In realityIn the application of border, first terminal equipment can establish video by social software etc. with target terminal equipment and connect.
Here, first terminal equipment can obtain the second image using the camera of itself.
Step 102, control second terminal equipment acquire the second image.
Wherein, the second terminal equipment can be located at the first terminal equipment it is arbitrary in same local area networkOne or more terminal device.For example, can be connected by modes such as bluetooths between first terminal equipment and second terminal equipmentIt connects, and consolidated network is connected to by same WiFi (Wireless Fidelity, Wireless Fidelity) or mobile phone hot spot.It is describedFirst image and second image are the image with different contents of shooting.
Wherein, the first image and second image are the image with different contents of shooting, comprising: described firstInclude the picture material of the first coverage in image, includes the picture material of the second coverage, institute in second imageIt states the first coverage and second coverage is located in same video scene;Or the first image and described secondImage is respectively the image of the different shooting angles of same target.By in difference included by the first image and the second imageHold, more complete video scene, and the content of abundant display can be embodied.
For example, include the video object A, B in video scene, then, it may include the video object A in the first image, theIt may include the video object B in two images.To may include whole the video objects by these images.
Or include the video object A in video scene, the shooting angle and the second figure of the video object A in the first imageThe shooting angle of the video object A is different as in.To may include the different angle of the video object by these images.
In this step, first terminal equipment can send image capturing request to the second terminal equipment, in the figureInformation as in acquisition request including target LAN network.So, the second terminal equipment is acquired according to described image and is requestedIt is connected to the target LAN network, and acquires second image.Wherein, image capturing request can be Bluetooth connection requestDeng.With this solution, it is ensured that be reliably connected with second terminal equipment and second terminal equipment is made to acquire image in time.
Step 103 sends the first image to server, so that the server is according to the first image and describedSecond image that second terminal equipment is sent determines target image, and sends the target to the target terminal equipmentImage.
In the embodiment of the present invention, above-mentioned image processing method can be applied to first terminal equipment, such as: mobile phone, plateComputer (Tablet Personal Computer), laptop computer (Laptop Computer), PDA (personalDigital assistant, personal digital assistant), MID (Mobile Internet Device, mobile Internet access device) or canWearable device (Wearable Device) etc..
In embodiments of the present invention, it in the case where first terminal equipment is established video with target terminal equipment and connect, obtainsThe first image for taking first terminal equipment to acquire, and control second terminal equipment and acquire the second image, then, first terminal equipmentThe image that itself is obtained is sent to server respectively with second terminal, target terminal equipment is then forwarded to by server.Due toFirst terminal equipment and second terminal equipment are located in same local area network, and the first image and second image be withThe image of different contents of shooting, therefore, the target image obtained through the embodiment of the present invention can embody more complete video fieldScape.
Optionally, in the above-described embodiments, before step 101, it may also include that and obtain the of the first terminal equipmentSecond shooting scene center position of one photographed scene center and the second terminal equipment.It so, can be according to acquisitionInformation, determine the first terminal equipment first shooting scene center position and the second terminal equipment second shootingScene center position, if be located in same horizontal line.So, step 101 is specifically, the first of the first terminal equipmentPhotographed scene center and the second shooting scene center position of the second terminal equipment are located at the feelings in same horizontal lineUnder condition, the first image of first terminal equipment acquisition is obtained.Since the photographed scene center to two terminal devices carries outVerification, therefore, it is ensured that the image of acquisition can accurately embody video scene.
That is, being the first shooting field for determining the first terminal equipment by first terminal equipment in this mannerSecond shooting scene center position of scape center and the second terminal equipment, if be located in same horizontal line.
On this basis, first terminal equipment can also send the first instruction information to the second terminal equipment, and described theOne instruction information is used to indicate whether the first shooting scene center position and second shooting scene center position are located atIn same horizontal line, consequently facilitating second terminal equipment is shot and guaranteed first terminal equipment in time and second terminal is setThe synchronism of the standby image obtained.
Optionally, in the above-described embodiments, before step 101, it may also include that receiving the second terminal equipment sendsSecond indication information, the second indication information be used to indicate the first terminal equipment first shooting scene center positionWhether it is located in same horizontal line with the second shooting scene center position of the second terminal equipment.So, step 101 is specificFor in the second indication information instruction first shooting scene center position and second shooting scene center position positionIn the case where in same horizontal line, the first image of first terminal equipment acquisition is obtained.Due to the bat to two terminal devicesIt takes the photograph scene center position to be verified, therefore, it is ensured that the image of acquisition can accurately embody video scene.
That is, being the first shooting field for determining the first terminal equipment by second terminal equipment in this mannerSecond shooting scene center position of scape center and the second terminal equipment, if be located in same horizontal line.
If not being located in same horizontal line, then adjustable first terminal equipment perhaps second terminal equipment or may be used alsoTwo terminal devices are adjusted simultaneously, are located at its photographed scene center in same horizontal line.
Referring to fig. 2, Fig. 2 is the flow chart of image processing method provided in an embodiment of the present invention, is set applied to second terminalIt is standby.As shown in Fig. 2, method includes the following steps:
Step 201, the control based on first terminal equipment acquire the second image.
Specifically, in this step, second terminal equipment receives the first terminal equipment and sends image capturing request,It include the information of target LAN network in described image acquisition request.Then, second terminal equipment is asked according to described image acquisitionIt asks and is connected to the target LAN network, and acquired and requested according to described image, acquire second image.For example, second is wholeEnd equipment can utilize the camera collection image of itself setting.Pass through the image capturing request, it is ensured that with first terminal equipmentBe reliably connected and acquire image in time.
Step 202 sends second image to server, so that the server is according to second image and describedThe first image that first terminal equipment is sent determines target image, and sends the target image to target terminal equipment.
Wherein, the first terminal equipment and the foundation of target terminal equipment have a video connection, the second terminal equipment withThe first terminal equipment is located in same local area network.For example, second terminal equipment can pass through bluetooth with first terminal equipmentMode is attached.The first image and second image are the image with different contents of shooting.
Wherein, the first image and second image are the image with different contents of shooting, comprising:
Include the picture material of the first coverage in the first image, includes the second shooting model in second imageThe picture material enclosed, first coverage and second coverage are located in same video scene;Alternatively, describedOne image and second image are respectively the image of the different shooting angles of same target.
In the embodiment of the present invention, above-mentioned image processing method can be applied to second terminal equipment, such as: mobile phone, plateComputer, laptop computer, PDA, MID or wearable device etc..By in difference included by the first image and the second imageHold, more complete video scene, and the content of abundant display can be embodied.
In embodiments of the present invention, it in the case where first terminal equipment is established video with target terminal equipment and connect, obtainsThe first image for taking first terminal equipment to acquire, and control second terminal equipment and acquire the second image, then, first terminal equipmentThe image that itself is obtained is sent to server respectively with second terminal, target terminal equipment is then forwarded to by server.Due toFirst terminal equipment and second terminal equipment are located in same local area network, and the first image and second image be withThe image of different contents of shooting, therefore, the target image obtained through the embodiment of the present invention can embody more complete video fieldScape.
Optionally, in the above-described embodiments, before step 201, it may also include that and obtain the of the first terminal equipmentSecond shooting scene center position of one photographed scene center and the second terminal equipment.So, second terminal equipmentIt can determine the first photographed scene of the second shooting scene center position and the first terminal equipment of the second terminal equipmentCenter, if be located in same horizontal line.So, step 201 specifically, it is described first shooting scene center position andIn the case that second shooting scene center position is located in same horizontal line, second image is acquired.Due to shootingIt is verified for scene center, to can guarantee that the image obtained can accurately embody video scene.
That is, being the first shooting field for determining the first terminal equipment by second terminal equipment in this mannerSecond shooting scene center position of scape center and the second terminal equipment, if be located in same horizontal line.
On this basis, second terminal equipment can also send second indication information to the first terminal equipment, and described theTwo instruction information are used to indicate whether the first shooting scene center position and second shooting scene center position are located atIn same horizontal line.By the instruction information, it can be shot in time convenient for first terminal equipment and guarantee that first terminal is setThe synchronism for the image that standby and second terminal equipment obtains.
Optionally, in the above-described embodiments, before step 201, it may also include that receiving the first terminal equipment sendsFirst instruction information, it is described first instruction information be used to indicate the first terminal equipment first shooting scene center positionWhether it is located in same horizontal line with the second shooting scene center position of the second terminal equipment.So, step 201 is specificFor in the first instruction information instruction first shooting scene center position and second shooting scene center position positionIn the case where in same horizontal line, second image is acquired.Due to being verified to photographed scene center, becauseThis, it is ensured that the image of acquisition can accurately embody video scene.
That is, being the first shooting field for determining the first terminal equipment by first terminal equipment in this mannerSecond shooting scene center position of scape center and the second terminal equipment, if be located in same horizontal line.
If not being located in same horizontal line, then adjustable first terminal equipment perhaps second terminal equipment or may be used alsoTwo terminal devices are adjusted simultaneously, are located at its photographed scene center in same horizontal line.
It is the flow chart of image processing method provided in an embodiment of the present invention referring to Fig. 3, Fig. 3, is set applied to target terminalIt is standby.As shown in figure 3, method includes the following steps:
Step 301, in the case where establishing video with first terminal equipment and connecting, receive the target figure that server is sentPicture.
In embodiments of the present invention, the target image that server is sent is the server according to the of second terminal equipmentWhat two images and the first image of the first terminal equipment obtained.For example, it may be the first image, the second image;It can be withIt is the composograph carried out to the first image and the second image after synthesis processing.Wherein, the second terminal equipment with it is describedFirst terminal equipment is located in same local area network;The first image and second image are the figure with different contents of shootingPicture.The first image and second image are the meaning of the image with different contents of shooting, can refer to previous embodimentDescription.
In embodiments of the present invention, before this step, target terminal equipment can also send image to the server and obtainTake request message.So, this step is corresponding specifically: indicates to obtain first kind image in described image acquisition request messageIn the case where, receive the first image and second image that the server is sent;Disappear in described image acquisition requestIn the case that breath instruction obtains Second Type image, the composograph that the server is sent is received, wherein the composographIt is to be generated after being synthesized the first image and second image.Due to obtaining image acquisition request message, becauseThis, the target image that may make server to send more meets the demand of target terminal equipment.
Wherein, first kind image also refers to single image namely each terminal device is sent to the figure of serverAs being transmitted directly to target terminal equipment by server, Second Type image also refers to server and sends out each terminal deviceThe image sent synthesized after composograph etc..
Step 302, the display target image.
In the case where receiving the first image and second image that the server is sent, in this step, theThe first image is shown in one display window, and second image is shown in the second display window.Later, it also can receive theTwo inputs, and in response to second input, the first image and second image are synthesized, third figure is generatedPicture, and the third image is shown in third display window.Due to showing image using different display windows, so as to justImage is watched in user, more fully understanding video scene.
Wherein, second input, which can be, touchs, clicks on input etc..The third display window is also possible to above-mentionedFirst display window or the second display window, can also be new display window.
In addition, for user's convenience, after the above step, may also include that and receive third input, and in response to describedThird input, splits at least one subgraph for the third image, then shows at least one described subgraph respectively.?That is, in this case, the image of synthesis can be split, and be utilized respectively display window and shown.Wherein, describedThree inputs, which can be, touchs, clicks on input etc..
In the case where receiving the first image and second image that the server is sent, in this step, may be used alsoThe 4th input is received, and according to the first image and second image, generates stereo-picture, shows the stereo-picture.Wherein, the 4th input, which can be, touchs, clicks on input etc..Due to may be viewed by stereo-picture, to enrich the figure of userAs viewing mode.
In the embodiment of the present invention, above-mentioned image processing method can be applied to target terminal equipment, such as: mobile phone, plateComputer, laptop computer, PDA, MID or wearable device etc..
In embodiments of the present invention, it in the case where first terminal equipment is established video with target terminal equipment and connect, obtainsThe first image for taking first terminal equipment to acquire, and control second terminal equipment and acquire the second image, then, first terminal equipmentThe image that itself is obtained is sent to server respectively with second terminal, target terminal equipment is then forwarded to by server.Due toFirst terminal equipment and second terminal equipment are located in same local area network, and the first image and second image be withThe image of different contents of shooting, therefore, the target image obtained through the embodiment of the present invention can embody more complete video fieldScape.
Referring to fig. 4, Fig. 4 is the flow chart of image processing method provided in an embodiment of the present invention, is applied to server.Such as figureShown in 4, method includes the following steps:
Step 401, in the case where first terminal equipment is established video with target terminal equipment and is connect, obtain described firstThe first image that terminal device is sent.
Step 402 obtains the second image that second terminal equipment is sent.
Wherein, the second terminal equipment and the first terminal equipment are located in same local area network;The first imageIt is the image with different contents of shooting with second image.
Step 403, according to the first image and second image, determine target image.
It, can be directly by the first image and described for the different demands for meeting target terminal specifically, in this stepSecond image is determined as target image;Alternatively, the image that the first image and second image synthesize is determined as targetImage.
Step 404, Xiang Suoshu target terminal equipment send the target image.
In this embodiment, the image acquisition request message of the target terminal equipment can also be obtained.It is obtained in described imageWhen request message being taken to indicate that the target terminal equipment obtains first kind image, Xiang Suoshu target terminal equipment sends described theOne image and second image;Indicate that the target terminal equipment obtains Second Type figure in described image acquisition request messageWhen picture, the first image and second image are synthesized, and send the figure after synthesis to the target terminal equipmentPicture.Due to obtaining the demand of target terminal equipment by image acquisition request message, it may make target image more to meetThe demand of target terminal.
Wherein, first kind image also refers to single image namely each terminal device is sent to the figure of serverPicture.That is, in this case, the image obtained from terminal device need to be only transmitted directly to target terminal equipment i.e. by serverIt can.Second Type image also refers to the composograph after server is synthesized the image that each terminal device is sentDeng.In this case, server needs carry out synthesis processing to the video image obtained from terminal device.
In embodiments of the present invention, in the case where first terminal equipment is established video with second terminal equipment and connect, touchingIt sends out second terminal equipment and acquires the second image, then, first terminal equipment and second terminal respectively send out the image itself obtainedServer is given, target terminal equipment is then forwarded to by server.Since first terminal equipment and second terminal equipment are located at togetherIn one local area network, and the first image and second image are the image with different contents of shooting, therefore, by this hairThe target image that bright embodiment obtains can embody more complete video scene.
It is the image processing system schematic diagram of the embodiment of the present invention referring to Fig. 5.As shown in figure 5, the system includes: firstTerminal device 51, second terminal equipment 52, server 53, third terminal equipment 54.Wherein, first terminal equipment and third terminalVideo connection can be established between equipment.First terminal equipment and second terminal equipment are in same video scene.In the present inventionIt is the description carried out so that two terminal devices establish local area network connection as an example in embodiment.In practical applications, according to videoThe quantity of the difference of object number, terminal device can accordingly increase.
In embodiments of the present invention, when first terminal equipment and third terminal equipment need to establish video calling, ifThe user of first terminal equipment and several good friends are in same scene, they need to regard with the user of third terminal equipment togetherFrequently.Again because the wide-angle of itself camera of first terminal equipment is limited, so that video image can not be obtained in third terminal equipment sideTo complete display.At this moment, good friend's using terminal equipment of first terminal equipment user can be allowed to participate in coming in establish same fieldScape video calling allows each terminal device to be shot in the different location of same horizontal line, then to these shooting image intoRow processing (such as synthesis processing), makes video image reach panorama and shows.To, third terminal equipment can watch one it is completeWhole video image.The image processing method of the embodiment of the present invention includes:
Step 601, in the case where first terminal equipment is established video with third terminal equipment and is connect, first terminal equipmentImage capturing request is sent to second terminal equipment.
Step 602, second terminal equipment are established with first terminal equipment and are connected to the network according to the image capturing request.
For example, first terminal equipment and the openable bluetooth of second terminal equipment establish connection by function of search.Work as indigo plantAfter tooth successful connection is established, first terminal equipment is sent to second terminal equipment by Bluetooth transmission and asking for Video chat is addedIt asks.First terminal equipment and second terminal equipment are added in the same local area network, use the same wifi or mobile phone hot spotNetwork, same group of server being connected on cloud.
Whether the photographed scene center of step 603, verification first terminal equipment and second terminal equipment is in same waterOn horizontal line.
Upon establishment of a connection, the front camera of first terminal equipment and second terminal equipment will be opened.At this point, as schemedA verifying function will be showed shown in 6, on the curtain of first terminal equipment or second terminal equipment., the function is mainly with terminalThe scene of equipment shooting is material, carries out judgement comparison to scene center position.
Here, for being verified using first terminal equipment, first terminal equipment is by the shooting of first terminal equipmentThe photographed scene center of the third terminal equipment of scene center position and acquisition is compared, and is referred to third terminal equipmentShow comparison result.
As shown in fig. 7, when the material center of two terminal devices shooting not in the same horizontal line when, on screenThe color of horizontal check energy function changes, such as becomes red by yellow, and verification is prompted to fail.As shown in figure 8, when twoThe material center position of terminal device shooting in the same horizontal line when, the color of the horizontal check energy function on screen changesBecome, such as green is become by yellow, and prompt to verify successfully.At this point, more terminal devices can be shot simultaneously.
Fig. 9 and Figure 10 respectively illustrates two kinds of verification scene figures.In Fig. 9, in the shooting material of two terminal devicesHeart position is not in the same horizontal line;In Figure 10, the center of the shooting material of each terminal device is in same horizontal lineOn.If in the same horizontal line, not can adjust the picture-taking position of two terminal devices, then, then verify again.
Step 604, when the photographed scene center of first terminal equipment and second terminal equipment in the same horizontal lineWhen, first terminal equipment and second terminal equipment are shot, and the image of respective coverage is acquired.
It as shown in figure 11, is photographed scene schematic diagram.In shooting process, terminal device and the erect-position of people are substantially as schemed instituteShow.As soon as being clapped simultaneously when terminal device can not take proprietary with two even more terminal devicesIt takes the photograph.
The image of acquisition is sent to server by step 605, first terminal equipment and second terminal equipment.
The image that step 606, server will acquire is sent to third terminal equipment.
Here, server can be the first terminal equipment that directly will acquire according to the requirement of third terminal equipment, selectionImage and the image of second terminal equipment are transmitted directly to third terminal equipment, the image for the first terminal equipment that still will acquireThird terminal equipment is then forwarded to after the image synthesis of second terminal equipment.
Step 607, third terminal equipment show the image obtained by server.
In third terminal equipment, if the image obtained is the image of first terminal equipment and image and second terminal equipment,It is shown respectively using different display windows, as shown in 1211 and 1212 in Figure 12.In this case, such as Figure 12 instituteShow, a button 1213 being bonded to can be shown on the screen of third terminal equipment.After receiving the input to the button, ringEach image independently shown should be subjected to synthesis processing in the input, obtain a secondary complete video image, and utilize displayWindow is shown.
In addition, as shown in figure 13, the button 1311 that a key is split can be shown under screen.When receiving to the buttonAfter input, in response to the input, complete video image is split as each independent video window again.
It can be seen that in embodiments of the present invention by above description, it, can be no longer during establishing video callingVideo distance and front camera itself wide-angle by terminal device are limited, and one-to-many or even multi-to-multi video is metChat scenario, to provide a complete transmission of video image, to reach better usage experience.
It in the above embodiment, is to be shot using different terminal devices to different coverages.So, existIn practical application, multiple cameras can also be used, same coverage is shot, then, the image of acquisition is uploaded into clothesBusiness device, then third terminal equipment is sent to by server.
As shown in figure 14, that is, realizing multi-angled shooting to same coverage.In this way, due to having carried out multi-angle batIt takes the photograph, in first terminal equipment or third terminal equipment side, the stereopsis of coverage can be shown, to realize AR scene.Specifically, can show the trigger button of three-dimensional video-frequency on a display screen for first terminal equipment.When receive user to this byAfter the operation of button, the image of video will switch to the image of a multi-C stereo from original 2d plane.So, in third terminalEquipment side can then see the stereopsis of coverage.In third terminal equipment side, as shown in figure 15, by receiving user to aobviousThe operation of display screen, such as the operation of sliding screen, the stereopsis that rotatably receives watch video pair from different perspectivesSide, with other side's Video chat.
It can be seen that technical solution through the embodiment of the present invention by above description, user can be allowed to establish viewFrequency is no longer influenced by the limitation of video distance and front camera itself wide-angle during conversing, meet one-to-many or even multipairMore Video chat scenes provides the user with a complete transmission of video image.
It is the structure chart of terminal device provided in an embodiment of the present invention referring to Figure 16, Figure 16.As shown in figure 16, terminal device1600 include:
First obtains module 1601, in the case where establishing video with target terminal equipment and connect, acquisition terminal to be setFirst image of standby acquisition;
Control module 1602 acquires the second image for controlling second terminal equipment;
Sending module 1603, for sending the first image to server, so that the server is according to described firstSecond image that image and the second terminal equipment are sent determines target image, and sends out to the target terminal equipmentSend the target image;
Wherein, the second terminal equipment and the terminal device are located in same local area network, the first image and instituteStating the second image is the image with different contents of shooting.
Optionally, the first image and second image are the image with different contents of shooting, comprising: described theInclude the picture material of the first coverage in one image, include the picture material of the second coverage in second image,First coverage and second coverage are located in same video scene;Or the first image and describedTwo images are respectively the image of the different shooting angles of same target.
Optionally, the control module 1602 is specifically used for, and Xiang Suoshu second terminal equipment sends image capturing request,It include the information of target LAN network in described image acquisition request;Wherein, the second terminal equipment is adopted according to described imageCollection request, is connected to the target LAN network, and acquire second image.
Optionally, the terminal device further include:
Second obtains module 1603, for obtaining the first shooting scene center position and described second of the terminal deviceSecond shooting scene center position of terminal device;The first acquisition module 1601 is specifically used for, in first shooting fieldIn the case that scape center and second shooting scene center position are located in same horizontal line, terminal device acquisition is obtainedThe first image.
Optionally, the terminal device further include:
Sending module 1604, for sending the first instruction information, the first instruction information to the second terminal equipmentIt is used to indicate whether the first shooting scene center position and second shooting scene center position are located at same horizontal lineOn.
Optionally, the terminal device further include:
Receiving module 1605, the second indication information sent for receiving the second terminal equipment, second instructionInformation is used to indicate the first shooting scene center position of the terminal device and the second shooting field of the second terminal equipmentWhether scape center is located in same horizontal line;The first acquisition module 1601 is specifically used for, in the second instruction letterThe breath instruction first shooting scene center position and second shooting scene center position are located at the feelings in same horizontal lineUnder condition, the first image of terminal device acquisition is obtained.
Terminal device 1600 can be realized each mistake that first terminal equipment is realized in the embodiment of the method for Fig. 1 to Figure 15Journey, to avoid repeating, which is not described herein again.
In embodiments of the present invention, it in the case where first terminal equipment is established video with target terminal equipment and connect, obtainsThe first image for taking first terminal equipment to acquire, and control second terminal equipment and acquire the second image, then, first terminal equipmentThe image that itself is obtained is sent to server respectively with second terminal, target terminal equipment is then forwarded to by server.Due toFirst terminal equipment and second terminal equipment are located in same local area network, and the first image and second image be withThe image of different contents of shooting, therefore, the target image obtained through the embodiment of the present invention can embody more complete video fieldScape.
It is the structure chart of terminal device provided in an embodiment of the present invention referring to Figure 17, Figure 17.As shown in figure 17, terminal device1700 include:
First obtains module 1701, for the control based on first terminal equipment, acquires the second image;
First sending module 1702, for for sending second image to server so that the server according toThe first image that second image and the first terminal equipment are sent determines target image, and sends out to target terminal equipmentSend the target image;
Wherein, the first terminal equipment and target terminal equipment foundation have video connection, and the second terminal is setIt is standby to be located in same local area network with the first terminal equipment;The first image and second image are with different shootingsThe image of content.
Optionally, include the picture material of the first coverage in the first image, include the in second imageThe picture material of two coverages, first coverage and second coverage are located in same video scene;OrPerson
The first image and second image are respectively the image of the different shooting angles of same target.
Optionally, the first acquisition module 1701 includes:
Receiving submodule sends image capturing request for receiving the first terminal equipment, asks in described image acquisitionIt include the information of target LAN network in asking;
Submodule is handled, is requested for being acquired according to described image, is connected to the target LAN network;
Acquisition submodule is requested for being acquired according to described image, acquires second image.
Optionally, the terminal device further include:
Second obtains module 1703, for obtaining the first shooting scene center position and described of the first terminal equipmentSecond shooting scene center position of terminal device;
The first acquisition module 1701 is specifically used for, in the first shooting scene center position and second shootingIn the case that scene center position is located in same horizontal line, second image is acquired.
Optionally, the terminal device further include:
Second sending module 1704, for sending second indication information, second instruction to the first terminal equipmentInformation is used to indicate whether the first shooting scene center position and second shooting scene center position are located at same waterOn horizontal line.
Optionally, the terminal device further include:
Third obtains module 1705, the first instruction information sent for obtaining the first terminal equipment, and described firstIndicate that information is used to indicate the second count of the first shooting scene center position and the terminal device of the first terminal equipmentTake the photograph whether scene center position is located in same horizontal line;
The first acquisition module 1701 is specifically used for, and indicates in first photographed scene in the first instruction informationIn the case that heart position and second shooting scene center position are located in same horizontal line, second image is acquired.
Terminal device 1700 can be realized each mistake that second terminal equipment is realized in the embodiment of the method for Fig. 1 to Figure 15Journey, to avoid repeating, which is not described herein again.
In embodiments of the present invention, it in the case where first terminal equipment is established video with target terminal equipment and connect, obtainsThe first image for taking first terminal equipment to acquire, and control second terminal equipment and acquire the second image, then, first terminal equipmentThe image that itself is obtained is sent to server respectively with second terminal, target terminal equipment is then forwarded to by server.Due toFirst terminal equipment and second terminal equipment are located in same local area network, and the first image and second image be withThe image of different contents of shooting, therefore, the target image obtained through the embodiment of the present invention can embody more complete video fieldScape.
It is the structure chart of terminal device provided in an embodiment of the present invention referring to Figure 18, Figure 18.As shown in figure 18, terminal device1800 include:
First receiving module 1801, for receiving server in the case where establishing video with first terminal equipment and connectingThe target image of transmission;
First display module 1802, for showing the target image;
Wherein, the target image is first image and second terminal of the server according to the first terminal equipmentWhat the second image of equipment determined;The second terminal equipment and the first terminal equipment are located in same local area network;It is describedFirst image and second image are the image with different contents of shooting.
Optionally, the terminal device further include:
Sending module 1803, for sending image acquisition request message to the server;
First receiving module 1801 includes:
First receiving submodule, for indicating the case where obtaining first kind image in described image acquisition request messageUnder, receive the first image and second image that the server is sent;Second receiving submodule, in the figureIn the case where obtaining Second Type image as the instruction of acquisition request message, the composograph that the server is sent is received, whereinThe composograph is generated after being synthesized the first image and second image.
Optionally, the terminal device further include:
Second display module 1803, for showing the first image in the first display window, in the second display windowMiddle display second image;
Second receiving module 1804, for receiving the second input;
First generation module 1805, for being inputted in response to described second, by the first image and second imageIt is synthesized, generates third image;
Third display module 1806, for showing the third image in third display window.
Optionally, the terminal device further include:
Third receiving module 1807, for receiving third input;
Module 1808 is split, for inputting in response to the third, the third image is split into at least one subgraphPicture;
4th display module 1809, for showing at least one described subgraph respectively.
Optionally, the terminal device further include:
4th receiving module 1810, for receiving the 4th input;
Second generation module 1811, for being inputted in response to the described 4th, according to the first image and second figurePicture generates stereo-picture;
5th display module 1812, for showing the stereo-picture.
Terminal device 1800 can be realized each mistake that target terminal equipment is realized in the embodiment of the method for Fig. 1 to Figure 15Journey, to avoid repeating, which is not described herein again.
In embodiments of the present invention, it in the case where first terminal equipment is established video with target terminal equipment and connect, obtainsThe first image for taking first terminal equipment to acquire, and control second terminal equipment and acquire the second image, then, first terminal equipmentThe image that itself is obtained is sent to server respectively with second terminal, target terminal equipment is then forwarded to by server.Due toFirst terminal equipment and second terminal equipment are located in same local area network, and the first image and second image be withThe image of different contents of shooting, therefore, the target image obtained through the embodiment of the present invention can embody more complete video fieldScape.
It is the structure chart of server provided in an embodiment of the present invention referring to Figure 19, Figure 19.As shown in figure 19, terminal device1900 include:
First obtains module 1901, for establishing the case where video is connect with target terminal equipment in first terminal equipmentUnder, obtain the first image that the first terminal equipment is sent;
Second obtains module 1902, for obtaining the second image of second terminal equipment transmission;
Determining module 1903, for determining target image according to the first image and second image;
Sending module 1904, for sending target image to the target terminal equipment;
Wherein, the second terminal equipment and the first terminal equipment are located in same local area network;The first imageIt is the image with different contents of shooting with second image.
Optionally, the determining module 1903 is specifically used for, and the first image and second image are determined as meshLogo image;Alternatively, the image that the first image and second image synthesize is determined as target image.
Optionally, the sending module 1904 includes:
Acquisition submodule, for obtaining the image acquisition request message of the target terminal equipment;
First sending submodule, for indicating that the target terminal equipment obtains first in described image acquisition request messageWhen types of image, Xiang Suoshu target terminal equipment sends the first image and second image;
Second sending submodule, for indicating that the target terminal equipment obtains second in described image acquisition request messageWhen types of image, the first image and second image are synthesized, and sends and synthesizes to the target terminal equipmentImage afterwards.
Terminal device 1800 can be realized each process that server is realized in the embodiment of the method for Fig. 1 to Figure 15, to keep awayExempt to repeat, which is not described herein again.
In embodiments of the present invention, it in the case where first terminal equipment is established video with target terminal equipment and connect, obtainsThe first image for taking first terminal equipment to acquire, and control second terminal equipment and acquire the second image, then, first terminal equipmentThe image that itself is obtained is sent to server respectively with second terminal, target terminal equipment is then forwarded to by server.Due toFirst terminal equipment and second terminal equipment are located in same local area network, and the first image and second image be withThe image of different contents of shooting, therefore, the target image obtained through the embodiment of the present invention can embody more complete video fieldScape.
A kind of hardware structural diagram of Figure 20 terminal device of each embodiment to realize the present invention.The terminal device2000 include but is not limited to: radio frequency unit 2001, network module 2002, audio output unit 2003, input unit 2004, sensingDevice 2005, display unit 2006, user input unit 2007, interface unit 2008, memory 2009, processor 2010 andThe components such as power supply 2011.It will be understood by those skilled in the art that the not structure paired terminal of terminal device structure shown in Figure 20The restriction of equipment, terminal device may include perhaps combining certain components or different than illustrating more or fewer componentsComponent layout.In embodiments of the present invention, terminal device includes but is not limited to mobile phone, tablet computer, laptop, palm electricityBrain, vehicle-mounted terminal equipment, wearable device and pedometer etc..
Wherein, processor 2010, for obtaining first terminal in the case where establishing video with target terminal equipment and connectingFirst image of equipment acquisition;It controls second terminal equipment and acquires the second image;
Radio frequency unit 2001, for sending the first image to server, so that the server is according to described firstSecond image that image and the second terminal equipment are sent determines target image, and sends out to the target terminal equipmentSend the target image;
Wherein, the second terminal equipment and the first terminal equipment are located in same local area network, the first imageIt is the image with different contents of shooting with second image.
In embodiments of the present invention, it in the case where first terminal equipment is established video with target terminal equipment and connect, obtainsThe first image for taking first terminal equipment to acquire, and control second terminal equipment and acquire the second image, then, first terminal equipmentThe image that itself is obtained is sent to server respectively with second terminal, target terminal equipment is then forwarded to by server.Due toFirst terminal equipment and second terminal equipment are located in same local area network, and the first image and second image be withThe image of different contents of shooting, therefore, the target image obtained through the embodiment of the present invention can embody more complete video fieldScape.
Wherein, the first image and second image are the image with different contents of shooting, comprising:
Include the picture material of the first coverage in the first image, includes the second shooting model in second imageThe picture material enclosed, first coverage and second coverage are located in same video scene;Or
The first image and second image are respectively the image of the different shooting angles of same target.
Optionally, it is adopted for sending image capturing request to the second terminal equipment in described image processor 2010It include the information of target LAN network in collection request;Wherein, the second terminal equipment is acquired according to described image and is requested, connectionThe extremely target LAN network, and acquire second image.
Optionally, processor 2010, for obtaining the first shooting scene center position and the institute of the first terminal equipmentState the second shooting scene center position of second terminal equipment;In the first shooting scene center position and second shootingIn the case that scene center position is located in same horizontal line, the first image of first terminal equipment acquisition is obtained.
Optionally, processor 2010, for sending the first instruction information, first instruction to the second terminal equipmentInformation is used to indicate whether the first shooting scene center position and second shooting scene center position are located at same waterOn horizontal line.
Optionally, processor 2010, the second indication information sent for receiving the second terminal equipment, described secondIndicate information be used to indicate the first terminal equipment the first shooting scene center position and the second terminal equipment theWhether two photographed scene centers are located in same horizontal line;First photographed scene is indicated in the second indication informationIn the case that center and second shooting scene center position are located in same horizontal line, obtain first terminal equipment and adoptFirst image of collection.
It should be understood that the embodiment of the present invention in, radio frequency unit 2001 can be used for receiving and sending messages or communication process in, signalSend and receive, specifically, by from base station downlink data receive after, to processor 2010 handle;In addition, by uplinkData are sent to base station.In general, radio frequency unit 2001 includes but is not limited to antenna, at least one amplifier, transceiver, couplingDevice, low-noise amplifier, duplexer etc..In addition, radio frequency unit 2001 can also by wireless communication system and network and otherEquipment communication.
Terminal device provides wireless broadband internet by network module 2002 for user and accesses, and such as user is helped to receiveIt sends e-mails, browse webpage and access streaming video etc..
Audio output unit 2003 can be received by radio frequency unit 2001 or network module 2002 or in memoryThe audio data stored in 2009 is converted into audio signal and exports to be sound.Moreover, audio output unit 2003 can be withAudio output relevant to the specific function that terminal device 2000 executes is provided (for example, call signal receives sound, message sinkSound etc.).Audio output unit 2003 includes loudspeaker, buzzer and receiver etc..
Input unit 2004 is for receiving audio or video signal.Input unit 2004 may include GPU (GraphicsProcessing Unit, graphics processor) 20041 and microphone 20042, graphics processor 20041 is in video acquisition modeOr the image data of the static images or video obtained in image capture mode by image capture apparatus (such as camera) carries outReason.Treated, and picture frame may be displayed on display unit 2006.It can be with through treated the picture frame of graphics processor 20041It is stored in memory 2009 (or other storage mediums) or is sent via radio frequency unit 2001 or network module 2002.Microphone 20042 can receive sound, and can be audio data by such acoustic processing.Treated, and audio data canTo be converted to the format output that can be sent to mobile communication base station via radio frequency unit 2001 in the case where telephone calling model.
Terminal device 2000 further includes at least one sensor 2005, for example, optical sensor, motion sensor and otherSensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to ringThe light and shade of border light adjusts the brightness of display panel 20061, proximity sensor can when terminal device 2000 is moved in one's ear,Close display panel 20061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directionsThe size of (generally three axis) acceleration, can detect that size and the direction of gravity, can be used to identify terminal device appearance when staticState (such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion)Deng;Sensor 2005 can also include fingerprint sensor, pressure sensor, iris sensor, molecule sensor, gyroscope, gasMeter, hygrometer, thermometer, infrared sensor etc. are pressed, details are not described herein.
Display unit 2006 is for showing information input by user or being supplied to the information of user.Display unit 2006 canIncluding display panel 20061, LCD (Liquid Crystal Display, liquid crystal display), OLED (Organic can be usedLight-Emitting Diode, Organic Light Emitting Diode) etc. forms configure display panel 20061.
User input unit 2007 can be used for receiving the number or character information of input, and generate the use with terminal deviceFamily setting and the related key signals input of function control.Specifically, user input unit 2007 include touch panel 20071 withAnd other input equipments 20072.Touch panel 20071, also referred to as touch screen collect the touch behaviour of user on it or nearbyMake (for example user uses any suitable objects or attachment such as finger, stylus on touch panel 20071 or in touch panelOperation near 20071).Touch panel 20071 may include both touch detecting apparatus and touch controller.Wherein, it touchesThe touch orientation of detection device detection user is touched, and detects touch operation bring signal, transmits a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and is converted into contact coordinate, then gives processor 2010,It receives the order that processor 2010 is sent and is executed.Furthermore, it is possible to using resistance-type, condenser type, infrared ray and surfaceThe multiple types such as sound wave realize touch panel 20071.In addition to touch panel 20071, user input unit 2007 can also includeOther input equipments 20072.Specifically, other input equipments 20072 can include but is not limited to physical keyboard, function key (ratioSuch as volume control button, switch key), trace ball, mouse, operating stick, details are not described herein.
Further, touch panel 20071 can be covered on display panel 20061, when touch panel 20071 detectsAfter touch operation on or near it, processor 2010 is sent to determine the type of touch event, is followed by subsequent processing device 2010Corresponding visual output is provided on display panel 20061 according to the type of touch event.Although in Figure 20, touch panel20071 and display panel 20061 are the functions that outputs and inputs of realizing terminal device as two independent components, butIn some embodiments, touch panel 20071 can be integrated with display panel 20061 and realize outputting and inputting for terminal deviceFunction, specifically herein without limitation.
Interface unit 2008 is the interface that external device (ED) is connect with terminal device 2000.For example, external device (ED) may includeWired or wireless headphone port, external power supply (or battery charger) port, wired or wireless data port, storage cardPort, port, the port audio input/output (I/O), video i/o port, earphone for connecting the device with identification modulePort etc..Interface unit 2008 can be used for receiving the input (for example, data information, electric power etc.) from external device (ED) simultaneouslyAnd by one or more elements that the input received is transferred in terminal device 2000 or it can be used in terminal deviceData are transmitted between 2000 and external device (ED).
Memory 2009 can be used for storing software program and various data.Memory 2009 can mainly include storage programArea and storage data area, wherein storing program area can application program needed for storage program area, at least one function (such asSound-playing function, image player function etc.) etc.;Storage data area, which can be stored, uses created data (ratio according to mobile phoneSuch as audio data, phone directory) etc..In addition, memory 2009 may include high-speed random access memory, it can also include non-Volatile memory, for example, at least a disk memory, flush memory device or other volatile solid-state parts.
Processor 2010 is the control centre of terminal device, utilizes each of various interfaces and the entire terminal device of connectionA part by running or execute the software program and/or module that are stored in memory 2009, and calls and is stored in storageData in device 2009 execute the various functions and processing data of terminal device, to carry out integral monitoring to terminal device.PlaceManaging device 2010 may include one or more processing units;Preferably, processor 2010 can integrate application processor and modulation /demodulationProcessor, wherein the main processing operation system of application processor, user interface and application program etc., modem processor masterHandle wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 2010.
Terminal device 2000 can also include the power supply 2011 (such as battery) powered to all parts, it is preferred that power supply2011 can be logically contiguous by power-supply management system and processor 2010, to realize that management is filled by power-supply management systemThe functions such as electricity, electric discharge and power managed.
In addition, terminal device 2000 includes some unshowned functional modules, details are not described herein.
A kind of hardware structural diagram of Figure 21 terminal device of each embodiment to realize the present invention.The terminal device2100 include but is not limited to: radio frequency unit 2101, network module 2102, audio output unit 2103, input unit 2104, sensingDevice 2105, display unit 2106, user input unit 2107, interface unit 2108, memory 2109, processor 2110 andThe components such as power supply 2111.It will be understood by those skilled in the art that the not structure paired terminal of terminal device structure shown in Figure 21The restriction of equipment, terminal device may include perhaps combining certain components or different than illustrating more or fewer componentsComponent layout.In embodiments of the present invention, terminal device includes but is not limited to mobile phone, tablet computer, laptop, palm electricityBrain, vehicle-mounted terminal equipment, wearable device and pedometer etc..
Wherein, processor 2110 acquire the second image for the control based on first terminal equipment;
Radio frequency unit 2101, for sending second image to server, so that the server is according to described secondThe first image that image and the first terminal equipment are sent determines target image, and sends the mesh to target terminal equipmentLogo image;
Wherein, the first terminal equipment and target terminal equipment foundation have video connection, and the second terminal is setIt is standby to be located in same local area network with the first terminal equipment;The first image and second image are with different shootingsThe image of content.
In embodiments of the present invention, it in the case where first terminal equipment is established video with target terminal equipment and connect, obtainsThe first image for taking first terminal equipment to acquire, and control second terminal equipment and acquire the second image, then, first terminal equipmentThe image that itself is obtained is sent to server respectively with second terminal, target terminal equipment is then forwarded to by server.Due toFirst terminal equipment and second terminal equipment are located in same local area network, and the first image and second image be withThe image of different contents of shooting, therefore, the target image obtained through the embodiment of the present invention can embody more complete video fieldScape.
Wherein, the first image and second image are the image with different contents of shooting, comprising:
Include the picture material of the first coverage in the first image, includes the second shooting model in second imageThe picture material enclosed, first coverage and second coverage are located in same video scene;Or
The first image and second image are respectively the image of the different shooting angles of same target.
Optionally, processor 2110 is used for, and is received the first terminal equipment and is sent image capturing request, in described imageIt include the information of target LAN network in acquisition request;It is acquired and is requested according to described image, be connected to the target LAN network;It is acquired and is requested according to described image, acquire second image.
Optionally, processor 2110 is used for, and obtains the first shooting scene center position and the institute of the first terminal equipmentState the second shooting scene center position of second terminal equipment;In the first shooting scene center position and second shootingIn the case that scene center position is located in same horizontal line, second image is acquired.
Optionally, processor 2110 is used for, and Xiang Suoshu first terminal equipment sends second indication information, second instructionInformation is used to indicate whether the first shooting scene center position and second shooting scene center position are located at same waterOn horizontal line.
Optionally, processor 2110 is used for, and obtains the first instruction information that the first terminal equipment is sent, and described firstIndicate information be used to indicate the first terminal equipment the first shooting scene center position and the second terminal equipment theWhether two photographed scene centers are located in same horizontal line;First photographed scene is indicated in the first instruction informationIn the case that center and second shooting scene center position are located in same horizontal line, second image is acquired.
It should be understood that the embodiment of the present invention in, radio frequency unit 2101 can be used for receiving and sending messages or communication process in, signalSend and receive, specifically, by from base station downlink data receive after, to processor 2110 handle;In addition, by uplinkData are sent to base station.In general, radio frequency unit 2101 includes but is not limited to antenna, at least one amplifier, transceiver, couplingDevice, low-noise amplifier, duplexer etc..In addition, radio frequency unit 2101 can also by wireless communication system and network and otherEquipment communication.
Terminal device provides wireless broadband internet by network module 2102 for user and accesses, and such as user is helped to receiveIt sends e-mails, browse webpage and access streaming video etc..
Audio output unit 2103 can be received by radio frequency unit 2101 or network module 2102 or in memoryThe audio data stored in 2109 is converted into audio signal and exports to be sound.Moreover, audio output unit 2103 can be withAudio output relevant to the specific function that terminal device 2100 executes is provided (for example, call signal receives sound, message sinkSound etc.).Audio output unit 2103 includes loudspeaker, buzzer and receiver etc..
Input unit 2104 is for receiving audio or video signal.Input unit 2104 may include GPU (GraphicsProcessing Unit, graphics processor) 21041 and microphone 21042, graphics processor 21041 is in video acquisition modeOr the image data of the static images or video obtained in image capture mode by image capture apparatus (such as camera) carries outReason.Treated, and picture frame may be displayed on display unit 2106.It can be with through treated the picture frame of graphics processor 21041It is stored in memory 2109 (or other storage mediums) or is sent via radio frequency unit 2101 or network module 2102.Microphone 21042 can receive sound, and can be audio data by such acoustic processing.Treated, and audio data canTo be converted to the format output that can be sent to mobile communication base station via radio frequency unit 2101 in the case where telephone calling model.
Terminal device 2100 further includes at least one sensor 2105, for example, optical sensor, motion sensor and otherSensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to ringThe light and shade of border light adjusts the brightness of display panel 21061, proximity sensor can when terminal device 2100 is moved in one's ear,Close display panel 21061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directionsThe size of (generally three axis) acceleration, can detect that size and the direction of gravity, can be used to identify terminal device appearance when staticState (such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion)Deng;Sensor 2105 can also include fingerprint sensor, pressure sensor, iris sensor, molecule sensor, gyroscope, gasMeter, hygrometer, thermometer, infrared sensor etc. are pressed, details are not described herein.
Display unit 2106 is for showing information input by user or being supplied to the information of user.Display unit 2106 canIncluding display panel 21061, LCD (Liquid Crystal Display, liquid crystal display), OLED (Organic can be usedLight-Emitting Diode, Organic Light Emitting Diode) etc. forms configure display panel 21061.
User input unit 2107 can be used for receiving the number or character information of input, and generate the use with terminal deviceFamily setting and the related key signals input of function control.Specifically, user input unit 2107 include touch panel 21071 withAnd other input equipments 21072.Touch panel 21071, also referred to as touch screen collect the touch behaviour of user on it or nearbyMake (for example user uses any suitable objects or attachment such as finger, stylus on touch panel 21071 or in touch panelOperation near 21071).Touch panel 21071 may include both touch detecting apparatus and touch controller.Wherein, it touchesThe touch orientation of detection device detection user is touched, and detects touch operation bring signal, transmits a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and is converted into contact coordinate, then gives processor 2110,It receives the order that processor 2110 is sent and is executed.Furthermore, it is possible to using resistance-type, condenser type, infrared ray and surfaceThe multiple types such as sound wave realize touch panel 21071.In addition to touch panel 21071, user input unit 2107 can also includeOther input equipments 21072.Specifically, other input equipments 21072 can include but is not limited to physical keyboard, function key (ratioSuch as volume control button, switch key), trace ball, mouse, operating stick, details are not described herein.
Further, touch panel 21071 can be covered on display panel 21061, when touch panel 21071 detectsAfter touch operation on or near it, processor 2110 is sent to determine the type of touch event, is followed by subsequent processing device 2110Corresponding visual output is provided on display panel 21061 according to the type of touch event.Although in Figure 21, touch panel21071 and display panel 21061 are the functions that outputs and inputs of realizing terminal device as two independent components, butIn some embodiments, touch panel 21071 can be integrated with display panel 21061 and realize outputting and inputting for terminal deviceFunction, specifically herein without limitation.
Interface unit 2108 is the interface that external device (ED) is connect with terminal device 2100.For example, external device (ED) may includeWired or wireless headphone port, external power supply (or battery charger) port, wired or wireless data port, storage cardPort, port, the port audio input/output (I/O), video i/o port, earphone for connecting the device with identification modulePort etc..Interface unit 2108 can be used for receiving the input (for example, data information, electric power etc.) from external device (ED) simultaneouslyAnd by one or more elements that the input received is transferred in terminal device 2100 or it can be used in terminal deviceData are transmitted between 2100 and external device (ED).
Memory 2109 can be used for storing software program and various data.Memory 2109 can mainly include storage programArea and storage data area, wherein storing program area can application program needed for storage program area, at least one function (such asSound-playing function, image player function etc.) etc.;Storage data area, which can be stored, uses created data (ratio according to mobile phoneSuch as audio data, phone directory) etc..In addition, memory 2109 may include high-speed random access memory, it can also include non-Volatile memory, for example, at least a disk memory, flush memory device or other volatile solid-state parts.
Processor 2110 is the control centre of terminal device, utilizes each of various interfaces and the entire terminal device of connectionA part by running or execute the software program and/or module that are stored in memory 2109, and calls and is stored in storageData in device 2109 execute the various functions and processing data of terminal device, to carry out integral monitoring to terminal device.PlaceManaging device 2110 may include one or more processing units;Preferably, processor 2110 can integrate application processor and modulation /demodulationProcessor, wherein the main processing operation system of application processor, user interface and application program etc., modem processor masterHandle wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 2110.
Terminal device 2100 can also include the power supply 2111 (such as battery) powered to all parts, it is preferred that power supply2111 can be logically contiguous by power-supply management system and processor 2110, to realize that management is filled by power-supply management systemThe functions such as electricity, electric discharge and power managed.
In addition, terminal device 2100 includes some unshowned functional modules, details are not described herein.
A kind of hardware structural diagram of Figure 22 terminal device of each embodiment to realize the present invention.The terminal device2200 include but is not limited to: radio frequency unit 2201, network module 2202, audio output unit 2203, input unit 2204, sensingDevice 2205, display unit 2206, user input unit 2207, interface unit 2208, memory 2209, processor 2210 andThe components such as power supply 2211.It will be understood by those skilled in the art that the not structure paired terminal of terminal device structure shown in Figure 22The restriction of equipment, terminal device may include perhaps combining certain components or different than illustrating more or fewer componentsComponent layout.In embodiments of the present invention, terminal device includes but is not limited to mobile phone, tablet computer, laptop, palm electricityBrain, vehicle-mounted terminal equipment, wearable device and pedometer etc..
Wherein, radio frequency unit 2201, for receiving server in the case where establishing video with first terminal equipment and connectingThe target image of transmission;
Display unit 2206, for showing the target image;
Wherein, the target image is first image and second terminal of the server according to the first terminal equipmentWhat the second image of equipment determined;The second terminal equipment and the first terminal equipment are located in same local area network;It is describedFirst image and second image are the image with different contents of shooting.
In embodiments of the present invention, it in the case where first terminal equipment is established video with target terminal equipment and connect, obtainsThe first image for taking first terminal equipment to acquire, and control second terminal equipment and acquire the second image, then, first terminal equipmentThe image that itself is obtained is sent to server respectively with second terminal, target terminal equipment is then forwarded to by server.Due toFirst terminal equipment and second terminal equipment are located in same local area network, and the first image and second image be withThe image of different contents of shooting, therefore, the target image obtained through the embodiment of the present invention can embody more complete video fieldScape.
Optionally, processor 2210 is used for, and Xiang Suoshu server sends image acquisition request message;It is obtained in described imageIn the case that request message instruction obtains first kind image, the first image that the server is sent and described the are receivedTwo images;In the case where described image acquisition request message indicates to obtain Second Type image, receives the server and sendComposograph, wherein the composograph is generated after being synthesized the first image and second image.
Optionally, processor 2210 is used for, and shows the first image in the first display window, in the second display windowMiddle display second image;Receive the second input;In response to second input, by the first image and second figureAs being synthesized, third image is generated;
The third image is shown in third display window.
Optionally, processor 2210 is used for, and receives third input;It is inputted in response to the third, by the third imageSplit at least one subgraph;At least one described subgraph is shown respectively.
Optionally, processor 2210 is used for, and receives the 4th input;In response to the 4th input, according to first figurePicture and second image generate stereo-picture;Show the stereo-picture.
It should be understood that the embodiment of the present invention in, radio frequency unit 2201 can be used for receiving and sending messages or communication process in, signalSend and receive, specifically, by from base station downlink data receive after, to processor 2210 handle;In addition, by uplinkData are sent to base station.In general, radio frequency unit 2201 includes but is not limited to antenna, at least one amplifier, transceiver, couplingDevice, low-noise amplifier, duplexer etc..In addition, radio frequency unit 2201 can also by wireless communication system and network and otherEquipment communication.
Terminal device provides wireless broadband internet by network module 2202 for user and accesses, and such as user is helped to receiveIt sends e-mails, browse webpage and access streaming video etc..
Audio output unit 2203 can be received by radio frequency unit 2201 or network module 2202 or in memoryThe audio data stored in 2209 is converted into audio signal and exports to be sound.Moreover, audio output unit 2203 can be withAudio output relevant to the specific function that terminal device 2200 executes is provided (for example, call signal receives sound, message sinkSound etc.).Audio output unit 2203 includes loudspeaker, buzzer and receiver etc..
Input unit 2204 is for receiving audio or video signal.Input unit 2204 may include GPU (GraphicsProcessing Unit, graphics processor) 22041 and microphone 22042, graphics processor 22041 is in video acquisition modeOr the image data of the static images or video obtained in image capture mode by image capture apparatus (such as camera) carries outReason.Treated, and picture frame may be displayed on display unit 2206.It can be with through treated the picture frame of graphics processor 22041It is stored in memory 2209 (or other storage mediums) or is sent via radio frequency unit 2201 or network module 2202.Microphone 22042 can receive sound, and can be audio data by such acoustic processing.Treated, and audio data canTo be converted to the format output that can be sent to mobile communication base station via radio frequency unit 2201 in the case where telephone calling model.
Terminal device 2200 further includes at least one sensor 2205, for example, optical sensor, motion sensor and otherSensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to ringThe light and shade of border light adjusts the brightness of display panel 22061, proximity sensor can when terminal device 2200 is moved in one's ear,Close display panel 22061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directionsThe size of (generally three axis) acceleration, can detect that size and the direction of gravity, can be used to identify terminal device appearance when staticState (such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion)Deng;Sensor 2205 can also include fingerprint sensor, pressure sensor, iris sensor, molecule sensor, gyroscope, gasMeter, hygrometer, thermometer, infrared sensor etc. are pressed, details are not described herein.
Display unit 2206 is for showing information input by user or being supplied to the information of user.Display unit 2206 canIncluding display panel 22061, LCD (Liquid Crystal Display, liquid crystal display), OLED (Organic can be usedLight-Emitting Diode, Organic Light Emitting Diode) etc. forms configure display panel 22061.
User input unit 2207 can be used for receiving the number or character information of input, and generate the use with terminal deviceFamily setting and the related key signals input of function control.Specifically, user input unit 2207 include touch panel 22071 withAnd other input equipments 22072.Touch panel 22071, also referred to as touch screen collect the touch behaviour of user on it or nearbyMake (for example user uses any suitable objects or attachment such as finger, stylus on touch panel 22071 or in touch panelOperation near 22071).Touch panel 22071 may include both touch detecting apparatus and touch controller.Wherein, it touchesThe touch orientation of detection device detection user is touched, and detects touch operation bring signal, transmits a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and is converted into contact coordinate, then gives processor 2210,It receives the order that processor 2210 is sent and is executed.Furthermore, it is possible to using resistance-type, condenser type, infrared ray and surfaceThe multiple types such as sound wave realize touch panel 22071.In addition to touch panel 22071, user input unit 2207 can also includeOther input equipments 22072.Specifically, other input equipments 22072 can include but is not limited to physical keyboard, function key (ratioSuch as volume control button, switch key), trace ball, mouse, operating stick, details are not described herein.
Further, touch panel 22071 can be covered on display panel 22061, when touch panel 22071 detectsAfter touch operation on or near it, processor 2210 is sent to determine the type of touch event, is followed by subsequent processing device 2210Corresponding visual output is provided on display panel 22061 according to the type of touch event.Although in Figure 22, touch panel22071 and display panel 22061 are the functions that outputs and inputs of realizing terminal device as two independent components, butIn some embodiments, touch panel 22071 can be integrated with display panel 22061 and realize outputting and inputting for terminal deviceFunction, specifically herein without limitation.
Interface unit 2208 is the interface that external device (ED) is connect with terminal device 2200.For example, external device (ED) may includeWired or wireless headphone port, external power supply (or battery charger) port, wired or wireless data port, storage cardPort, port, the port audio input/output (I/O), video i/o port, earphone for connecting the device with identification modulePort etc..Interface unit 2208 can be used for receiving the input (for example, data information, electric power etc.) from external device (ED) simultaneouslyAnd by one or more elements that the input received is transferred in terminal device 2200 or it can be used in terminal deviceData are transmitted between 2200 and external device (ED).
Memory 2209 can be used for storing software program and various data.Memory 2209 can mainly include storage programArea and storage data area, wherein storing program area can application program needed for storage program area, at least one function (such asSound-playing function, image player function etc.) etc.;Storage data area, which can be stored, uses created data (ratio according to mobile phoneSuch as audio data, phone directory) etc..In addition, memory 2209 may include high-speed random access memory, it can also include non-Volatile memory, for example, at least a disk memory, flush memory device or other volatile solid-state parts.
Processor 2210 is the control centre of terminal device, utilizes each of various interfaces and the entire terminal device of connectionA part by running or execute the software program and/or module that are stored in memory 2209, and calls and is stored in storageData in device 2209 execute the various functions and processing data of terminal device, to carry out integral monitoring to terminal device.PlaceManaging device 2210 may include one or more processing units;Preferably, processor 2210 can integrate application processor and modulation /demodulationProcessor, wherein the main processing operation system of application processor, user interface and application program etc., modem processor masterHandle wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 2210.
Terminal device 2200 can also include the power supply 2211 (such as battery) powered to all parts, it is preferred that power supply2211 can be logically contiguous by power-supply management system and processor 2210, to realize that management is filled by power-supply management systemThe functions such as electricity, electric discharge and power managed.
In addition, terminal device 2200 includes some unshowned functional modules, details are not described herein.
As shown in figure 23, the server of the embodiment of the present invention includes: processor 2300, for reading in memory 2320Program executes following process:
In the case where first terminal equipment is established video with target terminal equipment and connect, the first terminal equipment is obtainedThe first image sent;
Obtain the second image that second terminal equipment is sent;
According to the first image and second image, target image is determined;
The target image is sent to the target terminal equipment;
Wherein, the second terminal equipment and the first terminal equipment are located in same local area network;The first imageIt is the image with different contents of shooting with second image.
Transceiver 2310, for sending and receiving data under the control of processor 2300.
Wherein, in Figure 23, bus architecture may include the bus and bridge of any number of interconnection, specifically by processorThe various circuits for the memory that 2300 one or more processors represented and memory 2320 represent link together.Total coil holderStructure can also link together various other circuits of such as peripheral equipment, voltage-stablizer and management circuit or the like, thisIt is all a bit it is known in the art, therefore, it will not be further described herein.Bus interface provides interface.Transceiver2310 can be multiple element, that is, include transmitter and transceiver, provide for logical with various other devices over a transmission mediumThe unit of letter.Processor 2300 is responsible for management bus architecture and common processing, memory 2320 can store processor 2300The used data when executing operation.
Processor 2300, which is responsible for management bus architecture and common processing, memory 2320, can store processor 2300 and existsExecute used data when operation.
Processor 2300 is also used to read the computer program, executes following steps:
The first image and second image are determined as target image;
Alternatively, the image that the first image and second image synthesize is determined as target image.
Processor 2300 is also used to read the computer program, executes following steps:
Obtain the image acquisition request message of the target terminal equipment;
When described image acquisition request message indicates that the target terminal equipment obtains first kind image, Xiang Suoshu meshIt marks terminal device and sends the first image and second image;
When described image acquisition request message indicates that the target terminal equipment obtains Second Type image, by described theOne image and second image are synthesized, and send the image after synthesis to the target terminal equipment.
In embodiments of the present invention, it in the case where first terminal equipment is established video with target terminal equipment and connect, obtainsThe first image for taking first terminal equipment to acquire, and control second terminal equipment and acquire the second image, then, first terminal equipmentThe image that itself is obtained is sent to server respectively with second terminal, target terminal equipment is then forwarded to by server.Due toFirst terminal equipment and second terminal equipment are located in same local area network, and the first image and second image be withThe image of different contents of shooting, therefore, the target image obtained through the embodiment of the present invention can embody more complete video fieldScape.
Preferably, the embodiment of the present invention also provides a kind of terminal device, including processor, and memory is stored in memoryComputer program that is upper and can running on the processor, the computer program are realized when being executed by processor at above-mentioned imageEach process of embodiment of the method is managed, and identical technical effect can be reached, to avoid repeating, which is not described herein again.
The embodiment of the present invention also provides a kind of computer readable storage medium, and meter is stored on computer readable storage mediumCalculation machine program, the computer program realize each process of above-mentioned image processing method embodiment, and energy when being executed by processorReach identical technical effect, to avoid repeating, which is not described herein again.Wherein, the computer readable storage medium, such asROM (Read-Only Memory, read-only memory), RAM (Random Access Memory, random access memory), magneticDish or CD etc..
It should be noted that, in this document, the terms "include", "comprise" or its any other variant are intended to non-rowHis property includes, so that the process, method, article or the device that include a series of elements not only include those elements, andAnd further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsicElement.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including being somebody's turn to doThere is also other identical elements in the process, method of element, article or device.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment sideMethod can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many casesThe former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior artThe part contributed out can be embodied in the form of software products, which is stored in a storage mediumIn (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal device (can be mobile phone, computer, clothesBusiness device, air conditioner or the network equipment etc.) execute method described in each embodiment of the present invention.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited to above-mentioned specificEmbodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the artUnder the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, it can also make very muchForm belongs within protection of the invention.