Shooting method and electronic equipmentTechnical FieldThe embodiment of the application relates to the technical field of terminals, in particular to a shooting method and electronic equipment.
BackgroundElectronic devices such as cell phones, tablets, etc. are often provided with a photographing function. For example, a camera application is installed in the electronic device, and the camera application may provide a shooting function of pictures and videos.
However, when the photographing function is used, if a user wants to photograph something sideways or behind, it is generally necessary to rotate the body to achieve this. In addition, when the shooting function is used, if a user wants to shoot a scene in front of eyes and himself (or a scene behind the eyes) on one picture, the implementation difficulty is high without the help of other people.
Disclosure of Invention
The application provides a shooting method and electronic equipment, which can flexibly realize shooting at all angles and panoramic shooting at large angles by utilizing a folding rotating shaft of the electronic equipment with a folding screen.
In order to achieve the above purpose, the application adopts the following technical scheme:
In a first aspect, a photographing method is provided, which is applied to an electronic device, such as a mobile phone in a third mode shown in fig. 4A and fig. 4B. The electronic device includes a display screen (hereinafter, a folding screen 401 or a folding screen 411) and a rear camera (such as a rear camera 403), the display screen includes a first sub-screen (hereinafter, a third screen 4013, a third screen 4113), a second sub-screen (hereinafter, a second screen 4012, a second screen 4112) and a third sub-screen (hereinafter, a first screen 4011, a first screen 4111), the first sub-screen, the second sub-screen and the third sub-screen are three different display areas of the display screen, the second sub-screen is located between the first sub-screen and the third sub-screen when the display screen is in an unfolded state, the display directions of the first sub-screen and the second sub-screen are opposite, the display directions of the second sub-screen and the third sub-screen are opposite, and the rear camera is disposed on the back of the third sub-screen when the display screen is in the folded state.
Specifically, a first operation of a user, such as a click operation on a camera icon on a desktop, is received. In response to the first operation, a camera application in the electronic device is started, and a first interface (i.e., a commonly photographed view interface, such as interface 602, below) is displayed on at least one sub-screen of the display screen, where the first interface includes a first view and a first control (e.g., front-to-back switching control 6021 in interface 602 or "more" 7011 in interface 701). It is understood that the view frames refer to live views of the camera. And receiving a second operation of the first control by the user. In response to the second operation, a second interface (hereinafter interface 604) including a second view and an identification of the rotation shooting (e.g., shooting entry 6031 in interface 604) is displayed in the first sub-screen. Wherein the identification of the rotation shooting may be used to indicate that the rotation shooting has been switched to currently. That is, the first control may be used to trigger the electronic device to switch from normal shooting to rotational shooting.
After entering the rotation shooting, the second interface is displayed on the first sub-screen with the change of the first included angle (delta hereinafter) between the first sub-screen and the second sub-screen and/or the change of the second included angle (gamma hereinafter) between the second sub-screen and the third sub-screen, namely, the second interface is kept on the first sub-screen without switching the screen displaying the user interface based on the gesture of the device. A photographing operation of a user, such as a click operation of a photographing button, is received. In response to the photographing operation, an image is photographed by the rear camera.
In summary, by adopting the scheme of the application, the electronic device can provide two shooting modes through camera application, one is ordinary shooting, and the other is rotary shooting. And after triggering to enter the rotary shooting, the electronic equipment always displays a user interface through the first sub-screen even if the second sub-screen and the third sub-screen are rotated to change the view finding range of the rear camera so that the first included angle and the second included angle are changed. The first sub-screen is always visible to the user, and the first sub-screen may not rotate with the rear camera. Therefore, the user can see the framing picture in the whole rotation shooting process. Thus, under a specific shooting scene, such as a scene in front of eyes and a scene of shooting one photo by itself, a scene of multi-person photo, and the like, a user can realize large-angle shooting by only rotating the second sub-screen and the third sub-screen without rotating the body or the mobile phone, and a good preview effect can be maintained through the first sub-screen in the shooting process.
Further, in response to the second operation, the electronic device may record a first identification (an identification a hereinafter) for indicating that the rotation shooting has been currently switched to. And then, under the condition that the first mark is inquired, the electronic equipment keeps displaying a second interface on the first sub-screen along with the change of the first included angle and the second included angle. Of course, after exiting the rotation shooting, the electronic device may then clear the first identifier, and subsequently, the electronic device may switch the screen based on the device gesture, without maintaining the user interface always displayed on the first sub-screen.
After entering the rotation shooting, the electronic device is further provided with a function of dynamically turning on or off the rear mirror image based on the first included angle and the second included angle (i.e., a function two hereinafter). The following are respectively described:
in one possible design manner of the first aspect, the displaying the second interface in the first sub-screen in response to the second operation includes displaying the second interface in which the second view is a mirror image in the first sub-screen if the orientation angle (θ, hereinafter) of the rear camera is smaller than the first angle (θ, hereinafter) or larger than the fourth angle (10, hereinafter), and displaying the second interface in which the second view is a non-mirror image in the first sub-screen if the orientation angle is larger than the second angle (angle 4, hereinafter) and smaller than the third angle (angle 9, hereinafter) in response to the second operation. Wherein the first angle and the second angle are greater than 0 ° and less than 180 °, typically close to 90 °, the first angle is less than the second angle, and the third angle and the fourth angle are between greater than 180 ° and less than 360 °, typically close to 270 °, the third angle is less than the fourth angle.
The orientation angle comprises the lens direction of the rear camera.
Specifically, the display direction of the first sub-screen may be set as the 0 ° direction of the orientation angle. In the process of rotation shooting, the first sub-screen is generally opposite to the user, so that the display direction of the first sub-screen is taken as the direction of 0 ° of the orientation angle, and the display direction of the first sub-screen is also taken as the direction of 0 ° of the orientation angle. That is, the orientation angle is 0 ° when the lens orientation of the rear camera is the same as the opposite direction of the user orientation, and 180 ° when the lens orientation of the rear camera is the same as the user orientation.
Of course, the direction of 0 ° is not limited in practice. For example, the direction opposite to the display direction of the first sub-screen may be set to 0 ° of the orientation angle, or the direction perpendicular to the display direction of the first sub-screen and to the left may be set to 0 ° of the orientation angle. The embodiment of the present application is not particularly limited thereto.
Further, on the basis of the 0 ° direction, a clockwise rotation direction or a counterclockwise rotation direction may be used as the positive direction of the orientation angle. Taking clockwise rotation as a positive direction as an example, on the basis of the 0-degree direction, one circle of clockwise rotation can realize the change of the orientation angle from 0-360 degrees, and one circle of anticlockwise rotation can realize the change of the orientation angle from 360-0 degrees.
Typically, the electronic device may determine the orientation angle based on the first angle and the second angle.
For the electronic equipment of the first aspect, when the vertical screen is used and the first sub-screen is opposite to the user, if the orientation angle is larger than the second angle and smaller than the third angle, the rear camera can deviate from the scene in front of eyes, and at the moment, the rear mirror image is closed, so that the use habit of the rear camera under the condition of ordinary photographing is met. If the orientation angle is smaller than the first angle or larger than the fourth angle, the rear camera can deflect to the user, at the moment, the rear mirror image is started, and the user can conveniently adjust the photographed scenery according to the mirror image effect in the view finding picture, so that the use habit of self-timer preview is more met.
In one possible design manner of the first aspect, maintaining the second interface displayed on the first sub-screen with the change of the first included angle between the first sub-screen and the second sub-screen and/or the change of the second included angle between the second sub-screen and the third sub-screen includes switching the second view in the second interface to a non-mirror image in response to the change of the orientation angle from less than the first angle or greater than the fourth angle to greater than the second angle and less than the third angle. The second view in the second interface is switched to a mirror image in response to the orientation angle changing from greater than the second angle and less than the third angle to less than the first angle or greater than the fourth angle.
That is, after the orientation angle is changed due to the change of the first included angle and the second included angle, the electronic device also switches the state of the rear mirror image, so that the state of the rear mirror image matches the orientation angle.
In addition, if the orientation angle is greater than or equal to the first angle and less than or equal to the second angle, i.e., between the first angle and the second angle, or the orientation angle is greater than or equal to the third angle and less than or equal to the fourth angle, i.e., between the third angle and the fourth angle, the electronic device may close the rear mirror by default, and after the orientation angle is changed to be less than the first angle or greater than the fourth angle, the electronic device opens the rear mirror. Subsequently, the electronic device may also switch the second view in the second interface to a non-mirrored view in response to the orientation angle changing from less than the first angle or greater than the fourth angle to greater than the second angle and less than the third angle. The second view in the second interface is switched to a mirror image in response to the orientation angle changing from greater than the second angle and less than the third angle to less than the first angle or greater than the fourth angle. Therefore, the electronic device can ensure that the rear mirror image can be accurately opened or closed based on the change of the orientation angle when the electronic device enters the rotary shooting at any orientation angle.
In one possible design manner of the first aspect, the method further includes displaying a third interface (i.e., a preview interface of the stage in shooting) during the process of shooting the image by the rear camera, where the third interface includes a third view in shooting. The third view is a mirror image with the change of the orientation angle when the second view in the second interface is a mirror image when the second view is a mirror image when the second interface is a non-mirror image when the second interface is a mirror image.
That is, after shooting is started, the electronic device may always lock on the back mirror or close the back mirror. In this way, the influence on the shooting of the user caused by repeatedly opening or closing the rear mirror image in the shooting process can be avoided.
After entering the rotation shooting, and after entering the panoramic mode, the electronic device is also provided with a function of adjusting the guide arrow (i.e., function three hereinafter). The following are respectively described:
In one possible design manner of the first aspect, after the second interface is displayed in the first sub-screen, the method further includes receiving a third operation of the user, such as a selection operation of a mode option of the panoramic mode. In response to the third operation, switching to the panoramic mode, and displaying a fourth interface (i.e., a preview interface of panoramic shooting) in the first sub-screen based on the current orientation angle, wherein the fourth interface comprises a first guide arrow or a second guide arrow. Wherein the first guide arrow indicates a first direction (e.g., left to right) and the second guide arrow indicates a second direction (e.g., right to left).
That is, the electronic device may automatically adjust the direction indicated by the guide arrow based on the orientation angle to meet the shooting requirement of the user.
In one possible design of the first aspect, the electronic device adjusts the direction of the guiding arrow indication (guiding arrow 36021 hereinafter) only around 180 °.
Specifically, displaying the fourth interface based on the current orientation angle includes displaying the fourth interface including the first guide arrow in the first sub-screen if the orientation angle is less than the fifth angle (e.g., angle 11, below). In the case where the orientation angle is greater than the sixth angle (angle 12, hereinafter), a fourth interface including a second guide arrow is displayed in the first sub-screen. Wherein the fifth angle is greater than 90 ° and less than 180 °, and the sixth angle is greater than 180 ° and less than 270 °, and typically both the fifth angle and the sixth angle are approximately 180 °.
For the electronic device of the first aspect, when the vertical screen is in use and the first sub-screen is facing the user, if the orientation angle is smaller than the fifth angle, it indicates that the user wants to shoot in the process of changing the view range of the rear camera clockwise, similar to shooting from left to right. Thus, the electronic device displays the first guide arrow to guide the user to shoot from left to right. If the orientation angle is greater than the sixth angle, it indicates that the user wants to take a picture during a counterclockwise change in the range of view of the rear camera, similar to a right-to-left picture. Accordingly, the electronic device displays a second guide arrow to guide the user to take a photograph from right to left. So that the direction indicated by the guide arrow can be matched with the shooting requirements of the user.
In one possible design manner of the first aspect, after displaying the fourth interface including the first guiding arrow in the first sub-screen or after displaying the fourth interface including the second guiding arrow in the first sub-screen, the method further includes switching the first guiding arrow to the second guiding arrow in response to the orientation angle changing from less than the fifth angle to greater than the sixth angle. The second guide arrow is switched to the first guide arrow in response to the orientation angle changing from greater than the sixth angle to less than the fifth angle.
That is, as the orientation angle changes, the electronic device also switches the guide arrow such that the direction indicated by the guide arrow matches the orientation angle.
In addition, if the orientation angle is greater than or equal to the fifth angle and less than or equal to the sixth angle, i.e., between the fifth angle and the sixth angle, when switching to the panoramic mode, the electronic device may display the first guide arrow by default, and after the orientation angle is changed to be greater than the sixth angle, the electronic device switches to display the second guide arrow. Subsequently, the electronic device may then switch the first guide arrow to the second guide arrow in response to the orientation angle changing from less than the fifth angle to greater than the sixth angle. The second guide arrow is switched to the first guide arrow in response to the orientation angle changing from greater than the sixth angle to less than the fifth angle. Thus, the electronic device can ensure that the direction indicated by the guiding arrow can be accurately adjusted based on the change of the orientation angle when entering the panoramic mode at any orientation angle.
In one possible design manner of the first aspect, the electronic device may further adjust the guiding arrow with a corresponding angle threshold based on the shooting intention of the user. Wherein the shooting intents include shooting between [0 °,360 ° ], shooting between [0 °,180 ° ], and shooting between [180 °,360 ° ].
First, the shooting is intended to be taken between [0 °,180 ° ], with an eighth angle (angle 5 hereinafter) and a ninth angle (angle 6 hereinafter) as angle threshold adjustment guide arrows.
The displaying the fourth interface based on the current orientation angle includes displaying the fourth interface including the first guiding arrow in the first sub-screen when the second included angle is smaller than the seventh angle (i.e., the second included angle is approximately equal to 0 °, e.g., the seventh angle is 5 °) and the orientation angle is smaller than the eighth angle. And displaying a fourth interface including a second guiding arrow in the first sub-screen under the condition that the second included angle is smaller than the seventh angle and the orientation angle is larger than the ninth angle. Wherein the eighth angle and the ninth angle are greater than 0 ° and less than 180 °, typically close to 90 °, and the eighth angle is less than the ninth angle.
For the electronic device of the first aspect, when the vertical screen is in use and the first sub-screen is facing the user, if the second angle is close to 0 °, it is indicated that the user wants to shoot between [0 °,180 ° ]. If the orientation angle is smaller than the eighth angle, the user is indicated to want to shoot clockwise between [0 °,180 ° ], like shooting from left to right. Thus, the electronic device displays the first guide arrow to guide the user to shoot from left to right. If the orientation angle is greater than the ninth angle, it indicates that the user wants to photograph counterclockwise between [0 °,180 ° ], like photographing from right to left. Accordingly, the electronic device displays a second guide arrow to guide the user to take a photograph from right to left. So that the direction indicated by the guide arrow can be matched with the shooting requirements of the user.
Second, the shooting is intended to be taken between [180 °,360 ° ], using a tenth angle (hereinafter angle 13) and an eleventh angle (hereinafter angle 14) as angle threshold adjustment guide arrows.
Displaying the fourth interface based on the current orientation angle includes displaying the fourth interface including the first guiding arrow in the first sub-screen when the first included angle is smaller than the seventh angle (i.e., the first included angle is approximately equal to 0 °, e.g., the seventh angle is 5 °) and the orientation angle is smaller than the tenth angle. And displaying a fourth interface including a second guiding arrow in the first sub-screen under the condition that the first included angle is smaller than the seventh angle and the facing angle is larger than the eleventh angle. Wherein the tenth and eleventh angles are greater than 180 ° and less than 360 °, typically close to 270 °, and the tenth angle is less than the eleventh angle.
For the electronic device of the first aspect, when the vertical screen is in use and the first sub-screen is facing the user, if the first angle is close to 0 °, it is indicated that the user wants to shoot between [180 °,360 ° ]. If the orientation angle is smaller than the tenth angle, the user is indicated to want to shoot clockwise between 180 deg., 360 deg., similar to left to right shooting. Thus, the electronic device displays the first guide arrow to guide the user to shoot from left to right. If the orientation angle is greater than the eleventh angle, this indicates that the user wants to shoot counter-clockwise between [180, 360 ], similar to right to left shooting. Accordingly, the electronic device displays a second guide arrow to guide the user to take a photograph from right to left. So that the direction indicated by the guide arrow can be matched with the shooting requirements of the user.
Third, the shooting is intended to be between [0 °,360 ° ], using a fifth angle (angle 11 hereinafter) and an eleventh angle (angle 12 hereinafter) as the angle threshold adjustment guide arrow.
The displaying the fourth interface based on the current orientation angle comprises displaying the fourth interface comprising the first guiding arrow in the first sub-screen when the first included angle and the second included angle are both larger than or equal to the seventh angle and the orientation angle is smaller than the fifth angle. And displaying a fourth interface including a second guiding arrow in the first sub-screen in the case that the first included angle and the second included angle are both greater than or equal to the seventh angle and the orientation angle is greater than the sixth angle. Wherein the fifth angle is greater than 90 ° and less than 180 °, and the sixth angle is greater than 180 ° and less than 270 °, and typically both the fifth angle and the sixth angle are approximately 180 °.
For the electronic device of the first aspect, when the vertical screen is in use and the first sub-screen is facing the user, if neither the first angle nor the second angle is close to 0 °, it is indicated that the user wants to shoot between [0 °,360 ° ]. If the orientation angle is smaller than the fifth angle, the user is indicated to want to shoot clockwise between [0 °,360 ° ], like left to right shooting. Thus, the electronic device displays the first guide arrow to guide the user to shoot from left to right. If the orientation angle is greater than the sixth angle, it indicates that the user wants to photograph counterclockwise between [0 °,360 ° ], like photographing from right to left. Accordingly, the electronic device displays a second guide arrow to guide the user to take a photograph from right to left. So that the direction indicated by the guide arrow can be matched with the shooting requirements of the user.
Of course, the electronic device may also adjust the guiding arrow with the change of the orientation angle in the following steps corresponding to the three shooting intentions. Specifically, the first guide arrow is switched to the second guide arrow in response to the orientation angle changing from less than the low threshold (e.g., eighth angle, tenth angle, fifth angle) to greater than the high threshold (e.g., ninth angle, eleventh angle, sixth angle). The second guide arrow is switched to the first guide arrow in response to the change in orientation angle from greater than the high threshold to less than the low threshold. The guiding arrow can thus be switched with a change in the orientation angle, so that the direction indicated by the guiding arrow matches the orientation angle.
Further, after the panoramic mode is switched, the shooting intention of the user may also change, so after the panoramic mode is switched, the electronic device may update the shooting intention based on the current first included angle and/or the second included angle, and adjust the guiding arrow by adopting the corresponding angle threshold.
Specifically, in response to the second included angle changing from greater than or equal to the seventh angle to less than the seventh angle (i.e., the shooting intent is switched to shooting between [0 °,180 ° ]), the electronic device may switch the guide arrow in the fourth interface using the eighth angle and the ninth angle, respectively. In response to the first included angle changing from greater than or equal to the seventh angle to less than the seventh angle (i.e., the shooting intent is to switch to shooting between 180 °,360 ° ], the electronic device may switch the guide arrow in the fourth interface using the tenth angle and the eleventh angle, respectively. In response to both the first and second angles changing to be greater than or equal to the seventh angle (i.e., the capture is intended to switch to capture between 0, 360), the electronic device may switch the guide arrow in the fourth interface using the fifth and sixth angles, respectively.
In this way, the guide arrow can be accurately switched after the shooting intention is updated.
In addition, if the orientation angle is located just between the low threshold and the high threshold when determining the shooting intention or when updating the shooting intention, the electronic device may display the first guide arrow by default, and after the orientation angle is changed to be greater than the high threshold, the electronic device may switch to the second guide arrow. Subsequently, as the orientation angle changes, the electronic device will also adjust the guide arrow. That is, the first guide arrow is switched to the second guide arrow in response to the orientation angle changing from less than the low threshold to greater than the high threshold. The second guide arrow is switched to the first guide arrow in response to the change in orientation angle from greater than the high threshold to less than the low threshold. Therefore, the electronic equipment can accurately adjust the direction indicated by the guiding arrow based on the change of the orientation angle when the shooting intention is determined at any orientation angle.
In one possible design manner of the first aspect, a direction switching control is included in the fourth interface, for triggering the electronic device to adjust the guiding arrow. After displaying the fourth interface including the first guide arrow in the first sub-screen, the method further includes switching the first guide arrow to the second guide arrow in response to a fourth operation, such as a click operation, of the directional switching control. After displaying the fourth interface including the second guide arrow in the first sub-screen, the method further includes switching the second guide arrow to the first guide arrow in response to a fourth operation of the directional switching control.
That is, the electronic device may provide an entry to manually switch the guide arrow, i.e., a direction switch control, in addition to automatically adjusting the guide arrow based on the orientation angle, to facilitate the user's switching to a desired direction.
It should be noted that the direction switch control may trigger a switch in more directions, such as from top to bottom, bottom to top, etc., in addition to switching between the first direction and the second direction.
In one possible design manner of the first aspect, the step of shooting the image through the rear camera in response to the shooting operation includes shooting a panoramic photo through the rear camera in response to the shooting operation in the fourth interface, wherein a guiding arrow in the fourth interface is the same as a guiding arrow in the fourth preview interface in response to the shooting operation along with the change of the orientation angle in the process of shooting the panoramic photo.
That is, the electronic device may always lock the direction indicated by the guide arrow to the first direction or the second direction at the stage when the panoramic photograph is being taken. In this way, it is possible to avoid affecting the photographing of panoramic photographs.
After entering the rotation shooting, such as after the shooting is completed, the electronic device may also exit the camera application to background operation (i.e., function six, below). The following are respectively described:
In one possible design of the first aspect, after the first sub-screen displays the second interface, the method further includes displaying a fifth interface on the first sub-screen in response to an operation to exit the camera application to the background operation. The fifth interface may be a desktop or gallery interface, etc. For example, the operation of exiting the camera application to the background operation is an operation of returning to the desktop, and the fifth interface is the desktop. For another example, the operation of exiting the camera application to the background operation is a click operation on a gallery entry (typically in the lower left corner, in which a thumbnail of a most recently taken photo is displayed) in the second interface, and the fifth interface is a gallery interface.
For the electronic device provided in the first aspect, the effect of using the first sub-screen alone is also good. For example, in the first sub-screen, viewing, editing, sharing, and the like of the photographing result can be completely realized. Based on this feature, the electronic device remains in the first sub-screen display user interface, such as a display desktop, gallery interface, etc., regardless of the current device pose, in response to the operation of exiting the camera application to background operation. Therefore, the user can conveniently continue to enter the camera application to use the rotation shooting after finishing checking, editing, sharing and the like, and the display screen is not required to be switched in the middle.
In one possible design manner of the first aspect, after the fifth interface is displayed on the first sub-screen, the method further includes, in response to an update of a device gesture of the electronic device, displaying the fifth interface on at least one sub-screen based on the current device gesture, where the device gesture is related to the first included angle and the second included angle.
After exiting the camera application running in the foreground, if the device pose is updated, it indicates that the user wants to change the device pose using the mobile phone, so the electronic device may decide to display the display screen of the user interface based on the device pose instead of always maintaining the user interface displayed on the first sub-screen. For example, the electronic device may clear the first identity after the device pose is updated. Without the first identification, the electronic device may then display a display screen of the user interface along with the device gesture decision.
It will be appreciated that in general, the attitude sensor reporting a new attitude event and/or the hall sensor reporting a new hall event indicates that the device attitude has changed.
In one possible design manner of the first aspect, the method further includes, before the device gesture is updated, predicting that the user will further resume using the rotation shooting, and displaying the second interface on the first sub-screen in response to the operation of tuning the camera application from the background to the foreground. Thereby being convenient for the user to continue rotating shooting. For example, after the image viewing, editing, or sharing is completed, the camera application is returned to continue shooting a new image using the rotation shooting.
And after the device posture is updated, predicting that the user wants to use the electronic device in a new device posture, and the electronic device can exit the rotation shooting, and then, in response to the operation of adjusting the camera application from the background to the foreground, displaying a first interface on at least one sub-screen based on the current device posture. Thus, after the rotation shooting is exited, the electronic equipment can ensure that the lightened display screen is consistent with the equipment posture.
After entering the rotation shooting, e.g. after the shooting is completed, the electronic device may also actively trigger exiting the rotation shooting (i.e. function five below). The following are respectively described:
In one possible design manner of the first aspect, after the second interface is displayed in the first sub-screen, the method further includes receiving a fifth operation of the user on the identification of the rotation shooting. That is, the identification of the rotation shooting may also be one control. The fifth operation may be a click operation, a long press operation, or the like. In response to a fifth operation, a first interface is displayed in the at least one sub-screen based on a current device pose of the electronic device, the device pose being associated with the first angle and the second angle.
That is, after the user actively exits the rotation shooting, the electronic device may return to the normal shooting and not continue to remain in the first sub-screen display user interface.
In one possible design manner of the first aspect, the first control is a front-rear camera switching control, the second operation includes a sliding operation upward from the first control and a clicking operation on the second control, and the second control (such as the shooting portal 6031 in the interface 603) is displayed in response to the sliding operation upward on the first control. In this way, after entering the rotation shooting, mode options of various shooting modes can also be shared with the normal shooting, and particularly, see the following description of fig. 6B.
In a second aspect, the present application also provides an electronic device including a display screen, a rear camera, a memory, and one or more processors. The display, the memory, and the processor are coupled. The memory is used to store computer program code, which includes computer instructions. The computer instructions, when executed by a processor, cause an electronic device to perform the method of the first aspect and any one of its possible designs.
In a third aspect, the application provides a chip system for use in an electronic device comprising a display screen and a memory, the chip system comprising one or more interface circuits and one or more processors, the interface circuits and processors being interconnected by wires, the interface circuits being arranged to receive signals from the memory of the electronic device and to send signals to the processor, the signals comprising computer instructions stored in the memory, the electronic device performing the method according to the first aspect and any one of its possible designs when the computer instructions are executed by the processor.
In a fourth aspect, the present application provides a computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform a method as in the first aspect and any one of its possible designs.
In a fifth aspect, the application provides a computer program product for causing a computer to carry out the method as in the first aspect and any one of its possible design approaches when the computer program product is run on a computer.
It will be appreciated that the advantages achieved by the electronic device of the second aspect, the chip system of the third aspect, the computer storage medium of the fourth aspect, and the computer program product of the fifth aspect provided above may refer to the advantages in the first aspect and any possible design manner thereof, and are not repeated herein.
DrawingsFig. 1 is a hardware configuration diagram of an electronic device according to an embodiment of the present application;
Fig. 2 is a schematic diagram of a mobile phone according to a first aspect of the present application;
fig. 3 is a schematic diagram of a mobile phone according to a second aspect of the present application;
fig. 4A is a schematic diagram of a mobile phone according to a third aspect of the present application;
Fig. 4B is a second schematic diagram of a mobile phone according to a third embodiment of the present application;
fig. 5 is a schematic diagram of a mobile phone folded longitudinally according to an embodiment of the present application;
fig. 6A is a software architecture diagram of an electronic device according to an embodiment of the present application;
FIG. 6B is a diagram of one of the mobile phone interface diagrams according to the embodiment of the present application;
FIG. 7 is a second diagram of a mobile phone interface according to an embodiment of the present application;
Fig. 8 is a schematic diagram of a mobile phone implementing rotation shooting according to a first aspect of the present application;
Fig. 9A is a second schematic diagram of a mobile phone implementing rotation shooting according to the first embodiment of the present application;
fig. 9B is a third schematic diagram of a mobile phone implementing rotation shooting according to the first embodiment of the present application;
fig. 10 is a schematic diagram of a mobile phone implementing rotation shooting according to a first embodiment of the present application;
FIG. 11A is a schematic diagram of a first embodiment of the present application for implementing rotation shooting by a mobile phone;
FIG. 11B is a schematic diagram of a first embodiment of the present application for implementing rotation shooting by a mobile phone;
fig. 12 is a schematic diagram of a mobile phone implementing rotation shooting according to a first embodiment of the present application;
fig. 13 is a schematic diagram eighth of a mobile phone implementing rotation shooting according to a first embodiment of the present application;
fig. 14 is a schematic diagram of a mobile phone implementing rotation shooting according to a first embodiment of the present application;
fig. 15A is a schematic diagram of a mobile phone implementing rotation shooting according to a first embodiment of the present application;
fig. 15B is a schematic diagram of a mobile phone implementing rotation shooting according to a first embodiment of the present application;
fig. 16A is a schematic diagram twelve of a mobile phone implementing rotation shooting according to a first embodiment of the present application;
FIG. 16B is a diagram showing a first embodiment of the present application for implementing rotation shooting;
fig. 17 is a fourteen schematic diagrams of a mobile phone implementing rotation shooting according to a first embodiment of the present application;
FIG. 18 is a schematic diagram of a mobile phone according to a first embodiment of the present application for implementing rotation shooting;
Fig. 19 is a schematic diagram of a mobile phone implementing rotation shooting according to a first embodiment of the present application;
FIG. 20 is a schematic diagram showing a first embodiment of the present application for implementing rotation shooting;
fig. 21A is one of schematic diagrams of implementing rotation shooting by a mobile phone in form three according to an embodiment of the present application;
fig. 21B is a second schematic diagram of a third embodiment of the present application for implementing rotation shooting by a mobile phone;
fig. 21C is a third schematic diagram of a mobile phone implementing rotation shooting according to a third embodiment of the present application;
Fig. 22 is a schematic diagram of a third embodiment of the present application for implementing rotation shooting by a mobile phone;
fig. 23 is a schematic diagram of a third embodiment of the present application for implementing rotation shooting by a mobile phone;
fig. 24 is a schematic diagram of a third embodiment of the present application for implementing rotation shooting by a mobile phone;
fig. 25 is a schematic diagram of a third embodiment of the present application for implementing rotation shooting by a mobile phone;
fig. 26 is a schematic diagram eighth of a third embodiment of the present application for implementing rotation shooting by a mobile phone;
Fig. 27 is a schematic diagram of a third embodiment of the present application for implementing rotation shooting by a mobile phone;
Fig. 28 is a schematic diagram of a third embodiment of the present application for implementing rotation shooting by a mobile phone;
fig. 29 is an eleventh schematic diagram of a third embodiment of the present application for implementing rotation shooting by a mobile phone;
Fig. 30A is a schematic diagram twelve of a third embodiment of the present application for implementing rotation shooting by a mobile phone;
FIG. 30B is a diagram illustrating a third embodiment of the present application for implementing rotation shooting;
fig. 30C is a fourteen schematic diagrams of a third embodiment of the present application for implementing rotation shooting by a mobile phone;
fig. 31A is a fifteen schematic diagrams of a mobile phone implementing rotation shooting according to a third embodiment of the present application;
fig. 31B is a sixteen schematic diagrams of a third embodiment of the present application for implementing rotation shooting by a mobile phone;
FIG. 31C is a schematic diagram showing a third embodiment of the present application for implementing rotation shooting;
fig. 32A is a schematic diagram eighteenth of a third embodiment of the present application for implementing rotation shooting by a mobile phone;
Fig. 32B is a nineteenth schematic diagram of a mobile phone implementing rotation shooting according to a third embodiment of the present application;
Fig. 32C is a twenty-diagram illustrating a third embodiment of the present application for implementing rotation shooting by a mobile phone;
Fig. 33 is a twenty-first schematic diagram of a mobile phone implementing rotation shooting according to a third embodiment of the present application;
fig. 34 is a twenty-second schematic diagram of a third embodiment of the present application for implementing rotation shooting by a mobile phone.
Detailed DescriptionThe technical solutions in the embodiments of the present application are described below with reference to the accompanying drawings in the embodiments of the present application. In the description of embodiments of the application, the terminology used in the embodiments below is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in the following embodiments of the present application, "at least one", "one or more" means one or more than two (including two). The term "and/or" is used to describe an associative relationship of associative objects, and indicates that three relationships may exist, for example, a and/or B may indicate that a exists alone, while a and B exist together, and B exists alone, where A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise. The term "coupled" includes both direct and indirect connections, unless stated otherwise. The terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated.
In embodiments of the application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the conventional technology, the use experience of the shooting function provided by the electronic device in the following several scenarios needs to be improved:
scene 1, a scene in front of eyes and a scene photographed by the user on the same photo.
For example, a user traveling to a scenic spot may have a beautiful view both in front of the eyes and behind the eyes. In this scenario, if the user wants to take a photograph of both the front and rear scenery and himself/herself, the conventional photographing function is often not effective without help of the user.
For example, two users watch a star show and want to take a picture of the star and themselves, and often do not achieve the best results without the help of the user.
For example, a user adopts a panoramic shooting mode in a conventional shooting function, and only front and rear sceneries, such as front and rear sceneries in the example one, stars in the example two and crowds behind the person, can be shot on one panoramic photo, but cannot shoot the user himself on the same panoramic photo, so that the effect is poor.
For example, the user adopts a multi-lens shooting mode in the conventional shooting function, shoots the scene behind the user and the front-end camera through the front-end camera, shoots the scene in front of the eyes through the rear-end camera, such as the scene in front of the eyes in the example one and the star in the example two, and then the scenes shot by the front-end camera and the rear-end camera respectively are spliced together cleanly, so that the effect is poor.
Scene 2, scene of multi-person group photo.
When the number of the group photo is large, if the shooting with the rear camera is wanted, one person is needed to shoot, so that one person is less in the group photo, or the person needs to be required to take help. In some optimized shooting schemes, although a rear camera is adopted for shooting through gesture or sound triggering, shooting effects cannot be previewed generally, so that repeated adjustment and confirmation may be required in the process of photo combination, and the effect is poor.
Scene 3, a shot of a side or back scene.
In general, it is difficult to take a side or back view by a rear camera without turning around or helping the user. In some optimized photographing schemes, an electronic device with an out-folded folding screen (see in particular description of fig. 3 below) faces itself with the folding screen in a folded state, provides a preview on the back screen, and achieves self-photographing through the back camera. However, this photographing scheme cannot achieve lateral photographing while previewing.
In view of the above problems in conventional photographing functions, an embodiment of the present application provides a photographing method applied to an electronic device having a foldable screen. The foldable screen itself has a hinge for folding or unfolding. The viewing range of the rear camera may be changed during folding or unfolding of the foldable screen.
Based on the characteristics of the foldable screen, in the shooting method provided by the embodiment of the application, the electronic equipment can realize rotary shooting along with the change of the view finding range of the rear camera in the folding or unfolding process of the foldable screen. By way of example, the rotation shooting includes panoramic shooting between a folded state and an unfolded state, thereby realizing that a scene in front of eyes and the user take the same picture, and also realizing shooting of a plurality of persons in a combined picture. Also, by way of example, the rotation photographing may further include photographing at any angle between a folded state and an unfolded state, so that scenes at the sides or behind the photographing can be flexibly realized.
In addition, the rotation shooting may include other shooting modes, such as a video mode, a portrait mode, and the like. The electronic device may switch to a corresponding photographing mode to implement rotation photographing in response to a selection operation of any photographing mode. The embodiment of the present application is not particularly limited thereto.
The electronic device may be a mobile phone, a tablet, a notebook computer, or the like, which has a foldable screen and a rear camera, and the embodiments of the present application will be described mainly by taking the mobile phone as an example.
Referring to fig. 1, the electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc.
It should be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device. In other embodiments of the application, the electronic device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a memory, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device implements display functions via a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. In an embodiment of the present application, display 194 includes a foldable screen that may be folded to form at least two screens. The foldable screen has various forms. For example, the folding screen may include a two-fold folding screen and a multi-fold (e.g., three-fold and more) folding screen, depending on the number of screens formed by folding. For another example, the folding screen may include an inward folding screen, an outward folding screen, and a folding screen in which the inward folding and the outward folding are combined, or the folding screen may include a folding screen that is folded laterally and a folding screen that is folded longitudinally, depending on the direction of folding.
By way of example, the embodiments of the present application are described herein using a cellular phone as an example, and several typical forms of folding screens are described:
Form one, two folding screens of horizontal infolding.
Taking the example that the two folded screens folded in the lateral direction are the folded screen 201 in fig. 2, the folded screen 201 includes two sub-screens, such as a first screen 2011 and a second screen 2012, and the rotation axis is perpendicular to the width direction 1c of the folded screen. The folding screen 201 may be in an unfolded state (may be simply referred to as an unfolded state) as shown in fig. 2. In the unfolded state, the angle between the first screen 2011 and the second screen 2012 is about 180 °, and the folded screen 201 can be used as a large screen display user interface. Folding screen 201 is folded inwardly along an axis of rotation, such as first screen 2011 being folded in the direction shown at 1a and/or second screen 2012 being folded in the direction shown at 1b, may form a semi-folded state (which may be referred to simply as a semi-folded state) as shown in fig. 2. In the semi-folded state, the angle between the first screen 2011 and the second screen 2012 (referred to as the angle in the folding direction, e.g., denoted as α) satisfies 0 ° < α <180 °. Folding screen 201 continues to fold inwardly along the axis of rotation, as first screen 2011 continues to fold in the direction shown by 1a and/or second screen 2012 continues to fold in the direction shown by 1b, which may form the folded configuration shown in fig. 2 (e.g., folded configuration 1 or folded configuration 2, which may be referred to as a folded configuration for brevity). In the folded state, the included angle between the first screen 2011 and the second screen 2012 is about 0 °, at this time, the display direction of the first screen 2011 and the display direction of the second screen 2012 are opposite, and neither the first screen 2011 nor the second screen 2012 is visible to the user.
The display direction is the direction of the front surface of the screen, and can be said to be the direction of the front surface of the screen facing. And, the display direction is opposite to refer to a posture in which the two screens are "face-to-face". For example, in the folded state 1 and the folded state 2 shown in fig. 2, the first screen 2011 and the second screen 2012 are in a "face-to-face" posture, and thus the display directions of the first screen 2011 and the second screen 2012 are opposite.
It should be noted that the folded state 1 and the folded state 2 in fig. 2 are just the forms of the cellular phone having the folding screen 201 seen from different viewing angles. Wherein, the folded state 1 is the view angle of the rear cover 203 provided with the rear camera 204, and the folded state 2 is the view angle of the outer screen 202.
Typically, in a cell phone with a folding screen 201, the rear camera 204 is arranged in the rear housing behind the second screen 2012. Also, in order to facilitate use after the folding screen 201 is folded, an outer screen 202 is further provided at a position where the first screen 2011 faces away. As shown in folded state 2 of fig. 2, the cell phone may also display a user interface through the external screen 202 with the folding screen 201 in the folded state.
It should be appreciated that in a cell phone having a folding screen 201, the outer screen 202 is visible to the user with the folding screen 201 in the unfolded, semi-folded, or folded state, and the outer screen 202 may not rotate as the viewing angle of the rear camera 204 changes. Based on this, in the embodiment of the present application, the external screen 202 may be used as a front screen of the mobile phone with the folding screen 201, for displaying the preview interface during the rotation shooting process. Therefore, the user can see the framing picture in the whole rotation shooting process.
And a second form, namely a two-fold folding screen which is folded transversely outwards.
Taking the example that the two-fold folding screen folded laterally outward is the folding screen 301 in fig. 3, the folding screen 301 includes two sub-screens, such as a first screen 3011 and a second screen 3012, and the rotation axis is perpendicular to the width direction 1c of the folding screen. The folding screen 301 may be in the unfolded state shown in fig. 3. In the unfolded state, the angle between the first screen 3011 and the second screen 3012 is about 180 °, in which case the folded screen 301 may be used as a large screen display user interface. Folding screen 301 is folded outwardly about an axis of rotation such that first screen 3011 is folded in the direction shown at 1d and/or second screen 3012 is folded in the direction shown at 1e, resulting in the semi-folded configuration shown in fig. 3. In the semi-folded state, the angle between the first and second screens 3011, 3012 (referring to the angle in the direction opposite to the direction of folding, as noted as β) satisfies 0 ° < β <180 °. The folded screen 301 continues to fold inwardly along the axis of rotation, as the first screen 3011 continues to fold in the direction shown at 1d and/or the second screen 3012 continues to fold in the direction shown at 1e, resulting in the folded configuration shown in fig. 3 (e.g., folded configuration 1 or folded configuration 2). In the folded state, the angle between the first screen 3011 and the second screen 3012 is about 0 °, at which time the display direction of the first screen 3011 and the display direction of the second screen 3012 are opposite, and the first screen 3011 and the second screen 3012 are both visible to the user.
The opposite difference from the previous display direction is that the opposite display direction means that the two screens are in a "back-to-back" position. For example, in the folded state 1 and the folded state 2 shown in fig. 3, the first screen 3011 and the second screen 3012 are in a "back-to-back" posture, and thus the display directions of the first screen 3011 and the second screen 3012 are opposite at this time.
It should be noted that the folded state 1 and the folded state 2 in fig. 3 are just the forms of the mobile phone with the folding screen 301 seen from different viewing angles. Where fold 1 is the view of the first screen 3011 and fold 2 is the view of the second screen 3012.
Typically, in a cell phone with a folding screen 301, a rear camera 304 is provided in a specific area 305 behind the second screen 3012. Also, in the folded state, the specific region 305 and the first screen 3011 may form an integral body of the same size as one sub-screen, such as the second screen 3012.
It should be appreciated that in a cell phone having a folding screen 301, the first screen 3011 is visible to the user with the folding screen 301 in the unfolded, semi-folded, or folded state, and the first screen 3011 may not rotate as the viewing angle of the rear camera 304 changes. Based on this, in the embodiment of the present application, the first screen 3011 may be used as a front screen of the mobile phone with the folding screen 301, for displaying a preview interface in the process of rotation shooting. Therefore, the user can see the framing picture in the whole rotation shooting process.
Form three, a transverse three-fold folding screen, which is combined by an inner fold and an outer fold.
Taking the example that the folding screen folded in three in the lateral direction is the folding screen 401 in fig. 4A, the folding screen 401 includes three sub-screens, such as a first screen 4011, a second screen 4012, and a third screen 4013, and both axes of rotation (such as an axis of rotation 1 and an axis of rotation 2) are perpendicular to the width direction 1c of the folding screen. The folding screen 401 may be in the unfolded state shown in fig. 4A. In the unfolded state, the included angles between the first screen 4011, the second screen 4012 and the third screen 4013 are about 180 degrees, and the folding screen 401 can be used as a large screen to display a user interface. The half-folded state shown in fig. 4A can be formed by folding the first screen 4011 and the second screen 4012 of the folded screens 401 inwardly along the rotation axis 1, such as the first screen 4011 being folded along the direction shown by 1f, the second screen 4012 being folded along the direction shown by 1g, and the second screen 4012 and the third screen 4013 being folded outwardly along the rotation axis 2, such as the second screen 4012 being folded along the direction shown by 1g, and the third screen 4013 being folded along the direction shown by 1 h. In the semi-folded state, an angle between the first screen 4011 and the second screen 4012 (an angle referring to a folding direction, e.g., denoted by γ) satisfies 0 ° < γ <180 °, and an angle between the second screen 4012 and the third screen 4013 (an angle referring to a direction opposite to the folding direction, e.g., denoted by δ) satisfies 0 ° < δ <180 °. The folded state (e.g., folded state 1 or folded state 2) shown in fig. 4A may be formed by continuing to fold the first screen 4011 and the second screen 4012 in the folded screen 401 inwardly along the rotation axis 1, and folding the second screen 4012 and the third screen 4013 outwardly along the rotation axis 2. In the folded state, the angle γ between the first screen 4011 and the second screen 4012 is approximately 0 °, and the angle δ between the second screen 4012 and the third screen 4013 is also approximately 0 °. In the folded state, the display direction of the first screen 4011 and the display direction of the second screen 4012 are opposite, both the first screen 4011 and the second screen 4012 are invisible to the user, and the display direction of the second screen 4012 and the display direction of the third screen 4013 are opposite, the third screen 4013 is visible to the user, so that the user interface can be displayed on the third screen 4013.
It should be noted that the folded state 1 and the folded state 2 in fig. 4A are just the forms of the mobile phone with the folding screen 401 seen from different viewing angles. Wherein, the folded state 1 is the view angle of the rear cover 402 where the rear image head 403 is located, and the folded state 2 is the view angle of the third screen 4013.
Typically, in a cell phone with a folding screen 401, a rear camera 403 is provided on a rear cover 402 behind the first screen 4011. Thus, in the folded state, the user interface can be displayed on the third screen 4013, and the rear camera 403 can be used for photographing.
It should be appreciated that in a cell phone having a folding screen 401, the third screen 4013 is visible to the user with the folding screen 401 in an unfolded state, a semi-folded state, or a folded state, and the third screen 4013 may not rotate with a change in the angle of view of the rear camera 403. Based on this, in the embodiment of the present application, the third screen 4013 may be used as a front screen of the mobile phone having the folding screen 401, for displaying the preview interface during the rotation shooting process. Therefore, the user can see the framing picture in the whole rotation shooting process.
It should be noted that, in the above-described folding screen 401, the first screen 4011 and the second screen 4012 connected to the rotation shaft 1 are folded inward, as the first screen 4011 is folded in the direction shown by 1f, and the second screen 4012 is folded in the direction shown by 1g, and the second screen 4012 connected to the rotation shaft 2 is folded inward, as the direction shown by 1g, and the third screen 4013 is folded outward, as the third screen 4013 is folded in the direction shown by 1h. In the folding screen 401, a rear camera 403 is provided on the rear surface of the first screen 4011.
In practice, however, the folding direction and the position of the rear camera are not limited to those shown in fig. 4A. Illustratively, the lateral tri-fold folding screen may also be folding screen 411 in fig. 4B. Unlike the previously described folding screen 401, the first screen 4111 and the second screen 4112 connected by the hinge 2 are folded inward, as the first screen 4111 is folded in the direction shown by 2h, and the second screen 4112 is folded in the direction shown by 2g, and the second screen 4112 connected by the hinge 1 is folded inward, as the second screen 4112 is folded in the direction shown by 2g, and the first screen 4111 is folded outward, as the third screen 4113 is folded in the direction shown by 2 f. Also, in the folding screen 411, a rear camera (not shown in the drawing) is provided on the back surface of the first screen 4111, that is, on the rear cover 412.
That is, the folding direction of the folding screen 401 and the folding screen 411 are opposite. In the following, in the description of the third embodiment, the folding screen 401 will be mainly described as an example. It should be noted that if the folding screen 411 is replaced, it is only necessary to replace the first screen 4011 with the first screen 4111, the second screen 4012 with the second screen 4112, and the third screen 4013 with the third screen 4113.
The embodiment of the application can be applied to the folding screens in various forms. Of course, the embodiments of the present application may also be applied to folding screens other than those described above with respect to fig. 2-4B.
The folding screens shown in fig. 2-4B are all folding screens that are folded transversely (i.e., the rotation axis is perpendicular to the width direction of the folding screen), and the embodiments of the present application may also be applied to folding screens that are folded vertically (i.e., the rotation axis is perpendicular to the height direction of the folding screen). For example, the vertically folded folding screen may be the folding screen 501 shown in fig. 5, the folding screen 501 including two sub-screens, such as an a-screen and a B-screen, and the rotation axis being perpendicular to the height direction 1i of the folding screen. The folding screen 501 may be folded inwardly along an axis of rotation, such as the screen a being folded in the direction shown by 1j and/or the screen B being folded in the direction shown by 1k, with the angle between the screens a and B (referring to the angle of the folding direction) gradually decreasing as the fold progresses.
It should be noted that, for a folding screen mobile phone with a vertical folding, along with the progress of folding or unfolding, the view finding range of the rear camera mainly changes in the vertical direction. As the folding screen 501 is folded, the view range of the rear camera 503 provided on the rear cover 502 behind the a screen changes from the ground to the front. Of course, if the folding screen 501 is perpendicular to the ground in the unfolded state, the viewing range of the rear camera 503 is changed from the front direction on the fly during the folding of the folding screen 501. In general, the requirement of rotation shooting is mainly to change the viewing range of the rear camera in the horizontal direction, not the viewing range in the vertical direction. As in the previous scenes 1-3, the viewing range of the rear camera in the horizontal direction needs to be changed. It can be seen that the degree of matching of the vertically folded folding screen with the rotation shooting is not very high, but does not exclude the presence of a scene requiring a change of the viewing range of the rear camera in the vertical direction. For example, a user wants to take a photograph of the foreground of the eye and the sky.
Hereinafter, embodiments of the present application will be described mainly in terms of the folding screen shown in the first aspect to the third aspect.
The electronic device may implement shooting functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The electronic device may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
Further, the sensor 180 may include, but is not limited to, a gyro sensor 180B, a magnetic sensor 180D, and an acceleration sensor 180E.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device. In some embodiments, the angular velocity of the electronic device about three axes (i.e., x, y, and z axes) may be determined by the gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device, calculates the distance to be compensated by the lens module according to the angle, and controls the lens to move reversely to counteract the shake of the electronic device, thereby realizing anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device is stationary. And, the acceleration sensor 180E may also be used to identify the gesture of an electronic device, applied to a landscape/portrait screen switching, pedometer, etc.
In practice, in an electronic apparatus having a folding screen (hereinafter simply referred to as a foldable electronic apparatus), each of the sub-screens included in the folding screen is provided with a gyro sensor 180B and an acceleration sensor 180E correspondingly. For example, the first screen in fig. 2 to 4B is provided with a gyro sensor 180B and an acceleration sensor 180E correspondingly, the second screen is also provided with a gyro sensor 180B and an acceleration sensor 180E correspondingly, and the third screen in fig. 4A and 4B is also provided with a gyro sensor 180B and an acceleration sensor 180E correspondingly.
The magnetic sensor 180D includes a hall sensor. The electronic device may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device is a flip machine, the electronic device may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
In some possible cases, the hall sensor may be implemented directly based on a physical hall, or may be implemented based on a virtual hall sensor (also referred to as a digital hall sensor) implemented by a magnetometer, without limitation.
Taking the virtual hall sensor as an example, when the electronic device is a foldable electronic device, the virtual hall sensor can determine that the two screens are closed or far away and approximately in an angle range according to a magnetic force threshold perceived by the magnetometer.
In some possible cases, the electronic device may further include an attitude sensor (not shown in the figure), where the attitude sensor may be a physical sensor, or may be a virtual attitude sensor implemented by using a fusion algorithm in combination with sensor data such as an angle, an acceleration, or virtual sensor data such as an angle. The attitude sensor may provide more rich and accurate attitude data such as collapsed, semi-collapsed, expanded, tent, desk calendar, etc. The gesture data may be used by the electronic device to determine whether to turn on or off each sub-screen.
The electronic device may use the gyroscope sensor 180B and the acceleration sensor 180E that are correspondingly disposed on two sub-screens that form an included angle to calculate the included angle, may use a digital hall sensor to obtain the included angle, and may use a hall sensor simulated by a magnetometer to map the included angle range, for example, the hall approach state corresponds to an included angle within 5 ° and the away state corresponds to an included angle above 10 °. The angle values of the included angles (such as alpha, beta, gamma and delta) referred to below may be from calculated angle data, may be from an angle range corresponding to the approaching or separating state of the hall, may be used in a logical and relationship between the angle and the hall state, or may be used in a logical or relationship between the angle and the hall state. The embodiment of the present application is not particularly limited thereto.
The keys 190 may include a power on key, a volume key, etc. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc. The SIM card interface 195 is used to connect a SIM card.
The software system of the electronic device may adopt a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, an Android system with a layered architecture is taken as an example, and the software structure of the electronic equipment is illustrated. At least one subsystem, such as an AP subsystem and a sensor management component (sensorhub) subsystem, may be included in the software architecture.
Referring to fig. 6A, a software architecture block diagram of an electronic device according to an embodiment of the present application is provided. As shown in fig. 6A, the layered architecture may divide the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface.
In some embodiments, the AP subsystem is divided into four layers, from top to bottom, an application layer (applications), an application framework layer (framework), a hardware abstraction layer (hardware abstract layer, HAL), and a Kernel layer (Kernel, which may also be referred to as a driver layer). The sensorhub subsystem is divided into two layers, a framework layer (for distinction, denoted as co-framework layer) and a kernel layer (for distinction, denoted as co-kernel layer). It should be understood that in practical applications, more or fewer levels may be included in the AP subsystem and sensorhub subsystem in addition to the levels shown in fig. 6A. The embodiment of the present application is not limited thereto.
The application layer may include a series of application packages, among other things. For example, application packages may include cameras, calendars, WLANs, music, gallery, talk, navigation, bluetooth, etc. Wherein the camera application package is used for providing shooting functions. Further, the photographing function includes rotation photographing.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
The application framework layer may include a device state management service, a sensor service, a screen management module, a camera service, and the like.
Wherein each of the screen management modules may be configured to receive a notification sent by the device state management service to illuminate a screen, and initiate an on-screen process based on the notification. In a specific implementation, a power management service, a window management service, etc. may be included in the screen management module. The power management service can be used for controlling the power-on and power-off of the screen in the screen-lighting process. The window management service may be used to draw a display window that is placed in the screen for viewing by the user after the screen is powered up. For example, upon entering a rotation shot, the power management service may control the front screen to power up, such as the outer screen 202, the first screen 3011, or the third screen 4013 described above, and the window management service may be used to draw a display window displayed on the front screen.
After registering the attitude sensor and the hall sensor with the sensor service, the device state management service can be used for receiving the attitude event reported by the attitude sensor and receiving the hall event reported by the hall sensor. And then the device state management service determines the final device form of the electronic device based on the gesture event and/or the Hall event, makes a decision for controlling the screen lighting based on the device form, and triggers the screen management module to start the screen lighting process.
It will be appreciated that as the folding screen is unfolded or folded, the angles between the sub-screens of the folding screen, such as α, β, γ, δ, change, and accordingly, the device attitude of the electronic device changes. Typically, the gesture sensor reports a new gesture event and/or the hall sensor reports a new hall event, which indicates that the gesture of the device has changed. Wherein the gesture event comprises the gesture of folding screen of unfolding state, half unfolding state/half folding state, tent state, desk calendar state and the like. The hall event includes an event that the approaching state switches away from the state, and the approaching state switches away from the state. The electronic device may display a user interface on a screen that is consistent with the pose of the device.
Of course, the device state management may also make a decision to control the screen brightness based only on the gesture event, triggering the screen management module to start the screen brightness flow. The embodiment of the present application is not particularly limited thereto. Hereinafter, a device gesture decision determined based on gesture events and/or hall events is mainly described as an example.
In a specific implementation manner, the device state management service may also make a decision to control the front screen to turn on after starting the rotation shooting, and trigger the screen management module to start the on-screen process, so as to ensure that the preview interface is kept displayed on the front screen after starting the rotation shooting.
The sensor service may be used to report the received gesture event as well as the hall event to the device state management service.
The camera service may be used for interactions between the camera application and the underlying layer, such as for the camera application to send a capture request to the layer, and for the underlying layer to transmit preview stream data to the camera application, etc.
The hardware abstraction layer is an interface layer located between the kernel layer and the hardware layer (not shown) for the purpose of abstracting the hardware to provide a virtual hardware platform for the operating system. The hardware abstraction layer may include a sensor interface service, an inter-core communication module a, a camera HAL (Camera Provider), and the like.
Wherein the sensor interface service may be used to register each sensor, and receive data sent by each sensor after registration. For example, a gesture sensor interface and a hall sensor interface may be included in the sensor interface service. Wherein the gesture sensor interface may be used to manage the registration of the gesture sensor and receive gesture events uploaded by the gesture sensor and send the gesture events to the sensor service. The hall sensor interface may be used to manage registering hall sensors and receive hall events uploaded by hall sensors and send the hall events to the sensor service.
The inter-core communication module a may be configured to manage communication between the AP subsystem and the sensorhub subsystem, so that the AP subsystem and the sensorhub subsystem may transmit data to each other. For example, a gesture event, a hall event, etc. is transmitted.
The camera HAL is used to provide a standard HAL interface definition language (HAL INTERFACE definition language, HIDL) interface to call an upper layer (e.g., camera service) to maintain normal communication with the upper layer. And the camera HAL is used for downwards controlling the kernel layer through a standard HAL interface, such as HAL3.0, and acquiring preview stream data reported by the kernel layer.
The kernel layer is a layer between hardware and software. Camera drivers may be included in the kernel layer. The camera driver may receive a notification from an upper layer (e.g., a notification indicating to turn on or off the camera) and send a stream of functional processing parameters to hardware (e.g., camera module, ISP) based on the notification. And, the camera driver may also transfer the hardware-derived preview stream data to an upper layer.
The co-framework layer may include an event distribution manager, a sensor client manager, an inter-core communication module B, etc.
The event distribution manager may be configured to receive data uploaded by the kernel layer and distribute the data to other modules that have registered with the event distribution manager. For example, the event distribution manager may be configured to receive gesture events reported by the gesture sensors and then transmit the gesture events to the sensor client manager. The event distribution manager may also be configured to receive hall events reported by hall sensors and then transmit the hall events to the sensor client manager.
The sensor client manager may be configured to obtain data uploaded by the event distribution manager after registering with the event distribution manager, including, for example, gesture events, hall events, and the like.
The inter-core communication module B may be used to manage communication between the application subsystem and the system control subsystem. So that the system control subsystem and the application subsystem can communicate data with each other. For example, the inter-core communication module B may be configured to receive the gesture event, the hall event, etc., transmitted by the sensor client manager, and transmit data of the gesture event, the hall event, etc., to the inter-core communication module a.
The sensor drive may be used to activate the sensor, e.g., an attitude sensor, a hall sensor, etc. After the gesture sensor is started, a gesture event can be acquired and reported. For example, upon reporting the first frame pose event to the event distribution manager. Subsequently, if the current frame posture event is different from the previous frame posture event, the current frame posture event can also be reported to the event distribution manager. After the Hall sensor is started, a Hall event can be acquired and reported. For example, upon reporting a first frame hall event to an event distribution manager. Subsequently, if the hall event of the current frame is different from the hall event of the previous frame, the hall event of the current frame can also be reported to the event distribution manager.
It should be understood that fig. 6A illustrates an exemplary software block diagram of an electronic device, and should not be construed as limiting embodiments of the present application. For example, the location (e.g., the hierarchy) of the various modules may vary in actual applications.
The shooting method provided by the embodiment of the application is further described below with reference to the software and hardware structure shown in fig. 6A:
After starting the rotation shooting, the electronic device may employ the communication path I in fig. 6A, i.e. through the camera application, the camera service, the camera HAL and the camera driver, to enable interaction between the camera application and hardware (e.g. camera module, ISP). For example, the preview stream data obtained by the hardware is finally reported to the camera application after being processed.
And, the electronic device may employ the communication path II in fig. 6A, that is, through the camera application, the camera service, the device status management service, and the screen management module, to keep the front screen on and display the preview interface on the front screen all the time after entering the rotation shooting. That is, the device management service may always keep the front screen lit after entering the rotation shooting.
Meanwhile, after the rotation shooting is entered, the electronic device may still adopt the communication path III in fig. 6A, that is, through the screen management module, the device state management service, the sensor interface service, the inter-core communication module a, the inter-core communication module B, the sensor client manager, the event distribution manager, and the sensor driver, normal reporting of the gesture event and the hall event is implemented, and the device gesture is determined, and subsequently, after the rotation shooting is exited, further implementing the screen lighting or screen lighting of each screen based on the latest device gesture decision. Thus, after exiting the rotation shooting, the electronic device can timely revert to making a decision on the screen on or off of each screen based on the device gesture, without continuously displaying the user interface on the front screen.
The shooting method provided by the embodiment of the application can be completed in the electronic equipment with the hardware structure. Hereinafter, an embodiment of the present application will be described by taking an example in which the electronic device is a mobile phone.
In a camera function of the mobile phone, such as an application interface of a camera application, a photographing portal for rotation photographing is provided, and the mobile phone enters the rotation photographing in response to a selection operation of the photographing portal.
For example, the handset may display the interface 601 shown in fig. 6B, the interface 601 being the desktop of the handset. The interface 601 includes an application icon 6011 of the camera application. In response to a click operation on the application icon 6011 in the interface 601, the mobile phone may display an interface 602 (may also be referred to as a first interface) shown in fig. 6B, where the interface 602 is an application interface of the camera application. The interface 602 includes mode options of various shooting modes, such as shooting modes of shooting, portrait, video, and the like. The interface 602 also includes a front-to-back switch control 6021, and in response to clicking the front-to-back switch control 6021, the mobile phone can switch from the front camera to the back camera or from the back camera to the front camera.
In response to the slide operation from the front-rear switching control 6021 upward, the mobile phone can display an interface 603 shown in fig. 6B, and a photographing entry 6031 of rotation photographing is further displayed above the front-rear switching control 6021 in the interface 603. In response to a selection operation (e.g., a click operation) of the photographing entry 6031, the cellular phone may enter a rotation photographing mode. For example, in response to a selection operation of the photographing entry 6031, the cellular phone may display an interface 604 (may also be referred to as a second interface) shown in fig. 6B, where the photographing entry 6031 in the interface 604 is located at the original position of the front-rear switching control 6021, indicating that the rotation photographing has been entered at this time.
Also, the interface 604 includes various shooting mode options, such as a shooting mode option 6041, a portrait mode option 6042, a video mode option 6043, a panoramic mode option 6044, and the like. After entering the rotation shooting, the mobile phone defaults to the currently selected shooting mode, for example, the default selected shooting mode in the interface 604, so as to perform the rotation shooting. Subsequently, in response to the selection operation of the other mode options, the mobile phone can switch to the other shooting modes to perform rotary shooting. For example, in response to a selection operation of the mode option 6044 of the panoramic mode, the cellular phone may switch to the panoramic mode for rotation shooting.
Of course, the photographing entrance of the rotation photographing is not limited to that shown in fig. 6B. For example, the mobile phone may display the interface 701 shown in fig. 7, which is the same as the interface 602 described above, and will not be described herein. Also included in interface 701 is "more" 7011. In response to the selection of "more" 7011, the cell phone may display an interface 702, where the interface 702 includes mode options of more shooting modes provided by the camera, such as a mode option 7021 of a high pixel mode, a mode option 7022 of a delayed shooting mode, a mode option 7023 of a dual view video mode, and so on. And, the mode options of more shooting modes further include a mode option of rotating shooting various shooting modes, such as a mode option 7024 of rotating shooting mode, a mode option 7025 of rotating panoramic mode, a mode option 7026 of rotating video mode, and the like. In response to a selection operation of a mode option of a rotary shooting mode, the mobile phone can enter rotary shooting of the corresponding mode. That is, the mode option of the rotation shooting mode is a shooting entry of the rotation shooting. For example, in response to a selection of the mode option 7024 for the rotational shooting mode in the interface 702, the cell phone may then enter rotational shooting in the shooting mode.
Hereinafter, an embodiment of the present application will be described mainly by taking a form of a photographing portal shown in fig. 6B as an example.
In some embodiments, during the process of rotation shooting, the electronic device always keeps a screen (hereinafter referred to as a front screen) which is visible to a user and can not rotate along with the rear camera, and displays a preview interface on the front screen, so that the user can conveniently preview a viewfinder. Note that the viewfinder refers to a view acquired by the rear camera in real time.
Furthermore, in order to avoid false touch, when the preview interface is displayed in the front screen, the electronic device also keeps other screens off.
The above-mentioned function of always displaying the preview interface on the front screen may be referred to as a function one, and in particular, the following description of the function one in various product forms (e.g., form one-form three) will not be described herein.
After entering the rotation shooting, the mobile phone is in a preview stage before shooting, and then, in response to shooting operation, such as clicking operation on a shooting button, the mobile phone enters the shooting stage. In a mode (such as a photographing mode, a night view mode, a portrait mode, etc.) of photographing a single photo other than panoramic photographing, the mobile phone automatically ends photographing, and thus, the preview stage is entered again. In panoramic shooting or video shooting modes (such as a video recording mode and a video mode), the mobile phone can enter a preview stage again in response to shooting ending operation, such as clicking an end shooting button.
In some embodiments, during the preview stage of the rotation shooting, that is, the stage after the shooting is not started or the shooting is finished, if the rear camera is biased to the electronic device during the folding or unfolding process of the folding screen, the image function of the rear camera may be started (may be simply referred to as rear image). It will be appreciated that after the back mirror is turned on, the same viewing effect as with a mirror can be displayed in the preview interface. Thus, when the rear camera collects the scenery of the user and the scenery behind the user, the electronic equipment can present a mirror image picture in the front screen, so that the user can conveniently adjust the view finding, for example, the user moves left, and the user in the view finding picture also moves left. If the rear camera is biased towards the scenery in front of eyes, the electronic equipment can close the rear mirror image, so that when the rear camera shoots the scenery in front of eyes, a picture consistent with the direction of the scenery is displayed in the front screen.
Further, the electronic device may always lock on the back mirror or close the back mirror after entering the shooting stage, unlike the preview stage. The state of the rear camera is the opened rear mirror image state when shooting the last frame of image before, the rear camera is always locked to be opened rear mirror image in the shooting stage, and the rear camera is the closed rear mirror image state, and the rear camera is always locked to be closed rear mirror image in the shooting stage. In this way, the influence on the shooting of the user caused by repeatedly opening or closing the rear mirror image in the shooting process can be avoided.
Further, when the rear mirror image is turned on or turned off, the view-finding picture in the preview interface can be turned left and right. Therefore, the mobile phone can display transitional movement effects, such as fuzzy movement effects, when the rear mirror image is opened or closed. So that the switching process is smoother and not too stiff.
The above-mentioned function of turning on or off the rear mirror image may be referred to as a second function, and in detail, reference will be made to the following description of the second function, which will not be described herein.
Typically, after switching to panoramic mode, the electronic device may display a guide arrow in a preview interface (which may also be referred to as a fourth interface) for guiding the user to take a panoramic photograph with the mobile electronic device. In a scene of normal shooting, the direction indicated by the guide arrow is from left to right.
In some embodiments, at least one of the following reasons is considered:
for one reason, when the screen is used in a vertical screen state, the folding and unfolding of the folding screen usually involve the viewing of the rear camera to move left and right for a product form of transverse folding (as shown in fig. 2-4B above), and the folding and unfolding of the folding screen usually involve the viewing of the rear camera to move up and down for a product form of vertical folding (as shown in fig. 5 above).
Of course, the opposite is true in the horizontal screen state, and no further description is given here.
For the second reason, the view finding changes of the rear camera are different in the folding and unfolding processes of the product in the same form. For the product in the first mode, the view finding of the rear camera changes anticlockwise in the unfolding process, and the view finding of the rear camera changes clockwise in the folding process. For the product of the second form, the view of the rear camera changes clockwise in the unfolding process, and the view of the rear camera changes anticlockwise in the folding process. For the product shown in fig. 4A in the third aspect, the view of the rear camera is changed clockwise in the process of unfolding the first screen 4011 and the second screen 4012, the view of the rear camera is changed counterclockwise in the process of folding the first screen 4011 and the second screen 4012, and the view of the rear camera is changed counterclockwise in the process of unfolding the second screen 4012 and the third screen 4013, and the view of the rear camera is changed clockwise in the process of folding the second screen 4012 and the third screen 4013.
Based on the above, it is known that, unlike a general photographing scene, the view change of the rear camera during the rotation photographing can be varied variously. Thus, the electronic device may provide a switching portal (e.g., a direction switching control as described below) for a user to manually switch the direction of the guide arrow to flexibly implement rotational photographing in different directions. And/or, the electronic device may also predict the shooting requirement of the user based on the included angles (such as α, β, γ, δ, etc.) between the sub-screens of the folding screen, and automatically adjust the direction indicated by the guiding arrow based on the shooting requirement, so that the direction indicated by the guiding arrow may be matched with the shooting requirement of the user. The shooting requirements include clockwise shooting and anticlockwise shooting, and the corresponding directions are left to right and right to left in sequence.
Of course, the above manual adjustment and automatic adjustment may be combined. In addition, similar to the above-described opening or closing of the rear mirror image, the electronic device may always lock the direction indicated by the guide arrow from left to right or from right to left during the period in which the panoramic photograph is being taken. That is, when the last frame of image before shooting is started, the direction indicated by the guide arrow is from left to right, the direction is always locked from left to right in the stage of shooting the panoramic photo, and when the last frame of image before shooting is started, the direction indicated by the guide arrow is from right to left, the direction is always locked from right to left in the stage of shooting the panoramic photo. In this way, it is possible to avoid affecting the photographing of panoramic photographs.
The above-mentioned function of adjusting the direction indicated by the guide arrow may be referred to as a third function, and reference will be made specifically to the following description of the third function, and the description thereof will not be repeated here.
In some embodiments, after entering the rotation shooting, the user may adjust the view range of the rear camera by folding or unfolding the folding screen, and after adjusting to the appropriate view range, the user may perform a shooting operation, such as a click operation on a shooting button in the preview interface. In response to the shooting operation, the mobile phone can shoot an image, and the interface in shooting can be called a third interface.
The above-described function in response to the photographing operation may be referred to as a fourth function, and the description of the fourth function will be specifically referred to hereinafter, and will not be described in detail.
In some embodiments, after entering the rotation shooting, such as after the shooting is completed, the cell phone may exit the rotation shooting and enter the normal shooting in response to an operation to exit the rotation shooting, such as a click operation on the shooting portal 6031 in the previous interface 604. It will be appreciated that the operation of exiting the rotation capture may clearly indicate the intent to exit the rotation capture, and thereafter the user interface need not be displayed on the front screen at all times.
Thus, in response to exiting the rotation shooting operation, the mobile phone can decide to display a user interface, such as a screen of a normal shooting interface, based on the device gesture, without always maintaining the front screen display of the user interface.
The above-described function in response to the operation of exiting the rotation shooting may be referred to as a function five, and the description of the function five will be specifically referred to hereinafter, and will not be described in detail.
In some embodiments, after the instant in the rotation shooting, such as after the shooting is completed, in response to an operation of the camera application exiting the foreground, the mobile phone may exit the camera application running in the foreground, i.e., running the camera application in the background.
Unlike the operation of exiting the rotation shooting, the operation of exiting the foreground operation of the camera application cannot clearly indicate that the user wants to exit the rotation shooting, but it is quite possible to simply exit the camera application to the background and further view the shooting result, edit the shooting result, share the shooting result, or the like in the foreground.
Based on this, in response to the operation of exiting the camera application from the foreground, the cellular phone can decide whether to exit the rotation shooting based on the product morphology.
The function of the operation of exiting the foreground from the camera application in response to the above may be referred to as a sixth function, and in detail, reference will be made to the following description of the sixth function, which will not be described too much.
It should be noted that the above-mentioned first to sixth functions may be flexibly combined. Illustratively, during the preview phase of the panoramic mode, the electronic device may either dynamically turn on or off the rear mirror or adjust the direction indicated by the guide arrow. The embodiment of the present application is not particularly limited thereto.
The specific implementation of the first and fifth functions will be described below with respect to the first and third modes, respectively.
Form one
Function one
In form one, the front screen is the outer screen 202. Therefore, after entering the rotation shooting, as the included angle α between the first screen 2011 and the second screen 2012 of the folding screen 201 changes from 0 ° to 180 °, the mobile phone always keeps the external screen 202 bright, and displays a preview interface of the rotation shooting on the external screen 202.
Illustratively, after entering the rotation shooting, from the unfolded state shown by the solid line in fig. 8, as the first screen 2011 (hidden by the rear cover 203, not shown in the drawing) in the folding screen 201 is folded in the direction shown by 1b, α gradually decreases from 180 ° and reaches the folded state when decreasing to 0 °. In the above process, the mobile phone always displays the preview interface in the external screen 202.
It should be noted that during the rotation process, the view finding range of the rear camera may be changed, so that the view finding picture collected in real time may also be changed, and accordingly, the real-time view finding picture may be displayed in the preview interface.
In order to always display the preview interface on the external screen 202, in the first aspect, the mobile phone needs to ensure that the external screen 202 is used for displaying after receiving the selection operation of the shooting entrance, and in the second aspect, the mobile phone needs to ensure that the external screen 202 is used for displaying until the mobile phone exits from the rotation shooting. The implementation of the above two aspects is described below.
First aspect:
In practice, the mobile phone may receive a user's selection operation of a photographing entry for rotation photographing just when an application interface of a camera application is displayed on the external screen 202. In this case, when the rotation shooting is entered, the external screen 202 is in a bright screen state.
In addition, when the application interface 901 of the camera application shown in fig. 9A is displayed on the folding screen 201, the mobile phone may receive a user selection operation of the photographing entry 9011 for rotation photographing. It will be appreciated that when folding screen 201 is lit, outer screen 202 is normally in an off-screen state as shown in fig. 9A. In this case, when the rotation shooting is entered, the external screen 202 is not in the bright screen state, and the preview interface cannot be displayed on the external screen 202.
Of course, it is also possible for the handset to display the user interface on both the folding screen 201 and the external screen 202. In this case, although the condition that the outer screen 202 is bright is satisfied, the folded screen 201 is also bright, which easily causes erroneous touch.
Thus, in some embodiments, in response to a selection operation of the capture portal, the cell phone may query whether only the external screen 202 is on. If the external screen 202 is not only on, the mobile phone can switch to the state of the external screen 202 on and the folding screen 201 off as shown in fig. 9A, and a preview interface of the rotation shooting is displayed in the external screen 202. If only the outer screen 202 is lit, no switching is required. In this way, the phone can ensure that only the external screen 202 is on when entering the rotation shooting.
Further, if only the external screen 202 is not on, the mobile phone may display a switching prompt on the screen (such as the folding screen 201) on which the external screen 202 is currently on, so as to prompt to switch to the external screen 202 for display, and start to rotate shooting. For example, in response to a selection operation of the photographing entrance 9011 in the application interface 901 shown in fig. 9A, the cellular phone may display "about to switch to the external screen 202 display, enter the rotation photographing" in the folding screen 201. Subsequently, after the display duration of the switching prompt reaches a preset duration (for example, 3 seconds), or after receiving the confirmation operation of the user, the mobile phone may switch to the state of the external screen 202 being on and the folding screen 201 being off as shown in fig. 9A.
Second aspect:
After entering the rotation shooting, the handset needs to remain on the external screen 202 displaying the preview interface. Thus, in some embodiments, after entering the rotation shot, the handset may record an identification a of the rotation shot to indicate that the rotation shot is being used. In the case where the mark a is recorded, the mobile phone does not switch to the folder 201 display even if the switching condition to switch to the folder 201 display user interface is satisfied. In this way, the mobile phone can ensure that the preview interface is not switched to the folding screen 201 to be displayed in the process of rotating shooting, and the preview interface is always kept to be displayed on the external screen 202.
Function two
In the preview stage, the mobile phone can dynamically turn on or off the rear mirror image based on alpha so as to turn on the rear mirror image after the rear camera 204 deflects to the mobile phone, and turn off the rear mirror image after the rear camera 204 deflects to the scene in front of eyes.
Specifically, the handset may turn on or off the rear mirror based on angles 3 and 4 shown in fig. 9B. The angles 3 and 4 are close to 90 °, typically angle 3<90 °, angle 4>90 °.
When entering the preview phase, the handset may close the post-mirror if a < angle 3, then open the post-mirror if changing from a < angle 3 to a > angle 4, close the post-mirror if changing again from a > angle 4 to a < angle 3, and so on. When entering the preview phase, the handset can turn on the post-mirror if the angle alpha > 4, then turn off the post-mirror if changing from the angle alpha > 4 to the angle alpha < 3, turn on the post-mirror if changing again from the angle alpha < 3 to the angle alpha > 4, and so on.
In addition, when entering the preview stage, if the angle 3 is less than or equal to alpha is less than or equal to the angle 4, the mobile phone can close the rear mirror image by default, and after changing from the angle 3 is less than or equal to alpha to the angle 4 to alpha > the angle 4, the mobile phone can open the rear mirror image. Subsequently, if changing from an angle of a > 4 to an angle of a < 3, the handset may close the back mirror, if changing again from an angle of a < 3 to an angle of a > 4, the handset may open the back mirror, and so on.
The description herein will be mainly made in the case of an angle α < angle 3 or an angle α > angle 4 when entering the preview stage.
It will be appreciated that for a cell phone with a folding screen 201, the user will typically take a photograph in the portrait state. On this basis, in the case where the user deflects the external screen 202 toward the own preview image, when α >90 °, the rear camera 204 provided on the rear cover 203 is normally deflected toward itself, as shown in states ①, ②, and ③ in fig. 8. When α <90 °, the rear camera 204 disposed on the rear cover 203 is often biased toward the scene in front of the eyes, as shown in state ④ in fig. 8.
In a specific implementation, there is a follower region X between angle 3 and angle 4. In the following area X, the mobile phone can keep the state of the back mirror (including the state of opening the back mirror and the state of closing the back mirror) the same as the state of the previous frame. In the following area X, the state of the rear mirror image in the previous frame is an open rear mirror image state, the state of the rear mirror image in the current frame is also an open rear mirror image state, the state of the rear mirror image in the previous frame is a closed rear mirror image state, and the state of the rear mirror image in the current frame is also a closed rear mirror image state. Thus, the mobile phone can avoid frequent opening and closing of the rear mirror image when the alpha shakes in a small angle range.
Taking the example that the angle 3 is 87 ° shown in fig. 10 and the angle 4 is 93 ° shown in fig. 10, when the mobile phone is in the state of α=86° shown in fig. 11A, α < angle 3 is satisfied, and at this time, the rear mirror image is closed, and the mobile phone can display the preview interface 1101 shown in fig. 11A on the external screen 202. Subsequently, the second screen 2012 is unfolded in the direction of 1m (opposite to the direction of the foregoing 1 b), and is then in the following region X shown in fig. 10 before α >93 ° is satisfied, and the rear mirror state is kept as the closed rear mirror state. For example, when the mobile phone is in the state of α=93° shown in fig. 11A, the mobile phone may display the preview interface 1102 shown in fig. 11A on the external screen 202, and it is obvious that, compared with the preview interface 1101, the view in the preview interface 1102 is only shifted to the left, but not mirrored to the left or right. Subsequently, the second screen 2012 continues to be unfolded along the direction of 1m, and when the mobile phone reaches the state of α=94° shown in fig. 11A, α >93 ° is satisfied, and at this time, the rear mirror image is turned on, so that the mobile phone can display the preview interface 1103 shown in fig. 11A in the external screen 202, and the view-finding picture is subjected to left-right mirror image processing, that is, the preview interface can display the same effect as that of a mirror.
After the rear camera 204 deflects to itself (for example, alpha >93 deg.), and after the rear mirror image is opened, the user can conveniently adjust the photographed scene according to the mirror image effect in the view-finding picture. Illustratively, the user moves left, so does the user in the preview interface, and the user moves right, so does the user in the preview interface. For example, in the case where the mobile phone is in a state of α=94° shown in fig. 11A, the user feels that the user himself or herself is far to the right in the preview interface 1103, and after the left shift, the mobile phone can display the preview interface 1104 shown in fig. 11A on the external screen 202. It is apparent that the preview interface 1104 has shifted left by itself as compared to the preview interface 1103. It can be seen that after the rear camera 204 deflects to itself, the rear mirror image is turned on, which better conforms to the usage habit of self-timer preview.
Similarly, after the rear camera 204 is biased toward the scene in front of the eyes (e.g., α <87 °), and the rear mirror image is closed, the user can conveniently adjust the photographed scene according to the imaging effect in the viewfinder. Illustratively, if the user feels that the scene in the preview interface is far to the left, the mobile phone is moved to the left. It can be seen that after the rear camera 204 is deflected toward the scene in front of the eyes, the rear mirror image is turned off, which accords with the usage habit of the rear camera in the case of ordinary photographing.
The mobile phone always locks to open the rear mirror image or close the rear mirror image in the shooting stage.
Taking the example that the angle 3 is 87 ° and the angle 4 is 93 °, in the preview stage of the video mode, when the mobile phone is in the state of α=94° shown in fig. 11B, the angle α > 4 is satisfied, and when the mobile phone is opened and mirrored later, the mobile phone can display the preview interface 1111 shown in fig. 11B on the external screen 202, and then the view finding picture in the preview interface 1111 is a mirrored image. In response to a click operation of the photographing button 11111 in the preview interface 1111, the mobile phone can display the photographing interface 1112 shown in fig. 11B on the external screen 202. On the one hand, the photographing interface 1112 includes an end photographing button 11121 and a photographing timer 11122 to indicate that photographing has started, and on the other hand, the preview interface 1111 has a view screen in the photographing interface 1112 that is not flipped left and right, indicating that the rear mirror image is still kept on when photographing is started. Subsequently, in the shooting phase, α is reduced from 94 ° to 86 °, at which time the reduction from α > angle 4 to α < angle 3 is satisfied, and the mobile phone can display the shooting interface 1113 shown in fig. 11B on the external screen 202. Compared to the photographing interface 1112, the view in the photographing interface 1113 is not flipped left and right, meaning that the back mirror is still maintained open after α is reduced from 94 ° to 86 °. That is, the handset always locks on the rear mirror during the shooting process.
Function III
After entering the rotation shooting, the mobile phone may display a preview interface 1201 shown in fig. 12 in the external screen 202, where the preview interface 1201 is selected to be in a shooting mode, that is, a shooting mode currently in the rotation shooting. In response to a selection operation of the mode option 12011 of the panoramic mode in the preview interface 1201, the mobile phone may display a preview interface 1202 shown in fig. 12 in the external screen 202, where the panoramic mode is selected in the preview interface 1202, that is, the panoramic mode currently in the rotation shooting mode.
With continued reference to the preview interface 1202 in fig. 12, in the panning mode of the rotation shooting, a guide arrow 12021 is included in the preview interface 1202. The guide arrow 12021 is used to indicate the moving direction of the panoramic shooting. Typically, the guide arrow 12021 is indicated by default to move from left to right.
To facilitate the user taking panoramic photographs in any direction of movement, during the preview phase, the handset may provide a direction switch control 12022 in the preview interface 1202 shown in fig. 12. The direction switch control 12022 is used to trigger the handset to switch the direction indicated by the guide arrow 12021.
The directions indicated by the guide arrows 12021 include a plurality of directions from left to right, right to left, top to bottom, and bottom to top.
In a specific implementation, the plurality of directions have a priority order. Illustratively, the priority order is from top to bottom, from left to right, right to left, bottom to top, and top to bottom as shown in fig. 13. In this implementation, in response to a switching operation (e.g., a clicking operation) of the directional switching control 12022, the mobile phone may switch sequentially according to the priority order.
For example, in the panoramic mode of rotation shooting, in the preview stage, the mobile phone may display a preview interface 1401 shown in fig. 14 on the external screen 202, which is the same as the preview interface 1202 described above, and will not be described herein. In response to a click operation of the direction switch control 12022 in the preview interface 1401, the handset can display a preview interface 1402 shown in fig. 14 in the external screen 202. Unlike preview interface 1401, the direction indicated by guide arrow 12021 in preview interface 1402 is from right to left. In response to a click operation of the direction switch control 12022 in the preview interface 1402, the handset can display the preview interface 1403 shown in fig. 14 in the external screen 202. Unlike preview interface 1402, the direction indicated by guide arrow 12021 in preview interface 1403 is from bottom to top. In response to a click operation of the direction switch control 12022 in the preview interface 1403, the handset can display a preview interface 1404 shown in fig. 14 in the external screen 202. Unlike preview interface 1403, the direction indicated by guide arrow 12021 in preview interface 1404 is from top to bottom.
In another specific implementation, the preset direction has no priority order. In this design, in response to a switching operation (e.g., a clicking operation) of the direction switching control 12022, the mobile phone may provide a direction option for selecting multiple directions. Subsequently, in response to a selection operation of the other one of the direction options, the mobile phone can switch to the corresponding direction.
It should be noted that, the form and the display position of the direction switching control 12022 are not limited to those shown in fig. 12 and 14. Of course, the user may also switch the direction indicated by the guide arrow 12021 in other ways. For example, the handset may also switch the direction indicated by the guide arrow 12021 based on a user's switch gesture, such as a double tap gesture, a tap gesture in various directions, or the like. The embodiment of the present application is not particularly limited thereto.
In order to facilitate the user taking panoramic pictures in any moving direction, the handset may also dynamically adjust the direction indicated by the guide arrow 12021 during the preview phase based on α. Considering a cellular phone having a folding screen 201, a user usually photographs in a portrait state, and thus the following description will mainly be given by taking left to right and right to left as an example.
The mobile phone can turn on or off the rear mirror image based on the angle 5 and the angle 6 shown in fig. 15A. The angles 5 and 6 are close to 90 °, typically angle 5<90 °, angle 6>90 °.
Specifically, in switching to panoramic mode, if α < angle 5, the direction indicated by the guide arrow 12021 is from right to left, then if changing from α < angle 5 to α > angle 6, the handset may switch the direction indicated by the guide arrow 12021 from left to right, if changing from α > angle 6 to α < angle 5 again, the handset may switch the direction indicated by the guide arrow 12021 from right to left, and so on. In switching to panoramic mode, the direction indicated by the guide arrow 12021 is left to right if the angle α > 6, and subsequently, the handset may switch the direction indicated by the guide arrow 12021 from left to right to left if the angle α > 6 changes to angle α < 5, and the handset may switch the direction indicated by the guide arrow 12021 from left to right if the angle α < 5 changes to angle α > 6 again.
In addition, in switching to panoramic mode, if angle 5≤α≤angle 6, the guide arrow 12021 may indicate that the default direction is left to right, and after changing from angle 5≤α≤angle 6 to α < angle 5, the handset may switch the direction indicated by the guide arrow 12021 from right to left. Subsequently, if changing from an a < angle 5 to an a > angle 6, the handset may switch the direction indicated by the guide arrow 12021 from left to right, if changing again from an a > angle 6 to an a < angle 5, the handset may switch the direction indicated by the guide arrow 12021 from right to left, and so on.
This text will mainly be described in the case of an alpha < angle 5 or alpha > angle 6 when switching to panoramic mode.
Typically, a >90 °, indicates that the user wants to start from a state closer to the unfolded state and take a photograph in a gradual folding process in which the view range of the rear camera 204 changes from being biased clockwise toward the scene in front of the eyes, like a photograph from left to right. Alpha <90 deg., it indicates that the user wants to start from a state closer to the folded state and shoot in a gradual unfolding process in which the view range of the rear camera 204 is changed from the scene biased in front of the eyes to the left toward himself/herself, like shooting from right to left. Thus, the mobile phone dynamically adjusts the direction indicated by the guide arrow 12021 based on α, so that the direction indicated by the guide arrow 12021 matches the shooting intention of the user.
In a specific implementation, there is a follower region Y between angle 5 and angle 6. In the following area Y, the handset may keep the direction indicated by the guide arrow 12021 the same as that indicated at the previous frame. That is, in the following area Y, the direction indicated by the guide arrow 12021 in the previous frame is from left to right, the direction indicated by the guide arrow 12021 in the current frame is also from left to right, and the direction indicated by the guide arrow 12021 in the previous frame is from right to left, and the direction indicated by the guide arrow 12021 in the current frame is also from right to left. In this way, the handset can avoid frequent switching of the direction indicated by the guide arrow 12021 when the handset is dithered over a small angular range due to alpha.
Taking the example that the angle 5 is 87 ° shown in fig. 15B and the angle 6 is 93 ° shown in fig. 15B, when the mobile phone is in the state of α=94° shown in fig. 16A, α >93 ° is satisfied, the mobile phone can display the preview interface 1601 shown in fig. 16A on the external screen 202, and the direction indicated by the guide arrow 12021 in the preview interface 1601 is from left to right. Subsequently, the second screen 2012 is folded in the direction of 1B, and is then in the following region Y shown in fig. 15B before α <87 ° is satisfied, and the direction indicated by the guide arrow 12021 is always kept from left to right. For example, when the mobile phone is in the state of α=87° shown in fig. 16A, the mobile phone may display the preview interface 1602 shown in fig. 16A on the external screen 202, and the direction indicated by the guide arrow 12021 in the preview interface 1602 is still left to right, and no switch occurs. Subsequently, the second screen 2012 continues to fold in the direction of 1b, and when the cell phone reaches the state of α=86° shown in fig. 16A, then the change from α >93 ° to α <87 ° is satisfied, indicating that the user's photographing is intended to start from a state closer to the folded state and photographing in the process of gradually expanding. Thus, the handset may display the preview interface 1603 shown in fig. 16A on the external screen 202, and the direction indicated by the guide arrow 12021 in the preview interface 1603 is switched from right to left, that is, the direction indicated by the guide arrow 12021. So that the direction indicated by the guide arrow 12021 matches the shooting intention.
In the shooting stage, even if the switching condition for switching the direction indicated by the guide arrow 12021 is satisfied, such as the change from α < angle 5 to α > angle 6 or the change from α > angle 6 to α < angle 5, the mobile phone does not switch the direction indicated by the guide arrow 12021, but is locked so that the direction indicated by the guide arrow 12021 in the last frame before shooting is started.
Still taking the example that the angle 6 is 93 ° and the angle 5 is 87 °, when the mobile phone is in the state of α=94° shown in fig. 16B, the mobile phone satisfies the angle α > 6, and the mobile phone may display the preview interface 1611 shown in fig. 16B on the external screen 202, and the direction indicated by the guiding arrow 12021 in the preview interface 16011 is from left to right. In response to a click operation of the shoot button 16111 in the preview interface 1611, the handset can start panoramic shooting and keep the direction indicated by the guide arrow 12021 left to right until shooting is ended. For example, when α=94° in the stage of photographing, the mobile phone may display the photographing interface 1612 shown in fig. 16B in the external screen 202, the direction indicated by the guide arrow 12021 of the photographing interface 1612 is left to right, and when α decreases from 94 ° to 85 °, although it is satisfied that α > angle 6 changes to α < angle 5, the mobile phone may display the photographing interface 1613 shown in fig. 16B in the external screen 202, since the direction indicated by the guide arrow 12021 in the photographing interface 1613 is left to right in the stage of photographing.
Subsequently, upon exiting from the stage being photographed to the preview stage, the handset may continue to dynamically adjust the direction indicated by the guide arrow 12021 based on α. With continued reference to the example of fig. 16B, and taking the example that angle 5 is 87 °, when the mobile phone satisfies α < angle 5 when the external screen 202 displays the photographing interface 1613 shown in fig. 16B, at this time, in response to a click operation of the end photographing button 16131 in the photographing interface 1613, the mobile phone may display the preview interface 1614 shown in fig. 16B in the external screen 202. Unlike the capture interface 1613, the direction indicated by the guide arrow 12021 in the preview interface 1614 changes from right to left.
In practice, the above embodiment of manually adjusting the direction indicated by the guide arrow 12021 and the embodiment of dynamically adjusting the direction indicated by the guide arrow 12021 by the mobile phone based on α may be combined. Continuing with the example of fig. 16A, the handset further includes a direction switch control 12022 in the preview interface displayed on the external screen 202, such as the preview interface 1601, the preview interface 1602, and the preview interface 1603, and the user can still switch the direction indicated by the guide arrow 12021 by operating the direction switch control 12022. After the user manually adjusts the direction indicated by the guide arrow 12021, the mobile phone further adjusts the direction indicated by the guide arrow 12021 when the switching condition is satisfied.
Function IV
After entering the rotation photographing, the user can adjust the viewing range of the rear camera 204 by folding or unfolding the folding screen 201, and after adjusting to an appropriate viewing range, the user can perform a photographing operation, such as a clicking operation on a photographing button in the preview interface. In response to the photographing operation, the mobile phone may photograph an image.
In the photographing mode, after the camera is adjusted to a suitable viewing range, the mobile phone may use the rear camera 204 to photograph a picture of the corresponding viewing range in response to the photographing operation. Thus, the mobile phone can shoot scenes at the side and the back without turning the user, and can shoot the user.
In the video recording mode, after the camera is adjusted to a proper view-finding range, the mobile phone can start video recording from the corresponding view-finding range by using the rear camera 204 in response to the shooting operation, and in the video recording process, the user can continuously adjust the view-finding range, and the rear camera 204 can continuously shoot images corresponding to the updated view-finding range. And responding to the recording ending operation, and recording the video with changed view range by the mobile phone. For example, the cell phone may record video that ranges from behind to in front of the eye. Therefore, the mobile phone can flexibly record and obtain the video with the changed view finding range without turning the user.
In the panoramic mode, after adjusting to a suitable viewing range, in response to a photographing operation, the mobile phone may start panoramic photographing from the corresponding viewing range by using the rear camera 204. And, as the user continues to adjust the view range, the handset can adjust the position of the guide arrow 12021 in the preview interface and display the composite panoramic photograph. Continuing with fig. 16B as an example, in the stage of photographing, the guide arrow 12021 is moved from the position in the photographing interface 1612 to the position in the photographing interface 1613, i.e., the right shift occurs, from the view range of α=94° to the view range of α=85°, and the panoramic photograph is changed from the photograph 16121 in the photographing interface 1612 to the photograph 16132 in the photographing interface 1613.
Further, in the panorama mode, the mobile phone may calculate the displacement of the guide arrow 12021 in the preview interface through signals acquired by the acceleration sensor (e.g., the acceleration sensor [180E ]) and the gyro sensor (e.g., the gyro sensor 180B) respectively provided on the first screen 2011 and the second screen 2012, thereby displaying the guide arrow 12021 in real time.
The displacement includes a displacement of moving left and right and a displacement of shaking up and down. Illustratively, a sloshing displacement may be used to adjust the floating up and down display of the guide arrow 12021 by the handset and prompt the user to move smoothly.
Furthermore, in the panoramic mode, the mobile phone can synthesize the view frames of the view ranges into panoramic photos through a panoramic algorithm. By way of example, the panorama algorithm may extract the same feature area and difference pixel area of the two frames of viewfinder pictures, remove the overlapping portion, splice together, and cut off the pixel portion with uneven upper and lower shake, thereby synthesizing to obtain a panorama photograph.
In a specific implementation manner, in the panoramic algorithm, angular velocity displacement of panoramic shooting is obtained by using gyroscope sensors respectively arranged on the first screen 2011 and the second screen 2012, and the angular velocity displacement is used for accelerating feature analysis of two frames of viewfinder images, wherein the same feature area and difference pixel area.
Function five
After entering the rotation shooting, such as after the shooting is completed, the cellular phone may exit the rotation shooting and enter the normal shooting in response to an operation to exit the rotation shooting. For example, the operation of exiting the rotation photographing may be a click operation of the photographing entry 17011 of the rotation photographing in the preview interface 1701 shown in fig. 17 displayed in the external screen 202 by the cellular phone, and the cellular phone may exit the rotation photographing to enter the normal photographing in response to the click operation of the photographing entry 17011 of the rotation photographing in the preview interface 1701 shown in fig. 17.
In response to the operation of exiting the rotation shooting, the cellular phone may display a normal shooting interface using the folding screen 201 or the external screen 202 based on the device posture decision. Specifically, after exiting the rotation shooting, the mobile phone may clear the recorded identifier a. Without the identification a, the handset may then switch the display user interface between the folding screen 201 and the external screen 202 based on the device gesture decision.
The device posture is in a posture m, such as a folded state, and the external screen 202 is continuously used for displaying the common shooting interface, and the device posture is in a posture n, such as an unfolded state, and is switched to the folded screen 201 for displaying the common shooting interface.
Taking an example in which the device posture of the mobile phone corresponding to α=4° shown in fig. 17 is posture m, in response to a click operation of the photographing portal 17011 of the rotation photographing in the preview interface 1701 shown in fig. 17, the mobile phone may continue to display a normal photographing interface on the external screen 202, such as displaying the preview interface 1702 shown in fig. 17. In the preview interface 1702, the capture entry of the rotation capture is hidden, indicating that the rotation capture has been turned off, and the normal capture is entered.
Taking an example in which the device posture of the mobile phone corresponding to α=180° shown in fig. 18 is the posture n, in response to a click operation of the photographing entrance 18011 of the rotation photographing in the preview interface 1801 shown in fig. 18 displayed on the external screen 202, the mobile phone may switch to display a normal photographing interface in the folding screen 201, such as the preview interface 1802 shown in fig. 18 displayed in the folding screen 201. In the preview interface 1802, the photographing entrance of the rotation photographing is hidden, indicating that the rotation photographing has been turned off, and the normal photographing is entered. And, while the preview interface 1802 is displayed, the external screen 202 is off screen. Note that in fig. 18, the mobile phone is turned over in the drawing for convenience of viewing the preview interface 1802, and in practice, the mobile phone in fig. 18 is that the external screen 202 is biased toward the user, and α=180°, so that a self-timer viewfinder image can be continuously presented in the preview interface 1802.
Further, before switching to displaying the normal shooting interface using the folding screen 201, the mobile phone may also display a switching action on the external screen 202 to indicate that the user interface is to be switched from the external screen 202 to the folding screen 201.
Function six
After entering the rotation shooting, in response to an operation that the camera application exits the foreground, the cellular phone may exit the camera application running in the foreground, i.e., run the camera application in the background. Meanwhile, in the mobile phone having the folding screen 201, the effect of using the external screen 202 alone is also good. For example, in the external screen 202, viewing, editing, sharing, and the like of the photographing result can be fully realized. Based on this feature, the handset remains on the external screen 202 for displaying a user interface, such as a display desktop, gallery interface, etc., regardless of the current device pose, in response to the operation of the camera application exiting the foreground.
Illustratively, the handset is in the device pose shown in fig. 19 consistent with the folding screen 201, i.e., the user interface should be displayed on the folding screen 201. The operation of exiting the camera application from the foreground may be a slide-up operation from the bottom of the preview interface 1901 displayed on the external screen 202 shown in fig. 19 (as indicated by the arrow). However, in response to a slide-up operation from the bottom of the preview interface 1901, the handset still displays a user interface in the external screen 202, such as the desktop 1902 shown in fig. 19.
After exiting the camera application running in the foreground, if the device pose is updated as alpha changes, it is indicated that the user wants to change the device pose to use the handset, and subsequently, the handset can switch the display user interface between the folding screen 201 and the external screen 202 based on the device pose decision instead of maintaining the display user interface on the external screen 202 at all times. Illustratively, after the device pose is updated, the handset may clear the recorded identifier a. Without the identification a, the handset can then switch the display user interface between the folding screen 201 and the external screen 202 as the device gesture makes a decision.
Further, when the camera application is not in the foreground, the handset may close the rotation shooting after the device pose update. Subsequently, in response to the operation of the camera application from the background to the foreground, the mobile phone displays a normal shooting interface instead of continuing to rotate the shooting.
Of course, after exiting the camera application running in the foreground, if the handset cleans up the background camera application process, indicating that the rotation shooting is finished, the handset may switch the display user interface between the folding screen 201 and the external screen 202 based on the device gesture decision instead of always maintaining the display user interface on the external screen 202.
It should be noted that the above description of the operation of exiting the foreground in response to the camera application also takes into account the effect of the use of the external screen 202 alone. In practice, for the operation of exiting the camera application from the foreground, the mobile phone may also respond according to the operation of exiting the rotation shooting. That is, in response to an operation of exiting the foreground of the camera application, the cellular phone can display a normal photographing interface using the folder 201 or the external screen 202 based on the alpha decision without maintaining the normal photographing interface displayed at the external screen 202. The embodiment of the present application is not particularly limited thereto.
An embodiment of the first aspect will be further described with reference to fig. 20. Taking the example of angles 3 and 5 being 87 ° and angles 4 and 6 being 93 °, as shown in fig. 20, in a specific implementation, the rotation shooting of the mobile phone of form one (i.e. the mobile phone with folding screen 201) includes:
S2001, entering rotation shooting.
In the case that the foreground has a camera application running, in response to a selection operation of a photographing entry for photographing the rotation, the mobile phone enters the rotation photographing, and specifically, refer to fig. 6B, fig. 7 and related descriptions thereof, which are not repeated here.
S2002, inquiring whether only the external screen 202 is on, if yes, executing S2005, and if not, executing S2003.
Illustratively, the handset may determine whether only the external screen 202 is lit by querying the status of each display.
If only the external screen 202 is on, it indicates that the preview interface of the rotation shooting is not required to be displayed on the external screen 202 by switching, and the other screens are off. Subsequently, the mobile phone may execute S2005 described below. Otherwise, if only the external screen 202 is not on, it indicates that the preview interface of the rotation shooting needs to be displayed on the external screen 202 through switching, and other screens are controlled to be off. Subsequently, the mobile phone may execute S2003 and S2004 described below, to switch to the external screen 202 for display.
S2003, displaying a switching prompt.
For example, the handset may prompt "switch to external screen 202 display, enter rotation shooting".
And S2004, after the prompting time length reaches the preset time length, controlling only the outer screen 202 to be on, and displaying a preview interface of the rotary shooting on the outer screen 202.
That is, the mobile phone switches to the external screen 202 to display a preview interface of the rotation shooting.
In practice, other events may trigger the mobile phone to switch to the external screen 202 for display, besides the prompt duration reaching the preset duration. For example, the other event may be an event in which the user performs a confirmation operation on the switching prompt. Alternatively, the mobile phone may switch to the external screen 202 without displaying the switch prompt. The embodiment of the present application is not particularly limited thereto.
S2005, record id a.
That is, the handset sets the camera function to the rotation shot so that the handset subsequently performs a relevant response of the rotation shot based on the identification.
Note that, in the case where only the external screen 202 is turned on in S2002, or in the case where only the external screen 202 is not turned on in S2003 and the display is switched to the external screen 202, S2005 may be executed after the other screen is turned off.
And S2006, when the equipment posture changes, updating the equipment posture without cutting a screen.
That is, in the case where the identifier a is recorded, the mobile phone will update only the device gesture, but will not switch the display screen displaying the user interface based on the device gesture. The device gesture is updated to facilitate the display screen of the user interface to be displayed by the mobile phone based on the real-time updated device gesture decision after the rotation shooting is exited or the camera application is exited to the background. Reference is specifically made to fig. 17-19 and their associated descriptions, and no further description is provided herein. And, the screen is not cut off in order to ensure that the mobile phone always uses the external screen 202 for display during the rotation shooting process.
Note that S2006 is a continuous operation after entering the rotation shooting, and the device posture is updated upon receiving an event that the device posture is changed, and is not limited to the execution timing in fig. 20.
For the parts of S2002-S2006 not described in detail above, reference may be made specifically to the descriptions related to the first aspect and the second aspect, and the details are not repeated here.
Similar to ordinary photographing, rotational photographing also provides a plurality of photographing modes, such as photographing mode, video mode, panoramic mode, and the like. In the panoramic mode, the mobile phone needs to complete more processing, such as the processing of switching the direction indicated by the guiding arrow, calculating the displacement of the guiding arrow, synthesizing the panoramic photo and the like. Therefore, after entering the rotation photographing, the cellular phone may perform S2007 described below to distinguish the panorama mode from other modes.
S2007, detecting whether the panoramic mode is adopted.
After the rotation shooting is entered, the mobile phone can be switched to a corresponding shooting mode in response to the operation of switching the shooting mode by the user. That is, after entering the rotation shooting, the shooting mode may be changed. The mobile phone can be used for detecting whether the current shooting mode is the panoramic mode by recording the indication information of the current shooting mode.
If the panoramic mode is adopted, the mobile phone adopts the following processing of S2008-S2017, and if the panoramic shooting is not adopted, the mobile phone adopts the following processing of S2018-S2024.
S2008, detecting whether the preview stage is performed.
In panoramic mode, the processing done by the phone is different during the preview phase and the shooting phase. For example, the mobile phone may complete S2009-S2015 described below during the preview phase, and the mobile phone may complete S2016-S2017 described below during the shooting phase.
S2009, acquiring alpha.
It will be appreciated that as the user moves the handset, the relative positions of the first screen 2011 and the second screen 2012 change, and α changes accordingly. Therefore, the mobile phone can continuously execute S2009 in the preview stage to obtain the real-time α. Subsequently, the handset can dynamically turn on or off the back mirror based on α, and switch the direction indicated by the boot arrow 12021.
When the mobile phone is switched to the panoramic mode, if alpha <87 degrees, the direction indicated by the display guide arrow 12021 is from right to left, and if alpha >93 degrees, the direction indicated by the display guide arrow 12021 is from left to right. And when entering the preview stage, if alpha is less than 87 degrees, closing the rear mirror image by the mobile phone, and if alpha is more than 93 degrees, opening the rear mirror image by the mobile phone.
S2010, detecting whether the angle is changed from alpha <87 deg. to alpha >93 deg..
If changing from a <87 ° to a >93 °, the handset may perform S2011 and S2012 described below.
If it is not changed from a <87 to a >93, the handset may perform S2013 described below.
S2011, opening a rear mirror image.
In this way, the handset can turn on the rear mirror when the view range of the rear camera 204 is biased towards itself. Reference should be made specifically to fig. 10, 11A and their related descriptions, and their descriptions are omitted here.
S2012, the direction indicated by the adjustment guide arrow is from left to right.
In this way, the mobile phone can make the direction indicated by the guide arrow left to right when the rear camera 204 starts shooting from being biased to itself. Reference is made specifically to fig. 15B, fig. 16A and their associated descriptions, and no further description is given here.
S2013, detecting whether the angle is changed from alpha >93 degrees to alpha <87 degrees.
If changing from a >93 to a <87, the handset may perform S2014 and S2015 described below.
If not changed from a >93 to a <87, the handset may neither switch the state of the rear mirror nor adjust the direction indicated by the guide arrow.
S2014, closing the rear mirror image.
In this way, the handset can turn off the rear mirror when the range of view of the rear camera 204 is biased toward the scene in front of the eye. Reference should be made specifically to fig. 10, 11A and their related descriptions, and their descriptions are omitted here.
The direction indicated by the S2015, adjustment guide arrow is from right to left.
In this way, the handset can cause the direction indicated by the guide arrow to be from right to left when the rear camera 204 starts shooting from a scene that is biased toward the front of the eye. Reference is made specifically to fig. 15B, fig. 16A and their associated descriptions, and no further description is given here.
S2016, the position of the guide arrow is adjusted based on the movement of the user.
For example, the position of the guide arrow may move left and right and also float up and down.
It should be noted that, if the user is not satisfied with the direction indicated by the guide arrow after the automatic switching, the direction indicated by the guide arrow may be further adjusted manually, and in particular, reference may be made to fig. 14 and the description of the related context thereof for adjusting the direction indicated by the guide arrow by using the direction switching control, which will not be repeated herein.
S2017, shooting to obtain a panoramic photo.
For the above S2016-S2017, refer to the relevant description of the mobile phone starting panoramic shooting in response to the shooting operation, and will not be repeated here.
S2018, detecting whether the preview stage is performed.
In other modes such as a photographing mode and a video recording mode than the panoramic mode, the processing performed by the mobile phone is different between the preview stage and the photographing stage. For example, the handset may complete S2019-S2023 described below during the preview phase, and may complete S2024 described below during the shooting phase.
S2019, acquiring alpha.
S2020, detecting whether or not the change from α <87 ° to α >93 °.
S2021, opening the post mirror.
S2022, detecting whether the angle is changed from alpha >93 DEG to alpha <87 deg.
S2023, closing the rear mirror.
For the above S2019-S2023, reference may be specifically made to the descriptions of the corresponding steps in the foregoing S2010-S2015, which are not repeated herein.
S2024, shooting.
For S2024, refer to the foregoing description about taking a picture of a corresponding view range or starting recording by the mobile phone in response to the shooting operation, and will not be repeated here.
The execution sequence of S2007 to S2024 is not limited to fig. 20. For example, the mobile phone may also adjust the direction indicated by the guiding arrow first, and then switch the state of the rear mirror, for example, execute S2012 first, execute S2011 first, execute S2015 first, and execute S2014 second. Also for example, the handset may perform S2013 first if it does not change from a >93 ° to a <87 °, then S2010, or may perform S2022 first if it does not change from a >93 ° to a <87 °, then S2020. Still further exemplary, the handset may first detect whether it is a preview stage, and in the preview stage, the handset turns on or off the rear mirror image based on α, then further detect whether it is a panorama mode, if it is a panorama mode, then further adjust the direction indicated by the guide arrow based on α, if it is not a panorama mode, then it is not necessary to adjust the direction indicated by the guide arrow based on α, if it is not a preview stage, further detect whether it is a panorama mode, if it is a panorama mode, then execute S2016-S2017, if it is not a panorama mode, then execute S2024.
In addition, the above-described S2007-S2024 are not necessary. For example, the mobile phone may omit S2012 and S2015 without adjusting the direction indicated by the guide arrow, or may omit S2011, S2014, S2021 and S2023 without switching the rear mirror image.
Further, for other modes such as a photographing mode, a video recording mode, etc. except the panoramic mode, if the mobile phone does not switch the post-mirror, the above-mentioned S2018-S2023 may be omitted.
Further, for the panoramic mode, if the handset neither adjusts the direction indicated by the guide arrow nor switches the rear mirror, S2008-S2015 described above may be omitted.
After entering the rotation shooting, the user may exit the rotation shooting or switch the camera application to the background either before starting the shooting or after ending the shooting. Accordingly, the mobile phone may respond through the following step S2025 and the following steps.
Note that in fig. 20, S2025 and subsequent steps are located after photographing is ended, as in S2017 and S2024. In practice, S2025 and subsequent steps may be triggered to be performed at any time after entering the rotation shooting.
S2025, detecting whether an operation to exit the rotation shooting is received.
If an operation to exit the rotation shooting is received, it is indicated that the user does not continue to use the rotation shooting, in which case the mobile phone may execute S2029 described below to mark exit from the rotation shooting.
If the operation of exiting the rotation shooting is not received, the cellular phone may execute S2026 described below.
S2026, detecting whether an operation of exiting the foreground from the camera application is received.
If an operation to exit the camera application from the foreground is received, it indicates that the user wants to keep the rotation shooting in the background, in which case the handset can execute S2027 described below.
If no operation to exit the camera application from the foreground is received, normal use of the rotation shooting in the foreground may be continued, as the handset may repeatedly perform S2006-S2026.
Note that the order of execution of S2025 to S2026 is not limited to fig. 20. For example, the mobile phone may first perform S2026, and then further perform S2025 if it is detected that an operation of exiting the camera application from the foreground is not received.
S2027, the user interface continues to be displayed on the external screen 202.
S2028, detecting whether the posture of the equipment changes.
If the device posture is changed, it indicates that the user wants to change the device posture to use the cellular phone, in which case the cellular phone may perform S2029 described below to mark exit from the rotation shooting.
If the device pose does not change, indicating that the user wants to continue maintaining the device pose with the rotational camera, in which case the handset may continue to display the user interface on the external screen 202, i.e., repeatedly executing S2027.
S2029, clearing the mark a.
After clearing the flag a, the cellular phone also no longer performs the relevant response of the rotation shooting, for example, S2006-S2028.
S2030, inquires whether the gesture of the device is consistent with the bright screen of the external screen 202.
If the device gesture coincides with the external screen 202 being lit, indicating that the user interface needs to continue to be displayed on the external screen 202, the handset may execute S2031.
If the device gesture is inconsistent with the external screen 202 being lit, it indicates that a switch to the folding screen 201 to display the user interface is required, and the handset may execute S2032.
S2031, a user interface is displayed on the external screen 202.
S2032, switching to the folding screen 201 to display the user interface.
For the above S2025-S2032, refer to the descriptions of fig. 17-19 and their relevant contexts, and are not repeated here.
In addition, the mobile phone may execute S2029 to S2032 after receiving the operation of exiting the rotation shooting or the operation of exiting the camera application from the foreground, instead of distinguishing between the operation of exiting the rotation shooting and the operation of exiting the camera application from the foreground. The embodiment of the present application is not particularly limited thereto.
Form III
In the mobile phone with the folding screen 401, the user can change the viewing range of the rear camera 403 by unfolding or folding the folding screen 401 so that the gamma between the first screen 4011 and the second screen 4012 and/or the delta between the second screen 4012 and the third screen 4013 can be changed between 0 degrees and 180 degrees.
Wherein, as γ increases gradually from 0 ° to 180 °, the view range of the rear camera 403 may change by 180 ° clockwise. Taking δ as an example with δ kept 180 ° and the third screen 4013 facing itself, the rear camera 403 faces the user from the state shown in fig. 21A where γ=0°. When changing from γ=0° to γ=90° shown in fig. 21A, the rear camera 403 changes clockwise to the scene facing the left side. When γ=90° changes from γ=180° shown in fig. 21A, the rear camera 403 changes clockwise to face the scene in front of the eyes.
And, as δ gradually decreases from 180 ° to 0 °, the view range of the rear camera 403 may also change by 180 ° clockwise. Taking the example where γ is kept unchanged by 180 ° and the third screen 4013 faces itself, starting from the state shown in fig. 21B where δ=180°, the rear camera 403 faces the scene in front of the eyes. When changing from δ=180° to δ=90° shown in fig. 21B, the rear camera 403 changes clockwise to a scene which is biased to the right. When δ=90° changes from δ=0° shown in fig. 21B, the rear camera 403 changes clockwise to be biased toward itself.
Then, if γ gradually increases from 0 ° to 180 °, and δ gradually decreases from 180 ° to 0 °, a clockwise 360 ° photographing can be achieved.
It should be noted here that, in the above examples of fig. 21A and 21B, γ is gradually increased from 0 ° to 180 ° and then δ is gradually decreased from 180 ° to 0 °. In practice, δ may be gradually reduced from 180 ° to 0 ° and γ may be gradually increased from 0 ° to 180 °, so that 360 ° clockwise photographing may be realized.
For example, the mobile phone may be in a state where γ=0° as shown in fig. 21C, and the user may gradually close the third screen 4013 by closing the first screen 4011 and the second screen 4012 as a whole, so that δ gradually decreases from 180 ° to 0 °, and in this process, photographing clockwise [0 °,180 ° ] may be achieved. After δ is reduced to 0 °, the cellular phone may be in a state where δ=0° as shown in fig. 21C, and the user may gradually increase γ from 0 ° to 180 ° by expanding the first screen 4011 and the second screen 4012, and in this process, photographing clockwise [180 °,360 ° ] may be continuously achieved. Thereby realizing 360-degree photographing clockwise.
And, in the above examples of fig. 21A and 21B, δ is not adjusted until γ increases to 180 °, but is not limited thereto in practice. Illustratively, the user may also alternate between adjusting γ and δ, such as decreasing δ from 180 to 120 after γ increases to 90, then increasing γ from 90 to 120, further, delta is reduced from 120 deg. to 80 deg., so that the viewing range of the rear camera 403 can be changed, likewise, a 360 ° clockwise recording can be achieved.
Of course, it is understood that if γ is gradually reduced from 180 ° to 0 °, and δ is gradually increased from 0 ° to 180 °, photographing of counterclockwise 360 ° can be achieved. And will not be described in detail herein.
In addition, in practice, γ and δ may not always increase or decrease. Wherein, as the user expands the first screen 4011 and the second screen 4012, γ increases, and as the user closes the first screen 4011 and the second screen 4012, γ decreases. Delta decreases as the user closes the second screen 4012 and the third screen 4013 and increases as the user expands the second screen 4012 and the third screen 4013.
For a mobile phone with a folding screen 401, the user will usually shoot in the portrait state, and after entering the rotation shooting, the user will face the third screen 4013 to himself so as to preview the image. For convenience of description, in a case where the third screen 4013 is facing itself, the orientation (denoted as θ) of the rear camera 403 is 0 ° when γ=0°, δ=180°, and θ is 360 ° when γ=180°, δ=0°. That is, the clockwise direction is the positive direction.
On this basis, rotation photographing of 0 ° to 360 ° can be achieved as θ varies in the clockwise direction, and rotation photographing of 360 ° to 0 ° can be achieved as θ varies in the counterclockwise direction.
Illustratively, θ is shown in FIG. 22. Where θ=0° (360 °) (when the rear camera 403 is facing itself), θ=90° when the rear camera 403 is facing the scene on the left side of the mobile phone, θ=180° when the rear camera 403 is facing the scene in front of the eyes, and θ=270° when the rear camera 403 is facing the scene on the right side of the mobile phone.
Since gamma increases from 0 deg. to 180 deg., which causes the orientation of the rear camera 403 to change in a clockwise direction (i.e., the forward direction), delta increases from 0 deg. to 180 deg., which causes the orientation of the rear camera 403 to change in a counterclockwise direction (i.e., the reverse direction), the orientation of the rear camera 403 may be gamma+ (180 deg. -delta) at any one time, which may be expressed as an orientation formula.
The effect of using the orientation formula is described below in several specific examples:
Illustratively, θ=0++180° -180°) =0° can be calculated by substituting the above-described orientation formula for the state of γ=0° shown in fig. 21A, θ=90++180 ° -180°) =90° can be calculated by substituting the above-described orientation formula for the state of γ=90° shown in fig. 21A, and θ=180++ (180 ° -180 °) =180° can be calculated by substituting the above-described orientation formula for the state of γ=180° shown in fig. 21A. It can be seen that the calculated orientation matches the orientation shown in fig. 22 described above for the case of δ=180° using the orientation formula.
Also, as an example, θ=180++180 ° -180°) =180° can be calculated by substituting the above-described orientation formula for the state of δ=180° shown in fig. 21B, θ=180++ (180 ° -90 °) =270° can be calculated by substituting the above-described orientation formula for the state of δ=90° shown in fig. 21B, and θ=180++ (180 ° -0 °) =360° can be calculated by substituting the above-described orientation formula for the state of δ=0° shown in fig. 21B. It can be seen that the calculated orientation matches the orientation shown in fig. 22 described above for the case of γ=180° using the orientation formula.
Further exemplary, θ=45++ (180 ° -135 °) =90° can be calculated by substituting the above-described orientation formula into the state of δ=135°, γ=45° shown in fig. 23, θ=90++ (180 ° -90 °) =180° can be calculated by substituting the above-described orientation formula into the state of δ=90° shown in fig. 23, and θ=90++ (180 ° -0 °) =270° can be calculated by substituting the above-described orientation formula into the state of δ=0°, γ=90° shown in fig. 23. It can be seen that the calculated orientation matches the orientation shown in fig. 22 described above for the case where δ and γ are any combination using the orientation formula.
Function one
In form three, the front screen is the third screen 4013. In order to realize that when the rear camera 403 is in any orientation, the preview interface is always displayed on the third screen 4013, the mobile phone needs to ensure that the third screen 4013 is adopted for displaying after receiving the selection operation of the shooting entrance, and the mobile phone needs to ensure that the third screen 4013 is adopted for displaying before the mobile phone exits from the rotation shooting. The implementation of the above two aspects is described below.
First aspect:
The mobile phone with the folding screen 401 mainly has three use modes, namely mode 1, when the folding screen 401 is used as a large screen to display a user interface, the first screen 4011, the second screen 4012 and the third screen 4013 are all in a bright screen, mode 2, when the user interface is displayed on a screen formed by the first screen 4011 and the second screen 4012 in a delta=0°, when gamma approaches 180 degrees, the user interface is displayed on a screen formed by the first screen 4011 and the second screen 4012, the first screen 4011 and the second screen 4012 are both in a bright screen, and mode 3, when the user interface is displayed on the third screen 4013 in a folding state, only the third screen 4013 is in a bright screen.
Thus, in some embodiments, in response to a selection operation of the photographing portal, the mobile phone may query the states of the first screen 4011, the second screen 4012, and the third screen 4013. In the case that only the third screen 4013 is not on, the mobile phone can control the third screen 4013 to be on and display the preview interface and control the other screens to be off as in the above-described modes 1 and 2, and in the case that only the third screen 4013 is on, the mobile phone can display the user interface on the third screen 4013 as in the above-described mode 3 without switching the screens.
In the case where the user interface is displayed in the folding screen 401 shown in fig. 24 (before switching), as in the case of mode 1, the mobile phone can inquire that the first screen 4011, the second screen 4012, and the third screen 4013 are all lit. In this case, the mobile phone can switch to a state in which the third screen 4013 is on, the first screen 4011 and the second screen 4012 are off, and the preview interface 3301 shown in fig. 24 (after switching) is displayed in the third screen 4013.
Of course, with the continuous updating of the usage scenario of the mobile phone, the subsequent use situation of displaying the user interface only on the first screen 4011 or the second screen 4012, or displaying the user interface on the first screen 4011 and the second screen 4012, or on the first screen 4011 and the third screen 4013 may also occur. Accordingly, in response to a selection operation of the photographing entrance, the mobile phone may inquire whether only the third screen 4013 is on, and if only the third screen 4013 is on, no screen switching is required, and if not only the third screen 4013 is on, the mobile phone may switch to a state in which only 4013 is on and both the first screen 4011 and the second screen 4012 are off.
In this way, the handset can ensure that only the third screen 4013 is on the bright screen when entering the rotation shooting.
Further, the mobile phone may display a switching prompt on the display screen of the bright screen, so as to prompt switching to the third screen 4013 for display, and start to rotate to shoot. Subsequently, after the display duration of the switching prompt reaches a preset duration (for example, 3 seconds), or after receiving the confirmation operation of the user, the mobile phone may switch to a state in which the third screen 4013 is on and the first screen 4011 and the second screen 4012 are off.
Second aspect:
Similarly to the first aspect, in the third aspect, the mobile phone can also indicate that the rotation shooting is being used by recording the identification a of the rotation shooting. In the case where the mark a is recorded, the mobile phone is not switched even if the condition for switching the screen display user interface is satisfied. In this way, the mobile phone can ensure that the screen is not cut in the process of rotating shooting, and the preview interface is always displayed on the third screen 4013.
Function two
During the preview phase, the handset can dynamically turn on or off the post mirror based on γ and δ. In particular, the handset may dynamically turn on or off the rear mirror based on the orientations calculated from γ and δ (hereinafter referred to as θ).
As is clear from the orientation shown in fig. 22, θ is in the range of [0 °,90 ° and (270 °,360 ° ], the rear camera 403 is biased to itself, and θ is in the range of (90 °,270 °), and the rear camera 403 is biased to the scene in front of the eyes.
Based on this, referring to FIG. 25, upon entering the preview phase, if either the θ < angle 3 or the θ > angle 10 (typically close to 270 °, but slightly greater than 270 °, e.g., 273 °), the handset turns on the post-mirror, then if changing from the θ < angle 3 to the θ > angle 4, or from the θ > angle 10 to the θ < angle 9 (typically close to 270 °, but slightly less than 270 °, e.g., 267 °), the handset turns off the post-mirror, if changing from the angle 4< θ < angle 9 to the θ < angle 3 or θ > angle 10, the handset turns on the post-mirror, and so on. When entering the preview stage, if the angle 4< theta < angle 9, the mobile phone closes the rear mirror image, and subsequently, if the angle is changed from the angle 4< theta < angle 9 to the angle theta < 3 or the angle theta > 10, the mobile phone opens the rear mirror image, if the angle is changed from the angle theta < 3 to the angle theta > 4 again, or from the angle theta > 10 to the angle theta < 9, the mobile phone closes the rear mirror image, and so on.
In addition, when entering the preview stage, if the angle 3 is less than or equal to theta and less than or equal to 4 or the angle 9 is less than or equal to theta and less than or equal to 10, the mobile phone can default to close the rear mirror image, and after the angle 3 is less than or equal to theta and less than or equal to 4 is changed to theta < angle 3 or the angle 9 is less than or equal to theta and less than or equal to 10 is changed to theta > angle 10, the mobile phone can open the rear mirror image. Subsequently, if changing from θ < angle 3 to θ > angle 4 or from θ > angle 10 to θ < angle 9, the handset turns off the back mirror, if changing from angle 4< θ < angle 9 to θ < angle 3 or θ > angle 10, the handset turns on the back mirror, and so on.
The case where θ < angle 3, θ > angle 10, and angle 4< θ < angle 9 when entering the preview stage will be mainly described herein.
Taking the example that the angle 3 is 87 °, the angle 4 is 93 °, the angle 9 is 267 °, and the angle 10 is 273 °, when the mobile phone is in the state of θ=180° shown in fig. 26, the angle 4< θ < angle 9 is satisfied, the mobile phone can display the preview interface 3501 on the third screen 4013, and the view finding picture in the preview interface 3501 is not subjected to the left-right mirror image processing, that is, the rear mirror image is closed at this time. Subsequently, when the mobile phone changes from θ=180° to the state of θ=0° shown in fig. 26, satisfying θ < angle 3, the mobile phone can display the preview interface 3502 on the third screen 4013. Unlike the preview interface 3501, the view of the preview interface 3502 is mirrored left and right, i.e., the rear mirror is turned on.
Thus, based on θ, in the preview phase, the handset may turn on the rear mirror image if the rear camera 403 is biased towards itself, and may turn off the rear mirror image if the rear camera 403 is biased towards the scene in front of the eye.
Function III
After entering the rotation shooting, the mobile phone may display a preview interface 3601 shown in fig. 27 on a third screen 4013, where a shooting mode is selected in the preview interface 3601, that is, a shooting mode currently in the rotation shooting. In response to a selection operation of the mode option 36011 of the panoramic mode in the preview interface 3601, the mobile phone may display a preview interface 3602 shown in fig. 27 in the third screen 4013, where the panoramic mode is selected in the preview interface 3602, that is, the panoramic mode currently in the rotation shooting mode.
With continued reference to preview interface 3602 in fig. 27, in the panning mode of rotational shooting, a guide arrow 36021 is included in preview interface 3602. The guide arrow 36021 is used to indicate the moving direction of the panoramic shooting. Typically, the guide arrow 36021 indicates left to right by default.
In some embodiments, to facilitate the user taking panoramic photographs in any direction of movement, during the preview phase, the handset may provide a direction switch control 36022 in the preview interface 3602 shown in fig. 27. The direction switch control 36022 is used to trigger the handset to switch the direction indicated by the guide arrow 36021. For the specific implementation of this embodiment, reference may be made to the related description in the first aspect, which is not repeated here. The guiding arrow 36021 corresponds to the guiding arrow 12021 in the first aspect, and the direction switching control 36022 corresponds to the direction switching control 12022 in the first aspect.
In other embodiments, to facilitate the user taking panoramic pictures in any moving direction, the handset may also dynamically adjust the direction indicated by the guide arrow 36021 during the preview phase based on γ and δ. Specifically, the mobile phone can dynamically turn on or off the rear mirror image based on θ.
In a specific implementation, the handset may adjust the direction indicated by the guide arrow 36021 only in the vicinity of θ=180°. Referring to fig. 28, in switching to panoramic mode, if θ < angle 11 (typically close to 180 °, but slightly less than 180 °, such as 177 °), the direction indicated by guide arrow 36021 is left to right, then if θ < angle 11 changes to θ > angle 12 (typically close to 180 °, but slightly greater than 180 °, such as 183 °), the direction indicated by switch guide arrow 36021 is right to left, if θ < angle 12 changes again to θ < angle 11, the direction indicated by switch guide arrow 36021 is left to right, and so on. In switching to panoramic mode, the direction indicated by the guide arrow 36021 is from right to left if the angle 12 is θ >, subsequently, the direction indicated by the switch guide arrow 36021 is from left to right if the angle 12 is changed from θ to the angle 11, the direction indicated by the switch guide arrow 36021 is from right to left if the angle 12 is changed from θ < angle 11 to θ > again, and so on.
In addition, in switching to panoramic mode, if angle 11+.θ+.gtoreq.12, the guide arrow 12021 may indicate that the default direction is left to right, and after changing from angle 11+.θ+.gtoreq.12 to θ > angle 12, the handset may switch the direction indicated by guide arrow 36021 from right to left. Subsequently, if changing from θ < angle 12 to θ < angle 11, the direction indicated by switch guide arrow 36021 is left to right, if changing again from θ < angle 11 to θ > angle 12, the direction indicated by switch guide arrow 36021 is right to left, and so on.
The case of the beta < angle 11 or beta > angle 12 when switching to panoramic mode is mainly described herein.
In this way, the cell phone may have the direction indicated by the guide arrow 36021 from left to right in the case where θ does not reach half of 360 ° (e.g., less than angle 11) to thereby achieve less than 180 ° start and take a picture in the course of rotating in the clockwise direction of more than 180 °, and the cell phone may have the direction indicated by the guide arrow 36021 from right to left in the case where θ reaches half of 360 ° (e.g., greater than angle 12) to thereby achieve more than 180 ° start and take a picture in the course of rotating in the counterclockwise direction of less than 180 °.
In another specific implementation, in one aspect, the handset may adjust the direction indicated by the guide arrow 36021 around θ=180° according to angles 11 and 12 shown in fig. 29. On the other hand, the handset may further adjust the direction indicated by the guide arrow 36021 between [0 °,180 ° ] according to angles 5 and 6 shown in fig. 29, and [180 °,360 ° ] according to angles 13 and 14 shown in fig. 29.
Specifically, the cellular phone may determine the photographing intention of the user based on γ and δ. Wherein, the shooting intention includes three kinds of shooting between [0 °,360 ° ], shooting between [0 °,180 ° ], and shooting between [180 °,360 ° ]. After determining the intent to shoot, the handset may adjust the direction indicated by the guide arrow 36021 with an angle threshold that matches the intent to shoot.
Wherein, if the photographing is intended to be between [0 °,360 ° ], the angle threshold is the angle 11 and the angle 12 in fig. 29. If the shooting intention is to shoot between 0, 180, then the angle thresholds are angle 5 and angle 6 in fig. 29. If the shooting is intended to be between 180 deg., 360 deg., then the angle thresholds are angle 13 and angle 14 in fig. 29.
The embodiment of the application exemplarily illustrates the specific implementation of determining the shooting intention of the user based on gamma and delta by the mobile phone:
Considering γ=0°, as shown in the state of γ=0° in fig. 21C, as δ varies between [0 °,180 ° ], photographing of [0 °,180 ° ] can be achieved, and therefore, the mobile phone can determine that photographing is intended to be photographing between [0 °,180 ° ] in the case where γ is close to 0 ° (e.g., γ <5 °), and δ can reach a large value (e.g., δ >10 °).
And taking into account δ=0°, as shown in the state of δ=0° in fig. 21C, as γ varies between [0 °,180 ° ], photographing of [180 °,360 ° ] can be achieved, and therefore, the mobile phone can determine that photographing is intended to be photographing between [180 °,360 ° ] in a case where δ is close to 0 ° (e.g., δ <5 °), and γ can reach a large value (e.g., γ >10 °).
In addition to the above two cases, the mobile phone may determine that the photographing intention is to photograph between [0 °,360 ° ].
Of course, the mobile phone may determine the shooting intention of the user based only on whether γ and δ are close to 0 °. The embodiment of the present application is not particularly limited thereto. For example, if γ <5 °, the cell phone may determine that the photographing is intended to be between [0 °,180 ° ], if δ <5 °, the cell phone may determine that the photographing is intended to be between [180 °,360 ° ], and otherwise, the cell phone may determine that the photographing is intended to be between [0 °,360 ° ].
It is understood that the cellular phone may determine the photographing intention of the user based on γ and δ when switching to the panorama mode. On the basis, the mobile phone can update the shooting intention of the user based on gamma and delta after switching to the panoramic mode. That is, after entering the panoramic mode, the photographing intention may also be changed.
Illustratively, upon switching to panoramic mode, the handset determines that the capture is intended to be between [0 °,360 ° ]. And after switching to the panoramic mode, after satisfying δ <5 °, and γ >10 °, the shooting intention may be updated to shoot between [180 °,360 ° ], and after satisfying γ <5 °, and δ >10 °, the shooting intention may be updated to shoot between [0 °,180 ° ].
The following describes specific implementation of the mobile phone in the direction indicated by the matched angle threshold adjustment guide arrow 36021 according to different shooting intentions, with the angle 5 being 87 °, the angle 6 being 93 °, the angle 13 being 267 °, the angle 14 being 273 °, the angle 11 being 177 ° and the angle 12 being 183 °, respectively:
the first shooting intention is to shoot between [0 °,180 ° ].
Shooting between 0, 180, if θ < angle 5, indicates that the user wants to shoot clockwise within 0, 180, the direction indicated by the guide arrow 36021 may be left to right. If θ > angle 6, it indicates that the user wants to shoot counterclockwise within [0 °,180 ° ], the direction indicated by the guide arrow 36021 may be from right to left. If changing from θ < angle 5 to θ > angle 6, indicating that the user wants to switch to shooting counterclockwise within [0 °,180 ° ], the direction indicated by the guide arrow 36021 may be switched from right to left, and if changing from θ > angle 6 to θ < angle 5, indicating that the user wants to switch to shooting clockwise within [0 °,180 ° ], the direction indicated by the guide arrow 36021 may be left to right.
Referring to fig. 30A, the mobile phone determines a first shooting intention at time t0, where θ=0°, satisfying θ < angle 5 >, the mobile phone may display that the direction indicated by the guide arrow 36021 is from left to right (abbreviated as L-R in the figure), subsequently, θ changes from 0 ° at time t0 to 95 ° at time t1, satisfying from θ < angle 5 to θ > angle 6, the mobile phone may switch the direction indicated by the guide arrow 36021 from right to left (abbreviated as R-L in the figure), subsequently, θ changes from 95 ° at time t1 to 85 ° at time t2, satisfying from θ > angle 6 to θ < angle 5, the mobile phone may switch the direction indicated by the guide arrow 36021 from left to right, and so on.
Referring to fig. 30B, the mobile phone determines a first photographing intention at time t0, where θ=180°, satisfies θ > angle 6, the direction indicated by the guide arrow 36021 is right to left, then θ changes from 180 ° at time t0 to 85 ° at time t1, satisfies θ > angle 6 to θ < angle 5, the mobile phone can switch the direction indicated by the guide arrow 36021 from left to right, then θ changes from 85 ° at time t1 to 95 ° at time t2, satisfies θ < angle 5 to θ > angle 6, the mobile phone can switch the direction indicated by the guide arrow 36021 from right to left, and so on.
Referring to fig. 30C, the mobile phone determines the first shooting intention at time t0, where θ=90°, i.e., θ is not smaller than angle 5 nor larger than angle 6, but satisfies angle 5+.θ+.6, and the guiding arrow 36021 may indicate the default direction from left to right. After θ changes from 90 ° at time t0 to 95 ° at time t3, then θ > angle 6 is satisfied and the handset can switch the direction indicated by guide arrow 36021 from right to left. Subsequently, the mobile phone may adjust the direction indicated by the guiding arrow 36021 in a manner corresponding to the time t1 and the time t2 shown in fig. 30B. That is, if changing from θ < angle 6 to θ < angle 5, the handset may switch the direction indicated by the guide arrow 36021 from left to right, if changing from θ < angle 5 to θ > angle 6, the handset may switch the direction indicated by the guide arrow 36021 from right to left, and so on.
The second shooting intention is to shoot between 180 deg., 360 deg..
Shooting between [180 °,360 ° ], if θ < angle 13, indicates that the user wants to shoot clockwise within [180 °,360 ° ], the direction indicated by the guide arrow 36021 may be left to right. If θ > angle 14, it indicates that the user wants to shoot counterclockwise within [180 °,360 ° ], the direction indicated by the guide arrow 36021 may be from right to left. If changing from theta < angle 13 to theta > angle 14, indicating that the user wants to switch to shoot counterclockwise within 180 deg., 360 deg., the direction indicated by guide arrow 36021 may be switched from right to left, and if changing from theta > angle 14 to theta < angle 13, indicating that the user wants to switch to shoot clockwise within 180 deg., 360 deg., the direction indicated by guide arrow 36021 may be left to right.
Referring to fig. 31A, the mobile phone determines a second shooting intention at time t4, where θ=180°, satisfies θ < angle 13, the mobile phone may display that the direction indicated by the guide arrow 36021 is from left to right (abbreviated as L-R in the figure), subsequently, θ changes from 0 ° at time t4 to 275 ° at time t5, satisfies the direction indicated by the guide arrow 36021 to right to left (abbreviated as R-L in the figure), subsequently, θ changes from 275 ° at time t5 to 265 ° at time t6, satisfies the direction indicated by the guide arrow 36021 to left to right, and so on.
Referring to fig. 31B, the mobile phone determines a second photographing intention at time t4, where θ=360° satisfies θ > angle 14, the direction indicated by guide arrow 36021 is right to left, subsequently, θ changes from 360 ° at time t4 to 265 ° at time t5, satisfies θ > angle 14 to θ < angle 13, the mobile phone can switch the direction indicated by guide arrow 36021 from left to right, subsequently, θ changes from 260 ° at time t5 to 275 ° at time t6, satisfies θ < angle 13 to θ > angle 14, the mobile phone can switch the direction indicated by guide arrow 36021 from right to left, and so on.
Referring to fig. 31C, the mobile phone determines a second shooting intention at time t4, where θ=270°, i.e., θ is not smaller than angle 13 nor larger than angle 14, but satisfies angle 13+.θ+.14, and the guiding arrow 36021 may indicate the default direction from left to right. After θ changes from 270 ° at time t4 to 275 ° at time t7, then θ > angle 14 is satisfied and the handset can switch the direction indicated by guide arrow 36021 from right to left. Subsequently, the mobile phone may adjust the direction indicated by the guiding arrow 36021 in a manner corresponding to the time t5 and the time t6 shown in fig. 31B. That is, if changing from θ < angle 14 to θ < angle 13, the handset may switch the direction indicated by guide arrow 36021 from left to right, if changing from θ < angle 13 to θ > angle 14, the handset may switch the direction indicated by guide arrow 36021 from right to left, and so on.
The third shooting intention is to shoot between 0, 360.
Shooting between [0 °,360 ° ], if θ < angle 11, indicates that the user wants to shoot clockwise within [0 °,360 ° ], the direction indicated by the guide arrow 36021 may be left to right. If θ > angle 12, it indicates that the user wants to shoot counterclockwise within [0 °,360 ° ], the direction indicated by the guide arrow 36021 may be from right to left. If changing from theta < angle 11 to theta > angle 12, indicating that the user wants to switch to shoot counterclockwise within [0 °,360 ° ], the direction indicated by the guide arrow 36021 may be switched from right to left, and if changing from theta > angle 12 to theta < angle 11, indicating that the user wants to switch to shoot clockwise within [0 °,360 ° ], the direction indicated by the guide arrow 36021 may be left to right.
Referring to fig. 32A, the mobile phone determines a third shooting intention at time t8, where θ=0°, satisfies θ < angle 11, the mobile phone may display that the direction indicated by the guide arrow 36021 is from left to right (abbreviated as L-R in the figure), subsequently, θ changes from 0 ° at time t8 to 185 ° at time t9, satisfies the change from θ < angle 11 to θ > angle 12, the mobile phone may switch the direction indicated by the guide arrow 36021 from right to left (abbreviated as R-L in the figure), subsequently, θ changes from 185 ° at time t9 to 175 ° at time t10, satisfies the change from θ > angle 12 to θ < angle 11, the mobile phone may switch the direction indicated by the guide arrow 36021 from left to right, and so on.
Referring to fig. 32B, the handset determines a third photographing intention at time t8, where θ=360° satisfies θ > angle 12, the direction indicated by guide arrow 36021 is right to left, subsequently, θ changes from 360 ° at time t8 to 175 ° at time t9, satisfies θ > angle 12 to θ < angle 11, the handset can switch the direction indicated by guide arrow 36021 from left to right, subsequently, θ changes from 175 ° at time t9 to 185 ° at time t10, satisfies θ < angle 11 to θ > angle 12, the handset can switch the direction indicated by guide arrow 36021 from right to left, and so on.
Referring to fig. 32C, the mobile phone determines a third shooting intention at time t8, where θ=180°, i.e., θ is not smaller than angle 11 nor larger than angle 12, but satisfies angle 11+.θ+.12, and the guiding arrow 36021 may indicate the default direction from left to right. After θ changes from 180 ° at time t8 to 185 ° at time t11, then θ > angle 12 is satisfied and the handset can switch the direction indicated by guide arrow 36021 from right to left. Subsequently, the mobile phone may adjust the direction indicated by the guiding arrow 36021 in a manner corresponding to the time t9 and the time t10 shown in fig. 32B. That is, if changing from θ < angle 12 to θ < angle 11, the handset may switch the direction indicated by guide arrow 36021 from left to right, if changing from θ < angle 11 to θ > angle 12, the handset may switch the direction indicated by guide arrow 36021 from right to left, and so on.
In the shooting stage, the mobile phone does not switch the direction indicated by the guide arrow 36021, but is locked so that the direction indicated by the guide arrow 36021 in the frame before shooting is started. Subsequently, upon exiting from the capturing phase to the preview phase, the handset may continue to dynamically adjust the direction indicated by the guide arrow 36021 based on θ.
Likewise, the embodiment described above in which the direction indicated by the guide arrow 36021 is manually adjusted and the embodiment in which the handset dynamically adjusts the direction indicated by the guide arrow 36021 based on θ may be combined. Wherein, after the user manually adjusts the direction indicated by the guiding arrow 36021, the mobile phone automatically adjusts the direction indicated by the guiding arrow 36021 only after the switching condition is satisfied. And, the above-described embodiment of opening or closing the rear mirror and the embodiment of adjusting the direction indicated by the guide arrow 36021 may also be combined.
Function IV
After entering the rotation photographing, the user can adjust the view range of the rear camera 403 by folding or unfolding the folding screen 401, and after adjusting to an appropriate view range, the user can perform a photographing operation, such as a clicking operation on a photographing button in the preview interface. In response to the photographing operation, the mobile phone may photograph an image. For the content of this part, reference may also be made to the description of the relevant part in form one, which is not repeated here.
Function five
After entering the rotation shooting, in response to an operation to exit the rotation shooting, the cellular phone may exit the rotation shooting and enter the normal shooting. In response to the operation of exiting the rotation shooting, the cellular phone may display a normal shooting interface using at least one of the folding screens 401 based on the device gesture decision, instead of always displaying a user interface on the third screen 4013. For the content of the operation in response to the exit from the rotation shooting, reference may also be made to the description of the relevant part in the first aspect, and the description will not be repeated here.
Function six
After entering the rotation shooting, such as after the shooting is completed, the cell phone may exit running the camera application in the foreground in response to an operation of exiting the camera application from the foreground. Similar to the external screen 202 in the first aspect, in the third aspect, the third screen 4013 can be separately used to view, edit, and share the shooting result. Thus, in form three, in response to the operation of exiting the foreground from the camera application, the handset may continue to display a user interface, such as a desktop, gallery interface, in the third screen 4013.
Illustratively, the handset is in the device pose shown in fig. 33 consistent with the folding screen 401, i.e., the user interface should be displayed on the folding screen 401. The operation of exiting the camera application from the foreground may be a slide-up operation from the bottom of the preview interface 4201 displayed on the third screen 4013 shown in fig. 33 upward (as indicated by an arrow). However, in response to the slide-up operation from the bottom of the preview interface 4201, the mobile phone still displays a user interface in the third screen 4013, such as displaying the desktop 4202 shown in fig. 33.
In the content of the operation of responding to the exit of the rotation shooting or the operation of the camera application from the foreground, which is not described in detail, reference may also be made to the description of the relevant parts in the first aspect, such as fig. 17, 18 and the related description thereof, which are not described here.
An embodiment of the third aspect will be further described with reference to fig. 34. Taking the direction indicated by the guide arrow 32021 as an example, where the angle 5 is 87 °, the angle 6 is 93 °, the angle 13 is 267 °, the angle 14 is 273 °, the angle 11 is 177 °, the angle 12 is 183 °, and the phone adjusts according to the shooting intention, as shown in fig. 34, in a specific implementation, the rotation shooting of the phone of form three (i.e., the phone with the folding screen 401) includes:
S4301, entering into rotation shooting.
S4302, inquiring whether only the third screen 4013 is on, if yes, executing S4305, and if not, executing S4303.
Wherein, only the third screen 4013 is on means that the third screen 4013 is on, and the first screen 4011 and the second screen 4012 are off.
S4303, displaying a switching prompt.
For example, the cell phone may prompt "switch to third screen 4013 display, enter rotation shooting".
S4304, after the prompt time length reaches the preset time length, controlling only the third screen 4013 to be on, and displaying a preview interface of the rotary shooting on the third screen 4013.
S4305, recording an identification a.
S4306, when the device gesture is updated, updating the device gesture without cutting the screen.
S4307, detecting whether the panoramic mode is adopted, if so, executing S4308, and if not, executing S4318.
For the specific implementation of S4301-S4307, reference may be made to the descriptions of S2001-S2007 in sequence, and the description is omitted here.
S4308, detecting whether the preview stage is in progress. If yes, S4309 is executed, and if not, S4316 is executed.
In panoramic mode, the processing done by the phone is different during the preview phase and the shooting phase. For example, in the preview stage, the mobile phone can turn on or off the rear mirror image and adjust the direction indicated by the guide arrow, and in the shooting stage, the mobile phone can adjust the position of the guide arrow and calculate to obtain a panoramic photo.
S4309, acquiring gamma and delta, and calculating to obtain theta.
It will be appreciated that as the user moves the handset, the relative positions of the first and second screens 3011 and 3012 will change, and γ will change, and the relative positions of the second and third screens 3012 and 3013 will change, and δ will change. Therefore, the mobile phone may continuously execute S4309 in the preview stage to obtain γ and δ in real time, and calculate θ using the orientation formula. Subsequently, the handset may switch the direction indicated by the boot arrow based on dynamically opening or closing the post mirror.
It should be noted that the handset turns on the rear mirror if θ is in the range of [0 °,87 ° and (273 °,360 ° ] when entering the preview phase, and turns off the rear mirror if θ is in the range of (93 °,267 °) when entering the preview phase. Subsequently, the handset can dynamically turn on or off the post mirror based on θ.
S4310, if changing from 93 DEG < theta <267 DEG to theta <87 DEG or theta >273 DEG, turning on the post mirror.
In this way, the handset can turn on the rear mirror when the rear camera 403 is biased towards itself. Reference is specifically made to fig. 25, fig. 26 and the related descriptions thereof, and the details are not repeated here.
S4311, if the angle is changed from theta <87 DEG or theta >273 DEG to 93 DEG < theta <267 DEG, closing the rear mirror.
In this way, the handset can close the rear mirror when the rear camera 403 is deflected toward the scene in front of the eye. Reference is specifically made to fig. 25, fig. 26 and the related descriptions thereof, and the details are not repeated here.
S4312, detecting shooting intention.
S4313, shooting is intended to be performed at [0 DEG, 180 DEG ], wherein if theta <87 DEG to theta >93 DEG is changed, the direction indicated by the guide arrow is switched from right to left, and if theta >93 DEG to theta <87 DEG is changed, the direction indicated by the guide arrow is switched from right to left.
Thus, the guide arrow may indicate left to right when the user wants to shoot clockwise within [0 °,180 ° ], and right to left when the user wants to shoot counterclockwise within [0 °,180 ° ]. Reference may be made specifically to the foregoing descriptions of fig. 30A-30C, and details are not repeated here.
S4314, shooting is intended to be performed at [180 DEG, 360 DEG ], wherein if the angle is changed from theta <267 DEG to theta >273 DEG, the direction indicated by the guiding arrow is switched from right to left, and if the angle is changed from theta >273 DEG to theta <267 DEG, the direction indicated by the guiding arrow is switched from right to left.
Thus, the guide arrow may indicate left to right when the user wants to shoot clockwise within [180 °,360 ° ], and right to left when the user wants to shoot counterclockwise within [180 °,360 ° ]. Reference may be made specifically to the foregoing descriptions of fig. 31A-31C, and details are not repeated here.
S4315, shooting is intended to be performed at [0 DEG, 360 DEG ], the direction indicated by the guide arrow is switched from right to left if the angle is changed from theta <177 DEG to theta >183 DEG, and the direction indicated by the guide arrow is switched from right to left if the angle is changed from theta >183 DEG to theta <177 deg.
Thus, the guide arrow may indicate left to right when the user wants to shoot clockwise within [0 °,360 ° ], and right to left when the user wants to shoot counterclockwise within [0 °,360 ° ]. Reference may be made specifically to the foregoing descriptions of fig. 32A-32C, and details are not repeated here.
S4316, adjusting the position of the guide arrow based on the movement of the user.
It should be noted that, if the user is not satisfied with the direction indicated by the guiding arrow after the automatic switching, the direction indicated by the guiding arrow may be further adjusted by a manual manner, which is not described herein.
S4317, shooting to obtain a panoramic photo.
For the above S4316-S4317, reference may be made to the descriptions of S2016-S2017, and the details are not repeated here.
S4318, detecting whether the preview stage is in progress. If yes, S4319 is executed, and if not, S4324 is executed.
In other modes such as a photographing mode and a video recording mode than the panoramic mode, the processing performed by the mobile phone is different between the preview stage and the photographing stage. For example, the handset may dynamically turn on or off the rear mirror image during the preview phase and may complete the shooting during the shooting phase.
S4319, acquiring gamma and delta, and calculating to obtain theta.
S4320, if changing from 93 DEG < theta <267 DEG to theta <87 DEG or theta >273 DEG, turning on the post mirror.
S4321, if the angle is changed from theta <87 DEG or theta >273 DEG to 93 DEG < theta <267 DEG, closing the rear mirror.
For the above-mentioned steps S4319-S4323, refer to the corresponding steps in the above-mentioned steps S4309-S43112, and the details are not repeated here.
S4324, shooting.
It should be noted that the execution sequence of S4307 to S4324 is not limited to fig. 34. For example, the mobile phone may also adjust the direction indicated by the guiding arrow first, and then switch the state of the rear mirror image, for example, execute S4312-S4315 first, and then execute S4310-S4311. Also, for example, the mobile phone may first detect whether it is a preview stage, and in the preview stage, the mobile phone turns on or off the rear mirror based on θ, then further detect whether it is a panoramic mode, if it is a panoramic mode, further adjust the direction indicated by the guide arrow, if it is not a panoramic mode, it is not necessary to adjust the direction indicated by the guide arrow, if it is not a preview stage, further detect whether it is a panoramic mode, if it is a panoramic mode, then execute S4316-S4317, if it is not a panoramic mode, then execute S4324.
In addition, the above-mentioned S4307 to S4324 are not necessarily all required. For example, the mobile phone may omit the S4312-S4314 without adjusting the direction indicated by the guide arrow, or may omit the S4310-S4311 and S4320-S4321 without switching the rear mirror image.
After entering the rotation shooting, the user may exit the rotation shooting or switch the camera application to the background either before starting the shooting or after ending the shooting. Accordingly, the mobile phone may respond through the following S4325 and subsequent steps.
Note that in fig. 34, S4325 and subsequent steps are located after the photographing is completed, as in S4317 and S4324. In practice, S4325 and subsequent steps may be triggered to be performed at any time after entering the rotation shooting.
S4325, detecting whether an operation of exiting the rotation shooting is received. If yes, S4329 is executed, and if not, S4326 is executed.
S4326, detecting whether the operation of exiting the foreground of the camera application is received, if yes, executing S4327, and if not, repeatedly executing S4306-S4326.
It should be noted that the execution sequence of S4325-S4326 is not limited to FIG. 34. For example, the mobile phone may first perform S4326, and then further perform S4325 if it is detected that the operation of exiting the camera application from the foreground is not received.
S4327, the user interface continues to be displayed on the third screen 4013.
S4328, detecting whether the posture of the equipment changes. If yes, S4329 is executed, and if not, S4327 is repeatedly executed.
S4329, clearing the mark a.
After the flag a is cleared, the portable telephone also does not perform the relevant response of the rotation shooting, for example, S4306 to S2628.
S4330, inquiring whether the gesture of the device is consistent with the bright screen of the third screen 4013. If yes, S4331 is executed, and if not, S4332 is executed.
S4331, a user interface is displayed on the third screen 4013.
S4332, switching to the corresponding screen of the folding screen 401 to display a user interface.
For the above S4325-S4332, refer to fig. 34 and the related descriptions, and the details are not repeated here.
In addition, the mobile phone may execute S4329-S4332 after receiving the operation of exiting the rotation shooting or the operation of exiting the camera application from the foreground, instead of distinguishing the operation of exiting the rotation shooting from the operation of exiting the camera application from the foreground. The embodiment of the present application is not particularly limited thereto.
Embodiments of the present application also provide an electronic device that may include a display screen, a memory, and one or more processors (e.g., CPU, GPU, NPU, etc.). The display, the memory, and the processor are coupled. The memory is for storing computer program code, the computer program code comprising computer instructions. When the processor executes the computer instructions, the electronic device may perform the various functions or steps performed by the device in the method embodiments described above.
The embodiment of the application also provides a chip system which comprises at least one processor and at least one interface circuit. The processors and interface circuits may be interconnected by wires. For example, the interface circuit may be used to receive signals from other devices (e.g., a memory of an electronic apparatus). For another example, the interface circuit may be used to send signals to other devices (e.g., processors). The interface circuit may, for example, read instructions stored in the memory and send the instructions to the processor. The instructions, when executed by a processor, may cause an electronic device to perform the various steps of the embodiments described above. Of course, the system-on-chip may also include other discrete devices, which are not particularly limited in accordance with embodiments of the present application.
The present embodiment also provides a computer storage medium having stored therein computer instructions which, when executed on an electronic device, cause the electronic device to perform the above-described related method steps to implement the image processing method in the above-described embodiments.
The present embodiment also provides a computer program product which, when run on a computer, causes the computer to perform the above-described related steps to implement the image processing method in the above-described embodiments.
In addition, the embodiment of the application also provides a device which can be a chip, a component or a module, and the device can comprise a processor and a memory which are connected, wherein the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory so that the chip can execute the image processing method in the method embodiments.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding methods provided above, and will not be described herein.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated unit may be stored in a readable storage medium if implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. The storage medium includes a U disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that the above-mentioned embodiments are merely for illustrating the technical solution of the present application and not for limiting the same, and although the present application has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications and equivalents may be made to the technical solution of the present application without departing from the spirit and scope of the technical solution of the present application.