Image stabilization (IS) is a family of techniques that reduceblurring associated with the motion of acamera or other imaging device duringexposure.
Generally, it compensates forpan andtilt (angular movement, equivalent toyaw and pitch) of the imaging device, though electronic image stabilization can also compensate for rotation about the optical axis (roll).[1] It is mainly used in high-endimage-stabilized binoculars,still andvideo cameras, astronomicaltelescopes, and alsosmartphones. Withstill cameras,camera shake is a particular problem at slowshutter speeds or with longfocal length lenses (telephoto orzoom). Withvideo cameras, camera shake causes visible frame-to-framejitter in the recorded video. In astronomy, the problem of lens shake is added tovariation in the atmosphere, which changes the apparent positions of objects over time.
In photography, image stabilization can facilitate shutter speeds 2 to 5.5stops slower (exposures 4 to 30 times longer), and even slower effective speeds have been reported.
Arule of thumb to determine the slowest shutter speed possible for hand-holding without noticeable blur due to camera shake is to take thereciprocal of the35 mm equivalent focal length of the lens, also known as the "1/mm rule"[a]. For example, at a focal length of 125 mm on a 35 mm camera, vibration or camera shake could affect sharpness if the shutter speed is slower than1⁄125 second. As a result of the 2-to-4.5-stops slower shutter speeds allowed by IS, an image taken at1⁄125 second speed with an ordinary lens could be taken at1⁄15 or1⁄8 second with an IS-equipped lens and produce almost the same quality. The sharpness obtainable at a given speed can increase dramatically.[3]When calculating the effective focal length, it is important to take into account the image format a camera uses. For example, many digital SLR cameras use an image sensor that is2⁄3,5⁄8, or1⁄2 the size of a 35 mm film frame. This means that the 35 mm frame is 1.5, 1.6, or 2 times the size of the digital sensor. The latter values are referred to as thecrop factor, field-of-view crop factor, focal-length multiplier, or format factor. On a 2× crop factor camera, for instance, a 50 mm lens produces the same field of view as a 100 mm lens used on a 35 mm film camera, and can typically be handheld at1⁄100 second.
However, image stabilization doesnot preventmotion blur caused by the movement of the subject or by extreme movements of the camera. Image stabilization is only designed for and capable of reducing blur that results from normal, minute shaking of a lens due to hand-held shooting. Some lenses and camera bodies include a secondarypanning mode or a more aggressive 'active mode', both described in greater detail below underoptical image stabilization.
Astrophotography makes much use oflong-exposure photography, which requires the camera to be fixed in place. However, fastening it to the Earth is not enough, since theEarth rotates. The Pentax K-5 and K-r, when equipped with the O-GPS1GPS accessory for position data, can use their sensor-shift capability to reduce the resultingstar trails.[4]
Stabilization can be applied in the lens, the camera body or both. Each method has distinctive advantages and disadvantages.[5]
Anoptical image stabilizer (OIS,IS, orOS) is a mechanism used in still or video cameras that stabilizes the recorded image by varying the optical path to the sensor. This technology is implemented in the lens itself, as distinct fromin-body image stabilization (IBIS), which operates by moving the sensor as the final element in the optical path. The key element of all optical stabilization systems is that they stabilize the image projected on the sensor before the sensor converts the image intodigital information. IBIS can have up to 5axis of movement: X, Y, Roll, Yaw, and Pitch. IBIS has the added advantage of working with all lenses.
Optical image stabilization prolongs theshutter speed possible for handheld photography by reducing the likelihood of blurring the image from shake during the same exposure time.
For handheldvideo recording, regardless of lighting conditions, optical image stabilization compensates for minor shakes whose appearance magnifies when watched on a large display such as atelevision set orcomputer monitor.[6][7][8]
Different companies have different names for the OIS technology, for example:
Most high-end smartphones as of late 2014 use optical image stabilization for photos and videos.[11]
In Nikon andCanon's implementation, it works by using a floating lens element that is moved orthogonally to the optical axis of thelens using electromagnets.[12] Vibration is detected using two piezoelectricangular velocity sensors (often calledgyroscopic sensors), one to detect horizontal movement and the other to detect vertical movement.[13] As a result, this kind of image stabilizer corrects only for pitch and yaw axis rotations,[14][15] and cannot correct for rotation around the optical axis. Some lenses have a secondary mode that counteracts vertical-only camera shake. This mode is useful when using apanning technique. Some such lenses activate it automatically; others use a switch on the lens.
To compensate for camera shake in shooting video while walking, Panasonic introduced Power Hybrid OIS+ with five-axis correction: axis rotation, horizontal rotation, vertical rotation, and horizontal and vertical motion.[16]
Some Nikon VR-enabled lenses offer an "active" mode for shooting from a moving vehicle, such as a car or boat, which is supposed to correct for larger shakes than the "normal" mode.[17] However, active mode used for normal shooting can produce poorer results than normal mode.[18] This is because active mode is optimized for reducing higher angular velocity movements (typically when shooting from a heavily moving platform using faster shutter speeds), where normal mode tries to reduce lower angular velocity movements over a larger amplitude and timeframe (typically body and hand movement when standing on a stationary or slowly moving platform while using slower shutter speeds).
Most manufacturers suggest that the IS feature of a lens be turned off when the lens is mounted on a tripod as it can cause erratic results and is generally unnecessary. Many modern image stabilization lenses (notably Canon's more recent IS lenses) are able to auto-detect that they are tripod-mounted (as a result of extremely low vibration readings) and disable IS automatically to prevent this and any consequent image quality reduction.[19] The system also draws battery power, so deactivating it when not needed extends the battery charge.
A disadvantage oflens-based image stabilization is cost. Each lens requires its own image stabilization system. Also, not every lens is available in an image-stabilized version. This is often the case for fast primes and wide-angle lenses. However, the fastest lens with image stabilisation is theNocticron with a speed off/1.2. While the most obvious advantage for image stabilization lies with longer focal lengths, even normal and wide-angle lenses benefit from it in low-light applications.
Lens-based stabilization also has advantages over in-body stabilization. In low-light or low-contrast situations, the autofocus system (which has no stabilized sensors) is able to work more accurately when the image coming from the lens is already stabilized.[citation needed] In cameras with optical viewfinders, the image seen by the photographer through the stabilized lens (as opposed to in-body stabilization) reveals more detail because of its stability, and it also makes correct framing easier. This is especially the case with longer telephoto lenses. This is not an issue forMirrorless interchangeable-lens camera systems, because the sensor output to the screen orelectronic viewfinder is stabilized.
The sensor capturing the image can be moved in such a way as to counteract the motion of the camera, a technology often referred to as mechanical image stabilization. When the camera rotates, causing angular error, gyroscopes encode information to the actuator that moves the sensor.[20] The sensor is moved to maintain the projection of the image onto the image plane, which is a function of the focal length of the lens being used. Modern cameras can automatically acquire focal length information from modern lenses made for that camera.Minolta andKonica Minolta used a technique calledAnti-Shake (AS) now marketed asSteadyShot (SS) in theSony α line and Shake Reduction (SR) in thePentaxK-series andQ series cameras, which relies on a very precise angular rate sensor to detect camera motion.[21]Olympus introduced image stabilization with theirE-510D-SLR body, employing a system built around their Supersonic Wave Drive.[22] Other manufacturers usedigital signal processors (DSP) to analyze the image on the fly and then move the sensor appropriately. Sensor shifting is also used in some cameras by Fujifilm, Samsung, Casio Exilim and Ricoh Caplio.[23]
The advantage with moving theimage sensor, instead of the lens, is that the image can be stabilized even on lenses made without stabilization. This may allow the stabilization to work with many otherwise-unstabilized lenses, and reduces the weight and complexity of the lenses. Further, when sensor-based image stabilization technology improves, it requires replacing only the camera to take advantage of the improvements, which is typically far less expensive than replacing all existing lenses if relying on lens-based image stabilization. Some sensor-based image stabilization implementations are capable of correcting cameraroll rotation, a motion that is easily excited by pressing the shutter button. No lens-based system can address this potential source of image blur. A by-product of available "roll" compensation is that the camera can automatically correct for tilted horizons in the optical domain, provided it is equipped with an electronic spirit level, such as the Pentax K-7/K-5 cameras.
One of the primary disadvantages of moving the image sensor itself is that the image projected to the viewfinder is not stabilized. Similarly, the image projected to a phase-detection autofocus system that is not part of the image sensor, if used, is not stabilized. This is not an issue on cameras that use anelectronic viewfinder (EVF), since the image projected on that viewfinder is taken from the image sensor itself.
Some, but not all, camera-bodies capable of in-body stabilization can be pre-set manually to a given focal length. Their stabilization system corrects as if that focal length lens is attached, so the camera can stabilize older lenses, and lenses from other makers. This isn't viable with zoom lenses, because their focal length is variable. Some adapters communicate focal length information from the maker of one lens to the body of another maker. Some lenses that do not report their focal length can be retrofitted with a chip which reports a pre-programmed focal-length to the camera body. Sometimes, none of these techniques work, and image-stabilization cannot be used with such lenses.
In-body image stabilization requires the lens to have a larger output image circle because the sensor is moved during exposure and thus uses a larger part of the image. Compared to lens movements in optical image stabilization systems the sensor movements are quite large, so the effectiveness is limited by the maximum range of sensor movement, where a typical modern optically-stabilized lens has greater freedom. Both the speed and range of the required sensor movement increase with the focal length of the lens being used, making sensor-shift technology less suited for very long telephoto lenses, especially when using slower shutter speeds, because the available motion range of the sensor quickly becomes insufficient to cope with the increasing image displacement.
In September 2023, Nikon has announced the release ofNikon Z f, which has the world’s first Focus-Point VR technology that centers the axis of sensor shift image stabilization at the autofocus point, rather than at the center of the sensor like the conventional sensor shift image stabilization system. This allows for vibration reduction at the focused point rather than just in the center of the image.[24]
Starting with thePanasonic Lumix DMC-GX8, announced in July 2015, and subsequently in thePanasonic Lumix DC-GH5, Panasonic, who formerly only equipped lens-based stabilization in its interchangeable lens camera system (of theMicro Four Thirds standard), introduced sensor-shift stabilization that works in concert with the existing lens-based system ("Dual IS").
In the meantime (2016), Olympus also offered two lenses with image stabilization that can be synchronized with the in-built image stabilization system of the image sensors of Olympus'Micro Four Thirds cameras ("Sync IS"). With this technology a gain of 6.5f-stops can be achieved without blurred images.[25] This is limited by the rotational movement of the surface of the Earth, that fools theaccelerometers of the camera. Therefore, depending on the angle of view, the maximum exposure time should not exceed1⁄3 second for long telephoto shots (with a 35 mm equivalent focal length of 800 millimeters) and a little more than ten seconds for wide angle shots (with a 35 mm equivalent focal length of 24 millimeters), if the movement of the Earth is not taken into consideration by the image stabilization process.[26]
In 2015, theSony E camera system also allowed combining image stabilization systems of lenses and camera bodies, but without synchronizing the samedegrees of freedom. In this case, only the independent compensation degrees of the in-built image sensor stabilization are activated to support lens stabilisation.[27]
Canon and Nikon now have full-frame mirrorless bodies that have IBIS and also support each company's lens-based stabilization. Canon's first two such bodies, theEOS R andRP, do not have IBIS, but the feature was added for the more recent higher endR3,R5,R6 (and its MkII version) and the APS-CR7. However, the full frameR8 and APS-CR10 do not have IBIS. All of Nikon's full-frameZ-mount bodies—theZ 6,Z 7, the Mark II versions of both, theZ 8 andZ 9, have IBIS. However, itsAPS-CZ 50 lacks IBIS.
Real-timedigital image stabilization, also calledelectronic image stabilization (EIS), is used in some video cameras. This technique shifts the cropped area read out from the image sensor for each frame to counteract the motion. This requires the resolution of the image sensor to exceed the resolution of the recorded video, and it slightly reduces the field of view because the area on the image sensor outside the visible frame acts as a buffer against hand movements.[28] This technique reduces distracting vibrations from videos by smoothing the transition from one frame to another.
This technique can not do anything about existing motion blur, which may result in an image seemingly losing focus as motion is compensated due to movement during the exposure times of individual frames. This effect is more visible in darker sceneries due to prolonged exposure times per frame.
Some still camera manufacturers marketed their cameras as having digital image stabilization when they really only had a high-sensitivity mode that uses a short exposure time—producing pictures with less motion blur, but more noise.[29] It reduces blur when photographing something that is moving, as well as from camera shake.
Others now also use digital signal processing (DSP) to reduce blur in stills, for example by sub-dividing the exposure into several shorter exposures in rapid succession, discarding blurred ones, re-aligning the sharpest sub-exposures and adding them together, and using the gyroscope to detect the best time to take each frame.[30][31][32]
Many videonon-linear editing systems use stabilizationfilters that can correct a non-stabilized image by tracking the movement of pixels in the image and correcting the image by moving the frame.[33][34] The process is similar to digital image stabilization but since there is nolarger image to work with the filter either crops the image down to hide the motion of the frame or attempts to recreate the lost image at the edge through spatial or temporalextrapolation.[35]
Online services, includingYouTube, are also beginning to provide 'video stabilization as a post-processing step after content is uploaded. This has the disadvantage of not having access to the realtime gyroscopic data, but the advantage of more computing power and the ability to analyze images both before and after a particular frame.[36]
Used in astronomy, anorthogonal transfer CCD (OTCCD) actually shifts the image within theCCD itself while the image is being captured, based on analysis of the apparent motion of bright stars. This is a rare example of digital stabilization for still pictures. An example of this is in the upcoming gigapixel telescopePan-STARRS being constructed in Hawaii.[37]
A technique that requires no additional capabilities of any camera body–lens combination consists of stabilizing the entire camera body externally rather than using an internal method. This is achieved by attaching agyroscope to the camera body, usually using the camera's built-in tripod mount. This lets the external gyro (gimbal) stabilize the camera, and is typically used in photography from a moving vehicle, when a lens or camera offering another type of image stabilization is not available.[38]
A common way to stabilize moving cameras after approx. year 2015 is by using acamera stabilizer such as a stabilized remote camera head. The camera and lens are mounted in a remote controlled camera holder which is then mounted on anything that moves, such as rail systems, cables, cars or helicopters. An example of a remote stabilized head that is used to stabilize moving TV cameras that are broadcasting live is the Newton stabilized head.[39]
Another technique for stabilizing a video or motion picture camera body is theSteadicam system, which isolates the camera from the operator's body using a harness and a camera boom with a counterweight.[40]
A camera stabilizer is any device or object that externally stabilizes the camera. This can refer to aSteadicam, atripod, the camera operator's hand, or a combination of these.
In close-up photography, using rotation sensors to compensate for changes in pointing direction becomes insufficient. Moving, rather than tilting, the camera up/down or left/right by a fraction of a millimeter becomes noticeable if you are trying to resolve millimeter-size details on the object. Linear accelerometers in the camera, coupled with information such as the lens focal length and focused distance, can feed a secondary correction into the drive that moves the sensor or optics, to compensate for linear as well as rotational shake.[41]
In many animals, including human beings, theinner ear functions as the biological analogue of anaccelerometer in camera image stabilization systems, to stabilize the image by moving theeyes. When a rotation of the head is detected, an inhibitory signal is sent to theextraocular muscles on one side and an excitatory signal to the muscles on the other side. The result is a compensatory movement of the eyes. Typically eye movements lag the head movements by less than 10 ms.[42]
{{cite book}}
: CS1 maint: multiple names: authors list (link)[permanent dead link]