FirebaseMLVision Framework Reference

VisionImage

classVisionImage:NSObject

An image or image buffer used in vision detection, with optional metadata.

  • Metadata about the image (e.g. image orientation). If metadata is not specified, the defaultmetadata values are used.

    Declaration

    Swift

    varmetadata:FIRVisionImageMetadata?{getset}
  • Initializes aVisionImage object with the given image.

    Declaration

    Swift

    init(image:UIImage)

    Parameters

    image

    Image to use in vision detection. The given image should be rotated, so itsimageOrientation property is set toUIImageOrientationUp value. TheUIImage must have non-NULLCGImage property.

    Return Value

    AVisionImage instance with the given image.

  • Initializes aVisionImage object with the given image buffer. To improve performance, it is recommended to minimize the lifespan and number of instances of this class when initializing with aCMSampleBufferRef.

    Declaration

    Swift

    init(buffersampleBuffer:CMSampleBuffer)

    Parameters

    sampleBuffer

    Image buffer to use in vision detection. The buffer must be based on a pixel buffer (not compressed data), and the pixel format must be one of: -kCVPixelFormatType_32BGRA -kCVPixelFormatType_420YpCbCr8BiPlanarFullRange -kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange In practice: this works with the video output of the phone’s camera, but not other arbitrary sources ofCMSampleBufferRefs.

    Return Value

    AVisionImage instance with the given image buffer.

  • Unavailable.

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2021-02-11 UTC.