Skip to content

Commit

Permalink
Merge pull request #5096 from priankakariatyml:ios-doc-updates-ver2
Browse files Browse the repository at this point in the history
PiperOrigin-RevId: 601860316
  • Loading branch information
copybara-github committed Jan 26, 2024
2 parents c079ac2 + 2543d89 commit 4ff9922
Show file tree
Hide file tree
Showing 4 changed files with 31 additions and 47 deletions.
26 changes: 10 additions & 16 deletions mediapipe/tasks/ios/vision/face_detector/sources/MPPFaceDetector.h
Original file line number Diff line number Diff line change
Expand Up @@ -84,11 +84,9 @@ NS_SWIFT_NAME(FaceDetector)
* interest. Rotation will be applied according to the `orientation` property of the provided
* `MPImage`. Only use this method when the `FaceDetector` is created with running mode `.image`.
*
* This method supports classification of RGBA images. If your `MPImage` has a source type of
* `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of the
* following pixel format types:
* 1. kCVPixelFormatType_32BGRA
* 2. kCVPixelFormatType_32RGBA
* This method supports performing face detection on RGBA images. If your `MPImage` has a source
* type of `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must use
* `kCVPixelFormatType_32BGRA` as its pixel format.
*
* If your `MPImage` has a source type of `.image` ensure that the color space is
* RGB with an Alpha channel.
Expand All @@ -109,11 +107,9 @@ NS_SWIFT_NAME(FaceDetector)
* the provided `MPImage`. Only use this method when the `FaceDetector` is created with running
* mode `.video`.
*
* This method supports classification of RGBA images. If your `MPImage` has a source type of
* `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of the
* following pixel format types:
* 1. kCVPixelFormatType_32BGRA
* 2. kCVPixelFormatType_32RGBA
* This method supports performing face detection on RGBA images. If your `MPImage` has a source
* type of `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must use
* `kCVPixelFormatType_32BGRA` as its pixel format.
*
* If your `MPImage` has a source type of `.image` ensure that the color space is RGB with an Alpha
* channel.
Expand Down Expand Up @@ -145,17 +141,15 @@ NS_SWIFT_NAME(FaceDetector)
* It's required to provide a timestamp (in milliseconds) to indicate when the input image is sent
* to the face detector. The input timestamps must be monotonically increasing.
*
* This method supports classification of RGBA images. If your `MPImage` has a source type of
* `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of the
* following pixel format types:
* 1. kCVPixelFormatType_32BGRA
* 2. kCVPixelFormatType_32RGBA
* This method supports performing face detection on RGBA images. If your `MPImage` has a source
* type of `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must use
* `kCVPixelFormatType_32BGRA` as its pixel format.
*
* If the input `MPImage` has a source type of `.image` ensure that the color
* space is RGB with an Alpha channel.
*
* If this method is used for classifying live camera frames using `AVFoundation`, ensure that you
* request `AVCaptureVideoDataOutput` to output frames in `kCMPixelFormat_32RGBA` using its
* request `AVCaptureVideoDataOutput` to output frames in `kCMPixelFormat_32BGRA` using its
* `videoSettings` property.
*
* @param image A live stream image data of type `MPImage` on which face detection is to be
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,11 +57,9 @@ NS_SWIFT_NAME(FaceLandmarker)
* interest. Rotation will be applied according to the `orientation` property of the provided
* `MPImage`. Only use this method when the `FaceLandmarker` is created with `.image`.
*
* This method supports RGBA images. If your `MPImage` has a source type of `.pixelBuffer` or
* `.sampleBuffer`, the underlying pixel buffer must have one of the following pixel format
* types:
* 1. kCVPixelFormatType_32BGRA
* 2. kCVPixelFormatType_32RGBA
* This method supports performing face landmark detection on RGBA images. If your `MPImage` has a
* source type of `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must use
* `kCVPixelFormatType_32BGRA` as its pixel format.
*
* If your `MPImage` has a source type of `.image` ensure that the color space is RGB with an
* Alpha channel.
Expand All @@ -80,10 +78,9 @@ NS_SWIFT_NAME(FaceLandmarker)
* the provided `MPImage`. Only use this method when the `FaceLandmarker` is created with running
* mode `.video`.
*
* This method supports RGBA images. If your `MPImage` has a source type of `.pixelBuffer` or
* `.sampleBuffer`, the underlying pixel buffer must have one of the following pixel format types:
* 1. kCVPixelFormatType_32BGRA
* 2. kCVPixelFormatType_32RGBA
* This method supports performing face landmark detection on RGBA images. If your `MPImage` has a
* source type of `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must use
* `kCVPixelFormatType_32BGRA` as its pixel format.
*
* If your `MPImage` has a source type of `.image` ensure that the color space is RGB with an Alpha
* channel.
Expand Down Expand Up @@ -113,16 +110,15 @@ NS_SWIFT_NAME(FaceLandmarker)
* It's required to provide a timestamp (in milliseconds) to indicate when the input image is sent
* to the face detector. The input timestamps must be monotonically increasing.
*
* This method supports RGBA images. If your `MPImage` has a source type of `.pixelBuffer` or
* `.sampleBuffer`, the underlying pixel buffer must have one of the following pixel format types:
* 1. kCVPixelFormatType_32BGRA
* 2. kCVPixelFormatType_32RGBA
* This method supports performing face landmark detection on RGBA images. If your `MPImage` has a
* source type of `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must use
* `kCVPixelFormatType_32BGRA` as its pixel format.
*
* If the input `MPImage` has a source type of `.image` ensure that the color space is RGB with an
* Alpha channel.
*
* If this method is used for classifying live camera frames using `AVFoundation`, ensure that you
* request `AVCaptureVideoDataOutput` to output frames in `kCMPixelFormat_32RGBA` using its
* request `AVCaptureVideoDataOutput` to output frames in `kCMPixelFormat_32BGRA` using its
* `videoSettings` property.
*
* @param image A live stream image data of type `MPImage` on which face landmark detection is to be
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -63,11 +63,9 @@ NS_SWIFT_NAME(GestureRecognizer)
* `MPImage`. Only use this method when the `GestureRecognizer` is created with running mode,
* `.image`.
*
* This method supports gesture recognition of RGBA images. If your `MPImage` has a source type of
* `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of the following
* pixel format types:
* 1. kCVPixelFormatType_32BGRA
* 2. kCVPixelFormatType_32RGBA
* This method supports performing gesture recognition on RGBA images. If your `MPImage` has a
* source type of `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must use
* `kCVPixelFormatType_32BGRA` as its pixel format.
*
* If your `MPImage` has a source type of `.image` ensure that the color space is RGB with an Alpha
* channel.
Expand All @@ -92,11 +90,9 @@ NS_SWIFT_NAME(GestureRecognizer)
* It's required to provide the video frame's timestamp (in milliseconds). The input timestamps must
* be monotonically increasing.
*
* This method supports gesture recognition of RGBA images. If your `MPImage` has a source type of
* `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of the following
* pixel format types:
* 1. kCVPixelFormatType_32BGRA
* 2. kCVPixelFormatType_32RGBA
* This method supports performing gesture recognition on RGBA images. If your `MPImage` has a
* source type of `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must use
* `kCVPixelFormatType_32BGRA` as its pixel format.
*
* If your `MPImage` has a source type of `.image` ensure that the color space is RGB with an Alpha
* channel.
Expand Down Expand Up @@ -129,18 +125,16 @@ NS_SWIFT_NAME(GestureRecognizer)
* It's required to provide a timestamp (in milliseconds) to indicate when the input image is sent
* to the gesture recognizer. The input timestamps must be monotonically increasing.
*
* This method supports gesture recognition of RGBA images. If your `MPImage` has a source type of
* `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of the following
* pixel format types:
* 1. kCVPixelFormatType_32BGRA
* 2. kCVPixelFormatType_32RGBA
* This method supports performing gesture recognition on RGBA images. If your `MPImage` has a
* source type of `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must use
* `kCVPixelFormatType_32BGRA` as its pixel format.
*
* If the input `MPImage` has a source type of `.image` ensure that the color space is RGB with an
* Alpha channel.
*
* If this method is used for performing gesture recognition on live camera frames using
* `AVFoundation`, ensure that you request `AVCaptureVideoDataOutput` to output frames in
* `kCMPixelFormat_32RGBA` using its `videoSettings` property.
* `kCMPixelFormat_32BGRA` using its `videoSettings` property.
*
* @param image A live stream image data of type `MPImage` on which gesture recognition is to be
* performed.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -199,7 +199,7 @@ NS_SWIFT_NAME(ImageClassifier)
* Alpha channel.
*
* If this method is used for classifying live camera frames using `AVFoundation`, ensure that you
* request `AVCaptureVideoDataOutput` to output frames in `kCMPixelFormat_32RGBA` using its
* request `AVCaptureVideoDataOutput` to output frames in `kCMPixelFormat_32BGRA` using its
* `videoSettings` property.
*
* @param image A live stream image data of type `MPImage` on which image classification is to be
Expand Down

0 comments on commit 4ff9922

Please sign in to comment.