HomeiOS Developmentios - TrueDepth to actual level cloud

ios – TrueDepth to actual level cloud


I’m attempting to transform TrueDepth depth maps into level clouds. Is there any solution to rework TrueDepth digital camera depth map into an actual dimension level cloud utilizing supplied by Apple intrinsics? Different questions do not assist. That is how I extract parameters:

        if let calibrationData = depthData.cameraCalibrationData {
            let intrinsicMatrix = calibrationData.intrinsicMatrix
            let extrinsicMatrix = calibrationData.extrinsicMatrix
            let referenceDimensions = calibrationData.intrinsicMatrixReferenceDimensions
            let intrinsicReferenceDimensionWidth = referenceDimensions.width
        }

the place depthData: AVDepthData

My transformation code is right here (with IPhone 12 intrinsics and reference dimensions divided by depth dimensions):

def depth_to_point_cloud(depth, intrinsics=[[2742.1404 / (4032 / 640), 0, 2015.808 / (4032 / 640)], [0, 2742.1404 / (4032 / 640), 1513.418 / (4032 / 640)], [0, 0, 1] ]):
    peak, width = depth.form
    x, y = np.meshgrid(np.arange(width), np.arange(peak))
    xrw = (x - intrinsics[0][2]) * depth / intrinsics[0][0]
    yrw = (y - intrinsics[1][2]) * depth / intrinsics[1][1]
    xyzw = np.stack([xrw, yrw, depth], axis=2)
    return xyzw.reshape(-1, 3)

The outcome could be very stretched and doesn’t seem like some extent cloud with actual distances. What am I lacking? Any assist, questions, strategies, hyperlinks, guides, literature could be appreciated!

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments