That is my second StackOverflow query concerning the conduct of the Metallic Kernel. On this code, I attempted to overlay a shade dice on a picture coming from the iPhone digicam, however switch the colour dice information not as a texture, as it’s achieved in obtainable repositories, however as information, because it occurs with the “CIColorCube” filter. It is sooner and makes use of much less assets. However I got here throughout an sudden visible impact. The colour dice adjustments the picture in a different way than it does within the Core Picture filter. That is particularly noticeable when utilizing a shade dice with a black and white impact. Maybe I made a mistake in arithmetic, as a result of I made a trilinear interpolation in order that gentle cubes with small sizes wouldn’t spoil the gradients within the picture. Or there may be an error within the appropriate preparation of the information for the filter. Right here is the code of the computing core:
kernel void cameraKernel(texture2d<half, entry::read_write> lumaTexture [[ texture(0) ]],
texture2d<half, entry::read_write> chromaTexture [[ texture(1) ]],
texture2d<half, entry::write> presentTexture [[ texture(2) ]],
fixed uint& lutSize [[ buffer(0) ]],
system float4* lutData [[ buffer(1) ]],
fixed float& depth [[ buffer(2) ]],
uint2 gid [[ thread_position_in_grid ]]
) {
const half3x3 yCbCrToRGBMatrix = half3x3(1.0, 1.0, 1.0, 0.0, -0.343, 1.765, 1.4, -0.711, 0.0);
const half3x3 rgbToYCbCrMatrix = half3x3(0.299, -0.169, 0.5, +0.587, -0.331, -0.419, 0.114, 0.5, -0.081);
half3 yuv;
yuv.x = lumaTexture.learn(gid).r;
yuv.yz = chromaTexture.learn(uint2(gid.x/2, gid.y/2)).rg - half2(0.5, 0.5);
half3 rgb = clamp(yCbCrToRGBMatrix * yuv, half3(0.0), half3(1.0));
if (lutSize > 2) {
half3 luttedColor = 0.0;
const half3 shade = rgb * half(lutSize - 1);
const uint minBlueIndex = ground(shade.b) * lutSize * lutSize; // Calculate min blue index
const uint maxBlueIndex = ceil(shade.b) * lutSize * lutSize; // Calculate max blue index
const uint minGreenIndex = ground(shade.g) * lutSize; // Calculate min inexperienced index
const uint maxGreenIndex = ceil(shade.g) * lutSize; // Calculate max inexperienced index
const uint minRedIndex = ground(shade.r); // Calculate min purple index
const uint maxRedIndex = ceil(shade.r); // Calculate max purple index
// Fetch & interpolate blue colours
luttedColor.b = combine(half(lutData[minBlueIndex].b), half(lutData[maxBlueIndex].b), fract(shade.b));
// Fetch & interpolate inexperienced colours from min blue rows
const half minGreen = combine(half(lutData[minBlueIndex + minGreenIndex].g), half(lutData[minBlueIndex + maxGreenIndex].g), fract(shade.g));
// Fetch & interpolate inexperienced colours from max blue rows
const half maxGreen = combine(half(lutData[maxBlueIndex + minGreenIndex].g), half(lutData[maxBlueIndex + maxGreenIndex].g), fract(shade.g));
luttedColor.g = combine(minGreen, maxGreen, fract(shade.b)); // Interpolate consequence inexperienced colours
// Fetch & interpolate purple shade from min blue & min inexperienced indices
const half minMinRed = combine(half(lutData[minBlueIndex + minGreenIndex + minRedIndex].r), half(lutData[minBlueIndex + minGreenIndex + maxRedIndex].r), fract(shade.r));
// Fetch & interpolate purple shade from min blue & max inexperienced indices
const half minMaxRed = combine(half(lutData[minBlueIndex + maxGreenIndex + minRedIndex].r), half(lutData[minBlueIndex + maxGreenIndex + maxRedIndex].r), fract(shade.r));
// Fetch & interpolate purple shade from max blue & min inexperienced indices
const half maxMinRed = combine(half(lutData[maxBlueIndex + minGreenIndex + minRedIndex].r), half(lutData[maxBlueIndex + minGreenIndex + maxRedIndex].r), fract(shade.r));
// Fetch & interpolate purple shade from max blue & max inexperienced indices
const half maxMaxRed = combine(half(lutData[maxBlueIndex + maxGreenIndex + minRedIndex].r), half(lutData[maxBlueIndex + maxGreenIndex + maxRedIndex].r), fract(shade.r));
const half minRed = combine(minMinRed, minMaxRed, fract(shade.g)); // Interpolate purple colours from min blue indices
const half maxRed = combine(maxMinRed, maxMaxRed, fract(shade.g)); // Interpolate purple colours from max blue indices
luttedColor.r = combine(minRed, maxRed, fract(shade.b)); // Interpolate consequence purple colours
rgb = combine(rgb, luttedColor, depth);
}
half3 consequence = rgbToYCbCrMatrix * rgb;
consequence.yz += half2(0.5, 0.5);
lumaTexture.write(consequence.x, gid);
chromaTexture.write(half4(consequence.y, consequence.z, 1.0, 1.0), uint2(gid.x/2, gid.y/2));
presentTexture.write(half4(rgb, 1.0), gid);
}
Here is the entire challenge: https://github.com/VKostin8311/LiveEffectCamera
The colour dice information has already been transformed to binary in the way in which that’s crucial for the CIColorCube, and so they work completely with it. The challenge additionally implements the return of processed textures as biplane for subsequent recording utilizing AVAssetWriter
I’ll be thankful for any assist.