Bad Edges from Kinect V2 Scan Data
Where bad edges happen
When there is a large distance between the background and the object being scanned, along the edges will be data indicating that something is closer to the scanner.
Let’s take different looks at the bad data. Because the bad data is shifting around the edge, I’ve circled in purple an example in each image.
The bad edge data is being read as a part of the person, so it sees it as being attached to the actual object and not the background or off on its own.
The change in distance between each frame is shown in red, this means that the bad data is not stable/consistent. So it’s changing from pixel to pixel mostly, but also sometimes it’s somewhat stable; so excluding data that changes from frame to frame wouldn’t be enough.
Here you can see the change in distance around each pixel, orange is a higher change than green. So this means that the pixels are not smooth, but jump up and down next to each other.
Here we can see that blue is angled away from the camera and green is angled toward the camera; where the bad edge is – is actually being read as an angle towards the camera.
The current solution was to add a filter called ‘pixel distance’; this is the total distance between the pixel and all adjacent pixels. On the left was an image taken without the filter, and on the right is with the filter applied. This filter will effectively remove areas where pixels vary greatly from each other. You can see there is still some incorrect color on the edges, but this can be resolved by removing a layer from all edges (better less data and it be correct than more data and it be wrong).