CAMTracker(D2) v2.5 release instructions (1)

new function

1.   OCIO color space transform. This function has already been added to UE4. It is mainly used here to convert the color space of the camera image. The professional photography-related technology here is simply to convert other color spaces into the Linear linear color space used by the computer 3D engine. Different brands of cameras can use different configuration files. Of course, you can also use the corresponding LUT for color correction. The specific function can be parameterized in the description of TD: https://docs.derivative.ca/OpenColorIO_TOP

2.    Lens calibration. Because the camera lens will inevitably cause various distortions to the original shooting picture, in order to match the actual shooting picture with the engine picture, lens calibration needs to be done. Zhang Zhengyou’s camera calibration method is routinely used. I also recently completed the UI of the relevant camera calibration software. To do camera calibration, you need to print the checkerboard first, and paste the printed paper on the hard board, and then use the camera calibration software to clearly shoot at least 10 checkerboards from different angles, and then let the computer do the camera calibration calculation, calibration The data obtained later can be filled in here in turn. 

A description of options related to this node is here: https://docs.derivative.ca/Lens_Distort_TOP

3.   The Align View button is mainly for vive tracker users. If you have two trackers, one can be used as a tracking camera, and the other can be used to align the real and virtual cameras, that is, to adjust the XYZ offset value (the offset of the tracker from the camera’s entrance pupil). Of course, the FOV of the camera needs to be set before doing this. In addition, you can switch between different tracker types via the tracker switch slider, including 2.0, 3.0, tundra tracker and tracking camera styles.

4.   Green screen configuration mode. That is the button behind Align View can switch between billboard, 3screen and 4screen. Its function is that you can make corresponding settings in the three-dimensional space of the software according to the size of the green screen in the real space to ensure that only the pictures in the green screen space are output to D2. The three-fold screen and four-screen configuration modes here have nothing to do with D2. Billboard is still used in D2, but the billboard image is limited to the corresponding green screen space. All pictures outside the green screen will be masked out. This way, even if the camera captures images outside the green screen, it will not be displayed in the D2. This step is done after the camera image is matched with the software image.

 

5.   LUT1 and LUT2, the previous version only provided the adjustable spill removal mode of LUT2, here is the LUT1 fixed spill removal mode. It should be noted here that usually these values do not need to be set except for the color, unless you have special needs to adjust the last three items. The color of the first item can be set to a color similar to the green screen. Under normal circumstances, these two modes can already meet the role of despill color.

6.    ColorGrading function. This function is mainly for professional cameras with color correction needs to perform color correction processing on the image after keying. For the functions of each function, please refer to professional color correction knowledge.  

7.    More intuitive tracker offset adjustment logic and one-key reset origin function. Because the vive tracker uses the steamvr platform, its origin calibration procedure is quite cumbersome, so I specially developed a one-key calibration function, which greatly saves the workflow. Specifically, you only need to place the indicator light of the tracker2.0 towards the camera in the middle of the green screen floor, the direction must be the same as the direction of the green screen, and then click the DefaultVal(V) button to the right of the vive button to switch to RestOrigin(V), that is, This position is used as the origin to establish a coordinate system, and this origin reset will also include the three rotation axes of XYZ. You only need to remember the corresponding base station sequence next time you turn it on. Normally, the base station identified first is used as a reference. If the room is calibrated, of course it doesn’t matter.    

The other is the Kalman filter dedicated to the vive tracker, see the parameters in the blue border part of the screenshot. When you move the mouse to the corresponding position, the reference data waveform will be displayed in the right pane. In order to facilitate the intuitive adjustment of these parameters. Of course, the default value is usually sufficient. It is recommended to use a wired connection when using the tracker to avoid more interference, and to avoid reflection from interfering with base station positioning.

8.    For the T265 tracking camera, the function of recording the origin with one key is added, so as to avoid tossing wires back and forth for origin calibration. You only need to do the origin calibration once, then get a fixed position and click RecordOrigin. Next time, you only need to power on at this position and angle. Here you just switch the tracking device type. In addition, you can use the CAMROT function to install the T265 in four directions to avoid strong light reflection and tracking data jitter.  

In addition, it should be noted that the T265 device needs to be connected to the device before switching to it, and you cannot unplug it when you use it.

9.     The other three professional tracking protocols all support the transmission of zoom data. Of course, you need to use regular DMX data to send. The newly added PSN protocol only supports regular positioning data transmission.    

10.  Update the TD development platform to the 2022 version, which supports more types of capture cards. Although it also supports genlock and time code synchronization, due to the functional limitations of D2, it can only transmit images through NDI, so these functions can only be Wait for the official development team to follow up.

To be continued. .

Leave a Reply