Figure 1. Nonrigid surface tracking using my approach.
Dense tracking using optical flow estimation, is identified as a solution to locating most of pixels through multiple images, which commonly follows a fundamental Brightness Constancy assumption that the pixel brightness remains unchanged. This technique underpins the research in many other subareas of higher level computer vision such as video augmentation, motion capture and visual effects. Tracking a nonrigid surface in a real-world scene is a difficult task. The main challenge lies in the temporal violation of the pixel brightness constancy, which is caused by the complex nature of nonrigid motion, as well as the accompanying difficulties during images capture i.e. camera shake, repeated texture, arbitrary occlusions and large displacements. Such violation of brightness constancy constraint leads to unpredictable errors during nonrigid surface tracking.
Violation of Brightness Constancy is a common issue, which mathematically yields a smaller number of equations than unknowns. Typical solvers are normally categorised in two ways: the first strategy lies on the acquisition of more known variables from extra input data, in particular additional images or reliable landmarks. However such extra information is often manually selected and difficult to obtain in practice. The other strategy is to impose additional Constraints on the problem, i.e. further assumptions on texture or motion behaviour derived beforehand. In mathematical terms, a constraint may represent the geometric relation, physical model or statistical assumptions, which gives significant influence on the quality and efficiency of the host algorithm performance. Hence, the use of constraint the results in requiring less input data and is often more practical for real-world sequences.
Within this thesis, I first introduce a novel constraint on pairwise correspondent estimation (LME) which is then extended to deal with the blurry effect from real-world image (moBlur). I also investigate the dense tracking on long sequence by using long term feature technique (APO). To evaluate dense tracking methods for real-world sequences, I capture dense ground truth by using novel multispectral patterns on normal real-world surfaces (GT).
Figure 2. Frame-frame tracked mesh estimation process on the k-th level of the coarse- to-fine framework.
I present the initial focus on a local motion constraint, namely Laplacian Mesh constraint, to improve pairwise optical flow estimation on a typical nonrigid surface i.e. cloth. The Laplacian Mesh constraint is presented as the inherent geometric relation between a pixel and its adjacent neighbours: the movement of connected vertices (pixels) on a deformable surface behaves similarly within a small neighbourhood even when some vertices are occluded. This observation also holds in each pixel within a real-world nonrigid surface. Combining this constraint and a variational optical flow framework, highly accurate correspondence is obtained between an image pair containing nonrigidly deforming objects. The experiments demonstrate the success and outperforms many previous methods on several benchmark datasets.
W. Li, D. Cosker, Video Interpolation using Optical flow and Laplacian Smoothness, Neurocomputing 2016.
W. Li, D. Cosker, M. Brown, and R. Tang, Optical Flow Estimation using Laplacian Mesh Energy, in Proceeding of IEEE Conference on Computer Vision and Pattern Recognition (CVPR’13), IEEE, June 2013, pp. 2435–2442.
Figure 3. RGB-Motion Imaging System.
Optical flow estimation is a difficult task given real-world video footage with camera and object blur. I combine a 3D pose&position tracker with an RGB sensor allowing us to capture video footage together with 3D camera motion. The results show that the additional camera motion information can be embedded into a hybrid optical flow framework by interleaving an iterative blind deconvolution and warping based minimization scheme. Such a hybrid framework significantly improves the accuracy of optical flow estimation in scenes with strong blur. The proposed approach yields improved overall performance against three state-of-the-art baseline methods applied to the proposed ground truth sequences as well as in several other real-world sequences captured by this novel imaging system.
Figure 4. Visual comparison on real-world sequences of LabDesk and shoes.
W. Li, Y. Chen, J. Lee, G. Ren, and D. Cosker, Blur Robust Optical Flow using Motion Channel, Neurocomputing 2016.
W. Li, Y. Chen, J. Lee, G. Ren, and D. Cosker, Robust Optical Flow Estimation for Continuous Blurred Scenes using RGB-Motion Imaging and Directional Filtering, in Proceeding of IEEE Winter Conference on Application of Computer Vision (WACV’14), 2014. (awarded as Best Student Paper)
Figure 4. Anchor patches (blue patches) are labeled on non-anchor frames within every clip using SIFT feature matching and Barycentric Coordinate Mapping between reference frame and non-anchor frame.
Tracking nonrigid surface through long image sequences is a fundamental research issue in computer vision. This task relies on estimating correspondences between image pairs over time where error accumulation in tracking can result in drift. In this thesis, I propose an optimisation framework with a novel Anchor Patch based algorithm which significantly reduces overall tracking errors given long sequences containing nonrigidly deformable objects. The framework may be applied to any tracking algorithm that calculates dense correspondences between images, e.g. optical flow. This work demonstrates the success of the proposed approach by showing significant tracking error reduction using 6 existing optical flow algorithms applied to a range of nonrigid benchmarks. This work also provides quantitative analysis of this approach given synthetic occlusions and image noise.
W. Li, D. Cosker, and M. Brown, Drift Robust Non-rigid Optical Flow Enhancement for Long Sequences, Journal of Intelligent and Fuzzy Systems (JIFS'16), 2016.
W. Li, D. Cosker, and M. Brown, An Anchor Patch Based Optimisation Framework for Reducing Optical Flow Drift in Long Image Sequences, in Proceeding of Asian Conference on Computer Vision (ACCV’12), Springer, November 2012, pp. 112–125.
Figure 5.RGB-NIR Camera and the NIR visible dyes. Top Left: The inside structure of the camera. Bottom Left: Sample images captured by the RGB CCD sensor and NIR CCD sensor respectively. Top Right: The relative transmittance of our RGB CCD sensor and NIR CCD sensor (yellow). Bottom Right: The absorbance of the NIR visible dyes respect to various wavelength.
I present the first ground truth data set of nonrigidly deforming real-world scenes (both long and short video sequences). To construct ground truth for the RGB sequences, the proposed system simultaneously capture Near-Infrared (NIR) image sequences where the dense markers - visible only in NIR - represent the ground truth positions, allowing comparison between the RGB tracked positions and the formation of error metrics. This novel ground truth construction protocol may also be adopted to capture other types of deformable objects, thus opening ground truth opportunities in other difficult-to-track problems. Unlike previous datasets containing nonrigidly deforming sequences using synthetic data, the capture of real-world objects yields realistic photometric effects - such as blur and illumination change - as well as occlusion and complex deformations. A public evaluation website is constructed to allow for ranking of RGB image based optical flow and other dense tracking algorithms, with varying statistical measures. Furthermore, I present the first RGB-NIR multispectral optical flow formulation allowing for overall optimisation of the optical flow energy by maximizing the distinguishing information from both the RGB and the complementary NIR channels. In the experiments I evaluate eight existing optical flow methods on this new dataset, as well as examine the proposed multispectral optical flow algorithm by varying the input channels across RGB, NIR and RGB-NIR.
W. Li, D. Cosker, Z. Lv and M. Brown, A Nonrigid Ground Truth Dataset and Multispectral Optical Flow Estimation using Combined RGB and Near-Infrared Imaging, IEEE Robotics and Automation Letters (RA-L’16), 2016.
W. Li, D. Cosker, Z. Lv and M. Brown, Dense Nonrigid Ground Truth for Optical Flow in Real-World Scenes, IEEE Conference on Automation Science and Engineering (CASE), 2016.