|Lixin Yang*, Kailin Li, Haoyu Zhen, Mei Han, Cewu Lu|
International Conference on 3D Vision (3DV), 2024
Project Page / arXiv / Data / Code
Color-NeuS focuses on mesh reconstruction with color. We remove view-dependent color while using a relighting network to maintain volume rendering performance. Mesh is extracted from the SDF network, and vertex color is derived from the global color network. We conceived a in hand object scanning task and gathered several videos for it to evaluate Color-NeuS.
|Kailin Li*, Lixin Yang*, Haoyu Zhen, Zenan Lin, Xinyu Zhan, Licheng Zhong, Jian Xu, Kejian Wu, Cewu Lu|
International Conference on Computer Vision (ICCV), 2023
Project Page / arXiv / Code / PyBlend
We proposed a new method CHORD which exploits the categorical shape prior for reconstructing the shape of intra-class objects. In addition, we constructed a new dataset, COMIC, of category-level hand-object interaction. COMIC encompasses a diverse collection of object instances, materials, hand interactions, and viewing directions, as illustrated.
|Lixin Yang, Jian Xu, Licheng Zhong, Xinyu Zhan, Zhicheng Wang, Kejian Wu, Cewu Lu|
Computer Vision and Pattern Recognition (CVPR), 2023
Paper / arXiv / Code
POEM (Point Embedded Multi-view) focuses on reconstructing an articulation body with "true scale" and "accurate pose" from a series of sparsely arranged camera views. In practice, we used the example of hand. POEM explores the power of points, using a cluster of (x, y, z) coordinates with natural positional encoding to find associations in multi-view stereo.
Journal of Shanghai Jiao Tong University (Science)
We trained a model to identify the image of staplers (distinguish from mechanical arms as an example). After image recognition, we analyzed the feature information contained in the obtained image to help design. This method can also use in other product designs.
Machine Learning and Intelligent Systems Engineering (MLISE), 2022
We simulated two centralized methods and two decentralized methods for multi-robot safe navigation in their respective environments.