POS-BERT: Point cloud one-stage BERT pre-training

Published: 01 Jan 2024, Last Modified: 28 Oct 2024Expert Syst. Appl. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•Proposed a Point cloud One-Stage BERT-style pre-training method.•Using a momentum tokenizer to provide continuous and dynamic supervision signals.•Does not require an extra training step.•Using a contrastive learning to learn better high-level semantic representation.•Achieved the best performance on multiple downstream tasks.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview