Abstract: An in-depth analysis of existing point cloud upsampling architectures reveals the underlying causes of their high computational costs and memory usage, which pose critical limitations for real-time applications. To overcome these inefficiencies, we introduce LET-UP, a novel lightweight and efficient binarized transformer (BiTransformer) network that significantly reduces model size and inference time with only a slight trade-off in accuracy. LET-UP incorporates a BiTransformer framework, featuring binarized multilayer perceptron (BiMLP) and binarized self-attention (BiAttention) modules, which leverage the power of binarization to streamline feature extraction. Extensive experiments conducted on both synthetic and real-world datasets demonstrate that LET-UP achieves a threefold increase in speed and a 15-fold reduction in model size compared to state-of-the-art methods, with only a 1%–2% decrease in accuracy. These results underscore LET-UP’s potential to revolutionize real-time point cloud processing, offering a significant advancement for both research and practical applications in the field.
Loading