Revision History for Parameter Expanded Stochastic...

Camera Ready Revision Edit by Authors

  • 02 Mar 2025, 02:19 Coordinated Universal Time
  • Title: Parameter Expanded Stochastic Gradient Markov Chain Monte Carlo
  • Authors: Hyunsu Kim, Giung Nam, Chulhee Yun, Hongseok Yang, Juho Lee
  • Authorids: Hyunsu Kim, Giung Nam, Chulhee Yun, Hongseok Yang, Juho Lee
  • Keywords: SGMCMC, Bayesian Neural Network, Parameter Expansion
  • Abstract:

    Bayesian Neural Networks (BNNs) provide a promising framework for modeling predictive uncertainty and enhancing out-of-distribution robustness (OOD) by estimating the posterior distribution of network parameters. Stochastic Gradient Markov Chain Monte Carlo (SGMCMC) is one of the most powerful methods for scalable posterior sampling in BNNs, achieving efficiency by combining stochastic gradient descent with second-order Langevin dynamics. However, SGMCMC often suffers from limited sample diversity in practice, which affects uncertainty estimation and model performance. We propose a simple yet effective approach to enhance sample diversity in SGMCMC without the need for tempering or running multiple chains. Our approach reparameterizes the neural network by decomposing each of its weight matrices into a product of matrices, resulting in a sampling trajectory that better explores the target parameter space. This approach produces a more diverse set of samples, allowing faster mixing within the same computational budget. Notably, our sampler achieves these improvements without increasing the inference cost compared to the standard SGMCMC. Extensive experiments on image classification tasks, including OOD robustness, diversity, loss surface analyses, and a comparative study with Hamiltonian Monte Carlo, demonstrate the superiority of the proposed approach.

  • PDF: pdf
  • Supplementary Material: zip
  • Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)

    Edit Info


    Readers: Everyone
    Writers: ICLR 2025 Conference, ICLR 2025 Conference Submission13380 Authors
    Signatures: ICLR 2025 Conference Submission13380 Authors

    Camera Ready Revision Edit by Authors

    • 28 Feb 2025, 17:47 Coordinated Universal Time
    • Title: Parameter Expanded Stochastic Gradient Markov Chain Monte Carlo
    • Authors: Hyunsu Kim, Giung Nam, Chulhee Yun, Hongseok Yang, Juho Lee
    • Authorids: Hyunsu Kim, Giung Nam, Chulhee Yun, Hongseok Yang, Juho Lee
    • Keywords: SGMCMC, Bayesian Neural Network, Parameter Expansion
    • Abstract:

      Bayesian Neural Networks (BNNs) provide a promising framework for modeling predictive uncertainty and enhancing out-of-distribution robustness (OOD) by estimating the posterior distribution of network parameters. Stochastic Gradient Markov Chain Monte Carlo (SGMCMC) is one of the most powerful methods for scalable posterior sampling in BNNs, achieving efficiency by combining stochastic gradient descent with second-order Langevin dynamics. However, SGMCMC often suffers from limited sample diversity in practice, which affects uncertainty estimation and model performance. We propose a simple yet effective approach to enhance sample diversity in SGMCMC without the need for tempering or running multiple chains. Our approach reparameterizes the neural network by decomposing each of its weight matrices into a product of matrices, resulting in a sampling trajectory that better explores the target parameter space. This approach produces a more diverse set of samples, allowing faster mixing within the same computational budget. Notably, our sampler achieves these improvements without increasing the inference cost compared to the standard SGMCMC. Extensive experiments on image classification tasks, including OOD robustness, diversity, loss surface analyses, and a comparative study with Hamiltonian Monte Carlo, demonstrate the superiority of the proposed approach.

    • PDF: pdf
    • Supplementary Material: zip
    • Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)

      Edit Info


      Readers: Everyone
      Writers: ICLR 2025 Conference, ICLR 2025 Conference Submission13380 Authors
      Signatures: ICLR 2025 Conference Submission13380 Authors

      Camera Ready Revision Edit by Authors

      • 22 Feb 2025, 00:52 Coordinated Universal Time
      • Title: Parameter Expanded Stochastic Gradient Markov Chain Monte Carlo
      • Authors: Hyunsu Kim, Giung Nam, Chulhee Yun, Hongseok Yang, Juho Lee
      • Authorids: Hyunsu Kim, Giung Nam, Chulhee Yun, Hongseok Yang, Juho Lee
      • Keywords: SGMCMC, Bayesian Neural Network, Parameter Expansion
      • Abstract:

        Bayesian Neural Networks (BNNs) provide a promising framework for modeling predictive uncertainty and enhancing out-of-distribution robustness (OOD) by estimating the posterior distribution of network parameters. Stochastic Gradient Markov Chain Monte Carlo (SGMCMC) is one of the most powerful methods for scalable posterior sampling in BNNs, achieving efficiency by combining stochastic gradient descent with second-order Langevin dynamics. However, SGMCMC often suffers from limited sample diversity in practice, which affects uncertainty estimation and model performance. We propose a simple yet effective approach to enhance sample diversity in SGMCMC without the need for tempering or running multiple chains. Our approach reparameterizes the neural network by decomposing each of its weight matrices into a product of matrices, resulting in a sampling trajectory that better explores the target parameter space. This approach produces a more diverse set of samples, allowing faster mixing within the same computational budget. Notably, our sampler achieves these improvements without increasing the inference cost compared to the standard SGMCMC. Extensive experiments on image classification tasks, including OOD robustness, diversity, loss surface analyses, and a comparative study with Hamiltonian Monte Carlo, demonstrate the superiority of the proposed approach.

      • PDF: pdf
      • Supplementary Material: zip
      • Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)

        Edit Info


        Readers: Everyone
        Writers: ICLR 2025 Conference, ICLR 2025 Conference Submission13380 Authors
        Signatures: ICLR 2025 Conference Submission13380 Authors

        Camera Ready Revision Edit by Authors

        • 21 Feb 2025, 21:55 Coordinated Universal Time
        • Title: Parameter Expanded Stochastic Gradient Markov Chain Monte Carlo
        • Authors: Hyunsu Kim, Giung Nam, Chulhee Yun, Hongseok Yang, Juho Lee
        • Authorids: Hyunsu Kim, Giung Nam, Chulhee Yun, Hongseok Yang, Juho Lee
        • Keywords: SGMCMC, Bayesian Neural Network, Parameter Expansion
        • Abstract:

          Bayesian Neural Networks (BNNs) provide a promising framework for modeling predictive uncertainty and enhancing out-of-distribution robustness (OOD) by estimating the posterior distribution of network parameters. Stochastic Gradient Markov Chain Monte Carlo (SGMCMC) is one of the most powerful methods for scalable posterior sampling in BNNs, achieving efficiency by combining stochastic gradient descent with second-order Langevin dynamics. However, SGMCMC often suffers from limited sample diversity in practice, which affects uncertainty estimation and model performance. We propose a simple yet effective approach to enhance sample diversity in SGMCMC without the need for tempering or running multiple chains. Our approach reparameterizes the neural network by decomposing each of its weight matrices into a product of matrices, resulting in a sampling trajectory that better explores the target parameter space. This approach produces a more diverse set of samples, allowing faster mixing within the same computational budget. Notably, our sampler achieves these improvements without increasing the inference cost compared to the standard SGMCMC. Extensive experiments on image classification tasks, including OOD robustness, diversity, loss surface analyses, and a comparative study with Hamiltonian Monte Carlo, demonstrate the superiority of the proposed approach.

        • PDF: pdf
        • Supplementary Material: zip
        • Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)

          Edit Info


          Readers: Everyone
          Writers: ICLR 2025 Conference, ICLR 2025 Conference Submission13380 Authors
          Signatures: ICLR 2025 Conference Submission13380 Authors

          Rebuttal Revision Edit by Authors

          • 19 Nov 2024, 07:37 Coordinated Universal Time
          • Abstract:

            Bayesian Neural Networks (BNNs) provide a promising framework for modeling predictive uncertainty and enhancing out-of-distribution robustness (OOD) by estimating the posterior distribution of network parameters. Stochastic Gradient Markov Chain Monte Carlo (SGMCMC) is one of the most powerful methods for scalable posterior sampling in BNNs, achieving efficiency by combining stochastic gradient descent with second-order Langevin dynamics. However, SGMCMC often suffers from limited sample diversity in practice, which affects uncertainty estimation and model performance. We propose a simple yet effective approach to enhance sample diversity in SGMCMC without the need for tempering or running multiple chains. Our approach reparameterizes the neural network by decomposing each of its weight matrices into a product of matrices, resulting in a sampling trajectory that better explores the target parameter space. This approach produces a more diverse set of samples, allowing faster mixing within the same computational budget. Notably, our sampler achieves these improvements without increasing the inference cost compared to the standard SGMCMC. Extensive experiments on image classification tasks, including OOD robustness, diversity, loss surface analyses, and a comparative study with Hamiltonian Monte Carlo, demonstrate the superiority of the proposed approach.

          • PDF: pdf
          • Supplementary Material: zip

            Edit Info


            Readers: Everyone
            Writers: ICLR 2025 Conference, ICLR 2025 Conference Submission13380 Authors
            Signatures: ICLR 2025 Conference Submission13380 Authors

            Rebuttal Revision Edit by Authors

            • 19 Nov 2024, 07:35 Coordinated Universal Time
            • Abstract:

              Bayesian Neural Networks (BNNs) provide a promising framework for modeling predictive uncertainty and enhancing out-of-distribution robustness (OOD) by estimating the posterior distribution of network parameters. Stochastic Gradient Markov Chain Monte Carlo (SGMCMC) is one of the most powerful methods for scalable posterior sampling in BNNs, achieving efficiency by combining stochastic gradient descent with second-order Langevin dynamics. However, SGMCMC often suffers from limited sample diversity in practice, which affects uncertainty estimation and model performance. We propose a simple yet effective approach to enhance sample diversity in SGMCMC without the need for tempering or running multiple chains. Our approach reparameterizes the neural network by decomposing each of its weight matrices into a product of matrices, resulting in a sampling trajectory that better explores the target parameter space. This approach produces a more diverse set of samples, allowing faster mixing within the same computational budget. Notably, our sampler achieves these improvements without increasing the inference cost compared to the standard SGMCMC. Extensive experiments on image classification tasks, including OOD robustness, diversity, loss surface analyses, and a comparative study with Hamiltonian Monte Carlo, demonstrate the superiority of the proposed approach.

            • PDF: pdf
            • Supplementary Material: zip

              Edit Info


              Readers: Everyone
              Writers: ICLR 2025 Conference, ICLR 2025 Conference Submission13380 Authors
              Signatures: ICLR 2025 Conference Submission13380 Authors