Applications of Bayesian Learning part2(Machine Learning)

Monodeep Mukherjee
2 min readJan 15, 2023
Photo by Sahil Pandita on Unsplash
  1. Exploiting Tensor-based Bayesian Learning for Massive Grant-Free Random Access in LEO Satellite Internet of Things(arXiv)

Author : Ming Ying, Xiaoming Chen, Xiaodan Shao

Abstract : With the rapid development of Internet of Things (IoT), low earth orbit (LEO) satellite IoT is expected to provide low power, massive connectivity and wide coverage IoT applications. In this context, this paper provides a massive grant-free random access (GF-RA) scheme for LEO satellite IoT. This scheme does not need to change the transceiver, but transforms the received signal to a tensor decomposition form. By exploiting the characteristics of the tensor structure, a Bayesian learning algorithm for joint active device detection and channel estimation during massive GF-RA is designed. Theoretical analysis shows that the proposed algorithm has fast convergence and low complexity. Finally, extensive simulation results confirm its better performance in terms of error probability for active device detection and normalized mean square error for channel estimation over baseline algorithms in LEO satellite IoT. Especially, it is found that the proposed algorithm requires short preamble sequences and support massive connectivity with a low power, which is appealing to LEO satellite IoT.

2.Asynchronous Bayesian Learning over a Network (arXiv)

Author : Kinjal Bhar, He Bai, Jemin George, Carl Busart

Abstract : We present a practical asynchronous data fusion model for networked agents to perform distributed Bayesian learning without sharing raw data. Our algorithm uses a gossip-based approach where pairs of randomly selected agents employ unadjusted Langevin dynamics for parameter sampling. We also introduce an event-triggered mechanism to further reduce communication between gossiping agents. These mechanisms drastically reduce communication overhead and help avoid bottlenecks commonly experienced with distributed algorithms. In addition, the reduced link utilization by the algorithm is expected to increase resiliency to occasional link failure. We establish mathematical guarantees for our algorithm and demonstrate its effectiveness via numerical experiments

--

--

Monodeep Mukherjee

Universe Enthusiast. Writes about Computer Science, AI, Physics, Neuroscience and Technology,Front End and Backend Development