Abstract
This study proposes a regression-based approach for calculating the likelihood probability in time-varying multi-input multi-output (MIMO) systems using one-bit analog-to-digital converters. These time-varying MIMO systems often face performance challenges because of the difficulty in tracking changes in the likelihood probability. To address this challenge, the proposed method leverages channel statistics and decoded outputs to refine the likelihood. An optimization problem is then formulated to minimize the mean-squared error between the true and refined likelihood probabilities. A linear regression approach is derived to solve this problem, and a regularization technique is applied to further optimize the calculation. The simulation results indicate that the proposed method improves reliability by effectively tracking temporal variations in the likelihood probability and outperforms conventional methods in terms of performance.
Original language | English |
---|---|
Article number | 3957 |
Journal | Mathematics |
Volume | 12 |
Issue number | 24 |
DOIs | |
State | Published - Dec 2024 |
Keywords
- likelihood probability
- linear regression
- quantized MIMO
- time-varying channels