Title: Fade margins for minimum duration outages in lognormal shadow fading and Rayleigh fading
Abstract: Minimum duration outages have been used to characterize time-dependent performance in analysis of quantities such as average duration of an outage, the probability of outage, and the frequency of outage, in lognormal shadow fading (Mandayam et al., 1996) and Rayleigh fading (Lai and Mandayam, 1997), respectively. In this paper, we compare and contrast the effect of minimum duration outages on fade margin selection in channels subject to lognormal shadow fading and Rayleigh fading. A comparative analysis of relevant minimum durations for lognormal shadow fading and Rayleigh fading reveals the widely different time-scales active in outage considerations for each type of fading. Further, it is observed that lognormal shadow fading impacts outage on a time-scale that is much larger than that due to Rayleigh fading. The distinct time-scales revealed by the analysis show that the time-scales determined by the application govern the relative importance of the type of fading.
Publication Year: 2002
Publication Date: 2002-11-23
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 6
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot