Wireless Institute Seminar Series
318 DeBartolo Hall
Scalable Wireless Digital Twins and ML Models for High-Fidelity RF Signal Mapping
Abstract: Radio frequency (RF) signal mapping, which is the process of analyzing and predicting the RF signal strength and distribution within target areas, is crucial for cellular network planning and spectrum sharing applications. Traditional approaches to RF signal mapping rely on analytical/statistical models, which offer low complexity but often lack accuracy, or ray tracing tools, which provide enhanced precision for the target area at the cost of increased computational complexity. Recently, machine learning (ML) has emerged as a data-driven approach to model RF signal propagation, which leverages models trained on synthetic datasets to perform RF signal mapping in “unseen” areas. However, such methods often require the use of advanced proprietary software for creating the wireless digital twins and generating synthetic datasets (e.g., ray tracing), or rely on extensive measurements collected from the target areas for effective training of the models.
In this talk, I will present Geo2SigMap, an automated framework we developed to streamline the creation of scalable wireless digital twins and ML models, which can then be used to generate high-fidelity RF signal coverage maps. Through the integration of open-source databases and tools, Geo2SigMap enables the efficient generation of large-scale 3D scenes and ray tracing models based on OpenStreetMap, the USGS National Lidar Map, and NVIDIA’s Sionna RT and Aerial Omniverse Digital Twin (AODT) platforms. We also propose an ML model pre-trained on pure synthetic datasets and employed to generate detailed RF signal maps, leveraging environmental information and sparse measurement data. We evaluate the performance of Geo2SigMap via real-world measurements, where various types of user equipment (UE) collect over 50K data points related to cellular information from more than 10 LTE cells operating in the citizens broadband radio service (CBRS) band. Our results show that Geo2SigMap achieves an average root mean square error (RMSE) of 6.0 dB for predicting the reference signal received power (RSRP) at the UE, representing an average RMSE improvement of 3.6 dB compared to existing methods. I will conclude the talk by outlining the future roadmap of Geo2SigMap, and discussing its potential applications in a variety of spectrum monitoring and sharing scenarios.
Bio: Tingjun Chen is the Nortel Networks Assistant Professor of Electrical & Computer Engineering at Duke University, with a secondary appointment in Computer Science. His research focuses on the networking, communication, sensing, and energy-efficient computing aspects of wireless, mobile, and optical networked systems, bridging both theoretical foundations and experimental platforms. He received his Ph.D. degree in Electrical Engineering from Columbia University in 2020, and his B.Eng. degree in Electronic Engineering from Tsinghua University in 2014. Between 2020–2021, he was a Postdoctoral Associate at Yale University. He has received multiple awards including the NSF CAREER Award, Google Research Scholar Award, IBM Academic Awards, NVIDIA Academic Awards, Columbia Engineering Morton B. Friedman Memorial Prize for Excellence, Columbia University Eli Jury Award, and the Facebook Fellowship. He is also a co-recipient of several paper awards from ACM CoNEXT, ACM MobiHoc, IEEE MTT-S IMS, IEEE/Optica OFC, and ECOC. His Ph.D. thesis was recognized by the ACM SIGMOBILE Dissertation Award Runner-up.