Hybrid CPU-GPU distributed Framework for Large Scale Mobile Networks Simulation

Ben Romdhanne, Bilel; Nikaein, Navid; Mosli Bouksiaa, Mohamed Said; Bonnet, Christian
Research Report RR-12-268

Most of the existing packet-level simulation tools are designed to perform experiments modeling a small to medium scale networks. The main reason of this limitation is the amount of available computation power and memory in quasi mono-process simulation environment. To enable efficient packet-level simulation for large scale scenario, we introduce a new CPUGPU co-simulation framework where synchronization and experiment design are performed on CPU and node's processes are executed in parallel on GPU according to the master/worker model [13]. The framework is developed using Compute-Unified Device Architecture (CUDA) and denoted as Cunetsim [18], CUDA network simulator. To study the performance gain when GPU is used, we also introduce the CPU-legacy version of Cunetsim optimized for multi-core architecture. In this work, we present Cunetsim architecture, design concept, and features. We evaluate the performance of Cunetsim (both versions) compared to Sinalgo and NS-3 using benchmark scenarios [20]. Evaluation results show that Cunetsim execution time remains stable and that it achieves significantly lower computation time than CPU-based simulators for both static and mobile networks with no degradation in the accuracy of the results. We also study the impact of the hardware configuration on the performance gain and the simulation correctness. Cunetsim presents a proof of concept, demonstrating the feasibility of a fully GPU-based simulation rather than GPU-offloading or partial acceleration, through adequate architecture.


Type:
Rapport
Date:
2012-06-05
Department:
Systèmes de Communication
Eurecom Ref:
3892
Copyright:
© EURECOM. Personal use of this material is permitted. The definitive version of this paper was published in Research Report RR-12-268 and is available at :

PERMALINK : https://www.eurecom.fr/publication/3892