Laha Ale, Texas A&M University – Corpus Christi

Proactive Caching in Mobile Edge Computing Using Bidirectional Deep Recurrent Neural Network

Abstract: With the proliferation of mobile devices, mobile traffic has been increasing dramatically. Such a massive increase in data traffic can cause severe congestion in core networks. Moreover, the latency of service cannot be guaranteed if the requested contents are always fetched from remote cloud servers. Mobile Edge Computing (MEC) holds a great potential to address these challenges through decentralizing cloud service and spreading the burden of cloud servers to small Base Station (BS) at the network edge. In MEC, proactively caching popular contents at small BSs to serve users in proximity can significantly reduce traffic loads in core networks and latency of service of users. The performance of proactive caching heavily relies on the accuracy of the prediction of content popularity, which is typically unknown and change over time. However, content popularity prediction is very challenging due to the high dynamics of users’ requests and mobility, as well as the ever-changing content files.

In this work, we propose a deep learning based framework to predict potential requested contents and update cache files using a Bidirectional Deep Recurrent Neural Network. First, a one-dimensional convolutional neural network is designed on the first layer of the framework to reduce the computational cost. Second, a bidirectional recurrent neural network is employed to predict time-variant requests at small BSs. Third, a fully connected neural network is utilized to learn and sample predicts from the bidirectional recurrent neural network. Through experiments based on a real-world dataset, it is demonstrated that the proposed solution can achieve high prediction accuracy (about 90%), which can greatly reduce the service latency and mitigate congestion in core networks.

Presentation Author(s):
Laha Ale*

Judging Forms Official judges only