杨静, 武佳, 李红霞. 用户属性感知的移动社交网络边缘缓存机制[J]. 工程科学学报, 2020, 42(7): 930-938. DOI: 10.13374/j.issn2095-9389.2019.07.12.001
引用本文: 杨静, 武佳, 李红霞. 用户属性感知的移动社交网络边缘缓存机制[J]. 工程科学学报, 2020, 42(7): 930-938. DOI: 10.13374/j.issn2095-9389.2019.07.12.001
YANG Jing, WU Jia, LI Hong-xia. User-aware edge-caching mechanism for mobile social network[J]. Chinese Journal of Engineering, 2020, 42(7): 930-938. DOI: 10.13374/j.issn2095-9389.2019.07.12.001
Citation: YANG Jing, WU Jia, LI Hong-xia. User-aware edge-caching mechanism for mobile social network[J]. Chinese Journal of Engineering, 2020, 42(7): 930-938. DOI: 10.13374/j.issn2095-9389.2019.07.12.001

用户属性感知的移动社交网络边缘缓存机制

User-aware edge-caching mechanism for mobile social network

  • 摘要: 针对数据流量爆发式增长所引发的网络拥塞、用户体验质量恶化等问题,提出一种用户属性感知的边缘缓存机制。首先,利用隐语义模型获知用户对各类内容的兴趣度,进而估计本地流行内容,然后微基站将预测的本地流行内容协作缓存,并根据用户偏好的变化,将之实时更新。为进一步减少传输时延,根据用户偏好构建兴趣社区,在兴趣社区中基于用户的缓存意愿和缓存能力,选择合适的缓存用户缓存目标内容并分享给普通用户。结果表明,所提机制性能优于随机缓存及最流行内容缓存算法,在提高缓存命中率、降低传输时延的同时,增强了用户体验质量。

     

    Abstract: With the rapid growth in the number of intelligent terminal devices and wireless multimedia applications, mobile communication traffic has exploded. The latest report from Cisco Visual Networking Index (CVNI) indicates that by 2022, global mobile data traffic will have grown to three times that in 2017, which will exert tremendous pressure on the backhaul link. One key approach to solve this problem is to cache popular content at the edges (base stations and mobile devices) and then bring the requested content from the edges close to the user, instead of obtaining the requested content from the content server through backhaul networks. Thus, by obtaining the required content of mobile users locally, edge caching can effectively improve network performance and reduce the pressure on the backhaul link. However, owing to the limited storage capacity of the edge nodes and the diversification of user requirements, the edge nodes can neither cache all the content in the content server nor randomly cache the content. To solve these problems, an edge-caching mechanism based on user-awareness was proposed. First, using an implicit semantic model, we predicted popular content in a macro cell in terms of the users’ interests. Small base stations within identical macro cells cache data cooperatively, which update local popular content based on the dynamic content preference of users. To further reduce the delay in content delivery, we helped users to ascertain their top communities of interest based on their content preferences. At the same time, the most appropriate user equipment (UE) is selected considering the caching willingness and caching ability to cache data for other UEs in identical communities of interest. Results show that the proposed mechanism outperforms the random cache approach and the most popular content-caching algorithm; it improves the cache hit rate and reduces the transmission delay while enhancing the quality of user experience.

     

/

返回文章
返回