Publication in conference proceedings
Latent space transformers for generalizing deep networks
Hamed Farkhari (Farkhari, H.); Joseanne Viana (Viana, J.); Nidhi (Nidhi); Luis Miguel Campos (Campos, L. M.); Pedro Sebastião (Sebastião, P.); Albena Mihovska (Mihovska, A.); Purnima Lala Mehla (Mehta, P. L.); Luis Bernardo (Bernardo, L.); et al.
2021 IEEE Conference on Standards for Communications and Networking (CSCN)
Year (definitive publication)
2021
Language
English
Country
United States of America
More Information
Web of Science®

Times Cited: 0

(Last checked: 2024-05-17 20:09)

View record in Web of Science®

Scopus

Times Cited: 0

(Last checked: 2024-05-17 00:14)

View record in Scopus

Google Scholar

Times Cited: 0

(Last checked: 2024-05-13 13:58)

View record in Google Scholar

Abstract
Sharing information between deep networks is not a simple task nowadays. In a traditional approach, researchers change and train layers at the end of a pretrained deep network while the other layers remain the same to adapt it to their purposes or develop a new deep network. In this paper, we propose a novel concept for interoperability in deep networks. Generalizing such networks’ usability will facilitate the creation of new hybrid models promoting innovation and disruptive use cases for deep networks in the fifth generation of wireless communications (5G) networks and increasing the accessibility, usability, and affordability for these products. The main idea is to use standard latent space transformation to share information between such networks. First, each deep network should be split into two parts by creators. After that, they should provide access to standard latent space. As each deep network should do that, we suggest the standard for the procedure. By adding the latent space, we can combine two deep networks using the latent transformer block, the only block that needs to train while connecting different pretrained deep networks. The results from the combination create a new network with a unique ability. This paper contributes to a concept related to the generalization of deep networks using latent transformers, optimizing the utilization of the edge and cloud in 5G telecommunication, controlling load balancing, saving bandwidth, and decreasing the latency caused by cumbersome computations. We provide a review of the current standardization associated with deep networks and Artificial Intelligence in general. Lastly, we present some use cases in 5G supporting the proposed concept.
Acknowledgements
This research received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie Project Number 813391.
Keywords
Deep learning,Sharing information,Latent space,Standardization
Funding Records
Funding Reference Funding Entity
813391 Comissão Europeia