Exportar Publicação

A publicação pode ser exportada nos seguintes formatos: referência da APA (American Psychological Association), referência do IEEE (Institute of Electrical and Electronics Engineers), BibTeX e RIS.

Exportar Referência (APA)
Farkhari, H., Viana, J., Nidhi, Campos, L. M., Sebastião, P., Mihovska, A....Bernardo, L. (2021). Latent space transformers for generalizing deep networks. In IEEE (Ed.), 2021 IEEE Conference on Standards for Communications and Networking (CSCN). Virtual Online: IEEE.
Exportar Referência (IEEE)
H. Farkhari et al.,  "Latent space transformers for generalizing deep networks", in 2021 IEEE Conf. on Standards for Communications and Networking (CSCN), IEEE, Ed., Virtual Online, IEEE, 2021
Exportar BibTeX
@inproceedings{farkhari2021_1766221908605,
	author = "Farkhari, H. and Viana, J. and Nidhi and Campos, L. M. and Sebastião, P. and Mihovska, A. and Mehta, P. L. and Bernardo, L.",
	title = "Latent space transformers for generalizing deep networks",
	booktitle = "2021 IEEE Conference on Standards for Communications and Networking (CSCN)",
	year = "2021",
	editor = "IEEE",
	volume = "",
	number = "",
	series = "",
	doi = "10.1109/CSCN53733.2021.9686099",
	publisher = "IEEE",
	address = "Virtual Online",
	organization = "",
	url = "https://ieeexplore.ieee.org/xpl/conhome/9685016/proceeding"
}
Exportar RIS
TY  - CPAPER
TI  - Latent space transformers for generalizing deep networks
T2  - 2021 IEEE Conference on Standards for Communications and Networking (CSCN)
AU  - Farkhari, H.
AU  - Viana, J.
AU  - Nidhi
AU  - Campos, L. M.
AU  - Sebastião, P.
AU  - Mihovska, A.
AU  - Mehta, P. L.
AU  - Bernardo, L.
PY  - 2021
DO  - 10.1109/CSCN53733.2021.9686099
CY  - Virtual Online
UR  - https://ieeexplore.ieee.org/xpl/conhome/9685016/proceeding
AB  - Sharing information between deep networks is not a simple task nowadays. In a traditional approach, researchers change and train layers at the end of a pretrained deep network while the other layers remain the same to adapt it to their purposes or develop a new deep network. In this paper, we propose a novel concept for interoperability in deep networks. Generalizing such networks’ usability will facilitate the creation of new hybrid models promoting innovation and disruptive use cases for deep networks in the fifth generation of wireless communications (5G) networks and increasing the accessibility, usability, and affordability for these products. The main idea is to use standard latent space transformation to share information between such networks. First, each deep network should be split into two parts by creators. After that, they should provide access to standard latent space. As each deep network should do that, we suggest the standard for the procedure. By adding the latent space, we can combine two deep networks using the latent transformer block, the only block that needs to train while connecting different pretrained deep networks. The results from the combination create a new network with a unique ability. This paper contributes to a concept related to the generalization of deep networks using latent transformers, optimizing the utilization of the edge and cloud in 5G telecommunication, controlling load balancing, saving bandwidth, and decreasing the latency caused by cumbersome computations. We provide a review of the current standardization associated with deep networks and Artificial Intelligence in general. Lastly, we present some use cases in 5G supporting the proposed concept.
ER  -