Exportar Publicação

A publicação pode ser exportada nos seguintes formatos: referência da APA (American Psychological Association), referência do IEEE (Institute of Electrical and Electronics Engineers), BibTeX e RIS.

Exportar Referência (APA)
Dias, J. M. S., Nande, P., Santos, P., Barata, N. & Correia, A. (2021). Image manipulation through gestures. In Marcos, A., Mendonça, A., Leitão, M., Costa, A., and Jorge, J. (Ed.), Atas do 12º Encontro Português de Computação Gráfica. (pp. 111-118). Porto: Eurographics Association.
Exportar Referência (IEEE)
J. M. Dias et al.,  "Image manipulation through gestures", in Atas do 12º Encontro Português de Computação Gráfica, Marcos, A., Mendonça, A., Leitão, M., Costa, A., and Jorge, J., Ed., Porto, Eurographics Association, 2021, pp. 111-118
Exportar BibTeX
@inproceedings{dias2021_1732403437298,
	author = "Dias, J. M. S. and Nande, P. and Santos, P. and Barata, N. and Correia, A.",
	title = "Image manipulation through gestures",
	booktitle = "Atas do 12º Encontro Português de Computação Gráfica",
	year = "2021",
	editor = "Marcos, A., Mendonça, A., Leitão, M., Costa, A., and Jorge, J.",
	volume = "",
	number = "",
	series = "",
	doi = "10.2312/pt.20031431",
	pages = "111-118",
	publisher = "Eurographics Association",
	address = "Porto",
	organization = "Eurographics Association",
	url = "https://diglib.eg.org/handle/10.2312/2633098"
}
Exportar RIS
TY  - CPAPER
TI  - Image manipulation through gestures
T2  - Atas do 12º Encontro Português de Computação Gráfica
AU  - Dias, J. M. S.
AU  - Nande, P.
AU  - Santos, P.
AU  - Barata, N.
AU  - Correia, A.
PY  - 2021
SP  - 111-118
DO  - 10.2312/pt.20031431
CY  - Porto
UR  - https://diglib.eg.org/handle/10.2312/2633098
AB  - In this work, we present a novel free-hand gesture user interface based on detecting the trajectory of fiducial markers attached to the user's fingers and pulse, able to interact with a sequence of images of a digital video piece. The model adopted for the video representation is based on its decomposition in a sequence of frames or filmstrip. Sensor-less and cable-less interfaces provide the means for a user to intuitively interact through gestures with the filmstrip within the framework of an Augmented Virtuality usage scenario. By simply gesturing, users can select at random, drag, release, delete or zoom image frames, browse the filmstrip at a controlled user-defined rate, and issue start, end, stop and play commands to control the digital video sequence better. A fixed video camera monitors user interaction by gesturing the fiducial markers. This scheme enables the system to simplify the more complex problem of markerless free-hand gesture tracking. Once the computer vision layer detects and recognises the markers in real-time, the system obtains the marker centres' 3D pose (position and orientation) relative to a virtual camera reference frame, whose mathematical model matches the real video camera. We are specifically interested in obtaining the pose of the left and right-hand pulses, left and right thumb, and left and right-hand index. By projecting the positions of these poses in the 2D visualization window, simple topological analysis based on the study of the kinematics evolution of distances and angles can be implemented, enabling gesture recognition and the activation of system functions and, subsequently, of specific gesture-based user interaction for a given active functionality. This interaction will affect the shape, scale factor, position and visualisation of scene objects, that is, filmstrip frames. For the computer vision layer, our system adopts AR Toolkit, a C/Open GL-based open-source library that uses accurate vision-based tracking methods to determine the virtual camera pose information through the detection in real-time of fiducial markers. The graphical output is implemented with C++/Open GL. Our proposed system is general because it can interact with any filmstrip obtained ''a priori'' from a digital video source.
ER  -