Export Publication

The publication can be exported in the following formats: APA (American Psychological Association) reference format, IEEE (Institute of Electrical and Electronics Engineers) reference format, BibTeX and RIS.

Export Reference (APA)
Gaspar, F., Bastos, R. & Dias, M. S. (2011). Accurate infrared tracking system for immersive virtual environments. International Journal of Creative Interfaces and Computer Graphics. 2 (2), 49-73
Export Reference (IEEE)
F. A. Gaspar et al.,  "Accurate infrared tracking system for immersive virtual environments", in Int. Journal of Creative Interfaces and Computer Graphics, vol. 2, no. 2, pp. 49-73, 2011
Export BibTeX
@article{gaspar2011_1716185473477,
	author = "Gaspar, F. and Bastos, R. and Dias, M. S.",
	title = "Accurate infrared tracking system for immersive virtual environments",
	journal = "International Journal of Creative Interfaces and Computer Graphics",
	year = "2011",
	volume = "2",
	number = "2",
	doi = "10.4018/jcicg.2011070104",
	pages = "49-73",
	url = "http://www.igi-global.com/gateway/article/60536"
}
Export RIS
TY  - JOUR
TI  - Accurate infrared tracking system for immersive virtual environments
T2  - International Journal of Creative Interfaces and Computer Graphics
VL  - 2
IS  - 2
AU  - Gaspar, F.
AU  - Bastos, R.
AU  - Dias, M. S.
PY  - 2011
SP  - 49-73
SN  - 1947-3117
DO  - 10.4018/jcicg.2011070104
UR  - http://www.igi-global.com/gateway/article/60536
AB  - In large-scale immersive virtual reality VR environments, such as a CAVE, one of the most common problems is tracking the position of the user's head while he or she is immersed in this environment to reflect perspective changes in the synthetic stereoscopic images. In this paper, the authors describe the theoretical foundations and engineering approach adopted in the development of an infrared-optical tracking system designed for large scale immersive Virtual Environments VE or Augmented Reality AR settings. The system is capable of tracking independent retro-reflective markers arranged in a 3D structure in real time, recovering all possible 6DOF. These artefacts can be adjusted to the user's stereo glasses to track his or her head while immersed or used as a 3D input device for rich human-computer interaction HCI. The hardware configuration consists of 4 shutter-synchronized cameras attached with band-pass infrared filters and illuminated by infrared array-emitters. Pilot lab results have shown a latency of 40 ms when simultaneously tracking the pose of two artefacts with 4 infrared markers, achieving a frame-rate of 24.80 fps and showing a mean accuracy of 0.93mm/0.51ᄎ and a mean precision of 0.19mm/0.04ᄎ, respectively, in overall translation/rotation, fulfilling the requirements initially defined.
ER  -