Zur Hauptnavigation / To main navigation

Zur Sekundärnavigation / To secondary navigation

Zum Inhalt dieser Seite / To the content of this page

Sekundärnavigation / Secondary navigation

Inhaltsbereich / Content

Tobias Zimmermann

Contact

Phone+49 631 205-5125
E-mail t_zimmer [at] cs [dot] uni-kl [dot] de
OfficeBuilding 48, Room 455
Gottlieb-Daimler-Straße
67663 Kaiserslautern
Germany

Research Interests

  • Machine learning
  • Deep learning
  • Wearable computing

Curriculum Vitae

02/2017 - nowAG wearHEALTH, University of Kaiserslautern
01/2013 - 07/2017Study of Computer Science, University of Kaiserslautern
Degree: Master of Science
10/2005 - 12/2012Study of Computer Science, University of Kaiserslautern
Degree: Bachelor of Science

Publications

Liwicki, M., Weber, M., Zimmermann, T. and Dengel, A. (2012), Seamless Integration of Handwriting Recognition into Pen-Enabled Displays for Fast User Interaction, In Proceedings of the 2012 10th IAPR International Workshop on Document Analysis Systems. Washington, DC, USA, March, 2012. Vol. 12(5), pp. 364-368. IEEE Computer Society.
Abstract: This paper proposes a framework for the integration of handwriting recognition into natural user interfaces. As more and more pen-enabled touch displays are available, we make use of the distinction between touch actions and pen actions. Furthermore, we apply a recently introduced mode detection approach to distinguish between handwritten strokes and graphics drawn with the pen. These ideas are implemented in the Touch & Write SDK which can be used for various applications. In order to evaluate the effectiveness of our approach, we have conducted experiments for an annotation scenario. We asked several users to mark and label several objects in videos. We have measured the labeling time when using our novel user interaction system and compared it to the time needed when using common labeling tools. Furthermore, we compare our handwritten input paradigm to other existing systems. It turns out that the annotation is performed much faster when using our method and the user experience is also much better.
BibTeX:
@inproceedings{Liwicki:2012:SIH:2223950.2224138,
  author = {Liwicki, Marcus and Weber, Markus and Zimmermann, Tobias and Dengel, Andreas},
  title = {Seamless Integration of Handwriting Recognition into Pen-Enabled Displays for Fast User Interaction},
  booktitle = {Proceedings of the 2012 10th IAPR International Workshop on Document Analysis Systems},
  publisher = {IEEE Computer Society},
  year = {2012},
  volume = {12},
  number = {5},
  pages = {364--368},
  url = {http://dx.doi.org/10.1109/DAS.2012.78},
  doi = {10.1109/DAS.2012.78}
}
Zimmermann, T., Weber, M., Liwicki, M. and Stricker, D. (2012), CoVidA: Pen-based Collaborative Video Annotation, In Proceedings of the 1st International Workshop on Visual Interfaces for Ground Truth Collection in Computer Vision Applications. New York, NY, USA, May, 2012. Vol. 1(10), pp. 10:1-10:6. ACM.
Abstract: In this paper, we propose a pen-based annotation tool for videos. Annotating videos is an exhausting task, but it has a great benefit for several communities, as labeled ground truth data is the foundation for supervised machine learning approaches. Thus, there is need for an easy-to-use tool which assists users with labeling even complex structures. For outlining and labeling the shape of an object, we introduce a pen-based interface which combines pen and touch input. In our experiments we show that especially for complex structures the usage of a pen device improves the effectiveness of the outlining process.
BibTeX:
@inproceedings{Zimmermann:2012:CPC:2304496.2304506,
  author = {Zimmermann, Tobias and Weber, Markus and Liwicki, Marcus and Stricker, Didier},
  title = {CoVidA: Pen-based Collaborative Video Annotation},
  booktitle = {Proceedings of the 1st International Workshop on Visual Interfaces for Ground Truth Collection in Computer Vision Applications},
  publisher = {ACM},
  year = {2012},
  volume = {1},
  number = {10},
  pages = {10:1--10:6},
  doi = {10.1145/2304496.2304506}
}