National Repository of Grey Literature 2 records found  Search took 0.00 seconds. 
Robotic Workplace Programming Using Microsoft HoloLens 2
Hiadlovská, Simona ; Beran, Vítězslav (referee) ; Bambušek, Daniel (advisor)
This thesis focuses on usability of mixed reality head-mounted display - Microsoft HoloLens in programming a robotic workplace. Use of the headset is demonstrated by created user interface. The thesis builds on the existing user interface-- AREditor connected to ARServer. It allows the user to add and manipulate with 3D objects of robots and collision objects to the workplace scene. Subsequently, users can add specific tasks to the created scenes, in which they can use 3D action objects and action points to determine the type of action and the place of its execution. User can combine actions by links that determine the order in which actions are performed. All functions are available in a simple menu, which is displayed to the user whenever he looks at his hand. The resulting user interface is tested using user experiments, where the participants of the experiment tested the designed user interface and the existing AREditor interface in simple tasks.
Robotic Workplace Programming Using Microsoft HoloLens 2
Hiadlovská, Simona ; Beran, Vítězslav (referee) ; Bambušek, Daniel (advisor)
This thesis focuses on usability of mixed reality head-mounted display - Microsoft HoloLens in programming a robotic workplace. Use of the headset is demonstrated by created user interface. The thesis builds on the existing user interface-- AREditor connected to ARServer. It allows the user to add and manipulate with 3D objects of robots and collision objects to the workplace scene. Subsequently, users can add specific tasks to the created scenes, in which they can use 3D action objects and action points to determine the type of action and the place of its execution. User can combine actions by links that determine the order in which actions are performed. All functions are available in a simple menu, which is displayed to the user whenever he looks at his hand. The resulting user interface is tested using user experiments, where the participants of the experiment tested the designed user interface and the existing AREditor interface in simple tasks.

Interested in being notified about new results for this query?
Subscribe to the RSS feed.