Národní úložiště šedé literatury Nalezeno 6 záznamů.  Hledání trvalo 0.01 vteřin. 
User Interface for ARTable and Microsoft Hololens
Bambušek, Daniel ; Španěl, Michal (oponent) ; Kapinus, Michal (vedoucí práce)
This thesis focuses on usability of mixed reality head-mounted display -   Microsoft HoloLens - in a human-robot collaborative workspace - the ARTable. Use of the headset is demonstrated by created user interface which helps regular workers to better and faster understand the ARTable system. It allows to spatially visualize learned programs, without the necessity to run the robot itself. The user is guided by 3D animation of individual programs and by device voice, which helps him to get a clear idea of what will happen if he runs the program directly on the robot. The solution also provides interactive guidance for the user when programming the robot. Using mixed reality displays also enables to visualize valuable spatial information, such as robot perception.
Vizuální programování robotických aplikací
Ling, David ; Bambušek, Daniel (oponent) ; Kapinus, Michal (vedoucí práce)
Tato bakalářská práce se zabývá rozšířením funkcionality systému ARTable, který je postavený na frameworku ROS. V rámci práce bylo rozšířeno uživatelské rozhraní tak, aby bylo možné vytvářet nové nebo upravovat stávající programy pro robota PR2 přímo na dotykovém stole. Veškerá rozšíření jsou implementována v jazyce Python za využití frameworků ROS a Qt. Rozšíření je plně funkční a propojené se zbytkem systému ARTable.
Human-Robot Interaction: Advanced Task-centered Interfaces for Non-Expert Users
Materna, Zdeněk ; Ao.Univ.Prof. Dipl.-Ing. Dr.tech Markus Vincze (oponent) ; Míkovec, Zdeněk (oponent) ; Smrž, Pavel (vedoucí práce)
Recent years brought a growing trend of deploying robots in novel applications where they are not only supposed to co-exist with and work next to humans but to actually closely collaborate with them on shared complex tasks. Capabilities of the robotic systems need to be substantially expanded in order to make the close, rich as well as natural human-robot interaction possible. Indeed, the interaction will not only happen between caged robots and highly specialized experts any more. More and more often, it will interconnect safe and interactive robots with non-expert users with various background. Consequently, the amazingly complex machines, that the current robots are, will become even more complex. This poses further challenges for the design of their user interfaces. The objective of this thesis is to research and develop solutions for the close interaction between non-expert users and complex robots. The research was done in two different contexts: assistive service and industrial collaborative robots. Although these two domains have diverse requirements, related concepts could be used when designing the human-robot interaction. To cope with limitations of the current approaches, a novel method for task-centered interaction has been proposed. The most important aspects of the method are the utilization of mixed reality and robot-integrated capabilities, communication of the robot's inner state, context sensitivity, and usage of task-appropriate modalities. For each of the two mentioned domains, a user interface was designed and implemented. Both interfaces were successfully evaluated with non-expert users, who were able to carry out non-trivial tasks in cooperation with a robot. The reported evaluation provides an evidence that the realized method significantly improves the close human-robot interaction, which had not been entirely possible with previous approaches. The method's key characteristics provide guidelines for new designs of next user interfaces in the collaborative robotics.
Human-Robot Interaction: Advanced Task-centered Interfaces for Non-Expert Users
Materna, Zdeněk ; Ao.Univ.Prof. Dipl.-Ing. Dr.tech Markus Vincze (oponent) ; Míkovec, Zdeněk (oponent) ; Smrž, Pavel (vedoucí práce)
Recent years brought a growing trend of deploying robots in novel applications where they are not only supposed to co-exist with and work next to humans but to actually closely collaborate with them on shared complex tasks. Capabilities of the robotic systems need to be substantially expanded in order to make the close, rich as well as natural human-robot interaction possible. Indeed, the interaction will not only happen between caged robots and highly specialized experts any more. More and more often, it will interconnect safe and interactive robots with non-expert users with various background. Consequently, the amazingly complex machines, that the current robots are, will become even more complex. This poses further challenges for the design of their user interfaces. The objective of this thesis is to research and develop solutions for the close interaction between non-expert users and complex robots. The research was done in two different contexts: assistive service and industrial collaborative robots. Although these two domains have diverse requirements, related concepts could be used when designing the human-robot interaction. To cope with limitations of the current approaches, a novel method for task-centered interaction has been proposed. The most important aspects of the method are the utilization of mixed reality and robot-integrated capabilities, communication of the robot's inner state, context sensitivity, and usage of task-appropriate modalities. For each of the two mentioned domains, a user interface was designed and implemented. Both interfaces were successfully evaluated with non-expert users, who were able to carry out non-trivial tasks in cooperation with a robot. The reported evaluation provides an evidence that the realized method significantly improves the close human-robot interaction, which had not been entirely possible with previous approaches. The method's key characteristics provide guidelines for new designs of next user interfaces in the collaborative robotics.
Vizuální programování robotických aplikací
Ling, David ; Bambušek, Daniel (oponent) ; Kapinus, Michal (vedoucí práce)
Tato bakalářská práce se zabývá rozšířením funkcionality systému ARTable, který je postavený na frameworku ROS. V rámci práce bylo rozšířeno uživatelské rozhraní tak, aby bylo možné vytvářet nové nebo upravovat stávající programy pro robota PR2 přímo na dotykovém stole. Veškerá rozšíření jsou implementována v jazyce Python za využití frameworků ROS a Qt. Rozšíření je plně funkční a propojené se zbytkem systému ARTable.
User Interface for ARTable and Microsoft Hololens
Bambušek, Daniel ; Španěl, Michal (oponent) ; Kapinus, Michal (vedoucí práce)
This thesis focuses on usability of mixed reality head-mounted display -   Microsoft HoloLens - in a human-robot collaborative workspace - the ARTable. Use of the headset is demonstrated by created user interface which helps regular workers to better and faster understand the ARTable system. It allows to spatially visualize learned programs, without the necessity to run the robot itself. The user is guided by 3D animation of individual programs and by device voice, which helps him to get a clear idea of what will happen if he runs the program directly on the robot. The solution also provides interactive guidance for the user when programming the robot. Using mixed reality displays also enables to visualize valuable spatial information, such as robot perception.

Chcete být upozorněni, pokud se objeví nové záznamy odpovídající tomuto dotazu?
Přihlásit se k odběru RSS.