The widespread use and adoption of collaborative robots are highly dependent on reliable human-robot interaction in manufacturing. The principle for collaborative robot motions is a comprehensive model of the respective surrounding and in particular of the humans involved in the interaction. This requires, for instance, the recognition of speech, gesture, or emotion. Gesture refers to perceiving significant appearances of human movement, including the hand, arms, face, and body. As of late, there has been a tremendous interest in presenting Electromyogram (EMG) based natural interfaces that can perceive the user's hand movements and make an interpretation of them into advanced orders. In this study, a 7-axis collaborative robot was controlled via biosignals. For implementing the interaction, an ABB YuMi® collaborative robot was used. The robot has been adapted for remote control via a wearable device that uses electromyography (EMG) signals from the upper forearm called the MYO armband. In this way, the user was able to select the robot movements by using hand movements and the interaction between the user and the robot was achieved. Experiments in an assembly line demonstrated the efficiency of the proposed solution to select the robot movements by using hand movements and the interaction between the user and the robot was achieved.