Machine learning (ML) has been applied in both research and clinical settings to make myoelectric prostheses more functional and more intuitive to use. ML techniques for myoelectric control require information about the environment a control system occupies in order to make useful control decisions or predictions about a user’s desired control outcomes. Despite demonstrated increases in myoelectric control performance with the inclusion of additional information about users and their environments, the sensors in commercial prostheses are limited, and typically do not provide diverse channels of contextual information to their respective control systems. Additional sensor information is crucial to demonstrating and evaluating the full potential of next-generation ML control systems. With this in mind, a novel, cost-effective research prosthesis was designed to provide real-time sensory information for ML-based myoelectric control. This device is able to report fingertip forces on independently controlled fingers, angular position for individual finger joints, and visual information about the hand’s environment via a USB webcam integrated in the palm. Using 3D printing, the device was prototyped at a cost of less than $800 CAD. This work therefore contributes a new platform by which groups can conduct ML research on prostheses, and allows researchers to develop new ML approaches with ample access to contextual information about prosthesis movement, prosthesis-environment interactions, and local changes to the environment surrounding the prosthesis. By providing an inexpensive, highly sensorized prosthetic hand, this work helps mitigate the cost of purchasing and retrofitting commercial prostheses with new sensors; it is therefore also expected to support related research into methods for sensory feedback from prostheses to users.