The Mixed Initiative Experimental (MIX) Testbed is a research environment for Human-Robotic Interaction (HRI) composed of two primary applications: the Operator Control Unit (OCU) and the Unmanned System Simulator (USSIM). The OCU interfaces with one or more Unmanned Systems (e.g. ground, air, and surface robots) simulated by the USSIM applications, and is capable of simulating multiple concurrent tasks(e.g. radio communications, map updates). The Joint Architecture for Unmanned Systems (JAUS) provides the underlying communication protocol for MIX applications allowing for seamless integration of real or simulated Unmanned Systems developed by the ACTIVE Laboratory and collaborators. This site contains additional documentation to help users get started with MIX.
MIX Version 6.X
NEWS (7/23/2013): MIX version 6.130723 released. Updated software & bugfixes. Includes latest version of JAUS++ for improved communication in addition to RTSP video feed support.
NEWS (7/9/2012): Added additional papers by Jessie Y.C. Chen, Michael J. Barnes, and Zhihua Qu on supervisory control of multiple robots, all of which have incorporated MIX
NEWS (4/11/2012): MIX version 4.120411 released. Updated software & bugfixes
NEWS (11/18/2011): MIX version 4.111118 released. Updated manual and tutorial programs.
NEWS (10/24/2011): MIX website opens.
The MIX Testbed features full logging capabilities for capturing performance data and synchronizing third party logs in real time or post-hoc. Virtual environments can be graphically edited allowing for quick and highly customized scenario creation. It is designed to be flexible enough to support multiple research questions. Using different XML configuration files, the configuration of the user interface and concurrent tasks can be changed without editing any underlying source code. This feature allows for rapid development of experimental scenarios running different scripts for simulated tasks and new graphical layouts. Through the use of JAUS the OCU is regularly used to control real robotic platforms such as the Segway RMP. These robots are instantly recognized by the OCU, thus, allowing control for multiple types of robots with the ability to teleoperate via joystick, send GPS waypoints, acquire stream video, GPS and compass data all within a single user interface.
- Open source, available for Windows and Linux
- Immersive 3D environment with multiple simulated Unmanned Systems
- Graphical scenario editor for actors, pathing, and events
- Configurable OCU Interface
- Real-time video streaming
- Terrain map display and interaction
- Real-time data logging and automatic data synchronization
- Configurable events such as audio cues, graphical prompts and environmental changes
MIX has been utilized in several experiments involving multiple simulated unmanned vehicles in a variety of scenarios. Several publications have been made which involve the utilization of the MIX Testbed.
- Barber, D.J., Leontyev, S., Sun, B., Davis, L., Chen, J.Y.C., & Nicholson, D. (2008). The Mixed-Initiative Experimental (MIX) Testbed for collaborative human robot interactions. Proceedings of the 2008 International Symposium on Collaborative Technologies and Systems (CTS), January 29, 483-489.
- Barber, D., Davis, L., Nicholson, D., Chen, J.Y.C., Finkelstein, N. (2008). The Mixed Initiative Experimental (MIX) Testbed for human robot interactions with varied levels of automation. Proceedings of the 26th Annual Army Science Conference, December 1-4, ADA505701.
- Ortiz, E., Barber D., Stevens, J., & Finkelstein, N. (2009). Simulation to assess an unmanned systemís effect on team performance. Proceedings of the Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC), November 30-December 3. Orlando, FL: I/ITSEC, 9105.
- Reinerman-Jones, L., Barber, D., Lackey, S.J., & Nicholson, D.M. (2010). Developing methods for utilizing physiological measures. To appear in Proceedings of the Applied Human Factors and Ergonomics Society Conference 2010 (AHFE 2010), July 17-20, 2010.
- Reinerman-Jones, L., Taylor, G., Sprouse, K., Barber, D., Hudson, I. (2011). Adaptive Automation as a Task Switching and Task Congruence Challenge. Proceedings of the Human FActors and Ergonomics Society 55th Annual Meeting 20112008 International Symposium on Collaborative Technologies and Systems (CTS), September 19-23, 197-201.
- Jessie Y.C. Chen, Michael J. Barnes, and Zhihua Qu. (2010). RoboLeader: an agent for supervisory control of multiple robots. Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction (HRI '10), March 2-5, 81-82.
- Jessie Y.C. Chen, Michael J. Barnes. (2012). Supervisory Control of Multiple Robots: Effects of Imperfect Automation and Individual Differences. Human Factors: The Journal of the Human Factors and Ergonomics Society, April, 157-174
- Talone, A., Fincannon, T., Schuster, D., Jentsch, F., Hudson, I. (2013). Comparing Physical and Virtual Simulation Use in UGV Research: Lessons Learned from HRI Research with Two Test Beds. To appear in Proceedings of the Human Factors and Ergonomics Society International Annual Meeting 2013, September 30-October 4
The layout of GUI for MIX, the OCU, is highly customizeable, supporting multiple experimental questions. The OCU can also be used for any real robotic platforms running JAUS, including utilizing their vision systems. The 3D scenarios themselves can also be thoroughly edited, mostly through a simple graphical editor.
Available for download are videos for MIX, including several kinds of scenarios, the editor, and the view from a simulated XUV vehicle.
The latest version of MIX and its related content can be downloaded directly.