The Iris.3 mobile robot

Introduction & Current Status

The mobile robot (mobot) group is committed to the design and implementation of a robot capable of [mpe Insert ultimate mobot goal here].

The primary disciplines involved in the Iris3 Mobot group are Computer Science and Industrial Technology. The immediate goal is to develop a robot capable of playing Tic-Tac-Toe. There are three main areas of focus to obtain this goal: sensors (vision), effectors (arms), and the Tic-Tac-Toe engine.

The primary sensor for the Tic-Tac-Toe project is a connectix camera. We have been working on developing an interface to this camera that will suit our need, and nearly have it up and running. Once the camera is done, the next step will be to train the Artificial Neural Network (ANN) software to recognize X's O's of the playing field.

The effector is a single arm, which is currently being redesigned. The arm is controlled by a series of servo motors, and the servo motors are controlled by a Jameco controller card, which interfaces to the computer via a 9-pin serial connection. After the arm design is finalized, the next step will be to develop scripts to move the arm.

We have two (or more??) Tic-Tac-Toe engines that have been developed by past students. The different engines need to be evaluated and a choice needs to be made as to which one to use. That engine then needs to be integrated into the environment of SIE.

SIE, Shelley Integrated Environment, was developed by Illinois Wesleyan University's Shelley Research Group for their Shelley Robot. We have adapted it for our project. SIE forms the background of the mobot system. Performing a role analogous to an operating system, SIE is responsible for resource allocation and communication between the different programs that make up the mobot. Please see the for more information. Below is shown the current state of the software making up the Iris3 project.

Cognitive Engines (Primary AI software)

There are several pieces of AI software that perform special tasks (e.g. speaking and understanding language & learning to play tic tac toe).  There also must be a central control program that allows all of the pieces of software to communicate with each other -- we call such a program a "central control program."

  • Central Control Programs

    At the heart of our "top-down" robots is a piece of software that connects all other programs and devices together. For the first phase of the Iris3 Mobile robot project we will be using a program called SIE (Shelley Integrated Environment) developed by students working on the Shelley robot at Illinois Wesleyan University.

    • Shelley Integrated Evironment (SIE)

    • Central Control Program

      While we are currently using SIE for the "central control program" (CCP), we plan to write our own program in the future. The CCP research group at Illinois State is committed to creating a new central control program for our future generations of robots. We are still in the planning stages and welcome any input that others might have as we seek to design an effective program for undergraduate research projects. For more information, visit the CCP page.

    • Student Researchers:

      Matt Eisenbraun (Student Leader)

  • ProtoThinker V4.11 for Windows

    This is the primary AI engine that we use for natural language processing, belief acquisition, inferences engines, and moral reasoning capability.

  • TicTacToe

    The Iris robots presently have two tic-tac-toe learning programs, one written by Matt Carlson and one written by Neil Ryan. Our commitment is to develop webpages on machine learning that explain a wide range of learning strategies.

Sensors (input devices)

  • Vision System

    The Iris.2 robot uses a Connectix camera running under Windows and an off-the-shelf neural net program that has been trained to recognize X's and O's. It also has a de-skewing program to compensate for the angle of the camera. One possibility for the next generation, Iris.3, is to run the Connectix camera under Linux (Linux drivers are now available for the standard color camera, not the USB) and to use a new neural net program written by Matt Carlson.

    Student Researchers:

    • Andy Schmidgall (Screen Capture)

    • Marlon Jacobs (Neural Nets)

    Components of the Vision System:

    • Video Camera

    • Video Capture Program

    • Artificial Neural Net

    Vision Resources

  • Voice Recognition

    We are exploring many different strategies for voice recognition software.  The first to be implemented will probably be Dragon Dictate.

  • Optical Mobile Tracking System

    We have no idea, yet, how we are going to give Iris.3 the ability to navigate around its environment. In particular it must be capable of finding two positions in the room: The tic-tac-toe board on one table and its toy blocks on another table. We need a SENSOR expert to locate all of our options.   .

Output Devices

  • Robotic Arms

    Student Researchers:

    • Jeremy Tessendorf

    • Marlon Jacobs

    Arms & Controllers (We have several options):

    • Robix RCS-6 Robotic Kit

      Robix RCS-6 Kit: Controller

      Iris makes use of the Robix LPT Controller. Software is available from the Robix Support Site.

      IRIS.2 has an RCS-6 Drawbot (for drawing X's and O's) and an RCS-6 Gripper for picking up blocks.

    • Jameco SVO203 Controller & Standard servos

  • Mechanical Wheels

  • Speech

    TTS (text-to-speech): AT&T . . . (courtesy of Lucent Technologies)

  • Speakers

Electrical system

Student Leader: Jeremy Tessendorf

  • Electrical System

  • Batteries

  • Re-charger

Basic Hardware (Robot Body)

Student Leader: Kevin Krawzik

Mobile Robot Body

  • Shell

  • Miscellaneous: Plexiglass Windows, Fan, etc

Computers

  • Laptop: Gateway Solo 9100,  PII-266, 96 Meg RAM, 5 Gig HD

  • (2) Motherboards: P75 and P9