Saya is the new departmental robot receptionist (as seen on TV). This page is intended to serve mainly as a reference to its programming interface for interested students.

The mechanical control of Saya is realized via 22 face/neck McKibben-type pneumatic actuators (each receiving a value translated to 0–10V level, via USB port), and additionally via neck and eyes DC motors (simultaneously controlled by a microprocessor receiving simple commands via RS-232 port). We provide a Java wrapper API to these controls, also restricting the parameters to safe values (of course, Saya has physical protection from excessive air pressure as well).

Additionally, Saya has a video camera with regular USB interface in her left eye, and similarly transparent connections to microphone and speakers. For more details, see, e.g., this paper by Kobayashi et al.

The software which arrived with Saya consists of device drivers, controller firmware, and two components operating in closed loop. The first component interfaces with Microsoft Speech SDK in order to synthesize speech and produce facial expressions in response to a precompiled dictionary. The second component uses DirectShow to extract camera input and apply simple color filtering to locate the desired objects. It then controls DC motors to focus on these objects. The projects have been reorganized and converted to MS VS 8 format. Yet, our first objective: MS garbage out, advanced cross-platform projects in. :)

We seek highly capable students who want to breathe life into Saya. There are many project possibilities: face tracking / recognition, free speech recognition, speech synthesis, graphical simulation, web interface, anything you can think of. For details, contact Shlomi Dolev or Michael Orlov.

Project reports

Media coverage

Press articles