Monday, 14 October 2013

Update on the pi bot simulator

It's been a little while and I've got a little further with my pi bot simulator. There's not a lot to describe on top my earlier plans. Basically I now have:
  • A unity program that runs a simulation of a robot
  • A unity plugin that runs a small tcp server. It stores some values such as 'motor speed', 'sensor value', 'servo position'.
  • A python script that connects to the server to set values that unity reads for controlling the robot, or to retrieve values that unity stores from the 'sensors'.
My simulated robot now has 2 motors, 2 neck servos, 2 eye servos, front, back, left, right and bottom range finders and a couple of eye cameras. 

The server code is a bit of bog standard c++ use of sockets. It reads in a 4 byte message length followed by a text based command, interprets it and sends a response. The code is longer than I'd like in true c++ style, but the nice python interpreter is much more precise:

import socket
import struct


HOST = 'localhost'    # The remote host
PORT = 5152              # The same port as used by the server
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((HOST, PORT))

#handy recvall function to block until a fixed number of bytes are read
def recvall(sock,requested_bytes):
    while len(total_data) < requested_bytes:
        data = sock.recv(requested_bytes-len(total_data))
        total_data = total_data + data
    return total_data

#Recv a block of text as a msg with 4 byte length header
def RecvText():
    lenbytes = struct.unpack("<i",recvall(s,4))[0]
    data = recvall(s,lenbytes);
    return data.decode()

print("Begin client loop")
while True:
    #read in a message to send
    val = input("Please input something\n")
    if(val == "q"): #bail out if quit requested
    #get the number of bytes
    lenbytes = struct.pack("<i",len(val))

    #send the bytes and print out the response


Using this little script I can send commands such as:

set motor0 0.5

This results in:

  • The script posting the text "set motor0 0.5" to the server
  • The servo decoding this and assigning 0.5 to the variable RequestedMotorPower[0]
  • Unity querying the latest requested power for motor0 and assigning it to the motor joint in the simulation
Its the sort of thing that's much better described in video though, so here's a little demo:

Next up I'll getting those camera feeds through to python somehow, and get a slightly prettier python project going on with some proper modules to make the whole thing a bit more readable. In theory from there I should be able to run the very same scripts on the raspberry pi and have it controlling the simulation.


  1. Chris,

    Just got done watching the video and love it.

    I am starting a new project on a 'RoboOne' style walking mech-robot and while the individual components are not 'difficult' (ie: the RC servos are each being upgraded with Arduino based controller for I2C bus, force feedback, abs/rel. position, movement profiles, etc...) and a mash-up of other standard sensors (ultrasonic distance, cams, pressure switches), it is the system integration that has been the stumbling block as I know I needed a simulator to work out all the algorithms and interactions.

    The way your doing it with Unity as the robot mockup and pulling the video/sensor inputs from the Unity stage (assuming there is a gamecontroller assigned as each 'sensor') is great, but using the tcp 'server' as the robot controller is the piece that really ties it together for me. I was looking at writing all that background (RTOS simulation) processing within Unity and things were getting out of control ;-)

    I still have to figure out if the build-in NVIDIA PhysX engine has enough depth/precision to effect the 'robot' and thus its sensors in a life like manner so the server-based 'controller' can react properly to the sensor changes. It has all the base features that I need like ball-joints, hinges, joints, soft/hard bodies, etc... but unsure of how mass and center od mass assignments really works within it as I need a real-world sim on those to get values from the 'sensors' that will reflect numbers that are close (or at least tweak-able) to the real thing.

    Plus i not think that Unity supports the PhysX Visual Debugger (PVD) at all (?).

    Do you have the Unity and server code setup as an open-source project on GitHub (or such) so other can play with it? I would love to jump-start myself with it and use it as a 'tutorial' if possible.

    Thanks again for posting and hope to see more soon on your PiBot....

    1. I've been moving forward with using Unity as the visual front-end for my simulator and am re-using my older ActioScript/Away3d code via Playscript but as far as the server based controller I'm working with ROS. Massive overkill in some ways, but once you have the framework up and running it handles move things then I could every do a sole developer.

    2. Hi Sushi

      I don't have any stuff uploaded at the moment (will do eventually), but would be happy to send you a zip file with the relavent stuff in if you give me an email address (I'm One thing worth bearing in mind is that my current version requires the pro version of unity as it involves a c++ plugin. I've had to go down this route as I'm doing stuff with cameras. However I suspect you could use the same principle with unity's builtin tcp/ip support and avoid the need for a plugin.

      In terms of the physx simulation, my guess is that precision wouldn't be the issue if you turned up the accuracy high enough. Nothing will ever compete with the real world, but you'll certainly get a fair way with it. If anything it'll be the 'random' nature of the real world that the simulation is missing!