Categories
Python Robotics ROS

TensorFlow / Keras Model Predict Error

Unexpected error: Tensor Tensor("dense_1/BiasAdd:0", shape=(?, 3), dtype=float32) is not an element of this graph.

To Fix:

Make sure to include the following global:

import tensorflow as tf

global graph,model
graph = tf.get_default_graph()

And surround the prediction as such:

with graph.as_default():
                segment = self.model.predict(features)
Categories
Posts Programming ROS

ROS – rviz X Window System error.

rosrun rviz rviz
[ INFO] [1322561467.803234807]: rviz revision number 1.6.7
[ INFO] [1322561467.803351442]: ogre_tools revision number 1.6.2
[ INFO] [1322561467.803374070]: compiled against OGRE version 1.7.3 (Cthugha)
[ INFO] [1322561467.946771146]: Loading general config from [/home/alex/.rviz/config]
[ INFO] [1322561467.946953292]: Loading display config from [/home/alex/.rviz/display_config]
[ INFO] [1322561467.972674791]: RTT Preferred Mode is PBuffer.
The program 'rviz' received an X Window System error.
This probably reflects a bug in the program.
The error was 'BadDrawable (invalid Pixmap or Window parameter)'.
  (Details: serial 22 error_code 9 request_code 137 minor_code 3)
  (Note to programmers: normally, X errors are reported asynchronously;
   that is, you will receive the error a while after causing it.
   To debug your program, run it with the --sync command line
   option to change this behavior. You can then get a meaningful
   backtrace from your debugger if you break on the gdk_x_error() function.)

I got this error when I incorrectly compiled rviz using:

rosmake rviz rviz

You’ll want to re-compile it correctly, with only 1 input as followed:

rosmake rviz

Now when you run it, everything should be working.

Categories
C/C++ Linux Posts Programming Robotics ROS

ROS: Publishing and Subscribing to Arrays

In ROS (Robot Operating System) it’s really easy to publish variables to be used between nodes running on the same roscore however I was having some difficulty doing the same for arrays. The problem came when there was no real examples of how to use them, so here is a working example of std_msgs/*MultiArray in CPP.

Publishing:

[gist][/gist]

Subscribing:

[gist][/gist]

Categories
Blog Kinect Machine Vision Posts Programming Robotics

Kinect Development – Day 1

Head over to this page if you want some tutorials on getting started with the Kinect and libfreenect, I’ll update more as time goes on and I have free time.

I’ve been meaning to grab myself an Xbox 360 Kinect for a while, not because I’m a big motion controlled game fan but for machine vision development. Within the first month of the Kinect open source drivers being released the coolest things were seen, from motion controlled media centres to 3D modelling. I’ll admit, I’m a little late to the game, mostly due to the amount of work in my final year at university and other general business. Over the summer I’ll have plenty of time to do a couple of projects and hopefully come up with something cool and contribute to the scene.

Anyway enough of the small talk, I’ve decided to blog in as much detail the journey through the development, from the installation of the libraries to writing the first and last bit of code as a sort of a set of tutorials for anyone else who wants to get into it.

There are currently two main sets of drivers/libraries out there libfreenect and OpenNI both sporting hip, cool, open source names. So which one do you choose? Well, here’s a brief description of both.

Let’s start with OpenNI, these are the Official PrimeSense (the people that Microsoft paid to actually create the Kinect) these allow access to audio, video and depth with the addition of PrimeSense’s NITE Middleware. NITE is the software library used for skeletal tracking, hand gesture recognition and scene analyzer (used to separate figures in the foreground from the background) .

Alternatively there is the libfreenect libraries, from the community over at openkinect.org. While these are admittedly lacking slightly in features such as skeletal tracking and hand gesture recognition they much make up for it in the dedication to open source and the creation of the best suite available. These have access to video, microphones, motors and LED with speakers currently being worked on. They work under a variety of language wrappers for most OS’s and will of course by my personal library of choice.

Fortunately, you won’t have to decide which one you’d prefer ’cause you can run them both on the same machine but you’ll have to look into licencing information for releasing projects with OpenNI so it’s unlikely you’ll want to combine them (or even allowed?).

libfreenect Installation:

OpenKinect’s getting started page provides a well enough documented installation guide that anyone should be able to get them up and running under Windows/Linux or OSX. With Ubuntu being the distro of choice for installation guide. – http://openkinect.org/wiki/Getting_Started

If you’re running Arch, there are a few AUR packages available however they all seem to have lacked updates for a few months but the manual build is pretty simple on the getting started page, I’ve also added a quick list of commands to get you there:

Grab the git copy of the libraries:

git clone https://github.com/OpenKinect/libfreenect.git
cd libfreenect/

Make, install:

mkdir build
cd build/
cmake ..
make
sudo make install
sudo ldconfig /usr/local/lib64/

Add your user to allow access to the connect by creating a group called video and adding your user to it:
note: this can be skipped if you don’t mind running as root/sudo

sudo nano /etc/udev/rules.d/66-kinect.rules
sudo usermod -G video username

Test the kinect with the example program:

bin/glview

If all went well you should have seen a sight similar to the screenshot above, if not check out the OpenKinect page for more information and see if the problems you’re having haven’t already been resolved.

Categories
Blog Posts Robotics

Inverted Pendulum

As part of my Robotics university course we used an Inverted Pendulum rig to learn some control applications/algorithms. For extra marks we had the challenge to swing the pendulum from hanging down to an upright balancing position. Below is my first attempt which still needs a little tweaking to the values to reduce swings needed, increase reliability and reduce travel distance of the balancing algorithm but the general idea is there and it manages to get the pendulum up and balancing in a reasonable amount of time.

Due to the project being part of the course I’m unable to release more information about it.