5 minute read

Summary

After a lot of work I have solved the unmatching numbers of the dataset and train a new pilot with it.

Progress This Week

Topic frequency

The first thing we checked was the frequency of the topics that were being recorded with the ros2 bag, what I found out was that the image topic was being published at over 14 Hz while the velocity comands where being published by my program at 10 Hz. This explained why I was recording different number of messages from each topic.

Adjusting cmd_vel frequency

Given that the topic I controlled most was the velocity comands one I tried to adjust it, however being the image published at such an odd rate 14.1… it was quite difficoult and even though I managed to get to a really similar amount of published messages it was not a quite elgant solution and decided to try other options. Keeping this one saved in case I did not find anything better.

Adjusting image frequency

Looking into the image frequency for the next solution I found that the frequency of the published image was declared in the f1 car sdf. Which oddly was set to 20 Hz even though it was being published at aproximately 14 Hz. However, not loosing anything for trying, I changed it to 10 Hz so it would go accordingly with my own topic. This actually worked, and I got the exact same amount of data from each topic. However, the results of lowering the image frequency were that the car was actually going wrose in the circuit. So once again I saved the solution and keep on looking for a better one.

For the record, I tried to get the frequency up but I never found why it got stuck at 14 instead of at 20 as the sdf stasblished, which did not happen to my own topic.

Time stamps and synchronization

The next thing I tried and what had been suggested by my coordinator is to use the timestamps that ros2 topics have. I had tried other things because this was not trivial. However after some work on my code I was able to take the timestamps of each image and look in the array of velocities which was closer to it… and it worked!

I was quite happy, after all that work it had proven to by worth it. However, inspired, I decided to get a little more done discarding some of the images. Now that my computer can process the data faster I can be picky and I did not want to have lots of images with the same velocities because if the img number was greater then the velocities would have to be reused… And that would probably make my trained pilot a lot worse.

So I jumped to put some time conditions and if the closest velocity was not close enough it would be considered not to be an answer to the image and thus the image would get discarded. Happy with this idea I tried it and… None of the images were pick. I had gonne over something quite important, the timestamps of images and velocities where similar, but had to be taking its references from different places because there where a few minutes of difference.

I knew from looking into the sdfs and launchers that the images where using the gazebo in time, which looked quite good. However I had no specified which time where my velocities where taking the time. Thinking it would be easy I imported from rclpy.clock import ROSClock and used the code found in some forums to add a concrete timestamp. However the kind of mesage I was publishing (Twist) did not accept that. I tried to change it to TwistStamped but the car was specting a Twist and did not moove. And I did not know how to change that because it was using a Plugin in order to work and I didn’t have the courage to start touching it.

I also found an interesting conversation in ros2 forums asking to change the type of messages used in navigation2 and other packages from Twist to TwistStamped for similar reasons.

However, all the wark was not in vain. I had gotten to know all the code and ros2 topics quite better, and I had an idea I had suggested but had been discarded before…

Just doing it

If I could not mengle with the image topic to change it’s frequency becouse it would lower the quality of the pilot, and I could not used the timestamps because honestly, I did not know how to make them actually relate (I now guess I could have caculated differences between two images, for example first two, and asigning the first image the first velocity and the second one the velocity that corresponded with the time difference…). Sooooo, I created another topic, every time my program received an image it would copy it, and at the time of sending the velocities to cmd_vel it would send the last image received to car_image. This way I would ensure same frequency and at least I would know everey image was close in time to the velocity sent at the same instant.

After trying and recording a few rosbags in the simple circuit and the many curves one I had quite the dataset, with same ammount of images and velocities, each corresponding to each other… Was looking good.

PilotNet’s

I again went to train it, it again went way too fast and when I tried it with a program that sent the images to the neural net, asked for the actions and sent those to cmd_vel it crashed on the first curve. I had failed, but this time I did not think it was the dataset, I was trying the pilot in a simple circuit and I had used a many curves one to ensure there were loads of data in curves. So I looked again in my trainer and realised, because my laptop was not nearly as fast I had reduced a lot the epoch number, just to be able to try and see if things worked. Acknowledging this I set it to go for 500 epochs, and in just a few minutes (10? 20? I was writing this so im not sure how much time passed), I had it trained. I tested again… And you can see the results in this link: https://youtu.be/zCucXRBVxG4.