Wrapping up ideas
Current networks status updated
Comparing which circuits do the brains complete
Model | Dataset | Montmeló | Simple circuit no red line | Simple circuit no line no wall | Simple circuit white road | Simple circuit white road no line |
---|---|---|---|---|---|---|
Explicit | - | ✅ | ❌ | ❌ | ✅ | ❌ |
OpenCV | - | ✅ | ❌ | ❌ | ✅ | ❌ |
PilotNet | Old | ✅ | ❌ | ✅ | ✅ | ❌ |
PilotNet | Explicit | ✅ | ❌(last turns) | ❌(some turns ok) | ✅ | ❌ |
PilotNet | OpenCV(fastest) | ✅(increasing it/s much) | ❌ | ❌ | ✅ | ❌ |
Deepest LSTM | Old | ✅ | ❌ | ✅ | ❌ | ❌ |
Deepest LSTM | Explicit | ✅ | ❌ | ❌ | ✅ | ❌ |
Deepest LSTM | OpenCV(fastest) | ✅(increasing it/s) | ❌ | ✅(over the grass) | ✅ | ❌ |
Frankenstein | Old | ✅(increasing it/s) | ✅ | ✅(over the grass) | ✅ | ❌ |
Frankenstein | Explicit | ✅ | ❌ | ❌ | ✅ | ❌ |
Frankenstein | OpenCV(fastest) | ✅(increasing it/s) | ✅(increasing it/s) | ✅(over the grass and increasing it/s) | ✅(increasing it/s) | ✅(increasing it/s) |
PilotNet 3D | Old | ✅(increasing it/s) | ❌ | ❌ | ✅ | ❌ |
PilotNet 3D | Explicit | ❌ | ❌ | ❌ | ❌ | ❌ |
PilotNet 3D | OpenCV(fastest) | - | - | - | - | - |
Conclusions
- The brain “matches” the PID controller performance learning from the examples in the different circuits
- The neural brains need more it/s due to the inference time (this depends on the hardware). The conclusion is that the more it/s the better.
- Generalization. The neural brains are able to complete a broad variety of circuits. The memory based brain even more.
- We present a new neural architecture based on memory and some datasets.
- All open source and reproducible.
Ideas
Generalization
- Generalization: the neural brain trained on the fastest dataset is still able to complete circuits modifying the red line to a white line (simple circuit) modifying the number of iterations per seconds. The number of iterations per second is key in this case due to inference time.
Importance of number of iterations
-
Importance of number of iterations per seconds: when the number of iterations per second on the brain is low, the performance degrades a lot due to this fact and its combination with the inference time needed.
- Adding more iterations per second improves the performance of the PID controller? Yes, lowering for example the time to complete a circuit. Time to complete a lap with more it/s -> 50s
- Same question for neural brains? Same answer. For PilotNet, the time to complete a lap in simple with more it/s is 51s
Simple circuit
- Lap seconds PID -> 52s || with improved it/s: 50 s
- Lap seconds PilotNet -> || with improved it/s: 51 s
- Lap seconds TinyPilotNet -> 52 s || with improved it/s: 46 s
- Lap seconds Frankenstein with dataset with more extreme cases -> 65 s || with improved it/s: 56 s
Montmeló line
-
Lap seconds PID -> 71 s (56 it/s simulated) 78 s (11 it/s simulated) - Lap seconds TinyPilotNet -> 83 s (14 it/s simulated)
- Lap seconds Frankenstein -> 83 s (91 it/s simulated)
- Lap seconds PID -> 12it/s -> 5.23 8it/s -> 5.35 || with improved it/s:
- Lap seconds PilotNet -> || with improved it/s:
- Position deviation MAE Frankenstein with dataset with more extreme cases -> 8 it/s ->19.03 || with improved it/s:
-
Position deviation MAE Frankenstein (trained more!) with dataset with more extreme cases -> 91 it/s -> 5.0
- What happens when the number of iterations is the same?
- Comparing memory vs memory-less approaches. Differences?
Frankenstein is not fully trained.
[Image of training error curves]
The training curves show that the training hasn’t finished, so a longer training could help in this scenario.
What happens if images are modified?
If the line is white instead of red, the PID based brain can’t drive. PilotNet and Frankenstein are still able to complete part of the circuit.
Possible training strategy
Train a memory based model with several outputs, for example the same amount of outputs as inputs. With this training, the model should be able to output t, t+1 and t+2 speeds given as input t, t-1, t-2.