Decoupled Neural Interfaces using Synthetic Gradients are greatly effective to improve the training speed in Deep Neural Networks.
Our research is to further improve these Synthetic Gradients by introducing a 'pre-training' module so as to observe the result of this pre-training on the actual training process.
Technologies-
- Python
- TensorFlow
- Keras