Date Range
Date Range
Date Range
Web site for the graduate class on Representation Learning Algorithms IFT6266 H14.
Results with ReLUs and different subjects. Using ReLUs on the hidden layers. Using more hidden units in the NSNN. 1 Using ReLUs on the hidden layers. This effect was found to disappear after around 75 epochs.
This blog records my experiments for the course ift6266h14 at Université de Montréal. To be honest, progress was much much slower than what I expected at the start of the project. I have gained an appreciation for how hard machine learning is in practice. Evaluating robustness of recursive generation.
So far using more components seems only to improve results. I was able to train models longer and get better results using an exponential decay of 1.
Results with ReLUs and different subjects. Using ReLUs on the hidden layers. Using more hidden units in the NSNN. 1 Using ReLUs on the hidden layers. This effect was found to disappear after around 75 epochs.
My journal for IFT6266 projects. Increasing the number of batches per iteration. Instead, a random subset of the training examples is sampled on every epoch. This way, I get feedback from the monitoring channels more often. The learning rate is also adjusted more often this way.
I proposed to use the implementation composed of a SdA and a MLP of 2 hidden layers for each phoneme. I will present in this last post the results of this proposal. The hyperparameters of the implementation are almost the same as the ones used in my final experiment post.