- TF Hub is a library and platform used for transfer learning. It’s a library for publication, discovery, and consumption of reusable parts of machine learning models. We used it to for the first layer to create a Keras layer that utilizes a model from the hub to embed sentences.
- The optimizer’s purpose is to minimize loss and the script used adam. The loss function used the binary_crossentropy since it deals with probabilities better. When I ran the code my output was: loss: 0.327, accuracy: 0.851. So, my model was 85% accurate.
- The dot represents the training loss and the line represents the validation loss. Training and validation loss both decrease as epochs increases. The graph compares the loss of data the model was built on and that training data. Validation loss starts leveling off around 6 epochs. Training loss continues to decrease to 0 as epochs increase. (figure 1)
- The dot represents the training accuracy and the line represents the validation accuracy. As the epochs increase both the accuracies increases. The training accuracy is greater than the validation accuracy after 5 epochs. Both the graphs represents an overfit model since the model represents the training data better. (figure 2)