responses

A1)NLP was used to build a sarcasm classifer where 0 is not sarcastic and 1 is sarcastic. I took newspapaer headlines from cnn, the hill, and nbcnews. According to my ouputs, my most sarcastic headline was the 1st one below. This was probably one of the more serious headlines so the model did not do a good job at classifying. The least sarcastic headline was the last one, which I would actually deem more sarcastic compared to the rest. I did not really agree with this model and found it kind of useless. Based off the graphs the model is overfit. I think the idea behind the model is a great, but there is not really an efficient way to classify sarcasm. It’s more about tone and you can’t pick up tone through text.

NHL misses the mark entirely with performative gestures regarding Black lives: 8.9462417e-01

Kavanaugh urged Supreme Court to avoid decisions on Trump finances: 3.1719434e-01

New netflix show is causing quite a stir: 2.6043197e-06

Man arrested on suspicion of setting fire to Arizona state Democratic Party headquarters: 5.6860987e-03

Biden slams Trump for promoting false COVID-19 claims from ‘crazy woman’: 1.2501055e-06

B1) The RNN model uses the last object (letter) to learn what to predict the next time. When it is ran again it will use the last two letters and will continue to keep using the last letters+1 everytime it is reran. The first letter plays an important role since it is the main thing that will determine how the model will predict these probabilities and the output that will be given.

C1)

1)

spanish: hace mucho frio aqui

english: it’s very cold here

this translation did very well and translated the sentence correctly.

2)

esta es mi vida

this is my life

this translation did very well and it translated the sentence correctly.

3)

¿todavia estan en casa?

are you still at home ?

this translation did very well and it tranlated the sentence correctly.

4)

trata de averiguarlo

try to figure it out

the overall idea was right so it wasn’t completely wrong. it should technically be ‘try to find out’.