Transformers QUIZ (MCQ QUESTIONS AND ANSWERS)

Total Correct: 0

Time:20:00

Question: 1

Which type of neural network architecture is commonly used for generative tasks, such as image generation or text generation?

Question: 2

In the context of reinforcement learning, what is the "reward signal"?

Question: 3

What is the primary advantage of using a deep neural network with multiple hidden layers?

Question: 4

What does the term "overfitting" refer to in the context of machine learning?

Question: 5

Which type of neural network layer is commonly used for image feature extraction?

Question: 6

What is the purpose of data augmentation in deep learning?

Question: 7

What is a common activation function used in the output layer of a binary classification neural network?

Question: 8

Which optimization algorithm is commonly used for training deep neural networks?

Question: 9

What is the role of a loss function in training a neural network?

Question: 10

What is the primary purpose of batch normalization in neural networks?

Question: 11

In neural network terminology, what is a "epoch"?

Question: 12

What is transfer learning in the context of deep learning?

Question: 13

Which type of neural network architecture is well-suited for sequential data, such as time series or natural language?

Question: 14

What is the role of the softmax activation function in a classification neural network?

Question: 15

In the context of natural language processing, what is the purpose of tokenization?

Question: 16

What is the primary purpose of transformers in deep learning?

Question: 17

What is the purpose of dropout regularization in neural networks?

Question: 18

Question 14: What is the vanishing gradient problem in deep learning?

Question: 19

What is the purpose of activation functions in neural networks?

Question: 20

According to transformers' limitations, what might impact their use in certain scenarios?

Question: 21

What area of deep learning engineering is mentioned as a potential future application for transformers?

Question: 22

Which transformer-based model was developed by Google and is popular for general question-answering tasks?

Question: 23

In the context of deep learning engineering, what limitation is associated with using transformers?

Question: 24

What is one of the best practices for optimizing transformer model performance?

Question: 25

How is the transformer model trained to avoid overfitting?

Question: 26

What type of tasks do transformers perform well in, especially in natural language processing?

Question: 27

Which part of the transformer architecture is responsible for adding positional information to the input sequence?

Question: 28

What is the purpose of multi-head attention in the transformer architecture?

Question: 29

Which advanced LSTM technique involves two LSTMs, one processing input in a forward trend and the other in a backward trend?

Question: 30

In the context of transformers, what does self-attention refer to?