Modern AI Fundamentals

0% completed

Previous
Next
5.3 Quiz
1. What is the primary role of hidden layers in a neural network?
A
They store the final model predictions.
B
They prevent any form of backpropagation.
C
They remove data points that don’t fit well.
D
They transform raw inputs into increasingly abstract representations, capturing complex patterns.
2. In backpropagation, what is the main purpose of the backward pass?
A
To randomize the weights after each forward pass.
B
To visualize how the model performs on the test set.
C
To calculate gradients that show how much each weight contributed to the error, and adjust them to minimize the loss.
D
To skip the training step and directly generate outputs.
3. Why is the ReLU (Rectified Linear Unit) activation function popular in deep networks?
A
It doesn’t allow neurons to output any negative values, simplifying computation and helping gradients flow better.
B
It’s the only function that avoids overfitting.
C
It’s always guaranteed to produce 100% accuracy.
D
It requires no hidden layers to function.

.....

.....

.....

Like the course? Get enrolled and start learning!
Previous
Next