What is the exploding gradient problem while using the back propagation technique?

By drakula david in 22 Sep 2023 | 01:49 pm
drakula david

drakula david

Student
Posts: 128
Member since: 22 Sep 2023

What is the exploding gradient problem while using the back propagation technique?

22 Sep 2023 | 01:49 pm
0 Likes
divas goyal

divas goyal

Student
Posts: 453
Member since: 22 Sep 2023

   - The exploding gradient problem occurs in deep learning when gradients during backpropagation become very large, leading to numerical instability and slow convergence.

   - It can be mitigated using gradient clipping or selecting appropriate initialization methods.

23 Sep 2023 | 03:20 pm
0 Likes
divas goyal

divas goyal

Student
Posts: 453
Member since: 22 Sep 2023

  - Gradient Descent (GD) computes gradients using the entire dataset in each iteration.

   - Stochastic Gradient Descent (SGD) updates the model's parameters using only one randomly selected data point (or a small batch) in each iteration. It's computationally efficient but can have more erratic convergence.

23 Sep 2023 | 03:20 pm
0 Likes

Report

Please describe about the report short and clearly.