https://365datascience.com/dwqa-answer/answer-for-when-performing-the-linear-model-with-l2-norm-loss-and-gradient-decent-loss-function-reaches-infinity-and-then-nan/ -
Hi there,
The reason for that is that your loss function is not converging to 0, but rather – diverging to infinity.
This usually happens when the learning rate you have chosen is too big. Please try with some lower number, e.g. 0.0001.
Best,
The 365 Team
#365datascience #DataScience #data #science #365datascience #BigData #tutorial #infographic #career #salary #education #howto #scientist #engineer #course #engineer #MachineLearning #machine #learning #certificate #udemy
No comments:
Post a Comment