본문 바로가기
딥러닝

밸리데이션 로스(validation loss)가 안줄어들 경우 체크해볼 3가지

by 달죽 2020. 11. 10.
반응형


1. 데이터 전처리: 데이터를 표준화하고 정규화하자. (배치놈, 스캐일링)

 

2. 모델 강제성: 모델이 너무 복잡한지 확인하자. dropout 추가하고  계층의 레이어 수 또는 뉴런 수를 줄입니다.

 

3. 학습 속도 및 감소 속도 : 학습 속도를 줄이자! 

학습을 하기에 좋은 시작 값은 보통 0.0005에서 0.001 사이입니다.또한 1e-6의 decay raet 고려하십시오.

 

 

The model is overfitting right from epoch 10, the validation loss is increasing while the training loss is decreasing.

Dealing with such a Model:

  1. Data Preprocessing: Standardizing and Normalizing the data.
  2. Model compelxity: Check if the model is too complex. Add dropout, reduce number of layers or number of neurons in each layer.
  3. Learning Rate and Decay Rate: Reduce the learning rate, a good starting value is usually between 0.0005 to 0.001. Also consider a decay rate of 1e-6.

 

출처: <https://datascience.stackexchange.com/questions/43191/validation-loss-is-not-decreasing>

반응형

댓글