Skip links

  • Skip to primary navigation
  • Skip to content
  • Skip to footer
수학 전공자의 개발 이야기 Development Story of Mathematician
  • Home
  • Category
    감자

    감자

    병렬 프로그래밍 개발자

    • South Korea
    • GitHub
    • 📚 전체 글 수 133 개
    • C++
      • C++ STL & 표준 (14)
      Certifications
      • AI Certification (1)
      Cloud
      • Cloud Computing Theory (1)
      • Azure (22)
      Artificial Intelligence
      • Machine Learning (3)
      • Deep Learning (10)
      Graphics
      • OpenCV (2)
      Algorithm
      • Algorithm Theory (16)
      Computer Science
      Coding Test
      • Mathematics Theory (3)
      • Python (30)
      • C++ (6)
      Development
      • Git (3)
      • Web Development (7)
      • Dev Tools (1)
      • Environment Setup (5)
      etc
      • Blog (8)

    Recent Posts

    2025.02.06

    [Deep Learning] 8. 딥러닝 모델의 여러 가지 학습 개선 방법

    Deep Learning AI deep learning dropout hyperparameter neural network overfitting parameter training methods weight decay

    2025.02.06

    [Deep Learning] 7. 배치 정규화(Batch Normalization)

    Deep Learning AI batch normalization formula batch normalization deep learning neural network

    2025.02.06

    [Deep Learning] 6. 가중치의 초기값을 설정하는 방법

    Deep Learning AI deep learning neural network weight initialization methods weight initialization

    2025.02.05

    [Deep Learning] 5. 매개변수를 갱신하는 다양한 방법

    Deep Learning AI deep learning gradient descent neural network optimizer stochastic gradient descent

    2025.02.05

    [Deep Learning] 4. 오차 역전파(Error Backpropagation)에 대해 알아보자

    Deep Learning AI backpropagation deep learning error backward propagation gradient descent loss function neural network training neural network
    • 이전
    • 1
    • …
    • 14
    • 15
    • 16
    • 17
    • 18
    • …
    • 27
    • 다음
    • 팔로우:
    • 피드
    © 2025 김기덕. Powered by Jekyll & Minimal Mistakes.