卷积神经网络-李宏毅机器学习 CNN = convolution + max pooling + flatten 2022-07-21 学习 > Machine Learning #keras #AI #Marchine Learning #Deep Learning #CNN #convolution #max pooling #faltten
Class4 隐马尔科夫模型HMM HMM的两个基本假设! 2022-07-21 学习 > 语音识别 #语音识别 #ASR #HMM #Forwaed Algorithm #Backwaed Algorrithm #Viterbi Algorrithm
优化神经网络训练-李宏毅机器学习 local minima->saddle point/Optimization with Batch/Gradient Descent + Momentum/Adaptive Learning Rate/loss function/batch normalization 2022-07-20 学习 > Machine Learning #AI #Marchine Learning #Deep Learning #critical point #local minima #saddle point #Optimization #Batch #momentun #gradient descent #adam #learning rate #RMSProp #learning rate decay #warm up #loss function #batch normalization #feature normalization #Root mean square #Cross-Entropy #Mean square error
深度学习与反向传递机制-李宏毅机器学习 DL是采用神经网络的ML。 2022-07-16 学习 > Machine Learning #Marchine Learning #Deep Learning #OPA #Neural Network #feedforward #layer #Matrix Operation #backpropagation #activation function
误差及梯度下降-李宏毅机器学习 error = bias + variance \ Gradient Descent comes from Taylor Series 2022-07-14 学习 > Machine Learning #AI #Gradient Descent #Marchine Learning #Deep Learning #Adam #error #bias #variance #regularization
Regression-李宏毅机器学习 我要去补习宝可梦了~ 2022-07-12 学习 > Machine Learning #AI #Marchine Learning #Deep Learning #Adam #regularization #Regression
机器学习介绍-李宏毅机器学习 感谢datawhale提供组队学习机会~ 2022-07-11 学习 > Machine Learning #AI #Marchine Learning #Deep Learning