-
first choose network layers and neurons per layer from input unit to output unit
- Example network =[2,50,20,3] input unit=2 ,first hidden=50 ,second hidden=20 ,output softmax layer=3
-
then create network object ANN and choose
- Optimizer=['adam','momentum','SGD']
- Regularization='[L2',dropout','none']
- Activation_function=['relu','sigmoid']
- Then choose hyperparameter
- Learning_rate
- Lambd, for L2 regularziation
- Keep_prop for dropout
- Beta for optimizer ='momentum'
- Batch_size for minibatch
- Example:
net=ANN(network,iteration=300,optimizer='adam',regularization='L2',activation_function='relu', learning_rate=0.1,lambd=0.2,keep_prop=0.9,beta=0.9,batch_size=64)
-
Then call fit function and feed it with input and output (hot encoded)
-
loss=net.fit(X,y_hot) #fit function returns losses you can plot it
-
-
you can use to check training acuracy *
net.predict(X,y) -
to make prediction use *
scores,zs = net.forward_prop(X) # scores[-1] is the softmax output unit
-
Notifications
You must be signed in to change notification settings - Fork 0
khaledmohamed00/Artificial_Neural_Network_Library_using_NumPy
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published