You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+80Lines changed: 80 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -36,6 +36,62 @@ I've included a few of my favourite models, but this is not an exhaustive collec
36
36
* MobileNet-V3 (work in progress, validating config)
37
37
* ChamNet (details hard to find, currently an educated guess)
38
38
* FBNet-C (TODO A/B variants)
39
+
40
+
The full list of model strings that can be passed to model factory via `--model` arg for train, validation, inference scripts:
41
+
```
42
+
chamnetv1_100
43
+
chamnetv2_100
44
+
densenet121
45
+
densenet161
46
+
densenet169
47
+
densenet201
48
+
dpn107
49
+
dpn131
50
+
dpn68
51
+
dpn68b
52
+
dpn92
53
+
dpn98
54
+
fbnetc_100
55
+
inception_resnet_v2
56
+
inception_v4
57
+
mnasnet_050
58
+
mnasnet_075
59
+
mnasnet_100
60
+
mnasnet_140
61
+
mnasnet_small
62
+
mobilenetv1_100
63
+
mobilenetv2_100
64
+
mobilenetv3_050
65
+
mobilenetv3_075
66
+
mobilenetv3_100
67
+
pnasnet5large
68
+
resnet101
69
+
resnet152
70
+
resnet18
71
+
resnet34
72
+
resnet50
73
+
resnext101_32x4d
74
+
resnext101_64x4d
75
+
resnext152_32x4d
76
+
resnext50_32x4d
77
+
semnasnet_050
78
+
semnasnet_075
79
+
semnasnet_100
80
+
semnasnet_140
81
+
seresnet101
82
+
seresnet152
83
+
seresnet18
84
+
seresnet34
85
+
seresnet50
86
+
seresnext101_32x4d
87
+
seresnext26_32x4d
88
+
seresnext50_32x4d
89
+
spnasnet_100
90
+
tflite_mnasnet_100
91
+
tflite_semnasnet_100
92
+
xception
93
+
```
94
+
39
95
## Features
40
96
Several (less common) features that I often utilize in my projects are included. Many of their additions are the reason why I maintain my own set of models, instead of using others' via PIP:
41
97
* All models have a common default configuration interface and API for
@@ -72,6 +128,30 @@ I've leveraged the training scripts in this repository to train a few of the mod
72
128
73
129
NOTE: For some reason I can't hit the stated accuracy with my impl of MNASNet and Google's tflite weights. Using a TF equivalent to 'SAME' padding was important to get > 70%, but something small is still missing. Trying to train my own weights from scratch with these models has so far to leveled off in the same 72-73% range.
74
130
131
+
## Script Usage
132
+
133
+
### Training
134
+
135
+
The variety of training args is large and not all combinations of options (or even options) have been fully tested. For the training dataset folder, specify the folder to the base that contains a `train` and `validation` folder.
136
+
137
+
To train an SE-ResNet34 on ImageNet, locally distributed, 4 GPUs, one process per GPU w/ cosine schedule, random-erasing prob of 50% and per-pixel random value:
NOTE: NVIDIA APEX should be installed to run in per-process distributed via DDP or to enable AMP mixed precision with the --amp flag
142
+
143
+
### Validation / Inference
144
+
145
+
Validation and inference scripts are similar in usage. One outputs metrics on a validation set and the other outputs topk class ids in a csv. Specify the folder containing validation images, not the base as in training script.
146
+
147
+
To validate with the model's pretrained weights (if they exist):
0 commit comments