|
4a4df380e3
|
if name == main
|
2023-03-23 18:09:52 +00:00 |
|
|
81cad8e6b4
|
newline
|
2023-03-23 18:01:21 +00:00 |
|
|
1bd59dc038
|
finish script to plot generic metrics
|
2023-03-23 18:00:00 +00:00 |
|
|
698bbe2ffb
|
start working on plotting script, but it isn't finished yet
|
2023-03-22 17:45:06 +00:00 |
|
|
6b17d45aad
|
dlr: plot all metrics
|
2023-03-22 17:41:34 +00:00 |
|
|
e565c36149
|
deeplabv3+: prepare for ConvNeXt
|
2023-03-14 21:51:41 +00:00 |
|
|
779b546897
|
dlr: log new env vars
|
2023-03-14 20:28:22 +00:00 |
|
|
623208ba6d
|
dlr: add env var for water thresholding
|
2023-03-14 20:18:39 +00:00 |
|
|
c5fc62c411
|
dlr CHANGE: Add optional log(cosh(dice_loss))
Ref https://doi.org/10.1109/cibcb48159.2020.9277638
|
2023-03-10 20:24:13 +00:00 |
|
|
f25d1b5b1a
|
dlr CHANGE: properly normalise the heightmap
|
2023-03-10 19:13:32 +00:00 |
|
|
8e76ac2864
|
dlr: typo
|
2023-03-10 17:40:16 +00:00 |
|
|
5e79825292
|
dlr: tf.Tensor → numpy array → list → json → disk
|
2023-03-10 17:34:49 +00:00 |
|
|
b287970032
|
dlr: missing import
|
2023-03-10 17:31:03 +00:00 |
|
|
e6cd0aa1b9
|
dlr: fix crash
|
2023-03-10 17:14:06 +00:00 |
|
|
cf37aeb11a
|
dlr: truncate jsonl before we start to avoid mixing things up
|
2023-03-10 17:11:10 +00:00 |
|
|
5fdf229d06
|
dlr: write raw outputs to jsonl file
|
2023-03-10 17:07:44 +00:00 |
|
|
d24381babb
|
dlr: make size fo prediction graph actually sensible
|
2023-03-09 19:54:27 +00:00 |
|
|
8d932757b5
|
dlr: variable name typo
|
2023-03-09 19:44:39 +00:00 |
|
|
b6ca885fe2
|
dlr: told you so
|
2023-03-09 19:43:35 +00:00 |
|
|
f9df19cfd5
|
dlr: next of several typos
|
2023-03-09 19:34:45 +00:00 |
|
|
96eae636ea
|
dir: add missing functions to .load() custom objs
apparently metrics are also required to be included here...
|
2023-03-09 19:26:57 +00:00 |
|
|
ad52ae9241
|
dir: also plot inputs
|
2023-03-09 19:13:25 +00:00 |
|
|
436ab78438
|
dlr: when predicting, also display heatmap
...of positive predictions
|
2023-03-09 18:54:28 +00:00 |
|
|
5195fe6b62
|
jupyter test: heatmaps in matplotlib & seaborn
|
2023-03-09 18:52:56 +00:00 |
|
|
45c76ba252
|
typo
|
2023-03-03 22:46:02 +00:00 |
|
|
c909cfd3d1
|
fixup
|
2023-03-03 22:45:34 +00:00 |
|
|
0734201107
|
dlr: tf graph changes
|
2023-03-03 22:44:49 +00:00 |
|
|
750f46dbd2
|
debug
|
2023-03-03 22:39:30 +00:00 |
|
|
5472729f5e
|
dlr: fixup argmax & y_true/y_pred
|
2023-03-03 22:37:36 +00:00 |
|
|
bc734a29c6
|
y_true is one-hot, convert to sparse
|
2023-03-03 22:20:11 +00:00 |
|
|
c7b577ab29
|
specificity: convert to plaintf
|
2023-03-03 22:16:48 +00:00 |
|
|
26cc824ace
|
dlr: MeanIoU fixup
|
2023-03-03 22:10:49 +00:00 |
|
|
e9dcbe3863
|
dlr: fixup
|
2023-03-03 22:09:05 +00:00 |
|
|
5c6789bf40
|
meaniou: implement one-hot version
it expects sparse, but our output is one-hot.
|
2023-03-03 22:04:21 +00:00 |
|
|
6ffda40d48
|
fixup
|
2023-03-03 21:54:45 +00:00 |
|
|
9b13e9ca5b
|
dlr: fixup argmax first
|
2023-03-03 21:51:24 +00:00 |
|
|
7453c607ed
|
argmax for sensitivity & specificity too
|
2023-03-03 21:49:33 +00:00 |
|
|
8470aec996
|
dlr: fixup
|
2023-03-03 21:45:51 +00:00 |
|
|
3d051a8874
|
dlr: HACK: argmax to convert [64,128,128, 2] → [64,128,128]
|
2023-03-03 21:41:26 +00:00 |
|
|
94a32e7144
|
dlr: fix metrics
|
2023-03-03 20:37:22 +00:00 |
|
|
c7f96ab6ab
|
dlr: typo
|
2023-03-03 20:23:03 +00:00 |
|
|
06f956dc07
|
dlr: default LOSS, EPOCHS, and PREDICT_COUNT to better values
Ref recent experiments
|
2023-03-03 20:17:08 +00:00 |
|
|
b435cc54dd
|
dlr: add sensitivity (aka recall) and specificity metrics
|
2023-03-03 20:00:05 +00:00 |
|
|
483ecf11c8
|
add specificity metric
|
2023-03-03 19:35:20 +00:00 |
|
|
d464c9f57d
|
dlr: add dice loss as metric
more metrics to go tho
|
2023-03-03 19:34:55 +00:00 |
|
|
f70083bea4
|
dlr eo: set custom_objects when loading model
|
2023-03-01 17:19:10 +00:00 |
|
|
b5f23e76d1
|
dlr eo: allow setting DIR_OUTPUT directly
|
2023-03-01 16:54:15 +00:00 |
|
|
4fd9feba4f
|
dlr eo: tidyup
|
2023-03-01 16:47:36 +00:00 |
|
|
69b5ae8838
|
dlr eo: this should fix it
|
2023-02-23 17:24:30 +00:00 |
|
|
9f1cee2927
|
dlr eo: cheese it by upsampling and then downsampling again
|
2023-02-23 16:47:00 +00:00 |
|