|
5472729f5e
|
dlr: fixup argmax & y_true/y_pred
|
2023-03-03 22:37:36 +00:00 |
|
|
bc734a29c6
|
y_true is one-hot, convert to sparse
|
2023-03-03 22:20:11 +00:00 |
|
|
c7b577ab29
|
specificity: convert to plaintf
|
2023-03-03 22:16:48 +00:00 |
|
|
26cc824ace
|
dlr: MeanIoU fixup
|
2023-03-03 22:10:49 +00:00 |
|
|
5c6789bf40
|
meaniou: implement one-hot version
it expects sparse, but our output is one-hot.
|
2023-03-03 22:04:21 +00:00 |
|
|
6ffda40d48
|
fixup
|
2023-03-03 21:54:45 +00:00 |
|
|
9b13e9ca5b
|
dlr: fixup argmax first
|
2023-03-03 21:51:24 +00:00 |
|
|
7453c607ed
|
argmax for sensitivity & specificity too
|
2023-03-03 21:49:33 +00:00 |
|
|
8470aec996
|
dlr: fixup
|
2023-03-03 21:45:51 +00:00 |
|
|
3d051a8874
|
dlr: HACK: argmax to convert [64,128,128, 2] → [64,128,128]
|
2023-03-03 21:41:26 +00:00 |
|
|
94a32e7144
|
dlr: fix metrics
|
2023-03-03 20:37:22 +00:00 |
|
|
b435cc54dd
|
dlr: add sensitivity (aka recall) and specificity metrics
|
2023-03-03 20:00:05 +00:00 |
|
|
483ecf11c8
|
add specificity metric
|
2023-03-03 19:35:20 +00:00 |
|
|
d464c9f57d
|
dlr: add dice loss as metric
more metrics to go tho
|
2023-03-03 19:34:55 +00:00 |
|
|
9f1cee2927
|
dlr eo: cheese it by upsampling and then downsampling again
|
2023-02-23 16:47:00 +00:00 |
|
|
8446a842d1
|
typo
|
2023-02-03 16:01:54 +00:00 |
|
|
1a8f10339a
|
LayerConvNeXtGamma: fix for mixed precision mode
|
2023-02-02 16:22:08 +00:00 |
|
|
e1ad16a213
|
debug A
|
2023-01-20 18:58:45 +00:00 |
|
|
65a2e16a4c
|
ds_eo: lower memory usage
|
2023-01-20 18:55:52 +00:00 |
|
|
b5e68fc1a3
|
eo: don't downsample ConvNeXt at beginning
|
2023-01-20 18:49:46 +00:00 |
|
|
5b54ceec48
|
ds eo: debug
|
2023-01-20 18:36:14 +00:00 |
|
|
a3787f0647
|
debug
|
2023-01-20 18:34:56 +00:00 |
|
|
9bd5e0f7a3
|
ds_eo: debug
there's something fishy going on here.
|
2023-01-20 17:25:22 +00:00 |
|
|
aa3831bc4f
|
ds_eo: pass water to reshape
|
2023-01-17 19:07:16 +00:00 |
|
|
a01c49414f
|
ds_eo: typo
|
2023-01-17 19:06:29 +00:00 |
|
|
b9bea26d26
|
ds_eo: rogue variables
|
2023-01-17 19:03:14 +00:00 |
|
|
2f0ce0aa13
|
again
|
2023-01-13 18:27:03 +00:00 |
|
|
e04d6ab1b6
|
fixup again
|
2023-01-13 18:23:32 +00:00 |
|
|
37d1598b0b
|
loss cel+dice: fixup
|
2023-01-13 18:21:11 +00:00 |
|
|
0f0b691b5d
|
loss cel + dice: fixup
|
2023-01-13 18:09:31 +00:00 |
|
|
be77f035c8
|
dlr: add cross-entropy + dice loss fn option
|
2023-01-13 17:58:00 +00:00 |
|
|
b2a5acaf4e
|
dlr dsm: clash
|
2023-01-13 17:26:38 +00:00 |
|
|
3c4d1c5140
|
dlr: Add support for stripping isolated water pixels
That is, water pixels that have no other water pixels immediately adjacent thereto (diagonals count).
|
2023-01-13 16:57:26 +00:00 |
|
|
ce1467461d
|
fixup
|
2023-01-13 16:47:52 +00:00 |
|
|
0b676fa391
|
move shuffle to subdir
|
2023-01-13 16:47:35 +00:00 |
|
|
1958c4e6c2
|
encoderonly model: getting there
|
2023-01-09 19:33:41 +00:00 |
|
|
52cf66ca32
|
start working on a quick encoder test, but it's far from finished
|
2023-01-06 19:55:52 +00:00 |
|
|
4563fe6b27
|
dpl: fix moar crashes
|
2023-01-05 19:03:44 +00:00 |
|
|
fefeb5d531
|
fix water depth fiddling
|
2023-01-05 19:01:20 +00:00 |
|
|
46d1f5e4e0
|
dataset_mono: fix dataset parsing
|
2023-01-05 19:00:52 +00:00 |
|
|
aca7b83a78
|
dataset_mono: fix sizing
it didn't account for rainfall_scale_up
|
2023-01-05 18:38:47 +00:00 |
|
|
19bb2fcac0
|
debug
|
2023-01-05 18:32:22 +00:00 |
|
|
ef5071b569
|
DeepLabV3+: start working on version for rainfall radar, but it's not finished yet
|
2022-12-15 19:33:14 +00:00 |
|
|
eb47f8f544
|
dataset_mono: adjust to suit DeepLabV3+ too
|
2022-12-13 18:37:38 +00:00 |
|
|
4e4d42a281
|
LossDice: add comment
|
2022-12-12 18:34:20 +00:00 |
|
|
449bc425a7
|
LossDice: explicitly cast inputs to float32
|
2022-12-12 17:20:32 +00:00 |
|
|
dbf8f5617c
|
drop activation function in last layers
|
2022-12-12 17:20:04 +00:00 |
|
|
bcd2f1251e
|
LossDice: Do 1 - thing instead of -thing
|
2022-12-09 19:41:32 +00:00 |
|
|
d0dbc50bb7
|
debug
|
2022-12-09 19:33:28 +00:00 |
|
|
2142bb039c
|
again
|
2022-12-09 19:30:01 +00:00 |
|