Commit graph

275 commits

Author SHA1 Message Date
b5b26d980b
dlr: print filepaths
we need to know which is which with  this seed so we can visualise for the paper
2023-07-13 19:58:38 +01:00
9efc72db73
dlr ds/mono: just why 2023-06-16 18:36:18 +01:00
a4e80229fb
dlr ds/mono: fixup 2023-06-16 18:35:02 +01:00
fe05dd33fc
dlr: missing quotes 2023-05-11 16:23:36 +01:00
20092c6829
shuffle: add random seed env var 2023-05-11 15:59:01 +01:00
31687da931
celldice: actually do log(cosh())
....just wow.
2023-05-03 15:00:10 +01:00
c5fc62c411
dlr CHANGE: Add optional log(cosh(dice_loss))
Ref https://doi.org/10.1109/cibcb48159.2020.9277638
2023-03-10 20:24:13 +00:00
f25d1b5b1a
dlr CHANGE: properly normalise the heightmap 2023-03-10 19:13:32 +00:00
45c76ba252
typo 2023-03-03 22:46:02 +00:00
c909cfd3d1
fixup 2023-03-03 22:45:34 +00:00
0734201107
dlr: tf graph changes 2023-03-03 22:44:49 +00:00
750f46dbd2
debug 2023-03-03 22:39:30 +00:00
5472729f5e
dlr: fixup argmax & y_true/y_pred 2023-03-03 22:37:36 +00:00
bc734a29c6
y_true is one-hot, convert to sparse 2023-03-03 22:20:11 +00:00
c7b577ab29
specificity: convert to plaintf 2023-03-03 22:16:48 +00:00
26cc824ace
dlr: MeanIoU fixup 2023-03-03 22:10:49 +00:00
5c6789bf40
meaniou: implement one-hot version
it expects sparse, but our output is one-hot.
2023-03-03 22:04:21 +00:00
6ffda40d48
fixup 2023-03-03 21:54:45 +00:00
9b13e9ca5b
dlr: fixup argmax first 2023-03-03 21:51:24 +00:00
7453c607ed
argmax for sensitivity & specificity too 2023-03-03 21:49:33 +00:00
8470aec996
dlr: fixup 2023-03-03 21:45:51 +00:00
3d051a8874
dlr: HACK: argmax to convert [64,128,128, 2] → [64,128,128] 2023-03-03 21:41:26 +00:00
94a32e7144
dlr: fix metrics 2023-03-03 20:37:22 +00:00
b435cc54dd
dlr: add sensitivity (aka recall) and specificity metrics 2023-03-03 20:00:05 +00:00
483ecf11c8
add specificity metric 2023-03-03 19:35:20 +00:00
d464c9f57d
dlr: add dice loss as metric
more metrics to go tho
2023-03-03 19:34:55 +00:00
9f1cee2927
dlr eo: cheese it by upsampling and then downsampling again 2023-02-23 16:47:00 +00:00
8446a842d1
typo 2023-02-03 16:01:54 +00:00
1a8f10339a
LayerConvNeXtGamma: fix for mixed precision mode 2023-02-02 16:22:08 +00:00
e1ad16a213
debug A 2023-01-20 18:58:45 +00:00
65a2e16a4c
ds_eo: lower memory usage 2023-01-20 18:55:52 +00:00
b5e68fc1a3
eo: don't downsample ConvNeXt at beginning 2023-01-20 18:49:46 +00:00
5b54ceec48
ds eo: debug 2023-01-20 18:36:14 +00:00
a3787f0647
debug 2023-01-20 18:34:56 +00:00
9bd5e0f7a3
ds_eo: debug
there's something fishy going on here.
2023-01-20 17:25:22 +00:00
aa3831bc4f
ds_eo: pass water to reshape 2023-01-17 19:07:16 +00:00
a01c49414f
ds_eo: typo 2023-01-17 19:06:29 +00:00
b9bea26d26
ds_eo: rogue variables 2023-01-17 19:03:14 +00:00
2f0ce0aa13
again 2023-01-13 18:27:03 +00:00
e04d6ab1b6
fixup again 2023-01-13 18:23:32 +00:00
37d1598b0b
loss cel+dice: fixup 2023-01-13 18:21:11 +00:00
0f0b691b5d
loss cel + dice: fixup 2023-01-13 18:09:31 +00:00
be77f035c8
dlr: add cross-entropy + dice loss fn option 2023-01-13 17:58:00 +00:00
b2a5acaf4e
dlr dsm: clash 2023-01-13 17:26:38 +00:00
3c4d1c5140
dlr: Add support for stripping isolated water pixels
That is, water pixels that have no other water pixels immediately adjacent thereto (diagonals count).
2023-01-13 16:57:26 +00:00
ce1467461d
fixup 2023-01-13 16:47:52 +00:00
0b676fa391
move shuffle to subdir 2023-01-13 16:47:35 +00:00
1958c4e6c2
encoderonly model: getting there 2023-01-09 19:33:41 +00:00
52cf66ca32
start working on a quick encoder test, but it's far from finished 2023-01-06 19:55:52 +00:00
4563fe6b27
dpl: fix moar crashes 2023-01-05 19:03:44 +00:00