Commit graph

561 commits

Author SHA1 Message Date
483ecf11c8
add specificity metric 2023-03-03 19:35:20 +00:00
d464c9f57d
dlr: add dice loss as metric
more metrics to go tho
2023-03-03 19:34:55 +00:00
f70083bea4
dlr eo: set custom_objects when loading model 2023-03-01 17:19:10 +00:00
b5f23e76d1
dlr eo: allow setting DIR_OUTPUT directly 2023-03-01 16:54:15 +00:00
4fd9feba4f
dlr eo: tidyup 2023-03-01 16:47:36 +00:00
69b5ae8838
dlr eo: this should fix it 2023-02-23 17:24:30 +00:00
9f1cee2927
dlr eo: cheese it by upsampling and then downsampling again 2023-02-23 16:47:00 +00:00
96b94ec55b
upsampling test 2023-02-23 16:19:44 +00:00
747ddfd41b
weird, XLA_FLAGS cuda data dir wasn't needed before
libdevice not found at ./libdevice.10.bc
2023-02-10 13:28:34 +00:00
e43274cd91
dlr eo: add VAL_STEPS_PER_EPOCH 2023-02-03 16:41:30 +00:00
8446a842d1
typo 2023-02-03 16:01:54 +00:00
1a8f10339a
LayerConvNeXtGamma: fix for mixed precision mode 2023-02-02 16:22:08 +00:00
a630db2c49
dlr eo: fixup 2023-02-02 16:17:52 +00:00
2bf1872aca
dlr eo: add JIT_COMPILE and MIXED_PRECISION 2023-02-02 16:14:09 +00:00
71088b8c0b
typo 2023-02-02 15:48:49 +00:00
f7666865a0
dlr eo: add STEPS_PER_EXECUTION 2023-02-02 15:47:08 +00:00
f8202851a1
dlr eo: add LEARNING_RATE 2023-01-27 16:51:13 +00:00
fb898ea72b
slurm eo: seriously....? 2023-01-26 17:02:33 +00:00
be946091b1
slurm eo: DIR_OUTPUT → DIRPATH_OUTPUT 2023-01-26 16:52:14 +00:00
4703bdbea1
SLURM: add job file for encoderonly
It's pretty much bugfixed, but illykin doesn't have enough RAM to support it at the moment :-(
2023-01-20 20:32:35 +00:00
818d77c733
Make dirpath_rainfallwater consistent with other experiments 2023-01-20 20:31:26 +00:00
e72d3991b8
switch to a smaller ConvNeXt 2023-01-20 19:14:38 +00:00
e1ad16a213
debug A 2023-01-20 18:58:45 +00:00
65a2e16a4c
ds_eo: lower memory usage 2023-01-20 18:55:52 +00:00
b5e68fc1a3
eo: don't downsample ConvNeXt at beginning 2023-01-20 18:49:46 +00:00
d5fdab50ed
dlreo: missing import 2023-01-20 18:40:35 +00:00
4514086dc6
make_encoderonly: kwargs 2023-01-20 18:39:35 +00:00
35dbd3f8bc
ds eo: scale up rainfall data
It's taken most fo the afternoon to spot this one 🤦
2023-01-20 18:37:08 +00:00
5b54ceec48
ds eo: debug 2023-01-20 18:36:14 +00:00
a3787f0647
debug 2023-01-20 18:34:56 +00:00
9bd5e0f7a3
ds_eo: debug
there's something fishy going on here.
2023-01-20 17:25:22 +00:00
aa3831bc4f
ds_eo: pass water to reshape 2023-01-17 19:07:16 +00:00
a01c49414f
ds_eo: typo 2023-01-17 19:06:29 +00:00
b9bea26d26
ds_eo: rogue variables 2023-01-17 19:03:14 +00:00
64c57bbc21
dlr: add no-requeue
Ref https://support.hull.ac.uk/tas/public/ssp/content/detail/incident?unid=652db7ac6e73485c9f7658db78b2b628
2023-01-17 18:20:26 +00:00
a28bbb9cf7
encoderonly: make executable 2023-01-17 17:55:10 +00:00
835b376c72
slurm dlr: log exit code 2023-01-17 15:18:26 +00:00
40a550f155
slurm dlr: fixup 2023-01-16 18:45:08 +00:00
7bcf13f8d8
dlr: typo 2023-01-16 18:02:09 +00:00
6ff2864d23
slurm dlr: shell out in conda; redirect stderr & stdout to disk inside the experiments folder
Also, if the job restarts, we still save the previous run's results because we append rather than overwrite
2023-01-16 17:32:22 +00:00
0b31c9e700
log start time, end time, and elapsed
just in case...!
2023-01-16 17:30:20 +00:00
1a4ac3ed66
encoderonly: add graph plotting 2023-01-13 19:08:38 +00:00
82e01da70b
fixup.... again
oops
2023-01-13 18:47:29 +00:00
7b10f5c5fe
dlr: add learning_rate env var 2023-01-13 18:29:39 +00:00
2f0ce0aa13
again 2023-01-13 18:27:03 +00:00
e04d6ab1b6
fixup again 2023-01-13 18:23:32 +00:00
37d1598b0b
loss cel+dice: fixup 2023-01-13 18:21:11 +00:00
0f0b691b5d
loss cel + dice: fixup 2023-01-13 18:09:31 +00:00
be77f035c8
dlr: add cross-entropy + dice loss fn option 2023-01-13 17:58:00 +00:00
b2a5acaf4e
dlr dsm: clash 2023-01-13 17:26:38 +00:00
f7672db599
annoying 2023-01-13 17:00:47 +00:00
3c4d1c5140
dlr: Add support for stripping isolated water pixels
That is, water pixels that have no other water pixels immediately adjacent thereto (diagonals count).
2023-01-13 16:57:26 +00:00
ce1467461d
fixup 2023-01-13 16:47:52 +00:00
0b676fa391
move shuffle to subdir 2023-01-13 16:47:35 +00:00
2edfb1a21f
dlr predict: comment debug 2023-01-12 19:20:22 +00:00
f0dd9711ed
dlr: fixup 2023-01-12 18:55:33 +00:00
176dc022a0
add moar env vars 2023-01-12 18:54:39 +00:00
0d41bbba94
dlr predict: output with higher quality 2023-01-12 18:43:48 +00:00
1b5bb14d8f
dlr: debug 2023-01-12 18:35:29 +00:00
20f7d34fd1
fixup 2023-01-12 18:21:20 +00:00
c0c6e81c01
dlr predict: allow for multiple outputs 2023-01-12 18:12:50 +00:00
864dfa802d
dlr/predict typo 2023-01-12 18:03:06 +00:00
0140e93bbd
dlr: trying a thing 2023-01-12 18:00:16 +00:00
e1666026ad
dlr/predict: let's try another way 2023-01-12 17:56:59 +00:00
e7c0328648
typo 2023-01-12 16:58:23 +00:00
376eecc29f
dlr: try plotting the label too
https://www.youtube.com/watch?v=03qwgVJbNas
2023-01-12 16:13:04 +00:00
be7dd91f88
fix crash 'cause we're only plotting 1 thing 2023-01-11 17:41:55 +00:00
3787155665
dlr: we can't plot the input tensor because it has 8 channels rather than 1 or 3 2023-01-11 17:39:14 +00:00
373dda03b5
missing comma 2023-01-11 17:28:13 +00:00
7be0509ac8
dlr: slurm PATH_CHECKPOINT 2023-01-11 17:27:26 +00:00
a69c809008
dlr: slurm, load checkpoint 2023-01-11 17:26:57 +00:00
93e663e45d
add optional PATH_CHECKPOINT env var 2023-01-11 17:20:19 +00:00
0e3de8f5fc
dlr: fix predictions 2023-01-10 19:19:30 +00:00
2591cbe6bc
slurm dlr: quiet, pip 2023-01-10 18:12:35 +00:00
1958c4e6c2
encoderonly model: getting there 2023-01-09 19:33:41 +00:00
4b7df39fac
bugfix 2023-01-09 18:25:16 +00:00
581006cbe6
dlr: save checkpoints 2023-01-09 18:03:23 +00:00
52cf66ca32
start working on a quick encoder test, but it's far from finished 2023-01-06 19:55:52 +00:00
f080af0b57
identity test: plot binarised heightmap 2023-01-06 19:03:06 +00:00
36859746ff
dlr: fix crash 2023-01-06 17:13:35 +00:00
bcf198d47b
mono identity test: output 2023-01-06 17:08:18 +00:00
db0b010814
slur dlr: log file names correct 2023-01-05 19:47:51 +00:00
e01ecfb615
slurm dlr: fix output dir 2023-01-05 19:42:42 +00:00
aa76d754c1
slurm dlr: fix pathing 2023-01-05 19:35:56 +00:00
67b8a2c6c0
spaces → spaces 2023-01-05 19:17:44 +00:00
56a501f8a9
weights="imagenet" only works with 3 image channels 2023-01-05 19:09:31 +00:00
4563fe6b27
dpl: fix moar crashes 2023-01-05 19:03:44 +00:00
fefeb5d531
fix water depth fiddling 2023-01-05 19:01:20 +00:00
46d1f5e4e0
dataset_mono: fix dataset parsing 2023-01-05 19:00:52 +00:00
aca7b83a78
dataset_mono: fix sizing
it didn't account for rainfall_scale_up
2023-01-05 18:38:47 +00:00
19bb2fcac0
debug 2023-01-05 18:32:22 +00:00
6a4f68a055
missing import 2023-01-05 18:26:33 +00:00
0d4cc63b76
dl rainfall: fix env var name 2023-01-05 17:42:20 +00:00
dd79fb6e68
fixup 2023-01-05 17:09:09 +00:00
11ccd4cbee
slurm deeplab rainfall: fix variable naming 2023-01-05 17:08:57 +00:00
c17e53ca75
deeplabv3+ for rainfall 2022-12-16 19:52:59 +00:00
677e39f820
work on slurm for deeplabv3+ rainfall, but it's NOT FINISHED YET 2022-12-16 19:52:44 +00:00
423e277ed1
add comment 2022-12-15 19:33:25 +00:00
ef5071b569
DeepLabV3+: start working on version for rainfall radar, but it's not finished yet 2022-12-15 19:33:14 +00:00
15a3150127
DeepLabV3+: close each matplotlib figure after writing it 2022-12-15 19:14:07 +00:00
6ce121f861
DeepLabV3+: have argument for number of channels 2022-12-14 17:36:30 +00:00
1dc2ec3a46
DeepLabV3+: pathing.... again 2022-12-13 18:51:09 +00:00
eb47f8f544
dataset_mono: adjust to suit DeepLabV3+ too 2022-12-13 18:37:38 +00:00
440e693dfc
DeepLabv3+: fix pathing again 2022-12-13 18:26:00 +00:00
7e1f271bf4
deeplabv3+: fix colourmap 2022-12-13 14:02:10 +00:00
4d8ce792c9
ddeeplabv3+: fix imports/pathing errors 2022-12-13 13:38:27 +00:00
fc43f145c2
if not 2022-12-13 13:28:09 +00:00
d907dc48e5
DeepLabv3+: add logging 2022-12-13 13:20:16 +00:00
be4d928319
deeplabv3+: chmod +x 2022-12-13 13:06:42 +00:00
91846079b2
deeplabv3+ tesst: add shebang 2022-12-13 12:56:14 +00:00
96e260fe82
slurm: add job file for deeplabv3 test 2022-12-12 19:31:49 +00:00
8866960017
TEST SCRIPT: deeplabv3
ref https://keras.io/examples/vision/deeplabv3_plus/
dataset ref https://drive.google.com/uc?id=1B9A9UCJYMwTL4oBEo4RZfbMZMaZhKJaz

(the code is *terrible* spaghetti....!)
2022-12-12 19:20:07 +00:00
4e4d42a281
LossDice: add comment 2022-12-12 18:34:20 +00:00
449bc425a7
LossDice: explicitly cast inputs to float32 2022-12-12 17:20:32 +00:00
dbf8f5617c
drop activation function in last layers 2022-12-12 17:20:04 +00:00
bcd2f1251e
LossDice: Do 1 - thing instead of -thing 2022-12-09 19:41:32 +00:00
d0dbc50bb7
debug 2022-12-09 19:33:28 +00:00
2142bb039c
again 2022-12-09 19:30:01 +00:00
7000b0f193
fixup 2022-12-09 19:23:35 +00:00
85012d0616
fixup 2022-12-09 19:18:03 +00:00
719d8e9819
strip channels layer at end 2022-12-09 19:11:00 +00:00
0129c35a35
LossDice: remove weird K.* functions 2022-12-09 19:06:26 +00:00
659fc97fd4
fix crash 2022-12-09 18:39:27 +00:00
e22c0981e6
actually use dice loss 2022-12-09 18:35:17 +00:00
649c262960
mono: switch loss from crossentropy to dice 2022-12-09 18:13:37 +00:00
7fd7c750d6
jupyter: identity test
status: FAILED, as usual....!
Don't worry though, 'cause we has a *planses*..... MUHAHAHAHAHAHAHA
* cue evil laugh *
2022-12-09 18:07:56 +00:00
cf9e8aa237
jupyter: convnext-mono identity test 2022-12-09 15:50:27 +00:00
2a1772a211
confvnext_intrevse: add shallow 2022-12-08 19:10:12 +00:00
c27869630a
I hate VSCode's git commit interface
it doesn't let you ammend
2022-12-08 18:58:54 +00:00
b3345963f3
missing arg pass 2022-12-08 18:58:32 +00:00
3dde9b69da
fixup 2022-12-08 18:56:32 +00:00
6fce39f696
WHY?!?!?! 2022-12-08 18:55:53 +00:00
26766366fc
I hate the python code intelligence
it's bad
2022-12-08 18:55:15 +00:00
ff56f591c7
I hate python 2022-12-08 18:53:37 +00:00
d37e7224f5
train-mono: tidy up arg passing 2022-12-08 18:47:03 +00:00
b53db648bf
fixup 2022-12-08 18:31:42 +00:00
18c0210704
typo 2022-12-08 17:00:25 +00:00
a3c9416cf0
LossCrossentropy: don't sum 2022-12-08 16:57:11 +00:00
08046340f4
dataset_mono: normalise heightmap 2022-12-08 16:10:34 +00:00
d997157f55
dataset_mono: log when using heightmap 2022-12-06 19:30:11 +00:00
468c150570
slurm-train-mono: add HEIGHTMAP 2022-12-06 19:28:06 +00:00
d0f2e3d730
readfile: do transparent gzip by default
....but there's a glad to turn it off if needed
2022-12-06 19:27:39 +00:00
eac6472c97
Implement support for (optionally) taking a heightmap in 2022-12-06 18:55:58 +00:00
f92b2b3472
according to the equation it looks like it's 2 2022-12-02 17:22:46 +00:00
cad82cd1bc
CBAM: unsure if it's 1 ro 3 dense ayers in the shared mlp 2022-12-02 17:21:13 +00:00
62f6a993bb
implement CBAM, but it's UNTESTED
Convolutional Block Attention Module.
2022-12-02 17:17:45 +00:00
9d666c3b38
train mono: type=int → float 2022-12-01 15:39:44 +00:00
53dfa32685
model_mono: log learning rate 2022-12-01 15:10:51 +00:00
c384d55dff
add arg to adjust learning rate 2022-11-29 20:55:00 +00:00
8e23e9d341
model_segmenter: we're no longer using sparse 2022-11-29 19:28:27 +00:00