|
1a4ac3ed66
|
encoderonly: add graph plotting
|
2023-01-13 19:08:38 +00:00 |
|
|
82e01da70b
|
fixup.... again
oops
|
2023-01-13 18:47:29 +00:00 |
|
|
7b10f5c5fe
|
dlr: add learning_rate env var
|
2023-01-13 18:29:39 +00:00 |
|
|
2f0ce0aa13
|
again
|
2023-01-13 18:27:03 +00:00 |
|
|
e04d6ab1b6
|
fixup again
|
2023-01-13 18:23:32 +00:00 |
|
|
37d1598b0b
|
loss cel+dice: fixup
|
2023-01-13 18:21:11 +00:00 |
|
|
0f0b691b5d
|
loss cel + dice: fixup
|
2023-01-13 18:09:31 +00:00 |
|
|
be77f035c8
|
dlr: add cross-entropy + dice loss fn option
|
2023-01-13 17:58:00 +00:00 |
|
|
b2a5acaf4e
|
dlr dsm: clash
|
2023-01-13 17:26:38 +00:00 |
|
|
f7672db599
|
annoying
|
2023-01-13 17:00:47 +00:00 |
|
|
3c4d1c5140
|
dlr: Add support for stripping isolated water pixels
That is, water pixels that have no other water pixels immediately adjacent thereto (diagonals count).
|
2023-01-13 16:57:26 +00:00 |
|
|
ce1467461d
|
fixup
|
2023-01-13 16:47:52 +00:00 |
|
|
0b676fa391
|
move shuffle to subdir
|
2023-01-13 16:47:35 +00:00 |
|
|
2edfb1a21f
|
dlr predict: comment debug
|
2023-01-12 19:20:22 +00:00 |
|
|
f0dd9711ed
|
dlr: fixup
|
2023-01-12 18:55:33 +00:00 |
|
|
176dc022a0
|
add moar env vars
|
2023-01-12 18:54:39 +00:00 |
|
|
0d41bbba94
|
dlr predict: output with higher quality
|
2023-01-12 18:43:48 +00:00 |
|
|
1b5bb14d8f
|
dlr: debug
|
2023-01-12 18:35:29 +00:00 |
|
|
20f7d34fd1
|
fixup
|
2023-01-12 18:21:20 +00:00 |
|
|
c0c6e81c01
|
dlr predict: allow for multiple outputs
|
2023-01-12 18:12:50 +00:00 |
|
|
864dfa802d
|
dlr/predict typo
|
2023-01-12 18:03:06 +00:00 |
|
|
0140e93bbd
|
dlr: trying a thing
|
2023-01-12 18:00:16 +00:00 |
|
|
e1666026ad
|
dlr/predict: let's try another way
|
2023-01-12 17:56:59 +00:00 |
|
|
e7c0328648
|
typo
|
2023-01-12 16:58:23 +00:00 |
|
|
376eecc29f
|
dlr: try plotting the label too
https://www.youtube.com/watch?v=03qwgVJbNas
|
2023-01-12 16:13:04 +00:00 |
|
|
be7dd91f88
|
fix crash 'cause we're only plotting 1 thing
|
2023-01-11 17:41:55 +00:00 |
|
|
3787155665
|
dlr: we can't plot the input tensor because it has 8 channels rather than 1 or 3
|
2023-01-11 17:39:14 +00:00 |
|
|
373dda03b5
|
missing comma
|
2023-01-11 17:28:13 +00:00 |
|
|
7be0509ac8
|
dlr: slurm PATH_CHECKPOINT
|
2023-01-11 17:27:26 +00:00 |
|
|
a69c809008
|
dlr: slurm, load checkpoint
|
2023-01-11 17:26:57 +00:00 |
|
|
93e663e45d
|
add optional PATH_CHECKPOINT env var
|
2023-01-11 17:20:19 +00:00 |
|
|
0e3de8f5fc
|
dlr: fix predictions
|
2023-01-10 19:19:30 +00:00 |
|
|
2591cbe6bc
|
slurm dlr: quiet, pip
|
2023-01-10 18:12:35 +00:00 |
|
|
1958c4e6c2
|
encoderonly model: getting there
|
2023-01-09 19:33:41 +00:00 |
|
|
4b7df39fac
|
bugfix
|
2023-01-09 18:25:16 +00:00 |
|
|
581006cbe6
|
dlr: save checkpoints
|
2023-01-09 18:03:23 +00:00 |
|
|
52cf66ca32
|
start working on a quick encoder test, but it's far from finished
|
2023-01-06 19:55:52 +00:00 |
|
|
f080af0b57
|
identity test: plot binarised heightmap
|
2023-01-06 19:03:06 +00:00 |
|
|
36859746ff
|
dlr: fix crash
|
2023-01-06 17:13:35 +00:00 |
|
|
bcf198d47b
|
mono identity test: output
|
2023-01-06 17:08:18 +00:00 |
|
|
db0b010814
|
slur dlr: log file names correct
|
2023-01-05 19:47:51 +00:00 |
|
|
e01ecfb615
|
slurm dlr: fix output dir
|
2023-01-05 19:42:42 +00:00 |
|
|
aa76d754c1
|
slurm dlr: fix pathing
|
2023-01-05 19:35:56 +00:00 |
|
|
67b8a2c6c0
|
spaces → spaces
|
2023-01-05 19:17:44 +00:00 |
|
|
56a501f8a9
|
weights="imagenet" only works with 3 image channels
|
2023-01-05 19:09:31 +00:00 |
|
|
4563fe6b27
|
dpl: fix moar crashes
|
2023-01-05 19:03:44 +00:00 |
|
|
fefeb5d531
|
fix water depth fiddling
|
2023-01-05 19:01:20 +00:00 |
|
|
46d1f5e4e0
|
dataset_mono: fix dataset parsing
|
2023-01-05 19:00:52 +00:00 |
|
|
aca7b83a78
|
dataset_mono: fix sizing
it didn't account for rainfall_scale_up
|
2023-01-05 18:38:47 +00:00 |
|
|
19bb2fcac0
|
debug
|
2023-01-05 18:32:22 +00:00 |
|
|
6a4f68a055
|
missing import
|
2023-01-05 18:26:33 +00:00 |
|
|
0d4cc63b76
|
dl rainfall: fix env var name
|
2023-01-05 17:42:20 +00:00 |
|
|
dd79fb6e68
|
fixup
|
2023-01-05 17:09:09 +00:00 |
|
|
11ccd4cbee
|
slurm deeplab rainfall: fix variable naming
|
2023-01-05 17:08:57 +00:00 |
|
|
c17e53ca75
|
deeplabv3+ for rainfall
|
2022-12-16 19:52:59 +00:00 |
|
|
677e39f820
|
work on slurm for deeplabv3+ rainfall, but it's NOT FINISHED YET
|
2022-12-16 19:52:44 +00:00 |
|
|
423e277ed1
|
add comment
|
2022-12-15 19:33:25 +00:00 |
|
|
ef5071b569
|
DeepLabV3+: start working on version for rainfall radar, but it's not finished yet
|
2022-12-15 19:33:14 +00:00 |
|
|
15a3150127
|
DeepLabV3+: close each matplotlib figure after writing it
|
2022-12-15 19:14:07 +00:00 |
|
|
6ce121f861
|
DeepLabV3+: have argument for number of channels
|
2022-12-14 17:36:30 +00:00 |
|
|
1dc2ec3a46
|
DeepLabV3+: pathing.... again
|
2022-12-13 18:51:09 +00:00 |
|
|
eb47f8f544
|
dataset_mono: adjust to suit DeepLabV3+ too
|
2022-12-13 18:37:38 +00:00 |
|
|
440e693dfc
|
DeepLabv3+: fix pathing again
|
2022-12-13 18:26:00 +00:00 |
|
|
7e1f271bf4
|
deeplabv3+: fix colourmap
|
2022-12-13 14:02:10 +00:00 |
|
|
4d8ce792c9
|
ddeeplabv3+: fix imports/pathing errors
|
2022-12-13 13:38:27 +00:00 |
|
|
fc43f145c2
|
if not
|
2022-12-13 13:28:09 +00:00 |
|
|
d907dc48e5
|
DeepLabv3+: add logging
|
2022-12-13 13:20:16 +00:00 |
|
|
be4d928319
|
deeplabv3+: chmod +x
|
2022-12-13 13:06:42 +00:00 |
|
|
91846079b2
|
deeplabv3+ tesst: add shebang
|
2022-12-13 12:56:14 +00:00 |
|
|
96e260fe82
|
slurm: add job file for deeplabv3 test
|
2022-12-12 19:31:49 +00:00 |
|
|
8866960017
|
TEST SCRIPT: deeplabv3
ref https://keras.io/examples/vision/deeplabv3_plus/
dataset ref https://drive.google.com/uc?id=1B9A9UCJYMwTL4oBEo4RZfbMZMaZhKJaz
(the code is *terrible* spaghetti....!)
|
2022-12-12 19:20:07 +00:00 |
|
|
4e4d42a281
|
LossDice: add comment
|
2022-12-12 18:34:20 +00:00 |
|
|
449bc425a7
|
LossDice: explicitly cast inputs to float32
|
2022-12-12 17:20:32 +00:00 |
|
|
dbf8f5617c
|
drop activation function in last layers
|
2022-12-12 17:20:04 +00:00 |
|
|
bcd2f1251e
|
LossDice: Do 1 - thing instead of -thing
|
2022-12-09 19:41:32 +00:00 |
|
|
d0dbc50bb7
|
debug
|
2022-12-09 19:33:28 +00:00 |
|
|
2142bb039c
|
again
|
2022-12-09 19:30:01 +00:00 |
|
|
7000b0f193
|
fixup
|
2022-12-09 19:23:35 +00:00 |
|
|
85012d0616
|
fixup
|
2022-12-09 19:18:03 +00:00 |
|
|
719d8e9819
|
strip channels layer at end
|
2022-12-09 19:11:00 +00:00 |
|
|
0129c35a35
|
LossDice: remove weird K.* functions
|
2022-12-09 19:06:26 +00:00 |
|
|
659fc97fd4
|
fix crash
|
2022-12-09 18:39:27 +00:00 |
|
|
e22c0981e6
|
actually use dice loss
|
2022-12-09 18:35:17 +00:00 |
|
|
649c262960
|
mono: switch loss from crossentropy to dice
|
2022-12-09 18:13:37 +00:00 |
|
|
7fd7c750d6
|
jupyter: identity test
status: FAILED, as usual....!
Don't worry though, 'cause we has a *planses*..... MUHAHAHAHAHAHAHA
* cue evil laugh *
|
2022-12-09 18:07:56 +00:00 |
|
|
cf9e8aa237
|
jupyter: convnext-mono identity test
|
2022-12-09 15:50:27 +00:00 |
|
|
2a1772a211
|
confvnext_intrevse: add shallow
|
2022-12-08 19:10:12 +00:00 |
|
|
c27869630a
|
I hate VSCode's git commit interface
it doesn't let you ammend
|
2022-12-08 18:58:54 +00:00 |
|
|
b3345963f3
|
missing arg pass
|
2022-12-08 18:58:32 +00:00 |
|
|
3dde9b69da
|
fixup
|
2022-12-08 18:56:32 +00:00 |
|
|
6fce39f696
|
WHY?!?!?!
|
2022-12-08 18:55:53 +00:00 |
|
|
26766366fc
|
I hate the python code intelligence
it's bad
|
2022-12-08 18:55:15 +00:00 |
|
|
ff56f591c7
|
I hate python
|
2022-12-08 18:53:37 +00:00 |
|
|
d37e7224f5
|
train-mono: tidy up arg passing
|
2022-12-08 18:47:03 +00:00 |
|
|
b53db648bf
|
fixup
|
2022-12-08 18:31:42 +00:00 |
|
|
18c0210704
|
typo
|
2022-12-08 17:00:25 +00:00 |
|
|
a3c9416cf0
|
LossCrossentropy: don't sum
|
2022-12-08 16:57:11 +00:00 |
|
|
08046340f4
|
dataset_mono: normalise heightmap
|
2022-12-08 16:10:34 +00:00 |
|
|
d997157f55
|
dataset_mono: log when using heightmap
|
2022-12-06 19:30:11 +00:00 |
|
|
468c150570
|
slurm-train-mono: add HEIGHTMAP
|
2022-12-06 19:28:06 +00:00 |
|
|
d0f2e3d730
|
readfile: do transparent gzip by default
....but there's a glad to turn it off if needed
|
2022-12-06 19:27:39 +00:00 |
|
|
eac6472c97
|
Implement support for (optionally) taking a heightmap in
|
2022-12-06 18:55:58 +00:00 |
|
|
f92b2b3472
|
according to the equation it looks like it's 2
|
2022-12-02 17:22:46 +00:00 |
|
|
cad82cd1bc
|
CBAM: unsure if it's 1 ro 3 dense ayers in the shared mlp
|
2022-12-02 17:21:13 +00:00 |
|
|
62f6a993bb
|
implement CBAM, but it's UNTESTED
Convolutional Block Attention Module.
|
2022-12-02 17:17:45 +00:00 |
|
|
9d666c3b38
|
train mono: type=int → float
|
2022-12-01 15:39:44 +00:00 |
|
|
53dfa32685
|
model_mono: log learning rate
|
2022-12-01 15:10:51 +00:00 |
|
|
c384d55dff
|
add arg to adjust learning rate
|
2022-11-29 20:55:00 +00:00 |
|
|
8e23e9d341
|
model_segmenter: we're no longer using sparse
|
2022-11-29 19:28:27 +00:00 |
|
|
9a2b4c6838
|
dsseg: fix reshape/onehot ordering
|
2022-11-29 19:28:13 +00:00 |
|
|
df774146d9
|
dataset_segmenter: reshape, not squeeze
|
2022-11-29 19:24:54 +00:00 |
|
|
77b8a1a8db
|
dataset_segmenter: squeeze
|
2022-11-29 19:16:15 +00:00 |
|
|
2258b5a229
|
slurm-train: reduce RAM required by 10GB
|
2022-11-29 19:15:34 +00:00 |
|
|
01101ad30b
|
losscrossentropy: return the reduced value * facepalm *
|
2022-11-29 19:07:08 +00:00 |
|
|
ff65393e78
|
log file naming update
|
2022-11-29 18:41:14 +00:00 |
|
|
37f196a785
|
LossCrossentropy: add kwargs
|
2022-11-29 15:40:35 +00:00 |
|
|
838ff56a3b
|
mono: fix loading checkpoint
|
2022-11-29 15:25:11 +00:00 |
|
|
dba6cbffcd
|
WHY. * facepalms *
|
2022-11-28 19:33:42 +00:00 |
|
|
57b8eb93fb
|
fixup
|
2022-11-28 19:09:35 +00:00 |
|
|
6640a41bb7
|
almost got it....? it's not what I expected....!
|
2022-11-28 19:08:50 +00:00 |
|
|
f48473b703
|
fixup
|
2022-11-28 19:00:11 +00:00 |
|
|
f6feb125e3
|
this iss ome serious debugging.
This commit will produce an extremely large volume of output.
|
2022-11-28 18:57:41 +00:00 |
|
|
09f81b0746
|
train_mono: debug
this commit will generate a large amount of debug output.
|
2022-11-28 16:46:17 +00:00 |
|
|
f39e4ade70
|
LayerConvNextGamma: fix config serialisation bug
.....this is unlikely to be the problem as this bug is in an unused code path.
|
2022-11-25 21:16:31 +00:00 |
|
|
e7410fb480
|
train_mono_predict: limit label size to 64x64
that's the size the model predicts
|
2022-11-25 17:47:17 +00:00 |
|
|
51dd484d13
|
fixup
|
2022-11-25 16:55:45 +00:00 |
|
|
884c4eb150
|
rainfall_stats: formatting again
|
2022-11-24 19:08:07 +00:00 |
|
|
bfe038086c
|
rainfall_stats: formatting
|
2022-11-24 19:07:44 +00:00 |
|
|
7dba03200f
|
fixup
|
2022-11-24 19:06:48 +00:00 |
|
|
e5258b9c66
|
typo
|
2022-11-24 19:06:13 +00:00 |
|
|
64d646bb13
|
rainfall_stats: formatting
|
2022-11-24 19:05:35 +00:00 |
|
|
675c7a7448
|
fixup
|
2022-11-24 19:03:28 +00:00 |
|
|
afc1cdcf02
|
fixup
|
2022-11-24 19:02:58 +00:00 |
|
|
e4bea89c89
|
typo
|
2022-11-24 19:01:52 +00:00 |
|
|
a40cbe8705
|
rainfall_stats: remove unused imports
|
2022-11-24 19:01:18 +00:00 |
|
|
fe57d6aab2
|
rainfall_stats: initial implementation
this might reveal why we are having problems. If most/all the rainfall radar
data is v small numbers, normalising
might help.
|
2022-11-24 18:58:16 +00:00 |
|
|
3131b4f7b3
|
debug2
|
2022-11-24 18:25:32 +00:00 |
|
|
d55a13f536
|
debug
|
2022-11-24 18:24:03 +00:00 |
|
|
1f60f2a580
|
do_argmax
|
2022-11-24 18:11:03 +00:00 |
|
|
6c09d5254d
|
fixup
|
2022-11-24 17:57:48 +00:00 |
|
|
54a841efe9
|
train_mono_predict: convert to correct format
|
2022-11-24 17:56:07 +00:00 |
|
|
105dc5bc56
|
missing kwargs
|
2022-11-24 17:51:29 +00:00 |
|
|
1e1d6dd273
|
fixup
|
2022-11-24 17:48:19 +00:00 |
|
|
011e0aef78
|
update cli docs
|
2022-11-24 16:38:07 +00:00 |
|
|
773944f9fa
|
train_mono_predict: initial implementation
|
2022-11-24 16:33:50 +00:00 |
|
|
d31326cb30
|
slurm train mono: fix partition name
|
2022-11-22 17:02:02 +00:00 |
|
|
ce28ac4013
|
slurm: add job for train_mono
|
2022-11-22 16:58:46 +00:00 |
|
|
3a0356929c
|
mono: drop the sparse
|
2022-11-22 16:20:56 +00:00 |
|
|
7e8f63f8ba
|
fixup
|
2022-11-21 19:38:24 +00:00 |
|
|
ace4c8b246
|
dataset_mono: debug
|
2022-11-21 18:46:21 +00:00 |
|