Commit graph

560 commits

Author SHA1 Message Date
214afcc914
dlr: steps per execution is ugh 2023-05-04 19:54:51 +01:00
f0a2e7c450
dlr: print values of new env vars 2023-05-04 19:42:40 +01:00
8593999eb6
dlr: add JIT_COMPILE 2023-05-04 18:22:18 +01:00
dddc08c663
dlr: set steps_per_execution to 16 by default 2023-05-04 18:13:08 +01:00
e2e6a56b40
dlr: add UPSAMPLE env var
...AND actually add the functionality this time!
2023-05-04 17:40:16 +01:00
31687da931
celldice: actually do log(cosh())
....just wow.
2023-05-03 15:00:10 +01:00
4a4df380e3
if name == main 2023-03-23 18:09:52 +00:00
81cad8e6b4
newline 2023-03-23 18:01:21 +00:00
1bd59dc038
finish script to plot generic metrics 2023-03-23 18:00:00 +00:00
698bbe2ffb
start working on plotting script, but it isn't finished yet 2023-03-22 17:45:06 +00:00
6b17d45aad
dlr: plot all metrics 2023-03-22 17:41:34 +00:00
e565c36149
deeplabv3+: prepare for ConvNeXt 2023-03-14 21:51:41 +00:00
779b546897
dlr: log new env vars 2023-03-14 20:28:22 +00:00
623208ba6d
dlr: add env var for water thresholding 2023-03-14 20:18:39 +00:00
c5fc62c411
dlr CHANGE: Add optional log(cosh(dice_loss))
Ref https://doi.org/10.1109/cibcb48159.2020.9277638
2023-03-10 20:24:13 +00:00
f25d1b5b1a
dlr CHANGE: properly normalise the heightmap 2023-03-10 19:13:32 +00:00
8e76ac2864
dlr: typo 2023-03-10 17:40:16 +00:00
5e79825292
dlr: tf.Tensor → numpy array → list → json → disk 2023-03-10 17:34:49 +00:00
b287970032
dlr: missing import 2023-03-10 17:31:03 +00:00
e6cd0aa1b9
dlr: fix crash 2023-03-10 17:14:06 +00:00
cf37aeb11a
dlr: truncate jsonl before we start to avoid mixing things up 2023-03-10 17:11:10 +00:00
5fdf229d06
dlr: write raw outputs to jsonl file 2023-03-10 17:07:44 +00:00
d24381babb
dlr: make size fo prediction graph actually sensible 2023-03-09 19:54:27 +00:00
8d932757b5
dlr: variable name typo 2023-03-09 19:44:39 +00:00
b6ca885fe2
dlr: told you so 2023-03-09 19:43:35 +00:00
f9df19cfd5
dlr: next of several typos 2023-03-09 19:34:45 +00:00
96eae636ea
dir: add missing functions to .load() custom objs
apparently metrics are also required to be included here...
2023-03-09 19:26:57 +00:00
ad52ae9241
dir: also plot inputs 2023-03-09 19:13:25 +00:00
436ab78438
dlr: when predicting, also display heatmap
...of positive predictions
2023-03-09 18:54:28 +00:00
5195fe6b62
jupyter test: heatmaps in matplotlib & seaborn 2023-03-09 18:52:56 +00:00
45c76ba252
typo 2023-03-03 22:46:02 +00:00
c909cfd3d1
fixup 2023-03-03 22:45:34 +00:00
0734201107
dlr: tf graph changes 2023-03-03 22:44:49 +00:00
750f46dbd2
debug 2023-03-03 22:39:30 +00:00
5472729f5e
dlr: fixup argmax & y_true/y_pred 2023-03-03 22:37:36 +00:00
bc734a29c6
y_true is one-hot, convert to sparse 2023-03-03 22:20:11 +00:00
c7b577ab29
specificity: convert to plaintf 2023-03-03 22:16:48 +00:00
26cc824ace
dlr: MeanIoU fixup 2023-03-03 22:10:49 +00:00
e9dcbe3863
dlr: fixup 2023-03-03 22:09:05 +00:00
5c6789bf40
meaniou: implement one-hot version
it expects sparse, but our output is one-hot.
2023-03-03 22:04:21 +00:00
6ffda40d48
fixup 2023-03-03 21:54:45 +00:00
9b13e9ca5b
dlr: fixup argmax first 2023-03-03 21:51:24 +00:00
7453c607ed
argmax for sensitivity & specificity too 2023-03-03 21:49:33 +00:00
8470aec996
dlr: fixup 2023-03-03 21:45:51 +00:00
3d051a8874
dlr: HACK: argmax to convert [64,128,128, 2] → [64,128,128] 2023-03-03 21:41:26 +00:00
94a32e7144
dlr: fix metrics 2023-03-03 20:37:22 +00:00
c7f96ab6ab
dlr: typo 2023-03-03 20:23:03 +00:00
06f956dc07
dlr: default LOSS, EPOCHS, and PREDICT_COUNT to better values
Ref recent experiments
2023-03-03 20:17:08 +00:00
b435cc54dd
dlr: add sensitivity (aka recall) and specificity metrics 2023-03-03 20:00:05 +00:00
483ecf11c8
add specificity metric 2023-03-03 19:35:20 +00:00
d464c9f57d
dlr: add dice loss as metric
more metrics to go tho
2023-03-03 19:34:55 +00:00
f70083bea4
dlr eo: set custom_objects when loading model 2023-03-01 17:19:10 +00:00
b5f23e76d1
dlr eo: allow setting DIR_OUTPUT directly 2023-03-01 16:54:15 +00:00
4fd9feba4f
dlr eo: tidyup 2023-03-01 16:47:36 +00:00
69b5ae8838
dlr eo: this should fix it 2023-02-23 17:24:30 +00:00
9f1cee2927
dlr eo: cheese it by upsampling and then downsampling again 2023-02-23 16:47:00 +00:00
96b94ec55b
upsampling test 2023-02-23 16:19:44 +00:00
747ddfd41b
weird, XLA_FLAGS cuda data dir wasn't needed before
libdevice not found at ./libdevice.10.bc
2023-02-10 13:28:34 +00:00
e43274cd91
dlr eo: add VAL_STEPS_PER_EPOCH 2023-02-03 16:41:30 +00:00
8446a842d1
typo 2023-02-03 16:01:54 +00:00
1a8f10339a
LayerConvNeXtGamma: fix for mixed precision mode 2023-02-02 16:22:08 +00:00
a630db2c49
dlr eo: fixup 2023-02-02 16:17:52 +00:00
2bf1872aca
dlr eo: add JIT_COMPILE and MIXED_PRECISION 2023-02-02 16:14:09 +00:00
71088b8c0b
typo 2023-02-02 15:48:49 +00:00
f7666865a0
dlr eo: add STEPS_PER_EXECUTION 2023-02-02 15:47:08 +00:00
f8202851a1
dlr eo: add LEARNING_RATE 2023-01-27 16:51:13 +00:00
fb898ea72b
slurm eo: seriously....? 2023-01-26 17:02:33 +00:00
be946091b1
slurm eo: DIR_OUTPUT → DIRPATH_OUTPUT 2023-01-26 16:52:14 +00:00
4703bdbea1
SLURM: add job file for encoderonly
It's pretty much bugfixed, but illykin doesn't have enough RAM to support it at the moment :-(
2023-01-20 20:32:35 +00:00
818d77c733
Make dirpath_rainfallwater consistent with other experiments 2023-01-20 20:31:26 +00:00
e72d3991b8
switch to a smaller ConvNeXt 2023-01-20 19:14:38 +00:00
e1ad16a213
debug A 2023-01-20 18:58:45 +00:00
65a2e16a4c
ds_eo: lower memory usage 2023-01-20 18:55:52 +00:00
b5e68fc1a3
eo: don't downsample ConvNeXt at beginning 2023-01-20 18:49:46 +00:00
d5fdab50ed
dlreo: missing import 2023-01-20 18:40:35 +00:00
4514086dc6
make_encoderonly: kwargs 2023-01-20 18:39:35 +00:00
35dbd3f8bc
ds eo: scale up rainfall data
It's taken most fo the afternoon to spot this one 🤦
2023-01-20 18:37:08 +00:00
5b54ceec48
ds eo: debug 2023-01-20 18:36:14 +00:00
a3787f0647
debug 2023-01-20 18:34:56 +00:00
9bd5e0f7a3
ds_eo: debug
there's something fishy going on here.
2023-01-20 17:25:22 +00:00
aa3831bc4f
ds_eo: pass water to reshape 2023-01-17 19:07:16 +00:00
a01c49414f
ds_eo: typo 2023-01-17 19:06:29 +00:00
b9bea26d26
ds_eo: rogue variables 2023-01-17 19:03:14 +00:00
64c57bbc21
dlr: add no-requeue
Ref https://support.hull.ac.uk/tas/public/ssp/content/detail/incident?unid=652db7ac6e73485c9f7658db78b2b628
2023-01-17 18:20:26 +00:00
a28bbb9cf7
encoderonly: make executable 2023-01-17 17:55:10 +00:00
835b376c72
slurm dlr: log exit code 2023-01-17 15:18:26 +00:00
40a550f155
slurm dlr: fixup 2023-01-16 18:45:08 +00:00
7bcf13f8d8
dlr: typo 2023-01-16 18:02:09 +00:00
6ff2864d23
slurm dlr: shell out in conda; redirect stderr & stdout to disk inside the experiments folder
Also, if the job restarts, we still save the previous run's results because we append rather than overwrite
2023-01-16 17:32:22 +00:00
0b31c9e700
log start time, end time, and elapsed
just in case...!
2023-01-16 17:30:20 +00:00
1a4ac3ed66
encoderonly: add graph plotting 2023-01-13 19:08:38 +00:00
82e01da70b
fixup.... again
oops
2023-01-13 18:47:29 +00:00
7b10f5c5fe
dlr: add learning_rate env var 2023-01-13 18:29:39 +00:00
2f0ce0aa13
again 2023-01-13 18:27:03 +00:00
e04d6ab1b6
fixup again 2023-01-13 18:23:32 +00:00
37d1598b0b
loss cel+dice: fixup 2023-01-13 18:21:11 +00:00
0f0b691b5d
loss cel + dice: fixup 2023-01-13 18:09:31 +00:00
be77f035c8
dlr: add cross-entropy + dice loss fn option 2023-01-13 17:58:00 +00:00
b2a5acaf4e
dlr dsm: clash 2023-01-13 17:26:38 +00:00
f7672db599
annoying 2023-01-13 17:00:47 +00:00