├── README.md ├── models_trans11.py ├── models_z_new_SC_03.py └── results ├── 01_KSDD2 ├── ksdd2_z_new_SC_00_grandient_open │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt ├── ksdd2_z_new_SC_01_grandient_open(对比实验_C_Net) │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt ├── ksdd2_z_new_SC_02(对比实验New_S_Net) │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt └── ksdd2_z_new_SC_03(对比实验SC_Net) │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt ├── 02_DAGM ├── dagm_z_SC_03_class4 │ └── FOLD_4 │ │ ├── ROC.pdf │ │ ├── loss_val.png │ │ ├── losses.csv │ │ ├── precision-recall.pdf │ │ ├── results.csv │ │ ├── run_params.txt │ │ └── training_log.txt └── dagm_z_SC_03_except_class4 │ ├── FOLD_1 │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt │ ├── FOLD_10 │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt │ ├── FOLD_2 │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt │ ├── FOLD_3 │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt │ ├── FOLD_5 │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt │ ├── FOLD_6 │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt │ ├── FOLD_7 │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt │ ├── FOLD_8 │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt │ └── FOLD_9 │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt ├── 03_KSDD ├── FOLD_0 │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt ├── FOLD_1 │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt ├── FOLD_2 │ ├── ROC.pdf │ ├── loss_val.png │ ├── losses.csv │ ├── precision-recall.pdf │ ├── results.csv │ ├── run_params.txt │ └── training_log.txt └── eval_log.txt └── 04_STEEL ├── ROC.pdf ├── loss_val.png ├── losses.csv ├── precision-recall.pdf ├── results.csv ├── run_params.txt └── training_log.txt /README.md: -------------------------------------------------------------------------------- 1 | # SC-Net:弱监督的表面缺陷检测 2 | ## 1.论文与代码 3 | * 本文: 4 | * 论文:2022年10月17日被录用,等待发表(发表时间在2025年,先丢核心模型出来吧,兄弟们觉得有用给个星就行) 5 | * 代码:论文成功发表后公布(在基准论文代码上修改得来) 6 | * 基准论文(2021年): 7 | * 期刊:Computers in Industry(SCI影响因子11.245, Q1) 8 | * 作者:Božič J, Tabernik D, Skočaj D. 9 | * 论文:[Mixed supervision for surface-defect detection: From weakly to fully supervised learning](https://www.webofscience.com/wos/alldb/full-record/WOS:000648879500012) 10 | * 论文代码:[https://github.com/vicoslab/mixed-segdec-net-comind2021](https://github.com/vicoslab/mixed-segdec-net-comind2021) 11 | ## 2.简述 12 | * 用途:工业表面缺陷检测(分类、分割) 13 | * 输入: 14 | * 数据集的样本图像 15 | * **图像级标签(二元分类标签,仅需标注是否存在缺陷,即True or False)** 16 | * 输出: 17 | * 分割图(分割出缺陷的轮廓与位置) 18 | * 分类预测(样本图像存在缺陷的概率) 19 | * 实验数据集:KolektorSDD2(KSDD2), DAGM 1-10, KolektorSDD(KSDD), Severstal Steel(介绍与下载地址见文末) 20 | * 两个模型: 21 | * SC-Net(models_z_new_SC_03.py) 22 | * SSC-Net(models_trans11.py) 23 | * 作者的小声BB:由于一年没看过自己的代码了,刚刚花了一个小时才看出自己最终用的是哪两个模型(实验了上千次,一堆模型,一堆运行代码,一堆实验记录,一团乱码)。虽然对基准论文代码进行了不少改动,但我的代码像是屎山,现在上班了实在懒得给大伙筛选最终版本,毕竟待会晚上8.9点还要加班呢 24 | * 模型核心:面对二元分类标签的弱监督分类、分割任务,其实所需模型跟像素级标签、有轮廓的图像级标签(如椭圆型/方框型标签)可以是基本一样的。核心方法是将分割模型的分割图导入到一个简单的分类模型(试验的话,2-3层神经网络就行,超简单),利用分类模型的反向传播,补足分割模型对缺陷样本的信息输入就行。使用这种方法,可以将任意完全监督的分割模型,改造成仅输入二元分类标签的弱监督分类、分割模型(任意两字是猜测哈,但方法简单,兄弟们用用就知道了)。 25 | * 小提示:基准模型没用批处理计算样本,运行超慢,自己改一下就行;KSDD2这个数据集有些样本感觉标注错误,兄弟们别白费力气妄想分类效果提到100%;实验时,可以参考CAM的思想,在添加的分类模型中,找最后一层卷积层输出的特征图,以全连接层的参数作为权重,相乘就是个分割图,这个分割图能大致区分缺陷,你的分类模型才是真正有效(什么,说已经有分割模型为什么还要加CAM?你的分割模型每次都有效的话,当我没说);表面缺陷检测目前小样本的使用场景比较多一点,所以本文模型so mini也够用,但工业界感觉需要的是大一点的类似万能钥匙的预训练模型,学术的话,哪位兄弟好心拿我的核心思想去搞搞大模型和大型数据集,我当初实验室没设备,不然真想搞搞. 26 | 27 | ## 3.效果 28 | ### (1) 表1 四个数据集的平均精确率AP 29 | | 论文 | 标签类型 | KSDD2 | DAGM | KSDD | Severstal Steel | 30 | | :---: | --- | :---: | :---: | :---: | :---: | 31 | | 本文 | 图像级标签(难度更高) | 96.0 | 100 | 99.4 | 96.4 | 32 | | 基准论文 | 图像级标签(难度更高) | 73.3 | 74.0 | 93.4 | 90.3 | 33 | | 基准论文 | 像素级/椭圆型/方框型标签 | 95.4 | 100 | 100 | 97.7 | 34 | * 注: 35 | * 平均精确率AP能更好地评估正负样本严重不平衡的缺陷检测类数据集 36 | * 实验记录还有其他评估指标,包括AUC, AC, F1及其阀值, FP, FN 37 | 38 | ### (2) 表2 模型大小与检测速度的比较(数据集KSDD2) 39 | | 论文 | 模型大小 | 训练速度 | 检测速度 | 训练所需迭代次数 | 训练所需时间 | 40 | | :---: | :---: | :---: | :---: | :---: | :---: | 41 | | 本文 | 2.4M | 9.8s/epoch | 125个样本/s | 91 | 20min | 42 | | 基准论文 | 59.7M | 53.5s/epoch | 33个样本/s | 31 | 30min | 43 | 44 | ### (3) 分割图(KSDD) 45 | 46 | 47 | ## 4.数据集 48 | ### (1) 简介与下载地址 49 | * KolektorSDD2。 50 | * 该数据集由有缺陷的电气换向器的彩色图像构成,由Kolektor Group doo提供并部分注释,由视觉检查系统捕获,在受控环境中捕获的图像大小相似,大约230 像素宽和 630 像素高。数据集分为训练和测试子集,训练中有2085个负样本和246个正样本,测试子集中有894个负样本和110个正样本。缺陷用细粒度的分割掩码标注,形状、大小和颜色各不相同,从小划痕和小斑点到大表面缺陷不等。 51 | * [下载地址](https://www.vicos.si/resources/kolektorsdd2/) 52 | * DAGM。 53 | * 该数据集是众所周知的表面缺陷检测基准数据集。它包含十种不同的计算机生成表面和各种缺陷(如划痕或斑点)的灰度图像。每个表面都被视为一个二元分类问题。最初公开了六个类别,后来又公开了四个类别;因此,一些相关方法只报告前六个类的结果,而其他方法则报告所有十个类的结果。 54 | * [下载地址](https://hci.iwr.uni-heidelberg.de/content/weakly-supervised-learning-industrial-optical-inspection) 55 | * KolektorSDD。 56 | * 该数据集包含真实世界生产项目的灰度图像;其中许多包含可见的表面裂缝。由于样本量小,图像被分成三部分,如Tabernik、Šela 等人。(2019),而最终结果报告为三倍交叉验证的平均值。 57 | * [下载地址](https://www.vicos.si/resources/kolektorsdd/) 58 | * Severstal Steel。 59 | * 该缺陷数据集明显大于其他三个数据集,包含 4 类 12,568 张灰度图像,具有各种缺陷。我们在评估中仅使用数据集的一个子集。特别是,我们使用所有负片图像,但只考虑图像中存在最常见缺陷类别的正片图像(第 3 类)。缺陷在尺寸、形状和外观上非常多样化,从划痕和凹痕到多余的材料。尽管数据集相当大且多样化,但一些缺陷非常模糊,可能无法正确注释。 60 | * [下载地址](https://www.kaggle.com/c/severstal-steel-defect-detection/data) 61 | ### (2) 表3 数据集的结构 62 | | 数据集 | 类别 | KSDD2 | DAGM | KSDD | Severstal Steel | 63 | | :---: | :---: | :---: | :---: | :---: | :---: | 64 | | 训练集 | 正样本 | 246 | 1046 | 34 | 300 | 65 | | | 负样本 | 2086 | 7004 | 230 | 4143 | 66 | | 验证集 | 正样本 | 110 | 1054 | 110 | 559 | 67 | | | 负样本 | 894 | 6996 | 894 | 559 | 68 | | 测试集 | 正样本 | 110 | 1054 | 110 | 1200 | 69 | | | 负样本 | 894 | 6996 | 894 | 1200 | 70 | * 注: 71 | * 含缺陷的图像为正样本,无缺陷图像为负样本 72 | * 前3个数据集并没有真正的验证集,使用测试集代替 73 | -------------------------------------------------------------------------------- /models_trans11.py: -------------------------------------------------------------------------------- 1 | import math 2 | import torch 3 | import torch.nn as nn 4 | from torch.nn import init 5 | 6 | # 含6个类,SegDecNet为模型,其他为工具类 7 | 8 | BATCHNORM_TRACK_RUNNING_STATS = False 9 | BATCHNORM_MOVING_AVERAGE_DECAY = 0.9997 10 | 11 | 12 | class BNorm_init(nn.BatchNorm2d): 13 | def reset_parameters(self): 14 | init.uniform_(self.weight, 0, 1) 15 | init.zeros_(self.bias) 16 | 17 | 18 | class Conv2d_init(nn.Conv2d): 19 | def __init__(self, in_channels, out_channels, kernel_size, stride=1, padding=0, dilation=1, groups=1, bias=True, 20 | padding_mode="zeros"): 21 | super(Conv2d_init, self).__init__(in_channels, out_channels, kernel_size, stride, padding, dilation, groups, 22 | bias, padding_mode) 23 | 24 | def reset_parameters(self): 25 | init.xavier_normal_(self.weight) 26 | if self.bias is not None: 27 | fan_in, _ = init._calculate_fan_in_and_fan_out(self.weight) 28 | bound = 1 / math.sqrt(fan_in) 29 | init.uniform_(self.bias, -bound, bound) 30 | 31 | 32 | def _conv_block(in_chanels, out_chanels, kernel_size, padding, stride=1): 33 | return nn.Sequential(Conv2d_init(in_channels=in_chanels, out_channels=out_chanels, 34 | kernel_size=kernel_size, stride=stride, padding=padding, bias=False), 35 | FeatureNorm(num_features=out_chanels, eps=0.001), 36 | nn.ReLU()) 37 | 38 | 39 | class FeatureNorm(nn.Module): 40 | def __init__(self, num_features, feature_index=1, rank=4, reduce_dims=(2, 3), eps=0.001, include_bias=True): 41 | super(FeatureNorm, self).__init__() 42 | self.shape = [1] * rank 43 | self.shape[feature_index] = num_features 44 | self.reduce_dims = reduce_dims 45 | 46 | self.scale = nn.Parameter(torch.ones(self.shape, requires_grad=True, dtype=torch.float)) 47 | self.bias = nn.Parameter( 48 | torch.zeros(self.shape, requires_grad=True, dtype=torch.float)) if include_bias else nn.Parameter( 49 | torch.zeros(self.shape, requires_grad=False, dtype=torch.float)) 50 | 51 | self.eps = eps 52 | 53 | def forward(self, features): 54 | f_std = torch.std(features, dim=self.reduce_dims, keepdim=True) 55 | f_mean = torch.mean(features, dim=self.reduce_dims, keepdim=True) 56 | return self.scale * ((features - f_mean) / (f_std + self.eps).sqrt()) + self.bias 57 | 58 | 59 | # 继承 nn.Module(它本身是一个类并且能够跟踪状态) 60 | class SegDecNet(nn.Module): 61 | def __init__(self, device, input_width, input_height, input_channels, drop_p): 62 | super(SegDecNet, self).__init__() 63 | if input_width % 8 != 0 or input_height % 8 != 0: 64 | raise Exception(f"Input size must be divisible by 8! width={input_width}, height={input_height}") 65 | self.input_width = input_width 66 | self.input_height = input_height 67 | self.input_channels = input_channels 68 | # 分割层1:卷积(特征图提取) 69 | self.volume = nn.Sequential(_conv_block(self.input_channels, 32, 3, 1), 70 | _conv_block(32, 64, 3, 1), 71 | nn.MaxPool2d(2), 72 | _conv_block(64, 64, 3, 1), 73 | nn.MaxPool2d(2), 74 | _conv_block(64, 128, 3, 1), 75 | nn.MaxPool2d(2), 76 | _conv_block(128, 256, 3, 1), 77 | ) 78 | 79 | # 分割层2:正式分割 80 | self.seg_mask = nn.Sequential( 81 | Conv2d_init(in_channels=256, out_channels=1, kernel_size=1, padding=0, bias=False), 82 | FeatureNorm(num_features=1, eps=0.001, include_bias=False), 83 | # nn.ReLU(), 84 | # nn.LeakyReLU(negative_slope=0.01, inplace=False), 85 | # nn.Sigmoid() 86 | # nn.Tanh() 87 | ) # ??? 88 | 89 | # 决策层1:卷积(特征图提取) 90 | self.extractor = nn.Sequential( 91 | # nn.MaxPool2d(kernel_size=2), 92 | _conv_block(in_chanels=1, out_chanels=36, kernel_size=5, padding=2), 93 | # nn.MaxPool2d(kernel_size=2), 94 | # _conv_block(in_chanels=4, out_chanels=8, kernel_size=5, padding=2), 95 | # nn.MaxPool2d(kernel_size=2), 96 | # _conv_block(in_chanels=8, out_chanels=16, kernel_size=5, padding=2), 97 | # nn.MaxPool2d(kernel_size=2), 98 | # _conv_block(in_chanels=32, out_chanels=64, kernel_size=3, padding=1), 99 | ) 100 | 101 | # 决策层2:池化 102 | # self.global_max_pool_feat = nn.MaxPool2d(kernel_size=32) 103 | # self.global_avg_pool_feat = nn.AvgPool2d(kernel_size=32) 104 | # self.global_max_pool_seg = nn.MaxPool2d(kernel_size=(self.input_height / 8, self.input_width / 8)) 105 | # self.global_avg_pool_seg = nn.AvgPool2d(kernel_size=(self.input_height / 8, self.input_width / 8)) 106 | 107 | # 决策层3:最后的1分类器 108 | self.fc = nn.Sequential(nn.Dropout(p=drop_p), 109 | nn.Linear(in_features=36, out_features=1)) 110 | 111 | # ???(大概是自定义拼接的反向求导函数,具体使用请ctrl+f搜索本页) 112 | self.volume_lr_multiplier_layer = GradientMultiplyLayer().apply 113 | self.glob_max_lr_multiplier_layer = GradientMultiplyLayer().apply 114 | self.glob_avg_lr_multiplier_layer = GradientMultiplyLayer().apply 115 | 116 | # GPU 117 | self.device = device 118 | 119 | # torch.ones((1,) = tensor([1.]) 120 | def set_gradient_multipliers(self, multiplier): 121 | self.volume_lr_multiplier_mask = (torch.ones((1,)) * multiplier).to(self.device) 122 | self.glob_max_lr_multiplier_mask = (torch.ones((1,)) * multiplier).to(self.device) 123 | self.glob_avg_lr_multiplier_mask = (torch.ones((1,)) * multiplier).to(self.device) 124 | 125 | def forward(self, input): 126 | # 分割层1:卷积(特征图提取) 127 | volume = self.volume(input) 128 | # 分割层2:正式分割 129 | seg_mask = self.seg_mask(volume) 130 | 131 | # 梯度流控制组件(大概是自定义拼接的反向求导函数) 132 | seg_mask = self.volume_lr_multiplier_layer(seg_mask, self.volume_lr_multiplier_mask) 133 | 134 | # 决策层1:卷积(特征图提取) 135 | features = self.extractor(seg_mask) 136 | # 决策层2:池化 137 | global_max_feat = torch.max(torch.max(features, dim=-1, keepdim=True)[0], dim=-2, keepdim=True)[0] 138 | # global_avg_feat = torch.mean(features, dim=(-1, -2), keepdim=True) 139 | # global_max_seg = torch.max(torch.max(seg_mask, dim=-1, keepdim=True)[0], dim=-2, keepdim=True)[0] 140 | # global_avg_seg = torch.mean(seg_mask, dim=(-1, -2), keepdim=True) 141 | 142 | # reshape,并自定义分割池化的反向求导函数 143 | global_max_feat = global_max_feat.reshape(global_max_feat.size(0), -1) 144 | # global_avg_feat = global_avg_feat.reshape(global_avg_feat.size(0), -1) 145 | 146 | # global_max_seg = global_max_seg.reshape(global_max_seg.size(0), -1) 147 | # print(global_max_seg.shape) 148 | # global_max_seg = self.glob_max_lr_multiplier_layer(global_max_seg, self.glob_max_lr_multiplier_mask) 149 | # global_avg_seg = global_avg_seg.reshape(global_avg_seg.size(0), -1) 150 | # global_avg_seg = self.glob_avg_lr_multiplier_layer(global_avg_seg, self.glob_avg_lr_multiplier_mask) 151 | 152 | # 池化结果的拼接与reshape 153 | # fc_in = torch.cat([global_max_feat, global_avg_feat, global_max_seg, global_avg_seg], dim=1) 154 | # fc_in = torch.cat([global_max_feat, global_avg_feat], dim=1) 155 | # fc_in = torch.cat([global_max_seg, global_avg_seg], dim=1) 156 | # fc_in = fc_in.reshape(fc_in.size(0), -1) 157 | fc_in = global_max_feat.reshape(global_max_feat.size(0), -1) 158 | # 决策层3:最后的1分类器 159 | prediction = self.fc(fc_in) 160 | 161 | return prediction, seg_mask, features 162 | 163 | 164 | # 虽然pytorch可以自动求导,但是有时候一些操作是不可导的,这时候你需要自定义求导方式。也就是所谓的 “Extending torch.autograd” 165 | class GradientMultiplyLayer(torch.autograd.Function): 166 | @staticmethod 167 | def forward(ctx, input, mask_bw): 168 | ctx.save_for_backward(mask_bw) 169 | return input 170 | 171 | @staticmethod 172 | def backward(ctx, grad_output): 173 | mask_bw, = ctx.saved_tensors 174 | return grad_output.mul(mask_bw), None 175 | -------------------------------------------------------------------------------- /models_z_new_SC_03.py: -------------------------------------------------------------------------------- 1 | import math 2 | import torch 3 | import torch.nn as nn 4 | from torch.nn import init 5 | 6 | # 含6个类,SegDecNet为模型,其他为工具类 7 | 8 | BATCHNORM_TRACK_RUNNING_STATS = False 9 | BATCHNORM_MOVING_AVERAGE_DECAY = 0.9997 10 | 11 | 12 | class BNorm_init(nn.BatchNorm2d): 13 | def reset_parameters(self): 14 | init.uniform_(self.weight, 0, 1) 15 | init.zeros_(self.bias) 16 | 17 | 18 | class Conv2d_init(nn.Conv2d): 19 | def __init__(self, in_channels, out_channels, kernel_size, stride=1, padding=0, dilation=1, groups=1, bias=True, 20 | padding_mode="zeros"): 21 | super(Conv2d_init, self).__init__(in_channels, out_channels, kernel_size, stride, padding, dilation, groups, 22 | bias, padding_mode) 23 | 24 | def reset_parameters(self): 25 | init.xavier_normal_(self.weight) 26 | if self.bias is not None: 27 | fan_in, _ = init._calculate_fan_in_and_fan_out(self.weight) 28 | bound = 1 / math.sqrt(fan_in) 29 | init.uniform_(self.bias, -bound, bound) 30 | 31 | 32 | def _conv_block(in_chanels, out_chanels, kernel_size, padding, stride=1): 33 | return nn.Sequential(Conv2d_init(in_channels=in_chanels, out_channels=out_chanels, 34 | kernel_size=kernel_size, stride=stride, padding=padding, bias=False), 35 | FeatureNorm(num_features=out_chanels, eps=0.001), 36 | nn.ReLU()) 37 | 38 | 39 | class FeatureNorm(nn.Module): 40 | def __init__(self, num_features, feature_index=1, rank=4, reduce_dims=(2, 3), eps=0.001, include_bias=True): 41 | super(FeatureNorm, self).__init__() 42 | self.shape = [1] * rank 43 | self.shape[feature_index] = num_features 44 | self.reduce_dims = reduce_dims 45 | 46 | self.scale = nn.Parameter(torch.ones(self.shape, requires_grad=True, dtype=torch.float)) 47 | self.bias = nn.Parameter( 48 | torch.zeros(self.shape, requires_grad=True, dtype=torch.float)) if include_bias else nn.Parameter( 49 | torch.zeros(self.shape, requires_grad=False, dtype=torch.float)) 50 | 51 | self.eps = eps 52 | 53 | def forward(self, features): 54 | f_std = torch.std(features, dim=self.reduce_dims, keepdim=True) 55 | f_mean = torch.mean(features, dim=self.reduce_dims, keepdim=True) 56 | return self.scale * ((features - f_mean) / (f_std + self.eps).sqrt()) + self.bias 57 | 58 | # # 第一次 59 | # class _res_block(nn.Module): 60 | # def __init__(self, in_chanels, out_chanels, kernel_size, padding): 61 | # super(_res_block, self).__init__() 62 | # self.same_block = _conv_block(in_chanels, out_chanels, 1, 0) 63 | # self.block = nn.Sequential(_conv_block(out_chanels, out_chanels, kernel_size, padding), 64 | # _conv_block(out_chanels, out_chanels, kernel_size, padding), 65 | # ) 66 | # self.relu = nn.ReLU(False) 67 | # 68 | # def forward(self, in_put): 69 | # x0 = self.same_block(in_put) 70 | # x1 = self.block(x0) 71 | # x = x0 + x1 72 | # x = self.relu(x) 73 | # return x 74 | 75 | 76 | # 第二次 77 | class _res_block(nn.Module): 78 | def __init__(self, in_chanels, out_chanels, kernel_size, padding): 79 | super(_res_block, self).__init__() 80 | self.block = nn.Sequential(_conv_block(in_chanels, in_chanels, kernel_size, padding), 81 | _conv_block(in_chanels, in_chanels, kernel_size, padding), 82 | ) 83 | # self.same_block = _conv_block(in_chanels, out_chanels, 1, 0) 84 | self.relu = nn.ReLU(False) 85 | 86 | def forward(self, in_put): 87 | x0 = in_put 88 | x1 = self.block(x0) 89 | x = x0 + x1 90 | x = self.relu(x) 91 | # x = self.same_block(x) 92 | return x 93 | 94 | 95 | # # 第二次 96 | # class _res_up_block(nn.Module): 97 | # def __init__(self, in_chanels, out_chanels, kernel_size, padding): 98 | # super(_res_up_block, self).__init__() 99 | # self.block = nn.Sequential(_conv_block(in_chanels, in_chanels, kernel_size, padding), 100 | # _conv_block(in_chanels, in_chanels, kernel_size, padding), 101 | # ) 102 | # self.same_block = _conv_block(in_chanels, out_chanels, 1, 0) 103 | # self.relu = nn.ReLU(False) 104 | # 105 | # def forward(self, in_put): 106 | # x0 = in_put 107 | # x1 = self.block(x0) 108 | # x = x0 + x1 109 | # x = self.relu(x) 110 | # x = self.same_block(x) 111 | # return x 112 | 113 | 114 | # 第四次 115 | class _res_pool_block(nn.Module): 116 | def __init__(self, in_chanels, out_chanels, kernel_size, padding, stride=2): 117 | super(_res_pool_block, self).__init__() 118 | self.pool_block = _conv_block(in_chanels, out_chanels, 1, 0, stride) 119 | self.block = nn.Sequential(_conv_block(in_chanels, out_chanels, kernel_size, padding, stride), 120 | _conv_block(out_chanels, out_chanels, kernel_size, padding), 121 | ) 122 | self.relu = nn.ReLU(False) 123 | 124 | def forward(self, in_put): 125 | x0 = self.pool_block(in_put) 126 | x1 = self.block(in_put) 127 | x = x0 + x1 128 | x = self.relu(x) 129 | return x 130 | 131 | 132 | # # 第三次 133 | # class _res_block(nn.Module): 134 | # def __init__(self, in_chanels, out_chanels, kernel_size, padding): 135 | # super(_res_block, self).__init__() 136 | # self.block = nn.Sequential(_conv_block(in_chanels, in_chanels, kernel_size, padding), 137 | # _conv_block(in_chanels, out_chanels, kernel_size, padding), 138 | # ) 139 | # self.same_block = _conv_block(in_chanels, out_chanels, 1, 0) 140 | # self.relu = nn.ReLU(False) 141 | # 142 | # def forward(self, in_put): 143 | # x0 = self.same_block(in_put) 144 | # x1 = self.block(in_put) 145 | # x = x0 + x1 146 | # x = self.relu(x) 147 | # return x 148 | 149 | 150 | # 继承 nn.Module(它本身是一个类并且能够跟踪状态) 151 | class SegDecNet(nn.Module): 152 | def __init__(self, device, input_width, input_height, input_channels, drop_p): 153 | super(SegDecNet, self).__init__() 154 | if input_width % 8 != 0 or input_height % 8 != 0: 155 | raise Exception(f"Input size must be divisible by 8! width={input_width}, height={input_height}") 156 | self.input_width = input_width 157 | self.input_height = input_height 158 | self.input_channels = input_channels 159 | # 分割层1:卷积(特征图提取) 160 | self.volume = nn.Sequential(_conv_block(self.input_channels, 32, 3, 1), 161 | _conv_block(32, 64, 3, 1), 162 | nn.MaxPool2d(2), 163 | _conv_block(64, 64, 3, 1), 164 | nn.MaxPool2d(2), 165 | _conv_block(64, 128, 3, 1), 166 | nn.MaxPool2d(2), 167 | _conv_block(128, 256, 3, 1), 168 | ) 169 | 170 | # 分割层2:正式分割 171 | self.seg_mask = nn.Sequential( 172 | Conv2d_init(in_channels=256, out_channels=1, kernel_size=1, padding=0, bias=False), 173 | FeatureNorm(num_features=1, eps=0.001, include_bias=False), 174 | # nn.Sigmoid() 175 | # nn.ReLU() 176 | ) # ??? 177 | 178 | # 决策层1:卷积(特征图提取) 179 | self.extractor = nn.Sequential( 180 | # nn.MaxPool2d(kernel_size=2), 181 | _conv_block(in_chanels=257, out_chanels=32, kernel_size=5, padding=2), 182 | # nn.MaxPool2d(kernel_size=2), 183 | # _conv_block(in_chanels=16, out_chanels=32, kernel_size=5, padding=2), 184 | # nn.MaxPool2d(kernel_size=2), 185 | # _conv_block(in_chanels=16, out_chanels=32, kernel_size=5, padding=2), 186 | # nn.MaxPool2d(kernel_size=2), 187 | # _conv_block(in_chanels=32, out_chanels=64, kernel_size=3, padding=1), 188 | ) 189 | 190 | # 决策层2:池化 191 | # self.global_max_pool_feat = nn.MaxPool2d(kernel_size=32) 192 | # self.global_avg_pool_feat = nn.AvgPool2d(kernel_size=32) 193 | # self.global_max_pool_seg = nn.MaxPool2d(kernel_size=(self.input_height / 8, self.input_width / 8)) 194 | # self.global_avg_pool_seg = nn.AvgPool2d(kernel_size=(self.input_height / 8, self.input_width / 8)) 195 | 196 | # 决策层3:最后的1分类器 197 | self.fc = nn.Sequential(nn.Dropout(p=drop_p), 198 | nn.Linear(in_features=66, out_features=1)) 199 | 200 | # ???(大概是自定义拼接的反向求导函数,具体使用请ctrl+f搜索本页) 201 | self.volume_lr_multiplier_layer = GradientMultiplyLayer().apply 202 | self.glob_max_lr_multiplier_layer = GradientMultiplyLayer().apply 203 | self.glob_avg_lr_multiplier_layer = GradientMultiplyLayer().apply 204 | 205 | # GPU 206 | self.device = device 207 | 208 | # torch.ones((1,) = tensor([1.]) 209 | def set_gradient_multipliers(self, multiplier): 210 | self.volume_lr_multiplier_mask = (torch.ones((1,)) * multiplier).to(self.device) 211 | self.glob_max_lr_multiplier_mask = (torch.ones((1,)) * multiplier).to(self.device) 212 | self.glob_avg_lr_multiplier_mask = (torch.ones((1,)) * multiplier).to(self.device) 213 | 214 | def forward(self, input): 215 | # 分割层1:卷积(特征图提取) 216 | volume = self.volume(input) 217 | # 分割层2:正式分割 218 | seg_mask = self.seg_mask(volume) 219 | 220 | # 特征图与mask拼接:cat为1025特征图 221 | cat = torch.cat([volume, seg_mask], dim=1) 222 | 223 | # ????(大概是自定义拼接的反向求导函数) 224 | cat = self.volume_lr_multiplier_layer(cat, self.volume_lr_multiplier_mask) 225 | 226 | # 决策层1:卷积(特征图提取) 227 | features = self.extractor(cat) 228 | # 决策层2:池化 229 | global_max_feat = torch.max(torch.max(features, dim=-1, keepdim=True)[0], dim=-2, keepdim=True)[0] 230 | global_avg_feat = torch.mean(features, dim=(-1, -2), keepdim=True) 231 | global_max_seg = torch.max(torch.max(seg_mask, dim=-1, keepdim=True)[0], dim=-2, keepdim=True)[0] 232 | global_avg_seg = torch.mean(seg_mask, dim=(-1, -2), keepdim=True) 233 | 234 | # reshape,并自定义分割池化的反向求导函数 235 | global_max_feat = global_max_feat.reshape(global_max_feat.size(0), -1) 236 | global_avg_feat = global_avg_feat.reshape(global_avg_feat.size(0), -1) 237 | 238 | global_max_seg = global_max_seg.reshape(global_max_seg.size(0), -1) 239 | global_max_seg = self.glob_max_lr_multiplier_layer(global_max_seg, self.glob_max_lr_multiplier_mask) 240 | global_avg_seg = global_avg_seg.reshape(global_avg_seg.size(0), -1) 241 | global_avg_seg = self.glob_avg_lr_multiplier_layer(global_avg_seg, self.glob_avg_lr_multiplier_mask) 242 | 243 | # 池化结果的拼接与reshape 244 | fc_in = torch.cat([global_max_feat, global_avg_feat, global_max_seg, global_avg_seg], dim=1) 245 | fc_in = fc_in.reshape(fc_in.size(0), -1) 246 | # 决策层3:最后的1分类器 247 | prediction = self.fc(fc_in) 248 | 249 | # 用于cam计算 250 | cam_feature = torch.cat([features, features, seg_mask, seg_mask], dim=1) 251 | 252 | return prediction, seg_mask, cam_feature 253 | 254 | 255 | # 虽然pytorch可以自动求导,但是有时候一些操作是不可导的,这时候你需要自定义求导方式。也就是所谓的 “Extending torch.autograd” 256 | class GradientMultiplyLayer(torch.autograd.Function): 257 | @staticmethod 258 | def forward(ctx, input, mask_bw): 259 | ctx.save_for_backward(mask_bw) 260 | return input 261 | 262 | @staticmethod 263 | def backward(ctx, grad_output): 264 | mask_bw, = ctx.saved_tensors 265 | return grad_output.mul(mask_bw), None 266 | -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_00_grandient_open/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/01_KSDD2/ksdd2_z_new_SC_00_grandient_open/ROC.pdf -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_00_grandient_open/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/01_KSDD2/ksdd2_z_new_SC_00_grandient_open/loss_val.png -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_00_grandient_open/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.6516417507233658,0.5266713466101546,1.1783130973335205,0 3 | 0.4815427636712547,0.36170087742611645,0.8432436410973712,1 4 | 0.37437240767285107,0.33262679150434044,0.7069991991771916,2 5 | 0.3022175484556493,0.3299857957091758,0.6322033441648252,3 6 | 0.2513740159631745,0.2856486290693283,0.5370226450325029,4 7 | 0.21407989854735088,0.2392869042914088,0.4533668028387597,5 8 | 0.18574350781557036,0.2619008948284436,0.44764440264401395,6 9 | 0.16349955399831137,0.21550854519615328,0.37900809919446465,7 10 | 0.14577493241162803,0.16531849488979433,0.31109342730142236,8 11 | 0.13122277628115522,0.14851600858496455,0.2797387848661198,9 12 | 0.11914764066052631,0.11722692138538128,0.2363745620459076,10 13 | 0.10904823013437473,0.18415546697390273,0.29320369710827743,11 14 | 0.10043769275269858,0.06822086861584245,0.16865856136854102,12 15 | 0.09307318901628013,0.15279041960593162,0.24586360862221174,13 16 | 0.08665070713050967,0.11311612257022198,0.19976682970073165,14 17 | 0.08104787003703234,0.1062134699668826,0.18726134000391495,15 18 | 0.07609333958083052,0.0677880974774196,0.14388143705825013,16 19 | 0.07167199290380245,0.025554687117900307,0.09722668002170276,17 20 | 0.06774072046202373,0.019465382067047482,0.08720610252907121,18 21 | 0.0641917450641229,0.03383524062460273,0.09802698568872563,19 22 | 0.06101378665222385,0.05988975577965015,0.12090354243187401,20 23 | 0.05811704610421405,0.01395783898973368,0.07207488509394773,21 24 | 0.05546101612773368,0.011187791623690022,0.0666488077514237,22 25 | 0.053050117885194176,0.2184780565718931,0.27152817445708727,23 26 | 0.050825572474216055,0.037946437655879954,0.088772010130096,24 27 | 0.048782120754079124,0.021248003059044118,0.07003012381312324,25 28 | 0.046890885364718554,0.006797382818171527,0.05368826818289008,26 29 | 0.04512568532935972,0.006938882705914538,0.052064568035274254,27 30 | 0.043503361625400014,0.03630198849974639,0.0798053501251464,28 31 | 0.04198698814564604,0.00967544348100276,0.0516624316266488,29 32 | 0.040568436609535685,0.004795374102678483,0.04536381071221417,30 33 | -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_00_grandient_open/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/01_KSDD2/ksdd2_z_new_SC_00_grandient_open/precision-recall.pdf -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_00_grandient_open/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:12 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:KSDD2 5 | DATASET_PATH:./datasets/KSDD2/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:15 8 | DROPOUT_P:0.0 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:31 11 | FOLD:None 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:True 15 | INPUT_CHANNELS:3 16 | INPUT_HEIGHT:640 17 | INPUT_WIDTH:232 18 | LEARNING_RATE:0.06 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:12 21 | MODEL_NAME:models_z_new_SC_00 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:True 29 | TRAIN_NUM:-1 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:3.0 40 | WEIGHTED_SEG_LOSS_P:2.0 41 | -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_01_grandient_open(对比实验_C_Net)/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/01_KSDD2/ksdd2_z_new_SC_01_grandient_open(对比实验_C_Net)/ROC.pdf -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_01_grandient_open(对比实验_C_Net)/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/01_KSDD2/ksdd2_z_new_SC_01_grandient_open(对比实验_C_Net)/loss_val.png -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_01_grandient_open(对比实验_C_Net)/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.6516417507233658,0.6218904666299743,1.27353221735334,0 3 | 0.4815427636712547,0.4440935254096985,0.9256362890809532,1 4 | 0.37437240767285107,0.4795424174971697,0.8539148251700208,2 5 | 0.3022175484556493,0.40904270027711137,0.7112602487327606,3 6 | 0.2513740159631745,0.40741433966450574,0.6587883556276802,4 7 | 0.21407989854735088,0.34892431483036135,0.5630042133777122,5 8 | 0.18574350781557036,0.36947550606436846,0.5552190138799389,6 9 | 0.16349955399831137,0.3693160232974262,0.5328155772957376,7 10 | 0.14577493241162803,0.329799223358069,0.475574155769697,8 11 | 0.13122277628115522,0.29078132677369,0.42200410305484526,9 12 | 0.11914764066052631,0.3476652312811797,0.466812871941706,10 13 | 0.10904823013437473,0.3344276365710468,0.4434758667054215,11 14 | 0.10043769275269858,0.24681541584129255,0.3472531085939911,12 15 | 0.09307318901628013,0.3077382405356663,0.40081142955194643,13 16 | 0.08665070713050967,0.2646090267150383,0.35125973384554793,14 17 | 0.08104787003703234,0.3494898372791647,0.43053770731619706,15 18 | 0.07609333958083052,0.24045905131634657,0.31655239089717707,16 19 | 0.07167199290380245,0.23669384918137779,0.30836584208518025,17 20 | 0.06774072046202373,0.26862285982787126,0.336363580289895,18 21 | 0.0641917450641229,0.32192705644340047,0.38611880150752337,19 22 | 0.06101378665222385,0.3299574461894307,0.3909712328416545,20 23 | 0.05811704610421405,0.19928386983105806,0.2574009159352721,21 24 | 0.05546101612773368,0.22206678956262466,0.27752780569035834,22 25 | 0.053050117885194176,0.33577515239395744,0.3888252702791516,23 26 | 0.050825572474216055,0.20506107846532412,0.25588665093954016,24 27 | 0.048782120754079124,0.22906128032420708,0.2778434010782862,25 28 | 0.046890885364718554,0.22693504035715165,0.2738259257218702,26 29 | 0.04512568532935972,0.1730552084925698,0.2181808938219295,27 30 | 0.043503361625400014,0.18172241862469574,0.22522578025009576,28 31 | 0.04198698814564604,0.1365163988638215,0.17850338700946755,29 32 | 0.040568436609535685,0.21272348152185844,0.2532919181313941,30 33 | 0.03924390982563903,0.26357566050397674,0.30281957032961576,31 34 | 0.03799718824343953,0.15996056507394565,0.1979577533173852,32 35 | 0.0368344574197521,0.24455551649405946,0.28138997391381154,33 36 | 0.035736825529152784,0.23112114609741582,0.2668579716265686,34 37 | 0.034689461009773784,0.19090402308033733,0.2255934840901111,35 38 | 0.033712067073438226,0.17876383769742357,0.2124759047708618,36 39 | 0.032788269706373295,0.15110666183679083,0.18389493154316414,37 40 | 0.031913344150151665,0.16307154534066595,0.1949848894908176,38 41 | 0.03107518636114229,0.1689827156260731,0.20005790198721538,39 42 | 0.03028631973557356,0.23712835769827773,0.2674146774338513,40 43 | 0.02953473739023131,0.176010866339008,0.2055456037292393,41 44 | 0.028818380481343928,0.13651475383014214,0.16533313431148605,42 45 | 0.02813681614835088,0.14888116402354667,0.17701798017189754,43 46 | 0.02747988852301264,0.15387487369097344,0.18135476221398608,44 47 | 0.02686481468561219,0.20333855082773097,0.23020336551334317,45 48 | 0.026268515160413293,0.16994636790539192,0.1962148830658052,46 49 | 0.02569852793604378,0.15061824694578727,0.17631677488183103,47 50 | 0.025148198250832598,0.1364270994833451,0.1615752977341777,48 51 | 0.02463140544610295,0.1769544854002997,0.20158589084640266,49 52 | 0.024127464892902996,0.16311058308172033,0.18723804797462332,50 53 | 0.023647941648960114,0.20055216732548503,0.22420010897444514,51 54 | 0.02318082210736546,0.12806295018976296,0.15124377229712843,52 55 | 0.022735171020030975,0.09228217350394745,0.11501734452397842,53 56 | 0.022307543977489318,0.11623593410704194,0.13854347808453127,54 57 | 0.021893720922431324,0.09533556633273034,0.11722928725516166,55 58 | 0.021494375314654375,0.109764345671709,0.13125872098636338,56 59 | 0.021109142802595123,0.07122807621925585,0.09233721902185098,57 60 | 0.020738717865168562,0.11396521192074306,0.1347039297859116,58 61 | 0.020378762538113247,0.07514265708140726,0.0955214196195205,59 62 | 0.020034991477320834,0.12920419013930287,0.1492391816166237,60 63 | 0.019699696299990986,0.158843618355752,0.178543314655743,61 64 | 0.019375295931003927,0.04737960593396328,0.06675490186496721,62 65 | 0.019061455033658966,0.04760768104768623,0.0666691360813452,63 66 | 0.018758918241999014,0.13736304439361988,0.1561219626356189,64 67 | 0.018465383387193446,0.18930696131192087,0.20777234469911432,65 68 | 0.018183654037917534,0.09833260519596619,0.11651625923388373,66 69 | 0.017902722991094355,0.06589097510720414,0.0837936980982985,67 70 | 0.017634866532029177,0.10150757900673926,0.11914244553876843,68 71 | 0.017375454155167912,0.09088422493569977,0.10825967909086767,69 72 | 0.017117043578527807,0.13536057509179036,0.15247761867031817,70 73 | 0.016873685113055918,0.06205370601614922,0.07892739112920513,71 74 | 0.016638445751211508,0.07512136161085067,0.09175980736206218,72 75 | 0.016406894881066267,0.043952401682007605,0.06035929656307387,73 76 | 0.016180915682296442,0.05611132185648733,0.07229223753878378,74 77 | 0.015960678458213806,0.07732157345588614,0.09328225191409995,75 78 | 0.01574486127592684,0.048822598304690386,0.06456745958061723,76 79 | 0.015537116918864289,0.0813780595298584,0.09691517644872269,77 80 | 0.015335759405440432,0.12221128404958219,0.1375470434550226,78 81 | 0.015136066703050116,0.10084498811482899,0.11598105481787911,79 82 | 0.014943454232884616,0.09606217243140791,0.11100562666429253,80 83 | 0.014755991779691805,0.07784741818223422,0.09260340996192602,81 84 | 0.014572530348853367,0.033413513496374274,0.047986043845227644,82 85 | 0.014393087446205014,0.07418931158244367,0.08858239902864869,83 86 | 0.014218112923265473,0.08450884218289842,0.09872695510616389,84 87 | 0.014047564711512589,0.09972822560164017,0.11377579031315276,85 88 | 0.01388230931952717,0.11932816439709527,0.13321047371662245,86 89 | 0.013720820953206318,0.07467401288146895,0.08839483383467527,87 90 | 0.013561876354421057,0.05005571255990403,0.06361758891432509,88 91 | 0.01340853698491081,0.046771237539596916,0.06017977452450773,89 92 | 0.013256231974053189,0.029739538671418538,0.04299577064547173,90 93 | 0.013105081439745135,0.025642277205332627,0.038747358645077765,91 94 | 0.012963162598813452,0.050986794728361734,0.06394995732717519,92 95 | 0.012819771874484008,0.20349655114871457,0.2163163230231986,93 96 | 0.01268010447180368,0.12365710506654852,0.1363372095383522,94 97 | 0.01254533025307384,0.05487445135396428,0.06741978160703813,95 98 | 0.01241348038722829,0.04663220152819181,0.059045681915420097,96 99 | 0.012281381532671005,0.04743050536277091,0.059711886895441914,97 100 | 0.012157042881822198,0.03661409080997715,0.04877113369179935,98 101 | 0.012030235241825988,0.0443666737310861,0.05639690897291209,99 102 | -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_01_grandient_open(对比实验_C_Net)/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/01_KSDD2/ksdd2_z_new_SC_01_grandient_open(对比实验_C_Net)/precision-recall.pdf -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_01_grandient_open(对比实验_C_Net)/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:12 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:KSDD2 5 | DATASET_PATH:./datasets/KSDD2/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:15 8 | DROPOUT_P:0.4 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:100 11 | FOLD:None 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:True 15 | INPUT_CHANNELS:3 16 | INPUT_HEIGHT:640 17 | INPUT_WIDTH:232 18 | LEARNING_RATE:0.06 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:12 21 | MODEL_NAME:models_z_new_SC_01 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:True 29 | TRAIN_NUM:-1 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:3.0 40 | WEIGHTED_SEG_LOSS_P:2.0 41 | -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_02(对比实验New_S_Net)/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/01_KSDD2/ksdd2_z_new_SC_02(对比实验New_S_Net)/ROC.pdf -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_02(对比实验New_S_Net)/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/01_KSDD2/ksdd2_z_new_SC_02(对比实验New_S_Net)/loss_val.png -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_02(对比实验New_S_Net)/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.6634052778647198,0.510997584195641,1.1744028620603608,0 3 | 0.5081648652146502,0.37438539324737174,0.882550258462022,1 4 | 0.4016264018004503,0.3402641454363257,0.741890547236776,2 5 | 0.32483976788637114,0.25849832739771866,0.5833380952840899,3 6 | 0.27448415223175915,0.23881168045648715,0.5132958326882463,4 7 | 0.2334736130101894,0.2393198152625464,0.4727934282727358,5 8 | 0.20132788894622305,0.18556736760992346,0.38689525655614654,6 9 | 0.17810700773223628,0.17091367799576704,0.3490206857280033,7 10 | 0.1558395822842916,0.08676609151610513,0.24260567380039672,8 11 | 0.1433686841794146,0.14855261655842386,0.29192130073783845,9 12 | 0.13007377172873272,0.15223943436048865,0.2823132060892214,10 13 | 0.11855583922649787,0.09291719739151195,0.2114730366180098,11 14 | 0.10843966234990252,0.046410966255679365,0.15485062860558188,12 15 | -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_02(对比实验New_S_Net)/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/01_KSDD2/ksdd2_z_new_SC_02(对比实验New_S_Net)/precision-recall.pdf -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_02(对比实验New_S_Net)/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:12 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:KSDD2 5 | DATASET_PATH:./datasets/KSDD2/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:15 8 | DROPOUT_P:0.0 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:13 11 | FOLD:None 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:3 16 | INPUT_HEIGHT:640 17 | INPUT_WIDTH:232 18 | LEARNING_RATE:0.06 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:12 21 | MODEL_NAME:models_z_new_SC_02 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:True 29 | TRAIN_NUM:-1 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:3.0 40 | WEIGHTED_SEG_LOSS_P:2.0 41 | -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_02(对比实验New_S_Net)/training_log.txt: -------------------------------------------------------------------------------- 1 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 Executing run with path ./results_new\KSDD2\ksdd2_z_new_SC_02 2 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 AREA_RATIO_MIN : 0.9 3 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 BATCH_SIZE : 12 4 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 CONTINUE_LAST_TRAIN : None 5 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 DATASET : KSDD2 6 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 DATASET_PATH : ./datasets/KSDD2/ 7 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 DELTA_CLS_LOSS : 1.0 8 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 DILATE : 15 9 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 DROPOUT_P : 0.0 10 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 DYN_BALANCED_LOSS : False 11 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 EPOCHS : 13 12 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 FOLD : None 13 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 FREQUENCY_SAMPLING : True 14 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 GPU : 0 15 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 GRADIENT_ADJUSTMENT : False 16 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 INPUT_CHANNELS : 3 17 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 INPUT_HEIGHT : 640 18 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 INPUT_WIDTH : 232 19 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 LEARNING_RATE : 0.06 20 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 LOSS_SEG_THR : 0.02 21 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 MEMORY_FIT : 12 22 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 MODEL_NAME : models_z_new_SC_02 23 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 NUM_SEGMENTED : 0 24 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 ON_DEMAND_READ : False 25 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 OPTIMIZER : SGD 26 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 REPRODUCIBLE_RUN : True 27 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 RESULTS_PATH : ./results_new 28 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 SAMPLING : half_mixed 29 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 SAVE_IMAGES : True 30 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 TRAIN_NUM : -1 31 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 TRANS_BRIGHT : 1.0 32 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 TRANS_KEEP_LOOP : 1 33 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 TRANS_NUM : 4 34 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 USE_BEST_MODEL : False 35 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 VALIDATE : True 36 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 VALIDATE_ON_TEST : True 37 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 VALIDATION_N_EPOCHS : 3 38 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 VOLUME_CFG : None 39 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 WEIGHTED_SEG_LOSS : False 40 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 WEIGHTED_SEG_LOSS_MAX : 3.0 41 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 WEIGHTED_SEG_LOSS_P : 2.0 42 | 2022-09-18 23:16:43 ksdd2_z_new_SC_02 Reproducible run, fixing all seeds to:1337 43 | This is models_z_new_SC_02 44 | 2022-09-18 23:17:31 ksdd2_z_new_SC_02 Returning seg_loss_weight 1 and dec_loss_weight 1.0 45 | 2022-09-18 23:17:31 ksdd2_z_new_SC_02 Returning dec_gradient_multiplier 1 46 | 2022-09-18 23:17:42 ksdd2_z_new_SC_02 Epoch 1/13 ==> avg_loss_seg=0.66341, avg_loss_seg_pos=0.09999, avg_loss_dec=0.51100, avg_loss=1.17440, FP=56, FN=62, correct=374/492, in 10.97s/epoch 47 | 2022-09-18 23:17:50 ksdd2_z_new_SC_02 VALIDATION || AUC=0.943258, and AP=0.806242, with best thr=0.781618 at f-measure=0.737 and FP=10, FN=40, TOTAL SAMPLES=1004, avg_loss_seg_neg=0.57590, avg_loss_seg_pos=0.50415 48 | 2022-09-18 23:17:50 ksdd2_z_new_SC_02 Saving current models state to ./results_new\KSDD2\ksdd2_z_new_SC_02\models\ep_01.pth 49 | 2022-09-18 23:17:50 ksdd2_z_new_SC_02 Saving current models state to ./results_new\KSDD2\ksdd2_z_new_SC_02\models\best_state_dict.pth 50 | 2022-09-18 23:17:50 ksdd2_z_new_SC_02 Returning seg_loss_weight 1 and dec_loss_weight 1.0 51 | 2022-09-18 23:17:50 ksdd2_z_new_SC_02 Returning dec_gradient_multiplier 1 52 | 2022-09-18 23:17:59 ksdd2_z_new_SC_02 Epoch 2/13 ==> avg_loss_seg=0.50816, avg_loss_seg_pos=0.07616, avg_loss_dec=0.37439, avg_loss=0.88255, FP=24, FN=53, correct=415/492, in 9.49s/epoch 53 | 2022-09-18 23:17:59 ksdd2_z_new_SC_02 Returning seg_loss_weight 1 and dec_loss_weight 1.0 54 | 2022-09-18 23:17:59 ksdd2_z_new_SC_02 Returning dec_gradient_multiplier 1 55 | 2022-09-18 23:18:09 ksdd2_z_new_SC_02 Epoch 3/13 ==> avg_loss_seg=0.40163, avg_loss_seg_pos=0.06164, avg_loss_dec=0.34026, avg_loss=0.74189, FP=25, FN=49, correct=418/492, in 9.52s/epoch 56 | 2022-09-18 23:18:09 ksdd2_z_new_SC_02 Returning seg_loss_weight 1 and dec_loss_weight 1.0 57 | 2022-09-18 23:18:09 ksdd2_z_new_SC_02 Returning dec_gradient_multiplier 1 58 | 2022-09-18 23:18:19 ksdd2_z_new_SC_02 Epoch 4/13 ==> avg_loss_seg=0.32484, avg_loss_seg_pos=0.05152, avg_loss_dec=0.25850, avg_loss=0.58334, FP=15, FN=42, correct=435/492, in 9.53s/epoch 59 | 2022-09-18 23:18:26 ksdd2_z_new_SC_02 VALIDATION || AUC=0.963687, and AP=0.846973, with best thr=0.829498 at f-measure=0.775 and FP=15, FN=31, TOTAL SAMPLES=1004, avg_loss_seg_neg=0.29429, avg_loss_seg_pos=0.28513 60 | 2022-09-18 23:18:26 ksdd2_z_new_SC_02 Saving current models state to ./results_new\KSDD2\ksdd2_z_new_SC_02\models\ep_04.pth 61 | 2022-09-18 23:18:26 ksdd2_z_new_SC_02 Saving current models state to ./results_new\KSDD2\ksdd2_z_new_SC_02\models\best_state_dict.pth 62 | 2022-09-18 23:18:26 ksdd2_z_new_SC_02 Returning seg_loss_weight 1 and dec_loss_weight 1.0 63 | 2022-09-18 23:18:26 ksdd2_z_new_SC_02 Returning dec_gradient_multiplier 1 64 | 2022-09-18 23:18:36 ksdd2_z_new_SC_02 Epoch 5/13 ==> avg_loss_seg=0.27448, avg_loss_seg_pos=0.04591, avg_loss_dec=0.23881, avg_loss=0.51330, FP=25, FN=31, correct=436/492, in 9.56s/epoch 65 | 2022-09-18 23:18:36 ksdd2_z_new_SC_02 Returning seg_loss_weight 1 and dec_loss_weight 1.0 66 | 2022-09-18 23:18:36 ksdd2_z_new_SC_02 Returning dec_gradient_multiplier 1 67 | 2022-09-18 23:18:46 ksdd2_z_new_SC_02 Epoch 6/13 ==> avg_loss_seg=0.23347, avg_loss_seg_pos=0.03934, avg_loss_dec=0.23932, avg_loss=0.47279, FP=12, FN=36, correct=444/492, in 9.57s/epoch 68 | 2022-09-18 23:18:46 ksdd2_z_new_SC_02 Returning seg_loss_weight 1 and dec_loss_weight 1.0 69 | 2022-09-18 23:18:46 ksdd2_z_new_SC_02 Returning dec_gradient_multiplier 1 70 | 2022-09-18 23:18:56 ksdd2_z_new_SC_02 Epoch 7/13 ==> avg_loss_seg=0.20133, avg_loss_seg_pos=0.03581, avg_loss_dec=0.18557, avg_loss=0.38690, FP=16, FN=23, correct=453/492, in 9.63s/epoch 71 | 2022-09-18 23:19:03 ksdd2_z_new_SC_02 VALIDATION || AUC=0.960504, and AP=0.888462, with best thr=0.394276 at f-measure=0.845 and FP=13, FN=20, TOTAL SAMPLES=1004, avg_loss_seg_neg=0.18700, avg_loss_seg_pos=0.21252 72 | 2022-09-18 23:19:03 ksdd2_z_new_SC_02 Saving current models state to ./results_new\KSDD2\ksdd2_z_new_SC_02\models\ep_07.pth 73 | 2022-09-18 23:19:03 ksdd2_z_new_SC_02 Saving current models state to ./results_new\KSDD2\ksdd2_z_new_SC_02\models\best_state_dict.pth 74 | 2022-09-18 23:19:03 ksdd2_z_new_SC_02 Returning seg_loss_weight 1 and dec_loss_weight 1.0 75 | 2022-09-18 23:19:03 ksdd2_z_new_SC_02 Returning dec_gradient_multiplier 1 76 | 2022-09-18 23:19:13 ksdd2_z_new_SC_02 Epoch 8/13 ==> avg_loss_seg=0.17811, avg_loss_seg_pos=0.03314, avg_loss_dec=0.17091, avg_loss=0.34902, FP=11, FN=22, correct=459/492, in 9.57s/epoch 77 | 2022-09-18 23:19:13 ksdd2_z_new_SC_02 Returning seg_loss_weight 1 and dec_loss_weight 1.0 78 | 2022-09-18 23:19:13 ksdd2_z_new_SC_02 Returning dec_gradient_multiplier 1 79 | 2022-09-18 23:19:23 ksdd2_z_new_SC_02 Epoch 9/13 ==> avg_loss_seg=0.15584, avg_loss_seg_pos=0.03093, avg_loss_dec=0.08677, avg_loss=0.24261, FP=3, FN=10, correct=479/492, in 9.62s/epoch 80 | 2022-09-18 23:19:23 ksdd2_z_new_SC_02 Returning seg_loss_weight 1 and dec_loss_weight 1.0 81 | 2022-09-18 23:19:23 ksdd2_z_new_SC_02 Returning dec_gradient_multiplier 1 82 | 2022-09-18 23:19:33 ksdd2_z_new_SC_02 Epoch 10/13 ==> avg_loss_seg=0.14337, avg_loss_seg_pos=0.03014, avg_loss_dec=0.14855, avg_loss=0.29192, FP=9, FN=16, correct=467/492, in 9.61s/epoch 83 | 2022-09-18 23:19:40 ksdd2_z_new_SC_02 VALIDATION || AUC=0.966087, and AP=0.896264, with best thr=0.834959 at f-measure=0.836 and FP=7, FN=26, TOTAL SAMPLES=1004, avg_loss_seg_neg=0.13485, avg_loss_seg_pos=0.17866 84 | 2022-09-18 23:19:40 ksdd2_z_new_SC_02 Saving current models state to ./results_new\KSDD2\ksdd2_z_new_SC_02\models\ep_10.pth 85 | 2022-09-18 23:19:40 ksdd2_z_new_SC_02 Saving current models state to ./results_new\KSDD2\ksdd2_z_new_SC_02\models\best_state_dict.pth 86 | 2022-09-18 23:19:40 ksdd2_z_new_SC_02 Returning seg_loss_weight 1 and dec_loss_weight 1.0 87 | 2022-09-18 23:19:40 ksdd2_z_new_SC_02 Returning dec_gradient_multiplier 1 88 | 2022-09-18 23:19:50 ksdd2_z_new_SC_02 Epoch 11/13 ==> avg_loss_seg=0.13007, avg_loss_seg_pos=0.02861, avg_loss_dec=0.15224, avg_loss=0.28231, FP=11, FN=15, correct=466/492, in 9.59s/epoch 89 | 2022-09-18 23:19:50 ksdd2_z_new_SC_02 Returning seg_loss_weight 1 and dec_loss_weight 1.0 90 | 2022-09-18 23:19:50 ksdd2_z_new_SC_02 Returning dec_gradient_multiplier 1 91 | 2022-09-18 23:20:00 ksdd2_z_new_SC_02 Epoch 12/13 ==> avg_loss_seg=0.11856, avg_loss_seg_pos=0.02711, avg_loss_dec=0.09292, avg_loss=0.21147, FP=6, FN=10, correct=476/492, in 9.60s/epoch 92 | 2022-09-18 23:20:00 ksdd2_z_new_SC_02 Returning seg_loss_weight 1 and dec_loss_weight 1.0 93 | 2022-09-18 23:20:00 ksdd2_z_new_SC_02 Returning dec_gradient_multiplier 1 94 | 2022-09-18 23:20:10 ksdd2_z_new_SC_02 Epoch 13/13 ==> avg_loss_seg=0.10844, avg_loss_seg_pos=0.02552, avg_loss_dec=0.04641, avg_loss=0.15485, FP=1, FN=4, correct=487/492, in 9.60s/epoch 95 | 2022-09-18 23:20:17 ksdd2_z_new_SC_02 VALIDATION || AUC=0.977934, and AP=0.922398, with best thr=0.547365 at f-measure=0.854 and FP=4, FN=25, TOTAL SAMPLES=1004, avg_loss_seg_neg=0.10133, avg_loss_seg_pos=0.16043 96 | 2022-09-18 23:20:17 ksdd2_z_new_SC_02 Saving current models state to ./results_new\KSDD2\ksdd2_z_new_SC_02\models\ep_13.pth 97 | 2022-09-18 23:20:18 ksdd2_z_new_SC_02 Saving current models state to ./results_new\KSDD2\ksdd2_z_new_SC_02\models\best_state_dict.pth 98 | now best_AP=0.9223983970728719,epoch=13 99 | 2022-09-18 23:20:18 ksdd2_z_new_SC_02 Saving current models state to ./results_new\KSDD2\ksdd2_z_new_SC_02\models\final_state_dict.pth 100 | 2022-09-18 23:20:18 ksdd2_z_new_SC_02 Keeping same model state 101 | ksdd2_z_new_SC_02 EVAL AUC=0.977934, and AP=0.922398, w/ best thr=0.547365 at f-m=0.854 and FP=4, FN=25 102 | run_time 11.8min 103 | -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_03(对比实验SC_Net)/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/01_KSDD2/ksdd2_z_new_SC_03(对比实验SC_Net)/ROC.pdf -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_03(对比实验SC_Net)/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/01_KSDD2/ksdd2_z_new_SC_03(对比实验SC_Net)/loss_val.png -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_03(对比实验SC_Net)/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.6422829560148038,0.7653828331125461,1.4076657891273499,0 3 | 0.48314084948562996,0.5507439183510416,1.0338847678366716,1 4 | 0.3792846338535712,0.5702304229503725,0.9495150568039437,2 5 | 0.30829046460671156,0.42722576759695036,0.7355162322036619,3 6 | 0.25662283538802855,0.42944796419725184,0.6860707995852804,4 7 | 0.21863631940469508,0.3564938664194045,0.5751301858240996,5 8 | 0.19006200486082372,0.3971103337722096,0.5871723386330333,6 9 | 0.16708354857878957,0.36172774228138654,0.5288112908601761,7 10 | 0.14889566975880444,0.36213406387383373,0.5110297336326382,8 11 | 0.13432445637578885,0.34005261824383,0.47437707461961887,9 12 | 0.12181386739257874,0.29074645575469105,0.41256032314726976,10 13 | 0.11161846528208352,0.2991830876203087,0.41080155290239223,11 14 | 0.1029395003144334,0.33302829808335965,0.43596779839779304,12 15 | 0.09577068371501396,0.30177611227684875,0.3975467959918627,13 16 | 0.08878751014306294,0.2609249156543879,0.3497124257974509,14 17 | 0.08345183512059653,0.2694080273916082,0.3528598625122047,15 18 | 0.07840061211973672,0.2320244547615691,0.31042506688130583,16 19 | 0.07362782385775714,0.25689696441820964,0.33052478827596676,17 20 | 0.06954664682469718,0.17548227285951135,0.24502891968420853,18 21 | 0.06592028606228711,0.16120988418295132,0.22713017024523843,19 22 | 0.06261789362605025,0.25026734318675065,0.3128852368128009,20 23 | 0.05954229904384148,0.20711142079132358,0.26665371983516506,21 24 | 0.05700324088092742,0.13749484745104137,0.1944980883319688,22 25 | 0.054666520376515586,0.2758453544683573,0.3305118748448729,23 26 | 0.0524633622508708,0.1374319630667446,0.1898953253176154,24 27 | 0.0503414918979009,0.2604857504488976,0.3108272423467985,25 28 | 0.04842229163259026,0.18268571427561403,0.2311080059082043,26 29 | 0.04669277624386113,0.26523631426498173,0.3119290905088429,27 30 | 0.04508107621979907,0.12845515839876684,0.1735362346185659,28 31 | 0.0434936537006037,0.31915581523160624,0.36264946893220995,29 32 | 0.04210503696183848,0.13071744964738202,0.1728224866092205,30 33 | 0.040629312335475676,0.13324300387525947,0.17387231621073515,31 34 | 0.03930466343474582,0.11005329081743229,0.1493579542521781,32 35 | 0.038109827090085036,0.1008665926633327,0.13897641975341773,33 36 | 0.03691767507452306,0.11597851041431834,0.1528961854888414,34 37 | 0.035832611283635706,0.07662931711571973,0.11246192839935543,35 38 | 0.03480773667494456,0.09590591293403773,0.1307136496089823,36 39 | 0.03382948277200141,0.08929762291358556,0.12312710568558696,37 40 | 0.03289596191266688,0.10326868575066328,0.13616464766333017,38 41 | 0.032051209027205055,0.1359191727938085,0.16797038182101354,39 42 | 0.03119520497758214,0.24998581336765754,0.2811810183452397,40 43 | 0.030528542653816503,0.09214223972786732,0.12267078238168383,41 44 | 0.029795283587967476,0.15381671585203185,0.18361199943999934,42 45 | 0.029041329111025586,0.0705019323821596,0.09954326149318518,43 46 | 0.028327580874528344,0.08486536389777088,0.11319294477229923,44 47 | 0.02765417850114466,0.19366599642860938,0.22132017492975403,45 48 | 0.027152562105074163,0.10158164478172131,0.1287342068867955,46 49 | 0.0265299363470659,0.05998487157626002,0.08651480792332591,47 50 | 0.026005017442431878,0.11720504713191734,0.14321006457434923,48 51 | 0.025443212954494043,0.05584522716065005,0.08128844011514409,49 52 | 0.024962921452716114,0.13765316671593403,0.16261608816865014,50 53 | 0.024403296471611272,0.05217640830309895,0.07657970477471022,51 54 | 0.023942165864192372,0.04708590337294873,0.07102806923714111,52 55 | 0.02345812296479698,0.06487433886064625,0.08833246182544323,53 56 | 0.023044922729817834,0.09839768421919487,0.12144260694901271,54 57 | 0.022699274122714996,0.1948960778432164,0.21759535196593138,55 58 | 0.022289075502535192,0.07170043507121443,0.09398951057374963,56 59 | 0.02185077613931361,0.04604201755539431,0.06789279369470792,57 60 | 0.02149662048351474,0.09637528853263796,0.1178719090161527,58 61 | 0.02110801061721352,0.05376518130060134,0.07487319191781486,59 62 | 0.02076607548851308,0.1133751395400765,0.13414121502858958,60 63 | 0.020453274825481864,0.0931253064408656,0.11357858126634746,61 64 | 0.020102614854894032,0.0447166844922292,0.06481929934712323,62 65 | 0.019777647757191,0.027894379574866072,0.04767202733205707,63 66 | 0.019433542995191202,0.04119928052073451,0.06063282351592571,64 67 | 0.019086078415072063,0.04123665821905119,0.060322736634123256,65 68 | 0.01879192746388234,0.06340145560102613,0.08219338306490846,66 69 | 0.018577338658212646,0.03937198751704479,0.05794932617525744,67 70 | 0.01831308985507585,0.020769624031959026,0.03908271388703488,68 71 | 0.018019235897354963,0.03094375601257917,0.04896299190993413,69 72 | 0.01776150283895857,0.033878568003363,0.05164007084232157,70 73 | 0.01749347268444736,0.025485361935722272,0.04297883462016963,71 74 | 0.017248865762134878,0.03292962273855398,0.05017848850068886,72 75 | 0.016986190757858073,0.05571168827033049,0.07269787902818856,73 76 | 0.016788870277928143,0.07784905184128481,0.09463792211921296,74 77 | 0.016537108284428836,0.050314312100380176,0.06685142038480901,75 78 | 0.016301032064891443,0.026840191413989155,0.043141223478880594,76 79 | 0.016088611105593238,0.03625079846706211,0.05233940957265535,77 80 | 0.01586860345631111,0.1292480542721999,0.145116657728511,78 81 | 0.01569474191685033,0.06206644429423945,0.07776118621108978,79 82 | 0.015456382821245892,0.05660133923752218,0.07205772205876808,80 83 | 0.015250834537957742,0.042729539898928345,0.057980374436886084,81 84 | 0.015071230781514471,0.04034979278373161,0.05542102356524608,82 85 | 0.014890451680838577,0.04901746658956617,0.06390791827040475,83 86 | 0.014721510310967764,0.024206649867923764,0.03892816017889153,84 87 | 0.014562684770037488,0.05846762475349403,0.07303030952353151,85 88 | 0.014364837509829824,0.014616394337887565,0.02898123184771739,86 89 | 0.014199227034076443,0.06783391263835677,0.08203313967243321,87 90 | 0.014044479292824985,0.027020714177591044,0.041065193470416025,88 91 | 0.013886487823191697,0.02538313091225256,0.03926961873544425,89 92 | 0.013724295770734305,0.015716265077793563,0.02944056084852787,90 93 | -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_03(对比实验SC_Net)/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/01_KSDD2/ksdd2_z_new_SC_03(对比实验SC_Net)/precision-recall.pdf -------------------------------------------------------------------------------- /results/01_KSDD2/ksdd2_z_new_SC_03(对比实验SC_Net)/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:12 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:KSDD2 5 | DATASET_PATH:./datasets/KSDD2/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:15 8 | DROPOUT_P:0.4 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:91 11 | FOLD:None 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:3 16 | INPUT_HEIGHT:640 17 | INPUT_WIDTH:232 18 | LEARNING_RATE:0.06 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:12 21 | MODEL_NAME:models_z_new_SC_03 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:True 29 | TRAIN_NUM:-1 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:3.0 40 | WEIGHTED_SEG_LOSS_P:2.0 41 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_class4/FOLD_4/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_class4/FOLD_4/ROC.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_class4/FOLD_4/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_class4/FOLD_4/loss_val.png -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_class4/FOLD_4/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.7035769432783127,1.1892578691244124,1.8928348124027252,0 3 | 0.630057829618454,0.7122308611869812,1.3422886908054352,1 4 | 0.5725852012634277,0.7036916941404343,1.276276895403862,2 5 | 0.5240444868803025,0.730686554312706,1.2547310411930086,3 6 | 0.4818494886159897,0.7302584975957871,1.2121079862117767,4 7 | 0.44410407841205596,0.6948590248823165,1.1389631032943726,5 8 | 0.41027798056602477,0.719087952375412,1.1293659329414367,6 9 | 0.3802764669060707,0.6921477258205414,1.072424192726612,7 10 | 0.355304579436779,0.6858810514211655,1.0411856308579446,8 11 | 0.333024850487709,0.7122981786727905,1.0453230291604996,9 12 | 0.31232941150665283,0.6734563022851944,0.9857857137918472,10 13 | 0.2936579316854477,0.6703038036823272,0.963961735367775,11 14 | 0.27612723112106324,0.6420904949307442,0.9182177260518074,12 15 | 0.26064055114984513,0.6622613027691842,0.9229018539190292,13 16 | 0.24672590047121049,0.6216069594025612,0.8683328598737716,14 17 | 0.23331597074866295,0.6316298604011535,0.8649458311498165,15 18 | 0.2214138500392437,0.5473032340407371,0.7687170840799809,16 19 | 0.2127397373318672,0.5824621006846428,0.79520183801651,17 20 | 0.20172909423708915,0.568429148197174,0.7701582424342632,18 21 | 0.19355609342455865,0.5505090102553367,0.7440651036798953,19 22 | 0.18300239816308023,0.5530695229768753,0.7360719211399556,20 23 | 0.17450727000832558,0.44878921955823897,0.6232964895665646,21 24 | 0.16842004805803298,0.5168478392064572,0.6852678872644902,22 25 | 0.16034601107239724,0.3858899913728237,0.546236002445221,23 26 | 0.15196873992681503,0.37833785116672514,0.5303065910935402,24 27 | 0.14526605978608131,0.37724385038018227,0.5225099101662636,25 28 | 0.14188894256949425,0.37116139382123947,0.5130503363907337,26 29 | 0.13563731536269188,0.3180449493229389,0.4536822646856308,27 30 | 0.12880677357316017,0.21633436419069768,0.34514113776385785,28 31 | 0.12766031809151174,0.4355671674013138,0.5632274854928255,29 32 | 0.11923417150974273,0.25276250913739207,0.3719966806471348,30 33 | 0.1143695130944252,0.1910702720284462,0.3054397851228714,31 34 | 0.10940971709787846,0.22886126572266222,0.3382709828205407,32 35 | 0.10550793260335922,0.24462428148835896,0.3501322140917182,33 36 | 0.10290272124111652,0.16770811565220356,0.2706108368933201,34 37 | 0.10022638216614724,0.21670147976838053,0.31692786193452777,35 38 | 0.09532218649983407,0.14249452406074853,0.2378167105605826,36 39 | 0.09307675585150718,0.17741223312914373,0.2704889889806509,37 40 | 0.091157291457057,0.11031362153589726,0.20147091299295428,38 41 | 0.08813399933278561,0.29025973740499467,0.37839373673778026,39 42 | 0.10664608739316464,0.8469473153352738,0.9535934027284384,40 43 | 0.10330339036881923,0.7632613033056259,0.8665646936744451,41 44 | 0.09853668548166752,0.6962395042181015,0.794776189699769,42 45 | 0.0938396867364645,0.5367519415915012,0.6305916283279657,43 46 | 0.09167959317564964,0.3891793988645077,0.48085899204015736,44 47 | 0.08295831270515919,0.2999633714556694,0.3829216841608286,45 48 | 0.0816695123910904,0.3710447397083044,0.4527142520993948,46 49 | 0.07679404690861702,0.2229162845760584,0.2997103314846754,47 50 | 0.07488783784210681,0.16813164930790664,0.24301948715001345,48 51 | 0.07208401747047902,0.13293541818857194,0.20501943565905095,49 52 | 0.07547882869839669,0.3449997631832957,0.4204785918816924,50 53 | 0.06934981010854244,0.14836872527375816,0.2177185353823006,51 54 | 0.06799790039658546,0.18931557657197118,0.25731347696855666,52 55 | 0.06685736700892449,0.07345637427642941,0.1403137412853539,53 56 | 0.064408341050148,0.2083350616041571,0.2727434026543051,54 57 | 0.06340057235211134,0.11327432366088033,0.1766748960129917,55 58 | 0.06237173210829496,0.1167810252867639,0.17915275739505887,56 59 | 0.06052133049815893,0.06849074247293174,0.12901207297109069,57 60 | 0.059409772790968415,0.07909780194750056,0.13850757473846897,58 61 | 0.05873186439275742,0.07383500847499817,0.1325668728677556,59 62 | 0.05762695521116257,0.09294519077520817,0.15057214598637075,60 63 | 0.05632381364703178,0.08452801627572626,0.14085182992275805,61 64 | 0.05666108652949333,0.1260805486701429,0.18274163519963624,62 65 | 0.05470929685980082,0.09871220847126097,0.1534215053310618,63 66 | 0.053831365145742896,0.08853534017689527,0.14236670532263818,64 67 | 0.053218649327754976,0.15037448469083756,0.20359313401859253,65 68 | 0.05302081592381001,0.16513800285756589,0.21815881878137589,66 69 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_class4/FOLD_4/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_class4/FOLD_4/precision-recall.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_class4/FOLD_4/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:8 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:DAGM 5 | DATASET_PATH:./datasets/DAGM/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:0 8 | DROPOUT_P:0.4 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:67 11 | FOLD:4 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:1 16 | INPUT_HEIGHT:512 17 | INPUT_WIDTH:512 18 | LEARNING_RATE:0.04 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:8 21 | MODEL_NAME:models_trans10 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:True 29 | TRAIN_NUM:None 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:10.0 40 | WEIGHTED_SEG_LOSS_P:1.0 41 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_1/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_1/ROC.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_1/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_1/loss_val.png -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_1/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.7091114897476999,0.6079762327043634,1.3170877224520634,0 3 | 0.6473650179411236,0.1542571392890654,0.8016221572301889,1 4 | 0.592352851441032,0.07926015149017698,0.671613002931209,2 5 | 0.5445504627729717,0.0563426351566848,0.6008930979296565,3 6 | 0.5002683008971968,0.038945466966221205,0.539213767863418,4 7 | 0.46234344024407237,0.021262227557599545,0.4836056678016719,5 8 | 0.4280734940579063,0.01661460697773452,0.4446881010356408,6 9 | 0.4024986954111802,0.15415517142728755,0.5566538668384677,7 10 | 0.37331281837664154,0.1870183483697474,0.5603311667463889,8 11 | 0.347637215727254,0.04422560195119953,0.3918628176784535,9 12 | 0.3236434114606757,0.006648592056559497,0.3302920035172352,10 13 | 0.30399097423804433,0.007961236449007533,0.31195221068705187,11 14 | 0.2863670179718419,0.0042852445468796714,0.29065226251872156,12 15 | 0.2704002198420073,0.005368193991384224,0.2757684138333915,13 16 | 0.25580948355950806,0.0014073516481163863,0.25721683520762445,14 17 | 0.2427230838098024,0.00460846883269321,0.2473315526424956,15 18 | 0.23065876176482752,0.0017158346753077286,0.23237459644013525,16 19 | 0.21958199613972715,0.0028158085432981976,0.22239780468302534,17 20 | 0.2093683222406789,0.0016878497308904403,0.21105617197156934,18 21 | 0.2001507431268692,0.0014825986842257216,0.20163334181109493,19 22 | 0.19159667350743947,0.0007213350082134926,0.19231800851565298,20 23 | 0.18357217860849281,0.0023965373199500485,0.18596871592844286,21 24 | 0.1764311178734428,0.003212342421695786,0.17964346029513859,22 25 | 0.17014383563869878,0.012354719446432827,0.1824985550851316,23 26 | 0.16373007313201302,0.004263486630454855,0.16799355976246788,24 27 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_1/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_1/precision-recall.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_1/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:8 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:DAGM 5 | DATASET_PATH:./datasets/DAGM/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:0 8 | DROPOUT_P:0.2 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:25 11 | FOLD:1 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:1 16 | INPUT_HEIGHT:512 17 | INPUT_WIDTH:512 18 | LEARNING_RATE:0.04 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:8 21 | MODEL_NAME:models_trans10 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:True 29 | TRAIN_NUM:None 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:10.0 40 | WEIGHTED_SEG_LOSS_P:1.0 41 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_1/training_log.txt: -------------------------------------------------------------------------------- 1 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 Executing run with path ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_1 2 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 AREA_RATIO_MIN : 0.9 3 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 BATCH_SIZE : 8 4 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 CONTINUE_LAST_TRAIN : None 5 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 DATASET : DAGM 6 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 DATASET_PATH : ./datasets/DAGM/ 7 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 DELTA_CLS_LOSS : 1.0 8 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 DILATE : 0 9 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 DROPOUT_P : 0.2 10 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 DYN_BALANCED_LOSS : False 11 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 EPOCHS : 25 12 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 FOLD : 1 13 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 FREQUENCY_SAMPLING : True 14 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 GPU : 0 15 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 GRADIENT_ADJUSTMENT : False 16 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 INPUT_CHANNELS : 1 17 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 INPUT_HEIGHT : 512 18 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 INPUT_WIDTH : 512 19 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 LEARNING_RATE : 0.04 20 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 LOSS_SEG_THR : 0.02 21 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 MEMORY_FIT : 8 22 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 MODEL_NAME : models_trans10 23 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 NUM_SEGMENTED : 0 24 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 ON_DEMAND_READ : False 25 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 OPTIMIZER : SGD 26 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 REPRODUCIBLE_RUN : True 27 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 RESULTS_PATH : ./results_new 28 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 SAMPLING : half_mixed 29 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 SAVE_IMAGES : True 30 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 TRAIN_NUM : None 31 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 TRANS_BRIGHT : 1.0 32 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 TRANS_KEEP_LOOP : 1 33 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 TRANS_NUM : 4 34 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 USE_BEST_MODEL : False 35 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 VALIDATE : True 36 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 VALIDATE_ON_TEST : True 37 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 VALIDATION_N_EPOCHS : 3 38 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 VOLUME_CFG : None 39 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 WEIGHTED_SEG_LOSS : False 40 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 WEIGHTED_SEG_LOSS_MAX : 10.0 41 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 WEIGHTED_SEG_LOSS_P : 1.0 42 | 2022-09-30 00:22:17 dagm_z_SC_03_except_class4_FOLD_1 Reproducible run, fixing all seeds to:1337 43 | This is models_trans10 44 | 2022-09-30 00:22:40 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 45 | 2022-09-30 00:22:40 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 46 | 2022-09-30 00:22:48 dagm_z_SC_03_except_class4_FOLD_1 Epoch 1/25 ==> avg_loss_seg=0.70911, avg_loss_seg_pos=0.17313, avg_loss_dec=0.60798, avg_loss=1.31709, FP=28, FN=28, correct=96/152, in 7.68s/epoch 47 | 2022-09-30 00:22:54 dagm_z_SC_03_except_class4_FOLD_1 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.769863 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.67547, avg_loss_seg_pos=0.63027 48 | 2022-09-30 00:22:54 dagm_z_SC_03_except_class4_FOLD_1 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_1\models\ep_01.pth 49 | 2022-09-30 00:22:54 dagm_z_SC_03_except_class4_FOLD_1 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_1\models\best_state_dict.pth 50 | 2022-09-30 00:22:54 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 51 | 2022-09-30 00:22:54 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 52 | 2022-09-30 00:22:59 dagm_z_SC_03_except_class4_FOLD_1 Epoch 2/25 ==> avg_loss_seg=0.64737, avg_loss_seg_pos=0.14983, avg_loss_dec=0.15426, avg_loss=0.80162, FP=0, FN=4, correct=148/152, in 5.16s/epoch 53 | 2022-09-30 00:22:59 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 54 | 2022-09-30 00:22:59 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 55 | 2022-09-30 00:23:05 dagm_z_SC_03_except_class4_FOLD_1 Epoch 3/25 ==> avg_loss_seg=0.59235, avg_loss_seg_pos=0.13488, avg_loss_dec=0.07926, avg_loss=0.67161, FP=1, FN=3, correct=148/152, in 5.18s/epoch 56 | 2022-09-30 00:23:05 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 57 | 2022-09-30 00:23:05 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 58 | 2022-09-30 00:23:10 dagm_z_SC_03_except_class4_FOLD_1 Epoch 4/25 ==> avg_loss_seg=0.54455, avg_loss_seg_pos=0.12316, avg_loss_dec=0.05634, avg_loss=0.60089, FP=0, FN=1, correct=151/152, in 5.23s/epoch 59 | 2022-09-30 00:23:16 dagm_z_SC_03_except_class4_FOLD_1 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.997295 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.51970, avg_loss_seg_pos=0.47232 60 | 2022-09-30 00:23:16 dagm_z_SC_03_except_class4_FOLD_1 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_1\models\ep_04.pth 61 | 2022-09-30 00:23:16 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 62 | 2022-09-30 00:23:16 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 63 | 2022-09-30 00:23:21 dagm_z_SC_03_except_class4_FOLD_1 Epoch 5/25 ==> avg_loss_seg=0.50027, avg_loss_seg_pos=0.11429, avg_loss_dec=0.03895, avg_loss=0.53921, FP=0, FN=1, correct=151/152, in 5.26s/epoch 64 | 2022-09-30 00:23:21 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 65 | 2022-09-30 00:23:21 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 66 | 2022-09-30 00:23:27 dagm_z_SC_03_except_class4_FOLD_1 Epoch 6/25 ==> avg_loss_seg=0.46234, avg_loss_seg_pos=0.10580, avg_loss_dec=0.02126, avg_loss=0.48361, FP=0, FN=1, correct=151/152, in 5.20s/epoch 67 | 2022-09-30 00:23:27 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 68 | 2022-09-30 00:23:27 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 69 | 2022-09-30 00:23:32 dagm_z_SC_03_except_class4_FOLD_1 Epoch 7/25 ==> avg_loss_seg=0.42807, avg_loss_seg_pos=0.09839, avg_loss_dec=0.01661, avg_loss=0.44469, FP=0, FN=0, correct=152/152, in 5.21s/epoch 70 | 2022-09-30 00:23:38 dagm_z_SC_03_except_class4_FOLD_1 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999806 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.41208, avg_loss_seg_pos=0.37726 71 | 2022-09-30 00:23:38 dagm_z_SC_03_except_class4_FOLD_1 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_1\models\ep_07.pth 72 | 2022-09-30 00:23:38 dagm_z_SC_03_except_class4_FOLD_1 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_1\models\best_state_dict.pth 73 | 2022-09-30 00:23:38 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 74 | 2022-09-30 00:23:38 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 75 | 2022-09-30 00:23:44 dagm_z_SC_03_except_class4_FOLD_1 Epoch 8/25 ==> avg_loss_seg=0.40250, avg_loss_seg_pos=0.09429, avg_loss_dec=0.15416, avg_loss=0.55665, FP=3, FN=4, correct=145/152, in 5.22s/epoch 76 | 2022-09-30 00:23:44 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 77 | 2022-09-30 00:23:44 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 78 | 2022-09-30 00:23:49 dagm_z_SC_03_except_class4_FOLD_1 Epoch 9/25 ==> avg_loss_seg=0.37331, avg_loss_seg_pos=0.08775, avg_loss_dec=0.18702, avg_loss=0.56033, FP=5, FN=4, correct=143/152, in 5.22s/epoch 79 | 2022-09-30 00:23:49 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 80 | 2022-09-30 00:23:49 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 81 | 2022-09-30 00:23:54 dagm_z_SC_03_except_class4_FOLD_1 Epoch 10/25 ==> avg_loss_seg=0.34764, avg_loss_seg_pos=0.08225, avg_loss_dec=0.04423, avg_loss=0.39186, FP=1, FN=3, correct=148/152, in 5.23s/epoch 82 | 2022-09-30 00:24:00 dagm_z_SC_03_except_class4_FOLD_1 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999962 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.33398, avg_loss_seg_pos=0.31384 83 | 2022-09-30 00:24:00 dagm_z_SC_03_except_class4_FOLD_1 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_1\models\ep_10.pth 84 | 2022-09-30 00:24:00 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 85 | 2022-09-30 00:24:00 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 86 | 2022-09-30 00:24:06 dagm_z_SC_03_except_class4_FOLD_1 Epoch 11/25 ==> avg_loss_seg=0.32364, avg_loss_seg_pos=0.07645, avg_loss_dec=0.00665, avg_loss=0.33029, FP=0, FN=0, correct=152/152, in 5.23s/epoch 87 | 2022-09-30 00:24:06 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 88 | 2022-09-30 00:24:06 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 89 | 2022-09-30 00:24:11 dagm_z_SC_03_except_class4_FOLD_1 Epoch 12/25 ==> avg_loss_seg=0.30399, avg_loss_seg_pos=0.07319, avg_loss_dec=0.00796, avg_loss=0.31195, FP=0, FN=0, correct=152/152, in 5.24s/epoch 90 | 2022-09-30 00:24:11 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 91 | 2022-09-30 00:24:11 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 92 | 2022-09-30 00:24:17 dagm_z_SC_03_except_class4_FOLD_1 Epoch 13/25 ==> avg_loss_seg=0.28637, avg_loss_seg_pos=0.06962, avg_loss_dec=0.00429, avg_loss=0.29065, FP=0, FN=0, correct=152/152, in 5.31s/epoch 93 | 2022-09-30 00:24:23 dagm_z_SC_03_except_class4_FOLD_1 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999920 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.27771, avg_loss_seg_pos=0.27264 94 | 2022-09-30 00:24:23 dagm_z_SC_03_except_class4_FOLD_1 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_1\models\ep_13.pth 95 | 2022-09-30 00:24:23 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 96 | 2022-09-30 00:24:23 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 97 | 2022-09-30 00:24:28 dagm_z_SC_03_except_class4_FOLD_1 Epoch 14/25 ==> avg_loss_seg=0.27040, avg_loss_seg_pos=0.06646, avg_loss_dec=0.00537, avg_loss=0.27577, FP=0, FN=0, correct=152/152, in 5.23s/epoch 98 | 2022-09-30 00:24:28 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 99 | 2022-09-30 00:24:28 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 100 | 2022-09-30 00:24:34 dagm_z_SC_03_except_class4_FOLD_1 Epoch 15/25 ==> avg_loss_seg=0.25581, avg_loss_seg_pos=0.06390, avg_loss_dec=0.00141, avg_loss=0.25722, FP=0, FN=0, correct=152/152, in 5.24s/epoch 101 | 2022-09-30 00:24:34 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 102 | 2022-09-30 00:24:34 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 103 | 2022-09-30 00:24:39 dagm_z_SC_03_except_class4_FOLD_1 Epoch 16/25 ==> avg_loss_seg=0.24272, avg_loss_seg_pos=0.06111, avg_loss_dec=0.00461, avg_loss=0.24733, FP=0, FN=0, correct=152/152, in 5.23s/epoch 104 | 2022-09-30 00:24:45 dagm_z_SC_03_except_class4_FOLD_1 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999953 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.23610, avg_loss_seg_pos=0.24194 105 | 2022-09-30 00:24:45 dagm_z_SC_03_except_class4_FOLD_1 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_1\models\ep_16.pth 106 | 2022-09-30 00:24:45 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 107 | 2022-09-30 00:24:45 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 108 | 2022-09-30 00:24:50 dagm_z_SC_03_except_class4_FOLD_1 Epoch 17/25 ==> avg_loss_seg=0.23066, avg_loss_seg_pos=0.05920, avg_loss_dec=0.00172, avg_loss=0.23237, FP=0, FN=0, correct=152/152, in 5.24s/epoch 109 | 2022-09-30 00:24:50 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 110 | 2022-09-30 00:24:50 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 111 | 2022-09-30 00:24:56 dagm_z_SC_03_except_class4_FOLD_1 Epoch 18/25 ==> avg_loss_seg=0.21958, avg_loss_seg_pos=0.05737, avg_loss_dec=0.00282, avg_loss=0.22240, FP=0, FN=0, correct=152/152, in 5.24s/epoch 112 | 2022-09-30 00:24:56 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 113 | 2022-09-30 00:24:56 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 114 | 2022-09-30 00:25:01 dagm_z_SC_03_except_class4_FOLD_1 Epoch 19/25 ==> avg_loss_seg=0.20937, avg_loss_seg_pos=0.05545, avg_loss_dec=0.00169, avg_loss=0.21106, FP=0, FN=0, correct=152/152, in 5.24s/epoch 115 | 2022-09-30 00:25:07 dagm_z_SC_03_except_class4_FOLD_1 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999875 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.20446, avg_loss_seg_pos=0.21914 116 | 2022-09-30 00:25:07 dagm_z_SC_03_except_class4_FOLD_1 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_1\models\ep_19.pth 117 | 2022-09-30 00:25:07 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 118 | 2022-09-30 00:25:07 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 119 | 2022-09-30 00:25:13 dagm_z_SC_03_except_class4_FOLD_1 Epoch 20/25 ==> avg_loss_seg=0.20015, avg_loss_seg_pos=0.05374, avg_loss_dec=0.00148, avg_loss=0.20163, FP=0, FN=0, correct=152/152, in 5.24s/epoch 120 | 2022-09-30 00:25:13 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 121 | 2022-09-30 00:25:13 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 122 | 2022-09-30 00:25:18 dagm_z_SC_03_except_class4_FOLD_1 Epoch 21/25 ==> avg_loss_seg=0.19160, avg_loss_seg_pos=0.05196, avg_loss_dec=0.00072, avg_loss=0.19232, FP=0, FN=0, correct=152/152, in 5.24s/epoch 123 | 2022-09-30 00:25:18 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 124 | 2022-09-30 00:25:18 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 125 | 2022-09-30 00:25:24 dagm_z_SC_03_except_class4_FOLD_1 Epoch 22/25 ==> avg_loss_seg=0.18357, avg_loss_seg_pos=0.05083, avg_loss_dec=0.00240, avg_loss=0.18597, FP=0, FN=0, correct=152/152, in 5.24s/epoch 126 | 2022-09-30 00:25:29 dagm_z_SC_03_except_class4_FOLD_1 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999804 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.17969, avg_loss_seg_pos=0.20187 127 | 2022-09-30 00:25:29 dagm_z_SC_03_except_class4_FOLD_1 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_1\models\ep_22.pth 128 | 2022-09-30 00:25:29 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 129 | 2022-09-30 00:25:29 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 130 | 2022-09-30 00:25:35 dagm_z_SC_03_except_class4_FOLD_1 Epoch 23/25 ==> avg_loss_seg=0.17643, avg_loss_seg_pos=0.04923, avg_loss_dec=0.00321, avg_loss=0.17964, FP=0, FN=0, correct=152/152, in 5.24s/epoch 131 | 2022-09-30 00:25:35 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 132 | 2022-09-30 00:25:35 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 133 | 2022-09-30 00:25:40 dagm_z_SC_03_except_class4_FOLD_1 Epoch 24/25 ==> avg_loss_seg=0.17014, avg_loss_seg_pos=0.04724, avg_loss_dec=0.01235, avg_loss=0.18250, FP=0, FN=1, correct=151/152, in 5.23s/epoch 134 | 2022-09-30 00:25:40 dagm_z_SC_03_except_class4_FOLD_1 Returning seg_loss_weight 1 and dec_loss_weight 1.0 135 | 2022-09-30 00:25:40 dagm_z_SC_03_except_class4_FOLD_1 Returning dec_gradient_multiplier 1 136 | 2022-09-30 00:25:46 dagm_z_SC_03_except_class4_FOLD_1 Epoch 25/25 ==> avg_loss_seg=0.16373, avg_loss_seg_pos=0.04654, avg_loss_dec=0.00426, avg_loss=0.16799, FP=0, FN=0, correct=152/152, in 5.23s/epoch 137 | 2022-09-30 00:25:52 dagm_z_SC_03_except_class4_FOLD_1 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999983 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.16059, avg_loss_seg_pos=0.18814 138 | 2022-09-30 00:25:52 dagm_z_SC_03_except_class4_FOLD_1 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_1\models\ep_25.pth 139 | now best_AP=1.0,epoch=7 140 | 2022-09-30 00:25:52 dagm_z_SC_03_except_class4_FOLD_1 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_1\models\final_state_dict.pth 141 | 2022-09-30 00:25:52 dagm_z_SC_03_except_class4_FOLD_1 Keeping same model state 142 | dagm_z_SC_03_except_class4_FOLD_1 EVAL AUC=1.000000, and AP=1.000000, w/ best thr=0.999983 at f-m=1.000 and FP=0, FN=0 143 | run_time 8.0min 144 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_10/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_10/ROC.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_10/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_10/loss_val.png -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_10/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.6783757225887196,0.5940936864227861,1.2724694090115056,0 3 | 0.5669282803664336,0.17105111955488855,0.7379793999213222,1 4 | 0.4807767964698173,0.08101396495161729,0.5617907614214346,2 5 | 0.4124763221354098,0.019352699596925663,0.43182902173233545,3 6 | 0.3589315180842941,0.012980792347366947,0.371912310431661,4 7 | 0.3179283166253889,0.04223516587375638,0.36016348249914526,5 8 | 0.28510779625660665,0.02628550255699461,0.3113932988136013,6 9 | 0.254318169645361,0.0051677860760104815,0.2594859557213715,7 10 | 0.2303504420293344,0.013853551811071401,0.2442039938404058,8 11 | 0.21010127824706,0.0068638082420629624,0.21696508648912297,9 12 | 0.1922955436481012,0.015717354827486544,0.20801289847558774,10 13 | 0.17812342659847155,0.008237630672394173,0.18636105727086572,11 14 | 0.1645657428213068,0.0020997714710648398,0.1666655142923716,12 15 | 0.15312655431193276,0.0010650470271561848,0.15419160133908894,13 16 | 0.1437605736223427,0.015930423875672842,0.15969099749801555,14 17 | 0.1346031185742971,0.002180920173142678,0.13678403874743977,15 18 | 0.12814912884622007,0.08562599631693808,0.21377512516315816,16 19 | 0.12044944473215051,0.0030349682860400505,0.12348441301819056,17 20 | 0.11364075279719121,0.0010216896353577377,0.11466244243254894,18 21 | 0.10768980975892092,0.01679589083308914,0.12448570059201006,19 22 | 0.10259469095114115,0.021263030200734157,0.12385772115187531,20 23 | 0.09758557641022914,0.0051083914668934825,0.10269396787712262,21 24 | 0.0930446100396079,0.0011186244820065268,0.09416323452161443,22 25 | 0.08994267800369778,0.02615592604352797,0.11609860404722575,23 26 | 0.08581067239110535,0.0017045591890023756,0.08751523158010772,24 27 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_10/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_10/precision-recall.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_10/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:8 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:DAGM 5 | DATASET_PATH:./datasets/DAGM/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:0 8 | DROPOUT_P:0.2 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:25 11 | FOLD:10 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:1 16 | INPUT_HEIGHT:512 17 | INPUT_WIDTH:512 18 | LEARNING_RATE:0.04 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:8 21 | MODEL_NAME:models_trans10 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:True 29 | TRAIN_NUM:None 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:10.0 40 | WEIGHTED_SEG_LOSS_P:1.0 41 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_2/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_2/ROC.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_2/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_2/loss_val.png -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_2/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.7175667397677898,0.7129249237477779,1.4304916635155678,0 3 | 0.6645813509821892,0.6718929037451744,1.3364742547273636,1 4 | 0.6158261895179749,0.35614402079954743,0.9719702103175223,2 5 | 0.5731661655008793,0.13164918893016875,0.704815354431048,3 6 | 0.5329582616686821,0.0959888641955331,0.6289471258642152,4 7 | 0.49534452706575394,0.04284106282284483,0.5381855898885988,5 8 | 0.462944807484746,0.07653826611931436,0.5394830736040603,6 9 | 0.43317195773124695,0.02536679818877019,0.45853875592001714,7 10 | 0.40663851983845234,0.02218657646153588,0.4288250962999882,8 11 | 0.382412014529109,0.008838769514113665,0.39125078404322267,9 12 | 0.3605274371802807,0.019395387593249325,0.37992282477353,10 13 | 0.3411402367055416,0.004733528534416109,0.3458737652399577,11 14 | 0.32455606386065483,0.04477264169690898,0.3693287055575638,12 15 | 0.3077436275780201,0.012063428497640416,0.3198070560756605,13 16 | 0.2921931501477957,0.005468051218485925,0.2976612013662816,14 17 | 0.27782580628991127,0.003130487210000865,0.28095629349991214,15 18 | 0.264958880841732,0.005107693772515631,0.27006657461424766,16 19 | 0.2530263075605035,0.0018407190327707212,0.2548670265932742,17 20 | 0.24233126174658537,0.008028410738916136,0.2503596724855015,18 21 | 0.23203278798609972,0.015381919947685674,0.2474147079337854,19 22 | 0.22269933857023716,0.0068424855708144605,0.22954182414105162,20 23 | 0.213677859865129,0.00364916873331822,0.21732702859844721,21 24 | 0.2055952027440071,0.010990741073328536,0.21658594381733565,22 25 | 0.1978379748761654,0.003663822826638352,0.20150179770280374,23 26 | 0.19072203524410725,0.0020466889509407338,0.19276872419504798,24 27 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_2/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_2/precision-recall.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_2/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:8 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:DAGM 5 | DATASET_PATH:./datasets/DAGM/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:0 8 | DROPOUT_P:0.2 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:25 11 | FOLD:2 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:1 16 | INPUT_HEIGHT:512 17 | INPUT_WIDTH:512 18 | LEARNING_RATE:0.04 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:8 21 | MODEL_NAME:models_trans10 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:True 29 | TRAIN_NUM:None 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:10.0 40 | WEIGHTED_SEG_LOSS_P:1.0 41 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_3/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_3/ROC.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_3/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_3/loss_val.png -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_3/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.7126001007854939,0.7003240212798119,1.4129241220653057,0 3 | 0.6571685113012791,0.45514613203704357,1.1123146433383226,1 4 | 0.6068841181695461,0.1225988648366183,0.7294829830061644,2 5 | 0.5630993209779263,0.12958619964774698,0.6926855206256732,3 6 | 0.525486595928669,0.3939531891664956,0.9194397850951646,4 7 | 0.48916654475033283,0.227365902508609,0.7165324472589418,5 8 | 0.4588235355913639,0.10094440320972353,0.5597679388010874,6 9 | 0.4292065743356943,0.0158026691278792,0.4450092434635735,7 10 | 0.4050256498157978,0.0782655167276971,0.4832911665434949,8 11 | 0.38062725588679314,0.02522093918378232,0.40584819507057546,9 12 | 0.35861819609999657,0.007770907468511723,0.3663891035685083,10 13 | 0.3391517661511898,0.007049129177175928,0.34620089532836573,11 14 | 0.3214058596640825,0.004465384332434041,0.32587124399651657,12 15 | 0.3051272798329592,0.0062424617462966125,0.3113697415792558,13 16 | 0.2903824094682932,0.01525258048059186,0.30563498994888505,14 17 | 0.27664410695433617,0.0030932827030483168,0.2797373896573845,15 18 | 0.2638857625424862,0.006085003531552502,0.2699707660740387,16 19 | 0.2522041853517294,0.0031915093240968417,0.25539569467582623,17 20 | 0.24141072016209364,0.00200537899218034,0.24341609915427398,18 21 | 0.23139235097914934,0.001983213385756244,0.23337556436490559,19 22 | 0.22249016165733337,0.019652627754112473,0.24214278941144585,20 23 | 0.21387375704944134,0.02662334261549404,0.24049709966493538,21 24 | 0.20556414034217596,0.0012969326598977204,0.20686107300207368,22 25 | 0.1981368688866496,0.010977310972521082,0.2091141798591707,23 26 | 0.1909741871058941,0.0024933825820880884,0.19346756968798218,24 27 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_3/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_3/precision-recall.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_3/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:8 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:DAGM 5 | DATASET_PATH:./datasets/DAGM/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:0 8 | DROPOUT_P:0.2 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:25 11 | FOLD:3 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:1 16 | INPUT_HEIGHT:512 17 | INPUT_WIDTH:512 18 | LEARNING_RATE:0.04 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:8 21 | MODEL_NAME:models_trans10 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:True 29 | TRAIN_NUM:None 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:10.0 40 | WEIGHTED_SEG_LOSS_P:1.0 41 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_5/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_5/ROC.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_5/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_5/loss_val.png -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_5/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.7141051993650549,0.5431154904996648,1.2572206898647198,0 3 | 0.6563190993140725,0.29299432170741696,0.9493134210214895,1 4 | 0.603133780114791,0.15287080294836095,0.756004583063152,2 5 | 0.5572474108022802,0.04020535860977629,0.5974527694120565,3 6 | 0.5168858994455898,0.03664233031518319,0.553528229760773,4 7 | 0.4806221145040849,0.05425383420387173,0.5348759487079566,5 8 | 0.4476502204642576,0.005643251411931808,0.4532934718761894,6 9 | 0.41824371323866005,0.02617288749281536,0.44441660073147543,7 10 | 0.39179688692092896,0.005531391526079353,0.3973282784470083,8 11 | 0.3680405634291032,0.011590429364279023,0.3796309927933822,9 12 | 0.34642654131440553,0.003409910454031299,0.3498364517684368,10 13 | 0.327125889413497,0.02436267615035724,0.35148856556385427,11 14 | 0.309740052503698,0.020145513617214474,0.32988556612091247,12 15 | 0.2933594303972581,0.004815715317176107,0.2981751457144342,13 16 | 0.2784752915887272,0.001813463488360867,0.28028875507708806,14 17 | 0.2657458589357488,0.06520612957298427,0.3309519885087331,15 18 | 0.25432026999838214,0.01524074058568872,0.26956101058407084,16 19 | 0.24190822681959936,0.011532115585663739,0.2534403424052631,17 20 | 0.23106136392144597,0.005731264766141334,0.2367926286875873,18 21 | 0.2212752077509375,0.05848040849121068,0.2797556162421482,19 22 | 0.2122407985084197,0.016023549278650212,0.22826434778706992,20 23 | 0.20356117802507737,0.0020670811406008024,0.20562825916567817,21 24 | 0.19559628998532014,0.0009593165369177073,0.19655560652223786,22 25 | 0.18822145023766687,0.002014665981679333,0.1902361162193462,23 26 | 0.18168662137844982,0.0754546292232322,0.257141250601682,24 27 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_5/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_5/precision-recall.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_5/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:8 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:DAGM 5 | DATASET_PATH:./datasets/DAGM/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:0 8 | DROPOUT_P:0.2 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:25 11 | FOLD:5 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:1 16 | INPUT_HEIGHT:512 17 | INPUT_WIDTH:512 18 | LEARNING_RATE:0.04 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:8 21 | MODEL_NAME:models_trans10 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:True 29 | TRAIN_NUM:None 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:10.0 40 | WEIGHTED_SEG_LOSS_P:1.0 41 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_5/training_log.txt: -------------------------------------------------------------------------------- 1 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 Executing run with path ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_5 2 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 AREA_RATIO_MIN : 0.9 3 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 BATCH_SIZE : 8 4 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 CONTINUE_LAST_TRAIN : None 5 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 DATASET : DAGM 6 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 DATASET_PATH : ./datasets/DAGM/ 7 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 DELTA_CLS_LOSS : 1.0 8 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 DILATE : 0 9 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 DROPOUT_P : 0.2 10 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 DYN_BALANCED_LOSS : False 11 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 EPOCHS : 25 12 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 FOLD : 5 13 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 FREQUENCY_SAMPLING : True 14 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 GPU : 0 15 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 GRADIENT_ADJUSTMENT : False 16 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 INPUT_CHANNELS : 1 17 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 INPUT_HEIGHT : 512 18 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 INPUT_WIDTH : 512 19 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 LEARNING_RATE : 0.04 20 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 LOSS_SEG_THR : 0.02 21 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 MEMORY_FIT : 8 22 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 MODEL_NAME : models_trans10 23 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 NUM_SEGMENTED : 0 24 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 ON_DEMAND_READ : False 25 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 OPTIMIZER : SGD 26 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 REPRODUCIBLE_RUN : True 27 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 RESULTS_PATH : ./results_new 28 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 SAMPLING : half_mixed 29 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 SAVE_IMAGES : True 30 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 TRAIN_NUM : None 31 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 TRANS_BRIGHT : 1.0 32 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 TRANS_KEEP_LOOP : 1 33 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 TRANS_NUM : 4 34 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 USE_BEST_MODEL : False 35 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 VALIDATE : True 36 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 VALIDATE_ON_TEST : True 37 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 VALIDATION_N_EPOCHS : 3 38 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 VOLUME_CFG : None 39 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 WEIGHTED_SEG_LOSS : False 40 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 WEIGHTED_SEG_LOSS_MAX : 10.0 41 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 WEIGHTED_SEG_LOSS_P : 1.0 42 | 2022-09-30 00:45:24 dagm_z_SC_03_except_class4_FOLD_5 Reproducible run, fixing all seeds to:1337 43 | This is models_trans10 44 | 2022-09-30 00:45:46 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 45 | 2022-09-30 00:45:46 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 46 | 2022-09-30 00:45:52 dagm_z_SC_03_except_class4_FOLD_5 Epoch 1/25 ==> avg_loss_seg=0.71411, avg_loss_seg_pos=0.18042, avg_loss_dec=0.54312, avg_loss=1.25722, FP=22, FN=20, correct=94/136, in 6.27s/epoch 47 | 2022-09-30 00:45:58 dagm_z_SC_03_except_class4_FOLD_5 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.773299 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.68316, avg_loss_seg_pos=0.70667 48 | 2022-09-30 00:45:58 dagm_z_SC_03_except_class4_FOLD_5 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_5\models\ep_01.pth 49 | 2022-09-30 00:45:58 dagm_z_SC_03_except_class4_FOLD_5 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_5\models\best_state_dict.pth 50 | 2022-09-30 00:45:58 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 51 | 2022-09-30 00:45:58 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 52 | 2022-09-30 00:46:03 dagm_z_SC_03_except_class4_FOLD_5 Epoch 2/25 ==> avg_loss_seg=0.65632, avg_loss_seg_pos=0.16618, avg_loss_dec=0.29299, avg_loss=0.94931, FP=10, FN=5, correct=121/136, in 4.62s/epoch 53 | 2022-09-30 00:46:03 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 54 | 2022-09-30 00:46:03 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 55 | 2022-09-30 00:46:08 dagm_z_SC_03_except_class4_FOLD_5 Epoch 3/25 ==> avg_loss_seg=0.60313, avg_loss_seg_pos=0.15342, avg_loss_dec=0.15287, avg_loss=0.75600, FP=3, FN=5, correct=128/136, in 4.62s/epoch 56 | 2022-09-30 00:46:08 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 57 | 2022-09-30 00:46:08 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 58 | 2022-09-30 00:46:13 dagm_z_SC_03_except_class4_FOLD_5 Epoch 4/25 ==> avg_loss_seg=0.55725, avg_loss_seg_pos=0.14240, avg_loss_dec=0.04021, avg_loss=0.59745, FP=0, FN=1, correct=135/136, in 4.63s/epoch 59 | 2022-09-30 00:46:18 dagm_z_SC_03_except_class4_FOLD_5 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.998382 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.53525, avg_loss_seg_pos=0.55117 60 | 2022-09-30 00:46:18 dagm_z_SC_03_except_class4_FOLD_5 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_5\models\ep_04.pth 61 | 2022-09-30 00:46:18 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 62 | 2022-09-30 00:46:18 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 63 | 2022-09-30 00:46:23 dagm_z_SC_03_except_class4_FOLD_5 Epoch 5/25 ==> avg_loss_seg=0.51689, avg_loss_seg_pos=0.13304, avg_loss_dec=0.03664, avg_loss=0.55353, FP=0, FN=1, correct=135/136, in 4.64s/epoch 64 | 2022-09-30 00:46:23 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 65 | 2022-09-30 00:46:23 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 66 | 2022-09-30 00:46:28 dagm_z_SC_03_except_class4_FOLD_5 Epoch 6/25 ==> avg_loss_seg=0.48062, avg_loss_seg_pos=0.12418, avg_loss_dec=0.05425, avg_loss=0.53488, FP=0, FN=1, correct=135/136, in 4.63s/epoch 67 | 2022-09-30 00:46:28 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 68 | 2022-09-30 00:46:28 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 69 | 2022-09-30 00:46:33 dagm_z_SC_03_except_class4_FOLD_5 Epoch 7/25 ==> avg_loss_seg=0.44765, avg_loss_seg_pos=0.11634, avg_loss_dec=0.00564, avg_loss=0.45329, FP=0, FN=0, correct=136/136, in 4.65s/epoch 70 | 2022-09-30 00:46:39 dagm_z_SC_03_except_class4_FOLD_5 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.996797 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.43144, avg_loss_seg_pos=0.45110 71 | 2022-09-30 00:46:39 dagm_z_SC_03_except_class4_FOLD_5 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_5\models\ep_07.pth 72 | 2022-09-30 00:46:39 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 73 | 2022-09-30 00:46:39 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 74 | 2022-09-30 00:46:44 dagm_z_SC_03_except_class4_FOLD_5 Epoch 8/25 ==> avg_loss_seg=0.41824, avg_loss_seg_pos=0.10944, avg_loss_dec=0.02617, avg_loss=0.44442, FP=0, FN=1, correct=135/136, in 4.65s/epoch 75 | 2022-09-30 00:46:44 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 76 | 2022-09-30 00:46:44 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 77 | 2022-09-30 00:46:49 dagm_z_SC_03_except_class4_FOLD_5 Epoch 9/25 ==> avg_loss_seg=0.39180, avg_loss_seg_pos=0.10311, avg_loss_dec=0.00553, avg_loss=0.39733, FP=0, FN=0, correct=136/136, in 4.65s/epoch 78 | 2022-09-30 00:46:49 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 79 | 2022-09-30 00:46:49 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 80 | 2022-09-30 00:46:54 dagm_z_SC_03_except_class4_FOLD_5 Epoch 10/25 ==> avg_loss_seg=0.36804, avg_loss_seg_pos=0.09752, avg_loss_dec=0.01159, avg_loss=0.37963, FP=0, FN=1, correct=135/136, in 4.65s/epoch 81 | 2022-09-30 00:46:59 dagm_z_SC_03_except_class4_FOLD_5 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999837 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.35624, avg_loss_seg_pos=0.38066 82 | 2022-09-30 00:46:59 dagm_z_SC_03_except_class4_FOLD_5 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_5\models\ep_10.pth 83 | 2022-09-30 00:46:59 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 84 | 2022-09-30 00:46:59 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 85 | 2022-09-30 00:47:04 dagm_z_SC_03_except_class4_FOLD_5 Epoch 11/25 ==> avg_loss_seg=0.34643, avg_loss_seg_pos=0.09229, avg_loss_dec=0.00341, avg_loss=0.34984, FP=0, FN=0, correct=136/136, in 4.66s/epoch 86 | 2022-09-30 00:47:04 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 87 | 2022-09-30 00:47:04 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 88 | 2022-09-30 00:47:09 dagm_z_SC_03_except_class4_FOLD_5 Epoch 12/25 ==> avg_loss_seg=0.32713, avg_loss_seg_pos=0.08842, avg_loss_dec=0.02436, avg_loss=0.35149, FP=0, FN=1, correct=135/136, in 4.65s/epoch 89 | 2022-09-30 00:47:09 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 90 | 2022-09-30 00:47:09 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 91 | 2022-09-30 00:47:14 dagm_z_SC_03_except_class4_FOLD_5 Epoch 13/25 ==> avg_loss_seg=0.30974, avg_loss_seg_pos=0.08468, avg_loss_dec=0.02015, avg_loss=0.32989, FP=0, FN=1, correct=135/136, in 4.65s/epoch 92 | 2022-09-30 00:47:20 dagm_z_SC_03_except_class4_FOLD_5 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999994 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.30086, avg_loss_seg_pos=0.33102 93 | 2022-09-30 00:47:20 dagm_z_SC_03_except_class4_FOLD_5 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_5\models\ep_13.pth 94 | 2022-09-30 00:47:20 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 95 | 2022-09-30 00:47:20 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 96 | 2022-09-30 00:47:25 dagm_z_SC_03_except_class4_FOLD_5 Epoch 14/25 ==> avg_loss_seg=0.29336, avg_loss_seg_pos=0.08054, avg_loss_dec=0.00482, avg_loss=0.29818, FP=0, FN=0, correct=136/136, in 4.66s/epoch 97 | 2022-09-30 00:47:25 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 98 | 2022-09-30 00:47:25 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 99 | 2022-09-30 00:47:30 dagm_z_SC_03_except_class4_FOLD_5 Epoch 15/25 ==> avg_loss_seg=0.27848, avg_loss_seg_pos=0.07715, avg_loss_dec=0.00181, avg_loss=0.28029, FP=0, FN=0, correct=136/136, in 4.66s/epoch 100 | 2022-09-30 00:47:30 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 101 | 2022-09-30 00:47:30 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 102 | 2022-09-30 00:47:35 dagm_z_SC_03_except_class4_FOLD_5 Epoch 16/25 ==> avg_loss_seg=0.26575, avg_loss_seg_pos=0.07416, avg_loss_dec=0.06521, avg_loss=0.33095, FP=2, FN=2, correct=132/136, in 4.67s/epoch 103 | 2022-09-30 00:47:40 dagm_z_SC_03_except_class4_FOLD_5 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.985986 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.26113, avg_loss_seg_pos=0.29867 104 | 2022-09-30 00:47:40 dagm_z_SC_03_except_class4_FOLD_5 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_5\models\ep_16.pth 105 | 2022-09-30 00:47:40 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 106 | 2022-09-30 00:47:40 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 107 | 2022-09-30 00:47:45 dagm_z_SC_03_except_class4_FOLD_5 Epoch 17/25 ==> avg_loss_seg=0.25432, avg_loss_seg_pos=0.07198, avg_loss_dec=0.01524, avg_loss=0.26956, FP=0, FN=0, correct=136/136, in 4.67s/epoch 108 | 2022-09-30 00:47:45 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 109 | 2022-09-30 00:47:45 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 110 | 2022-09-30 00:47:50 dagm_z_SC_03_except_class4_FOLD_5 Epoch 18/25 ==> avg_loss_seg=0.24191, avg_loss_seg_pos=0.06908, avg_loss_dec=0.01153, avg_loss=0.25344, FP=0, FN=1, correct=135/136, in 4.66s/epoch 111 | 2022-09-30 00:47:50 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 112 | 2022-09-30 00:47:50 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 113 | 2022-09-30 00:47:55 dagm_z_SC_03_except_class4_FOLD_5 Epoch 19/25 ==> avg_loss_seg=0.23106, avg_loss_seg_pos=0.06662, avg_loss_dec=0.00573, avg_loss=0.23679, FP=0, FN=0, correct=136/136, in 4.66s/epoch 114 | 2022-09-30 00:48:01 dagm_z_SC_03_except_class4_FOLD_5 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999963 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.22563, avg_loss_seg_pos=0.26544 115 | 2022-09-30 00:48:01 dagm_z_SC_03_except_class4_FOLD_5 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_5\models\ep_19.pth 116 | 2022-09-30 00:48:01 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 117 | 2022-09-30 00:48:01 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 118 | 2022-09-30 00:48:06 dagm_z_SC_03_except_class4_FOLD_5 Epoch 20/25 ==> avg_loss_seg=0.22128, avg_loss_seg_pos=0.06427, avg_loss_dec=0.05848, avg_loss=0.27976, FP=1, FN=1, correct=134/136, in 4.67s/epoch 119 | 2022-09-30 00:48:06 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 120 | 2022-09-30 00:48:06 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 121 | 2022-09-30 00:48:11 dagm_z_SC_03_except_class4_FOLD_5 Epoch 21/25 ==> avg_loss_seg=0.21224, avg_loss_seg_pos=0.06262, avg_loss_dec=0.01602, avg_loss=0.22826, FP=0, FN=1, correct=135/136, in 4.66s/epoch 122 | 2022-09-30 00:48:11 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 123 | 2022-09-30 00:48:11 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 124 | 2022-09-30 00:48:16 dagm_z_SC_03_except_class4_FOLD_5 Epoch 22/25 ==> avg_loss_seg=0.20356, avg_loss_seg_pos=0.05993, avg_loss_dec=0.00207, avg_loss=0.20563, FP=0, FN=0, correct=136/136, in 4.67s/epoch 125 | 2022-09-30 00:48:21 dagm_z_SC_03_except_class4_FOLD_5 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999979 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.19924, avg_loss_seg_pos=0.23965 126 | 2022-09-30 00:48:21 dagm_z_SC_03_except_class4_FOLD_5 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_5\models\ep_22.pth 127 | 2022-09-30 00:48:21 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 128 | 2022-09-30 00:48:21 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 129 | 2022-09-30 00:48:26 dagm_z_SC_03_except_class4_FOLD_5 Epoch 23/25 ==> avg_loss_seg=0.19560, avg_loss_seg_pos=0.05825, avg_loss_dec=0.00096, avg_loss=0.19656, FP=0, FN=0, correct=136/136, in 4.67s/epoch 130 | 2022-09-30 00:48:26 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 131 | 2022-09-30 00:48:26 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 132 | 2022-09-30 00:48:31 dagm_z_SC_03_except_class4_FOLD_5 Epoch 24/25 ==> avg_loss_seg=0.18822, avg_loss_seg_pos=0.05655, avg_loss_dec=0.00201, avg_loss=0.19024, FP=0, FN=0, correct=136/136, in 4.67s/epoch 133 | 2022-09-30 00:48:31 dagm_z_SC_03_except_class4_FOLD_5 Returning seg_loss_weight 1 and dec_loss_weight 1.0 134 | 2022-09-30 00:48:31 dagm_z_SC_03_except_class4_FOLD_5 Returning dec_gradient_multiplier 1 135 | 2022-09-30 00:48:36 dagm_z_SC_03_except_class4_FOLD_5 Epoch 25/25 ==> avg_loss_seg=0.18169, avg_loss_seg_pos=0.05539, avg_loss_dec=0.07545, avg_loss=0.25714, FP=2, FN=2, correct=132/136, in 4.67s/epoch 136 | 2022-09-30 00:48:42 dagm_z_SC_03_except_class4_FOLD_5 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999997 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=575, avg_loss_seg_neg=0.17802, avg_loss_seg_pos=0.22350 137 | 2022-09-30 00:48:42 dagm_z_SC_03_except_class4_FOLD_5 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_5\models\ep_25.pth 138 | now best_AP=1.0,epoch=1 139 | 2022-09-30 00:48:42 dagm_z_SC_03_except_class4_FOLD_5 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_5\models\final_state_dict.pth 140 | 2022-09-30 00:48:42 dagm_z_SC_03_except_class4_FOLD_5 Keeping same model state 141 | dagm_z_SC_03_except_class4_FOLD_5 EVAL AUC=1.000000, and AP=1.000000, w/ best thr=0.999997 at f-m=1.000 and FP=0, FN=0 142 | run_time 7.7min 143 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_6/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_6/ROC.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_6/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_6/loss_val.png -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_6/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.7115654319524765,0.6380034625530243,1.3495688945055009,0 3 | 0.6455729007720947,0.43586455956101416,1.081437460333109,1 4 | 0.5870946556329727,0.2431777659803629,0.8302724216133356,2 5 | 0.5341200172901154,0.10872150249779225,0.6428415197879076,3 6 | 0.4913279190659523,0.16414860282093285,0.6554765218868852,4 7 | 0.44918117523193357,0.030245164642110467,0.47942633987404404,5 8 | 0.41387332379817965,0.02020283353049308,0.4340761573286727,6 9 | 0.38634833246469497,0.3740379904396832,0.7603863229043781,7 10 | 0.3574646309018135,0.06936874939128757,0.4268333802931011,8 11 | 0.3336899787187576,0.03041381755610928,0.3641037962748669,9 12 | 0.3123889654874802,0.02315998326521367,0.33554894875269387,10 13 | 0.29299225509166715,0.00929333473322913,0.3022855898248963,11 14 | 0.27543880641460416,0.006552090562763624,0.2819908969773678,12 15 | 0.2597988322377205,0.007818945168401115,0.2676177774061216,13 16 | 0.24561187028884887,0.006996183306910098,0.252608053595759,14 17 | 0.2327621817588806,0.008539481356274336,0.24130166311515494,15 18 | 0.22176935151219368,0.03557467277860269,0.2573440242907964,16 19 | 0.21110331639647484,0.019587049182155168,0.23069036557863,17 20 | 0.20095710158348085,0.003910617626388557,0.2048677192098694,18 21 | 0.19180023223161696,0.0054071390361059455,0.19720737126772292,19 22 | 0.18338676989078523,0.0036356179967697243,0.18702238788755496,20 23 | 0.17563704028725624,0.0023538185785582756,0.1779908588658145,21 24 | 0.16848389953374862,0.002237921336200088,0.17072182086994872,22 25 | 0.1618635945022106,0.0027608273434452713,0.16462442184565587,23 26 | 0.15573875606060028,0.0031086083181435242,0.1588473643787438,24 27 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_6/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_6/precision-recall.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_6/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:8 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:DAGM 5 | DATASET_PATH:./datasets/DAGM/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:0 8 | DROPOUT_P:0.2 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:25 11 | FOLD:6 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:1 16 | INPUT_HEIGHT:512 17 | INPUT_WIDTH:512 18 | LEARNING_RATE:0.04 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:8 21 | MODEL_NAME:models_trans10 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:True 29 | TRAIN_NUM:None 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:10.0 40 | WEIGHTED_SEG_LOSS_P:1.0 41 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_7/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_7/ROC.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_7/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_7/loss_val.png -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_7/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.6696458088385092,0.5664164011140127,1.236062209952522,0 3 | 0.5595072701170638,0.039257297915331016,0.5987645680323949,1 4 | 0.47590884324666616,0.023906624378837848,0.499815467625504,2 5 | 0.4104943790951291,0.009655620782428447,0.4201499998775575,3 6 | 0.35823119572691015,0.01282175151140404,0.37105294723831417,4 7 | 0.31601962930447347,0.018324011573379206,0.3343436408778527,5 8 | 0.28133501877655853,0.002353714854531997,0.28368873363109054,6 9 | 0.2526717129591349,0.0011961152777075768,0.25386782823684245,7 10 | 0.22907975718781753,0.005089897812049949,0.23416965499986747,8 11 | 0.20932044934582067,0.014419930389251661,0.22374037973507233,9 12 | 0.19193606400811994,0.0015376918615672636,0.1934737558696872,10 13 | 0.18131668744860469,0.1281171745866084,0.30943386203521306,11 14 | 0.16691578038640925,0.012768919119529577,0.17968469950593882,12 15 | 0.1546803592024623,0.002476794922266571,0.15715715412472886,13 16 | 0.14441131095628482,0.0020139891500081364,0.14642530010629295,14 17 | 0.1353585361628919,0.0013568455556319473,0.13671538171852385,15 18 | 0.1273314487692472,0.0006787377013195608,0.12801018647056678,16 19 | 0.12024402598271498,0.003835748576887921,0.12407977455960291,17 20 | 0.11379199475049973,0.0007077476691700065,0.11449974241966973,18 21 | 0.1080354868962958,0.0067973873688921185,0.11483287426518791,19 22 | 0.10278934059110847,0.002081324973374108,0.10487066556448259,20 23 | 0.09796648392000713,0.0007311147765486148,0.09869759869655574,21 24 | 0.09356232249253504,0.0003411836162716075,0.09390350610880666,22 25 | 0.08954817701030422,0.0008378881636442227,0.09038606517394844,23 26 | 0.0858254728687776,0.000203049505216768,0.08602852237399437,24 27 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_7/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_7/precision-recall.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_7/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:8 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:DAGM 5 | DATASET_PATH:./datasets/DAGM/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:0 8 | DROPOUT_P:0.2 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:25 11 | FOLD:7 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:1 16 | INPUT_HEIGHT:512 17 | INPUT_WIDTH:512 18 | LEARNING_RATE:0.04 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:8 21 | MODEL_NAME:models_trans10 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:True 29 | TRAIN_NUM:None 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:10.0 40 | WEIGHTED_SEG_LOSS_P:1.0 41 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_7/training_log.txt: -------------------------------------------------------------------------------- 1 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 Executing run with path ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_7 2 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 AREA_RATIO_MIN : 0.9 3 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 BATCH_SIZE : 8 4 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 CONTINUE_LAST_TRAIN : None 5 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 DATASET : DAGM 6 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 DATASET_PATH : ./datasets/DAGM/ 7 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 DELTA_CLS_LOSS : 1.0 8 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 DILATE : 0 9 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 DROPOUT_P : 0.2 10 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 DYN_BALANCED_LOSS : False 11 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 EPOCHS : 25 12 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 FOLD : 7 13 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 FREQUENCY_SAMPLING : True 14 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 GPU : 0 15 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 GRADIENT_ADJUSTMENT : False 16 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 INPUT_CHANNELS : 1 17 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 INPUT_HEIGHT : 512 18 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 INPUT_WIDTH : 512 19 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 LEARNING_RATE : 0.04 20 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 LOSS_SEG_THR : 0.02 21 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 MEMORY_FIT : 8 22 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 MODEL_NAME : models_trans10 23 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 NUM_SEGMENTED : 0 24 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 ON_DEMAND_READ : False 25 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 OPTIMIZER : SGD 26 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 REPRODUCIBLE_RUN : True 27 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 RESULTS_PATH : ./results_new 28 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 SAMPLING : half_mixed 29 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 SAVE_IMAGES : True 30 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 TRAIN_NUM : None 31 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 TRANS_BRIGHT : 1.0 32 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 TRANS_KEEP_LOOP : 1 33 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 TRANS_NUM : 4 34 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 USE_BEST_MODEL : False 35 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 VALIDATE : True 36 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 VALIDATE_ON_TEST : True 37 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 VALIDATION_N_EPOCHS : 3 38 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 VOLUME_CFG : None 39 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 WEIGHTED_SEG_LOSS : False 40 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 WEIGHTED_SEG_LOSS_MAX : 10.0 41 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 WEIGHTED_SEG_LOSS_P : 1.0 42 | 2022-09-30 01:01:10 dagm_z_SC_03_except_class4_FOLD_7 Reproducible run, fixing all seeds to:1337 43 | This is models_trans10 44 | 2022-09-30 01:01:53 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 45 | 2022-09-30 01:01:53 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 46 | 2022-09-30 01:02:05 dagm_z_SC_03_except_class4_FOLD_7 Epoch 1/25 ==> avg_loss_seg=0.66965, avg_loss_seg_pos=0.17223, avg_loss_dec=0.56642, avg_loss=1.23606, FP=47, FN=40, correct=209/296, in 11.92s/epoch 47 | 2022-09-30 01:02:17 dagm_z_SC_03_except_class4_FOLD_7 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.711760 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.60812, avg_loss_seg_pos=0.65738 48 | 2022-09-30 01:02:17 dagm_z_SC_03_except_class4_FOLD_7 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_7\models\ep_01.pth 49 | 2022-09-30 01:02:17 dagm_z_SC_03_except_class4_FOLD_7 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_7\models\best_state_dict.pth 50 | 2022-09-30 01:02:17 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 51 | 2022-09-30 01:02:17 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 52 | 2022-09-30 01:02:27 dagm_z_SC_03_except_class4_FOLD_7 Epoch 2/25 ==> avg_loss_seg=0.55951, avg_loss_seg_pos=0.15440, avg_loss_dec=0.03926, avg_loss=0.59876, FP=0, FN=2, correct=294/296, in 10.32s/epoch 53 | 2022-09-30 01:02:27 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 54 | 2022-09-30 01:02:27 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 55 | 2022-09-30 01:02:38 dagm_z_SC_03_except_class4_FOLD_7 Epoch 3/25 ==> avg_loss_seg=0.47591, avg_loss_seg_pos=0.13616, avg_loss_dec=0.02391, avg_loss=0.49982, FP=0, FN=1, correct=295/296, in 10.34s/epoch 56 | 2022-09-30 01:02:38 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 57 | 2022-09-30 01:02:38 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 58 | 2022-09-30 01:02:48 dagm_z_SC_03_except_class4_FOLD_7 Epoch 4/25 ==> avg_loss_seg=0.41049, avg_loss_seg_pos=0.12090, avg_loss_dec=0.00966, avg_loss=0.42015, FP=0, FN=0, correct=296/296, in 10.35s/epoch 59 | 2022-09-30 01:03:00 dagm_z_SC_03_except_class4_FOLD_7 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.994380 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.38122, avg_loss_seg_pos=0.45915 60 | 2022-09-30 01:03:00 dagm_z_SC_03_except_class4_FOLD_7 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_7\models\ep_04.pth 61 | 2022-09-30 01:03:00 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 62 | 2022-09-30 01:03:00 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 63 | 2022-09-30 01:03:11 dagm_z_SC_03_except_class4_FOLD_7 Epoch 5/25 ==> avg_loss_seg=0.35823, avg_loss_seg_pos=0.10810, avg_loss_dec=0.01282, avg_loss=0.37105, FP=0, FN=1, correct=295/296, in 10.38s/epoch 64 | 2022-09-30 01:03:11 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 65 | 2022-09-30 01:03:11 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 66 | 2022-09-30 01:03:21 dagm_z_SC_03_except_class4_FOLD_7 Epoch 6/25 ==> avg_loss_seg=0.31602, avg_loss_seg_pos=0.09960, avg_loss_dec=0.01832, avg_loss=0.33434, FP=0, FN=2, correct=294/296, in 10.37s/epoch 67 | 2022-09-30 01:03:21 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 68 | 2022-09-30 01:03:21 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 69 | 2022-09-30 01:03:32 dagm_z_SC_03_except_class4_FOLD_7 Epoch 7/25 ==> avg_loss_seg=0.28134, avg_loss_seg_pos=0.09150, avg_loss_dec=0.00235, avg_loss=0.28369, FP=0, FN=0, correct=296/296, in 10.37s/epoch 70 | 2022-09-30 01:03:44 dagm_z_SC_03_except_class4_FOLD_7 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999757 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.26570, avg_loss_seg_pos=0.35419 71 | 2022-09-30 01:03:44 dagm_z_SC_03_except_class4_FOLD_7 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_7\models\ep_07.pth 72 | 2022-09-30 01:03:44 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 73 | 2022-09-30 01:03:44 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 74 | 2022-09-30 01:03:54 dagm_z_SC_03_except_class4_FOLD_7 Epoch 8/25 ==> avg_loss_seg=0.25267, avg_loss_seg_pos=0.08498, avg_loss_dec=0.00120, avg_loss=0.25387, FP=0, FN=0, correct=296/296, in 10.38s/epoch 75 | 2022-09-30 01:03:54 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 76 | 2022-09-30 01:03:54 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 77 | 2022-09-30 01:04:05 dagm_z_SC_03_except_class4_FOLD_7 Epoch 9/25 ==> avg_loss_seg=0.22908, avg_loss_seg_pos=0.08022, avg_loss_dec=0.00509, avg_loss=0.23417, FP=0, FN=0, correct=296/296, in 10.38s/epoch 78 | 2022-09-30 01:04:05 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 79 | 2022-09-30 01:04:05 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 80 | 2022-09-30 01:04:16 dagm_z_SC_03_except_class4_FOLD_7 Epoch 10/25 ==> avg_loss_seg=0.20932, avg_loss_seg_pos=0.07652, avg_loss_dec=0.01442, avg_loss=0.22374, FP=0, FN=1, correct=295/296, in 10.37s/epoch 81 | 2022-09-30 01:04:27 dagm_z_SC_03_except_class4_FOLD_7 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=1.000000 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.19996, avg_loss_seg_pos=0.30419 82 | 2022-09-30 01:04:27 dagm_z_SC_03_except_class4_FOLD_7 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_7\models\ep_10.pth 83 | 2022-09-30 01:04:27 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 84 | 2022-09-30 01:04:27 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 85 | 2022-09-30 01:04:38 dagm_z_SC_03_except_class4_FOLD_7 Epoch 11/25 ==> avg_loss_seg=0.19194, avg_loss_seg_pos=0.07327, avg_loss_dec=0.00154, avg_loss=0.19347, FP=0, FN=0, correct=296/296, in 10.37s/epoch 86 | 2022-09-30 01:04:38 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 87 | 2022-09-30 01:04:38 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 88 | 2022-09-30 01:04:49 dagm_z_SC_03_except_class4_FOLD_7 Epoch 12/25 ==> avg_loss_seg=0.18132, avg_loss_seg_pos=0.06919, avg_loss_dec=0.12812, avg_loss=0.30943, FP=5, FN=7, correct=284/296, in 10.40s/epoch 89 | 2022-09-30 01:04:49 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 90 | 2022-09-30 01:04:49 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 91 | 2022-09-30 01:04:59 dagm_z_SC_03_except_class4_FOLD_7 Epoch 13/25 ==> avg_loss_seg=0.16692, avg_loss_seg_pos=0.06741, avg_loss_dec=0.01277, avg_loss=0.17968, FP=0, FN=1, correct=295/296, in 10.39s/epoch 92 | 2022-09-30 01:05:11 dagm_z_SC_03_except_class4_FOLD_7 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.994284 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.16023, avg_loss_seg_pos=0.26846 93 | 2022-09-30 01:05:11 dagm_z_SC_03_except_class4_FOLD_7 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_7\models\ep_13.pth 94 | 2022-09-30 01:05:11 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 95 | 2022-09-30 01:05:11 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 96 | 2022-09-30 01:05:22 dagm_z_SC_03_except_class4_FOLD_7 Epoch 14/25 ==> avg_loss_seg=0.15468, avg_loss_seg_pos=0.06468, avg_loss_dec=0.00248, avg_loss=0.15716, FP=0, FN=0, correct=296/296, in 10.41s/epoch 97 | 2022-09-30 01:05:22 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 98 | 2022-09-30 01:05:22 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 99 | 2022-09-30 01:05:32 dagm_z_SC_03_except_class4_FOLD_7 Epoch 15/25 ==> avg_loss_seg=0.14441, avg_loss_seg_pos=0.06226, avg_loss_dec=0.00201, avg_loss=0.14643, FP=0, FN=0, correct=296/296, in 10.42s/epoch 100 | 2022-09-30 01:05:32 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 101 | 2022-09-30 01:05:32 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 102 | 2022-09-30 01:05:43 dagm_z_SC_03_except_class4_FOLD_7 Epoch 16/25 ==> avg_loss_seg=0.13536, avg_loss_seg_pos=0.06068, avg_loss_dec=0.00136, avg_loss=0.13672, FP=0, FN=0, correct=296/296, in 10.41s/epoch 103 | 2022-09-30 01:05:55 dagm_z_SC_03_except_class4_FOLD_7 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.995330 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.13108, avg_loss_seg_pos=0.24311 104 | 2022-09-30 01:05:55 dagm_z_SC_03_except_class4_FOLD_7 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_7\models\ep_16.pth 105 | 2022-09-30 01:05:55 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 106 | 2022-09-30 01:05:55 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 107 | 2022-09-30 01:06:05 dagm_z_SC_03_except_class4_FOLD_7 Epoch 17/25 ==> avg_loss_seg=0.12733, avg_loss_seg_pos=0.05905, avg_loss_dec=0.00068, avg_loss=0.12801, FP=0, FN=0, correct=296/296, in 10.42s/epoch 108 | 2022-09-30 01:06:05 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 109 | 2022-09-30 01:06:05 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 110 | 2022-09-30 01:06:16 dagm_z_SC_03_except_class4_FOLD_7 Epoch 18/25 ==> avg_loss_seg=0.12024, avg_loss_seg_pos=0.05799, avg_loss_dec=0.00384, avg_loss=0.12408, FP=0, FN=0, correct=296/296, in 10.41s/epoch 111 | 2022-09-30 01:06:16 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 112 | 2022-09-30 01:06:16 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 113 | 2022-09-30 01:06:27 dagm_z_SC_03_except_class4_FOLD_7 Epoch 19/25 ==> avg_loss_seg=0.11379, avg_loss_seg_pos=0.05720, avg_loss_dec=0.00071, avg_loss=0.11450, FP=0, FN=0, correct=296/296, in 10.41s/epoch 114 | 2022-09-30 01:06:38 dagm_z_SC_03_except_class4_FOLD_7 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.997776 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.11070, avg_loss_seg_pos=0.23081 115 | 2022-09-30 01:06:38 dagm_z_SC_03_except_class4_FOLD_7 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_7\models\ep_19.pth 116 | 2022-09-30 01:06:38 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 117 | 2022-09-30 01:06:38 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 118 | 2022-09-30 01:06:49 dagm_z_SC_03_except_class4_FOLD_7 Epoch 20/25 ==> avg_loss_seg=0.10804, avg_loss_seg_pos=0.05664, avg_loss_dec=0.00680, avg_loss=0.11483, FP=0, FN=1, correct=295/296, in 10.41s/epoch 119 | 2022-09-30 01:06:49 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 120 | 2022-09-30 01:06:49 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 121 | 2022-09-30 01:07:00 dagm_z_SC_03_except_class4_FOLD_7 Epoch 21/25 ==> avg_loss_seg=0.10279, avg_loss_seg_pos=0.05581, avg_loss_dec=0.00208, avg_loss=0.10487, FP=0, FN=0, correct=296/296, in 10.40s/epoch 122 | 2022-09-30 01:07:00 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 123 | 2022-09-30 01:07:00 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 124 | 2022-09-30 01:07:10 dagm_z_SC_03_except_class4_FOLD_7 Epoch 22/25 ==> avg_loss_seg=0.09797, avg_loss_seg_pos=0.05483, avg_loss_dec=0.00073, avg_loss=0.09870, FP=0, FN=0, correct=296/296, in 10.40s/epoch 125 | 2022-09-30 01:07:22 dagm_z_SC_03_except_class4_FOLD_7 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999938 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.09563, avg_loss_seg_pos=0.22202 126 | 2022-09-30 01:07:22 dagm_z_SC_03_except_class4_FOLD_7 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_7\models\ep_22.pth 127 | 2022-09-30 01:07:22 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 128 | 2022-09-30 01:07:22 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 129 | 2022-09-30 01:07:33 dagm_z_SC_03_except_class4_FOLD_7 Epoch 23/25 ==> avg_loss_seg=0.09356, avg_loss_seg_pos=0.05366, avg_loss_dec=0.00034, avg_loss=0.09390, FP=0, FN=0, correct=296/296, in 10.41s/epoch 130 | 2022-09-30 01:07:33 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 131 | 2022-09-30 01:07:33 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 132 | 2022-09-30 01:07:43 dagm_z_SC_03_except_class4_FOLD_7 Epoch 24/25 ==> avg_loss_seg=0.08955, avg_loss_seg_pos=0.05264, avg_loss_dec=0.00084, avg_loss=0.09039, FP=0, FN=0, correct=296/296, in 10.41s/epoch 133 | 2022-09-30 01:07:43 dagm_z_SC_03_except_class4_FOLD_7 Returning seg_loss_weight 1 and dec_loss_weight 1.0 134 | 2022-09-30 01:07:43 dagm_z_SC_03_except_class4_FOLD_7 Returning dec_gradient_multiplier 1 135 | 2022-09-30 01:07:54 dagm_z_SC_03_except_class4_FOLD_7 Epoch 25/25 ==> avg_loss_seg=0.08583, avg_loss_seg_pos=0.05200, avg_loss_dec=0.00020, avg_loss=0.08603, FP=0, FN=0, correct=296/296, in 10.41s/epoch 136 | 2022-09-30 01:08:06 dagm_z_SC_03_except_class4_FOLD_7 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999027 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.08401, avg_loss_seg_pos=0.21215 137 | 2022-09-30 01:08:06 dagm_z_SC_03_except_class4_FOLD_7 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_7\models\ep_25.pth 138 | now best_AP=1.0,epoch=1 139 | 2022-09-30 01:08:06 dagm_z_SC_03_except_class4_FOLD_7 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_7\models\final_state_dict.pth 140 | 2022-09-30 01:08:06 dagm_z_SC_03_except_class4_FOLD_7 Keeping same model state 141 | dagm_z_SC_03_except_class4_FOLD_7 EVAL AUC=1.000000, and AP=1.000000, w/ best thr=0.999027 at f-m=1.000 and FP=0, FN=0 142 | run_time 15.6min 143 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_8/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_8/ROC.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_8/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_8/loss_val.png -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_8/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.6739911420925243,0.24246803286007126,0.9164591749525955,0 3 | 0.5621082927729633,0.011245161944660484,0.5733534547176238,1 4 | 0.4785027077069154,0.17892580408590367,0.657428511792819,2 5 | 0.4145901783092602,0.01728415128623916,0.4318743295954994,3 6 | 0.35906169462848353,0.01892724065456187,0.3779889352830454,4 7 | 0.31812015256366216,0.027851529753722903,0.34597168231738507,5 8 | 0.2820962844668208,0.001970666759990382,0.28406695122681114,6 9 | 0.2530398638667287,0.0007674396541473028,0.25380730352087605,7 10 | 0.22902124797975695,0.0011560222072246786,0.23017727018698164,8 11 | 0.20879765781196388,0.0005915019937674515,0.20938915980573133,9 12 | 0.19157041005186132,0.000438766762739318,0.19200917681460064,10 13 | 0.17677001937015638,0.00048760014984460166,0.177257619520001,11 14 | 0.16392929610368367,0.0003214673160982784,0.16425076341978195,12 15 | 0.15271563787718076,0.0004069100753907618,0.15312254795257152,13 16 | 0.1428275913805575,0.00025099777947714855,0.14307858916003466,14 17 | 0.1340667500689223,0.00020229028956925282,0.13426904035849155,15 18 | 0.1267729065305478,0.0037948210469826212,0.1305677275775304,16 19 | 0.11960480080263035,0.00032840900791130924,0.11993320981054166,17 20 | 0.11320279719861778,0.0006400795824644061,0.1138428767810822,18 21 | 0.1074646999304359,0.00023496736962811524,0.10769966730006401,19 22 | 0.10242638998740428,0.004302484020135543,0.10672887400753982,20 23 | 0.09763596307586979,0.0006560797907720396,0.09829204286664182,21 24 | 0.09325684546618848,0.0004142363627702375,0.09367108182895872,22 25 | 0.08925379571076986,0.0002471864498000139,0.08950098216056987,23 26 | 0.08557288348674774,0.0003764948677771089,0.08594937835452485,24 27 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_8/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_8/precision-recall.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_8/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:8 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:DAGM 5 | DATASET_PATH:./datasets/DAGM/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:0 8 | DROPOUT_P:0.2 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:25 11 | FOLD:8 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:1 16 | INPUT_HEIGHT:512 17 | INPUT_WIDTH:512 18 | LEARNING_RATE:0.04 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:8 21 | MODEL_NAME:models_trans10 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:True 29 | TRAIN_NUM:None 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:10.0 40 | WEIGHTED_SEG_LOSS_P:1.0 41 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_8/training_log.txt: -------------------------------------------------------------------------------- 1 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 Executing run with path ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_8 2 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 AREA_RATIO_MIN : 0.9 3 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 BATCH_SIZE : 8 4 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 CONTINUE_LAST_TRAIN : None 5 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 DATASET : DAGM 6 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 DATASET_PATH : ./datasets/DAGM/ 7 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 DELTA_CLS_LOSS : 1.0 8 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 DILATE : 0 9 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 DROPOUT_P : 0.2 10 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 DYN_BALANCED_LOSS : False 11 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 EPOCHS : 25 12 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 FOLD : 8 13 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 FREQUENCY_SAMPLING : True 14 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 GPU : 0 15 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 GRADIENT_ADJUSTMENT : False 16 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 INPUT_CHANNELS : 1 17 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 INPUT_HEIGHT : 512 18 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 INPUT_WIDTH : 512 19 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 LEARNING_RATE : 0.04 20 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 LOSS_SEG_THR : 0.02 21 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 MEMORY_FIT : 8 22 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 MODEL_NAME : models_trans10 23 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 NUM_SEGMENTED : 0 24 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 ON_DEMAND_READ : False 25 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 OPTIMIZER : SGD 26 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 REPRODUCIBLE_RUN : True 27 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 RESULTS_PATH : ./results_new 28 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 SAMPLING : half_mixed 29 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 SAVE_IMAGES : True 30 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 TRAIN_NUM : None 31 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 TRANS_BRIGHT : 1.0 32 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 TRANS_KEEP_LOOP : 1 33 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 TRANS_NUM : 4 34 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 USE_BEST_MODEL : False 35 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 VALIDATE : True 36 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 VALIDATE_ON_TEST : True 37 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 VALIDATION_N_EPOCHS : 3 38 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 VOLUME_CFG : None 39 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 WEIGHTED_SEG_LOSS : False 40 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 WEIGHTED_SEG_LOSS_MAX : 10.0 41 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 WEIGHTED_SEG_LOSS_P : 1.0 42 | 2022-09-30 01:16:47 dagm_z_SC_03_except_class4_FOLD_8 Reproducible run, fixing all seeds to:1337 43 | This is models_trans10 44 | 2022-09-30 01:17:30 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 45 | 2022-09-30 01:17:30 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 46 | 2022-09-30 01:17:42 dagm_z_SC_03_except_class4_FOLD_8 Epoch 1/25 ==> avg_loss_seg=0.67399, avg_loss_seg_pos=0.16749, avg_loss_dec=0.24247, avg_loss=0.91646, FP=16, FN=20, correct=260/296, in 11.96s/epoch 47 | 2022-09-30 01:17:54 dagm_z_SC_03_except_class4_FOLD_8 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.997533 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.61509, avg_loss_seg_pos=0.60826 48 | 2022-09-30 01:17:54 dagm_z_SC_03_except_class4_FOLD_8 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_8\models\ep_01.pth 49 | 2022-09-30 01:17:54 dagm_z_SC_03_except_class4_FOLD_8 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_8\models\best_state_dict.pth 50 | 2022-09-30 01:17:54 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 51 | 2022-09-30 01:17:54 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 52 | 2022-09-30 01:18:04 dagm_z_SC_03_except_class4_FOLD_8 Epoch 2/25 ==> avg_loss_seg=0.56211, avg_loss_seg_pos=0.13917, avg_loss_dec=0.01125, avg_loss=0.57335, FP=0, FN=0, correct=296/296, in 10.29s/epoch 53 | 2022-09-30 01:18:04 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 54 | 2022-09-30 01:18:04 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 55 | 2022-09-30 01:18:15 dagm_z_SC_03_except_class4_FOLD_8 Epoch 3/25 ==> avg_loss_seg=0.47850, avg_loss_seg_pos=0.11913, avg_loss_dec=0.17893, avg_loss=0.65743, FP=7, FN=5, correct=284/296, in 10.33s/epoch 56 | 2022-09-30 01:18:15 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 57 | 2022-09-30 01:18:15 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 58 | 2022-09-30 01:18:25 dagm_z_SC_03_except_class4_FOLD_8 Epoch 4/25 ==> avg_loss_seg=0.41459, avg_loss_seg_pos=0.10650, avg_loss_dec=0.01728, avg_loss=0.43187, FP=0, FN=1, correct=295/296, in 10.34s/epoch 59 | 2022-09-30 01:18:37 dagm_z_SC_03_except_class4_FOLD_8 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.875329 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.38284, avg_loss_seg_pos=0.39324 60 | 2022-09-30 01:18:37 dagm_z_SC_03_except_class4_FOLD_8 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_8\models\ep_04.pth 61 | 2022-09-30 01:18:37 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 62 | 2022-09-30 01:18:37 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 63 | 2022-09-30 01:18:48 dagm_z_SC_03_except_class4_FOLD_8 Epoch 5/25 ==> avg_loss_seg=0.35906, avg_loss_seg_pos=0.09213, avg_loss_dec=0.01893, avg_loss=0.37799, FP=0, FN=1, correct=295/296, in 10.37s/epoch 64 | 2022-09-30 01:18:48 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 65 | 2022-09-30 01:18:48 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 66 | 2022-09-30 01:18:58 dagm_z_SC_03_except_class4_FOLD_8 Epoch 6/25 ==> avg_loss_seg=0.31812, avg_loss_seg_pos=0.08090, avg_loss_dec=0.02785, avg_loss=0.34597, FP=2, FN=2, correct=292/296, in 10.37s/epoch 67 | 2022-09-30 01:18:58 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 68 | 2022-09-30 01:18:58 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 69 | 2022-09-30 01:19:09 dagm_z_SC_03_except_class4_FOLD_8 Epoch 7/25 ==> avg_loss_seg=0.28210, avg_loss_seg_pos=0.07185, avg_loss_dec=0.00197, avg_loss=0.28407, FP=0, FN=0, correct=296/296, in 10.36s/epoch 70 | 2022-09-30 01:19:21 dagm_z_SC_03_except_class4_FOLD_8 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.995980 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.26626, avg_loss_seg_pos=0.27176 71 | 2022-09-30 01:19:21 dagm_z_SC_03_except_class4_FOLD_8 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_8\models\ep_07.pth 72 | 2022-09-30 01:19:21 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 73 | 2022-09-30 01:19:21 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 74 | 2022-09-30 01:19:31 dagm_z_SC_03_except_class4_FOLD_8 Epoch 8/25 ==> avg_loss_seg=0.25304, avg_loss_seg_pos=0.06482, avg_loss_dec=0.00077, avg_loss=0.25381, FP=0, FN=0, correct=296/296, in 10.38s/epoch 75 | 2022-09-30 01:19:31 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 76 | 2022-09-30 01:19:31 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 77 | 2022-09-30 01:19:42 dagm_z_SC_03_except_class4_FOLD_8 Epoch 9/25 ==> avg_loss_seg=0.22902, avg_loss_seg_pos=0.05901, avg_loss_dec=0.00116, avg_loss=0.23018, FP=0, FN=0, correct=296/296, in 10.38s/epoch 78 | 2022-09-30 01:19:42 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 79 | 2022-09-30 01:19:42 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 80 | 2022-09-30 01:19:53 dagm_z_SC_03_except_class4_FOLD_8 Epoch 10/25 ==> avg_loss_seg=0.20880, avg_loss_seg_pos=0.05414, avg_loss_dec=0.00059, avg_loss=0.20939, FP=0, FN=0, correct=296/296, in 10.38s/epoch 81 | 2022-09-30 01:20:04 dagm_z_SC_03_except_class4_FOLD_8 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.989359 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.19950, avg_loss_seg_pos=0.20715 82 | 2022-09-30 01:20:04 dagm_z_SC_03_except_class4_FOLD_8 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_8\models\ep_10.pth 83 | 2022-09-30 01:20:04 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 84 | 2022-09-30 01:20:04 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 85 | 2022-09-30 01:20:15 dagm_z_SC_03_except_class4_FOLD_8 Epoch 11/25 ==> avg_loss_seg=0.19157, avg_loss_seg_pos=0.05000, avg_loss_dec=0.00044, avg_loss=0.19201, FP=0, FN=0, correct=296/296, in 10.40s/epoch 86 | 2022-09-30 01:20:15 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 87 | 2022-09-30 01:20:15 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 88 | 2022-09-30 01:20:26 dagm_z_SC_03_except_class4_FOLD_8 Epoch 12/25 ==> avg_loss_seg=0.17677, avg_loss_seg_pos=0.04642, avg_loss_dec=0.00049, avg_loss=0.17726, FP=0, FN=0, correct=296/296, in 10.41s/epoch 89 | 2022-09-30 01:20:26 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 90 | 2022-09-30 01:20:26 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 91 | 2022-09-30 01:20:36 dagm_z_SC_03_except_class4_FOLD_8 Epoch 13/25 ==> avg_loss_seg=0.16393, avg_loss_seg_pos=0.04336, avg_loss_dec=0.00032, avg_loss=0.16425, FP=0, FN=0, correct=296/296, in 10.40s/epoch 92 | 2022-09-30 01:20:48 dagm_z_SC_03_except_class4_FOLD_8 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.990689 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.15792, avg_loss_seg_pos=0.16716 93 | 2022-09-30 01:20:48 dagm_z_SC_03_except_class4_FOLD_8 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_8\models\ep_13.pth 94 | 2022-09-30 01:20:48 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 95 | 2022-09-30 01:20:48 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 96 | 2022-09-30 01:20:59 dagm_z_SC_03_except_class4_FOLD_8 Epoch 14/25 ==> avg_loss_seg=0.15272, avg_loss_seg_pos=0.04068, avg_loss_dec=0.00041, avg_loss=0.15312, FP=0, FN=0, correct=296/296, in 10.41s/epoch 97 | 2022-09-30 01:20:59 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 98 | 2022-09-30 01:20:59 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 99 | 2022-09-30 01:21:09 dagm_z_SC_03_except_class4_FOLD_8 Epoch 15/25 ==> avg_loss_seg=0.14283, avg_loss_seg_pos=0.03808, avg_loss_dec=0.00025, avg_loss=0.14308, FP=0, FN=0, correct=296/296, in 10.41s/epoch 100 | 2022-09-30 01:21:09 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 101 | 2022-09-30 01:21:09 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 102 | 2022-09-30 01:21:20 dagm_z_SC_03_except_class4_FOLD_8 Epoch 16/25 ==> avg_loss_seg=0.13407, avg_loss_seg_pos=0.03614, avg_loss_dec=0.00020, avg_loss=0.13427, FP=0, FN=0, correct=296/296, in 10.41s/epoch 103 | 2022-09-30 01:21:32 dagm_z_SC_03_except_class4_FOLD_8 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.980788 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.12992, avg_loss_seg_pos=0.14011 104 | 2022-09-30 01:21:32 dagm_z_SC_03_except_class4_FOLD_8 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_8\models\ep_16.pth 105 | 2022-09-30 01:21:32 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 106 | 2022-09-30 01:21:32 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 107 | 2022-09-30 01:21:42 dagm_z_SC_03_except_class4_FOLD_8 Epoch 17/25 ==> avg_loss_seg=0.12677, avg_loss_seg_pos=0.03430, avg_loss_dec=0.00379, avg_loss=0.13057, FP=0, FN=0, correct=296/296, in 10.41s/epoch 108 | 2022-09-30 01:21:42 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 109 | 2022-09-30 01:21:42 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 110 | 2022-09-30 01:21:53 dagm_z_SC_03_except_class4_FOLD_8 Epoch 18/25 ==> avg_loss_seg=0.11960, avg_loss_seg_pos=0.03267, avg_loss_dec=0.00033, avg_loss=0.11993, FP=0, FN=0, correct=296/296, in 10.40s/epoch 111 | 2022-09-30 01:21:53 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 112 | 2022-09-30 01:21:53 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 113 | 2022-09-30 01:22:04 dagm_z_SC_03_except_class4_FOLD_8 Epoch 19/25 ==> avg_loss_seg=0.11320, avg_loss_seg_pos=0.03116, avg_loss_dec=0.00064, avg_loss=0.11384, FP=0, FN=0, correct=296/296, in 10.40s/epoch 114 | 2022-09-30 01:22:16 dagm_z_SC_03_except_class4_FOLD_8 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999375 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.11018, avg_loss_seg_pos=0.12112 115 | 2022-09-30 01:22:16 dagm_z_SC_03_except_class4_FOLD_8 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_8\models\ep_19.pth 116 | 2022-09-30 01:22:16 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 117 | 2022-09-30 01:22:16 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 118 | 2022-09-30 01:22:26 dagm_z_SC_03_except_class4_FOLD_8 Epoch 20/25 ==> avg_loss_seg=0.10746, avg_loss_seg_pos=0.02983, avg_loss_dec=0.00023, avg_loss=0.10770, FP=0, FN=0, correct=296/296, in 10.40s/epoch 119 | 2022-09-30 01:22:26 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 120 | 2022-09-30 01:22:26 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 121 | 2022-09-30 01:22:37 dagm_z_SC_03_except_class4_FOLD_8 Epoch 21/25 ==> avg_loss_seg=0.10243, avg_loss_seg_pos=0.02873, avg_loss_dec=0.00430, avg_loss=0.10673, FP=0, FN=1, correct=295/296, in 10.39s/epoch 122 | 2022-09-30 01:22:37 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 123 | 2022-09-30 01:22:37 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 124 | 2022-09-30 01:22:47 dagm_z_SC_03_except_class4_FOLD_8 Epoch 22/25 ==> avg_loss_seg=0.09764, avg_loss_seg_pos=0.02762, avg_loss_dec=0.00066, avg_loss=0.09829, FP=0, FN=0, correct=296/296, in 10.40s/epoch 125 | 2022-09-30 01:22:59 dagm_z_SC_03_except_class4_FOLD_8 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999998 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.09532, avg_loss_seg_pos=0.10744 126 | 2022-09-30 01:22:59 dagm_z_SC_03_except_class4_FOLD_8 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_8\models\ep_22.pth 127 | 2022-09-30 01:22:59 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 128 | 2022-09-30 01:22:59 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 129 | 2022-09-30 01:23:10 dagm_z_SC_03_except_class4_FOLD_8 Epoch 23/25 ==> avg_loss_seg=0.09326, avg_loss_seg_pos=0.02660, avg_loss_dec=0.00041, avg_loss=0.09367, FP=0, FN=0, correct=296/296, in 10.39s/epoch 130 | 2022-09-30 01:23:10 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 131 | 2022-09-30 01:23:10 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 132 | 2022-09-30 01:23:20 dagm_z_SC_03_except_class4_FOLD_8 Epoch 24/25 ==> avg_loss_seg=0.08925, avg_loss_seg_pos=0.02564, avg_loss_dec=0.00025, avg_loss=0.08950, FP=0, FN=0, correct=296/296, in 10.40s/epoch 133 | 2022-09-30 01:23:20 dagm_z_SC_03_except_class4_FOLD_8 Returning seg_loss_weight 1 and dec_loss_weight 1.0 134 | 2022-09-30 01:23:20 dagm_z_SC_03_except_class4_FOLD_8 Returning dec_gradient_multiplier 1 135 | 2022-09-30 01:23:31 dagm_z_SC_03_except_class4_FOLD_8 Epoch 25/25 ==> avg_loss_seg=0.08557, avg_loss_seg_pos=0.02476, avg_loss_dec=0.00038, avg_loss=0.08595, FP=0, FN=0, correct=296/296, in 10.40s/epoch 136 | 2022-09-30 01:23:43 dagm_z_SC_03_except_class4_FOLD_8 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999987 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.08378, avg_loss_seg_pos=0.09639 137 | 2022-09-30 01:23:43 dagm_z_SC_03_except_class4_FOLD_8 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_8\models\ep_25.pth 138 | now best_AP=1.0,epoch=1 139 | 2022-09-30 01:23:43 dagm_z_SC_03_except_class4_FOLD_8 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_8\models\final_state_dict.pth 140 | 2022-09-30 01:23:43 dagm_z_SC_03_except_class4_FOLD_8 Keeping same model state 141 | dagm_z_SC_03_except_class4_FOLD_8 EVAL AUC=1.000000, and AP=1.000000, w/ best thr=0.999987 at f-m=1.000 and FP=0, FN=0 142 | run_time 15.6min 143 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_9/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_9/ROC.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_9/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_9/loss_val.png -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_9/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.6830731936403223,0.5794860360187453,1.2625592296590675,0 3 | 0.566590703822471,0.09575863162408003,0.6623493354465511,1 4 | 0.481217664641303,0.07594896255790987,0.5571666271992128,2 5 | 0.4159732348210103,0.09838250083093708,0.5143557356519474,3 6 | 0.3620442111749907,0.027968956506811082,0.3900131676818018,4 7 | 0.31712683793660756,0.018462532781995833,0.3355893707186034,5 8 | 0.2810749492129764,0.006190091794721682,0.28726504100769806,6 9 | 0.2519757260341902,0.0026194871937371537,0.2545952132279274,7 10 | 0.22833255899919047,0.011442697754859723,0.2397752567540502,8 11 | 0.21136614237282728,0.05198056988632055,0.26334671225914785,9 12 | 0.19162553629359683,0.006463580823037773,0.1980891171166346,10 13 | 0.17832333936884598,0.019457862255434076,0.19778120162428006,11 14 | 0.16442884504795074,0.0048434259806526825,0.16927227102860343,12 15 | 0.15341919056467107,0.015865190693246503,0.16928438125791756,13 16 | 0.1438034233209249,0.018717617326851847,0.16252104064777675,14 17 | 0.1337150625280432,0.017525620152465872,0.15124068268050905,15 18 | 0.12571123806205955,0.03022011903168105,0.15593135709374062,16 19 | 0.11827846737326803,0.001982119252175294,0.12026058662544332,17 20 | 0.11189246097126522,0.0032418773885552043,0.11513433835982043,18 21 | 0.11105167603976018,0.288485762457099,0.3995374384968592,19 22 | 0.10332328342908137,0.01353186313566324,0.1168551465647446,20 23 | 0.09776088756483954,0.04102770993297265,0.1387885974978122,21 24 | 0.09317803503693761,0.011780092814414939,0.10495812785135256,22 25 | 0.08887182478163694,0.02615959064821033,0.11503141542984727,23 26 | 0.08497032100284421,0.002813825197360554,0.08778414620020476,24 27 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_9/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_9/precision-recall.pdf -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_9/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:8 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:DAGM 5 | DATASET_PATH:./datasets/DAGM/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:0 8 | DROPOUT_P:0.2 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:25 11 | FOLD:9 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:1 16 | INPUT_HEIGHT:512 17 | INPUT_WIDTH:512 18 | LEARNING_RATE:0.04 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:8 21 | MODEL_NAME:models_trans10 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:True 29 | TRAIN_NUM:None 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:10.0 40 | WEIGHTED_SEG_LOSS_P:1.0 41 | -------------------------------------------------------------------------------- /results/02_DAGM/dagm_z_SC_03_except_class4/FOLD_9/training_log.txt: -------------------------------------------------------------------------------- 1 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 Executing run with path ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_9 2 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 AREA_RATIO_MIN : 0.9 3 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 BATCH_SIZE : 8 4 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 CONTINUE_LAST_TRAIN : None 5 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 DATASET : DAGM 6 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 DATASET_PATH : ./datasets/DAGM/ 7 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 DELTA_CLS_LOSS : 1.0 8 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 DILATE : 0 9 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 DROPOUT_P : 0.2 10 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 DYN_BALANCED_LOSS : False 11 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 EPOCHS : 25 12 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 FOLD : 9 13 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 FREQUENCY_SAMPLING : True 14 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 GPU : 0 15 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 GRADIENT_ADJUSTMENT : False 16 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 INPUT_CHANNELS : 1 17 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 INPUT_HEIGHT : 512 18 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 INPUT_WIDTH : 512 19 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 LEARNING_RATE : 0.04 20 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 LOSS_SEG_THR : 0.02 21 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 MEMORY_FIT : 8 22 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 MODEL_NAME : models_trans10 23 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 NUM_SEGMENTED : 0 24 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 ON_DEMAND_READ : False 25 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 OPTIMIZER : SGD 26 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 REPRODUCIBLE_RUN : True 27 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 RESULTS_PATH : ./results_new 28 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 SAMPLING : half_mixed 29 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 SAVE_IMAGES : True 30 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 TRAIN_NUM : None 31 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 TRANS_BRIGHT : 1.0 32 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 TRANS_KEEP_LOOP : 1 33 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 TRANS_NUM : 4 34 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 USE_BEST_MODEL : False 35 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 VALIDATE : True 36 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 VALIDATE_ON_TEST : True 37 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 VALIDATION_N_EPOCHS : 3 38 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 VOLUME_CFG : None 39 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 WEIGHTED_SEG_LOSS : False 40 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 WEIGHTED_SEG_LOSS_MAX : 10.0 41 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 WEIGHTED_SEG_LOSS_P : 1.0 42 | 2022-09-30 01:32:26 dagm_z_SC_03_except_class4_FOLD_9 Reproducible run, fixing all seeds to:1337 43 | This is models_trans10 44 | 2022-09-30 01:33:08 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 45 | 2022-09-30 01:33:08 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 46 | 2022-09-30 01:33:20 dagm_z_SC_03_except_class4_FOLD_9 Epoch 1/25 ==> avg_loss_seg=0.68307, avg_loss_seg_pos=0.16937, avg_loss_dec=0.57949, avg_loss=1.26256, FP=45, FN=38, correct=213/296, in 11.86s/epoch 47 | 2022-09-30 01:33:32 dagm_z_SC_03_except_class4_FOLD_9 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.253642 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.61258, avg_loss_seg_pos=0.60568 48 | 2022-09-30 01:33:32 dagm_z_SC_03_except_class4_FOLD_9 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_9\models\ep_01.pth 49 | 2022-09-30 01:33:32 dagm_z_SC_03_except_class4_FOLD_9 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_9\models\best_state_dict.pth 50 | 2022-09-30 01:33:32 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 51 | 2022-09-30 01:33:32 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 52 | 2022-09-30 01:33:43 dagm_z_SC_03_except_class4_FOLD_9 Epoch 2/25 ==> avg_loss_seg=0.56659, avg_loss_seg_pos=0.13948, avg_loss_dec=0.09576, avg_loss=0.66235, FP=1, FN=6, correct=289/296, in 10.31s/epoch 53 | 2022-09-30 01:33:43 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 54 | 2022-09-30 01:33:43 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 55 | 2022-09-30 01:33:53 dagm_z_SC_03_except_class4_FOLD_9 Epoch 3/25 ==> avg_loss_seg=0.48122, avg_loss_seg_pos=0.11873, avg_loss_dec=0.07595, avg_loss=0.55717, FP=0, FN=6, correct=290/296, in 10.34s/epoch 56 | 2022-09-30 01:33:53 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 57 | 2022-09-30 01:33:53 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 58 | 2022-09-30 01:34:04 dagm_z_SC_03_except_class4_FOLD_9 Epoch 4/25 ==> avg_loss_seg=0.41597, avg_loss_seg_pos=0.10264, avg_loss_dec=0.09838, avg_loss=0.51436, FP=3, FN=8, correct=285/296, in 10.34s/epoch 59 | 2022-09-30 01:34:16 dagm_z_SC_03_except_class4_FOLD_9 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.854182 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.38576, avg_loss_seg_pos=0.37996 60 | 2022-09-30 01:34:16 dagm_z_SC_03_except_class4_FOLD_9 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_9\models\ep_04.pth 61 | 2022-09-30 01:34:16 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 62 | 2022-09-30 01:34:16 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 63 | 2022-09-30 01:34:26 dagm_z_SC_03_except_class4_FOLD_9 Epoch 5/25 ==> avg_loss_seg=0.36204, avg_loss_seg_pos=0.08916, avg_loss_dec=0.02797, avg_loss=0.39001, FP=0, FN=3, correct=293/296, in 10.39s/epoch 64 | 2022-09-30 01:34:26 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 65 | 2022-09-30 01:34:26 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 66 | 2022-09-30 01:34:37 dagm_z_SC_03_except_class4_FOLD_9 Epoch 6/25 ==> avg_loss_seg=0.31713, avg_loss_seg_pos=0.07784, avg_loss_dec=0.01846, avg_loss=0.33559, FP=0, FN=2, correct=294/296, in 10.38s/epoch 67 | 2022-09-30 01:34:37 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 68 | 2022-09-30 01:34:37 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 69 | 2022-09-30 01:34:47 dagm_z_SC_03_except_class4_FOLD_9 Epoch 7/25 ==> avg_loss_seg=0.28107, avg_loss_seg_pos=0.06955, avg_loss_dec=0.00619, avg_loss=0.28727, FP=0, FN=0, correct=296/296, in 10.38s/epoch 70 | 2022-09-30 01:34:59 dagm_z_SC_03_except_class4_FOLD_9 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.884457 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.26507, avg_loss_seg_pos=0.26403 71 | 2022-09-30 01:34:59 dagm_z_SC_03_except_class4_FOLD_9 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_9\models\ep_07.pth 72 | 2022-09-30 01:34:59 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 73 | 2022-09-30 01:34:59 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 74 | 2022-09-30 01:35:10 dagm_z_SC_03_except_class4_FOLD_9 Epoch 8/25 ==> avg_loss_seg=0.25198, avg_loss_seg_pos=0.06259, avg_loss_dec=0.00262, avg_loss=0.25460, FP=0, FN=0, correct=296/296, in 10.39s/epoch 75 | 2022-09-30 01:35:10 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 76 | 2022-09-30 01:35:10 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 77 | 2022-09-30 01:35:20 dagm_z_SC_03_except_class4_FOLD_9 Epoch 9/25 ==> avg_loss_seg=0.22833, avg_loss_seg_pos=0.05694, avg_loss_dec=0.01144, avg_loss=0.23978, FP=1, FN=1, correct=294/296, in 10.39s/epoch 78 | 2022-09-30 01:35:20 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 79 | 2022-09-30 01:35:20 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 80 | 2022-09-30 01:35:31 dagm_z_SC_03_except_class4_FOLD_9 Epoch 10/25 ==> avg_loss_seg=0.21137, avg_loss_seg_pos=0.05262, avg_loss_dec=0.05198, avg_loss=0.26335, FP=1, FN=3, correct=292/296, in 10.40s/epoch 81 | 2022-09-30 01:35:43 dagm_z_SC_03_except_class4_FOLD_9 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.164139 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.20012, avg_loss_seg_pos=0.20077 82 | 2022-09-30 01:35:43 dagm_z_SC_03_except_class4_FOLD_9 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_9\models\ep_10.pth 83 | 2022-09-30 01:35:43 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 84 | 2022-09-30 01:35:43 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 85 | 2022-09-30 01:35:53 dagm_z_SC_03_except_class4_FOLD_9 Epoch 11/25 ==> avg_loss_seg=0.19163, avg_loss_seg_pos=0.04793, avg_loss_dec=0.00646, avg_loss=0.19809, FP=0, FN=0, correct=296/296, in 10.42s/epoch 86 | 2022-09-30 01:35:53 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 87 | 2022-09-30 01:35:53 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 88 | 2022-09-30 01:36:04 dagm_z_SC_03_except_class4_FOLD_9 Epoch 12/25 ==> avg_loss_seg=0.17832, avg_loss_seg_pos=0.04451, avg_loss_dec=0.01946, avg_loss=0.19778, FP=1, FN=1, correct=294/296, in 10.41s/epoch 89 | 2022-09-30 01:36:04 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 90 | 2022-09-30 01:36:04 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 91 | 2022-09-30 01:36:15 dagm_z_SC_03_except_class4_FOLD_9 Epoch 13/25 ==> avg_loss_seg=0.16443, avg_loss_seg_pos=0.04114, avg_loss_dec=0.00484, avg_loss=0.16927, FP=0, FN=0, correct=296/296, in 10.41s/epoch 92 | 2022-09-30 01:36:26 dagm_z_SC_03_except_class4_FOLD_9 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999241 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.15784, avg_loss_seg_pos=0.15843 93 | 2022-09-30 01:36:26 dagm_z_SC_03_except_class4_FOLD_9 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_9\models\ep_13.pth 94 | 2022-09-30 01:36:26 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 95 | 2022-09-30 01:36:26 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 96 | 2022-09-30 01:36:37 dagm_z_SC_03_except_class4_FOLD_9 Epoch 14/25 ==> avg_loss_seg=0.15342, avg_loss_seg_pos=0.03875, avg_loss_dec=0.01587, avg_loss=0.16928, FP=1, FN=2, correct=293/296, in 10.42s/epoch 97 | 2022-09-30 01:36:37 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 98 | 2022-09-30 01:36:37 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 99 | 2022-09-30 01:36:48 dagm_z_SC_03_except_class4_FOLD_9 Epoch 15/25 ==> avg_loss_seg=0.14380, avg_loss_seg_pos=0.03625, avg_loss_dec=0.01872, avg_loss=0.16252, FP=0, FN=1, correct=295/296, in 10.41s/epoch 100 | 2022-09-30 01:36:48 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 101 | 2022-09-30 01:36:48 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 102 | 2022-09-30 01:36:58 dagm_z_SC_03_except_class4_FOLD_9 Epoch 16/25 ==> avg_loss_seg=0.13372, avg_loss_seg_pos=0.03403, avg_loss_dec=0.01753, avg_loss=0.15124, FP=0, FN=2, correct=294/296, in 10.40s/epoch 103 | 2022-09-30 01:37:10 dagm_z_SC_03_except_class4_FOLD_9 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.994237 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.12934, avg_loss_seg_pos=0.13314 104 | 2022-09-30 01:37:10 dagm_z_SC_03_except_class4_FOLD_9 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_9\models\ep_16.pth 105 | 2022-09-30 01:37:10 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 106 | 2022-09-30 01:37:10 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 107 | 2022-09-30 01:37:21 dagm_z_SC_03_except_class4_FOLD_9 Epoch 17/25 ==> avg_loss_seg=0.12571, avg_loss_seg_pos=0.03259, avg_loss_dec=0.03022, avg_loss=0.15593, FP=0, FN=1, correct=295/296, in 10.41s/epoch 108 | 2022-09-30 01:37:21 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 109 | 2022-09-30 01:37:21 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 110 | 2022-09-30 01:37:31 dagm_z_SC_03_except_class4_FOLD_9 Epoch 18/25 ==> avg_loss_seg=0.11828, avg_loss_seg_pos=0.03090, avg_loss_dec=0.00198, avg_loss=0.12026, FP=0, FN=0, correct=296/296, in 10.41s/epoch 111 | 2022-09-30 01:37:31 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 112 | 2022-09-30 01:37:31 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 113 | 2022-09-30 01:37:42 dagm_z_SC_03_except_class4_FOLD_9 Epoch 19/25 ==> avg_loss_seg=0.11189, avg_loss_seg_pos=0.02927, avg_loss_dec=0.00324, avg_loss=0.11513, FP=0, FN=0, correct=296/296, in 10.41s/epoch 114 | 2022-09-30 01:37:54 dagm_z_SC_03_except_class4_FOLD_9 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.999542 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.10895, avg_loss_seg_pos=0.11447 115 | 2022-09-30 01:37:54 dagm_z_SC_03_except_class4_FOLD_9 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_9\models\ep_19.pth 116 | 2022-09-30 01:37:54 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 117 | 2022-09-30 01:37:54 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 118 | 2022-09-30 01:38:04 dagm_z_SC_03_except_class4_FOLD_9 Epoch 20/25 ==> avg_loss_seg=0.11105, avg_loss_seg_pos=0.02858, avg_loss_dec=0.28849, avg_loss=0.39954, FP=13, FN=13, correct=270/296, in 10.43s/epoch 119 | 2022-09-30 01:38:04 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 120 | 2022-09-30 01:38:04 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 121 | 2022-09-30 01:38:15 dagm_z_SC_03_except_class4_FOLD_9 Epoch 21/25 ==> avg_loss_seg=0.10332, avg_loss_seg_pos=0.02660, avg_loss_dec=0.01353, avg_loss=0.11686, FP=0, FN=1, correct=295/296, in 10.43s/epoch 122 | 2022-09-30 01:38:15 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 123 | 2022-09-30 01:38:15 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 124 | 2022-09-30 01:38:26 dagm_z_SC_03_except_class4_FOLD_9 Epoch 22/25 ==> avg_loss_seg=0.09776, avg_loss_seg_pos=0.02543, avg_loss_dec=0.04103, avg_loss=0.13879, FP=2, FN=2, correct=292/296, in 10.43s/epoch 125 | 2022-09-30 01:38:37 dagm_z_SC_03_except_class4_FOLD_9 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.957014 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.09511, avg_loss_seg_pos=0.09900 126 | 2022-09-30 01:38:37 dagm_z_SC_03_except_class4_FOLD_9 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_9\models\ep_22.pth 127 | 2022-09-30 01:38:37 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 128 | 2022-09-30 01:38:37 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 129 | 2022-09-30 01:38:48 dagm_z_SC_03_except_class4_FOLD_9 Epoch 23/25 ==> avg_loss_seg=0.09318, avg_loss_seg_pos=0.02417, avg_loss_dec=0.01178, avg_loss=0.10496, FP=0, FN=1, correct=295/296, in 10.41s/epoch 130 | 2022-09-30 01:38:48 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 131 | 2022-09-30 01:38:48 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 132 | 2022-09-30 01:38:59 dagm_z_SC_03_except_class4_FOLD_9 Epoch 24/25 ==> avg_loss_seg=0.08887, avg_loss_seg_pos=0.02309, avg_loss_dec=0.02616, avg_loss=0.11503, FP=0, FN=1, correct=295/296, in 10.40s/epoch 133 | 2022-09-30 01:38:59 dagm_z_SC_03_except_class4_FOLD_9 Returning seg_loss_weight 1 and dec_loss_weight 1.0 134 | 2022-09-30 01:38:59 dagm_z_SC_03_except_class4_FOLD_9 Returning dec_gradient_multiplier 1 135 | 2022-09-30 01:39:09 dagm_z_SC_03_except_class4_FOLD_9 Epoch 25/25 ==> avg_loss_seg=0.08497, avg_loss_seg_pos=0.02242, avg_loss_dec=0.00281, avg_loss=0.08778, FP=0, FN=0, correct=296/296, in 10.41s/epoch 136 | 2022-09-30 01:39:21 dagm_z_SC_03_except_class4_FOLD_9 VALIDATION || AUC=1.000000, and AP=1.000000, with best thr=0.994240 at f-measure=1.000 and FP=0, FN=0, TOTAL SAMPLES=1150, avg_loss_seg_neg=0.08301, avg_loss_seg_pos=0.08776 137 | 2022-09-30 01:39:21 dagm_z_SC_03_except_class4_FOLD_9 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_9\models\ep_25.pth 138 | now best_AP=1.0,epoch=1 139 | 2022-09-30 01:39:21 dagm_z_SC_03_except_class4_FOLD_9 Saving current models state to ./results_new\DAGM\dagm_z_SC_03_except_class4\FOLD_9\models\final_state_dict.pth 140 | 2022-09-30 01:39:21 dagm_z_SC_03_except_class4_FOLD_9 Keeping same model state 141 | dagm_z_SC_03_except_class4_FOLD_9 EVAL AUC=1.000000, and AP=1.000000, w/ best thr=0.994240 at f-m=1.000 and FP=0, FN=0 142 | run_time 15.6min 143 | -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_0/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/03_KSDD/FOLD_0/ROC.pdf -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_0/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/03_KSDD/FOLD_0/loss_val.png -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_0/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.6757097104016472,1.113409591948285,1.789119302349932,0 3 | 0.5455585209762349,1.161954534404418,1.707513055380653,1 4 | 0.4427218770279604,0.8820251870681258,1.3247470640960861,2 5 | 0.3824431764728883,0.8127651701078695,1.195208346580758,3 6 | 0.33307527356288014,0.7851950029897339,1.118270276552614,4 7 | 0.27511802491019755,0.780345902832992,1.0554639277431894,5 8 | 0.24161149210789623,0.5819164410452632,0.8235279331531594,6 9 | 0.2179562633528429,0.8054598553434891,1.023416118696332,7 10 | 0.19767600297927856,0.33097616426290616,0.5286521672421847,8 11 | 0.1842074065523989,0.5426216640934238,0.7268290706458227,9 12 | 0.16448780368356145,0.5237301252782345,0.6882179289617959,10 13 | 0.15000459025887883,0.33066532324386677,0.48066991350274557,11 14 | 0.1370142483535935,0.43802423167097215,0.5750384800245656,12 15 | 0.1263242414330735,0.7302193119916517,0.8565435534247252,13 16 | 0.12070816141717575,0.4303100266748943,0.55101818809207,14 17 | 0.1158113527823897,0.235618065474281,0.3514294182566707,15 18 | 0.10671762333196752,0.6571203034288962,0.7638379267608637,16 19 | 0.10188434031956337,0.2888260980424307,0.39071043836199404,17 20 | 0.09624619606663198,0.35177346041789004,0.448019656484522,18 21 | 0.09057528503677424,0.25797499926000667,0.3485502842967809,19 22 | 0.08853035858448814,0.311000204343787,0.39953056292827516,20 23 | 0.08480201749240651,0.07654625461573768,0.16134827210814418,21 24 | 0.08040029511732213,0.03508580583515782,0.11548610095247996,22 25 | 0.07706511327449013,0.09864192850278307,0.1757070417772732,23 26 | 0.07332228036487803,0.07870717795894426,0.1520294583238223,24 27 | -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_0/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/03_KSDD/FOLD_0/precision-recall.pdf -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_0/results.csv: -------------------------------------------------------------------------------- 1 | prediction,decision,ground_truth,img_name 2 | 0.5859120488166809,True,1.0,kos02_Part6 3 | 0.42828625440597534,True,1.0,kos08_Part2 4 | 0.999998927116394,True,1.0,kos09_Part6 5 | 0.9999995231628418,True,1.0,kos10_Part3 6 | 0.9988240599632263,True,1.0,kos15_Part3 7 | 0.9993172883987427,True,1.0,kos19_Part5 8 | 0.9999949932098389,True,1.0,kos21_Part5 9 | 0.9996832609176636,True,1.0,kos25_Part4 10 | 0.9999996423721313,True,1.0,kos27_Part0 11 | 0.9995588660240173,True,1.0,kos28_Part3 12 | 0.9999935626983643,True,1.0,kos30_Part0 13 | 0.9999780654907227,True,1.0,kos37_Part0 14 | 0.9999704360961914,True,1.0,kos38_Part0 15 | 0.9955292344093323,True,1.0,kos38_Part1 16 | 0.9999998807907104,True,1.0,kos41_Part7 17 | 1.0,True,1.0,kos43_Part1 18 | 0.9972443580627441,True,1.0,kos46_Part7 19 | 0.9999313354492188,True,1.0,kos47_Part2 20 | 0.001797763048671186,False,0.0,kos02_Part0 21 | 0.001282497774809599,False,0.0,kos02_Part1 22 | 0.0002409068838460371,False,0.0,kos02_Part2 23 | 0.0008108346373774111,False,0.0,kos02_Part3 24 | 0.0027350683230906725,False,0.0,kos02_Part4 25 | 0.002245614305138588,False,0.0,kos02_Part5 26 | 0.0007731056539341807,False,0.0,kos02_Part7 27 | 0.0016181421233341098,False,0.0,kos08_Part0 28 | 0.00045714358566328883,False,0.0,kos08_Part1 29 | 0.000718968512956053,False,0.0,kos08_Part3 30 | 0.0002593817189335823,False,0.0,kos08_Part4 31 | 0.00018380122492089868,False,0.0,kos08_Part5 32 | 0.00038494530599564314,False,0.0,kos08_Part6 33 | 0.0005270836409181356,False,0.0,kos08_Part7 34 | 0.0020871576853096485,False,0.0,kos09_Part0 35 | 0.0018099835142493248,False,0.0,kos09_Part1 36 | 0.001276228460483253,False,0.0,kos09_Part2 37 | 0.0008123998995870352,False,0.0,kos09_Part3 38 | 0.007846186868846416,False,0.0,kos09_Part4 39 | 0.001690796809270978,False,0.0,kos09_Part5 40 | 0.0009912390960380435,False,0.0,kos09_Part7 41 | 0.0021456144750118256,False,0.0,kos10_Part0 42 | 0.0006059015868231654,False,0.0,kos10_Part1 43 | 0.0061993966810405254,False,0.0,kos10_Part2 44 | 0.0016190777532756329,False,0.0,kos10_Part4 45 | 0.0005660248571075499,False,0.0,kos10_Part5 46 | 0.04003962501883507,False,0.0,kos10_Part6 47 | 0.029072951525449753,False,0.0,kos10_Part7 48 | 0.0009336316725239158,False,0.0,kos15_Part0 49 | 0.002130520297214389,False,0.0,kos15_Part1 50 | 0.0021277081687003374,False,0.0,kos15_Part2 51 | 0.003186824731528759,False,0.0,kos15_Part4 52 | 0.018666421994566917,False,0.0,kos15_Part5 53 | 0.001166732981801033,False,0.0,kos15_Part6 54 | 0.0021702710073441267,False,0.0,kos15_Part7 55 | 0.005619360134005547,False,0.0,kos19_Part0 56 | 0.0003046188212465495,False,0.0,kos19_Part1 57 | 0.04489516094326973,False,0.0,kos19_Part2 58 | 0.0009754070779308677,False,0.0,kos19_Part3 59 | 0.0002679404860828072,False,0.0,kos19_Part4 60 | 0.0006128360400907695,False,0.0,kos19_Part6 61 | 0.0020676227286458015,False,0.0,kos19_Part7 62 | 0.0005575675168074667,False,0.0,kos21_Part0 63 | 0.0032129010651260614,False,0.0,kos21_Part1 64 | 0.005090865306556225,False,0.0,kos21_Part2 65 | 0.0010002856142818928,False,0.0,kos21_Part3 66 | 0.0006633377633988857,False,0.0,kos21_Part4 67 | 0.0020417734049260616,False,0.0,kos21_Part6 68 | 0.0007382508483715355,False,0.0,kos25_Part0 69 | 0.0017712884582579136,False,0.0,kos25_Part1 70 | 0.0007717770640738308,False,0.0,kos25_Part2 71 | 0.0007139657973311841,False,0.0,kos25_Part3 72 | 0.007215107791125774,False,0.0,kos25_Part5 73 | 0.0009961985051631927,False,0.0,kos25_Part6 74 | 0.0008652856340631843,False,0.0,kos25_Part7 75 | 0.005315128713846207,False,0.0,kos27_Part1 76 | 0.0058343796990811825,False,0.0,kos27_Part2 77 | 0.007489267271012068,False,0.0,kos27_Part3 78 | 0.0008145289029926062,False,0.0,kos27_Part4 79 | 0.0008911046897992492,False,0.0,kos27_Part5 80 | 0.0035541148390620947,False,0.0,kos27_Part6 81 | 0.0007831405964680016,False,0.0,kos27_Part7 82 | 0.0015164049109444022,False,0.0,kos28_Part0 83 | 0.0010114122414961457,False,0.0,kos28_Part1 84 | 0.0011778066400438547,False,0.0,kos28_Part2 85 | 0.0046878657303750515,False,0.0,kos28_Part4 86 | 0.000598300015553832,False,0.0,kos28_Part5 87 | 0.001964923460036516,False,0.0,kos28_Part6 88 | 0.0014104008441790938,False,0.0,kos28_Part7 89 | 0.001992578851059079,False,0.0,kos30_Part1 90 | 0.0005195182748138905,False,0.0,kos30_Part2 91 | 0.00035107359872199595,False,0.0,kos30_Part3 92 | 0.0004762908793054521,False,0.0,kos30_Part4 93 | 0.00211762310937047,False,0.0,kos30_Part5 94 | 0.0014410457806661725,False,0.0,kos30_Part6 95 | 0.0013093200977891684,False,0.0,kos30_Part7 96 | 0.017918327823281288,False,0.0,kos37_Part1 97 | 0.07973974943161011,False,0.0,kos37_Part2 98 | 0.0037703358102589846,False,0.0,kos37_Part3 99 | 0.002907325280830264,False,0.0,kos37_Part4 100 | 0.018506918102502823,False,0.0,kos37_Part5 101 | 0.0005337600014172494,False,0.0,kos37_Part6 102 | 0.34439992904663086,False,0.0,kos37_Part7 103 | 0.0005491137271746993,False,0.0,kos38_Part2 104 | 0.0022206909488886595,False,0.0,kos38_Part3 105 | 0.0020442360546439886,False,0.0,kos38_Part4 106 | 0.0010469765402376652,False,0.0,kos38_Part5 107 | 0.0020397398620843887,False,0.0,kos38_Part6 108 | 0.002952628070488572,False,0.0,kos38_Part7 109 | 0.0010710903443396091,False,0.0,kos41_Part0 110 | 0.0021421960555016994,False,0.0,kos41_Part1 111 | 0.0009356202790513635,False,0.0,kos41_Part2 112 | 0.0009023224120028317,False,0.0,kos41_Part3 113 | 0.008052044548094273,False,0.0,kos41_Part4 114 | 0.001094347913749516,False,0.0,kos41_Part5 115 | 0.0007504905806854367,False,0.0,kos41_Part6 116 | 0.006524579133838415,False,0.0,kos43_Part0 117 | 0.0013336227275431156,False,0.0,kos43_Part2 118 | 0.0028886343352496624,False,0.0,kos43_Part3 119 | 0.005243326537311077,False,0.0,kos43_Part4 120 | 0.0038730979431420565,False,0.0,kos43_Part5 121 | 0.004861411172896624,False,0.0,kos43_Part6 122 | 0.03674124553799629,False,0.0,kos43_Part7 123 | 0.0006239901413209736,False,0.0,kos46_Part0 124 | 0.0038911355659365654,False,0.0,kos46_Part1 125 | 0.038179509341716766,False,0.0,kos46_Part2 126 | 0.0027535813860595226,False,0.0,kos46_Part3 127 | 0.2901425063610077,False,0.0,kos46_Part4 128 | 0.0018136934377253056,False,0.0,kos46_Part5 129 | 0.0570296011865139,False,0.0,kos46_Part6 130 | 0.0103477006778121,False,0.0,kos47_Part0 131 | 0.0009587452514097095,False,0.0,kos47_Part1 132 | 0.014342887327075005,False,0.0,kos47_Part3 133 | 0.0018389163305982947,False,0.0,kos47_Part4 134 | 0.006012987345457077,False,0.0,kos47_Part5 135 | 0.0022473307326436043,False,0.0,kos47_Part6 136 | 0.0005906732985749841,False,0.0,kos47_Part7 137 | -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_0/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:2 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:KSDD 5 | DATASET_PATH:./datasets/KSDD/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:7 8 | DROPOUT_P:0.1 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:25 11 | FOLD:0 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:1 16 | INPUT_HEIGHT:1408 17 | INPUT_WIDTH:512 18 | LEARNING_RATE:0.05 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:2 21 | MODEL_NAME:models_z_new_SC_03 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:False 29 | TRAIN_NUM:33 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:1.0 40 | WEIGHTED_SEG_LOSS_P:2.0 41 | -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_1/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/03_KSDD/FOLD_1/ROC.pdf -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_1/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/03_KSDD/FOLD_1/loss_val.png -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_1/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.6723639842341927,1.0832013838431414,1.7555653680773342,0 3 | 0.5478930946658639,0.881832695182632,1.4297257898484959,1 4 | 0.4521384589812335,0.8815070241689682,1.3336454831502018,2 5 | 0.37841975425972657,0.7349391071235433,1.11335886138327,3 6 | 0.3268251173636493,0.71480526139631,1.0416303787599592,4 7 | 0.2853642798521939,0.6840963394326323,0.9694606192848262,5 8 | 0.25217392470906763,0.8577917163424632,1.1099656410515308,6 9 | 0.22434010882587993,0.7337377492119285,0.9580778580378084,7 10 | 0.2024941532050862,0.5846135178033043,0.7871076710083905,8 11 | 0.18478565443964565,0.5638619813629809,0.7486476358026266,9 12 | 0.16999943291439729,0.744022304082618,0.9140217369970154,10 13 | 0.1567775002297233,0.3809966382515781,0.5377741384813014,11 14 | 0.14747646889265845,0.5051764164099415,0.6526528853025999,12 15 | 0.1466913144378101,0.8860280976411613,1.0327194120789713,13 16 | 0.138024083831731,0.16505519779371647,0.3030792816254475,14 17 | 0.12398626677253667,0.24943166676059583,0.3734179335331325,15 18 | 0.12338171307654942,0.5528981775089906,0.6762798905855401,16 19 | 0.1125325297608095,0.5320374594268609,0.6445699891876704,17 20 | 0.1045307733995073,0.33000645471517653,0.4345372281146838,18 21 | 0.10170302079880938,0.33438748047474826,0.43609050127355764,19 22 | 0.0973044443218147,0.09838820115649416,0.19569264547830886,20 23 | 0.09001376896220095,0.532073577380609,0.6220873463428099,21 24 | 0.08328250269679462,0.06988905236785821,0.15317155506465283,22 25 | 0.0781638280433767,0.3051565880288014,0.3833204160721781,23 26 | 0.07582898109274752,0.40459378726946477,0.4804227683622123,24 27 | -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_1/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/03_KSDD/FOLD_1/precision-recall.pdf -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_1/results.csv: -------------------------------------------------------------------------------- 1 | prediction,decision,ground_truth,img_name 2 | 0.9272087812423706,True,1.0,kos01_Part5 3 | 0.9810066223144531,True,1.0,kos03_Part2 4 | 0.46174538135528564,True,1.0,kos04_Part3 5 | 0.9261100888252258,True,1.0,kos06_Part7 6 | 0.9958113431930542,True,1.0,kos11_Part4 7 | 0.9977511763572693,True,1.0,kos17_Part5 8 | 0.9961453676223755,True,1.0,kos22_Part6 9 | 0.1181362196803093,True,1.0,kos24_Part1 10 | 0.9956743121147156,True,1.0,kos29_Part0 11 | 0.8125565648078918,True,1.0,kos31_Part1 12 | 0.06142754107713699,False,1.0,kos32_Part2 13 | 0.9482282996177673,True,1.0,kos36_Part3 14 | 0.8911421298980713,True,1.0,kos39_Part6 15 | 0.5865265727043152,True,1.0,kos39_Part7 16 | 0.9979357719421387,True,1.0,kos40_Part5 17 | 0.9994179010391235,True,1.0,kos42_Part3 18 | 0.9980048537254333,True,1.0,kos44_Part6 19 | 0.9958160519599915,True,1.0,kos48_Part5 20 | 0.08433721214532852,False,0.0,kos01_Part0 21 | 0.02041173353791237,False,0.0,kos01_Part1 22 | 0.0011668069055303931,False,0.0,kos01_Part2 23 | 0.004828223027288914,False,0.0,kos01_Part3 24 | 0.03006509505212307,False,0.0,kos01_Part4 25 | 0.06430912017822266,False,0.0,kos01_Part6 26 | 0.021707117557525635,False,0.0,kos01_Part7 27 | 0.01634896732866764,False,0.0,kos03_Part0 28 | 0.008317516185343266,False,0.0,kos03_Part1 29 | 0.005299803335219622,False,0.0,kos03_Part3 30 | 0.001930911443196237,False,0.0,kos03_Part4 31 | 0.007972823455929756,False,0.0,kos03_Part5 32 | 0.027995651587843895,False,0.0,kos03_Part6 33 | 0.010939856059849262,False,0.0,kos03_Part7 34 | 0.05670679360628128,False,0.0,kos04_Part0 35 | 0.010685532353818417,False,0.0,kos04_Part1 36 | 0.013199759647250175,False,0.0,kos04_Part2 37 | 0.012629097327589989,False,0.0,kos04_Part4 38 | 0.00969122163951397,False,0.0,kos04_Part5 39 | 0.03647918254137039,False,0.0,kos04_Part6 40 | 0.054772600531578064,False,0.0,kos04_Part7 41 | 0.004795114975422621,False,0.0,kos06_Part0 42 | 0.012960495427250862,False,0.0,kos06_Part1 43 | 0.009600675664842129,False,0.0,kos06_Part2 44 | 0.04515352472662926,False,0.0,kos06_Part3 45 | 0.003958260174840689,False,0.0,kos06_Part4 46 | 0.0047016870230436325,False,0.0,kos06_Part5 47 | 0.03789742290973663,False,0.0,kos06_Part6 48 | 0.04037877544760704,False,0.0,kos11_Part0 49 | 0.028348788619041443,False,0.0,kos11_Part1 50 | 0.01942165568470955,False,0.0,kos11_Part2 51 | 0.008758117444813251,False,0.0,kos11_Part3 52 | 0.008725990541279316,False,0.0,kos11_Part5 53 | 0.011462591588497162,False,0.0,kos11_Part6 54 | 0.005747831426560879,False,0.0,kos11_Part7 55 | 0.010315993800759315,False,0.0,kos17_Part0 56 | 0.012753555551171303,False,0.0,kos17_Part1 57 | 0.015384217724204063,False,0.0,kos17_Part2 58 | 0.07499492913484573,False,0.0,kos17_Part3 59 | 0.08268212527036667,False,0.0,kos17_Part4 60 | 0.04205913841724396,False,0.0,kos17_Part6 61 | 0.004893834702670574,False,0.0,kos17_Part7 62 | 0.009933014400303364,False,0.0,kos22_Part0 63 | 0.0241264495998621,False,0.0,kos22_Part1 64 | 0.049989763647317886,False,0.0,kos22_Part2 65 | 0.009895583614706993,False,0.0,kos22_Part3 66 | 0.006402574945241213,False,0.0,kos22_Part4 67 | 0.0028665547724813223,False,0.0,kos22_Part5 68 | 0.005825430620461702,False,0.0,kos22_Part7 69 | 0.005311764311045408,False,0.0,kos24_Part0 70 | 0.007827194407582283,False,0.0,kos24_Part2 71 | 0.004890733398497105,False,0.0,kos24_Part3 72 | 0.0054267761297523975,False,0.0,kos24_Part4 73 | 0.015432007610797882,False,0.0,kos24_Part5 74 | 0.009087003767490387,False,0.0,kos24_Part6 75 | 0.008465091697871685,False,0.0,kos24_Part7 76 | 0.004838420078158379,False,0.0,kos29_Part1 77 | 0.008980263024568558,False,0.0,kos29_Part2 78 | 0.005615604110062122,False,0.0,kos29_Part3 79 | 0.0275754202157259,False,0.0,kos29_Part4 80 | 0.007544112391769886,False,0.0,kos29_Part5 81 | 0.004269969649612904,False,0.0,kos29_Part6 82 | 0.0031844174955040216,False,0.0,kos29_Part7 83 | 0.0013691748026758432,False,0.0,kos31_Part0 84 | 0.0016180465463548899,False,0.0,kos31_Part2 85 | 0.0025175237096846104,False,0.0,kos31_Part3 86 | 0.005321328993886709,False,0.0,kos31_Part4 87 | 0.007689739111810923,False,0.0,kos31_Part5 88 | 0.004719672724604607,False,0.0,kos31_Part6 89 | 0.008795605972409248,False,0.0,kos31_Part7 90 | 0.008515875786542892,False,0.0,kos32_Part0 91 | 0.004540703259408474,False,0.0,kos32_Part1 92 | 0.004149454180151224,False,0.0,kos32_Part3 93 | 0.005352798383682966,False,0.0,kos32_Part4 94 | 0.00443453760817647,False,0.0,kos32_Part5 95 | 0.042718853801488876,False,0.0,kos32_Part6 96 | 0.005429744254797697,False,0.0,kos32_Part7 97 | 0.0045296745374798775,False,0.0,kos36_Part0 98 | 0.004649391397833824,False,0.0,kos36_Part1 99 | 0.044968992471694946,False,0.0,kos36_Part2 100 | 0.004671592265367508,False,0.0,kos36_Part4 101 | 0.0014216738054528832,False,0.0,kos36_Part5 102 | 0.004411662928760052,False,0.0,kos36_Part6 103 | 0.0051092286594212055,False,0.0,kos36_Part7 104 | 0.0060562873259186745,False,0.0,kos39_Part0 105 | 0.018324365839362144,False,0.0,kos39_Part1 106 | 0.0071446895599365234,False,0.0,kos39_Part2 107 | 0.004125253297388554,False,0.0,kos39_Part3 108 | 0.004897458478808403,False,0.0,kos39_Part4 109 | 0.013809963129460812,False,0.0,kos39_Part5 110 | 0.00614294596016407,False,0.0,kos40_Part0 111 | 0.005019659176468849,False,0.0,kos40_Part1 112 | 0.001844144775532186,False,0.0,kos40_Part2 113 | 0.0037565664388239384,False,0.0,kos40_Part3 114 | 0.005220925901085138,False,0.0,kos40_Part4 115 | 0.010843480937182903,False,0.0,kos40_Part6 116 | 0.005214114673435688,False,0.0,kos40_Part7 117 | 0.007881974801421165,False,0.0,kos42_Part0 118 | 0.006414920557290316,False,0.0,kos42_Part1 119 | 0.009368162602186203,False,0.0,kos42_Part2 120 | 0.013188062235713005,False,0.0,kos42_Part4 121 | 0.029509318992495537,False,0.0,kos42_Part5 122 | 0.007729936391115189,False,0.0,kos42_Part6 123 | 0.03233882784843445,False,0.0,kos42_Part7 124 | 0.0067450846545398235,False,0.0,kos44_Part0 125 | 0.01162141002714634,False,0.0,kos44_Part1 126 | 0.0314629040658474,False,0.0,kos44_Part2 127 | 0.00909604225307703,False,0.0,kos44_Part3 128 | 0.03595675155520439,False,0.0,kos44_Part4 129 | 0.008519995026290417,False,0.0,kos44_Part5 130 | 0.006093203090131283,False,0.0,kos44_Part7 131 | 0.020512543618679047,False,0.0,kos48_Part0 132 | 0.025453560054302216,False,0.0,kos48_Part1 133 | 0.02674575336277485,False,0.0,kos48_Part2 134 | 0.009170901030302048,False,0.0,kos48_Part3 135 | 0.08464127779006958,False,0.0,kos48_Part4 136 | 0.010484717786312103,False,0.0,kos48_Part6 137 | 0.006211842410266399,False,0.0,kos48_Part7 138 | -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_1/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:2 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:KSDD 5 | DATASET_PATH:./datasets/KSDD/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:7 8 | DROPOUT_P:0.1 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:25 11 | FOLD:1 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:1 16 | INPUT_HEIGHT:1408 17 | INPUT_WIDTH:512 18 | LEARNING_RATE:0.05 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:2 21 | MODEL_NAME:models_z_new_SC_03 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:False 29 | TRAIN_NUM:33 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:1.0 40 | WEIGHTED_SEG_LOSS_P:2.0 41 | -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_2/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/03_KSDD/FOLD_2/ROC.pdf -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_2/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/03_KSDD/FOLD_2/loss_val.png -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_2/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.6620423363314735,1.250384317504035,1.9124266538355084,0 3 | 0.5358014297154214,0.9825470050175985,1.5183484347330198,1 4 | 0.438238053686089,0.8807340204301808,1.3189720741162698,2 5 | 0.3725285645988252,0.9631095754189624,1.3356381400177875,3 6 | 0.3152499886022674,0.7554845372732315,1.070734525875499,4 7 | 0.27431229915883804,0.5626054221971167,0.8369177213559548,5 8 | 0.24034223498569596,0.5361069334127629,0.7764491683984589,6 9 | 0.2152968690627151,0.5198779968648322,0.7351748659275472,7 10 | 0.19385742644468942,0.5389636986268064,0.7328211250714958,8 11 | 0.17554956840144265,0.4467901258160257,0.6223396942174684,9 12 | 0.16059443768527773,0.5303887559825348,0.6909831936678125,10 13 | 0.14986294301019776,0.33277545544681036,0.4826383984570081,11 14 | 0.13832969218492508,0.18046597253770516,0.3187956647226302,12 15 | 0.13125982363190916,0.6164140703234201,0.7476738939553292,13 16 | 0.1223473126689593,0.5511360438152527,0.673483356484212,14 17 | 0.11713344831433561,0.31385575205139404,0.43098920036572963,15 18 | 0.11243923670715755,0.7194930223955048,0.8319322591026623,16 19 | 0.10049114603963163,0.249753829512176,0.35024497555180767,17 20 | 0.09450748542116748,0.2695078071653067,0.3640152925864742,18 21 | 0.08863348389665286,0.15572305968897934,0.2443565435856322,19 22 | 0.08547977585759428,0.2166405523247603,0.30212032818235457,20 23 | 0.08312915948530038,0.4951459498818925,0.5782751093671928,21 24 | 0.0822654529992077,0.276571557075335,0.3588370100745427,22 25 | 0.07473548232681221,0.49540712622021804,0.5701426085470302,23 26 | 0.0720982686099079,0.3984494693755146,0.4705477379854225,24 27 | -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_2/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/03_KSDD/FOLD_2/precision-recall.pdf -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_2/results.csv: -------------------------------------------------------------------------------- 1 | prediction,decision,ground_truth,img_name 2 | 0.9999520778656006,True,1.0,kos05_Part5 3 | 0.9985924363136292,True,1.0,kos07_Part1 4 | 0.18676447868347168,False,1.0,kos12_Part5 5 | 0.9988623857498169,True,1.0,kos13_Part3 6 | 0.5766454339027405,True,1.0,kos14_Part7 7 | 0.9993953704833984,True,1.0,kos16_Part5 8 | 0.9973614811897278,True,1.0,kos18_Part3 9 | 0.999541163444519,True,1.0,kos20_Part4 10 | 0.9999970197677612,True,1.0,kos23_Part6 11 | 0.9821735620498657,True,1.0,kos26_Part5 12 | 0.9997286200523376,True,1.0,kos33_Part6 13 | 0.9999575614929199,True,1.0,kos34_Part0 14 | 0.9432442784309387,True,1.0,kos35_Part5 15 | 0.9951510429382324,True,1.0,kos45_Part3 16 | 0.9994495511054993,True,1.0,kos49_Part2 17 | 0.4797040522098541,True,1.0,kos50_Part4 18 | 0.009008847177028656,False,0.0,kos05_Part0 19 | 0.07581401616334915,False,0.0,kos05_Part1 20 | 0.009229222312569618,False,0.0,kos05_Part2 21 | 0.006128465291112661,False,0.0,kos05_Part3 22 | 0.005555222742259502,False,0.0,kos05_Part4 23 | 0.03424542769789696,False,0.0,kos05_Part6 24 | 0.06808332353830338,False,0.0,kos05_Part7 25 | 0.009676611982285976,False,0.0,kos07_Part0 26 | 0.0098160021007061,False,0.0,kos07_Part2 27 | 0.01844630017876625,False,0.0,kos07_Part3 28 | 0.0049216789193451405,False,0.0,kos07_Part4 29 | 0.012254196219146252,False,0.0,kos07_Part5 30 | 0.010580003261566162,False,0.0,kos07_Part6 31 | 0.013379756361246109,False,0.0,kos07_Part7 32 | 0.0179468784481287,False,0.0,kos12_Part0 33 | 0.04270469769835472,False,0.0,kos12_Part1 34 | 0.022795097902417183,False,0.0,kos12_Part2 35 | 0.4610491096973419,False,0.0,kos12_Part3 36 | 0.032165203243494034,False,0.0,kos12_Part4 37 | 0.008861968293786049,False,0.0,kos12_Part6 38 | 0.0043816938996315,False,0.0,kos12_Part7 39 | 0.09113141894340515,False,0.0,kos13_Part0 40 | 0.004006832372397184,False,0.0,kos13_Part1 41 | 0.005875874310731888,False,0.0,kos13_Part2 42 | 0.012031658552587032,False,0.0,kos13_Part4 43 | 0.016061412170529366,False,0.0,kos13_Part5 44 | 0.014831370674073696,False,0.0,kos13_Part6 45 | 0.3763234317302704,False,0.0,kos13_Part7 46 | 0.09055718034505844,False,0.0,kos14_Part0 47 | 0.044688474386930466,False,0.0,kos14_Part1 48 | 0.1461181491613388,False,0.0,kos14_Part2 49 | 0.04277887940406799,False,0.0,kos14_Part3 50 | 0.04886261001229286,False,0.0,kos14_Part4 51 | 0.013403796590864658,False,0.0,kos14_Part5 52 | 0.057308875024318695,False,0.0,kos14_Part6 53 | 0.018306421115994453,False,0.0,kos16_Part0 54 | 0.01222469937056303,False,0.0,kos16_Part1 55 | 0.029639918357133865,False,0.0,kos16_Part2 56 | 0.06891371309757233,False,0.0,kos16_Part3 57 | 0.020587608218193054,False,0.0,kos16_Part4 58 | 0.014358286745846272,False,0.0,kos16_Part6 59 | 0.01752842590212822,False,0.0,kos16_Part7 60 | 0.004159735050052404,False,0.0,kos18_Part0 61 | 0.030874498188495636,False,0.0,kos18_Part1 62 | 0.02522597648203373,False,0.0,kos18_Part2 63 | 0.006391339004039764,False,0.0,kos18_Part4 64 | 0.04161732271313667,False,0.0,kos18_Part5 65 | 0.0032258681021630764,False,0.0,kos18_Part6 66 | 0.01231292262673378,False,0.0,kos18_Part7 67 | 0.06076651066541672,False,0.0,kos20_Part0 68 | 0.004168781451880932,False,0.0,kos20_Part1 69 | 0.004052634350955486,False,0.0,kos20_Part2 70 | 0.005762362852692604,False,0.0,kos20_Part3 71 | 0.0160816740244627,False,0.0,kos20_Part5 72 | 0.11657080054283142,False,0.0,kos20_Part6 73 | 0.07204021513462067,False,0.0,kos20_Part7 74 | 0.015794964507222176,False,0.0,kos23_Part0 75 | 0.00770812900736928,False,0.0,kos23_Part1 76 | 0.0054634311236441135,False,0.0,kos23_Part2 77 | 0.02450030855834484,False,0.0,kos23_Part3 78 | 0.06069999933242798,False,0.0,kos23_Part4 79 | 0.05925077944993973,False,0.0,kos23_Part5 80 | 0.025120049715042114,False,0.0,kos23_Part7 81 | 0.024776069447398186,False,0.0,kos26_Part0 82 | 0.10717673599720001,False,0.0,kos26_Part1 83 | 0.008718358352780342,False,0.0,kos26_Part2 84 | 0.0052630663849413395,False,0.0,kos26_Part3 85 | 0.010418234393000603,False,0.0,kos26_Part4 86 | 0.032960109412670135,False,0.0,kos26_Part6 87 | 0.026533043012022972,False,0.0,kos26_Part7 88 | 0.0542936846613884,False,0.0,kos33_Part0 89 | 0.004480691626667976,False,0.0,kos33_Part1 90 | 0.007308995816856623,False,0.0,kos33_Part2 91 | 0.07480946183204651,False,0.0,kos33_Part3 92 | 0.028573431074619293,False,0.0,kos33_Part4 93 | 0.0056165787391364574,False,0.0,kos33_Part5 94 | 0.008218592964112759,False,0.0,kos33_Part7 95 | 0.018169015645980835,False,0.0,kos34_Part1 96 | 0.004711559973657131,False,0.0,kos34_Part2 97 | 0.006395446136593819,False,0.0,kos34_Part3 98 | 0.004675418604165316,False,0.0,kos34_Part4 99 | 0.005864270962774754,False,0.0,kos34_Part5 100 | 0.003954550717025995,False,0.0,kos34_Part6 101 | 0.004404008854180574,False,0.0,kos34_Part7 102 | 0.005346039310097694,False,0.0,kos35_Part0 103 | 0.004459219519048929,False,0.0,kos35_Part1 104 | 0.004959597252309322,False,0.0,kos35_Part2 105 | 0.008449206128716469,False,0.0,kos35_Part3 106 | 0.005725472699850798,False,0.0,kos35_Part4 107 | 0.004160183481872082,False,0.0,kos35_Part6 108 | 0.013109304010868073,False,0.0,kos35_Part7 109 | 0.020684827119112015,False,0.0,kos45_Part0 110 | 0.003935525193810463,False,0.0,kos45_Part1 111 | 0.017721185460686684,False,0.0,kos45_Part2 112 | 0.016862643882632256,False,0.0,kos45_Part4 113 | 0.015166249126195908,False,0.0,kos45_Part5 114 | 0.01815323904156685,False,0.0,kos45_Part6 115 | 0.009488711133599281,False,0.0,kos45_Part7 116 | 0.011381448246538639,False,0.0,kos49_Part0 117 | 0.018440529704093933,False,0.0,kos49_Part1 118 | 0.07001245766878128,False,0.0,kos49_Part3 119 | 0.02134465053677559,False,0.0,kos49_Part4 120 | 0.021302083507180214,False,0.0,kos49_Part5 121 | 0.016375890001654625,False,0.0,kos49_Part6 122 | 0.007291986607015133,False,0.0,kos49_Part7 123 | 0.042571358382701874,False,0.0,kos50_Part0 124 | 0.036367274820804596,False,0.0,kos50_Part1 125 | 0.04218028485774994,False,0.0,kos50_Part2 126 | 0.04066936671733856,False,0.0,kos50_Part3 127 | 0.022151166573166847,False,0.0,kos50_Part5 128 | 0.01592046022415161,False,0.0,kos50_Part6 129 | 0.026799505576491356,False,0.0,kos50_Part7 130 | -------------------------------------------------------------------------------- /results/03_KSDD/FOLD_2/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:2 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:KSDD 5 | DATASET_PATH:./datasets/KSDD/ 6 | DELTA_CLS_LOSS:1.0 7 | DILATE:7 8 | DROPOUT_P:0.1 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:25 11 | FOLD:2 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:1 16 | INPUT_HEIGHT:1408 17 | INPUT_WIDTH:512 18 | LEARNING_RATE:0.05 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:2 21 | MODEL_NAME:models_z_new_SC_03 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:False 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:False 29 | TRAIN_NUM:33 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:False 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:True 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:1.0 40 | WEIGHTED_SEG_LOSS_P:2.0 41 | -------------------------------------------------------------------------------- /results/03_KSDD/eval_log.txt: -------------------------------------------------------------------------------- 1 | Running evaluation for RUN ./results_new\KSDD\ksdd1_z_new_SC_03_change08_25epoch 2 | RUN ksdd1_z_new_SC_03_change08_25epoch: AP:0.99366, AUC:0.99884, FP=0, FN=2, FN@.5=6, FP@.5=0, FP@FN0=7 3 | -------------------------------------------------------------------------------- /results/04_STEEL/ROC.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/04_STEEL/ROC.pdf -------------------------------------------------------------------------------- /results/04_STEEL/loss_val.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/04_STEEL/loss_val.png -------------------------------------------------------------------------------- /results/04_STEEL/losses.csv: -------------------------------------------------------------------------------- 1 | validation_data,loss_dec,loss,epoch 2 | 0.6147701243559519,0.7225943497816721,1.337364474137624,0 3 | 0.44499163309733075,0.6148672940333685,1.0598589271306993,1 4 | 0.34039474646250406,0.5787205259005229,0.9191152723630269,2 5 | 0.2646811930338542,0.5696714943647385,0.8343526873985927,3 6 | 0.21818197190761565,0.5652489838997523,0.783430955807368,4 7 | 0.18472751716772715,0.4839269302288691,0.6686544473965963,5 8 | 0.15770718614260357,0.49552750299374265,0.6532346891363462,6 9 | 0.1378169051806132,0.4948830262819926,0.6326999314626058,7 10 | 0.12309767305850983,0.43038605084021886,0.5534837238987287,8 11 | 0.1102469136317571,0.4534775697688262,0.5637244834005833,9 12 | 0.10006226321061452,0.4449517442037662,0.5450140074143808,10 13 | 0.09182438512643178,0.4396657798439264,0.5314901649703582,11 14 | 0.08475622529784839,0.4056070293734471,0.4903632546712955,12 15 | 0.07774525135755539,0.31131271199633676,0.38905796335389214,13 16 | 0.07230729733904202,0.33457935020327567,0.4068866475423177,14 17 | 0.06719952836632728,0.26655549419422947,0.3337550225605568,15 18 | 0.0641836126645406,0.40691281305626037,0.47109642572080096,16 19 | 0.060185690522193906,0.3066388912002246,0.3668245817224185,17 20 | 0.056418000062306725,0.2738515409330527,0.33026954099535943,18 21 | 0.05356040358543396,0.3223021074881156,0.37586251107354957,19 22 | 0.0510107905169328,0.30324696347738306,0.35425775399431586,20 23 | 0.048966319809357325,0.2112918768171221,0.2602581966264794,21 24 | 0.0474343541264534,0.37618496573840576,0.4236193198648592,22 25 | 0.046207478443781536,0.24858804017305375,0.2947955186168353,23 26 | 0.044127550721168515,0.27044403777147336,0.3145715884926419,24 27 | 0.04191833540797234,0.26648941073566673,0.3084077461436391,25 28 | 0.040027746732036275,0.22601826761538785,0.26604601434742414,26 29 | 0.03898701330025991,0.19102161260942618,0.2300086259096861,27 30 | 0.038273116225997605,0.16245583961407342,0.20072895584007103,28 31 | 0.03684098646044731,0.26204239410969116,0.29888338057013847,29 32 | 0.035590408692757286,0.17757335069899757,0.21316375939175486,30 33 | 0.03394493830700716,0.17834950524692733,0.21229444355393448,31 34 | 0.03275168587764104,0.16878217216581107,0.2015338580434521,32 35 | 0.03183964282274246,0.11312330076781411,0.14496294359055656,33 36 | 0.03098798784116904,0.18729490351785597,0.218282891359025,34 37 | 0.030105661948521933,0.20934794081995883,0.23945360276848077,35 38 | 0.029092484340071677,0.17330333308937648,0.20239581742944815,36 39 | 0.028425830602645873,0.1531467562254208,0.18157258682806665,37 40 | 0.027648483936985335,0.13098437872249633,0.15863286265948165,38 41 | 0.02685405984520912,0.11285849024813312,0.13971255009334224,39 42 | 0.026186387116710345,0.09264841972772653,0.11883480684443687,40 43 | 0.025747208595275878,0.13800255694581817,0.16374976554109405,41 44 | 0.025184478933612506,0.11049342306486021,0.1356779019984727,42 45 | 0.024582830170790354,0.13819535787799395,0.1627781880487843,43 46 | 0.0239755783478419,0.07157230708612285,0.09554788543396475,44 47 | 0.02374828634162744,0.15214672955994804,0.17589501590157547,45 48 | 0.0231091870367527,0.0498408965334238,0.0729500835701765,46 49 | 0.022444187303384145,0.11588844728500892,0.13833263458839307,47 50 | 0.021977167228857675,0.11117737425335994,0.13315454148221761,48 51 | 0.021668821970621744,0.15904202287861458,0.18071084484923633,49 52 | 0.02140103214730819,0.09992291199353834,0.12132394414084653,50 53 | 0.02086541460206111,0.10275797455493982,0.12362338915700093,51 54 | 0.020406942466894784,0.08895033271677676,0.10935727518367154,52 55 | 0.020245550411442916,0.09547344609122471,0.11571899650266762,53 56 | 0.020138905520240467,0.10362488561455394,0.12376379113479441,54 57 | 0.01934184600909551,0.13563645622790016,0.15497830223699566,55 58 | 0.01916098358730475,0.07090997154979656,0.09007095513710131,56 59 | 0.0187407527739803,0.06313398337243901,0.08187473614641931,57 60 | 0.018205085210502146,0.10638336147496981,0.12458844668547195,58 61 | 0.017903497827549776,0.10222802760225022,0.12013152542979999,59 62 | -------------------------------------------------------------------------------- /results/04_STEEL/precision-recall.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Fishdrink/SC-Net/51647e4cd1d665ce074f7a3286420523afd36487/results/04_STEEL/precision-recall.pdf -------------------------------------------------------------------------------- /results/04_STEEL/run_params.txt: -------------------------------------------------------------------------------- 1 | AREA_RATIO_MIN:0.9 2 | BATCH_SIZE:6 3 | CONTINUE_LAST_TRAIN:None 4 | DATASET:STEEL 5 | DATASET_PATH:./datasets/STEEL/ 6 | DELTA_CLS_LOSS:0.1 7 | DILATE:1 8 | DROPOUT_P:0.1 9 | DYN_BALANCED_LOSS:False 10 | EPOCHS:60 11 | FOLD:None 12 | FREQUENCY_SAMPLING:True 13 | GPU:0 14 | GRADIENT_ADJUSTMENT:False 15 | INPUT_CHANNELS:1 16 | INPUT_HEIGHT:256 17 | INPUT_WIDTH:1600 18 | LEARNING_RATE:0.03 19 | LOSS_SEG_THR:0.02 20 | MEMORY_FIT:6 21 | MODEL_NAME:models_z_new_SC_03 22 | NUM_SEGMENTED:0 23 | ON_DEMAND_READ:True 24 | OPTIMIZER:SGD 25 | REPRODUCIBLE_RUN:True 26 | RESULTS_PATH:./results_new 27 | SAMPLING:half_mixed 28 | SAVE_IMAGES:True 29 | TRAIN_NUM:300 30 | TRANS_BRIGHT:1.0 31 | TRANS_KEEP_LOOP:1 32 | TRANS_NUM:4 33 | USE_BEST_MODEL:True 34 | VALIDATE:True 35 | VALIDATE_ON_TEST:False 36 | VALIDATION_N_EPOCHS:3 37 | VOLUME_CFG:None 38 | WEIGHTED_SEG_LOSS:False 39 | WEIGHTED_SEG_LOSS_MAX:1.0 40 | WEIGHTED_SEG_LOSS_P:2.0 41 | --------------------------------------------------------------------------------