J7 - 对于ResNeXt-50算法的思考

  • 🍨 本文为🔗365天深度学习训练营 中的学习记录博客
  • 🍖 原作者:K同学啊 | 接辅导、项目定制

J6周有一段代码如下
Question

思考过程

  1. 首先看到这个问题的描述,想到的是可能使用了向量操作的广播机制
  2. 然后就想想办法验证一下,想到直接把J6的tensorflow代码跑一遍
  3. 通过model.summary打印了模型的所有层的信息,并把信息处理成方便查看(去掉分组卷积的一大堆层)
  4. 发现通道数一致,并不是使用了广播机制
  5. 仔细分析模型的过程,得出解释

验证过程

summary直接打印的内容,(太大只能贴出部分)

Model: "model"
__________________________________________________________________________________________________Layer (type)                Output Shape                 Param #   Connected to                  
==================================================================================================input_4 (InputLayer)        [(None, 224, 224, 3)]        0         []                            zero_padding2d_6 (ZeroPadd  (None, 230, 230, 3)          0         ['input_4[0][0]']             ing2D)                                                                                           conv2d_555 (Conv2D)         (None, 112, 112, 64)         9472      ['zero_padding2d_6[0][0]']    batch_normalization_59 (Ba  (None, 112, 112, 64)         256       ['conv2d_555[0][0]']          tchNormalization)                                                                                re_lu_53 (ReLU)             (None, 112, 112, 64)         0         ['batch_normalization_59[0][0]']                            zero_padding2d_7 (ZeroPadd  (None, 114, 114, 64)         0         ['re_lu_53[0][0]']            ing2D)                                                                                           max_pooling2d_3 (MaxPoolin  (None, 56, 56, 64)           0         ['zero_padding2d_7[0][0]']    g2D)                                                                                             conv2d_557 (Conv2D)         (None, 56, 56, 128)          8192      ['max_pooling2d_3[0][0]']     batch_normalization_61 (Ba  (None, 56, 56, 128)          512       ['conv2d_557[0][0]']          tchNormalization)                                                                                re_lu_54 (ReLU)             (None, 56, 56, 128)          0         ['batch_normalization_61[0][0]']                            lambda_514 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_515 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_516 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_517 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_518 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_519 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_520 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_521 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_522 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_523 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_524 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_525 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_526 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_527 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_528 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_529 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_530 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_531 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_532 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_533 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_534 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_535 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_536 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_537 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_538 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_539 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_540 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_541 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_542 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_543 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_544 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            lambda_545 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_54[0][0]']            conv2d_558 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_514[0][0]']          conv2d_559 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_515[0][0]']          conv2d_560 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_516[0][0]']          conv2d_561 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_517[0][0]']          conv2d_562 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_518[0][0]']          conv2d_563 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_519[0][0]']          conv2d_564 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_520[0][0]']          conv2d_565 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_521[0][0]']          conv2d_566 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_522[0][0]']          conv2d_567 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_523[0][0]']          conv2d_568 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_524[0][0]']          conv2d_569 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_525[0][0]']          conv2d_570 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_526[0][0]']          conv2d_571 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_527[0][0]']          conv2d_572 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_528[0][0]']          conv2d_573 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_529[0][0]']          conv2d_574 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_530[0][0]']          conv2d_575 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_531[0][0]']          conv2d_576 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_532[0][0]']          conv2d_577 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_533[0][0]']          conv2d_578 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_534[0][0]']          conv2d_579 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_535[0][0]']          conv2d_580 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_536[0][0]']          conv2d_581 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_537[0][0]']          conv2d_582 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_538[0][0]']          conv2d_583 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_539[0][0]']          conv2d_584 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_540[0][0]']          conv2d_585 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_541[0][0]']          conv2d_586 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_542[0][0]']          conv2d_587 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_543[0][0]']          conv2d_588 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_544[0][0]']          conv2d_589 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_545[0][0]']          concatenate_16 (Concatenat  (None, 56, 56, 128)          0         ['conv2d_558[0][0]',          e)                                                                  'conv2d_559[0][0]',          'conv2d_560[0][0]',          'conv2d_561[0][0]',          'conv2d_562[0][0]',          'conv2d_563[0][0]',          'conv2d_564[0][0]',          'conv2d_565[0][0]',          'conv2d_566[0][0]',          'conv2d_567[0][0]',          'conv2d_568[0][0]',          'conv2d_569[0][0]',          'conv2d_570[0][0]',          'conv2d_571[0][0]',          'conv2d_572[0][0]',          'conv2d_573[0][0]',          'conv2d_574[0][0]',          'conv2d_575[0][0]',          'conv2d_576[0][0]',          'conv2d_577[0][0]',          'conv2d_578[0][0]',          'conv2d_579[0][0]',          'conv2d_580[0][0]',          'conv2d_581[0][0]',          'conv2d_582[0][0]',          'conv2d_583[0][0]',          'conv2d_584[0][0]',          'conv2d_585[0][0]',          'conv2d_586[0][0]',          'conv2d_587[0][0]',          'conv2d_588[0][0]',          'conv2d_589[0][0]']          batch_normalization_62 (Ba  (None, 56, 56, 128)          512       ['concatenate_16[0][0]']      tchNormalization)                                                                                re_lu_55 (ReLU)             (None, 56, 56, 128)          0         ['batch_normalization_62[0][0]']                            conv2d_590 (Conv2D)         (None, 56, 56, 256)          32768     ['re_lu_55[0][0]']            conv2d_556 (Conv2D)         (None, 56, 56, 256)          16384     ['max_pooling2d_3[0][0]']     batch_normalization_63 (Ba  (None, 56, 56, 256)          1024      ['conv2d_590[0][0]']          tchNormalization)                                                                                batch_normalization_60 (Ba  (None, 56, 56, 256)          1024      ['conv2d_556[0][0]']          tchNormalization)                                                                                add_16 (Add)                (None, 56, 56, 256)          0         ['batch_normalization_63[0][0]',                            'batch_normalization_60[0][0]']                            re_lu_56 (ReLU)             (None, 56, 56, 256)          0         ['add_16[0][0]']              conv2d_591 (Conv2D)         (None, 56, 56, 128)          32768     ['re_lu_56[0][0]']            batch_normalization_64 (Ba  (None, 56, 56, 128)          512       ['conv2d_591[0][0]']          tchNormalization)                                                                                re_lu_57 (ReLU)             (None, 56, 56, 128)          0         ['batch_normalization_64[0][0]']                            lambda_546 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_547 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_548 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_549 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_550 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_551 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_552 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_553 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_554 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_555 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_556 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_557 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_558 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_559 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_560 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_561 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_562 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_563 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_564 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_565 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_566 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_567 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_568 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_569 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_570 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_571 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_572 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_573 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_574 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_575 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_576 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            lambda_577 (Lambda)         (None, 56, 56, 4)            0         ['re_lu_57[0][0]']            conv2d_592 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_546[0][0]']          conv2d_593 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_547[0][0]']          conv2d_594 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_548[0][0]']          conv2d_595 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_549[0][0]']          conv2d_596 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_550[0][0]']          conv2d_597 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_551[0][0]']          conv2d_598 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_552[0][0]']          conv2d_599 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_553[0][0]']          conv2d_600 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_554[0][0]']          conv2d_601 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_555[0][0]']          conv2d_602 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_556[0][0]']          conv2d_603 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_557[0][0]']          conv2d_604 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_558[0][0]']          conv2d_605 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_559[0][0]']          conv2d_606 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_560[0][0]']          conv2d_607 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_561[0][0]']          conv2d_608 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_562[0][0]']          conv2d_609 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_563[0][0]']          conv2d_610 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_564[0][0]']          conv2d_611 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_565[0][0]']          conv2d_612 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_566[0][0]']          conv2d_613 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_567[0][0]']          conv2d_614 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_568[0][0]']          conv2d_615 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_569[0][0]']          conv2d_616 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_570[0][0]']          conv2d_617 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_571[0][0]']          conv2d_618 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_572[0][0]']          conv2d_619 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_573[0][0]']          conv2d_620 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_574[0][0]']          conv2d_621 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_575[0][0]']          conv2d_622 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_576[0][0]']          conv2d_623 (Conv2D)         (None, 56, 56, 4)            144       ['lambda_577[0][0]']          concatenate_17 (Concatenat  (None, 56, 56, 128)          0         ['conv2d_592[0][0]',          e)                                                                  'conv2d_593[0][0]',          'conv2d_594[0][0]',          'conv2d_595[0][0]',          'conv2d_596[0][0]',          'conv2d_597[0][0]',          'conv2d_598[0][0]',          'conv2d_599[0][0]',          'conv2d_600[0][0]',          'conv2d_601[0][0]',          'conv2d_602[0][0]',          'conv2d_603[0][0]',          'conv2d_604[0][0]',          'conv2d_605[0][0]',          'conv2d_606[0][0]',          'conv2d_607[0][0]',          'conv2d_608[0][0]',          'conv2d_609[0][0]',          'conv2d_610[0][0]',          'conv2d_611[0][0]',          'conv2d_612[0][0]',          'conv2d_613[0][0]',          'conv2d_614[0][0]',          'conv2d_615[0][0]',          'conv2d_616[0][0]',          'conv2d_617[0][0]',          'conv2d_618[0][0]',          'conv2d_619[0][0]',          'conv2d_620[0][0]',          'conv2d_621[0][0]',          'conv2d_622[0][0]',          'conv2d_623[0][0]']          batch_normalization_65 (Ba  (None, 56, 56, 128)          512       ['concatenate_17[0][0]']      tchNormalization)                                                                                re_lu_58 (ReLU)             (None, 56, 56, 128)          0         ['batch_normalization_65[0][0]']                            

打印的层中,有大量的lambda,对照源代码,lambda操作在分组卷积内,我们可以把这一堆lambda一直到下面的concatenate全部看作在做分组卷积,分组卷积并不改变通道数,只是简化参数量。

# 把summary输出到文件中,使用python脚本处理掉这堆lambda
# 打开文件
f = open('summary')
# 读取内容
content = f.read()
# 按换行切分
lines = content.split('\n')clean_lines = []
# 过滤处理
for line in lines:if len(line.strip()) == 0:continueif len(line) - len(line.strip()) == 78 or len(line) - len(line.strip()) == 79:# 去掉concatenate那一堆connect tocontinue if 'lambda' in line:continueclean_lines.append(line)
for line in clean_lines:print(line)

处理后的模型结构如下

Model: "model"
__________________________________________________________________________________________________Layer (type)                Output Shape                 Param #   Connected to
==================================================================================================input_4 (InputLayer)        [(None, 224, 224, 3)]        0         []zero_padding2d_6 (ZeroPadd  (None, 230, 230, 3)          0         ['input_4[0][0]']ing2D)conv2d_555 (Conv2D)         (None, 112, 112, 64)         9472      ['zero_padding2d_6[0][0]']batch_normalization_59 (Ba  (None, 112, 112, 64)         256       ['conv2d_555[0][0]']tchNormalization)re_lu_53 (ReLU)             (None, 112, 112, 64)         0         ['batch_normalization_59[0][0]']zero_padding2d_7 (ZeroPadd  (None, 114, 114, 64)         0         ['re_lu_53[0][0]']ing2D)max_pooling2d_3 (MaxPoolin  (None, 56, 56, 64)           0         ['zero_padding2d_7[0][0]']g2D)conv2d_557 (Conv2D)         (None, 56, 56, 128)          8192      ['max_pooling2d_3[0][0]']batch_normalization_61 (Ba  (None, 56, 56, 128)          512       ['conv2d_557[0][0]']tchNormalization)re_lu_54 (ReLU)             (None, 56, 56, 128)          0         ['batch_normalization_61[0][0]']concatenate_16 (Concatenat  (None, 56, 56, 128)          0         ['conv2d_558[0][0]',e)                                                                  'conv2d_559[0][0]',batch_normalization_62 (Ba  (None, 56, 56, 128)          512       ['concatenate_16[0][0]']tchNormalization)re_lu_55 (ReLU)             (None, 56, 56, 128)          0         ['batch_normalization_62[0][0]']conv2d_590 (Conv2D)         (None, 56, 56, 256)          32768     ['re_lu_55[0][0]']conv2d_556 (Conv2D)         (None, 56, 56, 256)          16384     ['max_pooling2d_3[0][0]']batch_normalization_63 (Ba  (None, 56, 56, 256)          1024      ['conv2d_590[0][0]']tchNormalization)batch_normalization_60 (Ba  (None, 56, 56, 256)          1024      ['conv2d_556[0][0]']tchNormalization)add_16 (Add)                (None, 56, 56, 256)          0         ['batch_normalization_63[0][0]','batch_normalization_60[0][0]']re_lu_56 (ReLU)             (None, 56, 56, 256)          0         ['add_16[0][0]']conv2d_591 (Conv2D)         (None, 56, 56, 128)          32768     ['re_lu_56[0][0]']batch_normalization_64 (Ba  (None, 56, 56, 128)          512       ['conv2d_591[0][0]']tchNormalization)re_lu_57 (ReLU)             (None, 56, 56, 128)          0         ['batch_normalization_64[0][0]']concatenate_17 (Concatenat  (None, 56, 56, 128)          0         ['conv2d_592[0][0]',e)                                                                  'conv2d_593[0][0]',batch_normalization_65 (Ba  (None, 56, 56, 128)          512       ['concatenate_17[0][0]']tchNormalization)re_lu_58 (ReLU)             (None, 56, 56, 128)          0         ['batch_normalization_65[0][0]']conv2d_624 (Conv2D)         (None, 56, 56, 256)          32768     ['re_lu_58[0][0]']batch_normalization_66 (Ba  (None, 56, 56, 256)          1024      ['conv2d_624[0][0]']tchNormalization)add_17 (Add)                (None, 56, 56, 256)          0         ['batch_normalization_66[0][0]','re_lu_56[0][0]']re_lu_59 (ReLU)             (None, 56, 56, 256)          0         ['add_17[0][0]']conv2d_625 (Conv2D)         (None, 56, 56, 128)          32768     ['re_lu_59[0][0]']batch_normalization_67 (Ba  (None, 56, 56, 128)          512       ['conv2d_625[0][0]']tchNormalization)re_lu_60 (ReLU)             (None, 56, 56, 128)          0         ['batch_normalization_67[0][0]']concatenate_18 (Concatenat  (None, 56, 56, 128)          0         ['conv2d_626[0][0]',e)                                                                  'conv2d_627[0][0]',batch_normalization_68 (Ba  (None, 56, 56, 128)          512       ['concatenate_18[0][0]']tchNormalization)re_lu_61 (ReLU)             (None, 56, 56, 128)          0         ['batch_normalization_68[0][0]']conv2d_658 (Conv2D)         (None, 56, 56, 256)          32768     ['re_lu_61[0][0]']batch_normalization_69 (Ba  (None, 56, 56, 256)          1024      ['conv2d_658[0][0]']tchNormalization)add_18 (Add)                (None, 56, 56, 256)          0         ['batch_normalization_69[0][0]','re_lu_59[0][0]']re_lu_62 (ReLU)             (None, 56, 56, 256)          0         ['add_18[0][0]']conv2d_660 (Conv2D)         (None, 56, 56, 256)          65536     ['re_lu_62[0][0]']batch_normalization_71 (Ba  (None, 56, 56, 256)          1024      ['conv2d_660[0][0]']tchNormalization)re_lu_63 (ReLU)             (None, 56, 56, 256)          0         ['batch_normalization_71[0][0]']concatenate_19 (Concatenat  (None, 28, 28, 256)          0         ['conv2d_661[0][0]',e)                                                                  'conv2d_662[0][0]',batch_normalization_72 (Ba  (None, 28, 28, 256)          1024      ['concatenate_19[0][0]']tchNormalization)re_lu_64 (ReLU)             (None, 28, 28, 256)          0         ['batch_normalization_72[0][0]']conv2d_693 (Conv2D)         (None, 28, 28, 512)          131072    ['re_lu_64[0][0]']conv2d_659 (Conv2D)         (None, 28, 28, 512)          131072    ['re_lu_62[0][0]']batch_normalization_73 (Ba  (None, 28, 28, 512)          2048      ['conv2d_693[0][0]']tchNormalization)batch_normalization_70 (Ba  (None, 28, 28, 512)          2048      ['conv2d_659[0][0]']tchNormalization)add_19 (Add)                (None, 28, 28, 512)          0         ['batch_normalization_73[0][0]','batch_normalization_70[0][0]']re_lu_65 (ReLU)             (None, 28, 28, 512)          0         ['add_19[0][0]']conv2d_694 (Conv2D)         (None, 28, 28, 256)          131072    ['re_lu_65[0][0]']batch_normalization_74 (Ba  (None, 28, 28, 256)          1024      ['conv2d_694[0][0]']tchNormalization)re_lu_66 (ReLU)             (None, 28, 28, 256)          0         ['batch_normalization_74[0][0]']concatenate_20 (Concatenat  (None, 28, 28, 256)          0         ['conv2d_695[0][0]',e)                                                                  'conv2d_696[0][0]',batch_normalization_75 (Ba  (None, 28, 28, 256)          1024      ['concatenate_20[0][0]']tchNormalization)re_lu_67 (ReLU)             (None, 28, 28, 256)          0         ['batch_normalization_75[0][0]']conv2d_727 (Conv2D)         (None, 28, 28, 512)          131072    ['re_lu_67[0][0]']batch_normalization_76 (Ba  (None, 28, 28, 512)          2048      ['conv2d_727[0][0]']tchNormalization)add_20 (Add)                (None, 28, 28, 512)          0         ['batch_normalization_76[0][0]','re_lu_65[0][0]']re_lu_68 (ReLU)             (None, 28, 28, 512)          0         ['add_20[0][0]']conv2d_728 (Conv2D)         (None, 28, 28, 256)          131072    ['re_lu_68[0][0]']batch_normalization_77 (Ba  (None, 28, 28, 256)          1024      ['conv2d_728[0][0]']tchNormalization)re_lu_69 (ReLU)             (None, 28, 28, 256)          0         ['batch_normalization_77[0][0]']concatenate_21 (Concatenat  (None, 28, 28, 256)          0         ['conv2d_729[0][0]',e)                                                                  'conv2d_730[0][0]',batch_normalization_78 (Ba  (None, 28, 28, 256)          1024      ['concatenate_21[0][0]']tchNormalization)re_lu_70 (ReLU)             (None, 28, 28, 256)          0         ['batch_normalization_78[0][0]']conv2d_761 (Conv2D)         (None, 28, 28, 512)          131072    ['re_lu_70[0][0]']batch_normalization_79 (Ba  (None, 28, 28, 512)          2048      ['conv2d_761[0][0]']tchNormalization)add_21 (Add)                (None, 28, 28, 512)          0         ['batch_normalization_79[0][0]','re_lu_68[0][0]']re_lu_71 (ReLU)             (None, 28, 28, 512)          0         ['add_21[0][0]']conv2d_762 (Conv2D)         (None, 28, 28, 256)          131072    ['re_lu_71[0][0]']batch_normalization_80 (Ba  (None, 28, 28, 256)          1024      ['conv2d_762[0][0]']tchNormalization)re_lu_72 (ReLU)             (None, 28, 28, 256)          0         ['batch_normalization_80[0][0]']concatenate_22 (Concatenat  (None, 28, 28, 256)          0         ['conv2d_763[0][0]',e)                                                                  'conv2d_764[0][0]',batch_normalization_81 (Ba  (None, 28, 28, 256)          1024      ['concatenate_22[0][0]']tchNormalization)re_lu_73 (ReLU)             (None, 28, 28, 256)          0         ['batch_normalization_81[0][0]']conv2d_795 (Conv2D)         (None, 28, 28, 512)          131072    ['re_lu_73[0][0]']batch_normalization_82 (Ba  (None, 28, 28, 512)          2048      ['conv2d_795[0][0]']tchNormalization)add_22 (Add)                (None, 28, 28, 512)          0         ['batch_normalization_82[0][0]','re_lu_71[0][0]']re_lu_74 (ReLU)             (None, 28, 28, 512)          0         ['add_22[0][0]']conv2d_797 (Conv2D)         (None, 28, 28, 512)          262144    ['re_lu_74[0][0]']batch_normalization_84 (Ba  (None, 28, 28, 512)          2048      ['conv2d_797[0][0]']tchNormalization)re_lu_75 (ReLU)             (None, 28, 28, 512)          0         ['batch_normalization_84[0][0]']concatenate_23 (Concatenat  (None, 14, 14, 512)          0         ['conv2d_798[0][0]',e)                                                                  'conv2d_799[0][0]',batch_normalization_85 (Ba  (None, 14, 14, 512)          2048      ['concatenate_23[0][0]']tchNormalization)re_lu_76 (ReLU)             (None, 14, 14, 512)          0         ['batch_normalization_85[0][0]']conv2d_830 (Conv2D)         (None, 14, 14, 1024)         524288    ['re_lu_76[0][0]']conv2d_796 (Conv2D)         (None, 14, 14, 1024)         524288    ['re_lu_74[0][0]']batch_normalization_86 (Ba  (None, 14, 14, 1024)         4096      ['conv2d_830[0][0]']tchNormalization)batch_normalization_83 (Ba  (None, 14, 14, 1024)         4096      ['conv2d_796[0][0]']tchNormalization)add_23 (Add)                (None, 14, 14, 1024)         0         ['batch_normalization_86[0][0]','batch_normalization_83[0][0]']re_lu_77 (ReLU)             (None, 14, 14, 1024)         0         ['add_23[0][0]']conv2d_831 (Conv2D)         (None, 14, 14, 512)          524288    ['re_lu_77[0][0]']batch_normalization_87 (Ba  (None, 14, 14, 512)          2048      ['conv2d_831[0][0]']tchNormalization)re_lu_78 (ReLU)             (None, 14, 14, 512)          0         ['batch_normalization_87[0][0]']concatenate_24 (Concatenat  (None, 14, 14, 512)          0         ['conv2d_832[0][0]',e)                                                                  'conv2d_833[0][0]',batch_normalization_88 (Ba  (None, 14, 14, 512)          2048      ['concatenate_24[0][0]']tchNormalization)re_lu_79 (ReLU)             (None, 14, 14, 512)          0         ['batch_normalization_88[0][0]']conv2d_864 (Conv2D)         (None, 14, 14, 1024)         524288    ['re_lu_79[0][0]']batch_normalization_89 (Ba  (None, 14, 14, 1024)         4096      ['conv2d_864[0][0]']tchNormalization)add_24 (Add)                (None, 14, 14, 1024)         0         ['batch_normalization_89[0][0]','re_lu_77[0][0]']re_lu_80 (ReLU)             (None, 14, 14, 1024)         0         ['add_24[0][0]']conv2d_865 (Conv2D)         (None, 14, 14, 512)          524288    ['re_lu_80[0][0]']batch_normalization_90 (Ba  (None, 14, 14, 512)          2048      ['conv2d_865[0][0]']tchNormalization)re_lu_81 (ReLU)             (None, 14, 14, 512)          0         ['batch_normalization_90[0][0]']concatenate_25 (Concatenat  (None, 14, 14, 512)          0         ['conv2d_866[0][0]',e)                                                                  'conv2d_867[0][0]',batch_normalization_91 (Ba  (None, 14, 14, 512)          2048      ['concatenate_25[0][0]']tchNormalization)re_lu_82 (ReLU)             (None, 14, 14, 512)          0         ['batch_normalization_91[0][0]']conv2d_898 (Conv2D)         (None, 14, 14, 1024)         524288    ['re_lu_82[0][0]']batch_normalization_92 (Ba  (None, 14, 14, 1024)         4096      ['conv2d_898[0][0]']tchNormalization)add_25 (Add)                (None, 14, 14, 1024)         0         ['batch_normalization_92[0][0]','re_lu_80[0][0]']re_lu_83 (ReLU)             (None, 14, 14, 1024)         0         ['add_25[0][0]']conv2d_899 (Conv2D)         (None, 14, 14, 512)          524288    ['re_lu_83[0][0]']batch_normalization_93 (Ba  (None, 14, 14, 512)          2048      ['conv2d_899[0][0]']tchNormalization)re_lu_84 (ReLU)             (None, 14, 14, 512)          0         ['batch_normalization_93[0][0]']concatenate_26 (Concatenat  (None, 14, 14, 512)          0         ['conv2d_900[0][0]',e)                                                                  'conv2d_901[0][0]',batch_normalization_94 (Ba  (None, 14, 14, 512)          2048      ['concatenate_26[0][0]']tchNormalization)re_lu_85 (ReLU)             (None, 14, 14, 512)          0         ['batch_normalization_94[0][0]']conv2d_932 (Conv2D)         (None, 14, 14, 1024)         524288    ['re_lu_85[0][0]']batch_normalization_95 (Ba  (None, 14, 14, 1024)         4096      ['conv2d_932[0][0]']tchNormalization)add_26 (Add)                (None, 14, 14, 1024)         0         ['batch_normalization_95[0][0]','re_lu_83[0][0]']re_lu_86 (ReLU)             (None, 14, 14, 1024)         0         ['add_26[0][0]']conv2d_933 (Conv2D)         (None, 14, 14, 512)          524288    ['re_lu_86[0][0]']batch_normalization_96 (Ba  (None, 14, 14, 512)          2048      ['conv2d_933[0][0]']tchNormalization)re_lu_87 (ReLU)             (None, 14, 14, 512)          0         ['batch_normalization_96[0][0]']concatenate_27 (Concatenat  (None, 14, 14, 512)          0         ['conv2d_934[0][0]',e)                                                                  'conv2d_935[0][0]',batch_normalization_97 (Ba  (None, 14, 14, 512)          2048      ['concatenate_27[0][0]']tchNormalization)re_lu_88 (ReLU)             (None, 14, 14, 512)          0         ['batch_normalization_97[0][0]']conv2d_966 (Conv2D)         (None, 14, 14, 1024)         524288    ['re_lu_88[0][0]']batch_normalization_98 (Ba  (None, 14, 14, 1024)         4096      ['conv2d_966[0][0]']tchNormalization)add_27 (Add)                (None, 14, 14, 1024)         0         ['batch_normalization_98[0][0]','re_lu_86[0][0]']re_lu_89 (ReLU)             (None, 14, 14, 1024)         0         ['add_27[0][0]']conv2d_967 (Conv2D)         (None, 14, 14, 512)          524288    ['re_lu_89[0][0]']batch_normalization_99 (Ba  (None, 14, 14, 512)          2048      ['conv2d_967[0][0]']tchNormalization)re_lu_90 (ReLU)             (None, 14, 14, 512)          0         ['batch_normalization_99[0][0]']concatenate_28 (Concatenat  (None, 14, 14, 512)          0         ['conv2d_968[0][0]',e)                                                                  'conv2d_969[0][0]',batch_normalization_100 (B  (None, 14, 14, 512)          2048      ['concatenate_28[0][0]']atchNormalization)re_lu_91 (ReLU)             (None, 14, 14, 512)          0         ['batch_normalization_100[0][0]']conv2d_1000 (Conv2D)        (None, 14, 14, 1024)         524288    ['re_lu_91[0][0]']batch_normalization_101 (B  (None, 14, 14, 1024)         4096      ['conv2d_1000[0][0]']atchNormalization)add_28 (Add)                (None, 14, 14, 1024)         0         ['batch_normalization_101[0][0]','re_lu_89[0][0]']re_lu_92 (ReLU)             (None, 14, 14, 1024)         0         ['add_28[0][0]']conv2d_1002 (Conv2D)        (None, 14, 14, 1024)         1048576   ['re_lu_92[0][0]']batch_normalization_103 (B  (None, 14, 14, 1024)         4096      ['conv2d_1002[0][0]']atchNormalization)re_lu_93 (ReLU)             (None, 14, 14, 1024)         0         ['batch_normalization_103[0][0]']concatenate_29 (Concatenat  (None, 7, 7, 1024)           0         ['conv2d_1003[0][0]',e)                                                                  'conv2d_1004[0][0]',batch_normalization_104 (B  (None, 7, 7, 1024)           4096      ['concatenate_29[0][0]']atchNormalization)re_lu_94 (ReLU)             (None, 7, 7, 1024)           0         ['batch_normalization_104[0][0]']conv2d_1035 (Conv2D)        (None, 7, 7, 2048)           2097152   ['re_lu_94[0][0]']conv2d_1001 (Conv2D)        (None, 7, 7, 2048)           2097152   ['re_lu_92[0][0]']batch_normalization_105 (B  (None, 7, 7, 2048)           8192      ['conv2d_1035[0][0]']atchNormalization)batch_normalization_102 (B  (None, 7, 7, 2048)           8192      ['conv2d_1001[0][0]']atchNormalization)add_29 (Add)                (None, 7, 7, 2048)           0         ['batch_normalization_105[0][0]','batch_normalization_102[0][0]']re_lu_95 (ReLU)             (None, 7, 7, 2048)           0         ['add_29[0][0]']conv2d_1036 (Conv2D)        (None, 7, 7, 1024)           2097152   ['re_lu_95[0][0]']batch_normalization_106 (B  (None, 7, 7, 1024)           4096      ['conv2d_1036[0][0]']atchNormalization)re_lu_96 (ReLU)             (None, 7, 7, 1024)           0         ['batch_normalization_106[0][0]']concatenate_30 (Concatenat  (None, 7, 7, 1024)           0         ['conv2d_1037[0][0]',e)                                                                  'conv2d_1038[0][0]',batch_normalization_107 (B  (None, 7, 7, 1024)           4096      ['concatenate_30[0][0]']atchNormalization)re_lu_97 (ReLU)             (None, 7, 7, 1024)           0         ['batch_normalization_107[0][0]']conv2d_1069 (Conv2D)        (None, 7, 7, 2048)           2097152   ['re_lu_97[0][0]']batch_normalization_108 (B  (None, 7, 7, 2048)           8192      ['conv2d_1069[0][0]']atchNormalization)add_30 (Add)                (None, 7, 7, 2048)           0         ['batch_normalization_108[0][0]','re_lu_95[0][0]']re_lu_98 (ReLU)             (None, 7, 7, 2048)           0         ['add_30[0][0]']conv2d_1070 (Conv2D)        (None, 7, 7, 1024)           2097152   ['re_lu_98[0][0]']batch_normalization_109 (B  (None, 7, 7, 1024)           4096      ['conv2d_1070[0][0]']atchNormalization)re_lu_99 (ReLU)             (None, 7, 7, 1024)           0         ['batch_normalization_109[0][0]']concatenate_31 (Concatenat  (None, 7, 7, 1024)           0         ['conv2d_1071[0][0]',e)                                                                  'conv2d_1072[0][0]',batch_normalization_110 (B  (None, 7, 7, 1024)           4096      ['concatenate_31[0][0]']atchNormalization)re_lu_100 (ReLU)            (None, 7, 7, 1024)           0         ['batch_normalization_110[0][0]']conv2d_1103 (Conv2D)        (None, 7, 7, 2048)           2097152   ['re_lu_100[0][0]']batch_normalization_111 (B  (None, 7, 7, 2048)           8192      ['conv2d_1103[0][0]']atchNormalization)add_31 (Add)                (None, 7, 7, 2048)           0         ['batch_normalization_111[0][0]','re_lu_98[0][0]']re_lu_101 (ReLU)            (None, 7, 7, 2048)           0         ['add_31[0][0]']global_average_pooling2d_1  (None, 2048)                 0         ['re_lu_101[0][0]'](GlobalAveragePooling2D)dense_1 (Dense)             (None, 1000)                 2049000   ['global_average_pooling2d_1[0][0]']

观察Add的connected to,发现全都是一样的,并没有出现不一致的情况,竟然和我想的不一样,并没有使用什么广播机制。仔细观察模型的过程才发现,stack的block中,x和filters通道不一致,此时如果直接相加会报错,所以第一个block做了一个通道数*2的卷积。由于后续的filters没有变,输出的通道都是filters*2,所以也可以直接相加。

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://xiahunao.cn/news/2804285.html

如若内容造成侵权/违法违规/事实不符,请联系瞎胡闹网进行投诉反馈,一经查实,立即删除!

相关文章

华清远见作业第四十二天——Qt(第四天)

思维导图&#xff1a; 编程&#xff1a; 代码&#xff1a; widget.h #ifndef WIDGET_H #define WIDGET_H#include <QWidget> #include<QTextToSpeech> //语音播报类 QT_BEGIN_NAMESPACE namespace Ui { class Widget; } QT_END_NAMESPACEclass Widget : public Q…

【C++精简版回顾】5.字符串

1.字符串的四种初始化方式 string str "ilove"; string str1("ilove"); string str2(str1); string str3 str1; 2.针对字符串的一些函数 &#xff08;1&#xff09;字符串长度 cout<<str.length()<<endl;&#xff08;2&#xff09;查找字…

基于qt的图书管理系统----03核心界面设计

参考b站&#xff1a;视频连接 源码github&#xff1a;github 目录 1 添加软件图标2 打包程序3 三个管理界面设计4 代码编写4.1 加载界面4.2 点击按钮切换界面4.3 组团添加样式4.4 搭建表头4.5 表格相关操作 从别人那里下载的项目会有这个文件&#xff0c;里边是别人配置的路径…

[NOI2000]单词查找树

牛客题目链接:https://ac.nowcoder.com/acm/problem/16864 题目描述: 在进行文法分析的时候&#xff0c;通常需要检测一个单词是否在我们的单词列表里。为了提高查找和定位的速度&#xff0c;通常都要画出与单词列表所对应的单词查找树&#xff0c;其特点如下 (1). 根节点不包含…

Docker 是怎么工作的?

Docker 是怎么工作的&#xff1f; 本文转自 公众号 ByteByteGo&#xff0c;如有侵权&#xff0c;请联系&#xff0c;立即删除 Docker 是如何工作的&#xff1f; 下图展示了 Docker 的架构&#xff0c;以及当我们运行 “docker build”、"docker pull "和 "docke…

灌水:powershell 练习正则表达式

亲爱的读者们&#xff0c;请展示你们的能力&#xff1a;解析&#xff08;使用代码&#xff09;解析以下字符串 <鱼龙混杂的奇葩文件#> UI1|System.Windows.Forms.linklabel #创建用户对象 1.location.250.250 1.text.磁盘清理 1.autosize #自适应大小 #存在混淆风险…

分享一个UE的SmoothStep小技巧

SmoothStep节点可以制作更平滑的动画&#xff0c;而如果将max参数作为值传入将value和min参数作为约束&#xff0c;则可以做出类似冲击波的渐变效果&#xff1a; 并且通过修改value与min之间的数值差&#xff0c;可以调节渐变。 这个技巧主要就是可以产生硬边。 比如我们可…

Node.js的debug模块源码分析及在harmonyOS平台移植

Debug库 是一个小巧但功能强大的 JavaScript 调试工具库&#xff0c;可以帮助开发人员更轻松地进行调试&#xff0c;以便更快地发现和修复问题。它的主要特点是可以轻松地添加调试日志语句&#xff0c;同时在不需要调试时可以轻松地禁用它们&#xff0c;以避免在生产环境中对性…

【qt】类似微信扫描动画局部paint,防止cpu占用过高

实现类似微信扫码扫描效果&#xff0c;arm性能不高&#xff0c;如果直接定时器move线条的位置全局paint&#xff0c;会导致cpu占用过高&#xff0c;UI卡顿。所以需要仅仅绘制需要绘制的区域&#xff0c;减少cpu占用。 主要代码 #define MOVE_STEP 2 #define START_Y 300 #de…

使用HiveMQ实现Android MQTT

MQTT官网&#xff1a;https://mqtt.org/ 百度Android MQTT&#xff0c;或者B站上搜索&#xff0c;发现大多使用https://github.com/eclipse/paho.mqtt.android&#xff0c;这是Eclipse的一个Android MQTT客户端实现库&#xff0c;但是我发现这个库在运行到高版本的手机上时报错…

第九节HarmonyOS 常用基础组件27-Rating

1、描述 提供在给定范围内选择评分的组件。 2、接口 Rating(options?:{rating:number, indicator?:boolean}) 3、参数 参数名 参数类型 必填 描述 rating number 是 设置并接收评分值。默认值&#xff1a;0&#xff1b;取值范围[0, stars]&#xff0c;小于0取0&am…

运维SRE-17 自动化批量管理-ansible3

--- - hosts:alltasks:- name: 01 打开冰箱门shell: echo 01 >> /tmp/bingxiang.log- name: 02 把大象放进冰箱里shell: echo 02 >> /tmp/bingxiang.log- name: 03 关上冰箱门shell: echo 03 >> /tmp/bingxiang.log[rootm01 /server/ans/playbook]# cat 05-n…

Spring Boot WebFlux:实现web(Server-Sent Events)事件异步推送

WebFlux 在Spring Boot中&#xff0c;Flux是一个重要的概念&#xff0c;它是Spring Framework 5.0以后引入的响应式编程框架WebFlux的核心组件之一。Flux是Reactor项目的一部分&#xff0c;它实现了Reactive Streams规范&#xff0c;用于处理异步、非阻塞的数据流。 与传统的…

Grounded-SAM(最强Zero-Shot视觉应用):本地部署及各个模块的全网最详细使用教程!

本篇文章主要对Grounded-SAM项目的部署以及使用进行讲解&#xff0c;目的是使读者可以直接参考文档去使用Grounded-SAM&#xff0c;而无需再去参考Github一步步自己去分析尝试&#xff08;也算是我使用过程中的心得&#xff09;。 对于Grounded-SAM 技术报告的paper阅读可以跳转…

介绍 CI / CD

目录 一、介绍 CI / CD 1、为什么要 CI / CD 方法简介 1、持续集成 2、持续交付 3、持续部署 2、GitLab CI / CD简介 3、GitLab CI / CD 的工作原理 4、基本CI / CD工作流程 5、首次设置 GitLab CI / CD 6、GitLab CI / CD功能集 一、介绍 CI / CD 在本文档中&#x…

echarts 实现x轴文字过长时折行展示

代码如下&#xff1a; this.options {color: ["#0075FF", "#00E2C4", "#FCA884", "#FFCB11"],grid: {top: "25%",bottom: "6%",right: "8%",left: "8%",containLabel: true,},legend: {top…

springboot751社区维修平台

springboot751社区维修平台 获取源码——》公主号&#xff1a;计算机专业毕设大全

TikTok账号注册指南:TikTok的注册方式有哪些?

随着全球数字化浪潮的加速&#xff0c;TikTok已成为跨越文化和国界的社交媒体巨头。对于寻找跨境电商机会的卖家来说&#xff0c;一个有效的TikTok账号都是打开通往成功之门的钥匙。本文将为大家详细介绍TikTok账号的注册方式&#xff0c;并提供一些实用的技巧&#xff0c;帮助…

018—pandas 生成笛卡尔积排列组合合并多列字符串数据

思路&#xff1a; 本需求需要将给定的几列数据&#xff0c;生成一个排列组合形式的数据列&#xff0c;利用到 Pandas 多层索引生成的笛卡尔积的方法。 二、使用步骤 1.引入库 代码如下&#xff08;示例&#xff09;&#xff1a; import pandas as pd2.读入数据 代码如下&…

【动态规划】【回文】【字符串】1147. 段式回文

作者推荐 【广度优先搜索】【网格】【割点】【 推荐】1263. 推箱子 本文涉及知识点 动态规划汇总 LeetCode1147段式回文 你会得到一个字符串 text 。你应该把它分成 k 个子字符串 (subtext1, subtext2&#xff0c;…&#xff0c; subtextk) &#xff0c;要求满足: subtext…