How can I solve Detected Infinity or NaN in output 0 of eagerly-executing op "Conv2D" (# of outputs: 1)

The input is normal, but when input into the network structure of VGG-19, there will be a NAN error in the input after convolution, how to solve this problem?
The network structure is as follows:
self.main_layer_list += [
self.conv_block(n_filter=64, in_channels=self.in_channels, filter_size=(3, 3), strides=(1, 1),
act=tf.nn.relu, name=“conv1_1”),
self.conv_block(n_filter=64, in_channels=64, filter_size=(3, 3), strides=(1, 1), act=tf.nn.relu,
name=“conv1_2”),
MaxPool2d(filter_size=(2, 2), strides=(2, 2), data_format=self.data_format, name=“maxpool_1”),
self.conv_block(n_filter=128, in_channels=64, filter_size=(3, 3), strides=(1, 1), act=tf.nn.relu,
name=“conv2_1”),
self.conv_block(n_filter=128, in_channels=128, filter_size=(3, 3), strides=(1, 1), act=tf.nn.relu,
name=“conv2_2”),
MaxPool2d(filter_size=(2, 2), strides=(2, 2), data_format=self.data_format, name=“maxpool_2”),
self.conv_block(n_filter=256, in_channels=128, filter_size=(3, 3), strides=(1, 1), act=tf.nn.relu,
name=“conv3_1”),
self.conv_block(n_filter=256, in_channels=256, filter_size=(3, 3), strides=(1, 1), act=tf.nn.relu,
name=“conv3_2”),
self.conv_block(n_filter=256, in_channels=256, filter_size=(3, 3), strides=(1, 1), act=tf.nn.relu,
name=“conv3_3”),
self.conv_block(n_filter=256, in_channels=256, filter_size=(3, 3), strides=(1, 1), act=tf.nn.relu,
name=“conv3_4”),
MaxPool2d(filter_size=(2, 2), strides=(2, 2), data_format=self.data_format, name=“maxpool_3”),
self.conv_block(n_filter=512, in_channels=256, filter_size=(3, 3), strides=(1, 1), act=tf.nn.relu,
name=“conv4_1”),
self.conv_block(n_filter=512, in_channels=512, filter_size=(3, 3), strides=(1, 1), act=tf.nn.relu,
name=“conv4_2”),
]
self.out_channels=512

The following error message is displayed:
发生异常: InvalidArgumentError
!!! Detected Infinity or NaN in output 0 of eagerly-executing op “Conv2D” (# of outputs: 1) !!!
dtype: <dtype: ‘float32’>
shape: (3, 368, 432, 64)

of +NaN elements: 8400896

Input tensors (2):
0: tf.Tensor(
[[[[0.1500127 0.01813917 0.20477234 … 0. 0.04627592
0.15100552]
[0.1270498 0. 0.16060369 … 0. 0.0464062
0.15123805]
[0.12509023 0. 0.10747914 … 0. 0.04651143
0.15144674]

[0.20047459 0.06912258 0.15552373 … 0. 0.04633534
0.15134169]
[0.13695824 0. 0.13985525 … 0. 0.0464044
0.15111026]
[0.1151154 0. 0.10707475 … 0. 0.04634238
0.15118974]]

[[0.16503803 0.06883179 0.17618498 … 0. 0.04651359
0.15126403]
[0.15685916 0. 0.1386675 … 0. 0.04651892
0.15124896]
[0.130568 0. 0.08387777 … 0. 0.04643347
0.15124844]

[0.21337199 0.06075534 0.13784128 … 0. 0.04633574
0.15113612]
[0.14924058 0. 0.12674709 … 0. 0.04638441
0.15128049]
[0.10938848 0. 0.09444004 … 0. 0.04637663
0.15115641]]

[[0.23134854 0.06027757 0.09830838 … 0. 0.04640669
0.15104818]
[0.16252226 0. 0.078629 … 0. 0.04648879
0.15113725]
[0.14791417 0. 0.09002776 … 0. 0.04635768
0.15116881]

[0.25444698 0.02831446 0.13561597 … 0. 0.04641974
0.15117425]
[0.18091047 0. 0.11684151 … 0. 0.04645795
0.15116073]
[0.13574126 0. 0.0995906 … 0. 0.04643681
0.1512829 ]]

[[0.20018107 0. 0.18688463 … 0. 0.04647401
0.15116069]
[0.15582253 0. 0.12751058 … 0. 0.0464195
0.15144564]
[0.10352629 0. 0.1350222 … 0. 0.04643035
0.15128069]

[0.18527037 0.00605566 0.19329283 … 0. 0.04610997
0.15121844]
[0.12963252 0. 0.15224765 … 0. 0.04643129
0.15122595]
[0.12339732 0. 0.12152058 … 0. 0.0463755
0.15118946]]

[[0.14210704 0. 0.1949158 … 0. 0.04643936
0.15098086]
[0.11323164 0. 0.12632233 … 0. 0.04634906
0.15124868]
[0.07544261 0. 0.12454465 … 0. 0.04652483
0.15109496]

[0.17074794 0.03026725 0.16987517 … 0. 0.04626553
0.15122071]
[0.14239706 0. 0.12904872 … 0. 0.04625799
0.15114516]
[0.12012208 0. 0.09847053 … 0. 0.04635644
0.15121076]]

[[0.16870055 0.05731671 0.1505503 … 0. 0.04637546
0.1511615 ]
[0.11679927 0.00484516 0.1372374 … 0. 0.04627998
0.15123138]
[0.11676145 0. 0.11377239 … 0. 0.04617345
0.15120104]


[0.06313144 0. 0.04745703 … 0. 0.04638037
0.1511815 ]
[0.06313057 0. 0.04761081 … 0. 0.04696325
0.15130821]
[0.06312027 0. 0.04768058 … 0. 0.04635751
0.15122475]]

[[0.06316148 0. 0.04741119 … 0. 0.0462382
0.15106332]
[0.06302229 0. 0.04754437 … 0. 0.04670485
0.1510458 ]
[0.06311848 0. 0.04751291 … 0. 0.04630695
0.15122946]

[0.06312422 0. 0.0477731 … 0. 0.04652933
0.15110561]
[0.06314063 0. 0.04755805 … 0. 0.04620492
0.15106504]
[0.0631123 0. 0.04760112 … 0. 0.04647412
0.15121152]]

[[0.06296206 0. 0.04758103 … 0. 0.04648415
0.15096073]
[0.06311155 0. 0.04753026 … 0. 0.04650775
0.15124868]
[0.06311221 0. 0.04757759 … 0. 0.04649902
0.15126789]

[0.06309955 0. 0.04745144 … 0. 0.04638632
0.15104207]
[0.063112 0. 0.04761374 … 0. 0.04635442
0.15120094]
[0.06309702 0. 0.04755076 … 0. 0.04645729
0.15112793]]

[[0.06317139 0. 0.04747271 … 0. 0.04633769
0.15094432]
[0.06309769 0. 0.04768588 … 0. 0.04646319
0.15133908]
[0.06308432 0. 0.04749952 … 0. 0.04660663
0.15088978]

[0.06308443 0. 0.0475706 … 0. 0.046336
0.15122765]
[0.06310143 0. 0.04754329 … 0. 0.04661879
0.15129131]
[0.06311873 0. 0.04755755 … 0. 0.04639268
0.1511798 ]]

[[0.06320731 0. 0.04764764 … 0. 0.04640601
0.15135458]
[0.06311855 0. 0.04756876 … 0. 0.046175
0.15127508]
[0.06313971 0. 0.04763309 … 0. 0.04660531
0.15129742]

[0.06314914 0. 0.04752403 … 0. 0.04644692
0.15114014]
[0.06315352 0. 0.04758847 … 0.01143987 0.04638525
0.1512301 ]
[0.06312222 0. 0.04757226 … 0. 0.04649466
0.1512546 ]]]

[[[0.06324468 0. 0.04759694 … 0. 0.04573856
0.1512205 ]
[0.06310596 0. 0.04761045 … 0. 0.04663394
0.15183981]
[0.06310021 0. 0.04768058 … 0.00823797 0.04637683
0.1511759 ]

[0.06302091 0. 0.04755185 … 0. 0.04626622
0.15142453]
[0.06309784 0. 0.04762584 … 0. 0.04651338
0.1510375 ]
[0.06298435 0. 0.04756521 … 0. 0.0461109
0.15126368]]

[[0.06311273 0. 0.04753517 … 0.01493719 0.04531061
0.1518987 ]
[0.06311152 0. 0.04776169 … 0. 0.04667917
0.15204768]
[0.06304973 0. 0.047385 … 0. 0.04627886
0.15097228]

[0.06322829 0. 0.04747527 … 0. 0.04644391
0.15112118]
[0.06327836 0. 0.04747891 … 0. 0.04633773
0.15121038]
[0.06313961 0. 0.04762179 … 0. 0.04632648
0.1512203 ]]

[[0.06287706 0. 0.04821977 … 0. 0.04652021
0.15136288]
[0.06310599 0. 0.04773341 … 0. 0.04623234
0.1508004 ]
[0.06298158 0. 0.04783374 … 0. 0.04632206
0.15131027]

[0.06300439 0. 0.0476838 … 0. 0.04655826
0.15113804]
[0.06333267 0. 0.04770684 … 0. 0.04631234
0.15186736]
[0.06322863 0. 0.04757369 … 0. 0.04657236
0.15108368]]

[[0.06317069 0. 0.04844605 … 0. 0.03975863
0.1491302 ]
[0.06337952 0. 0.04727751 … 0. 0.04931464
0.14740095]
[0.0632981 0. 0.04770948 … 0. 0.04681782
0.15113834]

[0.06328101 0. 0.04766756 … 0. 0.04741061
0.15036847]
[0.06293908 0. 0.04750568 … 0. 0.04634361
0.15111847]
[0.06297186 0. 0.04753968 … 0. 0.04620032
0.14919464]]

[[0.06269491 0. 0.04785748 … 0. 0.04598617
0.15414901]
[0.06296396 0. 0.0477189 … 0. 0.04708571
0.15032662]
[0.06306094 0. 0.04768101 … 0. 0.04678869
0.15226592]

[0.06319796 0. 0.04760287 … 0. 0.04709743
0.15097383]
[0.06270527 0. 0.04747753 … 0. 0.04629578
0.14857516]
[0.06290129 0. 0.04748669 … 0. 0.04630448
0.14873436]]

[[0.06325986 0. 0.04766047 … 0. 0.04295744
0.14652026]
[0.06286769 0. 0.0476849 … 0. 0.04767453
0.15144211]
[0.06291497 0. 0.0476539 … 0. 0.04599401
0.15478058]

[0.06305683 0. 0.04770054 … 0. 0.0466041
0. ]
[0.04210485 0. 0.04746767 … 0. 0.04620145
0.07684492]
[0.06065362 0. 0.04760825 … 0. 0.04642985
0.0603441 ]]]], shape=(3, 368, 432, 64), dtype=float32)
1: tf.Tensor(
[[[[-1.11088110e-02 6.34150505e-02 -5.76001639e-03 … 6.11580499e-02
2.49455720e-02 -1.00616072e-30]
[-3.15621458e-02 1.02330081e-03 -2.49491170e-01 … 2.82532051e-02
2.18219552e-02 1.01019662e-30]
[-4.91782725e-02 -9.72359106e-02 -3.61218653e-03 … -9.14848745e-02
1.68567467e-02 -9.90843184e-31]

[-3.00795864e-02 1.15000661e-02 -2.37683002e-02 … -3.21527175e-03
1.66662652e-02 1.00293705e-30]
[ 2.30123736e-02 5.26421443e-02 6.54029995e-02 … 3.08185760e-02
-3.50763723e-02 -1.01505621e-30]
[ 3.95727083e-02 -8.27869698e-02 1.26787229e-02 … 1.22889234e-02
-3.53453010e-02 -1.00832448e-30]]

[[ 1.42461464e-01 -3.65686044e-02 5.24606667e-02 … -4.84120212e-02
-1.77265443e-02 -9.88538933e-31]
[-2.81071626e-02 -1.41359726e-03 -4.59216714e-01 … 1.03490271e-01
1.78554133e-02 1.00490294e-30]
[ 7.82028213e-03 9.42098908e-03 -6.19763434e-02 … -1.22324906e-01
2.28484813e-02 -9.91291471e-31]

[-2.17652339e-02 8.96690693e-03 -4.71094400e-02 … 1.08638508e-02
2.66167726e-02 -9.91231944e-31]
[-4.61992100e-02 -1.67343877e-02 -9.53740533e-03 … -5.06332032e-02
1.45640552e-01 -1.01593190e-30]
[ 6.31381795e-02 -3.46228033e-02 3.86278443e-02 … -4.91649844e-02
6.32931814e-02 -9.97357774e-31]]

[[-6.19041771e-02 -1.76877938e-02 2.25358494e-02 … -1.99198313e-02
-3.29559185e-02 -1.00788456e-30]
[-4.03494947e-03 2.12729387e-02 -2.36994684e-01 … -2.42546462e-02
4.40360941e-02 9.95994577e-31]
[-1.78208277e-02 1.05401918e-01 3.19699608e-02 … 1.57431662e-02
-1.09776661e-01 -9.93713271e-31]

[ 4.85972278e-02 2.09293105e-02 7.75282970e-03 … -1.52906617e-02
6.74401224e-02 -1.00968137e-30]
[-2.23821133e-01 7.85512850e-02 -7.07460493e-02 … -1.94579773e-02
-3.89976278e-02 -1.00384237e-30]
[-7.62479305e-02 -2.15675402e-02 -4.12339903e-02 … -3.08150658e-04
-7.89392516e-02 -1.01033034e-30]]]

[[[ 1.40317291e-01 -1.60281043e-02 -1.74017847e-02 … 2.74865679e-03
-6.25418723e-02 -1.00093626e-30]
[-1.10596372e-02 -2.87993997e-03 -2.54235566e-01 … 2.33163852e-02
3.91047299e-02 9.94662036e-31]
[ 2.31440756e-02 -2.83784606e-02 -4.90584783e-02 … -1.54673174e-01
-1.07597776e-01 -1.01400879e-30]

[ 7.05821766e-03 1.22367293e-02 -1.69851147e-02 … 2.65229177e-02
1.39451576e-02 9.94441514e-31]
[ 2.64479369e-02 -2.66256556e-02 1.03542343e-01 … 7.15620667e-02
-1.81379274e-01 -9.98766674e-31]
[ 8.96631703e-02 -4.41498496e-02 1.62797957e-03 … 2.38357764e-02
-1.67432055e-01 -9.94722410e-31]]

[[ 8.49749893e-03 -2.02199206e-01 2.37127617e-02 … 7.46154413e-03
-6.48580445e-03 -9.94899298e-31]
[-5.15625672e-03 -1.38808868e-03 -4.84916896e-01 … 1.11871190e-01
2.55850293e-02 9.95111075e-31]
[-2.18591318e-01 4.08473127e-02 -1.41542554e-01 … -2.01696888e-01
2.46873442e-02 -9.86149953e-31]

[ 9.98961367e-03 1.17450180e-02 -4.58426140e-02 … 7.16673434e-02
-3.15034091e-02 -9.84228725e-31]
[-2.14376107e-01 1.05493344e-01 3.45119163e-02 … 2.56901868e-02
7.70909414e-02 -1.01847200e-30]
[-1.92678198e-01 4.42770272e-02 8.18017311e-03 … 3.81427146e-02
4.02096510e-02 -9.93554156e-31]]

[[-1.04151405e-01 -9.22129676e-02 -2.96788830e-02 … 1.21290656e-02
-4.23808396e-03 -9.93877935e-31]
[ 2.47427467e-02 4.23858699e-04 -2.34434322e-01 … -3.57903587e-03
1.73379332e-02 1.01148957e-30]
[ 7.21759424e-02 -3.00470497e-02 2.68493146e-02 … -7.22192153e-02
4.49298471e-02 -9.86082338e-31]

[ 3.19699645e-02 2.71317344e-02 -1.80590987e-01 … 1.98109485e-02
2.38169301e-02 1.01078098e-30]
[-5.68427034e-02 1.59825638e-01 -7.64066726e-03 … 4.27936316e-02
-1.13358647e-02 -1.01299853e-30]
[ 3.67947854e-02 -1.32998582e-02 6.15848042e-03 … -2.34318376e-02
1.12402043e-03 -1.01087840e-30]]]], shape=(3, 3, 64, 64), dtype=float32)

: Tensor had NaN values [Op:CheckNumericsV2]
tensorflow.python.eager.core._NotOkStatusException: InvalidArgumentError:

!!! Detected Infinity or NaN in output 0 of eagerly-executing op “Conv2D” (# of outputs: 1) !!!
dtype: <dtype: ‘float32’>
shape: (3, 368, 432, 64)

of +NaN elements: 8400896

Input tensors (2):
0: tf.Tensor(
[[[[0.1500127 0.01813917 0.20477234 … 0. 0.04627592
0.15100552]
[0.1270498 0. 0.16060369 … 0. 0.0464062
0.15123805]
[0.12509023 0. 0.10747914 … 0. 0.04651143
0.15144674]

[0.20047459 0.06912258 0.15552373 … 0. 0.04633534
0.15134169]
[0.13695824 0. 0.13985525 … 0. 0.0464044
0.15111026]
[0.1151154 0. 0.10707475 … 0. 0.04634238
0.15118974]]

[[0.16503803 0.06883179 0.17618498 … 0. 0.04651359
0.15126403]
[0.15685916 0. 0.1386675 … 0. 0.04651892
0.15124896]
[0.130568 0. 0.08387777 … 0. 0.04643347
0.15124844]

[0.21337199 0.06075534 0.13784128 … 0. 0.04633574
0.15113612]
[0.14924058 0. 0.12674709 … 0. 0.04638441
0.15128049]
[0.10938848 0. 0.09444004 … 0. 0.04637663
0.15115641]]

[[0.23134854 0.06027757 0.09830838 … 0. 0.04640669
0.15104818]
[0.16252226 0. 0.078629 … 0. 0.04648879
0.15113725]
[0.14791417 0. 0.09002776 … 0. 0.04635768
0.15116881]

[0.25444698 0.02831446 0.13561597 … 0. 0.04641974
0.15117425]
[0.18091047 0. 0.11684151 … 0. 0.04645795
0.15116073]
[0.13574126 0. 0.0995906 … 0. 0.04643681
0.1512829 ]]

[[0.20018107 0. 0.18688463 … 0. 0.04647401
0.15116069]
[0.15582253 0. 0.12751058 … 0. 0.0464195
0.15144564]
[0.10352629 0. 0.1350222 … 0. 0.04643035
0.15128069]

[0.18527037 0.00605566 0.19329283 … 0. 0.04610997
0.15121844]
[0.12963252 0. 0.15224765 … 0. 0.04643129
0.15122595]
[0.12339732 0. 0.12152058 … 0. 0.0463755
0.15118946]]

[[0.14210704 0. 0.1949158 … 0. 0.04643936
0.15098086]
[0.11323164 0. 0.12632233 … 0. 0.04634906
0.15124868]
[0.07544261 0. 0.12454465 … 0. 0.04652483
0.15109496]

[0.17074794 0.03026725 0.16987517 … 0. 0.04626553
0.15122071]
[0.14239706 0. 0.12904872 … 0. 0.04625799
0.15114516]
[0.12012208 0. 0.09847053 … 0. 0.04635644
0.15121076]]

[[0.16870055 0.05731671 0.1505503 … 0. 0.04637546
0.1511615 ]
[0.11679927 0.00484516 0.1372374 … 0. 0.04627998
0.15123138]
[0.11676145 0. 0.11377239 … 0. 0.04617345
0.15120104]

[0.20526424 0.03110251 0.14539298 … 0. 0.04547212
0.15118138]
[0.14536244 0. 0.10238245 … 0. 0.04639519
0.15118091]
[0.10531607 0. 0.10621743 … 0. 0.04633547
0.15119813]]]

[[[0.06319267 0. 0.04307672 … 0. 0.04587988
0.15126266]
[0.06322442 0. 0.04548458 … 0. 0.04629109
0.15123114]
[0.06310781 0. 0.04803639 … 0. 0.04228668
0.15104337]

[0.06310688 0. 0.04740988 … 0. 0.04646659
0.15130231]
[0.06313025 0. 0.04767108 … 0. 0.04640108
0.15106979]
[0.06310599 0. 0.04757953 … 0. 0.04640587
0.15124735]]

[[0.06305203 0. 0.0474243 … 0. 0.04656786
0.15124363]
[0.06311002 0. 0.04769218 … 0. 0.04634644
0.15125467]
[0.06298938 0. 0.047675 … 0. 0.04643127
0.1508569 ]

[0.06313144 0. 0.04745703 … 0. 0.04638037
0.1511815 ]
[0.06313057 0. 0.04761081 … 0. 0.04696325
0.15130821]
[0.06312027 0. 0.04768058 … 0. 0.04635751
0.15122475]]

[[0.06316148 0. 0.04741119 … 0. 0.0462382
0.15106332]
[0.06302229 0. 0.04754437 … 0. 0.04670485
0.1510458 ]
[0.06311848 0. 0.04751291 … 0. 0.04630695
0.15122946]

[0.06312422 0. 0.0477731 … 0. 0.04652933
0.15110561]
[0.06314063 0. 0.04755805 … 0. 0.04620492
0.15106504]
[0.0631123 0. 0.04760112 … 0. 0.04647412
0.15121152]]

[[0.06296206 0. 0.04758103 … 0. 0.04648415
0.15096073]
[0.06311155 0. 0.04753026 … 0. 0.04650775
0.15124868]
[0.06311221 0. 0.04757759 … 0. 0.04649902
0.15126789]

[0.06309955 0. 0.04745144 … 0. 0.04638632
0.15104207]
[0.063112 0. 0.04761374 … 0. 0.04635442
0.15120094]
[0.06309702 0. 0.04755076 … 0. 0.04645729
0.15112793]]

[[0.06317139 0. 0.04747271 … 0. 0.04633769
0.15094432]
[0.06309769 0. 0.04768588 … 0. 0.04646319
0.15133908]
[0.06308432 0. 0.04749952 … 0. 0.04660663
0.15088978]

[0.06308443 0. 0.0475706 … 0. 0.046336
0.15122765]
[0.06310143 0. 0.04754329 … 0. 0.04661879
0.15129131]
[0.06311873 0. 0.04755755 … 0. 0.04639268
0.1511798 ]]

[[0.06320731 0. 0.04764764 … 0. 0.04640601
0.15135458]
[0.06311855 0. 0.04756876 … 0. 0.046175
0.15127508]
[0.06313971 0. 0.04763309 … 0. 0.04660531
0.15129742]

[0.06314914 0. 0.04752403 … 0. 0.04644692
0.15114014]
[0.06315352 0. 0.04758847 … 0.01143987 0.04638525
0.1512301 ]
[0.06312222 0. 0.04757226 … 0. 0.04649466
0.1512546 ]]]

[[[0.06324468 0. 0.04759694 … 0. 0.04573856
0.1512205 ]
[0.06310596 0. 0.04761045 … 0. 0.04663394
0.15183981]
[0.06310021 0. 0.04768058 … 0.00823797 0.04637683
0.1511759 ]

[0.06302091 0. 0.04755185 … 0. 0.04626622
0.15142453]
[0.06309784 0. 0.04762584 … 0. 0.04651338
0.1510375 ]
[0.06298435 0. 0.04756521 … 0. 0.0461109
0.15126368]]

[[0.06311273 0. 0.04753517 … 0.01493719 0.04531061
0.1518987 ]
[0.06311152 0. 0.04776169 … 0. 0.04667917
0.15204768]
[0.06304973 0. 0.047385 … 0. 0.04627886
0.15097228]

[0.06322829 0. 0.04747527 … 0. 0.04644391
0.15112118]
[0.06327836 0. 0.04747891 … 0. 0.04633773
0.15121038]
[0.06313961 0. 0.04762179 … 0. 0.04632648
0.1512203 ]]

[[0.06287706 0. 0.04821977 … 0. 0.04652021
0.15136288]
[0.06310599 0. 0.04773341 … 0. 0.04623234
0.1508004 ]
[0.06298158 0. 0.04783374 … 0. 0.04632206
0.15131027]

[0.06300439 0. 0.0476838 … 0. 0.04655826
0.15113804]
[0.06333267 0. 0.04770684 … 0. 0.04631234
0.15186736]
[0.06322863 0. 0.04757369 … 0. 0.04657236
0.15108368]]

[[0.06317069 0. 0.04844605 … 0. 0.03975863
0.1491302 ]
[0.06337952 0. 0.04727751 … 0. 0.04931464
0.14740095]
[0.0632981 0. 0.04770948 … 0. 0.04681782
0.15113834]

[0.06328101 0. 0.04766756 … 0. 0.04741061
0.15036847]
[0.06293908 0. 0.04750568 … 0. 0.04634361
0.15111847]
[0.06297186 0. 0.04753968 … 0. 0.04620032
0.14919464]]

[[0.06269491 0. 0.04785748 … 0. 0.04598617
0.15414901]
[0.06296396 0. 0.0477189 … 0. 0.04708571
0.15032662]
[0.06306094 0. 0.04768101 … 0. 0.04678869
0.15226592]

[0.06319796 0. 0.04760287 … 0. 0.04709743
0.15097383]
[0.06270527 0. 0.04747753 … 0. 0.04629578
0.14857516]
[0.06290129 0. 0.04748669 … 0. 0.04630448
0.14873436]]

[[0.06325986 0. 0.04766047 … 0. 0.04295744
0.14652026]
[0.06286769 0. 0.0476849 … 0. 0.04767453
0.15144211]
[0.06291497 0. 0.0476539 … 0. 0.04599401
0.15478058]

[0.06305683 0. 0.04770054 … 0. 0.0466041
0. ]
[0.04210485 0. 0.04746767 … 0. 0.04620145
0.07684492]
[0.06065362 0. 0.04760825 … 0. 0.04642985
0.0603441 ]]]], shape=(3, 368, 432, 64), dtype=float32)
1: tf.Tensor(
[[[[-1.11088110e-02 6.34150505e-02 -5.76001639e-03 … 6.11580499e-02
2.49455720e-02 -1.00616072e-30]
[-3.15621458e-02 1.02330081e-03 -2.49491170e-01 … 2.82532051e-02
2.18219552e-02 1.01019662e-30]
[-4.91782725e-02 -9.72359106e-02 -3.61218653e-03 … -9.14848745e-02
1.68567467e-02 -9.90843184e-31]

[-3.00795864e-02 1.15000661e-02 -2.37683002e-02 … -3.21527175e-03
1.66662652e-02 1.00293705e-30]
[ 2.30123736e-02 5.26421443e-02 6.54029995e-02 … 3.08185760e-02
-3.50763723e-02 -1.01505621e-30]
[ 3.95727083e-02 -8.27869698e-02 1.26787229e-02 … 1.22889234e-02
-3.53453010e-02 -1.00832448e-30]]

[[ 1.42461464e-01 -3.65686044e-02 5.24606667e-02 … -4.84120212e-02
-1.77265443e-02 -9.88538933e-31]
[-2.81071626e-02 -1.41359726e-03 -4.59216714e-01 … 1.03490271e-01
1.78554133e-02 1.00490294e-30]
[ 7.82028213e-03 9.42098908e-03 -6.19763434e-02 … -1.22324906e-01
2.28484813e-02 -9.91291471e-31]

[-2.17652339e-02 8.96690693e-03 -4.71094400e-02 … 1.08638508e-02
2.66167726e-02 -9.91231944e-31]
[-4.61992100e-02 -1.67343877e-02 -9.53740533e-03 … -5.06332032e-02
1.45640552e-01 -1.01593190e-30]
[ 6.31381795e-02 -3.46228033e-02 3.86278443e-02 … -4.91649844e-02
6.32931814e-02 -9.97357774e-31]]

[[-6.19041771e-02 -1.76877938e-02 2.25358494e-02 … -1.99198313e-02
-3.29559185e-02 -1.00788456e-30]
[-4.03494947e-03 2.12729387e-02 -2.36994684e-01 … -2.42546462e-02
4.40360941e-02 9.95994577e-31]
[-1.78208277e-02 1.05401918e-01 3.19699608e-02 … 1.57431662e-02
-1.09776661e-01 -9.93713271e-31]

[ 4.85972278e-02 2.09293105e-02 7.75282970e-03 … -1.52906617e-02
6.74401224e-02 -1.00968137e-30]
[-2.23821133e-01 7.85512850e-02 -7.07460493e-02 … -1.94579773e-02
-3.89976278e-02 -1.00384237e-30]
[-7.62479305e-02 -2.15675402e-02 -4.12339903e-02 … -3.08150658e-04
-7.89392516e-02 -1.01033034e-30]]]

[[[ 1.40317291e-01 -1.60281043e-02 -1.74017847e-02 … 2.74865679e-03
-6.25418723e-02 -1.00093626e-30]
[-1.10596372e-02 -2.87993997e-03 -2.54235566e-01 … 2.33163852e-02
3.91047299e-02 9.94662036e-31]
[ 2.31440756e-02 -2.83784606e-02 -4.90584783e-02 … -1.54673174e-01
-1.07597776e-01 -1.01400879e-30]

[ 7.05821766e-03 1.22367293e-02 -1.69851147e-02 … 2.65229177e-02
1.39451576e-02 9.94441514e-31]
[ 2.64479369e-02 -2.66256556e-02 1.03542343e-01 … 7.15620667e-02
-1.81379274e-01 -9.98766674e-31]
[ 8.96631703e-02 -4.41498496e-02 1.62797957e-03 … 2.38357764e-02
-1.67432055e-01 -9.94722410e-31]]

[[ 8.49749893e-03 -2.02199206e-01 2.37127617e-02 … 7.46154413e-03
-6.48580445e-03 -9.94899298e-31]
[-5.15625672e-03 -1.38808868e-03 -4.84916896e-01 … 1.11871190e-01
2.55850293e-02 9.95111075e-31]
[-2.18591318e-01 4.08473127e-02 -1.41542554e-01 … -2.01696888e-01
2.46873442e-02 -9.86149953e-31]

[ 9.98961367e-03 1.17450180e-02 -4.58426140e-02 … 7.16673434e-02
-3.15034091e-02 -9.84228725e-31]
[-2.14376107e-01 1.05493344e-01 3.45119163e-02 … 2.56901868e-02
7.70909414e-02 -1.01847200e-30]
[-1.92678198e-01 4.42770272e-02 8.18017311e-03 … 3.81427146e-02
4.02096510e-02 -9.93554156e-31]]

[[-1.04151405e-01 -9.22129676e-02 -2.96788830e-02 … 1.21290656e-02
-4.23808396e-03 -9.93877935e-31]
[ 2.47427467e-02 4.23858699e-04 -2.34434322e-01 … -3.57903587e-03
1.73379332e-02 1.01148957e-30]
[ 7.21759424e-02 -3.00470497e-02 2.68493146e-02 … -7.22192153e-02
4.49298471e-02 -9.86082338e-31]

[ 3.14932093e-02 1.15635507e-02 -8.50633681e-02 … -9.32200626e-03
4.51981165e-02 -9.73093784e-31]
[ 1.04182974e-01 -4.46507409e-02 -3.54773253e-02 … -2.35724077e-02
1.34961978e-01 -1.02205482e-30]
[ 7.73473382e-02 6.41304180e-02 -1.83031838e-02 … 2.37963460e-02
1.08736895e-01 -1.00645713e-30]]]

[[[-5.04381657e-02 -4.59756777e-02 2.60672439e-02 … -1.25015704e-02
-1.80273782e-02 -1.01898085e-30]
[ 1.20422272e-02 -1.92096073e-03 -5.40604666e-02 … -4.37741168e-03
6.35894388e-02 1.01597967e-30]
[-9.64498892e-03 -6.07643388e-02 5.31655774e-02 … -2.70240810e-02
1.03763983e-01 -9.90590406e-31]

[ 1.42090786e-02 1.25142476e-02 -1.07419595e-01 … 7.41001666e-02
3.83991078e-02 -1.01423082e-30]
[-1.22272417e-01 -1.53433671e-02 1.07969353e-02 … 9.55350474e-02
-1.00053763e-02 -1.00744699e-30]
[-5.96781746e-02 -5.36343157e-02 6.27685152e-03 … 1.02805095e-02
1.13947950e-02 -1.00816583e-30]]

[[-1.38121545e-01 -6.75840676e-02 -1.04698073e-02 … -7.36713759e-04
-1.04905749e-02 -1.01059525e-30]
[ 3.07952445e-02 -2.19245208e-03 -1.62279591e-01 … 2.88261492e-02
4.80132103e-02 1.01300059e-30]
[ 8.84162933e-02 -1.14328610e-02 3.19949053e-02 … -1.11638039e-01
-3.71548086e-02 -1.00885956e-30]

[ 2.58710776e-02 6.35949522e-03 -5.03680035e-02 … 9.42304209e-02
2.94183940e-02 -9.88401824e-31]
[-3.77384163e-02 -1.02724349e-02 2.30446290e-02 … 4.63326871e-02
-2.10664243e-01 -9.97724340e-31]
[ 3.35787646e-02 8.62909555e-02 3.66405062e-02 … -2.04538275e-02
-1.04904711e-01 -9.90738142e-31]]

[[-1.02690600e-01 -1.01399362e-01 -4.50549573e-02 … 1.38085696e-03
2.06973613e-03 -1.03247421e-30]
[ 9.16018151e-03 1.54991280e-02 -4.19987515e-02 … 5.45219593e-02
1.06255114e-02 1.00281912e-30]
[ 7.38876536e-02 1.47023401e-03 6.63633719e-02 … -2.36393288e-02
4.11504991e-02 -1.00679765e-30]

[ 3.19699645e-02 2.71317344e-02 -1.80590987e-01 … 1.98109485e-02
2.38169301e-02 1.01078098e-30]
[-5.68427034e-02 1.59825638e-01 -7.64066726e-03 … 4.27936316e-02
-1.13358647e-02 -1.01299853e-30]
[ 3.67947854e-02 -1.32998582e-02 6.15848042e-03 … -2.34318376e-02
1.12402043e-03 -1.01087840e-30]]]], shape=(3, 3, 64, 64), dtype=float32)

: Tensor had NaN values [Op:CheckNumericsV2]

During handling of the above exception, another exception occurred:

File “/private/privacy_optics_hpe/privacy_optics_hpe/hyperpose/Model/backbones.py”, line 353, in forward
x = layer.forward(x)
File “/private/privacy_optics_hpe/privacy_optics_hpe/hyperpose/Model/openpose/model/openpose.py”, line 69, in forward
backbone_features = self.backbone.forward(x)
File “/private/privacy_optics_hpe/privacy_optics_hpe/hyperpose/Model/openpose/model/private_model.py”, line 75, in forward
openpose_output= self.pose_model.forward(x_sensor,is_train=True)
File “/private/privacy_optics_hpe/privacy_optics_hpe/hyperpose/Model/openpose/train.py”, line 418, in one_step
pose_out,x_sensor= train_model.forward(image, is_train=True)
File “/private/privacy_optics_hpe/privacy_optics_hpe/hyperpose/Model/openpose/train.py”, line 565, in single_train
None)
File “/private/privacy_optics_hpe/privacy_optics_hpe/train.py”, line 154, in
train(model, dataset)
tensorflow.python.framework.errors_impl.InvalidArgumentError:

!!! Detected Infinity or NaN in output 0 of eagerly-executing op “Conv2D” (# of outputs: 1) !!!
dtype: <dtype: ‘float32’>
shape: (3, 368, 432, 64)

of +NaN elements: 8400896

: Tensor had NaN values [Op:CheckNumericsV2]

Hi @Wei, This error occurs when any ops generates outputs containing the infinity or NaN. The error message gives the name of the operation which is causing this error. As per the error message the conv2d is generating the output which contains Infinity or NaN. To resolve this error please check the input that is being passed to the conv2d layer contains any nan or Infinity. Thank You.