• 刘二大人 PyTorch深度学习实践 笔记 P4 反向传播


    P4 反向传播

    目的: 在图上面进行梯度的传播,帮助建立更具有弹性的模型结构

    1、P3回顾

    在这里插入图片描述

    2、反向传播

    如果是非常复杂的网络,无法直接计算。
    但是如果把网络看作图,通过图传播梯度,就能把梯度计算出来,即反向传播。
    在这里插入图片描述
    在这里插入图片描述
    矩阵计算书籍 Matricx cookbook https://www.math.uwaterloo.ca/~hwolkowi/matrixcookbook.pdf
    在这里插入图片描述
    发现每一层都可以变成 y = w * x + b ,为了增加网络复杂程度,添加一个激活函数
    在这里插入图片描述
    链式求导法则
    在这里插入图片描述

    3、计算过程

    1. 创建计算图 前馈运算 得到z
    2. 计算局部梯度
    3. 反向 得到Loss对z的偏导
    4. 使用链式求导计算 Loss对x的偏导 Loss对y的偏导
      在这里插入图片描述
      具体例子:
      在这里插入图片描述
      得到相应的梯度之后,就可以进行权重的更新了
      在这里插入图片描述

    4、作业

    作业4-1:计算y=x*w的梯度,理解forward和back ward过程

    在这里插入图片描述
    更新模型,增加一个偏置量
    在这里插入图片描述

    作业4-2:代码实现4-1

    import torch
    import matplotlib.pyplot as plt
    
    x_data = [1.0, 2.0, 3.0]
    y_data = [2.0, 4.0, 6.0]
    
    # torch.Tensor()是一种类,只会直接引用生成 torch.FloatTensor()类型的数据
    # torch.tensor()是python函数,其中数据可以是标量、向量、矩阵、高低维等
    # 会对数据做拷贝,生成相应的torch.LongTensor、torch.FloatTensor和torch.DoubleTensor等
    
    # tensor包含两个部分 data grad
    w = torch.tensor([1.0])
    w.requires_grad = True # 需要计算梯度
    
    # 前馈函数
    def forward(x):
    	return x * w # w为张量,数乘后结果也为张量
    
    # 计算损失
    def loss(x, y):
    	y_pred = forward(x)
    	return (y_pred - y) ** 2
    
    # 取标量 不创建计算图
    # 张量.item() 相当于把数据拿出来,为标量,不进行梯度计算
    # 张量.data 相当于对数据进行修改,为张量,不进行梯度计算
    print('Predict (before training)', 4, forward(4).item())
    
    epoch_list = []
    mse_list = []
    for epoch in range(100):
    	for x, y in zip(x_data, y_data):
    		l = loss(x, y)
    		l.backward() # 反向传播
    		print('\tgrad:', x, y, w.grad.item())
    		w.data = w.data - 0.01 * w.grad.data
    
    		w.grad.data.zero_() # 更新过权重值之后,需要将梯度清0,以便下一次创建图计算
    
    	epoch_list.append(epoch)
    	mse_list.append(l.item() / len(x_data))
    	print('progress:', epoch, l.item())
    
    print('predict (after training)', 4, forward(4).item())
    
    plt.plot(epoch_list, mse_list)
    plt.xlabel('Epoch')
    plt.ylabel('Cost')
    plt.show()
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24
    • 25
    • 26
    • 27
    • 28
    • 29
    • 30
    • 31
    • 32
    • 33
    • 34
    • 35
    • 36
    • 37
    • 38
    • 39
    • 40
    • 41
    • 42
    • 43
    • 44
    • 45
    • 46
    • 47
    • 48
    • 49

    输出:

    Predict (before training) 4 4.0
    	grad: 1.0 2.0 -2.0
    	grad: 2.0 4.0 -7.840000152587891
    	grad: 3.0 6.0 -16.228801727294922
    progress: 0 7.315943717956543
    	grad: 1.0 2.0 -1.478623867034912
    	grad: 2.0 4.0 -5.796205520629883
    	grad: 3.0 6.0 -11.998146057128906
    progress: 1 3.9987640380859375
    	grad: 1.0 2.0 -1.0931644439697266
    	grad: 2.0 4.0 -4.285204887390137
    	grad: 3.0 6.0 -8.870372772216797
    progress: 2 2.1856532096862793
    	grad: 1.0 2.0 -0.8081896305084229
    	grad: 2.0 4.0 -3.1681032180786133
    	grad: 3.0 6.0 -6.557973861694336
    progress: 3 1.1946394443511963
    	grad: 1.0 2.0 -0.5975041389465332
    	grad: 2.0 4.0 -2.3422164916992188
    	grad: 3.0 6.0 -4.848389625549316
    progress: 4 0.6529689431190491
    	grad: 1.0 2.0 -0.4417421817779541
    	grad: 2.0 4.0 -1.7316293716430664
    	grad: 3.0 6.0 -3.58447265625
    progress: 5 0.35690122842788696
    	grad: 1.0 2.0 -0.3265852928161621
    	grad: 2.0 4.0 -1.2802143096923828
    	grad: 3.0 6.0 -2.650045394897461
    progress: 6 0.195076122879982
    	grad: 1.0 2.0 -0.24144840240478516
    	grad: 2.0 4.0 -0.9464778900146484
    	grad: 3.0 6.0 -1.9592113494873047
    progress: 7 0.10662525147199631
    	grad: 1.0 2.0 -0.17850565910339355
    	grad: 2.0 4.0 -0.699742317199707
    	grad: 3.0 6.0 -1.4484672546386719
    progress: 8 0.0582793727517128
    	grad: 1.0 2.0 -0.1319713592529297
    	grad: 2.0 4.0 -0.5173273086547852
    	grad: 3.0 6.0 -1.070866584777832
    progress: 9 0.03185431286692619
    	grad: 1.0 2.0 -0.09756779670715332
    	grad: 2.0 4.0 -0.3824653625488281
    	grad: 3.0 6.0 -0.7917022705078125
    progress: 10 0.017410902306437492
    	grad: 1.0 2.0 -0.07213282585144043
    	grad: 2.0 4.0 -0.2827606201171875
    	grad: 3.0 6.0 -0.5853137969970703
    progress: 11 0.009516451507806778
    	grad: 1.0 2.0 -0.053328514099121094
    	grad: 2.0 4.0 -0.2090473175048828
    	grad: 3.0 6.0 -0.43272972106933594
    progress: 12 0.005201528314501047
    	grad: 1.0 2.0 -0.039426326751708984
    	grad: 2.0 4.0 -0.15455150604248047
    	grad: 3.0 6.0 -0.3199195861816406
    progress: 13 0.0028430151287466288
    	grad: 1.0 2.0 -0.029148340225219727
    	grad: 2.0 4.0 -0.11426162719726562
    	grad: 3.0 6.0 -0.23652076721191406
    progress: 14 0.0015539465239271522
    	grad: 1.0 2.0 -0.021549701690673828
    	grad: 2.0 4.0 -0.08447456359863281
    	grad: 3.0 6.0 -0.17486286163330078
    progress: 15 0.0008493617060594261
    	grad: 1.0 2.0 -0.01593184471130371
    	grad: 2.0 4.0 -0.062453269958496094
    	grad: 3.0 6.0 -0.12927818298339844
    progress: 16 0.00046424579340964556
    	grad: 1.0 2.0 -0.011778593063354492
    	grad: 2.0 4.0 -0.046172142028808594
    	grad: 3.0 6.0 -0.09557533264160156
    progress: 17 0.0002537401160225272
    	grad: 1.0 2.0 -0.00870823860168457
    	grad: 2.0 4.0 -0.03413581848144531
    	grad: 3.0 6.0 -0.07066154479980469
    progress: 18 0.00013869594840798527
    	grad: 1.0 2.0 -0.006437778472900391
    	grad: 2.0 4.0 -0.025236129760742188
    	grad: 3.0 6.0 -0.052239418029785156
    progress: 19 7.580435340059921e-05
    	grad: 1.0 2.0 -0.004759550094604492
    	grad: 2.0 4.0 -0.018657684326171875
    	grad: 3.0 6.0 -0.038620948791503906
    progress: 20 4.143271507928148e-05
    	grad: 1.0 2.0 -0.003518819808959961
    	grad: 2.0 4.0 -0.0137939453125
    	grad: 3.0 6.0 -0.028553009033203125
    progress: 21 2.264650902361609e-05
    	grad: 1.0 2.0 -0.00260162353515625
    	grad: 2.0 4.0 -0.010198593139648438
    	grad: 3.0 6.0 -0.021108627319335938
    progress: 22 1.2377059647405986e-05
    	grad: 1.0 2.0 -0.0019233226776123047
    	grad: 2.0 4.0 -0.0075397491455078125
    	grad: 3.0 6.0 -0.0156097412109375
    progress: 23 6.768445018678904e-06
    	grad: 1.0 2.0 -0.0014221668243408203
    	grad: 2.0 4.0 -0.0055751800537109375
    	grad: 3.0 6.0 -0.011541366577148438
    progress: 24 3.7000872907810844e-06
    	grad: 1.0 2.0 -0.0010514259338378906
    	grad: 2.0 4.0 -0.0041217803955078125
    	grad: 3.0 6.0 -0.008531570434570312
    progress: 25 2.021880391112063e-06
    	grad: 1.0 2.0 -0.0007772445678710938
    	grad: 2.0 4.0 -0.0030469894409179688
    	grad: 3.0 6.0 -0.006305694580078125
    progress: 26 1.1044940038118511e-06
    	grad: 1.0 2.0 -0.0005745887756347656
    	grad: 2.0 4.0 -0.0022525787353515625
    	grad: 3.0 6.0 -0.0046634674072265625
    progress: 27 6.041091182851233e-07
    	grad: 1.0 2.0 -0.0004248619079589844
    	grad: 2.0 4.0 -0.0016651153564453125
    	grad: 3.0 6.0 -0.003444671630859375
    progress: 28 3.296045179013163e-07
    	grad: 1.0 2.0 -0.0003139972686767578
    	grad: 2.0 4.0 -0.0012311935424804688
    	grad: 3.0 6.0 -0.0025491714477539062
    progress: 29 1.805076408345485e-07
    	grad: 1.0 2.0 -0.00023221969604492188
    	grad: 2.0 4.0 -0.0009107589721679688
    	grad: 3.0 6.0 -0.0018854141235351562
    progress: 30 9.874406714516226e-08
    	grad: 1.0 2.0 -0.00017189979553222656
    	grad: 2.0 4.0 -0.0006742477416992188
    	grad: 3.0 6.0 -0.00139617919921875
    progress: 31 5.4147676564753056e-08
    	grad: 1.0 2.0 -0.0001270771026611328
    	grad: 2.0 4.0 -0.0004978179931640625
    	grad: 3.0 6.0 -0.00102996826171875
    progress: 32 2.9467628337442875e-08
    	grad: 1.0 2.0 -9.393692016601562e-05
    	grad: 2.0 4.0 -0.0003681182861328125
    	grad: 3.0 6.0 -0.0007610321044921875
    progress: 33 1.6088051779661328e-08
    	grad: 1.0 2.0 -6.937980651855469e-05
    	grad: 2.0 4.0 -0.00027179718017578125
    	grad: 3.0 6.0 -0.000560760498046875
    progress: 34 8.734787115827203e-09
    	grad: 1.0 2.0 -5.125999450683594e-05
    	grad: 2.0 4.0 -0.00020122528076171875
    	grad: 3.0 6.0 -0.0004177093505859375
    progress: 35 4.8466972657479346e-09
    	grad: 1.0 2.0 -3.790855407714844e-05
    	grad: 2.0 4.0 -0.000148773193359375
    	grad: 3.0 6.0 -0.000308990478515625
    progress: 36 2.6520865503698587e-09
    	grad: 1.0 2.0 -2.8133392333984375e-05
    	grad: 2.0 4.0 -0.000110626220703125
    	grad: 3.0 6.0 -0.0002288818359375
    progress: 37 1.4551915228366852e-09
    	grad: 1.0 2.0 -2.09808349609375e-05
    	grad: 2.0 4.0 -8.20159912109375e-05
    	grad: 3.0 6.0 -0.00016880035400390625
    progress: 38 7.914877642178908e-10
    	grad: 1.0 2.0 -1.5497207641601562e-05
    	grad: 2.0 4.0 -6.103515625e-05
    	grad: 3.0 6.0 -0.000125885009765625
    progress: 39 4.4019543565809727e-10
    	grad: 1.0 2.0 -1.1444091796875e-05
    	grad: 2.0 4.0 -4.482269287109375e-05
    	grad: 3.0 6.0 -9.1552734375e-05
    progress: 40 2.3283064365386963e-10
    	grad: 1.0 2.0 -8.344650268554688e-06
    	grad: 2.0 4.0 -3.24249267578125e-05
    	grad: 3.0 6.0 -6.580352783203125e-05
    progress: 41 1.2028067430946976e-10
    	grad: 1.0 2.0 -5.9604644775390625e-06
    	grad: 2.0 4.0 -2.288818359375e-05
    	grad: 3.0 6.0 -4.57763671875e-05
    progress: 42 5.820766091346741e-11
    	grad: 1.0 2.0 -4.291534423828125e-06
    	grad: 2.0 4.0 -1.71661376953125e-05
    	grad: 3.0 6.0 -3.719329833984375e-05
    progress: 43 3.842615114990622e-11
    	grad: 1.0 2.0 -3.337860107421875e-06
    	grad: 2.0 4.0 -1.33514404296875e-05
    	grad: 3.0 6.0 -2.86102294921875e-05
    progress: 44 2.2737367544323206e-11
    	grad: 1.0 2.0 -2.6226043701171875e-06
    	grad: 2.0 4.0 -1.049041748046875e-05
    	grad: 3.0 6.0 -2.288818359375e-05
    progress: 45 1.4551915228366852e-11
    	grad: 1.0 2.0 -1.9073486328125e-06
    	grad: 2.0 4.0 -7.62939453125e-06
    	grad: 3.0 6.0 -1.430511474609375e-05
    progress: 46 5.6843418860808015e-12
    	grad: 1.0 2.0 -1.430511474609375e-06
    	grad: 2.0 4.0 -5.7220458984375e-06
    	grad: 3.0 6.0 -1.1444091796875e-05
    progress: 47 3.637978807091713e-12
    	grad: 1.0 2.0 -1.1920928955078125e-06
    	grad: 2.0 4.0 -4.76837158203125e-06
    	grad: 3.0 6.0 -1.1444091796875e-05
    progress: 48 3.637978807091713e-12
    	grad: 1.0 2.0 -9.5367431640625e-07
    	grad: 2.0 4.0 -3.814697265625e-06
    	grad: 3.0 6.0 -8.58306884765625e-06
    progress: 49 2.0463630789890885e-12
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 50 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 51 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 52 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 53 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 54 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 55 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 56 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 57 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 58 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 59 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 60 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 61 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 62 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 63 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 64 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 65 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 66 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 67 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 68 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 69 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 70 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 71 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 72 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 73 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 74 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 75 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 76 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 77 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 78 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 79 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 80 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 81 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 82 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 83 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 84 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 85 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 86 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 87 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 88 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 89 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 90 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 91 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 92 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 93 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 94 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 95 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 96 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 97 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 98 9.094947017729282e-13
    	grad: 1.0 2.0 -7.152557373046875e-07
    	grad: 2.0 4.0 -2.86102294921875e-06
    	grad: 3.0 6.0 -5.7220458984375e-06
    progress: 99 9.094947017729282e-13
    predict (after training) 4 7.999998569488525
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24
    • 25
    • 26
    • 27
    • 28
    • 29
    • 30
    • 31
    • 32
    • 33
    • 34
    • 35
    • 36
    • 37
    • 38
    • 39
    • 40
    • 41
    • 42
    • 43
    • 44
    • 45
    • 46
    • 47
    • 48
    • 49
    • 50
    • 51
    • 52
    • 53
    • 54
    • 55
    • 56
    • 57
    • 58
    • 59
    • 60
    • 61
    • 62
    • 63
    • 64
    • 65
    • 66
    • 67
    • 68
    • 69
    • 70
    • 71
    • 72
    • 73
    • 74
    • 75
    • 76
    • 77
    • 78
    • 79
    • 80
    • 81
    • 82
    • 83
    • 84
    • 85
    • 86
    • 87
    • 88
    • 89
    • 90
    • 91
    • 92
    • 93
    • 94
    • 95
    • 96
    • 97
    • 98
    • 99
    • 100
    • 101
    • 102
    • 103
    • 104
    • 105
    • 106
    • 107
    • 108
    • 109
    • 110
    • 111
    • 112
    • 113
    • 114
    • 115
    • 116
    • 117
    • 118
    • 119
    • 120
    • 121
    • 122
    • 123
    • 124
    • 125
    • 126
    • 127
    • 128
    • 129
    • 130
    • 131
    • 132
    • 133
    • 134
    • 135
    • 136
    • 137
    • 138
    • 139
    • 140
    • 141
    • 142
    • 143
    • 144
    • 145
    • 146
    • 147
    • 148
    • 149
    • 150
    • 151
    • 152
    • 153
    • 154
    • 155
    • 156
    • 157
    • 158
    • 159
    • 160
    • 161
    • 162
    • 163
    • 164
    • 165
    • 166
    • 167
    • 168
    • 169
    • 170
    • 171
    • 172
    • 173
    • 174
    • 175
    • 176
    • 177
    • 178
    • 179
    • 180
    • 181
    • 182
    • 183
    • 184
    • 185
    • 186
    • 187
    • 188
    • 189
    • 190
    • 191
    • 192
    • 193
    • 194
    • 195
    • 196
    • 197
    • 198
    • 199
    • 200
    • 201
    • 202
    • 203
    • 204
    • 205
    • 206
    • 207
    • 208
    • 209
    • 210
    • 211
    • 212
    • 213
    • 214
    • 215
    • 216
    • 217
    • 218
    • 219
    • 220
    • 221
    • 222
    • 223
    • 224
    • 225
    • 226
    • 227
    • 228
    • 229
    • 230
    • 231
    • 232
    • 233
    • 234
    • 235
    • 236
    • 237
    • 238
    • 239
    • 240
    • 241
    • 242
    • 243
    • 244
    • 245
    • 246
    • 247
    • 248
    • 249
    • 250
    • 251
    • 252
    • 253
    • 254
    • 255
    • 256
    • 257
    • 258
    • 259
    • 260
    • 261
    • 262
    • 263
    • 264
    • 265
    • 266
    • 267
    • 268
    • 269
    • 270
    • 271
    • 272
    • 273
    • 274
    • 275
    • 276
    • 277
    • 278
    • 279
    • 280
    • 281
    • 282
    • 283
    • 284
    • 285
    • 286
    • 287
    • 288
    • 289
    • 290
    • 291
    • 292
    • 293
    • 294
    • 295
    • 296
    • 297
    • 298
    • 299
    • 300
    • 301
    • 302
    • 303
    • 304
    • 305
    • 306
    • 307
    • 308
    • 309
    • 310
    • 311
    • 312
    • 313
    • 314
    • 315
    • 316
    • 317
    • 318
    • 319
    • 320
    • 321
    • 322
    • 323
    • 324
    • 325
    • 326
    • 327
    • 328
    • 329
    • 330
    • 331
    • 332
    • 333
    • 334
    • 335
    • 336
    • 337
    • 338
    • 339
    • 340
    • 341
    • 342
    • 343
    • 344
    • 345
    • 346
    • 347
    • 348
    • 349
    • 350
    • 351
    • 352
    • 353
    • 354
    • 355
    • 356
    • 357
    • 358
    • 359
    • 360
    • 361
    • 362
    • 363
    • 364
    • 365
    • 366
    • 367
    • 368
    • 369
    • 370
    • 371
    • 372
    • 373
    • 374
    • 375
    • 376
    • 377
    • 378
    • 379
    • 380
    • 381
    • 382
    • 383
    • 384
    • 385
    • 386
    • 387
    • 388
    • 389
    • 390
    • 391
    • 392
    • 393
    • 394
    • 395
    • 396
    • 397
    • 398
    • 399
    • 400
    • 401
    • 402

    在这里插入图片描述

    作业4-3:计算y = w1x^2 + w2x + b 的 梯度

    在这里插入图片描述

    作业4-4:代码实现4-3

    import torch
    import matplotlib.pyplot as plt
    
    def forward(x):
    	return x ** 2 * w1 + x * w2 + b
    
    def loss(x, y):
    	return (forward(x) - y) ** 2
    
    x_data = [1.0, 2.0, 3.0]
    y_data = [2.0, 4.0, 6.0]
    
    w1 = torch.tensor([1.0])
    w2 = torch.tensor([2.0])
    b = torch.tensor([3.0])
    w1.requires_grad = True
    w2.requires_grad = True
    b.requires_grad = True
    r = 0.01
    
    print('predict(before training):', 4, forward(4).item())
    
    epoch_list = []
    loss_list = []
    
    for epoch in range(100):
    	for x, y in zip(x_data, y_data):
    		l = loss(x, y)
    		l.backward()
    		print('\tgrad:', x, y, w1.grad.item(), w2.grad.item(), b.grad.item())
    		w1.data -= r * w1.grad.data
    		w2.data -= r * w2.grad.data
    		b.data -= r * b.grad.data
    
    		w1.grad.data.zero_()
    		w2.grad.data.zero_()
    		b.grad.data.zero_()
    
    	epoch_list.append(epoch)
    	loss_list.append(l.item())
    	print('progress:', epoch, l.item())
    
    print('predict(after training):', 4, forward(4).item())
    
    plt.plot(epoch_list, loss_list)
    plt.xlabel('Epoch')
    plt.ylabel('Loss')
    plt.show()
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24
    • 25
    • 26
    • 27
    • 28
    • 29
    • 30
    • 31
    • 32
    • 33
    • 34
    • 35
    • 36
    • 37
    • 38
    • 39
    • 40
    • 41
    • 42
    • 43
    • 44
    • 45
    • 46
    • 47
    • 48

    输出:

    predict(before training): 4 27.0
    	grad: 1.0 2.0 8.0 8.0 8.0
    	grad: 2.0 4.0 51.52000427246094 25.76000213623047 12.880001068115234
    	grad: 3.0 6.0 97.58880615234375 32.52960205078125 10.84320068359375
    progress: 0 29.39375114440918
    	grad: 1.0 2.0 2.8975682258605957 2.8975682258605957 2.8975682258605957
    	grad: 2.0 4.0 -9.041646957397461 -4.5208234786987305 -2.2604117393493652
    	grad: 3.0 6.0 -69.30754089355469 -23.10251235961914 -7.7008376121521
    progress: 1 14.825724601745605
    	grad: 1.0 2.0 5.042389869689941 5.042389869689941 5.042389869689941
    	grad: 2.0 4.0 18.422988891601562 9.211494445800781 4.605747222900391
    	grad: 3.0 6.0 9.384512901306152 3.128170967102051 1.0427236557006836
    progress: 2 0.2718181610107422
    	grad: 1.0 2.0 3.8239336013793945 3.8239336013793945 3.8239336013793945
    	grad: 2.0 4.0 4.956966400146484 2.478483200073242 1.239241600036621
    	grad: 3.0 6.0 -26.23502540588379 -8.74500846862793 -2.9150028228759766
    progress: 3 2.124310255050659
    	grad: 1.0 2.0 4.1789045333862305 4.1789045333862305 4.1789045333862305
    	grad: 2.0 4.0 10.562461853027344 5.281230926513672 2.640615463256836
    	grad: 3.0 6.0 -8.704278945922852 -2.901426315307617 -0.9671421051025391
    progress: 4 0.2338409572839737
    	grad: 1.0 2.0 3.809941291809082 3.809941291809082 3.809941291809082
    	grad: 2.0 4.0 7.3196258544921875 3.6598129272460938 1.8299064636230469
    	grad: 3.0 6.0 -15.941230773925781 -5.313743591308594 -1.7712478637695312
    progress: 5 0.7843297719955444
    	grad: 1.0 2.0 3.785682201385498 3.785682201385498 3.785682201385498
    	grad: 2.0 4.0 8.218498229980469 4.109249114990234 2.054624557495117
    	grad: 3.0 6.0 -11.68948745727539 -3.896495819091797 -1.2988319396972656
    progress: 6 0.4217410981655121
    	grad: 1.0 2.0 3.6085901260375977 3.6085901260375977 3.6085901260375977
    	grad: 2.0 4.0 7.213901519775391 3.6069507598876953 1.8034753799438477
    	grad: 3.0 6.0 -12.817611694335938 -4.2725372314453125 -1.4241790771484375
    progress: 7 0.5070714950561523
    	grad: 1.0 2.0 3.5098752975463867 3.5098752975463867 3.5098752975463867
    	grad: 2.0 4.0 7.117706298828125 3.5588531494140625 1.7794265747070312
    	grad: 3.0 6.0 -11.475434303283691 -3.8251447677612305 -1.2750482559204102
    progress: 8 0.4064370095729828
    	grad: 1.0 2.0 3.3816747665405273 3.3816747665405273 3.3816747665405273
    	grad: 2.0 4.0 6.620697021484375 3.3103485107421875 1.6551742553710938
    	grad: 3.0 6.0 -11.314312934875488 -3.771437644958496 -1.257145881652832
    progress: 9 0.39510393142700195
    	grad: 1.0 2.0 3.2739076614379883 3.2739076614379883 3.2739076614379883
    	grad: 2.0 4.0 6.331199645996094 3.165599822998047 1.5827999114990234
    	grad: 3.0 6.0 -10.634078979492188 -3.5446929931640625 -1.1815643310546875
    progress: 10 0.34902358055114746
    	grad: 1.0 2.0 3.163087844848633 3.163087844848633 3.163087844848633
    	grad: 2.0 4.0 5.965343475341797 2.9826717376708984 1.4913358688354492
    	grad: 3.0 6.0 -10.224632263183594 -3.4082107543945312 -1.1360702514648438
    progress: 11 0.32266390323638916
    	grad: 1.0 2.0 3.0598936080932617 3.0598936080932617 3.0598936080932617
    	grad: 2.0 4.0 5.654441833496094 2.827220916748047 1.4136104583740234
    	grad: 3.0 6.0 -9.717287063598633 -3.239095687866211 -1.0796985626220703
    progress: 12 0.2914372384548187
    	grad: 1.0 2.0 2.9591164588928223 2.9591164588928223 2.9591164588928223
    	grad: 2.0 4.0 5.336635589599609 2.6683177947998047 1.3341588973999023
    	grad: 3.0 6.0 -9.282554626464844 -3.0941848754882812 -1.0313949584960938
    progress: 13 0.26594388484954834
    	grad: 1.0 2.0 2.862949848175049 2.862949848175049 2.862949848175049
    	grad: 2.0 4.0 5.0399932861328125 2.5199966430664062 1.2599983215332031
    	grad: 3.0 6.0 -8.839994430541992 -2.946664810180664 -0.9822216033935547
    progress: 14 0.24118982255458832
    	grad: 1.0 2.0 2.7701501846313477 2.7701501846313477 2.7701501846313477
    	grad: 2.0 4.0 4.750751495361328 2.375375747680664 1.187687873840332
    	grad: 3.0 6.0 -8.426067352294922 -2.8086891174316406 -0.9362297058105469
    progress: 15 0.21913151443004608
    	grad: 1.0 2.0 2.681084632873535 2.681084632873535 2.681084632873535
    	grad: 2.0 4.0 4.4746551513671875 2.2373275756835938 1.1186637878417969
    	grad: 3.0 6.0 -8.022817611694336 -2.6742725372314453 -0.8914241790771484
    progress: 16 0.1986592710018158
    	grad: 1.0 2.0 2.595376968383789 2.595376968383789 2.595376968383789
    	grad: 2.0 4.0 4.2083892822265625 2.1041946411132812 1.0520973205566406
    	grad: 3.0 6.0 -7.637712478637695 -2.5459041595458984 -0.8486347198486328
    progress: 17 0.1800452172756195
    	grad: 1.0 2.0 2.513005256652832 2.513005256652832 2.513005256652832
    	grad: 2.0 4.0 3.9528884887695312 1.9764442443847656 0.9882221221923828
    	grad: 3.0 6.0 -7.266348838806152 -2.422116279602051 -0.8073720932006836
    progress: 18 0.1629624217748642
    	grad: 1.0 2.0 2.433791160583496 2.433791160583496 2.433791160583496
    	grad: 2.0 4.0 3.707111358642578 1.853555679321289 0.9267778396606445
    	grad: 3.0 6.0 -6.909919738769531 -2.3033065795898438 -0.7677688598632812
    progress: 19 0.14736725389957428
    	grad: 1.0 2.0 2.3576345443725586 2.3576345443725586 2.3576345443725586
    	grad: 2.0 4.0 3.470977783203125 1.7354888916015625 0.8677444458007812
    	grad: 3.0 6.0 -6.56707763671875 -2.18902587890625 -0.72967529296875
    progress: 20 0.1331065148115158
    	grad: 1.0 2.0 2.2844080924987793 2.2844080924987793 2.2844080924987793
    	grad: 2.0 4.0 3.2439804077148438 1.6219902038574219 0.8109951019287109
    	grad: 3.0 6.0 -6.237607955932617 -2.079202651977539 -0.6930675506591797
    progress: 21 0.12008565664291382
    	grad: 1.0 2.0 2.2140016555786133 2.2140016555786133 2.2140016555786133
    	grad: 2.0 4.0 3.02581787109375 1.512908935546875 0.7564544677734375
    	grad: 3.0 6.0 -5.920875549316406 -1.9736251831054688 -0.6578750610351562
    progress: 22 0.10819990187883377
    	grad: 1.0 2.0 2.146306037902832 2.146306037902832 2.146306037902832
    	grad: 2.0 4.0 2.8161354064941406 1.4080677032470703 0.7040338516235352
    	grad: 3.0 6.0 -5.61646842956543 -1.8721561431884766 -0.6240520477294922
    progress: 23 0.09736023843288422
    	grad: 1.0 2.0 2.0812158584594727 2.0812158584594727 2.0812158584594727
    	grad: 2.0 4.0 2.6146163940429688 1.3073081970214844 0.6536540985107422
    	grad: 3.0 6.0 -5.3238115310668945 -1.7746038436889648 -0.5915346145629883
    progress: 24 0.08747830241918564
    	grad: 1.0 2.0 2.0186309814453125 2.0186309814453125 2.0186309814453125
    	grad: 2.0 4.0 2.4209213256835938 1.2104606628417969 0.6052303314208984
    	grad: 3.0 6.0 -5.0425615310668945 -1.6808538436889648 -0.5602846145629883
    progress: 25 0.07847971469163895
    	grad: 1.0 2.0 1.9584546089172363 1.9584546089172363 1.9584546089172363
    	grad: 2.0 4.0 2.2347793579101562 1.1173896789550781 0.5586948394775391
    	grad: 3.0 6.0 -4.772186279296875 -1.590728759765625 -0.530242919921875
    progress: 26 0.07028938829898834
    	grad: 1.0 2.0 1.9005932807922363 1.9005932807922363 1.9005932807922363
    	grad: 2.0 4.0 2.055877685546875 1.0279388427734375 0.5139694213867188
    	grad: 3.0 6.0 -4.512325286865234 -1.5041084289550781 -0.5013694763183594
    progress: 27 0.06284283846616745
    	grad: 1.0 2.0 1.8449583053588867 1.8449583053588867 1.8449583053588867
    	grad: 2.0 4.0 1.883941650390625 0.9419708251953125 0.47098541259765625
    	grad: 3.0 6.0 -4.262515068054199 -1.4208383560180664 -0.47361278533935547
    progress: 28 0.0560772679746151
    	grad: 1.0 2.0 1.7914619445800781 1.7914619445800781 1.7914619445800781
    	grad: 2.0 4.0 1.7186965942382812 0.8593482971191406 0.4296741485595703
    	grad: 3.0 6.0 -4.022438049316406 -1.3408126831054688 -0.44693756103515625
    progress: 29 0.04993829503655434
    	grad: 1.0 2.0 1.7400236129760742 1.7400236129760742 1.7400236129760742
    	grad: 2.0 4.0 1.5598983764648438 0.7799491882324219 0.38997459411621094
    	grad: 3.0 6.0 -3.791656494140625 -1.263885498046875 -0.421295166015625
    progress: 30 0.04437240585684776
    	grad: 1.0 2.0 1.6905627250671387 1.6905627250671387 1.6905627250671387
    	grad: 2.0 4.0 1.4072799682617188 0.7036399841308594 0.3518199920654297
    	grad: 3.0 6.0 -3.569835662841797 -1.1899452209472656 -0.3966484069824219
    progress: 31 0.03933249041438103
    	grad: 1.0 2.0 1.6430025100708008 1.6430025100708008 1.6430025100708008
    	grad: 2.0 4.0 1.2606124877929688 0.6303062438964844 0.3151531219482422
    	grad: 3.0 6.0 -3.356649398803711 -1.1188831329345703 -0.37296104431152344
    progress: 32 0.034774985164403915
    	grad: 1.0 2.0 1.5972709655761719 1.5972709655761719 1.5972709655761719
    	grad: 2.0 4.0 1.1196670532226562 0.5598335266113281 0.27991676330566406
    	grad: 3.0 6.0 -3.1517200469970703 -1.0505733489990234 -0.3501911163330078
    progress: 33 0.030658453702926636
    	grad: 1.0 2.0 1.5532960891723633 1.5532960891723633 1.5532960891723633
    	grad: 2.0 4.0 0.9842185974121094 0.4921092987060547 0.24605464935302734
    	grad: 3.0 6.0 -2.95477294921875 -0.98492431640625 -0.32830810546875
    progress: 34 0.026946552097797394
    	grad: 1.0 2.0 1.5110101699829102 1.5110101699829102 1.5110101699829102
    	grad: 2.0 4.0 0.8540611267089844 0.4270305633544922 0.2135152816772461
    	grad: 3.0 6.0 -2.7654476165771484 -0.9218158721923828 -0.30727195739746094
    progress: 35 0.02360401302576065
    	grad: 1.0 2.0 1.4703483581542969 1.4703483581542969 1.4703483581542969
    	grad: 2.0 4.0 0.728973388671875 0.3644866943359375 0.18224334716796875
    	grad: 3.0 6.0 -2.5835208892822266 -0.8611736297607422 -0.28705787658691406
    progress: 36 0.02060055546462536
    	grad: 1.0 2.0 1.4312481880187988 1.4312481880187988 1.4312481880187988
    	grad: 2.0 4.0 0.6087875366210938 0.3043937683105469 0.15219688415527344
    	grad: 3.0 6.0 -2.408632278442383 -0.8028774261474609 -0.2676258087158203
    progress: 37 0.017905892804265022
    	grad: 1.0 2.0 1.3936481475830078 1.3936481475830078 1.3936481475830078
    	grad: 2.0 4.0 0.4932861328125 0.24664306640625 0.123321533203125
    	grad: 3.0 6.0 -2.2405757904052734 -0.7468585968017578 -0.24895286560058594
    progress: 38 0.01549438200891018
    	grad: 1.0 2.0 1.3574919700622559 1.3574919700622559 1.3574919700622559
    	grad: 2.0 4.0 0.3823051452636719 0.19115257263183594 0.09557628631591797
    	grad: 3.0 6.0 -2.079008102416992 -0.6930027008056641 -0.2310009002685547
    progress: 39 0.013340353965759277
    	grad: 1.0 2.0 1.3227224349975586 1.3227224349975586 1.3227224349975586
    	grad: 2.0 4.0 0.27565765380859375 0.13782882690429688 0.06891441345214844
    	grad: 3.0 6.0 -1.9237918853759766 -0.6412639617919922 -0.21375465393066406
    progress: 40 0.011422762647271156
    	grad: 1.0 2.0 1.2892866134643555 1.2892866134643555 1.2892866134643555
    	grad: 2.0 4.0 0.17319488525390625 0.08659744262695312 0.04329872131347656
    	grad: 3.0 6.0 -1.7745494842529297 -0.5915164947509766 -0.1971721649169922
    progress: 41 0.009719215333461761
    	grad: 1.0 2.0 1.2571325302124023 1.2571325302124023 1.2571325302124023
    	grad: 2.0 4.0 0.07473373413085938 0.03736686706542969 0.018683433532714844
    	grad: 3.0 6.0 -1.631169319152832 -0.5437231063842773 -0.18124103546142578
    progress: 42 0.008212078362703323
    	grad: 1.0 2.0 1.2262120246887207 1.2262120246887207 1.2262120246887207
    	grad: 2.0 4.0 -0.019863128662109375 -0.009931564331054688 -0.004965782165527344
    	grad: 3.0 6.0 -1.4933252334594727 -0.4977750778198242 -0.1659250259399414
    progress: 43 0.0068827783688902855
    	grad: 1.0 2.0 1.1964750289916992 1.1964750289916992 1.1964750289916992
    	grad: 2.0 4.0 -0.1107635498046875 -0.05538177490234375 -0.027690887451171875
    	grad: 3.0 6.0 -1.3608970642089844 -0.4536323547363281 -0.15121078491210938
    progress: 44 0.005716175306588411
    	grad: 1.0 2.0 1.1678781509399414 1.1678781509399414 1.1678781509399414
    	grad: 2.0 4.0 -0.198089599609375 -0.0990447998046875 -0.04952239990234375
    	grad: 3.0 6.0 -1.2336015701293945 -0.41120052337646484 -0.13706684112548828
    progress: 45 0.0046968297101557255
    	grad: 1.0 2.0 1.140376091003418 1.140376091003418 1.140376091003418
    	grad: 2.0 4.0 -0.2819938659667969 -0.14099693298339844 -0.07049846649169922
    	grad: 3.0 6.0 -1.11126708984375 -0.37042236328125 -0.12347412109375
    progress: 46 0.0038114646449685097
    	grad: 1.0 2.0 1.1139264106750488 1.1139264106750488 1.1139264106750488
    	grad: 2.0 4.0 -0.36260414123535156 -0.18130207061767578 -0.09065103530883789
    	grad: 3.0 6.0 -0.99371337890625 -0.33123779296875 -0.11041259765625
    progress: 47 0.0030477354303002357
    	grad: 1.0 2.0 1.088489055633545 1.088489055633545 1.088489055633545
    	grad: 2.0 4.0 -0.4400444030761719 -0.22002220153808594 -0.11001110076904297
    	grad: 3.0 6.0 -0.8807430267333984 -0.2935810089111328 -0.09786033630371094
    progress: 48 0.002394161419942975
    	grad: 1.0 2.0 1.0640249252319336 1.0640249252319336 1.0640249252319336
    	grad: 2.0 4.0 -0.5144424438476562 -0.2572212219238281 -0.12861061096191406
    	grad: 3.0 6.0 -0.7721672058105469 -0.2573890686035156 -0.08579635620117188
    progress: 49 0.0018402537098154426
    	grad: 1.0 2.0 1.0404958724975586 1.0404958724975586 1.0404958724975586
    	grad: 2.0 4.0 -0.5859127044677734 -0.2929563522338867 -0.14647817611694336
    	grad: 3.0 6.0 -0.6678228378295898 -0.22260761260986328 -0.0742025375366211
    progress: 50 0.0013765041949227452
    	grad: 1.0 2.0 1.0178661346435547 1.0178661346435547 1.0178661346435547
    	grad: 2.0 4.0 -0.6545791625976562 -0.3272895812988281 -0.16364479064941406
    	grad: 3.0 6.0 -0.5675897598266602 -0.18919658660888672 -0.0630655288696289
    progress: 51 0.0009943152545019984
    	grad: 1.0 2.0 0.9961013793945312 0.9961013793945312 0.9961013793945312
    	grad: 2.0 4.0 -0.7205257415771484 -0.3602628707885742 -0.1801314353942871
    	grad: 3.0 6.0 -0.4712362289428711 -0.15707874298095703 -0.052359580993652344
    progress: 52 0.0006853814120404422
    	grad: 1.0 2.0 0.9751672744750977 0.9751672744750977 0.9751672744750977
    	grad: 2.0 4.0 -0.7838821411132812 -0.3919410705566406 -0.1959705352783203
    	grad: 3.0 6.0 -0.37866783142089844 -0.1262226104736328 -0.04207420349121094
    progress: 53 0.00044255965622141957
    	grad: 1.0 2.0 0.9550323486328125 0.9550323486328125 0.9550323486328125
    	grad: 2.0 4.0 -0.8447341918945312 -0.4223670959472656 -0.2111835479736328
    	grad: 3.0 6.0 -0.2897043228149414 -0.09656810760498047 -0.032189369201660156
    progress: 54 0.000259038875810802
    	grad: 1.0 2.0 0.9356646537780762 0.9356646537780762 0.9356646537780762
    	grad: 2.0 4.0 -0.9031887054443359 -0.45159435272216797 -0.22579717636108398
    	grad: 3.0 6.0 -0.2042255401611328 -0.06807518005371094 -0.022691726684570312
    progress: 55 0.0001287286140723154
    	grad: 1.0 2.0 0.9170365333557129 0.9170365333557129 0.9170365333557129
    	grad: 2.0 4.0 -0.9593276977539062 -0.4796638488769531 -0.23983192443847656
    	grad: 3.0 6.0 -0.12209415435791016 -0.04069805145263672 -0.013566017150878906
    progress: 56 4.6009205107111484e-05
    	grad: 1.0 2.0 0.8991179466247559 0.8991179466247559 0.8991179466247559
    	grad: 2.0 4.0 -1.0132503509521484 -0.5066251754760742 -0.2533125877380371
    	grad: 3.0 6.0 -0.043181419372558594 -0.014393806457519531 -0.004797935485839844
    progress: 57 5.755046004196629e-06
    	grad: 1.0 2.0 0.8818821907043457 0.8818821907043457 0.8818821907043457
    	grad: 2.0 4.0 -1.065032958984375 -0.5325164794921875 -0.26625823974609375
    	grad: 3.0 6.0 0.03263282775878906 0.010877609252929688 0.0036258697509765625
    progress: 58 3.28673286276171e-06
    	grad: 1.0 2.0 0.8653020858764648 0.8653020858764648 0.8653020858764648
    	grad: 2.0 4.0 -1.1147651672363281 -0.5573825836181641 -0.27869129180908203
    	grad: 3.0 6.0 0.10549449920654297 0.035164833068847656 0.011721611022949219
    progress: 59 3.4349042834946886e-05
    	grad: 1.0 2.0 0.8493533134460449 0.8493533134460449 0.8493533134460449
    	grad: 2.0 4.0 -1.1625251770019531 -0.5812625885009766 -0.2906312942504883
    	grad: 3.0 6.0 0.17549800872802734 0.05849933624267578 0.019499778747558594
    progress: 60 9.506034257356077e-05
    	grad: 1.0 2.0 0.834010124206543 0.834010124206543 0.834010124206543
    	grad: 2.0 4.0 -1.2083892822265625 -0.6041946411132812 -0.3020973205566406
    	grad: 3.0 6.0 0.24274635314941406 0.08091545104980469 0.026971817016601562
    progress: 61 0.00018186973466072232
    	grad: 1.0 2.0 0.8192505836486816 0.8192505836486816 0.8192505836486816
    	grad: 2.0 4.0 -1.2524280548095703 -0.6262140274047852 -0.3131070137023926
    	grad: 3.0 6.0 0.30735111236572266 0.10245037078857422 0.034150123596191406
    progress: 62 0.0002915577497333288
    	grad: 1.0 2.0 0.805051326751709 0.805051326751709 0.805051326751709
    	grad: 2.0 4.0 -1.2947158813476562 -0.6473579406738281 -0.32367897033691406
    	grad: 3.0 6.0 0.36942386627197266 0.12314128875732422 0.041047096252441406
    progress: 63 0.0004212160420138389
    	grad: 1.0 2.0 0.7913913726806641 0.7913913726806641 0.7913913726806641
    	grad: 2.0 4.0 -1.3353118896484375 -0.6676559448242188 -0.3338279724121094
    	grad: 3.0 6.0 0.4290504455566406 0.14301681518554688 0.047672271728515625
    progress: 64 0.0005681613693013787
    	grad: 1.0 2.0 0.7782487869262695 0.7782487869262695 0.7782487869262695
    	grad: 2.0 4.0 -1.3742942810058594 -0.6871471405029297 -0.34357357025146484
    	grad: 3.0 6.0 0.48633384704589844 0.1621112823486328 0.05403709411621094
    progress: 65 0.0007300018914975226
    	grad: 1.0 2.0 0.7656049728393555 0.7656049728393555 0.7656049728393555
    	grad: 2.0 4.0 -1.4117164611816406 -0.7058582305908203 -0.35292911529541016
    	grad: 3.0 6.0 0.5413684844970703 0.18045616149902344 0.06015205383300781
    progress: 66 0.0009045674232766032
    	grad: 1.0 2.0 0.7534389495849609 0.7534389495849609 0.7534389495849609
    	grad: 2.0 4.0 -1.4476470947265625 -0.7238235473632812 -0.3619117736816406
    	grad: 3.0 6.0 0.5942230224609375 0.1980743408203125 0.0660247802734375
    progress: 67 0.0010898178443312645
    	grad: 1.0 2.0 0.7417335510253906 0.7417335510253906 0.7417335510253906
    	grad: 2.0 4.0 -1.4821281433105469 -0.7410640716552734 -0.3705320358276367
    	grad: 3.0 6.0 0.6450004577636719 0.21500015258789062 0.07166671752929688
    progress: 68 0.0012840295676141977
    	grad: 1.0 2.0 0.7304706573486328 0.7304706573486328 0.7304706573486328
    	grad: 2.0 4.0 -1.5152320861816406 -0.7576160430908203 -0.37880802154541016
    	grad: 3.0 6.0 0.6937780380249023 0.23125934600830078 0.0770864486694336
    progress: 69 0.001485580112785101
    	grad: 1.0 2.0 0.7196331024169922 0.7196331024169922 0.7196331024169922
    	grad: 2.0 4.0 -1.547006607055664 -0.773503303527832 -0.386751651763916
    	grad: 3.0 6.0 0.7406158447265625 0.2468719482421875 0.0822906494140625
    progress: 70 0.0016929376870393753
    	grad: 1.0 2.0 0.709205150604248 0.709205150604248 0.709205150604248
    	grad: 2.0 4.0 -1.5774974822998047 -0.7887487411499023 -0.39437437057495117
    	grad: 3.0 6.0 0.7856168746948242 0.2618722915649414 0.08729076385498047
    progress: 71 0.0019049193942919374
    	grad: 1.0 2.0 0.6991701126098633 0.6991701126098633 0.6991701126098633
    	grad: 2.0 4.0 -1.6067638397216797 -0.8033819198608398 -0.4016909599304199
    	grad: 3.0 6.0 0.8288154602050781 0.2762718200683594 0.09209060668945312
    progress: 72 0.0021201700437813997
    	grad: 1.0 2.0 0.6895127296447754 0.6895127296447754 0.6895127296447754
    	grad: 2.0 4.0 -1.6348438262939453 -0.8174219131469727 -0.40871095657348633
    	grad: 3.0 6.0 0.8703403472900391 0.2901134490966797 0.09670448303222656
    progress: 73 0.002337939338758588
    	grad: 1.0 2.0 0.6802182197570801 0.6802182197570801 0.6802182197570801
    	grad: 2.0 4.0 -1.661794662475586 -0.830897331237793 -0.4154486656188965
    	grad: 3.0 6.0 0.9101743698120117 0.3033914566040039 0.10113048553466797
    progress: 74 0.0025568436831235886
    	grad: 1.0 2.0 0.6712737083435059 0.6712737083435059 0.6712737083435059
    	grad: 2.0 4.0 -1.687643051147461 -0.8438215255737305 -0.42191076278686523
    	grad: 3.0 6.0 0.9484634399414062 0.31615447998046875 0.10538482666015625
    progress: 75 0.0027764905244112015
    	grad: 1.0 2.0 0.6626648902893066 0.6626648902893066 0.6626648902893066
    	grad: 2.0 4.0 -1.7124481201171875 -0.8562240600585938 -0.4281120300292969
    	grad: 3.0 6.0 0.985224723815918 0.32840824127197266 0.10946941375732422
    progress: 76 0.0029958882369101048
    	grad: 1.0 2.0 0.6543788909912109 0.6543788909912109 0.6543788909912109
    	grad: 2.0 4.0 -1.7362480163574219 -0.8681240081787109 -0.43406200408935547
    	grad: 3.0 6.0 1.0205097198486328 0.34016990661621094 0.11338996887207031
    progress: 77 0.00321432133205235
    	grad: 1.0 2.0 0.6464033126831055 0.6464033126831055 0.6464033126831055
    	grad: 2.0 4.0 -1.7590675354003906 -0.8795337677001953 -0.43976688385009766
    	grad: 3.0 6.0 1.054387092590332 0.35146236419677734 0.11715412139892578
    progress: 78 0.003431271994486451
    	grad: 1.0 2.0 0.6387262344360352 0.6387262344360352 0.6387262344360352
    	grad: 2.0 4.0 -1.7809600830078125 -0.8904800415039062 -0.4452400207519531
    	grad: 3.0 6.0 1.0869426727294922 0.36231422424316406 0.12077140808105469
    progress: 79 0.0036464333534240723
    	grad: 1.0 2.0 0.6313357353210449 0.6313357353210449 0.6313357353210449
    	grad: 2.0 4.0 -1.8019561767578125 -0.9009780883789062 -0.4504890441894531
    	grad: 3.0 6.0 1.1181678771972656 0.3727226257324219 0.12424087524414062
    progress: 80 0.003858948824927211
    	grad: 1.0 2.0 0.6242218017578125 0.6242218017578125 0.6242218017578125
    	grad: 2.0 4.0 -1.8220863342285156 -0.9110431671142578 -0.4555215835571289
    	grad: 3.0 6.0 1.1481657028198242 0.3827219009399414 0.12757396697998047
    progress: 81 0.004068779293447733
    	grad: 1.0 2.0 0.6173720359802246 0.6173720359802246 0.6173720359802246
    	grad: 2.0 4.0 -1.8413944244384766 -0.9206972122192383 -0.46034860610961914
    	grad: 3.0 6.0 1.1769447326660156 0.3923149108886719 0.13077163696289062
    progress: 82 0.004275305196642876
    	grad: 1.0 2.0 0.6107778549194336 0.6107778549194336 0.6107778549194336
    	grad: 2.0 4.0 -1.8599014282226562 -0.9299507141113281 -0.46497535705566406
    	grad: 3.0 6.0 1.2045907974243164 0.40153026580810547 0.13384342193603516
    progress: 83 0.004478515591472387
    	grad: 1.0 2.0 0.6044282913208008 0.6044282913208008 0.6044282913208008
    	grad: 2.0 4.0 -1.8776435852050781 -0.9388217926025391 -0.46941089630126953
    	grad: 3.0 6.0 1.2311210632324219 0.4103736877441406 0.13679122924804688
    progress: 84 0.004677960183471441
    	grad: 1.0 2.0 0.5983142852783203 0.5983142852783203 0.5983142852783203
    	grad: 2.0 4.0 -1.8946495056152344 -0.9473247528076172 -0.4736623764038086
    	grad: 3.0 6.0 1.2565698623657227 0.4188566207885742 0.1396188735961914
    progress: 85 0.004873357247561216
    	grad: 1.0 2.0 0.5924272537231445 0.5924272537231445 0.5924272537231445
    	grad: 2.0 4.0 -1.9109458923339844 -0.9554729461669922 -0.4777364730834961
    	grad: 3.0 6.0 1.281005859375 0.427001953125 0.142333984375
    progress: 86 0.005064740777015686
    	grad: 1.0 2.0 0.5867581367492676 0.5867581367492676 0.5867581367492676
    	grad: 2.0 4.0 -1.9265632629394531 -0.9632816314697266 -0.4816408157348633
    	grad: 3.0 6.0 1.3044633865356445 0.43482112884521484 0.14494037628173828
    progress: 87 0.005251928232610226
    	grad: 1.0 2.0 0.5812978744506836 0.5812978744506836 0.5812978744506836
    	grad: 2.0 4.0 -1.9415245056152344 -0.9707622528076172 -0.4853811264038086
    	grad: 3.0 6.0 1.3269596099853516 0.4423198699951172 0.14743995666503906
    progress: 88 0.00543463509529829
    	grad: 1.0 2.0 0.5760388374328613 0.5760388374328613 0.5760388374328613
    	grad: 2.0 4.0 -1.955862045288086 -0.977931022644043 -0.4889655113220215
    	grad: 3.0 6.0 1.3485546112060547 0.44951820373535156 0.1498394012451172
    progress: 89 0.005612961482256651
    	grad: 1.0 2.0 0.5709733963012695 0.5709733963012695 0.5709733963012695
    	grad: 2.0 4.0 -1.9695930480957031 -0.9847965240478516 -0.4923982620239258
    	grad: 3.0 6.0 1.3692569732666016 0.4564189910888672 0.15213966369628906
    progress: 90 0.005786619149148464
    	grad: 1.0 2.0 0.5660943984985352 0.5660943984985352 0.5660943984985352
    	grad: 2.0 4.0 -1.9827346801757812 -0.9913673400878906 -0.4956836700439453
    	grad: 3.0 6.0 1.3891525268554688 0.46305084228515625 0.15435028076171875
    progress: 91 0.00595600251108408
    	grad: 1.0 2.0 0.5613937377929688 0.5613937377929688 0.5613937377929688
    	grad: 2.0 4.0 -1.995330810546875 -0.9976654052734375 -0.49883270263671875
    	grad: 3.0 6.0 1.4081897735595703 0.46939659118652344 0.1564655303955078
    progress: 92 0.0061203655786812305
    	grad: 1.0 2.0 0.5568656921386719 0.5568656921386719 0.5568656921386719
    	grad: 2.0 4.0 -2.0073795318603516 -1.0036897659301758 -0.5018448829650879
    	grad: 3.0 6.0 1.4265060424804688 0.47550201416015625 0.15850067138671875
    progress: 93 0.006280615925788879
    	grad: 1.0 2.0 0.5525016784667969 0.5525016784667969 0.5525016784667969
    	grad: 2.0 4.0 -2.0189208984375 -1.00946044921875 -0.504730224609375
    	grad: 3.0 6.0 1.4440240859985352 0.4813413619995117 0.1604471206665039
    progress: 94 0.006435819435864687
    	grad: 1.0 2.0 0.5482978820800781 0.5482978820800781 0.5482978820800781
    	grad: 2.0 4.0 -2.0299606323242188 -1.0149803161621094 -0.5074901580810547
    	grad: 3.0 6.0 1.460855484008789 0.4869518280029297 0.16231727600097656
    progress: 95 0.006586724426597357
    	grad: 1.0 2.0 0.544245719909668 0.544245719909668 0.544245719909668
    	grad: 2.0 4.0 -2.04052734375 -1.020263671875 -0.5101318359375
    	grad: 3.0 6.0 1.4769744873046875 0.4923248291015625 0.1641082763671875
    progress: 96 0.006732881534844637
    	grad: 1.0 2.0 0.5403413772583008 0.5403413772583008 0.5403413772583008
    	grad: 2.0 4.0 -2.050628662109375 -1.0253143310546875 -0.5126571655273438
    	grad: 3.0 6.0 1.4924583435058594 0.4974861145019531 0.16582870483398438
    progress: 97 0.006874789949506521
    	grad: 1.0 2.0 0.5365777015686035 0.5365777015686035 0.5365777015686035
    	grad: 2.0 4.0 -2.0602989196777344 -1.0301494598388672 -0.5150747299194336
    	grad: 3.0 6.0 1.5072813034057617 0.5024271011352539 0.16747570037841797
    progress: 98 0.007012027781456709
    	grad: 1.0 2.0 0.5329499244689941 0.5329499244689941 0.5329499244689941
    	grad: 2.0 4.0 -2.0695419311523438 -1.0347709655761719 -0.5173854827880859
    	grad: 3.0 6.0 1.5214862823486328 0.5071620941162109 0.1690540313720703
    progress: 99 0.007144816219806671
    predict(after training): 4 8.724994659423828
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24
    • 25
    • 26
    • 27
    • 28
    • 29
    • 30
    • 31
    • 32
    • 33
    • 34
    • 35
    • 36
    • 37
    • 38
    • 39
    • 40
    • 41
    • 42
    • 43
    • 44
    • 45
    • 46
    • 47
    • 48
    • 49
    • 50
    • 51
    • 52
    • 53
    • 54
    • 55
    • 56
    • 57
    • 58
    • 59
    • 60
    • 61
    • 62
    • 63
    • 64
    • 65
    • 66
    • 67
    • 68
    • 69
    • 70
    • 71
    • 72
    • 73
    • 74
    • 75
    • 76
    • 77
    • 78
    • 79
    • 80
    • 81
    • 82
    • 83
    • 84
    • 85
    • 86
    • 87
    • 88
    • 89
    • 90
    • 91
    • 92
    • 93
    • 94
    • 95
    • 96
    • 97
    • 98
    • 99
    • 100
    • 101
    • 102
    • 103
    • 104
    • 105
    • 106
    • 107
    • 108
    • 109
    • 110
    • 111
    • 112
    • 113
    • 114
    • 115
    • 116
    • 117
    • 118
    • 119
    • 120
    • 121
    • 122
    • 123
    • 124
    • 125
    • 126
    • 127
    • 128
    • 129
    • 130
    • 131
    • 132
    • 133
    • 134
    • 135
    • 136
    • 137
    • 138
    • 139
    • 140
    • 141
    • 142
    • 143
    • 144
    • 145
    • 146
    • 147
    • 148
    • 149
    • 150
    • 151
    • 152
    • 153
    • 154
    • 155
    • 156
    • 157
    • 158
    • 159
    • 160
    • 161
    • 162
    • 163
    • 164
    • 165
    • 166
    • 167
    • 168
    • 169
    • 170
    • 171
    • 172
    • 173
    • 174
    • 175
    • 176
    • 177
    • 178
    • 179
    • 180
    • 181
    • 182
    • 183
    • 184
    • 185
    • 186
    • 187
    • 188
    • 189
    • 190
    • 191
    • 192
    • 193
    • 194
    • 195
    • 196
    • 197
    • 198
    • 199
    • 200
    • 201
    • 202
    • 203
    • 204
    • 205
    • 206
    • 207
    • 208
    • 209
    • 210
    • 211
    • 212
    • 213
    • 214
    • 215
    • 216
    • 217
    • 218
    • 219
    • 220
    • 221
    • 222
    • 223
    • 224
    • 225
    • 226
    • 227
    • 228
    • 229
    • 230
    • 231
    • 232
    • 233
    • 234
    • 235
    • 236
    • 237
    • 238
    • 239
    • 240
    • 241
    • 242
    • 243
    • 244
    • 245
    • 246
    • 247
    • 248
    • 249
    • 250
    • 251
    • 252
    • 253
    • 254
    • 255
    • 256
    • 257
    • 258
    • 259
    • 260
    • 261
    • 262
    • 263
    • 264
    • 265
    • 266
    • 267
    • 268
    • 269
    • 270
    • 271
    • 272
    • 273
    • 274
    • 275
    • 276
    • 277
    • 278
    • 279
    • 280
    • 281
    • 282
    • 283
    • 284
    • 285
    • 286
    • 287
    • 288
    • 289
    • 290
    • 291
    • 292
    • 293
    • 294
    • 295
    • 296
    • 297
    • 298
    • 299
    • 300
    • 301
    • 302
    • 303
    • 304
    • 305
    • 306
    • 307
    • 308
    • 309
    • 310
    • 311
    • 312
    • 313
    • 314
    • 315
    • 316
    • 317
    • 318
    • 319
    • 320
    • 321
    • 322
    • 323
    • 324
    • 325
    • 326
    • 327
    • 328
    • 329
    • 330
    • 331
    • 332
    • 333
    • 334
    • 335
    • 336
    • 337
    • 338
    • 339
    • 340
    • 341
    • 342
    • 343
    • 344
    • 345
    • 346
    • 347
    • 348
    • 349
    • 350
    • 351
    • 352
    • 353
    • 354
    • 355
    • 356
    • 357
    • 358
    • 359
    • 360
    • 361
    • 362
    • 363
    • 364
    • 365
    • 366
    • 367
    • 368
    • 369
    • 370
    • 371
    • 372
    • 373
    • 374
    • 375
    • 376
    • 377
    • 378
    • 379
    • 380
    • 381
    • 382
    • 383
    • 384
    • 385
    • 386
    • 387
    • 388
    • 389
    • 390
    • 391
    • 392
    • 393
    • 394
    • 395
    • 396
    • 397
    • 398
    • 399
    • 400
    • 401
    • 402

    在这里插入图片描述

  • 相关阅读:
    k8s:Pod 基础概念
    基础算法:大数除以除以13
    参数传递的方式
    Python实现连连看
    向毕业妥协系列之深度学习笔记(二)深层神经网络
    Innodb之索引与算法
    匈利亚算法的实现
    Systemd&&Sysvinit
    【分布式】: 幂等性和实现方式
    八股文之JVM
  • 原文地址:https://blog.csdn.net/qq_44948213/article/details/126382608