admin管理员组

文章数量:1530085

问题:

RuntimeError: Trying to backward through the graph a second time, but the saved intermediate results have already been freed. Specify retain_graph=True when calling .backward() or autograd.grad() the first time.

解决办法: 首先应该明白一件事,与XXX.backward() 中 XXX 有计算关系的所有变量均存在

grad_fn 关系,梯度回传的目的是为了更新网络中权重和偏移量的值,所有与梯度回传过程中计算权重和偏移量无关的变量均应 .detach() 处理,避免发生此类错误。

reference:

报错RuntimeError: Trying to backward through the graph a second time, but the saved intermediate results have already been freed - Rogn - 博客园

本文标签: 梯度变量回传关系笔记