admin管理员组

文章数量:1530085

问题描述

        书接上回,也是在攻防项目中遇到的问题RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.

原因分析和解决

        通过描述我们可以发现,是在backward第二次的时候张量或者是其中的某些中间结果被释放了,所以导致了运行时候异常的出现,改的办法也很简单

把.backward() 
改为.backward(retain_graph=True) 

         这样一来,在次运行,这个问题就解决了。

完结撒花

        但是呢,又出现了新的问题,很头疼:RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

本文标签: 算法RuntimeErrorGraphsavedAccess