gpt4 book ai didi

python - 如何在 TensorFlow eager 模式下使用复杂变量?

转载 作者:太空宇宙 更新时间:2023-11-03 20:22:42 24 4
gpt4 key购买 nike

在非急切模式下,我可以毫无问题地运行它:

s = tf.complex(tf.Variable(1.0), tf.Variable(1.0))
train_op = tf.train.AdamOptimizer(0.01).minimize(tf.abs(s))

with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
for i in range(5):
_, s_ = sess.run([train_op, s])
print(s_)

>(1+1j)
(0.99+0.99j)
(0.98+0.98j)
(0.9700001+0.9700001j)
(0.9600001+0.9600001j)

但我似乎无法在 eager 模式下找到等效的表达式。我已尝试以下方法,但 TF 提示:

tfe = tf.contrib.eager
s = tf.complex(tfe.Variable(1.0), tfe.Variable(1.0))
def obj(s):
return tf.abs(s)
with tf.GradientTape() as tape:
loss = obj(s)
grads = tape.gradient(loss, [s])
optimizer.apply_gradients(zip(grads, [s]))

The dtype of the source tensor must be floating (e.g. tf.float32) when calling GradientTape.gradient, got tf.complex64

No gradients provided for any variable: ['tf.Tensor((1+1j), shape=(), dtype=complex64)']

如何在 eager 模式下训练复杂变量?

最佳答案

在Tensorflow 2中使用eager模式,可以将实部和虚部作为实变量:

r, i = tf.Variable(1.0), tf.Variable(1.0)
def obj(s):
return tf.abs(s)
with tf.GradientTape() as tape:
s = tf.complex(r, i)
loss = obj(s)
grads = tape.gradient(loss, [r, i])
optimizer.apply_gradients(zip(grads, [r, i]))

关于python - 如何在 TensorFlow eager 模式下使用复杂变量?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/58052841/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com