最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

python - Error when trying to replicate GPflow's Stochastic Variational Inference for Scalability SVGP notebook - Stack

programmeradmin2浏览0评论

I'm learning about Gaussian processes and I'm using GPflow to do some exercises and tests.

I was studying the notebook of Stochastic Variational Inference for scalability with SVGP (.4.0/notebooks/advanced/gps_for_big_data.html) and I have managed to replicate most of it, but I have found a bug associated with the use of the Adam optimizer. Attached is the block of code that causes the error for clarity

minibatch_size = 100

# We turn off training for inducing point locations
gpflow.set_trainable(m.inducing_variable, False)


def run_adam(model, iterations):
    """
    Utility function running the Adam optimizer

    :param model: GPflow model
    :param interations: number of iterations
    """
    # Create an Adam Optimizer action
    logf = []
    train_iter = iter(train_dataset.batch(minibatch_size))
    training_loss = model.training_loss_closure(train_iter, compile=True)
    optimizer = tf.optimizers.Adam()

    @tf.function
    def optimization_step():
        optimizer.minimize(training_loss, model.trainable_variables)

    for step in range(iterations):
        optimization_step()
        if step % 10 == 0:
            elbo = -training_loss().numpy()
            logf.append(elbo)
    return logf

And the error is as follows:

AttributeError: in user code:

File "C:\Users\D\AppData\Local\Temp\ipykernel_16304\2914892647.py", line 16, in optimization_step  *
    optimizer.minimize(training_loss, model.trainable_variables)

AttributeError: 'Adam' object has no attribute 'minimize'

I don't really know how to solve this problem, because I'm just starting to learn how to use TensorFlow.

P.S. I'm using GPflow 2.4.0 and Tensor Flow 2.8.1

Thank you very much for your time

I'm learning about Gaussian processes and I'm using GPflow to do some exercises and tests.

I was studying the notebook of Stochastic Variational Inference for scalability with SVGP (https://gpflow.github.io/GPflow/2.4.0/notebooks/advanced/gps_for_big_data.html) and I have managed to replicate most of it, but I have found a bug associated with the use of the Adam optimizer. Attached is the block of code that causes the error for clarity

minibatch_size = 100

# We turn off training for inducing point locations
gpflow.set_trainable(m.inducing_variable, False)


def run_adam(model, iterations):
    """
    Utility function running the Adam optimizer

    :param model: GPflow model
    :param interations: number of iterations
    """
    # Create an Adam Optimizer action
    logf = []
    train_iter = iter(train_dataset.batch(minibatch_size))
    training_loss = model.training_loss_closure(train_iter, compile=True)
    optimizer = tf.optimizers.Adam()

    @tf.function
    def optimization_step():
        optimizer.minimize(training_loss, model.trainable_variables)

    for step in range(iterations):
        optimization_step()
        if step % 10 == 0:
            elbo = -training_loss().numpy()
            logf.append(elbo)
    return logf

And the error is as follows:

AttributeError: in user code:

File "C:\Users\D\AppData\Local\Temp\ipykernel_16304\2914892647.py", line 16, in optimization_step  *
    optimizer.minimize(training_loss, model.trainable_variables)

AttributeError: 'Adam' object has no attribute 'minimize'

I don't really know how to solve this problem, because I'm just starting to learn how to use TensorFlow.

P.S. I'm using GPflow 2.4.0 and Tensor Flow 2.8.1

Thank you very much for your time

Share Improve this question asked Mar 18 at 8:22 Samuel MSamuel M 254 bronze badges
Add a comment  | 

1 Answer 1

Reset to default 0

It looks like I finally found the solution.

The problem was related to the version of TensorFlow. Using the versions recommended on the installation page there is no problem.

https://gpflow.github.io/GPflow/develop/installation.html

I tried to solve the example with gpflow 2.9.1 and tensorflow 2.12.0 and everything worked fine.

Best regards

与本文相关的文章

发布评论

评论列表(0)

  1. 暂无评论