Why my TensorFlow code for polynomial regression not working?

I am trying to fit a polynomial regression using TensorFlow 2. Here is the code

import tensorflow as tf

import numpy as np

import matplotlib.pyplot as plt

trX = np.linspace(-1, 1, 101)

iterations = 0

num_coeffs = 6

trY_coeffs = [1, 2, 3, 4, 5, 6]

trY = 0

for i in range(num_coeffs):

trY += trY_coeffs[i] * np.power(trX, i)

trY += np.random.randn(*trX.shape) * 1.5

plt.scatter(trX, trY) # Uncomment to visualize data

learning_rate = 0.0085

“momentum” accelerates gradient descent in the

relevant direction and dampens oscillations.

Used for the Keras gradient descent implementation

momentum = 0.5

training_epochs = 40

X = tf.constant(trX, dtype=tf.float32)

Y = tf.constant(trY, dtype=tf.float32)

w = tf.Variable([0.] * num_coeffs, name=‘parameters’)

model = lambda _X, _w: tf.add_n([tf.multiply(_w[i], tf.pow(_X, i)) for i in range(num_coeffs)])

y_model = lambda: model(X, w)

Consider using mean squared error for better averaging

cost = lambda: tf.reduce_mean(tf.pow(Y - y_model(), 2))

train_op = tf.keras.optimizers.SGD(learning_rate, momentum=momentum)

for _ in range(training_epochs):

with tf.GradientTape() as tape:

loss_value = cost() # Calculate the loss

gradients = tape.gradient(loss_value, w) # Compute gradients

train_op.apply_gradients(zip(gradients, [w])) # Apply gradients to update weights

w_val = w.numpy() # Get the numpy array of weights

print(“Weights after training: \n”)

print(w_val)

trY2 = 0

for i in range(num_coeffs):

trY2 += w_val[i] * np.power(trX, i)

plt.plot(trX, trY2, ‘r’) # Plot the predicted data

plt.show()

The final output is [1.73181 1.73181 1.73181 1.73181 1.73181 1.73181] which is far from what is expected. Anyone help is appreciated! Thanks.

Hi @Han_Fei, I have made a few changes in your code like instead of taking weights initially as zero I have taken some random values, tried with different hyper parameters I have got the better regression curve. please refer to this gist for working code example. Thank You.

Thanks @Kiran_Sai_Ramineni for your help. Your insights were invaluable.