n today’s rapidly changing markets, the ability to dynamically adjust prices in real-time can provide a significant competitive edge. By leveraging online learning techniques, businesses can continuously refine their pricing strategies based on incoming data. In this post, we’ll walk through a practical implementation using Recursive Least Squares (RLS) to dynamically update prices. This example serves as a starting point, highlighting the complexities involved and the need for professional expertise to handle real-world scenarios effectively.

Introduction

Price optimization is crucial for maximizing revenue. Traditional methods often rely on static models that don’t adapt to changes in the market. Online learning, however, offers a dynamic approach, allowing for real-time updates to pricing strategies based on observed data. In this post, we use a simple RLS algorithm to adjust prices dynamically. While the example provides a foundation, real-world implementation requires addressing numerous factors such as promotions, seasonal effects, and holidays. Barnes Analytics is here to help you navigate these complexities and implement a robust pricing strategy tailored to your business needs.

The Algorithm

The RLS algorithm updates the model coefficients iteratively as new data points arrive. Here’s a step-by-step breakdown of the implementation.

Initialize Parameters

We start by initializing the parameters needed for the RLS algorithm. These include the number of features, the forgetting factor (λ), the initial value for the inverse covariance matrix (P), and the initial coefficients.

import numpy as np
import pandas as pd
import seaborn as sns
import matplotlib.pyplot as plt
sns.set()

# Initialize parameters
n_features = 1  # Number of features in the data (excluding the intercept)
lambda_ = 0.99  # Forgetting factor, should be slightly less than 1
delta = 1.0  # Initial value for the inverse covariance matrix
np.random.seed(42) # Set seed value for reproducibility

# Initialize the coefficient vector (including intercept)
coefficients = [100,-5]

# Initialize the inverse covariance matrix
P = np.eye(n_features + 1) / delta

#True Coefficients
true_coefficients = [500,-1.5]

#Burn In Period of 25 Time-Steps
x = np.random.normal(150,40,25)
y = true_coefficients[0]+true_coefficients[1]*x

# Contain simulation data stream  
data_stream = []

Define Prediction and Update Functions

Next, we define the functions for predicting the quantity demanded based on the current coefficients and for updating the coefficients using the RLS algorithm. These two function define our model. Obviously, this code is NOT production grade. It certainly does not follow scikit-learn’s API standards. However, for demonstrating the idea of generating a feedback loop between your sales and the prices that you set, this code is sufficient.

# Function to predict based on current coefficients
def predict(x, coefficients):
    x = np.insert(x, 0, 1)  # Add intercept term
    return np.dot(coefficients, x)

# Function to update coefficients using RLS
def update_coefficients(x, y, coefficients, P, lambda_):
    x = np.insert(x, 0, 1)  # Add intercept term
    x = x.reshape(-1, 1)
    
    # Compute the Kalman gain
    P_x = np.dot(P, x)
    gain = P_x / (lambda_ + np.dot(x.T, P_x))
    
    # Update the coefficients
    error = y - np.dot(coefficients, x.flatten())
    coefficients += gain.flatten() * error
    
    # Update the inverse covariance matrix
    P = (P - np.dot(gain, x.T.dot(P))) / lambda_
    
    return coefficients, P

Simulate Initial Data Stream

To begin, we simulate a burn-in period with some initial data points. This helps the model to stabilize before we start updating it with new data points. You can imagine this is your existing pricing data that you have on hand before you implement this model. This will help the algorithm get the parameters to a reasonable starting point so that it doesn’t behave unexpectedly at the beginning.

for i,j in zip(x,y):
    data_stream.append((i,j))

for x, y in data_stream:
    coefficients,P = update_coefficients(x, y, coefficients, P, lambda_)
    print(f"Updated coefficients: {coefficients}")

This gives the following output, note that I am suppressing the middle 21 updates in the output below:

Updated coefficients: [100.03446405   0.85435799]
Updated coefficients: [105.24113811   0.99627374]
...
Updated coefficients: [358.24864689  -0.5751082 ]
Updated coefficients: [359.89412476  -0.57974991]

Handle New Data Points And Simulate

We define a function to handle new data points as they arrive. This function updates the model’s coefficients using the RLS algorithm. The handle_new_data_point function is essentially saving a history of price, quantity pairs, and then getting an update on the coefficients. The get_new function uses the closed form for the optimal price given a linear demand to set the next price. It also contains some logic that says that prices should stay between an upper and lower bound that should be dictated by business logic. Finally, I also add an exploration function to the algorithm, this basically allows the algorithm to escape a local optimal value if we land there because of the randomness that we assumed in the true demand.

# Example function to handle new data point in the stream
def handle_new_data_point(x, y, coefficients, P, lambda_):
    data_stream.append((x,y))
    coefficients, P = update_coefficients(x, y, coefficients, P, lambda_)
    return coefficients, P

def get_new(coefficients):
    if np.random.random()>0.1:
        #revenue optimizing price when demand = coef[0]+coef[1]*price, because revenue = price*demand => revenue_prime = coef[0]+2*coef[1]*price
        x = np.clip(-coefficients[0]/(2*coefficients[1]),0.01,1000)
    else:
        x = np.clip(np.random.normal(np.clip(-coefficients[0]/(2*coefficients[1]),0.01,1000),10),0.01,None)
    y = np.clip(true_coefficients[0]+true_coefficients[1]*x+np.random.normal(0,50),0,None)
    return x,y


# Simulate receiving a new data point
for i in range(1000):
    new_data_point = get_new(coefficients)
    coefficients,P = handle_new_data_point(new_data_point[0], new_data_point[1], coefficients, P, lambda_)
    #print(f"{i} Updated coefficients after new data point: {coefficients}")
coefficients

After running the simulation for 1000 steps, we have the data generated saved in data_stream. So we can plot our estimate of best price against time steps.

df = pd.DataFrame(data_stream)
p_star = -true_coefficients[0]/(2*true_coefficients[1])
df[0].iloc[25:].plot(label='Price History')
plt.axhline(p_star,color='red',linestyle='--',label='Optimal Price')
plt.legend()
plt.title('Price History')
plt.xlabel('Time Step')
plt.ylabel('Suggested Price')
plt.show()

Which generates the following chart:

Price History of the algorithm

The odd jumps up and down are due to the exploration that we set for the algorithm. That keeps the algorithm trying to make sure that it is in fact optimal.

Key Considerations

While this example demonstrates the basics of real-time price optimization using RLS, it’s important to note that this code is not production-grade. Real-world pricing strategies must account for:

  • Promotions and Discounts: Temporary price changes can significantly impact demand.
  • Seasonal Effects: Demand can vary with seasons, holidays, and other cyclical events.
  • Market Conditions: Competitor pricing, market trends, and economic factors should be considered.
  • Customer Segmentation: Different customer segments may respond differently to price changes.

Conclusion

Implementing a real-time price optimization system can significantly boost your revenue by ensuring prices are always aligned with current market demand. However, the complexities of real-world scenarios require a tailored approach. At Barnes Analytics, we specialize in developing sophisticated pricing strategies that account for all the variables impacting your business. Contact us today to learn how we can help you implement a robust, dynamic pricing model.

Tell Us How We Can Help You!







Leave a Reply

Your email address will not be published. Required fields are marked *