1.3X – Improvement in Conversion Rate
2.8X- Increase in monthly Conversions

1.7X -Drop in CPA

See how we drove game-changing results for First Games.

Read Case Study
How First Games tripled their monthly conversions with RevX

As a part of Behind the Scenes @RevX , we sat down with Mayur Dangar, Data Scientist at RevX to learn about all things data!

Real-time bidding enables advertisers to bid for the right inventory shown to the right audience at the right time. It’s what makes programmatic an automated, data-driven, and highly effective channel. But what goes into optimizing this for the best results?

Our Data Scientist, Mayur Dangar shares with us how the CPM bargainer model leverages machine learning to reduce costs to advertisers and drive higher revenue.

Data science enthusiasts, be sure to stick around! This interview is packed with exclusive glimpses of everything from data preparation and pipeline to modeling and monitoring.

We are familiar with ads that can be bought and sold programmatically through RTB (Real-time bidding) auctions. This means that every time an ad inventory is available from SSPs (Supply-Side Platforms), DSPs (Demand-Side Platforms) bid for the same placements on behalf of advertisers. Whoever wins the bid gets to serve their ad on that spot. Programmatic platforms have to be able to respond within 50 ms of receiving the request.

This is where programmatic tech comes into play: to facilitate the exercise of quick bidding. We require machine-to-machine automation of advertising and media transactions for it all to work perfectly. No human intervention is needed, and the content delivered has a higher personalization level.

Here we explain how we optimize this bidding process to work at a higher level of efficacy for our advertisers.

CPM Bargainer: Project Goal

Historically, all exchanges happened on second-price auctions, but SSPs moved towards first-price auctions to earn higher revenue. Though this was profitable for the exchanges, it wasn’t so for the DSP side, which was bleeding money with a first-price auction.

The ultimate goal of the CPM bargainer project was to optimize the bidding value of the advertising inventory. This is so that DSPs can have control over bids and don’t end up bidding very high on the inventories.

DSPs use various metrics to measure the value of advertising inventory. One of the most effective KPIs is eCPM (effective cost per mille), a commonly used measurement that allows you to calculate the effective cost of an ad for every thousand impressions.

How Can Data Science Help?

Data Science is an essential part of programmatic advertising. DSPs rely on data science to optimize their UA (User Acquisition) and RT (Retargeting) campaigns.

With the support of data science, we gathered and analyzed the historical bids data, along with the outcomes of those bids, to train the complex Machine learning/Deep learning algorithms to:

🔹 Build predictive models that can predict the max eCPM for every incoming ad request

🔹 Control the bid price and save money by not bidding higher than the predicted max eCPM

🔹 And do it without compromising the win rate.

For this project, we used GCP Bigquery for Data sets, Tensorflow with Keras to build the deep learning model, and Cloud storage to store the outcomes.

Let’s get into the details.

Introduction to Data

In terms of the historical data, we gathered last month’s bid data (successfully won by us) and served as an impression using Bigquery SQL (as all the datasets are stored in the GCP Bigquery). The number of rows of the resulted dataset was in the millions. Given its huge size, the dataset was stored as a table in GCP Bigquery.

We considered supplier-level features like site URL and an aggregator for training the ML/DL model for the project. We tried to build the CPM bargainer model, which will predict the max CPM for different publisher sites requesting different creative sizes and coming from different countries.

Data Preparation and Data Pipeline

For the Data preparation part of the project, we performed the below few steps:

🔹 Data Cleaning: Fortunately, we found a few missing values with predefined default values. We also removed the outliers with very high CPMs, according to the IQR (Interquartile range) rule.

🔹 Data Preprocessing: We converted the features’ values to hash values to handle the high cardinality (high number of unique categories), as we did for the Publisher Site feature. We also hashed all the values of creative size.

We quickly pre-processed the above to feed the huge amount of data to the ML/DL model. We did it by integrating the extremely powerful GCP BigQuery I/O streaming API. With this API, we were able to create the train and test data loaders directly from the Bigquery dataset.

Modeling

So, after setting up the data pipeline for the data streaming, we arrived at the most expected part of this project: implementing the deep learning model.

We trained different Neural Network regressor models for different country and creative size combinations using TensorFlow and Keras.

Below is the model configuration,

🔹 Introduced two input layers: one embedding layer for categorical features and another one for normal input layer for numeric features, both followed by a concatenate layer combining both input layers.

🔹 Then, this concatenate layer was followed by fully connected neural networks created by 3–5 hidden layers containing 32–64 neurons.

🔹 Relu activation for all the hidden layers except the output layer, and the output layer had Softplus activation.

🔹 For compiling we used Adam optimizer and mean absolute error as a loss metric.

After creating the model, we fit both train and test data loaders already generated by big query streaming API to the model and completed the model’s training process.

This whole process was repeated for all the different combinations.

Automation and Monitoring

We followed offline training methods in this project, but we trained the same number of models every day.

To automate the whole process, we created the cron job with the help of crontab. This runs the shell script containing run commands for python scripts daily on a dedicated GCP VM and generates the max eCPM caps for all the keys.

This process runs every day, and it updates the max eCPM caps, which could affect the whole bidding process either in a good or bad way; due to this concern, daily monitoring is crucial for this project.

There are 2–3 factors that we need to monitor which are highly dependent on the eCPM max cap:

1. Impression win rate: total impressions/total bids
2. eCPA: effective cost per action
3. eCPC: effective cost per click

There are two other critical situations we pay close attention to:

🔹 Very High eCPM max cap: In this scenario, we might see a higher number of impressions, but there might be a chance that eCPC and eCPA might be high as well, which might affect the overall revenue.

🔹 eCPM max cap below the min_cpm: In this scenario, we could most probably lose many impressions due to the min_CPM cap that comes in the request from the publisher/SSP. We are required to bid at least the minimum CPM in the local currency, or our bid will be filtered if, let’s say, $0.05 is the minimum in USD, and the max eCPM cap is $0.04. Then we will lose the bid for sure.

Conclusion

In this article, we went through all the steps that helped us to complete our CPM bargainer project. In terms of the improvement, the project allowed us to save  20–25% of total media spending, which was 0–3% only with the static max CPM cap. This supported our efforts to reduce costs to advertisers, resulting in higher revenue.

 

profile

Shivani Salhotra

Never miss a story

Stay updated about RevX.