WP Rest API too slow?

Make your own entry point for fast data delivery

Yair Levy
Yair Levy
Mar 12 · 6 min read

tl;dr: Learn how to create your own entry point to wordpress backend, in order to deliver data to your frontend app as fast as possible.

Disclosure: I love wordpress… really :) I find it the easiest, fastest way to bootstrap a website. And wordpress’s eco-system & community makes it even easier to create almost any kind of functionality.

(here comes the ‘but’)

But… Web development is rapidly changing, this is the era of SPA’s and PWA’s, javascript frameworks such as Angular, React & Vue give us amazing tools to create awesome reactive responsive frontend applications.

Frontend apps need to consume data… data comes from backend, and it seems that wordpress infrastructure makes data delivery too slow for modern web apps.

How slow? Well… let’s try the wordpress old-school way of getting data delivered to the frontend (jQuery.ajax):

//Frontend script
    url: 'https://fastshopping.online/wp-admin/admin-ajax.php',
    type: 'post',
    data: {
      action: 'getsomeproducts'
    dataType: 'json',
    success: function(response) {
    error: function(jqXHR, textStatus, errorThrown) {
     console.log('error', jqXHR, textStatus, errorThrown);
add_action( 'wp_ajax_getsomeproducts', 'getsomeproducts' );
function getsomeproducts(){
  $query = new WC_Product_Query( array(
    'limit' => 12,
  $products = $query->get_products();

In the example above i am pulling 12 products from the database and sending them to my frontend app. I am using WC_Product_Query which is a woocommerce wrapper for WP_Query on a fresh wordpress install with 5000 products in the database, half a dozen plugins, on a shared hosting NGINX server. This is an extremely simple and common data request, no special effects involved… But it took about 1 sec to retrieve the data.

Analizing the request exposes TTFB (time to first byte) of 952ms… This is practically forever :(

OK.. old-school is not good enough, Let’s try the ‘new’ kid on the block, WP REST API (Woocommerce implemented their own WC REST API on top of it)

fetch('https://fastshopping.online/wp-json/wc/v3/products?consumer_key=ck_XXXXX&consumer_secret=cs_XXXXX&per_page=12', {
   method: 'get'
  }).then(function(response) {
  }).catch(function(err) {

Nice Hugh? No backend code needed, and i switched to fetch instead of jQuery.ajax. Note the consumer key and secret query parameters . You must create them for authenticating requests (personally i don’t see any reason for authenticating get requests but that’s for another article).

Again.. simple request, nothing complicated, but the response time hasn’t changed much.

TTFB is now 853ms… Not good enough. Deal breaker?

While looking for possible ways to solve this problem I ran into this article which displayed similar results to my own and also shed some light about what happens ‘behind the scenes’ of wordpress data retrieval process.

At this stage, I decided to do something a little nontraditional (maybe a bit radical) and create my own process for retrieving data. It may seem like a lot of overhead for accomplishing something extremely simple, but if you care about performance (and I care), the results are astonishing.


  1. Maintain a static file with all product data. (hook on wp_update_product to execute a wp_cron job that queries the database and stores the results)
  2. Create your own entry point (simple php script) that reads the file content extracts the desired data, and send it to the front end.

First thing to notice is that queries takes time and consumes your server resources. Since product data is relatively static, We can store it in a json file for faster retrieval. This file needs to update whenever a product updates so we will hook on ‘woocommerce_update_product’. We will also utilize wp cron because we don’t want this process to interfere or block other server activity.

add_action( 'woocommerce_update_product', 'run_my_cron_job');
function run_my_cron_job(){
 wp_schedule_single_event( time(), 'my_cron' );
add_action( 'my_cron','update_products_cache_file' );

So now we have a scheduled event where the function ‘update_products_cache_file’ will run whenever a product is updated.

Let’s take a look at the function:

First thing we do is querying the database for all products. Note that I am using a $wpdb query This is just because WP_Query And WC_Query might break with large data-sets (remember i have 5000 products in the database).

For smaller data-sets I would recommend using WC_Query it can save this entire code block.

function update_products_cache_file() {
  global $wpdb;
  $query = "
    SELECT p.*, 
    GROUP_CONCAT(pm.meta_key ORDER BY pm.meta_key DESC SEPARATOR 
    as meta_keys, 
    GROUP_CONCAT(pm.meta_value ORDER BY pm.meta_key DESC SEPARATOR 
    as meta_values 
    FROM $wpdb->posts p 
    LEFT JOIN $wpdb->postmeta pm on pm.post_id = p.ID 
    WHERE p.post_type = 'product' and p.post_status = 'publish' 
  $products = $wpdb->get_results($query);
  function map_meta($a){
    $a->meta = array_combine(
    return $a;
 $products = array_map('map_meta',$products);

Now we have an array of products with their meta-data stored in $products. I will loop through the product array to extract the data i need to store and arrange it in the structure i want. (again, you can change this part to any desired data structure). Note that we can save a lot of space (and later on bandwidth) because we are only saving the data we need. (If you don’t have sku’s, or you don’t manage stock, or cross-sells you don’t have to include it)

$_wfs_cache = [];
foreach($products as $product){
  $_wfs_cache[$product->ID]= array(
    'name' => $product->post_title,
    'desc' =>  $product->post_content,
    'date' => strtotime($product->post_modified),
    'sku' => $product->meta['_sku'],
    'price' => $product->meta['_price'],
    'sale_price' => $product->meta['_sale_price'],
    'stock' => $product->meta['_stock'],
    'rating' => $product->meta['_wc_average_rating'],
    'upsell' => $product->meta['_upsell_ids'],
    'crosssell' => $product->meta['_crosssell_ids'],
    'sales' => $product->meta['total_sales'],
 file_put_contents(plugin_dir_path( __FILE__ ) .
   'data/_wfs_cache.json'  , json_encode($_wfs_cache) );

Done! From now on, whenever a product updates our _wfs_cache.json file will update.

Now let’s create an entry point to retrieve this data… This is a simple php file, it does not need wordpress to be loaded, just access our _wfs_cache.json file, process the data, and return it.

//Backend ep_1.php
  // Read file content and decode it to PHP Object
  $products = json_decode(file_get_contents( __DIR__ . 
  //Use query parameters
  $per_page = $_GET["per_page"];
  $result = [];
  custom_sort($products);//Loop through the data
  foreach ($products as $key=>$value){
    if (custom_filter($value)){
    if (count($result)>=$per_page){
 echo json_encode($result);
shop/ep1.php?per_page=12', {
   method: 'get'
  .then(function(response) {
   return response.json();
  .then(function(myJson) {
  .catch(function(err) {

Notes on the above:

For demonstration purposes I did not include all logic embedded inside the PHP script (basic authentication, sorting and filtering the results). It really depends on your own needs and requirements.

OK… We have done that. Does it worth the effort? Let’s check the results

TTFB is now only 208ms ! That’s 80% less then our original jQuery.ajax request, and this is something you can actually feel when browsing through a website.


Wordpress is awesome as a CMS, and it can also be used as a backend for your web apps. A notable downside for this setup is a ridiculously long response time for data retrieval using the traditional methods.

Luckily, we can create our own entry points to the database, bypass wordpress and deliver our data much much faster.