- New SQL enhancement for Scalability Pro to fix WooCommerce long-running query in the product-hero block - August 31, 2023
- Create a static favicon.ico to avoid surplus PHP requests for rush traffic - July 25, 2023
- Breaking Through WooCommerce Restrictions with Super Speedy Filters: The Power of Pretty Permalinks - May 10, 2023
There are a multitude of profilers available for WordPress but they all fail in some way to provide the real results you need.
Query Monitor, for example, is fantastic for profiling database queries, which is normally where performance problems come from, but it lets you down a little when it comes to finding out which plugins or code-paths are using the most time with PHP.
P3 Profiler is next to useless and there’s nothing else – plugin-wise – I’ve found that really attempts to profile the PHP execution paths.
Enter Xdebug – it can profile *anything* that uses PHP and now works with PHP 7.
Table of Contents
Installing XDebug profiler
Run this command:
php -i > ~/phpinfo.txt
Then grab the contents of that phpinfo.txt file and paste them into this web page:
This page will then tell you how to install it, specifically for your server.
For my servers, installation looks like this:
tar -xvzf xdebug-2.4.0.tgz
sudo apt-get install php7.0-dev
cp modules/xdebug.so /usr/lib/php/20151012
Now edit: /etc/php/7.0/fpm/php.ini (and optionally /etc/php/7.0/cli/php.ini), find the modules area and insert the following lines:
zend_extension = /usr/lib/php/20151012/xdebug.so
xdebug.profiler_enable_trigger = 1
xdebug.profiler_output_dir = /var/log/xprofiler/
xdebug.profiler_output_name = cachegrind.out.%t.%p
xdebug.profiler_enable_trigger_value = secretpassword
Note: I set profiler_output_name because the default is cachegrind.out.%p – the %p is the processid, so when the same PHP-FPM process gets used again for serving up another request the previous log file will be overwritten. By adding the timestamp this is avoided. If you do not wish to have a lot of log files, then don’t do this and you’ll only ever have as many log files as you have PHP processes.
Make sure and choose some other characters for the secret password – the xprofiler saves a LOT of information to disk (if you set the profiler_output_name as above), so if a nasty robot finds your site and adds this parameter it could bring your site down very quickly. Save the file and then make sure you create the folder and change the owner so PHP can write to it:
chown www-data:www-data /var/log/xprofiler
Restart the php7.0-fpm service with this command:
service php7.0-fpm restart
Get XDebug profiler to run all the time
If you don’t want to use the URL trigger and instead want to profile everything that happens on your server, you can modify the following setting in php.ini:
xdebug.profiler_enable = 1
And set your profiler_enable_trigger value back to 0.
xdebug.profiler_enable_trigger = 0
Obviously make sure you have enough space because this will dump a lot of data to disk. For more in-depth information check this page out: https://xdebug.org/docs/profiler
Using XDebug profiler to figure out why a WordPress page is slow
Visit one of your problem website pages and append the following parameter to the URL:
You will get a whole bunch of profiling data saved to /var/log/xprofiler. You can analyse this data with the visual tool KCacheGrind. Get it for Linux or Windows (for Windows I use QCacheGrind) and download the cachegrind.out.* file(s) to your computer to analyse them and figure out where your poor performance is coming from.
For example, I was getting an elusive 90 second page load on my widgets page on one of my sites. Using XDebug Profiler and QCacheGrind I got this image:
Working from the top downwards, you can see 2 calls to curl_exec – made by the DatafeedrAPI plugin in the datafeedr.php file. These calls are taking 98 seconds. That means I instantly know where to go look to fix this problem.
Turns out, in this case, my server had been updating so many products that the datafeedr website had automatically banned my site!
It’s now unlocked and my WordPress back-end is working at sub-second speeds again.