Command Line WordPress, Node.js, Unit Testing

Dockunit: Containerized Unit Testing Across Any Platform and Programming Language

Problem:

I want my application to work on all the environments I claim to support but I have no way of easily testing updates in each environment.

Bad Solution:

Continuously push to your remote to force Travis CI to test changes in multiple environments.

Good Solution:

Containerized unit testing with Dockunit.

Dockunit is a simple node command that lets you run your test framework of choice across a number of predefined containers. Each container can have it’s own test command and Docker image. It was born out of the need to ensure my PHP applications (WordPress specifically) were compatible with a spectrum of environments. Travis CI dropping support for PHP 5.2 was the final straw so I created this command for myself and decided to make it public.

Install/Usage Instructions for Dockunit

1. Ensure dependencies are met. See requirements on Github.

2. Install Dockunit

npm install -g dockunit

3. Setup project(s) with predefined containers.

To do this create a file called Dockunit.json in the root of any project you want to run Dockunit on. Github contains specific guidelines for how to write Dockunit.json files. Here is an example file for testing a WordPress plugin in PHP 5.2 FPM in WordPress 4.1, PHP 5.5 FPM in WordPress 4.0, and PHP 5.5 for Apache in WordPress 3.9:

{
  "containers": [
    {
      "prettyName": "PHP 5.2 FPM WordPress 4.1",
      "image": "tlovett1/php-5.2-phpunit-3.5",
      "beforeScripts": [
        "service mysql start",
        "bash bin/install-wp-tests.sh wordpress_test root '' localhost 4.1"
      ],
      "testCommand": "phpunit"
    },
    {
      "prettyName": "PHP 5.5 FPM WordPress 4.0",
      "image": "tlovett1/php-fpm-phpunit-wp",
      "beforeScripts": [
        "service mysql start",
        "bash bin/install-wp-tests.sh wordpress_test2 root '' localhost 4.0"
      ],
      "testCommand": "phpunit"
    },
    {
      "prettyName": "PHP 5.5 for Apache WordPress 3.9",
      "image": "tlovett1/php-apache-phpunit-wp",
      "beforeScripts": [
        "service mysql start",
        "bash bin/install-wp-tests.sh wordpress_test3 root '' localhost 3.9"
      ],
      "testCommand": "phpunit"
    }
  ]
}

3. Run Dockunit command.

dockunit

Note: This assumes you have changed directories to your project root. You may need to sudo.

Detailed command usage instructions can be found on Github. Contributions and bug reports are encouraged :).

Database Theory, Elasticsearch, Search, WordCamps, WordPress Plugins

ElasticPress at WordCamp Paris

This weekend I presented at WordCamp Paris 2015. My session was titled “Modernizing WordPress Search with Elasticsearch”. The talk ran through issues with WordPress search, what Elasticsearch is, setting up an Elasticsearch cluster, and configuring ElasticPress.

Elasticsearch is a very exciting technology and I am thrilled at the chance to spread information about it. I (and 10up in general) am very proud of the work we have done on ElasticPress. My hope is that more people will install the plugin and give us feedback as a result of the talk.

Here are my slides for the talk:

Don’t forget that comprehensive documentation for ElasticPress lives on Github.

Publishing Experience, WordPress Plugins, WP API

Custom Contact Forms Redux

In 2009-2010 Custom Contact Forms was one of the first WordPress plugins I wrote. The purpose of the plugin was to solve a simple problem: easy contact form building on the web.

As I was building the plugin, WordPress 3.0 had not yet been released, and therefore custom post types and other useful API’s did not yet exist. I wrote the plugin the best I could using custom database tables (ouch!). The plugin become decently popular. I continued development on the plugin for the next year or so. In the process I learned a lot about writing code for WordPress. Also in the process I landed a job at 10up where I learned (and continue to learn) more than I ever thought I would.

Years later in 2015, here I am an experienced WordPress developer and an avid open source contributor. I have used most of the popular WordPress form plugins: Gravity Forms, Ninja Forms, Formidable, etc. Using all these plugins, despite lots of great functionality, have left me with the nagging feeling that something is missing.

That something was the WordPress experience. With the release of the media manager in WordPress 3.5, we have all grown accustomed to a high quality media management experience through smooth JavaScript interactions. Form management should be no different; they should be built within a media manager-esque modal, no page reloads should be necessary, and we should get live previews of what we are building. Simple task right? 8 months later… :)

Detailed installation and usage instructions are on Github. Download the plugin from WordPress.org

For Users

Within the post edit screen, a simple “Add Form” button next to the “Add Media” button brings up the form manager modal:

insert-form

Within the form management modal, you can also see your existing forms and edit them if you choose:

View all your forms within the form management modal for easy access.

After inserting a form into a post, you see a nice preview within TinyMCE:

Form previews even show within TinyMCE.

While the meat of the plugin lies within the form manager, single form views exist. Within a single form view you can see a live form preview:

Live previews of your forms are generated on the fly. No more guess and test.

Also within the single form view lives the form submissions table. You can easily paginate through results and add/remove columns as you please:

Easily scroll through form submissions in tabular format. You can configure columns you would like to see to ensure an uncluttered view.

For Developers

Most of the plugin is written in JavaScript. Custom Backbone views, models, and collections are written to emulate and display forms and fields. The plugin is extremely extensible. You can easily hook in to modify existing fields and views as well as create your own.

The plugin includes the new JSON REST API for WordPress. Right now, it is included as a Composer dependency for various reasons until the API is added to WordPress core.

Note: While the plugin is suitable for production environments, version 6 is still somewhat in beta. Please let me know on Github if you experience any problems.

Elasticsearch, Search, WordPress Plugins

Valuable Lessons Learned in ElasticPress

ElasticPress is a 10up WordPress plugin project that integrates Elasticsearch with WordPress. As we all know search in WordPress is not a great experience. Why? Well, MySQL is not a database optimized for search. Thus ElasticPress was born.

1. Search result relevancy scores on sites with high post to shard ratios can vary depending on order of indexing.

We first noticed this in our integration testing suite. We were using three shards across 1 primary node. Depending on the order that posts were indexed, different relevancy scores were returned for the same search.

Elasticsearch relevancy scores are calculated as term frequency / inverse document frequency. Term frequency is the number of times a term appears in the query field of the current document (or post). Inverse document frequency measures how often the term appears in all query fields across all documents in the index of the current shard. Notice I said shard NOT index. The shard a post lives on is determined by the number of shards and the size of the index. We can’t exactly predict relevancy scores for a search on an index across more than one shard. The Elasticsearch documentation has a great article on this.

The solution for testing purposes is to only use one shard. In the real world, this shouldn’t matter as inconsistencies plateau as index sizes grow larger. However, this is still something to be aware of.

2. There is no right search algorithm for WordPress. Fine tuning algorithms is an on-going, collaborative process.

As of ElasticPress 1.1, the meat of our default search query looked like this:

{
  "query": {
    "bool": {
      "must": {
        "fuzzy_like_this": {
          "fields": [
            "post_title",
            "post_excerpt",
            "post_content"
          ],
          "like_text": "search phrase",
          "min_similarity": 0.75
        }
      }
    }
  }
}

fuzzy_like_this is great. It combines fuzzy and more_like_this queries. fuzzy searches against a set of fuzzified terms (using the levenshtein distance algorithm). more_like_this selects “interesting” terms based on a number of factors like document frequency and checks each document against those terms.

The problem we encountered was that in certain established indexes exact matches were not getting boosted to the very top of results. This was due to the way the fuzzy_like_this algorithm works. We added an extra query to our search algorithm in 1.2 to boost exact matches:

{
  "query": {
    "bool": {
      "should": [
        {
          "multi_match": {
            "query": "search phrase",
            "boost": 2,
            "fields": ["post_title", "post_content", "post_excerpt"]
          }
        },
        {
          "fuzzy_like_this": {
            "fields": ["post_title", "post_excerpt", "post_content"],
            "like_text": "search phrase",
            "min_similarity": 0.75
          }
        }
      ]
    }
  }
}

The should query tells Elasticsearch that one of the multi_match or fuzzy_like_this queries must be true for a document to match. It then boosts anything found multi_match x2.

This solved our immediate problem but is not the perfect algorithm. We expect to continually optimize this for WordPress over time. (Note that ElasticPress allows you to filter the search query entirely if you want to customize it.)

3. Disable indexing during imports.

By default ElasticPress indexes when a post is created. This is great until you try to import a few thousand posts, and your Elasticsearch instance gets overloaded. This bit us pretty hard. As of newer versions, ElasticPress disables syncing during WordPress imports big or small.

Command Line WordPress, Unit Testing, WP API

Using the WordPress JSON REST API in Testing Suites with Travis CI

Problem:

We are writing custom endpoints and routes that extend WP API in a theme or plugin. We want to write tests for this code and run tests in Travis CI. Right now, WP API is not in WordPress core and must be included as a plugin. Therefore, we have a WordPress plugin/theme that depends on another plugin. We want to make sure all our dependencies are installed/included when we bootstrap our testing suite. WP API is not a registered Composer package.

Bad Solution #1:

Complain about WP API not being a registered Composer package. Just joking. The reason behind this is that WP API must remain backwards compatible after being included in WordPress core.

Bad Solution #2:

Include WP API in your theme/plugin as a Git submodule. Not only are Git submodules annoying but this doesn’t even make sense. Git submodules point to a specific commit. We want to just install the latest stable version which isn’t possible using submodules unless we want to manually update our submodule commit hash every time WP API is updated.

Final Solution

We need to manually include WP API in our plugin/theme but we want it up-to-date. We want to pretend WP API is a Composer package. If it’s not already there, create a folder vendor/ in the root of your project. This is the same folder that Composer will store your packages. Add vendor/ to your .gitignore since we don’t need this code in production (assuming you don’t have other packages in vendor that are needed in production).

We can execute bash scripts whenever Composer install/update is called. Let’s write a simple Bash script to clone/update WP API into vendor/. Place this code in bin/install-wp-api.sh:

cloneOutput=$(git clone https://github.com/WP-API/WP-API.git ./vendor/wp-api 2>&1)

if [[ $cloneOutput =~ "destination path './vendor/wp-api' already exists" ]]; then
  cd vendor/wp-api
  git reset --hard &>/dev/null
  git checkout master &>/dev/null
  git reset --hard &>/dev/null
  git pull origin master &>/dev/null
fi

This script will clone WP API, if it doesn’t exist. Otherwise it will checkout the master branch and pull the latest from Github.

Now we need to execute this script. We will use Composer scripts. Here is an example composer.json file:

{
    "name": "tlovett1/package-name",
    "description": "Description of my package.",
    "license": "MIT",
    "authors": [
        {
            "name": "Taylor Lovett",
            "email": "email@email.com"
        }
    ],
    "minimum-stability": "stable",
    "require": {},
    "scripts": {
        "pre-install-cmd": [
            "./bin/install-wp-api.sh"
        ],
        "pre-update-cmd": [
            "./bin/install-wp-api.sh"
        ]
    }
}

composer.json defines your package as a Composer package, registers it’s dependancies, and more. As I mentioned before, we can call scripts as well. We use the pre-install-cmd hook to execute our script every time composer install is called. We use the pre-update-cmd hook to execute our script every time composer update is called. On both of these hooks we are executing our script at ./bin/install-wp-api.sh.

Now we simply add the following to our .travis.yml file:

before_script:
   - composer install

In your test suite bootstrap file, you can insert the following PHP to manually use WP API assuming you are using the WordPress standard unit testing suite:

function _manually_load_plugin() {
	require( dirname( __FILE__ ) . '/../vendor/wp-api/plugin.php' );
}
tests_add_filter( 'muplugins_loaded', '_manually_load_plugin' );

EDIT: Another solution which is easier than this one is using a composer inline package.

Command Line WordPress, Node.js, WordPress for Enterprise

Run WordPress Cron on Real Unix Cron with Node.js

WordPress cron is a confusing beast. Most people don’t understand it or intentionally use it. It is not the same technology as Unix cron. Scheduled post functionality actually depends on WordPress cron. By default on every page load, WordPress checks to see if any cron events are due to fire. If an event is due, it sends a request to wp-cron.php asynchronously to execute the event(s).

So what?

This system works great for small websites running simple theme and plugin setups. Often when building WordPress applications for enterprise, enough resource intensive events get setup on cron that HTTP requests timeout before completion. We can circumvent this problem by executing WordPress cron using actual cron.

wp-cron-node is a simple node command that runs WordPress cron events via PHP CLI and Unix cron. It will execute WordPress cron events that are due for execution. The command requires WP-CLI.

Let’s run our scheduled events using actual Unix cron and Node:

  1. Make sure npm and WP-CLI are installed.
  2. Run the following command:
    npm install -g wp-cron-node
  3. Disable WordPress cron. This will prevent HTTP requests from triggering cron events. Put this code in wp-config.php:
    define( 'DISABLE_WP_CRON', true );
  4. Finally, setup your crontab file to your liking. Edit your crontab with the following command:
    crontab -e

    Here is an example crontab entry that will check for scheduled events every 10 minutes.

    */10 * * * * wp-cron-node /path/to/wp

    Note: Running this every 10 minutes means cron events could fire up to 10 minutes late. Ten is a conservative number for performance reasons. Change this to whatever makes you comfortable.

Backbone, WordPress Code Techniques

Syncing Backbone Models and Collections to admin-ajax.php

Backbone is framed around the assumption that your models and collections are backed by RESTful API’s. If this isn’t the case, life becomes difficult. WordPress provides an AJAX API for backend AJAX requests. These requests are sent through admin-ajax.php¬†which is not RESTful.

I’ve written some code to force Backbone models and collections to play nicely with admin-ajax.php. First I’ll show code, then provide explanation:

/**
 * A mixin for collections/models
 */
var adminAjaxSyncableMixin = {
	url: ajaxurl,

	sync: function( method, object, options ) {
		if ( typeof options.data === 'undefined' ) {
			options.data = {};
		}

		options.data.nonce = localizedSettings.nonce;
		options.data.action_type = method;
		options.data.action = 'myaction';

		var json = this.toJSON();
		var formattedJSON = {};

		if ( json instanceof Array ) {
			formattedJSON.models = json;
		} else {
			formattedJSON.model = json;
		}

		_.extend( options.data, formattedJSON );

		options.emulateJSON = true;

		return Backbone.sync.call( this, 'create', object, options );
	}
};

/**
 * A model for all your syncable models to extend
 */
var BaseModel = Backbone.Model.extend( _.defaults( {
	parse: function( response ) {
		// Implement me depending on your response from admin-ajax.php!

		return response
	}

}, adminAjaxSyncableMixin ));

/**
 * A collection for all your syncable collections to extend
 */
var BaseCollection = Backbone.Collection.extend( _.defaults( {
	parse: function( response ) {
		// Implement me depending on your response from admin-ajax.php!

		return response
	}

}, adminAjaxSyncableMixin ));

The bulk of the action happens in our mixin:

var adminAjaxSyncableMixin = {
	url: ajaxurl,

	sync: function( method, object, options ) {}
};

We will extend the defining objects passed to our base model and collection. Therefore BaseCollection and BaseModel will have this objects properties (unless they are overwritten — more on that later).

The url property defines the location to which syncing will occur. In this case we are using ajaxurl which is a variable localized from WordPress by default containing a full URL to admin-ajax.php.

The sync property defines a function that will be called in front of Backbone.sync. Backbone.sync is called whenever we call Backbone.Model.save, Backbone.Model.destroy, Backbone.Model.fetch, Backbone.Collection.save, and Backbone.Collection.fetch. By providing a sync property to our base model and collection, we are forcing our sync function to be called instead of Backbone.sync.

adminAjaxSyncableMixin.sync has a few parameters:

sync: function( method, object, options ) {

}

By default, method is set by the function called. For example, calling Backbone.Model.save or Backbone.Collection.save will call Backbone.sync where method is create or patch (possibly update). method ultimately defines the type of HTTP method used (GET, POST, DELETE, PATCH, PUT, or OPTIONS). object is the model or collection object being synced. options lets us send associative arguments to our sync destination among other things.

Let’s look at the body of this function.

if ( typeof options.data === 'undefined' ) {
	options.data = {};
}

Setting the data property in options lets us manually overwrite the representational data sent to the server. We are going to use this so we make sure it’s defined, if it isn’t.

options.data.nonce = localizedSettings.nonce;
options.data.action_type = method;
options.data.action = 'myaction';

Now we are just setting up information to pass to admin-ajax.php. By default we pass a nonce that has been localized to our script. options.data.action should contain the action slug registered within WordPress using the wp_ajax_ hook. We will force our request to be an HTTP POST so we send our method along inside action_type for later use.

var json = this.toJSON();
var formattedJSON = {};

if ( json instanceof Array ) {
	formattedJSON.models = json;
} else {
	formattedJSON.model = json;
}
_.extend( options.data, formattedJSON );

This code sets up our model or collection data to be passed to the server. If the toJSON() representation of the current object is an Array, we know we have a collection. We extend the options.data object with the formattedJSON object we create.

options.emulateJSON = true;

This sends our data as application/x-www-form-urlencoded (classic form style) instead of application/json preventing us from having to decode JSON in our endpoint.

return Backbone.sync.call( this, 'create', object, options );

Finally, we call Backbone.sync in the current object context. We pass create as the method (forcing a POST request). object is simply passed along. We pass options having extended it with our own data. Essentially, our sync function is an intermediary between Backbone save/fetch and Backbone.sync.

var BaseModel = Backbone.Model.extend( _.defaults( {
	idAttribute: 'ID',
	parse: function( response ) {
		// Implement me depending on your response from admin-ajax.php!

		return response
	}
}, adminAjaxSyncableMixin ));

var BaseCollection = Backbone.Collection.extend( _.defaults( {
	parse: function( response ) {
		// Implement me depending on your response from admin-ajax.php!

		return response
	}
}, adminAjaxSyncableMixin ));

We define BaseModel by extending Backbone.Model and mixing in adminAjaxSyncableMixin. _.defaults returns an object filling in undefined properties of the first parameter object with corresponding property in the second parameter object. We define BaseCollection the same way extending Backbone.Collection and mixing in adminAjaxSyncableMixin.

Backbone.Model.parse and Backbone.Collection.parse intercept sync responses before they are processed into models and model data. Depending on how you write your admin-ajax.php endpoints, you may need to write some parsing code.

Finally, we can define, instantiate, and utilize new models and collections based on BaseModel and BaseCollection:

var myModel = BaseModel.extend({});
var myCollection = BaseCollection.extend({});

var modelInstance = new myModel( { ID: 1 } );
modelInstance.fetch();
modelInstance.set( 'key', 'value' );
modelInstance.save();

var collectionInstance = new myCollection();
collectionInstance.fetch();
collectionInstance.at( 0 ).set( 'key', 'value' );
collectionInstance.save();
Presentations, WordPress Core

JSON REST API for WordPress at the DC API User Group

Today I am presenting on the JSON REST API for WordPress at the DC API User Group. This is a shorter talk geared at both developers and API users with or without WordPress experience.

With core integration coming in the near future, it’s important for developers of ALL backgrounds to understand it will be available on ~23% of the websites on the internet. Here are the slides for my talk: