Gutenberg: Beta usability test: comparison test script

Created on 4 May 2017  路  10Comments  路  Source: WordPress/gutenberg

Background: To catch up on our current state of usability testing, check out this post: https://make.wordpress.org/design/2017/04/27/testing-the-gutenberg-user-test/.

Goal: Is the new plugin more usable than the existing editor?

Task: Write a usability script using the alpha plugin. You should base it off the existing testing script, however you can add the following tasks:

  • full width image
  • text
  • headlines
  • lists
  • quotes

The test will run twice per user. Once on the new plugin (using the demo page) and once on an existing install of WordPress to test the existing editor. Remember the goal and focus the task on that.

Format: The script should be made into a google form like the first one, however first lets iterate on the script in this issue, then take to the form. This is not a task yet to run these tests. That will come.

The script should also come with a setup sheet for the testers. Our goal is to have as many people doing this test, as easily as possible.

[Type] Documentation

Most helpful comment

Great idea to iterate on a test script to compare how participants solve for the same task in the old vs the new editors. This will give us an awesome set of data!!

Small thing that we may want to keep in mind is the length of the test: the current test (18 tasks) took my participants 20-30mins to complete. If we ask participants to complete this twice, with potentially a few extra questions, it will make for a long test. Will people lose focus?

Perhaps we should cap the test at 10-15 tasks if it needs to be done twice?

All 10 comments

I took the liberty of editing the goal from "Is the new plugin more usable than the existing editor?" to "Compare tasks in the old editor and the new editor", and set the milestone to beta, where this test is probably better run.

The freeform block (#365) won't be ready for alpha, and when it is a comparison of the old editor to the equivalent block in the new editor doesn't directly make sense. But comparing tasks between the two certainly does!

Great idea to iterate on a test script to compare how participants solve for the same task in the old vs the new editors. This will give us an awesome set of data!!

Small thing that we may want to keep in mind is the length of the test: the current test (18 tasks) took my participants 20-30mins to complete. If we ask participants to complete this twice, with potentially a few extra questions, it will make for a long test. Will people lose focus?

Perhaps we should cap the test at 10-15 tasks if it needs to be done twice?

It's definitely a good idea to restrict the length of the test. I started a Google Doc with questions in it. I don't know if that's helpful or it should be in some other format, but feel free to copy/paste to wherever it should really live...

@inhll I don't seem able to add comments to your Google doc, which is a shame as I wanted to. Could you either open it up or past here?

Just to note we are doing this particular test in beta.. updating title to be more correct. But, lets be future prepared.

@karmatosed I'd originally set it to 'anyone with a link can edit' so you should be good to go. If you're still having trouble with it please just duplicate the document or paste it all out here.

image

I added a lot of the boilerplate stuff I do normally, assuming we can extract the questions into a Google Form leaving the doc in script format for the moderators to follow.

As confirmed via dm, thanks it works now - lets put that down to Google gremlins. Thanks for this, its awesome.

I would like just at the start for us to take a step back. Lets do the following step by step.

  • Work out a script skeleton
  • Make into a digital form (this way it is accessible to many) @jasmussen would love your insights into what to use for that so we centralise the feedback.
  • Then lets get a list of tasks that we all agree on
  • Write each test

@karmatosed should we get a list of tasks we agree on before creating the digital form? Seems like the back and forth could be handled better in a Google Doc comments and then merged into a digital form once the questions are finalized.

Or possibly just another step at the end to merge test tasks into the forms?

@karmatosed should we get a list of tasks we agree on before creating the digital form? Seems like the back and forth could be handled better in a Google Doc comments and then merged into a digital form once the questions are finalized.

@inhll I think we should.

Closing as we have tests live.

Was this page helpful?
0 / 5 - 0 ratings