Related to #4687 which was a symptom of the actual problem noted below
We have a massive entryversions table (~900 MB/~2 million records). The query that is ran to convert this data into the new entryversions format during a 3.2.x upgrade never completes due to how large this table is for us.
The $query being built in the convertVersions() method never completes and this process never finishes for us.
Here's the raw SQL that I got by running $query->getRawSql() instead of ->count():
SELECT `id`, `entryId`, `creatorId`, `siteId`, `num`, `notes`, `data`, `dateCreated`, `uid`
FROM `craft_entryversions` `v`
WHERE (NOT EXISTS (SELECT * FROM `craft_entryversionerrors` WHERE `versionId` = `v`.`id`)) AND (`num` > (SELECT max(cast(`num` as signed)) - 10 FROM `craft_entryversions` WHERE `entryId` = `v`.`entryId`))
ORDER BY `id` DESC
With a table the size of ours this never finishes pulling the data. I'm happy to supply a SQL file if that's helpful!
entryversions table that has many recordsHey Aaron, in theory you probably don’t really need any revisions that are more than a month or two old, so one possibility you could just delete any revisions that are older than that:
DELETE FROM craft_entryversions
WHERE dateCreated < '2019-05-31'
(Keep the original table backed up somewhere first, in case you do end up needing any of that revision data.)
Then try re-running the “Updating entry drafts and revisions” job, with the smaller dataset.
Thanks, @brandonkelly! Will this cause any issues if there's an entry that has only one version and it was created after this date?
You’d just end up losing all revision history for that entry. It won’t impact the current revision of any entries though.
Looks like that worked, @brandonkelly! It's now processing the entry drafts and revisions.

Since this is working should I just plan on trashing a number of these entryversions records before performing the 3.2.x update? Any concerns I should have about any data loss if I do this?
This is the first case I’ve heard of where deleting existing revisions was necessary in order to get the job to work properly. As long as that first query is able to execute, it shouldn’t really matter how big that entryrevisions table is, since the query only selects the last 50 (or whatever maxRevisions is set to) revisions for each entry, ignoring the rest. So I’d say in general you should not worry about it, unless you run into the same issue again.
Thanks, @brandonkelly! Sounds like when I do the official upgrade for the live site I'll need to run that SQL deletion if I want these to process through.
And, sorry, just to be 100% clear—I can run that DELETE statement without worrying about data loss, correct? This won't delete any entries it'll just remove the revisions of all entries to a certain date?
Yep exactly.
Thanks again, @brandonkelly!
I just ran into this issue upgrading from Craft 2 to Craft 3.2. Our craft_entryversions table had ~500,000 rows and even though we are only loading the first 50 versions of each entry, could looping through this many entries "tweak out" php/mysql?
MySql's SHOW FULL PROCESSLIST; showed several stalled tasks looping through craft_entryversions and inserting into entryversionerrors.
Luckily, all of these entry versions were able to be removed, which also resulted in a reduced DB size from ~900MB to ~350MB, and after restarting php/mysql haven't had an issue.
I had this same issue with a recent update but im struggling to see how to re-run the “Updating entry drafts and revisions” job now I have a smaller dataset. Any way I can do it from the command line?
@Harry-Harrison You should be able to retry it from the Control Panel if it wasn’t able to complete successfully. Otherwise, you could copy your craft executable file as reconvert or something (also saved w/ executable permissions), and replace these lines:
$exitCode = $app->run();
exit($exitCode);
with:
$app->queue->push(new \craft\queue\jobs\ConvertEntryRevisions());
exit(0);
Then from your terminal, run
./reconvert