Created 1 year ago by @ricricucit
It would be great to have some alternatives to just MySQL!
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
From @puco — https://github.com/directus/directus/issues/1020#issuecomment-473614013
Guys, what is the status of this? It is hanging on the top as the most requested feature after all.
So to give an update here, this is indeed the (second) most requested feature (behind GraphQL support, which is almost ready for beta). The reality is: we're a small open-source team and have to triage tickets based on priority. As of now that means that we are focusing on:
Essentially, we're trying to limit the breadth of what we need to test/support until things are in a good place. Adding Postgres will be awesome, and we're moving in the direction by continuing to identify, document, and abstract any MySQL specific queries/features. That way we'll be able to _hit the ground running_ when the time is right to add this support.
It's looking like we'll be starting in on this in Q3-2019 unless there are other priority issues that arise (pushes it back) or a company "sponsors" it's development (pushes it sooner). In the meantime, if you want to help out, try contributing to our public Doc where we're consolidating info on what needs to happen to fully abstract our database layer, become SQL vendor agnostic, and support Postgres:
https://docs.google.com/document/d/17ZUuUxb2qMotYduTzLuntvdj45ogfvU-d_RsDMMEhOA/edit?usp=sharing
To achieve better clarity/visibility, we are now tracking feature requests within the Feature Request project board.
This issue being closed does not mean it's not being considered.
What's the current state of PostgreSQL support for Directus? The release board for v2.2.0 says it's DONE but the code seems to only be merged to the pgsql branch.
Is this actually done and if so, how can I use it - can't find any documentation. (https://docs.directus.io/guides/cli.html#help for example still says only mysql supported).
I thought our PostgreSQL PR was merged in. @directus/api-team ?
We're collecting community feedback on the postgres implementation before releasing it to the public
@benhaynes
develop branch.Until we have automation testing in place, we have to check the stability of the PostgreSQL time by time manually (Mostly when we send any DB related fixes).
I am requesting @shealavington, Please make sure the test case you are writing that also works for PostgreSQL. So in the future, we have automation testing for PostgreSQL too.
@urvashithakar @bjgajjar
Please plan to test the PostgreSQL on your local computer with all the test case mentation in #1099.
Are there any plans on when PostgreSQL support will be available for the general public - any estimated time frame?
@matleh like @bjgajjar mentioned above, it's going to be very hard to know if it's stable enough to be "production ready" as it's going to rely on manually testing some use cases. We might merge it in to master but not "officially support"* it for a little while.
* Officially supporting something basically means pointing people to it in the docs. We'll add a page in there saying that postgres support is experimental
I've just seen your comment @hemratna
Please make sure the test case you are writing that also works for PostgreSQL. So in the future, we have automation testing for PostgreSQL too.
As far as I'm aware, my test suite will be a simple Input -> Output test set. These tests will not be reflected on which database is used, the tests will be checking that the API can be used as expected, based on the documented endpoints, and you shouldn't have to change which requests you make based on which database is used.
Ultimately, as long as the API works as expected, tests should pass, and I shouldn't have to write any separate tests for different databases.
You're right, those tests should work regardless of database / tech stack used in the API
Maybe the tests could add a test table to the database with known data before running, and remove after? That way it works on any database, but still has the info it needs to perform the tests?
It definitely has to do that, as we should assume an empty installation of directus for these tests (ping @shealavington )
Maybe the tests could add a test table to the database with known data before running
I could specifically write the test setup and tear-downs for MySQL, so then I could use a node MySQL implementation to do the following, however then we require anyone running the tests to be specifically using MySQL:
Either that, or I was going to completely rely on the API to do it all, however then if the before or after fails then the test fails as it was incorrectly configured, but this allows us to run it against any instance.
But yea, all-in-all, it's nothing to do with if you're using MySQL, MariaDB, Postgres, we're simply seeing if the API is performing as we expect it to.
That makes sense. We'll definitely want to make sure that this is database vendor agnostic. One more option to add into the mix:
If we want to ensure the setup/teardown works every time (doesn't rely on the API), then we could also have specific commands for each vendor. Since this "dummy" data shouldn't really change often, that shouldn't be a crazy amount of work.
Question 1: Can we keep the dummy data as one pure SQL file? Or does it need to be individual commands to work properly?
Question 2: Would we have setup/teardown before/after _each individual_ test, or before/after _all tests_?
I really wanted to K.I.S.S, therefore adding vendor specific commands creates more complexity. Ideally we could use the API for the entire thing, therefore if you make a change and run the tests, and the tests fail anywhere, then there's likely an issue with the code, even if it was in the setup(before) or teardown(after).
Q1: I believe we could have one SQL file that we use every time, we could run it before all tests run, and drop it after all tests have ran. Though, we need to ensure that when someone adds a new test, we have any seed data needed for the test, and we'd need it to work for each vendor requiring someone to test the tests for each vendor.
Q2: The original thought was before/after each suite. For example I want to test that the API can delete items, I'll make a suite of tests to delete 1 item, and 4 items, but, before they run, I'd need to insert 5 items for the tests to use.
Maybe we just write all tests in a format that allow them to take care of themselves?
So I would create a suite of tests called Items, and that suite would
Originally I was thinking we could use the Demo SQL file for tests, but I think we should be able to change that whenever we want, and shouldn't worry about breaking the tests.
Ooooh, I like the CRUD tests... testing themselves. If we use the API to create collections/fields/relationships, create items, then read them, update it, revert them, check revisions, delete, etc... then we can just catch and report whenever/wherever things break. Good thinking Shea! Then we don't even need a "seed schema".
Maybe the dummy data could even be language agnostic. So a lot of numbers, Lorem ipsum, and/or using "test names" as the data itself.
Also I just realized all this useful info is on the PostgreSQL issue... whoops!
I tried to test the code as well and can't seem to connect. Can someone post how should the config file for postgres look like? Or how to turn on better logging (the only thing I see in the logs currently is quite unspecific api[_].ERROR: Directus\Database\Exception\ConnectionFailedException: Failed to connect to the database in /var/www/src/core/Directus/Application/CoreServicesProvider.php:776 (running with docker compose built from pgsql branch)
Re connection: found the issue. Needed to install pdo_pgsql (the error reporting was not very helpfull as it does not log the original exception being thrown in Zend Adapter). After that managed to connect. Needed to refactor the schema.sql for pgsql. After adjusting the data types it starts and I could login. But I can't create any new collections (fails on creating id column, which I guess is created as int(11) which pgsql does not know what to do with). I also can't manage my existing tables (I can add the table but can't manage any fields due to some app issue)
Hey @puco — I assume you're testing with the pgsql branch?
@benhaynes yes I am. I can provide you more feedback as I test along if you are interested. But I don't think this is the right place for it.
Thank you! This is probably the best place for it, for now. Any feedback is very much appreciated!
pgsql branch is now 198 commits late from develop branch, can I open a new pull request with this branch rebased on develop for this to be merged in develop ?
I'll give a try on this branch because I really need PostgreSQL for one of my project.
Yeah, let's get this up-to-date and tested... then we can finally look into it getting merged. The problem in the past has been the added time investment and complexity of maintaining a new database vendor, so the more help we get from the community the better.
Most helpful comment
pgsqlbranch is now 198 commits late from develop branch, can I open a new pull request with this branch rebased ondevelopfor this to be merged in develop ?I'll give a try on this branch because I really need PostgreSQL for one of my project.