This website uses Cookies. Click Accept to agree to our website's cookie use as described in our Privacy Policy. Click Preferences to customize your cookie settings.
I have a Datastream set up that is streaming several tables from one of
our cloud SQL Postgres databases into a BigQuery dataset.We were under
the impression that Datastream should gracefully handle schema changes.
In this case we have a "message" co...
I thought I posted this but now I cannot find the post.Starting last
night, our calls to JSON_SET are no longer working. We get an error:
Invalid number of parameters passed to function: expected at least 4,
got 3The hardcoded examples in the docs fo...
We're having a frustrating problem with Datastream. We set up a user in
our Postgres (CloudSQL) db with all the settings as listed in the
Datastream documentationWe set up a publication from the database we
wanted, with 7 of the tables marked for str...
I'm using a service role that has storage admin and storage object admin
project-wide.But when I try to do a gcloud sql export to a bucket in
that project, it returns a 403 error. (gcloud.sql.export.csv) HTTPError
403: The service account does not ha...
The response to my support ticket indicates they are working on the
problem; if you are in us-east4 you probably are still seeing this but
other regions are likely fine. (I can't repro in US, for example)
The problem turned out to be order-of-events. The publication has to be
created before the replication slot or else it will not find the
publication once it does exist.Probably not normally an issue, but if
you are experimenting trying to learn how t...
Do you think it's just the service role I'm using that needs the perms,
or does the CloudSQL Postgres instance role also need write perms to the
bucket for the gcloud sql export to work?