Status Update
Comments
kn...@google.com <kn...@google.com>
kn...@google.com <kn...@google.com> #2
jh...@6river.com <jh...@6river.com> #3
na...@gmail.com <na...@gmail.com> #4
ad...@paramount.com <ad...@paramount.com> #5
dm...@gtempaccount.com <dm...@gtempaccount.com> #6
je...@panerabread.com <je...@panerabread.com> #7
As mentioned, the ability to clone a production DB to another non-prod project is something that would be used often.
de...@uk.qbe.com <de...@uk.qbe.com> #8
ra...@farmote.com <ra...@farmote.com> #9
ji...@platforma.one <ji...@platforma.one> #10
gcloud sql export sql db-production gs://someproject-development-bucket/db-production_dump.sql --project someproject-production && \
gcloud sql import sql db-development gs://someproject-development-bucket/db-production_dump.sql --project someproject-development
As long as permissions are set, you can import/export to/from any project to/from any project.
---
Our setup:
project-development:
- Cloud SQL: db-development
- Cloud Storage: bucket-development
- Cloud Build: clone-prod-to-dev
project-production:
- Cloud SQL: db-production
// YAML from Cloud Build trigger:
steps:
# export SQL databases
- name: '
args: [
'sql', 'export', 'sql',
'db-production',
'gs://someproject-development-bucket/db-production_dump-$BUILD_ID.sql',
'--project', 'someproject-production'
]
waitFor: ['-']
id: 'sql-export'
# import SQL databases
- name: '
args: [
'sql', 'import', 'sql',
'db-development',
'gs://someproject-development-bucket/db-production_dump-$BUILD_ID.sql',
'--project', 'someproject-development'
]
waitFor: ['sql-export']
ab...@gmail.com <ab...@gmail.com> #11
pr...@google.com <pr...@google.com> #12
c....@gmail.com <c....@gmail.com> #13
ad...@paramount.com <ad...@paramount.com> #14
oc...@gmail.com <oc...@gmail.com> #15
[Deleted User] <[Deleted User]> #16
aj...@google.com <aj...@google.com> #17
Best,
Akhil
Product Manager, Cloud SQL
ja...@homeprotech.com <ja...@homeprotech.com> #18
ca...@google.com <ca...@google.com> #19
For now a good option is to use the
s....@ndr.de <s....@ndr.de> #20
This is in fact possible now!
payload.json:
{
"restoreBackupContext":
{
"backupRunId": 1647594254833,
"project": "my-source-project",
"instanceId": "my-source-database"
}
}
curl -X POST \
-H "Authorization: Bearer "$(gcloud auth print-access-token) \
-H "Content-Type: application/json; charset=utf-8" \
-d @payload.json \
"https://sqladmin.googleapis.com/v1/projects/my-destination-project/instances/my-destination-database/restoreBackup"
Just adjust the values for backupRunId, project and instanceId in payload.json. As well as the url in the curl call.
b....@gmail.com <b....@gmail.com> #21
Could you please share any hints if it is possible to set up cross projects replicas for Cloud SQL in GCP?
Thank you!
br...@gmail.com <br...@gmail.com> #22
The only way I have seen is to use Database Migration Service to replicate a DB instance in your source project, to a new db instance in your destination project, and clone that to another db in your destination project, but now you are paying for an extra DB server.
I also need the ability to clone across projects, its preventing me from migrating about 120 databases (all > 200GB) over to cloud SQL instead of postgres running on GCE instances. (we can clone database disks from one project to another)
Description
Unfortunately, in order to move a database from one project to another, you only have the following, existing options:
- setup an external read replica, wait for it to catch up, and then promote it to master.
- export data, copy to new project, then import data.
The first option is rather cumbersome, but has the benefit of allowing you to make a fairly clean cutover with limited downtime. The second option will require more downtime, but requires a bunch of extra steps administratively.
Having the option to easily clone your database to a new one in a different project, or the ability to do a recovery from a backup in a different project, would greatly simplify this administrative process and require much less planning.