How do I transfer data directly from one Google Cloud Storage project to another?

How do I transfer data from one Google Cloud Storage project to another? I understand how to download and how to download, but I want to transfer directly between projects.

+21
source share
8 answers

To copy any single object from one GCS location to another, you can use the copy command. This can be done using any of our public APIs or using the gsutil command-line client .

With gsutil, the cp command can be used like this:

 gsutil cp gs://bucket1/obj gs://bucket2/obj2 

Edit :
Since I wrote this, the Google Cloud Transfer service has become available, which is convenient for copying entire segments between GCS projects or for copying entire segments from S3 to GCS. You can find out more here .

+22
source

Open the Web console Storage> Tranfer to create a new translation.

Select the source from which you want to copy. As in the crater vale mentioned above, segment identifiers are globally unique (this is the key to the solution). Therefore, as soon as you get to the destination of the transfer form, you can write / paste the target basket directly into it to enter text. Even if it's a bucket from another project. It will show you a green icon as soon as the target is confirmed as an existing bucket. You can continue the form again to complete the setup.

After you started the transfer from the form, you can monitor its progress by clicking the update button at the top of the console.

+7
source

The bucket names in GCS are unique in all of your projects. For example, Project1 and Project2 cannot have buckets with the name "image", although each of them has folders inside these buckets with the name "image".

This may seem like a mistake, as gsutil may ask you to choose a project to work with. For a copy command, this selection can be ignored.

gsutil cp gs://bucket1/obj gs://bucket2/obj

allows you to copy an object in Project1 / bucket1 to Project2 / bucket2

+5
source

Using Google Cloud Shell

Go to the first project that has the basket you wanted to copy.
gcloud config set project [PROJECT1 ID]

Made a directory where you can mount this bucket
mkdir test

Mount bucket in directory
gcsfuse [BUCKET1] test

Switch to the second project, in which there was a place that you wanted to fill
gcloud config set project [PROJECT2 ID]

Copy the contents of the new folder into the second bucket.
gsutil cp -r/home/user/test gs://[BUCKET2]

+1
source

According to the Moving Buckets docs.

You can just use gsutil .

 gsutil cp -r gs://[SOURCE_BUCKET]/* gs://[DESTINATION_BUCKET] 

note: _ if using zsh . Make sure you wrap your source in single quotes. Because zsh will try to expand the template before gsutil sees it. Look here

You can find the gsutil link in the gsutil tab of your browser.

0
source

If you want to use the console, follow the @Martin van Dam answer.

If you want to use a shell:

Step 1. Open google cloud shell

Step 2. Launch gcloud init and follow the process to connect to the cloud project to which bucket1 belongs.

Step 3. run gsutil cp -r gs://[bucket1]/* gs://[bucket2]

You made!


* Now there is a catch! If both blocks belong to the same project, these steps will work flawlessly. But in case both segments do not belong to the same project or the same google cloud account and the same google cloud account . This will not work. You need to fix the permissions.

If they belong to the same GCP account:

Go to " Storage >" Browser > "Select Recycle Bin"> "Options"> " Edit bucket permissions >" add member > insert a service account email id for the project that owns "bucket2"> install the "Storage" role. Storage Admin > Save. Then run gstuil cp .

If they belong to separate GCP accounts:

Go to " Storage >" Browser > "Select Recycle Bin"> "Options"> " Edit bucket permissions Recycle Bin"> " add member > insert the gmail id that owns the project that owns" bucket2 "> install the" Storage "role. Storage Admin > Save, then run gstuil cp .

0
source

If you have a key or service account that gives you access to both projects, it is very simple and works at a slow speed to use gsutils.

This is what I did from my local Mac and synchronized terabytes of data in minutes (yes, minutes, not hours)

 gsutil -m rsync -r gs://my/source/project/bucket/files/ gs://my/target/project/bucket/directory/ 

The key here is to use the -m flag.

Check out the white papers at [ https://cloud.google.com/storage/docs/gsutil/commands/rsync] for more information.

0
source

if you want to copy data from one project to another, I tried to do something below and it works, [transfer the file from project 1, calculate the engine1 server to bucket1, upload it to project 2, calculate the engine 2 server and upload it to bucket 2].
1. Transfer data to the first project container / google using gsutil.
2. Log in to webui, cloud.google.com, select the project when you transfer the file in step 1, and go to the repository and you will see the file there. select this file and select the "Share" checkbox on the right side.
3. Right-click on the public link and select the address of the link to copy.
4. Enter another project, calculate the engine server and use wget and the link that you will get in step 3 to make it. now you can use gsutil again and upload the file to another google storage bucket project.
5. important point, please disable the public shared link.

hope this helps.

-2
source

Source: https://habr.com/ru/post/989741/


All Articles