Sign InTry Free

Import or Migrate from Amazon S3 or GCS to TiDB Cloud

This document describes how to use Amazon Simple Storage Service (Amazon S3) or Google Cloud Storage (GCS) as a staging area for importing or migrating data into TiDB Cloud.

Import or migrate from Amazon S3 to TiDB Cloud

If your organization is using TiDB Cloud as a service on AWS, you can use Amazon S3 as a staging area for importing or migrating data into TiDB Cloud.

Prerequisites

Before migrating data from Amazon S3 to TiDB Cloud, ensure you have administrator access to your corporate-owned AWS account.

Step 1. Create an Amazon S3 bucket and prepare source data files

  1. Create an Amazon S3 bucket in your corporate-owned AWS account.

    For more information, see Creating a bucket in the AWS User Guide.

  2. If you are migrating data from an upstream database, you need to export the source data first.

    For more information, see Migrate Data from MySQL-Compatible Databases.

  3. If your source data is in local files, you can upload the files to the Amazon S3 bucket using either the Amazon S3 console or the AWS CLI.

    • To upload files using the Amazon S3 console, see Uploading objects in the AWS User Guide.

    • To upload files using the AWS CLI, use the following command:

      aws s3 sync <Local path> <Amazon S3 bucket URI>
      

      For example:

      aws s3 sync ./tidbcloud-samples-us-west-2/ s3://tidb-cloud-source-data
      

Step 2. Configure Amazon S3 access

To allow TiDB Cloud to access the source data in your Amazon S3 bucket, you need to configure the bucket access for TiDB Cloud and get the Role-ARN. Once the configuration is done for one TiDB cluster in a project, all TiDB clusters in that project can use the same Role-ARN to access your Amazon S3 bucket.

For detailed steps, see Configure Amazon S3 access.

Step 3. Import data into TiDB Cloud

  1. Open the Import page for your target cluster.

    1. Log in to the TiDB Cloud console and navigate to the Clusters page of your project.

    2. Click the name of your target cluster to go to its overview page, and then click Import in the left navigation pane.

  2. On the Import page, click Import Data in the upper-right corner, select From S3, and then fill in the following parameters:

    • Data format: choose the format of your data.
    • Bucket URI: fill in the bucket URI of your source data.
    • Role ARN: enter the Role-ARN you obtained in Step 2.

    If the region of the bucket is different from your cluster, confirm the compliance of cross region. Click Next.

    TiDB Cloud starts validating whether it can access your data in the specified bucket URI. After validation, TiDB Cloud tries to scan all the files in the data source using the default file naming pattern, and returns a scan summary result on the left side of the next page. If you get the AccessDenied error, see Troubleshoot Access Denied Errors during Data Import from S3.

  3. Modify the file patterns and add the table filter rules if needed.

  4. Click Next.

  5. On the Preview page, confirm the data to be imported and then click Start Import.

After the data is imported, if you want to remove the Amazon S3 access of TiDB Cloud, simply delete the policy that you added in Step 2. Configure Amazon S3 access.

Import or migrate from GCS to TiDB Cloud

If your organization is using TiDB Cloud as a service on Google Cloud Platform (GCP), you can use Google Cloud Storage (GCS) as a staging area for importing or migrating data into TiDB Cloud.

Prerequisites

Before migrating data from GCS to TiDB Cloud, ensure the following:

  • You have administrator access to your corporate-owned GCP account.
  • You have administrator access to the TiDB Cloud Management Portal.

Step 1. Create a GCS bucket and prepare source data files

  1. Create a GCS bucket in your corporate-owned GCP account.

    For more information, see Creating storage buckets in the Google Cloud Storage documentation.

  2. If you are migrating data from an upstream database, you need to export the source data first.

    For more information, see Install TiUP and Export data from MySQL compatible databases.

Step 2. Configure GCS access

To allow TiDB cloud to access the source data in your GCS bucket, you need to configure the GCS access for each TiDB Cloud as a service on the GCP project and GCS bucket pair. Once the configuration is done for one cluster in a project, all database clusters in that project can access the GCS bucket.

For detailed steps, see Configure GCS access.

Step 3. Copy source data files to GCS and import data into TiDB Cloud

  1. To copy your source data files to your GCS bucket, you can upload the data to the GCS bucket using either Google Cloud console or gsutil.

    • To upload data using Google Cloud console, see Creating storage buckets in Google Cloud Storage documentation.

    • To upload data using gsutil, use the following command:

      gsutil rsync -r <Local path> <GCS URI>
      

      For example:

      gsutil rsync -r ./tidbcloud-samples-us-west-2/ gs://target-url-in-gcs
      
  2. Log in to the TiDB Cloud console and navigate to the Clusters page of your project.

  3. Click the name of your target cluster to go to its overview page, and then click Import in the left navigation pane.

  4. On the Import page, click Import Data in the upper-right corner, and then fill in the importing related information.

Download PDFRequest docs changesAsk questions on TiDB Forum
Was this page helpful?
Open Source Ecosystem
TiDB
TiKV
TiFlash
TiSpark
Chaos Mesh
© 2023 PingCAP. All Rights Reserved.