BitBucket has a service called PipeLines that can deploy code to AWS services. Use pipelines to pack and send updates from your main branch to CodePipeline S3, which is connected to CodePipeline
Remarks:
You must enable PipeLines in your repository
PipeLines expects a file called bitbucket-pipelines.yml to be hosted inside your project
Make sure you set your AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY accounts in the BitBucket Pipelines user interface. It comes with encryption, so everyone is safe
Here is an example bitbucket-pipelines.yml that copies the contents of a directory named DynamoDb to the S3 bucket
pipelines: branches: master: - step: script: - apt-get update
Here is a working example Python boot script that should be deployed along with bitbucket-pipelines.yml in your project. Above, I named my Python script s3_upload.py :
from __future__ import print_function import os import sys import argparse import boto3 from botocore.exceptions import ClientError def upload_to_s3(bucket, artefact, bucket_key): """ Uploads an artefact to Amazon S3 """ try: client = boto3.client('s3') except ClientError as err: print("Failed to create boto3 client.\n" + str(err)) return False try: client.put_object( Body=open(artefact, 'rb'), Bucket=bucket, Key=bucket_key ) except ClientError as err: print("Failed to upload artefact to S3.\n" + str(err)) return False except IOError as err: print("Failed to access artefact in this directory.\n" + str(err)) return False return True def main(): parser = argparse.ArgumentParser() parser.add_argument("bucket", help="Name of the existing S3 bucket") parser.add_argument("artefact", help="Name of the artefact to be uploaded to S3") parser.add_argument("bucket_key", help="Name of the S3 Bucket key") args = parser.parse_args() if not upload_to_s3(args.bucket, args.artefact, args.bucket_key): sys.exit(1) if __name__ == "__main__": main()
Here is a CodePipeline example with just one Source step (you can add more):
Pipeline: Type: "AWS::CodePipeline::Pipeline" Properties: ArtifactStore: # Where codepipeline copies and unpacks the uploaded artifact # Must be versioned Location: !Ref "StagingBucket" Type: "S3" DisableInboundStageTransitions: [] RoleArn: !GetAtt "CodePipelineRole.Arn" Stages: - Name: "Source" Actions: - Name: "SourceTemplate" ActionTypeId: Category: "Source" Owner: "AWS" Provider: "S3" Version: "1" Configuration: # Where PipeLines uploads the artifact # Must be versioned S3Bucket: !Ref "LandingBucket" S3ObjectKey: "DynamoDb.zip" # Zip file that is uploaded OutputArtifacts: - Name: "DynamoDbArtifactSource" RunOrder: "1" LandingBucket: Type: "AWS::S3::Bucket" Properties: AccessControl: "Private" VersioningConfiguration: Status: "Enabled" StagingBucket: Type: "AWS::S3::Bucket" Properties: AccessControl: "Private" VersioningConfiguration: Status: "Enabled"
A link to this Python code along with other examples can be found here: https://bitbucket.org/account/user/awslabs/projects/BP
source share