Am I trying to integrate Bitbucket into AWS Code Pipeline? What is the best approach?

I want to integrate my code from Bitbucket into AWS Code Pipeline. I can not find suitable examples on the same. My source code is in .Net. Can someone please guide me. Thank you

+18
source share
7 answers

You can integrate Bitbucket with AWS CodePipeline using web connections that call the AWS API Gateway, which calls the Lambda function (which calls CodePipeline). There is an AWS blog to help you with this: integrating Git with AWS CodePipeline

+15
source

BitBucket has a service called PipeLines that can deploy code to AWS services. Use pipelines to pack and send updates from your main branch to CodePipeline S3, which is connected to CodePipeline

Remarks:

  • You must enable PipeLines in your repository

  • PipeLines expects a file called bitbucket-pipelines.yml to be hosted inside your project

  • Make sure you set your AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY accounts in the BitBucket Pipelines user interface. It comes with encryption, so everyone is safe

Here is an example bitbucket-pipelines.yml that copies the contents of a directory named DynamoDb to the S3 bucket

 pipelines: branches: master: - step: script: - apt-get update # required to install zip - apt-get install -y zip # required if you want to zip repository objects - zip -r DynamoDb.zip . - apt-get install -y python-pip - pip install boto3==1.3.0 # required for s3_upload.py # the first argument is the name of the existing S3 bucket to upload the artefact to # the second argument is the artefact to be uploaded # the third argument is the the bucket key - python s3_upload.py LandingBucketName DynamoDb.zip DynamoDb.zip # run the deployment script 

Here is a working example Python boot script that should be deployed along with bitbucket-pipelines.yml in your project. Above, I named my Python script s3_upload.py :

 from __future__ import print_function import os import sys import argparse import boto3 from botocore.exceptions import ClientError def upload_to_s3(bucket, artefact, bucket_key): """ Uploads an artefact to Amazon S3 """ try: client = boto3.client('s3') except ClientError as err: print("Failed to create boto3 client.\n" + str(err)) return False try: client.put_object( Body=open(artefact, 'rb'), Bucket=bucket, Key=bucket_key ) except ClientError as err: print("Failed to upload artefact to S3.\n" + str(err)) return False except IOError as err: print("Failed to access artefact in this directory.\n" + str(err)) return False return True def main(): parser = argparse.ArgumentParser() parser.add_argument("bucket", help="Name of the existing S3 bucket") parser.add_argument("artefact", help="Name of the artefact to be uploaded to S3") parser.add_argument("bucket_key", help="Name of the S3 Bucket key") args = parser.parse_args() if not upload_to_s3(args.bucket, args.artefact, args.bucket_key): sys.exit(1) if __name__ == "__main__": main() 

Here is a CodePipeline example with just one Source step (you can add more):

 Pipeline: Type: "AWS::CodePipeline::Pipeline" Properties: ArtifactStore: # Where codepipeline copies and unpacks the uploaded artifact # Must be versioned Location: !Ref "StagingBucket" Type: "S3" DisableInboundStageTransitions: [] RoleArn: !GetAtt "CodePipelineRole.Arn" Stages: - Name: "Source" Actions: - Name: "SourceTemplate" ActionTypeId: Category: "Source" Owner: "AWS" Provider: "S3" Version: "1" Configuration: # Where PipeLines uploads the artifact # Must be versioned S3Bucket: !Ref "LandingBucket" S3ObjectKey: "DynamoDb.zip" # Zip file that is uploaded OutputArtifacts: - Name: "DynamoDbArtifactSource" RunOrder: "1" LandingBucket: Type: "AWS::S3::Bucket" Properties: AccessControl: "Private" VersioningConfiguration: Status: "Enabled" StagingBucket: Type: "AWS::S3::Bucket" Properties: AccessControl: "Private" VersioningConfiguration: Status: "Enabled" 

A link to this Python code along with other examples can be found here: https://bitbucket.org/account/user/awslabs/projects/BP

+11
source

Make sure someone finds this now:

AWS CodeBuild now supports Atlassian Bitbucket Cloud as a source type, making it the fourth of the existing supported sources: AWS CodeCommit, Amazon S3, and GitHub.

This means that you no longer need to implement the lambda function, as suggested in the @Kirkaiya link, to integrate with Bitbucket - this is still a valid solution depending on your use case or if you are integrating with a non-cloud version of Bitbucket.

Posted on AWS Blog August 10, 2017 - https://aws.amazon.com/about-aws/whats-new/2017/08/aws-codebuild-now-supports-atlassian-bitbucket-cloud-as-a- source-type /

And for clarification for commentators, this link talks about integration with CodeBuild, not with CodePipeline: you still need to find a way to start the pipeline, but when it starts, CodeBuild will extract the code from BitBucket, and not copy the code to S3 or AWS CodeCommit before launching conveyor belt.

+10
source

AWS CodeBuild now supports the creation of Bitbucket pull requests , and we can use this for a better solution without using the webhooks / Gateway / Lambda API

You can use CodeBuild to archive your code on s3 and use it as a source in your CodePipeline

https://lgallardo.com/2018/09/07/codepipeline-bitbucket

+4
source

alternative to @binary answer and clarification to @OllyTheNinja answer:

in short: let CodeBuild listen to the Bitbucket Webhook and write to the S3 object. in the pipeline, listen for the latest update event.

In AWS code

  1. Define a CodeBuild project, with

    • Source: Bitbucket, which uses its WebHook to listen for git-push events.
    • Buildspec: build a project according to buildspec.yml
    • The artifact stores the assembly output directly in the S3 container.
  2. define conveyor:

    • Source: listening for updates to a previously defined S3 object
    • remove build step
    • add other steps, configure the deployment step
+4
source

If you are looking for a way to automate the build deployment process using AWS CodePipeline with bitmap source code without using lambda expressions, follow these steps:

  1. Create a CodeBuild that currently supports BitBucket. https://docs.aws.amazon.com/codebuild/latest/userguide/sample-bitbucket-pull-request.html Also create a web hook that rebuilds every time the code is put into the repository. You cannot use a web hook if you use the public Bitbucket repository.
  2. Code Build will automatically work when committing, create a zip file and save it in the s3 bucket.
  3. Create an S3 source code pipeline and deploy it using codeDeploy. Since S3 is a valid source.

Note -1. To create a web hook, you need to have BitBucket admin access, so the process from committing to deploying is fully automated. 2. At the moment (April'19) CodeBuild does not support webhook when merging Pull requests. If you want, you can create a trigger that will run code assembly, say, every day.

You can also create triggers for periodically building code https://docs.aws.amazon.com/codebuild/latest/userguide/trigger-create.html

Update - (June'19) - Retrieval request assemblies for PR_Merge are now supported in CodeBuild. Link: https://docs.aws.amazon.com/codebuild/latest/userguide/sample-bitbucket-pull-request.html#sample-bitbucket-pull-request-filter-webhook-events .

+4
source

For me, the best way to integrate Bitbucket with any AWS service is to use pipelines to mirror any commit in the (mirror) AWS CodeSommit repository. From there, you have primary integration with any AWS service. You can find great instructions: here :

+2
source

Source: https://habr.com/ru/post/1262901/


All Articles