Get started with Bitbucket Cloud
New to Bitbucket Cloud? Check out our get started guides for new users.
Jenkins is one of a number of CI/CD tools you can connect to Bitbucket Cloud. However, if you’re already using Bitbucket Cloud to manage your code, migrating from Jenkins to Bitbucket Pipelines can improve your overall experience by offering a highly scalable CI/CD tool fully integrated with Bitbucket Cloud’s features and interface.
Learn more about the benefits of migrating to Bitbucket Pipelines.
To migrate from a Jenkins server to Bitbucket Pipelines, you’ll need to translate the configuration file for each of your Jenkins pipelines to a format that Bitbucket Pipelines can understand.
Jenkins configuration files are written in groovy format, while Bitbucket Pipelines use YAML. To save time converting from one format to another, we’ve created an automated migration tool to translate your groovy-based Jenkins configuration files to Bitbucket Pipelines YAML files.
Before you gets started using the tool, it’s important to familiarise yourself with:
Once you’re ready to proceed, move on to:
The automated migration tool only supports declarative Jenkins pipelines. Scripted pipelines are not supported.
It’s important to understand that our migration tool can only provide a strong start to the conversion process. Because Jenkins servers are plugin-based, we cannot account for every plugin your Jenkins configuration files reference.
Our tool supports the most common Jenkins plugins used in typical CI/CD workflows. When you use the tool to convert a Jenkins configuration file, the tool clearly marks what commands could not be translated and need manual review in the resultant Bitbucket Pipelines configuration file.
For example:
1
2
# The migration assistant couldn't convert the following plugin in your Jenkinsfile. They may require manual conversion.
# Unhandled plugin (<plugin_name>) String Param: (<string_params>) params: (<keyword_params>)
In most cases, you will need to complete this conversion process yourself, either by manually reviewing and updating the Bitbucket Pipelines files the tool creates, or customising the tool itself to accomodate your specific Jenkins plugins.
We provide the migration tool as a self-contained docker image and as customisable source code.
We recommend that most users use the Docker image to run the tool. The image requires minimal setup and translates the most common plug-in based commands in your Jenkins configuration files.
If you’d like to extend the migration tool to translate additional plugin commands, we provide the source code for you to customise as necessary. This method requires more effort to get up and running, but allows you to extend the tool’s functionality to cover your specific Jenkins plug-in needs.
Regardless of how you use the migration tool, it’s important to review the Bitbucket Pipelines files the migration tool creates. Use our syntax examples to help you make manual revisions.
Jenkins plugins supported by the automated migration tool are continually updated. View the current list of supported plugins.
For mission critical workflows, we recommend rolling out a migration from Jenkins to Bitbucket Pipelines progressively. This strategy allows you to gradually transition your existing repositories to Bitbucket Pipelines while still leveraging Jenkins where necessary.
You can start by migrating less critical repositories or parts of your build processes to Bitbucket Pipelines. This lets you test and refine your configurations as you go.
During this transitional period, you can configure Bitbucket Pipelines to trigger Jenkins jobs using the jenkins-job-trigger pipe. This hybrid approach enables you to gradually familiarize your team with Bitbucket Pipelines.
Don’t forget that you can enhance your CI/CD workflows and minimize boilerplate code with our powerful CI/CD integrations, known as Pipes. Choose from our curated list of 100+ Pipes, or create and share your own custom solutions within your organization.
Most users should use the Docker image for simplicity and ease of use.
Docker: Ensure Docker is installed and running on your system.
Print out the help message:
1
docker run -it --rm atlassian/bitbucket-pipelines-importer --help
Run the Docker Container:
1
docker run -it -v ${PWD}:/files --rm atlassian/bitbucket-pipelines-importer migrate jenkins -i /files/Jenkinsfile -o files/bitbucket-pipelines.yml
For users unfamiliar of how Docker volume mount works, ${PWD}:/files this essentially mounts the left hand side of the colon which is the current directory on the host machine, into the right hand side of the colon which is the /files directory inside the Docker container.
This would allow the CLI within the Docker container to read from and write files to the current directory. And because the current directory is mounted to the /files directory, any input or output files need to be prefixed with /files so that the CLI inside the container can read the files properly, even though they exist in the current directory of the host machine.
Sample Jenkinsfile
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
pipeline {
agent {
docker { image 'alpine:3.20' }
}
stages {
stage('Back-end') {
agent {
docker { image 'maven:3.9.9-eclipse-temurin-21-alpine' }
}
steps {
sh 'mvn --version'
}
}
stage('Front-end') {
agent {
docker { image 'node:20.18.0-alpine3.20' }
}
steps {
sh 'node --version'
}
}
}
}
Check the output directory for the generated YAML file.
Sample Bitbucket Pipelines YAML file output
1
2
3
4
5
6
7
8
9
10
11
12
13
image: alpine:3.20
pipelines:
default:
- step:
name: Back-end
image: maven:3.9.9-eclipse-temurin-21-alpine
script:
- sh -c 'mvn --version'
- step:
name: Front-end
image: node:20.18.0-alpine3.20
script:
- sh -c 'node --version'
For users who wish to extend or customise the tool.
Java Development Kit (JDK): Version 21 or higher.
Apache Maven: Version 3.6.0 or higher.
Docker: For users opting to use the Docker image.
Clone the Repository:
1
git clone https://bitbucket.org/bitbucketpipelines/bitbucket-pipelines-importer/src/main/
Navigate into the directory
1
cd bitbucket-pipelines-importer
Build the Project:
1
mvn package spring-boot:repackage
Running the Tool
1
java -jar target/bitbucket-pipelines-importer-0.0.3.jar --help
Build the Docker image
1
docker build -t bitbucket-pipelines-importer .
Running the tool
1
docker run -it --rm bitbucket-pipelines-importer --help
Input: Jenkins pipeline Groovy file that specified as path to input file.
Output: Bitbucket Pipelines YAML file that specified as path to output file.
External developers are welcome to contribute to the tool and add support for additional Jenkins plugins. In most cases, though, we expect users to fork the migration tool repository when using source code and make changes privately to support their custom use cases.
The tool's pluggable architecture allows users to:
Define Custom Plugin Translation Handlers: Add support for custom Jenkins plugins not covered by the default implementation.
Override Existing Plugins: Replace the default translation of a plugin with a custom implementation
Create a New Plugin Handler:
Implement the PluginMappingStrategy interface provided by the tool.
1
2
3
4
5
6
public class CustomPluginHandler implements PluginMappingStrategy {
@Override
public PluginMappingResult map(MappingContext context, JenkinsStep jenkinsStep) {
return ...;
}
}
Register the Handler:
Register your custom handler in the Spring context or by defining it as a spring component, set which commands it supports, and set the plugin Precedence .
1
2
3
4
@Component
@Order(BeanPrecedence.DEFAULT_MID_IMPLEMENTATION)
@SupportedCommands({"custom"})
public class CustomPluginHandler implements PluginMappingStrategy {
Create a Custom Implementation:
Implement a new handler for the existing plugin you wish to override.
Assign Higher Priority:
Use the @Order annotation to give your handler a higher priority.
1
2
3
4
5
6
7
8
9
public interface BeanPrecedence {
int CUSTOM_IMPLEMENTATION = 1;
int DEFAULT_MID_IMPLEMENTATION = 10;
int DEFAULT_LOW_FALLBACK = 100;
}
@Component
@Order(BeanPrecedence.CUSTOM_IMPLEMENTATION)
@SupportedCommands({"sh"})
public class CustomPluginHandler implements PluginMappingStrategy {
Below are several examples and syntax comparisons for manually migrating from Jenkins to Bitbucket Pipelines.
Jenkins (groovy) | Pipelines (yaml) |
---|---|
1
2
3
4
5
6
7
8
9
10
pipeline {
agent any
stages {
stage('Example') {
steps {
echo 'Hello World'
}
}
}
} | 1
2
3
4
5
6
7
image: atlassian/default-image:4
pipelines:
default:
- step:
name: Example
script:
- echo 'Hello World' |
Jenkins (groovy) | Pipelines (yaml) |
---|---|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
pipeline {
agent any
stages {
stage('Parallel Steps') {
parallel {
stage('Build and Test') {
steps {
echo "Your build and test goes here..."
}
}
stage('Lint') {
steps {
echo "Your linting goes here..."
}
}
stage('Security scan') {
steps {
echo "Your security scan goes here..."
}
}
}
}
}
} | 1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
image: atlassian/default-image:4
pipelines:
default:
- parallel:
- step:
name: 'Build and Test'
script:
- echo "Your build and test goes here..."
- step:
name: 'Lint'
script:
- echo "Your linting goes here..."
- step:
name: 'Security scan'
script:
- echo "Your security scan goes here..." |
When migrating from Jenkins to Bitbucket Pipelines, it's important to note that existing variables and secrets used in Jenkins will need to be manually provided in Bitbucket Pipelines. This is necessary because the two systems handle environment variables differently, and there is no automatic transfer of these credentials. For more details on how to manage variables and secrets in Bitbucket Pipelines, refer to Variables and secrets | Bitbucket Cloud | Atlassian Support.
Jenkins (groovy) | Pipelines (yaml) |
---|---|
1
2
3
4
5
6
7
8
9
10
pipeline {
agent any
stages {
stage('Example') {
steps {
echo "Running ${env.BUILD_ID} on ${env.JENKINS_URL}"
}
}
}
} | 1
2
3
4
5
6
7
image: atlassian/default-image:4
pipelines:
default:
- step:
name: Example
script:
- echo "Running $BITBUCKET_BUILD_NUMBER on $BITBUCKET_GIT_HTTP_ORIGIN" |
The following Pipeline code shows an example of how to create a Pipeline using environment variables for secret text credentials.
Jenkins (groovy) | Pipelines (yaml) |
---|---|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
pipeline {
agent any
environment {
AWS_ACCESS_KEY_ID = credentials('jenkins-aws-secret-key-id')
AWS_SECRET_ACCESS_KEY = credentials('jenkins-aws-secret-access-key')
}
stages {
stage('Example stage 1') {
steps {
// logic required credentials
}
}
}
} | 1
2
3
4
5
6
7
8
9
10
image: atlassian/default-image:4
pipelines:
default:
- step:
name: Example
script:
- pipe: atlassian/bitbucket-upload-file:0.7.1
variables:
BITBUCKET_ACCESS_TOKEN: $BITBUCKET_ACCESS_TOKEN
FILENAME: 'package.json' |
Jenkins (groovy) | Pipelines (yaml) |
---|---|
1
2
3
4
5
6
7
8
9
10
11
12
13
pipeline {
agent any
stages {
stage('Example') {
options {
timeout(time: 1, unit: 'HOURS')
}
steps {
echo 'Hello World'
}
}
}
} | 1
2
3
4
5
6
7
8
image: atlassian/default-image:4
pipelines:
default:
- step:
name: Example
script:
- echo 'Hello World'
max-time: 60 # Timeout in minutes |
Jenkins (groovy) | Pipelines |
---|---|
1
2
3
4
5
6
7
8
9
10
11
12
13
pipeline {
agent any
triggers {
cron('H 0 * * *') // This cron expression triggers the pipeline once every day at midnight
}
stages {
stage('Example') {
steps {
echo 'Hello World'
}
}
}
} | Bitbucket Pipelines scheduled builds are configured in the user interface. You can easily set up and manage your build schedules, specify the frequency, and timing of builds. |
Jenkins (groovy) | Pipelines (yaml) |
---|---|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
pipeline {
agent any
tools {nodejs "nodejs"}
environment {
HEROKU_API_KEY = credentials('jenkins-heroku-api-key')
HEROKU_APP_NAME = credentials('jenkins-heroku-app-name')
}
stages {
stage('Build') {
steps {
sh 'npm install'
sh 'npm run build'
sh 'zip -r build.zip build/
}
}
stage('Test') {
steps {
sh 'npm test'
}
}
stage('Deploy to Heroku') {
steps {
script {
def deploymentUrl = "https://api.heroku.com/sources"
def response = sh(script: """
curl -X POST \
-H "Content-Type: application/json" \
-H "Accept: application/vnd.heroku+json; version=3" \
-H "Authorization: Bearer ${HEROKU_API_KEY}" \
"${deploymentUrl}"
""", returnStdout: true).trim()
def jsonSlurper = new groovy.json.JsonSlurper()
def parsedResponse = jsonSlurper.parseText(response)
def putUrl = parsedResponse.source_blob.put_url
def getUrl = parsedResponse.source_blob.get_url
sh """
curl "${putUrl}" \
-X PUT \
-H "Content-Type:" \
--data-binary @build.zip
"""
sh """
curl -X POST \
-H "Content-Type: application/json" \
-H "Accept: application/vnd.heroku+json; version=3" \
-H "Authorization: Bearer ${HEROKU_API_KEY}" \
-d '{"source_blob":{"url":"${getUrl}","version":"${BUILD_NUMBER}"}}' \
"https://api.heroku.com/apps/${HEROKU_APP_NAME}/builds"
"""
}
}
}
}
}
| 1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
image: atlassian/default-image:4
pipelines:
default:
- step:
name: Build
script:
- npm install
- npm run build
- zip -r build.zip build/
artifacts:
- build.zip
- step:
name: Test
caches:
- node
script:
- npm install
- npm test
- step:
name: Deploy to Heroku
services:
- docker
script:
- pipe: atlassian/heroku-deploy:2.4.0
variables:
HEROKU_API_KEY: $HEROKU_API_KEY
HEROKU_APP_NAME: $HEROKU_APP_NAME
ZIP_FILE: 'build.zip'
WAIT: 'true' |
Was this helpful?