Bitbucket salesforce

Bitbucket salesforce DEFAULT

Get started for free

Supported triggers and actions

What does this mean?

  • Bitbucket triggers, actions, and search
    New Commit

    Triggers when a new commit is added to a selected repo.

    Trigger

  • Bitbucket triggers, actions, and search
    New Event

    Triggers on the event stream of a selectable repo.

    Trigger

  • Bitbucket triggers, actions, and search
    New Issue

    Triggers when a new issue is added in a selected repository.

    Trigger

  • Bitbucket triggers, actions, and search
    New Repo

    Triggers when a new repository is added.

    Trigger

  • Bitbucket triggers, actions, and search
    Create Issue

    Creates an issue on any repository you specify.

    Action

  • Bitbucket triggers, actions, and search
    Find Issue

    Finds an issue whose title matches a search criteria.

    Action

  • Bitbucket triggers, actions, and search
    Find or Create Issue

    Find or Create Issue

    Action

  • Salesforce triggers, actions, and search
    New Outbound Message

    Triggers when a new outbound message is received in Salesforce.

    Trigger

  • Salesforce

    Triggers when a record of the specified Salesforce object (ie. Contact, Lead, Opportunity, etc.) is created.

    Scheduled

    Trigger

  • Salesforce

    Triggers when a field of your choosing (ie. email address, status) is updated on any record of a specified Salesforce object (ie. Contact, Lead, Opportunity, etc.).

    Scheduled

    Trigger

  • Salesforce

    Triggers when any record of the specified Salesforce object (ie. Contact, Lead, Opportunity, etc.) is updated.

    Scheduled

    Trigger

  • Salesforce

    Adds an existing contact to an existing campaign.

    Scheduled

    Action

  • Salesforce

    Adds an existing lead to an existing campaign.

    Scheduled

    Action

  • Salesforce

    Creates a new attachment (max 25MB).

    Scheduled

    Action

  • Salesforce

    Creates a new record of a specified Salesforce object (ie. Contact, Lead, Opportunity, etc.).

    Scheduled

    Action

  • Salesforce

    Updates an existing record for a specified Salesforce object (ie. Contact, Lead, Opportunity, etc.).

    Scheduled

    Action

  • Salesforce

    Finds a record of a specified Salesforce object by a field and value you choose.

    Scheduled

    Action

  • Salesforce

    Finds a record of a specified Salesforce object by a field and value you choose (with line-item support).

    Scheduled

    Action

  • Salesforce

    Finds a record of a Salesforce object using a Salesforce Object Query (SOQL) WHERE clause.

    Scheduled

    Action

  • Salesforce

    Find a record. Optionally create a record if not found.

    Scheduled

    Action

Sours: https://zapier.com/apps/bitbucket/integrations/salesforce

sfdx-bitbucket-package

For a fully guided walkthrough of setting up and configuring continuous integration using scratch orgs and Salesforce CLI, see the Continuous Integration Using Salesforce DX Trailhead module.

This repository shows one way you can successfully use scratch orgs to create new package versions with Bitbucket Pipelines. We make a few assumptions in this README. Continue only if you have completed these critical configuration prerequisites.

Getting Started

  1. Fork this repo to your GitHub account using the fork link at the top of the page.

  2. Clone your forked repo locally:

  3. Make sure that you have Salesforce CLI installed. Run and confirm you see the command output. If you don't have it installed, download and install it from here.

  4. Setup a JWT-based auth flow for the target orgs that you want to deploy to. This step creates a file that is used in subsequent steps. (https://developer.salesforce.com/docs/atlas.en-us.sfdx_dev.meta/sfdx_dev/sfdx_dev_auth_jwt_flow.htm)

  5. Confirm that you can perform a JWT-based auth:

    Note: For more info on setting up JWT-based auth, see Authorize an Org Using the JWT-Based Flow in the Salesforce DX Developer Guide.

  6. Encrypt and store the generated . IMPORTANT! Don't store the within the project.

  • First, generate a key and initializtion vector (iv) to encrypt your file locally. The and are used by Bitbucket Pipeplines to decrypt your server key in the build environment.
$ openssl enc -aes-256-cbc -k <passphrase here> -P -md sha1 -nosalt key=E5E9FA1BA31ECD1AE84F75CAAA474F3A663F05F412028F81DA65D26EE56424B2 iv =E93DA465B309C53FEC5FF93C9637DA58

Make note of the and values output to the screen. You'll use the values following and to encrypt your .

  • Encrypt the using the newly generated and values. Use the and values only once. Don't use them to encrypt more than the . While you can re-use this pair to encrypt other things, it's considered a security violation to do so. Every time you run the command above, it generates a new and value. You can't regenerate the same pair. If you lose these values, generate new ones and encrypt again.
openssl enc -nosalt -aes-256-cbc -in assets/server.key -out assets/server.key.enc -base64 -K <key from above> -iv <iv from above>

This step replaces the existing with your encrypted version.

  • Store the , and values somewhere safe. You'll use these values in a subsequent step in the Bitbucket Pipeplines UI. These values are considered secret so please treat them as such.
  1. From your JWT-based connected app on Salesforce, retrieve the generated from your Dev Hub org.

  2. Set your in a Bitbucket Pipelines environment variable named using the Bitbucket Pipelines UI. Set your in a Bitbucket Pipelines environment variable named using the Bitbucket Pipelines UI. Note that this username is the username that you use to access your Dev Hub.

  3. Store the and values used above in Bitbucket Pipelines environment variables named and , respectively. When finished setting environment variables, the environment variables setup screen should look like the one below.

alt text

  1. Copy all the contents of into and save.

  2. Create the sample package:

  3. Create the first package version:

  4. In the : Update the value in the variable to be the Package ID in your file. This ID starts with .

  5. Commit the updated ,, and files.

Now you're ready to go! When you commit and push a change, you change kicks off a Bitbucket Pipelines build.

Enjoy!

Contributing to the Repository

If you find any issues or opportunities for improving this repository, fix them! Feel free to contribute to this project by forking this repository and making changes to the content. Once you've made your changes, share them back with the community by sending a pull request. See How to send pull requests for more information about contributing to GitHub projects.

Reporting Issues

If you find any issues with this demo that you can't fix, feel free to report them in the issues section of this repository.

Sours: https://github.com/forcedotcom/sfdx-bitbucket-package
  1. Onewheel tutorial
  2. Calligraphy flowers easy
  3. Caique cages
  4. Pomerania lulu mini

Spread the love

Ivano Guerini is a Salesforce Senior Developer at Webresults, part of Engineering Group since 2015.
He started my career on Salesforce during his university studies and based his final thesis on it.
He’s passionate about technology and development, in his spare time he enjoys developing applications mainly on Node.js.


In this article, I’m going to walk you through the steps to set up CI with Salesforce DX.

For this, I decided to take advantage of Bitbucket and it’s integrated tool Bitbucket Pipelines.

This choice is not made after a comparison between the various version control systems and CI tools but is driven by some business needs for which we decided to fully embrace the cloud solutions and in particular the Atlassian suite of which Bitbucket its part.

What is Continuous Integration?

In software engineering, continuous integration (often abbreviated to CI) is a practice that is applied in contexts in which software development takes place through a versioning system. It consists of frequent alignment from the work environments of the developers to the shared environment.

In particular, it is generally assumed that automatic tests have been prepared that developers can execute immediately before releasing their contributions to the shared environment, so as to ensure that the changes do not introduce errors into the existing software.

Let’s apply this concept to our Salesforce development process using sfdx.

First of all, we have a production org where we want to deploy and maintain the application than typically we have one or more sandboxes such as for UAT, Integration Test and development.

With sfdx, we also have the concept of scratch org, disposable and preconfigured organizations where we, as developers, can deploy and test our work before pushing them into the deployment process.

In the image below you can see an approach to the CI with Salesforce DX. Once a developer have finished a feature he can push into the main Developer branch, from this the CI take place creating a scratch Org to run automated tests, such as Apex Unit Test or even Selenium like test automatisms. If there is no error the dev can create a pull request moving forward in the deployment process.

In this article, I’ll show you how to set up all the required tools and as an example, we will only set up an auto-deploy to our Salesforce org on every git push operation.

Toolbox

Let’s start with a brief description of the tools we’re going to use:

  • Git – is a version control system for tracking changes in files and coordinating work on those files across the team. All metadata items, whether modified on the server or locally, are tracked via GIT. This provides us with a version history as well as traceability.
  • Bitbucket – is a cloud-based GIT server from Atlassian used for hosting our repository. It provides a UI to navigate the GIT repository and has many additional features like pull requests. These are used for approving and merging changes.
  • Docker – provides a way to run applications securely, packaged with all its dependencies and libraries. So, we will be using it to create an environment for running sfdx commands.
  • Bitbucket Pipelines – is an add-on for Bitbucket cloud that will allow us to kick off deployments and validations when updates are made to the branches in Bitbucket.

If you have always worked in Salesforce, then it’s quite possible that Docker containers sound alien to you. So what is Docker? In simple terms, Docker can be thought of as a virtual machine in the cloud. Docker provides an environment in the cloud where applications can run. Bitbucket Pipelines support Docker images for running the Continuous Integration scripts. So, instead of installing sfdx in your local system, you’d now specify them to be installed in your Docker image, so that our CI scripts can run.

Create a developer Org and enable the DevHub

We made a brief introduction about what CI is and the tools we’re going to use, now it’s time to get to the heart of it and start configuring our tools. Starting from our Salesforce Org.

We are going to enable the devhub to be able to work with sfdx and we are going to set up a connected app that allows us to handle the login process inside our docker container.

For this article, I created a dedicated developer Org in order to have a clean environment.

We can do this simply filling out the form from the Salesforce site: https://developer.salesforce.com/signup and complete the registration process.

In this way, we will obtain a new environment on which to perform all the tests we want.

Let’s go immediately to enable the DevHub: Setup → Development → DevHub click on the Enable DevHub toggle.

Once enabled it can’t be disabled but this is a requirement to be able to work with SFDX.

Now you can install the sfdx cli tool on you computer.

Create a connected app

Now that we have our new org and the sfdx cli installed, we can run sfdx commands that makes it easy for us to manage the entire application development life cycle from the command line, including creating scripts that facilitate automation.

However, our CI will run in a separate environment where we haven’t a direct control, such as for the logging. So we will need a way to manage the authorization process inside the docker container when your CI automation job runs.

To do this we’ll use the OAuth JSON Web Token (JWT) bearer flow that’s supported in the Salesforce CLI, this OAuth flow gives you the ability to authenticate using the CLI without having to interactively login. This headless flow is perfect for automated builds and scripting.

Create a Self-Signed SSL Certificate and Private Key

For a CI solution to work, you’ll generate a private key for signing the JWT bearer token payload, and you’ll create a connected app in the Dev Hub org that contains a certificate generated from that private key.

To create an SSL certificate you need a private key and a certificate signing request. You can generate these files using OpenSSL CLI with a few simple commands.

If you use Unix Based System, you can install the OpenSSL CLI from the official OpenSSL website.

If you use Windows instead, you can download an installer from Shining Light Productions, although there are plenty of alternatives.

We will follow some specific command to create a certificate for our needs, if you want to better understand how OpenSSL works, you can find a handy guide in this article.

If you are not familiar with OpenSSL you can find a good

  1. Create a folder on your PC to store the generated files
    mkdir certificates
  2. Generate an RSA private key
    openssl genrsa -des3 -passout pass:<password> -out server.pass.key 2048
  3. Create a key file from the server.pass.key file using the same password from before:
    openssl rsa -passin pass:<password> -in server.pass.key -out server.key
  4. Delete the server.pass.key:
    rm server.pass.key
  5. Request and generate the certificate, when prompted for the challenge password press enter to skip the step:
    openssl req -new -key server.key -out server.csr
  6. Generate the SSL certificate:
    openssl x509 -req -sha256 -days 365 -in server.csr -signkey server.key -out server.crt

The self-signed SSL certificate is generated from the server.key private key and server.csr files.

Create the Connected App

The next step is to create a connected app on Salesforce that includes the certificate we just created.

  1. From Setup, enter App Manager in the Quick Find box, then select App Manager.
  2. Click New Connected App.
  3. Enter the connected app name and your email address:
    1. Connected App Name: sfdx ci
    1. Contact Email: <your email address>
  1. Select Enable OAuth Settings.
  2. Enter the callback URL:
  3. http://localhost:1717/OauthRedirect
  4. Select Use digital signatures.
  5. To upload your server.crt file, click Choose File.
  6. For OAuth scopes, add:
    • Access and manage your data (api)
    • Perform requests on your behalf at any time (refresh_token, offline_access)
    • Provide access to your data via the Web (web)
  7. Click Save

Edit Policies to avoid authorization step

After you’ve saved your connected app, edit the policies to enable the connected app to circumvent the manual login process.

  1. Click Manage.
  2. Click Edit Policies.
  3. In the OAuth policies section, for Permitted Users select Admin approved users are pre-authorized, then click OK.
  4. Click Save.

Create a Permission Set

Lastly, create a permission set and assign pre-authorized users for this connected app.

  1. From Setup, enter Permission in the Quick Find box, then select Permission Sets.
  2. Click New.
  3. For the Label, enter: sfdx ci
  4. Click Save.
  5. Click sfdx ci | Manage Assignments | Add Assignments.
  6. Select the checkbox next to your Dev Hub username, then click Assign | Done.
  7. Go back to your connected app.
    1. From Setup, enter App Manager in the Quick Find box, then select App Manager.
    2. Next to sfdx ci, click the list item drop-down arrow (), then select Manage.
    3. In the Permission Sets section, click Manage Permission Sets.
    4. Select the checkbox next to sfdx ci, then click Save.

Test the JWT Auth Flow

Open your Dev Hub org.

  • If you already authorized the Dev Hub, open it:
    sfdx force:org:open -u DevHub
  • If you haven’t yet logged in to your Dev Hub org:
    sfdx force:auth:web:login -d -a DevHub

Adding the -d flag sets this org as the default Dev Hub. To set an alias for the org, use the -a flag with an argument to set an alias.

To test the JWT auth flow you’ll use some of the information that we asked you to save previously. We’ll use the consumer key that was generated when you created the connected app (CONSUMER_KEY), the absolute path to the location where you generated your OpenSSL server.key file (JWT_KEY_FILE) and the username for the Dev Hub (HUB_USERNAME).

  1. On the command line, create these three session-based environment variables:
    export CONSUMER_KEY=<connected app consumer key>
    export JWT_KEY_FILE= ../certificates/server.key
    export HUB_USERNAME=<your Dev Hub username>


    These environment variables facilitate running the JWT auth command.
  2. Enter the following command as-is on a single line:
    sfdx force:auth:jwt:grant –clientid ${CONSUMER_KEY} –username ${HUB_USERNAME} \ –jwtkeyfile ${JWT_KEY_FILE} –setdefaultdevhubusername

This command logs in to the Dev Hub using only the consumer key (client ID), the username, and the JWT key file. And best of all, it doesn’t require you to interactively log in, which is important when you want your scripts to run automatically.

Congratulations, you’ve created your connected app and you are able to login using it with the SFDX CLI.

Set up your development environment

In this section we will configure our local environment, creating a remote repository in Bitbucket and linking it to our local sfdx project folder.

If you are already familiar with these steps you can skip and pass directly to the next section.

Create a Git Repository on Bitbucket

If you don’t have a bitbucket account, you can create a free one registering to the following link: https://bitbucket.org/account/signup/

Just insert your email and follow the first registration procedure.

Once logged in you will be able to create a new git repository from the plus button on the right menu.

You will be prompted to a window like the following, just insert a name for the repository, in my case I’ll name it: sfdx-ci, leaving Git selected as Version Control System.

We’re in but our repo is totally empty, Bitbucket provides some quick commands to initialize our repo. Select the clone command:

git clone https://[email protected]/username/sfdx-ci.git

Move to your desktop and open the command line tool and paste and execute the git clone command. This command will create a folder named like the Bitbucket repository already linked to it as a remote branch.

Initialize SFDX project

Without moving from our position, execute the sfdx create project command:
force:project:create -n sfdx-ci

Using the -n parameter with the same name of the folder we just cloned from git.

Try deploy commands

Before we pass to configure our CLI operations let’s try to do it in our local environment.

First of all, we must create our sfdx project.

The general sfdx deployment flow into a sandbox or production org is:

  1. Convert from source form to metadata api form
    sfdx force:source:convert -d <target directory>
  2. Use the metadata api to deploy
    sfdx force:mdapi:deploy -d <same directory as step 1> -u <username or alias>

These commands will be the same we are going to use inside our Bitbucket Pipelines, You can try in your local environment to see how they work.

Set up Continuous Integration

In previous sections, we talked mostly about common Salesforce project procedures. In the next one, we are going deeper in the CI world. Starting with a brief introduction to Docker and Bitbucket Pipelines.

Lastly, we’ll see how to create a Docker image with SFDX CLI installed and how to use it in our pipeline to run sfdx deploy commands.

Docker

Wikipedia defines Docker as

an open-source project that automates the deployment of software applications inside containers by providing an additional layer of abstraction and automation of OS-level virtualization on Linux.

In simpler words, Docker is a tool that allows developers, sys-admins, etc. to easily deploy their applications in a sandbox (called containers) to run on the host operating system i.e. Linux. The key benefit of Docker is that it allows users to package an application with all of its dependencies into a standardized unit for software development.

Docker Terminology

Before we go further, let me clarify some terminology that is used frequently in the Docker ecosystem.

  • Images – The blueprints of our application which form the basis of containers.
  • Containers – Containers offer a logical packaging mechanism in which applications can be abstracted from the environment in which they actually run.
  • Docker Daemon – The background service running on the host that manages building, running and distributing Docker containers. The daemon is the process that runs in the operating system to which clients talk to.
  • Docker Client – The command line tool that allows the user to interact with the daemon.
  • Docker Hub – A registry of Docker images. You can think of the registry as a directory of all available Docker images.
  • Dockerfile – A Dockerfile is a simple text file that contains a list of commands that the Docker client calls while creating an image. It’s a simple way to automate the image creation process. The best part is that the commands you write in a Dockerfile are almost identical to their equivalent Linux commands.

Build our personal Docker Image with SFDX CLI installed

Most Dockerfiles start from a parent image. If you need to completely control the contents of your image, you might need to create a base image instead. A parent image is an image that your image is based on. It refers to the contents of the FROM directive in the Dockerfile. Each subsequent declaration in the Dockerfile modifies this parent image.

Most Dockerfiles start from a parent image, rather than a base image, this will be our case, we will start from a Node base image.

Create a folder on your machine and create a file named Dockerfile, and paste the following code:

FROM node:jessie
RUN apk add --update --no-cache git openssh ca-certificates openssl curl
RUN npm install sfdx-cli --global
RUN sfdx --version
USER node

Let’s explain what this code means, in order:

  1. We use a Node base image, this image comes with NPM and Node.js preinstalled. This one is the official Node.js docker image, and jessie indicate the last available version;
  2. Next, with the apk add command we are going to install some additional utility tools mainly git and openssl to handle sfdx login using certificates;
  3. Lastly using npm command we install the SFDX CLI tools;
  4. Just a check for the installed version;
  5. And finally the USER instruction sets the user name to use when running the image.

Now we have to build our image and publishing it to the Docker Hub so to be ready to use in our Pipelines.

  1. Create an account on Docker Hub.
  2. Download and install Docker Desktop. If on Linux, download Docker Engine – Community
  3. Login to Docker Hub with your credentials. 
    docker login –username=yourhubusername –password=yourpassword
  4. Build you Docker Image
    docker build -t <your_username>/sfdxci
  5. Test your docker image locally:
    docker run <your_username>/sfdxci
  6. Push your Docker image to your Docker Hub repository
    docker push <your_username>/sfdxci

Pushing a docker image on the Docker Hub will make it available for use in Bitbucket pipelines.

Bitbucket Pipelines

Now that we have a working Docker Image with sfdx installed we can continue configuring the pipeline, that’s the core of our CI procedure.

Bitbucket Pipelines is an integrated CI/CD service, built into Bitbucket. It allows you to automatically build, test and even deploy your code, based on a configuration file in your repository. Essentially, it creates containers in the cloud for you.

Inside these containers, you can run commands (like you might on a local machine) but with all the advantages of a fresh system, custom configured for your needs.

To set up Pipelines you need to create and configure the bitbucket-pipelines.yml file in the root directory of your repository, if you are working with branches,  to be executed this file must be present in each branch root directory.

A bitbucket-pipelines.yml file looks like the following:

image: atlassian/default-image:2 pipelines: default: - step: script: - echo "Hello world default" branches: features/*: - step: script: - echo "Hello world feature branch"

There is a lot you can configure in the bitbucket-pipelines.yml file, but at its most basic the required keywords are:

  • image – the Docker image that will be used to create the Docker Container, You can use the default image (atlassian/default-image:latest), but using a personal one is preferred to avoid time consumption during the installation of required tools (e.g. SFDX CLI), To specify an image, use image: <your_dockerHub_account/repository_details>:<tag>
  • pipelines – contains all your pipeline definitions.
  • default – contains the steps that run on every push, unless they match one of the other sections.
  • branches – Specify the name of a branch on which run the defined steps, or use a glob pattern (to learn more about the glob patterns, refer to the BitBucket official guide).
  • step – each step starts a new Docker container with a clone of your repository, then runs the contents of your script section.
  • script – a list of cli commands that are executed in sequence.

Other than default and branches there are more signals keyword to identify what step must run, such as pull-request, but I leave you to the official documentation, we are going to use only these two.

Keep in mind that each step in your pipeline runs a separate Docker container and the script runs the commands you provide in this environment with the repository folder available.

Configure SFDX deployment Pipelines

Before configuring our pipeline, let’s review for a moment the steps needed to deploy to a production org using sfdx cli.

First of all we need to login into our SF org, to do so we have created a Salesforce Connected App to allow us logging in without any manual operation, simply using the following command:

sfdx force:auth:jwt:grant --clientid --username --jwtkeyfile keys/server.key --setdefaultdevhubusername --setalias sfdx-ci --instanceurl

As you can see there are three parameters that we have to set in this command line:

  • CONSUMER_KEY
  • SFDC_PROD_USER
  • SFDC_PROD_URL

Bitbucket offer a way to store some variables that can be used in our pipelines in order to avoid hard-coded values.

Under Bitbucket repository Settings → Pipelines → Repository Variables create three variables and fill them in with the data at your disposal.

Another parameter required by this command is the server.key file, in this case I simply added it in my repository under the keys folder.

It’s not a good practice and I will move it in a more secure position, but for this demonstration it’s enough.

Now you are logged in, you need only two sfdx commands to deploy your metadata. One to convert your project in a metadata API format and one to deploy in the sf org:
sfdx force:source:convert -d mdapi
sfdx force:mdapi:deploy -d mdapi -u <SFDC_PROD_USER>

Like the login command we are going to use a Pipeline Variable to indicate the target org username under the -u parameter.

OK, now that we know how to deploy a SFDX proggect we can put all this into our pipeline.

Move to the root of our sfdx project and create the bitbucket-pipelines.yml file and paste the following code (replace the image name with your own Docker image):

image: ivanoguerini/sfdx:latest pipelines: default: step: script: - echo $SFDC_PROD_URL - echo $SFDC_PROD_USER - sfdx force:auth:jwt:grant --clientid $CONSUMER_KEY --username $SFDC_PROD_USER --jwtkeyfile keys/server.key --setdefaultdevhubusername --setalias sfdx-ci --instanceurl $SFDC_PROD_URL - sfdx force:source:convert -d mdapi - sfdx force:mdapi:deploy -d mdapi -u $SFDC_PROD_USER

Commit and push this changes to the git repository.

Test the CI

OK we have our CI up and running, let’s do a quick test.

In your project create a new apex class and put some code in it. Then commit and push your changes.

git add . git commit -am “Test CI” git push

As we said the pipeline will run on every push into the remote repository, you can check the running status under the Pipelines menu. You will see something like this:

As you know, the mdapi:deploy command is asynchronous so to check if there was some errors during the deploy you have to run the following command mdapi:deploy:report specifying the jobId or if you prefer you can check the deploy directly in the salesforce Org under Deployment section.

Conclusions

With this article I wanted to provide you with the necessary knowledge to start configuring a CI using the BitBucket Pipelines.

Obviously what I showed you is not enough for a CI that can be used in an enterprise project, there is still a lot to do.

Here are some starting points to improve what we have seen:

  1. Store the server.key in a safe place so that it is not directly accessible from your repository.
  2. Manage the CI in the various sandbox environments used
  3. For the developer branch, consider automating the creation a scratch org and running Apex Unit Tests.

But, I leave this to you.

BitBucketCIContinuous IntegrationDevHubDockerGitPipelinesSalesforceSalesforce DXSFDX

Sours: https://blog.enree.co/2019/06/setting-up-sfdx-continuous-integration-using-bitbucket-pipelines-with-docker-image.html
Continuous integration in Salesforce Using Jenkins and Git in 20 minutes

Salesforce and Bitbucket

Atlassian is one of our favorite companies. They build a wide variety of great developer tools, they have thoughtful values and they are a true unicorn in that they IPO’d via bootstrapping their business solely on customer revenue.

salesforce-bitbucket.png

They’ve created a brilliantly complimentary suite of products that covers the entire software development lifecycle. Many of us at Blue Canvas have used their products throughout our careers. We’ve used Confluence to come up with ideas and collaborate on solutions. We’ve used JIRA and Trello to make plans and prioritize work. And we’ve used Bamboo and Bitbucket to actually get code released. Atlassian’s toolbelt rivals many of the enterprise platforms like Microsoft in its completeness, but surpasses all of them in its design aesthetic and simplicity.

Atlassian and Salesforce

But on top of all of that, Atlassian has a very cool Salesforce development strategy.

Many of our customers are using Atlassian products across their stack to make their Salesforce teams more agile. Most often we hear about JIRA and Bitbucket. We’ve written before about how to leverage Salesforce and JIRA with Blue Canvas.

As a source control tool, we at Blue Canvas wanted to provide some more information on how you can integrate Blue Canvas with Bitbucket to make development a bit simpler for the specific Salesforce use case.

Challenges with Salesforce and Git

Due to the popularity of our blog post on how Atlassian does Salesforce development, we’re often asked how other companies can do this themselves without investing as heavily in a team to do it.

Whether you have Atlassian’s development resources or not, the problem always starts with the difficulty of getting declarative Salesforce metadata into Git. This happens for a few reason. First of all, getting Admins into a Git flow can bring its own challenges. Sometimes they are not comfortable with the command line. And even when they can use tools like SourceTree they find themselves too busy to really adopt a proper Git flow for Salesforce. It’s too tempting to just click save when you’re adding a field or making a quick change in the Salesforce UI.

Furthermore, the Salesforce SOAP APIs and Ant migration tool really don’t do any favors for making this easier.

Using Blue Canvas to Get Changes into Git

Blue Canvas can help you get your Salesforce declarative metadata into Bitbucket. Under the hood, Blue Canvas is a standard Git repository. This makes it simple and intuitive to push from our repo into external repositories.

Since Blue Canvas automatically syncs admin changes into Git on save, we’re effectively a nice piece of middleware that can get your Salesforce admin metadata into GIt. If you have a Blue Canvas Git repository this is happening automatically and in real time.

Once you’ve connected your Salesforce orgs to Blue Canvas, the next step is simple: just configure Bitbucket as an external remote and push to it. As you can see our UI provides instructions for pulling code down locally into a Git repo. You can then simply push to a branch in your Bitbucket repository. It’s that easy.

The next step of course is automating this. Adding this kind of automatic push is on our roadmap. In the meantime, you can schedule a job with Jenkins, Bitbucket Pipelines or Bamboo. Or you can set up a cron job.

Sours: https://bluecanvas.io/blog/2018-11-15-bitbucket-and-salesforce

Salesforce bitbucket

You have a real trakhodrom here, like theirs. - The mother-in-law showed me on the TV screen, where a lesbian group sex was in full swing. Mature and young lesbians came together in a "fight" on a sofa like mine. - Love me Angela, I want to try it with you, I have no strength.

Salesforce Deployments Made Easy with Bitbucket Cloud Pipelines

to watch, and this excites him very much when Alena flirts, breaking up guys or girls for crazy sex, so at one time it. Was with Olya. The fun lasted somewhere until four in the morning, Alena and Olya were in the spotlight of all the parties, when they completely relaxed, dancing with each. Other, to the music began to kiss gently caressing each other, with light elements of striptease, there were simply no people indifferent in the club.

looked at this crazy couple with such sexual lust that the atmosphere in the club was simply saturated with their sexual desire, and envy was read in.

You will also like:

Two more blows. The smiling Lady got up from the sofa and gracefully walked over to the executor. Wait a second, dear. Let me check. The Mistress's cool, tender foot brought a little relief, touching the burning ass.



327 328 329 330 331