Introduction
Continuous Integration and Continuous Delivery (CI/CD) accelerates development while improving quality. This guide walks through setting up Jenkins pipelines that automatically build, test, and deploy applications from git commit to production rollback, including advanced strategies like blue-green deployments.
Jenkins Installation and Setup
Install Jenkins on Ubuntu: wget -q -O - https://pkg.jenkins.io/debian-stable/jenkins.io-2023.key | sudo apt-key add -, sudo sh -c 'echo deb https://pkg.jenkins.io/debian-stable binary/ > /etc/apt/sources.list.d/jenkins.list', sudo apt update, sudo apt install jenkins. Start: sudo systemctl enable jenkins, sudo systemctl start jenkins. Access: http://localhost:8080, initial password from sudo cat /var/lib/jenkins/secrets/initialAdminPassword. Install suggested plugins or choose custom: Git Pipeline, Blue Ocean, Docker, Kubernetes, Slack Notification, Credentials Binding. Create first admin user. Set up JDK, Maven, Node.js, Python via Global Tool Configuration.
Pipeline Concepts and Syntax
Jenkins Pipeline as Code (Jenkinsfile) defines entire build process. Declarative pipeline (recommended): pipeline { agent any, stages { stage('Build') { steps { echo 'Building...' } }, stage('Test') { steps { echo 'Testing...' } }, stage('Deploy') { steps { echo 'Deploying...' } } }, post { always { echo 'Cleanup' }, success { echo 'Success!' }, failure { echo 'Failed!' } } }. Scripted pipeline (more flexible): node { stage('Build') { sh "make build" } }. Blue Ocean UI for visual pipeline editor and test visualization.
Source Control Integration
Multibranch Pipeline automatically creates pipelines for each branch. Jenkinsfile at repository root. Git integration: checkout scm step. Branch detection: branch matching patterns (master, release/**, feature/*). Build triggers: Poll SCM (cron-based), GitHub webhooks (immediate), Bitbucket webhooks, GitLab webhooks. GitHub organization scanning: discovers all repositories in organization. Credentials: SSH keys (for git clone), personal access tokens (for GitHub API). For feature branches: build automatically on push, run unit tests, optionally deploy to ephemeral review environment.
Build Stage: Compilation and Packaging
Stage examples for different stacks: Node.js: stage('Install') { sh "npm ci" }, stage('Build') { sh "npm run build" }, stage('Test') { sh "npm test" }, stage('Lint') { sh "npm run lint" }. Maven Java: stage('Compile') { sh "mvn compile" }, stage('Package') { sh "mvn package -DskipTests" }, stage('Unit Test') { sh "mvn test" }. Docker: stage('Build Image') { script { docker.build("myapp:${env.BUILD_ID}") } }, stage('Push Image') { docker.withRegistry('https://registry.example.com', 'docker-creds') { docker.image("myapp:${env.BUILD_ID}").push() } }. Python: stage('Install') { sh "pip install -r requirements.txt" }, stage('Test') { sh "pytest tests/" }, stage('Package') { sh "python setup.py sdist" }.
Testing Stage: Automated Quality Gates
Comprehensive testing strategy: Unit tests (fast, isolated), Integration tests (database, APIs), End-to-end tests (Selenium, Cypress), Contract tests (Pact), Performance tests (JMeter, k6). Parallel testing for speed: parallel { stage('Unit Tests') {...}, stage('Integration Tests') {...} }. Collect test results: junit '**/target/surefire-reports/*.xml' (JUnit format). Publish coverage reports: publishHTML for Cobertura/Jacoco. Quality gates: fail build if coverage < 80% or tests fail. Integration with SonarQube for static analysis: withSonarQubeEnv('SonarQube') { sh "mvn sonar:sonar" }. Wait quality gate: timeout(time: 1, unit: 'HOURS') { waitForQualityGate abortPipeline: true }.
Artifact Management
Store build artifacts for deployment and audit. Built-in: stash/unstash for passing between stages (small artifacts). Nexus Repository (Sonatype): stage('Upload to Nexus') { nexusArtifactUploader artifacts: [ [artifactId: 'myapp', file: "target/myapp-${version}.war", type: 'war'] ], nexusUrl: 'nexus.example.com', repository: 'releases' }. Artifactory: rtUpload serverId: 'Artifactory', spec: '{ "files": [ { "pattern": "target/*.war", "target": "generic-local/myapp/${BUILD_NUMBER}/" } ] }'. Docker registry as artifact repository: tag images with git commit hash and latest. S3 bucket for large artifacts (database dumps, compiled assets). Version artifacts with git commit hash, build number, or semantic version from git tag.
Deployment Strategies
Blue-Green Deployment: Keep two environments (blue = current, green = new). Deploy to green, run smoke tests, switch router. Zero downtime, easy rollback (switch back). Jenkins implementation: stage('Deploy to Green') { sh "ansible-playbook deploy-green.yml" }, stage('Smoke Test') { sh "curl -f http://green.example.com/health || exit 1" }, stage('Switch Router') { sh "aws elbv2 modify-listener --listener-arn ${listener_arn} --default-actions Type=forward,TargetGroupArn=${green_tg_arn}" }. Canary Deployment: deploy to 5% of traffic initially, monitor for errors, increment to 100% if successful. Implement with load balancer weight adjustments or service mesh (Istio, Linkerd). Rolling Update: gradually replace old pods with new ones (Kubernetes native). Feature flags: deploy code but hide features until toggled (LaunchDarkly, Flagr).
Environment Management
Pipeline stages: Dev (feature branch), Staging (master branch), Production (release tag). Promote artifacts through environments. Credentials per environment: store in Jenkins Credentials Provider with different IDs (DB_PASSWORD_DEV, DB_PASSWORD_PROD). Use withCredentials step: withCredentials([string(credentialsId: 'DB_PASSWORD_PROD', variable: 'DB_PASSWORD')]) { sh "deploy.sh" }. Environment-specific variables in Jenkinsfile: if (env.BRANCH_NAME == 'master') { env.DEPLOY_ENV = 'staging' } else if (env.TAG_NAME =~ /^vd+.d+.d+$/) { env.DEPLOY_ENV = 'production' } else { env.DEPLOY_ENV = 'dev' }. Configuration file providers: replace tokens in config files with environment values (${ENVIRONMENT}).
Pipeline as Code Best Practices
DRY (Don't Repeat Yourself): Use shared libraries for common code. Shared library structure: src/ (groovy classes), vars/ (global variables). Load library: @Library('my-shared-library') _. Use in pipeline: mySharedLib.deployToKubernetes(environment). Keep Jenkinsfile under 500 lines. Externalize configuration: YAML files in repository (env/dev.yaml, env/prod.yaml). Read in pipeline: def config = readYaml file: "env/${DEPLOY_ENV}.yaml". Use when directive for conditional stages: when { branch 'master' }, when { expression { return env.DEPLOY_ENV == 'production' } }. Stages for each environment promote quality.
Notifications and Alerts
Slack notifications: slackSend channel: '#jenkins', color: 'good', message: "Build ${env.JOB_NAME} - ${env.BUILD_NUMBER} succeeded (<${env.BUILD_URL}|Open>)". Different colors: good (success), danger (failure), warning (unstable). Mention users (@here for failures). Email notifications: emailext to: 'team@example.com', subject: "Build Status", body: "Build ${env.JOB_NAME} ${env.BUILD_NUMBER} ${currentBuild.currentResult}". Include test report summaries. Teams integration: Webhook to Microsoft Teams cards. Custom webhooks: call external systems for incident management (PagerDuty, Opsgenie) on build failures. Dashboard: Build Monitor View shows all pipelines status. Blue Ocean for visual pipeline representation.
Rollback Automation
Automated rollback on deployment failure. Release version tracking: save current version before deployment. Deployment step: mv /app/current /app/rollback, ln -s /app/new /app/current. Rollback trigger: catchError(buildResult: 'FAILURE', stageResult: 'FAILURE') { deployToProduction() }, if failure: sh "./rollback.sh ${ROLLBACK_VERSION}". Database rollbacks: migration versioning (Flyway, Liquibase) tracks applied migrations. rollback step: flyway undo -target=previous_version. Blue-Green rollback: just switch router back to blue. Feature flag rollbacks: disable boolean flag via API without redeploy. Practice rollback drills monthly.
Security In CI/CD
Credentials management: never hardcode secrets. Use Jenkins Credentials Provider with folder-level isolation. Credentials binding injects as environment variable or file. Secrets detection in code: git-secrets or truffleHog pre-commit hook. Pipeline scanning: detect secrets in logs (mask passwords). Docker image scanning: Trivy or Clair scan for vulnerabilities before push. Supply chain security: check npm packages (npm audit), pip dependencies (safety), Maven dependencies (OWASP Dependency Check). Signed commits verification: require GPG signatures on release tags. Access control: Role-based strategy with least privilege (developers can't deploy production).
Performance Optimization
Parallel stage execution for speed: parallel { stage('Unit Tests') {...}, stage('Integration Tests') {...}, stage('UI Tests') {...} }. Build agents: Jenkins agents for distributed builds (Docker agents, Kubernetes pods). Provision agents on demand: Kubernetes plugin launches pod per build. Cache dependencies: reuse Maven ~/.m2/repository, npm ~/.npm, pip cache. Docker layer caching: pull previous image, mount as cache. Build once, deploy many: same artifact through all environments. Avoid git clones in every stage (use checkout scm once, stash workspace). Use incremental builds for compiled languages.
Conclusion
Jenkins pipeline mastery transforms development workflow. Start with simple build-test stages, add deployment progressively, then advanced strategies (blue-green, canary). Shared libraries, proper credentials management, and rollback automation enable reliable, high-velocity deployments.