We have seen the Continuous Delivery workflow in the previous question, now let’s see the step by step process of why Jenkins is being called as a Continuous Delivery Tool:
1. Developers work on their local environment for making changes in the source code and push it into the code repository.
2. When a change is detected, Jenkins performs several tests and code standards to check whether the changes are good to deploy or not.
3. Upon a successful build, it is being viewed by the developers.
4. Then the change is deployed manually on a staging environment where the client can have a look at it.
5. When all the changes get approved by the developers, testers, and clients, the final outcome is saved manually on the production server to be used by the end-users of the product.
In this way, Jenkins follows a Continuous Delivery approach and is called the Continuous Delivery Tool.
The build can be triggered in the following ways:
1. After the completion of other builds.
2. By source code management (modifications) commit.
3. At a specific time.
4. By requesting manual builds.
For installing Jenkins you need the following system configuration:
1. Java 7 or above.
2. Servlet 3.1
3. RAM ranging from 200 MB to 70+ GB depending on the project build needs.
4. 2 MB or more of memory.
Jenkins is an open source automation server written in Java. As an extensible automation server, Jenkins can be used as a simple CI server or turned into the continuous delivery hub for any project.
Impress Employers with your Jenkins knowledge
<> So, this is all about the best interview questions and answers on Jenkins. I hope these Jenkins interview questions help you crack your interview.
<> Prepare yourself with the best interview questions and answers on Jenkins to land your dream job!
In Jenkins, flow control follows the pipeline structure (scripted pipeline) that are being executed from the top to bottom of the Jenkins file.
The three mechanisms are as follows:
* Jenkins uses an internal database to store user data and credentials.
* Jenkins can use a lightweight Directory Access Protocol (LDAP) server to authenticate users.
* We can configure Jenkins to employ the application server's authentication mechanism upon which we deploy it.
A trigger is something that defines when and how the pipelines should be executed. There may be several triggers like a pull request trigger that is used to deploy a pull request, or there may be a stage trigger that is used in configuring how each stage in the release will be triggered.
It includes job configs, plugins, logs, plugin configuration, etc. Jenkins provides a backup plugin which can be used to get critical backup configuration. This is most important when there is a failure; it prevents the loss of any settings.
There are two ways to configure Jenkins node agent to communicate with Jenkins master:
1. Browser–If we launch the Jenkins node agent from a browser, a Java Web Start or JNLP file is downloaded. The downloaded file launches a new process on the client machine to run jobs.
2. Command-line–If you want to start the node agent using the command line, you need the executable agent.jar file. When this file runs, it launches a client's process to communicate with the Jenkins master to run build jobs.
To create a Multibranch Pipeline in Jenkins, follow the following steps:
<> Open the Jenkins dashboard and create a new item by clicking on 'new item'
<> Enter the project name and, from the options, select 'Multibranch pipeline'
<> Click on OK
The steps to deploy a custom build of a core plugin are:
First, copy the .hpi file to $JENKINS_HOME/plugins
Then remove the plugin's development directory
Next, create an empty file called <plugin>.hpi.pinned
Finally, restart Jenkins and use your custom build of a core plugin
Initially, you will have to open the console output where the broken builds are created and then figure out if there are any file changes that were missed. In case there are no issues found there, then you will need to update your local workspace, replicate the problem, and then try to solve it.
Some of the Jenkins environmental variables are:
$JOB_NAME - The name that you give your job when it is first set up.
$NODE_NAME - This is the name of the node on which the current build is running.
$WORKSPACE - Refers to the path of the workspace
$BUILD_URL - Indicates the URL where the results of the builds can be found.
$JENKINS_URL - This is set to the URL of the Jenkins master that is responsible for running the build.
The process to configure Third-party tools in Jenkins can be seen in four significant steps:
<> Install the third-party software
<> Then install a Jenkins plugin supporting the third-party tool
<> Now, configure the tool from the Manage Jenkins section
<> Finally, your plugin is ready to be used
The JENKINS_HOME folder contains a file named config.xml. When you enable the security, this file contains an XML element named useSecurity that changes to true. If you change this setting to false, security will be disabled the next time Jenkins is restarted.
However, we must understand that disabling security should always be both a last resort and a temporary measure. Once you resolve the authentication issues, make sure that you re-enable Jenkins security and reboot the CI server.
Continuous Testing is the process where you execute automated tests as part of the software delivery pipeline. This is done so that you get the feedback on the business risks associated with software as early as possible. It consists of evolving and extending test automation to address the increased complexity and pace of modern application development and delivery.
Continuous Testing means that testing takes place on a continuous basis without any disruption of any kind. In a Continuous DevOps process, a software change is continuously moving from Development to Testing to Deployment. The code undergoes continuous development, delivery, testing and deployment.
It is a project that was started with the purpose to rethink the user experience of Jenkins, modeling and presenting the process of software delivery by surfacing information that’s important to development teams. This is done with as few clicks as possible, while still staying true to the extensibility that is core to Jenkins. While this project is in the alpha stage of development, the intent is that Jenkins users can install Blue Ocean side-by-side with the Jenkins Classic UI via a plugin.
The Multibranch Pipeline project type enables you to implement different Jenkinsfiles for different branches of the same project. In a Multibranch Pipeline project, Jenkins automatically discovers, manages and executes Pipelines for branches that contain a Jenkinsfile in source control.
In software development, multiple developers or teams work on different segments of the same web application. So in this case, you have to perform integration testing by integrating all modules. In order to do that an automated process for each piece of code is performed on a daily bases so that all your codes get tested. This process is known as continuous integration.
There are numerous environment variables that are available by default in any Jenkins build job. A few commonly used ones include:
Note that, as new Jenkins plug-ins are configured, more environment variables become available. For example, when the Jenkins Git plug-in is configured, new Jenkins Git environment variables, such as $GIT_COMMIT and $GIT_URL, become available to be used in scripts.
DevOps is a software development practice that blends software development (Dev) with the IT operations (Ops) making the whole development lifecycle simpler and shorter by constantly delivering builds, fixes, updates, and features. Jenkins plays a crucial role because it helps in this integration by automating the build, test and deployment process.
There are 3 types –
1. CI CD pipeline (Continuous Integration Continuous Delivery)
2. Scripted pipeline
3. Declarative pipeline
There are 3 ways –
* The default way is to store user data and credentials in an internal database.
* Configure Jenkins to use the authentication mechanism defined by the application server on which it is deployed.
* Configure Jenkins to authenticate against LDAP server.
There are 2 ways to start the node agent –
<> Browser – if Jenkins node agent is launched from a browser, a JNLP (Java Web Start) file is downloaded. This file launches a new process on the client machine to run jobs.
<> Command-line – to start the node agent using the command line, the client needs the executable agent.jar file. When this file is run, it simply launches a process on the client to communicate with the Jenkins master to run build jobs.
A build can take several input parameters to execute. For example, if you have multiple test suites, but you want to run only one. You can set a parameter so that you are able to decide which one should be run. To have parameters in a job, you need to specify the same while defining the parameter. The parameter can be anything like a string, a file or a custom.
Common monitoring platforms like DataDog, Prometheus, JavaMelody & few others - have their corresponding Jenkins plugin, which when configured, sends Metrics to the corresponding Monitoring platform, which can then be Observed with the latest tools & technologies. The same can be configured with Alarms & Notifications for immediate attention when something goes wrong.
Jenkins, and several plugins, allow users to execute Groovy scripts in Jenkins. To protect Jenkins from the execution of malicious scripts, these plugins execute user-provided scripts in a Groovy Sandbox that limits what internal APIs are accessible.
This protection is provided by the Script Security plugin. As soon as an unsafe method is used in any of the scripts, the "In-process Script Approval" action should appear in "Manage Jenkins" to allow Administrators to make a decision about which unsafe methods, if any, should be allowed in the Jenkins environment.
This in-process script approval inherently improves the security of the overall Jenkins ecosystem.
To integrate Git with Jenkins, you can follow the following steps:
* First, create a new Jenkins job and open the Jenkins dashboard.
* Now, enter the desired project name and select the job type.
* Click on OK.
* Then enter the project information.
* After that, visit the 'Source Code Management' tab.
Using Selenium allows Jenkins’s testing whenever there are any software changes or any changes in the environment. When the Selenium test suite is integrated with Jenkins, the testing part is also automated as part of the build process.
To create a clone repository via Jenkins you need to use your login credentials in the Jenkins System.
To achieve the same you need to enter the Jenkins job directory and execute the git config command
Jenkins integrates with:
1. Build tools/ Build working script like Maven script.
2. Version control system/Accessible source code repository like Git repository.
Parameters are supported by the Agent section and are used to support various use-cases pipelines. Parameters are defined at the top-level of the pipeline or inside an individual stage directive.
To create a slave node in Jenkins:
1. Go to Manage Jenkins, and scroll down to Manage Nodes
2. Click on New Node
3. Set the node name, choose the Dumb slave option, and then click on OK
4. Enter the node slave machine details, and click on Save
Some steps for scheduling builds in Jenkins are as follows:
* First, we should have a source code management commit.
* We have to complete the other builds.
* Then, we have to schedule it to run at a specified time.
* We need to then give a manual build request.
First, we need to copy our jobs directory from the old to the new server. There are multiple ways to do it. We can either move the job from the installation by simply copying the corresponding job directory or we can make a clone of the job directory by making an existing job’s copy. For this, we need to have a different name, which we can rename later.
First, we need to open the console output where the broken build is created and then see if there are any file changes that were missed. If we do not find any issues in this manner, then we can update our local workspace and replicate the problem and then try to solve it.
A stage block defines a conceptually distinct subset of tasks performed through the entire Pipeline. Stages contain a sequence of one or more stage directives, the stages section is where the bulk of the "work" described by a Pipeline will be located. Minimum, it is recommended that stages contain at least one stage directive for each discrete part of the continuous delivery process, such as Build, Test, and Deploy.
The environment directive specifies a sequence of key-value pairs which will be defined as environment variables for the all steps, or stage-specific steps, depending on where the environment directive is located within the Pipeline. This directive supports a special helper method credentials() which can be used to access pre-defined Credentials by their identifier in the Jenkins environment.
Initially, Jenkins was called Hudson. However, due to some reasons, the name was changed from Hudson to Jenkins.
First, we need to ensure global security. Then, we have to make sure that Jenkins is integrated with the user directory through an appropriate plugin. The project matrix is enabled for fine tuning the access using the custom version-controlled script for automating the process of rights and privileges in Jenkins. The access to Jenkins data or folder is limited. We will run security audits on it.
There is a folder that contains a file named config.xml. We need to change the settings to false for the security to be disabled when Jenkins is started the next time.
These are the mechanisms for starting a Jenkins node agent:
* From the browser window, launch a Jenkins node agent
* From the command line, launch a Jenkins node agent
When we launch a Jenkins node agent, it will download a JNLP file. A new process is launched on the client machine by JNLP when it runs.
Whenever one wants to integrate Jenkins with GitHub projects, the GitHub plugin can be used. It is used to enable the scheduling of a build, pulling data and code files from the GitHub repository to the Jenkins machine, and triggering every build automatically on the Jenkins server after each commit on the Git repository. This saves time and allows one to incorporate the specific project into the CI process.
Jenkins provides remote access API to most of its functionalities (though some functionalities are programming language-dependent). Currently, it comes in three flavors -
<> JSON with JSONP support
Remote access API is offered in a REST-like style. That is, there is no single entry point for all features, and instead, they are available under the ".../api/" URL where the "..." portion is the data that it acts on.
For example, if your Jenkins installation sits at interviewbit.com, visiting /api/ will show just the top-level API features available – primarily a listing of the configured jobs for this Jenkins instance.
Or if we want to access information about a particular build, e.g. https://ci.jenkins.io/job/Infra/job/jenkins.io/job/master/lastSuccessfulBuild/, then go to https://ci.jenkins.io/job/Infra/job/jenkins.io/job/master/lastSuccessfulBuild/api/ and you’ll see the list of functionalities for that build.
Using the Jenkins CLI - console - command
java -jar jenkins-cli.jar console JOB [BUILD] [-f] [-n N]
Produces the console output of a specific build to stdout, as if you are doing 'cat build.log'
<> JOB: Name of the job
<> BUILD: Build number or permalink to point to the build. Defaults to the last build
<> -f: If the build is in progress, append console output as it comes, like tail -f
<> -n N: Display the last N lines.
Global Tools are tools that need to be installed outside the Jenkins environment and need to be controlled from within the Jenkins environment. Hence it needs its corresponding Jenkins plugin as well. Steps to using a Global Tool generally include -
<> Install the tool Plugin into the Jenkins instance, to include the global tool into a list of global tools used by Jenkins.
<> Install the tool in the Jenkins instance or provide away (maybe a command to download and) install the tool during runtime.
<> Go to Manage Jenkins -> Global Tools Configuration and Scroll through the tool list and configure the global tool-specific configurations.
<> Make use of the installed global Tool in your job/pipeline.