INDUSTRY PRACTICES AND TOOLS 2


1.  Discuss the importance of maintaining the quality of the code, explaining the different aspects of the code quality
Software development takes long amounts of time and painstaking effort. This is why most part-time volunteers cannot start big projects by themselves; it is much easier and more rewarding to contribute to existing projects, as this yields results that are immediately visible and usable.
Thus, we conclude that it is very important for existing projects to make it as easy as possible for people to contribute to them. One way of doing this is by making sure that programs are easy to read, understand, modify, and maintain.
Messy code is hard to read, and people may lose interest if they cannot decipher what the code tries to do. Also, it is important that programmers be able to understand the code quickly so that they can start contributing with bug fixes and enhancements in a short amount of time. Source code is a form of communication, and it is more for people than for computers. Just as someone would not like to read a novel with spelling errors, bad grammar, and sloppy punctuation, programmers should strive to write good code that is easy to understand and modify by others.
The following are some important qualities of good code:
Cleanliness
Clean code is easy to read with minimum effort. This lets people start to understand it easily. This includes the coding style itself (brace placement, indentation, variable names), and the actual control flow of the code.
Consistency
Consistent code makes it easy for people to understand how a program works. When reading consistent code, one subconsciously forms a number of assumptions and expectations about how the code works, so it is easier and safer to make modifications to it. Code that looks the same in two places should work the same, too.
Extensibility
General-purpose code is easier to reuse and modify than very specific code with lots of hardcoded assumptions. When someone wants to add a new feature to a program, it will obviously be easier to do so if the code was designed to be extensible from the beginning. Code that was not written this way may lead people into having to implement ugly hacks to add features.
Correctness
Finally, code that is designed to be correct lets people spend less time worrying about bugs, and more time enhancing the features of a program. Users also appreciate correct code, since nobody likes software that crashes. Code that is written for correctness and safety (i.e. code that explicitly tries to ensure that the program remains in a consistent state) prevents many kinds of silly bugs.


Different aspects of the code quality

There’s no one right way to think about software quality—it’s a complicated area. It is useful, however, to group its various components into three broad aspects.



The three aspects of software quality are functional quality, structural quality, and process quality. Each one is worth looking at in more detail. Functional quality means that the software correctly performs the tasks it’s intended to do for its users. Among the attributes of functional quality are:


v Meeting the specified requirements. Whether they come from the project’s sponsors or the software’s intended users, meeting requirements is the sine qua non of functional quality. In some cases, this might even include compliance with applicable laws and regulations. And since requirements commonly change throughout the development process, achieving this goal requires the development team to understand and implement the correct requirements throughout, not just those initially defined for the project.

v Creating software that has few defects. Among these are bugs that reduce the software’s reliability, compromise its security, or limit its functionality. Achieving zero defects is too much to ask for most projects, but users are rarely happy with software they perceive as buggy.

v Good enough performance. From a user’s point of view, there’s no such thing as a good, slow application.

v Ease of learning and ease of use. To its users, the software’s user interface is the application, and so these attributes of functional quality are most commonly provided by an effective interface and a well-thought-out user workflow. The aesthetics of the interface—how beautiful it is—can also be important, especially in consumer applications.

Software testing commonly focuses on functional quality. All of the characteristics just listed can be tested, at least to some degree, and so a large part of ensuring functional quality boils down to testing.

The second aspect of software quality, structural quality, means that the code itself is well structured. Unlike functional quality, structural quality is hard to test for (although there are tools to help measure it, as described later). The attributes of this type of quality include:

v Code testability. Is the code organized in a way that makes testing easy?

v Code maintainability. How easy is it to add new code or change existing code without introducing bugs?


v Code understandability. Is the code readable? Is it more complex than it needs to be? These have a large impact on how quickly new developers can begin working with an existing code base.

v Code efficiency. Especially in resource-constrained situations, writing efficient code can be critically important.

v Code security. Does the software allow common attacks such as buffer overruns and SQL injection? Is it insecure in other ways?

The most obvious attributes of process quality include these:

v Meeting delivery dates. Was the software delivered on time?

v Meeting budgets. Was the software delivered for the expected amount of money?

v A repeatable development process that reliably delivers quality software. if a process has the first two attributes—software delivered on time and on budget—but so stresses the development team that its best members quit, it isn’t a quality process. True process quality means being consistent from one project to the next.






2.  Explain different approaches and measurements used to measure the quality of code


Here are five of the key traits to measure for higher quality.
Reliability
Reliability measures the probability that a system will run without failure over a specific period of operation. It relates to the number of defects and availability of the software.
Number of defects can be measured by running a static analysis tool. Software availability can be measured using the mean time between failures (MTBF). Low defect counts are especially important for developing a reliable codebase.
Maintainability
Maintainability measures how easily software can be maintained. It relates to the size, consistency, structure, and complexity of the codebase. And ensuring maintainable source code relies on a number of factors, such as testability and understandability.
You can’t use a single metric to ensure maintainability. Some metrics you may consider to improve maintainability are number of stylistic warnings and Halstead complexity measures. Both automation and human reviewers are essential for developing maintainable codebases.
Testability
Testability measures how well the software supports testing efforts. It relies on how well you can control, observe, isolate, and automate testing, among other factors.
Testability can be measured based on how many test cases you need to find potential faults in the system. Size and complexity of the software can impact testability. So, applying methods at the code level — such as cyclomatic complexity — can help you improve the testability of the component.
Portability
Portability measures how usable the same software is in different environments. It relates to platform independency.
There isn’t a specific measure of portability. But there are several ways you can ensure portable code. It’s important to regularly test code on different platforms, rather than waiting until the end of development. It’s also a good idea to set your compiler warning levels as high as possible — and use at least two compilers. Enforcing a coding standard also helps with portability.

Reusability
Reusability measures whether existing assets — such as code — can be used again. Assets are more easily reused if they have characteristics such as modularity or loose coupling.
Reusability can be measured by the number of interdependencies. Running a static analyzer can help you identify these interdependencies.


3.   Identify and compare some available tools to maintain the code quality

 1) Collaborator
Collaborator is the most comprehensive peer code review tool, built for teams working on projects where code quality is critical.
  • See code changes, identify defects, and make comments on specific lines. Set review rules and automatic notifications to ensure that reviews are completed on time.
  • Custom review templates are unique to Collaborator. Set custom fields, checklists, and participant groups to tailor peer reviews to your team’s ideal workflow.
  • Easily integrate with 11 different SCMs, as well as IDEs like Eclipse & Visual Studio
  • Build custom review reports to drive process improvement and make auditing easy.
  • Conduct peer document reviews in the same tool so that teams can easily align on requirements, design changes, and compliance burdens.
2) Review Assistant
Review Assistant is a code review tool. This code review plug-in helps you to create review requests and respond to them without leaving Visual Studio. Review Assistant supports TFS, Subversion, Git, Mercurial, and Perforce. Simple setup: up and running in 5 minutes.
Key features:
  • Flexible code reviews
  • Discussions in code
  • Iterative review with defect fixing
  • Team Foundation Server integration
  • Flexible email notifications
  • Rich integration features
  • Reporting and Statistics
  • Drop-in Replacement for Visual Studio Code Review Feature and much more
3) Codebrag
  • Codebrag is a simple, light-weight, free and open source code review tool which makes the review entertaining and structured.
  • Codebrag is used to solve issues like non-blocking code review, inline comments & likes, smart email notifications etc.
  • With Codebrag one can focus on workflow to find out and eliminate issues along with joint learning and teamwork.
  • Codebrag helps in delivering enhanced software using its agile code review.
  • License for Codebrag open source is maintained by AGPL.

4) Gerrit
  • Gerrit is a free web-based code review tool used by the software developers to review their code on a web-browser and reject or approve the changes.
  • Gerrit can be integrated with Git which is a distributed Version Control System.
  • Gerrit provides the repository management for Git.
  • Using Gerrit, project members can use rationalized code review process and also the extremely configurable hierarchy.
  • Gerrit is also used in discussing a few detailed segments of the code and enhancing the right changes to be made.
Explore the site from here for more features on Gerrit.


5) Codestriker

  • Codestriker is an open source and free online code reviewing web application that assists the collaborative code review.
  • Using Codestriker one can record the issues, comments, and decisions in a database which can be further used for code inspections.
  • Codestriker supports traditional documents review. It can be integrated with ClearCase, Bugzilla, CVS etc.
  • Codestriker is licensed under GPL.
6) Rhodecode
  • Rhodecode is an open source, protected and incorporated enterprise source code management tool.
  • Rhodecode serves as an integrated tool for Git, Subversion, and Mercurial.
  • Rhodecode main features are team collaboration, Repository Management, and Code security & authentication.
  • Rhodecode has 2 editions, Community Edition (CE) which is a free and open source and Enterprise Edition (EE) is licensed per user.
  • Rhodecode automates the workflows to execute faster.
7) Phabricator
Phabricator is a complete suite of open-source software development applications which include light-weight web-based code review, planning, testing, browsing and audit score, finding bugs etc.
  • Code review tool from Phabricator suite is termed as “Differential”. It is used in minimizing the efforts required in creating the best quality code.
  • Phabricator has two types of code review workflows, namely “pre-push” also termed as “review” and “post-push” termed as “audit”.
  • Phabricator can be integrated with Git, Subversion, and Mercurial.

8) Crucible
Crucible is a web-based collaborative code review application used by developers for code review, finding defects, discussing the changes and knowledge sharing etc.

  • Crucible is a flexible application that accommodates ample range of work approaches and team sizes.
  • Crucible is a lightweight peer code review tool that is used in pre-commit and post-commit reviews.
  • Code review has become easy for SVN, Perforce, and CVS etc using Crucible.
9) Veracode
Veracode (now acquired by CA Technologies) is a company which delivers various solutions for automated & on-demand application security testing, automated code review etc.
  • Veracode is used by the developers in creating secured software by scanning the binary code or byte code in place of source code.
  • Using Veracode one can identify the improper encrypted functionalities, malicious code and backdoors from a source code.
  • Veracode can review a large amount of code and returns the results immediately.
  • To use Veracode there is no need to buy any software or hardware, you just need to pay for the analysis services you need.
10) Review Assistant
Review Assistant is a peer code review tool by Devart (Software Development Company) which supports the visual studio.
  • Review Assistant is a simple & better tool that follows a light-weight process in code reviews.
  • Review Assistant is used in code reviewing for creating review requests and responding to the same.
  • Review Assistant integrates with TFS, Subversion, Git, Perforce and Mercurial.
  • In Review Assistant, the developer can switch between the review comments and the source code.

11) Review Board
Review Board is a web-based, collaborative, free and open source tool used for code review and document review by open source projects and companies.
  • Using Review Board for code review one can save money and time. Time saved can be used in concentrating on creating great software.
  • Review Board can be integrated with Clear Case, CVS, Perforce, Plastic etc.
  • In a code review by Review Board tool, the code is syntax highlighted which makes it be read faster.
  • Review Board supports pre-commit reviews and post-commit reviews.
Visit the website from here for a re trial and for further details on the Review Board code review tool.
Additional Code Review Tools for consideration:
Below are a few more additional tools that are used by developers in reviewing the source code.
#11) Barkeep
Using Barkeep one can have fun in reviewing the code which makes the review faster. With this tool, one can email the comments to his associate committers.
12) J Architect
J Architect is a wonderful tool for analyzing the Java code. After each review, it surrenders a report stating the development of your project or software which eases your task of customizing the code.
13) Code Review Tool
Code Review Tool uses the light-weight review technique by providing all the advantages of formal inspections by reducing the effort and time.
14) Reviewable
Reviewable is a fresh, light-weight and powerful code review tool which makes the code review faster and thorough. It facilitates in improving the code quality by cleaning the User Interface, Customizing the code font, finding bugs or issues, highlighting the syntax etc.
15) Rietveld
Rietveld is a web based code collaborative code review tool from Google. Basically, it was developed to demonstrate the google app engine. However, now it is used by many open source projects for code review.
16) Peer Review Plugin
Peer Review Plugin is a web-based environment that makes the code review user-friendly. It allows the developers to review the code during their own time and that too in a distributed manner. The ultimate purpose of this plug-in is to review the files from the repository and comment the same.






4.Discuss the need for dependency/package management tools in software development?
Dependency Management is used to pull all the dependency information into a common POM file, simplifying the references in the child POM file.
It becomes useful when you have multiple attributes that you don’t want to retype in under multiple children projects. Finally, dependency Management can be used to define a standard version of an artifact to use across multiple projects.
Managing your dependencies manually in any programming language is a huge pain. This is why in most programming languages today you will find that they all have some implementation of a dependency management system or sometimes a package manger. For this article We have compiled a list of 15 Best Dependency Management Tools for Developers, by which you can easily manage your dependencies.

1. NuGet


NuGet is the package manager for the Microsoft development platform including .NET. The NuGet client tools provide the ability to produce and consume packages. The NuGet Gallery is the central package repository used by all package authors and consumers.

When you use NuGet to install a package, it copies the library files to your solution and automatically updates your project (add references, change config files, etc.). If you remove a package, NuGet reverses whatever changes it made so that no clutter is left.

2. Composer


This dependency manager for PHP lets you create a composer.json file in your project root, run a single command, and all your dependencies are downloaded ready to use.

Composer is not a package manager in the same sense as Yum or Apt are. Yes, it deals with “packages” or libraries, but it manages them on a per-project basis, installing them in a directory (e.g. vendor) inside your project. By default it does not install anything globally. Thus, it is a dependency manager. It does however support a “global” project for convenience via the global command.

 

3. David


David is a tool for getting an overview of your Node dependencies. It creates a badge showing the current status of each dependency, which you can embed on your website if you choose.

4. Nanny


Nanny is a dependency management tool for managing dependencies between your projects. Unlike tools like Maven, Nanny can be used for arbitrary dependencies and is easy to use.

Nanny lets you specify dependencies to your project, and Nanny will go ahead and pull in all the dependencies (and everything those dependencies are dependent on) into the _deps folder in your project. Nanny makes it easy to create dependencies and manage dependency versions.

5. Bower


Bower is a package manager for the web. Bower lets you easily install assets such as images, CSS and JavaScript, and manages dependencies for you.
Bower can manage components that contain HTML, CSS, JavaScript, fonts or even image files. Bower doesn’t concatenate or minify code or do anything else – it just installs the right versions of the packages you need and their dependencies.

6. Sprockets
Sprockets is a Ruby library for compiling and serving web assets. It features declarative dependency management for JavaScript and CSS assets, as well as a powerful preprocessor pipeline that allows you to write assets in languages like CoffeeScript, Sass and SCSS.

7. Pintjs


Pint is a small, asynchronous, dependency aware wrapper around Grunt attempting to solve some of the problems that accompany a build process at scale. A typical Gruntfile starts with, at a minimum, some variation of: jsHint, jasmine, LESS, handlebars, uglify, copy, and clean stack. Just these half dozen or so plugins can balloon your Gruntfile upwards of 300 lines and when you add complex concatenation, cache busting, and versioning can cause it to grow well in to the 1000+ lines. Pint allows you to break up and organize your build into small testable pieces.

8. Ender.js


Ender is a full featured manager for your browser, it allows you to search, install, manage and compile front-end JavaScript packages and their dependencies for the web. Ender is not a jQuery replacement, nor its static asset, its a tool for making the consumption of front-end JavaScript packages dead simple and powerful.

With Ender, if one library goes bad or unmaintained, it can be replaced with another.

 

 9. Jam


Jam is a package manager for JavaScript. Unlike other repositories, they put the browser first. Using a stack of script tags isn’t the most maintainable way of managing dependencies; with Jam packages and loaders like RequireJS you get automatic dependency resolution.

You can achieve faster load times with asynchronous loading and the ability to optimize downloads. JavaScript modules and packages provide properly namespaced and more modular code.

 10. Browserify


Browserify optimizes required modules and libraries by bundling them together. These bundles are supported in the browser which means you can include and merge modules with plain JavaScript. All you need is NPM to get started and then Browserify to get moving.

11. Volo

Volo is a tool for creating browser based, front end projects from project templates and add dependencies by fetching them from GitHub. Once your project is set up, automate common tasks.
volo is dependency manager and project creation tool that favors GitHub for the package repository. At its heart, volo is a generic command runner — you can create new commands for volo, and you can use commands others have created.

12. GemLou.pe


GemLou.pe is a bookmarklet that lets you easily view the full dependency tree for any Ruby gem before you install it. Use it directly from RubyGems.org or Ruby-Toolbox.com, or type in the name of the gem from anywhere else on the web.

 13. Mantri


Mantri is an open source and built for more complex web applications that require large bundles of dependencies. Mantri aims to follow modular programming practices and hopes to encourage developers onto the same path.

14. PIP


pip is a package management system used to install and manage software packages written in Python

15. NPM


npm is the package manager tool for JavaScript. Find, share, and reuse packages of code from hundreds of thousands of developers — and assemble them in powerful new ways. Dependencies can be updated and optimized right from the terminal. And you can build new projects with dependency files and version numbers automatically pulled from the package.json file
.







5. Explain the role of dependency/package management tools in software development


A package manager or package management system is a collection of software tools that automates the process of installing, upgrading, configuring, and removing computer programs for a computer's operating system in a consistent manner.
A package manager deals with packages, distributions of software and data in archive files. Packages contain metadata, such as the software's name, description of its purpose, version number, vendor, checksum, and a list of dependencies necessary for the software to run properly. Upon installation, metadata is stored in a local package database. Package managers typically maintain a database of software dependencies and version information to prevent software mismatches and missing prerequisites. They work closely with software repositories, binary repository managers, and app stores.
Package managers are designed to eliminate the need for manual installs and updates. This can be particularly useful for large enterprises whose operating systems are based on Linux and other Unix-like systems, typically consisting of hundreds or even tens of thousands of distinct software packages.




7.What is a build tool? Indicate the significance of using a build tool in large scale software development, distinguishing it from small scale software development



What is build tool

Build tools are programs that automate the creation of executable applications from source code. Building incorporates compiling, linking and packaging the code into a usable or executable form. In small projects, developers will often manually invoke the build process. This is not practical for larger projects, where it is very hard to keep track of what needs to be built, in what sequence and what dependencies there are in the building process. Using an automation tool allows the build process to be more consistent.



Software development is the process of conceiving, specifying, designing, programming, documenting, testing, and bug fixing involved in creating and maintaining applications, frameworks, or other software components. Software development is a process of writing and maintaining the source code, but in a broader sense, it includes all that is involved between the conception of the desired software through to the final manifestation of the software, sometimes in a planned and structured process.[1] Therefore, software development may include research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities that result in software products.
Software can be developed for a variety of purposes, the three most common being to meet specific needs of a specific client/business (the case with custom software), to meet a perceived need of some set of potential users (the case with commercial and open source software), or for personal use (e.g. a scientist may write software to automate a mundane task). Embedded software development, that is, the development of embedded software, such as used for controlling consumer products, requires the development process to be integrated with the development of the controlled physical product. System software underlies applications and the programming process itself, and is often developed separately.
The need for better quality control of the software development process has given rise to the discipline of software engineering, which aims to apply the systematic approach exemplified in the engineering paradigm to the process of software development.
There are many approaches to software project management, known as software development life cycle models, methodologies, processes, or models. The waterfall model is a traditional version, contrasted with the more recent innovation of agile software development.




Small Projects
Large Projects
Are usually completed by an individual or a small team
They usually have large teams working on them   
Typical time for development is in the order of hours to weeks
They usually take months to years to complete     
Emphasis is placed on "Getting it done!"
Maintenance plays a bigger role in large projects because the software life cycle is long in duration







 
As we can see from the table above, small projects and large projects usually have different requirements. With the help of these requirements, we were able too identify the most important aspects of programming languages for each of them. They are listed below:
 
 
Small Projects
v Familiarity with the programming language
v Development environment should be friendly
v Code Generation
v Language Simplicity
v Ease of documentation
v Availability of libraries

Large Projects


v Need to reduce Modular dependencies.
For e.g.. 
  • Information Hiding
  • Inheritance
v Low Run-time costs
v Good software design
v Good debugging
v Security of the Virtual Machine
v Easy interfacing between the modules
v Support for team-work
v Language simplicity
v Ease of documentation
v Availability of libraries


8.Explain the role of build automation in build tools indicating the need for build automation


What is Build Automation (BA)?
BA also sometimes referred as Continuous Integration (CI), is the process of automating on-demand build creation which encompasses some or all of the below
1.     Download code from a central repository – Git, SVN, TFS etc
2.     Make updates in code structure if needed
3.     Download required external packages via Maven, Nuget, Ant etc
4.     Build code using gcc, javac, MSBuild etc
5.     Create a building share with binary and default configuration – Jar, war, exe, XML, ini etc
6.     Propagate build output to cloud or network shares
7.     Do deployment on web servers & other servers
8.     Configure new or upgraded deployments
9.     Do BVT tests deployments
10.Inform relevant stakeholders
A CI is triggered usually when a code commit is done or a particular tag is created. A Basic BA job is usually triggered at a fixed time; Dev teams need to finish commits by that given time.
Continuous integration Vs. Build Automation
CI’s benefit lies in giving every team member responsibility for individual commits. Faults are uncovered fast. It’s a complex process even with a licensed software or service and needs good skilled DevOps team. Despite claims of only configuration based settings, some scripting always needs to be done.
In contrast, basic BA takes the time to uncover faults but its predictable timeline reduces anxiety for Team members. It’s easy to implement leaving few manual tasks. It can be developed by anyone with basic scripting knowledge as I will demonstrate in a later post. It can be done using the native shell of an OS without any licensed software.

9. Compare and contrast different build tools used in industry


Types of Build DevOps Tools:

Now that we have gone through the actual discussion about what DevOps is all about, now is the time to take off with these understandings in mind. One of the basic needs for any DevOps setup for an Organization is the actual build process and also the requirements that comes after that – the unit testing, continuous integration, regression testing, performance testing and etc. Let us now take a look at the available tools that can help us get through this phase of the Project Lifecycle with flying colors:

1. Scala oriented Build Tool (SBT):

SBT, an acronym that stands for Scala oriented build tool is a build tool that is meant for Scala, Java and many more programming languages as such. Not like the other sets of build tools, SBT is specifically targeted to work towards Scala and Java Projects. Adding to the points that are discussed already, SBT provides a special interactive mode making Scala builds significantly faster using the same JVM as against the traditional batch operation where a number of build tasks are executed in a sequence. SBT is perfectly capable of compiling Scala code, packaging archive artifacts, executing tests and also to support many other build operations.
Advantages:
  • SBT can be effortlessly used if it is a smaller and a simpler project.
  • Commonly identified being used with Scala open source projects.
  • Provides an awesome integration if you are using IntelliJ IDEA IDE for your development.
  • Most of the tasks that you will need (as like compile, test, run, doc, publish-local and the console) operations/tasks are usage ready with SBT.
  • It is also pointed by few as a feature that dependencies can be open source repositories that are grabbed directly from GitHub.

2. CMake:

CMake is cross-platform free and open-source software that helps in managing build processes of a software using its compiler-independent method. CMake also provides its support for directory hierarchies and also to applications that do depend on multiple libraries. CMake always works in conjunction with additional build environments as like the Xcode (from Apple) and Visual Studio (from Microsoft). The only pre-requisite for CMake will be a C++ compiler on its build system as such.
Advantages:
  • Enables cross-platform discovery of the available system libraries.
  • Provides automatic discovery and configuration for the tool-chain.
  • Ease in use than its predecessor (Make).
  • Ease in the compilation of your own files into a shared library in a platform agnostic way.
  • CMake does more than what Make is capable of doing and can perform more complex tasks.

3. Terraform:

Terraform is a software tool that is designed to safely and efficiently build, combine and to launch infrastructure. It is a tool that is dedicated to building, changing and also to version infrastructure. It can manage the existing and the most popular service providers alongside the in-house solutions as well with utmost ease. The configuration files describe how and in what way Terraform should run the required applications in the datacenter.
Terraform provides and generates an execution plan that puts in the better description the way to reach the desired state. Executes this documented way and to build the described infrastructure. Terraform has the ability to identify the changes made to the configuration and creates the incremental execution plans that can be further applied. Terraform has the ability to manage and include low-level components such as the compute instances, storage, networking, high-level components such as DNS entries, SaaS features etc. Having discussed all about these features, let us now take a look at the benefits or advantages that Terraform has to offer to individuals or organization who choose to use this offering for their own need.
Advantages:
  • JSON is not a coding language as it is very evident that most of the lines are just braces, brackets from the CFTs that we look at. Terraform has a custom HCL for creating templates to ease the document and also comment your code.
  • User data scripts can be written in separate files exactly as you would write them on the server locally.

4. Bower:

The tool Bower is a known package management system that works solely for the client-side programming on the internet, which depends on Node.js and npm. Bower is said to work with GIT and GitHub repositories. Bower is said to offer a generic and an un-opinionated solution to the problem of front-end package management, which exposing the necessary package dependency model via an API. This API can further be consumed by a more opinionated build stack. Bower runs over GIT and is package-agnostic.
Advantages:
  • There is no need to specifically and manually download the required dependencies.
  • There is a possibility to optionally install the packages that are part of the scaffolding based on user prompts.
  • No need to commit any missing dependencies to your version control.
  • Declare your dependencies in bower.json so that the dependencies are declaratively managed.
  • There is no need for us to segregate between various builds.

5. Gradle:

Gradle, an easier way to build your projects has one of its biggest achievements to focus upon – the elimination of XML as part of the build script generation. To be very precise, Gradle uses a domain specific language (acronym as DSL) that is based on Groovy, which is another programming language that can be run on the JVM. As like any other ANT build script, the Gradle’s DSL also lets you define both core parts of the build file and also specific steps called Tasks (but the only difference here it is going to be Groovy rather than XML). Not just that, it is also very much extensible thus making it easier to define customizable tasks.
The general Gradle builds files are appropriately named as ‘build.gradle’ and start out with the very task of configuring the build. Having said all of this, you might be wondering did the Gradle team spent a whole lot of time trying to re-invent the wheel. The answer to this is a perfect NO as they have reliability on the existing Maven or Ivy based dependency ecosystems. Gradle proposed a fairly reasonable name called TASK for its build step as compared to ANT’s obscure TARGET and Maven’s confusing PHASE.
Advantages:
  • Adding dependencies is one of the easiest tasks here, and also applying plugins.
  • It is a completely automated build process and also does provide an option to multi-module builds.
  • Easier to use than ANT or Apache Maven and provides its support to finalizers, dry runs and based on source changes automatically builds your project.
  • Comes integrated with most of the popular IDEs as like IntelliJ IDEA, Eclipse and etc via plugins.
  • It has the ability to sense the changes in Gradle files and updates the Project structure accordingly.
  • It is really unbelievable that the learning curve on this tool is pretty flat.

6. Apache ANT:

One of the oldest but the strongest of the competitors that are available in the current market in the DevOps space is Apache Ant, which is a Java library. It is also a command-line tool that helps in driving processes described in ANT build files as targets and tasks. One of the most common usages of Apache ANT is the way that it builds Java applications. Apache ANT finds itself in a comfortable situation where it can act as a build tool for Non-Java based programming languages as like C, C++. More genuinely, Apache ANT can be used to pilot any kind of process that can be described in terms of targets and tasks.

Advantages:
  • Provides great control over the whole build process and is ideal for projects that need control over their build process according to the needs of the project, precisely.
  • Apache ANT can be used to build absolutely anything.
  • It is one of the highly Customizable and Configurable build tool that is available in this space of DevOps tools.
  • It is also an ideal build tool if your project is a mix and match of technologies as like C, C++, and Java etc.
  • Apache ANT provides a lot of HTTP hooks to take up tasks on that side as well
  • Having discussed all these points above, it is extremely fast and works on almost all the platforms.
  • Apache ANT relies heavily on Apache Ivy for the dependency management requirements.
  • Bringing your project’s build process is only possible via Apache ANT as it streamlines the whole process of building your software into targets and tasks efficiently.

7. Apache Maven:

One of the other build tools only can be looked at is Apache Maven. Apart from being just a build tool, Apache Maven is also a software project management and comprehension tool. Based on the concept note of Project Object Model (acronym for POM), it can manage a project’s build, reporting and documentation processes from a central piece of Information. To be very precise in mentioning, Apache Maven is actually two tools made into one – the Dependency Manager and also a very strong build tool. Like Apache ANT, it is an XML based build file but at the same time, it outlines very rigid standards for itself.
Apache Maven can also be thought of as declarative, as it lets you define your build process should be doing rather than how should it be doing it. These features that we have discussed until now makes it an ideal choice for both the Dependency Management and also for its build automation processes. Build automation is much cleaner, standardized across platforms and individuals. All of this constitutes to lesser time being spent on the actual build process and hence be safely used on more important tasks at hand. It can very comfortably assume that Apache Maven is a de-facto standard of build automation in the Java world.

Advantages:
  • Possibility and also ease in configuring the whole project by just glancing through the one important file of the whole project, the pom.xml.
  • Apache Maven reduces the burden of keeping your dependencies up to date, as the dependency management module of Apache Maven kicks in to perform all such activities.
  • An added benefit is the ease with which you can comfortably build a Cucumber project or a TestNG project.
  • Once the configuration is done, the developers have only one task to do – Development with peace of mind!
  • Project management skills can be seen very clearly when Apache Maven is let to handle a relatively bigger project for all its needs – build automation, dependency management and etc.
  • Provides its support to any kind of platform for the actual build process.
  • Provides wonderful support for all the Unit testing needs and logging requirements.
  • Profile-based support to run the pom configurations based on the profile that it gets executed with.

8. Apache Continuum:

Apache Continuum, a partner to yet another famous build tools - Apache Maven, is also a continuous integration server that runs builds on a user-configurable schedule. It has much of features as like that available with CruiseControl as like the number of plugins that are available to be used in conjunction with Apache Continuum. One of the coolest features that is available with Apache Continuum is to identify the developer whose commit breaks the build for him/her to fix the issue.

Advantages:
  • Most of the major advantages that can be seen using Apache Continuum are the faster delivery, higher quality and also the reduction of risk in your product deployments.
  • Provides easier integration with tools as like Jenkins, Chef, GIT, SubVersion, Docker, Selenium, AWS and much more of these tools (the list never ends).
  • Provides a better tracking of business value throughout the lifecycle of the project.
  • It has an inbuilt integration with VersionOne Lifecycle the ALM product from the VersionOne family alongside to that has an integration with JIRA as well.
  • Provides a better way to track stories, defects, and issues through a delivery lifecycle.

9. CruiseControl:

Yet another tool that fits both into the Continuous Integration and Builds space of the DevOps tools that we are discussing here. CruiseControl is also both a Continuous Integration tool and also an extensible framework to create a custom continuous build process. It has lots of plugins from various source controls and builds technologies from the huge arena of DevOps tools to form the stronger tool that it is. It also has features as like notification schemes as like the email notification and even instant messaging feature. An interactive web UI provides all the necessary details about all the previous builds of the current project.
Advantages:
  • CruiseControl comes as one of the best choices to automate the build processes starting from the initial development environments to as high as a Production environment.
  • It is pretty straightforward and it is a big advantage that anybody from the user group can pick up the corresponding tasks from their standpoint.
  • Very strong community support and also adding to that is the stronger number of add-ins and plugins available with CruiseControl for usage.
  • It can very comfortably consider as one of the pioneers in the automated builds and Continuous Integration.
  • The setup process is very easy and alongside to that, the tool can be customized heavily according to the user requirements.
  • Pretty cool to see that other tools as like NAnt can be configured to automate your builds.
  • Provides an interactive dashboard that gives you a control of all the information about the builds and also allows users to drill down the details to the fine-grained details.

10. Hudson:

Hudson can be classified into two categories of DevOps tools that an Organization can choose to work with. It can be worked with as a Continuous Integration (CI) tool or it can also be used as a Build tool. Hudson in generic does the monitoring on the execution of repeated jobs as like building software projects or those jobs that get triggered on a time schedule (as like the CRON jobs). Two important things that Hudson can be relied upon – Building / Testing Software projects continuously and the other to monitor the execution of all such externally-run jobs.
If you look at Hudson as a Continuous Integration tool (this is basically written in Java), that runs in the context of a Servlet Container as like Apache Tomcat or a Glassfish application server – it provides its support to SCM tools as like CVS, Subversion, Git, Perforce and RTC. Alongside these, it can execute Apache Maven or Apache ANT based projects and also ad-hoc shell scripts or Windows batch commands. Hudson has gained its importance over CruiseControl and the other open-source build servers from its yesteryears itself.
There is this interesting incident on how Jenkins has been created as a fork from the Hudson project and has gained its own importance in the build and continuous integration space of the DevOps tools. It is still thought to be that Jenkins is a fork from Hudson by the Hudson developers and vice versa is the case with the Jenkins developers and there came the time period where interest in Hudson collapsed thereafter. To be very precise, there is no more Hudson maintenance and Jenkins has comfortably replaced it.
Advantages:
  • One of the best advantages of using Hudson is that it is open source and is available free to use comfortably.
  • The availability of a huge number of plugins makes it easier to use the tool itself and also the installation of the tool is just a cool breeze.
  • Software build automation, email notifications on jobs and etc – the most common and widely asked for features that are available with Hudson.
  • It is a multi-platform, feature heaving, reliable and configurable none the less.
  • Hudson can also be used as a Continuous Integration platform to conjoin the build process with also to run the automation tests on the generated build.
  • One of the best tools to use and has integrations with various other industry standard tools as like Gerrit, Jenkins and GitHub etc.
  • One of the largest community that solely worked on plugins for general use with Hudson.

 

11. Apache Buildr:

Apache Buildr is a popular build system for Java-based applications that is gaining the importance over the years. It also includes its support for other programming languages as like Scala, Groovy and etc. Apache Buildr is an intelligent build tool, so there is only things that we need to tell it so that it can take care of the rest of the responsibilities alongside to that. Apache Buildr is fast enough and also at the same time it is reliable as well. It has outshined its way to become an outstanding dependency management feature as well. You can build something out of your own interest using Apache Buildr, just because of the ease of use that it provides.
Advantages:
  • Easy to use with a varied number of projects.
  • Ability to build larger and complex projects and also at the same time to build a set of smaller projects to build a large project in turn.
  • Preconfigured tasks that require remotely no configuration at all, which keeps the build scripts on the DRY side and also makes it simple to maintain.
  • Provides the support towards APT source code generation, Javadoc and etc.
  • Provides the base features like any other build tool to compile, copy and filtering resources – Junit and TestNG test cases in the project.
  • An intelligent dependency mechanism that only builds the change-set that has been changed from the previous release.
  • Provides a drop-in replacement for Maven 2.0 using the same file layout, artifact specifications and also maintaining local/remote repositories.
  • A very generic statement which goes like this – Anything that can be done through ANT, can also be done through Apache Builder as well.
  • There is no additional overhead of building plugins or any further configuration, you just can write newer tasks or functions to achieve what’s required of you.
  • Apache Buildr, written in Ruby helps writing code using variables, functions, and objects, however, demanding to be your one-off tasks.
  • Easier to manage upgrades to newer versions.
12.NAnt:

NAnt is yet another build DevOps tool that is very much similar to ANT and hence the name. It is free to use build tool used famously for the .NET projects. In theory, if we need to explain NAnt is very much like MAKE without the fringes and the disadvantages of it and in practice, it is very much like ANT. NAnt is extended for further usage by task classes instead of extension with shell-based commands. The configuration files in NAnt are XML based files where a target tree once called gets all the necessary tasks executed, and each of the tasks that gets executed in NAnt is executed by an object that actually implements a particular Task interface.
NAnt in some way misses the power to be expressive with its build scripts, but yet it can give you the power of being able to run the same scripts across platform and lets you call your build scripts cross-platform enabled. If the need of the hour is just to execute a shell command, NAnt has a feature in the form of a task which allows different commands to be executed based on the operating system where it is getting executed upon.

Advantages:
  • NAnt is used widely to compile, build and also to run .NET projects.
  • NAnt uses XML format for its scripts to write build steps for projects and hence it is platform-independent.
  • NAnt can easily handle building modules written different programming languages like C# or VB and the like.
  • NAnt provides wonderful integration with Unit testing tool NUnit 2.0.
  • NAnt, just like ANT can also manage to do other tasks like creating files or directories, copying and deleting them and also ad-hoc tasks as like sending emails, zipping files and the link.
  • NAnt, just like ANT can group a certain number of tasks under a Target.



10.Explain the build life cycle, using an example (java, .net, etc…)
In this article, we are going to discuss about the Maven build life cycle in detail. A Maven build life cycle is a sequence of phases in which goals are executed in a well-defined order. A phase is nothing but the representation of a stage in the life cycle. Let’s understand the build life cycle with the help of the following example.



The following are the three standard lifecycles in Maven.
• clean
• default (or build)
• site
Clean Lifecycle of Maven Build
Open a command prompt and navigate to the project path (C:\work\project\maven_demo) which we had created in the last tutorial. Now execute the “mvn post-clean” command as shown below. This command will direct Maven to invoke the clean lifecycle which consists of the following phases.
• pre-clean
• clean
• post-clean



Default (or Build) Lifecycle of Maven Build
It is the main life cycle of Maven build which is used to build the application. There are 23 phases in the default or build lifecycle as follows.



1.
validate
This phase validates the project and all other project related necessary information which are required to complete the build process.
2.
initialize
This phase initializes the build state, e.g. set properties
3.
generate-sources
This phase generates the source code which may be required to be included in the compilation phase.
4.
process-sources
This phase processes the source code, e.g., filter any value.
5.
generate-resources
This phase generates the resources to be included in the package.
6.
process-resources
This phase copies and processes the resources into the destination directory to make it ready for the packaging phase.
7.
compile
This phase compiles the source code of the project.
8.
process-classes
This phase Post-processes the generated files from compilation, e.g. bytecode enhancement or optimization on Java classes.
9.
generate-test-sources
This phase generates the test source code which should be included in compilation phase.
10.
process-test-sources
This phase processes the test source code, e.g., filter any values.
11.
test-compile
This phase compiles the test source code into the test destination directory.
12.
process-test-classes
This phase processes the generated files from test code file compilation.
13.
test
This phase runs the tests using a suitable unit testing framework ( e.g., Junit, TestNG).
14.
prepare-package
This phase performs any operations necessary to prepare a package before the actual packaging process.
15.
package
This phase takes the compiled code and package it in its distributable format (i.e. JAR, WAR, or EAR file).
16.
pre-integration-test
This phase performs the actions required before integration tests are executed. E.g., the required environment set up.
17.
integration-test
This phase processes and deploys the package if necessary into an environment where integration tests can be executed.
18.
post-integration-test
This phase performs actions required after integration tests have been executed. E.g., the environment clean up.
19.
verify
This phase runs any check-ups to verify the package for its validity and quality criteria.
20.
install
This phase Installs the package into the local repository.
21.
deploy
This phase copies the final package to the remote repository for sharing with developers and projects.
22.
validate
This phase validates the project and other necessary information in order to complete the build process.
23.
initialize
This phase initializes the build state, e.g. set properties

There are few Maven concepts to keep in mind as follows.
• When we call a Maven phase through a maven command (say mvn compile) then the phases up to this phase (i.e. compile phase) will be executed.
• Depending on the packaging type (i.e. EAR, JAR, WAR, etc.), different Maven goals will bound to different Maven phases.
Site Lifecycle of Maven Build
The Maven Site plugin is mostly used for the creation of reports, deploy site, etc. It has the following phases.
• pre-site
• site
• post-site
• site-deploy



11.What is Maven, a dependency/package management tool or a build tool or something more?


Maven, a Yiddish word meaning accumulator of knowledge, was originally started as an attempt to simplify the build processes in the Jakarta Turbine project. There were several projects each with their own Ant build files that were all slightly different and JARs were checked into CVS
I hope now you have a clear idea about dependency management and build automation tools. I’ll directly come to the point by providing the list of objectives that maven attempts to achieve( these are mentioned in its official website)
1.     Make the build process easy — The programmers doesn’t need to know much about the process of how maven works or provide every configuration settings manually
2.     Provide a uniform build system — Working on a one project with maven is sufficient to work on other maven projects
3.     Provide quality project information — Maven provides user to specify the project information using the POM and make them available to use whenever needed. Some information are generated from the source of the project
4.     Provide guidelines for development best practices
5.     Allowing transparent migration to new features — Provide the ability to take advantage of updated versions of the dependencies and plugins for the same project (programmer can use updated or newly added Plugins)


12.Discuss how Maven uses conventions over configurations, explaining Maven’s approach to manage the configurations

Convention over configuration is one of the main design philosophies behind Apache Maven. Let's go through a few examples.
A complete Maven project can be created using the following configuration file (pom.xml):
<project>
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.packt</groupId>
  <artifactId>sample-one</artifactId>
  <version>1.0.0</version>
</project>

 

 

Note

The Maven POM file starts with the <project> element. Always define the <project> element with the schema. Some tools can't validate the file without it:
<project xmlns=http://maven.apache.org/POM/4.0.0
         xmlns:xsi=………
         xsi:schemaLocation="…">
The pom.xml file is the heart of any Maven project and is discussed in detail in Chapter 2, Understanding the Project Object Model (POM). Copy the previous configuration element and create a pom.xml file out of it. Then, place it in a directory called chapter-01, and then create the following child directories under it:
·         chapter-01/src/main/java
·         chapter-01/src/test/java
Now, you can place your Java code under chapter-01/src/main/java and test cases under chapter-01/src/test/java. Use the following command to run the Maven build from where the pom.xml is:
$ mvn clean install
This little configuration that you found in the sample pom.xml file is tied up with many conventions:
·         Java source code is available at {base-dir}/src/main/java
·         Test cases are available at {base-dir}/src/test/java
·         The type of the artifact produced is a JAR file
·         Compiled class files are copied to {base-dir}/target/classes
·         The final artifact is copied to {base-dir}/target
·         http://repo.maven.apache.org/maven2, is used as the repository URL.
If someone needs to override the default, conventional behavior of Maven, then it is possible too. The following sample pom.xml file shows how to override some of the preceding default values:
 
<project>
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.packt</groupId>
  <artifactId>sample-one</artifactId>
  <version>1.0.0</version>
  <packaging>jar</packaging>
 
  <build>    
    <sourceDirectory>${basedir}/src/main/java</sourceDirectory>              
    <testSourceDirectory>${basedir}/src/test/java               
                                         </testSourceDirectory>     
    <outputDirectory>${basedir}/target/classes
                                             </outputDirectory>     
  </build>
</project>




13.Discuss the terms build phases, build life cycle, build profile, and build goal in Maven


Maven Build Life Cycle
What is build life cycle? The sequence of steps which is defined in order to execute the tasks and goals of any maven project is known as build life cycle in maven. Maven 2.0 version is basically a build life cycle oriented and clearly says that these steps are well defined to get the desired output after the successful execution of the build life cycle.
Maven comes with 3 built-in build life cycles as shown below :
  • Clean - this phase involves cleaning of the project (for a fresh build & deployment)
  • Default - this phase handles the complete deployment of the project
  • Site - this phase handles the generating the java documentation of the project.
Now we will dig more into the detailed phases involved in the above mentioned built-in build life cycles.


Build Life Cycle of clean phase
As mentioned above, this clean phase is used to clean up the project and make it ready for the fresh compile and deployment. The command used for the same is mvn post-clean. When this command is invoked, maven executes the below tasks via executing the below commands internally :
1.     mvn pre-clean
2.     mvn clean
3.     mvn post-clean
This maven's clean is a goal and on executing it cleans up the output directory (target folder) by deleting all the compiled files.


NOTE : whenever a maven command for any life cycle is invoked, maven executes the phases till and up to the invoked phase. E.g. when 'mvn clean' is invoked, maven will execute only the phase clean. But, no compile/deployment/site phase is invoked.


Build Lifecycle (Default)
Below is the list of phases in the build lifecycle (default) of maven. These phases will be invoked through the maven commands.
Lifecycle Phase
Description
validate
Validates and ensures that the project is fine and perfect considering all the required information is made available for the build
generate-sources
Generating any source code to include the same in the compilation process
process-sources
Processing the source code in case some filter needs to be applied
generate-sources
Generating any source code to include the package
process-resources
Process of copying the resources to the destination folder and getting ready for the packaging
compile
Compilation of the project source code.
process-classes
To perform the bytecode enhancements for the class files generated from the compilation
generate-test-sources
Copying and processing the resources in the test destination directory.
test-compile
Compile the source code in the test destination directory
test
Executing/running the tests using some suitable test framework. Note: these test cases are not considered for packaging and deploying
prepare-package
To perform any final changes/validations before it is sent for final packaging.
package
Packaging the successfully compiled and tested code to some distributable format - JAR, WAR, EAR
pre-integration-test
To perform actions before integration tests are executed. This may require to set up some environmental changes for the app.
integration-test
Deploy the application to an environment where integration tests can be run.
post-integration-test
Basically, cleaning up the environment which was made in pre-integration-test phase.
verify
Performs action for a quality check and ensures the required criteria being met
install
Installing the application in the local repository. Any other project can use this as a dependency.
deploy
The final package will be copied to a remote repository, may be as a formal release and also made available to the other developers too.


Site Lifecycle (site)
Apart from cleaning, compiling the source code, building a deployable format of the application, maven has phase which does more than these phases. This phase is one of the vital features provided by maven which generates the detailed documentation of any java project.
This project documentation has a dedicated phases involved as listed below :
  • pre-site
  • site
  • post-site
  • site-deploy
The command used in maven to generate javadocs for a given project is 'mvn site'. Basically, when this command is invoked, maven calls 'Doxia' document generation and other report generating plugins.
Doxia is basically a framework used for content generation by maven. This generates contents both in static and dynamic ways.


Build Profiles in Maven
Profile in maven is nothing but subset of elements which allows to customize builds for particular environment. Profiles are also portable for different build environments.
Build environment basically means a specific environment set for production and development instances. When developers work on development phase, they are intend to use database from the production instance and for the production phase, the live database will be used.
So, in order to configure these instances maven provides the feature of build profiles. Any no. of build profiles can be configured and also can override any other settings in the pom.xml
These defined profiles have the ability to modify the pom.xml during the build time. I.e. to configure separate environments for development and production instances. Based on the parameters passed, the corresponding profiles are activated accordingly. E.g. profiles can be set for dev, test and production phases.
Types of build profiles
The below table shows the types of build profiles in Maven :
Build Profile Type
Defined in
Per project
pom.xml
Per User/Developer
Maven settings.xml (%USER_HOME%/.m2/settings.xml)
Global
Maven global settings.xml (%M2_HOME%/conf/settings.xml)


Build Portability
As mentioned above, different environments can be set up based on the requirements for a given project. So, with this the portability of a given project can be secured and handled effectively.
Build portability can be defined as the ability of a project which can be compiled and deployed successfully across different set of environments which also involves the applying different environmental configurations for the same. Any portable project should always tend to work without any customization of any properties.
And Any portable project will always eliminates the complexities and issues associated to contributing to a project.


Activating profiles
Below are ways in which build profiles of maven can be activated or triggered :
  • Explicitly using commands
  • Maven settings
  • Based on environment variables
  • Operating system settings
  • Present/missing files



15.Identify and discuss some other contemporary tools and practices widely used in the software industry


1.     Wrike
2.     Monday.com
3.     Clarizen
4.     ProjectManager
5.     Meister Task
6.     Zoho Projects
7.     Workamajig Platinum
8.     Backlog
9.     Bitrix24
10.     Nuvro


Comments