Trifork Blog

Category ‘Business’

Trainings at GOTO Academy - AngularJS, Spring, Axon, iBeacon

August 7th, 2014 by
(http://blog.trifork.com/2014/08/07/trainings-at-goto-academy-angularjs-spring-axon-ibeacon/)

Vagrant Logo

Take a look at the upcoming trainings schedule at GOTO Academy.

We will also have FREE evening events on the topics iBeacon, AngularJS. Stay tuned!

FREE EVENTS

Read the rest of this entry »

My Goto Amsterdam 2014

July 17th, 2014 by
(http://blog.trifork.com/2014/07/17/my-goto-amsterdam-2014/)

Trifork Logo

People who have worked with me know I'm a bit of a technical conservative. I'm very wary of quickly adopting the latest fads and trends because I've seen the collective hype and the following disillusionment too many times, including software being built with the then-latest-hype framework or platform and a year later being stuck with now-obsolete technology that only the original developers and a handful of other people still have any real experience with.

For the same reason I've avoided software tech conferences in the past years. A few visits to conferences several years ago on each occasion left me with feeling that I'd heard a lot about a lot, but that it wasn't really going to improve my daily software development work.

Luckily, Goto Amsterdam 2014 was different.

Many, if not all, of the talks were relevant to my actual, day to day, software development job. I learned about looking at Agile in a different way. I heard people speak on real life problems being solved with actual, current, widely adopted technology. I even listened to talks that weren't really that much about software development at all.

So let me walk you through my Goto Friday.

Read the rest of this entry »

Brightcenter, the multi-user classroom solution for educative app(lication)s

May 15th, 2014 by
(http://blog.trifork.com/2014/05/15/brightcenter-the-multi-user-classroom-solution-for-educative-applications/)

Tablets inside the classroom

brightcenter_logo_xl

For years now, PC/Desktops have been present in the classroom providing children and students digital learning environments. These learning environments are helping the teachers by providing interactive learning aids where children and students can independently work and learn. In primary schools, classrooms are filled with just a couple of PCs in order to allow children to learn how to use a computer. Many of these PCs include special software written specifically for kids, where they can learn to practice basic mathematics, writing, language, etc. Most of the time these PCs are not even connected to the Internet, because of the nature of the applications and run on local machines. Read the rest of this entry »

Scrum: Just follow the solutions provided below!

May 8th, 2014 by
(http://blog.trifork.com/2014/05/08/scrum-just-follow-the-solutions-provided-below/)

Doing Scrum is easy! Just follow and implement checklist below and everything will go well, right?

  • Deliver in cycles of two to six weeks
  • Work in a team sized six plus or minus three
  • Every day stand together answering what was done since the last standup and what will be done before the next
  • The standup should take no more than 15 minutes
  • Every sprint you have to review your process in a retro
  • The length of your retro should not exceed the length of the sprint in hours divided by 40
  • Stories in sprint should not take more effort than a team can do in two days
  • The stories should be broken up in tasks that can be completed in two hours max

Read the rest of this entry »

GOTO AMSTERDAM NEWSLETTER

May 2nd, 2014 by
(http://blog.trifork.com/2014/05/02/goto-amsterdam-newsletter-2/)

GOTO Logo

It's Time to Sign up!

We hope you can make it to our next GOTO Amsterdam (June 18th-20th).

90% of the conference schedule is now final and live. The next early discount to save €200 ends May 14th. Get your seat now!

We encourage you to take a look at the training courses, that take place on June 18th. Leading industry experts will provide a day of hands-on tutorials on a wide range of topics.

See the complete training overview here.

Sign up now

Read the rest of this entry »

Secure Digital Assessments with QTI - demo

December 12th, 2013 by
(http://blog.trifork.com/2013/12/12/secure-digital-assessments-with-qti-demo/)

Over the last year we have been working very hard on our new and improved QTI Assessment Delivery Engine; version 3. With the previous versions we were more or less stuck to the QTI rendering and implemented a lot of custom developed code around it to get it working. Many of these features have been rewritten and implemented into the product's core of version 3, of course taking into account the IMS QTI conformance.

Read the rest of this entry »

Lessons learned how to do Scrum in a fixed price project

August 22nd, 2013 by
(http://blog.trifork.com/2013/08/22/lessons-learned-how-to-do-scrum-in-a-fixed-price-project/)

As a Scrum Master my opinion on doing Scrum in combination with a fixed price, fixed functionality and fixed deadline is somewhat tricky to grasp. However, it still common that in many projects fixed price is just simply the norm. For instance, this is often the case in public tenders for government or education institutions for various projects such as the procurement of a new software system to name an example.

So if you and your company win the tender it’s up to you and your team to deal with the “fixed everything” aspect of the project. Interested in how to deal with some of the ongoing aspects of changes in the requirements, deadlines and how to keep both the customer and the team happy? Read on, in this blog I will share with you our experiences with fixed price projects and Scrum.
Read the rest of this entry »

Latest news from Trifork Amsterdam

June 17th, 2013 by
(http://blog.trifork.com/2013/06/17/latest-news-from-trifork-amsterdam/)

Just 1 day to go until #3 GOTO Amsterdam

The team behind GOTO Amsterdam are raring to go and this time it's already set to be the best year to date. Not only in terms of an impressive speaker line up and record number of delegates, but also the sponsors this year have pulled the stops out.

es logoWe at Trifork Amsterdam & Elasticsearch will be partners in crime this year and have a host of FREE fantastic giveaways including trainings seats & conference tickets to be redeemed across the globe. There's also a chance to hear about the customers using Elasticsearch and get insights as to how best to implement Elasticsearch in a production environment. So if you're at the event come and visit us (hint: if want to locate us, follow the scent of delicious warm waffles!).

Read the rest of this entry »

Ansible - Simple module

April 18th, 2013 by
(http://blog.trifork.com/2013/04/18/ansible-simple-module/)

In this post, we'll review Ansible module development.
I haven chosen to make a maven module; not very fancy, but it provides a good support for the subject.
This module will execute a maven phase for a project (a pom.xml is designated).
You can always refer to the Ansible Module Development page.

Which language?

The de facto language in Ansible is Python (you benefit from the boilerplate), but any language can be used. The only requirement is being to be able to read/write files and write to stdout.
We will be using bash.

Module input

The maven module needs two parameters, the phase and the pom.xml location (pom).
For non-Python modules, Ansible provides the parameters in a file (first parameter) with the following format:
pom=/home/mohamed/myproject/pom.xml phase=test

You then need to read this file and extract the parameters.

In bash you can do that in two ways:
source $1

This can cause problems because the whole file is evaluated, so any code in there will be executed. In that case we trust that Ansible will not put any harmful stuf in there.

You can also parse the file using sed (or any way you like):
eval $(sed -e "s/\([a-z]*\)=\([a-zA-Z0-9\/\.]*\)/\1='\2'/g" $1)
This is good enough for this exercise.

We now have two variables (pom and phase) with the expected values.
We can continue and execute the maven phase for the given project (pom.xml).

Module processing

Basically, we can check if the parameters have been provided and then execute the maven command:


#!/bin/bash

eval $(sed -e "s/\([a-z]*\)=\([a-zA-Z0-9\/\.]*\)/\1='\2'/g" $1)

if [ -z "${pom}" ] || [ -z "${phase}" ]; then
echo 'failed=True msg="Module needs pom file (pom) and phase name (phase)"'
exit 0;
fi

maven-output=$(mktemp /tmp/ansible-maven.XXX)
mvn ${phase} -f ${pom} > $maven-output 2>&1
if [ $? -ne 0 ]; then
echo "failed=True msg=\"Failed to execute maven ${phase} with ${pom}\""
exit 0
fi

echo "changed=True"
exit 0

In order to communicate the result, the module needs to return JSON.
To simplify the JSON outputing step, Ansible allows to use key=value as output.

Module output

You noticed that an output is always returned. If an error happened, failed=True is returned as well as an error message.
If everything went fine, changed=True is returned (or changed=False).

If the maven command fails, a generic error message is returned. We can change that by parsing the content of maven-ansible and return only what we need.

In some situations, your module doesn't do anything (no action is needed). In that case you'll need to return changed=False in order to let Ansible know that nothing happened (it is important if you need that for the rest of the tasks in your playbook).

Use it

You can run your module with the following command:

ansible buildservers -m maven -M /home/mohamed/ansible/mymodules/ --args="pom=/home/mohamed/myproject/pom.xml phase=test" -u mohamed -k

If it goes well, you get something like the following output:

localhost | success >> {
"changed": true
}

Otherwise:

localhost | FAILED >> {
"failed": true,
"msg": "Failed to execute maven test with /home/mohamed/myproject/pom.xml"
}

To install the module put it in ANSIBLE_LIBRARY (by default it is in /usr/share/ansible), and you can start using it inside your playbooks.
It goes without saying that this module has some dependencies: an obvious one is the presence of maven. You can ensure that maven is installed by adding a task in your playbook before using this module.

Conclusion

Module development is as easy as what we briefly saw here, and in any language. That's another point I wanted to make and that makes Ansible very nice to use.

Ansible - Example playbook to setup Jenkins slave

April 2nd, 2013 by
(http://blog.trifork.com/2013/04/02/ansible-example-playbook-to-setup-jenkins-slave/)

As mentioned in my previous post about Ansible, we will now proceed with writing an Ansible playbook. Playbooks are files containing instructions that can be processed by Ansible, they are written in yaml. For this blog post I will show you how to create a playbook that will setup a remote computer as a Jenkins slave.

What do we need?

We need some components to get a computer execute Jenkins jobs:

  • JVM 7
  • A dedicated user that will run the Jenkins agent
  • Subversion
  • Maven (with our configuation)
  • Jenkins Swarm Plugin and Client

Why Jenkins Swarm Plugin

We use the Swarm Plugin, because it allows a slave to auto-discover a master and join it automatically. We hence don't need any actions on the master.

JDK7

We now proceed with adding the JDK7 installation task. We will not use any package version (for example dedicate Ubuntu PPA or RedHat/Fedora repos), we will use the JDK7 archive from oracle.com.
There multiple steps required:

* We need wget to be install. This is needed to download the JDK
* To download the JDK you need to accept terms, we can't do that in a batch run so we need to wrap a wget call in a shell script to send extra HTTP headers
* Set the platform wide JDK links (java and jar executable)

Install wget

We want to verify that wget is installed on the remote computer and if not install it from the distribution repos. To install packages, there are modules available, yum and apt (There are others but we will focus on these).
To be able to run the correct task depending on the ansible_pkg_mgr value we can use only_id:

  - name: Install wget package (Debian based)
    action: apt pkg='wget' state=installed
    only_if: "'$ansible_pkg_mgr' == 'apt'"

  - name: Install wget package (RedHat based)
    action: yum name='wget' state=installed
    only_if: "'$ansible_pkg_mgr' == 'yum'"

Download JDK7

To download JDK7 from oracle.com, we need to accept the terms but we can't do that in a batch, so we need to skip that:

Create a script contains the wget call:

#!/bin/bash

wget --no-cookies --header "Cookie: gpw_e24=http%3A%2F%2Fwww.oracle.com" http://download.oracle.com/otn-pub/java/jdk/7/$1 -O $1

The parameter is the archive name.

  - name: Copy download JDK7 script
    copy: src=files/download-jdk7.sh dest=/tmp mode=0555

  - name: Download JDK7 (Ubuntu)
    action: command creates=${jvm_folder}/jdk1.7.0 chdir=${jvm_folder} /tmp/download-jdk7.sh $jdk_archive

These two tasks copy the script to /tmp and then execute it. $jdk_archive is a variable containing the archive name, it can be different depending on the distribution and the architecture.

Ansible provide a way to load variable files:

  vars_files:

    - [ "vars/defaults.yml" ]
    - [ "vars/$ansible_distribution-$ansible_architecture.yml", "vars/$ansible_distribution.yml" ]

This will load the file vars/defauts.yml (Note that all these file are written in yaml) and then look for the file vars/$ansible_distribution-$ansible_architecture.yml.
The variables are replaced by the their value on the remote computer voor example on an Ubuntu 32bits on i386 distribution, Ansible will look for the file vars/Ubuntu-i386.yml. If it doesn't find it, it will fallback to vars/Ubuntu.yml.

Examples, Ubuntu-i386.yml would contain:

---
jdk_archive: jdk-7-linux-i586.tar.gz

Fedora-i686.yml would contain:

---
jdk_archive: jdk-7-linux-i586.rpm

Unpack/Install JDK

You notice that for Ubuntu we use the tar.gz archive but for Fedora we use an rpm archive. That means the the installation of the JDK will be different depending on the distribution.

  - name: Unpack JDK7
    action: command creates=${jvm_folder}/jdk1.7.0 chdir=${jvm_folder} tar zxvf ${jvm_folder}/$jdk_archive --owner=root
    register: jdk_installed
    only_if: "'$ansible_pkg_mgr' == 'apt'"

  - name: Install JDK7 RPM package
    action: command creates=${jvm_folder}/latest chdir=${jvm_folder} rpm --force -Uvh ${jvm_folder}/$jdk_archive
    register: jdk_installed
    only_if: "'$ansible_pkg_mgr' == 'yum'"

On ubuntu, we just unpack the downloaded archive but for fedora we install it using rpm.
You might want to review the condition (only_if) particularly if you use SuSE.
jvm_folder is just an extra variable that can be global of per distribution, you need to place if in a vars file.
Note that the command module take a 'creates' parameter. It is useful if you don't want to rerun the command, the module that the file or directory provided via this parameter exits, if it does it will skip that task.
In this task, we use register. With register you can store the result of a task into a variable (in this case we called it jdk_installed).

Set links

To be able to make the java and jar executables accessible to anybody (particularly our jenkins user) from anywhere, we set symbolic links (actually we just install an alternative).

  - name: Set java link
    action: command update-alternatives --install /usr/bin/java java ${jvm_folder}/jdk1.7.0/bin/java 1
    only_if: '${jdk_installed.changed}'

  - name: Set jar link
    action: command update-alternatives --install /usr/bin/jar jar ${jvm_folder}/jdk1.7.0/bin/jar 1
    only_if: '${jdk_installed.changed}'

Here we reuse the stored register, jdk_installed. We can access the changed attribute, if the unpacking/installation of the JDK did do something then changed will be true and the update-alternatives command will be ran.

Cleanup

To keep things clean, you can remove the downloaded archive using the file module.

  - name: Remove JDK7 archive
    file: path=${jvm_folder}/$jdk_archive state=absent

We are done with the JDK.

Obviously you might want to reuse this process in other playbooks. Ansible let you do that.
Just create a file with all this task and include it in a playbook.

- include: tasks/jdk7-tasks.yml jvm_folder=${jvm_folder} jdk_archive=${jdk_archive}

jenkins user

Creation

With the name module, the can easily handle users.

  - name: Create jenkins user
    user: name=jenkins comment="Jenkins slave user" home=${jenkins_home} shell=/bin/bash

The variable jenkins_home can be defined in one of the vars files.

Password less from Jenkins master

We first create the .ssh in the jenkins home directory with the correct rights. And then with the authorized_key module, we can add the public of the jenkins user on the jenkins master to the authorized keys of the jenkins user (on the new slave). And then we verify that the new authorized_keys file has the correct rights.

  - name: Create .ssh folder
    file: path=${jenkins_home}/.ssh state=directory mode=0700 owner=jenkins

  - name: Add passwordless connection for jenkins
    authorized_key: user=jenkins key="xxxxxxxxxxxxxx jenkins@master"

  - name: Update authorized_keys rights
    file: path=${jenkins_home}/.ssh/authorized_keys state=file mode=0600 owner=jenkins

If you want jenkins to execute any command as sudo without the need of providing a password (basically updating /etc/sudoers), the module lineinfile can do that for you.
That module checks 'regexp' against 'dest', if it matches it doesn't do anything if not, it adds 'line' to 'dest'.

  - name: Tomcat can run any command with no password
    lineinfile: "line='tomcat ALL=NOPASSWD: ALL' dest=/etc/sudoers regexp='^tomcat'"

Subversion

This one is straight forward.

  - name: Install subversion package (Debian based)
    action: apt pkg='subversion' state=installed
    only_if: "'$ansible_pkg_mgr' == 'apt'"

  - name: Install subversion package (RedHat based)
    action: yum name='subversion' state=installed
    only_if: "'$ansible_pkg_mgr' == 'yum'"

Maven

We will put maven under /opt so we first need to create that directory.

  - name: Create /opt directory
    file: path=/opt state=directory

We then download the maven3 archive, this time it is more simple, we can directly use the get_url module.

  - name: Download Maven3
    get_url: dest=/opt/maven3.tar.gz url=http://apache.proserve.nl/maven/maven-3/3.0.4/binaries/apache-maven-3.0.4-bin.tar.gz

We can then unpack the archive and create a symbolic link to the maven location.

  - name: Unpack Maven3
    action: command creates=/opt/maven chdir=/opt tar zxvf /opt/maven3.tar.gz

  - name: Create Maven3 directory link
    file: path=/opt/maven src=/opt/apache-maven-3.0.4 state=link

We use again update-alternatives to make mvn accessible platform wide.

  - name: Set mvn link
    action: command update-alternatives --install /usr/bin/mvn mvn /opt/maven/bin/mvn 1

We put in place out settings.xml by creating the .m2 directory on the remote computer and copying a settings.xml (we backup any already existing settings.xml).

  - name: Create .m2 folder
    file: path=${jenkins_home}/.m2 state=directory owner=jenkins

  - name: Copy maven configuration
    copy: src=files/settings.xml dest=${jenkins_home}/.m2/ backup=yes

Clean things up.

  - name: Remove Maven3 archive
    file: path=/opt/maven3.tar.gz state=absent

Swarm client

You first need to install the Swarm plugin as mentioned here.
Then you can proceed with the client installation.

First create the jenkins slave working directory.

  - name: Create Jenkins slave directory
    file: path=${jenkins_home}/jenkins-slave state=directory owner=jenkins

Download the Swarm Client.

  - name: Download Jenkins Swarm Client
    get_url: dest=${jenkins_home}/swarm-client-1.8-jar-with-dependencies.jar url=http://maven.jenkins-ci.org/content/repositories/releases/org/jenkins-ci/plugins/swarm-client/1.8/swarm-client-1.8-jar-with-dependencies.jar owner=jenkins

When you start the swarm client, it will connect to the master and the master will automatically create a new node for it.
There are a couple of parameters to start the client. You still need to provided a login/password in order to authenticate. You obviously want this information to be parameterizable.

First we need a script/configuration to start the swarm client at boot time (systemv, upstart or systemd it is up to you). In that script/configuration, you need to add the swarm client run command:

java -jar {{jenkins_home}}/swarm-client-1.8-jar-with-dependencies.jar -name {{jenkins_slave_name}} -password {{jenkins_password}} -username {{jenkins_username}} -fsroot {{jenkins_home}}/jenkins-slave -master https://jenkins.trifork.nl -disableSslVerification &> {{jenkins_home}}/swarm-client.log &

Then using the template module, to process the script/configuration template (using Jinja2) into a file that will be put on a given location.

  - name: Install swarm client script
    template: src=templates/jenkins-swarm-client.tmpl dest=/etc/init.d/jenkins-swarm-client mode=0700

The file mode is 700 because we have a login/password in that file, we don't want people (that can log on the remote computer) to be able to see that.

Instead of putting jenkins_username and jenkins_password in vars files, you can prompt for them.

  vars_prompt:

    - name: jenkins_username
      prompt: "What is your jenkins user?"
      private: no
    - name: jenkins_password
      prompt: "What is your jenkins password?"
      private: yes

And then you can verify that they have been set.

  - fail: msg="Missing parameters!"
    when_string: $jenkins_username == '' or $jenkins_password == ''

You can now start the swarm client using the service module and enable it to start at boot time.

  - name: Start Jenkins swarm client
    action: service name=jenkins-swarm-client state=started enabled=yes

Run it!

ansible-playbook jenkins.yml --extra-vars "host=myhost user=myuser" --ask-sudo-pass

By passing '--ask-sudo-pass', you tell Ansible that 'myuser' requires a password to be typed in order to be able to run the tasks in the playbook.
'--extra-vars' will pass on a list of viriables to the playbook. The begining of the playbook will look like this:

---
 
- hosts: $host
  user: $user
  sudo: yes

'sudo: yes' tells Ansible to run all tasks as root but it acquires the privileges via sudo.
You can also use 'sudo_user: admin', if you want Ansible to run the command to sudo to admin instead of root.
Note that if you don't need facts, you can add 'gather_facts: no', this will spend up the playbook execution but that requires that you know everything you need about the remote computer.

Conclusion

The playbook is ready. You can now easily add new nodes for new Jenkins slaves thanks to Ansible.