16 Jul 2021 - tsp
Last update 16 Jul 2021
3 mins
Sometimes one wants to keep track of Jenkins build status in a semi realtime fashion. A nice way to do this is using the ATOM feeds (that are sometimes invalidly called RSS feeds). This can usually be done using an feed reader - it just has to be capable of performing authentication and access the feed using a POST request. Sometimes though it might be interesting to publish a specific set of RSS feeds to clients that should not know the credentials - or the build server at all. A simple solution to that problem is to just run a shellscript (or a periodic Jenkins job itself) that fetches the RSS feed(s) of a given user and pushes them to a reachable webserver. This webserver then can implement access control or allow public access to the feeds - how this is handled mainly depends on the task that one want’s to solve and about confidentality of the content.
Basically this is really simple. One just doesn’t have to wait for Jenkins to respond in a way that triggers clients to perform authentication - and programmatic access is not using the users password but an authentication token.
First one has to generate an API token for the given job. This can be done after you log into the webinterface of Jenkins. Select your username on the right upper side of the UI, switch into the configure section, scroll to the API Token area and select Add new token. Give this token a meaningful name - this helps one to remove unnecessary tokens later on (or remove compromised ones). Copy the token - it won’t be visible later and it won’t be stored by Jenkins in clear text form so it’s not recoverable as soon as one hides it for the first time.
In this case I’m going to use curl
over the usual fetch
since one
has to perform authentication without being asked - and send a POST request.
Basically all that one has to do boils down to a simple
curl -X POST --user ${username}:${apitoken} https://jenkins.example.com/rssAll > rssAll.xml
After that one can use a simple tool such as rsync
to publish the feeds
to the desired webserver - this is the same method that I also use to deploy
my static webpage to the public facing webserver.
The pipeline script that I’m using basically is:
pipeline {
agent none
stages {
stage('Fetch RSS feeds') {
agent {
label 'freebsd && amd64'
}
stages {
stage('Fetch ALL feed') {
steps {
sh 'curl -X POST --user USERNAME:APITOKEN http://jenkins.example.com/rssAll > rssAll.xml'
}
}
stage('Fetch FAILED feed') {
steps {
sh 'curl -X POST --user USERNAME:APITOKEN http://jenkins.example.com/rssFailed > rssFailed.xml'
}
}
stage('Deploy to webserver') {
steps {
sh 'rsync -av rssAll.xml deployrss@www.example.com:/usr/www/www.example.com/www/jenkins/'
sh 'rsync -av rssFailed.xml deployrss@www.example.com:/usr/www/www.example.com/www/jenkins/'
}
}
}
}
}
}
One can of course also use an even simpler cron
job to perform the same four
basic steps - a Jenkins job on the other hand nicely fits into the same infrastructure
and is allowed to be triggered by any means like finished other jobs, external
build triggers, etc.
This article is tagged:
Dipl.-Ing. Thomas Spielauer, Wien (webcomplains389t48957@tspi.at)
This webpage is also available via TOR at http://rh6v563nt2dnxd5h2vhhqkudmyvjaevgiv77c62xflas52d5omtkxuid.onion/