World class IT support

I attended a talk by James Stanger from CompTIA at the SITS17 conference in 2017 (I totally forgot I’d written these notes until now). He talked a lot about the sort of skills and knowledge that world class Service Desk staff needed to have, and it dovetailed nicely with a list I drew up myself a couple of years earlier, and also with ideas from other presentations from the same conference.

What follows is notes I took at the time (with my observations in italics)

Tech support today

  • Evolving endpoint – IOT
  • Analytics
  • Automation
  • The cloud and mobility – BYOD
  • Cyber / security
  • Complexity of privacy
  • Device diversity
  • We are all knowledge workers

Helpdesk are the frost line of defence against security.
More Mac and Linux now than 10 years ago.
More business units than ever before
Service desk and service management is a growing industry and there are more jobs available. This should be an increasing trend.
Need a more diverse skills set – increase in cloud, and move from Helpdesk to service desk.

Technical skills that Service Desk staff need

  • Security
  • Database
  • PC
  • Storage
  • Backups
  • Cloud
  • Telecoms
  • Web dev
  • Server
  • Mobile
  • Etc.

Top skills

  • Troubleshooting
  • App management
  • PC
  • Security
  • Data
  • Mobility
  • Repair
  • Ticketing
  • Cloud
  • Permissions and directory services

It’s about accessing data from any device rather than the device itself. The device breaks, then you just try and access it on another device.

Five essential skills

  1. Cybersecurity
  2. Linux
  3. Programming – shell scripts and python
  4. Networking – TCP/IP, network segmentation, VPN
  5. Soft skills – customer service mindset, project management, ability to turn negatives into positives.

Can we take these 5 skills and turn them into a blueprint for the sort of person we want to recruit into IT support roles (and I’m talking across the institution here, not just Service Desk people)? Should also add some stuff from the other lists though – especially around cloud, troubleshooting and application management. Do our people need to be lesss skilled but more knowledgeable? Or is it just that they need to be skilled and knowledgeable about different things?

I look at this list and see a lot of things that have previously been on our “nice to have” list. Things like cybersecurity, Linux and the ability to script solutions to IT problems have never been things we have tried to recruit at first line, and maybe even most second line teams. That is going to have to change if we are going to meet the demands of our customers and provide the sort of service they need from us.

Other things:

We want someone who loves problems

Can this person really look beneath the hood?

Can they see around the corners of a problem? Trend analysis and documenting new things and fixes to hard problems.

Hackers will go after the points where one technology interacts with another (where people are a technology – interface between people and technology is a big vulnerability).

Security trends

  • Notice the unknown (Ransomware, social engineering etc.)
  • Zero day attacks
  • Malware

Top security skills

The usual, plus social engineering, authentication methods.

Attacks occur where people and technology converge. This isn’t news.

We have too much information, too much data, if we had less there would be less to attack.

A good Helpdesk person will spot trends in security, and will help visualise data in a way that other people can understand.

Multi factor authentication – something you know and something you hold. 3rd factor is something you are (fingerprint, iris scan, face etc.)

Understanding monitoring and performance via command line tools is an important skill. All the stuff I have been talking about for years.

Saying things the right way is absolutely vital. Change geek speak into plain language. This is a key skill.

Give suggestions / say no / transfer a call

  • Identify the problem
  • Define the problem
  • Explore and examine the options
  • Act on the solutions
  • Look back at the solution and the consequences, or learn from the problem.

80% technical, 20% soft skills. And technical skills are much easier to teach, so recruit people who already have the soft skills.

Using Pi-hole as an ad blocker

I’ve used various ad-blockers over the years, and while they have all largely worked, they have also started to slow my browser down (especially on older computers). I read about Pi-hole a few times, but didn’t get around to actually installing it until this week. Now I have installed it I’m wishing I hadn’t waited, because not only does it lead to a largely ad-free browsing experience, but it also makes my older and slower computers noticeably faster.

Pi-hole should work on any Debian or Red Hat derived Linux distribution, but I went for the obvious solution of putting it on one of my always-on Raspberry Pis (which also runs WordPress and a command-line IRC client). To install just type curl -sSL https://install.pi-hole.net | bash in a terminal, and then visit the /admin URL of the machine it’s installed on to view the admin console.

Configuring machines to use it is just a case of defining a custom DNS server (how to do that varies between each OS, but was trivial on Ubuntu and ChromeOS – I’ve not tried anything else yet). Just add the IP address of the Pi as a DNS server, and it will block anything on the block list, and then forward everything else on to be dealt with as normal. If you want to do this for everything on your network then there are various options detailed here that range from configuring one machine to routing everything through Pi-hole.

The admin page will tell you how much blocking is going on. With me it was about 1% of all traffic, and it will even tell you which domains it is blocking so you can whitelist anything you actually want to see (not all ads are bad). I don’t really notice a performance increase on my main computer, but older and slower computers definitely seem snappier, and can maintain about twice as many open tabs before they start to slow down, which is a bonus feature that I wasn’t really expecting.

How I use social media

I started this as a bit of an FAQ for strangers who try and get me to connect with them on Linkedin, or who want to post guest content on my blog, but I thought it was actually worth putting together something that articulates who I choose to follow and interact with on social media, and what criteria I use to make decisions around this sort of thing.

First things first, I have a number of communication channels that I use regularly. I have a public blog and Twitter account, locked Facebook and Google+ accounts, and two email accounts (one for work, one for everything else). I also have Linkedin profile that I largely use for tracking my professional network, and writing nice things about people I know who are engaged in job hunting, but that I don’t really use for communication as such.

I’ll start with my public social media. I’ve maintained a blog for the best part of 10 years, and anyone is welcome to read it, subscribe to email alerts, read it through an RSS reader, or consume it in any other way. What you won’t be able to do is leave comments (I turned those off years ago), or write content for my blog (because it’s mine and it’s part of my public internet presence so I want it to reflect me).

My Twitter account is also public, and I’m not choosy about who follows it, but I’ll generally only follow people back if I know them, I’m interested in the sort of content they post, or I’m interested in having actual conversations with them over social media (Twitter mentions and DMs are the only synchronous online conversations I regularly engage in). I will initiate connections, and often follow accounts that look unloved in the hope that I can help people I like see the wonders of Twitter (and thus talk to them more). I also cross-post to Twitter every time I write a blog post, and am happy to engage with people about the content of the blog post via Twitter. Twitter is also where to look for music recommendations, random snippets of life, occasional banter, and sporadic requests for social contact. It’s also the one place I’ll still post when I’m neglecting everything else (140 characters helps with this).

I suppose Linkedin classes as public social media too, although I use it in a very different way. I occasionally cross-post work-related content from Twitter, but I mainly maintain it to track my professional network, endorse and recommend people I know, and to do anything else I can think of to help other people with their job hunting and career progression. I’ll connect with anyone I’ve ever known professionally, anyone I know personally whose area of interest overlaps mine (so people who work in Universities, or are interested in psychology or personality, or work in IT, or are involved in any sort of people, project or service management), and anyone I don’t know who looks like they might be a useful addition to my professional network (although I never initiate these connections). I’m a lot pickier about people in recruitment and sales, especially if I don’t know them. I also tend not to initiate connections with people who are direct reports or where I am perceived to be more powerful than them in an organisation (although I’ll happily reciprocate invitations if they come in). That’s not a hard and fast rule though – it very much depends what sort of personal connection I’ve already got with the person. I’m also quite sporadic with using Linkedin, and have not done any endorsements for about 3 months (I need to fix that soon).

I use Google+ to communicate with a specific (fairly large) group of people I’ve known for ages. Most of the friendships predate G+, and have followed me through the IRC, Livejournal, Facebook, and Buzz days, and I suspect anyone else would regard my account as being unused, as all my content is locked. I initiate G+ connections a lot, and check the site several times a day (although I have email notifications turned off globally), and while I’ll accept requests from anyone I know, I don’t promise to post anything too interesting.

I’ve used Facebook for a long time, but these days I only really cross-post from Twitter, comment on what other people post, or use it to organise my social life with groups of people who don’t use G+ or Twitter. My friends list is a weird mix of family, friends, colleagues, and people I’ve not seen for years. I’ll generally accept requests from anyone I know (including people I know through work), although I’m fairly bad at initiating requests unless I’ve identified someone who I want to connect with and it looks like Facebook is the only option. I also have notifications turned off, and rarely use the IM function, so it’s not the best method if you need a quick response (weirdly, that’s probably still email).

I like email a lot (if you really don’t have my address then it’s somewhere on this site I’m sure). I try and maintain inbox zero, although I am quite discerning about what I’ll reply to (I get a lot of email), and a lot of what I get actually gets converted to a Trello card if it requires me to do something that takes longer than about five minutes. Before there was social media I used email a lot for socialising – now I find that doesn’t happen unless I know the person really well or the topic of conversation is confidential, but I’m not against using email for social contact if that’s what someone is most comfortable with.

One day I’ll sit down and consolidate my social networks so that they represent everyone I know (for someone with such a clear preference for introversion I know a lot of people), but that day is not today, and I suspect that it’s a job I’ll not get round to for a long time. In the meantime I hope this blog post gives people an idea of what they can expect if they choose to engage with me on social media.

Building the Debian Handbook

What follows is instructions for creating a local HTML copy of the Debian Administrator’s Handbook (which is a very useful source of information for anyone working with any Debian derivative including Ubuntu and Raspian). All work related to this project was done on a Raspberry Pi Zero running Raspian, so I suspect it will work on anything running any Debian derivative (although Ubuntu 16.04 is the only other system I’ve tested this on so far).

Open up a terminal, and issue the following commands to get hold of the source code:

sudo apt install git
sudo git clone
git://anonscm.debian.org/debian-handbook/debian-handbook.git

Install the packages required for building:

sudo apt install publican publican-debian

Build the html files:

cd debian-handbook/
sudo ./build/build-html

It might take a while to build, especially on the sort of hardware I’ve been using. This might be the point to make a cup of tea.

Copy the HTML files into the root of your web server:

sudo cp -R publish/en-US/Debian/8/html/debian-handbook/ /var/www/html/

At this point you should be able to browse to the home page of the directory by navigating to the hostname or IP address of your web server.

Simple CCTV setup using a Raspberry Pi

This weekend I’ve been setting up my latest Raspberry Pi (a version III, in a blue lego case, running Ubuntu) to display a video stream of what’s going on outside my house so I can watch out for deliveries etc.

It’s something I’ve done before on different hardware, but I thought it was worth documenting as it’s a good project for any model of Raspberry Pi, and requires nothing more than the Pi, a USB webcam (or camera module), and 15 minutes of your time. I’m using a piece of software called motion which is available in the Debian/Raspian/Ubuntu repositories.

Install motion:

sudo apt-get install motion

Enable motion to start at boot:

sudo nano /etc/default/motion

Find the line that says start_motion_daemon=no and change it to start_motion_daemon=yes.

Enable the stream to be viewed from other computers on the local network, and also make the output a little bigger:

sudo nano /etc/motion/motion.conf

Change the following values:

daemon on
width 640
height 480
framerate 100
stream_localhost off

Reboot, and then browse to port 8081 on the computer you’ve set it up on.

Making professional presentations

Over the last couple of weeks I’ve been writing a presentation that I have to give as part of my ILM5 qualification. I give presentations fairly regularly (in fact I’ve given two since I started writing this one), but this one is different in that I’m being assessed on every aspect of it, and the assessment criteria is fairly specific.

As part of this process I attended a one day workshop covering all the key aspects of presentation skills, and also giving us the opportunity to practice standing up and talking in front of other people who then provided feedback. I found this useful, and none of the feedback I received was a surprise. I think the only thing I could look to change related to delivery of presentations is the amount I move while I’m presenting, but I suspect I’m not going to be able to move less without feeling really self conscious and detracting from the quality of the presentation – I’m certainly willing to give it a try though.

We didn’t have to create slides as part of the training, but the other piece of feedback I generally get is around my slides, and specifically how they don’t contain a great deal of text and therefore often require further information to make sense to anyone who wasn’t actually at the presentation. I’ve not changed the style of my slides as result of this, but I have worked on ensuring they flow in a sensible chronological order, and I’ve also prepared a longer slide set that intersperses the slides I’ll be showing with slides containing what I’ll actually be saying. Hopefully this version of the presentation will be useful as a handout, and will add context to the slides I’ll be showing (which are largely diagrams, graphs and charts). I’m a big believer that slides should enhance a talk rather than acting as a script, and I’d much rather the audience were listening to what I say rather than reading it off a screen.

Over the years I’ve experimented with a few different ways of creating slides, although in recent years I’ve either presented from a PDF file or created them straight in Keynote (for more complex presentations). This time I ended up doing a bit of both, as I wanted to create the slides/notes as markdown files, but also wanted to take advantage of Keynote’s presentation mode. I created my slides as a markdown file, and converted them to a PDF using Pandoc and Beamer (the process is detailed here), and then I used a tool called PDF to Keynote to convert them. I prefer working in markdown because it allows me to convert the same file to a Word document, PDF, ebook and presentation, but it means I have to go through as couple of extra layers of processing to be able to present from Powerpoint. I’ve made sure I can do that this time, although it’s not usually something I’ll bother with, especially if I’m the only person presenting.

My plan is to present from my laptop in Keynote and to use my phone as a remote (or just to use the trackpad of the laptop as it’s a fairly small room). Mitigations for technical difficulties include PDF, Keynote and Powerpoint versions on a USB device and in Dropbox, a second laptop in my bag, and adaptors to allow me to connect either my phone or iPad to the projector and present from that (I had to do that once when my laptop decided to reboot just as I was about to present). I’ll also have the source markdown with me so I have the ability to create slides on the fly should I need to. A lot of this may be overkill, but I’d rather be prepared.

An updated guide to using Pandoc for document conversion

I wrote about Pandoc last year, but I’m using it more and more and I’ve found myself editing the original post a fair few times. This is the updated 2016 version that gathers together useful commands I’ve learned so far.

Last year I found myself needing to do a lot of document conversion, and maintaining documentation that needs to be available in a variety of formats (HTML, Word documents, Markdown and PDF). My tool of choice for this sort of thing is Pandoc, which is available for Windows, Mac OS X and Linux, although most of my usage so far has been on Linux and Mac OS X (it’s a command line package that can output to Dropbox, so it doesn’t matter where it runs really).

There are instructions for installing Pandoc on quite a few platforms. I’ve found that following these is generally enough, although it’s worth installing the latest version of the .deb packages rather than the one in the repositories.

On Debian/Ubuntu I also add the texlive-latex-extra package, but that’s largely because it gives me a specific Beamer theme I like to use.

If you’re using Pandoc on Mac OS X there is one more command you’ll need to issue prior to the first time you want to create a PDF file:

sudo ln -s /Library/TeX/texbin/pdflatex /usr/local/bin/

This will ensure Pandoc knows where to find pdflatex. If this step isn’t followed then you’ll likely get an error message along the lines of pandoc: pdflatex not found. pdflatex is needed for pdf output.

Pandoc works for me because I write everything in markdown, and Pandoc is great at taking markdown and converting it into almost anything else. It’s also good if you need to create a PDF, a Word document and a slide show from the same document. The syntax is fairly simple for most document types:

For example:

pandoc input.md -s -o output.docx
pandoc input.md -s -o output.html
pandoc input.md -s -o output.epub

Conversion to PDF works the same, although I’m not a fan of wide margins, so I tweak it slightly:

pandoc -V geometry:margin=1in input.md -s -o output.pdf

For a Beamer slide show you’ll need something like:

pandoc -t beamer input.md -V theme:metropolis -o output.pdf

Pandoc does a lot more, but the documentation is great, and the commands above should be enough to get you started. If you want to try out the functionality in a web browser then http://pandoc.org/try/ should be able to handle most types of conversions.

Setting up WordPress

The following instructions describe how I install WordPress on Ubuntu. The instructions may differ slightly for other server environments, but the basic principles should be the same. This requires shell access to the server, but once it’s finished the WordPress instance(s) should be capable of being administered through a web browser.

Part 1 – Installing WordPress

Download WordPress and move it to /var/www/html/ so it runs from the root directory of the web server.

cd /var/www/html
sudo apt-get install unzip
sudo wget http://wordpress.org/latest.zip
sudo unzip latest.zip
cd wordpress
sudo mv * /var/www/html/
cd /var/www/html
sudo mv index.html index.html_old

Log into mysql:

mysql -u root -p

Create a new database (calling it something different to the example below)

mysql> CREATE DATABASE wordpress;
mysql> GRANT ALL PRIVILEGES ON wordpress.* TO wp_user@localhost IDENTIFIED BY "<password>";
mysql> exit

Install WordPress, following the instructions at http://codex.wordpress.org/Installing_WordPress. Remember to make a note of the username and password you set up for the admin account.

At one point in the installation (it will be obvious) you may need to issue the following command to manually create a config file.

sudo nano wp-config.php

Once WordPress is installed, navigate to /var/www (cd .. or cd /var/www/) and issue the following command:

sudo chown -R www-data html

This will ensure you can install plugins and themes through the WordPress web interface.


Part 2 – Configuring WordPress

Log in using the account you just created.

Install and activate some plugins (Acunetix WP Security, Jetpack by WordPress.com, WP-Markdown and WordPress Importer), via the web interface in WordPress (if you’ve not issued the command above then this won’t work).

Navigate to the left hand menu item for Acunetix WP Security, tick all boxes and click on “update settings”. This will apply all recommended security changes.

Use WordPress Importer to import content (posts, tags, files) from other instances of WordPress.

If you want to compose posts in markdown then you’ll need to navigate to Settings –> Writing and tick the boxes for the interfaces you want to default to markdown.

Note: you won’t be able to activate Jetpack unless the server is visible on the public internet.

Remove the “Hello World!” post and the sample page (both should be obvious if they have not been removed!)


Part 3 – WordPress Multisite (optional)

This allows you to run more than one blog/site in a single instance of WordPress. The instructions at http://codex.wordpress.org/Create_A_Network are good, and are mostly enough to get it up and running.

There are two more things to do on Ubuntu servers:

Enable mod_rewrite

 a2enmod rewrite

Open /etc/apache2/apache2.conf and find the part that says:

 <Directory /var/www/>
 Options Indexes FollowSymLinks
 AllowOverride None
 Require all granted
 </Directory>

Replace AllowOverride None with AllowOverride All


Part 4 – SSL (optional)

Enable mod_rewrite (if you’ve not already done it as part of step 3)

sudo a2enmod rewrite

Enable ssl

sudo a2enmod ssl
sudo a2ensite default-ssl.conf

Amend your apache config to enable pages to be served on port 443

sudo nano /etc/apache2/sites-available/default-ssl.conf

<VirtualHost _default_:443>
Servername yourdomain.com
DocumentRoot /var/www/html

#Enable/Disable SSL for this virtual host.
SSLEngine on
SSLProtocol all -SSLv2 -SSLv3

Amend 80 config (e.g. 000-default.conf), to redirect to 443

sudo nano /etc/apache2/sites-available/000-default.conf

DocumentRoot /var/www/html
ServerName yourdomain.com
Redirect "/" "https://yourdomain.com"

Restart apache

sudo service apache2 restart

Change domain in WordPress settings (through UI) to yourdomain.com

Create a certificate request (csr):

sudo mkdir /etc/apache2/ssl
sudo openssl req -new -newkey rsa:2048 -nodes -keyout /etc/apache2/ssl/yourdomain.key -out /etc/apache2/ssl/yourdomain.csr

Country Name (2 letter code) [XX]:GB
State or Province Name (full name) []:West Midlands
Locality Name (eg, city) [Default City]:Your city
Organization Name (eg, company) [Default Company Ltd]:Your Company
Organizational Unit Name (eg, section) []:
Common Name (eg, your name or your server's hostname) []:yourdomain.com

Copy certificate somewhere sensible

sudo cp /etc/apache2/ssl/yourdomain.csr /home/username/yourdomain.csr

What you’ll need to do then is grab the certificate from home directory, save it somewhere safe and then do whatever you do in your organisation/environment to generate/buy/get a root certificate (there are so many different ways).

Once you have a root certificate, follow instructions at http://askubuntu.com/questions/73287/how-do-i-install-a-root-certificate

Configure apache to use certificate

sudo nano /etc/apache2/sites-available/default-ssl.conf

Then add/edit the following lines:

SSLEngine on
SSLProtocol all -SSLv2 -SSLv3
SSLCertificateFile  /etc/apache2/ssl/yourdomain.com.cer
SSLCertificateKeyFile /etc/apache2/ssl/yourdomain.key

Restart apache

sudo service apache2 restart

At that point your site should serve web pages on https with no error messages.

What I did on my holidays

I’m quite pleased with what I’ve achieved over the last two weeks. This holiday was supposed to be a chance to recharge prior to a very busy period at work, but I think I’ve actually been about as productive as I normally am (just in different ways).

I’ve done a lot of technical things while I’ve been off, including dismantling (and throwing away) 5 old computers, building a server/workstation using a lot of spare parts and a new case/motherboard, and setting up WordPress Multisite on the new server (and then building a site to host my Continuous Professional Development Portfolio which I have to do as part of ILM5). I’ve also decluttered my study, set up a new Raspberry Pi Zero, written a lot of notes about fixing specific technical issues I’ve encountered whilst doing all these things, and ripped about 100 CDs to MP3.

The decluttering has felt very liberating, and I plan on doing more of it (and throwing out more computers) in the summer. Of course, all this means is that I have an even larger pile of old hard drives and memory (even after using 3 of each in the new server) that I need to dispose of at some point.

As well as technical things I’ve also visited the Sea Life Centre, been out for two meals, and booked tickets for various shows. I’ve certainly spent a lot less money than a two week holiday abroad would have cost, and I’m feeling like my technology setup is moving in the right direction again.

WordPress troubleshooting

I’ve done a fair bit of WordPress troubleshooting over the last few weeks, including moving sites from one server to another and upgrading server operating systems. While a lot of this isn’t probably that interesting, I did come across a few things that might help other people undertaking similar tasks.

My method for moving content between sites generally involves exporting the content to XML, installing a fresh instance of WordPress on the new server, importing the content using the WordPress Importer plugin, and then seeing what doesn’t work. What usually doesn’t work is uploaded images and files, but it’s just a case of copying across the whole contents of /var/www/html/wp-content/uploads to the new server to fix that.

Talking of which, Debian (and derivatives) changed the default location for websites from /var/www to /var/www/html last year. If you upgrade Debian to the latest version and all your WordPress sites break, then it should just be a case of editing /etc/apache2/sites-available/000-default.conf so that the DocumentRoot value is set back to to /var/www/.

My last revelation is around hardwired links. When moving a site to a new server (with a new URL) there are likely to be many references to the old site URL in config files and in the WordPress database. There is a plugin called Search and Replace which does all the heavy lifting for you, and which should rewrite your new URL to everywhere it needs to be. I’ve used this a few times now and it works really well.

I’m also half way through writing up how I install sites from scratch, but that’s going to be quite a lengthy document and probably deserves a separate post.