Business Process Modelling with BPMN

Having moved away from software development and design and more towards management of IT processes and services, I have found that Business Process Modelling is more applicable than UML to describing the kinds of processes I am encountering. This is not surprising, as UML is more IT-centric and I needed more flexibility to capture the realities of how things work in real life. Yes, you can use a combination of UML diagrams to capture a real-world process, but this is not as intuitive to non-IT people of which I encounter more often.

My first attempts at modelling a business process was using activity diagrams, sequence diagrams and use case models. The use case model defines all of the actors involved – both people and systems, the sequence diagram showed the message flow between them.

Figure 1 – Use Case Model Diagram
Figure 2 – UML Activity Diagram

However, this was still too low-level and I needed something that would capture the “big picture”. After all, a high-level process (e.g. a sales process) can naturally be broken down into sub-processes. Each level of detail provides meaning to the different layers of the organization as appropriate. Of course, UML is still important for helping to formally describe the resulting IT systems implementation.

The nice thing about BPMN is that you can practice it all the time. With UML you generally want to be working on something IT related, but BPM can be applied to any process. For instance, how do people get something to eat for lunch? Do they eat out or have they brought a lunch box? This process can be described using BPMN.

Figure 3 – Process for eating lunch using BPMN

If BPM interests you and you are reading this article, the chances are that you are a pioneer in in your organization. BPMN is an industry-standard notation so if you are learning BPMN then the quicker you learn the rules and follow best-practice the more rewarding will be the result. I highly recommend the following two books:

Spending time formally documenting a process may seem like a waste of time in some ways. In the real world, situations change and people adapt or take shortcuts and the process model may be out-of-date in no time, but your BPMN model should not try to capture every detail or variation. More importantly, modelling a process using BPMN is an excellent aid to understanding how a given process currently works (even if it is dysfunctional). This process analysis can be much more complete when using a comprehensive notation like BPMN – if it can’t be modelled in BPMN then there is probably some wrong assumption or something hidden in the process that needs to be investigated. BPMN gives you the confidence to pursue a process analysis to its proper conclusion.

I will finish with an example of a process model I was grappling with recently. Systems integration is often done using messaging, typical of a Service Oriented Architecture. Files are transferred from one server to another and then imported into the recipient software system. (As this is an IT-centric problem I could of course have used UML to model this.) File transfer is either push or pull, in this case push. The sender places files on the recipient’s file system. The receiver checks for new files every few seconds and if it finds any it processes them.

Modelling system interaction in BPM it is called a collaboration. The collaboration is named after the process, in the case “File transfer”, and the lanes are named after the actors. The first thing I had to figure out was whether to use events to show that a message had arrived. At the same time the recipient is busy polling the directory looking for files, and will continue to do so as long as the service is available.

The sender and receiver are modelled as two separate processes. The sender sends the file using a message activity with a message flow symbol attached.

Figure 4 – File transfer using BPMN

The message is sent to the recipient’s polling subprocess which can generate a non-interrupting escalation event (ooh!) (the little arrow in the dotted circle) to trigger the next activity that processes the files. The subprocess is looped (the little circular arrow), so it will continue to run after the escalation occurs (forever in this case).

So how did I know how to use a non-interrupting escalation? Well, the non-interrupting part is just saying that the event does not interrupt the subprocess flow, i.e. polling will still continue when files have been found. The escalation part, just means that the polling process has found files and needs someone else to deal with them, so it notifies the parent process (escalation).

The diagrams were produced using Visio Professional 2016 which includes a function to validate the diagram according to BPMN 2.0 (“Check diagram”).

The agile way to migrate from Gmail to Office 365

I was recently working on a migration from Google Apps to Office 365 and was not happy with the big bang approach for migrating email as suggested by Microsoft. This is just too big a risk since email is a critical service for communication within the company – and with customers. It also meant that everyone would start using Office 365 at the same time, which provided no opportunity to improve the migration process once it was set in motion.

So I worked out a way to do an agile migration, where users could be migrated in batches and the administrator could refine the migration process with each iteration Kaizen-style. I decided to publish a generalised procedure that hopefully could be of use to others looking for a better way. At the very least, it should provide some insights into how to plan your own Office 365 migration.

Thanks to Finn McCann for reviewing the document and providing valuable insights. Enjoy!

Retro games

So I bought a Raspberry Pi 3 and installed an OpenELEC’s implementation of Kodi, the media centre application. This would finally replace my Windows Media Center (WMC) PC that I’d mothballed some time ago. Back then I had decided to convert my DVDs into ISOs in order to capture any extra stuff that came with the film, and (apart from WMC) Kodi was the only mainstream app I could find that could play back ISOs.

I have had a Synology DS412+ for a while now to back up files, photos and home videos, and I had also transferred my ISO collection to it. The Synology does have DLNA support and I can navigate the video/music libraries on it from my Samsung TV. However, the DS412+ with its four bays is more for business users, and has limited transcoding support compared to the Synology “Play” variants. But even the Play devices cannot compare to Kodi’s transcoding capabilities, and Synology cannot play back ISOs. Converting to some other container format seemed like the wrong way to solve the problem.

Kodi

Once the OpenELEC bundle was installed on the SanDisk 32GB micro SD card and the Pi was connected to the TV, Kodi started up automatically. Kodi can be navigated using the TV’s remote control thanks to HDMI-CEC eliminating the need for an extra remote control. The setup was fairly straightforward, I needed to do the following:

  1. Make my Synology media available in Kodi. There are some default sources set up in Kodi that point to the local filesystem, I edited these to point to the relevant folders on the NAS using the Synology’s NFS service.
  2. Get Kodi to fit properly on the screen. On larger screens Kodi can be too big but there is an option to resize it to fit the screen called Zoom. I set this to -4% which was perfect.
  3. Display the time and date correctly. Firstly, Kodi needs to be synced with an NTP server so that it displays the correct time and date. Then I also wanted it to display both the time and the date in the correct format. I navigate to System -> OpenELEC -> Network and added the standard three NTP servers to the list of Timeservers:
0.pool.ntp.org
1.pool.ntp.org
2.pool.ntp.org

After that everything setup and ready to play.

Arcade console

The Raspberry Pi is a general purpose computer and a media centre is just one of the uses it can be put to. I had played old 80’s arcade games on MAME about 15 years ago on my PC and thought why not use the Pi now.

There are a couple of methods to turning a Pi into an arcade game emulator. One way is to use RetroPie, a dedicated arcade game Linux setup, however that would mean replacing OpenELEC which I didn’t want to do for obvious reasons. The other option is to use RetroArch which plugs nicely into Kodi. In fact RetroPie is built on RetroArch. RetroArch works as a launcher for many different emulators including MAME. The emulators are including in the RetroArch distribution but not the game ROMs themselves.

RetroArch

I installed RetroArch and tested the one game that was included (a Sega Genesis game) which worked fine. To start a game, go to Program -> Advanced Launcher -> Default and select an emulator and then a game to play. Before we go any further, I will explain the parts of the RetroArch filesystem that were most relevant to my setup:

/storage/emulators/RetroArch/config/retroarch.cfg

This is where all of the many configuration options of RetroArch are stored. There is also a GUI (called RGUI) which can be used to edit these settings. More on that later.

/storage/emulators/RetroArch/roms

This is where the ROMs go. In Kodi select the emulator you want to use to run the new game(s) and use the context menu to “Add items”. I use the option to scan for new items which are then automatically added to the list of games under the emulator. The scan will also remove items whose ROMs have been deleted.

/storage/.kodi/addons/emulator.tools.retroarch/lib/libretro

Here is the list of emulators that ship with RetroArch. Only some of them are preconfigured in the Kodi Advanced Launcher menu. Setup more of these emulators in Kodi as needed.

/storage/.kodi/addons/emulator.tools.retroarch/config/retroarch.cfg

Here is the reference configuration. This is a handy cheatsheet that explains what each setting in retroarch.cfg does, as well as showing you the default value.

First ROM: Hardhat

On the MAME website there a few free ROMs to download. So I installed Hardhat in the ROMs directory using WinSCP. Then I added the game to “MAME / iMame4All” in Kodi and that ran fine too.

When RetroArch starts from Kodi, Kodi is replaced with the emulator and the TV remote control can no longer be used. So I plugged in a USB keyboard which was all I had available. RetroArch uses default bindings for keyboards out of the box. Here are the basics:

  • Right shift: Insert coins
  • Enter: Start game
  • Left/Right arrow keys: Move left/right
  • Space: Shoot

Once I could use the keyboard to play games, I started looking for a pair of SNES joypads to make the experience more authentic. These USB joypads were a small investment, of course RetroArch can bind to all kinds of game controllers, but for most of the early arcade games, the SNES joypads have sufficient functionality. I plugged the first one in and fired up Hardhat. Retroarch found the joypad but complained that the “controller not configured”. What to do?

RetroArch does of course have a (very large) configuration file which includes the settings for binding game controllers. RetroArch also provides a GUI (called RGUI) for editing the same settings. There is no obvious way to start RGUI from Kodi but I accidently stumbled across it when I renamed the the “hardhat.zip” ROM to “Hardhat.zip” (Linux is case-senstive). When Kodi tried to launch the emulator using “hardhat.zip” it failed and the RGUI started instead (which is the default behaviour I assume).

In RGUI I used the keyboard to navigate the menus. Here are the most relevant bindings:

  • Up/Down arrow keys: Move up and down the menus
  • Left/Right arrow keys: Hop up and down the menus
  • x: Enter submenu or edit value
  • z: Leave submenu or stop editing
  • Esc: Quit RGUI

SNES controller

So I navigated to Settings->Input->Input User 1 binds and bound the joypad to each control field. There were 10 in all: Left, Right, Up, Down, A, B, X, Y, Start and Select.

Super Nintendo controller

My plan was only to have the joypads plugged in the Pi; I wanted to avoid having a keyboard lying around just so I could press “Esc” to return to Kodi. This is where the RetroArch Hotkeys comes in. The SNES controller includes the “L” and “R” shoulder buttons which are not needed for most early arcade games. So I bound “L” as the RetroArch HotKey enabler (Settings->Input->Input Hotkey Binds->Enable hotkeys) and “R” as the “Quit RetroArch” hotkey (…->Input Hotkey Binds->Quit RetroArch). So now when I press “L” and “R” together the game exits and Kodi is restored. Bye bye keyboard.

input_enable_hotkey_btn = "4"
input_exit_emulator_btn = "5"

When I plugged in the second SNES joypad RetroArch automatically applied the same bindings to it which was nice.

The last problem was the games themselves were too big for the TV screen. The top and bottom were not visible which meant I couldn’t see vital information like the score and the number of lives left. RetroArch solved this too. This was fixed by changing the setting Settings->Video->Integer Scale to ON.

Finally, I changed the setting on the Advanced Laucher to Activate “Launching Application” notification. This is so that I could see the Kodi was responding even if it took a few seconds for RetroArch to warm up.

iMame4All

MAME is built for PCs which means it expects the user to be seating in front of the keyboard and to be able to type in commands or use hotkeys. iMame4All is built on MAME (currently MAME version 0.37b) and is aimed at mobile phone and other touchscreen platforms and is therefore better suited to a media center platform like Kodi.

RetroArch ships with MAME, iMame4All and lots of other emulators but only a handful are preconfigured in Kodi. The “MAME / iMame4All” menu item is preconfigured to run the iMame4All emulator but can be changed to run one of the MAME emulators included with RetroArch if desired.

MAME 0.37b is a very old version of MAME from 2000, so finding ROMs that work with that version of the emulator via the normal ROM websites was not going to be easy. So I searched for “mame 0.37b5 roms download” instead.

Once I had a few games up and running, I added a thumbnail to each game, usually a screenshot, to give a visual clue about what type of game it is. Of course you can add more metadata to the Kodi menu items to aid filtering if you have a lot of ROMs.

And that’s it. Just got a find the time to play now.

Big Data and the new EU regulations

On Tuesday, the new EU regulations regarding Big Data went into force. This affects all companies and authorities who are registering and storing personal data. This replaces the patchwork of rules and regulations that exist today:

On 4 May 2016, the official texts of the Regulation and the Directive have been published in the EU Official Journal in all the official languages. While the Regulation will enter into force on 24 May 2016, it shall apply from 25 May 2018. The Directive enters into force on 5 May 2016 and EU Member States have to transpose it into their national law by 6 May 2018. ( Read more)

The major points of the legislation are (source Wikipedia) :

  1. Responsibility and accountability: controllers have much more responsibility for the proper management of personal data.
  2. Consent: Valid consent must be explicit for data collected. Consent for children under 16 must be given by child’s parent or custodian.
  3. Data Protection Officer: A person with expert knowledge of data protection law and practices should assist the controller.
  4. Data breaches: Breaches must be reported to the Supervisory Authority as soon as they become aware of the data breach.
  5. Right to erasure: The data subject has the right to request erasure of personal data related to him.
  6. Data portability: A person shall be able to transfer their personal data from one electronic processing system to and into another.

Further reading: The EU Data Protection Reform and Big Data Factsheet (PDF)

With regards to exporting data outside the EU, the now invalid Safe Harbour agreement has been replaced with the new EU-U.S Privacy Shield which is promises to improve the handling of EU citizens data by U.S. authorities and companies.

Further reading:EU-U.S. Privacy Shield (PDF)

What is an IT Manager?

About five months ago, just before Christmas I started looking for a new job. I was working as IT Manager and find this type of role very enjoyable with the combination of strategic and operative responsibilities. I have a very broad IT background from smaller companies (<100 employees) with a focus on process development and automation.

For the first month or so I focused purely on applying for IT Manager jobs. Later I broadened my horizons to include System Architect roles, Requirement Analyst roles and Project Manager roles. This was for several reasons, one was that there are only a few IT Manager roles advertised at any one time – and that matched my salary expectations, travel time limits and required experience. Applying for other types of roles also meant more interview practice but more importantly that I would rather be in a job sooner and with the potential to advance, rather than later.

It turned out that IT Manager can be so many different things depending on the industry or just the company in question; from purely internal back-office IT management, to a more organisational development role, to product development. Managing the IT systems of a retail company does not exploit much of my experience from IT product management and delivery for instance, and the salary was, accordingly, not that exciting. But then again, I was more interested in moving away from tech stuff and working with a larger organisation that wanted to leverage outsourced, offshored and cloud services. More “what can IT do for you” rather than “what I can do with IT”.

So I applied for all types of advanced IT roles like architect and analyst, usually in bigger companies where it would be at least as challenging as IT manager in a smaller company (for which I also applied). Some sectors were just a no-go it seemed, the banking industry requires financial systems experience, government agencies want experience of working in the public sector and with public tenders. In short, the path to my next IT job was tough going. Every time I applied for jobs that were not exactly a match, there were always other candidates better suited and I never got an interview.

A changing role

The problem as I see it is that the traditional IT Manager role is changing radically. This is mainly due to outsourcing and pay-as-you-go cloud services. IT Managers need less technical skills and more business knowledge skills nowadays.  So either the IT Manager has to adapt or become diminished, as strategic IT decisions happen elsewhere. Either way this affects IT Manager salaries negatively.

At the same time there were oodles of consultancies looking for architects, analysts and project managers. So there is obviously work in this are and with the chance of a decent salary too.

Is there a connection between the two? I speculate that companies have access to more high-quality IT products than ever in a pay-as-you-go model that requires more analysis/architecture/integration expertise that classical nuts-and-bolts IT department know-how. That’s not to say that the IT Manager couldn’t do the job, it just means that as the use of IT grows, it is not increasing the status of and resources available to the IT Manager, but more the opposite.

Go with the flow

One of my principles as system integrator and IT Manager has always been to phase myself out by helping the organisation to help itself. It is the job of IT to help the organisation to become more efficient and to scale. Well maybe that is just what happened, so about two months ago I started contacting consultancy companies.

(In Sweden the consultancy market is very well developed. This is because Sweden has very strict employment laws but companies still need/want to be flexible. In my home country Ireland a consultant was always a specialist, someone you called in to do a specific job. In Sweden consultants (“konsulter”) are mostly manpower but there are of course still consultants that are specialists. More and more larger companies now have frame agreements with consultancy firms to provide resources at pre-negotiated rates. And sometimes it is hard for companies to understand why they must pay more for consultants because they are actually specialists.)

So, being a consultant will give me the chance to find out what the market for IT competence looks like nowadays, and to find out what my market-worth is. Consultancy companies can work in specific niches that are good to be familiar with. For instance ework and ZeroChaos function as de facto recruitment departments for some companies and are very good at pressing prices for consultants. Nox on the other hand work as an umbrella organisation for small consultancies or independent consultants and are working for them instead.

Polar Cape

In the end I ended up working for Polar Cape who rang me up and made me feel right at home. It is a small company but with colleagues with a similar level of IT industry experience. This is not as daring a being an independent consultant, but I feel I have a lot to learn about marketing/promoting myself and getting assignments as well as building my network. So now I have a chance to work with interesting IT projects in different industries while leveraging my broad technical experience and observing the rapid transformation of the IT landscape.

At a recent CIO Excellence conference, the final debate was about IT management’s role. My argument was that once you strip away all the back office IT management and maintenance activities, the company will still need IT governance regardless of whether IT services are provided internally or externally. Specifically, IT security will be a central part of IT governance in this future scenario and I am working towards a CISSP certification.

So what of the IT Manager? Well, as a consultant that has helped companies with their IT transformation process, I will be in a position to see whether this role still exists in 5-10 years time. Interesting times indeed.

A year with OneDrive for Business

As a completely cloud-based organisation, there was no backup service in place, but instead an ad hoc Dropbox solution was used to store files off-site. Each user simply created a free personal account which usually had sufficient capacity for most users. It was time to migrate to something better: OneDrive for Business, the final piece in the Microsoft’s cloud puzzle that is Office 365.

We were really looking forward to rolling out OneDrive and we started with a few pilot users. Here are some of the use cases that came up as part of the general rollout.

Backup

At a minimum, OneDrive functions as a backup for files that otherwise only exist on the employee’s computers and laptops. All business related files were moved from My Documents or the Dropbox folder to the new OneDrive folder. The OneDrive application then automatically uploads the files to the user’s 1TB personal storage space in the corporate Office 365 environment. This storage is part of the SharePoint Online file system and version control can be enabled to provide even more security in case of accidental changes or deletion.

Now, OneDrive maintains synchronisation in the background, it is completely transparent to the user, and Dropbox works the same. When users were using Dropbox they were working on local copies of their Word and Excel files. When the file was saved it was synced automatically to the cloud. (This is similar to classic version control systems like Subversion and CVS, but where synchronisation (“checking in”) is done manually.)

However, Microsoft Office turns this principle on its head. When the user opens an Office file, such as a Word file that is located in the OneDrive folder, what really happens is that Word fetches the server (cloud) copy of the file and opens that instead. When the file is saved, it is saved to the cloud and then the OneDrive client updates the local copy. In other words rather than letting OneDrive do its job, Office is also getting involved. Paul Thurott’s blog describes the behaviour more exactly and how to work around some of the excesses.

Normally all this does not concern the user as it is all completely transparent. Unfortunately, OneDrive for Business turned out to be not so robust and there were frequent problems with files being stuck out-of-sync and other generic “server errors” that defied analysis. Our road warriors could be in a 3G brown spot and the slow network connection could play havoc with the OneDrive/Office acrobatics described above. From an administration point-of-view this was difficult to troubleshoot until we understood what was happening with the files. But for most users who have never worked with version control-like systems before it was almost impossible to explain.

These recurring problems and the lack of understanding of what was happening caused a real crisis of confidence with some users. File synchronisation just cannot fail to work or it is worse than useless. There were calls to roll back to Dropbox. I explained that Microsoft just has to fix these stability issues since OneDrive is an essential component in the Office 365 service suite, and also that we gain so much functionality with OneDrive, such as integration with SharePoint.

Syncing other libraries

Once we got going with synchronising personal files using OneDrive, it was possible to start leveraging all the other features. SharePoint document libraries can also be synchronised to the local computer – a big step up from the limited file management functions in the SharePoint library web view. However, there are some limitations on the libraries that could be synced which we managed to work around.

Publishing on SharePoint

Now that any SharePoint document library could be synchronised, users could also update local copies of documents that were used in some embedded fashion in a SharePoint web part or webpage. For example the user could update a local copy of a spreadsheet in a synchronised Excel file and the web part or webpage would immediately be updated with the new table values or graphs.

Project collaboration

SharePoint websites are a great way to manage projects and a document library is often used to store the project documents. With OneDrive, all of the team members can synchronise with a common document library for the project. That way when one member adds or updates a file, the local copy for all the other team members is updated as well. No more emailing documents! There is even a OneDrive feature to allow multiple users to simultaneously edit the same file if needed.

External collaboration

Customers and suppliers can also use OneDrive for Business to access document libraries in the corporate SharePoint Online. This is an extension of the project collaboration use case above where the project team comprises both employees and external users. Just be very sure to restrict the privileges of the external users to just the document library or at most the project sub-site.

Mobility

There is of course a mobile app that is handy for viewing your OneDrive files. However, it only shows files from the user’s personal OneDrive space and not any other SharePoint document libraries that were synced to the user’s computer.

Document templates

This is one of my favourite applications for OneDrive. The company has many document templates for various types of Word and PowerPoint documents. Normally in Office it is possible to configure a location for custom templates; this had to be a folder on the computer or a file share and this is still true for the Office 365 apps.

With OneDrive, it was possible to create a document library in SharePoint dedicated to storing document templates. This library was set up with a folder hierarchy for categorising the templates. Then every employee could simply synchronise the document library containing the templates. The local copy of the library could then be set as the custom template location in Office. So now, users can start Word or PowerPoint and select the correct template from within the application as normal.

Finally, using the SharePoint library permission settings, write access could be restricted to the template administrators, and all other employees were given read-only access which allowed them to use the templates but not to be able to change or delete them. Furthermore, when the administrators made updates to the templates, OneDrive would automatically sync the changes to every user’s local copy so that new documents would always be created using the latest templates.

Summary

OneDrive for Business is an essential tool for all Office 365 customers. It still has some robustness issues but it will delivery huge productivity benefits in project collaboration, web publishing and template portfolio management.

Thinking about migrating to SaaS

In my last article we looked at what factors influence a company’s choice of IaaS solution. A more advanced strategy would be to migrate to a SaaS with the potential for even bigger savings.

If a company is looking to upgrade an existing IT system then there should always be some research done into what cloud alternatives are available. More are more companies are offering cloud versions of their services, or someone else is offering an equivalent competing SaaS.

Compared to IaaS, SaaS takes management of the IT systems completely out of the hands of the IT department. So much so, that any executive with purchasing power can start paying-as-they-go for a cloud service. It requires no IT expertise to register for a Salesforce subscription for instance. 

However, this is a flawed approach for two reasons, the first is that it undermines whatever IT strategy the company may have and can lead to a proliferation of SaaS subscriptions that provide overlapping functionality and are difficult to integrate. What we are talking about is IT governance. While SaaS simplifies the business case for using a new IT system (i.e. zero CAPEX aka “pay-as-you-go”) it still needs to be done in coordination with the IT function (e.g. the CIO). The second aspect of IT governance is security. While any decent SaaS provides good security functionality, it still requires the application of a security posture that is in line with the organisation’s security policies and standards.

To rephrase, if a company uses only SaaS solutions for its IT needs, then IT governance is reduced to managing the SaaS portfolio (which functionality is available where and how they could or should be integrated) and maintaining the organisation’s IT security posture.

Subscribing to a new SaaS is easy as it should be. The pay-as-you-go model simplifies testing a service, and the setup and roll-out of the service in the organisation is not under the same time-pressure as one that has required a huge upfront CAPEX. However, it is a different proposition if the company needs to migrate from an existing legacy system.

There are two types of migration, one from an on-premise product to the cloud version of the same product. This can happen because it is cheaper and/or the vendor has phased out the server version for the cloud version on the product. The other type of migration is to a cloud service based on a different product.

Regardless of which type of migration is being performed, there are some challenges (I hesitate to say limitations, read on) with leaving a legacy server-based solution. When a company owns and manages its own copy of a product, it has complete control over how it is deployed and integrated into the corporate IT environment. The product may provide APIs (or not) and there is the possibility to customise the product to meet the organisation’s needs. But the same product delivered as a SaaS will not allow the same customisation. And here is where SaaS really comes into its own I believe.

SaaS is very attractive from a licensing and management point-of-view, provided the company does not want to do a lot of customisations. Vendors, however, understand that one-size-fits-all will limit the number of customers they will have, so vendors invest heavily in providing lots of configuration possibilities. In the extreme, they can provide layers of abstraction and deliver what is essentially a toolbox of functionality that the customer can use to build their equivalent proprietary functionality. Jira Cloud is an example of a service that provides enormous flexibility when building issue-tracking workflows for instance.

Vendors will provide this toolbox-like functionality as long as there is a market willing to pay for it. However, this may still not be enough for customers with very specific needs. But cloud vendors are not done yet. They can also provide APIs such as REST to allow the customer to fulfil its requirements by encapsulating the custom functionality in a separate service. Jira Cloud and Salesforce Force.com provide this type of integration for instance.

And so here it is, the customer can migrate to the cloud using a standardised, configurable SaaS with an integration to a company-specific service that meets all of their requirements. Now, suddenly you have cost visibility! On one side you have a standard SaaS that probably provides 95% of the functionality for a very reasonable monthly cost, and on the other side the customisations that deliver 5% of the functionality but probably cost more per month.

But the whole point is not for the customer to have to migrate to the cloud in this way. SaaS makes the real cost of maintaining proprietary solutions painfully visible to management, with the result that there is more incentive to analyse why they are needed in the first place. And guess what, the organisation can often adapt their business processes to behave in a more standard fashion; after all cloud services exist because they are a successful way for lots of companies to leverage IT in their businesses.

In summary, when an organisation has to choose between making a work process change and making proprietary changes to on-premise IT-system, then its IT that most often gets the job. This creates a legacy that is dragged into the light when the company wants to leverage the benefits of very economical pay-as-you-go services. These customisations will acquire a very real maintenance cost and companies will only retain those that are essential. The IT department’s budget will start to correlate more with maintaining these customisations.

What does tomorrow’s IT department look like? I will explore this topic in another article.

Cloudily

So where are we today? Well, just about everything can be done in the cloud nowadays. At least, there are SaaS services available that offer standard functionality to support most business processes. And then there are all the integration possibilities both between services (e.g. data-mining) and with business specific solutions (e.g. via Force.com or some form of REST API or similar). In short, it is possible for companies to choose their comfort zone when adopting cloud technologies.

We mustn’t forget that virtualisation is the backbone of all cloud infrastructures. Functionality like thin-provisioning provides economies of scale and a more efficient use of resources. Virtualisation started in the enterprise as private clouds and is still as relevant today even with the fantastic flora of public cloud offerings. So picking your comfort zone raises some important questions for companies, the most important being, what are the business drivers for adopting cloud technologies?

There is the obvious cost driver that can be realised with virtualisation coupled with hosting (aka IAAS). Virtualisation provides efficiency and hosting eliminates the need for resources for management and maintenance activities. This also brings us to the competence driver. IT organisations are implicitly expected to remain small and basically invisible to the rest of the organisation. Certainly in the Cloud Age if you need more IT people instead of outsourcing then the view is that you’re doing it wrong.

Virtualisation products also deliver a host of other functionality that can be important business drivers. High-availability, scalability and performance monitoring features are common to probably all IAAS platforms. So outsourcing infrastructure can provide cost savings plus lots of new features in a like-for-like migration, i.e. moving from on-premise to an equivalently performing cloud infrastructure with respect to the potentially different workloads of each IT system (e.g. IOPS vs. data-mining).

The next driver to look at is flexibility, the ability to be able to bring more resources online in response to a temporarily increased workload. The question is, does the organisation need flexibility, really? Yes, loads vary, but this must be weighed against the cost of hosting in an on-demand infrastructure instead of, say, a (virtual) private cloud (VPC). A VPC can of course be extended to meet increased workloads but this is a more long-term activity such as meeting the needs of a growing organisation. But, if just a single application requires extra capacity then this does not motivate putting the entire IT-environment in an on-demand IAAS solution such as AWS or Microsoft Azure. Certainly this is true of larger organisations which have a bigger and more stable infrastructure footprint.

Flexibility then is achieved using a hybrid solution. The backbone of the IT infrastructure should be virtualised as much as possible using a private cloud (on-premise or hosted). The private cloud will have a predictable workload and will expand or change at a strategic pace. The private cloud is then complemented with on-demand resources from an on-demand infrastructure supplier such as AWS for handling peak workloads. Finding a balance between these two costs can be achieved using the resource monitoring tools in both platforms. Vertical vs. horizontal scalability is also a deciding factor.

Security concerns are usually not a business driver. By this I mean confidentially and integrity of data when some or all of the IT systems are residing in one or more IAAS platforms. These platforms usually provide networking capabilities such as VPN to allow the corporate network to extend into one or more cloud infrastructures and so provide seamless integration between the various IT systems. Central authentication systems can also use the same VPN tunnels to control authentication and authorisation in all the relevant systems.

To summarise, this walkthrough of the IAAS landscape shows that you can pick your comfort zone with a proper analysis of the infrastructure requirements. The benefits are huge so some of the cost-savings should be used to over-dimension the infrastructure solution and so save time performing difficult workload analysis. Hopefully the company will grow into its new suit and this will also provide important input to future infrastructure investments.