software is eating the world!
The above quote has becoming ever more poignant in today’s modern enterprise. Due to the increased adoption cloud computing platforms and organisations continuing to leverage the benefits of outsourcing their entire IT infrastructure. There is a growing need to control and manage these systems utilizing software integrations.
Many organisations are also transforming their business units to adopt Lean and Agile business methodologies with many going further and creating Innovation colonies. The result of most business transformations is that jobs titles , roles, responsibilities, business units or departments may change over time, but one thing that will remain the same is that a specific function or job will still be required to be done, whether it be by some external outsourced service or automation of some kind.
Over the years the Software Development and IT Operations have evolved into two distinctly different units in many organisations. This wasn’t always the case; when computers first came on the scene developers were required to operate the machines they were working on, by mounting tapes, flicking switches, replacing vacuüm tubes and debug the equipment.
It was during the 60’s that the need to split out functions occurred when programmers/analysts would dump boxes of computer punch cards into readers and the computer operators would run around, behind walls of glass, mounting tapes, pulling printouts and shoving them into labelled chubby holes.
During the 70’s and 80’s with the advent of the personal desktop computers, the roles evolved further, and the coming of age of systems and network administrators and the birth of the modern era IT Operations. Most computer users of this era typically possessed just enough computing wisdom to be dangerous, with most software packages requiring all manner of user interaction with the operating system. This inevitably lead to computer failures and issues, requiring assistance from trained computer professional on hand to continually repair systems and keep the IT infrastructure running.
During the 90’s and 00’s with the introduction of the Internet, seen many organisations adopting websites and Internet commerce, distributed computing and remote working capabilities. This required greater resource utilisation just to keep the infrastructure running, resulting in IT departments commonly referred to as the Fire Department, due the fact that their primary responsibility was fire fighting.
If we take a look at the modern organisations network topology we’ll notice the increased adoption of hybrid cloud or Public and Private cloud infrastructures. Bringing with it a whole new set of challenges despite it enabling organisations overcome limitations.
Infrastructure as code
Every good software developer instinctively knows that if you need to do anything more than once you either need to re-use or automate and every good IT operations engineer instinctively knows that if you need reliable operations it needs to be reproducible and programmatic. It is at this juncture that the two disciplines merge.
We have seen the increased introduction of tools like Puppet and Chef to automate machine configuration to ensure all machine configurations are identical and all the right services have been enabled. We can further expand on technical abilities using tools like Vagrant and Docker to be able to create both virtual Development and live environments.
It is common for the technical staff to focus on the exciting technical challenges when it comes to Devops. Many man hours could be spent on tuning the automated continuous deployments to scalable cloud environments. However, as much as it pains the geek inside me to say, this is not even close to what the real challenge behind a comprehensive DevOps strategy. The real challenges and major obstacles for any organisation to overcome or entirely political.
Human beings and their interaction with DevOps will cause well over 90% of the real challenges. In my experience one of the most underrated skills for effective DevOps engineers are all of the soft variety .
There is a completely natural tendency in organisations in silo information. This is not done with any malice or nefarious intent but is rather a natural result of like-minded individuals communicating with each other, forming their own common working practices, internal jargon and even creating information systems to help them to communicate better. This often results in certain departments or business units actually customizing or developing an application to suit their method of working.
Technical communities tend to gravitate towards common toolkits, for instance Quality Assurance professionals across industry will implement the defacto industry standard , HP ALM, whilst developers may prefer Atlassasian JIRA , whilst Business Analysts may choose IBM rational Doors and Service desk professionals will opt for ServiceNow and the entire project management Office may have a preference for VersionOne.
Organisations may never overcome the tendencies of silo-ing but they can get information to flow in and out of the silos. However ensuring that information can flow freely between silos and at the same time enabling staff to use and update this information within their tool of choice should be the focus. This is where a connected lifecycle tool like Tasktop sync excels. It helps your organisation to ensure the right information gets to it’s respective audience in a the format they understand and in the tool of their choice.
It’s naive to think that all it takes for a development group to overcome the challenges of high performance, distributed applications and develop software that won’t fail is to simply integrate development into operations. It also doesn’t mean that by simply merging the two disciplines that your organisation somehow reduces the support overhead. There are certainly great rewards for an organisation, if it chooses to adopt a DevOps strategy, but they do need to mindful of the challenges they will face enroute. They need to be mindful of the fact, that successful execution will depend on the clarity and efficiency of its communication channels.