photonetworkde - Fotolia
I'm a big fan of the internet of things. I sorta like the idea of my refrigerator being able to keep in contact with Amazon to make sure that I am always well stocked with cream cheese and orange juice. It's comforting in a way.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
That's the good news.
The bad news is that there is a good argument to be made that given the lack of standards, the internet of things (IoT) is going to set DevOps practices and principles back 20 years.
Allow me to elaborate.
Before the advent of operating systems with graphic user interfaces, such as Windows and MacOS, there was the disk operating system (DOS). DOS came in a variety of flavors. Microsoft had MS-DOS. There was also DR-DOS from Digital Research. The Apple II ran on its own version of DOS -- Apple DOS.
DOS is character based. You interact with DOS via the command line in the same way that modern developers use the terminal. One of the drawbacks of the DOS era was that when it came to supporting devices, there was no real standard. For example, if I made an application that allowed printing, I needed to ensure that my application worked directly with a given printer. Maybe there was a library I could use in my code to interact with a particular model of an Epson dot matrix printer, for example. If not, I had to write a driver. In other words, it was up to me to make sure that my application ran with all the popular printers. It was not unusual for serious development shops to have a farm of printers against which printing tests were executed.
When GUI-based operating systems showed up, printer support was pushed over to the OS. Applications no longer needed to know anything about printers. The application interacts with the OS, and the OS interacts with the printer. Abstracting away hardware was an important breakthrough for application developers. Hardware abstraction increased the speed of the software development lifecycle significantly.
DevOps and IoT
So, what does this have to do with my refrigerator?
A fundamental notion in DevOps practices and principles is to automate as much as possible. We automate testing. We automate provisioning. We automate deployment. We'll even automate system recovery, should things go bad. The reason we can automate to such a high level is because most hardware has been abstracted to a software representation. Today, we call this environment as code. Again, this is very good news.
Most IoT hardware in the world, though, has yet to be abstracted -- my refrigerator, for example. Without abstraction, using standard DevOps techniques to facilitate continuous integration and deployment becomes an almost impossible task. For example, let's say I want to do a simple test to ensure that when I open the door to my refrigerator, two things happen. First, the light inside goes on, and, second, my fridge does a scan of all the items stored within and reports back to me via audio any item that is missing. For example, "Bob, you are out of cream cheese. Should I order some?"
So, let's do the first part of the test, open the door. Turns out, I can't do it. Unlike mobile application testing processes that allow me to connect to a cellphone directly or use an emulator, my refrigerator has not been standardized to the degree where I can just do:
Refrigerator fridge = new Refrigerator(DEVICE_ID);
//run some tests
Rather, I have to do something like send a robot over to the door to do the opening and closing. Then, hopefully, I can monitor the digital interactions within the device to assert that the expected behavior is being performed. Right now, not being able to open and close the door automatically is a showstopper in terms of DevOps continuous integration/continuous delivery.
Back in the disk operating system (DOS) days
Now, this is not to say that the folks at Samsung, LG and Maytag have yet to figure out a way to perform effective testing particular to the devices they manufacture. They probably have. Industrial manufacturing places a lot of emphasis on stringent testing. The costs of failure and subsequent recall are too high to fall short. But, device testing by the manufacturer does not help those of us for whom IoT development will become a way of life. In other words, going back to the DOS days, while Epson ensured that its printers worked when leaving the loading dock, the company never ensured that the device worked with the applications I developed. It was up to me to write the driver and ensure that the driver worked.
Presently, IoT is in its infancy as a technology. It's back where microcomputing was in the early 1980s when a lot of the work was DIY. Once the microcomputing industry standardized among manufacturers, hardware abstraction was possible. Hardware abstraction leads to easier automation; hardware abstraction coupled with easy automation gives rise to DevOps.
There will be thousands of new IoT devices coming onto our digital landscape within the next five years. According to Business Insider:
In total, we forecast there will be 34 billion devices connected to the internet by 2020, up from 10 billion in 2015. IoT devices will account for 24 billion, while traditional computing devices (e.g. smartphones, tablets, smartwatches, etc.) will comprise 10 billion.
This is a lot of opportunity for those with the entrepreneurial spirit. But, it will be a lot of headache for those invested in DevOps practices and principles, at least until the essentials of the various device types become standardized. Part of the work on the road to standardization might be to develop a set of robotic agents that do the physical activities required to support automated activities -- think an inexpensive, programmable robot that puts a glass in the water dispenser on the outside of a refrigerator door. Or it might be making it so device manipulation can be performed via wireless connection.
Regardless of how it is done, implementing hardware standardization and abstraction throughout the IoT landscape needs to happen soon. Otherwise, those DevOps practices and principles we've come to know -- to which many have devoted their career to promote -- will come to a screeching halt. It would be a shame. Going back to the future makes for a great movie. Going back in the future is opportunity lost.
It's easier to make DevOps engineers than to find them
How DevOps can protect you from an IoT security breach
DevOps can prevent tragic software failures