IT Vocabulary16 Oct 2013
"Cloud = Server + X" with X being background synchronization. While many professions have a very stable dictionary of technical vocabulary, software terms are often defined by marketing departments rather than practitioners in the field. The hipness of computers/web and smartphones is a blessing and a curse. With the enormous interest in computer science things also comes the buzzword curse. Cloud is a very old one at that. The idea to store data on a central server instead of on your local machine was the norm when computers were invented.
Central servers were connected to light terminals in order to efficiently use the resources of the extremely expensive big servers. Then came personal computer, where you could get everything done on your own. Now with smaller devices with limited resources such as tablets and smartphones, we see the same shift going in the opposite direction. The curious thing with computer science is that, apparently we need a new word for this. "Server-based", "centralized storage", "datacenter-based" "synchronized with a data center" have all been dismissed for the vague word cloud. When using these vague marketing terms, we deprive ourselves of an efficient communication mechanism. If you are more verbose, you make communication harder. Take current top suppliers for virtual servers, like digital ocean or rackspace. They need a lot of copy text to explain what they offer exactly. I myself even had to contact their live-chat support to understand what they were offering in their recent developer lover initiative. The issue is that this very new stuff is not very well named yet and that this happens a lot in the computer domain.
Other domains have well defined terms that have slowly evolved over decades. I imagine that mechanical engineering has much more clearly defined terms because: a) it's an older domain b) it originates from definitions by some committee instead of loose consensus c) it's not moving that fast and consequently less "sexy". So what can we do about it? Of course a) and c) cannot really be changed, but in fact the definition of a common vocabulary has been tried. In 1966 there was a commendable attempt that resulted in an ISO standard. Neville Holmes writes very eloquently about this problem in his essay "The Great Term Robbery" in IEEE's Computer. He also writes about the interesting distinction between "data" that are bits stored in a computer and "information", which only a human can interpret from it. I believe that the standardization attempt was unsuccessful because computer vocabulary evolved in a different direction -- as language often naturally does -- but I ask myself if we shouldn't as a community also try to do something similar again. Technically we see a lot of new inventions: open-source libraries/hardware, kickstarter for funding, but we don't have a strong idea for some more subtle over-arching problems such as vocabulary or data-driven evaluation of programming methods. I wonder if we will ever have something like this and if it will be created because a) computer science matured or b) because there was a conscious combined effort to create it.