The business community has witnessed – and embraced – many technological transformations since our first cloud computing implementation in 2000. A perspective on the significant changes in IT and what businesses can expect moving forward into the next decade.
Q. With all of the utility and cloud computing options available, why would someone want to invest in in-house IT resources?
Until recently, many of today’s business executives viewed IT as a sinkhole rather than as a profit-center. As a result many have opted with what, at first glance, was the least expensive option. I think many recognize that IT has become more integrated into the business process.
Some IT professionals believe that you can better control your own destiny by using your own IT equipment. This way you’re not at the mercy of third-party computer services or tied into their long-term contracts. Owning your own IT resources is fine if you’re just starting a company, but as your business grows so does the complexity of your IT system.
Q. Should businesses be concerned about being ‘locked-in’ by an IT consulting firm?
Unfortunately, yes. Most of the firms that provide IT support are credible and competent. However, you should always be careful about signing a long-term contract. Computer technology evolves so quickly that it’s hard to know whether your vendor has the resources to keep pace. This is especially the case with cloud or utility computing which sounds great in theory, but you don’t want your vendor to learn new systems at your expense.
Q. What are the primary disadvantages of owning and managing your own IT equipment?
As I mentioned earlier, buying your own equipment seems fairly innocuous for a small business just starting up. However, over time, key components – such as CPU, disk, RAM, and, to a certain extent, the network – begin to bottleneck, resulting in latency or crashing of your system. These problems grow exponentially in organization with satellite locations and remote users or when remote pc access is required. All too often the typical in-house answer for these problems is to throw more money at it. Buy another server. Add more redundancy. Hire another IT support person. It’s becomes a vicious, never-ending cycle.
For these reasons, budget forecasting for IT is also an issue when you invest in your own equipment. You never know when IT problems can arise, and to what extent. You find yourself having to assume more capital expenditures and risk.
Q. What should a prospective client look for in an IT vendor?
Certainly be wary of long term contracts. Also, if you’re venturing into “newer” IT solutions, such as cloud computing, make certain that the prospective vendor has significant experience in this area of IT expertise. We cover this topic extensively in our whitepaper Nine Tips for Choosing the Right Cloud Computing Provider.
Q. Is there a downside to the rapid-pace evolution of today’s technology?
Moore’s Law (named after Intel co-founder Gordon E. Moore) suggests that the number of transistors that can be placed inexpensively on an integrated circuit doubles every two years. Unfortunately, computers are evolving so quickly, ‘hard drives’ (storage devices for digital data) simply can’t keep up with the pace. This may, perhaps, account for why many organizations have transitioned from traditional computer services to cloud and utility computing in the past decade.
Conversely, one of the key ‘upsides’ of the IT evolution is the trends towards simpler “usability”, a phenomena Gartner Research calls the “democratization of technology”. In the past few decades alone, technology has transformed from 100+ lb. machines managed and used only by highly-trained professionals to something that is built into everything we use, including business processes.
Q. Moore’s Law has been around for decades. Why haven’t Houston IT consulting firms offered cloud computing in the past?
The advancement of the Internet has been the key factor in the advancement of services such as remote PC access, remote pc support and, eventually, cloud computing. Ten years ago, remote access was still in its infancy. Most consumers and businesses relied on 56k modems, and faster lines were cost-prohibitive. Now that cheaper alternatives, such as DSL and cable, are readily available to all budgets, remote desktops run at the same speeds or even faster than locally-run systems.
Q. So all we’ve been waiting for was the Internet?
Historically, remote access has always been a step behind ‘local access’. In the 1980’s, there was only text. In the late 1980’s, Microsoft and Apple made local computing two dimensional with graphics and images. At this time, text was now accessible remotely. In the 1990’s, Windows NT4 and Citrix enabled graphical elements available remotely. However, at this time, local access had evolved to enable three-dimensional capabilities (aka “multi-media”, the blend of sound and video).
This was predominantly the ‘status quo’ for over ten years since business demand wasn’t present at the time. Recently, industries IT advancements in the oil and gas and healthcare sectors have paved the way for four-dimensional imaging available in real time. Experts predict that eventually there will no longer be a need for locally-owned systems.
Author: Yehuda Cagen is the Director of Client Services for Xvand Technology Corporation, provider of IsUtility, a pioneer of utility and cloud computing computer services for Houston’s small and midsized businesses, backed by nearly two decades of research and proven client implementation.