On Saturday September the 10th, it was reported that ING Bank in Romania had an entire regional datacentre taken offline for ten hours, resulting in the loss of all banking services. A disaster of this size might have been expected to be caused by a long term regional blackout, earthquake, or possibly a terrorist bombing, but no, it was caused by sound… a very loud sound. [Read more…] about Be Kind To Your Hard Drives, They Are Sensitive
As part of our thirty-year celebration, many of us have spent time looking back with warm affection for the early days of personal computing. There was a feeling of excitement for the changes in technology which truly felt like magic was at work. For those who did not experience this era, you may well not understand how important the “PC Revolution” was.
My first experience with computers, didn’t involve a seeing a computer, a keyboard, or a monitor… what? We would write programs by using a pencil to fill in little boxes on computer cards (rectangular pieces of thin cardboard). You would put an elastic band around your stack of cards, your “program”, and send them off to a distant computer centre. A week later the cards would be returned to you, informing you that your program stopped on card 23, because the corner of the card got bent. You now would prepare a replacement card, put your elastic band around your stack, and send it away again. This process would repeat many times as you debugged your simple program, and after a couple of months you would have your simple program running. Interactive, it was not. Enjoyable? Not for me.
In later years we had access to teletype terminals, which gave you direct access to a remote computer. They did not have a monitor, but would print out responses from the computer. This was a big step forward as it gave you interaction with the computer system. But generally you were just accessing some running program, and not writing software.
For me, the “magic” started to show itself when video terminals began to be available. It appeared to behave largely the way today’s computers do. You ran programs on it, and could write programs, and everything was displayed on a monitor almost instantly. The only catch was you couldn’t take it home, as the actual computer it was connected to was larger than your house, and from home few people would have any way to connect to it. But the future was being hinted at, and coming fast.
By Jason Scott – Flickr: IMG_9976, CC BY 2.0, https://commons.wikimedia.org/w/index.php?curid=29457452
The PC revolution started with a large number of different companies, each making their own PC, with little to no compatibility with software or hardware. Many of the early computers were actually designed to be gaming platforms, but that didn’t detract from their ability to run more serious applications. People were ecstatic that they could afford a computer… a personal version of a multi-million dollar mainframe, which would allow them to compute in their own homes. It was felt that a home PC, in some small way, allowed you to compete with big business, and the government….you had a Computer!
The next big event in the PC Revolution was the move to a standardized platform, starting with the IBM PC, and quickly followed by Compaq and others. By the mid-eighties, you could purchase PC’s from many different companies, and for the most part, your software would run on all of them. This standardization introduced competition in the industry, and drove pricing down so that most people could afford a computer if they wanted one.
I spoke about the “magic” at the beginning. I’m not sure that I can properly describe what that magic was for me, and for many other enthusiasts. It had something to do with being able to write software to do whatever I wanted it to do. For many of us it was as simple as creating a program to catalog recipes… not a big deal, but it rewarded us with the joy of creating something successfully. From the perspective of 2016 this appears trivial, but before the PC came along, it was something you could not do.
Today this enthusiasm, this “magic” is still very much alive. It is alive in the thousands of Open Source software projects that exist, that tens of thousands of people freely give their time and skills to develop. Hardware is not being left out either; there is a huge surge in the use of the Open Source Arduino microcontrollers, and similar ones such as the Raspberry Pi. I don’t want to forget the Maker movement, which is about doing just about anything, with anything, and having fun at the same time. If you are not familiar with any of the above, Google is your friend.
Perhaps the magic is about the joy in people all over the world sharing their enthusiasm for these projects, and taking the time to help others who are interested in learning as well. You can experience the magic if you want. You just have to want it.
Computers are magnificent tools for the realization of our dreams, but no machine can replace the human spark of spirit, compassion, love, and understanding.
Louis V. Gerstner, Jr.
Don’t forget to enter our “Blast from the Past” contest!
Part 2 of my Cloud Assumptions is alternately titled “Show me the money”, because many businesses are looking to move their Servers, and Services to the Cloud with the expectation of saving money. Whether you save money, or spend twice as much on the Cloud, will be a situation unique to your business.
Let me take one step back, and give you a really simple definition of the “Cloud”. If you are accessing some computer/software, and it isn’t running in your office, it is in the “Cloud”. Someone other than you is running the computer/software, and you are accessing it across the Internet. Before the Internet we used to call this accessing a Mainframe using a Terminal, but no one had a soft cuddly name like “cloud” for it.
Some company’s requirements are almost a “no brainer” for using the Cloud. For example, if you have extreme peak periods in your business, say for Christmas online orders, when you need 100 web servers available, instead of your normal 10. Cloud providers allow you to rent those additional 90 web servers for just the Christmas season, and then shut them down, and not continue paying for them, which is an easy “win”. However putting your company’s general computing needs into the Cloud is much more complex, and will require considerable research into not only the costs, but the risks.
So how do you start determining whether a Cloud service makes sense for your business? I would start by creating a Worksheet. If you are comparing the cost of running a specific software package locally, versus in the cloud, start entering the relevant costs, as well as requirements, required resources, and any compliance regulations you may need to meet. For example, if your industry compliance rules don’t allow customer data to be stored outside of Canada, and your proposed Cloud solution is based in the U.S. that will quickly make your decision for you.
When estimating your costs, make sure you include any startup/migration professional services costs, as these can be quite significant for those migrating to the Cloud. For completeness, I would also investigate the costs of returning from the Cloud, back to an in-house system. In the past, customers have found that this could be very difficult. Vendors may need a week, or more, to provide you with your data, and may not provide that data in a file format which is immediately useful to you. The more responsible vendors are however providing the necessary tools for customers who wish to migrate away from their Cloud.
Don’t forget that with a Cloud based solution, there may be additional costs should you need to upgrade your Internet bandwidth to handle the increased traffic. Depending on how heavily utilized your Internet connection will be, you may require a firewall capable of prioritizing traffic, so that your Cloud services don’t slow to a crawl. If your Cloud services are critical to your business operation, perhaps you need redundant Internet connections from two different Internet Service Providers.
Each businesses requirements and situation is unique, and with thousands of Cloud based services, it is impossible to come up with a definitive checklist for you. What will save you money is to do as much planning as you can, and then do even a bit more. You will find that as you revise the costs in your worksheet, you will discover additional expenses that you forgot about. For example, moving to a new software version may require updating, or replacing existing desktops and laptops. Don’t try to shortcut the planning process.
Some companies expect that moving to the Cloud will allow them to benefit by cutting the number of support staff. Be realistic as to whether a Cloud based solution has any real impact on your staffing requirements. For example, if you have a Microsoft Exchange Administrator today, moving to a Cloud based Exchange solution will not change your need for that person’s expertise and assistance. Cloud based solutions may increase your need for staff, or decrease your needs, so be careful before you make any assumptions regarding staff cost savings.
Lanworks has moved customers into the Cloud, and moved customers out of the Cloud. We have seen what works well, and what doesn’t, and are happy to share our experiences Lanworks has their own Cloud based Disaster Recovery and Backup services. We welcome your questions, and are always happy to assist you in your journey.
You have likely heard the expression that when you assume something, you make an ass out of u and me (ass-u-me). The common interpretation of this is that when we make assumptions, we may be embarrassed and/or disappointed when we find out we were wrong in making those assumptions. For example, when you purchase a used car, you should not assume the engine is in good health, and that it comes with a good spare tire. Making such assumptions could be very costly to you.
This brings me to the blog title of “Cloud Assumptions.” Almost every service is having a “Cloud” sticker put on it. Low prices are offered, and grand promises of how wonderful the cloud is, are freely given. However, I cannot emphasize strongly enough that you should not assume any functionality whatsoever, that the vendor will not verify in writing. Over a series of blogs, I will research and bring you examples of functionality in Cloud services that you may make assumptions about, but should not.
Let’s start with one such service today.
Performance Not Always as Expected
Assumption: “I’m going to use an extremely well-known Backup vendor, who heavily advertises how great they are. I have an Internet pipe of 250 Mbps, so performance should be acceptable.”
I picked an unnamed vendor as I hear their advertising very often. They have been around for a long time and sound like a decent offering. My area of interest was in how fast I could get my data back from them, should I need to restore due to loss or encryption by malware. When I first researched them a few months ago, I found that when you attempted to restore large amounts of data, they would throttle your bandwidth to slow you down. As of today, though, good news! Or is it? From their FAQ on their site Company X “does not throttle your restore (download) speeds.” That sounds like good news, does it not? Continuing on in their FAQ we read “Your data can typically be restored at speeds up to 10Mbps” and “it is possible to restore as much as 100GB per day.”
Restoring at (“up to”) 100 GB a day is not a realistic option for most businesses. Even a small business is likely going to be shut down for most of a week restoring their data. For businesses with over 200 GB of data (almost everyone), this solution would not be acceptable under any conditions. Personally, I think they are misleading their customers. To say “does not throttle” and “speeds up to 10 Mbps” in the same paragraph, doesn’t sit well with me. This is still throttling, regardless of how they phrase it.
So both upload and download speeds are two of the factors you should examine carefully before signing on the dotted line. But there are many more details to be wary of, and I look forward to sharing these in future blogs.
As a side note, I will not name this vendor. Primarily, I don’t wish to hear from their lawyers. However, perhaps, more importantly, is that vendor policies can change at any time. What is important to your business is their policy at the time you sign a contract with them, not what they may have done in the past.
Stay tuned for the next in this series of blogs on the assumptions made about the Cloud.
Providing commentary on Hyper Converged Systems (HCI) is challenging. After a few years on the market, practically every major vendor is now offering some sort of HCI offering… well at least their interpretation of what HCI is. In the time it takes you to read this blog, you will probably have received another vendor email touting how wonderful their HCI solution is. So if every vendor is touting how great HCI solutions are, how can I possibly question their assertions?
In my opinion, many vendors HCI solutions don’t look any different then traditional IT solutions. With HCI, I feel as if vendors just sprinkled some magic dust on an existing management interface. But HCI solutions demand a premium price, so I am looking for some significant benefits for our customers.
The Claimed Benefits Get a Reality Check
Using a checklist from a well-known HCI vendor, I’ve evaluates each list item to determine if the claim is beneficial or not.
- “Single Vendor Solution” from Delivery, through Support.
I would agree that this is an ideal situation; however, with the consequence that it greatly narrows your flexibility and choice of products. You are typically locked into purchasing from that vendor and there is no interoperability between vendors. So if you find any aspect of your HCI solution lacking, you are stuck with it. Maybe this was not the benefit you were looking for?
- “Single Shared Pool of x86 Resources”
Is this not what Virtualization is about? I think it is important that we evaluate vendor statements to determine if there is a real benefit, or is it just some “nice sounding words”?
- “Ease of Scale” “Easily scales by adding x86 building blocks”
Depending on your situation this may be a “win” for you, or perhaps for your vendor. The nature of this “building block” architecture means that in order to add either CPU, Memory, or Storage, you need to add all three of them in a HCI “block”. So they are right, it is “easy”, but this is not the same as “inexpensive”.
- “Centralized Management”
This is one of my favourites, as it sounds like “Datacenter Control for Dummies”. From what I have seen, this single dashboard approach has been obtained by removing much of the rich functionality that traditional products offer.
- “Hyper-Efficient Use of Resources” “Data center components are not idle resources”
Isn’t this the problem that virtualization solved a decade ago?
- “VM-Centricity” “The management paradigm shifts”
You should probably avoid any sentence like this. This vendor goes on to say “eliminates the need for infrastructure specialists” … sure, you can have Kathy in Accounting responsible for your infrastructure, and that will be fine until the first time it goes down.
- “Native Data Protection”
This is in regard to these solutions including “backup, recovery and disaster recovery”. It sounds good, but the implementation may be too simplistic to be of real value to you, only allowing you to restore entire VM’s. Because a product says “recovery”, don’t assume any functionality until it is shown to you.
- “Software-Centric Design” “to meet software-defined data center requirements”
Their last item managed to fit in the magic “software-defined data center” phrase. You have likely heard this phrase hundreds of times, and although it sounds impressive, it does not necessarily do anything different from what you are doing now.
At the end of the day, all I ask of you (and my clients) is to carefully read their literature, and write notes about what is important to you if evaluating an HCI solution. Ask the vendor detailed questions, and insist on a live demonstration to determine which functionality is important to you.
Many HCI solutions will require a minimum number of “blocks” to be purchased initially. Each time you need more computing/storage resources, you may also need to purchase multiple “blocks”. You want to identify all aspects of living with a single vendor’s HCI, as you don’t have an easy way to get out. If your newest ERP application requires more compute resources than you have, you don’t want to be surprised that you need $50,000 for the next incremental block upgrade.
There can be a good fit between your needs, and what an HCI vendor can provide. It is just that the vendor lock-in aspect of HCI requires more due diligence than normal. With more traditional IT, if you buy the wrong switch for your needs, you can buy just another switch. With HCI if you buy the wrong solution, you have to replace the entire infrastructure… that would be a bad day for anyone.
It is difficult to turn on the news without hearing something more about Ashley Madison, and the confidential information of their customers being leaked into the press. The nature of this website makes much of our society giddy with thoughts of friends, co-workers, and acquaintances possibly showing up in the leaked data. I myself have been swept up in lunchtime conversations about this site.
A question that surfaces during each discussion is whether you should look at the personal data being released. I’m going to say that in my opinion, the answer is unequivocally “NO”. Looking at this question from an ethical angle I considered a few factors. Ashley Madison operates a legal business in Ontario, involving consenting adults of legal age. Second, the information is of a confidential nature, and was released publicly without the permission of the users.
Is the release of this information to the public any different from the personal data released by black hat hackers from other companies? When Sony, and Home Depot were hacked, and customer information released, did you seek it out? No you didn’t; likely because you really didn’t care about the information, but hopefully some belief that it would not be ethical would have affected your decision as well.
Is the Ashley Madison hack really different?
I am a HUGE advocate for privacy of both personal, and corporate information. My personal information is not anyone else’s business. Your personal information is yours, and everyone else should respect it. Businesses often survive only because their competitors do not have access to their confidential business information. Lanworks survives because we respect our customers’ absolute right to privacy regarding all aspects of their business. This is not restricted to their data, but to all aspects of how they run their businesses.
I would suggest that we take the high road and stand up for our desire for protection of our personal information, by not violating the personal information of others. If no one was to look at the stolen information, we would disempower those who steal and release such information. By viewing personal information obtained illegally, you are supporting criminal elements.
What if it was your information? What if next time, it is the release of critically important competitive information from your company? What can you do to prevent this from happening? The type of product you may want to explore is called Data Loss Prevention (DLP).
One concept of a DLP is to have it profile your important documents. Then should someone try to send these documents (or even a modified form of them), the DLP can be set to log the event, email an alert to your system administrator, and stop the document transfer immediately. DLP products go well beyond this capability, and are designed specifically to prevent inappropriate data from leaving your company.
This is but one step you can take, in a series of steps, from simple to complex. Think that perhaps you need help with your security posture? Let us know, and we can help.