Mozy Celebrates World Backup Day 2012

Celebrate World Backup Day with Mozy

Mozy is excited to celebrate World Backup Day 2012. As you know, we’re passionate about backup, and we want to help people protect their important memories and business files. Mozy online backup is easy, secure, and affordable. This celebration is a good time to review your current backup situation to make sure everything is in order. Mozy makes this easy to do. Check out the following links for more information.

Why Mozy?

Mozy is the leader in data protection and access

For years we’ve serviced more than 3 million users and 70,000 businesses. With regular updates and improvements under our belt, we know how to protect your data in the best possible way.

70+ petabytes

When we say we have experience in data, we mean it! One petabyte = 1,024 terabytes. So Mozy manages over 71,680 terabytes. That’s a lot of data! With that comes experience and expertise. It’s no small task, but Mozy has perfected the art.

Not backing up yet? Check out our recent blog post series, “What to look for in a backup vendor?

Are you backing up, but not using an online backup provider? It’s never to late to get started. Check out our blog post, “The Top 5 Signs it’s Time to Move to the Cloud” for a fun look at how it’s easy to move to the cloud.

Contests:

It wouldn’t be a celebration without a few contests! To commemorate World Backup Day 2012, Mozy is giving away 3 Zaggsparq device chargers – the best way to power and charge your devices. So, how can you get your hands on one of these goodies? There are two ways to enter:

Retweet on Twitter:

Click here to RT the following tweet and enter to win. (We’ll only count 1 of your tweets per day)

“I’m entering the @Mozy contest to win awesome prizes. You should RT to enter too! Back up with Mozy: http://ow.ly/9XxL8 #worldbackupday”

FB quiz:

Click here to take our Facebook World Backup Day quiz. Ace the quiz and you’ll be entered in the drawing for the prizes. You can only complete the quiz once, so make sure to study first!

Speeding up your applications in the Cloud with Blue Coat MACH5

If you think that moving to cloud-based applications will slow down your business, then take a closer look at Blue Coat’s Mach 5 WAN optimization appliance. It has some nifty features that can really improve cloud performance. While this is an old problem, what is new is how it solves the problem for many companies who are using cloud computing. Putting your apps in the cloud doesn’t mean they have to run slower, in fact, with Mach5, they can be tuned to actually run close to local load times.

Today’s networks are evolving towards more Internet-based cloud deployments using Web applications and protocols, incorporating more multi-media content such as Flash and Silverlight, as well as shared documents and other collaboration tools. There are thousands of software-as-a-service applications today, each catering to an important facet of your business.

Blue Coat believes that the greater adoption of the cloud will lead more companies to bring direct Internet access to branch offices. The company calls this “Branch to Cloud”. The service accomplishes several things: it protects your branch offices from Web-based malware, especially in real-time, moves information back and forth securely, and delivers documents from cloud-based repositories quickly, keeping bandwidth demands lean.

The idea is to install one of their appliances in a local office, and set it up to optimize for cloud delivery as you see in our series of configuration screens. The appliance will cache frequently used content, partial Web pages, images, video and downloaded files, so that subsequent accesses happen quickly.

But the real genius here is what’s missing from their solution. As you can see in our diagram above, you only need a single Blue Coat appliance on your local network – unlike competitors’ gear, you don’t need a matched pair when it comes to software-as-a-service, which is a good thing because that would be difficult to impossible with getting access to most cloud provider’s networks. Blue Coat’s CloudCaching Engine provides asymmetric or “one-sided” WAN optimization by using specialized caching and SSL decryption capabilities.

Better wide-area performance is just one feature of Mach5 and is part of an integrated line of WAN optimization appliances from Blue Coat that also accelerate remote access to email, centralized files, storage and enterprise applications and optimize live and on-demand video by enhancing the user experience and reducing the bandwidth consumption down to a thousandth or less of what it would otherwise be. If you have any of these applications on your network and your users are complaining about sluggish performance, then perhaps you might want to check them out on their website.

 

 

World Backup Day – Are you backed up?

“World Backup Day” is March 31. Is your business backed up properly?

This weekend is a great excuse to make sure your business data is protected.

Consider, in the last year:

  • 50% of all businesses reported that an employee’s hard drive had crashed
  • 11% had a laptop stolen while on business
  • In 72% of all cases above, the data was not fully recovered.
  • 60% of businesses surveyed do not budget for any form of backup
  • Of those that do, 40% only backup to a single location
  • ~30% of businesses allow employees to select their own method of backup

Backup methods include:

Business Backup Methods

These statistics are from a recent Mozy survey of more than 640 small-to-midsize businesses in the U.S. The survey was conducted by independent research firm Compass Partners – to identify employees and executives’ habits and attitudes about backup and data security. The survey found that a significant number of SMBs don’t implement safe backup strategies – despite well-documented risks for loss of sensitive client and company data.

In a press release issued today, Mozy’s director of product management Gytis Barzdukas suggested: “Professionals should take the following steps to implement backup practices. First, find a secure and reliable cloud service to complement a local backup device, which by itself can easily be destroyed, damaged or misplaced. Second, the offsite service chosen should automatically back up data, be user-friendly and should emphasize data security and privacy through a strong encryption method. Finally, companies should extend backup policies to include strategies for protecting the data on mobile devices, as analysts predict a surge in employees using personal smartphones or tablets for business purposes throughout 2012.”

So be sure to get your business data is securely backed up this weekend!

P.S. While this survey was focused on businesses, personal data also needs protection. And for that we have MozyHome.

Be safe,

The Mozy Team

How to Select a Cloud Backup and Recovery Vendor – Part 3

(This article is the third in a three-part series exploring how to evaluate and select a cloud backup and recover service. Previous articles explored how to evaluate your data needs and how different data types are treated by backup services. Read Part 1 here and Part 2 here.)

Selecting a cloud backup vendorBackups, whether they’re local or to the cloud (or “hybrid,” doing both at the same time), can be done in a variety of ways. Just like if you’re planning to buy a car that can accelerate quickly–like driving on New Jersey’s Garden State Parkway, whose entrance ramps are often very abbreviated)–you need to know whether a cloud service’s operation matches your needs. Some services do schedule backups, e.g. daily at 2 a.m. This means if your hard drive crashes at 4 p.m., you’ve lost everything you’ve been working on all day.

Other services apply a more frequent schedule, perhaps every six or three hours–again, possibly not good enough for your needs. Other services do backups continuously, meaning they check your files for changes. Even here, there are various ways this may be done. Some services do backups only when you exit an application, e.g., close down Microsoft Word. Some backup services back up a file when you save and close the file, which still may not be good enough if it’s a spreadsheet or other file you typically keep open all day long. (Or you have to change the way you work.) Other backup services save changes to a file every few minutes, or however frequently you specify, or even every time the application does “writes” to the computer’s disk.

Incremental and versioning

Obviously, continuous or very frequent backups offer the most protection for your data.

But you also need to know two important aspects of what the cloud backup is doing.

One, does the backup have to upload the entire file each time? For big files, like an Outlook .PST file, this can take a long time, and consume a lot of bandwidth. Or does the backup do an “incremental” backup — upload only the changes, applying them to the cloud backup?

And you also need to know, and be able to set, backup “versioning.” What if you want to get back a file the way it was the day before, for example? Does the service offer “versioning”? If so, how many “versions” will it maintain?

Backup considerations

Other important things to determine about a cloud backup provider (by reading their website, and, if need be, asking a sales person) include:

  • If you delete a file from your hard drive, does the backup service delete its copy? Or does it preserve these files, and if so, for how long?
  • Recovering files: can you do it through any web browser on any computer (assuming you haven’t lost your password)? How long does it take to recover a few files, or a few directories?
  • Recovering large amounts of data: How long to recover a gigabyte, or many gigabytes — how long does it take the cloud service to get the recovered data available, and how long will the download take? And for very large recoveries — many gigabytes or tens-to-hundreds of gigabytes — can you request them sent to you on a DVD or hard drive? And if so, how much extra does this cost, and how quickly can it be done?
  • How much does the cloud backup service cost, and how are charges determined? Are charges based on the amount of data “being protected” — what’s on your hard drive(s) — or the size of the stored backup? Can you backup several computers to a single account, and if so, are there per-computer charges? Are you charged for retrievals? Customer service calls?
  • What operating system(s) and version(s) does the cloud backup service support? For example, if you’ve got a Mac, and they only support Windows, that doesn’t do you any good.
  • How long does it take for the first backup — which, if you have a lot of data, can take a long time? Can you “prime the pump” by sending in a copy of your data on DVD or hard drive (make sure secured with encryption and a password!)? And if so, what does that cost?

Once you know what a cloud backup service is doing, you can see if it’s a match for all, or some, of your data backup needs. If you’ve done your homework, you’ll know when you look at cloud services which ones may be a match, and which ones clearly aren’t.

Then you look at which of these is the best match based on the way you work and your back up needs.

 

 

Top 5 Signs It’s Time to Upgrade and Enter the Cloud

Upgrade and enter the cloudHard as it is, it just may be time to let go of yesterday’s technology and get on with yourself. The signs have been getter more clear over the last few years. There was that incident with the cat photos at work. And that visit from the uptight suit with the IRS. So before things really get out of control, do yourself, your family and Huey Lewis a favor and trade up to the 21st century and the magic of all that is cloud computing.

Wait For the Beep…

You don’t have voicemail. You have an answering machine. A big, hulking Panasonic monstrosity that requires a team lift it when rearranging the home office.

Solution: Update to a smartphone; any VOIP setup.

Catastrophe

A pile of work documents has somehow co-mingled with dozens of 8×10 prints of your great-aunt’s feline companion “Buttons” dressed as characters from “The Wizard of Oz.” It doesn’t take a wizard to see this is headed nowhere good.

Solution: Explain to the senior vice president that yes, it is a cat, and yes, it is also a flying monkey, but no, it has no bearing on the Rooney account. Hook up Aunt Mable with a Facebook account, and buy yourself a document management solution and add some backup and storage options for your work files.

Tax Alot

You use your buddy who can crunch and store large numbers in his head as your accountant. Sure, he was featured on Stan Lee’s “Superhumans,” but this doesn’t seem to impress the guy from the IRS who’s looking at you in a funny way.

Solution: Utilize the benefits of cloud computing to compile and store important receipts and tax documents.

Mix Signals

Your cassette player is undoubtedly cool, and ’80s retro, and a conversation starter and ironic. But you’re spending $87 a week on AA batteries. Even Huey Lewis would surely understand that simple economics suggest it’s time to hang up those foam-covered headphones and pick up a cloud-based music service.

Solution: Pandora, iTunes, Rhapsody, watching episodes of VH1′s “I Love the ’80s.”

Fine Print

Print may not be dead, but the cost of printing out photos is nearly killin’ ya. The home office is beginning to resemble a second-rate law library, with dozens and dozens of brown, imitation-leather-bound photo albums lining shelves and cluttering tabletops. We know it was a pretty rainbow, but did you really need to print out 45 shots of it? Less is more. For real.

Solution: Store and share the majority of your photos in the cloud, and pick a handful of special ones to print out and display. Like that one of the cat dressed as the munchkin mayor.

 

 

Cloud Computing and Links – March 26

‘Personal Cloud’ to Replace PC by 2014, Says Gartner

The cloud has certainly grabbed the attention of both big business and the typical consumer, but the technology’s impact may signal the end of the PC as we know it. Research firm Gartner believes the personal cloud will replace the PC as the center of our digital lives sooner than you might think: 2014.

“Major trends in client computing have shifted the market away from a focus on personal computers to a broader device perspective that includes smartphones, tablets and other consumer devices,” Steve Kleynhans, research vice president at Gartner, said in a statement. “Emerging cloud services will become the glue that connects the Web of devices that users choose to access during the different aspects of their daily life.”

Mike Barton, of Wired’s Cloudline blog, delves into the subject here. Barton draws former Microsoft chief software architect Ray Ozzie into the discussion.

“People argue about, ‘Are we in a post-PC world?’ Why are we arguing? Of course we are in a post-PC world,” Ozzie said at a recent GeekWire-sponsored conference. “That doesn’t mean the PC dies; that just means that the scenarios that we use them in, we stop referring to them as PCs, we refer to them as other things.”

And You Thought Your Utility Bills Were High

How many servers does it take to power Amazon’s huge cloud computing operation? Like most large Internet
companies, Amazon doesn’t disclose such details. But a researcher estimates that Amazon Web Services is using at least 454,400 servers in seven data center hubs around the globe, according to a post at Data Center Knowledge.

Huan Liu, a research manager at Accenture Technology Labs, analyzed Amazon’s EC2 compute service using internal and external IP addresses, which he extrapolated to come up with estimates for the number of racks in each data center location. Liu then applied an assumption of 64 blade servers per rack – four 10U chassis, each holding eight blades – to arrive at the estimate.

Liu’s estimate is bound to generate some debate. But it provides an additional point of reference for Amazon’s scale, along with earlier analyses. It clearly places the size of Amazon’s structure well above the hosting providers that have publicly disclosed their server counts, but still well below the estimated 900,000 servers in Google’s data center network.

Cloud Hunts for the Origins of the Universe

When your day job is figuring out the workings of the universe you need some heavy duty computing power at your disposal.

That’s why researchers at CERN, the Swiss research lab that is home to the Large Hadron Collider (LHC) particle accelerator, are dialing up additional muscle from the cloud, according to ZDNet.

CERN is taking part in the Helix Nebula initiative, a pilot project designed to kick start the European cloud computing industry by carrying out scientific research in the cloud.

“On the CERN site we can’t increase the size of our data center much more. Two or three years down the line we’re going to be limited by space and by electrical consumption. We have to think of what other options are open to us and the on-demand, elastic cloud computing provided by a number of these companies seems like a very good option for us to explore,” said Bob Jones, CERN’s head of openlab, the public-private partnership that helps CERN identify new IT that could benefit the lab.

CERN’s mission is to answer fundamental questions such as “What is the origin of mass?” Heavy stuff.

 

 

How to Make the Private Cloud More Secure

Cloud securitySecurity concerns remain one of the biggest obstacles to cloud computing adoption, even as spending on cloud-based solutions accelerates. Users welcome the affordability and scalability of cloud solutions, but many remain fearful about the potential for network breaches and leaks. These fears typically focus on public cloud offerings, and as such, they open opportunities for securing private cloud environments.

Just as in the physical world, security is a multi-pronged approach in the virtual world as well. You need basic anti-virus/anti-malware protection just like any desktop or server receives across your enterprise; access controls so that a random employee can’t bring down your entire virtual infrastructure; firewalls and intrusion prevention products to keep network-based attackers out; and auditing and compliance tools to make sure your security is up to snuff. That is a lot of gear to handle, and all of it has to come cloud-aware otherwise it won’t be much use. Let’s look at some typical products in each category.

Reflex’ Virtual Management Center is the most comprehensive security solution, with modules in three broad areas (auditing/compliance, firewall/intrusion detection, and access controls). The product is actually four separate protective modules that are knit together with separate reporting and management consoles:

  • vTrust for virtual firewall protection,
  • vCapacity for capacity management,
  • vWatch which handles performance and resource monitoring and
  • vProfile for configuration management

Trend Micro purchased Third Brigade and has incorporated its features into its Deep Security product. The product has a variety of protective modules, including agent or agentless firewall/IDS, anti-malware, and web application protection. As you might suspect from a consumer software company, its Web management interface is very attractive and the dashboard has a lot going on. At a glance you can see your entire VM collection, whether any protective measures have been installed, and what alerts have been reported. You have to use the maps generated by VMware to see a visual picture of your network of VMs and their hosts.

Then there is Dome9.com, which is trying to make the cloud more secure by providing an automated service to centralize and consolidate security management across both private and public clouds and in and outside of your data center, including VMs residing on Rackspace, Amazon’s EC2 and GoGrid. They will manage all of your Window and Linux servers’ existing built-in firewalls. The product uses either agents or talks directly to VMware and other cloud provider APIs to automate secure access. For example, you can open and close RDP ports on a timed schedule to make sure that someone didn’t inadvertently leave them open when they were done with a remote connection.

They can also close ports without locking out legitimate server admins who need to get in on an as-needed basis without having to bother the overall security administrator to temporarily grant this access.

Tier 3′s Environment Engine can help the automation of various Microsoft and Linux server deployments. Each deployment can be configured to be private, shared publicly or limited sharing to specific individuals. You can add multiple VMs so that an entire Web app can be brought up with a single command, even though it is deployed across multiple Web, database, and app servers on different VMs. You can script out an entire installation, adding monitoring, backups, firewall rule sets – in short, you can replicate in the cloud your entire computing environment.

As you can see, the number of individual products and services that are available to handle cloud computing is a huge space, and only growing as the important of the cloud picks up for many IT managers. You should try out some of these services and experiment with the kinds of protective features that you need to feel comfortable with your cloud deployment.

We have just touched on a few of the products in this space and feel free to share the ones that you recommend as well.

 

 

How to Select a Cloud Backup and Recovery Vendor – Part 2

(This article is the second in a three-part series exploring how to evaluate and select a cloud backup and recover service. The previous article explored how to evaluate your data needs and the future article will cover the different backup methods. Read Part 1 here, and Part 3 here.)

Selecting a Cloud Backup VendorIn terms of backup requirements, not all of your data is the same.

One way of looking at your data is by importance: What data can’t you live without? What would be unable to reconstruct or rebuild? For example, you can re-rip new copies of your audio CDs or re-scan your old photographs, if you still have them, but you won’t be able to rewrite your project report or your novel manuscript from memory; you won’t be able to re-take pictures of your dog from five years ago.

Another question: what data do you need back as soon as possible, and how soon is “as soon as possible”? This is what backup experts typically refer to as “Recovery Point Objective” (RPO) and “Recovery Time Objective” (RTO).

For example, I’m a freelance writer; the files for my active projects, plus some key calendar, to-do list and other files, typically total to maybe a quarter of a gigabyte. My “archives” — files for projects I’m dealing with — and other less-critical files represent maybe a gigabyte or so.

Not prepared to lose

But I’ve also got 50+ gigabytes of photos, 25+ gigabytes of video, some audio, dozens of scanned images, and gigabytes of assorted sundry stuff.

And when I get to digitizing my older photos and negatives, record albums, and CDs, I’m sure I’ll have a terabyte or so of additional multimedia files.

None of which I am prepared to lose — so it all must be backed up.

For you, essential data you need available may include three large databases, many spreadsheets, several presentations, the past three months’ worth of email, and client billing and payment data for the past six months. If you’re a professional photographer or designer, you may need a ready archive of tens, even hundreds of gigabytes of photos and images.

And you may have lots of personal multimedia — photos, video, scans, etc. — that you don’t want to lose.

RPOs and RTOs

So I’ve really got several sets of RPO/RTOs and yours might look similar to mine:

  • For the RPO consisting of “Projects that I am actively working on, plus roughly half a dozen files of to-do, calendaring, etc.” my RTO would be “two to three hours at most.” Ideally, for the half-dozen or so files relating to projects I’m working on immediately, I’d prefer an RTO of “one hour or less.”
  • For the RPO that also includes other current projects, along with marketing and pitching, I could probably live with an RTO of 1-2 days.
  • For all my other files, I’m sure I could wait a week, even weeks to months — as long as I knew for sure that I’d get them all back.

All this, of course, is just for data. I’d also want a working computer with my core productivity applications on it. (Having recently bought a new, small notebook computer, I’ve got that covered — although there’s more I could be doing in that area… but that’s straying from “data backup.”)

Create and change

The next question: How often do I create or change files — and how much do I care about saving these changes.

For example, my multimedia files are pretty “static” — once I’ve created, organized and named or tagged them, I don’t expect to edit or change them, as a rule.

But the files for whatever I’m working on are created or changed throughout the day. If I lose a file that I have been working on all day (and my most recent backup was at midnight) I’ve lost hours of effort.

So you not only have to know how much data you have, but also how much of it changes frequently, and which and how much data you need near-continuous access to versus what you can wait a few days or even weeks to regain access to.

Now you’re ready to look at cloud backup services, and see which of these match your requirements.

 

 

Do We Need A Desktop OS Anymore?

Mozy cloud storageMicrosoft fought a long battle to achieve a near monopoly of the desktop Operating System market that may stand forever. But does it even matter? Do we even need a desktop OS anymore?

As we see what is happening with Windows 8 and Metro, I am coming to the conclusion that the answer to this question is “no.” We may be reaching the point where the desktop OS is no longer important, eclipsed by the developments of the browser and ironically a victim of better integration of the Web by Microsoft and others.

My prediction is that Windows 8 will become the OS/2 of the modern era: an OS that is elegant but instantly made obsolete by events, designed for the wrong chip (in the case of Windows 8, the mobile ARM CPUs) and based on a cellphone design ethos that no one could care less about. Yeah, but it has a great new set of APIs!

It wasn’t all that long ago that Internet Explorer became almost indistinguishable from Windows Explorer. And with the rise of Chromebooks and how much of our time is spent online, the days of the particular desktop OS is almost irrelevant now. Who really cares what OS we run?

Remember when the desktop OS did things like keep track of directories, protect us from viruses (and Windows still doesn’t really do that all that well), make copies of files to removable media, and handle printing? Yes, I know I still can’t print my Web pages out with any kind of fidelity, and if I have an iPad, printing is almost an afterthought. But is that the browser’s fault or my OS?

Now that you can get gigabytes of free file storage in the cloud (thanks Mozy!), do you really care what is on your hard drive? Well, some of us dinosaurs (and I count myself among them) still cling to our hard drives but soon they will be totems from another era, much the way many of you look upon 5 inch floppy disks, or even 8 inch ones if you can recall back that far. Wow, we could carry an entire 360 kB of something around with us! (Of course, we didn’t have mp3s or videos either, but still.) And all this cloud storage is happening as hard drives are getting so cheap that they will be giving them away in cereal boxes soon: a 2 TB drive can be had for less than $50.

Meanwhile, Adobe has big plans for Flash, where it will take over the kinds of OS-like services that I mentioned above (ditto on the protect us from malware issue too, at least so far). And Google is trying mightily to rejigger HTML with its Dart Web programming language. And VMware has a new version of its View too, which is probably the OS that I really will end up spending most of my time with going forward. Whatever comes of these efforts, it almost doesn’t matter whether we are running Windows or Mac or Linux. Because we don’t need them anymore for our online lives.

Now stop and reconsider that last paragraph. Whom have we trusted for the next OS? It isn’t Microsoft, and it isn’t Apple. It is a bunch of folks from the valley that have never built an OS before (well, give Google half credit). Think about that for a moment.

Back at the dawn of the computing era in the 1980s we all wrote dBase apps (and saved them on those darn floppies too). Then we moved up to use Lotus Notes, before the Web took root. Then we branched out in a dozen different directions, using all sorts of programming languages that used HTTP protocols. That was the beginning of the end for the desktop OS.

Now we’ll still have desktops of one sort or another. And yes, Windows isn’t going away, much as Microsoft is determined to pry every last copy of XP from our cold, shaking hands. But when Adobe, Google and VMware all get done with their stuff, it won’t matter what will be running on our desktops. If we even have them around much longer.

 

 

Geeking Google Earth with the Mozy App

Geeking Google Earth with the Mozy AppI’m a total geek for Google Earth. I use it to find places for my wife and I to explore in the Utah outback.

I scout out campsites with it. It helps me to locate minor roads and two tracks. Photos from Panoramio reveal lesser-known points of interest for us to explore. To me, Google Earth is one of the most wondrous innovations of the information age.

Google Earth has a native file format, known as “KML,” that you can use to share placemarks, routes, photos, and complex shapes for defining whole areas. For example, the United States Geological Survey (USGS) provides KML files for the surface geology of each US state. Also, many geolocation-enabled apps provide an “export to KML” option (Runkeeper is a great example).

Unfortunately, there has always been a big shortcoming with KML files. Although you could create them with Google Earth on your computer, you could not read them in the Google Earth app on your mobile device. At least, you couldn’t until this week. Google just updated the Android and iOS versions of Google Earth to v6.2.1 so that the app can accept when you send a KML file to it. And this is awesome for me as a Mozy user!

I have numerous KML files in my Stash–many that I have created, others that I have downloaded. So now, I can use the Mozy app to send these files to the Google Earth app. I’m so stoked about this that I created a quick demo video for any other Mozy users who are as geeky about Google Earth as I am.

If you want to try it out, but you need a KML file, you can grab the one I used for the demo video from IntrepidXJ’s Adventure Blog (and also see some fantastic trip reports showing Utah scenery).

 

Until next time, be safe,

Ted