Brought to you by

Migrating to a new primary data center

Posted 5 years ago in Xero news by Duncan Ritchie
Posted by Duncan Ritchie

Last Sunday we migrated all of our customers, data and systems to a new primary data center in order to set the foundations for further growth. The migration involved synchronizing approximately 10 terabytes (TB) over the 1,500km between our old primary data center in Dallas and a new pre-prepared data center in Chicago. 10 TB is approximately the equivalent of the entire printed collection of the US Library of Congress.

On the day we had 34 staff in multiple offices performing data migration, database and infrastructure tasks, testing the application to ensure the migration had completed successfully, taking care of social media and customer tickets and standing by just in case we had any code issues. The team atmosphere was great with people pitching in where required from making coffees to triaging issues.

We completed the whole process about an hour early and have had a very small number of issues since we went live. For those of you that have been involved in large IT projects you will be aware that migrations are not often this smooth. The key for Xero was a lot of preparation and practice combined with a very talented and committed team.

Since the migration the Xero applications have been slightly better than 10% faster. While our goals weren’t performance related, it is good to see a material improvement.

The project

The project to plan for, design, build and migrate to our new hosting environment started in late June 2012 with a planning trip to San Francisco to meet with our CTO and then on to San Antonio, Texas to meet with Rackspace – our primary hosting partner.

We looked at the various options open to us from Platform as a Service (PaaS) offerings such as Microsoft Azure, Amazon AWS and Rackspace OpenCloud through to building and managing our own equipment. Ultimately we settled on the Rackspace Private Cloud solution based on a number of factors that allowed us to meet timeframes, performance and capacity drivers, the need for rapid scaling and of course cost effectiveness.

We have been operating on the Rackspace Managed Hosting and Private Cloud solutions since March 2008. Rackspace offer a flexible and comprehensive solution that gives us a good balance between the benefits of our own equipment and Infrastructure as a Service.

The detail

A number of people have asked us for the technical details of our new hosting environment so the remainder of this post covers further disclosable aspects of the environment.

The new hosting environment has completely new equipment, new versions of our virtualization software, operating systems, database software and SAN hardware. Every aspect of the environment has redundancy, which means that the failure of one (or often multiple) components won’t affect our customers.

We run a 10 Gb network backbone with smaller servers and devices connected by one or two 1 Gb connections. We have F5 load balancers and some undisclosable intrusion detection systems and firewalls.

We moved our storage to an EMC VNX SAN, which has a mixture of SSD flash and traditional spinning disks. Overall we have over 100 TB of usable storage due to the need for multiple redundant copies of key data.

The servers in our new hosting environment are almost entirely virtualized on dedicated physical servers running VMWare vSphere 5.1. We have two clusters – one runs application and utility servers and the other runs some of our database servers. Most of our servers run Windows Server 2008 R2 although we have a small number running linux.

Our database layer was upgraded from Microsoft SQL Server 2008 to 2012. 2012 has a number of new features but the most important for us was AlwaysOn Availability Groups. Our old platform used Windows Clustering failover SQL instances for database resilience and we wanted something better. By comparison Availability Groups are much easier to implement and maintain and recover rapidly in the event of a failure.


Gayle Buchanan
January 31, 2013 at 10.09 am

Thanks Duncan … from all the bookie chics who just ‘drive’ Xero knowing you take care of all the ‘under the hood stuff’ (way over my head that’s for sure!)

Rod Drury Xero
January 31, 2013 at 2.44 pm

@Jonathan we prefer real names and email addresses please, but happy to respond in this case.

Yes we noticed this as well.

All our data is locked away and only available to a small number of Xero staff with a strict authorisation process.

The US Government, in the unlikely event they cared about one of our customers, would have to approach us to get specific data.

Our position has always been that if asked, we would push hard to alert our customer prior to granting access so they had the chance to maintain their privacy.

Currently each access to Xero application is logged and visible to the end customer. So effectively access through the UI (the best way to see things quickly) would be logged and visible. We do not have any ‘hidden’ features for an agency to see information in Xero.

While emotional we haven’t seen this as a real issue in practice. I suspect that anyone that who was nervous about any Government seeing their data would be unlikely to use online services such as ours.

Cloud computer is still compelling but interesting to see these issues come up.

January 31, 2013 at 3.09 pm

Maybe there is a market for a with browser encryption and auto banks feeds from Cayman Island accounts?
Good job on the migration. I can only imagine the pain involved. Our database is only 100G and it takes the good part of a day to backup, move and restore on another machine.
Do you mind sharing how you scale out your SQL server databases? Do you use any tools to help or was the application designed to shard different customers into different databases?

Robbie Dellow
January 31, 2013 at 3.28 pm

quote – ‘Every aspect of the environment has redundancy’
Please advise what is in place with regards to Disaster Recovery – Offisite.
Is the ‘old primary data centre, in Dallas, planned to be decomm’d?

Duncan Ritchie Xero
January 31, 2013 at 6.52 pm

@James – Yes we shard customers into suitably sized databases to allow horizontal scaling.

Duncan Ritchie Xero
January 31, 2013 at 6.56 pm

@Robbie – Yes our secondary site has moved from Chicago to Dallas. Data is replicated to the secondary site continuously. We also have offsite tape backup which gives us at least four copies of production data.

February 1, 2013 at 10.50 am

Wow, it’s interesting to see this sort of transparency. I have almost no idea what most of that jargon means, but kudos to Xero for opening themselves up to scrutiny, shown in some of the comments here. Facebook opens up my life to Google and we don’t hear about it for months. That being said, as cloud services become the new standard, the laws are being written. Looks like Xero has taken a pretty responsible path. Respect.

February 1, 2013 at 11.53 am

Great job. As someone who has done many production migrations I am impressed with the smoothness of this one. Also, as a user who left Wave Accounting for Xero recently due to Wave’s complete a utter failure of a major rollout I am very happy to see everything working well. Big props to your IT team excellent job. I honestly didn’t even notice the downtime or the transition, which is the best possible outcome.

March 12, 2013 at 10.22 pm

Great job done, looking forward to your session in Auckland User Group Meeting next week.

Mike Block CPA
November 10, 2013 at 12.13 am

I strongly agree with James. “Maybe there is a market for a,
with browser encryption and auto banks feeds from Cayman Island accounts?”

However, I would go further. Internal Revenue and states now demand QuickBooks files for audits, so demands for Xero access are sure to follow soon. We also have NSA and Obamacare snooping. There are already servers in Swiss bank vaults for privacy, so there is definitely a market for such a service.

Our corporations may not have Fifth Amendment rights, but we still have this individual right. We must report international accounts on U.S. income tax returns, but need not report Bitcoin, physical gold or other accounts.

Transferring all data to a different service will present major problems. This includes not only list and Bank Rule data transfers, but outstanding bank deposits and payments, inventory detail, open payables and receivables and much more. Therefore, while I doubt any of my clients would use an offshore service, Xero should license its code to an offshore company.

I predict that this would instantly attract many international clients, especially non-U.S. clients. The Xero business – computer service model is especially suited for this.

Glennis Stuckey
June 18, 2014 at 8.18 am

Hi All,

Katalyst Office Management are focused on sustainability and the ability to be 100% paperless using Xero Files – which we call XFiles ;-). I was wondering if you can tell me the type of energy the new data centre uses – is it clean energy, or a percentage clean? For example Apple use 100% clean energy in their data centres, where Ebay uses only 6% clean data.

A printing friend of mine who is very ‘lean’ and green, says that online is less sustainable than print due to the data centre energy usage. I see though a lot of work has been done on data centres to swing this around in the past few years.

I would like to blog about the energy that Xero uses, and have a win for paper vs cloud.

Any information would be very much appreciated.

Thanks so much, Glennis

Leave a reply

Your email address will not be published. Required fields are marked *