In 2017 we launched our new Companies House service (CHS) to a public cloud solution utilising Amazon Web Services (AWS). At the time there were a small number of regions in Europe available and we chose to host our service in eu-west-1 (Dublin). There are many benefits to cloud provisioned services such as economies of scale, flexibility, service protection and increased speed of delivery. As a result of cloud hosting, our search service has achieved an uptime of 100% for the last 2 years.
Following the UK’s decision to leave the European Union, we received notification that upon our withdrawal, we may not be able to host the register data and images used for our service outside of the UK.
So, we began to plan how to move the service back to the UK. Although a daunting task as CHS serves 18 million requests for data each day, it proved to be an easier process than we first thought.
How did we do it?
When we built the service initially, we adopted Infrastructure as Code (IaC). This is a method where source code builds and provisions the service.
As the source code is controlled, it guarantees repeatability as the same source code is used to build multiple environments. So, rather than trying to move the service gradually, we decided it would be easier to create an entire duplicate service and switch users to this using the magic of Domain Name System (DNS) - the phone book of the internet.
Obstacles and challenges
Our scripts needed to change. There were hard coded eu-west-1 values in the provisioning scripts that needed to be replaced so we could run the scripts against eu-west-2.
We also needed to update our software. The software versions we run did not have the eu-west-2 region available or were not available in the eu-west-2 region.
The machine types we had selected in Ireland were not available in London. This meant we had to pick new machine types. Also, services that were available in Ireland were not available in London (such as AWS WAF, AWS DataPipeline).
Size of the data involved
- 40TB of image data (180 million images needed to be moved)
- 300 million database rows
- 300GB of data
- Over 30 S3 Buckets
These were overcome using the services AWS offers (S3 Sync, Cross Region Replication, DataPipeline). This was all tested with the creation of a new test site in London which gave us the confidence that the approach we had taken was the correct one.
Implementation day and the migration
On 26 March, we moved the service. We:
- stopped Deltas (the process that updates CHS from our internal database processing) and Load Elastic Search (the CHS search engine)
- moved the image system using DNS so customers were served images from London rather than Ireland
- extended Mongo Cluster so the database extended across both regions - this allowed us to move the data to London and then shut down Ireland (once the master node was in London all data was served from there)
- used DNS to switch the customers to the new service
Success
This was a huge collaborative effort across the organisation, involving our front-end systems support, database administration, networks, architecture and development teams. We moved the service within 4 hours without any disruption to customers and we did not receive a single alert from our external monitoring services.
Moving and rebuilding the service allowed us to look at the machine sizes we had built in Ireland in terms of the disk allocated, CPU and memory. In many cases we were able to build smaller machines as the current ones were under-utilised, saving on AWS costs.
As we know we’ll be staying in London for the foreseeable future, we’re able to purchase reserved instances which will allow us to benefit from a further cost savings.
By working together in innovative ways, we’re able to ensure that our services are fit for the future and making the best use of cloud capabilities. We know the value of our services to the UK, so everyone here at Companies House is focused on delivering the best service we can to our users.
To keep in touch, sign up to email updates from this blog or follow us on Twitter.