RateCity is one of Australia’s leading online financial comparison services. Partnered with NineMSN and utilising a constant data feed provided by the likes of Canstar Cannex, they provide a one stop shop for anyone looking for financial recommendations. Through their website they allow comparisons to be made, not only across major credit card, loan and insurance vendors, but also the offerings from most major savings and investment brokers in the industry.
Using a feed of data they receive periodic updates from in excess of 250 financial institutions. This allows for over 13,000 products to be evaluated and represented in an easy to consume format on their website.
When RateCity contacted us in early 2010 they were interested in having Anchor take on the responsibility of managing their hosting infrastructure. The specifics of the brief included:
The existing website was running on infrastructure owned and operated by Canstar Cannex.
It was necessary to separate this infrastructure out onto their own machines which would become entirely independent.
The RateCity technical team was made up purely of development personnel – they didn’t want to have to worry about running the infrastructure.
All internet-facing services needed to be reliable. Deploying redundant, highly available infrastructure was mandatory.
The hosting company needed to accept responsibility for managing the entire stack which included Ruby on Rails frontends and a relational database backend, as well as a suitable development environment.
The Anchor Solution
Like many of the projects that Anchor works on, designing this solution had a number of very specific considerations which needed to be taken into account. On this occasion, the primary goal was to be able to successfully extract a portion of an existing web stack and then deploy it on a stand-alone basis in an entirely different location. Throughout this process Anchor had a number of meetings with the RateCity development team to tailor a solution that would meet their requirements.
The final infrastructure relied heavily on virtualization and ran a number of virtual machines as frontend web servers. The software stack which was implemented comprised the Nginx and Unicorn web servers, running Ruby on Rails applications. In front of the web servers sat a pair of load balancers, in a high-availability configuration designed to distribute the requests out across the frontend infrastructure. For the backend, two PostgreSQL database servers were configured in an active/passive state to ensure that there was no single point of failure. In addition to this, there were two search servers, running Solr and doing the “heavy lifting” with the data feed being supplied by Canstar Cannex.
Deploying the infrastructure in this fashion addressed all of the concerns in the original brief and made the infrastructure robust enough to handle the existing traffic loads, as well as scalable enough to handle future traffic demands.