We have crafted solutions for industry leaders, combining years of expertise with agility and innovation.
Data is gold but only when mined and managed properly. With this in mind, a Top 4 Bank’s Data Engineering team foresaw the benefits of big data and real-time streaming very early on. However, its on-premises were not catering for its data needs and was causing a series of problems. The bank wanted to migrate all its data to AWS and swiftly. It would need to migrate on mass both live and archived data to the cloud, with automatic deployments to ensure high availability. The bank needed to find a way to migrate this data without customer disruption and with future-focused architecture.
The Bank’s strategic intention to be cloud-native and AWS-first meant it had to migrate its data platform into AWS quickly and seamlessly. The Bank’s Data Engineering team run two large platforms. The first is its Hadoop on-prem platform and the second is its Kafka on-prem platform. The Bank needed to migrate and archive petabytes of live data into cloud in a structured way, without compromising the data integrity and quality. It needed to determine how to build a Hadoop platform in the cloud with two strategic intentions. The first considering how to build for the future to gain valuable data insights and the second ensuring minimal disruptions to its end users.
The Bank had a team of experts in Hadoop and Kafka but not in migrating, architecting and managing these platforms on AWS Cloud. The Bank had previously partnered with Synthesis who had already created some of its data patterns through a Technology Risk and Security Data Platform and had proven-cloud expertise. The bank decided to partner with the software development company once again to begin its data-driven cloud journey.
Synthesis is now working with the Bank to provide it with both cloud administration expertise and data expertise. Currently, Synthesis has set up a UAT environment and it is ready to deploy, to production with automated CI/CD pipelines and Infrastructure as Code using Terraform. Next, new workloads will go into a strategic AWS Lakehouse architecture. Synthesis is working to create a data mesh-based architecture which will decouple data management from a central team making data more accessible, scalable and describable in house with greater throughput while adhering to the bank’s strict privacy and security regulations. Synthesis is creating a visualisation layer which orchestrates the data so the Bank’s data analysts only have access to the relevant and correct data. It also creates agility as the team has easy and fast access to the correct curated data products. The secret sauce is validating data quality and schema at source solving an age-old problem of complete and accurate data. A data lake allows all the data to be centralised with a central team managing reliability, governance and security with decentralised teams managing their own data and access. A distributed model will allow the data to be quickly and easily distributed. This ensures the Bank is receiving data-driven value at scale in a federated way.
When the migration is complete, the Bank will have successfully migrated its two platforms with automated deployments, an ability to scale, bank-grade security and most importantly, an ability to alchemise its data and benefit its customers.