A stubborn myth in the tech sector dictates that mastery demands absolute tunnel vision: you are either a…

A stubborn myth in the tech sector dictates that mastery demands absolute tunnel vision: you are either a software developer or a data scientist. Pick your lane and stay there. Grace Abayomi‘s career dismantles that assumption.

Now an application engineer at Vanguard, Abayomi’s entry into the industry was unorthodox. Graduating from Olabisi Onabanjo University in 2017 with a BSc in Geography, she was meant to map terrain and analyse spatial data, not architect cloud-native financial systems. Then came a single Java class.

“That class changed everything,” she recalls. “I realised technology wasn’t just a tool; it was a way to solve real problems at scale.” What followed was a deliberate pivot through full-stack software engineering, data engineering, data science, design thinking, and product management. We sat down with Abayomi to understand how she built a truly multidisciplinary profile in the UK tech industry and why it is only now catching up to her approach.

Q: Your degree is in geography, not computer science. How have you weaponised that background? Abayomi: Geography is fundamentally a systems discipline.

You are constantly modelling how physical and human variables interact at scale. When I assessed factory risks in residential areas or mapped flood zones using spatial data, I was already doing data analysis and engineering: collecting datasets, cleaning them, building predictions, and communicating risks. I just lacked the vocabulary.

Grace Abayomi When I encountered Java, everything clicked because I understood constraint modelling and systems-level abstraction. Geography also gave me an obsession with the end-user. A technically correct but unreadable map is useless.

That same thinking drives my API design today. Q: What does engineering a live payment gateway at scale demand? Abayomi: Payment infrastructure is unforgiving.

There is no margin for latency spikes when millions of real-time transactions pass through your system. At SeerBit, I worked on the checkout and gateway layer. The technical demands were intense: idempotent API design for handling duplicate requests; thread-safe transaction processing in Java and Spring Boot; and rigorous end-to-end testing, as a defect here would result in a failed payment for a real business.

This taught me that reliability is an architectural property, not a testing afterthought. I began designing for failure modes first, a mental model I still use today. Q: How did building blockchain-based banking tools at Appzone Group change your approach to security?

Abayomi: In banking, blockchain is about immutable audit trails and distributed trust, not cryptocurrency. African financial infrastructure struggles with interoperability among multiple institutions, legacy systems, and fragmented regulations. Blockchain lets us create shared ledgers where no single institution can unilaterally alter a record.

Architecturally, it forced me to design consensus upfront rather than bolting on security later. It also deepened my appreciation for cryptographic principles like hash integrity and public-key infrastructure, which directly informed my later privacy-preserving data work at Vanguard. Q: How did your software engineering background influence your MSc in data science?

Abayomi: I entered strong in system design and production engineering, allowing me to focus deliberately on building my statistical and ML toolkits. This let me go further than a pure statistician might. When building ML pipelines, I didn’t just train models; I designed them to be production-ready from day one, baking in modularisation, test coverage, and infrastructure considerations.

My dissertation on NLP and semantic search directly fed into my tools at Vanguard, making the MSc a deliberate knowledge investment rather than an academic detour. Grace Abayomi Q: Your synthetic data work at Vanguard is a frontier area. What did you build and why does it matter?

Abayomi: To test financial software, you need realistic data, account balances and transaction histories, but using real customer data triggers GDPR compliance issues. The traditional workaround, anonymisation, destroys statistical properties and breaks test scenarios. Using tools like the Synthetic Data Vault (SDV), I built pipelines to generate entirely artificial datasets that preserve real-world distributions and edge-case frequencies without exposing actual records.

This capability is commercially critical. It empowers teams to test aggressively, move faster, and maintain regulatory compliance, preventing systemic failures. Q: Grace Abayomi: Before Vanguard, you applied NLP to law enforcement intelligence at Crimestoppers.

What were the technical demands? Abayomi: Crimestoppers receives massive volumes of unstructured text. I built an NLP pipeline for entity extraction, semantic search, and topic clustering so analysts could surface patterns without manual reading.

The domain-specific challenge was prioritising precision over recall. In a co