Morné Fourie - AWS Certified Solutions Architect | .NET Technical Lead

Hello, I'm Morné Fourie

AWS Solutions Architect | .NET Technical Lead | Cloud Infrastructure Expert | DevOps & SRE Advocate | Platform Architect | Full-Stack Engineer | Passionate Mentor

 Located in Bellville, South Africa

Morné Fourie

AWS Solutions Architect | .NET Technical Lead | Cloud Infrastructure Expert | DevOps & SRE Advocate | Platform Architect | Full-Stack Engineer | Passionate Mentor

Morné Fourie

I'm an AWS Certified Solutions Architect and .NET Technical Lead based in Bellville, South Africa, currently working as a Lead Product Engineer at ABSA Corporate & Investment Banking since 2018. I specialize in designing and developing scalable cloud infrastructure and application platforms, with deep expertise in AWS services (EKS, EC2, RDS, Lambda, S3), Kubernetes, Docker, Terraform, and Infrastructure-as-Code. My technical stack centers on .NET Core/ASP.NET, microservices architecture, PostgreSQL databases, event-driven systems using Kafka, and CI/CD automation with Azure DevOps. I have extensive experience in zero-trust security architectures, OAuth2/OpenID Connect, observability systems, and data analytics using Apache Iceberg and S3-compatible data lakes. With certifications including AWS Solutions Architect Associate and Kanban University KMP1 & KMP2, I combine technical leadership with mentoring capabilities, having previously worked on projects ranging from peer-to-peer lending platforms to billing engines and database migrations across financial services, telematics, and education sectors. My career spans from early work at Curro Holdings (2010-2014) through DVT consulting engagements to my current role architecting identity, permissions, and platform services for corporate investment banking.

AWS Solutions Architect

I have in-depth production experience in the following AWS patterns:

Automated infrastructure provisioning

Secure ingress patterns (mTLS/WAF)

Migrating workloads to AWS

Migrating databases to AWS

Serverless architecture

Containerized workloads

Custom AMIs for EC2 instances

AutoScaling and Load Balancing

IAM Roles Anywhere

Service Catalog TF Engine

.NET Technical Lead

As a Tech Lead, I ensure that .NET developers push quality code to Production

ASP.NET Core

Project Templates

Blazor / Angular

Identity Server, OAuth2, OpenID Connect, JWT

Resilience / Health

Entity Framework, Dapper, Marten

Dependency Injection / configuration

SeriLog / Fluentd / Kibana

OpenTelemetry

xUnit, NUnit, MS Test

Active Directory, LDAP Integration


Micro-services

Docker, Kubernetes, Helm

EKS / Rancher

ServiceFabric


DevOps

CI/CD & GitOps

Azure Pipelines

GitHub Actions


Front-end Development

Modern Angular / TypeScript

Project Templates / Code Generators

Tailwind CSS / Material Design

End to End test automation


Database Development

Deep experience in SQL Server and PostgreSQL

Database design

Database source control

MongoDB high throughput

DR/HA configuration

Redis Caching Strategies

Database indexing and performance tuning

S3 Data Lake architecture


Event Driven Architecture

Apache Kafka / RabbitMQ


I also have Java skills

Handy when .NET and Java teams integrate on larger projects.

Spring Boot / MVC / IOC

SOAP / REST Web Services / RetroFit

JPA / Hibernate

JUnit

Log4J

LDAP / Active Directory Integration

Kafka Streams


Social Profiles

Professional Profile

 I started programming at the age of 14, and received a national award in the same year.

 

 Today I am a customer focussed Technical Lead and Solutions Architect at Absa CIB in Cape Town. I have a passion for scalable, maintainable and well-architected solutions that deliver business value early on. The detail is important to me.

 

 The developer productivity of my team is very important to me. For any given project, I lay down the rails so that they can follow the path to success in a natural and intuitive way. These often include project templates and code generators. Who wants to write front-end DTOs by hand if a code generator on the back-end could do it in a few seconds? And who starts from scratch if a project template could generate a starter kit, including the back-end, front-end and CI/CD pipeline yaml?

 

 As a full stack software engineer, I specialise in modern tech stacks. My efforts are focused on .NET and Angular LTS versions. I sometimes use Blazor Server and WASM when a project has a shorter deadline. I find the developer productivity exponentially more rewarding for certain projects. I believe in evergreen architectures and prefer to upgrade the early in the release cycle. I have in-depth production experience in micro-service architectures on Kubernetes, both on-premise and in the Cloud. I love Kubernetes because it's one of the most portable technologies on the planet. Migrating Kubernetes applications between on-premise and Cloud data centers is usually painless. I have extensive production experience in Docker, Kubernetes, Helm, Rancher and AWS EKS. With 8 years of experience with corporate bank-level security, I excel in zero-trust architectures, end-to-end encryption and cyber-security tooling for static and dynamic analysis of code bases on a macro level. I prefer using SonarQube, Trivy and AquaSec where suitable.

 

 My solutions are secure by default, with multiple layers of security for defense in depth. I work with OAuth2 and OpenID Connect on a daily basis and ensure that all API endpoints are secure, whether they're exposed to the public internet or running in an isolated environment. Atackers find it hard to move laterally in such a zero-trust environment. Observability is key. That's why OpenTelemetry and Serilog is always part of the project template. I have a solid understanding of the .NET ecosystem, such as dependency injection and the configuration system. I enjoy authoring internal NuGet libraries to help tenant teams to consume our platform services to stay productive and focus on their business features. xUnit is my framework of choice for unit testing on the back-end. I constantly develop and update guidelines and patterns to improve test coverage. In a micro-service there are added benefits to testing the full API surface area, and to reduce brittle mock-based testing. These techniques have led to some of the most stable micro-services in the bank. My focus is primarily on the latest versions of .NET Core, and I am always up to date with the latest trends and best practices.

 

 In a microservice environment Event Driven Architecture (EDA) decouples the services from one another. Some processes don't require synchronous communication and can be processed later in an asynchronous manner. This is where I apply my RabbitMQ and Kafka experience. Since Kafka is a highly scalable and reliable platform, my focus has shifted there over the last 5 years. My Kafka Producers and Consumers deliver high-throughput messages on critical data pipelines. I sometimes cross over to Java when I need to develop bespoke Kafka Streams applications to join data between existing topics. I'm also learning how to build massively scalable data pipelines with Apache Flink with FlinkSQL.

 

 For operational data stores I standardise on PostgreSQL database technologies where possible. I find that the extensible ecosystem very useful. The PostgreSQL community is at the forefront in many areas of the industry. I use Entity Framework and Dapper for relational data and native JSONB or MartenDB for NoSQL use cases. Database schemas and even test data is checked into source control and deployed to the target environments, reducing click-ops and scaling productivity. I also have production experience with Microsoft SQL Server, MongoDB and Redis caches, making performance improvements for different types of workloads through indexing and adjusting cluster-specific parameters. Each workload is different. Performance tuning is part of delivering the solution to our customers. I often fulfill the role of DBA for our team, knowing that there are other specialists I can reach out to when needed.

 

 For data analytics I prefer to standardise on the Iceberg format in an S3-compatible data lake. When developing data analytics solutions in AWS, the existing S3, Glue and Athena services are the primary bulding blocks. When regulatory requirements demand an on-premise solution, the same is achieved with S3-compatible storage such as Dell ECS, Apache Hive MetaStore and Starburst Trino. I have extensive experience in delivering high-value data pipelines, from raw sources, to standardised layers and curated data assets, whether in real-time or through batch processing. I dabble in Apache Spark and Scala where needed, but prefer ANSI SQL transformations where possible.

 

 As a well-rounded solutions architect I believe in the principle of micro-service ownership. A team should own their application from concept all the way to production. That is why I spend a lot of time building out the Site Reliablity Engineering (SRE) principles in my teams. Infrastructure-as-Code (IaC) and CI/CD automation are key enablers to reduce click-ops and manual toil in software engineering teams. I love reliable and repeatable outcomes. That's why I invest time in setting up development, staging and production environments with IaC technologies such as Terraform. I train my top engineers to contribute to the IaC code-base by writing reusable modules that are easily configured for any environment. A solid understanding of Terraform State is part of the training. Not only do we build out our AWS infrastructure with Terraform, but we also use it to build out our Cisco ThousandEyes observability dashboards.

 

 My SRE mindset enabled me to build up extensive production experience in designing and developing observability and alerting systems for our application platform. My dashboards highlight serious and even preventative health checks for applications, databases, infrastructure, network througput and advance notice on TLS certificate expiry. It also continuously monitors advanced scenarios like WAF and mTLS traffic. I always use the tool that's best for the use case, whether it's Cisco ThousandEyes, IBM Instana, AWS CloudWatch Alarms, Grafana or a custom .NET solution. Sometimes a combination of these toolsets deliver the best coverage.

 

 I particularly enjoy the principle of continuous improvement. Our initial CI/CD processes were designed to migrate us from a server architecture to a micro-service architecture. I designed our Azure DevOps build pipelines to form the rails for other teams in the organisation to seamlessly and reliably deploy their applications to shared environments. Over the years my team and I have built shared steps and standardised deployment pipelines for others to use. This year alone we rolled out hundreds of changes to production with minimal downtime. We also developed custom build agents with tooling pre-installed to accelerate the build and deployment processes. Our CI/CD pipelines also include automated DB schema migrations and indexing from source control, as well as static security scanning with the AquaSec tooling. Docker and AWS AMI patching happens independently of application code changes. Our teams often patch a given image at a moment's notice with zero downtime or customer impact. For applications running on AWS EC2 instances, we use AutoScaling Groups and AWS CodeDeploy to seamlessly upgrade the OS or the applications on the instances. Our developers don't have to know about all the moving parts. They simply run the release process in Azure DevOps.

 

 I am an AWS Certified Solutions Architect who enjoys delivering scalable and maintainable solutions for our customers. Financial institutions demand a higher degree of security. Over the years I partnered with security and Cloud architects in the bank to implement secure ingress patterns into our application environments, using AWS services like Cloudfront, AWS WAF and AWS Shield. Furthermore we enabled our high-value customers to add an additional layer of security via mTLS client certificates. Other AWS services I use on a daily basis include: Route53, IAM Roles and Permissions, IAM RolesAnywhere (for on-premise workloads), RDS PostgreSQL, ElastiCache Redis / Valkey, EC2 Amazon Linux 2023, EKS, ECR, SSM Parameter Store and Secrets Manager. I also experiment with serverless patterns such as Lambda, Fargate, API Gateway, S3, DynamoDB, SQS and RDS Postgres Serverless services when a valid use case surfaces.

 

 I believe in mentoring and continuous training. For this reason I set up an AWS Sandbox account where developers can freely experiment with new technologies, including AWS Bedrock LLMs. After manually testing out a pattern in the Sandbox account, I encourage and train them to import the existing AWS resources into Terraform to accellerate an IaC driven infrastructure pipeline. My Terraform skills also enable me to publish modular and reusable pre-approved AWS Service Catalog Products that other teams in the organisation can use for their scenarios.

 

 I'm very comfortable with bash when I develop Amazon Linux bootstrap scripts. Similarly I use Powershell when bootstrapping Windows EC2 instances. My solutions include custom instances for processing ingress traffic, terminating mTLS traffic for advanced security use cases. I'm well versed in the ModSecurity WAF solutions, both for Nginx and Apache web servers. The Core RuleSets are often customised for bespoke business scenarios.

 

 In my role as Technical Lead I often gravitate to the realm of Platform Architecture, where micro-services are categorised into tiers of criticality to reliably deliver a set of business capabilities, even when there's a degree of service degredation on the platform. Such platforms often contain identity providers, permission systems, internationalization, notifications, workflows, audit trails and more. I have extensive experience in all of these business domains and excel in evolving the code-bases in an isolated manner to reduce impact on consuming services.

 

 As a Technical Lead I'm comfortable leading a team of strong software engineers. I also enjoy mentoring and upskilling junior and intermediate developers. I really enjoy the career development aspect when I shape a team. I'm actively involved in the recruitment process by interviewing .NET developer, Angular developers and DevOps engineers. Leading a team in the office or remotely makes no difference to me. I have learned to manage technical teams of diverse nationalities, genders, beliefs and orientation. I enjoy forming human connections with my team members, and encourage them to bond as a team. Beware! I am an elusive office prankster and I love to make things fun for my team members. I use Sprint Retros as an opportunity to remove any obstacles from the team and to forge unity of vision for the next project.

 

 As a certified Kanban practitioner I periodically analyse the flow of work items to see how we can improve as a team during the next iteration. I prefer the Nave analytics tool for this.

 

 I charge a little more because I invest a lot of time in sharpening my skills and staying up to date with the latest technologies. As a generalist, I have a wide view of the technology landscape, and as a specialist, I do a deep dive into specific areas where I can add the most value. I make sure that I'm ahead of the curve, so that I can guide my team and organisation in the right direction. Not everyone can do this. As the saying goes...

“If you think hiring a professional is expensive, wait ‘til you see what an amateur costs you”

Cups of coffee

Lines of code

+

Terraform modules

What I do

My Skills and Experience.

AWS Cloud Architecture

I specialise in AWS cloud architecture, with a strong focus on scalable and resilient solutions.

  • AWS Well-Architected Framework
  • AWS Serverless architectures
  • AWS IaC / Terraform
  • AWS RDS / Aurora
  • AWS API Gateway
  • AWS S3 / Glue / Athena
  • AWS ECS / EKS
  • AWS Control Tower
  • AWS IAM / Verified Permissions
  • AWS Networking

.NET Development

I specialise in ASP.NET Core web development, with a strong focus on Web API's and MVC frameworks.

  • ASP.NET Core
  • Project Templates / Code Generators
  • Blazor / Angular
  • Identity Server, OAuth2, OpenID Connect, JWT
  • Resilience / Health Checks
  • Entity Framework, Dapper, Marten
  • Kafka / RabbitMQ / Redis
  • Dependency Injection / Configuration
  • SeriLog / Fluentd / Kibana
  • OpenTelemetry
  • xUnit, NUnit, MS Test
  • Active Directory, LDAP Integration

Micro-services

I help teams on their containerisation journey:

  • Deep experience in Docker, Kubernetes, Helm
  • EKS / Rancher production experience
  • ServiceFabric experience

Front-end Development

On front-end projects, I spend my time with:

  • Modern Angular (17+)
  • TypeScript
  • Project Templates / Code Generators
  • Tailwind CSS / Material Design
  • Automated End to End testing

Database Development & Analytics

When database requirements go beyond CRUD and ORMs, I focus on:

  • SQL Server & PostgreSQL
  • MongoDB & Redis
  • Repository patterns
  • Database design
  • Database source control
  • Views and Stored Procedures
  • Indexing and Performance Tuning
  • SQL Server Data Replication
  • High Availability
  • Disaster Recovery
  • Kafka Connectors
  • S3 Data Lake architecture

Java Development

My Java skills come in handy when .NET and Java teams integrate on larger projects. I have experience in:

  • Spring Boot / MVC / IOC
  • SOAP / REST Web Services
  • JPA / Hibernate
  • JUnit
  • Log4J
  • LDAP / Active Directory Integration
  • Kafka Streams

Integration with Financial Systems

I have extensive experience in the financial sector. My billing engine (2011-2016) passed Deloite audits every year. I have a deep understanding of:

  • Accounting Systems
  • Corporate Investment Banking platforms
  • Peer to peer lending platforms
  • Credit Bureau APIs
  • Payment Gateways

I'm a Solution Architect and .NET Tech Lead at ABSA Corporate and Investment Banking in Cape Town, South Africa,

where I work with an amazing team to design, develop and evolve an application platform that addresses cross-cutting concerns such as identity, permissions, auditing, internationalization, notifications and more. Our platform is ISO 27001 certified and is used by multiple business domains to deliver value and drive innovation for our customers.

Curriculum Vitae

My education and experience.

Education

Self-study

I keep up to date with the latest technologies by following trending open source repositories, podcasts, blogs and Social Media feeds. I also watch training videos and read books on the latest development topics.

Self

2005 - Current

AWS Certified Solutions Architect

The AWS CSA certification showcases knowledge and skills in AWS technology, across a wide range of AWS services. The focus of this certification is on the design of cost and performance optimized solutions, demonstrating a strong understanding of the AWS Well-Architected Framework.

AWS

2021 - Current

KMP1 & KMP2

The KMP1 Kanban Systems Design certification lays the groundwork in designing a Kanban system (or improving an existing system) for optimal flow and faster feature delivery. The KMP2 Kanban Systems Improvement certification concentrates on the complex demands of a multi-teamed organization and explores how to maintain momentum beyond initial improvements realized from a successful Kanban implementation.

Kanban University

2022 - Current

National Certificate in Datametrics

My studies centered around Delphi and C++, as well as systems analysis and various design methodologies. I use UML, Use Case, Flow Control, Timelines and other diagrams in my technical documents.

UNISA

2003 - 2004

Advanced Delphi and Oracle Course

My first in-house CRM was written in Delphi. This course helped me to master the more advanced concepts, and also introduced me to Oracle databases.

Dakota Training Centre

2002

Microsoft Certified Systems Engineer (MCSE)

My MSCE certification enabled me to administer and maintain Microsoft Windows Servers and provided me with the necessary network skills.

Dynamix Training Centre

2000

Experience

Application platform

I'm a Solution Architect and .NET Tech Lead at ABSA Corporate and Investment Banking in Cape Town, South Africa, where I work with an amazing team to design, develop and evolve an application platform that addresses cross-cutting concerns such as identity, permissions, auditing, internationalization, notifications and more. Our platform is ISO 27001 certified and is used by multiple business domains to deliver value and drive innovation for our customers.

ABSA CIB

2018 - Current

Peer to Peer Lending Platform

RainFin's lending marketplace is an alternative way to borrow or lend money. It directly connects borrowers and lenders, allowing for cheaper credit for borrowers and better returns for lenders.

As a DVT contractor I was responsible for completing the SME version of the application, along with 3 other developers. I learnt from one of the best architects in South Africa.

We also ported the PHP version of the Personal Loans application to a .NET MVC application.

The Personal and SME applications consumed web services exposed by the core system, and also did live credit checks.

We also implemented a lending platform for the 4AX exchange, who partnered with RainFin to boost SME growth in Africa.

DVT / RainFin

2015 - 2017

Data Analytics

Mix Telematics is a global provider of fleet and asset management solutions, offering a range of services to improve operational efficiency and reduce costs.

At the time they had OLAP cubes built on SQL Server Analysis Services, which were used to generate reports in SQL Server Reporting Services (SSRS). My role was to integrate REST API data sources into the existing OLAP cubes, so that reports could be generated with a more holistic view of the data.

DVT / MiX Telematics

2015 - 2016

Billing Engine

I designed and implemented the Synergy billing engine which was at the heart of Curro's cashflow provisioning.

The billing system which would take inputs such as attendance records and class lists from the 3rd party academic system for monthly billing. I implemented an ASP.NET solution that would generate batches for import into Pastel Partner. Exception reporting helped the bursars to validate the external inputs before generating the monthly invoices.

In 2015 Synergy generated more than 30000 monthly invoices for study fees and other adhoc fees and stored them in Sage Evolution.

Parents could view their statements online and pay online.

Curro Holdings

2010 - 2015