portfolio/data/projects.ts
GeorgeWebberley a09509581f
Some checks are pending
ci/woodpecker/release/woodpecker Pipeline is running
Added more project details
2026-01-30 21:01:59 +01:00

286 lines
12 KiB
TypeScript

import { Project } from "@/types/project";
export const PROJECT_REGISTRY: Project[] = [
{
slug: "ratoong",
category: "web",
title: "Ratoong",
subtitle: "High-Performance Ski & Travel Engine",
role: "Full-Stack Engineer",
duration: "2020 — 2022",
stack: ["Angular", "Firebase", "GCP Cloud Functions", "TypeScript"],
metrics: [
"< 200ms Search Latency",
"10,000+ Active Data Points",
"Fully Responsive Design",
],
description:
"A comprehensive ski resort planning and rating platform featuring real-time weather integration and complex multi-parameter search filters.",
engineeringStory: `
Building Ratoong was an exercise in managing **High-Density Data** within a reactive frontend ecosystem. The core challenge was transforming thousands of resort data points—ranging from piste lengths to real-time weather—into a lightning-fast, searchable interface.
#### Data Orchestration & Efficiency
Leveraging a **Document-Based Architecture (Firestore)**, I designed a schema that balanced read efficiency with real-time updates. To handle complex filtering (altitude, lift types, pricing) without taxing the client-side, I utilized **GCP Cloud Functions** as a middleware layer to process and normalize data from various 3rd-party APIs, including Google Maps and Weather services.
#### Modern Angular & Responsive UI
The frontend was built using modern **Angular**, focusing on a component-based architecture that ensured high performance across both desktop and mobile. I implemented a custom state management flow to handle resort ratings and trip planning, ensuring that user interactions were instantly reflected in the UI while syncing seamlessly with **Firebase Authentication**.
#### Lessons in Scalability
Working with a **Backend-as-a-Service (BaaS)** model taught me the importance of cost-effective query design and the power of event-driven triggers. I was responsible for maintaining the development, staging, and production environments, ensuring a clean CI/CD flow from localhost to the Firebase cloud.
#### Security & Data Governance
A key architectural pillar was the implementation of a robust **Security Rules** layer within Firebase. By moving the logic from the client to the database level, we ensured that resort metadata was globally searchable while sensitive user planning data remained strictly isolated. This event-driven security model allowed us to scale the user base without increasing the risk surface area of the platform.
`,
images: [
"/projects/ratoong/ratoong-1.jpg",
"/projects/ratoong/ratoong-2.jpg",
"/projects/ratoong/ratoong-3.jpg",
"/projects/ratoong/ratoong-4.jpg",
"/projects/ratoong/ratoong-5.jpg",
],
liveUrl: "https://ratoong.com",
isPrivate: false,
mermaidChart: `
graph LR
subgraph Client_Side [Frontend]
A[Angular Web App]:::traffic
end
subgraph Firebase_GCP [Cloud Infrastructure]
direction TB
Hub((Firebase SDK)):::hub
B[Firebase Auth]:::node
C[Firestore DB]:::node
D[Cloud Functions]:::node
E[Partner API Proxy]:::node
end
subgraph External [Third Party]
direction TB
F[Weather API]:::traffic
G[Google Maps API]:::traffic
H[Affiliate Partners]:::traffic
end
A ==> Hub
Hub -->|Identity| B
Hub <-->|Data Sync| C
Hub -->|Triggers| D
D --> F
D --> G
D --> H
E -.->|Internal Access| C
classDef traffic fill:#2563eb,stroke:#3b82f6,color:#fff
classDef node fill:#16a34a,stroke:#22c55e,color:#fff
classDef hub fill:#f59e0b,stroke:#d97706,color:#fff,stroke-width:2px
`,
storyLabel: "DATA // UI EFFICIENCY",
},
{
slug: "datasaur",
category: "web",
title: "Datasaur",
subtitle: "Automated Statistical Analysis Engine",
role: "Lead Architect & Creator",
duration: "2019 — 2021", // Reflecting "one of my first things"
stack: ["Python", "Flask", "MongoDB", "Pandas", "SciPy"],
metrics: [
"Automated Stat-Testing",
"Multi-Format ETL",
"Self-Hosted Architecture",
],
description:
"A comprehensive survey data platform that automates complex statistical workflows, from raw data aggregation to advanced hypothesis testing and visualization.",
storyLabel: "ALGORITHMIC // STATISTICAL PROCESSING",
images: [
"/projects/datasaur/datasaur-1.jpg",
"/projects/datasaur/datasaur-2.jpg",
"/projects/datasaur/datasaur-3.jpg",
"/projects/datasaur/datasaur-4.jpg",
"/projects/datasaur/datasaur-5.jpg",
"/projects/datasaur/datasaur-6.jpg",
],
repoUrl: "https://git.georgew.dev/georgew/datasaur",
liveUrl: "https://datasaur.georgew.dev", // Adjusted based on your self-hosting mention
isPrivate: false,
engineeringStory: `
Datasaur was born out of a necessity to bridge the gap between raw survey data and academic-grade statistical insights. The challenge wasn't just displaying data, but architecting a system capable of performing complex mathematical computations on-the-fly.
#### Statistical Automation Pipeline
The core of the application is a robust processing engine built on **Pandas** and **SciPy**. I implemented automated workflows for non-parametric tests like **Kruskal-Wallis** and **Mann-Whitney U**, ensuring that the platform could intelligently suggest and execute the correct statistical test based on the data distribution.
#### Data Visualization & Export
To translate these numbers into insights, I built a visualization layer supporting everything from standard histograms to complex **Box and Whisker** plots. Using **XlsxWriter**, I developed a custom export engine that allowed users to pull processed data directly into professional-grade spreadsheets with pre-formatted statistical summaries.
#### Infrastructure & Monolithic Integrity
The project follows a classic monolithic architecture, which proved highly efficient for keeping memory-intensive dataframes close to the processing logic. Today, the platform is self-hosted using a **Caddy** reverse proxy and **MongoDB Atlas**, demonstrating the longevity and stability of a well-architected Flask ecosystem.
`,
mermaidChart: `
graph LR
subgraph Client_Layer [User Interface]
A[Vanilla JS / Browser]:::traffic
end
subgraph Server_Layer [Application Logic]
B[Caddy Reverse Proxy]:::node
C[Flask / Python Monolith]:::node
end
subgraph Processing_Engine [Data Science Core]
D[Pandas ETL]:::node
E[SciPy / Pingouin Stats]:::node
F[XlsxWriter Export]:::node
end
subgraph Storage [Data Persistence]
G[MongoDB Atlas]:::node
end
A <-->|HTTPS| B
B <-->|WSGI| C
C <-->|Query/Write| G
C ==>|Dataframes| D
D --> E
D --> F
%% Styles %%
classDef traffic fill:#2563eb,stroke:#3b82f6,color:#fff
classDef node fill:#16a34a,stroke:#22c55e,color:#fff
`,
},
{
slug: "ayla",
category: "infrastructure",
title: "Ayla",
subtitle: "Regulatory-Compliant Medical Platform",
role: "Tech Lead & Scrum Master",
duration: "2022 — 2024",
stack: [
"Kubernetes",
"Ruby on Rails",
"Flutter",
"Terraform",
"GCP",
"OTC",
],
metrics: [
"Multi-Region Data Residency",
"ISO 27001 Compliant",
"Single-Click IaC Deployment",
],
description:
"A high-availability medical device platform supporting dementia treatment, featuring multi-cloud infrastructure, automated pipelines, cross-platform web and mobile deployments and strict regulatory requirements.",
storyLabel: "GOVERNANCE // CLOUD ORCHESTRATION",
images: [
"/projects/ayla/ayla-1.jpg",
"/projects/ayla/ayla-2.jpg",
"/projects/ayla/ayla-3.jpg",
"/projects/ayla/ayla-4.jpg",
"/projects/ayla/ayla-5.jpg",
],
isPrivate: true,
engineeringStory: `
As Tech Lead for Ayla, I was responsible for architecting a platform that met the rigorous safety and security standards of a certified medical device. This required a "Security-by-Design" approach, balancing high availability (SLA) with rigid data residency requirements across the UK and EU.
#### Multi-Cloud Infrastructure & IaC
To satisfy GDPR and local health data regulations, I architected a dual-cloud strategy: **Open Telekom Cloud (OTC)** for European users and **GCP** for the UK. Using **Terraform**, I codified the entire infrastructure, enabling us to spin up identical, audit-ready Kubernetes clusters or Cloud Run environments in minutes. This automation was critical for maintaining the "Release-Pre-Release" protocols required for medical certification.
#### Full-Stack Delivery & Automation
The platform featured a **Flutter** frontend for Web, iOS, and Android, all managed through automated **CICD** pipelines. I implemented a layered automation strategy, combining **GitHub Actions** for web deployments and server-side logic with **Fastlane** for mobile app store distribution. The backend was a high-performance **Ruby on Rails** API, architected as a stateless "mini-service" to ensure horizontal scalability within Kubernetes. I also integrated **Squidex CMS** to empower non-technical colleagues to manage content without compromising the system's core integrity.
#### Leadership & Compliance
Beyond the code, I served as Scrum Master and Product Owner, leading sprint planning, retro and demos. I worked closely with regulatory partners and personally oversaw the creation of **DPIAs**, **Cyber Essentials** certification, and the path to **ISO 27001** compliance. In the absence of a dedicated IT department, I managed the MDM systems and sysadmin duties, ensuring that every layer of the organization met the strict regulatory bar.
`,
mermaidChart: `
graph TB
%% Direction and Layout
direction TB
subgraph Shared_Ops [DevOps & CMS]
I[GitHub Actions CICD]:::traffic
J[Terraform IaC]:::traffic
K[Squidex CMS]:::node
end
subgraph Frontend_Layer [Omni-Channel]
A[Flutter Web / Mobile]:::traffic
B[Bunny CDN / Edge Storage]:::node
end
subgraph UK_Region [GCP]
G[Cloud Run Containers]:::node
H[Cloud SQL]:::node
end
subgraph EU_Region [Open Telekom Cloud]
D[NGINX Ingress]:::node
C[K8s Cluster]:::node
F[Object Storage]:::node
E[PostgreSQL RDS]:::node
end
%% Connections
A <--> B
I -->|Fastlane| A
J -->|Provision| G
J -->|Provision| C
B <-->|UK Traffic| G
B <-->|EU Traffic| D
D --> C
G <--> H
C <--> E
C --- F
G <--> K
C <--> K
%% Styles
classDef traffic fill:#2563eb,stroke:#3b82f6,color:#fff
classDef node fill:#16a34a,stroke:#22c55e,color:#fff
`,
},
{
slug: "flutter-1",
category: "mobile",
title: "Flutter-1",
subtitle: "Personal R&D Pipeline",
role: "Architect & Creator",
duration: "2025 — Present",
stack: ["Python", "FastAPI", "Next.js", "Redis"],
metrics: ["Sub-50ms Latency", "Automated ETL", "Self-Hosted"],
description:
"A data science pipeline tool built to explore high-speed processing and real-time visualization of large datasets.",
images: ["/datasaur-1.jpg"],
repoUrl: "https://git.georgew.dev/georgew/datasaur",
liveUrl: "https://ratoong.com",
engineeringStory:
"In this high-stakes medical environment, I implemented a custom audit logging system that ensured every state change was immutable...",
storyLabel: "DATA EFFICIENCY",
isPrivate: true,
},
{
slug: "flutter-2",
category: "mobile",
title: "Flutter-1",
subtitle: "Personal R&D Pipeline",
role: "Architect & Creator",
duration: "2025 — Present",
stack: ["Python", "FastAPI", "Next.js", "Redis"],
metrics: ["Sub-50ms Latency", "Automated ETL", "Self-Hosted"],
description:
"A data science pipeline tool built to explore high-speed processing and real-time visualization of large datasets.",
images: ["/datasaur-1.jpg"],
repoUrl: "https://git.georgew.dev/georgew/datasaur",
liveUrl: "https://ratoong.com",
engineeringStory:
"In this high-stakes medical environment, I implemented a custom audit logging system that ensured every state change was immutable...",
storyLabel: "DATA EFFICIENCY",
isPrivate: true,
},
];