Khaled Ezzat

Mobile Developer

Software Engineer

Project Manager

Tag: Technology

16/07/2025 Free Supabase BaaS: Self-Host with Edge Functions on VPS

Supabase has rapidly emerged as a powerful open-source alternative to Firebase, offering developers a suite of tools built on top of the rock-solid foundation of PostgreSQL. While its managed cloud platform provides an excellent and easy entry point, the true power of open-source lies in the freedom to run it yourself. This article explores the compelling proposition of self-hosting the entire Supabase stack, including the Deno-based Edge Functions, on your own Virtual Private Server (VPS). We will delve into how you can achieve this powerful setup for free, bypassing the limitations of managed free tiers and gaining complete control over your data, infrastructure, and scalability. This is your guide to building a production-ready Backend-as-a-Service without the recurring monthly bill.

Why Self-Host Supabase? The Allure of Full Control

Opting to self-host Supabase is a strategic decision that shifts the balance of power from the platform provider to you, the developer. The most immediate benefit is sovereignty over your data and infrastructure. When you run Supabase on your own VPS, your PostgreSQL database, authentication services, and storage files reside in a server environment you manage. This eliminates vendor lock-in and gives you the freedom to choose your server’s geographic region, a critical factor for data compliance regulations like GDPR. Furthermore, self-hosting allows you to completely bypass the limitations inherent in the official managed free tier. Forget about projects being paused due to inactivity, restrictive database sizes, or limited API request quotas. Your only constraints are the resources of your VPS.

Beyond breaking free from limitations, self-hosting unlocks a deeper level of customization. You gain direct, unfettered access to the underlying PostgreSQL database. This means you can install any trusted Postgres extension you need, perform fine-grained performance tuning, and implement complex backup and replication strategies that go beyond the offerings of the managed platform. While this guide focuses on leveraging free VPS tiers, it’s crucial to understand that this model is also incredibly cost-effective at scale. As your application grows, the predictable cost of a more powerful VPS will often be significantly lower than the equivalent paid tiers on a managed service.

Securing Your Free VPS: The Foundation of Your Stack

Before deploying any application, you must first build a secure foundation. The “free” in “free VPS” is meaningless if your server is vulnerable to attack. Fortunately, several cloud providers offer “Always Free” tiers that are more than capable of running a Supabase instance for development or small-to-medium production workloads. Oracle Cloud’s Free Tier is a popular choice due to its generous offerings, including Ampere A1 ARM-based instances with multiple cores and ample RAM. Alternatives include the free tiers from AWS EC2 and Google Cloud Platform, though their terms can be more restrictive.

Once you’ve provisioned your virtual machine (typically running a Linux distribution like Ubuntu), the next step is server hardening. This is not optional. Follow these essential security practices:

  • Create a Non-Root User: Immediately create a new user account with `sudo` privileges. You will perform all subsequent actions from this account, disabling the root login to reduce the attack surface.
  • Implement a Firewall: A firewall is your first line of defense. Using a simple tool like `ufw` (Uncomplicated Firewall) on Ubuntu, you can set a default policy to deny all incoming traffic and explicitly allow only the necessary ports, such as SSH (port 22), HTTP (port 80), and HTTPS (port 443).
  • Use SSH Key Authentication: Passwords can be cracked. Disable password-based authentication for SSH entirely and rely solely on SSH keys. This method is vastly more secure and ensures that only machines with a corresponding private key can access your server.

Only after these hardening steps is your server truly ready to host your Supabase stack securely.

Deploying Supabase with Docker: A Step-by-Step Overview

The officially supported and most straightforward method for self-hosting Supabase is through Docker. This containerization approach encapsulates each component of the Supabase stack—from the database to the API gateway—into isolated, manageable services. The primary prerequisite is to install Docker and Docker Compose on your hardened VPS.

The deployment process is methodical:

  1. Clone the Official Repository: Begin by cloning the Supabase Docker setup files directly from their GitHub repository. You can do this with the command: `git clone –depth 1 https://github.com/supabase/docker`.
  2. Configure Your Environment: Navigate into the new `docker/` directory. Here you will find a file named `.env.example`. Copy this to a new file named `.env`. This file is the control panel for your entire stack.
  3. Generate Secure Secrets: This is the most critical configuration step. The `.env` file contains placeholders for essential secrets like `POSTGRES_PASSWORD`, `JWT_SECRET`, and the `ANON_KEY` and `SERVICE_ROLE_KEY` for the API. Do not use the default example values. Use a strong password generator to create unique, complex strings for each of these secrets. Compromising these keys would mean compromising your entire application.
  4. Launch the Stack: Once your `.env` file is populated with secure secrets, you can bring your Supabase instance to life with a single command: `docker-compose up -d`. This command tells Docker to pull all the necessary images and start all the services (Kong, GoTrue, PostgREST, Realtime, Storage, etc.) in the background.

After a few minutes, your private, fully functional Supabase backend will be running on your VPS.

Enabling Edge Functions and Securing with a Reverse Proxy

A base Supabase installation is powerful, but modern applications demand serverless logic. The self-hosted stack includes the Deno-based Edge Functions service, allowing you to deploy custom TypeScript code that runs close to your data. This service is part of the `docker-compose.yml` configuration and runs alongside the other components. You can deploy functions to your new instance using the Supabase CLI, but you must first configure it to point to your self-hosted domain instead of the official Supabase cloud.

However, you should never expose the myriad of ports from your Docker containers directly to the internet. This is insecure and impractical. The professional solution is to set up a reverse proxy. A web server like Nginx or Caddy is installed on the host VPS and acts as a single, secure entry point for all web traffic. It then intelligently routes incoming requests to the correct internal Supabase service based on the URL path. For example, a request to `https://your-domain.com/auth/v1/` would be forwarded by the reverse proxy to the internal GoTrue authentication service running on its specific Docker port.

Finally, to make your service production-ready, you must enable SSL/TLS. No modern application should operate over unencrypted HTTP. A reverse proxy makes this simple. Using a free tool like Certbot with Nginx, you can automatically obtain and renew SSL certificates from Let’s Encrypt. This ensures all communication between your users’ clients and your Supabase backend is fully encrypted and secure, completing your professional, self-hosted setup.

In conclusion, self-hosting a Supabase and Edge Functions stack on a free VPS is not only possible but also a strategically sound decision for developers seeking ultimate control and cost-efficiency. We’ve walked through the entire process, starting with the compelling reasons to self-host, such as data sovereignty and the removal of platform limitations. We then established a secure foundation by hardening a free-tier VPS, a non-negotiable first step. From there, we detailed the Docker-based deployment, emphasizing the critical importance of securing your configuration secrets. Finally, we elevated the setup to a production-grade service by integrating a reverse proxy for secure traffic management and enabling SSL/TLS encryption. While this path requires more initial setup than a one-click managed solution, the reward is immense: a powerful, scalable, and entirely free backend infrastructure that you truly own.

16/07/2025 AI & Developer Evolution: Architect, Critic, Prompt Engineer

The landscape of software development is undergoing a seismic shift, powered by the rapid advancements in Artificial Intelligence. Tools like GitHub Copilot and ChatGPT are no longer novelties; they are becoming integrated into the daily workflows of developers worldwide. This integration sparks a compelling and often debated question: is AI merely the next step in developer productivity tools, or does it signal a fundamental transformation of the developer’s role itself? As AI models become more adept at writing, debugging, and even designing code, we must explore whether the core skill of a developer will transition from writing explicit code to crafting precise instructions for an AI. This article delves into how AI is reshaping coding and considers the emerging reality: will developers evolve into prompt engineers?

The AI Co-Pilot: Augmenting Developer Productivity

Before we can talk about a full-blown role change, it’s crucial to understand AI’s current position in the software development world: that of a powerful co-pilot. Tools based on Large Language Models (LLMs) have proven exceptionally effective at handling the repetitive and time-consuming tasks that often bog down development cycles. This includes:

  • Boilerplate Code Generation: Setting up project structures, writing standard functions, or creating data models can be done in seconds with a simple prompt, freeing up developers to focus on more complex, unique business logic.
  • Accelerated Debugging: Instead of spending hours searching for a cryptic bug, a developer can present the problematic code snippet to an AI, ask for an explanation, and receive potential fixes. The AI acts as an ever-present, knowledgeable partner for troubleshooting.
  • Learning and Exploration: When encountering a new framework or library, developers can use AI as an interactive tutor, asking it to explain concepts or generate example implementations. This dramatically shortens the learning curve.

In this capacity, AI is not replacing the developer’s critical thinking. Instead, it’s augmenting it. The developer is still the architect, the decision-maker, and the one responsible for the final product. The AI is a tool, albeit an incredibly sophisticated one, that handles the “how” once the developer has figured out the “what” and the “why”. It removes friction, allowing for a more fluid and creative coding process.

The Rise of Prompt Engineering: A New Layer of Abstraction

As developers become more reliant on AI co-pilots, a new skill is naturally coming to the forefront: prompt engineering. This is the art and science of communicating effectively with an AI to achieve a desired output. Simply asking an AI to “write a user authentication system” will yield generic, likely insecure, and incomplete code. Effective prompt engineering is far more nuanced and demonstrates a deep understanding of the underlying technical requirements.

An expert developer-turned-prompt-engineer would provide detailed context, such as:

  • The Technology Stack: “Using Node.js with Express and a PostgreSQL database…”
  • Specific Libraries and Methods: “…implement password hashing with bcrypt and session management using JWTs (JSON Web Tokens)…”
  • Constraints and Requirements: “…ensure the function handles invalid input gracefully and returns specific HTTP error codes for different failure scenarios.”

This isn’t just asking a question; it’s designing a micro-task through natural language. In a way, this is the next logical step in the history of programming abstraction. We moved from machine code to assembly, then to high-level languages like C++ and Python. Each step allowed us to communicate our intent to the machine more abstractly. Prompt engineering is the newest layer, allowing developers to orchestrate complex code generation using structured natural language. It’s a skill that requires just as much precision as writing the code itself.

Beyond Code: AI’s Integration into the Full Software Development Lifecycle

The impact of AI extends far beyond the act of writing code. It is beginning to weave itself into the entire Software Development Lifecycle (SDLC), changing how we approach everything from planning to deployment. A developer’s job has always been more than just coding, and AI is becoming a partner in these other critical areas as well.

Consider AI’s role in:

  • Automated Testing: AI can analyze a function and automatically generate a suite of unit tests to cover various edge cases, significantly improving code coverage and reliability with minimal manual effort.
  • Intelligent Code Reviews: AI tools can act as a preliminary reviewer, flagging potential bugs, security vulnerabilities, or deviations from team-specific style guides before a human reviewer even sees the code.
  • Documentation Generation: Maintaining up-to-date documentation is a common pain point. AI can parse code and comments to automatically generate and update technical documentation, ensuring it never falls out of sync.
  • Project Planning: AI can assist project managers and senior developers in breaking down large, complex features into smaller, manageable user stories and even provide rough time estimates based on historical data.

This holistic integration means the developer’s role is shifting from a pure implementer to more of a systems orchestrator. Their time is spent less on the granular details of implementation and more on high-level design, quality assurance, and strategic decision-making, using AI as a force multiplier across all stages of a project.

The Future Developer: A Hybrid of Architect, Critic, and Communicator

So, will developers become just prompt engineers? The answer is a definitive no. They will, however, undoubtedly need to become excellent prompt engineers as part of a new, hybrid skillset. The idea that one can simply type commands without a deep understanding of software engineering is a fallacy. AI-generated code is not infallible; it can contain subtle bugs, security flaws, or be inefficient. It lacks true business context and an understanding of a system’s long-term architectural goals.

The developer of the future, therefore, is a hybrid professional who excels in three key areas:

  1. The Architect: They hold the high-level vision for the software, designing scalable, maintainable, and robust systems. They decide what needs to be built and why.
  2. The Critic: They possess the deep technical knowledge to critically evaluate, debug, and refine AI-generated code, ensuring it meets quality and security standards. They are the ultimate quality gate.
  3. The Communicator (Prompt Engineer): They are masters of instructing the AI, translating complex architectural requirements into precise, context-rich prompts that yield useful, high-quality code.

The core, irreplaceable skills will be critical thinking, problem-solving, and systems design. Writing code line-by-line may become less frequent, but understanding code intimately will be more important than ever.

In conclusion, AI is not leading to the obsolescence of the developer but to their evolution. We are moving from a world where a developer’s primary value was in writing code to one where their value lies in directing, validating, and integrating it. The journey began with AI as a co-pilot, augmenting productivity and handling rote tasks. This necessitated the rise of prompt engineering, a new abstraction layer for communicating technical intent. Now, we see AI permeating the entire development lifecycle. The developer of tomorrow will not be a simple prompt engineer; they will be a technical leader who wields AI as a powerful tool. They will absorb prompt engineering as a core competency, but their foundational skills in architecture, critical analysis, and problem-solving will be what truly defines their expertise and indispensability.

16/07/2025 The Ultimate Guide to Self-Hosting in 2025: Why, How, and SEO Wins

# The Ultimate Guide to Self-Hosting in 2025: Why, How, and SEO Wins

**Meta Description**: Discover why self-hosting is booming in 2025, how to launch your stack in minutes, and SEO best practices to grow your site traffic organically.

Self-hosting isn’t just for tech nerds anymore — it’s become a legit lifestyle. Whether you’re aiming to take back data sovereignty, avoid SaaS subscription creep, or rank your blog on Google _without_ paying Squarespace, this guide covers it all.

## 🔥 Why Self-Hosting Is Booming in 2025

– **Privacy**: Host your own cloud, mail, and password manager — _no third-party eyes_.
– **Cost**: One VPS = 10+ apps for the price of one SaaS plan.
– **Freedom**: Customize everything from themes to database configs.

> “Control your stack, control your story.”

## 🚀 Getting Started: Self-Hosting Stack in 10 Minutes

Here’s a quick-start VPS stack that works great for blogging, personal sites, and file hosting:

### 🧰 Tools
– **Docker**: containerized app deployments
– **Traefik**: reverse proxy + HTTPS
– **Portainer**: manage containers in a GUI
– **Uptime Kuma**: service monitoring

### 🧱 Deployment
“`bash
git clone https://github.com/yourname/selfhosted-starter.git
cd selfhosted-starter
docker-compose up -d
“`

And you’re live. No Nginx configs. No pain.

## 🌍 SEO Best Practices for Self-Hosters

Getting Google to notice your self-hosted blog is easier than most think.

### ✅ Checklist
– **Use HTTPS** (Traefik + Let’s Encrypt)
– **Add meta descriptions**
– **Fast load times** (use [Lighthouse](https://pagespeed.web.dev/))
– **OpenGraph tags** (great for social shares)
– **Sitemap.xml + robots.txt**
– **Clean permalinks** (via static site generator or CMS)

### Recommended SEO-Friendly CMS
– **Ghost**: beautiful, lightweight
– **WordPress**: familiar, powerful plugins
– **Hugo**: blazing fast static sites

## 🛠️ My Current SEO-Optimized Self-Hosted Setup

| Service | Purpose | Why I Use It |
|————-|———————|—————————————–|
| Ghost | Blog engine | Clean URLs, fast performance |
| Plausible | Analytics | GDPR-safe, no cookies |
| Cloudflare | DNS & cache | Speed + protection |
| Traefik | Proxy + HTTPS | Auto SSL, integrates with everything |

> Hosting tip: SEO begins with speed and structure. Ghost + Traefik + CDN = gold.

## 📚 Bonus Resources

– [Awesome Self-Hosted](https://github.com/awesome-selfhosted/awesome-selfhosted)
– [r/selfhosted](https://www.reddit.com/r/selfhosted/)
– [StaticGen](https://www.staticgen.com/)

## 🧠 Final Thoughts

Self-hosting isn’t about rejecting the cloud — it’s about _owning_ it. With the right stack, you get control, privacy, and serious SEO potential.

Let’s build the web we want to see. One container at a time.

> 🧠 Ready to start your self-hosted setup?
>
> I personally use [this server provider](https://www.kqzyfj.com/click-101302612-15022370) to host my stack — fast, affordable, and reliable for self-hosting projects.
> 👉 If you’d like to support this blog, feel free to sign up through [this affiliate link](https://www.kqzyfj.com/click-101302612-15022370) — it helps me keep the lights on!

Want help deploying Ghost, Plausible, or your own SEO-optimized stack? Just say the word!

16/07/2025 Self-Hosted Diaries (2025.1): My Personal Homelab Stack and Journey

# Self-Hosted Diaries (2025.1): My Personal Homelab Stack and Journey

**Meta Description**: Explore my complete 2025 self-hosting stack — from Proxmox and Traefik to Nextcloud, Vaultwarden, and self-hosted LLMs. Insights, tools, and lessons learned.

Self-hosting isn’t just a tech hobby anymore — it’s how I’ve taken back control over my data, my apps, and even my budget. Here’s the full rundown of my current 2025 stack: what I run, how it’s configured, and why each piece earns its place in my homelab.

## 🧱 Core Infrastructure

– **Server**: Intel NUC (i7, 64GB RAM, 1TB NVMe)
– **Hypervisor**: Proxmox VE 8.2
– **Networking**: Unifi Dream Router, VLANs for IoT/guests
– **Backups**: BorgBackup + Rclone to Wasabi S3

## 🌐 Reverse Proxy & Auth

– **Reverse Proxy**: Traefik v2 with Docker provider
– **SSL**: Let’s Encrypt + DNS challenge (Cloudflare API)
– **SSO**: Authelia (paired with Traefik)

**Tip**: Offload auth to Authelia early — simplifies service security massively.

## 🧠 Core Services

| Service | Purpose | Notes |
|—————-|——————-|————————————-|
| Portainer | Docker GUI | Easy container management |
| Watchtower | Auto-updates | Monitors and updates containers |
| Uptime Kuma | Monitoring | Self-hosted Statuspage & alerts |
| Homer Dashboard | Landing Page | Quick access for all services |

## 💾 File & Sync

– **Nextcloud Hub 7**: File sync, calendar, contacts
– **Syncthing**: Peer-to-peer file sync (laptops ↔ server)

> **Optimization**: Offload preview generation in Nextcloud with `previewgenerator` cron job.

## 🔐 Security Tools

– **Vaultwarden**: Password manager
– **Gitea**: Git server (private repos)
– **Dozzle**: Real-time Docker logs
– **CrowdSec**: Logs + ban IPs via Traefik bouncer

## 📬 Mail Stack

– **Mailserver**: Mailu (Postfix, Dovecot, Rspamd)
– **Webmail**: Snappymail (fast, beautiful)
– **MX / DNS**: Hosted externally with Mailu’s DNS templates

> **Pro tip**: SPF/DKIM/DMARC tuning is critical. Use tools like mail-tester.com to verify.

## 📺 Media & Extras

– **Plex**: For streaming backed by local RAID
– **Mealie**: Recipe manager for the household
– **FreshRSS**: RSS reader

## 🧠 Self-Hosted LLMs

– **Ollama**: Local models like Mistral 7B
– **Frontend**: OpenWebUI

> This combo gives me a local ChatGPT-style interface — fast, no API limits.

## 🔁 CI/CD & Automation

– **Woodpecker CI**: Lightweight CI for personal repos
– **Webhooks**: Trigger redeploys, alerts
– **Cronicle**: UI cron job manager

## 🔐 VPN & Remote

– **Tailscale**: Remote access + subnet routing
– **Pi-hole**: Ad-blocking DNS (via split-tunnel)

## Lessons from 2024

– **Don’t over-engineer**: Simplicity scales better.
– **Automation wins**: Every webhook or cron job you configure saves hours later.
– **Monitoring > troubleshooting**: Setup alerts _before_ things break.

## 📸 ALT text suggestions
– Screenshot of Homer dashboard: `Self-hosted dashboard with service shortcuts`
– Diagram of network layout: `Proxmox homelab architecture with VLANs and Traefik proxy`

> 🧠 Ready to start your self-hosted setup?
>
> I personally use [this server provider](https://www.kqzyfj.com/click-101302612-15022370) to host my stack — fast, affordable, and reliable for self-hosting projects.
> 👉 If you’d like to support this blog, feel free to sign up through [this affiliate link](https://www.kqzyfj.com/click-101302612-15022370) — it helps me keep the lights on!

Want a deep dive on any tool above? Just let me know!