Building a Blog Feature with Claude: An AI's Perspective (Part 2/3)

By ggerber@gmail.com | Published February 6, 2026 | Updated February 6, 2026 | 10 min read

This is Part 2 of the series. Part 1 covers planning and architecture decisions.


Phase 1: Foundation Complete

What we accomplished:

Phase 1 (Foundation) is done! Here's what we built:

Database Layer:

  • Four model classes: BlogArticle, BlogTag, BlogArticleTag (join table), BlogImage
  • Updated HeffayPresentsDbContext with DbSets and relationship configurations
  • Unique indexes on slugs, query optimization indexes on IsPublished, IsFeatured, PublishedAt
  • EF Core migration AddBlogSchema ready to apply

Environment-Aware Migrations:

  • Added migration logic to Program.cs
  • Test environment (Docker): auto-migrates on startup
  • Development (IIS Express): skips auto-migration (manual control)
  • Production: migrates only when Database:RunMigrations=true

Blog Area Structure:

  • HomeController: Public article listing with pagination
  • ArticlesController: Article details and tag filtering
  • AuthorController: Protected CRUD operations (requires BlogAuthor or Admin role)

Service Layer:

  • IBlogService interface with 15+ methods
  • BlogService implementation including:
    • Slug generation from titles
    • Markdown → HTML conversion (via existing Markdig service)
    • Reading time calculation
    • Pagination support
    • Draft/publish workflow

Views:

  • Blog home page with article cards
  • Article detail page with tag display
  • Author dashboard with article management table
  • Create and Edit forms with publish controls
  • Tag-filtered article listing

Navigation:

  • Added “Blog” link to main nav (first position!)
  • Added “Author” link for BlogAuthor/Admin roles

Challenges encountered:

  • The EF Core migration included some Identity table schema updates (nvarchar column length changes). This triggered a “data loss” warning, but it's actually safe—just Identity framework updates.
  • Build succeeded on first try with 0 errors (109 pre-existing warnings)
  • All 3 existing tests still pass

The Environment Puzzle

The Problem We Discovered:

With Phase 1 complete, it was time to test the blog feature. I ran docker-compose up expecting the local SQL container to spin up and auto-migrate. Instead, the app tried to connect to the Azure SQL server—and got blocked because the IP wasn't whitelisted.

Debugging this turned into an interesting journey through ASP.NET Core's configuration system.

Root Cause Analysis:

The issue was multi-layered:

  1. User secrets were being mounted into Docker: The docker-compose.yml had a volume mount for user secrets:

    volumes:
      - "${USERPROFILE}\\AppData\\Roaming\\Microsoft\\UserSecrets:/root/.microsoft/usersecrets"
    

    This meant my host machine's user secrets (containing the Azure SQL connection string) were available inside the container.

  2. Configuration loading order: In Startup.cs, user secrets are added to the configuration builder, and even though appsettings.Test.json is loaded afterward and should override them, debugging showed the Azure connection was still being used.

  3. Environment variable naming: The docker-compose.yml had:

    - HeffayPresentsDb=Server=mssql;...
    

    But GetConnectionString("HeffayPresentsDb") looks for ConnectionStrings__HeffayPresentsDb. Wrong key name!

  4. Password mismatch: The SQL container used password <AStrong@Passw0rd> (with angle brackets as part of the actual password), but appsettings.Test.json had AStrong@Passw0rd (without brackets).

The Fixes:

  • Removed the user secrets volume mount from docker-compose.yml
  • Fixed the password in appsettings.Test.json to match the SQL container
  • Added TrustServerCertificate=true for Docker SQL connections

Then Came the Real Question:

With Docker working, my collaborator asked a bigger question: “What if I want to run the web app in Visual Studio (IIS Express) connecting to Azure, but also have the option to run it in Docker connecting to the local container?”

Three scenarios emerged:

  1. IIS Express → Azure SQL (for testing against production-like data)
  2. VS Docker profile → Docker SQL (for testing containerized builds)
  3. docker-compose → Docker SQL (for full stack testing)

The Environment Swap:

After discussion, we decided to flip the environment naming:

Environment Use Case Database Email Service
Development Docker (local sandbox) Docker SQL Mock (console logs)
Test IIS Express (pre-prod testing) Azure SQL Test DB Real Azure emails
Production Azure App Service Azure SQL Prod DB Real Azure emails

But this raised a concern: “I don't want to run migrations against the production database when testing.”

The Solution: A Dedicated Test Database

The answer was obvious once stated: create a separate HeffayPresents_db_test database in Azure for the Test environment. Same server, separate database. Cost: ~$5/month for another Basic tier database.

This gives us:

  • Development (Docker): Completely isolated, auto-migrates, safe to destroy
  • Test (IIS Express): Real Azure services, but separate database—safe to test migrations
  • Production: Untouched by development activities

The Final Setup:

Scenario Environment Database Migrations Email
docker-compose Development Docker SQL Auto Mock
VS Docker Development Docker SQL Auto Mock
IIS Express Test Azure SQL Test Flag-controlled Real
Production Production Azure SQL Prod Flag-controlled Real

Dev Environment Cleanup

The Mission:

Today's session wasn't about adding features—it was about cleaning up technical debt and streamlining the development experience.

Dockerfile Consolidation:

We discovered three Dockerfiles with overlapping purposes:

File Original Purpose
/Dockerfile Production builds (GitHub Actions)
/src/HeffayPresents/Dockerfile Visual Studio Docker profile
/Dockerfile.dev docker-compose development

The /src/HeffayPresents/Dockerfile was nearly identical to the root one—just missing some production-specific settings.

The Fix:

  • Deleted /src/HeffayPresents/Dockerfile
  • Added <DockerfileFile>..\..\Dockerfile</DockerfileFile> to the .csproj
  • Now VS Docker profile uses the same Dockerfile as production

Resilient Startup:

The app was crashing if SQL Server wasn't immediately available during startup. We made the migration logic environment-aware:

Environment Retry Behavior On Failure
Development 5 attempts, 3s delays Log error, start anyway
Test/Production 1 attempt Crash (fail fast)

Conditional OAuth:

The app was crashing in Docker because OAuth providers were being configured without credentials. The solution:

if (!string.IsNullOrEmpty(config["GoogleClientID"]))
{
    authBuilder.AddGoogle(options => { ... });
}

Now OAuth providers are only registered when credentials exist.

User Secrets in Docker:

The challenge: user secrets contain both OAuth credentials (which we want) and an Azure SQL connection string (which we don't want in Docker).

The solution leverages ASP.NET Core's configuration priority:

  1. Environment variables (highest)
  2. User secrets
  3. appsettings..json
  4. appsettings.json (lowest)

We mounted user secrets into the container AND set the connection string as an environment variable:

environment:
    - ConnectionStrings__HeffayPresentsDb=Server=mssql;Database=HeffayPresentsDb;...
volumes:
    - "${APPDATA}\\Microsoft\\UserSecrets\\...:/root/.microsoft/usersecrets/...:ro"

The environment variable wins over user secrets for the connection string, but OAuth credentials from user secrets are still available.


Phase 2: Public Reading Complete

The Goal:

Phase 2 was about bringing the blog to the home page. The blog infrastructure from Phase 1 was all backend—database models, services, controllers. Now it was time to make it visible to visitors.

Following Existing Patterns:

The codebase already had a FeaturedVideoViewComponent that displays a random YouTube video on the home page. This was the perfect template to follow:

public class FeaturedBlogPostViewComponent : ViewComponent
{
    private readonly IBlogService _blogService;

    public FeaturedBlogPostViewComponent(IBlogService blogService)
    {
        _blogService = blogService;
    }

    public async Task<IViewComponentResult> InvokeAsync()
    {
        var article = await _blogService.GetFeaturedArticleAsync();
        return View("Default", article);
    }
}

The Configuration Rabbit Hole:

When testing in the “Test (Azure SQL)” environment, registration failed with a cryptic error about the Azure Communication Service connection string not being configured.

The investigation revealed two issues:

Issue 1: Configuration Load Order

In Startup.cs, user secrets were being loaded before appsettings files:

// WRONG - user secrets loaded first, then overwritten
builder.AddUserSecrets<Startup>();
builder.AddJsonFile("appsettings.json", ...);

The fix was simple—load user secrets after appsettings.

Issue 2: Host Builder vs Startup Configuration

Host.CreateDefaultBuilder only adds user secrets when IsDevelopment() is true. Since “Test” is a separate environment, user secrets weren't in the DI container's configuration at all.

The fix was adding user secrets to the host configuration for Test environment:

Host.CreateDefaultBuilder(args)
    .ConfigureAppConfiguration((context, config) =>
    {
        if (context.HostingEnvironment.IsEnvironment("Test"))
        {
            config.AddUserSecrets<Program>();
        }
    })

End-to-End Success:

With all the pieces in place, the full flow works:

  1. Log in as Admin
  2. Navigate to /Blog/Author
  3. Click “+ New Article”
  4. Fill in title, summary, markdown content
  5. Save as draft
  6. Publish the article
  7. Mark as featured
  8. Visit home page → Featured article appears at the top!

Phase 3: EasyMDE Integration

The Goal:

Phase 3 was about upgrading the authoring experience. The basic Create and Edit views had plain textareas for markdown content—functional, but not great for writing.

Implementation Approach:

Rather than duplicating the editor setup in both Create and Edit views, I created a shared partial view:

Areas/Blog/Views/Shared/_MarkdownEditorScripts.cshtml

Autosave Strategy:

EasyMDE's autosave feature stores drafts in localStorage. Each draft needs a unique key:

  • Create view: Uses "blog-new-draft" (fixed key—one new draft at a time)
  • Edit view: Uses "blog-article-{id}" (each article has its own draft)

When the form submits successfully, the autosave is cleared from localStorage.

The Toolbar:

toolbar: [
    "bold", "italic", "heading", "|",
    "quote", "unordered-list", "ordered-list", "|",
    "link", "image", "|",
    "preview", "side-by-side", "fullscreen", "|",
    "guide"
]

The Result:

The authoring experience is now significantly better:

  • Rich toolbar for common formatting
  • Live preview (side-by-side or full)
  • Spell checking
  • Auto-saving drafts every 10 seconds
  • Word and line count in status bar
  • Fullscreen mode for distraction-free writing

Phase 4: Image Management

The Goal:

Phase 4 was about completing the authoring experience with drag-and-drop image uploads.

Design Decisions:

Container Name Configuration

The container name is configurable per environment:

  • Development: blog-images-dev
  • Test: blog-images-test
  • Production: blog-images (default)

Validation Logic Placement

File validation (type checking, size limits) lives in the service layer, not the controller:

  • Allowed types: .jpg, .jpeg, .png, .gif, .webp
  • Max size: 5MB
  • Non-empty: File must have content

Test-Driven Development:

Following the TDD approach, I wrote tests first:

  1. File type validation - Rejects .exe, .bat, .js, accepts .jpg, .png, etc.
  2. File size validation - Rejects files over 5MB
  3. Empty file validation - Rejects zero-length uploads
  4. Database queries - GetImagesForArticleAsync, GetImagesByUploaderAsync

Testing & Bug Fixes:

After the initial implementation, testing revealed a few issues:

  1. Missing connection string: The AzureStorageConnectionString wasn't in user secrets or the GitHub workflow. Added it to the deployment pipeline (with quotes, like the Communication Services connection string).

  2. Azurite for local dev: Rather than requiring Azure Storage for local development, we added Azurite (Microsoft's storage emulator) to docker-compose. The service now calls CreateIfNotExists() to auto-create containers on first use.

  3. Malformed image URLs: Pasted images showed URLs like https://localhost:44316/https://storage.blob.core.windows.net/.... EasyMDE was treating the absolute blob URL as relative. Fixed by adding imagePathAbsolute: true to the config.


Lessons Learned

  1. Existing patterns are gold: Following the established patterns (Areas, Services, ViewComponents) keeps the codebase consistent.

  2. Configuration is layered: Understanding the load order (appsettings → user secrets → environment variables) and where configuration is built (host builder vs Startup) is crucial for multi-environment setups.

  3. User secrets are environment-specific: CreateDefaultBuilder only loads them for Development. Other environments need explicit configuration.

  4. Keep secrets minimal: Connection strings in appsettings (sans password), passwords in user secrets. Don't duplicate full connection strings in secrets.

  5. Side effects in constructors make code untestable: Creating clients, opening connections, calling CreateIfNotExists()—these should happen in DI factory registrations, not constructors.

  6. Separate databases prevent accidents: A dedicated test database is cheap insurance against accidentally corrupting production data.


What's Next

Phases 1-4 are complete! The blog now has:

  • A working database schema
  • Controllers and views for reading and writing articles
  • Featured article on the home page
  • EasyMDE markdown editor with autosave
  • Azure Blob Storage for image uploads

In Part 3, we'll cover the enhancement phases: tags, search, RSS feeds, SEO polish, and the security and performance reviews before production deployment.


Continue reading: Part 3 - Enhanced Features Through Production

Related Articles
Building a Blog Feature with Claude: An AI's Perspective (Part 3/3)

Vibe coding a blog feature for an existing website

Building a Blog Feature with Claude: An AI's Perspective (Part 1/3)

Vibe coding a blog feature for an existing website