#ai-collaboration

2 posts

After Intelligence Microsite

A quick prototype for CCA's upcoming AI lecture series.

California College of the Arts is hosting After Intelligence, a lecture series exploring AI through artistic, ethical, and speculative lenses.

I prototyped this microsite using Claude’s frontend skills in a single afternoon session. Dark minimal aesthetic, responsive nav, custom button system, view transitions.

View the site

Designing a Collaborative AI Workflow: 24 Hours from Concept to Deployment

When a promotion review required a portfolio with minimal notice, I didn't just scramble to document my work—I pioneered a methodology for AI-augmented creative development that fundamentally changed how I approach projects.

The Challenge That Changed Everything

My promotion review was coming up, and my manager asked a reasonable question: “Can you share a portfolio that demonstrates the scope of work you’ve been doing?”

Simple request. Totally fair. One problem: I didn’t have a portfolio.

Not just “my portfolio needs updating.” I mean I had nothing. No case studies, no project showcases, no structured demonstration of my work. Building a professional portfolio from scratch typically takes weeks of planning, design iteration, content writing, and development. I had days, maybe a week at most before the review.

This is the story of what happened next—but more importantly, it’s about the methodology I discovered in the process. Because this wasn’t just about building something fast. It was about fundamentally rethinking how humans and AI can collaborate on creative work.

The Conventional Wisdom I Rejected

Faced with a tight deadline for something important, conventional wisdom offered two bad options:

  1. Rush through a basic portfolio - Sacrifice quality for speed, throw up a simple site with bullet points, hope it’s good enough for the review
  2. Ask for more time - Acknowledge that good work takes time, delay the review, lose momentum

I rejected both. Not out of stubbornness, but because I sensed there was a third option that nobody was talking about: What if the bottleneck isn’t time, but how we allocate human attention?

The Core Insight: Strategic vs. Tactical

Here’s what I realized: Most development time isn’t spent on high-level creative decisions. It’s spent on:

  • Writing boilerplate code
  • Debugging syntax errors
  • Looking up documentation
  • Implementing patterns you’ve done before
  • Repetitive styling and layout work

These tasks are necessary, but they’re not where creative value comes from. The real value is in:

  • Strategic decisions - What story should this portfolio tell?
  • Creative direction - What emotional experience should users have?
  • Content quality - How do I articulate my unique value?
  • Architectural choices - How should this site be structured?

Traditional development forces you to do both. AI collaboration lets you focus on the second while delegating the first.

The Four-Phase Methodology

Phase 1: Strategic Interview (Not Planning)

I didn’t start by planning what to build. I started by having Claude interview me like a recruiter.

The prompt: “Before we build anything, interview me to understand my professional narrative and value proposition.”

What followed was a 30-minute conversation that clarified:

  • Which projects actually demonstrated transferable skills
  • What made my approach unique vs. generic
  • The story arc I wanted potential employers to follow
  • My actual strengths (not what I assumed they were)

The insight: AI isn’t just for implementation—it’s for strategic clarification. By forcing me to articulate my value proposition conversationally, Claude helped me understand what I was really trying to communicate.

What I learned: This interview approach works because AI asks follow-up questions humans often skip. It probes assumptions. It asks “why does that matter?” and “how is that different from what anyone else would do?”

Phase 2: Rapid Prototyping with Live Feedback

With clarity on the narrative, we moved to prototyping—but not in the traditional way.

Instead of:

  • Static mockups → Development → Testing → Refinement

We did:

  • Live demo pages → Immediate feedback → Real-time iteration → Ship

The process:

  • Claude would build a working demo of a concept (a hero section, a project card, a transition effect)
  • I’d see it live in the browser within minutes
  • I’d give feedback: “This is too formal,” “The hierarchy is wrong,” “This feels sterile”
  • Claude would iterate immediately

The insight: When iteration cycles go from days to minutes, you can explore creative directions you’d normally dismiss as “too risky” or “not enough time.”

What I learned: Speed unlocks creativity. When trying something new doesn’t cost you half a day, you’re willing to experiment. I tested design systems I’d never attempted before because the cost of failure was 10 minutes, not 10 hours.

Phase 3: Iterative Development Partnership

This is where the methodology got interesting. I wasn’t directing AI to build my vision. I was partnering with AI to explore possibilities I hadn’t imagined.

Example: The modal-based project showcases.

My initial idea: “I want interactive project cards that expand to show details.”

Claude’s interpretation: “What about a SPA-like experience with smooth transitions, modal overlays, and no page reloads?”

My response: “I’ve never built that before—can we try it?”

The result: A sophisticated interaction pattern I wouldn’t have attempted on my own because I assumed it would be too complex for the timeline.

The insight: AI partnership isn’t about speed-running your existing ideas. It’s about expanding the possibility space of what you can attempt.

What I learned: The best AI collaboration happens when you’re willing to be surprised. I went in thinking I knew what I wanted. The partnership revealed possibilities I hadn’t considered because I’d mentally pre-rejected them as “too hard for the timeline.”

Phase 4: Seamless Deployment Integration

The final phase was deployment—and this is where the methodology proved its completeness.

Traditional workflow:

  • Build locally → Learn deployment platform → Configure settings → Troubleshoot issues → Deploy

AI-augmented workflow:

  • “I need to deploy this to Netlify” → Claude walks me through MCP integration → Deployed in one workflow

The insight: AI collaboration can encompass the entire development lifecycle, not just the coding phase.

What I learned: The methodology works when it removes all the tactical bottlenecks, not just some of them. Deployment is often where timelines explode—days lost to configuration issues, documentation gaps, and platform-specific quirks. AI eliminated that entirely.

What Worked (And What Didn’t)

What Worked Brilliantly

1. Chunked conversations with clear goals

  • Instead of “build me a portfolio,” I had focused 30-minute sessions: “Let’s design the hero section,” “Now let’s structure the case studies,” etc.
  • Each chunk had a clear success metric
  • This prevented scope creep and kept momentum high

2. Strategic briefs instead of technical specs

  • I didn’t say “create a div with flexbox layout and these specific styles”
  • I said “this section needs to feel aspirational but grounded”
  • AI translated strategic intent into technical implementation

3. Rapid feedback loops

  • Within minutes of seeing something, I could say “this isn’t working”
  • We’d iterate immediately instead of scheduling a refinement session
  • This compressed weeks of typical design iteration into hours

What Didn’t Work (Honest Assessment)

1. Perfect code quality

  • The code works, but it’s not as DRY or optimized as I’d write with more time
  • Trade-off: Speed vs. perfect architecture
  • Decision: For a portfolio showcase, working > perfect

2. Comprehensive testing

  • No unit tests, limited cross-browser testing
  • Risk accepted: Personal portfolio with limited traffic
  • Mitigation: Visual regression testing as I used it

3. Deep customization

  • Some design choices were “good enough” rather than “exactly what I envisioned”
  • Trade-off: Perfect vision vs. timeline constraints
  • Decision: 90% of vision in 10% of time is worthwhile

The Broader Implications

This methodology isn’t just about portfolios. It’s about a fundamental shift in how creative professionals can work.

The old model: Human does everything, slower but controlled The new model: Human focuses on strategy/creativity, AI handles implementation, faster and often better

But here’s the nuance: This only works when you’re clear about what AI can’t do.

AI can’t:

  • Decide what story resonates with your audience
  • Know which of your projects demonstrates unique value
  • Make aesthetic judgments about emotional tone
  • Understand what makes your work different from everyone else’s

AI can:

  • Implement technical patterns you describe strategically
  • Iterate rapidly on design concepts
  • Handle documentation and deployment workflows
  • Explore possibility spaces you haven’t considered

The key: Keep strategic control, delegate tactical execution.

What I’d Do Differently Next Time

  1. Start with even more strategic clarity - The interview phase was valuable, but I’d spend more time on competitive analysis and differentiation strategy before building

  2. Build in quality checkpoints - Schedule explicit moments to step back and assess: “Is this still aligned with my strategic goals?”

  3. Document the methodology as I work - I discovered this process in real-time, but having it explicit would make it reproducible

  4. Test assumptions earlier - Some design choices I’d have validated with quick user feedback before committing to implementation

The Meta Lesson

The ironic thing? This blog post exists because the portfolio worked too well as a demonstration of AI collaboration—and that became its weakness as a portfolio piece.

When people saw the portfolio, they thought: “Did Joel really build this, or did AI do everything?” The very thing that made it impressive (rapid AI-augmented development) raised questions about my actual capabilities.

That’s why it’s now a blog post instead.

As a portfolio piece, it was self-referential in a weird way. As thought leadership about AI-augmented workflows, it’s actually interesting. It shows not just what I can build, but how I think about collaboration, methodology, and creative work.

And that’s ultimately more valuable than any portfolio piece could be.

Conclusion: AI as Amplifier, Not Replacement

The 24-hour portfolio sprint taught me something crucial: AI doesn’t replace creative work—it amplifies creative capacity by removing tactical bottlenecks.

The portfolio itself? It helped make the case for my promotion. But more importantly, it gave me a methodology I now use for all creative projects:

  1. Strategic clarity first - Interview-driven understanding of goals
  2. Rapid prototyping - Live demos over static planning
  3. Iterative partnership - Exploration over execution
  4. End-to-end workflows - Deployment integrated, not appended

This isn’t just about building portfolios faster. It’s about building anything faster while maintaining—or even increasing—creative quality.

The question isn’t “Can AI build this?” The question is “Where should I focus my irreplaceable human attention to create the most value?”

Everything else? That’s what AI partnership is for.


Want to explore this methodology further? I’m continuing to document my experiences with AI-augmented creative workflows. Check out my other posts about AI learning reflections and thought leadership on modern development practices.

Have questions or thoughts on AI collaboration? I’d love to hear your experiences. Connect with me through the contact form or find me on the platforms listed there.