How I turned days of manual data exploration into hours of collaborative reasoning—and finally made AI useful
The Pain We All Know
Data migration sounds simple—until you’re the one doing it.
I was recently tasked with migrating a MongoDB setup to PostgreSQL. The source system held years of critical business data, spread across multiple collections with inconsistent structures, embedded relationships, and undocumented logic.
The usual path would look like this:
- Write exploratory queries just to understand the shape of the data
- Reverse-engineer relationships and business rules
- Document everything in spreadsheets
- Map fields manually
- Hold meetings to confirm every assumption
It’s long, painful, and error-prone.
But this time, I approached it differently—by turning the migration planning into a collaborative reasoning session with AI, powered by something called MCP.
Quick Detour: Why AI Coding Often Fails
In an earlier blog, I wrote about why coding with LLMs can actually be harder than you think. I explored how:
- LLMs lack context over time
- They struggle with edge cases
- They generate confident but incorrect code
- You often spend more time correcting than creating
The core insight from that blog was this:
The problem isn’t the model—it’s the process.
Throwing raw prompts at an LLM won’t work. You need a systematic, structured way to interact, iterate, and co-create.
And that’s exactly what Model Context Protocol (MCP) provides.
What is MCP?
MCP (Model Context Protocol) is a method that lets AI collaborate with external systems—like a database—while maintaining context across sessions. It transforms a generic chat into a purposeful, reasoning-based dialogue where:
- The AI can connect directly to data sources
- Ask you smart questions
- Learn from your answers
- Build a shared mental model of the system
- Generate specs, mappings, and code with rationale
Think of it as pair programming meets system design meets product thinking, with AI as your assistant architect.
How I Started the Conversation with AI
To kick things off, I shared the real-world context and gave the AI a clear instruction to reason with me step-by-step. Here’s the first prompt I used:
You are my partner in this migration.
I have a system called **XYZ** with a MongoDB datastore. It’s been rewritten to use **PostgreSQL** instead.
The MongoDB database is named `data_db`.
I need your help to do a **gap analysis** for migrating data from Mongo to Postgres.
Let’s go step by step — ask clear, specific questions. Don’t assume anything.
This prompt set the tone. Instead of throwing static instructions at the model, I treated it as a collaborator. I didn’t just want output—I wanted a conversation. That changed everything.
My AI-Powered Migration Plan (Step-by-Step)
🧭 1. Discovery: Letting the AI Explore
I gave the AI my MongoDB connection string(via MCP).
It connected, scanned the collections, and built a structural understanding:
- Identified 4 key collections
- Spotted nested documents and references
- Surfaced anomalies (nulls, missing links)
Then it started asking questions—smart ones:
“Some records appear in multiple groups—should we treat them as duplicates or distinct?” “There are null values in rating thresholds—should we default them or skip these records?”
I didn’t have to instruct it line by line. The AI did the discovery, and I just had to guide its understanding.
🧠 2. Clarifying Business Logic
Instead of me documenting every rule up front, the AI interviewed me through focused prompts.
We discussed:
- Availability logic
- Prioritization rules
- Inconsistencies across collections
- What missing data really meant in the business context
And when I gave an answer, the AI updated its assumptions and adjusted its strategy.
🔄 3. Mapping Strategy: Working Through Solutions
Once the logic was understood, we worked through field mappings together. Here’s the kind of dialogue we had:
-- MongoDB: nested_rating.rating_type.min_value
-- Postgres: rating_ranges.range_minimum
-- Reason: Normalize nested object
-- Special Handling: Convert nulls to 0; log this in migration audit
Another:
“Capacity records exist without skill assignments—should these be skipped or marked as incomplete?”
I’d respond, and it would modify the mapping spec and document the rationale.
📄 4. The Final Output: A Living, Rational Spec
By the end of our session, we had:
✅ Business Logic Documentation
agent_availability:
logic: "Based on active schedules"
edge_cases:
- "Overlapping schedules → take most recent"
- "No data → default to available"
validation: "Mapped statuses must exist in Postgres schema"
✅ Mapping Definitions
-- Source: agent.skills[].type
-- Target: skill_assignments.skill_type
-- Notes: Flattened array → join table; log unmapped values
✅ Migration Phases
- Reference data (e.g., rating configs)
- Core entities (e.g., agent profiles)
- Relationship data (e.g., skills)
- Derived data (e.g., availability)
Each phase came with:
- Migration SQL
- Validation queries
- Rollback scripts
What Made This Different
🔍 1. AI Did Discovery, Not Just Execution
It didn’t just wait for prompts. It explored the data and drove the conversation.
⚡ 2. Real-Time Iteration
Instead of a long planning phase, we adjusted course quickly—answer, iterate, document, repeat.
🧾 3. Reasoning Over Output
The AI didn’t just give code. It gave decisions with explanations—which made validation and debugging easier.
Try This Yourself
If you’re facing a complex data migration, or just tired of AI generating “half-right” code, here’s how to approach it:
- Configure MCP – Give the AI access to your system (DB/API/docs)
- Collaborate – Let it explore and ask questions
- Clarify – Answer business logic, edge cases, and rules
- Refine – Review and tweak the proposed logic
- Capture – Let it document everything: logic, mappings, rationale
- Generate – Use the output to produce real implementation code
Final Thoughts
This experience reminded me: AI isn’t your coder—it’s your collaborator.
It won’t replace your understanding of the system. But with the right process, it can accelerate your understanding, catch blind spots, and help you build something robust, fast.
I didn’t just generate migration scripts—I co-designed a data model, documented logic, and built trust in the outcome.
The shift? Not in the AI’s capabilities. But in how I worked with it.