Skip to content

MFS Application Notes

How a Voice of the Customer agent system maps to Multifamily Strategy's existing tools, data sources, and workflows.

MFS Data Sources Available

What MFS Already Has

SourceToolVoC SignalIntegration Path
Sales callsFathomObjections, requests, competitor mentions, sentimentFathom API or transcript export
CRM dataGoHighLevel (GHL)Deal stage, lead source, pipeline, tags, notes**
CommunitySkoolMember questions, pain points, engagement patternsSkool DM responder data + scraping
DM conversationsSkool + GHLQualification signals, objections, intent**
Ad engagementMeta AdsWhich hooks/offers resonate, audience segmentsMeta API
Website analytics(if applicable)Content interest, drop-off pointsAnalytics API

What's Missing vs. Ramp's Stack

  • No Gong — Fathom is the call recorder; different API, similar data
  • No Snowflake — Would need a warehouse or use Postgres on the VPS
  • No enterprise CDP — GHL is the closest thing to a unified customer view
  • No Zendesk — Support happens through DMs and community, not ticketing

Fathom as the Gong Equivalent

Fathom replaces Gong in the MFS stack. Key differences:

CapabilityGongFathom
Transcript APIFull REST APIExport-based (check current API status)
CRM syncNative SalesforceGHL via Zapier or direct
AI summariesBuilt-inBuilt-in
Coaching scorecardsYesLimited
MCP supportYes (new)Not yet

Approach: Pull Fathom transcripts (API or export), process same way as Gong data. See Fathom Sales Call Analysis for current integration details.

Fathom-Specific Pipeline

  1. Export or API-pull call transcripts from Fathom
  2. Parse into speaker turns with timestamps
  3. Apply same chunking strategy (512-1024 tokens)
  4. Enrich with GHL deal data (match by contact email/phone)
  5. Store in Postgres on VPS (or local SQLite for prototyping)

GHL as the Salesforce Equivalent

GHL replaces Salesforce. Integration is already documented:

GHL ObjectMaps to SalesforceVoC Signal
ContactContact + LeadName, email, phone, tags, source
OpportunityOpportunityPipeline stage, value, close date
ConversationActivityDM history, email threads
TagsCustom fieldsQualification status, segment
NotesNotesManual observations

Existing integration: GHL + Claude Code via MCP server — agents can already query GHL data.

Skool Community as a Unique VoC Source

Ramp doesn't have an equivalent — this is MFS-specific signal:

  • Member questions — What do people ask about? (pain points, knowledge gaps)
  • Engagement patterns — Which content gets reactions? (value indicators)
  • DM conversations — The DM setter bot captures qualification data
  • Churn signals — Members going quiet, not attending calls

Extracting Skool Signals

  1. Scrape or export community posts and comments
  2. Classify by topic: deal analysis, financing, operations, mindset, tools
  3. Track question frequency — recurring questions = content/product gaps
  4. Cross-reference with GHL contact data — which members are in the pipeline?

Proposed MFS VoC Architecture

[Fathom Transcripts]  [GHL CRM Data]  [Skool Community]  [DM Bot Logs]
        |                   |                |                  |
        v                   v                v                  v
    [PostgreSQL on VPS — unified customer_signals table]
        |
        v
    [Claude Code VoC Skill]
        |
        ├── [Call Analyzer Agent] — Fathom transcripts
        ├── [Pipeline Agent] — GHL opportunities + contacts
        ├── [Community Agent] — Skool posts + DMs
        └── [Ad Intel Agent] — Meta ad performance
        |
        v
    [Synthesis Agent] — cross-source patterns
        |
        v
    [Output: briefings, content ideas, offer optimization]

Key Differences from Ramp

  • Postgres instead of Snowflake — simpler, already on VPS, sufficient for MFS scale
  • Fathom instead of Gong — same concept, different integration
  • GHL instead of Salesforce — already integrated via MCP
  • Skool instead of Zendesk — community-based support, not ticketing
  • Claude Code skills instead of LangChain — already the MFS automation backbone

MFS-Specific Use Cases

1. Sales Call Intelligence

  • "What objections are most common in setter calls this month?"
  • "Which lead sources produce the most engaged prospects on calls?"
  • "What competitor programs are prospects mentioning?"

2. Content & Offer Optimization

  • "What questions keep coming up in Skool that we haven't addressed?"
  • "Which ad hooks led to the best call-to-close rates?"
  • "What language do our best customers use to describe their goals?"

3. Pipeline Health

  • "Which deals are stalling and why?"
  • "What do closed-won deals have in common vs. closed-lost?"
  • "Are DM-qualified leads converting better than ad leads?"

4. Community Health

  • "Which Skool members are most at-risk of churning?"
  • "What topics drive the most engagement?"
  • "Are community members progressing through the pipeline?"

Implementation Priority for MFS

Phase 1: Quick Win (1-2 weeks)

  • Build a Claude Code skill that pulls Fathom call summaries + GHL pipeline data
  • Simple synthesis: "Here's what happened in calls this week + pipeline status"
  • No vector store needed — just API pulls and LLM synthesis

Phase 2: Deeper Analysis (2-4 weeks)

  • Add Fathom transcript processing (full text, not just summaries)
  • Add Skool community data extraction
  • Store in Postgres, build the unified customer_signals table
  • Cross-reference: which Skool questions match call objections?

Phase 3: Full VoC System (4-8 weeks)

  • Parallel agent architecture with all four sources
  • Automated weekly VoC briefings
  • Ad creative optimization loop: VoC signals -> ad creative testing
  • Feed into Hormozi copywriting for offer refinement

Key Takeaways

  • MFS has the data sources needed for a VoC agent — they're just different tools than Ramp's (Fathom/GHL/Skool instead of Gong/Salesforce/Zendesk)
  • The existing GHL MCP integration and Claude Code skills are the foundation — no new orchestration framework needed
  • Skool community data is a unique VoC source that Ramp doesn't have — exploit this advantage
  • Start with a simple skill (Phase 1) that pulls Fathom summaries + GHL data — prove value before building the full pipeline
  • Postgres on the VPS is sufficient — don't over-engineer with Snowflake at MFS scale