Prepared Answers
TOUGH QUESTIONS
Jason will push back. Here are 12 questions he's likely to ask and exactly how to answer each one.
"What does Phase 1 actually look like day-to-day?"
Richard answers
Week 1: Working sessions with your teams — product, engineering, QA. We map how each team builds software today, where time is lost, and how AI tools are currently being used. Week 2: We deliver the playbook — specific tool recommendations wired to your Jira and GitHub pipelines, top 3-5 automation opportunities ranked by impact, and a phased rollout plan. You'll have something actionable, not theoretical.
"How do you handle our different tech stacks?"
Richard answers
Claude's Model Context Protocol connects to Jira and GitHub at the operational layer — it reads the repo context regardless of whether the code is PHP, .NET, Ruby, or Node. We don't need to standardize your languages. We standardize the orchestration on top of them. CodeRabbit parses the abstract syntax trees of each language specifically, so it understands legacy PHP as well as modern Node.
"What is Kelly actually doing with AI internally?"
Whole team — be honest
We're building our AI practice right now — and that's actually an advantage. We're not selling a legacy framework. We're building in real-time using the same tools we're recommending to you. This playbook we just showed you? Built with AI. The podcast? AI-generated from our call notes. We practice what we preach every day. The companies ahead on AI hired partners who learned alongside them. We're offering to be that partner.
"You lost to an offshore competitor last time. Why should I trust you now?"
Roxy + Treasure
We learned from that. Last time we came with staffing — headcount, not strategy. This time we're bringing a specific technical approach built around your exact situation. The playbook you just saw proves we did the homework no one else is doing. We're not throwing bodies at this. We're bringing practitioners who build AI agent pipelines and embed with your teams.
"I don't need a $15,000 PDF telling me what I already know."
Treasure
Neither do we. Phase 1 isn't a study — it's an implementation playbook. You get exact wiring diagrams for how AI agents connect to your Jira and GitHub pipelines via MCP. You get specific tool configurations for your stacks. You get a ranked list of automation opportunities with estimated cycle time impact. You can use it with us or without us — that's how confident we are that you'll want us for Phase 2.
"How does this integrate with what Nova is already doing?"
Wade
We augment, not replace. Your Nova teams are in rapid experimentation right now. We come alongside them — shoulder to shoulder — and help them evaluate what's working, fill the gaps, and build the orchestration layer that connects the individual tools into a system. We're not asking them to stop what they're doing. We're helping them do it faster and make it repeatable.
"What about product managers? They can't keep up with faster engineering."
Richard
You identified this on our last call and it's exactly right. If we only accelerate engineering, we just move the bottleneck to product. Our approach includes the product agent specifically — AI that helps generate and refine requirements so product keeps pace with engineering. The whole pipeline has to move together or you get no benefit.
"What about security? Our code is proprietary."
Richard
All AI tools we deploy are configured for enterprise data isolation — no training on your code. We use enterprise tiers with data segregation. For your UK operations like Slick and TeamUp, we ensure GDPR compliance on all AI tool usage. API keys and secrets never enter AI prompts. And all AI-generated code goes through human security review before it touches production. This is baked into our approach from day one, not bolted on later.
"Can you actually deliver in 2 weeks or is that optimistic?"
Treasure
Two weeks is realistic because we've already done significant homework. We're not starting from zero. We know your tech stacks, your tooling, your processes, and your goals. Phase 1 is about validating our assumptions with your teams and turning our research into specific implementation plans. The heavy lifting of understanding DaySmart has already been done.
"What if Phase 1 doesn't show us anything we don't already know?"
Wade
If all we delivered was a summary of your current state, that would be a failure and we'd deserve the criticism. Phase 1 delivers the HOW — the specific wiring, the tool configurations, the agent pipeline architecture mapped to your processes. You know what needs to change. We show you exactly how to change it without disrupting what works.
"Who specifically would be working with our teams?"
Treasure + Roxy
Richard leads the AI architecture and tool integration. Wade brings the agentic workflow expertise — he builds 2-3 agents a week and has done this for AT&T and healthcare clients. Treasure is your day-to-day point of contact for execution. Roxy stays involved at the relationship level. These are senior people, not junior consultants. You'll be working directly with the people on this call.
"How is this different from what Accenture or Deloitte would pitch?"
Richard
They'd send you a 100-page assessment over 3 months with a team of 20 juniors. We're four senior practitioners who embed with your teams for 2 weeks and deliver an implementation playbook you can execute immediately. We don't sell frameworks — we build working systems. And we built our own pitch using the exact tools and approach we're proposing for you. No big firm is doing that.