AI Prompting Standardization Gap
Blake Oliver
DURABLE
Documented
Demand: Documented
Speaker explicitly describes people paying or seeking this.
Needs New ConceptBuildability
One new concept needed — The technical components exist (Process Street, Zapier, OpenAI API) but need to be packaged as a turnkey solution for accounting firms without integration capabilities.
Solution: PartialSolution Status: Partial
Something exists but has a gap: Blake demonstrates a working solution but requires technical integration expertise most firms lack.
Problem Statement
Teams use AI inconsistently with varying prompts, making quality assurance impossible and creating liability exposure. No systematic way to ensure prompt consistency across team members or track what inputs generated which outputs for client work.
Job to Be Done
Give me confidence that every team member is using the exact same AI prompts for client work, with full audit trails of inputs and outputs.
Assessment
Helmer Power
Switching costs (workflow integration creates stickiness)
Network effects (standardized prompts improve with usage)
Lenses Triggered
Variable Cost to Zero
Human Behavior Constant
Contrarian Signal
Variable Cost
Current model: each team member crafts individual prompts with variable quality. Systematic prompting collapses prompt development cost to zero marginal cost per use.
Why This Is Durable
Quality control and standardization challenges are permanent features of professional services. As AI adoption scales, the gap between ad-hoc usage and systematic implementation becomes a structural business risk.
Solution Gap
Blake demonstrates a working solution but requires technical integration expertise most firms lack.
Demand Evidence
Blake describes this as his implemented solution to a problem he actively experienced — ungoverned AI usage making quality control impossible.
Human Behavior Insight
People optimize for immediate task completion over systematic process adherence when tools are informal and accessible.
Paradigm Challenge
AI tools should be available for individual team member experimentation and learning.
Source Quote
it's really hard to do quality assurance on the outputs if you don't know what people are using for the prompts to begin with and the only way that I found to ensure that people are doing the same thing is to automate the prompting
Broad Tags
manual_process_ripe_for_automation
manual_process_ripe_for_automation
Blake explicitly describes teams doing 'willy-nilly' AI usage with copy-paste workflows that should be systematized through workflow automation.
institutional_buyer_unfulfilledinstitutional_buyer_unfulfilled
Professional service firms need systematic AI implementation but current solutions require technical expertise most firms don't possess.
incentive_misalignmentincentive_misalignment
Individual team members optimize for speed (quick ChatGPT queries) while firm needs standardization and quality control — these objectives conflict without systematic infrastructure.
Specific Tags (structural patterns for cross-referencing)
quality_assurance_impossible_without_standardizationai_prompting_as_ungoverned_individual_activityprofessional_liability_from_inconsistent_ai_usageaudit_trail_nonexistent_for_client_deliverablesteam_members_optimize_speed_over_consistencycopy_paste_workflow_creates_variancesystematic_prompting_infrastructure_missingtechnical_integration_expertise_barrierclient_data_governance_ad_hocprompt_refinement_trapped_in_individual_practice
Constraints Blocking Progress
⚙
TECHNICAL
api integration expertise required
Setting up Process Street + Zapier + OpenAI requires technical knowledge most accounting firms lack.
💰
COST
multiple software subscriptions per user
Blake's solution requires Process Street + Zapier + OpenAI subscriptions, adding $50-100+ per user monthly.
📋
REGULATORY
client data ai usage permissions
Firms must update engagement letters and obtain client consent for AI processing of financial data.
This problem represents the transition from individual AI adoption to institutional AI governance — the exact inflection point where firms either systematize successfully or create operational chaos. Blake is documenting the gap between 'everyone should try AI' (current state) and 'AI should be invisible infrastructure' (required end state).
What makes this especially actionable is Blake's demonstration that the technical components already exist and work together. The constraint isn't technological capability — it's packaging these components into a turnkey solution. Most accounting firms won't build what Blake built, but they would buy it as a packaged service.
The professional liability angle is crucial. As AI becomes standard in professional services, 'we don't govern our AI usage' becomes a malpractice risk, not just an efficiency problem. Blake is essentially describing the compliance infrastructure that will be required, not optional.
[15:20] Blake explains: 'I want to automate this and that's where zapier comes in... I don't want my team to have to go into chat GPT and upload a PDF and then copy and paste a prompt and do multiple prompts like this is not a reliable thing I want to automate this... it's really important if we are going to use AI in our practices that we standardize that we don't just have people using it willy-nilly because it's really hard to do quality assurance on the outputs if you don't know what people are using for the prompts to begin with and the only way that I found to ensure that people are doing the same thing is to automate the prompting.'
answer
TRUE
explanation
Professional services always require quality control and standardization. AI makes this need more acute, not less.
findable
TRUE
explanation
Blake explicitly mentions professional liability concerns and client data governance — acute pain points for this exact segment.
specific group
Accounting firms with 10-50 staff implementing AI for client work
acute enough to pay
TRUE
underlying job
Let my team use AI power without creating professional liability exposure or quality variance
not surface task
Surface task is 'train people to use AI better.' Real job is 'systematic quality control for AI outputs.'
claim
Firms should systematize AI usage rather than let teams use it freely
contrarian
TRUE
explanation
Most firms are encouraging individual AI exploration. Blake argues the opposite — systematic control from day one.
structurally sound
TRUE
explanation
Once prompts are systematized in workflow software, switching costs are high. Network effects: more firms using standardized prompts creates better prompt libraries.
helmer powers
['Switching costs', 'Network effects']
opens up
Systematic AI integration with full audit trails and consistent outputs
inversion
What if AI prompting happened automatically through workflow triggers?
constraint identified
AI tools must be used manually by each team member
if zero
Every team member uses perfect prompts without development time
who pays
Team members (time) and firm (inconsistent quality)
per unit cost
Prompt development time per team member per use case
collapsible components
Prompt crafting, quality testing, refinement iterations
mechanism
Chemical trails encode successful strategies that all colony members follow automatically, preventing wasteful individual exploration of inferior paths
transferable
TRUE
domain distance
MEDIUM
natural example
Ant colony pheromone standardization — successful foraging paths get reinforced system-wide, not reinvented by each ant
nature solved analogous
TRUE
if parallel
All team members access optimized prompts simultaneously through workflow automation
bottleneck removed
Individual prompt development and testing
sequential assumption
Team members must individually learn AI prompting before firm can benefit
insight
People optimize for immediate task completion over systematic process adherence when tools are informal. This appears in email usage, documentation practices, and now AI adoption.
across eras
TRUE
across domains
TRUE