PromptsVault AI is thinking...
Searching the best prompts from our community
Searching the best prompts from our community
Prompts matching the #metrics tag
Measure which prompt performs better. Features: 1. Two versions of a prompt (Variant A vs B). 2. Key Metrics: Relevancy Score, Accuracy, Response Speed, Token Usage. 3. Statistically significant 'Winner' badge. 4. User feedback collection tool for manual evaluation. 5. Chart comparing costs over 1,000 runs.
Build a comprehensive product metrics dashboard. Key metrics: 1. Acquisition (signups, activation rate). 2. Engagement (DAU/MAU, feature adoption). 3. Retention (cohort analysis, churn rate). 4. Revenue (MRR, ARPU, LTV). 5. Referral (viral coefficient, NPS). Use AARRR (Pirate Metrics) framework. Create separate views for different stakeholders. Include trend lines and targets. Set up automated alerts for anomalies. Use tools like Mixpanel, Amplitude, or custom SQL dashboards.
Measure and validate product-market fit. Indicators: 1. Sean Ellis test (% users 'very disappointed' if product disappeared >40%). 2. Retention cohorts (flattening curve after initial drop). 3. Organic growth rate (word-of-mouth, low CAC). 4. NPS score (>50 is excellent). 5. Sales cycle length (decreasing over time). 6. Customer feedback themes (solving real pain). Conduct surveys and analyze usage data. If PMF not achieved, pivot or iterate. Document assumptions and validate continuously.
Implement NPS program. Process: 1. Survey timing (post-interaction or periodic). 2. Question: 'How likely to recommend 0-10?' 3. Categorize: Promoters (9-10), Passives (7-8), Detractors (0-6). 4. Calculate: %Promoters - %Detractors. 5. Follow-up questions for context. 6. Close the loop with respondents. 7. Root cause analysis. 8. Track trends over time. Use for customer sentiment. Benchmarks vary by industry. Focus on improving score by addressing detractors.
Set up comprehensive funnel analytics to optimize conversion. Define key funnels: 1. Acquisition: landing page → signup → activation. 2. Conversion: trial start → paid conversion. 3. Engagement: login → core action → return visit. Track events: use event-based analytics (Amplitude, Mixpanel) not just pageviews. Event properties: user_id, timestamp, device, traffic source, feature variant. Conversion benchmarks: signup to activation 20-40%, trial to paid 15-25%, varies by industry. Analysis techniques: cohort analysis (retention over time), segmentation (power users vs. casual), funnel drop-off identification. Actionable insights: if 60% drop from signup to first use, focus on onboarding. A/B testing: experiment with different funnel steps. Reporting: weekly dashboards, monthly deep-dives, quarterly strategy reviews.
Set fair, achievable, stretching quotas. Inputs: 1. Company revenue goal. 2. Number of reps. 3. Historical attainment (what % hit quota). 4. Market capacity (total addressable market, saturation). 5. Sales cycle length. 6. Average deal size. Bottom-up calculation: Company needs $10M revenue. Reps: 10. Individual quota: $1M each (100% coverage). Add buffer for 80% attainment: $1M ÷ 0.8 = $1.25M quota per rep. Ramping: new reps 0% month 1-2, 50% month 3, 75% month 4, 100% month 5+. Segmentation: enterprise reps higher quotas ($2M), SMB reps lower ($750k). Validation: sanity check with outside data (SaaS benchmarks show similar company size has $1-1.5M quotas). Adjustments: ramp-up time, territory quality, available leads. Review quarterly: if <60% hitting quota, quotas too high. If >90% hitting, quotas too low. Sweet spot: 60-70% attainment rate. Communicate transparently: show math, rationale, FAQ doc.
Calculate and optimize CAC. Formula: Total Sales & Marketing Costs ÷ Number of New Customers. Best practices: 1. Segment by channel. 2. Include all costs (tools, salaries, ads). 3. Track cohorts over time. 4. Compare to LTV (LTV:CAC ratio should be 3:1). 5. Payback period (ideally < 12 months). 6. Optimize high-CAC channels. 7. Increase conversion rates. 8. Retention reduces effective CAC.
Set and track Objectives and Key Results for product success. OKR structure: Objective (qualitative goal) + 3-5 Key Results (quantitative outcomes). Example: Objective: 'Improve user onboarding experience.' Key Results: 1. Increase DAU/MAU ratio from 15% to 25%. 2. Reduce time-to-first-value from 7 days to 3 days. 3. Achieve 70% completion rate for onboarding flow. Quarterly cycle: 1. Set OKRs at quarter start (team input + leadership alignment). 2. Weekly check-ins on progress. 3. Monthly OKR reviews with adjustments if needed. 4. Quarterly retrospective and grading (0-1.0 scale, 0.7 is good). Dashboard setup: automated tracking where possible, manual updates weekly. Leading vs. lagging indicators: track both activity metrics (features shipped) and outcome metrics (user satisfaction). Transparency: share OKRs across company for alignment.
Calculate customer lifetime value. Formula: Average Order Value × Purchase Frequency × Customer Lifespan. Approaches: 1. Historical (actual data). 2. Predictive (cohort analysis). 3. Segment by customer type. 4. Factor in churn rate. 5. Include gross margin. 6. Discount future cash flows (NPV). 7. Track LTV trends. 8. Optimize to increase LTV. LTV:CAC ratio crucial for sustainability. Aim to maximize LTV through retention.
Build comprehensive analytics infrastructure for data-driven decisions. Analytics architecture: 1. Data collection: event tracking, user interactions, system metrics. 2. Data pipeline: ETL processes, data validation, transformation. 3. Data warehouse: centralized storage, dimensional modeling. 4. Business intelligence: dashboards, reports, self-service analytics. Key metrics framework: 1. Acquisition: traffic sources, conversion rates, cost per acquisition. 2. Activation: onboarding completion, time-to-first-value, feature adoption. 3. Retention: DAU/MAU, cohort retention, churn analysis. 4. Revenue: ARPU, LTV, conversion rates, expansion revenue. 5. Referral: viral coefficient, NPS, organic growth rate. Reporting strategy: 1. Executive dashboards: KPIs, trends, alerts. 2. Product dashboards: feature usage, user flows, experimentation results. 3. Operational reports: performance monitoring, error tracking. Tools: Segment for data collection, Snowflake for warehousing, Tableau for visualization. Data governance: quality monitoring, access controls, privacy compliance, documentation standards.