What We've Learned So Far¶
Building Beet is a journey of constant learning. This page captures the insights, mistakes, and "aha moments" that are shaping how we build and grow.
Our Learning Philosophy¶
We believe in: - Fast feedback loops - Learn quickly, adjust quickly - Shared knowledge - What one person learns, the whole team benefits from - Honest retrospectives - Celebrate wins, own mistakes, extract lessons - Applied learning - Turn insights into better processes and decisions
Key Lessons by Category¶
Product Development¶
Lesson #001: Start Narrow, Go Deep¶
What Happened: Initially planned to build events + movies + restaurants + news simultaneously. What We Learned: Users were confused about our core value. It's better to dominate one area first. How We Changed: Events-first approach with phased expansion. Impact: Clearer positioning, faster development, better user experience.
Lesson #002: Manual First, Automate Second¶
What Happened: Planned to build automated event scraping from day one. What We Learned: Manual curation teaches you what good content looks like. How We Changed: Hand-pick first 100+ events, then build automation based on learned patterns. Impact: Higher quality content, better understanding of user preferences.
User Research¶
Lesson #003: Talk to Users Early and Often¶
What Happened: Made assumptions about what Indian diaspora users wanted. What We Learned: Real user needs were different from our assumptions. How We Changed: Weekly user interviews, feedback collection built into product. Impact: Product-market fit insights, feature prioritization clarity.
Lesson #004: Geography Matters More Than We Thought¶
What Happened: Assumed users would engage with events anywhere in their region. What We Learned: Users primarily care about events within 30-45 minutes of their location. How We Changed: Hyper-local focus within SF Bay Area first. Impact: Higher engagement, better venue partnerships.
Team & Process¶
Lesson #005: Document Decisions in Real-Time¶
What Happened: Had to re-discuss the same decisions multiple times. What We Learned: Writing down "why" prevents re-litigation and helps new team members. How We Changed: This decision log process, immediate documentation. Impact: Faster onboarding, consistent decision-making.
Lesson #006: Weekly Goals Beat Monthly Targets¶
What Happened: Monthly planning led to end-of-month scrambling. What We Learned: Weekly cycles create better momentum and course correction opportunities. How We Changed: Weekly sprint planning with daily standups. Impact: More consistent progress, earlier problem detection.
Market & Business Insights¶
Competition¶
What We've Discovered About Our Space¶
- Incumbents are complacent: Ticketmaster, Eventbrite treat our niche as afterthought
- Niche players lack execution: Good intent but poor user experience
- Content is king: Better curation beats better technology
- Trust is everything: Users need to believe we'll have accurate, current information
User Behavior¶
Patterns We're Seeing¶
- Discovery happens on mobile: 85%+ of event browsing is on phones
- Decision-making is social: Users share events before committing
- Timing matters: Peak engagement is Thursday-Sunday for weekend planning
- Price sensitivity varies: Free events get high engagement, premium events need clear value
Partnerships¶
What Works in Venue Relationships¶
- Lead with value: Show them engaged users before asking for integration
- Start small: Begin with listing partnership before booking integration
- Local relationships matter: Personal connections trump cold outreach
- Mutual success metrics: Align on what success looks like for both sides
Technical Learnings¶
Architecture¶
Lesson #007: Simple Infrastructure First¶
What Happened: Considered microservices architecture from start. What We Learned: Complexity without scale is just complexity. How We Changed: Monolithic architecture with clear module boundaries. Impact: Faster development, easier debugging, lower operational overhead.
Lesson #008: Real Users Break Everything¶
What Happened: Alpha testing revealed edge cases we never considered. What We Learned: No amount of internal testing replaces real user behavior. How We Changed: Earlier alpha release, better error handling and monitoring. Impact: More robust product, faster bug detection.
Data & Analytics¶
What We Track vs What Actually Matters¶
- Vanity metrics: Total signups, page views
- Reality metrics: Weekly active users, event page to signup conversion
- Leading indicators: Content engagement, user return visits
- Lagging indicators: Revenue, partnerships, retention
Mistakes & Course Corrections¶
What Didn't Work¶
Mistake #001: Trying to Launch in 3 Cities Simultaneously¶
The Problem: Divided attention, inconsistent content quality, harder to measure success. The Fix: SF Bay Area focus first, then expand city by city. The Lesson: Geographic focus creates network effects and operational efficiency.
Mistake #002: Building Features Before Validating Need¶
The Problem: Built event rating system before users were even discovering events consistently. The Fix: Focus on core user journey first, add bells and whistles later. The Lesson: Solve the main problem completely before adding secondary features.
Mistake #003: Underestimating Content Curation Effort¶
The Problem: Thought we could automate event data collection immediately. The Fix: Manual curation process with gradual automation. The Lesson: Understanding your content deeply is prerequisite to automating it.
Future Learning Priorities¶
What We Need to Figure Out¶
Short-term (Next 3 months): - Optimal event curation workflow - User onboarding conversion optimization - Partnership negotiation best practices - Mobile app vs web engagement patterns
Medium-term (Next 6 months): - Multi-city expansion playbook - Revenue model optimization - Team scaling and culture maintenance - Competitive response strategies
Long-term (Next 12 months): - Super-app feature integration - Enterprise/B2B market approach - International expansion considerations - Platform and ecosystem development
How We Keep Learning¶
Regular Learning Rituals¶
Weekly: - Team retrospectives after each sprint - User feedback review and categorization - Competitive intelligence updates - Metric reviews and trend analysis
Monthly: - Deep-dive lessons learned sessions - Process improvement workshops - Market research and user interview synthesis - Partnership and business development learnings
Quarterly: - Strategic assumption validation - Major decision reviews and outcomes assessment - Team skill development planning - Market positioning and messaging refinement
Learning Artifacts¶
We create: - Decision documentation (this section!) - User research synthesis reports - Competitive analysis updates - Process improvement recommendations - Success and failure case studies
We share: - Weekly team learning highlights - Monthly all-hands lessons learned - Quarterly board/advisor insights - Industry conference presentations
Quick Reference¶
Top 5 Lessons So Far¶
- Start narrow, go deep - Better to own one thing than be mediocre at many
- Manual first, automate second - Understand the problem before building the solution
- Geography matters - Local focus creates stronger network effects
- Users break everything - Real usage reveals problems internal testing misses
- Document decisions immediately - Prevents re-litigation and helps team alignment
Current Learning Focus¶
- User onboarding optimization
- Event curation efficiency
- Partnership development
- Revenue model validation
Archived Learnings¶
- Initial market assumptions (mostly wrong!)
- Technology stack decisions (mostly right!)
- Launch strategy approaches (evolved significantly)
Owner: Entire team contributes, Ranga Reddy synthesizes Update Cycle: Weekly new insights, monthly deep synthesis
Learning faster than the competition is our sustainable advantage.