Questbook Grant Proposal Guide
Based on 100 approved + 100 rejected real applications (data: March 2026)
Overview
Questbook is the leading web3 grant management platform. Major ecosystems using it include Arbitrum, Compound, Polygon, Uniswap, and others. Grants range from $5K tooling projects to $150K protocol builds.
Current data coverage: Primarily Arbitrum grants (Developer Tooling, New Protocols, Education, Gaming, Stylus Sprint).
Ecosystem Approval Rates
| Ecosystem | Approved | Rejected | Rate |
|---|---|---|---|
| Developer Tooling on Arbitrum One and Stylus 3.0 | 25 | 40 | 38% |
| Arbitrum New Protocols and Ideas 3.0 | 13 | 33 | 28% |
| Arbitrum Education, Community Growth and Events 3.0 | 16 | 16 | 50% |
| Arbitrum Gaming 3.0 | 14 | 0 | 100% |
| Arbitrum - Orbit domain | 5 | 9 | 36% |
| Arbitrum Stylus Sprint | 10 | 0 | 100% |
| Final Grantees | 5 | 0 | 100% |
| Compound : Dapps and Ideas Domain | 3 | 0 | 100% |
| iExec Developer Rewards Program | 2 | 0 | 100% |
| Community Events | 2 | 0 | 100% |
| comps | 1 | 1 | 50% |
| AngelHack x Polygon Community Grants Program | 1 | 0 | 100% |
| Open Strategic Missions | 1 | 0 | 100% |
| Compound : Multichain and Cross Chain Domain | 1 | 0 | 100% |
| Compound dapps and protocol ideas (CGP 2.0) | 1 | 0 | 100% |
Key insight: Arbitrum's approval rate varies widely by domain. Gaming/Stylus Sprint programs run near 100% (curated), while competitive open programs (Developer Tooling) run 28–38%.
What Approved Proposals Have in Common
Field Completion Rate (Approved Applications)
- projectdetails: 100% of approved proposals
- projectname: 100% of approved proposals
- teammembers: 100% of approved proposals
- what innovation or value will your project bring to arbitrum? what previously un: 70% of approved proposals
- what is the current stage of your project: 70% of approved proposals
- do you have a target audience? if so, which one: 59% of approved proposals
- website: 56% of approved proposals
- please provide a detailed breakdown of the budget in term of utilizations, costs: 56% of approved proposals
- provide a list of the milestones, with the usd amount of the grant associated to: 56% of approved proposals
- are milestones clearly defined, time-bound, and measurable with quantitative met: 56% of approved proposals
Rule #1: Fill every field. Approved proposals treat every question — even optional ones — as an opportunity to demonstrate depth.
The Anatomy of a Winning Proposal
1. Project Description (customField6 / "What is your idea")
What approved proposals do:
- Open with a one-sentence problem statement naming the specific pain point
- Explain the technical solution with enough depth to show feasibility
- Explicitly name which Arbitrum features/protocols it leverages
- Quantify the gap: "X developers face Y problem, existing tools solve Z% of it"
Example pattern from approved applications:
"[Tool] is a specialized [type] for Arbitrum [technology] that provides [specific capability]. Developers input [specific input] to receive: (1) [output 1], (2) [output 2], (3) [output 3]."
2. Milestones (customField12)
Structure that gets approved — 3–5 milestones, each containing:
- Milestone name + amount (USD)
- Deliverables: specific repo, deployed address, documentation URL
- Completion criteria: testable, not subjective
- Timeline: "4 weeks" not "Month 1"
- Partial payment: tied to milestone completion, not project end
Example from a real approved proposal:
"Milestone 1: WASM Analysis Engine — Amount: $14,000 Deliverables: Functional WASM binary parser that validates contract size, calculates deployment costs using live Arbitrum gas prices, and provides size optimization recommendations. KPI: Successfully parse 100% of valid WASM contracts, <200ms analysis time."
3. Budget Breakdown (customField11)
Approved format — itemized by component:
- Component name: amount
- Justification: hours × rate or fixed cost with market rationale
- Include: dev hours, infrastructure, audits, documentation
Example:
"- WASM Parser & Analysis Engine: $14,000 — Binary parsing system (280 hrs × $50/hr)
- Code Generator: $7,000 — Contract interaction layer
- CLI Interface: $5,000 — User-facing tooling"
Typical approved ranges:
- Small tooling/scripts: $5K–$20K
- SDK/library: $20K–$60K
- Full dApp/protocol: $50K–$150K
- Research + implementation: $25K–$80K
4. Deliverables (customField7)
List specific artifacts:
- GitHub repo URL (can be placeholder, but name the org)
- Deployed contract address (even testnet)
- Documentation site or README
- Test coverage percentage target
- User/developer metrics (active addresses, integrations)
5. Team Section (customField28)
- Full names or verified pseudonyms with GitHub/Twitter links
- Relevant prior work — link to deployed contracts or repos
- Previous grants with outcomes (even failed ones show experience)
- If solo: acknowledge it and explain why scope is right for one person
6. Ecosystem Alignment (customField8)
This is the "why Arbitrum" section. Reviewers are gatekeepers for ecosystem money — they need to justify approval to their community. Give them the argument:
- Name specific Arbitrum features you leverage (Stylus, Orbit, L2 gas model)
- Explain the multiplier effect: how does your tool help N other developers/projects
- Reference Arbitrum's stated priorities (found in grant program descriptions)
Common Rejection Patterns (from 99 rejection messages)
- Weak milestone structure: 50/99 rejection messages (51%)
- Technical concerns: 50/99 rejection messages (51%)
- Team credibility issues: 49/99 rejection messages (49%)
- Out of ecosystem scope: 39/99 rejection messages (39%)
- Scope too broad: 39/99 rejection messages (39%)
Approved Proposal Examples
Example 1: Analytics and Complementary CLI for Arbitrum Stylus (Developer Tooling on Arbitrum One and Stylus 3.0)
Project Idea: Stylus Analytics Suite is a specialized CLI tool for Arbitrum Stylus that provides WASM contract analysis and code generation. Developers input compiled WASM files and ABIs to receive: (1) instant size validation with deployment warnings, (2) detailed cost calculations for Arbitrum deployment, (3) function signature collision detection with security warnings, (4) auto-generated type-safe integration code for contract interaction, and (5) contract interface exports for Stylus-to-Stylus communicat
Milestones: Milestone 1: WASM Analysis Engine Amount: $14,000 Deliverables: Functional WASM binary parser that validates contract size, calculates deployment costs using live Arbitrum gas prices, and provides size optimization warnings Estimated Completion: 6 weeks from grant approval
Milestone 2: Function Signature Collision Analyzer Amount: $7,000 Deliverables: Analysis tool that detects 4-byte selector co
Budget:
- WASM Parser & Analysis Engine: $14,000 - Binary parsing system for WASM contract analysis, size validation, and deployment cost calculation
- Code Generator for Contract Interaction: $7,000 - Development for type-safe function call generation with proper error handling for Stylus contract interact
Reviewer Feedback (Approved):
Please resubmit your proposal
Example 2: Kaiju Finance (Arbitrum New Protocols and Ideas 3.0)
Project Idea: High-level description Kaiju is deploying Kaiju v1 on Arbitrum: a payment-native revolving credit protocol that allows users to spend against on-chain collateral while keeping their crypto exposure. Kaiju integrates with existing card programs so they can offer a Kaiju-powered credit line to their current users using their existing rails and UX.
Step 1 - Kaiju v1 MVP launch (closed beta, whitelist cohort; no partner integrations)
Deliverables:
- Kaiju v1 smart contract deployed on testnet and
Milestones: Milestones and Grant Allocation
Step 1: Deploy Kaiju v1 protocol on both testnet & mainnet ($10,000)
Team Cost: $7,000 Marketing: $1,000 Funding initial credit line: $2,000
Deliverables:
- Kaiju v1 smart contract deployed on testnet
- Third-party smart contract audit
- Mainnet deployment on Arbitrum
Step 2: Integrate with 2 payment partners to perform an end-to-end settlement drawdown ($1
Detailed breakdown: Step 1: Deploy Kaiju v1 protocol on both testnet & mainnet
Third-party auditing: $22,000
Step 2: Integrate with a payment partner to perform an end-to-end se
Reviewer Feedback (Approved):
Please change the wallet address
Example 3: Participation Architecture: Governance Data Pipeline & Deterministic Triage Rule (Developer Tooling on Arbitrum One and Stylus 3.0)
Project Idea: “Participation Architecture” — Governance Data Pipe + Deterministic Triage Rules
A modular backend service that:
-
Ingests governance items from defined sources (initial focus: Snapshot/Tally-style feeds).
-
Normalizes them into a stable schema (proposal metadata, dates, links, category hints, amounts when available). |
-
Applies a Deterministic Rule Engine that outputs: 3.1. priority_score (0–100) 3.2. abels (treasury, elections, parameter_change, routine_ops, etc.) 3.3. reasons (rule ID
Milestones: Milestone 1 — Pipeline Hardening + Rulebook v1 + API v1 $3,500 — Weeks 1–5
Deliverables Ingestion + normalization into a documented schema Deterministic rule engine implemented Rulebook v1 (rulebook.yaml + rulebook.md) with versioning API endpoints in staging: GET /proposals/feed GET /proposals/{id} GET /health Swagger + Quickstart draft
Acceptance / KPIs Reproducible local run via Docker (docke
Budget: Link: https://docs.google.com/spreadsheets/d/1Ipb3nLzsL6yjMkP3T99qK-cSAup0HvsgQzvBeyufZNE/edit?usp=sharing
Total: $6,500 USD
- Engineering (pipeline + rule engine + fatigue index) — $4,900 ~70 hours @ $70/hr 1.1. ingestion + normalization hardening 1.2. deterministic rule engine + rulebook + test
Example 4: OAK RESEARCH (Final Grantees)
Project Idea: Private company
Milestones: France
Budget: French audience is our main target as we originated in France, but we aim to expand our international reach in 2025 through various collaborations with projects across the space.
This content targets
Application Checklist
Before submitting:
- All fields filled (no blanks, even on optional questions)
- Problem statement is specific and quantified (not "we will improve...")
- Technical approach names specific stack/libraries/protocols
- 3–5 milestones with concrete deliverables + verifiable completion criteria
- Budget broken down by line item with hour/rate or fixed-cost justification
- Team section has verifiable GitHub/social links
- At least one prior completed project linked
- "Why Arbitrum specifically" answered — not just "L2 benefits"
- KPIs are measurable (numbers, not "improved ecosystem")
- Timeline is realistic: 2–6 months typical, not >12 months
Ecosystem-specific tips (Arbitrum):
- Reference the Stylus/Orbit/gaming ecosystem if relevant
- Mention composability with existing Arbitrum protocols
- Specify if targeting Arbitrum One vs Nova vs Orbit chains
- Include audit plan for any contract work
Resources
- Platform: https://questbook.app
- Arbitrum grants: https://arbitrum.questbook.app
- Docs: https://docs.questbook.app
Data: AgentRel analysis of Questbook GraphQL API. 200 applications, March 2026.