Playbooks
Real strategies from real use cases. Each one comes with runnable code, live API endpoints, and zero hand-waving.
16 playbooks
I Built a Tool That Scores Congressional Trades for Suspicious Timing — and Found Some Interesting Patterns
Cross-reference congressional stock trades with earnings dates, SEC filings, and lobbying disclosures to score trades on 'suspiciousness' using one API.
I Built a 'What Could You Have Known?' Tool That Pulls Every Signal Before Any Market Event
For any ticker and date, pull all available signals from congressional trades, insider selling, SEC filings, earnings, news, Reddit, lobbying, and patents to reconstruct the pre-event information landscape.
I Tracked How a Topic Goes From ArXiv Paper to Reddit Hype to Market Mover — in One API Call Per Source
Map the information cascade from academic papers through YouTube, Reddit, and news to understand where a narrative sits in its lifecycle.
I Connected Weather Forecasts to Earnings Calls and Found Signals Nobody Else Was Watching
Combine NWS severe weather alerts with company earnings, news, and SEC filings to identify weather-exposed stocks before the market reacts.
Alt Data Pulse: How Google Trends Predicts Market Moves
Cross-reference Google Trends with insider trading, congressional activity, and SEC filings to detect market-moving signals before they hit the tape.
I Built a Narrative Radar That Told Me About the AI Agent Meta 3 Weeks Before CT Caught On
How to use YouTube transcripts, Reddit, and podcast data to detect crypto narrative shifts before they show up on Crypto Twitter.
I Cross-Referenced Lobbying Disclosures, Congressional Trades, and Earnings Calls to Find a Story Nobody Else Was Covering
How to use one API to connect lobbying filings, congressional stock trading, and corporate earnings transcripts into an investigative lead pipeline.
I Built a Picks Aggregator That Finds Consensus Across Reddit and Podcasts Before Tip-Off
Scan Reddit POTD threads and betting podcasts for specific game picks, extract structured picks with AI, and surface consensus plays with confidence scores — all before the lines move.
I Built a Working Product in a Weekend Because I Didn't Have to Write a Single Scraper
How a college student built a content monitoring dashboard in 48 hours using Trawl instead of spending weeks learning Selenium, Puppeteer, and API authentication.
I Do More Due Diligence in 2 Hours Than Our Associates Do in 2 Weeks
How a VC analyst uses one API to cross-reference earnings calls, SEC filings, news sentiment, patents, and congressional trading for deal evaluation.
Explore Alternative Data for Quant Trading with One API
Pull earnings calls, SEC filings, news, Reddit posts, and papers through one API. Interactive demo included.
I Built a Compliance Monitor That Tracks Any Stock Across 4 Data Sources — In 50 Lines of Python
How one API replaced 500 lines of web scraping code. SEC filings, news, YouTube analyst transcripts, and AI entity extraction — all from pip install trawl-sdk.
I Built a Prediction Market Trading Bot That Trawls 5 Data Sources for Geopolitical Intelligence
552K characters from news, YouTube transcripts, podcasts, SEC filings, and academic papers in 42 API calls. One Python package.
How to Build a YouTube RAG Pipeline in 10 Minutes
Step-by-step guide to building a RAG pipeline that searches YouTube, extracts transcripts, chunks them for embeddings, and loads into a vector database. Zero quotas, zero friction.
Giving Claude Access to YouTube Transcripts via MCP
How I set up Trawl's MCP server so Claude can search YouTube, extract transcripts, and analyze video content mid-conversation.
How I Transcribed 50 Podcast Episodes for a RAG Pipeline
Building a podcast transcript pipeline with Trawl — search 4M+ shows, resolve Apple/Spotify links, and extract full transcripts for downstream AI.