Real-Time Content Aggregation Platform
Next.js 14 application aggregating 50+ content sources with AI-powered analysis and personalized feeds
Overview
Built a Twitter/X-style content aggregation platform that pulls from 50+ YouTube channels, 15+ Bluesky accounts, 10+ Reddit subreddits, and major news RSS feeds. Features AI-powered content analysis, bookmarking, and personalized feed curation.
The Problem
Modern content consumption is fragmented across platforms. You check YouTube, then Twitter, then Reddit, then news sites. Each platform optimizes for engagement, not information quality. There's no unified way to consume content from diverse sources with intelligent filtering.
Architecture Decisions
Why Next.js 14 App Router?
Server components for RSS fetching (no CORS issues). Streaming for fast initial paint. Built-in API routes for Claude integration. Seamless Vercel deployment.
Why Supabase over Firebase?
PostgreSQL for complex queries (feed filtering). Row-level security for user data. Real-time subscriptions built-in. Better developer experience.
Key Features
- Unified Timeline: All content sources in single chronological feed
- AI Reflection: Claude-powered deeper analysis of any post
- Smart Filtering: Category-based filtering (Finance, Tech, Sports, News)
- Bookmarks: Save posts with localStorage persistence
- Two Modes: "Following" (curated) vs "Everything" (firehose)
Challenges & Solutions
| Challenge | Solution |
|---|---|
| CORS blocking RSS feeds | Server components fetch server-side, no browser CORS |
| Rate limiting from sources | Implemented caching layer with 5-min TTL |
| Inconsistent feed formats | Built adapter pattern for each source type |
| Real-time feel without WebSockets | SWR with revalidation on focus |
Results
- ✓Aggregates 500+ posts per refresh
- ✓Sub-second feed loading with streaming
- ✓AI analysis available on any post
- ✓Works offline with service worker caching
Code Sample - AI Reflection Feature
// API Route: /api/reflect
export async function POST(request: Request) {
const { content, title, source } = await request.json();
const response = await anthropic.messages.create({
model: "claude-sonnet-4-20250514",
max_tokens: 1024,
messages: [{
role: "user",
content: `Analyze this content and provide deeper insights:
Title: ${title}
Source: ${source}
Content: ${content}
Provide:
1. Key takeaways (3 bullet points)
2. Why this matters
3. Questions this raises
4. Related topics to explore`
}]
});
return Response.json({
reflection: response.content[0].text
});
}What I'd Do Differently
- →Add algorithmic ranking option (not just chronological)
- →Implement cross-post deduplication
- →Build recommendation engine based on reading history