Interactive Research Library
From research papers
to interactive products
PaperMap turns landmark AI research papers into visual, interactive learning experiences.
Explore the Transformer, GPT-3, and InstructGPT — rebuilt as production-grade explainers.
Paper Library
Published Explainers
Each paper is a self-contained interactive guide with visual demos, architecture breakdowns, quizzes, and research-accurate detail from the original publications.
Live
Vaswani et al. · 2017
Attention Is All You Need
The paper that started it all. Interactive walkthrough of the Transformer architecture with tokenization, embeddings, self-attention, multi-head attention, positional encoding, and full encoder-decoder visualization.
Transformer
Self-Attention
NLP
Architecture
Open Explainer →
Live
Brown et al. · 2020
Language Models are Few-Shot Learners
How 175 billion parameters unlocked in-context learning. Interactive few-shot demos, scaling law visualizations, benchmark tables, data contamination analysis, and the full GPT-3 model family.
GPT-3
Few-Shot
Scaling Laws
175B
Open Explainer →
Live
Ouyang et al. · 2022
Training Language Models to Follow Instructions with Human Feedback
The RLHF paper that paved the way for ChatGPT. Interactive 3-step pipeline (SFT → Reward Model → PPO), human evaluation results, bias analysis, and qualitative examples.
InstructGPT
RLHF
PPO
Alignment
Open Explainer →
About
What is PaperMap?
A scalable, static-first platform that transforms dense research papers into interactive learning products — no frameworks, no build tools, no dependencies.
🎯
Research-Accurate
Every number, formula, and architectural detail is sourced directly from the original paper. Section references included throughout.
⚡
Interactive Demos
Tokenizers, attention maps, few-shot playgrounds, scaling charts, and quiz blocks — learn by building intuition, not just reading.
📱
Mobile-First Design
Responsive layouts, touch targets, and reduced-motion support. Works perfectly on phones, tablets, and desktops.
🚀
Zero Build Step
Pure HTML, CSS, and vanilla JavaScript. Deploy anywhere — GitHub Pages, Netlify, Vercel, or any static host.
♿
Accessible by Default
Semantic HTML, ARIA labels, keyboard navigation, skip links, focus styles, and prefers-reduced-motion support.
🎨
Consistent Design System
Shared typography, color palette, components, and interaction patterns across all paper explainers.
Roadmap
Building the Library
PaperMap grows one paper at a time. Each explainer is handcrafted with the same production-quality standards.
✓ Attention Is All You Need
Transformer architecture, tokenization, embeddings, self-attention, multi-head attention, positional encoding, training.
✓ GPT-3: Few-Shot Learners
In-context learning, scaling laws, 175B architecture, benchmark results, data contamination, societal impact.
✓ InstructGPT & RLHF
3-step RLHF pipeline, reward modeling, PPO, human evaluations, alignment tax, bias analysis.
→ Next: More Papers
BERT, diffusion models, retrieval-augmented generation, and more landmark papers coming soon.