introduction
diy-productivity
- Lossless learning
- Good Resumes
- LNO framework for productivity
- How I journal
- How I take notes
- Inbox zero and mail categorization
thoughtful-ramblings
research-papers
- Things read 2025 JFM
- CLIP
- Attention sinks
- Quantization
- LLM Inference Part 1
- Things read in 2024 JFM
- Things read in 2023 OND
- Whisper
- AST Audio Spectrogram Transformer
- Swin transformers
deep-learning
- CLIP
- Short intro to VQ VAEs
- Whisper
- Quickstart on LLMs
- Batchnorm
- AST Audio Spectrogram Transformer
- Swin transformers
book-notes
ml-ops
tech
machine-learning
learnings
llms
- Quantization
- LLM Inference Part 1
- Things read in 2024 JFM
- Agents
- Things read in 2023 OND
- Quickstart on LLMs
computer-vision
multimodal
agents
gpus
- Cutlass primer
- Mechanics of FP8 for LLMs
- What is Systems for LLM
- GPU chip design tradeoffs
- LLM Inference Part 1
llm-optimization
- Cutlass primer
- The curious case of weight tying in LoRA
- Mechanics of FP8 for LLMs
- Attention sinks
- LLM Inference Part 1