https://www.notboring.co/p/atoms-are-local?utm_source=substack&utm_medium=email
Welcome to the 2,948 newly Not Boring people who have joined us since last Monday! If you haven’t subscribed, join 165,475 smart, curious folks by subscribing here:
Today’s Not Boring is brought to you by… Secureframe
Answering RFPs and security questionnaires is critical to closing deals and growing revenue. But the manual process can delay deals or stop them cold. Secureframe helps you respond to both fast with machine learning so you can quickly build trust with customers, close more deals, and accelerate speed to revenue.
Click here to schedule a demo of AI-powered Secureframe Questionnaires today.
Hi friends 👋 ,
Happy Monday!
Normally, when I write something, no matter how long and complex, I try to get it done in a week. Whenever I try to plan ahead, to write out a list of potential essay topics and start researching weeks in advance, I get bored with them by the time I actually sit down to write the piece.
Currently, though, I’m in the middle of researching and writing a piece I’ve been thinking about for weeks and that’s fully captured my imagination. I think it’s going to pull together a lot of the threads we’ve been talking about in Not Boring for the past couple of years and frame the way we look at the world going forward.
On Friday, I was trying to jam on that essay to get it done for today, and I realized that if I rushed it out, I was going to fall way short of what I was going for. I panicked a little bit, bummed that I wouldn’t be able to do the idea justice, but marched on. Then, miraculously, two things happened: 1) Anton and I recorded the first episode of a new series, and 2) Elliot sent me a draft of the piece he was working on this week.
So instead of a typical Not Boring essay, we’re hitting you with a 1-2 punch. Two geniuses in two of our favorite areas guiding us through their worlds, and two more weeks for me to work on that essay.
Anton Teaches Packy AI
A couple of weeks ago, I joked on Twitter that I had no idea what I was looking at when I read an AI research paper. Anton Troynikov, the co-founder of Not Boring portfolio company Chroma, replied offering to teach me, paper-by-paper.
anton 🏴☠️ @atroyn@packyM packy i would absolutely do a weekly 'anton explains an ML paper to packy' type thing with you
After a little back and forth, we decided that there are probably a lot of people who want to understand what’s going on in AI a lot better than they do, and that we should probably just record our sessions and share them. Anton Teaches Packy AI was born.
For the first session, we dug into Attention is All You Need, the 2017 paper that proposed the Transformer architecture at the heart of so much of the innovation happening in AI today. I first read about the paper in Ben Thompson’s excellent interview with Daniel Gross and Nat Friedman and heard about it again in Lex Fridman’s conversation with AI master Andrej Karpathy. It’s a biggie.
We decided to make Anton Teaches Packy AI a YouTube series in order to use visual aids to explain the concepts, and I’m excited to unveil our first episode: