The meaning of DeepSeek, or: the power of constraints
đź‘‹ Hello again
Issue #261: the early days of Etsy, morality rankings, and getting out there
By Harris Sockel
I was playing around with DeepSeek last night… and asked it to write an issue of the Medium Newsletter. I gave it yesterday’s issue as an example. It talks in this weird hype-y voice that’s not really me. And it thinks today is Thursday. But:
I tried. I’ll remain employed for now, I guess.
DeepSeek does basically the same things as ChatGPT and Gemini, but — according to some — it does them better. It can write mildly interesting derivative fiction. It’s decent at math. And in the words of one poster on Hacker News, “It is simply smarter… more careful, more astute, more aware, more meta-aware.” DeepSeek is also “open weight,” meaning its algorithm (how it “weighs” factors when responding) is free and available for download. (This is not quite the same as “open source.”)
On Medium, Alberto Romero explains why this AI product is getting so much coverage. Despite facing significant constraints — like U.S. export controls on NVIDIA chips — a previously unknown Chinese company built a bot that is 10 times cheaper than its leading U.S. competitors and arguably performs better. Weeks after OpenAI announced a $500 billion investment in AI, DeepSeek is making everyone question whether these models actually need to be so expensive.
We all know constraints make you more creative. They give you something to push against. Marketing and creativity facilitator Rachel Audige uses the term “creative desperation,” which sounds extreme but I like it. It’s the feeling you get when working toward a tight deadline, the feeling when you just have to finish something and, in those last moments before it’s due, you find workarounds or extra reserves of energy to accomplish it. (Have I mentioned I’ve been writing this newsletter every day for a year? I’ve definitely had some “creative desperation” moments.)
In the Interesting Engineering newsletter, AI enthusiast Suhnylla Kler conveys the software and architectural elegance within DeepSeek’s model. Elements like a “DualPipe” algorithm let the model compute and communicate at the same time. Kler comes to a similar conclusion about constraints: “DeepSeek V3 was able to keep their AI model running efficiently, showing that innovation isn’t just about having the best tools but also about using what you have in the smartest way possible.” She compares it to cooking a big meal in a kitchen with fewer appliances than you’d like. And she imagines U.S. AI models will copy these tricks.
⚡ 1 story, 1 sentence
- A pattern: Tech platforms foster weird, fragile communities early on — until they grow big enough that they develop hyperpersonalized algorithms and community erodes as a result. (Abby Paradis on the early days of Etsy)
- One way to view political differences: Pretty much all of us agree that our primary moral obligation is to our family, but after that we rank our affinities (to community, coworkers, and country) differently. (Prof. Nichola Raihani)
- If I ever write a book, I too will sleep with it. (Emily J. Smith)
🎠A dose of practical wisdom: get out there
A new study out of Cambridge tracking 2,000 British adults over age 50 found one factor that reliably decreased their risk of depression: getting out to enjoy some kind of cultural event (a movie, show, museum, etc.) just once a month. (Jessica Stillman)
Deepen your understanding every day with the Medium Newsletter. Sign up here.
Edited and produced by Scott Lamb & Carly Rose Gillis
Questions, feedback, or story suggestions? Email us: tips@medium.com
Like what you see in this newsletter but not already a Medium member? Read without limits or ads, fund great writers, and join a community that believes in human storytelling.