Quantization, Linear Regression, and Hardware for AI: Our Best Recent Deep Dives | by TDS Editors | Apr, 2024


Towards Data Science

There are times when brevity is a blessing; sometimes you just need to figure something out quickly to move ahead with your day. More often than not, though, if you’d like to truly learn about a new topic, there is no substitute for spending some time with it.

This is where our Deep Dives excel: these articles tend to be on the longer side (some of them could easily become a short book!), but they reward readers with top-notch writing, nuanced explanations, and a well-rounded approach to the question or problem at hand. We’ve published some excellent articles in this category recently, and wanted to make sure you don’t miss out.

Happy reading (and bookmarking)!

  • Quantizing the AI Colossi
    Sure, Nate Cibik’s comprehensive guide to quantization might be an 81-minute read, but we promise it’s worth the investment: it’s a one-stop resource to understand the mathematical underpinning of this ever-relevant approach, catch up with recent research, and learn about the practical aspects of implementation, too.
  • Linear Regressions for Causal Conclusions
    For a thorough and highly accessible intro to linear regression, especially in the context of business problems and decision-making scenarios, head right over to Mariya Mansurova’s latest explainer, which shows how a relatively straightforward method can yield sophisticated insights.
  • Groq, and the Hardware of AI — Intuitively and Exhaustively Explained
    We all know that recent advances in AI depend on major improvements to computing technology, but far fewer among us can explain in detail how this evolution unfolded. This is where Daniel Warfield’s stellar overview kicks in, taking us through the entire recent history of hardware from CPUs and GPUs to TPUs and beyond.
Photo by K8 on Unsplash
  • Deep Dive into Sora’s Diffusion Transformer (DiT) by Hand
    We have a soft spot for patient, well-illustrated guides that focus on one thing—a model, a tool, a workflow—and unpack its nuances with care. Srijanie Dey, PhD offers precisely that in her recent deep dive, which focuses on the inner workings of OpenAI’s video-generating (and buzz-generating) model, Sora.
  • Create an Agent with OpenAI Function Calling Capabilities
    If your idea of an immersive read entails less language, more code, we suspect you’ll appreciate Tianyi Li and Selina Li’s patient tutorial, which walks us through the process of creating a trip-assistant agent—one that leverages function calling for a more streamlined and efficient workflow.
  • 5 Powerful Strategies To Make Sure AI Doesn’t Steal Your Job — A Spotify Data Scientist’s Survival Guide
    In times of economic and technological uncertainty, it’s always a good idea to invest in your core skills. Khouloud El Alami shares several actionable insights on the areas you should focus on to make your career more resilient to disruption (whether AI-induced or otherwise).
  • Overwriting in Python: Tricky. Dangerous. Powerful.
    Looking to expand your Python toolkit and gain a deeper understanding of a common programming procedure? Marcin Kozak is back with another Python must-read, this one zooming in on overwriting: how it works, why using it comes with more than a few risks, and how to go about harnessing its power effectively and safely.

Recent Articles

Related Stories

Leave A Reply

Please enter your comment!
Please enter your name here