Deepseek Has a New Updated Model that Is Wowing Coders
DeepSeek has just dropped an upgraded version of its already impressive V3 model—and it’s got developers talking. This Chinese AI startup released theV3andR1models earlier this year, and they immediately grabbed attention by offering performance that rivals top-tier models from OpenAI and Google—completely open-source and free.
Now, they are back at it again with the updated version of the V3 model –DeepSeek-V3-0324. This is already generating buzz for writing hundreds of lines of code without breaking a sweat.

Let’s break it down.
Table of Contents
What’s New in DeepSeek-V3-0324?
The big change here is power. The parameter count jumped from671 billion to 685 billion, giving it more capacity while still using the efficientMixture-of-Experts (MoE)architecture. Only 37 billion parameters activate per task, so it’s smart with how it uses resources.
They also switched to theMIT license, which is developer-friendly and makes integration much easier.

Benchmarks also show strong gains:
This isn’t just benchmark fluff, either. Here are the changes that you will notice when using the new model.
What You’ll Notice When Using It
Then How It Performs?
People have tested it—and the results are impressive.
Petri Kuittinen, a Finnish lecturer, got it to generate a fully responsive landing page for an AI company—958 lines of working code. Jasper Zhang, a Math Olympiad gold medalist, gave it a 2025 AIME problem. It solved it flawlessly.
Apple’sAwni Hannunran it on a512GB M3 Ultra Mac. The speed was around 20+ tokens per second, but the peak memory usage was just381GB, which is solid for a model this size.

We tested it too.
When we asked it to create a Python web app using Flask, including login functionality and hashed password security, it generated the code. To my surprise, it worked, too.
We tried the same on ChatGPT and Gemini. ChatGPT kept restarting the output. Gemini managed to finish it after a few tries, but the code was incomplete and didn’t work without serious fixing.

How to Access the Latest DeepSeek V3?
you’re able to directly access the V3 from theDeepSeek websiteand the mobile app. By default, it uses the new DeepSeek-V3-0324 model. So you can just hop on and try the new model right away.
Developers can integrate DeepSeek into their applications and websites by using the API, which costs the same. You can use the sameAPI endpoint(model=deepseek-chat)

To download and run the model locally, you can do it from theHuggingFace platform.
What’s Next?
Rumors point to an upcoming R2 reasoning model—possibly even sooner than expected. And based on how good V3-0324 is, R2 could make an even bigger splash.
However, not everyone’s thrilled. With its rising influence, DeepSeek is under U.S. government scrutiny over national security and data privacy. There’s talk of banning its apps from official devices. Still, DeepSeek-V3-0324 is proving that open-source AI can be powerful, practical, and cost-effective. If you’re a coder, builder, or just curious about what’s next in AI, you should try it for yourself.
Ravi Teja KNTS
Tech writer with over 4 years of experience at TechWiser, where he has authored more than 700 articles on AI, Google apps, Chrome OS, Discord, and Android. His journey started with a passion for discussing technology and helping others in online forums, which naturally grew into a career in tech journalism. Ravi’s writing focuses on simplifying technology, making it accessible and jargon-free for readers. When he’s not breaking down the latest tech, he’s often immersed in a classic film – a true cinephile at heart.