Master Generative AI with 10+ Real-world Projects in 2025!
Learn how to fine-tune Liquid Foundation Models (LFM 2) using Direct Preference Optimization for efficient, edge-ready small language models.
Explore Google T5Gemma-2, a compact multimodal AI model with long context and laptop-friendly performance, with a hands-on demo.
Explore how a TRM is able to achieve maximal reasoning through minimal architecture, proving to be a viable alternative to HRMs
A modern LLM is no longer about more GPUs, it uses clever design tricks for effective results. Read on to find out what these are
Explore the Github Repository for LLM Datasets and transform your AI projects with quality data for model training.
In this hands-on, we try the capabilities of ERNIE X1.1, Baidu's latest model focusing on practical reliability and reasoning-first design.
Explore the two newly launched Qwen3 models while using their impressive context lengths to build a RAG model
Explore Decoder-Only Transformer: attention, normalization, classification. Master text generation & translation.
Participate Now
Edit
Resend OTP
Resend OTP in 45s