online training
LLM and
Generative AI
Building Applications Without Complex Algorithms
Contact
In simple language about the complex clear and structured
Mentor support no limit on answers
Up-to-date materials Regular updates every 3 months
Practical assignments We teach with real-life examples
About the course
This course is your quick start in the world of generative AI without delving into complex math and algorithms. There is no dry theory here - only practice, real projects, and working tools that will help you immediately apply AI to your tasks.
What will you get?
Basics of Generative AI
Up-to-date materials Regular updates every 3 months
Practice from the first lesson
creating chatbots, RAG systems, content generators.
Finished projects
from recommendation systems to business task automation.
Modern tools
LangChain, Hugging Face, vector databases (Pinecone, ChromaDB).
Support
GitHub repository, tests, 2025 updates.
Who will benefit from this course?
Developers
Understand how to create APIs from scratch and apply them to real projects.
Managers
Understand how to apply AI to automate business processes and create smart features in your products.
Beginners
Get started in AI without complicated math - just practice and real projects.
Digital Professionals
Automate content creation and enhance creativity with generative AI.
Course Requirements
For the course you will need: basic Python skills (functions, loops, working with libraries).
Who will benefit from this course?
01
Work with modern LLMs - learn GPT-4, Gemini, Claude and open-source models from Hugging Face.
02
Apply advanced prompt engineering - learn how to “talk” to AI for accurate results.
03
Create chatbots with memory - so that they maintain dialog context, like ChatGPT.
04
Develop RAGs - to search and generate answers from your data (documents, knowledge bases).
05
Automate content - to write text, generate ideas and even code with AI.
06
Work with LangChain - to assemble complex AI chains for real-world problems.
07
Use vector databases (Pinecone, ChromaDB) - for smart information retrieval and storage.
08
Deploy AI solutions - from a prototype in Jupyter Notebook to a working MVP.
Course Experts
Our experts are high-level specialists and have real-world experience developing generative AI applications in large companies
Anna Kosach
Lead AI Engineer, Sber AI Develops industrial solutions based on generative AI to automate customer service and document processing.
Konstantin Yaritsa
Senior ML Engineer, Mail.ru Group Specializes in implementing LLM in search engines and recommendation algorithms.
Katya Sluch
AI Product Manager, MTS AI Manages the development of AI products, including chatbots and voice assistants for the telecom sector.
Artem Zhukov
CTO, AI Startup (ex-Yandex.Taxi) Creates scalable architectures for AI applications with a focus on natural language processing.
Course team
Module 1Introduction
Introduction Get the most out of this course
Module 2Setting up the development environment
Overview Setting up the development environment Install Python UV tool Dev setup : Install Anaconda Dev setup : UV Quizzes, exercises and projects Accessing the Large Language Models
Module 3Generative AI: Fundamentals
Overview Introduction to AI, MOE, neural networks, generative AI Neurons, Neural & Deep Learning Networks Exercise: Try out a Neural Network for Solving Math Equations Viewing the generative AI model as a black box Quiz: Fundamentals of Generative AI Models An Overview of Generative AI Applications Exercise: Setup Access to Google Gemini Models Introduction to Hugging Face Exercise: Checkout the Hugging Face Portal Exercise: Join the Community and Explore Hugging Face Quiz: Generative AI and Hugging Face Intro to Natural Language Processing (NLP, NLU, NLG) NLP with LLMs Exercise: Try Out NLP Tasks with Hugging Face Models Test: NLP with LLMs
Module 4Generative AI applications
Overview Introduction to OLlama OLlama model hosting Model naming scheme Training, implementation, and chat models Quiz: Instruct, Embedding, and Chat Models Next Word Prediction by LLM and Fill Mask Task Model Inference Control Parameters Randomness Control Inference Parameters Exercise: Setup Cohere Key and Try Out Randomness Control Parameters Diversity Control Inference Parameters Output Length Control Parameters Exercise: Try Out Decoding or Inference Parameters Quiz: Decoding Hyper-parameters Introduction to In-Context Learning Quiz: In-Context Learning
Module 5Hugging Face Models: Foundatals
Overview Exercise: Install & Work with Hugging Face Transformers Library Transformers Library Pipeline Classes Quiz: Hugging Face Transformers Library Hugging Face Hub Library & Working with Endpoints Quiz: Hugging Face Hub Library Exercise: Proof of Concept (PoC) for Summarization Task Hugging Face CLI Tools and Model Caching
Module 6Hugging Face Models: Advanced
Overview Model Input/Output and Tensors Hugging Face Model Configuration Classes Model Tokenizers & Tokenization Classes Working with Logits Hugging Face Models Auto Classes Quiz: Hugging Face Classes Exercise: Build a Question Answering System
Module 7LLM challenges and operational design
Overview Challenges with Large Language Models Model Grounding and Conditioning Exercise: Explore the Domain Adapted Models Prompt Engineering and Practices (1) Prompt Engineering and Practices (2) Quiz & Exercise: Prompting Best Practices Few-Shot & Zero-Shot Prompts Quiz & Exercise: Few-Shot Prompts Chain of thought prompting technique Quiz & Exercise: Chain of Thought Self-Consistency Prompting Technique Tree of Thoughts Prompting Technique Exercise: Tree of Thought Creative Writing Workbench (v1)
Module 8Langchain: Prompts, chains, LCEL
Overview Prompt Templates Few-Shot Prompt Template & Example Selectors Prompt Model Specificity LLM Invoke, Streams, Batches & Fake LLM Exercise: Interacting with LLM using LangChain Exercise: LLM Client Utility Quiz: Prompt Templates, LLM, and Fake LLM Introduction to LangChain Execution Language Exercise: Create Compound Sequential Chain LCEL: Runnable Classes (1) LCEL: Runnable Classes (2) Exercise: Try Out Common LCEL Patterns Exercise: Creative Writing Workbench v2 Quiz: LCEL, Chains and Runnables
Module 9Dealing with structured responses from LLM
Overview Challenges with Structured Responses LangChain Output Parsers Exercise: Use the EnumOutputParser Exercise: Use the PydanticOutputParser Project: Creative Writing Workbench Project: Solution Walkthrough (1) Project: Solution Walkthrough (2) Handling Parsing Errors Quiz and Exercise: Parsers, Error Handling
Module 10Datasets for model training and testing
Overview Datasets for LLM Pre-training HuggingFace Datasets and Datasets Library Exercise: Use Features of Datasets Library Exercise: Create and Publish a Dataset on Hugging Face
Module 11Vectors, embedding and semantic search
What is the meaning of contextualized understanding? Building Blocks of Transformer Architecture Intro to Vectors, Vector Spaces, and Embeddings Measuring semantic similarity with distance Quiz: Vectors, Embeddings, Similarity Sentence transformer models (SBERT) Working with sentence transformers Exercise: Work with Classification and Mining Tasks Creating embeddings with LangChain Exercise: CacheBackedEmbeddings Classes Lexical, semantic, and kNN search Search Efficiency and Search Performance Metrics Search Algorithms, Indexing, ANN, FAISS Quiz & Exercise: Try Out FAISS for Similarity Search Search Algorithm: Local Sensitivity Hashing (LSH) Search Algorithm: Inverted File Index (IVF) Search Algorithm: Product Quantization (PQ) Search Algorithm: HNSW (1) Search Algorithm: HNSW (2) Quiz & Exercise: Search Algorithms & Metrics Project: Build a Movie Recommendation Engine Benchmarking ANN Algorithms Exercise: Benchmark the ANN Algorithms
Module 12Vector Databases
Challenges with semantic search libraries Introduction to Vector Database Exercise: Try out ChromaDB Exercise: Custom embeddings Chunking, Symmetric & Asymmetric Searches LangChain Document Loaders LangChain Text Splitters for Chunking LangChain Retrievers & Vector stores Search Scores and Maximal-Marginal-Relevancy (MMR) Project: Pinecone Adoption @ Company Quiz: Vector Databases, Chunking, Text Splitters
Module 13Conversational user interface
Introduction to Streamlit Framework Exercise: Build a HuggingFace LLM Playground Building Conversational User Interfaces Exercise: Build a Chatbot with Streamlit LangChain Conversation Memory Quiz & Exercise: Building Chatbots with LangChain Project: PDF Document Summarizer Application
Module 14Advanced retrieval augmented generation
Introduction to Retrieval Augmented Generation (RAG) LangChain RAG Pipelines Exercise: Build Smart Retriever with LangChain Quiz: RAG and Retrievers Pattern: Multi Query Retriever (MQR) Pattern: Parent Document Retriever (PDR) Pattern: Multi Vector Retriever (MVR) Quiz: MQR, PDR and MVR Ranking, Sparse, Dense & Ensemble Retrievers Pattern: Long Context Reorder (LCR) Quiz: Ensemble & Long Context Retrievers Pattern: Contextual Compressor Pattern: Merger Retriever Quiz: Contextual Compressors and Merger Retriever Patterns
Module 15Agentic RAG
Introduction to Agents, Tools, and Agentic RAG Exercise: Build a Single-Step Agent without LangChain LangChain Tools and Toolkits Quiz: Agents, Tools & Toolkits Exercise: Try Out the FileManagement Toolkit How Do We Humans & LLMs Think? ReACT Framework & Multi-Step Agents Exercise: Build Question/Answering ReACT Agent Exercise: Build a Multi-Step ReACT Agent LangChain Utilities for Building Agentic-RAG Solutions Exercise: Build an Agentic-RAG Solution using LangChain Quiz: Agentic RAG and ReAct
Module 16Model Context Protocol
Overview Introduction to Model Context Protocol (1) Exercise: Install MCP server on Claude desktop Introduction to Model Context Protocol (2) Quiz: Fundamentals of MCP Exercise : Setup project with MCP Python SDK Exercise : Explore MCP using the Inspector tool Exercise : Code an MCP Server MCP messaging and transport Exercise : Setup standalone MCP server with notifications Quiz: MCP Servers MCP : Client Exercise: Code the MCP client Exercise : Question-Answering application Quiz: MCP Clients
Module 17Fine Tuning
Introduction to Fine-tuning Fine-tuning : Reasons Fine tuning process overview Tools for fine tuning Exercise: Fine tune Cohere model for toxicity classification Creating a dataset for fine tuning Exercise: Prepare a dataset and fine tune Open AI 4o model Project: Build a credit card fraud detection dataset
Module 18Dataset Preparation for Fine-tuning
Introduction to Alpaca Fine-tuning dataset formats Exercise: Explore instruct fine-tuning datasets Fine-tuning datasets for chats Exercise: Prepare chat dataset for fine-tuning Quiz: Fine-tuning datasets
Module 19Pre-training and Fine-tuning with Hugging Face Trainer
Fine-tuning under the covers Hyperparameters (1) Hyperparameters (2) Understand the checkpointing Exercise: Explore Hyperparameters Intro HuggingFace Trainer, Data collator classes Exercise: Try out the DataCollators Exercise: Pre-training Roberta model Exercise: Full fine-tune BERT for sentiment analysis Quiz: Hyperparameters & Fine-tuning
Module 20Quantization
LLM Training Computing Needs Inferencing Compute Needs Quiz : Check your understanding of GPU & CUDA Introduction to Quantization Exercise: Quantization maths (Affine technique) Applying quantization : Static & Dynamic Exercise: Dynamic quantization with PyTorch Exercise: Static quantization with AutoGPTQ Quiz: Check your understanding of quantization
How the training goes
Video lessons in a convenient format
Learn at your own pace with effective instructional videos.
Practice in every case
Practice. You immediately apply what you've learned by building your app step by step.
Mentor support
Assignment checking and guidance. Mentors check every job and give practical tips for improvement.
Community
Get quick answers from experts, share your experiences with other course participants and find inspiration in our private chat room.
Official certificate
and career prospects
Upon successful completion of the course, you will receive a personalized certificate recognizing your skills in developing generative AI applications.
We offer favorable tariffs
Introductory
$11
  • 3 modules
  • Video Lessons
  • Practical assignments
  • Assignment Check
  • Chat for students and tutors
  • Access to the course - 1 month
  • No certificate
Checkout
Basic
$39
  • 16 modules
  • Video Lessons
  • Practical assignments
  • Checking and error analysis
  • Feedback
  • Chat for students and tutors
  • Access: 6 months
  • Certificate
Checkout
Standard
$45
  • 20 modules
  • Video Lessons
  • Practical assignments
  • Checking and error correction
  • Feedback
  • Chat for students and tutors
  • Access: 9 months
  • Certificate
Checkout
Comfortable
$59
  • Personalized support
  • Error analysis and recommendations
  • 18 sections
  • Video lessons
  • Practical exercises
  • Assignment Check
  • Chat for students and tutors
  • Access: 12 months
  • Certificate
Checkout
Corporate
$440
  • Corporate groups of 5–10 people
  • 20 modules
  • Video lessons
  • Practical assignments
  • Checking and error correction
  • Feedback
  • Chat for students and tutors
  • Access: 12 months
  • Certificate
Checkout
Our students are satisfied with the education
Data from an independent survey of graduates
95%
of graduates note that the course helped achieve the set goal
89%
of graduates are ready to recommend studying with us
Course Reviews
Alexey
Python developer
I took the course to learn how to integrate AI into our products. Within 2 weeks I had already built a prototype of a chatbot for internal analytics. LangChain and RAG are now my main tools. A separate plus - examples from real cases, not abstract assignments.
Marina
Product Manager
As a person with no technical background, I was afraid that I wouldn't be able to cope. But the course is perfectly structured: from simple prompts to complex chains in LangChain. Now I myself set TORs for developers on AI-features. The cases on automation of routine reports were especially useful. Thank you for quality training!
Ivan
founder of edtech startup
I was looking for a practical course without water about LLM. Here is the perfect balance: minimum theory, maximum action. In a month I launched an AI assistant to check homework. The final project on fine-tuning helped me save $1.5k/month on API!
Olga
Digital-marketer
After the course I completely changed my approach to content generation. Now I make 5 times more SEO-texts with the help of custom GPT-templates. I was surprised that even with basic Python I was able to understand Hugging Face.
Dmitry
Team lead
I recommend it to teams that want to implement AI quickly. We took the course together with junior developers - in 3 weeks we already had a prototype RAG system for searching technical documentation working. Experts give you exactly the skills you need in production.
We guarantee it,
you don't lose anything!
We remain flexible to suit your needs. That's why we guarantee a full refund if you change your mind within three days.
Get Started
Questions and Answers
Do I need experience in machine learning to take the course?
No, the course is designed for beginners. You just need a basic knowledge of Python (functions, loops, working with libraries). All the concepts of generative AI are explained from scratch, without going deep into math.
What tools and technologies will we learn?
Main stack: LangChain (for building AI applications), Hugging Face (open-source models), vector databases (Pinecone, ChromaDB), and APIs GPT-4, Gemini and Claude. Additionally, Streamlit for deploying web interfaces.
Will there be practice or just theory?
The course consists of 80% practice. You will create chatbots with contextual memory, develop RAGs for document search, generate content (texts, code, ideas).
How much time do you need to devote per week?
We recommend studying 4-6 hours per week. This is the optimal pace to master knowledge and acquire skills, without undue stress. But you can choose any pace that suits you.
How is feedback organized?
Checking of assignments by supervisors, questions in a private chat room for students with prompt answers.