Gen AI & Agentic AI Crash Course ...
Become Industry-Ready in Just 2 Months
For Working Professionals Across All Industries | 7 Days | 6 Hours/Day | 42 Hours Total Python → Deep Learning → Gen AI → Agentic AI → Real-World Applications
Why Choose This Program?
AI is no longer a future topic — it is happening right now across every industry, and the professionals who understand it will lead the next decade. This course is your fast-track to becoming one of them.
In just 7 days, you will go from zero to building real AI systems that organisations are deploying today — systems that automate complex workflows, serve customers 24/7, qualify leads intelligently, process documents in minutes, and make decisions that used to take days.
You do not need to be a programmer or data scientist to start. Every concept is taught step by step, with real-world examples drawn from banking, e-commerce, sales, HR, and operations.
By the end, you will know how AI thinks, how to direct it, and how to apply it to the real challenges your organisation faces.
Overview
DAY 1 — Python & Deep Learning Foundations
Duration: 6 Hours | Level: Foundations
Build the technical base — Python programming, data handling, neural networks, and NLP fundamentals that power every modern AI system.
Session 1 — Python Fundamentals
Topics Covered – Variables, Data Types, Control Flow, Functions, OOP – Lists, Dictionaries, Sets, Comprehensions – File I/O, Exception Handling, Modules
Session 2 — NumPy & Pandas for Data Analysis
Topics Covered – NumPy Arrays, Broadcasting, Vectorized operations – Pandas DataFrames — slicing, filtering, groupby, merge – Time-series handling, rolling windows, aggregations
Session 3 — Data Visualization
Topics Covered – Matplotlib — line, bar, scatter, histogram plots – Seaborn — heatmaps, pairplots, distribution plots – Storytelling with data
Session 4 — Neural Networks & Deep Learning
Topics Covered – Perceptrons & Multi-Layer Perceptrons (MLPs) – Activation Functions: ReLU, Sigmoid, Tanh, Softmax – Forward & Backward Propagation – Loss Functions & Optimizers: SGD, Adam, RMSprop – Regularization: Dropout, L1/L2
Session 5 — RNNs & LSTMs
Topics Covered – Sequential data processing, Hidden states, Temporal dependencies – Vanishing/Exploding Gradient Problem – LSTM architecture: Forget, Input, Output gates – Bidirectional LSTMs – CNNs: Convolutional layers, pooling, feature maps
Session 6 — NLP Fundamentals
Topics Covered – Text preprocessing: tokenization, stop words, stemming, lemmatization – Bag of Words (BoW), TF-IDF – Word Embeddings: Word2Vec, GloVe – Encoder-Decoder Seq2Seq architecture
DAY 2 — Transformers, LLMs & Prompt Engineering
Duration: 6 Hours | Level: Intermediate
Understand the architecture behind ChatGPT, Claude, and every modern AI assistant. Master prompts that work in production.
Session 1 — Transformer Architecture Mastery
Topics Covered – Self-Attention: Query, Key, Value (QKV) matrices – Scaled Dot-Product Attention – Multi-Head Attention – Why Attention beats RNNs for long documents
Session 2 — Positional Encodings & Tokenization
Topics Covered – Why position matters in transformers – Sinusoidal positional encoding – Relative positional encodings: RoPE, ALiBi – Tokenization: BPE, WordPiece, SentencePiece – Subword tokenization advantages
Session 3 — LLM Architecture & Variants
Topics Covered – BERT (Encoder-only) — classification, NER – GPT (Decoder-only) — generation, completion – T5 (Encoder-Decoder) — summarization, translation – Embedding layers, Transformer blocks, LayerNorm – Context length limitations, KV cache optimization
Session 4 — Sampling, Generation & Alignment
Topics Covered – Temperature, Top-k, Top-p (nucleus) sampling – Beam search, Greedy decoding – RLHF — Reinforcement Learning from Human Feedback – DPO — Direct Preference Optimization – Constitutional AI
Session 5 — Prompt Engineering Fundamentals
Topics Covered – Zero-shot, One-shot, Few-shot prompting – System, User, Assistant role structure – Chain-of-Thought (CoT) prompting – Tree of Thoughts (ToT) – Self-consistency
Session 6 — Structured Outputs & Prompt Security
Topics Covered – JSON mode and schema enforcement – Function calling patterns – Prompt injection awareness and delimiter usage – Token optimization, prompt versioning – LangSmith for prompt debugging
DAY 3 — Agentic AI Foundations & RAG Systems
Duration: 6 Hours | Level: Intermediate
Move from chat to action — build AI agents that can reason, plan, and retrieve information from your organisation’s documents.
Session 1 — Agentic AI Fundamentals
Topics Covered – AI Agents vs Models — key distinctions – Autonomous vs Semi-Autonomous agents – Perception-Action Loop: Observe → Think → Act → Feedback – Autonomy levels and use cases – Agent limitations and failure modes
Session 2 — Planning, Reasoning & Tool Use
Topics Covered – Task decomposition and goal setting – Planning algorithms: A*, MCTS – ReAct paradigm (Reasoning + Acting) – Plan-and-Execute pattern – Tool use: function calling, web browsing, code execution
Session 3 — Agent Memory Systems
Topics Covered – Working memory (context window) – Short-term memory: conversation history – Long-term memory: vector stores, databases – Episodic vs Semantic memory – State tracking and session management
Session 4 — RAG Fundamentals
Topics Covered – RAG architecture: Retriever + Reader + Generator pattern – Why RAG? Hallucination mitigation – When to use RAG vs fine-tuning – Embedding models: OpenAI, Cohere, BGE, E5 – Cosine similarity and retrieval metrics
Session 5 — Vector Databases & Document Processing
Topics Covered – Vector DBs: FAISS, Pinecone, Weaviate, Chroma, Qdrant, Milvus – Indexing strategies: HNSW, IVF – Metadata filtering – Chunking strategies: fixed-size, semantic, overlap – Document processing: OCR, PDF parsing, table extraction
Session 6 — Advanced RAG Patterns
Topics Covered – Hybrid search: dense + sparse (BM25) – Re-ranking mechanisms – Hypothetical Document Embeddings (HyDE) – Query decomposition, multi-query retrieval – Graph RAG: knowledge graphs + entity linking – RAG Evaluation: RAGAS, ARES frameworks
DAY 4 — Agentic Frameworks & Multi-Agent Systems
Duration: 6 Hours | Level: Advanced
Build production-grade AI pipelines using LangChain, LangGraph, AutoGen, and CrewAI.
Session 1 — LangChain
Topics Covered – LCEL (LangChain Expression Language) and chains – Agents and tools – Memory management in chains – Callbacks and tracing – Custom tool development
Session 2 — LangGraph for Stateful Workflows
Topics Covered – Graph-based stateful workflows – State machines and conditional routing – Checkpointing and persistence – Human-in-the-Loop (HITL) integration
Session 3 — AutoGen for Multi-Agent Systems
Topics Covered – Conversable agents and group chat – Multi-agent orchestration – Code execution agents – Human-in-the-loop patterns
Session 4 — CrewAI for Role-Based Agent Teams
Topics Covered – Role-based agent design – Task assignment and delegation – Crew orchestration patterns – Framework comparison: LangChain vs LangGraph vs AutoGen vs CrewAI
Session 5 — Multi-Agent Architecture & Communication
Topics Covered – Centralized vs Decentralized control – Communication protocols: message passing, pub-sub, event-driven – Agent routing and semantic routing – Fallback mechanisms and error handling
Session 6 — Hierarchical Agents & Human-in-the-Loop
Topics Covered – Manager-Worker patterns – Task decomposition (RACI matrix) – Collaborative vs competitive agents – Approval workflows and human feedback integration – Reflection agents and critique-refine loops
DAY 5 — Fine-Tuning, Multimodal AI & Agent Memory
Duration: 6 Hours | Level: Advanced
Customize AI models for your domain, process documents and audio visually, and build persistent agent memory.
Session 1 — Fine-Tuning Fundamentals
Topics Covered – When to fine-tune vs RAG vs prompt engineering – Full fine-tuning vs PEFT – Data requirements: JSONL format, train/validation split – Quality over quantity principle
Session 2 — Parameter-Efficient Fine-Tuning (PEFT)
Topics Covered – LoRA (Low-Rank Adaptation) — how it works – QLoRA (Quantized LoRA) — memory-efficient training – Adapter layers and Prefix tuning – Instruction tuning and SFT (Supervised Fine-Tuning)
Session 3 — Agent Memory & Cross-Session Persistence
Topics Covered – Multi-tier memory architecture – Short-term: context window and conversation history – Long-term: vector stores and database storage – Episodic memory: event sequence storage – Semantic memory: fact storage, knowledge graph integration – State management: SQLite, Redis, PostgreSQL – State checkpointing, privacy considerations
DAY 6 — Safety, Evaluation, MCP & Observability
Duration: 6 Hours | Level: Advanced
Deploy AI responsibly — guardrails, evaluation frameworks, the Model Context Protocol, and production monitoring.
Session 1 — AI Safety & Content Moderation
Topics Covered – AI Safety Principles: alignment, robustness, transparency – Content Moderation: toxicity detection, PII detection – Input validation and sanitization: length limits, format checks – Output validation: factuality verification, hallucination detection – Constitutional AI and value alignment
Session 2 — Prompt Injection Defense & Hallucination Mitigation
Topics Covered – Understanding prompt injection attacks – Delimiter-based defenses – Instruction hierarchy and system message hardening – Jailbreak prevention: refusal training, multi-layer defenses – Hallucination Mitigation: Chain-of-Verification (CoVe), confidence scoring, citation requirements
Session 3 — Evaluation Frameworks
Topics Covered – Evaluation Framework Design: metric selection, benchmark creation – Offline evaluation: automated metrics (BLEU, ROUGE, F1, embedding similarity) – Online evaluation: A/B testing, canary deployment, champion-challenger – LLM-as-Judge: criteria-based evaluation, bias considerations – RAG-specific: context relevance, answer faithfulness, RAGAS and ARES – Agent evaluation: task completion rate, tool usage accuracy
Session 4 — Observability & Monitoring
Topics Covered – Observability fundamentals: metrics, logs, traces – Distributed tracing for AI pipelines – LangSmith platform: trace visualization, dataset management, prompt playground – Grafana dashboards: metrics visualization and alerting – Cost tracking, latency analysis, token usage monitoring
Session 5 — Model Context Protocol (MCP)
Topics Covered – MCP protocol: architecture, MCP vs traditional APIs – MCP Server Development: tool registration, schema definition – MCP Client Integration: tool discovery, error handling – LLM Connectors: connecting LLMs to external systems – MCP Security: authentication, permission model, rate limiting – MCP Deployment: containerization, orchestration, monitoring
Session 6 — Red-Teaming & Responsible AI
Topics Covered – Red-teaming methodology for AI systems – Adversarial testing and penetration testing – OWASP Top 10 for LLMs – Human oversight: approval workflows, audit logging, escalation – Safety classifiers: multi-stage filtering – Building audit trails for compliance and governance
DAY 7 — Real-World Projects & Capstone
Duration: 6 Hours | Level: Production
Build 3 complete AI systems across different domains — from design to deployment. Portfolio-ready projects you can showcase immediately.
PROJECT 1 — Loan Application & Processing Agent
Domain: Banking & Financial Services
Type: Agentic AI Frameworks: LangGraph (7-node workflow)
Problem:
Manual loan processing takes 5–7 days, involves 28+ processors, and handles only 600 loans per month with high error rates and poor customer experience.
What You Build:
An end-to-end autonomous loan processing agent — from application submission to disbursement — with no manual intervention for standard cases.
Workflow Nodes:
Document intake + OCR extraction
ID authenticity verification
Credit score analysis
Debt-to-income ratio calculation
Property / collateral valuation
Risk scoring + interest rate determination
Auto-approve / escalate to human / disburse
Agentic Capabilities:
Multi-step autonomous decision making
Human-in-the-loop escalation for edge cases
Document verification using Vision AI
Real-time status updates to applicant
Tech Stack:
LangGraph, GPT-4o, Qdrant, Unstructured.io (OCR), e-signature API, Streamlit
Real Benchmark:
HDFC Bank > Loan approval: 5 days → 2 minutes | Capacity: 600 → 15,000 loans/month (25x) | Default accuracy: 78% → 94%
PROJECT 2 — Customer Support Chatbot
Domain: E-commerce / Customer Service
Type: Multi-Agent + MCP Frameworks: LangGraph + MCP
Problem:
Support costs $1.3 trillion globally. Average wait time is 11 minutes, causing 75% customer abandonment. Agents spend 60% of time on repetitive tasks. Traditional chatbots fail at multi-step problems and cannot take real actions like processing refunds. Support is unavailable 16 hours/day for most businesses.
What You Build:
An autonomous multi-agent support system handling 70% of queries end-to-end — checking orders, processing refunds, updating records, and escalating only when needed.
Multi-Agent System:
Triage Agent — classifies intent, routes to correct agent
Order Management Agent — checks order status, tracking, delivery
Product Info Agent — answers product queries, availability, specs
Billing Agent — handles payments, invoices, refunds
Technical Support Agent — troubleshooting, step-by-step guidance
Escalation Agent — hands off to human with full context
MCP Integration:
Centralized tool registry connecting all business systems — order database, inventory system, payment processor, email service, ticket creation
Decision Logic:
Business rules engine for refunds, returns, and escalation thresholds
Tech Stack:
LangGraph, GPT-4o-mini, Weaviate, MCP (business system connectors), WebSocket, Streamlit
Key Metrics:
70% queries resolved end-to-end | Response time < 30 seconds | 24/7 availability | Support cost reduced by 60%+
PROJECT 3 — Lead Scoring Agent
Domain: Sales & Marketing
Type: Agentic AI + Web Search Frameworks: CrewAI + Web Intelligence
Problem:
Sales teams waste 60% of their time on unqualified leads — contacting 100 prospects to close just 3 deals. Manual research takes 15–30 minutes per lead with no systematic qualification process, resulting in $50,000+ wasted effort per sales rep annually. There is no data-driven way to prioritize high-probability prospects.
What You Build:
An autonomous lead qualification agent that researches, scores, and prioritizes prospects — giving sales teams a ranked list of high-conversion leads with personalized talking points ready to go.
Agentic Workflow:
Research Agent — scrapes company website, LinkedIn, news, funding data, tech stack
Analysis Agent — matches lead against Ideal Customer Profile (ICP)
Contact Agent — identifies decision-makers and generates contact information
Scoring Agent — scores lead 0–100 based on conversion probability
Data Sources:
Web search, company websites, LinkedIn, tech stack databases, funding databases, news APIs
Scoring Model Inputs:
ICP matching (industry, company size, revenue range)
Buying signals (recent funding, hiring trends, tech adoption)
Engagement signals (website visits, content downloads)
Decision-maker accessibility
Output Per Lead:
Lead score (0–100) with reasoning
Decision-maker contacts and roles
Personalized talking points tailored to company context
Recommended outreach timing and channel
Tech Stack:
CrewAI, GPT-4o, Tavily (web search), Qdrant, LinkedIn API, Streamlit
Key Metrics:
15–30 min manual research → 2 min automated | Sales team focuses only on top 20% leads | Conversion rate improved 3–5x
🚀 Ready to Become an Expert in Generative AI & Agentic AI?
Enroll Now and Secure Your AI Future
Training Supports & Benefits
- Learn from the World’s Best Faculty & Industry Experts
- Learn with fun Hands-on Exercises & Assignments
- Participate in Hackathons & Group Activities
- Dedicated Faculty
- 9AM to 6 PM Support
- Participate in Hackathons & Group Activities
- Resume Building & Mock Interview Preparation.
- We offer personalized access to our Learning Management System (LMS).
- Moc Interview Practise + Real time Test
Our Student Placed in...
Students
Reviews
-
DV analytics is One of the best Institutes, giving excellent training and placement with in 6 months duration, The comprehensive curriculum, expert instructors, and hands-on projects make it an unparalleled learning experience. Plus, their commitment to placements ensures you're not just trained; you're career-ready!
Gayathri S
Brings out the best in an employee. Provides multiple opportunities and avenues to excel. The management has been supportive of new ideas therein enabling a diverse learning curve. The overall culture of the organization is grounded and the fostering in a pleasant manner. The people in charge have set the standards high by practicing a hands on approach. Highly recommended for budding professionals to explore DV Analytics as a career option.
Vivian Peter
DV Institute is a place which you can look up for carrier change and also personally for me it's a course content and delivery is all what let you into a good profile as Data Analyst, DV is one of the India's best institute as many training institute and online platform are available these day but a content is not been well aligned in most of these places as my personal experience you will end up with no results after working hard, however when it's comes to DV content is well aligned with industry requirements and assignments are been designed in such a fashion like if you practice those no-one, can stop you bagging a very good offer,as it's all about developing a skill set and don't worry at all about anything if you do your part. DV will always been supportive to every student.
raina goswami
DV Analytics Training Institute boasts a friendly work culture that fosters collaboration. The training environment is conducive to effective learning, with supportive instructors and a management team that genuinely cares about your success. A fantastic place to grow your skills and build a solid foundation for a successful career.
Prasantika Mohapatra
I'm a student at DV Analytics right now, and I couldn't be more satisfied. I wholeheartedly endorse DV Analytics to anyone wishing to advance their data analytics skills, whether they are novices or seasoned professionals. The course material is interesting and applicable. Deb Sir is a true Data Scientist expert, with a wealth of knowledge and experience in the field. Thank you, DV Analytics!
ajit
Best institute for learning data science for your career opportunities. I have earned a high package job thank you dv analytic
Lokeshwari Loki
-
Certainly! DV Analytics stands out as the best data science training institute with placement support. The comprehensive curriculum, expert trainers, and hands-on projects provide a robust learning experience. The institute's emphasis on real-world skills and industry connections ensures students are well-prepared for the job market. The dedicated support staff and effective placement assistance make it an ideal choice for aspiring data scientists. I am grateful for my experience there, and I am now confidently pursuing a career in data science, all thanks to DV Analytics.
Sri Rangam
I am a student of DV Analytics. And I m not from IT background but because of the #faculties and #Dev sir I feel I can be a data scientist nd I will definitely achieve my goals. Thank you 😊 #DVAnalytics
Supriya Mona
Any one who wants to get into the field of Data can definitely check this place. At this place you wont just learn a lot, the placement support DV gives its students is insanely strong. Deb sir and Venky sir are genius in their field and to learn from them was just amazing experience.
Vijith Visweswaran
This is the best career decision I have made till date to join in DV Analytics. Dev sir explanation is top notch. He covers a lot of content in very short time while making sure it is easy to understand. Assignments helped me to get deeper understanding of the concepts explained in class. Materials and recording sessions are to the point for quick revision as well as to clear our doubts on our own. This is my experience till now. Looking forward to update it on curriculum and placements.
Manideep Kasina
It's the Best institute with awesome and enthusiastic mentors.I was searching for the training institute for long time and got this. If you want to go for Data scientist/Data Analyst course must join DV.
Priyanshi
Good institute Well experienced faculties. It is the most rated data science training institute in Bangalore. They have a great team of teachers. Appreciate the fact that doubt clearing sessions are conducted regularly. Dv analytics is also great platform to solve queries.
tarun somisetty
-
Highly recommended to all the people who believe sky is the limit and sees themselves succeeding as well as growing in life. On the basis of my personal experience, DV is one the best training institute. The way all the programs has been designed, be it mock interviews, classes or assignments, all these makes you ready for the competitive world out there. Also the faculty members are way too supportive and motives you at each and every step.
kritika raina
DV Analytics is not just an institute, It is a temple of knowledge. The people there are so down to earth and supportive, They are one of the best institute that I have been to. The entire training was a fun session with lots of learning and creativity. Their team is very good and they have excellent people in the organization who supports us every step of our journey. The student mentors support us to make sure that we complete the assignment on time. The placement team supports us until we get placed. Every person in the organization has supported us in some way or the other. Thanks DV for shaping my future.
Shashank Nr
It's a best institute to start a carrier in Data science, trust me you have the best teachers who are teaching here. "DV Analytics Training Institute" is a second home for me. I am a beginner to this field but before coming here i thought that this may be difficult for me, but no problem every path is hard before walking into it. Ask Dev sir for the guidance, I am 100% sure that he will guide you throughout your entire journey of Data Science. Thanks to DV analytics for providing me a nice platform where i am feeling much confident.
raj pahan
DV Analytics Training Institute boasts a friendly work culture that fosters collaboration. The training environment is conducive to effective learning, with supportive instructors and a management team that genuinely cares about your CAREER
sahal roshan
On the basis of my personal experience, DV is one the best Data Science training institute in India. I really appreciate that the team of Dv Analytics take care of Each and every student from their joining into Dv to getting success in his/her life.
Suryakanta lenka
Dv analytics is an excellent training institute which has a very good job placement track record.
Rᴇᴠᴀᴛʜʏ ᴘ s
-
DV is best choice if you are deciding to build Data Science as a professional career. The best thing about DV is the Mentor- Mentee Strategy adopted by them for hand holding of each student till they get Placement. Further, they conduct various Trainings/ Workshops by Industry Experts who helps students to understand the use of various Data Science Tools in real world.
Duryadha Sethi
If someone wants to build his career in Data Science field than DV Analytics is the best place for this. Faculties are best in their respective subjects and specially Dev sir , he is teacher cum guide for us . 👍 Best place to learn.
Rahul Ghorpade
Dv analytics is a very good training institute with very good trainers and has helped my daughter secure a nice job in the corporate company.
Sunitha Kumari B
It’s a great place to learn and make a carrier in data science. The atmosphere of this institute is very good I am glad that I took this decision..to all people those are looking to make a carrier in this field must join DV Analytics 👍🏻
Payal Udhwani
Best Institute to have a great training and learning experience of DATA SCIENCE. Thanks to all the support staffs and specially Dev Sir for grooming me to be in IT sector from a NON-IT background. Before joining DV I really have negative thoughts to join IT sector but now it seems to be easier for me due to the best guidance by Dev Sir.
Tophanranjan Khuntia
DV Analytics is the best place were one can transform their career in Data Science . Thanks to DV Analytics ... They have a well planned approach in their training and course content so that each student is made sure that gets individual attention and guidance till their placements in this institute.
silpa Ajithkumar
























SINCE 2010