Probability & Stats 3: Markov Chains & Stochastic Processes

Course Description

In this third and final series on Probability and Statistics, Michel van Biezen introduces Markov chains and stochastic processes and how it predicts the probability of future outcomes.

Probability & Stats 3: Markov Chains & Stochastic Processes
Visual representation of Markov chains, random processes that undergo transitions from one state to another on a state space.
Not yet rated

Video Lectures & Study Materials

Visit the official course website for more study materials: http://www.ilectureonline.com/lectures/subject/MATH/18/164

# Lecture Play Lecture
1 What are Markov Chains: An Introduction Play Video
2 Markov Chains: An Introduction (Another Method) Play Video
3 Why Are Markov Chains Called "Markov Chains"? Play Video
4 Another Way to Calculate the Markov Chains Play Video
5 What Happens if the Markov Chain Continues? Play Video
6 Markov Chain Applied to Market Penetration Play Video
7 Power of the Probability Matrix Play Video
8 What is a Stochastic Matrix? Play Video
9 What is a Regular Matrix? Play Video
10 Regular Markov Chain Play Video
11 How to Check for a Stable Distribution Matrix Play Video
12 How to Find a Stable 2x2 Matrix - Ex. 1 Play Video
13 How to Find a Stable 2x2 Matrix - Ex. 2 Play Video
14 How to Find a Stable 2x2 Matrix - Ex. 3 Play Video
15 How to Find a Stable 3x3 Matrix Play Video
16 Application Problem #1, Charity Contributions Play Video
17 Application Problem #2, Grocery Stores Play Video
18 Application Problem #3, Brand Loyalty Play Video
19 Absorbing Markov Chains - Definition 1 Play Video
20 Absorbing Markov Chains - Definition 2 Play Video
21 Absorbing Markov Chains - Example 1 Play Video
22 Absorbing Markov Chains - Example 2 Play Video
23 Absorbing and Non-Absorbing Markov Chain Play Video
24 Absorbing Markov Chain in Standard Form Play Video
25 Absorbing Markov Chain: Stable Matrix=? Play Video
26 Absorbing Markov Chain: Stable Matrix=? Ex. 1 Play Video
27 Absorbing Markov Chain: Stable Matrix=? Ex. 2 Play Video
28 Absorbing Markov Chain: Stable Distribution Matrix I Play Video
29 Absorbing Markov Chain: Stable Distribution Matrix II Play Video
30 Basics of Solving Markov Chains Play Video
31 Powers of a Transition Matrix Play Video
32 Finding Stable State Matrix Play Video
33 What is an Absorbing Markov Chain Play Video
34 Finding the Stable State Matrix Play Video
35 Finding the Stable State & Transition Matrices Play Video
36 Absorbing Markov Chain: Standard Form - Ex. Play Video
37 Absorbing Markov Chain: Changing to Standard Form Play Video
38 Absorbing Markov Chain: Standard Form - Ex. Play Video

Comments

Displaying 1 comment:

melissa V wrote 2 years ago. - Delete
Thanks very much. This is the best expanation on the net!
Very clear and nice examples.


  Post comment as a guest user.
Click to login or register:
Your name:
Your email:
(will not appear)
Your comment:
(max. 1000 characters)
Are you human? (Sorry)
 
Disclaimer:
CosmoLearning is promoting these materials solely for nonprofit educational purposes, and to recognize contributions made by Michel van Baiezen (iLecturesOnline) to online education. We do not host or upload any copyrighted materials, including videos hosted on video websites like YouTube*, unless with explicit permission from the author(s). All intellectual property rights are reserved to iLecturesOnline and involved parties. CosmoLearning is not endorsed by iLecturesOnline, and we are not affiliated with them, unless otherwise specified. Any questions, claims or concerns regarding this content should be directed to their creator(s).

*If any embedded videos constitute copyright infringement, we strictly recommend contacting the website hosts directly to have such videos taken down. In such an event, these videos will no longer be playable on CosmoLearning or other websites.