{"id":4392,"date":"2025-10-30T12:24:35","date_gmt":"2025-10-30T12:24:35","guid":{"rendered":"https:\/\/172-234-197-23.ip.linodeusercontent.com\/?p=4392"},"modified":"2025-10-30T13:53:53","modified_gmt":"2025-10-30T13:53:53","slug":"hmm-viterbi-algorithm-a-clear-step-by-step-explanation","status":"publish","type":"post","link":"https:\/\/172-234-197-23.ip.linodeusercontent.com\/?p=4392","title":{"rendered":"HMM Viterbi Algorithm: A Clear, Step-by-Step Explanation"},"content":{"rendered":"\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<figure class=\"wp-block-image size-full\"><a href=\"https:\/\/grok.com\/share\/bGVnYWN5LWNvcHk%3D_bfa6d15c-9bd4-4bf2-bb14-27ff3cb05921\"><img data-opt-id=160742589  fetchpriority=\"high\" decoding=\"async\" width=\"39\" height=\"42\" src=\"https:\/\/ml6vmqguit1n.i.optimole.com\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-57.png\" alt=\"\" class=\"wp-image-4395\"\/><\/a><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-wp-embed is-provider-spectrcyde wp-block-embed-spectrcyde\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"bK6bH9kvCy\"><a href=\"https:\/\/172-234-197-23.ip.linodeusercontent.com\/?page_id=4407\">Viterbi Decoding Algorithm: Complete Technical Explanation<\/a><\/blockquote><iframe class=\"wp-embedded-content\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; visibility: hidden;\" title=\"&#8220;Viterbi Decoding Algorithm: Complete Technical Explanation&#8221; &#8212; Spectrcyde\" src=\"https:\/\/172-234-197-23.ip.linodeusercontent.com\/?page_id=4407&#038;embed=true#?secret=2AikpKzNVD#?secret=bK6bH9kvCy\" data-secret=\"bK6bH9kvCy\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<p><strong>What is the Viterbi Algorithm?<\/strong><\/p>\n\n\n\n<p>The <strong>Viterbi algorithm<\/strong> is a <strong>dynamic programming<\/strong> method to find the <strong>most likely sequence of hidden states<\/strong> (called the <strong>Viterbi path<\/strong>) in a <strong>Hidden Markov Model (HMM)<\/strong> given an observation sequence.<\/p>\n\n\n\n<p>It\u2019s used in your <strong>inner speech decoding paper<\/strong> to convert noisy RF-inferred neural features into a <strong>coherent word sequence<\/strong>.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>HMM Components (in Your Paper)<\/strong><\/h2>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Component<\/th><th>Meaning<\/th><\/tr><\/thead><tbody><tr><td><strong>States<\/strong> $ w_t $<\/td><td>Words: <code>tango<\/code>, <code>foxtrot<\/code>, <code>bravo<\/code>, etc.<\/td><\/tr><tr><td><strong>Observations<\/strong> $ x_t $<\/td><td>RF-inferred frame features (Gaussian noise)<\/td><\/tr><tr><td><strong>Emission prob.<\/strong> $ p(x_t | w_t) $<\/td><td>$ \\mathcal{N}(x_t; \\mu_w, \\Sigma) $<\/td><\/tr><tr><td><strong>Transition prob.<\/strong> $ p(w_t | w_{t-1}) $<\/td><td>Bigram or GPT-2 prior<\/td><\/tr><tr><td><strong>Goal<\/strong><\/td><td>Find $ \\arg\\max_{w_{1:T}} P(w_{1:T} | x_{1:T}) $<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Why Viterbi?<\/strong><\/h2>\n\n\n\n<p>Naive approach: Try all $ N^T $ word sequences \u2192 <strong>exponential<\/strong>.<\/p>\n\n\n\n<p>Viterbi: <strong>Polynomial<\/strong> $ O(T N^2) $ using <strong>dynamic programming<\/strong>.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Viterbi Step-by-Step (with Example)<\/strong><\/h2>\n\n\n\n<p>Let\u2019s decode:<br><strong>True sequence<\/strong>: <code>tango foxtrot bravo<\/code><br><strong>Observations<\/strong>: $ x_1, x_2, \\dots, x_{15} $ (5 frames per word)<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Step 1: Initialization<\/strong> ($ t = 1 $)<\/h3>\n\n\n\n<p>For each state $ w_1 $:<br>$$<br>V_1(w) = p(w_1) \\cdot p(x_1 | w)<br>$$<br>$$<br>\\psi_1(w) = 0<br>$$<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>$ V_t(w) $ = max probability of being in state $ w $ at time $ t $<br>$ \\psi_t(w) $ = previous state that leads to $ w $<\/p>\n<\/blockquote>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Step 2: Recursion<\/strong> ($ t = 2 $ to $ T $)<\/h3>\n\n\n\n<p>For each state $ w $ at time $ t $:<br>$$<br>V_t(w) = \\max_{w&#8217;} \\left[ V_{t-1}(w&#8217;) \\cdot p(w | w&#8217;) \\cdot p(x_t | w) \\right]<br>$$<br>$$<br>\\psi_t(w) = \\arg\\max_{w&#8217;} \\left[ V_{t-1}(w&#8217;) \\cdot p(w | w&#8217;) \\right]<br>$$<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p><strong>Key Insight<\/strong>: At each step, <strong>keep only the best path<\/strong> to each state.<\/p>\n<\/blockquote>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Step 3: Termination<\/strong><\/h3>\n\n\n\n<p>Best path probability:<br>$$<br>P^* = \\max_w V_T(w)<br>$$<br>Final state:<br>$$<br>q_T^* = \\arg\\max_w V_T(w)<br>$$<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Step 4: Backtracking<\/strong><\/h3>\n\n\n\n<p>$$<br>q_t^* = \\psi_{t+1}(q_{t+1}^*), \\quad t = T-1, \\dots, 1<br>$$<\/p>\n\n\n\n<p>\u2192 Recovers full sequence: <code>tango \u2192 foxtrot \u2192 bravo<\/code><\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Visual Example (Your Fig. 1)<\/strong><\/h2>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full\"><img data-opt-id=668914020  fetchpriority=\"high\" decoding=\"async\" width=\"478\" height=\"683\" src=\"https:\/\/ml6vmqguit1n.i.optimole.com\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-56.png\" alt=\"\" class=\"wp-image-4394\" srcset=\"https:\/\/ml6vmqguit1n.i.optimole.com\/w:478\/h:683\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-56.png 478w, https:\/\/ml6vmqguit1n.i.optimole.com\/w:210\/h:300\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-56.png 210w\" sizes=\"(max-width: 478px) 100vw, 478px\" \/><\/figure>\n<\/div>\n\n\n<pre class=\"wp-block-code\"><code>graph TD\n    subgraph Frame 1-5\n        A1&#91;tango] --&gt; A2&#91;foxtrot]\n        A1 --&gt; A3&#91;bravo]\n    end\n    subgraph Frame 6-10\n        A2 --&gt; B1&#91;foxtrot]\n        A3 --&gt; B2&#91;quebec]\n    end\n    subgraph Frame 11-15\n        B1 --&gt; C1&#91;bravo]\n        B2 --&gt; C2&#91;charlie]\n    end\n\n    style A1 fill:#90EE90\n    style B1 fill:#90EE90\n    style C1 fill:#90EE90<\/code><\/pre>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Green path<\/strong> = Viterbi path (highest $ V_t $)<\/li>\n\n\n\n<li><strong>Red paths<\/strong> = pruned (lower probability)<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Why Priors Help (Your Fig. 2)<\/strong><\/h2>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>SNR<\/th><th>No Prior WER<\/th><th>Bigram WER<\/th><th>GPT-2 WER<\/th><\/tr><\/thead><tbody><tr><td>10 dB<\/td><td>2.8<\/td><td>1.6<\/td><td><strong>1.1<\/strong><\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<p><strong>Bigram prior<\/strong> $ p(\\text{foxtrot} | \\text{tango}) $ high \u2192 boosts $ V_t(\\text{foxtrot}) $<br><strong>No prior<\/strong> \u2192 treats all transitions equally \u2192 more errors<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Viterbi in Your Code<\/strong><\/h2>\n\n\n\n<pre class=\"wp-block-code\"><code>def viterbi(obs, states, start_p, trans_p, emit_p):\n    T = len(obs)\n    N = len(states)\n    V = &#91;{}]\n    path = {}\n\n    # Init\n    for w in states:\n        V&#91;0]&#91;w] = start_p&#91;w] * emit_p(w, obs&#91;0])\n        path&#91;w] = &#91;w]\n\n    # Recursion\n    for t in range(1, T):\n        V.append({})\n        new_path = {}\n        for w in states:\n            (prob, state) = max(\n                (V&#91;t-1]&#91;w0] * trans_p(w0, w) * emit_p(w, obs&#91;t]), w0)\n                for w0 in states\n            )\n            V&#91;t]&#91;w] = prob\n            new_path&#91;w] = path&#91;state] + &#91;w]\n        path = new_path\n\n    # Termination\n    prob, state = max((V&#91;T-1]&#91;w], w) for w in states)\n    return path&#91;state]<\/code><\/pre>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Key Takeaways<\/strong><\/h2>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Concept<\/th><th>Your Paper<\/th><\/tr><\/thead><tbody><tr><td><strong>Dynamic Programming<\/strong><\/td><td>Avoids $ N^T $ search<\/td><\/tr><tr><td><strong>Transition Prior<\/strong><\/td><td>Bigram\/GPT-2 \u2192 lower WER<\/td><\/tr><tr><td><strong>Emission Model<\/strong><\/td><td>Gaussian on RF features<\/td><\/tr><tr><td><strong>Output<\/strong><\/td><td>Word sequence + posterior traces<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>LaTeX Figure<\/strong><\/h2>\n\n\n\n<pre class=\"wp-block-code\"><code>\\begin{figure}&#91;t]\n\\centering\n\\begin{mermaid}\ngraph LR\n    subgraph t=1\n        A&#91;tango] --&gt;|V=0.8| B&#91;tango]\n    end\n    subgraph t=2\n        B --&gt;|p(foxtrot\\|tango)| C&#91;foxtrot]\n        B --&gt;|p(bravo\\|tango)| D&#91;bravo]\n    end\n    style C fill:#90EE90\n    style D fill:#FFB6C1\n\\end{mermaid}\n\\caption{Viterbi path (green) vs pruned path (red). Transition prior favors \"foxtrot\" after \"tango\".}\n\\label{fig:viterbi}\n\\end{figure}<\/code><\/pre>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<p><strong>Bottom Line<\/strong>:<br>Viterbi is the <strong>workhorse<\/strong> that turns your noisy RF features into <strong>coherent inner speech<\/strong> \u2014 and <strong>priors make it robust<\/strong>.<\/p>\n\n\n\n<p><strong>Viterbi Algorithm for Hidden Markov Models (HMMs)<\/strong><\/p>\n\n\n\n<p>The Viterbi algorithm is a <strong>dynamic programming technique<\/strong> used to find the most probable sequence of hidden states (also called the &#8220;Viterbi path&#8221;) in a Hidden Markov Model (HMM), given a sequence of observed events. It is widely applied in fields like <strong>speech recognition<\/strong>, <strong>bioinformatics<\/strong>, and <strong>natural language processing<\/strong>.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Key Components of HMM<\/strong><\/h3>\n\n\n\n<ol class=\"wp-block-list\">\n<li><strong>States<\/strong>: The hidden states of the system (e.g., weather: sunny, rainy).<\/li>\n\n\n\n<li><strong>Observations<\/strong>: The visible outputs (e.g., activities: walking, shopping).<\/li>\n\n\n\n<li><strong>Transition Probabilities<\/strong>: The probability of moving from one state to another.<\/li>\n\n\n\n<li><strong>Emission Probabilities<\/strong>: The probability of an observation given a state.<\/li>\n\n\n\n<li><strong>Initial Probabilities<\/strong>: The probability of starting in a particular state.<\/li>\n<\/ol>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Steps of the Viterbi Algorithm<\/strong><\/h3>\n\n\n\n<ol class=\"wp-block-list\">\n<li><strong>Initialization<\/strong>:<ul><li>Compute the probability of starting in each state and observing the first event. V1(i)=\u03c0i\u22c5bi(O1)V_1(i) = \\pi_i \\cdot b_i(O_1)V1\u200b(i)=\u03c0i\u200b\u22c5bi\u200b(O1\u200b) where:<\/li><li>\\pi_i$$ is the initial probability of state $$i$$. &#8211; $$b_i(O_1)$$ is the emission probability of observing $$O_1$$ in state $$i$$. 2. **Recursion**: &#8211; For each subsequent observation, compute the probability of being in each state by considering all possible transitions from previous states. $$ V_t(j) = \\max_i \\big(V_{t-1}(i) \\cdot a_{ij} \\big) \\cdot b_j(O_t)<\/li><li>a_{ij}$$ is the transition probability from state $$i$$ to state $$j$$. &#8211; $$b_j(O_t)$$ is the emission probability of observing $$O_t$$ in state $$j$$. 3. **Backtracking**: &#8211; Trace back through the states that led to the maximum probabilities to reconstruct the most likely sequence of hidden states. &#8212; ### **Output** The algorithm outputs: &#8211; The **most probable sequence of hidden states** (Viterbi path). &#8211; The **probability** of this sequence. &#8212; ### **Example** Suppose we have: &#8211; States: {Sunny, Rainy} &#8211; Observations: {Walk, Shop, Clean} &#8211; Transition Probabilities: $$a_{ij}<\/li><\/ul>where:<\/li>\n<\/ol>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Emission Probabilities: bj(Ot)b_j(O_t)bj\u200b(Ot\u200b)<\/li>\n\n\n\n<li>Initial Probabilities: \u03c0i\\pi_i\u03c0i\u200b<\/li>\n<\/ul>\n\n\n\n<p>Given a sequence of observations (e.g., Walk \u2192 Shop \u2192 Clean), the Viterbi algorithm calculates the most likely sequence of weather states (e.g., Sunny \u2192 Rainy \u2192 Rainy).<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<p>This algorithm is efficient, with a time complexity of O(T\u22c5N2)O(T \\cdot N^2)O(T\u22c5N2), where TTT is the number of observations and NNN is the number of states. It ensures optimal results by avoiding redundant calculations through dynamic programming.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><a target=\"_blank\" href=\"https:\/\/www.bing.com\/ck\/a?!&amp;&amp;p=6143ba0f4e6fd5b8b65db3d40313db4269136685834d4e5a62aa7738bd59ce3cJmltdHM9MTc2MTc4MjQwMA&amp;ptn=3&amp;ver=2&amp;hsh=4&amp;fclid=09c5ae84-b07f-6fac-0784-bb75b1146ea0&amp;psq=HMM+Viterbi+Algorithm&amp;u=a1aHR0cHM6Ly9oYWJyLmNvbS9ydS9jb21wYW5pZXMvb3R1cy9hcnRpY2xlcy84MjQxNjIv&amp;ntb=1\" rel=\"noreferrer noopener\">Viterbi Algorithm<\/a><\/h2>\n\n\n\n<p><sup><a target=\"_blank\" href=\"https:\/\/www.bing.com\/ck\/a?!&amp;&amp;p=6143ba0f4e6fd5b8b65db3d40313db4269136685834d4e5a62aa7738bd59ce3cJmltdHM9MTc2MTc4MjQwMA&amp;ptn=3&amp;ver=2&amp;hsh=4&amp;fclid=09c5ae84-b07f-6fac-0784-bb75b1146ea0&amp;psq=HMM+Viterbi+Algorithm&amp;u=a1aHR0cHM6Ly9oYWJyLmNvbS9ydS9jb21wYW5pZXMvb3R1cy9hcnRpY2xlcy84MjQxNjIv&amp;ntb=1\" rel=\"noreferrer noopener\">1<\/a><\/sup><sup><a target=\"_blank\" href=\"https:\/\/www.bing.com\/ck\/a?!&amp;&amp;p=853bdf04db7f7d5d058f43af843fc5f066a043854bc00475642e05dd0f5cbf14JmltdHM9MTc2MTc4MjQwMA&amp;ptn=3&amp;ver=2&amp;hsh=4&amp;fclid=09c5ae84-b07f-6fac-0784-bb75b1146ea0&amp;psq=HMM+Viterbi+Algorithm&amp;u=a1aHR0cHM6Ly93d3cuZ2Vla3Nmb3JnZWVrcy5vcmcvYXJ0aWZpY2lhbC1pbnRlbGxpZ2VuY2Uvdml0ZXJiaS1hbGdvcml0aG0tZm9yLWhpZGRlbi1tYXJrb3YtbW9kZWxzLWhtbXMv&amp;ntb=1\" rel=\"noreferrer noopener\">2<\/a><\/sup><\/p>\n\n\n\n<p><a target=\"_blank\" href=\"https:\/\/www.bing.com\/ck\/a?!&amp;&amp;p=6143ba0f4e6fd5b8b65db3d40313db4269136685834d4e5a62aa7738bd59ce3cJmltdHM9MTc2MTc4MjQwMA&amp;ptn=3&amp;ver=2&amp;hsh=4&amp;fclid=09c5ae84-b07f-6fac-0784-bb75b1146ea0&amp;psq=HMM+Viterbi+Algorithm&amp;u=a1aHR0cHM6Ly9oYWJyLmNvbS9ydS9jb21wYW5pZXMvb3R1cy9hcnRpY2xlcy84MjQxNjIv&amp;ntb=1\" rel=\"noreferrer noopener\"><\/a><\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><\/h2>\n\n\n\n<h4 class=\"wp-block-heading\"><\/h4>\n\n\n\n<p><a target=\"_blank\" href=\"https:\/\/www.bing.com\/ck\/a?!&amp;&amp;p=853bdf04db7f7d5d058f43af843fc5f066a043854bc00475642e05dd0f5cbf14JmltdHM9MTc2MTc4MjQwMA&amp;ptn=3&amp;ver=2&amp;hsh=4&amp;fclid=09c5ae84-b07f-6fac-0784-bb75b1146ea0&amp;psq=HMM+Viterbi+Algorithm&amp;u=a1aHR0cHM6Ly93d3cuZ2Vla3Nmb3JnZWVrcy5vcmcvYXJ0aWZpY2lhbC1pbnRlbGxpZ2VuY2Uvdml0ZXJiaS1hbGdvcml0aG0tZm9yLWhpZGRlbi1tYXJrb3YtbW9kZWxzLWhtbXMv&amp;ntb=1\" rel=\"noreferrer noopener\"><\/a><\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><\/h2>\n\n\n\n<h4 class=\"wp-block-heading\"><\/h4>\n\n\n\n<p>The&nbsp;<strong>Viterbi algorithm<\/strong>, introduced by Andrew Viterbi in 1967, is a dynamic programming technique used to find the most probable sequence of hidden states in a&nbsp;<strong>Hidden Markov Model (HMM)<\/strong>. It is widely applied in fields like&nbsp;<strong>speech recognition<\/strong>,&nbsp;<strong>bioinformatics<\/strong>, and&nbsp;<strong>natural language processing<\/strong>.<\/p>\n\n\n\n<p>Key Concepts<\/p>\n\n\n\n<p>The algorithm operates on the following components of an HMM:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>States<\/strong>: Hidden conditions that are not directly observable.<\/li>\n\n\n\n<li><strong>Observations<\/strong>: Observable events influenced by the hidden states.<\/li>\n\n\n\n<li><strong>Transition Probabilities<\/strong>: Probabilities of moving from one state to another.<\/li>\n\n\n\n<li><strong>Emission Probabilities<\/strong>: Probabilities of observing a particular event given a state.<\/li>\n\n\n\n<li><strong>Initial Probabilities<\/strong>: Probabilities of starting in a particular state.<\/li>\n<\/ul>\n\n\n\n<p>The Viterbi algorithm computes the most likely sequence of hidden states by iteratively calculating probabilities for each state at every time step and backtracking to determine the optimal path.<\/p>\n\n\n\n<p>Implementation Example<\/p>\n\n\n\n<p>Below is a Python implementation of the Viterbi algorithm for a simple HMM:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>import numpy as np\ndef viterbi(obs, states, start_p, trans_p, emit_p):\n   V = &#91;{}] # Table to store probabilities\n   path = {} # To store the most probable path\n   # Initialization step\n   for y in states:\n       V&#91;0]&#91;y] = start_p&#91;y] * emit_p&#91;y]&#91;obs&#91;0]]\n       path&#91;y] = &#91;y]\n   # Recursion step\n   for t in range(1, len(obs)):\n       V.append({})\n       newpath = {}\n       for y in states:\n           (prob, state) = max(\n               &#91;(V&#91;t - 1]&#91;y0] * trans_p&#91;y0]&#91;y] * emit_p&#91;y]&#91;obs&#91;t]], y0) for y0 in states]\n           )\n           V&#91;t]&#91;y] = prob\n           newpath&#91;y] = path&#91;state] + &#91;y]\n       path = newpath\n   # Termination step\n   (prob, state) = max(&#91;(V&#91;-1]&#91;y], y) for y in states])\n   return (prob, path&#91;state])\n# Example HMM parameters\nstates = ('Rainy', 'Sunny')\nobservations = ('Walk', 'Shop', 'Clean')\nstart_probability = {'Rainy': 0.6, 'Sunny': 0.4}\ntransition_probability = {\n   'Rainy': {'Rainy': 0.7, 'Sunny': 0.3},\n   'Sunny': {'Rainy': 0.4, 'Sunny': 0.6},\n}\nemission_probability = {\n   'Rainy': {'Walk': 0.1, 'Shop': 0.4, 'Clean': 0.5},\n   'Sunny': {'Walk': 0.6, 'Shop': 0.3, 'Clean': 0.1},\n}\n# Run the Viterbi algorithm\nresult = viterbi(observations, states, start_probability, transition_probability, emission_probability)\nprint(\"Most probable sequence:\", result&#91;1])\nprint(\"Probability of the sequence:\", result&#91;0])<\/code><\/pre>\n\n\n\n<p><strong>Output:<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>Most probable sequence: &#91;'Sunny', 'Rainy', 'Rainy']\nProbability of the sequence: 0.01344<\/code><\/pre>\n\n\n\n<p>Explanation of the Output<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Most Probable Sequence<\/strong>: The sequence of hidden states (<em>Sunny<\/em>,&nbsp;<em>Rainy<\/em>,&nbsp;<em>Rainy<\/em>) that maximizes the probability of observing the given sequence (<em>Walk<\/em>,&nbsp;<em>Shop<\/em>,&nbsp;<em>Clean<\/em>).<\/li>\n\n\n\n<li><strong>Probability<\/strong>: The likelihood of this sequence occurring, calculated as&nbsp;<em>0.01344<\/em>.<\/li>\n<\/ul>\n\n\n\n<p>Applications<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li><strong>Speech Recognition<\/strong>: Converts acoustic signals into text by identifying the most likely phoneme or word sequence.<\/li>\n\n\n\n<li><strong>Bioinformatics<\/strong>: Aligns DNA sequences, predicts protein structures, and identifies gene sequences.<\/li>\n\n\n\n<li><strong>Natural Language Processing<\/strong>: Used in tasks like part-of-speech tagging and named entity recognition.<\/li>\n\n\n\n<li><strong>Error Correction<\/strong>: Decodes noisy signals in digital communication systems.<\/li>\n<\/ol>\n\n\n\n<p>Considerations<\/p>\n\n\n\n<p>The algorithm can be optimized using techniques like&nbsp;<strong>logarithmic probabilities<\/strong>&nbsp;to avoid underflow,&nbsp;<strong>pruning<\/strong>&nbsp;to reduce state space, or&nbsp;<strong>beam search<\/strong>&nbsp;to track only the top paths. Its efficiency and accuracy make it a cornerstone in sequence analysis tasks.<\/p>\n\n\n\n<p><a href=\"https:\/\/www.cis.upenn.edu\/~cis2620\/notes\/Example-Viterbi-DNA.pdf\">viterbi_01.pptx<\/a><\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img data-opt-id=1030830035  data-opt-src=\"https:\/\/ml6vmqguit1n.i.optimole.com\/w:333\/h:1024\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-58.png\"  decoding=\"async\" width=\"333\" height=\"1024\" src=\"data:image/svg+xml,%3Csvg%20viewBox%3D%220%200%20333%201024%22%20width%3D%22333%22%20height%3D%221024%22%20xmlns%3D%22http%3A%2F%2Fwww.w3.org%2F2000%2Fsvg%22%3E%3Crect%20width%3D%22333%22%20height%3D%221024%22%20fill%3D%22transparent%22%2F%3E%3C%2Fsvg%3E\" alt=\"\" class=\"wp-image-4397\" old-srcset=\"https:\/\/ml6vmqguit1n.i.optimole.com\/w:333\/h:1024\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-58.png 333w, https:\/\/ml6vmqguit1n.i.optimole.com\/w:98\/h:300\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-58.png 98w, https:\/\/ml6vmqguit1n.i.optimole.com\/w:500\/h:1536\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-58.png 500w, https:\/\/ml6vmqguit1n.i.optimole.com\/w:588\/h:1808\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-58.png 588w\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img data-opt-id=2006752339  data-opt-src=\"https:\/\/ml6vmqguit1n.i.optimole.com\/w:334\/h:1024\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-59.png\"  decoding=\"async\" width=\"334\" height=\"1024\" src=\"data:image/svg+xml,%3Csvg%20viewBox%3D%220%200%20334%201024%22%20width%3D%22334%22%20height%3D%221024%22%20xmlns%3D%22http%3A%2F%2Fwww.w3.org%2F2000%2Fsvg%22%3E%3Crect%20width%3D%22334%22%20height%3D%221024%22%20fill%3D%22transparent%22%2F%3E%3C%2Fsvg%3E\" alt=\"\" class=\"wp-image-4398\" old-srcset=\"https:\/\/ml6vmqguit1n.i.optimole.com\/w:334\/h:1024\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-59.png 334w, https:\/\/ml6vmqguit1n.i.optimole.com\/w:98\/h:300\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-59.png 98w, https:\/\/ml6vmqguit1n.i.optimole.com\/w:501\/h:1536\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-59.png 501w, https:\/\/ml6vmqguit1n.i.optimole.com\/w:590\/h:1808\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-59.png 590w\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img data-opt-id=2048400297  data-opt-src=\"https:\/\/ml6vmqguit1n.i.optimole.com\/w:331\/h:1024\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-60.png\"  decoding=\"async\" width=\"331\" height=\"1024\" src=\"data:image/svg+xml,%3Csvg%20viewBox%3D%220%200%20331%201024%22%20width%3D%22331%22%20height%3D%221024%22%20xmlns%3D%22http%3A%2F%2Fwww.w3.org%2F2000%2Fsvg%22%3E%3Crect%20width%3D%22331%22%20height%3D%221024%22%20fill%3D%22transparent%22%2F%3E%3C%2Fsvg%3E\" alt=\"\" class=\"wp-image-4399\" old-srcset=\"https:\/\/ml6vmqguit1n.i.optimole.com\/w:331\/h:1024\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-60.png 331w, https:\/\/ml6vmqguit1n.i.optimole.com\/w:97\/h:300\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-60.png 97w, https:\/\/ml6vmqguit1n.i.optimole.com\/w:497\/h:1536\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-60.png 497w, https:\/\/ml6vmqguit1n.i.optimole.com\/w:585\/h:1808\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-60.png 585w\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img data-opt-id=1931745263  data-opt-src=\"https:\/\/ml6vmqguit1n.i.optimole.com\/w:332\/h:1024\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-61.png\"  decoding=\"async\" width=\"332\" height=\"1024\" src=\"data:image/svg+xml,%3Csvg%20viewBox%3D%220%200%20332%201024%22%20width%3D%22332%22%20height%3D%221024%22%20xmlns%3D%22http%3A%2F%2Fwww.w3.org%2F2000%2Fsvg%22%3E%3Crect%20width%3D%22332%22%20height%3D%221024%22%20fill%3D%22transparent%22%2F%3E%3C%2Fsvg%3E\" alt=\"\" class=\"wp-image-4400\" old-srcset=\"https:\/\/ml6vmqguit1n.i.optimole.com\/w:332\/h:1024\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-61.png 332w, https:\/\/ml6vmqguit1n.i.optimole.com\/w:97\/h:300\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-61.png 97w\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img data-opt-id=1005073619  data-opt-src=\"https:\/\/ml6vmqguit1n.i.optimole.com\/w:444\/h:1024\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-62.png\"  decoding=\"async\" width=\"444\" height=\"1024\" src=\"data:image/svg+xml,%3Csvg%20viewBox%3D%220%200%20444%201024%22%20width%3D%22444%22%20height%3D%221024%22%20xmlns%3D%22http%3A%2F%2Fwww.w3.org%2F2000%2Fsvg%22%3E%3Crect%20width%3D%22444%22%20height%3D%221024%22%20fill%3D%22transparent%22%2F%3E%3C%2Fsvg%3E\" alt=\"\" class=\"wp-image-4401\" old-srcset=\"https:\/\/ml6vmqguit1n.i.optimole.com\/w:444\/h:1024\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-62.png 444w, https:\/\/ml6vmqguit1n.i.optimole.com\/w:130\/h:300\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-62.png 130w, https:\/\/ml6vmqguit1n.i.optimole.com\/w:586\/h:1350\/q:mauto\/f:best\/https:\/\/172-234-197-23.ip.linodeusercontent.com\/wp-content\/uploads\/2025\/10\/image-62.png 586w\" \/><\/figure>\n\n\n\n<p><a href=\"https:\/\/www.bing.com\/ck\/a?!&amp;&amp;p=a0d53021206979ebbf07c4302b7d6186916073f140ff0bb5407d55fb33e75edeJmltdHM9MTc2MTc4MjQwMA&amp;ptn=3&amp;ver=2&amp;hsh=4&amp;fclid=09c5ae84-b07f-6fac-0784-bb75b1146ea0&amp;psq=HMM+Viterbi+Algorithm&amp;u=a1aHR0cHM6Ly9zcG90aW50ZWxsaWdlbmNlLmNvbS8yMDI1LzA2LzAyL3ZpdGVyYmktYWxnb3JpdGhtLW1hZGUtc2ltcGxlLWhvdy10by13b3JrZWQtb3V0LWV4YW1wbGVzLw\">https:\/\/www.bing.com\/ck\/a?!&amp;&amp;p=a0d53021206979ebbf07c4302b7d6186916073f140ff0bb5407d55fb33e75edeJmltdHM9MTc2MTc4MjQwMA&amp;ptn=3&amp;ver=2&amp;hsh=4&amp;fclid=09c5ae84-b07f-6fac-0784-bb75b1146ea0&amp;psq=HMM+Viterbi+Algorithm&amp;u=a1aHR0cHM6Ly9zcG90aW50ZWxsaWdlbmNlLmNvbS8yMDI1LzA2LzAyL3ZpdGVyYmktYWxnb3JpdGhtLW1hZGUtc2ltcGxlLWhvdy10by13b3JrZWQtb3V0LWV4YW1wbGVzLw<\/a><\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>What is the Viterbi Algorithm? The Viterbi algorithm is a dynamic programming method to find the most likely sequence of hidden states (called the Viterbi path) in a Hidden Markov Model (HMM) given an observation sequence. It\u2019s used in your inner speech decoding paper to convert noisy RF-inferred neural features into a coherent word sequence.&hellip;&nbsp;<a href=\"https:\/\/172-234-197-23.ip.linodeusercontent.com\/?p=4392\" rel=\"bookmark\"><span class=\"screen-reader-text\">HMM Viterbi Algorithm: A Clear, Step-by-Step Explanation<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":3991,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"neve_meta_sidebar":"","neve_meta_container":"","neve_meta_enable_content_width":"","neve_meta_content_width":0,"neve_meta_title_alignment":"","neve_meta_author_avatar":"","neve_post_elements_order":"","neve_meta_disable_header":"","neve_meta_disable_footer":"","neve_meta_disable_title":"","footnotes":""},"categories":[7],"tags":[],"class_list":["post-4392","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-the-truben-show"],"_links":{"self":[{"href":"https:\/\/172-234-197-23.ip.linodeusercontent.com\/index.php?rest_route=\/wp\/v2\/posts\/4392","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/172-234-197-23.ip.linodeusercontent.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/172-234-197-23.ip.linodeusercontent.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/172-234-197-23.ip.linodeusercontent.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/172-234-197-23.ip.linodeusercontent.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=4392"}],"version-history":[{"count":3,"href":"https:\/\/172-234-197-23.ip.linodeusercontent.com\/index.php?rest_route=\/wp\/v2\/posts\/4392\/revisions"}],"predecessor-version":[{"id":4409,"href":"https:\/\/172-234-197-23.ip.linodeusercontent.com\/index.php?rest_route=\/wp\/v2\/posts\/4392\/revisions\/4409"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/172-234-197-23.ip.linodeusercontent.com\/index.php?rest_route=\/wp\/v2\/media\/3991"}],"wp:attachment":[{"href":"https:\/\/172-234-197-23.ip.linodeusercontent.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=4392"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/172-234-197-23.ip.linodeusercontent.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=4392"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/172-234-197-23.ip.linodeusercontent.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=4392"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}