04. Optimal Binary Search Tree
✅ Minimum Cost Optimal BST solution using Dynamic Programming, including optimized techniques for efficient lookup cost computation. Perfect for DSA and competitive programming.
The problem can be found at the following link: 🔗 Question Link
🧩 Problem Description
You are given a set of distinct keys in sorted order, represented by keys[]. Each key ki represents a data record accessed during a search operation. For all keys, you are also given a frequency array freq[], which denotes how many times key ki is searched for.
The cost of accessing a key in a binary search tree is calculated by multiplying its access frequency by the level at which it appears in the tree. Therefore, different arrangements of keys in the BST result in different total search costs.
Your task is to calculate the minimum total search cost required to construct a binary search tree containing all the keys.
Note: Consider the root of the BST is at level 1.
📘 Examples
Example 1
Input: keys[] = [10, 12], freq[] = [34, 50]
Output: 118
Explanation: There can be two possible BSTs:
Tree I: 10 as root, 12 as right child → Cost = 34*1 + 50*2 = 134
Tree II: 12 as root, 10 as left child → Cost = 50*1 + 34*2 = 118
The minimum cost is 118.Example 2
🔒 Constraints
$1 \le \text{keys.size()} = \text{freq.size()} \le 100$
$1 \le \text{keys}[i], \text{freq}[i] \le 10^4$
✅ My Approach
The optimal approach uses Dynamic Programming with Prefix Sum Optimization to efficiently compute the minimum cost BST:
Dynamic Programming with Prefix Sum
Initialize DP Table:
Create a 2D DP table
dp[i][j]representing the minimum cost to construct a BST with keys from indexitoj.Base case:
dp[i][i] = freq[i](single key costs its frequency at level 1).
Precompute Prefix Sums:
Build prefix sum array
ps[]whereps[i]stores the cumulative sum of frequencies up to indexi-1.This allows O(1) range sum queries:
sum(i, j) = ps[j+1] - ps[i].
Fill DP Table:
Iterate over all possible subarray lengths
lfrom 2 to n.For each subarray
[i, j], try every keyrin the range as the root.Calculate cost when key
ris root:cost = dp[i][r-1] + dp[r+1][j] + frequencySum(i, j).The frequency sum represents the cost increment as all keys in the range go one level deeper.
Recurrence Relation:
dp[i][j] = min(dp[i][r-1] + dp[r+1][j]) + sum(freq[i..j])for all r in [i, j].Left subtree cost + Right subtree cost + Total frequency sum of current range.
Return Result:
dp[0][n-1]contains the minimum cost for constructing BST with all keys.
📝 Time and Auxiliary Space Complexity
Expected Time Complexity: O(n³), as we have three nested loops: one for subarray length (O(n)), one for starting position (O(n)), and one for choosing the root within the range (O(n)).
Expected Auxiliary Space Complexity: O(n²), as we use a 2D DP table of size n×n and a prefix sum array of size n+1.
🧑💻 Code (C++)
⚡ View Alternative Approaches with Code and Analysis
📊 2️⃣ Memoization Approach
💡 Algorithm Steps:
Use recursive top-down approach with memoization for subproblem results.
For each range calculate cost by trying each node as root.
Store computed results in memo table to avoid recomputation.
Return the minimum cost among all possible root choices.
📝 Complexity Analysis:
Time: ⏱️ O(n³) - Three nested loops for all subproblems
Auxiliary Space: 💾 O(n²) - Memoization table and recursion stack
✅ Why This Approach?
Natural recursive structure matches problem definition
Easy to understand and debug recursively
Avoids redundant calculations through memoization
📊 3️⃣ Space-Optimized DP
💡 Algorithm Steps:
Use bottom-up DP with optimized space for frequency sum calculation.
Precompute prefix sums to get range sum in O(1) time.
Build DP table for increasing subproblem sizes.
Optimize by reusing computed subproblem results efficiently.
📝 Complexity Analysis:
Time: ⏱️ O(n³) - Three loops for DP table construction
Auxiliary Space: 💾 O(n²) - Only DP table storage needed
✅ Why This Approach?
Iterative approach avoids recursion overhead
Gap-based iteration provides clear subproblem ordering
Compact code structure with inline sum calculation
📊 4️⃣ Knuth Optimization
💡 Algorithm Steps:
Apply Knuth's optimization to reduce search space for optimal root.
For each subproblem maintain the root index that gives minimum cost.
Use property that optimal root lies between roots of adjacent subproblems.
Reduce the range of root candidates from O(n) to O(n) amortized.
📝 Complexity Analysis:
Time: ⏱️ O(n²) - Optimized root search reduces complexity
Auxiliary Space: 💾 O(n²) - DP and root tracking tables
✅ Why This Approach?
Best asymptotic time complexity for this problem
Leverages optimal substructure property efficiently
Significant speedup for large input sizes
🆚 🔍 Comparison of Approaches
🚀 Approach
⏱️ Time Complexity
💾 Space Complexity
✅ Pros
⚠️ Cons
🏷️ Prefix Sum DP
🟡 O(n³)
🟢 O(n²)
🚀 Fast range sum queries
🐌 Cubic time complexity
🔍 Memoization
🟡 O(n³)
🟡 O(n²)
📖 Intuitive recursive logic
📚 Recursion stack overhead
📊 Space-Optimized DP
🟡 O(n³)
🟢 O(n²)
🎯 Clean iterative structure
🐌 Same time as basic DP
🔄 Knuth Optimization
🟢 O(n²)
🟢 O(n²)
⭐ Best time complexity
🔧 More complex implementation
🏆 Best Choice Recommendation
🎯 Scenario
🎖️ Recommended Approach
🔥 Performance Rating
🏅 Large input sizes
🥇 Knuth Optimization
★★★★★
📖 Learning/Understanding
🥈 Memoization
★★★★☆
🔧 Interview setting
🥉 Prefix Sum DP
★★★★☆
🎯 Balance simplicity/speed
🏅 Space-Optimized DP
★★★★☆
☕ Code (Java)
🐍 Code (Python)
🧠 Contribution and Support
For discussions, questions, or doubts related to this solution, feel free to connect on LinkedIn: 📬 Any Questions?. Let's make this learning journey more collaborative!
⭐ If you find this helpful, please give this repository a star! ⭐
📍Visitor Count
Last updated