1) Output Type: Regression trees yield continuous numerical predictions (e.g., price, temperature, age). Classification trees output discrete categories (e.g., ‘Yes/No’, ‘Spam/Not Spam’, ‘Pass/Fail’).
2) Splitting Criterion: Regression trees optimize node splits by minimizing error metrics such as Mean Squared Error (MSE). Classification trees employ measures like Gini Index or Entropy to ensure node category purity.
While both are decision tree algorithms, they serve distinct prediction purposes.