# How we can draw an ROC curve for decision trees?

Normally we cannot draw an ROC curve for the discrete classifiers like decision trees.
Am I right?
Is there any way to draw an ROC curve for Dtrees?

If your classifier produces only factor outcomes (only labels), without scores, you still can draw a ROC curve. However this ROC curve is only a point. Considering the ROC space, this points is $(x,y) = (\text{FPR}, \text{TPR})$, where $\text{FPR}$ – false positive rate and $\text{TPR}$ – true positive rate.

You can extend this point to look like a ROC curve by drawing a line from $(0,0)$ to your point, and from there to $(1,1)$. Thus you have a curve.

However, for a decision tree is easy to extend from an label output to a numeric output. Note that when you predict with a decision tree you go down from the root node to a leaf node, where you predict with majority class. If instead of that class you would return the proportion of classes in that leaf node, you would have a score for each class. Suppose that you have two classes $\text{T}$ and $\text{F}$, and in your leaf node you have 10 instances with $\text{T}$ and 5 instances with $\text{F}$, you can return a vector of scores: $(\text{score}_T, \text{score}_F) = ( \frac{\text{count}_T}{\text{count}_T + \text{count}_F}, \frac{\text{count}_F}{\text{count}_T + \text{count}_F}) = (10/15, 5/15) = (0.66, 0.33)$. Take care that this is really note a proper scoring rule (this are not the best estimator for probabilities), but is better than nothing I believe, and this is how usually scores are retrieved for decision trees.