# Determining best fitting curve fitting function out of linear, exponential, and logarithmic functions

### Context:

From a question on Mathematics Stack Exchange (Can I build a program), someone has a set of $x-y$ points, and wants to fit a curve to it, linear, exponential or logarithmic.
The usual method is to start by choosing one of these (which specifies the model), and then do the statistical calculations.

But what is really wanted is to find the ‘best’ curve out of linear, exponential or logarithmic.

Ostensibly, one could try all three, and choose the best fitted curve of the three according to the best correlation coefficient.

But somehow I’m feeling this is not quite kosher. The generally accepted method is to pick your model first, one of those three (or some other link function), then from the data calculate the coefficients. And post facto picking the best of all is cherry picking. But to me whether you’re determining a function or coefficients from the data it is still the same thing, your procedure is discovering the best…thing (let’s say that which function is -also- another coefficient o be discovered).

### Questions:

• Is it appropriate to choose the best fitting model out of linear, exponential, and logarithmic models, based on a comparison of fit statistics?
• If so, what is the most appropriate way to do this?
• If regression helps find parameters (coefficients) in a function, why can’t there be a discrete parameter to choose which of three curve families the best would come from?