Hi !

The problem I have looks simple but I could not find an efficient method to solve it.

I have a set of data points which needs to be fitted to a Model (function) which is of the form Y= aX^n / ( 1 + aX^n), where "a" and "n" are the unknown constants which needs to be found for the best fit. The number of data points available is between 40 to 100. I know that the best "a" and "n" should minimize the total of square of errors between Y values from the model and actual Y data values for the given X values. However, I find it a bit too involved for me to translate this into a minimization algorithm . I wonder if there is anyone who has coded a solution to a problem like this.

Thanks,

Sampath

The problem I have looks simple but I could not find an efficient method to solve it.

I have a set of data points which needs to be fitted to a Model (function) which is of the form Y= aX^n / ( 1 + aX^n), where "a" and "n" are the unknown constants which needs to be found for the best fit. The number of data points available is between 40 to 100. I know that the best "a" and "n" should minimize the total of square of errors between Y values from the model and actual Y data values for the given X values. However, I find it a bit too involved for me to translate this into a minimization algorithm . I wonder if there is anyone who has coded a solution to a problem like this.

Thanks,

Sampath

## Comment