### Fitting Multiple-Response Data

Bruce Miller

Mathematica has a built-in function `Fit` and a Standard AddOn Package `Statistics`NonlinearFit`` for finding the least-squares fit of an expression with adjustable parameters to data. They fit only one expression at a time, however, and often it is convenient to fit more than one expression to a set of data, with the parameters being adjusted to minimize the combined error in all the expressions being fit. This note gives a step-by-step example of how to perform a simultaneous fit to multiple functions, and provides a fairly general function for performing such fits.

The example below shows two functions, `y1[x1,x2]` and `y2[x1,x2]`, fit to `{x1,x2,y1,y2}` data using the same parameters for the two functions. This is most easily done by building an expression for the total error in both functions in terms of the parameters, then finding the parameters which minimize that error. This is an example of a general technique used for fitting irregular functions or data. (One example from the Technical Support FAQ site is at http://support.wolfram.com/Math/Statistics/NonlinearOptimization.html.)

#### Create Sample Data

Here are some fake data (with a little noise). I use `Flatten` because the `Table` command yields a matrix of points. This gives an extra level of braces around rows of points; the rest of the example needs the data in form. `Flatten` eliminates the division of the matrix into rows of points and gives just one long list of points.

#### The Basic Technique

Define the two functions describing the expressions to fit. The function arguments are the free parameters to be fitted and the independent variables.

Define an error function which returns at this point given these parameter values. Run this function over all the data points, add the individual point errors together, and find the parameters which minimize the total.

```The `Apply[Plus,chisq],{a,1},{b,1},{c,1}]` in `FindMinimum` below turns the list of squared errors, `chisq` into a sum, that is,

If `FindMinimum` complains about complex values, put the first argument inside an `Abs` or `Chop`, such as `FindMinimum[Abs[Apply[Plus,chisq]], {a,1},{b,1},{c,1}].````

This will eliminate tiny imaginary components which sometimes arise in finite-precision arithmetic.

``soln = FindMinimum[Apply[Plus,chisq],{a,1},{b,1},{c,1}]``

The output from `FindMinimum` is a list containing the final value of the function being minimized and a sublist of replacement rules for the fitted parameters. Use the result to create final fitted functions that can be used for calculations or plotting.

``y1[x1_,x2_] = y1[a,b,c,x1,x2] /. soln[[2]]``
``y2[x1_,x2_] = y2[a,b,x1,x2] /. soln[[2]]``
``````Plot3D[y1[x1,x2],{x1,0,4},{x2,0,3},Lighting->False,
ColorFunction->Hue];``````

#### General Functions

To make the above procedure easier to use, the algorithm can be placed into a function which can be called with different data and expressions. The job performed by the `errfunc` function above is in the separate function `onept` (one-point). The `fitmult` function returns a list of rules for the fitted parameters.

`xsys` is the data in a list where each point has the form . `models` is a list of expressions with parameters to fit. The number of expressions in `models` must be the same as the number of responses in the data. `var` is a list of the independent variables, one for each of the in the data and in the same order. `parms` is a list of parameters with optional initial values, in a format acceptable to `FindMinimum`. `Opts` is zero or more options to be passed to `FindMinimum`. The use of `Simplify` is not necessary, but the expression for the total error can be huge (one term per data point) and the `FindMinimum` may run faster if the many terms are combined.

The `fitmult` function requires its data in a different format from the simple example shown at the beginning. The simple pure function below groups the four elements of each data point into the form .

``Take[dat3,2]``

The functions to fit are as follows.

``functofit={a+b*x1+c*x2^2, Sqrt[1+a*x1+b*x2]}``
``````solu=fitmult[dat3,functofit,
{x1,x2},{{a,1},{b,1},{c,1}}]``````
``{f[x1_,x2_],g[x1_,x2_]}=functofit /. solu``
``f[2,1.5]``

This routine can also be used to fit one expression at a time to or data. Here I also show a different format for the parameters (useful if the models cannot be symbolically differentiated) and two options which are passed to `FindMinimum`.

``````dat4=Table[{x,BesselJ[0,x]},{x,-3,3}];
fitmult[dat4,{a*x^2+b*x+c},{x},
{{a,{0,2}},{b,{0,2}},{c,{-2,2}}},
MaxIterations->20,AccuracyGoal->4]``````

##### Suggestions and Caveats

The `fitmult` function wants the models, independent variables, and parameter specifications in lists, even if there is only one. This means that if there is only one parameter to fit, it must be doubly nested (i.e., {{a,1}}).

If you want a quick estimate of the goodness-of-fit, change the at the bottom of `fitmult` to . This will print the final value of the sum of squared errors along with the parameter values. (The effect is that same as eliminating the `Last` and its square brackets, but makes it easier to switch back and forth.) Remember to extract the rules from the combined output before using them, as in the following example.

``````sol=qefitmult[dat3,functofit,
{x1,x2},{{a,1},{b,1},{c,1}}]``````
``{afit,bfit,cfit} = {a,b,c} /. Last[sol]``

To use these functions with complex data and models, change the `Chop[dfs.dfs]` at the end of `onept` to `Chop[dfs.Conjugate[dfs]]`. Because `Conjugate `is not analytically differentiable, this will necessitate specifying two starting values for each parameter, as in the `dat4` example above.

If you have a different way to calculate the error (with small- Poisson statistics, for example), just modify the `onept` function.

Nonlinear fitting codes seek minima in the total error function, not distinguishing between local minima and global minima. Careful selection of parameter starting values and model functions may be necessary to obtain satisfactory results.

Converted by Mathematica      October 8, 1999 [Prev Page]