source:branches/devel-peter/Ipopt/contrib/MatlabInterface/examples/examplelasso.m@1339

Last change on this file since 1339 was 1339, checked in by pcarbo, 5 years ago

I added a new example to the "examples" directory and fixed a few
small bugs in the MATLAB interface.

File size: 1.3 KB
Line
1% In this small MATLAB script, we compute the least squares solution to a
2% regression problem subject to L1 regularization, which rewards "sparse"
3% models that have regression coefficients of zero. See, for instance,
4% the work in the "Lasso" by the statistician Robert Tibshirani.
5
6% Experiment parameters.
7lambda = 1;                      % Level of L1 regularization.
8n      = 100;                    % Number of training examples.
9e      = 1;                      % Std. dev. in noise of outputs.
10beta   = [ 0 0 2 -4 0 0 -1 3 ]'; % "True" regression coefficients.
11
12% Set the random number generator seed.
13seed = 7;
14rand('state',seed);
15randn('state',seed);
16
17% CREATE DATA SET.
18% Generate the input vectors from the standard normal, and generate the
19% binary responses from the regression with some additional noise, and then
20% transform the results using the logistic function. The variable "beta" is
21% the set of true regression coefficients.
22m     = length(beta);      % Number of features.
23A     = randn(n,m);        % The n x m matrix of examples.
24noise = e * randn(n,1);    % Noise in outputs.
25y     = A * beta + noise;  % The binary outputs.
26
27% COMPUTE SOLUTION WITH IPOPT.
28% Compute the L1-regularized maximum likelihood estimator.
29w = lasso(A,y,lambda);
30fprintf('Solution:\n');
31disp(w);
32
33
Note: See TracBrowser for help on using the repository browser.