Linear Regression in Javascript

2020-05-17 04:07发布

I want to do Least Squares Fitting in Javascript in a web browser.

Currently users enter data point information using HTML text inputs and then I grab that data with jQuery and graph it with Flot.

After the user had entered in their data points I would like to present them with a "line of best fit". I imagine I would calculate the linear, polynomial, exponential and logarithmic equations and then choose the one with the highest R^2 value.

I can't seem to find any libraries that will help me to do this though. I stumbled upon jStat, but it is completely missing documentation (as far as I can find) and after digging through the the source code it doesn't seem to have any linear regression functionality built in--I'm basing this purely on function names however.

Does anyone know any Javascript libraries that offer simple regression analysis?


The hope would be that I could use the library like so...

If I had some set of scatter points in an array var points = [[3,4],[15,45],...[23,78]], I would be able to hand that to some function like lin_reg(points) and it would return something like [7.12,3] if the linear equation was y = 7.12 x + 3.

7条回答
相关推荐>>
2楼-- · 2020-05-17 04:37

I found this great JavaScript library.

It's very simple, and seems to work perfectly.

I also can't recommend Math.JS enough.

查看更多
该账号已被封号
3楼-- · 2020-05-17 04:37

Simple linear regression with measures of variation ( Total sum of squares = Regression sum of squares + Error sum of squares ), Standard error of estimate SEE (Residual standard error), and coefficients of determination R2 and correlation R.

const regress = (x, y) => {
    const n = y.length;
    let sx = 0;
    let sy = 0;
    let sxy = 0;
    let sxx = 0;
    let syy = 0;
    for (let i = 0; i < n; i++) {
        sx += x[i];
        sy += y[i];
        sxy += x[i] * y[i];
        sxx += x[i] * x[i];
        syy += y[i] * y[i];
    }
    const mx = sx / n;
    const my = sy / n;
    const yy = n * syy - sy * sy;
    const xx = n * sxx - sx * sx;
    const xy = n * sxy - sx * sy;
    const slope = xy / xx;
    const intercept = my - slope * mx;
    const r = xy / Math.sqrt(xx * yy);
    const r2 = Math.pow(r,2);
    let sst = 0;
    for (let i = 0; i < n; i++) {
       sst += Math.pow((y[i] - my), 2);
    }
    const sse = sst - r2 * sst;
    const see = Math.sqrt(sse / (n - 2));
    const ssr = sst - sse;
    return {slope, intercept, r, r2, sse, ssr, sst, sy, sx, see};
}
regress([1, 2, 3, 4, 5], [1, 2, 3, 4, 3]);
查看更多
孤傲高冷的网名
4楼-- · 2020-05-17 04:38

Check out https://web.archive.org/web/20150523035452/https://cgwb.nci.nih.gov/cgwbreg.html (javascript regression calculator) - pure JavaScript, not CGI calls to server. The data and processing remains on your computer. Complete R style results and R code to check the work and a visualization of the results.

See the source code for the embedded JavaScript implementations of OLS and statistics associated with the results.

The code is my effort to port the GSL library functions to JavaScript.

The codes is released under GPL because it's basically line for line porting of GPL licensed Gnu Scientific Library (GSL) code.

EDIT: Paul Lutus also provides some GPL code for regression at: http://arachnoid.com/polysolve/index.html

查看更多
对你真心纯属浪费
5楼-- · 2020-05-17 04:39

Here is a snippet that will take an array of triplets (x, y, r) where r is the weight of the (x, y) data point and return [a, b] such that Y = a*X + b approximate the data.

// return (a, b) that minimize
// sum_i r_i * (a*x_i+b - y_i)^2
function linear_regression( xyr )
{
    var i, 
        x, y, r,
        sumx=0, sumy=0, sumx2=0, sumy2=0, sumxy=0, sumr=0,
        a, b;

    for(i=0;i<xyr.length;i++)
    {   
        // this is our data pair
        x = xyr[i][0]; y = xyr[i][1]; 

        // this is the weight for that pair
        // set to 1 (and simplify code accordingly, ie, sumr becomes xy.length) if weighting is not needed
        r = xyr[i][2];  

        // consider checking for NaN in the x, y and r variables here 
        // (add a continue statement in that case)

        sumr += r;
        sumx += r*x;
        sumx2 += r*(x*x);
        sumy += r*y;
        sumy2 += r*(y*y);
        sumxy += r*(x*y);
    }

    // note: the denominator is the variance of the random variable X
    // the only case when it is 0 is the degenerate case X==constant
    b = (sumy*sumx2 - sumx*sumxy)/(sumr*sumx2-sumx*sumx);
    a = (sumr*sumxy - sumx*sumy)/(sumr*sumx2-sumx*sumx);

    return [a, b];
}
查看更多
Animai°情兽
6楼-- · 2020-05-17 04:39

Somewhat based on Nic Mabon's answer.

function linearRegression(x, y)
{
    var xs = 0;  // sum(x)
    var ys = 0;  // sum(y)
    var xxs = 0; // sum(x*x)
    var xys = 0; // sum(x*y)
    var yys = 0; // sum(y*y)

    var n = 0;
    for (; n < x.length && n < y.length; n++)
    {
        xs += x[n];
        ys += y[n];
        xxs += x[n] * x[n];
        xys += x[n] * y[n];
        yys += y[n] * y[n];
    }

    var div = n * xxs - xs * xs;
    var gain = (n * xys - xs * ys) / div;
    var offset = (ys * xxs - xs * xys) / div;
    var correlation = Math.abs((xys * n - xs * ys) / Math.sqrt((xxs * n - xs * xs) * (yys * n - ys * ys)));

    return { gain: gain, offset: offset, correlation: correlation };
}

Then y' = x * gain + offset.

查看更多
smile是对你的礼貌
7楼-- · 2020-05-17 05:00

What kind of linear regression? For something simple like least squares, I'd just program it myself:

http://mathworld.wolfram.com/LeastSquaresFitting.html

The math is not too hard to follow there, give it a shot for an hour or so and let me know if it's too hard, I can try it.

EDIT:

Found someone that did it:

http://dracoblue.net/dev/linear-least-squares-in-javascript/159/

查看更多
登录 后发表回答