"最小二乗で"の翻訳 英語に:


  辞書 日本-英語

最小二乗で - 翻訳 :

  例 (レビューされていない外部ソース)

普通の最小二乗法が我々が最初にとるアプローチです
Then we'll talk about the idea of estimation of regression coefficients.
そしてその結果 残差の二乗和を得る そしてこの最小二乗法では
Just like the sum of square, sum of deviation squares. So we'll square them.
その数値を最小化する訳です 残差の二乗和を
And that would give us the sum of squared residuals.
最小化したい だから残差の二乗和を最小化したい 単回帰の時みたいに
So, we wanna minimize the difference between the observed scores on Y and predicted scores on Y, predicted by the model.
そして最小二乗法とはなんなのかを理解する事
X and Y, we are still assuming that X and Y are both continuous, both normal and there's a linear association between them and we will move beyond those assumptions
するとこの場合 最小化問題は 10掛ける (u 5)の二乗
Now if I want to take this objective function and multiply it by 10, so here my minimization problem is minimum of u of 10, u minus 5 squared plus 10.
トレーニング手本xiとの二乗距離が 最小になるクラスタ重心を選びとった物と考える事が出来る だがもちろん 距離の二乗を最小化しようと
So we think of Ci as picking the cluster centroid with the smallest squared distance to my training example Xi.
ここが最小二乗法の考えが登場する所です そのアイデアはとてもシンプル
How does R, or how if we want to calculat e this by hand, how do we come up with these estimates?
そこで まず最初に x 二乗 それから
So let me just graph those.
二乗して 足して それが二乗の和です
Lynn example.
AB 59の二乗 OK そして最後は 観客
59 squared, OK, and finally?
距離を最小化しようと 同じciの値になるはず でも普通は二乗をつける
But of course minimizing squared distance, and minimizing distance that should give you the same value of Ci, but we usually put in the square there, just as the convention that people use for K means.
2X二乗
2x squared.
1つだけ線形から外れています 二乗誤差を最小化しようとすると
This is an interesting one where we seem to have a linear relationship that is flatter than the linear regression indicates, but there is one outlier.
シグマの二乗 サンプルの分散はSDの二乗
Again, there's variance in the population, sigma squared.
要素単位での A の 二乗になるので 1の二乗は
This gives me the multi, the element wise squaring of
では y x二乗は
And I think you're familiar with what that looks like.
2X二乗と
We only have 1x squared terms, so let's write that down.
ですから x二乗からx二乗足す3にすると
It's going to look like that.
分母には Xの二乗和とYの二乗和
Look what's in the numerator, sum of cross products.
5X二乗引く6Xです
And their sum is equal to this up here.
5 の二乗は 25 です
So that's equal to 25.
二項式の2乗です
Or we could write this as x minus 7 squared is equal to 0.
X二乗の項をやりましょう 4x二乗と
OK, now we can simplify.
小さなNの間 この辺までは 実際に 1 2 nの二乗 はより小さい定数なので
And we see here in the graph exactly the behavior that we discussed earlier, which is that the small
16t二乗 10t 84
I swapped the sides.
37の二乗 OK
Arthur Benjamin
23の二乗 OK
AB 23 squared, OK. Audience
イコール4となる 2の二乗だから 図では ミュー1とシグマ二乗1で
So, sigma squared 1 of course would be equal to 4, for example, as the square of 2.
以下のような最適化問題があるとする 実数のuを (u 5)の二乗 1を 最小化するように選ぶ
Here is what I mean, to give you a concrete example, suppose I had a minimization problem that you know minimize over a real number u of u minus 5 squared, plus 1, right.
分散 シグマ二乗は 0.5の二乗 つまり 0.25となる
So the standard deviation is one half and the variance sigma squared would therefore be the square of 0.5 would be 0.25.
まだ残っているのは メートルの二乗 秒の二乗
So we put in one density.
rの二乗になります 15の二乗はいくつでしょうか
So 15 squared plus 20 squared is going to be equal to r squared.
これは残差の二乗和 Residual Sum of Squares 覚えてる 我々が最小化しようとしているのは
First I'm gonna bring your attention to this RSS column.
五の二乗は25である
The square of 5 is 25.
20の二乗は400なので
It's 225.
マイナスbの二乗は4です
Plus or minus the square root of b squared.
0.454割る0.0254の二乗です
So I can say, Okay, well now how about weight in pounds and height in inches squared? And I have to multiply, right? Because I have to multiply by the conversion.
ではシグマ二乗はどうか
And that just gives me the center of this distribution. How about sigma squared?
それを 3 で乗算して 最終的に正数です ここでの最小値は 0 なので
So this expression with the squared here is going to be positive, and you multiply it by 3, and it's going to be positive.
最も長い辺の二乗 平方 に等しくなるというものです 最も長い辺の二乗 平方 に等しくなるというものです 最も長い辺を 斜辺といいます
And all that tells us is it the sum of the squares of the shorter sides of the triangle are going to be equal to the square of the longer side.
符号が逆転します つまり 1掛けるX二乗は X二乗で
And now we distribute this minus over this whole expression.
見せたかったからです Xの二乗和 Yの二乗和 そして
We don't do that. But I did it here just to demonstrate the similarity of some cross products.
重回帰での回帰係数の計算 ここでも単回帰の時と同じように最小二乗法やってるんだが
So the main topic of this segment is just again estimation of regression coefficients in multiple regression.
987の二乗は974,169
AB 987 squared is 974,169.

 

関連検索 : 最小二乗 - 最小二乗法 - 最小二乗解 - 最小二乗法 - 最小二乗フィット - 最小二乗法 - 最小二乗フィット - フィッティング最小二乗 - 最小二乗アルゴリズム - 二段階最小二乗 - 最小二乗線形 - 部分最小二乗 - 最小二乗平均 - 最小二乗回帰