[{"Name":"Eigenvectors and Eigenvalues of a Matrix","TopicPlaylistFirstVideoID":0,"Duration":null,"Videos":[{"Watched":false,"Name":"Introduction","Duration":"7m 56s","ChapterTopicVideoID":24838,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":"https://www.proprep.uk/Images/Videos_Thumbnails/24838.jpeg","UploadDate":"2021-02-23T10:13:10.9200000","DurationForVideoObject":"PT7M56S","Description":null,"MetaTitle":"Introduction: Video + Workbook | Proprep","MetaDescription":"Eigenvectors Eigenvalues and Diagonalization - Eigenvectors and Eigenvalues of a Matrix. Watch the video made by an expert in the field. Download the workbook and maximize your learning.","Canonical":"https://www.proprep.uk/general-modules/all/linear-algebra/eigenvectors-eigenvalues-and-diagonalization/eigenvectors-and--eigenvalues-of-a-matrix/vid25751","VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.570","Text":"In this clip, I\u0027m going to introduce the concept of"},{"Start":"00:03.570 ","End":"00:08.265","Text":"eigenvalues and eigenvectors of a matrix."},{"Start":"00:08.265 ","End":"00:11.970","Text":"Suppose we have this square matrix A,"},{"Start":"00:11.970 ","End":"00:14.435","Text":"which is as follows,"},{"Start":"00:14.435 ","End":"00:20.565","Text":"and I\u0027m going to multiply this matrix by a few column vectors."},{"Start":"00:20.565 ","End":"00:22.440","Text":"Here\u0027s the first one,"},{"Start":"00:22.440 ","End":"00:25.845","Text":"multiply it by minus 1, 1, 0."},{"Start":"00:25.845 ","End":"00:30.270","Text":"You know how to do this, do the computation, we get this."},{"Start":"00:30.270 ","End":"00:33.390","Text":"Another example, the same matrix A,"},{"Start":"00:33.390 ","End":"00:35.520","Text":"this time by column vector 1,"},{"Start":"00:35.520 ","End":"00:38.610","Text":"2, 3 and we get this."},{"Start":"00:38.610 ","End":"00:41.420","Text":"The third example."},{"Start":"00:41.420 ","End":"00:43.190","Text":"Multiply it by 1,"},{"Start":"00:43.190 ","End":"00:45.395","Text":"1, 1, we get 2, 2, 2."},{"Start":"00:45.395 ","End":"00:52.590","Text":"The last example will be this one and here\u0027s the result."},{"Start":"00:52.590 ","End":"01:00.320","Text":"Now, notice that something interesting happens in 3 out of the 4 cases."},{"Start":"01:00.320 ","End":"01:01.550","Text":"Let\u0027s say look at this one,"},{"Start":"01:01.550 ","End":"01:03.275","Text":"it\u0027s easiest to see."},{"Start":"01:03.275 ","End":"01:07.760","Text":"When we multiply the matrix by the vector,"},{"Start":"01:07.760 ","End":"01:09.770","Text":"we got a multiple of this vector,"},{"Start":"01:09.770 ","End":"01:15.860","Text":"we got exactly twice this vector here."},{"Start":"01:15.860 ","End":"01:18.720","Text":"This one, certainly good."},{"Start":"01:18.720 ","End":"01:20.670","Text":"If you look at the first one,"},{"Start":"01:20.670 ","End":"01:27.845","Text":"then we see that the result is exactly minus 4 times this vector."},{"Start":"01:27.845 ","End":"01:33.270","Text":"Here, this vector is not a multiple of this vector."},{"Start":"01:33.270 ","End":"01:39.700","Text":"Here again, we do get a multiple exactly 6 times this gives us this."},{"Start":"01:39.700 ","End":"01:42.670","Text":"In some cases, what we get after"},{"Start":"01:42.670 ","End":"01:48.835","Text":"the multiplication is a scalar multiple of the original vector."},{"Start":"01:48.835 ","End":"01:50.800","Text":"I want to spell it out a bit more,"},{"Start":"01:50.800 ","End":"01:52.870","Text":"add some color, give some labels,"},{"Start":"01:52.870 ","End":"02:02.545","Text":"call this one vector v. We see that this times this is exactly minus 4 times this,"},{"Start":"02:02.545 ","End":"02:07.890","Text":"because we previously saw 4 minus 4 is 0."},{"Start":"02:07.890 ","End":"02:12.595","Text":"Here are the other 2 examples where we did get a multiple."},{"Start":"02:12.595 ","End":"02:15.820","Text":"We found that if we multiply A by this,"},{"Start":"02:15.820 ","End":"02:22.539","Text":"we get twice what that was and in the last case we got 6 times the original."},{"Start":"02:22.539 ","End":"02:25.945","Text":"Now when this phenomenon happens,"},{"Start":"02:25.945 ","End":"02:32.490","Text":"meaning that matrix A times the vector is some scalar times the vector,"},{"Start":"02:32.490 ","End":"02:36.975","Text":"and we assume the vector is not 0."},{"Start":"02:36.975 ","End":"02:38.865","Text":"Otherwise, it\u0027s not interesting."},{"Start":"02:38.865 ","End":"02:41.715","Text":"Because if vector v is 0,"},{"Start":"02:41.715 ","End":"02:47.110","Text":"any matrix A times the 0 vector will always give the 0 vector,"},{"Start":"02:47.110 ","End":"02:49.429","Text":"and 0 is a multiple of itself."},{"Start":"02:49.429 ","End":"02:50.960","Text":"It\u0027s anything times itself,"},{"Start":"02:50.960 ","End":"02:54.590","Text":"7 times 0 is 0 and a 100 times 0 is 0."},{"Start":"02:54.590 ","End":"02:57.355","Text":"Anyway, it is not interesting case."},{"Start":"02:57.355 ","End":"03:00.790","Text":"When this phenomenon happens,"},{"Start":"03:00.790 ","End":"03:06.095","Text":"then we call v an eigenvector."},{"Start":"03:06.095 ","End":"03:07.770","Text":"This is the general v,"},{"Start":"03:07.770 ","End":"03:09.170","Text":"not specifically this one,"},{"Start":"03:09.170 ","End":"03:10.775","Text":"I\u0027m just in general."},{"Start":"03:10.775 ","End":"03:15.050","Text":"When we have matrix times v is some scalar multiple of v,"},{"Start":"03:15.050 ","End":"03:24.095","Text":"v is called an eigenvector of A corresponding to the eigenvalue x."},{"Start":"03:24.095 ","End":"03:26.150","Text":"Here we could mark x,"},{"Start":"03:26.150 ","End":"03:28.159","Text":"this would be the x here,"},{"Start":"03:28.159 ","End":"03:31.910","Text":"this would be the x here,"},{"Start":"03:31.910 ","End":"03:35.210","Text":"and this would be the x here."},{"Start":"03:35.210 ","End":"03:37.945","Text":"Spelling it out yet again."},{"Start":"03:37.945 ","End":"03:45.575","Text":"This vector is an eigenvector corresponding to the eigenvalue x equals 4,"},{"Start":"03:45.575 ","End":"03:49.615","Text":"because A times v is minus 4v."},{"Start":"03:49.615 ","End":"03:52.440","Text":"We saw that this column vector,"},{"Start":"03:52.440 ","End":"03:54.180","Text":"call it u,"},{"Start":"03:54.180 ","End":"03:58.760","Text":"is an eigenvector corresponding to eigenvalue 2,"},{"Start":"03:58.760 ","End":"04:01.740","Text":"because A times u is 2u."},{"Start":"04:02.420 ","End":"04:04.485","Text":"The last one,"},{"Start":"04:04.485 ","End":"04:05.895","Text":"call it w,"},{"Start":"04:05.895 ","End":"04:08.510","Text":"is an eigenvector corresponding to eigenvalue 6,"},{"Start":"04:08.510 ","End":"04:12.980","Text":"because A times w gave us 6w."},{"Start":"04:12.980 ","End":"04:16.025","Text":"Notice that there\u0027s an asterisk here."},{"Start":"04:16.025 ","End":"04:18.740","Text":"I looked up what these words mean,"},{"Start":"04:18.740 ","End":"04:23.675","Text":"where they come from on the Wikipedia."},{"Start":"04:23.675 ","End":"04:26.600","Text":"Well, I\u0027ll just leave you to read that."},{"Start":"04:26.600 ","End":"04:29.825","Text":"I just copy pasted it from the Wikipedia."},{"Start":"04:29.825 ","End":"04:33.750","Text":"Eigen, it\u0027s a German prefix."},{"Start":"04:33.750 ","End":"04:37.395","Text":"Sometimes use the following notation,"},{"Start":"04:37.395 ","End":"04:43.220","Text":"and I would read this to say that the vector minus 1, 1,"},{"Start":"04:43.220 ","End":"04:50.595","Text":"0 is an eigenvector v for the eigenvalue minus 4,"},{"Start":"04:50.595 ","End":"04:54.740","Text":"and u which is 1, 1,"},{"Start":"04:54.740 ","End":"04:58.490","Text":"1 is an eigenvector for eigenvalue 2,"},{"Start":"04:58.490 ","End":"05:05.675","Text":"and this w is an eigen vector for eigenvalue 6."},{"Start":"05:05.675 ","End":"05:11.584","Text":"I said an eigenvector because there could be more than 1 as we\u0027ll see in a moment."},{"Start":"05:11.584 ","End":"05:16.040","Text":"Note that it\u0027s possible for 0 to be an eigenvalue."},{"Start":"05:16.040 ","End":"05:20.420","Text":"We didn\u0027t allow 0 vector to be an eigenvector,"},{"Start":"05:20.420 ","End":"05:22.505","Text":"but as an eigenvalue it\u0027s possible."},{"Start":"05:22.505 ","End":"05:29.055","Text":"For example, if you take this A and multiply by this v,"},{"Start":"05:29.055 ","End":"05:31.600","Text":"then we get 0, 0, 0."},{"Start":"05:31.600 ","End":"05:37.850","Text":"You can check that this matrix times this column vector just gives us 0,"},{"Start":"05:37.850 ","End":"05:44.645","Text":"0, 0, which I can write as scalar 0 times 1, 0, 0,"},{"Start":"05:44.645 ","End":"05:48.200","Text":"so 0 can be an eigenvalue."},{"Start":"05:48.200 ","End":"05:54.365","Text":"Now I\u0027m going to relate to the comment I made before about more than 1 eigenvector,"},{"Start":"05:54.365 ","End":"06:01.235","Text":"one way of getting another eigenvector is just multiplying by a scalar."},{"Start":"06:01.235 ","End":"06:03.730","Text":"Multiply an eigenvector by,"},{"Start":"06:03.730 ","End":"06:05.500","Text":"has to be non 0 scalar,"},{"Start":"06:05.500 ","End":"06:07.855","Text":"it\u0027s also going to be an eigenvector."},{"Start":"06:07.855 ","End":"06:18.760","Text":"For example, above we saw that this matrix times this is minus 4 times this."},{"Start":"06:18.990 ","End":"06:28.570","Text":"Now if I take this vector and multiply it by 3, I\u0027ll get this."},{"Start":"06:28.570 ","End":"06:34.510","Text":"You can check this is also an eigenvector corresponding to the same eigenvalue,"},{"Start":"06:34.510 ","End":"06:39.495","Text":"is the computation so that any multiple"},{"Start":"06:39.495 ","End":"06:47.560","Text":"of an eigenvector will still be an eigenvector with the same eigenvalue."},{"Start":"06:47.560 ","End":"06:54.440","Text":"Now that\u0027s not the only way that 2 eigenvectors can correspond to the same eigenvalue."},{"Start":"06:54.440 ","End":"06:58.115","Text":"I\u0027ll give an example when one\u0027s not a multiple of the other."},{"Start":"06:58.115 ","End":"07:04.730","Text":"I\u0027ll let you check this computation that this matrix times this vector is this,"},{"Start":"07:04.730 ","End":"07:09.879","Text":"which is 3 times the original vector so that"},{"Start":"07:09.879 ","End":"07:16.925","Text":"eigenvalue 3 has this as an eigenvector."},{"Start":"07:16.925 ","End":"07:22.805","Text":"But also if we multiply by 1,"},{"Start":"07:22.805 ","End":"07:26.945","Text":"1, 0, which is not a multiple of this,"},{"Start":"07:26.945 ","End":"07:28.580","Text":"and you do the computation,"},{"Start":"07:28.580 ","End":"07:32.870","Text":"you also get 3 times the original vector here called v,"},{"Start":"07:32.870 ","End":"07:37.190","Text":"here called w. In the next clip,"},{"Start":"07:37.190 ","End":"07:40.100","Text":"we\u0027ll see how to find eigenvalues and"},{"Start":"07:40.100 ","End":"07:43.985","Text":"eigenvectors of a matrix and what it\u0027s all good for."},{"Start":"07:43.985 ","End":"07:45.410","Text":"Because as it is,"},{"Start":"07:45.410 ","End":"07:49.310","Text":"I just pulled these examples out of a hat just from"},{"Start":"07:49.310 ","End":"07:53.885","Text":"nowhere and you\u0027ll see how I got such computations."},{"Start":"07:53.885 ","End":"07:56.700","Text":"For this clip, we\u0027re done."}],"ID":25751},{"Watched":false,"Name":"Computing Eigenvalues and Eigenvectors","Duration":"9m 2s","ChapterTopicVideoID":24822,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:05.055","Text":"In the previous clip, we learned what an eigenvalue and an eigenvector are."},{"Start":"00:05.055 ","End":"00:07.680","Text":"This clip will be more practical,"},{"Start":"00:07.680 ","End":"00:10.480","Text":"how to actually find them."},{"Start":"00:10.550 ","End":"00:13.635","Text":"We\u0027ll take the same matrix."},{"Start":"00:13.635 ","End":"00:15.300","Text":"It\u0027s a 3-by-3,"},{"Start":"00:15.300 ","End":"00:17.160","Text":"that we had before,"},{"Start":"00:17.160 ","End":"00:19.575","Text":"and find its eigenvalues and eigenvectors."},{"Start":"00:19.575 ","End":"00:23.340","Text":"Of course, this technique could be generalized to any order,"},{"Start":"00:23.340 ","End":"00:25.485","Text":"but we\u0027ll take Order 3."},{"Start":"00:25.485 ","End":"00:30.780","Text":"The first step is to compute what is called the characteristic matrix."},{"Start":"00:30.780 ","End":"00:35.790","Text":"The characteristic matrix contains a variable x."},{"Start":"00:35.790 ","End":"00:41.785","Text":"We take x times the identity matrix minus our matrix,"},{"Start":"00:41.785 ","End":"00:49.145","Text":"which in our case is diagonal x x x minus the matrix."},{"Start":"00:49.145 ","End":"00:51.230","Text":"This is the result."},{"Start":"00:51.230 ","End":"00:56.540","Text":"The next step, is to compute the characteristic polynomial."},{"Start":"00:56.540 ","End":"01:02.705","Text":"The characteristic polynomial is the determinant of the characteristic matrix."},{"Start":"01:02.705 ","End":"01:05.365","Text":"Instead of determinants, I can write bars."},{"Start":"01:05.365 ","End":"01:10.520","Text":"Easiest is to expand along the third column because it\u0027s got a couple of 0s here."},{"Start":"01:10.520 ","End":"01:14.510","Text":"It\u0027s just this entry, times the minor,"},{"Start":"01:14.510 ","End":"01:20.690","Text":"which is the 2-by-2 determinant that\u0027s left, we get this."},{"Start":"01:20.690 ","End":"01:23.600","Text":"Then use difference of squares from algebra,"},{"Start":"01:23.600 ","End":"01:26.180","Text":"a squared minus b squared is a minus b,"},{"Start":"01:26.180 ","End":"01:27.980","Text":"a plus b,"},{"Start":"01:27.980 ","End":"01:32.410","Text":"and this is what we get."},{"Start":"01:32.410 ","End":"01:35.390","Text":"I wanted it to be as simplified as possible,"},{"Start":"01:35.390 ","End":"01:39.120","Text":"factorized is good as you\u0027ll see."},{"Start":"01:39.550 ","End":"01:44.540","Text":"Once again, here\u0027s the characteristic polynomial."},{"Start":"01:44.540 ","End":"01:48.595","Text":"Now, we\u0027re ready to compute the eigenvalues."},{"Start":"01:48.595 ","End":"01:53.020","Text":"The way we do this is by solving the equation, p of x is 0."},{"Start":"01:53.020 ","End":"01:56.230","Text":"We set the characteristic polynomial to 0."},{"Start":"01:56.230 ","End":"02:00.070","Text":"This is sometimes also called the characteristic equation."},{"Start":"02:00.070 ","End":"02:02.620","Text":"Because we already had it factorized,"},{"Start":"02:02.620 ","End":"02:04.330","Text":"it\u0027s very easy to solve."},{"Start":"02:04.330 ","End":"02:07.390","Text":"In this case, the 3 solutions,"},{"Start":"02:07.390 ","End":"02:12.415","Text":"are 6, 2 and minus 4."},{"Start":"02:12.415 ","End":"02:14.985","Text":"These are the eigenvalues."},{"Start":"02:14.985 ","End":"02:18.025","Text":"Now, we\u0027re going to compute the eigenvectors."},{"Start":"02:18.025 ","End":"02:19.825","Text":"This is a bit more work."},{"Start":"02:19.825 ","End":"02:25.240","Text":"For each eigenvalue, we want to find an eigenvector."},{"Start":"02:25.240 ","End":"02:27.240","Text":"We\u0027ll do them 1 at a time,"},{"Start":"02:27.240 ","End":"02:30.400","Text":"and we\u0027ll start with x equals 6."},{"Start":"02:31.340 ","End":"02:37.790","Text":"We return to the characteristic matrix and substitute x equals 6."},{"Start":"02:37.790 ","End":"02:39.475","Text":"That\u0027s the eigenvalue."},{"Start":"02:39.475 ","End":"02:41.840","Text":"If we do that,"},{"Start":"02:41.890 ","End":"02:45.140","Text":"we get this matrix."},{"Start":"02:45.140 ","End":"02:47.210","Text":"Now, from this matrix,"},{"Start":"02:47.210 ","End":"02:51.800","Text":"we get a homogeneous system of linear equations."},{"Start":"02:51.800 ","End":"02:57.215","Text":"Just by taking the coefficients and applying it to x, y, and z."},{"Start":"02:57.215 ","End":"03:03.600","Text":"In this case, z is missing because all the coefficients are 0."},{"Start":"03:03.710 ","End":"03:05.900","Text":"To solve this system,"},{"Start":"03:05.900 ","End":"03:09.410","Text":"we start converting this to echelon form."},{"Start":"03:09.410 ","End":"03:11.885","Text":"First, I create 0s here,"},{"Start":"03:11.885 ","End":"03:15.800","Text":"add 3 times this row to 7 times this row,"},{"Start":"03:15.800 ","End":"03:18.070","Text":"and so on and so on."},{"Start":"03:18.070 ","End":"03:23.990","Text":"From here, we get to here by dividing this row by 40 and this by 19,"},{"Start":"03:23.990 ","End":"03:27.515","Text":"and subtract second from third row."},{"Start":"03:27.515 ","End":"03:33.270","Text":"Here we are in echelon form and we have a row of 0s."},{"Start":"03:33.440 ","End":"03:38.870","Text":"This gives us this system of equations where z,"},{"Start":"03:38.870 ","End":"03:39.950","Text":"which is not here,"},{"Start":"03:39.950 ","End":"03:44.575","Text":"is a free variable and x and y are bound."},{"Start":"03:44.575 ","End":"03:47.600","Text":"Z is free, we can take wherever we want."},{"Start":"03:47.600 ","End":"03:50.690","Text":"We usually take z equals 1,"},{"Start":"03:50.690 ","End":"03:55.040","Text":"and the rest are fixed, no choice."},{"Start":"03:55.040 ","End":"03:56.630","Text":"Y has to be 0,"},{"Start":"03:56.630 ","End":"04:00.780","Text":"and if y is 0, this gives us that x is 0."},{"Start":"04:01.280 ","End":"04:07.625","Text":"We found an eigenvector for the eigenvalue 6,"},{"Start":"04:07.625 ","End":"04:10.535","Text":"which is 0 0 1,"},{"Start":"04:10.535 ","End":"04:12.860","Text":"that\u0027s just 6,"},{"Start":"04:12.860 ","End":"04:15.025","Text":"now we have 2 others."},{"Start":"04:15.025 ","End":"04:19.540","Text":"The next 1 is eigenvalue 2."},{"Start":"04:19.540 ","End":"04:21.920","Text":"We proceed as before,"},{"Start":"04:21.920 ","End":"04:28.810","Text":"but this time we substitute x equals 2 into the characteristic matrix."},{"Start":"04:28.810 ","End":"04:32.645","Text":"This is what we get after the substitution."},{"Start":"04:32.645 ","End":"04:38.815","Text":"The system of equations is this."},{"Start":"04:38.815 ","End":"04:44.240","Text":"Then we bring this to row echelon form of the details."},{"Start":"04:44.240 ","End":"04:46.130","Text":"Here, we can just strike out the row of"},{"Start":"04:46.130 ","End":"04:48.980","Text":"0s or we could have pushed it to the third row."},{"Start":"04:48.980 ","End":"04:53.555","Text":"Anyway, this gives us this system."},{"Start":"04:53.555 ","End":"04:56.420","Text":"Once again, z is the free variable."},{"Start":"04:56.420 ","End":"05:04.680","Text":"The leading term gives us the not free or bound variables x and y."},{"Start":"05:04.680 ","End":"05:06.955","Text":"If z is free,"},{"Start":"05:06.955 ","End":"05:11.300","Text":"and we usually take z equals 1,"},{"Start":"05:11.300 ","End":"05:13.145","Text":"and then the rest are determined."},{"Start":"05:13.145 ","End":"05:16.430","Text":"Once z is 1, y is also 1."},{"Start":"05:16.430 ","End":"05:17.915","Text":"If y is 1,"},{"Start":"05:17.915 ","End":"05:20.160","Text":"then x is 1."},{"Start":"05:20.270 ","End":"05:22.580","Text":"The vector 1, 1,"},{"Start":"05:22.580 ","End":"05:26.285","Text":"1 is an eigenvector for eigenvalue 2."},{"Start":"05:26.285 ","End":"05:30.770","Text":"I wrote it as a row vector and I did before just for convenience."},{"Start":"05:30.770 ","End":"05:33.079","Text":"Now, we still have another,"},{"Start":"05:33.079 ","End":"05:38.015","Text":"the third eigenvector to find for eigenvalue minus 4."},{"Start":"05:38.015 ","End":"05:40.670","Text":"As you might have guessed, this time we substitute x"},{"Start":"05:40.670 ","End":"05:43.924","Text":"equals minus 4 in the characteristic matrix,"},{"Start":"05:43.924 ","End":"05:47.455","Text":"which gives us this SLE."},{"Start":"05:47.455 ","End":"05:50.160","Text":"Bring this to row echelon form,"},{"Start":"05:50.160 ","End":"05:52.350","Text":"I\u0027ll spare you the details,"},{"Start":"05:52.350 ","End":"05:55.695","Text":"which gives us this system."},{"Start":"05:55.695 ","End":"06:00.435","Text":"Here, y is the free variable."},{"Start":"06:00.435 ","End":"06:02.595","Text":"We let y is 1,"},{"Start":"06:02.595 ","End":"06:05.715","Text":"b equal to 1, z has to be 0."},{"Start":"06:05.715 ","End":"06:09.135","Text":"If y is 1, x is forced to be minus 1."},{"Start":"06:09.135 ","End":"06:16.740","Text":"An eigenvector for the eigenvalue minus 4 is minus 1, 1, 0."},{"Start":"06:16.740 ","End":"06:19.560","Text":"That\u0027s 3 eigenvectors found,"},{"Start":"06:19.560 ","End":"06:22.770","Text":"and that\u0027s the basic technique."},{"Start":"06:22.770 ","End":"06:28.880","Text":"Now, I\u0027d like to run through some of the main points and add remarks,"},{"Start":"06:28.880 ","End":"06:31.980","Text":"I want to add remarks to what I did earlier."},{"Start":"06:32.030 ","End":"06:34.215","Text":"Here we are again,"},{"Start":"06:34.215 ","End":"06:38.330","Text":"this matrix you want to find its eigenvalues and eigenvectors."},{"Start":"06:38.330 ","End":"06:41.150","Text":"Step 1, just the same as before."},{"Start":"06:41.150 ","End":"06:42.964","Text":"Now a remark."},{"Start":"06:42.964 ","End":"06:47.100","Text":"Typically, instead of the letter x,"},{"Start":"06:47.100 ","End":"06:55.050","Text":"most books, professors use the Greek letter Lambda instead of x."},{"Start":"06:55.750 ","End":"07:06.745","Text":"In that case, the characteristic matrix would be Lambda i minus a instead of xi minus a."},{"Start":"07:06.745 ","End":"07:12.140","Text":"Everything proceeds the same except that we have a Lambda in place of x."},{"Start":"07:12.140 ","End":"07:14.345","Text":"Now the next remark,"},{"Start":"07:14.345 ","End":"07:23.150","Text":"is that sometimes we see the definition backwards instead of a minus xi,"},{"Start":"07:23.150 ","End":"07:28.875","Text":"you might see xi minus a or instead of a minus Lambda i,"},{"Start":"07:28.875 ","End":"07:31.935","Text":"you might see Lambda i minus a."},{"Start":"07:31.935 ","End":"07:36.275","Text":"Any of these combinations is possible, it\u0027s not consequential."},{"Start":"07:36.275 ","End":"07:41.430","Text":"The results of the eigenvalues and eigenvectors turn out the same."},{"Start":"07:41.540 ","End":"07:43.950","Text":"Now, here we are in Step 2,"},{"Start":"07:43.950 ","End":"07:46.975","Text":"same as before, but another remark."},{"Start":"07:46.975 ","End":"07:51.650","Text":"Notice that the characteristic polynomial is of degree 3."},{"Start":"07:51.650 ","End":"07:53.450","Text":"It\u0027s got 3 factors, or if you like,"},{"Start":"07:53.450 ","End":"07:57.065","Text":"multiply it out and you\u0027ll see it\u0027s x cubed plus something."},{"Start":"07:57.065 ","End":"08:01.585","Text":"That\u0027s so, because the matrix was of order 3."},{"Start":"08:01.585 ","End":"08:05.595","Text":"We can generalize from 3 to n,"},{"Start":"08:05.595 ","End":"08:10.670","Text":"and if our matrix is of order n,"},{"Start":"08:10.670 ","End":"08:13.055","Text":"square matrix order n by n,"},{"Start":"08:13.055 ","End":"08:17.420","Text":"then the polynomial will be of degree n. Now,"},{"Start":"08:17.420 ","End":"08:22.550","Text":"Step 3, we compute the eigenvalues just as before,"},{"Start":"08:22.550 ","End":"08:24.650","Text":"and I think I may have mentioned that"},{"Start":"08:24.650 ","End":"08:27.455","Text":"this equation is called the characteristic equation."},{"Start":"08:27.455 ","End":"08:29.690","Text":"The characteristic polynomial equals 0,"},{"Start":"08:29.690 ","End":"08:31.160","Text":"is the characteristic equation."},{"Start":"08:31.160 ","End":"08:35.145","Text":"The remark is that we had it easy."},{"Start":"08:35.145 ","End":"08:37.640","Text":"This polynomial was all factorized,"},{"Start":"08:37.640 ","End":"08:41.945","Text":"so we can easily solve the equation and find the roots,"},{"Start":"08:41.945 ","End":"08:46.040","Text":"but sometimes when we compute this,"},{"Start":"08:46.040 ","End":"08:48.170","Text":"it\u0027s a bit more involved,"},{"Start":"08:48.170 ","End":"08:51.365","Text":"and it might not be so easy to solve."},{"Start":"08:51.365 ","End":"08:54.020","Text":"Such cases are taken care of and"},{"Start":"08:54.020 ","End":"08:58.840","Text":"demonstrated in the solved examples following the tutorial."},{"Start":"08:58.840 ","End":"09:03.480","Text":"That\u0027s the last remark and that\u0027s the end of this clip."}],"ID":25735},{"Watched":false,"Name":"Exercise 1","Duration":"5m 28s","ChapterTopicVideoID":24823,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:06.765","Text":"In this exercise, we have to find the eigenvalues and eigenvectors of this matrix A."},{"Start":"00:06.765 ","End":"00:12.840","Text":"We start by finding the characteristic matrix defined like this."},{"Start":"00:12.840 ","End":"00:18.030","Text":"Here x is along the diagonal and we subtract our matrix A,"},{"Start":"00:18.030 ","End":"00:20.785","Text":"and this is what we get."},{"Start":"00:20.785 ","End":"00:25.670","Text":"Next, we need the characteristic polynomial,"},{"Start":"00:25.670 ","End":"00:30.030","Text":"which is just the determinant of this matrix."},{"Start":"00:30.490 ","End":"00:33.335","Text":"I don\u0027t see any 0s here."},{"Start":"00:33.335 ","End":"00:38.400","Text":"Let\u0027s just expand along the first row,"},{"Start":"00:38.530 ","End":"00:41.990","Text":"I\u0027m assuming you are very familiar with this stuff."},{"Start":"00:41.990 ","End":"00:45.605","Text":"We get the X minus 4 times this determinant here,"},{"Start":"00:45.605 ","End":"00:51.060","Text":"then 1 times this determinant, which is here."},{"Start":"00:51.060 ","End":"00:55.085","Text":"Then this 1 times this determinant, which is here,"},{"Start":"00:55.085 ","End":"01:00.320","Text":"except that we have to have a minus here because the signs go plus,"},{"Start":"01:00.320 ","End":"01:03.925","Text":"minus, plus, and so on like a checkerboard."},{"Start":"01:03.925 ","End":"01:07.925","Text":"Now we need to simplify this a bit."},{"Start":"01:07.925 ","End":"01:10.540","Text":"Each of these 3 determinants,"},{"Start":"01:10.540 ","End":"01:13.980","Text":"this diagonal product minus this diagonal product."},{"Start":"01:13.980 ","End":"01:15.340","Text":"This is what we get for this."},{"Start":"01:15.340 ","End":"01:17.920","Text":"This gives minus X plus 3,"},{"Start":"01:17.920 ","End":"01:21.265","Text":"and this 1 gives us this."},{"Start":"01:21.265 ","End":"01:23.335","Text":"I\u0027m not going to do all the steps."},{"Start":"01:23.335 ","End":"01:24.655","Text":"If you simplify this,"},{"Start":"01:24.655 ","End":"01:25.690","Text":"this is what we get."},{"Start":"01:25.690 ","End":"01:32.910","Text":"We get a cubic polynomial and this is our characteristic polynomial."},{"Start":"01:32.910 ","End":"01:35.925","Text":"Next, we want to find the eigenvalues."},{"Start":"01:35.925 ","End":"01:44.780","Text":"The way to find the eigenvalues is to set the characteristic polynomial to 0."},{"Start":"01:45.230 ","End":"01:49.520","Text":"This gives us a cubic equation to solve."},{"Start":"01:49.520 ","End":"01:53.885","Text":"The way we\u0027re going to do this is using the theorem that"},{"Start":"01:53.885 ","End":"01:59.520","Text":"whole number solutions must be divisors of the free co-efficient."},{"Start":"01:59.520 ","End":"02:03.800","Text":"Divisors of minus 18, here\u0027s a list of them."},{"Start":"02:03.800 ","End":"02:07.405","Text":"There\u0027s quite a few, just 1-by-1 substitute."},{"Start":"02:07.405 ","End":"02:09.420","Text":"I\u0027ve done that for you,"},{"Start":"02:09.420 ","End":"02:11.370","Text":"the only 1 from this list,"},{"Start":"02:11.370 ","End":"02:13.335","Text":"and there are actually 12 things in this list,"},{"Start":"02:13.335 ","End":"02:17.045","Text":"but only these 2 will satisfy this equation."},{"Start":"02:17.045 ","End":"02:19.075","Text":"We have a 2 and 3,"},{"Start":"02:19.075 ","End":"02:22.285","Text":"but this is a cubic and we want 3 solutions."},{"Start":"02:22.285 ","End":"02:25.360","Text":"Perhaps 1 of these 2 is a double root,"},{"Start":"02:25.360 ","End":"02:31.775","Text":"we can find this out by seeing if it\u0027s a root of the derivative."},{"Start":"02:31.775 ","End":"02:35.720","Text":"Now, the derivative of this polynomial is this."},{"Start":"02:35.720 ","End":"02:41.420","Text":"If you check, you see that this 3 also satisfies the derivative,"},{"Start":"02:41.420 ","End":"02:45.125","Text":"3 times 9 is 27 plus 21 is 48."},{"Start":"02:45.125 ","End":"02:47.310","Text":"Minus 16 times 3 is also 48."},{"Start":"02:47.310 ","End":"02:51.065","Text":"Anyway, works, 2, 3,"},{"Start":"02:51.065 ","End":"02:55.710","Text":"and 3, but the double root is still just 1 eigenvalue."},{"Start":"02:55.710 ","End":"02:57.830","Text":"Only 2 and 3 are eigenvalues."},{"Start":"02:57.830 ","End":"03:01.690","Text":"So we have to find the eigenvectors for 2 and for 3."},{"Start":"03:01.690 ","End":"03:05.645","Text":"Let\u0027s start with 2, the single root."},{"Start":"03:05.645 ","End":"03:10.630","Text":"We take the characteristic matrix and substitute 2."},{"Start":"03:10.630 ","End":"03:14.745","Text":"This gives us 2 minus 4 is minus 2 and so on."},{"Start":"03:14.745 ","End":"03:20.225","Text":"This is what we get, and the corresponding system of linear equations is this."},{"Start":"03:20.225 ","End":"03:25.500","Text":"Now we want to use row operations to try and solve that."},{"Start":"03:25.880 ","End":"03:31.580","Text":"This row minus twice this row gives us this,"},{"Start":"03:31.580 ","End":"03:36.184","Text":"and this row minus twice this row gives us this."},{"Start":"03:36.184 ","End":"03:40.115","Text":"Add this row to this row and it gives us the 0s."},{"Start":"03:40.115 ","End":"03:42.680","Text":"This is the system we have,"},{"Start":"03:42.680 ","End":"03:45.500","Text":"we see that x and y are dependent variables and"},{"Start":"03:45.500 ","End":"03:48.305","Text":"z is independent and can be whatever we like."},{"Start":"03:48.305 ","End":"03:50.875","Text":"We usually let it be 1."},{"Start":"03:50.875 ","End":"03:52.650","Text":"If z is 1 from here,"},{"Start":"03:52.650 ","End":"03:56.250","Text":"we get that y is 1 and if z and y are both 1,"},{"Start":"03:56.250 ","End":"03:59.280","Text":"then we get that x is 1."},{"Start":"03:59.280 ","End":"04:05.995","Text":"The eigenvector for eigenvalue 2 is the vector 1, 1, 1."},{"Start":"04:05.995 ","End":"04:08.430","Text":"Moving on to the other eigenvalue,"},{"Start":"04:08.430 ","End":"04:12.120","Text":"x equals 3, let\u0027s see what eigenvector we\u0027ll get."},{"Start":"04:12.120 ","End":"04:13.400","Text":"Because it\u0027s a double root,"},{"Start":"04:13.400 ","End":"04:16.895","Text":"we should expect to get 2 eigenvectors. Let\u0027s see if we do."},{"Start":"04:16.895 ","End":"04:19.490","Text":"We start with the characteristic matrix,"},{"Start":"04:19.490 ","End":"04:21.490","Text":"then substitute x equals 3."},{"Start":"04:21.490 ","End":"04:27.135","Text":"This is the matrix we get and the corresponding system of linear equations,"},{"Start":"04:27.135 ","End":"04:29.530","Text":"but all the rows are the same."},{"Start":"04:29.530 ","End":"04:32.410","Text":"Actually, you could see it here already."},{"Start":"04:35.960 ","End":"04:38.985","Text":"We just need to take it once."},{"Start":"04:38.985 ","End":"04:43.214","Text":"We see that only x is the dependent variable,"},{"Start":"04:43.214 ","End":"04:45.670","Text":"y and z can be anything we want."},{"Start":"04:45.670 ","End":"04:50.060","Text":"Now, we want to get 2 linearly independent eigenvectors."},{"Start":"04:50.060 ","End":"04:53.440","Text":"One way to ensure that is to take y is 1,"},{"Start":"04:53.440 ","End":"04:55.360","Text":"z equals 0 for the first,"},{"Start":"04:55.360 ","End":"04:57.640","Text":"and then the other way round y is 0,"},{"Start":"04:57.640 ","End":"04:59.945","Text":"z is 1 for the second."},{"Start":"04:59.945 ","End":"05:03.825","Text":"Z1, y0, if we plug that in,"},{"Start":"05:03.825 ","End":"05:06.960","Text":"then we get x equals 1."},{"Start":"05:06.960 ","End":"05:08.670","Text":"If it\u0027s the other way around,"},{"Start":"05:08.670 ","End":"05:10.200","Text":"y plus z is still 1,"},{"Start":"05:10.200 ","End":"05:12.910","Text":"so that makes x 1."},{"Start":"05:13.580 ","End":"05:18.705","Text":"We actually have 2 eigenvectors for eigenvalue 3."},{"Start":"05:18.705 ","End":"05:20.475","Text":"We have from here 1,"},{"Start":"05:20.475 ","End":"05:22.290","Text":"0, 1, That\u0027s this,"},{"Start":"05:22.290 ","End":"05:25.230","Text":"and from here 1, 1, 0,"},{"Start":"05:25.230 ","End":"05:28.990","Text":"that\u0027s this. Now, we\u0027re done."}],"ID":25736},{"Watched":false,"Name":"Exercise 2","Duration":"6m 6s","ChapterTopicVideoID":24824,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:02.670","Text":"Here we want to find the eigenvalues,"},{"Start":"00:02.670 ","End":"00:06.120","Text":"and the eigenvectors of this 3-by-3 matrix A."},{"Start":"00:06.120 ","End":"00:09.810","Text":"As usual, we start with the characteristic matrix."},{"Start":"00:09.810 ","End":"00:18.540","Text":"This comes out to be a matrix with just x\u0027s on the diagonal minus our original matrix,"},{"Start":"00:18.540 ","End":"00:20.880","Text":"and that gives us this."},{"Start":"00:20.880 ","End":"00:23.204","Text":"After the characteristic matrix,"},{"Start":"00:23.204 ","End":"00:26.610","Text":"we also want the characteristic polynomial."},{"Start":"00:26.610 ","End":"00:30.860","Text":"That\u0027s defined to be the determinant of this matrix,"},{"Start":"00:30.860 ","End":"00:35.940","Text":"which is this just the same numbers from here but inside bars."},{"Start":"00:36.860 ","End":"00:40.150","Text":"There\u0027s nothing, no shortcut that I can see."},{"Start":"00:40.150 ","End":"00:43.280","Text":"We\u0027ll just expand along the first row."},{"Start":"00:43.280 ","End":"00:45.740","Text":"We\u0027ve done this kind of expansion before,"},{"Start":"00:45.740 ","End":"00:48.530","Text":"I won\u0027t go into all the details."},{"Start":"00:48.530 ","End":"00:51.145","Text":"This is what we get."},{"Start":"00:51.145 ","End":"00:53.280","Text":"We spend all 3,"},{"Start":"00:53.280 ","End":"00:58.445","Text":"2-by-2 determinants as product of this diagonal minus the product of the other diagonal."},{"Start":"00:58.445 ","End":"01:00.305","Text":"This is the expression we get."},{"Start":"01:00.305 ","End":"01:01.940","Text":"We simplify."},{"Start":"01:01.940 ","End":"01:04.370","Text":"We get this."},{"Start":"01:04.370 ","End":"01:07.685","Text":"Take the x minus 1 out, simplify more."},{"Start":"01:07.685 ","End":"01:12.950","Text":"We get this, which is just the characteristic polynomial."},{"Start":"01:12.950 ","End":"01:17.630","Text":"Don\u0027t expand it because we\u0027re going to want to factor it if anything."},{"Start":"01:17.630 ","End":"01:19.160","Text":"To find the eigenvalues,"},{"Start":"01:19.160 ","End":"01:22.465","Text":"we set the characteristic polynomial to 0."},{"Start":"01:22.465 ","End":"01:25.395","Text":"This was our characteristic polynomial."},{"Start":"01:25.395 ","End":"01:27.350","Text":"If the first factor is 0,"},{"Start":"01:27.350 ","End":"01:30.110","Text":"then x is 1, the second factor is 0."},{"Start":"01:30.110 ","End":"01:34.285","Text":"We solve a quadratic the formula or factoring."},{"Start":"01:34.285 ","End":"01:39.110","Text":"Either way, this is what we get and these are the 3 eigenvalues,"},{"Start":"01:39.110 ","End":"01:41.795","Text":"1, 3, and minus 2."},{"Start":"01:41.795 ","End":"01:45.050","Text":"For each of them, we\u0027re going to find an eigenvector."},{"Start":"01:45.050 ","End":"01:49.340","Text":"We\u0027ll begin with the x equals 1."},{"Start":"01:49.340 ","End":"01:53.570","Text":"We start with the characteristic matrix and then substitute x equals"},{"Start":"01:53.570 ","End":"01:58.160","Text":"1 into the characteristic matrix. This is what we get."},{"Start":"01:58.160 ","End":"02:00.440","Text":"For example here, 1 minus 1 is 0,"},{"Start":"02:00.440 ","End":"02:03.005","Text":"1 minus 2 is minus 1, and so on."},{"Start":"02:03.005 ","End":"02:08.390","Text":"This matrix corresponds to a system of linear equations in x,"},{"Start":"02:08.390 ","End":"02:12.229","Text":"y, z. I\u0027m going to do row operations."},{"Start":"02:12.229 ","End":"02:15.980","Text":"The first 1 I did was switch the first and last rows because I want to get it"},{"Start":"02:15.980 ","End":"02:20.765","Text":"into row echelon form and I don\u0027t want to 0 at the top left,"},{"Start":"02:20.765 ","End":"02:23.210","Text":"and so we have this."},{"Start":"02:23.210 ","End":"02:25.460","Text":"Then if we take,"},{"Start":"02:25.460 ","End":"02:32.880","Text":"twice this minus 3 times this, we get this."},{"Start":"02:33.470 ","End":"02:36.610","Text":"You can see that these 2 rows are the same,"},{"Start":"02:36.610 ","End":"02:39.445","Text":"so I really just need the top 2 rows."},{"Start":"02:39.445 ","End":"02:42.905","Text":"These give me this system of linear equations,"},{"Start":"02:42.905 ","End":"02:46.140","Text":"3 unknowns, but only 2 equations."},{"Start":"02:46.140 ","End":"02:49.920","Text":"Z is an independent variable,"},{"Start":"02:49.920 ","End":"02:54.690","Text":"and we usually let it equal 1 could be anything non 0."},{"Start":"02:54.690 ","End":"02:57.899","Text":"Anyway, if z is 1,"},{"Start":"02:57.899 ","End":"03:02.040","Text":"then from the last 1 we get that y equals 4."},{"Start":"03:02.040 ","End":"03:06.945","Text":"Now that we know z and y from the first 1 we get that x is minus 1."},{"Start":"03:06.945 ","End":"03:13.690","Text":"That gives us our eigenvector for the eigenvalue 1."},{"Start":"03:13.690 ","End":"03:14.980","Text":"We have minus 1, 4,"},{"Start":"03:14.980 ","End":"03:18.425","Text":"1 and take it in the order x, y, z."},{"Start":"03:18.425 ","End":"03:21.125","Text":"Next eigenvalue is 3."},{"Start":"03:21.125 ","End":"03:25.330","Text":"Let\u0027s look for an eigenvector for that start with the characteristic matrix."},{"Start":"03:25.330 ","End":"03:27.160","Text":"Here it is, you know the routine,"},{"Start":"03:27.160 ","End":"03:29.585","Text":"I\u0027m going to substitute x equals 3."},{"Start":"03:29.585 ","End":"03:34.630","Text":"We get this matrix and its corresponding system of linear equations."},{"Start":"03:34.630 ","End":"03:39.500","Text":"We\u0027re going to start doing all kinds of row operations on the matrix."},{"Start":"03:39.500 ","End":"03:41.775","Text":"First 1 as is,"},{"Start":"03:41.775 ","End":"03:49.695","Text":"then 3 times this plus twice this will give us a 0 here, which is good."},{"Start":"03:49.695 ","End":"03:52.805","Text":"If we just add the top and the bottom,"},{"Start":"03:52.805 ","End":"03:54.310","Text":"that will give us all zeros."},{"Start":"03:54.310 ","End":"03:56.270","Text":"We just have 2 equations."},{"Start":"03:56.270 ","End":"03:59.149","Text":"Here they are, z is independent,"},{"Start":"03:59.149 ","End":"04:00.500","Text":"x and y are dependent."},{"Start":"04:00.500 ","End":"04:02.735","Text":"As usual, we\u0027ll let z equals 1."},{"Start":"04:02.735 ","End":"04:04.879","Text":"But as I said before,"},{"Start":"04:04.879 ","End":"04:07.250","Text":"and z could be any non-zero."},{"Start":"04:07.250 ","End":"04:10.450","Text":"But whatever\u0027s convenient, 1 is convenient."},{"Start":"04:10.450 ","End":"04:13.785","Text":"If c is 1, then from here y is 2,"},{"Start":"04:13.785 ","End":"04:16.050","Text":"but here z is 1 and y is 2,"},{"Start":"04:16.050 ","End":"04:17.790","Text":"you get that 2,"},{"Start":"04:17.790 ","End":"04:20.370","Text":"x is 2, so x is 1."},{"Start":"04:20.370 ","End":"04:28.005","Text":"We found the eigenvector for the value 1 and it\u0027s 1, 2, 1."},{"Start":"04:28.005 ","End":"04:30.185","Text":"Now the third and last,"},{"Start":"04:30.185 ","End":"04:33.125","Text":"the eigenvalue minus 2,"},{"Start":"04:33.125 ","End":"04:38.450","Text":"the characteristic matrix substitute x equals minus 2. This is what we get."},{"Start":"04:38.450 ","End":"04:40.490","Text":"For example, if x is minus 2, minus 2,"},{"Start":"04:40.490 ","End":"04:43.415","Text":"minus 1 is minus 3."},{"Start":"04:43.415 ","End":"04:49.445","Text":"Here\u0027s the corresponding system of linear equations."},{"Start":"04:49.445 ","End":"04:52.205","Text":"Row operations."},{"Start":"04:52.205 ","End":"04:58.430","Text":"First row minus the second row gives us this,"},{"Start":"04:58.430 ","End":"05:03.470","Text":"or is it the second row minus the first row actually?"},{"Start":"05:03.470 ","End":"05:06.110","Text":"Then here, let\u0027s see."},{"Start":"05:06.110 ","End":"05:11.585","Text":"We want twice the first row plus"},{"Start":"05:11.585 ","End":"05:19.260","Text":"3 times 0 minus 3 times the last row will give us this."},{"Start":"05:19.550 ","End":"05:24.200","Text":"Well, this is the same as this multiplied by minus 1."},{"Start":"05:24.200 ","End":"05:28.855","Text":"This is redundant, so we just need the top 2 rows,"},{"Start":"05:28.855 ","End":"05:31.430","Text":"or you could add these and see that it\u0027s 0."},{"Start":"05:31.430 ","End":"05:33.675","Text":"Anyway, these 2 rows are what count,"},{"Start":"05:33.675 ","End":"05:37.550","Text":"We get this system of equations, 3 unknowns,"},{"Start":"05:37.550 ","End":"05:39.995","Text":"2 equations, z is an independent,"},{"Start":"05:39.995 ","End":"05:41.510","Text":"can be whatever we like."},{"Start":"05:41.510 ","End":"05:43.280","Text":"X and y depend on z."},{"Start":"05:43.280 ","End":"05:45.430","Text":"Let\u0027s let z equal 1."},{"Start":"05:45.430 ","End":"05:47.700","Text":"From the last equation."},{"Start":"05:47.700 ","End":"05:48.860","Text":"When z is 1,"},{"Start":"05:48.860 ","End":"05:50.525","Text":"we\u0027ll get that y is 1,"},{"Start":"05:50.525 ","End":"05:53.030","Text":"and then substitute both of these in here."},{"Start":"05:53.030 ","End":"05:57.545","Text":"We\u0027ll get minus 3x is 3 or x is minus 1."},{"Start":"05:57.545 ","End":"06:03.125","Text":"Which brings us to the eigenvector for the eigenvalue minus 2,"},{"Start":"06:03.125 ","End":"06:06.840","Text":"minus 1, 1, 1, and we\u0027re done."}],"ID":25737},{"Watched":false,"Name":"Exercise 3","Duration":"7m 42s","ChapterTopicVideoID":24825,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.740 ","End":"00:06.115","Text":"Here we have to find the eigenvalues and eigenvectors of this matrix A."},{"Start":"00:06.115 ","End":"00:09.885","Text":"We start by finding the characteristic matrix,"},{"Start":"00:09.885 ","End":"00:11.400","Text":"which is defined thus."},{"Start":"00:11.400 ","End":"00:14.700","Text":"But in practice, it just means that we take matrix with"},{"Start":"00:14.700 ","End":"00:19.065","Text":"all x\u0027s on the diagonal and subtract our matrix."},{"Start":"00:19.065 ","End":"00:21.240","Text":"Here we get this,"},{"Start":"00:21.240 ","End":"00:23.460","Text":"which is our characteristic matrix."},{"Start":"00:23.460 ","End":"00:26.370","Text":"Then we\u0027re going to use the characteristic matrix to find"},{"Start":"00:26.370 ","End":"00:34.100","Text":"the characteristic polynomial which is just the determinant of the characteristic matrix."},{"Start":"00:34.100 ","End":"00:41.645","Text":"It\u0027s like this. Let\u0027s compute it by expansion along the first row."},{"Start":"00:41.645 ","End":"00:43.850","Text":"If we do that,"},{"Start":"00:43.850 ","End":"00:46.005","Text":"this is what we\u0027ll get, we\u0027ve done it so many times,"},{"Start":"00:46.005 ","End":"00:47.450","Text":"I won\u0027t go into the details."},{"Start":"00:47.450 ","End":"00:49.040","Text":"Then we compute all 3,"},{"Start":"00:49.040 ","End":"00:51.040","Text":"2 by 2 determinants."},{"Start":"00:51.040 ","End":"00:58.850","Text":"We get this, this square brackets comes out to be x squared minus 3x plus 1."},{"Start":"00:58.850 ","End":"01:00.560","Text":"If we add these 2,"},{"Start":"01:00.560 ","End":"01:01.610","Text":"these are linear terms."},{"Start":"01:01.610 ","End":"01:02.930","Text":"It comes out to be this."},{"Start":"01:02.930 ","End":"01:05.810","Text":"Now, we take x minus 1 up and what we\u0027re left with is x"},{"Start":"01:05.810 ","End":"01:10.415","Text":"squared minus 3x plus 1 minus 5, which is this."},{"Start":"01:10.415 ","End":"01:13.520","Text":"This is our characteristic polynomial."},{"Start":"01:13.520 ","End":"01:16.070","Text":"We\u0027ll use it to find the eigenvalues."},{"Start":"01:16.070 ","End":"01:20.335","Text":"The way we do this is we start with the characteristic matrix."},{"Start":"01:20.335 ","End":"01:25.315","Text":"I meant the characteristic polynomial and set it to 0."},{"Start":"01:25.315 ","End":"01:29.690","Text":"We get what we had before equals 0."},{"Start":"01:29.690 ","End":"01:34.220","Text":"Either x is 1 from the linear term and the quadratic,"},{"Start":"01:34.220 ","End":"01:36.830","Text":"you can do from the formula or by factorization,"},{"Start":"01:36.830 ","End":"01:40.410","Text":"whatever, you get these 2 solutions."},{"Start":"01:40.410 ","End":"01:43.875","Text":"These are our 3 eigenvalues."},{"Start":"01:43.875 ","End":"01:47.150","Text":"For each of them we\u0027ll find an eigenvector."},{"Start":"01:47.150 ","End":"01:52.420","Text":"We\u0027ll do them one at a time starting with x equals 1."},{"Start":"01:52.420 ","End":"01:58.310","Text":"We take the characteristic matrix and substitute x equals 1,"},{"Start":"01:58.310 ","End":"02:00.260","Text":"and we get this matrix."},{"Start":"02:00.260 ","End":"02:05.630","Text":"Then we write the corresponding system of linear equations, which is this."},{"Start":"02:05.630 ","End":"02:08.270","Text":"Then some row operations."},{"Start":"02:08.270 ","End":"02:13.385","Text":"If we take this row and subtract twice this row,"},{"Start":"02:13.385 ","End":"02:16.130","Text":"we will get this."},{"Start":"02:16.130 ","End":"02:20.825","Text":"If we just negate the last row, we get this."},{"Start":"02:20.825 ","End":"02:30.110","Text":"Next I want to get the last row to be 0 here."},{"Start":"02:30.110 ","End":"02:36.125","Text":"I subtract twice the first row from the last row,"},{"Start":"02:36.125 ","End":"02:38.440","Text":"and that gives us this."},{"Start":"02:38.440 ","End":"02:42.020","Text":"Then we note that this is essentially the same as this,"},{"Start":"02:42.020 ","End":"02:43.339","Text":"just with a minus."},{"Start":"02:43.339 ","End":"02:45.470","Text":"We could multiply it by minus,"},{"Start":"02:45.470 ","End":"02:48.035","Text":"or we could just add this to this and get all 0\u0027s."},{"Start":"02:48.035 ","End":"02:52.850","Text":"In any event, we can ignore the last row and we get the following system of"},{"Start":"02:52.850 ","End":"02:59.630","Text":"linear equations where we see that z is independent and x and y are dependent."},{"Start":"02:59.630 ","End":"03:04.335","Text":"We typically set z to equal 1."},{"Start":"03:04.335 ","End":"03:06.980","Text":"If we do that from the last equation,"},{"Start":"03:06.980 ","End":"03:09.515","Text":"we get that y is minus 2."},{"Start":"03:09.515 ","End":"03:11.360","Text":"Now that we have z and y here,"},{"Start":"03:11.360 ","End":"03:14.960","Text":"we substitute and you will check that x is equal to 1."},{"Start":"03:14.960 ","End":"03:18.890","Text":"If we just put these in the right order, 1 minus 2,"},{"Start":"03:18.890 ","End":"03:24.100","Text":"1, we\u0027ve got the eigenvector for the eigenvalue 1."},{"Start":"03:24.100 ","End":"03:27.825","Text":"Next, we\u0027ll take the eigenvalue 4."},{"Start":"03:27.825 ","End":"03:31.759","Text":"We want to take this characteristic matrix and substitute"},{"Start":"03:31.759 ","End":"03:39.215","Text":"4 which gives us this matrix and the corresponding system of linear equations."},{"Start":"03:39.215 ","End":"03:42.980","Text":"Then some row operations like if we take"},{"Start":"03:42.980 ","End":"03:48.485","Text":"3 times this row and add it to this row we get this."},{"Start":"03:48.485 ","End":"03:55.965","Text":"If we take twice this and 3 times this and add,"},{"Start":"03:55.965 ","End":"04:01.130","Text":"then we get this last row is just the minus times this row,"},{"Start":"04:01.130 ","End":"04:02.855","Text":"so we don\u0027t need that."},{"Start":"04:02.855 ","End":"04:06.920","Text":"From the first 2, we get a system of linear equations."},{"Start":"04:06.920 ","End":"04:13.850","Text":"We see that z can be an independent variable and x and y computed from it."},{"Start":"04:13.850 ","End":"04:18.230","Text":"Let\u0027s let z equal, say 1."},{"Start":"04:18.230 ","End":"04:21.020","Text":"Then from the last equation,"},{"Start":"04:21.020 ","End":"04:23.450","Text":"z is 1, y is also 1."},{"Start":"04:23.450 ","End":"04:27.860","Text":"Then plug into the top 1 and we get that x is 1."},{"Start":"04:27.860 ","End":"04:30.800","Text":"We found the eigenvector 1,"},{"Start":"04:30.800 ","End":"04:33.850","Text":"1, 1 for the eigenvalue 4."},{"Start":"04:33.850 ","End":"04:38.540","Text":"One more to go. We still have the minus 1 eigenvalue."},{"Start":"04:38.540 ","End":"04:44.120","Text":"The characteristic matrix substitute x equals minus 1."},{"Start":"04:44.120 ","End":"04:46.040","Text":"That\u0027s a typo,"},{"Start":"04:46.040 ","End":"04:49.070","Text":"I meant to say x equals 4."},{"Start":"04:49.070 ","End":"04:53.120","Text":"Then we get 4 minus 1 is 3 and so on."},{"Start":"04:53.120 ","End":"04:57.830","Text":"We get this matrix and the corresponding system of linear equations."},{"Start":"04:57.830 ","End":"05:00.440","Text":"Then row operations like, I don\u0027t know,"},{"Start":"05:00.440 ","End":"05:04.235","Text":"3 times this plus this will give us this."},{"Start":"05:04.235 ","End":"05:09.305","Text":"Also if we take twice this and 3 times this, we get this."},{"Start":"05:09.305 ","End":"05:13.505","Text":"This is the same as this just multiplied by minus 1 so we only need the first 2."},{"Start":"05:13.505 ","End":"05:18.335","Text":"They give us this system of equations where we see that"},{"Start":"05:18.335 ","End":"05:25.405","Text":"z can be anything we like and x and y are dependent so let\u0027s take z equals 1."},{"Start":"05:25.405 ","End":"05:27.705","Text":"From the last equation,"},{"Start":"05:27.705 ","End":"05:29.670","Text":"we see that y is also 1."},{"Start":"05:29.670 ","End":"05:30.950","Text":"Then from the first equation,"},{"Start":"05:30.950 ","End":"05:32.470","Text":"that will give us x equals 1."},{"Start":"05:32.470 ","End":"05:34.730","Text":"If we take them in the order x, y,"},{"Start":"05:34.730 ","End":"05:39.690","Text":"z, because here the order doesn\u0027t matter because they\u0027re all one, but yeah."},{"Start":"05:39.690 ","End":"05:43.070","Text":"The eigenvector 1, 1,"},{"Start":"05:43.070 ","End":"05:46.855","Text":"1 belongs to the eigenvalue 4."},{"Start":"05:46.855 ","End":"05:50.210","Text":"Finally, the third eigenvalue,"},{"Start":"05:50.210 ","End":"05:52.100","Text":"x equals minus 1,"},{"Start":"05:52.100 ","End":"05:56.930","Text":"we substituted into the characteristic matrix."},{"Start":"05:56.930 ","End":"06:00.300","Text":"Wherever we see x, we put minus 1."},{"Start":"06:00.530 ","End":"06:08.555","Text":"We get this with the corresponding system of linear equations."},{"Start":"06:08.555 ","End":"06:12.655","Text":"Then we want to bring this into row echelon form."},{"Start":"06:12.655 ","End":"06:15.410","Text":"You must be pretty used to doing this by now,"},{"Start":"06:15.410 ","End":"06:18.425","Text":"I\u0027ll just quote the result."},{"Start":"06:18.425 ","End":"06:24.570","Text":"Briefly say I took this row and subtracted 1/2 of this row."},{"Start":"06:24.570 ","End":"06:26.775","Text":"We get this."},{"Start":"06:26.775 ","End":"06:33.675","Text":"This row at minus the top row gives all 0\u0027s."},{"Start":"06:33.675 ","End":"06:36.440","Text":"Now, I\u0027ve got a lot of negatives and I have fractions."},{"Start":"06:36.440 ","End":"06:42.040","Text":"What I\u0027m going to do is multiply everything by minus 1."},{"Start":"06:42.040 ","End":"06:47.195","Text":"Well, the top row by minus 1 and this will multiply by minus 2."},{"Start":"06:47.195 ","End":"06:52.075","Text":"That gives us this with just a little bit easier to deal with."},{"Start":"06:52.075 ","End":"06:55.620","Text":"The corresponding system of linear equations,"},{"Start":"06:55.620 ","End":"06:58.115","Text":"they\u0027re 2 equations, 3 unknowns."},{"Start":"06:58.115 ","End":"07:02.515","Text":"The free variable I put it in a different color is the z."},{"Start":"07:02.515 ","End":"07:07.130","Text":"Then we can compute x and y from z."},{"Start":"07:07.130 ","End":"07:11.285","Text":"To get a basis, we could let z equals 1."},{"Start":"07:11.285 ","End":"07:13.490","Text":"Regardless of z,"},{"Start":"07:13.490 ","End":"07:15.995","Text":"y is going to be 0."},{"Start":"07:15.995 ","End":"07:18.740","Text":"If we plug in z equals 1,"},{"Start":"07:18.740 ","End":"07:20.945","Text":"y equals 0 in here,"},{"Start":"07:20.945 ","End":"07:24.285","Text":"we get 2x plus 2 equals 0."},{"Start":"07:24.285 ","End":"07:26.030","Text":"X is minus 1,"},{"Start":"07:26.030 ","End":"07:28.610","Text":"and we take them in order x, y, and z."},{"Start":"07:28.610 ","End":"07:33.670","Text":"We can get a basis for the eigenvectors,"},{"Start":"07:33.670 ","End":"07:38.250","Text":"which will be minus 1,0,1."},{"Start":"07:38.250 ","End":"07:42.880","Text":"That\u0027s the third and final one and we are done."}],"ID":25738},{"Watched":false,"Name":"Exercise 4","Duration":"3m 15s","ChapterTopicVideoID":24826,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.480 ","End":"00:06.880","Text":"Here we need to find the eigenvalues and eigenvectors of this 2-by-2 matrix A,"},{"Start":"00:06.880 ","End":"00:12.490","Text":"and we start by computing the characteristic matrix defined like this."},{"Start":"00:12.490 ","End":"00:16.660","Text":"What it is, is x is on the diagonal, that\u0027s this part,"},{"Start":"00:16.660 ","End":"00:18.860","Text":"minus A is this part,"},{"Start":"00:18.860 ","End":"00:21.730","Text":"and this was the result of the subtraction."},{"Start":"00:21.730 ","End":"00:24.804","Text":"Now that we have the characteristic matrix,"},{"Start":"00:24.804 ","End":"00:27.805","Text":"we can use it to find the characteristic polynomial."},{"Start":"00:27.805 ","End":"00:33.814","Text":"To find this by simply taking the determinant of the characteristic matrix."},{"Start":"00:33.814 ","End":"00:37.780","Text":"Like so, or you could have written it with vertical bars,"},{"Start":"00:37.780 ","End":"00:40.105","Text":"or you write that, it\u0027s okay."},{"Start":"00:40.105 ","End":"00:43.190","Text":"This diagonal minus this diagonal,"},{"Start":"00:43.190 ","End":"00:50.560","Text":"so x minus 1 squared minus 4 because it\u0027s minus minus minus."},{"Start":"00:50.560 ","End":"00:53.140","Text":"This comes out x squared minus 2,"},{"Start":"00:53.140 ","End":"00:54.665","Text":"x minus 3,"},{"Start":"00:54.665 ","End":"00:57.190","Text":"and that\u0027s our characteristic polynomial."},{"Start":"00:57.190 ","End":"00:59.500","Text":"Next we move on to eigenvalues."},{"Start":"00:59.500 ","End":"01:06.200","Text":"We get the eigenvalues by setting the characteristic equation to 0,"},{"Start":"01:06.200 ","End":"01:09.285","Text":"which gives us this quadratic equation,"},{"Start":"01:09.285 ","End":"01:12.870","Text":"and you can do this by formula,"},{"Start":"01:12.870 ","End":"01:14.250","Text":"by factorization, anyway,"},{"Start":"01:14.250 ","End":"01:16.605","Text":"you get minus 1 and 3."},{"Start":"01:16.605 ","End":"01:18.505","Text":"These are our 2 eigenvalues,"},{"Start":"01:18.505 ","End":"01:21.595","Text":"and for each of them we\u0027ll search for an eigenvector."},{"Start":"01:21.595 ","End":"01:24.340","Text":"Let\u0027s start with the minus 1."},{"Start":"01:24.340 ","End":"01:30.485","Text":"We take the characteristic matrix and substitute x equals minus 1,"},{"Start":"01:30.485 ","End":"01:36.070","Text":"like so, and this is what we get."},{"Start":"01:36.070 ","End":"01:41.480","Text":"If we subtract twice this row from this row,"},{"Start":"01:41.480 ","End":"01:44.470","Text":"we\u0027ll get 0, 0 here."},{"Start":"01:44.470 ","End":"01:49.250","Text":"Our system of equations is really just 1 equation which is this."},{"Start":"01:49.250 ","End":"01:51.410","Text":"Here x would be, say,"},{"Start":"01:51.410 ","End":"01:53.839","Text":"the dependent variable and y the independent."},{"Start":"01:53.839 ","End":"01:55.535","Text":"We can let it be whatever we want."},{"Start":"01:55.535 ","End":"01:56.990","Text":"We often use the value 1,"},{"Start":"01:56.990 ","End":"01:58.475","Text":"but that will make x a fraction."},{"Start":"01:58.475 ","End":"02:00.925","Text":"Let\u0027s take y equals 2."},{"Start":"02:00.925 ","End":"02:05.730","Text":"Then if y is 2, we get that,"},{"Start":"02:05.730 ","End":"02:07.789","Text":"bring y to the other side,"},{"Start":"02:07.789 ","End":"02:09.830","Text":"minus 2x is 2,"},{"Start":"02:09.830 ","End":"02:12.100","Text":"so x is minus 1."},{"Start":"02:12.100 ","End":"02:13.860","Text":"Get them in the right order,"},{"Start":"02:13.860 ","End":"02:15.120","Text":"first x, and first y,"},{"Start":"02:15.120 ","End":"02:19.775","Text":"and this is our eigenvector corresponding to the eigenvalue minus 1."},{"Start":"02:19.775 ","End":"02:23.270","Text":"Remember the other eigenvalue was x equals 3,"},{"Start":"02:23.270 ","End":"02:26.915","Text":"and this is what we substitute in the characteristic matrix."},{"Start":"02:26.915 ","End":"02:31.195","Text":"To get this 2-by-2 matrix,"},{"Start":"02:31.195 ","End":"02:33.170","Text":"can do row operations."},{"Start":"02:33.170 ","End":"02:37.805","Text":"Specifically, I want to add twice the top row to the last row,"},{"Start":"02:37.805 ","End":"02:40.160","Text":"and that will make the last row 0,"},{"Start":"02:40.160 ","End":"02:41.945","Text":"0, so we can ignore it,"},{"Start":"02:41.945 ","End":"02:43.910","Text":"so really just the first row counts."},{"Start":"02:43.910 ","End":"02:47.790","Text":"Our system of equations is just the 1 equation,"},{"Start":"02:47.790 ","End":"02:50.370","Text":"2x minus y is 0."},{"Start":"02:50.370 ","End":"02:53.500","Text":"I let y be anything I want."},{"Start":"02:54.410 ","End":"02:57.570","Text":"I\u0027m thinking y equals 2 because if I take"},{"Start":"02:57.570 ","End":"02:59.985","Text":"1x it will come out a fraction and I don\u0027t want fractions."},{"Start":"02:59.985 ","End":"03:01.230","Text":"I\u0027ll take y equals 2,"},{"Start":"03:01.230 ","End":"03:03.690","Text":"then 2x is 1,"},{"Start":"03:03.690 ","End":"03:09.095","Text":"and that gives us our eigenvector for the value 3, the eigenvalue."},{"Start":"03:09.095 ","End":"03:10.610","Text":"First the x, of course,"},{"Start":"03:10.610 ","End":"03:11.630","Text":"and then the y 1,"},{"Start":"03:11.630 ","End":"03:15.000","Text":"2, and that\u0027s it. We\u0027re done."}],"ID":25739},{"Watched":false,"Name":"Exercise 5","Duration":"8m 8s","ChapterTopicVideoID":24827,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.500 ","End":"00:04.050","Text":"In this exercise, we want to find the eigenvalues and"},{"Start":"00:04.050 ","End":"00:07.830","Text":"eigenvectors of this 3 by 3 matrix A."},{"Start":"00:07.830 ","End":"00:11.205","Text":"We begin by computing the characteristic matrix,"},{"Start":"00:11.205 ","End":"00:13.920","Text":"which is defined as follows."},{"Start":"00:13.920 ","End":"00:19.770","Text":"This is what it means, x is on the diagonal and we subtract our matrix."},{"Start":"00:19.770 ","End":"00:22.125","Text":"This is what we get."},{"Start":"00:22.125 ","End":"00:24.900","Text":"After the characteristic matrix,"},{"Start":"00:24.900 ","End":"00:27.795","Text":"we want the characteristic polynomial,"},{"Start":"00:27.795 ","End":"00:32.455","Text":"which is actually just the determinant of the characteristic matrix."},{"Start":"00:32.455 ","End":"00:35.900","Text":"We have to compute this 3 by 3 determinant."},{"Start":"00:35.900 ","End":"00:41.855","Text":"Let\u0027s expand along the first row."},{"Start":"00:41.855 ","End":"00:44.225","Text":"You know how to do this."},{"Start":"00:44.225 ","End":"00:46.405","Text":"This is what we get,"},{"Start":"00:46.405 ","End":"00:49.610","Text":"x minus 1 times a 2 by 2 determinants and so on."},{"Start":"00:49.610 ","End":"00:52.184","Text":"Just be careful with the signs."},{"Start":"00:52.184 ","End":"00:54.825","Text":"This is plus, minus, plus."},{"Start":"00:54.825 ","End":"00:56.250","Text":"The minus with the minus 1,"},{"Start":"00:56.250 ","End":"00:58.820","Text":"makes it a plus 1 just be careful there."},{"Start":"00:58.820 ","End":"01:01.624","Text":"This is fairly routine to compute."},{"Start":"01:01.624 ","End":"01:04.585","Text":"We just have 2 by 2 determinants."},{"Start":"01:04.585 ","End":"01:07.380","Text":"This is what we get."},{"Start":"01:07.380 ","End":"01:09.945","Text":"A bit of simplification."},{"Start":"01:09.945 ","End":"01:15.870","Text":"Then we\u0027ll take x minus 1 out of the brackets and get this expression."},{"Start":"01:15.870 ","End":"01:22.050","Text":"This is our characteristic polynomial p of x."},{"Start":"01:22.050 ","End":"01:25.735","Text":"Next we\u0027re going to find the eigenvalues."},{"Start":"01:25.735 ","End":"01:32.275","Text":"We get the eigenvalues by setting the characteristic polynomial to 0,"},{"Start":"01:32.275 ","End":"01:35.085","Text":"which gives us this equation."},{"Start":"01:35.085 ","End":"01:38.375","Text":"We know that either this is 0 or this quadratic is 0."},{"Start":"01:38.375 ","End":"01:41.855","Text":"This is the easy part, if x minus 1 is 0, then x is 1."},{"Start":"01:41.855 ","End":"01:44.930","Text":"Now we\u0027ll take the other expression as a quadratic."},{"Start":"01:44.930 ","End":"01:49.535","Text":"But this quadratic has complex roots."},{"Start":"01:49.535 ","End":"01:53.180","Text":"We can simplify it by writing the minus 12,"},{"Start":"01:53.180 ","End":"01:55.745","Text":"as minus 1 times 4 times 3."},{"Start":"01:55.745 ","End":"02:01.450","Text":"This will be good because we know that root 4 is 2 and root of minus 1 is i."},{"Start":"02:01.450 ","End":"02:06.455","Text":"We get this and obviously now we\u0027re just going to divide by"},{"Start":"02:06.455 ","End":"02:11.550","Text":"2 and this just gives us 1 plus or minus root 3i."},{"Start":"02:11.550 ","End":"02:18.960","Text":"Summarizing, we had x equals 1 and then we had 1 plus or minus root 3i,"},{"Start":"02:18.960 ","End":"02:21.830","Text":"that\u0027s these 2 complex conjugates."},{"Start":"02:21.830 ","End":"02:27.020","Text":"Next, we want to find eigenvectors for each of these 3."},{"Start":"02:27.020 ","End":"02:30.455","Text":"Let\u0027s start with the real one,"},{"Start":"02:30.455 ","End":"02:33.815","Text":"the x equals 1 eigenvalue."},{"Start":"02:33.815 ","End":"02:37.910","Text":"Hope you recall, we start with the characteristic matrix and"},{"Start":"02:37.910 ","End":"02:42.340","Text":"substitute this eigenvalue in here."},{"Start":"02:42.340 ","End":"02:46.085","Text":"That gives us this matrix,"},{"Start":"02:46.085 ","End":"02:51.600","Text":"which has a corresponding system of linear equations."},{"Start":"02:51.670 ","End":"02:58.265","Text":"Let\u0027s do some row operations on the matrix that will help us."},{"Start":"02:58.265 ","End":"03:04.625","Text":"I get this by adding the second row to the third row."},{"Start":"03:04.625 ","End":"03:09.550","Text":"Then add the first row to the second row and get this."},{"Start":"03:09.550 ","End":"03:13.820","Text":"At this point I noticed that the third and second row are almost the same."},{"Start":"03:13.820 ","End":"03:16.310","Text":"The third row is just minus 1 times this,"},{"Start":"03:16.310 ","End":"03:18.590","Text":"but it\u0027s the same equation."},{"Start":"03:18.590 ","End":"03:23.120","Text":"I could actually add these two and get just 0s here."},{"Start":"03:23.120 ","End":"03:27.090","Text":"We only get an equation from the top 2,"},{"Start":"03:27.090 ","End":"03:31.535","Text":"so we have 2 equations in three unknowns, x, y, z."},{"Start":"03:31.535 ","End":"03:33.680","Text":"Z is an independent variable."},{"Start":"03:33.680 ","End":"03:35.195","Text":"X and y are dependent."},{"Start":"03:35.195 ","End":"03:37.250","Text":"We let z be whatever we want."},{"Start":"03:37.250 ","End":"03:39.295","Text":"I\u0027ll choose z equals 1."},{"Start":"03:39.295 ","End":"03:42.030","Text":"If Z is 1, that forces y to be 1,"},{"Start":"03:42.030 ","End":"03:45.180","Text":"and if z is 1 and y is 1,"},{"Start":"03:45.180 ","End":"03:48.360","Text":"then x is also 1,"},{"Start":"03:48.360 ","End":"03:51.450","Text":"so we have our eigenvector 1, 1,"},{"Start":"03:51.450 ","End":"03:55.210","Text":"1 for the eigenvalue 1."},{"Start":"03:55.210 ","End":"04:03.440","Text":"Next we\u0027ll take one of the complex eigenvalues 1 plus root 3i. Same principle."},{"Start":"04:03.440 ","End":"04:09.695","Text":"We start with the characteristic matrix and then substitute this eigenvalue for x."},{"Start":"04:09.695 ","End":"04:14.330","Text":"This substitution here leads to this matrix and here\u0027s"},{"Start":"04:14.330 ","End":"04:19.745","Text":"the corresponding system of equations. Let\u0027s see."},{"Start":"04:19.745 ","End":"04:22.630","Text":"Again, we\u0027re going to do some row operations."},{"Start":"04:22.630 ","End":"04:25.170","Text":"I\u0027d like to have the 1 here,"},{"Start":"04:25.170 ","End":"04:30.385","Text":"so I just switch row 1 with row 2 and we get this."},{"Start":"04:30.385 ","End":"04:32.855","Text":"Next. We\u0027ll do two things at once."},{"Start":"04:32.855 ","End":"04:40.610","Text":"We\u0027ll replace row 2 by row 2 minus root 3i times row 1."},{"Start":"04:40.610 ","End":"04:43.850","Text":"In other words, I multiply this by root"},{"Start":"04:43.850 ","End":"04:47.900","Text":"3i and then subtract from here to give me a 0 here."},{"Start":"04:47.900 ","End":"04:50.540","Text":"I also, for row 3,"},{"Start":"04:50.540 ","End":"04:53.110","Text":"just add to it row 1."},{"Start":"04:53.110 ","End":"04:57.175","Text":"Row 3 plus row 1 goes into row 3."},{"Start":"04:57.175 ","End":"05:03.600","Text":"If we do those calculations, we get this."},{"Start":"05:03.600 ","End":"05:06.535","Text":"This is what we wanted, to have 0s here."},{"Start":"05:06.535 ","End":"05:09.515","Text":"We\u0027re aiming for the row echelon form."},{"Start":"05:09.515 ","End":"05:10.860","Text":"I want to get a 0 here,"},{"Start":"05:10.860 ","End":"05:20.580","Text":"so what I do is I replace the third row by twice the third row,"},{"Start":"05:20.580 ","End":"05:26.045","Text":"like from this 2, I multiply by this and I subtract this times the second row."},{"Start":"05:26.045 ","End":"05:29.340","Text":"If you do the computation,"},{"Start":"05:29.660 ","End":"05:32.250","Text":"you\u0027ll see that we get this."},{"Start":"05:32.250 ","End":"05:34.190","Text":"Now we have the 0s here,"},{"Start":"05:34.190 ","End":"05:37.460","Text":"here and here, which is row echelon form."},{"Start":"05:37.460 ","End":"05:39.920","Text":"This is the problematic elements,"},{"Start":"05:39.920 ","End":"05:45.270","Text":"I Just expanded the square here and this,"},{"Start":"05:45.270 ","End":"05:48.100","Text":"if I just collect it,"},{"Start":"05:48.100 ","End":"05:50.360","Text":"we end up with this,"},{"Start":"05:50.360 ","End":"05:53.390","Text":"and this actually comes out to be 0."},{"Start":"05:53.390 ","End":"05:57.420","Text":"We only have two equations,"},{"Start":"05:57.890 ","End":"06:04.225","Text":"so z will be the independent variable."},{"Start":"06:04.225 ","End":"06:07.170","Text":"I chose to set z to be minus 2."},{"Start":"06:07.170 ","End":"06:09.390","Text":"I messed with it. If it\u0027s 1, you get a fraction."},{"Start":"06:09.390 ","End":"06:12.360","Text":"If you put it as 2 it\u0027s okay,"},{"Start":"06:12.360 ","End":"06:14.745","Text":"but minus 2 comes out a bit neater."},{"Start":"06:14.745 ","End":"06:18.000","Text":"I let z be minus 2."},{"Start":"06:18.000 ","End":"06:24.350","Text":"Then if you isolate y,"},{"Start":"06:24.350 ","End":"06:29.060","Text":"you take this to the other side and divide and we get through the"},{"Start":"06:29.060 ","End":"06:35.280","Text":"the computation y equals 1 plus root 3i."},{"Start":"06:37.640 ","End":"06:44.630","Text":"2y goes to the other side and we divide by minus 2 and the z is minus 2,"},{"Start":"06:44.630 ","End":"06:48.620","Text":"so we\u0027re left with just this thing. That\u0027s z and y."},{"Start":"06:48.620 ","End":"06:50.665","Text":"And then we put them in here."},{"Start":"06:50.665 ","End":"06:53.850","Text":"We can compute x once we have y and z,"},{"Start":"06:53.850 ","End":"06:56.205","Text":"and it comes out to be this."},{"Start":"06:56.205 ","End":"06:57.890","Text":"Arrange them in the right order,"},{"Start":"06:57.890 ","End":"07:00.455","Text":"first x, then y, then z."},{"Start":"07:00.455 ","End":"07:06.650","Text":"This is the vector we get for the eigenvalue 1 plus root 3i."},{"Start":"07:06.650 ","End":"07:09.530","Text":"That was quite a lot of work."},{"Start":"07:09.530 ","End":"07:15.525","Text":"But we can take a shortcut for the 1 minus root 3i."},{"Start":"07:15.525 ","End":"07:19.865","Text":"For this we use a proposition you may have seen this before,"},{"Start":"07:19.865 ","End":"07:25.060","Text":"which basically says that if you have an eigenvector for a complex eigenvalue,"},{"Start":"07:25.060 ","End":"07:32.615","Text":"like the 1 plus root 3i and you now want the eigenvector for the conjugate eigenvalue,"},{"Start":"07:32.615 ","End":"07:40.445","Text":"all you have to do is take this vector and in each of these three entries,"},{"Start":"07:40.445 ","End":"07:43.205","Text":"we replace it by its conjugate."},{"Start":"07:43.205 ","End":"07:46.085","Text":"That\u0027s what this proposition says essentially."},{"Start":"07:46.085 ","End":"07:49.250","Text":"If I take the conjugate of each of these three,"},{"Start":"07:49.250 ","End":"07:51.560","Text":"then without any major computations,"},{"Start":"07:51.560 ","End":"07:53.840","Text":"we say, 1 minus root 3i,"},{"Start":"07:53.840 ","End":"07:57.255","Text":"so it\u0027s plus 1 plus root 3i."},{"Start":"07:57.255 ","End":"07:58.895","Text":"This one is a minus,"},{"Start":"07:58.895 ","End":"08:00.110","Text":"this is a real number,"},{"Start":"08:00.110 ","End":"08:01.685","Text":"it stays the same,"},{"Start":"08:01.685 ","End":"08:08.850","Text":"so this is our answer for the last eigenvalue, and we\u0027re done."}],"ID":25740},{"Watched":false,"Name":"Exercise 6","Duration":"2m 12s","ChapterTopicVideoID":24828,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:05.490","Text":"In this exercise, we\u0027re given the following 3-by-3 matrix A,"},{"Start":"00:05.490 ","End":"00:09.570","Text":"which involves a parameter k. The question is,"},{"Start":"00:09.570 ","End":"00:17.385","Text":"for which value or values of the parameter k will 2 be an eigenvalue of A?"},{"Start":"00:17.385 ","End":"00:23.795","Text":"Well, the eigenvalues of A are just the solutions of the characteristic equation."},{"Start":"00:23.795 ","End":"00:26.435","Text":"The 2 bars mean determinant, of course,"},{"Start":"00:26.435 ","End":"00:31.955","Text":"the determinant of xI minus A equals 0 and I is the identity matrix,"},{"Start":"00:31.955 ","End":"00:34.610","Text":"3 by 3 in this case."},{"Start":"00:34.610 ","End":"00:37.730","Text":"If 2 is an eigenvalue,"},{"Start":"00:37.730 ","End":"00:42.665","Text":"that means we can plug in x equals 2 and it will satisfy the equation."},{"Start":"00:42.665 ","End":"00:44.800","Text":"This is what we get."},{"Start":"00:44.800 ","End":"00:47.040","Text":"2I is just,"},{"Start":"00:47.040 ","End":"00:50.655","Text":"2 is on the diagonal and A copied from here."},{"Start":"00:50.655 ","End":"00:55.380","Text":"We need to do the subtraction and the determinant."},{"Start":"00:55.380 ","End":"00:59.690","Text":"2 minus k minus 2 is 4 minus k,"},{"Start":"00:59.690 ","End":"01:01.220","Text":"and so on for the rest of them."},{"Start":"01:01.220 ","End":"01:06.240","Text":"We now have a determinant to compute and assign to 0."},{"Start":"01:07.040 ","End":"01:13.905","Text":"Easiest would be to expand along the 3rd row because there\u0027s a 0 here."},{"Start":"01:13.905 ","End":"01:17.960","Text":"Remember there\u0027s a checkerboard pattern of pluses and minuses."},{"Start":"01:17.960 ","End":"01:20.270","Text":"Plus, minus, plus, minus, plus."},{"Start":"01:20.270 ","End":"01:22.925","Text":"Both of these are pluses."},{"Start":"01:22.925 ","End":"01:29.560","Text":"We have k times what we get,"},{"Start":"01:29.560 ","End":"01:33.604","Text":"the determinant when we erase the row and column."},{"Start":"01:33.604 ","End":"01:39.870","Text":"You know how to do this. 8 times the determinant of this is here."},{"Start":"01:40.460 ","End":"01:43.110","Text":"The rest of it is just algebra."},{"Start":"01:43.110 ","End":"01:45.920","Text":"The 2 by 2 determinant is this diagonal product"},{"Start":"01:45.920 ","End":"01:49.325","Text":"minus this diagonal product and this is what we get."},{"Start":"01:49.325 ","End":"01:54.529","Text":"Then we just open it up and we get this quadratic equation."},{"Start":"01:54.529 ","End":"01:58.415","Text":"I\u0027m not going to waste time on solving quadratic equations."},{"Start":"01:58.415 ","End":"02:00.560","Text":"I\u0027ll just give you the answers."},{"Start":"02:00.560 ","End":"02:03.170","Text":"There\u0027s 2 values of k that satisfy this."},{"Start":"02:03.170 ","End":"02:08.615","Text":"One of them is 3 and the other is minus 32 over 9."},{"Start":"02:08.615 ","End":"02:12.570","Text":"That\u0027s the answer to the question. We\u0027re done."}],"ID":25741},{"Watched":false,"Name":"Exercise 7","Duration":"5m 5s","ChapterTopicVideoID":24829,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:05.730","Text":"Here we have an exercise involving a square matrix A."},{"Start":"00:05.730 ","End":"00:06.810","Text":"There are 4 parts."},{"Start":"00:06.810 ","End":"00:08.865","Text":"Each of them is a true or false."},{"Start":"00:08.865 ","End":"00:16.660","Text":"Part a, 0 is an eigenvalue of A if and only if A is non-invertible."},{"Start":"00:17.480 ","End":"00:24.075","Text":"Part b, if A is invertible and Lambda is an eigenvalue of it,"},{"Start":"00:24.075 ","End":"00:28.995","Text":"then 1 over Lambda is an eigenvalue of the inverse of A."},{"Start":"00:28.995 ","End":"00:32.910","Text":"Part c and d talk about A and A transpose."},{"Start":"00:32.910 ","End":"00:36.750","Text":"They have the same characteristic polynomial, true or false?"},{"Start":"00:36.750 ","End":"00:40.810","Text":"They have the same eigenvectors, true or false?"},{"Start":"00:47.390 ","End":"00:51.150","Text":"Let\u0027s see. Turns out a is true."},{"Start":"00:51.150 ","End":"00:53.595","Text":"We have to prove it."},{"Start":"00:53.595 ","End":"00:56.760","Text":"Note this is an if and only if question,"},{"Start":"00:56.760 ","End":"00:58.335","Text":"so we have to go both ways."},{"Start":"00:58.335 ","End":"01:02.660","Text":"We start off with 0 is an eigenvalue of A."},{"Start":"01:02.660 ","End":"01:05.260","Text":"What does it mean to be an eigenvalue?"},{"Start":"01:05.260 ","End":"01:10.170","Text":"Means that Av is 0 times v for sum non-zero"},{"Start":"01:10.170 ","End":"01:15.425","Text":"v. This is just the same as this because 0v is 0,"},{"Start":"01:15.425 ","End":"01:17.665","Text":"0 vector that is."},{"Start":"01:17.665 ","End":"01:25.610","Text":"Now, this means that the kernel contains more than just the 0 because it\u0027s got sum v,"},{"Start":"01:25.610 ","End":"01:27.875","Text":"which is non-zero in it."},{"Start":"01:27.875 ","End":"01:33.830","Text":"When the kernel is non-zero for a square matrix,"},{"Start":"01:33.830 ","End":"01:41.550","Text":"that is true if and only if the matrix is non-invertible."},{"Start":"01:42.250 ","End":"01:44.360","Text":"Now onto b,"},{"Start":"01:44.360 ","End":"01:49.270","Text":"which turns out also to be true. We have to prove it."},{"Start":"01:49.270 ","End":"01:52.410","Text":"We\u0027re given that A is invertible and Lambda is"},{"Start":"01:52.410 ","End":"01:56.745","Text":"an eigenvalue of A and we\u0027re also given that Lambda is not 0."},{"Start":"01:56.745 ","End":"02:03.885","Text":"The eigenvalue means that Av is Lambda v for sum v, which is not 0."},{"Start":"02:03.885 ","End":"02:06.990","Text":"Multiply both sides by A inverse,"},{"Start":"02:06.990 ","End":"02:09.525","Text":"so the A inverse cancels out here."},{"Start":"02:09.525 ","End":"02:13.765","Text":"The A inverse slides past the scalar here."},{"Start":"02:13.765 ","End":"02:18.075","Text":"Next, multiply both sides by 1 over Lambda."},{"Start":"02:18.075 ","End":"02:21.485","Text":"Here it disappears and it comes out here."},{"Start":"02:21.485 ","End":"02:25.045","Text":"If you just look at it the other way,"},{"Start":"02:25.045 ","End":"02:29.750","Text":"this just means that 1 over Lambda is an eigenvalue of A inverse."},{"Start":"02:29.750 ","End":"02:34.250","Text":"Of course v is still non-zero. That was b."},{"Start":"02:34.250 ","End":"02:40.755","Text":"Now onto c. This 1 is also true."},{"Start":"02:40.755 ","End":"02:45.320","Text":"Remember we want to show that A and A transpose of the same characteristic polynomial."},{"Start":"02:45.320 ","End":"02:48.560","Text":"This is the definition of the characteristic polynomial,"},{"Start":"02:48.560 ","End":"02:52.315","Text":"the determinant of xI minus the matrix."},{"Start":"02:52.315 ","End":"02:54.150","Text":"Now remember, in general,"},{"Start":"02:54.150 ","End":"02:57.815","Text":"for any square matrix M,"},{"Start":"02:57.815 ","End":"03:02.620","Text":"the determinant is the same as the determinant of the transpose."},{"Start":"03:02.620 ","End":"03:07.930","Text":"In our case, the determinant of xI minus A,"},{"Start":"03:07.930 ","End":"03:09.845","Text":"which I took from the left side,"},{"Start":"03:09.845 ","End":"03:15.535","Text":"is equal to the determinant of the same thing transpose from the general principle."},{"Start":"03:15.535 ","End":"03:18.485","Text":"The transpose of a sum or difference,"},{"Start":"03:18.485 ","End":"03:22.405","Text":"you can apply it to each term separately."},{"Start":"03:22.405 ","End":"03:28.205","Text":"Transpose of I or anything times I is itself because this is asymmetric,"},{"Start":"03:28.205 ","End":"03:30.935","Text":"and the transpose of A is just A transpose."},{"Start":"03:30.935 ","End":"03:33.770","Text":"Look, we got this equals this,"},{"Start":"03:33.770 ","End":"03:35.165","Text":"and that\u0027s what we need."},{"Start":"03:35.165 ","End":"03:38.420","Text":"That\u0027s part c proven."},{"Start":"03:38.420 ","End":"03:43.325","Text":"However, part d turns out to be false."},{"Start":"03:43.325 ","End":"03:46.760","Text":"Which is somewhat surprising because we saw that"},{"Start":"03:46.760 ","End":"03:49.895","Text":"A and A transpose of the same characteristic polynomial."},{"Start":"03:49.895 ","End":"03:52.100","Text":"They have the same eigenvalues,"},{"Start":"03:52.100 ","End":"03:55.085","Text":"but they don\u0027t necessarily have the same eigenvectors."},{"Start":"03:55.085 ","End":"03:58.915","Text":"All I need is to produce 1 counter example."},{"Start":"03:58.915 ","End":"04:00.480","Text":"After playing around,"},{"Start":"04:00.480 ","End":"04:03.720","Text":"I found 1 in a 2 by 2 matrix, A,"},{"Start":"04:03.720 ","End":"04:07.935","Text":"as this 1, 1,1,1 with a 4 here."},{"Start":"04:07.935 ","End":"04:12.400","Text":"Let\u0027s take a vector 1, 2."},{"Start":"04:12.410 ","End":"04:16.260","Text":"If you multiply A times v,"},{"Start":"04:16.260 ","End":"04:21.455","Text":"you get 3 times v. That means that"},{"Start":"04:21.455 ","End":"04:25.895","Text":"the vector v is an eigenvector of"},{"Start":"04:25.895 ","End":"04:31.120","Text":"A corresponding to eigenvalue 3 but it\u0027s less important."},{"Start":"04:31.120 ","End":"04:37.775","Text":"But if I take A transpose and multiply it by v,"},{"Start":"04:37.775 ","End":"04:40.780","Text":"I get 9, 3,"},{"Start":"04:40.780 ","End":"04:44.765","Text":"which no way is a multiple of 1, 2."},{"Start":"04:44.765 ","End":"04:46.940","Text":"There\u0027s nothing you can multiply 1,"},{"Start":"04:46.940 ","End":"04:49.285","Text":"2 by to get 9, 3."},{"Start":"04:49.285 ","End":"04:52.680","Text":"I mean, it would have to be 9 and it would have to be 1-and-a-half."},{"Start":"04:52.680 ","End":"04:57.170","Text":"It\u0027s a contradiction. We found a vector v,"},{"Start":"04:57.170 ","End":"05:01.160","Text":"which is an eigenvector of A but not of A transpose."},{"Start":"05:01.160 ","End":"05:05.950","Text":"That shows that d is false, and we\u0027re done."}],"ID":25742},{"Watched":false,"Name":"Exercise 8","Duration":"5m 2s","ChapterTopicVideoID":24830,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.900","Text":"In this exercise, we\u0027re given 2 square matrices,"},{"Start":"00:03.900 ","End":"00:07.785","Text":"A and B of the same size, I should say."},{"Start":"00:07.785 ","End":"00:12.495","Text":"We have 2 parts, each of them true or false, prove or disprove."},{"Start":"00:12.495 ","End":"00:18.390","Text":"Part A says that AB and BA have the same eigenvalues."},{"Start":"00:18.390 ","End":"00:24.180","Text":"The second part says that if we have a non-zero eigenvector,"},{"Start":"00:24.180 ","End":"00:26.040","Text":"well all eigenvectors are non-zero,"},{"Start":"00:26.040 ","End":"00:30.600","Text":"but okay, an eigenvector of both A and B,"},{"Start":"00:30.600 ","End":"00:36.525","Text":"then it\u0027s also an eigenvector of 4A plus 10B."},{"Start":"00:36.525 ","End":"00:39.430","Text":"We\u0027ll start with part a."},{"Start":"00:39.430 ","End":"00:47.405","Text":"Turns out that this is true and we\u0027ll break it up into 2 cases, 0 and non-zero."},{"Start":"00:47.405 ","End":"00:52.940","Text":"The case where the eigenvalue is 0, separately,"},{"Start":"00:52.940 ","End":"00:55.880","Text":"if 0 is an eigenvalue of AB,"},{"Start":"00:55.880 ","End":"00:59.285","Text":"that means that AB is non-invertible."},{"Start":"00:59.285 ","End":"01:03.780","Text":"We know this is the condition for non-invertibility."},{"Start":"01:03.890 ","End":"01:09.380","Text":"At least 1 of A and B is non-invertible"},{"Start":"01:09.380 ","End":"01:11.390","Text":"because if you have the product of"},{"Start":"01:11.390 ","End":"01:14.960","Text":"2 invertible matrices and that will also be invertible."},{"Start":"01:14.960 ","End":"01:17.345","Text":"If 1 of these is non-invertible,"},{"Start":"01:17.345 ","End":"01:26.170","Text":"then BA is non-invertible and non-invertible means that 0 is eigenvalue of BA."},{"Start":"01:26.540 ","End":"01:30.170","Text":"We\u0027ve shown that if 0 is an eigenvalue of AB,"},{"Start":"01:30.170 ","End":"01:33.110","Text":"then it\u0027s an eigenvalue of BA and by symmetry,"},{"Start":"01:33.110 ","End":"01:35.030","Text":"it\u0027s the other way around also."},{"Start":"01:35.030 ","End":"01:38.910","Text":"Now let\u0027s take non-zero case."},{"Start":"01:40.340 ","End":"01:44.090","Text":"If we have a non-zero eigenvalue,"},{"Start":"01:44.090 ","End":"01:51.110","Text":"then there is some v not 0 such that AB times v is"},{"Start":"01:51.110 ","End":"02:00.260","Text":"Lambda times v. We want to show that Lambda is also an eigenvalue of BA."},{"Start":"02:00.260 ","End":"02:05.840","Text":"I\u0027m not saying that the vector v will also be an eigenvector of BA,"},{"Start":"02:05.840 ","End":"02:11.450","Text":"but for Lambda the eigenvalue so we proceed as follows."},{"Start":"02:11.450 ","End":"02:14.240","Text":"Let\u0027s evaluate the following expression."},{"Start":"02:14.240 ","End":"02:18.950","Text":"Doesn\u0027t matter where I got it from, by reverse engineering."},{"Start":"02:18.950 ","End":"02:27.170","Text":"But BA times Bv is B times ABv, it\u0027s associativity."},{"Start":"02:27.440 ","End":"02:34.739","Text":"We already know that ABv is Lambda v. We can pull the Lambda in front."},{"Start":"02:34.739 ","End":"02:36.500","Text":"I meant to put it in front,"},{"Start":"02:36.500 ","End":"02:40.080","Text":"just indicate that it should be here."},{"Start":"02:40.210 ","End":"02:42.350","Text":"Notice what we have here."},{"Start":"02:42.350 ","End":"02:48.500","Text":"We have BA times this thing is equal to Lambda times this thing."},{"Start":"02:48.500 ","End":"02:53.585","Text":"What I\u0027ve highlighted, let\u0027s say we call this thing u."},{"Start":"02:53.585 ","End":"03:00.245","Text":"Then what we\u0027ve just shown is that BA times u is Lambda u."},{"Start":"03:00.245 ","End":"03:07.090","Text":"Now, what this means is that Lambda is an eigenvalue of BA,"},{"Start":"03:07.090 ","End":"03:11.225","Text":"with a different eigenvector this time u."},{"Start":"03:11.225 ","End":"03:16.415","Text":"To be rigorous, I really should say why u is non-zero."},{"Start":"03:16.415 ","End":"03:18.830","Text":"Suppose it is 0, u is just Bv."},{"Start":"03:18.830 ","End":"03:20.885","Text":"Then we\u0027d have Bv is 0."},{"Start":"03:20.885 ","End":"03:22.760","Text":"Then if we multiply by A,"},{"Start":"03:22.760 ","End":"03:25.235","Text":"we get ABv is 0."},{"Start":"03:25.235 ","End":"03:32.210","Text":"But that\u0027s a contradiction because ABv is Lambda v, that\u0027s not 0,"},{"Start":"03:32.210 ","End":"03:36.390","Text":"because v is not 0 and Lambda is not 0 so u"},{"Start":"03:36.390 ","End":"03:41.270","Text":"really is non-zero and so Lambda really is an eigenvalue of BA."},{"Start":"03:41.270 ","End":"03:45.170","Text":"Every eigenvalue of AB is an eigenvalue of BA and"},{"Start":"03:45.170 ","End":"03:50.155","Text":"the other way around is just by symmetry and so that completes part a."},{"Start":"03:50.155 ","End":"03:53.880","Text":"Just going back to see what was b."},{"Start":"03:53.880 ","End":"03:59.890","Text":"B was to show that if we have an eigenvector of A and of B,"},{"Start":"03:59.890 ","End":"04:04.370","Text":"then it\u0027s also an eigenvector of 4A plus 10B."},{"Start":"04:04.400 ","End":"04:08.155","Text":"From the given, it follows,"},{"Start":"04:08.155 ","End":"04:10.980","Text":"because it\u0027s an eigenvector of A."},{"Start":"04:10.980 ","End":"04:15.510","Text":"Then Av is some Lambda times v. Also"},{"Start":"04:15.510 ","End":"04:22.155","Text":"Bv is maybe another Lambda times v and v is not equal to 0."},{"Start":"04:22.155 ","End":"04:28.030","Text":"Now we want to compute 4A plus 10B times v. By linearity,"},{"Start":"04:28.030 ","End":"04:32.130","Text":"we can break it up into 4A times v plus 10B times"},{"Start":"04:32.130 ","End":"04:38.000","Text":"v. Then we apply these equalities here so this"},{"Start":"04:38.000 ","End":"04:44.430","Text":"becomes 4 Lambda 1 v plus 10 Lambda 2 v. Then we can"},{"Start":"04:44.430 ","End":"04:47.690","Text":"take v outside the brackets and we\u0027ve got 4 Lambda 1 plus"},{"Start":"04:47.690 ","End":"04:51.685","Text":"10 Lambda 2 and that\u0027s another Lambda."},{"Start":"04:51.685 ","End":"04:53.730","Text":"Just call this thing Lambda,"},{"Start":"04:53.730 ","End":"04:57.570","Text":"so we have the 4A plus 10B times v is Lambda v and v"},{"Start":"04:57.570 ","End":"05:02.440","Text":"is still non-zero and so we are done."}],"ID":25743},{"Watched":false,"Name":"Exercise 9","Duration":"6m 54s","ChapterTopicVideoID":24831,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.080 ","End":"00:03.150","Text":"Let me paraphrase this exercise."},{"Start":"00:03.150 ","End":"00:08.760","Text":"We have some v which is an eigenvector of"},{"Start":"00:08.760 ","End":"00:14.550","Text":"dimension n and we consider those n by n matrices,"},{"Start":"00:14.550 ","End":"00:16.815","Text":"i.e the subset of M,"},{"Start":"00:16.815 ","End":"00:18.675","Text":"n by n,"},{"Start":"00:18.675 ","End":"00:23.940","Text":"as those matrices for which v is an eigenvector."},{"Start":"00:23.940 ","End":"00:28.080","Text":"W is all the matrices n by n,"},{"Start":"00:28.080 ","End":"00:34.575","Text":"for which v is an eigenvector and we have to prove, first of all,"},{"Start":"00:34.575 ","End":"00:40.035","Text":"that this subset is actually a vector subspace"},{"Start":"00:40.035 ","End":"00:49.940","Text":"and in the particular case where n equals 2 and our vector v is this 4,0,"},{"Start":"00:49.940 ","End":"00:59.070","Text":"we have to find a basis for this subspace W and once we found the basis,"},{"Start":"00:59.070 ","End":"01:02.410","Text":"so say what the dimension of W is."},{"Start":"01:03.500 ","End":"01:10.430","Text":"We can write W in a different form or explicit."},{"Start":"01:10.430 ","End":"01:17.210","Text":"W is the set of all n by n matrices A and to say that"},{"Start":"01:17.210 ","End":"01:24.760","Text":"v is an eigenvector of A is to say that Av is Lambda v for some Lambda."},{"Start":"01:24.760 ","End":"01:30.290","Text":"Different As might have different lambdas and we have to show that W"},{"Start":"01:30.290 ","End":"01:36.470","Text":"satisfies the usual 3 subspace axioms and these are,"},{"Start":"01:36.470 ","End":"01:37.910","Text":"I just wrote them briefly,"},{"Start":"01:37.910 ","End":"01:42.375","Text":"W has to contain the 0,"},{"Start":"01:42.375 ","End":"01:45.020","Text":"in this case it would be the 0 matrix,"},{"Start":"01:45.020 ","End":"01:46.760","Text":"the 0, 2 by 2 matrix."},{"Start":"01:46.760 ","End":"01:55.920","Text":"It has to be closed under addition and it has to be closed under scalar multiplication."},{"Start":"01:56.470 ","End":"01:58.580","Text":"For part 1, like I said,"},{"Start":"01:58.580 ","End":"02:00.500","Text":"0 means the 0 matrix,"},{"Start":"02:00.500 ","End":"02:04.405","Text":"I wrote it as 0 n by n. Now,"},{"Start":"02:04.405 ","End":"02:08.315","Text":"the 0 matrix times this particular vector,"},{"Start":"02:08.315 ","End":"02:10.825","Text":"and I emphasized it with arrow over it,"},{"Start":"02:10.825 ","End":"02:15.515","Text":"it\u0027s going to be the 0 vector always and that\u0027s the same"},{"Start":"02:15.515 ","End":"02:20.270","Text":"as the scalar 0 times vector v. 0 will be"},{"Start":"02:20.270 ","End":"02:26.060","Text":"our Lambda and that means that really v"},{"Start":"02:26.060 ","End":"02:32.885","Text":"is an eigenvector corresponding to eigenvalue Lambda equals 0."},{"Start":"02:32.885 ","End":"02:40.130","Text":"It is in W and just to be explicit what the 0 matrix is,"},{"Start":"02:40.130 ","End":"02:43.025","Text":"it\u0027s just 0s everywhere,"},{"Start":"02:43.025 ","End":"02:49.040","Text":"n by n. Now we need to do the closure under addition."},{"Start":"02:49.040 ","End":"02:52.235","Text":"We take 2 matrices,"},{"Start":"02:52.235 ","End":"02:55.145","Text":"or if you like you can think of them as vectors,"},{"Start":"02:55.145 ","End":"03:03.890","Text":"in W and each of them has to satisfy that v is an eigenvector,"},{"Start":"03:03.890 ","End":"03:06.830","Text":"so Av is Lambda v,"},{"Start":"03:06.830 ","End":"03:09.650","Text":"but there could be different lambdas so I called them,"},{"Start":"03:09.650 ","End":"03:15.110","Text":"call this 1 Lambda 1 v and Bv is maybe some other Lambda times"},{"Start":"03:15.110 ","End":"03:23.420","Text":"v. Let\u0027s see what happens when we compute A plus B times v. First of all,"},{"Start":"03:23.420 ","End":"03:29.235","Text":"by linearity, it\u0027s Av plus Bv."},{"Start":"03:29.235 ","End":"03:35.450","Text":"Now Av is Lambda 1 v and Bv is Lambda 2 v and I can take v outside"},{"Start":"03:35.450 ","End":"03:39.470","Text":"the brackets and get Lambda 1 plus Lambda"},{"Start":"03:39.470 ","End":"03:44.225","Text":"2 times v and this Lambda 1 plus Lambda 2 is like another Lambda."},{"Start":"03:44.225 ","End":"03:48.110","Text":"We\u0027ve got A plus B times v is Lambda v. So v is"},{"Start":"03:48.110 ","End":"03:52.230","Text":"an eigenvector of A plus B and that means that A plus"},{"Start":"03:52.230 ","End":"04:00.845","Text":"B is also in W. That does the closure under addition."},{"Start":"04:00.845 ","End":"04:06.820","Text":"Now we go on to the closure under scalar multiplication."},{"Start":"04:06.820 ","End":"04:13.740","Text":"I\u0027m assuming that A is in W. KA times v is,"},{"Start":"04:13.740 ","End":"04:19.620","Text":"I can regroup it and call it K times Av and if v is in W,"},{"Start":"04:19.620 ","End":"04:24.965","Text":"the Av is Lambda v and then I can just put the k with the Lambda,"},{"Start":"04:24.965 ","End":"04:28.910","Text":"the 2 scalars together and this is a new Lambda,"},{"Start":"04:28.910 ","End":"04:35.210","Text":"call it Lambda tilde times v. KA times v is some Lambda times"},{"Start":"04:35.210 ","End":"04:41.690","Text":"v. That shows that kA is also in W. I guess I should have started off by saying,"},{"Start":"04:41.690 ","End":"04:50.100","Text":"assuming A belongs to W. If A belongs to W,"},{"Start":"04:50.100 ","End":"04:53.950","Text":"then so does multiplication by a scalar."},{"Start":"04:54.620 ","End":"05:02.150","Text":"That was the conclusion of part a and now we\u0027re moving on to part b and in part b,"},{"Start":"05:02.150 ","End":"05:08.900","Text":"remember we were given the vector v is 4,0 and we were dealing with 2 by 2 matrices."},{"Start":"05:08.900 ","End":"05:11.960","Text":"W is all the matrices a, b, c,"},{"Start":"05:11.960 ","End":"05:17.660","Text":"d lets called them such that 4,0 is an eigenvector,"},{"Start":"05:17.660 ","End":"05:18.680","Text":"which means that a, b, c,"},{"Start":"05:18.680 ","End":"05:23.525","Text":"d times 4,0 is Lambda times 4,0 for some Lambda."},{"Start":"05:23.525 ","End":"05:26.905","Text":"Different matrix could give us a different Lambda."},{"Start":"05:26.905 ","End":"05:33.150","Text":"Now if we perform the multiplication this times this gives us 4a,"},{"Start":"05:33.150 ","End":"05:39.810","Text":"4c and Lambda times 4,0 is 4 Lambda times 0."},{"Start":"05:39.810 ","End":"05:41.360","Text":"If these 2 vectors are equal,"},{"Start":"05:41.360 ","End":"05:43.615","Text":"they\u0027re equal component-wise,"},{"Start":"05:43.615 ","End":"05:45.560","Text":"4a equals 4 Lambda,"},{"Start":"05:45.560 ","End":"05:49.975","Text":"meaning a equals Lambda and c equals 0."},{"Start":"05:49.975 ","End":"05:55.595","Text":"Now a equals Lambda is not really any condition because Lambda could be anything."},{"Start":"05:55.595 ","End":"06:04.099","Text":"Really what we can conclude is that W is defined by c equals 0."},{"Start":"06:04.099 ","End":"06:12.600","Text":"There I wrote it. Now, c equals 0 means that a,"},{"Start":"06:12.600 ","End":"06:13.905","Text":"b, and d could be anything,"},{"Start":"06:13.905 ","End":"06:16.950","Text":"they\u0027re free variables, and c has to be 0."},{"Start":"06:16.950 ","End":"06:19.620","Text":"It\u0027s a, b,0, d for any a, b,"},{"Start":"06:19.620 ","End":"06:24.045","Text":"d. There are 3 free variables"},{"Start":"06:24.045 ","End":"06:29.415","Text":"and each time if 1 of them equal 1 and the other 0, I get a basis."},{"Start":"06:29.415 ","End":"06:33.510","Text":"A possible basis for W are these 3,"},{"Start":"06:33.510 ","End":"06:35.790","Text":"notice that in each case the c is still 0,"},{"Start":"06:35.790 ","End":"06:37.350","Text":"I just letting a equals 1,"},{"Start":"06:37.350 ","End":"06:38.520","Text":"then I let b equals 1,"},{"Start":"06:38.520 ","End":"06:40.140","Text":"and then I let d equals 1."},{"Start":"06:40.140 ","End":"06:43.785","Text":"If I count 1, 2, 3,"},{"Start":"06:43.785 ","End":"06:48.695","Text":"that gives you the dimension of W is 3,"},{"Start":"06:48.695 ","End":"06:54.690","Text":"because the dimension is the number of elements in a basis. We\u0027re done."}],"ID":25744},{"Watched":false,"Name":"Exercise 10","Duration":"8m 1s","ChapterTopicVideoID":24832,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:02.550","Text":"In this exercise is 2 parts."},{"Start":"00:02.550 ","End":"00:07.290","Text":"In part a, we suppose that a square matrix A,"},{"Start":"00:07.290 ","End":"00:08.310","Text":"if I don\u0027t say anything,"},{"Start":"00:08.310 ","End":"00:10.170","Text":"you can assume it\u0027s over the reals,"},{"Start":"00:10.170 ","End":"00:14.535","Text":"it has an eigenvector v for eigenvalue 4."},{"Start":"00:14.535 ","End":"00:20.400","Text":"Then we define another square matrix B to be equal to this expression."},{"Start":"00:20.400 ","End":"00:22.350","Text":"It\u0027s a polynomial in A,"},{"Start":"00:22.350 ","End":"00:24.420","Text":"it will also be a square matrix."},{"Start":"00:24.420 ","End":"00:26.415","Text":"We have to prove that v,"},{"Start":"00:26.415 ","End":"00:28.530","Text":"which is an eigenvector for A,"},{"Start":"00:28.530 ","End":"00:33.525","Text":"is also an eigenvector of B and find the corresponding eigenvalue."},{"Start":"00:33.525 ","End":"00:37.320","Text":"Part b is somewhat of a generalization"},{"Start":"00:37.320 ","End":"00:41.140","Text":"of A and we\u0027ll read it when we come to it or you can pause and look at it."},{"Start":"00:41.140 ","End":"00:43.785","Text":"I want to start with A already."},{"Start":"00:43.785 ","End":"00:46.745","Text":"Just to interpret what it says here,"},{"Start":"00:46.745 ","End":"00:50.735","Text":"it means that there is some vector which is non-zero,"},{"Start":"00:50.735 ","End":"00:59.180","Text":"such that Av is 4 times v. Now we want to show that v is an eigenvector of B."},{"Start":"00:59.180 ","End":"01:03.110","Text":"Let\u0027s see what happens when we apply B to v. Well,"},{"Start":"01:03.110 ","End":"01:06.605","Text":"B is this expression."},{"Start":"01:06.605 ","End":"01:14.060","Text":"We apply that to v. Then there\u0027s a distributive law or it\u0027s a linearity."},{"Start":"01:14.060 ","End":"01:23.370","Text":"We can just apply each piece to v. We get A^4v minus 2A squared v plus 10"},{"Start":"01:23.370 ","End":"01:26.760","Text":"Av minus 4 and Iv is just v."},{"Start":"01:26.760 ","End":"01:33.200","Text":"Now we want to use this property that Av is 4v so we can rewrite this a bit."},{"Start":"01:33.200 ","End":"01:36.090","Text":"This is A cube times Av."},{"Start":"01:36.090 ","End":"01:39.170","Text":"When we have Av, we can use it here."},{"Start":"01:39.170 ","End":"01:42.560","Text":"Second bit is 2A times Av."},{"Start":"01:42.560 ","End":"01:44.930","Text":"Then here we already have Av,"},{"Start":"01:44.930 ","End":"01:48.680","Text":"just bracket it and minus 4v."},{"Start":"01:48.680 ","End":"01:51.040","Text":"Now Av is 4v,"},{"Start":"01:51.040 ","End":"01:52.980","Text":"and again here,"},{"Start":"01:52.980 ","End":"01:55.540","Text":"and again here."},{"Start":"01:56.240 ","End":"01:58.380","Text":"More rearranging."},{"Start":"01:58.380 ","End":"02:02.670","Text":"We can do this trick again with Av equals 4v."},{"Start":"02:02.670 ","End":"02:07.020","Text":"We\u0027ll bring the 4 in front and push 1 of the As into the bracket,"},{"Start":"02:07.020 ","End":"02:08.600","Text":"so we get this."},{"Start":"02:08.600 ","End":"02:14.445","Text":"Likewise here, bring the 4 in front that makes it 8 and push A inside."},{"Start":"02:14.445 ","End":"02:17.030","Text":"Here, just evaluate it,"},{"Start":"02:17.030 ","End":"02:21.240","Text":"10 times 4 is 40 minus 4 is 36. We have 36v."},{"Start":"02:21.340 ","End":"02:29.548","Text":"Again, replace Av by 4v in 2 places."},{"Start":"02:29.548 ","End":"02:31.365","Text":"Again, bring the 4 in front, it\u0027s 16,"},{"Start":"02:31.365 ","End":"02:33.240","Text":"push 1 of the As inside,"},{"Start":"02:33.240 ","End":"02:34.620","Text":"so we get this."},{"Start":"02:34.620 ","End":"02:40.875","Text":"Here it\u0027s minus 32 plus 36, which is 4v."},{"Start":"02:40.875 ","End":"02:43.350","Text":"Then Av equals 4v,"},{"Start":"02:43.350 ","End":"02:48.420","Text":"again bring the 4 in front, it\u0027s 64 Av."},{"Start":"02:48.420 ","End":"02:52.125","Text":"Another time, last time Av is"},{"Start":"02:52.125 ","End":"02:58.980","Text":"4v and 64 times 4 plus 4 gives us 260,"},{"Start":"02:58.980 ","End":"03:01.755","Text":"this stands for 256 plus 4."},{"Start":"03:01.755 ","End":"03:08.930","Text":"What we have is that v is an eigenvector of B corresponding to eigenvalue 260."},{"Start":"03:08.930 ","End":"03:12.950","Text":"I\u0027d like to show you an alternative way to do this computation."},{"Start":"03:12.950 ","End":"03:16.470","Text":"We have that Av is 4v."},{"Start":"03:16.470 ","End":"03:20.975","Text":"You might have seen in other exercises that whenever we have an eigenvalue,"},{"Start":"03:20.975 ","End":"03:23.240","Text":"if we take the matrix to the power of n,"},{"Start":"03:23.240 ","End":"03:26.585","Text":"we can also take the eigenvalues to the power of n,"},{"Start":"03:26.585 ","End":"03:31.970","Text":"A^n v is 4^n v. But even if you didn\u0027t remember that,"},{"Start":"03:31.970 ","End":"03:34.175","Text":"we can just do it directly."},{"Start":"03:34.175 ","End":"03:36.800","Text":"A squared v is A times A times v,"},{"Start":"03:36.800 ","End":"03:38.555","Text":"which is 8 times 4v,"},{"Start":"03:38.555 ","End":"03:41.905","Text":"just 4Av, 4 times 4, 4 squared."},{"Start":"03:41.905 ","End":"03:44.535","Text":"Then if you have A cubed, again,"},{"Start":"03:44.535 ","End":"03:49.050","Text":"multiply by A and it comes out 4 cubed."},{"Start":"03:49.050 ","End":"03:51.990","Text":"Similarly for A^4 v, you follow this,"},{"Start":"03:51.990 ","End":"03:59.940","Text":"you get 4^4 v. Bv is equal to this expression."},{"Start":"03:59.940 ","End":"04:04.940","Text":"We can use this thing with the powers to rewrite this."},{"Start":"04:04.940 ","End":"04:06.630","Text":"This is A^4 v,"},{"Start":"04:06.630 ","End":"04:11.760","Text":"is 4^4 v. A squared v is 4 squared v. Av is 4v,"},{"Start":"04:11.760 ","End":"04:13.725","Text":"and this as is."},{"Start":"04:13.725 ","End":"04:16.760","Text":"We already have it collected in terms of v. We have"},{"Start":"04:16.760 ","End":"04:21.050","Text":"4^4 minus 2 times 4 squared minus 10 times 4 minus 4."},{"Start":"04:21.050 ","End":"04:25.635","Text":"If you do the computation, you get 260."},{"Start":"04:25.635 ","End":"04:30.755","Text":"Again, v is an eigenvector of B corresponding to eigenvalue 260."},{"Start":"04:30.755 ","End":"04:35.890","Text":"That\u0027s part a, but I want to continue a bit so you can see the motivations for part b."},{"Start":"04:35.890 ","End":"04:40.715","Text":"Now this 260, this was the computation."},{"Start":"04:40.715 ","End":"04:45.410","Text":"But this is exactly what you would get if you defined a polynomial,"},{"Start":"04:45.410 ","End":"04:49.010","Text":"x^4 minus 2x squared plus 10x minus 4."},{"Start":"04:49.010 ","End":"04:52.670","Text":"If you put 4 instead of x,"},{"Start":"04:52.670 ","End":"04:55.280","Text":"then that\u0027s p of 4."},{"Start":"04:55.280 ","End":"04:57.380","Text":"P of 4 is 260."},{"Start":"04:57.380 ","End":"05:00.065","Text":"B is p of A."},{"Start":"05:00.065 ","End":"05:06.290","Text":"We see that v is an eigenvector of p of A corresponding to eigenvalue p of 4."},{"Start":"05:06.290 ","End":"05:09.455","Text":"Part b is going to generalize this."},{"Start":"05:09.455 ","End":"05:11.325","Text":"Now here\u0027s part b."},{"Start":"05:11.325 ","End":"05:13.460","Text":"It\u0027s similar to Part a,"},{"Start":"05:13.460 ","End":"05:16.310","Text":"except that instead of the eigenvalue 4,"},{"Start":"05:16.310 ","End":"05:19.555","Text":"we have a general eigenvalue Lambda."},{"Start":"05:19.555 ","End":"05:22.790","Text":"Here A^4 and so on,"},{"Start":"05:22.790 ","End":"05:29.030","Text":"we have B equals some polynomial in A. P of x is a polynomial."},{"Start":"05:29.030 ","End":"05:36.815","Text":"Once again, we\u0027re going to show that v is an eigenvector of B as well as A."},{"Start":"05:36.815 ","End":"05:40.010","Text":"We have to find the corresponding eigenvalue."},{"Start":"05:40.010 ","End":"05:42.680","Text":"Here it came out to be p of 4,"},{"Start":"05:42.680 ","End":"05:46.950","Text":"here it will come out to be p of Lambda as we\u0027ll see."},{"Start":"05:47.630 ","End":"05:52.370","Text":"I just wrote what I said that we\u0027re going to expect to get p of"},{"Start":"05:52.370 ","End":"05:59.060","Text":"Lambda as the eigenvalue of v as an eigenvector of B."},{"Start":"05:59.060 ","End":"06:02.030","Text":"Now, I mentioned before"},{"Start":"06:02.030 ","End":"06:09.550","Text":"that A^n v is Lambda^n v only I mentioned it with 4 instead of Lambda."},{"Start":"06:09.550 ","End":"06:11.240","Text":"Let\u0027s just prove this here."},{"Start":"06:11.240 ","End":"06:12.650","Text":"You may have proved it before,"},{"Start":"06:12.650 ","End":"06:16.490","Text":"but we\u0027ll do it again quickly and if you\u0027ve seen it,"},{"Start":"06:16.490 ","End":"06:17.960","Text":"you can skip this bit."},{"Start":"06:17.960 ","End":"06:21.160","Text":"This is certainly true for n equals 1."},{"Start":"06:21.160 ","End":"06:26.180","Text":"We just need the induction step that if A^n v is Lambda^n v,"},{"Start":"06:26.180 ","End":"06:35.775","Text":"does it then follow that A^n plus 1 v equals Lambda^n plus 1 v. Here\u0027s the computation."},{"Start":"06:35.775 ","End":"06:39.285","Text":"We breakup, the A^n plus 1 as A^n times A."},{"Start":"06:39.285 ","End":"06:40.990","Text":"Now Av is Lambda v,"},{"Start":"06:40.990 ","End":"06:42.800","Text":"bring the Lambda in front,"},{"Start":"06:42.800 ","End":"06:46.580","Text":"then use the induction hypothesis that A^n v is"},{"Start":"06:46.580 ","End":"06:51.455","Text":"Lambda^n v. Then combine and we get Lambda^n plus 1."},{"Start":"06:51.455 ","End":"06:55.145","Text":"That\u0027s the proof of this claim."},{"Start":"06:55.145 ","End":"06:57.130","Text":"Now back to the exercise."},{"Start":"06:57.130 ","End":"07:03.500","Text":"Suppose that p of x is written out from lowest power to the highest,"},{"Start":"07:03.500 ","End":"07:04.700","Text":"and the coefficients a_naught,"},{"Start":"07:04.700 ","End":"07:06.620","Text":"a_1 up to a_n,"},{"Start":"07:06.620 ","End":"07:09.835","Text":"and B is p of A."},{"Start":"07:09.835 ","End":"07:14.430","Text":"We\u0027re about to show that Bv is p of Lambda v."},{"Start":"07:14.430 ","End":"07:21.495","Text":"Bv, replace B by this expression in a."},{"Start":"07:21.495 ","End":"07:23.810","Text":"Then just like before,"},{"Start":"07:23.810 ","End":"07:27.230","Text":"we can buy this distributive law."},{"Start":"07:27.230 ","End":"07:31.380","Text":"Write v in each part here."},{"Start":"07:31.380 ","End":"07:35.460","Text":"We already showed that Av is Lambda v,"},{"Start":"07:35.460 ","End":"07:37.980","Text":"A squared v is Lambda squared v. In general,"},{"Start":"07:37.980 ","End":"07:45.650","Text":"A^n v is Lambda^n v. Then we can take v outside the brackets and we get this."},{"Start":"07:45.650 ","End":"07:48.500","Text":"But what is this? This is exactly p of x,"},{"Start":"07:48.500 ","End":"07:50.120","Text":"but with Lambda instead of x."},{"Start":"07:50.120 ","End":"07:57.865","Text":"So it\u0027s p of Lambda times v. Bv is p of Lambda v. That\u0027s what we had to show."},{"Start":"07:57.865 ","End":"08:01.360","Text":"We\u0027re done with part b."}],"ID":25745},{"Watched":false,"Name":"Exercise 11","Duration":"5m 5s","ChapterTopicVideoID":24833,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:02.200","Text":"In this exercise, there are 3 parts,"},{"Start":"00:02.200 ","End":"00:04.720","Text":"they all concern a matrix A,"},{"Start":"00:04.720 ","End":"00:07.360","Text":"which is a 2 by 2 real matrix,"},{"Start":"00:07.360 ","End":"00:10.855","Text":"but it has a parameter a here."},{"Start":"00:10.855 ","End":"00:16.000","Text":"First part, we\u0027re given that a is 3 and we have to"},{"Start":"00:16.000 ","End":"00:21.085","Text":"find an example of a vector which is not an eigenvector of A."},{"Start":"00:21.085 ","End":"00:23.035","Text":"Read the other 2 as we come to them."},{"Start":"00:23.035 ","End":"00:26.780","Text":"Let start already with part a, when a is 3,"},{"Start":"00:26.780 ","End":"00:33.390","Text":"just put 3 instead of the a and we want to find a non-eigenvector."},{"Start":"00:33.390 ","End":"00:36.700","Text":"Method 1 would be to first find"},{"Start":"00:36.700 ","End":"00:42.850","Text":"the eigenvalues and then find the eigenspace for each eigenvalue."},{"Start":"00:42.850 ","End":"00:46.525","Text":"Then we want something that\u0027s not in any of the eigenspaces."},{"Start":"00:46.525 ","End":"00:51.790","Text":"Just find a vector that\u0027s not in the union of these 2,"},{"Start":"00:51.790 ","End":"00:59.410","Text":"but it\u0027s quite a bit of work relatively to find the eigenvalues and the eigenspaces."},{"Start":"00:59.410 ","End":"01:02.320","Text":"Let\u0027s go for a simpler method."},{"Start":"01:02.320 ","End":"01:04.900","Text":"Even if it\u0027s less systematic."},{"Start":"01:04.900 ","End":"01:06.700","Text":"Trial and error."},{"Start":"01:06.700 ","End":"01:13.705","Text":"Just choose a random vector and see if it\u0027s an eigenvector or not."},{"Start":"01:13.705 ","End":"01:19.880","Text":"If it is, then we just try another 1 until we get 1 that\u0027s not an eigenvector."},{"Start":"01:19.880 ","End":"01:24.165","Text":"Let\u0027s go with method 2 and choose, for example,"},{"Start":"01:24.165 ","End":"01:30.700","Text":"1,1 a times the vector comes out to be 4,"},{"Start":"01:30.700 ","End":"01:34.900","Text":"5, certainly not some Lambda times 1,"},{"Start":"01:34.900 ","End":"01:38.200","Text":"1 because the 4 and the 5 are not equal."},{"Start":"01:38.200 ","End":"01:41.060","Text":"We got lucky first time."},{"Start":"01:41.060 ","End":"01:43.665","Text":"On to part b,"},{"Start":"01:43.665 ","End":"01:45.745","Text":"just summarized it for you again,"},{"Start":"01:45.745 ","End":"01:48.610","Text":"we\u0027re given this matrix with the parameter a."},{"Start":"01:48.610 ","End":"01:52.690","Text":"We want to find the parameter a such that 1,"},{"Start":"01:52.690 ","End":"01:55.270","Text":"2 is an eigenvector of it."},{"Start":"01:55.270 ","End":"02:00.090","Text":"We have that A times 1,2 is some Lambda times 1,"},{"Start":"02:00.090 ","End":"02:04.345","Text":"2, and a is 1 little a 4, 1."},{"Start":"02:04.345 ","End":"02:08.190","Text":"We have this equation to vector equation."},{"Start":"02:08.190 ","End":"02:11.500","Text":"First, do the multiplication to get this."},{"Start":"02:11.500 ","End":"02:15.995","Text":"Here we get this and then we can compare this to this and this to this."},{"Start":"02:15.995 ","End":"02:19.415","Text":"We get that 1 plus 2a equals Lambda,"},{"Start":"02:19.415 ","End":"02:22.040","Text":"6 equals 2 Lambda."},{"Start":"02:22.040 ","End":"02:23.570","Text":"From the second equation,"},{"Start":"02:23.570 ","End":"02:25.595","Text":"we get that Lambda is 3,"},{"Start":"02:25.595 ","End":"02:27.755","Text":"put Lambda equals 3 here,"},{"Start":"02:27.755 ","End":"02:30.380","Text":"and we get 1 plus 2a equals 3."},{"Start":"02:30.380 ","End":"02:33.785","Text":"Solve this, we\u0027ve got that a equals 1."},{"Start":"02:33.785 ","End":"02:36.505","Text":"That answers part b."},{"Start":"02:36.505 ","End":"02:40.050","Text":"Part c is not related to parts a and b."},{"Start":"02:40.050 ","End":"02:47.810","Text":"This matrix A is 2 by 2 but it\u0027s not the same 1 as we had in the first 2 parts."},{"Start":"02:47.810 ","End":"02:54.170","Text":"We have a nonzero matrix A and a nonzero vector v in R^2,"},{"Start":"02:54.170 ","End":"02:56.945","Text":"which is not an eigenvector."},{"Start":"02:56.945 ","End":"03:01.580","Text":"We have to prove that the pair of vectors v,"},{"Start":"03:01.580 ","End":"03:05.990","Text":"Av forms a basis for R^2."},{"Start":"03:05.990 ","End":"03:10.100","Text":"In general, in R^n,"},{"Start":"03:10.100 ","End":"03:13.595","Text":"any n linearly independent vectors are a basis."},{"Start":"03:13.595 ","End":"03:15.770","Text":"In particular, when n equals 2,"},{"Start":"03:15.770 ","End":"03:20.030","Text":"if we have 2 linearly independent vectors, then there are basis."},{"Start":"03:20.030 ","End":"03:25.600","Text":"What we have to do is show that this pair are linearly independent."},{"Start":"03:25.600 ","End":"03:30.835","Text":"Suppose linear combination is 0 with Alpha and Beta."},{"Start":"03:30.835 ","End":"03:36.290","Text":"We have to show that the only way this can happen is if Alpha and Beta are both 0,"},{"Start":"03:36.290 ","End":"03:40.985","Text":"we\u0027ll do a proof by contradiction that this is not the case."},{"Start":"03:40.985 ","End":"03:43.765","Text":"There are 3 alternatives,"},{"Start":"03:43.765 ","End":"03:49.380","Text":"could be that Alpha is not equal to 0 and Beta equals 0."},{"Start":"03:49.380 ","End":"03:54.430","Text":"I\u0027ll show you already and then we can have the Alpha 0 and Beta is not 0,"},{"Start":"03:54.430 ","End":"03:57.340","Text":"or we can have both of them are not 0."},{"Start":"03:57.340 ","End":"03:59.685","Text":"In the first case,"},{"Start":"03:59.685 ","End":"04:03.180","Text":"if Alpha is not 0 and Beta is 0,"},{"Start":"04:03.180 ","End":"04:06.750","Text":"we have that Alpha v is 0 but Alpha is not 0,"},{"Start":"04:06.750 ","End":"04:07.950","Text":"so v is 0,"},{"Start":"04:07.950 ","End":"04:09.765","Text":"and that\u0027s a contradiction."},{"Start":"04:09.765 ","End":"04:14.410","Text":"If we have Alpha 0 and Beta not 0,"},{"Start":"04:15.530 ","End":"04:20.445","Text":"this part disappears Beta is not 0 and Beta times something is 0."},{"Start":"04:20.445 ","End":"04:21.810","Text":"This has to be 0."},{"Start":"04:21.810 ","End":"04:23.250","Text":"Av is 0."},{"Start":"04:23.250 ","End":"04:29.355","Text":"So v is an eigenvector of A with eigenvalue 0."},{"Start":"04:29.355 ","End":"04:33.730","Text":"That\u0027s also a contradiction because our v is not an eigenvector."},{"Start":"04:33.730 ","End":"04:37.175","Text":"The third case is where they\u0027re both nonzero,"},{"Start":"04:37.175 ","End":"04:39.200","Text":"then we can extract,"},{"Start":"04:39.200 ","End":"04:40.835","Text":"this should be Av,"},{"Start":"04:40.835 ","End":"04:46.405","Text":"we can get that Av is minus Alpha v over Beta,"},{"Start":"04:46.405 ","End":"04:48.650","Text":"and that means that v is"},{"Start":"04:48.650 ","End":"04:53.195","Text":"an eigenvector because this minus Alpha over Beta is like Lambda."},{"Start":"04:53.195 ","End":"04:55.040","Text":"That\u0027s also a contradiction."},{"Start":"04:55.040 ","End":"04:57.920","Text":"All the 3 alternatives give us a contradiction."},{"Start":"04:57.920 ","End":"05:01.740","Text":"It must be that Alpha equals Beta equals 0."},{"Start":"05:01.740 ","End":"05:05.950","Text":"That completes the proof. We\u0027re done."}],"ID":25746},{"Watched":false,"Name":"Exercise 12","Duration":"2m 1s","ChapterTopicVideoID":24834,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:08.730","Text":"In this exercise, A and B are both n by n matrices such that AB equals BA,"},{"Start":"00:08.730 ","End":"00:13.320","Text":"because in general, matrix multiplication is not commutative, but here it is."},{"Start":"00:13.320 ","End":"00:16.905","Text":"We\u0027re given that the rank of A is n minus 1."},{"Start":"00:16.905 ","End":"00:25.080","Text":"Now suppose that v is an eigenvector of A corresponding to eigenvalue 0."},{"Start":"00:25.080 ","End":"00:30.030","Text":"We have to prove that v is also an eigenvector of B,"},{"Start":"00:30.030 ","End":"00:33.345","Text":"but maybe not for eigenvalue 0."},{"Start":"00:33.345 ","End":"00:36.240","Text":"We\u0027ll use the rank-nullity theorem."},{"Start":"00:36.240 ","End":"00:40.940","Text":"What it says in our case is that the rank of A plus the nullity,"},{"Start":"00:40.940 ","End":"00:43.655","Text":"which is the dimension of the kernel of A,"},{"Start":"00:43.655 ","End":"00:49.010","Text":"is equal to n. We get that the dimension of the kernel is 1,"},{"Start":"00:49.010 ","End":"00:51.110","Text":"then minus 1 plus something equals n,"},{"Start":"00:51.110 ","End":"00:53.600","Text":"and that something has to be equal to 1."},{"Start":"00:53.600 ","End":"00:56.860","Text":"Now, if we compute Av,"},{"Start":"00:56.860 ","End":"01:02.855","Text":"Av is 0v because it\u0027s an eigenvector for eigenvalue 0."},{"Start":"01:02.855 ","End":"01:05.690","Text":"This means that v is in the kernel of A."},{"Start":"01:05.690 ","End":"01:09.980","Text":"In general, the kernel is the same as the 0 eigenspace."},{"Start":"01:09.980 ","End":"01:11.660","Text":"We could also say that."},{"Start":"01:11.660 ","End":"01:14.720","Text":"Now, v is not equal to 0,"},{"Start":"01:14.720 ","End":"01:21.395","Text":"so that means that v is a basis for the kernel of A,"},{"Start":"01:21.395 ","End":"01:24.380","Text":"because we only need 1 vector if its dimension 1."},{"Start":"01:24.380 ","End":"01:29.550","Text":"Now, Av equals 0 implies that BAv is 0,"},{"Start":"01:29.550 ","End":"01:34.170","Text":"just multiply both sides on the left by B. BA is equal to AB,"},{"Start":"01:34.170 ","End":"01:36.540","Text":"so we get ABv is 0."},{"Start":"01:36.540 ","End":"01:40.255","Text":"Which means that Bv is in the kernel of A."},{"Start":"01:40.255 ","End":"01:48.185","Text":"We already know that the kernel of A is the span v. Bv must be some constant times v,"},{"Start":"01:48.185 ","End":"01:50.710","Text":"say Lambda v, for some Lambda."},{"Start":"01:50.710 ","End":"01:57.020","Text":"That means that v is an eigenvector of B corresponding to eigenvalue Lambda."},{"Start":"01:57.020 ","End":"01:59.285","Text":"This is what we had to show,"},{"Start":"01:59.285 ","End":"02:01.680","Text":"and so we\u0027re done."}],"ID":25747},{"Watched":false,"Name":"Exercise 13","Duration":"8m 33s","ChapterTopicVideoID":24835,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.435","Text":"This exercise has several parts."},{"Start":"00:03.435 ","End":"00:06.765","Text":"In part a, we have a square 2 by 2 matrix,"},{"Start":"00:06.765 ","End":"00:08.400","Text":"and we have to prove that"},{"Start":"00:08.400 ","End":"00:16.720","Text":"the characteristic polynomial can be computed in terms of the trace and the determinant."},{"Start":"00:17.420 ","End":"00:20.085","Text":"There\u0027s also a second part."},{"Start":"00:20.085 ","End":"00:22.710","Text":"You know what? We\u0027ll read these as we get to them."},{"Start":"00:22.710 ","End":"00:27.420","Text":"Let\u0027s just start with part a where we have a 2 by 2 matrix and we have to"},{"Start":"00:27.420 ","End":"00:32.729","Text":"prove that the characteristic polynomial is given as follows."},{"Start":"00:32.729 ","End":"00:38.015","Text":"X squared minus the trace of A times x plus the determinant of A."},{"Start":"00:38.015 ","End":"00:41.675","Text":"What we\u0027ll do is first take an example."},{"Start":"00:41.675 ","End":"00:45.050","Text":"Let\u0027s say 1, 2, 3, 4 for A."},{"Start":"00:45.050 ","End":"00:47.899","Text":"Then the trace of A is 5,"},{"Start":"00:47.899 ","End":"00:51.125","Text":"it\u0027s the sum of the diagonal,"},{"Start":"00:51.125 ","End":"00:56.570","Text":"and the determinant is the product 1 times 4 minus 2 times 3,"},{"Start":"00:56.570 ","End":"00:58.660","Text":"so that\u0027s minus 2."},{"Start":"00:58.660 ","End":"01:00.765","Text":"If we plug it into this,"},{"Start":"01:00.765 ","End":"01:02.360","Text":"note there\u0027s a minus here."},{"Start":"01:02.360 ","End":"01:06.920","Text":"Then we get x squared minus 5x minus 2."},{"Start":"01:06.920 ","End":"01:09.320","Text":"That\u0027s our characteristic polynomial."},{"Start":"01:09.320 ","End":"01:11.150","Text":"That was just an example."},{"Start":"01:11.150 ","End":"01:16.070","Text":"Now let\u0027s actually do it in general and we\u0027ll let the matrix A be little a,"},{"Start":"01:16.070 ","End":"01:20.375","Text":"b, c, d. Let\u0027s compute the characteristic polynomial."},{"Start":"01:20.375 ","End":"01:26.580","Text":"Well, we have this formula that the polynomial is the determinant of xI minus A,"},{"Start":"01:26.580 ","End":"01:30.430","Text":"and that is equal to the following."},{"Start":"01:30.430 ","End":"01:37.310","Text":"If we multiply this diagonal and subtract this product of the diagonal, we get this."},{"Start":"01:37.310 ","End":"01:45.955","Text":"This comes out if we arrange it to be x squared minus a plus dx plus ad minus bc."},{"Start":"01:45.955 ","End":"01:47.880","Text":"I\u0027ve colored it to help us,"},{"Start":"01:47.880 ","End":"01:49.915","Text":"look what is a plus d?"},{"Start":"01:49.915 ","End":"01:52.490","Text":"That is exactly the trace of A,"},{"Start":"01:52.490 ","End":"01:55.130","Text":"and what\u0027s ad minus bc?"},{"Start":"01:55.130 ","End":"01:58.030","Text":"Well, that\u0027s the determinant of A."},{"Start":"01:58.030 ","End":"02:02.370","Text":"We\u0027ve got this and that proves Part 1."},{"Start":"02:02.370 ","End":"02:06.650","Text":"There was also a second part, 2 part a."},{"Start":"02:06.650 ","End":"02:14.015","Text":"That given that the trace of A is 4 and given also that A has only 1 eigenvalue,"},{"Start":"02:14.015 ","End":"02:17.000","Text":"compute the determinant of A."},{"Start":"02:17.000 ","End":"02:25.200","Text":"The characteristic polynomial is x squared minus 4x plus determinant of A."},{"Start":"02:25.200 ","End":"02:27.560","Text":"Now we know it has 1 eigenvalue."},{"Start":"02:27.560 ","End":"02:31.600","Text":"Well, let\u0027s call it Lambda as a polynomial of degree 2."},{"Start":"02:31.600 ","End":"02:35.020","Text":"So x minus Lambda factors into it."},{"Start":"02:35.020 ","End":"02:41.160","Text":"The other linear factor also has to be x minus Lambda because we only have 1 eigenvalue."},{"Start":"02:41.160 ","End":"02:45.255","Text":"The polynomial is x minus Lambda squared."},{"Start":"02:45.255 ","End":"02:52.360","Text":"What we have is that x squared minus 4x plus determinant of A,"},{"Start":"02:52.360 ","End":"02:54.445","Text":"which is this on the 1 hand,"},{"Start":"02:54.445 ","End":"03:00.190","Text":"is equal to squaring this x squared minus 2 Lambda plus Lambda squared."},{"Start":"03:00.190 ","End":"03:02.080","Text":"Now it\u0027s not just inequality,"},{"Start":"03:02.080 ","End":"03:04.480","Text":"this is identity of polynomials."},{"Start":"03:04.480 ","End":"03:06.920","Text":"We can compare coefficients."},{"Start":"03:06.920 ","End":"03:11.580","Text":"What we can get is that 4 equals 2 Lambda,"},{"Start":"03:11.580 ","End":"03:14.010","Text":"so that Lambda equals 2."},{"Start":"03:14.010 ","End":"03:16.115","Text":"Now that we have Lambda,"},{"Start":"03:16.115 ","End":"03:21.220","Text":"since we have that the determinant of A is Lambda squared and Lambda is 2."},{"Start":"03:21.220 ","End":"03:24.275","Text":"The determinant of A is equal to 4,"},{"Start":"03:24.275 ","End":"03:28.110","Text":"and that\u0027s the answer to a Part 2."},{"Start":"03:28.110 ","End":"03:35.350","Text":"I\u0027ll show you a different way we can do it without using this x minus Lambda squared."},{"Start":"03:35.350 ","End":"03:44.625","Text":"What we can say is that this x squared minus 4x plus determinant A has only 1 solution."},{"Start":"03:44.625 ","End":"03:47.500","Text":"As a quadratic, if it only has 1 solution,"},{"Start":"03:47.500 ","End":"03:50.050","Text":"it\u0027s discriminant is 0."},{"Start":"03:50.050 ","End":"03:52.475","Text":"The discriminant of the quadratic,"},{"Start":"03:52.475 ","End":"03:55.700","Text":"remember it\u0027s b squared minus 4ac."},{"Start":"03:55.700 ","End":"03:58.930","Text":"I didn\u0027t want to write b squared minus 4ac because we already have a,"},{"Start":"03:58.930 ","End":"04:00.685","Text":"b, c, and d here."},{"Start":"04:00.685 ","End":"04:04.900","Text":"Let\u0027s just mentally do at b squared minus 4 squared"},{"Start":"04:04.900 ","End":"04:09.280","Text":"minus 4 times 1 times determinant of A. Yeah,"},{"Start":"04:09.280 ","End":"04:15.485","Text":"16 minus 4 times 1 times determinant of A is 0."},{"Start":"04:15.485 ","End":"04:22.760","Text":"Then from this, we can get that the determinant of A has to equal 4,"},{"Start":"04:22.760 ","End":"04:26.210","Text":"which luckily is the same answer as we had here."},{"Start":"04:26.210 ","End":"04:28.880","Text":"That\u0027s Part 2 of part a."},{"Start":"04:28.880 ","End":"04:33.155","Text":"We\u0027ve done part a and onto part b."},{"Start":"04:33.155 ","End":"04:37.715","Text":"This is actually a generalization of part a 2,"},{"Start":"04:37.715 ","End":"04:40.400","Text":"from degree 2 degree n."},{"Start":"04:40.400 ","End":"04:44.960","Text":"Where if we have the characteristic polynomial of a monic polynomial,"},{"Start":"04:44.960 ","End":"04:48.230","Text":"then we now the next leading coefficient and"},{"Start":"04:48.230 ","End":"04:53.075","Text":"the constant coefficient in terms of the trace and the determinant."},{"Start":"04:53.075 ","End":"04:57.304","Text":"This 1 here is minus the trace,"},{"Start":"04:57.304 ","End":"05:02.450","Text":"and the last 1 is plus or minus the determinant depending on the degree"},{"Start":"05:02.450 ","End":"05:10.005","Text":"n. The definition of the characteristic polynomial is the determinant of xI minus A."},{"Start":"05:10.005 ","End":"05:13.444","Text":"We\u0027ll evaluate p of 0 in 2 ways."},{"Start":"05:13.444 ","End":"05:20.180","Text":"On the 1 hand, we can just substitute 0 in here and get the determinant of minus A."},{"Start":"05:20.180 ","End":"05:23.885","Text":"The n by n is just to remind you of the size of it."},{"Start":"05:23.885 ","End":"05:28.520","Text":"On the other hand, if we substitute x equals 0 here,"},{"Start":"05:28.520 ","End":"05:32.390","Text":"all of these are 0 and we just get a naught."},{"Start":"05:32.390 ","End":"05:34.490","Text":"We have this equality,"},{"Start":"05:34.490 ","End":"05:40.910","Text":"which means that if we use the formula that the determinant of k times"},{"Start":"05:40.910 ","End":"05:49.240","Text":"something is k^n of the determinant without the k. Let k equal minus 1."},{"Start":"05:49.240 ","End":"05:51.090","Text":"Then we get this formula,"},{"Start":"05:51.090 ","End":"05:53.525","Text":"minus 1^n determinant of A."},{"Start":"05:53.525 ","End":"05:55.070","Text":"This path was the easy part."},{"Start":"05:55.070 ","End":"05:57.535","Text":"The trace 1 is the harder part."},{"Start":"05:57.535 ","End":"06:02.300","Text":"Let\u0027s A be a square matrix consisting of"},{"Start":"06:02.300 ","End":"06:09.060","Text":"coefficient c_ij from c_11 up to c_nn."},{"Start":"06:09.230 ","End":"06:14.150","Text":"The determinant of xI minus A means that we make all"},{"Start":"06:14.150 ","End":"06:18.995","Text":"this negative and then add x\u0027s along the diagonal."},{"Start":"06:18.995 ","End":"06:24.755","Text":"The characteristic polynomial, which is equal to this determinant"},{"Start":"06:24.755 ","End":"06:31.430","Text":"of xI minus A is if we expand by the first row,"},{"Start":"06:31.430 ","End":"06:34.875","Text":"we have this diagonal product,"},{"Start":"06:34.875 ","End":"06:39.590","Text":"and then we have each 1 of these times A minor."},{"Start":"06:39.590 ","End":"06:41.900","Text":"Where we delete a row and a column."},{"Start":"06:41.900 ","End":"06:44.165","Text":"If we do that,"},{"Start":"06:44.165 ","End":"06:47.550","Text":"we delete 2 of the x\u0027s at 1 time."},{"Start":"06:47.550 ","End":"06:51.380","Text":"So we have at most degree n minus 2."},{"Start":"06:51.380 ","End":"06:57.775","Text":"There is nothing that will give us a degree n minus 1 other than this diagonal."},{"Start":"06:57.775 ","End":"07:02.105","Text":"That\u0027s all we care about. We have degree less than or equal to n minus 2."},{"Start":"07:02.105 ","End":"07:06.275","Text":"But we do have a degree n minus 1 hidden in here."},{"Start":"07:06.275 ","End":"07:10.250","Text":"Now, this part here is,"},{"Start":"07:10.250 ","End":"07:12.415","Text":"first of all, x^n,"},{"Start":"07:12.415 ","End":"07:14.625","Text":"that\u0027s the only thing that\u0027ll give us x^n."},{"Start":"07:14.625 ","End":"07:18.480","Text":"To get the coefficient of x^n minus 1,"},{"Start":"07:18.480 ","End":"07:24.305","Text":"each time we take n minus 1x\u0027s and a minus c something."},{"Start":"07:24.305 ","End":"07:29.660","Text":"So altogether we have n such terms, minus c_11,"},{"Start":"07:29.660 ","End":"07:32.000","Text":"and minus c_22, and so on,"},{"Start":"07:32.000 ","End":"07:37.825","Text":"x^n minus 1 and all the rest of it will be degrees less than or equal to n minus 2."},{"Start":"07:37.825 ","End":"07:41.735","Text":"If we combine that with this,"},{"Start":"07:41.735 ","End":"07:50.285","Text":"we have that the original p of x is equal to x^n minus this."},{"Start":"07:50.285 ","End":"07:53.970","Text":"Then the polynomial in degree,"},{"Start":"07:53.970 ","End":"07:56.960","Text":"less than or equal to n minus 2 plus this polynomial,"},{"Start":"07:56.960 ","End":"08:01.180","Text":"it\u0027s still a polynomial of degree less than or equal to n minus 2."},{"Start":"08:01.180 ","End":"08:10.490","Text":"Again, this disappears and the blue part gets replaced by x^n minus this, plus this."},{"Start":"08:10.490 ","End":"08:16.545","Text":"Then all we have to do is add this and these 2 combine to give us this."},{"Start":"08:16.545 ","End":"08:20.140","Text":"This, what\u0027s in brackets is the trace of A,"},{"Start":"08:20.140 ","End":"08:26.270","Text":"and so that gives us that the coefficient of x^n minus 1,"},{"Start":"08:26.270 ","End":"08:28.430","Text":"which is a_n minus 1,"},{"Start":"08:28.430 ","End":"08:30.845","Text":"is minus the trace of A."},{"Start":"08:30.845 ","End":"08:32.360","Text":"That\u0027s all that remained,"},{"Start":"08:32.360 ","End":"08:34.770","Text":"and we are done."}],"ID":25748},{"Watched":false,"Name":"Algebraic and Geometric Multiplicity","Duration":"11m 2s","ChapterTopicVideoID":25336,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:04.650","Text":"Now we\u0027ll learn about the concept of a multiplicity of an eigenvalue."},{"Start":"00:04.650 ","End":"00:07.125","Text":"Actually, there\u0027s 2 kinds of multiplicity,"},{"Start":"00:07.125 ","End":"00:10.770","Text":"algebraic multiplicity and geometric multiplicity."},{"Start":"00:10.770 ","End":"00:15.345","Text":"We\u0027ll start with the algebraic multiplicity."},{"Start":"00:15.345 ","End":"00:17.100","Text":"We\u0027ll start with an example."},{"Start":"00:17.100 ","End":"00:22.695","Text":"The following 2 matrices are given A and B as follows."},{"Start":"00:22.695 ","End":"00:25.035","Text":"They are very similar,"},{"Start":"00:25.035 ","End":"00:27.570","Text":"just as a 4 here and a 1 here."},{"Start":"00:27.570 ","End":"00:29.505","Text":"Everything else is the same."},{"Start":"00:29.505 ","End":"00:35.790","Text":"Now, if you would compute the characteristic polynomial for matrix A,"},{"Start":"00:35.790 ","End":"00:37.905","Text":"I\u0027ve spared the details,"},{"Start":"00:37.905 ","End":"00:42.810","Text":"we get the following characteristic polynomial."},{"Start":"00:42.810 ","End":"00:48.530","Text":"If we do it for B, then we get this characteristic polynomial."},{"Start":"00:48.530 ","End":"00:54.170","Text":"They\u0027re somewhat similar, the squared here is all over the x minus 4,"},{"Start":"00:54.170 ","End":"00:57.005","Text":"and here it\u0027s over the x minus 1."},{"Start":"00:57.005 ","End":"00:59.240","Text":"Now in both cases,"},{"Start":"00:59.240 ","End":"01:01.850","Text":"there are just 2 eigenvalues,"},{"Start":"01:01.850 ","End":"01:04.190","Text":"1 and 4,"},{"Start":"01:04.190 ","End":"01:08.380","Text":"even though the polynomials are different."},{"Start":"01:08.380 ","End":"01:11.670","Text":"The polynomial for A is x minus 1,"},{"Start":"01:11.670 ","End":"01:13.320","Text":"x minus 4, x minus 4,"},{"Start":"01:13.320 ","End":"01:18.620","Text":"and the one for B has twice the factor x minus 1 and only wants the x minus 4."},{"Start":"01:18.620 ","End":"01:23.330","Text":"The solutions, the roots of the polynomial,"},{"Start":"01:23.330 ","End":"01:27.520","Text":"including multiple roots,"},{"Start":"01:27.520 ","End":"01:30.325","Text":"here it\u0027s 1, 4, and 4,"},{"Start":"01:30.325 ","End":"01:33.650","Text":"and here it\u0027s 1, 1, and 4."},{"Start":"01:33.650 ","End":"01:37.504","Text":"This brings us to the new concept,"},{"Start":"01:37.504 ","End":"01:40.850","Text":"the algebraic multiplicity of an eigenvalue."},{"Start":"01:40.850 ","End":"01:45.860","Text":"For the matrix A because of the 1, 4, 4,"},{"Start":"01:45.860 ","End":"01:51.815","Text":"we say that the eigenvalue x equals 1 has multiplicity"},{"Start":"01:51.815 ","End":"01:58.770","Text":"1 and x equals 4 has algebraic multiplicity 2."},{"Start":"01:58.790 ","End":"02:02.800","Text":"Can you guess what it\u0027s going to be for B?"},{"Start":"02:02.800 ","End":"02:06.650","Text":"Well, naturally just the opposite because of the 1, 1, 4."},{"Start":"02:06.650 ","End":"02:13.705","Text":"Here, the eigenvalue 1 has multiplicity 2 and the eigenvalue 4 has multiplicity 1."},{"Start":"02:13.705 ","End":"02:18.635","Text":"I mean algebraic multiplicity when I just say multiplicity."},{"Start":"02:18.635 ","End":"02:25.310","Text":"That brings us to the definition of the algebraic multiplicity of an eigenvalue,"},{"Start":"02:25.310 ","End":"02:30.343","Text":"is just the number of times it appears as a root of the characteristic polynomial,"},{"Start":"02:30.343 ","End":"02:33.055","Text":"so as we saw here,"},{"Start":"02:33.055 ","End":"02:35.145","Text":"4 appeared twice,"},{"Start":"02:35.145 ","End":"02:42.670","Text":"so 4 has multiplicity 2 in the matrix A and so on."},{"Start":"02:42.890 ","End":"02:45.695","Text":"Here\u0027s another example exercise."},{"Start":"02:45.695 ","End":"02:51.170","Text":"If A is a square matrix whose characteristic polynomial is this,"},{"Start":"02:51.170 ","End":"02:53.375","Text":"p of x is all of this,"},{"Start":"02:53.375 ","End":"03:01.519","Text":"what is the order of A and what are the eigenvalues and for each eigenvalue,"},{"Start":"03:01.519 ","End":"03:04.355","Text":"what is its algebraic multiplicity?"},{"Start":"03:04.355 ","End":"03:10.070","Text":"For Part a, we see that the polynomial has degree 12,"},{"Start":"03:10.070 ","End":"03:11.780","Text":"4 plus 2 is 6,"},{"Start":"03:11.780 ","End":"03:13.520","Text":"plus 5 is 11,"},{"Start":"03:13.520 ","End":"03:15.065","Text":"plus 1 is 12."},{"Start":"03:15.065 ","End":"03:19.445","Text":"It has degree 12. Then the matrix is order 12,"},{"Start":"03:19.445 ","End":"03:23.610","Text":"or a square 12 by 12 matrix."},{"Start":"03:23.660 ","End":"03:27.335","Text":"For Part b, we just have to look at this,"},{"Start":"03:27.335 ","End":"03:31.985","Text":"look at the exponents and look at the roots."},{"Start":"03:31.985 ","End":"03:35.820","Text":"We\u0027ll see from the first term,"},{"Start":"03:35.820 ","End":"03:37.800","Text":"this is x minus 0^4,"},{"Start":"03:37.800 ","End":"03:42.900","Text":"so 0 is an eigenvalue with multiplicity 4 and similarly for the rest,"},{"Start":"03:42.900 ","End":"03:45.465","Text":"1 has multiplicity 2,"},{"Start":"03:45.465 ","End":"03:48.750","Text":"negative 2 has multiplicity 5,"},{"Start":"03:48.750 ","End":"03:52.455","Text":"and 7 has multiplicity 1."},{"Start":"03:52.455 ","End":"03:55.810","Text":"That\u0027s that example."},{"Start":"03:56.450 ","End":"04:02.710","Text":"Now it\u0027s time to move on to the geometric multiplicity of an eigenvalue."},{"Start":"04:02.710 ","End":"04:06.890","Text":"Now we saw earlier that an eigenvalue can have"},{"Start":"04:06.890 ","End":"04:13.085","Text":"more than 1 eigenvector and not just a multiple of the other."},{"Start":"04:13.085 ","End":"04:20.370","Text":"The example we had was the eigenvalue x equals 3 for this matrix."},{"Start":"04:20.370 ","End":"04:22.050","Text":"We saw that both 1, 0,"},{"Start":"04:22.050 ","End":"04:26.410","Text":"1 and 1, 1, 0 are eigenvectors."},{"Start":"04:26.410 ","End":"04:31.730","Text":"Note that these two are linearly independent."},{"Start":"04:31.730 ","End":"04:37.130","Text":"This brings us to the definition that the geometric multiplicity of"},{"Start":"04:37.130 ","End":"04:45.810","Text":"an eigenvalue is the number of linearly independent eigenvectors corresponding to it."},{"Start":"04:46.470 ","End":"04:49.420","Text":"In fact, it turns out that in this case,"},{"Start":"04:49.420 ","End":"04:53.410","Text":"the geometric multiplicity of 3 is exactly 2."},{"Start":"04:53.410 ","End":"04:56.740","Text":"We can\u0027t find a third independent eigenvector."},{"Start":"04:56.740 ","End":"05:03.280","Text":"Anyway, let\u0027s continue and we\u0027ll analyze the example above in a bit more detail,"},{"Start":"05:03.280 ","End":"05:05.800","Text":"so here again is our matrix A,"},{"Start":"05:05.800 ","End":"05:09.490","Text":"and let\u0027s compute its characteristic polynomial."},{"Start":"05:09.490 ","End":"05:17.950","Text":"I\u0027ve spared the details just the result turns out to be x minus 2 and x minus 3 squared."},{"Start":"05:17.950 ","End":"05:22.260","Text":"The eigenvalues are, well,"},{"Start":"05:22.260 ","End":"05:26.290","Text":"they\u0027re 2 and 3 but if you write the solutions,"},{"Start":"05:26.290 ","End":"05:28.300","Text":"the roots of the polynomial,"},{"Start":"05:28.300 ","End":"05:31.205","Text":"then 3 appears twice."},{"Start":"05:31.205 ","End":"05:38.015","Text":"In fact, the algebraic multiplicity of eigenvalue 3 is 2 and the multiplicity of 2 is 1."},{"Start":"05:38.015 ","End":"05:39.980","Text":"That\u0027s the algebraic."},{"Start":"05:39.980 ","End":"05:45.095","Text":"We want to find the geometric multiplicity of x equals 3."},{"Start":"05:45.095 ","End":"05:49.790","Text":"What we do is we start by finding the eigenvectors."},{"Start":"05:49.790 ","End":"05:53.960","Text":"Remember we do this by substituting the eigenvalue 3 in"},{"Start":"05:53.960 ","End":"05:58.340","Text":"this case into the characteristic matrix,"},{"Start":"05:58.340 ","End":"06:06.260","Text":"which gives us this and its corresponding homogeneous SLE, which is this."},{"Start":"06:06.260 ","End":"06:07.970","Text":"Well, all 3 rows are the same,"},{"Start":"06:07.970 ","End":"06:10.295","Text":"all 3 equations are the same."},{"Start":"06:10.295 ","End":"06:11.900","Text":"It\u0027s easy to solve,"},{"Start":"06:11.900 ","End":"06:13.370","Text":"but you start to solve one equation."},{"Start":"06:13.370 ","End":"06:20.719","Text":"Y and z would be the free variables and x is bound,"},{"Start":"06:20.719 ","End":"06:23.045","Text":"it\u0027s dependent on y and z."},{"Start":"06:23.045 ","End":"06:31.625","Text":"We can get 2 independent solutions vectors."},{"Start":"06:31.625 ","End":"06:34.070","Text":"First we let z equal 1,"},{"Start":"06:34.070 ","End":"06:37.310","Text":"y equals 0, and then we\u0027ll do the other way around."},{"Start":"06:37.310 ","End":"06:39.890","Text":"If z is 1, y is 0,"},{"Start":"06:39.890 ","End":"06:46.545","Text":"then we get that x is 1 and if z is 0, y is 1."},{"Start":"06:46.545 ","End":"06:49.440","Text":"Then we also get x equals 1."},{"Start":"06:49.440 ","End":"06:51.840","Text":"Writing these as vectors,"},{"Start":"06:51.840 ","End":"06:53.809","Text":"let us do it as row vectors."},{"Start":"06:53.809 ","End":"06:55.610","Text":"This one gives us x,"},{"Start":"06:55.610 ","End":"06:57.440","Text":"y, z is 1, 0, 1."},{"Start":"06:57.440 ","End":"06:59.705","Text":"The other one gives us 1, 1, 0."},{"Start":"06:59.705 ","End":"07:08.510","Text":"That means that there are just 2 linearly independent eigenvectors for the eigenvalue 3."},{"Start":"07:08.510 ","End":"07:14.160","Text":"In conclusion, the geometric multiplicity is 2."},{"Start":"07:14.360 ","End":"07:17.045","Text":"Now one more concept,"},{"Start":"07:17.045 ","End":"07:20.635","Text":"the eigenspace of an eigenvalue."},{"Start":"07:20.635 ","End":"07:23.150","Text":"Returning to the example above,"},{"Start":"07:23.150 ","End":"07:26.180","Text":"the same matrix A and a reminder,"},{"Start":"07:26.180 ","End":"07:31.235","Text":"these are the 2 eigenvectors of eigenvalue 3."},{"Start":"07:31.235 ","End":"07:39.340","Text":"Now, the eigenvectors came as solutions to a homogeneous system of linear equations."},{"Start":"07:39.340 ","End":"07:42.625","Text":"Whenever we have solutions to an SLE,"},{"Start":"07:42.625 ","End":"07:48.560","Text":"the sum and scalar product of any one of them is also one of them,"},{"Start":"07:48.560 ","End":"07:55.865","Text":"so that means that any linear combination of such eigenvectors is also an eigenvector."},{"Start":"07:55.865 ","End":"07:58.070","Text":"Let\u0027s take an example of a linear combination."},{"Start":"07:58.070 ","End":"08:00.935","Text":"I\u0027ll take 4 of these and 10 of those."},{"Start":"08:00.935 ","End":"08:02.480","Text":"If I compute that,"},{"Start":"08:02.480 ","End":"08:04.705","Text":"that comes out to be this."},{"Start":"08:04.705 ","End":"08:10.765","Text":"Now my claim is also an eigenvector of the same eigenvalue 3."},{"Start":"08:10.765 ","End":"08:13.415","Text":"Here\u0027s the computation."},{"Start":"08:13.415 ","End":"08:18.980","Text":"Well, I\u0027ll leave you to check this times this 14,"},{"Start":"08:18.980 ","End":"08:21.260","Text":"10, 4 which I\u0027ve now written as a column vector,"},{"Start":"08:21.260 ","End":"08:24.900","Text":"gives us this, which is 3 times."},{"Start":"08:26.540 ","End":"08:32.035","Text":"Sorry, I meant 14, 10, 4."},{"Start":"08:32.035 ","End":"08:34.040","Text":"Now, what this means,"},{"Start":"08:34.040 ","End":"08:36.740","Text":"this property of linear combinations,"},{"Start":"08:36.740 ","End":"08:38.450","Text":"also being an eigenvector,"},{"Start":"08:38.450 ","End":"08:41.375","Text":"it means that if we take the subset of eigenvectors,"},{"Start":"08:41.375 ","End":"08:43.990","Text":"it\u0027s actually a subspace."},{"Start":"08:43.990 ","End":"08:47.290","Text":"That brings us to a definition."},{"Start":"08:47.290 ","End":"08:54.155","Text":"The subspace consisting of all eigenvectors of an eigenvalue,"},{"Start":"08:54.155 ","End":"08:58.130","Text":"throw in the 0 also, I should have said,"},{"Start":"08:58.130 ","End":"09:02.120","Text":"is called the eigenspace of this eigenvalue."},{"Start":"09:02.120 ","End":"09:10.820","Text":"Better write that in plus the 0 vector because the vector space has to have the 0 in it,"},{"Start":"09:10.820 ","End":"09:14.450","Text":"so in the example that we had above,"},{"Start":"09:14.450 ","End":"09:17.290","Text":"the eigenspace of the eigenvalue,"},{"Start":"09:17.290 ","End":"09:23.300","Text":"3 is the span of the 2 vectors, 1,"},{"Start":"09:23.300 ","End":"09:25.475","Text":"0, 1 and 1, 1, 0,"},{"Start":"09:25.475 ","End":"09:30.505","Text":"I guess I meant square brackets here."},{"Start":"09:30.505 ","End":"09:38.845","Text":"I think you got it. The way we denoted is v for x equals 3,"},{"Start":"09:38.845 ","End":"09:43.590","Text":"the eigenspace of the eigenvalue 3."},{"Start":"09:43.590 ","End":"09:51.095","Text":"That brings us to an alternative definition of geometric multiplicity."},{"Start":"09:51.095 ","End":"09:57.485","Text":"The geometric multiplicity of an eigenvalue is the dimension,"},{"Start":"09:57.485 ","End":"10:04.115","Text":"which means the number of elements in a basis of the eigenspace of the eigenvalue."},{"Start":"10:04.115 ","End":"10:09.925","Text":"We\u0027ll conclude this clip with an important and well-known theorem."},{"Start":"10:09.925 ","End":"10:13.940","Text":"It goes like this, for any eigenvalue,"},{"Start":"10:13.940 ","End":"10:17.779","Text":"its geometric multiplicity can never exceed"},{"Start":"10:17.779 ","End":"10:22.525","Text":"its algebraic multiplicity less than or equal to."},{"Start":"10:22.525 ","End":"10:27.790","Text":"Sometimes we throw in also the obvious fact that this is bigger or equal to 0"},{"Start":"10:27.790 ","End":"10:33.634","Text":"and the algebraic multiplicity can never exceed the dimension of the space itself."},{"Start":"10:33.634 ","End":"10:38.630","Text":"Here we have this theorem which shows the relation between the 2 multiplicities,"},{"Start":"10:38.630 ","End":"10:41.030","Text":"the geometric and the algebraic."},{"Start":"10:41.030 ","End":"10:44.290","Text":"But also write this in mathematical symbols."},{"Start":"10:44.290 ","End":"10:52.340","Text":"Notation often is Gamma for geometric multiplicity and Mu for algebraic."},{"Start":"10:52.340 ","End":"10:55.760","Text":"I\u0027ve also seen m sub g multiplicity,"},{"Start":"10:55.760 ","End":"10:59.580","Text":"geometric and multiplicity algebraic."},{"Start":"10:59.580 ","End":"11:02.960","Text":"That concludes this clip."}],"ID":26153},{"Watched":false,"Name":"Exercise 14","Duration":"6m 50s","ChapterTopicVideoID":24836,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:04.830","Text":"This exercise is about a square matrix B 4 by 4."},{"Start":"00:04.830 ","End":"00:06.735","Text":"We can assume it\u0027s over the reals,"},{"Start":"00:06.735 ","End":"00:08.309","Text":"it doesn\u0027t say otherwise."},{"Start":"00:08.309 ","End":"00:11.265","Text":"Suppose that it has a rank of 1,"},{"Start":"00:11.265 ","End":"00:13.875","Text":"we have to prove 5 things."},{"Start":"00:13.875 ","End":"00:16.920","Text":"That 0 is an eigenvalue of B,"},{"Start":"00:16.920 ","End":"00:22.775","Text":"that the geometric multiplicity of the eigenvalue is 3,"},{"Start":"00:22.775 ","End":"00:28.245","Text":"that the algebraic multiplicity is either 3 or 4,"},{"Start":"00:28.245 ","End":"00:34.395","Text":"that B has at most 2 eigenvalues, and finally,"},{"Start":"00:34.395 ","End":"00:38.950","Text":"that if lambda is not 0 and is an eigenvalue of B,"},{"Start":"00:38.950 ","End":"00:42.835","Text":"then lambda is the trace of B."},{"Start":"00:42.835 ","End":"00:44.790","Text":"Let\u0027s start, the first 1,"},{"Start":"00:44.790 ","End":"00:47.495","Text":"we have to show the 0 is an eigenvalue of B."},{"Start":"00:47.495 ","End":"00:54.410","Text":"Now, 0 is an eigenvalue of a square matrix if and only if it\u0027s non-invertible."},{"Start":"00:54.410 ","End":"01:01.210","Text":"That\u0027s true if and only if the rank of that matrix is less than 4."},{"Start":"01:01.210 ","End":"01:03.720","Text":"Now, the rank here is 1,"},{"Start":"01:03.720 ","End":"01:06.615","Text":"which is for sure less than 4."},{"Start":"01:06.615 ","End":"01:08.370","Text":"The result follows."},{"Start":"01:08.370 ","End":"01:11.495","Text":"We can also prove it using the rank nullity theorem."},{"Start":"01:11.495 ","End":"01:14.930","Text":"We can say that the rank of B plus the nullity,"},{"Start":"01:14.930 ","End":"01:18.445","Text":"which is a dimension of the kernel is 4."},{"Start":"01:18.445 ","End":"01:23.780","Text":"The dimension of the kernel is 4 minus 1, which is 3."},{"Start":"01:23.780 ","End":"01:27.410","Text":"3 is definitely bigger than 0,"},{"Start":"01:27.410 ","End":"01:32.840","Text":"which means that the kernel is not trivial."},{"Start":"01:32.840 ","End":"01:35.825","Text":"It\u0027s got some vector in it."},{"Start":"01:35.825 ","End":"01:41.695","Text":"Any vector in kernel B is an eigenvector because"},{"Start":"01:41.695 ","End":"01:49.229","Text":"Bv is 0 and 0 is lambda v. We have Bv is lambda v, lambda is 0."},{"Start":"01:49.229 ","End":"01:55.745","Text":"0 is an eigenvalue of B with the corresponding eigenvector v. That was the first part."},{"Start":"01:55.745 ","End":"02:01.010","Text":"The second part was to show the geometric multiplicity is 3."},{"Start":"02:01.010 ","End":"02:06.484","Text":"The eigenspace for eigenvalue 0 is the same as the kernel."},{"Start":"02:06.484 ","End":"02:09.440","Text":"The kernel has dimension 3."},{"Start":"02:09.440 ","End":"02:12.935","Text":"We just showed this here, dimension 3,"},{"Start":"02:12.935 ","End":"02:18.080","Text":"which means that 0 has a geometric multiplicity of 3."},{"Start":"02:18.080 ","End":"02:21.590","Text":"Because the geometric multiplicity is the same as"},{"Start":"02:21.590 ","End":"02:27.580","Text":"the dimension of the eigenspace for that particular eigenvalue."},{"Start":"02:27.580 ","End":"02:30.095","Text":"That\u0027s Part B."},{"Start":"02:30.095 ","End":"02:36.350","Text":"Now, the geometric multiplicity of the eigenvalue 0 is either 3 or 4."},{"Start":"02:36.350 ","End":"02:38.095","Text":"That\u0027s what we have to show."},{"Start":"02:38.095 ","End":"02:46.025","Text":"Let mu be the algebraic multiplicity and Gamma, the geometric multiplicity."},{"Start":"02:46.025 ","End":"02:49.730","Text":"Is like a Greek M and a Greek G. G for geometric,"},{"Start":"02:49.730 ","End":"02:52.450","Text":"M for multiplicity I suppose."},{"Start":"02:52.450 ","End":"02:56.010","Text":"Respectively of the matrix B."},{"Start":"02:56.010 ","End":"02:59.510","Text":"We always have that the algebraic is bigger or equal to"},{"Start":"02:59.510 ","End":"03:03.860","Text":"the geometric multiplicity and it\u0027s always less than or equal to the size,"},{"Start":"03:03.860 ","End":"03:06.170","Text":"the order of the matrix."},{"Start":"03:06.170 ","End":"03:08.095","Text":"Here n is 4."},{"Start":"03:08.095 ","End":"03:10.170","Text":"Since gamma is 3,"},{"Start":"03:10.170 ","End":"03:13.995","Text":"we have the mu is between 3 and 4."},{"Start":"03:13.995 ","End":"03:20.690","Text":"The only integer between and including is either 3 or 4; it\u0027s 2 possibilities."},{"Start":"03:20.690 ","End":"03:24.860","Text":"Now it turns out that both these possibilities can occur."},{"Start":"03:24.860 ","End":"03:27.575","Text":"We can\u0027t rule out any further."},{"Start":"03:27.575 ","End":"03:34.940","Text":"Here\u0027s an example of a 4 by 4 matrix which has a rank of 1."},{"Start":"03:34.940 ","End":"03:37.645","Text":"Clearly in each case the rank is 1."},{"Start":"03:37.645 ","End":"03:43.490","Text":"Here, the characteristic polynomial is x cubed, x minus 1."},{"Start":"03:43.490 ","End":"03:45.845","Text":"0 is a triple root,"},{"Start":"03:45.845 ","End":"03:49.205","Text":"means it has algebraic multiplicity 3."},{"Start":"03:49.205 ","End":"03:54.290","Text":"Here, the characteristic polynomial comes out to be exactly x to"},{"Start":"03:54.290 ","End":"03:59.535","Text":"the 4th meaning 0 is a root of degree 4 of that polynomial."},{"Start":"03:59.535 ","End":"04:02.570","Text":"That\u0027s the algebraic multiplicity here."},{"Start":"04:02.570 ","End":"04:05.635","Text":"We can\u0027t say. It could be 3 it could be 4."},{"Start":"04:05.635 ","End":"04:07.760","Text":"Next, Part D,"},{"Start":"04:07.760 ","End":"04:11.120","Text":"we have to show that B has at most 2 eigenvalues."},{"Start":"04:11.120 ","End":"04:13.745","Text":"Well, we know that 0 is an eigenvalue."},{"Start":"04:13.745 ","End":"04:17.240","Text":"Most have 1 more besides the 0."},{"Start":"04:17.240 ","End":"04:24.095","Text":"Now, here we say that the algebraic multiplicity of 0 is 3 or 4."},{"Start":"04:24.095 ","End":"04:26.094","Text":"Let\u0027s distinguish the 2 cases."},{"Start":"04:26.094 ","End":"04:28.560","Text":"First, the case when it\u0027s 4."},{"Start":"04:28.560 ","End":"04:33.890","Text":"Then we know that the characteristic polynomial is x to the 4th."},{"Start":"04:33.890 ","End":"04:36.560","Text":"That\u0027s in fact what we got in this example,"},{"Start":"04:36.560 ","End":"04:40.365","Text":"that we got x to the 4th as the characteristic polynomial."},{"Start":"04:40.365 ","End":"04:46.145","Text":"If the algebraic multiplicity of 0 is 3,"},{"Start":"04:46.145 ","End":"04:48.254","Text":"as it was here,"},{"Start":"04:48.254 ","End":"04:52.450","Text":"then we know that the characteristic polynomial is divisible by x cubed."},{"Start":"04:52.450 ","End":"04:56.200","Text":"Here we said it was x minus 1 times x cubed,"},{"Start":"04:56.200 ","End":"04:59.370","Text":"but it\u0027s not divisible by x to the 4th."},{"Start":"04:59.370 ","End":"05:03.880","Text":"Our characteristic polynomial is x cubed times x minus something,"},{"Start":"05:03.880 ","End":"05:06.385","Text":"and this something is not 0."},{"Start":"05:06.385 ","End":"05:08.800","Text":"There\u0027s 2 possibilities."},{"Start":"05:08.800 ","End":"05:10.825","Text":"In the first case,"},{"Start":"05:10.825 ","End":"05:13.765","Text":"the eigenvalue is just 0,"},{"Start":"05:13.765 ","End":"05:19.400","Text":"and the other case could be 0 or a, 2 eigenvalues."},{"Start":"05:19.400 ","End":"05:23.730","Text":"That\u0027s Part D. Finally,"},{"Start":"05:23.730 ","End":"05:27.580","Text":"Part E says that if lambda which is not 0,"},{"Start":"05:27.580 ","End":"05:29.425","Text":"is an eigenvalue of B."},{"Start":"05:29.425 ","End":"05:33.320","Text":"That\u0027s like in Part D with are these 2 cases,"},{"Start":"05:33.320 ","End":"05:37.565","Text":"either 0 or 0 and a, well, this is like the a."},{"Start":"05:37.565 ","End":"05:42.380","Text":"We\u0027re going to show that in the second case that other eigenvalue,"},{"Start":"05:42.380 ","End":"05:45.575","Text":"which is not 0, is equal to the trace of B."},{"Start":"05:45.575 ","End":"05:46.685","Text":"From the above,"},{"Start":"05:46.685 ","End":"05:49.355","Text":"or in the case where P of x is x cubed,"},{"Start":"05:49.355 ","End":"05:52.345","Text":"we said x minus a equals x minus lambda,"},{"Start":"05:52.345 ","End":"05:55.835","Text":"which is x to the 4th minus lambda x cubed."},{"Start":"05:55.835 ","End":"06:02.345","Text":"Now, we know from a previous exercise that if we have a square n by n matrix,"},{"Start":"06:02.345 ","End":"06:04.970","Text":"then its characteristic polynomial,"},{"Start":"06:04.970 ","End":"06:07.235","Text":"if we write it in this form,"},{"Start":"06:07.235 ","End":"06:13.010","Text":"has the property that this coefficient is minus the trace,"},{"Start":"06:13.010 ","End":"06:16.835","Text":"and the last coefficient is the determinant."},{"Start":"06:16.835 ","End":"06:18.050","Text":"Anyway, don\u0027t need that last bit."},{"Start":"06:18.050 ","End":"06:23.485","Text":"We just need the fact that this 1 here is minus the trace of A."},{"Start":"06:23.485 ","End":"06:25.305","Text":"Now in our case,"},{"Start":"06:25.305 ","End":"06:27.525","Text":"we don\u0027t have A, we have B."},{"Start":"06:27.525 ","End":"06:30.910","Text":"Instead of n, we have 4."},{"Start":"06:30.910 ","End":"06:39.710","Text":"So we just have to substitute and say that minus the trace of B is minus lambda."},{"Start":"06:39.710 ","End":"06:41.060","Text":"Throw out the minuses."},{"Start":"06:41.060 ","End":"06:43.475","Text":"There should be a minus here and a minus here."},{"Start":"06:43.475 ","End":"06:44.900","Text":"Dispose of those."},{"Start":"06:44.900 ","End":"06:46.835","Text":"Got lambda is trace B."},{"Start":"06:46.835 ","End":"06:48.275","Text":"That\u0027s what we wanted to show,"},{"Start":"06:48.275 ","End":"06:51.660","Text":"and this is the last part. We\u0027re done."}],"ID":25749},{"Watched":false,"Name":"Exercise 15","Duration":"3m 54s","ChapterTopicVideoID":24837,"CourseChapterTopicPlaylistID":7320,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:02.700","Text":"This exercise might seem familiar."},{"Start":"00:02.700 ","End":"00:07.605","Text":"Some of its pieces were covered in previous exercises, but never mind."},{"Start":"00:07.605 ","End":"00:13.830","Text":"B is a square matrix of order n and its rank is k,"},{"Start":"00:13.830 ","End":"00:17.175","Text":"which is strictly less than n. A,"},{"Start":"00:17.175 ","End":"00:20.490","Text":"we have to prove that 0 is an eigenvalue of B."},{"Start":"00:20.490 ","End":"00:25.770","Text":"Part b, we have to prove that the geometric multiplicity of this eigenvalue"},{"Start":"00:25.770 ","End":"00:31.140","Text":"0 is exactly n minus k. In part c,"},{"Start":"00:31.140 ","End":"00:34.110","Text":"we have to say what values are possible for"},{"Start":"00:34.110 ","End":"00:38.980","Text":"the algebraic multiplicity of the eigenvalue 0."},{"Start":"00:39.710 ","End":"00:44.820","Text":"Start with a to prove that 0 is an eigenvalue."},{"Start":"00:44.820 ","End":"00:52.990","Text":"0 is an eigenvalue if and only if the matrix is non-invertible."},{"Start":"00:52.990 ","End":"00:57.710","Text":"This is true if and only if the rank is less than"},{"Start":"00:57.710 ","End":"01:06.905","Text":"n. The rank is less than n so it\u0027s non-invertible so 0 is an eigenvalue,"},{"Start":"01:06.905 ","End":"01:09.440","Text":"and that proves part a. Yeah,"},{"Start":"01:09.440 ","End":"01:11.570","Text":"this line just repeats what I said."},{"Start":"01:11.570 ","End":"01:16.190","Text":"However, there is another way of doing this might be interested."},{"Start":"01:16.190 ","End":"01:18.170","Text":"Using the rank-nullity theorem."},{"Start":"01:18.170 ","End":"01:21.470","Text":"We can say that the rank of B plus the nullity of"},{"Start":"01:21.470 ","End":"01:25.970","Text":"B is n. Nullity being the dimension of the kernel."},{"Start":"01:25.970 ","End":"01:31.270","Text":"If we take this minus this we\u0027ll get this dimension."},{"Start":"01:31.270 ","End":"01:35.030","Text":"This is n minus k and this is positive."},{"Start":"01:35.030 ","End":"01:38.735","Text":"If the kernel has a positive dimension,"},{"Start":"01:38.735 ","End":"01:40.700","Text":"it\u0027s not just 0,"},{"Start":"01:40.700 ","End":"01:45.400","Text":"it contains some non-zero vector."},{"Start":"01:45.400 ","End":"01:48.240","Text":"If this v is in the kernel,"},{"Start":"01:48.240 ","End":"01:50.344","Text":"then Bv is 0,"},{"Start":"01:50.344 ","End":"01:54.980","Text":"which means it\u0027s an eigenvalue because we can take Lambda equals 0 and we get Bv equals"},{"Start":"01:54.980 ","End":"02:01.940","Text":"Lambda v. That\u0027s just an alternative for a and now onto b,"},{"Start":"02:01.940 ","End":"02:08.060","Text":"we have to show that the geometric multiplicity of this eigenvalue 0 is n minus k. Now,"},{"Start":"02:08.060 ","End":"02:12.710","Text":"the eigenspace of this eigenvalue 0 is"},{"Start":"02:12.710 ","End":"02:18.635","Text":"exactly the kernel and the kernel has dimension n minus k,"},{"Start":"02:18.635 ","End":"02:20.535","Text":"as we showed here,"},{"Start":"02:20.535 ","End":"02:26.450","Text":"and so 0 has geometric multiplicity n minus k. This is the definition of"},{"Start":"02:26.450 ","End":"02:33.995","Text":"multiplicity it\u0027s the dimension of the eigenspace corresponding to the eigenvalue."},{"Start":"02:33.995 ","End":"02:36.445","Text":"That\u0027s part b."},{"Start":"02:36.445 ","End":"02:38.480","Text":"Now lastly part c,"},{"Start":"02:38.480 ","End":"02:43.325","Text":"what are the possible values for the algebraic multiplicity of 0?"},{"Start":"02:43.325 ","End":"02:45.500","Text":"Let\u0027s give them names."},{"Start":"02:45.500 ","End":"02:47.870","Text":"Often use a letter mu for"},{"Start":"02:47.870 ","End":"02:54.605","Text":"the algebraic multiplicity and gamma for the geometric multiplicity."},{"Start":"02:54.605 ","End":"02:58.310","Text":"These are the multiplicities of 0."},{"Start":"02:58.310 ","End":"03:04.265","Text":"We always know that the algebraic is bigger than the geometric multiplicity."},{"Start":"03:04.265 ","End":"03:08.060","Text":"Of course, this is less than or equal to the order of the matrix."},{"Start":"03:08.060 ","End":"03:09.260","Text":"We have this inequality,"},{"Start":"03:09.260 ","End":"03:17.285","Text":"so n minus k and n some which the possibilities for the algebraic multiplicity."},{"Start":"03:17.285 ","End":"03:21.260","Text":"Now, we can\u0027t in general pin it down any further."},{"Start":"03:21.260 ","End":"03:24.350","Text":"In fact, it could be any of the values between"},{"Start":"03:24.350 ","End":"03:27.590","Text":"this and this and I\u0027ll just leave you with an example."},{"Start":"03:27.590 ","End":"03:30.395","Text":"Suppose n is 4 and k is 2,"},{"Start":"03:30.395 ","End":"03:36.590","Text":"then all we can say about the algebraic multiplicity of 0 is that it\u0027s between 2 and 4."},{"Start":"03:36.590 ","End":"03:38.875","Text":"In other words, it\u0027s 2,3, or 4."},{"Start":"03:38.875 ","End":"03:41.285","Text":"Actually all 3 are possible."},{"Start":"03:41.285 ","End":"03:45.380","Text":"I\u0027ll leave you with this example to look at if you want to examine it."},{"Start":"03:45.380 ","End":"03:47.599","Text":"I won\u0027t go into this in detail,"},{"Start":"03:47.599 ","End":"03:51.650","Text":"but I just wanted to say that we can\u0027t do any better than this estimate."},{"Start":"03:51.650 ","End":"03:54.780","Text":"That concludes this exercise."}],"ID":25750}],"Thumbnail":null,"ID":7320},{"Name":"Matrix Diagonalization","TopicPlaylistFirstVideoID":0,"Duration":null,"Videos":[{"Watched":false,"Name":"Motivation for matrix diagonalization","Duration":"15m 13s","ChapterTopicVideoID":24862,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:02.609","Text":"This is an optional topic."},{"Start":"00:02.609 ","End":"00:04.380","Text":"I\u0027ve called it the theory,"},{"Start":"00:04.380 ","End":"00:09.270","Text":"but it\u0027s really like background or introduction or motivation for"},{"Start":"00:09.270 ","End":"00:16.035","Text":"the concept of matrix diagonalization just for enrichment."},{"Start":"00:16.035 ","End":"00:23.070","Text":"Let\u0027s go back first of all and discuss some properties of square matrices."},{"Start":"00:23.070 ","End":"00:25.380","Text":"You may have come across these, maybe not,"},{"Start":"00:25.380 ","End":"00:27.480","Text":"but assume you haven\u0027t."},{"Start":"00:27.480 ","End":"00:29.279","Text":"Some additional properties."},{"Start":"00:29.279 ","End":"00:33.150","Text":"The first 1, if you multiply a matrix by"},{"Start":"00:33.150 ","End":"00:37.720","Text":"a column vector which has only a single 1 and the rest 0s,"},{"Start":"00:37.720 ","End":"00:40.010","Text":"and the 1 is in the kth place,"},{"Start":"00:40.010 ","End":"00:42.860","Text":"then we get the Kth column of the matrix."},{"Start":"00:42.860 ","End":"00:44.945","Text":"Here\u0027s some examples."},{"Start":"00:44.945 ","End":"00:52.370","Text":"Here\u0027s a square matrix and here\u0027s a column vector with a single 1 and the rest 0s."},{"Start":"00:52.370 ","End":"00:54.500","Text":"The 1 is in the first place,"},{"Start":"00:54.500 ","End":"00:57.475","Text":"so the result is the first column."},{"Start":"00:57.475 ","End":"01:00.525","Text":"If we put the 1 in the 2nd place,"},{"Start":"01:00.525 ","End":"01:03.585","Text":"then the result would be the 2nd column,"},{"Start":"01:03.585 ","End":"01:07.080","Text":"and the 3rd place would give us the 3rd column."},{"Start":"01:07.080 ","End":"01:09.860","Text":"Really this example says it all,"},{"Start":"01:09.860 ","End":"01:11.330","Text":"but let\u0027s give some more examples."},{"Start":"01:11.330 ","End":"01:12.770","Text":"A 2 by 2,"},{"Start":"01:12.770 ","End":"01:14.585","Text":"here we have 0, 1,"},{"Start":"01:14.585 ","End":"01:16.130","Text":"1 is in the 2nd place,"},{"Start":"01:16.130 ","End":"01:19.145","Text":"so the result will be the second column, 2, 4."},{"Start":"01:19.145 ","End":"01:23.540","Text":"Of course you can check these by doing matrix multiplication."},{"Start":"01:23.540 ","End":"01:27.770","Text":"Let\u0027s go for a 4 by 4 example."},{"Start":"01:27.770 ","End":"01:31.485","Text":"Here we have a 1 in the 3rd place,"},{"Start":"01:31.485 ","End":"01:35.900","Text":"so we take the 3rd column and this is the result"},{"Start":"01:35.900 ","End":"01:42.500","Text":"K. Now this actually is true also for other matrices besides square matrices."},{"Start":"01:42.500 ","End":"01:49.390","Text":"For example, if we have here a 1 in the 3rd place out of 4,"},{"Start":"01:49.390 ","End":"01:52.560","Text":"then here we have the 3rd column out of 4."},{"Start":"01:52.560 ","End":"01:58.805","Text":"The answer is 3,7,2 because the matrices must be compatible for multiplication."},{"Start":"01:58.805 ","End":"02:02.500","Text":"I want to generalize this a little bit."},{"Start":"02:02.500 ","End":"02:06.300","Text":"If you don\u0027t have 1, we have another number."},{"Start":"02:06.300 ","End":"02:08.890","Text":"Well, explained with an example,"},{"Start":"02:08.890 ","End":"02:11.770","Text":"suppose we had a 10 here and the rest 0s,"},{"Start":"02:11.770 ","End":"02:13.930","Text":"the 10 is in the 1st place."},{"Start":"02:13.930 ","End":"02:18.250","Text":"We take the first column and you multiply it by the 10,"},{"Start":"02:18.250 ","End":"02:22.525","Text":"so the answer is 10,14,70 doesn\u0027t matter the actual answer."},{"Start":"02:22.525 ","End":"02:25.210","Text":"Here we have a single non 0,"},{"Start":"02:25.210 ","End":"02:27.770","Text":"it\u0027s minus 4 in the 2nd place."},{"Start":"02:27.770 ","End":"02:32.335","Text":"Take the 2nd column and multiply it by minus 4,"},{"Start":"02:32.335 ","End":"02:36.160","Text":"and here we have a 21 in the 3rd place."},{"Start":"02:36.160 ","End":"02:40.240","Text":"We take the 3rd column and multiply it by 21."},{"Start":"02:40.240 ","End":"02:41.950","Text":"Now the next property,"},{"Start":"02:41.950 ","End":"02:44.805","Text":"I\u0027ll bring the examples first and then we\u0027ll state it."},{"Start":"02:44.805 ","End":"02:47.090","Text":"I\u0027m just multiplying 2 matrices here."},{"Start":"02:47.090 ","End":"02:52.945","Text":"This times, this gives me 1 times 5 plus 2 times 7 here and so on,"},{"Start":"02:52.945 ","End":"02:56.175","Text":"and it comes out to be 19, etc."},{"Start":"02:56.175 ","End":"02:58.290","Text":"This is not the example yet."},{"Start":"02:58.290 ","End":"03:00.815","Text":"I want to show you a different way of looking at this."},{"Start":"03:00.815 ","End":"03:04.640","Text":"We can think of the second matrix as being 2 columns,"},{"Start":"03:04.640 ","End":"03:07.325","Text":"the 5,7 column and the 6,8 column."},{"Start":"03:07.325 ","End":"03:13.085","Text":"What we can do is take this matrix and multiply it by this column separately,"},{"Start":"03:13.085 ","End":"03:17.285","Text":"and then take this matrix and multiply it by this column."},{"Start":"03:17.285 ","End":"03:21.100","Text":"This just means separate into columns,"},{"Start":"03:21.100 ","End":"03:23.805","Text":"and what we get is,"},{"Start":"03:23.805 ","End":"03:26.280","Text":"this is 1 times 5 plus 2 times 7,"},{"Start":"03:26.280 ","End":"03:29.685","Text":"and then 3 times 5 plus 4 times 7,"},{"Start":"03:29.685 ","End":"03:31.785","Text":"we get the same thing."},{"Start":"03:31.785 ","End":"03:37.505","Text":"What we\u0027re saying is that you can do matrix multiplication column by column."},{"Start":"03:37.505 ","End":"03:41.990","Text":"The matrix on the right just take each column separately and multiply"},{"Start":"03:41.990 ","End":"03:47.315","Text":"the left matrix by this column and then do it column by column is another example."},{"Start":"03:47.315 ","End":"03:51.440","Text":"Multiply this by this which has 3,3 columns."},{"Start":"03:51.440 ","End":"03:55.070","Text":"We take this matrix and multiply by the 1st column,"},{"Start":"03:55.070 ","End":"03:58.730","Text":"then we\u0027ll multiply it by the 2nd column and then by the 3rd column,"},{"Start":"03:58.730 ","End":"04:00.805","Text":"and I won\u0027t continue past this."},{"Start":"04:00.805 ","End":"04:02.355","Text":"That\u0027s the idea."},{"Start":"04:02.355 ","End":"04:06.020","Text":"If you multiply a matrix A by a matrix B,"},{"Start":"04:06.020 ","End":"04:10.580","Text":"the result matrix C is such that the kth column is"},{"Start":"04:10.580 ","End":"04:15.880","Text":"a product of A by the column in the k place of B."},{"Start":"04:15.880 ","End":"04:17.930","Text":"Now they don\u0027t actually have to be"},{"Start":"04:17.930 ","End":"04:22.355","Text":"square matrices just as long as they can be multiplied."},{"Start":"04:22.355 ","End":"04:24.510","Text":"You can write it symbolically,"},{"Start":"04:24.510 ","End":"04:29.705","Text":"is A times another matrix which is split into columns v1 to vn."},{"Start":"04:29.705 ","End":"04:33.425","Text":"Take matrix a multiplied by v1,"},{"Start":"04:33.425 ","End":"04:37.925","Text":"matrix A by v2 and so on for all the columns."},{"Start":"04:37.925 ","End":"04:42.870","Text":"Continuing with this introduction to matrix diagonalization,"},{"Start":"04:42.870 ","End":"04:45.910","Text":"most people ask, what is it good for?"},{"Start":"04:45.910 ","End":"04:51.740","Text":"1 good example is to raise a square matrix to a power."},{"Start":"04:51.740 ","End":"04:55.910","Text":"For example, suppose we have a square matrix,"},{"Start":"04:55.910 ","End":"05:01.430","Text":"this, and we want to calculate A to the 10th."},{"Start":"05:01.430 ","End":"05:04.895","Text":"In principle, you could just multiply it out."},{"Start":"05:04.895 ","End":"05:07.310","Text":"Just take this times itself, times itself,"},{"Start":"05:07.310 ","End":"05:10.355","Text":"and you have 10 factors,"},{"Start":"05:10.355 ","End":"05:13.070","Text":"would be 9 multiplications."},{"Start":"05:13.070 ","End":"05:15.185","Text":"It\u0027s not a very pleasant task,"},{"Start":"05:15.185 ","End":"05:19.190","Text":"but it is doable that suppose I now say,"},{"Start":"05:19.190 ","End":"05:22.100","Text":"what about A to the power of 400?"},{"Start":"05:22.100 ","End":"05:26.600","Text":"I don\u0027t think you\u0027d really want to multiply it out longhand,"},{"Start":"05:26.600 ","End":"05:30.250","Text":"so we\u0027re going to look for a better way of doing that."},{"Start":"05:30.250 ","End":"05:34.580","Text":"The idea is to first look for matrices which we"},{"Start":"05:34.580 ","End":"05:39.230","Text":"can easily raise to the power of 10 or 400 or whatever,"},{"Start":"05:39.230 ","End":"05:43.280","Text":"and something that comes to mind is diagonal matrices."},{"Start":"05:43.280 ","End":"05:47.150","Text":"For example, if we have a 3 by 3 diagonal matrix, A, B,"},{"Start":"05:47.150 ","End":"05:53.550","Text":"C, D squared would be this times itself."},{"Start":"05:53.550 ","End":"05:55.940","Text":"Whenever you have 2 diagonal matrices,"},{"Start":"05:55.940 ","End":"05:59.065","Text":"you just multiply corresponding entries."},{"Start":"05:59.065 ","End":"06:03.735","Text":"A times A is A squared then B squared then C squared."},{"Start":"06:03.735 ","End":"06:08.630","Text":"Continuing, D cubed will give us a cubed, B cubed,"},{"Start":"06:08.630 ","End":"06:11.555","Text":"C cubed, and in general,"},{"Start":"06:11.555 ","End":"06:13.745","Text":"D to the power of n,"},{"Start":"06:13.745 ","End":"06:15.995","Text":"by induction you could prove it."},{"Start":"06:15.995 ","End":"06:20.400","Text":"It\u0027s just clear that we have A to the n, B to the n,"},{"Start":"06:20.400 ","End":"06:22.849","Text":"C to the n. With diagonal matrices,"},{"Start":"06:22.849 ","End":"06:25.550","Text":"we can easily raise them to a power."},{"Start":"06:25.550 ","End":"06:27.410","Text":"Now how does this help?"},{"Start":"06:27.410 ","End":"06:29.555","Text":"Our matrix is not diagonal."},{"Start":"06:29.555 ","End":"06:36.890","Text":"Suppose we could find an invertible matrix P such that A times P is P times D,"},{"Start":"06:36.890 ","End":"06:39.325","Text":"seems a strange thing, but you\u0027ll see."},{"Start":"06:39.325 ","End":"06:45.170","Text":"If we found a matrix P such that that holds where D is a diagonal matrix,"},{"Start":"06:45.170 ","End":"06:50.945","Text":"then we would get that A equals PDP inverse."},{"Start":"06:50.945 ","End":"06:59.990","Text":"A is similar to D. Then if we try to do A to the power of 10 each time instead of A,"},{"Start":"06:59.990 ","End":"07:03.390","Text":"you could put PDP inverse."},{"Start":"07:04.610 ","End":"07:08.120","Text":"Every time you get P inverse P,"},{"Start":"07:08.120 ","End":"07:10.355","Text":"that\u0027s the identity matrix,"},{"Start":"07:10.355 ","End":"07:13.365","Text":"those would cancel out."},{"Start":"07:13.365 ","End":"07:18.440","Text":"Well, just putting eyes in the middle here and you can just throw out the identities,"},{"Start":"07:18.440 ","End":"07:22.160","Text":"and we have D to the tenth."},{"Start":"07:22.160 ","End":"07:28.400","Text":"All in all, we have P times D to the 10th, P inverse."},{"Start":"07:28.400 ","End":"07:32.950","Text":"If instead of 10 you put n by induction,"},{"Start":"07:32.950 ","End":"07:39.880","Text":"you can generalize and say A to the nth is P times D to the n P inverse."},{"Start":"07:39.880 ","End":"07:44.995","Text":"A to the power of 4,100, let\u0027s say,"},{"Start":"07:44.995 ","End":"07:50.050","Text":"will be PD to the power of 4,100 P inverse."},{"Start":"07:50.050 ","End":"07:52.660","Text":"If D is a, b, c,"},{"Start":"07:52.660 ","End":"08:01.089","Text":"then D to the power of 4,100 is just over diagonals raised to that power."},{"Start":"08:01.089 ","End":"08:03.955","Text":"Once you have that computed,"},{"Start":"08:03.955 ","End":"08:11.305","Text":"you get A to the 4,100 by multiplying on the left by P and on the right by P inverse."},{"Start":"08:11.305 ","End":"08:13.075","Text":"That\u0027s not that much work,"},{"Start":"08:13.075 ","End":"08:16.720","Text":"but we still haven\u0027t got any idea of how to find"},{"Start":"08:16.720 ","End":"08:20.605","Text":"such a matrix P. That\u0027s the key to doing this."},{"Start":"08:20.605 ","End":"08:22.900","Text":"Let\u0027s start with the definition."},{"Start":"08:22.900 ","End":"08:30.745","Text":"We can\u0027t always find such P and D. But if it so happens that these P and D exists,"},{"Start":"08:30.745 ","End":"08:35.005","Text":"then A is said to be diagonalizable."},{"Start":"08:35.005 ","End":"08:36.685","Text":"It\u0027s not diagonal,"},{"Start":"08:36.685 ","End":"08:41.620","Text":"it\u0027s similar to a diagonal matrix D if you multiply"},{"Start":"08:41.620 ","End":"08:47.065","Text":"on the left and right by P and P inverse."},{"Start":"08:47.065 ","End":"08:52.690","Text":"Now, assuming A is diagonalizable and in principle we know that P and D exist,"},{"Start":"08:52.690 ","End":"08:54.925","Text":"how do we find them?"},{"Start":"08:54.925 ","End":"08:57.970","Text":"Let\u0027s do some investigation and for simplicity,"},{"Start":"08:57.970 ","End":"09:01.165","Text":"we\u0027ll take 3 by 3 matrices."},{"Start":"09:01.165 ","End":"09:07.720","Text":"If we write P in column form as 3 columns or n columns in general,"},{"Start":"09:07.720 ","End":"09:14.439","Text":"then AP equals PD can be written using the column theory that we had earlier."},{"Start":"09:14.439 ","End":"09:22.165","Text":"Start by breaking up P into columns like so and D we can express as X_1,"},{"Start":"09:22.165 ","End":"09:25.195","Text":"X_2 up to X_N on the diagonal."},{"Start":"09:25.195 ","End":"09:27.610","Text":"Then, like what we said earlier,"},{"Start":"09:27.610 ","End":"09:31.930","Text":"we can take matrix A and multiply it by column 1,"},{"Start":"09:31.930 ","End":"09:35.455","Text":"which is v1, multiply it by each of the columns."},{"Start":"09:35.455 ","End":"09:40.585","Text":"Here also, this matrix P times the first column."},{"Start":"09:40.585 ","End":"09:45.445","Text":"This is a column with all 0s except at this place."},{"Start":"09:45.445 ","End":"09:52.390","Text":"Like we saw before, we take this number and put it in front of this column."},{"Start":"09:52.390 ","End":"09:58.525","Text":"This is a scalar multiplied by the column and similarly for X_2 and X_3."},{"Start":"09:58.525 ","End":"10:04.265","Text":"That means that each of these columns equals each of these columns."},{"Start":"10:04.265 ","End":"10:07.140","Text":"If I is 1, 2, or 3,"},{"Start":"10:07.140 ","End":"10:08.985","Text":"we have a times V_i,"},{"Start":"10:08.985 ","End":"10:11.970","Text":"like AV_2 is X_2 V_2."},{"Start":"10:11.970 ","End":"10:16.070","Text":"Similarly for I equals 1 and 3."},{"Start":"10:16.070 ","End":"10:22.810","Text":"By the way, these Vs are not 0 because P is invertible,"},{"Start":"10:22.810 ","End":"10:25.225","Text":"so it can\u0027t have a 0 column."},{"Start":"10:25.225 ","End":"10:31.780","Text":"From this, we get the motivation to define eigenvalues and eigenvectors from this here,"},{"Start":"10:31.780 ","End":"10:33.295","Text":"this gives us the inspiration."},{"Start":"10:33.295 ","End":"10:39.385","Text":"If A times vector V is some scalar X times vector V,"},{"Start":"10:39.385 ","End":"10:44.020","Text":"then V is called an eigenvector of A,"},{"Start":"10:44.020 ","End":"10:47.500","Text":"and X is the corresponding eigenvalue."},{"Start":"10:47.500 ","End":"10:49.810","Text":"I\u0027m not sure what Eigen comes from."},{"Start":"10:49.810 ","End":"10:53.739","Text":"It\u0027s from German and I\u0027m not sure exactly the meaning."},{"Start":"10:53.739 ","End":"10:59.455","Text":"The columns of P will then be eigenvectors of A"},{"Start":"10:59.455 ","End":"11:05.305","Text":"and they will be linearly independent because P is invertible,"},{"Start":"11:05.305 ","End":"11:10.855","Text":"the columns of an invertible matrix are linearly independent."},{"Start":"11:10.855 ","End":"11:16.825","Text":"The entries on the diagonal of D are the corresponding eigenvalues."},{"Start":"11:16.825 ","End":"11:18.745","Text":"Like we saw here,"},{"Start":"11:18.745 ","End":"11:22.070","Text":"these eigenvalues are the diagonal."},{"Start":"11:23.100 ","End":"11:26.199","Text":"Now let\u0027s get back to this equation."},{"Start":"11:26.199 ","End":"11:32.169","Text":"Av equals xv and remember v is not 0."},{"Start":"11:32.169 ","End":"11:36.685","Text":"We can write this as xv minus Av equals 0,"},{"Start":"11:36.685 ","End":"11:40.240","Text":"the scalar x we could write as x times"},{"Start":"11:40.240 ","End":"11:44.140","Text":"the identity matrix times v. That would be the same thing."},{"Start":"11:44.140 ","End":"11:49.179","Text":"So we have a difference of 2 matrices times v is 0."},{"Start":"11:49.179 ","End":"11:53.620","Text":"Therefore, the determinant of this is 0."},{"Start":"11:53.620 ","End":"11:59.215","Text":"Because if a matrix multiplied by a nonzero vector is 0,"},{"Start":"11:59.215 ","End":"12:02.079","Text":"then its determinant is 0."},{"Start":"12:02.079 ","End":"12:07.300","Text":"This matrix xI minus A is called the characteristic matrix."},{"Start":"12:07.300 ","End":"12:11.860","Text":"The determinant involves a variable x,"},{"Start":"12:11.860 ","End":"12:14.260","Text":"it\u0027s called the characteristic polynomial."},{"Start":"12:14.260 ","End":"12:18.670","Text":"This expands into a polynomial of degree n,"},{"Start":"12:18.670 ","End":"12:21.640","Text":"3 in our case in x."},{"Start":"12:21.640 ","End":"12:27.040","Text":"Sometimes we write a subscript A to remind us what the name of the matrix was,"},{"Start":"12:27.040 ","End":"12:28.855","Text":"where we got the polynomial from."},{"Start":"12:28.855 ","End":"12:31.990","Text":"The characteristic equation is to"},{"Start":"12:31.990 ","End":"12:36.670","Text":"take the characteristic polynomial and set it equal to 0."},{"Start":"12:36.670 ","End":"12:40.240","Text":"The roots of this characteristic polynomial"},{"Start":"12:40.240 ","End":"12:42.640","Text":"or the solutions of the characteristic equation,"},{"Start":"12:42.640 ","End":"12:45.515","Text":"same thing, are called the eigenvalues."},{"Start":"12:45.515 ","End":"12:50.235","Text":"The eigenvectors for a particular eigenvalue Lambda,"},{"Start":"12:50.235 ","End":"12:53.780","Text":"Lambda is a common letter for eigenvalues."},{"Start":"12:53.780 ","End":"12:57.400","Text":"These are the vectors v which are not 0,"},{"Start":"12:57.400 ","End":"13:01.855","Text":"such that Lambda I minus Av is 0."},{"Start":"13:01.855 ","End":"13:05.635","Text":"It\u0027s like this, but x Lambda."},{"Start":"13:05.635 ","End":"13:14.560","Text":"These vectors V for a given Lambda form a subspace of R^n or R^3 in our case,"},{"Start":"13:14.560 ","End":"13:19.135","Text":"and it\u0027s actually called the Lambda eigenspace."},{"Start":"13:19.135 ","End":"13:21.805","Text":"Sometimes they use the term eigenspace, so yeah,"},{"Start":"13:21.805 ","End":"13:25.075","Text":"this vector space of"},{"Start":"13:25.075 ","End":"13:30.445","Text":"eigenvectors belonging to a particular Lambda is the Lambda eigenspace."},{"Start":"13:30.445 ","End":"13:36.265","Text":"The way we build P is what we do is for each eigenvalue,"},{"Start":"13:36.265 ","End":"13:41.380","Text":"we successively take a basis of this subspace of eigenvectors,"},{"Start":"13:41.380 ","End":"13:43.240","Text":"considered as column vectors,"},{"Start":"13:43.240 ","End":"13:47.245","Text":"and they will be n of them all together."},{"Start":"13:47.245 ","End":"13:52.660","Text":"You go along each eigenvalue and it will have 1 or more eigenvectors and collect together"},{"Start":"13:52.660 ","End":"13:55.690","Text":"all the basis eigenvectors and put them"},{"Start":"13:55.690 ","End":"13:59.995","Text":"together so we have a set of column vectors and there should be n of them,"},{"Start":"13:59.995 ","End":"14:04.150","Text":"provided that A is diagonalizable that can be proved,"},{"Start":"14:04.150 ","End":"14:07.870","Text":"and from these n columns you build"},{"Start":"14:07.870 ","End":"14:12.445","Text":"the matrix P. It doesn\u0027t matter what order you take them in."},{"Start":"14:12.445 ","End":"14:17.785","Text":"P that we built this way will satisfy AP equals PD."},{"Start":"14:17.785 ","End":"14:20.455","Text":"I forgot to tell you what D is in a moment,"},{"Start":"14:20.455 ","End":"14:23.290","Text":"and it can be proven to be invertible."},{"Start":"14:23.290 ","End":"14:25.600","Text":"We build D, the diagonal matrix,"},{"Start":"14:25.600 ","End":"14:29.155","Text":"by taking the eigenvalues in the corresponding order."},{"Start":"14:29.155 ","End":"14:34.330","Text":"Meaning here in P we have eigenvectors in order."},{"Start":"14:34.330 ","End":"14:36.070","Text":"Each eigenvector corresponds to"},{"Start":"14:36.070 ","End":"14:40.989","Text":"an eigenvalue and you put them in the same order along the diagonal."},{"Start":"14:40.989 ","End":"14:42.430","Text":"That\u0027s how we get the D,"},{"Start":"14:42.430 ","End":"14:45.920","Text":"and I forgot to write that earlier."},{"Start":"14:46.260 ","End":"14:51.220","Text":"Eigenvalues and eigenvectors have many uses in mathematics other than that,"},{"Start":"14:51.220 ","End":"14:56.805","Text":"and we gave main example of raising a square matrix to a power."},{"Start":"14:56.805 ","End":"15:01.640","Text":"Another example would be an ordinary differential equations."},{"Start":"15:01.640 ","End":"15:05.270","Text":"It\u0027s used in solving systems of linear equations,"},{"Start":"15:05.270 ","End":"15:09.515","Text":"the eigenvalues and eigenvectors and diagonalization."},{"Start":"15:09.515 ","End":"15:13.710","Text":"That\u0027s it for now, for the introduction."}],"ID":25775},{"Watched":false,"Name":"Diagonalization of Matrices","Duration":"12m 39s","ChapterTopicVideoID":24863,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:05.610","Text":"In this clip, we\u0027ll learn how to raise a square matrix to a power"},{"Start":"00:05.610 ","End":"00:10.575","Text":"or exponent and we\u0027ll learn some other things too while we\u0027re at it."},{"Start":"00:10.575 ","End":"00:12.660","Text":"Anyway, let\u0027s start with an example."},{"Start":"00:12.660 ","End":"00:14.190","Text":"He\u0027s a square matrix A,"},{"Start":"00:14.190 ","End":"00:17.475","Text":"a 3-by-3, we\u0027ve seen it before."},{"Start":"00:17.475 ","End":"00:23.565","Text":"Suppose you wanted to compute A to the power of some big number, say, 414."},{"Start":"00:23.565 ","End":"00:27.285","Text":"Ordinarily, if we didn\u0027t have any special techniques,"},{"Start":"00:27.285 ","End":"00:36.830","Text":"we just have to multiply it by itself 414 factors and that could take quite a while."},{"Start":"00:36.830 ","End":"00:39.920","Text":"But we\u0027re going to learn how to do it in"},{"Start":"00:39.920 ","End":"00:45.440","Text":"about half an hour and we\u0027re going to use eigenvalues and eigenvectors."},{"Start":"00:45.440 ","End":"00:48.320","Text":"But before we get started properly,"},{"Start":"00:48.320 ","End":"00:54.320","Text":"let\u0027s just note that there\u0027s a class of matrices which can be easily raised to a power;"},{"Start":"00:54.320 ","End":"00:57.515","Text":"these are the diagonal matrices."},{"Start":"00:57.515 ","End":"01:04.460","Text":"Take a general 3-by-3 matrix and if we want to raise it to the power of n,"},{"Start":"01:04.460 ","End":"01:06.890","Text":"where n is some natural number,"},{"Start":"01:06.890 ","End":"01:12.080","Text":"then it turns out that all you have to do is raise each of"},{"Start":"01:12.080 ","End":"01:17.960","Text":"the elements on the diagonal to that power n. This is actually very easy to prove,"},{"Start":"01:17.960 ","End":"01:19.825","Text":"but I shan\u0027t prove it."},{"Start":"01:19.825 ","End":"01:22.955","Text":"Of course, it works for any square matrix,"},{"Start":"01:22.955 ","End":"01:25.295","Text":"n by n, not just the 3-by-3."},{"Start":"01:25.295 ","End":"01:31.730","Text":"For instance, if I had this diagonal matrix and I want to raise it to the power of 200,"},{"Start":"01:31.730 ","End":"01:37.640","Text":"we just raise 4^200, 10^200, and 21^200."},{"Start":"01:37.640 ","End":"01:42.330","Text":"Of course, you wouldn\u0027t actually compute this,"},{"Start":"01:42.330 ","End":"01:45.540","Text":"this would be 1 with 200 zeros following it,"},{"Start":"01:45.540 ","End":"01:47.365","Text":"so we would leave it in this form."},{"Start":"01:47.365 ","End":"01:49.220","Text":"Now, that\u0027s all very well,"},{"Start":"01:49.220 ","End":"01:51.740","Text":"but our matrix isn\u0027t diagonal,"},{"Start":"01:51.740 ","End":"01:54.510","Text":"so what do we do?"},{"Start":"01:54.580 ","End":"01:57.980","Text":"Fortunately, there is a recipe,"},{"Start":"01:57.980 ","End":"02:00.845","Text":"a technique, method."},{"Start":"02:00.845 ","End":"02:04.890","Text":"Back to our given example,"},{"Start":"02:04.890 ","End":"02:12.510","Text":"the first step is to find the eigenvalues and eigenvectors of A."},{"Start":"02:12.610 ","End":"02:18.170","Text":"But we\u0027ve had this before and we\u0027ve solved it already,"},{"Start":"02:18.170 ","End":"02:20.299","Text":"so I\u0027m just bringing the results."},{"Start":"02:20.299 ","End":"02:24.030","Text":"The eigenvalues are 6, 2,"},{"Start":"02:24.030 ","End":"02:25.260","Text":"and minus 4,"},{"Start":"02:25.260 ","End":"02:29.905","Text":"and their corresponding eigenvectors are here."},{"Start":"02:29.905 ","End":"02:35.720","Text":"The second step is to build a diagonal matrix"},{"Start":"02:35.720 ","End":"02:42.680","Text":"such that the main diagonal consists of the eigenvalues of A and we call this one D,"},{"Start":"02:42.680 ","End":"02:43.840","Text":"D for diagonal,"},{"Start":"02:43.840 ","End":"02:47.810","Text":"so 6, 2, and minus 4."},{"Start":"02:47.810 ","End":"02:51.605","Text":"The next step is to build a matrix,"},{"Start":"02:51.605 ","End":"02:58.925","Text":"also 3-by-3, whose columns are the eigenvectors corresponding to the eigenvalues."},{"Start":"02:58.925 ","End":"03:01.510","Text":"I\u0027ll just show you what it is."},{"Start":"03:01.510 ","End":"03:06.275","Text":"We call it P and I still have these."},{"Start":"03:06.275 ","End":"03:08.360","Text":"Notice that here 0,"},{"Start":"03:08.360 ","End":"03:11.210","Text":"0, 1 and the first column 0, 0,"},{"Start":"03:11.210 ","End":"03:13.580","Text":"1, then 1, 1, 1,"},{"Start":"03:13.580 ","End":"03:17.705","Text":"and that\u0027s this column and so on."},{"Start":"03:17.705 ","End":"03:22.630","Text":"I\u0027ve colored it because it is important to put them in the right order."},{"Start":"03:22.630 ","End":"03:29.870","Text":"Now, it turns out that this P is invertible and there\u0027s"},{"Start":"03:29.870 ","End":"03:39.020","Text":"a useful formula that if you compute P times D times the inverse of P,"},{"Start":"03:39.020 ","End":"03:42.680","Text":"then you get the original matrix A."},{"Start":"03:42.680 ","End":"03:45.545","Text":"From that formula I just wrote,"},{"Start":"03:45.545 ","End":"03:55.975","Text":"it\u0027s easy to prove that A^n is P times D^n P inverse."},{"Start":"03:55.975 ","End":"04:00.890","Text":"In our case, it means that the result"},{"Start":"04:00.890 ","End":"04:05.885","Text":"of our problem can be gotten by multiplying these 3,"},{"Start":"04:05.885 ","End":"04:09.500","Text":"well, we also have to raise this to the power of 414,"},{"Start":"04:09.500 ","End":"04:11.795","Text":"but this is a diagonal matrix,"},{"Start":"04:11.795 ","End":"04:14.075","Text":"so that\u0027s easy to do."},{"Start":"04:14.075 ","End":"04:16.820","Text":"An inverse, there\u0027s a little bit of work,"},{"Start":"04:16.820 ","End":"04:18.845","Text":"but not too much."},{"Start":"04:18.845 ","End":"04:21.530","Text":"Most of the work of the exercise, in fact,"},{"Start":"04:21.530 ","End":"04:24.780","Text":"is finding the eigenvalues and eigenvectors."},{"Start":"04:24.780 ","End":"04:27.380","Text":"We put the exponent,"},{"Start":"04:27.380 ","End":"04:34.160","Text":"the power on each of the elements on the diagonal, and the inverse I computed for you."},{"Start":"04:34.160 ","End":"04:40.865","Text":"You can see the computations in one of the solved exercises following the tutorial."},{"Start":"04:40.865 ","End":"04:45.125","Text":"This is now a product of 3 matrices,"},{"Start":"04:45.125 ","End":"04:47.990","Text":"and I\u0027ll just skip that part,"},{"Start":"04:47.990 ","End":"04:50.525","Text":"it just looks messy, but it\u0027s easy."},{"Start":"04:50.525 ","End":"04:53.525","Text":"Notice I put an asterisk here."},{"Start":"04:53.525 ","End":"04:56.930","Text":"I\u0027ll just show you the proof of this,"},{"Start":"04:56.930 ","End":"04:59.335","Text":"but you can skip it."},{"Start":"04:59.335 ","End":"05:02.340","Text":"Just for those who are interested, in fact,"},{"Start":"05:02.340 ","End":"05:03.510","Text":"I won\u0027t even go through it,"},{"Start":"05:03.510 ","End":"05:05.855","Text":"I\u0027ve written it here and if you\u0027re interested,"},{"Start":"05:05.855 ","End":"05:07.685","Text":"try and follow this."},{"Start":"05:07.685 ","End":"05:13.705","Text":"In a sense, we\u0027re done and this is the solution."},{"Start":"05:13.705 ","End":"05:19.250","Text":"In what follows, I\u0027m just going to be making some comments about how to"},{"Start":"05:19.250 ","End":"05:24.785","Text":"generalize this and various snags that you might encounter,"},{"Start":"05:24.785 ","End":"05:28.790","Text":"so I\u0027m going to go back to the beginning and make comments."},{"Start":"05:28.790 ","End":"05:35.959","Text":"Here we are and my first remark is that the recipe or the technique"},{"Start":"05:35.959 ","End":"05:43.645","Text":"that we just showed for this matrix is not applicable to every square matrix."},{"Start":"05:43.645 ","End":"05:47.990","Text":"In what follows, I\u0027ll explain why and how we can"},{"Start":"05:47.990 ","End":"05:53.290","Text":"determine in advance if it applies to a given matrix."},{"Start":"05:53.290 ","End":"05:59.350","Text":"Remember that step 1 was to find the eigenvalues and eigenvectors and for this step,"},{"Start":"05:59.350 ","End":"06:01.850","Text":"I have no comments."},{"Start":"06:01.850 ","End":"06:08.145","Text":"Then we had step 2 about building this diagonal matrix,"},{"Start":"06:08.145 ","End":"06:10.875","Text":"and here I do want to comment."},{"Start":"06:10.875 ","End":"06:12.580","Text":"In our particular case,"},{"Start":"06:12.580 ","End":"06:18.545","Text":"we got 3 different eigenvalues and each with algebraic multiplicity 1."},{"Start":"06:18.545 ","End":"06:21.640","Text":"I have to tell you what to do because it sometimes"},{"Start":"06:21.640 ","End":"06:25.780","Text":"happens that we have multiplicity greater than 1."},{"Start":"06:25.780 ","End":"06:31.760","Text":"What we do is we just repeat the eigenvalue n times on the diagonal."},{"Start":"06:31.760 ","End":"06:36.230","Text":"If, for example, the characteristic polynomial turns"},{"Start":"06:36.230 ","End":"06:40.805","Text":"out to be x minus 2 times x minus 3 squared,"},{"Start":"06:40.805 ","End":"06:48.270","Text":"then the roots of this polynomial are 2, 3, and 3."},{"Start":"06:48.640 ","End":"06:52.995","Text":"The eigenvalue 3 has multiplicity 2,"},{"Start":"06:52.995 ","End":"06:58.070","Text":"so we put it twice on the diagonal and our D,"},{"Start":"06:58.070 ","End":"07:01.180","Text":"in this case, would be as follows."},{"Start":"07:01.180 ","End":"07:03.440","Text":"In Step 3, if you remember,"},{"Start":"07:03.440 ","End":"07:09.185","Text":"we built a matrix whose columns were the eigenvectors and we called it"},{"Start":"07:09.185 ","End":"07:15.740","Text":"P. It turns out that this is not always possible."},{"Start":"07:15.740 ","End":"07:20.345","Text":"But I want to give a definition now that when it is possible to do it,"},{"Start":"07:20.345 ","End":"07:24.410","Text":"then the matrix is called diagonalizable,"},{"Start":"07:24.410 ","End":"07:27.830","Text":"otherwise, we say it\u0027s not diagonalizable."},{"Start":"07:27.830 ","End":"07:33.035","Text":"The main thing is that if we have like a 3-by-3 matrix,"},{"Start":"07:33.035 ","End":"07:40.205","Text":"then we have to find 3 eigenvectors using our method but we might not."},{"Start":"07:40.205 ","End":"07:43.085","Text":"Nothing special about number 3, in general,"},{"Start":"07:43.085 ","End":"07:50.930","Text":"the square matrix of order n needs to have n eigenvectors by using this method."},{"Start":"07:50.930 ","End":"07:54.635","Text":"There is an equivalent requirement, the requirement,"},{"Start":"07:54.635 ","End":"07:57.470","Text":"which we could phrase as the matrix has to have"},{"Start":"07:57.470 ","End":"08:01.100","Text":"the same number of eigenvectors as the order of the matrix,"},{"Start":"08:01.100 ","End":"08:05.300","Text":"is equivalent to the following requirement, and that is,"},{"Start":"08:05.300 ","End":"08:07.445","Text":"that for each eigenvalue,"},{"Start":"08:07.445 ","End":"08:12.560","Text":"the algebraic multiplicity has to be the same as the geometric multiplicity."},{"Start":"08:12.560 ","End":"08:15.260","Text":"This requirement is equivalent to this requirement."},{"Start":"08:15.260 ","End":"08:17.760","Text":"You know what? I\u0027ll just summarize."},{"Start":"08:18.620 ","End":"08:24.140","Text":"An n by n matrix is diagonalizable if and only if,"},{"Start":"08:24.140 ","End":"08:28.025","Text":"either using that first requirement,"},{"Start":"08:28.025 ","End":"08:34.085","Text":"the process that we used above yields n eigenvectors for A,"},{"Start":"08:34.085 ","End":"08:38.825","Text":"no less; or equivalently,"},{"Start":"08:38.825 ","End":"08:40.580","Text":"for each eigenvalue of A,"},{"Start":"08:40.580 ","End":"08:44.790","Text":"its algebraic and geometric multiplicities are equal."},{"Start":"08:45.440 ","End":"08:48.380","Text":"Either one although naturally,"},{"Start":"08:48.380 ","End":"08:55.415","Text":"we would be using the method and we would see if we do get n eigenvectors or less."},{"Start":"08:55.415 ","End":"08:59.595","Text":"Another useful remark. Turns out,"},{"Start":"08:59.595 ","End":"09:05.160","Text":"it\u0027s a theorem, that each eigenvalue has at least 1 eigenvector."},{"Start":"09:05.480 ","End":"09:11.210","Text":"If we have an n by n matrix and it has n different eigenvalues,"},{"Start":"09:11.210 ","End":"09:13.310","Text":"then we will get at least n,"},{"Start":"09:13.310 ","End":"09:20.605","Text":"meaning exactly n eigenvectors and A will be diagonalizable."},{"Start":"09:20.605 ","End":"09:25.985","Text":"I realized I never really explained why the word diagonalizable."},{"Start":"09:25.985 ","End":"09:36.515","Text":"Remember that we had that A is equal to P times D times P inverse."},{"Start":"09:36.515 ","End":"09:42.320","Text":"A is in some ways like D or at least when we raise it to the power,"},{"Start":"09:42.320 ","End":"09:44.960","Text":"we just raise D to the power, so in some ways,"},{"Start":"09:44.960 ","End":"09:48.590","Text":"A is similar to a diagonal matrix"},{"Start":"09:48.590 ","End":"09:53.720","Text":"D. There is actually a very technical sense in which A and D is similar,"},{"Start":"09:53.720 ","End":"09:56.135","Text":"but I don\u0027t really want to get into it."},{"Start":"09:56.135 ","End":"09:59.970","Text":"It\u0027s just a word, diagonalizable."},{"Start":"10:00.050 ","End":"10:05.760","Text":"This is what we had in Step 4 in the beginning and then here at this point,"},{"Start":"10:05.760 ","End":"10:07.929","Text":"I want to make a comment."},{"Start":"10:07.980 ","End":"10:14.130","Text":"You see, we assume that there is a P minus 1 for this formula,"},{"Start":"10:14.130 ","End":"10:17.920","Text":"I meant to say P inverse, P^minus 1."},{"Start":"10:17.920 ","End":"10:22.030","Text":"But how do we know that P actually has an inverse?"},{"Start":"10:22.030 ","End":"10:25.915","Text":"How do we know that P inverse exists?"},{"Start":"10:25.915 ","End":"10:28.675","Text":"Well, it\u0027s possible to prove,"},{"Start":"10:28.675 ","End":"10:29.920","Text":"but I won\u0027t do it,"},{"Start":"10:29.920 ","End":"10:33.250","Text":"that the eigenvectors that we get,"},{"Start":"10:33.250 ","End":"10:35.650","Text":"if we follow our recipe,"},{"Start":"10:35.650 ","End":"10:39.245","Text":"they are always linearly independent."},{"Start":"10:39.245 ","End":"10:42.195","Text":"Now, for the whole process to work,"},{"Start":"10:42.195 ","End":"10:46.240","Text":"we have to assume that A is diagonalizable."},{"Start":"10:46.240 ","End":"10:48.470","Text":"This is the assumption I\u0027m going to make, otherwise,"},{"Start":"10:48.470 ","End":"10:51.215","Text":"the process doesn\u0027t work and there\u0027s no point in continuing."},{"Start":"10:51.215 ","End":"10:57.515","Text":"Remember one of the ways of defining diagonalizable means that we"},{"Start":"10:57.515 ","End":"11:04.655","Text":"get n eigenvectors using our recipe."},{"Start":"11:04.655 ","End":"11:09.115","Text":"Now, the eigenvectors are the columns of P and"},{"Start":"11:09.115 ","End":"11:14.000","Text":"these are going to be linearly independent because of this,"},{"Start":"11:14.000 ","End":"11:15.050","Text":"what I said here,"},{"Start":"11:15.050 ","End":"11:19.090","Text":"can be proven that the eigenvectors we get are always linearly independent,"},{"Start":"11:19.090 ","End":"11:23.945","Text":"so we have n linearly independent columns"},{"Start":"11:23.945 ","End":"11:28.570","Text":"of P and that\u0027s equivalent to saying that P is invertible."},{"Start":"11:28.570 ","End":"11:36.130","Text":"That\u0027s how we know that P is invertible and P^minus 1 or P inverse exists."},{"Start":"11:36.130 ","End":"11:41.450","Text":"This idea is often stated as a theorem that an n by n matrix A is"},{"Start":"11:41.450 ","End":"11:48.850","Text":"diagonalizable if and only if it has n linearly independent eigenvectors."},{"Start":"11:48.850 ","End":"11:52.875","Text":"Another useful theorem is the following,"},{"Start":"11:52.875 ","End":"11:58.865","Text":"that if we have a set of eigenvectors corresponding to different eigenvalues,"},{"Start":"11:58.865 ","End":"12:02.345","Text":"then that set is linearly independent."},{"Start":"12:02.345 ","End":"12:06.875","Text":"Eigenvectors from different eigenvalues are linearly independent."},{"Start":"12:06.875 ","End":"12:08.840","Text":"That\u0027s pretty useful later on."},{"Start":"12:08.840 ","End":"12:10.970","Text":"Just to re-emphasize,"},{"Start":"12:10.970 ","End":"12:16.595","Text":"if we get n eigenvectors using our method,"},{"Start":"12:16.595 ","End":"12:20.750","Text":"then you can skip the check that they\u0027re linearly independent and we\u0027re"},{"Start":"12:20.750 ","End":"12:25.925","Text":"guaranteed that P is invertible and A is diagonalizable,"},{"Start":"12:25.925 ","End":"12:33.210","Text":"and that\u0027s the main thing to make sure that you get n eigenvectors using our method."},{"Start":"12:33.410 ","End":"12:39.450","Text":"That\u0027s all I want to say for this clip and we are done."}],"ID":25776},{"Watched":false,"Name":"Exercise 1 parts a-f","Duration":"12m 55s","ChapterTopicVideoID":24864,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:01.920","Text":"In this exercise,"},{"Start":"00:01.920 ","End":"00:08.160","Text":"we\u0027re given this 3 by 3 matrix and let\u0027s say it\u0027s over the real numbers,"},{"Start":"00:08.160 ","End":"00:11.850","Text":"so it\u0027s a member of this."},{"Start":"00:11.850 ","End":"00:15.435","Text":"We have 6 parts to this question."},{"Start":"00:15.435 ","End":"00:18.090","Text":"I won\u0027t read them out, you can pause and read them."},{"Start":"00:18.090 ","End":"00:20.890","Text":"I\u0027ll just take them 1 at a time."},{"Start":"00:20.900 ","End":"00:24.300","Text":"First, the characteristic matrix."},{"Start":"00:24.300 ","End":"00:27.960","Text":"Well, that\u0027s defined to be x times I,"},{"Start":"00:27.960 ","End":"00:32.130","Text":"where I is the identity matrix, minus A."},{"Start":"00:32.130 ","End":"00:35.580","Text":"The identity matrix has all 1s on the diagonal."},{"Start":"00:35.580 ","End":"00:36.990","Text":"When you multiply it by x,"},{"Start":"00:36.990 ","End":"00:39.510","Text":"you have x\u0027s on the diagonal and 0 elsewhere."},{"Start":"00:39.510 ","End":"00:42.315","Text":"This is our matrix A."},{"Start":"00:42.315 ","End":"00:45.410","Text":"This is the result of the subtraction."},{"Start":"00:45.410 ","End":"00:49.250","Text":"Yes, this is our characteristic matrix."},{"Start":"00:49.250 ","End":"00:57.240","Text":"Related to the characteristic matrix is the characteristic polynomial, that\u0027s Part B."},{"Start":"00:57.240 ","End":"01:00.980","Text":"That\u0027s simply the determinant of what we found here because look,"},{"Start":"01:00.980 ","End":"01:03.875","Text":"this xI minus A is the same as that."},{"Start":"01:03.875 ","End":"01:08.510","Text":"All we have to do is take the determinant of the matrix,"},{"Start":"01:08.510 ","End":"01:13.865","Text":"which means replacing the brackets by bars, essentially."},{"Start":"01:13.865 ","End":"01:16.650","Text":"Now, we have to compute this."},{"Start":"01:17.300 ","End":"01:22.775","Text":"I noticed that this first column has a couple of 0s in it."},{"Start":"01:22.775 ","End":"01:25.915","Text":"Let\u0027s expand along the first column."},{"Start":"01:25.915 ","End":"01:31.100","Text":"Then we just get x times, well,"},{"Start":"01:31.100 ","End":"01:34.280","Text":"if I cross out this row and column,"},{"Start":"01:34.280 ","End":"01:37.190","Text":"then I have a 2 by 2 determinant."},{"Start":"01:37.190 ","End":"01:40.745","Text":"That would be x minus 2 times x here,"},{"Start":"01:40.745 ","End":"01:45.745","Text":"minus 1 which makes it plus 1."},{"Start":"01:45.745 ","End":"01:51.250","Text":"Now, we just need to simplify it a bit."},{"Start":"01:51.470 ","End":"01:54.090","Text":"This bit comes out like this."},{"Start":"01:54.090 ","End":"01:58.065","Text":"This is a perfect square so this is what we get."},{"Start":"01:58.065 ","End":"02:00.140","Text":"I want to point out, in this exercise,"},{"Start":"02:00.140 ","End":"02:02.600","Text":"you want your polynomials factorized."},{"Start":"02:02.600 ","End":"02:07.970","Text":"We could have expanded it and said x cubed minus 2x squared plus x or something."},{"Start":"02:07.970 ","End":"02:12.090","Text":"But no, we want it in factorized form."},{"Start":"02:13.060 ","End":"02:16.670","Text":"Put a nice little box around it and give it a name,"},{"Start":"02:16.670 ","End":"02:20.240","Text":"p of x, the characteristic polynomial."},{"Start":"02:20.240 ","End":"02:24.230","Text":"Onto the next section,"},{"Start":"02:24.230 ","End":"02:29.525","Text":"eigenvalues and algebraic multiplicity."},{"Start":"02:29.525 ","End":"02:38.790","Text":"I guess I forgot to say this is Part C. Let\u0027s go for the eigenvalues."},{"Start":"02:39.140 ","End":"02:46.075","Text":"The eigenvalues of a matrix are just the roots of the characteristic polynomial."},{"Start":"02:46.075 ","End":"02:49.330","Text":"Now, what is a root of a polynomial?"},{"Start":"02:49.330 ","End":"02:57.085","Text":"It means it\u0027s a solution to the equation where we let the polynomial equal 0."},{"Start":"02:57.085 ","End":"02:59.440","Text":"Just to throw in another term,"},{"Start":"02:59.440 ","End":"03:01.420","Text":"we had characteristic matrix,"},{"Start":"03:01.420 ","End":"03:04.460","Text":"characteristic polynomial, and"},{"Start":"03:04.460 ","End":"03:06.970","Text":"characteristic equation is just what you get"},{"Start":"03:06.970 ","End":"03:10.000","Text":"when you set the characteristic polynomial to 0."},{"Start":"03:10.000 ","End":"03:12.490","Text":"The eigenvalues are the solutions"},{"Start":"03:12.490 ","End":"03:16.100","Text":"of the characteristic equation is another way of putting it."},{"Start":"03:16.220 ","End":"03:22.955","Text":"In our case, this is the characteristic polynomial equation."},{"Start":"03:22.955 ","End":"03:27.500","Text":"The eigenvalues are therefore, 0 and 1."},{"Start":"03:27.500 ","End":"03:32.180","Text":"Although some people would say there were 3 solutions, 0,"},{"Start":"03:32.180 ","End":"03:33.410","Text":"1, and 1,"},{"Start":"03:33.410 ","End":"03:34.430","Text":"and 1 is a double root,"},{"Start":"03:34.430 ","End":"03:38.310","Text":"but I\u0027m just counting the different ones, 0 and 1."},{"Start":"03:38.930 ","End":"03:43.880","Text":"Next, we come to the algebraic multiplicity."},{"Start":"03:43.880 ","End":"03:45.740","Text":"Before I get into it formally,"},{"Start":"03:45.740 ","End":"03:50.480","Text":"I can tell you informally that multiplicity means how many times it"},{"Start":"03:50.480 ","End":"03:55.520","Text":"appears if I do it this way and 0 appears once and 1 appears twice."},{"Start":"03:55.520 ","End":"03:57.170","Text":"This has multiplicity 1,"},{"Start":"03:57.170 ","End":"03:58.835","Text":"this has multiplicity 2."},{"Start":"03:58.835 ","End":"04:03.300","Text":"Anyway, let\u0027s do it more precisely."},{"Start":"04:03.550 ","End":"04:10.730","Text":"If the characteristic polynomial and I should have added factorized,"},{"Start":"04:10.730 ","End":"04:14.880","Text":"even just write that down, factorized."},{"Start":"04:16.340 ","End":"04:19.505","Text":"If it\u0027s expanded, you won\u0027t see it."},{"Start":"04:19.505 ","End":"04:26.945","Text":"If it contains a factor of the form x minus a to the power of k,"},{"Start":"04:26.945 ","End":"04:29.105","Text":"a is the eigenvalue,"},{"Start":"04:29.105 ","End":"04:34.070","Text":"k is the multiplicity of a."},{"Start":"04:34.070 ","End":"04:36.455","Text":"In our case, well,"},{"Start":"04:36.455 ","End":"04:37.550","Text":"let me just remind you,"},{"Start":"04:37.550 ","End":"04:41.315","Text":"we had x times x minus 1 squared,"},{"Start":"04:41.315 ","End":"04:46.810","Text":"but I could write this as x minus 0 to the 1,"},{"Start":"04:46.810 ","End":"04:49.125","Text":"x minus 1 squared."},{"Start":"04:49.125 ","End":"04:51.555","Text":"Then each factor looks like this."},{"Start":"04:51.555 ","End":"04:54.615","Text":"We see that 0 has multiplicity 1,"},{"Start":"04:54.615 ","End":"04:57.345","Text":"and 1 has multiplicity 2."},{"Start":"04:57.345 ","End":"05:01.840","Text":"They wrote that down in the opposite order, it doesn\u0027t matter."},{"Start":"05:02.290 ","End":"05:09.140","Text":"In Part D, we\u0027ll be discussing another multiplicity called geometric multiplicity,"},{"Start":"05:09.140 ","End":"05:14.005","Text":"and that\u0027s related to a concept called eigenspaces."},{"Start":"05:14.005 ","End":"05:17.550","Text":"Each eigenvalue has its eigenspace."},{"Start":"05:17.550 ","End":"05:23.945","Text":"The way we find it is to substitute the eigenvalue in the characteristic matrix."},{"Start":"05:23.945 ","End":"05:25.490","Text":"Remember, this contains an x,"},{"Start":"05:25.490 ","End":"05:27.830","Text":"instead of x, we put that eigenvalue,"},{"Start":"05:27.830 ","End":"05:33.480","Text":"and then we solve the corresponding system of linear equations."},{"Start":"05:33.520 ","End":"05:41.120","Text":"The solution space of this SLE is called the eigenspace of that particular eigenvalue."},{"Start":"05:41.120 ","End":"05:49.480","Text":"The dimension of the eigenspace is the geometric multiplicity of that eigenvalue."},{"Start":"05:49.480 ","End":"05:51.995","Text":"Just as a reminder,"},{"Start":"05:51.995 ","End":"05:55.385","Text":"our eigenvalues are 1 and 0."},{"Start":"05:55.385 ","End":"05:58.310","Text":"I\u0027ll do the computations separately."},{"Start":"05:58.310 ","End":"06:01.195","Text":"We\u0027ll first take 1 then the other."},{"Start":"06:01.195 ","End":"06:03.815","Text":"Let\u0027s go with x equals 1,"},{"Start":"06:03.815 ","End":"06:05.390","Text":"this eigenvalue first,"},{"Start":"06:05.390 ","End":"06:07.909","Text":"and afterwards, we\u0027ll do the case 0."},{"Start":"06:07.909 ","End":"06:11.555","Text":"Here\u0027s the characteristic matrix,"},{"Start":"06:11.555 ","End":"06:12.995","Text":"it has the x in it,"},{"Start":"06:12.995 ","End":"06:14.890","Text":"and we substitute,"},{"Start":"06:14.890 ","End":"06:18.365","Text":"instead of x, we put 1, the eigenvalue."},{"Start":"06:18.365 ","End":"06:20.660","Text":"This is what we get, x is 1,"},{"Start":"06:20.660 ","End":"06:22.400","Text":"1 minus 2 is minus 1,"},{"Start":"06:22.400 ","End":"06:25.300","Text":"here again, 1, everything else the same."},{"Start":"06:25.300 ","End":"06:28.540","Text":"Here\u0027s the corresponding SLE."},{"Start":"06:28.540 ","End":"06:33.305","Text":"Just take the coefficients and use xyz."},{"Start":"06:33.305 ","End":"06:37.829","Text":"This would not be the same x as the x here."},{"Start":"06:38.420 ","End":"06:42.110","Text":"We bring this matrix to row echelon form."},{"Start":"06:42.110 ","End":"06:43.700","Text":"Here, it\u0027s pretty easy."},{"Start":"06:43.700 ","End":"06:47.980","Text":"We just have to subtract the second equation from the third equation,"},{"Start":"06:47.980 ","End":"06:53.100","Text":"and that gives us a row of 0s and pretend like it\u0027s not there."},{"Start":"06:53.100 ","End":"06:59.565","Text":"Then we get from this the row echelon form of the system,"},{"Start":"06:59.565 ","End":"07:06.790","Text":"and z is the free variable."},{"Start":"07:06.950 ","End":"07:09.705","Text":"We\u0027re going to use our technique,"},{"Start":"07:09.705 ","End":"07:16.205","Text":"we called it the wondering 1s when we need to find a basis for the solution space,"},{"Start":"07:16.205 ","End":"07:20.690","Text":"and we just take each free variable in turn and make it 1."},{"Start":"07:20.690 ","End":"07:22.010","Text":"Well, there is only 1 here,"},{"Start":"07:22.010 ","End":"07:24.840","Text":"so let z equal 1."},{"Start":"07:24.860 ","End":"07:28.725","Text":"Everything else that was not free, it follows."},{"Start":"07:28.725 ","End":"07:30.420","Text":"Everything else is forced."},{"Start":"07:30.420 ","End":"07:32.315","Text":"Once Z is 1,"},{"Start":"07:32.315 ","End":"07:35.165","Text":"then using back substitution from here,"},{"Start":"07:35.165 ","End":"07:39.910","Text":"we get that y is also equal to 1."},{"Start":"07:39.910 ","End":"07:42.080","Text":"If you put z and y in here,"},{"Start":"07:42.080 ","End":"07:44.045","Text":"we get that x equals 1."},{"Start":"07:44.045 ","End":"07:47.885","Text":"If you arrange them in the order x, y, z,"},{"Start":"07:47.885 ","End":"07:51.740","Text":"then the set containing the vector 1, 1,"},{"Start":"07:51.740 ","End":"07:55.029","Text":"1 is the basis for the solution space."},{"Start":"07:55.029 ","End":"07:56.945","Text":"If we take the span of that,"},{"Start":"07:56.945 ","End":"07:58.490","Text":"that is the solution space."},{"Start":"07:58.490 ","End":"08:00.815","Text":"In other words, this is the eigenspace."},{"Start":"08:00.815 ","End":"08:06.355","Text":"This is the eigenspace for eigenvalue 1."},{"Start":"08:06.355 ","End":"08:11.040","Text":"The dimension is 1 because there\u0027s only 1 vector in the basis,"},{"Start":"08:11.040 ","End":"08:15.050","Text":"so that\u0027s the geometric multiplicity is 1."},{"Start":"08:15.050 ","End":"08:18.890","Text":"For the case where the eigenvalue is 0,"},{"Start":"08:18.890 ","End":"08:24.610","Text":"we substitute x equals 0 in the characteristic matrix."},{"Start":"08:24.610 ","End":"08:27.495","Text":"This is what we get,"},{"Start":"08:27.495 ","End":"08:31.590","Text":"and this is the corresponding SLE."},{"Start":"08:31.590 ","End":"08:35.540","Text":"Here, if we subtract the first row from the second row,"},{"Start":"08:35.540 ","End":"08:39.475","Text":"we get a row of 0s so I can cross that out."},{"Start":"08:39.475 ","End":"08:43.235","Text":"This is the SLE in row echelon form."},{"Start":"08:43.235 ","End":"08:47.630","Text":"The free variable doesn\u0027t actually appear explicitly,"},{"Start":"08:47.630 ","End":"08:50.915","Text":"but that would be x, the free variable."},{"Start":"08:50.915 ","End":"08:55.235","Text":"Using our usual technique,"},{"Start":"08:55.235 ","End":"09:01.025","Text":"we let x equal 1 and then we can compute y and z,"},{"Start":"09:01.025 ","End":"09:03.240","Text":"which are forced,"},{"Start":"09:03.500 ","End":"09:06.675","Text":"z has to be 0,"},{"Start":"09:06.675 ","End":"09:08.025","Text":"regardless of x,"},{"Start":"09:08.025 ","End":"09:10.935","Text":"and when z is 0 and y is also 0,"},{"Start":"09:10.935 ","End":"09:12.740","Text":"it doesn\u0027t really depend on x."},{"Start":"09:12.740 ","End":"09:14.750","Text":"If you put them in the right order,"},{"Start":"09:14.750 ","End":"09:16.880","Text":"x, y, z, 1, 0, 0,"},{"Start":"09:16.880 ","End":"09:22.564","Text":"that vector is a basis for the solution space,"},{"Start":"09:22.564 ","End":"09:25.445","Text":"which means that the solution space is the span of this,"},{"Start":"09:25.445 ","End":"09:33.600","Text":"and this is what we call the eigenspace corresponding to eigenvalue 0."},{"Start":"09:34.040 ","End":"09:39.765","Text":"This has dimension 1 because there\u0027s only 1 vector in the basis,"},{"Start":"09:39.765 ","End":"09:45.985","Text":"that\u0027s the geometric multiplicity of eigenvalue 0."},{"Start":"09:45.985 ","End":"09:49.860","Text":"Now, on to the eigenvectors,"},{"Start":"09:49.860 ","End":"09:54.340","Text":"each eigenvalue contributes some eigenvectors."},{"Start":"09:54.340 ","End":"09:56.290","Text":"For a given eigenvalue,"},{"Start":"09:56.290 ","End":"09:59.830","Text":"we just take the basis of the eigenspace."},{"Start":"09:59.830 ","End":"10:03.430","Text":"Actually, the word the is not really right"},{"Start":"10:03.430 ","End":"10:08.830","Text":"here because there\u0027s really more than 1 way to choose a basis,"},{"Start":"10:08.830 ","End":"10:11.130","Text":"but we just pick 1 of them."},{"Start":"10:11.130 ","End":"10:14.875","Text":"Those vectors are the eigenvectors."},{"Start":"10:14.875 ","End":"10:17.930","Text":"For x equals 1,"},{"Start":"10:17.930 ","End":"10:22.630","Text":"we had that the eigenspace will spend by this."},{"Start":"10:22.630 ","End":"10:26.505","Text":"This vector will be the or a,"},{"Start":"10:26.505 ","End":"10:30.930","Text":"eigenvector for x equals 1."},{"Start":"10:30.930 ","End":"10:35.449","Text":"Similarly, for the eigenvalue 0,"},{"Start":"10:35.449 ","End":"10:38.150","Text":"we got this was the eigenspace,"},{"Start":"10:38.150 ","End":"10:40.460","Text":"and so 1, 0,"},{"Start":"10:40.460 ","End":"10:45.680","Text":"0 would be an eigenvector for the eigenvalue 0."},{"Start":"10:45.680 ","End":"10:48.860","Text":"Notice that this is a big V and this is a little v. This"},{"Start":"10:48.860 ","End":"10:53.000","Text":"is for the eigenspace and this is for the eigenvector."},{"Start":"10:53.000 ","End":"10:57.680","Text":"Then we put all the eigenvectors together from the different eigenvalues."},{"Start":"10:57.680 ","End":"11:02.800","Text":"Altogether, we have this and this, are eigenvectors."},{"Start":"11:02.800 ","End":"11:09.110","Text":"Now the last part is the matrix diagonalizable."},{"Start":"11:09.110 ","End":"11:13.565","Text":"Let\u0027s remember the definition of diagonalizable."},{"Start":"11:13.565 ","End":"11:17.850","Text":"There\u0027s 2 ways of phrasing it."},{"Start":"11:17.890 ","End":"11:24.500","Text":"We want to know if there\u0027s an invertible matrix P such that this equation holds true,"},{"Start":"11:24.500 ","End":"11:27.005","Text":"A times P is P times D,"},{"Start":"11:27.005 ","End":"11:29.720","Text":"where D is some diagonal matrix."},{"Start":"11:29.720 ","End":"11:30.995","Text":"But instead of this,"},{"Start":"11:30.995 ","End":"11:33.050","Text":"you could write it this way."},{"Start":"11:33.050 ","End":"11:36.635","Text":"You could really multiply both on the left by P minus 1,"},{"Start":"11:36.635 ","End":"11:39.200","Text":"inverse of P, and get this."},{"Start":"11:39.200 ","End":"11:43.110","Text":"I actually prefer this form to this."},{"Start":"11:43.220 ","End":"11:46.805","Text":"There\u0027s a theorem which really helps us here."},{"Start":"11:46.805 ","End":"11:48.590","Text":"It talks about in general,"},{"Start":"11:48.590 ","End":"11:50.360","Text":"an n by n matrix."},{"Start":"11:50.360 ","End":"11:57.745","Text":"It\u0027s diagonalizable if and only if it has n linearly independent eigenvectors."},{"Start":"11:57.745 ","End":"12:00.030","Text":"Now, in our case, n is 3,"},{"Start":"12:00.030 ","End":"12:05.100","Text":"we have a 3 by 3 and we only have 2 eigenvectors,"},{"Start":"12:05.100 ","End":"12:07.440","Text":"so 2 is not equal to 3,"},{"Start":"12:07.440 ","End":"12:11.575","Text":"so A is not diagonalizable."},{"Start":"12:11.575 ","End":"12:16.760","Text":"There\u0027s also another way using a different theorem which says that"},{"Start":"12:16.760 ","End":"12:21.830","Text":"an n by n matrix is diagonalizable if and only if for each eigenvalue,"},{"Start":"12:21.830 ","End":"12:23.960","Text":"the 2 multiplicities are the same."},{"Start":"12:23.960 ","End":"12:28.920","Text":"The geometric and the algebraic are both equal."},{"Start":"12:28.990 ","End":"12:32.810","Text":"What we had is 1 of the eigenvalues,"},{"Start":"12:32.810 ","End":"12:34.840","Text":"the eigenvalue 1,"},{"Start":"12:34.840 ","End":"12:38.480","Text":"does not satisfy this because it\u0027s"},{"Start":"12:38.480 ","End":"12:44.000","Text":"algebraic multiplicity is 2 but the geometric multiplicity is only 1,"},{"Start":"12:44.000 ","End":"12:46.480","Text":"and 1 is not equal to 2."},{"Start":"12:46.480 ","End":"12:56.280","Text":"Once again, A is not diagonalizable confirmation. That\u0027s it."}],"ID":25777},{"Watched":false,"Name":"Exercise 1 parts i-j","Duration":"7m 27s","ChapterTopicVideoID":24865,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:02.520","Text":"Continuing with the exercise,"},{"Start":"00:02.520 ","End":"00:04.500","Text":"we already did A through F."},{"Start":"00:04.500 ","End":"00:08.310","Text":"There is no G, H that\u0027s reserved"},{"Start":"00:08.310 ","End":"00:13.560","Text":"for diagonalizable matrices and our matrix isn\u0027t."},{"Start":"00:13.560 ","End":"00:18.960","Text":"The next 1 is to find the minimal polynomial for A."},{"Start":"00:18.960 ","End":"00:23.760","Text":"I\u0027ll read the other part when I get to it."},{"Start":"00:23.760 ","End":"00:25.980","Text":"The minimal polynomial."},{"Start":"00:25.980 ","End":"00:29.685","Text":"We start out with the Cayley-Hamilton theorem,"},{"Start":"00:29.685 ","End":"00:35.010","Text":"which in essence says every matrix satisfies its own characteristic equation."},{"Start":"00:35.010 ","End":"00:36.735","Text":"What does that mean?"},{"Start":"00:36.735 ","End":"00:40.920","Text":"We take the characteristic polynomial of a p of x"},{"Start":"00:40.920 ","End":"00:45.440","Text":"and we substitute for x instead of a number,"},{"Start":"00:45.440 ","End":"00:48.990","Text":"we can actually substitute a matrix."},{"Start":"00:49.370 ","End":"00:52.605","Text":"If we substitute the matrix A,"},{"Start":"00:52.605 ","End":"00:56.210","Text":"then p of A is 0."},{"Start":"00:56.210 ","End":"01:01.140","Text":"That\u0027s the Cayley-Hamilton theorem, the 0 matrix."},{"Start":"01:02.090 ","End":"01:05.385","Text":"This shows that there is a polynomial,"},{"Start":"01:05.385 ","End":"01:09.455","Text":"which when you plug in A gives 0,"},{"Start":"01:09.455 ","End":"01:14.750","Text":"but there might be 1 with a smaller degree."},{"Start":"01:14.750 ","End":"01:19.390","Text":"That\u0027s where the concept of minimal polynomial comes in."},{"Start":"01:19.390 ","End":"01:24.285","Text":"The minimal polynomial is a polynomial,"},{"Start":"01:24.285 ","End":"01:25.550","Text":"maybe it\u0027s p of x,"},{"Start":"01:25.550 ","End":"01:27.620","Text":"but maybe there\u0027s 1 with a smaller degree."},{"Start":"01:27.620 ","End":"01:32.090","Text":"Anyway, it\u0027s the polynomial with least degree"},{"Start":"01:32.090 ","End":"01:36.655","Text":"that when you plug A in, you get 0."},{"Start":"01:36.655 ","End":"01:42.350","Text":"Not very essential, but I added the word monic,"},{"Start":"01:42.350 ","End":"01:44.660","Text":"meaning the leading coefficient is 1."},{"Start":"01:44.660 ","End":"01:51.820","Text":"Otherwise I can\u0027t say the,"},{"Start":"01:53.090 ","End":"02:00.855","Text":"because we could multiply a polynomial by 2 or by any non-zero number."},{"Start":"02:00.855 ","End":"02:06.170","Text":"That will also satisfy this and it won\u0027t change the degree."},{"Start":"02:06.170 ","End":"02:10.580","Text":"To make it unique we just say that the leading coefficient is 1."},{"Start":"02:10.580 ","End":"02:12.440","Text":"It isn\u0027t, we can divide by it."},{"Start":"02:12.440 ","End":"02:14.890","Text":"Anyway this is not crucial."},{"Start":"02:14.890 ","End":"02:18.530","Text":"There\u0027s a theorem which helps us here,"},{"Start":"02:18.530 ","End":"02:21.890","Text":"which says that the minimal polynomial and"},{"Start":"02:21.890 ","End":"02:27.335","Text":"the characteristic polynomial have the same irreducible factors."},{"Start":"02:27.335 ","End":"02:30.215","Text":"I\u0027ll explain what this means in a moment."},{"Start":"02:30.215 ","End":"02:37.700","Text":"The method is that we start off with the characteristic polynomial."},{"Start":"02:37.700 ","End":"02:40.970","Text":"In our case, this is not the characteristic polynomial."},{"Start":"02:40.970 ","End":"02:44.195","Text":"I\u0027m just using 1 for the sake of example."},{"Start":"02:44.195 ","End":"02:49.870","Text":"I want 1 that\u0027s more interesting for the purposes of illustration."},{"Start":"02:49.870 ","End":"02:51.664","Text":"Let\u0027s take this 1."},{"Start":"02:51.664 ","End":"02:59.339","Text":"The irreducible factors, the x minus 1,"},{"Start":"02:59.339 ","End":"03:02.560","Text":"and the x squared plus 4."},{"Start":"03:02.560 ","End":"03:04.460","Text":"They are irreducible."},{"Start":"03:04.460 ","End":"03:06.764","Text":"We can\u0027t factorized anymore."},{"Start":"03:06.764 ","End":"03:09.950","Text":"Remember, we\u0027re working over the real numbers."},{"Start":"03:09.950 ","End":"03:12.530","Text":"As some of you who\u0027ve studied complex numbers,"},{"Start":"03:12.530 ","End":"03:15.570","Text":"and then this would be factorizable."},{"Start":"03:16.070 ","End":"03:18.680","Text":"If it has the same factors,"},{"Start":"03:18.680 ","End":"03:19.760","Text":"it has to have an x,"},{"Start":"03:19.760 ","End":"03:21.545","Text":"it has to have an x minus 1,"},{"Start":"03:21.545 ","End":"03:23.645","Text":"and it has to have an x squared plus 4,"},{"Start":"03:23.645 ","End":"03:25.820","Text":"but possibly of lower degree."},{"Start":"03:25.820 ","End":"03:28.885","Text":"Instead of 2, this could be 1."},{"Start":"03:28.885 ","End":"03:33.220","Text":"Instead of this, it could be 1 or 2 or 3."},{"Start":"03:33.220 ","End":"03:36.335","Text":"That actually gives us 6 combinations."},{"Start":"03:36.335 ","End":"03:38.915","Text":"This has to stay degree."},{"Start":"03:38.915 ","End":"03:41.010","Text":"I mean, yeah, 1 here."},{"Start":"03:41.010 ","End":"03:45.495","Text":"Well, we can choose 2 choice of 2, choice of 3."},{"Start":"03:45.495 ","End":"03:47.740","Text":"2 times 3 is 6,"},{"Start":"03:47.740 ","End":"03:50.930","Text":"and that gives us a choice of 6 possibilities."},{"Start":"03:50.930 ","End":"03:56.670","Text":"I take all these combinations where here it\u0027s 1 or 2 and here 1, 2 or 3."},{"Start":"03:57.200 ","End":"04:02.270","Text":"I arrange them in order of increasing degree."},{"Start":"04:02.270 ","End":"04:05.165","Text":"If it\u0027s a tie, it doesn\u0027t matter which order."},{"Start":"04:05.165 ","End":"04:11.675","Text":"Notice that the last 1 is actually the characteristic polynomial itself."},{"Start":"04:11.675 ","End":"04:16.040","Text":"What we do is we substitute the matrix,"},{"Start":"04:16.040 ","End":"04:17.940","Text":"in our case it\u0027s A,"},{"Start":"04:17.940 ","End":"04:21.230","Text":"successively in the polynomials,"},{"Start":"04:21.230 ","End":"04:25.185","Text":"but we do it from the lowest degree,"},{"Start":"04:25.185 ","End":"04:28.950","Text":"successively, increasing the degree."},{"Start":"04:28.950 ","End":"04:32.770","Text":"The first 1 we get that gives us 0 that\u0027s"},{"Start":"04:32.770 ","End":"04:36.680","Text":"where we stop and that m is the 1 that we want."},{"Start":"04:36.680 ","End":"04:39.165","Text":"That would be the minimal polynomial."},{"Start":"04:39.165 ","End":"04:40.050","Text":"M for minimal."},{"Start":"04:40.050 ","End":"04:43.310","Text":"Now, returning to our example."},{"Start":"04:43.310 ","End":"04:45.650","Text":"This wasn\u0027t our example."},{"Start":"04:45.650 ","End":"04:49.805","Text":"Our example, this was the characteristic polynomial."},{"Start":"04:49.805 ","End":"04:52.280","Text":"This was the matrix A."},{"Start":"04:52.280 ","End":"04:55.760","Text":"I can write this as x^1."},{"Start":"04:55.760 ","End":"04:59.735","Text":"Just like before we vary,"},{"Start":"04:59.735 ","End":"05:02.140","Text":"this could be 2, 0, 1."},{"Start":"05:02.140 ","End":"05:04.310","Text":"If we have to have all the irreducible factors,"},{"Start":"05:04.310 ","End":"05:08.900","Text":"we have to have x and we have to have x minus 1."},{"Start":"05:08.900 ","End":"05:13.595","Text":"The only way to reduce the degree is to take a 1 here instead."},{"Start":"05:13.595 ","End":"05:15.770","Text":"So there\u0027s 2 possibilities."},{"Start":"05:15.770 ","End":"05:17.630","Text":"If we take the 1 here,"},{"Start":"05:17.630 ","End":"05:19.280","Text":"then we get x, x minus 1."},{"Start":"05:19.280 ","End":"05:20.645","Text":"If we take the 2 here,"},{"Start":"05:20.645 ","End":"05:22.490","Text":"it\u0027s x, x minus 1 squared."},{"Start":"05:22.490 ","End":"05:25.055","Text":"This 1 has degree 2, this 1 has degree 3."},{"Start":"05:25.055 ","End":"05:30.270","Text":"We start with this 1 and substitute A into it."},{"Start":"05:30.270 ","End":"05:32.940","Text":"We want to compute A times."},{"Start":"05:32.940 ","End":"05:35.050","Text":"It\u0027s not A minus 1."},{"Start":"05:35.050 ","End":"05:37.320","Text":"When you substitute a matrix,"},{"Start":"05:37.320 ","End":"05:41.540","Text":"1 is the identity matrix."},{"Start":"05:41.540 ","End":"05:43.625","Text":"That\u0027s how it works."},{"Start":"05:43.625 ","End":"05:48.120","Text":"Here\u0027s the matrix A, the original 1."},{"Start":"05:48.500 ","End":"05:52.725","Text":"The identity matrix just has 1s along the diagonal."},{"Start":"05:52.725 ","End":"05:56.310","Text":"Just have to subtract 1 from here, here, and here."},{"Start":"05:56.310 ","End":"05:58.860","Text":"That gives me this matrix."},{"Start":"05:58.860 ","End":"06:03.060","Text":"Now, we do a matrix multiplication."},{"Start":"06:03.310 ","End":"06:06.770","Text":"But I didn\u0027t do the multiplication all the way."},{"Start":"06:06.770 ","End":"06:08.540","Text":"I started doing these."},{"Start":"06:08.540 ","End":"06:10.685","Text":"But as soon as you get,"},{"Start":"06:10.685 ","End":"06:15.035","Text":"if we take this with this,"},{"Start":"06:15.035 ","End":"06:16.924","Text":"that gives us this entry,"},{"Start":"06:16.924 ","End":"06:22.450","Text":"it comes out to be 1, 0 times 2, 2 times 1 minus 1."},{"Start":"06:22.450 ","End":"06:25.490","Text":"As soon as you get a non-zero entry, you can stop."},{"Start":"06:25.490 ","End":"06:27.440","Text":"There\u0027s no point continuing because"},{"Start":"06:27.440 ","End":"06:31.620","Text":"it\u0027s already not going to equal the 0 matrix."},{"Start":"06:32.630 ","End":"06:35.970","Text":"This m is no good for us."},{"Start":"06:35.970 ","End":"06:38.885","Text":"We have to take the other m,"},{"Start":"06:38.885 ","End":"06:44.240","Text":"this 1, and that 1 is the characteristic polynomial is also the minimal."},{"Start":"06:44.240 ","End":"06:47.450","Text":"We don\u0027t have to substitute because Cayley-Hamilton guarantees"},{"Start":"06:47.450 ","End":"06:52.020","Text":"that if we substituted here, we\u0027ll get 0."},{"Start":"06:52.370 ","End":"06:58.540","Text":"The last part is A invertible."},{"Start":"06:58.540 ","End":"07:01.475","Text":"The theorem that would help us here,"},{"Start":"07:01.475 ","End":"07:06.442","Text":"that a matrix is invertible if and only if its eigenvalues"},{"Start":"07:06.442 ","End":"07:09.200","Text":"and all of them are non-zero."},{"Start":"07:09.200 ","End":"07:13.699","Text":"Which is not true in our case because the eigenvalues,"},{"Start":"07:13.699 ","End":"07:15.515","Text":"as you remember are 0 and 1,"},{"Start":"07:15.515 ","End":"07:17.645","Text":"so I can\u0027t say they\u0027re all non-zero."},{"Start":"07:17.645 ","End":"07:19.565","Text":"1 of them is specifically 0,"},{"Start":"07:19.565 ","End":"07:22.830","Text":"so it\u0027s not invertible."},{"Start":"07:23.690 ","End":"07:27.730","Text":"That concludes this exercise."}],"ID":25778},{"Watched":false,"Name":"Exercise 2 parts a-f","Duration":"12m 7s","ChapterTopicVideoID":24839,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.050 ","End":"00:05.700","Text":"In this exercise, we\u0027re given a 3 by 3 matrix A as follows"},{"Start":"00:05.700 ","End":"00:10.830","Text":"and there are 6 parts here and I won\u0027t read them out"},{"Start":"00:10.830 ","End":"00:15.960","Text":"the same as was in the previous exercise, so let\u0027s just start."},{"Start":"00:15.960 ","End":"00:22.065","Text":"The first 1 is to find the characteristic matrix,"},{"Start":"00:22.065 ","End":"00:29.440","Text":"and this is defined as x times the identity matrix minus A."},{"Start":"00:29.440 ","End":"00:37.560","Text":"Remember that the identity matrix has 1s along the diagonal and 0 elsewhere."},{"Start":"00:37.560 ","End":"00:40.310","Text":"If we multiply the identity by x,"},{"Start":"00:40.310 ","End":"00:44.900","Text":"we just have xs on the diagonal and this is our original A,"},{"Start":"00:44.900 ","End":"00:47.965","Text":"and then subtraction as usual,"},{"Start":"00:47.965 ","End":"00:49.980","Text":"and that gives us this,"},{"Start":"00:49.980 ","End":"00:52.580","Text":"and this is the characteristic matrix."},{"Start":"00:52.580 ","End":"00:56.480","Text":"Just want to point out that in some places,"},{"Start":"00:56.480 ","End":"00:58.610","Text":"instead of the letter x,"},{"Start":"00:58.610 ","End":"01:01.130","Text":"the letter lambda is often used"},{"Start":"01:01.130 ","End":"01:05.395","Text":"and you also sometimes see the subtraction done in the other order,"},{"Start":"01:05.395 ","End":"01:09.240","Text":"A minus x times I."},{"Start":"01:09.240 ","End":"01:11.820","Text":"All these are good,"},{"Start":"01:11.820 ","End":"01:16.960","Text":"some professors, some books, use 1 or the other."},{"Start":"01:18.460 ","End":"01:22.490","Text":"Next section is the characteristic polynomial."},{"Start":"01:22.490 ","End":"01:27.875","Text":"This is just the determinant of the characteristic matrix."},{"Start":"01:27.875 ","End":"01:30.260","Text":"Remember the characteristic matrix is this"},{"Start":"01:30.260 ","End":"01:32.540","Text":"and its determinants characteristic polynomial."},{"Start":"01:32.540 ","End":"01:37.120","Text":"In our case, we just replace the brackets by bars,"},{"Start":"01:37.120 ","End":"01:40.425","Text":"like so, and determinants,"},{"Start":"01:40.425 ","End":"01:43.840","Text":"we can expand along the row or column."},{"Start":"01:44.320 ","End":"01:47.945","Text":"You could do it by the first column"},{"Start":"01:47.945 ","End":"01:51.245","Text":"and take this and multiply by the determinant of this."},{"Start":"01:51.245 ","End":"01:55.999","Text":"Then you\u0027d see it\u0027s just the product of the diagonal."},{"Start":"01:55.999 ","End":"01:58.490","Text":"In fact, there\u0027s also a theorem that if you have"},{"Start":"01:58.490 ","End":"02:03.785","Text":"an upper diagonal matrix or a lower diagonal,"},{"Start":"02:03.785 ","End":"02:08.460","Text":"this case below the diagonal, we have 0s,"},{"Start":"02:08.460 ","End":"02:12.490","Text":"then the product of the entries on the diagonal is the determinant."},{"Start":"02:12.490 ","End":"02:18.305","Text":"It\u0027s x minus 1 times x minus 1 times x minus 2, like so."},{"Start":"02:18.305 ","End":"02:22.749","Text":"That\u0027s the characteristic polynomial and we put it in a nice box."},{"Start":"02:22.749 ","End":"02:30.665","Text":"Next, we have eigenvalues and algebraic multiplicity of each eigenvalue."},{"Start":"02:30.665 ","End":"02:33.605","Text":"Let\u0027s start with the eigenvalues."},{"Start":"02:33.605 ","End":"02:38.825","Text":"The eigenvalues are just the roots of the characteristic polynomial."},{"Start":"02:38.825 ","End":"02:41.225","Text":"When I say roots of a polynomial, I mean,"},{"Start":"02:41.225 ","End":"02:49.470","Text":"you set the polynomial to 0 and you find the solution to this."},{"Start":"02:49.470 ","End":"02:55.220","Text":"I\u0027m going to point out that sometimes we use the concept characteristic equation."},{"Start":"02:55.220 ","End":"02:57.860","Text":"When I set the characteristic polynomial to 0,"},{"Start":"02:57.860 ","End":"02:59.885","Text":"that\u0027s called the characteristic equation."},{"Start":"02:59.885 ","End":"03:01.865","Text":"You could say that"},{"Start":"03:01.865 ","End":"03:07.205","Text":"the eigenvalues are solutions of the characteristic equation, same thing."},{"Start":"03:07.205 ","End":"03:11.900","Text":"In our case, the characteristic polynomial is this,"},{"Start":"03:11.900 ","End":"03:15.865","Text":"we set it to 0 and we solve it."},{"Start":"03:15.865 ","End":"03:20.080","Text":"There are 2 solutions, 1 and 2."},{"Start":"03:20.080 ","End":"03:25.805","Text":"Although you could say that there are 3 solutions, 1, 1 and 2,"},{"Start":"03:25.805 ","End":"03:28.555","Text":"and that the 1 is a double solution."},{"Start":"03:28.555 ","End":"03:33.355","Text":"That actually ties in with algebraic multiplicity, as we shall see."},{"Start":"03:33.355 ","End":"03:38.940","Text":"Now, each eigenvalue has an algebraic multiplicity."},{"Start":"03:38.940 ","End":"03:42.650","Text":"Let\u0027s do it for x equals 1 and for x equals 2."},{"Start":"03:42.650 ","End":"03:45.100","Text":"But first a bit of theory,"},{"Start":"03:45.100 ","End":"03:47.800","Text":"which should be just a reminder,"},{"Start":"03:47.800 ","End":"03:54.815","Text":"that if the characteristic polynomial contains a factor of the form x minus a^k,"},{"Start":"03:54.815 ","End":"03:57.460","Text":"just like here we have x minus 1 squared,"},{"Start":"03:57.460 ","End":"04:01.775","Text":"then the algebraic multiplicity of a,"},{"Start":"04:01.775 ","End":"04:04.370","Text":"that would be the eigenvalue is k."},{"Start":"04:04.370 ","End":"04:07.250","Text":"I should have said that I\u0027m going to assume"},{"Start":"04:07.250 ","End":"04:13.385","Text":"that all our polynomials are given in factorized form."},{"Start":"04:13.385 ","End":"04:15.890","Text":"Otherwise you won\u0027t see these things,"},{"Start":"04:15.890 ","End":"04:17.180","Text":"I mean, if I expanded this,"},{"Start":"04:17.180 ","End":"04:18.425","Text":"it would be some cubic."},{"Start":"04:18.425 ","End":"04:21.920","Text":"You wouldn\u0027t see all this so we\u0027re assuming in this context"},{"Start":"04:21.920 ","End":"04:24.380","Text":"that everything\u0027s factorized."},{"Start":"04:24.380 ","End":"04:29.270","Text":"Anyway, follows from the definition especially if I put a 1 here,"},{"Start":"04:29.270 ","End":"04:36.655","Text":"that 1 has a multiplicity of 2 and 2 has a multiplicity of 1,"},{"Start":"04:36.655 ","End":"04:39.855","Text":"algebraic multiplicity that is,"},{"Start":"04:39.855 ","End":"04:41.970","Text":"and that corresponds to,"},{"Start":"04:41.970 ","End":"04:44.345","Text":"I mean, you can see here we have 1, 1, 2,"},{"Start":"04:44.345 ","End":"04:50.160","Text":"1 appears twice, so it has multiplicity 2, 2 appears once."},{"Start":"04:50.450 ","End":"04:55.490","Text":"Now we\u0027re going to find the eigenspaces for each eigenvalue"},{"Start":"04:55.490 ","End":"05:01.400","Text":"and also the geometric multiplicity as opposed to the algebraic multiplicity."},{"Start":"05:01.400 ","End":"05:05.510","Text":"To find the eigenspace of a given eigenvalue,"},{"Start":"05:05.510 ","End":"05:09.050","Text":"what you do, is you substitute the eigenvalue"},{"Start":"05:09.050 ","End":"05:15.664","Text":"in the characteristic matrix and corresponding to the matrix,"},{"Start":"05:15.664 ","End":"05:18.785","Text":"we have a system of linear equations."},{"Start":"05:18.785 ","End":"05:22.480","Text":"We solve that system of linear equations,"},{"Start":"05:22.480 ","End":"05:28.460","Text":"and the solution space is the eigenspace."},{"Start":"05:28.460 ","End":"05:29.800","Text":"Here I wrote that."},{"Start":"05:29.800 ","End":"05:34.340","Text":"The dimension of the eigenspace is called"},{"Start":"05:34.340 ","End":"05:38.810","Text":"the geometric multiplicity of the eigenvalue."},{"Start":"05:38.810 ","End":"05:42.390","Text":"I don\u0027t know why it\u0027s called geometric."},{"Start":"05:42.550 ","End":"05:46.423","Text":"Remember that in our case we have eigenvalues 1 and 0"},{"Start":"05:46.423 ","End":"05:49.490","Text":"and let\u0027s take care of each 1 separately."},{"Start":"05:49.490 ","End":"05:53.345","Text":"Let\u0027s start with the case where x equals 1."},{"Start":"05:53.345 ","End":"05:55.700","Text":"We take the characteristic matrix,"},{"Start":"05:55.700 ","End":"05:59.420","Text":"which is this, and substitute x equals 1."},{"Start":"05:59.420 ","End":"06:01.685","Text":"If we do that, we get this."},{"Start":"06:01.685 ","End":"06:07.360","Text":"Notice that we have a row of 0s and I just throw that out,"},{"Start":"06:07.360 ","End":"06:14.150","Text":"and so we get this peculiar set of 2 equations and 3 variables"},{"Start":"06:14.150 ","End":"06:18.785","Text":"but x just happens not to be mentioned by name."},{"Start":"06:18.785 ","End":"06:26.065","Text":"Still x is the free variable and y and z are constrained."},{"Start":"06:26.065 ","End":"06:30.380","Text":"What we do is we let the free variable be 1"},{"Start":"06:30.380 ","End":"06:33.260","Text":"and there\u0027s no way to substitute it."},{"Start":"06:33.260 ","End":"06:38.015","Text":"We can just straight away get that y is 0 and z is 0,"},{"Start":"06:38.015 ","End":"06:43.190","Text":"and so the solution space corresponding"},{"Start":"06:43.190 ","End":"06:49.850","Text":"to the eigenvalue 1 is the subspace spanned"},{"Start":"06:49.850 ","End":"06:56.110","Text":"by the vector 1, 0, 0 from here, x, y, z in order."},{"Start":"06:56.110 ","End":"07:00.000","Text":"Now the dimension of this is 1,"},{"Start":"07:00.000 ","End":"07:04.605","Text":"I mean, this is a base 1, 0, 0 which is a single vector."},{"Start":"07:04.605 ","End":"07:07.400","Text":"That\u0027s the geometric multiplicity."},{"Start":"07:07.400 ","End":"07:10.190","Text":"It\u0027s the dimension of the eigenspace"},{"Start":"07:10.190 ","End":"07:12.705","Text":"and so the answer to that is 1."},{"Start":"07:12.705 ","End":"07:15.965","Text":"Let\u0027s go on to the next eigenvalue."},{"Start":"07:15.965 ","End":"07:21.980","Text":"We proceed similarly for the eigenvalue 2,"},{"Start":"07:21.980 ","End":"07:25.910","Text":"we take the characteristic matrix and instead of x,"},{"Start":"07:25.910 ","End":"07:31.819","Text":"we plug in 2 and then we get this matrix and the row of 0s,"},{"Start":"07:31.819 ","End":"07:33.380","Text":"you just throw it out."},{"Start":"07:33.380 ","End":"07:37.760","Text":"This gives us 2 equations in 3 unknowns, x, y, and z."},{"Start":"07:37.760 ","End":"07:40.115","Text":"Only z doesn\u0027t appear explicitly,"},{"Start":"07:40.115 ","End":"07:42.595","Text":"so z is the free variable."},{"Start":"07:42.595 ","End":"07:45.760","Text":"Then x and y are constrained."},{"Start":"07:45.760 ","End":"07:49.565","Text":"Use the method of the wandering 1s and we get a basis,"},{"Start":"07:49.565 ","End":"07:52.250","Text":"we let the free variable be 1."},{"Start":"07:52.250 ","End":"07:53.630","Text":"If there\u0027s more than 1,"},{"Start":"07:53.630 ","End":"07:58.870","Text":"we let each 1 be 1 and the others 0 and keep changing."},{"Start":"07:58.870 ","End":"08:03.380","Text":"Here we have just that and then y is 0,"},{"Start":"08:03.380 ","End":"08:05.720","Text":"doesn\u0027t really depend on z at all."},{"Start":"08:05.720 ","End":"08:09.230","Text":"When y is 0, then it forces x to be 0,"},{"Start":"08:09.230 ","End":"08:12.500","Text":"if we put them in order, it\u0027s 0, 0, 1"},{"Start":"08:12.500 ","End":"08:22.535","Text":"and so the eigenspace corresponding to eigenvalue 2 is the span of 0, 0, 1."},{"Start":"08:22.535 ","End":"08:27.230","Text":"This space has also dimension 1"},{"Start":"08:27.230 ","End":"08:33.710","Text":"and that means that the geometric multiplicity is 1."},{"Start":"08:33.710 ","End":"08:38.030","Text":"Next we come to the eigenvectors."},{"Start":"08:38.030 ","End":"08:45.710","Text":"What we do for a given eigenvalue is take a basis of the eigenspace"},{"Start":"08:45.710 ","End":"08:55.730","Text":"and those vectors, this is not ideally phrased."},{"Start":"08:55.730 ","End":"08:59.180","Text":"What I mean is we just take a basis for the eigenspace"},{"Start":"08:59.180 ","End":"09:03.395","Text":"and those we call the eigenvectors."},{"Start":"09:03.395 ","End":"09:07.970","Text":"To be pedantic theoretically there\u0027s more than 1 way to choose a basis,"},{"Start":"09:07.970 ","End":"09:09.440","Text":"but it doesn\u0027t really matter."},{"Start":"09:09.440 ","End":"09:11.675","Text":"But once you choose, you stick to it,"},{"Start":"09:11.675 ","End":"09:15.600","Text":"choose a basis, then those are your eigenvectors."},{"Start":"09:16.040 ","End":"09:20.420","Text":"In our case, for the eigenvalue 1,"},{"Start":"09:20.420 ","End":"09:27.590","Text":"we had that this was the eigenspace and the basis for it is just 1 0, 0,"},{"Start":"09:27.590 ","End":"09:32.580","Text":"and this would be the eigenvector for x equals 1."},{"Start":"09:33.170 ","End":"09:36.260","Text":"Similarly with x equals 2,"},{"Start":"09:36.260 ","End":"09:41.620","Text":"we found that the solution space was spanned by this 1 vector,"},{"Start":"09:41.620 ","End":"09:43.735","Text":"so that is the basis,"},{"Start":"09:43.735 ","End":"09:50.810","Text":"and this would be the eigenvector for x equals 2."},{"Start":"09:50.810 ","End":"09:53.390","Text":"If we take them together, we could say that,"},{"Start":"09:53.390 ","End":"09:57.140","Text":"these 2 are the 2 eigenvectors."},{"Start":"09:57.140 ","End":"10:03.710","Text":"The final section asks if the matrix is diagonalizable."},{"Start":"10:04.520 ","End":"10:08.460","Text":"Let\u0027s remember what that even means,"},{"Start":"10:08.460 ","End":"10:12.695","Text":"that means that there exists an invertible matrix P,"},{"Start":"10:12.695 ","End":"10:16.675","Text":"such that A P equals P D,"},{"Start":"10:16.675 ","End":"10:19.610","Text":"and I actually prefer this alternate version."},{"Start":"10:19.610 ","End":"10:22.610","Text":"If you bring the P over to the other side, minus 1,"},{"Start":"10:22.610 ","End":"10:24.890","Text":"P minus 1 A P is D,"},{"Start":"10:24.890 ","End":"10:28.070","Text":"where D is a diagonal matrix."},{"Start":"10:28.070 ","End":"10:32.255","Text":"Now there\u0027s a theorem that\u0027s going to help us here,"},{"Start":"10:32.255 ","End":"10:37.220","Text":"and this theorem states that if we have an n by n matrix,"},{"Start":"10:37.220 ","End":"10:43.505","Text":"then it\u0027s diagonalizable if and only if it has n linearly independent eigenvectors."},{"Start":"10:43.505 ","End":"10:46.240","Text":"In our case, n equals 3,"},{"Start":"10:46.240 ","End":"10:50.370","Text":"so we would expect 3 linearly independent eigenvectors"},{"Start":"10:50.370 ","End":"10:57.135","Text":"and in our case, we only found 2,"},{"Start":"10:57.135 ","End":"11:04.085","Text":"2 is not equal to 3 and so A is not diagonalizable."},{"Start":"11:04.085 ","End":"11:06.360","Text":"Now, we could have got this another way,"},{"Start":"11:06.360 ","End":"11:08.329","Text":"I wanted to show you another theorem"},{"Start":"11:08.329 ","End":"11:16.460","Text":"which also gives a condition for an n by n matrix to be diagonalizable."},{"Start":"11:16.460 ","End":"11:20.840","Text":"This condition is that the geometric"},{"Start":"11:20.840 ","End":"11:26.720","Text":"and algebraic multiplicities have to be the same for each eigenvalue."},{"Start":"11:26.720 ","End":"11:29.470","Text":"Note the word each."},{"Start":"11:29.470 ","End":"11:31.355","Text":"Now in our case,"},{"Start":"11:31.355 ","End":"11:33.500","Text":"eigenvalue 2 is okay,"},{"Start":"11:33.500 ","End":"11:36.020","Text":"they both had multiplicity 1,"},{"Start":"11:36.020 ","End":"11:38.180","Text":"the algebraic and the geometric."},{"Start":"11:38.180 ","End":"11:44.520","Text":"But the eigenvalue 1 has different multiplicities algebraic it\u0027s 2"},{"Start":"11:44.520 ","End":"11:47.610","Text":"and geometric multiplicity is 1,"},{"Start":"11:47.610 ","End":"11:50.050","Text":"which is not the same."},{"Start":"11:50.450 ","End":"11:54.845","Text":"Again, we see that A is not diagonalizable."},{"Start":"11:54.845 ","End":"11:57.590","Text":"We got contradicting answers here and here,"},{"Start":"11:57.590 ","End":"11:59.660","Text":"that would be very bad but we\u0027re okay,"},{"Start":"11:59.660 ","End":"12:02.900","Text":"so A is not diagonalizable, it\u0027s double-checked."},{"Start":"12:02.900 ","End":"12:07.290","Text":"We are done."}],"ID":25752},{"Watched":false,"Name":"Exercise 2 parts i-j","Duration":"10m 3s","ChapterTopicVideoID":24840,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.110 ","End":"00:03.870","Text":"We\u0027re continuing with the same exercise we did."},{"Start":"00:03.870 ","End":"00:08.850","Text":"parts A through F. G and H are"},{"Start":"00:08.850 ","End":"00:14.295","Text":"omitted because they relate to diagonalizable matrices."},{"Start":"00:14.295 ","End":"00:16.515","Text":"We go onto I,J."},{"Start":"00:16.515 ","End":"00:21.195","Text":"First define the minimal polynomial for A,"},{"Start":"00:21.195 ","End":"00:25.320","Text":"then to determine if A is invertible,"},{"Start":"00:25.320 ","End":"00:28.510","Text":"and we\u0027ll continue reading it when we get to it."},{"Start":"00:28.790 ","End":"00:34.110","Text":"I\u0027m going to remind you what a minimal polynomial is,"},{"Start":"00:34.110 ","End":"00:37.765","Text":"starting with the Cayley-Hamilton theorem,"},{"Start":"00:37.765 ","End":"00:44.390","Text":"which in condensed form just says every matrix satisfy its own characteristic equation."},{"Start":"00:44.390 ","End":"00:46.655","Text":"If we spell it out,"},{"Start":"00:46.655 ","End":"00:49.655","Text":"what it means is that if we have a matrix,"},{"Start":"00:49.655 ","End":"00:57.845","Text":"then we take its characteristic polynomial p of x and if we substitute instead of x,"},{"Start":"00:57.845 ","End":"00:59.540","Text":"instead of a real number,"},{"Start":"00:59.540 ","End":"01:04.895","Text":"we can substitute a matrix instead of x and get p of a,"},{"Start":"01:04.895 ","End":"01:09.590","Text":"then it comes out to be 0, the 0 matrix."},{"Start":"01:09.590 ","End":"01:13.025","Text":"You can substitute matrices instead of numbers."},{"Start":"01:13.025 ","End":"01:15.950","Text":"You just have to remember that you don\u0027t have constants."},{"Start":"01:15.950 ","End":"01:19.565","Text":"A constant like 3 is interpreted as 3I,"},{"Start":"01:19.565 ","End":"01:22.930","Text":"where I is the identity matrix, so we\u0027ll see."},{"Start":"01:22.930 ","End":"01:29.810","Text":"We know that there is a polynomial that when you plug A in gives us 0."},{"Start":"01:29.810 ","End":"01:36.005","Text":"What we\u0027re interested in is finding such a polynomial but with the least degree possible."},{"Start":"01:36.005 ","End":"01:39.040","Text":"That is the minimal polynomial."},{"Start":"01:39.040 ","End":"01:42.985","Text":"I could say A minimal polynomial."},{"Start":"01:42.985 ","End":"01:47.990","Text":"Because if I multiply a polynomial by a non-zero constant,"},{"Start":"01:47.990 ","End":"01:51.990","Text":"it will also be of the same degree in everything."},{"Start":"01:52.210 ","End":"01:57.095","Text":"What is customary to do is to take a monic polynomial."},{"Start":"01:57.095 ","End":"02:02.305","Text":"The leading coefficient should be 1."},{"Start":"02:02.305 ","End":"02:07.045","Text":"I wrote it here. If we restrict polynomials to have a leading coefficient 1,"},{"Start":"02:07.045 ","End":"02:10.790","Text":"then it\u0027s the minimal polynomial,"},{"Start":"02:10.790 ","End":"02:18.515","Text":"is the monic polynomial with least degree m of x such that m of A is 0."},{"Start":"02:18.515 ","End":"02:22.780","Text":"To help us find this minimal polynomial,"},{"Start":"02:22.780 ","End":"02:29.110","Text":"there\u0027s a theorem which looks strange and we\u0027ll explain it."},{"Start":"02:29.110 ","End":"02:35.770","Text":"The minimal polynomial and the characteristic polynomial,"},{"Start":"02:35.770 ","End":"02:39.415","Text":"they have the same irreducible factors."},{"Start":"02:39.415 ","End":"02:44.815","Text":"Remember that we\u0027re taking all our polynomials to be factorized already."},{"Start":"02:44.815 ","End":"02:49.520","Text":"Let\u0027s see how we put that in practice to actually find it."},{"Start":"02:49.520 ","End":"02:52.720","Text":"First of all, you find the characteristic polynomial."},{"Start":"02:52.720 ","End":"02:58.420","Text":"I don\u0027t want to use our example with A, it\u0027s too small."},{"Start":"02:58.420 ","End":"03:01.000","Text":"Let\u0027s say we got,"},{"Start":"03:01.000 ","End":"03:04.235","Text":"this as our characteristic polynomial."},{"Start":"03:04.235 ","End":"03:06.300","Text":"Notice that this has an exponent,"},{"Start":"03:06.300 ","End":"03:09.335","Text":"2 here 3 and here 1."},{"Start":"03:09.335 ","End":"03:12.145","Text":"Now, these are the irreducible factors,"},{"Start":"03:12.145 ","End":"03:16.480","Text":"x, x minus 1 and x squared plus 4."},{"Start":"03:16.480 ","End":"03:18.100","Text":"I guess I should have said earlier,"},{"Start":"03:18.100 ","End":"03:20.980","Text":"we\u0027re working over the real numbers because this actually"},{"Start":"03:20.980 ","End":"03:25.420","Text":"factorizes over complex numbers and if you don\u0027t know what complex numbers are,"},{"Start":"03:25.420 ","End":"03:27.495","Text":"then just forget I said that."},{"Start":"03:27.495 ","End":"03:30.450","Text":"Anyway, we\u0027re working with real numbers."},{"Start":"03:30.450 ","End":"03:33.310","Text":"These are all irreducible."},{"Start":"03:33.310 ","End":"03:37.160","Text":"What we do is we get a list of possibilities."},{"Start":"03:37.160 ","End":"03:42.275","Text":"Now each of them has to have an x and x minus 1 and an x squared plus 4."},{"Start":"03:42.275 ","End":"03:45.920","Text":"But the exponent could be lower for the minimal,"},{"Start":"03:45.920 ","End":"03:47.675","Text":"like instead of 2,"},{"Start":"03:47.675 ","End":"03:49.910","Text":"it could be 1 instead of 3,"},{"Start":"03:49.910 ","End":"03:51.860","Text":"it could be 2 or 1."},{"Start":"03:51.860 ","End":"03:53.640","Text":"The 1 has to be,"},{"Start":"03:53.640 ","End":"03:55.200","Text":"so this part\u0027s fixed,"},{"Start":"03:55.200 ","End":"03:57.600","Text":"then we have 6 possibilities."},{"Start":"03:57.600 ","End":"03:58.905","Text":"We could have 1 or 2,"},{"Start":"03:58.905 ","End":"04:00.960","Text":"and here 1,2 or 3,"},{"Start":"04:00.960 ","End":"04:03.125","Text":"2 times 3 is 6 possibilities."},{"Start":"04:03.125 ","End":"04:04.520","Text":"I\u0027ve listed them all,"},{"Start":"04:04.520 ","End":"04:08.875","Text":"and we list them in order of increasing degree."},{"Start":"04:08.875 ","End":"04:11.850","Text":"We go from degree 4 up to degree 7."},{"Start":"04:11.850 ","End":"04:13.860","Text":"Some of them are a tie."},{"Start":"04:13.860 ","End":"04:19.310","Text":"What we do, is we substitute the original matrix"},{"Start":"04:19.310 ","End":"04:24.440","Text":"that we had in each of these polynomials successively here,"},{"Start":"04:24.440 ","End":"04:32.030","Text":"then here, then here, and so on from the lowest degree until you get a 0 as the result,"},{"Start":"04:32.030 ","End":"04:35.855","Text":"the 0 matrix, and then you stop."},{"Start":"04:35.855 ","End":"04:38.075","Text":"The place you stopped,"},{"Start":"04:38.075 ","End":"04:42.735","Text":"the way you got the 0, that 1 would be the minimal polynomial."},{"Start":"04:42.735 ","End":"04:47.525","Text":"Okay, so that\u0027s the method let\u0027s get back to our example."},{"Start":"04:47.525 ","End":"04:51.665","Text":"I\u0027ll just remind you this was our original matrix"},{"Start":"04:51.665 ","End":"04:56.285","Text":"and the characteristic polynomial was this."},{"Start":"04:56.285 ","End":"05:00.890","Text":"Now in our case, we only get 2 possibilities because to the power of 1,"},{"Start":"05:00.890 ","End":"05:02.150","Text":"this has to stay."},{"Start":"05:02.150 ","End":"05:07.280","Text":"All we could possibly have is this could either be 2 or 1,"},{"Start":"05:09.050 ","End":"05:13.580","Text":"because of the theorem that we had."},{"Start":"05:13.580 ","End":"05:17.600","Text":"We either have minimal polynomial is this or this."},{"Start":"05:17.600 ","End":"05:23.155","Text":"The last one in each case is always the characteristic polynomial itself."},{"Start":"05:23.155 ","End":"05:29.425","Text":"Let\u0027s substitute in each one of these our matrix A."},{"Start":"05:29.425 ","End":"05:31.310","Text":"Remember we start with the lowest,"},{"Start":"05:31.310 ","End":"05:36.810","Text":"so we first of all check to see if we put A in here."},{"Start":"05:36.860 ","End":"05:40.745","Text":"Remember that constant is I,"},{"Start":"05:40.745 ","End":"05:43.460","Text":"like minus 2 we write minus 2I."},{"Start":"05:43.460 ","End":"05:48.110","Text":"That\u0027s how we work when we put a matrix into a polynomial."},{"Start":"05:48.110 ","End":"05:50.825","Text":"We don\u0027t have just numbers."},{"Start":"05:50.825 ","End":"05:53.210","Text":"Okay, I\u0027m not going to scroll back,"},{"Start":"05:53.210 ","End":"05:59.750","Text":"but if you look at A and if you take away ones from the diagonal, we get this."},{"Start":"05:59.750 ","End":"06:05.860","Text":"If you were to take 2 away from each of the diagonals, we get this."},{"Start":"06:05.860 ","End":"06:09.405","Text":"Then we multiply them together."},{"Start":"06:09.405 ","End":"06:12.320","Text":"They started doing the multiplication."},{"Start":"06:12.320 ","End":"06:15.049","Text":"But then I came to this,"},{"Start":"06:15.049 ","End":"06:19.330","Text":"times this, to get this entry."},{"Start":"06:19.850 ","End":"06:23.120","Text":"You get 0 times 1, 1 times minus 1,"},{"Start":"06:23.120 ","End":"06:26.750","Text":"0 times 0, it comes out to be minus 1, which is not 0."},{"Start":"06:26.750 ","End":"06:30.530","Text":"At that point, obviously you stop doing"},{"Start":"06:30.530 ","End":"06:34.520","Text":"the computation because as soon as an entry is non-zero,"},{"Start":"06:34.520 ","End":"06:42.450","Text":"you\u0027re not going to get the 0 matrix and so this one\u0027s ruled out so we go to the next 1."},{"Start":"06:42.450 ","End":"06:45.360","Text":"It has to be the next 1 because we know it\u0027s one of these two and"},{"Start":"06:45.360 ","End":"06:49.550","Text":"that last one is the same as the characteristic polynomial."},{"Start":"06:49.550 ","End":"06:55.090","Text":"In this case, the characteristic polynomial is also the minimal polynomial."},{"Start":"06:55.090 ","End":"07:01.475","Text":"The last part asks if A is invertible and if so,"},{"Start":"07:01.475 ","End":"07:04.655","Text":"then I should have given a hint,"},{"Start":"07:04.655 ","End":"07:11.890","Text":"use Cayley-Hamilton theorem to find A inverse in terms of A and I."},{"Start":"07:11.890 ","End":"07:14.000","Text":"There\u0027s a theorem that helps us."},{"Start":"07:14.000 ","End":"07:19.775","Text":"It says that a matrix is invertible if and only if all its eigenvalues are non-zero."},{"Start":"07:19.775 ","End":"07:28.990","Text":"In this case, we had 1 and 2 and they\u0027re both non-zero so our matrix is invertible."},{"Start":"07:29.030 ","End":"07:35.420","Text":"The way to compute A inverse is to use the Cayley-Hamilton theorem."},{"Start":"07:35.420 ","End":"07:38.290","Text":"By doing a bit of manipulation on the equation,"},{"Start":"07:38.290 ","End":"07:42.055","Text":"we can easily get to the form A times something equals I,"},{"Start":"07:42.055 ","End":"07:46.280","Text":"and then that something will be a inverse."},{"Start":"07:46.550 ","End":"07:50.875","Text":"Recall that our characteristic polynomial was this"},{"Start":"07:50.875 ","End":"07:55.850","Text":"and the Cayley-Hamilton theorem says that p of A is 0."},{"Start":"07:55.850 ","End":"08:03.020","Text":"If I substitute, remember 1 becomes I and 2 becomes 2I, we get this."},{"Start":"08:03.020 ","End":"08:05.380","Text":"Now we\u0027re going to expand all the brackets."},{"Start":"08:05.380 ","End":"08:09.500","Text":"First of all, I\u0027ll do the A minus I squared"},{"Start":"08:09.500 ","End":"08:16.040","Text":"and its just like special binomial expansion in algebra."},{"Start":"08:16.220 ","End":"08:20.100","Text":"Its A squared minus twice A times I,"},{"Start":"08:20.100 ","End":"08:22.880","Text":"which is just 2A plus I squared,"},{"Start":"08:22.880 ","End":"08:26.244","Text":"which is just I times this."},{"Start":"08:26.244 ","End":"08:28.590","Text":"Now, do the multiplication,"},{"Start":"08:28.590 ","End":"08:31.310","Text":"we\u0027ll take this A and multiply it on the right"},{"Start":"08:31.310 ","End":"08:34.265","Text":"by each of these and we get the first 3 terms."},{"Start":"08:34.265 ","End":"08:39.715","Text":"Then the minus 2I times each of these gives us this."},{"Start":"08:39.715 ","End":"08:42.350","Text":"Collect like terms."},{"Start":"08:42.350 ","End":"08:46.475","Text":"I might just mention that there is a variation."},{"Start":"08:46.475 ","End":"08:52.550","Text":"You could just have expanded p of x and got it as a cubic polynomial."},{"Start":"08:52.550 ","End":"08:58.295","Text":"Presumably you would have got x cubed minus 4x squared plus 5x minus 2 is 0."},{"Start":"08:58.295 ","End":"09:03.345","Text":"Then you could have replaced A at that stage that it doesn\u0027t really matter."},{"Start":"09:03.345 ","End":"09:07.250","Text":"Now here\u0027s the thing. I take the 2I to"},{"Start":"09:07.250 ","End":"09:10.870","Text":"the other side like the constant part of the part without A,"},{"Start":"09:10.870 ","End":"09:12.700","Text":"I throw it over to the right."},{"Start":"09:12.700 ","End":"09:17.525","Text":"Then, for the rest of it I take A outside the brackets."},{"Start":"09:17.525 ","End":"09:21.085","Text":"Can you see how this is going to help us?"},{"Start":"09:21.085 ","End":"09:25.590","Text":"Okay, divide by 2 and put the 2 in this part here."},{"Start":"09:25.590 ","End":"09:30.605","Text":"We have A times something is the identity matrix,"},{"Start":"09:30.605 ","End":"09:35.755","Text":"so that something is precisely the inverse matrix"},{"Start":"09:35.755 ","End":"09:41.170","Text":"to A and so this is the expression for a inverse."},{"Start":"09:41.170 ","End":"09:43.310","Text":"If you have some extra time,"},{"Start":"09:43.310 ","End":"09:49.020","Text":"you could actually compute this A squared minus 4A plus 5I divide by"},{"Start":"09:49.020 ","End":"09:51.920","Text":"2 and check that it really is the inverse by"},{"Start":"09:51.920 ","End":"09:55.160","Text":"multiplying them and see you get the identity."},{"Start":"09:55.160 ","End":"09:57.200","Text":"That\u0027s if you have nothing better to do."},{"Start":"09:57.200 ","End":"10:03.120","Text":"Anyway, we\u0027ve finished with this exercise. That\u0027s it."}],"ID":25753},{"Watched":false,"Name":"Exercise 3 parts a-f","Duration":"15m 25s","ChapterTopicVideoID":24841,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:04.980","Text":"In this exercise, we\u0027re given a matrix A,"},{"Start":"00:04.980 ","End":"00:09.420","Text":"which is 3 by 3 over the real numbers."},{"Start":"00:09.420 ","End":"00:15.495","Text":"I could say that it belongs to M_3 over the reals."},{"Start":"00:15.495 ","End":"00:19.080","Text":"There are 6 parts to this question."},{"Start":"00:19.080 ","End":"00:21.825","Text":"I\u0027ll let you read them on your own."},{"Start":"00:21.825 ","End":"00:24.610","Text":"I\u0027ll just take them 1 at a time."},{"Start":"00:24.610 ","End":"00:28.285","Text":"A is to find the characteristic matrix."},{"Start":"00:28.285 ","End":"00:32.915","Text":"It\u0027s just a matter of applying the definition of a characteristic matrix,"},{"Start":"00:32.915 ","End":"00:39.650","Text":"which is x times the identity matrix minus A."},{"Start":"00:39.650 ","End":"00:43.505","Text":"Now, the identity matrix is ones along the diagonal."},{"Start":"00:43.505 ","End":"00:45.170","Text":"If I multiply that by x,"},{"Start":"00:45.170 ","End":"00:47.530","Text":"we\u0027ll get x is along the diagonal."},{"Start":"00:47.530 ","End":"00:50.765","Text":"Here\u0027s x times identity."},{"Start":"00:50.765 ","End":"00:53.150","Text":"Here\u0027s our matrix A,"},{"Start":"00:53.150 ","End":"00:55.300","Text":"just subtract them,"},{"Start":"00:55.300 ","End":"00:58.620","Text":"x minus 1 is x minus 1,"},{"Start":"00:58.620 ","End":"01:02.730","Text":"and so on for the rest and this is basically it."},{"Start":"01:02.730 ","End":"01:10.220","Text":"But I would like to note that some books will"},{"Start":"01:10.220 ","End":"01:17.300","Text":"write the Greek letter Lambda instead of x. I\u0027ve seen it frequently,"},{"Start":"01:17.300 ","End":"01:19.485","Text":"so you should be aware of that."},{"Start":"01:19.485 ","End":"01:23.255","Text":"Also, some do it the other way around."},{"Start":"01:23.255 ","End":"01:26.930","Text":"A minus x times I,"},{"Start":"01:26.930 ","End":"01:29.390","Text":"or A minus Lambda times I."},{"Start":"01:29.390 ","End":"01:35.315","Text":"That\u0027s also possible and it\u0027s all essentially the same in the end."},{"Start":"01:35.315 ","End":"01:42.575","Text":"Now the characteristic polynomial is just the determinant of the characteristic matrix."},{"Start":"01:42.575 ","End":"01:45.300","Text":"All we have to do is to,"},{"Start":"01:45.300 ","End":"01:47.090","Text":"let\u0027s just see it here,"},{"Start":"01:47.090 ","End":"01:50.840","Text":"is to take the determinant of this,"},{"Start":"01:50.840 ","End":"01:55.865","Text":"which just means replacing the brackets by bars and"},{"Start":"01:55.865 ","End":"02:02.074","Text":"the determinant we can compute by expanding along."},{"Start":"02:02.074 ","End":"02:05.990","Text":"We look for a row or a column with a lot of 0s."},{"Start":"02:05.990 ","End":"02:11.170","Text":"Let\u0027s say the middle row."},{"Start":"02:11.170 ","End":"02:14.145","Text":"Maybe I\u0027ll highlight it this way."},{"Start":"02:14.145 ","End":"02:16.520","Text":"The 0s don\u0027t contribute anything,"},{"Start":"02:16.520 ","End":"02:18.215","Text":"so it\u0027s just this."},{"Start":"02:18.215 ","End":"02:23.420","Text":"Then we erase the row and"},{"Start":"02:23.420 ","End":"02:29.700","Text":"the column that this belongs to and take the determinant of what\u0027s left."},{"Start":"02:30.500 ","End":"02:32.960","Text":"There\u0027s also a matter of a sign."},{"Start":"02:32.960 ","End":"02:34.880","Text":"Remember the checkerboard plus, minus,"},{"Start":"02:34.880 ","End":"02:38.110","Text":"plus, minus, plus, this is a plus."},{"Start":"02:38.110 ","End":"02:43.880","Text":"We get the plus times the x minus 1 times the determinant of what\u0027s left."},{"Start":"02:43.880 ","End":"02:49.770","Text":"It\u0027s a 2 by 2, so it\u0027s this diagonal x minus 1 squared minus the other diagonal,"},{"Start":"02:49.770 ","End":"02:50.850","Text":"it\u0027s minus, minus,"},{"Start":"02:50.850 ","End":"02:53.625","Text":"minus 1, which is minus 1."},{"Start":"02:53.625 ","End":"02:56.960","Text":"This 1 comes out to be x squared minus 2x plus 1 minus"},{"Start":"02:56.960 ","End":"03:00.790","Text":"1 so it\u0027s just x squared minus 2x."},{"Start":"03:00.790 ","End":"03:06.050","Text":"Let\u0027s take x out of here and bring it right up front so we get x, x minus 1,"},{"Start":"03:06.050 ","End":"03:09.620","Text":"x minus 2, and put a nice box around it,"},{"Start":"03:09.620 ","End":"03:13.070","Text":"and that is the characteristic polynomial."},{"Start":"03:13.070 ","End":"03:19.670","Text":"The next section is eigenvalues and algebraic multiplicity for each eigenvalue."},{"Start":"03:19.670 ","End":"03:22.225","Text":"First of all, the eigenvalues."},{"Start":"03:22.225 ","End":"03:29.230","Text":"The eigenvalues are defined to be the roots of the characteristic polynomial."},{"Start":"03:29.230 ","End":"03:34.935","Text":"The roots means that we set it to zero and solve it."},{"Start":"03:34.935 ","End":"03:38.530","Text":"By the way, there\u0027s a name for this equation."},{"Start":"03:38.530 ","End":"03:42.520","Text":"This is called the characteristic equation."},{"Start":"03:42.520 ","End":"03:46.415","Text":"Its solutions are the eigenvalues."},{"Start":"03:46.415 ","End":"03:55.370","Text":"Solutions to this equation is the same thing as the roots of p. In our case,"},{"Start":"03:55.370 ","End":"03:58.100","Text":"what we had was x,"},{"Start":"03:58.100 ","End":"04:00.380","Text":"x minus 1, x minus 2."},{"Start":"04:00.380 ","End":"04:02.755","Text":"Setting this equal to 0,"},{"Start":"04:02.755 ","End":"04:05.520","Text":"we get 3 eigenvalues,"},{"Start":"04:05.520 ","End":"04:08.040","Text":"0 from here, 1 from here,"},{"Start":"04:08.040 ","End":"04:10.210","Text":"and 2 from here."},{"Start":"04:11.660 ","End":"04:18.485","Text":"This brings us to the other part of this section, the multiplicity."},{"Start":"04:18.485 ","End":"04:25.850","Text":"The algebraic multiplicity of an eigenvalue is defined as follows."},{"Start":"04:25.850 ","End":"04:34.090","Text":"That if the characteristic polynomial contains the factor of the form x minus a^k,"},{"Start":"04:34.090 ","End":"04:39.920","Text":"a is the eigenvalue and k is its algebraic multiplicity."},{"Start":"04:39.920 ","End":"04:42.139","Text":"Now in our case,"},{"Start":"04:42.139 ","End":"04:45.500","Text":"perhaps I better scroll back so we can see it."},{"Start":"04:45.500 ","End":"04:47.930","Text":"Yeah, there it is."},{"Start":"04:47.930 ","End":"04:55.505","Text":"We can write this as x minus 0^1,"},{"Start":"04:55.505 ","End":"05:03.130","Text":"x minus 1^1, x minus 2^1."},{"Start":"05:03.130 ","End":"05:05.310","Text":"Each of the eigenvalues 0, 1,"},{"Start":"05:05.310 ","End":"05:10.490","Text":"and 2 they each have a multiplicity of 1 by this definition."},{"Start":"05:10.490 ","End":"05:14.980","Text":"Yeah, the algebraic multiplicity is 1 for each of them."},{"Start":"05:14.980 ","End":"05:19.070","Text":"Now there\u0027s another kind of multiplicity besides the algebraic,"},{"Start":"05:19.070 ","End":"05:23.390","Text":"and it\u0027s called the geometric multiplicity."},{"Start":"05:23.390 ","End":"05:26.495","Text":"I don\u0027t know why it\u0027s called geometric."},{"Start":"05:26.495 ","End":"05:34.990","Text":"But to find it, we first have to find the eigenspace of each eigenvalue."},{"Start":"05:34.990 ","End":"05:37.740","Text":"This is defined as follows,"},{"Start":"05:37.740 ","End":"05:43.260","Text":"we substitute the eigenvalue into the characteristic matrix and"},{"Start":"05:43.260 ","End":"05:49.850","Text":"then we solve the system of linear equations that corresponds to that matrix."},{"Start":"05:49.850 ","End":"05:54.980","Text":"The solution space is called the eigenspace and"},{"Start":"05:54.980 ","End":"06:01.205","Text":"the dimension of that is called the geometric multiplicity of that eigenvalue."},{"Start":"06:01.205 ","End":"06:04.145","Text":"We\u0027ll illustrate this in a moment."},{"Start":"06:04.145 ","End":"06:07.575","Text":"Remember in our case we have 3 eigenvalues 0, 1,"},{"Start":"06:07.575 ","End":"06:11.375","Text":"and 2, so we have to do some work on each 1 of them."},{"Start":"06:11.375 ","End":"06:14.750","Text":"Again, we\u0027ll start with the eigenvalue 0."},{"Start":"06:14.750 ","End":"06:21.850","Text":"We just take this characteristic matrix and substitute the value 0 in it."},{"Start":"06:21.850 ","End":"06:26.925","Text":"The matrix we get is this."},{"Start":"06:26.925 ","End":"06:32.210","Text":"The system of linear equations that corresponds to this in 3 variables,"},{"Start":"06:32.210 ","End":"06:34.790","Text":"x, y, z, is this."},{"Start":"06:34.790 ","End":"06:39.780","Text":"This x has no relationship to this x, just pointing out."},{"Start":"06:41.420 ","End":"06:46.280","Text":"Now we do row operations on this to bring to echelon form,"},{"Start":"06:46.280 ","End":"06:48.860","Text":"subtract the first row from"},{"Start":"06:48.860 ","End":"06:55.460","Text":"the third row to get a 0 here and the whole row comes out to be 0."},{"Start":"06:55.460 ","End":"06:58.715","Text":"This is the system we get."},{"Start":"06:58.715 ","End":"07:04.520","Text":"Note that z is the free variable."},{"Start":"07:04.520 ","End":"07:06.210","Text":"We can let it be whatever we want,"},{"Start":"07:06.210 ","End":"07:13.200","Text":"and then x and y depend on whatever value of z it is, the constrained."},{"Start":"07:13.260 ","End":"07:17.695","Text":"We have our technique called the wondering 1s,"},{"Start":"07:17.695 ","End":"07:23.920","Text":"and we just let z equal 1 and this will give us a basis."},{"Start":"07:23.920 ","End":"07:26.470","Text":"If z is 1,"},{"Start":"07:26.470 ","End":"07:33.910","Text":"then x is going to be minus 1 and y regardless of z is 0."},{"Start":"07:33.910 ","End":"07:37.765","Text":"Now we take these in order x, y, and z,"},{"Start":"07:37.765 ","End":"07:41.980","Text":"and that will give us the vector minus 1,"},{"Start":"07:41.980 ","End":"07:46.330","Text":"0, 1, which is a basis for the eigenspace."},{"Start":"07:46.330 ","End":"07:50.230","Text":"In other words, the eigenspace is the span of this vector."},{"Start":"07:50.230 ","End":"07:53.365","Text":"That\u0027s the solution space of the system,"},{"Start":"07:53.365 ","End":"07:55.045","Text":"and we write it like this."},{"Start":"07:55.045 ","End":"07:58.705","Text":"The eigenspace corresponding to x equals 0 is"},{"Start":"07:58.705 ","End":"08:04.570","Text":"this and because the basis only has 1 element in it,"},{"Start":"08:04.570 ","End":"08:09.550","Text":"1 vector than the geometric multiplicity is 1."},{"Start":"08:09.550 ","End":"08:11.950","Text":"Onto the next eigenvalue,"},{"Start":"08:11.950 ","End":"08:13.105","Text":"which is 1,"},{"Start":"08:13.105 ","End":"08:16.510","Text":"we substitute 1 for x in"},{"Start":"08:16.510 ","End":"08:23.275","Text":"the characteristic matrix and this makes the whole diagonal come out to be 0."},{"Start":"08:23.275 ","End":"08:25.120","Text":"This is the matrix we get,"},{"Start":"08:25.120 ","End":"08:30.084","Text":"and the corresponding system of linear equations is this."},{"Start":"08:30.084 ","End":"08:34.900","Text":"In this case, we don\u0027t actually see y but y would be"},{"Start":"08:34.900 ","End":"08:43.975","Text":"the free variable and x and z are dependent or constrained,"},{"Start":"08:43.975 ","End":"08:51.790","Text":"so we use a technique called The Wandering 1s to find a basis."},{"Start":"08:51.790 ","End":"08:54.700","Text":"We let the free variable be 1,"},{"Start":"08:54.700 ","End":"08:58.030","Text":"and then we can figure out the rest."},{"Start":"08:58.030 ","End":"09:00.820","Text":"It actually doesn\u0027t even matter that y is 1."},{"Start":"09:00.820 ","End":"09:05.245","Text":"Whatever you do, x and z are going to be 0,"},{"Start":"09:05.245 ","End":"09:07.090","Text":"and we take them in order."},{"Start":"09:07.090 ","End":"09:08.800","Text":"First x, then y, then z."},{"Start":"09:08.800 ","End":"09:15.085","Text":"That gives us 0, 1, 0 and this will be a basis for the solution space."},{"Start":"09:15.085 ","End":"09:20.215","Text":"The solution space is the span of this vector"},{"Start":"09:20.215 ","End":"09:27.385","Text":"and this is the eigenspace for the eigenvalue 1,"},{"Start":"09:27.385 ","End":"09:31.900","Text":"and the geometric multiplicity is 1 because there\u0027s only 1 vector in"},{"Start":"09:31.900 ","End":"09:37.150","Text":"the basis and onto the third and last eigenvalue 2."},{"Start":"09:37.150 ","End":"09:40.795","Text":"We substitute 2 in the characteristic matrix,"},{"Start":"09:40.795 ","End":"09:48.130","Text":"and we get this matrix and its corresponding system of linear equations is this."},{"Start":"09:48.130 ","End":"09:53.185","Text":"We bring the matrix to row echelon form by adding the first to the last,"},{"Start":"09:53.185 ","End":"09:54.730","Text":"and then we can cross this out."},{"Start":"09:54.730 ","End":"09:56.170","Text":"It\u0027s just zeros,"},{"Start":"09:56.170 ","End":"10:01.780","Text":"and so we get this system of equations,"},{"Start":"10:01.780 ","End":"10:08.530","Text":"and here z is the free variable and x and y depend on it."},{"Start":"10:08.530 ","End":"10:11.170","Text":"As before, we want to find a basis,"},{"Start":"10:11.170 ","End":"10:15.790","Text":"so we let the free variable be 1 and then figure out"},{"Start":"10:15.790 ","End":"10:21.835","Text":"the other variables y is 0 and from here if z is 1 then x is 1,"},{"Start":"10:21.835 ","End":"10:23.950","Text":"I want to put them in order, x, y,"},{"Start":"10:23.950 ","End":"10:26.650","Text":"z, which is 1, 0,"},{"Start":"10:26.650 ","End":"10:30.880","Text":"1 so this set is the basis for"},{"Start":"10:30.880 ","End":"10:36.320","Text":"the solution space and the solution space is the span of that."},{"Start":"10:36.840 ","End":"10:40.960","Text":"The basis only has 1 vector in it,"},{"Start":"10:40.960 ","End":"10:45.580","Text":"so the geometric multiplicity is 1."},{"Start":"10:45.580 ","End":"10:49.675","Text":"Yeah, and the solution space is called the eigenspace,"},{"Start":"10:49.675 ","End":"10:53.140","Text":"I should say, of the eigenvalue 2."},{"Start":"10:53.140 ","End":"10:56.860","Text":"That\u0027s the third of them out of the 3."},{"Start":"10:56.860 ","End":"11:00.895","Text":"Onto the next section, eigenvectors."},{"Start":"11:00.895 ","End":"11:05.365","Text":"Now, the eigenvectors of"},{"Start":"11:05.365 ","End":"11:13.690","Text":"each eigenvalue we get them by taking a basis for the corresponding eigenspace,"},{"Start":"11:13.690 ","End":"11:22.660","Text":"so each eigenvalue contributes a certain number of eigenvectors and to be pedantic,"},{"Start":"11:22.660 ","End":"11:27.205","Text":"I just want to point out that there\u0027s more than 1 way possibly of choosing a basis,"},{"Start":"11:27.205 ","End":"11:30.920","Text":"it doesn\u0027t matter, but you stick with your choice."},{"Start":"11:30.920 ","End":"11:35.370","Text":"For example, for the eigenvalue 0,"},{"Start":"11:35.370 ","End":"11:42.165","Text":"we can take the eigenvector as minus 1, 0, 1."},{"Start":"11:42.165 ","End":"11:48.750","Text":"There was only 1 eigenvector because the basis for this is just the set containing"},{"Start":"11:48.750 ","End":"11:55.665","Text":"that 1 vector and similarly for the eigenvalues 1 and 2,"},{"Start":"11:55.665 ","End":"12:03.450","Text":"we get this eigenvector for this eigenvalue and this eigenvector for this eigenvalue."},{"Start":"12:03.450 ","End":"12:08.010","Text":"That is, I\u0027ve used a little v for the eigenvectors and a capital V"},{"Start":"12:08.010 ","End":"12:12.550","Text":"for the eigenspace and if I take all the eigenvectors together,"},{"Start":"12:12.550 ","End":"12:15.980","Text":"we have 3 eigenvectors."},{"Start":"12:16.920 ","End":"12:19.075","Text":"It could have been,"},{"Start":"12:19.075 ","End":"12:22.900","Text":"we get more than 1 eigenvector for a given eigenvalue,"},{"Start":"12:22.900 ","End":"12:24.310","Text":"but it didn\u0027t happen here,"},{"Start":"12:24.310 ","End":"12:28.195","Text":"in a future exercise I believe we will see such a case."},{"Start":"12:28.195 ","End":"12:32.635","Text":"Next the final part,"},{"Start":"12:32.635 ","End":"12:38.050","Text":"f is the matrix a diagonalizable?"},{"Start":"12:38.050 ","End":"12:40.990","Text":"Hard word to pronounce."},{"Start":"12:40.990 ","End":"12:44.005","Text":"Now I\u0027ll remind you what this means."},{"Start":"12:44.005 ","End":"12:50.980","Text":"The diagonalizable, it means there exists some invertible matrix p such"},{"Start":"12:50.980 ","End":"13:00.355","Text":"that a times p is p times d. I\u0027ll tell you what d is in a moment or equivalently,"},{"Start":"13:00.355 ","End":"13:06.910","Text":"I actually like the equivalent version better that p minus 1ap is d, p is invertible,"},{"Start":"13:06.910 ","End":"13:13.255","Text":"so p minus 1 makes sense and d stands for some diagonal matrix."},{"Start":"13:13.255 ","End":"13:15.085","Text":"In other words, if we can find such a p,"},{"Start":"13:15.085 ","End":"13:17.500","Text":"that when we do compute p minus 1ap,"},{"Start":"13:17.500 ","End":"13:23.335","Text":"we get a diagonal matrix then a is diagonalizable."},{"Start":"13:23.335 ","End":"13:28.420","Text":"There\u0027s a theorem that\u0027s going to help us out here to decide whether it"},{"Start":"13:28.420 ","End":"13:32.589","Text":"is or it isn\u0027t and the theorem states that in general,"},{"Start":"13:32.589 ","End":"13:34.030","Text":"an n by n matrix,"},{"Start":"13:34.030 ","End":"13:36.250","Text":"in our case it\u0027s a 3 by 3 matrix,"},{"Start":"13:36.250 ","End":"13:43.150","Text":"is diagonalizable if and only if it has n linearly independent eigenvectors."},{"Start":"13:43.150 ","End":"13:46.660","Text":"Now in our case, we have a 3 by 3 matrix and"},{"Start":"13:46.660 ","End":"13:50.665","Text":"there are 3 linearly independent eigenvectors."},{"Start":"13:50.665 ","End":"13:57.400","Text":"We saw 3 eigenvectors it remains for me to answer why they are"},{"Start":"13:57.400 ","End":"14:04.015","Text":"linearly independent and hence A is diagonalizable."},{"Start":"14:04.015 ","End":"14:06.759","Text":"Now I\u0027ll come to answer that question."},{"Start":"14:06.759 ","End":"14:10.240","Text":"Why? Well, once again,"},{"Start":"14:10.240 ","End":"14:12.775","Text":"I need to pull a theorem out of the hat."},{"Start":"14:12.775 ","End":"14:17.620","Text":"There is also a theorem that says that eigenvectors"},{"Start":"14:17.620 ","End":"14:22.750","Text":"belonging to different eigenvalues are linearly independent and this was our case."},{"Start":"14:22.750 ","End":"14:24.955","Text":"Remember we had 0,"},{"Start":"14:24.955 ","End":"14:30.140","Text":"1 and 2 and each 1 of them"},{"Start":"14:30.180 ","End":"14:38.205","Text":"had a different eigenvector so different eigenvalues they are linearly independent."},{"Start":"14:38.205 ","End":"14:40.550","Text":"If you\u0027re not totally happy with that,"},{"Start":"14:40.550 ","End":"14:42.290","Text":"that I pull out 2 theorems,"},{"Start":"14:42.290 ","End":"14:44.660","Text":"I can do it with just 1 theorem."},{"Start":"14:44.660 ","End":"14:48.920","Text":"There\u0027s also a theorem that says that an n by n matrix is"},{"Start":"14:48.920 ","End":"14:54.320","Text":"diagonalizable if and only if for each eigenvalue,"},{"Start":"14:54.320 ","End":"14:56.900","Text":"the 2 multiplicities are the same,"},{"Start":"14:56.900 ","End":"15:01.370","Text":"the geometric and the algebraic and in our case,"},{"Start":"15:01.370 ","End":"15:05.569","Text":"remember all the geometric multiplicities and all the algebraic multiplicities,"},{"Start":"15:05.569 ","End":"15:09.940","Text":"they all came out to be 1 and so 1 equals 1,"},{"Start":"15:09.940 ","End":"15:11.930","Text":"3 times and yeah,"},{"Start":"15:11.930 ","End":"15:15.885","Text":"so once again, we see that A is diagonalizable."},{"Start":"15:15.885 ","End":"15:19.040","Text":"I\u0027ll be very surprised if we got different answers using"},{"Start":"15:19.040 ","End":"15:25.170","Text":"the 2 different methods and that concludes this clip"}],"ID":25754},{"Watched":false,"Name":"Exercise 3 parts g-h","Duration":"8m 6s","ChapterTopicVideoID":24842,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:02.865","Text":"This exercise is a continuation."},{"Start":"00:02.865 ","End":"00:11.650","Text":"In the previous clip we did parts A through F and now we have 4 more parts, G, H, I, J."},{"Start":"00:12.370 ","End":"00:16.820","Text":"The first part here is to diagonalize A."},{"Start":"00:16.820 ","End":"00:18.350","Text":"In the previous clip,"},{"Start":"00:18.350 ","End":"00:22.460","Text":"we showed that A is diagonalizable,"},{"Start":"00:22.460 ","End":"00:26.110","Text":"but how do we actually diagonalize it?"},{"Start":"00:26.110 ","End":"00:30.920","Text":"Now, if we go to the definition of diagonalizable,"},{"Start":"00:30.920 ","End":"00:38.180","Text":"what it means is that there\u0027s a matrix P. There were 2 versions,"},{"Start":"00:38.180 ","End":"00:39.920","Text":"this is 1 of the versions we had."},{"Start":"00:39.920 ","End":"00:42.905","Text":"It was that P minus 1 AP is equal to D,"},{"Start":"00:42.905 ","End":"00:46.280","Text":"and D is a diagonal matrix."},{"Start":"00:46.280 ","End":"00:50.820","Text":"Really, we need to find both of them."},{"Start":"00:50.820 ","End":"00:58.520","Text":"We have to find P and we have to find D. Start with D,"},{"Start":"00:58.520 ","End":"01:02.590","Text":"that\u0027s simpler, and what it is,"},{"Start":"01:02.590 ","End":"01:05.970","Text":"is just to take the eigenvalues we found."},{"Start":"01:05.970 ","End":"01:07.650","Text":"Remember we found 0,"},{"Start":"01:07.650 ","End":"01:10.110","Text":"1, and 2 were the eigenvalues."},{"Start":"01:10.110 ","End":"01:11.340","Text":"We put 0, 1,"},{"Start":"01:11.340 ","End":"01:13.185","Text":"and 2 along the diagonal,"},{"Start":"01:13.185 ","End":"01:15.375","Text":"and everywhere else 0."},{"Start":"01:15.375 ","End":"01:17.655","Text":"That\u0027s the D part."},{"Start":"01:17.655 ","End":"01:20.680","Text":"Next, the P part."},{"Start":"01:20.680 ","End":"01:27.665","Text":"What we do is we build a matrix where the columns are the eigenvectors."},{"Start":"01:27.665 ","End":"01:30.170","Text":"We found an eigenvector for each,"},{"Start":"01:30.170 ","End":"01:31.505","Text":"there was just 1 for each,"},{"Start":"01:31.505 ","End":"01:35.470","Text":"but we have to keep them in the right order."},{"Start":"01:35.470 ","End":"01:39.570","Text":"First we found that minus 1, 0,"},{"Start":"01:39.570 ","End":"01:44.505","Text":"1 was an eigenvector for the eigenvalue 0."},{"Start":"01:44.505 ","End":"01:46.710","Text":"For 1, we had this,"},{"Start":"01:46.710 ","End":"01:47.865","Text":"and I\u0027ve color coded it."},{"Start":"01:47.865 ","End":"01:50.210","Text":"You can see the blue with the blue, the red with the red,"},{"Start":"01:50.210 ","End":"01:54.165","Text":"green with the green. That\u0027s basically it."},{"Start":"01:54.165 ","End":"01:57.140","Text":"Now, ideally, we should really verify it and actually"},{"Start":"01:57.140 ","End":"02:02.275","Text":"compute P minus 1 AP and see that it is equal to D,"},{"Start":"02:02.275 ","End":"02:05.690","Text":"but I\u0027m not going to waste time doing that."},{"Start":"02:05.690 ","End":"02:08.785","Text":"If you like, it\u0027s good exercise."},{"Start":"02:08.785 ","End":"02:11.190","Text":"Now, we come to Section H,"},{"Start":"02:11.190 ","End":"02:19.655","Text":"in which I\u0027ll show you 1 of the applications of diagonalizing a matrix."},{"Start":"02:19.655 ","End":"02:22.800","Text":"If A is diagonalizable,"},{"Start":"02:22.800 ","End":"02:25.310","Text":"then we can compute it to any power."},{"Start":"02:25.310 ","End":"02:27.950","Text":"I took, for example, a large number,"},{"Start":"02:27.950 ","End":"02:32.900","Text":"2,017, I just chose the year I wrote this in."},{"Start":"02:32.900 ","End":"02:35.510","Text":"Let\u0027s see how we do that."},{"Start":"02:35.510 ","End":"02:42.630","Text":"This is the formula and it\u0027s not hard to explain why this formula is so,"},{"Start":"02:42.630 ","End":"02:49.010","Text":"but you just take it as a formula where P and D,"},{"Start":"02:49.010 ","End":"02:53.110","Text":"are the P and D we found in the previous section."},{"Start":"02:53.110 ","End":"02:58.065","Text":"For example, if n is 2,017,"},{"Start":"02:58.065 ","End":"03:00.510","Text":"then we get this."},{"Start":"03:00.510 ","End":"03:04.875","Text":"Now, we know what P is and we know what D is,"},{"Start":"03:04.875 ","End":"03:07.920","Text":"we don\u0027t know what P minus 1 is."},{"Start":"03:07.920 ","End":"03:10.730","Text":"I\u0027m going to tell you what P minus 1 is,"},{"Start":"03:10.730 ","End":"03:13.235","Text":"but I don\u0027t expect you to take it on trust."},{"Start":"03:13.235 ","End":"03:16.415","Text":"I will do the computation later."},{"Start":"03:16.415 ","End":"03:21.305","Text":"I just don\u0027t want to interrupt the flow with this technical computation."},{"Start":"03:21.305 ","End":"03:27.250","Text":"At the end I\u0027ll show you how I got to this."},{"Start":"03:27.250 ","End":"03:30.045","Text":"For this formula we have P,"},{"Start":"03:30.045 ","End":"03:33.225","Text":"now we have P minus 1, P inverse."},{"Start":"03:33.225 ","End":"03:37.920","Text":"We need now D^2,017."},{"Start":"03:37.920 ","End":"03:45.425","Text":"Now, D is this and there is a special property of diagonal matrices."},{"Start":"03:45.425 ","End":"03:47.330","Text":"This is a diagonal matrix,"},{"Start":"03:47.330 ","End":"03:51.485","Text":"that if you raise it to a power,"},{"Start":"03:51.485 ","End":"03:55.760","Text":"then you just take each entry and raise it to that power."},{"Start":"03:55.760 ","End":"03:59.770","Text":"It\u0027s a special property for diagonal matrices,"},{"Start":"03:59.770 ","End":"04:01.790","Text":"it won\u0027t work for a general matrix."},{"Start":"04:01.790 ","End":"04:05.630","Text":"You can\u0027t take the exponent of each element separately,"},{"Start":"04:05.630 ","End":"04:07.740","Text":"but here you can."},{"Start":"04:07.810 ","End":"04:11.275","Text":"Let\u0027s create some space."},{"Start":"04:11.275 ","End":"04:13.260","Text":"Here\u0027s what we get."},{"Start":"04:13.260 ","End":"04:14.460","Text":"We had this formula,"},{"Start":"04:14.460 ","End":"04:22.830","Text":"that A^2,017 is P. P is equal to this,"},{"Start":"04:22.830 ","End":"04:25.530","Text":"D^ 2,017 is this,"},{"Start":"04:25.530 ","End":"04:27.405","Text":"P minus 1 is this."},{"Start":"04:27.405 ","End":"04:33.585","Text":"Now, we just have to multiply these 3. Let\u0027s see."},{"Start":"04:33.585 ","End":"04:39.005","Text":"By the associates of law of matrices for multiplication,"},{"Start":"04:39.005 ","End":"04:44.040","Text":"I can take this times this first and then multiply by that."},{"Start":"04:44.040 ","End":"04:47.255","Text":"If I take this times this, check it,"},{"Start":"04:47.255 ","End":"04:53.850","Text":"you will get this."},{"Start":"04:53.850 ","End":"04:55.980","Text":"I mean this too we can compute,"},{"Start":"04:55.980 ","End":"05:02.610","Text":"0^2,017 is 0 and 1^2,017 is 1."},{"Start":"05:02.610 ","End":"05:06.050","Text":"The only thing I\u0027m going to leave as an expression, is this."},{"Start":"05:06.050 ","End":"05:09.270","Text":"Certainly you don\u0027t want to actually compute this,"},{"Start":"05:09.270 ","End":"05:11.335","Text":"it\u0027s a huge number."},{"Start":"05:11.335 ","End":"05:14.000","Text":"Anyway, if you check this times this is this,"},{"Start":"05:14.000 ","End":"05:17.600","Text":"and now we have to do another product, this times this."},{"Start":"05:17.600 ","End":"05:20.840","Text":"I\u0027ll leave you to check that what we get is this."},{"Start":"05:20.840 ","End":"05:25.265","Text":"The only thing I want to point out that may not be immediately obvious is that"},{"Start":"05:25.265 ","End":"05:28.010","Text":"0.5 times"},{"Start":"05:28.010 ","End":"05:35.575","Text":"2^2,017 is 2^2,016,"},{"Start":"05:35.575 ","End":"05:38.380","Text":"because this is really 1 1/2."},{"Start":"05:38.380 ","End":"05:41.239","Text":"I really should have used fractions, not decimals."},{"Start":"05:41.239 ","End":"05:43.880","Text":"1 1/2 is 2 to the minus 1."},{"Start":"05:43.880 ","End":"05:45.560","Text":"Using the rules of exponents,"},{"Start":"05:45.560 ","End":"05:49.865","Text":"taking 1 1/2 of it just knocks the exponent down by 1."},{"Start":"05:49.865 ","End":"05:53.225","Text":"We end up getting this and that\u0027s the answer."},{"Start":"05:53.225 ","End":"05:57.445","Text":"This is 1 of the uses of diagonalization."},{"Start":"05:57.445 ","End":"05:59.460","Text":"I still have that debt,"},{"Start":"05:59.460 ","End":"06:06.380","Text":"where I have to show you how we would compute P inverse."},{"Start":"06:06.380 ","End":"06:08.360","Text":"Remember how to compute an inverse,"},{"Start":"06:08.360 ","End":"06:11.705","Text":"we write a matrix like this with a separator,"},{"Start":"06:11.705 ","End":"06:14.535","Text":"and we put the matrix,"},{"Start":"06:14.535 ","End":"06:19.430","Text":"in this case P on 1 side and the identity matrix on the other side,"},{"Start":"06:19.430 ","End":"06:25.610","Text":"and then we do row operations until we get the identity on the left,"},{"Start":"06:25.610 ","End":"06:28.825","Text":"and what\u0027s on the right will be the inverse."},{"Start":"06:28.825 ","End":"06:38.900","Text":"The first row operation will be to add the first row to the third row and get a 0 here."},{"Start":"06:38.900 ","End":"06:44.540","Text":"If we do that, we\u0027ll get this and we also have to do it on the right side also,"},{"Start":"06:44.540 ","End":"06:47.140","Text":"that gives us that 1 here."},{"Start":"06:47.140 ","End":"06:51.175","Text":"The next thing, here it is,"},{"Start":"06:51.175 ","End":"07:01.340","Text":"is to add twice the first row minus the third row into the first row,"},{"Start":"07:02.360 ","End":"07:05.980","Text":"because my aim was to get a 0 here."},{"Start":"07:05.980 ","End":"07:07.900","Text":"I mean, it\u0027s already in echelon form."},{"Start":"07:07.900 ","End":"07:15.320","Text":"Now, I have to work on the top right part to get it to be 0."},{"Start":"07:15.320 ","End":"07:20.575","Text":"To get this 0, I double this row and then subtract the last row,"},{"Start":"07:20.575 ","End":"07:22.480","Text":"and that gives me this."},{"Start":"07:22.480 ","End":"07:25.960","Text":"Now, we\u0027re almost at the identity matrix because we have a diagonal."},{"Start":"07:25.960 ","End":"07:33.430","Text":"We just have to divide this row by minus 2 and this row by 2."},{"Start":"07:33.620 ","End":"07:36.620","Text":"Instead of saying divide by minus 2,"},{"Start":"07:36.620 ","End":"07:40.625","Text":"I can say multiply by minus 1/2 and I\u0027ll do it in decimal,"},{"Start":"07:40.625 ","End":"07:44.275","Text":"similarly 1 1/2 of the third row."},{"Start":"07:44.275 ","End":"07:45.800","Text":"On the right-hand side,"},{"Start":"07:45.800 ","End":"07:48.785","Text":"we get all these 0.5s."},{"Start":"07:48.785 ","End":"07:51.715","Text":"You could work in fractions also."},{"Start":"07:51.715 ","End":"07:55.294","Text":"Since we have the identity matrix here,"},{"Start":"07:55.294 ","End":"07:58.205","Text":"then on the other side of the partition,"},{"Start":"07:58.205 ","End":"08:00.800","Text":"that would be the inverse of P,"},{"Start":"08:00.800 ","End":"08:04.010","Text":"and that\u0027s exactly what we got earlier, you can check."},{"Start":"08:04.010 ","End":"08:06.900","Text":"Now we are done."}],"ID":25755},{"Watched":false,"Name":"Exercise 3 parts i-j","Duration":"6m ","ChapterTopicVideoID":24843,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:02.895","Text":"Continuing the same exercise,"},{"Start":"00:02.895 ","End":"00:04.560","Text":"remember we had several parts."},{"Start":"00:04.560 ","End":"00:07.125","Text":"We had A through F,"},{"Start":"00:07.125 ","End":"00:11.640","Text":"and then I started doing G through J."},{"Start":"00:11.640 ","End":"00:13.350","Text":"But after H,"},{"Start":"00:13.350 ","End":"00:18.390","Text":"it was too long I stopped and now we\u0027re continuing with I and J."},{"Start":"00:18.390 ","End":"00:22.395","Text":"The question itself, is in the previous clip."},{"Start":"00:22.395 ","End":"00:25.035","Text":"The minimal polynomial."},{"Start":"00:25.035 ","End":"00:27.780","Text":"I\u0027ll review the concept."},{"Start":"00:27.780 ","End":"00:31.230","Text":"Start off with the Cayley-Hamilton theorem"},{"Start":"00:31.230 ","End":"00:36.690","Text":"that says that every matrix satisfies its own characteristic equation."},{"Start":"00:36.690 ","End":"00:38.490","Text":"In other words, if we find"},{"Start":"00:38.490 ","End":"00:47.510","Text":"the characteristic polynomial of A and if we plug in instead of the variable x,"},{"Start":"00:47.510 ","End":"00:49.730","Text":"we put in the matrix A,"},{"Start":"00:49.730 ","End":"00:53.615","Text":"then we get 0 but 0 matrix."},{"Start":"00:53.615 ","End":"00:57.365","Text":"We can actually substitute a matrix into a polynomial."},{"Start":"00:57.365 ","End":"01:01.610","Text":"The only thing you have to watch out for is that the constant,"},{"Start":"01:01.610 ","End":"01:08.910","Text":"the part without x has got to be like if you had 3 you would put 3 I."},{"Start":"01:08.960 ","End":"01:16.740","Text":"Now A satisfies this polynomial equals 0."},{"Start":"01:16.740 ","End":"01:23.495","Text":"Maybe there\u0027s a lower degree polynomial that will do the same thing and"},{"Start":"01:23.495 ","End":"01:30.865","Text":"we let m be the least degree polynomial such that this is true."},{"Start":"01:30.865 ","End":"01:33.840","Text":"Now, I like to be a bit pedantic."},{"Start":"01:33.840 ","End":"01:38.555","Text":"There\u0027s going to be more than 1 because if I multiply or divide by a non-zero constant,"},{"Start":"01:38.555 ","End":"01:42.380","Text":"it\u0027s not going to change the degree and it\u0027s still going to be 0 for A."},{"Start":"01:42.380 ","End":"01:47.419","Text":"So by convention, we take the monic polynomial,"},{"Start":"01:47.419 ","End":"01:51.200","Text":"meaning that the leading coefficient is 1."},{"Start":"01:51.200 ","End":"01:57.380","Text":"Yeah, just so that we can really say the minimal polynomial"},{"Start":"01:57.380 ","End":"02:03.130","Text":"and not twice the minimal polynomial will still satisfy."},{"Start":"02:03.130 ","End":"02:12.080","Text":"Anyway, yeah. We need the theorem to help us to do that and the theorem says"},{"Start":"02:12.080 ","End":"02:16.565","Text":"that the minimal polynomial"},{"Start":"02:16.565 ","End":"02:22.685","Text":"and the characteristic polynomial have the same irreducible factors."},{"Start":"02:22.685 ","End":"02:25.390","Text":"What does that mean?"},{"Start":"02:25.390 ","End":"02:29.435","Text":"I\u0027ll explain it by use of an example."},{"Start":"02:29.435 ","End":"02:33.890","Text":"We start with the characteristic polynomial,"},{"Start":"02:33.890 ","End":"02:40.185","Text":"but I didn\u0027t take our characteristic polynomial because it\u0027s too small."},{"Start":"02:40.185 ","End":"02:43.365","Text":"I mean, the degree is to low I wanted a better example,"},{"Start":"02:43.365 ","End":"02:47.950","Text":"more juicy, so I just invented this 1."},{"Start":"02:48.980 ","End":"02:54.260","Text":"This will be better for explaining the concept."},{"Start":"02:54.260 ","End":"02:57.620","Text":"Something I should have mentioned that I\u0027m going to assume that"},{"Start":"02:57.620 ","End":"03:02.240","Text":"the polynomials here are all factorized."},{"Start":"03:02.240 ","End":"03:04.070","Text":"I mean, if you expand them,"},{"Start":"03:04.070 ","End":"03:07.010","Text":"you won\u0027t see anything because you\u0027ll just get polynomial."},{"Start":"03:07.010 ","End":"03:09.050","Text":"I\u0027m assuming everything\u0027s factorized,"},{"Start":"03:09.050 ","End":"03:11.150","Text":"like here it\u0027s factorized."},{"Start":"03:11.150 ","End":"03:13.255","Text":"The irreducible pieces,"},{"Start":"03:13.255 ","End":"03:14.820","Text":"there is x,"},{"Start":"03:14.820 ","End":"03:16.755","Text":"there is x minus 1,"},{"Start":"03:16.755 ","End":"03:19.900","Text":"and there is x squared plus 4."},{"Start":"03:20.030 ","End":"03:24.020","Text":"Now, the minimal polynomial also got to contain x,"},{"Start":"03:24.020 ","End":"03:27.335","Text":"x minus 1 and x squared plus 4 by the theorem,"},{"Start":"03:27.335 ","End":"03:29.120","Text":"but it\u0027s got a lower degree."},{"Start":"03:29.120 ","End":"03:33.700","Text":"What we will have is x to the power of either 2 or 1,"},{"Start":"03:33.700 ","End":"03:37.125","Text":"x minus 1 to the power of 3 or 2,"},{"Start":"03:37.125 ","End":"03:41.420","Text":"or 1 and this has to stay to the power of 1,"},{"Start":"03:41.420 ","End":"03:43.970","Text":"we can\u0027t get any less."},{"Start":"03:43.970 ","End":"03:46.760","Text":"We actually have 6 possibilities,"},{"Start":"03:46.760 ","End":"03:49.250","Text":"2 possibilities here times 3 possibilities"},{"Start":"03:49.250 ","End":"03:54.810","Text":"here and combinatorically there\u0027ll be 6 of them."},{"Start":"03:54.810 ","End":"04:01.565","Text":"Here they are and I\u0027ve arranged them in order of increasing degree."},{"Start":"04:01.565 ","End":"04:05.480","Text":"If it\u0027s a tie like here and here, it doesn\u0027t matter."},{"Start":"04:05.480 ","End":"04:10.940","Text":"The last 1 is always going to be the original characteristic polynomial,"},{"Start":"04:10.940 ","End":"04:12.920","Text":"which happens to have degree 7,"},{"Start":"04:12.920 ","End":"04:20.640","Text":"2 plus 3 plus this is also degree 2 is 7."},{"Start":"04:20.640 ","End":"04:23.355","Text":"The idea is to find the minimum,"},{"Start":"04:23.355 ","End":"04:29.460","Text":"we just keep going at it 1 by 1 and we substitute A."},{"Start":"04:29.460 ","End":"04:32.955","Text":"Well, I put it in writing."},{"Start":"04:32.955 ","End":"04:36.740","Text":"We substitute in our case A or whatever the matrix is,"},{"Start":"04:36.740 ","End":"04:40.190","Text":"successively 1 by 1 in the polynomials."},{"Start":"04:40.190 ","End":"04:41.630","Text":"We start with the lowest degree,"},{"Start":"04:41.630 ","End":"04:46.665","Text":"work our way down and as soon as we get the 0 matrix, we stop."},{"Start":"04:46.665 ","End":"04:51.775","Text":"At that point, that particular m is our minimal polynomial."},{"Start":"04:51.775 ","End":"04:53.490","Text":"Now that was theoretical,"},{"Start":"04:53.490 ","End":"04:55.025","Text":"that\u0027s not our example."},{"Start":"04:55.025 ","End":"04:57.455","Text":"Our example is much simpler."},{"Start":"04:57.455 ","End":"04:59.420","Text":"This was our example."},{"Start":"04:59.420 ","End":"05:05.165","Text":"There were 3 irreducible factors,"},{"Start":"05:05.165 ","End":"05:07.655","Text":"and each of them appears to the power of 1,"},{"Start":"05:07.655 ","End":"05:09.275","Text":"and we don\u0027t write the 1,"},{"Start":"05:09.275 ","End":"05:11.860","Text":"so we can\u0027t get any less."},{"Start":"05:11.860 ","End":"05:17.660","Text":"In this case, the only possibility is this."},{"Start":"05:17.660 ","End":"05:22.630","Text":"In other words, the minimal polynomial is the characteristic polynomial."},{"Start":"05:22.630 ","End":"05:25.950","Text":"Finally Part J, which is the last in"},{"Start":"05:25.950 ","End":"05:29.130","Text":"the whole series I think there were 10 parts altogether."},{"Start":"05:29.130 ","End":"05:35.675","Text":"Sure, J is the 10th letter of the alphabet and there\u0027s the theorem that we use for this."},{"Start":"05:35.675 ","End":"05:41.990","Text":"A matrix is invertible if and only if all its eigenvalues are non-zero."},{"Start":"05:41.990 ","End":"05:44.060","Text":"But in our case, well,"},{"Start":"05:44.060 ","End":"05:47.240","Text":"let\u0027s remind you we had 0, 1, and 2."},{"Start":"05:47.240 ","End":"05:51.950","Text":"This is non-zero and this is non-zero, but this is 0."},{"Start":"05:51.950 ","End":"05:54.530","Text":"So no, not all of them are non-zero,"},{"Start":"05:54.530 ","End":"05:57.890","Text":"so it is not invertible."},{"Start":"05:57.890 ","End":"06:00.810","Text":"That\u0027s it, we\u0027re done."}],"ID":25756},{"Watched":false,"Name":"Exercise 4 parts a-f","Duration":"15m 42s","ChapterTopicVideoID":24844,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:04.560","Text":"In this exercise, we\u0027re given a matrix A,"},{"Start":"00:04.560 ","End":"00:06.450","Text":"a 3 by 3,"},{"Start":"00:06.450 ","End":"00:10.875","Text":"and we assume it\u0027s over the real numbers."},{"Start":"00:10.875 ","End":"00:13.770","Text":"Six part question."},{"Start":"00:13.770 ","End":"00:16.680","Text":"I won\u0027t read them all in advance,"},{"Start":"00:16.680 ","End":"00:19.575","Text":"so each 1 as we come to it."},{"Start":"00:19.575 ","End":"00:24.705","Text":"Part A is to find the characteristic matrix."},{"Start":"00:24.705 ","End":"00:28.469","Text":"In general, it\u0027s defined to be this,"},{"Start":"00:28.469 ","End":"00:33.030","Text":"where I is the identity matrix of the appropriate size,"},{"Start":"00:33.030 ","End":"00:36.900","Text":"3 by 3 and x remains a variable."},{"Start":"00:36.900 ","End":"00:40.779","Text":"Let\u0027s see what this would be in our case."},{"Start":"00:40.779 ","End":"00:44.240","Text":"The identity matrix has ones along the diagonal,"},{"Start":"00:44.240 ","End":"00:45.560","Text":"so if I multiply it by x,"},{"Start":"00:45.560 ","End":"00:48.350","Text":"we have x is along the diagonal 0 elsewhere."},{"Start":"00:48.350 ","End":"00:51.575","Text":"This such as copied as this is our matrix A."},{"Start":"00:51.575 ","End":"00:53.935","Text":"Now we do a simple subtraction."},{"Start":"00:53.935 ","End":"00:56.630","Text":"This is the characteristic matrix."},{"Start":"00:56.630 ","End":"00:58.940","Text":"It\u0027s a matrix with a variable in it,"},{"Start":"00:58.940 ","End":"01:03.065","Text":"x. I want to point out that not everyone uses x."},{"Start":"01:03.065 ","End":"01:07.480","Text":"Some books, some professors like to use the Greek letter lambda."},{"Start":"01:07.480 ","End":"01:11.690","Text":"In fact, in some places it\u0027s also written as the other way around,"},{"Start":"01:11.690 ","End":"01:17.060","Text":"A minus xI or a minus lambda I. These are all good."},{"Start":"01:17.060 ","End":"01:19.285","Text":"We use this form."},{"Start":"01:19.285 ","End":"01:23.225","Text":"Now, part B is the characteristic polynomial."},{"Start":"01:23.225 ","End":"01:25.100","Text":"The characteristic polynomial,"},{"Start":"01:25.100 ","End":"01:30.445","Text":"one way of defining it is just saying it\u0027s the determinant of the characteristic matrix."},{"Start":"01:30.445 ","End":"01:34.215","Text":"See matrix is xI minus A."},{"Start":"01:34.215 ","End":"01:39.065","Text":"The characteristic polynomial is the determinant of the same thing."},{"Start":"01:39.065 ","End":"01:41.370","Text":"Let\u0027s compute it."},{"Start":"01:42.010 ","End":"01:48.065","Text":"Really just take off the brackets and put vertical bars instead."},{"Start":"01:48.065 ","End":"01:53.630","Text":"That is the matrix we take the determinant and we\u0027re going to get a polynomial,"},{"Start":"01:53.630 ","End":"01:56.030","Text":"as you\u0027ll see, because there is a variable in here,"},{"Start":"01:56.030 ","End":"01:57.260","Text":"we don\u0027t get a number,"},{"Start":"01:57.260 ","End":"02:00.330","Text":"we\u0027ll get an expression with x."},{"Start":"02:00.590 ","End":"02:03.935","Text":"Since there are 2 0s in this column,"},{"Start":"02:03.935 ","End":"02:06.649","Text":"we\u0027ll do an expansion along this column."},{"Start":"02:06.649 ","End":"02:14.890","Text":"Really only this entry contributes and then we strike out the row and column it\u0027s in."},{"Start":"02:14.890 ","End":"02:21.530","Text":"What we get is this x minus 6 times the determinant of what\u0027s left,"},{"Start":"02:21.530 ","End":"02:25.175","Text":"which is x plus 1 times x plus 1 here,"},{"Start":"02:25.175 ","End":"02:28.510","Text":"minus 3 times minus 3,"},{"Start":"02:28.510 ","End":"02:31.560","Text":"3 minuses so it\u0027s a minus."},{"Start":"02:31.560 ","End":"02:35.210","Text":"Then I use the difference of squares formula."},{"Start":"02:35.210 ","End":"02:40.295","Text":"Remember, a squared minus b squared is a minus b,"},{"Start":"02:40.295 ","End":"02:42.080","Text":"a plus b,"},{"Start":"02:42.080 ","End":"02:45.510","Text":"and that gives us this."},{"Start":"02:46.190 ","End":"02:52.450","Text":"This is what we get and we prefer to have it in factorized form and to expand it,"},{"Start":"02:52.450 ","End":"02:55.779","Text":"you could expand it and get a cubic polynomial."},{"Start":"02:55.779 ","End":"02:57.790","Text":"But in this section,"},{"Start":"02:57.790 ","End":"03:01.750","Text":"this chapter we prefer factorized."},{"Start":"03:01.750 ","End":"03:04.400","Text":"This is the answer."},{"Start":"03:04.400 ","End":"03:11.360","Text":"Let\u0027s put it in a nice box and give it a name p of x, the characteristic polynomial."},{"Start":"03:11.400 ","End":"03:16.240","Text":"Next, we come to eigenvalues and algebraic multiplicity."},{"Start":"03:16.240 ","End":"03:18.700","Text":"We start with eigenvalues."},{"Start":"03:18.700 ","End":"03:25.645","Text":"The eigenvalues of a matrix are just the roots of the characteristic polynomial."},{"Start":"03:25.645 ","End":"03:28.540","Text":"Remember what roots of a polynomial are?"},{"Start":"03:28.540 ","End":"03:31.820","Text":"They\u0027re just the solutions to the equation,"},{"Start":"03:31.820 ","End":"03:34.564","Text":"the polynomial equals 0."},{"Start":"03:34.564 ","End":"03:38.975","Text":"The term characteristic equation is sometimes used."},{"Start":"03:38.975 ","End":"03:41.870","Text":"When you say the characteristic polynomial equals 0,"},{"Start":"03:41.870 ","End":"03:44.030","Text":"that\u0027s called the characteristic equation,"},{"Start":"03:44.030 ","End":"03:48.780","Text":"so the eigenvalues of the solutions of the characteristic equation."},{"Start":"03:48.800 ","End":"03:54.570","Text":"What it means in practice, we take out p of x and just set it equal to 0."},{"Start":"03:54.650 ","End":"04:01.655","Text":"We easily see it, the roots which are the eigenvalues would be 6,"},{"Start":"04:01.655 ","End":"04:05.125","Text":"2, and minus 4,"},{"Start":"04:05.125 ","End":"04:07.500","Text":"that was the eigenvalues part."},{"Start":"04:07.500 ","End":"04:12.440","Text":"Now we\u0027ll go to the algebraic multiplicity."},{"Start":"04:12.440 ","End":"04:22.320","Text":"Turns out that each eigenvalue has its algebraic multiplicity."},{"Start":"04:22.540 ","End":"04:25.010","Text":"We define it as follows."},{"Start":"04:25.010 ","End":"04:27.260","Text":"If the characteristic polynomial,"},{"Start":"04:27.260 ","End":"04:29.870","Text":"and I\u0027m assuming it\u0027s factorized,"},{"Start":"04:29.870 ","End":"04:36.315","Text":"contains a factor of the form x minus a^k,"},{"Start":"04:36.315 ","End":"04:38.510","Text":"well, a would be an eigenvalue."},{"Start":"04:38.510 ","End":"04:44.260","Text":"The algebraic multiplicity of a is k. In our case, you know what?"},{"Start":"04:44.260 ","End":"04:47.240","Text":"I\u0027m just going to scroll back so we can see it."},{"Start":"04:47.240 ","End":"04:50.840","Text":"In our case we have x minus 6^1,"},{"Start":"04:50.840 ","End":"04:56.940","Text":"x minus 2^1, x plus 4^1."},{"Start":"04:56.940 ","End":"04:58.880","Text":"All 3 of these eigenvalues,"},{"Start":"04:58.880 ","End":"05:00.875","Text":"6, 2, and minus 4,"},{"Start":"05:00.875 ","End":"05:05.930","Text":"all have multiplicity 1, algebraic multiplicity."},{"Start":"05:05.930 ","End":"05:08.775","Text":"That\u0027s the 1, the 1, and the 1."},{"Start":"05:08.775 ","End":"05:12.440","Text":"Soon we\u0027ll come to another kind of multiplicity."},{"Start":"05:12.440 ","End":"05:15.230","Text":"There is also a geometric multiplicity,"},{"Start":"05:15.230 ","End":"05:21.220","Text":"but before that we have to talk about another concept, eigenspaces."},{"Start":"05:21.220 ","End":"05:24.860","Text":"Eigenspace is associated with a particular eigenvalue."},{"Start":"05:24.860 ","End":"05:26.765","Text":"Each eigenvalue has an eigenspace,"},{"Start":"05:26.765 ","End":"05:29.600","Text":"and the way you find it is as follows."},{"Start":"05:29.600 ","End":"05:39.460","Text":"What you do is you substitute the eigenvalue in place of x in the characteristic matrix."},{"Start":"05:39.460 ","End":"05:43.580","Text":"Each matrix has a corresponding system of linear equations,"},{"Start":"05:43.580 ","End":"05:48.175","Text":"and we solve that system of linear equations."},{"Start":"05:48.175 ","End":"05:56.335","Text":"The solution space of that SLE is called the eigenspace of that particular eigenvalue."},{"Start":"05:56.335 ","End":"06:03.065","Text":"Furthermore, the dimension of the eigenspace is called the geometric multiplicity."},{"Start":"06:03.065 ","End":"06:06.110","Text":"I don\u0027t know why it\u0027s called geometric multiplicity,"},{"Start":"06:06.110 ","End":"06:08.110","Text":"that\u0027s just its name."},{"Start":"06:08.110 ","End":"06:10.400","Text":"Just as a reminder,"},{"Start":"06:10.400 ","End":"06:12.920","Text":"we have eigenvalues 6,"},{"Start":"06:12.920 ","End":"06:14.465","Text":"2, and minus 4."},{"Start":"06:14.465 ","End":"06:19.250","Text":"I\u0027m going to do this procedure 3 times,"},{"Start":"06:19.250 ","End":"06:22.080","Text":"once for each eigenvalue."},{"Start":"06:22.100 ","End":"06:26.565","Text":"Let\u0027s start with the eigenvalue 6."},{"Start":"06:26.565 ","End":"06:29.220","Text":"Here\u0027s our characteristic matrix,"},{"Start":"06:29.220 ","End":"06:30.680","Text":"and it has an x in it,"},{"Start":"06:30.680 ","End":"06:34.295","Text":"and we just substitute x equals 6."},{"Start":"06:34.295 ","End":"06:37.115","Text":"If we do that substitution,"},{"Start":"06:37.115 ","End":"06:39.745","Text":"this is what we will get."},{"Start":"06:39.745 ","End":"06:44.705","Text":"The corresponding SLE is this."},{"Start":"06:44.705 ","End":"06:53.430","Text":"Just taking the coefficients here and getting a system of equations in xyz."},{"Start":"06:53.600 ","End":"06:57.485","Text":"As usual, we want to bring this to row echelon form."},{"Start":"06:57.485 ","End":"06:59.315","Text":"I want to get 0s here,"},{"Start":"06:59.315 ","End":"07:05.535","Text":"so I add 3 times this row,"},{"Start":"07:05.535 ","End":"07:07.700","Text":"2, 7 times this row,"},{"Start":"07:07.700 ","End":"07:09.575","Text":"and that gives me a 0 here."},{"Start":"07:09.575 ","End":"07:13.760","Text":"If I subtract twice this row from 7 times this row,"},{"Start":"07:13.760 ","End":"07:16.015","Text":"this is what I get here."},{"Start":"07:16.015 ","End":"07:24.950","Text":"Then I can divide this row by 40 and get this and divide this row by 20 and get this,"},{"Start":"07:24.950 ","End":"07:27.265","Text":"now I\u0027ve got 2 rows are the same."},{"Start":"07:27.265 ","End":"07:33.500","Text":"We just subtract this from this and that\u0027s a 0 and we ignore it, just throw it out."},{"Start":"07:33.500 ","End":"07:39.155","Text":"We have these 2 equations and then this gives this system of linear equations."},{"Start":"07:39.155 ","End":"07:44.570","Text":"Note that z is absent and that means that z"},{"Start":"07:44.570 ","End":"07:50.675","Text":"is our free variable and the others depend on it, they\u0027re constrained."},{"Start":"07:50.675 ","End":"07:54.110","Text":"We use our technique, I called it the wondering 1 so"},{"Start":"07:54.110 ","End":"07:58.695","Text":"we let z equal 1, the free variable."},{"Start":"07:58.695 ","End":"08:04.155","Text":"Then the rest of them compute to y comes out 0,"},{"Start":"08:04.155 ","End":"08:06.840","Text":"and x comes out 0."},{"Start":"08:06.840 ","End":"08:09.545","Text":"This, if you put it in the right order,"},{"Start":"08:09.545 ","End":"08:11.330","Text":"will give us a basis vector,"},{"Start":"08:11.330 ","End":"08:13.685","Text":"would be 0, 0,1."},{"Start":"08:13.685 ","End":"08:22.865","Text":"The solution space is spanned by the basis and the basis consists of just 0, 0, 1."},{"Start":"08:22.865 ","End":"08:25.040","Text":"This is called the eigenspace."},{"Start":"08:25.040 ","End":"08:31.860","Text":"The solution space is the eigenspace of the particular eigenvalue 6."},{"Start":"08:32.010 ","End":"08:34.750","Text":"Because the basis,"},{"Start":"08:34.750 ","End":"08:36.895","Text":"which is this bit without the sp,"},{"Start":"08:36.895 ","End":"08:39.775","Text":"just has 1 element in it, 1 vector,"},{"Start":"08:39.775 ","End":"08:41.200","Text":"the dimension is 1,"},{"Start":"08:41.200 ","End":"08:45.320","Text":"and that\u0027s the geometric multiplicity is 1."},{"Start":"08:45.330 ","End":"08:48.955","Text":"The next eigenvalue, x equal 2,"},{"Start":"08:48.955 ","End":"08:50.395","Text":"that was a bit quicker,"},{"Start":"08:50.395 ","End":"08:58.045","Text":"is the eigenmatrix and the characteristic matrix."},{"Start":"08:58.045 ","End":"09:01.270","Text":"We substitute x equals 2."},{"Start":"09:01.270 ","End":"09:03.055","Text":"We get this."},{"Start":"09:03.055 ","End":"09:06.925","Text":"From this, we go to the system of linear equations,"},{"Start":"09:06.925 ","End":"09:08.559","Text":"which we want to solve."},{"Start":"09:08.559 ","End":"09:10.910","Text":"We want to bring this to echelon form."},{"Start":"09:10.910 ","End":"09:15.060","Text":"After we\u0027ve made the first column below the 3,"},{"Start":"09:15.060 ","End":"09:17.970","Text":"0, we notice that a 0 row."},{"Start":"09:17.970 ","End":"09:20.530","Text":"I\u0027m going to cross that out."},{"Start":"09:22.620 ","End":"09:24.955","Text":"Well, this is crossed out,"},{"Start":"09:24.955 ","End":"09:27.655","Text":"the last row we can divide by 12."},{"Start":"09:27.655 ","End":"09:36.385","Text":"We have 2 equations and the system of equations becomes this."},{"Start":"09:36.385 ","End":"09:41.950","Text":"Z would be the free variable and as before,"},{"Start":"09:41.950 ","End":"09:47.185","Text":"we let the free variable be 1 and then we compute from this the other 2."},{"Start":"09:47.185 ","End":"09:50.200","Text":"If you put them in the right order x, y, z."},{"Start":"09:50.200 ","End":"09:52.495","Text":"Well, it doesn\u0027t matter, they\u0027re all 1,"},{"Start":"09:52.495 ","End":"09:56.140","Text":"the solution space is spanned by"},{"Start":"09:56.140 ","End":"10:01.045","Text":"the basis and the base just consists of the vector 1, 1, 1."},{"Start":"10:01.045 ","End":"10:08.365","Text":"This solution space is the eigenspace for the eigenvalue 2."},{"Start":"10:08.365 ","End":"10:12.865","Text":"The dimension is 1 because there\u0027s only 1 vector in the basis,"},{"Start":"10:12.865 ","End":"10:16.970","Text":"and so that\u0027s the geometric multiplicity is 1."},{"Start":"10:17.010 ","End":"10:20.395","Text":"Now, the third eigenvalue,"},{"Start":"10:20.395 ","End":"10:22.480","Text":"which is minus 4."},{"Start":"10:22.480 ","End":"10:26.530","Text":"Plug it in to the characteristic matrix."},{"Start":"10:26.530 ","End":"10:28.615","Text":"This is what we get."},{"Start":"10:28.615 ","End":"10:31.705","Text":"This is the corresponding SLE."},{"Start":"10:31.705 ","End":"10:34.945","Text":"We want to bring this to echelon form."},{"Start":"10:34.945 ","End":"10:40.465","Text":"We zero out the rest of the first column below the minus 3,"},{"Start":"10:40.465 ","End":"10:44.320","Text":"subtracting this from this,"},{"Start":"10:44.320 ","End":"10:51.630","Text":"and then adding 3 times this plus twice this."},{"Start":"10:51.630 ","End":"10:53.265","Text":"Anyway, we get this,"},{"Start":"10:53.265 ","End":"10:55.605","Text":"we get a row of 0s."},{"Start":"10:55.605 ","End":"11:00.570","Text":"Divide the top row by minus 3,"},{"Start":"11:00.570 ","End":"11:03.330","Text":"divide the bottom row by minus 30."},{"Start":"11:03.330 ","End":"11:06.240","Text":"This is the system we have."},{"Start":"11:06.240 ","End":"11:09.625","Text":"These 2 equations, here they are."},{"Start":"11:09.625 ","End":"11:14.335","Text":"This time the free variable is y."},{"Start":"11:14.335 ","End":"11:18.970","Text":"For a basis, we let y equal 1."},{"Start":"11:18.970 ","End":"11:22.090","Text":"From that, we just compute."},{"Start":"11:22.090 ","End":"11:26.500","Text":"Well, z is 0 and then x becomes minus 1."},{"Start":"11:26.500 ","End":"11:29.875","Text":"The basis just contains 1 vector,"},{"Start":"11:29.875 ","End":"11:31.270","Text":"make sure you get the order right,"},{"Start":"11:31.270 ","End":"11:34.465","Text":"minus 1, 1, 0."},{"Start":"11:34.465 ","End":"11:36.865","Text":"The solution space is the span of that."},{"Start":"11:36.865 ","End":"11:45.415","Text":"This is the eigenspace of the eigenvalue minus 4,"},{"Start":"11:45.415 ","End":"11:51.710","Text":"and the geometric multiplicity of the eigenvalue is therefore also 1."},{"Start":"11:51.930 ","End":"11:59.110","Text":"All 3 eigenvalues had a geometric multiplicity of 1."},{"Start":"11:59.110 ","End":"12:03.895","Text":"Next, the eigenvectors."},{"Start":"12:03.895 ","End":"12:07.330","Text":"The eigenvectors are sorted by eigenvalue."},{"Start":"12:07.330 ","End":"12:09.550","Text":"For any given eigenvalue,"},{"Start":"12:09.550 ","End":"12:13.480","Text":"what we do is we take a basis for the eigenspace and"},{"Start":"12:13.480 ","End":"12:18.470","Text":"those vectors in the basis are the eigenvectors."},{"Start":"12:19.380 ","End":"12:23.365","Text":"There\u0027s more than 1 way possibly of choosing a basis."},{"Start":"12:23.365 ","End":"12:26.260","Text":"It doesn\u0027t matter, just choose 1 and stick with it,"},{"Start":"12:26.260 ","End":"12:30.745","Text":"and that will be the eigenvectors."},{"Start":"12:30.745 ","End":"12:33.190","Text":"Remember or you can look back,"},{"Start":"12:33.190 ","End":"12:35.980","Text":"for x equals 6, for this eigenvalue,"},{"Start":"12:35.980 ","End":"12:44.170","Text":"we got the solution space was spanned by this and the basis contains just 1 vector,"},{"Start":"12:44.170 ","End":"12:53.289","Text":"0, 0, 1 so this will be the eigenvector for x equals 6."},{"Start":"12:53.289 ","End":"13:01.375","Text":"I call it with v. This is V for the eigenspace and v for the eigenvector."},{"Start":"13:01.375 ","End":"13:03.580","Text":"Similarly for the other 2."},{"Start":"13:03.580 ","End":"13:07.570","Text":"For x equals 2, this was the eigenvector."},{"Start":"13:07.570 ","End":"13:09.730","Text":"We can take it like that."},{"Start":"13:09.730 ","End":"13:11.110","Text":"Now, for minus 4,"},{"Start":"13:11.110 ","End":"13:13.225","Text":"we could take this 1."},{"Start":"13:13.225 ","End":"13:16.885","Text":"At the end, you put them all together."},{"Start":"13:16.885 ","End":"13:21.415","Text":"These will be our 3 eigenvectors."},{"Start":"13:21.415 ","End":"13:23.830","Text":"Now, the last part,"},{"Start":"13:23.830 ","End":"13:29.200","Text":"F. Is the matrix a diagonalizable?"},{"Start":"13:29.200 ","End":"13:33.954","Text":"Let\u0027s review the concept of what this means."},{"Start":"13:33.954 ","End":"13:35.740","Text":"It means, in other words,"},{"Start":"13:35.740 ","End":"13:39.220","Text":"that there exists an invertible matrix p,"},{"Start":"13:39.220 ","End":"13:42.595","Text":"such that p minus 1,"},{"Start":"13:42.595 ","End":"13:44.965","Text":"when I say p minus 1, I mean the inverse matrix,"},{"Start":"13:44.965 ","End":"13:48.475","Text":"p minus 1 ap is equal to d,"},{"Start":"13:48.475 ","End":"13:51.550","Text":"and d is a diagonal matrix."},{"Start":"13:51.550 ","End":"13:53.440","Text":"Sometimes we write it like this."},{"Start":"13:53.440 ","End":"13:55.855","Text":"Instead of this, this is an equivalent form,"},{"Start":"13:55.855 ","End":"13:59.690","Text":"ap equals pd. Same thing."},{"Start":"14:02.610 ","End":"14:07.900","Text":"An n by n matrix is diagonalizable if and"},{"Start":"14:07.900 ","End":"14:12.400","Text":"only if it has n linearly independent eigenvectors."},{"Start":"14:12.400 ","End":"14:13.929","Text":"Now, in our case,"},{"Start":"14:13.929 ","End":"14:17.515","Text":"we have a 3-by-3 matrix."},{"Start":"14:17.515 ","End":"14:22.735","Text":"We saw that we had 3 eigenvectors."},{"Start":"14:22.735 ","End":"14:29.035","Text":"I have to show you in a moment why they are linearly independent."},{"Start":"14:29.035 ","End":"14:30.610","Text":"If I just said it like that,"},{"Start":"14:30.610 ","End":"14:31.840","Text":"you\u0027d say, \"Why is that so?\""},{"Start":"14:31.840 ","End":"14:33.385","Text":"I\u0027ll explain in a minute."},{"Start":"14:33.385 ","End":"14:36.760","Text":"A is diagonalizable."},{"Start":"14:36.760 ","End":"14:38.799","Text":"Now, I\u0027ll answer this."},{"Start":"14:38.799 ","End":"14:41.590","Text":"I just pulled another theorem out of the hat."},{"Start":"14:41.590 ","End":"14:48.130","Text":"It says that eigenvectors belonging to different eigenvalues are linearly independent."},{"Start":"14:48.130 ","End":"14:54.190","Text":"Each of the 3 eigenvectors we got belong to a different eigenvalue, so that\u0027s okay."},{"Start":"14:54.190 ","End":"14:56.335","Text":"It is a diagonalizable."},{"Start":"14:56.335 ","End":"14:57.640","Text":"I\u0027d like to show you another way."},{"Start":"14:57.640 ","End":"15:00.265","Text":"It doesn\u0027t use 2 theorems, just 1 theorem."},{"Start":"15:00.265 ","End":"15:02.350","Text":"Here\u0027s the phrasing. I won\u0027t read it out,"},{"Start":"15:02.350 ","End":"15:04.705","Text":"I will just explain what it means."},{"Start":"15:04.705 ","End":"15:09.850","Text":"Each eigenvalue has 2 kinds of multiplicity,"},{"Start":"15:09.850 ","End":"15:13.270","Text":"geometric multiplicity, and algebraic multiplicity."},{"Start":"15:13.270 ","End":"15:15.400","Text":"If in each case, for each eigenvalue."},{"Start":"15:15.400 ","End":"15:17.635","Text":"These 2 multiplicities are the same,"},{"Start":"15:17.635 ","End":"15:21.190","Text":"then the matrix is diagonalizable."},{"Start":"15:21.190 ","End":"15:24.820","Text":"Now, in our case, for each eigenvalue,"},{"Start":"15:24.820 ","End":"15:30.190","Text":"we got geometric multiplicity 1 and algebraic multiplicity 1 and 1 equals"},{"Start":"15:30.190 ","End":"15:32.050","Text":"1 so we\u0027re okay."},{"Start":"15:32.050 ","End":"15:34.820","Text":"1 equals 1, 3 times."},{"Start":"15:35.520 ","End":"15:37.825","Text":"I\u0027ve got the same result,"},{"Start":"15:37.825 ","End":"15:39.955","Text":"a is diagonalizable."},{"Start":"15:39.955 ","End":"15:42.410","Text":"We\u0027re now done."}],"ID":25757},{"Watched":false,"Name":"Exercise 4 parts g-h","Duration":"8m 16s","ChapterTopicVideoID":24845,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.050 ","End":"00:05.040","Text":"Here we are continuing from the previous clip,"},{"Start":"00:05.040 ","End":"00:06.810","Text":"we did A through F,"},{"Start":"00:06.810 ","End":"00:14.850","Text":"and actually Part F was to show that matrix A is diagonalizable and in G,"},{"Start":"00:14.850 ","End":"00:18.300","Text":"we actually have to diagonalize it."},{"Start":"00:18.300 ","End":"00:23.160","Text":"In general, if you want to diagonalize a matrix A,"},{"Start":"00:23.160 ","End":"00:30.660","Text":"what it means is that we have to find an invertible matrix P such that"},{"Start":"00:30.660 ","End":"00:34.655","Text":"P inverse times A times P is"},{"Start":"00:34.655 ","End":"00:41.065","Text":"a diagonal matrix D. Actually we have to also find D. In fact,"},{"Start":"00:41.065 ","End":"00:46.640","Text":"if we find P and D,"},{"Start":"00:46.640 ","End":"00:52.105","Text":"that\u0027s considered to be a solution to the exercise."},{"Start":"00:52.105 ","End":"00:55.375","Text":"It\u0027s easier to start with D actually,"},{"Start":"00:55.375 ","End":"00:59.735","Text":"this is a diagonal matrix and what you do is along the diagonal,"},{"Start":"00:59.735 ","End":"01:02.560","Text":"you put all the eigenvalue."},{"Start":"01:02.560 ","End":"01:07.040","Text":"You might remember that our eigenvalues were 6,"},{"Start":"01:07.040 ","End":"01:09.650","Text":"2, and minus 4."},{"Start":"01:09.650 ","End":"01:12.830","Text":"Now at this point it doesn\u0027t matter what order you do"},{"Start":"01:12.830 ","End":"01:15.350","Text":"it in but once you\u0027ve chosen an order,"},{"Start":"01:15.350 ","End":"01:19.945","Text":"what we do next will have to correspond with this order."},{"Start":"01:19.945 ","End":"01:22.080","Text":"To get P,"},{"Start":"01:22.080 ","End":"01:28.250","Text":"what you do is you take the matrix comprised of columns,"},{"Start":"01:28.250 ","End":"01:29.990","Text":"which are the eigenvectors."},{"Start":"01:29.990 ","End":"01:31.730","Text":"We also had 3 eigenvectors,"},{"Start":"01:31.730 ","End":"01:34.385","Text":"but you have to get them in the right order."},{"Start":"01:34.385 ","End":"01:38.150","Text":"In this column, you\u0027d put the eigenvector for 6,"},{"Start":"01:38.150 ","End":"01:41.620","Text":"this column the eigenvector for 2, and so on."},{"Start":"01:41.620 ","End":"01:44.570","Text":"If you look back, these were the values of"},{"Start":"01:44.570 ","End":"01:48.800","Text":"the 3 eigenvectors and they\u0027re color-coded to show you that this goes with this,"},{"Start":"01:48.800 ","End":"01:54.870","Text":"so you have to get them in the right order."},{"Start":"01:55.910 ","End":"01:58.250","Text":"When you found D and P,"},{"Start":"01:58.250 ","End":"02:01.010","Text":"that\u0027s considered as having solved the problem,"},{"Start":"02:01.010 ","End":"02:03.875","Text":"you\u0027ve diagonalized the matrix."},{"Start":"02:03.875 ","End":"02:09.510","Text":"The next part, it looks strange to compute A^2017."},{"Start":"02:10.120 ","End":"02:19.530","Text":"I needed some large number and I just chose the year that I\u0027m saying this in, 2017."},{"Start":"02:19.530 ","End":"02:22.665","Text":"Obviously, you\u0027re not going to multiply A times A times A times A,"},{"Start":"02:22.665 ","End":"02:28.685","Text":"and here\u0027s 1 of the applications of diagonalization of a matrix."},{"Start":"02:28.685 ","End":"02:32.270","Text":"There\u0027s a formula which is actually easy to prove,"},{"Start":"02:32.270 ","End":"02:39.260","Text":"but I want to just give it to you that A^n is given"},{"Start":"02:39.260 ","End":"02:47.299","Text":"by this formula where we found already P and D. In particular,"},{"Start":"02:47.299 ","End":"02:49.640","Text":"if n is 2017,"},{"Start":"02:49.640 ","End":"02:52.400","Text":"then we get this."},{"Start":"02:52.400 ","End":"02:59.000","Text":"Now you\u0027ll see the advantage of a diagonal matrix or you will in a moment."},{"Start":"02:59.000 ","End":"03:01.070","Text":"Let\u0027s just get some order here."},{"Start":"03:01.070 ","End":"03:03.680","Text":"We have P, and that\u0027s here."},{"Start":"03:03.680 ","End":"03:09.125","Text":"We have D as I said in the moment and P inverse,"},{"Start":"03:09.125 ","End":"03:11.150","Text":"I will compute at the end."},{"Start":"03:11.150 ","End":"03:13.160","Text":"I don\u0027t want to interrupt the flow,"},{"Start":"03:13.160 ","End":"03:16.700","Text":"I\u0027ll give you the answer now that the inverse of P is this,"},{"Start":"03:16.700 ","End":"03:20.770","Text":"and IOU, write an IOU,"},{"Start":"03:20.770 ","End":"03:23.600","Text":"to do this at the end,"},{"Start":"03:23.600 ","End":"03:26.630","Text":"the computation for those who wanted."},{"Start":"03:26.630 ","End":"03:31.565","Text":"Now here\u0027s the thing about diagonal matrices that if you have"},{"Start":"03:31.565 ","End":"03:36.005","Text":"something that\u0027s diagonal and you want to raise it to any power,"},{"Start":"03:36.005 ","End":"03:39.575","Text":"you just raise each entry to that power."},{"Start":"03:39.575 ","End":"03:44.390","Text":"If D is this and I want to raise it to the power of 2017,"},{"Start":"03:44.390 ","End":"03:47.150","Text":"then this is what I get."},{"Start":"03:47.150 ","End":"03:50.030","Text":"I\u0027m not going to actually compute it as a decimal number,"},{"Start":"03:50.030 ","End":"03:53.450","Text":"we just going to leave it as an expression like this."},{"Start":"03:53.450 ","End":"03:57.200","Text":"But this is what it is."},{"Start":"03:57.200 ","End":"04:00.845","Text":"I could take the minus out, it doesn\u0027t matter."},{"Start":"04:00.845 ","End":"04:06.030","Text":"Now let\u0027s piece this together in this formula."},{"Start":"04:07.510 ","End":"04:10.675","Text":"Here we are now looking here."},{"Start":"04:10.675 ","End":"04:15.200","Text":"I\u0027ve got A^2017 is P and this is P,"},{"Start":"04:15.200 ","End":"04:21.595","Text":"D^2017 is here from here."},{"Start":"04:21.595 ","End":"04:26.810","Text":"P inverse, what we take on trust meanwhile is this."},{"Start":"04:26.810 ","End":"04:29.975","Text":"Now I just have to multiply these,"},{"Start":"04:29.975 ","End":"04:35.450","Text":"and the 3 factors to multiply,"},{"Start":"04:35.450 ","End":"04:37.945","Text":"I can do them in any order."},{"Start":"04:37.945 ","End":"04:42.380","Text":"I can do this pair first or this pair first,"},{"Start":"04:42.380 ","End":"04:44.290","Text":"the associative law,"},{"Start":"04:44.290 ","End":"04:47.310","Text":"and I chose to multiply these 2,"},{"Start":"04:47.310 ","End":"04:53.210","Text":"and also the scalar 1/2 can come out in front is also a property of matrices."},{"Start":"04:53.210 ","End":"04:56.240","Text":"You can put the scalar basically anywhere."},{"Start":"04:56.420 ","End":"04:59.580","Text":"This times this, just check the computations."},{"Start":"04:59.580 ","End":"05:01.155","Text":"This is what we get,"},{"Start":"05:01.155 ","End":"05:05.510","Text":"and then this times this will give us this mess here,"},{"Start":"05:05.510 ","End":"05:10.850","Text":"but I mean it\u0027s an answer each entries and some arithmetical expression,"},{"Start":"05:10.850 ","End":"05:14.855","Text":"but this is A^2017."},{"Start":"05:14.855 ","End":"05:17.620","Text":"I haven\u0027t forgotten my debt,"},{"Start":"05:17.620 ","End":"05:21.140","Text":"and I\u0027m going to use the technique for computing the inverse,"},{"Start":"05:21.140 ","End":"05:23.660","Text":"or we start with the matrix here and"},{"Start":"05:23.660 ","End":"05:28.430","Text":"the identity matrix here in some augmented matrix with the separator."},{"Start":"05:28.430 ","End":"05:32.785","Text":"We do row operations until we get the identity on the left,"},{"Start":"05:32.785 ","End":"05:36.179","Text":"and what you get on the right will be the inverse."},{"Start":"05:37.480 ","End":"05:39.860","Text":"As I said, the identity matrix,"},{"Start":"05:39.860 ","End":"05:41.300","Text":"you want this 1 up here,"},{"Start":"05:41.300 ","End":"05:46.180","Text":"not down there so let\u0027s just swap the 1st and 3rd rows,"},{"Start":"05:46.180 ","End":"05:49.380","Text":"and notice that also on the right I do that"},{"Start":"05:49.380 ","End":"05:52.555","Text":"so this 1 goes down there and this 1 comes up here."},{"Start":"05:52.555 ","End":"05:55.165","Text":"Next, I want to have a 0 here,"},{"Start":"05:55.165 ","End":"06:02.805","Text":"so what I\u0027m going to do is subtract the 2nd row from the 3rd row,"},{"Start":"06:02.805 ","End":"06:04.200","Text":"and this gives us this,"},{"Start":"06:04.200 ","End":"06:09.260","Text":"don\u0027t forget to work on the left and the right of the partition."},{"Start":"06:09.260 ","End":"06:11.460","Text":"Now we want the 0 here"},{"Start":"06:11.460 ","End":"06:20.430","Text":"so I\u0027m going to"},{"Start":"06:20.430 ","End":"06:27.800","Text":"take this row and add it to twice this row,"},{"Start":"06:28.250 ","End":"06:32.060","Text":"and then we get this here,"},{"Start":"06:32.060 ","End":"06:33.140","Text":"there\u0027s a 0 here."},{"Start":"06:33.140 ","End":"06:38.830","Text":"Now, I just need to get this to be 0 and then it\u0027s going to be diagonal."},{"Start":"06:38.830 ","End":"06:46.830","Text":"What I\u0027ll do is double the top row and then subtract the second row from it,"},{"Start":"06:46.830 ","End":"06:50.095","Text":"and that will give us 2, 0, 0 here."},{"Start":"06:50.095 ","End":"06:51.740","Text":"Now we\u0027re getting close to"},{"Start":"06:51.740 ","End":"06:55.520","Text":"the identity matrix because all we have to do is divide each row"},{"Start":"06:55.520 ","End":"07:00.760","Text":"by the appropriate constant or multiplication;"},{"Start":"07:00.760 ","End":"07:03.980","Text":"dividing by 2 or multiplying by 1/2 same thing."},{"Start":"07:03.980 ","End":"07:05.390","Text":"We have to divide this by 2,"},{"Start":"07:05.390 ","End":"07:08.720","Text":"this by 2 and this by minus 2 so we multiply by 1/2,"},{"Start":"07:08.720 ","End":"07:10.795","Text":"1/2, minus 1/2."},{"Start":"07:10.795 ","End":"07:13.820","Text":"Then we finally get the identity matrix here,"},{"Start":"07:13.820 ","End":"07:15.125","Text":"and after we\u0027ve done that,"},{"Start":"07:15.125 ","End":"07:17.830","Text":"this is what we get here,"},{"Start":"07:17.830 ","End":"07:24.530","Text":"and so this is the identity matrix that means that this is the inverse of P,"},{"Start":"07:24.530 ","End":"07:29.010","Text":"but that\u0027s not quite what I wrote earlier."},{"Start":"07:29.150 ","End":"07:31.340","Text":"I mean, this is a correct answer,"},{"Start":"07:31.340 ","End":"07:33.215","Text":"but it\u0027s not the same as what we had before."},{"Start":"07:33.215 ","End":"07:35.315","Text":"We had 1/2 in front of something."},{"Start":"07:35.315 ","End":"07:38.670","Text":"This part here, I guess I forgot a step,"},{"Start":"07:38.670 ","End":"07:45.595","Text":"I take 1/2 outside the brackets and then we\u0027re left with minus 1,"},{"Start":"07:45.595 ","End":"07:50.670","Text":"and basically if we take 1/2 and we just double everything,"},{"Start":"07:50.670 ","End":"07:52.110","Text":"so that\u0027s minus 1,"},{"Start":"07:52.110 ","End":"07:55.325","Text":"minus 1, 2, here we get 1,"},{"Start":"07:55.325 ","End":"07:59.930","Text":"1, 0, and here we double this minus 1, 1,"},{"Start":"07:59.930 ","End":"08:03.395","Text":"0, and this is what we had earlier,"},{"Start":"08:03.395 ","End":"08:06.540","Text":"I just left that step out."},{"Start":"08:07.160 ","End":"08:09.810","Text":"That\u0027s this part,"},{"Start":"08:09.810 ","End":"08:12.890","Text":"and I think it\u0027s time for a break and we\u0027ll finish it"},{"Start":"08:12.890 ","End":"08:16.620","Text":"off the last 2 sections in the next clip."}],"ID":25758},{"Watched":false,"Name":"Exercise 4 parts i-j","Duration":"9m ","ChapterTopicVideoID":24846,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.270","Text":"Back from the break, continuing,"},{"Start":"00:03.270 ","End":"00:05.955","Text":"we still have sections I and J to do,"},{"Start":"00:05.955 ","End":"00:08.850","Text":"I was the minimal polynomial."},{"Start":"00:08.850 ","End":"00:11.670","Text":"Now what is it about this minimal polynomial?"},{"Start":"00:11.670 ","End":"00:15.945","Text":"Well, let\u0027s start with the Cayley-Hamilton theorem that says"},{"Start":"00:15.945 ","End":"00:21.420","Text":"that every matrix satisfies its own characteristic equation."},{"Start":"00:21.420 ","End":"00:24.030","Text":"Or in terms of characteristic polynomial,"},{"Start":"00:24.030 ","End":"00:31.340","Text":"if you substitute A instead of x in the characteristic polynomial,"},{"Start":"00:31.340 ","End":"00:34.330","Text":"then you get the 0 matrix."},{"Start":"00:34.330 ","End":"00:37.640","Text":"Besides numbers, when you have a polynomial,"},{"Start":"00:37.640 ","End":"00:39.785","Text":"you can also substitute a matrix."},{"Start":"00:39.785 ","End":"00:43.475","Text":"That\u0027s just 1 small difference that when you see a constant,"},{"Start":"00:43.475 ","End":"00:48.210","Text":"you have to make it a constant times I, we\u0027ll see that."},{"Start":"00:49.340 ","End":"00:55.880","Text":"There is a polynomial which when you plug A in is 0,"},{"Start":"00:55.880 ","End":"01:00.100","Text":"but maybe there\u0027s 1 of smaller degree,"},{"Start":"01:00.100 ","End":"01:02.540","Text":"and that\u0027s where the minimal polynomial comes in."},{"Start":"01:02.540 ","End":"01:05.105","Text":"You want the least degree possible,"},{"Start":"01:05.105 ","End":"01:07.430","Text":"and so here\u0027s a definition,"},{"Start":"01:07.430 ","End":"01:09.965","Text":"ignore this word monic for the moment."},{"Start":"01:09.965 ","End":"01:20.709","Text":"The minimal polynomial is the polynomial m of x of least degree such that m of A is 0."},{"Start":"01:20.709 ","End":"01:22.830","Text":"Just like p of A is 0,"},{"Start":"01:22.830 ","End":"01:25.860","Text":"we want to find m of A but with the smallest degree."},{"Start":"01:25.860 ","End":"01:31.515","Text":"Now, maybe pedantic and there\u0027s no the minimal polynomial,"},{"Start":"01:31.515 ","End":"01:33.125","Text":"because if I find 1,"},{"Start":"01:33.125 ","End":"01:37.760","Text":"I can always multiply it by 2 or divide it by 3."},{"Start":"01:37.760 ","End":"01:41.405","Text":"It will still have the same degree and it will still satisfy this."},{"Start":"01:41.405 ","End":"01:43.280","Text":"That to make it unique,"},{"Start":"01:43.280 ","End":"01:47.195","Text":"we usually say that the leading coefficient be 1,"},{"Start":"01:47.195 ","End":"01:49.550","Text":"and if its leading coefficient isn\u0027t 1,"},{"Start":"01:49.550 ","End":"01:54.495","Text":"you can just divide by it anyway, that\u0027s important."},{"Start":"01:54.495 ","End":"02:01.435","Text":"But what is important is this theorem which will help us to find the minimal polynomial."},{"Start":"02:01.435 ","End":"02:06.640","Text":"What it says is that the minimal polynomial and the characteristic polynomial,"},{"Start":"02:06.640 ","End":"02:10.530","Text":"they have the same irreducible factors,"},{"Start":"02:10.530 ","End":"02:15.180","Text":"and this assumes that I have both of them in factorized form."},{"Start":"02:15.180 ","End":"02:16.815","Text":"I\u0027ll just make a note of it,"},{"Start":"02:16.815 ","End":"02:24.065","Text":"that I have both of them factorize not some cubic or whatever or into factors."},{"Start":"02:24.065 ","End":"02:26.930","Text":"I\u0027ll explain this with an example,"},{"Start":"02:26.930 ","End":"02:30.010","Text":"I\u0027m not going to use our example of p of x."},{"Start":"02:30.010 ","End":"02:31.930","Text":"It\u0027s just not interesting enough,"},{"Start":"02:31.930 ","End":"02:33.775","Text":"I want a better example."},{"Start":"02:33.775 ","End":"02:40.655","Text":"Let\u0027s say from another exercise we got this characteristic polynomial,"},{"Start":"02:40.655 ","End":"02:44.925","Text":"then the irreducible factors,"},{"Start":"02:44.925 ","End":"02:49.985","Text":"x is an irreducible factor because we can\u0027t go any further with it."},{"Start":"02:49.985 ","End":"02:54.140","Text":"This x minus 1 is an irreducible factor,"},{"Start":"02:54.140 ","End":"02:58.130","Text":"and x squared plus 4 is an irreducible factor."},{"Start":"02:58.130 ","End":"03:01.955","Text":"Assuming that we\u0027re working with real numbers, which we are,"},{"Start":"03:01.955 ","End":"03:03.740","Text":"those of you who know complex numbers,"},{"Start":"03:03.740 ","End":"03:06.035","Text":"know that it can\u0027t be factorized further,"},{"Start":"03:06.035 ","End":"03:08.120","Text":"but with real numbers it can."},{"Start":"03:08.120 ","End":"03:11.045","Text":"There\u0027s 3 irreducible factors,"},{"Start":"03:11.045 ","End":"03:14.440","Text":"and each of them has a degree."},{"Start":"03:14.440 ","End":"03:17.720","Text":"Now, minimal polynomial is got to have x in it,"},{"Start":"03:17.720 ","End":"03:19.010","Text":"it\u0027s got to have x minus 1."},{"Start":"03:19.010 ","End":"03:20.735","Text":"It\u0027s got to have x squared plus 4,"},{"Start":"03:20.735 ","End":"03:23.510","Text":"but it could have a lower power."},{"Start":"03:23.510 ","End":"03:28.080","Text":"This one is 1, instead of 2,"},{"Start":"03:28.080 ","End":"03:29.660","Text":"I could have a 1 here,"},{"Start":"03:29.660 ","End":"03:30.800","Text":"instead of 3,"},{"Start":"03:30.800 ","End":"03:33.140","Text":"I might have a 2 or a 1."},{"Start":"03:33.140 ","End":"03:34.940","Text":"There\u0027s actually 6 combinations,"},{"Start":"03:34.940 ","End":"03:36.635","Text":"this have to take 1 or 2,"},{"Start":"03:36.635 ","End":"03:38.375","Text":"this 1, 2, or 3,"},{"Start":"03:38.375 ","End":"03:41.200","Text":"and here I have to take it as 1."},{"Start":"03:41.200 ","End":"03:45.705","Text":"I\u0027m going to make a list of the 6 possibilities."},{"Start":"03:45.705 ","End":"03:52.435","Text":"Like I said, I have all the combinations of this being 1 or 2."},{"Start":"03:52.435 ","End":"03:55.620","Text":"Like here it\u0027s 1,"},{"Start":"03:55.620 ","End":"03:57.420","Text":"here it\u0027s 2, here it\u0027s 1, here it\u0027s 2."},{"Start":"03:57.420 ","End":"03:59.295","Text":"I don\u0027t write the 1 of course,"},{"Start":"03:59.295 ","End":"04:03.810","Text":"and also this could be 1,2 or 3."},{"Start":"04:03.810 ","End":"04:05.160","Text":"Like here it\u0027s 1,"},{"Start":"04:05.160 ","End":"04:07.725","Text":"here it\u0027s 3, here it\u0027s 2."},{"Start":"04:07.725 ","End":"04:10.530","Text":"We arrange them in order of degree,"},{"Start":"04:10.530 ","End":"04:12.210","Text":"this is a degree 4 polynomial,"},{"Start":"04:12.210 ","End":"04:15.900","Text":"this is a degree 5 but the tie doesn\u0027t matter what order."},{"Start":"04:15.900 ","End":"04:19.135","Text":"The last one is actually the original one,"},{"Start":"04:19.135 ","End":"04:21.665","Text":"that\u0027s degree 7,"},{"Start":"04:21.665 ","End":"04:25.835","Text":"we\u0027re hoping we will get one less than 7."},{"Start":"04:25.835 ","End":"04:29.870","Text":"What we do is we try each one out in turn,"},{"Start":"04:29.870 ","End":"04:33.520","Text":"going from the smallest and working our way upward."},{"Start":"04:33.520 ","End":"04:35.840","Text":"Here we substitute successively."},{"Start":"04:35.840 ","End":"04:39.365","Text":"We put A in here and see if we get the 0 polynomial."},{"Start":"04:39.365 ","End":"04:41.810","Text":"Next we try in here,"},{"Start":"04:41.810 ","End":"04:43.355","Text":"and then in here and so on."},{"Start":"04:43.355 ","End":"04:49.215","Text":"Once you get a 0 matrix,"},{"Start":"04:49.215 ","End":"04:53.470","Text":"you stop, and that\u0027s the minimal polynomial."},{"Start":"04:54.800 ","End":"04:58.255","Text":"It\u0027s time to get back to our example,"},{"Start":"04:58.255 ","End":"05:00.425","Text":"this wasn\u0027t from our example."},{"Start":"05:00.425 ","End":"05:02.930","Text":"In our example, this is our polynomial."},{"Start":"05:02.930 ","End":"05:04.490","Text":"This is the power of 1,"},{"Start":"05:04.490 ","End":"05:06.905","Text":"this is the power of 1, power of 1."},{"Start":"05:06.905 ","End":"05:10.450","Text":"The minimal has to have all 3 of them,"},{"Start":"05:10.450 ","End":"05:13.030","Text":"and I can\u0027t take anything less than 1,"},{"Start":"05:13.030 ","End":"05:14.790","Text":"then it wouldn\u0027t appear."},{"Start":"05:14.790 ","End":"05:16.925","Text":"Actually, my hands are tied,"},{"Start":"05:16.925 ","End":"05:18.215","Text":"we have no choice."},{"Start":"05:18.215 ","End":"05:24.840","Text":"The minimal is the same as the original characteristic polynomial,"},{"Start":"05:24.970 ","End":"05:27.800","Text":"the minimal polynomial is this,"},{"Start":"05:27.800 ","End":"05:30.275","Text":"I just copied this."},{"Start":"05:30.275 ","End":"05:34.060","Text":"That\u0027s this section,"},{"Start":"05:34.060 ","End":"05:38.240","Text":"and now we come to the last section or part or whatever."},{"Start":"05:38.240 ","End":"05:42.470","Text":"Part J, is A invertible?"},{"Start":"05:42.470 ","End":"05:52.280","Text":"If it is, we have to express the inverse of A in terms of A and the identity matrix."},{"Start":"05:52.280 ","End":"05:56.240","Text":"I should have said using the Cayley-Hamilton Theorem."},{"Start":"05:56.240 ","End":"05:59.765","Text":"This is the theorem that we need that comes in handy."},{"Start":"05:59.765 ","End":"06:05.045","Text":"A matrix is invertible if and only if all its eigenvalues are non-zero,"},{"Start":"06:05.045 ","End":"06:06.950","Text":"and if you look at our eigenvalues,"},{"Start":"06:06.950 ","End":"06:09.565","Text":"0 was not one of them."},{"Start":"06:09.565 ","End":"06:13.155","Text":"We know that A is invertible,"},{"Start":"06:13.155 ","End":"06:16.100","Text":"and now we have to do the second part of the question."},{"Start":"06:16.100 ","End":"06:17.360","Text":"If it is invertible,"},{"Start":"06:17.360 ","End":"06:20.000","Text":"we have to find the inverse of A."},{"Start":"06:20.000 ","End":"06:22.085","Text":"Here\u0027s the plan,"},{"Start":"06:22.085 ","End":"06:26.360","Text":"we use the Cayley-Hamilton to get an equation of"},{"Start":"06:26.360 ","End":"06:30.560","Text":"the form A times something is equal to I."},{"Start":"06:30.560 ","End":"06:34.875","Text":"Just by a little bit of manipulation of the Cayley-Hamilton theorem,"},{"Start":"06:34.875 ","End":"06:36.815","Text":"and when this is the case,"},{"Start":"06:36.815 ","End":"06:38.990","Text":"then what\u0027s written in the asterisk,"},{"Start":"06:38.990 ","End":"06:44.030","Text":"will be the inverse of A because A times it is the identity."},{"Start":"06:44.030 ","End":"06:46.820","Text":"Now our characteristic polynomial,"},{"Start":"06:46.820 ","End":"06:48.950","Text":"if you look back was this,"},{"Start":"06:48.950 ","End":"06:54.770","Text":"and Cayley-Hamilton theorem says that if I plug in A,"},{"Start":"06:54.770 ","End":"06:58.345","Text":"I get 0, the 0 matrix,"},{"Start":"06:58.345 ","End":"07:00.120","Text":"and that means this."},{"Start":"07:00.120 ","End":"07:03.155","Text":"Remember that I told you that if we use matrices,"},{"Start":"07:03.155 ","End":"07:08.390","Text":"we have to change the constants to constant times I."},{"Start":"07:08.390 ","End":"07:12.400","Text":"This is what we get in matrix form,"},{"Start":"07:12.400 ","End":"07:15.110","Text":"and here\u0027s the part where I don\u0027t want it"},{"Start":"07:15.110 ","End":"07:18.890","Text":"factorized everywhere up till now we\u0027ve been doing everything factorize,"},{"Start":"07:18.890 ","End":"07:22.260","Text":"this time, actually want to expand."},{"Start":"07:23.330 ","End":"07:27.560","Text":"We multiply lets say these 2 first,"},{"Start":"07:27.560 ","End":"07:35.940","Text":"and we get this number A times I or I times A in both cases it\u0027s A,"},{"Start":"07:35.940 ","End":"07:39.300","Text":"and I times I is I and A times A is A squared."},{"Start":"07:39.300 ","End":"07:40.935","Text":"This is what we get,"},{"Start":"07:40.935 ","End":"07:47.209","Text":"then we do another multiplication and keep the order."},{"Start":"07:47.209 ","End":"07:51.020","Text":"What I did was I multiplied each of these"},{"Start":"07:51.020 ","End":"07:55.220","Text":"by A on the right and then I multiplied them by 4I,"},{"Start":"07:55.220 ","End":"07:56.960","Text":"but 4 comes up front anyway,"},{"Start":"07:56.960 ","End":"07:58.235","Text":"this is the next step."},{"Start":"07:58.235 ","End":"08:00.275","Text":"Then collect like terms,"},{"Start":"08:00.275 ","End":"08:02.200","Text":"and this is what we get."},{"Start":"08:02.200 ","End":"08:04.610","Text":"Actually could have done this another way,"},{"Start":"08:04.610 ","End":"08:08.330","Text":"we could have expanded the characteristic polynomial and you would"},{"Start":"08:08.330 ","End":"08:12.815","Text":"have got x cubed minus 4x squared minus 20x plus 48,"},{"Start":"08:12.815 ","End":"08:16.400","Text":"and then plugged A in, doesn\u0027t really matter."},{"Start":"08:16.400 ","End":"08:21.140","Text":"Now, the next step is to take the bit with the I onto the other side,"},{"Start":"08:21.140 ","End":"08:23.310","Text":"so that\u0027s minus 48I."},{"Start":"08:23.560 ","End":"08:27.965","Text":"Now remember we wanted to have just I on the right-hand side."},{"Start":"08:27.965 ","End":"08:33.140","Text":"What we do is we divide left and right by minus 48."},{"Start":"08:33.140 ","End":"08:38.990","Text":"I just put minus 1 over 48 in front and I can put that inside the brackets,"},{"Start":"08:38.990 ","End":"08:41.800","Text":"doesn\u0027t matter whether constants go."},{"Start":"08:41.800 ","End":"08:45.340","Text":"If A times this thing is I,"},{"Start":"08:45.340 ","End":"08:48.380","Text":"then this thing is the inverse of A."},{"Start":"08:48.380 ","End":"08:51.410","Text":"The inverse of A is this expression,"},{"Start":"08:51.410 ","End":"08:56.130","Text":"and that\u0027s what we wanted in terms of A and I."},{"Start":"08:57.370 ","End":"09:00.450","Text":"We are done."}],"ID":25759},{"Watched":false,"Name":"Exercise 5","Duration":"10m 59s","ChapterTopicVideoID":24848,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.050 ","End":"00:06.150","Text":"In this exercise, we\u0027re given a 2 by 2 matrix A,"},{"Start":"00:06.150 ","End":"00:14.880","Text":"and we have to find the eigenvalues and the corresponding eigenvectors for A."},{"Start":"00:14.880 ","End":"00:25.325","Text":"Then, if it\u0027s diagonalizable to find the invertible matrix P such that this holds,"},{"Start":"00:25.325 ","End":"00:26.790","Text":"and we also have to find D,"},{"Start":"00:26.790 ","End":"00:28.875","Text":"which is a diagonal matrix."},{"Start":"00:28.875 ","End":"00:30.510","Text":"Now, here\u0027s the thing."},{"Start":"00:30.510 ","End":"00:32.445","Text":"We have to solve it twice,"},{"Start":"00:32.445 ","End":"00:35.880","Text":"once when this is considered over the field of"},{"Start":"00:35.880 ","End":"00:40.260","Text":"real numbers and once over the field of complex numbers."},{"Start":"00:40.260 ","End":"00:43.110","Text":"I\u0027m assuming you\u0027ve studied complex numbers,"},{"Start":"00:43.110 ","End":"00:46.065","Text":"if not, you should skip this exercise."},{"Start":"00:46.065 ","End":"00:48.390","Text":"To find the eigenvalues,"},{"Start":"00:48.390 ","End":"00:51.260","Text":"we start off with the characteristic matrix then we"},{"Start":"00:51.260 ","End":"00:54.305","Text":"get the characteristic polynomial and find its roots."},{"Start":"00:54.305 ","End":"01:01.315","Text":"Anyway, the characteristic matrix is x times the identity matrix minus A."},{"Start":"01:01.315 ","End":"01:04.620","Text":"The identity here, the 2 by 2 identity,"},{"Start":"01:04.620 ","End":"01:07.440","Text":"which is just 1s on the diagonal,"},{"Start":"01:07.440 ","End":"01:08.780","Text":"1 here and 1 here."},{"Start":"01:08.780 ","End":"01:11.030","Text":"But when we multiply by x,"},{"Start":"01:11.030 ","End":"01:15.720","Text":"it\u0027s x and x and I just copied from there."},{"Start":"01:15.880 ","End":"01:23.150","Text":"Let\u0027s see, this is what we get and this is the characteristic matrix."},{"Start":"01:23.150 ","End":"01:26.300","Text":"Next, the characteristic polynomial is"},{"Start":"01:26.300 ","End":"01:29.370","Text":"just the determinant of the characteristic matrix;"},{"Start":"01:29.370 ","End":"01:32.315","Text":"what\u0027s written here is the same as what\u0027s written here."},{"Start":"01:32.315 ","End":"01:35.760","Text":"Maybe there\u0027s a dot in there also, anyway."},{"Start":"01:36.950 ","End":"01:39.290","Text":"To get the determinant,"},{"Start":"01:39.290 ","End":"01:45.545","Text":"we just replace the square brackets with these bars which indicate determinant."},{"Start":"01:45.545 ","End":"01:48.860","Text":"Now to compute the determinant of a 2 by 2 is easy,"},{"Start":"01:48.860 ","End":"01:52.205","Text":"just the product of this diagonal minus the product of this diagonal,"},{"Start":"01:52.205 ","End":"01:55.580","Text":"so x minus 3 times x plus 1 for the first diagonal,"},{"Start":"01:55.580 ","End":"01:58.795","Text":"and 2 times negative 4 for the other."},{"Start":"01:58.795 ","End":"02:03.905","Text":"After simplification, this is the characteristic polynomial."},{"Start":"02:03.905 ","End":"02:09.185","Text":"I just like to frame it and call it a name p of x."},{"Start":"02:09.185 ","End":"02:12.425","Text":"Now that we have the characteristic polynomial,"},{"Start":"02:12.425 ","End":"02:15.935","Text":"we can start talking about eigenvalues."},{"Start":"02:15.935 ","End":"02:23.330","Text":"The eigenvalues of a matrix suggest the roots of its characteristic polynomial."},{"Start":"02:23.330 ","End":"02:25.805","Text":"The roots of a polynomial,"},{"Start":"02:25.805 ","End":"02:28.310","Text":"the solutions to the equation,"},{"Start":"02:28.310 ","End":"02:30.290","Text":"that polynomial equals 0."},{"Start":"02:30.290 ","End":"02:32.280","Text":"There\u0027s also a name for this equation,"},{"Start":"02:32.280 ","End":"02:35.660","Text":"this is called the characteristic equation."},{"Start":"02:35.660 ","End":"02:38.820","Text":"In our case, this is the equation we get,"},{"Start":"02:38.820 ","End":"02:40.625","Text":"it\u0027s a quadratic equation."},{"Start":"02:40.625 ","End":"02:43.430","Text":"Now if you try to solve this over the real numbers,"},{"Start":"02:43.430 ","End":"02:47.030","Text":"you\u0027ll find that there are no solutions because"},{"Start":"02:47.030 ","End":"02:50.930","Text":"what you get from the formula is something negative under the square root,"},{"Start":"02:50.930 ","End":"02:54.365","Text":"you have to do the square root of minus 16 or something."},{"Start":"02:54.365 ","End":"02:57.050","Text":"Anyway, there are no solutions and this is"},{"Start":"02:57.050 ","End":"03:00.875","Text":"the big difference between the real numbers and the complex numbers here."},{"Start":"03:00.875 ","End":"03:03.890","Text":"If you\u0027re doing this question for the real numbers,"},{"Start":"03:03.890 ","End":"03:05.060","Text":"we say there\u0027s no solution,"},{"Start":"03:05.060 ","End":"03:06.350","Text":"so there\u0027s no eigenvalues,"},{"Start":"03:06.350 ","End":"03:08.660","Text":"there\u0027s no eigenvectors,"},{"Start":"03:08.660 ","End":"03:13.730","Text":"and you can\u0027t diagonalize it or do anything and we\u0027re just done."},{"Start":"03:13.730 ","End":"03:16.415","Text":"If we\u0027re doing it over the reals,"},{"Start":"03:16.415 ","End":"03:19.070","Text":"then we\u0027re done, we\u0027re finished, that\u0027s it."},{"Start":"03:19.070 ","End":"03:22.020","Text":"From here on down,"},{"Start":"03:22.020 ","End":"03:24.710","Text":"we\u0027re going to assume that we are working now with"},{"Start":"03:24.710 ","End":"03:29.225","Text":"the complex numbers and see if we make some headway there."},{"Start":"03:29.225 ","End":"03:33.740","Text":"Now we saw over the complex numbers we do get somewhere."},{"Start":"03:33.740 ","End":"03:35.150","Text":"We get a couple of solutions,"},{"Start":"03:35.150 ","End":"03:42.710","Text":"we get using the formula minus b plus or minus the square root of b squared minus 4ac,"},{"Start":"03:42.710 ","End":"03:48.720","Text":"which is 4 minus 20 is minus 16 all over 2a."},{"Start":"03:48.720 ","End":"03:52.460","Text":"Anyway, you\u0027ve seen this plenty of times,"},{"Start":"03:52.460 ","End":"03:56.460","Text":"comes down to 1 plus or minus 2i."},{"Start":"03:56.500 ","End":"04:02.640","Text":"2 solutions, 1 plus 2i and 1 minus 2i."},{"Start":"04:02.930 ","End":"04:05.690","Text":"Now that we have the eigenvalues,"},{"Start":"04:05.690 ","End":"04:07.910","Text":"we can talk about eigenvectors,"},{"Start":"04:07.910 ","End":"04:13.980","Text":"so let\u0027s find the eigenvectors first of all for this and then for this."},{"Start":"04:14.060 ","End":"04:19.555","Text":"Like I said, we start with 1 plus 2i and later we\u0027ll do the 1 minus 2i."},{"Start":"04:19.555 ","End":"04:26.770","Text":"We begin with the characteristic matrix and then we need to substitute this for x."},{"Start":"04:26.770 ","End":"04:29.805","Text":"If we make that substitution,"},{"Start":"04:29.805 ","End":"04:32.790","Text":"check the calculation, not very difficult,"},{"Start":"04:32.790 ","End":"04:36.795","Text":"we get this matrix without x."},{"Start":"04:36.795 ","End":"04:40.490","Text":"Here\u0027s the corresponding system of linear equations."},{"Start":"04:40.490 ","End":"04:49.270","Text":"Now, we need to solve this system and get the solution space,"},{"Start":"04:49.270 ","End":"04:57.265","Text":"and the basis for that will be the eigenvectors for this particular eigenvalue."},{"Start":"04:57.265 ","End":"04:59.820","Text":"Anyway, back to the matrix form,"},{"Start":"04:59.820 ","End":"05:02.090","Text":"we want to bring it to row echelon form."},{"Start":"05:02.090 ","End":"05:03.620","Text":"Everything\u0027s divisible by 2,"},{"Start":"05:03.620 ","End":"05:08.600","Text":"so let\u0027s just take 1/2 of the first row and 1/2 of the second row,"},{"Start":"05:08.600 ","End":"05:12.170","Text":"and this is what we get and the numbers on neater,"},{"Start":"05:12.170 ","End":"05:13.730","Text":"easier to work with."},{"Start":"05:13.730 ","End":"05:17.000","Text":"For row echelon form we need a 0 here,"},{"Start":"05:17.000 ","End":"05:26.455","Text":"so take 1 minus i times this row plus twice this row and put it in the second row."},{"Start":"05:26.455 ","End":"05:29.240","Text":"If you do the computation,"},{"Start":"05:29.240 ","End":"05:31.460","Text":"you\u0027ll see that not only do we get 0 here,"},{"Start":"05:31.460 ","End":"05:34.115","Text":"but we get a 0 here also."},{"Start":"05:34.115 ","End":"05:37.800","Text":"What if we just cross this row out?"},{"Start":"05:37.820 ","End":"05:42.010","Text":"Here\u0027s the corresponding system of linear equations,"},{"Start":"05:42.010 ","End":"05:45.365","Text":"it\u0027s a system even though it only has 1 equation in it,"},{"Start":"05:45.365 ","End":"05:47.570","Text":"and 1 equation and 2 unknowns."},{"Start":"05:47.570 ","End":"05:53.640","Text":"Y is the free variable and x depends on it, x is constrained."},{"Start":"05:53.640 ","End":"06:00.170","Text":"Our usual trick is to let y equal 1 and that\u0027s fine."},{"Start":"06:00.170 ","End":"06:03.095","Text":"The thing is, if we let y equal 1 here,"},{"Start":"06:03.095 ","End":"06:07.175","Text":"you do the computation, you get x,"},{"Start":"06:07.175 ","End":"06:09.110","Text":"which looks a bit messy,"},{"Start":"06:09.110 ","End":"06:11.000","Text":"doesn\u0027t have to be 1,"},{"Start":"06:11.000 ","End":"06:16.565","Text":"and this is like a trick anything but 0."},{"Start":"06:16.565 ","End":"06:18.440","Text":"If you want to get nicer numbers,"},{"Start":"06:18.440 ","End":"06:20.470","Text":"let\u0027s try y equals 2,"},{"Start":"06:20.470 ","End":"06:25.025","Text":"and then if you do the computation you get x, which looks nicer."},{"Start":"06:25.025 ","End":"06:26.569","Text":"This is not wrong,"},{"Start":"06:26.569 ","End":"06:28.885","Text":"but this is neater."},{"Start":"06:28.885 ","End":"06:35.870","Text":"This technique gives us a basis for the solution space."},{"Start":"06:35.870 ","End":"06:38.390","Text":"In this case, we only get 1,"},{"Start":"06:38.390 ","End":"06:40.945","Text":"and this is it."},{"Start":"06:40.945 ","End":"06:48.840","Text":"That\u0027s our eigenvector for the eigenvalue 1 plus 2i,"},{"Start":"06:48.940 ","End":"06:55.835","Text":"so we took care of 1 plus 2i and now the other 1, 1 minus 2i."},{"Start":"06:55.835 ","End":"06:58.370","Text":"Just like before we start with"},{"Start":"06:58.370 ","End":"07:03.660","Text":"the characteristic matrix and substitute the eigenvalue in it."},{"Start":"07:03.660 ","End":"07:05.445","Text":"This is what we get,"},{"Start":"07:05.445 ","End":"07:09.115","Text":"and the corresponding system of linear equations,"},{"Start":"07:09.115 ","End":"07:11.720","Text":"want to bring this to echelon form."},{"Start":"07:11.720 ","End":"07:13.950","Text":"First, let\u0027s get the numbers smaller,"},{"Start":"07:13.950 ","End":"07:24.315","Text":"everything\u0027s divisible by 2 and minus 1/2 would be more convenient, I just checked it."},{"Start":"07:24.315 ","End":"07:28.080","Text":"This is nice, the smaller numbers and less minuses,"},{"Start":"07:28.080 ","End":"07:36.740","Text":"and now we want to get a 0 in this position here."},{"Start":"07:36.740 ","End":"07:43.055","Text":"We can do a row operation twice this minus i plus 1 times this,"},{"Start":"07:43.055 ","End":"07:44.720","Text":"and if we do that,"},{"Start":"07:44.720 ","End":"07:49.025","Text":"we get a 0 here but we also get a 0 here."},{"Start":"07:49.025 ","End":"07:52.320","Text":"It was like this row didn\u0027t exist and"},{"Start":"07:52.320 ","End":"07:57.465","Text":"our system now just contains 1 equation which is this,"},{"Start":"07:57.465 ","End":"08:05.580","Text":"and y would be the free variable and x is the constrained or whatever."},{"Start":"08:06.470 ","End":"08:09.820","Text":"Just like before, I\u0027m going to use the trick."},{"Start":"08:09.820 ","End":"08:11.600","Text":"If we let y equals 1,"},{"Start":"08:11.600 ","End":"08:12.620","Text":"we\u0027ll get fractions,"},{"Start":"08:12.620 ","End":"08:15.875","Text":"so let\u0027s let y equal 2."},{"Start":"08:15.875 ","End":"08:18.710","Text":"If you do that and compute x,"},{"Start":"08:18.710 ","End":"08:21.520","Text":"it comes out to be this,"},{"Start":"08:21.520 ","End":"08:26.165","Text":"and this actually can be simplified to 1 minus i,"},{"Start":"08:26.165 ","End":"08:31.400","Text":"multiply top and bottom by the conjugate,"},{"Start":"08:31.400 ","End":"08:40.490","Text":"and so the eigenvector we get is first of all x which is 1 minus i,"},{"Start":"08:40.490 ","End":"08:42.275","Text":"and then y which is 2,"},{"Start":"08:42.275 ","End":"08:45.900","Text":"and that\u0027s the other eigenvector."},{"Start":"08:46.090 ","End":"08:49.700","Text":"Now that we\u0027ve found the eigenvalues and the eigenvectors"},{"Start":"08:49.700 ","End":"08:52.910","Text":"we can get to the matter of diagonalization."},{"Start":"08:52.910 ","End":"08:56.165","Text":"Let\u0027s just summarize what we found earlier."},{"Start":"08:56.165 ","End":"08:59.000","Text":"We found the eigenvalue"},{"Start":"08:59.000 ","End":"09:05.335","Text":"1 plus 2i and there\u0027s an eigenvector associated with it, which is this."},{"Start":"09:05.335 ","End":"09:08.815","Text":"The other eigenvalue, 1 minus 2i,"},{"Start":"09:08.815 ","End":"09:12.875","Text":"and we get this eigenvector."},{"Start":"09:12.875 ","End":"09:15.874","Text":"Using a theorem we had earlier,"},{"Start":"09:15.874 ","End":"09:21.730","Text":"it was dated with n by n matrices and n linearly independent vectors,"},{"Start":"09:21.730 ","End":"09:24.255","Text":"then we know it\u0027s diagonalizable."},{"Start":"09:24.255 ","End":"09:27.890","Text":"How do we know that these 2 vectors are linearly independent?"},{"Start":"09:27.890 ","End":"09:32.150","Text":"There was another theorem that if they come from different eigenvalues,"},{"Start":"09:32.150 ","End":"09:38.405","Text":"then they are linearly independent and in that case we know it\u0027s diagonalizable."},{"Start":"09:38.405 ","End":"09:41.570","Text":"The question is, how do we actually diagonalize it?"},{"Start":"09:41.570 ","End":"09:43.370","Text":"What our task is,"},{"Start":"09:43.370 ","End":"09:45.020","Text":"is to find 2 matrices,"},{"Start":"09:45.020 ","End":"09:49.430","Text":"P and D, where P is invertible and D is diagonal,"},{"Start":"09:49.430 ","End":"09:57.350","Text":"such that this equation holds P inverse times A times P is D. D is"},{"Start":"09:57.350 ","End":"10:05.515","Text":"found by just placing the eigenvalues along the diagonal and 0s elsewhere,"},{"Start":"10:05.515 ","End":"10:08.600","Text":"and we have to make note of the order we did it in."},{"Start":"10:08.600 ","End":"10:11.720","Text":"First the 1 plus 2i then the 1 minus 2i,"},{"Start":"10:11.720 ","End":"10:14.560","Text":"because the next part depends on this."},{"Start":"10:14.560 ","End":"10:18.345","Text":"The matrix P is gotten by taking the 2 eigenvectors,"},{"Start":"10:18.345 ","End":"10:20.390","Text":"but we have to do it in the right order."},{"Start":"10:20.390 ","End":"10:22.820","Text":"Corresponding to the 1 plus 2i,"},{"Start":"10:22.820 ","End":"10:26.870","Text":"we have this and we write it as a column vector,"},{"Start":"10:26.870 ","End":"10:29.205","Text":"and I\u0027ve color-coded this to help."},{"Start":"10:29.205 ","End":"10:31.410","Text":"1 minus 2i, so we look here,"},{"Start":"10:31.410 ","End":"10:35.850","Text":"so it\u0027s 1 minus i2 which is the second column."},{"Start":"10:35.850 ","End":"10:39.800","Text":"Basically, we\u0027re done but I\u0027d just like to make a suggestion."},{"Start":"10:39.800 ","End":"10:43.415","Text":"If this is on the exam and you have the time,"},{"Start":"10:43.415 ","End":"10:46.220","Text":"or if you need extra homework,"},{"Start":"10:46.220 ","End":"10:51.710","Text":"you could compute the inverse of P and then compute"},{"Start":"10:51.710 ","End":"10:59.360","Text":"P inverse times A times P and see that you really do get D. Anyway, totally optional."}],"ID":25761},{"Watched":false,"Name":"Exercise 5 - Shortcut","Duration":"11m 11s","ChapterTopicVideoID":24847,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.050 ","End":"00:06.150","Text":"In this exercise, we\u0027re given a 2 by 2 matrix A,"},{"Start":"00:06.150 ","End":"00:14.910","Text":"and we have to find the eigenvalues and the corresponding eigenvectors for A."},{"Start":"00:14.910 ","End":"00:25.430","Text":"Then if it\u0027s diagonalizable to find the invertible matrix P such that this holds,"},{"Start":"00:25.430 ","End":"00:26.785","Text":"we also have to find D,"},{"Start":"00:26.785 ","End":"00:28.875","Text":"which is a diagonal matrix."},{"Start":"00:28.875 ","End":"00:30.510","Text":"Now, here\u0027s the thing."},{"Start":"00:30.510 ","End":"00:32.250","Text":"We have to solve it twice."},{"Start":"00:32.250 ","End":"00:35.880","Text":"Once, when this is considered over the field of"},{"Start":"00:35.880 ","End":"00:40.260","Text":"real numbers and once over the field of complex numbers."},{"Start":"00:40.260 ","End":"00:43.110","Text":"I\u0027m assuming you\u0027ve studied complex numbers,"},{"Start":"00:43.110 ","End":"00:46.065","Text":"if not, you should skip this exercise."},{"Start":"00:46.065 ","End":"00:48.390","Text":"To find the eigenvalues,"},{"Start":"00:48.390 ","End":"00:50.809","Text":"we start off with the characteristic matrix,"},{"Start":"00:50.809 ","End":"00:54.305","Text":"then we get the characteristic polynomial and find its roots."},{"Start":"00:54.305 ","End":"01:01.315","Text":"Anyway, the characteristic matrix is x times the identity matrix minus A."},{"Start":"01:01.315 ","End":"01:04.620","Text":"The identity here, the 2 by 2 identity,"},{"Start":"01:04.620 ","End":"01:07.440","Text":"which is just 1s on the diagonal,"},{"Start":"01:07.440 ","End":"01:08.780","Text":"1 here and 1 here."},{"Start":"01:08.780 ","End":"01:11.030","Text":"But when we multiply by x,"},{"Start":"01:11.030 ","End":"01:15.720","Text":"it\u0027s x and x and A I just copied from there."},{"Start":"01:15.880 ","End":"01:18.625","Text":"Let\u0027s see."},{"Start":"01:18.625 ","End":"01:23.150","Text":"This is what we get and this is the characteristic matrix."},{"Start":"01:23.150 ","End":"01:29.420","Text":"Next, the characteristic polynomial is just the determinant of the characteristic matrix."},{"Start":"01:29.420 ","End":"01:32.420","Text":"What\u0027s written here is the same as what\u0027s written here."},{"Start":"01:32.420 ","End":"01:35.760","Text":"There\u0027s a dot in there also, anyway."},{"Start":"01:36.950 ","End":"01:39.290","Text":"To get the determinant,"},{"Start":"01:39.290 ","End":"01:45.545","Text":"we just replace the square brackets with these bars which indicate determinant."},{"Start":"01:45.545 ","End":"01:48.860","Text":"Now to compute the determinant of a 2 by 2 is easy"},{"Start":"01:48.860 ","End":"01:52.119","Text":"just the product of this diagonal minus the product of this diagonal."},{"Start":"01:52.119 ","End":"01:58.795","Text":"X minus 3 times x plus 1 for the first diagonal and 2 times negative 4 for the other."},{"Start":"01:58.795 ","End":"02:03.905","Text":"After simplification, this is the characteristic polynomial."},{"Start":"02:03.905 ","End":"02:09.185","Text":"I just like to frame it and call it a name P of x."},{"Start":"02:09.185 ","End":"02:12.425","Text":"Now that we have the characteristic polynomial,"},{"Start":"02:12.425 ","End":"02:15.935","Text":"we can start talking about eigenvalues."},{"Start":"02:15.935 ","End":"02:23.330","Text":"The eigenvalues of a matrix suggest the roots of its characteristic polynomial."},{"Start":"02:23.330 ","End":"02:25.805","Text":"The roots of a polynomial,"},{"Start":"02:25.805 ","End":"02:28.310","Text":"the solutions to the equation,"},{"Start":"02:28.310 ","End":"02:30.290","Text":"that polynomial equals 0."},{"Start":"02:30.290 ","End":"02:32.330","Text":"There\u0027s also a name for this equation."},{"Start":"02:32.330 ","End":"02:35.665","Text":"This is called the characteristic equation."},{"Start":"02:35.665 ","End":"02:38.840","Text":"In our case this is the equation we get."},{"Start":"02:38.840 ","End":"02:40.625","Text":"It\u0027s a quadratic equation."},{"Start":"02:40.625 ","End":"02:43.430","Text":"Now if you try to solve this over the real numbers,"},{"Start":"02:43.430 ","End":"02:46.685","Text":"you\u0027ll find that there are no solutions."},{"Start":"02:46.685 ","End":"02:50.930","Text":"Because what you get from the formula is something negative under the square root,"},{"Start":"02:50.930 ","End":"02:54.365","Text":"you have to do the square root of minus 16 or something."},{"Start":"02:54.365 ","End":"02:56.315","Text":"Anyway, there are no solutions."},{"Start":"02:56.315 ","End":"03:00.875","Text":"This is the big difference between the real numbers and the complex numbers here."},{"Start":"03:00.875 ","End":"03:03.890","Text":"If you\u0027re doing this question for the real numbers,"},{"Start":"03:03.890 ","End":"03:05.060","Text":"we say there\u0027s no solution,"},{"Start":"03:05.060 ","End":"03:07.370","Text":"so there are no eigenvalues and if no eigenvalues there is"},{"Start":"03:07.370 ","End":"03:12.410","Text":"no eigenvectors and you can\u0027t diagonalize it or do anything,"},{"Start":"03:12.410 ","End":"03:13.550","Text":"and we\u0027re just done?"},{"Start":"03:13.550 ","End":"03:16.415","Text":"If we\u0027re doing it over the reals,"},{"Start":"03:16.415 ","End":"03:19.070","Text":"then we\u0027re done, we\u0027re finished. That\u0027s it."},{"Start":"03:19.070 ","End":"03:23.630","Text":"From here on down we\u0027re going to assume that we are"},{"Start":"03:23.630 ","End":"03:29.225","Text":"working now with the complex numbers and see if we make some headway there."},{"Start":"03:29.225 ","End":"03:31.940","Text":"Now we solve over the complex numbers,"},{"Start":"03:31.940 ","End":"03:33.740","Text":"we do get somewhere."},{"Start":"03:33.740 ","End":"03:36.110","Text":"We get a couple of solutions we get,"},{"Start":"03:36.110 ","End":"03:42.710","Text":"using the formula minus B plus or minus the square root of B squared minus 4AC,"},{"Start":"03:42.710 ","End":"03:44.995","Text":"which is 4 plus 20,"},{"Start":"03:44.995 ","End":"03:48.930","Text":"34 minus 20 is minus 16 all over 2A."},{"Start":"03:48.930 ","End":"03:52.460","Text":"Anyway, you\u0027ve seen this plenty of times,"},{"Start":"03:52.460 ","End":"03:56.460","Text":"comes down to 1 plus or minus 2i."},{"Start":"03:56.500 ","End":"04:02.640","Text":"Two solutions, 1 plus 2i and 1 minus 2i."},{"Start":"04:02.930 ","End":"04:05.690","Text":"Now that we have the eigenvalues,"},{"Start":"04:05.690 ","End":"04:08.315","Text":"we can talk about eigenvectors."},{"Start":"04:08.315 ","End":"04:13.980","Text":"Let\u0027s find the eigenvectors first of all for this and then for this."},{"Start":"04:14.060 ","End":"04:19.555","Text":"Like I said, we start with 1 plus 2i and later we\u0027ll do the 1 minus 2i."},{"Start":"04:19.555 ","End":"04:22.790","Text":"We begin with the characteristic matrix,"},{"Start":"04:22.790 ","End":"04:26.784","Text":"and then we need to substitute this for x."},{"Start":"04:26.784 ","End":"04:31.790","Text":"If we make that substitution, check the calculation."},{"Start":"04:31.790 ","End":"04:36.790","Text":"Not very difficult. We get this matrix without an x."},{"Start":"04:36.790 ","End":"04:40.490","Text":"Here\u0027s the corresponding system of linear equations."},{"Start":"04:40.490 ","End":"04:49.270","Text":"Now, we need to solve this system and get the solution space,"},{"Start":"04:49.270 ","End":"04:57.260","Text":"and the basis for that will be the eigenvectors for this particular eigenvalue."},{"Start":"04:57.260 ","End":"04:59.810","Text":"Anyway, back to the matrix form."},{"Start":"04:59.810 ","End":"05:02.090","Text":"We want to bring it to echelon form."},{"Start":"05:02.090 ","End":"05:03.770","Text":"Everything\u0027s divisible by 2."},{"Start":"05:03.770 ","End":"05:08.735","Text":"Let\u0027s just take 1/2 of the first row and 1/2 of the second row."},{"Start":"05:08.735 ","End":"05:13.730","Text":"This is what we get and the numbers are neater easier to work with."},{"Start":"05:13.730 ","End":"05:17.210","Text":"For row echelon form we need a 0 here."},{"Start":"05:17.210 ","End":"05:26.440","Text":"Take 1 minus i times this row plus twice this row and put it in the second row."},{"Start":"05:26.440 ","End":"05:29.240","Text":"If you do the computation,"},{"Start":"05:29.240 ","End":"05:31.460","Text":"you\u0027ll see that not only do we get 0 here,"},{"Start":"05:31.460 ","End":"05:34.115","Text":"but we get a 0 here also."},{"Start":"05:34.115 ","End":"05:37.800","Text":"What don\u0027t we just cross this row out?"},{"Start":"05:37.820 ","End":"05:41.990","Text":"Here\u0027s the corresponding system of linear equations."},{"Start":"05:41.990 ","End":"05:45.365","Text":"It\u0027s a system even though it only has 1 equation in it,"},{"Start":"05:45.365 ","End":"05:47.660","Text":"and 1 equation and 2 unknowns."},{"Start":"05:47.660 ","End":"05:53.640","Text":"Y is the free variable and x depends on it access constraint."},{"Start":"05:53.640 ","End":"06:00.200","Text":"Our usual trick is to let y equal 1 and that\u0027s fine."},{"Start":"06:00.200 ","End":"06:02.840","Text":"The thing is if we let y equal 1 here,"},{"Start":"06:02.840 ","End":"06:04.925","Text":"if you do the computation,"},{"Start":"06:04.925 ","End":"06:09.110","Text":"you get x, which looks a bit messy,"},{"Start":"06:09.110 ","End":"06:11.210","Text":"doesn\u0027t have to be 1."},{"Start":"06:11.210 ","End":"06:16.565","Text":"This is like a trick anything but 0."},{"Start":"06:16.565 ","End":"06:18.440","Text":"If you want to get nicer numbers,"},{"Start":"06:18.440 ","End":"06:20.470","Text":"Let\u0027s try y equals 2,"},{"Start":"06:20.470 ","End":"06:22.280","Text":"and then if you do the computation,"},{"Start":"06:22.280 ","End":"06:25.025","Text":"you get x, which looks nicer."},{"Start":"06:25.025 ","End":"06:26.569","Text":"This is not wrong,"},{"Start":"06:26.569 ","End":"06:28.885","Text":"but this is neater."},{"Start":"06:28.885 ","End":"06:35.870","Text":"This technique gives us a basis for the solution space."},{"Start":"06:35.870 ","End":"06:40.945","Text":"In this case, we only get 1 and this is it."},{"Start":"06:40.945 ","End":"06:48.840","Text":"That\u0027s our eigenvector for the eigenvalue 1 plus 2i."},{"Start":"06:49.450 ","End":"06:54.780","Text":"We did the case for 1 plus 2i,"},{"Start":"06:54.780 ","End":"06:57.570","Text":"and now we come to 1 minus 2i."},{"Start":"06:57.570 ","End":"06:59.850","Text":"But there is a shortcut,"},{"Start":"06:59.850 ","End":"07:03.080","Text":"we don\u0027t have to work so hard as we did with 1 plus 2i,"},{"Start":"07:03.080 ","End":"07:07.180","Text":"because the other 1 is the conjugate of this."},{"Start":"07:07.180 ","End":"07:10.560","Text":"There are the following shortcut,"},{"Start":"07:10.560 ","End":"07:14.010","Text":"and it works whenever we have the 2 solutions,"},{"Start":"07:14.010 ","End":"07:17.270","Text":"2 eigenvalues that are complex conjugates."},{"Start":"07:17.270 ","End":"07:21.480","Text":"There\u0027s a prepositions like a mini theorem."},{"Start":"07:21.640 ","End":"07:29.404","Text":"It looks scary, but it\u0027s actually quite easy when you see what we do in practice."},{"Start":"07:29.404 ","End":"07:33.005","Text":"But in general, suppose we have"},{"Start":"07:33.005 ","End":"07:39.155","Text":"an eigenvector and these are its components for some complex eigenvalue,"},{"Start":"07:39.155 ","End":"07:42.360","Text":"say a plus bi."},{"Start":"07:44.840 ","End":"07:49.385","Text":"Some real n by n matrix in our case is 2 by 2."},{"Start":"07:49.385 ","End":"07:56.470","Text":"Then if you take this vector and take the conjugate of each component,"},{"Start":"07:56.470 ","End":"07:59.905","Text":"so we get this vector and call it v star,"},{"Start":"07:59.905 ","End":"08:05.335","Text":"that will be an eigenvector for the other eigenvalue for the a minus bi."},{"Start":"08:05.335 ","End":"08:09.075","Text":"The conjugate looks complicated,"},{"Start":"08:09.075 ","End":"08:11.740","Text":"let me show you what happens in our case."},{"Start":"08:11.740 ","End":"08:20.590","Text":"Earlier, we had that for the eigenvalue 1 plus 2i we get this eigenvector."},{"Start":"08:20.590 ","End":"08:27.520","Text":"What the proposition says is that if you want it for 1 minus 2i just"},{"Start":"08:27.520 ","End":"08:34.400","Text":"take the conjugate in each place so the conjugate of 1 plus i is 1 minus i,"},{"Start":"08:34.400 ","End":"08:36.800","Text":"and the conjugate of 2 is 2,"},{"Start":"08:36.800 ","End":"08:40.200","Text":"is 2 plus 0i and 2 minus 0i."},{"Start":"08:40.200 ","End":"08:41.985","Text":"That\u0027s all there is to it."},{"Start":"08:41.985 ","End":"08:46.085","Text":"That thing about the conjugates really saves work."},{"Start":"08:46.085 ","End":"08:47.450","Text":"Once you see this example,"},{"Start":"08:47.450 ","End":"08:50.420","Text":"you see that this looks frightening, but its not."},{"Start":"08:50.420 ","End":"08:52.900","Text":"Have a eigenvalue,"},{"Start":"08:52.900 ","End":"08:54.950","Text":"eigenvector, take the conjugate,"},{"Start":"08:54.950 ","End":"08:57.810","Text":"just take the conjugate in each place."},{"Start":"08:58.540 ","End":"09:02.150","Text":"Now that we\u0027ve found the eigenvalues and the eigenvectors"},{"Start":"09:02.150 ","End":"09:05.330","Text":"we can get to the matter of diagonalization."},{"Start":"09:05.330 ","End":"09:08.614","Text":"Let\u0027s just summarize what we found earlier."},{"Start":"09:08.614 ","End":"09:12.230","Text":"We found the eigenvalue 1 plus"},{"Start":"09:12.230 ","End":"09:17.750","Text":"2i and there\u0027s an eigenvector associated with it, which is this."},{"Start":"09:17.750 ","End":"09:21.260","Text":"The other eigenvalue, 1 minus 2i,"},{"Start":"09:21.260 ","End":"09:25.315","Text":"and we get this eigenvector."},{"Start":"09:25.315 ","End":"09:28.310","Text":"Using a theorem we had earlier,"},{"Start":"09:28.310 ","End":"09:34.145","Text":"it was stated with n by n matrices and n linearly independent vectors."},{"Start":"09:34.145 ","End":"09:36.570","Text":"Then we know it\u0027s diagonalizable,"},{"Start":"09:36.570 ","End":"09:40.325","Text":"and how do we know that these 2 vectors are linearly independent?"},{"Start":"09:40.325 ","End":"09:44.570","Text":"There was another theorem that if they come from different eigenvalues,"},{"Start":"09:44.570 ","End":"09:47.680","Text":"then they are linearly independent."},{"Start":"09:47.680 ","End":"09:50.840","Text":"In that case we know it\u0027s diagonalizable."},{"Start":"09:50.840 ","End":"09:53.990","Text":"The question is, how do we actually diagonalize it?"},{"Start":"09:53.990 ","End":"09:55.790","Text":"What our task is,"},{"Start":"09:55.790 ","End":"09:58.835","Text":"is to find 2 matrices, P and D,"},{"Start":"09:58.835 ","End":"10:01.865","Text":"where P is invertible and D is diagonal,"},{"Start":"10:01.865 ","End":"10:05.390","Text":"such that this equation holds."},{"Start":"10:05.390 ","End":"10:11.480","Text":"P inverse times A times P is D. D is found by just placing"},{"Start":"10:11.480 ","End":"10:17.630","Text":"the eigenvalues along the diagonal and zeros elsewhere."},{"Start":"10:17.630 ","End":"10:21.020","Text":"We have to make note of the order we did it in."},{"Start":"10:21.020 ","End":"10:24.160","Text":"First the 1 plus 2i then the 1 minus 2i,"},{"Start":"10:24.160 ","End":"10:27.010","Text":"because the next part depends on this."},{"Start":"10:27.010 ","End":"10:30.785","Text":"The matrix P is gotten by taking the 2 eigenvectors,"},{"Start":"10:30.785 ","End":"10:32.825","Text":"but we have to do it in the right order,"},{"Start":"10:32.825 ","End":"10:35.570","Text":"corresponding to the 1 plus 2i."},{"Start":"10:35.570 ","End":"10:39.385","Text":"We have this and we write it as a column vector."},{"Start":"10:39.385 ","End":"10:42.960","Text":"I\u0027ve color coded this to help.1 minus 2i,"},{"Start":"10:42.960 ","End":"10:45.650","Text":"so we look here, so it\u0027s 1 minus i2,"},{"Start":"10:45.650 ","End":"10:48.295","Text":"which is the second column."},{"Start":"10:48.295 ","End":"10:50.420","Text":"Basically we\u0027re done."},{"Start":"10:50.420 ","End":"10:55.835","Text":"But I\u0027d just like to make a suggestion if it\u0027s on an exam and you have the time,"},{"Start":"10:55.835 ","End":"10:58.670","Text":"or if you need extra homework,"},{"Start":"10:58.670 ","End":"11:04.145","Text":"you could compute the inverse of P and then compute"},{"Start":"11:04.145 ","End":"11:11.820","Text":"P inverse times A times P and see that you really do get D. Anyway it\u0027s totally optional."}],"ID":25760},{"Watched":false,"Name":"Exercise 6","Duration":"4m 44s","ChapterTopicVideoID":24849,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:05.895","Text":"In this exercise, we have matrix A which is 2 by 2."},{"Start":"00:05.895 ","End":"00:14.280","Text":"Our task is to find the eigenvalues and their corresponding eigenvectors for this matrix."},{"Start":"00:14.280 ","End":"00:18.959","Text":"Then to see if it\u0027s diagonalizable and if so,"},{"Start":"00:18.959 ","End":"00:28.815","Text":"we want to find the matrix P and the matrix D such that this equality holds,"},{"Start":"00:28.815 ","End":"00:33.010","Text":"where P is invertible, D is diagonal."},{"Start":"00:33.020 ","End":"00:35.340","Text":"We have to do this twice,"},{"Start":"00:35.340 ","End":"00:42.075","Text":"once when we consider this as a matrix of real numbers and once with complex numbers."},{"Start":"00:42.075 ","End":"00:43.880","Text":"Real numbers, complex numbers,"},{"Start":"00:43.880 ","End":"00:45.560","Text":"if you haven\u0027t studied complex numbers,"},{"Start":"00:45.560 ","End":"00:48.680","Text":"then you might want to skip this exercise."},{"Start":"00:48.680 ","End":"00:52.640","Text":"We start by finding the characteristic matrix,"},{"Start":"00:52.640 ","End":"00:53.810","Text":"which as you recall,"},{"Start":"00:53.810 ","End":"00:59.570","Text":"is x I minus A. I in this case is the 2 by 2 identity matrix,"},{"Start":"00:59.570 ","End":"01:02.810","Text":"just 1s on the diagonal here, 0 here,"},{"Start":"01:02.810 ","End":"01:05.600","Text":"multiply it by x, so we get x x,"},{"Start":"01:05.600 ","End":"01:13.265","Text":"subtract this original A from it and the result is this."},{"Start":"01:13.265 ","End":"01:19.460","Text":"The next step is going to be to find the characteristic polynomial,"},{"Start":"01:19.460 ","End":"01:24.565","Text":"which is just the determinant of this characteristic matrix."},{"Start":"01:24.565 ","End":"01:26.880","Text":"This looks like this. Here it\u0027s brackets,"},{"Start":"01:26.880 ","End":"01:29.185","Text":"here it\u0027s bars here is the determinant."},{"Start":"01:29.185 ","End":"01:30.865","Text":"A 2 by 2 determinant,"},{"Start":"01:30.865 ","End":"01:35.275","Text":"take this diagonal product minus the other diagonal\u0027s product."},{"Start":"01:35.275 ","End":"01:42.865","Text":"If we do that, we get this and we want to simplify that and this is what it is."},{"Start":"01:42.865 ","End":"01:48.220","Text":"This factorizes as x minus 3 squared and we give it a name p of x,"},{"Start":"01:48.220 ","End":"01:51.590","Text":"and that\u0027s our characteristic polynomial."},{"Start":"01:51.590 ","End":"01:54.280","Text":"Now that we have the characteristic polynomial,"},{"Start":"01:54.280 ","End":"01:57.400","Text":"we can go ahead and find the eigenvalues."},{"Start":"01:57.400 ","End":"02:01.660","Text":"To find them, we just take the characteristic polynomial,"},{"Start":"02:01.660 ","End":"02:05.090","Text":"equate it to 0 and solve."},{"Start":"02:05.090 ","End":"02:07.520","Text":"In our case it\u0027s this, by the way,"},{"Start":"02:07.520 ","End":"02:10.880","Text":"there\u0027s a name for this equation is called the characteristic equation."},{"Start":"02:10.880 ","End":"02:12.620","Text":"This in our case."},{"Start":"02:12.620 ","End":"02:15.050","Text":"Now it doesn\u0027t matter if we\u0027re doing this over"},{"Start":"02:15.050 ","End":"02:17.570","Text":"the real numbers are over the complex numbers."},{"Start":"02:17.570 ","End":"02:23.970","Text":"In either case, there is just 1 solution, x equals 3."},{"Start":"02:24.950 ","End":"02:27.860","Text":"Basically, the paths are emerging for"},{"Start":"02:27.860 ","End":"02:30.710","Text":"the reals and the complex is going to be the same continuation."},{"Start":"02:30.710 ","End":"02:34.730","Text":"We only have the eigenvalue x equals 3, like I said."},{"Start":"02:34.730 ","End":"02:40.560","Text":"Now we\u0027ll find the corresponding eigenvectors."},{"Start":"02:43.790 ","End":"02:46.560","Text":"What we do is we take"},{"Start":"02:46.560 ","End":"02:56.040","Text":"this characteristic matrix and we substitute in place of x, the value 3"},{"Start":"02:56.040 ","End":"02:57.870","Text":"which is our eigenvalue."},{"Start":"02:57.870 ","End":"03:01.130","Text":"This now becomes 3 minus 2 is 1,"},{"Start":"03:01.130 ","End":"03:02.450","Text":"3 minus 4 is minus 1."},{"Start":"03:02.450 ","End":"03:04.165","Text":"This is what we get."},{"Start":"03:04.165 ","End":"03:08.880","Text":"This is the corresponding system of linear equations."},{"Start":"03:08.950 ","End":"03:15.605","Text":"The eigenvectors are going to be a basis for the solution space of this system."},{"Start":"03:15.605 ","End":"03:17.810","Text":"We bring to row echelon form."},{"Start":"03:17.810 ","End":"03:21.214","Text":"Just add the first equation to the second."},{"Start":"03:21.214 ","End":"03:26.945","Text":"In fact we get a row of 0s which we can just strike out."},{"Start":"03:26.945 ","End":"03:34.555","Text":"Our system is equivalent to this system which has just 1 equation and 2 unknowns."},{"Start":"03:34.555 ","End":"03:42.970","Text":"Y is the free variable and x is constrained."},{"Start":"03:43.070 ","End":"03:47.420","Text":"We use our technique of letting y be 1."},{"Start":"03:47.420 ","End":"03:48.784","Text":"It could be anything non-zero."},{"Start":"03:48.784 ","End":"03:52.110","Text":"Usually we let the free variable be 1."},{"Start":"03:52.150 ","End":"03:57.390","Text":"If we do that, the next comes out to be minus 1."},{"Start":"03:57.390 ","End":"03:59.340","Text":"Put it in the right order,"},{"Start":"03:59.340 ","End":"04:08.325","Text":"first x and then y and that gives us an eigenvector for the eigenvalue 3."},{"Start":"04:08.325 ","End":"04:14.010","Text":"But we only get 1 eigenvector and that\u0027s not good,"},{"Start":"04:14.010 ","End":"04:17.080","Text":"we wanted 2 eigenvectors."},{"Start":"04:17.080 ","End":"04:22.790","Text":"Because there\u0027s that theorem that an n by n matrix is"},{"Start":"04:22.790 ","End":"04:26.105","Text":"diagonalizable if and only if you have"},{"Start":"04:26.105 ","End":"04:30.320","Text":"n eigenvectors and they have to be linearly independent"},{"Start":"04:30.320 ","End":"04:33.410","Text":"but we don\u0027t have 2 eigenvectors, we only have 1,"},{"Start":"04:33.410 ","End":"04:38.120","Text":"so not diagonalizable, doesn\u0027t matter reals or complex."},{"Start":"04:38.120 ","End":"04:40.670","Text":"This is the point at which we stop,"},{"Start":"04:40.670 ","End":"04:44.610","Text":"we can\u0027t continue. We\u0027re done."}],"ID":25762},{"Watched":false,"Name":"Exercise 7","Duration":"10m 23s","ChapterTopicVideoID":24850,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:05.055","Text":"In this exercise, we\u0027re given the following 3 by 3 matrix A."},{"Start":"00:05.055 ","End":"00:09.045","Text":"Notice that it has 2 parameters: a and b."},{"Start":"00:09.045 ","End":"00:10.920","Text":"First question is,"},{"Start":"00:10.920 ","End":"00:14.610","Text":"for which value or values of a and b will"},{"Start":"00:14.610 ","End":"00:20.310","Text":"the eigenvalues of matrix A be just 1 and minus 1?"},{"Start":"00:20.310 ","End":"00:25.695","Text":"Then continuing, using the values of a and b that we found,"},{"Start":"00:25.695 ","End":"00:31.270","Text":"we have to decide if this matrix A is diagonalizable."},{"Start":"00:33.080 ","End":"00:37.995","Text":"Now there\u0027s 2 possibilities here for the roots,"},{"Start":"00:37.995 ","End":"00:41.315","Text":"I should have said, of the characteristic polynomial, of course."},{"Start":"00:41.315 ","End":"00:43.820","Text":"The roots is going to be 3 of them,"},{"Start":"00:43.820 ","End":"00:48.540","Text":"but we only have 2 different ones: 1 and minus 1."},{"Start":"00:48.540 ","End":"00:54.230","Text":"There\u0027s either 1 as a double root and minus 1 as a single root or the other way"},{"Start":"00:54.230 ","End":"01:00.770","Text":"around as the 2 possibilities for the 3 roots or eigenvalues."},{"Start":"01:00.770 ","End":"01:04.640","Text":"Now we need to compute the characteristic polynomial,"},{"Start":"01:04.640 ","End":"01:08.295","Text":"which is the determinant of xI minus A,"},{"Start":"01:08.295 ","End":"01:11.055","Text":"I is a 3 by 3 identity matrix."},{"Start":"01:11.055 ","End":"01:12.735","Text":"We\u0027ve seen this many times."},{"Start":"01:12.735 ","End":"01:17.570","Text":"What we do is we just negate all the entries here and add"},{"Start":"01:17.570 ","End":"01:23.565","Text":"x\u0027s on the diagonal and we\u0027ve got this determinant."},{"Start":"01:23.565 ","End":"01:30.380","Text":"Expanding by the first row and remembering that it\u0027s plus minus plus,"},{"Start":"01:30.380 ","End":"01:35.630","Text":"that\u0027s the reason we have the plus here because it\u0027s a minus with a minus, we get this."},{"Start":"01:35.630 ","End":"01:40.685","Text":"Well, I\u0027m not doing all the details because you know how to do this thing."},{"Start":"01:40.685 ","End":"01:43.760","Text":"I\u0027ll just combine the last 2 terms."},{"Start":"01:43.760 ","End":"01:45.170","Text":"I won\u0027t go all the way."},{"Start":"01:45.170 ","End":"01:47.760","Text":"This will be enough for me."},{"Start":"01:48.370 ","End":"01:51.505","Text":"Now let\u0027s start with Case I,"},{"Start":"01:51.505 ","End":"01:57.250","Text":"which is where 1 is a double root and minus 1 is a single root."},{"Start":"01:58.250 ","End":"02:07.135","Text":"What we have is that 1 and minus 1 satisfy the polynomial p roots."},{"Start":"02:07.135 ","End":"02:10.310","Text":"But because 1 is a double root,"},{"Start":"02:10.310 ","End":"02:15.635","Text":"it also satisfies the derivative of p equals 0."},{"Start":"02:15.635 ","End":"02:20.860","Text":"Let\u0027s see what each of these 3 equations gives us."},{"Start":"02:20.860 ","End":"02:23.250","Text":"P of 1 equals 0,"},{"Start":"02:23.250 ","End":"02:29.360","Text":"so we put x equals 1 here and here and here, everywhere there\u0027s x."},{"Start":"02:29.360 ","End":"02:31.145","Text":"This is what we get."},{"Start":"02:31.145 ","End":"02:37.740","Text":"What it boils down to is this equation in a and b."},{"Start":"02:37.960 ","End":"02:45.575","Text":"Substituting minus 1 in p just boils down to 0 equals 0,"},{"Start":"02:45.575 ","End":"02:49.310","Text":"which means we don\u0027t get any extra information because,"},{"Start":"02:49.310 ","End":"02:51.110","Text":"of course, always 0 is equal to 0,"},{"Start":"02:51.110 ","End":"02:53.315","Text":"so this doesn\u0027t give us anything."},{"Start":"02:53.315 ","End":"02:55.415","Text":"But the third one,"},{"Start":"02:55.415 ","End":"02:58.760","Text":"well, we\u0027ll have to first of all find p prime."},{"Start":"02:58.760 ","End":"03:01.520","Text":"We haven\u0027t done all the work yet."},{"Start":"03:01.520 ","End":"03:04.315","Text":"Here was p again."},{"Start":"03:04.315 ","End":"03:06.000","Text":"Well, not quite the same,"},{"Start":"03:06.000 ","End":"03:07.260","Text":"I slightly expanded it."},{"Start":"03:07.260 ","End":"03:10.380","Text":"I multiplied x minus a with these 2."},{"Start":"03:10.380 ","End":"03:16.850","Text":"Anyway, we don\u0027t have to open it all the way to do the derivative using the product rule."},{"Start":"03:16.850 ","End":"03:20.070","Text":"The derivative of this product of 3 things."},{"Start":"03:20.070 ","End":"03:21.230","Text":"It\u0027s got 3 bits."},{"Start":"03:21.230 ","End":"03:24.890","Text":"Each time we differentiate 1 and leave the others untouched,"},{"Start":"03:24.890 ","End":"03:26.735","Text":"but each 1 has a derivative of 1."},{"Start":"03:26.735 ","End":"03:28.910","Text":"It\u0027s this with this,"},{"Start":"03:28.910 ","End":"03:33.050","Text":"and then the first and the last and then the last 2."},{"Start":"03:33.050 ","End":"03:43.275","Text":"16 x minus a gives us just 16 and minus bx plus a constant gives us just minus b."},{"Start":"03:43.275 ","End":"03:46.655","Text":"Now we can plug in the value 1,"},{"Start":"03:46.655 ","End":"03:51.365","Text":"that\u0027s the double root here, into this bit."},{"Start":"03:51.365 ","End":"03:54.320","Text":"There\u0027s some calculations there."},{"Start":"03:54.320 ","End":"03:59.300","Text":"I will just leave you to follow up on this."},{"Start":"03:59.300 ","End":"04:02.540","Text":"We get this equation basically."},{"Start":"04:02.540 ","End":"04:07.205","Text":"This and this is 2 equations in 2 unknowns."},{"Start":"04:07.205 ","End":"04:09.530","Text":"This is the solution."},{"Start":"04:09.530 ","End":"04:11.570","Text":"I\u0027ll do it this time."},{"Start":"04:11.570 ","End":"04:12.860","Text":"I\u0027ll show you how we get that."},{"Start":"04:12.860 ","End":"04:19.940","Text":"Let\u0027s say from here, I can isolate b is equal to 2 minus 2a."},{"Start":"04:19.940 ","End":"04:23.180","Text":"Then substitute that in here."},{"Start":"04:23.180 ","End":"04:29.415","Text":"I can say that 8 minus 4a is equal to b,"},{"Start":"04:29.415 ","End":"04:32.550","Text":"which is 2 minus 2a."},{"Start":"04:32.550 ","End":"04:39.645","Text":"Then we get 6 equals 2a, a equals 3."},{"Start":"04:39.645 ","End":"04:44.055","Text":"Plug in a equals 3 in here."},{"Start":"04:44.055 ","End":"04:49.610","Text":"We\u0027ve got the b is 2 minus 6,"},{"Start":"04:49.610 ","End":"04:51.560","Text":"which is minus 4,"},{"Start":"04:51.560 ","End":"04:53.120","Text":"and there we are."},{"Start":"04:53.120 ","End":"04:59.150","Text":"In future, I won\u0027t be doing this tedious routine stuff."},{"Start":"04:59.150 ","End":"05:01.850","Text":"Let\u0027s get onto the other case."},{"Start":"05:01.850 ","End":"05:08.135","Text":"Remember the other case is where minus 1 is the double root and 1 is the single root."},{"Start":"05:08.135 ","End":"05:10.735","Text":"It\u0027s going to be similar to before."},{"Start":"05:10.735 ","End":"05:15.725","Text":"Like before, 1 and minus 1 satisfy the polynomial."},{"Start":"05:15.725 ","End":"05:22.325","Text":"But this time, minus 1 satisfies the derivative polynomial."},{"Start":"05:22.325 ","End":"05:28.670","Text":"There\u0027s a bunch of stuff that\u0027s just copy-paste from the previous case."},{"Start":"05:28.670 ","End":"05:34.810","Text":"The difference is that now we\u0027re going to substitute minus 1 in here instead of 1."},{"Start":"05:34.810 ","End":"05:36.725","Text":"When we do that,"},{"Start":"05:36.725 ","End":"05:40.985","Text":"the computation comes down to b equals 0."},{"Start":"05:40.985 ","End":"05:43.670","Text":"If b equals 0 and you plug that in here,"},{"Start":"05:43.670 ","End":"05:46.900","Text":"then you\u0027re going to get that a is 1."},{"Start":"05:46.900 ","End":"05:49.625","Text":"That concludes part a."},{"Start":"05:49.625 ","End":"05:52.900","Text":"Now we need to move on to part b."},{"Start":"05:52.900 ","End":"05:57.740","Text":"Here also we\u0027re going to divide up into Case I and Case II."},{"Start":"05:57.740 ","End":"06:01.670","Text":"It\u0027s scrolled off but remember we had a equals 3,"},{"Start":"06:01.670 ","End":"06:04.370","Text":"b equals minus 4 in Case 1."},{"Start":"06:04.370 ","End":"06:09.780","Text":"What we want to do now is substitute that into the matrix A."},{"Start":"06:09.780 ","End":"06:11.730","Text":"Remember here we had a,"},{"Start":"06:11.730 ","End":"06:13.020","Text":"b, b,"},{"Start":"06:13.020 ","End":"06:15.795","Text":"that\u0027s the 3 minus 4 minus 4,"},{"Start":"06:15.795 ","End":"06:18.975","Text":"and this is our matrix."},{"Start":"06:18.975 ","End":"06:23.345","Text":"Our goal is to decide if it\u0027s diagonalizable."},{"Start":"06:23.345 ","End":"06:28.120","Text":"Here\u0027s the characteristic matrix xI minus A."},{"Start":"06:28.120 ","End":"06:30.770","Text":"The strategy is to find"},{"Start":"06:30.770 ","End":"06:35.345","Text":"the eigenvectors for each eigenvalue and count how many there are."},{"Start":"06:35.345 ","End":"06:38.870","Text":"Starting with 1, what we do as well, of course,"},{"Start":"06:38.870 ","End":"06:41.285","Text":"we substitute x equals 1,"},{"Start":"06:41.285 ","End":"06:46.145","Text":"and then we get this."},{"Start":"06:46.145 ","End":"06:49.130","Text":"Just 1 minus 3 is minus 2."},{"Start":"06:49.130 ","End":"06:55.160","Text":"Then we do row operations to bring it to echelon form."},{"Start":"06:55.160 ","End":"07:04.280","Text":"For example, if I subtract twice this row or rather add twice this row to this row,"},{"Start":"07:04.280 ","End":"07:08.855","Text":"I get a row of 0s and then I can interchange these 2,"},{"Start":"07:08.855 ","End":"07:13.400","Text":"and this is the echelon form now."},{"Start":"07:13.400 ","End":"07:15.925","Text":"We have a row of 0s."},{"Start":"07:15.925 ","End":"07:22.565","Text":"Now Imagine this matrix as being like a system of linear equations, homogeneous."},{"Start":"07:22.565 ","End":"07:27.855","Text":"When we get a row of 0s and this is,"},{"Start":"07:27.855 ","End":"07:30.390","Text":"you can\u0027t reduce it any further,"},{"Start":"07:30.390 ","End":"07:33.620","Text":"we get just 1 free variable,"},{"Start":"07:33.620 ","End":"07:35.915","Text":"that\u0027s the third variable."},{"Start":"07:35.915 ","End":"07:39.980","Text":"That means that we only have 1 eigenvector or"},{"Start":"07:39.980 ","End":"07:44.465","Text":"rather it has a dimension 1 for the eigenspace."},{"Start":"07:44.465 ","End":"07:46.970","Text":"Anyway, 1 eigenvector."},{"Start":"07:46.970 ","End":"07:51.375","Text":"But 1 is a double root."},{"Start":"07:51.375 ","End":"07:55.280","Text":"The algebraic multiplicity is 2,"},{"Start":"07:55.280 ","End":"07:59.850","Text":"but the geometric multiplicity is only 1."},{"Start":"08:00.710 ","End":"08:06.590","Text":"The condition for a matrix to be diagonalizable is that"},{"Start":"08:06.590 ","End":"08:08.240","Text":"each eigenvalue should have"},{"Start":"08:08.240 ","End":"08:14.629","Text":"the same geometric and algebraic multiplicity but already for eigenvalue 1,"},{"Start":"08:14.629 ","End":"08:20.430","Text":"we see that 1 is not equal to 2."},{"Start":"08:20.430 ","End":"08:25.220","Text":"The algebraic is 2 and the geometric multiplicity is 1,"},{"Start":"08:25.220 ","End":"08:27.845","Text":"so A is not diagonalizable."},{"Start":"08:27.845 ","End":"08:30.415","Text":"That\u0027s for Case I."},{"Start":"08:30.415 ","End":"08:33.420","Text":"Now we move on to Case II,"},{"Start":"08:33.420 ","End":"08:38.770","Text":"where minus 1 was a double root and we found a and b to be 1 and 0."},{"Start":"08:38.770 ","End":"08:44.650","Text":"We substitute that into our matrix and we get the 1, 0, 0."},{"Start":"08:44.650 ","End":"08:45.910","Text":"This is the a, b, b,"},{"Start":"08:45.910 ","End":"08:48.059","Text":"the rest of it is the same."},{"Start":"08:48.059 ","End":"08:51.665","Text":"Here\u0027s the characteristic matrix."},{"Start":"08:51.665 ","End":"08:56.690","Text":"As before, we\u0027re going to substitute the eigenvalue with an x equal minus 1 and"},{"Start":"08:56.690 ","End":"09:02.620","Text":"then start bringing this to row echelon form."},{"Start":"09:02.620 ","End":"09:08.730","Text":"Here we are. First, the substitution x equals minus 1 gives us this."},{"Start":"09:08.800 ","End":"09:17.629","Text":"Then we switch the first and the third rows."},{"Start":"09:17.629 ","End":"09:25.180","Text":"It\u0027s more convenient because we already have 0s here and then do some canceling,"},{"Start":"09:25.180 ","End":"09:29.040","Text":"divide this row by minus 2 and this row also."},{"Start":"09:29.040 ","End":"09:37.580","Text":"We get this and then we subtract the first row from the second and from the third."},{"Start":"09:38.330 ","End":"09:48.300","Text":"Finally, we interchange these 2 rows."},{"Start":"09:48.300 ","End":"09:52.530","Text":"Now we have a row of 0s then as before."},{"Start":"09:52.530 ","End":"09:57.340","Text":"We have only 1 free variable for the corresponding SLE,"},{"Start":"09:57.340 ","End":"10:00.985","Text":"which means that only 1 eigenvector."},{"Start":"10:00.985 ","End":"10:03.400","Text":"Same story as before,"},{"Start":"10:03.400 ","End":"10:07.310","Text":"geometric multiplicity of the eigenvalue"},{"Start":"10:07.310 ","End":"10:12.340","Text":"minus 1 is 1 whereas the algebraic multiplicity is 2, they\u0027re not equal."},{"Start":"10:12.340 ","End":"10:22.890","Text":"Once again, it is not diagonalizable. We\u0027re done."}],"ID":25763},{"Watched":false,"Name":"Exercise 8","Duration":"5m 4s","ChapterTopicVideoID":24851,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.330","Text":"This exercise is a reverse exercise."},{"Start":"00:03.330 ","End":"00:09.750","Text":"Usually you\u0027re given a matrix A and you have to find its eigenvalues and eigenvectors."},{"Start":"00:09.750 ","End":"00:11.820","Text":"Here, it\u0027s the opposite."},{"Start":"00:11.820 ","End":"00:13.965","Text":"We have a 3 by 3 matrix,"},{"Start":"00:13.965 ","End":"00:18.260","Text":"and we\u0027re given that the 3 eigenvectors,"},{"Start":"00:18.260 ","End":"00:23.465","Text":"these corresponding respectively to eigenvalues."},{"Start":"00:23.465 ","End":"00:26.030","Text":"This with this, this with this,"},{"Start":"00:26.030 ","End":"00:27.830","Text":"this with this, and so on."},{"Start":"00:27.830 ","End":"00:32.370","Text":"We have to find the matrix A."},{"Start":"00:32.370 ","End":"00:38.735","Text":"Now it\u0027s not hard to check that these 3 eigenvectors are linearly independent."},{"Start":"00:38.735 ","End":"00:41.509","Text":"Well, actually they have to be because"},{"Start":"00:41.509 ","End":"00:46.865","Text":"eigenvectors corresponding to different eigenvalues are always linearly independent."},{"Start":"00:46.865 ","End":"00:49.955","Text":"We know that A is diagonalizable,"},{"Start":"00:49.955 ","End":"00:52.025","Text":"we just have to construct it."},{"Start":"00:52.025 ","End":"01:02.415","Text":"The thing is that we know that P minus 1 AP equals D,"},{"Start":"01:02.415 ","End":"01:12.510","Text":"where P is the matrix built from these 3 eigenvectors, this."},{"Start":"01:12.510 ","End":"01:22.780","Text":"D is simply the diagonal with the 3 eigenvalues in other words, this."},{"Start":"01:22.780 ","End":"01:28.550","Text":"Now from this equation we can extract A and do the computation."},{"Start":"01:28.550 ","End":"01:33.340","Text":"From here, we can get that A is PD P inverse,"},{"Start":"01:33.340 ","End":"01:35.230","Text":"but we don\u0027t have P inverse,"},{"Start":"01:35.230 ","End":"01:36.910","Text":"so we\u0027ll have to compute that."},{"Start":"01:36.910 ","End":"01:40.295","Text":"I\u0027ll postpone that to later and just give you the answer."},{"Start":"01:40.295 ","End":"01:48.740","Text":"Here it is, the inverse of P. If you do the computation multiply,"},{"Start":"01:48.740 ","End":"01:55.120","Text":"let\u0027s see, P times D times P inverse, we get this."},{"Start":"01:55.120 ","End":"01:57.520","Text":"Now I skimped on the computations,"},{"Start":"01:57.520 ","End":"02:01.580","Text":"I didn\u0027t show you how we multiply this out and I didn\u0027t show you how we"},{"Start":"02:01.580 ","End":"02:07.580","Text":"got the inverse of P. If you\u0027re not interested,"},{"Start":"02:07.580 ","End":"02:08.960","Text":"you can stop now."},{"Start":"02:08.960 ","End":"02:11.950","Text":"Otherwise, I\u0027m going to show you the computations."},{"Start":"02:11.950 ","End":"02:15.620","Text":"We\u0027ll start with the computation for P inverse."},{"Start":"02:15.620 ","End":"02:17.060","Text":"Remember how we do that?"},{"Start":"02:17.060 ","End":"02:26.630","Text":"We have extended augmented matrix with P on 1 side and the identity on the other."},{"Start":"02:26.630 ","End":"02:28.790","Text":"Then through a series of row operations,"},{"Start":"02:28.790 ","End":"02:32.750","Text":"we want to get it so that the identity matrix is"},{"Start":"02:32.750 ","End":"02:38.520","Text":"here on the left and then the inverse of P will be on the right."},{"Start":"02:38.990 ","End":"02:46.770","Text":"First thing is to invert the first and third rows if we want to have 1 here."},{"Start":"02:46.770 ","End":"02:52.440","Text":"Already the rest of the first column is zeroed out."},{"Start":"02:53.270 ","End":"02:58.250","Text":"In the next step we subtracted the second row from the third row."},{"Start":"02:58.250 ","End":"03:02.620","Text":"So now we\u0027re getting close, well,"},{"Start":"03:02.620 ","End":"03:04.400","Text":"it\u0027s an echelon form,"},{"Start":"03:04.400 ","End":"03:10.735","Text":"but we have to keep going because we want also 0s above the diagonal."},{"Start":"03:10.735 ","End":"03:13.930","Text":"But it\u0027s easiest if we have a 1 here,"},{"Start":"03:13.930 ","End":"03:17.585","Text":"so we divide the last row by minus 2."},{"Start":"03:17.585 ","End":"03:21.175","Text":"Don\u0027t forget to do it to the right-hand side also."},{"Start":"03:21.175 ","End":"03:29.745","Text":"This is very good. Now we could subtract the third row from the second row to get this."},{"Start":"03:29.745 ","End":"03:36.640","Text":"Now all that we need is to 0 this element by subtracting the second from the first."},{"Start":"03:36.640 ","End":"03:40.320","Text":"That gives us the identity here."},{"Start":"03:40.320 ","End":"03:45.315","Text":"What\u0027s here is P inverse,"},{"Start":"03:45.315 ","End":"03:47.940","Text":"and there\u0027s all this 0.5 stuff,"},{"Start":"03:47.940 ","End":"03:56.965","Text":"so let\u0027s just take 1/2 out of the brackets and we have that P inverse is 1/2 of this."},{"Start":"03:56.965 ","End":"03:59.180","Text":"That\u0027s 1 computation."},{"Start":"03:59.180 ","End":"04:01.295","Text":"But there was another,"},{"Start":"04:01.295 ","End":"04:07.410","Text":"which was this computation to compute A from PD P inverse."},{"Start":"04:09.170 ","End":"04:13.020","Text":"Here\u0027s P, here\u0027s D,"},{"Start":"04:13.020 ","End":"04:16.825","Text":"and 1/2 with this is P inverse."},{"Start":"04:16.825 ","End":"04:21.845","Text":"Took the opportunity to put the 1/2 in with this because everything here is even."},{"Start":"04:21.845 ","End":"04:25.460","Text":"So this becomes 3, 1 minus 2."},{"Start":"04:25.460 ","End":"04:31.715","Text":"Then leaving the third matrix alone, multiplying these 2."},{"Start":"04:31.715 ","End":"04:36.455","Text":"For example, we take this first row with this first column,"},{"Start":"04:36.455 ","End":"04:38.650","Text":"and we get 0 times 3, 1 times 0,"},{"Start":"04:38.650 ","End":"04:40.040","Text":"minus 1 times 0,"},{"Start":"04:40.040 ","End":"04:42.020","Text":"that gives us the 0 here and so on."},{"Start":"04:42.020 ","End":"04:44.015","Text":"You know how to multiply matrices."},{"Start":"04:44.015 ","End":"04:46.490","Text":"Then we just have to multiply these 2,"},{"Start":"04:46.490 ","End":"04:47.840","Text":"for example, 0, 1,"},{"Start":"04:47.840 ","End":"04:50.135","Text":"2 with minus 1, 1,"},{"Start":"04:50.135 ","End":"04:56.840","Text":"1 will give us 1 times 1 minus 2 is minus 1,"},{"Start":"04:56.840 ","End":"04:58.760","Text":"and so on for the rest."},{"Start":"04:58.760 ","End":"05:01.325","Text":"That\u0027s what we got before."},{"Start":"05:01.325 ","End":"05:04.590","Text":"Now we are done."}],"ID":25764},{"Watched":false,"Name":"Exercise 9","Duration":"2m 2s","ChapterTopicVideoID":24852,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:06.240","Text":"In this exercise, we have to determine if there is a 3 by 3 matrix which"},{"Start":"00:06.240 ","End":"00:12.870","Text":"has these 3 eigenvectors and corresponding eigenvalues."},{"Start":"00:12.870 ","End":"00:14.145","Text":"This, this and this,"},{"Start":"00:14.145 ","End":"00:15.870","Text":"and this goes with 1,"},{"Start":"00:15.870 ","End":"00:21.480","Text":"this goes with 2, this goes with 3 and if it exists then find it."},{"Start":"00:21.480 ","End":"00:26.790","Text":"In fact, there is no such matrix and we\u0027ll show that."},{"Start":"00:26.790 ","End":"00:30.210","Text":"The strategy will be to show that we have"},{"Start":"00:30.210 ","End":"00:35.820","Text":"3 different eigenvalues and we should expect 3 linearly independent eigenvectors."},{"Start":"00:35.820 ","End":"00:39.595","Text":"If I show that these 3 are not linearly independent, then we\u0027re done."},{"Start":"00:39.595 ","End":"00:44.370","Text":"I just wrote down what I just said."},{"Start":"00:44.370 ","End":"00:49.950","Text":"Let\u0027s go show that these 3 are not linearly independent."},{"Start":"00:49.950 ","End":"00:54.890","Text":"We just take the columns and put them into a 3 by"},{"Start":"00:54.890 ","End":"01:02.200","Text":"3 matrix and then do row operations on this and see if we get a row of 0s at the end."},{"Start":"01:02.200 ","End":"01:07.190","Text":"First, we zero out the rest of the first column by subtracting multiples of"},{"Start":"01:07.190 ","End":"01:14.015","Text":"this 4 times from here and 7 times from here and we end up with this."},{"Start":"01:14.015 ","End":"01:23.240","Text":"Next, we\u0027re going to just add twice this row to this row and we get a row of 0s,"},{"Start":"01:23.240 ","End":"01:28.550","Text":"which means that these 3 vectors are not linearly independent and"},{"Start":"01:28.550 ","End":"01:35.450","Text":"so there is no such matrix because we\u0027d get a contradiction."},{"Start":"01:35.450 ","End":"01:38.930","Text":"By the way, it\u0027s also easy to see that these 3 are not linearly"},{"Start":"01:38.930 ","End":"01:43.160","Text":"independent by noticing because of the numbers"},{"Start":"01:43.160 ","End":"01:49.250","Text":"that the middle vector is the average of the 2 on the outside and"},{"Start":"01:49.250 ","End":"01:56.765","Text":"then we can have a linear combination like this minus twice this plus this is 0."},{"Start":"01:56.765 ","End":"02:02.250","Text":"Anyway, this is the more standard approach. We\u0027re done."}],"ID":25765},{"Watched":false,"Name":"Exercise 10","Duration":"4m 42s","ChapterTopicVideoID":24853,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.020 ","End":"00:03.855","Text":"This exercise has 4 parts,"},{"Start":"00:03.855 ","End":"00:04.965","Text":"and each of them,"},{"Start":"00:04.965 ","End":"00:08.595","Text":"it\u0027s true or false, prove or disprove."},{"Start":"00:08.595 ","End":"00:13.305","Text":"Part A, every diagonalizable matrix is invertible,"},{"Start":"00:13.305 ","End":"00:17.970","Text":"and B, every diagonalizable matrix is not invertible."},{"Start":"00:17.970 ","End":"00:24.090","Text":"Part C, every matrix is diagonalizable, and in part D,"},{"Start":"00:24.090 ","End":"00:27.450","Text":"there exists a matrix A which has"},{"Start":"00:27.450 ","End":"00:34.320","Text":"an eigenvector like so with corresponding eigenvalue of 14."},{"Start":"00:34.320 ","End":"00:36.850","Text":"Okay, let\u0027s begin."},{"Start":"00:36.850 ","End":"00:39.320","Text":"For part A,"},{"Start":"00:39.320 ","End":"00:41.075","Text":"we just have to find"},{"Start":"00:41.075 ","End":"00:47.060","Text":"1 counter example of something that\u0027s diagonalizable, but not invertible."},{"Start":"00:47.060 ","End":"00:52.640","Text":"Just put 1 of the entries on the diagonal as a 0,"},{"Start":"00:52.640 ","End":"00:54.230","Text":"or more of them,"},{"Start":"00:54.230 ","End":"00:56.450","Text":"and it\u0027s already not invertible."},{"Start":"00:56.450 ","End":"01:00.590","Text":"Not invertible because it has a row of 0s,"},{"Start":"01:00.590 ","End":"01:03.579","Text":"so can\u0027t be invertible."},{"Start":"01:03.579 ","End":"01:06.630","Text":"Part B, same idea,"},{"Start":"01:06.630 ","End":"01:09.620","Text":"we have to find 1 counterexample,"},{"Start":"01:09.620 ","End":"01:11.680","Text":"or to give a proof."},{"Start":"01:11.720 ","End":"01:14.310","Text":"This time, as an example,"},{"Start":"01:14.310 ","End":"01:18.770","Text":"I take all non-0s on the diagonal,"},{"Start":"01:18.770 ","End":"01:20.720","Text":"and then it is invertible."},{"Start":"01:20.720 ","End":"01:25.285","Text":"In other words, diagonalizable matrix could be non-invertible,"},{"Start":"01:25.285 ","End":"01:27.895","Text":"or invertible, you can\u0027t say."},{"Start":"01:27.895 ","End":"01:29.685","Text":"What about part C,"},{"Start":"01:29.685 ","End":"01:32.465","Text":"is every matrix diagonalizable?"},{"Start":"01:32.465 ","End":"01:35.555","Text":"Turns out that the answer is no,"},{"Start":"01:35.555 ","End":"01:38.975","Text":"and we can find a 2-by-2 example."},{"Start":"01:38.975 ","End":"01:41.840","Text":"If we take ones on this diagonal,"},{"Start":"01:41.840 ","End":"01:44.555","Text":"but put an extra 1 here,"},{"Start":"01:44.555 ","End":"01:48.060","Text":"it\u0027s invertible, that\u0027s easy to check."},{"Start":"01:48.060 ","End":"01:49.470","Text":"Just check its determinant,"},{"Start":"01:49.470 ","End":"01:51.235","Text":"the determinant is 1,"},{"Start":"01:51.235 ","End":"01:53.550","Text":"but it\u0027s not diagonalizable,"},{"Start":"01:53.550 ","End":"01:57.329","Text":"and then we\u0027re going to have to prove that this isn\u0027t."},{"Start":"01:57.350 ","End":"02:01.010","Text":"The strategy will be the 1 with"},{"Start":"02:01.010 ","End":"02:06.010","Text":"the geometric multiplicity and the algebraic multiplicity."},{"Start":"02:06.010 ","End":"02:12.810","Text":"The characteristic matrix xI minus A is this,"},{"Start":"02:12.810 ","End":"02:19.220","Text":"and the characteristic polynomial is the determinant of this,"},{"Start":"02:19.220 ","End":"02:20.570","Text":"well, this times this is 0,"},{"Start":"02:20.570 ","End":"02:23.345","Text":"so it\u0027s just x minus 1 squared,"},{"Start":"02:23.345 ","End":"02:27.855","Text":"it\u0027s a polynomial, it has a double root of 1,"},{"Start":"02:27.855 ","End":"02:31.350","Text":"so it only has an eigenvalue of 1,"},{"Start":"02:31.350 ","End":"02:34.190","Text":"or you could say it\u0027s a double eigenvalue,"},{"Start":"02:34.190 ","End":"02:40.400","Text":"1 and 1, but only 1 distinct eigenvalue."},{"Start":"02:40.400 ","End":"02:44.600","Text":"We want to see how many eigenvectors it has, 1 or 2."},{"Start":"02:44.600 ","End":"02:49.080","Text":"We substitute x equals 1 in here,"},{"Start":"02:49.100 ","End":"02:51.810","Text":"and we get 0,0,"},{"Start":"02:51.810 ","End":"02:53.625","Text":"we have minus 1 here,"},{"Start":"02:53.625 ","End":"02:56.000","Text":"and as a system of linear equations,"},{"Start":"02:56.000 ","End":"03:00.140","Text":"it only has 1 free variable, the second variable,"},{"Start":"03:00.140 ","End":"03:06.960","Text":"which means that there is only 1 eigenvector,"},{"Start":"03:06.960 ","End":"03:12.020","Text":"so the eigenvalue 1 has algebraic multiplicity 2,"},{"Start":"03:12.020 ","End":"03:14.915","Text":"but geometric multiplicity only 1,"},{"Start":"03:14.915 ","End":"03:17.420","Text":"and when that happens for an eigenvalue,"},{"Start":"03:17.420 ","End":"03:20.710","Text":"then the matrix is not diagonalizable."},{"Start":"03:20.710 ","End":"03:25.305","Text":"Now part D, let\u0027s just scroll back and see what it was."},{"Start":"03:25.305 ","End":"03:31.025","Text":"We were asking if there is"},{"Start":"03:31.025 ","End":"03:37.145","Text":"a matrix with this as an eigenvector with eigenvalue 14."},{"Start":"03:37.145 ","End":"03:42.530","Text":"Well, turns out we can make any vector have eigenvalue of 14."},{"Start":"03:42.530 ","End":"03:48.520","Text":"Just let the matrix be 14 times the identity matrix."},{"Start":"03:48.520 ","End":"03:50.775","Text":"Then for any vector,"},{"Start":"03:50.775 ","End":"03:53.085","Text":"not just the 1 given,"},{"Start":"03:53.085 ","End":"03:54.870","Text":"A times v,"},{"Start":"03:54.870 ","End":"04:01.135","Text":"which is 14 identity times v is v,"},{"Start":"04:01.135 ","End":"04:04.850","Text":"is 14v, so av is 14."},{"Start":"04:04.850 ","End":"04:08.350","Text":"V means that 14 is an eigenvalue for this."},{"Start":"04:08.350 ","End":"04:10.085","Text":"In particular, it\u0027s true for this."},{"Start":"04:10.085 ","End":"04:15.080","Text":"The other way of showing it would be by direct computation."},{"Start":"04:15.080 ","End":"04:17.330","Text":"Here\u0027s matrix A,"},{"Start":"04:17.330 ","End":"04:19.210","Text":"which is 14i,"},{"Start":"04:19.210 ","End":"04:21.165","Text":"this 14 is on the diagonal,"},{"Start":"04:21.165 ","End":"04:23.355","Text":"here\u0027s the vector we were given,"},{"Start":"04:23.355 ","End":"04:26.930","Text":"and if you multiply it out,"},{"Start":"04:26.930 ","End":"04:30.690","Text":"you would get 56, 14,"},{"Start":"04:30.690 ","End":"04:34.680","Text":"140, and then you took the 14 out,"},{"Start":"04:34.680 ","End":"04:37.430","Text":"you\u0027d get 14 times this,"},{"Start":"04:37.430 ","End":"04:39.560","Text":"just to check directly."},{"Start":"04:39.560 ","End":"04:42.750","Text":"Okay, so we are done."}],"ID":25766},{"Watched":false,"Name":"Exercise 11","Duration":"4m 32s","ChapterTopicVideoID":24854,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:07.140","Text":"In this exercise, we\u0027re given A which is a diagonalizable square matrix."},{"Start":"00:07.140 ","End":"00:12.270","Text":"Part a, asks us to prove that for any scalar k,"},{"Start":"00:12.270 ","End":"00:16.365","Text":"the matrix A plus kI is also diagonalizable."},{"Start":"00:16.365 ","End":"00:18.105","Text":"In part b,"},{"Start":"00:18.105 ","End":"00:23.640","Text":"we\u0027re given that 4 is an eigenvalue of A,"},{"Start":"00:23.640 ","End":"00:28.570","Text":"and we have to find an eigenvalue of this A plus kI."},{"Start":"00:30.050 ","End":"00:34.715","Text":"Now, we\u0027re going to spell it out what it means for A to be diagonalizable."},{"Start":"00:34.715 ","End":"00:38.614","Text":"It means that there are matrices P and D,"},{"Start":"00:38.614 ","End":"00:40.535","Text":"where P is invertible,"},{"Start":"00:40.535 ","End":"00:46.440","Text":"D is a diagonal matrix and P minus 1 AP is"},{"Start":"00:46.440 ","End":"00:50.360","Text":"D. What we have to do to show that A plus"},{"Start":"00:50.360 ","End":"00:55.790","Text":"kI is diagonalizable is to find different P and D,"},{"Start":"00:55.790 ","End":"00:58.759","Text":"let\u0027s call them P tilde and D tilde,"},{"Start":"00:58.759 ","End":"01:01.505","Text":"such that for this A plus kI,"},{"Start":"01:01.505 ","End":"01:05.330","Text":"the same basic equation holds just with P"},{"Start":"01:05.330 ","End":"01:12.875","Text":"tilde inverse on the left and P tilde on the right and D tilde,"},{"Start":"01:12.875 ","End":"01:16.260","Text":"which is also a diagonal matrix."},{"Start":"01:16.790 ","End":"01:18.860","Text":"We\u0027ll get to this."},{"Start":"01:18.860 ","End":"01:24.290","Text":"We\u0027ll start off with what we know is that A is PDP inverse."},{"Start":"01:24.290 ","End":"01:28.675","Text":"Add kI to both sides."},{"Start":"01:28.675 ","End":"01:34.965","Text":"Now, we can write I as PIP inverse."},{"Start":"01:34.965 ","End":"01:45.235","Text":"If this becomes this what the idea is that we want to make this as P something P inverse."},{"Start":"01:45.235 ","End":"01:49.720","Text":"Clearly, we can do that if we just take D plus kI."},{"Start":"01:50.210 ","End":"01:55.340","Text":"It\u0027s like we did 2 distributive laws."},{"Start":"01:55.340 ","End":"02:00.565","Text":"We took P out from the left and P inverse out on the right and this is what we get."},{"Start":"02:00.565 ","End":"02:04.010","Text":"Now, if I multiply both sides of the equation on"},{"Start":"02:04.010 ","End":"02:07.175","Text":"the left by P inverse and on the right by P,"},{"Start":"02:07.175 ","End":"02:10.295","Text":"then on the right we\u0027re left with the D plus kI,"},{"Start":"02:10.295 ","End":"02:12.990","Text":"on the left we have this."},{"Start":"02:13.820 ","End":"02:19.430","Text":"If you compare this with this,"},{"Start":"02:19.430 ","End":"02:27.985","Text":"then you see that if we just take P tilde as P and we take D tilde as D plus kI,"},{"Start":"02:27.985 ","End":"02:35.205","Text":"then we found invertible and diagonal matrix."},{"Start":"02:35.205 ","End":"02:39.875","Text":"Of course, it goes without saying or maybe it does need saying that"},{"Start":"02:39.875 ","End":"02:46.820","Text":"a diagonal matrix plus a multiple of the identity matrix is still going to be diagonal,"},{"Start":"02:46.820 ","End":"02:50.045","Text":"because this is just going to have ks on the diagonal,"},{"Start":"02:50.045 ","End":"02:53.880","Text":"so D tilde is also diagonal."},{"Start":"02:53.930 ","End":"02:57.300","Text":"Now, we\u0027re on to part b."},{"Start":"02:57.300 ","End":"03:02.780","Text":"Remember, we were told that 4 is an eigenvalue of A,"},{"Start":"03:02.780 ","End":"03:06.180","Text":"and we have to find an eigenvalue of A, plus kI."},{"Start":"03:06.830 ","End":"03:09.875","Text":"What will help us is this computation,"},{"Start":"03:09.875 ","End":"03:13.020","Text":"we take P minus 1 in front,"},{"Start":"03:13.020 ","End":"03:15.345","Text":"and P after the A plus kI,"},{"Start":"03:15.345 ","End":"03:23.110","Text":"we get by breaking it up to P minus 1 AP plus this."},{"Start":"03:23.660 ","End":"03:29.080","Text":"The second term is just k times the identity."},{"Start":"03:29.080 ","End":"03:37.385","Text":"Now, this p minus 1 AP is the diagonal matrix consisting of eigenvalues of A."},{"Start":"03:37.385 ","End":"03:39.670","Text":"One of them is 4."},{"Start":"03:39.670 ","End":"03:42.045","Text":"I just put it at the top left,"},{"Start":"03:42.045 ","End":"03:43.890","Text":"actually it could have been anywhere here,"},{"Start":"03:43.890 ","End":"03:46.415","Text":"and the proof continues the same."},{"Start":"03:46.415 ","End":"03:48.635","Text":"Just leave it at the top left."},{"Start":"03:48.635 ","End":"03:52.855","Text":"K times the identity is just K is along the diagonal."},{"Start":"03:52.855 ","End":"03:57.130","Text":"Of course, the 0 is everywhere else here and here."},{"Start":"03:57.530 ","End":"03:59.600","Text":"If we add these,"},{"Start":"03:59.600 ","End":"04:01.730","Text":"then wherever there was a 4,"},{"Start":"04:01.730 ","End":"04:06.320","Text":"we add k to everything,"},{"Start":"04:06.320 ","End":"04:09.210","Text":"but it\u0027s still just something, something, something."},{"Start":"04:09.800 ","End":"04:15.995","Text":"This is a diagonal matrix and 1 of the values is 4 plus k,"},{"Start":"04:15.995 ","End":"04:24.110","Text":"which goes to show that 4 plus k is an eigenvalue of A plus kI."},{"Start":"04:24.110 ","End":"04:32.460","Text":"This is the D for A plus kI. That\u0027s it."}],"ID":25767},{"Watched":false,"Name":"Exercise 12","Duration":"7m 20s","ChapterTopicVideoID":24855,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:06.675","Text":"In this exercise, we\u0027re given a 3 by 3 matrix A,"},{"Start":"00:06.675 ","End":"00:12.450","Text":"and suppose that we have 2 eigenvectors, v_1,"},{"Start":"00:12.450 ","End":"00:16.065","Text":"v_2 for eigenvalue 1,"},{"Start":"00:16.065 ","End":"00:18.795","Text":"and we have v_3,"},{"Start":"00:18.795 ","End":"00:22.890","Text":"an eigenvector for minus 1."},{"Start":"00:22.890 ","End":"00:26.415","Text":"There are 3 true false questions."},{"Start":"00:26.415 ","End":"00:28.184","Text":"Prove disprove."},{"Start":"00:28.184 ","End":"00:35.585","Text":"A, v_3 is a linear combination of the vectors v_1, v_2."},{"Start":"00:35.585 ","End":"00:40.805","Text":"Part b, if v_1 and v_2 are linearly independent,"},{"Start":"00:40.805 ","End":"00:47.040","Text":"then A to the power of 2018 is equal to the identity matrix,"},{"Start":"00:47.040 ","End":"00:50.555","Text":"and part c, A is diagonalizable."},{"Start":"00:50.555 ","End":"00:53.095","Text":"Start with a, of course,"},{"Start":"00:53.095 ","End":"00:56.625","Text":"and it\u0027s definitely false,"},{"Start":"00:56.625 ","End":"00:59.160","Text":"and let\u0027s prove that."},{"Start":"00:59.160 ","End":"01:01.560","Text":"We\u0027ll use proof by contradiction,"},{"Start":"01:01.560 ","End":"01:04.170","Text":"so suppose it is a linear combination,"},{"Start":"01:04.170 ","End":"01:09.885","Text":"then we have v_3 is some scalar v_1 plus another scalar v_2,"},{"Start":"01:09.885 ","End":"01:16.380","Text":"and what we get if we apply A to both sides is this,"},{"Start":"01:16.380 ","End":"01:18.225","Text":"then open the brackets,"},{"Start":"01:18.225 ","End":"01:28.440","Text":"and then remember that v_1 and v_2 are eigenvectors for eigenvalue 1,"},{"Start":"01:28.440 ","End":"01:33.195","Text":"which means that Av_1 is v_1 and Av_2 is v_2,"},{"Start":"01:33.195 ","End":"01:35.415","Text":"and this is what we get,"},{"Start":"01:35.415 ","End":"01:37.740","Text":"and this is just v_3,"},{"Start":"01:37.740 ","End":"01:41.115","Text":"so we get Av_3 is v_3,"},{"Start":"01:41.115 ","End":"01:43.215","Text":"but on the other hand,"},{"Start":"01:43.215 ","End":"01:47.460","Text":"because v_3 is an eigenvector for a minus 1,"},{"Start":"01:47.460 ","End":"01:50.550","Text":"Av_3 is minus v_3,"},{"Start":"01:50.550 ","End":"01:57.720","Text":"and that can\u0027t be because we would then get that v_3 is minus v_3."},{"Start":"01:57.720 ","End":"02:00.130","Text":"Bring this to the other side twice, v_3 is 0,"},{"Start":"02:00.130 ","End":"02:03.565","Text":"divide by 2, and we\u0027ve got v_3 equals 0."},{"Start":"02:03.565 ","End":"02:06.090","Text":"That\u0027s a contradiction because v_3 is an eigenvector,"},{"Start":"02:06.090 ","End":"02:07.395","Text":"it can\u0027t be 0,"},{"Start":"02:07.395 ","End":"02:11.085","Text":"so part a is false. Let\u0027s move on."},{"Start":"02:11.085 ","End":"02:14.015","Text":"Part b turns out to be true."},{"Start":"02:14.015 ","End":"02:21.160","Text":"That was the 1 about matrix A to the power of 2018. You can go back and see."},{"Start":"02:22.070 ","End":"02:29.290","Text":"Now, where I\u0027m heading for is to show that A is diagonalizable."},{"Start":"02:29.450 ","End":"02:32.580","Text":"My first claim is that v_1, v_2,"},{"Start":"02:32.580 ","End":"02:35.705","Text":"v_3 are linearly independent."},{"Start":"02:35.705 ","End":"02:38.550","Text":"Otherwise, meaning if they\u0027re dependent,"},{"Start":"02:38.550 ","End":"02:42.375","Text":"then we can get v_3 as a combination of v_1 and v_2."},{"Start":"02:42.375 ","End":"02:46.114","Text":"Perhaps I should spell this out in more detail."},{"Start":"02:46.114 ","End":"02:55.970","Text":"Suppose we have that a_1 v_1 plus a_2 v_2 plus a_3 v_3 is 0,"},{"Start":"02:55.970 ","End":"02:59.220","Text":"and that all the coefficients is 0."},{"Start":"02:59.220 ","End":"03:02.700","Text":"I say that a_3 cannot be 0,"},{"Start":"03:02.700 ","End":"03:04.490","Text":"because if a_3 is 0,"},{"Start":"03:04.490 ","End":"03:07.040","Text":"if you just omit the last term,"},{"Start":"03:07.040 ","End":"03:10.580","Text":"it will give you that v_1 and v_2 are linearly dependent,"},{"Start":"03:10.580 ","End":"03:14.355","Text":"and that\u0027s against what is given."},{"Start":"03:14.355 ","End":"03:16.755","Text":"So a_3 is not 0,"},{"Start":"03:16.755 ","End":"03:18.450","Text":"and if a_3 is not 0,"},{"Start":"03:18.450 ","End":"03:23.850","Text":"it means I can bring v_3 to the other side and get v_3 is what?"},{"Start":"03:23.850 ","End":"03:29.340","Text":"Minus a_1 over a_3 v_1,"},{"Start":"03:29.340 ","End":"03:33.870","Text":"plus or minus,"},{"Start":"03:33.870 ","End":"03:39.465","Text":"doesn\u0027t matter a_2 over a_3 v_2."},{"Start":"03:39.465 ","End":"03:44.410","Text":"We\u0027ve got v_3 is a combination of v_1, v_2."},{"Start":"03:44.660 ","End":"03:49.890","Text":"We already showed in part a,"},{"Start":"03:49.890 ","End":"03:55.505","Text":"that v_3 is not a linear combination of v_1, v_2."},{"Start":"03:55.505 ","End":"03:59.300","Text":"Now we do have 3 linearly independent eigenvectors,"},{"Start":"03:59.300 ","End":"04:01.210","Text":"v_1, v_2, v_3,"},{"Start":"04:01.210 ","End":"04:02.900","Text":"and whenever that happens,"},{"Start":"04:02.900 ","End":"04:06.235","Text":"then the matrix is diagonalizable,"},{"Start":"04:06.235 ","End":"04:09.420","Text":"and that means as an infertile matrix P,"},{"Start":"04:09.420 ","End":"04:16.580","Text":"such that P minus 1 or P inverse AP is D. But we know what D the diagonal matrix is."},{"Start":"04:16.580 ","End":"04:20.210","Text":"It has the 3 eigenvalues on it."},{"Start":"04:20.210 ","End":"04:22.340","Text":"The order doesn\u0027t necessarily matter,"},{"Start":"04:22.340 ","End":"04:24.270","Text":"but we had 1,"},{"Start":"04:24.270 ","End":"04:26.040","Text":"1, and minus 1,"},{"Start":"04:26.040 ","End":"04:28.685","Text":"so we have this situation now."},{"Start":"04:28.685 ","End":"04:32.300","Text":"Now we can bring the Ps to the other side,"},{"Start":"04:32.300 ","End":"04:37.475","Text":"multiply on the left by P and on the right by P inverse and we\u0027ve got this,"},{"Start":"04:37.475 ","End":"04:43.000","Text":"and now we\u0027re ready to take it to the power of 2018,"},{"Start":"04:43.000 ","End":"04:45.860","Text":"we\u0027ve seen this computation before."},{"Start":"04:45.860 ","End":"04:50.060","Text":"When you have something sandwiched between a matrix and its inverse,"},{"Start":"04:50.060 ","End":"04:56.100","Text":"all you have to do is raise the middle bit to the power of 2018."},{"Start":"04:56.750 ","End":"05:02.794","Text":"Because 2018 is an even number and this is diagonal,"},{"Start":"05:02.794 ","End":"05:06.760","Text":"we raise each of the entries to the power of 2018,"},{"Start":"05:06.760 ","End":"05:10.040","Text":"1 to the power of 2018 is 1,"},{"Start":"05:10.040 ","End":"05:13.950","Text":"1, and again 1 because this is an even number."},{"Start":"05:13.950 ","End":"05:17.695","Text":"Basically we just get the identity matrix in the middle."},{"Start":"05:17.695 ","End":"05:23.570","Text":"PIP minus 1 is just equal to I,"},{"Start":"05:23.570 ","End":"05:28.120","Text":"and this is what we wanted to show and that concludes part b."},{"Start":"05:28.120 ","End":"05:32.250","Text":"Let\u0027s move on. Part c turns out to be false,"},{"Start":"05:32.250 ","End":"05:35.345","Text":"so part c said that A is diagonalizable."},{"Start":"05:35.345 ","End":"05:37.070","Text":"When I say false, I mean,"},{"Start":"05:37.070 ","End":"05:39.680","Text":"it may or may not be diagonalizable."},{"Start":"05:39.680 ","End":"05:42.185","Text":"We\u0027ll have to do is give a counterexample."},{"Start":"05:42.185 ","End":"05:44.629","Text":"But I\u0027ll show you that it could go both ways."},{"Start":"05:44.629 ","End":"05:51.345","Text":"For example, if we took the matrix which is diagonal and 1,"},{"Start":"05:51.345 ","End":"05:52.470","Text":"1 minus 1,"},{"Start":"05:52.470 ","End":"05:54.570","Text":"it satisfies the conditions,"},{"Start":"05:54.570 ","End":"05:56.600","Text":"and it already is diagonal,"},{"Start":"05:56.600 ","End":"05:58.430","Text":"so it\u0027s diagonalizable,"},{"Start":"05:58.430 ","End":"06:00.810","Text":"but if we take 1,"},{"Start":"06:00.810 ","End":"06:04.650","Text":"1 minus 1 and stick an extra 1 here,"},{"Start":"06:04.650 ","End":"06:07.170","Text":"the eigenvalues are still 1,"},{"Start":"06:07.170 ","End":"06:08.460","Text":"1 and minus 1,"},{"Start":"06:08.460 ","End":"06:11.175","Text":"but this time it\u0027s not diagonalizable,"},{"Start":"06:11.175 ","End":"06:13.815","Text":"and this is what we\u0027re going to show."},{"Start":"06:13.815 ","End":"06:17.360","Text":"Here, just to justify what I said,"},{"Start":"06:17.360 ","End":"06:19.940","Text":"if you take the characteristic polynomial,"},{"Start":"06:19.940 ","End":"06:24.230","Text":"this 1 is not going to have any effect and we\u0027ll still going to get the same as here,"},{"Start":"06:24.230 ","End":"06:25.910","Text":"x minus 1 squared, x plus 1,"},{"Start":"06:25.910 ","End":"06:29.350","Text":"so the eigenvalues are 1 and minus 1,"},{"Start":"06:29.350 ","End":"06:34.685","Text":"and 1 has algebraic multiplicity 2."},{"Start":"06:34.685 ","End":"06:38.390","Text":"Now I\u0027m skipping the computation details."},{"Start":"06:38.390 ","End":"06:42.395","Text":"The eigenvectors for Eigenvalue 1,"},{"Start":"06:42.395 ","End":"06:44.600","Text":"turns out that there is only 1,"},{"Start":"06:44.600 ","End":"06:47.710","Text":"so that the geometric multiplicity is 1,"},{"Start":"06:47.710 ","End":"06:52.930","Text":"and I could already stop here and say it\u0027s not diagonalizable,"},{"Start":"06:52.930 ","End":"06:56.585","Text":"but just to complete the picture,"},{"Start":"06:56.585 ","End":"06:59.990","Text":"this is the eigenvalue for minus 1,"},{"Start":"06:59.990 ","End":"07:04.120","Text":"the algebraic and geometric multiplicity are 1,"},{"Start":"07:04.120 ","End":"07:08.210","Text":"so altogether we only have 2 and not 3 eigenvectors."},{"Start":"07:08.210 ","End":"07:09.530","Text":"That\u0027s another way of looking at it,"},{"Start":"07:09.530 ","End":"07:11.285","Text":"so A is not diagonalizable,"},{"Start":"07:11.285 ","End":"07:17.560","Text":"but we could have already stopped here when we only got 1 eigenvector for eigenvalue 1."},{"Start":"07:17.560 ","End":"07:21.120","Text":"That\u0027s part c and we\u0027re done."}],"ID":25768},{"Watched":false,"Name":"Exercise 13","Duration":"4m 26s","ChapterTopicVideoID":24856,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:04.410","Text":"In this exercise, we\u0027re given 2 n by n matrices."},{"Start":"00:04.410 ","End":"00:08.445","Text":"1 of them B, which is known to be diagonalizable,"},{"Start":"00:08.445 ","End":"00:13.740","Text":"and the other 1 Q, which is invertible and we have 2 claims."},{"Start":"00:13.740 ","End":"00:16.875","Text":"We have to say true or false, prove or disprove."},{"Start":"00:16.875 ","End":"00:22.890","Text":"The first 1, that Q inverse BQ is a diagonal matrix."},{"Start":"00:22.890 ","End":"00:28.390","Text":"The second part the Q inverse BQ is diagonalizable."},{"Start":"00:28.610 ","End":"00:31.424","Text":"Part a is false,"},{"Start":"00:31.424 ","End":"00:35.175","Text":"and all we have to do is bring 1 counterexample."},{"Start":"00:35.175 ","End":"00:37.350","Text":"I need to give you B and Q."},{"Start":"00:37.350 ","End":"00:39.075","Text":"Lets take B as this,"},{"Start":"00:39.075 ","End":"00:41.355","Text":"1, 2, 0, 3."},{"Start":"00:41.355 ","End":"00:46.910","Text":"It\u0027s diagonalizable, partly I leave it to you as an exercise,"},{"Start":"00:46.910 ","End":"00:52.100","Text":"but if you take the characteristic polynomial,"},{"Start":"00:52.100 ","End":"00:55.040","Text":"you\u0027ll find that it\u0027s x minus 1, x minus 3."},{"Start":"00:55.040 ","End":"00:58.430","Text":"You\u0027ll see that it has 2 different eigenvalues, 1 and 3,"},{"Start":"00:58.430 ","End":"01:02.380","Text":"when we have 2 different eigenvalues,"},{"Start":"01:02.380 ","End":"01:06.615","Text":"for a 2 by 2 matrix, then it\u0027s diagonalizable."},{"Start":"01:06.615 ","End":"01:09.235","Text":"Now for Q,"},{"Start":"01:09.235 ","End":"01:11.870","Text":"I\u0027ll just take the identity matrix."},{"Start":"01:11.870 ","End":"01:17.490","Text":"So you can see that Q minus 1 BQ is just going to be B."},{"Start":"01:17.660 ","End":"01:23.605","Text":"Certainly this thing is not diagonal."},{"Start":"01:23.605 ","End":"01:25.910","Text":"It is diagonalizable we\u0027ve shown,"},{"Start":"01:25.910 ","End":"01:28.130","Text":"but it\u0027s not diagonal."},{"Start":"01:28.130 ","End":"01:31.975","Text":"Let\u0027s get on to part B."},{"Start":"01:31.975 ","End":"01:34.565","Text":"This turns out to be true."},{"Start":"01:34.565 ","End":"01:37.040","Text":"In other words, we can\u0027t conclude that"},{"Start":"01:37.040 ","End":"01:42.125","Text":"this Q inverse BQ is diagonal but it is diagonalizable,"},{"Start":"01:42.125 ","End":"01:46.480","Text":"just like it was in the previous example."},{"Start":"01:46.480 ","End":"01:50.730","Text":"For this to be diagonalizable,"},{"Start":"01:50.730 ","End":"01:57.035","Text":"usually we use P and D. This time I\u0027m going to use M and D. We need to find"},{"Start":"01:57.035 ","End":"02:05.960","Text":"an invertible matrix such that if I put the inverse in front and the matrix afterwards,"},{"Start":"02:05.960 ","End":"02:08.600","Text":"I\u0027ll get something that\u0027s diagonal."},{"Start":"02:08.600 ","End":"02:11.300","Text":"I wanted to save P for something else,"},{"Start":"02:11.300 ","End":"02:16.835","Text":"so I use M and D. Now B was given to be diagonalizable."},{"Start":"02:16.835 ","End":"02:22.160","Text":"So we can find P and D. I shouldn\u0027t really use the same letter D,"},{"Start":"02:22.160 ","End":"02:27.665","Text":"but it\u0027s going to turn out to be the same D. I should use D prime or E or something."},{"Start":"02:27.665 ","End":"02:32.900","Text":"Anyway. There\u0027s a P and a D invertible and diagonals such that"},{"Start":"02:32.900 ","End":"02:39.380","Text":"P inverse BP is D. I\u0027m claiming that this is the D that we need here,"},{"Start":"02:39.380 ","End":"02:42.120","Text":"so I don\u0027t need a separate letter."},{"Start":"02:43.940 ","End":"02:53.440","Text":"We\u0027ll take M to be Q inverse P. P is invertible and Q is invertible."},{"Start":"02:53.440 ","End":"02:55.600","Text":"So Q minus 1 is invertible,"},{"Start":"02:55.600 ","End":"02:59.965","Text":"so M is invertible and I got this just by reverse engineering it."},{"Start":"02:59.965 ","End":"03:04.780","Text":"Just going to the end and seeing what it was that I needed to make it work."},{"Start":"03:04.780 ","End":"03:07.255","Text":"The D we take from here,"},{"Start":"03:07.255 ","End":"03:13.310","Text":"and the M we take is Q inverse P. Let\u0027s see if this does the job for us."},{"Start":"03:13.310 ","End":"03:16.330","Text":"Here I just wrote what I said before that M is invertible"},{"Start":"03:16.330 ","End":"03:18.835","Text":"because Q and P are and now we\u0027re going to compute"},{"Start":"03:18.835 ","End":"03:24.810","Text":"this expression which is here and I\u0027ll show you that we get to D in the end."},{"Start":"03:24.810 ","End":"03:28.910","Text":"First I just dropped the brackets because we have an associativity."},{"Start":"03:28.910 ","End":"03:30.775","Text":"It doesn\u0027t matter the order."},{"Start":"03:30.775 ","End":"03:35.480","Text":"Then I group the QM here and notice that M inverse,"},{"Start":"03:35.480 ","End":"03:41.180","Text":"Q inverse is the same as Q M inverse I have to change the order as well."},{"Start":"03:41.180 ","End":"03:44.180","Text":"I guess I should have mentioned that from here,"},{"Start":"03:44.180 ","End":"03:49.295","Text":"that P"},{"Start":"03:49.295 ","End":"03:54.480","Text":"is equal to QM."},{"Start":"03:54.480 ","End":"03:58.565","Text":"All I have to do is multiply both sides on the left by Q."},{"Start":"03:58.565 ","End":"04:01.565","Text":"So if P is QM,"},{"Start":"04:01.565 ","End":"04:07.060","Text":"I can replace QM here by P and we get this."},{"Start":"04:07.060 ","End":"04:09.480","Text":"Now just look over here,"},{"Start":"04:09.480 ","End":"04:16.370","Text":"this is equal to D. This is equal to D as required."},{"Start":"04:16.370 ","End":"04:19.430","Text":"This is what we wanted to find and so B"},{"Start":"04:19.430 ","End":"04:26.520","Text":"is true and proven and that completes the exercise."}],"ID":25769},{"Watched":false,"Name":"Exercise 14","Duration":"4m 15s","ChapterTopicVideoID":24857,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:06.615","Text":"In this exercise, we have the following 3 by 3 matrix A,"},{"Start":"00:06.615 ","End":"00:09.090","Text":"and you can see the pattern this is a, b,"},{"Start":"00:09.090 ","End":"00:12.450","Text":"c, this is multiplied by 4, multiplied by 10."},{"Start":"00:12.450 ","End":"00:16.635","Text":"Now, suppose that a has a non-zero eigenvalue,"},{"Start":"00:16.635 ","End":"00:19.785","Text":"we have to prove that a is diagonalizable."},{"Start":"00:19.785 ","End":"00:23.280","Text":"I\u0027d like you to note that a can\u0027t be"},{"Start":"00:23.280 ","End":"00:29.170","Text":"the 0 matrix because that only has 0 as an eigenvalue."},{"Start":"00:29.390 ","End":"00:32.130","Text":"Not sure why that is."},{"Start":"00:32.130 ","End":"00:33.675","Text":"Here\u0027s an explanation."},{"Start":"00:33.675 ","End":"00:37.080","Text":"If we have an eigenvalue v,"},{"Start":"00:37.080 ","End":"00:39.570","Text":"which is not the 0 vector of course,"},{"Start":"00:39.570 ","End":"00:43.840","Text":"and av is lambda v. If a is a 0 matrix,"},{"Start":"00:43.840 ","End":"00:45.020","Text":"on the left-hand side,"},{"Start":"00:45.020 ","End":"00:47.094","Text":"we get the 0 vector."},{"Start":"00:47.094 ","End":"00:52.729","Text":"On the right-hand side, we have lambda times v. V is not 0."},{"Start":"00:52.729 ","End":"00:56.480","Text":"The co-efficient, the scalar lambda has to be 0."},{"Start":"00:56.480 ","End":"01:00.275","Text":"The only eigenvalue lambda can be 0."},{"Start":"01:00.275 ","End":"01:03.110","Text":"Now, by doing row operations,"},{"Start":"01:03.110 ","End":"01:04.835","Text":"we can get from here to here,"},{"Start":"01:04.835 ","End":"01:08.420","Text":"like subtract 4 times the first row from"},{"Start":"01:08.420 ","End":"01:14.225","Text":"the 2nd row and subtract 10 times the first row from the 3rd row."},{"Start":"01:14.225 ","End":"01:20.175","Text":"We get to this and this is not the 0 matrix as we just said."},{"Start":"01:20.175 ","End":"01:22.515","Text":"The rank has to be 1,"},{"Start":"01:22.515 ","End":"01:23.600","Text":"the rank is 1."},{"Start":"01:23.600 ","End":"01:28.370","Text":"We can apply the rank nullity theorem and conclude that the kernel of"},{"Start":"01:28.370 ","End":"01:36.830","Text":"a has dimension 3 minus 1 because the rank plus the nullity is 3 in this case,"},{"Start":"01:36.830 ","End":"01:39.710","Text":"and the nullity is just the kernel of a."},{"Start":"01:39.710 ","End":"01:44.105","Text":"Now, the kernel is the same as the 0 eigenspace."},{"Start":"01:44.105 ","End":"01:47.645","Text":"All the vectors that have a 0 eigenvalue."},{"Start":"01:47.645 ","End":"01:55.450","Text":"We know that 0 has 2 linearly independent eigenvectors because the dimension is 2."},{"Start":"01:55.450 ","End":"01:58.645","Text":"Let\u0027s call these v_1 and v_2."},{"Start":"01:58.645 ","End":"02:05.030","Text":"Let lambda be the non-zero eigenvalue that we were promised here."},{"Start":"02:05.030 ","End":"02:08.945","Text":"Suppose we have a non-zero eigenvalue, it\u0027s called the lambda."},{"Start":"02:08.945 ","End":"02:11.865","Text":"It has to have an eigenvector corresponding,"},{"Start":"02:11.865 ","End":"02:14.900","Text":"called that v_3, which is not 0."},{"Start":"02:14.900 ","End":"02:18.850","Text":"The claim is that if you take this v_1, v_2,"},{"Start":"02:18.850 ","End":"02:21.860","Text":"and this v_3 together,"},{"Start":"02:21.860 ","End":"02:24.700","Text":"they are linearly independent."},{"Start":"02:24.700 ","End":"02:31.055","Text":"For the proof, suppose that we have a linear combination 0,"},{"Start":"02:31.055 ","End":"02:34.475","Text":"we have to show that each of the coefficients is 0."},{"Start":"02:34.475 ","End":"02:38.105","Text":"Multiply both sides by matrix a,"},{"Start":"02:38.105 ","End":"02:43.880","Text":"and that gives us 0 on this side."},{"Start":"02:43.880 ","End":"02:45.860","Text":"On the left-hand side,"},{"Start":"02:45.860 ","End":"02:52.930","Text":"we can take the scalar in front and we have this equality,"},{"Start":"02:53.020 ","End":"02:58.084","Text":"and Av_1 is 0, Av_2 is 0,"},{"Start":"02:58.084 ","End":"03:00.240","Text":"and Av_3 not 0,"},{"Start":"03:00.240 ","End":"03:04.705","Text":"it\u0027s lambda v_3 because here the eigenvalue is lambda."},{"Start":"03:04.705 ","End":"03:07.070","Text":"We have that this is 0."},{"Start":"03:07.070 ","End":"03:11.720","Text":"Now, this is 0, this part here."},{"Start":"03:11.720 ","End":"03:15.595","Text":"All we\u0027re left with is that this equals 0."},{"Start":"03:15.595 ","End":"03:21.885","Text":"But lambda is not 0 and v_3 is not 0."},{"Start":"03:21.885 ","End":"03:26.250","Text":"We have to have that a_3 is equal to 0."},{"Start":"03:26.250 ","End":"03:28.785","Text":"If a_3 is 0,"},{"Start":"03:28.785 ","End":"03:35.540","Text":"then we have the a_1v_1 plus a_2 v_2 is 0 from the first equation here."},{"Start":"03:35.540 ","End":"03:42.300","Text":"But that means that a_1 equals a_2 equals 0 because we know that v_1,"},{"Start":"03:42.300 ","End":"03:45.215","Text":"v_2 are linearly independent."},{"Start":"03:45.215 ","End":"03:49.420","Text":"Now we\u0027ve got that all the 3 coefficients are 0."},{"Start":"03:49.420 ","End":"03:53.420","Text":"These are linearly independent and therefore they\u0027re a basis."},{"Start":"03:53.420 ","End":"03:58.730","Text":"We have a basis consisting of eigenvectors and that\u0027s"},{"Start":"03:58.730 ","End":"04:04.490","Text":"a condition for a to be diagonalizable and can even say what a diagonal matrix is,"},{"Start":"04:04.490 ","End":"04:10.200","Text":"it going to be all 0s except for a single lambda on the diagonal,"},{"Start":"04:10.200 ","End":"04:15.700","Text":"and 1 of the 3 places doesn\u0027t really matter. We\u0027re done."}],"ID":25770},{"Watched":false,"Name":"Exercise 15","Duration":"7m 27s","ChapterTopicVideoID":24858,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:06.240","Text":"In this exercise, we have A which is a row matrix 1 by n,"},{"Start":"00:06.240 ","End":"00:07.830","Text":"and we can write it as a_1,"},{"Start":"00:07.830 ","End":"00:09.180","Text":"a_2 up to a_n."},{"Start":"00:09.180 ","End":"00:12.330","Text":"Then we\u0027ll assume that n is bigger than 1."},{"Start":"00:12.330 ","End":"00:15.150","Text":"Now we define another matrix B,"},{"Start":"00:15.150 ","End":"00:18.645","Text":"which is A transpose times A."},{"Start":"00:18.645 ","End":"00:21.420","Text":"We have to show that B is a square matrix,"},{"Start":"00:21.420 ","End":"00:23.880","Text":"and to find its eigenvalues."},{"Start":"00:23.880 ","End":"00:26.100","Text":"But first of all, let\u0027s see if this product is really"},{"Start":"00:26.100 ","End":"00:29.475","Text":"defined because you can\u0027t multiply just any 2 matrices."},{"Start":"00:29.475 ","End":"00:31.620","Text":"In general, if you have,"},{"Start":"00:31.620 ","End":"00:35.895","Text":"let\u0027s say k by l and another matrix l by m,"},{"Start":"00:35.895 ","End":"00:40.240","Text":"as long as this number is the same, then we\u0027re okay."},{"Start":"00:40.240 ","End":"00:43.040","Text":"The new size will be the 2 outer numbers,"},{"Start":"00:43.040 ","End":"00:46.550","Text":"k by m. In our case we have an n by"},{"Start":"00:46.550 ","End":"00:52.285","Text":"1 matrix because that\u0027s the transpose of a 1 by n. You just switch the numbers here,"},{"Start":"00:52.285 ","End":"00:57.260","Text":"and a is 1 by n. We have n by 1 times 1 by n,"},{"Start":"00:57.260 ","End":"01:02.995","Text":"eliminate the 1 we\u0027ve got an n by n. It is a square matrix."},{"Start":"01:02.995 ","End":"01:07.040","Text":"Now there\u0027s a proposition about the rank of a product."},{"Start":"01:07.040 ","End":"01:14.900","Text":"The rank of the product of 2 matrices is less than or equal to each of the matrices,"},{"Start":"01:14.900 ","End":"01:21.560","Text":"or you could say it\u0027s less than or equal^minimum of the ranks of the 2 matrices."},{"Start":"01:21.560 ","End":"01:25.100","Text":"Anyway, rank of MN is less than or equal to rank of N,"},{"Start":"01:25.100 ","End":"01:29.790","Text":"and it\u0027s also less than or equal^rank of M. In our case,"},{"Start":"01:29.790 ","End":"01:35.460","Text":"the rank of A transpose A is less than or equal^rank of A."},{"Start":"01:35.460 ","End":"01:36.960","Text":"The rank of A,"},{"Start":"01:36.960 ","End":"01:38.865","Text":"it\u0027s a row matrix,"},{"Start":"01:38.865 ","End":"01:42.225","Text":"at most it can have rank 1."},{"Start":"01:42.225 ","End":"01:45.860","Text":"The number of linearly independent rows can at most be 1."},{"Start":"01:45.860 ","End":"01:49.370","Text":"The rank of B has to be 0 or 1."},{"Start":"01:49.370 ","End":"01:54.270","Text":"It\u0027s an integer, it\u0027s non-negative and less than or equal to 1,"},{"Start":"01:54.270 ","End":"01:56.205","Text":"so 0 or 1."},{"Start":"01:56.205 ","End":"01:58.130","Text":"We have 2 cases."},{"Start":"01:58.130 ","End":"01:59.930","Text":"The first case would be where the rank is"},{"Start":"01:59.930 ","End":"02:03.080","Text":"0 and the second case will be where the rank is 1."},{"Start":"02:03.080 ","End":"02:06.125","Text":"Now if the rank of a matrix is 0,"},{"Start":"02:06.125 ","End":"02:08.150","Text":"matrix of any size,"},{"Start":"02:08.150 ","End":"02:11.705","Text":"then the matrix has to be 0,"},{"Start":"02:11.705 ","End":"02:17.045","Text":"and the only eigenvalue for the 0 matrix is 0."},{"Start":"02:17.045 ","End":"02:19.690","Text":"We\u0027ve seen this before."},{"Start":"02:20.090 ","End":"02:22.590","Text":"Just by the way though,"},{"Start":"02:22.590 ","End":"02:23.810","Text":"didn\u0027t ask for this,"},{"Start":"02:23.810 ","End":"02:27.350","Text":"the geometric multiplicity and the algebraic multiplicity of"},{"Start":"02:27.350 ","End":"02:32.060","Text":"the 0 matrix are both n. That takes care of case 1,"},{"Start":"02:32.060 ","End":"02:33.230","Text":"which is the easy case."},{"Start":"02:33.230 ","End":"02:36.725","Text":"Now let\u0027s take case 2 where the rank is 1."},{"Start":"02:36.725 ","End":"02:42.500","Text":"We\u0027ll apply the rank nullity theorem that the rank plus the nullity is n,"},{"Start":"02:42.500 ","End":"02:45.415","Text":"and we know what the rank is."},{"Start":"02:45.415 ","End":"02:48.815","Text":"The nullity, which is the dimension of the kernel,"},{"Start":"02:48.815 ","End":"02:50.705","Text":"is n minus 1."},{"Start":"02:50.705 ","End":"02:57.965","Text":"Then we know that the kernel of B is exactly the 0-eigenspace of B,"},{"Start":"02:57.965 ","End":"03:03.665","Text":"meaning the set of all eigenvectors for eigenvalue 0."},{"Start":"03:03.665 ","End":"03:09.350","Text":"So 0 is an eigenvalue with geometric multiplicity,"},{"Start":"03:09.350 ","End":"03:13.320","Text":"which is always the dimension of the eigenspace,"},{"Start":"03:13.320 ","End":"03:15.630","Text":"and it\u0027s n minus 1."},{"Start":"03:15.630 ","End":"03:21.685","Text":"This Gamma the geometric multiplicity, the algebraic multiplicity,"},{"Start":"03:21.685 ","End":"03:26.020","Text":"this letter Mu has to be bigger or equal^geometric"},{"Start":"03:26.020 ","End":"03:31.685","Text":"multiplicity and less than or equal to n. If this is n and this is n minus 1,"},{"Start":"03:31.685 ","End":"03:33.210","Text":"so there\u0027s 2 cases."},{"Start":"03:33.210 ","End":"03:40.845","Text":"This is either n minus 1 or n. Now we\u0027ll branch again into 2 sub cases,"},{"Start":"03:40.845 ","End":"03:44.360","Text":"1 where the algebraic multiplicity"},{"Start":"03:44.360 ","End":"03:49.295","Text":"is n and then the other case will be where it\u0027s n minus 1."},{"Start":"03:49.295 ","End":"03:51.015","Text":"Now in this case,"},{"Start":"03:51.015 ","End":"03:55.760","Text":"the characteristic polynomial is x^n,"},{"Start":"03:55.760 ","End":"03:58.505","Text":"and 0 is the only eigenvalue."},{"Start":"03:58.505 ","End":"03:59.990","Text":"Remember, that\u0027s what we\u0027re looking for."},{"Start":"03:59.990 ","End":"04:02.470","Text":"The question says find the eigenvalues."},{"Start":"04:02.470 ","End":"04:10.725","Text":"We had here that the only 1 is 0 in case 2a also 0."},{"Start":"04:10.725 ","End":"04:13.450","Text":"Let\u0027s see what happens in case 2b."},{"Start":"04:13.450 ","End":"04:17.615","Text":"Then the algebraic multiplicity is n minus 1."},{"Start":"04:17.615 ","End":"04:24.230","Text":"The characteristic polynomial is x^n minus 1 times something which is not x,"},{"Start":"04:24.230 ","End":"04:27.005","Text":"x minus Lambda, where Lambda is not 0."},{"Start":"04:27.005 ","End":"04:30.575","Text":"We have a second eigenvalue Lambda."},{"Start":"04:30.575 ","End":"04:33.559","Text":"Let\u0027s see what that Lambda is."},{"Start":"04:33.559 ","End":"04:35.060","Text":"Now multiply out."},{"Start":"04:35.060 ","End":"04:39.560","Text":"We\u0027ve got x^n minus Lambda x^n minus 1."},{"Start":"04:39.560 ","End":"04:48.070","Text":"By a previous exercises a formula that the second leading coefficient is minus the trace."},{"Start":"04:48.070 ","End":"04:50.745","Text":"Minus Lambda is minus the trace,"},{"Start":"04:50.745 ","End":"04:52.820","Text":"Lambda is the trace of B."},{"Start":"04:52.820 ","End":"05:00.515","Text":"Now, the trace of B I claim is a_1 squared plus a_2 squared and so on up to a_n squared."},{"Start":"05:00.515 ","End":"05:05.580","Text":"Let\u0027s see, this B which is A transpose times A,"},{"Start":"05:05.580 ","End":"05:07.635","Text":"this is the matrix A,"},{"Start":"05:07.635 ","End":"05:09.930","Text":"this is its transpose."},{"Start":"05:09.930 ","End":"05:13.415","Text":"In each case we take a row from here and a column from here."},{"Start":"05:13.415 ","End":"05:15.500","Text":"We get a_1 times a_1,"},{"Start":"05:15.500 ","End":"05:18.485","Text":"then a_1 times a_2, and so on."},{"Start":"05:18.485 ","End":"05:20.255","Text":"Anyway, this is what we get."},{"Start":"05:20.255 ","End":"05:23.750","Text":"The trace, which is the sum of the members on the diagonal,"},{"Start":"05:23.750 ","End":"05:26.580","Text":"will be exactly this."},{"Start":"05:26.580 ","End":"05:29.510","Text":"This case there are 2 eigenvalues,"},{"Start":"05:29.510 ","End":"05:31.800","Text":"0 and this Lambda,"},{"Start":"05:31.800 ","End":"05:35.230","Text":"which is the sum of the a_i squared."},{"Start":"05:35.230 ","End":"05:36.860","Text":"Now we\u0027re done with this exercise,"},{"Start":"05:36.860 ","End":"05:41.420","Text":"but I\u0027d just like to say a little bit more just for interest sake,"},{"Start":"05:41.420 ","End":"05:44.660","Text":"that the eigenvector for this Lambda I claim"},{"Start":"05:44.660 ","End":"05:47.780","Text":"is the column vector which is the transpose of A."},{"Start":"05:47.780 ","End":"05:48.950","Text":"A was the row vector,"},{"Start":"05:48.950 ","End":"05:51.785","Text":"its transpose is the column vector, and I\u0027ll show you this."},{"Start":"05:51.785 ","End":"05:54.745","Text":"First of all it\u0027s not the 0 vector,"},{"Start":"05:54.745 ","End":"05:56.864","Text":"that if A transpose is 0,"},{"Start":"05:56.864 ","End":"06:00.510","Text":"then B is also the 0 matrix,"},{"Start":"06:00.510 ","End":"06:02.865","Text":"and that contradicts the rank,"},{"Start":"06:02.865 ","End":"06:09.234","Text":"and A times A transpose is a row times a column."},{"Start":"06:09.234 ","End":"06:12.730","Text":"This time it gives us a 1-by-1 matrix."},{"Start":"06:12.730 ","End":"06:15.760","Text":"Previously, we did it the other way round with n by n matrix."},{"Start":"06:15.760 ","End":"06:17.665","Text":"Here we get a 1-by-1 matrix,"},{"Start":"06:17.665 ","End":"06:20.045","Text":"which is just Lambda."},{"Start":"06:20.045 ","End":"06:23.920","Text":"I\u0027ll show you. Take this a_1 to a_n times this."},{"Start":"06:23.920 ","End":"06:26.800","Text":"We take this row times this column,"},{"Start":"06:26.800 ","End":"06:32.395","Text":"meaning we multiply in pairs and add that a_1 times a_1 plus a_2 times a_2."},{"Start":"06:32.395 ","End":"06:34.075","Text":"This is what we get."},{"Start":"06:34.075 ","End":"06:40.150","Text":"Now the final bit, we have to show that Bv is Lambda v. Well, Bv is,"},{"Start":"06:40.150 ","End":"06:45.394","Text":"B is A transpose times A and v,"},{"Start":"06:45.394 ","End":"06:48.480","Text":"we already said is A transpose."},{"Start":"06:48.480 ","End":"06:50.540","Text":"This first 2 is B."},{"Start":"06:50.540 ","End":"06:54.590","Text":"The last part is v. We know that AA transpose from"},{"Start":"06:54.590 ","End":"06:59.585","Text":"here is the matrix with a single entry Lambda."},{"Start":"06:59.585 ","End":"07:02.675","Text":"A transpose is the column vector."},{"Start":"07:02.675 ","End":"07:09.020","Text":"Now if you multiply a column vector by a 1-by-1 matrix Lambda, if you think about it,"},{"Start":"07:09.020 ","End":"07:13.445","Text":"it just comes out to be Lambda times this column matrix,"},{"Start":"07:13.445 ","End":"07:17.030","Text":"which is Lambda times v. Bv is Lambda v,"},{"Start":"07:17.030 ","End":"07:27.520","Text":"and so this shows us that the eigenvector for this Lambda is this. That\u0027s enough."}],"ID":25771},{"Watched":false,"Name":"Exercise 16","Duration":"7m 2s","ChapterTopicVideoID":24859,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:06.975","Text":"In this exercise, we have a 5 by 5 real square matrix called a,"},{"Start":"00:06.975 ","End":"00:14.445","Text":"and we have 5 statements about a and we have to prove or disprove each of them."},{"Start":"00:14.445 ","End":"00:22.665","Text":"The first one a, says that there exists a Lambda such that the Lambda eigenspace,"},{"Start":"00:22.665 ","End":"00:28.605","Text":"which is the set of all v such that Av equals Lambda v,"},{"Start":"00:28.605 ","End":"00:33.900","Text":"that this is non-trivial as a subspace of R^5."},{"Start":"00:33.900 ","End":"00:40.535","Text":"Non-trivial means that it has a positive dimension or in other words,"},{"Start":"00:40.535 ","End":"00:43.325","Text":"that it\u0027s not trivial,"},{"Start":"00:43.325 ","End":"00:47.120","Text":"that it\u0027s not just the set containing 0."},{"Start":"00:47.120 ","End":"00:49.480","Text":"Let\u0027s start solving this."},{"Start":"00:49.480 ","End":"00:54.590","Text":"What it really says is that there is some eigenvalue Lambda."},{"Start":"00:54.590 ","End":"00:59.210","Text":"The Lambda eigenspace W_Lambda will be nontrivial,"},{"Start":"00:59.210 ","End":"01:02.060","Text":"which means it contains something other than 0."},{"Start":"01:02.060 ","End":"01:05.600","Text":"This question boils down to the question,"},{"Start":"01:05.600 ","End":"01:09.320","Text":"does there exist an eigenvalue of a?"},{"Start":"01:09.320 ","End":"01:13.355","Text":"Well, turns out this is true and we\u0027ll prove it."},{"Start":"01:13.355 ","End":"01:18.635","Text":"The characteristic polynomial for the matrix a has degree 5,"},{"Start":"01:18.635 ","End":"01:21.450","Text":"and 5 is an odd number."},{"Start":"01:21.900 ","End":"01:27.265","Text":"Now any odd degree polynomial over R has a root,"},{"Start":"01:27.265 ","End":"01:29.499","Text":"and let\u0027s call it Lambda."},{"Start":"01:29.499 ","End":"01:32.230","Text":"This is going to be an eigenvalue of a,"},{"Start":"01:32.230 ","End":"01:36.820","Text":"then the root of the characteristic polynomial is an eigenvalue,"},{"Start":"01:36.820 ","End":"01:38.950","Text":"so that does Part a."},{"Start":"01:38.950 ","End":"01:42.080","Text":"Now we come to Part b."},{"Start":"01:42.080 ","End":"01:49.135","Text":"This says that if v_1 and v_2 are 2 eigenvectors of the matrix a,"},{"Start":"01:49.135 ","End":"01:53.585","Text":"then, so is there some v_1 plus v_2."},{"Start":"01:53.585 ","End":"02:01.195","Text":"Let me remark that if the 2 eigenvectors correspond to the same eigenvalue,"},{"Start":"02:01.195 ","End":"02:06.770","Text":"then the claim is true because they belong to the same eigenspace,"},{"Start":"02:06.770 ","End":"02:09.650","Text":"which is a subspace so closed under addition."},{"Start":"02:09.650 ","End":"02:13.289","Text":"But we\u0027re not told that and in general,"},{"Start":"02:13.289 ","End":"02:15.570","Text":"the claim is false."},{"Start":"02:15.570 ","End":"02:22.505","Text":"They\u0027ll give you a counter-example A to be the diagonal matrix with a 3 and a 4 here."},{"Start":"02:22.505 ","End":"02:27.845","Text":"V_1 and v_2 will be just the regular basis vectors."},{"Start":"02:27.845 ","End":"02:32.880","Text":"Now, a times v_1 is going to be 3, 0,"},{"Start":"02:32.880 ","End":"02:35.115","Text":"which is 3 times v_1,"},{"Start":"02:35.115 ","End":"02:37.980","Text":"and a times v_2 will be 0,"},{"Start":"02:37.980 ","End":"02:40.320","Text":"4, which is 4 times v_2."},{"Start":"02:40.320 ","End":"02:44.450","Text":"So v_1 and v_2 are eigenvectors but with different eigenvalues."},{"Start":"02:44.450 ","End":"02:48.460","Text":"1 has 3 and the other has 4 as an eigenvalue."},{"Start":"02:48.460 ","End":"02:51.390","Text":"What about v_1 plus v_2?"},{"Start":"02:51.390 ","End":"02:53.130","Text":"Well, the sum is 1,"},{"Start":"02:53.130 ","End":"02:57.375","Text":"1 and if we apply the matrix A to this,"},{"Start":"02:57.375 ","End":"02:59.100","Text":"then we have 3, 0, 0,"},{"Start":"02:59.100 ","End":"03:02.775","Text":"4 times 1,1 which comes out to be 3, 4."},{"Start":"03:02.775 ","End":"03:06.390","Text":"But 3,4 is not equal to Lambda times 1,"},{"Start":"03:06.390 ","End":"03:09.015","Text":"1 for any Lambda."},{"Start":"03:09.015 ","End":"03:13.620","Text":"V_1 plus v_2 is not an eigenvector."},{"Start":"03:13.620 ","End":"03:16.750","Text":"That concludes Part B."},{"Start":"03:16.750 ","End":"03:19.130","Text":"Let me say that from here on,"},{"Start":"03:19.130 ","End":"03:24.800","Text":"a is no longer the 5 by 5 real matrix."},{"Start":"03:24.800 ","End":"03:27.140","Text":"That\u0027s just for parts a and b."},{"Start":"03:27.140 ","End":"03:32.000","Text":"Part c, if A and B are all equivalent square matrices,"},{"Start":"03:32.000 ","End":"03:36.540","Text":"then they have the same eigenvalues and the answer"},{"Start":"03:36.540 ","End":"03:41.120","Text":"to that is false and to disprove the claim,"},{"Start":"03:41.120 ","End":"03:43.505","Text":"we just need to give a counter-example."},{"Start":"03:43.505 ","End":"03:45.455","Text":"Let\u0027s look at these 2."},{"Start":"03:45.455 ","End":"03:48.890","Text":"This one is 1,1 on the diagonal here,"},{"Start":"03:48.890 ","End":"03:53.035","Text":"3,4 on the diagonal, and the rest is 0."},{"Start":"03:53.035 ","End":"03:55.830","Text":"A and b are all equivalent."},{"Start":"03:55.830 ","End":"04:00.260","Text":"Just take 3 times this row and put it into the first row"},{"Start":"04:00.260 ","End":"04:05.210","Text":"and also 4 times the second row and put that into the new second row."},{"Start":"04:05.210 ","End":"04:07.045","Text":"They\u0027re all equivalent,"},{"Start":"04:07.045 ","End":"04:11.660","Text":"but the eigenvalue of a is just 1,"},{"Start":"04:11.660 ","End":"04:14.210","Text":"whereas the eigenvalues of b are 3,"},{"Start":"04:14.210 ","End":"04:17.255","Text":"4 that\u0027s different obviously."},{"Start":"04:17.255 ","End":"04:19.825","Text":"So that\u0027s a counter-example."},{"Start":"04:19.825 ","End":"04:28.790","Text":"Now in D, we have an n by n square matrix and the claim is that if a is diagonalizable,"},{"Start":"04:28.790 ","End":"04:32.155","Text":"all the eigenvalues are distinct."},{"Start":"04:32.155 ","End":"04:36.170","Text":"The answer to this is definitely no, false."},{"Start":"04:36.170 ","End":"04:43.160","Text":"A counter-example would be for a to take the identity matrix of"},{"Start":"04:43.160 ","End":"04:51.395","Text":"size n and note that a has n eigenvalues,"},{"Start":"04:51.395 ","End":"04:53.210","Text":"but they\u0027re all the same,"},{"Start":"04:53.210 ","End":"04:55.175","Text":"they\u0027re all equal to 1."},{"Start":"04:55.175 ","End":"04:58.880","Text":"Or we can just say that it only has 1 eigenvalue."},{"Start":"04:58.880 ","End":"05:01.430","Text":"It all depends if you count multiplicity or not."},{"Start":"05:01.430 ","End":"05:05.600","Text":"In any event it doesn\u0027t have n distinct eigenvalues."},{"Start":"05:05.600 ","End":"05:08.020","Text":"All the eigenvalues are 1."},{"Start":"05:08.020 ","End":"05:12.940","Text":"That was d. Now, Part a."},{"Start":"05:12.940 ","End":"05:17.195","Text":"If a is a square n by n matrix over the reals,"},{"Start":"05:17.195 ","End":"05:21.245","Text":"and if all its eigenvalues are distinct,"},{"Start":"05:21.245 ","End":"05:27.500","Text":"then a is diagonalizable over R. An inverse of part"},{"Start":"05:27.500 ","End":"05:35.390","Text":"d. I\u0027ll remark that if we were talking about the complex numbers instead of the reals,"},{"Start":"05:35.390 ","End":"05:44.505","Text":"then the characteristic polynomial would have n roots all distinct."},{"Start":"05:44.505 ","End":"05:47.705","Text":"The point is that there are n of them."},{"Start":"05:47.705 ","End":"05:52.340","Text":"That\u0027s the full amount and so it would be diagonalizable."},{"Start":"05:52.340 ","End":"05:57.005","Text":"But over the reals that could be less than n roots,"},{"Start":"05:57.005 ","End":"05:59.570","Text":"so the answer is false,"},{"Start":"05:59.570 ","End":"06:02.065","Text":"and here is a counter-example."},{"Start":"06:02.065 ","End":"06:08.000","Text":"Let\u0027s take a to be the following matrix,"},{"Start":"06:08.000 ","End":"06:09.545","Text":"1 and a 2 here,"},{"Start":"06:09.545 ","End":"06:11.765","Text":"1 and a minus 1 here."},{"Start":"06:11.765 ","End":"06:18.410","Text":"The characteristic polynomial, I\u0027ll leave you to do the computation is x minus 1,"},{"Start":"06:18.410 ","End":"06:21.980","Text":"x minus 2 times x squared plus 1."},{"Start":"06:21.980 ","End":"06:25.175","Text":"This is an irreducible quadratic,"},{"Start":"06:25.175 ","End":"06:30.115","Text":"and this only has 2 real eigenvalues, 1 and 2."},{"Start":"06:30.115 ","End":"06:34.940","Text":"There are different, so all the eigenvalues are different."},{"Start":"06:34.940 ","End":"06:38.195","Text":"But a is not diagonalizable."},{"Start":"06:38.195 ","End":"06:43.650","Text":"Noted to be diagonalizable needs to have 4 eigenvalues."},{"Start":"06:43.690 ","End":"06:51.140","Text":"Remarks that oversee it is diagonalizable because we can keep factorizing this."},{"Start":"06:51.140 ","End":"06:53.645","Text":"We\u0027ll get x plus i, x minus i."},{"Start":"06:53.645 ","End":"06:59.575","Text":"So we have 4 eigenvalues and then it would be diagonalizable."},{"Start":"06:59.575 ","End":"07:03.220","Text":"We\u0027re done with this exercise."}],"ID":25772},{"Watched":false,"Name":"Exercise 17","Duration":"6m ","ChapterTopicVideoID":24860,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:05.310","Text":"In this exercise, A is a 4-by-4 real matrix,"},{"Start":"00:05.310 ","End":"00:10.485","Text":"and we\u0027re given that A has 4 eigenvalues."},{"Start":"00:10.485 ","End":"00:15.810","Text":"Let me say that it\u0027s ambiguous sometimes whether it means"},{"Start":"00:15.810 ","End":"00:21.815","Text":"4 are all different or in this case including multiplicity,"},{"Start":"00:21.815 ","End":"00:23.760","Text":"so there could be repeats."},{"Start":"00:23.760 ","End":"00:29.835","Text":"Anyway, we\u0027re told that the smallest of the eigenvalues is 2 and the largest is 4."},{"Start":"00:29.835 ","End":"00:33.825","Text":"We have to say which of the following a through e,"},{"Start":"00:33.825 ","End":"00:35.700","Text":"are true or false,"},{"Start":"00:35.700 ","End":"00:38.145","Text":"and each case prove or disprove."},{"Start":"00:38.145 ","End":"00:41.260","Text":"We\u0027ll read each 1 as we come to it."},{"Start":"00:41.260 ","End":"00:43.940","Text":"This is just a summary of what we\u0027re given,"},{"Start":"00:43.940 ","End":"00:45.260","Text":"starting with part a,"},{"Start":"00:45.260 ","End":"00:48.380","Text":"which claims that rank A is equal to 4."},{"Start":"00:48.380 ","End":"00:51.625","Text":"Turns out that this is true,"},{"Start":"00:51.625 ","End":"00:54.150","Text":"so we need to prove that."},{"Start":"00:54.150 ","End":"00:57.425","Text":"Well, 0 isn\u0027t an eigenvalue."},{"Start":"00:57.425 ","End":"00:59.765","Text":"Why isn\u0027t 0 an eigenvalue?"},{"Start":"00:59.765 ","End":"01:04.665","Text":"Because it\u0027s in the range of 2 to 4."},{"Start":"01:04.665 ","End":"01:07.890","Text":"Smallest is 2, so can\u0027t be 0."},{"Start":"01:07.890 ","End":"01:10.500","Text":"If 0 isn\u0027t an eigenvalue,"},{"Start":"01:10.500 ","End":"01:13.175","Text":"then A is invertible."},{"Start":"01:13.175 ","End":"01:17.240","Text":"That\u0027s actually an if and only if condition for invertibility."},{"Start":"01:17.240 ","End":"01:19.015","Text":"If it\u0027s invertible,"},{"Start":"01:19.015 ","End":"01:22.480","Text":"that means it has the full rank of 4,"},{"Start":"01:22.480 ","End":"01:26.660","Text":"whatever the order of the square matrix is."},{"Start":"01:26.660 ","End":"01:30.970","Text":"The rank is 4 and that proves a."},{"Start":"01:30.970 ","End":"01:34.635","Text":"B claims that A is diagonalizable,"},{"Start":"01:34.635 ","End":"01:37.010","Text":"and as a matter of fact,"},{"Start":"01:37.010 ","End":"01:40.310","Text":"if we knew that all the eigenvalues were different,"},{"Start":"01:40.310 ","End":"01:42.170","Text":"we were not counting multiplicity,"},{"Start":"01:42.170 ","End":"01:43.490","Text":"then it would be true,"},{"Start":"01:43.490 ","End":"01:47.630","Text":"because for different eigenvalues or n different ones in general,"},{"Start":"01:47.630 ","End":"01:49.910","Text":"would mean that it is diagonalizable"},{"Start":"01:49.910 ","End":"01:54.365","Text":"but in this case we allow repeats and the answer is false."},{"Start":"01:54.365 ","End":"01:58.700","Text":"Here is a counterexample that we have 2,"},{"Start":"01:58.700 ","End":"02:03.830","Text":"2, and 4, 4 here and a 1 here."},{"Start":"02:03.830 ","End":"02:06.560","Text":"Now, why is it not diagonalizable?"},{"Start":"02:06.560 ","End":"02:09.380","Text":"If only we didn\u0027t have this 1 here it would be diagonal,"},{"Start":"02:09.380 ","End":"02:11.225","Text":"but this throws it off."},{"Start":"02:11.225 ","End":"02:14.330","Text":"If you compute the characteristic polynomial,"},{"Start":"02:14.330 ","End":"02:16.760","Text":"it comes out to be x minus 2 squared,"},{"Start":"02:16.760 ","End":"02:19.145","Text":"x minus 4 squared."},{"Start":"02:19.145 ","End":"02:23.750","Text":"The eigenvalues are 2 and 4."},{"Start":"02:23.750 ","End":"02:27.080","Text":"Each has algebraic multiplicity 2."},{"Start":"02:27.080 ","End":"02:29.810","Text":"A 2 from here and the 2 from here."},{"Start":"02:29.810 ","End":"02:36.155","Text":"But the claim is the geometric multiplicity of the eigenvalue 2 is only 1,"},{"Start":"02:36.155 ","End":"02:39.845","Text":"which is less than the algebraic multiplicity."},{"Start":"02:39.845 ","End":"02:43.405","Text":"If we just show this, then we\u0027re done."},{"Start":"02:43.405 ","End":"02:47.610","Text":"Let\u0027s look at 2I minus A."},{"Start":"02:47.610 ","End":"02:54.215","Text":"In general, we look at Lambda I minus A and look for its kernel to get the eigenspace."},{"Start":"02:54.215 ","End":"02:56.210","Text":"Let\u0027s see what the eigenspace for 2 is."},{"Start":"02:56.210 ","End":"03:01.610","Text":"Well, 2I minus A comes out this and it has rank 3."},{"Start":"03:01.610 ","End":"03:05.390","Text":"For example, the columns are linearly independent,"},{"Start":"03:05.390 ","End":"03:08.045","Text":"these 3 or these 3 rows."},{"Start":"03:08.045 ","End":"03:10.010","Text":"If it has a rank of 3,"},{"Start":"03:10.010 ","End":"03:11.720","Text":"the nullity is n,"},{"Start":"03:11.720 ","End":"03:14.000","Text":"which is 4 minus 3,"},{"Start":"03:14.000 ","End":"03:15.050","Text":"which is 1,"},{"Start":"03:15.050 ","End":"03:17.000","Text":"by the rank-nullity theorem."},{"Start":"03:17.000 ","End":"03:22.580","Text":"The kernel, which is the 2 eigenspace has dimension 1."},{"Start":"03:22.580 ","End":"03:27.190","Text":"This dimension is exactly the geometric multiplicity."},{"Start":"03:27.190 ","End":"03:30.955","Text":"Whenever we have even 1 eigenvalue for which"},{"Start":"03:30.955 ","End":"03:36.280","Text":"the geometric multiplicity is less than the algebraic multiplicity,"},{"Start":"03:36.280 ","End":"03:40.150","Text":"it means that the matrix is not diagonalizable,"},{"Start":"03:40.150 ","End":"03:43.280","Text":"so that proves part b."},{"Start":"03:43.280 ","End":"03:51.105","Text":"Now we\u0027re on to c which claims that the trace of A must be bigger than 10."},{"Start":"03:51.105 ","End":"03:53.670","Text":"This turns out to be false,"},{"Start":"03:53.670 ","End":"03:55.860","Text":"and we\u0027re going to give a counterexample."},{"Start":"03:55.860 ","End":"04:00.775","Text":"The counterexample is to take the following,"},{"Start":"04:00.775 ","End":"04:02.260","Text":"which is 2, 2,"},{"Start":"04:02.260 ","End":"04:04.600","Text":"2, 4 on the diagonal."},{"Start":"04:04.600 ","End":"04:09.740","Text":"It meets the conditions that the minimum is 2 and the maximum is 4."},{"Start":"04:09.740 ","End":"04:13.280","Text":"It has got 4 eigenvalues, 2, 2, 2, and 4."},{"Start":"04:13.280 ","End":"04:15.925","Text":"Remember we\u0027re counting multiplicity."},{"Start":"04:15.925 ","End":"04:19.720","Text":"The trace of A is 10,"},{"Start":"04:19.720 ","End":"04:23.135","Text":"which is not bigger than 10, it\u0027s equal to."},{"Start":"04:23.135 ","End":"04:24.665","Text":"That\u0027s part c,"},{"Start":"04:24.665 ","End":"04:32.600","Text":"next on to part d. In part d the claim is that the determinant of A is less than 127,"},{"Start":"04:32.600 ","End":"04:35.385","Text":"and this turns out to be false."},{"Start":"04:35.385 ","End":"04:37.835","Text":"Here is a counterexample."},{"Start":"04:37.835 ","End":"04:41.450","Text":"Once again, you see that it has 4 eigenvalues."},{"Start":"04:41.450 ","End":"04:42.470","Text":"The smallest is 2,"},{"Start":"04:42.470 ","End":"04:44.045","Text":"the largest is 4,"},{"Start":"04:44.045 ","End":"04:48.905","Text":"but the determinant is 2 times 4 times 4 times 4,"},{"Start":"04:48.905 ","End":"04:51.300","Text":"which is 128,"},{"Start":"04:51.300 ","End":"04:53.895","Text":"which is not less than 127,"},{"Start":"04:53.895 ","End":"04:57.600","Text":"so that\u0027s part d. Part e,"},{"Start":"04:57.600 ","End":"05:05.060","Text":"asks if there exists an eigenvector v of A such that A squared v is 2v."},{"Start":"05:05.060 ","End":"05:07.820","Text":"This turns out to be false,"},{"Start":"05:07.820 ","End":"05:10.664","Text":"and we\u0027ll prove that it\u0027s false."},{"Start":"05:10.664 ","End":"05:13.635","Text":"Suppose that v is an eigenvector,"},{"Start":"05:13.635 ","End":"05:17.195","Text":"meaning Av is Lambda v for some Lambda,"},{"Start":"05:17.195 ","End":"05:20.350","Text":"which by the conditions has to be between 2 and 4,"},{"Start":"05:20.350 ","End":"05:25.250","Text":"then A squared v is Lambda squared v. We\u0027ve seen this many times,"},{"Start":"05:25.250 ","End":"05:28.995","Text":"A to the nv is Lambda to the nv when you have an eigenvector."},{"Start":"05:28.995 ","End":"05:36.870","Text":"Lambda squared is not equal to 2 because this condition,"},{"Start":"05:36.870 ","End":"05:38.190","Text":"if you square it,"},{"Start":"05:38.190 ","End":"05:43.820","Text":"tells us that lambda squared is between 2 squared and 4 squared."},{"Start":"05:43.820 ","End":"05:46.780","Text":"In other words, between 4 and 16."},{"Start":"05:46.780 ","End":"05:52.515","Text":"Lambda squared can\u0027t be 2 because 2 is not between 4 and 16."},{"Start":"05:52.515 ","End":"06:00.420","Text":"This proves what we wanted to prove and we are done with part e and with this exercise."}],"ID":25773},{"Watched":false,"Name":"Exercise 18","Duration":"5m 16s","ChapterTopicVideoID":24861,"CourseChapterTopicPlaylistID":118359,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:04.440","Text":"In this exercise, A is a square matrix of order n,"},{"Start":"00:04.440 ","End":"00:07.055","Text":"which is a natural number bigger than 1."},{"Start":"00:07.055 ","End":"00:10.975","Text":"We have a series of true, false questions."},{"Start":"00:10.975 ","End":"00:13.980","Text":"In a, v is an eigenvector of A,"},{"Start":"00:13.980 ","End":"00:17.655","Text":"does that imply that v is also an eigenvector of A to the n,"},{"Start":"00:17.655 ","End":"00:19.875","Text":"and b is the other way around."},{"Start":"00:19.875 ","End":"00:22.440","Text":"c, if a is diagonalizable,"},{"Start":"00:22.440 ","End":"00:25.965","Text":"so is A to the n and d is the other way around."},{"Start":"00:25.965 ","End":"00:28.365","Text":"Let\u0027s start with a,"},{"Start":"00:28.365 ","End":"00:30.360","Text":"v is an eigenvector of A,"},{"Start":"00:30.360 ","End":"00:33.250","Text":"then v is also an eigenvector of A to the n."},{"Start":"00:33.250 ","End":"00:35.060","Text":"Turns out that\u0027s true."},{"Start":"00:35.060 ","End":"00:40.015","Text":"We\u0027ve seen many times that if v is an eigenvector of A,"},{"Start":"00:40.015 ","End":"00:42.025","Text":"with eigenvalue Lambda,"},{"Start":"00:42.025 ","End":"00:48.575","Text":"then it\u0027s also an eigenvector of A to the n with eigenvalue Lambda to the n."},{"Start":"00:48.575 ","End":"00:53.430","Text":"That\u0027s a. Now, b is the opposite,"},{"Start":"00:53.430 ","End":"00:55.130","Text":"if v is an eigenvector of A to the n,"},{"Start":"00:55.130 ","End":"00:56.960","Text":"is it also an eigenvector of A?"},{"Start":"00:56.960 ","End":"00:58.250","Text":"Turns outs that\u0027s false,"},{"Start":"00:58.250 ","End":"01:01.040","Text":"and all we need is 1 counterexample."},{"Start":"01:01.040 ","End":"01:04.385","Text":"Let A be this matrix,"},{"Start":"01:04.385 ","End":"01:07.390","Text":"then A squared is this."},{"Start":"01:07.390 ","End":"01:10.939","Text":"We\u0027ll take v to be the column vector 1, 0."},{"Start":"01:10.939 ","End":"01:15.320","Text":"Just by the way, if you know about rotation matrices in the plane,"},{"Start":"01:15.320 ","End":"01:20.530","Text":"this represents a 90 degrees counterclockwise rotation,"},{"Start":"01:20.530 ","End":"01:24.065","Text":"and this represents 180 degrees rotation,"},{"Start":"01:24.065 ","End":"01:26.750","Text":"which is just minus the identity matrix."},{"Start":"01:26.750 ","End":"01:33.920","Text":"Really every vector is an eigenvector of a squared with Lambda equals minus 1."},{"Start":"01:33.920 ","End":"01:38.240","Text":"Anyway, I digress v is an eigenvector of A square but not of A, let\u0027s check."},{"Start":"01:38.240 ","End":"01:44.120","Text":"A squared v is minus v for this particular vector,"},{"Start":"01:44.120 ","End":"01:45.485","Text":"actually for any vector,"},{"Start":"01:45.485 ","End":"01:48.860","Text":"but Av, if you apply it to 1,"},{"Start":"01:48.860 ","End":"01:57.545","Text":"0 you get 0 minus 1 and there\u0027s no Lambda that can possibly make this equal lambda 0."},{"Start":"01:57.545 ","End":"02:00.860","Text":"I mean, 0 is not equal to minus 1."},{"Start":"02:00.860 ","End":"02:02.990","Text":"That was b."},{"Start":"02:02.990 ","End":"02:08.410","Text":"Now c, if A is diagonalizable, so is A to the n."},{"Start":"02:08.410 ","End":"02:11.160","Text":"Turns out that\u0027s true so we have to prove it."},{"Start":"02:11.160 ","End":"02:13.145","Text":"If A is diagonalizable,"},{"Start":"02:13.145 ","End":"02:18.950","Text":"then it\u0027s similar to a diagonal matrix D. Similar means that we can find"},{"Start":"02:18.950 ","End":"02:25.100","Text":"invertible P such that P minus 1 AP is equal to D. Now,"},{"Start":"02:25.100 ","End":"02:27.995","Text":"if we raise this to the power of n,"},{"Start":"02:27.995 ","End":"02:29.525","Text":"and we\u0027ve seen this before,"},{"Start":"02:29.525 ","End":"02:32.840","Text":"we just have to take the A and raise it to"},{"Start":"02:32.840 ","End":"02:36.980","Text":"the power of n. May if you\u0027ve forgotten and you want to see the proof,"},{"Start":"02:36.980 ","End":"02:39.395","Text":"here\u0027s a quick proof by induction."},{"Start":"02:39.395 ","End":"02:41.350","Text":"Obviously true for n equals 1."},{"Start":"02:41.350 ","End":"02:43.570","Text":"We just need the induction step."},{"Start":"02:43.570 ","End":"02:46.240","Text":"We raise it to the power of n plus 1,"},{"Start":"02:46.240 ","End":"02:49.740","Text":"so we can break the n plus 1 up into n and 1."},{"Start":"02:49.740 ","End":"02:55.020","Text":"By the induction hypothesis this is P minus 1 A to the nP."},{"Start":"02:55.020 ","End":"02:57.135","Text":"The second part as is."},{"Start":"02:57.135 ","End":"02:59.340","Text":"The P cancels."},{"Start":"02:59.340 ","End":"03:02.505","Text":"A to the n combines with A, so we get this."},{"Start":"03:02.505 ","End":"03:04.515","Text":"Now that we have that,"},{"Start":"03:04.515 ","End":"03:09.770","Text":"suppose that P minus 1 AP is D raised to the power of n both sides,"},{"Start":"03:09.770 ","End":"03:13.250","Text":"then we have P minus 1 A to the n P is D to the n."},{"Start":"03:13.250 ","End":"03:15.050","Text":"Label D to the n,"},{"Start":"03:15.050 ","End":"03:16.940","Text":"call it D with a hat on it."},{"Start":"03:16.940 ","End":"03:19.110","Text":"A different D. Now,"},{"Start":"03:19.110 ","End":"03:23.410","Text":"D is also a diagonal because when you take a diagonal matrix and raise it to a power,"},{"Start":"03:23.410 ","End":"03:27.565","Text":"you just raise each of the elements on the diagonal."},{"Start":"03:27.565 ","End":"03:31.200","Text":"The fact that it\u0027s equal to P minus 1 A to the n,"},{"Start":"03:31.200 ","End":"03:34.940","Text":"P means that A to the n is similar to a diagonal matrix,"},{"Start":"03:34.940 ","End":"03:37.865","Text":"which means that A to the n is diagonalizable."},{"Start":"03:37.865 ","End":"03:43.155","Text":"That was c. Now d is the converse of that."},{"Start":"03:43.155 ","End":"03:45.560","Text":"If A and the n is diagonalizable,"},{"Start":"03:45.560 ","End":"03:47.240","Text":"then so is A."},{"Start":"03:47.240 ","End":"03:49.160","Text":"That happens to be false."},{"Start":"03:49.160 ","End":"03:53.030","Text":"As a counterexample, we can take the matrix A,"},{"Start":"03:53.030 ","End":"03:56.090","Text":"which is 0, 1, 0, 0."},{"Start":"03:56.090 ","End":"03:57.620","Text":"If you square it,"},{"Start":"03:57.620 ","End":"04:00.055","Text":"you get all 0s."},{"Start":"04:00.055 ","End":"04:03.525","Text":"This is a diagonal matrix trivially,"},{"Start":"04:03.525 ","End":"04:06.570","Text":"it has 0s off the diagonal."},{"Start":"04:06.570 ","End":"04:10.745","Text":"It\u0027s diagonal and hence diagonalizable."},{"Start":"04:10.745 ","End":"04:15.610","Text":"What about A? Is it diagonalizable? It\u0027s not."},{"Start":"04:15.610 ","End":"04:18.870","Text":"We just have to compare the 2 multiplicities,"},{"Start":"04:18.870 ","End":"04:20.915","Text":"the geometric and the algebraic."},{"Start":"04:20.915 ","End":"04:24.860","Text":"The algebraic we get by looking at the characteristic polynomial,"},{"Start":"04:24.860 ","End":"04:26.660","Text":"which is just x squared,"},{"Start":"04:26.660 ","End":"04:28.870","Text":"or if you like x minus 0 squared."},{"Start":"04:28.870 ","End":"04:33.620","Text":"It\u0027s either 0 is an eigenvalue with multiplicity 2."},{"Start":"04:33.620 ","End":"04:36.455","Text":"That\u0027s the algebraic multiplicity."},{"Start":"04:36.455 ","End":"04:39.005","Text":"What about the geometric multiplicity?"},{"Start":"04:39.005 ","End":"04:43.535","Text":"We want the eigenspace corresponding to the eigenvalue 0."},{"Start":"04:43.535 ","End":"04:48.920","Text":"This comes out to be the same as the kernel of this matrix."},{"Start":"04:48.920 ","End":"04:52.495","Text":"We can use the rank-nullity theorem here."},{"Start":"04:52.495 ","End":"04:55.755","Text":"The rank is 1 clearly,"},{"Start":"04:55.755 ","End":"04:58.790","Text":"so n minus the rank,"},{"Start":"04:58.790 ","End":"05:02.410","Text":"which is 2 minus 1 is 1."},{"Start":"05:02.410 ","End":"05:06.550","Text":"That\u0027s the geometric multiplicity."},{"Start":"05:06.740 ","End":"05:09.510","Text":"1 here, 2 here,"},{"Start":"05:09.510 ","End":"05:10.920","Text":"1 is not equal to 2,"},{"Start":"05:10.920 ","End":"05:13.505","Text":"so A is not diagonalizable."},{"Start":"05:13.505 ","End":"05:16.740","Text":"That concludes part d and we\u0027re done."}],"ID":25774}],"Thumbnail":null,"ID":118359},{"Name":"Cayley-Hamilton theorem and the Minimal Polynomial","TopicPlaylistFirstVideoID":0,"Duration":null,"Videos":[{"Watched":false,"Name":"Cayley-Hamilton Theorem","Duration":"3m 17s","ChapterTopicVideoID":25335,"CourseChapterTopicPlaylistID":118360,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:04.275","Text":"In this clip, I\u0027ll present the Cayley-Hamilton theorem,"},{"Start":"00:04.275 ","End":"00:06.975","Text":"and in the next clip, I\u0027ll prove it."},{"Start":"00:06.975 ","End":"00:12.940","Text":"These are 2 mathematicians, Cayley and Hamilton."},{"Start":"00:13.850 ","End":"00:20.520","Text":"The theorem says that every square matrix, over a field,"},{"Start":"00:20.520 ","End":"00:24.015","Text":"satisfies its own characteristic equation."},{"Start":"00:24.015 ","End":"00:25.935","Text":"I can rephrase that,"},{"Start":"00:25.935 ","End":"00:31.395","Text":"that every square matrix zeros its own characteristic polynomial."},{"Start":"00:31.395 ","End":"00:34.380","Text":"We talked about to 0, a polynomial means that"},{"Start":"00:34.380 ","End":"00:36.480","Text":"if you plug in the matrix, its 0."},{"Start":"00:36.480 ","End":"00:38.805","Text":"It is equivalent."},{"Start":"00:38.805 ","End":"00:43.515","Text":"An example, let\u0027s take a 3 by 3 real matrix, A,"},{"Start":"00:43.515 ","End":"00:45.555","Text":"which is the following,"},{"Start":"00:45.555 ","End":"00:48.690","Text":"and a characteristic polynomial,"},{"Start":"00:48.690 ","End":"00:50.130","Text":"I computed it for you,"},{"Start":"00:50.130 ","End":"00:52.770","Text":"we don\u0027t want to spend time on doing that,"},{"Start":"00:52.770 ","End":"00:54.255","Text":"you should check it,"},{"Start":"00:54.255 ","End":"00:56.730","Text":"this is the polynomial we get,"},{"Start":"00:56.730 ","End":"01:01.400","Text":"by the Cayley-Hamilton, p of A is 0,"},{"Start":"01:01.400 ","End":"01:06.904","Text":"which means if you put a instead of x here,"},{"Start":"01:06.904 ","End":"01:11.675","Text":"and don\u0027t forget to put the identity matrix instead of 1,"},{"Start":"01:11.675 ","End":"01:20.575","Text":"then we get A cube minus 4A squared minus 28 plus 48I equals 0."},{"Start":"01:20.575 ","End":"01:22.670","Text":"To spell it out,"},{"Start":"01:22.670 ","End":"01:27.710","Text":"this matrix cubed minus 4 times this squared minus 20"},{"Start":"01:27.710 ","End":"01:34.100","Text":"times this plus 48 times the identity is 0,"},{"Start":"01:34.100 ","End":"01:36.715","Text":"meaning the 0 matrix."},{"Start":"01:36.715 ","End":"01:39.995","Text":"As before, I\u0027m not going to actually do the computation."},{"Start":"01:39.995 ","End":"01:42.355","Text":"I\u0027ll leave you to check this."},{"Start":"01:42.355 ","End":"01:47.975","Text":"Let\u0027s do an exercise now using this Cayley-Hamilton theorem."},{"Start":"01:47.975 ","End":"01:50.225","Text":"We\u0027re given the matrix."},{"Start":"01:50.225 ","End":"01:52.475","Text":"That\u0027s the same 1 as above."},{"Start":"01:52.475 ","End":"01:57.625","Text":"Use the Cayley-Hamilton theorem to prove that A is invertible,"},{"Start":"01:57.625 ","End":"01:59.980","Text":"that\u0027s Part 1, and Part 2,"},{"Start":"01:59.980 ","End":"02:03.650","Text":"to express the inverse in terms of A."},{"Start":"02:03.650 ","End":"02:08.450","Text":"First of all, we compute the characteristic polynomial."},{"Start":"02:08.450 ","End":"02:09.920","Text":"We\u0027ve done that already,"},{"Start":"02:09.920 ","End":"02:12.940","Text":"just copied it from here."},{"Start":"02:12.940 ","End":"02:17.160","Text":"Then substitute A in it,"},{"Start":"02:17.160 ","End":"02:24.300","Text":"So we have A cubed minus 4A squared minus 28 plus 48I equals 0."},{"Start":"02:24.300 ","End":"02:27.510","Text":"Then what we can do is bring"},{"Start":"02:27.510 ","End":"02:31.685","Text":"the 48I to the other side because I want to take A outside the bracket."},{"Start":"02:31.685 ","End":"02:33.830","Text":"You can see where I\u0027m heading."},{"Start":"02:33.830 ","End":"02:41.500","Text":"Divide by the minus 48 and divide it into the bracket part,"},{"Start":"02:41.500 ","End":"02:43.690","Text":"so minus 1 over 48 times this,"},{"Start":"02:43.690 ","End":"02:45.420","Text":"times A is I."},{"Start":"02:45.420 ","End":"02:47.855","Text":"Look, A times something is I."},{"Start":"02:47.855 ","End":"02:51.080","Text":"Whenever A times some other matrix is I,"},{"Start":"02:51.080 ","End":"02:53.165","Text":"doesn\u0027t matter on the left or the right,"},{"Start":"02:53.165 ","End":"02:56.315","Text":"then these 2 are inverses of each other."},{"Start":"02:56.315 ","End":"03:01.920","Text":"So we get that A inverse is just what\u0027s written here,"},{"Start":"03:01.920 ","End":"03:03.420","Text":"minus 1 over 48,"},{"Start":"03:03.420 ","End":"03:07.590","Text":"times A squared minus 4A minus 20I."},{"Start":"03:07.590 ","End":"03:09.950","Text":"That solves this exercise,"},{"Start":"03:09.950 ","End":"03:12.215","Text":"and that\u0027s the only example I wanted to give."},{"Start":"03:12.215 ","End":"03:16.095","Text":"Next clip is the proof of the Cayley-Hamilton theorem,"},{"Start":"03:16.095 ","End":"03:18.220","Text":"and we are done."}],"ID":26152},{"Watched":false,"Name":"Cayley-Hamilton Proof","Duration":"6m 43s","ChapterTopicVideoID":24873,"CourseChapterTopicPlaylistID":118360,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.870","Text":"Now we come to the proof of the Cayley-Hamilton theorem."},{"Start":"00:03.870 ","End":"00:06.089","Text":"But before we get properly started,"},{"Start":"00:06.089 ","End":"00:08.100","Text":"I want to do some preparation."},{"Start":"00:08.100 ","End":"00:10.680","Text":"Let, A, it could be any matrix,"},{"Start":"00:10.680 ","End":"00:13.470","Text":"but here let\u0027s just take it as a 3 by 3 matrix."},{"Start":"00:13.470 ","End":"00:14.970","Text":"It happens to be diagonal,"},{"Start":"00:14.970 ","End":"00:16.485","Text":"but in general not."},{"Start":"00:16.485 ","End":"00:23.010","Text":"The characteristic matrix is x times the identity matrix minus A."},{"Start":"00:23.010 ","End":"00:25.825","Text":"In some books it\u0027s the other way around, doesn\u0027t really matter."},{"Start":"00:25.825 ","End":"00:28.490","Text":"But in our case, it will be x minus 1, x minus 2,"},{"Start":"00:28.490 ","End":"00:31.555","Text":"x minus 3, and 0s everywhere else."},{"Start":"00:31.555 ","End":"00:34.685","Text":"Now, the adjugate matrix,"},{"Start":"00:34.685 ","End":"00:37.460","Text":"sometimes called the classical adjoint,"},{"Start":"00:37.460 ","End":"00:39.470","Text":"there\u0027s a new thing called adjoint now."},{"Start":"00:39.470 ","End":"00:41.495","Text":"It\u0027s like Coke and Classic Coke."},{"Start":"00:41.495 ","End":"00:48.455","Text":"Anyway, classical adjoint is what we get when instead of each element, sorry, in here,"},{"Start":"00:48.455 ","End":"00:55.220","Text":"we delete the column and row and take the determinant of what\u0027s left,"},{"Start":"00:55.220 ","End":"00:59.860","Text":"and we also multiply it by plus or minus in a checkerboard fashion,"},{"Start":"00:59.860 ","End":"01:01.815","Text":"but here it\u0027s all pluses."},{"Start":"01:01.815 ","End":"01:05.735","Text":"At the end we put a transpose on it."},{"Start":"01:05.735 ","End":"01:10.120","Text":"What we get, just multiplying these out,"},{"Start":"01:10.120 ","End":"01:13.570","Text":"we get this quadratic polynomials here."},{"Start":"01:13.570 ","End":"01:18.305","Text":"Then we can split it up into pieces as follows."},{"Start":"01:18.305 ","End":"01:21.835","Text":"Next we can separate out the x squared,"},{"Start":"01:21.835 ","End":"01:25.155","Text":"the x, and the constant terms."},{"Start":"01:25.155 ","End":"01:30.480","Text":"X squared, from here,"},{"Start":"01:30.480 ","End":"01:32.190","Text":"from here, and from here,"},{"Start":"01:32.190 ","End":"01:36.010","Text":"we have the matrix which is 1, 1, and 1."},{"Start":"01:36.010 ","End":"01:37.930","Text":"It\u0027s this. Then for x,"},{"Start":"01:37.930 ","End":"01:39.460","Text":"we have minus 5 here,"},{"Start":"01:39.460 ","End":"01:42.680","Text":"minus 4 here, and minus 3 here."},{"Start":"01:42.680 ","End":"01:45.405","Text":"The constants, we have 6 here,"},{"Start":"01:45.405 ","End":"01:47.370","Text":"a 3 here, and a 2 here,"},{"Start":"01:47.370 ","End":"01:50.460","Text":"6,3,2 times a 1."},{"Start":"01:50.460 ","End":"01:55.135","Text":"Give these matrices names according to the power of x."},{"Start":"01:55.135 ","End":"01:57.700","Text":"This will be B_2, B_1, B_0."},{"Start":"01:57.700 ","End":"02:01.780","Text":"Notice that these matrices don\u0027t have any xs in them,"},{"Start":"02:01.780 ","End":"02:04.010","Text":"they\u0027re independent of x."},{"Start":"02:04.010 ","End":"02:06.780","Text":"That\u0027s true in general,"},{"Start":"02:06.780 ","End":"02:09.375","Text":"even with an n by n matrix."},{"Start":"02:09.375 ","End":"02:13.365","Text":"We\u0027ll get the adjoint of xI minus A will be,"},{"Start":"02:13.365 ","End":"02:14.860","Text":"this is in backwards order,"},{"Start":"02:14.860 ","End":"02:17.890","Text":"starting from B_0 plus B_1x, B_2x squared,"},{"Start":"02:17.890 ","End":"02:21.565","Text":"and so on up to B_n minus 1 x^n minus 1."},{"Start":"02:21.565 ","End":"02:25.590","Text":"These B\u0027s don\u0027t depend on x."},{"Start":"02:25.590 ","End":"02:27.630","Text":"That\u0027s the preliminaries,"},{"Start":"02:27.630 ","End":"02:30.950","Text":"now let\u0027s get started on the proof proper."},{"Start":"02:30.950 ","End":"02:33.554","Text":"The proof of the theorem,"},{"Start":"02:33.554 ","End":"02:37.280","Text":"and I\u0027ll remind you what the theorem is, I\u0027ll leave it up here."},{"Start":"02:37.280 ","End":"02:38.870","Text":"Let\u0027s start the proof."},{"Start":"02:38.870 ","End":"02:43.090","Text":"Now in general, for any square matrix M,"},{"Start":"02:43.090 ","End":"02:48.650","Text":"we have the property that M times the adjoint of M is the determinant of M times I,"},{"Start":"02:48.650 ","End":"02:54.920","Text":"where I is square matrix of the same size n by n. In our case,"},{"Start":"02:54.920 ","End":"02:58.940","Text":"let\u0027s take M as xI minus A and apply this here,"},{"Start":"02:58.940 ","End":"03:02.900","Text":"and we get xI minus A times the adjoint of xI minus A."},{"Start":"03:02.900 ","End":"03:05.015","Text":"This part we can call B,"},{"Start":"03:05.015 ","End":"03:11.105","Text":"is equal to the determinant of xI minus A times the identity of the right size."},{"Start":"03:11.105 ","End":"03:19.820","Text":"Now, the elements of this adjoint cofactors of this matrix,"},{"Start":"03:19.820 ","End":"03:24.440","Text":"which is cofactors\u0027 plus or minus the minus."},{"Start":"03:24.440 ","End":"03:29.120","Text":"It\u0027s a determinant of an n by 1 by n minus 1 matrix"},{"Start":"03:29.120 ","End":"03:34.130","Text":"which we get by deleting a row and a column."},{"Start":"03:34.130 ","End":"03:38.805","Text":"What\u0027s left, there\u0027s at most n minus 1 xs,"},{"Start":"03:38.805 ","End":"03:40.815","Text":"1 in each row at most."},{"Start":"03:40.815 ","End":"03:44.750","Text":"We have a polynomial in x of degree n minus 1,"},{"Start":"03:44.750 ","End":"03:47.600","Text":"just like in the example before the proof."},{"Start":"03:47.600 ","End":"03:50.915","Text":"We can write B as the sum,"},{"Start":"03:50.915 ","End":"03:52.805","Text":"just like we did before,"},{"Start":"03:52.805 ","End":"04:00.810","Text":"where the B_i or n by n matrices and we have powers of x from 1 to x^n minus 1."},{"Start":"04:00.970 ","End":"04:03.605","Text":"If we go back to here,"},{"Start":"04:03.605 ","End":"04:05.885","Text":"we get that xI minus A,"},{"Start":"04:05.885 ","End":"04:08.900","Text":"times the adjoint, which is B,"},{"Start":"04:08.900 ","End":"04:10.505","Text":"which is this,"},{"Start":"04:10.505 ","End":"04:12.220","Text":"is equal to,"},{"Start":"04:12.220 ","End":"04:14.840","Text":"and this is the characteristic polynomial,"},{"Start":"04:14.840 ","End":"04:16.490","Text":"the determinant of xI minus A,"},{"Start":"04:16.490 ","End":"04:19.130","Text":"so a_0 plus a_1x and so on."},{"Start":"04:19.130 ","End":"04:23.365","Text":"Characteristic polynomial times the identity matrix of the right size."},{"Start":"04:23.365 ","End":"04:26.615","Text":"The B_i\u0027s don\u0027t depend on x."},{"Start":"04:26.615 ","End":"04:33.290","Text":"Now let\u0027s compute this another way by expanding this minus this times this."},{"Start":"04:33.290 ","End":"04:38.400","Text":"On the 1 hand, we have xI times this,"},{"Start":"04:38.440 ","End":"04:41.120","Text":"which is just like here,"},{"Start":"04:41.120 ","End":"04:43.280","Text":"but x times,"},{"Start":"04:43.280 ","End":"04:45.160","Text":"the i doesn\u0027t make any difference."},{"Start":"04:45.160 ","End":"04:47.990","Text":"Just increase the powers of x."},{"Start":"04:47.990 ","End":"04:55.590","Text":"Then the minus A with this gives us minus A times B_0 minus AB_1 x, and so on."},{"Start":"04:55.590 ","End":"04:58.070","Text":"The left-hand side of this expands to this."},{"Start":"04:58.070 ","End":"05:03.200","Text":"Now let\u0027s take the right-hand side and we can just write"},{"Start":"05:03.200 ","End":"05:08.480","Text":"it by multiplying each term by I, like so."},{"Start":"05:08.480 ","End":"05:11.990","Text":"What we can do is because this is equal to this,"},{"Start":"05:11.990 ","End":"05:18.125","Text":"we can get n equations by just taking columns."},{"Start":"05:18.125 ","End":"05:20.735","Text":"Well, I\u0027ll write it out and you\u0027ll see."},{"Start":"05:20.735 ","End":"05:28.810","Text":"Like here, we have B_1 minus AB_2 is equal to a_2I."},{"Start":"05:28.810 ","End":"05:32.690","Text":"That\u0027s by comparing the x squares and by comparing each power of x,"},{"Start":"05:32.690 ","End":"05:38.310","Text":"we get these n equations."},{"Start":"05:39.080 ","End":"05:46.310","Text":"Next, multiply both sides of each equation by the appropriate power of A,"},{"Start":"05:46.310 ","End":"05:47.330","Text":"like this 1,"},{"Start":"05:47.330 ","End":"05:50.245","Text":"we\u0027ll multiply by A squared."},{"Start":"05:50.245 ","End":"05:53.190","Text":"What we\u0027ll get, like for this 1,"},{"Start":"05:53.190 ","End":"05:59.144","Text":"we\u0027ll get A squared B_1 minus A cubed B_2 equals a_2 A squared."},{"Start":"05:59.144 ","End":"06:01.365","Text":"We do this for each 1."},{"Start":"06:01.365 ","End":"06:06.900","Text":"Then notice that diagonals will get cancellations."},{"Start":"06:06.900 ","End":"06:08.370","Text":"Like this will cancel with this,"},{"Start":"06:08.370 ","End":"06:11.725","Text":"the A squared B_1 will cancel with the A squared B_1,"},{"Start":"06:11.725 ","End":"06:16.670","Text":"and all we\u0027ll be left with is 0 because there\u0027s nothing here and there\u0027s nothing here,"},{"Start":"06:16.670 ","End":"06:20.180","Text":"so there are just pairs that cancel each other out."},{"Start":"06:20.180 ","End":"06:23.800","Text":"This will equal the sum of the right-hand sides."},{"Start":"06:23.800 ","End":"06:29.600","Text":"What we get as 0 is a_0I plus a_1A plus so on."},{"Start":"06:29.600 ","End":"06:32.630","Text":"Now this is exactly what we needed to prove."},{"Start":"06:32.630 ","End":"06:35.930","Text":"This is exactly what happens when you substitute"},{"Start":"06:35.930 ","End":"06:40.670","Text":"matrix A in the characteristic polynomial and we get 0."},{"Start":"06:40.670 ","End":"06:43.800","Text":"That completes the proof."}],"ID":25786},{"Watched":false,"Name":"Minimal Polynomial","Duration":"8m 25s","ChapterTopicVideoID":24872,"CourseChapterTopicPlaylistID":118360,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:05.805","Text":"In this clip, we\u0027ll learn about the minimal polynomial of a square matrix."},{"Start":"00:05.805 ","End":"00:10.020","Text":"Now we know from the Cayley-Hamilton theorem that"},{"Start":"00:10.020 ","End":"00:14.430","Text":"every square matrix is a 0 of a polynomial."},{"Start":"00:14.430 ","End":"00:16.440","Text":"We could take, for example,"},{"Start":"00:16.440 ","End":"00:20.715","Text":"the characteristic polynomial for that matrix."},{"Start":"00:20.715 ","End":"00:23.700","Text":"Let\u0027s now take an example."},{"Start":"00:23.700 ","End":"00:26.085","Text":"Let\u0027s take a square real matrix,"},{"Start":"00:26.085 ","End":"00:28.320","Text":"3 by 3 this 1."},{"Start":"00:28.320 ","End":"00:33.960","Text":"A with 0 the polynomial x cubed minus 5x squared plus 7x minus 3"},{"Start":"00:33.960 ","End":"00:39.075","Text":"which I did the computation comes out to be the characteristic polynomial for this."},{"Start":"00:39.075 ","End":"00:42.120","Text":"As a jargon, we don\u0027t always say A is a 0 of p,"},{"Start":"00:42.120 ","End":"00:45.030","Text":"usually the scalar is a 0 of a polynomial."},{"Start":"00:45.030 ","End":"00:50.625","Text":"We say sometimes that A zeros p or p is zeroed by A."},{"Start":"00:50.625 ","End":"00:54.120","Text":"Means if you plug in A and you replace the constant"},{"Start":"00:54.120 ","End":"00:57.765","Text":"by a constant times the identity matrix then you get 0."},{"Start":"00:57.765 ","End":"01:00.330","Text":"This p of x has degree 3."},{"Start":"01:00.330 ","End":"01:06.570","Text":"The question is, if we\u0027re looking for a polynomial which A zeros,"},{"Start":"01:06.570 ","End":"01:08.070","Text":"can we do better than 3?"},{"Start":"01:08.070 ","End":"01:09.645","Text":"Can we get a smaller degree?"},{"Start":"01:09.645 ","End":"01:11.685","Text":"Answer turns out to be yes."},{"Start":"01:11.685 ","End":"01:16.800","Text":"We can actually find a degree 2 polynomial if you plug this into q of x,"},{"Start":"01:16.800 ","End":"01:22.110","Text":"which is this, and A zeroes it to q of A is 0."},{"Start":"01:22.110 ","End":"01:28.490","Text":"In general, we might want to ask is 2 the lowest degree in this case?"},{"Start":"01:28.490 ","End":"01:30.940","Text":"The answer is yes."},{"Start":"01:30.940 ","End":"01:34.830","Text":"How did I get to this polynomial q?"},{"Start":"01:34.830 ","End":"01:37.755","Text":"If m is a square matrix,"},{"Start":"01:37.755 ","End":"01:39.930","Text":"is there always a polynomial with"},{"Start":"01:39.930 ","End":"01:43.020","Text":"a lower degree than the characteristic polynomial of m,"},{"Start":"01:43.020 ","End":"01:45.090","Text":"which is zeroed by m?"},{"Start":"01:45.090 ","End":"01:46.110","Text":"The answer to this is no,"},{"Start":"01:46.110 ","End":"01:48.900","Text":"sometimes there is or sometimes there isn\u0027t."},{"Start":"01:48.900 ","End":"01:51.705","Text":"Time for a definition."},{"Start":"01:51.705 ","End":"01:57.180","Text":"The minimal polynomial of a matrix A is a polynomial,"},{"Start":"01:57.180 ","End":"01:59.010","Text":"but we require that it be monic,"},{"Start":"01:59.010 ","End":"02:04.695","Text":"have a leading coefficient of 1 like it did here and here."},{"Start":"02:04.695 ","End":"02:10.010","Text":"We\u0027ll call it little m of x of lowest degree,"},{"Start":"02:10.010 ","End":"02:12.845","Text":"which is zeroed by A."},{"Start":"02:12.845 ","End":"02:17.060","Text":"There\u0027s a reason for this requirement of monic."},{"Start":"02:17.060 ","End":"02:22.840","Text":"In general, we know that there is a non-empty set of polynomials which zero A,"},{"Start":"02:22.840 ","End":"02:25.460","Text":"because we have the characteristic for example."},{"Start":"02:25.460 ","End":"02:28.520","Text":"There is 1 of lowest degree."},{"Start":"02:28.520 ","End":"02:33.005","Text":"If you insist on the monic as we do,"},{"Start":"02:33.005 ","End":"02:37.040","Text":"then it can be proven that there is only 1 minimal polynomial."},{"Start":"02:37.040 ","End":"02:40.505","Text":"It\u0027s unique, we say the minimal polynomial."},{"Start":"02:40.505 ","End":"02:42.080","Text":"We need the monic for that,"},{"Start":"02:42.080 ","End":"02:44.345","Text":"like I said, because for example,"},{"Start":"02:44.345 ","End":"02:47.435","Text":"if we have q of x,"},{"Start":"02:47.435 ","End":"02:50.210","Text":"in the example above, it\u0027s monic."},{"Start":"02:50.210 ","End":"02:54.175","Text":"But if we say multiply it by 2,"},{"Start":"02:54.175 ","End":"02:58.395","Text":"this polynomial also zeros A and also has degree 2,"},{"Start":"02:58.395 ","End":"03:00.075","Text":"but it\u0027s not monic."},{"Start":"03:00.075 ","End":"03:02.415","Text":"It\u0027s a normalization."},{"Start":"03:02.415 ","End":"03:05.460","Text":"Just always divide by the leading coefficient to"},{"Start":"03:05.460 ","End":"03:09.015","Text":"get a monic polynomial and then it will be unique."},{"Start":"03:09.015 ","End":"03:14.565","Text":"Now we come to the question of how to find the minimal polynomial for a square matrix."},{"Start":"03:14.565 ","End":"03:16.620","Text":"Let\u0027s take, for example,"},{"Start":"03:16.620 ","End":"03:20.445","Text":"a matrix which is 7 by 7 over the reals,"},{"Start":"03:20.445 ","End":"03:23.145","Text":"and you want to find its minimal polynomial."},{"Start":"03:23.145 ","End":"03:26.610","Text":"The first step is to compute the characteristic polynomial"},{"Start":"03:26.610 ","End":"03:30.450","Text":"and then factorize it into irreducible factors."},{"Start":"03:30.450 ","End":"03:33.840","Text":"Let\u0027s say we get p of x is x squared,"},{"Start":"03:33.840 ","End":"03:35.010","Text":"x minus 1 cubed,"},{"Start":"03:35.010 ","End":"03:36.615","Text":"x squared plus 4."},{"Start":"03:36.615 ","End":"03:41.775","Text":"Note that over the reals that will always have linear and quadratic factors,"},{"Start":"03:41.775 ","End":"03:45.015","Text":"it doesn\u0027t have any cubic parts."},{"Start":"03:45.015 ","End":"03:47.880","Text":"This 1 has degree 7, 2 from here,"},{"Start":"03:47.880 ","End":"03:49.545","Text":"3 from here, and the x squared,"},{"Start":"03:49.545 ","End":"03:52.560","Text":"there is another 2, so it\u0027s 7 altogether."},{"Start":"03:52.560 ","End":"03:54.750","Text":"The field was the complex,"},{"Start":"03:54.750 ","End":"03:57.240","Text":"not the reals, that would be different."},{"Start":"03:57.240 ","End":"04:02.535","Text":"The x squared plus 4 would factorize into x plus 2i x minus 2i."},{"Start":"04:02.535 ","End":"04:06.750","Text":"The next step is that the theorem that the characteristic and"},{"Start":"04:06.750 ","End":"04:10.860","Text":"the minimal polynomials have the same irreducible factors,"},{"Start":"04:10.860 ","End":"04:12.495","Text":"the same building blocks,"},{"Start":"04:12.495 ","End":"04:15.480","Text":"just maybe with different coefficients."},{"Start":"04:15.480 ","End":"04:23.715","Text":"I forgot to write that the minimal polynomial divides the characteristic polynomial."},{"Start":"04:23.715 ","End":"04:26.820","Text":"It\u0027s made up of the same building blocks,"},{"Start":"04:26.820 ","End":"04:30.360","Text":"but the powers here will"},{"Start":"04:30.360 ","End":"04:34.275","Text":"be less than or equal to the powers in the characteristic polynomial."},{"Start":"04:34.275 ","End":"04:39.500","Text":"If you take all the combinations like x has to be to the power of 1 or 2."},{"Start":"04:39.500 ","End":"04:42.130","Text":"X minus 1 needs to be to the power of 1,"},{"Start":"04:42.130 ","End":"04:45.690","Text":"2 or 3, x squared plus 4 has to be as is."},{"Start":"04:45.690 ","End":"04:52.530","Text":"If you take all the combinations and arrange them in order of degree,"},{"Start":"04:52.530 ","End":"04:55.545","Text":"there is 6 combinations."},{"Start":"04:55.545 ","End":"04:57.700","Text":"Once you have this done,"},{"Start":"04:57.700 ","End":"05:03.200","Text":"what you do is you successively substitute matrix A in m_1 of x,"},{"Start":"05:03.200 ","End":"05:08.810","Text":"in m_2 of x in each 1 of these in order until you get the 0 matrix."},{"Start":"05:08.810 ","End":"05:10.220","Text":"At that point,"},{"Start":"05:10.220 ","End":"05:14.515","Text":"you stop and you got the minimal polynomial."},{"Start":"05:14.515 ","End":"05:18.620","Text":"But the theorem that the minimal polynomial"},{"Start":"05:18.620 ","End":"05:22.640","Text":"is independent of the field over which it\u0027s calculated."},{"Start":"05:22.640 ","End":"05:26.870","Text":"It would be the same minimal polynomial over the reals of the complex."},{"Start":"05:26.870 ","End":"05:29.075","Text":"If the entries are real,"},{"Start":"05:29.075 ","End":"05:30.600","Text":"then it will be"},{"Start":"05:30.600 ","End":"05:37.650","Text":"the same minimal polynomial for any other larger field than the real say the complex."},{"Start":"05:37.650 ","End":"05:42.495","Text":"Now let\u0027s take an example and we\u0027ll follow the steps that we just covered."},{"Start":"05:42.495 ","End":"05:48.240","Text":"Let\u0027s take a 3 by 3 matrix as follows,"},{"Start":"05:48.240 ","End":"05:52.020","Text":"and see if we can find its minimal polynomial."},{"Start":"05:52.020 ","End":"05:54.810","Text":"Step 1 was to compute"},{"Start":"05:54.810 ","End":"06:00.375","Text":"the characteristic polynomial and express it as a product of irreducible factors."},{"Start":"06:00.375 ","End":"06:01.950","Text":"I\u0027ve done that for you."},{"Start":"06:01.950 ","End":"06:07.665","Text":"You could check it, you should check it and this is what we get."},{"Start":"06:07.665 ","End":"06:11.640","Text":"Step 2 is to take the possibilities for"},{"Start":"06:11.640 ","End":"06:16.950","Text":"the minimal polynomial in increasing order of degree."},{"Start":"06:16.950 ","End":"06:19.110","Text":"If we just take x minus 1,"},{"Start":"06:19.110 ","End":"06:23.730","Text":"x minus 3 here we get degree 2 and if we take x minus 1 squared x minus 3,"},{"Start":"06:23.730 ","End":"06:25.125","Text":"we get degree 3."},{"Start":"06:25.125 ","End":"06:29.070","Text":"We have to have each of the factors at least to the power of 1."},{"Start":"06:29.070 ","End":"06:34.245","Text":"There\u0027s really only 2 possibilities this 1 could be taken to the power of 1 or 2,"},{"Start":"06:34.245 ","End":"06:36.360","Text":"and this 1 has to be taken."},{"Start":"06:36.360 ","End":"06:39.900","Text":"We plug it into this 1 first see if it fits, and if not,"},{"Start":"06:39.900 ","End":"06:43.380","Text":"then we go to the next 1. Let\u0027s see."},{"Start":"06:43.380 ","End":"06:45.195","Text":"If we plug it into m_1,"},{"Start":"06:45.195 ","End":"06:47.100","Text":"we get A minus I,"},{"Start":"06:47.100 ","End":"06:49.830","Text":"A minus 3I from here."},{"Start":"06:49.830 ","End":"06:52.710","Text":"This is what we get for A minus I, A minus 3I,"},{"Start":"06:52.710 ","End":"06:55.680","Text":"we just take A and subtract 1 from the diagonal,"},{"Start":"06:55.680 ","End":"06:58.620","Text":"and then we subtract 3 from the diagonal."},{"Start":"06:58.620 ","End":"06:59.970","Text":"We get these 2."},{"Start":"06:59.970 ","End":"07:01.770","Text":"If you multiply these out,"},{"Start":"07:01.770 ","End":"07:04.170","Text":"we actually get the 0 matrix."},{"Start":"07:04.170 ","End":"07:06.570","Text":"For example, 1, 2,"},{"Start":"07:06.570 ","End":"07:10.080","Text":"minus 5 multiplied by minus 1, 3,"},{"Start":"07:10.080 ","End":"07:14.340","Text":"1 gives us minus 1 plus 6 minus 5,"},{"Start":"07:14.340 ","End":"07:15.570","Text":"which is 0,"},{"Start":"07:15.570 ","End":"07:17.490","Text":"and similarly for all the rest."},{"Start":"07:17.490 ","End":"07:18.780","Text":"Once we hit 0,"},{"Start":"07:18.780 ","End":"07:22.545","Text":"we can stop and say this is the minimal polynomial,"},{"Start":"07:22.545 ","End":"07:26.230","Text":"is x minus 1, x minus 3."},{"Start":"07:27.650 ","End":"07:30.780","Text":"I\u0027ll finish off the clip by just giving you"},{"Start":"07:30.780 ","End":"07:35.160","Text":"some well-known theorems about the minimal polynomial."},{"Start":"07:35.160 ","End":"07:37.155","Text":"A is a square matrix,"},{"Start":"07:37.155 ","End":"07:38.595","Text":"and then we\u0027ll have some theorems."},{"Start":"07:38.595 ","End":"07:41.010","Text":"One of them is that the minimal polynomial of"},{"Start":"07:41.010 ","End":"07:45.120","Text":"A divides any polynomial which is zeroed by A."},{"Start":"07:45.120 ","End":"07:50.550","Text":"For example, the minimal polynomial divides the characteristic polynomial."},{"Start":"07:50.550 ","End":"07:54.630","Text":"The characteristic and minimal polynomials have the same irreducible factors."},{"Start":"07:54.630 ","End":"07:57.150","Text":"We also mentioned that before."},{"Start":"07:57.150 ","End":"08:06.240","Text":"Lambda is an eigenvalue if and only if Lambda is a root of the minimal polynomial of A."},{"Start":"08:06.240 ","End":"08:08.800","Text":"That\u0027s an important theorem."},{"Start":"08:09.170 ","End":"08:14.385","Text":"As we mentioned, the minimal polynomial exists and it\u0027s unique."},{"Start":"08:14.385 ","End":"08:20.070","Text":"By the way, 3 is also true if instead of minimal we take characteristic polynomial."},{"Start":"08:20.070 ","End":"08:22.170","Text":"I think that\u0027s enough."},{"Start":"08:22.170 ","End":"08:25.210","Text":"We\u0027ll end the clip here."}],"ID":25785},{"Watched":false,"Name":"Exercise 1","Duration":"5m 28s","ChapterTopicVideoID":24874,"CourseChapterTopicPlaylistID":118360,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:06.630","Text":"In this exercise, A is a square n by n matrix, which is idempotent."},{"Start":"00:06.630 ","End":"00:08.130","Text":"In case you\u0027ve forgotten,"},{"Start":"00:08.130 ","End":"00:12.150","Text":"that means that A squared equals A."},{"Start":"00:12.150 ","End":"00:18.195","Text":"We have 5 parts and we\u0027ll just read each part as we come to it and solve it."},{"Start":"00:18.195 ","End":"00:19.905","Text":"In part a,"},{"Start":"00:19.905 ","End":"00:25.455","Text":"we have to prove that each eigenvalue of A is either a 0 or a 1."},{"Start":"00:25.455 ","End":"00:29.550","Text":"Let Lambda be an eigenvalue and does an eigenvalue,"},{"Start":"00:29.550 ","End":"00:33.440","Text":"it has an eigenvector which is not 0."},{"Start":"00:33.440 ","End":"00:40.055","Text":"We\u0027ve seen several times before that if we have an eigenvalue for a matrix,"},{"Start":"00:40.055 ","End":"00:43.160","Text":"and not only is Av equals Lambda v,"},{"Start":"00:43.160 ","End":"00:48.155","Text":"but if we put a power of n here and here, it also works."},{"Start":"00:48.155 ","End":"00:51.435","Text":"Now in our case, A squared equals A."},{"Start":"00:51.435 ","End":"00:54.950","Text":"Let\u0027s apply both sides to a vector v."},{"Start":"00:54.950 ","End":"00:58.070","Text":"We have A squared v equals Av."},{"Start":"00:58.070 ","End":"01:03.080","Text":"Now using this, we have that Lambda squared v equals"},{"Start":"01:03.080 ","End":"01:10.120","Text":"Lambda v so that Lambda squared minus Lambda times v equals 0."},{"Start":"01:10.120 ","End":"01:13.510","Text":"V is a non-zero vector,"},{"Start":"01:13.510 ","End":"01:17.690","Text":"so the only scalar that can multiply it to give 0 is 0."},{"Start":"01:17.690 ","End":"01:20.425","Text":"So Lambda squared minus Lambda is 0."},{"Start":"01:20.425 ","End":"01:24.320","Text":"Factorizing, Lambda times Lambda minus 1 is 0."},{"Start":"01:24.320 ","End":"01:26.960","Text":"Lambda has to be 0 or 1."},{"Start":"01:26.960 ","End":"01:30.180","Text":"That\u0027s part a."},{"Start":"01:30.180 ","End":"01:37.830","Text":"Now onto part b, we have to list all the possibilities for the minimal polynomial of A."},{"Start":"01:37.940 ","End":"01:40.305","Text":"A squared is A,"},{"Start":"01:40.305 ","End":"01:42.645","Text":"so A squared minus A is 0."},{"Start":"01:42.645 ","End":"01:45.000","Text":"If we look at the polynomial p of x,"},{"Start":"01:45.000 ","End":"01:50.325","Text":"which equals x squared minus x and substitute A, we get a 0."},{"Start":"01:50.325 ","End":"01:52.995","Text":"If A is 0 is this polynomial,"},{"Start":"01:52.995 ","End":"01:55.960","Text":"which is xx minus 1,"},{"Start":"01:55.960 ","End":"02:04.020","Text":"then the minimal polynomial has to be a divisor of this polynomial."},{"Start":"02:04.430 ","End":"02:11.120","Text":"The only possibilities for a divisor of this are these 3 possibilities,"},{"Start":"02:11.120 ","End":"02:15.915","Text":"xx minus 1, or just x, or just x minus 1."},{"Start":"02:15.915 ","End":"02:17.685","Text":"That\u0027s part b."},{"Start":"02:17.685 ","End":"02:22.460","Text":"Now part c, we have to prove that the characteristic polynomial of A"},{"Start":"02:22.460 ","End":"02:25.760","Text":"can be factored into linear factors."},{"Start":"02:25.760 ","End":"02:30.380","Text":"Now, every irreducible factor of the characteristic polynomial"},{"Start":"02:30.380 ","End":"02:34.520","Text":"is also an irreducible factor of the minimal polynomial."},{"Start":"02:34.520 ","End":"02:38.150","Text":"But the minimal polynomial,"},{"Start":"02:38.150 ","End":"02:40.115","Text":"whichever 1 of these 3 it is,"},{"Start":"02:40.115 ","End":"02:42.020","Text":"only has linear factors."},{"Start":"02:42.020 ","End":"02:45.885","Text":"There\u0027s no x squared plus 1, for example."},{"Start":"02:45.885 ","End":"02:52.475","Text":"The characteristic polynomial also is made up of x and x minus 1,"},{"Start":"02:52.475 ","End":"02:53.750","Text":"this to the power of something,"},{"Start":"02:53.750 ","End":"02:55.280","Text":"this to the power of something,"},{"Start":"02:55.280 ","End":"02:57.260","Text":"anyway, has linear factors only."},{"Start":"02:57.260 ","End":"02:58.850","Text":"That\u0027s part c."},{"Start":"02:58.850 ","End":"03:03.455","Text":"Now part d, we have to prove that A is diagonalizable."},{"Start":"03:03.455 ","End":"03:06.200","Text":"That\u0027s pretty straightforward because"},{"Start":"03:06.200 ","End":"03:09.050","Text":"we know there\u0027s a theorem that a is diagonalizable"},{"Start":"03:09.050 ","End":"03:13.880","Text":"if and only if its minimal polynomial can be factored into linear factors."},{"Start":"03:13.880 ","End":"03:15.380","Text":"We just showed this,"},{"Start":"03:15.380 ","End":"03:18.680","Text":"it consists of factors of x or x minus 1."},{"Start":"03:18.680 ","End":"03:20.380","Text":"This is true in our case,"},{"Start":"03:20.380 ","End":"03:22.760","Text":"so A is diagonalizable."},{"Start":"03:22.760 ","End":"03:24.650","Text":"Now the last part e,"},{"Start":"03:24.650 ","End":"03:28.520","Text":"we have to prove that the trace of A is equal to the rank of A."},{"Start":"03:28.520 ","End":"03:31.675","Text":"Now, we just said that A is diagonalizable,"},{"Start":"03:31.675 ","End":"03:33.800","Text":"and what is diagonalizable mean?"},{"Start":"03:33.800 ","End":"03:36.995","Text":"It means it\u0027s similar to a diagonal matrix D,"},{"Start":"03:36.995 ","End":"03:43.940","Text":"similar in the sense that we have an invertible matrix P such that D is P minus 1AP."},{"Start":"03:43.940 ","End":"03:45.290","Text":"To remind you."},{"Start":"03:45.290 ","End":"03:50.165","Text":"Now, the diagonal D consists of eigenvalues,"},{"Start":"03:50.165 ","End":"03:53.435","Text":"and these are all 0 or 1,"},{"Start":"03:53.435 ","End":"04:00.790","Text":"and so the number of 1s on the diagonal is both the trace of D."}],"ID":25787},{"Watched":false,"Name":"Exercise 2","Duration":"2m 9s","ChapterTopicVideoID":24875,"CourseChapterTopicPlaylistID":118360,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:05.655","Text":"In this exercise, A is a 3 by 3 matrix over the reals,"},{"Start":"00:05.655 ","End":"00:07.820","Text":"and we\u0027re given some fact about A."},{"Start":"00:07.820 ","End":"00:09.765","Text":"Its trace is 0,"},{"Start":"00:09.765 ","End":"00:12.195","Text":"its determinant is 0,"},{"Start":"00:12.195 ","End":"00:17.175","Text":"and lambda equals 1 is an eigenvalue of A."},{"Start":"00:17.175 ","End":"00:23.715","Text":"We have to show that A is diagonalizable and find all its eigenvalues."},{"Start":"00:23.715 ","End":"00:30.180","Text":"The characteristic polynomial has degree 3 and also a leading coefficient of 1."},{"Start":"00:30.180 ","End":"00:35.010","Text":"It\u0027s of the form x cubed plus bx squared plus cx plus d."},{"Start":"00:35.010 ","End":"00:37.400","Text":"Now, from our previous exercise,"},{"Start":"00:37.400 ","End":"00:42.590","Text":"we know that the second highest coefficient is minus the trace"},{"Start":"00:42.590 ","End":"00:49.775","Text":"and the free coefficient is minus 1 to the power of the degree times the determinant."},{"Start":"00:49.775 ","End":"00:54.860","Text":"That means that b is 0 because the trace of A is 0,"},{"Start":"00:54.860 ","End":"00:58.940","Text":"and d is also 0 because the determinant of A is 0."},{"Start":"00:58.940 ","End":"01:01.760","Text":"What we\u0027re left if we put here b and d is 0,"},{"Start":"01:01.760 ","End":"01:04.945","Text":"is that p of x is x cubed plus cx."},{"Start":"01:04.945 ","End":"01:07.040","Text":"Now, we haven\u0027t used this fact yet."},{"Start":"01:07.040 ","End":"01:08.960","Text":"Lambda equals 1 is an eigenvalue,"},{"Start":"01:08.960 ","End":"01:12.905","Text":"so it must be a root of the characteristic polynomial."},{"Start":"01:12.905 ","End":"01:18.410","Text":"P of 1 is 0, meaning 1 cubed plus c times 1 is 0."},{"Start":"01:18.410 ","End":"01:22.055","Text":"That means that c is minus 1."},{"Start":"01:22.055 ","End":"01:27.200","Text":"Plug that in and now we have that p of x is x cubed minus x."},{"Start":"01:27.200 ","End":"01:31.320","Text":"Factorizing this, we get x, x squared minus 1,"},{"Start":"01:31.320 ","End":"01:34.085","Text":"which is x, x minus 1, x plus 1."},{"Start":"01:34.085 ","End":"01:36.770","Text":"The minimal polynomial has to have"},{"Start":"01:36.770 ","End":"01:42.475","Text":"each of the irreducible factors of the characteristic polynomial,"},{"Start":"01:42.475 ","End":"01:43.740","Text":"so it has to have an x,"},{"Start":"01:43.740 ","End":"01:45.780","Text":"an x minus 1 and an x plus 1,"},{"Start":"01:45.780 ","End":"01:48.740","Text":"and it also has to be of degree less than or equal to this."},{"Start":"01:48.740 ","End":"01:53.225","Text":"The minimum polynomial is also x, x minus 1, x plus 1."},{"Start":"01:53.225 ","End":"01:57.050","Text":"Now we\u0027re using the theorem that A is diagonalizable"},{"Start":"01:57.050 ","End":"02:02.270","Text":"if and only if its minimal polynomial can be factored into linear factors,"},{"Start":"02:02.270 ","End":"02:10.150","Text":"which it is, and so A is diagonalizable and we\u0027re done."}],"ID":25788},{"Watched":false,"Name":"Exercise 3","Duration":"4m 55s","ChapterTopicVideoID":24870,"CourseChapterTopicPlaylistID":118360,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:02.370","Text":"In this exercise, A is"},{"Start":"00:02.370 ","End":"00:09.900","Text":"a 3 by 3 real matrix and we\u0027re given a certain condition involving the rank."},{"Start":"00:09.900 ","End":"00:13.440","Text":"Then there are 4 parts."},{"Start":"00:13.440 ","End":"00:17.685","Text":"We\u0027ll read each part as we come to it and solve it."},{"Start":"00:17.685 ","End":"00:20.175","Text":"This is just what it says here."},{"Start":"00:20.175 ","End":"00:22.245","Text":"Now, part a,"},{"Start":"00:22.245 ","End":"00:29.520","Text":"we have to find the eigenvalues of the matrix A and the geometric multiplicity."},{"Start":"00:29.520 ","End":"00:34.475","Text":"Now, this thing is really just a bit of a trick to make things harder,"},{"Start":"00:34.475 ","End":"00:36.095","Text":"to make you work a bit more."},{"Start":"00:36.095 ","End":"00:38.449","Text":"This is an integer and this is an integer."},{"Start":"00:38.449 ","End":"00:40.289","Text":"If you have 2 integers,"},{"Start":"00:40.289 ","End":"00:43.755","Text":"m and n stuck between 0 and 3,"},{"Start":"00:43.755 ","End":"00:46.710","Text":"then this one has to be 1 and this one has to be 2."},{"Start":"00:46.710 ","End":"00:51.930","Text":"The only one way to cram 2 more integers between 0 and 3."},{"Start":"00:51.930 ","End":"00:57.325","Text":"We might as well have said that this rank is 1 and this rank is 2."},{"Start":"00:57.325 ","End":"01:01.640","Text":"That was just a bit of trickery to make it harder."},{"Start":"01:01.640 ","End":"01:06.650","Text":"Now, if we apply the rank nullity theorem to each of these,"},{"Start":"01:06.650 ","End":"01:11.725","Text":"then the nullity or the dimension of the kernel for this one,"},{"Start":"01:11.725 ","End":"01:13.890","Text":"it\u0027s n minus 1."},{"Start":"01:13.890 ","End":"01:16.515","Text":"N is 3, 3 minus 1 is 2."},{"Start":"01:16.515 ","End":"01:20.270","Text":"Here, 3 minus 2 is 1 for this one."},{"Start":"01:20.270 ","End":"01:26.510","Text":"Now in general, the kernel of A minus Lambda I is just the Lambda eigenspace of"},{"Start":"01:26.510 ","End":"01:33.445","Text":"A and its dimension is the geometric multiplicity of Lambda as an eigenvalue."},{"Start":"01:33.445 ","End":"01:39.075","Text":"We have that 10 and 4 are eigenvalues of A."},{"Start":"01:39.075 ","End":"01:47.300","Text":"The 10 has a geometric multiplicity of 2 and the 4 has a geometric multiplicity of 1."},{"Start":"01:47.300 ","End":"01:50.240","Text":"Together 2 and 1 gives us 3,"},{"Start":"01:50.240 ","End":"01:53.855","Text":"which is our whole dimension of the space,"},{"Start":"01:53.855 ","End":"01:55.900","Text":"so there aren\u0027t anymore."},{"Start":"01:55.900 ","End":"01:59.865","Text":"Just 10 and 4 and these are the geometric multiplicities."},{"Start":"01:59.865 ","End":"02:01.430","Text":"In part b, we have to find"},{"Start":"02:01.430 ","End":"02:06.295","Text":"the algebraic multiplicities and the characteristic polynomial."},{"Start":"02:06.295 ","End":"02:09.110","Text":"Now, we know that the algebraic multiplicity is"},{"Start":"02:09.110 ","End":"02:12.395","Text":"bigger or equal to the geometric multiplicity."},{"Start":"02:12.395 ","End":"02:17.330","Text":"For 10 it\u0027s bigger or equal to 2 and for 4 its big or equal to 1,"},{"Start":"02:17.330 ","End":"02:21.670","Text":"but altogether it has to be at most 3."},{"Start":"02:21.670 ","End":"02:24.430","Text":"They can\u0027t be any larger."},{"Start":"02:24.430 ","End":"02:28.040","Text":"The algebraic multiplicities are the same as a"},{"Start":"02:28.040 ","End":"02:32.015","Text":"geometric and for 10 it\u0027s 2 and for 4 it\u0027s 1."},{"Start":"02:32.015 ","End":"02:35.660","Text":"We also have to find the characteristic polynomial."},{"Start":"02:35.660 ","End":"02:37.925","Text":"For the 2, 10,"},{"Start":"02:37.925 ","End":"02:42.590","Text":"we get x minus 10 squared and for the 4 and the 1,"},{"Start":"02:42.590 ","End":"02:45.670","Text":"we get x minus 4 to the power of 1."},{"Start":"02:45.670 ","End":"02:49.265","Text":"It\u0027s just by the definition of the algebraic multiplicity."},{"Start":"02:49.265 ","End":"02:51.904","Text":"This is our answer for b."},{"Start":"02:51.904 ","End":"02:57.710","Text":"Just like to make a remark because it\u0027s an error that sometimes students make."},{"Start":"02:57.710 ","End":"03:01.040","Text":"Since this is the characteristic polynomial,"},{"Start":"03:01.040 ","End":"03:03.620","Text":"we know by Cayley-Hamilton\u0027s theorem that"},{"Start":"03:03.620 ","End":"03:06.860","Text":"the matrix A satisfies its own characteristic polynomial,"},{"Start":"03:06.860 ","End":"03:09.645","Text":"so we get that this thing holds."},{"Start":"03:09.645 ","End":"03:15.935","Text":"But you can\u0027t conclude that A equals 4I or A equals 10I from this,"},{"Start":"03:15.935 ","End":"03:18.320","Text":"just like we do with numbers."},{"Start":"03:18.320 ","End":"03:19.730","Text":"If this was 0,"},{"Start":"03:19.730 ","End":"03:22.070","Text":"we could say that x is 10 or x is 4,"},{"Start":"03:22.070 ","End":"03:24.110","Text":"but we can\u0027t do this for the matrix."},{"Start":"03:24.110 ","End":"03:28.069","Text":"In fact, it\u0027s false because with matrices,"},{"Start":"03:28.069 ","End":"03:30.860","Text":"unlike with numbers, if a product is 0,"},{"Start":"03:30.860 ","End":"03:34.445","Text":"it doesn\u0027t mean that either 1 of them need be 0."},{"Start":"03:34.445 ","End":"03:38.450","Text":"We get what is called the 0 divisors."},{"Start":"03:38.450 ","End":"03:40.820","Text":"You can multiply 2 to get 0."},{"Start":"03:40.820 ","End":"03:42.650","Text":"That\u0027s just a remark."},{"Start":"03:42.650 ","End":"03:46.010","Text":"Let\u0027s get on to part c, which says,"},{"Start":"03:46.010 ","End":"03:50.945","Text":"to determine if A is invertible and the answer is yes."},{"Start":"03:50.945 ","End":"03:54.110","Text":"The reason it\u0027s invertible is that 0 isn\u0027t"},{"Start":"03:54.110 ","End":"03:57.110","Text":"an eigenvalue because we found all the eigenvalues,"},{"Start":"03:57.110 ","End":"04:00.130","Text":"only 4 and 10 are eigenvalues and 0 is not."},{"Start":"04:00.130 ","End":"04:02.040","Text":"If 0 isn\u0027t an eigenvalue,"},{"Start":"04:02.040 ","End":"04:03.680","Text":"then A is invertible."},{"Start":"04:03.680 ","End":"04:05.270","Text":"The reverse is also true."},{"Start":"04:05.270 ","End":"04:07.205","Text":"It\u0027s an if and only if."},{"Start":"04:07.205 ","End":"04:09.530","Text":"Finally, d,"},{"Start":"04:09.530 ","End":"04:12.455","Text":"determine if A is diagonalizable,"},{"Start":"04:12.455 ","End":"04:14.435","Text":"and the answer is yes."},{"Start":"04:14.435 ","End":"04:16.670","Text":"The reason is that the sum of"},{"Start":"04:16.670 ","End":"04:21.635","Text":"the geometric multiplicities of the eigenvalues is the order of A,"},{"Start":"04:21.635 ","End":"04:25.010","Text":"that\u0027s this 2 plus 1 equals 3."},{"Start":"04:25.010 ","End":"04:29.090","Text":"But another way of seeing this would be if"},{"Start":"04:29.090 ","End":"04:35.150","Text":"the algebraic multiplicity equals the geometric multiplicity for each eigenvalue,"},{"Start":"04:35.150 ","End":"04:37.655","Text":"then that\u0027s also a sufficient reason."},{"Start":"04:37.655 ","End":"04:41.330","Text":"In this case, we said that they have the same."},{"Start":"04:41.330 ","End":"04:45.540","Text":"10 has both algebraic and geometric multiplicity of"},{"Start":"04:45.540 ","End":"04:50.050","Text":"2 and 4 has geometric and algebraic multiplicity of 1."},{"Start":"04:50.050 ","End":"04:55.550","Text":"That\u0027s another way of seeing it. We are done."}],"ID":25783},{"Watched":false,"Name":"Exercise 4","Duration":"2m 21s","ChapterTopicVideoID":24871,"CourseChapterTopicPlaylistID":118360,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.690","Text":"In this exercise, A is a square matrix"},{"Start":"00:03.690 ","End":"00:08.160","Text":"with minimal polynomial x minus 1 squared."},{"Start":"00:08.160 ","End":"00:09.840","Text":"We have another matrix B,"},{"Start":"00:09.840 ","End":"00:13.020","Text":"which is defined as A squared plus 4A plus 3I."},{"Start":"00:13.020 ","End":"00:17.250","Text":"We have to show that B is invertible."},{"Start":"00:17.250 ","End":"00:23.955","Text":"Note that the only root of this minimal polynomial is 1."},{"Start":"00:23.955 ","End":"00:30.650","Text":"This implies that 1 is the only eigenvalue of A and the reason is that"},{"Start":"00:30.650 ","End":"00:33.440","Text":"Lambda is an eigenvalue of A"},{"Start":"00:33.440 ","End":"00:37.460","Text":"if and only if Lambda is a root of the minimal polynomial."},{"Start":"00:37.460 ","End":"00:39.560","Text":"You could also put the word"},{"Start":"00:39.560 ","End":"00:42.935","Text":"characteristic instead of minimal here and that will work too."},{"Start":"00:42.935 ","End":"00:48.455","Text":"Now by the definition of minimal polynomial,"},{"Start":"00:48.455 ","End":"00:52.250","Text":"the matrix satisfies the minimal polynomial"},{"Start":"00:52.250 ","End":"00:54.865","Text":"and it\u0027s also the smallest as such."},{"Start":"00:54.865 ","End":"00:58.580","Text":"Anyway, put in A instead of x"},{"Start":"00:58.580 ","End":"01:00.530","Text":"and we have to replace the 1 by I,"},{"Start":"01:00.530 ","End":"01:06.810","Text":"and we get A minus I squared is 0 or A squared minus 2A plus I is 0."},{"Start":"01:06.810 ","End":"01:10.010","Text":"Now B, just copying from here,"},{"Start":"01:10.010 ","End":"01:11.960","Text":"we can rearrange it."},{"Start":"01:11.960 ","End":"01:18.390","Text":"We can take A squared minus 2A plus I first to get a 0 and see what\u0027s missing."},{"Start":"01:18.390 ","End":"01:23.235","Text":"We need another plus 6A and another plus 2I."},{"Start":"01:23.235 ","End":"01:26.515","Text":"Because this is 0, it\u0027s 6A plus 2I."},{"Start":"01:26.515 ","End":"01:31.625","Text":"Now a matrix is invertible if and only if it has a non 0 determinant."},{"Start":"01:31.625 ","End":"01:34.760","Text":"I\u0027d like to take the 6 outside of the determinant."},{"Start":"01:34.760 ","End":"01:36.950","Text":"But by the rules of the determinant,"},{"Start":"01:36.950 ","End":"01:41.660","Text":"we have to take 6 out as 6^n,"},{"Start":"01:41.660 ","End":"01:43.940","Text":"where n is the order of the matrix."},{"Start":"01:43.940 ","End":"01:50.700","Text":"We get that 6^n times this determinant of A plus 1/3I is 0, divide by 6^n."},{"Start":"01:50.700 ","End":"01:53.130","Text":"The determinant of this is 0."},{"Start":"01:53.130 ","End":"01:57.465","Text":"It is A minus minus 1/3I is 0."},{"Start":"01:57.465 ","End":"02:05.340","Text":"This just means that minus 1/3 is an eigenvalue of A,"},{"Start":"02:05.340 ","End":"02:08.480","Text":"and that\u0027s a contradiction because"},{"Start":"02:08.480 ","End":"02:13.325","Text":"we saw that the only eigenvalue of A is 1."},{"Start":"02:13.325 ","End":"02:16.025","Text":"Minus 1/3 is not equal to 1."},{"Start":"02:16.025 ","End":"02:19.550","Text":"This contradiction completes the proof."},{"Start":"02:19.550 ","End":"02:21.840","Text":"We are done."}],"ID":25784}],"Thumbnail":null,"ID":118360},{"Name":"Matrix Similarity","TopicPlaylistFirstVideoID":0,"Duration":null,"Videos":[{"Watched":false,"Name":"Introduction through exercise","Duration":"6m 51s","ChapterTopicVideoID":24867,"CourseChapterTopicPlaylistID":7321,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.420","Text":"This exercise is a bit theoretical."},{"Start":"00:03.420 ","End":"00:06.165","Text":"I don\u0027t think you would get something like this on an exam."},{"Start":"00:06.165 ","End":"00:08.340","Text":"Anyway, let\u0027s go through it."},{"Start":"00:08.340 ","End":"00:13.725","Text":"The first part to define the concept of similarity of matrices."},{"Start":"00:13.725 ","End":"00:15.720","Text":"In part b,"},{"Start":"00:15.720 ","End":"00:19.110","Text":"we\u0027re using the similarity concept,"},{"Start":"00:19.110 ","End":"00:21.090","Text":"and we\u0027re given that A and B"},{"Start":"00:21.090 ","End":"00:24.300","Text":"are similar matrices and we have to prove that they have"},{"Start":"00:24.300 ","End":"00:25.605","Text":"the same determinant,"},{"Start":"00:25.605 ","End":"00:27.030","Text":"the same trace,"},{"Start":"00:27.030 ","End":"00:30.570","Text":"and the same characteristic polynomial."},{"Start":"00:30.570 ","End":"00:32.805","Text":"As for the definition,"},{"Start":"00:32.805 ","End":"00:36.570","Text":"A and B are similar matrices."},{"Start":"00:36.570 ","End":"00:40.790","Text":"I should say that they\u0027re both square matrices of the same size."},{"Start":"00:40.790 ","End":"00:44.375","Text":"If there exists an invertible matrix P,"},{"Start":"00:44.375 ","End":"00:49.900","Text":"such that this holds P inverse times A times P is B."},{"Start":"00:49.900 ","End":"00:53.190","Text":"There\u0027s an equivalent version,"},{"Start":"00:53.190 ","End":"00:57.760","Text":"we can also use this AP is equal to PB."},{"Start":"00:57.760 ","End":"01:00.440","Text":"Each of these is derivable from the other,"},{"Start":"01:00.440 ","End":"01:02.270","Text":"whichever is more convenient,"},{"Start":"01:02.270 ","End":"01:04.380","Text":"that\u0027s the one we use."},{"Start":"01:04.930 ","End":"01:10.420","Text":"Now on to b subpart 1,"},{"Start":"01:10.420 ","End":"01:16.160","Text":"we\u0027re given that A and B are similar and I\u0027ll use this definition,"},{"Start":"01:16.160 ","End":"01:19.250","Text":"so AP is equal to PB,"},{"Start":"01:19.250 ","End":"01:24.285","Text":"and our goal is to show that they have the same determinant."},{"Start":"01:24.285 ","End":"01:31.075","Text":"These are equal, so I can put determinant around each,"},{"Start":"01:31.075 ","End":"01:34.240","Text":"and there\u0027s a property of determinants that"},{"Start":"01:34.240 ","End":"01:38.110","Text":"the determinant of a product is the product of the determinants."},{"Start":"01:38.110 ","End":"01:42.220","Text":"Left-hand side is the determinant of A times determinant of P,"},{"Start":"01:42.220 ","End":"01:47.080","Text":"and the right-hand side is determinant of P times determinant of B."},{"Start":"01:47.080 ","End":"01:52.420","Text":"Now, P is invertible and an invertible matrix has"},{"Start":"01:52.420 ","End":"01:59.605","Text":"a non 0 determinant so we could actually cancel this from both sides,"},{"Start":"01:59.605 ","End":"02:01.080","Text":"these are numbers,"},{"Start":"02:01.080 ","End":"02:10.030","Text":"and that gives us the desired result that determinant of A equals determinant of B."},{"Start":"02:10.190 ","End":"02:16.290","Text":"Now on to the second part of b,"},{"Start":"02:16.290 ","End":"02:22.235","Text":"and I\u0027m going to use the other definition of similar matrices."},{"Start":"02:22.235 ","End":"02:24.800","Text":"There was this, but there\u0027s equivalent one this,"},{"Start":"02:24.800 ","End":"02:31.170","Text":"and remember our goal is to show that they have the same trace."},{"Start":"02:31.210 ","End":"02:36.040","Text":"These are equal, so these are the same trace."},{"Start":"02:36.040 ","End":"02:39.065","Text":"Now what I can do,"},{"Start":"02:39.065 ","End":"02:42.380","Text":"multiplication is what we call associative."},{"Start":"02:42.380 ","End":"02:46.905","Text":"I can group it together any way I want,"},{"Start":"02:46.905 ","End":"02:55.235","Text":"and I\u0027d like to consider this as this P inverse times the result of AP."},{"Start":"02:55.235 ","End":"03:01.930","Text":"This is like a product of 2 things and that\u0027s equal to the trace of B."},{"Start":"03:01.930 ","End":"03:04.095","Text":"Now there\u0027s a theorem,"},{"Start":"03:04.095 ","End":"03:06.135","Text":"when we learn trace,"},{"Start":"03:06.135 ","End":"03:08.265","Text":"is that the trace,"},{"Start":"03:08.265 ","End":"03:12.310","Text":"if we have a product of 2 matrices, xy,"},{"Start":"03:12.310 ","End":"03:18.335","Text":"it\u0027s the same as the trace of the product yx."},{"Start":"03:18.335 ","End":"03:24.155","Text":"If we have a product, we can change the order of the multiplication."},{"Start":"03:24.155 ","End":"03:26.900","Text":"I mean xy is not equal to yx in general,"},{"Start":"03:26.900 ","End":"03:29.280","Text":"but the trace is."},{"Start":"03:29.360 ","End":"03:33.725","Text":"This is like the x and this is like the y."},{"Start":"03:33.725 ","End":"03:38.670","Text":"Then we can switch the order and put the y in front of the x."},{"Start":"03:42.110 ","End":"03:46.820","Text":"Again, because of the associative law,"},{"Start":"03:46.820 ","End":"03:52.760","Text":"I can do it in the order of first multiplying P by P inverse and then A times that."},{"Start":"03:52.760 ","End":"03:54.110","Text":"But this times this"},{"Start":"03:54.110 ","End":"03:56.075","Text":"is the identity matrix."},{"Start":"03:56.075 ","End":"03:58.855","Text":"A times identity is just A,"},{"Start":"03:58.855 ","End":"04:01.235","Text":"and that gives us what we wanted,"},{"Start":"04:01.235 ","End":"04:04.955","Text":"that the trace of A equals the trace of B."},{"Start":"04:04.955 ","End":"04:06.890","Text":"Then part 3,"},{"Start":"04:06.890 ","End":"04:11.830","Text":"we wanted to show that A and B have the same characteristic polynomial."},{"Start":"04:11.830 ","End":"04:15.860","Text":"But the definition of the characteristic polynomial is the"},{"Start":"04:15.860 ","End":"04:19.760","Text":"determinant of xI minus A if it\u0027s A"},{"Start":"04:19.760 ","End":"04:27.115","Text":"and determinant of xI minus B for B. I\u0027ll start a new page."},{"Start":"04:27.115 ","End":"04:31.160","Text":"Remember this is what we\u0027re going to prove now and we know that A and B are"},{"Start":"04:31.160 ","End":"04:36.300","Text":"similar and we\u0027ll use this definition of similarity."},{"Start":"04:36.620 ","End":"04:39.480","Text":"Now here\u0027s the trick we\u0027re going to use."},{"Start":"04:39.480 ","End":"04:41.180","Text":"If these 2 are equal,"},{"Start":"04:41.180 ","End":"04:45.370","Text":"I can subtract each of them from the same thing,"},{"Start":"04:45.370 ","End":"04:54.739","Text":"xP minus this will equal xP minus the other because these are equal."},{"Start":"04:54.739 ","End":"05:04.740","Text":"Now I\u0027m going to use distributive law."},{"Start":"05:04.740 ","End":"05:08.660","Text":"There\u0027s a right distributive and the left distributive."},{"Start":"05:08.660 ","End":"05:11.350","Text":"Here I take P out on the right,"},{"Start":"05:11.350 ","End":"05:13.970","Text":"but what\u0027s left is not just x,"},{"Start":"05:13.970 ","End":"05:15.965","Text":"but x times the identity."},{"Start":"05:15.965 ","End":"05:18.740","Text":"You could think of it like there\u0027s an identity I"},{"Start":"05:18.740 ","End":"05:21.605","Text":"in here or you can just multiply it out,"},{"Start":"05:21.605 ","End":"05:25.070","Text":"x times I times P is just xP."},{"Start":"05:25.070 ","End":"05:31.760","Text":"Similarly here, I take P out on the left."},{"Start":"05:31.760 ","End":"05:33.290","Text":"x is a scalar,"},{"Start":"05:33.290 ","End":"05:35.330","Text":"so it doesn\u0027t matter where you put it."},{"Start":"05:35.330 ","End":"05:40.355","Text":"If you multiply PxI, it\u0027s the same as x times PI, which is xP."},{"Start":"05:40.355 ","End":"05:43.830","Text":"This works out also."},{"Start":"05:45.590 ","End":"05:51.385","Text":"Now I take the determinant of both sides of the equality."},{"Start":"05:51.385 ","End":"05:54.560","Text":"Again, I\u0027m going to use the property that the determinant"},{"Start":"05:54.560 ","End":"05:57.260","Text":"of a product is the product of the determinants,"},{"Start":"05:57.260 ","End":"06:00.035","Text":"and so I can break it up the determinant to this part"},{"Start":"06:00.035 ","End":"06:03.800","Text":"times determinant of this part and here, this and this."},{"Start":"06:03.800 ","End":"06:06.380","Text":"Now P is invertible,"},{"Start":"06:06.380 ","End":"06:08.509","Text":"so its determinant is not 0."},{"Start":"06:08.509 ","End":"06:14.980","Text":"I can divide both sides by this non 0 constant."},{"Start":"06:14.980 ","End":"06:20.465","Text":"That leaves us with this equality and that\u0027s what we were trying to show."},{"Start":"06:20.465 ","End":"06:26.275","Text":"This means that the characteristic polynomials for A and B is the same."},{"Start":"06:26.275 ","End":"06:32.925","Text":"I\u0027d like to just to point out something that follows from this,"},{"Start":"06:32.925 ","End":"06:37.610","Text":"and that is that similar matrices have the same eigenvalues because"},{"Start":"06:37.610 ","End":"06:43.655","Text":"the eigenvalues just depend on the characteristic polynomials, its roots."},{"Start":"06:43.655 ","End":"06:47.115","Text":"Same characteristic polynomial, so same eigenvalues."},{"Start":"06:47.115 ","End":"06:52.540","Text":"That\u0027s just something to note. We\u0027re done."}],"ID":25780},{"Watched":false,"Name":"Exercise 2","Duration":"4m 13s","ChapterTopicVideoID":9605,"CourseChapterTopicPlaylistID":7321,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.329","Text":"This is a more theoretical exercise,"},{"Start":"00:03.329 ","End":"00:08.400","Text":"not the kind you would expect to get on an exam. I hope not."},{"Start":"00:08.400 ","End":"00:13.950","Text":"We\u0027re given that B is equal to P inverse AP."},{"Start":"00:13.950 ","End":"00:18.090","Text":"Notice that this means that B is similar to A."},{"Start":"00:18.090 ","End":"00:22.980","Text":"We have to show that if we want to raise B to the power of n,"},{"Start":"00:22.980 ","End":"00:26.640","Text":"all we have to do is take the A that\u0027s sandwiched here"},{"Start":"00:26.640 ","End":"00:29.240","Text":"and raise it to the power of n."},{"Start":"00:29.240 ","End":"00:32.194","Text":"n is a natural number,"},{"Start":"00:32.194 ","End":"00:34.025","Text":"1, 2, 3, 4, etc."},{"Start":"00:34.025 ","End":"00:38.150","Text":"I\u0027m going to give an informal and a more formal solution."},{"Start":"00:38.150 ","End":"00:40.130","Text":"First, the informal proofs,"},{"Start":"00:40.130 ","End":"00:45.150","Text":"I\u0027m just writing again that B is P inverse AP."},{"Start":"00:46.030 ","End":"00:51.300","Text":"B^n means I would just write P^ minus 1 AP n times,"},{"Start":"00:51.300 ","End":"01:02.734","Text":"1, 2, 3, however many I need, just to take n of these factors of P^ minus 1 AP."},{"Start":"01:02.734 ","End":"01:09.660","Text":"Now, notice that they come in a way"},{"Start":"01:09.660 ","End":"01:12.930","Text":"that there\u0027s P and P^ minus 1 following each other."},{"Start":"01:12.930 ","End":"01:14.480","Text":"By the associative law,"},{"Start":"01:14.480 ","End":"01:16.550","Text":"I can group the product anyway I want,"},{"Start":"01:16.550 ","End":"01:19.670","Text":"so this P cancels with this P inverse,"},{"Start":"01:19.670 ","End":"01:23.680","Text":"and this P with this P inverse, and so on."},{"Start":"01:23.680 ","End":"01:35.010","Text":"All these middle bits cancel and all we\u0027re left with is P^ minus 1, AAAA n times."},{"Start":"01:35.010 ","End":"01:40.890","Text":"AAAA n times is just A^n."},{"Start":"01:40.890 ","End":"01:43.935","Text":"That proves it. That was the informal proof."},{"Start":"01:43.935 ","End":"01:48.230","Text":"Next, I\u0027m going to give a formal proof"},{"Start":"01:48.230 ","End":"01:57.630","Text":"and we\u0027ll use the technique of proof by induction on n."},{"Start":"01:57.630 ","End":"01:59.430","Text":"For induction, there\u0027s 2 parts."},{"Start":"01:59.430 ","End":"02:02.890","Text":"We show that the thing is true for n equals 1"},{"Start":"02:02.890 ","End":"02:07.205","Text":"and then we show the induction step that if it\u0027s true for a particular n,"},{"Start":"02:07.205 ","End":"02:09.410","Text":"then it\u0027s also true for the following n."},{"Start":"02:09.410 ","End":"02:14.250","Text":"The first part, n equals 1."},{"Start":"02:14.900 ","End":"02:17.475","Text":"Let\u0027s see if I can go back."},{"Start":"02:17.475 ","End":"02:20.225","Text":"Here\u0027s the thing we have to prove."},{"Start":"02:20.225 ","End":"02:21.650","Text":"Put n equals 1."},{"Start":"02:21.650 ","End":"02:27.845","Text":"We want to check if B^1 equal to P inverse A^1P."},{"Start":"02:27.845 ","End":"02:31.490","Text":"Yeah, you just throw the 1 out and this is what was given,"},{"Start":"02:31.490 ","End":"02:33.530","Text":"B equals P inverse AP."},{"Start":"02:33.530 ","End":"02:36.545","Text":"Yeah, that\u0027s trivial."},{"Start":"02:36.545 ","End":"02:40.625","Text":"Now, next we have to do the induction step where we"},{"Start":"02:40.625 ","End":"02:46.040","Text":"assume that the claim is true for a particular n."},{"Start":"02:46.040 ","End":"02:50.675","Text":"In other words, we suppose that this is true."},{"Start":"02:50.675 ","End":"02:54.297","Text":"We have to show it\u0027s true for the following n."},{"Start":"02:54.297 ","End":"02:56.960","Text":"[inaudible] replace n by n plus 1,"},{"Start":"02:56.960 ","End":"02:59.870","Text":"we have to show that this is equal to this."},{"Start":"02:59.870 ","End":"03:02.570","Text":"We would note to have n plus 1 in place of the n."},{"Start":"03:02.570 ","End":"03:07.080","Text":"Given this, we have to show that this is true."},{"Start":"03:07.240 ","End":"03:10.040","Text":"Ready? Here\u0027s the proof."},{"Start":"03:10.040 ","End":"03:17.370","Text":"Certainly, I can write B^n plus 1 as B times B^n."},{"Start":"03:17.370 ","End":"03:21.020","Text":"Now by the induction hypothesis, this for n,"},{"Start":"03:21.020 ","End":"03:25.700","Text":"it\u0027s true, B^n equals P inverse A^nP."},{"Start":"03:25.700 ","End":"03:27.860","Text":"That\u0027s this."},{"Start":"03:27.860 ","End":"03:38.075","Text":"But recall that we were given that B is P^ minus 1 AP, this here."},{"Start":"03:38.075 ","End":"03:40.975","Text":"That was given."},{"Start":"03:40.975 ","End":"03:44.445","Text":"I can put this and copy this."},{"Start":"03:44.445 ","End":"03:50.205","Text":"Now look, the P cancels with the P minus 1,"},{"Start":"03:50.205 ","End":"03:53.450","Text":"and I guess we could have put an extra step in the middle"},{"Start":"03:53.450 ","End":"03:57.260","Text":"that we have AA^n in the middle,"},{"Start":"03:57.260 ","End":"04:03.380","Text":"P inverse and then P and then AA^n is A^n plus 1."},{"Start":"04:03.380 ","End":"04:06.835","Text":"This is what we had to show."},{"Start":"04:06.835 ","End":"04:08.870","Text":"We\u0027ve proved both parts,"},{"Start":"04:08.870 ","End":"04:14.130","Text":"the n equals 1 and the induction step. We\u0027re done."}],"ID":10121},{"Watched":false,"Name":"Exercise 3","Duration":"4m 12s","ChapterTopicVideoID":13380,"CourseChapterTopicPlaylistID":7321,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.930","Text":"Yet another exercise in similar matrices."},{"Start":"00:03.930 ","End":"00:07.920","Text":"This time we have an n by n matrix A."},{"Start":"00:07.920 ","End":"00:12.240","Text":"We\u0027re given that A is similar to 4A,"},{"Start":"00:12.240 ","End":"00:19.100","Text":"we have to prove that A is not invertible and then in part b we\u0027re"},{"Start":"00:19.100 ","End":"00:27.640","Text":"given a matrix A like so and we have to show that A is similar to 4A."},{"Start":"00:27.640 ","End":"00:30.295","Text":"Let\u0027s start with part a."},{"Start":"00:30.295 ","End":"00:35.060","Text":"Now, similar matrices have equal determinants."},{"Start":"00:35.060 ","End":"00:36.710","Text":"So if this is similar to this,"},{"Start":"00:36.710 ","End":"00:40.265","Text":"then the determinant of this equals the determinant of this."},{"Start":"00:40.265 ","End":"00:44.585","Text":"By the rules of the determinants, we can take the 4 out,"},{"Start":"00:44.585 ","End":"00:46.730","Text":"but we have to first raise it to the power of"},{"Start":"00:46.730 ","End":"00:51.395","Text":"n. Then we can bring this to the other side,"},{"Start":"00:51.395 ","End":"00:54.890","Text":"take determinant of A outside the brackets and we"},{"Start":"00:54.890 ","End":"01:00.455","Text":"have the scalar 1 minus 4 to the n is equal to 0."},{"Start":"01:00.455 ","End":"01:04.050","Text":"Now, this cannot be 0."},{"Start":"01:04.660 ","End":"01:07.610","Text":"If n is 0, it would be 0,"},{"Start":"01:07.610 ","End":"01:11.930","Text":"but n is not 0 because n is going to be at least 1."},{"Start":"01:11.930 ","End":"01:13.670","Text":"So 1 minus 4n is not 0."},{"Start":"01:13.670 ","End":"01:18.280","Text":"So we can divide both sides by it and get that the determinant of A is"},{"Start":"01:18.280 ","End":"01:24.005","Text":"0 and having a determinant of 0 is equivalent to being non-invertible."},{"Start":"01:24.005 ","End":"01:26.140","Text":"So that\u0027s part a."},{"Start":"01:26.140 ","End":"01:28.455","Text":"Now in part b,"},{"Start":"01:28.455 ","End":"01:34.920","Text":"we were given a certain A and we had to show that it\u0027s similar to 4A."},{"Start":"01:35.000 ","End":"01:39.410","Text":"This here was A, and this here is 4A."},{"Start":"01:39.410 ","End":"01:44.090","Text":"We have to find P such that P inverse times this"},{"Start":"01:44.090 ","End":"01:47.965","Text":"times P equals this and that will show that they are similar."},{"Start":"01:47.965 ","End":"01:50.940","Text":"Let\u0027s bring the P over to the other side,"},{"Start":"01:50.940 ","End":"01:53.900","Text":"I multiply both sides on the left by P,"},{"Start":"01:53.900 ","End":"01:56.790","Text":"and we\u0027ve got this which is more convenient."},{"Start":"01:56.900 ","End":"02:01.700","Text":"Let P equal x, y, z, t."},{"Start":"02:01.700 ","End":"02:05.435","Text":"So we\u0027ve got this equation now."},{"Start":"02:05.435 ","End":"02:08.240","Text":"You know how to multiply matrices."},{"Start":"02:08.240 ","End":"02:12.350","Text":"For example, this first row with this first column gives us 0,"},{"Start":"02:12.350 ","End":"02:16.010","Text":"x plus 1z is z, and so on for the rest and here,"},{"Start":"02:16.010 ","End":"02:17.675","Text":"x times y, sorry,"},{"Start":"02:17.675 ","End":"02:22.235","Text":"the row x-y with the column 00 gives us 0 and etc."},{"Start":"02:22.235 ","End":"02:24.350","Text":"This is the equation we get."},{"Start":"02:24.350 ","End":"02:28.940","Text":"So each element is going to be equal,"},{"Start":"02:28.940 ","End":"02:30.260","Text":"z is going to be equal to 0."},{"Start":"02:30.260 ","End":"02:35.050","Text":"We\u0027ve got 4 equalities because 0 equals 0 doesn\u0027t give us anything."},{"Start":"02:35.050 ","End":"02:40.850","Text":"So we get 3 equations, this, this, and this."},{"Start":"02:42.040 ","End":"02:45.590","Text":"There\u0027s more than 1 solution for this."},{"Start":"02:45.590 ","End":"02:48.305","Text":"The last equation doesn\u0027t give us anything,"},{"Start":"02:48.305 ","End":"02:51.110","Text":"just gives us z equals 0."},{"Start":"02:51.110 ","End":"02:53.360","Text":"Here we know that t is 4x,"},{"Start":"02:53.360 ","End":"02:58.580","Text":"but we don\u0027t know what x is and y is completely free."},{"Start":"02:58.580 ","End":"03:02.465","Text":"So really, we could let x and y be anything."},{"Start":"03:02.465 ","End":"03:05.870","Text":"The thing is I want P to be invertible."},{"Start":"03:05.870 ","End":"03:08.840","Text":"If I let x and y, for example, both 0,"},{"Start":"03:08.840 ","End":"03:10.820","Text":"then everything will come out 0."},{"Start":"03:10.820 ","End":"03:12.770","Text":"What we can do, for example,"},{"Start":"03:12.770 ","End":"03:18.600","Text":"is take x equals 1 and y equals 0."},{"Start":"03:18.600 ","End":"03:21.530","Text":"Like I said, it\u0027s not the only choice,"},{"Start":"03:21.530 ","End":"03:25.700","Text":"but if I mean z is 0, and once we have x and y,"},{"Start":"03:25.700 ","End":"03:28.430","Text":"then we can compute t because t is 4x,"},{"Start":"03:28.430 ","End":"03:30.695","Text":"there could be other combinations that work."},{"Start":"03:30.695 ","End":"03:35.820","Text":"This 1 gives us, that P is equal to x, y, z, t."},{"Start":"03:35.820 ","End":"03:40.745","Text":"So it\u0027s 1,0,0,4, and certainly this is invertible."},{"Start":"03:40.745 ","End":"03:46.265","Text":"So to summarize, we found an invertible P such that P inverse AP is 4A,"},{"Start":"03:46.265 ","End":"03:49.310","Text":"and that proves that A and 4A are similar."},{"Start":"03:49.310 ","End":"03:54.500","Text":"Notice that this is completely consistent with part a."},{"Start":"03:54.500 ","End":"03:59.270","Text":"Part a says whenever this happens that A is not invertible,"},{"Start":"03:59.270 ","End":"04:08.000","Text":"and indeed A is not invertible, whereas A is 0,1,0,0,"},{"Start":"04:08.000 ","End":"04:12.600","Text":"is not invertible, so we\u0027re all all right, and we\u0027re done."}],"ID":14021},{"Watched":false,"Name":"Exercise 4","Duration":"4m 32s","ChapterTopicVideoID":13381,"CourseChapterTopicPlaylistID":7321,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:04.530","Text":"In this exercise, we\u0027re given the real 3 by 3 matrix A as follows."},{"Start":"00:04.530 ","End":"00:08.220","Text":"Notice that it has 2 parameters, a and b in it."},{"Start":"00:08.220 ","End":"00:11.490","Text":"Constants parameters. The question is,"},{"Start":"00:11.490 ","End":"00:16.860","Text":"can we find a and b to make A similar to B,"},{"Start":"00:16.860 ","End":"00:19.300","Text":"which is given here?"},{"Start":"00:19.300 ","End":"00:26.070","Text":"To help us, we know that similar matrices have certain things in common."},{"Start":"00:26.070 ","End":"00:30.030","Text":"They have the same trace, they have the same determinant,"},{"Start":"00:30.030 ","End":"00:33.255","Text":"and they have the same eigenvalues."},{"Start":"00:33.255 ","End":"00:35.280","Text":"So let\u0027s see if this works out here,"},{"Start":"00:35.280 ","End":"00:39.780","Text":"then that\u0027s going to be the constraints that will give us equations."},{"Start":"00:39.780 ","End":"00:43.730","Text":"First of all, we\u0027ll talk about the same trace."},{"Start":"00:43.730 ","End":"00:47.915","Text":"The trace of b is 2 plus 7 plus 6."},{"Start":"00:47.915 ","End":"00:53.010","Text":"I think there\u0027s a typo here, it should be a 16 here."},{"Start":"00:53.090 ","End":"00:55.485","Text":"Yeah. Fix that."},{"Start":"00:55.485 ","End":"00:59.675","Text":"That\u0027s 25 a, we\u0027ve scrolled off. Yeah, there it is."},{"Start":"00:59.675 ","End":"01:05.530","Text":"A trace of a is a plus 3 plus negative 5,"},{"Start":"01:05.530 ","End":"01:10.425","Text":"and that gives us a minus 2."},{"Start":"01:10.425 ","End":"01:18.110","Text":"We compare a minus 2 is 25, that gives us that a is 27."},{"Start":"01:18.110 ","End":"01:21.350","Text":"Now, let\u0027s go for the determinant."},{"Start":"01:21.350 ","End":"01:26.540","Text":"Wait, before that we want to revise a because we had little a here,"},{"Start":"01:26.540 ","End":"01:29.360","Text":"but we know that\u0027s 27, so now we can revise a."},{"Start":"01:29.360 ","End":"01:32.795","Text":"Now we want to compare the determinants."},{"Start":"01:32.795 ","End":"01:37.385","Text":"For the determinant of a is this determinant and just to make it easier,"},{"Start":"01:37.385 ","End":"01:44.255","Text":"what we do is subtract the second column from the third column,"},{"Start":"01:44.255 ","End":"01:46.800","Text":"b minus b is 0."},{"Start":"01:47.110 ","End":"01:50.090","Text":"Maybe we should subtract it the other way around."},{"Start":"01:50.090 ","End":"01:54.760","Text":"Yeah, it\u0027s the third minus the second and we\u0027ll get a plus 3 here."},{"Start":"01:54.760 ","End":"01:58.680","Text":"Now, we expand along the first row."},{"Start":"01:58.680 ","End":"02:02.060","Text":"You\u0027ve got 27 times this 2 by 2 determinant,"},{"Start":"02:02.060 ","End":"02:07.690","Text":"then minus b times this 2 by 2 determinant."},{"Start":"02:07.690 ","End":"02:10.760","Text":"Anyway, this is what we get, 27 plus b."},{"Start":"02:10.760 ","End":"02:14.930","Text":"I\u0027ll spare you the details of the determinant of b"},{"Start":"02:14.930 ","End":"02:18.995","Text":"because just a waste of time to do that here."},{"Start":"02:18.995 ","End":"02:22.285","Text":"It comes out 84, I make it."},{"Start":"02:22.285 ","End":"02:28.160","Text":"We equate 27 plus b is 84 and bring the 27 over to the other side."},{"Start":"02:28.160 ","End":"02:30.800","Text":"That gives us that b is 57."},{"Start":"02:30.800 ","End":"02:37.430","Text":"If we substitute b is 57 in here, then we\u0027ve got that A is this."},{"Start":"02:37.430 ","End":"02:44.090","Text":"We can\u0027t stop here because there is a third condition about the same eigenvalues."},{"Start":"02:44.090 ","End":"02:48.395","Text":"We might reach a contradiction if it doesn\u0027t work out."},{"Start":"02:48.395 ","End":"02:50.780","Text":"If a exist, it\u0027s this,"},{"Start":"02:50.780 ","End":"02:54.205","Text":"but we still have to check the same eigenvalues."},{"Start":"02:54.205 ","End":"02:56.750","Text":"I\u0027ll do that on a clean page."},{"Start":"02:56.750 ","End":"02:59.030","Text":"Let\u0027s start with the matrix b."},{"Start":"02:59.030 ","End":"03:03.200","Text":"Its characteristic polynomial is the determinant of x_i"},{"Start":"03:03.200 ","End":"03:07.890","Text":"minus b and this is a triangular matrix,"},{"Start":"03:07.890 ","End":"03:13.300","Text":"all these are zeros, so the determinant is just the product of these, which is these."},{"Start":"03:13.300 ","End":"03:18.635","Text":"The eigenvalues are going to be 2 and 7 and 6, those 3."},{"Start":"03:18.635 ","End":"03:21.970","Text":"Now let\u0027s see what happens with a."},{"Start":"03:21.970 ","End":"03:25.580","Text":"Only we don\u0027t actually have to find the eigenvalues for a,"},{"Start":"03:25.580 ","End":"03:29.170","Text":"we can try them 1 at a time."},{"Start":"03:29.170 ","End":"03:33.555","Text":"Let\u0027s say we can see if 2 is an eigenvalue of a."},{"Start":"03:33.555 ","End":"03:41.815","Text":"The characteristic polynomial of a is this is determinant of x_i minus a."},{"Start":"03:41.815 ","End":"03:46.425","Text":"Like I said, we\u0027re not going to do the computation,"},{"Start":"03:46.425 ","End":"03:50.180","Text":"we\u0027re just going to check if these are eigenvalues."},{"Start":"03:50.180 ","End":"03:52.715","Text":"As soon as we get 1 that isn\u0027t, we can stop."},{"Start":"03:52.715 ","End":"03:54.155","Text":"Let\u0027s start with the 2."},{"Start":"03:54.155 ","End":"03:56.440","Text":"Plug in x equals 2,"},{"Start":"03:56.440 ","End":"04:00.080","Text":"and we get P of 2 when x equals 2 here,"},{"Start":"04:00.080 ","End":"04:02.419","Text":"2 minus 27 is minus 25,"},{"Start":"04:02.419 ","End":"04:05.795","Text":"and so on and so on and if you compute this determinant,"},{"Start":"04:05.795 ","End":"04:09.020","Text":"it comes out to be not 0."},{"Start":"04:09.020 ","End":"04:11.370","Text":"If it\u0027s not 0,"},{"Start":"04:11.370 ","End":"04:16.309","Text":"then 2 is not a solution to the characteristic equation."},{"Start":"04:16.309 ","End":"04:18.865","Text":"So 2 is not an eigenvalue,"},{"Start":"04:18.865 ","End":"04:24.140","Text":"and the conclusion from all this is that the matrices are not similar."},{"Start":"04:24.140 ","End":"04:28.410","Text":"It satisfied 1 and 2 but not 3."},{"Start":"04:29.340 ","End":"04:32.420","Text":"That\u0027s it. We\u0027re done."}],"ID":14022},{"Watched":false,"Name":"Exercise 5 - Parts a-c","Duration":"4m 21s","ChapterTopicVideoID":13382,"CourseChapterTopicPlaylistID":7321,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:05.925","Text":"In this exercise, we\u0027re given 3 square matrices of order n,"},{"Start":"00:05.925 ","End":"00:09.540","Text":"meaning size n by n, and they\u0027re A, B, and C."},{"Start":"00:09.540 ","End":"00:12.390","Text":"We have to prove 3 things."},{"Start":"00:12.390 ","End":"00:18.705","Text":"First of all, that A is similar to A, to itself."},{"Start":"00:18.705 ","End":"00:24.810","Text":"Secondly, if A is similar to B, then B is similar to A."},{"Start":"00:24.810 ","End":"00:29.730","Text":"The third part, if A is similar to B and B is similar to C,"},{"Start":"00:29.730 ","End":"00:32.280","Text":"then A is similar to C."},{"Start":"00:32.280 ","End":"00:36.270","Text":"I\u0027d like to remind you what similar means"},{"Start":"00:36.270 ","End":"00:37.650","Text":"and I want to use A and B,"},{"Start":"00:37.650 ","End":"00:38.730","Text":"use M and N."},{"Start":"00:38.730 ","End":"00:45.920","Text":"M is similar to N if there is an invertible matrix P"},{"Start":"00:45.920 ","End":"00:49.340","Text":"such that P inverse MP equals N."},{"Start":"00:49.340 ","End":"00:53.570","Text":"Part a is very short,"},{"Start":"00:53.570 ","End":"00:56.330","Text":"very easy to prove, A is similar to itself."},{"Start":"00:56.330 ","End":"01:01.180","Text":"The trick is just to choose P as the identity matrix."},{"Start":"01:01.180 ","End":"01:05.405","Text":"If I plug that in the definition and let P equal I,"},{"Start":"01:05.405 ","End":"01:07.820","Text":"I inverse AI is just A,"},{"Start":"01:07.820 ","End":"01:10.580","Text":"so A is similar to A."},{"Start":"01:10.580 ","End":"01:15.140","Text":"That was easy. To prove the next,"},{"Start":"01:15.140 ","End":"01:18.710","Text":"I\u0027ll start off with the first proposition that A is similar to B,"},{"Start":"01:18.710 ","End":"01:24.480","Text":"and I\u0027ll keep going until we reach that B is similar to A."},{"Start":"01:24.710 ","End":"01:30.865","Text":"There is an invertible P such that P inverse AP is B."},{"Start":"01:30.865 ","End":"01:33.830","Text":"Now I\u0027d like to bring the Ps to the other side."},{"Start":"01:33.830 ","End":"01:39.560","Text":"What I\u0027m going to do is multiply on the left by P and on the right by P inverse,"},{"Start":"01:39.560 ","End":"01:43.430","Text":"on the left by P, on the right by P inverse."},{"Start":"01:43.430 ","End":"01:47.090","Text":"The P, P inverse cancels, in here also,"},{"Start":"01:47.090 ","End":"01:52.525","Text":"and so we\u0027re left with A equals PBP inverse."},{"Start":"01:52.525 ","End":"01:58.550","Text":"Now, I can write P as the inverse of P inverse,"},{"Start":"01:58.550 ","End":"02:01.009","Text":"because the inverse of the inverse is itself."},{"Start":"02:01.009 ","End":"02:04.935","Text":"Here, I just put this in brackets."},{"Start":"02:04.935 ","End":"02:08.840","Text":"Now what I\u0027m going to do to tidy this up is to"},{"Start":"02:08.840 ","End":"02:13.955","Text":"rename P inverse and call it P tilde, for example,"},{"Start":"02:13.955 ","End":"02:21.650","Text":"so that A is equal to this P tilde inverse,"},{"Start":"02:21.650 ","End":"02:24.745","Text":"and then B and then P tilde."},{"Start":"02:24.745 ","End":"02:28.070","Text":"This is exactly what we\u0027re looking for,"},{"Start":"02:28.070 ","End":"02:30.800","Text":"we don\u0027t have P, we have a different P, it\u0027s P tilde,"},{"Start":"02:30.800 ","End":"02:36.770","Text":"but this is exactly the definition of B similar to A."},{"Start":"02:36.770 ","End":"02:41.590","Text":"Perhaps what I should\u0027ve done is written this equality the other way around,"},{"Start":"02:41.590 ","End":"02:51.340","Text":"that P tilde inverse BP tilde equals A and then it looks more like our definition."},{"Start":"02:51.440 ","End":"02:54.345","Text":"That solves part b."},{"Start":"02:54.345 ","End":"02:56.925","Text":"Now there\u0027s a part c, remember?"},{"Start":"02:56.925 ","End":"03:00.320","Text":"In this part, we\u0027re given 2 things, that A is similar to B"},{"Start":"03:00.320 ","End":"03:01.940","Text":"and B is similar to C."},{"Start":"03:01.940 ","End":"03:05.000","Text":"Our task is to develop this until we"},{"Start":"03:05.000 ","End":"03:07.460","Text":"get to A similar to C."},{"Start":"03:07.460 ","End":"03:11.550","Text":"The first thing I do is"},{"Start":"03:11.550 ","End":"03:16.940","Text":"to write what it means for A to be similar to B"},{"Start":"03:16.940 ","End":"03:19.850","Text":"means there exists a P such that this is true."},{"Start":"03:19.850 ","End":"03:25.770","Text":"I can\u0027t use the same letter again with B and C, I\u0027m using letter Q."},{"Start":"03:25.820 ","End":"03:28.470","Text":"Now here, I didn\u0027t do anything,"},{"Start":"03:28.470 ","End":"03:32.125","Text":"I just added a bit of coloring and just showing you my strategy,"},{"Start":"03:32.125 ","End":"03:35.290","Text":"B is equal to this and I have a B here."},{"Start":"03:35.290 ","End":"03:39.070","Text":"What I\u0027m going to do is substitute this for B in"},{"Start":"03:39.070 ","End":"03:43.300","Text":"this and that substitution will produce this line."},{"Start":"03:43.300 ","End":"03:47.600","Text":"You see here is the P minus 1_AP, that was B."},{"Start":"03:47.930 ","End":"03:52.900","Text":"Now I\u0027m just grouping PQ in brackets and note that Q inverse,"},{"Start":"03:52.900 ","End":"03:56.080","Text":"P inverse is the same as PQ inverse,"},{"Start":"03:56.080 ","End":"03:57.820","Text":"you have to change the order."},{"Start":"03:57.820 ","End":"04:03.889","Text":"Now if I let this PQ be some new letter,"},{"Start":"04:03.889 ","End":"04:07.050","Text":"[inaudible] R, PQR."},{"Start":"04:07.050 ","End":"04:08.880","Text":"Let R equal PQ,"},{"Start":"04:08.880 ","End":"04:12.780","Text":"then we get that R inverse AR is C,"},{"Start":"04:12.780 ","End":"04:17.745","Text":"and that means that A is similar to C,"},{"Start":"04:17.745 ","End":"04:21.640","Text":"and then we\u0027re done."}],"ID":14023},{"Watched":false,"Name":"Exercise 5 - Parts d-e","Duration":"3m 2s","ChapterTopicVideoID":13383,"CourseChapterTopicPlaylistID":7321,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:04.350","Text":"Here\u0027s another exercise dealing with similar matrices."},{"Start":"00:04.350 ","End":"00:09.090","Text":"In both parts A and B, we have A similar to B."},{"Start":"00:09.090 ","End":"00:15.030","Text":"First we have to prove that if they\u0027re both invertible,"},{"Start":"00:15.030 ","End":"00:18.795","Text":"then the inverses are also similar."},{"Start":"00:18.795 ","End":"00:23.760","Text":"In the other part, we have to show that raising them to the power of"},{"Start":"00:23.760 ","End":"00:30.430","Text":"some natural number k will leave them both also similar."},{"Start":"00:31.520 ","End":"00:34.260","Text":"If A is similar to B,"},{"Start":"00:34.260 ","End":"00:37.440","Text":"then we know that the definition of this is that there is"},{"Start":"00:37.440 ","End":"00:42.040","Text":"some invertible P such that this holds."},{"Start":"00:42.260 ","End":"00:47.035","Text":"Our goal is to reach something similar to this,"},{"Start":"00:47.035 ","End":"00:50.050","Text":"but with A inverse and B inverse."},{"Start":"00:50.050 ","End":"00:54.835","Text":"Let\u0027s start by taking the inverse of both sides of the equation."},{"Start":"00:54.835 ","End":"00:58.750","Text":"I just put the inverse sign here and here."},{"Start":"00:58.750 ","End":"01:02.920","Text":"Remember that when you take the inverse of a product,"},{"Start":"01:02.920 ","End":"01:04.770","Text":"it could be more than 2 things,"},{"Start":"01:04.770 ","End":"01:09.000","Text":"you just reverse the order besides taking the inverse."},{"Start":"01:09.000 ","End":"01:14.050","Text":"What this gives us on the left is we take the inverse of P,"},{"Start":"01:14.050 ","End":"01:17.410","Text":"then the inverse of A, and then the inverse of P inverse,"},{"Start":"01:17.410 ","End":"01:20.000","Text":"so it\u0027s P inverse inverse."},{"Start":"01:20.270 ","End":"01:25.364","Text":"Since the inverse of the inverse of P is just P itself,"},{"Start":"01:25.364 ","End":"01:31.200","Text":"we get to here and that\u0027s what we needed basically."},{"Start":"01:31.200 ","End":"01:36.000","Text":"This shows that A inverse is similar to B inverse and we even use"},{"Start":"01:36.000 ","End":"01:43.620","Text":"the same P. That\u0027s part A, let\u0027s get on to part B."},{"Start":"01:43.620 ","End":"01:45.620","Text":"Here, I\u0027m repeating the line"},{"Start":"01:45.620 ","End":"01:48.550","Text":"and what it means to say that A is similar to B."},{"Start":"01:48.550 ","End":"01:51.880","Text":"Whereas previously we took the inverse of both sides,"},{"Start":"01:51.880 ","End":"01:54.065","Text":"this time we\u0027re raising both sides to the power of k."},{"Start":"01:54.065 ","End":"02:01.705","Text":"At first I just write a k on each side."},{"Start":"02:01.705 ","End":"02:04.730","Text":"Now, to the power of k it just means I take"},{"Start":"02:04.730 ","End":"02:09.095","Text":"a lot of products of the same thing k times."},{"Start":"02:09.095 ","End":"02:13.010","Text":"Well, if k is small, it\u0027ll be less, but anyway."},{"Start":"02:13.010 ","End":"02:15.070","Text":"That\u0027s what we get."},{"Start":"02:15.070 ","End":"02:18.140","Text":"Then we\u0027re going to open the brackets and the Ps"},{"Start":"02:18.140 ","End":"02:22.670","Text":"are going to cancel with the inverse of Ps everywhere."},{"Start":"02:22.690 ","End":"02:27.980","Text":"I\u0027m just regrouping now so that this P goes with this P inverse,"},{"Start":"02:27.980 ","End":"02:31.270","Text":"this P with this P inverse, and so on."},{"Start":"02:31.270 ","End":"02:35.765","Text":"Each of these P, P inverse is the identity."},{"Start":"02:35.765 ","End":"02:38.355","Text":"This is what we get."},{"Start":"02:38.355 ","End":"02:43.880","Text":"Of course, there\u0027s going to be k of these As"},{"Start":"02:43.880 ","End":"02:47.200","Text":"is because there were k factors here."},{"Start":"02:47.200 ","End":"02:52.050","Text":"This just boils down to P inverse A to the kP and B to the k."},{"Start":"02:52.050 ","End":"02:56.360","Text":"That just means that A to the k is similar"},{"Start":"02:56.360 ","End":"02:58.550","Text":"to B to the k and the same P as"},{"Start":"02:58.550 ","End":"03:03.390","Text":"the original works for this also. That\u0027s it."}],"ID":14024},{"Watched":false,"Name":"Exercise 5 - Parts f-g","Duration":"4m 19s","ChapterTopicVideoID":13384,"CourseChapterTopicPlaylistID":7321,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:05.775","Text":"In this exercise, we have 2 things to prove related to similar matrices."},{"Start":"00:05.775 ","End":"00:06.930","Text":"In the 1st 1,"},{"Start":"00:06.930 ","End":"00:12.600","Text":"we\u0027re given that a is similar to b and q is a polynomial."},{"Start":"00:12.600 ","End":"00:17.955","Text":"Our task is to prove that q of a is similar to q of b."},{"Start":"00:17.955 ","End":"00:19.950","Text":"Then the second part,"},{"Start":"00:19.950 ","End":"00:22.050","Text":"also given a similar to b,"},{"Start":"00:22.050 ","End":"00:29.200","Text":"we have to show that their transposes are also similar. I start with part a."},{"Start":"00:29.390 ","End":"00:35.100","Text":"This is just spelling out what it means for a to be similar to b and the some"},{"Start":"00:35.100 ","End":"00:40.945","Text":"invertible matrix p. There\u0027s a computation we\u0027ve done many times,"},{"Start":"00:40.945 ","End":"00:45.020","Text":"which is that if we take b to the power of n,"},{"Start":"00:45.020 ","End":"00:49.640","Text":"we just have to take to the power of n in the middle."},{"Start":"00:49.640 ","End":"00:52.870","Text":"If you remember, if we string a lot of these along the p,"},{"Start":"00:52.870 ","End":"00:57.385","Text":"p inverse cancel all the way in the middle and we\u0027re left with this."},{"Start":"00:57.385 ","End":"01:00.890","Text":"Now suppose q has some degree k,"},{"Start":"01:00.890 ","End":"01:04.235","Text":"and then we can write it out in increasing order,"},{"Start":"01:04.235 ","End":"01:07.765","Text":"increasing powers of x, like so."},{"Start":"01:07.765 ","End":"01:12.210","Text":"Our task is to prove that q of a is similar to q of b,"},{"Start":"01:12.210 ","End":"01:13.400","Text":"and if we spell it out,"},{"Start":"01:13.400 ","End":"01:14.630","Text":"this is what we get."},{"Start":"01:14.630 ","End":"01:21.980","Text":"Don\u0027t forget that the constant has to be accompanied by the identity matrix I."},{"Start":"01:21.980 ","End":"01:29.995","Text":"You can\u0027t just leave a constant because these are all matrices. Here\u0027s the proof."},{"Start":"01:29.995 ","End":"01:33.230","Text":"Let\u0027s take this p minus 1 and p,"},{"Start":"01:33.230 ","End":"01:36.065","Text":"and instead of sandwiching it around a,"},{"Start":"01:36.065 ","End":"01:39.905","Text":"we\u0027ll sandwich it around q of a, and see what we get."},{"Start":"01:39.905 ","End":"01:42.860","Text":"Just first, substitute q of a,"},{"Start":"01:42.860 ","End":"01:44.620","Text":"which is this,"},{"Start":"01:44.620 ","End":"01:47.740","Text":"and then I\u0027m doing 2 steps in 1,"},{"Start":"01:47.740 ","End":"01:54.530","Text":"I\u0027m using linearity so I can put p minus 1 p around each of the term separately."},{"Start":"01:54.530 ","End":"01:59.510","Text":"But the constants also can get pulled out upfront."},{"Start":"01:59.510 ","End":"02:02.880","Text":"I guess I also omitted the I here,"},{"Start":"02:02.880 ","End":"02:06.100","Text":"so we did several steps in 1."},{"Start":"02:09.920 ","End":"02:14.365","Text":"Here it is, yeah, I\u0027m using this formula here now"},{"Start":"02:14.365 ","End":"02:19.105","Text":"that p inverse a to the something p is b to that something."},{"Start":"02:19.105 ","End":"02:23.520","Text":"For example, p inverse a squared,"},{"Start":"02:23.520 ","End":"02:25.395","Text":"p is b squared,"},{"Start":"02:25.395 ","End":"02:30.505","Text":"and the same thing for a to the k gives us b to the k. But if you look at this,"},{"Start":"02:30.505 ","End":"02:34.630","Text":"this is exactly q of b."},{"Start":"02:34.630 ","End":"02:36.610","Text":"It\u0027s the polynomial q,"},{"Start":"02:36.610 ","End":"02:40.910","Text":"just replacing x by b here."},{"Start":"02:40.910 ","End":"02:46.735","Text":"I should have written it that because this is equal to this,"},{"Start":"02:46.735 ","End":"02:55.500","Text":"it follows that q of a is similar to q of b,"},{"Start":"02:55.500 ","End":"03:01.965","Text":"using the same p as we used for showing that a was similar to b."},{"Start":"03:01.965 ","End":"03:04.455","Text":"Now part b,"},{"Start":"03:04.455 ","End":"03:08.610","Text":"once again, a is similar to b,"},{"Start":"03:08.610 ","End":"03:11.800","Text":"so this equation holds."},{"Start":"03:11.990 ","End":"03:15.710","Text":"This time we\u0027ll take the transpose of both sides"},{"Start":"03:15.710 ","End":"03:19.255","Text":"of this because we want to show that the transposes are similar."},{"Start":"03:19.255 ","End":"03:22.880","Text":"The transpose of this equal to the transpose of this,"},{"Start":"03:22.880 ","End":"03:24.335","Text":"and that\u0027s what I\u0027ve written here."},{"Start":"03:24.335 ","End":"03:28.160","Text":"I want to remind you that when you take the transpose of a product,"},{"Start":"03:28.160 ","End":"03:31.440","Text":"you have to invert the order."},{"Start":"03:31.700 ","End":"03:35.780","Text":"The left-hand side here becomes first p transpose,"},{"Start":"03:35.780 ","End":"03:39.725","Text":"then a transpose, then p inverse transpose."},{"Start":"03:39.725 ","End":"03:46.895","Text":"Now it\u0027s known that the transpose of the inverse is the inverse of the transpose."},{"Start":"03:46.895 ","End":"03:50.300","Text":"Now, my next step might seem strange."},{"Start":"03:50.300 ","End":"03:55.805","Text":"I\u0027m writing p transpose as p transpose inverse inverse."},{"Start":"03:55.805 ","End":"04:00.215","Text":"If I let this p transpose inverse,"},{"Start":"04:00.215 ","End":"04:04.154","Text":"I\u0027ll call it p something, p tilde."},{"Start":"04:04.154 ","End":"04:06.570","Text":"Then this is p tilde inverse,"},{"Start":"04:06.570 ","End":"04:09.240","Text":"a transpose p tilde."},{"Start":"04:09.240 ","End":"04:11.300","Text":"When we have this equation,"},{"Start":"04:11.300 ","End":"04:20.010","Text":"it shows that a transpose is similar to b transpose and we\u0027re done."}],"ID":14025},{"Watched":false,"Name":"Exercise 5 - Parts h-i","Duration":"2m 31s","ChapterTopicVideoID":13385,"CourseChapterTopicPlaylistID":7321,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:07.005","Text":"In this exercise, we\u0027re given 2 similar N by N matrices."},{"Start":"00:07.005 ","End":"00:09.120","Text":"We have to prove 2 things,"},{"Start":"00:09.120 ","End":"00:11.910","Text":"that they have the same rank,"},{"Start":"00:11.910 ","End":"00:13.980","Text":"and that they have the same nullity."},{"Start":"00:13.980 ","End":"00:18.435","Text":"We\u0027re given a hint that the rank of"},{"Start":"00:18.435 ","End":"00:26.115","Text":"a product is less than or equal to the rank of either 1 of the factors."},{"Start":"00:26.115 ","End":"00:32.840","Text":"Remember, A and B are similar matrices we can write P inverse AP is"},{"Start":"00:32.840 ","End":"00:39.275","Text":"equal to B for some invertible P. I\u0027m going to use the hint."},{"Start":"00:39.275 ","End":"00:42.170","Text":"The hint obviously applies to product of"},{"Start":"00:42.170 ","End":"00:47.630","Text":"more than 2 every time you multiply and this is for square matrices."},{"Start":"00:47.630 ","End":"00:50.450","Text":"If I keep multiplying by another matrix,"},{"Start":"00:50.450 ","End":"00:52.510","Text":"the rank can only get smaller."},{"Start":"00:52.510 ","End":"00:56.690","Text":"Here, the rank of P inverse AP."},{"Start":"00:56.690 ","End":"01:00.600","Text":"First of all, it\u0027s less than the rank of AP."},{"Start":"01:01.160 ","End":"01:05.615","Text":"Therefore, it\u0027s less than the rank of A each time I can"},{"Start":"01:05.615 ","End":"01:10.430","Text":"drop a matrix and the rank and only grow larger."},{"Start":"01:10.430 ","End":"01:15.140","Text":"Rank of B is less than or equal to the rank of A."},{"Start":"01:15.140 ","End":"01:19.430","Text":"Without repeating the computation just by symmetry, if A is similar to B,"},{"Start":"01:19.430 ","End":"01:21.620","Text":"then B is similar to A so I could also"},{"Start":"01:21.620 ","End":"01:24.920","Text":"get that rank of A is less than or equal to rank of B."},{"Start":"01:24.920 ","End":"01:30.455","Text":"When something is less than or equal to something and vice versa,"},{"Start":"01:30.455 ","End":"01:32.900","Text":"then those 2 quantities are equal."},{"Start":"01:32.900 ","End":"01:37.295","Text":"I wrote it formally if A is less than or equal to B and B is less than or equal to A,"},{"Start":"01:37.295 ","End":"01:39.395","Text":"then A equals B."},{"Start":"01:39.395 ","End":"01:42.425","Text":"Very often used in mathematics."},{"Start":"01:42.425 ","End":"01:45.730","Text":"Prove equality by a double inequality."},{"Start":"01:45.730 ","End":"01:47.650","Text":"For part B,"},{"Start":"01:47.650 ","End":"01:51.150","Text":"I\u0027m going to use the rank nullity theorem."},{"Start":"01:51.310 ","End":"01:54.995","Text":"If you don\u0027t remember it then and go look it up."},{"Start":"01:54.995 ","End":"01:57.830","Text":"Anyway, for a square matrix,"},{"Start":"01:57.830 ","End":"02:03.695","Text":"the rank plus the nullity is equal to the dimension."},{"Start":"02:03.695 ","End":"02:08.435","Text":"This is true for A and it\u0027s also true for B."},{"Start":"02:08.435 ","End":"02:11.330","Text":"Then all we need now is a computation."},{"Start":"02:11.330 ","End":"02:16.735","Text":"The nullity of A from this is N minus the rank of A."},{"Start":"02:16.735 ","End":"02:18.510","Text":"But from part A,"},{"Start":"02:18.510 ","End":"02:23.595","Text":"rank of A equals rank B. I can write that this equals N minus rank of B."},{"Start":"02:23.595 ","End":"02:25.490","Text":"Again by rank-nullity theorem,"},{"Start":"02:25.490 ","End":"02:27.205","Text":"this is equal to the novelty of B."},{"Start":"02:27.205 ","End":"02:32.030","Text":"Look, nullity of A equals nullity of B, and we\u0027re done."}],"ID":14026},{"Watched":false,"Name":"Exercise 6","Duration":"6m 30s","ChapterTopicVideoID":24868,"CourseChapterTopicPlaylistID":7321,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.720","Text":"In this exercise, there are 4 true-false questions,"},{"Start":"00:03.720 ","End":"00:06.990","Text":"all about square real matrices."},{"Start":"00:06.990 ","End":"00:09.780","Text":"We\u0027ll read each 1 as we come to it."},{"Start":"00:09.780 ","End":"00:12.975","Text":"In part a, true or false."},{"Start":"00:12.975 ","End":"00:18.810","Text":"If 2 3 by 3 matrices have the same characteristic polynomial, then they\u0027re similar."},{"Start":"00:18.810 ","End":"00:21.300","Text":"It turns out that this is false."},{"Start":"00:21.300 ","End":"00:23.750","Text":"Here is a counterexample."},{"Start":"00:23.750 ","End":"00:28.730","Text":"This is A and this is D. Now I won\u0027t do the computation."},{"Start":"00:28.730 ","End":"00:29.945","Text":"I\u0027ll leave that to you,"},{"Start":"00:29.945 ","End":"00:32.795","Text":"but they both have a characteristic polynomial,"},{"Start":"00:32.795 ","End":"00:35.915","Text":"x minus 1 squared times x minus 2."},{"Start":"00:35.915 ","End":"00:37.730","Text":"Well, it\u0027s clear for this 1,"},{"Start":"00:37.730 ","End":"00:39.230","Text":"this is diagonal matrix,"},{"Start":"00:39.230 ","End":"00:40.810","Text":"so it\u0027s x minus 1, x minus 1,"},{"Start":"00:40.810 ","End":"00:42.100","Text":"x minus 2,"},{"Start":"00:42.100 ","End":"00:44.645","Text":"and you could check it here it is also,"},{"Start":"00:44.645 ","End":"00:46.370","Text":"But they are not similar."},{"Start":"00:46.370 ","End":"00:50.700","Text":"Why not, this is diagonal and if they were similar,"},{"Start":"00:50.700 ","End":"00:53.465","Text":"then A would be diagonalizable."},{"Start":"00:53.465 ","End":"00:55.460","Text":"But in a previous exercise,"},{"Start":"00:55.460 ","End":"00:57.275","Text":"we showed that it isn\u0027t."},{"Start":"00:57.275 ","End":"01:00.790","Text":"Now in case you missed that previous exercise,"},{"Start":"01:00.790 ","End":"01:03.515","Text":"here\u0027s a screenshot from it."},{"Start":"01:03.515 ","End":"01:10.160","Text":"This is the characteristic matrix and we plug in x equals 1,"},{"Start":"01:10.160 ","End":"01:14.560","Text":"that\u0027s 1 of the eigenvalues and we want to compute the multiplicities."},{"Start":"01:14.560 ","End":"01:16.260","Text":"If you do that,"},{"Start":"01:16.260 ","End":"01:20.655","Text":"you get this and we need the kernel of this matrix,"},{"Start":"01:20.655 ","End":"01:21.950","Text":"it\u0027s like the null space."},{"Start":"01:21.950 ","End":"01:25.220","Text":"It\u0027s a solution to the following system of equations."},{"Start":"01:25.220 ","End":"01:28.310","Text":"This middle 1 is 0, so there\u0027s only 2 equations."},{"Start":"01:28.310 ","End":"01:30.200","Text":"It gives us that y is 0,"},{"Start":"01:30.200 ","End":"01:34.460","Text":"z is 0 and that means that we could take, for example,"},{"Start":"01:34.460 ","End":"01:37.100","Text":"x equals 1 and get that"},{"Start":"01:37.100 ","End":"01:43.700","Text":"the eigenspace for x equals 1 is spanned by a single vector, 1, 0,"},{"Start":"01:43.700 ","End":"01:48.490","Text":"0, and any event its dimension is 1,"},{"Start":"01:48.490 ","End":"01:54.560","Text":"so the geometric multiplicity is 1 and the algebraic multiplicity is 2."},{"Start":"01:54.560 ","End":"01:56.675","Text":"Since 2 is not equal to 1,"},{"Start":"01:56.675 ","End":"02:00.800","Text":"then it\u0027s not diagonalizable. That was a."},{"Start":"02:00.800 ","End":"02:07.775","Text":"B asks us if 2 3 by 3 matrices have the same minimal polynomial than they are similar."},{"Start":"02:07.775 ","End":"02:12.195","Text":"It\u0027s like part a except the word minimal instead of the word characteristic."},{"Start":"02:12.195 ","End":"02:15.180","Text":"This also turns out to be false."},{"Start":"02:15.180 ","End":"02:18.240","Text":"Here is a counterexample."},{"Start":"02:18.240 ","End":"02:21.425","Text":"If you compute the minimal polynomials,"},{"Start":"02:21.425 ","End":"02:22.670","Text":"and I won\u0027t do the work for you,"},{"Start":"02:22.670 ","End":"02:27.720","Text":"but they both come out to be x minus 1, x minus 2."},{"Start":"02:28.410 ","End":"02:32.650","Text":"The characteristic polynomial for A comes out to be"},{"Start":"02:32.650 ","End":"02:36.865","Text":"this and the characteristic polynomial for B is this,"},{"Start":"02:36.865 ","End":"02:37.990","Text":"and they\u0027re not the same."},{"Start":"02:37.990 ","End":"02:43.220","Text":"Here the x minus 1 is squared and here the x minus 2 is squared."},{"Start":"02:43.790 ","End":"02:46.960","Text":"If they don\u0027t have the same characteristic polynomial,"},{"Start":"02:46.960 ","End":"02:48.445","Text":"then they\u0027re not similar."},{"Start":"02:48.445 ","End":"02:50.200","Text":"Now in our third attempt,"},{"Start":"02:50.200 ","End":"02:53.170","Text":"let\u0027s try to change the requirement that they have"},{"Start":"02:53.170 ","End":"02:57.640","Text":"the same characteristic polynomial and the same minimal polynomial,"},{"Start":"02:57.640 ","End":"02:59.410","Text":"then are they similar?"},{"Start":"02:59.410 ","End":"03:02.205","Text":"Well, the answer is still no."},{"Start":"03:02.205 ","End":"03:04.925","Text":"We need a 4 by 4 counterexample."},{"Start":"03:04.925 ","End":"03:08.135","Text":"It\u0027s actually true for 3 by 3."},{"Start":"03:08.135 ","End":"03:12.875","Text":"The example is that it\u0027s all 0s except for a 1 here and here."},{"Start":"03:12.875 ","End":"03:14.790","Text":"That\u0027s for A, and for B,"},{"Start":"03:14.790 ","End":"03:16.805","Text":"we just have a 1 here."},{"Start":"03:16.805 ","End":"03:19.550","Text":"Computation for both of them."},{"Start":"03:19.550 ","End":"03:27.275","Text":"You get the polynomial is x^4th and the minimal polynomial is x squared."},{"Start":"03:27.275 ","End":"03:29.840","Text":"You can actually check that each 1 of these,"},{"Start":"03:29.840 ","End":"03:33.080","Text":"if you square it, you get the 0 matrix."},{"Start":"03:33.080 ","End":"03:37.295","Text":"These are the same and these are the same and they\u0027re not similar"},{"Start":"03:37.295 ","End":"03:42.810","Text":"because the rank of A is 2 and the rank of B is 1."},{"Start":"03:42.810 ","End":"03:45.735","Text":"Similar matrices have the same rank."},{"Start":"03:45.735 ","End":"03:51.630","Text":"That\u0027s part c. Part d asks us if the following 2 matrices,"},{"Start":"03:51.630 ","End":"03:53.715","Text":"A and B, are similar."},{"Start":"03:53.715 ","End":"03:58.430","Text":"We\u0027re given a hint that 3 by 3 matrices do have"},{"Start":"03:58.430 ","End":"04:00.890","Text":"the property that they\u0027re similar if and only if they"},{"Start":"04:00.890 ","End":"04:03.800","Text":"have the same characteristic and minimal polynomials."},{"Start":"04:03.800 ","End":"04:09.880","Text":"Let\u0027s compute the minimal for this and this and the characteristic for this and this."},{"Start":"04:09.880 ","End":"04:13.595","Text":"I\u0027ll do 1 of the computations at the end."},{"Start":"04:13.595 ","End":"04:16.295","Text":"Let\u0027s just get the result first."},{"Start":"04:16.295 ","End":"04:19.890","Text":"These characteristics are both x squared,"},{"Start":"04:19.890 ","End":"04:23.945","Text":"x minus 3 and the minimal for both is x,"},{"Start":"04:23.945 ","End":"04:29.015","Text":"x minus 3, which are the same and because it\u0027s 3 by 3, they\u0027re similar."},{"Start":"04:29.015 ","End":"04:31.985","Text":"Now it said that compute 1 of the 4, this 1 here."},{"Start":"04:31.985 ","End":"04:37.355","Text":"Yeah, that\u0027s just a conclusion and now here\u0027s the computation for this 1."},{"Start":"04:37.355 ","End":"04:42.445","Text":"We need the determinant of xI minus b."},{"Start":"04:42.445 ","End":"04:45.120","Text":"B is just all 1s,"},{"Start":"04:45.120 ","End":"04:50.400","Text":"so we take diagonal with x\u0027s and then subtract 1 everywhere."},{"Start":"04:50.400 ","End":"04:52.790","Text":"We\u0027ll do some row operations."},{"Start":"04:52.790 ","End":"04:56.790","Text":"We\u0027ll subtract row 1 from row 2,"},{"Start":"04:56.790 ","End":"04:58.770","Text":"and put it into row 2."},{"Start":"04:58.770 ","End":"05:03.025","Text":"We\u0027ll subtract R_1 also from R_3,"},{"Start":"05:03.025 ","End":"05:05.030","Text":"and we\u0027ll put it in row 3."},{"Start":"05:05.030 ","End":"05:06.680","Text":"First row stays the same."},{"Start":"05:06.680 ","End":"05:08.930","Text":"This minus this gives us this,"},{"Start":"05:08.930 ","End":"05:11.765","Text":"and this minus this gives us this."},{"Start":"05:11.765 ","End":"05:14.035","Text":"Now column operations."},{"Start":"05:14.035 ","End":"05:17.915","Text":"Add the first 2 columns and put them in the first column,"},{"Start":"05:17.915 ","End":"05:20.710","Text":"and then add the third column to the first column."},{"Start":"05:20.710 ","End":"05:25.175","Text":"Really we just add these 2 columns and subtract from the first."},{"Start":"05:25.175 ","End":"05:27.595","Text":"Minus 1 and minus 1 is minus 2."},{"Start":"05:27.595 ","End":"05:29.105","Text":"Sorry, we add it to the first."},{"Start":"05:29.105 ","End":"05:30.905","Text":"It gives us x minus 3."},{"Start":"05:30.905 ","End":"05:33.950","Text":"This and this is x added to this gives us 0."},{"Start":"05:33.950 ","End":"05:37.790","Text":"This and this is x added to this gives us 0."},{"Start":"05:37.790 ","End":"05:41.845","Text":"These 2 are the same but here we have 2 zeros."},{"Start":"05:41.845 ","End":"05:46.760","Text":"Then we can compute the determinant by expanding from the first row and"},{"Start":"05:46.760 ","End":"05:51.760","Text":"only this 1 will give us something which will be the product of these minus 1,"},{"Start":"05:51.760 ","End":"05:53.590","Text":"it\u0027s minor 0, 0,"},{"Start":"05:53.590 ","End":"05:55.490","Text":"0, x is 0."},{"Start":"05:55.490 ","End":"05:57.230","Text":"Similarly, for this 1,"},{"Start":"05:57.230 ","End":"06:00.895","Text":"we just get x squared, x minus 3."},{"Start":"06:00.895 ","End":"06:05.300","Text":"That was the computation that we did here."},{"Start":"06:05.300 ","End":"06:07.855","Text":"Maybe I\u0027ll just repeat the main point again."},{"Start":"06:07.855 ","End":"06:10.855","Text":"We showed that these 2 matrices,"},{"Start":"06:10.855 ","End":"06:15.915","Text":"A and B had the same characteristic polynomial and the same minimal polynomial."},{"Start":"06:15.915 ","End":"06:17.745","Text":"We use the theorem,"},{"Start":"06:17.745 ","End":"06:19.420","Text":"the 3 by 3 matrices,"},{"Start":"06:19.420 ","End":"06:22.025","Text":"to conclude that they are similar."},{"Start":"06:22.025 ","End":"06:23.480","Text":"Because in the 3 by 3 case,"},{"Start":"06:23.480 ","End":"06:30.360","Text":"it\u0027s an if and only if as opposed to part c where we had a 4 by 4. That\u0027s it."}],"ID":25781},{"Watched":false,"Name":"Exercise 7","Duration":"4m 26s","ChapterTopicVideoID":24869,"CourseChapterTopicPlaylistID":7321,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:02.710","Text":"In this exercise, we have a square 3 by"},{"Start":"00:02.710 ","End":"00:09.420","Text":"3 matrix A over the reals and it has 3 eigenvalues, 0, 1, and 2."},{"Start":"00:09.420 ","End":"00:14.580","Text":"We have 6 things to compute or to say why they can\u0027t be computed."},{"Start":"00:14.580 ","End":"00:17.955","Text":"Rank of A, dimension of kernel of A,"},{"Start":"00:17.955 ","End":"00:22.690","Text":"trace of A and we\u0027ll read the rest as we come to them."},{"Start":"00:22.690 ","End":"00:26.900","Text":"Let\u0027s start with noticing that because A has"},{"Start":"00:26.900 ","End":"00:33.230","Text":"3 distinct eigenvalues and it\u0027s an order 3 square matrix,"},{"Start":"00:33.230 ","End":"00:37.640","Text":"then it\u0027s similar to the diagonal matrix with 0, 1,"},{"Start":"00:37.640 ","End":"00:39.760","Text":"and 2 on the diagonal."},{"Start":"00:39.760 ","End":"00:45.860","Text":"This will really help us because certain things are preserved under similarity."},{"Start":"00:45.860 ","End":"00:49.040","Text":"If 2 matrices are similar then they have the same rank,"},{"Start":"00:49.040 ","End":"00:50.480","Text":"and they have the same nullity,"},{"Start":"00:50.480 ","End":"00:51.845","Text":"they have the same trace,"},{"Start":"00:51.845 ","End":"00:53.240","Text":"they even have the same determinant,"},{"Start":"00:53.240 ","End":"00:55.385","Text":"but we don\u0027t need that here."},{"Start":"00:55.385 ","End":"00:58.820","Text":"In part a, the rank of A is going to be the same as"},{"Start":"00:58.820 ","End":"00:59.840","Text":"the rank of D."},{"Start":"00:59.840 ","End":"01:02.015","Text":"The rank of D, just look at it,"},{"Start":"01:02.015 ","End":"01:06.375","Text":"it has 2 linearly independent rows. It\u0027s 2."},{"Start":"01:06.375 ","End":"01:11.250","Text":"The dimension of the kernel or the nullity is going to"},{"Start":"01:11.250 ","End":"01:16.020","Text":"be by the rank nullity theorem N minus the rank which is 3,"},{"Start":"01:16.020 ","End":"01:18.615","Text":"minus 2, which is 1."},{"Start":"01:18.615 ","End":"01:22.580","Text":"The trace of A, same as the trace of D."},{"Start":"01:22.580 ","End":"01:26.045","Text":"We just add up the diagonal 0 plus 1 plus 2 is 3."},{"Start":"01:26.045 ","End":"01:28.610","Text":"That\u0027s true for D, so it\u0027s true for A."},{"Start":"01:28.610 ","End":"01:31.475","Text":"Already, we\u0027ve done the first 3."},{"Start":"01:31.475 ","End":"01:39.880","Text":"The next 1 is we have to figure out the determinant of A transpose times A."},{"Start":"01:39.880 ","End":"01:44.860","Text":"A is not invertible because 0 is an eigenvalue,"},{"Start":"01:44.860 ","End":"01:47.210","Text":"so its determinant is 0."},{"Start":"01:47.210 ","End":"01:53.120","Text":"The determinant of A transpose times A is equal to the determinant is multiplicative."},{"Start":"01:53.120 ","End":"01:54.860","Text":"Meaning if you take a product,"},{"Start":"01:54.860 ","End":"01:57.890","Text":"we can just convert that to a product of determinants."},{"Start":"01:57.890 ","End":"02:00.410","Text":"Determinant of this time determinant of A."},{"Start":"02:00.410 ","End":"02:04.305","Text":"Since this is 0, this is 0."},{"Start":"02:04.305 ","End":"02:09.530","Text":"Next part e, we want the eigenvalues of A transpose A."},{"Start":"02:09.530 ","End":"02:13.115","Text":"It turns out that we can\u0027t say,"},{"Start":"02:13.115 ","End":"02:16.640","Text":"because I can give you an example of 2 matrices,"},{"Start":"02:16.640 ","End":"02:20.780","Text":"3 by 3, each of them having eigenvalues 0, 1, 2,"},{"Start":"02:20.780 ","End":"02:25.415","Text":"but having a different set of eigenvalues for A transpose A."},{"Start":"02:25.415 ","End":"02:27.565","Text":"These are the 2 examples."},{"Start":"02:27.565 ","End":"02:34.085","Text":"A_1 is this, and A_2 similar just has an extra 1 here."},{"Start":"02:34.085 ","End":"02:39.495","Text":"I computed the product for you and the eigenvalues is 0, 1, 4."},{"Start":"02:39.495 ","End":"02:42.710","Text":"This has eigenvalue 0, 2, 4."},{"Start":"02:42.710 ","End":"02:45.230","Text":"Well, it\u0027s not the same set of eigenvalues,"},{"Start":"02:45.230 ","End":"02:49.590","Text":"which means that we can\u0027t say in general what they are."},{"Start":"02:49.590 ","End":"02:51.185","Text":"The last part f,"},{"Start":"02:51.185 ","End":"02:53.690","Text":"we have to compute the eigenvalues of"},{"Start":"02:53.690 ","End":"02:58.555","Text":"this expression for A square plus 10A plus I inverse."},{"Start":"02:58.555 ","End":"03:01.960","Text":"If it has an inverse, and we\u0027ll see."},{"Start":"03:01.960 ","End":"03:07.445","Text":"Let p of x be the polynomial for x squared plus 10x plus 1."},{"Start":"03:07.445 ","End":"03:11.335","Text":"This just came from looking at this like a polynomial."},{"Start":"03:11.335 ","End":"03:16.030","Text":"If you let B be p of A,"},{"Start":"03:16.030 ","End":"03:20.090","Text":"which is exactly what\u0027s written inside the brackets here,"},{"Start":"03:20.090 ","End":"03:21.665","Text":"that p of A,"},{"Start":"03:21.665 ","End":"03:25.465","Text":"you just plug A into this polynomial."},{"Start":"03:25.465 ","End":"03:30.335","Text":"Now, we\u0027ve seen before that when you take the polynomial of a matrix,"},{"Start":"03:30.335 ","End":"03:33.260","Text":"the result has the same eigenvectors,"},{"Start":"03:33.260 ","End":"03:38.845","Text":"but the eigenvalues are p of the old eigenvalues."},{"Start":"03:38.845 ","End":"03:41.120","Text":"In this case there will be p of 0,"},{"Start":"03:41.120 ","End":"03:43.505","Text":"p of 1, and p of 4."},{"Start":"03:43.505 ","End":"03:49.880","Text":"For example, p of 1 would be 4 times 1 squared plus 10 times 1 plus 1 comes up 15,"},{"Start":"03:49.880 ","End":"03:51.845","Text":"and so on for the rest of them."},{"Start":"03:51.845 ","End":"03:57.655","Text":"These are the 3 eigenvalues and 0 is not among them."},{"Start":"03:57.655 ","End":"03:59.580","Text":"B is invertible."},{"Start":"03:59.580 ","End":"04:03.365","Text":"It makes sense to take the inverse here."},{"Start":"04:03.365 ","End":"04:06.964","Text":"Our matrix which is B inverse,"},{"Start":"04:06.964 ","End":"04:10.790","Text":"is also a property of inverses that they also have"},{"Start":"04:10.790 ","End":"04:14.600","Text":"the same eigenvectors with the inverse of the eigenvalues,"},{"Start":"04:14.600 ","End":"04:16.055","Text":"assuming they\u0027re not 0."},{"Start":"04:16.055 ","End":"04:23.085","Text":"In this case, we have 1/1, 1/15, and 1/37."},{"Start":"04:23.085 ","End":"04:26.650","Text":"That\u0027s the answer, and we\u0027re done."}],"ID":25782},{"Watched":false,"Name":"Exercise 8","Duration":"1m 9s","ChapterTopicVideoID":24866,"CourseChapterTopicPlaylistID":7321,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.630","Text":"In this exercise, we have to prove that if A and B"},{"Start":"00:03.630 ","End":"00:06.150","Text":"are similar matrices,"},{"Start":"00:06.150 ","End":"00:08.910","Text":"then they have the same minimal polynomial."},{"Start":"00:08.910 ","End":"00:16.335","Text":"Now we have to remember a proposition that if A and B are similar and p is a polynomial,"},{"Start":"00:16.335 ","End":"00:19.665","Text":"then p of A is similar to p of B."},{"Start":"00:19.665 ","End":"00:28.320","Text":"Using that, we conclude that p of A is 0 if and only if p of B is 0."},{"Start":"00:28.320 ","End":"00:34.175","Text":"This is because 0 is the only matrix that\u0027s similar to itself."},{"Start":"00:34.175 ","End":"00:35.930","Text":"Like p of A is 0,"},{"Start":"00:35.930 ","End":"00:37.670","Text":"p of B is similar to 0,"},{"Start":"00:37.670 ","End":"00:40.490","Text":"so it is 0 and vice versa."},{"Start":"00:40.490 ","End":"00:43.205","Text":"Now, A and B,"},{"Start":"00:43.205 ","End":"00:45.575","Text":"0 the same set of polynomials."},{"Start":"00:45.575 ","End":"00:52.670","Text":"Polynomial p applied to A is 0 if and only if p applied to B is 0."},{"Start":"00:52.670 ","End":"00:56.930","Text":"Then in particular, the monic one with"},{"Start":"00:56.930 ","End":"00:59.630","Text":"least degree is also the same"},{"Start":"00:59.630 ","End":"01:02.599","Text":"for both because we choose it from the same set of polynomial,"},{"Start":"01:02.599 ","End":"01:04.610","Text":"the smallest, lowest degree monic one."},{"Start":"01:04.610 ","End":"01:07.295","Text":"So they have the same minimal polynomial."},{"Start":"01:07.295 ","End":"01:10.170","Text":"That\u0027s all there is to it."}],"ID":25779}],"Thumbnail":null,"ID":7321},{"Name":"Linear Transformation\u0027s Eigenvalues and Diagonalization","TopicPlaylistFirstVideoID":0,"Duration":null,"Videos":[{"Watched":false,"Name":"Introduction","Duration":"9m 3s","ChapterTopicVideoID":24890,"CourseChapterTopicPlaylistID":118361,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.840","Text":"We\u0027re starting a new call it subtopic,"},{"Start":"00:03.840 ","End":"00:07.170","Text":"we\u0027re still in eigenvalues and eigenvectors,"},{"Start":"00:07.170 ","End":"00:11.580","Text":"but so far we\u0027ve talked about eigenvalues and eigenvectors of"},{"Start":"00:11.580 ","End":"00:14.640","Text":"a matrix and now we\u0027re going to talk about"},{"Start":"00:14.640 ","End":"00:18.270","Text":"eigenvalues and eigenvectors of a linear transformation."},{"Start":"00:18.270 ","End":"00:22.865","Text":"I\u0027m going to assume that you\u0027re familiar with the case of a matrix."},{"Start":"00:22.865 ","End":"00:26.255","Text":"If you\u0027re not, you should go back and study that,"},{"Start":"00:26.255 ","End":"00:28.145","Text":"but for this particular clip,"},{"Start":"00:28.145 ","End":"00:32.835","Text":"you can get by without this. That\u0027s up to you."},{"Start":"00:32.835 ","End":"00:37.610","Text":"Like I said, we\u0027re going to apply these now to a more general situation of"},{"Start":"00:37.610 ","End":"00:41.780","Text":"a linear transformation from a vector space to itself,"},{"Start":"00:41.780 ","End":"00:46.875","Text":"but we\u0027re going to only be concerned with finite dimensional vector spaces."},{"Start":"00:46.875 ","End":"00:49.290","Text":"That\u0027s why dimension n doesn\u0027t matter what n is,"},{"Start":"00:49.290 ","End":"00:51.720","Text":"but the point is it\u0027s finite."},{"Start":"00:51.720 ","End":"00:54.920","Text":"The field is usually the real,"},{"Start":"00:54.920 ","End":"00:56.465","Text":"sometimes the complex,"},{"Start":"00:56.465 ","End":"00:58.265","Text":"but I start with an example."},{"Start":"00:58.265 ","End":"01:03.385","Text":"Let\u0027s take transformation T from R^3 to itself,"},{"Start":"01:03.385 ","End":"01:06.270","Text":"it takes x, y, z to minus x plus 3y,"},{"Start":"01:06.270 ","End":"01:10.120","Text":"3x minus y, minus 2x minus 2y plus 6z."},{"Start":"01:10.220 ","End":"01:15.310","Text":"I\u0027ll leave you to mentally note why this is really linear."},{"Start":"01:15.310 ","End":"01:18.035","Text":"Let\u0027s compute Tv,"},{"Start":"01:18.035 ","End":"01:20.160","Text":"sometimes I put the brackets around the v,"},{"Start":"01:20.160 ","End":"01:23.865","Text":"sometimes not, for a few v in R^3."},{"Start":"01:23.865 ","End":"01:26.680","Text":"If we apply it to minus 1, 1, 0,"},{"Start":"01:26.680 ","End":"01:30.805","Text":"say we\u0027ll get plus 1 plus 3,"},{"Start":"01:30.805 ","End":"01:36.050","Text":"that\u0027s 4 and then minus 3, minus 4."},{"Start":"01:36.050 ","End":"01:42.400","Text":"Here we have 2 minus 2 plus 6 times 0 is 0."},{"Start":"01:42.400 ","End":"01:46.150","Text":"Here are 3 more not going to do the computation."},{"Start":"01:46.150 ","End":"01:48.700","Text":"Now, there is something interesting to note,"},{"Start":"01:48.700 ","End":"01:50.500","Text":"and in some of these cases,"},{"Start":"01:50.500 ","End":"01:54.460","Text":"T takes a vector to a multiple of itself."},{"Start":"01:54.460 ","End":"01:56.500","Text":"That would be in this case,"},{"Start":"01:56.500 ","End":"01:58.195","Text":"this case and this case,"},{"Start":"01:58.195 ","End":"02:00.010","Text":"T of minus 1, 1,"},{"Start":"02:00.010 ","End":"02:02.480","Text":"0 is 4 minus 4,0,"},{"Start":"02:02.480 ","End":"02:06.660","Text":"which is exactly minus 4 times the original v,"},{"Start":"02:06.660 ","End":"02:08.619","Text":"this is v. Similarly,"},{"Start":"02:08.619 ","End":"02:11.065","Text":"in this case here,"},{"Start":"02:11.065 ","End":"02:14.110","Text":"we just take this and double it, multiply by 2."},{"Start":"02:14.110 ","End":"02:17.110","Text":"Then the last case take 0,"},{"Start":"02:17.110 ","End":"02:21.065","Text":"0, 1 to 6 times 0, 0, 1."},{"Start":"02:21.065 ","End":"02:25.940","Text":"In each of these cases, we have the original vector multiplied by some scalar."},{"Start":"02:25.940 ","End":"02:28.560","Text":"This leads us to a definition,"},{"Start":"02:28.560 ","End":"02:31.625","Text":"let V be a vector space over a field and T,"},{"Start":"02:31.625 ","End":"02:36.740","Text":"a linear transformation from V to itself than a non-zero vector."},{"Start":"02:36.740 ","End":"02:42.455","Text":"It\u0027s important that it should be non-zero is called an eigenvector of"},{"Start":"02:42.455 ","End":"02:51.025","Text":"T. If T of v is K times v for some k in the field scalar."},{"Start":"02:51.025 ","End":"02:55.665","Text":"There\u0027s a name for this K it\u0027s called the eigenvalue."},{"Start":"02:55.665 ","End":"03:00.350","Text":"Each eigenvector has a certain corresponding eigenvalue."},{"Start":"03:00.350 ","End":"03:04.350","Text":"It\u0027s that multiple that we talked about here,"},{"Start":"03:04.350 ","End":"03:07.065","Text":"the scalar multiple is the eigenvalue."},{"Start":"03:07.065 ","End":"03:09.230","Text":"In the example above,"},{"Start":"03:09.230 ","End":"03:10.875","Text":"like this 1,"},{"Start":"03:10.875 ","End":"03:13.555","Text":"if v is minus 1, 1, 0,"},{"Start":"03:13.555 ","End":"03:21.760","Text":"then it\u0027s an eigenvector of T and it corresponds to the eigenvalue minus 4."},{"Start":"03:21.760 ","End":"03:24.640","Text":"Now some remarks. First of all,"},{"Start":"03:24.640 ","End":"03:28.390","Text":"it\u0027s possible for an eigenvalue to be 0."},{"Start":"03:28.390 ","End":"03:30.880","Text":"The eigenvector can\u0027t be 0,"},{"Start":"03:30.880 ","End":"03:34.270","Text":"but the eigenvalue can be and here\u0027s an example."},{"Start":"03:34.270 ","End":"03:35.710","Text":"Let\u0027s take this T,"},{"Start":"03:35.710 ","End":"03:38.830","Text":"this formula and then as an eigenvector,"},{"Start":"03:38.830 ","End":"03:40.870","Text":"take 1, 0, 0,"},{"Start":"03:40.870 ","End":"03:44.954","Text":"apply T to it and we get 0, 0, 0,"},{"Start":"03:44.954 ","End":"03:52.320","Text":"which is 0 times v. The vector is not 0 vector but the eigenvalue is."},{"Start":"03:52.320 ","End":"03:55.880","Text":"Just note because sometimes people get confused about that."},{"Start":"03:55.880 ","End":"04:01.835","Text":"Secondly, if we have an eigenvector of T with eigenvalue Lambda,"},{"Start":"04:01.835 ","End":"04:09.005","Text":"then we can find other eigenvectors just by multiplying v by a non-zero scalar."},{"Start":"04:09.005 ","End":"04:14.185","Text":"For example, 3v is also an eigenvector with the same Lambda."},{"Start":"04:14.185 ","End":"04:18.450","Text":"Let\u0027s show why 3v is an eigenvector,"},{"Start":"04:18.450 ","End":"04:21.030","Text":"T times 3v, T is linear,"},{"Start":"04:21.030 ","End":"04:24.945","Text":"let\u0027s pull a 3 out in front, 3 times Tv."},{"Start":"04:24.945 ","End":"04:26.990","Text":"Tv is Lambda v,"},{"Start":"04:26.990 ","End":"04:29.630","Text":"and then scalar times scalar times vector,"},{"Start":"04:29.630 ","End":"04:35.015","Text":"you can multiply the 2 scalars and then you can also take 1 of the scale is out again."},{"Start":"04:35.015 ","End":"04:38.520","Text":"T times this is Lambda times this."},{"Start":"04:38.520 ","End":"04:44.975","Text":"This is the 3v is also an eigenvector and it\u0027s also not zero, v is not 0."},{"Start":"04:44.975 ","End":"04:49.655","Text":"Number 3, an eigenvalue of a linear transformation"},{"Start":"04:49.655 ","End":"04:53.990","Text":"could have more than 1 linearly independent eigenvectors."},{"Start":"04:53.990 ","End":"04:58.420","Text":"The reason I say linearly independent is that we now it can have more than 1"},{"Start":"04:58.420 ","End":"05:04.100","Text":"because 3v and 2v and minus 7v and all that are eigenvectors,"},{"Start":"05:04.100 ","End":"05:07.610","Text":"but besides multiples, in other words,"},{"Start":"05:07.610 ","End":"05:13.880","Text":"you could find other ones that are not dependent on this given eigenvector."},{"Start":"05:13.880 ","End":"05:19.820","Text":"As an example, let\u0027s take this transformation by this formula."},{"Start":"05:19.820 ","End":"05:22.905","Text":"Then T times 1,0,1,"},{"Start":"05:22.905 ","End":"05:26.765","Text":"check the computation comes out to be 3,0,3,"},{"Start":"05:26.765 ","End":"05:33.905","Text":"which is 3 times v. This is an eigenvector with eigenvalue 3."},{"Start":"05:33.905 ","End":"05:36.880","Text":"Then try 1,1,0,"},{"Start":"05:36.880 ","End":"05:39.405","Text":"call that w say,"},{"Start":"05:39.405 ","End":"05:41.790","Text":"that comes out to be 3,3,0."},{"Start":"05:41.790 ","End":"05:44.140","Text":"Check the computation."},{"Start":"05:44.780 ","End":"05:49.245","Text":"This is exactly 3 times this,"},{"Start":"05:49.245 ","End":"05:54.990","Text":"T times w is 3 times w. Clearly these 2,"},{"Start":"05:54.990 ","End":"05:59.130","Text":"v and w and not linearly dependent."},{"Start":"05:59.130 ","End":"06:02.570","Text":"Just for the fact let say in the 3rd component,"},{"Start":"06:02.570 ","End":"06:05.135","Text":"this is non-zero and this is 0,"},{"Start":"06:05.135 ","End":"06:10.730","Text":"there\u0027s no multiple of w that will give you v and obviously not dependent."},{"Start":"06:10.730 ","End":"06:12.785","Text":"Now the 4th and last remark,"},{"Start":"06:12.785 ","End":"06:19.755","Text":"a linear transformation in general may be of the form T from some vector space to itself."},{"Start":"06:19.755 ","End":"06:24.665","Text":"It needn\u0027t be some R^n or C^n or anything like that."},{"Start":"06:24.665 ","End":"06:30.590","Text":"Now, such a T can also have eigenvalues and eigenvectors."},{"Start":"06:30.590 ","End":"06:38.110","Text":"For example, let\u0027s say we have the space of 2-by-2 real matrices."},{"Start":"06:38.110 ","End":"06:39.855","Text":"This is not R^n,"},{"Start":"06:39.855 ","End":"06:43.090","Text":"these are 2-by-2 matrices."},{"Start":"06:43.460 ","End":"06:49.250","Text":"Let\u0027s suppose we give the transformation T by T of"},{"Start":"06:49.250 ","End":"06:57.390","Text":"such a matrix means multiplying it on the left by this 1, 1, 1, 1."},{"Start":"06:57.700 ","End":"07:01.985","Text":"In that case, I claim that 1, 0, 1,"},{"Start":"07:01.985 ","End":"07:06.485","Text":"0 is an eigenvector of T with eigenvalue 2."},{"Start":"07:06.485 ","End":"07:12.470","Text":"Let\u0027s see, T times this vector is 1,"},{"Start":"07:12.470 ","End":"07:14.944","Text":"1, 1, 1 times this vector,"},{"Start":"07:14.944 ","End":"07:17.960","Text":"which comes out to be 2,0, 2,0,"},{"Start":"07:17.960 ","End":"07:26.410","Text":"which is exactly twice the original matrix considered as a vector."},{"Start":"07:26.570 ","End":"07:36.210","Text":"Eigenvectors and eigenvalues exist besides just R^n or C^n or whatever."},{"Start":"07:37.070 ","End":"07:41.570","Text":"Want to give another example of a more general vector space."},{"Start":"07:41.570 ","End":"07:46.765","Text":"Let\u0027s take the space of polynomials of dimension 2 or less."},{"Start":"07:46.765 ","End":"07:52.380","Text":"Less than or equal to 2 dimension polynomials all over the reals."},{"Start":"07:53.200 ","End":"07:57.800","Text":"Let\u0027s say it\u0027s given by the T of a polynomial,"},{"Start":"07:57.800 ","End":"08:01.530","Text":"is the derivative of this polynomial."},{"Start":"08:01.960 ","End":"08:12.455","Text":"Now, it is linear because T of a constant times a function or a polynomial,"},{"Start":"08:12.455 ","End":"08:16.115","Text":"the constant just comes out of the derivative."},{"Start":"08:16.115 ","End":"08:21.280","Text":"The derivative of a sum is the sum of the derivatives, derivative is linear."},{"Start":"08:21.280 ","End":"08:24.870","Text":"Now, the polynomial 1,"},{"Start":"08:24.870 ","End":"08:26.805","Text":"belongs to this space."},{"Start":"08:26.805 ","End":"08:30.380","Text":"This is an eigenvector with eigenvalue 0."},{"Start":"08:30.380 ","End":"08:35.930","Text":"Because T of the polynomial 1 is the derivative of 1,"},{"Start":"08:35.930 ","End":"08:42.530","Text":"which is 0, which is scalar 0 times polynomial 1."},{"Start":"08:42.530 ","End":"08:45.780","Text":"Tv is 0v."},{"Start":"08:46.100 ","End":"08:56.200","Text":"That\u0027s an example like in Number 1 and Number 1.0 above that 0 can be an eigenvalue."},{"Start":"08:57.800 ","End":"09:03.240","Text":"With this example, we\u0027re done with this intro clip."}],"ID":25803},{"Watched":false,"Name":"Computing Linear Transformation\u0027s Eigenvalues","Duration":"7m 57s","ChapterTopicVideoID":24892,"CourseChapterTopicPlaylistID":118361,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.030","Text":"This clip follows the introduction to"},{"Start":"00:03.030 ","End":"00:06.120","Text":"eigenvalues and eigenvectors of linear transformations."},{"Start":"00:06.120 ","End":"00:11.235","Text":"Here we\u0027ll discuss the practicalities of how to find them."},{"Start":"00:11.235 ","End":"00:15.810","Text":"Let\u0027s continue the example we had before of"},{"Start":"00:15.810 ","End":"00:21.585","Text":"a transformation from R^3 to R^3 given by this formula."},{"Start":"00:21.585 ","End":"00:23.925","Text":"In the previous clip in the intro,"},{"Start":"00:23.925 ","End":"00:30.945","Text":"we showed that these 3 vectors are eigenvectors with corresponding eigenvalues."},{"Start":"00:30.945 ","End":"00:34.200","Text":"But we didn\u0027t say how we got them,"},{"Start":"00:34.200 ","End":"00:35.955","Text":"I just pulled them out of thin air,"},{"Start":"00:35.955 ","End":"00:38.705","Text":"so here we\u0027re going to discuss how to find them."},{"Start":"00:38.705 ","End":"00:44.135","Text":"In this case, our vector spaces of the format to the n with n equals 3."},{"Start":"00:44.135 ","End":"00:50.435","Text":"It\u0027s easier than the general case which we\u0027ll tackle subsequently."},{"Start":"00:50.435 ","End":"00:53.990","Text":"Let\u0027s first of all discuss the R\u0027s to the end case."},{"Start":"00:53.990 ","End":"01:01.715","Text":"Step 1 is to find the matrix that represents the transformation in the standard basis."},{"Start":"01:01.715 ","End":"01:03.740","Text":"We can write it as T of x, y,"},{"Start":"01:03.740 ","End":"01:06.350","Text":"z, is this matrix times x,"},{"Start":"01:06.350 ","End":"01:09.110","Text":"y, z. I\u0027ve got these coefficients,"},{"Start":"01:09.110 ","End":"01:11.210","Text":"for example, let\u0027s take the middle row."},{"Start":"01:11.210 ","End":"01:19.799","Text":"3 minus 1, 0 comes from 3 minus 1 and there is no z."},{"Start":"01:19.799 ","End":"01:27.455","Text":"We know how to do that. This is the representation with respect to the standard basis."},{"Start":"01:27.455 ","End":"01:34.280","Text":"The matrix with respect to the standard basis is this matrix here."},{"Start":"01:34.280 ","End":"01:38.240","Text":"Now the technique which works for RN is to take"},{"Start":"01:38.240 ","End":"01:44.600","Text":"this matrix and find the eigenvalues and eigenvectors of the matrix."},{"Start":"01:44.600 ","End":"01:46.845","Text":"That\u0027s what Step 2 is."},{"Start":"01:46.845 ","End":"01:52.115","Text":"We get the following eigenvalues and vectors,"},{"Start":"01:52.115 ","End":"01:56.330","Text":"and I\u0027ll do the computations in a separate clip,"},{"Start":"01:56.330 ","End":"01:58.810","Text":"the following or the one after."},{"Start":"01:58.810 ","End":"02:01.115","Text":"Then in the case of R_n,"},{"Start":"02:01.115 ","End":"02:07.780","Text":"these are exactly the same eigenvalues and eigenvectors as those of the transformation."},{"Start":"02:07.780 ","End":"02:12.310","Text":"The matrix and the transformation have the same eigenvalues and vectors."},{"Start":"02:12.310 ","End":"02:15.070","Text":"Might ask, what\u0027s the logic behind this?"},{"Start":"02:15.070 ","End":"02:17.665","Text":"T of x, y, z,"},{"Start":"02:17.665 ","End":"02:21.279","Text":"which is coordinates in standard basis,"},{"Start":"02:21.279 ","End":"02:25.425","Text":"is given by if you look at the formula minus x plus 3y,"},{"Start":"02:25.425 ","End":"02:29.985","Text":"just copying the comp1nts and writing it in column vector form."},{"Start":"02:29.985 ","End":"02:36.780","Text":"T of x, y, z is the same as the product of this with x, y, z."},{"Start":"02:36.780 ","End":"02:41.865","Text":"This is the transformation matrix with respect to E, the standard basis."},{"Start":"02:41.865 ","End":"02:43.970","Text":"If T of x, y,"},{"Start":"02:43.970 ","End":"02:47.780","Text":"z is some scalar times x, y, z,"},{"Start":"02:47.780 ","End":"02:51.410","Text":"then so is the matrix applied to x, y,"},{"Start":"02:51.410 ","End":"02:53.180","Text":"z equal to lambda of x, y,"},{"Start":"02:53.180 ","End":"02:55.990","Text":"z, so if and only if it\u0027s the same thing."},{"Start":"02:55.990 ","End":"03:01.235","Text":"The eigenvalues and eigenvectors are the same for the transformation as for the matrix."},{"Start":"03:01.235 ","End":"03:04.475","Text":"Now the more general case."},{"Start":"03:04.475 ","End":"03:10.020","Text":"Suppose the linear transformation may not be."},{"Start":"03:10.850 ","End":"03:18.380","Text":"For example, suppose it\u0027s of the form T from M_n of our 2 itself, what is this?"},{"Start":"03:18.380 ","End":"03:22.070","Text":"This is the n by n matrices over the reals."},{"Start":"03:22.070 ","End":"03:24.065","Text":"It could be of the form,"},{"Start":"03:24.065 ","End":"03:27.785","Text":"that\u0027s another example, P_n of R to P_n of R,"},{"Start":"03:27.785 ","End":"03:33.890","Text":"where this is the polynomials of degree less than or equal to n over the reals."},{"Start":"03:33.890 ","End":"03:36.980","Text":"But it could also be more general from V to V,"},{"Start":"03:36.980 ","End":"03:40.159","Text":"where V is some n-dimensional vector space."},{"Start":"03:40.159 ","End":"03:45.340","Text":"We\u0027re not going to work with infinite dimensional spaces, just finite."},{"Start":"03:45.340 ","End":"03:49.655","Text":"The eigenvectors that we obtain with this method,"},{"Start":"03:49.655 ","End":"03:51.815","Text":"the eigenvectors for the matrix,"},{"Start":"03:51.815 ","End":"03:53.360","Text":"not the final answer,"},{"Start":"03:53.360 ","End":"03:58.600","Text":"they are just the coordinate vectors of the eigenvectors that we need."},{"Start":"03:58.600 ","End":"04:00.900","Text":"That doesn\u0027t explain it very well,"},{"Start":"04:00.900 ","End":"04:03.950","Text":"so following 2 examples will show how we"},{"Start":"04:03.950 ","End":"04:07.925","Text":"build the actual eigenvectors from these coordinate vectors,"},{"Start":"04:07.925 ","End":"04:10.925","Text":"which are the eigenvectors of the matrix."},{"Start":"04:10.925 ","End":"04:13.055","Text":"Here\u0027s the first example,"},{"Start":"04:13.055 ","End":"04:15.260","Text":"T applied to x, y,"},{"Start":"04:15.260 ","End":"04:17.750","Text":"z. T is the matrix 1, 1, 1,"},{"Start":"04:17.750 ","End":"04:20.570","Text":"1 times the vector,"},{"Start":"04:20.570 ","End":"04:22.760","Text":"which is really a matrix x, y, z,"},{"Start":"04:22.760 ","End":"04:29.990","Text":"t. Standard basis for 2-by-2 matrices over R is the following."},{"Start":"04:29.990 ","End":"04:34.850","Text":"Notice that the 1 appears in each of the positions and in this order."},{"Start":"04:34.850 ","End":"04:38.555","Text":"Now T, say apply it to the first 1,"},{"Start":"04:38.555 ","End":"04:43.070","Text":"if you multiply this matrix by this matrix,"},{"Start":"04:43.070 ","End":"04:44.615","Text":"we get this 1."},{"Start":"04:44.615 ","End":"04:47.630","Text":"This has coordinates 1,"},{"Start":"04:47.630 ","End":"04:51.440","Text":"0, 1, 0 with respect to the standard basis."},{"Start":"04:51.440 ","End":"04:52.940","Text":"We didn\u0027t have to do all this,"},{"Start":"04:52.940 ","End":"04:54.425","Text":"you could just say 1,"},{"Start":"04:54.425 ","End":"04:55.880","Text":"0, 1,"},{"Start":"04:55.880 ","End":"04:59.515","Text":"0, and similarly for the other 3."},{"Start":"04:59.515 ","End":"05:05.275","Text":"The matrix for T in this basis is the following."},{"Start":"05:05.275 ","End":"05:11.450","Text":"We have the following eigenvectors and the computations are not here"},{"Start":"05:11.450 ","End":"05:19.205","Text":"because this was 1 of the previous exercises in the section on Eigenvectors of matrices."},{"Start":"05:19.205 ","End":"05:20.810","Text":"We\u0027ve done this already, if not,"},{"Start":"05:20.810 ","End":"05:22.610","Text":"you\u0027ll just have to trust me for it."},{"Start":"05:22.610 ","End":"05:28.435","Text":"We have the following for eigenvectors with eigenvalues, 0 and 2."},{"Start":"05:28.435 ","End":"05:34.220","Text":"2 for each 1 and 2 for eigenvalue 0 and 1 and 2 for eigenvalue 2."},{"Start":"05:34.220 ","End":"05:38.120","Text":"Now these are not the answer because they\u0027re not 2-by-2 matrices,"},{"Start":"05:38.120 ","End":"05:43.040","Text":"they\u0027re just coordinate vectors for the eigenvectors of M_2 of R. What you"},{"Start":"05:43.040 ","End":"05:49.485","Text":"get is you take the coordinate vectors and multiply,"},{"Start":"05:49.485 ","End":"05:50.925","Text":"I\u0027ll show you in a moment."},{"Start":"05:50.925 ","End":"05:57.140","Text":"What we get are the following eigenvectors, which are matrices."},{"Start":"05:57.140 ","End":"05:59.990","Text":"We got this 1, for example,"},{"Start":"05:59.990 ","End":"06:07.490","Text":"by taking 0 times this plus minus 1 times the second basis element,"},{"Start":"06:07.490 ","End":"06:08.660","Text":"0 times this,1,"},{"Start":"06:08.660 ","End":"06:11.375","Text":"times this, and that\u0027s how we do it."},{"Start":"06:11.375 ","End":"06:14.360","Text":"For the other 3, it\u0027s a similar process."},{"Start":"06:14.360 ","End":"06:16.730","Text":"Let\u0027s move on to the second example."},{"Start":"06:16.730 ","End":"06:22.720","Text":"The second example is polynomials of degree less than or equal to 2 over the reals."},{"Start":"06:22.720 ","End":"06:26.720","Text":"T of a polynomial is what you get when"},{"Start":"06:26.720 ","End":"06:30.050","Text":"you substitute x plus 1 instead of x for the polynomial,"},{"Start":"06:30.050 ","End":"06:32.225","Text":"that\u0027s just in words what this says."},{"Start":"06:32.225 ","End":"06:34.370","Text":"I often make a mental note,"},{"Start":"06:34.370 ","End":"06:36.080","Text":"is this really linear?"},{"Start":"06:36.080 ","End":"06:37.380","Text":"I\u0027ll leave you to check that."},{"Start":"06:37.380 ","End":"06:40.340","Text":"The standard basis for the polynomials of"},{"Start":"06:40.340 ","End":"06:43.625","Text":"degree less than or equal to 2 is the following."},{"Start":"06:43.625 ","End":"06:46.135","Text":"Degree 0, degree 1, degree 2."},{"Start":"06:46.135 ","End":"06:48.375","Text":"Let\u0027s see what T does to each."},{"Start":"06:48.375 ","End":"06:51.450","Text":"To substitute x plus 1 instead of x in 1,"},{"Start":"06:51.450 ","End":"06:52.640","Text":"there is no way to substitute,"},{"Start":"06:52.640 ","End":"06:53.795","Text":"so it\u0027s just itself."},{"Start":"06:53.795 ","End":"06:59.110","Text":"T of x is x plus 1 and T of x squared is x plus 1 squared, which is this."},{"Start":"06:59.110 ","End":"07:01.790","Text":"The matrix has the following basis,"},{"Start":"07:01.790 ","End":"07:05.030","Text":"just taking the coefficients from here,1-to-1,"},{"Start":"07:05.030 ","End":"07:07.750","Text":"but notice that it\u0027s transposed here."},{"Start":"07:07.750 ","End":"07:12.030","Text":"Now, this matrix has only one eigenvector."},{"Start":"07:12.030 ","End":"07:16.640","Text":"Again, this was done in a previous exercise for eigenvectors of matrices,"},{"Start":"07:16.640 ","End":"07:18.200","Text":"and it\u0027s this 1."},{"Start":"07:18.200 ","End":"07:21.290","Text":"What we do is we convert"},{"Start":"07:21.290 ","End":"07:27.630","Text":"the coordinate vector into an actual polynomial by applying it to the standard basis,"},{"Start":"07:27.630 ","End":"07:33.590","Text":"so it\u0027s 1 times the polynomial 1 plus 0 times polynomial x"},{"Start":"07:33.590 ","End":"07:35.840","Text":"plus 0 times polynomial x squared."},{"Start":"07:35.840 ","End":"07:38.455","Text":"What we get as the polynomial 1."},{"Start":"07:38.455 ","End":"07:41.570","Text":"This is our eigenvector, the polynomial."},{"Start":"07:41.570 ","End":"07:42.870","Text":"Let\u0027s just check it,"},{"Start":"07:42.870 ","End":"07:45.590","Text":"if you apply T to 1, we get 1,"},{"Start":"07:45.590 ","End":"07:52.090","Text":"the polynomial, which is just the scalar 1 times our polynomial 1."},{"Start":"07:52.550 ","End":"07:57.690","Text":"This second example concludes this clip."}],"ID":25805},{"Watched":false,"Name":"Computation - Example","Duration":"9m 10s","ChapterTopicVideoID":24891,"CourseChapterTopicPlaylistID":118361,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:06.810","Text":"In this clip, we\u0027ll complete the computations that we omitted in the previous clip."},{"Start":"00:06.810 ","End":"00:08.415","Text":"Maybe a bit boring,"},{"Start":"00:08.415 ","End":"00:12.555","Text":"but it is important and you need to know the technique."},{"Start":"00:12.555 ","End":"00:16.890","Text":"We\u0027ll present the computation in the form of an exercise."},{"Start":"00:16.890 ","End":"00:18.885","Text":"This is the matrix we had."},{"Start":"00:18.885 ","End":"00:21.900","Text":"We just have to find the eigenvalues and eigenvectors,"},{"Start":"00:21.900 ","End":"00:23.759","Text":"but I did it in steps."},{"Start":"00:23.759 ","End":"00:28.200","Text":"We, first of all, find the characteristic matrix or characteristic polynomial,"},{"Start":"00:28.200 ","End":"00:33.275","Text":"then the eigenvalues, the algebraic multiplicity of each."},{"Start":"00:33.275 ","End":"00:37.925","Text":"Then for each eigenvalue find its eigenspace and eigenvectors,"},{"Start":"00:37.925 ","End":"00:40.675","Text":"and the geometric multiplicity."},{"Start":"00:40.675 ","End":"00:44.405","Text":"Finally, determine if a is diagonalizable."},{"Start":"00:44.405 ","End":"00:46.400","Text":"What\u0027s in this color is more than what we"},{"Start":"00:46.400 ","End":"00:48.905","Text":"need for the computations, but we\u0027ll do it anyway."},{"Start":"00:48.905 ","End":"00:53.050","Text":"Let\u0027s start with the first 1 to find the characteristic matrix of A."},{"Start":"00:53.050 ","End":"00:57.535","Text":"That\u0027s simply x_I minus A or lambda I minus A."},{"Start":"00:57.535 ","End":"00:59.540","Text":"This is what we get,"},{"Start":"00:59.540 ","End":"01:02.090","Text":"and that\u0027s the answer to part A."},{"Start":"01:02.090 ","End":"01:04.610","Text":"Part B, the characteristic polynomial is simply"},{"Start":"01:04.610 ","End":"01:08.180","Text":"the determinant of the characteristic matrix."},{"Start":"01:08.180 ","End":"01:10.835","Text":"Determinant you can write this way or with bars."},{"Start":"01:10.835 ","End":"01:13.475","Text":"This is what we have to compute,"},{"Start":"01:13.475 ","End":"01:17.675","Text":"and we\u0027re going to compute it by the first column."},{"Start":"01:17.675 ","End":"01:20.925","Text":"These are 2 0s, so all we get is this."},{"Start":"01:20.925 ","End":"01:22.670","Text":"This is a plus, plus,"},{"Start":"01:22.670 ","End":"01:25.175","Text":"minus, plus, minus, plus."},{"Start":"01:25.175 ","End":"01:32.300","Text":"Then we have a 2 by 2 so it\u0027s x minus 6 times the determinant here,"},{"Start":"01:32.300 ","End":"01:36.775","Text":"which is x plus 1 squared minus minus 3 squared."},{"Start":"01:36.775 ","End":"01:41.240","Text":"Straightforward algebra, the polynomial comes out to be x minus 6,"},{"Start":"01:41.240 ","End":"01:43.370","Text":"x minus 2, x plus 4."},{"Start":"01:43.370 ","End":"01:49.340","Text":"Next, the eigenvalues of the matrix A and the multiplicity of each,"},{"Start":"01:49.340 ","End":"01:52.460","Text":"we just take the characteristic polynomial, set it to 0."},{"Start":"01:52.460 ","End":"01:55.090","Text":"This is also called the characteristic equation,"},{"Start":"01:55.090 ","End":"02:00.415","Text":"which is this and so the roots are 6 and minus 4."},{"Start":"02:00.415 ","End":"02:02.000","Text":"These are the eigenvalues."},{"Start":"02:02.000 ","End":"02:03.500","Text":"Just by the way,"},{"Start":"02:03.500 ","End":"02:05.870","Text":"there\u0027s a name for this set of eigenvalues."},{"Start":"02:05.870 ","End":"02:11.570","Text":"It\u0027s called the spectrum of the matrix and is sometimes denoted with a sigma."},{"Start":"02:11.570 ","End":"02:16.069","Text":"Now, in general, if x equals a is a 0,"},{"Start":"02:16.069 ","End":"02:18.890","Text":"and in our case, a will be 1 of these."},{"Start":"02:18.890 ","End":"02:23.270","Text":"Then if it\u0027s a 0 of order k of the polynomial,"},{"Start":"02:23.270 ","End":"02:29.480","Text":"then we say it\u0027s an eigenvalue with algebraic multiplicity k. To spell it out,"},{"Start":"02:29.480 ","End":"02:33.920","Text":"if we factorize p into separate x minus a to the something,"},{"Start":"02:33.920 ","End":"02:37.070","Text":"x minus something else to the something else, and so on."},{"Start":"02:37.070 ","End":"02:40.820","Text":"Then x equals a has algebraic multiplicity k,"},{"Start":"02:40.820 ","End":"02:42.290","Text":"it\u0027s the power here."},{"Start":"02:42.290 ","End":"02:43.625","Text":"But in our case,"},{"Start":"02:43.625 ","End":"02:45.550","Text":"these are all 1."},{"Start":"02:45.550 ","End":"02:47.580","Text":"All the eigenvalues,"},{"Start":"02:47.580 ","End":"02:50.675","Text":"all 3 of them of algebraic multiplicity 1."},{"Start":"02:50.675 ","End":"02:53.390","Text":"There\u0027s a notation for algebraic multiplicity."},{"Start":"02:53.390 ","End":"02:55.054","Text":"Sometimes a letter Mu,"},{"Start":"02:55.054 ","End":"02:58.175","Text":"the Greek letter is used for that."},{"Start":"02:58.175 ","End":"03:00.110","Text":"Mu 6 is 1,"},{"Start":"03:00.110 ","End":"03:04.100","Text":"mu 2 is 1, and mu minus 4 is 1."},{"Start":"03:04.100 ","End":"03:11.690","Text":"Now we\u0027re going to find the eigenspace of each eigenvalue and the geometric multiplicity,"},{"Start":"03:11.690 ","End":"03:14.350","Text":"which is the dimension of the eigenspace."},{"Start":"03:14.350 ","End":"03:16.500","Text":"Start off with x equals 6,"},{"Start":"03:16.500 ","End":"03:18.060","Text":"we have 3 of these."},{"Start":"03:18.060 ","End":"03:20.135","Text":"Let\u0027s go through quickly."},{"Start":"03:20.135 ","End":"03:22.414","Text":"This is the characteristic matrix,"},{"Start":"03:22.414 ","End":"03:26.860","Text":"substitute the eigenvalue, and this is what we get,"},{"Start":"03:26.860 ","End":"03:30.095","Text":"and this is considered as a system of"},{"Start":"03:30.095 ","End":"03:34.505","Text":"linear equations in 3 unknowns though z doesn\u0027t appear."},{"Start":"03:34.505 ","End":"03:37.215","Text":"Now to some row operations here,"},{"Start":"03:37.215 ","End":"03:39.820","Text":"I want to get a 0 here and here."},{"Start":"03:39.820 ","End":"03:49.450","Text":"7 times second row plus 3 times the first row into the second row will give us a 0 here."},{"Start":"03:49.450 ","End":"03:52.820","Text":"Similarly, we\u0027ll take here,"},{"Start":"03:52.820 ","End":"03:55.880","Text":"I don\u0027t know why it took a combination of second or third rows."},{"Start":"03:55.880 ","End":"03:59.000","Text":"We usually do a combination of first and third, doesn\u0027t matter."},{"Start":"03:59.000 ","End":"04:04.070","Text":"We still take twice this plus 3 times this and put it in the last row,"},{"Start":"04:04.070 ","End":"04:08.720","Text":"so twice 7 plus 3 times 2 will give us 20."},{"Start":"04:08.720 ","End":"04:12.139","Text":"That\u0027s this 20. Just like earlier,"},{"Start":"04:12.139 ","End":"04:18.120","Text":"7 of these which is 49,"},{"Start":"04:18.120 ","End":"04:21.530","Text":"plus 3 of these is minus 9 gives us the 40."},{"Start":"04:21.530 ","End":"04:25.265","Text":"Next, what we can do is another row operation."},{"Start":"04:25.265 ","End":"04:30.380","Text":"We can subtract twice this from the last row or the other way around,"},{"Start":"04:30.380 ","End":"04:37.880","Text":"a twice this minus this will give us 0 here."},{"Start":"04:37.880 ","End":"04:44.360","Text":"For this, what we\u0027ll do is take this row and just divide it by 40,"},{"Start":"04:44.360 ","End":"04:46.325","Text":"and that will give us a 1 here."},{"Start":"04:46.325 ","End":"04:51.815","Text":"Now that\u0027s the echelon form and this is the corresponding system of equations."},{"Start":"04:51.815 ","End":"04:55.800","Text":"Z is the free variable or Z if you\u0027re in England."},{"Start":"04:55.800 ","End":"05:00.770","Text":"We\u0027ll let Z equal 1 and then we can compute the others."},{"Start":"05:00.770 ","End":"05:05.405","Text":"Y is 0, X comes out to be 0 because if Y is 0,"},{"Start":"05:05.405 ","End":"05:06.695","Text":"then 7_X is 0,"},{"Start":"05:06.695 ","End":"05:07.990","Text":"X is 0,"},{"Start":"05:07.990 ","End":"05:13.735","Text":"and that gives us the eigenvector 0, 0, 1."},{"Start":"05:13.735 ","End":"05:17.115","Text":"The eigenspace is simply the span,"},{"Start":"05:17.115 ","End":"05:19.390","Text":"all multiples of this vector,"},{"Start":"05:19.390 ","End":"05:21.560","Text":"and the dimension of this space,"},{"Start":"05:21.560 ","End":"05:25.475","Text":"since it has 1 basis element is 1."},{"Start":"05:25.475 ","End":"05:28.365","Text":"The dimension of E_6 is 1."},{"Start":"05:28.365 ","End":"05:30.810","Text":"Like I said, this is a gamma notation."},{"Start":"05:30.810 ","End":"05:33.740","Text":"Let\u0027s go on to the next eigenvalue."},{"Start":"05:33.740 ","End":"05:38.120","Text":"This time x equals 2 substituting for the characteristic matrix,"},{"Start":"05:38.120 ","End":"05:39.725","Text":"this is what we get."},{"Start":"05:39.725 ","End":"05:46.430","Text":"Now we\u0027ll do row operations on this at the first and the second row and we\u0027ll get 0,"},{"Start":"05:46.430 ","End":"05:48.155","Text":"0, 0, that\u0027s this."},{"Start":"05:48.155 ","End":"05:52.460","Text":"If we take 3 times this row,"},{"Start":"05:52.460 ","End":"05:54.350","Text":"minus twice this row,"},{"Start":"05:54.350 ","End":"05:56.090","Text":"3 times 2 is 6,"},{"Start":"05:56.090 ","End":"05:57.800","Text":"minus twice 3 is 0,"},{"Start":"05:57.800 ","End":"05:59.510","Text":"3 times 2 is 6,"},{"Start":"05:59.510 ","End":"06:04.685","Text":"minus, minus twice 3 is 6 plus 6 is 12."},{"Start":"06:04.685 ","End":"06:09.740","Text":"Now we can simplify by dividing the top row by 3 we get this,"},{"Start":"06:09.740 ","End":"06:11.330","Text":"divide this row by 12,"},{"Start":"06:11.330 ","End":"06:15.645","Text":"we get this, eliminate the middle row."},{"Start":"06:15.645 ","End":"06:17.520","Text":"We have X minus Y equals 0."},{"Start":"06:17.520 ","End":"06:20.209","Text":"Y minus Z is 0."},{"Start":"06:20.209 ","End":"06:22.070","Text":"C is the free variable,"},{"Start":"06:22.070 ","End":"06:23.560","Text":"let it be 1."},{"Start":"06:23.560 ","End":"06:26.820","Text":"From it we determine that Y is 1 and from Y is 1,"},{"Start":"06:26.820 ","End":"06:28.440","Text":"we get x equals 1."},{"Start":"06:28.440 ","End":"06:32.240","Text":"So 1, 1, 1 is the eigenvector for value"},{"Start":"06:32.240 ","End":"06:35.960","Text":"2 and the eigenspace is the span of this and as before,"},{"Start":"06:35.960 ","End":"06:37.775","Text":"it has dimension 1."},{"Start":"06:37.775 ","End":"06:42.290","Text":"The geometric multiplicity is 1,"},{"Start":"06:42.290 ","End":"06:45.575","Text":"and we call that Gamma for eigenvalue 2."},{"Start":"06:45.575 ","End":"06:49.265","Text":"The third eigenvalue minus 4."},{"Start":"06:49.265 ","End":"06:52.640","Text":"Plug that into the matrix, we get this."},{"Start":"06:52.640 ","End":"06:55.145","Text":"This is the corresponding equation."},{"Start":"06:55.145 ","End":"06:57.800","Text":"Now some row operations,"},{"Start":"06:57.800 ","End":"07:02.105","Text":"obviously we can subtract the first from the second."},{"Start":"07:02.105 ","End":"07:05.995","Text":"Anyway, go through it quicker this time."},{"Start":"07:05.995 ","End":"07:13.295","Text":"After row operations, we get this divide here by 3 and here by 30."},{"Start":"07:13.295 ","End":"07:19.895","Text":"We have this, eliminate the middle row altogether and we have this system of equations."},{"Start":"07:19.895 ","End":"07:26.940","Text":"Why is the free variable that y equals 1 and then z is 0,"},{"Start":"07:26.940 ","End":"07:29.145","Text":"x is minus 1."},{"Start":"07:29.145 ","End":"07:34.110","Text":"This is our eigenvector and the eigenspace,"},{"Start":"07:34.110 ","End":"07:36.855","Text":"so minus 4 is the span of this."},{"Start":"07:36.855 ","End":"07:40.250","Text":"Again, the geometric multiplicity is 1 which we"},{"Start":"07:40.250 ","End":"07:44.180","Text":"denote with a Gamma and the subscript is the eigenvalue."},{"Start":"07:44.180 ","End":"07:48.380","Text":"That was part D. Now we just have E remaining."},{"Start":"07:48.380 ","End":"07:51.080","Text":"We have to decide if A is diagonalizable."},{"Start":"07:51.080 ","End":"07:52.355","Text":"The answer is yes."},{"Start":"07:52.355 ","End":"07:54.080","Text":"I\u0027ll show you in 2 ways."},{"Start":"07:54.080 ","End":"07:59.480","Text":"The first way is to take the sum of the algebraic multiplicities, it\u0027s 3."},{"Start":"07:59.480 ","End":"08:01.340","Text":"It\u0027s the full amount."},{"Start":"08:01.340 ","End":"08:02.850","Text":"We\u0027re in dimension 3,"},{"Start":"08:02.850 ","End":"08:04.260","Text":"that\u0027s first of all."},{"Start":"08:04.260 ","End":"08:07.460","Text":"The second thing is that the algebraic and geometric multiplicities"},{"Start":"08:07.460 ","End":"08:10.945","Text":"are the same for each of the eigenvalues there 1."},{"Start":"08:10.945 ","End":"08:13.860","Text":"For 6, they\u0027re both 1, for 2,"},{"Start":"08:13.860 ","End":"08:17.025","Text":"they\u0027re both 1, and a minus 4, they\u0027re both 1."},{"Start":"08:17.025 ","End":"08:21.710","Text":"This together with the fact that the sum of them all is 3,"},{"Start":"08:21.710 ","End":"08:24.560","Text":"so there\u0027s a theorem that if this holds,"},{"Start":"08:24.560 ","End":"08:27.515","Text":"then the matrix is diagonalizable."},{"Start":"08:27.515 ","End":"08:30.070","Text":"The other way of doing it is to say that"},{"Start":"08:30.070 ","End":"08:34.070","Text":"the 3 vectors that we had are linearly independent."},{"Start":"08:34.070 ","End":"08:37.445","Text":"Eigenvectors from different eigenvalues are always"},{"Start":"08:37.445 ","End":"08:42.260","Text":"linearly independent is a theorem that it\u0027s actually not hard to prove."},{"Start":"08:42.260 ","End":"08:46.790","Text":"These 3 eigenvectors are a basis of"},{"Start":"08:46.790 ","End":"08:52.375","Text":"R^3 because the linearly independent and the 3 of them, the full amount."},{"Start":"08:52.375 ","End":"08:54.990","Text":"By another theorem, A is diagonalizable."},{"Start":"08:54.990 ","End":"08:56.990","Text":"That other theorem is that there\u0027s an if,"},{"Start":"08:56.990 ","End":"09:01.040","Text":"and only if condition that we have a basis consisting of eigenvectors."},{"Start":"09:01.040 ","End":"09:04.330","Text":"This would be a basis of eigenvectors."},{"Start":"09:04.330 ","End":"09:07.840","Text":"That finishes the computations."},{"Start":"09:07.840 ","End":"09:10.500","Text":"This clip finished."}],"ID":25804},{"Watched":false,"Name":"Diagonalisable Linear Transformations","Duration":"6m 6s","ChapterTopicVideoID":24878,"CourseChapterTopicPlaylistID":118361,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:02.310","Text":"In this clip, we\u0027ll talk about"},{"Start":"00:02.310 ","End":"00:06.180","Text":"diagonalizable linear transformations as opposed"},{"Start":"00:06.180 ","End":"00:10.515","Text":"to diagonalizable matrices which you\u0027ve already studied."},{"Start":"00:10.515 ","End":"00:15.270","Text":"Now definition, a linear transformation T from a vector space to"},{"Start":"00:15.270 ","End":"00:21.090","Text":"itself it\u0027s called diagonalizable if 1 of the following conditions hold."},{"Start":"00:21.090 ","End":"00:23.355","Text":"These will be 2 equivalent conditions."},{"Start":"00:23.355 ","End":"00:27.495","Text":"The first 1 is that there is a basis of V"},{"Start":"00:27.495 ","End":"00:32.925","Text":"such that the matrix of T with respect to the basis B,"},{"Start":"00:32.925 ","End":"00:35.535","Text":"is a diagonal matrix."},{"Start":"00:35.535 ","End":"00:40.525","Text":"I should have said we assume that V is finite dimensional."},{"Start":"00:40.525 ","End":"00:43.460","Text":"The second condition equivalent to the first,"},{"Start":"00:43.460 ","End":"00:46.415","Text":"is that for any basis B of V,"},{"Start":"00:46.415 ","End":"00:50.150","Text":"the matrix of T with respect to B is diagonalizable."},{"Start":"00:50.150 ","End":"00:53.250","Text":"Note the difference here, there is,"},{"Start":"00:53.250 ","End":"00:59.435","Text":"just means there exists at least such 1 base is that it\u0027s actually diagonal."},{"Start":"00:59.435 ","End":"01:07.490","Text":"Here it\u0027s for any but the conditions we care is diagonalizable, not necessarily diagonal."},{"Start":"01:07.490 ","End":"01:15.185","Text":"Now an example, let\u0027s take T from R^3 to itself given as follows."},{"Start":"01:15.185 ","End":"01:16.850","Text":"We\u0027re familiar with this."},{"Start":"01:16.850 ","End":"01:19.820","Text":"We showed earlier that the matrix corresponding to"},{"Start":"01:19.820 ","End":"01:22.985","Text":"T with respect to the standard basis is this,"},{"Start":"01:22.985 ","End":"01:28.770","Text":"and it\u0027s diagonalizable we showed in the computation\u0027s clip."},{"Start":"01:31.250 ","End":"01:34.745","Text":"I meant 2 hear this condition."},{"Start":"01:34.745 ","End":"01:36.500","Text":"Yeah, got it backwards."},{"Start":"01:36.500 ","End":"01:41.180","Text":"We also found that it has eigenvalues 6,"},{"Start":"01:41.180 ","End":"01:46.405","Text":"2, and minus 4 and the following are the corresponding eigenvectors."},{"Start":"01:46.405 ","End":"01:51.705","Text":"With respect to this space is the transformation is 6, 2,"},{"Start":"01:51.705 ","End":"01:57.900","Text":"minus 4 on the diagonal so T is diagonalizable by,"},{"Start":"01:57.900 ","End":"02:01.110","Text":"this should be a 1, not 2 of course."},{"Start":"02:01.110 ","End":"02:07.790","Text":"As useful proposition in this regard that T is diagonalizable if and"},{"Start":"02:07.790 ","End":"02:14.360","Text":"only if V has a basis consisting of eigenvectors of T. Now an example,"},{"Start":"02:14.360 ","End":"02:20.840","Text":"let\u0027s consider the linear transformation from 2 by 2 matrices over the real to itself."},{"Start":"02:20.840 ","End":"02:22.010","Text":"We\u0027ve seen this before."},{"Start":"02:22.010 ","End":"02:28.910","Text":"We define T of a matrix as left multiplication by this matrix 1 1, 1 1."},{"Start":"02:28.910 ","End":"02:36.660","Text":"The standard basis to remind you of the space are these 4 matrices."},{"Start":"02:36.660 ","End":"02:40.880","Text":"For example, the first basis element,"},{"Start":"02:40.880 ","End":"02:42.695","Text":"if you apply T to it,"},{"Start":"02:42.695 ","End":"02:45.890","Text":"we get 1 0, 1 0."},{"Start":"02:45.890 ","End":"02:49.670","Text":"If you multiply this matrix by this matrix,"},{"Start":"02:49.670 ","End":"02:52.280","Text":"we get this and if you break it up into components,"},{"Start":"02:52.280 ","End":"02:57.020","Text":"we\u0027re going to get 1 0, 1 0."},{"Start":"02:57.020 ","End":"03:02.470","Text":"Similarly for the other 3 so the matrix comes out to be,"},{"Start":"03:02.470 ","End":"03:04.050","Text":"the first 1 we get 1 0,"},{"Start":"03:04.050 ","End":"03:06.815","Text":"1 0 and for the others we get similar."},{"Start":"03:06.815 ","End":"03:14.480","Text":"In another exercise, we showed that this matrix has 4 eigenvectors,"},{"Start":"03:14.480 ","End":"03:16.870","Text":"which are the following."},{"Start":"03:16.870 ","End":"03:20.310","Text":"This 2 for lambda equals 0,"},{"Start":"03:20.310 ","End":"03:23.550","Text":"and there\u0027s 2 of them for Lambda equals 2."},{"Start":"03:23.550 ","End":"03:29.195","Text":"But do you want the eigenvectors for the 2 by 2 matrices?"},{"Start":"03:29.195 ","End":"03:33.050","Text":"These are just coordinate vectors for this space."},{"Start":"03:33.050 ","End":"03:35.000","Text":"What we do is we build them."},{"Start":"03:35.000 ","End":"03:40.340","Text":"We take these as coordinates and this times the first basis vector,"},{"Start":"03:40.340 ","End":"03:42.860","Text":"plus this times the second basis vector."},{"Start":"03:42.860 ","End":"03:46.130","Text":"Then we get this or in practical terms,"},{"Start":"03:46.130 ","End":"03:49.445","Text":"you could just fill it in from top left to bottom right,"},{"Start":"03:49.445 ","End":"03:52.325","Text":"the 0 here, minus 1 here, 0, 1."},{"Start":"03:52.325 ","End":"03:54.800","Text":"If you do the long computation,"},{"Start":"03:54.800 ","End":"03:57.560","Text":"then this is how you would do it,"},{"Start":"03:57.560 ","End":"04:00.735","Text":"0 times this, minus 1 times this, so on."},{"Start":"04:00.735 ","End":"04:05.230","Text":"These are for eigenvectors of the space."},{"Start":"04:05.230 ","End":"04:08.065","Text":"The matrices are also vectors."},{"Start":"04:08.065 ","End":"04:10.255","Text":"Is it diagonalizable?"},{"Start":"04:10.255 ","End":"04:16.480","Text":"Yes, because there are 4 linearly independent eigenvectors and 4 is the full amount,"},{"Start":"04:16.480 ","End":"04:19.375","Text":"the maximum, because we are in a 4 dimensional space."},{"Start":"04:19.375 ","End":"04:22.735","Text":"If we have that many linearly independent eigenvectors,"},{"Start":"04:22.735 ","End":"04:29.155","Text":"then it\u0027s diagonalizable by that proposition that we just showed."},{"Start":"04:29.155 ","End":"04:32.170","Text":"Again, I\u0027ll show you the use of this proposition."},{"Start":"04:32.170 ","End":"04:37.825","Text":"In another example, we\u0027ll take the space to actually have 3 dimensions."},{"Start":"04:37.825 ","End":"04:43.809","Text":"P_2 of R, the polynomials of degree less than or equal to 2 over the reals."},{"Start":"04:43.809 ","End":"04:45.490","Text":"We\u0027ve seen this before."},{"Start":"04:45.490 ","End":"04:48.250","Text":"T sends a polynomial p of x to"},{"Start":"04:48.250 ","End":"04:52.235","Text":"a polynomial that you get when you substitute x plus 1 instead of x."},{"Start":"04:52.235 ","End":"04:55.960","Text":"The standard basis, we know is 1 x x"},{"Start":"04:55.960 ","End":"05:00.100","Text":"squared T of each of the basis elements is as follows."},{"Start":"05:00.100 ","End":"05:03.610","Text":"T of x squared you would get by plugging in x plus 1"},{"Start":"05:03.610 ","End":"05:08.390","Text":"instead of x and getting x plus 1 squared is x squared plus 2x plus 1."},{"Start":"05:08.550 ","End":"05:13.000","Text":"The matrix, we take the coefficients from these,"},{"Start":"05:13.000 ","End":"05:15.160","Text":"but in columns like this 1 is, 1,"},{"Start":"05:15.160 ","End":"05:18.220","Text":"2, 1 so that\u0027s the 1, 2,1."},{"Start":"05:18.220 ","End":"05:20.800","Text":"Here we have 1,"},{"Start":"05:20.800 ","End":"05:23.110","Text":"1 and then 0 x squared,"},{"Start":"05:23.110 ","End":"05:25.190","Text":"so 1, 1, 0 and so on."},{"Start":"05:25.190 ","End":"05:32.570","Text":"In another exercise, we showed that this has only 1 eigenvector, which is 1, 0,"},{"Start":"05:32.570 ","End":"05:37.580","Text":"0, and this corresponds to 1 plus 0 x plus 0 x squared,"},{"Start":"05:37.580 ","End":"05:40.905","Text":"which gives us the polynomial 1."},{"Start":"05:40.905 ","End":"05:43.490","Text":"It is really an eigenvector."},{"Start":"05:43.490 ","End":"05:46.265","Text":"You can check, we\u0027ve seen this before also."},{"Start":"05:46.265 ","End":"05:53.870","Text":"Now we only have 1 eigenvector and for the proposition we need the full amount of 3."},{"Start":"05:53.870 ","End":"05:55.790","Text":"Are we considerably short?"},{"Start":"05:55.790 ","End":"05:58.835","Text":"Even if we had 2 of them wouldn\u0027t be enough. We need 3."},{"Start":"05:58.835 ","End":"06:02.195","Text":"No, it\u0027s not diagonalizable."},{"Start":"06:02.195 ","End":"06:07.200","Text":"That\u0027s the end of this example and the end of the clip."}],"ID":25791},{"Watched":false,"Name":"Exercise 1","Duration":"7m 44s","ChapterTopicVideoID":24879,"CourseChapterTopicPlaylistID":118361,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:04.320","Text":"In this exercise, P belongs to this,"},{"Start":"00:04.320 ","End":"00:09.420","Text":"which is the set of 2 by 2 matrices over the reals,"},{"Start":"00:09.420 ","End":"00:12.060","Text":"and it\u0027s a vector space over the reals."},{"Start":"00:12.060 ","End":"00:14.685","Text":"A matrix can be seen as a vector."},{"Start":"00:14.685 ","End":"00:17.895","Text":"Now we can define a linear transformation,"},{"Start":"00:17.895 ","End":"00:20.285","Text":"T or T_P,"},{"Start":"00:20.285 ","End":"00:24.965","Text":"because it depends on P from this space to itself,"},{"Start":"00:24.965 ","End":"00:29.085","Text":"as follows, T_P of X is PX."},{"Start":"00:29.085 ","End":"00:34.410","Text":"What we do is left multiply by matrix P. Remember we have a member of this,"},{"Start":"00:34.410 ","End":"00:39.885","Text":"we left multiply it by P and we get an outcome which is another 2 by 2 matrix."},{"Start":"00:39.885 ","End":"00:42.225","Text":"There are 2 questions,"},{"Start":"00:42.225 ","End":"00:47.825","Text":"but I added another 1 because many of these exercises don\u0027t bother to say,"},{"Start":"00:47.825 ","End":"00:49.610","Text":"to check that it\u0027s linear."},{"Start":"00:49.610 ","End":"00:52.900","Text":"They say, define a linear transformation first,"},{"Start":"00:52.900 ","End":"00:54.695","Text":"but really it should be checked."},{"Start":"00:54.695 ","End":"00:57.950","Text":"Now I\u0027m not going to do it in all the exercises but I\u0027m going to do it here,"},{"Start":"00:57.950 ","End":"01:00.350","Text":"and usually you should at least mentally try to see"},{"Start":"01:00.350 ","End":"01:03.580","Text":"why this really is a linear transformation."},{"Start":"01:03.580 ","End":"01:08.325","Text":"There are 2 properties for linearity;"},{"Start":"01:08.325 ","End":"01:14.590","Text":"it has to preserve addition and multiplication with a scalar or a constant."},{"Start":"01:14.590 ","End":"01:21.490","Text":"Let\u0027s see, T_P of the sum X plus Y is P times X plus Y,"},{"Start":"01:21.490 ","End":"01:28.920","Text":"by the definition that T_P is just multiplication by P. By the distributive rule,"},{"Start":"01:28.920 ","End":"01:32.579","Text":"this is PX plus PY, and by definition,"},{"Start":"01:32.579 ","End":"01:38.250","Text":"PX is T_P of X and PY is T_P of Y."},{"Start":"01:38.250 ","End":"01:40.530","Text":"We really do get additivity,"},{"Start":"01:40.530 ","End":"01:43.510","Text":"T_P of the sum is the sum of the T_Ps,"},{"Start":"01:43.510 ","End":"01:48.320","Text":"multiplication by a scalar that it comes out in front, let\u0027s see."},{"Start":"01:48.320 ","End":"01:52.520","Text":"T_P of aX is P times aX,"},{"Start":"01:52.520 ","End":"01:54.655","Text":"the scalar comes in front,"},{"Start":"01:54.655 ","End":"01:57.120","Text":"and this is just a times PX,"},{"Start":"01:57.120 ","End":"01:58.560","Text":"which is T_P of X."},{"Start":"01:58.560 ","End":"02:01.730","Text":"It\u0027s as if we just took the a and put it in front."},{"Start":"02:01.730 ","End":"02:03.845","Text":"Now that we know that it\u0027s linear,"},{"Start":"02:03.845 ","End":"02:06.560","Text":"let\u0027s get onto the first question."},{"Start":"02:06.560 ","End":"02:12.390","Text":"Find W, where W is the subset"},{"Start":"02:12.390 ","End":"02:18.755","Text":"of the space consisting of all those P for which this particular A,"},{"Start":"02:18.755 ","End":"02:23.120","Text":"1, 1, 0, 1 is an eigenvector of T_P,"},{"Start":"02:23.120 ","End":"02:27.560","Text":"and we have to find W. If P is any 2 by 2 matrix,"},{"Start":"02:27.560 ","End":"02:31.220","Text":"we can give it letters or values a,"},{"Start":"02:31.220 ","End":"02:33.685","Text":"b, c, d for real numbers."},{"Start":"02:33.685 ","End":"02:35.230","Text":"Suppose this P,"},{"Start":"02:35.230 ","End":"02:37.390","Text":"which is like so is in W,"},{"Start":"02:37.390 ","End":"02:42.230","Text":"let\u0027s see if we can characterize or find what\u0027s the restriction or conditions on a, b, c,"},{"Start":"02:42.230 ","End":"02:46.105","Text":"d. By definition of W,"},{"Start":"02:46.105 ","End":"02:49.395","Text":"A is an eigenvector of T_P,"},{"Start":"02:49.395 ","End":"02:55.575","Text":"and that means that T_P of A is some Lambda times A."},{"Start":"02:55.575 ","End":"03:01.740","Text":"This is equal to PA. We have PA equals Lambda A,"},{"Start":"03:01.740 ","End":"03:08.680","Text":"and that gives us matrix P times A is scalar Lambda times A."},{"Start":"03:09.340 ","End":"03:13.085","Text":"On the left-hand side, if we multiply out,"},{"Start":"03:13.085 ","End":"03:14.420","Text":"we get this,"},{"Start":"03:14.420 ","End":"03:15.800","Text":"so I leave you to check that,"},{"Start":"03:15.800 ","End":"03:17.120","Text":"and on the right we get Lambda,"},{"Start":"03:17.120 ","End":"03:19.255","Text":"Lambda, 0, Lambda."},{"Start":"03:19.255 ","End":"03:23.630","Text":"These 2 are equal, they\u0027re equal in every position,"},{"Start":"03:23.630 ","End":"03:27.225","Text":"so what we get is that a is equal to Lambda,"},{"Start":"03:27.225 ","End":"03:30.300","Text":"a plus b is Lambda, c is 0,"},{"Start":"03:30.300 ","End":"03:35.100","Text":"c plus d is Lambda, 4 equations."},{"Start":"03:35.100 ","End":"03:39.620","Text":"We get the following conditions that c is 0,"},{"Start":"03:39.620 ","End":"03:42.500","Text":"d is Lambda, a is Lambda, and b is 0."},{"Start":"03:42.500 ","End":"03:46.550","Text":"Lambda is free to be whatever it wants and we have now a,"},{"Start":"03:46.550 ","End":"03:48.905","Text":"b, c, and d in terms of Lambda,"},{"Start":"03:48.905 ","End":"03:50.300","Text":"which means that P,"},{"Start":"03:50.300 ","End":"03:51.310","Text":"which is a, b,"},{"Start":"03:51.310 ","End":"03:53.020","Text":"c, d as a matrix,"},{"Start":"03:53.020 ","End":"03:55.680","Text":"is a is Lambda, b is 0,"},{"Start":"03:55.680 ","End":"03:57.540","Text":"c is 0, d is Lambda,"},{"Start":"03:57.540 ","End":"03:59.670","Text":"Lambda, 0, 0, Lambda."},{"Start":"03:59.670 ","End":"04:05.450","Text":"That means that P belongs to the set of all Lambda,"},{"Start":"04:05.450 ","End":"04:07.970","Text":"0, 0, Lambda where Lambda is real."},{"Start":"04:07.970 ","End":"04:09.590","Text":"Just review what we did."},{"Start":"04:09.590 ","End":"04:13.300","Text":"We said that if P is in W,"},{"Start":"04:13.300 ","End":"04:16.820","Text":"then P is in this set."},{"Start":"04:16.820 ","End":"04:24.125","Text":"Now I\u0027m going to show the reverse so that this really is W. Everything in W is in here."},{"Start":"04:24.125 ","End":"04:29.175","Text":"Meanwhile, we\u0027ve only shown that W is contained in this set,"},{"Start":"04:29.175 ","End":"04:32.930","Text":"now we\u0027ll show the reverse containment that anything in"},{"Start":"04:32.930 ","End":"04:37.495","Text":"here is also in W. If P is in this set,"},{"Start":"04:37.495 ","End":"04:40.850","Text":"that means that P is some Lambda, 0, 0,"},{"Start":"04:40.850 ","End":"04:44.365","Text":"Lambda for some Lambda which is real,"},{"Start":"04:44.365 ","End":"04:49.475","Text":"and this happens to be Lambda times the identity matrix 1, 0, 0 1."},{"Start":"04:49.475 ","End":"04:54.900","Text":"PA is, since P is Lambda I is Lambda IA,"},{"Start":"04:54.900 ","End":"04:56.640","Text":"which is Lambda IA,"},{"Start":"04:56.640 ","End":"04:58.080","Text":"which is Lambda A."},{"Start":"04:58.080 ","End":"05:00.840","Text":"If PA is Lambda A,"},{"Start":"05:00.840 ","End":"05:04.620","Text":"this thing is T_P of A,"},{"Start":"05:04.620 ","End":"05:06.705","Text":"it\u0027s equal to Lambda A,"},{"Start":"05:06.705 ","End":"05:11.320","Text":"and that means that A is an eigenvector of T_P,"},{"Start":"05:11.320 ","End":"05:13.685","Text":"and specifically with eigenvalue Lambda,"},{"Start":"05:13.685 ","End":"05:15.760","Text":"but it\u0027s an eigenvector."},{"Start":"05:15.760 ","End":"05:23.450","Text":"That means that P belongs to W. That\u0027s the defining property of W,"},{"Start":"05:23.450 ","End":"05:30.615","Text":"is all those P such that T_P has A as an eigenvector."},{"Start":"05:30.615 ","End":"05:33.510","Text":"We\u0027ve proved both directions,"},{"Start":"05:33.510 ","End":"05:39.990","Text":"and that means that we have equality that W is equal to the set of all Lambda,"},{"Start":"05:39.990 ","End":"05:41.445","Text":"0, 0, Lambda,"},{"Start":"05:41.445 ","End":"05:43.365","Text":"where Lambda is a real number."},{"Start":"05:43.365 ","End":"05:47.460","Text":"If you like, it\u0027s the set of all Lambda times I."},{"Start":"05:47.460 ","End":"05:49.320","Text":"Now that was Part 1,"},{"Start":"05:49.320 ","End":"05:52.380","Text":"we still have Part 2 to prove that W is"},{"Start":"05:52.380 ","End":"05:58.810","Text":"a subspace of this set of 2 by 2 matrices over r,"},{"Start":"05:58.810 ","End":"06:00.980","Text":"and we have to find a basis."},{"Start":"06:00.980 ","End":"06:04.100","Text":"Now, W is the set of"},{"Start":"06:04.100 ","End":"06:09.800","Text":"all multiples of the identity matrix Lambda times the identity or Lambda,"},{"Start":"06:09.800 ","End":"06:11.435","Text":"0, 0, Lambda."},{"Start":"06:11.435 ","End":"06:14.705","Text":"Subspace has to satisfy 3 properties."},{"Start":"06:14.705 ","End":"06:16.925","Text":"You have to show that it\u0027s non-empty,"},{"Start":"06:16.925 ","End":"06:21.740","Text":"that it\u0027s closed under addition and closed under scalar multiplication."},{"Start":"06:21.740 ","End":"06:25.775","Text":"It\u0027s non-empty because it contains the 0 matrix."},{"Start":"06:25.775 ","End":"06:29.630","Text":"All you have to do is choose Lambda equals 0 here,"},{"Start":"06:29.630 ","End":"06:32.185","Text":"and we get the 0 matrix."},{"Start":"06:32.185 ","End":"06:34.230","Text":"Closed under addition,"},{"Start":"06:34.230 ","End":"06:36.235","Text":"suppose we have 2 of these,"},{"Start":"06:36.235 ","End":"06:37.745","Text":"I have to choose a different letter,"},{"Start":"06:37.745 ","End":"06:41.575","Text":"Lambda times this plus Mu times this."},{"Start":"06:41.575 ","End":"06:43.820","Text":"Altogether is Lambda plus Mu,"},{"Start":"06:43.820 ","End":"06:45.470","Text":"which is still another scalar,"},{"Start":"06:45.470 ","End":"06:47.090","Text":"times the identity matrix,"},{"Start":"06:47.090 ","End":"06:51.220","Text":"so that\u0027s also in W. Scalar multiplication,"},{"Start":"06:51.220 ","End":"06:56.750","Text":"if Lambda times the identity is in W and you multiply it by another scalar Mu,"},{"Start":"06:56.750 ","End":"07:02.000","Text":"it\u0027s as if we took this and multiply it by a single scalar Mu times Lambda."},{"Start":"07:02.000 ","End":"07:03.650","Text":"It satisfies those,"},{"Start":"07:03.650 ","End":"07:05.090","Text":"so it\u0027s a subspace."},{"Start":"07:05.090 ","End":"07:07.145","Text":"Now, how do we find the basis?"},{"Start":"07:07.145 ","End":"07:09.340","Text":"The basis should be fairly obvious."},{"Start":"07:09.340 ","End":"07:13.550","Text":"Since everything is Lambda times the identity matrix,"},{"Start":"07:13.550 ","End":"07:17.945","Text":"we can just take the identity matrix seen as a vector,"},{"Start":"07:17.945 ","End":"07:25.145","Text":"as a basis for W. Basis means it spans and it\u0027s linearly independent."},{"Start":"07:25.145 ","End":"07:33.665","Text":"Well, it spans because the span of a single vector is all multiples Lambda times it,"},{"Start":"07:33.665 ","End":"07:35.525","Text":"which is what we have,"},{"Start":"07:35.525 ","End":"07:42.035","Text":"and linearly independent, because 1 non-zero vector is always linearly independent."},{"Start":"07:42.035 ","End":"07:45.030","Text":"That\u0027s it, we\u0027re done."}],"ID":25792},{"Watched":false,"Name":"Exercise 2","Duration":"1m 52s","ChapterTopicVideoID":24880,"CourseChapterTopicPlaylistID":118361,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:04.245","Text":"In this exercise, we have a linear transformation T,"},{"Start":"00:04.245 ","End":"00:11.670","Text":"and it goes from the vector space of 10 by 10 real matrices to itself."},{"Start":"00:11.670 ","End":"00:15.900","Text":"It\u0027s given by T of X is P times X,"},{"Start":"00:15.900 ","End":"00:20.910","Text":"where P is a particular 10 by 10 real matrix."},{"Start":"00:20.910 ","End":"00:24.990","Text":"Now, suppose that A in this same space is"},{"Start":"00:24.990 ","End":"00:30.690","Text":"invertible and it\u0027s an eigenvector of T with eigenvalue 4."},{"Start":"00:30.690 ","End":"00:35.870","Text":"We have to compute the determinant of P. Let\u0027s start."},{"Start":"00:35.870 ","End":"00:42.110","Text":"What does it mean that A is an eigenvector of T with eigenvalue 4?"},{"Start":"00:42.110 ","End":"00:45.280","Text":"It means that when you apply T to A,"},{"Start":"00:45.280 ","End":"00:47.505","Text":"you get 4 times A."},{"Start":"00:47.505 ","End":"00:49.740","Text":"We know what T of A is,"},{"Start":"00:49.740 ","End":"00:52.275","Text":"because in general, T of X as PX,"},{"Start":"00:52.275 ","End":"00:54.479","Text":"so T of A is PA,"},{"Start":"00:54.479 ","End":"00:57.120","Text":"so P times A is 4A."},{"Start":"00:57.120 ","End":"01:00.290","Text":"Now take the determinant of both sides."},{"Start":"01:00.290 ","End":"01:02.570","Text":"The determinant is multiplicative,"},{"Start":"01:02.570 ","End":"01:05.240","Text":"so determinant of P times the determinant of A,"},{"Start":"01:05.240 ","End":"01:09.770","Text":"and another property of determinants of N by N matrices,"},{"Start":"01:09.770 ","End":"01:11.270","Text":"when you take a scalar out,"},{"Start":"01:11.270 ","End":"01:13.160","Text":"it comes to the power of N,"},{"Start":"01:13.160 ","End":"01:14.715","Text":"which in this case is 10."},{"Start":"01:14.715 ","End":"01:16.625","Text":"This is what we have now,"},{"Start":"01:16.625 ","End":"01:22.895","Text":"and we can then cancel the determinant of A from both sides."},{"Start":"01:22.895 ","End":"01:29.350","Text":"This is not 0 because we\u0027re given that this is invertible."},{"Start":"01:29.350 ","End":"01:33.260","Text":"Note that, because otherwise you wouldn\u0027t be able to divide by it,"},{"Start":"01:33.260 ","End":"01:38.675","Text":"and that gives us that the determinant of P is 4 to the 10th."},{"Start":"01:38.675 ","End":"01:41.600","Text":"If you want a numerical answer,"},{"Start":"01:41.600 ","End":"01:43.770","Text":"this is what it is,"},{"Start":"01:43.770 ","End":"01:45.610","Text":"4 to the 10th."},{"Start":"01:45.610 ","End":"01:49.445","Text":"This eigenvalue to the power of the 10 from here,"},{"Start":"01:49.445 ","End":"01:52.560","Text":"so it comes out, and we\u0027re done."}],"ID":25793},{"Watched":false,"Name":"Exercise 3","Duration":"46s","ChapterTopicVideoID":24881,"CourseChapterTopicPlaylistID":118361,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:07.080","Text":"In this exercise, T is a linear transformation from this space to itself. What is this?"},{"Start":"00:07.080 ","End":"00:11.175","Text":"This is the 2 by 3 real matrices."},{"Start":"00:11.175 ","End":"00:13.530","Text":"We want to find all those T,"},{"Start":"00:13.530 ","End":"00:20.415","Text":"such that this particular matrix is an eigenvector corresponding to eigenvalue 4."},{"Start":"00:20.415 ","End":"00:23.010","Text":"Notice it says find a linear transformation,"},{"Start":"00:23.010 ","End":"00:24.825","Text":"so there could be more than 1 answer."},{"Start":"00:24.825 ","End":"00:29.340","Text":"We want T of this to equal 4 times this."},{"Start":"00:29.340 ","End":"00:33.585","Text":"Now the most obvious choice of T is to take,"},{"Start":"00:33.585 ","End":"00:37.070","Text":"for any x, T of x is 4 times x."},{"Start":"00:37.070 ","End":"00:43.370","Text":"That would be a linear transformation and it would certainly satisfy this."},{"Start":"00:43.370 ","End":"00:46.650","Text":"This is the answer and we\u0027re done."}],"ID":25794},{"Watched":false,"Name":"Exercise 4 - preparation","Duration":"7m 3s","ChapterTopicVideoID":24882,"CourseChapterTopicPlaylistID":118361,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.150","Text":"In this exercise, we have this matrix, a,"},{"Start":"00:03.150 ","End":"00:06.255","Text":"which is 3 by 3 over the reals"},{"Start":"00:06.255 ","End":"00:12.270","Text":"and we\u0027re asked to find the eigenvalues and eigenvectors of a."},{"Start":"00:12.270 ","End":"00:18.450","Text":"Start by finding the characteristic matrix xi minus a,"},{"Start":"00:18.450 ","End":"00:25.070","Text":"which some books do the other way around as a minus xi but makes no difference really."},{"Start":"00:25.070 ","End":"00:28.730","Text":"What we get is this is xi, this is minus a."},{"Start":"00:28.730 ","End":"00:31.325","Text":"Do the subtraction, we get this."},{"Start":"00:31.325 ","End":"00:33.530","Text":"That\u0027s the characteristic matrix."},{"Start":"00:33.530 ","End":"00:36.170","Text":"Next we need the characteristic polynomial,"},{"Start":"00:36.170 ","End":"00:39.605","Text":"which is the determinant of the characteristic matrix."},{"Start":"00:39.605 ","End":"00:42.695","Text":"It\u0027s just what we have here but with bars."},{"Start":"00:42.695 ","End":"00:47.460","Text":"We\u0027re going to expand by the 1st row."},{"Start":"00:47.460 ","End":"00:51.710","Text":"What we get using the method of minors and co-factors,"},{"Start":"00:51.710 ","End":"00:56.389","Text":"you have x minus 4 times the determinant"},{"Start":"00:56.389 ","End":"01:00.740","Text":"of what\u0027s left and remember that the checkerboard pattern of plus,"},{"Start":"01:00.740 ","End":"01:05.470","Text":"minus, plus, so the middle term gets this minus here."},{"Start":"01:05.470 ","End":"01:08.890","Text":"1 times, cross this and this."},{"Start":"01:08.890 ","End":"01:11.394","Text":"You\u0027ve seen this all before."},{"Start":"01:11.394 ","End":"01:13.134","Text":"See what we get."},{"Start":"01:13.134 ","End":"01:16.225","Text":"I\u0027m going to leave you to follow this is just straightforward algebra."},{"Start":"01:16.225 ","End":"01:19.505","Text":"We get this polynomial of Degree 3."},{"Start":"01:19.505 ","End":"01:22.060","Text":"That\u0027s the characteristic polynomial."},{"Start":"01:22.060 ","End":"01:26.650","Text":"What we want next are the eigenvalues which we get by"},{"Start":"01:26.650 ","End":"01:32.440","Text":"setting the characteristic polynomial to 0 and finding the solutions."},{"Start":"01:32.440 ","End":"01:38.260","Text":"This happens to have all whole numbers and there"},{"Start":"01:38.260 ","End":"01:45.205","Text":"is a way of looking for whole numbers solutions when the leading coefficient is 1."},{"Start":"01:45.205 ","End":"01:47.935","Text":"Doesn\u0027t always work but it will in this case."},{"Start":"01:47.935 ","End":"01:50.120","Text":"We take the free term,"},{"Start":"01:50.120 ","End":"01:51.990","Text":"the constant, 18,"},{"Start":"01:51.990 ","End":"01:54.085","Text":"doesn\u0027t matter the plus or minus,"},{"Start":"01:54.085 ","End":"01:59.930","Text":"and then we take all the factors of the numbers, whole numbers,"},{"Start":"01:59.930 ","End":"02:06.050","Text":"that divide into 18 and we get quite a lot of them plus or minus 1,"},{"Start":"02:06.050 ","End":"02:09.110","Text":"2, 3, 6, 9, and 18."},{"Start":"02:09.110 ","End":"02:17.870","Text":"Then you just try them successively to plug into here and see which ones actually work."},{"Start":"02:17.870 ","End":"02:22.250","Text":"Turns out that 2 and 3 work."},{"Start":"02:22.250 ","End":"02:25.250","Text":"I\u0027m not going to show you all the failed attempts but you"},{"Start":"02:25.250 ","End":"02:29.855","Text":"could plug in 2 and 3 and see that these really are routes."},{"Start":"02:29.855 ","End":"02:33.770","Text":"You might say, \"It\u0027s a cubic and we only have 2 roots.\""},{"Start":"02:33.770 ","End":"02:40.320","Text":"It usually means that one of these is a double root."},{"Start":"02:40.320 ","End":"02:42.590","Text":"It could be that there was a third root,"},{"Start":"02:42.590 ","End":"02:43.640","Text":"that\u0027s not a whole number,"},{"Start":"02:43.640 ","End":"02:45.920","Text":"but for various reasons that can\u0027t be."},{"Start":"02:45.920 ","End":"02:50.585","Text":"How do we tell which of these is a double root or multiple root?"},{"Start":"02:50.585 ","End":"02:56.690","Text":"We just differentiate the polynomial and see which one of these,"},{"Start":"02:56.690 ","End":"03:00.190","Text":"if any, make the derivative 0."},{"Start":"03:00.190 ","End":"03:03.950","Text":"It turns out if you plug in x equals 3 here,"},{"Start":"03:03.950 ","End":"03:06.830","Text":"then it gives you 0."},{"Start":"03:06.830 ","End":"03:11.690","Text":"3 times 9 minus 16 times 3 plus"},{"Start":"03:11.690 ","End":"03:18.735","Text":"21 so we get 27 plus 21 minus 48, anyway, is 0."},{"Start":"03:18.735 ","End":"03:21.965","Text":"That means that 3 is a double root."},{"Start":"03:21.965 ","End":"03:24.480","Text":"Next, the eigenvectors."},{"Start":"03:24.480 ","End":"03:27.080","Text":"We take each eigenvalue separately."},{"Start":"03:27.080 ","End":"03:29.915","Text":"First of all, we\u0027ll try x equals 2,"},{"Start":"03:29.915 ","End":"03:33.260","Text":"which has an algebraic multiplicity of 1."},{"Start":"03:33.260 ","End":"03:38.870","Text":"The other one has an algebraic multiplicity of 2 but let\u0027s start with this one."},{"Start":"03:38.870 ","End":"03:44.870","Text":"We take the characteristic polynomial and substitute x equals"},{"Start":"03:44.870 ","End":"03:51.000","Text":"2 in it and what we get is this matrix."},{"Start":"03:51.000 ","End":"03:53.660","Text":"Then this is like a system of equations."},{"Start":"03:53.660 ","End":"03:59.240","Text":"We want the nullspace or the solution space for this matrix."},{"Start":"03:59.240 ","End":"04:04.730","Text":"It\u0027s like a system of linear equations in 3 unknowns."},{"Start":"04:04.730 ","End":"04:06.380","Text":"Instead of solving this,"},{"Start":"04:06.380 ","End":"04:09.860","Text":"we do some raw operations on this to simplify it."},{"Start":"04:09.860 ","End":"04:17.625","Text":"If we subtract twice the 2nd row from the 1st row"},{"Start":"04:17.625 ","End":"04:21.470","Text":"and put that in the 2nd row and also take"},{"Start":"04:21.470 ","End":"04:26.270","Text":"this row minus twice the 3rd row and put it in the 3rd row,"},{"Start":"04:26.270 ","End":"04:28.580","Text":"what we\u0027ll get is this."},{"Start":"04:28.580 ","End":"04:33.680","Text":"We were trying to make 0 the entries below the minus 2."},{"Start":"04:33.680 ","End":"04:37.955","Text":"Next thing we can do is add these 2,"},{"Start":"04:37.955 ","End":"04:40.480","Text":"we want to get another 0 here."},{"Start":"04:40.480 ","End":"04:47.615","Text":"This is already in echelon form and the equation corresponding is this."},{"Start":"04:47.615 ","End":"04:53.210","Text":"The leading ones are the bound ones that are dependent."},{"Start":"04:53.210 ","End":"04:55.465","Text":"The free one is z."},{"Start":"04:55.465 ","End":"04:57.480","Text":"If z is 1,"},{"Start":"04:57.480 ","End":"04:59.670","Text":"usually we substitute 1,"},{"Start":"04:59.670 ","End":"05:06.390","Text":"then we get that y is 1 from here and then d is 1 and y is 1."},{"Start":"05:06.390 ","End":"05:09.270","Text":"From here we get that x is 1."},{"Start":"05:09.270 ","End":"05:16.130","Text":"What we get is an eigenvector corresponding to the Eigenvalue 2,"},{"Start":"05:16.130 ","End":"05:17.900","Text":"which is 1, 1, 1."},{"Start":"05:17.900 ","End":"05:22.505","Text":"This is actually a basis eigenvector for the eigenspace."},{"Start":"05:22.505 ","End":"05:27.035","Text":"But we\u0027ll just say this is the eigenvector for the Eigenvalue 2 even though,"},{"Start":"05:27.035 ","End":"05:30.595","Text":"really, a whole subspace that\u0027s spanned by this."},{"Start":"05:30.595 ","End":"05:33.919","Text":"Next on to the Eigenvalue 3."},{"Start":"05:33.919 ","End":"05:39.020","Text":"Once again, the characteristic matrix plug in x equals 3."},{"Start":"05:39.020 ","End":"05:44.180","Text":"We get this corresponds to this system of equations."},{"Start":"05:44.180 ","End":"05:50.535","Text":"Subtract the 1st row from the 2nd row and also from the 3rd row."},{"Start":"05:50.535 ","End":"05:54.040","Text":"That makes these 2 rows 0."},{"Start":"05:54.040 ","End":"06:01.699","Text":"The system of equations is just 1 equation and y and z are free."},{"Start":"06:01.699 ","End":"06:07.890","Text":"What we do is we alternately set 1 of them to 1 and the rest to 0."},{"Start":"06:07.890 ","End":"06:10.529","Text":"We have y equals 0,"},{"Start":"06:10.529 ","End":"06:12.270","Text":"z equals 1,"},{"Start":"06:12.270 ","End":"06:14.950","Text":"and that gives us x equals 1,"},{"Start":"06:14.950 ","End":"06:16.460","Text":"and then vice versa."},{"Start":"06:16.460 ","End":"06:21.720","Text":"y is 1, z is 0 also gives x equals 1."},{"Start":"06:21.720 ","End":"06:28.365","Text":"We have 2 eigenvectors for x equals 3 or for Eigenvalue 3."},{"Start":"06:28.365 ","End":"06:32.145","Text":"We have 1, 0, 1 and 1, 1, 0."},{"Start":"06:32.145 ","End":"06:37.939","Text":"Together, these span the eigenspace for Eigenvalue 3."},{"Start":"06:37.939 ","End":"06:39.740","Text":"Perhaps I\u0027ll just make a note of that,"},{"Start":"06:39.740 ","End":"06:44.280","Text":"that the eigenspaces for Eigenvalue 2,"},{"Start":"06:44.280 ","End":"06:46.020","Text":"what we had above was 1, 1,"},{"Start":"06:46.020 ","End":"06:49.480","Text":"1 and we really want the span of that."}],"ID":25795},{"Watched":false,"Name":"Exercise 4","Duration":"3m 51s","ChapterTopicVideoID":24883,"CourseChapterTopicPlaylistID":118361,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:05.715","Text":"In this exercise, we have a transformation from R3 to itself,"},{"Start":"00:05.715 ","End":"00:09.240","Text":"linear transformation described as follows."},{"Start":"00:09.240 ","End":"00:14.175","Text":"We\u0027re using row notation for vectors rather than column."},{"Start":"00:14.175 ","End":"00:18.630","Text":"What we have to do is to find the eigenvalues of t"},{"Start":"00:18.630 ","End":"00:23.460","Text":"for each eigenvalue to find a basis of its eigenspace."},{"Start":"00:23.460 ","End":"00:27.270","Text":"In other words, to find the eigenvectors for this eigenvalue,"},{"Start":"00:27.270 ","End":"00:33.150","Text":"and to decide, finally, if this t is diagonalizable,"},{"Start":"00:33.150 ","End":"00:36.900","Text":"and I\u0027ll just mention, just mentally check that this really is"},{"Start":"00:36.900 ","End":"00:39.770","Text":"a linear transformation because each of"},{"Start":"00:39.770 ","End":"00:44.660","Text":"these 3 components is the linear form in x, y, and z,"},{"Start":"00:44.660 ","End":"00:46.685","Text":"ax plus b, y plus cz."},{"Start":"00:46.685 ","End":"00:51.940","Text":"So let\u0027s start parts a and b, we\u0027ll do together."},{"Start":"00:51.940 ","End":"00:54.200","Text":"We do it like in the tutorial,"},{"Start":"00:54.200 ","End":"00:58.310","Text":"which basically says that we just find the matrix for"},{"Start":"00:58.310 ","End":"01:00.920","Text":"this transformation with the standard basis and"},{"Start":"01:00.920 ","End":"01:04.325","Text":"then work with the matrix rather than a transformation."},{"Start":"01:04.325 ","End":"01:09.065","Text":"We find the matrix that represents T in the standard basis."},{"Start":"01:09.065 ","End":"01:13.625","Text":"The standard basis is 1,0,0, 0,1,0, 0,0,1."},{"Start":"01:13.625 ","End":"01:16.700","Text":"T of 1,0,0 is this."},{"Start":"01:16.700 ","End":"01:21.495","Text":"You just plug in here, we get these 3 vectors,"},{"Start":"01:21.495 ","End":"01:28.610","Text":"and so the matrix of t with respect to this basis is these numbers,"},{"Start":"01:28.610 ","End":"01:31.890","Text":"but in column form."},{"Start":"01:31.890 ","End":"01:33.365","Text":"In other words, this row,"},{"Start":"01:33.365 ","End":"01:36.170","Text":"first row is the first column here and so on."},{"Start":"01:36.170 ","End":"01:37.730","Text":"There\u0027s another way of doing this."},{"Start":"01:37.730 ","End":"01:40.650","Text":"You could also just look at what t is,"},{"Start":"01:40.650 ","End":"01:43.100","Text":"and I\u0027m using the coloring to help you."},{"Start":"01:43.100 ","End":"01:47.345","Text":"We take here the 4 minus 1, minus 1 and put that here,"},{"Start":"01:47.345 ","End":"01:48.440","Text":"the x, y, z."},{"Start":"01:48.440 ","End":"01:54.265","Text":"Similarly, 1, 2, minus 1, so on, both ways to do it."},{"Start":"01:54.265 ","End":"02:00.290","Text":"Step 2 is to find the eigenvalues and eigenvectors of the matrix."},{"Start":"02:00.290 ","End":"02:02.059","Text":"Instead of the transformation,"},{"Start":"02:02.059 ","End":"02:05.560","Text":"we take the matrix, and it comes out the same thing."},{"Start":"02:05.560 ","End":"02:08.570","Text":"This was done in the previous exercise."},{"Start":"02:08.570 ","End":"02:11.660","Text":"We got the eigenvalues 2 and 3."},{"Start":"02:11.660 ","End":"02:16.865","Text":"By the way, the algebraic multiplicities were 1 and 2 respectively."},{"Start":"02:16.865 ","End":"02:22.250","Text":"For eigenvalue 2, we got the eigenspace spanned by 1, 1,"},{"Start":"02:22.250 ","End":"02:24.980","Text":"1, and for eigenvalue 3,"},{"Start":"02:24.980 ","End":"02:28.865","Text":"we got the space spanned by 1,0,1 and 1,1,0."},{"Start":"02:28.865 ","End":"02:32.210","Text":"Otherwise, these are the eigenvectors or"},{"Start":"02:32.210 ","End":"02:36.080","Text":"the possible choice of eigenvectors for this matrix,"},{"Start":"02:36.080 ","End":"02:38.240","Text":"and what happens is that these are"},{"Start":"02:38.240 ","End":"02:41.540","Text":"the same eigenvectors and eigenvalues."},{"Start":"02:41.540 ","End":"02:43.820","Text":"As for the transformation, the matrix"},{"Start":"02:43.820 ","End":"02:47.795","Text":"and the transformation have the same eigenvalues and eigenvectors."},{"Start":"02:47.795 ","End":"02:52.730","Text":"For part c, we want to know if the transformation is diagonalizable."},{"Start":"02:52.730 ","End":"02:56.075","Text":"The 3 eigenvectors are linearly independent."},{"Start":"02:56.075 ","End":"02:59.810","Text":"You always get linearly independent when you take a basis"},{"Start":"02:59.810 ","End":"03:02.420","Text":"for each of the eigenspaces, and you put them altogether,"},{"Start":"03:02.420 ","End":"03:06.080","Text":"they always are linearly independent and there\u0027s 3 of them, and that\u0027s important."},{"Start":"03:06.080 ","End":"03:08.540","Text":"So we have 3 linearly independent eigenvectors,"},{"Start":"03:08.540 ","End":"03:12.075","Text":"and so they\u0027re a basis of the space;"},{"Start":"03:12.075 ","End":"03:16.065","Text":"and that means that the matrix is diagonalizable."},{"Start":"03:16.065 ","End":"03:18.890","Text":"Because the matrix is diagonalizable,"},{"Start":"03:18.890 ","End":"03:22.655","Text":"so is the transformation that\u0027s by definition."},{"Start":"03:22.655 ","End":"03:28.195","Text":"Transformation is diagonalizable if and only if the matrix according to any basis."},{"Start":"03:28.195 ","End":"03:31.100","Text":"As a matter of fact, just for interest\u0027s sake,"},{"Start":"03:31.100 ","End":"03:36.155","Text":"if you took the representation of t with respect to this basis,"},{"Start":"03:36.155 ","End":"03:40.190","Text":"then you would get exactly the diagonal matrix with 2,"},{"Start":"03:40.190 ","End":"03:42.095","Text":"3, 3 on the diagonal."},{"Start":"03:42.095 ","End":"03:45.500","Text":"Because if you apply t to this, you get twice this,"},{"Start":"03:45.500 ","End":"03:47.915","Text":"and here and here we get 3 times these."},{"Start":"03:47.915 ","End":"03:51.900","Text":"That\u0027s just a by the way, and we are done."}],"ID":25796},{"Watched":false,"Name":"Exercise 5 - preparation","Duration":"6m 5s","ChapterTopicVideoID":24884,"CourseChapterTopicPlaylistID":118361,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:05.985","Text":"In this exercise, we have a 3 by 3 real matrix a as follows,"},{"Start":"00:05.985 ","End":"00:07.875","Text":"and there are 6 parts."},{"Start":"00:07.875 ","End":"00:10.650","Text":"I\u0027ll read each part as I come to it."},{"Start":"00:10.650 ","End":"00:18.795","Text":"Let\u0027s start with part a to find the characteristic matrix of a that\u0027s xI minus a."},{"Start":"00:18.795 ","End":"00:21.540","Text":"Here\u0027s xI, here\u0027s a,"},{"Start":"00:21.540 ","End":"00:24.555","Text":"subtract them, this is the matrix we get."},{"Start":"00:24.555 ","End":"00:28.200","Text":"The next part is the characteristic polynomial,"},{"Start":"00:28.200 ","End":"00:32.145","Text":"and that\u0027s just the determinant of the characteristic matrix."},{"Start":"00:32.145 ","End":"00:39.390","Text":"What we get is the determinant of this and they\u0027re going to expand by the second column."},{"Start":"00:39.390 ","End":"00:41.650","Text":"Notice there\u0027s is a 0 here and a 0 here,"},{"Start":"00:41.650 ","End":"00:47.630","Text":"so we just have this x minus 1 times the determinant of what\u0027s left."},{"Start":"00:47.630 ","End":"00:51.110","Text":"This is a plus in the checkerboard pattern,"},{"Start":"00:51.110 ","End":"00:53.630","Text":"plus, minus, plus, minus plus."},{"Start":"00:53.630 ","End":"00:59.870","Text":"What we get is this x minus 1 and then this times this minus,"},{"Start":"00:59.870 ","End":"01:02.300","Text":"this times this, this is minus,"},{"Start":"01:02.300 ","End":"01:04.940","Text":"minus minus 1, so it\u0027s minus."},{"Start":"01:04.940 ","End":"01:09.665","Text":"What we get is x minus 1 x squared minus 2x,"},{"Start":"01:09.665 ","End":"01:11.470","Text":"if you factorize it,"},{"Start":"01:11.470 ","End":"01:12.785","Text":"take x out of here,"},{"Start":"01:12.785 ","End":"01:15.110","Text":"we get x, x minus 1, x minus 2."},{"Start":"01:15.110 ","End":"01:20.815","Text":"Next, you want to find the eigenvalues and the algebraic multiplicity of each."},{"Start":"01:20.815 ","End":"01:24.815","Text":"The eigenvalues of the solutions of the characteristic equation."},{"Start":"01:24.815 ","End":"01:28.565","Text":"We set the characteristic polynomial to 0."},{"Start":"01:28.565 ","End":"01:30.590","Text":"We get this equals 0,"},{"Start":"01:30.590 ","End":"01:33.010","Text":"so x is 0, 1, or 2."},{"Start":"01:33.010 ","End":"01:40.295","Text":"In general, if we have a as a 0 of order k of the characteristic polynomial,"},{"Start":"01:40.295 ","End":"01:45.980","Text":"then it\u0027s an eigenvalue with algebraic multiplicity k. In other words,"},{"Start":"01:45.980 ","End":"01:49.910","Text":"if p of x is x minus a to the k times et cetera,"},{"Start":"01:49.910 ","End":"01:55.565","Text":"then x equals a has algebraic multiplicity k as an eigenvalue."},{"Start":"01:55.565 ","End":"01:58.070","Text":"Now in our case we have x,"},{"Start":"01:58.070 ","End":"02:00.500","Text":"which is x minus Note to the 1,"},{"Start":"02:00.500 ","End":"02:02.090","Text":"x minus 1 to the 1,"},{"Start":"02:02.090 ","End":"02:03.785","Text":"x minus 2 to the 1."},{"Start":"02:03.785 ","End":"02:08.290","Text":"All 3 eigenvalues have algebraic multiplicity 1."},{"Start":"02:08.290 ","End":"02:13.580","Text":"Notation sometimes use a letter Mu for algebraic multiplicity,"},{"Start":"02:13.580 ","End":"02:17.465","Text":"so mu for 0, for 1 and for 2 is 1."},{"Start":"02:17.465 ","End":"02:24.110","Text":"Next we want the eigenspaces and geometric multiplicity of each eigenvalue."},{"Start":"02:24.110 ","End":"02:28.970","Text":"The eigenspace for an eigenvalue Lambda is a solution space or"},{"Start":"02:28.970 ","End":"02:35.330","Text":"the kernel of the characteristic matrix after we substitute Lambda instead of X."},{"Start":"02:35.330 ","End":"02:41.435","Text":"For the case of 0, we take this characteristic matrix,"},{"Start":"02:41.435 ","End":"02:47.420","Text":"plug in x equals 0 and this is what we get,"},{"Start":"02:47.420 ","End":"02:53.750","Text":"which corresponds to the system of linear equations, this 1."},{"Start":"02:53.750 ","End":"02:58.760","Text":"We can simplify this matrix with row operations."},{"Start":"02:58.760 ","End":"03:06.060","Text":"Subtract this row from the first row and we negate the second row just for convenience,"},{"Start":"03:06.060 ","End":"03:08.700","Text":"I\u0027m going to negate the first row also,"},{"Start":"03:08.700 ","End":"03:10.050","Text":"just make it pluses."},{"Start":"03:10.050 ","End":"03:11.600","Text":"We now have just 1,"},{"Start":"03:11.600 ","End":"03:13.955","Text":"1, and 1 here and the rest 0s."},{"Start":"03:13.955 ","End":"03:17.720","Text":"This is the system we have and in this case,"},{"Start":"03:17.720 ","End":"03:22.605","Text":"z is the free variable and x and y are dependent."},{"Start":"03:22.605 ","End":"03:24.920","Text":"We can let z equal 1,"},{"Start":"03:24.920 ","End":"03:26.540","Text":"which is what we usually do,"},{"Start":"03:26.540 ","End":"03:27.800","Text":"could be any non-zero,"},{"Start":"03:27.800 ","End":"03:29.345","Text":"but typically it\u0027s 1,"},{"Start":"03:29.345 ","End":"03:36.680","Text":"and then from this we can get that x is minus 1 and regardless of X and Z, Y is 0,"},{"Start":"03:36.680 ","End":"03:44.720","Text":"which gives us that an eigenvector for eigenvalue 0 is minus 1,"},{"Start":"03:44.720 ","End":"03:47.535","Text":"0, 1,"},{"Start":"03:47.535 ","End":"03:48.990","Text":"and that\u0027s an eigenvector."},{"Start":"03:48.990 ","End":"03:52.730","Text":"The eigenspace is the span of the eigenvector,"},{"Start":"03:52.730 ","End":"03:55.910","Text":"the eigenspace for 0."},{"Start":"03:55.910 ","End":"04:01.180","Text":"The geometric multiplicity is the dimension of the eigenspace,"},{"Start":"04:01.180 ","End":"04:04.845","Text":"sometimes you select a Gamma for geometric."},{"Start":"04:04.845 ","End":"04:06.800","Text":"Gamma 0 is 1."},{"Start":"04:06.800 ","End":"04:10.730","Text":"Next we\u0027ll take the eigenvalue x equals 1 characteristic matrix,"},{"Start":"04:10.730 ","End":"04:12.515","Text":"substitute x equals 1,"},{"Start":"04:12.515 ","End":"04:15.320","Text":"get a system of linear equations."},{"Start":"04:15.320 ","End":"04:21.904","Text":"Y is independent, so let it equal 1 and X is 0 and Z is 0."},{"Start":"04:21.904 ","End":"04:32.165","Text":"So 0, 1, 0 is an eigenvector for eigenvalue 1 and the eigenspace is the span of this."},{"Start":"04:32.165 ","End":"04:36.595","Text":"The geometric multiplicity again is 1."},{"Start":"04:36.595 ","End":"04:39.210","Text":"For eigenvalue 2,"},{"Start":"04:39.210 ","End":"04:43.265","Text":"similarly, let X equals 2, get this matrix."},{"Start":"04:43.265 ","End":"04:45.860","Text":"This is the system of equations."},{"Start":"04:45.860 ","End":"04:49.670","Text":"If we add the first row to the last row,"},{"Start":"04:49.670 ","End":"04:53.215","Text":"then we get 0 in the last row."},{"Start":"04:53.215 ","End":"04:56.805","Text":"This is the system of equations we get."},{"Start":"04:56.805 ","End":"05:01.790","Text":"Here z is the free variable that we can set it to 1."},{"Start":"05:01.790 ","End":"05:06.140","Text":"X is 1, Y is 0, which gives us 1,"},{"Start":"05:06.140 ","End":"05:10.450","Text":"0, 1 as the eigenvector for eigenvalue 2."},{"Start":"05:10.450 ","End":"05:12.410","Text":"This is the eigenspace,"},{"Start":"05:12.410 ","End":"05:14.410","Text":"and it also has dimension 1,"},{"Start":"05:14.410 ","End":"05:18.180","Text":"so the geometric multiplicity is 1."},{"Start":"05:18.180 ","End":"05:24.020","Text":"It turns out that all the geometric multiplicities are 1 for all 3 of the eigenvalues."},{"Start":"05:24.020 ","End":"05:27.815","Text":"Next, we want a set of eigenvectors."},{"Start":"05:27.815 ","End":"05:31.625","Text":"Just pick eigenvectors for each eigenspace,"},{"Start":"05:31.625 ","End":"05:33.575","Text":"take a basis for each,"},{"Start":"05:33.575 ","End":"05:36.380","Text":"and then put them all together."},{"Start":"05:36.380 ","End":"05:39.235","Text":"These are the 3 vectors that we found,"},{"Start":"05:39.235 ","End":"05:42.935","Text":"these are the eigenvectors of a."},{"Start":"05:42.935 ","End":"05:47.450","Text":"Next is to show that the matrix a is diagonalizable."},{"Start":"05:47.450 ","End":"05:50.150","Text":"These eigenvectors come from different eigenvalues."},{"Start":"05:50.150 ","End":"05:54.050","Text":"They are linearly independent and so we have 3,"},{"Start":"05:54.050 ","End":"05:58.580","Text":"which is a maximal number of linearly independent eigenvectors."},{"Start":"05:58.580 ","End":"06:01.250","Text":"That means that a is diagonalizable."},{"Start":"06:01.250 ","End":"06:03.890","Text":"That\u0027s part f and the last part,"},{"Start":"06:03.890 ","End":"06:06.330","Text":"so we are done."}],"ID":25797},{"Watched":false,"Name":"Exercise 5 contiued","Duration":"5m 10s","ChapterTopicVideoID":24885,"CourseChapterTopicPlaylistID":118361,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:06.105","Text":"This is a continuation of the previous clip where we did part A to F,"},{"Start":"00:06.105 ","End":"00:08.550","Text":"and we had a matrix A,"},{"Start":"00:08.550 ","End":"00:12.765","Text":"and Part F was to show that A is diagonalizable."},{"Start":"00:12.765 ","End":"00:18.110","Text":"Now part G is to actually diagonalize A."},{"Start":"00:18.110 ","End":"00:20.860","Text":"Now, what does it mean to diagonalize a matrix?"},{"Start":"00:20.860 ","End":"00:24.000","Text":"It means to find 2 other matrices,"},{"Start":"00:24.000 ","End":"00:25.410","Text":"P and D,"},{"Start":"00:25.410 ","End":"00:31.185","Text":"where D is diagonal and P is invertible such that the following holds."},{"Start":"00:31.185 ","End":"00:34.515","Text":"Sometimes written as AP equals PD."},{"Start":"00:34.515 ","End":"00:38.010","Text":"Recall that we found 3 eigenvalues, 0, 1,"},{"Start":"00:38.010 ","End":"00:39.470","Text":"and 2, and for each,"},{"Start":"00:39.470 ","End":"00:42.800","Text":"we found an eigenvector as shown here."},{"Start":"00:42.800 ","End":"00:48.550","Text":"From this, what we do is we construct D and P. Now I\u0027ll show you how I got to this."},{"Start":"00:48.550 ","End":"00:50.480","Text":"You put these in a certain order,"},{"Start":"00:50.480 ","End":"00:53.415","Text":"let\u0027s say this order 0,1, and 2,"},{"Start":"00:53.415 ","End":"01:00.300","Text":"and then you put these eigenvalues along the diagonal of D. That\u0027s the 0,"},{"Start":"01:00.300 ","End":"01:01.985","Text":"the 1, and the 2 here,"},{"Start":"01:01.985 ","End":"01:03.750","Text":"and everything else is 0s,"},{"Start":"01:03.750 ","End":"01:08.030","Text":"that\u0027s how we get D. We get P by"},{"Start":"01:08.030 ","End":"01:13.250","Text":"taking these vectors and treating them as column vectors and putting them here,"},{"Start":"01:13.250 ","End":"01:14.750","Text":"here, and here."},{"Start":"01:14.750 ","End":"01:16.655","Text":"That gives us P,"},{"Start":"01:16.655 ","End":"01:19.640","Text":"just make sure to keep the order,"},{"Start":"01:19.640 ","End":"01:23.325","Text":"not to switch them around once you\u0027ve decided on the order."},{"Start":"01:23.325 ","End":"01:26.945","Text":"Let\u0027s just check that it really works."},{"Start":"01:26.945 ","End":"01:30.975","Text":"That AP equals PD,"},{"Start":"01:30.975 ","End":"01:35.000","Text":"that\u0027s the equivalent to this P and move it to the other side."},{"Start":"01:35.000 ","End":"01:37.850","Text":"That\u0027s the AP is this times this."},{"Start":"01:37.850 ","End":"01:40.950","Text":"Well, I won\u0027t check."}],"ID":25798},{"Watched":false,"Name":"Exercise 5","Duration":"5m 17s","ChapterTopicVideoID":24886,"CourseChapterTopicPlaylistID":118361,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.080 ","End":"00:05.670","Text":"In this exercise, T is a linear transformation from"},{"Start":"00:05.670 ","End":"00:09.915","Text":"R^3 to itself, described as follows."},{"Start":"00:09.915 ","End":"00:13.860","Text":"You should mentally check that this really is linear."},{"Start":"00:13.860 ","End":"00:16.200","Text":"We have 3 things to do,"},{"Start":"00:16.200 ","End":"00:19.890","Text":"to find the eigenvalues and eigenvectors of T,"},{"Start":"00:19.890 ","End":"00:28.890","Text":"to show that T is diagonalizable and to compute T^2021,"},{"Start":"00:28.890 ","End":"00:32.755","Text":"or specifically what it does to a vector x, y, z."},{"Start":"00:32.755 ","End":"00:34.955","Text":"We\u0027ll start with part a,"},{"Start":"00:34.955 ","End":"00:37.970","Text":"which is the eigenvalues and eigenvectors."},{"Start":"00:37.970 ","End":"00:44.045","Text":"First of all, we want the matrix for t in the standard basis."},{"Start":"00:44.045 ","End":"00:48.170","Text":"If we do that, and we\u0027ve seen this before,"},{"Start":"00:48.170 ","End":"00:51.680","Text":"we get the following matrix, which we could also get"},{"Start":"00:51.680 ","End":"00:56.460","Text":"directly from here by taking this first row as 1,"},{"Start":"00:56.460 ","End":"01:01.050","Text":"0, 1, then 0, 1, 0, and then other 1,0,1."},{"Start":"01:01.050 ","End":"01:02.900","Text":"We do it from here, it\u0027s rows."},{"Start":"01:02.900 ","End":"01:04.475","Text":"If you do it from this,"},{"Start":"01:04.475 ","End":"01:06.530","Text":"you have to put them in column."},{"Start":"01:06.530 ","End":"01:09.800","Text":"Matrix is symmetric, so it doesn\u0027t matter in this case,"},{"Start":"01:09.800 ","End":"01:11.530","Text":"but in general it does."},{"Start":"01:11.530 ","End":"01:16.550","Text":"Then the second step, we use the computations for the matrix,"},{"Start":"01:16.550 ","End":"01:25.630","Text":"let\u0027s call it A which is like I said the representation of T in the standard basis."},{"Start":"01:25.790 ","End":"01:28.205","Text":"I\u0027m just quoting the results."},{"Start":"01:28.205 ","End":"01:31.255","Text":"See the previous clip or 2."},{"Start":"01:31.255 ","End":"01:36.630","Text":"The eigenvalues are 0,1 and 2 and the corresponding eigenvectors,"},{"Start":"01:36.630 ","End":"01:38.430","Text":"well, what\u0027s written here,"},{"Start":"01:38.430 ","End":"01:43.820","Text":"and the answer to part a is that these are the same for T as they"},{"Start":"01:43.820 ","End":"01:49.940","Text":"are for the representation of T. The eigenvalues, of course,"},{"Start":"01:49.940 ","End":"01:53.120","Text":"would be the same in any basis."},{"Start":"01:53.120 ","End":"01:57.485","Text":"Now, we have to show that T is diagonalizable."},{"Start":"01:57.485 ","End":"02:03.290","Text":"We showed in the previous clips that the matrix A,"},{"Start":"02:03.290 ","End":"02:06.815","Text":"this 1, is diagonalizable."},{"Start":"02:06.815 ","End":"02:12.500","Text":"By definition of diagonalizable for a transformation,"},{"Start":"02:12.500 ","End":"02:20.270","Text":"it\u0027s true if and only if the matrix with respect to any basis is diagonalizable."},{"Start":"02:20.270 ","End":"02:22.860","Text":"It is diagonalizable."},{"Start":"02:22.860 ","End":"02:25.610","Text":"Then we\u0027re going to make this computation."},{"Start":"02:25.610 ","End":"02:29.645","Text":"Again, we\u0027re going to use the results of the previous clips."},{"Start":"02:29.645 ","End":"02:38.100","Text":"If we take the matrix T^2021 with respect to the standard basis,"},{"Start":"02:38.100 ","End":"02:44.240","Text":"it\u0027s like for us taking the matrix and then raising it to the power of 2021."},{"Start":"02:44.240 ","End":"02:49.220","Text":"This is because the property that if you multiply transformations"},{"Start":"02:49.220 ","End":"02:53.740","Text":"or rather compose them, it\u0027s equivalent to multiplying matrices."},{"Start":"02:53.740 ","End":"02:56.055","Text":"Now, this we called A."},{"Start":"02:56.055 ","End":"03:02.410","Text":"It\u0027s A^2021 and we have that result in the previous clip, it\u0027s this."},{"Start":"03:02.410 ","End":"03:08.780","Text":"All we have to do is to take the matrix from here and apply it to a vector x,"},{"Start":"03:08.780 ","End":"03:13.340","Text":"y, z, and we get this, and that\u0027s the answer."},{"Start":"03:13.340 ","End":"03:17.060","Text":"But don\u0027t go yet if you want to see an alternative solution"},{"Start":"03:17.060 ","End":"03:21.755","Text":"without matrices to see how we could get this result."},{"Start":"03:21.755 ","End":"03:30.800","Text":"You might guess that we can, in general, say that if it was n here and not 2021,"},{"Start":"03:30.800 ","End":"03:32.750","Text":"we could figure out the formula."},{"Start":"03:32.750 ","End":"03:39.525","Text":"You\u0027d probably figure out that T^n of x, y, z is 2^n minus 1."},{"Start":"03:39.525 ","End":"03:43.730","Text":"The way to get from 2021 to 2020 is to subtract 1,"},{"Start":"03:43.730 ","End":"03:46.110","Text":"that seems to be the rule."},{"Start":"03:46.400 ","End":"03:51.455","Text":"We could guess this and then substitute n equals 2021."},{"Start":"03:51.455 ","End":"03:53.810","Text":"This is provable by induction."},{"Start":"03:53.810 ","End":"03:57.200","Text":"For n equals 1, it just comes out to be the definition"},{"Start":"03:57.200 ","End":"04:00.050","Text":"because it says here then x plus z y,"},{"Start":"04:00.050 ","End":"04:04.475","Text":"x plus z 2 to the n minus 1 is 1, so you can throw it out."},{"Start":"04:04.475 ","End":"04:06.760","Text":"The induction step,"},{"Start":"04:06.760 ","End":"04:13.055","Text":"to apply T n plus 1 times is like to apply it n times and then another time."},{"Start":"04:13.055 ","End":"04:20.585","Text":"We have T of this and then by the induction step,"},{"Start":"04:20.585 ","End":"04:26.975","Text":"this is equal to this, just like copying it from here."},{"Start":"04:26.975 ","End":"04:31.570","Text":"Cool. The first component a, the second b, the third c."},{"Start":"04:31.570 ","End":"04:38.430","Text":"Then we get T of a, b, c is a plus c, b a plus c."},{"Start":"04:38.430 ","End":"04:41.130","Text":"We can\u0027t reuse x, y, and z, so we use a, b,"},{"Start":"04:41.130 ","End":"04:47.565","Text":"c. Now, if you multiply 2 times 2^n minus 1, it\u0027s 2^n."},{"Start":"04:47.565 ","End":"04:49.935","Text":"This becomes this."},{"Start":"04:49.935 ","End":"04:52.430","Text":"It isn\u0027t clear is exactly what we wanted."},{"Start":"04:52.430 ","End":"04:57.575","Text":"It\u0027s what happens if you replace n with n plus 1 here,"},{"Start":"04:57.575 ","End":"04:59.805","Text":"we want n plus 1 minus 1."},{"Start":"04:59.805 ","End":"05:03.610","Text":"It\u0027s just a question of noting that n is n plus 1 minus 1."},{"Start":"05:03.610 ","End":"05:11.255","Text":"Finally, like I said, you just substitute here n equals 2021 and we get this,"},{"Start":"05:11.255 ","End":"05:14.405","Text":"which is the same as what we got before."},{"Start":"05:14.405 ","End":"05:17.780","Text":"Now we\u0027re done with the second type."}],"ID":25799},{"Watched":false,"Name":"Exercise 6","Duration":"9m 42s","ChapterTopicVideoID":24887,"CourseChapterTopicPlaylistID":118361,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:03.765","Text":"In this exercise, we have a linear transformation,"},{"Start":"00:03.765 ","End":"00:11.325","Text":"T from the space of 2 by 2 real matrices to itself given as follows,"},{"Start":"00:11.325 ","End":"00:12.780","Text":"and, of course, it\u0027s linear."},{"Start":"00:12.780 ","End":"00:16.905","Text":"I like to make a mental note that it really is linear."},{"Start":"00:16.905 ","End":"00:18.855","Text":"We have 3 things to do."},{"Start":"00:18.855 ","End":"00:24.440","Text":"To find the representation matrix with respect to the standard basis,"},{"Start":"00:24.440 ","End":"00:28.580","Text":"then to find the eigenvalues and eigenvectors of T,"},{"Start":"00:28.580 ","End":"00:32.330","Text":"and to show that T is diagonalizable."},{"Start":"00:32.330 ","End":"00:35.330","Text":"Then to compute T^10."},{"Start":"00:35.330 ","End":"00:40.760","Text":"I want to remind you that these are 2 by 2 matrices,"},{"Start":"00:40.760 ","End":"00:43.820","Text":"but you consider each matrix also as a vector,"},{"Start":"00:43.820 ","End":"00:46.270","Text":"as an element in a vector space."},{"Start":"00:46.270 ","End":"00:49.190","Text":"When we compute a representation matrix,"},{"Start":"00:49.190 ","End":"00:50.690","Text":"it will be a 4 by 4,"},{"Start":"00:50.690 ","End":"00:52.265","Text":"not a 2 by 2."},{"Start":"00:52.265 ","End":"00:53.990","Text":"That\u0027s just a preliminary."},{"Start":"00:53.990 ","End":"00:57.410","Text":"Let\u0027s start with the first one to find the representation matrix."},{"Start":"00:57.410 ","End":"00:59.210","Text":"What is the standard basis?"},{"Start":"00:59.210 ","End":"01:04.250","Text":"Standard basis is 1 in each of the 4 places,"},{"Start":"01:04.250 ","End":"01:08.435","Text":"but we take it in order from top-left down to bottom-right."},{"Start":"01:08.435 ","End":"01:10.759","Text":"That\u0027s the standard basis."},{"Start":"01:10.759 ","End":"01:17.440","Text":"Now let\u0027s compute what the transformation does to each member of the standard basis."},{"Start":"01:17.440 ","End":"01:21.230","Text":"Te1 is by the definition,"},{"Start":"01:21.230 ","End":"01:23.870","Text":"this matrix times e1,"},{"Start":"01:23.870 ","End":"01:25.640","Text":"which comes out to be this."},{"Start":"01:25.640 ","End":"01:29.375","Text":"Now we write this as the linear combination of these 4,"},{"Start":"01:29.375 ","End":"01:33.435","Text":"including the 0\u0027s when we don\u0027t need."},{"Start":"01:33.435 ","End":"01:40.110","Text":"This one means that it\u0027s 1 time e_ and this one means 1 time e3,"},{"Start":"01:40.110 ","End":"01:42.825","Text":"and we fill in for e2 and e4."},{"Start":"01:42.825 ","End":"01:46.900","Text":"Similarly, for the other 3."},{"Start":"01:47.210 ","End":"01:52.280","Text":"The reason I\u0027ve put some coloring here is to remind you that we get"},{"Start":"01:52.280 ","End":"01:57.430","Text":"the matrix by taking these entries and putting them in the columns."},{"Start":"01:57.430 ","End":"01:59.370","Text":"This is a symmetric matrix,"},{"Start":"01:59.370 ","End":"02:00.970","Text":"so I wanted to emphasize that."},{"Start":"02:00.970 ","End":"02:04.955","Text":"In this case, you could get it wrong and still get it right if you see what I mean."},{"Start":"02:04.955 ","End":"02:09.290","Text":"Now we want to find the eigenvalues and eigenvectors of T,"},{"Start":"02:09.290 ","End":"02:17.700","Text":"so we want the characteristic matrix of A which is xI minus A, which is this."},{"Start":"02:17.700 ","End":"02:20.645","Text":"Then the characteristic polynomial,"},{"Start":"02:20.645 ","End":"02:23.760","Text":"which is the determinant of this."},{"Start":"02:25.030 ","End":"02:29.525","Text":"We\u0027re going to expand by the first row."},{"Start":"02:29.525 ","End":"02:31.040","Text":"That\u0027s why I\u0027ve colored it."},{"Start":"02:31.040 ","End":"02:35.865","Text":"We\u0027ll get 2 times this times the determinant."},{"Start":"02:35.865 ","End":"02:37.890","Text":"Well, you know all this already."},{"Start":"02:37.890 ","End":"02:45.875","Text":"We take x minus 1 times this and then minus 1 times this determinant."},{"Start":"02:45.875 ","End":"02:50.520","Text":"This will expand by the middle row or middle column,"},{"Start":"02:50.520 ","End":"02:51.735","Text":"comes out the same."},{"Start":"02:51.735 ","End":"02:54.615","Text":"This will expand by,"},{"Start":"02:54.615 ","End":"02:58.655","Text":"again, the first row, the middle column."},{"Start":"02:58.655 ","End":"02:59.900","Text":"Whichever way you do it,"},{"Start":"02:59.900 ","End":"03:01.400","Text":"this is the expression you get."},{"Start":"03:01.400 ","End":"03:06.035","Text":"We get the determinant of this, this, this, this."},{"Start":"03:06.035 ","End":"03:13.700","Text":"Here it\u0027s x minus 1 times x minus 1 minus 1 times minus 1,"},{"Start":"03:13.700 ","End":"03:15.630","Text":"which is minus 1,"},{"Start":"03:15.630 ","End":"03:17.030","Text":"minus, minus, minus."},{"Start":"03:17.030 ","End":"03:19.540","Text":"Anyway, gloss over this."},{"Start":"03:19.540 ","End":"03:22.065","Text":"We\u0027ve done so many of these."},{"Start":"03:22.065 ","End":"03:25.160","Text":"This is standard we get x squared minus 2x squared."},{"Start":"03:25.160 ","End":"03:26.750","Text":"This is x, x minus 2,"},{"Start":"03:26.750 ","End":"03:29.570","Text":"so it\u0027s x squared, x minus 2 squared."},{"Start":"03:29.570 ","End":"03:37.560","Text":"Then we set this to 0 to find the eigenvalues and we get 2 solutions,"},{"Start":"03:37.560 ","End":"03:42.350","Text":"0 and 2, but each of them has algebraic multiplicity 2."},{"Start":"03:42.350 ","End":"03:44.510","Text":"We may or may not need that fact."},{"Start":"03:44.510 ","End":"03:48.225","Text":"Now, the eigenvectors for x equals 0,"},{"Start":"03:48.225 ","End":"03:55.250","Text":"we want the solution space of basically xI minus A when you substitute x equals 0,"},{"Start":"03:55.250 ","End":"03:57.665","Text":"which is just like minus A,"},{"Start":"03:57.665 ","End":"04:02.165","Text":"and that gives us this system of equations."},{"Start":"04:02.165 ","End":"04:05.840","Text":"The second 2 equations are identical to the first 2 equations,"},{"Start":"04:05.840 ","End":"04:07.460","Text":"so we can throw those out,"},{"Start":"04:07.460 ","End":"04:10.490","Text":"or you could do row operations and get 2 rows of 0\u0027s,"},{"Start":"04:10.490 ","End":"04:11.890","Text":"but you still throw them out."},{"Start":"04:11.890 ","End":"04:15.885","Text":"This is this. Z and T are the free variables,"},{"Start":"04:15.885 ","End":"04:20.254","Text":"so we alternately assign one of them to be 1 and the other to be 0."},{"Start":"04:20.254 ","End":"04:23.400","Text":"If we do it one way, we get this."},{"Start":"04:23.400 ","End":"04:25.085","Text":"The other way we get this."},{"Start":"04:25.085 ","End":"04:28.095","Text":"That gives us 2 eigenvectors."},{"Start":"04:28.095 ","End":"04:33.475","Text":"Remember, the vectors here are 2 by 2 matrices in this vector space."},{"Start":"04:33.475 ","End":"04:36.215","Text":"See, we have to take them in order, X,"},{"Start":"04:36.215 ","End":"04:43.830","Text":"Y, Z, T. For the other one,"},{"Start":"04:43.830 ","End":"04:49.540","Text":"it\u0027s going to be minus 1, 0, 1, 0."},{"Start":"04:49.850 ","End":"04:54.039","Text":"That\u0027s for eigenvalue 0."},{"Start":"04:54.039 ","End":"04:59.550","Text":"Now let\u0027s move on to the next eigenvalue, which is 2."},{"Start":"05:02.240 ","End":"05:07.305","Text":"For x equals 2, we want the solution space of 2I minus A."},{"Start":"05:07.305 ","End":"05:11.450","Text":"Usually, I take the characteristic matrix and substitute 2,"},{"Start":"05:11.450 ","End":"05:13.145","Text":"especially what I\u0027ve done here,"},{"Start":"05:13.145 ","End":"05:16.060","Text":"and normally would get 4 equations."},{"Start":"05:16.060 ","End":"05:19.745","Text":"Unlike the previous case where x is 0,"},{"Start":"05:19.745 ","End":"05:22.830","Text":"the last 2 rows are not the same as the first 2 rows,"},{"Start":"05:22.830 ","End":"05:26.980","Text":"but if you multiply them by a minus, they are the same."},{"Start":"05:26.980 ","End":"05:31.580","Text":"We can already just skip the last 2 because it\u0027s the same as these 2 equations,"},{"Start":"05:31.580 ","End":"05:32.765","Text":"but with a minus."},{"Start":"05:32.765 ","End":"05:35.195","Text":"As above, Z and T are free,"},{"Start":"05:35.195 ","End":"05:37.160","Text":"so one time where Z equals 0,"},{"Start":"05:37.160 ","End":"05:39.820","Text":"T equals 1, and vice versa."},{"Start":"05:39.820 ","End":"05:43.335","Text":"We find Y and X."},{"Start":"05:43.335 ","End":"05:50.220","Text":"From this we get the 2 eigenvectors for eigenvalue 2."},{"Start":"05:50.220 ","End":"05:53.870","Text":"Just putting them in the right order, you order X,"},{"Start":"05:53.870 ","End":"05:57.655","Text":"Y, Z, T, we get this and this."},{"Start":"05:57.655 ","End":"06:03.500","Text":"Note that the geometric multiplicity for both is also 2."},{"Start":"06:03.500 ","End":"06:06.730","Text":"That means it\u0027s going to be diagonalizable."},{"Start":"06:06.730 ","End":"06:08.910","Text":"Coming up to part c,"},{"Start":"06:08.910 ","End":"06:10.925","Text":"that\u0027s basically what it is."},{"Start":"06:10.925 ","End":"06:13.940","Text":"The other way of seeing this is to say yes,"},{"Start":"06:13.940 ","End":"06:19.180","Text":"we got 4 linearly independent eigenvectors in a space of dimension 4."},{"Start":"06:19.180 ","End":"06:24.030","Text":"We got the maximum number of eigenvectors, so it\u0027s diagonalizable."},{"Start":"06:24.030 ","End":"06:25.325","Text":"Or like I said,"},{"Start":"06:25.325 ","End":"06:29.660","Text":"the algebraic multiplicity and the geometric multiplicities are the same,"},{"Start":"06:29.660 ","End":"06:33.750","Text":"and the algebraic multiplicities add up to 4."},{"Start":"06:34.280 ","End":"06:42.120","Text":"D, the last part is to compute T^10 working with representations."},{"Start":"06:42.120 ","End":"06:49.260","Text":"This space of matrices is being represented as 4-dimensional real space."},{"Start":"06:49.260 ","End":"06:53.510","Text":"We can represent it by the column vector."},{"Start":"06:53.510 ","End":"06:55.550","Text":"T is represented by A,"},{"Start":"06:55.550 ","End":"06:58.555","Text":"which is a 4 by 4 matrix."},{"Start":"06:58.555 ","End":"07:02.805","Text":"Now, we figure out A^10 and then we can get T^10."},{"Start":"07:02.805 ","End":"07:07.385","Text":"We apply A^10 to this and then put it back into the square format."},{"Start":"07:07.385 ","End":"07:08.835","Text":"We diagonalize A."},{"Start":"07:08.835 ","End":"07:11.430","Text":"Remember, when we take a matrix to a high power,"},{"Start":"07:11.430 ","End":"07:14.135","Text":"if we can diagonalize it, it\u0027s much easier."},{"Start":"07:14.135 ","End":"07:20.660","Text":"We want to find P and D such that AP equals PD or alternatively like so."},{"Start":"07:20.660 ","End":"07:24.770","Text":"We have that the eigenvectors for x equals"},{"Start":"07:24.770 ","End":"07:29.135","Text":"0 are these 2 and for x equals 2 we have these 2."},{"Start":"07:29.135 ","End":"07:34.455","Text":"From this, we can construct the diagonal is 0,"},{"Start":"07:34.455 ","End":"07:38.835","Text":"0, 2, 2 on the diagonal."},{"Start":"07:38.835 ","End":"07:43.850","Text":"Here we take these with the column vector representation."},{"Start":"07:43.850 ","End":"07:45.320","Text":"This, this, this,"},{"Start":"07:45.320 ","End":"07:46.520","Text":"and this in order are this,"},{"Start":"07:46.520 ","End":"07:48.760","Text":"this, this and this."},{"Start":"07:48.760 ","End":"07:53.020","Text":"We have D and P. Now,"},{"Start":"07:53.620 ","End":"07:57.755","Text":"P inverse is this."},{"Start":"07:57.755 ","End":"08:00.140","Text":"You can check that it really is an inverse."},{"Start":"08:00.140 ","End":"08:03.250","Text":"If you multiply P inverse by P,"},{"Start":"08:03.250 ","End":"08:04.670","Text":"leave the half aside,"},{"Start":"08:04.670 ","End":"08:06.935","Text":"multiply this matrix by this matrix,"},{"Start":"08:06.935 ","End":"08:10.355","Text":"you\u0027ll get a matrix which is just 2\u0027s along the diagonal,"},{"Start":"08:10.355 ","End":"08:13.399","Text":"and then the half will make it the identity matrix."},{"Start":"08:13.399 ","End":"08:15.115","Text":"You could check that."},{"Start":"08:15.115 ","End":"08:17.580","Text":"Now we\u0027ll compute A^10."},{"Start":"08:17.580 ","End":"08:21.795","Text":"This will be PD^10 P inverse."},{"Start":"08:21.795 ","End":"08:26.845","Text":"D^10 is just D where we raise the diagonal to the power of 10."},{"Start":"08:26.845 ","End":"08:29.320","Text":"0, 0, 2^10, 2^10."},{"Start":"08:29.320 ","End":"08:31.229","Text":"Here\u0027s P, here\u0027s P inverse."},{"Start":"08:31.229 ","End":"08:35.400","Text":"Now, this is just 0, 0, 2^10, 2^10,"},{"Start":"08:35.400 ","End":"08:41.220","Text":"but combined with the half will make it 2^9,"},{"Start":"08:41.220 ","End":"08:43.740","Text":"2^9, 0^9, 0^9."},{"Start":"08:43.740 ","End":"08:49.990","Text":"Then multiply this by this, we get this."},{"Start":"08:51.140 ","End":"08:54.620","Text":"Easy computation shows that we get this."},{"Start":"08:54.620 ","End":"08:57.110","Text":"Could leave it like that but,"},{"Start":"08:57.110 ","End":"08:58.235","Text":"I like to tidy it."},{"Start":"08:58.235 ","End":"09:00.620","Text":"Everything involve 2^9."},{"Start":"09:00.620 ","End":"09:05.300","Text":"You could take 2^9 out times this matrix."},{"Start":"09:05.300 ","End":"09:07.950","Text":"We\u0027re not done yet."},{"Start":"09:08.170 ","End":"09:16.099","Text":"T^10 of a typical matrix is going to represent this as a column."},{"Start":"09:16.099 ","End":"09:20.780","Text":"Take this here, multiply by it."},{"Start":"09:20.780 ","End":"09:23.390","Text":"You just multiply with this matrix by this column,"},{"Start":"09:23.390 ","End":"09:26.175","Text":"we get this, so 2^9 stays."},{"Start":"09:26.175 ","End":"09:28.915","Text":"T^10 of this,"},{"Start":"09:28.915 ","End":"09:33.170","Text":"if you put it back into the matrix representation,"},{"Start":"09:33.170 ","End":"09:37.070","Text":"you have X plus Z and then Y plus T, and then X plus Z,"},{"Start":"09:37.070 ","End":"09:43.350","Text":"Y plus T. This is the answer and we\u0027re done."}],"ID":25800},{"Watched":false,"Name":"Exercise 7","Duration":"4m 32s","ChapterTopicVideoID":24888,"CourseChapterTopicPlaylistID":118361,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:07.335","Text":"In this exercise, T is a linear transformation from P_2 of R to P_2 of R,"},{"Start":"00:07.335 ","End":"00:11.070","Text":"given as follows, T of p of x."},{"Start":"00:11.070 ","End":"00:16.020","Text":"You know what? Let me stop and remind you that P_2 of R is"},{"Start":"00:16.020 ","End":"00:22.020","Text":"the space of polynomials of degree 2 or less over the reals."},{"Start":"00:22.020 ","End":"00:25.245","Text":"So typical element is a polynomial p of x."},{"Start":"00:25.245 ","End":"00:28.050","Text":"What T sends it to is the same polynomial,"},{"Start":"00:28.050 ","End":"00:32.745","Text":"but with x plus 1 instead of x substituting k,"},{"Start":"00:32.745 ","End":"00:37.935","Text":"we have to find the representation matrix of T in the standard basis,"},{"Start":"00:37.935 ","End":"00:41.510","Text":"to find the eigenvalues and eigenvectors of T,"},{"Start":"00:41.510 ","End":"00:45.625","Text":"and finally, to show that T isn\u0027t diagonalizable."},{"Start":"00:45.625 ","End":"00:49.790","Text":"We\u0027ll start with the representation matrix."},{"Start":"00:49.790 ","End":"00:52.670","Text":"Just copying the definition."},{"Start":"00:52.670 ","End":"00:54.965","Text":"Now if you want this more explicitly,"},{"Start":"00:54.965 ","End":"01:00.260","Text":"a general polynomial of degree 2 or less as a plus bx plus cx squared."},{"Start":"01:00.260 ","End":"01:03.935","Text":"If you replace x by x squared here and here,"},{"Start":"01:03.935 ","End":"01:08.210","Text":"we get this standard basis is,"},{"Start":"01:08.210 ","End":"01:09.230","Text":"of course, 1,"},{"Start":"01:09.230 ","End":"01:10.370","Text":"x, x squared."},{"Start":"01:10.370 ","End":"01:13.400","Text":"It\u0027s just the most obvious thing to take as a standard basis."},{"Start":"01:13.400 ","End":"01:17.525","Text":"Let\u0027s see what T does to each basis member."},{"Start":"01:17.525 ","End":"01:22.640","Text":"It takes 1 to 1 because there\u0027s nowhere to substitute x."},{"Start":"01:22.640 ","End":"01:26.045","Text":"Well, you could use this formula with b and c is 0."},{"Start":"01:26.045 ","End":"01:30.595","Text":"T of x could use this formula with a is 0,"},{"Start":"01:30.595 ","End":"01:31.620","Text":"b is 1,"},{"Start":"01:31.620 ","End":"01:36.285","Text":"c is 0, and then b is 1. We just get x plus 1."},{"Start":"01:36.285 ","End":"01:39.680","Text":"The third case we have a is 0,"},{"Start":"01:39.680 ","End":"01:41.820","Text":"b is 0, c is 1."},{"Start":"01:41.820 ","End":"01:43.050","Text":"We get x plus 1 squared,"},{"Start":"01:43.050 ","End":"01:45.475","Text":"which is x squared plus 2x plus 1."},{"Start":"01:45.475 ","End":"01:49.700","Text":"Well, you could just do it directly substitute x plus 1 instead of x."},{"Start":"01:49.700 ","End":"01:56.540","Text":"Now, write each of these linear combination of these and this is what we get."},{"Start":"01:56.540 ","End":"02:02.060","Text":"Now we take these coefficients and put them into a matrix."},{"Start":"02:02.060 ","End":"02:07.400","Text":"But rows are columns need to transpose so that the last row,"},{"Start":"02:07.400 ","End":"02:10.235","Text":"1, 2, 1 is the last column."},{"Start":"02:10.235 ","End":"02:11.870","Text":"That\u0027s part a."},{"Start":"02:11.870 ","End":"02:13.910","Text":"Now part b to find the eigenvalues and"},{"Start":"02:13.910 ","End":"02:18.875","Text":"eigenvectors start with the characteristic matrix will use Lambda instead of x,"},{"Start":"02:18.875 ","End":"02:20.815","Text":"doesn\u0027t matter what letter to be used."},{"Start":"02:20.815 ","End":"02:23.870","Text":"The characteristic polynomial is the determinant"},{"Start":"02:23.870 ","End":"02:27.959","Text":"of the characteristic matrix so it\u0027s this."},{"Start":"02:28.210 ","End":"02:32.435","Text":"Since this is an upper diagonal matrix,"},{"Start":"02:32.435 ","End":"02:37.235","Text":"the determinant you can get just by multiplying the elements along the diagonal."},{"Start":"02:37.235 ","End":"02:39.440","Text":"It\u0027s Lambda minus 1, Lambda minus 1,"},{"Start":"02:39.440 ","End":"02:40.520","Text":"Lambda minus 1,"},{"Start":"02:40.520 ","End":"02:42.665","Text":"or Lambda minus 1 cubed."},{"Start":"02:42.665 ","End":"02:48.185","Text":"The eigenvalues are the solutions of the characteristic equation,"},{"Start":"02:48.185 ","End":"02:50.665","Text":"which is setting this to 0."},{"Start":"02:50.665 ","End":"02:53.570","Text":"There\u0027s only one, Lambda equals 1."},{"Start":"02:53.570 ","End":"02:55.355","Text":"It\u0027s a triple root,"},{"Start":"02:55.355 ","End":"02:58.460","Text":"meaning it has algebraic multiplicity 3,"},{"Start":"02:58.460 ","End":"03:00.005","Text":"but still it\u0027s the only one."},{"Start":"03:00.005 ","End":"03:02.320","Text":"Now for the eigenvectors,"},{"Start":"03:02.320 ","End":"03:05.000","Text":"we substitute Lambda equals 1."},{"Start":"03:05.000 ","End":"03:07.080","Text":"That\u0027s the only eigenvalue."},{"Start":"03:07.080 ","End":"03:10.335","Text":"In here and that gives us this,"},{"Start":"03:10.335 ","End":"03:17.405","Text":"we want the kernel of this or the solution space of system of equations."},{"Start":"03:17.405 ","End":"03:20.665","Text":"The last one we don\u0027t need because of zeros."},{"Start":"03:20.665 ","End":"03:28.295","Text":"Just make it all minus plus y plus z equals 0 and from here 2z equals 0."},{"Start":"03:28.295 ","End":"03:30.115","Text":"X doesn\u0027t appear,"},{"Start":"03:30.115 ","End":"03:32.405","Text":"so x is a free variable,"},{"Start":"03:32.405 ","End":"03:37.460","Text":"let it equal 1 and get an eigenvector that way anything non-zero but typically 1."},{"Start":"03:37.460 ","End":"03:39.335","Text":"Then x is 1,"},{"Start":"03:39.335 ","End":"03:43.225","Text":"z is 0, and therefore y is 0."},{"Start":"03:43.225 ","End":"03:45.740","Text":"What we have is not the vector 1,"},{"Start":"03:45.740 ","End":"03:48.170","Text":"0, 0, we need the polynomial."},{"Start":"03:48.170 ","End":"03:49.700","Text":"The 1, 0, 0,"},{"Start":"03:49.700 ","End":"03:51.860","Text":"go in front of the basis."},{"Start":"03:51.860 ","End":"03:54.125","Text":"We get this linear combination,"},{"Start":"03:54.125 ","End":"03:56.840","Text":"and we get the polynomial 1."},{"Start":"03:56.840 ","End":"04:00.455","Text":"That\u0027s the eigenvector for our eigenvalue 1."},{"Start":"04:00.455 ","End":"04:05.495","Text":"Now part C is to show that T is not diagonalizable."},{"Start":"04:05.495 ","End":"04:07.295","Text":"If it was diagonalizable,"},{"Start":"04:07.295 ","End":"04:09.680","Text":"we would have the maximum number,"},{"Start":"04:09.680 ","End":"04:12.890","Text":"which is three linearly independent eigenvectors,"},{"Start":"04:12.890 ","End":"04:16.250","Text":"but we only have one, we\u0027re missing two."},{"Start":"04:16.250 ","End":"04:19.480","Text":"No it\u0027s not diagonalizable."},{"Start":"04:19.480 ","End":"04:25.205","Text":"Since the representation matrix is not diagonalizable,"},{"Start":"04:25.205 ","End":"04:27.065","Text":"neither is the transformation."},{"Start":"04:27.065 ","End":"04:28.835","Text":"It\u0027s if and only if."},{"Start":"04:28.835 ","End":"04:32.400","Text":"That was the last part and so we\u0027re done."}],"ID":25801},{"Watched":false,"Name":"Exercise 8","Duration":"3m 41s","ChapterTopicVideoID":24889,"CourseChapterTopicPlaylistID":118361,"HasSubtitles":true,"ThumbnailPath":null,"UploadDate":null,"DurationForVideoObject":null,"Description":null,"MetaTitle":null,"MetaDescription":null,"Canonical":null,"VideoComments":[],"Subtitles":[{"Start":"00:00.000 ","End":"00:01.590","Text":"In this exeercise,"},{"Start":"00:01.590 ","End":"00:04.215","Text":"V is a vector space of dimension,"},{"Start":"00:04.215 ","End":"00:06.210","Text":"n, over a field,"},{"Start":"00:06.210 ","End":"00:09.210","Text":"F. We don\u0027t know what n is,"},{"Start":"00:09.210 ","End":"00:14.100","Text":"but the point is to say that it\u0027s a finite-dimensional vector space."},{"Start":"00:14.100 ","End":"00:19.259","Text":"T is a linear transformation from V to itself."},{"Start":"00:19.259 ","End":"00:29.505","Text":"We have to prove that T is invertible if and only if all its eigenvalues are non-zero."},{"Start":"00:29.505 ","End":"00:35.430","Text":"Secondly, to prove that if T is invertible,"},{"Start":"00:35.430 ","End":"00:38.730","Text":"then T inverse, which of course exists,"},{"Start":"00:38.730 ","End":"00:44.390","Text":"has the same eigenvectors as T. There\u0027s another part of this."},{"Start":"00:44.390 ","End":"00:49.610","Text":"How are the eigenvalues of T inverse related to the eigenvalues of T?"},{"Start":"00:49.610 ","End":"00:51.410","Text":"It\u0027s really 3 parts."},{"Start":"00:51.410 ","End":"00:53.330","Text":"The first part, well,"},{"Start":"00:53.330 ","End":"00:55.630","Text":"it\u0027s logically equivalent,"},{"Start":"00:55.630 ","End":"00:58.675","Text":"what we call in logic the contra-positive,"},{"Start":"00:58.675 ","End":"01:02.570","Text":"that this is not true if and only if this is not true."},{"Start":"01:02.570 ","End":"01:09.205","Text":"T is non-invertible if and only if it has 0 as an eigenvalue."},{"Start":"01:09.205 ","End":"01:14.910","Text":"We\u0027ll start from the right: 0 is an eigenvalue of T,"},{"Start":"01:14.910 ","End":"01:23.340","Text":"and this is true if and only if Tv is 0 for some non-zero vector, it\u0027s 0v,"},{"Start":"01:23.340 ","End":"01:29.260","Text":"which is 0, and this is true if and only if the kernel of T is nontrivial,"},{"Start":"01:29.260 ","End":"01:33.860","Text":"meaning this v is in the kernel when it\u0027s not 0."},{"Start":"01:33.860 ","End":"01:38.010","Text":"This means that T is not invertible."},{"Start":"01:38.010 ","End":"01:43.090","Text":"There is a theorem that T is invertible if and only"},{"Start":"01:43.090 ","End":"01:48.625","Text":"if the kernel of T consists of solely the 0 elements."},{"Start":"01:48.625 ","End":"01:50.470","Text":"For this theorem to hold,"},{"Start":"01:50.470 ","End":"01:54.565","Text":"it\u0027s important that the dimension of V is finite,"},{"Start":"01:54.565 ","End":"01:59.055","Text":"and that\u0027s why in the beginning we said dimension n. Now,"},{"Start":"01:59.055 ","End":"02:01.015","Text":"let\u0027s go to part B,"},{"Start":"02:01.015 ","End":"02:05.110","Text":"where we have to prove that if T is invertible,"},{"Start":"02:05.110 ","End":"02:09.230","Text":"then its inverse has the same eigenvectors."},{"Start":"02:09.440 ","End":"02:14.090","Text":"Let v be a non-zero eigenvector,"},{"Start":"02:14.090 ","End":"02:16.970","Text":"by definition of eigenvectors it\u0027s non-zero,"},{"Start":"02:16.970 ","End":"02:21.100","Text":"corresponding to the eigenvalue Lambda."},{"Start":"02:21.100 ","End":"02:27.170","Text":"What does that mean? That Tv equals Lambda v. But because of what it says here,"},{"Start":"02:27.170 ","End":"02:29.270","Text":"Lambda is not 0. Let\u0027s see."},{"Start":"02:29.270 ","End":"02:33.694","Text":"If lambda was 0, then Tv would equal 0,"},{"Start":"02:33.694 ","End":"02:37.660","Text":"which means that v is in the kernel."},{"Start":"02:37.660 ","End":"02:39.755","Text":"But v is not 0,"},{"Start":"02:39.755 ","End":"02:43.070","Text":"and the kernel of T consists solely of 0,"},{"Start":"02:43.070 ","End":"02:44.630","Text":"if T is invertible,"},{"Start":"02:44.630 ","End":"02:47.680","Text":"so Lambda is not 0."},{"Start":"02:47.680 ","End":"02:56.090","Text":"If Lambda is not 0, then we can take its inverse on the T. If T sends v to Lambda v,"},{"Start":"02:56.090 ","End":"02:59.525","Text":"then T inverse sends Lambda v back to v,"},{"Start":"02:59.525 ","End":"03:01.340","Text":"that\u0027s the definition of the inverse."},{"Start":"03:01.340 ","End":"03:03.485","Text":"Now, T inverse is also linear,"},{"Start":"03:03.485 ","End":"03:07.310","Text":"so we can take the lambda outside the transformation,"},{"Start":"03:07.310 ","End":"03:13.380","Text":"so Lambda T inverse of v is v. Now notice that Lambda is not 0,"},{"Start":"03:13.380 ","End":"03:14.730","Text":"I\u0027m just reminding you,"},{"Start":"03:14.730 ","End":"03:18.465","Text":"so we can divide both sides by Lambda,"},{"Start":"03:18.465 ","End":"03:26.045","Text":"and get T inverse v is 1 over lambda times v. What does this mean?"},{"Start":"03:26.045 ","End":"03:29.060","Text":"That v is an eigenvector of T inverse,"},{"Start":"03:29.060 ","End":"03:32.000","Text":"but the eigenvalue is the reciprocal of"},{"Start":"03:32.000 ","End":"03:36.680","Text":"the eigenvalue that it was for T. That\u0027s just a by the way,"},{"Start":"03:36.680 ","End":"03:38.855","Text":"which is important, but it wasn\u0027t asked for."},{"Start":"03:38.855 ","End":"03:41.250","Text":"Anyway, we are done."}],"ID":25802}],"Thumbnail":null,"ID":118361}]