Talk:Adjugate matrix
This article is rated B-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
adjoint is not used?
[edit]The adjugate has sometimes been called the "adjoint", but that terminology is ambiguous and is not used in Wikipedia. Today, "adjoint" normally refers to the conjugate transpose.
A more generic example is this: given matrix A its adjoint is ...
The article is clear to me: "adjugate" resolves an ambiguity in the literature. But the literature is full of ambiguities! None of my references mention the word "adjugate" at all: they unabashedly use the word "adjoint" without worrying about ambiguities. I'm all for fixing our language: it is just another tool after all. But it would be much more helpful to the cause if several references that recommend this particular word to fix the ambiguity in the literature were cited. As it is, I'm not yet convinced that the literature recommends "adjugate". Does it? Where?
Cjfsyntropy (talk) 21:55, 21 April 2010 (UTC)
transpose required
[edit]after the co-factoring break up is done as shown in the picture, you need to take the transpose of that entire matrix to get the real adjoint, this is not mentioned on the picture or the text
I'm a mere student using this site to help with coursework, so take this with a pinch of salt. But when you say
"Today, "adjoint" normally refers to the conjugate transpose."
Might I suggest that you mean the conjugate transpose of the cofactor matrix?
Please ignore if I'm wrong or if you feel this is implied.
ta,
Th
--I don't think they mean conjugate transpose of the cofactor matrix. The cofactor matrix is only involved in the *classical adjoint* or adjugate, whereas the *adjoint* is precisely the conjugate (i.e. take the complex conjugate of each element in the matrix) transpose of the matrix. 74.104.1.193 06:08, 4 February 2007 (UTC) Jordan
correction
[edit]the last adjugate example, the 3 by 3 with A subscripts, is incorrect. the result needs to be transpose.
Fixed. TooMuchMath 19:14, 28 January 2006 (UTC)
q(A)
[edit]I read If p(t) = det(A - tI) is the characteristic polynomial of A and we define the polynomial q(t) = (p(0) - p(t))/t, then adj(A) = q(A)., but q(A) = (p(0) - p(A))/A and p(A) = det(A - AI) = 0.. Maybe you mean qA(t) --151.28.36.120 07:43, 27 September 2006 (UTC)
There was actually no problem here. I clarified this in the text by noting the standard way to understand q(A) with q a polynomial is as the sum q_0 + q_1 A + ...+ q_n A^n where q_n are the coefficients of q(t). You are correct that p(A)=0, however this doesn't imply q(A) = 0, rather q(A) = q(0)/ A = (deta)/A = adj(A)! (Incidentally, the proof that p(A)=0 since det(A-A I) =0 is incorrect, as there is a priori no reason that p(A) = det (A-AI)!. To explain: for an arbitrary matrix B one defines p(B) = p_0 + p_1 B + ... + p_n B^n with p_j the coefficients of p(t) = det (A -t I). And indeed, it is not necessarily true that p(B) = det (A -B) I! A simple example is
A = ( 0 1 // 0 0 ) and B = (0 1 // 1 0 ) // = new row .
Then p(A)= det (A -t I) = t^2 so p(B) = B^2 = I. However, det (A - B) = 0!) --Jhschenker 16:30, 18 October 2006 (UTC)
What is the derivative with respect to A of adj(A)?
[edit]Please give the formula using both matrix notation and index notation.
Umm... What exactly do you want? You can take the derivative of each element of a matrix. and matrices can be used as linear transformations of polynomial to produce the derivative. There is the Wronskian matrix. However taking the derivative with respect to an entire matrix is not something you can do.--Cronholm144 05:02, 13 July 2007 (UTC)
Yes you can... The result will be a tensor valued function. In other words, adj' will be a function which takes a matrix (a 2-tensor, i.e. an n x n array of numbers) as input and gives a 4-tensor (an n x n x n x n array) as its ouput. This is because adj is a function from 2-tensors to 2-tensors. In general, the derrivative of a function from j-tensors to k-tensors is a function from j-tensors to (j+k)-tensors. Anyway, sorry I'm at a loss for what adj' actually is :) --Wikimorphism 02:20, 13 August 2007 (UTC)
- I think the OP means something like
- Only this is valid for invertible A only, and I have no idea of how to find a more general formula. David 09:10, 16 May 2008 (UTC)
I think the formula maybe:
--Liuyifourfire (talk) 17:34, 22 March 2009 (UTC)
Multiplicative property of the adjugate
[edit]- from page Wikipedia:Reference desk/Mathematics
Hi! I'm looking for a proof of the identity:
The article Adjugate matrix says nothing about the poof and the proof is not contained in the reference 'Gilbert Strang: Linear Algebra and its Applications '. When A and B are invertable the proof is easy, but otherwise? I can't treat cofactors well. Would you be so kind and help me to find a real reference or a proof? Thanks, Mozó (talk) 14:21, 15 December 2008 (UTC)
- For real or complex square matirces you may get it by continuity, because invertible matrices are dense--PMajer (talk) 19:57, 15 December 2008 (UTC)
- Uuuuh, what a good idea, thank's! I would have thought about it :) or how to say it in English :S But, what about the engineer courses? I'd like to show them directly, that trace(adj(A)) is the 2nd scalar invariant of the (classical) tensor A that is for every O orthogonal trfmtion (and for orthonormal bases) trace adj (A) = trace adj (OTAO). And (I realised that) I need the identity above. Mozó (talk) 22:23, 15 December 2008 (UTC)
- Yeah, trying to explain topology to a bunch of engineers is probably a bad idea! If you're only interesting in a particular number of dimensions, then you could do it as a direct calculation (or, rather, set it as an exercise - it will be a horrible mess!). There is probably a better way I'm just not thinking of, though. --Tango (talk) 23:04, 15 December 2008 (UTC)
- Actually, we do teach topology for engineers (specially the argument above), when we find the continuous solution of the function equation |f|=ex (or |f(x)|=|x|), so PMajer's idea could work. And of course the proof of the identity manually (by term×term) for home work may cause bad feelings about math :) Mozó (talk) 07:17, 16 December 2008 (UTC)
- Here's a way that avoids topology, though it probably doesn't qualify as "direct". Each element of adj(AB) − adj(B) adj(A) is a polynomial in the elements of A and B and is zero whenever A and B are both invertible. There are infinitely many invertible n×n matrices over the reals, so the polynomials must be identically zero over the reals and hence over any commutative ring, since their construction is independent of the ring. -- BenRG (talk) 08:28, 16 December 2008 (UTC)
- Actually, we do teach topology for engineers (specially the argument above), when we find the continuous solution of the function equation |f|=ex (or |f(x)|=|x|), so PMajer's idea could work. And of course the proof of the identity manually (by term×term) for home work may cause bad feelings about math :) Mozó (talk) 07:17, 16 December 2008 (UTC)
- Yeah, trying to explain topology to a bunch of engineers is probably a bad idea! If you're only interesting in a particular number of dimensions, then you could do it as a direct calculation (or, rather, set it as an exercise - it will be a horrible mess!). There is probably a better way I'm just not thinking of, though. --Tango (talk) 23:04, 15 December 2008 (UTC)
- Uuuuh, what a good idea, thank's! I would have thought about it :) or how to say it in English :S But, what about the engineer courses? I'd like to show them directly, that trace(adj(A)) is the 2nd scalar invariant of the (classical) tensor A that is for every O orthogonal trfmtion (and for orthonormal bases) trace adj (A) = trace adj (OTAO). And (I realised that) I need the identity above. Mozó (talk) 22:23, 15 December 2008 (UTC)
Here's a fairly direct proof. Let B1, ..., Bn be the rows of B, and A1, ..., An the columns of A. Now examine the i,j entry of each side of the matrix identity. Each side is a function that:
- does not depend on the row Bj or the column Ai;
- is an alternating multilinear form with respect to the remaining rows of B
- is an alternating multilinear form with respect to the remaining columns of A.
Thus it is enough to check equality when A and B are both permutation matrices. (Proof: Fix i, j. Because each of the two quantities is linear with respect to each row of B/column of A other than Bj and Ai, you can assume each of these is one of the standard basis vectors, and that none are repeated. Then set Bj (resp. Ai) to be the remaining standard basis vector, since this doesn't affect the i,j entry.) Now check that if the identity is true for A, B, then it remains true when two consecutive columns of A (resp. rows of B) are exchanged. (This results from the fact that if you exchange, say, two consecutive rows of C, this has the effect of exchanging the corresponding columns of Adj(C) and multiplying it by -1.) This reduces the problem to checking the identity when A, B are both the identity matrix. 67.150.253.60 (talk) 13:04, 16 December 2008 (UTC)
- This is obviously "the right way" to do the problem. I had wanted to say something along the lines that it is true as a formal identity in the polynomial ring where the entries of A and B had been adjoined and that passing to a suitable transcendental field extension then implies the result. But that was too complicated. Good answer, siℓℓy rabbit (talk) 20:01, 16 December 2008 (UTC)
- Just another option, in case your students don't like extension of polynomial identities in several variables: prove it first for A, B invertible as suggested by others above. Now let A be arbitrary and B be invertible. All but a finite number of the matrices A + tI are invertible. The identity to be proved is a single-variable polynomial one true for most t, and therefore for all t. Now let A, B be arbitrary and do the same thing with B + sI. 67.150.246.75 (talk) 03:22, 17 December 2008 (UTC) —Preceding unsigned comment added by Mozó (talk • contribs)
3x3 example
[edit]If you follow the "3x3 numeric matrix" example it doesn't make sense. The definition of adj(A) (the big matrix with 9 cofactors) clearly shows the bottom middle entry as -det(1 3 4 6), which would be -det(-3 -5 -1 -2) in the numeric example. This conflicts with the claim that the submatrix is (-3 2 3 -4). There seems to be some confusion over whether the answer is transposed or not. I don't really care, but the wikipedia page shouldn't contradict itself. Either change the example so it fits the definition of adj(A) as given above, or transpose the definition of adj(A) so it matches the example. 131.215.143.14 (talk) 23:54, 26 August 2009 (UTC)
Was all quite wrong - Fixed.
Stormcloud51090 (talk) 07:05, 16 June 2010 (UTC)
Fixed?
[edit]The problem with the internal contradiction mentioned above is still there. I fail to see how it's been fixed. I made the corrections once so that it did not conflict with the 1-9 matrix in the example above but apparently someone "fixed" it so that it now contradicts itself again. Someone needs to find a definition and stick by it, and make sure that definition is followed throughout the article.
Elevent 2010-08-13 —Preceding unsigned comment added by 83.251.231.9 (talk) 08:38, 13 August 2010 (UTC)
Left & right Adj?
[edit]The article assumes that the matrix is square. This does not appear to be a requirement. That is, SVD handles non-square matrices, so left- and right-psuedo-inverses are easily defined for non-square matrices. So also for adj, correct?
What is the definition for a left-adj or a right-adj? (FWIW, it is my guess that much of the complicated explanations in terms of cofactors will become trivial when expressed in terms of the SVD -- but I don't know, since I can't find any useful (non-functor) definition of left/right-adj.) Jmacwiki (talk) 21:00, 7 July 2012 (UTC)
Formula correct
[edit]This edit marked the formula below dubious:
The formula is correct, provided is a unit. Here's a proof:
Cancel one from each side to get the result. I've edited the article accordingly. 75.76.162.89 (talk) 08:56, 11 August 2012 (UTC)
Examples: 2x2 generic matrix
[edit]Should
hence that adj(adj) = A.
be
hence that adj(adj(A)) = A.
?
--98.111.232.64 (talk) 19:54, 18 April 2017 (UTC)
- Of course you are right. It was part of the Feb 16 vandalism spasm. Fixed it. Thanks. Cuzkatzimhut (talk) 21:38, 18 April 2017 (UTC)
Recent edits
[edit]Recent edits apparently add nothing to the article, instead introduces grammatical issues and is not conform with Wikipedia standards, see for instance reference additions. I have notified the author about referencing on his talk page before. He also does not give any edit summary in his edits. prokaryotes (talk) 16:56, 13 August 2015 (UTC)
- Can you be more explicit? I strongly believe this is a retaliatory move on scalar field theory edits and Fermi's golden rule edits. Is it not? Cuzkatzimhut (talk) 16:58, 13 August 2015 (UTC)
- Small correction, you have added 3 links, the rest should be self explanatory. I look over soem of your recent edits and it appears that most are not beneficial to the project. As I have pointed out to you now several times, your references are lacking (not inline), your additions are often to technical - also mentioned already to you. If you think i edit war or retaliate over whatever you think it is, then you are wrong with your assumptions. prokaryotes (talk) 17:04, 13 August 2015 (UTC)
- My additions are technical since this is a technical issue of matrix theory. The three classic texts on the subject I added support and organize all aspects of the improvements made. If you were more detailed about how you'd like things improved, I could be accommodating. What, exactly, do you wish to know? HTML-template upgrades of in-text TeX are the mandate of WP, given the diversity of platforms and systems it is accessed from. I need not assume anything: I observe your documented rollback of 3,400 and 730 characters, in 2 articles, respectively, within 10 mins. It is not about what I think, assume, or observe, however. Cuzkatzimhut (talk) 18:36, 13 August 2015 (UTC)
- Changes such as "so that"(not correct). "whis" (slang), or using a longer See also link, is not what keeps things simple. Keep it simple and don't change things which do not require attention. And add a edit summary when you change something. Do not change content unless you have something to add, or a correction. Looking how long you have been on Wikipedia and your edit history suggest that the issues mentioned are not new to you. prokaryotes (talk) 18:49, 13 August 2015 (UTC)
- Thank you for the "whis": a typo, corrected. "So that" is standard usage. You are free to condense longer see also links--I believe immediacy trumps brevity, here. "Issues" you are conjuring up may, in fact, cut both ways. Yes, I have found my stride over the years, effulgently. Convince me why my technical edits could help the reader with their absence. Cuzkatzimhut (talk) 19:04, 13 August 2015 (UTC)
- Notice that i will bring this to admin attention if you repeat your edit behavior, also read WP:CIR. prokaryotes (talk) 19:40, 13 August 2015 (UTC)
- Update: Thanks for adding the references now to the correct spots. prokaryotes (talk) 19:43, 13 August 2015 (UTC)
- Notice that i will bring this to admin attention if you repeat your edit behavior, also read WP:CIR. prokaryotes (talk) 19:40, 13 August 2015 (UTC)
- Glad to be of service to the reader. Let me know of the technical issues with the adjugate troubling you. Cuzkatzimhut (talk) 19:59, 13 August 2015 (UTC)
Now, concerning your, frankly, bizarre editing of my message to fracture my grammar by deforming my adverb, "effulgently" to a noun, "effulgence", makes me wonder about your etiquette practices, beyond unwarranted grammatical prestidigitation. Please do not do this again: putting words in other people's mouths is a very-very bad habit, and, yes, do by all means invite the administrators to inspect today's massive deletions. The 39 watchers of this page may wish to opine. Cuzkatzimhut (talk) 20:10, 13 August 2015 (UTC)
- Editing other peoples posts is a big no-no. It is not entirely uncommon among people with strong opinions and weak arguments.
- Other affected articles in the edit spree by User:Prokaryotes include Fermi's golden rule and Scalar field theory. The edits cannot be described as helpful or even being aimed at improving the articles. YohanN7 (talk) 20:25, 13 August 2015 (UTC)
- Re effulgence, i was looking up the word on google, no idea why this was saved in the edit summary, a mistake on my part. prokaryotes (talk) 20:32, 13 August 2015 (UTC)
Assessment comment
[edit]The comment(s) below were originally left at Talk:Adjugate matrix/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.
Needs references. Geometry guy 01:08, 15 June 2007 (UTC) This is a rather technical topic, e.g. this article has no or few incoming links. Hence low priority. Arcfrk 06:00, 28 June 2007 (UTC) |
Last edited at 06:00, 28 June 2007 (UTC). Substituted at 01:43, 5 May 2016 (UTC)
How does the adjugate show that a zero determinant implies singularity?
[edit]The article claims that the adjugate provides a proof that if the determinant of a matrix is zero then it is singular. But it is unclear to me why that should be the case. The adjugate of a non-zero matrix is sometimes zero. --Svennik (talk) 12:14, 5 July 2020 (UTC)
Adjugate of the 1 x 1 matrix 0
[edit]The article says the adjugate of a matrix is except that by convention when , for which we define the adjugate to be zero. I haven't seen this convention and I don't see what it's supposed to achieve. We don't need this convention to make
for matrices . Further, this convention makes the adjugate discontinuous for matrices: it's except for the matrix and for the matrix .
Update: someone has changed the article to delete this strange convention about the adjugate of the matrix . John Baez (talk) 10:49, 12 November 2022 (UTC)
- I made the correction. There is no need for a convention: the determinant of a 0x0 matrix is 1, for many reasonable reasons, notably its definition using permutations (the symmetric group on 0 letters has one element, its signature is 1, and an empty product equals 1), its definition using the exterior algebra (the 0th exterior power is the ring), the formula for determinant of diagonal block matrices (it is equal the product of the two determinants), etc.
- Moreover, that bizarre convention made the formula discontinuous, while the correct one is polynomial (constant…) like it is in all other cases. ACL-FMD (talk) 10:52, 12 November 2022 (UTC)
- Additionally, I changed the property 'adj(0) = 0' to specify that it doesn't hold in the n=1 case. Oscar Cunningham (talk) 12:57, 12 November 2022 (UTC)