| This article is ratedStart-class on Wikipedia'scontent assessment scale. It is of interest to the followingWikiProjects: | |||||||||||
| |||||||||||
Shouldn't one useinner product notation here rather thanbra-ket notation?
I.e. rather than.
PJ.de.Bruin 00:49, 5 Jul 2004 (UTC)
at least one concrete example would be nice!otherwise it seems A*=A^{-1} as in A*A=AA*=I... anyone?
The distinction is not made clear. is self-adjoint if:
Hermiticity does not necessarily guarantee the latter statement.
Dirc20:52, 14 February 2007 (UTC)[reply]
Hmm, the article says:
I don't really see how this follows from the properties above it. Can someone provide a simple proof for this?
This page explains what adjoints are, but just reading this page it isn't clear why it might be useful to define "adjoint".—Ben FrantzDale01:18, 9 April 2007 (UTC)Sometimes a solution to a problem involving an adjoint converts into a solution to the original problem as for example in the case of some second order differential equations.[reply]
Adjoints of operators generalize conjugate transposes of square matrices to (possibly) infinite-dimensional situations. If one thinks of operators on a Hilbert space as "generalized complex numbers", then the adjoint of an operator plays the role of the complex conjugate of a complex number.
Neither I easily found a link from this article to adjoint of general lin. bound. operator nor is it coevered in this article.—Precedingunsigned comment added by131.111.8.103 (talk)16:46, 14 February 2008 (UTC)[reply]
The article states that one can prove the existence of the adjoint operator using the Riesz representation theorem for the dual of Hilbert spaces. While this is certainly true it glosses over a nice, short, instructive proof which I'd like to add to the article. If there are no objections I'll add a section that contains a proof of the existence of adjoint operators using sesquilinear forms, as in the book by Kreyszig.Compsonheir (talk)20:35, 27 April 2009 (UTC)[reply]
Do so, I'm interested by that proof. Also I will change the first line: we are implicitely talking about bounded operator. In case of unbounded one has to say something about the domain.— Precedingunsigned comment added byNoix07 (talk •contribs)15:14, 6 December 2013 (UTC)[reply]
Notation was A, A* in matrix notation. I changed them to linear maps, so it is like T*(x) T(x) etc.—Precedingunsigned comment added byNegi(afk) (talk •contribs)06:29, 28 April 2009 (UTC)[reply]
this article is same withConjugate transpose.:)Ladsgroupبحث11:57, 30 January 2011 (UTC)[reply]
No, this article is not the same asConjugate transpose. A matrix is a matrix, and an operator is an operator. Of course there is a relation between those objects (and indeed an isomorphism between the spaces conataining them), but both articles should definitely remain separated in my opinion--Vilietha (talk)16:32, 11 March 2011 (UTC).[reply]
I agree. I encounter discussions of operators in quantum mechanics that go back and forth between mention of operators, as such, and matrix representations. BUT operators can be considered quite apart from the matrix representations, and I take for granted manipulations of other representations of the operators, and extenstions that cannot be handled so easily by matrices. And I am opposed to the appalling proliferation of articles on interrelated topics and am in favour of many mergers elsewhere.Michael P. Barnett (talk)03:54, 1 May 2011 (UTC)[reply]
No merge. In order to do this don't we need to "broaden" the definiton of a matrix to include infinite dimensional matrices? Even then this only allows us to consider spaces isomorphic to L2 and the matrix representation only converges for bounded operators(right? Maybe my memory is faulty). So I don't agree with the merge. In fact, I don't recall ever reading anything that refered to the adjoint of an operator on a Hilbert Space as the Conjugate Transpose. Anyway keeping the articles seperate helps to keep a distinction between linear algebra and operator theory. There is enough confusion between the two among undergraduate math students as it is.--129.69.206.184 (talk)12:56, 22 December 2011 (UTC)[reply]
Please could someone provide an accessible reference to clarify the statement posted above that "Hermiticity and self-adjoitness are synonymous. (the notation A* = A implicitly implies Dom(A) = Dom(A*), usually.)" In particular, could someone please explain the usage of "usually" in this context. Clear specification of the nature of objects represented by symbols and the circumstances needed for a statement containing these to be true in mathematical discourse is not an unreasonable request.Michael P. Barnett (talk)04:05, 1 May 2011 (UTC)[reply]
Can someone please either change the symbol or else define it? With the prevalence of ambiguous notation concerning conjugates and adjoints etc (A, A*, A†,...), it is rather unclear what is meant by it, to the casual reader. In particular, is it related to the raised version93.96.22.178 (talk)17:14, 15 July 2013 (UTC)Erikpan[reply]
Is the terminology "self-adjoint" only defined for complex Hilbert spaces? The article is written this way. But I think it is equally valid for real Hilbert spaces.
178.38.97.101 (talk)22:35, 17 May 2015 (UTC)[reply]
Anothervery strange thing about this article: isn't the adjoint also defined when the two spaces aredistinct? That is,A* can be defined for a linear operatorA:H1→H2.
This holds in both finite and infinite dimensions, and for both bounded and unbounded operators. In the case of unbounded operators,A must be densely defined forA* to exist. Seeunbounded operator.
This seems like a serious omission. How can I flag an article for having a too-narrow scope?178.38.97.101 (talk)23:17, 17 May 2015 (UTC)[reply]
I will try to draft a generalization for operators between Banach spaces and distinct Hilbert spaces tomorrow.Valiantrider (talk)21:11, 26 December 2015 (UTC)[reply]
It is somewhat different for operators (linear transformations) than for matrices and bilinear/sesquilinear forms). That is perhaps why it is confusing.
For operators:
By way of comparison, the use of symmetric isdifferent for matrices and for bilinear/sequilinear forms than it is for operators.
For bilinear/sequilinear forms:
178.38.97.101 (talk)23:56, 17 May 2015 (UTC)[reply]
I included the definition of adjoint operators for operators between normed spaces from Brezis' book on functional analysis. Please check, I'm new to this. I will probably include the case where both spaces are Hilbert spaces as an example tomorrow. Also there is need for some rewriting of the intro (to change the scope from Hilbert to normed spaces) and maybe some section titles.Valiantrider (talk)21:58, 26 December 2015 (UTC)[reply]
I included an informal definition to ease into the notion of adjoint operators and mentioned the "mixed" case (operator from Hilbert to Banach), which is especially interesting when one considers for example A as the inclusion operator from some Hilbert space which is a proper subset of a Banach space (as in Cameron-Martin spaces). I got most of the input from Brezis' book. Will probably include some examples soon.Valiantrider (talk)21:46, 27 December 2015 (UTC)[reply]
Since this is all about inner products, wouldn't it be possible to start the "information definition" section with a simple example involving finite-dimensional vectors and matrices?— Precedingunsigned comment added by62.80.108.37 (talk)16:58, 6 February 2019 (UTC)[reply]