Please use this identifier to cite or link to this item: https://hdl.handle.net/10316/7709
Title: An interlacing theorem for matrices whose graph is a given tree
Authors: Fonseca, C. M. da 
Issue Date: 2006
Citation: Journal of Mathematical Sciences. 139:4 (2006) 6823-6830
Abstract: Abstract Let A and B be (nn)-matrices. For an index set S ? {1, …, n}, denote by A(S) the principal submatrix that lies in the rows and columns indexed by S. Denote by S' the complement of S and define ?(A, B) = $$\mathop \sum \limits_S $$ det A(S) det B(S'), where the summation is over all subsets of {1, …, n} and, by convention, det A(Ø) = det B(Ø) = 1. C. R. Johnson conjectured that if A and B are Hermitian and A is positive semidefinite, then the polynomial ?(?A,-B) has only real roots. G. Rublein and R. B. Bapat proved that this is true for n ? 3. Bapat also proved this result for any n with the condition that both A and B are tridiagonal. In this paper, we generalize some little-known results concerning the characteristic polynomials and adjacency matrices of trees to matrices whose graph is a given tree and prove the conjecture for any n under the additional assumption that both A and B are matrices whose graph is a tree.
URI: https://hdl.handle.net/10316/7709
DOI: 10.1007/s10958-006-0394-1
Rights: openAccess
Appears in Collections:FCTUC Matemática - Artigos em Revistas Internacionais

Files in This Item:
File Description SizeFormat
obra.pdf133.92 kBAdobe PDFView/Open
Show full item record

Page view(s)

274
checked on Apr 16, 2024

Download(s)

158
checked on Apr 16, 2024

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.