What is a scalar? |
Define a vector. |
What is a matrix? |
What is a tensor? |
How do you represent a vector geometrically? |
What is the difference between a row vector and a column vector? |
What is a diagonal matrix? |
What is an identity matrix? |
What is a zero matrix? |
What is a symmetric matrix? |
What is a skew-symmetric matrix? |
Define orthogonal matrix. |
What is a triangular matrix? |
What is the trace of a matrix? |
What does it mean for a matrix to be invertible? |
What is matrix addition? |
What is scalar multiplication? |
Define matrix multiplication. |
When is matrix multiplication possible? |
Is matrix multiplication commutative? |
What is the associative property of matrix multiplication? |
What is the distributive property? |
What is the transpose of a matrix? |
What are the properties of matrix transpose? |
What is the inverse of a matrix? |
What are the conditions for a matrix to have an inverse? |
What is a singular matrix? |
What is an orthogonal matrix? |
How do you compute the determinant of a 2x2 matrix? |
How do you compute the determinant of a 3x3 matrix? |
What is the dot product of two vectors? |
What is the cross product? |
What does it mean for two vectors to be orthogonal? |
What is the angle between two vectors? |
What is the projection of one vector onto another? |
What is a unit vector? |
What is a basis vector? |
How do you normalize a vector? |
What is vector space? |
What is subspace? |
What is linear independence? |
What is linear dependence? |
How do you check if vectors are linearly independent? |
What is the span of a set of vectors? |
What is the basis of a vector space? |
What is the dimension of a vector space? |
Give an example of a dependent set of vectors. |
What is the rank of a matrix? |
How do you compute the rank? |
What does full rank mean? |
What is the null space (kernel) of a matrix? |
How do you find the nullity of a matrix? |
What is the column space? |
What is the row space? |
What is the orthogonal complement? |
What is an eigenvalue? |
What is an eigenvector? |
How do you compute eigenvalues? |
What is the characteristic polynomial? |
How do you find eigenvectors? |
What is geometric multiplicity? |
What is algebraic multiplicity? |
What are some properties of eigenvalues? |
Why are eigenvalues important in machine learning? |
What is the spectral theorem? |
What is LU decomposition? |
What is QR decomposition? |
What is SVD (Singular Value Decomposition)? |
What is the purpose of matrix decomposition? |
What is the Cholesky decomposition? |
What is the difference between LU and QR? |
What is the Gram-Schmidt process? |
How is SVD used in data compression? |
What is the Jordan Normal Form? |
What is the Schur decomposition? |
What is the determinant of a matrix? |
What is the physical meaning of a determinant? |
How do you find the inverse of a 2x2 matrix? |
How do you find the inverse using row reduction? |
What is the adjoint of a matrix? |
How is the determinant used to check invertibility? |
What is a cofactor? |
What is the Laplace expansion? |
Can a non-square matrix be inverted? |
What is a pseudoinverse? |
What is the Moore-Penrose pseudoinverse? |
What is a block matrix? |
What is the Kronecker product? |
What is the Hadamard product? |
What is matrix exponentiation? |
What is a condition number? |
What is a rank-deficient matrix? |
What is a positive definite matrix? |
What is a diagonalizable matrix? |
What is matrix similarity? |
What is a vector norm? |
What is the L1 norm? |
What is the L2 norm? |
What is the infinity norm? |
What is the Frobenius norm? |
How are norms used in machine learning? |
What is the distance between two vectors? |
How is cosine similarity calculated? |
What is the relationship between norm and distance? |
How does normalization affect vector norms? |
What is an orthogonal set of vectors? |
What is orthonormality? |
How do you orthogonalize a set of vectors? |
What is the projection matrix? |
How do you project a vector onto a subspace? |
What is the geometric interpretation of a projection? |
When is a projection matrix idempotent? |
What is the Gram matrix? |
What is the orthogonal projection theorem? |
How is projection used in least squares? |
What is the least squares solution? |
Why is least squares used in linear regression? |
What is the normal equation? |
How do you derive the least squares estimator? |
What happens if the design matrix is not full rank? |
What is the role of the pseudoinverse in least squares? |
What is the residual vector? |
How do you minimize the residual? |
What is the cost function in linear regression? |
What is the relationship between projection and least squares? |
What is a linear transformation? |
What is the matrix representation of a transformation? |
What is a standard basis? |
What is a change of basis? |
How do you change a vector to a new basis? |
What is a transition matrix? |
How are linear transformations represented in different bases? |
What is the role of similarity transformations? |
What is the canonical form? |
What is the matrix of a reflection or rotation? |
What is a symmetric matrix? |
What is a skew-symmetric matrix? |
What is a positive definite matrix? |
What is a positive semi-definite matrix? |
How do you test for positive definiteness? |
What are the properties of symmetric matrices? |
What are applications of positive definite matrices in ML? |
What is an idempotent matrix? |
What is a nilpotent matrix? |
What is the Cayley-Hamilton Theorem? |
What is Gaussian elimination? |
What is Gauss-Jordan elimination? |
What is row echelon form? |
What is reduced row echelon form? |
What is pivoting? |
What are leading and free variables? |
What is backward substitution? |
What is forward substitution? |
What is the computational complexity of matrix multiplication? |
What is sparse matrix representation? |
What is SVD used for in NLP? |
How is PCA related to eigenvectors? |
What is the Eckart?Young theorem? |
What is the thin SVD? |
What is truncated SVD? |
What is the application of QR decomposition in ML? |
How does Cholesky compare to LU? |
What is the Householder transformation? |
What is the Givens rotation? |
What is the role of decomposition in solving linear systems? |
What is a dual space? |
What is a linear functional? |
What is the relationship between dual basis and basis? |
How are linear maps between duals represented? |
What is reflexivity in linear algebra? |
How are eigenvectors used in spectral clustering? |
What is Laplacian matrix in graph theory? |
What are principal components? |
How is PCA implemented using eigen decomposition? |
How does dimensionality reduction work in PCA? |
What is a vector subspace? |
What is the annihilator of a subspace? |
What is the quotient space? |
What is the rank-nullity theorem? |
What is a bilinear form? |
What is a quadratic form? |
How do you diagonalize a quadratic form? |
What is matrix congruence? |
What is orthogonal diagonalization? |
What is a linear operator? |
Why is linear algebra essential in machine learning? |
How is linear algebra used in neural networks? |
What is the role of dot product in attention mechanisms? |
How are matrices used in image processing? |
What is the Jacobian matrix in deep learning? |
What is the Hessian matrix? |
How is SVD used in recommendation systems? |
How are tensors used in deep learning? |
What is the shape of input data for neural networks? |
How does dimensionality reduction improve performance? |
What is a tensor? |
What is a rank of a tensor? |
How are tensors represented? |
What are tensor contractions? |
What is tensor decomposition? |
Can a matrix have more than one inverse? |
Can a non-square matrix be orthogonal? |
Can a set of linearly independent vectors form a basis? |
Is every orthonormal set linearly independent? |
Can two vectors be orthogonal but not linearly independent? |
What is the geometric meaning of a determinant? |
How is the rank related to dimensionality? |
How do transformations affect shapes and dimensions? |
What is shearing? |
What is scaling in linear transformations? |
Difference between rank and dimension? |
Compare dot product and cross product. |
Difference between linear map and affine map? |
Compare eigen decomposition and SVD. |
Difference between orthogonal and orthonormal? |
True or False: All orthogonal matrices are invertible. |
True or False: A matrix with zero determinant is invertible. |
True or False: Eigenvalues can be complex. |
True or False: A matrix can be diagonalized only if it's square. |
True or False: The transpose of a symmetric matrix is symmetric. |
Describe how a matrix transforms a vector. |
Explain the intuition behind the dot product. |
Describe how matrix rank affects system solutions. |
Explain the steps to solve a linear system using Gaussian elimination. |
Describe why PCA reduces noise. |
How do you compute matrix inverse in Python (NumPy)? |
How do you compute eigenvalues in NumPy? |
How do you use SVD in scikit-learn? |
How do you create a projection matrix using NumPy? |
How do you find the rank of a matrix programmatically? |
Given 3 vectors, check if they are linearly independent. |
Find the eigenvalues of a 2x2 matrix manually. |
Solve a 3x3 linear system using matrix inversion. |
Given a matrix, compute its rank. |
Reduce a matrix to row echelon form. |
What is the determinant of an identity matrix? |
What is the transpose of a diagonal matrix? |
What is the rank of a zero matrix? |
What is the nullity of an invertible matrix? |
What is the dimension of R³? |
Why are orthogonal matrices preferred in numerical computations? |
Why is SVD preferred over eigen decomposition in some cases? |
How is linear algebra related to convolution in CNNs? |
How do singular matrices affect training in ML models? |
What matrix operations are most common in backpropagation? |
What is a Vandermonde matrix? |
What is the companion matrix? |
What is the Toeplitz matrix? |
What is a circulant matrix? |
What are real-world examples of linear algebra in AI? |
|
|
|
|
|
|
|
|
|
No comments:
Post a Comment