r/LinearAlgebra

Taking linear algebra next semester

I am taking linear algebra for the first time next semester. This one is a 400 level class, the school offers a 200 level one, but I am required to take the 400 level. I've never taken a linear algebra class before, so I don't really know what to expect.

reddit.com
u/camgame00 — 14 hours ago

Resources for learning advanced linear algebra

Hi everyone! I'm interested in linear algebra, and I have already done an intro course. I want to ask whether any of you know some resources for advanced linear algebra, such as basis shift, dual spaces and bases, tensor products, spectre/spectral theorem, and the trace.

I know it seems rather a lot, but I truly do want to learn as much as possible about these subjects, and so far 3Blue1Brown hasn't helped me, adnd I was wondering whether there was anything else other than maybe Strang or else. Thank you all in advance!

reddit.com
u/Cheap_Anywhere_6929 — 1 day ago
🔥 Hot ▲ 77 r/LinearAlgebra

Application of Linear Algebra: The Gram–Schmidt Process

This post is not about proof-based linear algebra in pure mathematics.
I would like to make it clear in advance that the scope here is limited to linear algebra as a bridge course for applications in finite-dimensional Hilbert spaces, such as quantum mechanics and quantum computing / quantum information theory.

If we have linearly independent vectors, the Gram–Schmidt process is a method for constructing new vectors that are mutually orthogonal while still spanning the same space.
In other words, it is a process for finding an easier basis to work with without changing the space itself.

I hope this helps.

u/TROSE9025 — 1 day ago

Learn Linear Algebra the new way

I’m developing a new way of learning linear algebra. Traditionally we watched lectures and solved problem in a textbook but with the invent of AI, this can be made more effective. Just to be clear, this doesn’t replace any of that but makes you more productive and provide a fun experience.

I’m using a combination of GitHub notes, NotebookLM and SageMath / Python with this.

This is free to use for all. No catch, this is my contribution to learning. If you find it useful, star my repo.

https://github.com/prashantkul/learn-linear-algebra

reddit.com

Linear algebra over the summer

I don’t have much planned for the summer, and I’d like to use that time productively rather than let it go to waste. I’ve been considering taking Linear Algebra during a 7-week summer session while working around 15–20 hours per week.

Do you think this would be manageable? How rigorous is the course typically over the summer, and about how many hours per day should I expect to dedicate to studying?

Thank you.

reddit.com
u/Forsaken-Device-2859 — 2 days ago
▲ 19 r/LinearAlgebra+1 crossposts

Any good course on Linear Algebra on Coursera, EDX or Udemy?

Hi, I'm looking for a really good course designed for learning Linear Algebra. I was searching on platforms Coursera Plus, EDX and Udemy. What do you suggest?

reddit.com
u/Lebdim45 — 3 days ago
🔥 Hot ▲ 172 r/LinearAlgebra+7 crossposts

[Off-Site] Three normals to a parabola hide a centroid that cannot leave the axis.

I've been thinking about a classical result in conic geometry that I think deserves more attention.

Take the parabola x² = 4ay. From any point Q = (h, k) inside the evolute, you can draw exactly three normals to the curve. Each normal meets the parabola at a foot, giving you three points and those three points form a triangle.

The theorem: the centroid of that triangle always lies on the axis of the parabola.

The proof comes down to one beautiful observation. When you substitute Q into the normal equation x + ty = 2at + at³, you get the cubic

at³ + (2a − k)t − h = 0

There is no t² term. By Vieta's formulas, the sum of the roots is zero: t₁ + t₂ + t₃ = 0. Since the x-coordinate of the centroid is (2a/3)(t₁ + t₂ + t₃), it vanishes identically.

What's even nicer: the y-coordinate of the centroid works out to 2(k − 2a)/3 — it depends only on k, the height of Q. The horizontal position h disappears entirely. So if you slide Q left and right at fixed height, the centroid doesn't move at all. That's what the GIF shows.

Here’s the full video:

https://youtu.be/BT8ByfN1SNo?si=Fo8Ii_KQOvWwSzjn

u/Nomadic_Seth — 5 days ago

Check my understanding of vector spaces

So I just started learning linear algebra on my own very recently, right now Im learning about vector spaces and just wanted a sanity check of what vector spaces are. To my understanding, a vector space is a collection of objects called vectors, and vectors in this context are essentially anything that we can scale by a constant and add together (they must also follow 8 other axioms related to this but that's the gist). And all possible linear combinations of these vectors must remain within this defined space. Is this a correct understanding (or at least approximately)? My other question is what exactly is meant by all combinations of the vectors must remain within this space, like I understand it intuitively but how do we define the boundaries of this space or is the boundaries of this space described by the combinations of the vectors within it?

reddit.com
u/RocketsAndRobots77 — 4 days ago
▲ 4 r/LinearAlgebra+1 crossposts

Rotation in 3D space: Y-axis

I'm really struggling to make a simple rotation through the axis that has no angle (neither theta or phi). Ok I'm not good in linear algebra but could you please help me?

axes = ThreeDAxes(
    x_range=[0,6,1],
    y_range=[0,6,1],
    z_range=[0,6,1],
    x_length=6,
    y_length=6,
    z_length=6,
)

self.set_camera_orientation(zoom=0.6)
labels = axes.get_axis_labels("p'", "q", "v")self.set_camera_orientation(zoom=0.6)
labels = axes.get_axis_labels("p'", "q", "v")

I need to change view from the auto (q-p' or y-x with y vertical, to q-v with y-z

reddit.com
u/Tau-Ceti-25 — 3 days ago
▲ 21 r/LinearAlgebra+1 crossposts

mod 5 mod 13. 8K image

An iteration from a modulo 5 CA used as the initial condition for a modulo 13 CA. Pretty tight geometric structure.

u/protofield — 3 days ago
🔥 Hot ▲ 210 r/LinearAlgebra+1 crossposts

The Hessian

Let H ∈ M_{2}(ℝ) be symmetric (by Clairaut), Q(x)=xᵀHx

left: z = Q(x)

right: Q(θ) = h(θ)ᵀHh(θ), h(θ) = (cosθ, sinθ)

i.e. restriction of Q to the unit circle (curvature by direction)

classification:

eigenvectors = principal directions

eigenvalues = values of Q along them

more: MathNotes

u/CantorClosure — 6 days ago
🔥 Hot ▲ 61 r/LinearAlgebra

Any book like this homies, books of math who gives examples with images and really examples represented the world phenomenon, and not make you sleep on two pages

u/Background-Reach7499 — 5 days ago

Im struggling to identify how to comprehensively write a proof

I feel like I've overwritten here, but I also am simultaneously unsure if my answer is even correct. Could someone comment on the accuracy of my proof please?

u/_metaladder_ — 6 days ago
▲ 15 r/LinearAlgebra+3 crossposts

ndatafusion: linear algebra and ML for DataFusion, powered by nabled

Hello r/rust! I just released ndatafusion 🪐

The goal is to make Apache DataFusion a much more natural place for linear algebra and ML-style workloads.

ndatafusion brings numerical capabilities into DataFusion through explicit Arrow/DataFusion contracts, powered under the hood by nabled, my other Rust crate for linalg/ML workloads.

I built it because I wanted a cleaner answer to a pretty basic problem: if you’re already using Arrow and DataFusion, you shouldn’t have to leave that ecosystem the moment you need vector or matrix-oriented computation.

As far as I know, there still isn’t much in the Rust/DataFusion ecosystem aimed directly at this layer, which is part of why I wanted to get it out.

Links:

Would especially appreciate feedback from people working on Rust query engines, Arrow-native systems, SQL extensions, or ML/data infra.

reddit.com
u/moneymachinegoesbing — 4 days ago

Need help (yt videos, notes, ANY resource

I am struggling in this class and desperately need an A and full mark in my final to score said grade. Please help a fellow student out for the love of God 🙏🏻🙏🏻🙏🏻

reddit.com
u/Flaky_Importance1378 — 3 days ago
🔥 Hot ▲ 142 r/LinearAlgebra

Linear Algebra: What Is the Inner Product, and Where Is It Used?

The inner product is one of the most important ideas in linear algebra, especially in many applied fields.

It measures, in a broad sense, how much two vectors overlap.

Its meaning is interpreted a little differently depending on the field, but what is common is that it helps define the structure of a vector space.

In quantum mechanics, the inner product is closely connected to normalization, probability, and unitary transformations.

Here, I try to connect these ideas step by step through Dirac’s bra-ket notation, geometric meaning, and matrix representation.

By Taeryeon.

 

u/TROSE9025 — 7 days ago

Is the decomposition of a vector dependent on the inner product space?

currently taking linear algebra and i an dealing with inner product spaces. specifically, dealing with orthogonal and orthonormal basis. for the grant shmit process, I understand everything expect literally the first step. the first step is decomposing the vector.

what i understand: a standard basis, let's say R^2, is a basis that is orthonormal. any vector within the space can we decomposed into its corresponding x and y position using rcos(theta) and rsin(theta) respectively.

but, how does this work if the basis isn't:

(1) unit orthogonal

and

(2) standard

additionally, does the does the first step of the grant process have decomposition, and if it does am I thinking of it properly?

I am not looking for anything formal at all.

please try and keep it simple if you can.

thank you very much!

reddit.com
u/wbld — 12 days ago

Symbolic matrix analyzer (exact eigenvalues, diagonalization, structure detection)

I’ve been playing around with a symbolic matrix analyzer that goes a bit beyond the usual numeric tools.

It handles things like:

  • exact eigenvalues/eigenvectors (with parameters like β, γ, etc.)
  • symbolic diagonalization
  • recognizing structures (e.g. Hadamard, Pauli, Lorentz boosts)
  • clean factorization of expressions instead of messy outputs

Might be useful if you’re working with parametric matrices or teaching concepts where numeric approximations get in the way:

https://www.dubiumlabs.com/en/mathematics/symbolic-matrix-analyzer

Curious how it compares to what you usually use.

u/dubiumlb — 10 days ago

column space basis question

For this (b) part i, I used the method of transposing A, row-reducing Aᵀ, and taking the non-zero rows of RREF(Aᵀ) as the basis vectors. is it correct or not? and for part ii
Part (i) — bases NOT necessarily from A:

Row space basis → take non-zero rows of RREF(A): { (1, 0, −1), (0, 1, 0) }

Column space basis → transpose A, row-reduce Aᵀ, take non-zero rows: { (1, 0, −5, −3), (0, 1, 2, 1) }

Part (ii) — bases that ARE rows/columns of A:

Row space basis → take the pivot rows (rows 1 & 2) from the original A: { (1, 2, −1), (1, 9, −1) }

Column space basis → RREF says cols 1 & 2 are pivots → take those columns from original A: { (1, 1, −3, −2), (2, 9, 8, 3) }

this is the answer basically but my teacher marked it wrong so kindly let me know

https://preview.redd.it/fvsckc9dbeug1.png?width=1170&format=png&auto=webp&s=9b907a100d9e6f89e9b448b1fd4727b1ae35eb08

reddit.com
u/Kind-Ant-8221 — 11 days ago