Stathol wrote:
For some reason I was thinking you were in the Austin area now. I don't know anything about UTRGV. If they've only semi-recently come under the UT system umbrella, things might be different there. UTD is really heavily pushing for that whole "tier 1 university, STEM master race" kind of thing.
It's only in the last... 5 years? More or less.
Diamondeye wrote:
one of the kids next to me in the class tonight recommended a book on data structures
Quote:
Let me guess: CLRS ("Introduction to Algorithms" by Cormen, Leiserson, Rivest, and Stein)? It's a good reference book, but I'm not a huge fan of it in terms of learning algorithms unless you are an abstract thinker by nature. Most people aren't; certainly I'm not. That's mostly what made advanced algorithms and graph algorithms difficult, at least for me. Add to that these truths: "there are no algorithms for devising algorithms" and "there are no algorithms for proving algorithms". Well, there are automated proving systems, but these are proof validators not proof inventors. I.e. if you can convince the Coq Proof Assistant that your algorithm satisfies proposition P, then it really does satisfy proposition P.
Anyway, complaints about CLRS aside, I don't really have a good book recommendation for learning algorithms, either. I got a lot of mileage out of YouTube, frankly. Some of the lectures and demos you'll find there are far more cogent than anything you'll get in a book or college lecture, in all honesty.
The book is called Data Structures and Algorithms in C++ by Mark Weiss. All of the professor's examples are in C++. I'll keep that other one you mentioned in mind, though, for the future.
Diamondeye wrote:
Since you've mentioned this a couple times -- I wouldn't worry about linear algebra. Like, at all. I'm not trying to talk you out of learning about (knowledge is good for the sake of knowledge, etc., etc.)
but from a purely pragmatic point of view, if you know how to do basic matrix multiplication, you're probably good to go for anything you'll encounter in grad school -- maybe even including Linear Programming, if they even offer that as a separate course. We got into Linear Programming a little in the Graph Algorithms course (CS6381) because Primal/Dual algorithms have a lot of applications in that context, but even that was nothing more complicated than matrix multiplication. Other than that, the only time linear algebra has come up
at all was in undergrad algorithms. The
matrix chain multiplication problem is a common example for dynamic programming. I can't vouch for machine learning because I haven't taken that. I've a got a few hours left and that might be one of the things I fill in to finish.
The professor has mentioned it a few times. Also, on the first homework he had this problem:
Code:
<description of Hadamard matrices; H sub k refers to a Hadamard matrix of order k > 0>
Show that if v is a column vector of length n = 2^k, then the matrix vector product H sub k of v [sorry about the awkward typing of subscripts] can be calculated using O(n log n) operations. Assume that all numbers involved are small enough that basic arithmetic operations like addition and multiplication take unit time
My answer was:
(I apologize if anything is unclear; the board wiped out the superscripts and subscripts that I had in the Word document)
Quote:
Code:
For any Hadamard Matrix of order k:
|Hk-1 Hk-1 |
|Hk-1 - Hk-1 | * column vector v of length n = 2k, all values Hk-1 * v1 â€¦ vk, Hk-1 = v, -Hk-1 = -v
For T(n) = aT(n/b)+cnr, a = 4 (children per k), b = 2 (cost per k of multiply by 1 and -1), r = 2 (n is vector length 2k, k = log2n, therefore n2 = total H at level k1)
For 4T(n/2) + n2, logba = 2, r = 2
If logba = r,T(n) = O(n log n)
Logba = 2 = r, therefore T(n) = O(n log n)
This all was, indeed, in the unit on dynamic computing.