Jon Krohn is Chief Data Scientist at the machine learning company untapt. He authored the book Deep Learning Illustrated (Addison-Wesley, 2020), an instant #1 bestseller that has been translated into six languages. Jon is renowned for his compelling lectures, which he offers in-person at Columbia University and New York University, as well as online via O'Reilly, YouTube, and the Super Data Science Podcast. Jon holds a PhD from Oxford and has been publishing on machine learning in leading acad journals since 2010; his papers have been cited over a thousand s.

Intermediate

Appreciate the role of algebra in machine and deep learning

Understand the fundamentals of linear algebra, a ubiquitous approach for solving for unknowns within high-dimensional spaces

Develop a geometric intuition of what's going on beneath the hood of machine learning algorithms, including those used for deep learning

Be able to more intimately grasp the details of machine learning papers as well as all of the other subjects that underlie ML, including calculus, statistics, and optimization algorithms

Manipulate tensors of all dimensionalities including scalars, vectors, and matrices, in all of the leading Python tensor libraries: NumPy, TensorFlow, and PyTorch

Reduce the dimensionality of complex spaces down to their most informative elements with techniques such as ndecomposition (nvectors and nvalues), singular value decomposition, and principal components analysis

Who Should Take This Course

Users of high-level software libraries (e.g., scikit-learn, Keras, TensorFlow) to train or deploy machine learning algorithms who would now like to understand the fundamentals underlying the abstractions, enabling them to expand their capabilities

Software developers who would like to develop a firm foundation for the deployment of machine learning algorithms into production systems

Data scientists who would like to reinforce their understanding of the subjects at the core of their professional discipline

Data analysts or AI enthusiasts who would like to become a data scientist or data/ML eeer and are keen to deeply understand from the ground up the field they're entering (very wise!)

Course Requirements

Mathematics: Familiarity with secondary school-level mathematics will make the course easier to follow. If you are comfortable dealing with quantitative information-such as understanding charts and rearrag simple equations-then you should be well-prepared to follow along with all of the mathematics.

Programming: All code demos are in Python so experience with it or another object-oriented programming language would be helpful for following along with the hands-on examples.

Lesson 1: Orientation to Linear Algebra

In Lesson 1, Jon starts with a definition of linear algebra. He then shows you how to use it to solve for unknowns in a system of linear equations. Next, he discusses why linear algebra is so crucial in modern machine learning, including deep learning. Finally, he finishes up with a brief history of algebra and some comprehension exercises.

Lesson 2: Data Structures for Algebra

Lesson 2 focuses on tensors, the fundamental data structure of linear algebra. Jon starts off with zero-dimensional scalar tensors. Then, he covers one-dimensional vector tensors, including the topics of transposition, norms, and unit vectors, as well as the bases for orthogonal and orthonormal vectors. The lesson wraps up with two-dimensional matrix tensors and higher-dimensional n tensors and a few exercises.

Lesson 3: Common Tensor Operations

Lesson 3 is about common tensor operations, including transposition, basic tensor arithmetic, reduction, and the dot product. It finishes up with exercises.

Lesson 4: Solving Linear Systems

In Lesson 4 you take a brief break from hands-on code demos to learn how to solve systems of linear systems by hand. The focus is on substitution and elimination strats. The lesson finishes with exercises to reinforce those concepts.

Lesson 5: Matrix Multiplication

Lesson 5 is about matrix multiplication. Matrix-by-vector multiplication is covered first, followed by matrix-by-matrix multiplication. Next, the concepts of symmetric and identity matrices are discussed, followed by exercises that show their relevance to matrix multiplication. Finally, Jon wraps up with an explanation of the critical role of matrix multiplication in machine learning and deep learning applications.

Lesson 6: Special Matrices and Matrix Operations

Lesson 6 covers a number of special matrices and special matrix operations that are essential to machine learning. These include the Frobenius norm, matrix inversion, diagonal matrices, orthogonal matrices, and the trace operator.

Lesson 7: nvectors and nvalues

Lesson 7 bs with Jon discussing what the nconcept is all about. He follows this with some exercises to warm you up for playing around with nvectors in Python, including high-dimensional nvectors.

Lesson 8: Matrix Deteants and Decomposition

Jon bs Lesson 8 by illustrating how to calculate the deteant of a 2 x 2 matrix as well as the deteant for larger matrices. This prepares you for being able to work on some exercises on deteants on your own. The second half of the lesson discusses the relationship between deteants and nvalues and provides an overview of the broad range of ndecomposition applications in the real world.

Lesson 9: Machine Learning with Linear Algebra

In Lesson 9 Jon helps you tie together many of the concepts you have been introduced to previously to power many useful machine learning applications. You learn singular value decomposition to compress a media file, the Moore-Penrose pseudoinverse to form a regression, and principal component analysis to break down a dataset into its most influential components. Finally, Jon provides you with resources for your further study of linear algebra.

**DOWNLOAD****uploadgig****rapidgator****nitroflare**