Help with this notation -- some sort of norm?

Click For Summary

Homework Help Overview

The discussion revolves around understanding specific mathematical notation related to norms, particularly the squared 2-norm in the context of machine learning and least squares estimation. The original poster seeks clarification on various symbols encountered while self-studying the subject.

Discussion Character

  • Conceptual clarification, Assumption checking

Approaches and Questions Raised

  • Participants explore the implications of minimizing the squared euclidean length of a vector and question why this approach is valid. The original poster also raises inquiries about specific symbols, such as the meaning of a "1 with a vertical line through its back."

Discussion Status

Participants are actively engaging with the notation and concepts, with some offering guidance on resources for better understanding. There is an acknowledgment of the variability in notation across different texts, which prompts further exploration of definitions and meanings.

Contextual Notes

The original poster has a background in electrical engineering but is encountering challenges with mathematical symbols in machine learning. There is a mention of the need for consistent notation in learning materials.

SELFMADE
Messages
80
Reaction score
0
I need help understanding this notation, what does this mean?

Squared of 2-norm?

1. Homework Statement

4Xjhm1A.jpg


Thanks
 
Physics news on Phys.org
yes. Say ##\mathbf b = \mathbf{y - Xw}##. I assume that b is real valued for this example. Say you want to minimize the euclidean length of ##\mathbf b##. You write that as ##min \big(\mathbf b^T \mathbf b\big)^\frac{1}{2}##. But square roots are unpleasant to work with, so you then recognize that if you minimize the squared euclidean length of ##\mathbf b## then that also must minimize the euclidean length of ##\mathbf b##. (Why must this be the case?). Hence you recover the problem above that reads as: ##min \big(\mathbf b^T \mathbf b\big)##.

What you have there is the setup for the Normal Equations, and doing ordinary least squares estimations. There are two approaches to deriving the solution for an over-determined system of equations -- one involves calculus and the other involves wielding orthogonality. Both approaches are worth understanding and thinking on.
 
  • Like
Likes   Reactions: SELFMADE
Thank you for your reply. So my hunch was right.

I am learning Machine Learning by myself. I have BSEE but I am encountering many symbols/notations that I don't understand.

For example, what does the "1 with a vertical line through its back" mean?

I know as far as E stands for expected value.

ZET1L38.jpg


Thanks
 
You'
SELFMADE said:
Thank you for your reply. So my hunch was right.

I am learning Machine Learning by myself. I have BSEE but I am encountering many symbols/notations that I don't understand.

For example, what does the "1 with a vertical line through its back" mean?

I know as far as E stands for expected value.

ZET1L38.jpg


Thanks

My recommendation is to find a good text like "Learning From Data" and learn from that text (plus its echapters). (The book is quite cheap at $30 in the US, though due to peculiarities with licensing, $100 in Canada?) There is an associated free course with the same title at work.caltech.edu, and also on itunes store.

More to the point: a good text will have an appendix that lists and defines all the notation that it uses. Unfortunately notation is not standardized or uniform between authors.

A ##\mathbf 1## will tend to mean a ones vector or an indicator function or sometimes even the identity matrix. Here it is an indicator function. I personally prefer a ##\mathbf 1## to mean ones vector, ##\mathbf I## to mean identity matrix, and ##\mathbb I(Y = 1)## to denote an indicator function, but the fundamental issues is non-homogeneity of notation in the space --- again my solution is that you can homogenize things when starting off by picking one good source to learn from that has its own consistent notation. (Then once you've mastered that one source, you can much more easily infer / guess other people's notation as your branch out.)

Good luck.
 
  • Like
Likes   Reactions: SELFMADE

Similar threads

  • · Replies 26 ·
Replies
26
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 43 ·
2
Replies
43
Views
5K
Replies
1
Views
1K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 5 ·
Replies
5
Views
3K