Can I Optimize X for a Given Level Curve with Partial Derivatives?

  • Context: Graduate 
  • Thread starter Thread starter hotvette
  • Start date Start date
  • Tags Tags
    Optimization
Click For Summary
SUMMARY

The discussion focuses on optimizing a function z=f(x,y) for a given level curve defined by a constant c. The user successfully computes the maximum and minimum values of x using Newton's method and Brent's method, while also considering Lagrange multipliers as a potential solution. The challenge lies in solving the two equations derived from the optimization problem, specifically partial f / partial x = 0 and f - c = 0. Ultimately, the user concludes that their original approach may be the most practical despite the difficulties encountered with the conditioning of the system.

PREREQUISITES
  • Understanding of partial derivatives and optimization techniques
  • Familiarity with Newton's method and Brent's method for root-finding
  • Knowledge of Lagrange multipliers for constrained optimization
  • Basic concepts of level curves in multivariable calculus
NEXT STEPS
  • Explore advanced techniques in nonlinear optimization
  • Study the application of Lagrange multipliers in depth
  • Learn about conditioning in numerical methods and how to address it
  • Investigate alternative optimization algorithms for constrained problems
USEFUL FOR

Mathematicians, engineers, and data scientists involved in optimization problems, particularly those dealing with multivariable functions and constrained optimization scenarios.

hotvette
Homework Helper
Messages
1,001
Reaction score
11
I have a function z=f(x,y) that is reasonably well behaved (single global maximum). I can readily compute the value of z as well as partials of z with respect to x and y. I can also quite easily find the maximum.

The challenge is to find the maximum and minimum values of x where c = constant = f(x,y). In other words, I'm trying to find the extreme values of x for a given level curve. I also know that the extreme values of x occur along a ridge (i.e. where the partial of f with respect to y is zero). In a way, this is a backwards constrained optimization problem, where the value of the function is contrained and the goal is to optimize the values of the variables.

Currently, I'm using a combination of Newton's method and Brent's method to successfully solve the problem, but I'm wondering if there might be a more elegant and mathematically rigorous approach. Any suggestions?
 
Physics news on Phys.org
Why not just use Lagrange multipliers, optimizing the function g(x,y)=x with the constraint equation f(x,y)-c=0?
 
Wow, I didn't realize it was really that simple. Turns out the optimization exercise gets me right back to what I already knew (partial f / partial x = 0 and f - c = 0). Two (nonlinear in this case) equations in two unknowns. Straightforward on the surface. Then it comes to a choice of how to solve the two equations. Newton's method reveals a very badly conditioned system that I can't seem to get around. Thus, it appears my original approach may indeed be the most practical. Thanks for helping get the entire picture more clear.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
3
Views
2K
  • · Replies 17 ·
Replies
17
Views
4K
  • · Replies 3 ·
Replies
3
Views
6K
  • · Replies 16 ·
Replies
16
Views
3K
Replies
3
Views
2K