How can I find a solution for c and d for all real integer values?

  • Context: Undergrad 
  • Thread starter Thread starter x86
  • Start date Start date
  • Tags Tags
    Integer
Click For Summary
SUMMARY

The discussion centers on solving the equation $$w = \frac{(ab - d)}{(c - a - b)}$$ for variables `c` and `d` across all real integer values of `w`. The user successfully derives that for specific cases where `w = 0` and `w = 1`, the solutions are `d = ab` and `c = a + b`. However, the user expresses confusion about proving the validity of these solutions for all values of `w` in the range $$w \in (-\infty, +\infty)$$, particularly noting that setting `c = a + b` leads to a zero denominator, which is mathematically invalid.

PREREQUISITES
  • Understanding of algebraic manipulation and solving equations
  • Familiarity with real number properties and integer values
  • Knowledge of functions and their domains
  • Basic concepts of limits and continuity in mathematics
NEXT STEPS
  • Explore the implications of setting denominators to zero in algebraic equations
  • Study the concept of functions and their behavior over different domains
  • Investigate the properties of real numbers and integer solutions in equations
  • Learn about the conditions under which equations can be solved for multiple variables
USEFUL FOR

Mathematics students, educators, and anyone interested in algebraic problem-solving and the behavior of equations across different values.

x86
Gold Member
Messages
256
Reaction score
18
$$w = \frac{(ab - d) }{c - a - b}$$

I have to solve the above equation for variables `c` and `d` if `w` can be any number from $$w \in (-\infty, +\infty)$$

If we set `w = 0, then w = 1` we can solve for `c and d`

$$0 = ab - d$$
$$d = ab$$
$$c = a + b$$

Now if I can substitute the values to check the solution for `w = 1`
$$c - a - b = ab - d$$
Substituting c, $$a + b - a - b = ab - d$$
$$0 = ab - d$$
$$d = ab$$

I know that my solution is true for both `w = 0 and w = 1` but how can I prove that my solution is true for $$w \in (-\infty, +\infty)$$

I've tried this:

$$w(c - a - b) = (ab - d)$$
$$w(a + b - a -b) = ab - d$$
$$0 = ab - d$$

$$ab = d$$

But is this really an acceptable way of solving the solution? I am very confused. I've proved that the equations I found earlier (when I set w = 1 and w = 0) are true when w = w by putting it into the mother equation
 
Mathematics news on Phys.org
There are no fixed c,d, such that the equation is true for more than one w. If you know a,b,c,d, you can calculate w, it cannot be more than one value.

c=a+b is impossible, this would make the denominator zero.

Either you try something impossible, or it is unclear what you want to do.
 

Similar threads

Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
3
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K