jetoso
- 73
- 0
A businessman has the option of investing his money in two plans. Plan A guarantees that each dollar invested will earn 70 cents a year hence, and plan B guarantees that each dollar invested will earn $2.00 two years hence. In plan B, only investments for periods that are multiples of 2 years are allowed. How should he invest $100,000 to maximize the earnings at the end of 3 years? Formulate the problem as a linear programming model.
Any suggestions?
Any suggestions?