How can I optimize numerical approximation with fewer samples?

Click For Summary
SUMMARY

The discussion focuses on optimizing numerical approximation using fewer samples, specifically targeting a reduction to 20 samples for effective function approximation. Participants suggest methods such as dividing the independent variable domain into equal-width segments and selecting endpoints as sample points. The conversation highlights the tension between the desire for fewer samples and the statistical principle that larger sample sizes enhance prediction accuracy. Tools mentioned for implementation include MATLAB and Excel.

PREREQUISITES
  • Understanding of numerical approximation techniques
  • Familiarity with MATLAB for computational analysis
  • Basic knowledge of Excel for data manipulation
  • Concept of linear approximation in statistics
NEXT STEPS
  • Research methods for sample selection in numerical approximation
  • Learn MATLAB functions for optimizing sample sizes
  • Explore Excel's capabilities for handling large datasets
  • Investigate statistical principles behind sample size determination
USEFUL FOR

This discussion is beneficial for data analysts, statisticians, and engineers looking to enhance their skills in numerical approximation and sample optimization techniques.

galc81
Messages
1
Reaction score
0
Hi all,
i have a problem to solve that i want maybe to solve with MATLAB o excel.
I have a numerical samples and with linear approsimation i have a function, but now i want to use less samples for example only 20 and i want to find the best set of samples to approsimate in the best way the function.
thanks a lot
 
Physics news on Phys.org
galc81 said:
Hi all,
i have a problem to solve that i want maybe to solve with MATLAB o excel.
I have a numerical samples and with linear approsimation i have a function, but now i want to use less samples for example only 20 and i want to find the best set of samples to approsimate in the best way the function.
thanks a lot

Well, I suppose there are a number of ways you could choose your samples: break up the independent variable domain into 19 equal-width chunks, and use the endpoints (20 of them) as your samples. You might or might not have data for those points, naturally, so you could choose the ones closest to those that you do have.

The big question in my mind is this: from a statistics perspective, you usually want the largest sample size you can get, because your predictions and descriptive power are always greater when you have a larger sample size. So why do you want to reduce the size of your sample? Excel, for example, can crunch through quite a large sample size.
 

Similar threads

Replies
2
Views
2K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 23 ·
Replies
23
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
3
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K