Hello there! This is my first time posting, but I'm a long time reader. I could not find a more appropriate sub-forum. I have just been hired to compile and organize data for an arctic hydrology research project in Fairbanks, AK. They are studying climate change and have recorded a few years worth of data. 150 temperature sensors in a "glacial polygon" (that's what the researcher said), about 1 meter between each sensor, recording once every hour. I'm scared to calculate how many "values" that would give us... My employer has said she isn't very picky about how the data is compiled. She just wants to start by finding data anomalies and spotting general trends. I'm basically going to help her enter data and make graphs. She plans to request a "final product" and I can go about getting there however I see fit. I am familiar with these programs: Excel, Mathematica, MATLAB, Microlab, Datastudio. Microlab and Datastudio were used in my lower level Chemistry and Physics labs. I've never used Mathematica or MATLAB for generating graphs, but I've heard they work well for that. She only knows excel, which is fairly easy to use. I'm comfortable with it, but it doesn't seem as versatile as the subject-specific programs I mentioned. I am only a second year undergraduate in chemistry. I have lab experience from 1st and 2nd year science courses and one summer job, but not much else. I essentially know nothing except how to work (which can be enough some times). Can anyone recommend programs that are usually used for large amounts of data like this? Actually, any advice at all is welcome. I just want to do this small job well, and I would put in the time to learn how to use new tools and techniques. Thank you! Note: I can provide much more information, but this post is already long.