# Functions in C to calculate hours, minutes, seconds from milliseconds input

1. Apr 28, 2016

### DiamondV

1. The problem statement, all variables and given/known data
Write three functions int get_hour(int timestamp), int get_min(int timestamp), int get_second(int timestamp) which will respectively return the hour of the day, the minute of the hour, and the second of the minute from a value given as parameter which is in milliseconds.
Example they give us is:
int timestamps = 1324561223; would output 07:56:01

The online system we use is basically where we write the actual function and the online compiler is coded to automatically run our code with predefined inputs that the compiler will insert as parameters into our code and print the result

2. Relevant equations

3. The attempt at a solution
Heres my code:
Code (Text):

#include <stdio.h>
#include <stdlib.h>
#include <math.h>

int get_hour(int timestamp){
int hour;
hour = (timestamp/(1000*60*60));
return hour;
}

int get_min(int timestamp){
int min;
min = ((timestamp%3600000)/60000);
return min;
}

int get_second(int timestamp){
int second;
second = (((timestamp%3600000)%60000)/1000);
return second;
}
I've been at this question for quite sometime. Theres something wrong with my hours. I keep getting this answer: 367:56:01 for int timestamps = 1324561223;
If you compare this with the example i said above, the minutes and seconds are correct, its something wrong with the hours and I have no idea why they are getting 07hrs. I googled it and even it said 367hrs. I thought maybe this has something to do with 24hrs in a day. 367/24 is 15 which is still not the right answer.

Why are my hours not right?

Last edited by a moderator: Apr 28, 2016
2. Apr 28, 2016

### Staff: Mentor

The problem statement says: "...return the hour of the day, the minute of the hour, and the second of the minute".

So the hour should be in the range 0..24. That's for a 24 hour clock. You might be dealing with a 12 hour clock where a.m. and p.m. hours are expected. Try that.

3. Apr 28, 2016

### SteamKing

Staff Emeritus
All timestamp schemes have a "zero day" from which time is counted. For example, the timestamp on Unix systems starts at 00:00:00 UTC on January 1, 1970. It's not clear if your timestamp is based on the Unix system or something else.

https://en.wikipedia.org/wiki/Unix_time

4. Apr 28, 2016

### DiamondV

They also gave another example of
int timestamps = 321635432; with correct output being 17:20:35
So its definitely in 24hr clock.

For that input my code outputted: 89:20:35

I tried this code aswell, no luck

int get_hour(int timestamp){
int hour;
hour = timestamp/3600000;
return hour;
}

int get_min(int timestamp){
int min;
min = timestamp/60000;
return min;
}

int get_second(int timestamp){
int second;
second = timestamp/1000;
return second;
}

5. Apr 28, 2016

### Staff: Mentor

Going back to your first example and rendering it by hand:

$h = \left \lfloor {\frac{1324561223}{1000⋅60*60}} \right \rfloor = 367$

Now, 367/24 = 15 + 7/24. So it looks like you need to take the modulus of h/24.