Hello, > We have been trying to work out how the Gryphon time related to real time. 'Gryphon time' is the number of 10 microsecond units since midnight, January the first, 1970. This is quite a big number, so in gryphon data messages, events etc, it gets truncated to the least significant 32 bits. You can get the complete 64 bit time stamp by using the gryphon CMD_GET_TIME command (see the manual), and with this, as been as the most significant 32 bits only change about once every 12 hours, you can convert the 32 bit timestamp in the gryphon data message header into a full 64 bit value (you just need some sort of code to handle the occassional wrapping). Once you've derived the 64 bit value, time calculations all become very simple, assuming your compiler supports 64 bit arithmatic. (gcc under unix does of course, the type you use is just "unsigned long long"). For effiecency's sake, _if_ the client code is running locally on the gryphon thats producing the traffic, instead of using CMD_GET_TIME you can just use a function similar to the following thats uses the gettimeofday() system call (a "system call" is just like a library function, some programmers wont even be aware theres a difference between the two): unsigned long long gryphongettimell() { register unsigned long long ull; struct timeval tv; do_gettimeofday(&tv); ull=((unsigned long long)(tv.tv_sec))*((unsigned long long)100000); ull+=((unsigned long long)(tv.tv_usec/10)); return(ull); } Note the call to 'do_gettimeofday()' in the above should be replaced by a call to 'gettimeofday()' (together with handling of any error conditions as mentioned in the man page), but the above demonstraits the calculation you need to do. If you just want the 32 bit time stamp you can do something like: unsigned long gryphongettimel() { struct timeval tv; do_gettimeofday(&tv); return((tv.tv_sec*100000)+(tv.tv_usec/10)); } If you want to convert the time into a human readable ascii string you can use a unix library function like ctime() which just needs the number of seconds since the same 1970 epoch, which you can get just by dividing the 64 bit gryphon timestamp by 100,000. The following program demonstrates some of this: #include #include #include #include int main() { struct timeval tv; unsigned long long ull; unsigned long ul; time_t tt; /* time_t is just an integer type of some sort */ if(gettimeofday(&tv,NULL)) { perror("gettimeofday"); exit(1); }; ull=((unsigned long long)(tv.tv_sec))*((unsigned long long)100000); ull+=((unsigned long long)(tv.tv_usec/10)); printf("64 bit gryphon time: %llu decimal, 0x%llx hex\n",ull,ull); ul=(tv.tv_sec*100000)+(tv.tv_usec/10); printf("32 bit gryphon time: %lu decimal, 0x%lx hex\n",ul,ul); /* note ull has been obtained with gettimeofday(2), but it could have */ /* been obtained with a gryphon CMD_GET_TIME and this would still work: */ tt=(ull/((unsigned long long)100000)); printf("human readable time: %s\n",ctime(&tt)); return(0); } Note that I'd normally suggest using CMD_GET_TIME and not gettimeofday() so that code doesn't have to be run locally etc _except_ I realise for applications with difficult timing issues, not having the latency that CMD_GET_TIME has could be a big help. Also, I've ignored any timezone issues here, although if I remember correctly on gryphon we've just been keeping the timezone set to 'Grenwich Mean Time' always to avoid complicating things, so you should probably be ok regarding 'the time as just the time'. Hope this answers all your questions. Thanks, Robin