I was considering hijacking an existing thread on these but I thought I may as well start a new thread.
I'm playing around with some AB1805 real time clocks from Abracon. Before I send a board out to the pcb fab I thought I'd try to prototype the device and make sure I can get what I want from it, a reasonably accurate RTC. I've soldered it to a QFN16 breakout board, hooked it up to a Moteino and messed around with the alarm functionality until it behaved. I'm also using the 32KHz crystal recommended in the datasheed ABS07-120-something.
Close up:
I've got it set to alarm every minute (matching 0 seconds, 0 hundreths) and it seems to work. The problem is that it seems to be way off in terms of accuracy. I've set the alarm match interrupt to generate a pulse on FOUT, which is fed into the Moteino. When Moteino detects it going low, it transmits a few bytes over the USART (115200 baud) to a computer which timestamps it. Here is a few minutes output:
2018:03:31:01:04:35:304,*
2018:03:31:01:05:35:296,*
2018:03:31:01:06:35:287,*
2018:03:31:01:07:35:277,*
2018:03:31:01:08:35:254,*
2018:03:31:01:09:35:238,*
2018:03:31:01:10:35:255,*
2018:03:31:01:11:35:231,*
2018:03:31:01:12:35:230,*
2018:03:31:01:13:35:205,*
It's running fast, to the point of 10-15 ms per minute. By my rubbish maths thats about 680 ms per hour. I'd expect more like 35-70ms per hour if the crystal was +10 or +20 ppm. Is this similar to other peoples experiences? I know there are calibration registers but I didn't expect to have to use them to this extent.
(I also know that this isnt the most accurate way to time this sort of thing. That being said, the computer is running ntp and if I do the same test with a DS3234 I basically get no drift at all)