Data persistence error after sudden power cut


I’ve created an image for the Raspberry Pi that reads a sensor and saves the value to a text file located in /data. This Python script repeats this steps in an infinite loop.

The problem comes when I mount this SD card into another computer and try to read the file. If I stop the container or power down the Pi from the command line, the file is complete, however if I remove the power cable the file is either empty or incomplete. Since this is a battery powered application this has to be fixed.

Does anyone know what could be going on? Do I need to flush or sync the writes to the data volume?

Just in case you want to take a look at the sources:

Kind regards

Hi @crespum
Thanks for sharing your code along with this detailed description of the issue.

I see that the GPS class preserves a different file for each day. Can you confirm whether the data loss is only about the current logfile or whether it also affects files from previous days?

As a first step, as you mentioned you could try adding an f.flushI() or even an os.fsync() call as the python docs suggest.
On the other hand since your GPS.dump() just opens the file using a with statement, which closes the file right after the single operation finishes, I would expect that all the buffers would have already been flushed.
Do you have any metrics about how much time might GPS.dump() need to complete on a production device?

As far as I can tell, you loop rewrites the file from scratch, is that correct?
That would not only might reduce the lifespan of your storage media, but also increases the chance that the file gets corrupted if there is a sudden power loss.
Ideally you should only append the new records to the file instead of rewriting its whole content.

An easier approach that would also confirm our suspicions could be to always have a secondary file to write the new data into and then swap it with the primary one. For example, you could always dump your data on a file named gps_log_%Y%m%d_next.json, and then once that’s flushed, rename the files to make the _next be the proper gps_log_%Y%m%d.json.

Hope that helps.

Kind regards,

EDIT: Something I haven’t mentioned is that before cutting the power I print the file through ssh with cat and it’s definitely not empty and size increases as expected.

Thanks for the answer @thgreasi. I’m going to respond under your lines:

Only the current file being edited is empty. The rest of the files are okay.

I’ve tried adding f.flush() and it didn’t help. I’ll try os.fsync() today and let you know.

No, not really, but I’ve tried to write every 5 seconds instead of every 1 second and the problem remains. I guess if it were a timing problem, 5 seconds to write and flush the contents should be more than enough.

I’m aware of this, I just wanted to run a quick demo. I’ve also tried to open the file for appending data but the behavior is similar. In this case, the file is not completely empty but it’s still missing some lines.

Indeed. I am just trying to understand first if I’m missing something.

Thanks again!


Another suggestion to add to what my colleague has already said is that you can use a hardware solution for your problem. There are Uninterrupted Power Supplies available for the Raspberry Pi which are designed for such solutions. For example check out this one from Pi Supply - There are more of these on Tindie - for example this one. I think the problem you are facing is something that can be best solved with a hardware solution like this.

It looks it’s been fixed now by adding the following lines to the with statement:


I’ll be running more tests soon.


Hey there! I believe that the flush/fsync approach will mitigate, but not completely fix the issue, but let us know how it goes, and if you got the chance to go for a hardware approach as well

It’s already a battery powered device, but I was experiencing the same issue when batteries drained. In any case, my goal was just to write a getting-started tutorial about using balenaOS and how to interact with the hardware from within Docker containers.

Thanks for everyone’s help, if you want to check the post visit

Hey @crespum, thanks for sharing your project and that blog post. Looks nice.