Google Cloud IoT Core is shutting down; Alternative configuration options?

Hi

Balena does a great job at provisioning and managing devices. What we’re missing in our project is a better way of providing service configurations (e.g. file based configurations per container running), because the env variables have obvious limitations.

We started using GCP IoT Core combined with Balena for that reason. Now Google is shutting it down. Is anybody using similar services that would do the job? Or is Balena working on something in that direction? :slight_smile:

Functional needs:

  • Fleet wide configuration files for specific app containers
  • Ability to update the configuration via API per fleet and per device

GCP IoT Core also provided an MQTT server, which would be nice in addition.

Regards,
ada

2 Likes

Balena is focused on the management of the devices, and its containers - we don’t have plans to get into the data aspect of that. Services like Microsoft’s Azure and Amazon’s AWS IoT are pretty well known when it comes to providing those services. In fact, some of my colleagues here at Balena have written some great connecting services for these data services

Similar to you, would love to hear what other tools and services the Balena community is using for their data needs :slight_smile:

1 Like

Thanks for bringing this up, I was just logging in to start a discussion about this. I’m hopeful that industry leaders in balena that are using iot core can all come together and brainstorm on possible “alternatives” as they put it.

I can mention what my team and I came up with to address the functionality gap you’ll have with no more iot core @ada. There’s a couple components, I’ll outline them here:

  • All device configurations are stored on-device in a simple database, and reflected in our cloud database.
  • We have REST api end points set up on all of our devices that accept instructions on updating device configurations.
  • Instructions to update these configurations originate from a central app → sent in to a pub/sub queue where a cloud function attempting to deliver the instruction to the device. If the device communication fails, as it usually does, the message is stored back in the pub/sub queue. These attempts fallback exponentially with an ultimate max retries parameter per message.
  • ** Device config updates are then sent via MQTT through iot core to another pubsub that feeds into dataflow which injects our messages into the bigquery tables. This part of course is what we have to re-engineer now that iot core will go away.

I’m wondering if anyone has any suggestions on how to replace point ** above? My current thinking is to basically re-engineer an MQTT interface that pushes messages into our pubsub. Any thoughts or suggestions on other MQTT implementations (using Google Cloud, not about to move my entire infra to AWS or Azure because of this).

Thanks!

1 Like

@anujdeshpande Thanks for the feedback! I’ll definitely look into AWS IoT. Even though everything else we build is living in GCP, these IoT functions can be extracted.

Another option that I consider is something out of the mobile application world for automated device data synchronization such as Firebase Firebase Realtime Database (yes, Google again…, but AWS and others have similar offerings)

This might not work for many IoT applications where the devices are short on resources but I guess most Balena users have more powerful devices anyway.

@richard.galvez Thanks for the outline of your setup. Let me see if I understand that correctly, you are using the MQTT “feedback channel” to update your central database (big query) after the device has committed the change locally? In that case I’m guessing you could replace the whole setup also with something like Firebase.

Our devices currently just pull the configuration via HTTP from IoT when they boot up. The central administration only modifies the cloud version of the configuration. We could also just build a REST API to pull the configs, but I believe that would also mean building a device registry, key exchange and authentication.

1 Like

Thanks @ada I’ll be looking into firebase, thanks for the heads up. Your description is right btw.

I was googling about this today and found a reddit thread where they’re discussing this. A company called ClearBlade apparently is offering a “one-click” migration to their system: https://www.clearblade.com/wp-content/uploads/2022/08/ClearBlade-Google-IoT-Core-Migration_Website.pdf

Here’s the reddit thread: https://www.reddit.com/r/googlecloud/comments/wp93ss/legal_notice_iot_core_will_be_discontinued_on_aug/

1 Like

Hi @ada @richard.galvez ,

as a GCP IoT Core config management drop in replacement you could use the <application|service|device>_environment_variables. I’m wondering which limitations you see @ada, can you please share more insights?

A drop in solution (although without the versioning / rollback of a IoT core config) would be using a device_environment_variable and write a base64 encoded JSON object into it.
The consumer on the device just needs to decode base64 which should be a widely available package or library for most implementations.

You may use the balena-sdk or the balena-api to update

Please be aware, that using the same env var name will overwrite the content and the more specific content will win.

Please share your thoughts, we highly appreciate it!

Best regards
Harald

1 Like

How much data can one env variable hold?

Hello @ada
I’ve researched the GCP IoT Core config limit of 64KiByte ( デバイス、構成、状態  |  Cloud IoT Core ドキュメント  |  Google Cloud ). This amount of data can be stored in balenaCloud or open-Balena environment variables even encoded with base64 which adds around 33-35% overhead.

A colleague pointed me on the fact, that updating an environment variable will restart the affected service on the device. Hence, if you need a resilient config update mechanism, you may need to add a separate config service which is updated via a device_service_environment_variable, restarts and updates a file on across services shared volume. The other services may need to check this config file for changes.

Another approach would be making the config service a local message broker such as MQTT Docker Hub .
The mqtt broker and config service creates a config topic on the mqtt broker with the retain flag enabled, so that the config is retained until updated explicitly.
All services may install a mqtt cli client ( Debian -- Informationen über Paket mosquitto-clients in sid ) and can subscribe to this local config topic.

Please let us know when you need help to use the balena platform to replace the GCP IoT Core config mechanism.

1 Like

Thanks @fisehara . Ok 64kB is a good start. I was actually not very happy about that with GCP IoT Core in the first place.

At least with Balena we would get more in that case because you can have multiple variables :). BUT, it’s still a bit clumsy in terms of managing of a lot of data. Something along the lines of a database synchronization might make more sense anyway.

Your suggestion with the config service is helpful. The auto restart is otherwise a real issue… Is there no way to disable that? (actually a long standing question I had in general…)

Right, that’s a good option also. We have a Redis container running for that purpose which is nice because it gives you the message broker AND the config data store.

@ada
I’ve tested it right now and I figured out that 96KiB plus base64 overhead ~ 128KiB is doable.
More than this will violate the environment argument list when restarting the container on the balenaOS.

@ada, @richard-galvez, @fisehara: balena’s Cloud Relay block provides a layer of indirection on the device between your data service and the cloud provider, based on a local MQTT broker. Harald also described an environment variable backed config service that would work similarly.

If your devices have the resources, then attaching services to a local broker give you a lot of resilience and flexibility in the face of change. Your applications rely only on the API of the local MQTT based services.

Thanks @kb2ma and @fisehara for the advice. Technically speaking the alternative solution to what I laid out in the beginning is the proposed use of env variables and Balena API.

A more complete alternative to GCP IoT Core is switching to AWS/Azure with the help of your Cloud Relay service.

Speaking for my own project, I’m not sure yet if using a different cloud provider for this makes sense because the data has go back to GCP eventually. Maybe we’ll setup the MQTT server manually and adapt provisioning. Or use a thirdparty option like ClearBlade.

@ada would it make sense to move away from mqtt towards something else that GCP supports. Maybe something based on http. Would be custom work I think - but would allow you to stick to google services through and through. I don’t think for devices capable of running balenaOS a shift from mqtt to http would be a stretch

Hello @ada @richard.galvez we are curious to learn how did you solve this. Do you have any insight to share with the community?

Looking forward to learn more!

@ada have you looked into AWS IoT Core? Have been using for a while now and it’s quite powerful even though their documentation is pretty awful.

They have a feature called Jobs which might be exactly what you’re looking for.

Combine it with some Lambda functions and it could support some very advanced use cases.

1 Like

Digging out this old thread again since I missed the previous replies. If of any interest, what we ended up doing is:

  1. Write our own “config” service. Balena env variables configure it, and the service downloads larger config files from GCP buckets.

  2. Publishing real time data directly to GCP pub/sub from that service, instead of using MQTT or similar.

We still would like to try the Firebase Realtime Database or Firestore approach at some point since it deals with 2-way sync automatically. But right now the above does what we need.