Fastest way to dev on local device

Is there a faster way to do local dev than making changes locally on my IDE then pushing to the local device to test? No way to somehow mount my local folder to the local device?

Thanks,
Brice

Hi @brice I believe that using the resin cli would be what you are looking for:

https://docs.resin.io/tools/cli/#sync-uuid-

Is that what you were looking for?

@floion thanks, it does help but I was looking for something where I could leverage things like hot reload / auto rebuild with watch folders.

I guess you could use inotify in a script to watch your directory and then trigger a resin sync to your device.

Also, just a heads-up in case you did not know about it, there is another development mode you could use, local mode, not sure if it fits better for you than the inotify + resin ssh approach:

Is their an update on this?

The resin-sync takes 30s and that is to long for efficient development. Normally I develop my auto syncing files to the pi and restart the node process in the ssh terminal of the device. This takes less then 4s.

Is their a way to do this on a local development build? Or are their other development options that take less then 5s to test if code is working?

@brett resin sync is different from resin local.
Did you check the link that florin pasted above?
Resin local is a lot faster and might be a good match for your development needs.

I cant find a clear tutorial on how to use the resin local push command.
It first complains that their is no docker file, then I create one by removing .template extension, then i get different errors.
And if I google it I find only open github tickets complaining about the issue.
So I’m a bit lost on that route.

Hi @brett, what does it complain about in the second case using a Dockerfile? In that case the in the FROM statement you have to use a specific image, a valid image name (in practice it means replacing the %%RESIN_MACHINE_NAME%% with your machine, e.g. raspberrypi3 if using an RPi3 or as it’s appropriate, see our list of images, for example: https://docs.resin.io/runtime/resin-base-images/ ). We are working on making the CLI work with Dockerfile.template files as well, but it’s in progress (github issue)

Most of the guidance how to use it is in resin local push --help, have some explanation and examples.

Hi thank you for the help so far. But it still isn’t working.
My application works if push to resin and if I use resin sync.

I copied the docker template and replaced the image name as described above. But if I run "git local push deviceuuid -s . " it starts by pulling the container but fails during building. And that is weird since it works for the cloud build and the sync build.

Logs below

$ sudo resin local push mydeviceuuid.local -s .                                                                                                            ‹system›
* Building..
- Stopping and Removing any previous 'myappname' container
- Building new 'myappname' image
Step 1/11 : FROM resin/raspberrypi3-node:slim
slim: Pulling from resin/raspberrypi3-node
5ec7d30a9a8c: Already exists
2c5f1d9670fa: Already exists
effee8c83224: Already exists
9c72b15ef569: Already exists
8b21541bbf83: Already exists
5f6f3e4aab84: Already exists
3c415431ffe3: Already exists
47eb08f01e22: Already exists
bb35d33eae84: Already exists
f6e2c0f91fd9: Already exists
38b3af3b1080: Already exists
fa61be08855e: Already exists
987bb75b0817: Already exists
eef675aada05: Already exists
38d9d717e243: Already exists
ee86a4d62ca2: Already exists
08be2fa1c4f0: Already exists
 ---> b00e0fd9fb82
Step 2/11 : RUN apt-get update && apt-get install -yq     automake libtool git build-essential python
 ---> Running in 2287816093df
Get:1 http://security.debian.org jessie/updates InRelease [63.1 kB]
Get:2 http://archive.raspbian.org jessie InRelease [14.9 kB]
Get:3 http://archive.raspberrypi.org jessie InRelease [22.9 kB]
Ign http://deb.debian.org jessie InRelease
Get:4 http://deb.debian.org jessie-updates InRelease [145 kB]
Get:5 http://deb.debian.org jessie Release.gpg [2373 B]
Get:6 http://deb.debian.org jessie Release [148 kB]
Get:7 http://security.debian.org jessie/updates/main armhf Packages [532 kB]
Get:8 http://archive.raspbian.org jessie/main armhf Packages [13.3 MB]
Get:9 http://archive.raspberrypi.org jessie/main armhf Packages [170 kB]
rdt push failed. Error: The command '/bin/sh -c apt-get update && apt-get install -yq     automake libtool git build-essential python' returned a non-zero code: 137 Error: The command '/bin/sh -c apt-get update && apt-get install -yq     automake libtool git build-essential python' returned a non-zero code: 137
    at Stream.<anonymous> (/usr/local/lib/node_modules/resin-cli/node_modules/resin-sync/build/docker-utils.js:145:23)
    at emitOne (events.js:77:13)
    at Stream.emit (events.js:169:7)
    at drain (/usr/local/lib/node_modules/resin-cli/node_modules/through/index.js:36:16)
    at Stream.stream.queue.stream.push (/usr/local/lib/node_modules/resin-cli/node_modules/through/index.js:45:5)
    at Parser.parser.onToken (/usr/local/lib/node_modules/resin-cli/node_modules/resin-sync/node_modules/JSONStream/index.js:132:18)
    at Parser.proto.write (/usr/local/lib/node_modules/resin-cli/node_modules/resin-sync/node_modules/jsonparse/jsonparse.js:134:34)
    at Stream.<anonymous> (/usr/local/lib/node_modules/resin-cli/node_modules/resin-sync/node_modules/JSONStream/index.js:23:12)
    at Stream.stream.write (/usr/local/lib/node_modules/resin-cli/node_modules/through/index.js:26:11)
    at IncomingMessage.ondata (_stream_readable.js:528:20)
    at emitOne (events.js:77:13)
    at IncomingMessage.emit (events.js:169:7)
    at readableAddChunk (_stream_readable.js:146:16)
    at IncomingMessage.Readable.push (_stream_readable.js:110:10)
    at HTTPParser.parserOnBody (_http_common.js:109:22)
    at Socket.socketOnData (_http_client.js:305:20)
    at emitOne (events.js:77:13)
    at Socket.emit (events.js:169:7)
    at readableAddChunk (_stream_readable.js:146:16)
    at Socket.Readable.push (_stream_readable.js:110:10)
    at TCP.onread (net.js:523:20)

Hi, can you please make sure that you have used the Enable Local Mode action on the device, according to the docs?
Also, please confirm whether you are using a dev Resin OS image on your device.

These issues we’ve seen before, when the supervisor running on the device is not switched into Local Mode as mentioned, and then it will clear out user container and images it doesn’t know about - the cryptic “returned a non-zero code: 137” we’ve seen in such cases so far. In that case, the error would happen at different stages on each push.

Switching into Local Mode, the supervisor behaviour adapts, and won’t clean out containers it doesn’t know about.

The device was in local mode during error.

The resin local push is well, awfully slow since it’s built on the raspberry pi, not your local machine.

The other way around would be to somehow build the image on your x86 machine and then push it on to the device locally. Here is a great article explaining how to build arm images on x86. However, resin currently doesn’t have a way to push an image built on an x86 machine directly in local mode. Here is the github issue related to this topic.


But there is another side to this argument…

I tried doing a resin build on my machine, and i have to admit, it was quite slow compared to the resin servers.

The resin servers fly compared to my local machine, probably because they do the builds on fast arm servers.

Has anyone else tried a resin build on their x86 machine? How was the experience?


resin sync is fast enough for small code-changes.


I do however, find it slower than the traditional “write and run” cycle. For now, I have resorted to making my applications run-able on x86.

For example, my python app uses the keyboard library to emulate GPIO when it’s running on x86.

try:
    import RPi.GPIO
except (RuntimeError, ImportError):
    import keyboard

    def is_pressed():
        return keyboard.is_pressed('ctrl+alt+r')
else:
    GPIO.setmode(GPIO.BCM)
    GPIO.setup(18, GPIO.IN, pull_up_down=GPIO.PUD_UP)

    def is_pressed():
        return not GPIO.input(18)

This is the fastest solution to local development I have found.

@devxpy the first time you deploy your code on a device with resin local push is expected to need some time to complete since it needs to fetch the build toolchain & then build the whole container from scratch on the device (as you said).
On the other hand subsequent pushes shouldn’t take that long if you use multistage builds for your app, since most of the time, because of layer caching, it will only be the top layer(s) that will need to be built. Of course that built time also depends on the size of your application itself. Does it take too long even on subsequent builds?

See: https://resin.io/blog/multi-stage-docker-builds-for-tiny-iot-images/

I’m aware of the caching, it works as expected.

However,

It takes a long time for the rpi to install dependencies from pip. So even with caching, it’s still very slow, every time I update my dependencies.

Similarly, APT dependencies take quite some time to install as well.

(And god help you, if you have to manually build something because pre-built binaries aren’t available)

P.S. I won’t say it’s an issue with resin, per say. I have used plain old raspian in the past and it’s comparable in terms of performance.

Fully agree with @devxpy on this. I also do the more serious development on my laptop and then deploy on devices for testing and tuning.

My project is based on the Pi Zero W, and because of that it is terribly slow, even code syncs. After working on this project for over a year I have come to a couple of different optimizations and workflow changes to make development as fast as possible. Hopefully this will help someone in the future!

First: Use the Docker cache

Second: If using Raspberry Pi, use PiWheels! Its awesome! No more compiling numpy for days on end. No more accidently breaking the cache and having to twiddle thumbs for a few hours during the workday.

Third: Use base images: When building a base image on your computer, you have these lines at the start and end of the base image dockerfile:

RUN [ "cross-build-start" ]
RUN [ "cross-build-end" ]

which I think means its cross compiling on my computer. You should do all apt-get install and pip install commands in the base image. If I am just testing a package, I only install that one package in the standard dockerfile used in the push, or I use PiWheels so its much faster.

Lastest: Use sshfs! I’ve been using this command for a mounting cloud servers to my host machine, and its pretty convenient. It uses ssh in order to mount remote filesystems into a folder on your machine. Install sshfs on your Balena device, and mount your PCs code folder on the Balena device.

Workflow Without sshfs:

balena local push

wait for it to push, once its done you can kill it and ssh in:

ctrl-C
balena local ssh
tail -f /data/my_application_log.log

Each push takes 10-20 seconds or so, then you ssh, which you have to select which container to ssh into, and then finally you open up the log.

Workflow with sshfs:
SSH into the device twice, one of the windows will be used for viewing the debug log, and one will be used to restart your program. The benefits however, is this ssh session does not get killed on each push.

balena local ssh
sshfs myname@myip:/home/code/ /usr/src/app/code
python3 /usr/src/app/code/app.py
balena local ssh
tail -f /data/my_application_log.log

This configuration lets me edit code on my computer, and immediately run it on my Pi Zero W device. Saves me a good 30 seconds per code change, and helps keep my focus since I am not tempted to switch tasks in the meantime.

1 Like

I work with a totally different approach:

First I run a NUC image locally as a virtual machine,as mentioned here. I do all the development that isn’t hardware specific. Most times I am able to connect my hardware related stuff over some sort of USB converter. The building and compiling is quicker for x64 and I imagine the local sync will work fine too. On top of this, you don’t have to carry any additional devices around.

Secondly, when hardware truly gets involved I switch to a local dev image and do all the debugging on the device itself. Coping code over from my main IDE to the device, just using vim. It takes three commands to delete the complete contents, paste it and then save it.

I experience that installing dependencies and compiling always is quicker by deploying the code on resin than to install it locally. That said, for this to work it is important to have the caching right. For instance, a multifile copy command breaks all following caching even if one file is changed. I always copy my pip requirements over first, install them and then copy all other files.