Mar 18 2020

A Simple Tool: Memento

I love simple tools. They are easy to write, maintain, and deploy. Memento is one of these tools. I created Memento back in 2017 and is still humming on a Digital Ocean server despite ignoring the Docker container for three years. I’m a huge fan of Docker deploys, but I think that post is for another day.

Before describing the script, it’s probably best to cover the problem statement first.

My life exists in in the vimwiki. Without the vimwiki, I would be completely lost. It’s the best place to aggregate all of my data: calendar events, completed Trello items (for reporting to project managers), the weather, etc. It feels like a personal secretary.

Funneling everything into one place has its downsides too. One of these is interaction from a smart phone or tablet. The files live on my laptop’s file system. This laptop is normally powered down when I am on-the-go, so the files go cold. How do I write using my smart phone in this scenario? While I could use iPhone Notes, I usually forget to transfer the notes over to my vimwiki. Ideally I could pipe data straight to my vimwiki file.

The solution: Push data from the smart phone to AWS SQS. Then when I happen to return to my laptop, pull from the queue and append to the vimwiki.

More specifically, there is a Flask web server as a middleman that accepts requests and pipes the text into AWS. The smart phone interacts with this server so that the client isn’t responsible for connecting to AWS directly. This web service is what I call Memento.

On my mobile device, I have Shortcuts send user input to Memento. This way I don’t even have to write a front end.

This cloud service can be summarized as:

@application.route('/sqs', methods=['POST'])
def sqs_create():
    text = flask.request.get_json()['text']
    return flask.Response(text, mimetype='text/plain')

And then on my laptop I fetch from this queue with:

import boto3, os, sys

has_messages = True
sqs = boto3.resource('sqs', region_name='us-east-1')
queue_url = os.environ['AWS_SQS_URL']

while has_messages:
 response = sqs.meta.client.receive_message(QueueUrl=queue_url, MaxNumberOfMessages=10)

 if 'Messages' in response:
  for message in response['Messages']:
   receipt_handle = message['ReceiptHandle']
   sqs.meta.client.delete_message(QueueUrl=queue_url, ReceiptHandle=receipt_handle)

There’s no need to really go through the web server to pull the data back down. Since the laptop is already the “control center”, I might as well directly interact with AWS. It’s a pretty straight forward script.

To fetch, just run python >>

Since I got tired of running that command, I have a parent shell script that I run called ./a. It pulls memento, Trello tasks, calendar events, and a couple of other house keeping items like notmuch new. I run ./a every morning.