Using the Toggl Track API to automate a daily ritual with Python
I have a real habit of over-engineering automation tools. This is yet another, semi-useless automation project, but this time using Python! Before we dive into specifics, here is a bit of context…
In my current team’s form of agile, we post a short message at the end of the day summarising where we got to with our tickets prior to sign off. This acts as a cover for our teammates in other timezones before the stand-up session the next day.
Our east coast folks get to read west coast summaries when we log in the morning, and our west coast folks read them as we log off, roughly after their lunch. I find these particularly useful in bridging the gap in progress or project updates between daily’s. These messages normally follow the format of a short bullet list of updates with context or links as needed to ticket and progress.
I also have to track my time (like many other developers). For this I use Toggl Track, it has a super sleek UI, baked in Pomodoro timer in the desktop app, and simple project assignment functionality letting me group my time entries by ticket super easily. Tie in their iOS app and an extension for my Stream Deck I have no excuse not to keep an eye on my time! I’m not amazing at remembering to log my hours, my current engineering role at Points is actually the first development job where I’ve had actual timesheets, and I just haven’t built up that habit yet, but I had an idea on how to help: Generate my daily summaries from my Toggl time tracking entries!
Automate with Toggl API
Turns out Toggle Track has a REST API and buried in their v8 docs are details of their
/time_entries
endpoint where you can provide a start and end date and the API will respond with the entries that
match, a daily summary if you will.
I put together a quick Python script to using requests
to send a GET request to that endpoint and calculating
exact ISO8601 start and end dates, and the API responds with a JSON array of entries looking like this:
I tend to reuse the same description to track work on each ticket which lets me group them together. This saves me creating a new Toggl project for each one as they are a bit cumbersome to manage. For larger or longer term projects I do create them, but they are few and far between.
Processing my entries
So we’ve got a request to the API, with our credentials, and our start and end dates as parameters.
Time to start manipulating our output! I start by combining entries with duplicate descriptions adding their durations together. I then sort the
resulting dict
by duration so my most time-consuming (and what I’d like to think) most valuable work is at the top.
Rendering
From here I create an output string to match what I normally post in Slack, a title line with an emoji, each entry starting with a bullet and then a closing line.
One new thing I’ve picked up while doing this is Line 77
and it’s fun \N{...}
syntax. This
represents an escape sequence for named characters
in the Unicode database, so here we’re using the Studio Microphone
emoji name which results with the corresponding emoji being rendered.
I also implemented pyperclip
at the very end to copy the output, bullet and all, to my clipboard
so its nice and easy to share on Slack. This is then all wired up in app.py
like so:
As you can see on the first line of app.py
we’re setting the python interpreter using a
shebang #!
which allows me to run this file from
the command line without the python
prefix, after making it executable with: chmod +x app.py
.
I made available via the command line by running a symlink to my python script in
my /usr/local/bin
as eod
.
Check it out on GitHub: https://github.com/jamesrwilliams/eod