Albin Larsson: Blog

Culture, Climate, and Code

Cloudflare Worker to resolve URLs

1st March 2024

The other day, I needed to resolve a w.wiki URL from a client-side application. However, the UrlShortener MediaWiki extension does not provide an API for resolving URIs (T358049), and client-side applications can’t simply resolve the URLs normally due to CORS.

To unblock myself, I decided to write a generic Cloudflare Worker to resolve URLs, as it is a common task and I always end up dealing with the same edge cases, such as content negotiation and URL fragments. I will update the code below as I need to handle more cases.

// Created by Albin Larsson and made available under Creative Commons Zero
addEventListener('fetch', event => {
  event.respondWith(handleRequest(event.request));
});

async function handleRequest(request) {
  const urlParam = new URL(request.url).searchParams.get('url');

  if (!urlParam) {
    return new Response('URL parameter is missing.', { status: 400 });
  }

  const corsHeaders = {
    'Access-Control-Allow-Origin': '*',
    'Access-Control-Allow-Methods': 'GET, HEAD, OPTIONS',
  };

  // manually follow redirects to obtain the final URL from the location header
  // as the client-side only fragment wont be a part of the URL-object
  let finalUrl = urlParam;
  let response;

  do {
    // pretent to be human by requesting text/html to prevent default content-neogtiation
    response = await fetch(finalUrl, { redirect: 'manual', headers: { 'Accept': 'text/html' } });

    if (response.status >= 300 && response.status < 400) {
      const redirectUrl = new URL(response.headers.get('location'), finalUrl);
      finalUrl = redirectUrl.href;
    }
  } while (response.status >= 300 && response.status < 400);

  return new Response(finalUrl, {
    headers: {
      ...corsHeaders
    }
  });
}

Using Python virtual environments through Just

21st February 2024

I have gotten quite fond of Just lately much thanks to how it forces you into the habit of creating structured documentation for the various commands and scripts that you end up writing.

When adding a Justfile to a Python/Django project the other day I found myself in a situation where I wanted to make sure that all commands ran in a virtual environment. However, because Just run each line in a separate shell, it is not possible to activate the virtual environment in one line and then run the command in the next.

The only (sane) way I found to solve this was to prefix each command with the path to the virtual environment’s Python or Pip binary. This is not ideal, but it’s likley that you and your colaborators will have settled on a naming convention for the virtual environment directory anyway.

Here is a full example of a Justfile form one of my Django projects:

# load .env file
set dotenv-load

@_default:
  just --list

# setup virtual environment, install dependencies, and run migrations
setup:
  python3 -m venv .venv
  ./.venv/bin/pip install -r requirements.txt
  ./.venv/bin/python -Wa manage.py migrate

run:
  ./.venv/bin/python -Wa manage.py runserver

test:
  ./.venv/bin/python -Wa manage.py test

# virtual environment wrapper for manage.py
manage *COMMAND:
  ./.venv/bin/python manage.py 

Ensuring VS Code can watch Snowman projects for file changes

15th February 2024

Rencently VS Code and VS Codium has been throwing the following error at me when working with Snowman projects:

Visual Studio Code is unable to watch for file changes in this large workspace

Turns out that VS Code is trying to watch all the files in the .snowman directory and it’s subdirectories. No wonder it’s complaining, there are a lot of files in there!

Adding .snowman to the files.watcherExclude setting in the VS Code settings solved the issue accross all my Snowman workspaces.

Now if do want to watch the .snowman directory for changes one thing you can do is to try to reduce the number of files in there by deleting old cache data with the following Snowman command:

snomwan cache --invalidate

This can be good practice to do anyway every now and then to keep folder from growing larger and larger.

Building and deploying Snowman sites with Gitlab Pages

14th February 2024

I have previously written about how to build Snowman sites on Github Actions. Yesterday I had to figure out not only how to build Snowman sites on Gitlab Pages but also how to deploy them. Not only was the Gitlab CI/CD configuration a joy compared to Github Actions, but it integrats well with the Gitlab Pages service to the extent that any Snowman site should be able to build and deploy with the following .gitlab-ci.yml configuration:

# The Docker image that will be used to build your app
image: debian:bookworm
# Functions that should be executed before the build script is run
before_script:
  - apt-get update && apt-get install --yes ca-certificates && apt-get install --yes wget
  - wget
    "https://github.com/glaciers-in-archives/snowman/releases/download/0.5.0/snowman-linux-amd64"
    -O snowman && chmod +x snowman
pages:
  script:
    - ./snowman build
  publish: site
  artifacts:
    paths:
      # The folder that contains the files to be exposed at the Page URL
      - site

  rules:
    # This ensures that only pushes to the default branch will trigger
    # a pages deploy
    - if: $CI_COMMIT_REF_NAME == $CI_DEFAULT_BRANCH

Now if you want to build it with local RDF files you would need to setup Oxigraph or another SPARQL service just like in the Github Actions example. I haven’t needed that yet so I leave that exercise to the reader.

Older Posts