2024-06-13 12:50 UTC: Server migration is now complete. Feel free to raise a github issue if you think things are not working as expected.

Forced photometry is now available from the Southern Telescopes (El Sauce, Chile and Sutherland, South Africa). Please be aware that the difference imaging template south of -50 degrees declination was changed during commissioning, so you may get an unexpected discontinuity in your target's difference lightcurve.

The forced photometry server has a REST API that makes it easy to script requests. In this example, we'll use Python.

First, let's import some useful packages and set the API URL:

import io
import re
import sys
import time

import pandas as pd
import requests

BASEURL = "https://fallingstar-data.com/forcedphot"

Next, we need to obtain a secret token from our username and password. This step normally only has to be done once, unless a token reset is specifically requested (e.g., if it accidentally becomes compromised).

resp = requests.post(url=f"{BASEURL}/api-token-auth/", data={'username': "__my_username__", 'password': "__my_password__"})

if resp.status_code == 200:
    token = resp.json()['token']
    print(f'Your token is {token}')
    headers = {'Authorization': f'Token {token}', 'Accept': 'application/json'}
else:
    print(f'ERROR {resp.status_code}')
    print(resp.json())

Next, submit an RA and Dec coordinate to the server to obtain a URL for checking the status. Note that our request may be throttled if we make too many in a short time.

task_url = None
while not task_url:
    with requests.Session() as s:
        resp = s.post(f"{BASEURL}/queue/", headers=headers, data={
            'ra': 44, 'dec': 22, 'mjd_min': 59248.})

        if resp.status_code == 201:  # successfully queued
            task_url = resp.json()['url']
            print(f'The task URL is {task_url}')
        elif resp.status_code == 429:  # throttled
            message = resp.json()["detail"]
            print(f'{resp.status_code} {message}')
            t_sec = re.findall(r'available in (\d+) seconds', message)
            t_min = re.findall(r'available in (\d+) minutes', message)
            if t_sec:
                waittime = int(t_sec[0])
            elif t_min:
                waittime = int(t_min[0]) * 60
            else:
                waittime = 10
            print(f'Waiting {waittime} seconds')
            time.sleep(waittime)
        else:
            print(f'ERROR {resp.status_code}')
            print(resp.json())
            sys.exit()
Now we can check if the task has completed, and if so, retrieve the results. Be aware that the result URL is for short term use and may expire in the future.
result_url = None
    while not result_url:
        with requests.Session() as s:
            resp = s.get(task_url, headers=headers)

            if resp.status_code == 200:  # HTTP OK
                if resp.json()['finishtimestamp']:
                    result_url = resp.json()['result_url']
                    print(f"Task is complete with results available at {result_url}")
                    break
                elif resp.json()['starttimestamp']:
                    print(f"Task is running (started at {resp.json()['starttimestamp']})")
                else:
                    print("Waiting for job to start. Checking again in 10 seconds...")
                time.sleep(10)
            else:
                print(f'ERROR {resp.status_code}')
                print(resp.json())
                sys.exit()

with requests.Session() as s:
    textdata = s.get(result_url, headers=headers).text

    # if we'll be making a lot of requests, keep the web queue from being
    # cluttered (and reduce server storage usage) by sending a delete operation
    # s.delete(task_url, headers=headers).json()

The raw text can be parsed into a pandas DataFrame for easy manipulation.

dfresult = pd.read_csv(io.StringIO(textdata.replace("###", "")), delim_whitespace=True)
print(dfresult)

A complete Python example script for the API can be downloaded from apiexample.py.

For plotting the output text files, David Young has provided a Python plot script.

If you notice any issues with the data or this website, please report an issue on GitHub or for urgent/security matters email Luke Shingles. This is an open-source volunteer project and feature requests may be considered as time allows.