Bulk removing Slack files

Removing your Slack files when you’re out of storage can be a real pain in the ass and time consuming. When you are on the free plan it can be even harder because you only have 5GB of storage and you will have to that frequently.

You will have to remove your files file by file, but an article I came across last week changed this for me, it allowed me to remove 100 files at a time without losing my most recent files.

At work, we even created a Slash command for it, that everyone, on our team, can run at the moment we reached our storage limit.

The only thing you have to do to get this to work with a slash command is creating a token parameter for the script so that you can use the same script for multiple users.

import requests
import json
import calendar
import argparse
from datetime import datetime, timedelta

parser = argparse.ArgumentParser()
parser.add_argument("token")
args = parser.parse_args()

_token = args.token
_domain = "thesedays"

if __name__ == '__main__':
    while 1:
        files_list_url = 'https://slack.com/api/files.list'
        date = str(calendar.timegm((datetime.now() + timedelta(-30))
            .utctimetuple()))
        data = {"token": _token, "ts_to": date}
        response = requests.post(files_list_url, data = data)
        if len(response.json()["files"]) == 0:
            break
        for f in response.json()["files"]:
            print "Deleting file " + f["name"] + "..."
            timestamp = str(calendar.timegm(datetime.now().utctimetuple()))
            delete_url = "https://" + _domain + ".slack.com/api/files.delete?t=" + timestamp
            requests.post(delete_url, data = {
                "token": _token, 
                "file": f["id"], 
                "set_active": "true", 
                "_attempts": "1"})
    print "DONE!"

Now when the slash command calls your API, you just run the Python script with the passed token. And the script starts deleting the files of the user who entered the command.

It already saved me a lot of time, big thanks to the original author of the script: Santiago L. Valdarrama