66

I noticed my home servers SSD running out of space and it ended up being my Jellyfin Docker container which wasn't clearing the directory for transcodes in /var/lib/jellyfin/transcodes correctly.

I simply created a new directory on my media hard drive and bind mounted the above mentioned directory to it. Now Jellyfin got over 1 TB of free space to theoretically clutter. To prevent this I simply created a cronjob to delete old files in case Jellyfin isn't.

@daily /usr/bin/find /path/to/transcodes -mtime +1 -delete

Easy!

you are viewing a single comment's thread
view the rest of the comments
[-] Novi@sh.itjust.works 36 points 8 months ago
[-] dataprolet@lemmy.dbzer0.com 3 points 8 months ago

I have like a dozen people using my Jellyfin and sometimes 3-4 people watch something at the same time which results in a lot of transcoding data. At the moment my transcoding directory (which is cleaned every 24 hours) is almost 8 GB big. I don't have the RAM to do this.

[-] domi@lemmy.secnd.me 9 points 8 months ago

Starting with 10.9 you can enable segment deletion so files are cleaned up while still transcoding.

[-] passepartout@feddit.de 3 points 8 months ago

Im so looking forward to this. When i tried to use tmpfs / ramdisk, the transcoding would simply stop because there was no space left.

[-] dataprolet@lemmy.dbzer0.com 1 points 8 months ago

Version 10.9 is not even released, right?

[-] domi@lemmy.secnd.me 3 points 8 months ago

Nope, release target is mid-April currently.

[-] LunchEnjoyer@lemmy.world 3 points 8 months ago

How is this done? If you don't mind sharing 🤗

[-] Novi@sh.itjust.works 5 points 8 months ago

tmpfs is the filesystem you are looking for. You can mount it like any other filesystem in /etc/fstab.

tmpfs /path/to/transcode/dir tmpfs defaults 0 0

[-] Shadow@lemmy.ca 4 points 8 months ago* (last edited 8 months ago)

Can just point it to /dev/shm as a transcoding folder, for a quick and dirty way.

Otherwise you'd mount a tmpfs disk.

[-] agent_flounder@lemmy.world 1 points 8 months ago

I will have to try that once my ram upgrade gets here.

[-] Novi@sh.itjust.works 3 points 8 months ago

You can restrict the size of the ramdisk so you do not end up killing processes. A large amount of ram is not mandatory.

[-] agent_flounder@lemmy.world 1 points 8 months ago

Good to know. Well I have 16G now that should give me plenty to spare.

this post was submitted on 09 Mar 2024
66 points (93.4% liked)

Selfhosted

40443 readers
481 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS