this post was submitted on 19 Dec 2025
-21 points (34.3% liked)

Selfhosted

53917 readers
670 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Documentation for nanogram available here for awhile

Edit: ~~Dont be a ungrateful~~ Be nice pls. I put a lot of time, effort, and my own money into making this. I'm choosing to freely share it :)

Yes I get help from LLM's. Review the code if you think it's unsafe, or just move on and don't use it. Happy to answer any technical questions.

Edit 2: Expanded source code for termux version here.

Edit 3: Expanded source for pi version here

top 32 comments
sorted by: hot top controversial new old
[–] savvywolf@pawb.social 2 points 5 days ago (1 children)

Had a quick skim and found this little guy:

# ---------- Protected media route ----------
@app.route('/img/<path:name>')
@login_required
def media(name):
    db = SessionLocal()
    try:
        me = current_user(db)
        # Find the post with this image
        post = db.query(Post).filter_by(image_path=name).first()
        if post:
            # Check visibility
            can_view = post.user_id == me.id or db.query(UserVisibility).filter_by(
                owner_id=post.user_id, viewer_id=me.id
            ).first() is not None
            if not can_view:
                abort(403)
        return send_from_directory(UPLOAD_DIR, os.path.basename(name))
    finally:
        db.close()

I've not read through everything, but there are some security concerns that jump out to me from just this function. Hopefully you can enlighten me on them.

Firstly, what is stopping a logged in user from accessing any image that, for whatever reason, doesn't have an associated post for it?

Secondly, the return codes for "the image doesn't exist" (404) and "the image exists but you can't access it" (403) look to be different. This means that a logged in user can check whether a given filename (e.g. "epstien_and_trump_cuddling.jpg") has been uploaded or not by any user.

Both of these look to be pretty bad security issues, especially for a project touting its ability to protect from nationstates. Am I missing something?

[–] hereforawhile@lemmy.ml -1 points 5 days ago* (last edited 5 days ago) (1 children)
  1. I disclaim the opposite, I don't tout its ability against nation states in the Readme.

  1. There are two checks for someone on the server to be able to view a post. First, are you a valid user? Then did the person sharing the photo give you access to view their posts? If both are true you can see the post. Also, on upload to the server, the image get compressed and stripped of any meta data including the file name...so no they couldn't check a file name. Each photo is given a randomly generated filename.

Edit.

  1. There can't be any posts without images attached. There will always be a post and an image. (unless it's a 1-1 DM or group chat) which has its own rules for access.
[–] savvywolf@pawb.social 2 points 4 days ago (1 children)
  1. You list "Activist/journalist secure communication" as a use case. Not all countries have freedom of press.

  2. Looks like you name images based on a random uuid, so that should protect against filename attacks. But if you do have a filename you can tell whether the image has been an image or not.

Also, looks like all uploads are converted to jpg, regardless as to whether the original image was a jpg (or even an image) or not. Don't do that.

  1. Can you point to where in code this invariant is enforced?
[–] hereforawhile@lemmy.ml 0 points 4 days ago* (last edited 4 days ago) (1 children)
  1. You list "Activist/journalist secure communication" as a use case. Not all countries have freedom of press.

Is that an inaccurate claim? It should provide the means to organize and communicate securely...to the extent Tor is secure, and if your using the official Tor browser, web crypto can be utilized for group and 1-1s for an additional layer of encryption. I thought it was a fine claim. It should be able to handle quite a few people messaging all at once on the PI varient.

  1. Looks like you name images based on a random uuid, so that should protect against filename attacks. But if you do have a filename you can tell whether the image has been an image or not.

How would you ever discover a filename?

If you did have a filename and the exact url to the image you would need to be logged in as a valid user, and the person who shared the photo would have needed to allow access to their profile.

Even if you have the correct link, if those two conditions arnt satisfied you will not be able to view.

Also, looks like all uploads are converted to jpg, regardless as to whether the original image was a jpg (or even an image) or not. Don't do that.

This was a design choice to have consistency in filetypes. What's the downside? All browsers will support displaying a jpg.

  1. Can you point to where in code this invariant is enforced?

Which part are you talking about? The image compression is defined as the compress and store function.

The "API reference" in the readme goes into further specifics on how this works with flask.

Everything except the login page, registration link will behind these two checks see (def login) where the @loginrequired logic is defined for each of the app routes.

[–] savvywolf@pawb.social 2 points 3 days ago (1 children)

to the extent Tor is secure

Tor doesn't automatically secure your app. If your social media instance has 1000 users on it, and one user gets compromised, then the other 999 users shouldn't have any interactions outside of that user leaked.

web crypto can be utilized for group and 1-1s for an additional layer of encryption

Are file uploads encrypted?

How would you ever discover a filename?

Maybe you have a data leak. Maybe they send the filename in plaintext somewhere. Maybe they take advantage of the fact that UUIDs might be deterministic. But if I may flip the question... Why does an inaccessible post even need to return 403 anyway? It just functions as a big footgun that may cause any other exploits to behave worse.

Even if you have the correct link, if those two conditions arnt satisfied you will not be able to view.

But you can determine its existence or not through the status code.

This was a design choice to have consistency in filetypes. What’s the downside? All browsers will support displaying a jpg.

Gifs will lose any animation, pngs will lose quality. Also, as far as I can tell, there's nothing stopping a malicious user uploading a non-image file.

Which part are you talking about?

There are two steps to making a post: Upload and store the image and add the post to the database. There's also similar steps to deleting a post: Removing the image upload and removing the post from the database. Are both these operations atomic?

Everything except the login page, registration link will behind these two checks see (def login) where the @loginrequired logic is defined for each of the app routes.

It's not that hard for a sufficiently motivated adversary to get an account on a sufficiently large instance. You need to ensure that one user account being compromised doesn't result in information leakage from unrelated accounts.

This discussion stems from issues I found in just one function. You're making a product which requires a very high level of security. You need to understand how to write secure code, and your LLM won't be able to do it for you.

I don't want to discourage you from programming in general, but making a very secure social media site is a rather complex undertaking for someone new to programming.

[–] hereforawhile@lemmy.ml 1 points 3 days ago (1 children)

First, thanks for replying I appreciate the feedback and thoughtful replys.

If your social media instance has 1000 users on it, and one user gets compromised, then the other 999 users shouldn't have any interactions outside of that user leaked.

If I intended on using this for mission critical communications or something, maybe I would add and enforce two factor authenticated logins. That could mitigate this conern to some extent. Or use tors built in authenticated onion service mechanism and distribute the certificate to users. This thing was never intended to scale to that size though.

But this is pretty much the case for any platform yeah? If you gain access you gain access?

Users that did not allow their posts to be shared with the compromised account would remain private, and conversations outside of the compromised account would remain private. AND, let's say you had a hint that a account was compromised and you were using web crypto. Resetting your password would break the encryption of all conversation history... OR anyone engaged in a sensitive conversation could remotely wipe their conversations.

Are file uploads encrypted?

File uploads are encrypted in transit from the client to the server but not encrypted on the server. Anyone needing anything further would already know how to encrypt a file and can handle that manually. It's a heavy operation is the main reason. My use case is to send a pdf of a already public news article or something so I didn't feel implementing was important.

But if I may flip the question... Why does an inaccessible post even need to return 403 anyway? It just functions as a big footgun that may cause any other exploits to behave worse.

That's a fair question. I could see how it could be used to test to probe the server or something. The thing is, you would only get that different 403 response if you were logged in. If you were logged out, you get the same response checking for a valid uuid and a non uuid so I'm not sure what an attacker is learning?

But you can determine its existence or not through the status code.

You get the small benefit of knowing if a file exists only if you have valid credentials. If you don't have credentials your going to get bounced to the login screen no matter what string you try with no feedback.

Gifs will lose any animation, pngs will lose quality. Also, as far as I can tell, there's nothing stopping a malicious user uploading a non-image file.

Again this is a design choice I don't want gifs. There are filetype checks on line 350 of the app. PNG, webp, jpegs allowed only.

One of the main design goals was to keep this light weight. That's why I'm only displaying 10 photos before a new page is created. I am extremely happy with the performance of the image compression. The compression amount is tunable however if you want higher quality.

The server can ingest a 8mb photo and compress it down to 100-500kb and it still looks totally fine to me. I was most amazed with this function. Plus, I like that I'm able to archive all these family moments into a really small footprint. Over 250 photos is only like 40mb.

There are two steps to making a post: Upload and store the image and add the post to the database. There's also similar steps to deleting a post: Removing the image upload and removing the post from the database. Are both these operations atomic?

Yes deleting is atomic. It should leave no trace in the db and it really removes it from the file directory of the sever. Also wiped are all related comments and likes associated with the post.

It's not that hard for a sufficiently motivated adversary to get an account on a sufficiently large instance. You need to ensure that one user account being compromised doesn't result in information leakage from unrelated accounts.

My current built in security features are as follows.

  • invites only generated by the server manager

  • ability for the server manager to delete and wipe accounts.

  • ability to rotate your onion address. This cuts of all access to the service. The server operator would need to redistribute the onion address.

  • users have control of any data they have sent to the server...ie real deletion rights that really delete things.

  • any new invitee to the server has zero access to any accounts. Each user already in the instance needs to manually allow access to all their posts.

[–] savvywolf@pawb.social 1 points 2 days ago

Two factor authentication won't help here. You have to build your app with the assumption that any attacker has a valid login and credentials and therefore restrict them to only information they have permission to see.

File uploads are encrypted in transit from the client to the server but not encrypted on the server.

Usually when people talk about e2e encrypted messaging they mean that everything is encrypted. That includes images and text content. The server should not be able to read any contents of any message sent through it.

Again this is a design choice I don’t want gifs.

Why? Sending memes is a core part of any social media experience.

There are filetype checks on line 350 of the app.

Line 350 in both files doesn't seem to contain any filetype checks. I assume you mean file.content_type. That may not be accurate to the actual file uploaded; it can be spoofed.

Yes deleting is atomic.

        # Delete the associated message if it exists
        if chat_file.message_id:
            msg = db.get(Message, chat_file.message_id)
            if msg:
                db.delete(msg)
        ---> Here
        # Delete file from disk
        file_path = os.path.join(CHAT_FILES_DIR, file_uuid)
        if os.path.exists(file_path):
            os.remove(file_path)

If the application crashes/closes at the indicated point, then you will delete the message from the database but still have the image on the server. If this is an image served from /img/whatever, it would have no checks beyond a login check.

[–] CypherColt@sh.itjust.works 4 points 6 days ago (1 children)

I've been experimenting with Vibe Coding for a few months. I recommend you do some beginner tutorials on coding, at least python.

Then, using what you have learned, have your AI vibe coding assistant refactor your code into something manageable.

Just because it works, doesn't mean it's ready for others to use. If you want to vibe code an app for yourself and it works, that's fine. But this is.. python code, in a bash script? You need to clean it up and make it more professional before you share it.

[–] hereforawhile@lemmy.ml -4 points 6 days ago* (last edited 6 days ago)

Look at the edits, they have both been broken apart into standalone projects broken down into all their parts.

Expanded source for pi version here

Expanded source code for termux version here.

[–] sem@piefed.blahaj.zone 12 points 1 week ago* (last edited 1 week ago) (1 children)

Don't be mad when ppl don't like LLM code. You can release something for free but calling people ungrateful for not liking seems a bit... entitled.

[–] smiletolerantly@awful.systems 10 points 1 week ago

Almost 9k lunes of python in a bash script. Lmao. No.

[–] 6nk06@sh.itjust.works 7 points 1 week ago (2 children)

Why put all the Python code in the script?

[–] Ghoelian@piefed.social 23 points 1 week ago* (last edited 1 week ago) (3 children)

Apparently it was all ai generated and the author doesn't actually know how to program. Just look at their responses in the .ml cross-post, that's not someone whose software I would trust.

[–] 6nk06@sh.itjust.works 14 points 1 week ago (3 children)

And it's worse because they hide the Python code, which means that they can't use tools like uv or ruff to check that everything works properly. I don't understand why people do this.

[–] CameronDev@programming.dev 9 points 1 week ago (1 children)

I dont understand why people do this

Charitably: AI turbocharged dunning-kruger

Less charitable: Malware delivery.

There is no good reason why they couldn't have a normal source tree, that they pack into a single shell script in CI.

[–] hereforawhile@lemmy.ml -4 points 1 week ago (1 children)
[–] CameronDev@programming.dev 9 points 1 week ago (1 children)

Sorry, but a photo of a directory structure is not a source tree.

Your git repo consists of 4 files, a readme, a licence, and two packed shell scripts.

If you have an actual published source repo, link people to it.

[–] Starfighter@discuss.tchncs.de 5 points 1 week ago* (last edited 1 week ago) (2 children)

Also the normal and rpi versions are two completely independent implementations of the same software. So now the LLMs have twice the maintenance load.

I didn't diff the two files but even the startup and control code appears to be custom for each version.

[–] hereforawhile@lemmy.ml 0 points 1 week ago

Better? https://gitlab.com/here_forawhile/nanogram-termux

Dedicated expanded pi version coming later.

[–] hereforawhile@lemmy.ml -1 points 1 week ago

They are different environments, and so there are many changes that take place in order for it to work on a PI.

The core app and features are a mirror.

[–] hereforawhile@lemmy.ml -2 points 1 week ago* (last edited 1 week ago)

What do you mean?

[–] cypherpunks@lemmy.ml 5 points 1 week ago (1 children)

look at their responses in the .ml cross-post,

that post is now deleted, but you can see their modlog here