Cooper8

joined 1 month ago
[–] Cooper8@feddit.online 5 points 1 week ago (4 children)

Followup question: is there a company that is not part of the new lighting cartel making screws-in bulbs that actually last 10+ years?

[–] Cooper8@feddit.online 1 points 1 week ago (1 children)

Yes this is a great discussion. Generally speaking I see how user local-first posting mirrored by relays under e2ee can help solve some of the downsides of instance based federation, however it seems like the actual implementation makes or breaks the utility.

I have a concept that came up in reading the comments on lobster, which is that the issue of incomplete data due to asynchronous/intermittent downtime push and pull by users to/from the relays as well as inconsistent relays behavior leads to inevitable incomplete/non-consensus/out of date data access (something federation also suffers from).

My idea is that relays, bothe standard and specialized, could host a dedicated encrypted ledger for each user/key that has posted to it (potentially within a time limit, or with approval) that holds only a sequential identifier (counted since the first event by the key) of the user's most recent activity and a unique identifier/key associated with the event the activity was associated with (so that edits would be associated with the UI of the post being edited for example, or a new message to an ongoing thread would use the UI of the thread and the UI of the message.) Limit this log to very few entries and replace it every time it is updated, say between 1 and 10, and you would keep the size of the file very small, and the pushed update from the user/key would also be very small.

This way a user could push activity log updates to a broader set of hosts/relays than the actual content/event was sent to while keeping the cache/data burden on the broader network down. Ideally this would mean that not only the Relays but also users following the user/key could hold the log (enabling gossip without large cache burden). Unlike a blockchain where the ledgers would need to cross-sync with each-other and seek consensus on larger data chunks, in this case the reader of the ledger can always default to the most recent sequential identifier, and that identifier would be generated by publishing key/user.

This way time code variance isn't an issue, and at time of login a user can pull the logs for all users/keys they follow from relays OR peers they follow and determine the number of events posted by each user/key since they last pulled updates. Then the client could crawl the relays for the actual events with sequential identifiers between those AND stop crawling once they are found.

One issue I see with this sort of system is in the case of deleted events, so perhaps the log would also need to include a string of the sequential identifiers of events which have been deleted within a given time period.

[–] Cooper8@feddit.online 2 points 1 week ago (1 children)

My understanding is that the content is essentially self-hosted, so content removed from relays still exists on the posting user's client and can be accessed directly, just like a website sending out RSS. So saying it "still exist on the network" is technically true, but only in the same way that you would say that about say bittorrent or the open web. What people host/post is present raw, what is amplified/"curated"/relayed is filtered. Client settings/config sets default and custom user content interaction, like a browser which can have adblock or not.

In principle, this seems like a decent solution, but I can see why different users prefer different protocols, differing to moderation takes a burden off of the user to vet inbound content. The same can be achieved via relays but the culture of "curation" seems weaker because the pressure from the userbase is lower to optimize it, as users are not solely reliant on any one relays. An odd network effect, but a truly invested curation/admin team could just as easily build a well "curated" relay as a well moderated instance.

[–] Cooper8@feddit.online 2 points 1 week ago

This is helpful thanks, seems like ditto is an interesting middle ground.

[–] Cooper8@feddit.online 2 points 1 week ago

maybe Peergos? I'm not sure, seem to match up on Goals but IDK about execution: https://github.com/Peergos/Peergos?tab=readme-ov-file

[–] Cooper8@feddit.online 2 points 1 week ago

Sounds a bit like Plebbit, though that is more about P2P "communities/boards" that a user starts and others can post to, rather than a microblog type platform. Unfortunately it also has a strong association with Crypto communities as its founders come from that scene. It is built on IPFS.

What you describe also has similiarities to Secure Scuttlebutt and its successor PZP , which are both unfortunately abandoned but layed a lot of groundwork for asynchronous encryption key based networks.

[–] Cooper8@feddit.online 7 points 1 week ago

Vlogbrothers ftw

[–] Cooper8@feddit.online 2 points 1 week ago* (last edited 1 week ago)

Client-side curation sounds like whitelisting effectively, if you follow only the curated feed and the curated feed resigns all events posted by selected keys, that's a whitelist and seems like a decent solution for casual users so long as they can find trusted curators and clients that enable them to be easily discovered and subscribed to. What client is best for this currently?

On the flipside, if those "curators" were able to export and import lists of keys to automatically exclude from feeds, that would be very useful for the curators who have to manually or automatically sort events and new users to build their feeds. Is that feature currently available? Eliminating known bot accounts from feeds seems like minimum viable feature set for new curators in the current state of play.

[–] Cooper8@feddit.online 1 points 2 weeks ago (19 children)

So there are no moderation tools / whitelist/blacklists?

[–] Cooper8@feddit.online 3 points 2 weeks ago (2 children)

What do you prefer about Nostr?

[–] Cooper8@feddit.online 8 points 2 weeks ago

No, it is encrypted at the user level but doesn't use a blockchain. Some servers have a bitcoin based tipping feature, but that isn't core to the protocol.

 

I have seen some critical views on Nostr as a part of decentralized network discussions, but most seem to be focused on culture not function.

What are the functional / protocol differences that make you prefer ActivityPub over Nostr?

 

"Apertus: a fully open, transparent, multilingual language model

EPFL, ETH Zurich and the Swiss National Supercomputing Centre (CSCS) released Apertus 2 September, Switzerland’s first large-scale, open, multilingual language model — a milestone in generative AI for transparency and diversity.

Researchers from EPFL, ETH Zurich and CSCS have developed the large language model Apertus – it is one of the largest open LLMs and a basic technology on which others can build.

In brief Researchers at EPFL, ETH Zurich and CSCS have developed Apertus, a fully open Large Language Model (LLM) – one of the largest of its kind. As a foundational technology, Apertus enables innovation and strengthens AI expertise across research, society and industry by allowing others to build upon it. Apertus is currently available through strategic partner Swisscom, the AI platform Hugging Face, and the Public AI network. ...

The model is named Apertus – Latin for “open” – highlighting its distinctive feature: the entire development process, including its architecture, model weights, and training data and recipes, is openly accessible and fully documented.

AI researchers, professionals, and experienced enthusiasts can either access the model through the strategic partner Swisscom or download it from Hugging Face – a platform for AI models and applications – and deploy it for their own projects. Apertus is freely available in two sizes – featuring 8 billion and 70 billion parameters, the smaller model being more appropriate for individual usage. Both models are released under a permissive open-source license, allowing use in education and research as well as broad societal and commercial applications. ...

Trained on 15 trillion tokens across more than 1,000 languages – 40% of the data is non-English – Apertus includes many languages that have so far been underrepresented in LLMs, such as Swiss German, Romansh, and many others. ...

Furthermore, for people outside of Switzerland, the external pagePublic AI Inference Utility will make Apertus accessible as part of a global movement for public AI. "Currently, Apertus is the leading public AI model: a model built by public institutions, for the public interest. It is our best proof yet that AI can be a form of public infrastructure like highways, water, or electricity," says Joshua Tan, Lead Maintainer of the Public AI Inference Utility."

 

"Apertus: a fully open, transparent, multilingual language model

EPFL, ETH Zurich and the Swiss National Supercomputing Centre (CSCS) released Apertus 2 September, Switzerland’s first large-scale, open, multilingual language model — a milestone in generative AI for transparency and diversity.

Researchers from EPFL, ETH Zurich and CSCS have developed the large language model Apertus – it is one of the largest open LLMs and a basic technology on which others can build.

In brief Researchers at EPFL, ETH Zurich and CSCS have developed Apertus, a fully open Large Language Model (LLM) – one of the largest of its kind. As a foundational technology, Apertus enables innovation and strengthens AI expertise across research, society and industry by allowing others to build upon it. Apertus is currently available through strategic partner Swisscom, the AI platform Hugging Face, and the Public AI network. ...

The model is named Apertus – Latin for “open” – highlighting its distinctive feature: the entire development process, including its architecture, model weights, and training data and recipes, is openly accessible and fully documented.

AI researchers, professionals, and experienced enthusiasts can either access the model through the strategic partner Swisscom or download it from Hugging Face – a platform for AI models and applications – and deploy it for their own projects. Apertus is freely available in two sizes – featuring 8 billion and 70 billion parameters, the smaller model being more appropriate for individual usage. Both models are released under a permissive open-source license, allowing use in education and research as well as broad societal and commercial applications. ...

Trained on 15 trillion tokens across more than 1,000 languages – 40% of the data is non-English – Apertus includes many languages that have so far been underrepresented in LLMs, such as Swiss German, Romansh, and many others. ...

Furthermore, for people outside of Switzerland, the external pagePublic AI Inference Utility will make Apertus accessible as part of a global movement for public AI. "Currently, Apertus is the leading public AI model: a model built by public institutions, for the public interest. It is our best proof yet that AI can be a form of public infrastructure like highways, water, or electricity," says Joshua Tan, Lead Maintainer of the Public AI Inference Utility."

 

I have been looking into setting up a secure home/small business server and hardening my local network and I came across this kickstarter which is currently floundering, likely because it’s campaign page is way too technical without enough fluff for the uninformed out there (like myself to some extent). For reference I work in small industry and have some interest in implementing more IOT, and also want to self host more of my media probably via Jellyfin, and an indieweb site, possibly some AI automation via n8n.

That said, from what I can tell it seems like a really great device for my use case actually, combining a multiband WiFi 7 gateway with a built in NAS and upgradeable compute modules. As a bonus it is a German company so I’m a bit less worried about back doors that with some of the Chinese generic manufacturers out there. That said, I haven't run a server of my own before and am not sure what to make of the hardware specifications.

What I can’t sus out is how secure this actually is, how technical my background needs to be to get it set up effectively, and whether the price is good for the hardware. Any help?

 

I have been looking into setting up a secure home/small business server and hardening my local network and I came across this kickstarter which is currently floundering, likely because it’s campaign page is way too technical without enough fluff for the uninformed out there (like myself to some extent). For reference I work in small industry and have some interest in implementing more IOT.

That said, from what I can tell it seems like a really great device for my use case actually, combining a multiband WiFi 7 gateway with a built in NAS and upgradeable compute modules. As a bonus it is a German company so I’m a bit less worried about back doors that with some of the Chinese generic manufacturers out there.

What I can’t sus out is how secure this actually is, how technical my background needs to be to get it set up effectively, and whether the price is good for the hardware. Any help?

15
submitted 1 month ago* (last edited 1 month ago) by Cooper8@feddit.online to c/privacy@lemmy.ml
 

I have been looking into setting up a secure home server and hardening my local network and I came across this kickstarter which is currently floundering, likely because it's campaign page is way too technical without enough fluff for the uninformed out there (like myself to some extent).

That said, from what I can tell it seems like a really great device for my use case actually, combining a multiband WiFi 7 gateway with a built in NAS and upgradeable compute modules. As a binus it is a German company so I'm a bit less worried about back doors that with some of the Chinese generic manufacturers out there.

What I can't sus out is how secure this actually is, how technical my background needs to be to get it set up effectively, and whether the price is good for the hardware. Any help?

view more: next ›