Support for this instance is greatly appreciated at https://sdf.org/support

1
2
Coding for a Finite World (yoric.github.io)
submitted 2 hours ago by thomask to c/permacomputing
2
2
submitted 2 hours ago by space_turtle to c/sdfarc

Hi

I guess it fits there: I was a bit frustrated by my FT-450D because of its lack of waterfall display when I was a listener. When beginning the hobby I was on 11 meters, and listening on 10 to 12m, didn't want to spend more than that, but I already knew that hacking your hardware was part of the fun...

Now it's fixed, more like hacked ;)https://toobnix.org/w/qkkH3wsSNC4cGT4Sccsaj4

Using an RTL-SDR that was sitting on the shelve, and a 60€ IF card adapter, it's a whole new thing!

Maybe this will encourage you to upgrade this nice and affordable transmitter, use it more and pass your exam ;-)

Kind regards F4IPZ

(edit: this content was moved outside of Google/YT/Reddit, now on SDF community systems <3 )

3
37
submitted 9 hours ago by pieguy to c/bun_alert_system
4
28
submitted 2 days ago* (last edited 2 days ago) by evenwicht to c/sustainabletech

MAFF (a shit-show, unsustained)

Firefox used to have an in-house format called MAFF (Mozilla Archive File Format), which boiled down to a zip file that had HTML and a tree of media. I saved several web pages that way. It worked well. Then Mozilla dropped the ball and completely abandoned their own format. WTF. Did not even give people a MAFF→mhtml conversion tool. Just abandoned people while failing to realize the meaning and purpose of archival. Now Firefox today has no replacement. No MHTML. Choices are:

  • HTML only
  • HTML complete (but not as a single file but a tree of files)

MHTML (shit-show due to non-portable browser-dependency)

Chromium-based browsers can save a whole complete web page to a single MHTML file. Seems like a good move but then if you open Chromium-generated MHTML files in Firefox, you just get an ascii text dump of the contents which resembles a fake email header, MIME, and encoded (probably base64). So that’s a show-stopper.

exceptionally portable approach: A plugin adds a right-click option called “Save page WE” (available in both Firefox and Chromium). That extension produces an MHTML file that both Chromium and Firefox can open.

PDF (lossy)

Saving or printing a web page to PDF mostly guarantees that the content and representation can reasonably be reproduced well into the future. The problem is that PDF inherently forces the content to be arranged on a fixed width that matches a physical paper geometry (A4, US letter, etc). So you lose some data. You lose information about how to re-render it on different devices with different widths. You might save on A4 paper then later need to print it to US letter paper, which is a bit sloppy and messy.

PDF+MHTML hybrid

First use Firefox with the “Save page WE” plugin to produce an MHTML file. But relying on this alone is foolish considering how unstable HTML specs are even still today in 2024 with a duopoly of browser makers doing whatever the fuck they want - abusing their power. So you should also print the webpage to a PDF file. The PDF will ensure you have a reliable way to reproduce the content in the future. Then embed the MHTML file in the PDF (because PDF is a container format). Use this command:

$ pdfattach webpage.pdf webpage.mhtml webpage_with_HTML.pdf

The PDF will just work as you expect a PDF to, but you also have the option to extract the MHTML file using pdfdetach webpage_with_HTML.pdf if the need arises to re-render the content on a different device.

The downside is duplication. Every image is has one copy stored in the MTHML file and another copy separately stored in the PDF next to it. So it’s shitty from a storage space standpoint. The other downside is plugin dependency. Mozilla has proven browser extensions are unsustainable when they kicked some of them out of their protectionist official repository and made it painful for exiled projects to reach their users. Also the mere fact that plugins are less likely to be maintained than a browser builtin function.

We need to evolve

What we need is a way to save the webpage as a sprawled out tree of files the way Firefox does, then a way to stuff that whole tree of files into a PDF, while also producing a PDF vector graphic that references those other embedded images. I think it’s theoretically possible but no tool exists like this. PDF has no concept of directories AFAIK, so the HTML tree would likely have to be flattened before stuffing into the PDF.

Other approaches I have overlooked? I’m not up to speed on all the ereader formats but I think they are made for variable widths. So saving a webpage to an ereader format of some kind might be more sensible than PDF, if possible.

(update) The goals

  1. Capture the webpage as a static snapshot in time which requires no network to render. Must have a simple and stable format whereby future viewers are unlikely to change their treatment of the archive. PDF comes close to this.
  2. Record the raw original web content in a non-lossy way. This is to enable us to re-render the content on different devices with different widths. Future-proofness of the raw content is likely impossible because we cannot stop the unstable web standards from changing. But capturing a timestamp and web browser user-agent string would facilitate installation of the original browser. A snapshot of audio, video, and the code (JavaScript) which makes the page dynamic is also needed both for forensic purposes (suitable for court) and for being able to faithfully reproduce the dynamic elements if needed. This is to faithfully capture what’s more of an application than a document. wget -m possibly satisfies this. But perhaps tricky to capture 3rd party JS without recursing too far on other links.
  3. A raw code-free (thus partially lossy) snapshot for offline rendering is also needed if goal 1 leads to a width-constrained format. Save page WE and WebScrapBook apparently satisfies this.

PDF satisfies goal 1; wget satisfies goal 2; maff/mhtml satisfies goal 3. There is likely no single format that does all of the above, AFAIK. But I still need to explore these suggestions.

5
27
submitted 1 day ago by TokenEffort@sh.itjust.works to c/rant

She never let me grow because some dumb misdiagnosis and I could have grown like everyone else but I was in an institution that made me stagnate. I wasn't allowed outside. I wasn't allowed to SHOWER until I was 11 because "I don't know better" but I was showering at FOUR before i was misdiagnosed. I'm 26 and living the tween years I never got to experience. I never had family, just bullies and abusers. The institution forced me to be friends with hurtful people and dissolved my boundaries. I never got to grow as a kid and even today I can't even be an adult. Being an adult is a joke because of the MISdiagnosis. Being a kid was a joke because the imaginary disorder made a CHILD be CHILDISH. I want to fix that woman's mistake and die now. That woman should have NEVER had kids and if she really insisted, she should have killed me if she didn't want a "special" kid that I wouldn't have been if she gave me a chance at life.

6
11
NetBSD on a ROCK64 Board (simonevellei.com)
submitted 1 day ago by jaypatelani@lemmy.ml to c/bsd
7
116
conference of spikes (lemmy.sdf.org)
submitted 2 days ago by pmjv to c/unix_surrealism
8
6
submitted 22 hours ago by M33 to c/opendata

FYI France has this open data website... https://www.data.gouv.fr/en/datasets/

9
4
NetBSD on a ROCK64 Board (simonevellei.com)
submitted 1 day ago by jaypatelani@lemmy.ml to c/netbsd
10
256
submitted 3 days ago* (last edited 3 days ago) by pmjv to c/unix_surrealism
11
70
submitted 3 days ago* (last edited 1 day ago) by fratermus to c/houseless

{Edit: it was based on my usual dried beans recipe}

This was enough for dinner that night and lunch the next day. So divide these costs in half to get the per-meal cost:

  • 170Wh of energy
  • $1.11, if my math is right:
    • ~$0.24 for onion, $0.49/lb at the local mercado.
    • 180g of small red beans. These were free because someone had them but couldn't/wouldn't cook them in their rig...
    • ~$0.01 5g of salt
    • ~$0.03 dried chili pepper from a bulk bag, again from the mercado
    • $0.83 1/2lb of louisiana smoked sausages (hot) from the manager's discount bin

Pressed the Beans button on the 3qt Instant Pot and forgot about it until it was done. Served with cold beer....

12
9
submitted 2 days ago by webghost0101@sopuli.xyz to c/funhole
13
12
submitted 2 days ago by fratermus to c/houseless

Each winter brings a flurry (ha!) of first-timers wondering how to keep warm. Not giving advice in the above post, but explaining how I do it.

14
0
submitted 1 day ago by oldbill to c/swampies
15
14
submitted 2 days ago by ciferecaNinjo@fedia.io to c/beneficial_bots

In Belgium real estate listings mostly omit addresses. This makes it extremely annoying for consumers looking to either buy or rent because they are forced into engagement with the landlord/seller just to find out the address. Very time-wasting. You must register on a site and disclose your email address, then wait for someone to reply with the address (and often they do not, or they want to speak on the phone and hear your voice -- which can go badly if you don’t speak the local language)†.

The published listings tend to only disclose what approximate neighborhood the dwelling is in (useless for my needs because you have no way of knowing if it’s near a tram stop that is relevant). But there are some exceptions. Maybe ~5—10% of listings have an address. I decided to ignore the majority of listings and only consider those with an address. This meant in order to get a decent number of choices I had to scrape every single real estate site that covers my city to harvest just the listings with addresses.

Then I used a geocaching API to convert the addresses to GPS coords. From there, I scraped the public transport websites. For every address in my city the tool would grab all weekday public transport routes from every GPS fix, which includes trams and transfer times. Then it calculated the walk time on both sides to/from the tram stops on every route to derive the shortest door to door time.

I also wanted to be within a certain cycling time from the center of the city, to ensure I don’t get too far from the center. That was calculated using an API.

The tool also accounted for the usual filters, like budget. I ended up selecting the dwelling that was the shortest commute without deviating from the proxity to center constraint.

The only problem with my approach was that one listing used a fake address. So my tool trusted the addresses and some jack ass published bogus info that lead me to a place that was occupied and unavailable. When I called to say “where are you” he said “down the street.. I gave an address that was close but incorrect”.. WTF. It was far enough to screw up the public transport option.

Anyway, this would have been impossible to do without scraping all those websites. I had freedom and power that’s denied to all other consumers who are trapped in the UIs of the real estate sites. But the next time I need a dwelling, the tool is certainly broken due to how rapidly websites change and also how increasingly anti-bot they have become. I think when I built that tool it was during the last moment of time that the web was relatively open access.

Everyone is generally forced to look for a place close to work. But close in terms of straight distance does not translate into a short tram commute because the routes are chaotic. You could be somewhat close but need 2 or 3 transfers. One interesting thing I noticed was a dwelling on the complete opposite side of the city was reasonable because it was close to a train station with no need for transfers. Trains are the fastest with much fewer stops. Also, there are express buses (fewer stops) and normal buses. So intuition is too inaccurate.

† The point of contact is often a real estate agent or property manager who has many listings. So if you call or write to ask for an address of many listings, the same person sees all your requests and ignores all of them because they assume you are not serious. They think: what kind of person looks all over the place.. surely they only want to see one or two neighborhoods. So this bullshit blocks consumers from searching for a place to live in a way that accounts for public transport schedules. They want to force you to choose where to live based on everything other than the address.

16
461
submitted 5 days ago by pmjv to c/unix_surrealism
17
25
🅱️igga peas (lemmy.world)
submitted 3 days ago by NickwithaC@lemmy.world to c/theyknew
18
50
submitted 4 days ago by pieguy to c/bun_alert_system
19
210
HACKED BRAIN #3 (lemmy.sdf.org)
submitted 5 days ago by pmjv to c/unix_surrealism
20
27
submitted 4 days ago by fratermus to c/houseless

The article talk about camping but also describes full-timers. There is also a kind of Green Book angle that might be useful.

21
8
submitted 3 days ago by fratermus to c/houseless

I am fully aware most folks won’t have the interest, space, or power to run a bread machine. But for me it’s cheap fun.

Did I burn down the van? Did I kill my my batteries? Did it make actual bread?

Another silly experiment in a week or so.

22
6
submitted 3 days ago by theluddite@lemmy.ml to c/permacomputing
23
125
endorsement (lemmy.sdf.org)
submitted 1 week ago by pmjv to c/funhole
24
57
We're Back! (self.sdfpubnix)
submitted 1 week ago by scroll_responsibly to c/sdfpubnix

What were you up to during the outage?!

25
18
submitted 5 days ago by pmjv to c/links2
view more: next ›

SDF Chatter

4,784 readers
191 users here now
founded 1 year ago
ADMINS
SDF