this post was submitted on 25 Mar 2026
26 points (96.4% liked)

Linux

64087 readers
745 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 6 years ago
MODERATORS
 

There is a webcomic called strong female protagonist that i want to persevere(in case the website is ever lost) but not sure how.

The image you see above is not a webpage of the site but rather a drop-down like menu. There is a web crawler called WFDownloader(that i am using the window's exe file inside bottles)that can grab images and can follow links, grab images "N" number of pages down but since this a drop-down menu i am not sure it will work

There also the issue of organizing the images. WFDownloader doesn't have options for organizing.

What i am thinking about, is somehow translating the html for the drop-down menu into separate xml file based on issues/titles, run a script to download the images, have each image named after its own hyperlink and have each issue in its own folder. Later on i can create a stitch-up version of the each issues.

you are viewing a single comment's thread
view the rest of the comments
[–] ergonomic_importer@piefed.ca -2 points 1 day ago (1 children)

robots.txt would probably put a stop to that

[–] eldavi@lemmy.ml 3 points 21 hours ago (1 children)

i was asking op if they used a manual approach, which wouldn't be impacted by something like robots.txt

[–] cactus_head@programming.dev 2 points 20 hours ago* (last edited 20 hours ago)

Haven't used curl or wget,have yet to start using command-line(outside of solving some linux issue or organizing family photos) but open to learning.