[-] fool@programming.dev 12 points 1 day ago* (last edited 1 day ago)

As compulsion, I watch YT tutorials at breakneck speed: 2.5x-3x.

YouTube tutorials can be pretty low information density. Sentences have important pointers every 5 seconds or more ("The thing is, like, if you're trying to do this, or this, do X first" -- predictable/less functional words), and the first third of a YouTube video is often useless. Of course, denser videos get slowed to normal and have clips replayed.

Internally, this stems from nervousness of wasting time (oops), and it hurts my head if I do it too long ( but looks cool beforehand B) )

[-] fool@programming.dev 1 points 1 day ago* (last edited 1 day ago)

I wish I had a copy of a copy ability that copies copy abilities, just in case

[-] fool@programming.dev 1 points 3 days ago* (last edited 3 days ago)

If you use the dollar to match the S&P500 beginning in 1928, you'd earn about 1(1.1)^96 = 9 thousand dollars. (w/ dividends)

9k a day times 365 days a year is 3.2 million a year. Or you can invest THOSE EARNINGS into ETFs at the present day again -- by 5 years you'll have made 16 mil principal + 5.4 mil interest. At that rate, it'll take 35 years to be a billionaire.

Oooooor you can just continually dunk the magic daily dollar in Bitcoins instead

[-] fool@programming.dev 9 points 3 days ago

I use FlorisBoard for pure functionality.

  • I've been working with an odd app where you type at length but the message gets eaten sometimes. So I follow up everything with instant hits of the select-all + copy buttons
  • I find the < and > arrows to navigate text much more ergonomic than holding on space to edge around, especially for long strings (webpage forms lol)
  • Private clipboard is a measurable peace of mind. Had a heart attack when a private SSH key got autosuggested on the stock keyboard. Not sure if it ever gets TLS transferred anywhere though, e.g. for autocorrect training.

I do switch back to the stock keyboard for emoji search though.

[-] fool@programming.dev 5 points 4 days ago

My pigeon carrier wifi isn't even loading the image but I see the top of Ihwa's head. Therefore I upvote

[-] fool@programming.dev 4 points 4 days ago

Conky? A window with stats on it?

I'm pretty sure you can spin anything like the sort with ags. Takes some legwork though (bit of TypeScript or Lua). Then just set the layer to something near the bottom and put it on every monitor, skabing, shaboom.

[-] fool@programming.dev 35 points 1 week ago

A brave, vulnerably nuanced answer. Suspicious... what are you planning?

18
submitted 1 week ago by fool@programming.dev to c/asklemmy@lemmy.ml

I was doing some "algorithm surfing" (i.e. VPN+private tab+click enough youtube videos on a topic=temporarily immersed in someone else's rabbit hole). In a patriotism rabbit hole, I found this video about a fearless teenager defending himself and his father against police misconduct with knowledge of Utah law.

Question: how can a layperson possibly know that much about the law to rival a cop's situational power like that?

I'm already familiar with shutting up (I vaguely remember there being a way funnier video but I can't find it)

but I think not shutting up, and instead sheer CYA, was instrumental to that kid and his dad winning the counterlawsuit. And being friendly has turned a speeding ticket into a warning for me (anecdotal evidence)... once...

Apologies if this question is too American. Also please don't hit me with another All Cops Are Benzene or something -- I could use a usable answer ^ .^

53

When you cryptsetup luksFormat, LUKS2 cryptography defaults to argon2id, a competition-winning gpu-resistant multi-core memory-hard algorithm thingy. Only problem is everyone only supports pbkdf2 instead :3

  • GRUB had an argon2id support patch in the works. Buuut it stopped because a version-pinned dependency added argon2id support, and GRUB wants to update lib x to update lib y to update lib z to update said dependency (2 years later... I'm here D: )
  • systemd-boot is simple and doesn't support argon2id
  • efistub, i.e. making the kernel boot itself (i think?), necessitates secure boot and I'm not sure that's the best way to do this (Ventoy can bypass secure boot with MOKMANAGER funkin' anyway, can't it?)
  • Raspberry Pi's bootloader might support argon2id? idk

Not to be deterred, I tried manually patching GRUB (tried with aur on a usb, then with portage) but I don't think these are supported with the latest GRUB. (Attempted with whatever the aur package uses, then Gentoo's grub-2.12-r4, then Gentoo's grub-2.12-r5, then git cloning and checking out older versions manually, then picking the earliest 2.12 archive.org tarball to patch lol. All failed with "couldn't find disk"-esque issues)

Does anyone have this working at or after Nov 2024? And better yet, am I missing something obvious ¯\_(ᵕ—ᴗ—)_/¯

Threat model: Avoiding a twopointfouristan prank, but also just screwing around for fun (◡‿◡✿)

66
submitted 4 weeks ago* (last edited 4 weeks ago) by fool@programming.dev to c/linux@lemmy.ml

Perhaps dumb questions inbound ;)

I use Arch because I'm strapped for time and my system is always moving.

  • 2 minutes to install something? AUR probably has it.

  • Ten minutes of free time to look for a software that fits a new need? Try random AUR things (auditing PKGBUILDs is just twenty seconds or so).

  • If I need a tiny patch, I'll just add a sed or patch file to the PKGBUILD. (Super easy, you barely learn any syntax cuz it's intuitive shell.)

  • make && make install/meson blahblah usually just works.

  • Wiki does the thinking for me if I need something special (e.g. hw video acceleration)

Buuuut update surprises can be a pain (e.g. Pipewire explodes Saturday evening) and declarative rollbackable immutability sounds really freakin' AWESOME, so I'm considering NixOS for my new laptop (old one's webcam broke). So I ask:

  • How much can I grok in a week?
    • I need to know Nixlang, right? I have a ton of dotfiles and random homemade cpp commands in ~/.local/bin that I use daily
  • How quick is it to make a derivation?
    • I make install a lot, do I need to declare that due to non-FHS? Can I boilerplate the whole thing with someone else's make install and ctrl+c ctrl+v? How does genAI fare? (Lemmy hates word guess bots, I know)
  • How quick is it to install something new and random?
    • Do I just use nix-shell if I need something asap? Do I need to make a derivation for all my programs? e.g. do I need to declare a Hyprland plugin I'm test-running?
  • How long do you research a new package for?
    • On Gentoo I always looked up USE flags (NOO my time); on Arch I just audit the PKGBUILD and test-run it (20 seconds); on Ubuntu I had to find the relevant PPA (2 minutes). What's it like for Nix?
  • Can you set up dev environments quickly or do you need to write a ton of configs?
    • I hear python can be annoying. Do C++/Android Studio have header file/etc. issues?
  • What maintenance ouchies do you run into? How long to rectify?
  • Do I need to finagle on my own to have /boot encrypted?
    • I boot via: unencrypted EFI grub asks for LUKS password -> decrypt /boot, which then has a keyfile -> decrypt and mount btrfs root partition. But lots of guides don't do it this way

Thanks for bearing with me ദ്ദി(。•̀ヮ<)~✩‧₊

29
submitted 1 month ago by fool@programming.dev to c/asklemmy@lemmy.ml

I know, I know, mostly just undergrads care about undergrad prestige (except resumé bots on LinkedIn scanning for "MIT") but I'm curious about the average Lemming, who might lie less often than Redditors and probably isn't a hyper outlier. Though I still expect selection and response bias :3

Let me start with my own wall of anecdotes.

  1. An old American embedded systems mentor I once had had had like two master's degrees, but in his words,

Just get a Bachelor's and a good internship. If the company will let you do it on their dime, then get the Master's.

So the college-then-job thing wasn't quite cause-then-effect.

  1. Another friend I had said "All of the higher-ups in the chip engineering dept I'm gunning for have a PhD. Wanna contribute meaningfully? Probably gotta have one too" (Somewhere in the entirety of Asia, exacts hidden for privacy). So grad school matters more in that case.

  2. My old econ teacher told me that, if you want a job where undergrad is just a stepping stone, then your undergrad "prestige" mostly doesn't matter (e.g. pre-law, pre-med). And saving 50k in undergrad student loans to then dump into matching the S&P is a cheat code at age 18, worth far more than "initial salary". ~not~ ~financial~ ~advice~ ~lol~ In this case, the "get your job" isn't even that important.

  3. An acquaintance I once had pipelined from Cornell to DeepMind. There, prestige and its opportunities probably/definitely/maybe had an effect.

  4. A second acquaintance says his Canadian public school (iirc) only mildly helped him, so he went all-in on making his own networks outside of school to get into AI (Is he a hustler bro or something?). So he dodged the idea of college choice mattering.

  5. A Harvard acquaintance I knew says both their dad and granddad agreed that going to Harvard played into getting their positions. (No need to believe me. I forgot what position tho -- finance/big business probably)

  6. The managers and manager managers my parents knew often only had community/state school undergrads, sometimes with MBAs.

  7. I don't care about CEOs. All outliers anyway.

So what have you empirically found? And where? (inb4 "American elite school obsession bad" and "CS is skill-based, not school-based, thread over" -- heard all of that already)

You can be vague if needed c:

64
submitted 1 month ago by fool@programming.dev to c/asklemmy@lemmy.ml

It can be a small skill.

The last thing I learned to do was whistle. Never could whistle my whole life, and tutorials and friends never could help me.

So, for the last month or two, I just sort of made the blow shape then spam-tried different "tongue configurations" so to speak -- whenever I had free time. Monkey-at-a-typewriter type shit. It was more an absentminded thing than a practice investment.

Probably looked dumb as hell making blow noises. Felt dumb too ("what? you can't whistle? just watch"), but I kept at it like a really really low-investment... dare I attract self-help gurus... habit.

Eventually I made a pitch, then I could shift the pitch up a little, then five pitches, then Liebestraum, then the range of a tenth or so. Skadoosh. Still doing it now lol.

(Make of this what you will: If I went the musician route my brain told me to, then I would've gotten bored after 1 minute of major scales. When I was stuck at only having five pitches, I had way more longevity whistle-blowing cartoonish Tom-and-Jerry-running-around chromaticisms than failing the "fa" in "do re mi fa".)

So, Lemmings: What was the last skill you learned? And further, what was the context/way in which you learned it?

44
submitted 1 month ago* (last edited 1 month ago) by fool@programming.dev to c/linux@lemmy.ml

I had a teeny pet project using GNU assembly that was going to target two platforms.

Instead of keeping my handwritten worst-practices Makefile I decided to try GNU Autotools for the educated reasons of:

  • Text scrolling by looks pretty
  • Vague memories of ./configure make make install tarballs

I got hit with mysterious macro errors, recompile with -fPIE errors (didn't need this before?), autotools trying to run gcc on a .o file w/ the same options as an .s file, "no rule for all:", and other things a noob would run into. (I don't need a bugfix, since my handspun Makefile is "working on my machine" with uname -m.) So there's a bit of a learning curve here, inhibited by old documentation ~and~ ~more~ ~quietly,~ ~genAI~ ~being~ ~shittier~ ~than~ ~normal~ ~in~ ~this~ ~department~

With this I ask:

Do people still use Autotools for non-legacy stuff? If not, what do people choose for a new project's build system and why?

edit: trimmed an aside

33

A bookmarklet is a bookmark whose URL is JavaScript code instead of a site. It might be, for example,

javascript:document.querySelector('video').playbackRate = Number(prompt("speed")) || 1; void(0)

// formatted version:
javascript:
document
  .querySelector('video')
  .playbackRate = 
  Number(prompt("speed")) || 1; 
void(0)

so that if you click the bookmark, it sets the speed of the video to whatever you want (e.g. 3.7).

You could also run this directly in the URL bar (in some cases -- I think desktop Chrome does that), or you can simply type alert() into the dev console (desktop Firefox prefers this for security reasons).

Is running my own arbitrary JS like this a thing on mobile? I'm on Android but I'm not sure if Brave disabled it -- I vaguely remember it working once, but it doesn't anymore. No luck on Firefox either. Maybe there's a workaround?

31
submitted 1 month ago* (last edited 1 month ago) by fool@programming.dev to c/linux@programming.dev

edit: solved lol

When I run Cheese, the inbuilt webcam light flashes for an instant then stops. Assuming Cheese opens correctly, it never successfully shows the webcam feed. Cheese worked prior to update. ( Zoom webcam fails too D: )

setup

Arch Linux, kernel 6.11.5-arch-11, Hyprland v0.44.1 (pipewire 1.2.6) on a hybrid Nvidia+Intel Lenovo laptop.

$ lsusb
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 002: ID 25a7:fa23 Areson Technology Corp 2.4G Receiver
Bus 001 Device 004: ID 8087:0a2b Intel Corp. Bluetooth wireless interface
Bus 001 Device 005: ID 138a:0094 Validity Sensors, Inc. 
Bus 001 Device 033: ID 13d3:5673 IMC Networks EasyCamera
Bus 002 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub

$ v4l2-ctl --list-devices                                         
EasyCamera: EasyCamera (usb-0000:00:14.0-5):
	/dev/video0
	/dev/video1
	/dev/media0

packages

I'm up-to-date.

$ yay -Q |egrep -i 'gstreamer|video|cam|media|mtp'
fswebcam 20200725-1
gnome-video-effects 1:0.6.0-2
gst-plugin-libcamera 0.3.2-1
gstreamer 1.24.8-1
gstreamer-vaapi 1.24.8-1
guvcview 2.1.0-4
guvcview-common 2.1.0-4
gvfs-mtp 1.56.1-1
haskell-http-media 0.8.1.1-18
intel-media-driver 24.3.3-1
libcamera 0.3.2-1
libcamera-ipa 0.3.2-1
libmtp 1.1.21-2
media-player-info 26-1
perl-lwp-mediatypes 6.04-6
pipewire-libcamera 1:1.2.6-1
# qt omitted
xf86-video-intel 1:2.99.917+923+gb74b67f0-2

tried apps

Tried cheese and fswebcam, among a few others (logs are too long to fit)

$ cheese
cheese

(cheese:53011): Gdk-WARNING **: 11:41:45.977: Native Windows taller than 65535 pixels are not supported
[0:56:26.030577084] [53011] ERROR IPAModule ipa_module.cpp:171 Symbol ipaModuleInfo not found
[0:56:26.030591748] [53011] ERROR IPAModule ipa_module.cpp:291 v4l2-compat.so: IPA module has no valid info
[0:56:26.030607425] [53011]  INFO Camera camera_manager.cpp:325 libcamera v0.3.2

(cheese:53011): GStreamer-CRITICAL **: 11:41:46.096: gst_structure_get_value: assertion 'structure != NULL' failed
[0:57:26.270746293] [53011] ERROR IPAModule ipa_module.cpp:171 Symbol ipaModuleInfo not found
[0:57:26.270790988] [53011] ERROR IPAModule ipa_module.cpp:291 v4l2-compat.so: IPA module has no valid info
[0:57:26.270850259] [53011]  INFO Camera camera_manager.cpp:325 libcamera v0.3.2
[0:57:26.432615229] [53061]  INFO Camera camera.cpp:1197 configuring streams: (0) 640x480-MJPEG
[0:57:26.449271361] [53574] ERROR V4L2 v4l2_videodevice.cpp:1931 /dev/video0[450:cap]: Failed to start streaming: Protocol error
[0:57:26.449379043] [53574] ERROR V4L2 v4l2_videodevice.cpp:1266 /dev/video0[450:cap]: Unable to request 0 buffers: No such device

(cheese:53011): cheese-WARNING **: 11:42:46.537: Failed to start the camera: Protocol error: ../libcamera/src/gstreamer/gstlibcamerasrc.cpp(680): gst_libcamera_src_task_enter (): /GstCameraBin:camerabin/GstWrapperCameraBinSrc:camera_source/GstBin:bin37/GstLibcameraSrc:libcamerasrc0:
Camera.start() failed with error code -71

(cheese:53011): Clutter-CRITICAL **: 11:42:48.119: Unable to create dummy onscreen: No foreign surface, and wl_shell unsupported by the compositor
$ fswebcam -v
***
Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
src_v4l2_get_capability,91: /dev/video0 information:
src_v4l2_get_capability,92: cap.driver: "uvcvideo"
src_v4l2_get_capability,93: cap.card: "EasyCamera: EasyCamera"
src_v4l2_get_capability,94: cap.bus_info: "usb-0000:00:14.0-5"
src_v4l2_get_capability,95: cap.capabilities=0x84A00001
src_v4l2_get_capability,96: - VIDEO_CAPTURE
src_v4l2_get_capability,107: - STREAMING
No input was specified, using the first.
src_v4l2_set_input,185: /dev/video0: Input 0 information:
src_v4l2_set_input,186: name = "Camera 1"
src_v4l2_set_input,187: type = 00000002
src_v4l2_set_input,189: - CAMERA
src_v4l2_set_input,190: audioset = 00000000
src_v4l2_set_input,191: tuner = 00000000
src_v4l2_set_input,192: status = 00000000
src_v4l2_set_pix_format,523: Device offers the following V4L2 pixel formats:
src_v4l2_set_pix_format,532: 0: [0x47504A4D] 'MJPG' (Motion-JPEG)
src_v4l2_set_pix_format,532: 1: [0x56595559] 'YUYV' (YUYV 4:2:2)
Using palette MJPEG
Adjusting resolution from 384x288 to 424x240.
src_v4l2_set_mmap,675: mmap information:
src_v4l2_set_mmap,676: frames=4
src_v4l2_set_mmap,725: 0 length=203520
src_v4l2_set_mmap,725: 1 length=203520
src_v4l2_set_mmap,725: 2 length=203520
src_v4l2_set_mmap,725: 3 length=203520
Error starting stream.
VIDIOC_STREAMON: Protocol error
Unable to use mmap. Using read instead.
Unable to use read.

logs

$ journalctl -b-0
# some stuff removed for post character limit
pipewire[1028]: spa.v4l2: '/dev/video0' VIDIOC_STREAMON: Protocol error
pipewire[1028]: pw.node: (v4l2_input.pci-0000_00_14.0-usb-0_5_1.0-78) suspended -> error (Start error: Protocol error)
kernel: usb 1-5: USB disconnect, device number 50
pipewire[1028]: spa.v4l2: VIDIOC_REQBUFS: No such device
kernel: usb 1-5: new high-speed USB device number 51 using xhci_hcd
kernel: usb 1-5: New USB device found, idVendor=13d3, idProduct=5673, bcdDevice=16.04
kernel: usb 1-5: New USB device strings: Mfr=3, Product=1, SerialNumber=2
kernel: usb 1-5: Product: EasyCamera
kernel: usb 1-5: Manufacturer: AzureWave
kernel: usb 1-5: SerialNumber: 0001
kernel: usb 1-5: Found UVC 1.00 device EasyCamera (13d3:5673)
mtp-probe[66565]: checking bus 1, device 51: "/sys/devices/pci0000:00/0000:00:14.0/usb1/1-5"
mtp-probe[66565]: bus: 1, device: 51 was not an MTP device
mtp-probe[66599]: checking bus 1, device 51: "/sys/devices/pci0000:00/0000:00:14.0/usb1/1-5"
mtp-probe[66599]: bus: 1, device: 51 was not an MTP device
kernel: usb 1-5: USB disconnect, device number 51
kernel: usb 1-5: new high-speed USB device number 52 using xhci_hcd
kernel: usb 1-5: New USB device found, idVendor=13d3, idProduct=5673, bcdDevice=16.04
kernel: usb 1-5: New USB device strings: Mfr=3, Product=1, SerialNumber=2
kernel: usb 1-5: Product: EasyCamera
kernel: usb 1-5: Manufacturer: AzureWave
kernel: usb 1-5: SerialNumber: 0001
kernel: usb 1-5: Found UVC 1.00 device EasyCamera (13d3:5673)
$ sudo dmesg
[ 5248.449913] usb 1-5: USB disconnect, device number 50
[ 5248.842621] usb 1-5: new high-speed USB device number 51 using xhci_hcd
[ 5249.025592] usb 1-5: New USB device found, idVendor=13d3, idProduct=5673, bcdDevice=16.04
[ 5249.025612] usb 1-5: New USB device strings: Mfr=3, Product=1, SerialNumber=2
[ 5249.025620] usb 1-5: Product: EasyCamera
[ 5249.025626] usb 1-5: Manufacturer: AzureWave
[ 5249.025632] usb 1-5: SerialNumber: 0001
[ 5249.030816] usb 1-5: Found UVC 1.00 device EasyCamera (13d3:5673)
[ 5259.873533] usb 1-5: USB disconnect, device number 51
[ 5260.268988] usb 1-5: new high-speed USB device number 52 using xhci_hcd
[ 5260.454354] usb 1-5: New USB device found, idVendor=13d3, idProduct=5673, bcdDevice=16.04
[ 5260.454371] usb 1-5: New USB device strings: Mfr=3, Product=1, SerialNumber=2
[ 5260.454378] usb 1-5: Product: EasyCamera
[ 5260.454384] usb 1-5: Manufacturer: AzureWave
[ 5260.454389] usb 1-5: SerialNumber: 0001
[ 5260.460370] usb 1-5: Found UVC 1.00 device EasyCamera (13d3:5673)

Any help appreciated! ^_^

Solution: It fixed itself after like 15 power cycles. Easy peasy

[-] fool@programming.dev 59 points 1 month ago

consider the following

[-] fool@programming.dev 52 points 1 month ago
[-] fool@programming.dev 54 points 1 month ago

neuron activation

[-] fool@programming.dev 165 points 1 month ago

duck girl

i haven't drawn a person since grade school but this post has given me the courage

120
kdesu (programming.dev)
submitted 2 months ago* (last edited 2 months ago) by fool@programming.dev to c/linuxmemes@lemmy.world
45
submitted 2 months ago* (last edited 2 months ago) by fool@programming.dev to c/nostupidquestions@lemmy.world

This site is so cool!

             />  フ
            |  _ _| 
          /` ミ_xノ 
         /     |
        /  ヽ   ノ
        │  | | |
   / ̄|   | | |
   ( ̄  ヽ__ヽ_)__)
    \二)

But how do people make these? I searched online and the best I could find were small Japanese communities still using MS Gothic (which is metrically incompatible with Arial/more-used fonts) and halfhearted JPG-to-ASCII-bitmap converters.

Further, how do people manage these? I'd imagine an emoji search, but these millionfold emoticons don't have names; and the other alternatives are "I've got a meme for that scrolls down infinite camera roll" or searching them up every time.

⠀/\_/\
(˶ᵔ ᵕ ᵔ˶) thanks lol
/ >🌷<~⁠♡

[-] fool@programming.dev 42 points 2 months ago

-1 accuracy point ( ◞ ﹏ ◟)

linux 4.5-rc5 had efivarfs fixed to prevent "rm -rf /" bricking uefi motherboards -- so maybe someone can try it out? :]

554
of=/dev/sda (programming.dev)
77
submitted 2 months ago by fool@programming.dev to c/asklemmy@lemmy.ml

I saw a post recently about someone setting up parental controls -- screentime, blocked sites, etc. -- and it made me wonder.

In my childhood, my free time was very flexible. Within this low-pressure flexibility I was naturally curious, in all directions -- that meant both watching brainteaser videos, and watching Gmod brainrot. I had little exposure to video games other than Minecraft which ran poorly on my machine, so I tended to surf Flash games and YouTube.

Strikingly, while watching a brainteaser video, tiny me had a thought:

I'm glad my dad doesn't make me watch educational videos like the other kids in school have to.

For some reason, I wanted to remember that to "remember what my thought process was as a child" so that memory has stuck with me.

Onto the meat: if I had had a capped screentime, like a timer I could see, and knew that I was being watched in some way, I'd feel pressure. For example,

10 minutes left. Oh no. I didn't have fun yet. I didn't have fun yet!!

Oh no, I'm gonna get in so much trouble for watching another YTP...

and maybe that pressure wouldn't have made me into an independent, curious kid, to the person I am now. Maybe it would've made me fearful or suspicious instead. I was suspicious once, when one of my parents said "I can see what you browse from the other room" -- so I ran the scientific method to verify if they were. (I wrote "HI MOM" on Paint, and tested if her expression changed.)

So what about now? Were we too free, and now it's our job to tighten the next generation? I said "butthead" often. I loved asdfmovie, but my parents probably wouldn't have. I watched SpingeBill YTPs (at least it's not corporatized YouTube Kids).

Or differently: do we watch our kids without them knowing? Write a keylogger? Or just take router logs? Do we prosecute them like some sort of panopticon, for their own good?

Or do we completely forgo this? Take an Adventure Playground approach?

Of course, I don't expect a one-size-fits-all answer. Where do you stand, and why?

view more: next ›

fool

joined 1 year ago