Kissaki

joined 2 years ago
MODERATOR OF
[–] Kissaki@programming.dev 2 points 1 month ago* (last edited 1 month ago)

Notice: sr.ht is currently in alpha, and the quality of the service may reflect that.

Are these all different services? Seems like quite a hassle. Like a split of project resources.

An alpha classification doesn't spark confidence in using it productively and for significant projects.

[–] Kissaki@programming.dev 2 points 1 month ago (1 children)

Unfortunately, I find the need to have an account in order to contribute to projects a deal breaker. It causes too much friction for no real gain. Email based workflows will always reign supreme. It’s the OG of code contributions.

After opening with a need to be open-minded, this seems quite close-minded. Sure, it's their article. Still, I was hoping for a more neutral and substantiated advocating and description.

I certainly didn't feel like it answered [all] my questions and concerns in multiple sections.

[–] Kissaki@programming.dev 2 points 1 month ago

I somewhat like the idea of being able to submit issues via email directly. It does cost on spam classification and prevention, though. An account is easily classifiable as an additional confidence metric. E-Mail, not so much, or with significantly more complexity in relating data and ensuring continuity of source.

An account is a very obvious way to build a reputation. If you see a new GitHub account submitting a PR vs someone having contributed for a long time and significant projects in the same technology, you may approach the reviews quite differently. It is, at least, a very useful and simple way to classify authors and patch submitters.

What does SourceHut provide in this aspect? To what degree does it verify incoming emails authenticity, sender source, and continuity of source hoster? To what degree does it relate information by email address? I assume it does not.

[–] Kissaki@programming.dev 1 points 1 month ago* (last edited 1 month ago)

Additionally, the total size of "non-promoted" content, that is repositories that are for personal use (e.g. "my website", "my dotfiles") as well as private repositories, should not exceed 100 MiB

🤔 made me explore; there are no paid tiers, and the FAQ explains intentions:

In many cases, yes, but please read on. Our goal is to support Free Content, and we do not act as a private hosting for everyone! However, if we see that you contribute to Free Software / Content and the ecosystem, we allow up to 100 MB of private content for your convenience. Further exceptions are spelled out in our Terms of Service:


I've always seen Codeberg as a hosting platform much like GitHub and GitLab. But I see now it's a much more deliberate and specific effort and platform. And "personal use" [only] is not part of that.

[–] Kissaki@programming.dev 5 points 1 month ago* (last edited 1 month ago)

Supporting soft subs is a complex topic though. Three formats, font embedding, positioning and animations. It's a ton of effort, and anything less than "full featureset support" will mean they don't render how you design them in your full-set editor and local media play. And there will be differences and bugs, at least for a while. I suspect font rendering with various fonts in a media render context will have it's own set of issues.

I also think it'd be nice, but I can totally see how it may not make sense technically (complexity with its burdens vs need) or economically.

Browsers are already absurdly complex though so… maybe? :P

[–] Kissaki@programming.dev 1 points 2 months ago* (last edited 2 months ago)

RE: phabricator…I don’t know what that service is or is for, so I can’t comment if there’s any proof therein.

The how to submit a patch section documents that that's where they accept patches. And they do their reviews and change iterations there. By necessity, that also means hosting/having the repos.


That's confusing to me.

They only accept patches on Phabricator, have the sources there, but suggest using GitHub, but afterwards Phabricator to submit the changes?

I can only imagine it's to lower barrier to entry because GitHub is more well known. But this just seems like a confusing mess to me, without clear wording of intentions and separation of concerns [in their docs, not your post or comment here].

[–] Kissaki@programming.dev 5 points 2 months ago (1 children)

When I searched for text "github" I did not find anything. But searching in the inspector to cover urls:

Firefox and related code is stored in our git repository.

Which makes it all the more confusing. Stored there, but patches only elsewhere?

Really, for a "moved their sources" claim I'd prefer some form of announcement or docs that describe this.

[–] Kissaki@programming.dev 8 points 2 months ago

These changes will apply to operations like cloning repositories over HTTPS, anonymously interacting with our REST APIs, and downloading files from raw.githubusercontent.com.

[–] Kissaki@programming.dev 80 points 2 months ago* (last edited 2 months ago) (6 children)

That's a read-only mirror, not a "move onto GitHub".

PRs get automatically closed, referring to the contrib docs.

[–] Kissaki@programming.dev 23 points 2 months ago

Lenard Flören, a Germany-based art director at an advertising agency, said he quickly realized that trying to create his dream fitness app with one lengthy prompt would lead to a plethora of bugs that “neither ChatGPT nor my clueless self had any chance of solving.”

If everyone can create programs, and everyone fails, maybe it'll bring increased appreciation to development and good development and products? One could hope. I guess the worst offenders won't even try themselves either way. The services are not that accessible.

[–] Kissaki@programming.dev 5 points 2 months ago* (last edited 2 months ago)

I've aired my frustration about the terminology previously; anyway, I'm trying to accept the terminology in the interpretation it could make some sense:

You tell the AI the "vibe" of what you want the result to have, and it does that - but of course it's not necessarily that simple. You may end up doing prompt engineering, multiple iterations, trial and error, etc

When we tried a product at my workplace generating a web app prototype in react seemed viable and reasonable, possibly good for prototyping and demonstrating. We also tried a Blazor app, and it utterly failed. I suspect because of less training on it and much more complex mixture of technologies.

[–] Kissaki@programming.dev 1 points 2 months ago* (last edited 2 months ago) (1 children)

, but it works reliably well. It takes a second or two to be redirected to the site you’re visiting.

Do you mean it works reliably well in letting users through, or in blocking AI?

Do you have sources or more information about the effectiveness of it in blocking AI? What else it blocks as collateral damage would also be interesting.

/edit: Clicking through some links (specifically canine.tools) I have to say - it may also be effective in annoying me personally, and eventually exiting those websites. Similar to consent dialogs you could go into settings for and save with opt-outs. But it's a barrier and user-opposing functionality.

I certainly don't see it as a simply or only good and effective thing.

 

This year, we are introducing updates in the HTTP space, new HttpClientFactory APIs, .NET Framework compatibility improvements, and more.

 

This post only applies if you’re using ASP.NET Core on .NET Framework.

ASP.NET Core users on .NET Framework should update to the latest ASP.NET Core 2.3 release to stay in support. This update enables ASP.NET Core 2.2 users to update to a supported version by doing a NuGet package upgrade instead of a downgrade. ASP.NET Core 2.1 users updating to ASP.NET Core 2.3 should experience no change in behavior as the packages contain the exact same code. ASP.NET Core 2.2 users may need to remove any dependencies on ASP.NET Core 2.2 specific changes. Any future servicing fixes for ASP.NET Core on .NET Framework will be based on ASP.NET Core 2.3.

Microsoft making changes for something five years out of support (the 2.2 version).

lol at every instance of ASP.NET becoming a link here

 

Nushell is a powerful shell and scripting language with strong typing, querying, and piping functionalities.

This release adds runtime pipeline input type checking, several new commands and operators, and various other miscellaneous improvements.

42
I Stopped Using Matrix - Tatsumoto (tatsumoto.neocities.org)
submitted 5 months ago* (last edited 5 months ago) by Kissaki@programming.dev to c/opensource@programming.dev
 

What ultimately pushed me to leave Matrix was discovering that my homeserver's admin was using my account without my consent.

In an encrypted room even with fully verified members, a compromised or hostile home server can still take over the room by impersonating an admin. That admin (or even a newly minted user) can then send events or listen on the conversations.

…, I've decided to move my conversations over to SimpleX.

For the past few months, the Matrix community has been largely inactive (despite having over 5,000 members), while the Telegram community has remained much more vibrant. This is disappointing given that I have been a strong advocate for using Matrix and have promoted it widely. For some reason, people are not moving to Matrix at the rate I had hoped.

 

GitHub repo

Examples

> (15 kg/m) * 7cm
# (((15 * kg) / m)) * 7 * cm
out = 1050 * g
> 1 |> cos |> log
# 1 |> cos |> log
out = -0.6156264703860141
> display dev
# Display mode: dev (Developer)
>>> 1.5
# 1.5
out = 1.5
    # IEEE 754 - double - 64-bit
    #
    = 0x_3FF80000_00000000
    = 0x____3____F____F____8____0____0____0____0____0____0____0____0____0____0____0____0
    #    seee eeee eeee ffff ffff ffff ffff ffff ffff ffff ffff ffff ffff ffff ffff ffff
    = 0b_0011_1111_1111_1000_0000_0000_0000_0000_0000_0000_0000_0000_0000_0000_0000_0000
    #   63                48                  32                  16                   0
    #
    # sign    exponent              |-------------------- fraction --------------------|
    =   1 * 2 ^ (1023 - 1023) * 0b1.1000000000000000000000000000000000000000000000000000
 

I track and version my Nushell environment and configuration in a public repository.

I added a GitHub Actions workflow that tests these files. That will ensure a more defined environment and prerequisites/assumptions, given that they have to be set up in the workflow configuration. Given that I mostly work on Windows, but set up the CI to run on Linux/Ubuntu, it will also ensure platform neutrality.


Since Nushell version 0.101.0, there's no need for a default, base env.nu or config.nu, and the nu binary can be called with only the custom, minimal env and config files.

The nu binary offers --env-config and --config parameters.

I noticed that when using them, errors do not lead to error exist codes; nu will continue execution and report success despite env or config not loading [correctly]. (Bug Ticket #14745)


Do you version your environment configuration? Only locally, or with a hosted repository as a backup or to share? Do you run automated tests on it?

 

One of two Azure CDN providers was Edgio, which filed for bankruptcy.

azureedge.net dotnet CDN URLs will cease to work sometime next year after January 15th.


We expect that most users will not be directly affected, however, it is critical that you validate if you are affected and to watch for downtime or other kinds of breakage.

We maintain multiple Content Delivery Network (CDN) instances for delivering .NET builds. Some end in azureedge.net. These domains are hosted by edg.io, which will soon cease operations due to bankruptcy. We are required to migrate to a new CDN and will be using new domains going forward.

Affected domains:

  • dotnetcli.azureedge.net
  • dotnetbuilds.azureedge.net

Unaffected domains:

  • dotnet.microsoft.com
  • download.visualstudio.microsoft.com
 

azureedge.net dotnet CDN URLs will cease to work sometime next year after January 15th.

One of two Azure CDN providers was Edgio, which filed for bankruptcy. CDN migration is in progress.


We expect that most users will not be directly affected, however, it is critical that you validate if you are affected and to watch for downtime or other kinds of breakage.

We maintain multiple Content Delivery Network (CDN) instances for delivering .NET builds. Some end in azureedge.net. These domains are hosted by edg.io, which will soon cease operations due to bankruptcy. We are required to migrate to a new CDN and will be using new domains going forward.

Affected domains:

  • dotnetcli.azureedge.net
  • dotnetbuilds.azureedge.net

Unaffected domains:

  • dotnet.microsoft.com
  • download.visualstudio.microsoft.com

  • Update dotnetcli.azureedge.net to builds.dotnet.microsoft.com
  • Update dotnetcli.blob.core.windows.net to builds.dotnet.microsoft.com

We also noticed that there is a lot of use of our storage account: dotnetcli.blob.core.windows.net. Please also search for it. The storage account is unaffected, however, it would be much better for everyone if you used our new CDN. It will deliver better performance.

 
  • Simplified Startup Configuration
  • path self
  • chunk-by
  • term query
  • merge deep
  • WASM support (again)
  • sys net inclueds new columns mac and ip
  • raw string pattern matching
  • dates can now be added to durations
  • additional explore keybinds
  • version in startup banner
  • input --default
  • PowerShell script invocation on Windows
  • new introspection tools

Breaking Changes:

  • ++ operator, stricter command signature parsing (resolves silent parse errors)
  • group-by now supports "groupers" (multiple criteria)
  • timeit
  • sys cpu
  • from csv and from tsv
  • std/iter scan
  • completion sorting in custom completers, import module naming with normalization
  • display_output hook
  • du flag changes
  • Code specific environment variables updated during source
view more: ‹ prev next ›