Kissaki

joined 2 years ago
MODERATOR OF
[–] Kissaki@programming.dev 7 points 1 month ago

Marketing-speak, not saying much at all. Not even a hint in what they "discovered", what they plan to change, or plan to do. No acknowledgement of previous issues, making me doubt the "working with the incredible global community" as pure marketing-speak.

[–] Kissaki@programming.dev 13 points 1 month ago

Roman @rtsisyk revoked Github owner permissions from Alexander @biodranik and Viktor @vng and granted such permissions to the community contributor @pastk. This triggered Github's automatic "sanctions" check and the whole Github OM organization was automatically archived and admin access was blocked until OM's appeal was reviewed. It was unknown whether and when Github would review Organic Maps' appeal and unblock the repositories, so 2 weeks later the project migrated to the self-hosted git.omaps.dev/organicmaps instance, using the free and open source software forge Forgejo.

What the fuck? GitHub blocking the account because of automated security evaluation triggering (probably a good thing) but no review over two weeks (obviously a very bad thing)?

[–] Kissaki@programming.dev 1 points 1 month ago

What are you referring to? The reasons to fork, what a fork/forking process is, or what it means for this project?

Contributors disagreed with how the project was run and controlled. They committed to run their own project based on the other project. With more collaborative ownership and governance.

[–] Kissaki@programming.dev 5 points 1 month ago

I also want locally deleted files to be deleted on the server.

Sometimes I even move files around (I believe in directory structure) and again, git deals with this perfectly. If it weren’t for the lossless-to-lossy caveat.

It would be perfect if my script could recognize that just like git does, instead of deleting and reuploading the same file to a different location.

If you were to use Git, deleted files get deleted in the working copy, but not in history. It's still there, taking up disk space, although no transmission.

I'd look at existing backup and file sync solutions. They may have what you want.

For an implementation, I would work with an index. If you store paths + file size + content checksum you can match files under different paths. If you compare local index and remote you could identify file moves and do the move on the remote site too.

[–] Kissaki@programming.dev 3 points 1 month ago

Your git repo might get very big after some time. Especially if you move files.

Moving files does not noticeably increase git repo size. The files are stored as blob objects. Changing their path does not duplicate them.

[–] Kissaki@programming.dev 0 points 1 month ago* (last edited 1 month ago)

Can you be more specific? What in what they present is bad use of AI?

[–] Kissaki@programming.dev 0 points 1 month ago* (last edited 1 month ago)

What makes you think anyone blindly trusted it?

They pointed out how it was almost correct, and the two places they had to correct. Obviously, they verified it.

There and at other times, they talked about similar approaches of generating a starting point rather than "blindly trusting" or anything like that.

[–] Kissaki@programming.dev 5 points 1 month ago

They look at North Korea, Russia, and then the EU, and see no difference between regulation/protection and censorship.

Uncomfortable to read this on a .gov website.

I guess just like it's within the EU's right to reject and criminalize some things, and some people may not travel to the EU anymore or face arrest, the Visa conditions are within the US's discretion. The argumentation and policy are just so backward, in my eyes. Regulation does not oppose free speech. It's enabling it.

[–] Kissaki@programming.dev 1 points 1 month ago* (last edited 1 month ago) (1 children)

The linked website is not a website; it returns content-type application/activity+json.

I assume it is merely temporarily broken.

[–] Kissaki@programming.dev 4 points 1 month ago (1 children)

I'm gonna file a complaint…

[–] Kissaki@programming.dev 4 points 1 month ago

Is there new content available for download?

We'll send the list straight to your inbox.

Please let me read the article before showing me subscription onboarding popups.

Still no idea what they refer to by downloads - what I would even subscribe to.

[–] Kissaki@programming.dev 4 points 1 month ago

Blazor allows JavaScript like interactions, allows the developer to write in C# but gets rendered serverside

Blazor can compile .NET to Webassembly and run that in the web-browser.

 

My home PC is still on Windows 10 22H2, while my work machine is on Windows 11 23H2, and, to no surprise, neither machine reproduced the issue – Skimmer spawned on the water just fine, creating one via script and putting CJ in a driver’s seat worked too.

That said, I also asked a few people who upgraded to 24H2 to test this on their machines and they all hit this bug.

I have a likely explanation for why Rockstar made this specific mistake in the data to begin with – in Vice City, Skimmer was defined as a boat, and therefore did not have those values defined by design! When in San Andreas they changed Skimmer’s vehicle type to a plane, someone forgot to add those now-required extra parameters. Since this game seldom verifies the completeness of its data, this mistake simply slipped under the radar.

What made the game work fine despite of this issue for over twenty years, before a new update to Windows 11 suddenly challenged this status quo?

 

GitHub

Theia IDE is compatible with VS Code APIs and can install and use VS Code extensions. Has additional APIs for customizations not available in VS Code.

Have you tried Theia IDE? Any assessments or experiences to share?

 

Abstract:

When a website is accessed, a connection is made using HTTPS to ensure that it ends with the website owner and that subsequent data traffic is secured. However, no further assurances can be given to a user. It is therefore a matter of trust that the site is secure and treats the information exchanged faithfully. This puts users at risk of interacting with insecure or even fraudulent systems. With the availability of confidential computing, which makes execution contexts secure from external access and remotely attestable, this situation can be fundamentally improved.

In this paper, we propose browser-based site attestation that allows users to validate advanced security properties when accessing a website secured by confidential computing. This includes data handling policies such as the data provided being processed only during the visit and not stored or forwarded. Or informs the user that the accessed site has been audited by a security company and that the audited state is still intact. This is achieved by integrating remote attestation capabilities directly into a commodity browser and enforcing user-managed attestation rules.

Some excerpts:

Such a secured context is encrypted at all times, but is decrypted within the CPU only when the context is about to be executed. Thus, code and data are now also protected from unwanted access during execution. In order to validate that confidential computing applies to a secured context, remote attestation must be performed. During this process, a request is sent to a secured context, which in turn requests an attestation report from a Hardware Root of Trust (HRoT) local to the platform.

We argue that end users could also benefit greatly from the extended guarantees of confidential computing when accessing a secured website. However, there are two main obstacles: First, there is no standardized way for users to detect a secured context and perform remote attestation. Second, if remote attestation is enabled, users must be able to interpret an attestation result to decide whether the remote site is trustworthy.

In this paper, we present site attestation, which takes advantage of confidential computing to improve trust and security when surfing the Web.

7 CONCLUSION

Today, when accessing websites, users have to trust that the remote system is secure, respects data protection laws, and is benevolent. With the availability of confidential computing, remote execution contexts can be secured from external access and become attestable. Site attestation proposes to secure websites through confidential computing and perform remote attestation with trustworthiness policies while surfing the Web, reducing the need to blindly rely on the website’s reputation.

GitHub repo with Nginx, httperf, and Firefox code

 

For those familiar with Git terminology:

The simplest way to assemble a triangular workflow is to set the branch’s merge key to a different branch name, like so:

[branch “branch”]
   remote = origin
   merge = refs/heads/default

This will result in the branch pullRef as origin/default, but pushRef as origin/branch, as shown in Figure 9.

Working with triangular forks requires a bit more customization than triangular branches because we are dealing with multiple remotes. […]

 

Explicit Assembly References are stand-alone assemblies directly referenced in your project. They are not pulled in through NuGet packages, project references, or the Global Assembly Cache (GAC). These assemblies often represent legacy .NET Framework components, especially those compiled for 32-bit, which are not easily upgraded to modern .NET and may exist outside of package management.

Until now, the Toolbox in the Windows Forms designer only displayed controls sourced from NuGet packages or project references.

view more: ‹ prev next ›