I think there's context missing from that story. Diagrams do not trigger disgust. At best, making superfluous and time-wasting demands in the context of trivial tasks that add nothing of value and achieve nothing but wasting time and adding overhead can often lead managers to frown upon them.
No> Context is whatever makes sense to provide to a consumer to help them debug it or respond to it
So it's both optional and unspecified. This means it can't be parsed or relied upon, specially by consumers. It's useless.
the same basic idea as in the rfc under details.
No, it isn't. Contrary to your ad-hoc format, RFC9457 specifies exactly the data type of detail and what's its purpose. This allows third parties to reliably consume resources that comply with RFC9457 while your ad-hoc format leaves clients no option other than to ignore it.
IMO, it can’t easily be generalized. Some APIs may have context to provide, others may not.
It matters nothing what services can produce. What matters is whether clients can consume it. Your ad-hoc format fails to specify this field, which is optional, and thus leaves no option other than to ignore it. It's unusable.
Success is something that you can sniff for after deserializing, as IIRC Fetch API will not throw except for a network errors, even in the event of a 4XX or 5XX.
What the Fetch API does or does not do is irrelevant. The responsibility of putting together a response and generating the resource shipped with it lies exclusicely in your service. If it outputs a resource that is unable to tell clients what went on, that's a problem cause by both how your service is designed and the ad-hoc format it outputs.
The main take is that RFC9457 is well specified and covers basic usecases, while your ad-hoc format is broken by design. Thus when you describe the RFC as "overwrought", you're actually expressing the half-baked approach you took.
I like old reddit. This project is a reminder that it's highly likely those bastards will start to work on making it unusable, if not outright end it.
Your format looks half baked and not thought all he way through. Take for instance the success bool. What info does this add that error_code and the request's own status code doesn't already send? And what's the point of context if it is both unspecified and optional?
I'm not sure you understand that what a union does or does not do is completely irrelevant and besides the point. Python's protocols add support for structural subtyping, and enable both runtime and build-time type checks without requiring major code changes. Don't you understand what that means?
I'm going to play devil's advocate for a moment.
following best practices we laid out in our internal documentation
Are you absolutely sure those "best practices" are relevant or meaningful?
I once worked with a junior dev who only cared about "best practices" because it was a quickly whipped document they hastily put together that only specified stuff like coding styles and if spaces should appear before or after things. That junior dev proceeded to cite their own "best practices" doc with an almost religious fervor in everyone else's pull requests. That stopped the very moment I made available a linter to the project, but mind you the junior dev refused to run it.
What's the actual purpose of your "best practices" doc? Does it add any value whatsoever? Or is it just fuel for grandstanding and petty office politics?
his code works mind you,
Sounds like the senior dev is doing the job he was paid to do. Are you doing the same?
It’s weird because I literally went through most of the same training in company with him on best practices and TDD, but he just seems to ignore it.
Perhaps his job is to deliver value instead of wasting time with nonsense that serves no purpose. What do you think?
I am sceptical. I’d understood vtable lookups to be extremely cheap on modern architecture; usually cheaper than the if / switch statement you’d have to write as an alternative if you weren’t using inheritance.
They are relatively cheap, but I think cheap does not mean free and some references mention a performance hit that can be as high as 7%.
And for really performance-sensitive code, you would never have used virtualized classes anyway.
Yeah, because of this performance hit.
One of the cool things about this article is the fact that it points out how adding 'final' can magically get rid of that performance hit.
What? The fact that Instagram accounts were used to seed Threads' userbase is straight out of Instagram's announcement of Instagram's new text feature called... Threads.
https://about.instagram.com/blog/announcements/threads-instagram-text-feature
I’s amazing how often people celebrate some new feature in a language and I’m like: TypeScript has been doing this for years now.
Whatever another programming language supports or does is entirely irrelevant if what you're working with is Python.
I already explained it to you: protocols apply to types you do not own or control, let alone can specify a base class.
You just specify a protocol, specify that a function requires it, and afterwards you can pass anything to it as-is and you'll be able to validate your calls.
(...) what, pray tell, is the point of protocols, other than backward compatibility with historical fragile ducks (at the cost of future backwards compatibility)?
Got both of those wrong. The point of protocols is to have a way to validate duck typing errors by adding a single definition of a duck. This is not something that only applies to backwards compatible, nor does it affects backwards compatibility.
Why are people afraid of using real base classes?
You're missing the whole point of prototypes. The point of a prototype is that you want duck typing, not inheritance. Those are entirely different things, and with prototypes you only need to specify a single prototype and use it in a function definition, and it automatically validates each and any object passed to it without having to touch it's definition or add a base class.
Perfecting what you have often leads to a completely different language. See C vs C with classes which ended up being C++.
There is absolutely no problem with creating new languages. These are often designed with specific features in mind, and the success cases often offer features that are in high demand. Take for instance node.js, and how its event loop makes it a near ideal language for network-heavy applications that run on a single thread.