In that case, woosh me. Just wanted to make sure nobody takes that as an actual advice.
squaresinger
With me too, my employer has to start worrying once I put my current position into my linkedin profile.
I agree with your final take, but why would you want to take frontend tickets if you can also do backend work?
Nope. Hallucinations are not a cool thing. They are a bug, not a feature. The term itself is also far from cool or positive. Or would you think it's cool if humans have hallucinations?
Hallucinations mean something specific in the context of AI. It's a technical term, same as "putting an app into a sandbox" doesn't literally mean that you pour sand into your phone.
Human hallucinations and AI hallucinations are very different concepts caused by very different things.
590-620nm. Identical to orange.
The difference between brown and orange is the brightness level, and since the eyes have an automatic brightness adjustment, brightness levels only appear in context.
Light becomes a darker variant if there's brighter light around and vice versa. Shine brown/orange light into a dark room, and it will appear orange. Shine the same light into a brighter context, and it will be brown.
It's exactly the same thing as e.g. dark blue or light blue. Both share the exact same wavelength, and their brightness becomes apparent in context.
If you've ever been to a cinema and you saw anything brown or orange on screen, you have seen the effect. If you have ever seen a dim conventional light bulb in a bright room, you have seen it too.
Brown has just as much a wave length as orange, because it's the same color.
Amazon games does it too.
Pokemon games also routinely come close.
That's what you got emoji for.
Terrible idea for a few reasons.
- The example in the OP does not need anything but the country. GPS coordinates are less efficient than ISO codes
- GPS coordinates don't map 1:1 to countries or even street addresses. There are infinite different coordinates for each address, and it's very non-trivial to match one to another. Comparing whether two records with country codes are in the same country is trivial. Doing the same with two GPS coordinates is very difficult.
- GPS coordinates might be more exact than accurate. This is a surprisingly common issue: you start out only needing a country, so you put some arvitrary GPS position (e.g. the center of the country) into the GPS coordinates. Later a new requirement arises that means you now need street addresses. Now all old entries point so some random house in the middle of the country, and there's no easy way to differentiate these false locations from real ones.
I guess you meant that as a joke, but people are really doing this and it leads to actual problems.
I saw a news report a while ago about something like that being done in a database for people with outstanding debt. If the address of the debtor wasn't known, they just put "US" in the form, and the program automatically entered the centre of the US as the coordinates.
Sucks for the family that lives there because they constantly get threatening mail and even house visits from angry lenders who want their money back. People even vandalized their house and car because they believed that their debtors lived in that house.
You did not read your source. Some quotes you apparently missed:
Scraping to violate the public’s privacy is bad, actually.
Scraping to alienate creative workers’ labor is bad, actually.
Please read your source before posting it and claiming it says something it doesn't actually say.
Now why does Doctrow distinguish between good scraping and bad scraping, and even between good LLM training and bad LLM training in his post?
Because the good applications are actually covered by fair use while the bad parts aren't.
Because fair use isn't actually about what is done (scraping, LLM training, ...) but about who does it (researchers, non-profit vs. companies, for-profit) and for what purpose (research, critique, teaching, news reporting vs. making a profit by putting original copyright owners out of work).
That's the whole point of fair use. It's even in the name. It's about the use, and the use needs to be fair. It's not called "Allowed techniques, don't care if it's fair".
It's not anthropomorphizing, its how new terms are created.
Pretty much every new term ever draws on already existing terms.
A car is called car, because that term was first used for streetcars before that, and for passenger train cars before that, and before that it was used for cargo train cars and before that it was used for a charriot and originally it was used for a two-wheeled Celtic war chariot. Not a lot of modern cars have two wheels and a horse.
A plane is called a plane, because it's short for airplane, which derives from aeroplane, which means the wing of an airplane and that term first denoted the shell casings of a beetle's wings. And not a lot of modern planes are actually of beetle wing shell casings.
You can do the same for almost all modern terms. Every term derives from a term that denotes something similar, often in another domain.
Same with AI hallucinations. Nobody with half an education would think that the cause, effect and expression of AI hallucinations is the same as for humans. OpenAI doesn't feed ChatGTP hallucinogenics. It's just a technical term that means something vaguely related to what the term originally meant for humans, same as "plane" and "beetle wing shell casing".