

Guy randomly stopped me to ask me how I like my Model 3 when I was getting out of it in a parking lot last week. First time that’s happened in like 6 years.


Guy randomly stopped me to ask me how I like my Model 3 when I was getting out of it in a parking lot last week. First time that’s happened in like 6 years.
PSSST
“It’s pronounced PS/2”


For everyone who never tried it, they had honest to god paid employees in Horizon Worlds to help players get oriented. I cannot imagine a worse job.
The one I ran into was standing in front of some “game” experience that was like…jumping on tiles or some shit.


But I thought the whole line was “hOrIzOn WoRlDs iSnt the MeTaVeRsE.”
Followed by no explanation of what the metaverse is.


There’s a remote job listing on LinkedIn right now for $125/hr to train an AI how to do schematic capture and layout. Like it’s right in the listing that you’re training an AI to do your job. Insanity.


Ask it which is heavier: 20 pounds of gold or 20 feathers.


Vibe coding guy wrote unit tests for our embedded project. Of course, the hardware peripherals aren’t available for unit tests on the dev machine/build server, so you sometimes have to write mock versions (like an “adc” function that just returns predetermined values in the format of the real analog-digital converter).
Claude wrote the tests and mock hardware so well that it forgot to include any actual code from the project. The test cases were just testing the mock hardware.


Definitely not Mandela, but maybe it’s something Google never officially confirmed.
Here’s an article about Ingress, the spiritual predecessor to PkmnGo. https://www.newscientist.com/article/mg21628936-200-why-googles-ingress-game-is-a-data-gold-mine/


I feel like this was common knowledge back in 2016. Is this surprising to anyone?
And you realize that it’s taking a while to delete that small handful of files.


In ECC memory?


I think it’s the first time it’s happened since I upgraded my hardware over a year ago. 64 gigs of RAM and I rarely use more than 30% of it.


Reset button not working, but power button working is quite odd.
Yeah makes me think something hardware level.
Are the server and chargers close to each other? Can you reliably trigger it with the car charger?
No. The car charges every night. This is the first time this has happened.


I used to work for a consultancy that tried to bill themselves as experts in VR/AR. This is back in 2017 or so. We helped a client make a 3D tracking system with VR/AR applications, and this client let us kind of run with it.
Anyway, I was sort of head of this AR/VR thing, and we were always desperate for free advertising, so I somehow got pulled to provide my thoughts on the impact of VR/AR on the grocery store industry for an article in “The Grocer” or some other industry mag.
Leading up to the call, I was trying to think of what I’d say. My thoughts were on building out virtual grocery stores to test customer reactions before building them for real. Bring in some test subjects, see how they plan their route, how they react to different placements of goods. Track their eye movements to see if the new end-cap design is working. Time how long they spend in the store, etc. Are the aisles too narrow and claustrophobic. I got the idea from another client who was using VR to test out new detergent bottle concepts (apparently a one-off of a blow-molded bleach bottle is crazy expensive).
Well my consultancy had been purchased by a multinational conglomerate a year or so prior, so I got a phone call from some C-suite ass who wanted to brief me on what they wanted me to say to the magazine.
His idea was a service where you could have a store employee wear some kind of camera rig so the customer could sit at home in VR and pilot the employee around the store. This would essentially replace curbside pickup, but with the added benefit of “allowing the customer to pick which apple they want out of the bunch.”
I resolved to ignore that advice, but the whole magazine thing ended up falling through anyway. I quit within the year.


I’m mostly trying to describe a feeling I don’t hear named very often


I’ll give that a shot.
I’m running it in docker because it’s running on a headless server with a boatload of other services. Ideally whatever I use will be accessible over the network.
I think at the time I started, not everything supported Intel cards, but it looks like llama-cli has support form Intel GPUs. I’ll give it a shot. Thanks!


Thanks for the link. I was gonna ask if you were a writer, heh.
I agree. The tone of the ads this year felt almost like lampshading. Like if we acknowledge the problem, we’re wise to what the audience is feeling, but we’re not going to do a damn thing to address it. It’s just something that needs to be done to make the ad feel remotely relevant.
AI is scary, but don’t be afraid of our surveillance device because we acknowledged that AI is scary
AI will sell you ads. Anyway, you’re watching an ad for AI
Work sucks amirite? Why not let us unemploy you?
There’s a wealth gap. Spend money on our stuff.
And I’m not going to even link the He Gets Us ads.
It was an especially interesting case because there was a question of whether the photographer lied about who actually took the picture. So he could either claim the monkey took it an lose the copyright or claim he took it and have it lose all value.


Thanks for taking the time.
So I’m not using a CLI. I’ve got the intelanalytics/ipex-llm-inference-cpp-xpu image running and hosting LLMs to be used by a separate open-webui container. I originally set it up with Deepseek-R1:latest per the tutorial to get the results above. This was straight out of the box with no tweaks.
The interface offers some controls settings (below screenshot). Is that what you’re talking about?

Bought it in 2018. Tried to convince him to buy a Kia.