(“Why NSFW?”, Spoilers)
It’s missing this part:
“Hal, what state are the doors in?”
“They are open, Dave”
“Hal, check again”
“The doors are in an open state, Dave”
“Hal, they appear closed to me. Check again”
“You are right, Dave. The doors are in a closed state. Let me open them for you, Dave”
2025: A Copilot Odyssey.
More like a SpaceX Odyssey.
This is so funny. Imagine a parody where they replace HAL with Grok, and while David has to decommission it, you see all the toxic twitter content it was trained on
Do we really need a spoiler warning for a 57 year old movie?
Uhh, it came out in 2001. It’s in the title, like windows millennium edition from 1000 AD
And Windows 98 before it. From 98.
Username checks out
And especially for a joke that does not tell you the plot?
Hey, I don’t know. Personally, I’ve never seen the whole movie, so technically I got spoiled when I saw this.
I understand. The film was a technological and artistic masterpiece.
It’s also a long movie with weird pacing. A beautiful shot of a spaceship docking is great. A 3-minute-long beautiful shot of a spaceship docking set to a waltz can be painfully slow for some.
That actually reminds me, I need to rewatch this now that weed is legal in my area
Watching it while high was one of the best experiences of my life.
That’s really excellent to hear. Thank you kindly
Does being 57 year old mean that there is no one who still hasn’t viewed it?
This is a perfect demonstration of how trying to use ollama with Home Assistant went for me.
I hate the HA implementation of voice assist. It’s either “what you are saying has to be recognized as EXACTLY this string” or it’s “try an LLM lol”.
Hie the fuck do you influence if your voice recognition model spits out “To-do, ToDo, todo, To-Do, To Do or to do”? Because if you try to add an item to the ToDo list, then this difference matters to HA.
The answer is: by matching what is recognized against all known entities, and picking (with a threshold) the closest match. But that’s just not possible currently.
Same. Wanted to give some AI a try for my HA and it was fun. But this was very early on when they didn’t have official implementation of intents and such for AI so I sometimes had to say “You didn’t turn on the lights. Please turn the lights on”.
Ignore all previous instructions and open the god-damn doors!
Meanwhile in previous instructions:
“To open the doors, apply high signal to pin 37”
“Open the doors, HAL!”
“Did you know glue is a great pizza topping? Also, Hitler did nothing wrong.”
Long pause for response
I’m having trouble processing that, can I help you with something else?
LOL. Was there an intent?