daniellamyoung_3h
Unpopular opinion: you only hate chat gpt because it makes it harder to stack rank and discriminate against people.
So what everyone can write well now? great it’s a tool! Just like moving faster because you drive a car.
The good news is you’ll be easily able to hire for that writing job you need. The bad news is you won’t be able to discriminate against candidates who are not as good with the written word.
Also, an obsession with the written word is a tenant of white supremacy [salute emoji]
Ian Rennie
@theangelremiel.bsky.social
Man, this probably hits really hard if you’re fuckin stupid.
Spoken like someone who thinks they can cheat their way to talent
Ya can’t
Not talent, but you can cheat your way to success and power, which is what they care about.
NGL my first though on reading that was “they millied my vanillis”
That’s an interesting point. It also aligns with how some of our main characters are involved in trying to organize a steroid olympics.
I have the feeling this comment is written in a code I do not know.
Unfortunately “steroid olympics” isn’t a code for anything at all but is quite literal.
There’s literally a weirdo who came up with plans for “the Enhanced Games”, and some Silicon Valley venture capitalists including Peter Thiel and Balaji Srinivasan actually invested in it.
ah techbros high on their own farts again.
“tenant of white supremacy”
White Supremacy is the worst landlord.
Give him a break, he was writing with the ‘help’ of chatGpt
And not for a lack of tough competition.
YOU JUST OUTED YOURSELF, PUNK! That was a trap and you FELL for it!
wut
It’s just a tool, like cars! My definition of tools is things that are being forced on us even though they’re terrible for the environment and make everyone’s life worse!
It’s a tool, just like cars, in that both are terrible for the environment and risk the survival of the human species as well as countless ecosystems.
But not in the cool way that the people selling them say they endanger the survival of life on this planet, just in the boring climate catastrophe ways that people have been trying to get taken seriously since the fucking 70s.
I looked through her recent replies on threads, and while she has deleted the original post, it looks like she is doubling down on this take:
I guess I’ll say this in a different way, the language around that SOME people are using around chat GPT is the same panic language society always uses with new “advancements” or tools. We saw it when GPS became a thing, we see it now with people freaking out about cursive going away, and oh my, they definitely saw it with calculators. At its core it’s a “geez how are we gonna tell people apart anymore, if we can’t test these skills.” That’s not the only argument about it…
there are plenty of things to talk about about AI But this language definitely exists in the conversation. I recognize it easily, because it’s very, very Culty. It’s this very apocalyptic nature of discussion around it instead of the acknowledgment that human beings will keep building tools that will change everything.
every time a new tool makes certain skills that we test for to rank folks obsolete human beings freak out
To which all this I say… wow, she really has decided to just ignore all the discourse about generative AI*, huh? Like sure you can use this analogy but it breaks down pretty quickly, especially when you spend like 5 minutes doing any research on this stuff.
*Would love to start using a new term here because AI oversells the whole concept. I was thinking of tacking something onto procedural generation? Mass PG? LLMPG/LPG? Added benefit of evoking petroleum gas.
Have you considered
That analogy is horseshit because gps and the death of cursive were both need based
Generative ai / chat gpt for writing fiction has no need nor real purpose despite them desperately pinwheeling about jamming it everywhere possible.
The only use I’ve had for writing cursive in 30 years has been to copy out an anti-cheating pledge on a standardized test, because some fucker thought cursive magically makes a pledge 300% more honest.
yup, hence saying its death was need based.
People don’t write cursive for the same reason that councils don’t put in new horse troughs.
gps
Can anyone elaborate on gps panic please? What happened when it became available?
Personally i’ve never heard of a moral panic over GPS, though if pressed I could manufacture some. So that one seems like something dreamed up by the author. Would love to be proven wrong!
Ooh, I know! I’d not exactly call it a moral panic but there were people who were convinced that people would be driving off cliffs or getting lost in the mountains because they didn’t have the skills to read a paper map properly. Wasn’t very convincing, especially as if people are determined to be stupid enough to drive off a cliff without noticing they’re going to find a way to do that even if there’s a big sign in front of them saying “Cliff, do not drive off”.
In much of the world online mapping services still aren’t anywhere near the standard of a proper topological map and there’s really no substitute for (say) an Ordnance Survey map if you’re climbing in the Cuillins, but that’s not the fault of GPS.
It’s barely was existant. At least with Photoshop, you had the occasional outrage over some manipulated photos created in order to spread hate (anyone remembers the photo where someone photoshopped the heads of Barack Obama and Osama Bin Laden onto Jewish people who wore big stars of Davids?) or create fake nudes, or the elitist oil painter that already had problems with other mediums just found yet another one to be snarky at something else besides of pencil drawings.
I’ve been playing with “mass averaging synthesis machines”, variations on “automated plagiarism”, “content theftwashing systems”
still undecided tho
I’m still partial to “spicy autocomplete” as a good analogy for how these systems actually work that people have more direct experience with. Take those Facebook posts that give you the first few words and say “what does autocomplete say your most used words are?” and make answering the question use as much electricity as a small city.
As much as it is a good phrase, I’m too used to seeing “spicy” as a compliment, so it doesn’t work for me!
Considering the amount I have to say the term whenever ranting or debating, something that can be shortened is welcomed. I like the idea of calling “automated plagiarism,” since it can be shortened to “autoplag” which is also ugly sounding.
autoplag pronounced like “auto-plag” or “auto-plage”?
“Auto-plague”
Charles Stross suggested “Blarney Engine”
If everyone can write well now then explain this post.
Tenant was that Christopher Nolan movie with the bad audio. Quality comparison.
No that was Tenet, you’re thinking of a tent.
No, a tent is a shelter made out of fabric, you’re thinking of Tencent
No, tencent is a Chinese tech company, you’re thinking of tenement.
No, that’s a housing subdivided for rent. Your thinking of Tennant’s.
No, Tennants is an auction house based at Leyburn in North Yorkshire, England. You’re thinking of tenant.
Correct! A “tenant” is defiantly what the obsession with the written word is in relation to white supremacy.
please stop posting about The Net (1995)
No, that’s stuff belonging to the tenth doctor. You’re thinking of ten ents.
New hire firefighter [leaning against a dumpster]: yeah I used the AI that puts out fires to get this job. They would have been able to discriminate against me if I hadn’t done that. Glad that in this crazy fucked up trash fire of a world, there’s still something out there helping to level the playing field.
Veteran firefighter: that trash behind you is on literally on fire
Imagining judging someone for a job about communicating with people on their ability to communicate with people effectively.
ChatGPT is great because you can use it to show a potential employer how good your writing is for that writing job they’ll totally pay you to use ChatGPT to do.
It is and always has been racism that has stopped bad writers from getting writing jobs.
/s
Like, there is definitely racism in the hiring process and how writing is judged, but it comes from the fact that white people and white people alone don’t have to code switch in order to be taken seriously. The problem isn’t that bad writers are discriminated against it’s that nonwhite people have to turn on their “white voice” in order to be recognized as good writers. Giving everyone a white robot that can functionally take their place doesn’t actually make nonwhite people any more accepted. It’s the same old bullshit about how anonymity means 4chan can’t be racist.
I’m actually pretty sympathetic to the value of even the most sneer-worthy technologies as accessibility tools, but that has to come with an acknowledgement of the limitations of those tools and is anathema to the rot economy trying to sell them as a panacea to any problem.
Except that the ability to communicate is a very real skill that’s important for many jobs, and ChatGPT in this case is the equivalent to an advanced version of spelling+grammar check combined with a (sometimes) expert system.
So yeah, if there’s somebody who can actually write a good introduction letter and answer questions on an interview, verses somebody who just manages to get ChatGPT to generate a cover and answer questions quickly: which one is more likely going to be able to communicate well:
- with co-workers
- in a crisis,
- without potentially providing sensitive data to a third-party tool
- While providing reliable answers based on fact without “hallucinating”
Don’t get me wrong, it can even the field for some people in some positions. I know somebody who uses it to generate templates for various questions/situations and then puts in the appropriate details, resulting in a well-formatted communication. It’s quite useful for people who have professional knowledge of a situation but might have lesser writing ability due to being ESL, etc. However, that is always in a situation where there’s time to sanitize the inputs and validate the output, often choosing from and reworking the prompt to get the desired result.
In many cases it’s not going to be available past the application/overview process due to privacy concerns and it’s still a crap-shoot on providing accurate information. We’ve already seen cases of lawyers and other professionals also relying on it for professional info that turns out to be completely fabricated.
LLMs are distinctly different from expert systems.
Expert systems are designed to be perfectly correct in specific domains, but not to communicate.
LLMs are designed to generate confident statements with no regard for correctness.
Yeah. I should have said “illusions of” an expert system or something similar. An LLM can for example produce decent working code to meet a given request, but it can also spit out garbage that doesn’t work or has major vulnerabilities. It’s a crap shoot
alert alert we’ve got of one of them on the doorstep
People who know about AI?
you probably don’t know this, but this post is so much funnier than you probably meant it
and it (probably) still won’t save you
I don’t understand what part of their statement you read as pro-LLM
arguing-from-existence of expert systems (which were the fantasy in the previous wave)
Expert systems actually worked.
Don’t make me tap the sign
This is not debate club
We don’t correct people when they are wrong. We do other things.
That’s just a strawman fallacy followed by a “guilt by association” fallacy.
how to let people know you’re not a talented writer but think you should be without telling people you’re not a talented writer but you think you should be
Serious question: what does “stack rank” mean?
See https://en.m.wikipedia.org/wiki/Vitality_curve
It’s become really popular in large tech companies and it’s fucking stupid.
Thank you. I’ve never even heard of the term before, and didn’t know if it was slang, a typo, or what. It wouldn’t have occurred to me to search for it.
Yet another word for the good ol’ rank-and-yank. Great way to instantly make number go up by suddenly laying off 10-20% of your employees. The trick is making sure you’ve moved on to another department or another company before the predictable consequences take hold.
Surely that’s an AI generated pfp
apparently she is a real known person from military twitter
actually i feel a bit bad about it now https://www.abc.net.au/news/2023-01-27/daniella-cult-the-family-joined-the-army-toxic-control/101895164
Oh no, I think I’ve seen people talk about her memoir! I’ve heard it’s good!
My understanding is that through her experience, she has developed a career around analysing cults (as well as other things) which is great! We need that in the world.
However she has unfortunately missed the areas where her expertise would be really insightful (SV itself basically) and taken this weird tack. Dunning Kruger be like that sometimes
Ultracrepidarianism, apparently.
Ooh, that’s a five dollar word if I’ve ever seen one.
Indubitably.
my heuristic: I can understand a shitty past giving crescent to bad reactions, but the moment you start choosing bad things with current-era things I rapidly start losing grace and patience
(and yeah I know there’s a continuum of stuff between A and B, but anyone showing up in a fucking news article of this shape is generally well past accident)
Okay, show me a system that was only trained on data given with explicit permission and hopefully compensation and I’ll happily be fine with it.
But that isn’t what these capitalists, tech obsessives etc they have done. They take take take and give nothing back.
They do not understand nor care about consent, that’s the crux of the issue.
I couldn’t care less if all the training data was consensual.
But even if it there was an LLM that used only ethical sources it would still need massive amounts of energy for training and using so until we’re 100% renewable and the whole world gets as much of that energy as they need …