- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Fuckin’ HR nerds can’t even be bothered to do their own jobs and the rest of us have to suffer for it.
I don’t blame HR for wanting to make their jobs easier, everyone else does that too.
I blame HR for not taking time to think about problems that could arise by having a computer judge if a person will be a good fit at the company, which is a famously difficult task, even if you’re intimately familiar with the role you’re filling and all the people your potential hire will be working with.
Exactly. I and my boss will sometimes use ai for helping to write code and other small tasks but we always check to make sure its right or tweak it before using it.
Ai is a tool and like any tool you can use it incorrectly. Fine if you wanna use automation but thay automation better be damn near perfect if you’re using it in production and have checks in place to ensure its doing it right.
Yeah, but it doesn’t solve all of our problems perfectly right now so it’s obviously a total and complete waste of money! …/s
Ah yes very good point, we should just throw it out /s
Yea not a fan of that either. it’s early but the tech is cool and useful for some stuff at least. I’m having lots fun running running it locally for audio transcription and creating summaries for my dnd campaign. (Though i cant seem to get it to work for longer bits atm, still messing with it.)
Most companies are really just using it wrong or are shoving it into stuff it doesnt need to be.
I have a friend that uses it to make outlines for stories (with tone, setting, general plot arch, writing style) and then fills in each block of the outline with a well developed story, iterating all the way through until they are happy with it… with his kids, as a bedtime exercise. It’s a pretty awesome exercise where the family spends time together, learns about creativity and how to structure a story, and ultimately comes out the back end with a memory they can share and a story they can pass around to friends and family.
If you spend the time to learn how to use it as a tool, there is a ton of value in it, even in its current iteration. It’s a tool in the early stages of development. As it matures as a product, there are a lot of gains to be made.
The social media zeitgeist wants to push it as a terrible thing that isn’t living up to its hype… but it really just shows a lack of creativity and understanding of how to use new tools. Every story I see about how AI is a waste of resources and won’t ever amount to anything just makes me cringe a little bit and shows how disconnected media is from reality. If you can’t find value in a tool, maybe look inward to see if you have some growing to do rather than lashing out at the tool for not solving all of your problems.
That is adorable. What a nice way to have fun with your kids.
And I agree, if you know how to use it or learn how to use it, there is absolutely value to be had.
Disagree. If you’re half assing the job you’re supposed to be doing because it’s easier on you, you don’t deserve a pass. Especially when it’s negatively effecting other people like this does.
Also, generally, screw HR. They exist just to protect the rich they work for and they’ll have you fired in an instant if it means helping out the company in any way.
At this rate it would be easier to just do hiring by lottery
When I went to a career center they said to basically copy paste the job description in this way. Not surprised to see something like this works, too.
This shows a “grandma email forward” level understanding of how hiring software systems work
Yeah most résumé analysis is done through an ATS not raw ChatGPT, those have their own issues but they aren’t just LLMs processing your résumés
This is usually true but small HR departments without fancy tools will often just run that shit through ChatGPT using their personal Google accounts as a login. They don’t give a fuck about privacy from my experience.
Your average small time HR person is either basically the admin for an entire company or some office grunt who just fell into the job. I’m really unsurprised by this as an Organizational Psychologist I’ve basically seen it all in terms of terrible practice.
I worked on HR and got hired to one job by reviewing an employee timesheet. They didn’t obfuscate any private information or names, just handed it to a prospective employee over a lunch interview. The shit they did there was crazy.
Hilarious labour law violation 😂
You know I’m pretty confident that absolutely no one uses chatGPT to screen applicants. How would that even work, it doesn’t know anything about the requirements of the job.
They’ve been using software to screen applicants long before GPT was on the scene.
Neither does most recruiters in technicals fields, lol
How would that even work, it doesn’t know anything about the requirements of the job
I rest the case.
Wouldn’t that text just disappear when you turn it into a pdf?
No. Why would it?
Depends on how you do it I suppose. If you use the print to pdf dialog f.ex. it would.
My understanding is that most printers will ignore any white in a document but the actual code for the white text or white image is sent. So yeah PDF would contain white text.
After all some printers are going to be printing on non-white materials and so do have white ink, so they need that information sent over, and the PDF doesn’t know what type of printer it’s going to ahead of time.
It depends on what method it is turned into a PDF, because PDF is a horrendous mess of a standard.
It is my understanding that PDF can contain anything from PostScript-like markup of text and page elements, to a raster image, depending on how it was generated. It is possible to preserve white-on-white text through a PDF export, if it is configured right.