this post was submitted on 13 Apr 2026
167 points (99.4% liked)
Not The Onion
21196 readers
1606 users here now
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Please also avoid duplicates.
Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, ableist, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
So, I don't have a comment on specifically doing Zuckerberg, but a practice adopted by a number of companies that make a product that can reasonably be used by employees is to try to have employees actually use the thing, because that makes them aware of things that need to be changed or other issues or improvements and more-interested in doing so. That is, in general, as a company, you're likely better-off in terms of filling user needs if employees actually use whatever they make, especially if they're in a position to make decisions about how it works.
https://en.wikipedia.org/wiki/Eating_your_own_dog_food
Not everywhere I've worked has done that, but at places where it was applicable, they tried to do so, including one handing out free hardware if necessary to use the product, as well has having the company itself make use of the products if possible. I think that it's generally a good idea; it makes people at the company in a position to improve things very aware of pain points.
If
and I have no idea if this is actually the case
Meta is trying to position AI models they make to act in a "contact the company" role, they might want to have their employees actually doing that themselves.
Meta's in a strange business for that philosophy because, well... 99% of their income is ads. They model and engage users to sell ads.
It's not great dogfood for employees to try.
And their "AI" situation is murky. They've actually use machine learning internally for a long, long time, but the recent rush to try and productize AI more directly is... mixed.
They had a really good open weights LLM division, and built an interesting ecosystem around those "Llama" models. Small/medium businesses helped expand them. Meta employees interacted with other open source projects, too, and posted their own experiments. It was great! And a prime example of "eating your own dog food."
...But that lab had one failed experiment, so Zuckerberg killed the whole thing. As Zuck tends to do.
And now they have some new division which, from my perspective in the tinkerer community, I would bluntly describe as "a clash of Tech Bro egos." It's generous to call experiments like an "AI CEO" as an attempt to test their own product, but it more closely resembles Zuckerberg's pattern of frantically, nervously engaging in something with the nebulous hope it goes viral like Facebook did.