Endmaker

joined 1 year ago
[–] Endmaker@ani.social 35 points 2 weeks ago* (last edited 2 weeks ago) (7 children)

This concierge approach is nothing new; it existed even before LLMs are a thing.

One of my courses in undergrad computer science is Human-Computer Interaction, in which we learn about user experience (UX) concepts.

One of the things we learnt is to validate our ideas quickly and cheaply before putting a lot of time, effort and money into building the thing.

To do so, what we can do is build prototypes. The early versions may be be low-fidelity (lofi) and are scrappy. The later, high-fidelity (hifi) ones would mimic the functionality of the actual products, and may even appear to work to end users when in reality it could be just be manual effort behind the scenes.

The example given during lecture is the development of a ticketing system. To test the idea out, one could simply get a dude to sit in the "machine" and give out slips of paper.

Anyway, I am explaining all these because this seems like a surprise to those without the same educational background. Long story short, what this startup did is completely normal in the realm of software.

We may have better tools like Figma to simulate browser / mobile frontend experiences, but nothing is stopping us from going back to the basics and doing it this way.

[–] Endmaker@ani.social 93 points 1 month ago (2 children)

Randall has known Binod for more than 16 years. A self-described conservative constitutionalist Christian, he blames Joe Biden for Binod’s deportation. “Biden was completely responsible for it. Trump wasn’t,” he said. In Randall’s view, Trump was “just fixing the border”, cleaning up after Biden’s failure to control immigration, and Binod was collateral damage.

🤯

[–] Endmaker@ani.social 17 points 1 month ago* (last edited 1 month ago) (3 children)

That's the joke, no? That they are obviously all different people, but we treat them as one person - with the superhero name Floridaman - who's responsible for all the weird shit.

[–] Endmaker@ani.social 3 points 1 month ago

I thought the general concensus is that there's a tech bubble. The only issue is that the market can stay irrational for a long time. I guess Burry is confident that the bubble would be popping in the near future then?

IMO Tesla is another one that's overvalued.

[–] Endmaker@ani.social 9 points 2 months ago* (last edited 2 months ago) (1 children)

????

Perhaps I'm too dumb to understand it, but I don't see how your comment is relevant to the article.

[–] Endmaker@ani.social 89 points 2 months ago* (last edited 2 months ago) (7 children)

Anjum was absent from the operating room for eight minutes and the patient came to no harm.

[–] Endmaker@ani.social 1 points 3 months ago (1 children)

it's that there aren't even any realistic theoretical applications.

Here's the neat thing about research: the researcher themselves may not even know the kind of outcomes their research would bring about in the future.

It is not necessarily a known unknown in which we work towards a theoretical application; it could very well be an unknown unknown.

[–] Endmaker@ani.social 4 points 3 months ago* (last edited 3 months ago) (3 children)

OpenCerts

An easy way for employers to verify that your certifications are authentic.


Tangentially, a lot of scientists do research on topics that do not see application in everyday life immediately.

I can't think of any examples off the top of my head, but I remember reading articles on how some research bear fruit - ones with huge impacts - only decades later.

To stop research into a topic because there is no practical application now is short-sighted IMO.

[–] Endmaker@ani.social 11 points 3 months ago* (last edited 3 months ago) (1 children)

"People" as in actual humans? None, I think.

If I block those with frequent trash takes, I won't get to downvote them next time. Other reasonable people would then see their future posts with higher scores.

[–] Endmaker@ani.social 71 points 4 months ago* (last edited 4 months ago) (1 children)
[–] Endmaker@ani.social 6 points 4 months ago* (last edited 4 months ago) (2 children)

Someone with the expertise should correct me if I am wrong; it's been 4-5 years since I learnt about NPUs during my internship so I am very rusty:

You don't even need a GPU if all you want to do is to run - i.e. perform inference with - a neural network (abbreviating it to NN). Just a CPU would do if the NN is sufficiently lightweight. The GPU is only needed to speed up the training of NNs.

The thing is, the CPU is a general-purpose processor, so it won't be able run the NN optimally / as efficiently as possible. Imagine you want to do something that requires the NN and as a result, you can't do anything else on your phone / laptop (it won't be problem for desktops with GPUs though).

Where NPU really shines is when there are performance constraints on the model: when it has to be fast (to be specific: have real-time speed), lightweight and memory efficient. Use cases include mobile computing and IoT.

In fact, there's news about live translation on Apple AirPod. I think this may be the perfect scenario for using NPUs - ideally housed within the earphones directly but if not, within a phone.

Disclaimer: I am only familiar with NPUs in the context of "old-school" convolutional neural networks (boy, tech moves so quickly). I am not familiar with NPUs for transformers - and LLMs by extension - but I won't be surprised if NPUs have been adapted to work with them.

view more: next ›