tal

joined 2 years ago
[–] tal@lemmy.today 2 points 3 months ago

One, if it got lucky.

[–] tal@lemmy.today 9 points 3 months ago

An April MIT study found AI Large Language Models (LLM) encourage delusional thinking, likely due to their tendency to flatter and agree with users rather than pushing back or providing objective information.

If all it takes to get someone to believe something is to flatter them and agree with them, it does kind of explain how people manage to sell people on all kinds of crazy things.

[–] tal@lemmy.today 1 points 3 months ago* (last edited 3 months ago)

I found that while the lemmy_server process starts successfully and shows "Starting HTTP server at 0.0.0.0:8536" in logs, nothing is actually listening on port 8536.

Does:

# netstat -ntap|grep 8536

...show anything bound to the port?

I'm not sure how you determined that it's not binding to the port, but that's how I'd check.

There isn't much that should stop a process from listening on a port over 1024 unless another process is already listening on it.

[–] tal@lemmy.today 2 points 3 months ago* (last edited 3 months ago)

Thinking in terms of water for cooling, looks like it's on the Ruhr rather than the coast.

[–] tal@lemmy.today 1 points 3 months ago (1 children)

I don't know why you want a BIOS update.

If you're wanting VT-x support, looks like it's present.

https://mobilespecs.net/laptop/Toshiba/Toshiba_SATELLITE_M100-221.html

Processor Model: T5600

https://www.techpowerup.com/cpu-specs/core-2-duo-t5600.c378

VT-x

https://superuser.com/questions/1584771/what-is-difference-between-vmx-and-vt-x

The CPU flag for VT-x capability is "vmx"; in Linux, this can be checked via /proc/cpuinfo

[–] tal@lemmy.today 2 points 3 months ago (1 children)

I'd assume so, but more importantly, for both, there's a cryptographic signature being performed by the card. The credentials never leave the card


there's a private key on the card, and what goes out is a signature on the transaction, which is useless for doing other transactions.

[–] tal@lemmy.today 36 points 3 months ago* (last edited 3 months ago) (4 children)

In most respects, CRTs were technically worse, but a lot of video game art was designed around their characteristics, optimized for them, and thus can appear better on a CRT. We can


and do


try to emulate CRT quirks on LCDs/LEDs to varying degrees, but we're always going to be just approaching what a CRT looked like.

https://wackoid.com/game/10-pictures-that-show-why-crt-tvs-are-better-for-gaming/

Final Fantasy 6:

Castlevania: Symphony of the Night:

Final Fantasy 7:

Things like blurriness are a limitation in the fidelity of the display, true enough, but in the context of material optimized for that display, it can be a positive rather than a negative.

[–] tal@lemmy.today 9 points 3 months ago (1 children)
  • Lawrence of Arabia.
[–] tal@lemmy.today 10 points 3 months ago* (last edited 3 months ago)

I was going to ask whether she would vote against herself if her opponent is male, but it looks like she's ahead of the ball on that one:

https://cactuspolitics.com/2025/08/mylie-biggs-wants-to-lead-the-state-but-says-women-should-stay-home/

She criticized "modern feminism" and its impact, specifically mentioning concerns "starting with women's right to vote."

EDIT: Ah, looks like they included that quote in the submitted article too.

[–] tal@lemmy.today 1 points 3 months ago (1 children)

Well, there's certainly that. But even then, I'd think that a lot of videos could be made to be more concise. I was actually wondering whether YouTube creators get paid based on the amount of time they have people watch, since that'd explain drawing things out. My impression, from what I could dig up in a brief skim, is that they're indirectly linked


apparently, YouTube shows ads periodically, and the more ads shown, the more revenue the creator gets. So there would be some level of incentive to stretch videos out.

[–] tal@lemmy.today 1 points 3 months ago (1 children)

-the opening the port process makes sense. It seems like if I have a backend on my rig, I’m going to need to open a port to access that backend from a front end of a phone device.

Yes. Or even if you run a Web-accessible front-end on the LLM PC


the Web browser on the phone needs to reach the Web frontend on the PC.

Or possibly even access that same backend on the phone device via a mirror?

Well, the term wouldn't be a mirror. In your shoes, it's not what I would do, because introducing some third host not on your network to the equation is another thing to break. But, okay, hypothetically, I guess that doing that would be an option. thinks. There might be some service out there that permits two devices to connect to each other, though I'm not personally aware of one. And, say you got a virtual private server for $10 a month or whatever the going rate is, yeah, that could be set up to do this -- you could use it as an intermediate host, do SSH tunneling from both the PC and the phone of the sort that another user in this thread mentioned. I guess that that'd let you reach the PC from other places, if that's something that you want to do, though it's not the only way to accomplish that. But...I think that that's most-likely going to add more complexity. The only scenario where that would truly be necessary is if the wireless access point


which I assume your ISP has provided


absolutely does not permit the LLM PC and the phone to communicate at all on the WiFi network, which I think is very unlikely, and even then, I'd probably just get a second wireless access point in that scenario, put the PC and the phone on it.

In general, I don't think that trying to connect the two machines on your home network via a machine out on the Internet somewhere is a great idea. More moving parts, more things to break, and if you lose Internet connectivity, you lose the ability to have them talk to each other.

-it seems like it would be easier if I could connect to the rig via an android phone instead of an iPhone. My end goal is to use Linux but I’m not ready for that step. Seems like android would be an adequate stepping stone to move to, especially if we have to go thru all this trouble with iPhone. Shall we try on the android instead? If not I’ll follow the directions you put above and report back on Saturday.

If you have an Android phone available, that would probably be easier from my standpoint, because I can replicate the environment; I have an Android phone available here. But it's not really the phone where setup is the issue. Like, it's going to be the LLM PC and potentially wireless access point that require any configuration changes to make ollama reachable from the phone; the phone doesn't need anything other than Reins installed and having an endpoint set or just using a Web browser and using the correct URL there. I'm just mostly-interested in that the phone has to be able to talk to the PC, has to be able to open a TCP connection to the PC, and so having diagnostic tools on a phone is helpful. I don't have to guess how the diagnostic tools work in Termux on an Android, because I can use them myself locally.

I wouldn't suggest going out and buying an Android phone to do just that, though. I mean...this is a one-off diagnostic task, just trying to understand why the phone isn't able to reach the LLM PC. If you can open a connection from the Android phone to the LLM PC, then you should also be able to open a connection from the iOS phone to the LLM PC. If you do have one already available, though, then yeah, my preference would be if you could install Termux on it for the diagnostic tools rather than install iSH on the iOS device. It should still be possible to get the LLM PC reachable on the iOS device either way.

I don't mind trying to diagnose connectivity on the iOS device. Just keep in mind that I may have to guess a bit as to what the behavior is, because I can't actually try the device here, so we may potentially have a few extra rounds of back-and-forth.

If you do want to use an Android phone, then just put the phone on the WiFi network, install Termux, open Termux, run the apk command to install telnet (apk install telnet) and then try the telnet command I mentioned and just report back what error, if anything, you get when trying to open a connection to the LLM PC


hopefully it'll be one of the above three outcomes.

[–] tal@lemmy.today 8 points 3 months ago* (last edited 3 months ago) (1 children)

Deregulation might give some amount of an edge, but I really don't think that in 2025, the major limitation on deployment of AI systems is overbearing regulation. Rather, it's lack of sufficient R&D work on the systems, and them needing further technical development.

I doubt that the government can do a whole lot to try to improve the rate of R&D. Maybe research grants, but I think that industry already has plenty of capital available in the US. Maybe work visas for people doing R&D work on AI.

view more: ‹ prev next ›