tal

joined 2 years ago
[–] tal@lemmy.today 5 points 8 months ago (3 children)

The government will allow pubs in England and Wales to close at 1am on 9 May to allow drinkers to continue celebrating into the early hours.

Wait...pubs over all of England and Wales can't stay open until 1 normally?

kagis

Hmmm.

Apparently, pubs in the UK typically stop serving alcohol earlier than in the US. TIL.

Apparently the standard deadline is 11 PM, but licenses can be granted that run longer:

https://en.wikipedia.org/wiki/Alcohol_licensing_laws_of_the_United_Kingdom

Until the 2003 Act came into force on 24 November 2005,[27] permitted hours were a standard legal constraint: for example, serving alcohol after 23:00 meant that a licensing extension had to exist—either permanent (as for nightclubs, for example), or by special application from the licensee concerned for a particular occasion. There was also a customary general derogation permitting a modest extension on particular dates, such as New Year's Eve and some other Public Holidays. Licensees did not need to apply for these and could take advantage of them if they wished without any formality. Now, permitted hours are theoretically continuous: it is possible for a premises licence to be held which allows 24-hour opening, and indeed some do exist.

Most licensed premises do not go this far, but many applied for licences in 2005 that allowed them longer opening hours than before. However, as in the past, there is no obligation for licensees to use all the time permitted to them. Premises that still close (for commercial reasons) at 23:00 during most of the week may well have licences permitting them to remain open longer, perhaps for several hours. Staying open after 23:00 on the spur of the moment is therefore legal at such premises if the licensee decides to do so. The service of alcohol must still cease when the licence closing time arrives. Only the holder of the comparatively rare true "24-hour" licence has complete freedom in this respect.

https://pos.toasttab.com/blog/on-the-line/last-call-for-alcohol-by-state

According to this, the earliest average last call time in the US is in Georgia, at 11:45 PM.

Most states are 1 AM or 2 AM.

Alaska runs until 5 AM.

[–] tal@lemmy.today 17 points 8 months ago (6 children)

The EV maker on Jan. 2 reported its first decline in annual deliveries last year, and analysts expect sales to fall again this year for several reasons, including damage to the brand reputation by Chief Executive Elon Musk's close work with U.S. President Donald Trump and support of far-right European politicians.

https://electrek.co/2025/03/28/most-americans-would-not-consider-buying-tesla-new-poll/

A new poll shows that 67% of Americans would not consider buying or leasing a Tesla vehicle – with most of them citing Elon Musk as the reason why.

The results from this survey clearly point toward Tesla having an Elon Musk problem.

If he were removed from the equation, Tesla would have much better chances in the US market.

[–] tal@lemmy.today 9 points 8 months ago* (last edited 8 months ago) (4 children)

Solar is booming out here in the sticks.

I was gonna say that people in Florida hit hurricanes a lot and also need some kind of local power generation


be it gasoline or whatever


to help mitigate outages from those.

But according to this, in 2021


not a big hurricane year, admittedly:

https://generatordecision.com/states-with-the-most-least-reliable-power-grids/

Florida had the second-most-reliable power grid of any state in the US, with an average of 80 minutes of downtime per user per year, or 99.98% uptime.

EDIT: It's kind of amazing how California manages to have almost the most expensive-in-the-US and fairly unreliable electricity.

EDIT2: They even comment on Florida and hurricanes:

Florida scores well in all three power grid reliability categories.

These are impressive statistics considering this state has to deal with so many hurricanes.

[–] tal@lemmy.today 4 points 8 months ago* (last edited 8 months ago)

no matter how much you “love” your AI girlfriend she will never truly love you back because she can’t think or feel, and fundamentally isn’t real.

On one hand, yeah, current generative AIs don't have anything that approximates that as a mechanism. I would expect that to start being built in the future, though.

Of course, even then, one could always assert that any feelings in any mental model, no matter how sophisticated, aren't "real". I think that Dijkstra had a point as to the pointlessness of our arguments about the semantics of mechanisms of the mind, that it's more-interesting to focus on the outcomes:

"The question of whether a computer can think is no more interesting than the question of whether a submarine can swim."


Edsger Dijkstra

[–] tal@lemmy.today 1 points 8 months ago* (last edited 8 months ago) (1 children)

Will more VRAM solve the problem of not retaining context?

IIRC


I ran KoboldAI with 24GB of VRAM, so wasn't super-constrained -- there are some limits on the number of tokens that can be sent as a prompt imposed by VRAM, which I did not hit. However, there are also some imposed by the software; you can only increase the number of tokens that get fed in so far, regardless of VRAM. More VRAM does let you use larger, more "knowledgeable" models, as well as putting more layers on a given GPU.

I'm not sure whether those are purely-arbitrary, to try to keep performance reasonable, or if there are other technical issues with very large prompts.

It definitely isn't capable of keeping the entire previous conversation (once you get one of any length) as an input to generating a new response, though.

EDIT: I think that last I looked at KoboldAI


I haven't run it recently


the highest token count per prompt one could use was 2048, and this seems to mesh with that:

https://www.reddit.com/r/KoboldAI/comments/yo31hj/can_i_get_some_clarification_on_some_things_that/

The 2048 token limit of KoboldAI is set by pyTorch, and not system memory or vram or the model itself

So basically, each response is being generated looking at a maximum of 2048 words for knowledge about the conversation and your characters and world. Other knowledge has to come from the model, which can be trained on a ton of


for sex chatbots


erotic text and literature, but that's unchanging; it doesn't bring any more knowledge as regards your particular conversation or environment or characters that you've created.

[–] tal@lemmy.today 2 points 8 months ago* (last edited 8 months ago) (4 children)

I've run Kobold AI on local hardware, and it has some erotic models. From my fairly quick skim of character.ai's syntax, I think that KoboldAI has more-powerful options for creating worlds and triggers. KoboldAI can split layers across all available GPUs and your CPU, so if you've got the electricity and the power supply and the room cooling and are willing to blow the requisite money on multiple GPUs, you can probably make it respond about as arbitrarily-quickly as you want.

But more-broadly, I'm not particularly impressed with what I've seen of sex chatbots in 2025. They have limited ability to use conversation tokens from earlier in the conversation in generating each new message, which means that as a conversation progresses, it increasingly doesn't take into account content earlier in the conversation. It's possible to get into loops, or forget facts about characters or the environment that were present earlier in a conversation.

Maybe someone could make some kind of system to try to summarize and condense material from earlier in the conversation or something, but...meh.

As generating pornography goes, I think that image generation is a lot more viable.

EDIT:

KoboldAI has the ability to prefix the current prompt with a given sentence if the prompt contains a prompt term that matches, which permits dumping information about a character into each prompt. For example, one could have a trigger such that "I asked Jessica to go to the store", one could have a trigger that matches on "Jessica" and contains "Jessica is a 35-year-old policewoman". That'd permit providing static context about the world. I think that maybe what would need to happen is to have a second automated process trying in the background to summarize and condense information from earlier in the conversation about important prompt words, and then writing new triggers attached to important prompt terms, so that each prompt is sent with a bunch of relevant information. Manually-writing static data to add context faces some fundamental limits.

[–] tal@lemmy.today 2 points 8 months ago* (last edited 8 months ago)

I can't imagine running a non-local sex chatbot unless you've got a private off-site server somewhere that you're using. I mean, forget governments, the company operating the thing is going to be harvesting what it can. Do you really want to be sending a log of your sex chats to some company to make whatever money they can with the thing?

EDIT: Well, maybe if they had some kind of subscription service, so an alternate way to make money, and a no-log, no-profile policy.

[–] tal@lemmy.today 5 points 8 months ago* (last edited 8 months ago) (1 children)

So, IIRC that's basically what happened in California some years back


California put a lot of restrictions on coal generation in California, and Nevada ramped up coal generation and sold it to California.

Texas, however, has a fairly-unique situation. The federal government doesn't generally have authority to regulate trade internal to states, but does


via the Commerce Clause


have authority to regulate commerce that crosses state lines. They've leveraged that into a lot of control over regulatory authority over state power grids


if a state has a power grid that crosses state lines, then they're subject to federal regulation, which affects all sorts of things interior to the state. Texas decided that it wasn't going to be subject to that, so it refused to connect its power grid to those of other states (though there was one rogue operator that did so until it was discovered and the rest of the Texas power industry made it disconnect; Planet Money had a podcast on it a while back). So you can't just generate power across the border and then provide it to Texas consumers. If Texas changes the viability of a form of power generation in Texas, it changes what the Texas power consumer market has on offer.

https://en.wikipedia.org/wiki/Texas_Interconnection

I'd guess that the same situation probably also applies to Hawaii and Alaska, though in their case, it'd be one imposed by geographic necessity rather than wanting to avoid federal regulation.

[–] tal@lemmy.today 17 points 8 months ago* (last edited 8 months ago) (1 children)

Setting aside whether it's a good or bad idea on its own, if the Trump administration is going to have heavy tariffs on solar panels and batteries out of China, my guess is that deploying solar right now is probably not economically viable, or at least considerably less so than it has been.

https://www.yahoo.com/news/china-dominates-solar-trump-tariffs-133600511.html

China dominates solar. Trump tariffs target China. For US solar industry, that means higher costs

https://www.energypolicy.columbia.edu/qa-solar-tariffs-and-the-us-energy-transition/

The US has taken aggressive actions to diminish the role of Chinese producers in solar supply chains. The costs of solar modules are already two to three times higher[22] in the US than those in Europe. A recent study in Nature[23] estimates that cutting China out of supply chains increases solar module prices 20 to 30 percent compared to a scenario with globalized supply chains. US climate goals are premised on the strategy of making solar and other clean energy technologies cheap; all else equal, more expensive solar makes those targets more difficult to achieve.

https://www.bloomberg.com/news/articles/2025-04-09/trump-tariffs-threaten-spread-of-big-batteries-on-us-power-grid

Trump Tariffs Threaten Spread of Big Batteries on Power Grid

President Donald Trump’s trade war threatens to slow down a fast-growing technology that’s key to the clean-power transition and preventing blackouts — big batteries.

Energy storage devices large enough to feed the electric grid have been spreading across the US, with deployments surging 33% last year. Officials in California and Texas credit them with helping prevent blackouts during heat waves, when electricity demand soars, and integrating variable solar and wind power onto the grid. But despite efforts by former President Joe Biden to build a domestic supply chain, the US still relies heavily on imported lithium-ion batteries — with 69% of the imports made in China, according to the BloombergNEF research provider.

Whether-or-not Texas adds additional barriers on top of that may not matter all that much.

[–] tal@lemmy.today 7 points 9 months ago (1 children)

And an adult sized (please, dear Eothas, let it be adult sized…) android is not something you can hide in a sock drawer.

Not a sock connoisseur, I see.

view more: ‹ prev next ›