kuberoot

joined 2 years ago
[–] kuberoot@discuss.tchncs.de 2 points 2 days ago (1 children)

I haven't properly tried satisfactory, I tried the demo back when that first came out, was asked to run around collecting leaves to put into a power generator for half an hour, and bricked my game trying to put it into borderless or something... And then I switched to Linux, the game was epic exclusive despite promises otherwise, and I passed.

I got the impression it's got a tedious early game, having a prebuilt map might make replaying less fun, and it sadly seems to have a very point-to-point, purpose-specific-device approach to logistics. I also like the performance of Factorio, it's really lightweight on the GPU, and well optimized for CPU (though with the entire map and tons of individual entities loaded at all times there's only so much you can do), which I imagine isn't as great for the modern 3D game Satisfactory is.

I don't want to rant too much about it, but I think the splitter taking in and outputting two belts in Factorio is brilliant. There's only a few types of logistics, but they are versatile and nuanced. Being able to belt items onto the side of an underground belt lets you filter out belts by side, the mechanics of belt sides and how they interact with inserters let you create compact designs or maximize throughput if you spend time on it. There's no dedicated buffer machine, no separate splitters and mergers, all the neat things you can build come together out of component parts in an organic way.

I will also mention that I like to try to plan ahead specifically to avoid starting over, but when rebuilding is necessary (and when laying a rail network) robots are a must-have.

On the topic of the DLC... If you're not drawn into the base game, might be best to pass on it, but they did a good job giving each planet some interesting unique challenges, including organic items that spoil after a certain amount of time. There's plenty of straight content expansion mods, big and popular ones, but they mixed up the gameplay quite a bit in Space Age.

All in all... Yeah, different people, different tastes. I'm currently doing a second playthrough of Space Age with friends, but one of them might've been felled by Gleba. If you want some more unsolicited gaming takes, I can recommend Mindustry and Outer Wilds ;D

[–] kuberoot@discuss.tchncs.de 3 points 2 days ago (3 children)

it's also very shallow

You take that back!

In all seriousness, if you're talking about something like the fact that all machines are functionally doing the same thing, that's kinda fair, but there's a lot of complexity in all the options available, made even greater with DLC and mods. Just the logistics of getting items to the right places have many different approaches with various upsides and downsides, and I love all the emergent mechanics that come from belts having two sides and splitters handling two belts.

It's not a game for everyone, but calling Factorio shallow seems really odd. If anything, I feel like it allows you to explore its mechanics deeply, instead of having a breadth of shallow mechanics that don't leave anything to be discovered.

[–] kuberoot@discuss.tchncs.de 3 points 2 weeks ago

Not the same person and cba to get a timestamp right now, but it's the 80% rule - the electrical stuff isn't designed to deliver the rated amperage continuously for hours on end, so for car charging, you're apparently supposed to limit it to 80%. Now, 80% of 50 isn't 42 but 40, so not sure if it's a case of 80% not being a precise number or a mistake here, but it roughly checks out.

[–] kuberoot@discuss.tchncs.de 4 points 3 weeks ago

I wouldn't blame the kid too much, he may have grown up to be a good person... But the teacher was abusing her position without verifying the accusations, and nobody else intervened?

[–] kuberoot@discuss.tchncs.de 1 points 4 weeks ago

I had the impression cloud was about the opposite - detaching your server software from physical machines you manage, instead paying a company to provide more abstracted services, with the ideal being high scalability by having images that can be deployed en masse independent of the specifics of where they're hosted and on what hardware. Pay for "storage", instead of renting a machine with specific hardware and software, for example.

[–] kuberoot@discuss.tchncs.de 2 points 1 month ago (1 children)

Sounds like something out of a dream, could she never have said it?

[–] kuberoot@discuss.tchncs.de 10 points 1 month ago

Yes, apple should allow that, and Sony should allow that. Your "gotcha" seems pretty stupid, because "allow" doesn't mean "facilitate" - it's not Apple's responsibility to make those things work on their devices, but Apple is going out of their way to prevent individuals from making those things happen on their own.

[–] kuberoot@discuss.tchncs.de 1 points 1 month ago (1 children)

If you license your project under GPL, and somebody submits some code (like through a pull request) that ends up in the library you use, you are now also bound by the GPL license, meaning you also have to publish the source of any derivatives.

The way to avoid it is to use something like a CLA, requiring every contributor to sign an agreement giving you special rights to their code, so you can ignore the GPL license in relation to the code they wrote. This works, but is obviously exploitative, taking rights to contributions while giving out less.

It also means if somebody forks the project, you can't pull in their changes (if you can't meet GPL terms, of course), unlike with MIT, where by default everybody can make their own versions, public or private, for any purpose.

Though it's worth noting, if you license your code under MIT, a fork can still add the GPL license on top, which means if you wanted to pull in their changes you'd be bound to both licenses and thus GPL terms. I believe this is also by design in the GPL license, to give open-source an edge, though that can be a bit of a dick move when done to a good project, since it lets the GPL fork pull in changes from MIT versions without giving back to them.

[–] kuberoot@discuss.tchncs.de 2 points 1 month ago (1 children)

Not necessarily, if they have "magic tech", they could be uploading a virus that rapidly spreads across the entire internet, making every machine broadcast its data through electromagnetic waves or something like that, picking up all those transmitions with said magic tech.

It would still take longer just to read the data off off all the storage, but theoretically not DSL

[–] kuberoot@discuss.tchncs.de 6 points 1 month ago

NFTs try to introduce artificial scarcity

Just want to add to that, NFTs aren't inherently about artificial scarcity, they could also be used to track ownership of rights or real life items without a central authority that everybody needs to trust.

Of course, cryptobros immediately went to pushing them as an investment scheme, and the actual implementations are slow, inefficient, and downright expensive to use. I don't think anybody has managed to make NFTs actually useful, but I imagine the original creators weren't looking to create... Whatever this is.

[–] kuberoot@discuss.tchncs.de 1 points 2 months ago

Git exposes a lot of internals through odd commands, so I suspect you could manage synchronization by sending changes over email or something.

Bonus fun fact: there's a git bundle command that "dumps" the repository into a single file, that can be interacted with as a remote. So if you're ever working with a local repository and want to put it on a server over ssh or something like that, you can just create a bundle, scp it over, and clone from that on the server.

[–] kuberoot@discuss.tchncs.de 4 points 2 months ago (3 children)

Fundamentally, the repository you have on GitHub is the same thing as the repository you have on your computer when you clone it. Pulling and pushing are shorthands for synchronizing commits between the two repositories, but you could also synchronize them directly with somebody else who cloned the repository. As somebody mentioned, you can also just host the same repository on two servers, and push to both of them.

The issue is that git doesn't include convenient features like issues, pull requests, CI, wikis, etc., and by extensions, those aren't included in your local repository, so if GitHub takes them down, you don't have a copy.

An extra fun fact is that git can be considered a blockchain. It's a distributed ledger of immutable commits, each one representing a change in state relative to the previous one. Everybody who clones a repository gets a copy of its entire history and fast forwards through the changes to calculate the current state.

view more: next ›