r00ty

joined 1 year ago
[–] r00ty@kbin.life 6 points 9 hours ago

Anyone running a webserver and looking at their logs will know AI is being trained on EVERYTHING. There are so many crawlers for AI that are literally ripping the internet wholesale. Reddit just got in on charging the AI companies for access to freely contributed content. For everyone else, they're just outright stealing it.

[–] r00ty@kbin.life 8 points 1 day ago* (last edited 1 day ago)

It's specifically implied? :P

[–] r00ty@kbin.life 2 points 1 day ago

Hah, we used to have some of those AUI to 10Base2 transducers back in the day in the office. Definitely had one on the IBM RS6000/220 box.

[–] r00ty@kbin.life 7 points 1 day ago

"it goes up to eleven"

[–] r00ty@kbin.life 14 points 1 day ago

Yep, less overheads! This is the way.

[–] r00ty@kbin.life 2 points 2 days ago

I would have thought so, but I think it depends on how thin the skin of the pipe is. I would also have expected a breaker to trip under that much load. But, based on that happening, I'd not be surprised if there are bypasses and/or broken breakers.

When we moved into the house we're in now, the RCD (GFCI) didn't work at all. I pressed test, nothing. Had the electrician over to change it. He tested the actual actuation using earth leakage. Nothing. So, faults can happen too.

I want to be wrong, though. Because that's a pretty bad state to get into, I think.

[–] r00ty@kbin.life 12 points 2 days ago (2 children)

The only way that immediately springs to mind is so unlikely to happen. It requires multiple faults/mistakes.

1: The chassis of one of the two units became live (connected to "hot" for you Americans) but was also not grounded in any way.
2: The chassis of the other WAS grounded and created a circuit for the current to flow.
3: There was no RCD (GFCD or whatever you guys call it) on the circuit.

In this way, that pipe would be the only thing connecting the two devices, and the resistance is causing a huge amount of heat (just like an incandescent bulb, or a heating element does by design).

Probably other possibilities, but it's just the first thing I could think of that could potentially produce this result. But, that's a lot of safety features to have either failed or just simply not been in place for this to be possible. So, frankly I hope I'm totally wrong.

[–] r00ty@kbin.life 2 points 3 days ago

When I was talking about memory, I was more thinking about how it is accessed. For example, exactly what actions are atomic, and what are not on a given architecture, these can cause unexpected interactions during multi-core work depending on byte alignment for example. Also considering how to make the most of your CPU cache. These kind of things.

[–] r00ty@kbin.life 3 points 3 days ago (1 children)

I'd agree that there's a lot more abstraction involved today. But, my main point isn't that people should know everything. But knowing the base understanding of how perhaps even a basic microcontroller works would be helpful.

Where I work, people often come to me with weird problems, and the way I solve them is usually based in low level understanding of what's really happening when the code runs.

[–] r00ty@kbin.life 12 points 3 days ago (14 children)

I've always found this weird. I think to be a good software developer it helps to know what's happening under the hood when you take an action. It certainly helps when you want to optimize memory access for speed etc.

I genuinely do know both sides of the coin. But I do know that the majority of my fellow developers at work most certainly have no clue about how computers work under the hood, or networking for example.

I find it weird because, to be good at software development (and I don't mean, following what the computer science methodology tells you, I mean having an idea of the best way to translate an idea into a logical solution that can be applied in any programming language, and most importantly how to optimize your solution, for example in terms of memory access etc) requires an understanding of the underlying systems. That if you write software that is sending or receiving network packets it certainly helps to understand how that works, at least to consider the best protocols to use.

But, it is definitely true.

[–] r00ty@kbin.life 1 points 5 days ago (1 children)

The problem with wifi is that things will go downhill quickly once you have too many stations online. Even if they're not actively browsing, the normal amount of chatter that a network has will often just slow things right down. It would need to be split into smaller wifi networks linked somehow and that means someone needs to be in a central location that is easily traced.

In theory I guess someone with a very fast connection could run a layer 2 VPN. Then you could all run a routing protocol over that network which is accessed over the internet.

Lot's of ways to do it really. Wifi alone is probably the worst though.

[–] r00ty@kbin.life 2 points 5 days ago

In fact, forget the internet!

66
Fluffing machine. (media.kbin.life)
 
 

He spoke at the SCO summit which took place virtually under Indian PM Narendra Modi's leadership.

view more: next ›