this post was submitted on 11 May 2024
1 points (100.0% liked)

Technology

34893 readers
1191 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
all 13 comments
sorted by: hot top controversial new old
[–] delirious_owl@discuss.online 1 points 6 months ago

How much of that is dead pixels?

[–] Evil_Shrubbery@lemm.ee 1 points 6 months ago

Yes, humans kinda brute-forced intelligence with current assets - made it bigger (with some birthing issues) & more power hungry (with some cooling issues), but it mostly works.

[–] Lojcs@lemm.ee 1 points 6 months ago (1 children)

So a 4k movie is 100 GB? 2 hour movie would make it 110 mbps. Insane bitrate even for h.254 imo

[–] Tempo@lemmy.ml 1 points 6 months ago

4K Blu Rays encoded in H265 are usually on 100gb discs, so I can see where they're coming from

[–] riplin@lemm.ee 0 points 6 months ago (1 children)

That’s capturing everything. Ultimately you need only a tiny fraction of that data to emulate the human brain.

Numenta is working on a brain model to create functional sections of the brain. Their approach is different though. They are trying to understand the components and how they work together and not just aggregating vast amounts of data.

[–] remotelove@lemmy.ca 0 points 6 months ago (1 children)

Ultimately you need only a tiny fraction of that data to emulate the human brain.

I am curious how that conclusion was formed as we have only recently discovered many new types of functional brain cells.

While I am not saying this is the case, that statement sounds like it was based on the "we only use 10% of our brain" myth, so that is why I am trying to get clarification.

[–] biscuitswalrus@aussie.zone 1 points 6 months ago

They took imaging scans, I just took a picture of a 1MB memory chip and omg my picture is 4GB in RAW. That RAM the chip was on could take dozens of GB!

[–] chirospasm@lemmy.ml 0 points 6 months ago* (last edited 6 months ago) (2 children)

"We did the back-of-napkin math on what ramping up this experiment to the entire brain would cost, and the scale is impossibly large — 1.6 zettabytes of storage costing $50 billion and spanning 140 acres, making it the largest data center on the planet."

Look at what they need to mimic just a fraction of our power.

[–] SirEDCaLot@lemmy.today 1 points 6 months ago

In fairness, the scan required such astronomical resources because of how they were scanning it. They took the cubic millimeter chunk and cut it into 5,000 super thin flat slices and then did extremely high detail scans of each slice. That's why they needed AI, to try and piece those flat layers back together into some sort of 3D structure.

Once they have the 3D structure, the scans are useless and can be deleted.

In time it should be possible to scan the tissue and get the 3D structure without such extreme data use.

[–] zagaberoo@beehaw.org 0 points 6 months ago (1 children)

And the whole human body, brain and all, can run on ~100 watts. Truly astounding.