

Current AI is a dead end, it’s less intelligent than a fly brain, which by the way can solve mazes with just 160,000 neurons, remember where the food is after hours of discovering such, is self replicating, and is a generic intelegect.
I mean no harm.


Current AI is a dead end, it’s less intelligent than a fly brain, which by the way can solve mazes with just 160,000 neurons, remember where the food is after hours of discovering such, is self replicating, and is a generic intelegect.


Initially, x86 CPUs didn’t have a FPU. It cost extra, and was delivered as a separate chip.
Later, GPU is just a overgrown SIMD FPU.
NPU is a specialized GPU that operates on low-precision floating-point numbers, and mostly does matrix-multiply-and-add operations.
There is zero neural processing going on here, which would mean the chip operates using bursts of encoded analog signals, within power consumption of about 20W, and would be able to adjust itself on the fly online, without having a few datacenters spending exceeding amount of energy to update the weights of the model.
I’m dum founded why we are even considering overseas services, there is a literal datacenter boom going on right now. There are many local, including massive ones like google, hetzner providers.
AWS, or cloud in general, is an money extortion service in practice for very large customers.