deleted by creator
deleted by creator
Neural nets are typically written in C; then frameworks abstract on top of that (like Torch, or Tensorflow) providing higher-level APIs to languages like (most commonly) Python, or JavaScript.
There are some other nn implementations in Rust, C++, etc.
You seem like a good person and I wish Lemmy had a way to follow people. Need more positivity in the feed. Keep it up, friend.
I don’t care. Will always be Nome. Fuck Pedo Stallman’s preference.
Costco’s soft-serve is way better than McD’s and actually is cheap.
I know more humans that fit that description than language models.
No. Strictly and technically speaking, LLMs absolutely fall under the category of AI. You’re thinking of AGI, which is a subset of AI, and which LLMs will be a necessary but insufficient component of.
I’m an AI Engineer; I’ve taken to, in my circles, calling AI “Algorithmic Intelligence” rather than “Artificial Intelligence.” It’s far more fitting term for what is happening. But until the Yanns and Ngs and Hintons of the field start calling it that, we’re stuck with it.
Bullshit. Developers never make mistakes. N.E.V.R.
No, but it’s basically a “I can use it to build my billion-dollar business and keep the profits if I want” license. The only real catch is that if I decide to modify the code and distribute it, I’m required by the license to share those changes with whoever gets the modified version. There’s nothing in the GPL that stops me from being a downstream freeloader, and I can stay on whatever version I like—no one’s forcing me to update to newer ones with terms I don’t agree with. Forking and modifying for my own needs is totally fine, as long as I slap the same GPL on the changes if I hand them out.
You can scan before the encryption step. It defeats the purpose of the encryption such that only the privileged actor gets plaintext while everyone downstream gets encrypted bytes, but technically it’s possible.
It’s only a matter of time until a vulnerability in the privilege is found and silently exploited by a nefarious monkey, and that’s precisely why adding backdoors should never be done.
I’d say it actually goes further. We have plenty of evidence leading to the realization of fact that simply measuring a phenomenon changes the phenomenon. From a quantum mechanics perspective we say things like “measuring the phenomenon collapses its wave function to a single state.”
When a quantum system is measured, its wave function, which represents a superposition of multiple potential outcomes, collapses to a single definite state corresponding to the result of the measurement.
All macroscopic phenomena comprise nanoscopic quantum phenomena.
Super fucking weird to think about. The classic undergrad physics experiment is the double-slit experiment— particles like electrons create an interference pattern when unobserved, acting like waves and passing through both slits at once. However, when we measure which slit a particle goes through, this wave-like behavior disappears, and the particle behaves as if it went through only one slit. This shows that measurement collapses the particle’s wave function from multiple possibilities into a single, definite state.
Similarly, despite being depicted as such in early exposures to chemistry, electrons don’t “orbit” the nucleus like planets do their stars—rather they have regions around the nucleus in which they are more probably found. These misleadingly named “orbitals” vary in shape.
Finally, we have the Heisenberg Uncertainty Principle; which states that we can measure either a particle’s speed (kinetic energy) or its location, but not both, because the act of measuring (observing) that particle irrevocably changes it.
Here’s a macroscopic example of how measuring/observing things changes the thing. When you measure the temperature of an object using a thermometer, the object is either transmitting or receiving thermal energy to/from the thermometer, because the thermometer needs to be in contact and thermal equilibrium with the object. The object’s total energy level has now changed—even if it’s a trivial change it’s also non-zero. Measuring/observing the object in this way has changed it.
omg it goes deeper. I love physics. Classical mechanics models work well when we want to explain and predict macroscopic and limited chain-of-events phenomena. We can predict with high confidence that a 2000 kg car traveling at 100 km/h will impulse this much force and energy to a stationary object when they collide, assuming a perfectly inelastic collision, spherical cows, etc. We can’t model with any confidence with any classical model how the displaced air molecules from this collision in Nuremberg, Germany will create tornadoes in six months in Wichita, Kansas, USA. That’s the butterfly effect.
Ultimately, this interplay between measurement and outcome highlights a fundamental truth in both quantum mechanics and chaos theory: the universe is inherently unpredictable at every scale. Just as the behavior of subatomic particles is influenced by the act of observation, the butterfly effect shows us that small changes can lead to significant consequences in complex systems. This intertwining of uncertainty and complexity underscores the limitations of our predictive models, whether they pertain to the quantum realm or the macroscopic world.
The notion that our universe is perfectly causal to the point that you can predict exactly when and where that specific atom will decay is pretty much bunked at this point. Not that living in a probabilistic, quantum physics universe is any fucking easier to comprehend but them’s be the cards we were dealt.
Might be the only job that’s left after StarNet takes over.
Can’t wait for Nintendo to sue Microsoft because VS Code can be used to edit save files.
If I want to wear my sunglasses while I’m watching a movie in the cinema because I have a light-sensitivity condition—usagenof the sunglasses alters my perception of the film without changing the permanent media storage of the film—am I cheating and subject to copyright infringement action?
Check your pocket for the fourth time. Might actually be there this time.
Stop giving me Thermo nightmares; I lived through that shit already I don’t need to sleep through it too.
How are you planning on handling the induced phase shifts due to the rapid polarity reversals that occur in the transgravitational electron flux arrays? I mean, this is a nonstarter if you can’t get that to work—the electropositron fields are going to decay too quickly to be useful otherwise and the quite-expensive phosphokinesis-generator will be wasted.
We’re looking at this from opposite sides of the same coin.
The NN graph is written at a high-level in Python using frameworks (PyTorch, Tensorflow—man I really don’t miss TF after jumping to Torch :) ).
But the calculations don’t execute on the Python kernel—sure you could write it to do so but it would be sloooow. The actual network of calculations happen within the framework internals; C++. Then depending on the hardware you want to run it on, you go down to BLAS or CUDA, etc. all of which are written in low-level languages like Fortran or C.
Numpy fits into places all throughout this stack and its performant pieces are mostly implemented in C.
Any way you slice it: the post I was responding to is to argue that AI IS CODE. No two ways about that. It’s also the weights and biases and activations of the models that have been trained.