• 0 Posts
  • 45 Comments
Joined 1 year ago
cake
Cake day: June 30th, 2023

help-circle
  • We’re looking at this from opposite sides of the same coin.

    The NN graph is written at a high-level in Python using frameworks (PyTorch, Tensorflow—man I really don’t miss TF after jumping to Torch :) ).

    But the calculations don’t execute on the Python kernel—sure you could write it to do so but it would be sloooow. The actual network of calculations happen within the framework internals; C++. Then depending on the hardware you want to run it on, you go down to BLAS or CUDA, etc. all of which are written in low-level languages like Fortran or C.

    Numpy fits into places all throughout this stack and its performant pieces are mostly implemented in C.

    Any way you slice it: the post I was responding to is to argue that AI IS CODE. No two ways about that. It’s also the weights and biases and activations of the models that have been trained.








  • No. Strictly and technically speaking, LLMs absolutely fall under the category of AI. You’re thinking of AGI, which is a subset of AI, and which LLMs will be a necessary but insufficient component of.

    I’m an AI Engineer; I’ve taken to, in my circles, calling AI “Algorithmic Intelligence” rather than “Artificial Intelligence.” It’s far more fitting term for what is happening. But until the Yanns and Ngs and Hintons of the field start calling it that, we’re stuck with it.






  • I’d say it actually goes further. We have plenty of evidence leading to the realization of fact that simply measuring a phenomenon changes the phenomenon. From a quantum mechanics perspective we say things like “measuring the phenomenon collapses its wave function to a single state.”

    When a quantum system is measured, its wave function, which represents a superposition of multiple potential outcomes, collapses to a single definite state corresponding to the result of the measurement.

    All macroscopic phenomena comprise nanoscopic quantum phenomena.

    Super fucking weird to think about. The classic undergrad physics experiment is the double-slit experiment— particles like electrons create an interference pattern when unobserved, acting like waves and passing through both slits at once. However, when we measure which slit a particle goes through, this wave-like behavior disappears, and the particle behaves as if it went through only one slit. This shows that measurement collapses the particle’s wave function from multiple possibilities into a single, definite state.

    Similarly, despite being depicted as such in early exposures to chemistry, electrons don’t “orbit” the nucleus like planets do their stars—rather they have regions around the nucleus in which they are more probably found. These misleadingly named “orbitals” vary in shape.

    Finally, we have the Heisenberg Uncertainty Principle; which states that we can measure either a particle’s speed (kinetic energy) or its location, but not both, because the act of measuring (observing) that particle irrevocably changes it.

    Here’s a macroscopic example of how measuring/observing things changes the thing. When you measure the temperature of an object using a thermometer, the object is either transmitting or receiving thermal energy to/from the thermometer, because the thermometer needs to be in contact and thermal equilibrium with the object. The object’s total energy level has now changed—even if it’s a trivial change it’s also non-zero. Measuring/observing the object in this way has changed it.

    omg it goes deeper. I love physics. Classical mechanics models work well when we want to explain and predict macroscopic and limited chain-of-events phenomena. We can predict with high confidence that a 2000 kg car traveling at 100 km/h will impulse this much force and energy to a stationary object when they collide, assuming a perfectly inelastic collision, spherical cows, etc. We can’t model with any confidence with any classical model how the displaced air molecules from this collision in Nuremberg, Germany will create tornadoes in six months in Wichita, Kansas, USA. That’s the butterfly effect.

    Ultimately, this interplay between measurement and outcome highlights a fundamental truth in both quantum mechanics and chaos theory: the universe is inherently unpredictable at every scale. Just as the behavior of subatomic particles is influenced by the act of observation, the butterfly effect shows us that small changes can lead to significant consequences in complex systems. This intertwining of uncertainty and complexity underscores the limitations of our predictive models, whether they pertain to the quantum realm or the macroscopic world.