• u_tamtam
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    This is the definition of complexity, isn’t it? The fact here is that we can’t scale up our understanding at a small level to make sense of the bigger picture. Having worked myself with (much simpler) artificial neural networks, I think it’s very much correct and to the point to say that “we don’t know how it works”. I would even go further and claim that we will never know how it works fully: the weights in the network in essence form structures that do what they do, that we can recognize by analogy (e.g. logic gates, contour extractors, …), but this is an anthropomorphic approximation which moreover only works in a certain range of values/set of conditions. Had we a formal definition of what the weights represent, we would then be dealing with a (much simpler and efficient) algorithm in the traditional sense (with cleanly delineated and rigorously defined specialized functions).