1. 9
  1.  

  2. 4

    It’s hard to take this seriously with that All Seeing Eye sitting at the top. Anatomy of an AI System (2018) has a more grounded take on the reality of contemporary “AI products”.

    1. 3

      Well, they are two very different perspectives. Pasquinelli gives an epistemical perspective, while Anatomy of an AI system is more a deconstruction of power structures in the material processes of AI production. I wouldn’t compare them directly, especially if you want to accuse Pasquinelli of not being grounded, since he writes theory.

      1. 5

        deconstruction of power structures in the material processes of AI production

        I’m sorry, but these sounds like reverse-buzz words.

        1. 2

          Ah, the subtexts of zzub-words! Bargain jargon, deeply (learnedly?) discounted. Quite a mash-up.

          This post has me grinning ear to ear. It’s so much fun, I’m going to have to read it carefully, on my own time. I really can’t tell how seriously to take it, but I am going to do my best. Thanks for sharing, @chobeat !

          1. 1

            Why? They are very precise and specific terms.

            1. 1

              I understood zge to mean something like “buzzkill”. Made sense to me; Marxist jargon isn’t everybody’s cup of tea.

              1. 1

                well, it’s not really marxist. Different fields in philosophy and sociology use both “power structure” and “deconstruction”. “Deconstruction” is used and abused mostly in critical theory, for sure, but it’s not like they invented the term. I hope “material process” is not jargon now. Or people started to believe that workers are just bots working in the digital space?

                1. 1

                  Fair enough, and bait politely declined. I believe that, in the US at least, most people are exposed to these terms, if at all, in a (sometimes implicitly) politicized context. Anyway, “jargon” doesn’t mean “nonsense”, it just means localized (and often opaque) terminology.

              2. 1

                That’s why I said “reverse” ^^

                But still, just like in marketing everything is “blockchain-based” when they mean they are using a non-standard database or “machine learning” when applying basic statistical methods, so I feel on the one hand terms like “deconstruction of power structures” is (doubly) over-used, while “material processes of AI production” mixes two different idea (the “material process of production” and “AI”), in a way which I have grown to expect from marketing and desperate academics.

                I’m actually particular confused by the phrase “material processes of AI production”. While I agree that the terms are precise and specific, it’s their combination that irritates me. How is “AI” produced, any more than automation or the division of labour is produced materially? I haven’t finished reading the article yet, I’m just saying the phrase seems more obfuscating than illuminating.

                1. 2

                  while “material processes of AI production” mixes two different idea Why are they different? I’m a machine learning engineer and my RSI is very material. AI is not produced in the sky.

                  AI, both as a technological solution and as a social construct (i.e. the cost to maintain the delusion that this stuff is magical and solves problems by itself) costs labor and has material externalities: pollution from the energy demand, exploitation of third workers for data labeling/crowdwork and consequent reshaping of many economies and lives, impact on the hardware supply chain for the increased demand of specialized and non-specialized hardware and devices, material maintenance and protection of data centers and their impact on local economies and environments. AI is extremely material, like all the software we produce. Just because a very narrow part of the product is purely digital, doesn’t mean that the production process is ethereal and immaterial.

                  This without considering the structure necessary to support technical workers with their eccentric and demanding lifestyle but this is not specific to AI since it’s a trait of every form of software production.

                  1. 1

                    (you seem to have missed a newline)

                    I agree with what you say, maybe except for the choice of words which remind me of “buzzword” language. I forgot to add a “seems”, as in “seems to mix two different ideas”. The reason for that is that AI appears as an academical, mathematical or scientific endeavour (at least to some eyes – mine for example) while the production on a material level is a far more fundamental aspect, but also in a different way. I really didn’t expect such a detailed answer, which I thank you for.

                    1. 2

                      The reason for that is that AI appears as an academical, mathematical or scientific endeavour (at least to some eyes – mine for example) while the production on a material level is a far more fundamental aspect, but also in a different way. I really didn’t expect such a detailed answer, which I thank you for.

                      The whole article (but his work at large) is aimed at challenging this illusion. If you are interested at how AI (as an idea, more than its pratical applications) is socially perceived and potentially used for harming or abusing other humans, this paper probably will blow your mind: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3078224

            2. 1

              Fair enough. They both use a detailed diagram as a visual hook which is why they look similar at a surface level.

              1. 1

                Pasquinelli loves this stuff, he made others: http://kim.hfg-karlsruhe.de/evolution-of-machine-intelligence/

          2. 2

            Can anybody explain what on earth is going on in this thread?