Yes, artefacts are tools. But they can function as feedback functions, in which case they simply connect our outputs (like hammer blows) to the variables we control (the height of the nailhead above the surface) , or as control systems, in which case they do keep controlled variables at their specified reference states (I discuss this below).
>>>> Agreed. But don’t underestimate the role of CVs in the design of feedback functions, and the role of conscious awareness in the choice of its use; this is the essence of trickery, sabotage, but also urban design and advertising, IMO. For example, if I design and build fun slides for kids then I am supporting my CV of praise and financial success, and supporting their CV of maximising fun. However, if the slide goes into a pool of crocodiles that can’t be seen at the top, and I know this and the kids don’t, then the feedback function is not simply a connection; it is the means to achieve a goal that I may want but they don’t. It is facilitating manipulation. We need to think about AI and digital media in this way - it looks like the feedback function we want from the outside - but once you get into it, it is a pool full of crocodiles!
This explains the “C” and the “G” (“giant” when C is large, I presume) in GVC. But why the “V”? Is collective control always virtual? Or is GVC a special case of collective control that is done by gigantic (G) rather than small collectives virtually (V) rather than really?
>>> I am not wedded to the G, and the V is relative to the point of view of the modeller of the system. My CV of writing a sentence is virtual from the perspective of the neurons in my brain who are controlling various electrophysiological variables.
What is the difference between an intersubjective perception and a non-intersubjective perception? Aren’t all perceptions just subjective?
>>>> A concept is intersubjective because it exists in minds only and we use words to try to indicate this concept to one another. A perception of a chair is subjective because it is a perception of something physical.
This makes it sound like feedback functions can do the “drawing into” – controlling – and that sophisticated feedback funcitons can do it even better. But, of course, feedback functions don’t control unless, as in this case, they are control systems themselves.
>>> I would say that some feedback functions don’t need to be control systems in order to ‘draw into’. A ditch is not a control system but I can still fall into it, especially if you hide it under some fake grass you cheeky fellow…
The algorithms developed by social media companies are designed to control for sustained hits (the controlled perception) by varying the content that they show to the user of social media (their output). The algorithms control the user’s output (hits) by producing content that disturbs variables the user controls, such as the content in the “rabbit hole”.
This is the same approach to behavior control that is used in the rubber band demo. But in this case it is a person (E), not algorithms, controlling the behavior of another person (S). E controls S’s finger movements by disturbing the position of the knot that S is controlling. The feedback function that connects E’s output (finger movements) to the controlled variable (S’s finger movements) is the knotted rubber bands. In this case, it is clearly E and not the feedback function that is controlling S’s finger movements.
I’m sure you know all this
>>> Yep. But I think you are underestimating the power of the form of feedback functions; their role complements that of disturbances by making some CVs much more likely to meet and stay at their RVs than others.
but I think it would have improved your presentation enormously if you had made it clear that a feedback function, such as a social media algorithm, only controls (“drawing in” users) if it is a control system. Otherwise a feedback function is just a causal connection between the control system and the variable it controls.
>>> Again, you are underplaying the importance and form of a ‘causal’ connection, especially if that feedback function has been purposively designed by a control system (person). A nuclear bomb is not a control system but it can make quite a mess.
And I don’t think it’s appropriate to call the controlling done by social media algorithms “coercion” inasmuch as there is no physical force used when the algorithm fails to get a user “drawn in”. I would call the controlling done by social media algorithms “manipulation”. Coercion is what we call the controlling done by the POS fascists who are currently running the once great USA.
>>> Yep, ‘coercion’ seemed to work well in the title, but I explain that mostly this is social manipulation at various points in the talk. However, there are many situations where social manipulation can be much more damaging than coercion, depending on the degree of threat used for coercion, and the outcomes desired by the coercer or manipulator.
I think that consciousness (in the PCT sense) comes into play here only if the user is suffering in some way from being controlled by the algorithm.
>>> That is the point of my talk to help people who get addicted, but I still don’t agree. Conscious reflection can be pre-emptive of suffering, and it can also be used to improve relative wellbeing by attending proportionately to all needs/goals/CVs.
I think what you are advocating for here is more appropriately called “education”. Education does involve consciousness to the extent that learning is involved.
>>>> Exactly.
But education is learning that occurs prior to being in situations where a person is being manipulated (by things like advertising, propaganda or social media). It gives users the ability to notice that they are being controlled and to decide whether there is something better to do with their time.
>>> Absolutely.
Of course, fascists don’t care for education – especially in the humanities – so it will be tough to implement this solution to the social media problem here in the US. But such education would be nice to have once sanity returns, which it surely will, eventually.
>>> Hurray!