I’ve explored a number of instagram accounts for research purposes. Each of them tends to present its own content in reels. For the TallDevin account, I intentionally shared some images with potentially inflammatory language or hashtags, specifically on an image of Bud Light-branded socks. I included a remarking that what people do in socks is disgusting and included #budlight.
I’ve discovered my current batch of reels consists of:
- “Christians” and “conservatives” making anti-LGBT statements + “pro-life” propaganda.
- “Anti-vax” conspiracies about the COVID vaccines.
- Flat earth conspiracy and jokes about those beliefs.
- A strange woman who gets all all fours and barks at people to scare them.
- A woman farting in public places.
- Machines in factories.
- Cats and other small animals.
Inflamatory content can be fascinating and it’s wild to see young people trying to grow their audience from bigotry. I’m not sure how this usually works out, but I don’t plan to post hateful content. I suspect the algorithm is tied more closely to hashtags than the meaning or intent of the post. Hate should not be necessary.
Imagine feeling inspired to make reactionary content to what you saw in reels. It’s not a leap, in my opinion. If you perceive that a variety of hateful post is successful, you might try and use the same method.
On some level, I’ve entered a scene from the movie, Idiocracy. Something about guys getting hit in the balls? I guess the algorithm believes this is the content people want to see, so people naturally feel inclined to make more of that type of stupid content. The human part makes sense, too. They want views, likes, and comments.
In the traditional media platforms, a group of “¿elites?” would decide what was worth airing. And once they discovered what worked, experimentation usually took the form of subtle tweaks over time. A team of professionals was focused on quality and generating a clear image with accurate color, sharp focus, and good sound. There was a common consensus that technical quality was important to maintain viewership.
Yet modern, non-traditional Internet celebrities may discover these technical concerns about quality irrelevant to their subscribers. Entertainers like PewDiePie has 111 million YouTube subscribers. I find the audio irritating, but his fans might disagree. And I do confess, I sometimes watch things because they are popular.
I was trained in the “craft” of traditional media. And while technical quality is important to the process, I also don’t want it to distract from the goal to use AI to supplement the content. If AI has generated body parts, a background, or a fake pet, I want it to be somewhat obvious because the purpose is not to deceive, but to amuse.