IT and Business Insights for SMB Solution Providers

Disinformation: Smarter Every Day

I know this is long. And it might be disturbing. But please take the time to read this and watch the linked videos. Disinformation in our online community is an extremely important topic. I've been meaning to blog about this for a long time.

One of my favorite channels on YouTube is Smarter Every Day with Destin Sandlin. If you're nerdy (and I know you are), check it out. He is literally a rocket scientist. He loves to play with super high speed video, chase the space station, and explain all kinds of things from a scientific perspective. 

His channel is https://www.youtube.com/channel/UC6107grRI4m0o2-emgoDnAA. I have no idea why he doesn't have a custom URL for that, but he has nine million subscribers and doesn't need my advice.

One of the projects he's taken on in 2019-2020 is analysis of why and how disinformation spreads on social media. He started with YouTube, chasing some weird stuff he noticed on Twitter.

With just a little investigation, he found some massive manipulations of YouTube that were cleverly designed to get around YouTube's automate algorithm. He shows some amazing techniques that bots are using to keep getting around YouTube's security.

Remember: The basic goal is for evil people to grab a few minutes of your time. When attention is the most important currency on the Internet, this grab matters. 

I highly recommend that you spend some time educating yourself on what's happening and how the evil robots are doing this. One clear lesson stands out: A massive amount of the engagement you see online is 100% fake. Fake videos are created by the dozens. Then they are loaded onto YouTube channels that have been purchased. Next, millions of fake likes and clicks are purchased on the gray/dark Internet. All these videos point to one another's meta data.

Goal #1 appears to be hosting ads and getting paid ad revenue . . . but at this point virtually all of the "views" are from paid click farms.

Goal #2 is that one of the nearly-identical videos eventually beats the YouTube algorithm and is served up to real human beings. At this point, hyper-partisan people begin viewing and commenting. That's engagement, and YouTube rewards it.

Twitter has a similar story. Malicious people are absolutely attacking Twitter (and all the social media) in order to either drive behavior or simply create discord.

The challenge for the social media giants is to balance usability and security. And maybe security means authenticity. At some level, we assume that "popular" on social media means something. But automated accounts allow a great deal of content to be created without humans being involved.

That's not all a bad thing. You can monitor volcano activity, weather, traffic, and so forth. So automation can be good, and provide information people want. But, obviously, bots can be created FAST in massive numbers. As of April 2019, Twitter was examining ten million new accounts per week - and 75% of them were removed!!! Obviously, those numbers are higher now.

There is clearly a war going on here. Artificial intelligence is used heavily on both sides. As a result, many (MANY) fake accounts are created and operate for some period of time before they are caught and closed down. As a result, the account might make a few thousand or a few million impressions before it's closed down.

One important piece of news: The good guys (including really smart people at NATO) are setting up accounts to attract robots so they can examine the behavior of evil robots. This is actual Spy-vs-Spy stuff.

Facebook is probably the worst platform when it comes to divisiveness and social disruption. Facebook is under attack from many fronts. They have famously outsourced a lot of their content filtering to specialized companies. And the crap the people want to post is extremely alarming: murder videos, animal torture, etc. The content is so disturbing that many of these human editors have mental and emotional problems due to their exposure to this stuff.

I am grateful they keep this stuff off Facebook, and I'm sorry that someone is traumatized so I don't have to be.

The other war Facebook is fighting is the political and social war. Russia (the nation) and several other nation states are actively trying to get Americans and Europeans to fight amongst themselves. They have basically manipulated the entire populations of these countries to create the era of extremism we find ourselves in.

Trying to NOT contribute to extremism is a fulltime job at Facebook. Most notably, Facebook has changed their ad policies to eliminate the worst of this. But a lot of the horrible stuff is posted as content rather than ads. Facebook has also created a great deal of transparency about ads. You can actually see every ad and who paid for it.

Facebook deleted over one million fake accounts per DAY as of April 2019. Of course it's more now. But many accounts are created by real people . . . and maintained at a minimal level until they are sold to evil companies. So "real" accounts become fake accounts.

Reddit is another massive home for disinformation. Reddit is very different from the other social media due to its forum format. Reddit has individuals, communities, and moderators. This format makes coordinated attacks a little harder to execute, but the essentially text-based format makes it simple to create massive numbers of posts with bots. So, it's all a numbers game.

Like Twitter, Reddit places a lot of emphasis on what's popular in order to share that content more widely. But they both want that popularity to be real (from real humans, not hired hands or robots).

Reddit nuked 944 trolls after the 2016 election. Since then, they have created a massive transparency area inside Reddit. In other words, you can go to r/technology and see posts with links to known troll accounts. The goal is to help users educate themselves about what these accounts look like. 

Rather than take down the fake accounts, Reddit simply calls them out and labels them. This allows researchers to analyze the behavior of the trolls. So, what you see over time is that conversations "go troll" when issues become simplified (no longer complex) and more toxic or aggressive. The attack strategy on Reddit is all about diverting conversations from honest and interactive to manipulated and divisive.

I'm not trying to summarize Destin's work. He goes into extreme detail and posts many, many links to resources and publications. I would encourage you to watch these videos and then check out all the fun stuff he does on his channel. 

Personally, I love complicated discussions. So I would rather be involved in a six-sided discussion than a two-sided conversation. On top of all that, I can honestly say that the book Thinking, Fast and Slow by Daniel Kahneman will give you a really clear picture of how the images and the very nature of social media are so effective - even if they're evil and manipulative.

Again, I am asking you to put in some time here. Kahneman's book is 500 pages. It's all about how our brains work, and how we cannot control a lot of what our brains do. So we have to approach all social media with our attention set to high. Sadly, that means it cannot be a relaxing place to spend time.

Ironically, if you relax and enjoy social media, you will find that you're being manipulated and fed angry lies and trolling. That's not ultimately relaxing.

Here are the important videos. I listed the first one last because it has the best advice about how to actually analyze your news feed.

YouTube

https://www.youtube.com/watch?v=1PGm8LslEb4 (about 20 minutes)

Twitter

https://www.youtube.com/watch?v=V-1RhQ1uuQ4 (about 30 minutes)

Facebook

https://www.youtube.com/watch?v=FY_NtO7SIrY (about 22 minutes)

Reddit

https://www.youtube.com/watch?v=soYkEqDp760 (about 26 minutes)

Why Your Newsfeed Sucks

https://www.youtube.com/watch?v=MUiYglgGbos (about 12 minutes)

All of your social media are literally under attack. And, as far as we know, there is no winning. The best we can hope for is to manage the attack, or the effects of the attack.

I hope you spend a little time raising your awareness.

I'm sad to say, I don't know where this is all going. But I believe we can make social media mostly good if we all put in some effort.

Take care my friends.

:-)

Tags: 

About the Author

Karl W. Palachuk, is a technology consultant, author, speaker, trainer, and coach. He is the author of fifteen books. He has built several successful businesses, including two managed services companies. His books include Managed Services in a Month and The Network Documentation Workbook. Karl is a frequent trainer and speaker in the SMB Community. His popular blog can be found at SmallBizThoughts.com. He has more than twenty years experience as an I.T. professional and serves on advisory panels for several hardware and software companies.

ChannelPro SMB Magazine
SUBSCRIBE FREE!

Get an edge on the competition

With each issue packed full of powerful news, reviews, analysis, and advice targeting IT channel professionals, ChannelPro-SMB will help you cultivate your SMB customers and run your business more profitably.