Monthly Archives: March 2019

And it’s a doozy. The headline & the Lede:

Facebook Bans White Nationalism and White Separatism

After a civil rights backlash, Facebook will now treat white nationalism and separatism the same as white supremacy, and will direct users who try to post that content to a nonprofit that helps people leave hate groups.

I’d like to thank Joseph Cox on Twitter for raising this news to me. You can find his thread here.

My take on this: atta’corp Facebook! I’ve been bellyaching at you for so long it’s nice to finally see you do the right thing.

But as 6yo would say, Geez Louise Facebook. Took ya long enough.

Didja ever think the real world would look like a Sharepoint Governance project run amok? I sure didn’t. I thought there were Pros. In the room. Somewhere.

But there weren’t. Yet we know what the Pro version of a Facebook or SharePoint look like, don’t we?

Yes we do. You see, at work we -we free humans engaged in cooperative profit-seeking endeavor under the banner of an LLC or LP- we wouldn’t allow white supremacy themes on our screen. The Googlers sure don’t when they are at work. We know about that thanks to Googlers who have left employ of Google and written stories up about their work screens.

And at home, you and I wouldn’t allow it on our screens either. Why? Because we’re adults with a sense of morals and purpose and we love life, not death. I don’t care who you are, I got that in common with you.

We love life. We have loved ones. Start there. That’s what the technologists would call “First Principles” so let’s push the “First Principles” reset button and say, We Love Life, and We Have Loved Ones.

I don’t know if there’s a “2nd Principles” but step two should be: Put Mein Kampf & the other grabbag of nihilst shit back in a dusty corner of the library or in the museum with the relics of the other vanquished foes from the worst parts of our 19th & 20th centuries. That’s what you’d do at home. That’s what we’d do at work. Remove it from the indexes. Censor the hell out of it. Make it hard to find.

Right. So Step 2 is don’t let it breathe. That’s the last thing you or I want for this world we’re in, for ourselves & our loved ones. For the New Zealand dead -and in their memories- don’t let it breathe.

Now, Step 3 is more complicated. It kinda goes to the subtle point I’m making as I compose my words on your screen. Step 3 is to understand yourself and others in

step4

“Our Community”

relationship to the screen, and the folks who make the screen show us stuff. Step 3 then is to stop thinking of yourselves as Facebook’s Community, and get pissed when you see “Our Facebook Community.” Because you’re not Facebook’s community. You’re you. A free human with hopes, dreams, fears and love. You’re not the property nor the subject of an unaccountable corporation worth more than $450 billion that employs only about 30,000 people and contributes tremendous damage to our global society while all value accrues to its shareholders. That’s step 3. You’re you. Just understand that.

selfothervenn2

Step 4 on the path to righting this horrible SharePoint Governance project run amok is to understand yourself and your relationship to the other that you’re reading on your screen. And then your task is to think about your self-interests, and the interests of the others. And your task is to find something we in the public biz like to call “common ground” or “common interests.” Such things might include, but are not limited to Maslow’s Hierarchy of Needs. Safety. Food. Shelter.

selfother venn

Step 5 on the road to SharePoint Governance reform is a little out there, but roll with me fam. Step 5 is this: understand that what many call “the nationalization of our politics” is actually the capture of them. By the people who control the screens. Just mull on that for a bit, we don’t have to tackle that one until we’re done with Steps 1-4.

That’s all for now fam, but I just want you to remember this: It’s only here, in this screen, where we’re up against a force that controls the screen and is informed by libertarian politics and incented by dollar bills, it’s only here where we get confused. Only here.

Peace to you and your’s fam. Stay lit!  ☮

Writing in 1989, moral philosopher Sissela Bok tells us:

Imagine a society, no matter how ideal in other respects, where word and gesture could never be counted upon. Questions asked, answers given, information exchanged—all would be worthless. Were all statements randomly truthful or deceptive, action and choice would be undermined from the outset. There must be a minimal degree of trust in communication for language and action to be more than stabs in the dark. This is why some level of truthfulness has always been seen as essential to human society, no matter how deficient the observance of other moral principles. Even the devils themselves, as Samuel Johnson said, do not lie to one another, since the society of Hell could not subsist without truth any more than others

When I look at my screen 30 years later, I see this effect -this collapse- all the time. And the only one I’m comfortable as a moral agent talking about is Baby Getting Cheesed.

Last Saturday I had a moment of pause, so I looked at my screen. The screen showed me an adult person taking a slice of yellow cheese and tossing it on a surprised baby’s face. It made a wet plop sound as it stuck to the startled baby’s face. The video ended.

I didn’t feel upset or outraged by the act itself. It was, in a way, cute and tugged my dad heart-strings. I remember my son at that age. I wouldn’t have tossed a cheese slice on his face, but I played little games with him, like pretending I’d eat his foot, just so I could get a laugh or smile out of him. The cheese slice on baby face schtick was odd, but it was also endearing in a way.

What bothered me most about the baby getting cheesed was that someone -or perhaps the algorithm itself- had decided to put it on my screen. To get me to consume it. To please me and get me to share it. I realized instantly why the baby getting cheesed had upset me: I was staring a moral hazard in the face.

In general, tossing cheese slices onto the faces of babies is a Bad Thing. It’s not something you or I, as moral adults, would encourage. It’s not something we’d do in our own homes. It’s not something we’d do to our friends’ children, our grandchildren, our niece or nephew. It’s not something any baby care book would recommend. You’d be hard-pressed to find a parenting or caregiver expert to tell you that throwing a cheese slice on a baby’s face was a Good Thing To Do. Yet here I was, looking at my screen, reading a piece that voiced great hilarity and mirth at the baby getting cheesed. The video had been viewed 8 million times, and dozens of copy-cats videos had been made, the writer told me. Most replicas were made by parents. Like me. The whole thing had gone viral in the words of the privatized commons.

That horrified me. So I asked myself why baby getting cheesed had gone viral?

I’m no behavioral scientist, my credentials in science, law, and or sociology are pitiful.  What I do know a lot about is how people use technology, and what might motivate the ways they use it. And I know how to use my sense of morality in public and private spaces.

Knowing that most people, in the privacy of their homes or out in public with their child, would elect not to throw cheese on their babys’ faces, or celebrate that others had done so, I realized that the behavior I was seeing on my screen was being induced by something. Encouraged by an unseen hand. By some perverse economic logic at work there, in my screen.

It was being encouraged by the app itself. In my case, that app was Twitter. But it doesn’t really matter. All the apps encourage sharing. They live and die by what we share. And they reward us for sharing. In Twitter’s case, the reward is a value-less form of currency: a like, or a retweet, or maybe a reply. All of these things are bundled up and re-named from what they were (verbs signifying operation-actions on an item of information) into something new: engagement.

Engagement is the coin of the realm of our screens. It’s the engine celebrated by the bit-tycoons and those who write about them for a living. It’s the core economic logic in our screens. To keep us engaged. To further that engagement. To take more of our attention. To ✨razzle dazzle us with pleasing animations and unique experiences.

And also, to get us to do things we wouldn’t normally do. 

Notice the deception therein. As people, as normal moral beings in a real physical place, we’d probably not cheese the baby’s face, and, more than that, we’d also probably condemn or shun others who did so. We sure as hell would not yield to a corporation asking us to throw cheese on our baby’s face, film it, and then put on screens all over the world.

But in the deceptive hall of mirrors that is social, -where sharing is effortless and the twin to the moral hazards it produces- we do exactly that. In the real world, we grab a slice of yellow cheese from the fridge, and toss it on the baby’s face, then upload the video. For nothing and no reason at all except to accrue a meaningless currency.

To top it all off, the original cheese video -supposedly posted by a brother of the baby- was itself a deception. It had been downloaded and stolen from Facebook. Again: why? To perform. To steal a little authenticity for the purpose of accruing likes.

I think we’re in dangerous territory here. My sense is that this un-virtuous cycle could devolve very quickly into chaos. We’re seeing more and more bad actors utilize these exploitative software systems to amplify -and indeed induce- bad behavior. The same thing happened with the Momo hoax, which is now no longer a hoax, but a very real self-harm thing frightening parents of 3rd graders at my kid’s school.  These patterns seem similar to me to the ones that preceded violence in Myanmar and India. And that’s frightening.

Most importantly, we can’t depend on any of these apps to regulate or modify the inducement logic behind the behavior their users exhibit. The app makers benefit from inducing certain behaviors in us. We should have learned that lesson as far back as 2016. We should have learned it in 2017 and 2018, especially after violence took people’s lives in Myanmar. But app makers have no interest in fixing this, and there’s no reason to trust them to fix this as they’ve let us down so many times already. We’ve seen the app makers spread lies, apologize for consequences and yet engagement keeps rising. They have no incentive to fix this; in fact, engagement forces the opposite logic on these businesses. Don’t fix it. Let it spread. We’re making money, so who cares?

Bok, writing with moral clarity and force, warns us again:

A society, then, whose members were unable to distinguish truthful messages from deceptive ones, would collapse. But even before such a general collapse, individual choice and survival would be imperiled. The search for food and shelter could depend on no expectations from others. A warning that a well was poisoned or a plea for help in an accident would come to be ignored unless independent confirmation could be found. All our choices depend on our estimates of what is the case; these estimates must in turn often rely on information from others. Lies distort this information and therefore our situation as we rerceive it, as well as our choices. A lie, in Hartmann’s words, “injures the deceived person in his life; it leads him astray.”